Input
stringlengths 251
41.6k
| Output
stringlengths 137
9.7k
| input_ids
sequencelengths 157
2.05k
| attention_mask
sequencelengths 157
2.05k
| labels
sequencelengths 157
2.05k
|
---|---|---|---|---|
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
this paper presents sample efficient algorithms for learning in twoplayer zerosum markov games the algorithms are for decoupled and coordinated settings the latter of which is based on alternate optimism the authors also extend the eluder dimension of mdps to to zerosum markov games using the minimax bellman operator the authors develop algorithms for optimistic nash eliminated for zerosum twoplayer markov games based on valuefunction elimination zhang 2017 the contributions are clear in the introduction and the technical challenges section provide good explanation of which concentration bounds and ideas of optimism are employed in the paper it would be useful to have some clear definition of what decoupled and coordinated means in this case how do those results relate to similar settings in single agent learning the discussion feels lacking i would expect some more commentary comparing the various algorithms currently they are treated mostly as separate algorithms is there a unifying framework of the three algorithms that you could expand on or comment on how the regret bounds compare with each other when to use one over the other i think addressing this could make the story more cohesive the contributions seem significant for the zerosum markov games settings the motivation for constructing the algorithms in section 32 411 and 421 is a bit lacking what are you trying to do and why are you doing this way perhaps some transition sentences could help with clarity a minor but related comment adding line number references in those sections can help make it more clear it would be nice to see commentary on the significance of the informal theorems since the proof is omitted perhaps comment on whether it is a surprising or expected result the authors present a wide range of results on efficient algorithms for zerosum markov games these results leverage several previous works on valuefunction elimination using the bellman factorization and bellman eluder dimension the contributions seem to be significant the correctness seems good but it is possible i did not understand some parts of the submission and unfamiliar with some piece of relate work the technical novelty seem high providing new theoretical insights into minimax markov games docsepthe paper discussed markov games with general function approximation they consider two settings the decoupled one where the player does not observe the opponents policy and the coordinated one where optimistic planning is performed for both the former works under a new notion of eluder dimension that is appropriate for games and the algorithm itself is a variant of jin et al 2021 the second works under a notion similar to the witnessed rank of sun et al 2019a and the algorithm is also inspired by theirs i have few remarks the work alludes to general function approximation i advise caution with the wording as we do not have yet a full characterization of general function approximation in rl besides the notion of eluder dimension does not seem to give rise to models much more general than generalized linear models to the best of my understanding the work seems to be a collage of different ideas namely the bellman eluder dimension and witnessed rank the authors should explain in much better depth why they use one instead of the others in the two different settings most of the ideas in this work have been introduced previously in non games the lack of computational tractability should be emphasized much better while i listed several critical comments to this work on the whole i liked the work because it still makes a nontrivial contribution to the theory of markov games docsepthis paper studies efficient function approxiamtion in twoplayer zerosum markov games with general function classes both decoupled and coordinated settings for learning agents are considered modelfree algorithms for both settings and modelbased algorithm for the leter are provided all with proved sample complexities strengths the paper is well organized and properly presented both intuitive design and technical proofs are easy to follow weaknesses 1 my major concern is about the difference between the modelfree algorithms and the work by jin et al 2021b it seems that they share almost the same idea of the algorithm design similar analysis and results for sampling complexity especially extending the same measure of eluder dimension named minimax eluder dimension in this paper while multiagent eluder dimension in jin et al 2021b without any comparison can the auther highlight any technical contribution beyond jin et al 2021b 2 the assumptions of realizability and completeness are not wellmotivated it would be better to provide more convicing evidence that these assumptions are realisitc 3 one minor question is that whether the sample complexity of the modelbased algorithm can be compared with the modelfree one i think this paper is blow the bar of acceptance docsepthis paper considers zerosum markov games in a general regime where the model is parameterized by general function classes the goal of this research is to investigate reinforcement learning algorithms that learn a nash policy in a trialanderror fashion the level of generality is achieved by introducing two complexity measures the minimax eluder dimension and the witness rank the main result three algorithms with provable regret upper bounds involving covering numbers and the minimax eluder dimension strengths the presentation is mostly clear and to the point the main results are related to the relevant literature a particular strong feature of this paper is a detailed appendix containing the main proofs of the paper the appendix is mostly well written and one obtains a good overview weaknesses i see a couple of issues with the references some entries seem to be duplicated or not cited properly this seems to be the case here julien perolat bruno scherrer bilal piot and olivier pietquin approximate dynamic programming for twoplayer zerosum markov games in international conference on machine learning pp 13211329 2015a julien perolat bruno scherrer bilal piot and olivier pietquin approximate dynamic programming for twoplayer zerosum markov games international conference on machine learning 3713211329 2015b ian osband and benjamin van roy modelbased reinforcement learning and the eluder dimension arxiv preprint arxiv14061853 2014 this is a nips2014 paper and at many more places note that reference yulai zhao yuandong tian jason d lee and simon s du provably efficient policy gradient methods for twoplayer zerosum markov games 2021 is not even properly referenced i do not understand the bellman operator defined in eq 1 usually the value is computed outside of the expectation and not inside i would also not know what value iteration applied to this operator would deliver note that the bellman operator your introduce is not the one defined in the paper you are citing at this stage julien perolat bruno scherrer bilal piot and olivier pietquin approximate dynamic programming for twoplayer zerosum markov games international conference on machine learning 3713211329 2015b in that reference you clearly see that the operator approach of stochastic games is follows which is the mathematically correct way to define equilibrium in markov games why is vpi or pinu unique the definition of the measures pifh is confusing first i belief there is a typo on the sentence preceding sentence it should now be fin fh but rather fin f doesnt it only then i would understand that you want to construct a sequence of measurevalued policies indexed by the approximating function sequence f and the time horizon the proposed minimax eluder dimension looks like a special case of the bellman eluder dimension on previously introduced in the context of mdps what is the rational of looking at dirac measures in this definition also if the state space is not compact i am wondering if this notion is a suitable concept what is the minimax eluder dimension on the real line algorithm 1 is a cuttingplane method in the sequence space of reward functions if i understood it correctly the round k profile of rewards is chosen as the best reward function at time 1 of the markov game when all perperiod policies are myopically minmaxed i am wondering how this is computed to me the argmax defining fk reads like an optimization problem over a space of functions and i do not see how this can be done efficiently in the present setting note that you do not assume the state space to the finite unless i missed something this looks to me like an extremely challenging task and some more explanation would be needed to understand the hardness of this step after eq 8 the set mathcalzrho has not been defined before i think the paper comes with a very detailed appendix containing all the necessary proofs this is a big plus of the paper i am not quite sure though how novel the presented results are at the same time i am admitting that i am not an expert on mdps or rl from an algorithmic perspective i have some doubts about the computational complexity of the algorithms presented here but it seems this is a general problem in this literature so again i would not put much weight on my critique here what i find quite confusing is the use of classical concepts like the shapley operator the analogue of the bellman operator in stochastic games is defined in a way that i have not seen before in the operator approach to stochastic games it is used to define an extension of dynamic programming to competitive settings a key property is that the shapley operator is a contraction and this drives a value iteration kind of algorithm i am not sure how these classical results translate in the present formulation as the operator is different and it is not clear whether contraction properties can be used the question is then to me how one can relate regret and equilibrium note that we know that regret minimization and nash equilibrium are not the same and even in zero sum games usually regret minimization implies something about the ergodic average but not the last iterate this confuses me quite a bit in this paper and the cited references to be fair
### Summary: | summary the paper discusses markov games with general function approximation and investigates in particular reinforcement learning algorithms that learn a nash policy in a trialanderror fashion they consider two settings the decoupled one where the player does not observe the opponents policy and the coordinated one where optimistic planning is performed for both the main contribution is a new complexity measure called minimax eluder dimension which is used to control the regret of the proposed algorithms discussions the reviewers raised many minor concerns regarding the writing typos of missing notation and the clarity missing discussions and explanations which were addressed during the discussion phase in light of this revision the committee and myself judge that the paper should be accepted to iclr decision accept | [
360,
1616,
729,
3958,
970,
253,
7221,
991,
17487,
1342,
5572,
50276,
783,
4477,
1287,
11333,
323,
28684,
295,
1225,
17527,
50276,
1542,
33303,
360,
2500,
4488,
4071,
1616,
729,
3958,
1754,
327,
1318,
3701,
20408,
1182,
12109,
4240,
253,
9021,
403,
2590,
275,
253,
10199,
285,
253,
7681,
7881,
2593,
2085,
1175,
8813,
273,
534,
4719,
14493,
285,
5697,
273,
36970,
403,
7091,
275,
253,
2929,
50276,
262,
651,
320,
4217,
281,
452,
690,
2590,
5426,
273,
752,
34430,
6216,
285,
25899,
2097,
275,
436,
1083,
849,
513,
1110,
1543,
14588,
281,
2074,
7533,
275,
2014,
5570,
4715,
50276,
783,
5955,
9193,
14999,
891,
651,
1902,
690,
625,
22378,
10941,
253,
2710,
11333,
4390,
597,
403,
4127,
6571,
347,
4858,
11333,
310,
627,
247,
440,
5411,
7792,
273,
253,
1264,
11333,
326,
368,
812,
5645,
327,
390,
4385,
327,
849,
253,
14938,
14493,
7277,
342,
1016,
643,
672,
281,
897,
581,
689,
253,
643,
891,
1158,
15974,
436,
812,
1056,
253,
2926,
625,
28901,
422,
50276,
783,
9021,
1646,
1534,
323,
253,
33303,
360,
1616,
729,
3958,
7533,
253,
16038,
323,
26736,
253,
11333,
275,
2593,
4567,
33232,
285,
38609,
310,
247,
2372,
14999,
752,
403,
368,
2820,
281,
513,
285,
2139,
403,
368,
2509,
436,
1039,
4931,
690,
5502,
14683,
812,
1361,
342,
19843,
247,
5884,
533,
2905,
4385,
6240,
1386,
1180,
10414,
275,
1110,
7118,
476,
1361,
1056,
352,
625,
2590,
50276,
262,
651,
320,
5322,
281,
923,
22378,
327,
253,
8453,
273,
253,
25040,
39383,
1580,
253,
4737,
310,
11035,
4931,
4385,
327,
1880,
352,
310,
247,
10084,
390,
3264,
906,
50275,
783,
4477,
1246,
247,
4618,
2491,
273,
1543,
327,
5919,
11333,
323,
33303,
360,
1616,
729,
3958,
841,
1543,
25057,
2067,
2045,
2987,
327,
1318,
3701,
20408,
970,
253,
17487,
1342,
39401,
285,
17487,
1342,
1045,
32656,
7877,
253,
9021,
1646,
281,
320,
1534,
253,
36594,
3133,
1175,
533,
352,
310,
1896,
891,
858,
417,
2096,
690,
4243,
273,
253,
19529,
285,
32139,
342,
690,
5313,
273,
14588,
789,
253,
7681,
38135,
1646,
1029,
5277,
747,
10527,
16039,
715,
7221,
991,
1616,
729,
3958,
5474,
339,
431,
248,
2929,
5469,
1616,
729,
3958,
342,
2087,
1159,
11193,
597,
1908,
767,
7533,
253,
34430,
6216,
581,
835,
253,
4760,
1057,
417,
10018,
253,
18062,
3646,
285,
253,
25899,
581,
835,
28684,
7219,
310,
2684,
323,
1097,
50274,
783,
3438,
2987,
762,
247,
747,
10732,
273,
1045,
32656,
7877,
326,
310,
4569,
323,
3958,
285,
253,
5933,
3139,
310,
247,
12955,
273,
480,
249,
1162,
355,
43425,
50276,
783,
1273,
2987,
762,
247,
10732,
2074,
281,
253,
21153,
5958,
273,
5101,
1162,
355,
6247,
66,
285,
253,
5933,
310,
671,
11797,
407,
31187,
50276,
74,
452,
1643,
16157,
50276,
783,
789,
512,
14735,
281,
2087,
1159,
11193,
891,
22276,
17458,
342,
253,
41066,
347,
359,
513,
417,
452,
2568,
247,
2120,
14846,
273,
2087,
1159,
11193,
275,
391,
77,
16280,
253,
10732,
273,
1045,
32656,
7877,
1057,
417,
1646,
281,
1918,
6054,
281,
3210,
1199,
625,
2087,
685,
14923,
4872,
3210,
281,
253,
1682,
273,
619,
4685,
50276,
783,
789,
3133,
281,
320,
247,
3007,
486,
273,
1027,
5697,
10775,
253,
17487,
1342,
1045,
32656,
7877,
285,
21153,
5958,
253,
4477,
943,
5513,
275,
1199,
1805,
6864,
2139,
597,
897,
581,
3185,
273,
253,
2571,
275,
253,
767,
1027,
7533,
50276,
2252,
273,
253,
5697,
275,
436,
789,
452,
644,
5611,
3786,
275,
1327,
3958,
50276,
783,
3480,
273,
15180,
10649,
1430,
943,
320,
21947,
1199,
1805,
50276,
6050,
891,
7117,
2067,
4619,
5701,
281,
436,
789,
327,
253,
2644,
891,
10490,
253,
789,
984,
352,
1335,
2789,
247,
37825,
7680,
281,
253,
3762,
273,
1616,
729,
3958,
50276,
7152,
33032,
2520,
2929,
2175,
5919,
1159,
1192,
2981,
312,
19606,
275,
2500,
4488,
4071,
33303,
360,
1616,
729,
3958,
342,
2087,
1159,
5971,
1097,
34430,
6216,
285,
25899,
7533,
323,
4715,
6083,
403,
2783,
771,
813,
658,
11333,
323,
1097,
7533,
285,
1566,
3169,
5933,
323,
253,
458,
350,
403,
2530,
512,
342,
8058,
3410,
48663,
20544,
253,
2929,
310,
973,
10932,
285,
6283,
3559,
1097,
27350,
2216,
285,
7681,
27947,
403,
3477,
281,
956,
50276,
20881,
1255,
265,
337,
619,
2201,
4468,
310,
670,
253,
3064,
875,
253,
771,
813,
658,
11333,
285,
253,
789,
407,
480,
249,
1162,
355,
43425,
67,
352,
3133,
326,
597,
3894,
2761,
253,
1072,
2934,
273,
253,
5933,
2216,
2074,
1783,
285,
1543,
323,
10491,
10454,
3340,
13633,
253,
1072,
2557,
273,
1045,
32656,
7877,
4907,
7221,
991,
1045,
32656,
7877,
275,
436,
2929,
1223,
4471,
12788,
1045,
32656,
7877,
275,
480,
249,
1162,
355,
43425,
67,
1293,
667,
5301,
476,
253,
1125,
379,
6780,
667,
7681,
7680,
4457,
480,
249,
1162,
355,
43425,
67,
374,
253,
13260,
273,
42924,
1430,
285,
29867,
403,
417,
973,
24013,
8550,
352,
651,
320,
1805,
281,
2085,
625,
2410,
11733,
1941,
326,
841,
13260,
403,
1524,
261,
262,
68,
495,
581,
5884,
1953,
310,
326,
1880,
253,
3410,
10454,
273,
253,
1566,
3169,
5933,
476,
320,
2429,
342,
253,
771,
813,
658,
581,
50276,
74,
1158,
436,
2929,
310,
8230,
253,
2534,
273,
14924,
5474,
33032,
2520,
2929,
19401,
33303,
360,
1616,
729,
3958,
275,
247,
2087,
9459,
835,
253,
1566,
310,
4764,
1025,
407,
2087,
1159,
5971,
253,
4736,
273,
436,
2561,
310,
281,
7409,
35221,
4715,
11333,
326,
3037,
247,
295,
1225,
3646,
275,
247,
2332,
395,
3775,
8142,
253,
1268,
273,
31376,
310,
6786,
407,
16984,
767,
10454,
5593,
253,
7221,
991,
1045,
32656,
7877,
285,
253,
5517,
5958,
253,
2022,
906,
1264,
11333,
342,
872,
494,
14938,
5170,
14493,
7668,
10985,
3904,
285,
253,
7221,
991,
1045,
32656,
7877,
50276,
296,
3755,
20556,
50275,
783,
9759,
310,
6571,
2590,
285,
281,
253,
1127,
253,
2022,
1543,
403,
2905,
281,
253,
4623,
6239,
247,
1798,
2266,
4735,
273,
436,
2929,
310,
247,
7000,
30762,
4508,
253,
2022,
27947,
273,
253,
2929,
253,
30762,
310,
6571,
973,
3542,
285,
581,
31326,
247,
1175,
18389,
50275,
20881,
1255,
265,
50276,
74,
923,
247,
4564,
273,
3374,
342,
253,
10414,
690,
12028,
1646,
281,
320,
44223,
390,
417,
11106,
6283,
436,
3133,
281,
320,
253,
1083,
1060,
50276,
75,
335,
1914,
591,
311,
255,
1308,
26162,
660,
379,
6554,
10370,
267,
268,
7173,
285,
8919,
400,
1321,
268,
2880,
16573,
16851,
7870,
10717,
323,
2500,
4488,
4071,
33303,
360,
1616,
729,
3958,
275,
5213,
8059,
327,
5145,
4715,
7266,
13718,
883,
22011,
4104,
66,
50276,
75,
335,
1914,
591,
311,
255,
1308,
26162,
660,
379,
6554,
10370,
267,
268,
7173,
285,
8919,
400,
1321,
268,
2880,
16573,
16851,
7870,
10717,
323,
2500,
4488,
4071,
33303,
360,
1616,
729,
3958,
5213,
8059,
327,
5145,
4715,
5345,
14783,
883,
22011,
4104,
67,
50276,
757,
7684,
4152,
285,
2240,
16935,
3889,
12869,
1566,
3169,
35221,
4715,
285,
253,
1045,
32656,
7877,
549,
32693,
638,
3845,
549,
32693,
1047,
3071,
1093,
3357,
4059,
50276,
2520,
310,
247,
295,
2824,
6759,
2929,
50276,
395,
387,
1142,
625,
5053,
3877,
326,
3806,
340,
335,
2284,
1182,
31035,
340,
86,
395,
543,
246,
757,
480,
1187,
277,
458,
70,
285,
948,
251,
256,
3443,
872,
1598,
5919,
3646,
11786,
3082,
323,
2500,
4488,
4071,
33303,
360,
1616,
729,
3958,
43425,
50276,
261,
417,
1014,
6283,
23378,
50274,
74,
513,
417,
2096,
253,
17487,
1342,
5572,
2931,
275,
16186,
337,
3798,
253,
1318,
310,
10302,
3345,
273,
253,
15355,
285,
417,
3304,
891,
651,
671,
417,
871,
752,
1318,
19502,
3732,
281,
436,
5572,
651,
7257,
3877,
326,
253,
17487,
1342,
5572,
634,
9569,
310,
417,
253,
581,
2931,
275,
253,
2929,
368,
403,
19936,
387,
436,
3924,
49137,
1914,
591,
311,
255,
1308,
26162,
660,
379,
6554,
10370,
267,
268,
7173,
285,
8919,
400,
1321,
268,
2880,
16573,
16851,
7870,
10717,
323,
2500,
4488,
4071,
33303,
360,
1616,
729,
3958,
5213,
8059,
327,
5145,
4715,
5345,
14783,
883,
22011,
4104,
67,
275,
326,
3806,
368,
4518,
923,
326,
253,
5572,
2746,
273,
19191,
3958,
310,
3637,
534,
310,
253,
11076,
1037,
3451,
1039,
281,
4853,
12902,
275,
1616,
729,
3958,
50274,
22309,
310,
362,
2059,
390,
9176,
86,
4451,
50275,
783,
5426,
273,
253,
5593,
268,
338,
73,
310,
21643,
806,
891,
9927,
627,
310,
247,
1745,
80,
327,
253,
6197,
17691,
6197,
352,
943,
1024,
320,
1442,
269,
73,
533,
2581,
1442,
269,
36908,
352,
760,
840,
891,
651,
2096,
326,
368,
971,
281,
3989,
247,
3425,
273,
2557,
24995,
7823,
32891,
407,
253,
4020,
839,
1159,
3425,
269,
285,
253,
673,
16892,
50275,
783,
4081,
7221,
991,
1045,
32656,
7877,
4453,
751,
247,
2714,
1083,
273,
253,
17487,
1342,
1045,
32656,
7877,
327,
3786,
5611,
275,
253,
3634,
273,
31934,
793,
752,
310,
253,
8870,
273,
2819,
387,
14035,
317,
5593,
275,
436,
5426,
671,
604,
253,
1375,
2317,
310,
417,
8566,
891,
717,
12371,
604,
436,
10732,
310,
247,
7470,
4473,
752,
310,
253,
7221,
991,
1045,
32656,
7877,
327,
253,
1524,
1386,
50274,
41528,
337,
310,
247,
9968,
13568,
1332,
275,
253,
3425,
2317,
273,
10921,
3470,
604,
891,
7192,
352,
9113,
253,
3790,
465,
6222,
273,
23267,
310,
6777,
347,
253,
1682,
10921,
1159,
387,
673,
337,
273,
253,
1616,
729,
2165,
672,
512,
591,
17911,
7823,
403,
619,
37800,
1054,
4090,
264,
891,
717,
12371,
849,
436,
310,
10302,
281,
479,
253,
1736,
4090,
13947,
269,
76,
9563,
751,
271,
13757,
1895,
689,
247,
2317,
273,
3470,
285,
891,
513,
417,
923,
849,
436,
476,
320,
2218,
14556,
275,
253,
1246,
4758,
3877,
326,
368,
513,
417,
5467,
253,
1375,
2317,
281,
253,
6486,
5734,
891,
9829,
1633,
436,
4453,
281,
479,
751,
271,
6685,
11132,
4836,
285,
690,
625,
8813,
651,
320,
3058,
281,
2096,
253,
38576,
273,
436,
3213,
50275,
6438,
16186,
854,
253,
873,
14168,
1179,
91,
2859,
556,
417,
644,
2931,
1078,
891,
1158,
50276,
783,
2929,
3249,
342,
247,
1077,
7000,
30762,
4508,
512,
253,
3309,
27947,
436,
310,
247,
1943,
5043,
273,
253,
2929,
891,
717,
417,
3240,
2119,
2167,
849,
4460,
253,
3559,
1543,
403,
387,
253,
1072,
673,
891,
717,
28168,
326,
891,
717,
417,
271,
6485,
327,
31934,
793,
390,
391,
77,
432,
271,
5933,
280,
8668,
891,
452,
690,
24626,
670,
253,
15180,
10454,
273,
253,
11333,
3559,
1060,
533,
352,
3133,
436,
310,
247,
2087,
1895,
275,
436,
6239,
594,
969,
891,
651,
417,
1691,
1199,
2801,
327,
619,
29254,
1060,
50275,
5371,
891,
1089,
3240,
21643,
310,
253,
897,
273,
8946,
12342,
751,
253,
439,
522,
2205,
5572,
253,
28046,
273,
253,
17487,
1342,
5572,
275,
19191,
3958,
310,
2931,
275,
247,
1039,
326,
891,
452,
417,
2326,
1078,
275,
253,
5572,
2746,
281,
19191,
3958,
352,
310,
908,
281,
4853,
271,
6880,
273,
7870,
10717,
281,
12085,
7533,
247,
2234,
2867,
310,
326,
253,
439,
522,
2205,
5572,
310,
247,
22170,
285,
436,
14137,
247,
1318,
19502,
2238,
273,
5933,
891,
717,
417,
2119,
849,
841,
8946,
1543,
16497,
275,
253,
1246,
15895,
347,
253,
5572,
310,
1027,
285,
352,
310,
417,
2590,
1880,
22170,
3607,
476,
320,
908,
253,
1953,
310,
840,
281,
479,
849,
581,
476,
14588,
14938,
285,
12902,
3877,
326,
359,
871,
326,
14938,
41458,
285,
295,
1225,
12902,
403,
417,
253,
1072,
285,
1014,
275,
5058,
2020,
3958,
3798,
14938,
41458,
8018,
1633,
670,
253,
21651,
23329,
3388,
533,
417,
253,
1390,
35388,
436,
1461,
5123,
479,
3240,
247,
2372,
275,
436,
2929,
285,
253,
11106,
10414,
281,
320,
4344,
50275,
187,
187,
4118,
18435,
27,
8774,
253,
2929,
25339,
1616,
729,
3958,
342,
2087,
1159,
11193,
285,
2340,
684,
275,
1798,
35221,
4715,
11333,
326,
3037,
247,
295,
1225,
3646,
275,
247,
2332,
395,
3775,
8142,
597,
1908,
767,
7533,
253,
34430,
6216,
581,
835,
253,
4760,
1057,
417,
10018,
253,
18062,
3646,
285,
253,
25899,
581,
835,
28684,
7219,
310,
2684,
323,
1097,
253,
2022,
7680,
310,
247,
747,
10454,
2557,
1925,
7221,
991,
1045,
32656,
7877,
534,
310,
908,
281,
1453,
253,
14938,
273,
253,
4081,
11333,
50275,
35844,
621,
253,
30628,
5439,
1142,
5884,
7350,
5001,
253,
4028,
963,
993,
273,
5816,
14951,
285,
253,
19843,
5816,
11985,
285,
22909,
534,
497,
9713,
1309,
253,
5955,
3408,
275,
1708,
273,
436,
18520,
253,
9353,
285,
4266,
5963,
326,
253,
2929,
943,
320,
7607,
281,
17857,
32888,
50276,
33642,
2997
] | [
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] | [
360,
1616,
729,
3958,
970,
253,
7221,
991,
17487,
1342,
5572,
50276,
783,
4477,
1287,
11333,
323,
28684,
295,
1225,
17527,
50276,
1542,
33303,
360,
2500,
4488,
4071,
1616,
729,
3958,
1754,
327,
1318,
3701,
20408,
1182,
12109,
4240,
253,
9021,
403,
2590,
275,
253,
10199,
285,
253,
7681,
7881,
2593,
2085,
1175,
8813,
273,
534,
4719,
14493,
285,
5697,
273,
36970,
403,
7091,
275,
253,
2929,
50276,
262,
651,
320,
4217,
281,
452,
690,
2590,
5426,
273,
752,
34430,
6216,
285,
25899,
2097,
275,
436,
1083,
849,
513,
1110,
1543,
14588,
281,
2074,
7533,
275,
2014,
5570,
4715,
50276,
783,
5955,
9193,
14999,
891,
651,
1902,
690,
625,
22378,
10941,
253,
2710,
11333,
4390,
597,
403,
4127,
6571,
347,
4858,
11333,
310,
627,
247,
440,
5411,
7792,
273,
253,
1264,
11333,
326,
368,
812,
5645,
327,
390,
4385,
327,
849,
253,
14938,
14493,
7277,
342,
1016,
643,
672,
281,
897,
581,
689,
253,
643,
891,
1158,
15974,
436,
812,
1056,
253,
2926,
625,
28901,
422,
50276,
783,
9021,
1646,
1534,
323,
253,
33303,
360,
1616,
729,
3958,
7533,
253,
16038,
323,
26736,
253,
11333,
275,
2593,
4567,
33232,
285,
38609,
310,
247,
2372,
14999,
752,
403,
368,
2820,
281,
513,
285,
2139,
403,
368,
2509,
436,
1039,
4931,
690,
5502,
14683,
812,
1361,
342,
19843,
247,
5884,
533,
2905,
4385,
6240,
1386,
1180,
10414,
275,
1110,
7118,
476,
1361,
1056,
352,
625,
2590,
50276,
262,
651,
320,
5322,
281,
923,
22378,
327,
253,
8453,
273,
253,
25040,
39383,
1580,
253,
4737,
310,
11035,
4931,
4385,
327,
1880,
352,
310,
247,
10084,
390,
3264,
906,
50275,
783,
4477,
1246,
247,
4618,
2491,
273,
1543,
327,
5919,
11333,
323,
33303,
360,
1616,
729,
3958,
841,
1543,
25057,
2067,
2045,
2987,
327,
1318,
3701,
20408,
970,
253,
17487,
1342,
39401,
285,
17487,
1342,
1045,
32656,
7877,
253,
9021,
1646,
281,
320,
1534,
253,
36594,
3133,
1175,
533,
352,
310,
1896,
891,
858,
417,
2096,
690,
4243,
273,
253,
19529,
285,
32139,
342,
690,
5313,
273,
14588,
789,
253,
7681,
38135,
1646,
1029,
5277,
747,
10527,
16039,
715,
7221,
991,
1616,
729,
3958,
5474,
339,
431,
248,
2929,
5469,
1616,
729,
3958,
342,
2087,
1159,
11193,
597,
1908,
767,
7533,
253,
34430,
6216,
581,
835,
253,
4760,
1057,
417,
10018,
253,
18062,
3646,
285,
253,
25899,
581,
835,
28684,
7219,
310,
2684,
323,
1097,
50274,
783,
3438,
2987,
762,
247,
747,
10732,
273,
1045,
32656,
7877,
326,
310,
4569,
323,
3958,
285,
253,
5933,
3139,
310,
247,
12955,
273,
480,
249,
1162,
355,
43425,
50276,
783,
1273,
2987,
762,
247,
10732,
2074,
281,
253,
21153,
5958,
273,
5101,
1162,
355,
6247,
66,
285,
253,
5933,
310,
671,
11797,
407,
31187,
50276,
74,
452,
1643,
16157,
50276,
783,
789,
512,
14735,
281,
2087,
1159,
11193,
891,
22276,
17458,
342,
253,
41066,
347,
359,
513,
417,
452,
2568,
247,
2120,
14846,
273,
2087,
1159,
11193,
275,
391,
77,
16280,
253,
10732,
273,
1045,
32656,
7877,
1057,
417,
1646,
281,
1918,
6054,
281,
3210,
1199,
625,
2087,
685,
14923,
4872,
3210,
281,
253,
1682,
273,
619,
4685,
50276,
783,
789,
3133,
281,
320,
247,
3007,
486,
273,
1027,
5697,
10775,
253,
17487,
1342,
1045,
32656,
7877,
285,
21153,
5958,
253,
4477,
943,
5513,
275,
1199,
1805,
6864,
2139,
597,
897,
581,
3185,
273,
253,
2571,
275,
253,
767,
1027,
7533,
50276,
2252,
273,
253,
5697,
275,
436,
789,
452,
644,
5611,
3786,
275,
1327,
3958,
50276,
783,
3480,
273,
15180,
10649,
1430,
943,
320,
21947,
1199,
1805,
50276,
6050,
891,
7117,
2067,
4619,
5701,
281,
436,
789,
327,
253,
2644,
891,
10490,
253,
789,
984,
352,
1335,
2789,
247,
37825,
7680,
281,
253,
3762,
273,
1616,
729,
3958,
50276,
7152,
33032,
2520,
2929,
2175,
5919,
1159,
1192,
2981,
312,
19606,
275,
2500,
4488,
4071,
33303,
360,
1616,
729,
3958,
342,
2087,
1159,
5971,
1097,
34430,
6216,
285,
25899,
7533,
323,
4715,
6083,
403,
2783,
771,
813,
658,
11333,
323,
1097,
7533,
285,
1566,
3169,
5933,
323,
253,
458,
350,
403,
2530,
512,
342,
8058,
3410,
48663,
20544,
253,
2929,
310,
973,
10932,
285,
6283,
3559,
1097,
27350,
2216,
285,
7681,
27947,
403,
3477,
281,
956,
50276,
20881,
1255,
265,
337,
619,
2201,
4468,
310,
670,
253,
3064,
875,
253,
771,
813,
658,
11333,
285,
253,
789,
407,
480,
249,
1162,
355,
43425,
67,
352,
3133,
326,
597,
3894,
2761,
253,
1072,
2934,
273,
253,
5933,
2216,
2074,
1783,
285,
1543,
323,
10491,
10454,
3340,
13633,
253,
1072,
2557,
273,
1045,
32656,
7877,
4907,
7221,
991,
1045,
32656,
7877,
275,
436,
2929,
1223,
4471,
12788,
1045,
32656,
7877,
275,
480,
249,
1162,
355,
43425,
67,
1293,
667,
5301,
476,
253,
1125,
379,
6780,
667,
7681,
7680,
4457,
480,
249,
1162,
355,
43425,
67,
374,
253,
13260,
273,
42924,
1430,
285,
29867,
403,
417,
973,
24013,
8550,
352,
651,
320,
1805,
281,
2085,
625,
2410,
11733,
1941,
326,
841,
13260,
403,
1524,
261,
262,
68,
495,
581,
5884,
1953,
310,
326,
1880,
253,
3410,
10454,
273,
253,
1566,
3169,
5933,
476,
320,
2429,
342,
253,
771,
813,
658,
581,
50276,
74,
1158,
436,
2929,
310,
8230,
253,
2534,
273,
14924,
5474,
33032,
2520,
2929,
19401,
33303,
360,
1616,
729,
3958,
275,
247,
2087,
9459,
835,
253,
1566,
310,
4764,
1025,
407,
2087,
1159,
5971,
253,
4736,
273,
436,
2561,
310,
281,
7409,
35221,
4715,
11333,
326,
3037,
247,
295,
1225,
3646,
275,
247,
2332,
395,
3775,
8142,
253,
1268,
273,
31376,
310,
6786,
407,
16984,
767,
10454,
5593,
253,
7221,
991,
1045,
32656,
7877,
285,
253,
5517,
5958,
253,
2022,
906,
1264,
11333,
342,
872,
494,
14938,
5170,
14493,
7668,
10985,
3904,
285,
253,
7221,
991,
1045,
32656,
7877,
50276,
296,
3755,
20556,
50275,
783,
9759,
310,
6571,
2590,
285,
281,
253,
1127,
253,
2022,
1543,
403,
2905,
281,
253,
4623,
6239,
247,
1798,
2266,
4735,
273,
436,
2929,
310,
247,
7000,
30762,
4508,
253,
2022,
27947,
273,
253,
2929,
253,
30762,
310,
6571,
973,
3542,
285,
581,
31326,
247,
1175,
18389,
50275,
20881,
1255,
265,
50276,
74,
923,
247,
4564,
273,
3374,
342,
253,
10414,
690,
12028,
1646,
281,
320,
44223,
390,
417,
11106,
6283,
436,
3133,
281,
320,
253,
1083,
1060,
50276,
75,
335,
1914,
591,
311,
255,
1308,
26162,
660,
379,
6554,
10370,
267,
268,
7173,
285,
8919,
400,
1321,
268,
2880,
16573,
16851,
7870,
10717,
323,
2500,
4488,
4071,
33303,
360,
1616,
729,
3958,
275,
5213,
8059,
327,
5145,
4715,
7266,
13718,
883,
22011,
4104,
66,
50276,
75,
335,
1914,
591,
311,
255,
1308,
26162,
660,
379,
6554,
10370,
267,
268,
7173,
285,
8919,
400,
1321,
268,
2880,
16573,
16851,
7870,
10717,
323,
2500,
4488,
4071,
33303,
360,
1616,
729,
3958,
5213,
8059,
327,
5145,
4715,
5345,
14783,
883,
22011,
4104,
67,
50276,
757,
7684,
4152,
285,
2240,
16935,
3889,
12869,
1566,
3169,
35221,
4715,
285,
253,
1045,
32656,
7877,
549,
32693,
638,
3845,
549,
32693,
1047,
3071,
1093,
3357,
4059,
50276,
2520,
310,
247,
295,
2824,
6759,
2929,
50276,
395,
387,
1142,
625,
5053,
3877,
326,
3806,
340,
335,
2284,
1182,
31035,
340,
86,
395,
543,
246,
757,
480,
1187,
277,
458,
70,
285,
948,
251,
256,
3443,
872,
1598,
5919,
3646,
11786,
3082,
323,
2500,
4488,
4071,
33303,
360,
1616,
729,
3958,
43425,
50276,
261,
417,
1014,
6283,
23378,
50274,
74,
513,
417,
2096,
253,
17487,
1342,
5572,
2931,
275,
16186,
337,
3798,
253,
1318,
310,
10302,
3345,
273,
253,
15355,
285,
417,
3304,
891,
651,
671,
417,
871,
752,
1318,
19502,
3732,
281,
436,
5572,
651,
7257,
3877,
326,
253,
17487,
1342,
5572,
634,
9569,
310,
417,
253,
581,
2931,
275,
253,
2929,
368,
403,
19936,
387,
436,
3924,
49137,
1914,
591,
311,
255,
1308,
26162,
660,
379,
6554,
10370,
267,
268,
7173,
285,
8919,
400,
1321,
268,
2880,
16573,
16851,
7870,
10717,
323,
2500,
4488,
4071,
33303,
360,
1616,
729,
3958,
5213,
8059,
327,
5145,
4715,
5345,
14783,
883,
22011,
4104,
67,
275,
326,
3806,
368,
4518,
923,
326,
253,
5572,
2746,
273,
19191,
3958,
310,
3637,
534,
310,
253,
11076,
1037,
3451,
1039,
281,
4853,
12902,
275,
1616,
729,
3958,
50274,
22309,
310,
362,
2059,
390,
9176,
86,
4451,
50275,
783,
5426,
273,
253,
5593,
268,
338,
73,
310,
21643,
806,
891,
9927,
627,
310,
247,
1745,
80,
327,
253,
6197,
17691,
6197,
352,
943,
1024,
320,
1442,
269,
73,
533,
2581,
1442,
269,
36908,
352,
760,
840,
891,
651,
2096,
326,
368,
971,
281,
3989,
247,
3425,
273,
2557,
24995,
7823,
32891,
407,
253,
4020,
839,
1159,
3425,
269,
285,
253,
673,
16892,
50275,
783,
4081,
7221,
991,
1045,
32656,
7877,
4453,
751,
247,
2714,
1083,
273,
253,
17487,
1342,
1045,
32656,
7877,
327,
3786,
5611,
275,
253,
3634,
273,
31934,
793,
752,
310,
253,
8870,
273,
2819,
387,
14035,
317,
5593,
275,
436,
5426,
671,
604,
253,
1375,
2317,
310,
417,
8566,
891,
717,
12371,
604,
436,
10732,
310,
247,
7470,
4473,
752,
310,
253,
7221,
991,
1045,
32656,
7877,
327,
253,
1524,
1386,
50274,
41528,
337,
310,
247,
9968,
13568,
1332,
275,
253,
3425,
2317,
273,
10921,
3470,
604,
891,
7192,
352,
9113,
253,
3790,
465,
6222,
273,
23267,
310,
6777,
347,
253,
1682,
10921,
1159,
387,
673,
337,
273,
253,
1616,
729,
2165,
672,
512,
591,
17911,
7823,
403,
619,
37800,
1054,
4090,
264,
891,
717,
12371,
849,
436,
310,
10302,
281,
479,
253,
1736,
4090,
13947,
269,
76,
9563,
751,
271,
13757,
1895,
689,
247,
2317,
273,
3470,
285,
891,
513,
417,
923,
849,
436,
476,
320,
2218,
14556,
275,
253,
1246,
4758,
3877,
326,
368,
513,
417,
5467,
253,
1375,
2317,
281,
253,
6486,
5734,
891,
9829,
1633,
436,
4453,
281,
479,
751,
271,
6685,
11132,
4836,
285,
690,
625,
8813,
651,
320,
3058,
281,
2096,
253,
38576,
273,
436,
3213,
50275,
6438,
16186,
854,
253,
873,
14168,
1179,
91,
2859,
556,
417,
644,
2931,
1078,
891,
1158,
50276,
783,
2929,
3249,
342,
247,
1077,
7000,
30762,
4508,
512,
253,
3309,
27947,
436,
310,
247,
1943,
5043,
273,
253,
2929,
891,
717,
417,
3240,
2119,
2167,
849,
4460,
253,
3559,
1543,
403,
387,
253,
1072,
673,
891,
717,
28168,
326,
891,
717,
417,
271,
6485,
327,
31934,
793,
390,
391,
77,
432,
271,
5933,
280,
8668,
891,
452,
690,
24626,
670,
253,
15180,
10454,
273,
253,
11333,
3559,
1060,
533,
352,
3133,
436,
310,
247,
2087,
1895,
275,
436,
6239,
594,
969,
891,
651,
417,
1691,
1199,
2801,
327,
619,
29254,
1060,
50275,
5371,
891,
1089,
3240,
21643,
310,
253,
897,
273,
8946,
12342,
751,
253,
439,
522,
2205,
5572,
253,
28046,
273,
253,
17487,
1342,
5572,
275,
19191,
3958,
310,
2931,
275,
247,
1039,
326,
891,
452,
417,
2326,
1078,
275,
253,
5572,
2746,
281,
19191,
3958,
352,
310,
908,
281,
4853,
271,
6880,
273,
7870,
10717,
281,
12085,
7533,
247,
2234,
2867,
310,
326,
253,
439,
522,
2205,
5572,
310,
247,
22170,
285,
436,
14137,
247,
1318,
19502,
2238,
273,
5933,
891,
717,
417,
2119,
849,
841,
8946,
1543,
16497,
275,
253,
1246,
15895,
347,
253,
5572,
310,
1027,
285,
352,
310,
417,
2590,
1880,
22170,
3607,
476,
320,
908,
253,
1953,
310,
840,
281,
479,
849,
581,
476,
14588,
14938,
285,
12902,
3877,
326,
359,
871,
326,
14938,
41458,
285,
295,
1225,
12902,
403,
417,
253,
1072,
285,
1014,
275,
5058,
2020,
3958,
3798,
14938,
41458,
8018,
1633,
670,
253,
21651,
23329,
3388,
533,
417,
253,
1390,
35388,
436,
1461,
5123,
479,
3240,
247,
2372,
275,
436,
2929,
285,
253,
11106,
10414,
281,
320,
4344,
50275,
187,
187,
4118,
18435,
27,
8774,
253,
2929,
25339,
1616,
729,
3958,
342,
2087,
1159,
11193,
285,
2340,
684,
275,
1798,
35221,
4715,
11333,
326,
3037,
247,
295,
1225,
3646,
275,
247,
2332,
395,
3775,
8142,
597,
1908,
767,
7533,
253,
34430,
6216,
581,
835,
253,
4760,
1057,
417,
10018,
253,
18062,
3646,
285,
253,
25899,
581,
835,
28684,
7219,
310,
2684,
323,
1097,
253,
2022,
7680,
310,
247,
747,
10454,
2557,
1925,
7221,
991,
1045,
32656,
7877,
534,
310,
908,
281,
1453,
253,
14938,
273,
253,
4081,
11333,
50275,
35844,
621,
253,
30628,
5439,
1142,
5884,
7350,
5001,
253,
4028,
963,
993,
273,
5816,
14951,
285,
253,
19843,
5816,
11985,
285,
22909,
534,
497,
9713,
1309,
253,
5955,
3408,
275,
1708,
273,
436,
18520,
253,
9353,
285,
4266,
5963,
326,
253,
2929,
943,
320,
7607,
281,
17857,
32888,
50276,
33642,
2997
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
the authors proposed latent optimistic value exploration love as a mechanism to leverage optimistic exploration for continuous visual control the main idea is to use a small 5 ensemble of latent models with shared encoders and therefore shared learned latent space but different transition reward and value models the variance of predictions from this ensemble can be used as uncertainty estimates of each action sequence while the mean is the typical policy learning objective then love puts more weighs on the states with high variance during exploration to enforce the agent to visit optimistically uncertain states this is inherently optimistic because there is a positive bias beta 0 and var 0 addition to the expected reward the idea of the paper is not novel and it has been explored before such as t kurutach et al iclr 2018 and unfortunately the authors did not provide enough context in the related works to distinguish their work with previous research however in its proposed form love is new to the best of my knowledge using the variance of predictions of an ensemble of agents is an interesting way of approximating uncertainty and the experiments demonstrate how this bias can improve the results however i find the experiments not to be convincing enough that the added complexity is necessary to achieve the improved performance first the main claim of the paper is that the proposed method achieves better score by exploring better however there are many changes that can cause this improvement for example using an ensemble of the models with different transition reward and value models is essentially using a bigger ie with more parameters model there is an ablation study that demonstrates only using an ensemble doesnt work lve alternate however this is not convincing that the improvements are coming from better exploration this can be done in multiple ways such as demonstrating that a base method eg dreamer works better if trained on the data collected by love an ablation study with a negative beta can be also super helpful although i believe the results of that is kinda trivial second there is no study that visualizes the importance of the ensemble the authors used 5 particles throughout the experiments but why 5 im not suggesting optimizing this number as a hyperparameter or anything like that but what im looking for is a study that clearly demonstrates that having a more accurate approximation of uncertainty is important this can be achieved by studying the effect of number of particles on the performance of the agent visualizing the actual variance of the predictions and demonstrating that the predictions actually vary is also important this tightly related to various values for beta which also requires another ablation study there are also many other way of enforcing optimism that the proposed method can be compared to overall the authors are on the right track the paper except for related works is very well written and easy to read however more experiments are required to clearly demonstrate why the method is working and how docsepsummary this paper proposes love latent optimistic value exploration a modelbased exploration algorithm for pomdp or pixelbased control systems the method builds upon dreamer hafner et al 2019 for learning latent models and the variance of value estimates one transitionrewardvalue model per particle estimates the epistemic uncertainty of the world which can be used as a ucbstyle exploration reward love can achieve a comparable or better performance on a simple pointmass environment with a trap and standard deepmind continuous control suites strength this paper studies a highsignificance problem of learning representation for visualbased control and deep exploration selfsupervised learning of latent representation and modelbased exploration is an important and timely research problem to study overall the paper is written well with a clear motivation and a good organization of the method presentation in terms of pseudocode plot table and hyperparameters look great the method is straightforward and the idea behind the algorithm is very reasonable weakness the paper has limited novelty the core idea of modelbased planning and disagreementbased exploration are taken from existing works though combining them and making them work in a challenging pixelbased continuous control tasks would be a nontrivial work crucially there are a few of very similar works in modelbased exploration which also builds upon dreamer please see the detailed comments below there might be some technical difference but a comprehensive comparison with existing works would be critical it is not clearly discussed or mentioned in the paper why the proposed method can be called as a deep exploration or how beneficial it would be compared to shallow explorations detailed comments there are some similar works in the modelbased exploration literature which this paper did not cite or compare with plan2explore sekar et al icml 2020 this paper learns a dreamerlike model where an ensemble of qh s a is learned and the variance of prediction of the latent variable h is used as the uncertainty or the disagreement intrinsic reward the difference to love would be the uncertainty is measured by estimation of value function or latent representations it will be interesting to see how these approaches can be compared value estimation might be much harder than the transition model especially when the task reward is sparse and would not be directly applicable in a rewardfreeunsupervised setup in such scenarios how can love learn to explore the world more related works to consider ready policy one rp1 ball et al arxiv 2020 modelbased active exploration max shyam et al icml 2019 learning awareness models amos et al iclr 2018 in terms of experimentsempirical evaluation there is no baseline exploration algorithm presented that is based on dreamer or pixelbased control which makes the effectiveness of the proposed approach a bit difficult to be assessed for example what if we do a straightforward exploration such as curiosity pathak et al 2017 rnd bruda et al 2018 on dreamer additional comments it would be great to provide more clarificationexplanation on how lve love with beta0 and dreamer are different additional analysis and experiments would also have strengthened the paper such as how does the algorithm performance differ under different values of the planning horizon how does the performance of the algorithm vary under different number m of ensembles as in hafner et al 2019a it would be great to clarify the difference between pcdot and qcdot in their meaning ie p for distributions that samples from real environments and q for approximations docsepthe proposed method seems like a simple and effective combination of existing ideas and approaches overall while i like the approach but am leaning towards a reject since i think there are aspects in the experiment section that can be clarified and improved however i am open to changing my decision if my concerns are addressed during the rebuttal period strengths the ideas presented are simple extensions of existing methods and are quite easily digested the proposed approach does seem to learn faster on some of the tasks considered although some aspects of this need clarification the experimental details presented in the appendix are quite thorough and did help me understand some aspects of the work in more detail issues and points worth clarifying apart from a number of points of clarification that are listed later my main concern is with the experimental section of the paper this work compares performance of the proposed method on 8 of the deepmind control suite domains and one toy domain with very little ablation in contrast the original drq and dreamer work which are used as baseline here show results on 15 and 20 control suite tasks on top of some results on atari the number of domains becomes somewhat more important in light of the fact that on 2 of the 8 domains considered drq outperforms the proposed approach the explanation regarding these 2 tasks having dense rewards does make sense but more data would help substantiate the claim another concern i have is with regard to the presentation of the results in figure 3 most of the curves are cut off before they have converged this makes it hard to sanity check against the existing results from the dreamer and drq paper and it remains unclear to me if love performs asymptotically as well as the baselines on some of the tasks perhaps this information could be included instead in table 1 which as far as i can tell does not currently present any new information that is not present in figure 3 finally the paper could also include more ablations especially regarding the effect of the ensemble size and how important the step increase schedule is for the beta parameter of the ucb objective finally there are some points which i dont fully understand based on my reading of the text and for which clarification would greatly help the main text mentions that love has exploration noise turned off since exploration happens through the ucb noise was this the same setting used for the lve where beta is set to 0 my understanding is that lve does have exploration noise but it would be good for the authors to confirm this the parameter in the ucb objective beta is progressively increased from 0 in delta steps of 0001 where does this number come from was this a hyperparameter learning multiple dreamer models is an interesting idea but the obvious issue with this is that it could bloat wall clock learning time how much slower is the proposed approach to vanilla dreamer in terms of wall clock time while understandably the focus of this work is data efficiency i think this is an important number to mention perhaps even in the appendix to paint a more complete picture update nov 25 i am happy that the authors improved the paper with reviewer feedback in particular i think the new ablations and comparison and mention of previous work makes the work more complete the results on new sparse tasks are also interesting i still think more can be done in terms of experimental validation in particular my original note regarding early cutting of the curves has not been addressed however overall i think the paper does meet the acceptance threshold as things stand docsep summary the paper proposes love an adaptation of dove seyde20 to latent variable predictive models seyde20 only condsidered predictive models without latent variables seyde20 proposes to use a generalization of upper confidence bound to deep modelbased rl by training an ensemble of models and value functions and training a policy to maximize mean variance of this ensemble similarly to lowrey19 the submission empirically demonstrates that tuning the learning rate and number of training steps per environment steps of dreamer hafner20 improves sample efficiency using an ensemble of predictive models further improves data efficiency slightly on cartpole and walker tasks while on top of that the proposed exploration method slightly improves sample efficiency on the hopper and sparse cartpole tasks decision the submission contains little technical novelty over prior work of seyde 2020 the experimental results are weak but somewhat justify the claims as there is a slight but consistent improvement on some tasks however the paper suffers from a major flaw in the empirical evaluation figure 3 and the relevant discussion describe love as significantly outperforming the dreamer baseline this difference is largely due to the fact love uses a different learning rate and number of epochs which improves sample efficiency the paper graciously provides the comparison to the fairly tuned baseline in the appendix as figure 9 confirming this the fairly tuned baseline needs to be moved from the appendix to the main paper and the contribution section and the discussion of the experiments need to be rewritten accordingly if this is provided i will reevaluate the paper in the current state of the paper i am unable to consider its merits on the basis of this flaw strengths the paper is technically correct except for the flaw explained above and proposes a promising approach to a relevant problem of exploration in rl from images the experimental results indicate that the proposed method could be effective weaknesses the major flaw of the paper is described earlier in addition there are two other major issues with experimental evaluation the experimental evaluation of the paper is rather weak the proposed exploration method only improves performance in 2 out of 8 environments this might be because the other environments do not require sophisticated exploration in which case the method needs to be tested on more than 2 relevant environments sparsereward versions of the evaluated environments can be easily designed and would be suitable for evaluating the method the second major issue is that the method is not evaluated against any competing exploration baselines even though the paper cites multiple prior works on this for instance the paper claims that methods based on information gain or improving knowledge of the dynamics will not explore as efficiently as the proposed method both of these claims need to be empirically evaluated or toned down additional comments the related work section is missing the following papers ball20 is a modelbased rl method that uses ucb for exploration sekar20 is a modelbased rl method that uses latent ensemble variance for taskagnostic exploration ball20 ready policy one world building through active learning sekar20 planning to explore via selfsupervised world models update the new sparse tasks and comparison to dreamer curious improve the paper and address some of my concerns specifically a sizable improvement due to exploration is now seen on 3 tasks hopper cartpole sparse and cheetah sparse the new maze task is also more challenging than the bug trap task final decision after the significant improvements in the experimental evaluation i believe the paper provides a reasonable case for the proposed latent ucb method it also provides an interesting discussion on the advantages of ucbstyle methods and an interesting observation that optimistic rewardbased exploration can be effectively used even in absence of positive rewards even though the experimental evaluation of prior work on exploration is still rather lacking i believe that these contributions are enough for the paper to be interesting to the iclr audience i raise my score to 6 remaining weaknesses the experimental evaluation in the paper is still quite lacking in terms of baselines making it impossible to judge whether the paper actually works better than prior work first the proposed method contains two improvements model ensembling and optimistic exploration but doesnt go much indepth analyzing either of these improvements instead trying to focus on both at the same time this makes comparison to prior exploration methods hard because the proposed method receives an additional boost due to an ensemble of dynamics the paper conveniently quantifies this boost in the lve method and it is shown to be rather large for a more fair comparison the ensemble of dynamics might be ablated leaving only the ensemble of value functions or the competing baselines could also be built on top of lve second the paper only compares against one competing exploration method dreamer curious there has been a large amount of proposed exploration methods and it would be appropriate to evaluate the proposed ucb method against at least a few of them for instance the paper could compare against similar value function ensemble techniques osband16 lowrey18 seyde20 or other cited work ostrovski17 pathak17 burda18 is not cited but perhaps should be compared against all these methods can be relatively easily implemented on top of lve for a fair comparison burda18 exploration by random network distillation additional comments would be great to clarify what is the observation space for the bug trap and maze tasks for instance you could add observations and predictions for these tasks to the appendix
### Summary: | the submission is acknowledged as having potential value in terms of proposing a new approach for exploration based on ensembles and value functions however there are lingering concerns about the discussion of what this paper brings to the table visavis prior work together with a lack of clear demonstration of the explicit gains from the exploration mechanism and more experimental studies the authors would do well to revise as per the feedback given and resubmit a version with a more compelling argument | [
281,
8338,
253,
1533,
50276,
3062,
2905,
2987,
281,
1908,
50276,
2038,
3646,
581,
391,
81,
18,
4023,
1162,
355,
549,
32693,
9169,
50276,
7645,
3169,
3939,
17947,
2781,
23478,
312,
1162,
355,
17857,
1686,
6247,
50276,
28269,
11891,
3210,
717,
375,
1162,
355,
17857,
32888,
4765,
50276,
249,
2426,
273,
3368,
339,
2503,
45756,
7103,
627,
310,
642,
8245,
17947,
5933,
3559,
326,
310,
1754,
327,
7156,
254,
390,
12275,
3169,
1453,
534,
2789,
253,
12510,
273,
253,
4081,
2746,
247,
2372,
2834,
281,
320,
7515,
323,
1650,
752,
604,
359,
513,
247,
15246,
17947,
824,
347,
24536,
1854,
518,
1162,
355,
4240,
391,
2109,
1308,
14776,
1162,
355,
4765,
327,
7156,
254,
50275,
38092,
5701,
50275,
262,
651,
320,
1270,
281,
2085,
625,
37699,
911,
45525,
327,
849,
298,
306,
2389,
342,
9840,
17,
285,
7156,
254,
403,
1027,
50276,
38092,
1783,
285,
4679,
651,
671,
452,
34615,
253,
2929,
824,
347,
849,
1057,
253,
5933,
3045,
9184,
762,
1027,
2193,
273,
253,
7219,
16892,
849,
1057,
253,
3045,
273,
253,
5933,
6889,
762,
1027,
1180,
278,
273,
49328,
50276,
284,
275,
419,
71,
1216,
1162,
355,
6247,
66,
352,
651,
320,
1270,
281,
19148,
253,
3064,
875,
268,
3830,
285,
2805,
3830,
275,
616,
4495,
26332,
268,
323,
10670,
326,
3530,
432,
1524,
12620,
285,
2805,
323,
34754,
5474,
339,
431,
248,
4081,
1332,
3133,
751,
247,
2969,
285,
3576,
5019,
273,
5368,
5697,
285,
7274,
4583,
1223,
891,
751,
253,
2746,
533,
717,
25661,
4404,
247,
12009,
1580,
891,
1158,
627,
403,
7794,
275,
253,
3368,
2593,
326,
476,
320,
31637,
285,
5520,
2299,
891,
717,
1527,
281,
6890,
619,
3061,
604,
619,
7350,
403,
9713,
1309,
253,
30080,
22559,
2180,
50276,
296,
3755,
20556,
50276,
783,
5697,
3559,
403,
2969,
18149,
273,
5368,
3082,
285,
403,
3240,
4354,
38780,
50276,
783,
4081,
2746,
1057,
1646,
281,
3037,
7938,
327,
690,
273,
253,
8892,
2783,
3738,
690,
7794,
273,
436,
878,
37699,
50276,
783,
5661,
4278,
3559,
275,
253,
30762,
403,
3240,
11080,
285,
858,
1361,
479,
2096,
690,
7794,
273,
253,
789,
275,
625,
2508,
50276,
22402,
285,
2792,
4409,
8254,
5411,
50275,
522,
435,
432,
247,
1180,
273,
2792,
273,
37699,
326,
403,
7117,
1996,
619,
2022,
4468,
310,
342,
253,
5661,
2593,
273,
253,
2929,
436,
789,
26662,
3045,
273,
253,
4081,
1332,
327,
854,
273,
253,
3676,
14785,
1453,
18880,
10625,
285,
581,
20953,
5028,
342,
1077,
1652,
28913,
275,
4499,
253,
3236,
1837,
82,
285,
7156,
254,
789,
534,
403,
908,
347,
8245,
1060,
921,
1543,
327,
1458,
285,
1384,
1453,
18880,
8892,
327,
1755,
273,
690,
1543,
327,
387,
1792,
50275,
783,
1180,
273,
10625,
4916,
8489,
625,
1774,
275,
1708,
273,
253,
958,
326,
327,
374,
273,
253,
854,
10625,
2783,
1837,
82,
41731,
13015,
253,
4081,
2746,
253,
8813,
5001,
841,
374,
8892,
1907,
14086,
23267,
1057,
1056,
3282,
533,
625,
941,
651,
1361,
4326,
4513,
253,
1750,
50276,
23955,
4468,
891,
452,
310,
342,
2743,
281,
253,
9759,
273,
253,
1543,
275,
4677,
495,
954,
273,
253,
9191,
403,
2624,
745,
1078,
597,
452,
5975,
2400,
436,
2789,
352,
1892,
281,
50276,
26125,
414,
2451,
1411,
253,
5368,
1543,
432,
253,
7156,
254,
285,
1837,
82,
2929,
285,
352,
4558,
12744,
281,
479,
604,
2389,
17923,
38311,
347,
973,
347,
253,
1666,
25379,
327,
690,
273,
253,
8892,
4931,
436,
1491,
812,
320,
2908,
3185,
275,
2829,
337,
534,
347,
2080,
347,
891,
476,
2028,
1057,
417,
4390,
1246,
667,
747,
1491,
326,
310,
417,
1246,
275,
4677,
495,
50276,
71,
3341,
253,
2929,
812,
671,
2486,
625,
490,
77,
569,
50276,
20432,
5001,
253,
1055,
273,
253,
19862,
1979,
285,
849,
1774,
253,
3213,
2572,
10130,
310,
323,
253,
9840,
4764,
273,
253,
44274,
67,
8103,
50276,
71,
3341,
627,
403,
690,
2792,
534,
891,
13414,
4751,
2096,
1754,
327,
619,
4361,
273,
253,
2505,
285,
323,
534,
37699,
651,
10260,
1361,
50276,
783,
2022,
2505,
25957,
326,
2389,
556,
17947,
6046,
3531,
745,
1580,
17947,
6569,
949,
253,
44274,
67,
6046,
369,
436,
253,
1072,
4758,
908,
323,
253,
298,
306,
835,
9840,
310,
873,
281,
470,
619,
4685,
310,
326,
298,
306,
1057,
452,
17947,
6046,
533,
352,
651,
320,
1175,
323,
253,
4477,
281,
6583,
436,
50275,
783,
4764,
275,
253,
44274,
67,
8103,
9840,
310,
31414,
2559,
432,
470,
275,
18687,
5018,
273,
209,
5831,
835,
1057,
436,
1180,
1705,
432,
369,
436,
247,
4373,
19484,
50275,
28269,
2709,
7156,
254,
3210,
310,
271,
4722,
2934,
533,
253,
4755,
2523,
342,
436,
310,
326,
352,
812,
787,
4875,
3402,
8886,
4715,
673,
849,
1199,
17357,
310,
253,
4081,
2746,
281,
26724,
7156,
254,
50276,
249,
2426,
273,
3402,
8886,
673,
1223,
2096,
1598,
253,
2770,
273,
436,
789,
310,
941,
6733,
891,
1158,
436,
310,
271,
1774,
1180,
281,
3748,
4931,
1014,
275,
253,
30762,
281,
6848,
247,
625,
3426,
5406,
50276,
11183,
22458,
2030,
50276,
74,
717,
5211,
326,
253,
4477,
5520,
253,
2929,
342,
37317,
8680,
275,
1798,
891,
1158,
253,
747,
490,
77,
569,
285,
5301,
285,
3748,
273,
2045,
789,
2789,
253,
789,
625,
3426,
253,
1543,
327,
747,
23507,
8892,
403,
671,
4722,
891,
1335,
1158,
625,
476,
320,
2218,
275,
2426,
273,
5661,
12820,
275,
1798,
619,
3236,
3877,
5001,
2393,
9968,
273,
253,
9191,
556,
417,
644,
9713,
2299,
4583,
891,
1158,
253,
2929,
1057,
2525,
253,
14924,
7887,
347,
1841,
1462,
5474,
33032,
6010,
50275,
783,
2929,
29328,
2389,
271,
15644,
273,
38688,
396,
90,
615,
938,
281,
21624,
4778,
15970,
3210,
396,
90,
615,
938,
760,
345,
1397,
2744,
15970,
3210,
1293,
21624,
4903,
396,
90,
615,
938,
29328,
281,
897,
247,
26647,
273,
5170,
7162,
3033,
281,
3676,
1566,
3169,
391,
77,
407,
3733,
271,
19862,
273,
3210,
285,
1318,
3470,
285,
3733,
247,
3646,
281,
22950,
1599,
50276,
87,
14417,
273,
436,
19862,
12014,
281,
1698,
5292,
746,
253,
19529,
45190,
14371,
326,
25184,
253,
4715,
2281,
285,
1180,
273,
3733,
5018,
591,
3126,
5018,
273,
7156,
254,
419,
71,
1216,
938,
19132,
3410,
6733,
970,
271,
19862,
273,
15970,
3210,
2007,
19132,
941,
6733,
5777,
327,
7281,
36479,
285,
2940,
254,
8892,
1223,
327,
1755,
273,
326,
253,
4081,
17947,
1332,
5777,
19132,
3410,
6733,
327,
253,
8511,
3803,
285,
23507,
7281,
36479,
8892,
50274,
33642,
50275,
783,
19529,
4428,
1652,
7681,
38135,
689,
2720,
789,
273,
396,
90,
615,
9169,
253,
5661,
1543,
403,
5075,
533,
8489,
15249,
253,
3916,
347,
627,
310,
247,
4512,
533,
5185,
7756,
327,
690,
8892,
2299,
253,
2929,
27171,
432,
247,
2201,
19652,
275,
253,
16774,
7103,
4677,
495,
285,
253,
4623,
5955,
6266,
2389,
347,
3012,
41731,
14692,
253,
7156,
254,
8245,
436,
3064,
310,
8127,
1955,
281,
253,
958,
2389,
4648,
247,
1027,
4715,
2281,
285,
1180,
273,
44540,
534,
19132,
3410,
6733,
253,
2929,
40763,
8140,
3400,
253,
5301,
281,
253,
9648,
24251,
8245,
275,
253,
30762,
347,
4677,
898,
24025,
436,
253,
9648,
24251,
8245,
3198,
281,
320,
4395,
432,
253,
30762,
281,
253,
2022,
2929,
285,
253,
7680,
2593,
285,
253,
5955,
273,
253,
4679,
878,
281,
320,
35993,
15672,
604,
436,
310,
2530,
891,
588,
294,
45141,
253,
2929,
275,
253,
1655,
1375,
273,
253,
2929,
891,
717,
7591,
281,
1908,
697,
16108,
327,
253,
3720,
273,
436,
19652,
50275,
296,
3755,
20556,
50275,
783,
2929,
310,
22335,
3451,
3707,
323,
253,
19652,
5544,
1840,
285,
29328,
247,
12532,
2746,
281,
247,
4623,
1895,
273,
17947,
275,
391,
77,
432,
3888,
253,
5661,
1543,
5224,
326,
253,
4081,
1332,
812,
320,
3576,
50275,
20881,
1255,
265,
50275,
783,
2201,
19652,
273,
253,
2929,
310,
2529,
4321,
275,
1635,
627,
403,
767,
643,
2201,
3374,
342,
5661,
7103,
50276,
783,
5661,
7103,
273,
253,
2929,
310,
2581,
5075,
253,
4081,
17947,
1332,
760,
19132,
3045,
275,
374,
562,
273,
854,
12620,
436,
1537,
320,
984,
253,
643,
12620,
513,
417,
2430,
18144,
17947,
275,
534,
1083,
253,
1332,
3198,
281,
320,
5762,
327,
625,
685,
374,
4623,
12620,
23507,
250,
1034,
9508,
273,
253,
6760,
12620,
476,
320,
4354,
4158,
285,
651,
320,
7470,
323,
16344,
253,
1332,
50276,
783,
1273,
2201,
2523,
310,
326,
253,
1332,
310,
417,
6760,
1411,
667,
11771,
17947,
1666,
25379,
1014,
2167,
253,
2929,
28070,
2709,
2720,
2987,
327,
436,
323,
4227,
253,
2929,
3916,
326,
3082,
1754,
327,
1491,
6351,
390,
11138,
3640,
273,
253,
8062,
588,
417,
8338,
347,
14556,
347,
253,
4081,
1332,
1097,
273,
841,
3916,
878,
281,
320,
45190,
6760,
390,
7020,
264,
1066,
50275,
38092,
5701,
50275,
783,
2905,
789,
2593,
310,
5816,
253,
1563,
9380,
50275,
2910,
938,
310,
247,
1566,
3169,
391,
77,
1332,
326,
4648,
44274,
67,
323,
17947,
50276,
339,
18970,
938,
310,
247,
1566,
3169,
391,
77,
1332,
326,
4648,
21624,
19862,
11041,
323,
4836,
1530,
6932,
17947,
50276,
2910,
938,
4704,
3646,
581,
1533,
3652,
949,
3939,
4715,
50276,
339,
18970,
938,
7219,
281,
8338,
3066,
1881,
35421,
1533,
3210,
50275,
11183,
50275,
783,
747,
23507,
8892,
285,
5301,
281,
7156,
254,
50276,
1915,
784,
3157,
253,
2929,
285,
2953,
690,
273,
619,
7350,
5742,
247,
256,
12729,
7756,
1955,
281,
17947,
310,
1024,
2326,
327,
495,
8892,
8511,
3803,
7281,
36479,
23507,
285,
1161,
292,
1240,
23507,
253,
747,
37045,
4836,
310,
671,
625,
11132,
685,
253,
7505,
15464,
4836,
50273,
13017,
3061,
50275,
6438,
253,
1534,
11701,
275,
253,
5661,
7103,
891,
2868,
253,
2929,
3400,
247,
5272,
1083,
323,
253,
4081,
21624,
44274,
67,
1332,
352,
671,
3400,
271,
4722,
5955,
327,
253,
11361,
273,
44274,
67,
4826,
3082,
285,
271,
4722,
8310,
326,
28684,
10921,
3169,
17947,
476,
320,
8069,
908,
1014,
275,
5928,
273,
2762,
23267,
1014,
2167,
253,
5661,
7103,
273,
2720,
789,
327,
17947,
310,
1335,
2581,
14999,
891,
2868,
326,
841,
9021,
403,
2217,
323,
253,
2929,
281,
320,
4722,
281,
253,
17857,
32888,
8446,
891,
7164,
619,
4868,
281,
721,
50275,
2013,
1776,
32213,
50275,
783,
5661,
7103,
275,
253,
2929,
310,
1335,
3240,
14999,
275,
2426,
273,
1666,
25379,
2403,
352,
7479,
281,
5963,
1880,
253,
2929,
2686,
2987,
1805,
685,
2720,
789,
50276,
7053,
253,
4081,
1332,
4428,
767,
11701,
1566,
546,
35128,
285,
28684,
17947,
533,
36908,
564,
1199,
801,
554,
394,
18918,
2057,
273,
841,
11701,
3185,
2820,
281,
2770,
327,
1097,
387,
253,
1072,
673,
436,
2789,
5301,
281,
2720,
17947,
3082,
1892,
984,
253,
4081,
1332,
14488,
271,
3081,
9510,
1955,
281,
271,
19862,
273,
8062,
253,
2929,
34090,
2677,
7790,
436,
9510,
275,
253,
298,
306,
1332,
285,
352,
310,
2011,
281,
320,
2581,
1781,
323,
247,
625,
4344,
5301,
253,
19862,
273,
8062,
1537,
320,
490,
16148,
6108,
760,
253,
19862,
273,
1318,
3470,
390,
253,
11771,
1666,
25379,
812,
671,
320,
4270,
327,
1755,
273,
298,
306,
50276,
9815,
253,
2929,
760,
26662,
1411,
581,
11771,
17947,
1332,
7156,
254,
50276,
1915,
784,
627,
556,
644,
247,
1781,
2408,
273,
4081,
17947,
3082,
285,
352,
651,
320,
4569,
281,
7472,
253,
4081,
44274,
67,
1332,
1411,
387,
1878,
247,
1643,
273,
731,
323,
4227,
253,
2929,
812,
7277,
1411,
2074,
1318,
1159,
19862,
5609,
7684,
4152,
1036,
1698,
5292,
1093,
396,
90,
615,
938,
390,
643,
11106,
789,
27723,
18540,
9327,
1166,
1854,
518,
1166,
3600,
1473,
1093,
310,
417,
11106,
533,
4931,
943,
320,
2429,
1411,
512,
841,
3082,
476,
320,
4942,
4354,
9009,
327,
1755,
273,
298,
306,
323,
247,
4344,
5301,
50276,
13156,
1473,
1093,
17947,
407,
3632,
2990,
940,
21755,
50275,
38092,
5701,
50275,
12756,
320,
1270,
281,
19148,
752,
310,
253,
8310,
2317,
323,
253,
7505,
15464,
285,
37045,
8892,
323,
4227,
368,
812,
823,
7313,
285,
13650,
323,
841,
8892,
281,
253,
30762,
187,
187,
4118,
18435,
27,
783,
19529,
310,
14969,
347,
1907,
2442,
1318,
275,
2426,
273,
36636,
247,
747,
2746,
323,
17947,
1754,
327,
49328,
285,
1318,
3470,
2299,
627,
403,
42578,
7350,
670,
253,
5955,
273,
752,
436,
2929,
10316,
281,
253,
2829,
1649,
41826,
2720,
789,
2366,
342,
247,
3480,
273,
2590,
20028,
273,
253,
6843,
15988,
432,
253,
17947,
5122,
285,
625,
5661,
2175,
253,
4477,
651,
513,
973,
281,
49620,
347,
591,
253,
8680,
1677,
285,
501,
538,
2225,
247,
2715,
342,
247,
625,
18511,
4154,
209
] | [
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] | [
281,
8338,
253,
1533,
50276,
3062,
2905,
2987,
281,
1908,
50276,
2038,
3646,
581,
391,
81,
18,
4023,
1162,
355,
549,
32693,
9169,
50276,
7645,
3169,
3939,
17947,
2781,
23478,
312,
1162,
355,
17857,
1686,
6247,
50276,
28269,
11891,
3210,
717,
375,
1162,
355,
17857,
32888,
4765,
50276,
249,
2426,
273,
3368,
339,
2503,
45756,
7103,
627,
310,
642,
8245,
17947,
5933,
3559,
326,
310,
1754,
327,
7156,
254,
390,
12275,
3169,
1453,
534,
2789,
253,
12510,
273,
253,
4081,
2746,
247,
2372,
2834,
281,
320,
7515,
323,
1650,
752,
604,
359,
513,
247,
15246,
17947,
824,
347,
24536,
1854,
518,
1162,
355,
4240,
391,
2109,
1308,
14776,
1162,
355,
4765,
327,
7156,
254,
50275,
38092,
5701,
50275,
262,
651,
320,
1270,
281,
2085,
625,
37699,
911,
45525,
327,
849,
298,
306,
2389,
342,
9840,
17,
285,
7156,
254,
403,
1027,
50276,
38092,
1783,
285,
4679,
651,
671,
452,
34615,
253,
2929,
824,
347,
849,
1057,
253,
5933,
3045,
9184,
762,
1027,
2193,
273,
253,
7219,
16892,
849,
1057,
253,
3045,
273,
253,
5933,
6889,
762,
1027,
1180,
278,
273,
49328,
50276,
284,
275,
419,
71,
1216,
1162,
355,
6247,
66,
352,
651,
320,
1270,
281,
19148,
253,
3064,
875,
268,
3830,
285,
2805,
3830,
275,
616,
4495,
26332,
268,
323,
10670,
326,
3530,
432,
1524,
12620,
285,
2805,
323,
34754,
5474,
339,
431,
248,
4081,
1332,
3133,
751,
247,
2969,
285,
3576,
5019,
273,
5368,
5697,
285,
7274,
4583,
1223,
891,
751,
253,
2746,
533,
717,
25661,
4404,
247,
12009,
1580,
891,
1158,
627,
403,
7794,
275,
253,
3368,
2593,
326,
476,
320,
31637,
285,
5520,
2299,
891,
717,
1527,
281,
6890,
619,
3061,
604,
619,
7350,
403,
9713,
1309,
253,
30080,
22559,
2180,
50276,
296,
3755,
20556,
50276,
783,
5697,
3559,
403,
2969,
18149,
273,
5368,
3082,
285,
403,
3240,
4354,
38780,
50276,
783,
4081,
2746,
1057,
1646,
281,
3037,
7938,
327,
690,
273,
253,
8892,
2783,
3738,
690,
7794,
273,
436,
878,
37699,
50276,
783,
5661,
4278,
3559,
275,
253,
30762,
403,
3240,
11080,
285,
858,
1361,
479,
2096,
690,
7794,
273,
253,
789,
275,
625,
2508,
50276,
22402,
285,
2792,
4409,
8254,
5411,
50275,
522,
435,
432,
247,
1180,
273,
2792,
273,
37699,
326,
403,
7117,
1996,
619,
2022,
4468,
310,
342,
253,
5661,
2593,
273,
253,
2929,
436,
789,
26662,
3045,
273,
253,
4081,
1332,
327,
854,
273,
253,
3676,
14785,
1453,
18880,
10625,
285,
581,
20953,
5028,
342,
1077,
1652,
28913,
275,
4499,
253,
3236,
1837,
82,
285,
7156,
254,
789,
534,
403,
908,
347,
8245,
1060,
921,
1543,
327,
1458,
285,
1384,
1453,
18880,
8892,
327,
1755,
273,
690,
1543,
327,
387,
1792,
50275,
783,
1180,
273,
10625,
4916,
8489,
625,
1774,
275,
1708,
273,
253,
958,
326,
327,
374,
273,
253,
854,
10625,
2783,
1837,
82,
41731,
13015,
253,
4081,
2746,
253,
8813,
5001,
841,
374,
8892,
1907,
14086,
23267,
1057,
1056,
3282,
533,
625,
941,
651,
1361,
4326,
4513,
253,
1750,
50276,
23955,
4468,
891,
452,
310,
342,
2743,
281,
253,
9759,
273,
253,
1543,
275,
4677,
495,
954,
273,
253,
9191,
403,
2624,
745,
1078,
597,
452,
5975,
2400,
436,
2789,
352,
1892,
281,
50276,
26125,
414,
2451,
1411,
253,
5368,
1543,
432,
253,
7156,
254,
285,
1837,
82,
2929,
285,
352,
4558,
12744,
281,
479,
604,
2389,
17923,
38311,
347,
973,
347,
253,
1666,
25379,
327,
690,
273,
253,
8892,
4931,
436,
1491,
812,
320,
2908,
3185,
275,
2829,
337,
534,
347,
2080,
347,
891,
476,
2028,
1057,
417,
4390,
1246,
667,
747,
1491,
326,
310,
417,
1246,
275,
4677,
495,
50276,
71,
3341,
253,
2929,
812,
671,
2486,
625,
490,
77,
569,
50276,
20432,
5001,
253,
1055,
273,
253,
19862,
1979,
285,
849,
1774,
253,
3213,
2572,
10130,
310,
323,
253,
9840,
4764,
273,
253,
44274,
67,
8103,
50276,
71,
3341,
627,
403,
690,
2792,
534,
891,
13414,
4751,
2096,
1754,
327,
619,
4361,
273,
253,
2505,
285,
323,
534,
37699,
651,
10260,
1361,
50276,
783,
2022,
2505,
25957,
326,
2389,
556,
17947,
6046,
3531,
745,
1580,
17947,
6569,
949,
253,
44274,
67,
6046,
369,
436,
253,
1072,
4758,
908,
323,
253,
298,
306,
835,
9840,
310,
873,
281,
470,
619,
4685,
310,
326,
298,
306,
1057,
452,
17947,
6046,
533,
352,
651,
320,
1175,
323,
253,
4477,
281,
6583,
436,
50275,
783,
4764,
275,
253,
44274,
67,
8103,
9840,
310,
31414,
2559,
432,
470,
275,
18687,
5018,
273,
209,
5831,
835,
1057,
436,
1180,
1705,
432,
369,
436,
247,
4373,
19484,
50275,
28269,
2709,
7156,
254,
3210,
310,
271,
4722,
2934,
533,
253,
4755,
2523,
342,
436,
310,
326,
352,
812,
787,
4875,
3402,
8886,
4715,
673,
849,
1199,
17357,
310,
253,
4081,
2746,
281,
26724,
7156,
254,
50276,
249,
2426,
273,
3402,
8886,
673,
1223,
2096,
1598,
253,
2770,
273,
436,
789,
310,
941,
6733,
891,
1158,
436,
310,
271,
1774,
1180,
281,
3748,
4931,
1014,
275,
253,
30762,
281,
6848,
247,
625,
3426,
5406,
50276,
11183,
22458,
2030,
50276,
74,
717,
5211,
326,
253,
4477,
5520,
253,
2929,
342,
37317,
8680,
275,
1798,
891,
1158,
253,
747,
490,
77,
569,
285,
5301,
285,
3748,
273,
2045,
789,
2789,
253,
789,
625,
3426,
253,
1543,
327,
747,
23507,
8892,
403,
671,
4722,
891,
1335,
1158,
625,
476,
320,
2218,
275,
2426,
273,
5661,
12820,
275,
1798,
619,
3236,
3877,
5001,
2393,
9968,
273,
253,
9191,
556,
417,
644,
9713,
2299,
4583,
891,
1158,
253,
2929,
1057,
2525,
253,
14924,
7887,
347,
1841,
1462,
5474,
33032,
6010,
50275,
783,
2929,
29328,
2389,
271,
15644,
273,
38688,
396,
90,
615,
938,
281,
21624,
4778,
15970,
3210,
396,
90,
615,
938,
760,
345,
1397,
2744,
15970,
3210,
1293,
21624,
4903,
396,
90,
615,
938,
29328,
281,
897,
247,
26647,
273,
5170,
7162,
3033,
281,
3676,
1566,
3169,
391,
77,
407,
3733,
271,
19862,
273,
3210,
285,
1318,
3470,
285,
3733,
247,
3646,
281,
22950,
1599,
50276,
87,
14417,
273,
436,
19862,
12014,
281,
1698,
5292,
746,
253,
19529,
45190,
14371,
326,
25184,
253,
4715,
2281,
285,
1180,
273,
3733,
5018,
591,
3126,
5018,
273,
7156,
254,
419,
71,
1216,
938,
19132,
3410,
6733,
970,
271,
19862,
273,
15970,
3210,
2007,
19132,
941,
6733,
5777,
327,
7281,
36479,
285,
2940,
254,
8892,
1223,
327,
1755,
273,
326,
253,
4081,
17947,
1332,
5777,
19132,
3410,
6733,
327,
253,
8511,
3803,
285,
23507,
7281,
36479,
8892,
50274,
33642,
50275,
783,
19529,
4428,
1652,
7681,
38135,
689,
2720,
789,
273,
396,
90,
615,
9169,
253,
5661,
1543,
403,
5075,
533,
8489,
15249,
253,
3916,
347,
627,
310,
247,
4512,
533,
5185,
7756,
327,
690,
8892,
2299,
253,
2929,
27171,
432,
247,
2201,
19652,
275,
253,
16774,
7103,
4677,
495,
285,
253,
4623,
5955,
6266,
2389,
347,
3012,
41731,
14692,
253,
7156,
254,
8245,
436,
3064,
310,
8127,
1955,
281,
253,
958,
2389,
4648,
247,
1027,
4715,
2281,
285,
1180,
273,
44540,
534,
19132,
3410,
6733,
253,
2929,
40763,
8140,
3400,
253,
5301,
281,
253,
9648,
24251,
8245,
275,
253,
30762,
347,
4677,
898,
24025,
436,
253,
9648,
24251,
8245,
3198,
281,
320,
4395,
432,
253,
30762,
281,
253,
2022,
2929,
285,
253,
7680,
2593,
285,
253,
5955,
273,
253,
4679,
878,
281,
320,
35993,
15672,
604,
436,
310,
2530,
891,
588,
294,
45141,
253,
2929,
275,
253,
1655,
1375,
273,
253,
2929,
891,
717,
7591,
281,
1908,
697,
16108,
327,
253,
3720,
273,
436,
19652,
50275,
296,
3755,
20556,
50275,
783,
2929,
310,
22335,
3451,
3707,
323,
253,
19652,
5544,
1840,
285,
29328,
247,
12532,
2746,
281,
247,
4623,
1895,
273,
17947,
275,
391,
77,
432,
3888,
253,
5661,
1543,
5224,
326,
253,
4081,
1332,
812,
320,
3576,
50275,
20881,
1255,
265,
50275,
783,
2201,
19652,
273,
253,
2929,
310,
2529,
4321,
275,
1635,
627,
403,
767,
643,
2201,
3374,
342,
5661,
7103,
50276,
783,
5661,
7103,
273,
253,
2929,
310,
2581,
5075,
253,
4081,
17947,
1332,
760,
19132,
3045,
275,
374,
562,
273,
854,
12620,
436,
1537,
320,
984,
253,
643,
12620,
513,
417,
2430,
18144,
17947,
275,
534,
1083,
253,
1332,
3198,
281,
320,
5762,
327,
625,
685,
374,
4623,
12620,
23507,
250,
1034,
9508,
273,
253,
6760,
12620,
476,
320,
4354,
4158,
285,
651,
320,
7470,
323,
16344,
253,
1332,
50276,
783,
1273,
2201,
2523,
310,
326,
253,
1332,
310,
417,
6760,
1411,
667,
11771,
17947,
1666,
25379,
1014,
2167,
253,
2929,
28070,
2709,
2720,
2987,
327,
436,
323,
4227,
253,
2929,
3916,
326,
3082,
1754,
327,
1491,
6351,
390,
11138,
3640,
273,
253,
8062,
588,
417,
8338,
347,
14556,
347,
253,
4081,
1332,
1097,
273,
841,
3916,
878,
281,
320,
45190,
6760,
390,
7020,
264,
1066,
50275,
38092,
5701,
50275,
783,
2905,
789,
2593,
310,
5816,
253,
1563,
9380,
50275,
2910,
938,
310,
247,
1566,
3169,
391,
77,
1332,
326,
4648,
44274,
67,
323,
17947,
50276,
339,
18970,
938,
310,
247,
1566,
3169,
391,
77,
1332,
326,
4648,
21624,
19862,
11041,
323,
4836,
1530,
6932,
17947,
50276,
2910,
938,
4704,
3646,
581,
1533,
3652,
949,
3939,
4715,
50276,
339,
18970,
938,
7219,
281,
8338,
3066,
1881,
35421,
1533,
3210,
50275,
11183,
50275,
783,
747,
23507,
8892,
285,
5301,
281,
7156,
254,
50276,
1915,
784,
3157,
253,
2929,
285,
2953,
690,
273,
619,
7350,
5742,
247,
256,
12729,
7756,
1955,
281,
17947,
310,
1024,
2326,
327,
495,
8892,
8511,
3803,
7281,
36479,
23507,
285,
1161,
292,
1240,
23507,
253,
747,
37045,
4836,
310,
671,
625,
11132,
685,
253,
7505,
15464,
4836,
50273,
13017,
3061,
50275,
6438,
253,
1534,
11701,
275,
253,
5661,
7103,
891,
2868,
253,
2929,
3400,
247,
5272,
1083,
323,
253,
4081,
21624,
44274,
67,
1332,
352,
671,
3400,
271,
4722,
5955,
327,
253,
11361,
273,
44274,
67,
4826,
3082,
285,
271,
4722,
8310,
326,
28684,
10921,
3169,
17947,
476,
320,
8069,
908,
1014,
275,
5928,
273,
2762,
23267,
1014,
2167,
253,
5661,
7103,
273,
2720,
789,
327,
17947,
310,
1335,
2581,
14999,
891,
2868,
326,
841,
9021,
403,
2217,
323,
253,
2929,
281,
320,
4722,
281,
253,
17857,
32888,
8446,
891,
7164,
619,
4868,
281,
721,
50275,
2013,
1776,
32213,
50275,
783,
5661,
7103,
275,
253,
2929,
310,
1335,
3240,
14999,
275,
2426,
273,
1666,
25379,
2403,
352,
7479,
281,
5963,
1880,
253,
2929,
2686,
2987,
1805,
685,
2720,
789,
50276,
7053,
253,
4081,
1332,
4428,
767,
11701,
1566,
546,
35128,
285,
28684,
17947,
533,
36908,
564,
1199,
801,
554,
394,
18918,
2057,
273,
841,
11701,
3185,
2820,
281,
2770,
327,
1097,
387,
253,
1072,
673,
436,
2789,
5301,
281,
2720,
17947,
3082,
1892,
984,
253,
4081,
1332,
14488,
271,
3081,
9510,
1955,
281,
271,
19862,
273,
8062,
253,
2929,
34090,
2677,
7790,
436,
9510,
275,
253,
298,
306,
1332,
285,
352,
310,
2011,
281,
320,
2581,
1781,
323,
247,
625,
4344,
5301,
253,
19862,
273,
8062,
1537,
320,
490,
16148,
6108,
760,
253,
19862,
273,
1318,
3470,
390,
253,
11771,
1666,
25379,
812,
671,
320,
4270,
327,
1755,
273,
298,
306,
50276,
9815,
253,
2929,
760,
26662,
1411,
581,
11771,
17947,
1332,
7156,
254,
50276,
1915,
784,
627,
556,
644,
247,
1781,
2408,
273,
4081,
17947,
3082,
285,
352,
651,
320,
4569,
281,
7472,
253,
4081,
44274,
67,
1332,
1411,
387,
1878,
247,
1643,
273,
731,
323,
4227,
253,
2929,
812,
7277,
1411,
2074,
1318,
1159,
19862,
5609,
7684,
4152,
1036,
1698,
5292,
1093,
396,
90,
615,
938,
390,
643,
11106,
789,
27723,
18540,
9327,
1166,
1854,
518,
1166,
3600,
1473,
1093,
310,
417,
11106,
533,
4931,
943,
320,
2429,
1411,
512,
841,
3082,
476,
320,
4942,
4354,
9009,
327,
1755,
273,
298,
306,
323,
247,
4344,
5301,
50276,
13156,
1473,
1093,
17947,
407,
3632,
2990,
940,
21755,
50275,
38092,
5701,
50275,
12756,
320,
1270,
281,
19148,
752,
310,
253,
8310,
2317,
323,
253,
7505,
15464,
285,
37045,
8892,
323,
4227,
368,
812,
823,
7313,
285,
13650,
323,
841,
8892,
281,
253,
30762,
187,
187,
4118,
18435,
27,
783,
19529,
310,
14969,
347,
1907,
2442,
1318,
275,
2426,
273,
36636,
247,
747,
2746,
323,
17947,
1754,
327,
49328,
285,
1318,
3470,
2299,
627,
403,
42578,
7350,
670,
253,
5955,
273,
752,
436,
2929,
10316,
281,
253,
2829,
1649,
41826,
2720,
789,
2366,
342,
247,
3480,
273,
2590,
20028,
273,
253,
6843,
15988,
432,
253,
17947,
5122,
285,
625,
5661,
2175,
253,
4477,
651,
513,
973,
281,
49620,
347,
591,
253,
8680,
1677,
285,
501,
538,
2225,
247,
2715,
342,
247,
625,
18511,
4154,
209
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
this paper aims to improve image translation for the task of plague segmentation by building an approach which can preserve small subtle structures as well as global and intermediate structures the approach makes use of unit liu et al 2017 an image translation network with a single shared latent space and a selfexpressive loss ji et al 2017 which helps with separating the subspaces and crossdomain translation pros the authors explain the task and motivations well and validate against another well known translation network unit the results are promising with improved performance compared to unit a small decrease in performance is shown in comparison to training on real images which is to be expected cons the method can be more thoroughly validated and a more detailed illustration of the network can be given docsepthis paper proposes to add a selfexpressiveness regularization term to learn a union of subspaces for imagetoimage translation in medical domain its shown that such selfexpressiveness constraint can help to preserve subtle structures during image translation which is critical for medical tasks such as plaque detection the motivation and methodology are well explained with proper reference works improvement on plaque detection is signification comment it would nice if the authors could also show some visualisations of the latent space with comparisons between with and without the constraint this will provide more insights or explanationsdocsepsummary this short paper discusses a method for crossdomain plaque detection using image translation methods translating from precontrast to postconstrast ct the method adds an additional constraint to the learning objective to force the translation model to learn a representation constructed of easily separable subspaces the authors suggest that this allows the model to better represent the different structures in the images their experiments show a decreased drop in performance over competing methods strengths this is an interesting application of selfexpressiveness loss and the reported results show that the proposed method might achieve a reasonable crossdomain performance the experiments are very limited but seem promising weaknesses the paper is not very generous with information about the implementation of the method we are told nothing about the encoding and decoding networks or the segmentation model for example the comparison with competing methods is very limited the paper seems to depends heavily on work by liu et al it is of course a short paper questions the main assumption of the authors is that different anatomical structures would be represented by different subspaces in the representation it would be interesting to know if this is really what happens in the model or whether the proposed method improves in other waysdocsepthe authors propose to extend the unit model for imagetoimage translation and apply this to synthesis of noncontrast ct images from contrastenhanced ct images subsequent experiments show that aortic calcifications can be automatically identified in the synthetic noncontrast images but not in the original contrast images strengths its very strong that the authors not only perform imagetoimage translation but also evaluate the effect of this translation on a subsequent segmentation task good experiments comparing to a situation without image translation and one with single subspace image translation quantitative results are convincing results suggest that the proposed approach would allow automatic aortic calcium scoring in contrastenhanced images without the need for annotated training data in these images weaknesses quite information dense paper its not entirely clear what was exactly the contribution of the authors is eg subspace clustering seems to have been proposed previously as has the unit model the individual models could have been explained a bit better a diagram would have been useful its unclear what the contribution of the small patches is it would be interesting to visualize the subspaces using these small patches moreover its unclear how the number of patches n is determined some typos for simplicty one or more subspace
### Summary: | i agree with the reviewers that the paper is quite densely written it is difficult to understand the details and what exactly the contribution of this work is but i like that the paper presents image synthesis which is evaluated with a clinically relevant application this is an interesting approach for detection of arterial calcifications | [
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
2520,
2929,
13698,
281,
3157,
2460,
10234,
323,
253,
4836,
273,
31781,
26405,
407,
3652,
271,
2746,
534,
476,
14003,
1355,
16105,
5289,
347,
973,
347,
4156,
285,
10444,
5289,
253,
2746,
2789,
897,
273,
3943,
632,
86,
1162,
355,
4240,
271,
2460,
10234,
2990,
342,
247,
2014,
6096,
21624,
2317,
285,
247,
11329,
453,
89,
7100,
422,
2957,
480,
74,
1162,
355,
4240,
534,
7729,
342,
23694,
253,
749,
31748,
285,
2831,
13517,
10234,
50276,
856,
84,
50276,
783,
4477,
5513,
253,
4836,
285,
42852,
973,
285,
17813,
1411,
1529,
973,
1929,
10234,
2990,
3943,
253,
1543,
403,
12532,
342,
5520,
3045,
2429,
281,
3943,
247,
1355,
6379,
275,
3045,
310,
2011,
275,
5301,
281,
3733,
327,
1524,
3888,
534,
310,
281,
320,
3264,
50275,
5040,
253,
1332,
476,
320,
625,
16575,
17618,
285,
247,
625,
7000,
23356,
273,
253,
2990,
476,
320,
1677,
5474,
33032,
2520,
2929,
29328,
281,
823,
247,
11329,
453,
89,
7100,
6460,
37820,
1307,
281,
3037,
247,
8083,
273,
749,
31748,
323,
4440,
16713,
5695,
10234,
275,
3739,
5028,
697,
2011,
326,
824,
11329,
453,
89,
7100,
6460,
7658,
476,
1361,
281,
14003,
16105,
5289,
1309,
2460,
10234,
534,
310,
4619,
323,
3739,
8892,
824,
347,
23861,
5481,
50276,
783,
16038,
285,
16182,
403,
973,
5544,
342,
1463,
3806,
2987,
7756,
327,
23861,
5481,
310,
1415,
318,
50276,
13982,
352,
651,
5322,
604,
253,
4477,
812,
671,
921,
690,
5304,
18058,
273,
253,
21624,
2317,
342,
14023,
875,
342,
285,
1293,
253,
7658,
436,
588,
2085,
625,
16039,
390,
22909,
7152,
339,
793,
360,
3454,
436,
2159,
2929,
25339,
247,
1332,
323,
2831,
13517,
23861,
5481,
970,
2460,
10234,
3082,
42477,
432,
638,
45842,
281,
1501,
585,
1344,
505,
45830,
253,
1332,
11323,
271,
3081,
7658,
281,
253,
4715,
8103,
281,
3490,
253,
10234,
1566,
281,
3037,
247,
6779,
8818,
273,
4354,
39690,
749,
31748,
253,
4477,
1804,
326,
436,
4483,
253,
1566,
281,
1805,
1957,
253,
1027,
5289,
275,
253,
3888,
616,
4679,
921,
247,
6137,
5926,
275,
3045,
689,
11771,
3082,
50276,
296,
3755,
20556,
436,
310,
271,
4722,
2898,
273,
11329,
453,
89,
7100,
6460,
2957,
285,
253,
2361,
1543,
921,
326,
253,
4081,
1332,
1537,
5115,
247,
5272,
2831,
13517,
3045,
253,
4679,
403,
1077,
3710,
533,
1646,
12532,
50276,
20881,
1255,
265,
253,
2929,
310,
417,
1077,
19649,
342,
1491,
670,
253,
7092,
273,
253,
1332,
359,
403,
2183,
2717,
670,
253,
9706,
285,
28490,
6928,
390,
253,
26405,
1566,
323,
1650,
253,
5301,
342,
11771,
3082,
310,
1077,
3710,
50276,
783,
2929,
3133,
281,
7024,
11306,
327,
789,
407,
632,
86,
1162,
355,
50276,
262,
310,
273,
2282,
247,
2159,
2929,
50276,
34974,
253,
2022,
9376,
273,
253,
4477,
310,
326,
1027,
27166,
5289,
651,
320,
6607,
407,
1027,
749,
31748,
275,
253,
6779,
352,
651,
320,
4722,
281,
871,
604,
436,
310,
1663,
752,
6569,
275,
253,
1566,
390,
1880,
253,
4081,
1332,
19132,
275,
643,
4088,
7152,
339,
431,
248,
4477,
12661,
281,
9017,
253,
3943,
1566,
323,
4440,
16713,
5695,
10234,
285,
4647,
436,
281,
9066,
273,
1327,
45842,
45830,
3888,
432,
4499,
35465,
45830,
3888,
6774,
4679,
921,
326,
22678,
9039,
6787,
476,
320,
8356,
3636,
275,
253,
13506,
1327,
45842,
3888,
533,
417,
275,
253,
3236,
4499,
3888,
50276,
296,
3755,
20556,
209,
186,
953,
1077,
2266,
326,
253,
4477,
417,
760,
1347,
4440,
16713,
5695,
10234,
533,
671,
7472,
253,
1055,
273,
436,
10234,
327,
247,
6774,
26405,
4836,
209,
186,
12311,
4679,
10941,
281,
247,
4112,
1293,
2460,
10234,
285,
581,
342,
2014,
24822,
2460,
10234,
11745,
1543,
403,
21414,
209,
186,
16680,
1804,
326,
253,
4081,
2746,
651,
1581,
12077,
22678,
11672,
14755,
275,
4499,
35465,
3888,
1293,
253,
878,
323,
28267,
3733,
941,
275,
841,
3888,
50276,
20881,
1255,
265,
209,
186,
39911,
1491,
14086,
2929,
697,
417,
7094,
2590,
752,
369,
4555,
253,
7680,
273,
253,
4477,
310,
24088,
24822,
17524,
3133,
281,
452,
644,
4081,
3786,
347,
556,
253,
3943,
1566,
209,
186,
783,
2060,
3210,
812,
452,
644,
5544,
247,
2372,
1805,
247,
10659,
651,
452,
644,
4217,
50276,
186,
953,
12744,
752,
253,
7680,
273,
253,
1355,
20412,
310,
352,
651,
320,
4722,
281,
31986,
253,
749,
31748,
970,
841,
1355,
20412,
25761,
697,
12744,
849,
253,
1180,
273,
20412,
295,
310,
3413,
209,
186,
8826,
963,
993,
323,
948,
1860,
555,
581,
390,
625,
24822,
187,
187,
4118,
18435,
27,
74,
5194,
342,
253,
30628,
326,
253,
2929,
310,
3240,
42350,
3542,
352,
310,
2834,
281,
2096,
253,
4278,
285,
752,
4555,
253,
7680,
273,
436,
789,
310,
533,
891,
751,
326,
253,
2929,
10262,
2460,
9066,
534,
310,
6760,
342,
247,
16747,
4623,
2898,
436,
310,
271,
4722,
2746,
323,
5481,
273,
18571,
9039,
6787
] | [
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] | [
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
2520,
2929,
13698,
281,
3157,
2460,
10234,
323,
253,
4836,
273,
31781,
26405,
407,
3652,
271,
2746,
534,
476,
14003,
1355,
16105,
5289,
347,
973,
347,
4156,
285,
10444,
5289,
253,
2746,
2789,
897,
273,
3943,
632,
86,
1162,
355,
4240,
271,
2460,
10234,
2990,
342,
247,
2014,
6096,
21624,
2317,
285,
247,
11329,
453,
89,
7100,
422,
2957,
480,
74,
1162,
355,
4240,
534,
7729,
342,
23694,
253,
749,
31748,
285,
2831,
13517,
10234,
50276,
856,
84,
50276,
783,
4477,
5513,
253,
4836,
285,
42852,
973,
285,
17813,
1411,
1529,
973,
1929,
10234,
2990,
3943,
253,
1543,
403,
12532,
342,
5520,
3045,
2429,
281,
3943,
247,
1355,
6379,
275,
3045,
310,
2011,
275,
5301,
281,
3733,
327,
1524,
3888,
534,
310,
281,
320,
3264,
50275,
5040,
253,
1332,
476,
320,
625,
16575,
17618,
285,
247,
625,
7000,
23356,
273,
253,
2990,
476,
320,
1677,
5474,
33032,
2520,
2929,
29328,
281,
823,
247,
11329,
453,
89,
7100,
6460,
37820,
1307,
281,
3037,
247,
8083,
273,
749,
31748,
323,
4440,
16713,
5695,
10234,
275,
3739,
5028,
697,
2011,
326,
824,
11329,
453,
89,
7100,
6460,
7658,
476,
1361,
281,
14003,
16105,
5289,
1309,
2460,
10234,
534,
310,
4619,
323,
3739,
8892,
824,
347,
23861,
5481,
50276,
783,
16038,
285,
16182,
403,
973,
5544,
342,
1463,
3806,
2987,
7756,
327,
23861,
5481,
310,
1415,
318,
50276,
13982,
352,
651,
5322,
604,
253,
4477,
812,
671,
921,
690,
5304,
18058,
273,
253,
21624,
2317,
342,
14023,
875,
342,
285,
1293,
253,
7658,
436,
588,
2085,
625,
16039,
390,
22909,
7152,
339,
793,
360,
3454,
436,
2159,
2929,
25339,
247,
1332,
323,
2831,
13517,
23861,
5481,
970,
2460,
10234,
3082,
42477,
432,
638,
45842,
281,
1501,
585,
1344,
505,
45830,
253,
1332,
11323,
271,
3081,
7658,
281,
253,
4715,
8103,
281,
3490,
253,
10234,
1566,
281,
3037,
247,
6779,
8818,
273,
4354,
39690,
749,
31748,
253,
4477,
1804,
326,
436,
4483,
253,
1566,
281,
1805,
1957,
253,
1027,
5289,
275,
253,
3888,
616,
4679,
921,
247,
6137,
5926,
275,
3045,
689,
11771,
3082,
50276,
296,
3755,
20556,
436,
310,
271,
4722,
2898,
273,
11329,
453,
89,
7100,
6460,
2957,
285,
253,
2361,
1543,
921,
326,
253,
4081,
1332,
1537,
5115,
247,
5272,
2831,
13517,
3045,
253,
4679,
403,
1077,
3710,
533,
1646,
12532,
50276,
20881,
1255,
265,
253,
2929,
310,
417,
1077,
19649,
342,
1491,
670,
253,
7092,
273,
253,
1332,
359,
403,
2183,
2717,
670,
253,
9706,
285,
28490,
6928,
390,
253,
26405,
1566,
323,
1650,
253,
5301,
342,
11771,
3082,
310,
1077,
3710,
50276,
783,
2929,
3133,
281,
7024,
11306,
327,
789,
407,
632,
86,
1162,
355,
50276,
262,
310,
273,
2282,
247,
2159,
2929,
50276,
34974,
253,
2022,
9376,
273,
253,
4477,
310,
326,
1027,
27166,
5289,
651,
320,
6607,
407,
1027,
749,
31748,
275,
253,
6779,
352,
651,
320,
4722,
281,
871,
604,
436,
310,
1663,
752,
6569,
275,
253,
1566,
390,
1880,
253,
4081,
1332,
19132,
275,
643,
4088,
7152,
339,
431,
248,
4477,
12661,
281,
9017,
253,
3943,
1566,
323,
4440,
16713,
5695,
10234,
285,
4647,
436,
281,
9066,
273,
1327,
45842,
45830,
3888,
432,
4499,
35465,
45830,
3888,
6774,
4679,
921,
326,
22678,
9039,
6787,
476,
320,
8356,
3636,
275,
253,
13506,
1327,
45842,
3888,
533,
417,
275,
253,
3236,
4499,
3888,
50276,
296,
3755,
20556,
209,
186,
953,
1077,
2266,
326,
253,
4477,
417,
760,
1347,
4440,
16713,
5695,
10234,
533,
671,
7472,
253,
1055,
273,
436,
10234,
327,
247,
6774,
26405,
4836,
209,
186,
12311,
4679,
10941,
281,
247,
4112,
1293,
2460,
10234,
285,
581,
342,
2014,
24822,
2460,
10234,
11745,
1543,
403,
21414,
209,
186,
16680,
1804,
326,
253,
4081,
2746,
651,
1581,
12077,
22678,
11672,
14755,
275,
4499,
35465,
3888,
1293,
253,
878,
323,
28267,
3733,
941,
275,
841,
3888,
50276,
20881,
1255,
265,
209,
186,
39911,
1491,
14086,
2929,
697,
417,
7094,
2590,
752,
369,
4555,
253,
7680,
273,
253,
4477,
310,
24088,
24822,
17524,
3133,
281,
452,
644,
4081,
3786,
347,
556,
253,
3943,
1566,
209,
186,
783,
2060,
3210,
812,
452,
644,
5544,
247,
2372,
1805,
247,
10659,
651,
452,
644,
4217,
50276,
186,
953,
12744,
752,
253,
7680,
273,
253,
1355,
20412,
310,
352,
651,
320,
4722,
281,
31986,
253,
749,
31748,
970,
841,
1355,
20412,
25761,
697,
12744,
849,
253,
1180,
273,
20412,
295,
310,
3413,
209,
186,
8826,
963,
993,
323,
948,
1860,
555,
581,
390,
625,
24822,
187,
187,
4118,
18435,
27,
74,
5194,
342,
253,
30628,
326,
253,
2929,
310,
3240,
42350,
3542,
352,
310,
2834,
281,
2096,
253,
4278,
285,
752,
4555,
253,
7680,
273,
436,
789,
310,
533,
891,
751,
326,
253,
2929,
10262,
2460,
9066,
534,
310,
6760,
342,
247,
16747,
4623,
2898,
436,
310,
271,
4722,
2746,
323,
5481,
273,
18571,
9039,
6787
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
the paper presents a clean subspace variational autoencoder clsvae model for systematic outlier detection building on vae for latent variable modelling the paper proposes a semisupervised learning to infer the potential outlier the author made the following assumptions 1 outliers exist therein the inliers are still the majority 2 outliers are systematic with predictable recurring patterns 3 compression hypothesis inlier data can be compressed further than outlier data 4 outliers are a combination of clean and dirty patterns on the model side a mixture model is proposed parametrized by bernoulli mixing weight and decoder pxz while the encoder qzx is applied for sampling latent variable z a standard elbo is formulated to maximise the log likelihood for both y known and unknown scenarios for outlier repair the author proposed to minimize the correlation between noise and clean signal experimentally the author evaluate two tasks outlier detection and automated repair on freyfaces3 fashionmnist xiao et al 2017 syntheticshapes strength the problem is well motivated and clearly presented very easy to follow the formulation is very clear the performance of outlier removal is satisfactory on simple datasets weakness the idea and formulation is a little incremental and similar to rvae eduardo etal the main difference could be the semisupervised mixture model some thinking how correlated is the denoising quality and the downstream task eg classification could it be formulated together how is the model related to blindsource separation and ica could the author gave some insight the author presented clsvae to handle systematic outlier the problem is well motivated and clearly presented very easy to follow the formulation is very clear the performance of outlier removal is satisfactory on simple datasets the idea and formulation is a little incremental and similar to rvae eduardo etal with the main difference could be the semisupervised mixture model as mentioned above i wish the author could answer the two following questions 1 how correlated is the denoising quality and the downstream task eg classification could it be formulated together 2 how is the model related to blindsource separation and ica could the author gave some insight docsepthis paper presents a neural network model based on variational autoencoders vae that learn an implicit representation separating outliers with systematic errors from inliers using a small labelled subset of the training data set trusted set more specifically clean data and recurring systematic errors are represented in separate latent subspaces where outliers are supposed to be a linear combination of such clean and dirty patterns after training the model there are two tasks outlier detection and automated repair removing the dirty patterns from the detected outliers overall the paper is well written and the contributions are clearly formulated however i have several concerns with it the intuition behind the assumption that inliers can be represented in a subspace of the clean and dirty patterns is not so obvious especially given the datasets that have been used in the experiments these systematic errors of adding or masking out lines or squares at fixed positions actually simplify the data and the variance is reduced the problem is clearly stated but to me it is not evident what would be a practical application of this algorithm and a real use case first of all the fact that a labelled trusted data set is needed albeit small constrains the practical usage second the use case of detecting images where some pixels are set to default values eg from a deficient camera would be dependent on a specific sensor finally the application of watermarks is mentioned and makes more sense however the experimental evaluation does not include this more complex type of outliers there is an extensive appendix which would be ok but the main paper misses some explanations and details that are only mentioned in the apendices for example appendix e and f some minor remarks section 42 on the variational model mixes the theoretical model with some implementation aspects like stop gradient which is only related to trainingoptimisation also some parts are not explained like the distribution pi the figures and tables are too small the paper presents and interested approach that is well formulated however the main motivations and assumptions are not well comprehensible theoretical and practical implementation aspects are mixed and the presentation is lacking clarity in some places docsepthis paper develops a deep learning based method for detecting and repairing systematic outliers in data before performing model learning on the data the method is called clean subspace variational autoencoder clsvae clsvae takes as input a trusted set provided by the user the trusted set contains a set of inliers and outliers but not requiring information about how the outliers are corrupted and seeks to learn a variational autoencoder where the latent code is a concatenation of two subspaces a clean subspace and a dirty subspace clsvae is semisupervised because the labelled trusted set is much smaller than the unlabelled set the paper performs outlier detection and repair benchmarks on three image datasets with systematic outliers and shows that clsvae achieves favorable results strength the paper is well motivated and generally well written the key idea of subspace learning is intuitive and easy to understand results look to be good and support the claim weaknesses while i found section 3 to be well written but section 4 which contains the math and equations seems to be difficult to follow especially for a person that is not necessarily familiar with vaes and generative models perhaps a simple introduction of those could help an issue of the paper from my perspective is that the experiments are simulated and not real therefore i am not fully convinced that the developed method will have practical impact in particular is it possible to test the method on some dataset that is wellknown to be corrupted by systematic outliers rather than simulating the outliers one example that i am aware of is that in robot perception wrong correspondences is a big challenge for example in point cloud registration establishing reliable matches between a pair of point clouds is a very challenging task there is a huge amount of literature on this problem but perhaps ref1 and ref2 could be good references this maybe a future work direction for the authors to consider i also think an ablation study could help where the impact of the size of the trusted set on the performance could be investigated if the authors increase the size of the trusted set from 10 to 50 do we expect a gain in the performance ref1 yang heng jingnan shi and luca carlone teaser fast and certifiable point cloud registration ieee transactions on robotics 37 no 2 2020 314333 ref2 yi kwang moo eduard trulls yuki ono vincent lepetit mathieu salzmann and pascal fua learning to find good correspondences in proceedings of the ieee conference on computer vision and pattern recognition pp 26662674 2018 i am not an expert in this field i tend to weak accept this paper but i will also see if other reviewers have serious concerns
### Summary: | this is a borderline paper with 2 marginally above and a marginally below acceptance recommendations while the authors provided valid responses to some of the criticism i still find some of the motivation and assumptions not sufficiently clear theoretical and practical issues are mixed and the validation on only synthetic data raises practical questions | [
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
783,
2929,
10262,
247,
4076,
24822,
39762,
6753,
36465,
502,
11427,
3348,
1566,
323,
12082,
562,
3623,
5481,
3652,
327,
362,
3348,
323,
21624,
4778,
26278,
253,
2929,
29328,
247,
49863,
29974,
13337,
4715,
281,
9441,
253,
2442,
562,
3623,
50275,
783,
2488,
1160,
253,
1563,
13260,
337,
42559,
2226,
15308,
253,
275,
19139,
403,
1335,
253,
5020,
374,
42559,
403,
12082,
342,
28826,
36108,
6127,
495,
50276,
3118,
1256,
9079,
275,
3623,
941,
476,
320,
21012,
2007,
685,
562,
3623,
941,
577,
42559,
403,
247,
5019,
273,
4076,
285,
16076,
6127,
50276,
251,
253,
1566,
1930,
247,
7802,
1566,
310,
4081,
30364,
50065,
407,
270,
1808,
276,
25658,
12480,
2801,
285,
29810,
268,
39344,
1223,
253,
32049,
2805,
91,
89,
310,
3732,
323,
10491,
21624,
4778,
1182,
247,
2629,
1045,
2399,
310,
26115,
281,
11903,
885,
253,
2412,
12177,
323,
1097,
340,
1929,
285,
7202,
15216,
323,
562,
3623,
8706,
253,
2488,
4081,
281,
15338,
253,
5921,
875,
6046,
285,
4076,
2625,
50276,
16217,
2092,
595,
253,
2488,
7472,
767,
8892,
562,
3623,
5481,
285,
16644,
8706,
327,
4107,
90,
6511,
20,
50276,
29585,
16192,
382,
1269,
22728,
1162,
355,
4240,
5132,
49631,
73,
9652,
4757,
253,
1895,
310,
973,
17194,
285,
4518,
3559,
1077,
3477,
281,
956,
253,
15895,
310,
1077,
2590,
253,
3045,
273,
562,
3623,
8570,
310,
20297,
327,
2969,
15302,
50276,
20881,
1255,
253,
2934,
285,
15895,
310,
247,
1652,
32809,
285,
2074,
281,
391,
21574,
1407,
86,
15916,
1162,
267,
253,
2022,
3064,
812,
320,
253,
49863,
29974,
13337,
7802,
1566,
50276,
8826,
4680,
849,
9578,
310,
253,
1850,
80,
2182,
3290,
285,
253,
15450,
4836,
24088,
9162,
812,
352,
320,
26115,
2366,
849,
310,
253,
1566,
2905,
281,
9645,
6756,
9712,
285,
209,
3737,
812,
253,
2488,
3534,
690,
12288,
253,
2488,
3559,
502,
11427,
3348,
281,
6016,
12082,
562,
3623,
253,
1895,
310,
973,
17194,
285,
4518,
3559,
1077,
3477,
281,
956,
253,
15895,
310,
1077,
2590,
253,
3045,
273,
562,
3623,
8570,
310,
20297,
327,
2969,
15302,
50276,
783,
2934,
285,
15895,
310,
247,
1652,
32809,
285,
2074,
281,
391,
21574,
1407,
86,
15916,
1162,
267,
342,
253,
2022,
3064,
812,
320,
253,
49863,
29974,
13337,
7802,
1566,
50276,
284,
5393,
1840,
891,
5730,
253,
2488,
812,
3662,
253,
767,
1563,
3533,
337,
849,
9578,
310,
253,
1850,
80,
2182,
3290,
285,
253,
15450,
4836,
24088,
9162,
812,
352,
320,
26115,
2366,
374,
849,
310,
253,
1566,
2905,
281,
9645,
6756,
9712,
285,
209,
3737,
812,
253,
2488,
3534,
690,
12288,
5474,
33032,
2520,
2929,
10262,
247,
11454,
2990,
1566,
1754,
327,
39762,
6753,
2083,
351,
398,
362,
3348,
326,
3037,
271,
15424,
6779,
23694,
42559,
342,
12082,
6332,
432,
275,
19139,
970,
247,
1355,
27214,
8578,
273,
253,
3733,
941,
873,
18273,
873,
625,
5742,
4076,
941,
285,
36108,
12082,
6332,
403,
6607,
275,
4858,
21624,
749,
31748,
835,
42559,
403,
6326,
281,
320,
247,
4872,
5019,
273,
824,
4076,
285,
16076,
6127,
846,
3733,
253,
1566,
627,
403,
767,
8892,
562,
3623,
5481,
285,
16644,
8706,
11922,
253,
16076,
6127,
432,
253,
5189,
42559,
50276,
1189,
455,
253,
2929,
310,
973,
3542,
285,
253,
9021,
403,
4518,
26115,
2299,
891,
452,
2067,
7350,
342,
352,
50276,
783,
30328,
3212,
253,
9376,
326,
275,
19139,
476,
320,
6607,
275,
247,
24822,
273,
253,
4076,
285,
16076,
6127,
310,
417,
594,
4755,
3340,
1677,
253,
15302,
326,
452,
644,
908,
275,
253,
4679,
841,
12082,
6332,
273,
6240,
390,
44790,
562,
3104,
390,
19325,
387,
4229,
6887,
2686,
25636,
253,
941,
285,
253,
11041,
310,
3777,
50276,
783,
1895,
310,
4518,
4767,
533,
281,
479,
352,
310,
417,
8943,
752,
651,
320,
247,
8542,
2898,
273,
436,
5933,
285,
247,
1524,
897,
1083,
806,
273,
512,
253,
958,
326,
247,
27214,
18273,
941,
873,
310,
3058,
23447,
1355,
1030,
44196,
253,
8542,
10393,
1273,
253,
897,
1083,
273,
15549,
3888,
835,
690,
15115,
403,
873,
281,
4284,
2193,
24088,
432,
247,
24030,
6568,
651,
320,
7976,
327,
247,
2173,
8468,
4720,
253,
2898,
273,
37385,
7969,
310,
5393,
285,
2789,
625,
3282,
2299,
253,
5661,
7103,
1057,
417,
2486,
436,
625,
2570,
1511,
273,
42559,
50276,
9088,
310,
271,
9470,
30762,
534,
651,
320,
8718,
533,
253,
2022,
2929,
38771,
690,
22909,
285,
4278,
326,
403,
760,
5393,
275,
253,
1049,
423,
1271,
323,
1650,
30762,
299,
285,
269,
50276,
8826,
5884,
16157,
50275,
4674,
5976,
327,
253,
39762,
1566,
47603,
253,
10527,
1566,
342,
690,
7092,
7794,
751,
3523,
11786,
534,
310,
760,
2905,
281,
3733,
32581,
5837,
671,
690,
4243,
403,
417,
5544,
751,
253,
3268,
12580,
50276,
783,
8442,
285,
7180,
403,
1512,
1355,
50276,
783,
2929,
10262,
285,
6110,
2746,
326,
310,
973,
26115,
2299,
253,
2022,
42852,
285,
13260,
403,
417,
973,
28535,
6286,
10527,
285,
8542,
7092,
7794,
403,
6804,
285,
253,
9759,
310,
14999,
19843,
275,
690,
5053,
5474,
33032,
2520,
2929,
24357,
247,
3676,
4715,
1754,
1332,
323,
15549,
285,
47387,
12082,
42559,
275,
941,
1078,
9591,
1566,
4715,
327,
253,
941,
50276,
783,
1332,
310,
1925,
4076,
24822,
39762,
6753,
36465,
502,
11427,
3348,
502,
11427,
3348,
3936,
347,
3280,
247,
18273,
873,
2530,
407,
253,
2608,
253,
18273,
873,
4428,
247,
873,
273,
275,
19139,
285,
42559,
533,
417,
10568,
1491,
670,
849,
253,
42559,
403,
40634,
285,
14993,
281,
3037,
247,
39762,
6753,
36465,
835,
253,
21624,
2127,
310,
247,
32147,
318,
273,
767,
749,
31748,
247,
4076,
24822,
285,
247,
16076,
24822,
502,
11427,
3348,
310,
49863,
29974,
13337,
984,
253,
27214,
18273,
873,
310,
1199,
4577,
685,
253,
440,
47728,
873,
50276,
783,
2929,
17923,
562,
3623,
5481,
285,
8706,
49602,
327,
1264,
2460,
15302,
342,
12082,
42559,
285,
2722,
326,
502,
11427,
3348,
33526,
13857,
1543,
50275,
45563,
50276,
783,
2929,
310,
973,
17194,
285,
3839,
973,
3542,
253,
2234,
2934,
273,
24822,
4715,
310,
27350,
285,
3477,
281,
2096,
50276,
16680,
1007,
281,
320,
1175,
285,
1329,
253,
1750,
50276,
20881,
1255,
265,
50276,
6050,
891,
1119,
2593,
495,
281,
320,
973,
3542,
533,
2593,
577,
534,
4428,
253,
14168,
285,
7424,
3133,
281,
320,
2834,
281,
956,
3340,
323,
247,
1436,
326,
310,
417,
7933,
7615,
342,
13460,
265,
285,
1006,
800,
3210,
4931,
247,
2969,
10199,
273,
1110,
812,
1361,
50275,
266,
2523,
273,
253,
2929,
432,
619,
8668,
310,
326,
253,
4679,
403,
15524,
285,
417,
1524,
3103,
891,
717,
417,
4751,
13762,
326,
253,
3715,
1332,
588,
452,
8542,
3486,
50276,
249,
1798,
310,
352,
1896,
281,
1071,
253,
1332,
327,
690,
10895,
326,
310,
973,
4304,
281,
320,
40634,
407,
12082,
42559,
2581,
685,
948,
8287,
253,
42559,
50276,
531,
1650,
326,
891,
717,
6600,
273,
310,
326,
275,
15688,
13071,
3430,
2723,
2979,
310,
247,
1943,
5691,
323,
1650,
275,
1127,
9005,
12960,
14631,
9630,
10129,
875,
247,
4667,
273,
1127,
16173,
310,
247,
1077,
11132,
4836,
627,
310,
247,
5699,
2408,
273,
6239,
327,
436,
1895,
533,
4931,
1275,
18,
285,
1275,
19,
812,
320,
1175,
10414,
50276,
2520,
5046,
247,
2852,
789,
3884,
323,
253,
4477,
281,
1908,
50275,
74,
671,
1158,
271,
28913,
1263,
812,
1361,
835,
253,
3486,
273,
253,
1979,
273,
253,
18273,
873,
327,
253,
3045,
812,
320,
6949,
604,
253,
4477,
2572,
253,
1979,
273,
253,
18273,
873,
432,
884,
281,
2456,
513,
359,
1902,
247,
6351,
275,
253,
3045,
50275,
709,
18,
30966,
344,
1251,
480,
272,
11943,
439,
74,
285,
18205,
66,
1113,
77,
531,
716,
12290,
3809,
285,
5306,
18397,
1127,
9005,
12960,
26332,
1796,
13122,
327,
15688,
982,
5345,
642,
374,
9169,
33198,
20084,
50276,
709,
19,
340,
74,
465,
33317,
278,
3288,
1407,
86,
472,
492,
48059,
340,
19385,
327,
80,
35803,
1154,
458,
8337,
262,
14168,
19683,
3779,
91,
8420,
285,
7222,
1179,
269,
5738,
4715,
281,
1089,
1175,
2723,
2979,
275,
10061,
273,
253,
26332,
1796,
8059,
327,
4382,
8113,
285,
3102,
8981,
7266,
374,
24185,
1731,
3566,
4765,
891,
717,
417,
271,
6485,
275,
436,
1673,
891,
5257,
281,
5075,
2997,
436,
2929,
533,
891,
588,
671,
923,
604,
643,
30628,
452,
4092,
7350,
2490,
187,
4118,
18435,
27,
2520,
310,
247,
45210,
2929,
342,
374,
42876,
1840,
285,
247,
42876,
2708,
14924,
12645,
1223,
253,
4477,
2530,
3588,
6128,
281,
690,
273,
253,
14226,
891,
1335,
1089,
690,
273,
253,
16038,
285,
13260,
417,
10481,
2590,
10527,
285,
8542,
3374,
403,
6804,
285,
253,
12820,
327,
760,
13506,
941,
16540,
8542,
3533
] | [
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] | [
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
783,
2929,
10262,
247,
4076,
24822,
39762,
6753,
36465,
502,
11427,
3348,
1566,
323,
12082,
562,
3623,
5481,
3652,
327,
362,
3348,
323,
21624,
4778,
26278,
253,
2929,
29328,
247,
49863,
29974,
13337,
4715,
281,
9441,
253,
2442,
562,
3623,
50275,
783,
2488,
1160,
253,
1563,
13260,
337,
42559,
2226,
15308,
253,
275,
19139,
403,
1335,
253,
5020,
374,
42559,
403,
12082,
342,
28826,
36108,
6127,
495,
50276,
3118,
1256,
9079,
275,
3623,
941,
476,
320,
21012,
2007,
685,
562,
3623,
941,
577,
42559,
403,
247,
5019,
273,
4076,
285,
16076,
6127,
50276,
251,
253,
1566,
1930,
247,
7802,
1566,
310,
4081,
30364,
50065,
407,
270,
1808,
276,
25658,
12480,
2801,
285,
29810,
268,
39344,
1223,
253,
32049,
2805,
91,
89,
310,
3732,
323,
10491,
21624,
4778,
1182,
247,
2629,
1045,
2399,
310,
26115,
281,
11903,
885,
253,
2412,
12177,
323,
1097,
340,
1929,
285,
7202,
15216,
323,
562,
3623,
8706,
253,
2488,
4081,
281,
15338,
253,
5921,
875,
6046,
285,
4076,
2625,
50276,
16217,
2092,
595,
253,
2488,
7472,
767,
8892,
562,
3623,
5481,
285,
16644,
8706,
327,
4107,
90,
6511,
20,
50276,
29585,
16192,
382,
1269,
22728,
1162,
355,
4240,
5132,
49631,
73,
9652,
4757,
253,
1895,
310,
973,
17194,
285,
4518,
3559,
1077,
3477,
281,
956,
253,
15895,
310,
1077,
2590,
253,
3045,
273,
562,
3623,
8570,
310,
20297,
327,
2969,
15302,
50276,
20881,
1255,
253,
2934,
285,
15895,
310,
247,
1652,
32809,
285,
2074,
281,
391,
21574,
1407,
86,
15916,
1162,
267,
253,
2022,
3064,
812,
320,
253,
49863,
29974,
13337,
7802,
1566,
50276,
8826,
4680,
849,
9578,
310,
253,
1850,
80,
2182,
3290,
285,
253,
15450,
4836,
24088,
9162,
812,
352,
320,
26115,
2366,
849,
310,
253,
1566,
2905,
281,
9645,
6756,
9712,
285,
209,
3737,
812,
253,
2488,
3534,
690,
12288,
253,
2488,
3559,
502,
11427,
3348,
281,
6016,
12082,
562,
3623,
253,
1895,
310,
973,
17194,
285,
4518,
3559,
1077,
3477,
281,
956,
253,
15895,
310,
1077,
2590,
253,
3045,
273,
562,
3623,
8570,
310,
20297,
327,
2969,
15302,
50276,
783,
2934,
285,
15895,
310,
247,
1652,
32809,
285,
2074,
281,
391,
21574,
1407,
86,
15916,
1162,
267,
342,
253,
2022,
3064,
812,
320,
253,
49863,
29974,
13337,
7802,
1566,
50276,
284,
5393,
1840,
891,
5730,
253,
2488,
812,
3662,
253,
767,
1563,
3533,
337,
849,
9578,
310,
253,
1850,
80,
2182,
3290,
285,
253,
15450,
4836,
24088,
9162,
812,
352,
320,
26115,
2366,
374,
849,
310,
253,
1566,
2905,
281,
9645,
6756,
9712,
285,
209,
3737,
812,
253,
2488,
3534,
690,
12288,
5474,
33032,
2520,
2929,
10262,
247,
11454,
2990,
1566,
1754,
327,
39762,
6753,
2083,
351,
398,
362,
3348,
326,
3037,
271,
15424,
6779,
23694,
42559,
342,
12082,
6332,
432,
275,
19139,
970,
247,
1355,
27214,
8578,
273,
253,
3733,
941,
873,
18273,
873,
625,
5742,
4076,
941,
285,
36108,
12082,
6332,
403,
6607,
275,
4858,
21624,
749,
31748,
835,
42559,
403,
6326,
281,
320,
247,
4872,
5019,
273,
824,
4076,
285,
16076,
6127,
846,
3733,
253,
1566,
627,
403,
767,
8892,
562,
3623,
5481,
285,
16644,
8706,
11922,
253,
16076,
6127,
432,
253,
5189,
42559,
50276,
1189,
455,
253,
2929,
310,
973,
3542,
285,
253,
9021,
403,
4518,
26115,
2299,
891,
452,
2067,
7350,
342,
352,
50276,
783,
30328,
3212,
253,
9376,
326,
275,
19139,
476,
320,
6607,
275,
247,
24822,
273,
253,
4076,
285,
16076,
6127,
310,
417,
594,
4755,
3340,
1677,
253,
15302,
326,
452,
644,
908,
275,
253,
4679,
841,
12082,
6332,
273,
6240,
390,
44790,
562,
3104,
390,
19325,
387,
4229,
6887,
2686,
25636,
253,
941,
285,
253,
11041,
310,
3777,
50276,
783,
1895,
310,
4518,
4767,
533,
281,
479,
352,
310,
417,
8943,
752,
651,
320,
247,
8542,
2898,
273,
436,
5933,
285,
247,
1524,
897,
1083,
806,
273,
512,
253,
958,
326,
247,
27214,
18273,
941,
873,
310,
3058,
23447,
1355,
1030,
44196,
253,
8542,
10393,
1273,
253,
897,
1083,
273,
15549,
3888,
835,
690,
15115,
403,
873,
281,
4284,
2193,
24088,
432,
247,
24030,
6568,
651,
320,
7976,
327,
247,
2173,
8468,
4720,
253,
2898,
273,
37385,
7969,
310,
5393,
285,
2789,
625,
3282,
2299,
253,
5661,
7103,
1057,
417,
2486,
436,
625,
2570,
1511,
273,
42559,
50276,
9088,
310,
271,
9470,
30762,
534,
651,
320,
8718,
533,
253,
2022,
2929,
38771,
690,
22909,
285,
4278,
326,
403,
760,
5393,
275,
253,
1049,
423,
1271,
323,
1650,
30762,
299,
285,
269,
50276,
8826,
5884,
16157,
50275,
4674,
5976,
327,
253,
39762,
1566,
47603,
253,
10527,
1566,
342,
690,
7092,
7794,
751,
3523,
11786,
534,
310,
760,
2905,
281,
3733,
32581,
5837,
671,
690,
4243,
403,
417,
5544,
751,
253,
3268,
12580,
50276,
783,
8442,
285,
7180,
403,
1512,
1355,
50276,
783,
2929,
10262,
285,
6110,
2746,
326,
310,
973,
26115,
2299,
253,
2022,
42852,
285,
13260,
403,
417,
973,
28535,
6286,
10527,
285,
8542,
7092,
7794,
403,
6804,
285,
253,
9759,
310,
14999,
19843,
275,
690,
5053,
5474,
33032,
2520,
2929,
24357,
247,
3676,
4715,
1754,
1332,
323,
15549,
285,
47387,
12082,
42559,
275,
941,
1078,
9591,
1566,
4715,
327,
253,
941,
50276,
783,
1332,
310,
1925,
4076,
24822,
39762,
6753,
36465,
502,
11427,
3348,
502,
11427,
3348,
3936,
347,
3280,
247,
18273,
873,
2530,
407,
253,
2608,
253,
18273,
873,
4428,
247,
873,
273,
275,
19139,
285,
42559,
533,
417,
10568,
1491,
670,
849,
253,
42559,
403,
40634,
285,
14993,
281,
3037,
247,
39762,
6753,
36465,
835,
253,
21624,
2127,
310,
247,
32147,
318,
273,
767,
749,
31748,
247,
4076,
24822,
285,
247,
16076,
24822,
502,
11427,
3348,
310,
49863,
29974,
13337,
984,
253,
27214,
18273,
873,
310,
1199,
4577,
685,
253,
440,
47728,
873,
50276,
783,
2929,
17923,
562,
3623,
5481,
285,
8706,
49602,
327,
1264,
2460,
15302,
342,
12082,
42559,
285,
2722,
326,
502,
11427,
3348,
33526,
13857,
1543,
50275,
45563,
50276,
783,
2929,
310,
973,
17194,
285,
3839,
973,
3542,
253,
2234,
2934,
273,
24822,
4715,
310,
27350,
285,
3477,
281,
2096,
50276,
16680,
1007,
281,
320,
1175,
285,
1329,
253,
1750,
50276,
20881,
1255,
265,
50276,
6050,
891,
1119,
2593,
495,
281,
320,
973,
3542,
533,
2593,
577,
534,
4428,
253,
14168,
285,
7424,
3133,
281,
320,
2834,
281,
956,
3340,
323,
247,
1436,
326,
310,
417,
7933,
7615,
342,
13460,
265,
285,
1006,
800,
3210,
4931,
247,
2969,
10199,
273,
1110,
812,
1361,
50275,
266,
2523,
273,
253,
2929,
432,
619,
8668,
310,
326,
253,
4679,
403,
15524,
285,
417,
1524,
3103,
891,
717,
417,
4751,
13762,
326,
253,
3715,
1332,
588,
452,
8542,
3486,
50276,
249,
1798,
310,
352,
1896,
281,
1071,
253,
1332,
327,
690,
10895,
326,
310,
973,
4304,
281,
320,
40634,
407,
12082,
42559,
2581,
685,
948,
8287,
253,
42559,
50276,
531,
1650,
326,
891,
717,
6600,
273,
310,
326,
275,
15688,
13071,
3430,
2723,
2979,
310,
247,
1943,
5691,
323,
1650,
275,
1127,
9005,
12960,
14631,
9630,
10129,
875,
247,
4667,
273,
1127,
16173,
310,
247,
1077,
11132,
4836,
627,
310,
247,
5699,
2408,
273,
6239,
327,
436,
1895,
533,
4931,
1275,
18,
285,
1275,
19,
812,
320,
1175,
10414,
50276,
2520,
5046,
247,
2852,
789,
3884,
323,
253,
4477,
281,
1908,
50275,
74,
671,
1158,
271,
28913,
1263,
812,
1361,
835,
253,
3486,
273,
253,
1979,
273,
253,
18273,
873,
327,
253,
3045,
812,
320,
6949,
604,
253,
4477,
2572,
253,
1979,
273,
253,
18273,
873,
432,
884,
281,
2456,
513,
359,
1902,
247,
6351,
275,
253,
3045,
50275,
709,
18,
30966,
344,
1251,
480,
272,
11943,
439,
74,
285,
18205,
66,
1113,
77,
531,
716,
12290,
3809,
285,
5306,
18397,
1127,
9005,
12960,
26332,
1796,
13122,
327,
15688,
982,
5345,
642,
374,
9169,
33198,
20084,
50276,
709,
19,
340,
74,
465,
33317,
278,
3288,
1407,
86,
472,
492,
48059,
340,
19385,
327,
80,
35803,
1154,
458,
8337,
262,
14168,
19683,
3779,
91,
8420,
285,
7222,
1179,
269,
5738,
4715,
281,
1089,
1175,
2723,
2979,
275,
10061,
273,
253,
26332,
1796,
8059,
327,
4382,
8113,
285,
3102,
8981,
7266,
374,
24185,
1731,
3566,
4765,
891,
717,
417,
271,
6485,
275,
436,
1673,
891,
5257,
281,
5075,
2997,
436,
2929,
533,
891,
588,
671,
923,
604,
643,
30628,
452,
4092,
7350,
2490,
187,
4118,
18435,
27,
2520,
310,
247,
45210,
2929,
342,
374,
42876,
1840,
285,
247,
42876,
2708,
14924,
12645,
1223,
253,
4477,
2530,
3588,
6128,
281,
690,
273,
253,
14226,
891,
1335,
1089,
690,
273,
253,
16038,
285,
13260,
417,
10481,
2590,
10527,
285,
8542,
3374,
403,
6804,
285,
253,
12820,
327,
760,
13506,
941,
16540,
8542,
3533
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
this paper studies the effectiveness of using the restricted computational model class of lowdegree polynomials for assessing statisticalcomputation gaps for highdimensional statistical inference problems using the problem of reconstruction on trees the authors identify problem settings for which averagecase reconstruction is impossible using lowdegree polynomials yet reconstruction is possible with computationally efficient methods strengths major significance informationcomputation tradeoffs for high dimensional statistical inference problems are important there are several restricted computational models that have been used to study gaps for different problems planted clique sparse pca etc lowdegree polynomials are a prominent model the authors show that there are problems where thresholds identified using lowdegree polynomials do not match thresholds for existence of computationally efficient methods the extension of the lower bound to investigate when kernel ridge regression would fail is interesting the authors provide substantial discussions strengths minor while work is theoretical the authors include an experiment fig 1 investigate performance of lowdegree polynomials for small but nonzero lambda2 suggesting that the inaccuracy of lowdegree polynomials to capture computational hardness may not narrowly occur on the limit case of lambda20 analytically studied in the paper weaknesses i did not identify any major weaknesses very minor notes line 30 acronym csp not defined line 78 the acronym for statistical query sq is used multiple times before being defined line 95 notation c not introduced yet why not xr for realization of root variable line 103 rho a subscript appearing in the conditioning event is not defined lines 117 and 122 missing parenthesis line 189 a constant c for nc is discussed but that is different than the c as the root variable realization right if so id suggest not overloading notation line 210 capitalize markov line 212 should suppose that be there line 218 has a condition on mdelta and in 212 the notation m delta epsilon and c not yet specified line 217218 i dont think the notation varphicdot was introduced yet yes docsepinvestigates the problem of tree reconstruction from leaves to root through low degree polynomials lowdegree polynomials as a computational model used to study intractability of learninginference problems seems to be a useful and important model so studying when it doesdoesnt reflect the wider class of polynomialtime algorithms is very important the results in this paper provide an important insight in this direction the assumptions are pretty carefully justified and the authors are largely working in standard models the main limitation i can see is that the tree structure that makes this analysis go through is somewhat limited docsepthis paper studies the problem of reconstruction on trees through low degree polynomials the authors show that there exists simple tree models in which nontrivial reconstruction of the root value is possible in polynomial time and if the tree is unknown but given samples with correlated root assignments nontrivial reconstruction is possible with a statistical query algorithm the paper also provide a result related to rbf kernel ridge regression for predicting root coloration an open question about low degree polynomials and the ks threshold is also proposed the topic of this paper is completely out of my area and any technical comments i make will probably be unfair to the authors regarding organization i can hardly follow the paper partly it is because the topic is out of my area however in the current shape everything including introduction preliminaries definitions theorems remarks are all mixed into the two massive sections while i understand there might be a lot of contents in the paper i think the presentation can definitely be improved i also dont know what are the exact contributions in this paper for example theorem 5 mossel and peres 2003 appears under section 12 our results is it a new result or from a prior work not applicable docsepthis paper studies the problem of tree reconstruction on dary trees the root of the tree is given a spin xrho sim nu which is then propagated down to the leaves according to a markov channel m the problem is then given the spins xl at the leaves to recover the original root spin xrho several variants of this model are considered withwithout noise at the leaves with the underlying tree known or not or with several realizations of the same tree process the focus in this paper is on lowdegree polynomial reconstruction for which values of m can xrho be estimated by a lowdegree polynomial in the leaves ie a function of the form fxl sumssubset l s d fsxs where xs is the subset of leaves in s it is already known that when d lambda2m2 1 a linear d 1 estimator suffices on the other hand general reconstruction using belief propagation is possible for almost all m the authors show that if lambda2m 0 then no polynomial algorithm of degree leq nc where n is the number of leaves can recover the true root spin the proof is based on the property that mk is of rank 1 for some k and hence the correlation between a vertex x and its kth ancestor is 0 as a corollary they show that a kernel ridge regression method needs at least enc samples to learn the tree reconstruction problem since it needs to approximate a polynomial of degree at least nc the article also contains a positive result for the case where the underlying tree is unknown equivalently where the leaves are known up to permutation they show that for a fixed root spin xrho and a polynomial number of samples from the tree process started at xrho there exists a reconstruction algorithm that recovers xrho better than random chance the tree reconstruction problem is ubiquitous in many inference problems eg community detection for which computationaltostatistical gaps are still fairly unexplained its interesting to see a low degree polynomial approach to this problem which bridges the gap between the census reconstruction problem and bp approaches the paper is overall wellwritten and easy to read the introduced notions are clearly defined with the notable exception of the vstat oracle and the results are nicely presented it is especially interesting that the impossibility result extends to onc degrees although this might just be a consequence of lambda2m 0 the main weakness of this paper in my opinion is its specificity all proofs hinge on the specific properties that occur when mk is of rank one which implies very string independence properties between tree nodes the tree structure is similarly rigid with only the dary tree considered however this is a good first step which i hope will inspire more work on this topic the limitations have been adequately addressed
### Summary: | this paper studies using lowdegree polynomials for analyzing statisticalcomputational gaps for highdimensional inference problems and identify averagecase settings that exhibit this gap this is a nice paper and above the bar though it perhaps appeal to only a theoretical audience | [
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
2520,
2929,
2175,
253,
12510,
273,
970,
253,
11096,
15180,
1566,
966,
273,
1698,
14577,
21783,
323,
18005,
7605,
681,
10340,
18388,
323,
1029,
6967,
7605,
17032,
3237,
50276,
5302,
253,
1895,
273,
14433,
327,
7139,
253,
4477,
4271,
1895,
7533,
323,
534,
3388,
5045,
14433,
310,
7479,
970,
1698,
14577,
21783,
2568,
14433,
310,
1896,
342,
43245,
5919,
3082,
50275,
296,
3755,
20556,
2201,
50276,
9188,
40348,
1491,
681,
10340,
5454,
14273,
323,
1029,
15759,
7605,
17032,
3237,
403,
1774,
50276,
9088,
403,
2067,
11096,
15180,
3210,
326,
452,
644,
908,
281,
1263,
18388,
323,
1027,
3237,
23846,
502,
2271,
23507,
268,
6357,
3966,
50276,
676,
14577,
21783,
403,
247,
11906,
1566,
50276,
783,
4477,
921,
326,
627,
403,
3237,
835,
26682,
3636,
970,
1698,
14577,
21783,
513,
417,
3761,
26682,
323,
6242,
273,
43245,
5919,
3082,
50274,
783,
6880,
273,
253,
2406,
3033,
281,
7409,
672,
10295,
27563,
9077,
651,
1891,
310,
4722,
50276,
783,
4477,
2085,
6832,
11985,
50276,
296,
3755,
20556,
5884,
50276,
6050,
789,
310,
10527,
253,
4477,
2486,
271,
3368,
3036,
337,
7409,
3045,
273,
1698,
14577,
21783,
323,
1355,
533,
28078,
29331,
19,
7738,
326,
253,
23437,
1974,
273,
1698,
14577,
21783,
281,
9232,
15180,
38576,
778,
417,
35440,
2826,
327,
253,
2701,
1083,
273,
29331,
938,
41398,
5421,
275,
253,
2929,
50276,
20881,
1255,
265,
50275,
74,
858,
417,
4271,
667,
2201,
32213,
50276,
635,
5884,
7211,
50275,
1282,
1884,
913,
1406,
1105,
260,
1033,
417,
2931,
50276,
1282,
10523,
50276,
783,
913,
1406,
1105,
323,
7605,
7316,
34703,
310,
908,
2709,
2069,
1078,
1146,
2931,
50275,
1282,
5325,
50276,
25604,
260,
417,
5611,
2568,
2139,
417,
1269,
83,
323,
22786,
273,
5230,
4778,
50276,
1282,
13062,
50276,
2859,
247,
749,
3866,
15602,
275,
253,
21839,
2362,
310,
417,
2931,
50275,
8737,
12387,
285,
17115,
5816,
2885,
25232,
50276,
1282,
24665,
50276,
66,
3638,
260,
323,
295,
68,
310,
5469,
533,
326,
310,
1027,
685,
253,
260,
347,
253,
5230,
4778,
22786,
987,
604,
594,
2654,
1804,
417,
689,
23333,
14951,
50276,
1282,
20048,
5347,
907,
1616,
729,
50276,
1282,
21990,
943,
9428,
326,
50276,
1257,
627,
50276,
1282,
26578,
556,
247,
1617,
327,
278,
3005,
285,
275,
21990,
253,
14951,
278,
18687,
299,
4277,
285,
260,
417,
2568,
7616,
50276,
1282,
374,
20333,
1093,
891,
13414,
1158,
253,
14951,
945,
545,
280,
5256,
369,
5611,
2568,
50276,
9820,
5474,
339,
9852,
5410,
304,
684,
253,
1895,
273,
5202,
14433,
432,
6505,
281,
5230,
949,
1698,
4248,
21783,
50276,
676,
14577,
21783,
347,
247,
15180,
1566,
908,
281,
1263,
540,
974,
1430,
273,
4715,
249,
1793,
3237,
3133,
281,
320,
247,
4217,
285,
1774,
1566,
594,
12392,
672,
352,
1057,
18566,
2649,
4887,
253,
14200,
966,
273,
14189,
2606,
11333,
310,
1077,
1774,
253,
1543,
275,
436,
2929,
2085,
271,
1774,
12288,
275,
436,
3884,
253,
13260,
403,
3965,
9257,
17285,
285,
253,
4477,
403,
8127,
2444,
275,
2629,
3210,
253,
2022,
12291,
891,
476,
923,
310,
326,
253,
5202,
2605,
326,
2789,
436,
1783,
564,
949,
310,
8489,
3710,
50276,
7152,
33032,
2520,
2929,
2175,
253,
1895,
273,
14433,
327,
7139,
949,
1698,
4248,
21783,
253,
4477,
921,
326,
627,
4961,
2969,
5202,
3210,
275,
534,
37825,
14433,
273,
253,
5230,
1318,
310,
1896,
275,
14189,
673,
285,
604,
253,
5202,
310,
7202,
533,
1677,
3530,
342,
9578,
5230,
23768,
37825,
14433,
310,
1896,
342,
247,
7605,
7316,
5933,
253,
2929,
671,
2085,
247,
906,
2905,
281,
391,
3342,
10295,
27563,
9077,
323,
21565,
5230,
3295,
318,
271,
1527,
1953,
670,
1698,
4248,
21783,
285,
253,
465,
84,
7887,
310,
671,
4081,
253,
9400,
273,
436,
2929,
310,
4336,
562,
273,
619,
2170,
285,
667,
7681,
5701,
891,
1056,
588,
3164,
320,
16593,
281,
253,
4477,
50276,
1747,
13218,
6003,
891,
476,
10693,
956,
253,
2929,
13730,
352,
310,
984,
253,
9400,
310,
562,
273,
619,
2170,
2299,
275,
253,
1655,
5281,
3253,
1690,
10199,
11944,
249,
3927,
14308,
39383,
16157,
403,
512,
6804,
715,
253,
767,
7863,
7118,
1223,
891,
2096,
627,
1537,
320,
247,
2257,
273,
9410,
275,
253,
2929,
891,
1158,
253,
9759,
476,
7964,
320,
5520,
50275,
74,
671,
13414,
871,
752,
403,
253,
3242,
9021,
275,
436,
2929,
323,
1650,
10012,
608,
43177,
293,
285,
759,
373,
6469,
4620,
762,
2593,
1249,
776,
1543,
310,
352,
247,
747,
906,
390,
432,
247,
2720,
789,
50276,
1439,
7763,
50274,
7152,
33032,
2520,
2929,
2175,
253,
1895,
273,
5202,
14433,
327,
277,
552,
7139,
253,
5230,
273,
253,
5202,
310,
1677,
247,
5508,
1269,
2859,
948,
8794,
534,
310,
840,
46695,
1066,
281,
253,
6505,
2556,
281,
247,
1616,
729,
5048,
278,
253,
1895,
310,
840,
1677,
253,
27923,
1269,
77,
387,
253,
6505,
281,
9295,
253,
3236,
5230,
5508,
1269,
2859,
2067,
11640,
273,
436,
1566,
403,
2783,
342,
14920,
6046,
387,
253,
6505,
342,
253,
6944,
5202,
1929,
390,
417,
390,
342,
2067,
1524,
5904,
273,
253,
1072,
5202,
1232,
50276,
783,
2770,
275,
436,
2929,
310,
327,
1698,
14577,
14189,
14433,
323,
534,
2193,
273,
278,
476,
1269,
2859,
320,
5998,
407,
247,
1698,
14577,
14189,
275,
253,
6505,
26332,
247,
1159,
273,
253,
830,
50276,
21448,
77,
50276,
2204,
859,
538,
1178,
298,
256,
50276,
69,
25290,
14831,
50275,
2811,
48361,
310,
253,
8578,
273,
6505,
275,
256,
352,
310,
2168,
1929,
326,
672,
277,
29331,
19,
78,
19,
50276,
18,
247,
4872,
277,
50276,
18,
29107,
31088,
327,
253,
643,
1133,
2087,
14433,
970,
9927,
18634,
310,
1896,
323,
2761,
512,
278,
50276,
783,
4477,
921,
326,
604,
29331,
19,
78,
50276,
17,
840,
642,
14189,
5933,
273,
4248,
458,
82,
295,
68,
835,
295,
310,
253,
1180,
273,
6505,
476,
9295,
253,
2032,
5230,
5508,
253,
4737,
310,
1754,
327,
253,
2867,
326,
36904,
310,
273,
5958,
337,
323,
690,
465,
285,
7613,
253,
5921,
875,
247,
11302,
1269,
285,
697,
465,
394,
35095,
310,
470,
347,
247,
40460,
597,
921,
326,
247,
10295,
27563,
9077,
1332,
3198,
387,
1878,
2349,
3530,
281,
3037,
253,
5202,
14433,
1895,
1580,
352,
3198,
281,
16851,
247,
14189,
273,
4248,
387,
1878,
295,
68,
50276,
783,
3929,
671,
4428,
247,
2762,
906,
323,
253,
1083,
835,
253,
6944,
5202,
310,
7202,
39406,
835,
253,
6505,
403,
1929,
598,
281,
29391,
597,
921,
326,
323,
247,
4229,
5230,
5508,
1269,
2859,
285,
247,
14189,
1180,
273,
3530,
432,
253,
5202,
1232,
3053,
387,
1269,
2859,
627,
4961,
247,
14433,
5933,
326,
761,
12239,
1269,
2859,
1805,
685,
3632,
4839,
253,
5202,
14433,
1895,
310,
33079,
275,
1142,
17032,
3237,
24088,
3114,
5481,
323,
534,
15180,
85,
30469,
11230,
18388,
403,
1335,
9648,
49374,
697,
4722,
281,
923,
247,
1698,
4248,
14189,
2746,
281,
436,
1895,
534,
24853,
253,
8037,
875,
253,
20740,
14433,
1895,
285,
20633,
7274,
253,
2929,
310,
4583,
973,
15720,
285,
3477,
281,
1239,
253,
5611,
27367,
403,
4518,
2931,
342,
253,
16613,
6517,
273,
253,
362,
8766,
42295,
285,
253,
1543,
403,
23395,
3559,
352,
310,
3340,
4722,
326,
253,
37593,
2322,
906,
8725,
281,
18695,
7759,
3738,
436,
1537,
816,
320,
247,
9936,
273,
29331,
19,
78,
50276,
17,
50276,
783,
2022,
14855,
273,
436,
2929,
275,
619,
4743,
310,
697,
13005,
512,
27947,
38864,
327,
253,
2173,
3607,
326,
2826,
672,
36904,
310,
273,
5958,
581,
534,
8018,
1077,
2876,
14275,
3607,
875,
5202,
7632,
253,
5202,
2605,
310,
12014,
16572,
342,
760,
253,
277,
552,
5202,
2783,
2299,
436,
310,
247,
1175,
806,
3213,
534,
891,
3524,
588,
26761,
625,
789,
327,
436,
9400,
50276,
783,
7364,
452,
644,
18212,
9713,
2490,
187,
4118,
18435,
27,
2520,
2929,
2175,
970,
1698,
14577,
21783,
323,
18918,
7605,
16777,
1050,
18388,
323,
1029,
6967,
17032,
3237,
285,
4271,
3388,
5045,
7533,
326,
10738,
436,
8037,
50276,
2520,
310,
247,
5322,
2929,
285,
1840,
253,
2534,
2167,
352,
4931,
4549,
281,
760,
247,
10527,
8446
] | [
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] | [
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
2520,
2929,
2175,
253,
12510,
273,
970,
253,
11096,
15180,
1566,
966,
273,
1698,
14577,
21783,
323,
18005,
7605,
681,
10340,
18388,
323,
1029,
6967,
7605,
17032,
3237,
50276,
5302,
253,
1895,
273,
14433,
327,
7139,
253,
4477,
4271,
1895,
7533,
323,
534,
3388,
5045,
14433,
310,
7479,
970,
1698,
14577,
21783,
2568,
14433,
310,
1896,
342,
43245,
5919,
3082,
50275,
296,
3755,
20556,
2201,
50276,
9188,
40348,
1491,
681,
10340,
5454,
14273,
323,
1029,
15759,
7605,
17032,
3237,
403,
1774,
50276,
9088,
403,
2067,
11096,
15180,
3210,
326,
452,
644,
908,
281,
1263,
18388,
323,
1027,
3237,
23846,
502,
2271,
23507,
268,
6357,
3966,
50276,
676,
14577,
21783,
403,
247,
11906,
1566,
50276,
783,
4477,
921,
326,
627,
403,
3237,
835,
26682,
3636,
970,
1698,
14577,
21783,
513,
417,
3761,
26682,
323,
6242,
273,
43245,
5919,
3082,
50274,
783,
6880,
273,
253,
2406,
3033,
281,
7409,
672,
10295,
27563,
9077,
651,
1891,
310,
4722,
50276,
783,
4477,
2085,
6832,
11985,
50276,
296,
3755,
20556,
5884,
50276,
6050,
789,
310,
10527,
253,
4477,
2486,
271,
3368,
3036,
337,
7409,
3045,
273,
1698,
14577,
21783,
323,
1355,
533,
28078,
29331,
19,
7738,
326,
253,
23437,
1974,
273,
1698,
14577,
21783,
281,
9232,
15180,
38576,
778,
417,
35440,
2826,
327,
253,
2701,
1083,
273,
29331,
938,
41398,
5421,
275,
253,
2929,
50276,
20881,
1255,
265,
50275,
74,
858,
417,
4271,
667,
2201,
32213,
50276,
635,
5884,
7211,
50275,
1282,
1884,
913,
1406,
1105,
260,
1033,
417,
2931,
50276,
1282,
10523,
50276,
783,
913,
1406,
1105,
323,
7605,
7316,
34703,
310,
908,
2709,
2069,
1078,
1146,
2931,
50275,
1282,
5325,
50276,
25604,
260,
417,
5611,
2568,
2139,
417,
1269,
83,
323,
22786,
273,
5230,
4778,
50276,
1282,
13062,
50276,
2859,
247,
749,
3866,
15602,
275,
253,
21839,
2362,
310,
417,
2931,
50275,
8737,
12387,
285,
17115,
5816,
2885,
25232,
50276,
1282,
24665,
50276,
66,
3638,
260,
323,
295,
68,
310,
5469,
533,
326,
310,
1027,
685,
253,
260,
347,
253,
5230,
4778,
22786,
987,
604,
594,
2654,
1804,
417,
689,
23333,
14951,
50276,
1282,
20048,
5347,
907,
1616,
729,
50276,
1282,
21990,
943,
9428,
326,
50276,
1257,
627,
50276,
1282,
26578,
556,
247,
1617,
327,
278,
3005,
285,
275,
21990,
253,
14951,
278,
18687,
299,
4277,
285,
260,
417,
2568,
7616,
50276,
1282,
374,
20333,
1093,
891,
13414,
1158,
253,
14951,
945,
545,
280,
5256,
369,
5611,
2568,
50276,
9820,
5474,
339,
9852,
5410,
304,
684,
253,
1895,
273,
5202,
14433,
432,
6505,
281,
5230,
949,
1698,
4248,
21783,
50276,
676,
14577,
21783,
347,
247,
15180,
1566,
908,
281,
1263,
540,
974,
1430,
273,
4715,
249,
1793,
3237,
3133,
281,
320,
247,
4217,
285,
1774,
1566,
594,
12392,
672,
352,
1057,
18566,
2649,
4887,
253,
14200,
966,
273,
14189,
2606,
11333,
310,
1077,
1774,
253,
1543,
275,
436,
2929,
2085,
271,
1774,
12288,
275,
436,
3884,
253,
13260,
403,
3965,
9257,
17285,
285,
253,
4477,
403,
8127,
2444,
275,
2629,
3210,
253,
2022,
12291,
891,
476,
923,
310,
326,
253,
5202,
2605,
326,
2789,
436,
1783,
564,
949,
310,
8489,
3710,
50276,
7152,
33032,
2520,
2929,
2175,
253,
1895,
273,
14433,
327,
7139,
949,
1698,
4248,
21783,
253,
4477,
921,
326,
627,
4961,
2969,
5202,
3210,
275,
534,
37825,
14433,
273,
253,
5230,
1318,
310,
1896,
275,
14189,
673,
285,
604,
253,
5202,
310,
7202,
533,
1677,
3530,
342,
9578,
5230,
23768,
37825,
14433,
310,
1896,
342,
247,
7605,
7316,
5933,
253,
2929,
671,
2085,
247,
906,
2905,
281,
391,
3342,
10295,
27563,
9077,
323,
21565,
5230,
3295,
318,
271,
1527,
1953,
670,
1698,
4248,
21783,
285,
253,
465,
84,
7887,
310,
671,
4081,
253,
9400,
273,
436,
2929,
310,
4336,
562,
273,
619,
2170,
285,
667,
7681,
5701,
891,
1056,
588,
3164,
320,
16593,
281,
253,
4477,
50276,
1747,
13218,
6003,
891,
476,
10693,
956,
253,
2929,
13730,
352,
310,
984,
253,
9400,
310,
562,
273,
619,
2170,
2299,
275,
253,
1655,
5281,
3253,
1690,
10199,
11944,
249,
3927,
14308,
39383,
16157,
403,
512,
6804,
715,
253,
767,
7863,
7118,
1223,
891,
2096,
627,
1537,
320,
247,
2257,
273,
9410,
275,
253,
2929,
891,
1158,
253,
9759,
476,
7964,
320,
5520,
50275,
74,
671,
13414,
871,
752,
403,
253,
3242,
9021,
275,
436,
2929,
323,
1650,
10012,
608,
43177,
293,
285,
759,
373,
6469,
4620,
762,
2593,
1249,
776,
1543,
310,
352,
247,
747,
906,
390,
432,
247,
2720,
789,
50276,
1439,
7763,
50274,
7152,
33032,
2520,
2929,
2175,
253,
1895,
273,
5202,
14433,
327,
277,
552,
7139,
253,
5230,
273,
253,
5202,
310,
1677,
247,
5508,
1269,
2859,
948,
8794,
534,
310,
840,
46695,
1066,
281,
253,
6505,
2556,
281,
247,
1616,
729,
5048,
278,
253,
1895,
310,
840,
1677,
253,
27923,
1269,
77,
387,
253,
6505,
281,
9295,
253,
3236,
5230,
5508,
1269,
2859,
2067,
11640,
273,
436,
1566,
403,
2783,
342,
14920,
6046,
387,
253,
6505,
342,
253,
6944,
5202,
1929,
390,
417,
390,
342,
2067,
1524,
5904,
273,
253,
1072,
5202,
1232,
50276,
783,
2770,
275,
436,
2929,
310,
327,
1698,
14577,
14189,
14433,
323,
534,
2193,
273,
278,
476,
1269,
2859,
320,
5998,
407,
247,
1698,
14577,
14189,
275,
253,
6505,
26332,
247,
1159,
273,
253,
830,
50276,
21448,
77,
50276,
2204,
859,
538,
1178,
298,
256,
50276,
69,
25290,
14831,
50275,
2811,
48361,
310,
253,
8578,
273,
6505,
275,
256,
352,
310,
2168,
1929,
326,
672,
277,
29331,
19,
78,
19,
50276,
18,
247,
4872,
277,
50276,
18,
29107,
31088,
327,
253,
643,
1133,
2087,
14433,
970,
9927,
18634,
310,
1896,
323,
2761,
512,
278,
50276,
783,
4477,
921,
326,
604,
29331,
19,
78,
50276,
17,
840,
642,
14189,
5933,
273,
4248,
458,
82,
295,
68,
835,
295,
310,
253,
1180,
273,
6505,
476,
9295,
253,
2032,
5230,
5508,
253,
4737,
310,
1754,
327,
253,
2867,
326,
36904,
310,
273,
5958,
337,
323,
690,
465,
285,
7613,
253,
5921,
875,
247,
11302,
1269,
285,
697,
465,
394,
35095,
310,
470,
347,
247,
40460,
597,
921,
326,
247,
10295,
27563,
9077,
1332,
3198,
387,
1878,
2349,
3530,
281,
3037,
253,
5202,
14433,
1895,
1580,
352,
3198,
281,
16851,
247,
14189,
273,
4248,
387,
1878,
295,
68,
50276,
783,
3929,
671,
4428,
247,
2762,
906,
323,
253,
1083,
835,
253,
6944,
5202,
310,
7202,
39406,
835,
253,
6505,
403,
1929,
598,
281,
29391,
597,
921,
326,
323,
247,
4229,
5230,
5508,
1269,
2859,
285,
247,
14189,
1180,
273,
3530,
432,
253,
5202,
1232,
3053,
387,
1269,
2859,
627,
4961,
247,
14433,
5933,
326,
761,
12239,
1269,
2859,
1805,
685,
3632,
4839,
253,
5202,
14433,
1895,
310,
33079,
275,
1142,
17032,
3237,
24088,
3114,
5481,
323,
534,
15180,
85,
30469,
11230,
18388,
403,
1335,
9648,
49374,
697,
4722,
281,
923,
247,
1698,
4248,
14189,
2746,
281,
436,
1895,
534,
24853,
253,
8037,
875,
253,
20740,
14433,
1895,
285,
20633,
7274,
253,
2929,
310,
4583,
973,
15720,
285,
3477,
281,
1239,
253,
5611,
27367,
403,
4518,
2931,
342,
253,
16613,
6517,
273,
253,
362,
8766,
42295,
285,
253,
1543,
403,
23395,
3559,
352,
310,
3340,
4722,
326,
253,
37593,
2322,
906,
8725,
281,
18695,
7759,
3738,
436,
1537,
816,
320,
247,
9936,
273,
29331,
19,
78,
50276,
17,
50276,
783,
2022,
14855,
273,
436,
2929,
275,
619,
4743,
310,
697,
13005,
512,
27947,
38864,
327,
253,
2173,
3607,
326,
2826,
672,
36904,
310,
273,
5958,
581,
534,
8018,
1077,
2876,
14275,
3607,
875,
5202,
7632,
253,
5202,
2605,
310,
12014,
16572,
342,
760,
253,
277,
552,
5202,
2783,
2299,
436,
310,
247,
1175,
806,
3213,
534,
891,
3524,
588,
26761,
625,
789,
327,
436,
9400,
50276,
783,
7364,
452,
644,
18212,
9713,
2490,
187,
4118,
18435,
27,
2520,
2929,
2175,
970,
1698,
14577,
21783,
323,
18918,
7605,
16777,
1050,
18388,
323,
1029,
6967,
17032,
3237,
285,
4271,
3388,
5045,
7533,
326,
10738,
436,
8037,
50276,
2520,
310,
247,
5322,
2929,
285,
1840,
253,
2534,
2167,
352,
4931,
4549,
281,
760,
247,
10527,
8446
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
this paper deals with diffusion models for inverse problems an important drawback of this approach is that to produce satisfactory results it is necessary to iterate the forward and the reverse diffusions several times based on the manifold hypothesis the authors show that this occurs because the score function only acts on the normal direction of the manifold and therefore a drift occurs pushing the inference path out of the manifold to solve this besides the measurement constraint equation they propose to include a constraint on the reverse diffusion equation given by the gradient of the l2 residual of the inverse problem for the bayes optimal denoising step from noise2noise kim and ye neurips 2021 the authors prove that this constraint forces the diffusion to lie on the data manifold these claims are supported by theoretical proofs and a clear geometric interpretation strengths the paper is very well written the idea of combining noise2score and diffusion models is novel interesting and completely sound the proposed strategy is fully backed up by the theortical arguments and proofs provided by the authors thorough comparison with stateoftheart diffusion modelbased approaches and supervised learningbased baselines confirm the superiority of the proposed approach weaknesses it would be nice to show and discuss failure cases or situations when the proposed approach does not outperform the others minor comments table x figure y section z etc table x figure y section z etc eq 3 xt xt fix punctuation at the end of eqs 6 and 9 l71 mathbbrn mathbbrm l76 utilize l77 rely eqs 13 and 14 for consistency write them in the discrete setting l144 uses l154 recall the definition of p0 it was only defined in section 2 suppmat l465466 check the sentence suppmat fig 6 there are two lines in red that should be in green suppmat l502 epsilontheta ztheta suppmatl507 4 table 4 suppmat l509 1 algorithm 1 yes docsepthe paper proposes a simple augmentation to existing scorebased diffusion solvers for inverse problems by applying an extra correction term in the diffusion step this term in theory captures the onmanifold component of the current estimation error after rebuttal the authors have convinced me that the empirical experiments are sound and have also tempered their theoretical claims i am happy with the paper in its present state as i am convinced that it presents a novel technique for inverse imaging and achieves a solid improvement on the prior state of the art strengths the proposed step is elegant and computationally simple the results if correct see below suggest massive improvements in signal recovery weaknesses the main theorem does not state its assumptions clearly moreover i dont believe they hold and this is quite problematic for the proposed algorithm see overview the empirical experiments are somewhat problematic see overview minor the use of bayes rule in section 31 to obtain eqs 11 and 12 is somewhat misleading this is a general technique for conditional sampling in scorebased diffusion models see for eg 1 2 3 overview overall i am very excited by the direction proposed here but i think there are currently major issues in the reasoning and execution of the paper i believe the paper will be much stronger after addressing these problems ultimately my concerns are twofold theoretically i am not convinced that the main result of the paper upon which the proposed algorithm rests holds in practice theorem 1 requires proposition 1 to hold and proposition 1 assumes that the score function is a global minimizer of the denoising loss eq 9 this is nearly impossible in a diffusion model since the denoising function at tt or t0 in the reverse process must recover an image from pure noise this is a highly illdefined problem another way to see that this cannot hold if the score function was a global minimizer the generative process for unconditional diffusion models would take 1 inference step in practice it takes thousands of steps in the vanilla diffusion model the suboptimality of the score function has consequences that extend past theory as the central manifold constraint gradient mcg term also relies on the global optimality of the score function to define manifold projection eq 13 thus this key assumption is quite central to the entire paper empirically the visual quality of competing work seems like it could be somewhat misrepresented especially in image inpainting for example in figure 3 the visual quality of repaint 4 and lama 5 both differ greatly from that reported in their respective papers see figure 1 for each paper moreover looking more closely at the experimental setup of the competing works repaint and lama the two papers share the same experiments for inpainting and establish a baseline namely their inpainting task involves recovering an original image corrupted by wide and narrow masks this paper uses a different set of masks and reports significantly different lpip scores for repaint and lama for this task to be clear i think it is fair to change the experimental setup however i wonder if the authors did not spend enough time tuning the competing models in this new setup and consequently misrepresented the competing works 1 dhariwal p nichol a 2021 diffusion models beat gans on image synthesis advances in neural information processing systems 34 87808794 2 sohldickstein j weiss e maheswaranathan n ganguli s 2015 june deep unsupervised learning using nonequilibrium thermodynamics in international conference on machine learning pp 22562265 pmlr 3 song y sohldickstein j kingma d p kumar a ermon s poole b 2020 scorebased generative modeling through stochastic differential equations arxiv preprint arxiv201113456 4 lugmayr a danelljan m romero a yu f timofte r van gool l 2022 repaint inpainting using denoising diffusion probabilistic models in proceedings of the ieeecvf conference on computer vision and pattern recognition pp 1146111471 5 suvorov r logacheva e mashikhin a remizova a ashukha a silvestrov a lempitsky v 2022 resolutionrobust large mask inpainting with fourier convolutions in proceedings of the ieeecvf winter conference on applications of computer vision pp 21492159 no docsepthis paper proposes a framework for solving inverse problems using diffusion models as priors the idea is to introduce another constraint in the formulation of the posterior distribution that enforces the generated sample to be on the manifold of natural images the introduced constraint is motivated by the tweedie formula that relates the score function of the noisy data distribution and the best mmse estimator denoising several experimental results on different imaging inverse problems show that the proposed method outperforms existing methods in fid lpips psnr and ssim metrics strengths the paper introduces an interesting adaptation to traditional sampling methods motivated by the tweedie formula the paper is generally well written and the idea is well motivated several experimental results on different imaging inverse problems show that the proposed method outperforms existing methods in fid lpips psnr and ssim metrics weakness the paper exposition could be better the paper relates the idea a lot to noise2score but the key motivation is the tweedie formula with that in mind it would be better to introduce the tweedie formula and relevant work better see below motivation for the additional constraint is not clear section 31 in particular eq 14 deduction is unclear mathcalc is a set so how should one interpret pmathcalc x how is obtained w for each application theoretical findings should be better connected to the proposed algorithm in general i like the paper the idea is interesting and the experimental results show a clear win in performance the main current issue to me is that the motivation and exposition is not clear enough most of the pieces are already there but a reorganization and better discussion is needed limitations are correctly discussed docsepin this paper the authors propose a new methodology to perform conditional sampling using denoising diffusion models namely they combine existing conditional scorematching techniques using a stochastic contraction with a slight change of the score indeed instead of considering the unconditional score functional they consider the score function associated with the conditional distribution given by some manifold constraint the authors show that this term is in fact complementary to the existing unconditional score term as the score term only locally project on the data manifold whereas the additional term moves locally on the tangent space of this manifold they evaluate their methodologies against some baselines and show promising results for inpainting colorization and ctscan reconstruction strengths the paper introduces a new methodology for inpainting which makes sense from a theoretical point of view is easy to implement and seems to yield good results weaknesses i think that a deeper discussion with recent works such as 1 is missing it is not clear from reading the main document what is the state of the art conditional sampling method using diffusion models at least previous to this work in terms of novelty it seems that the proposed correction was already introduced in 2 as a consequence the method proposed by the authors can be see as a generalization to arbitrary linear problems as stated by the authors and therefore has limited novely with the stochastic contraction term and the gradient term proposed by the authors it is not clear at all if we are sampling from the right posterior distribution i understand that the method yields good visual results but in fields like ctreconstruction the visual quality should not be the only metrics as we also need a models which provide good uncertainty quantification such considerations are missing here as a first question can you characterize what is the invariant measure of the proposed diffusion model 1 kawar et al 2022 denoising diffusion restoration models 2 ho et al 2022 video diffusion models the limitations are discussed in the conclusion i found this discussion to be appropriate docsepthe paper suggests to improve the performance a diffusionbased reconstruction scheme by adding to its updates the gradient of a loglikelihood term that uses the link between xi to x0 some mathematical motivation is provided 1 the introduction part is lacking you should mention other methods that use the same pretrained priors for solving many almost arbitrary reconstruction tasks ie in an unsupervised fashion for example ganbased reconstruction methods ab and the powerful plugandplay pp denoisers approach cde the pp denoisers approach c which is much older than diffusionbased reconstruction methods commonly uses generalporpose cnn denoisers to impose the prior within iterative schemes d e so in fact diffusionbased reconstruction methods can be understood as pp variant with classspecific denoisers determined by the training data that can handle extreme noise levels which makes them generative models i expect these topics to be discussed in the introduction 2 the continuous formulation of the diffusion process in the introduction part seems redundant 3 fix the notation in eq 15 you mean y rather than y0 right also the argument of hatx0cdot is xi right this should be clearly written 4 the presentation of the mcg ingredient needs to be improved essentially the manifold constraint is the information on the prior of the signal the transformation from xi to x0 egthe constraint in eq 13 is independent of y yet in the term that is added to eq 15 compared to eq 7 you already connect it with the loglikelihood term data fidelity term thus it should be clearly stated throughout the paper that this new term combines data fidelity term and the prior information of the transformation from xi to x0 furthermore i tend to believe that there exist other works that also use perhaps in a similar way the loglikelihood function in their iterative diffusion reconstruction schemes please make sure that you cover the exiting literature 5 note that for alpha0 your additional term is canceled thus your experiments should include ablation study and also some discussion and examination of the effect of different values of alpha 6 please verify that there is no mismatch between eq 15 and the algorithms that you actually use in the experiments for example in algorithm 1 in appendix c line 10 is not aligned with eq 15 alpha is multiplied with a matrix moreover observe that your mcg component in line 10 of algorithm 1 vanishes when you compute ipp ddxi py0px0 ipp ddxi py0px0 ipp ddx0 py0px0 dx0dxi ipp pppx0y0 dx0dxi 0 because ipp pp0 since for inpainting p is m rows of the identity matrix in so pp is a projection matrix 7 in your algorithm all the choices of a and b for the different tasks including ct makes eq 16 and eq 8 coincide with the backprojection step in e the iterative usage of such a data consistency step with pretrained denoisers which as mentioned above is very similar to diffusionbased reconstruction has been proposed in e and theoretically analyzed in followup works such very related works should be mentioned 8 the competitors in the experiments seem somewhat weak for example the recovery that is presented to some of them changes even the known pixels in inpainting recoveries that do not utilize such known information are obviously weak hence it would have been interesting and informative to see comparison with strong nonnaive ganbased reconstruction methods such as b obviously with gans that are trained on the same training data as the other methods eg by using common datasets and gans the method in b is a clear example of a method that does not fall into the authors statement that the ability to maintain perfect measurement consistency is often not satisfied for unsupervised ganbased solutions 9 in the experiments section you must include a discussion on the number of iterations and runtime of the different methods that are examined a bora a jalal a price e and dimakis ag 2017 july compressed sensing using generative models in international conference on machine learning pp 537546 pmlr b hussein sa tirer t and giryes r 2020 april imageadaptive gan based reconstruction in proceedings of the aaai conference on artificial intelligence vol 34 no 04 pp 31213129 c venkatakrishnan sv bouman ca and wohlberg b 2013 december plugandplay priors for model based reconstruction in 2013 ieee global conference on signal and information processing pp 945948 ieee d zhang k zuo w gu s and zhang l 2017 learning deep cnn denoiser prior for image restoration in proceedings of the ieee conference on computer vision and pattern recognition pp 39293938 e tirer t and giryes r 2018 image restoration by iterative denoising and backward projections ieee transactions on image processing 283 pp12201234 my questions and comments are listed above
### Summary: | this paper proposes to incorporate manifold constraints in diffusion model for inverse problems the general consensus after rebuttal is that this submission is worthy of acceptance to neurips please incorporate the remaining reviewers feedback for the cameraready version | [
8245,
10775,
616,
275,
31406,
1076,
4836,
8687,
27930,
271,
3236,
2460,
40634,
407,
4618,
285,
6891,
25965,
436,
2929,
4648,
247,
1027,
873,
273,
25965,
285,
5012,
3012,
1027,
39322,
532,
7363,
323,
1234,
1143,
285,
298,
2902,
323,
436,
4836,
281,
320,
2590,
891,
1158,
352,
310,
4344,
281,
1818,
253,
5661,
9978,
2299,
891,
4282,
604,
253,
4477,
858,
417,
6947,
2217,
673,
25184,
253,
11771,
3210,
275,
436,
747,
9978,
285,
17912,
25355,
264,
253,
11771,
2987,
50276,
18,
42158,
1792,
18758,
268,
50276,
79,
469,
311,
247,
43425,
12393,
3210,
7171,
305,
507,
327,
2460,
9066,
16424,
275,
11454,
1491,
5162,
2718,
5910,
11422,
1438,
2597,
3953,
374,
594,
73,
392,
781,
6339,
480,
359,
739,
299,
6429,
1041,
7523,
266,
10511,
295,
50276,
26774,
22357,
256,
4104,
480,
2517,
3676,
440,
35421,
4715,
970,
5293,
48494,
5461,
30141,
275,
5213,
8059,
327,
5145,
4715,
7266,
374,
9726,
1423,
2082,
268,
1686,
83,
495,
4498,
340,
594,
73,
392,
781,
6339,
480,
6963,
785,
277,
268,
465,
22711,
247,
209,
693,
251,
256,
50276,
5367,
1306,
270,
9169,
4868,
3169,
1006,
800,
14053,
949,
19191,
8967,
7424,
549,
32693,
638,
3845,
549,
32693,
1252,
883,
1706,
3208,
577,
26285,
11159,
83,
247,
16447,
437,
17551,
278,
10102,
2771,
247,
340,
86,
269,
4522,
1171,
442,
391,
50276,
6148,
564,
311,
298,
1384,
1423,
1234,
1143,
275,
31406,
1076,
970,
1850,
80,
2182,
12393,
37851,
3210,
275,
10061,
273,
253,
26332,
70,
886,
39985,
8059,
327,
4382,
8113,
285,
3102,
8981,
7266,
11601,
45759,
44389,
608,
402,
19449,
729,
391,
2412,
2679,
6156,
299,
48676,
20323,
249,
247,
867,
478,
8947,
247,
15898,
2788,
3227,
247,
2830,
5410,
18540,
247,
50275,
282,
2503,
953,
4742,
362,
1384,
1423,
6064,
18848,
461,
1781,
8989,
275,
31406,
1076,
342,
269,
15421,
2410,
17009,
275,
10061,
273,
253,
26332,
70,
886,
39985,
8986,
8059,
327,
4893,
273,
4382,
8113,
7266,
25901,
4529,
17220,
50276,
2369,
5474,
33032,
2520,
2929,
29328,
247,
7792,
323,
16161,
13737,
3237,
970,
12393,
3210,
347,
2235,
641,
253,
2934,
310,
281,
9569,
1529,
7658,
275,
253,
15895,
273,
253,
12637,
3268,
326,
546,
36217,
253,
4561,
3410,
281,
320,
327,
253,
16751,
273,
3626,
3888,
253,
5611,
7658,
310,
17194,
407,
253,
13660,
264,
466,
7212,
326,
7033,
253,
4868,
1159,
273,
253,
27620,
941,
3268,
285,
253,
1682,
5823,
339,
29107,
1850,
80,
2182,
2067,
5661,
1543,
327,
1027,
6979,
13737,
3237,
921,
326,
253,
4081,
1332,
41731,
13015,
5368,
3082,
275,
269,
301,
39322,
2824,
3714,
23838,
285,
256,
3549,
17082,
20544,
50276,
783,
2929,
23970,
271,
4722,
15644,
281,
5899,
10491,
3082,
17194,
407,
253,
13660,
264,
466,
7212,
50275,
783,
2929,
310,
3839,
973,
3542,
285,
253,
2934,
310,
973,
17194,
50276,
43249,
5661,
1543,
327,
1027,
6979,
13737,
3237,
921,
326,
253,
4081,
1332,
41731,
13015,
5368,
3082,
275,
269,
301,
39322,
2824,
3714,
23838,
285,
256,
3549,
17082,
50276,
20881,
1255,
50276,
783,
2929,
47284,
812,
320,
1805,
50276,
783,
2929,
7033,
253,
2934,
247,
2257,
281,
6046,
19,
18891,
533,
253,
2234,
16038,
310,
253,
13660,
264,
466,
7212,
342,
326,
275,
2564,
352,
651,
320,
1805,
281,
9569,
253,
13660,
264,
466,
7212,
285,
4623,
789,
1805,
923,
2708,
50276,
24013,
7639,
323,
253,
3081,
7658,
310,
417,
2590,
2593,
4562,
275,
1798,
16186,
1638,
34143,
310,
12744,
14168,
32557,
310,
247,
873,
594,
849,
943,
581,
4665,
268,
1588,
68,
50276,
89,
50275,
5430,
310,
2797,
259,
323,
1016,
2898,
50276,
783,
33977,
4342,
943,
320,
1805,
4802,
281,
253,
4081,
5933,
50275,
249,
2087,
891,
751,
253,
2929,
253,
2934,
310,
4722,
285,
253,
5661,
1543,
921,
247,
2590,
3330,
275,
3045,
253,
2022,
1655,
2523,
281,
479,
310,
326,
253,
16038,
285,
47284,
310,
417,
2590,
2217,
954,
273,
253,
7437,
403,
2168,
627,
533,
247,
40386,
285,
1805,
5955,
310,
3058,
50275,
17465,
569,
403,
9113,
5469,
5474,
339,
9852,
436,
2929,
253,
4477,
12661,
247,
747,
16182,
281,
1347,
17697,
10491,
970,
1850,
80,
2182,
12393,
3210,
10775,
597,
13398,
5368,
17697,
660,
4362,
16464,
5609,
970,
247,
19191,
22170,
342,
247,
4512,
1818,
273,
253,
4868,
6296,
3185,
273,
7296,
253,
49795,
4868,
5164,
597,
1908,
253,
4868,
1159,
2330,
342,
253,
17697,
3268,
1677,
407,
690,
16751,
7658,
253,
4477,
921,
326,
436,
1307,
310,
275,
958,
19767,
281,
253,
5368,
49795,
4868,
1307,
347,
253,
4868,
1307,
760,
12171,
2199,
327,
253,
941,
16751,
5727,
253,
3081,
1307,
9727,
12171,
327,
253,
28196,
2317,
273,
436,
16751,
597,
7472,
616,
39396,
1411,
690,
1666,
25379,
285,
921,
12532,
1543,
323,
275,
31406,
1076,
3295,
1320,
285,
45830,
22798,
14433,
20544,
50275,
783,
2929,
23970,
247,
747,
16182,
323,
275,
31406,
1076,
534,
2789,
3282,
432,
247,
10527,
1127,
273,
1859,
310,
3477,
281,
3359,
285,
3133,
281,
4917,
1175,
1543,
50276,
20881,
1255,
265,
50275,
74,
1158,
326,
247,
12861,
5955,
342,
3332,
2987,
824,
347,
337,
310,
5816,
352,
310,
417,
2590,
432,
4361,
253,
2022,
3389,
752,
310,
253,
1375,
273,
253,
1445,
17697,
10491,
1332,
970,
12393,
3210,
387,
1878,
2045,
281,
436,
789,
50275,
249,
2426,
273,
38135,
352,
3133,
326,
253,
4081,
10618,
369,
2168,
5611,
275,
374,
347,
247,
9936,
253,
1332,
4081,
407,
253,
4477,
476,
320,
923,
347,
247,
26647,
281,
10341,
4872,
3237,
347,
4767,
407,
253,
4477,
285,
3103,
556,
3710,
22458,
600,
50275,
3113,
253,
19191,
22170,
1307,
285,
253,
11786,
1307,
4081,
407,
253,
4477,
352,
310,
417,
2590,
387,
512,
604,
359,
403,
10491,
432,
253,
987,
12637,
3268,
891,
2096,
326,
253,
1332,
11026,
1175,
5304,
1543,
533,
275,
4910,
751,
260,
5643,
11682,
253,
5304,
3290,
943,
417,
320,
253,
760,
17082,
347,
359,
671,
878,
247,
3210,
534,
2085,
1175,
11649,
21652,
824,
15711,
403,
5816,
1060,
347,
247,
806,
1953,
476,
368,
17710,
752,
310,
253,
13727,
2557,
273,
253,
4081,
12393,
1566,
50276,
18,
465,
1403,
274,
1162,
355,
1384,
1423,
50276,
3354,
80,
2182,
12393,
20384,
3210,
374,
8511,
1162,
355,
1384,
1423,
50276,
16455,
12393,
3210,
253,
7364,
403,
5469,
275,
253,
6452,
891,
1119,
436,
5955,
281,
320,
4569,
5474,
339,
431,
248,
2929,
5936,
281,
3157,
253,
3045,
247,
12393,
3169,
14433,
6974,
407,
6240,
281,
697,
11269,
253,
11786,
273,
247,
2412,
7513,
10202,
1307,
326,
4648,
253,
3048,
875,
1269,
74,
281,
1269,
17,
690,
15965,
16038,
310,
2530,
337,
253,
10199,
629,
310,
14999,
368,
943,
3748,
643,
3082,
326,
897,
253,
1072,
3215,
11273,
2235,
641,
323,
16161,
1142,
2761,
10341,
14433,
8892,
26332,
275,
271,
440,
35421,
8142,
323,
1650,
36827,
3169,
14433,
3082,
490,
285,
253,
6422,
10358,
395,
1993,
7266,
1850,
10225,
398,
2746,
260,
615,
253,
7266,
1850,
10225,
398,
2746,
260,
534,
310,
1199,
5662,
685,
12393,
3169,
14433,
3082,
7744,
4648,
2087,
1831,
3014,
260,
9866,
1850,
10225,
398,
281,
16209,
253,
2720,
1561,
34560,
15849,
277,
299,
50276,
601,
275,
958,
12393,
3169,
14433,
3082,
476,
320,
7192,
347,
7266,
12955,
342,
966,
6160,
1850,
10225,
398,
3413,
407,
253,
3733,
941,
326,
476,
6016,
9559,
6046,
2308,
534,
2789,
731,
1006,
800,
3210,
891,
1902,
841,
12989,
281,
320,
5469,
275,
253,
10199,
50276,
19,
253,
5415,
15895,
273,
253,
12393,
1232,
275,
253,
10199,
629,
3133,
28116,
50276,
20,
4993,
253,
14951,
275,
16186,
1458,
368,
1599,
340,
2581,
685,
340,
17,
987,
50276,
12563,
253,
4154,
273,
7856,
89,
17,
3830,
310,
1269,
74,
987,
436,
943,
320,
4518,
3542,
50276,
21,
253,
9759,
273,
253,
278,
29676,
24405,
3198,
281,
320,
5520,
9093,
253,
16751,
7658,
310,
253,
1491,
327,
253,
2720,
273,
253,
2625,
253,
9261,
432,
1269,
74,
281,
1269,
17,
24088,
783,
7658,
275,
16186,
2145,
310,
3907,
273,
340,
2568,
275,
253,
1307,
326,
310,
2879,
281,
16186,
1458,
2429,
281,
16186,
818,
368,
2168,
4684,
352,
342,
253,
2412,
7513,
10202,
1307,
941,
32422,
1307,
3021,
352,
943,
320,
4518,
4767,
4768,
253,
2929,
326,
436,
747,
1307,
24772,
941,
32422,
1307,
285,
253,
2720,
1491,
273,
253,
9261,
432,
1269,
74,
281,
1269,
17,
33810,
891,
5257,
281,
2868,
326,
627,
2226,
643,
2987,
326,
671,
897,
4931,
275,
247,
2074,
1039,
253,
2412,
7513,
10202,
1159,
275,
616,
34560,
12393,
14433,
15849,
4496,
1056,
2119,
326,
368,
3835,
253,
44528,
6239,
50275,
22,
3877,
326,
323,
9765,
17,
634,
3081,
1307,
310,
32093,
3021,
634,
4679,
943,
2486,
28913,
1263,
285,
671,
690,
5955,
285,
8368,
273,
253,
1055,
273,
1027,
2193,
273,
9765,
50276,
23,
4496,
12654,
326,
627,
310,
642,
29713,
875,
16186,
1458,
285,
253,
11333,
326,
368,
2686,
897,
275,
253,
4679,
323,
1650,
275,
5933,
337,
275,
30762,
260,
1386,
884,
310,
417,
15616,
342,
16186,
1458,
9765,
310,
31458,
342,
247,
4315,
25761,
10018,
326,
634,
278,
29676,
4445,
275,
1386,
884,
273,
5933,
337,
27309,
672,
368,
11897,
891,
377,
50276,
1678,
2981,
7239,
17,
3498,
17,
50276,
5265,
50276,
1678,
2981,
7239,
17,
3498,
17,
50276,
5265,
50276,
1678,
89,
17,
7239,
17,
3498,
17,
50276,
9665,
17,
69,
2981,
50276,
5265,
50276,
377,
3498,
17,
90,
17,
50276,
9665,
17,
69,
2981,
50276,
17,
50276,
12157,
891,
377,
50276,
377,
17,
1580,
323,
275,
31406,
1076,
268,
310,
278,
10175,
273,
253,
6489,
4315,
275,
594,
7266,
310,
247,
12378,
4315,
50276,
24,
275,
634,
5933,
512,
253,
10165,
273,
247,
285,
270,
323,
253,
1027,
8892,
1690,
45830,
2789,
16186,
1668,
285,
16186,
854,
28588,
342,
253,
896,
856,
5342,
3213,
275,
299,
253,
34560,
10393,
273,
824,
247,
941,
15274,
3213,
342,
3215,
11273,
1850,
10225,
398,
534,
347,
5393,
1840,
310,
1077,
2074,
281,
12393,
3169,
14433,
556,
644,
4081,
275,
299,
285,
28055,
5867,
275,
956,
484,
2987,
824,
1077,
2905,
2987,
943,
320,
5393,
50276,
25,
253,
21607,
275,
253,
4679,
1646,
8489,
5075,
323,
1650,
253,
7355,
326,
310,
3559,
281,
690,
273,
731,
2544,
1014,
253,
1929,
15115,
275,
275,
31406,
1076,
9295,
447,
326,
513,
417,
16584,
824,
1929,
1491,
403,
9090,
5075,
7613,
352,
651,
452,
644,
4722,
285,
27096,
281,
923,
5301,
342,
2266,
1327,
2072,
422,
36827,
3169,
14433,
3082,
824,
347,
270,
9090,
342,
305,
507,
326,
403,
10166,
327,
253,
1072,
3733,
941,
347,
253,
643,
3082,
24088,
407,
970,
1846,
15302,
285,
305,
507,
253,
1332,
275,
270,
310,
247,
2590,
1650,
273,
247,
1332,
326,
1057,
417,
2965,
715,
253,
4477,
3908,
326,
253,
3745,
281,
6558,
3962,
6814,
15274,
50276,
261,
2223,
417,
10048,
323,
440,
35421,
36827,
3169,
5482,
50276,
26,
275,
253,
4679,
2593,
368,
1364,
2486,
247,
5955,
327,
253,
1180,
273,
25142,
285,
20243,
273,
253,
1027,
3082,
326,
403,
6730,
50275,
66,
270,
6464,
247,
480,
267,
267,
247,
4376,
299,
285,
3317,
30441,
639,
4240,
480,
2988,
21012,
17950,
970,
1006,
800,
3210,
275,
5213,
8059,
327,
5145,
4715,
7266,
608,
17820,
2950,
268,
1686,
83,
50276,
67,
5424,
34103,
618,
20052,
83,
246,
285,
48496,
9820,
391,
9169,
1049,
21704,
2460,
26672,
422,
36827,
1754,
14433,
275,
10061,
273,
253,
39951,
2284,
8059,
327,
13345,
9260,
1936,
5910,
642,
16703,
7266,
30581,
1012,
13482,
50276,
68,
8097,
29846,
518,
41657,
11943,
18504,
29909,
1342,
7318,
285,
259,
37844,
4978,
270,
4072,
372,
4246,
10358,
395,
1993,
2235,
641,
323,
1566,
1754,
14433,
275,
4072,
26332,
1796,
4156,
8059,
327,
2625,
285,
1491,
5162,
7266,
898,
28333,
2385,
26332,
1796,
50276,
69,
1182,
12109,
465,
10736,
80,
259,
1149,
256,
285,
1182,
12109,
298,
4240,
4715,
3676,
260,
9866,
1850,
80,
9141,
2720,
323,
2460,
20384,
275,
10061,
273,
253,
26332,
1796,
8059,
327,
4382,
8113,
285,
3102,
8981,
7266,
6931,
1717,
1867,
1839,
50276,
70,
20052,
83,
246,
285,
48496,
9820,
391,
4765,
2460,
20384,
407,
34560,
1850,
80,
2182,
285,
19265,
20553,
26332,
1796,
13122,
327,
2460,
5162,
34360,
7266,
805,
1252,
20210,
50276,
2577,
3533,
285,
5701,
403,
7117,
1840,
2490,
187,
4118,
18435,
27,
2520,
2929,
29328,
281,
19071,
16751,
10806,
275,
12393,
1566,
323,
13737,
3237,
253,
2087,
13969,
846,
30080,
22559,
310,
326,
436,
19529,
310,
18338,
273,
14924,
281,
5723,
2824,
4496,
19071,
253,
5780,
30628,
8680,
323,
253,
4049,
254,
609,
5102,
2715
] | [
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] | [
8245,
10775,
616,
275,
31406,
1076,
4836,
8687,
27930,
271,
3236,
2460,
40634,
407,
4618,
285,
6891,
25965,
436,
2929,
4648,
247,
1027,
873,
273,
25965,
285,
5012,
3012,
1027,
39322,
532,
7363,
323,
1234,
1143,
285,
298,
2902,
323,
436,
4836,
281,
320,
2590,
891,
1158,
352,
310,
4344,
281,
1818,
253,
5661,
9978,
2299,
891,
4282,
604,
253,
4477,
858,
417,
6947,
2217,
673,
25184,
253,
11771,
3210,
275,
436,
747,
9978,
285,
17912,
25355,
264,
253,
11771,
2987,
50276,
18,
42158,
1792,
18758,
268,
50276,
79,
469,
311,
247,
43425,
12393,
3210,
7171,
305,
507,
327,
2460,
9066,
16424,
275,
11454,
1491,
5162,
2718,
5910,
11422,
1438,
2597,
3953,
374,
594,
73,
392,
781,
6339,
480,
359,
739,
299,
6429,
1041,
7523,
266,
10511,
295,
50276,
26774,
22357,
256,
4104,
480,
2517,
3676,
440,
35421,
4715,
970,
5293,
48494,
5461,
30141,
275,
5213,
8059,
327,
5145,
4715,
7266,
374,
9726,
1423,
2082,
268,
1686,
83,
495,
4498,
340,
594,
73,
392,
781,
6339,
480,
6963,
785,
277,
268,
465,
22711,
247,
209,
693,
251,
256,
50276,
5367,
1306,
270,
9169,
4868,
3169,
1006,
800,
14053,
949,
19191,
8967,
7424,
549,
32693,
638,
3845,
549,
32693,
1252,
883,
1706,
3208,
577,
26285,
11159,
83,
247,
16447,
437,
17551,
278,
10102,
2771,
247,
340,
86,
269,
4522,
1171,
442,
391,
50276,
6148,
564,
311,
298,
1384,
1423,
1234,
1143,
275,
31406,
1076,
970,
1850,
80,
2182,
12393,
37851,
3210,
275,
10061,
273,
253,
26332,
70,
886,
39985,
8059,
327,
4382,
8113,
285,
3102,
8981,
7266,
11601,
45759,
44389,
608,
402,
19449,
729,
391,
2412,
2679,
6156,
299,
48676,
20323,
249,
247,
867,
478,
8947,
247,
15898,
2788,
3227,
247,
2830,
5410,
18540,
247,
50275,
282,
2503,
953,
4742,
362,
1384,
1423,
6064,
18848,
461,
1781,
8989,
275,
31406,
1076,
342,
269,
15421,
2410,
17009,
275,
10061,
273,
253,
26332,
70,
886,
39985,
8986,
8059,
327,
4893,
273,
4382,
8113,
7266,
25901,
4529,
17220,
50276,
2369,
5474,
33032,
2520,
2929,
29328,
247,
7792,
323,
16161,
13737,
3237,
970,
12393,
3210,
347,
2235,
641,
253,
2934,
310,
281,
9569,
1529,
7658,
275,
253,
15895,
273,
253,
12637,
3268,
326,
546,
36217,
253,
4561,
3410,
281,
320,
327,
253,
16751,
273,
3626,
3888,
253,
5611,
7658,
310,
17194,
407,
253,
13660,
264,
466,
7212,
326,
7033,
253,
4868,
1159,
273,
253,
27620,
941,
3268,
285,
253,
1682,
5823,
339,
29107,
1850,
80,
2182,
2067,
5661,
1543,
327,
1027,
6979,
13737,
3237,
921,
326,
253,
4081,
1332,
41731,
13015,
5368,
3082,
275,
269,
301,
39322,
2824,
3714,
23838,
285,
256,
3549,
17082,
20544,
50276,
783,
2929,
23970,
271,
4722,
15644,
281,
5899,
10491,
3082,
17194,
407,
253,
13660,
264,
466,
7212,
50275,
783,
2929,
310,
3839,
973,
3542,
285,
253,
2934,
310,
973,
17194,
50276,
43249,
5661,
1543,
327,
1027,
6979,
13737,
3237,
921,
326,
253,
4081,
1332,
41731,
13015,
5368,
3082,
275,
269,
301,
39322,
2824,
3714,
23838,
285,
256,
3549,
17082,
50276,
20881,
1255,
50276,
783,
2929,
47284,
812,
320,
1805,
50276,
783,
2929,
7033,
253,
2934,
247,
2257,
281,
6046,
19,
18891,
533,
253,
2234,
16038,
310,
253,
13660,
264,
466,
7212,
342,
326,
275,
2564,
352,
651,
320,
1805,
281,
9569,
253,
13660,
264,
466,
7212,
285,
4623,
789,
1805,
923,
2708,
50276,
24013,
7639,
323,
253,
3081,
7658,
310,
417,
2590,
2593,
4562,
275,
1798,
16186,
1638,
34143,
310,
12744,
14168,
32557,
310,
247,
873,
594,
849,
943,
581,
4665,
268,
1588,
68,
50276,
89,
50275,
5430,
310,
2797,
259,
323,
1016,
2898,
50276,
783,
33977,
4342,
943,
320,
1805,
4802,
281,
253,
4081,
5933,
50275,
249,
2087,
891,
751,
253,
2929,
253,
2934,
310,
4722,
285,
253,
5661,
1543,
921,
247,
2590,
3330,
275,
3045,
253,
2022,
1655,
2523,
281,
479,
310,
326,
253,
16038,
285,
47284,
310,
417,
2590,
2217,
954,
273,
253,
7437,
403,
2168,
627,
533,
247,
40386,
285,
1805,
5955,
310,
3058,
50275,
17465,
569,
403,
9113,
5469,
5474,
339,
9852,
436,
2929,
253,
4477,
12661,
247,
747,
16182,
281,
1347,
17697,
10491,
970,
1850,
80,
2182,
12393,
3210,
10775,
597,
13398,
5368,
17697,
660,
4362,
16464,
5609,
970,
247,
19191,
22170,
342,
247,
4512,
1818,
273,
253,
4868,
6296,
3185,
273,
7296,
253,
49795,
4868,
5164,
597,
1908,
253,
4868,
1159,
2330,
342,
253,
17697,
3268,
1677,
407,
690,
16751,
7658,
253,
4477,
921,
326,
436,
1307,
310,
275,
958,
19767,
281,
253,
5368,
49795,
4868,
1307,
347,
253,
4868,
1307,
760,
12171,
2199,
327,
253,
941,
16751,
5727,
253,
3081,
1307,
9727,
12171,
327,
253,
28196,
2317,
273,
436,
16751,
597,
7472,
616,
39396,
1411,
690,
1666,
25379,
285,
921,
12532,
1543,
323,
275,
31406,
1076,
3295,
1320,
285,
45830,
22798,
14433,
20544,
50275,
783,
2929,
23970,
247,
747,
16182,
323,
275,
31406,
1076,
534,
2789,
3282,
432,
247,
10527,
1127,
273,
1859,
310,
3477,
281,
3359,
285,
3133,
281,
4917,
1175,
1543,
50276,
20881,
1255,
265,
50275,
74,
1158,
326,
247,
12861,
5955,
342,
3332,
2987,
824,
347,
337,
310,
5816,
352,
310,
417,
2590,
432,
4361,
253,
2022,
3389,
752,
310,
253,
1375,
273,
253,
1445,
17697,
10491,
1332,
970,
12393,
3210,
387,
1878,
2045,
281,
436,
789,
50275,
249,
2426,
273,
38135,
352,
3133,
326,
253,
4081,
10618,
369,
2168,
5611,
275,
374,
347,
247,
9936,
253,
1332,
4081,
407,
253,
4477,
476,
320,
923,
347,
247,
26647,
281,
10341,
4872,
3237,
347,
4767,
407,
253,
4477,
285,
3103,
556,
3710,
22458,
600,
50275,
3113,
253,
19191,
22170,
1307,
285,
253,
11786,
1307,
4081,
407,
253,
4477,
352,
310,
417,
2590,
387,
512,
604,
359,
403,
10491,
432,
253,
987,
12637,
3268,
891,
2096,
326,
253,
1332,
11026,
1175,
5304,
1543,
533,
275,
4910,
751,
260,
5643,
11682,
253,
5304,
3290,
943,
417,
320,
253,
760,
17082,
347,
359,
671,
878,
247,
3210,
534,
2085,
1175,
11649,
21652,
824,
15711,
403,
5816,
1060,
347,
247,
806,
1953,
476,
368,
17710,
752,
310,
253,
13727,
2557,
273,
253,
4081,
12393,
1566,
50276,
18,
465,
1403,
274,
1162,
355,
1384,
1423,
50276,
3354,
80,
2182,
12393,
20384,
3210,
374,
8511,
1162,
355,
1384,
1423,
50276,
16455,
12393,
3210,
253,
7364,
403,
5469,
275,
253,
6452,
891,
1119,
436,
5955,
281,
320,
4569,
5474,
339,
431,
248,
2929,
5936,
281,
3157,
253,
3045,
247,
12393,
3169,
14433,
6974,
407,
6240,
281,
697,
11269,
253,
11786,
273,
247,
2412,
7513,
10202,
1307,
326,
4648,
253,
3048,
875,
1269,
74,
281,
1269,
17,
690,
15965,
16038,
310,
2530,
337,
253,
10199,
629,
310,
14999,
368,
943,
3748,
643,
3082,
326,
897,
253,
1072,
3215,
11273,
2235,
641,
323,
16161,
1142,
2761,
10341,
14433,
8892,
26332,
275,
271,
440,
35421,
8142,
323,
1650,
36827,
3169,
14433,
3082,
490,
285,
253,
6422,
10358,
395,
1993,
7266,
1850,
10225,
398,
2746,
260,
615,
253,
7266,
1850,
10225,
398,
2746,
260,
534,
310,
1199,
5662,
685,
12393,
3169,
14433,
3082,
7744,
4648,
2087,
1831,
3014,
260,
9866,
1850,
10225,
398,
281,
16209,
253,
2720,
1561,
34560,
15849,
277,
299,
50276,
601,
275,
958,
12393,
3169,
14433,
3082,
476,
320,
7192,
347,
7266,
12955,
342,
966,
6160,
1850,
10225,
398,
3413,
407,
253,
3733,
941,
326,
476,
6016,
9559,
6046,
2308,
534,
2789,
731,
1006,
800,
3210,
891,
1902,
841,
12989,
281,
320,
5469,
275,
253,
10199,
50276,
19,
253,
5415,
15895,
273,
253,
12393,
1232,
275,
253,
10199,
629,
3133,
28116,
50276,
20,
4993,
253,
14951,
275,
16186,
1458,
368,
1599,
340,
2581,
685,
340,
17,
987,
50276,
12563,
253,
4154,
273,
7856,
89,
17,
3830,
310,
1269,
74,
987,
436,
943,
320,
4518,
3542,
50276,
21,
253,
9759,
273,
253,
278,
29676,
24405,
3198,
281,
320,
5520,
9093,
253,
16751,
7658,
310,
253,
1491,
327,
253,
2720,
273,
253,
2625,
253,
9261,
432,
1269,
74,
281,
1269,
17,
24088,
783,
7658,
275,
16186,
2145,
310,
3907,
273,
340,
2568,
275,
253,
1307,
326,
310,
2879,
281,
16186,
1458,
2429,
281,
16186,
818,
368,
2168,
4684,
352,
342,
253,
2412,
7513,
10202,
1307,
941,
32422,
1307,
3021,
352,
943,
320,
4518,
4767,
4768,
253,
2929,
326,
436,
747,
1307,
24772,
941,
32422,
1307,
285,
253,
2720,
1491,
273,
253,
9261,
432,
1269,
74,
281,
1269,
17,
33810,
891,
5257,
281,
2868,
326,
627,
2226,
643,
2987,
326,
671,
897,
4931,
275,
247,
2074,
1039,
253,
2412,
7513,
10202,
1159,
275,
616,
34560,
12393,
14433,
15849,
4496,
1056,
2119,
326,
368,
3835,
253,
44528,
6239,
50275,
22,
3877,
326,
323,
9765,
17,
634,
3081,
1307,
310,
32093,
3021,
634,
4679,
943,
2486,
28913,
1263,
285,
671,
690,
5955,
285,
8368,
273,
253,
1055,
273,
1027,
2193,
273,
9765,
50276,
23,
4496,
12654,
326,
627,
310,
642,
29713,
875,
16186,
1458,
285,
253,
11333,
326,
368,
2686,
897,
275,
253,
4679,
323,
1650,
275,
5933,
337,
275,
30762,
260,
1386,
884,
310,
417,
15616,
342,
16186,
1458,
9765,
310,
31458,
342,
247,
4315,
25761,
10018,
326,
634,
278,
29676,
4445,
275,
1386,
884,
273,
5933,
337,
27309,
672,
368,
11897,
891,
377,
50276,
1678,
2981,
7239,
17,
3498,
17,
50276,
5265,
50276,
1678,
2981,
7239,
17,
3498,
17,
50276,
5265,
50276,
1678,
89,
17,
7239,
17,
3498,
17,
50276,
9665,
17,
69,
2981,
50276,
5265,
50276,
377,
3498,
17,
90,
17,
50276,
9665,
17,
69,
2981,
50276,
17,
50276,
12157,
891,
377,
50276,
377,
17,
1580,
323,
275,
31406,
1076,
268,
310,
278,
10175,
273,
253,
6489,
4315,
275,
594,
7266,
310,
247,
12378,
4315,
50276,
24,
275,
634,
5933,
512,
253,
10165,
273,
247,
285,
270,
323,
253,
1027,
8892,
1690,
45830,
2789,
16186,
1668,
285,
16186,
854,
28588,
342,
253,
896,
856,
5342,
3213,
275,
299,
253,
34560,
10393,
273,
824,
247,
941,
15274,
3213,
342,
3215,
11273,
1850,
10225,
398,
534,
347,
5393,
1840,
310,
1077,
2074,
281,
12393,
3169,
14433,
556,
644,
4081,
275,
299,
285,
28055,
5867,
275,
956,
484,
2987,
824,
1077,
2905,
2987,
943,
320,
5393,
50276,
25,
253,
21607,
275,
253,
4679,
1646,
8489,
5075,
323,
1650,
253,
7355,
326,
310,
3559,
281,
690,
273,
731,
2544,
1014,
253,
1929,
15115,
275,
275,
31406,
1076,
9295,
447,
326,
513,
417,
16584,
824,
1929,
1491,
403,
9090,
5075,
7613,
352,
651,
452,
644,
4722,
285,
27096,
281,
923,
5301,
342,
2266,
1327,
2072,
422,
36827,
3169,
14433,
3082,
824,
347,
270,
9090,
342,
305,
507,
326,
403,
10166,
327,
253,
1072,
3733,
941,
347,
253,
643,
3082,
24088,
407,
970,
1846,
15302,
285,
305,
507,
253,
1332,
275,
270,
310,
247,
2590,
1650,
273,
247,
1332,
326,
1057,
417,
2965,
715,
253,
4477,
3908,
326,
253,
3745,
281,
6558,
3962,
6814,
15274,
50276,
261,
2223,
417,
10048,
323,
440,
35421,
36827,
3169,
5482,
50276,
26,
275,
253,
4679,
2593,
368,
1364,
2486,
247,
5955,
327,
253,
1180,
273,
25142,
285,
20243,
273,
253,
1027,
3082,
326,
403,
6730,
50275,
66,
270,
6464,
247,
480,
267,
267,
247,
4376,
299,
285,
3317,
30441,
639,
4240,
480,
2988,
21012,
17950,
970,
1006,
800,
3210,
275,
5213,
8059,
327,
5145,
4715,
7266,
608,
17820,
2950,
268,
1686,
83,
50276,
67,
5424,
34103,
618,
20052,
83,
246,
285,
48496,
9820,
391,
9169,
1049,
21704,
2460,
26672,
422,
36827,
1754,
14433,
275,
10061,
273,
253,
39951,
2284,
8059,
327,
13345,
9260,
1936,
5910,
642,
16703,
7266,
30581,
1012,
13482,
50276,
68,
8097,
29846,
518,
41657,
11943,
18504,
29909,
1342,
7318,
285,
259,
37844,
4978,
270,
4072,
372,
4246,
10358,
395,
1993,
2235,
641,
323,
1566,
1754,
14433,
275,
4072,
26332,
1796,
4156,
8059,
327,
2625,
285,
1491,
5162,
7266,
898,
28333,
2385,
26332,
1796,
50276,
69,
1182,
12109,
465,
10736,
80,
259,
1149,
256,
285,
1182,
12109,
298,
4240,
4715,
3676,
260,
9866,
1850,
80,
9141,
2720,
323,
2460,
20384,
275,
10061,
273,
253,
26332,
1796,
8059,
327,
4382,
8113,
285,
3102,
8981,
7266,
6931,
1717,
1867,
1839,
50276,
70,
20052,
83,
246,
285,
48496,
9820,
391,
4765,
2460,
20384,
407,
34560,
1850,
80,
2182,
285,
19265,
20553,
26332,
1796,
13122,
327,
2460,
5162,
34360,
7266,
805,
1252,
20210,
50276,
2577,
3533,
285,
5701,
403,
7117,
1840,
2490,
187,
4118,
18435,
27,
2520,
2929,
29328,
281,
19071,
16751,
10806,
275,
12393,
1566,
323,
13737,
3237,
253,
2087,
13969,
846,
30080,
22559,
310,
326,
436,
19529,
310,
18338,
273,
14924,
281,
5723,
2824,
4496,
19071,
253,
5780,
30628,
8680,
323,
253,
4049,
254,
609,
5102,
2715
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
this paper proposes nndriven model for designing novel traffic engineering techniques automatically this paper proposes nndriven model for designing novel traffic engineering techniques automatically this paper is missing details on neural architecture configuration parameters and experiment details i had major difficult understanding the write up due to these missing difficulties let me elaborate 1 no details on the model paper does not provide the details of the nn how many layers what is the size of the input how many neurons per layer what is the loss function and how is it trained 2 experiment setup why is 10 simulation sufficient is random uniform random how many concurrent failures for a large network 10 simulations of failure events is not sufficient 3 model abstraction and retraining of the network does the proposed model has to be retrained for a different network size and topology how long does it take to retrain and what is the initial training time 4 simulation what traffic is being simulated what are the datacenter applications that paper is using overall the paper is very light on details and is not fit for publication at this time other issues 1 smartnics cannot behave like switches so the nn inference they are planning to do has to be done on p4 switches only authors should elaborate on how they plan to deploy their model in practice 2 nocomparison with contra which is stateoftheart they do not show how their te algorithm behaves against contra for the exact same performance goals this paper is missing details on neural architecture configuration parameters and experiment details i am not sure what the novelty of the paper is it is most probably due to the fact that the paper significantly lacks in details docsepthe paper proposes an mlbased approach mistill to help automatically deploy distributed protocol from a given te policy the approach learns forwarding decisions together with intermediate results such as exchange information and linkstate encoding from exemplary policies the resulting network can later be deployed in switches to encode linkstate and compute forwarding decisions the technical contributions are the following first they leverage a reparameterization trick to handle categorical distributions when encoding link states into a binary vector second they use the scaled dotproduct attention method to help switches learn to exploit the encodings mentioned above and make the forwarding decision in the evaluation part the paper compares with three custom baselines ie lcp minmax wcmp and exceeds all of them in terms of corresponding metrics strengths 1 the paper gives some insights in incorporating machine learning techniques into the traffic engineering area including how to encode the linkstate using reparameterization and how to communicate using attention techniques 2 the paper gives a general analysis of the pros and cons of ml approaches it is informative when evaluating the feasibility of applying ml on real switches weaknesses 1 the paper represents the challenges in a vague way in section 1 introduction the paper illustrates the necessity of customizing traffic engineering schemes for various applications however it lacks concrete analysis of challenges that lie in translating a te policy into a distributed protocol only a general description in paragraph 2 such as it requires the specification of exchange data the processing of the data and the algorithm is not solid and analytical enough to demonstrate how these challenge the design of a better te scheme the paper should show how difficult it is that to process perapplicationlevel data and to design an algorithm furthermore they do not explain how incorporating machine learning techniques mitigate those challenges 2 the paper presents unclear descriptions of technical details 1 in the second paragraph of section 2 they claim that the proposed approach will learn two aspects ie process and exchange the local state of switches and map the exchanged state into forwarding decisions however later in the fourth paragraph they mention that the model should learn more than these two aspects the model should also learn how to react to changes in the network such as node and link failures and changes in monitored measures they do not explain how the latter aspect is related to the former two aspects 2 in section 3 paragraph 5 does to select the hnsas necessary for this indicates that there might be more hnsas also it is unclear what this refers to 3 the evaluation part is not complete enough first the paper does not show the comparison of computing overhead compared to baselines as a result we do not know if the proposed one can be deployed in actual data center deployment how long will it take for training whats the computing overhead for inference second they do not use traffic features from various applications and thus we do not know if the proposed approach can adapt to various traffic features it is crucial in traffic engineering as stated in section 1 introduction 4 the paper does not discuss the possible technical defects of their approach for example with wrong forwarding decisions accumulating network congestion may happen how does the approach handle problems that arise from intrinsic errors of the machine learning algorithm the paper gives good insights into combining machine learning techniques and traffic engineering however it lacks solid evaluation to demonstrate the overhead of this approach and lacks analysis of problem handling moreover the representation of background and some of the technical details does not help readers to understand therefore the paper should 1 add evaluations demonstrating the overhead of ml techniques analyze the possible problems that may arise from prediction errors 2 rewrite ambiguous sentences to provide more clear technical details 3 show more concrete difficulties in translating traffic engineering policy into protocols docsepthis work is targeting the problem of traffic engineering in computer networks it proposes a learning method called mistill for making packet forwarding decisions in distributed network switches using data generated by a centralized policy the paper assumes that a forwarding policy can be obtained in a centralized fashion with global information and the goal is to train a neural network model that can make the correct forwarding decision in each switch without global states the network switch can communicate its local state to other switches to improve the overall performance this work tries to apply existing machine learning techniques to solve an important computer network problem however i have a number of concerns with this work 1 the first concern is related to the contribution of the paper this paper is mainly applying existing neural network techniques for the traffic engineering problem there is a very limited contribution towards the machine learning techniques on the other hand if we focus on the application side it seems the compute network evaluation environment is not realistic enough since only simulation results are provided more on this point later 2 the authors have emphasized in the paper that the network switches have limited computing power and the forwarding model needs to run in the data plane however there is no evaluation to show that the designed model would be able to run sufficiently fast in such an environment in fact chen et al 2018 have shown that it is challenging to directly use neural networks in the data plane to make forwarding decisions due to latency constraints neural network latencies are at a millisecond level but in todays data center which is the main concern for this paper switch need to make switching decisions at a microsecond level it is important to evaluate this in a real environment to persuade the reader that this method is actually feasible 3 one of the challenges when designing te techniques is the low latency monitoring of the network state such as in alizadeh et al 2018 in this work the authors propose sharing each switchs local state in a broadcast manner selective nodes broadcast is this an efficient communication pattern would it cause high latency and overhead 4 one of the benefits of decentralized control is the scalability of the system however it seems the current design requires retraining the model when adding more switches to the network 5 what is the traffic load used in the evaluation would the algorithms still work when different traffic load is used to evaluate the system the policies used in the evaluation are relatively simple more realistic workloads would be more desirable 6 presentation issue some of the images legends are overlapping with the curve this paper applies existing machine learning techniques to solve the traffic engineering problem in computer networks the contribution toward the machine learning methodologies seems to be limited in this work and the evaluation needs to be conducted in a more realistic environment to demonstrate the feasibility of this proposed method for the targeted application docsepthe paper studies the use of ml to design protocols for a datacenter application in particular a singleserver data center a fattree topology and a single policy are assumed a neural network is training to learn forwarding policies from examples a modular structure is proposed to minimize complexity the nn is trained to be adaptive but the model itself is fixed at the end of training the paper studies the use of ml to design protocols for a datacenter application in particular a singleserver data center a fattree topology and a single policy are assumed a neural network is training to learn forwarding policies from examples a modular structure is proposed to minimize complexity the nn is trained to be adaptive but the model itself is fixed at the end of training the paper is a welcome addition to the sparse set of papers on applying ml to problems in communication networks compared with image classification for example a challenge of course is availability of data overall the paper reads well and it contains detailed information on the architecture and training perhaps a bit too much for the iclr audience there are several issues that need clarification 1 do all switches need to have mistill installed can this work in a hybrid architecture where some switches do not have mistill 2 the work is focused on datacenter application the assumption of clostopologies makes it inapplicable to packetswitched networks which are the norm today 3 is the topology assumed known does every node know its parents and children nodes 4 the specific policy learned here is that of dropping packets or flows if a node is unreachable due to link or node failures this seems to be a simple problem given the fat tree structure does the policy also switch to a new route if the current route is unavailable 5 typically protocols must instantiate several policies not just a single policy can the framework be extended to do that 6 given that the topology is assumed to be a tree it is not clear that the approach can be extended to making use of new links as would be the case in an ad hoc mesh network not just links and hence paths disappearing in a tree but also new links forming or potentially forming 7 a beta term is introduced in the modified loss function in 6 to ensure that embeddings corresponding to different nodes do not end up close to each other but this is precisely what contrastive learning is supposed to achieve it is not clear why the authors were not able to get nce to work 8 associated with nodes and links is context information which can also be used to differentiate between embeddings this has been widely used in the social networks literature this paper assumes that nodes are identical and have no context information 9 the paper makes a criticism of contra that the latter relies on the use of a highlevel policy language what is wrong with that approach if a ml model is to learn a policy it should be able to recognize and resolve conflicts in the examples provided the conflict may be due to lack of full context being provided or used by the learner a purely datadriven approach may not be able to offer insights on what has been learned vs a neurosymbolic approach where rules might be learned and possibly interpretable by a human 10 figures 5 and 6 provide some hints but it is not clear what the model has learned 11 there is some discussion of overhead top of page 5 but this is insufficient would like to get some more details 12 shortest path routes are mentioned several times but there is only one cyclefree path on a tree 13 the 2009 alfares paper indicates that a kfat tree has k22 kcore switches and can support k34 hosts for k8 this leads to 16 switches and 128 hosts but the paper says 80 switches and 128 hosts please clarify also are there k8 servers at the data center typos and such please define acronyms before not after first use eg hnsa in sec 3 the paper studies the use of ml to design protocols for a datacenter application it adds to the sparse set of papers on the application of ml to problems in communication networks and is a welcome change from papers dealing yet again with mnist my main concern relates to generalizability can the framework be extended to nontree topologies in particular where links may appear as well as disappear and where multiple routes exist to protocols that must support multiple policies and to incorporation of contextual information associated with nodes perhaps also links it is not clear what the model has learned and some additional details about the test setup are needed
### Summary: | most of the reviewers thought this paper has issues where it could be improved there was a range of concerns most importantly several reviewers felt the novelty in the paper was unclear as well as the requirement for more details in the experimental evaluations | [
7693,
1054,
4090,
259,
21630,
285,
23141,
512,
273,
731,
275,
2426,
273,
3969,
17082,
20544,
337,
253,
2929,
4245,
690,
16039,
275,
24049,
5145,
4715,
5609,
715,
253,
7137,
11369,
2170,
1690,
849,
281,
22573,
253,
3048,
3409,
970,
294,
19484,
1320,
285,
849,
281,
13791,
970,
4116,
5609,
50276,
19,
253,
2929,
4245,
247,
2087,
1783,
273,
253,
5847,
285,
772,
273,
13361,
7274,
352,
310,
27096,
672,
16344,
253,
25720,
273,
9433,
13361,
327,
1524,
20994,
50276,
20881,
1255,
265,
337,
253,
2929,
6125,
253,
7881,
275,
247,
21248,
1039,
275,
2593,
337,
10199,
253,
2929,
18303,
253,
15504,
273,
2840,
3006,
7137,
11369,
15849,
323,
2710,
4893,
2299,
352,
19756,
11859,
1783,
273,
7881,
326,
7027,
275,
42477,
247,
716,
3646,
715,
247,
5939,
7241,
760,
247,
2087,
5740,
275,
12494,
374,
824,
347,
352,
4419,
253,
17776,
273,
6431,
941,
253,
5162,
273,
253,
941,
285,
253,
5933,
50276,
261,
417,
4891,
285,
16101,
2217,
281,
7568,
849,
841,
5691,
253,
2216,
273,
247,
1805,
716,
6974,
253,
2929,
943,
921,
849,
2834,
352,
310,
326,
281,
1232,
591,
13259,
5251,
941,
285,
281,
2216,
271,
5933,
33810,
597,
513,
417,
5513,
849,
24049,
5145,
4715,
5609,
29966,
1110,
7881,
50276,
19,
253,
2929,
10262,
12744,
20121,
273,
7681,
4278,
337,
275,
253,
1273,
12494,
273,
2593,
374,
597,
1750,
326,
253,
4081,
2746,
588,
3037,
767,
7794,
26332,
1232,
285,
6431,
253,
1980,
1375,
273,
20994,
285,
3711,
253,
25920,
1375,
715,
45279,
7089,
2299,
1996,
275,
253,
7002,
12494,
597,
3748,
326,
253,
1566,
943,
3037,
625,
685,
841,
767,
7794,
253,
1566,
943,
671,
3037,
849,
281,
8071,
281,
2544,
275,
253,
2990,
824,
347,
4666,
285,
3048,
20101,
285,
2544,
275,
16387,
5593,
597,
513,
417,
5513,
849,
253,
6158,
4809,
310,
2905,
281,
253,
3438,
767,
7794,
374,
275,
2593,
495,
12494,
608,
1057,
281,
3609,
253,
288,
2224,
284,
3309,
323,
436,
6492,
326,
627,
1537,
320,
625,
288,
2224,
284,
671,
352,
310,
12744,
752,
436,
10770,
281,
50276,
20,
253,
7103,
629,
310,
417,
3426,
2217,
806,
253,
2929,
1057,
417,
921,
253,
5301,
273,
12672,
18332,
2429,
281,
1666,
25379,
347,
247,
906,
359,
513,
417,
871,
604,
253,
4081,
581,
476,
320,
18329,
275,
4588,
941,
4055,
19007,
849,
1048,
588,
352,
1379,
323,
3733,
47515,
253,
12672,
18332,
323,
17032,
1273,
597,
513,
417,
897,
7137,
3386,
432,
2710,
4893,
285,
3021,
359,
513,
417,
871,
604,
253,
4081,
2746,
476,
5223,
281,
2710,
7137,
3386,
352,
310,
9560,
275,
7137,
11369,
347,
4767,
275,
2593,
337,
10199,
50276,
21,
253,
2929,
1057,
417,
2319,
253,
1896,
7681,
12834,
273,
616,
2746,
323,
1650,
342,
3430,
45279,
7089,
47125,
2990,
35367,
778,
5108,
849,
1057,
253,
2746,
6016,
3237,
326,
12893,
432,
15276,
6332,
273,
253,
5145,
4715,
5933,
253,
2929,
4245,
1175,
16039,
715,
16248,
5145,
4715,
5609,
285,
7137,
11369,
2299,
352,
19756,
4891,
7103,
281,
7568,
253,
18332,
273,
436,
2746,
285,
19756,
1783,
273,
1895,
10885,
25761,
253,
6779,
273,
4114,
285,
690,
273,
253,
7681,
4278,
1057,
417,
1361,
10668,
281,
2096,
3103,
253,
2929,
943,
337,
823,
27163,
17227,
253,
18332,
273,
13361,
5609,
12106,
253,
1896,
3237,
326,
778,
12893,
432,
10554,
6332,
374,
24813,
23851,
14683,
281,
2085,
625,
2590,
7681,
4278,
495,
921,
625,
11859,
12748,
275,
42477,
7137,
11369,
3646,
715,
14238,
5474,
33032,
2520,
789,
310,
12262,
253,
1895,
273,
7137,
11369,
275,
4382,
6928,
352,
29328,
247,
4715,
1332,
1925,
5384,
408,
323,
2403,
13138,
45279,
7089,
275,
5939,
2990,
20994,
970,
941,
4561,
407,
247,
36409,
3646,
253,
2929,
19584,
326,
247,
45279,
3646,
476,
320,
2797,
275,
247,
36409,
8142,
342,
4156,
1491,
285,
253,
4736,
310,
281,
6194,
247,
11454,
2990,
1566,
326,
476,
1056,
253,
3451,
45279,
3061,
275,
1016,
5234,
1293,
4156,
3054,
253,
2990,
5234,
476,
13791,
697,
1980,
1375,
281,
643,
20994,
281,
3157,
253,
4583,
3045,
50276,
2520,
789,
14177,
281,
4647,
5368,
5145,
4715,
5609,
281,
8415,
271,
1774,
4382,
2990,
1895,
2299,
891,
452,
247,
1180,
273,
7350,
342,
436,
789,
50276,
18,
253,
806,
4468,
310,
2905,
281,
253,
7680,
273,
253,
2929,
436,
2929,
310,
7194,
9433,
5368,
11454,
2990,
5609,
323,
253,
7137,
11369,
1895,
627,
310,
247,
1077,
3710,
7680,
4404,
253,
5145,
4715,
5609,
327,
253,
643,
1133,
604,
359,
2770,
327,
253,
2898,
1930,
352,
3133,
253,
11897,
2990,
7103,
3126,
310,
417,
15958,
2217,
1580,
760,
9864,
1543,
403,
2530,
625,
327,
436,
1127,
1996,
50276,
19,
253,
4477,
452,
21947,
275,
253,
2929,
326,
253,
2990,
20994,
452,
3710,
12672,
1612,
285,
253,
45279,
1566,
3198,
281,
1408,
275,
253,
941,
6415,
2299,
627,
310,
642,
7103,
281,
921,
326,
253,
4158,
1566,
651,
320,
2104,
281,
1408,
10481,
3809,
275,
824,
271,
3126,
275,
958,
260,
864,
1162,
355,
4765,
452,
2011,
326,
352,
310,
11132,
281,
3587,
897,
11454,
6928,
275,
253,
941,
6415,
281,
1056,
45279,
7089,
1955,
281,
22667,
10806,
11454,
2990,
4329,
4601,
403,
387,
247,
5499,
26054,
1268,
533,
275,
281,
11015,
941,
4055,
534,
310,
253,
2022,
4468,
323,
436,
2929,
5234,
878,
281,
1056,
12797,
7089,
387,
247,
2494,
9815,
1268,
352,
310,
1774,
281,
7472,
436,
275,
247,
1524,
3126,
281,
29720,
253,
9414,
326,
436,
1332,
310,
2686,
17887,
50275,
20,
581,
273,
253,
7881,
672,
20462,
716,
5609,
310,
253,
1698,
22667,
8667,
273,
253,
2990,
1375,
824,
347,
275,
355,
478,
796,
73,
1162,
355,
4765,
275,
436,
789,
253,
4477,
12661,
9628,
1016,
5234,
84,
1980,
1375,
275,
247,
10675,
5133,
13687,
7632,
10675,
310,
436,
271,
5919,
5511,
3102,
651,
352,
2847,
1029,
22667,
285,
18332,
50276,
21,
581,
273,
253,
5373,
273,
40880,
1453,
310,
253,
9171,
1430,
273,
253,
985,
2299,
352,
3133,
253,
1655,
2216,
4419,
851,
26208,
253,
1566,
672,
6240,
625,
20994,
281,
253,
2990,
50276,
22,
752,
310,
253,
7137,
3301,
908,
275,
253,
7103,
651,
253,
11333,
1335,
789,
672,
1027,
7137,
3301,
310,
908,
281,
7472,
253,
985,
253,
7823,
908,
275,
253,
7103,
403,
4942,
2969,
625,
15958,
32140,
84,
651,
320,
625,
11408,
50276,
23,
9759,
2523,
690,
273,
253,
3888,
38209,
403,
21481,
342,
253,
6970,
50276,
2520,
2929,
10384,
5368,
5145,
4715,
5609,
281,
8415,
253,
7137,
11369,
1895,
275,
4382,
6928,
253,
7680,
2584,
253,
5145,
4715,
39396,
3133,
281,
320,
3710,
275,
436,
789,
285,
253,
7103,
3198,
281,
320,
5196,
275,
247,
625,
15958,
3126,
281,
7568,
253,
25720,
273,
436,
4081,
1332,
323,
253,
10522,
2898,
5474,
339,
431,
248,
2929,
2175,
253,
897,
273,
13361,
281,
2216,
14238,
323,
247,
2856,
7837,
254,
2898,
275,
1798,
247,
21864,
17188,
941,
4055,
247,
269,
1595,
658,
18080,
285,
247,
2014,
3646,
403,
8025,
247,
11454,
2990,
310,
3733,
281,
3037,
45279,
7823,
432,
6667,
50276,
66,
23178,
2605,
310,
4081,
281,
15338,
10454,
253,
48257,
310,
10166,
281,
320,
17825,
533,
253,
1566,
3139,
310,
4229,
387,
253,
990,
273,
3733,
253,
2929,
2175,
253,
897,
273,
13361,
281,
2216,
14238,
323,
247,
2856,
7837,
254,
2898,
275,
1798,
247,
21864,
17188,
941,
4055,
247,
269,
1595,
658,
18080,
285,
247,
2014,
3646,
403,
8025,
247,
11454,
2990,
310,
3733,
281,
3037,
45279,
7823,
432,
6667,
50276,
66,
23178,
2605,
310,
4081,
281,
15338,
10454,
253,
48257,
310,
10166,
281,
320,
17825,
533,
253,
1566,
3139,
310,
4229,
387,
253,
990,
273,
3733,
50276,
783,
2929,
310,
247,
10112,
1635,
281,
253,
23507,
873,
273,
9380,
327,
9433,
13361,
281,
3237,
275,
5511,
6928,
2429,
342,
2460,
9162,
323,
1650,
247,
5691,
273,
2282,
310,
11659,
273,
941,
50275,
1189,
455,
253,
2929,
9563,
973,
285,
352,
4428,
7000,
1491,
327,
253,
10336,
285,
3733,
4931,
247,
2372,
1512,
1199,
323,
253,
17857,
32888,
8446,
50276,
9088,
403,
2067,
3374,
326,
878,
37699,
50276,
18,
513,
512,
20994,
878,
281,
452,
5384,
408,
8038,
476,
436,
789,
275,
247,
9769,
10336,
835,
690,
20994,
513,
417,
452,
5384,
408,
50276,
19,
253,
789,
310,
7106,
327,
2856,
7837,
254,
2898,
253,
9376,
273,
502,
493,
412,
5970,
2789,
352,
275,
40812,
281,
20342,
88,
34005,
6928,
534,
403,
253,
5222,
3063,
50276,
20,
310,
253,
18080,
8025,
1929,
1057,
1046,
4666,
871,
697,
4651,
285,
2151,
7632,
50276,
21,
253,
2173,
3646,
6311,
1060,
310,
326,
273,
18752,
20342,
390,
14221,
604,
247,
4666,
310,
440,
21943,
494,
1955,
281,
3048,
390,
4666,
20101,
436,
3133,
281,
320,
247,
2969,
1895,
1677,
253,
4688,
5202,
2605,
1057,
253,
3646,
671,
5234,
281,
247,
747,
7622,
604,
253,
1655,
7622,
310,
29356,
50275,
22,
5431,
14238,
1364,
8164,
4513,
2067,
7823,
417,
816,
247,
2014,
3646,
476,
253,
7792,
320,
6508,
281,
513,
326,
50276,
23,
1677,
326,
253,
18080,
310,
8025,
281,
320,
247,
5202,
352,
310,
417,
2590,
326,
253,
2746,
476,
320,
6508,
281,
2403,
897,
273,
747,
4859,
347,
651,
320,
253,
1083,
275,
271,
519,
26901,
17489,
2990,
417,
816,
4859,
285,
7613,
11865,
42689,
275,
247,
5202,
533,
671,
747,
4859,
9046,
390,
7826,
9046,
50276,
24,
247,
9840,
1307,
310,
5611,
275,
253,
7321,
2957,
1159,
275,
721,
281,
5416,
326,
46234,
3969,
281,
1027,
7632,
513,
417,
990,
598,
2810,
281,
1016,
643,
533,
436,
310,
10534,
752,
4499,
422,
4715,
310,
6326,
281,
5115,
352,
310,
417,
2590,
2139,
253,
4477,
497,
417,
2104,
281,
755,
295,
336,
281,
789,
50276,
25,
2330,
342,
7632,
285,
4859,
310,
3634,
1491,
534,
476,
671,
320,
908,
281,
22629,
875,
46234,
436,
556,
644,
7561,
908,
275,
253,
2675,
6928,
6239,
436,
2929,
19584,
326,
7632,
403,
8931,
285,
452,
642,
3634,
1491,
50274,
26,
253,
2929,
2789,
247,
14226,
273,
15563,
326,
253,
6158,
15771,
327,
253,
897,
273,
247,
1029,
5251,
3646,
3448,
50276,
5371,
310,
3430,
342,
326,
2746,
604,
247,
13361,
1566,
310,
281,
3037,
247,
3646,
352,
943,
320,
2104,
281,
9446,
285,
11322,
15272,
275,
253,
6667,
2530,
253,
7344,
778,
320,
1955,
281,
3480,
273,
2120,
3634,
1146,
2530,
390,
908,
407,
253,
458,
47612,
247,
15846,
2856,
324,
1069,
257,
2746,
778,
417,
320,
2104,
281,
3959,
16039,
327,
752,
556,
644,
6311,
4632,
247,
6551,
84,
3445,
3422,
2746,
835,
4803,
1537,
320,
6311,
285,
6830,
4665,
494,
407,
247,
1966,
50276,
740,
8442,
608,
285,
721,
2085,
690,
28145,
533,
352,
310,
417,
2590,
752,
253,
1566,
556,
6311,
50275,
883,
627,
310,
690,
5955,
273,
18332,
1755,
273,
3239,
608,
533,
436,
310,
12497,
651,
751,
281,
755,
690,
625,
4278,
50275,
805,
30505,
1854,
15050,
403,
5393,
2067,
2069,
533,
627,
310,
760,
581,
5880,
4924,
1854,
327,
247,
5202,
50275,
1012,
253,
4748,
355,
71,
4420,
2929,
6492,
326,
247,
465,
19397,
5202,
556,
465,
1423,
465,
6443,
20994,
285,
476,
1329,
465,
1706,
14516,
323,
465,
25,
436,
5644,
281,
1668,
20994,
285,
12842,
14516,
533,
253,
2929,
2296,
5096,
20994,
285,
12842,
14516,
4496,
19148,
671,
403,
627,
465,
25,
14903,
387,
253,
941,
4055,
50274,
555,
993,
285,
824,
50276,
32897,
4853,
913,
1406,
90,
983,
1078,
417,
846,
806,
897,
24088,
288,
2224,
66,
275,
4706,
495,
50275,
783,
2929,
2175,
253,
897,
273,
13361,
281,
2216,
14238,
323,
247,
2856,
7837,
254,
2898,
352,
11323,
281,
253,
23507,
873,
273,
9380,
327,
253,
2898,
273,
50276,
1686,
281,
3237,
275,
5511,
6928,
285,
310,
247,
10112,
1818,
432,
9380,
10620,
2568,
969,
342,
278,
79,
382,
50275,
2577,
2022,
4468,
7033,
281,
2087,
50228,
476,
253,
7792,
320,
6508,
281,
25450,
658,
1755,
5970,
275,
1798,
835,
4859,
778,
3176,
347,
973,
347,
15529,
285,
835,
2709,
15050,
2226,
281,
14238,
326,
1364,
1329,
2709,
7823,
285,
281,
24319,
273,
33876,
1491,
2330,
342,
7632,
4931,
671,
4859,
352,
310,
417,
2590,
752,
253,
1566,
556,
6311,
285,
690,
3081,
4278,
670,
253,
1071,
9978,
403,
3058,
50275,
187,
187,
4118,
18435,
27,
2252,
273,
253,
30628,
1869,
436,
2929,
556,
3374,
835,
352,
812,
320,
5520,
50276,
9088,
369,
247,
2491,
273,
7350,
954,
15538,
2067,
30628,
3543,
253,
38135,
275,
253,
2929,
369,
12744,
347,
973,
347,
253,
8284,
323,
625,
4278,
275,
253,
5661,
27163
] | [
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] | [
7693,
1054,
4090,
259,
21630,
285,
23141,
512,
273,
731,
275,
2426,
273,
3969,
17082,
20544,
337,
253,
2929,
4245,
690,
16039,
275,
24049,
5145,
4715,
5609,
715,
253,
7137,
11369,
2170,
1690,
849,
281,
22573,
253,
3048,
3409,
970,
294,
19484,
1320,
285,
849,
281,
13791,
970,
4116,
5609,
50276,
19,
253,
2929,
4245,
247,
2087,
1783,
273,
253,
5847,
285,
772,
273,
13361,
7274,
352,
310,
27096,
672,
16344,
253,
25720,
273,
9433,
13361,
327,
1524,
20994,
50276,
20881,
1255,
265,
337,
253,
2929,
6125,
253,
7881,
275,
247,
21248,
1039,
275,
2593,
337,
10199,
253,
2929,
18303,
253,
15504,
273,
2840,
3006,
7137,
11369,
15849,
323,
2710,
4893,
2299,
352,
19756,
11859,
1783,
273,
7881,
326,
7027,
275,
42477,
247,
716,
3646,
715,
247,
5939,
7241,
760,
247,
2087,
5740,
275,
12494,
374,
824,
347,
352,
4419,
253,
17776,
273,
6431,
941,
253,
5162,
273,
253,
941,
285,
253,
5933,
50276,
261,
417,
4891,
285,
16101,
2217,
281,
7568,
849,
841,
5691,
253,
2216,
273,
247,
1805,
716,
6974,
253,
2929,
943,
921,
849,
2834,
352,
310,
326,
281,
1232,
591,
13259,
5251,
941,
285,
281,
2216,
271,
5933,
33810,
597,
513,
417,
5513,
849,
24049,
5145,
4715,
5609,
29966,
1110,
7881,
50276,
19,
253,
2929,
10262,
12744,
20121,
273,
7681,
4278,
337,
275,
253,
1273,
12494,
273,
2593,
374,
597,
1750,
326,
253,
4081,
2746,
588,
3037,
767,
7794,
26332,
1232,
285,
6431,
253,
1980,
1375,
273,
20994,
285,
3711,
253,
25920,
1375,
715,
45279,
7089,
2299,
1996,
275,
253,
7002,
12494,
597,
3748,
326,
253,
1566,
943,
3037,
625,
685,
841,
767,
7794,
253,
1566,
943,
671,
3037,
849,
281,
8071,
281,
2544,
275,
253,
2990,
824,
347,
4666,
285,
3048,
20101,
285,
2544,
275,
16387,
5593,
597,
513,
417,
5513,
849,
253,
6158,
4809,
310,
2905,
281,
253,
3438,
767,
7794,
374,
275,
2593,
495,
12494,
608,
1057,
281,
3609,
253,
288,
2224,
284,
3309,
323,
436,
6492,
326,
627,
1537,
320,
625,
288,
2224,
284,
671,
352,
310,
12744,
752,
436,
10770,
281,
50276,
20,
253,
7103,
629,
310,
417,
3426,
2217,
806,
253,
2929,
1057,
417,
921,
253,
5301,
273,
12672,
18332,
2429,
281,
1666,
25379,
347,
247,
906,
359,
513,
417,
871,
604,
253,
4081,
581,
476,
320,
18329,
275,
4588,
941,
4055,
19007,
849,
1048,
588,
352,
1379,
323,
3733,
47515,
253,
12672,
18332,
323,
17032,
1273,
597,
513,
417,
897,
7137,
3386,
432,
2710,
4893,
285,
3021,
359,
513,
417,
871,
604,
253,
4081,
2746,
476,
5223,
281,
2710,
7137,
3386,
352,
310,
9560,
275,
7137,
11369,
347,
4767,
275,
2593,
337,
10199,
50276,
21,
253,
2929,
1057,
417,
2319,
253,
1896,
7681,
12834,
273,
616,
2746,
323,
1650,
342,
3430,
45279,
7089,
47125,
2990,
35367,
778,
5108,
849,
1057,
253,
2746,
6016,
3237,
326,
12893,
432,
15276,
6332,
273,
253,
5145,
4715,
5933,
253,
2929,
4245,
1175,
16039,
715,
16248,
5145,
4715,
5609,
285,
7137,
11369,
2299,
352,
19756,
4891,
7103,
281,
7568,
253,
18332,
273,
436,
2746,
285,
19756,
1783,
273,
1895,
10885,
25761,
253,
6779,
273,
4114,
285,
690,
273,
253,
7681,
4278,
1057,
417,
1361,
10668,
281,
2096,
3103,
253,
2929,
943,
337,
823,
27163,
17227,
253,
18332,
273,
13361,
5609,
12106,
253,
1896,
3237,
326,
778,
12893,
432,
10554,
6332,
374,
24813,
23851,
14683,
281,
2085,
625,
2590,
7681,
4278,
495,
921,
625,
11859,
12748,
275,
42477,
7137,
11369,
3646,
715,
14238,
5474,
33032,
2520,
789,
310,
12262,
253,
1895,
273,
7137,
11369,
275,
4382,
6928,
352,
29328,
247,
4715,
1332,
1925,
5384,
408,
323,
2403,
13138,
45279,
7089,
275,
5939,
2990,
20994,
970,
941,
4561,
407,
247,
36409,
3646,
253,
2929,
19584,
326,
247,
45279,
3646,
476,
320,
2797,
275,
247,
36409,
8142,
342,
4156,
1491,
285,
253,
4736,
310,
281,
6194,
247,
11454,
2990,
1566,
326,
476,
1056,
253,
3451,
45279,
3061,
275,
1016,
5234,
1293,
4156,
3054,
253,
2990,
5234,
476,
13791,
697,
1980,
1375,
281,
643,
20994,
281,
3157,
253,
4583,
3045,
50276,
2520,
789,
14177,
281,
4647,
5368,
5145,
4715,
5609,
281,
8415,
271,
1774,
4382,
2990,
1895,
2299,
891,
452,
247,
1180,
273,
7350,
342,
436,
789,
50276,
18,
253,
806,
4468,
310,
2905,
281,
253,
7680,
273,
253,
2929,
436,
2929,
310,
7194,
9433,
5368,
11454,
2990,
5609,
323,
253,
7137,
11369,
1895,
627,
310,
247,
1077,
3710,
7680,
4404,
253,
5145,
4715,
5609,
327,
253,
643,
1133,
604,
359,
2770,
327,
253,
2898,
1930,
352,
3133,
253,
11897,
2990,
7103,
3126,
310,
417,
15958,
2217,
1580,
760,
9864,
1543,
403,
2530,
625,
327,
436,
1127,
1996,
50276,
19,
253,
4477,
452,
21947,
275,
253,
2929,
326,
253,
2990,
20994,
452,
3710,
12672,
1612,
285,
253,
45279,
1566,
3198,
281,
1408,
275,
253,
941,
6415,
2299,
627,
310,
642,
7103,
281,
921,
326,
253,
4158,
1566,
651,
320,
2104,
281,
1408,
10481,
3809,
275,
824,
271,
3126,
275,
958,
260,
864,
1162,
355,
4765,
452,
2011,
326,
352,
310,
11132,
281,
3587,
897,
11454,
6928,
275,
253,
941,
6415,
281,
1056,
45279,
7089,
1955,
281,
22667,
10806,
11454,
2990,
4329,
4601,
403,
387,
247,
5499,
26054,
1268,
533,
275,
281,
11015,
941,
4055,
534,
310,
253,
2022,
4468,
323,
436,
2929,
5234,
878,
281,
1056,
12797,
7089,
387,
247,
2494,
9815,
1268,
352,
310,
1774,
281,
7472,
436,
275,
247,
1524,
3126,
281,
29720,
253,
9414,
326,
436,
1332,
310,
2686,
17887,
50275,
20,
581,
273,
253,
7881,
672,
20462,
716,
5609,
310,
253,
1698,
22667,
8667,
273,
253,
2990,
1375,
824,
347,
275,
355,
478,
796,
73,
1162,
355,
4765,
275,
436,
789,
253,
4477,
12661,
9628,
1016,
5234,
84,
1980,
1375,
275,
247,
10675,
5133,
13687,
7632,
10675,
310,
436,
271,
5919,
5511,
3102,
651,
352,
2847,
1029,
22667,
285,
18332,
50276,
21,
581,
273,
253,
5373,
273,
40880,
1453,
310,
253,
9171,
1430,
273,
253,
985,
2299,
352,
3133,
253,
1655,
2216,
4419,
851,
26208,
253,
1566,
672,
6240,
625,
20994,
281,
253,
2990,
50276,
22,
752,
310,
253,
7137,
3301,
908,
275,
253,
7103,
651,
253,
11333,
1335,
789,
672,
1027,
7137,
3301,
310,
908,
281,
7472,
253,
985,
253,
7823,
908,
275,
253,
7103,
403,
4942,
2969,
625,
15958,
32140,
84,
651,
320,
625,
11408,
50276,
23,
9759,
2523,
690,
273,
253,
3888,
38209,
403,
21481,
342,
253,
6970,
50276,
2520,
2929,
10384,
5368,
5145,
4715,
5609,
281,
8415,
253,
7137,
11369,
1895,
275,
4382,
6928,
253,
7680,
2584,
253,
5145,
4715,
39396,
3133,
281,
320,
3710,
275,
436,
789,
285,
253,
7103,
3198,
281,
320,
5196,
275,
247,
625,
15958,
3126,
281,
7568,
253,
25720,
273,
436,
4081,
1332,
323,
253,
10522,
2898,
5474,
339,
431,
248,
2929,
2175,
253,
897,
273,
13361,
281,
2216,
14238,
323,
247,
2856,
7837,
254,
2898,
275,
1798,
247,
21864,
17188,
941,
4055,
247,
269,
1595,
658,
18080,
285,
247,
2014,
3646,
403,
8025,
247,
11454,
2990,
310,
3733,
281,
3037,
45279,
7823,
432,
6667,
50276,
66,
23178,
2605,
310,
4081,
281,
15338,
10454,
253,
48257,
310,
10166,
281,
320,
17825,
533,
253,
1566,
3139,
310,
4229,
387,
253,
990,
273,
3733,
253,
2929,
2175,
253,
897,
273,
13361,
281,
2216,
14238,
323,
247,
2856,
7837,
254,
2898,
275,
1798,
247,
21864,
17188,
941,
4055,
247,
269,
1595,
658,
18080,
285,
247,
2014,
3646,
403,
8025,
247,
11454,
2990,
310,
3733,
281,
3037,
45279,
7823,
432,
6667,
50276,
66,
23178,
2605,
310,
4081,
281,
15338,
10454,
253,
48257,
310,
10166,
281,
320,
17825,
533,
253,
1566,
3139,
310,
4229,
387,
253,
990,
273,
3733,
50276,
783,
2929,
310,
247,
10112,
1635,
281,
253,
23507,
873,
273,
9380,
327,
9433,
13361,
281,
3237,
275,
5511,
6928,
2429,
342,
2460,
9162,
323,
1650,
247,
5691,
273,
2282,
310,
11659,
273,
941,
50275,
1189,
455,
253,
2929,
9563,
973,
285,
352,
4428,
7000,
1491,
327,
253,
10336,
285,
3733,
4931,
247,
2372,
1512,
1199,
323,
253,
17857,
32888,
8446,
50276,
9088,
403,
2067,
3374,
326,
878,
37699,
50276,
18,
513,
512,
20994,
878,
281,
452,
5384,
408,
8038,
476,
436,
789,
275,
247,
9769,
10336,
835,
690,
20994,
513,
417,
452,
5384,
408,
50276,
19,
253,
789,
310,
7106,
327,
2856,
7837,
254,
2898,
253,
9376,
273,
502,
493,
412,
5970,
2789,
352,
275,
40812,
281,
20342,
88,
34005,
6928,
534,
403,
253,
5222,
3063,
50276,
20,
310,
253,
18080,
8025,
1929,
1057,
1046,
4666,
871,
697,
4651,
285,
2151,
7632,
50276,
21,
253,
2173,
3646,
6311,
1060,
310,
326,
273,
18752,
20342,
390,
14221,
604,
247,
4666,
310,
440,
21943,
494,
1955,
281,
3048,
390,
4666,
20101,
436,
3133,
281,
320,
247,
2969,
1895,
1677,
253,
4688,
5202,
2605,
1057,
253,
3646,
671,
5234,
281,
247,
747,
7622,
604,
253,
1655,
7622,
310,
29356,
50275,
22,
5431,
14238,
1364,
8164,
4513,
2067,
7823,
417,
816,
247,
2014,
3646,
476,
253,
7792,
320,
6508,
281,
513,
326,
50276,
23,
1677,
326,
253,
18080,
310,
8025,
281,
320,
247,
5202,
352,
310,
417,
2590,
326,
253,
2746,
476,
320,
6508,
281,
2403,
897,
273,
747,
4859,
347,
651,
320,
253,
1083,
275,
271,
519,
26901,
17489,
2990,
417,
816,
4859,
285,
7613,
11865,
42689,
275,
247,
5202,
533,
671,
747,
4859,
9046,
390,
7826,
9046,
50276,
24,
247,
9840,
1307,
310,
5611,
275,
253,
7321,
2957,
1159,
275,
721,
281,
5416,
326,
46234,
3969,
281,
1027,
7632,
513,
417,
990,
598,
2810,
281,
1016,
643,
533,
436,
310,
10534,
752,
4499,
422,
4715,
310,
6326,
281,
5115,
352,
310,
417,
2590,
2139,
253,
4477,
497,
417,
2104,
281,
755,
295,
336,
281,
789,
50276,
25,
2330,
342,
7632,
285,
4859,
310,
3634,
1491,
534,
476,
671,
320,
908,
281,
22629,
875,
46234,
436,
556,
644,
7561,
908,
275,
253,
2675,
6928,
6239,
436,
2929,
19584,
326,
7632,
403,
8931,
285,
452,
642,
3634,
1491,
50274,
26,
253,
2929,
2789,
247,
14226,
273,
15563,
326,
253,
6158,
15771,
327,
253,
897,
273,
247,
1029,
5251,
3646,
3448,
50276,
5371,
310,
3430,
342,
326,
2746,
604,
247,
13361,
1566,
310,
281,
3037,
247,
3646,
352,
943,
320,
2104,
281,
9446,
285,
11322,
15272,
275,
253,
6667,
2530,
253,
7344,
778,
320,
1955,
281,
3480,
273,
2120,
3634,
1146,
2530,
390,
908,
407,
253,
458,
47612,
247,
15846,
2856,
324,
1069,
257,
2746,
778,
417,
320,
2104,
281,
3959,
16039,
327,
752,
556,
644,
6311,
4632,
247,
6551,
84,
3445,
3422,
2746,
835,
4803,
1537,
320,
6311,
285,
6830,
4665,
494,
407,
247,
1966,
50276,
740,
8442,
608,
285,
721,
2085,
690,
28145,
533,
352,
310,
417,
2590,
752,
253,
1566,
556,
6311,
50275,
883,
627,
310,
690,
5955,
273,
18332,
1755,
273,
3239,
608,
533,
436,
310,
12497,
651,
751,
281,
755,
690,
625,
4278,
50275,
805,
30505,
1854,
15050,
403,
5393,
2067,
2069,
533,
627,
310,
760,
581,
5880,
4924,
1854,
327,
247,
5202,
50275,
1012,
253,
4748,
355,
71,
4420,
2929,
6492,
326,
247,
465,
19397,
5202,
556,
465,
1423,
465,
6443,
20994,
285,
476,
1329,
465,
1706,
14516,
323,
465,
25,
436,
5644,
281,
1668,
20994,
285,
12842,
14516,
533,
253,
2929,
2296,
5096,
20994,
285,
12842,
14516,
4496,
19148,
671,
403,
627,
465,
25,
14903,
387,
253,
941,
4055,
50274,
555,
993,
285,
824,
50276,
32897,
4853,
913,
1406,
90,
983,
1078,
417,
846,
806,
897,
24088,
288,
2224,
66,
275,
4706,
495,
50275,
783,
2929,
2175,
253,
897,
273,
13361,
281,
2216,
14238,
323,
247,
2856,
7837,
254,
2898,
352,
11323,
281,
253,
23507,
873,
273,
9380,
327,
253,
2898,
273,
50276,
1686,
281,
3237,
275,
5511,
6928,
285,
310,
247,
10112,
1818,
432,
9380,
10620,
2568,
969,
342,
278,
79,
382,
50275,
2577,
2022,
4468,
7033,
281,
2087,
50228,
476,
253,
7792,
320,
6508,
281,
25450,
658,
1755,
5970,
275,
1798,
835,
4859,
778,
3176,
347,
973,
347,
15529,
285,
835,
2709,
15050,
2226,
281,
14238,
326,
1364,
1329,
2709,
7823,
285,
281,
24319,
273,
33876,
1491,
2330,
342,
7632,
4931,
671,
4859,
352,
310,
417,
2590,
752,
253,
1566,
556,
6311,
285,
690,
3081,
4278,
670,
253,
1071,
9978,
403,
3058,
50275,
187,
187,
4118,
18435,
27,
2252,
273,
253,
30628,
1869,
436,
2929,
556,
3374,
835,
352,
812,
320,
5520,
50276,
9088,
369,
247,
2491,
273,
7350,
954,
15538,
2067,
30628,
3543,
253,
38135,
275,
253,
2929,
369,
12744,
347,
973,
347,
253,
8284,
323,
625,
4278,
275,
253,
5661,
27163
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
this work introduces a new irl framework soloirl that learns a reward function using only expert trajectories this has the benefit of being trained in an offline manner which speeds up the training process soloirl builds on top of the work of uchibe 2018 and exploits the fact that a discriminator can replace the binary classification used in logregirl they improve on logregirl by overcoming the difficulty of selecting an appropriate baseline trajectories by proposing to use adversarial oneclass classification sabokrou et al 2018 the authors empirically demonstrate their results on the cartpole and bipedalwalker tasks and show superior performance over logregirl strengths this work demonstrates an interesting and insightful extension to irl frameworks the authors created an efficient irl method that can be trained in a completely offline manner by replacing the binary classification with adversarial oneclass classifiction from logregirl they remove the difficulty of picking a good baseline trajectory that logregirl requires furthermore this idea of adversarially learned oneclass classifier draws on the idea of anomaly detection problem to learn a good discriminator this can potentially have benefits for a stronger baseline trajectory during training indeed this was demonstrated in the results in the bipedalwalker task where the random policy could not provide an appropriate baseline trajectory for the expert trajectory to progress to the goal weaknesses the main weakness of this work is the sparsity of the experimental section a baseline that should be compared to is behavioral cloning bc the setting of this work is under an offline setting using only expert trajectories which is the same for bc thus i am curious if bc performs onpar with soloirl the authors do not explicitly mention an advantage of their method over simple bc although i would imagine the reward function may be an advantage and thus i would like to see its performance compared to bc the authors demonstrate that the reward function is better shaped than logregirl but they do not showcase how it can be used it would be interesting to see if the learned reward can be used to transfer to like tasks one additional concern is the sparsity in tasks and baselines in the experimental section it would be interesting to see more than 1 baseline and tasks other than bipedalwalkercartpole since cartpole is quite simple the authors also mentioned that adjusting the noise in soloirl is difficult it would be nice to see an ablations section to understand how sensitivehard to tune that is as compared to selecting baseline trajectories miscellaneous a more extensive background of the irl field would be nice a few citations to include maxentirl ziebart et al 2008 guided cost learning finn et al 2016 generative adversarial imitation learning ho et al 2016 behavioral cloning from observation torabi et al 2018 the authors introduce an interesting new framework for irl that can learn a reward function in an offline manner furthermore they draw insights from anomaly detection problem and improve upon the logregirl they demonstrate empirically that their method can learn a better reward function and achieve higher performance on two openai gym tasks however the experimental section for this work is very sparse if the authors can address the concerns listed above in particular adding bc as a baseline i would be willing to increase my rating docsepthe paper proposes an adversarial inverse reinforcement learning algorithm that learns purely from expert demonstrations and does not require any online interaction with the environment or a dataset of unlabeled interactions with the environment the key idea is to synthesize negative examples ie examples of nonexpert behavior using a denoising autoencoder trained on the positive examples ie expert demonstrations experiments on the bipedalwalker simulated locomotion task show that the proposed method learns a reward function such that an rl agent trained to maximize the learned rewards achieves higher true rewards than a prior method overall this is an interesting paper but im concerned that the experiments dont compare to prior methods that also learn a reward function purely from expert demonstrations without any unlabeled data random expert distillationhttpsarxivorgabs190506750 disagreementregularized imitation learninghttpsopenreviewnetforumidrkgbyyhtwb onailhttpsarxivorgabs200803525 im also concerned that the experiments only show that the proposed method outperforms prior work on one task bipedalwalker i would consider raising my score if the paper included experiments with different tasks and compared to at least one of the prior methods listed above update thank you to the authors for adding the experimental comparisons to bc and red on the hopper and walker2d simulated locomotion tasks unfortunately it seems that the bc and red baselines substantially outperform the proposed method on those two tasks so i will keep my original score lacking comparisons to relevant prior methods and evaluations on diverse tasks docsepthe paper proposes a stateonly offline ilr algorithm learning reward function soloirl by reducing irl to adversarial oneclass classification compared to most existing ilr algorithms the proposed algorithm is more efficient and requires fewer assumptions it does not require solving rl problems in the inner loop and does not require ranked expert trajectories or assumptions on trajectories generated by uncontrolled transitions probabilities the authors show the algorithm learns reasonable reward on two simulated control tasks and significantly outperforms the logregirl algorithm that it extends strengths the problem that the paper tends to address is challenging and is very relevant to the iclr community the algorithm proposed seems simple and can be implemented using the offtheself adversarial oneclass algorithms although tuning noise level may be required as the authors noted the paper is overall clearly written and easy to follow concerns the approach that the authors propose is very interesting however i am not sure whether the analysis of logregirl in section 23 carries over to the proposed method the analysis of logregirl seems to only apply when learning classifiers that distinguish between trajectories generated from uncontrolled and expertcontrolled transition probabilities its not shown in the paper what reward and value functions are learned by learning classifiers that distinguish between expertcontrolled and noisecorruptedexportcontrolled data therefore theoretical analysis of the proposed method is lacking do the reward function and transition probability in the experiments satisfy the requirements of lmdp the experiments need more work some ablation analysis would help to understand what change contributes to the improvement on the bipedalwalker environment among the addition of generator weight decay and changed crossentropy logregirl is affected by baseline trajectories in theory it would be helpful to compare to related works such as drex brown et al 2019 an extension of trex that proposes a similar idea of injecting noise to demonstrated trajectories due to the randomness in network initialization and trajectory generation experiments with more seeds are helpful more tasks would also better demonstrate the performance of the algorithm i appreciated that the authors are honest about the difficulty in turning the noise levels could you give some intuition about this moreover what are the state dimensions in table 5 in page 13 its exciting to see the visualization of the rewards that soloirl learned it will be informative to plot the true reward function too other comments questions eq 4 rewriting eq 3 using the definition of kl divergence does not seem correct and eq 5 does not align with the equation in the logregirl paper is assuming the states are finite necessary in eg eq 4 or its just for convenience postrebuttal comments thank the authors very much for responding to my questions and adding more experiments in a short amount of time however the results do not seem very promising and i am still concerned with the theoretical grounding of the proposed algorithm therefore i would like to keep my score because of the concerns discussed in the main review tab i am leaning towards rejecting the paper for now
### Summary: | this paper studies the problem of inverse reinforcement learning by relying on only demonstrations and no interaction like imitation learning the reviewers liked the premise but had major concerns with evaluation and baselines the paper initially received reviews tending to reject one of the questions was about missing behavior cloning baseline which the authors added in rebuttal but the bc baseline seems to be really competitive in fact better in 3 out of 4 envs as compared to the proposed approach in conclusion all reviewers still believed that their concerns regarding insufficient evidence for justifying approach and missing comparisons to other prior work still stand ac agrees with the reviewers consensus that the paper is not yet ready for acceptance | [
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
2520,
789,
23970,
247,
747,
209,
2587,
7792,
14088,
2587,
326,
33772,
247,
10921,
1159,
970,
760,
6485,
24102,
436,
556,
253,
5649,
273,
1146,
10166,
275,
271,
28841,
5133,
534,
18819,
598,
253,
3733,
1232,
14088,
2587,
21168,
327,
1755,
273,
253,
789,
273,
209,
976,
24669,
4765,
285,
40725,
253,
958,
326,
247,
7134,
12915,
476,
8171,
253,
8985,
9162,
908,
275,
2412,
1747,
2587,
597,
3157,
327,
2412,
1747,
2587,
407,
40845,
253,
10183,
273,
17221,
271,
4569,
8245,
24102,
407,
36636,
281,
897,
48960,
581,
2437,
9162,
18429,
536,
30898,
1162,
355,
4765,
253,
4477,
45190,
7568,
616,
1543,
327,
253,
7281,
36479,
285,
15086,
264,
267,
49679,
8892,
285,
921,
8936,
3045,
689,
2412,
1747,
2587,
50276,
296,
3755,
20556,
436,
789,
14371,
271,
4722,
285,
47860,
6880,
281,
209,
2587,
31225,
253,
4477,
3562,
271,
5919,
209,
2587,
1332,
326,
476,
320,
10166,
275,
247,
4336,
28841,
5133,
407,
15706,
253,
8985,
9162,
342,
48960,
581,
2437,
966,
692,
19606,
432,
2412,
1747,
2587,
597,
5386,
253,
10183,
273,
8871,
247,
1175,
8245,
18974,
326,
2412,
1747,
2587,
4419,
33810,
436,
2934,
273,
18539,
274,
1365,
6311,
581,
2437,
30410,
21354,
327,
253,
2934,
273,
30207,
5481,
1895,
281,
3037,
247,
1175,
7134,
12915,
436,
476,
7826,
452,
5373,
323,
247,
10046,
8245,
18974,
1309,
3733,
6296,
436,
369,
5183,
275,
253,
1543,
275,
253,
15086,
264,
267,
49679,
4836,
835,
253,
3632,
3646,
812,
417,
2085,
271,
4569,
8245,
18974,
323,
253,
6485,
18974,
281,
4780,
281,
253,
4736,
50275,
20881,
1255,
265,
253,
2022,
14855,
273,
436,
789,
310,
253,
37139,
414,
273,
253,
5661,
2593,
247,
8245,
326,
943,
320,
2429,
281,
310,
14613,
34591,
49501,
253,
4758,
273,
436,
789,
310,
762,
271,
28841,
4758,
970,
760,
6485,
24102,
534,
310,
253,
1072,
323,
49501,
3021,
891,
717,
14338,
604,
49501,
17923,
327,
1148,
342,
14088,
2587,
253,
4477,
513,
417,
11120,
3748,
271,
5750,
273,
616,
1332,
689,
2969,
49501,
3738,
891,
651,
8564,
253,
10921,
1159,
778,
320,
271,
5750,
285,
3021,
891,
651,
751,
281,
923,
697,
3045,
2429,
281,
49501,
253,
4477,
7568,
326,
253,
10921,
1159,
310,
1805,
16745,
685,
2412,
1747,
2587,
533,
597,
513,
417,
34647,
849,
352,
476,
320,
908,
352,
651,
320,
4722,
281,
923,
604,
253,
6311,
10921,
476,
320,
908,
281,
3700,
281,
751,
8892,
50275,
531,
3081,
4468,
310,
253,
37139,
414,
275,
8892,
285,
1666,
25379,
275,
253,
5661,
2593,
352,
651,
320,
4722,
281,
923,
625,
685,
337,
8245,
285,
8892,
643,
685,
15086,
264,
267,
13678,
2269,
435,
36479,
1580,
7281,
36479,
310,
3240,
2969,
253,
4477,
671,
5393,
326,
19427,
253,
6046,
275,
14088,
2587,
310,
2834,
352,
651,
320,
5322,
281,
923,
271,
490,
77,
569,
2593,
281,
2096,
849,
7996,
10984,
281,
19928,
326,
310,
347,
2429,
281,
17221,
8245,
24102,
50275,
43671,
43295,
247,
625,
9470,
4114,
273,
253,
209,
2587,
1673,
651,
320,
5322,
247,
1643,
30404,
281,
2486,
50275,
4090,
290,
2587,
1182,
466,
35292,
1162,
355,
4695,
50275,
26960,
2105,
4715,
1442,
79,
1162,
355,
4022,
50274,
36749,
48960,
45738,
4715,
8511,
1162,
355,
4022,
50275,
38631,
267,
34591,
432,
8310,
7263,
18754,
1162,
355,
4765,
253,
4477,
9569,
271,
4722,
747,
7792,
323,
209,
2587,
326,
476,
3037,
247,
10921,
1159,
275,
271,
28841,
5133,
33810,
597,
3812,
16039,
432,
30207,
5481,
1895,
285,
3157,
2220,
253,
2412,
1747,
2587,
597,
7568,
45190,
326,
616,
1332,
476,
3037,
247,
1805,
10921,
1159,
285,
5115,
2169,
3045,
327,
767,
1527,
2284,
17409,
8892,
2299,
253,
5661,
2593,
323,
436,
789,
310,
1077,
23507,
604,
253,
4477,
476,
2953,
253,
7350,
7117,
1840,
275,
1798,
6240,
49501,
347,
247,
8245,
891,
651,
320,
7378,
281,
2572,
619,
13716,
5474,
339,
431,
248,
2929,
29328,
271,
48960,
13737,
35221,
4715,
5933,
326,
33772,
15846,
432,
6485,
32367,
285,
1057,
417,
2430,
667,
3909,
5016,
342,
253,
3126,
390,
247,
10895,
273,
440,
22027,
6355,
342,
253,
3126,
253,
2234,
2934,
310,
281,
46919,
4016,
6667,
26332,
6667,
273,
44382,
8292,
3879,
970,
247,
1850,
80,
2182,
6753,
36465,
10166,
327,
253,
2762,
6667,
26332,
6485,
32367,
4679,
327,
253,
15086,
264,
267,
49679,
15524,
23904,
5011,
4836,
921,
326,
253,
4081,
1332,
33772,
247,
10921,
1159,
824,
326,
271,
391,
77,
5570,
10166,
281,
22950,
253,
6311,
23267,
33526,
2169,
2032,
23267,
685,
247,
2720,
1332,
4583,
436,
310,
271,
4722,
2929,
533,
516,
7514,
326,
253,
4679,
13414,
7277,
281,
2720,
3082,
326,
671,
3037,
247,
10921,
1159,
15846,
432,
6485,
32367,
1293,
667,
440,
22027,
941,
50275,
14719,
6485,
940,
21755,
3614,
39962,
2061,
5375,
16129,
1235,
2251,
1235,
50275,
3431,
45694,
12846,
1025,
45738,
4715,
3614,
5758,
15337,
3024,
39061,
301,
83,
5840,
1615,
90,
384,
34338,
50275,
251,
647,
3614,
39962,
2061,
5375,
1518,
1438,
1671,
1099,
50276,
303,
671,
7514,
326,
253,
4679,
760,
921,
326,
253,
4081,
1332,
41731,
13015,
2720,
789,
327,
581,
4836,
15086,
264,
267,
49679,
891,
651,
1908,
12976,
619,
4868,
604,
253,
2929,
2908,
4679,
342,
1027,
8892,
285,
2429,
281,
387,
1878,
581,
273,
253,
2720,
3082,
7117,
1840,
50276,
11183,
50276,
47033,
368,
281,
253,
4477,
323,
6240,
253,
5661,
14023,
281,
49501,
285,
2502,
327,
253,
8511,
3803,
285,
2940,
254,
19,
69,
15524,
23904,
5011,
8892,
19235,
352,
3133,
326,
253,
49501,
285,
2502,
1666,
25379,
9619,
562,
32231,
253,
4081,
1332,
327,
1110,
767,
8892,
594,
891,
588,
1978,
619,
3236,
4868,
50276,
77,
10892,
14023,
281,
4623,
2720,
3082,
285,
27163,
327,
11117,
8892,
5474,
339,
431,
248,
2929,
29328,
247,
1375,
7483,
28841,
4164,
83,
5933,
4715,
10921,
1159,
50276,
84,
13013,
2587,
407,
8493,
209,
2587,
281,
48960,
581,
2437,
9162,
2429,
281,
954,
5368,
4164,
83,
11333,
253,
4081,
5933,
310,
625,
5919,
285,
4419,
11184,
13260,
352,
1057,
417,
2430,
16161,
391,
77,
3237,
275,
253,
6703,
6287,
285,
1057,
417,
2430,
17045,
6485,
24102,
390,
13260,
327,
24102,
4561,
407,
42915,
16307,
20552,
253,
4477,
921,
253,
5933,
33772,
5272,
10921,
327,
767,
15524,
1453,
8892,
285,
3012,
41731,
13015,
253,
2412,
1747,
2587,
5933,
326,
352,
8725,
50274,
296,
3755,
20556,
50274,
783,
1895,
326,
253,
2929,
14280,
281,
2953,
310,
11132,
285,
310,
1077,
4623,
281,
253,
17857,
32888,
3114,
50275,
783,
5933,
4081,
3133,
2969,
285,
476,
320,
9009,
970,
253,
273,
649,
1041,
813,
48960,
581,
2437,
11333,
3738,
25184,
6046,
1268,
778,
320,
2424,
347,
253,
4477,
4879,
50275,
783,
2929,
310,
4583,
4518,
3542,
285,
3477,
281,
956,
50275,
585,
1209,
2224,
50275,
783,
2746,
326,
253,
4477,
12661,
310,
1077,
4722,
2299,
891,
717,
417,
2119,
1880,
253,
1783,
273,
2412,
1747,
2587,
275,
2593,
3495,
15814,
689,
281,
253,
4081,
1332,
253,
1783,
273,
2412,
1747,
2587,
3133,
281,
760,
4647,
672,
4715,
49996,
326,
12129,
875,
24102,
4561,
432,
42915,
285,
6485,
16894,
5502,
20552,
697,
417,
2011,
275,
253,
2929,
752,
10921,
285,
1318,
3470,
403,
6311,
407,
4715,
49996,
326,
12129,
875,
6485,
16894,
285,
6046,
5528,
3787,
264,
15035,
16894,
941,
3103,
10527,
1783,
273,
253,
4081,
1332,
310,
14999,
50276,
3088,
253,
10921,
1159,
285,
5502,
5912,
275,
253,
4679,
10517,
253,
6095,
273,
298,
6535,
81,
50276,
783,
4679,
878,
625,
789,
50272,
8826,
28913,
1783,
651,
1361,
281,
2096,
752,
1818,
17904,
281,
253,
7756,
327,
253,
15086,
264,
267,
49679,
3126,
2190,
253,
1635,
273,
14156,
2801,
10027,
285,
4391,
2831,
290,
10144,
50273,
2808,
1747,
2587,
310,
5876,
407,
8245,
24102,
275,
3762,
352,
651,
320,
9371,
281,
7277,
281,
2905,
2987,
824,
347,
277,
18398,
8516,
1162,
355,
6247,
271,
6880,
273,
2578,
89,
326,
29328,
247,
2074,
2934,
273,
42014,
6046,
281,
5183,
24102,
50273,
21848,
281,
253,
3632,
1255,
275,
2990,
31850,
285,
18974,
5978,
4679,
342,
625,
12922,
403,
9371,
625,
8892,
651,
671,
1805,
7568,
253,
3045,
273,
253,
5933,
50273,
74,
14109,
326,
253,
4477,
403,
8274,
670,
253,
10183,
275,
8577,
253,
6046,
2308,
812,
368,
1918,
690,
30328,
670,
436,
25761,
752,
403,
253,
1375,
10103,
275,
2829,
608,
275,
3239,
2145,
50274,
953,
12302,
281,
923,
253,
24426,
273,
253,
23267,
326,
14088,
2587,
6311,
352,
588,
320,
27096,
281,
7484,
253,
2032,
10921,
1159,
1512,
50273,
977,
5701,
50276,
34974,
50275,
2574,
577,
294,
17695,
16186,
495,
970,
253,
5426,
273,
27451,
23279,
1057,
417,
1646,
3451,
285,
16186,
608,
1057,
417,
8495,
342,
253,
5150,
275,
253,
2412,
1747,
2587,
2929,
50275,
261,
7384,
253,
3054,
403,
6486,
3309,
275,
24088,
16186,
577,
390,
697,
816,
323,
16397,
50275,
5996,
250,
2858,
22559,
5701,
50276,
47033,
253,
4477,
1077,
1199,
323,
19392,
281,
619,
3533,
285,
6240,
625,
4679,
275,
247,
2159,
2408,
273,
673,
2299,
253,
1543,
513,
417,
1646,
1077,
12532,
285,
891,
717,
1335,
7514,
342,
253,
10527,
3216,
272,
273,
253,
4081,
5933,
3103,
891,
651,
751,
281,
1978,
619,
4868,
50275,
12157,
273,
253,
7350,
5469,
275,
253,
2022,
2278,
10334,
891,
717,
25661,
4404,
33944,
253,
2929,
323,
1024,
50276,
187,
187,
4118,
18435,
27,
2520,
2929,
2175,
253,
1895,
273,
13737,
35221,
4715,
407,
22128,
327,
760,
32367,
285,
642,
5016,
751,
45738,
4715,
253,
30628,
10490,
253,
26536,
533,
574,
2201,
7350,
342,
7103,
285,
1666,
25379,
253,
2929,
8523,
2959,
10123,
43981,
281,
12009,
581,
273,
253,
3533,
369,
670,
5816,
3879,
34591,
8245,
534,
253,
4477,
2879,
275,
30080,
22559,
533,
253,
49501,
8245,
3133,
281,
320,
1663,
12085,
275,
958,
1805,
275,
495,
562,
273,
577,
546,
10936,
347,
2429,
281,
253,
4081,
2746,
275,
6452,
512,
30628,
1335,
6566,
326,
616,
7350,
5001,
12497,
1941,
323,
816,
5411,
2746,
285,
5816,
14023,
281,
643,
2720,
789,
1335,
1462,
913,
18726,
342,
253,
30628,
13969,
326,
253,
2929,
310,
417,
2568,
4704,
323,
14924
] | [
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] | [
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
2520,
789,
23970,
247,
747,
209,
2587,
7792,
14088,
2587,
326,
33772,
247,
10921,
1159,
970,
760,
6485,
24102,
436,
556,
253,
5649,
273,
1146,
10166,
275,
271,
28841,
5133,
534,
18819,
598,
253,
3733,
1232,
14088,
2587,
21168,
327,
1755,
273,
253,
789,
273,
209,
976,
24669,
4765,
285,
40725,
253,
958,
326,
247,
7134,
12915,
476,
8171,
253,
8985,
9162,
908,
275,
2412,
1747,
2587,
597,
3157,
327,
2412,
1747,
2587,
407,
40845,
253,
10183,
273,
17221,
271,
4569,
8245,
24102,
407,
36636,
281,
897,
48960,
581,
2437,
9162,
18429,
536,
30898,
1162,
355,
4765,
253,
4477,
45190,
7568,
616,
1543,
327,
253,
7281,
36479,
285,
15086,
264,
267,
49679,
8892,
285,
921,
8936,
3045,
689,
2412,
1747,
2587,
50276,
296,
3755,
20556,
436,
789,
14371,
271,
4722,
285,
47860,
6880,
281,
209,
2587,
31225,
253,
4477,
3562,
271,
5919,
209,
2587,
1332,
326,
476,
320,
10166,
275,
247,
4336,
28841,
5133,
407,
15706,
253,
8985,
9162,
342,
48960,
581,
2437,
966,
692,
19606,
432,
2412,
1747,
2587,
597,
5386,
253,
10183,
273,
8871,
247,
1175,
8245,
18974,
326,
2412,
1747,
2587,
4419,
33810,
436,
2934,
273,
18539,
274,
1365,
6311,
581,
2437,
30410,
21354,
327,
253,
2934,
273,
30207,
5481,
1895,
281,
3037,
247,
1175,
7134,
12915,
436,
476,
7826,
452,
5373,
323,
247,
10046,
8245,
18974,
1309,
3733,
6296,
436,
369,
5183,
275,
253,
1543,
275,
253,
15086,
264,
267,
49679,
4836,
835,
253,
3632,
3646,
812,
417,
2085,
271,
4569,
8245,
18974,
323,
253,
6485,
18974,
281,
4780,
281,
253,
4736,
50275,
20881,
1255,
265,
253,
2022,
14855,
273,
436,
789,
310,
253,
37139,
414,
273,
253,
5661,
2593,
247,
8245,
326,
943,
320,
2429,
281,
310,
14613,
34591,
49501,
253,
4758,
273,
436,
789,
310,
762,
271,
28841,
4758,
970,
760,
6485,
24102,
534,
310,
253,
1072,
323,
49501,
3021,
891,
717,
14338,
604,
49501,
17923,
327,
1148,
342,
14088,
2587,
253,
4477,
513,
417,
11120,
3748,
271,
5750,
273,
616,
1332,
689,
2969,
49501,
3738,
891,
651,
8564,
253,
10921,
1159,
778,
320,
271,
5750,
285,
3021,
891,
651,
751,
281,
923,
697,
3045,
2429,
281,
49501,
253,
4477,
7568,
326,
253,
10921,
1159,
310,
1805,
16745,
685,
2412,
1747,
2587,
533,
597,
513,
417,
34647,
849,
352,
476,
320,
908,
352,
651,
320,
4722,
281,
923,
604,
253,
6311,
10921,
476,
320,
908,
281,
3700,
281,
751,
8892,
50275,
531,
3081,
4468,
310,
253,
37139,
414,
275,
8892,
285,
1666,
25379,
275,
253,
5661,
2593,
352,
651,
320,
4722,
281,
923,
625,
685,
337,
8245,
285,
8892,
643,
685,
15086,
264,
267,
13678,
2269,
435,
36479,
1580,
7281,
36479,
310,
3240,
2969,
253,
4477,
671,
5393,
326,
19427,
253,
6046,
275,
14088,
2587,
310,
2834,
352,
651,
320,
5322,
281,
923,
271,
490,
77,
569,
2593,
281,
2096,
849,
7996,
10984,
281,
19928,
326,
310,
347,
2429,
281,
17221,
8245,
24102,
50275,
43671,
43295,
247,
625,
9470,
4114,
273,
253,
209,
2587,
1673,
651,
320,
5322,
247,
1643,
30404,
281,
2486,
50275,
4090,
290,
2587,
1182,
466,
35292,
1162,
355,
4695,
50275,
26960,
2105,
4715,
1442,
79,
1162,
355,
4022,
50274,
36749,
48960,
45738,
4715,
8511,
1162,
355,
4022,
50275,
38631,
267,
34591,
432,
8310,
7263,
18754,
1162,
355,
4765,
253,
4477,
9569,
271,
4722,
747,
7792,
323,
209,
2587,
326,
476,
3037,
247,
10921,
1159,
275,
271,
28841,
5133,
33810,
597,
3812,
16039,
432,
30207,
5481,
1895,
285,
3157,
2220,
253,
2412,
1747,
2587,
597,
7568,
45190,
326,
616,
1332,
476,
3037,
247,
1805,
10921,
1159,
285,
5115,
2169,
3045,
327,
767,
1527,
2284,
17409,
8892,
2299,
253,
5661,
2593,
323,
436,
789,
310,
1077,
23507,
604,
253,
4477,
476,
2953,
253,
7350,
7117,
1840,
275,
1798,
6240,
49501,
347,
247,
8245,
891,
651,
320,
7378,
281,
2572,
619,
13716,
5474,
339,
431,
248,
2929,
29328,
271,
48960,
13737,
35221,
4715,
5933,
326,
33772,
15846,
432,
6485,
32367,
285,
1057,
417,
2430,
667,
3909,
5016,
342,
253,
3126,
390,
247,
10895,
273,
440,
22027,
6355,
342,
253,
3126,
253,
2234,
2934,
310,
281,
46919,
4016,
6667,
26332,
6667,
273,
44382,
8292,
3879,
970,
247,
1850,
80,
2182,
6753,
36465,
10166,
327,
253,
2762,
6667,
26332,
6485,
32367,
4679,
327,
253,
15086,
264,
267,
49679,
15524,
23904,
5011,
4836,
921,
326,
253,
4081,
1332,
33772,
247,
10921,
1159,
824,
326,
271,
391,
77,
5570,
10166,
281,
22950,
253,
6311,
23267,
33526,
2169,
2032,
23267,
685,
247,
2720,
1332,
4583,
436,
310,
271,
4722,
2929,
533,
516,
7514,
326,
253,
4679,
13414,
7277,
281,
2720,
3082,
326,
671,
3037,
247,
10921,
1159,
15846,
432,
6485,
32367,
1293,
667,
440,
22027,
941,
50275,
14719,
6485,
940,
21755,
3614,
39962,
2061,
5375,
16129,
1235,
2251,
1235,
50275,
3431,
45694,
12846,
1025,
45738,
4715,
3614,
5758,
15337,
3024,
39061,
301,
83,
5840,
1615,
90,
384,
34338,
50275,
251,
647,
3614,
39962,
2061,
5375,
1518,
1438,
1671,
1099,
50276,
303,
671,
7514,
326,
253,
4679,
760,
921,
326,
253,
4081,
1332,
41731,
13015,
2720,
789,
327,
581,
4836,
15086,
264,
267,
49679,
891,
651,
1908,
12976,
619,
4868,
604,
253,
2929,
2908,
4679,
342,
1027,
8892,
285,
2429,
281,
387,
1878,
581,
273,
253,
2720,
3082,
7117,
1840,
50276,
11183,
50276,
47033,
368,
281,
253,
4477,
323,
6240,
253,
5661,
14023,
281,
49501,
285,
2502,
327,
253,
8511,
3803,
285,
2940,
254,
19,
69,
15524,
23904,
5011,
8892,
19235,
352,
3133,
326,
253,
49501,
285,
2502,
1666,
25379,
9619,
562,
32231,
253,
4081,
1332,
327,
1110,
767,
8892,
594,
891,
588,
1978,
619,
3236,
4868,
50276,
77,
10892,
14023,
281,
4623,
2720,
3082,
285,
27163,
327,
11117,
8892,
5474,
339,
431,
248,
2929,
29328,
247,
1375,
7483,
28841,
4164,
83,
5933,
4715,
10921,
1159,
50276,
84,
13013,
2587,
407,
8493,
209,
2587,
281,
48960,
581,
2437,
9162,
2429,
281,
954,
5368,
4164,
83,
11333,
253,
4081,
5933,
310,
625,
5919,
285,
4419,
11184,
13260,
352,
1057,
417,
2430,
16161,
391,
77,
3237,
275,
253,
6703,
6287,
285,
1057,
417,
2430,
17045,
6485,
24102,
390,
13260,
327,
24102,
4561,
407,
42915,
16307,
20552,
253,
4477,
921,
253,
5933,
33772,
5272,
10921,
327,
767,
15524,
1453,
8892,
285,
3012,
41731,
13015,
253,
2412,
1747,
2587,
5933,
326,
352,
8725,
50274,
296,
3755,
20556,
50274,
783,
1895,
326,
253,
2929,
14280,
281,
2953,
310,
11132,
285,
310,
1077,
4623,
281,
253,
17857,
32888,
3114,
50275,
783,
5933,
4081,
3133,
2969,
285,
476,
320,
9009,
970,
253,
273,
649,
1041,
813,
48960,
581,
2437,
11333,
3738,
25184,
6046,
1268,
778,
320,
2424,
347,
253,
4477,
4879,
50275,
783,
2929,
310,
4583,
4518,
3542,
285,
3477,
281,
956,
50275,
585,
1209,
2224,
50275,
783,
2746,
326,
253,
4477,
12661,
310,
1077,
4722,
2299,
891,
717,
417,
2119,
1880,
253,
1783,
273,
2412,
1747,
2587,
275,
2593,
3495,
15814,
689,
281,
253,
4081,
1332,
253,
1783,
273,
2412,
1747,
2587,
3133,
281,
760,
4647,
672,
4715,
49996,
326,
12129,
875,
24102,
4561,
432,
42915,
285,
6485,
16894,
5502,
20552,
697,
417,
2011,
275,
253,
2929,
752,
10921,
285,
1318,
3470,
403,
6311,
407,
4715,
49996,
326,
12129,
875,
6485,
16894,
285,
6046,
5528,
3787,
264,
15035,
16894,
941,
3103,
10527,
1783,
273,
253,
4081,
1332,
310,
14999,
50276,
3088,
253,
10921,
1159,
285,
5502,
5912,
275,
253,
4679,
10517,
253,
6095,
273,
298,
6535,
81,
50276,
783,
4679,
878,
625,
789,
50272,
8826,
28913,
1783,
651,
1361,
281,
2096,
752,
1818,
17904,
281,
253,
7756,
327,
253,
15086,
264,
267,
49679,
3126,
2190,
253,
1635,
273,
14156,
2801,
10027,
285,
4391,
2831,
290,
10144,
50273,
2808,
1747,
2587,
310,
5876,
407,
8245,
24102,
275,
3762,
352,
651,
320,
9371,
281,
7277,
281,
2905,
2987,
824,
347,
277,
18398,
8516,
1162,
355,
6247,
271,
6880,
273,
2578,
89,
326,
29328,
247,
2074,
2934,
273,
42014,
6046,
281,
5183,
24102,
50273,
21848,
281,
253,
3632,
1255,
275,
2990,
31850,
285,
18974,
5978,
4679,
342,
625,
12922,
403,
9371,
625,
8892,
651,
671,
1805,
7568,
253,
3045,
273,
253,
5933,
50273,
74,
14109,
326,
253,
4477,
403,
8274,
670,
253,
10183,
275,
8577,
253,
6046,
2308,
812,
368,
1918,
690,
30328,
670,
436,
25761,
752,
403,
253,
1375,
10103,
275,
2829,
608,
275,
3239,
2145,
50274,
953,
12302,
281,
923,
253,
24426,
273,
253,
23267,
326,
14088,
2587,
6311,
352,
588,
320,
27096,
281,
7484,
253,
2032,
10921,
1159,
1512,
50273,
977,
5701,
50276,
34974,
50275,
2574,
577,
294,
17695,
16186,
495,
970,
253,
5426,
273,
27451,
23279,
1057,
417,
1646,
3451,
285,
16186,
608,
1057,
417,
8495,
342,
253,
5150,
275,
253,
2412,
1747,
2587,
2929,
50275,
261,
7384,
253,
3054,
403,
6486,
3309,
275,
24088,
16186,
577,
390,
697,
816,
323,
16397,
50275,
5996,
250,
2858,
22559,
5701,
50276,
47033,
253,
4477,
1077,
1199,
323,
19392,
281,
619,
3533,
285,
6240,
625,
4679,
275,
247,
2159,
2408,
273,
673,
2299,
253,
1543,
513,
417,
1646,
1077,
12532,
285,
891,
717,
1335,
7514,
342,
253,
10527,
3216,
272,
273,
253,
4081,
5933,
3103,
891,
651,
751,
281,
1978,
619,
4868,
50275,
12157,
273,
253,
7350,
5469,
275,
253,
2022,
2278,
10334,
891,
717,
25661,
4404,
33944,
253,
2929,
323,
1024,
50276,
187,
187,
4118,
18435,
27,
2520,
2929,
2175,
253,
1895,
273,
13737,
35221,
4715,
407,
22128,
327,
760,
32367,
285,
642,
5016,
751,
45738,
4715,
253,
30628,
10490,
253,
26536,
533,
574,
2201,
7350,
342,
7103,
285,
1666,
25379,
253,
2929,
8523,
2959,
10123,
43981,
281,
12009,
581,
273,
253,
3533,
369,
670,
5816,
3879,
34591,
8245,
534,
253,
4477,
2879,
275,
30080,
22559,
533,
253,
49501,
8245,
3133,
281,
320,
1663,
12085,
275,
958,
1805,
275,
495,
562,
273,
577,
546,
10936,
347,
2429,
281,
253,
4081,
2746,
275,
6452,
512,
30628,
1335,
6566,
326,
616,
7350,
5001,
12497,
1941,
323,
816,
5411,
2746,
285,
5816,
14023,
281,
643,
2720,
789,
1335,
1462,
913,
18726,
342,
253,
30628,
13969,
326,
253,
2929,
310,
417,
2568,
4704,
323,
14924
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
summary the authors use domain invariant categorical semantics to improve unsupervised domain translation udt they learn these semantics in an unsupervised manner they show how this can improve results on semantic preserving unsupervised domain translation and style heterogeneous domain translation by doing experiments on mnistsvhn features traditionally learned are very different but digit identity could be the same and sketchesreals distinct styles respectively strengths the visual results on both tasksdatasets are striking the paper is simple to read and the idea is intuitive experiments are extensive including an ablation on the losses and comparison against baselines weaknesses more examples of sketchesreal could have been shown when does this method fail some failure cases would be good docsepthe paper addresses the domain translation problem and proposes a novel approach to translate images between domains in an unsupervised manner by integrating unsupervised learning of domaininvariant semantic features between the two domains the paper is wellwritten with a clear standing point and motivation along with welldescribed contributions the paper has an inclusive and sound theoretical comparison to related work evaluation is welldesigned and includes previous work in the same context in overall it is a good paper with an original and potentially inspiring idea and a convincing application of conditional gans would be an interesting read for many in the conference typos page 1 showing that such a translation two situations where unsupervised domain translation page 7 the representations of the real images comments 1 unsupervised domain translation methods work due to an inductive bias toward identity in sections 1 and 32 need citation to support this statement 2 considering clustering is one of the important parts of the contributions it may need more emphasis in the paper it would be nice to restructure this part extend the discussion of alternatives and an analysis to compare different approaches also by moving some part of the discussion to the paper from the appendicesdocsepsummary this paper proposes to learn the categorical semantic features in an unsupervised manner to enhance the ability of image translation at preserving highlevel features and obtains some good results on semanticpreserving unsupervised domain translation and styleheterogeneous domain translation problems major issues the proposed method seems to be a combination of current works the main contribution of this work may be leveraging the unsupervised representation learning for semantic features extraction the quality of generated images is still not satisfactory with such rapid development of gans the experiment and qualitative evaluation are too limited only two image translation tasks are conducted for comparison and little visual results are given it will be preferred if some common i2i tasks results are given only fid is used adding other metrics such as lpips ndb and jsd will be more convincing minor issues what are the essential differences between spudt and shdt problem how does the model solve the two problems according to their differences docsepthis paper presents unsupervised domain translation udt considering two scenarios semantic preserving unsupervised domain translation spudt and is styleheterogeneous domain translation shdt this study uses mnist and svhn datasets for demonstrating spudt and sketches and reals samples from the domainnet dataset for demonstrating shdt although the method uses different components depending on the scenario or dataset the presented architecture is essentially the same the method consists of the framework for learning invariant categorical semantics across domains section 31 and semantic style modulation to make shdt generations consistent section 32 section 31 consists of unsupervised representation learning clustering and unsupervised domain adaptation however this section did not describe the first two components in detail probably because the method uses different components depending on the datasets described in section 4 unsupervised domain adaptation is realized by the minimization problem described in p4 section 32 describes content encoders semantics encoders obtained in section 31 mapping networks style encoders and generator the section presents the loss functions adversarial loss cycleconsistency loss styleconsistency loss style diversity loss and semantic loss section 4 compares the proposed method with udt baselines under spudt and shdt scenarios the experiments on mnist and svhn show that the proposed method achieved high accuracy in domain translation and high quality in the generation measured by fid the experimental results on sketches and reals show that the proposed method yields highquality generations overall the experiments demonstrate the importance of incorporating the semantics category in udt pros it was interesting to see that the performance of udt was drastically improved by incorporating semantic categories induced by the clustering algorithms the presented method was reasonable and well designed cons the main part of this paper is dense i had to go back and forth between the main body and the appendix a lot this paper does not seem to emphasize the technical novelty of this work although section 31 was designed to present a novel framework it does not explain the detail of the components methods for unsupervised representation learning and clustering are explained in section 4 the component of unsupervised domain adaptation may be a new proposal in this work however this paper did not explain this component in the main body i had to read appendix b5 to understand the formula presented in p4 the model architecture described in section 32 is mostly based on previous work eg adversarial loss cycleconsistency loss etc the semantic loss presented in this section may be a new addition to udt but again this is not emphasized in section 32 questions p4 we describe those regularizers in more detail in appendix b5 if this part is a proposal in this paper it should be explained as the main content of this paper if i do not read appendix b5 i have no idea how domain 2 is involved in the minimization problem minor comments p1 that are mappings biased toward the identity that those it may be convenient to have equation numbers so that reviewersreaders can locate a formula directly
### Summary: | this paper studies the problem of unsupervised domain translation here translation does not refer to language translation instead it refers to the idea of transferring highlevel semantic features specifically the authors look at digit style transfer between mnistpostal address numbers and svhnstreet view house numbers and sketches to reals the visuals look very convincing and the empirical results are strong too there is one weaker review but the authors address the concerns in their response and the reviewer did unfortunately not respond despite promting | [
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
8774,
253,
4477,
897,
5028,
13727,
31091,
35185,
281,
3157,
440,
35421,
5028,
10234,
18198,
85,
597,
3037,
841,
35185,
275,
271,
440,
35421,
5133,
597,
921,
849,
436,
476,
3157,
1543,
327,
24705,
24279,
440,
35421,
5028,
10234,
285,
3740,
22766,
5028,
10234,
407,
2509,
4679,
327,
278,
79,
1346,
87,
13107,
3386,
21533,
6311,
403,
1077,
1027,
533,
6670,
6489,
812,
320,
253,
1072,
285,
46159,
250,
932,
5799,
14957,
2975,
50274,
296,
3755,
20556,
50276,
783,
5304,
1543,
327,
1097,
8892,
46906,
1507,
403,
13631,
50276,
783,
2929,
310,
2969,
281,
1239,
285,
253,
2934,
310,
27350,
50276,
16217,
3825,
403,
9470,
1690,
271,
28913,
327,
253,
11655,
285,
5301,
1411,
1666,
25379,
50276,
20881,
1255,
265,
50275,
3062,
6667,
273,
46159,
6549,
812,
452,
644,
2011,
50276,
9453,
1057,
436,
1332,
1891,
690,
4433,
2219,
651,
320,
1175,
5474,
339,
431,
248,
2929,
12453,
253,
5028,
10234,
1895,
285,
29328,
247,
4460,
2746,
281,
16497,
3888,
875,
10625,
275,
271,
440,
35421,
5133,
407,
24399,
440,
35421,
4715,
273,
5028,
25168,
24705,
3386,
875,
253,
767,
10625,
253,
2929,
310,
973,
15720,
342,
247,
2590,
6306,
1127,
285,
16038,
2112,
342,
6210,
392,
265,
9397,
9021,
253,
2929,
556,
271,
25495,
285,
3590,
10527,
5301,
281,
2905,
789,
7103,
310,
6210,
392,
265,
1300,
285,
3797,
2045,
789,
275,
253,
1072,
3634,
275,
4583,
352,
310,
247,
1175,
2929,
342,
271,
3236,
285,
7826,
29853,
2934,
285,
247,
21414,
2898,
273,
17697,
305,
507,
651,
320,
271,
4722,
1239,
323,
1142,
275,
253,
8059,
50276,
555,
993,
3239,
337,
50276,
9029,
272,
326,
824,
247,
10234,
767,
9534,
835,
440,
35421,
5028,
10234,
3239,
818,
50276,
783,
14237,
273,
253,
1524,
3888,
50276,
26122,
337,
440,
35421,
5028,
10234,
3082,
789,
1955,
281,
271,
42115,
8492,
2584,
6489,
275,
7118,
337,
285,
4567,
878,
25577,
281,
1329,
436,
3908,
374,
7296,
17524,
310,
581,
273,
253,
1774,
4243,
273,
253,
9021,
352,
778,
878,
625,
15075,
275,
253,
2929,
352,
651,
320,
5322,
281,
1551,
7818,
436,
629,
9017,
253,
5955,
273,
18075,
285,
271,
1783,
281,
7277,
1027,
7274,
671,
407,
4886,
690,
629,
273,
253,
5955,
281,
253,
2929,
432,
253,
14801,
1271,
7152,
339,
793,
360,
3454,
436,
2929,
29328,
281,
3037,
253,
31091,
24705,
3386,
275,
271,
440,
35421,
5133,
281,
7278,
253,
3745,
273,
2460,
10234,
387,
24279,
1029,
5251,
3386,
285,
31326,
690,
1175,
1543,
327,
24705,
10192,
26368,
440,
35421,
5028,
10234,
285,
3740,
33709,
10553,
5028,
10234,
3237,
50274,
24330,
3374,
50275,
783,
4081,
1332,
3133,
281,
320,
247,
5019,
273,
1655,
2987,
253,
2022,
7680,
273,
436,
789,
778,
320,
19732,
2977,
253,
440,
35421,
6779,
4715,
323,
24705,
3386,
11998,
50273,
783,
3290,
273,
4561,
3888,
310,
1335,
417,
20297,
342,
824,
5233,
2440,
273,
305,
507,
50275,
783,
3368,
285,
18276,
7103,
403,
1512,
3710,
760,
767,
2460,
10234,
8892,
403,
5196,
323,
5301,
285,
1652,
5304,
1543,
403,
1677,
352,
588,
320,
9013,
604,
690,
1846,
891,
19,
74,
8892,
1543,
403,
1677,
760,
269,
301,
310,
908,
6240,
643,
17082,
824,
347,
39322,
2824,
295,
5470,
285,
480,
8289,
588,
320,
625,
21414,
50274,
37585,
3374,
50276,
5371,
403,
253,
5667,
3910,
875,
653,
438,
85,
285,
439,
7064,
1895,
849,
1057,
253,
1566,
8415,
253,
767,
3237,
2556,
281,
616,
3910,
50276,
7152,
33032,
2520,
2929,
10262,
440,
35421,
5028,
10234,
18198,
85,
7296,
767,
15216,
24705,
24279,
440,
35421,
5028,
10234,
653,
438,
85,
285,
310,
3740,
33709,
10553,
5028,
10234,
439,
7064,
436,
1263,
4648,
278,
79,
382,
285,
18504,
13107,
15302,
323,
17227,
653,
438,
85,
285,
46159,
285,
294,
932,
3530,
432,
253,
5028,
3024,
10895,
323,
17227,
439,
7064,
3738,
253,
1332,
4648,
1027,
4295,
7293,
327,
253,
10076,
390,
10895,
253,
3559,
10336,
310,
9093,
253,
1072,
50276,
783,
1332,
8414,
273,
253,
7792,
323,
4715,
13727,
31091,
35185,
2439,
10625,
2593,
4562,
285,
24705,
3740,
15673,
281,
1056,
439,
7064,
14649,
5185,
2593,
4567,
2593,
4562,
8414,
273,
440,
35421,
6779,
4715,
17524,
285,
440,
35421,
5028,
15644,
2299,
436,
2593,
858,
417,
6266,
253,
806,
767,
4295,
275,
2508,
3164,
984,
253,
1332,
4648,
1027,
4295,
7293,
327,
253,
15302,
2529,
275,
2593,
577,
440,
35421,
5028,
15644,
310,
8156,
407,
253,
41458,
1895,
2529,
275,
268,
21,
2593,
4567,
8631,
2600,
2349,
351,
398,
35185,
2349,
351,
398,
2797,
275,
2593,
4562,
10603,
6928,
3740,
2349,
351,
398,
285,
14156,
253,
2593,
10262,
253,
2957,
3470,
48960,
2957,
5880,
46540,
1371,
2957,
3740,
46540,
1371,
2957,
3740,
9991,
2957,
285,
24705,
2957,
50276,
4674,
577,
26662,
253,
4081,
1332,
342,
18198,
85,
1666,
25379,
762,
653,
438,
85,
285,
439,
7064,
15216,
253,
4679,
327,
278,
79,
382,
285,
18504,
13107,
921,
326,
253,
4081,
1332,
6786,
1029,
7200,
275,
5028,
10234,
285,
1029,
3290,
275,
253,
5978,
4080,
407,
269,
301,
253,
5661,
1543,
327,
46159,
285,
294,
932,
921,
326,
253,
4081,
1332,
11026,
1029,
15177,
14649,
4583,
253,
4679,
7568,
253,
6349,
273,
24049,
253,
35185,
7140,
275,
18198,
85,
50276,
856,
84,
50276,
262,
369,
4722,
281,
923,
326,
253,
3045,
273,
18198,
85,
369,
31063,
5520,
407,
24049,
24705,
9050,
5802,
407,
253,
17524,
11333,
50276,
783,
3559,
1332,
369,
5272,
285,
973,
4158,
50276,
5040,
50276,
783,
2022,
629,
273,
436,
2929,
310,
14086,
891,
574,
281,
564,
896,
285,
6593,
875,
253,
2022,
2133,
285,
253,
30762,
247,
2257,
50276,
2520,
2929,
1057,
417,
1646,
281,
22175,
253,
7681,
38135,
273,
436,
789,
3738,
2593,
4562,
369,
4158,
281,
1246,
247,
4460,
7792,
352,
1057,
417,
5513,
253,
2508,
273,
253,
4295,
3082,
323,
440,
35421,
6779,
4715,
285,
17524,
403,
5544,
275,
2593,
577,
253,
4445,
273,
440,
35421,
5028,
15644,
778,
320,
247,
747,
10419,
275,
436,
789,
2299,
436,
2929,
858,
417,
5513,
436,
4445,
275,
253,
2022,
2133,
891,
574,
281,
1239,
30762,
270,
22,
281,
2096,
253,
7212,
3559,
275,
268,
21,
50276,
783,
1566,
10336,
2529,
275,
2593,
4567,
310,
6571,
1754,
327,
2045,
789,
24088,
48960,
2957,
5880,
46540,
1371,
2957,
3966,
253,
24705,
2957,
3559,
275,
436,
2593,
778,
320,
247,
747,
1635,
281,
18198,
85,
533,
969,
436,
310,
417,
21947,
275,
2593,
4567,
50276,
34974,
50276,
81,
21,
359,
6266,
1110,
3963,
14460,
275,
625,
2508,
275,
30762,
270,
22,
604,
436,
629,
310,
247,
10419,
275,
436,
2929,
352,
943,
320,
5544,
347,
253,
2022,
2600,
273,
436,
2929,
604,
891,
513,
417,
1239,
30762,
270,
22,
891,
452,
642,
2934,
849,
5028,
374,
310,
3206,
275,
253,
41458,
1895,
50276,
37585,
5701,
50276,
81,
18,
326,
403,
42794,
23539,
2584,
253,
6489,
326,
50276,
21808,
50276,
262,
778,
320,
11638,
281,
452,
5150,
3904,
594,
326,
30628,
1088,
398,
476,
19912,
247,
7212,
3587,
187,
187,
4118,
18435,
27,
2520,
2929,
2175,
253,
1895,
273,
440,
35421,
5028,
10234,
1060,
10234,
1057,
417,
3730,
281,
3448,
10234,
3185,
352,
10770,
281,
253,
2934,
273,
27090,
1029,
5251,
24705,
3386,
5742,
253,
4477,
1007,
387,
6670,
3740,
3700,
875,
278,
79,
382,
5996,
267,
2953,
3904,
285,
18504,
13107,
33151,
1859,
2419,
3904,
285,
46159,
281,
294,
932,
253,
5304,
84,
1007,
1077,
21414,
285,
253,
16774,
1543,
403,
2266,
1512,
627,
310,
581,
21076,
2278,
533,
253,
4477,
2953,
253,
7350,
275,
616,
2380,
285,
253,
37317,
858,
19235,
417,
3794,
5747,
1964,
1076
] | [
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] | [
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
8774,
253,
4477,
897,
5028,
13727,
31091,
35185,
281,
3157,
440,
35421,
5028,
10234,
18198,
85,
597,
3037,
841,
35185,
275,
271,
440,
35421,
5133,
597,
921,
849,
436,
476,
3157,
1543,
327,
24705,
24279,
440,
35421,
5028,
10234,
285,
3740,
22766,
5028,
10234,
407,
2509,
4679,
327,
278,
79,
1346,
87,
13107,
3386,
21533,
6311,
403,
1077,
1027,
533,
6670,
6489,
812,
320,
253,
1072,
285,
46159,
250,
932,
5799,
14957,
2975,
50274,
296,
3755,
20556,
50276,
783,
5304,
1543,
327,
1097,
8892,
46906,
1507,
403,
13631,
50276,
783,
2929,
310,
2969,
281,
1239,
285,
253,
2934,
310,
27350,
50276,
16217,
3825,
403,
9470,
1690,
271,
28913,
327,
253,
11655,
285,
5301,
1411,
1666,
25379,
50276,
20881,
1255,
265,
50275,
3062,
6667,
273,
46159,
6549,
812,
452,
644,
2011,
50276,
9453,
1057,
436,
1332,
1891,
690,
4433,
2219,
651,
320,
1175,
5474,
339,
431,
248,
2929,
12453,
253,
5028,
10234,
1895,
285,
29328,
247,
4460,
2746,
281,
16497,
3888,
875,
10625,
275,
271,
440,
35421,
5133,
407,
24399,
440,
35421,
4715,
273,
5028,
25168,
24705,
3386,
875,
253,
767,
10625,
253,
2929,
310,
973,
15720,
342,
247,
2590,
6306,
1127,
285,
16038,
2112,
342,
6210,
392,
265,
9397,
9021,
253,
2929,
556,
271,
25495,
285,
3590,
10527,
5301,
281,
2905,
789,
7103,
310,
6210,
392,
265,
1300,
285,
3797,
2045,
789,
275,
253,
1072,
3634,
275,
4583,
352,
310,
247,
1175,
2929,
342,
271,
3236,
285,
7826,
29853,
2934,
285,
247,
21414,
2898,
273,
17697,
305,
507,
651,
320,
271,
4722,
1239,
323,
1142,
275,
253,
8059,
50276,
555,
993,
3239,
337,
50276,
9029,
272,
326,
824,
247,
10234,
767,
9534,
835,
440,
35421,
5028,
10234,
3239,
818,
50276,
783,
14237,
273,
253,
1524,
3888,
50276,
26122,
337,
440,
35421,
5028,
10234,
3082,
789,
1955,
281,
271,
42115,
8492,
2584,
6489,
275,
7118,
337,
285,
4567,
878,
25577,
281,
1329,
436,
3908,
374,
7296,
17524,
310,
581,
273,
253,
1774,
4243,
273,
253,
9021,
352,
778,
878,
625,
15075,
275,
253,
2929,
352,
651,
320,
5322,
281,
1551,
7818,
436,
629,
9017,
253,
5955,
273,
18075,
285,
271,
1783,
281,
7277,
1027,
7274,
671,
407,
4886,
690,
629,
273,
253,
5955,
281,
253,
2929,
432,
253,
14801,
1271,
7152,
339,
793,
360,
3454,
436,
2929,
29328,
281,
3037,
253,
31091,
24705,
3386,
275,
271,
440,
35421,
5133,
281,
7278,
253,
3745,
273,
2460,
10234,
387,
24279,
1029,
5251,
3386,
285,
31326,
690,
1175,
1543,
327,
24705,
10192,
26368,
440,
35421,
5028,
10234,
285,
3740,
33709,
10553,
5028,
10234,
3237,
50274,
24330,
3374,
50275,
783,
4081,
1332,
3133,
281,
320,
247,
5019,
273,
1655,
2987,
253,
2022,
7680,
273,
436,
789,
778,
320,
19732,
2977,
253,
440,
35421,
6779,
4715,
323,
24705,
3386,
11998,
50273,
783,
3290,
273,
4561,
3888,
310,
1335,
417,
20297,
342,
824,
5233,
2440,
273,
305,
507,
50275,
783,
3368,
285,
18276,
7103,
403,
1512,
3710,
760,
767,
2460,
10234,
8892,
403,
5196,
323,
5301,
285,
1652,
5304,
1543,
403,
1677,
352,
588,
320,
9013,
604,
690,
1846,
891,
19,
74,
8892,
1543,
403,
1677,
760,
269,
301,
310,
908,
6240,
643,
17082,
824,
347,
39322,
2824,
295,
5470,
285,
480,
8289,
588,
320,
625,
21414,
50274,
37585,
3374,
50276,
5371,
403,
253,
5667,
3910,
875,
653,
438,
85,
285,
439,
7064,
1895,
849,
1057,
253,
1566,
8415,
253,
767,
3237,
2556,
281,
616,
3910,
50276,
7152,
33032,
2520,
2929,
10262,
440,
35421,
5028,
10234,
18198,
85,
7296,
767,
15216,
24705,
24279,
440,
35421,
5028,
10234,
653,
438,
85,
285,
310,
3740,
33709,
10553,
5028,
10234,
439,
7064,
436,
1263,
4648,
278,
79,
382,
285,
18504,
13107,
15302,
323,
17227,
653,
438,
85,
285,
46159,
285,
294,
932,
3530,
432,
253,
5028,
3024,
10895,
323,
17227,
439,
7064,
3738,
253,
1332,
4648,
1027,
4295,
7293,
327,
253,
10076,
390,
10895,
253,
3559,
10336,
310,
9093,
253,
1072,
50276,
783,
1332,
8414,
273,
253,
7792,
323,
4715,
13727,
31091,
35185,
2439,
10625,
2593,
4562,
285,
24705,
3740,
15673,
281,
1056,
439,
7064,
14649,
5185,
2593,
4567,
2593,
4562,
8414,
273,
440,
35421,
6779,
4715,
17524,
285,
440,
35421,
5028,
15644,
2299,
436,
2593,
858,
417,
6266,
253,
806,
767,
4295,
275,
2508,
3164,
984,
253,
1332,
4648,
1027,
4295,
7293,
327,
253,
15302,
2529,
275,
2593,
577,
440,
35421,
5028,
15644,
310,
8156,
407,
253,
41458,
1895,
2529,
275,
268,
21,
2593,
4567,
8631,
2600,
2349,
351,
398,
35185,
2349,
351,
398,
2797,
275,
2593,
4562,
10603,
6928,
3740,
2349,
351,
398,
285,
14156,
253,
2593,
10262,
253,
2957,
3470,
48960,
2957,
5880,
46540,
1371,
2957,
3740,
46540,
1371,
2957,
3740,
9991,
2957,
285,
24705,
2957,
50276,
4674,
577,
26662,
253,
4081,
1332,
342,
18198,
85,
1666,
25379,
762,
653,
438,
85,
285,
439,
7064,
15216,
253,
4679,
327,
278,
79,
382,
285,
18504,
13107,
921,
326,
253,
4081,
1332,
6786,
1029,
7200,
275,
5028,
10234,
285,
1029,
3290,
275,
253,
5978,
4080,
407,
269,
301,
253,
5661,
1543,
327,
46159,
285,
294,
932,
921,
326,
253,
4081,
1332,
11026,
1029,
15177,
14649,
4583,
253,
4679,
7568,
253,
6349,
273,
24049,
253,
35185,
7140,
275,
18198,
85,
50276,
856,
84,
50276,
262,
369,
4722,
281,
923,
326,
253,
3045,
273,
18198,
85,
369,
31063,
5520,
407,
24049,
24705,
9050,
5802,
407,
253,
17524,
11333,
50276,
783,
3559,
1332,
369,
5272,
285,
973,
4158,
50276,
5040,
50276,
783,
2022,
629,
273,
436,
2929,
310,
14086,
891,
574,
281,
564,
896,
285,
6593,
875,
253,
2022,
2133,
285,
253,
30762,
247,
2257,
50276,
2520,
2929,
1057,
417,
1646,
281,
22175,
253,
7681,
38135,
273,
436,
789,
3738,
2593,
4562,
369,
4158,
281,
1246,
247,
4460,
7792,
352,
1057,
417,
5513,
253,
2508,
273,
253,
4295,
3082,
323,
440,
35421,
6779,
4715,
285,
17524,
403,
5544,
275,
2593,
577,
253,
4445,
273,
440,
35421,
5028,
15644,
778,
320,
247,
747,
10419,
275,
436,
789,
2299,
436,
2929,
858,
417,
5513,
436,
4445,
275,
253,
2022,
2133,
891,
574,
281,
1239,
30762,
270,
22,
281,
2096,
253,
7212,
3559,
275,
268,
21,
50276,
783,
1566,
10336,
2529,
275,
2593,
4567,
310,
6571,
1754,
327,
2045,
789,
24088,
48960,
2957,
5880,
46540,
1371,
2957,
3966,
253,
24705,
2957,
3559,
275,
436,
2593,
778,
320,
247,
747,
1635,
281,
18198,
85,
533,
969,
436,
310,
417,
21947,
275,
2593,
4567,
50276,
34974,
50276,
81,
21,
359,
6266,
1110,
3963,
14460,
275,
625,
2508,
275,
30762,
270,
22,
604,
436,
629,
310,
247,
10419,
275,
436,
2929,
352,
943,
320,
5544,
347,
253,
2022,
2600,
273,
436,
2929,
604,
891,
513,
417,
1239,
30762,
270,
22,
891,
452,
642,
2934,
849,
5028,
374,
310,
3206,
275,
253,
41458,
1895,
50276,
37585,
5701,
50276,
81,
18,
326,
403,
42794,
23539,
2584,
253,
6489,
326,
50276,
21808,
50276,
262,
778,
320,
11638,
281,
452,
5150,
3904,
594,
326,
30628,
1088,
398,
476,
19912,
247,
7212,
3587,
187,
187,
4118,
18435,
27,
2520,
2929,
2175,
253,
1895,
273,
440,
35421,
5028,
10234,
1060,
10234,
1057,
417,
3730,
281,
3448,
10234,
3185,
352,
10770,
281,
253,
2934,
273,
27090,
1029,
5251,
24705,
3386,
5742,
253,
4477,
1007,
387,
6670,
3740,
3700,
875,
278,
79,
382,
5996,
267,
2953,
3904,
285,
18504,
13107,
33151,
1859,
2419,
3904,
285,
46159,
281,
294,
932,
253,
5304,
84,
1007,
1077,
21414,
285,
253,
16774,
1543,
403,
2266,
1512,
627,
310,
581,
21076,
2278,
533,
253,
4477,
2953,
253,
7350,
275,
616,
2380,
285,
253,
37317,
858,
19235,
417,
3794,
5747,
1964,
1076
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
this paper presents a novel approach to tackle the problem of unconstrained joint error in previous unsupervised domain adaptation theory it builds up a new upper bound that under certain circumstances reduces to an upper bound of optimal joint error in practice it instantiates two methods shs and ths to approximate a hypothesis space that most likely contains ft furthermore it proposes a novel cross margin discrepancy to alleviate instability during adversarial learning the main weaknesses of this paper include the following aspects 1 this paper is not well written and some parts are hard to follow it lacks necessary logical transition and important figures for example it lacks explanations to support the connection between the proposed training objective and the cross margin discrepancy also it should at least contain one figure to explain the overall architecture or training pipeline 2 the authors claim that there is still no research focusing on the joint error for uda but this problem of arbitrarily increased joint error has already been studied in previous works like domain adaptation with asymmetricallyrelaxed distribution alignment in icml2019 the authors should discuss on that work and directly illustrate the relationship between that work and the proposed one and why the proposed method is better 3 although the joint error is indeed included in the proposed upper bound in practice the authors have to use sourcedriven hypothesis space and targetdriven hypothesis space to obtain approximation of fs and ft to me in practice the use of three classifiers h f1 f2 is just like an improvement over mcd hence i doubt whether the proposed method can still simultaneously minimize the domain discrepancy and the joint error for example as shown in the digit experiments the performance is highly sensitive to the choice of gamma in shs and sometimes the optimal gamma value is conflicting for different domains in the same dataset which is strange since according to the papers theorem smaller gamma only means more relaxed constraint on hypothesis space also as shown in the visda experiments the optimal value of eta is close to 1 which means classification error from the approximate target domain is basically useless 4 the benchmark results are inferior to the stateoftheart methods for instance the contrastive adaptation network achieves an average of 872 on visda2017 which is much higher than 797 achieved by the proposed method and the same goes with digit office31 and officehome dataset docsepthis paper proposes a new algorithm for unsupervised domain adaptation taking the adaptability term the joint error into consideration instead of minimizing only the domain discrepancy the authors also justify their method with some theoretical intuitions empirical results show that their method achieves stateoftheart performance the paper is overall clearly organized and easy to follow minimizing the joint error along with the domain discrepancy is a novel idea i note that this paper appeared in iclr 2020 paper 559 for the previous version reviewers raised concerns about 1 the realizability assumption f1 and f2 are not in h 2 the derivation of the general bound can be confusing 3 lack of ablation study to probe into where the improvement of performance comes in the current manuscript the authors deal with the problems above 1 add the analysis where the realizability assumption does not hold 2 reorganize the derivation to make their intuition clear 3 also add some part of the ablation study i believe the authors have addressed most of the concerns in iclr 2020 paper 559 thus this work can be more ready for publication docsepthis paper studies the problem of unsupervised domain adaption giving a theoretical analysis that yields a new upper bound on the target error in particular in contrast to much existing work this paper focuses on the role of the ideal joint sourcetarget error in the upper bound the paper proposes two methods for improved results within the framework of the upper bound constraining the searched hypothesis space in two possible manners sourcedriven and target driven and proposing a a marginbased measure of dissimilarity between hypotheses positives derivation of novel upper bound for target error demonstrates that two existing methods for unsupervised domain adaptation can be derived as special cases of the proposed framework ablation study demonstrates the relative improvements from each proposed component good empirical results relative to existing work negatives one main concern i have is with the role of eta in the targetdrive hypothesis space constraint first it would ideally be nice to have some intuition about its role and how it might be set second it appears different values are optimal depending on the data set sometimes being set to 0 and other times 09 i would have liked to see some sensitivity analysis to this value more importantly it appears that the value is being set to directly optimize the test performance if so it should instead be set based on some separate validation data and the accuracy numbers updated to reflect this would have been nice if the ablation study had been applied to more than just one specific data setdirection overall i found this to be a nice paper but would like to see the concern above about eta addresseddocsepsummary this paper tries to study unsupervised domain adaptation problem via minimized joint error the authors propose an upper bound including joint error term and design algorithm on the basis of the theoretical results good experimental results show the effectiveness of the proposed method strengths this paper studies unsupervised domain adaptation in the perspective of joint error which i think is one of the most important issues in the deep regime i really like the topic of the paper the proposed method works well in several benchmarks this paper conveys some interesting conclusions by showing some interesting examples in figures hence this paper is very interesting and easy to follow weaknesses i like the topic of the paper nevertheless im not convinced by the theoretical results and some claims are not well supported this is my main scoring basis also i think the practicality of the method needs further verification the specific comments are as follows unreliable theoretical analysis i dont think introducing true labeling functions is a good choice for minimized joint error we can see that cs tfs ft h is actually an upper bound of joint error in inequality 2 nevertheless i dont think it makes any sense and it is just an upper bound on the one hand the bound is not tighter on the other hand without assumptions about fs ft the bound is pointless for example we can set f1 and f2 predict opposite results and then cs tf1 f2 h is larger than m for any h in this point of view the theory actually relies on assumptions about the fs and ft fs ft are contained in hypothesis space is a very strong assumption in some real scenarios with highdimensional input spaces the true labeling function may be very complex and can not be contained in the hypothesis space additionally in order to ensure that the bound is not trivial authors even need to restrict fs ft in subsets of hypothesis the assumption that ft can be found by source risk minimization with a smaller weight may not be reliable by the result of i neural networks could memorize any labels so weights may not impact the memorization of source data and the claim gamma is accuary and used as weighted source error is very confusing do you mean a small learning rate for source loss will lead to a low accuary and their relationship is approximately linear i think this claim is clearly refuted by i the assumption that ft could be achieved by source loss minimization and pseudo label training needs theoretical guarantees i think the method of using pseudo label is difficult to guarantee performance when the two domains are distant pseudo label may be unreliable using pseudo label may be okay for relatively simple benchmark nevertheless i doubt whether it is helpful for solving complex problems especially many practical problems i am very interested in the performance of the proposed method on domainnetii which is a difficult benchmark for domain adaptation the experimental results are very sensitive to hyperparameters and may not be applicable in real problem the proposed method achieves best results on different datasets with different choices of eta and gamma however how to select a proper hyperparameter is not well stated the motivation of the paper is similar to iii which shows some interesting empirical results this paper can give the value of joint error empirically similar to iii i zhang et al understanding deep learning requires rethinking generalization ii peng et al moment matching for multisource domain adaptation iii chen et al transferability vs discriminability batch spectral penalization for adversarial domain adaptation the authors response addresses a small part of my concerns so i change my score however i still recommend rejecting this submission details are shown in my response
### Summary: | this submission provides a new bound and derived method for unsupervised domain adaptation based on adversarial training the method is then extensively evaluated empirically pro the proposed method seems empirically successful con i agree with one of the reviewers that the presented theoretical justification is not convincing there should be assumptions on how fs and and ft relate in order to drive meaningful guarantees in the proposed framework the presented results here seem to be a case of a method that was motivated by some theoretical consideration and worked well after some heuristic approximations were made it would be nice to see a cleaner analysis of the conditions under which the method will be successful | [
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
2520,
2929,
10262,
247,
4460,
2746,
281,
18915,
253,
1895,
273,
440,
48454,
6036,
2228,
275,
2045,
440,
35421,
5028,
15644,
3762,
352,
21168,
598,
247,
747,
5170,
3033,
326,
762,
2176,
5989,
11355,
281,
271,
5170,
3033,
273,
8654,
6036,
2228,
275,
3946,
352,
8164,
28032,
767,
3082,
439,
84,
285,
289,
84,
281,
16851,
247,
9079,
2317,
326,
954,
2779,
4428,
23899,
33810,
352,
29328,
247,
4460,
2831,
8459,
26210,
281,
33623,
17620,
1309,
48960,
4715,
50276,
783,
2022,
32213,
273,
436,
2929,
2486,
253,
1563,
7794,
337,
186,
2520,
2929,
310,
417,
973,
3542,
285,
690,
4243,
403,
1892,
281,
956,
352,
19756,
3309,
13760,
5502,
285,
1774,
8442,
323,
1650,
352,
19756,
22909,
281,
1329,
253,
4602,
875,
253,
4081,
3733,
8103,
285,
253,
2831,
8459,
26210,
671,
352,
943,
387,
1878,
3831,
581,
4677,
281,
5513,
253,
4583,
10336,
390,
3733,
15722,
374,
186,
783,
4477,
1750,
326,
627,
310,
1335,
642,
2561,
13654,
327,
253,
6036,
2228,
323,
209,
14776,
533,
436,
1895,
273,
29607,
2559,
6036,
2228,
556,
2168,
644,
5421,
275,
2045,
2987,
751,
5028,
15644,
342,
40736,
16671,
39471,
264,
3268,
12420,
275,
17857,
1686,
9638,
253,
4477,
943,
2319,
327,
326,
789,
285,
3587,
17093,
253,
2954,
875,
326,
789,
285,
253,
4081,
581,
285,
2139,
253,
4081,
1332,
310,
1805,
495,
186,
20261,
253,
6036,
2228,
310,
6296,
2908,
275,
253,
4081,
5170,
3033,
275,
3946,
253,
4477,
452,
281,
897,
47344,
1069,
257,
9079,
2317,
285,
2303,
17477,
9079,
2317,
281,
4044,
11193,
273,
25290,
285,
23899,
281,
479,
275,
3946,
253,
897,
273,
1264,
49996,
288,
269,
18,
269,
19,
310,
816,
751,
271,
7756,
689,
278,
2428,
7613,
891,
5545,
1880,
253,
4081,
1332,
476,
1335,
10486,
15338,
253,
5028,
26210,
285,
253,
6036,
2228,
323,
1650,
347,
2011,
275,
253,
6670,
4679,
253,
3045,
310,
4122,
7996,
281,
253,
4327,
273,
17356,
275,
439,
84,
285,
4536,
253,
8654,
17356,
1318,
310,
24648,
323,
1027,
10625,
275,
253,
1072,
10895,
534,
310,
8921,
1580,
2556,
281,
253,
9380,
10012,
4577,
17356,
760,
2097,
625,
19595,
7658,
327,
9079,
2317,
671,
347,
2011,
275,
253,
1649,
1473,
4679,
253,
8654,
1318,
273,
1162,
66,
310,
2810,
281,
337,
534,
2097,
9162,
2228,
432,
253,
16851,
2303,
5028,
310,
10323,
19437,
577,
186,
783,
22791,
1543,
403,
18134,
281,
253,
1375,
23037,
14387,
3082,
323,
4227,
253,
4499,
422,
15644,
2990,
33526,
271,
3388,
273,
854,
3547,
327,
1649,
1473,
7132,
534,
310,
1199,
2169,
685,
818,
4148,
6786,
407,
253,
4081,
1332,
285,
253,
1072,
4566,
342,
6670,
3906,
2405,
285,
3906,
9511,
10895,
5474,
33032,
2520,
2929,
29328,
247,
747,
5933,
323,
440,
35421,
5028,
15644,
3192,
253,
5223,
1430,
1307,
253,
6036,
2228,
715,
8180,
3185,
273,
28699,
760,
253,
5028,
26210,
253,
4477,
671,
15249,
616,
1332,
342,
690,
10527,
16875,
4431,
16774,
1543,
921,
326,
616,
1332,
33526,
1375,
23037,
14387,
3045,
50276,
783,
2929,
310,
4583,
4518,
10932,
285,
3477,
281,
956,
28699,
253,
6036,
2228,
2112,
342,
253,
5028,
26210,
310,
247,
4460,
2934,
50276,
74,
3877,
326,
436,
2929,
5420,
275,
17857,
32888,
9169,
2929,
47534,
323,
253,
2045,
2715,
30628,
5439,
7350,
670,
337,
186,
783,
42924,
1430,
9376,
269,
18,
285,
269,
19,
403,
417,
275,
288,
374,
186,
783,
28529,
273,
253,
2087,
3033,
476,
320,
21643,
495,
186,
77,
471,
273,
28913,
1263,
281,
10304,
715,
835,
253,
7756,
273,
3045,
3249,
50276,
249,
253,
1655,
7714,
253,
4477,
2968,
342,
253,
3237,
1840,
337,
186,
1911,
253,
1783,
835,
253,
42924,
1430,
9376,
1057,
417,
2186,
374,
186,
250,
7397,
907,
253,
28529,
281,
1056,
616,
30328,
2590,
495,
186,
12563,
823,
690,
629,
273,
253,
28913,
1263,
50276,
74,
2868,
253,
4477,
452,
9713,
954,
273,
253,
7350,
275,
17857,
32888,
9169,
2929,
47534,
3021,
436,
789,
476,
320,
625,
4704,
323,
9311,
50276,
7152,
33032,
2520,
2929,
2175,
253,
1895,
273,
440,
35421,
5028,
5223,
279,
4933,
247,
10527,
1783,
326,
11026,
247,
747,
5170,
3033,
327,
253,
2303,
2228,
50276,
249,
1798,
275,
4499,
281,
1199,
5368,
789,
436,
2929,
16633,
327,
253,
2554,
273,
253,
7445,
6036,
18988,
68,
292,
1816,
2228,
275,
253,
5170,
3033,
50276,
783,
2929,
29328,
767,
3082,
323,
5520,
1543,
1561,
253,
7792,
273,
253,
5170,
3033,
50276,
3474,
26208,
253,
16113,
9079,
2317,
275,
767,
1896,
34323,
47344,
1069,
257,
285,
2303,
8877,
285,
36636,
247,
247,
8459,
3169,
2557,
273,
43110,
414,
875,
24316,
50274,
993,
23223,
50276,
491,
7639,
273,
4460,
5170,
3033,
323,
2303,
2228,
50276,
48387,
684,
326,
767,
5368,
3082,
323,
440,
35421,
5028,
15644,
476,
320,
6012,
347,
2714,
2219,
273,
253,
4081,
7792,
50276,
1752,
318,
1263,
14371,
253,
4103,
11701,
432,
1016,
4081,
4445,
50276,
12311,
16774,
1543,
4103,
281,
5368,
789,
50276,
8265,
3993,
50276,
531,
2022,
4468,
891,
452,
310,
342,
253,
2554,
273,
1162,
66,
275,
253,
2303,
27190,
9079,
2317,
7658,
50276,
7053,
352,
651,
34243,
320,
5322,
281,
452,
690,
30328,
670,
697,
2554,
285,
849,
352,
1537,
320,
873,
50276,
9815,
352,
4620,
1027,
2193,
403,
8654,
7293,
327,
253,
941,
873,
4536,
1146,
873,
281,
470,
285,
643,
2069,
15630,
50276,
74,
651,
452,
10490,
281,
923,
690,
7340,
1783,
281,
436,
1318,
50276,
3062,
15538,
352,
4620,
326,
253,
1318,
310,
1146,
873,
281,
3587,
22318,
253,
1071,
3045,
50276,
338,
594,
352,
943,
3185,
320,
873,
1754,
327,
690,
4858,
12820,
941,
285,
253,
7200,
3904,
9300,
281,
4887,
436,
50276,
12756,
452,
644,
5322,
604,
253,
28913,
1263,
574,
644,
3732,
281,
625,
685,
816,
581,
2173,
941,
873,
21285,
50276,
1189,
455,
891,
1119,
436,
281,
320,
247,
5322,
2929,
533,
651,
751,
281,
923,
253,
4468,
1840,
670,
1162,
66,
9713,
7152,
339,
793,
360,
3454,
436,
2929,
14177,
281,
1263,
440,
35421,
5028,
15644,
1895,
3066,
36625,
6036,
2228,
253,
4477,
12661,
271,
5170,
3033,
1690,
6036,
2228,
1307,
285,
2216,
5933,
327,
253,
3720,
273,
253,
10527,
1543,
1175,
5661,
1543,
921,
253,
12510,
273,
253,
4081,
1332,
50275,
296,
3755,
20556,
50276,
2520,
2929,
2175,
440,
35421,
5028,
15644,
275,
253,
8668,
273,
6036,
2228,
534,
891,
1158,
310,
581,
273,
253,
954,
1774,
3374,
275,
253,
3676,
9459,
891,
1663,
751,
253,
9400,
273,
253,
2929,
50276,
783,
4081,
1332,
2987,
973,
275,
2067,
49602,
50276,
2520,
2929,
11785,
656,
690,
4722,
11815,
407,
4645,
690,
4722,
6667,
275,
8442,
7613,
436,
2929,
310,
1077,
4722,
285,
3477,
281,
956,
50276,
20881,
1255,
265,
50276,
74,
751,
253,
9400,
273,
253,
2929,
17837,
516,
417,
13762,
407,
253,
10527,
1543,
285,
690,
3916,
403,
417,
973,
4516,
436,
310,
619,
2022,
14755,
3720,
50276,
12563,
891,
1158,
253,
8542,
414,
273,
253,
1332,
3198,
2007,
21999,
253,
2173,
5701,
403,
347,
3637,
50276,
328,
31631,
10527,
1783,
891,
13414,
1158,
16984,
2032,
21473,
3470,
310,
247,
1175,
4327,
323,
36625,
6036,
2228,
359,
476,
923,
326,
29180,
246,
3671,
23899,
288,
310,
2686,
271,
5170,
3033,
273,
6036,
2228,
275,
11370,
374,
17837,
891,
13414,
1158,
352,
2789,
667,
3282,
285,
352,
310,
816,
271,
5170,
3033,
327,
253,
581,
1133,
253,
3033,
310,
417,
40638,
327,
253,
643,
1133,
1293,
13260,
670,
25290,
23899,
253,
3033,
310,
41690,
323,
1650,
359,
476,
873,
269,
18,
285,
269,
19,
3283,
7285,
1543,
285,
840,
29180,
28793,
18,
269,
19,
288,
310,
4067,
685,
278,
323,
667,
288,
275,
436,
1127,
273,
1859,
253,
3762,
2686,
15771,
327,
13260,
670,
253,
25290,
285,
23899,
50276,
3671,
23899,
403,
6221,
275,
9079,
2317,
310,
247,
1077,
2266,
9376,
275,
690,
1524,
15216,
342,
1029,
6967,
3280,
8470,
253,
2032,
21473,
1159,
778,
320,
1077,
2570,
285,
476,
417,
320,
6221,
275,
253,
9079,
2317,
23000,
275,
1340,
281,
5416,
326,
253,
3033,
310,
417,
14916,
4477,
1014,
878,
281,
4656,
25290,
23899,
275,
20077,
273,
9079,
50275,
783,
9376,
326,
23899,
476,
320,
1119,
407,
2603,
2495,
41458,
342,
247,
4577,
2801,
778,
417,
320,
9630,
407,
253,
906,
273,
891,
11454,
6928,
812,
16407,
907,
667,
13301,
594,
13461,
778,
417,
3486,
253,
16407,
1320,
273,
2603,
941,
285,
253,
1750,
17356,
50276,
261,
756,
3702,
285,
908,
347,
17375,
2603,
2228,
310,
1077,
21643,
513,
368,
1599,
247,
1355,
4715,
2281,
323,
2603,
2957,
588,
1421,
281,
247,
1698,
756,
3702,
285,
616,
2954,
310,
5512,
4872,
891,
1158,
436,
1750,
310,
4518,
1275,
4525,
407,
891,
50276,
783,
9376,
326,
23899,
812,
320,
6786,
407,
2603,
2957,
41458,
285,
17927,
5203,
3733,
3198,
10527,
23632,
891,
1158,
253,
1332,
273,
970,
17927,
5203,
310,
2834,
281,
12215,
3045,
672,
253,
767,
10625,
403,
13392,
17927,
5203,
778,
320,
36230,
970,
17927,
5203,
778,
320,
8261,
323,
4942,
2969,
22791,
17837,
891,
5545,
1880,
352,
310,
9371,
323,
16161,
2570,
3237,
3340,
1142,
8542,
3237,
891,
717,
1077,
6110,
275,
253,
3045,
273,
253,
4081,
1332,
327,
5028,
3024,
2886,
534,
310,
247,
2834,
22791,
323,
5028,
15644,
50276,
783,
5661,
1543,
403,
1077,
7996,
281,
4373,
22041,
285,
778,
417,
320,
7763,
275,
1524,
1895,
253,
4081,
1332,
33526,
1682,
1543,
327,
1027,
15302,
342,
1027,
10165,
273,
1162,
66,
285,
17356,
2299,
849,
281,
3609,
247,
1463,
4373,
19484,
310,
417,
973,
4767,
50276,
783,
16038,
273,
253,
2929,
310,
2074,
281,
37685,
534,
2722,
690,
4722,
16774,
1543,
436,
2929,
476,
1918,
253,
1318,
273,
6036,
2228,
45190,
2074,
281,
37685,
50276,
74,
1182,
12109,
1162,
355,
4685,
3676,
4715,
4419,
294,
37341,
26647,
50276,
2886,
42151,
1162,
355,
2774,
11038,
323,
1554,
261,
1505,
5028,
15644,
50276,
12211,
260,
864,
1162,
355,
3700,
1430,
4632,
20741,
1430,
14604,
9879,
29697,
1320,
323,
48960,
5028,
15644,
50275,
783,
4477,
2380,
12453,
247,
1355,
629,
273,
619,
7350,
594,
891,
1818,
619,
4868,
2299,
891,
1335,
5583,
33944,
436,
19529,
4278,
403,
2011,
275,
619,
2380,
50275,
187,
187,
4118,
18435,
27,
2520,
19529,
3400,
247,
747,
3033,
285,
6012,
1332,
323,
440,
35421,
5028,
15644,
1754,
327,
48960,
3733,
50276,
783,
1332,
310,
840,
18171,
6760,
45190,
50276,
856,
50276,
783,
4081,
1332,
3133,
45190,
5547,
50276,
585,
50276,
74,
5194,
342,
581,
273,
253,
30628,
326,
253,
3559,
10527,
22861,
310,
417,
21414,
627,
943,
320,
13260,
327,
849,
25290,
285,
285,
23899,
14588,
275,
1340,
281,
4446,
14282,
23632,
275,
253,
4081,
7792,
253,
3559,
1543,
1060,
1646,
281,
320,
247,
1083,
273,
247,
1332,
326,
369,
17194,
407,
690,
10527,
8180,
285,
4307,
973,
846,
690,
47641,
34754,
497,
1160,
352,
651,
320,
5322,
281,
923,
247,
28452,
1783,
273,
253,
2515,
762,
534,
253,
1332,
588,
320,
5547
] | [
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] | [
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
2520,
2929,
10262,
247,
4460,
2746,
281,
18915,
253,
1895,
273,
440,
48454,
6036,
2228,
275,
2045,
440,
35421,
5028,
15644,
3762,
352,
21168,
598,
247,
747,
5170,
3033,
326,
762,
2176,
5989,
11355,
281,
271,
5170,
3033,
273,
8654,
6036,
2228,
275,
3946,
352,
8164,
28032,
767,
3082,
439,
84,
285,
289,
84,
281,
16851,
247,
9079,
2317,
326,
954,
2779,
4428,
23899,
33810,
352,
29328,
247,
4460,
2831,
8459,
26210,
281,
33623,
17620,
1309,
48960,
4715,
50276,
783,
2022,
32213,
273,
436,
2929,
2486,
253,
1563,
7794,
337,
186,
2520,
2929,
310,
417,
973,
3542,
285,
690,
4243,
403,
1892,
281,
956,
352,
19756,
3309,
13760,
5502,
285,
1774,
8442,
323,
1650,
352,
19756,
22909,
281,
1329,
253,
4602,
875,
253,
4081,
3733,
8103,
285,
253,
2831,
8459,
26210,
671,
352,
943,
387,
1878,
3831,
581,
4677,
281,
5513,
253,
4583,
10336,
390,
3733,
15722,
374,
186,
783,
4477,
1750,
326,
627,
310,
1335,
642,
2561,
13654,
327,
253,
6036,
2228,
323,
209,
14776,
533,
436,
1895,
273,
29607,
2559,
6036,
2228,
556,
2168,
644,
5421,
275,
2045,
2987,
751,
5028,
15644,
342,
40736,
16671,
39471,
264,
3268,
12420,
275,
17857,
1686,
9638,
253,
4477,
943,
2319,
327,
326,
789,
285,
3587,
17093,
253,
2954,
875,
326,
789,
285,
253,
4081,
581,
285,
2139,
253,
4081,
1332,
310,
1805,
495,
186,
20261,
253,
6036,
2228,
310,
6296,
2908,
275,
253,
4081,
5170,
3033,
275,
3946,
253,
4477,
452,
281,
897,
47344,
1069,
257,
9079,
2317,
285,
2303,
17477,
9079,
2317,
281,
4044,
11193,
273,
25290,
285,
23899,
281,
479,
275,
3946,
253,
897,
273,
1264,
49996,
288,
269,
18,
269,
19,
310,
816,
751,
271,
7756,
689,
278,
2428,
7613,
891,
5545,
1880,
253,
4081,
1332,
476,
1335,
10486,
15338,
253,
5028,
26210,
285,
253,
6036,
2228,
323,
1650,
347,
2011,
275,
253,
6670,
4679,
253,
3045,
310,
4122,
7996,
281,
253,
4327,
273,
17356,
275,
439,
84,
285,
4536,
253,
8654,
17356,
1318,
310,
24648,
323,
1027,
10625,
275,
253,
1072,
10895,
534,
310,
8921,
1580,
2556,
281,
253,
9380,
10012,
4577,
17356,
760,
2097,
625,
19595,
7658,
327,
9079,
2317,
671,
347,
2011,
275,
253,
1649,
1473,
4679,
253,
8654,
1318,
273,
1162,
66,
310,
2810,
281,
337,
534,
2097,
9162,
2228,
432,
253,
16851,
2303,
5028,
310,
10323,
19437,
577,
186,
783,
22791,
1543,
403,
18134,
281,
253,
1375,
23037,
14387,
3082,
323,
4227,
253,
4499,
422,
15644,
2990,
33526,
271,
3388,
273,
854,
3547,
327,
1649,
1473,
7132,
534,
310,
1199,
2169,
685,
818,
4148,
6786,
407,
253,
4081,
1332,
285,
253,
1072,
4566,
342,
6670,
3906,
2405,
285,
3906,
9511,
10895,
5474,
33032,
2520,
2929,
29328,
247,
747,
5933,
323,
440,
35421,
5028,
15644,
3192,
253,
5223,
1430,
1307,
253,
6036,
2228,
715,
8180,
3185,
273,
28699,
760,
253,
5028,
26210,
253,
4477,
671,
15249,
616,
1332,
342,
690,
10527,
16875,
4431,
16774,
1543,
921,
326,
616,
1332,
33526,
1375,
23037,
14387,
3045,
50276,
783,
2929,
310,
4583,
4518,
10932,
285,
3477,
281,
956,
28699,
253,
6036,
2228,
2112,
342,
253,
5028,
26210,
310,
247,
4460,
2934,
50276,
74,
3877,
326,
436,
2929,
5420,
275,
17857,
32888,
9169,
2929,
47534,
323,
253,
2045,
2715,
30628,
5439,
7350,
670,
337,
186,
783,
42924,
1430,
9376,
269,
18,
285,
269,
19,
403,
417,
275,
288,
374,
186,
783,
28529,
273,
253,
2087,
3033,
476,
320,
21643,
495,
186,
77,
471,
273,
28913,
1263,
281,
10304,
715,
835,
253,
7756,
273,
3045,
3249,
50276,
249,
253,
1655,
7714,
253,
4477,
2968,
342,
253,
3237,
1840,
337,
186,
1911,
253,
1783,
835,
253,
42924,
1430,
9376,
1057,
417,
2186,
374,
186,
250,
7397,
907,
253,
28529,
281,
1056,
616,
30328,
2590,
495,
186,
12563,
823,
690,
629,
273,
253,
28913,
1263,
50276,
74,
2868,
253,
4477,
452,
9713,
954,
273,
253,
7350,
275,
17857,
32888,
9169,
2929,
47534,
3021,
436,
789,
476,
320,
625,
4704,
323,
9311,
50276,
7152,
33032,
2520,
2929,
2175,
253,
1895,
273,
440,
35421,
5028,
5223,
279,
4933,
247,
10527,
1783,
326,
11026,
247,
747,
5170,
3033,
327,
253,
2303,
2228,
50276,
249,
1798,
275,
4499,
281,
1199,
5368,
789,
436,
2929,
16633,
327,
253,
2554,
273,
253,
7445,
6036,
18988,
68,
292,
1816,
2228,
275,
253,
5170,
3033,
50276,
783,
2929,
29328,
767,
3082,
323,
5520,
1543,
1561,
253,
7792,
273,
253,
5170,
3033,
50276,
3474,
26208,
253,
16113,
9079,
2317,
275,
767,
1896,
34323,
47344,
1069,
257,
285,
2303,
8877,
285,
36636,
247,
247,
8459,
3169,
2557,
273,
43110,
414,
875,
24316,
50274,
993,
23223,
50276,
491,
7639,
273,
4460,
5170,
3033,
323,
2303,
2228,
50276,
48387,
684,
326,
767,
5368,
3082,
323,
440,
35421,
5028,
15644,
476,
320,
6012,
347,
2714,
2219,
273,
253,
4081,
7792,
50276,
1752,
318,
1263,
14371,
253,
4103,
11701,
432,
1016,
4081,
4445,
50276,
12311,
16774,
1543,
4103,
281,
5368,
789,
50276,
8265,
3993,
50276,
531,
2022,
4468,
891,
452,
310,
342,
253,
2554,
273,
1162,
66,
275,
253,
2303,
27190,
9079,
2317,
7658,
50276,
7053,
352,
651,
34243,
320,
5322,
281,
452,
690,
30328,
670,
697,
2554,
285,
849,
352,
1537,
320,
873,
50276,
9815,
352,
4620,
1027,
2193,
403,
8654,
7293,
327,
253,
941,
873,
4536,
1146,
873,
281,
470,
285,
643,
2069,
15630,
50276,
74,
651,
452,
10490,
281,
923,
690,
7340,
1783,
281,
436,
1318,
50276,
3062,
15538,
352,
4620,
326,
253,
1318,
310,
1146,
873,
281,
3587,
22318,
253,
1071,
3045,
50276,
338,
594,
352,
943,
3185,
320,
873,
1754,
327,
690,
4858,
12820,
941,
285,
253,
7200,
3904,
9300,
281,
4887,
436,
50276,
12756,
452,
644,
5322,
604,
253,
28913,
1263,
574,
644,
3732,
281,
625,
685,
816,
581,
2173,
941,
873,
21285,
50276,
1189,
455,
891,
1119,
436,
281,
320,
247,
5322,
2929,
533,
651,
751,
281,
923,
253,
4468,
1840,
670,
1162,
66,
9713,
7152,
339,
793,
360,
3454,
436,
2929,
14177,
281,
1263,
440,
35421,
5028,
15644,
1895,
3066,
36625,
6036,
2228,
253,
4477,
12661,
271,
5170,
3033,
1690,
6036,
2228,
1307,
285,
2216,
5933,
327,
253,
3720,
273,
253,
10527,
1543,
1175,
5661,
1543,
921,
253,
12510,
273,
253,
4081,
1332,
50275,
296,
3755,
20556,
50276,
2520,
2929,
2175,
440,
35421,
5028,
15644,
275,
253,
8668,
273,
6036,
2228,
534,
891,
1158,
310,
581,
273,
253,
954,
1774,
3374,
275,
253,
3676,
9459,
891,
1663,
751,
253,
9400,
273,
253,
2929,
50276,
783,
4081,
1332,
2987,
973,
275,
2067,
49602,
50276,
2520,
2929,
11785,
656,
690,
4722,
11815,
407,
4645,
690,
4722,
6667,
275,
8442,
7613,
436,
2929,
310,
1077,
4722,
285,
3477,
281,
956,
50276,
20881,
1255,
265,
50276,
74,
751,
253,
9400,
273,
253,
2929,
17837,
516,
417,
13762,
407,
253,
10527,
1543,
285,
690,
3916,
403,
417,
973,
4516,
436,
310,
619,
2022,
14755,
3720,
50276,
12563,
891,
1158,
253,
8542,
414,
273,
253,
1332,
3198,
2007,
21999,
253,
2173,
5701,
403,
347,
3637,
50276,
328,
31631,
10527,
1783,
891,
13414,
1158,
16984,
2032,
21473,
3470,
310,
247,
1175,
4327,
323,
36625,
6036,
2228,
359,
476,
923,
326,
29180,
246,
3671,
23899,
288,
310,
2686,
271,
5170,
3033,
273,
6036,
2228,
275,
11370,
374,
17837,
891,
13414,
1158,
352,
2789,
667,
3282,
285,
352,
310,
816,
271,
5170,
3033,
327,
253,
581,
1133,
253,
3033,
310,
417,
40638,
327,
253,
643,
1133,
1293,
13260,
670,
25290,
23899,
253,
3033,
310,
41690,
323,
1650,
359,
476,
873,
269,
18,
285,
269,
19,
3283,
7285,
1543,
285,
840,
29180,
28793,
18,
269,
19,
288,
310,
4067,
685,
278,
323,
667,
288,
275,
436,
1127,
273,
1859,
253,
3762,
2686,
15771,
327,
13260,
670,
253,
25290,
285,
23899,
50276,
3671,
23899,
403,
6221,
275,
9079,
2317,
310,
247,
1077,
2266,
9376,
275,
690,
1524,
15216,
342,
1029,
6967,
3280,
8470,
253,
2032,
21473,
1159,
778,
320,
1077,
2570,
285,
476,
417,
320,
6221,
275,
253,
9079,
2317,
23000,
275,
1340,
281,
5416,
326,
253,
3033,
310,
417,
14916,
4477,
1014,
878,
281,
4656,
25290,
23899,
275,
20077,
273,
9079,
50275,
783,
9376,
326,
23899,
476,
320,
1119,
407,
2603,
2495,
41458,
342,
247,
4577,
2801,
778,
417,
320,
9630,
407,
253,
906,
273,
891,
11454,
6928,
812,
16407,
907,
667,
13301,
594,
13461,
778,
417,
3486,
253,
16407,
1320,
273,
2603,
941,
285,
253,
1750,
17356,
50276,
261,
756,
3702,
285,
908,
347,
17375,
2603,
2228,
310,
1077,
21643,
513,
368,
1599,
247,
1355,
4715,
2281,
323,
2603,
2957,
588,
1421,
281,
247,
1698,
756,
3702,
285,
616,
2954,
310,
5512,
4872,
891,
1158,
436,
1750,
310,
4518,
1275,
4525,
407,
891,
50276,
783,
9376,
326,
23899,
812,
320,
6786,
407,
2603,
2957,
41458,
285,
17927,
5203,
3733,
3198,
10527,
23632,
891,
1158,
253,
1332,
273,
970,
17927,
5203,
310,
2834,
281,
12215,
3045,
672,
253,
767,
10625,
403,
13392,
17927,
5203,
778,
320,
36230,
970,
17927,
5203,
778,
320,
8261,
323,
4942,
2969,
22791,
17837,
891,
5545,
1880,
352,
310,
9371,
323,
16161,
2570,
3237,
3340,
1142,
8542,
3237,
891,
717,
1077,
6110,
275,
253,
3045,
273,
253,
4081,
1332,
327,
5028,
3024,
2886,
534,
310,
247,
2834,
22791,
323,
5028,
15644,
50276,
783,
5661,
1543,
403,
1077,
7996,
281,
4373,
22041,
285,
778,
417,
320,
7763,
275,
1524,
1895,
253,
4081,
1332,
33526,
1682,
1543,
327,
1027,
15302,
342,
1027,
10165,
273,
1162,
66,
285,
17356,
2299,
849,
281,
3609,
247,
1463,
4373,
19484,
310,
417,
973,
4767,
50276,
783,
16038,
273,
253,
2929,
310,
2074,
281,
37685,
534,
2722,
690,
4722,
16774,
1543,
436,
2929,
476,
1918,
253,
1318,
273,
6036,
2228,
45190,
2074,
281,
37685,
50276,
74,
1182,
12109,
1162,
355,
4685,
3676,
4715,
4419,
294,
37341,
26647,
50276,
2886,
42151,
1162,
355,
2774,
11038,
323,
1554,
261,
1505,
5028,
15644,
50276,
12211,
260,
864,
1162,
355,
3700,
1430,
4632,
20741,
1430,
14604,
9879,
29697,
1320,
323,
48960,
5028,
15644,
50275,
783,
4477,
2380,
12453,
247,
1355,
629,
273,
619,
7350,
594,
891,
1818,
619,
4868,
2299,
891,
1335,
5583,
33944,
436,
19529,
4278,
403,
2011,
275,
619,
2380,
50275,
187,
187,
4118,
18435,
27,
2520,
19529,
3400,
247,
747,
3033,
285,
6012,
1332,
323,
440,
35421,
5028,
15644,
1754,
327,
48960,
3733,
50276,
783,
1332,
310,
840,
18171,
6760,
45190,
50276,
856,
50276,
783,
4081,
1332,
3133,
45190,
5547,
50276,
585,
50276,
74,
5194,
342,
581,
273,
253,
30628,
326,
253,
3559,
10527,
22861,
310,
417,
21414,
627,
943,
320,
13260,
327,
849,
25290,
285,
285,
23899,
14588,
275,
1340,
281,
4446,
14282,
23632,
275,
253,
4081,
7792,
253,
3559,
1543,
1060,
1646,
281,
320,
247,
1083,
273,
247,
1332,
326,
369,
17194,
407,
690,
10527,
8180,
285,
4307,
973,
846,
690,
47641,
34754,
497,
1160,
352,
651,
320,
5322,
281,
923,
247,
28452,
1783,
273,
253,
2515,
762,
534,
253,
1332,
588,
320,
5547
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
this work tries to produce robust decision stump ensembles against feature perturbation to achieve this the authors propose derandomized smoothing drs which can help provide a robust certificate moreover the authors propose a robust mle optimality criterion for individual stumps and two boosting schemes for whole ensembles the experimental results look promising and demonstrate the effectiveness of the proposed method strength 1 the writing and figure presentation are very good 2 the pdf computation via dynamic programming looks novel 3 there are substantial improvements on experimental results and the code is attached as part of the submission weakness 1 one part i would like to see better is i wish the authors could conduct experiments comparing the baseline 23 and the proposed work on more tabular data breastcancer and diabetes seem not sufficient to me in table 2 moreover results for the baseline 23 are directly copied from 23 i would like to see more experimental results on more datasets to assess the effectiveness of the proposed method 2 the other thing im concerned about is that all results reported in table 2 have no standard deviations it is not clear for me how the training and test set split was done or whether there is any overfitting the authors have addressed the limitations in section 7 docsepthe paper focuses on the problem of certifying ensembles of decision stumps efficiently and deterministically by leveraging discretization and piecewiseconstant nature of decision stumps the paper presents a way to compute the output distribution pdf and cdf efficiently in a closed form for standard input randomizations without requiring any sampling this paper also discusses the joint certification over numerical and categorical features and demonstrates how the proposed scheme can be extended to handle categorical features consequently using the deterministic certification scheme the paper proposes a novel way to construct robust decision tree stumps that can then be used to build ensembles through bagging or gradient boosting and adaptive boosting the empirical evaluations demonstrate the improved robust predictive performance of the proposed schemes over existing baselines and the ability of the proposed schemes to provide joint certificates strengths this paper handles the robustness certification of a very widely used class of models ensembles of decision tree stumps and hence has the potential of significant impact given the wide use of such models in practical applications the paper provides a concise but sufficient background on randomized smoothing that allows the reader to understand the challenges that the proposed derandomized smoothing is able circumvent it is very intuitive how the authors leverage the piecewiseconstant structure in decision stump ensembles and the input randomization independent across dimensions to efficiently compute the output distribution this is a great example of leveraging structure in the problem for efficient and accurate solutions this efficient solution also allows the authors to develop a novel constructiion of robust decision tree stumps which would not have been possible without such an efficient computation of the output distribution weaknesses one weakness of the proposed scheme is the focus on decision stumps which facilitates the creation of the perfeature metastumps that are critical in the efficient computation it is not clear how this technique would transfer to deeper decision trees since it is not clear to me how the metastumps could be created for such trees the authors explicitly address limitations and potential negative societal impact one of the limitations being unable to handle arbitrary ensembles of decision trees is related to the weakness i listed above the authors claim that it is possible to extend to decision trees but the extension seems nontrivial however the authors are upfront with this limitation docsepthis paper deterministic smoothing for decision stump ensembles and derive an mleoptimal training method for smoothed decision stumps under randomization strengths 1 good presentation weaknesses 1 the necessity of using randomized smoothing to provide the robustness guarantee for decision stump ensemble 2 the contribution is incremental the contribution of this paper is incremental the paper just applies randomized smoothing to decision stump ensemble and derives the robustness certificate for such a special classifier the key contribution deterministic certificate is because of the overly simple structure of the base classifier decision stump ensemble instead of some new smoothing distributions eg splitting noise levine et al 2021 we can also easily compute deterministic certificates for some other simple classifiers for instance we can derive the deterministic certificates for linear classifiers the reason why we compute the probabilistic certificates for randomized smoothing is the complex structure almost blackbox of the neural network classifier moreover i think the randomized smoothing for decision stump ensemble is only a simplified verison of the general problems studied in cohen et al 2019 yang et al 2020 levine et al 2021 improved deterministic smoothing for l1 certified robustness icml 2021 cohen et al 2019 certified adversarial robustness via randomized smoothing icml 2019 yang et al 2020 randomized smoothing of all shapes and sizes icml 2020
### Summary: | there were substantial discussions around this paper and its contributions the authors did a good job of explaining and interacting with reviewers with as a consequence a substantial raise of scores to prepare the cameraready version it is strongly suggested to include the material introduced at discussion time including the experimental results k1dy and use the intuitive explanation provided for the dynamic programming approach 7kxj to revamp a section paragraph on a highlevel explanation of the approach | [
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
2520,
789,
14177,
281,
4711,
10237,
3061,
48733,
49328,
1411,
4735,
20452,
281,
5115,
436,
253,
4477,
12661,
1784,
2976,
1025,
36971,
1837,
84,
534,
476,
1361,
2085,
247,
10237,
14204,
25761,
253,
4477,
12661,
247,
10237,
278,
282,
5556,
1319,
17705,
323,
2060,
331,
10628,
285,
767,
43124,
15849,
323,
2644,
49328,
253,
5661,
1543,
1007,
12532,
285,
7568,
253,
12510,
273,
253,
4081,
1332,
4757,
50276,
18,
253,
4028,
285,
4677,
9759,
403,
1077,
1175,
374,
253,
31697,
13782,
3066,
7870,
10717,
4453,
4460,
495,
627,
403,
6832,
11701,
327,
5661,
1543,
285,
253,
2127,
310,
7660,
347,
629,
273,
253,
19529,
50276,
20881,
1255,
50276,
18,
581,
629,
891,
651,
751,
281,
923,
1805,
310,
891,
5730,
253,
4477,
812,
2589,
4679,
10941,
253,
8245,
3495,
285,
253,
4081,
789,
327,
625,
10334,
792,
941,
5988,
30366,
285,
8401,
1646,
417,
4209,
281,
479,
275,
2829,
374,
25761,
1543,
323,
253,
8245,
3495,
403,
3587,
22489,
432,
3495,
891,
651,
751,
281,
923,
625,
5661,
1543,
327,
625,
15302,
281,
2939,
253,
12510,
273,
253,
4081,
1332,
50276,
19,
253,
643,
2181,
516,
7514,
670,
310,
326,
512,
1543,
2361,
275,
2829,
374,
452,
642,
2629,
21492,
352,
310,
417,
2590,
323,
479,
849,
253,
3733,
285,
1071,
873,
8085,
369,
2218,
390,
1880,
627,
310,
667,
689,
31893,
253,
4477,
452,
9713,
253,
7364,
275,
2593,
818,
5474,
339,
431,
248,
2929,
16633,
327,
253,
1895,
273,
5306,
5411,
49328,
273,
3061,
331,
10628,
14556,
285,
11544,
18260,
407,
19732,
2977,
35132,
1320,
285,
5313,
3020,
22174,
3753,
273,
3061,
331,
10628,
253,
2929,
10262,
247,
1039,
281,
11897,
253,
3453,
3268,
31697,
285,
260,
4989,
14556,
275,
247,
4581,
830,
323,
2629,
3280,
3632,
5904,
1293,
10568,
667,
10491,
436,
2929,
671,
25339,
253,
6036,
21612,
689,
10704,
285,
31091,
3386,
285,
14371,
849,
253,
4081,
6974,
476,
320,
6508,
281,
6016,
31091,
3386,
17912,
970,
253,
30027,
21612,
6974,
253,
2929,
29328,
247,
4460,
1039,
281,
3989,
10237,
3061,
5202,
331,
10628,
326,
476,
840,
320,
908,
281,
1973,
49328,
949,
7351,
3390,
390,
11786,
43124,
285,
17825,
43124,
253,
16774,
27163,
7568,
253,
5520,
10237,
15970,
3045,
273,
253,
4081,
15849,
689,
5368,
1666,
25379,
285,
253,
3745,
273,
253,
4081,
15849,
281,
2085,
6036,
28460,
50275,
296,
3755,
20556,
50275,
2520,
2929,
22139,
253,
31640,
21612,
273,
247,
1077,
7561,
908,
966,
273,
3210,
50276,
1215,
1814,
868,
273,
3061,
5202,
331,
10628,
50276,
395,
7613,
556,
253,
2442,
273,
1534,
3486,
1677,
253,
4618,
897,
273,
824,
3210,
275,
8542,
4893,
50275,
783,
2929,
3400,
247,
44003,
533,
4209,
4114,
327,
14871,
36971,
326,
4483,
253,
9414,
281,
2096,
253,
7881,
326,
253,
4081,
1784,
2976,
1025,
36971,
310,
2104,
39256,
50275,
262,
310,
1077,
27350,
849,
253,
4477,
25057,
253,
5313,
3020,
22174,
2605,
275,
3061,
48733,
49328,
285,
253,
3280,
46852,
3907,
2439,
10103,
281,
14556,
11897,
253,
3453,
3268,
436,
310,
247,
1270,
1650,
273,
19732,
2977,
2605,
275,
253,
1895,
323,
5919,
285,
7899,
5482,
436,
5919,
2900,
671,
4483,
253,
4477,
281,
1287,
247,
4460,
3989,
74,
279,
273,
10237,
3061,
5202,
331,
10628,
534,
651,
417,
452,
644,
1896,
1293,
824,
271,
5919,
13782,
273,
253,
3453,
3268,
50275,
20881,
1255,
265,
50275,
531,
14855,
273,
253,
4081,
6974,
310,
253,
2770,
327,
3061,
331,
10628,
534,
29499,
253,
8869,
273,
253,
591,
24594,
8866,
10628,
326,
403,
4619,
275,
253,
5919,
13782,
352,
310,
417,
2590,
849,
436,
5853,
651,
3700,
281,
12861,
3061,
7139,
1580,
352,
310,
417,
2590,
281,
479,
849,
253,
8866,
10628,
812,
320,
3562,
323,
824,
7139,
50275,
783,
4477,
11120,
2953,
7364,
285,
2442,
4016,
38058,
3486,
581,
273,
253,
7364,
1146,
7591,
281,
6016,
10341,
49328,
273,
3061,
7139,
310,
2905,
281,
253,
14855,
891,
7117,
1840,
253,
4477,
1750,
326,
352,
310,
1896,
281,
9017,
281,
3061,
7139,
533,
253,
6880,
3133,
37825,
2299,
253,
4477,
403,
598,
6342,
342,
436,
12291,
50276,
7152,
33032,
2520,
2929,
30027,
36971,
323,
3061,
48733,
49328,
285,
15313,
271,
278,
282,
29776,
3733,
1332,
323,
43966,
3061,
331,
10628,
762,
46852,
20544,
337,
1175,
9759,
50275,
20881,
1255,
265,
50276,
18,
50276,
783,
15504,
273,
970,
14871,
36971,
281,
2085,
253,
31640,
12215,
323,
3061,
48733,
19862,
50275,
19,
253,
7680,
310,
32809,
50276,
783,
7680,
273,
436,
2929,
310,
32809,
253,
2929,
816,
10384,
14871,
36971,
281,
3061,
48733,
19862,
285,
38422,
253,
31640,
14204,
323,
824,
247,
2714,
30410,
253,
2234,
7680,
30027,
14204,
310,
984,
273,
253,
27662,
2969,
2605,
273,
253,
2613,
30410,
3061,
48733,
19862,
3185,
273,
690,
747,
36971,
10670,
24088,
19860,
6046,
20978,
460,
1162,
355,
43425,
359,
476,
671,
4354,
11897,
30027,
28460,
323,
690,
643,
2969,
49996,
323,
4227,
359,
476,
15313,
253,
30027,
28460,
323,
4872,
49996,
253,
1921,
2139,
359,
11897,
253,
37851,
28460,
323,
14871,
36971,
310,
253,
2570,
2605,
2761,
2806,
3364,
273,
253,
11454,
2990,
30410,
25761,
891,
1158,
253,
14871,
36971,
323,
3061,
48733,
19862,
310,
760,
247,
21010,
2336,
1988,
273,
253,
2087,
3237,
5421,
275,
820,
864,
1162,
355,
6247,
30966,
1162,
355,
9169,
50276,
20852,
460,
1162,
355,
43425,
5520,
30027,
36971,
323,
298,
18,
18065,
31640,
17857,
1686,
43425,
50276,
1940,
864,
1162,
355,
6247,
18065,
48960,
31640,
3066,
14871,
36971,
17857,
1686,
6247,
50276,
31524,
1162,
355,
9169,
14871,
36971,
273,
512,
15029,
285,
9552,
17857,
1686,
9169,
50276,
187,
187,
4118,
18435,
27,
9088,
497,
6832,
11985,
1475,
436,
2929,
285,
697,
9021,
253,
4477,
858,
247,
1175,
2628,
273,
15571,
285,
18745,
342,
30628,
342,
347,
247,
9936,
247,
6832,
7164,
273,
7363,
281,
10347,
253,
4049,
254,
609,
5102,
2715,
352,
310,
7052,
5125,
281,
2486,
253,
2144,
5611,
387,
5955,
673,
1690,
253,
5661,
1543,
465,
18,
6421,
285,
897,
253,
27350,
8813,
2530,
323,
253,
7870,
10717,
2746,
818,
76,
89,
75,
281,
3585,
1301,
247,
2593,
50276,
43575,
327,
247,
1029,
5251,
8813,
273,
253,
2746
] | [
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] | [
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
2520,
789,
14177,
281,
4711,
10237,
3061,
48733,
49328,
1411,
4735,
20452,
281,
5115,
436,
253,
4477,
12661,
1784,
2976,
1025,
36971,
1837,
84,
534,
476,
1361,
2085,
247,
10237,
14204,
25761,
253,
4477,
12661,
247,
10237,
278,
282,
5556,
1319,
17705,
323,
2060,
331,
10628,
285,
767,
43124,
15849,
323,
2644,
49328,
253,
5661,
1543,
1007,
12532,
285,
7568,
253,
12510,
273,
253,
4081,
1332,
4757,
50276,
18,
253,
4028,
285,
4677,
9759,
403,
1077,
1175,
374,
253,
31697,
13782,
3066,
7870,
10717,
4453,
4460,
495,
627,
403,
6832,
11701,
327,
5661,
1543,
285,
253,
2127,
310,
7660,
347,
629,
273,
253,
19529,
50276,
20881,
1255,
50276,
18,
581,
629,
891,
651,
751,
281,
923,
1805,
310,
891,
5730,
253,
4477,
812,
2589,
4679,
10941,
253,
8245,
3495,
285,
253,
4081,
789,
327,
625,
10334,
792,
941,
5988,
30366,
285,
8401,
1646,
417,
4209,
281,
479,
275,
2829,
374,
25761,
1543,
323,
253,
8245,
3495,
403,
3587,
22489,
432,
3495,
891,
651,
751,
281,
923,
625,
5661,
1543,
327,
625,
15302,
281,
2939,
253,
12510,
273,
253,
4081,
1332,
50276,
19,
253,
643,
2181,
516,
7514,
670,
310,
326,
512,
1543,
2361,
275,
2829,
374,
452,
642,
2629,
21492,
352,
310,
417,
2590,
323,
479,
849,
253,
3733,
285,
1071,
873,
8085,
369,
2218,
390,
1880,
627,
310,
667,
689,
31893,
253,
4477,
452,
9713,
253,
7364,
275,
2593,
818,
5474,
339,
431,
248,
2929,
16633,
327,
253,
1895,
273,
5306,
5411,
49328,
273,
3061,
331,
10628,
14556,
285,
11544,
18260,
407,
19732,
2977,
35132,
1320,
285,
5313,
3020,
22174,
3753,
273,
3061,
331,
10628,
253,
2929,
10262,
247,
1039,
281,
11897,
253,
3453,
3268,
31697,
285,
260,
4989,
14556,
275,
247,
4581,
830,
323,
2629,
3280,
3632,
5904,
1293,
10568,
667,
10491,
436,
2929,
671,
25339,
253,
6036,
21612,
689,
10704,
285,
31091,
3386,
285,
14371,
849,
253,
4081,
6974,
476,
320,
6508,
281,
6016,
31091,
3386,
17912,
970,
253,
30027,
21612,
6974,
253,
2929,
29328,
247,
4460,
1039,
281,
3989,
10237,
3061,
5202,
331,
10628,
326,
476,
840,
320,
908,
281,
1973,
49328,
949,
7351,
3390,
390,
11786,
43124,
285,
17825,
43124,
253,
16774,
27163,
7568,
253,
5520,
10237,
15970,
3045,
273,
253,
4081,
15849,
689,
5368,
1666,
25379,
285,
253,
3745,
273,
253,
4081,
15849,
281,
2085,
6036,
28460,
50275,
296,
3755,
20556,
50275,
2520,
2929,
22139,
253,
31640,
21612,
273,
247,
1077,
7561,
908,
966,
273,
3210,
50276,
1215,
1814,
868,
273,
3061,
5202,
331,
10628,
50276,
395,
7613,
556,
253,
2442,
273,
1534,
3486,
1677,
253,
4618,
897,
273,
824,
3210,
275,
8542,
4893,
50275,
783,
2929,
3400,
247,
44003,
533,
4209,
4114,
327,
14871,
36971,
326,
4483,
253,
9414,
281,
2096,
253,
7881,
326,
253,
4081,
1784,
2976,
1025,
36971,
310,
2104,
39256,
50275,
262,
310,
1077,
27350,
849,
253,
4477,
25057,
253,
5313,
3020,
22174,
2605,
275,
3061,
48733,
49328,
285,
253,
3280,
46852,
3907,
2439,
10103,
281,
14556,
11897,
253,
3453,
3268,
436,
310,
247,
1270,
1650,
273,
19732,
2977,
2605,
275,
253,
1895,
323,
5919,
285,
7899,
5482,
436,
5919,
2900,
671,
4483,
253,
4477,
281,
1287,
247,
4460,
3989,
74,
279,
273,
10237,
3061,
5202,
331,
10628,
534,
651,
417,
452,
644,
1896,
1293,
824,
271,
5919,
13782,
273,
253,
3453,
3268,
50275,
20881,
1255,
265,
50275,
531,
14855,
273,
253,
4081,
6974,
310,
253,
2770,
327,
3061,
331,
10628,
534,
29499,
253,
8869,
273,
253,
591,
24594,
8866,
10628,
326,
403,
4619,
275,
253,
5919,
13782,
352,
310,
417,
2590,
849,
436,
5853,
651,
3700,
281,
12861,
3061,
7139,
1580,
352,
310,
417,
2590,
281,
479,
849,
253,
8866,
10628,
812,
320,
3562,
323,
824,
7139,
50275,
783,
4477,
11120,
2953,
7364,
285,
2442,
4016,
38058,
3486,
581,
273,
253,
7364,
1146,
7591,
281,
6016,
10341,
49328,
273,
3061,
7139,
310,
2905,
281,
253,
14855,
891,
7117,
1840,
253,
4477,
1750,
326,
352,
310,
1896,
281,
9017,
281,
3061,
7139,
533,
253,
6880,
3133,
37825,
2299,
253,
4477,
403,
598,
6342,
342,
436,
12291,
50276,
7152,
33032,
2520,
2929,
30027,
36971,
323,
3061,
48733,
49328,
285,
15313,
271,
278,
282,
29776,
3733,
1332,
323,
43966,
3061,
331,
10628,
762,
46852,
20544,
337,
1175,
9759,
50275,
20881,
1255,
265,
50276,
18,
50276,
783,
15504,
273,
970,
14871,
36971,
281,
2085,
253,
31640,
12215,
323,
3061,
48733,
19862,
50275,
19,
253,
7680,
310,
32809,
50276,
783,
7680,
273,
436,
2929,
310,
32809,
253,
2929,
816,
10384,
14871,
36971,
281,
3061,
48733,
19862,
285,
38422,
253,
31640,
14204,
323,
824,
247,
2714,
30410,
253,
2234,
7680,
30027,
14204,
310,
984,
273,
253,
27662,
2969,
2605,
273,
253,
2613,
30410,
3061,
48733,
19862,
3185,
273,
690,
747,
36971,
10670,
24088,
19860,
6046,
20978,
460,
1162,
355,
43425,
359,
476,
671,
4354,
11897,
30027,
28460,
323,
690,
643,
2969,
49996,
323,
4227,
359,
476,
15313,
253,
30027,
28460,
323,
4872,
49996,
253,
1921,
2139,
359,
11897,
253,
37851,
28460,
323,
14871,
36971,
310,
253,
2570,
2605,
2761,
2806,
3364,
273,
253,
11454,
2990,
30410,
25761,
891,
1158,
253,
14871,
36971,
323,
3061,
48733,
19862,
310,
760,
247,
21010,
2336,
1988,
273,
253,
2087,
3237,
5421,
275,
820,
864,
1162,
355,
6247,
30966,
1162,
355,
9169,
50276,
20852,
460,
1162,
355,
43425,
5520,
30027,
36971,
323,
298,
18,
18065,
31640,
17857,
1686,
43425,
50276,
1940,
864,
1162,
355,
6247,
18065,
48960,
31640,
3066,
14871,
36971,
17857,
1686,
6247,
50276,
31524,
1162,
355,
9169,
14871,
36971,
273,
512,
15029,
285,
9552,
17857,
1686,
9169,
50276,
187,
187,
4118,
18435,
27,
9088,
497,
6832,
11985,
1475,
436,
2929,
285,
697,
9021,
253,
4477,
858,
247,
1175,
2628,
273,
15571,
285,
18745,
342,
30628,
342,
347,
247,
9936,
247,
6832,
7164,
273,
7363,
281,
10347,
253,
4049,
254,
609,
5102,
2715,
352,
310,
7052,
5125,
281,
2486,
253,
2144,
5611,
387,
5955,
673,
1690,
253,
5661,
1543,
465,
18,
6421,
285,
897,
253,
27350,
8813,
2530,
323,
253,
7870,
10717,
2746,
818,
76,
89,
75,
281,
3585,
1301,
247,
2593,
50276,
43575,
327,
247,
1029,
5251,
8813,
273,
253,
2746
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
in this paper the authors consider solving three ml security related challenges that would primarily arise in the cloud based ml model namely they consider the setting where a client wishes to obtain predictions from an ml model hosted on a server while being sure that the server is running the model they believe is being run and without the server learning nothing about their input additionally the server wishes for the user to learn nothing about the model other than its output on the users input to solve this problem the authors introduce a new scheme for running ml algorithms in a trusted execution environment the key idea is to oursource expensive computation involved with forwarding images through a model to an untrusted gpu in a way that still allows for the tee to verify the integrity of the gpus output because the authors method is able to utilize gpu computing they achieve substantial speedups compared to methods that run the full neural network in trusted hardware overall i found the paper to be very well written and easy to digest and the basic idea to be simple the authors strike a nice balance between details left to the appendix and the high level overview explained in the paper at the same time the authors proposed solution seems to achieve reasonably practicable performance and provides a simple highthroughput solution to some interesting ml security problems that seems readily applicable in the mlasacloudservice use case i only have a few comments and feedback i would recommend the authors use the full 10 pages available by moving key results from the appendix to the main text at present much of the experimental evaluation performed is done in the appendix eg figures 3 through 5 the notation prs oversetsgetsmathbbsn is not defined anywhere as far as i can tell before its first usage in lemma 21 does this just denote the probability over a uniform random draw of s from mathbbs if so i might recommend just dropping the subscript a b and c being deterministic makes the sample space unambiguous negllambda is also undefined in section three you claim that slalom could be extended to other architectures like residual networks can you give some intuition on how straightforward it would be to implement operations like concatenation required for densenets i would expect these operations could be implemented in the tee rather than on the coprocessor and then verified however the basic picture on the left of figure 1 may then change as the output of each layer may need to be verified before concatenation i think augmenting the right of figure 1 to account for these operations may be straightforward it would be interesting to see throughput results on these networks particularly because they are known to substantially outperform vgg in terms of classification performancedocsepthe authors propose a new method of securely evaluating neural networks the approach builds upon existing trusted execution environments tee a combination of hardware and software that isolates sensitive computations from the untrusted software stack the downside of tee is that it is expensive and slow to run this paper proposes outsourcing the linear evaluation portions of the dnn to an untrusted stack thats colocated with the tee to achieve privacy ie the input isnt revealed to the untrusted evaluator the approach adds a random number r to the input vector x evaluates fxr on the untrusted stack then subtracts off fr from the output this limits the approach to be applicable to only linear functions to achieve integrity verify the correctness of the output the paper proposes testing with random input vectors an application of freivalds theorem which bounds the error probability the techniques for integrity and privacy works only on integer evaluations hence the network weights and inputs need to be quantized the paper tries to minimize degradation in accuracy by quantizing as finely as numerically allowable achieving 05 drop in accuracy on two example dnns overall compared to full evaluation in a tee this approach is 10x faster on one dnn and 40x to 64x faster on another network depending on how the network is formulated disclaimer i am a complete outsider to the field of hw security and privacy the paper is very readable so i think i understand its overall gist i found the approach to be novel and the results convincing though i may be missing important context since im not familiar with the subject to me the biggest missing piece is a discussion of the limitations of the approach how big of a network can be evaluated this way is it sufficient for most common applications what are the bottlenecks to scaling this approach its also not clear why integrity checks are required is there a chance that the outsourcing could result in incorrect values its not obvious why it would lastly a question about quantization you try to quantize as finely as possible to minimize quantization errors by multiplying by the largest power of 2 possible without causing overflow since quantization need to be applied to both input and network weights does this mean that you must also bound the scale of the input or do you assume that the inputs are preprocessed to be within a known scale is this possible for intermediate outputs ie after the input has been multiplied through a few layers of the dnn pros simple yet effective approach to achieve the goals laid out in the problem statement clearly written thorough experiments and benchmarks strong results cons no discussion of limitations minor questions regarding quantization and size limits disclaimer reviewer is generally knowledgeable but not familiar with the subject areadocsep given the growing interest in building trust worthy and privacy protecting ai systems this paper demonstrates a novel approach to achieve these important goals by allowing a trusted but slow computation engine to leverage a fast but untrusted computation engine for the sake of protecting privacy this is done by establishing an additive secret share such that evaluation on one part of the share is performed offline and the computation on the other part of the share is performed on the untrusted engine to verify the correctness of the computation on the untrusted server a randomized algorithm is used to sample the correctness of the results using these techniques the authors demonstrate an order of magnitude speedup compared to running only on the trusted engine and 34 orders of magnitude speedup compared to softwarebased solutions overall this is a strong paper which presents good ideas that have influence in ml and beyond i appreciate the fact that the authors are planning to make their code publicly available which makes it more reproducible below are a few commentsquestionssuggestions 1 this papers and other papers too propose mechanisms to protect the privacy of the data while outsourcing the computation on a prediction task however an alternative approach would be to bring the computation to the data which means performing the prediction on the client side in what sense is it better to outsource the computation note that outsourcing the computation requires both complexity on the server side and additional computation on the client side encryption decryption 2 you present the limitations of the trust model of sgx only in the appendix while in the paper you compare to other techniques such as gazzelle which have a different trust model and assumption it makes sense to at least hint the reader on these differences 3 in section 22 has to be processed with high throughput when available is it high throughput that is required or low latency 4 in section 43 in one of the vgg experiment you computed only the convolution layers which as you say are commonly used to generate features in this case however doesnt it make more sense that the feature generation will take place on the client side while only the upper layers dense layers will be outsourced 5 in section 43 private inference do you include in the time reported also the offline preprocessing time as far as i understand this should take the same amount of time as computing on the tee
### Summary: | the authors propose a new method of securely evaluating neural networks the reviewers were unanimous in their vote to accept the paper is very well written the idea is relatively simple and so it is likely that this would make a nice presentation | [
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
249,
436,
2929,
253,
4477,
1908,
16161,
1264,
13361,
3988,
2905,
7881,
326,
651,
8558,
12893,
275,
253,
9005,
1754,
13361,
1566,
10775,
597,
1908,
253,
4758,
835,
247,
5268,
16995,
281,
4044,
13650,
432,
271,
13361,
1566,
17386,
327,
247,
4771,
1223,
1146,
2119,
326,
253,
4771,
310,
3515,
253,
1566,
597,
2868,
310,
1146,
1408,
285,
1293,
253,
4771,
4715,
2717,
670,
616,
3280,
23000,
253,
4771,
16995,
323,
253,
2608,
281,
3037,
50276,
26142,
670,
253,
1566,
643,
685,
697,
3453,
327,
253,
4212,
3280,
281,
8415,
436,
1895,
253,
4477,
9569,
247,
747,
6974,
323,
3515,
13361,
11333,
275,
247,
18273,
10636,
3126,
253,
2234,
2934,
310,
281,
776,
6756,
8214,
13782,
3206,
342,
45279,
3888,
949,
247,
1566,
281,
271,
440,
1206,
15172,
305,
11113,
275,
247,
1039,
326,
1335,
4483,
323,
253,
44722,
281,
12654,
253,
12974,
273,
253,
31025,
316,
3453,
984,
253,
4477,
1332,
310,
2104,
281,
16584,
305,
11113,
12672,
597,
5115,
6832,
3885,
8777,
2429,
281,
3082,
326,
1408,
253,
2120,
11454,
2990,
275,
18273,
10309,
50276,
1189,
455,
891,
1119,
253,
2929,
281,
320,
1077,
973,
3542,
285,
3477,
281,
19818,
285,
253,
5044,
2934,
281,
320,
2969,
253,
50276,
43355,
9974,
247,
5322,
6654,
875,
4278,
1669,
281,
253,
30762,
285,
253,
1029,
1268,
18389,
5544,
275,
253,
2929,
387,
253,
1072,
673,
253,
4477,
4081,
2900,
3133,
281,
5115,
12054,
2283,
35232,
3045,
285,
3400,
247,
2969,
288,
429,
73,
903,
1065,
2900,
281,
690,
4722,
13361,
3988,
3237,
326,
3133,
12450,
7763,
275,
253,
278,
9869,
29404,
13360,
4139,
897,
1083,
891,
760,
452,
247,
1643,
5701,
285,
8680,
50276,
74,
651,
5583,
253,
4477,
897,
253,
2120,
884,
7223,
2130,
407,
4886,
2234,
1543,
432,
253,
30762,
281,
253,
2022,
2505,
387,
1246,
1199,
273,
253,
5661,
7103,
2684,
310,
2218,
275,
253,
30762,
24088,
8442,
495,
949,
50276,
22,
50275,
783,
14951,
819,
84,
689,
19598,
18145,
1324,
1768,
79,
310,
417,
2931,
9825,
347,
2080,
347,
891,
476,
2028,
1078,
697,
806,
10393,
275,
18057,
3127,
1057,
436,
816,
9173,
253,
5912,
689,
247,
6447,
3632,
3812,
273,
256,
432,
14168,
67,
1768,
604,
594,
891,
1537,
5583,
816,
18752,
253,
749,
3866,
247,
270,
285,
260,
1146,
30027,
2789,
253,
3410,
2317,
39662,
2297,
620,
1836,
310,
671,
17011,
50275,
249,
2593,
1264,
368,
1750,
326,
1499,
267,
297,
812,
320,
6508,
281,
643,
35615,
751,
12541,
6928,
476,
368,
1918,
690,
30328,
327,
849,
15246,
352,
651,
320,
281,
3359,
5871,
751,
32147,
318,
2424,
323,
12006,
257,
1507,
891,
651,
1902,
841,
5871,
812,
320,
9009,
275,
253,
44722,
2581,
685,
50276,
251,
253,
5440,
287,
829,
263,
285,
840,
16058,
2299,
253,
5044,
5406,
327,
253,
1669,
273,
4677,
337,
778,
840,
1818,
347,
253,
3453,
273,
1016,
3828,
778,
878,
281,
320,
16058,
1078,
32147,
318,
891,
1158,
35919,
272,
253,
987,
273,
4677,
337,
281,
2395,
323,
841,
5871,
778,
320,
15246,
352,
651,
320,
4722,
281,
923,
28519,
1543,
327,
841,
6928,
3782,
984,
597,
403,
1929,
281,
9619,
562,
32231,
362,
1266,
275,
2426,
273,
9162,
1347,
3086,
406,
339,
431,
248,
4477,
12661,
247,
747,
1332,
273,
37370,
16344,
11454,
6928,
253,
2746,
21168,
2220,
5368,
18273,
10636,
12620,
44722,
247,
5019,
273,
10309,
285,
3694,
326,
14470,
7996,
30745,
432,
253,
440,
1206,
15172,
3694,
8031,
253,
42719,
273,
44722,
310,
326,
352,
310,
8214,
285,
3468,
281,
1408,
436,
2929,
29328,
20823,
40883,
253,
4872,
7103,
11821,
273,
253,
277,
9866,
281,
271,
440,
1206,
15172,
8031,
28763,
847,
30510,
342,
253,
44722,
281,
5115,
11068,
26332,
253,
3280,
310,
2649,
4950,
281,
253,
440,
1206,
15172,
3595,
1080,
253,
2746,
11323,
247,
3632,
1180,
391,
281,
253,
3280,
4972,
1269,
44995,
269,
89,
83,
327,
253,
440,
1206,
15172,
8031,
840,
43444,
84,
745,
1315,
432,
253,
3453,
436,
7787,
253,
2746,
281,
320,
7763,
281,
760,
4872,
3470,
281,
5115,
12974,
12654,
253,
36594,
273,
253,
3453,
253,
2929,
29328,
5175,
342,
3632,
3280,
11390,
271,
2898,
273,
4107,
2401,
1397,
10012,
534,
14493,
253,
2228,
5912,
253,
5609,
323,
12974,
285,
11068,
2987,
760,
327,
7007,
27163,
7613,
253,
2990,
13461,
285,
14800,
878,
281,
320,
2677,
1025,
253,
2929,
14177,
281,
15338,
11961,
275,
7200,
407,
2677,
3006,
347,
25806,
347,
27184,
1581,
494,
17170,
16987,
5926,
275,
7200,
327,
767,
1650,
277,
79,
2224,
4583,
2429,
281,
2120,
7103,
275,
247,
44722,
436,
2746,
310,
884,
89,
7938,
327,
581,
277,
9866,
285,
3387,
89,
281,
6705,
89,
7938,
327,
1529,
2990,
7293,
327,
849,
253,
2990,
310,
26115,
50276,
3431,
34837,
891,
717,
247,
3426,
20823,
1334,
281,
253,
1673,
273,
45850,
3988,
285,
11068,
253,
2929,
310,
1077,
34025,
594,
891,
1158,
891,
2096,
697,
4583,
305,
382,
891,
1119,
253,
2746,
281,
320,
4460,
285,
253,
1543,
21414,
2167,
891,
778,
320,
5816,
1774,
3634,
1580,
516,
417,
7615,
342,
253,
2256,
50276,
936,
479,
253,
5962,
5816,
5313,
310,
247,
5955,
273,
253,
7364,
273,
253,
2746,
849,
1943,
273,
247,
2990,
476,
320,
6760,
436,
1039,
310,
352,
4209,
323,
954,
1846,
4893,
752,
403,
253,
3673,
5025,
886,
661,
281,
13642,
436,
2746,
50276,
953,
671,
417,
2590,
2139,
12974,
12255,
403,
2424,
310,
627,
247,
4839,
326,
253,
20823,
40883,
812,
906,
275,
13583,
2193,
697,
417,
4755,
2139,
352,
651,
50276,
6275,
314,
247,
1953,
670,
36643,
368,
1611,
281,
2677,
907,
347,
25806,
347,
1896,
281,
15338,
36643,
6332,
407,
39763,
407,
253,
6253,
1612,
273,
374,
1896,
1293,
8479,
19068,
1580,
36643,
878,
281,
320,
3732,
281,
1097,
3280,
285,
2990,
13461,
1057,
436,
1599,
326,
368,
1364,
671,
3033,
253,
4311,
273,
253,
3280,
390,
513,
368,
5467,
326,
253,
14800,
403,
638,
36981,
281,
320,
1561,
247,
1929,
4311,
310,
436,
1896,
323,
10444,
18012,
26332,
846,
253,
3280,
556,
644,
31458,
949,
247,
1643,
8090,
273,
253,
277,
9866,
50276,
856,
84,
50276,
19583,
2568,
3576,
2746,
281,
5115,
253,
7342,
10090,
562,
275,
253,
1895,
3908,
50276,
49346,
3542,
50276,
42771,
602,
4679,
285,
49602,
50276,
9072,
1543,
50276,
5040,
50276,
2369,
5955,
273,
7364,
50276,
37585,
3533,
5001,
36643,
285,
1979,
7787,
50276,
3431,
34837,
37317,
310,
3839,
37289,
533,
417,
7615,
342,
253,
2256,
403,
44180,
33032,
1677,
253,
5675,
1600,
275,
3652,
4517,
18338,
285,
11068,
15233,
23105,
2718,
436,
2929,
14371,
247,
4460,
2746,
281,
5115,
841,
1774,
7342,
407,
6941,
247,
18273,
533,
3468,
13782,
3948,
281,
25057,
247,
3809,
533,
440,
1206,
15172,
13782,
3948,
323,
253,
13232,
273,
15233,
11068,
436,
310,
2218,
407,
14631,
271,
21842,
4279,
3894,
824,
326,
7103,
327,
581,
629,
273,
253,
3894,
310,
2684,
28841,
285,
253,
13782,
327,
253,
643,
629,
273,
253,
3894,
310,
2684,
327,
253,
440,
1206,
15172,
3948,
281,
12654,
253,
36594,
273,
253,
13782,
327,
253,
440,
1206,
15172,
4771,
247,
14871,
5933,
310,
908,
281,
3410,
253,
36594,
273,
253,
1543,
970,
841,
5609,
253,
4477,
7568,
271,
1340,
273,
9777,
3885,
484,
2429,
281,
3515,
760,
327,
253,
18273,
3948,
285,
5910,
7367,
273,
9777,
3885,
484,
2429,
281,
3694,
3169,
5482,
50276,
1189,
455,
436,
310,
247,
2266,
2929,
534,
10262,
1175,
5697,
326,
452,
4833,
275,
13361,
285,
4457,
891,
11435,
253,
958,
326,
253,
4477,
403,
7219,
281,
1056,
616,
2127,
13644,
2130,
534,
2789,
352,
625,
41374,
2708,
403,
247,
1643,
5701,
34974,
35640,
621,
50275,
18,
186,
2520,
9380,
285,
643,
9380,
1512,
12661,
6297,
281,
4017,
253,
11068,
273,
253,
941,
1223,
20823,
40883,
253,
13782,
327,
247,
10554,
4836,
2299,
271,
5795,
2746,
651,
320,
281,
3324,
253,
13782,
281,
253,
941,
534,
2097,
9591,
253,
10554,
327,
253,
5268,
1930,
275,
752,
3282,
310,
352,
1805,
281,
562,
6756,
253,
13782,
3877,
326,
20823,
40883,
253,
13782,
4419,
1097,
10454,
327,
253,
4771,
1930,
285,
3081,
13782,
327,
253,
5268,
1930,
24589,
50276,
8632,
18008,
374,
186,
5658,
1246,
253,
7364,
273,
253,
4517,
1566,
273,
48237,
89,
760,
275,
253,
30762,
1223,
275,
253,
2929,
368,
7277,
281,
643,
5609,
824,
347,
305,
11105,
4415,
534,
452,
247,
1027,
4517,
1566,
285,
9376,
352,
2789,
3282,
281,
387,
1878,
12662,
253,
9414,
327,
841,
3910,
50276,
20,
186,
249,
2593,
3307,
556,
281,
320,
11742,
342,
1029,
28519,
672,
2130,
310,
352,
1029,
28519,
326,
310,
2424,
390,
1698,
22667,
577,
186,
249,
2593,
7652,
275,
581,
273,
253,
362,
1266,
3368,
368,
10302,
760,
253,
27311,
8090,
534,
347,
368,
1333,
403,
7744,
908,
281,
6635,
3386,
275,
436,
1083,
2299,
36908,
352,
1056,
625,
3282,
326,
253,
4735,
5978,
588,
1379,
1659,
327,
253,
5268,
1930,
1223,
760,
253,
5170,
8090,
14086,
8090,
588,
320,
20823,
47549,
608,
186,
249,
2593,
7652,
3055,
17032,
50276,
3088,
368,
2486,
275,
253,
673,
2361,
671,
253,
28841,
638,
21678,
673,
347,
2080,
347,
891,
2096,
436,
943,
1379,
253,
1072,
2408,
273,
673,
347,
12672,
327,
253,
44722,
2490,
187,
4118,
18435,
27,
783,
4477,
12661,
247,
747,
1332,
273,
37370,
16344,
11454,
6928,
50275,
783,
30628,
497,
42293,
275,
616,
6273,
281,
2997,
253,
2929,
310,
1077,
973,
3542,
253,
2934,
310,
4942,
2969,
285,
594,
352,
310,
2779,
326,
436,
651,
1056,
247,
5322,
9759
] | [
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] | [
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
249,
436,
2929,
253,
4477,
1908,
16161,
1264,
13361,
3988,
2905,
7881,
326,
651,
8558,
12893,
275,
253,
9005,
1754,
13361,
1566,
10775,
597,
1908,
253,
4758,
835,
247,
5268,
16995,
281,
4044,
13650,
432,
271,
13361,
1566,
17386,
327,
247,
4771,
1223,
1146,
2119,
326,
253,
4771,
310,
3515,
253,
1566,
597,
2868,
310,
1146,
1408,
285,
1293,
253,
4771,
4715,
2717,
670,
616,
3280,
23000,
253,
4771,
16995,
323,
253,
2608,
281,
3037,
50276,
26142,
670,
253,
1566,
643,
685,
697,
3453,
327,
253,
4212,
3280,
281,
8415,
436,
1895,
253,
4477,
9569,
247,
747,
6974,
323,
3515,
13361,
11333,
275,
247,
18273,
10636,
3126,
253,
2234,
2934,
310,
281,
776,
6756,
8214,
13782,
3206,
342,
45279,
3888,
949,
247,
1566,
281,
271,
440,
1206,
15172,
305,
11113,
275,
247,
1039,
326,
1335,
4483,
323,
253,
44722,
281,
12654,
253,
12974,
273,
253,
31025,
316,
3453,
984,
253,
4477,
1332,
310,
2104,
281,
16584,
305,
11113,
12672,
597,
5115,
6832,
3885,
8777,
2429,
281,
3082,
326,
1408,
253,
2120,
11454,
2990,
275,
18273,
10309,
50276,
1189,
455,
891,
1119,
253,
2929,
281,
320,
1077,
973,
3542,
285,
3477,
281,
19818,
285,
253,
5044,
2934,
281,
320,
2969,
253,
50276,
43355,
9974,
247,
5322,
6654,
875,
4278,
1669,
281,
253,
30762,
285,
253,
1029,
1268,
18389,
5544,
275,
253,
2929,
387,
253,
1072,
673,
253,
4477,
4081,
2900,
3133,
281,
5115,
12054,
2283,
35232,
3045,
285,
3400,
247,
2969,
288,
429,
73,
903,
1065,
2900,
281,
690,
4722,
13361,
3988,
3237,
326,
3133,
12450,
7763,
275,
253,
278,
9869,
29404,
13360,
4139,
897,
1083,
891,
760,
452,
247,
1643,
5701,
285,
8680,
50276,
74,
651,
5583,
253,
4477,
897,
253,
2120,
884,
7223,
2130,
407,
4886,
2234,
1543,
432,
253,
30762,
281,
253,
2022,
2505,
387,
1246,
1199,
273,
253,
5661,
7103,
2684,
310,
2218,
275,
253,
30762,
24088,
8442,
495,
949,
50276,
22,
50275,
783,
14951,
819,
84,
689,
19598,
18145,
1324,
1768,
79,
310,
417,
2931,
9825,
347,
2080,
347,
891,
476,
2028,
1078,
697,
806,
10393,
275,
18057,
3127,
1057,
436,
816,
9173,
253,
5912,
689,
247,
6447,
3632,
3812,
273,
256,
432,
14168,
67,
1768,
604,
594,
891,
1537,
5583,
816,
18752,
253,
749,
3866,
247,
270,
285,
260,
1146,
30027,
2789,
253,
3410,
2317,
39662,
2297,
620,
1836,
310,
671,
17011,
50275,
249,
2593,
1264,
368,
1750,
326,
1499,
267,
297,
812,
320,
6508,
281,
643,
35615,
751,
12541,
6928,
476,
368,
1918,
690,
30328,
327,
849,
15246,
352,
651,
320,
281,
3359,
5871,
751,
32147,
318,
2424,
323,
12006,
257,
1507,
891,
651,
1902,
841,
5871,
812,
320,
9009,
275,
253,
44722,
2581,
685,
50276,
251,
253,
5440,
287,
829,
263,
285,
840,
16058,
2299,
253,
5044,
5406,
327,
253,
1669,
273,
4677,
337,
778,
840,
1818,
347,
253,
3453,
273,
1016,
3828,
778,
878,
281,
320,
16058,
1078,
32147,
318,
891,
1158,
35919,
272,
253,
987,
273,
4677,
337,
281,
2395,
323,
841,
5871,
778,
320,
15246,
352,
651,
320,
4722,
281,
923,
28519,
1543,
327,
841,
6928,
3782,
984,
597,
403,
1929,
281,
9619,
562,
32231,
362,
1266,
275,
2426,
273,
9162,
1347,
3086,
406,
339,
431,
248,
4477,
12661,
247,
747,
1332,
273,
37370,
16344,
11454,
6928,
253,
2746,
21168,
2220,
5368,
18273,
10636,
12620,
44722,
247,
5019,
273,
10309,
285,
3694,
326,
14470,
7996,
30745,
432,
253,
440,
1206,
15172,
3694,
8031,
253,
42719,
273,
44722,
310,
326,
352,
310,
8214,
285,
3468,
281,
1408,
436,
2929,
29328,
20823,
40883,
253,
4872,
7103,
11821,
273,
253,
277,
9866,
281,
271,
440,
1206,
15172,
8031,
28763,
847,
30510,
342,
253,
44722,
281,
5115,
11068,
26332,
253,
3280,
310,
2649,
4950,
281,
253,
440,
1206,
15172,
3595,
1080,
253,
2746,
11323,
247,
3632,
1180,
391,
281,
253,
3280,
4972,
1269,
44995,
269,
89,
83,
327,
253,
440,
1206,
15172,
8031,
840,
43444,
84,
745,
1315,
432,
253,
3453,
436,
7787,
253,
2746,
281,
320,
7763,
281,
760,
4872,
3470,
281,
5115,
12974,
12654,
253,
36594,
273,
253,
3453,
253,
2929,
29328,
5175,
342,
3632,
3280,
11390,
271,
2898,
273,
4107,
2401,
1397,
10012,
534,
14493,
253,
2228,
5912,
253,
5609,
323,
12974,
285,
11068,
2987,
760,
327,
7007,
27163,
7613,
253,
2990,
13461,
285,
14800,
878,
281,
320,
2677,
1025,
253,
2929,
14177,
281,
15338,
11961,
275,
7200,
407,
2677,
3006,
347,
25806,
347,
27184,
1581,
494,
17170,
16987,
5926,
275,
7200,
327,
767,
1650,
277,
79,
2224,
4583,
2429,
281,
2120,
7103,
275,
247,
44722,
436,
2746,
310,
884,
89,
7938,
327,
581,
277,
9866,
285,
3387,
89,
281,
6705,
89,
7938,
327,
1529,
2990,
7293,
327,
849,
253,
2990,
310,
26115,
50276,
3431,
34837,
891,
717,
247,
3426,
20823,
1334,
281,
253,
1673,
273,
45850,
3988,
285,
11068,
253,
2929,
310,
1077,
34025,
594,
891,
1158,
891,
2096,
697,
4583,
305,
382,
891,
1119,
253,
2746,
281,
320,
4460,
285,
253,
1543,
21414,
2167,
891,
778,
320,
5816,
1774,
3634,
1580,
516,
417,
7615,
342,
253,
2256,
50276,
936,
479,
253,
5962,
5816,
5313,
310,
247,
5955,
273,
253,
7364,
273,
253,
2746,
849,
1943,
273,
247,
2990,
476,
320,
6760,
436,
1039,
310,
352,
4209,
323,
954,
1846,
4893,
752,
403,
253,
3673,
5025,
886,
661,
281,
13642,
436,
2746,
50276,
953,
671,
417,
2590,
2139,
12974,
12255,
403,
2424,
310,
627,
247,
4839,
326,
253,
20823,
40883,
812,
906,
275,
13583,
2193,
697,
417,
4755,
2139,
352,
651,
50276,
6275,
314,
247,
1953,
670,
36643,
368,
1611,
281,
2677,
907,
347,
25806,
347,
1896,
281,
15338,
36643,
6332,
407,
39763,
407,
253,
6253,
1612,
273,
374,
1896,
1293,
8479,
19068,
1580,
36643,
878,
281,
320,
3732,
281,
1097,
3280,
285,
2990,
13461,
1057,
436,
1599,
326,
368,
1364,
671,
3033,
253,
4311,
273,
253,
3280,
390,
513,
368,
5467,
326,
253,
14800,
403,
638,
36981,
281,
320,
1561,
247,
1929,
4311,
310,
436,
1896,
323,
10444,
18012,
26332,
846,
253,
3280,
556,
644,
31458,
949,
247,
1643,
8090,
273,
253,
277,
9866,
50276,
856,
84,
50276,
19583,
2568,
3576,
2746,
281,
5115,
253,
7342,
10090,
562,
275,
253,
1895,
3908,
50276,
49346,
3542,
50276,
42771,
602,
4679,
285,
49602,
50276,
9072,
1543,
50276,
5040,
50276,
2369,
5955,
273,
7364,
50276,
37585,
3533,
5001,
36643,
285,
1979,
7787,
50276,
3431,
34837,
37317,
310,
3839,
37289,
533,
417,
7615,
342,
253,
2256,
403,
44180,
33032,
1677,
253,
5675,
1600,
275,
3652,
4517,
18338,
285,
11068,
15233,
23105,
2718,
436,
2929,
14371,
247,
4460,
2746,
281,
5115,
841,
1774,
7342,
407,
6941,
247,
18273,
533,
3468,
13782,
3948,
281,
25057,
247,
3809,
533,
440,
1206,
15172,
13782,
3948,
323,
253,
13232,
273,
15233,
11068,
436,
310,
2218,
407,
14631,
271,
21842,
4279,
3894,
824,
326,
7103,
327,
581,
629,
273,
253,
3894,
310,
2684,
28841,
285,
253,
13782,
327,
253,
643,
629,
273,
253,
3894,
310,
2684,
327,
253,
440,
1206,
15172,
3948,
281,
12654,
253,
36594,
273,
253,
13782,
327,
253,
440,
1206,
15172,
4771,
247,
14871,
5933,
310,
908,
281,
3410,
253,
36594,
273,
253,
1543,
970,
841,
5609,
253,
4477,
7568,
271,
1340,
273,
9777,
3885,
484,
2429,
281,
3515,
760,
327,
253,
18273,
3948,
285,
5910,
7367,
273,
9777,
3885,
484,
2429,
281,
3694,
3169,
5482,
50276,
1189,
455,
436,
310,
247,
2266,
2929,
534,
10262,
1175,
5697,
326,
452,
4833,
275,
13361,
285,
4457,
891,
11435,
253,
958,
326,
253,
4477,
403,
7219,
281,
1056,
616,
2127,
13644,
2130,
534,
2789,
352,
625,
41374,
2708,
403,
247,
1643,
5701,
34974,
35640,
621,
50275,
18,
186,
2520,
9380,
285,
643,
9380,
1512,
12661,
6297,
281,
4017,
253,
11068,
273,
253,
941,
1223,
20823,
40883,
253,
13782,
327,
247,
10554,
4836,
2299,
271,
5795,
2746,
651,
320,
281,
3324,
253,
13782,
281,
253,
941,
534,
2097,
9591,
253,
10554,
327,
253,
5268,
1930,
275,
752,
3282,
310,
352,
1805,
281,
562,
6756,
253,
13782,
3877,
326,
20823,
40883,
253,
13782,
4419,
1097,
10454,
327,
253,
4771,
1930,
285,
3081,
13782,
327,
253,
5268,
1930,
24589,
50276,
8632,
18008,
374,
186,
5658,
1246,
253,
7364,
273,
253,
4517,
1566,
273,
48237,
89,
760,
275,
253,
30762,
1223,
275,
253,
2929,
368,
7277,
281,
643,
5609,
824,
347,
305,
11105,
4415,
534,
452,
247,
1027,
4517,
1566,
285,
9376,
352,
2789,
3282,
281,
387,
1878,
12662,
253,
9414,
327,
841,
3910,
50276,
20,
186,
249,
2593,
3307,
556,
281,
320,
11742,
342,
1029,
28519,
672,
2130,
310,
352,
1029,
28519,
326,
310,
2424,
390,
1698,
22667,
577,
186,
249,
2593,
7652,
275,
581,
273,
253,
362,
1266,
3368,
368,
10302,
760,
253,
27311,
8090,
534,
347,
368,
1333,
403,
7744,
908,
281,
6635,
3386,
275,
436,
1083,
2299,
36908,
352,
1056,
625,
3282,
326,
253,
4735,
5978,
588,
1379,
1659,
327,
253,
5268,
1930,
1223,
760,
253,
5170,
8090,
14086,
8090,
588,
320,
20823,
47549,
608,
186,
249,
2593,
7652,
3055,
17032,
50276,
3088,
368,
2486,
275,
253,
673,
2361,
671,
253,
28841,
638,
21678,
673,
347,
2080,
347,
891,
2096,
436,
943,
1379,
253,
1072,
2408,
273,
673,
347,
12672,
327,
253,
44722,
2490,
187,
4118,
18435,
27,
783,
4477,
12661,
247,
747,
1332,
273,
37370,
16344,
11454,
6928,
50275,
783,
30628,
497,
42293,
275,
616,
6273,
281,
2997,
253,
2929,
310,
1077,
973,
3542,
253,
2934,
310,
4942,
2969,
285,
594,
352,
310,
2779,
326,
436,
651,
1056,
247,
5322,
9759
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
primary strength the primary strength of the paper is the creation of a single api to access chest xray datasets along with relevant baselines pretrained models and standard data tools that make the researchdevelopment pipeline on chest xray imaging convenient standardized and easily reproducible significance of work preparing data classes to adapt to heterogeneous structures formats labels and sourcespecific fields are a major component of working with chest xray datasets the difficulty is compounded when working with multiple datasets as is the case in applications such as domain adaptation continual learning and transfer learning a large part of this effort is mitigated by torchxrayvision standardization of data handling preprocessing corresponding standard baselines and pretrained models provided in this library will aid reproducibility another argument in support of this papers strength is the benefit the community has already seen from this library sections 12 13 papers across transfer learning 1 patient clinical trajectory 2 4 fewshot transfer 3 out of distribution generalization 5 continual learning 6 etc have already leveraged this library to my knowledge there is no equivalent library for chest xray imaging and the benefits to reproducibility standardization convenience of implementation and expediting research are significant structurelanguage of paper the paper itself is easy to follow and summarizes the primary functionalities of the library well example applications with sample code are provided in the paper as reference to each functionality and a few example notebooks are provided within the library the library is integrated well with standard pytorch pipelines and follows conventional pythonic styles references 1 douglas p s gomes et al mavidh score a covid19 severity scoring using chest xray pathology features 2 douglas p s gomes et al potential features of icu admission in xray images of covid19 patients 3 mehdi cherti and jenia jitsev effect of largescale pretraining on full and fewshot transfer learning for natural and medical images arxiv210600116 5 2021 4 aniket maurya predicting intubation support requirement of patients using chest xray with deep representation learning 5 joseph paul cohen mohammad hashir rupert brooks and hadrien bertrand on the limits of crossdomain generalization in automated xray prediction medical imaging with deep learning 2020 6 srivastava s yaqub m nandakumar k ge z mahapatra d 2021 continual domain incremental learning for chest xray classification in lowresource clinical settings fair workshop miccai 2021 novelty the novelty of this work as expected is arguable the library provides no novel functionality the implementations of the data classes tools and pretrained models are largely a development effort modeled on torchvision 1 and in most cases a reimplementation of standard protocols for chest xray datasets 23 among other standard dataset structuring preprocessing reading protocols albeit within a single structure and api documentation the documentation within the library itself is lacking to start to use the library seems to require a detailed read of the dataset module httpsgithubcommlmedtorchxrayvisionblobmastertorchxrayvisiondatasetspy to my knowledge the structure the data is expected in is confirmed only within the data classes this seems unnecessary however a link to the class documentation or a more detailed description within the repository readme may be helpful the core dataset module might benefit from documentation on the merge filter subset classes the benchmarks and baselines implemented in the library could benefit from more detailed documentation the benchmarksmd for example lacks training details graphslogs employed metrics multiseed results details that will be needed in using the provided pretrained models as reportable benchmarks references 1 httpspytorchorgvisionstableindexhtml 2 irvin jeremy et al chexpert a large chest radiograph dataset with uncertainty labels and expert comparison proceedings of the aaai conference on artificial intelligence vol 33 no 01 2019 3 bustos aurelia et al padchest a large chest xray image dataset with multilabel annotated reports medical image analysis 66 2020 101797 docsepthe authors make it easier for others to work with more than one chest xray dataset this facilitates evaluating models on data from multiple sources and developing more robust models that were trained on more than one dataset the pretrained models may be useful as baseline models to compare against i want to be very open about this i am really unsure how to grade this paper and whether it is interesting to the midl audience i have experienced myself how hard it is to get papers accepted that just describe an implementation and i was always of the opinion that this should be easier the authors definitely provide a value to the community through their library which i want to acknowledge but i must admit that i was not fascinated reading the paper i think i would have liked it better if it contained less how to information which belongs into the software documentation and a little more focus on the motivation and concepts which much of the paper already does well a weakness of the library might be that some parts of it seem to be geared towards pytorch although the core dataset handling and abstraction mechanisms could also have been useful to tensorflow users along similar lines i would suggest to have a lowlevel api that provides just filenames to be loaded so that the library could also be integrated into frameworks where native image loaders eg itk ones should be used and numpy arrays are not desired
### Summary: | the paper is a good contribution to the medical imaging community since it describes in detail about the use of the torchxrayvision library although not strictly a paper contributing novel methods this is definitely a good piece of work in a very applied field such as medical image analysis with the exception of one reviewer all others rate the paper very highly and i feel the paper should be accepted and the audience given an opportunity to know about it | [
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
3625,
4757,
253,
3625,
4757,
273,
253,
2929,
310,
253,
8869,
273,
247,
2014,
23370,
281,
2289,
9081,
1269,
1402,
15302,
2112,
342,
4623,
1666,
25379,
3215,
11273,
3210,
285,
2629,
941,
5657,
326,
1056,
253,
2561,
27055,
15722,
327,
9081,
1269,
1402,
6979,
11638,
19817,
285,
4354,
41374,
50274,
9188,
40348,
273,
789,
13828,
941,
5971,
281,
5223,
281,
22766,
5289,
21453,
13301,
285,
4973,
29765,
4910,
403,
247,
2201,
4445,
273,
2444,
342,
9081,
1269,
1402,
15302,
253,
10183,
310,
509,
8055,
672,
2444,
342,
2709,
15302,
347,
310,
253,
1083,
275,
4893,
824,
347,
5028,
15644,
45120,
4715,
285,
3700,
4715,
247,
1781,
629,
273,
436,
3434,
310,
4784,
27285,
407,
30162,
89,
1402,
4694,
2629,
1320,
273,
941,
10885,
638,
21678,
3969,
2629,
1666,
25379,
285,
3215,
11273,
3210,
2530,
275,
436,
6335,
588,
8596,
38041,
50276,
23955,
4154,
275,
1329,
273,
436,
9380,
4757,
310,
253,
5649,
253,
3114,
556,
2168,
2326,
432,
436,
6335,
7118,
1249,
2145,
9380,
2439,
3700,
4715,
337,
3110,
3382,
18974,
374,
577,
1643,
11860,
3700,
495,
562,
273,
3268,
26647,
608,
45120,
4715,
721,
3966,
452,
2168,
19732,
2961,
436,
6335,
50275,
936,
619,
3640,
627,
310,
642,
6425,
6335,
323,
9081,
1269,
1402,
6979,
285,
253,
5373,
281,
38041,
2629,
1320,
16397,
273,
7092,
285,
16625,
2996,
2561,
403,
1534,
50275,
18317,
12982,
273,
2929,
253,
2929,
3139,
310,
3477,
281,
956,
285,
37250,
253,
3625,
5164,
1005,
273,
253,
6335,
973,
1650,
4893,
342,
3410,
2127,
403,
2530,
275,
253,
2929,
347,
3806,
281,
1016,
13175,
285,
247,
1643,
1650,
24849,
84,
403,
2530,
1561,
253,
6335,
253,
6335,
310,
8527,
973,
342,
2629,
268,
1767,
263,
348,
44387,
285,
3637,
6041,
7239,
394,
5120,
14957,
50274,
250,
3065,
337,
277,
529,
9869,
268,
256,
305,
6819,
1162,
355,
278,
580,
301,
73,
4868,
247,
9383,
301,
746,
12147,
14755,
970,
9081,
1269,
1402,
19955,
3386,
50275,
19,
277,
529,
9869,
268,
256,
305,
6819,
1162,
355,
50276,
33177,
3386,
273,
17857,
86,
11341,
275,
1269,
1402,
3888,
273,
9383,
301,
746,
1363,
50276,
20,
479,
73,
5168,
22765,
6811,
285,
480,
23364,
480,
262,
46731,
1055,
273,
1236,
2510,
25912,
3215,
26208,
327,
2120,
285,
1643,
11860,
3700,
4715,
323,
3626,
285,
3739,
3888,
549,
32693,
16899,
10487,
13210,
608,
43425,
50276,
21,
271,
1479,
292,
6429,
1626,
66,
21565,
540,
30987,
1329,
8284,
273,
1363,
970,
9081,
1269,
1402,
342,
3676,
6779,
4715,
50276,
22,
480,
583,
545,
1349,
335,
820,
864,
278,
1368,
20136,
13283,
343,
8864,
8292,
1795,
24560,
285,
574,
19430,
270,
797,
17554,
327,
253,
7787,
273,
2831,
13517,
26647,
275,
16644,
1269,
1402,
10554,
3739,
6979,
342,
3676,
4715,
9169,
50276,
23,
256,
1069,
505,
2623,
256,
14076,
371,
67,
278,
295,
395,
518,
22711,
465,
3471,
1182,
50276,
785,
73,
522,
29628,
277,
43425,
45120,
5028,
32809,
4715,
323,
9081,
1269,
1402,
9162,
275,
1698,
15024,
3382,
7533,
4344,
22586,
8555,
68,
2284,
43425,
50275,
2369,
652,
555,
50276,
783,
38135,
273,
436,
789,
347,
3264,
310,
1736,
8584,
253,
6335,
3400,
642,
4460,
13175,
253,
27558,
273,
253,
941,
5971,
5657,
285,
3215,
11273,
3210,
403,
8127,
247,
2440,
3434,
23115,
327,
30162,
4694,
337,
285,
275,
954,
2219,
247,
294,
39595,
273,
2629,
14238,
323,
9081,
1269,
1402,
15302,
3495,
2190,
643,
2629,
10895,
1577,
981,
638,
21678,
4361,
14238,
23447,
1561,
247,
2014,
2605,
285,
23370,
50274,
37442,
50276,
783,
10097,
1561,
253,
6335,
3139,
310,
14999,
50274,
936,
1265,
281,
897,
253,
6335,
3133,
281,
2430,
247,
7000,
1239,
273,
253,
10895,
6333,
5987,
7280,
681,
1686,
1314,
13473,
348,
89,
1402,
4694,
23723,
11717,
13473,
348,
89,
1402,
4694,
42429,
1033,
90,
281,
619,
3640,
253,
2605,
253,
941,
310,
3264,
275,
310,
5783,
760,
1561,
253,
941,
5971,
436,
3133,
15279,
2299,
247,
3048,
281,
253,
966,
10097,
390,
247,
625,
7000,
5740,
1561,
253,
18491,
1239,
1405,
778,
320,
9371,
50275,
783,
5161,
10895,
6333,
1537,
5649,
432,
10097,
327,
253,
17310,
5806,
8578,
5971,
50274,
783,
49602,
285,
1666,
25379,
9009,
275,
253,
6335,
812,
5649,
432,
625,
7000,
10097,
253,
49602,
6535,
323,
1650,
19756,
3733,
4278,
14580,
43975,
7091,
17082,
1554,
885,
264,
1543,
50276,
23454,
326,
588,
320,
3058,
275,
970,
253,
2530,
3215,
11273,
3210,
347,
1304,
494,
49602,
50275,
250,
3065,
50276,
18,
5987,
81,
1767,
263,
348,
2061,
4694,
11351,
4663,
2974,
50276,
19,
3496,
8498,
5139,
2013,
90,
1162,
355,
1161,
89,
8292,
247,
1781,
9081,
8188,
2047,
10895,
342,
11649,
13301,
285,
6485,
5301,
10061,
273,
253,
39951,
2284,
8059,
327,
13345,
9260,
1936,
5922,
642,
14805,
6247,
50276,
20,
21501,
375,
247,
459,
19702,
1162,
355,
13229,
1962,
296,
247,
1781,
9081,
1269,
1402,
2460,
10895,
342,
33362,
1492,
28267,
5012,
3739,
2460,
1783,
9523,
9169,
8437,
35304,
50276,
7152,
339,
431,
248,
4477,
1056,
352,
6927,
323,
2571,
281,
789,
342,
625,
685,
581,
9081,
1269,
1402,
10895,
436,
29499,
16344,
3210,
327,
941,
432,
2709,
4973,
285,
6684,
625,
10237,
3210,
326,
497,
10166,
327,
625,
685,
581,
10895,
253,
3215,
11273,
3210,
778,
320,
4217,
347,
8245,
3210,
281,
7277,
1411,
891,
971,
281,
320,
1077,
1527,
670,
436,
891,
717,
1663,
31488,
849,
281,
9646,
436,
2929,
285,
1880,
352,
310,
4722,
281,
253,
4260,
77,
8446,
891,
452,
7407,
4266,
849,
1892,
352,
310,
281,
755,
9380,
7607,
326,
816,
6266,
271,
7092,
285,
891,
369,
1900,
273,
253,
4743,
326,
436,
943,
320,
6927,
253,
4477,
7964,
2085,
247,
1318,
281,
253,
3114,
949,
616,
6335,
534,
891,
971,
281,
14409,
533,
891,
1364,
11476,
326,
891,
369,
417,
39974,
4361,
253,
2929,
891,
1158,
891,
651,
452,
10490,
352,
1805,
604,
352,
6221,
1679,
849,
281,
1491,
534,
14125,
715,
253,
3694,
10097,
285,
247,
1652,
625,
2770,
327,
253,
16038,
285,
12342,
534,
1199,
273,
253,
2929,
2168,
1057,
973,
50276,
66,
14855,
273,
253,
6335,
1537,
320,
326,
690,
4243,
273,
352,
1646,
281,
320,
48526,
4404,
268,
1767,
263,
348,
3738,
253,
5161,
10895,
10885,
285,
38562,
6297,
812,
671,
452,
644,
4217,
281,
13148,
5449,
4212,
50276,
28694,
2074,
3104,
891,
651,
1804,
281,
452,
247,
1698,
5251,
23370,
326,
3400,
816,
1193,
257,
1443,
281,
320,
10607,
594,
326,
253,
6335,
812,
671,
320,
8527,
715,
31225,
835,
7925,
2460,
3301,
398,
24088,
352,
76,
4394,
943,
320,
908,
285,
36950,
16417,
403,
417,
6799,
2490,
187,
4118,
18435,
27,
783,
2929,
310,
247,
1175,
7680,
281,
253,
3739,
6979,
3114,
1580,
352,
8631,
275,
2508,
670,
253,
897,
273,
253,
30162,
89,
1402,
4694,
6335,
3738,
417,
13714,
247,
2929,
15979,
4460,
3082,
436,
310,
7964,
247,
1175,
5313,
273,
789,
275,
247,
1077,
3732,
1673,
824,
347,
3739,
2460,
1783,
342,
253,
6517,
273,
581,
37317,
512,
2571,
2281,
253,
2929,
1077,
4122,
285,
891,
1928,
50276,
783,
2929,
943,
320,
7607,
285,
253,
8446,
1677,
271,
5107,
281,
871,
670,
352
] | [
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] | [
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
3625,
4757,
253,
3625,
4757,
273,
253,
2929,
310,
253,
8869,
273,
247,
2014,
23370,
281,
2289,
9081,
1269,
1402,
15302,
2112,
342,
4623,
1666,
25379,
3215,
11273,
3210,
285,
2629,
941,
5657,
326,
1056,
253,
2561,
27055,
15722,
327,
9081,
1269,
1402,
6979,
11638,
19817,
285,
4354,
41374,
50274,
9188,
40348,
273,
789,
13828,
941,
5971,
281,
5223,
281,
22766,
5289,
21453,
13301,
285,
4973,
29765,
4910,
403,
247,
2201,
4445,
273,
2444,
342,
9081,
1269,
1402,
15302,
253,
10183,
310,
509,
8055,
672,
2444,
342,
2709,
15302,
347,
310,
253,
1083,
275,
4893,
824,
347,
5028,
15644,
45120,
4715,
285,
3700,
4715,
247,
1781,
629,
273,
436,
3434,
310,
4784,
27285,
407,
30162,
89,
1402,
4694,
2629,
1320,
273,
941,
10885,
638,
21678,
3969,
2629,
1666,
25379,
285,
3215,
11273,
3210,
2530,
275,
436,
6335,
588,
8596,
38041,
50276,
23955,
4154,
275,
1329,
273,
436,
9380,
4757,
310,
253,
5649,
253,
3114,
556,
2168,
2326,
432,
436,
6335,
7118,
1249,
2145,
9380,
2439,
3700,
4715,
337,
3110,
3382,
18974,
374,
577,
1643,
11860,
3700,
495,
562,
273,
3268,
26647,
608,
45120,
4715,
721,
3966,
452,
2168,
19732,
2961,
436,
6335,
50275,
936,
619,
3640,
627,
310,
642,
6425,
6335,
323,
9081,
1269,
1402,
6979,
285,
253,
5373,
281,
38041,
2629,
1320,
16397,
273,
7092,
285,
16625,
2996,
2561,
403,
1534,
50275,
18317,
12982,
273,
2929,
253,
2929,
3139,
310,
3477,
281,
956,
285,
37250,
253,
3625,
5164,
1005,
273,
253,
6335,
973,
1650,
4893,
342,
3410,
2127,
403,
2530,
275,
253,
2929,
347,
3806,
281,
1016,
13175,
285,
247,
1643,
1650,
24849,
84,
403,
2530,
1561,
253,
6335,
253,
6335,
310,
8527,
973,
342,
2629,
268,
1767,
263,
348,
44387,
285,
3637,
6041,
7239,
394,
5120,
14957,
50274,
250,
3065,
337,
277,
529,
9869,
268,
256,
305,
6819,
1162,
355,
278,
580,
301,
73,
4868,
247,
9383,
301,
746,
12147,
14755,
970,
9081,
1269,
1402,
19955,
3386,
50275,
19,
277,
529,
9869,
268,
256,
305,
6819,
1162,
355,
50276,
33177,
3386,
273,
17857,
86,
11341,
275,
1269,
1402,
3888,
273,
9383,
301,
746,
1363,
50276,
20,
479,
73,
5168,
22765,
6811,
285,
480,
23364,
480,
262,
46731,
1055,
273,
1236,
2510,
25912,
3215,
26208,
327,
2120,
285,
1643,
11860,
3700,
4715,
323,
3626,
285,
3739,
3888,
549,
32693,
16899,
10487,
13210,
608,
43425,
50276,
21,
271,
1479,
292,
6429,
1626,
66,
21565,
540,
30987,
1329,
8284,
273,
1363,
970,
9081,
1269,
1402,
342,
3676,
6779,
4715,
50276,
22,
480,
583,
545,
1349,
335,
820,
864,
278,
1368,
20136,
13283,
343,
8864,
8292,
1795,
24560,
285,
574,
19430,
270,
797,
17554,
327,
253,
7787,
273,
2831,
13517,
26647,
275,
16644,
1269,
1402,
10554,
3739,
6979,
342,
3676,
4715,
9169,
50276,
23,
256,
1069,
505,
2623,
256,
14076,
371,
67,
278,
295,
395,
518,
22711,
465,
3471,
1182,
50276,
785,
73,
522,
29628,
277,
43425,
45120,
5028,
32809,
4715,
323,
9081,
1269,
1402,
9162,
275,
1698,
15024,
3382,
7533,
4344,
22586,
8555,
68,
2284,
43425,
50275,
2369,
652,
555,
50276,
783,
38135,
273,
436,
789,
347,
3264,
310,
1736,
8584,
253,
6335,
3400,
642,
4460,
13175,
253,
27558,
273,
253,
941,
5971,
5657,
285,
3215,
11273,
3210,
403,
8127,
247,
2440,
3434,
23115,
327,
30162,
4694,
337,
285,
275,
954,
2219,
247,
294,
39595,
273,
2629,
14238,
323,
9081,
1269,
1402,
15302,
3495,
2190,
643,
2629,
10895,
1577,
981,
638,
21678,
4361,
14238,
23447,
1561,
247,
2014,
2605,
285,
23370,
50274,
37442,
50276,
783,
10097,
1561,
253,
6335,
3139,
310,
14999,
50274,
936,
1265,
281,
897,
253,
6335,
3133,
281,
2430,
247,
7000,
1239,
273,
253,
10895,
6333,
5987,
7280,
681,
1686,
1314,
13473,
348,
89,
1402,
4694,
23723,
11717,
13473,
348,
89,
1402,
4694,
42429,
1033,
90,
281,
619,
3640,
253,
2605,
253,
941,
310,
3264,
275,
310,
5783,
760,
1561,
253,
941,
5971,
436,
3133,
15279,
2299,
247,
3048,
281,
253,
966,
10097,
390,
247,
625,
7000,
5740,
1561,
253,
18491,
1239,
1405,
778,
320,
9371,
50275,
783,
5161,
10895,
6333,
1537,
5649,
432,
10097,
327,
253,
17310,
5806,
8578,
5971,
50274,
783,
49602,
285,
1666,
25379,
9009,
275,
253,
6335,
812,
5649,
432,
625,
7000,
10097,
253,
49602,
6535,
323,
1650,
19756,
3733,
4278,
14580,
43975,
7091,
17082,
1554,
885,
264,
1543,
50276,
23454,
326,
588,
320,
3058,
275,
970,
253,
2530,
3215,
11273,
3210,
347,
1304,
494,
49602,
50275,
250,
3065,
50276,
18,
5987,
81,
1767,
263,
348,
2061,
4694,
11351,
4663,
2974,
50276,
19,
3496,
8498,
5139,
2013,
90,
1162,
355,
1161,
89,
8292,
247,
1781,
9081,
8188,
2047,
10895,
342,
11649,
13301,
285,
6485,
5301,
10061,
273,
253,
39951,
2284,
8059,
327,
13345,
9260,
1936,
5922,
642,
14805,
6247,
50276,
20,
21501,
375,
247,
459,
19702,
1162,
355,
13229,
1962,
296,
247,
1781,
9081,
1269,
1402,
2460,
10895,
342,
33362,
1492,
28267,
5012,
3739,
2460,
1783,
9523,
9169,
8437,
35304,
50276,
7152,
339,
431,
248,
4477,
1056,
352,
6927,
323,
2571,
281,
789,
342,
625,
685,
581,
9081,
1269,
1402,
10895,
436,
29499,
16344,
3210,
327,
941,
432,
2709,
4973,
285,
6684,
625,
10237,
3210,
326,
497,
10166,
327,
625,
685,
581,
10895,
253,
3215,
11273,
3210,
778,
320,
4217,
347,
8245,
3210,
281,
7277,
1411,
891,
971,
281,
320,
1077,
1527,
670,
436,
891,
717,
1663,
31488,
849,
281,
9646,
436,
2929,
285,
1880,
352,
310,
4722,
281,
253,
4260,
77,
8446,
891,
452,
7407,
4266,
849,
1892,
352,
310,
281,
755,
9380,
7607,
326,
816,
6266,
271,
7092,
285,
891,
369,
1900,
273,
253,
4743,
326,
436,
943,
320,
6927,
253,
4477,
7964,
2085,
247,
1318,
281,
253,
3114,
949,
616,
6335,
534,
891,
971,
281,
14409,
533,
891,
1364,
11476,
326,
891,
369,
417,
39974,
4361,
253,
2929,
891,
1158,
891,
651,
452,
10490,
352,
1805,
604,
352,
6221,
1679,
849,
281,
1491,
534,
14125,
715,
253,
3694,
10097,
285,
247,
1652,
625,
2770,
327,
253,
16038,
285,
12342,
534,
1199,
273,
253,
2929,
2168,
1057,
973,
50276,
66,
14855,
273,
253,
6335,
1537,
320,
326,
690,
4243,
273,
352,
1646,
281,
320,
48526,
4404,
268,
1767,
263,
348,
3738,
253,
5161,
10895,
10885,
285,
38562,
6297,
812,
671,
452,
644,
4217,
281,
13148,
5449,
4212,
50276,
28694,
2074,
3104,
891,
651,
1804,
281,
452,
247,
1698,
5251,
23370,
326,
3400,
816,
1193,
257,
1443,
281,
320,
10607,
594,
326,
253,
6335,
812,
671,
320,
8527,
715,
31225,
835,
7925,
2460,
3301,
398,
24088,
352,
76,
4394,
943,
320,
908,
285,
36950,
16417,
403,
417,
6799,
2490,
187,
4118,
18435,
27,
783,
2929,
310,
247,
1175,
7680,
281,
253,
3739,
6979,
3114,
1580,
352,
8631,
275,
2508,
670,
253,
897,
273,
253,
30162,
89,
1402,
4694,
6335,
3738,
417,
13714,
247,
2929,
15979,
4460,
3082,
436,
310,
7964,
247,
1175,
5313,
273,
789,
275,
247,
1077,
3732,
1673,
824,
347,
3739,
2460,
1783,
342,
253,
6517,
273,
581,
37317,
512,
2571,
2281,
253,
2929,
1077,
4122,
285,
891,
1928,
50276,
783,
2929,
943,
320,
7607,
285,
253,
8446,
1677,
271,
5107,
281,
871,
670,
352
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
in binary image classification setting the paper shows that when using an attack set augmenting the training set with additional intentionally mislabeled datapoints to vary the generalization performance of a fixed neural network nn architecture the approximation of bayesian prior pf of the outputs of the trained nn on the test set correlates very well with generalization the effect persists when using different optimizers and different number of training epochs in contrast the popular measure of flatness approximation of the spectral norm of the hessian of the training loss correlates worse with generalization in these settings when using sgd and does not correlate with it at all when using variants like adam or entropysgd the authors provide some preliminary explanation of the effect and propose the bayesian prior as a more reliable generalization proxy strengths 1 in the specific settings considered the correlation between generalization and the proposed bayesian prior is extremely strong and robustly outperforms the flatness measure 1 appendix contains many subsettings considering different measures of flatness and different training durations all corroborating the findings in the main text 1 disentangling the influence of architecture and optimization on generalization is an important area of research weaknesses the main weakness of the paper is that all measurements are done under a somewhat exotic setup that imo does not resemble a realistic situation where one would do model selection based on some generalization measure for this reason i find strong claims about the bayesian prior being a more robust predictor than flatness eg in the abstract and in the conclusion unsubstantiated precisely 1 the authors use attack sets to generate models with 0 training loss of varying generalization i suspect this might produce a very specific distribution of resulting functions a much more natural and realistic setup would be to just sample many different hyperparameters like learning rate batch size learning rate decay momentum number of epochs weight decay etc by pushing them to extremes i am pretty sure it is possible to obtain 100train accurate yet poorlygeneralizing models and even if they all happen to generalize well then that resulting smaller range of performance would be the realistic scale at which practitioners and researchers would want to see a correlation between generalization and the proposed proxy another more realistic parameter to vary would be just the training set size since training onobtaining more data could be costly and one might want to estimate generalization scaling as a function of training set size but an attack set setting is something that i find much less practically relevant and worry that if we were to consider a practical setting as above the correlation might not be there since the manifold of models trained with varying attack sets and the more practicallyrelevant manifold of models trained with varying hyperparameters are likely very different please correct me if im wrong or missing something 1 iiuc the bayesian prior should only be compared between the same models due to the assumption that ps is constant which is otherwise intractable therefore pf cannot be used for model selection between different architectures or priors on the weights only the training set and the training procedure can vary in contrast flatness can be compared between any nns with the same loss function which is another reason why i think the claim about prior being a superior measure is too strong 1 finally i think its realistic to expect that there are manifolds of models where prior does not correlate with generalization and it would be nice if the authors proactively considered discussed such settings to set more realistic expectations for the reader for example i suspect that in a regression mse loss setting you could obtain a model that fits the training set perfectly but then falls off to 0 rapidly everywhere else and doesnt generalize at all yet this model will have the highest prior possible due to the prior being a zerocentered gaussian in a classification setting perhaps constructing an attack set differently could also lead to different behavior eg suppose i only misclassify one class incorrectly ie return a constant class label for the whole attack set would the prior still correlate with generalization then its fair to leave this sort of investigation for future work but without such i again find the central claim of the paper too strong questions 1 is the sharpness measured on the training set or the training and attack set jointly and do you think it matters have you or any prior work considered measuring sharpness on a separate validation set and do you think it would make sharpness perform better 2 have you considered similar experiments in an mse regression setting with continuousvalued outputs in this setting it would be simpler and cheaper to measure pf in closedform with nngp and im curious how mse and accuracy would correlate with the prior flatness in this case minor 1 abstract zero error on a test set training set 1 section 21 the loss is defined on the space of outputs mathcaly but is usually instead noted as lbfw this is inconsistent with definition but also makes it hard to understand if its the train train attack validation set loss 1 page 5 line 2 psgdfs 1 figure 2 caption the function with the largest prior 1 figure 2 regarding two bands could you add different colors for them 1 typos in section a1 eg wwe type b postrebuttal update the authors have answered my questions and clarified that we may have some differences in interpreting their results but i dont think its significant enough for me to change the original score original review the paper presents an interesting setting in which the approximation to the bayesian prior pf correlates remarkably well with generalization while flatness does not as i detail in my review i find the setting to be somewhat constrained and unrealistic and therefore the claim that the prior is a significantly more robust predictor of generalization than flatness is too strong especially since there is no solid theoretical or qualitative explanation for why exactly pf works nonetheless i still find this to be a novel and meaningful result and look forward to future research investigating explaining this effect further so am inclined to accept assuming the authors soften their claims or convince me that my concerns are overblown docsepthis paper studies the correlation between flatness and generalization in deep neural networks and show that consists with some previous studies the correlation could sometimes be broken as an alternative it propose a new measure based on the bayesian prior upon initialization and empirically demonstrate this criterion could maintain a positive correlation with generalization even in the case where flatness breaks strength this paper is relatively well written and easy to follow it studies an interesting and important question that characterizes the generalization performance of a trained neural network it proposes a new criterion and demonstrates the effectiveness in standard datasets weakness i think the paper is still a bit weak in a number of aspects 1 optimizers plays an important role in deciding the final form of a trained model but the architecture and parameterization are also fundamentally important it would be great if in addition to the optimizer comparison experiments the paper could add experiments to compare across different architectures there is also a belief that overparamerization contribute to better flatness and generalization so it would be good to see how the number of model parameters correlate with flatness and the bayesian prior as well 2 there is a new optimizer called sharpnessaware minimization httpsarxivorgabs201001412 that minimize the loss sharpness during optimization and are shown to greatly improve the generalization performance in some tasks it would be interesting to add sam to the existing optimizer comparison experiments 3 although not strictly required developing a theoretical link between the proposed criterion and generalization would make the paper more coherent and solid 4 gps are used to approximate the bayesian posteriors in this paper how accurate is this approximation could the paper provide a synthetic study like in 51 where the groundtruth can be measured by direct sampling and compare the accuracy of gp based estimation 5 in the correlation figures eg fig 3 the generalization is shown as the y axis i wonder are all the networks trained to 0 training error if not what would be the plot like when test accuracy is plotted in the y axis after rebuttal thanks for the clarifications i intend to keep my rating this paper propose a new criterion to measure generalization and empirically show that it correlate with generalization even in cases where flatness fails to maintain the correlation it would be more solid if direct theoretical link could be shown it is ok for the paper to be primarily empirical studies but in this case a more systematic set of experiments could help strengthen the paper while it is impossible to enumerate all possible comparison cases a number of canonical settings like different parameterizations see above for details are very important docsepthe paper proposes using the log of the prior of the function as a predictor of generalization they provide arguments for why this is better than using sharpness of the local minima and support their arguments with empirical results on small datasets there are several recent works which looked into the connection between generalization and the sharpness of the local minimum here sharpness is often defined using the spectrum of the hessian in addition to some normalization terms in some cases however previous measures of sharpness are not invariant under parameter rescaling which would produce the same solution but with a different level of sharpness the authors in this work propose an alternative measure the log of the prior of the function as a predictor of generalization the motivation behind this comes from recent lines of work which look into the relation between generalization and the posterior probability pf s the authors note that the posterior is related to the prior pf by a constant term since neural networks are expressive enough to have zero training error hence pf might serve as a good predictor of generalization here the prior on the 01 function f is the total probability mass of the weights that produce that function over a fixed dataset training test where the probability measure over the weights is computed using gp the authors demonstrate experimentally on mnist and cifar10 that there is indeed a strong correlation between pf and generalization in addition because this is defined over 01 functions it is invariant to parameter rescaling in general the paper is wellwritten and provide a nice historical review however it is not clear where the noveltycontribution is as mentioned by the authors previous works have already shown that pfs is a good predictor of generalization the only addition in this work is replacing pfs with pf but because pf is a constant multiple of pfs claiming that pf correlates well with generalization is the same as claiming that pfs correlates with generalization which is what was previously established in addition i do have concerns with the way pf is computed in this paper the authors use sgd as a way of sampling function f if there exists good functions that are not often found by sgd the space of functions studied by the paper would be a distorted version of the reality for example some works on reinitialization eg httpsarxivorgabs210900267 suggest that there exists flatter local minima that are only reached by sgd if a subset of the parameters are reinitialized frequently in other words a good local minimum would be surrounded by sharper local minima that generalize less but because the flatter local minima are surrounded by the sharper ones sgd would settle on the sharper local minima this is fixed using reinitialization if one uses sgd to sample the functions f many functions could be missing other comments questions how is sharpness in definition 21 computed in the experiments minor issues in definition 21 it should be mentioned that w is a stationary point otherwise the connection to the hessian is wrong in definition 22 the actual definition is on a separate paragraph the definition of ps can be misleading it is a function of s but not probability of the sample s i suggest using a different notation eg gs instead of ps the statement in page 4 related to figure 1 is not really justified one can draw a picture in which that statement does not hold ie that the volume of f is not correlated with pf it is just an intuition but it can be wrong and it is not clear to me why it is more likely to be true within the main text the authors use a more acceptable argument about pf having different orders of magnitude but thats different from the claim in figure 1 please move figure 2 to the top of the page i am curious to know how sgd with momentum and the recent sharpnessaware minimization sam would perform in figure 4 have the authors experimented with them typos page 6 the function the largest prior page 9 neededmorever needs a full stop the paper is well written however the main contribution in the paper seems to be identical to previous works i would appreciate it if the authors elaborate on what their contribution is compared to the works of valleprez louis 2020 and mingard 2021 docsepthis paper studies the relation between generalization and the flatness of the loss landscapes the authors show that the correlation between local flatness and generalization is broken in the case of using adam or entropysgd or by performing parameter rescaling they show that the bayesian prior of dnns which are trained until zero training error could on the other hand predict generalization better than the local flatness measures this is motivated by the work of mingard et al 2021 where it is shown that the bayesian posterior function correlates well with the probability function of dnns that reach zero training error using sgd strengths the empirical observation that adam and entropysgd could break the correlation between flatness and generalization is interesting the empirical observation that the bayesian prior is linked with generalization reinforces the ideas presented in the paper mingard et al 2021 this is an interesting area to explore especially to connect empirical observations with theoretical findings throughout the paper there are some nice explanationsintuitions given for each figure presented in the main paper there are similar experiments to provide sanity checks weaknesses the novelty of the paper is very limited previous work already suggests that care must be taken when interpreting local measures of flatness in 1 it has been shown that local flatness alone is not enough and we need global wellconnectivity as well the results regarding how rescaling would break the correlation between generalization and flatness are also studied before in the literature dinh et al 2017 the link between the bayesian prior and generalization is studied in mingard et al 2021 more precisely in mingard et al 2021 the authors show that pb f s generalization error as for a given s pb f s p f then it is implicit that p f generalization error which is the main result of this submission the paper needs to be fully proofread there are many typos throughout the paper missing a the minst instead of mnist the first sentence of section 53 is incomplete etc the flow of the paper also needs major adjustments the paragraphs are not coherently written also a lot of the sentences could be summarizedrewritten some example suggestions definition 22 is not needed in contribution number 2 explanations should be avoided instead only the main results should be stated from the sentence for discrete the text should be moved to the proper section in some parts of the text some unusual terms are used examples section 21 the inputs live in the observed output y the parameter vector w is not defined in section 21 and in section 22 the loss function is a function of w which is a mismatch with the notation used in section 21 to define the loss function section 4 could be combined with section 3 there are repetitions in section 3 and some paragraphs do not follow well from one another as there are no theoretical results section 2 could be significantly shortened all these would help with the flow of the ideas and would help the reader to stay focused on the main outcomes of the paper in the current shape the main outcomes are in the shadows the results of the paper only come after page 6 as the paper is an empirical study it is lacking important ablation studies the scope of the empirical results should be expanded for example be similar to the scope of empirical results in 1 one example case study is to compare different initial random seeds are the networks with higher volume higher bayesian prior also the networks with higher test accuracy it would be better to move the last paragraph of the related work which is currently in the appendix to the main paper and discuss exactly the new contributions in light of the three mentioned prior work it is implicit that if the bayesian posterior is related to generalization which is studied thoroughly in mingard et al 2021 then the bayesian prior is also related to generalization stating this result is not a new claim although i believe that empirically showing this is interesting for this paper to be an empirical study many more ablation studies and settings should be studied to make the claim practically convincing minor comments there is a rather new study on neural network loss landscapes 1 where the authors show that the best accuracy is achieved when not only the model has converged to a locally flat region but also to a globally wellconnected region it would be interesting to compare the bayesian prior with the combination of global connectivity and local flatness what is the difference between global connectivity and bayesian prior is there a relation between the two can we say that globally wellconnected regions also have a large volume similar to definitions 21 and 22 which are properly cited definition 23 should also be cited from mingard et al 2021 figure 1 is a replica of figure 6a of mingard et al 2021 proper credit should be given throughout the paper either only refer to flatness or only refer to sharpness the inconsistency makes the resultstext less readable the results in figure 2 are not the main focus of the paper instead the main focus is to find scenarios where the correlation between the two and therefore the correlation between flatness and generalization breaks so there is no need to present figure 2 in the main paper 1 yang et al taxonomizing local versus global structure in neural network loss landscapes 2021 overall although the two observations that the bayesian prior is empirically linked with generalization and that when using adam or entropysgd flatness is no longer linked to generalization are interesting the contribution of the paper is not enough the novelty of the work is very limited and the empirical results are insufficient docsepthis paper shows that for certain variants of sgd the commonly held assertion that flat minima imply good generalization may not hold the overall conclusion from this aspect of the paper is that popular measures of flatness sometimes do and sometimes do not correlate with generalization the authors continue on to propose an alternative measure the bayesian prior which they claim correlates well with generalization in essence this measure captures the volume of parameter space which when a networks weights are initialized within the volumes region it converge to a prespecified function with low training error after covering the definitions and related measures they perform experiments on mnist cifar10 and discuss how the choice of optimizer affects the claims first its extremely easy to determine the authors they selfcite collectively over 50 times if you include the publications where the authors are coauthored furthermore they have the current and the former neurips version of this paper posted on arxiv where the original title was slightly different and with a less ambiguous conclusion why flatness correlates with generalization for deep neural networks this isnt against iclrs rules and this review is not to judge either arxiv version but since v1 is publicly available i have considered it among the relevant past work for the currently submitted paper general concerns no specific order the name neutral space of f has a much more commonly known name among the mathematicsneural network community its called the fiber of f see httpsenwikipediaorgwikifibermathematics section 24 it is stated that dnns are typically trained to zero training error on the training set as a practitioner myself im not sure whowhere this information is coming from dnns are typically not trained to zero training error as that almost always results in overfitting and an overusage of timecomputational resources figure 1 it shows the loss function has negative error at certain weights w this seems to contradict what was said about loss functions in the paragraph immediately prior that the loss function is zero or positive what does negative error mean i find it slightly disingenuous to place central definitions in the appendices to achieve the page limit gamifying the page limit isnt the purpose of the page limit expecting reviewers to read 35 pages in the time allotted isnt considerate appendix b where the parameterfunction map is defined it states that it was first introduced in valleprez et al 2018 the map from the parameter space which takes a parameter and maps it to a function is so ubiquitous its questionable to assign any single person as its creator my cynical side questions if this was an attempt at selfcitation minor the text makes the jump from lyy to lw without stating that y is parameterized by w its not a big deal but the less familiar reader may get confused at an early stage with this jump section 3 states perhaps the simplest argument is that if dnns trained to zero error are known to generalize well on unseen data thats a very big if in practice dnns trained to zero error are expected to not generalize well on unseen data section 3 states mingard et al 2021 showed that log pbfs correlates quite tightly with generalization error if this is the case then its not clear to me if the second aspect of this work is a novel contribution if it is its not clear to me how section 51 states that the dnn being used was found to be sufficiently expressive to represent almost all possible functions the relu network is shaped 7 40 40 2 and its claimed that this is capable of representing almost all of the possible 34 x 1034 functions id like proof for example consider proposition 3 from the well known work on relu linear regions httpsarxivorgabs14021869 where even the most conservative upper bound for the number of functions computable by a relu network with n hidden neurons is 2n 280 o1024 this is 10 orders of magnitude short for this case the lower bound on the maximum theorem 4 is even more pessimistic and both of these bounds have been shown to be overoptimistic in later works eg httpsarxivorgabs171102114 taking advantage of the fact that zeroactivations of relus collapse the dimension of the signal passing through the network ten orders of magnitude is hard to imagine section 51 the authors take 108 initial points that is 26 orders of magnitude short of how many functions are computable or 0000000000000000000000001 of the possible functions wouldve been sampled it is likely that even less were sampled since these points were chosen from essentially a bounded sphere in parameter space ie finite samples from a thintailed gaussian distribution its difficult to back up any empirical claim when theres 1034 functions critical section 52 it is stated that in order to generate functions f with zero error on the training set s but with diverse generalization performance we use the attackset trick from wu et al 2017 this seems like a critical flaw in that the training process is not sgd and therefore the distribution of parameters could be substantially different than that of using sgd this feels like the evidence only supports the different claim that for this attack dataset trick these claims about flatness volume measures hold section 53 states changing optimizers or changing hyperparameters can of course alter the generalization performance by small amounts which may be critically important in practical applications true and false if small was replaced by large id say true to be facetious one could choose hyperparameters extreme enough where generalization is arbitrarily poor since hyperparameters can alter the level of generalization from arbitrarily poor to stateoftheart for some models this is an important point perhaps its present somewhere in the 35 pages and i do not see it but im unable to find much comparison for this at all figure 5 what is alpha scaling not introduced as far as i can find in the discussion it states evidential basis for this hypothesis has always been mixed it seems like the first main contribution of this work adds to the pile of mixed results another excerpt here we performed extensive empirical work showing that flatness can indeed correlate with generalization however this correlation is not always tight and can be easily broken by changing the optimizer or by parameterrescaling what i take away from this is that sometimes its correlated sometimes its not and theres no clear pattern at this point seeing as this is the main title of this paper it doesnt feel like a novel result at all it seems like more supporting evidence that we really dont know how flatness measures correlate with generalization i could be wrong clarification would be helpful typsetting many figures are hard to read with ticks have 6 pt font or otherwise appear to have errors in typesetting graphs sitting overtop of words or maybe labels were manipulated on ms paint or the alike overall theres very little consistency across figures figure size varies substantially legends are seemingly randomly placed sometimes larger than the labels sometimes not figures are different sizes even within the same 8subfigure figure sota state of the art or also written stateoftheart in this same paper doesnt require an abbreviation since its used once well really its introduced twice along with its abbreviation twice regardless not needed same goes for cross entropy ce introduced without ever using it again gaussian processes gps is used multiple times but also introduced multiple times and its not introduced at the first opportunity either seems unorganized the use of wrt just write it out instead of bringing in needless abbreviations quotation style fluctuates often look closely at the frequently changing quotation marks fig vs fig appendix vs appendix eq4 vs eq 4 figs vs fig for multiple subplots consistency and generally speaking upper case fig and eq tend to look more professional general equation typesetting spacing periods before a sentence with an equation ends commas spaces just simple typos that could be caught by any spell check fig 2 vs fig s3 i dont get the ss for example in section 51 it states the figure is in the appendix but its linked to regular fig 3 sometimes figures with s link to the main text sometimes not check this the function with the largest prior missing the word with overall its very long there are a few critical concerns about the soundness of the claims there are some typesetting issues the figures could use work a wall of dozens of hardtoread graphs at the end arent helping the focus the conclusion for me is quite ambiguous im not sure what to take away from this paper in terms of if flatness makes sense to use as a predictor of generalization the title doesnt entirely reflect whats being claimed in the paper since much of the paper focuses on the bayesian prior
### Summary: | this paper proposes the use of bayesian prior upon initialization for predicting generalization performance of a neural network and empirically shows that it can outperform flatnessbased measures understanding the underlying reasons that control generalization performance on neural networks is of great theoretical and practical importance and reviewers find efforts in this direction valuable however they believe the submission in current state is not ready for publication specifically zcfx believes the setup considered in the paper does not resemble a realistic situation which makes claims about the bayesian prior being a more robust predictor than flatness unsubstantiated zcfx appreicates authors response and clarifications but finds the concerns unresolved ohft believes the paper is weak in a certain aspects such as comparing across different architectures including number of parameters and comparing with sam optimizer whose goal is to find flat minima and has shown to greatly improve the generalization performance ohft acknowledged reading authors response but the response did not help with changing the score of the paper r1hf has some reservations about the novelty of the work and the limited experiments which remained unresolved r1hf suggests that the authors revise the paper to emphasize on the authors contribution in light of the previous work based on reviewers feedback i suggest authors to resubmit after revising the draft to address the issues raised above | [
374,
275,
253,
2022,
2929,
50275,
18,
30966,
1162,
355,
2891,
11168,
3006,
1980,
7147,
4156,
2605,
275,
11454,
2990,
2957,
37328,
43425,
50272,
1189,
455,
3738,
253,
767,
7313,
326,
253,
17699,
16561,
2720,
310,
45190,
7939,
342,
26647,
285,
326,
672,
970,
38622,
390,
994,
1658,
656,
35333,
6507,
1255,
310,
642,
3356,
7939,
281,
26647,
403,
4722,
253,
7680,
273,
253,
2929,
310,
417,
2217,
253,
38135,
273,
253,
789,
310,
1077,
3710,
285,
253,
16774,
1543,
403,
12497,
50276,
7152,
33032,
2520,
2929,
2722,
326,
323,
2176,
11640,
273,
256,
35333,
253,
7744,
2918,
17077,
326,
6507,
46836,
16084,
1175,
26647,
778,
417,
2186,
253,
4583,
6452,
432,
436,
4809,
273,
253,
2929,
310,
326,
4633,
5593,
273,
6507,
1255,
4536,
513,
285,
4536,
513,
417,
24888,
342,
26647,
253,
4477,
4035,
327,
281,
12661,
271,
5795,
2557,
253,
17699,
16561,
2720,
534,
597,
1750,
27972,
973,
342,
26647,
275,
17718,
436,
2557,
28174,
253,
4644,
273,
4764,
2317,
534,
672,
247,
6928,
13461,
403,
31260,
1561,
253,
14118,
2919,
352,
29623,
281,
247,
838,
1553,
1245,
1159,
342,
1698,
3733,
2228,
846,
10985,
253,
14308,
285,
2905,
5593,
597,
1347,
4679,
327,
278,
79,
382,
260,
338,
274,
740,
285,
2319,
849,
253,
4327,
273,
5556,
6081,
11852,
253,
3916,
806,
697,
6685,
3477,
281,
3653,
253,
4477,
597,
1881,
41766,
26708,
689,
2456,
2069,
604,
368,
2486,
253,
16516,
835,
253,
4477,
403,
820,
14399,
2149,
33810,
597,
452,
253,
1655,
285,
253,
3438,
5723,
2824,
2715,
273,
436,
2929,
9269,
327,
549,
32693,
835,
253,
3236,
4060,
369,
5777,
1027,
285,
342,
247,
1679,
23851,
6452,
50276,
22309,
6507,
1255,
27972,
342,
26647,
323,
3676,
11454,
6928,
436,
310,
2649,
1411,
17857,
77,
2967,
4803,
285,
436,
2278,
310,
417,
281,
5963,
2057,
549,
32693,
2715,
533,
1580,
362,
18,
310,
13644,
2130,
891,
452,
2783,
352,
2190,
253,
4623,
2469,
789,
323,
253,
4390,
9262,
2929,
50275,
16691,
7350,
642,
2173,
1340,
50276,
186,
253,
1416,
9238,
2317,
273,
269,
556,
247,
1199,
625,
7744,
1929,
1416,
2190,
253,
23065,
570,
1546,
2990,
3114,
697,
1925,
253,
9538,
273,
269,
923,
5987,
257,
25842,
2061,
44874,
338,
487,
693,
4349,
47328,
50276,
186,
2593,
2164,
352,
310,
4767,
326,
277,
79,
2224,
403,
5431,
10166,
281,
5058,
3733,
2228,
327,
253,
3733,
873,
347,
247,
34815,
4266,
516,
417,
2119,
364,
319,
1568,
436,
1491,
310,
3551,
432,
277,
79,
2224,
403,
5431,
417,
10166,
281,
5058,
3733,
2228,
347,
326,
2761,
1900,
1543,
275,
689,
31893,
285,
271,
689,
24483,
273,
673,
16777,
1050,
5300,
50276,
186,
4677,
337,
352,
2722,
253,
2957,
1159,
556,
4016,
2228,
387,
2176,
13461,
259,
436,
3133,
281,
17343,
752,
369,
753,
670,
2957,
3470,
275,
253,
12494,
4745,
2720,
326,
253,
2957,
1159,
310,
5058,
390,
2762,
752,
1057,
4016,
2228,
1599,
50276,
186,
891,
1089,
352,
5777,
557,
25203,
3472,
281,
1659,
4275,
14308,
275,
253,
14801,
1271,
281,
5115,
253,
3239,
2701,
18814,
5411,
253,
3239,
2701,
310,
2649,
253,
4096,
273,
253,
3239,
2701,
16764,
30628,
281,
1239,
4791,
7223,
275,
253,
673,
512,
7655,
310,
2649,
1908,
366,
50267,
50237,
270,
835,
253,
4764,
3701,
3711,
310,
2931,
352,
3054,
326,
352,
369,
806,
5611,
275,
821,
282,
3456,
91,
1162,
355,
4765,
253,
3711,
432,
253,
4764,
2317,
534,
3936,
247,
4764,
285,
8115,
352,
281,
247,
1159,
310,
594,
33079,
697,
30455,
281,
9212,
667,
2014,
1436,
347,
697,
24799,
619,
47892,
1930,
3533,
604,
436,
369,
271,
3177,
387,
1881,
26977,
50276,
186,
5884,
253,
2505,
2789,
253,
6923,
432,
298,
12502,
281,
298,
88,
1293,
14851,
326,
340,
310,
4764,
1025,
407,
259,
697,
417,
247,
1943,
2968,
533,
253,
1679,
7615,
9414,
778,
755,
13477,
387,
271,
2393,
3924,
342,
436,
6923,
50276,
186,
2593,
495,
3054,
4931,
253,
22325,
4154,
310,
326,
604,
277,
79,
2224,
10166,
281,
5058,
2228,
403,
1929,
281,
39970,
973,
327,
39709,
941,
28763,
247,
1077,
1943,
604,
275,
3946,
277,
79,
2224,
10166,
281,
5058,
2228,
403,
3264,
281,
417,
39970,
973,
327,
39709,
941,
50276,
186,
2593,
495,
3054,
43261,
472,
1162,
355,
43425,
2692,
326,
2412,
268,
3342,
84,
27972,
3240,
18996,
342,
26647,
2228,
604,
436,
310,
253,
1083,
840,
697,
417,
2590,
281,
479,
604,
253,
1273,
4809,
273,
436,
789,
310,
247,
4460,
7680,
604,
352,
310,
697,
417,
2590,
281,
479,
849,
50275,
186,
2593,
8319,
3054,
326,
253,
277,
9866,
1146,
908,
369,
1119,
281,
320,
10481,
43541,
281,
1957,
2761,
512,
1896,
3470,
253,
774,
86,
2990,
310,
16745,
818,
50276,
1449,
50276,
1449,
50276,
19,
285,
697,
7558,
326,
436,
310,
7032,
273,
9999,
2761,
512,
273,
253,
1896,
5910,
1269,
884,
1706,
3470,
2654,
751,
4737,
323,
1650,
1908,
13989,
495,
432,
253,
973,
1929,
789,
327,
774,
86,
4872,
4811,
5987,
39962,
2061,
5375,
1047,
2640,
1093,
2090,
835,
1014,
253,
954,
11518,
5170,
3033,
323,
253,
1180,
273,
3470,
2475,
494,
407,
247,
774,
86,
2990,
342,
295,
8763,
8512,
310,
374,
79,
50276,
19100,
50276,
80,
31111,
436,
310,
884,
7367,
273,
9777,
2159,
323,
436,
1083,
253,
2406,
3033,
327,
253,
4869,
10012,
577,
310,
1014,
625,
45234,
2531,
285,
1097,
273,
841,
14493,
452,
644,
2011,
281,
320,
689,
32581,
2531,
275,
1996,
2987,
24088,
5987,
39962,
2061,
5375,
1166,
7749,
19,
13391,
3192,
5750,
273,
253,
958,
326,
5058,
19452,
569,
273,
774,
316,
13551,
253,
7877,
273,
253,
2625,
8136,
949,
253,
2990,
3578,
7367,
273,
9777,
310,
1892,
281,
8564,
50276,
186,
2593,
8319,
253,
4477,
1379,
13278,
3302,
2792,
326,
310,
3436,
7367,
273,
9777,
2159,
273,
849,
1142,
3470,
403,
2475,
494,
390,
209,
9754,
4226,
18,
273,
253,
1896,
3470,
651,
306,
644,
19958,
352,
310,
2779,
326,
1014,
1679,
497,
19958,
1580,
841,
2792,
497,
6777,
432,
9093,
247,
11542,
15269,
275,
4764,
2317,
26332,
6486,
3530,
432,
247,
289,
565,
7193,
305,
12064,
3268,
697,
2834,
281,
896,
598,
667,
16774,
1750,
672,
253,
373,
884,
1706,
3470,
50276,
186,
4619,
2593,
8073,
352,
310,
4767,
326,
275,
1340,
281,
6635,
3470,
269,
342,
5058,
2228,
327,
253,
3733,
873,
256,
533,
342,
11117,
26647,
3045,
359,
897,
253,
2983,
1178,
10480,
432,
259,
86,
1162,
355,
4240,
436,
3133,
751,
247,
4619,
19652,
275,
326,
253,
3733,
1232,
310,
417,
256,
35333,
285,
3103,
253,
3268,
273,
3602,
812,
320,
9619,
1027,
685,
326,
273,
970,
256,
35333,
436,
9193,
751,
253,
1941,
760,
8525,
253,
1027,
1750,
326,
323,
436,
2983,
10895,
10480,
841,
3916,
670,
6507,
1255,
4644,
5593,
2186,
50276,
186,
2593,
8676,
3054,
6890,
5556,
14460,
390,
6890,
4373,
22041,
476,
273,
2282,
6990,
253,
26647,
3045,
407,
1355,
8322,
534,
778,
320,
21038,
1774,
275,
8542,
4893,
2032,
285,
3221,
604,
1355,
369,
7932,
407,
1781,
2654,
1333,
2032,
281,
320,
32124,
784,
581,
812,
5206,
4373,
22041,
9559,
2217,
835,
26647,
310,
29607,
4105,
1580,
4373,
22041,
476,
6990,
253,
1268,
273,
26647,
432,
29607,
4105,
281,
1375,
23037,
14387,
323,
690,
3210,
436,
310,
271,
1774,
1127,
4931,
697,
1246,
9366,
275,
253,
4791,
7223,
285,
891,
513,
417,
923,
352,
533,
516,
7591,
281,
1089,
1199,
5301,
323,
436,
387,
512,
50276,
186,
4677,
608,
752,
310,
9765,
13642,
417,
5611,
347,
2080,
347,
891,
476,
1089,
50276,
186,
275,
253,
5955,
352,
3054,
8943,
451,
3720,
323,
436,
9079,
556,
1900,
644,
6804,
352,
3133,
751,
253,
806,
2022,
7680,
273,
436,
789,
11323,
281,
253,
19176,
273,
6804,
1543,
1529,
32491,
1060,
359,
2684,
9470,
16774,
789,
4645,
326,
6507,
1255,
476,
6296,
24888,
342,
26647,
2299,
436,
5921,
310,
417,
1900,
6863,
285,
476,
320,
4354,
7154,
407,
6890,
253,
5556,
6081,
390,
407,
4764,
373,
1179,
272,
752,
891,
1379,
1977,
432,
436,
310,
326,
4536,
697,
9578,
4536,
697,
417,
285,
253,
373,
642,
2590,
3102,
387,
436,
1127,
6523,
347,
436,
310,
253,
2022,
4060,
273,
436,
2929,
352,
36908,
1928,
751,
247,
4460,
906,
387,
512,
352,
3133,
751,
625,
8109,
1941,
326,
359,
1663,
13414,
871,
849,
6507,
1255,
5593,
24888,
342,
26647,
891,
812,
320,
3430,
37699,
651,
320,
9371,
50276,
555,
793,
33513,
50276,
186,
1142,
8442,
403,
1892,
281,
1239,
342,
39064,
452,
721,
31048,
8266,
390,
5010,
3176,
281,
452,
6332,
275,
3510,
33513,
14580,
7063,
689,
3956,
273,
3000,
390,
5046,
13301,
497,
32494,
327,
13818,
6848,
390,
253,
19605,
4583,
253,
373,
1077,
1652,
15274,
2439,
8442,
50276,
186,
4677,
1979,
16149,
9619,
38209,
403,
16907,
12421,
4845,
4536,
4067,
685,
253,
13301,
4536,
417,
8442,
403,
1027,
9552,
1014,
1561,
253,
1072,
854,
2377,
13206,
4677,
50276,
186,
256,
5503,
1375,
273,
253,
1445,
390,
671,
3542,
1375,
23037,
14387,
275,
436,
1072,
2929,
36908,
2430,
271,
31931,
2492,
1580,
697,
908,
2378,
973,
1663,
697,
5611,
7019,
2112,
342,
697,
31931,
2492,
7019,
10159,
417,
3058,
1072,
4566,
323,
2831,
15579,
2636,
5611,
1293,
2455,
970,
352,
969,
305,
12064,
4870,
305,
793,
310,
908,
2709,
2069,
533,
671,
5611,
2709,
2069,
285,
697,
417,
5611,
387,
253,
806,
5107,
2057,
3133,
440,
34092,
50276,
186,
253,
897,
273,
8772,
816,
3630,
352,
562,
3185,
273,
9745,
275,
878,
1417,
490,
25669,
50276,
186,
25241,
3740,
10920,
21094,
2223,
1007,
8244,
387,
253,
7208,
6890,
25241,
10880,
50276,
186,
3036,
4632,
3036,
30762,
4632,
30762,
16186,
21,
4632,
16186,
577,
3036,
84,
4632,
3036,
323,
2709,
749,
42045,
15274,
285,
3839,
8288,
5170,
1083,
3036,
285,
16186,
5257,
281,
1007,
625,
5702,
50276,
186,
2087,
5150,
3510,
33513,
22735,
9894,
1078,
247,
6197,
342,
271,
5150,
7637,
50276,
186,
764,
284,
8470,
816,
2969,
963,
993,
326,
812,
320,
7270,
407,
667,
15368,
2451,
50276,
186,
3036,
374,
4632,
3036,
256,
20,
891,
13414,
755,
253,
23524,
323,
1650,
275,
2593,
8319,
352,
3054,
253,
4677,
310,
275,
253,
30762,
533,
697,
7939,
281,
3963,
3036,
495,
4536,
8442,
342,
256,
3048,
281,
253,
2022,
2505,
4536,
417,
2451,
436,
50276,
186,
253,
1159,
342,
253,
6253,
2720,
5816,
253,
3159,
342,
50276,
1189,
455,
697,
1077,
1048,
627,
403,
247,
1643,
4619,
7350,
670,
253,
3590,
1255,
273,
253,
3916,
627,
403,
690,
3510,
33513,
3374,
253,
8442,
812,
897,
789,
247,
3402,
273,
18660,
273,
1892,
85,
410,
324,
14580,
387,
253,
990,
403,
2649,
9073,
253,
2770,
253,
6452,
323,
479,
310,
3240,
23851,
516,
417,
2119,
752,
281,
1379,
1977,
432,
436,
2929,
275,
2426,
273,
604,
6507,
1255,
2789,
3282,
281,
897,
347,
247,
23403,
273,
26647,
253,
4060,
36908,
7094,
4887,
47515,
1146,
7558,
275,
253,
2929,
1580,
1199,
273,
253,
2929,
16633,
327,
253,
17699,
16561,
2720,
2490,
187,
4118,
18435,
27,
2520,
2929,
29328,
253,
897,
273,
17699,
16561,
2720,
2220,
31850,
323,
21565,
26647,
3045,
273,
247,
11454,
2990,
285,
45190,
2722,
326,
352,
476,
562,
32231,
6507,
1255,
3169,
5593,
4685,
253,
6944,
4606,
326,
1453,
26647,
3045,
327,
11454,
6928,
310,
273,
1270,
10527,
285,
8542,
6349,
285,
30628,
1089,
6031,
275,
436,
3884,
9865,
2299,
597,
2868,
253,
19529,
275,
1655,
1375,
310,
417,
4704,
323,
9311,
50276,
46458,
1182,
7836,
89,
11532,
253,
9978,
2783,
275,
253,
2929,
1057,
417,
28788,
247,
15958,
4112,
534,
2789,
3916,
670,
253,
17699,
16561,
2720,
1146,
247,
625,
10237,
23403,
685,
6507,
1255,
440,
44167,
4215,
1182,
7836,
89,
622,
250,
31290,
4477,
2380,
285,
8254,
6787,
533,
9010,
253,
7350,
39394,
12506,
649,
11532,
253,
2929,
310,
5075,
275,
247,
2176,
7794,
824,
347,
10941,
2439,
1027,
35615,
1690,
1180,
273,
3602,
285,
10941,
342,
1775,
5556,
6081,
3692,
4736,
310,
281,
1089,
6507,
46836,
285,
556,
2011,
281,
10260,
3157,
253,
26647,
3045,
12506,
649,
14969,
4361,
4477,
2380,
533,
253,
2380,
858,
417,
1361,
342,
6890,
253,
4868,
273,
253,
2929,
391,
18,
45791,
556,
690,
33196,
670,
253,
38135,
273,
253,
789,
285,
253,
3710,
4679,
534,
6376,
39394,
391,
18,
45791,
5936,
326,
253,
4477,
49620,
253,
2929,
281,
22175,
327,
253,
4477,
7680,
275,
1708,
273,
253,
2045,
789,
50276,
3169,
327,
30628,
8680,
891,
1804,
4477,
281,
501,
538,
2225,
846,
3585,
2182,
253,
7482,
281,
2953,
253,
3374,
5439,
1840
] | [
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] | [
374,
275,
253,
2022,
2929,
50275,
18,
30966,
1162,
355,
2891,
11168,
3006,
1980,
7147,
4156,
2605,
275,
11454,
2990,
2957,
37328,
43425,
50272,
1189,
455,
3738,
253,
767,
7313,
326,
253,
17699,
16561,
2720,
310,
45190,
7939,
342,
26647,
285,
326,
672,
970,
38622,
390,
994,
1658,
656,
35333,
6507,
1255,
310,
642,
3356,
7939,
281,
26647,
403,
4722,
253,
7680,
273,
253,
2929,
310,
417,
2217,
253,
38135,
273,
253,
789,
310,
1077,
3710,
285,
253,
16774,
1543,
403,
12497,
50276,
7152,
33032,
2520,
2929,
2722,
326,
323,
2176,
11640,
273,
256,
35333,
253,
7744,
2918,
17077,
326,
6507,
46836,
16084,
1175,
26647,
778,
417,
2186,
253,
4583,
6452,
432,
436,
4809,
273,
253,
2929,
310,
326,
4633,
5593,
273,
6507,
1255,
4536,
513,
285,
4536,
513,
417,
24888,
342,
26647,
253,
4477,
4035,
327,
281,
12661,
271,
5795,
2557,
253,
17699,
16561,
2720,
534,
597,
1750,
27972,
973,
342,
26647,
275,
17718,
436,
2557,
28174,
253,
4644,
273,
4764,
2317,
534,
672,
247,
6928,
13461,
403,
31260,
1561,
253,
14118,
2919,
352,
29623,
281,
247,
838,
1553,
1245,
1159,
342,
1698,
3733,
2228,
846,
10985,
253,
14308,
285,
2905,
5593,
597,
1347,
4679,
327,
278,
79,
382,
260,
338,
274,
740,
285,
2319,
849,
253,
4327,
273,
5556,
6081,
11852,
253,
3916,
806,
697,
6685,
3477,
281,
3653,
253,
4477,
597,
1881,
41766,
26708,
689,
2456,
2069,
604,
368,
2486,
253,
16516,
835,
253,
4477,
403,
820,
14399,
2149,
33810,
597,
452,
253,
1655,
285,
253,
3438,
5723,
2824,
2715,
273,
436,
2929,
9269,
327,
549,
32693,
835,
253,
3236,
4060,
369,
5777,
1027,
285,
342,
247,
1679,
23851,
6452,
50276,
22309,
6507,
1255,
27972,
342,
26647,
323,
3676,
11454,
6928,
436,
310,
2649,
1411,
17857,
77,
2967,
4803,
285,
436,
2278,
310,
417,
281,
5963,
2057,
549,
32693,
2715,
533,
1580,
362,
18,
310,
13644,
2130,
891,
452,
2783,
352,
2190,
253,
4623,
2469,
789,
323,
253,
4390,
9262,
2929,
50275,
16691,
7350,
642,
2173,
1340,
50276,
186,
253,
1416,
9238,
2317,
273,
269,
556,
247,
1199,
625,
7744,
1929,
1416,
2190,
253,
23065,
570,
1546,
2990,
3114,
697,
1925,
253,
9538,
273,
269,
923,
5987,
257,
25842,
2061,
44874,
338,
487,
693,
4349,
47328,
50276,
186,
2593,
2164,
352,
310,
4767,
326,
277,
79,
2224,
403,
5431,
10166,
281,
5058,
3733,
2228,
327,
253,
3733,
873,
347,
247,
34815,
4266,
516,
417,
2119,
364,
319,
1568,
436,
1491,
310,
3551,
432,
277,
79,
2224,
403,
5431,
417,
10166,
281,
5058,
3733,
2228,
347,
326,
2761,
1900,
1543,
275,
689,
31893,
285,
271,
689,
24483,
273,
673,
16777,
1050,
5300,
50276,
186,
4677,
337,
352,
2722,
253,
2957,
1159,
556,
4016,
2228,
387,
2176,
13461,
259,
436,
3133,
281,
17343,
752,
369,
753,
670,
2957,
3470,
275,
253,
12494,
4745,
2720,
326,
253,
2957,
1159,
310,
5058,
390,
2762,
752,
1057,
4016,
2228,
1599,
50276,
186,
891,
1089,
352,
5777,
557,
25203,
3472,
281,
1659,
4275,
14308,
275,
253,
14801,
1271,
281,
5115,
253,
3239,
2701,
18814,
5411,
253,
3239,
2701,
310,
2649,
253,
4096,
273,
253,
3239,
2701,
16764,
30628,
281,
1239,
4791,
7223,
275,
253,
673,
512,
7655,
310,
2649,
1908,
366,
50267,
50237,
270,
835,
253,
4764,
3701,
3711,
310,
2931,
352,
3054,
326,
352,
369,
806,
5611,
275,
821,
282,
3456,
91,
1162,
355,
4765,
253,
3711,
432,
253,
4764,
2317,
534,
3936,
247,
4764,
285,
8115,
352,
281,
247,
1159,
310,
594,
33079,
697,
30455,
281,
9212,
667,
2014,
1436,
347,
697,
24799,
619,
47892,
1930,
3533,
604,
436,
369,
271,
3177,
387,
1881,
26977,
50276,
186,
5884,
253,
2505,
2789,
253,
6923,
432,
298,
12502,
281,
298,
88,
1293,
14851,
326,
340,
310,
4764,
1025,
407,
259,
697,
417,
247,
1943,
2968,
533,
253,
1679,
7615,
9414,
778,
755,
13477,
387,
271,
2393,
3924,
342,
436,
6923,
50276,
186,
2593,
495,
3054,
4931,
253,
22325,
4154,
310,
326,
604,
277,
79,
2224,
10166,
281,
5058,
2228,
403,
1929,
281,
39970,
973,
327,
39709,
941,
28763,
247,
1077,
1943,
604,
275,
3946,
277,
79,
2224,
10166,
281,
5058,
2228,
403,
3264,
281,
417,
39970,
973,
327,
39709,
941,
50276,
186,
2593,
495,
3054,
43261,
472,
1162,
355,
43425,
2692,
326,
2412,
268,
3342,
84,
27972,
3240,
18996,
342,
26647,
2228,
604,
436,
310,
253,
1083,
840,
697,
417,
2590,
281,
479,
604,
253,
1273,
4809,
273,
436,
789,
310,
247,
4460,
7680,
604,
352,
310,
697,
417,
2590,
281,
479,
849,
50275,
186,
2593,
8319,
3054,
326,
253,
277,
9866,
1146,
908,
369,
1119,
281,
320,
10481,
43541,
281,
1957,
2761,
512,
1896,
3470,
253,
774,
86,
2990,
310,
16745,
818,
50276,
1449,
50276,
1449,
50276,
19,
285,
697,
7558,
326,
436,
310,
7032,
273,
9999,
2761,
512,
273,
253,
1896,
5910,
1269,
884,
1706,
3470,
2654,
751,
4737,
323,
1650,
1908,
13989,
495,
432,
253,
973,
1929,
789,
327,
774,
86,
4872,
4811,
5987,
39962,
2061,
5375,
1047,
2640,
1093,
2090,
835,
1014,
253,
954,
11518,
5170,
3033,
323,
253,
1180,
273,
3470,
2475,
494,
407,
247,
774,
86,
2990,
342,
295,
8763,
8512,
310,
374,
79,
50276,
19100,
50276,
80,
31111,
436,
310,
884,
7367,
273,
9777,
2159,
323,
436,
1083,
253,
2406,
3033,
327,
253,
4869,
10012,
577,
310,
1014,
625,
45234,
2531,
285,
1097,
273,
841,
14493,
452,
644,
2011,
281,
320,
689,
32581,
2531,
275,
1996,
2987,
24088,
5987,
39962,
2061,
5375,
1166,
7749,
19,
13391,
3192,
5750,
273,
253,
958,
326,
5058,
19452,
569,
273,
774,
316,
13551,
253,
7877,
273,
253,
2625,
8136,
949,
253,
2990,
3578,
7367,
273,
9777,
310,
1892,
281,
8564,
50276,
186,
2593,
8319,
253,
4477,
1379,
13278,
3302,
2792,
326,
310,
3436,
7367,
273,
9777,
2159,
273,
849,
1142,
3470,
403,
2475,
494,
390,
209,
9754,
4226,
18,
273,
253,
1896,
3470,
651,
306,
644,
19958,
352,
310,
2779,
326,
1014,
1679,
497,
19958,
1580,
841,
2792,
497,
6777,
432,
9093,
247,
11542,
15269,
275,
4764,
2317,
26332,
6486,
3530,
432,
247,
289,
565,
7193,
305,
12064,
3268,
697,
2834,
281,
896,
598,
667,
16774,
1750,
672,
253,
373,
884,
1706,
3470,
50276,
186,
4619,
2593,
8073,
352,
310,
4767,
326,
275,
1340,
281,
6635,
3470,
269,
342,
5058,
2228,
327,
253,
3733,
873,
256,
533,
342,
11117,
26647,
3045,
359,
897,
253,
2983,
1178,
10480,
432,
259,
86,
1162,
355,
4240,
436,
3133,
751,
247,
4619,
19652,
275,
326,
253,
3733,
1232,
310,
417,
256,
35333,
285,
3103,
253,
3268,
273,
3602,
812,
320,
9619,
1027,
685,
326,
273,
970,
256,
35333,
436,
9193,
751,
253,
1941,
760,
8525,
253,
1027,
1750,
326,
323,
436,
2983,
10895,
10480,
841,
3916,
670,
6507,
1255,
4644,
5593,
2186,
50276,
186,
2593,
8676,
3054,
6890,
5556,
14460,
390,
6890,
4373,
22041,
476,
273,
2282,
6990,
253,
26647,
3045,
407,
1355,
8322,
534,
778,
320,
21038,
1774,
275,
8542,
4893,
2032,
285,
3221,
604,
1355,
369,
7932,
407,
1781,
2654,
1333,
2032,
281,
320,
32124,
784,
581,
812,
5206,
4373,
22041,
9559,
2217,
835,
26647,
310,
29607,
4105,
1580,
4373,
22041,
476,
6990,
253,
1268,
273,
26647,
432,
29607,
4105,
281,
1375,
23037,
14387,
323,
690,
3210,
436,
310,
271,
1774,
1127,
4931,
697,
1246,
9366,
275,
253,
4791,
7223,
285,
891,
513,
417,
923,
352,
533,
516,
7591,
281,
1089,
1199,
5301,
323,
436,
387,
512,
50276,
186,
4677,
608,
752,
310,
9765,
13642,
417,
5611,
347,
2080,
347,
891,
476,
1089,
50276,
186,
275,
253,
5955,
352,
3054,
8943,
451,
3720,
323,
436,
9079,
556,
1900,
644,
6804,
352,
3133,
751,
253,
806,
2022,
7680,
273,
436,
789,
11323,
281,
253,
19176,
273,
6804,
1543,
1529,
32491,
1060,
359,
2684,
9470,
16774,
789,
4645,
326,
6507,
1255,
476,
6296,
24888,
342,
26647,
2299,
436,
5921,
310,
417,
1900,
6863,
285,
476,
320,
4354,
7154,
407,
6890,
253,
5556,
6081,
390,
407,
4764,
373,
1179,
272,
752,
891,
1379,
1977,
432,
436,
310,
326,
4536,
697,
9578,
4536,
697,
417,
285,
253,
373,
642,
2590,
3102,
387,
436,
1127,
6523,
347,
436,
310,
253,
2022,
4060,
273,
436,
2929,
352,
36908,
1928,
751,
247,
4460,
906,
387,
512,
352,
3133,
751,
625,
8109,
1941,
326,
359,
1663,
13414,
871,
849,
6507,
1255,
5593,
24888,
342,
26647,
891,
812,
320,
3430,
37699,
651,
320,
9371,
50276,
555,
793,
33513,
50276,
186,
1142,
8442,
403,
1892,
281,
1239,
342,
39064,
452,
721,
31048,
8266,
390,
5010,
3176,
281,
452,
6332,
275,
3510,
33513,
14580,
7063,
689,
3956,
273,
3000,
390,
5046,
13301,
497,
32494,
327,
13818,
6848,
390,
253,
19605,
4583,
253,
373,
1077,
1652,
15274,
2439,
8442,
50276,
186,
4677,
1979,
16149,
9619,
38209,
403,
16907,
12421,
4845,
4536,
4067,
685,
253,
13301,
4536,
417,
8442,
403,
1027,
9552,
1014,
1561,
253,
1072,
854,
2377,
13206,
4677,
50276,
186,
256,
5503,
1375,
273,
253,
1445,
390,
671,
3542,
1375,
23037,
14387,
275,
436,
1072,
2929,
36908,
2430,
271,
31931,
2492,
1580,
697,
908,
2378,
973,
1663,
697,
5611,
7019,
2112,
342,
697,
31931,
2492,
7019,
10159,
417,
3058,
1072,
4566,
323,
2831,
15579,
2636,
5611,
1293,
2455,
970,
352,
969,
305,
12064,
4870,
305,
793,
310,
908,
2709,
2069,
533,
671,
5611,
2709,
2069,
285,
697,
417,
5611,
387,
253,
806,
5107,
2057,
3133,
440,
34092,
50276,
186,
253,
897,
273,
8772,
816,
3630,
352,
562,
3185,
273,
9745,
275,
878,
1417,
490,
25669,
50276,
186,
25241,
3740,
10920,
21094,
2223,
1007,
8244,
387,
253,
7208,
6890,
25241,
10880,
50276,
186,
3036,
4632,
3036,
30762,
4632,
30762,
16186,
21,
4632,
16186,
577,
3036,
84,
4632,
3036,
323,
2709,
749,
42045,
15274,
285,
3839,
8288,
5170,
1083,
3036,
285,
16186,
5257,
281,
1007,
625,
5702,
50276,
186,
2087,
5150,
3510,
33513,
22735,
9894,
1078,
247,
6197,
342,
271,
5150,
7637,
50276,
186,
764,
284,
8470,
816,
2969,
963,
993,
326,
812,
320,
7270,
407,
667,
15368,
2451,
50276,
186,
3036,
374,
4632,
3036,
256,
20,
891,
13414,
755,
253,
23524,
323,
1650,
275,
2593,
8319,
352,
3054,
253,
4677,
310,
275,
253,
30762,
533,
697,
7939,
281,
3963,
3036,
495,
4536,
8442,
342,
256,
3048,
281,
253,
2022,
2505,
4536,
417,
2451,
436,
50276,
186,
253,
1159,
342,
253,
6253,
2720,
5816,
253,
3159,
342,
50276,
1189,
455,
697,
1077,
1048,
627,
403,
247,
1643,
4619,
7350,
670,
253,
3590,
1255,
273,
253,
3916,
627,
403,
690,
3510,
33513,
3374,
253,
8442,
812,
897,
789,
247,
3402,
273,
18660,
273,
1892,
85,
410,
324,
14580,
387,
253,
990,
403,
2649,
9073,
253,
2770,
253,
6452,
323,
479,
310,
3240,
23851,
516,
417,
2119,
752,
281,
1379,
1977,
432,
436,
2929,
275,
2426,
273,
604,
6507,
1255,
2789,
3282,
281,
897,
347,
247,
23403,
273,
26647,
253,
4060,
36908,
7094,
4887,
47515,
1146,
7558,
275,
253,
2929,
1580,
1199,
273,
253,
2929,
16633,
327,
253,
17699,
16561,
2720,
2490,
187,
4118,
18435,
27,
2520,
2929,
29328,
253,
897,
273,
17699,
16561,
2720,
2220,
31850,
323,
21565,
26647,
3045,
273,
247,
11454,
2990,
285,
45190,
2722,
326,
352,
476,
562,
32231,
6507,
1255,
3169,
5593,
4685,
253,
6944,
4606,
326,
1453,
26647,
3045,
327,
11454,
6928,
310,
273,
1270,
10527,
285,
8542,
6349,
285,
30628,
1089,
6031,
275,
436,
3884,
9865,
2299,
597,
2868,
253,
19529,
275,
1655,
1375,
310,
417,
4704,
323,
9311,
50276,
46458,
1182,
7836,
89,
11532,
253,
9978,
2783,
275,
253,
2929,
1057,
417,
28788,
247,
15958,
4112,
534,
2789,
3916,
670,
253,
17699,
16561,
2720,
1146,
247,
625,
10237,
23403,
685,
6507,
1255,
440,
44167,
4215,
1182,
7836,
89,
622,
250,
31290,
4477,
2380,
285,
8254,
6787,
533,
9010,
253,
7350,
39394,
12506,
649,
11532,
253,
2929,
310,
5075,
275,
247,
2176,
7794,
824,
347,
10941,
2439,
1027,
35615,
1690,
1180,
273,
3602,
285,
10941,
342,
1775,
5556,
6081,
3692,
4736,
310,
281,
1089,
6507,
46836,
285,
556,
2011,
281,
10260,
3157,
253,
26647,
3045,
12506,
649,
14969,
4361,
4477,
2380,
533,
253,
2380,
858,
417,
1361,
342,
6890,
253,
4868,
273,
253,
2929,
391,
18,
45791,
556,
690,
33196,
670,
253,
38135,
273,
253,
789,
285,
253,
3710,
4679,
534,
6376,
39394,
391,
18,
45791,
5936,
326,
253,
4477,
49620,
253,
2929,
281,
22175,
327,
253,
4477,
7680,
275,
1708,
273,
253,
2045,
789,
50276,
3169,
327,
30628,
8680,
891,
1804,
4477,
281,
501,
538,
2225,
846,
3585,
2182,
253,
7482,
281,
2953,
253,
3374,
5439,
1840
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
this paper studies the recovery of mixed multinomial logits using frankwolfe algorithms the authors proved that the proposed algorithm ssrfw converges to the true parameter of each individual mixture model they also provided a sample complexity analysis of the proposed algorithm the paper is generally well written and the results are interesting some clarification on the sample complexity of the subroutine should be added to the analysis the comparison in the experiments could also be improved see my specific questions in the next section for the experiment results i think it would be more reasonable to also compare the proposed method with those in recent related papers such as chierichetti et al 2018 and jagabathula et al 2020 though these baseline algorithms may work in a reduced setting or have different theoretical guarantees their methods should still be applicable to the empirical studies in this paper therefore it would be interesting to see how they compare with one another in the experiments lin 336 reply rely docsepthe authors provide theoretical results on learning mixed multinomial logits the main contribution is a new algorithm that learns both mixture weights and componentspecific logit parameters with provable convergence guarantees for an arbitrary number of mixtures strengths 1 the paper is clearly written and the theoretical results look solid 2 sample complexity is established for the proposed algorithm weaknesses 1 the proposed algorithm performs emlike updates while sample complexity is provided the algorithm could take many iterations to converge and potentially converge to a local optima how does the algorithm address these issues 2 both alpha and beta depend on the parameter epsilon in definition 1 since the magnitude of z could be arbitrary is the condition on beta necessary 3 i think the authors forgot to include the simulation studies as suggested in the abstract na docsepthe paper proposed a new method for learning mixtures of multinomial logits based on historical choice data the proposed method is based on frankwolfe while restricting to a narrowed set of estimated choice probability vectors the paper provided theoretical results on convergence and sample complexity the paper provided numerical experiments to demonstrate the performance of the proposed method strengths the proposed method seems effective for learning multinomial logit mixtures the theoretical results are strong and solid the paper is overall wellwritten weaknesses the paper specifically considers historical choice data which should be reflected in the title it seems a narrow topic of learning multinomial logit mixtures using historical choice data which may reduce the papers significance the preliminaries subsection is a bit brief and it would be more helpful to formally define the historical choice data line 35 typo of frankwolf fw line 330 typo of not only does it resolves na docsepthe authors provide theoretical analysis of a new stochastic subregion frankwolfe algorithm for learning mixed multinomial logits the authors provide provable guarantees showing a polynomial number of samples is sufficient to achieve the performance guarantees simulations are provided in the supplementary material overall the topic and results are interesting the smoothness of reading is affected by some typos that can be improved through further polishing originality the authors provide a novel algorithm that can be considered a variation of the frankewolfe algorithm with a component that takes into account a feasible region this concept is quite interesting using this concept the hyperparameter k is not required in the algorithm which makes it appealing compared to algorithms that require knowledge of k quality the paper overall is interesting the algorithms presented make sense see below sections for a few questions see below questions section and several notational confusions see below clarity clarity the clarity in the paper is mostly obstructed by typos and notational issues below are several points during my review clarity of notations and typos for example defining cdot before it appears in theorem 1 in the description of theorem 1 the summation should be over k and not k there are multiple typos throughout the submission but minimally affects the overall read it would be beneficial to fix these typos in further revisions significance the mmnl model frankwolfe algorithm and related topics benefit a range of real world applications and therefore related research including this paper is important limitations are not thoroughly discussed in this submission it would be beneficial to discuss limitations in more detail
### Summary: | this paper proposes frankwolfe method for learning mixture of multinomial logit models the algorithms is novel and the first of its kind to be applied to this problem this is an important addition to the widely studied topic of preference learning and this result extends the capability of what can be efficiently learned under realistic scenario with mixture models mixture models for ranking are notoriously hard problems and a novel approach such as the one introduced in this paper is critical in bridging the gap to making preference learning practical all the reviewers concerns have been addressed in the rebuttal adequately given the good quality of the paper it will be quite beneficial for the readers if the paper spend some more time on explaining how the result compares to the extensive work that has been don on mixture of multinomial logit models such a nicely written related work section will only make the paper more interesting and broaden the impact of the results some important and closely related work that predate chierichetti et al 2018 are missing some more recent related work are missing here is a sampling of such related work that should be compared with learning mixtures of random utility models with features from incomplete preferences zhibing zhao ao liu lirong xia ijcai22 on the identifiability of mixtures of ranking models xiaomin zhang xucheng zhang poling loh yingyu liang httpsarxivorgabs220113132 learning mixtures of plackettluce models from structured partial orders zhibing zhao lirong xia neurips 2019 learning plackettluce mixtures from partial preferences ao liu zhibing zhao chao liao pinyan lu lirong xia aaai19 learning mixtures of plackettluce models zhibing zhao peter piech lirong xia icml16 collaboratively learning preferences from ordinal data sewoong oh kiran k thekumparampil jiaming xu nips 2015 a topic modeling approach to ranking weicong ding prakash ishwar venkatesh saligrama aistats15 learning mixed multinomial logit model from ordinal data s oh d shah nips14 | [
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
2520,
2929,
2175,
253,
7355,
273,
6804,
37197,
28261,
2412,
953,
970,
21332,
88,
311,
453,
11333,
253,
4477,
8058,
326,
253,
4081,
5933,
256,
18356,
25837,
26414,
281,
253,
2032,
4764,
273,
1016,
2060,
7802,
1566,
597,
671,
2530,
247,
3410,
10454,
1783,
273,
253,
4081,
5933,
50275,
783,
2929,
310,
3839,
973,
3542,
285,
253,
1543,
403,
4722,
690,
37699,
327,
253,
3410,
10454,
273,
253,
749,
27861,
460,
943,
320,
2879,
281,
253,
1783,
253,
5301,
275,
253,
4679,
812,
671,
320,
5520,
923,
619,
2173,
3533,
275,
253,
1735,
2593,
50276,
1542,
253,
3368,
1543,
891,
1158,
352,
651,
320,
625,
5272,
281,
671,
7277,
253,
4081,
1332,
342,
1110,
275,
3332,
2905,
9380,
824,
347,
448,
1321,
16128,
26760,
1162,
355,
4765,
285,
16361,
357,
506,
3627,
1162,
355,
9169,
2167,
841,
8245,
11333,
778,
789,
275,
247,
3777,
4758,
390,
452,
1027,
10527,
23632,
616,
3082,
943,
1335,
320,
7763,
281,
253,
16774,
2175,
275,
436,
2929,
3103,
352,
651,
320,
4722,
281,
923,
849,
597,
7277,
342,
581,
1529,
275,
253,
4679,
50275,
3642,
34128,
12252,
50276,
5314,
50276,
7152,
339,
431,
248,
4477,
2085,
10527,
1543,
327,
4715,
6804,
37197,
28261,
2412,
953,
253,
2022,
7680,
310,
247,
747,
5933,
326,
33772,
1097,
7802,
13461,
285,
4295,
29765,
2412,
262,
3602,
342,
872,
494,
14940,
23632,
323,
271,
10341,
1180,
273,
24170,
20544,
337,
253,
2929,
310,
4518,
3542,
285,
253,
10527,
1543,
1007,
4891,
50276,
19,
3410,
10454,
310,
4232,
323,
253,
4081,
5933,
50276,
20881,
1255,
265,
337,
253,
4081,
5933,
17923,
802,
3022,
11269,
1223,
3410,
10454,
310,
2530,
253,
5933,
812,
1379,
1142,
25142,
281,
29623,
285,
7826,
29623,
281,
247,
1980,
5556,
66,
849,
1057,
253,
5933,
2953,
841,
3374,
374,
1097,
9765,
285,
9840,
3469,
327,
253,
4764,
299,
4277,
275,
5426,
337,
1580,
253,
9777,
273,
1182,
812,
320,
10341,
310,
253,
1617,
327,
9840,
3309,
495,
891,
1158,
253,
4477,
18298,
281,
2486,
253,
9864,
2175,
347,
5125,
275,
253,
12002,
50276,
2072,
5474,
339,
431,
248,
2929,
4081,
247,
747,
1332,
323,
4715,
24170,
273,
37197,
28261,
2412,
953,
1754,
327,
9493,
4327,
941,
253,
4081,
1332,
310,
1754,
327,
21332,
88,
311,
453,
1223,
34617,
281,
247,
34774,
873,
273,
5998,
4327,
5912,
11390,
253,
2929,
2530,
10527,
1543,
327,
14940,
285,
3410,
10454,
253,
2929,
2530,
10704,
4679,
281,
7568,
253,
3045,
273,
253,
4081,
1332,
50274,
296,
3755,
20556,
253,
4081,
1332,
3133,
3576,
323,
4715,
37197,
28261,
2412,
262,
24170,
253,
10527,
1543,
403,
2266,
285,
4891,
253,
2929,
310,
4583,
973,
15720,
50275,
20881,
1255,
265,
253,
2929,
5742,
19401,
9493,
4327,
941,
534,
943,
320,
11392,
275,
253,
4060,
352,
3133,
247,
6891,
9400,
273,
4715,
37197,
28261,
2412,
262,
24170,
970,
9493,
4327,
941,
534,
778,
4796,
253,
9380,
8453,
253,
11944,
249,
3927,
19087,
310,
247,
2372,
4864,
285,
352,
651,
320,
625,
9371,
281,
19186,
4853,
253,
9493,
4327,
941,
50273,
1282,
4791,
1745,
80,
273,
21332,
40995,
269,
88,
50275,
1282,
24792,
1745,
80,
273,
417,
760,
1057,
352,
501,
14503,
50276,
2072,
5474,
339,
431,
248,
4477,
2085,
10527,
1783,
273,
247,
747,
19191,
749,
17187,
21332,
88,
311,
453,
5933,
323,
4715,
6804,
37197,
28261,
2412,
953,
253,
4477,
2085,
872,
494,
23632,
4645,
247,
14189,
1180,
273,
3530,
310,
4209,
281,
5115,
253,
3045,
23632,
9938,
403,
2530,
275,
253,
24864,
2144,
4583,
253,
9400,
285,
1543,
403,
4722,
253,
6032,
1255,
273,
4361,
310,
5876,
407,
690,
963,
993,
326,
476,
320,
5520,
949,
2007,
35952,
50276,
19164,
414,
253,
4477,
2085,
247,
4460,
5933,
326,
476,
320,
2783,
247,
7629,
273,
253,
42839,
413,
88,
311,
453,
5933,
342,
247,
4445,
326,
3936,
715,
2395,
247,
17887,
2919,
436,
4473,
310,
3240,
4722,
970,
436,
4473,
253,
4373,
19484,
465,
310,
417,
2424,
275,
253,
5933,
534,
2789,
352,
23176,
2429,
281,
11333,
326,
2430,
3640,
273,
465,
50276,
15177,
253,
2929,
4583,
310,
4722,
253,
11333,
3559,
1056,
3282,
923,
2708,
7118,
323,
247,
1643,
3533,
923,
2708,
3533,
2593,
285,
2067,
417,
1050,
1461,
16723,
923,
2708,
19843,
50275,
498,
15752,
253,
19843,
275,
253,
2929,
310,
6571,
23040,
264,
407,
963,
993,
285,
417,
1050,
3374,
2708,
403,
2067,
2792,
1309,
619,
2278,
50276,
498,
15752,
273,
41818,
285,
963,
993,
323,
1650,
50272,
1545,
1699,
50276,
3830,
50276,
9131,
352,
4620,
275,
10012,
337,
50272,
249,
253,
5740,
273,
10012,
337,
253,
36138,
943,
320,
689,
465,
285,
417,
465,
50271,
9088,
403,
2709,
963,
993,
4768,
253,
19529,
533,
34885,
11852,
253,
4583,
1239,
352,
651,
320,
12912,
281,
4993,
841,
963,
993,
275,
2007,
38549,
50276,
9188,
40348,
253,
5823,
13307,
1566,
21332,
88,
311,
453,
5933,
285,
2905,
12989,
5649,
247,
2491,
273,
1524,
1533,
4893,
285,
3103,
2905,
2561,
1690,
436,
2929,
310,
1774,
50276,
17465,
569,
403,
417,
16575,
5469,
275,
436,
19529,
352,
651,
320,
12912,
281,
2319,
7364,
275,
625,
2508,
2490,
187,
4118,
18435,
27,
2520,
2929,
29328,
21332,
88,
311,
453,
1332,
323,
4715,
7802,
273,
37197,
28261,
2412,
262,
3210,
253,
11333,
310,
4460,
285,
253,
806,
273,
697,
2238,
281,
320,
3732,
281,
436,
1895,
436,
310,
271,
1774,
1635,
281,
253,
7561,
5421,
9400,
273,
14682,
4715,
285,
436,
906,
8725,
253,
14603,
273,
752,
476,
320,
14556,
6311,
762,
15958,
10076,
342,
7802,
3210,
7802,
3210,
323,
19947,
403,
417,
49186,
1892,
3237,
285,
247,
4460,
2746,
824,
347,
253,
581,
5611,
275,
436,
2929,
310,
4619,
275,
49519,
253,
8037,
281,
2403,
14682,
4715,
8542,
512,
253,
30628,
7350,
452,
644,
9713,
275,
253,
30080,
22559,
18212,
50275,
28821,
253,
1175,
3290,
273,
253,
2929,
352,
588,
320,
3240,
12912,
323,
253,
10668,
604,
253,
2929,
6947,
690,
625,
673,
327,
15571,
849,
253,
906,
26662,
281,
253,
9470,
789,
326,
556,
644,
1053,
327,
7802,
273,
37197,
28261,
2412,
262,
3210,
824,
247,
23395,
3542,
2905,
789,
2593,
588,
760,
1056,
253,
2929,
625,
4722,
285,
3862,
257,
253,
3486,
273,
253,
1543,
690,
1774,
285,
8244,
2905,
789,
326,
2063,
366,
448,
1321,
16128,
26760,
1162,
355,
4765,
403,
5816,
690,
625,
3332,
2905,
789,
403,
5816,
1060,
310,
247,
10491,
273,
824,
2905,
789,
326,
943,
320,
2429,
342,
50275,
28269,
24170,
273,
3632,
11839,
3210,
342,
3386,
432,
18464,
17971,
1182,
1856,
272,
1182,
31035,
18547,
632,
86,
298,
343,
543,
1269,
571,
891,
23925,
2284,
1423,
50275,
251,
253,
1548,
18279,
1430,
273,
24170,
273,
19947,
3210,
1269,
571,
5240,
1182,
12109,
1269,
1028,
24176,
1182,
12109,
877,
272,
298,
1368,
340,
272,
30838,
632,
606,
5987,
39962,
2061,
5375,
19,
1252,
1012,
14783,
50276,
28269,
24170,
273,
499,
471,
3592,
77,
6977,
3210,
432,
18872,
7898,
7367,
1182,
1856,
272,
1182,
31035,
298,
343,
543,
1269,
571,
5723,
2824,
6247,
50276,
28269,
499,
471,
3592,
77,
6977,
24170,
432,
7898,
17971,
18547,
632,
86,
1182,
1856,
272,
1182,
31035,
448,
8500,
298,
22728,
268,
5104,
266,
26535,
298,
343,
543,
1269,
571,
39951,
2284,
746,
50276,
28269,
24170,
273,
499,
471,
3592,
77,
6977,
3210,
1182,
1856,
272,
1182,
31035,
268,
1715,
3376,
348,
298,
343,
543,
1269,
571,
17857,
1686,
1036,
50276,
14641,
2735,
3146,
4715,
17971,
432,
50144,
941,
396,
680,
543,
12506,
465,
343,
266,
465,
253,
76,
360,
1148,
1301,
300,
480,
74,
6472,
1269,
86,
295,
2824,
4104,
50276,
66,
9400,
14053,
2746,
281,
19947,
359,
280,
543,
50097,
9791,
76,
1225,
310,
73,
7523,
8097,
76,
684,
73,
3779,
19531,
2902,
247,
382,
1832,
1010,
50276,
28269,
6804,
37197,
28261,
2412,
262,
1566,
432,
50144,
941,
256,
12506,
277,
439,
1240,
295,
2824,
1047,
50273
] | [
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] | [
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
2520,
2929,
2175,
253,
7355,
273,
6804,
37197,
28261,
2412,
953,
970,
21332,
88,
311,
453,
11333,
253,
4477,
8058,
326,
253,
4081,
5933,
256,
18356,
25837,
26414,
281,
253,
2032,
4764,
273,
1016,
2060,
7802,
1566,
597,
671,
2530,
247,
3410,
10454,
1783,
273,
253,
4081,
5933,
50275,
783,
2929,
310,
3839,
973,
3542,
285,
253,
1543,
403,
4722,
690,
37699,
327,
253,
3410,
10454,
273,
253,
749,
27861,
460,
943,
320,
2879,
281,
253,
1783,
253,
5301,
275,
253,
4679,
812,
671,
320,
5520,
923,
619,
2173,
3533,
275,
253,
1735,
2593,
50276,
1542,
253,
3368,
1543,
891,
1158,
352,
651,
320,
625,
5272,
281,
671,
7277,
253,
4081,
1332,
342,
1110,
275,
3332,
2905,
9380,
824,
347,
448,
1321,
16128,
26760,
1162,
355,
4765,
285,
16361,
357,
506,
3627,
1162,
355,
9169,
2167,
841,
8245,
11333,
778,
789,
275,
247,
3777,
4758,
390,
452,
1027,
10527,
23632,
616,
3082,
943,
1335,
320,
7763,
281,
253,
16774,
2175,
275,
436,
2929,
3103,
352,
651,
320,
4722,
281,
923,
849,
597,
7277,
342,
581,
1529,
275,
253,
4679,
50275,
3642,
34128,
12252,
50276,
5314,
50276,
7152,
339,
431,
248,
4477,
2085,
10527,
1543,
327,
4715,
6804,
37197,
28261,
2412,
953,
253,
2022,
7680,
310,
247,
747,
5933,
326,
33772,
1097,
7802,
13461,
285,
4295,
29765,
2412,
262,
3602,
342,
872,
494,
14940,
23632,
323,
271,
10341,
1180,
273,
24170,
20544,
337,
253,
2929,
310,
4518,
3542,
285,
253,
10527,
1543,
1007,
4891,
50276,
19,
3410,
10454,
310,
4232,
323,
253,
4081,
5933,
50276,
20881,
1255,
265,
337,
253,
4081,
5933,
17923,
802,
3022,
11269,
1223,
3410,
10454,
310,
2530,
253,
5933,
812,
1379,
1142,
25142,
281,
29623,
285,
7826,
29623,
281,
247,
1980,
5556,
66,
849,
1057,
253,
5933,
2953,
841,
3374,
374,
1097,
9765,
285,
9840,
3469,
327,
253,
4764,
299,
4277,
275,
5426,
337,
1580,
253,
9777,
273,
1182,
812,
320,
10341,
310,
253,
1617,
327,
9840,
3309,
495,
891,
1158,
253,
4477,
18298,
281,
2486,
253,
9864,
2175,
347,
5125,
275,
253,
12002,
50276,
2072,
5474,
339,
431,
248,
2929,
4081,
247,
747,
1332,
323,
4715,
24170,
273,
37197,
28261,
2412,
953,
1754,
327,
9493,
4327,
941,
253,
4081,
1332,
310,
1754,
327,
21332,
88,
311,
453,
1223,
34617,
281,
247,
34774,
873,
273,
5998,
4327,
5912,
11390,
253,
2929,
2530,
10527,
1543,
327,
14940,
285,
3410,
10454,
253,
2929,
2530,
10704,
4679,
281,
7568,
253,
3045,
273,
253,
4081,
1332,
50274,
296,
3755,
20556,
253,
4081,
1332,
3133,
3576,
323,
4715,
37197,
28261,
2412,
262,
24170,
253,
10527,
1543,
403,
2266,
285,
4891,
253,
2929,
310,
4583,
973,
15720,
50275,
20881,
1255,
265,
253,
2929,
5742,
19401,
9493,
4327,
941,
534,
943,
320,
11392,
275,
253,
4060,
352,
3133,
247,
6891,
9400,
273,
4715,
37197,
28261,
2412,
262,
24170,
970,
9493,
4327,
941,
534,
778,
4796,
253,
9380,
8453,
253,
11944,
249,
3927,
19087,
310,
247,
2372,
4864,
285,
352,
651,
320,
625,
9371,
281,
19186,
4853,
253,
9493,
4327,
941,
50273,
1282,
4791,
1745,
80,
273,
21332,
40995,
269,
88,
50275,
1282,
24792,
1745,
80,
273,
417,
760,
1057,
352,
501,
14503,
50276,
2072,
5474,
339,
431,
248,
4477,
2085,
10527,
1783,
273,
247,
747,
19191,
749,
17187,
21332,
88,
311,
453,
5933,
323,
4715,
6804,
37197,
28261,
2412,
953,
253,
4477,
2085,
872,
494,
23632,
4645,
247,
14189,
1180,
273,
3530,
310,
4209,
281,
5115,
253,
3045,
23632,
9938,
403,
2530,
275,
253,
24864,
2144,
4583,
253,
9400,
285,
1543,
403,
4722,
253,
6032,
1255,
273,
4361,
310,
5876,
407,
690,
963,
993,
326,
476,
320,
5520,
949,
2007,
35952,
50276,
19164,
414,
253,
4477,
2085,
247,
4460,
5933,
326,
476,
320,
2783,
247,
7629,
273,
253,
42839,
413,
88,
311,
453,
5933,
342,
247,
4445,
326,
3936,
715,
2395,
247,
17887,
2919,
436,
4473,
310,
3240,
4722,
970,
436,
4473,
253,
4373,
19484,
465,
310,
417,
2424,
275,
253,
5933,
534,
2789,
352,
23176,
2429,
281,
11333,
326,
2430,
3640,
273,
465,
50276,
15177,
253,
2929,
4583,
310,
4722,
253,
11333,
3559,
1056,
3282,
923,
2708,
7118,
323,
247,
1643,
3533,
923,
2708,
3533,
2593,
285,
2067,
417,
1050,
1461,
16723,
923,
2708,
19843,
50275,
498,
15752,
253,
19843,
275,
253,
2929,
310,
6571,
23040,
264,
407,
963,
993,
285,
417,
1050,
3374,
2708,
403,
2067,
2792,
1309,
619,
2278,
50276,
498,
15752,
273,
41818,
285,
963,
993,
323,
1650,
50272,
1545,
1699,
50276,
3830,
50276,
9131,
352,
4620,
275,
10012,
337,
50272,
249,
253,
5740,
273,
10012,
337,
253,
36138,
943,
320,
689,
465,
285,
417,
465,
50271,
9088,
403,
2709,
963,
993,
4768,
253,
19529,
533,
34885,
11852,
253,
4583,
1239,
352,
651,
320,
12912,
281,
4993,
841,
963,
993,
275,
2007,
38549,
50276,
9188,
40348,
253,
5823,
13307,
1566,
21332,
88,
311,
453,
5933,
285,
2905,
12989,
5649,
247,
2491,
273,
1524,
1533,
4893,
285,
3103,
2905,
2561,
1690,
436,
2929,
310,
1774,
50276,
17465,
569,
403,
417,
16575,
5469,
275,
436,
19529,
352,
651,
320,
12912,
281,
2319,
7364,
275,
625,
2508,
2490,
187,
4118,
18435,
27,
2520,
2929,
29328,
21332,
88,
311,
453,
1332,
323,
4715,
7802,
273,
37197,
28261,
2412,
262,
3210,
253,
11333,
310,
4460,
285,
253,
806,
273,
697,
2238,
281,
320,
3732,
281,
436,
1895,
436,
310,
271,
1774,
1635,
281,
253,
7561,
5421,
9400,
273,
14682,
4715,
285,
436,
906,
8725,
253,
14603,
273,
752,
476,
320,
14556,
6311,
762,
15958,
10076,
342,
7802,
3210,
7802,
3210,
323,
19947,
403,
417,
49186,
1892,
3237,
285,
247,
4460,
2746,
824,
347,
253,
581,
5611,
275,
436,
2929,
310,
4619,
275,
49519,
253,
8037,
281,
2403,
14682,
4715,
8542,
512,
253,
30628,
7350,
452,
644,
9713,
275,
253,
30080,
22559,
18212,
50275,
28821,
253,
1175,
3290,
273,
253,
2929,
352,
588,
320,
3240,
12912,
323,
253,
10668,
604,
253,
2929,
6947,
690,
625,
673,
327,
15571,
849,
253,
906,
26662,
281,
253,
9470,
789,
326,
556,
644,
1053,
327,
7802,
273,
37197,
28261,
2412,
262,
3210,
824,
247,
23395,
3542,
2905,
789,
2593,
588,
760,
1056,
253,
2929,
625,
4722,
285,
3862,
257,
253,
3486,
273,
253,
1543,
690,
1774,
285,
8244,
2905,
789,
326,
2063,
366,
448,
1321,
16128,
26760,
1162,
355,
4765,
403,
5816,
690,
625,
3332,
2905,
789,
403,
5816,
1060,
310,
247,
10491,
273,
824,
2905,
789,
326,
943,
320,
2429,
342,
50275,
28269,
24170,
273,
3632,
11839,
3210,
342,
3386,
432,
18464,
17971,
1182,
1856,
272,
1182,
31035,
18547,
632,
86,
298,
343,
543,
1269,
571,
891,
23925,
2284,
1423,
50275,
251,
253,
1548,
18279,
1430,
273,
24170,
273,
19947,
3210,
1269,
571,
5240,
1182,
12109,
1269,
1028,
24176,
1182,
12109,
877,
272,
298,
1368,
340,
272,
30838,
632,
606,
5987,
39962,
2061,
5375,
19,
1252,
1012,
14783,
50276,
28269,
24170,
273,
499,
471,
3592,
77,
6977,
3210,
432,
18872,
7898,
7367,
1182,
1856,
272,
1182,
31035,
298,
343,
543,
1269,
571,
5723,
2824,
6247,
50276,
28269,
499,
471,
3592,
77,
6977,
24170,
432,
7898,
17971,
18547,
632,
86,
1182,
1856,
272,
1182,
31035,
448,
8500,
298,
22728,
268,
5104,
266,
26535,
298,
343,
543,
1269,
571,
39951,
2284,
746,
50276,
28269,
24170,
273,
499,
471,
3592,
77,
6977,
3210,
1182,
1856,
272,
1182,
31035,
268,
1715,
3376,
348,
298,
343,
543,
1269,
571,
17857,
1686,
1036,
50276,
14641,
2735,
3146,
4715,
17971,
432,
50144,
941,
396,
680,
543,
12506,
465,
343,
266,
465,
253,
76,
360,
1148,
1301,
300,
480,
74,
6472,
1269,
86,
295,
2824,
4104,
50276,
66,
9400,
14053,
2746,
281,
19947,
359,
280,
543,
50097,
9791,
76,
1225,
310,
73,
7523,
8097,
76,
684,
73,
3779,
19531,
2902,
247,
382,
1832,
1010,
50276,
28269,
6804,
37197,
28261,
2412,
262,
1566,
432,
50144,
941,
256,
12506,
277,
439,
1240,
295,
2824,
1047,
50273
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
the authors propose a distributionally robust optimization dro framework for structured prediction over trees where the ambiguity set is given by a maximum feature moment difference to the empirical distribution they derive an unconstrained dual problem and establish a worstcase excess risk bound for the optimal parameter they discuss strategies for efficient approximate computation of this parameter with particular emphasis on projection onto the arborescence polytope the method is illustrated on three dependency parsing tasks where it performs competitively with a neural dependency parser with consistent outperformance in lowdata settings strengths this is a very wellwritten paper the writing is direct precise and informationdense the exposition makes extensive reference to relevant work providing rich context and highlighting the gaps that the authors aim to fill the contribution appears novel and and is thoroughly developed in its theoretical computational and empirical aspects there are significant technical contributions in terms of a novel dro problem formulation and its analysis in terms of the statistical properties of the associated estimator and in terms of considerations for its efficient computation the experiments compare the method to a strong baseline quantify uncertainty in the result and consider both accuracy metrics and computational costs weaknesses except on the ud dataset the empirical results are somewhat modest outside of two verylowdata settings m10 and m50 some tradeoffs and assumptions are discussed at the end of section 7 the authors acknowledge that computation time may be a bottleneck and address tradeoffs in computation between the marginal and stochastic game implementations they point out fairly that for the increase in computational cost versus the biaf baseline they are able to offer a guarantee of distributional robustness docsepthis paper attempts to develop a better learning algorithm for structured prediction tasks in particular trees the most direct application being supervised dependency parsing dependency parsing has a long line of work most of which are trained with maximum likelihood marginbased or ermbased objectives the authors point out that such training objectives are not aligned with the evaluation objective further such methods have nonconvex objectives due to which they lose convergence and generalization guarantees to mitigate that the paper proposes a training approach based on the distributionally robust optimization framework dro in very simple terms in dro the learning algorithm is unaware of the underlying distribution apart from some knowledge over the support and the goal is to minimize the worstcase risk over some uncertainty set they show the equivalence between the drobased objective to a convex surrogate loss function and derive uniform convergence based generalization bounds they develop efficient training algorithms and show convergence guarantees they evaluate their method on three dependency parsing datasets ptb ud and ctb with biaf as the baseline in particular they focus on low training data settings 10 1000 training examples they use their method to predict the unlabelled dependency tree and use a different classifier to predict arc labels the model uses representation learned by roberta and the final linear predictor is learned by their approach or biaf as baseline their approach achieves consistent gains over biaf on the three datasets strengths s1 interesting method and theoretical results the adaption of the dro framework for treestructured prediction is quite interesting the issues with prior approaches are well described and show a natural fit for the proposed approach provable methods for structured prediction are relatively scarce due to various reasons and this paper seems to take a step in that direction s2 efficiency and convergence the designed algorithms seem to be reasonably efficient and fast convergence makes it appealing from a practical standpoint s3 writing the paper is well written the motivations are clearly outlined and the arguments are wellpresented weaknesses w1 incremental improvement compared to the baseline biaf the proposed method achieves very small improvements across all tasks even in the low training data setting w2 reliance on prior representations although representation learning is not the main goal of the paper the disconnect with the representation learning model puts it in a position of disadvantage in experiments they first train the baseline model robertabiaf on the training data and then use its representation assuming prefinal layer as features for the proposed model compared to endtoend models the proposed method could perform worse due to the lack of proper features general comment overall i think it is an interesting paper despite weak practical gains the work could be helpful to other researchers in the community working on such structured prediction problems limitations are discussed in the final section docsepthis work proposes a momentbased distributional robust optimization approach for tree structured prediction which is further applied to dependency parsing the authors propose the instantiation of the distributional robustness objective to the tree structured prediction setting in the parameterization of marginal probabilities the authors also discuss the risk bound for the proposed methods and methods for projection to the arborescence polytopes experiments on three datasets demonstrate the effectiveness of the proposed method in the fewshot setting strengths i like the rigorous mathematical discussions on this work the authors give thorough discussions about the background and clear notes on the related work of multiple threads and how this work is built on top on the existing methods the idea of using distributionally robust optimization for tree structured prediction is indeed significant and would add valid contribution of the field experiments show its effectiveness especially on lowresource settings weakness ideally distributionally robust optimization should be validated on datasets with distributional shifts eg from news domain to dialog domain but the evaluation is more on a indistribution fewshot setting rather than a distributional shift setting i would expect the authors to elaborate this a little bit in the discussion period the implementation is on cpus with c and python it remains unclear how this method can be implemented on gpuaccelerated automatic differentiation libraries like pytorch such limitation could severely restrict the applicability of the proposed method the gradients wrt the projection step to the polytope section 4 and figure 1 seems to be not welldefined and have previously been under discussions1 2 it is not very clear in this papers setting how such problem may or may not become an issue in terms of optimization 1 peng et al acl 2018 backpropagating through structured argmax using a spigot 2 mihaylova et al 2020 understanding the mechanics of spigot surrogate gradients for latent structure learning see above discussions
### Summary: | this paper presents theoretical results for structured prediction over trees and empirical results on three syntactic dependency parsing datasets all reviewers agree that the paper is well written and novel the empirical gains are not huge but the comparisons are done in a thorough and fair way all reviewers suggest acceptance even though some reviewers have low confidence and the meta reviewer agrees as well | [
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
783,
4477,
12661,
247,
3268,
595,
10237,
13757,
3926,
7792,
323,
18872,
10554,
689,
7139,
835,
253,
28931,
873,
310,
1677,
407,
247,
4869,
4735,
2774,
3064,
281,
253,
16774,
3268,
597,
15313,
271,
440,
48454,
8746,
1895,
285,
5100,
247,
9065,
5045,
6714,
2495,
3033,
323,
253,
8654,
4764,
597,
2319,
8130,
323,
5919,
16851,
13782,
273,
436,
4764,
342,
1798,
15075,
327,
12378,
4830,
253,
40141,
18394,
3488,
936,
365,
253,
1332,
310,
12800,
327,
1264,
18925,
29072,
8892,
835,
352,
17923,
3947,
25785,
342,
247,
11454,
18925,
22109,
342,
5185,
562,
24159,
275,
1698,
2203,
7533,
20544,
50275,
2520,
310,
247,
1077,
973,
15720,
2929,
253,
4028,
310,
1480,
10799,
285,
1491,
43200,
253,
47284,
2789,
9470,
3806,
281,
4623,
789,
5277,
6793,
3634,
285,
27321,
253,
18388,
326,
253,
4477,
4388,
281,
7522,
50275,
783,
7680,
4620,
4460,
285,
285,
310,
16575,
3715,
275,
697,
10527,
15180,
285,
16774,
7794,
627,
403,
1534,
7681,
9021,
275,
2426,
273,
247,
4460,
3926,
1895,
15895,
285,
697,
1783,
275,
2426,
273,
253,
7605,
3607,
273,
253,
2330,
29107,
285,
275,
2426,
273,
15711,
323,
697,
5919,
13782,
253,
4679,
7277,
253,
1332,
281,
247,
2266,
8245,
22048,
11649,
275,
253,
906,
285,
1908,
1097,
7200,
17082,
285,
15180,
4815,
50276,
20881,
1255,
265,
50275,
16829,
327,
253,
18198,
10895,
253,
16774,
1543,
403,
8489,
16453,
3345,
273,
767,
1077,
676,
2203,
7533,
278,
740,
285,
278,
1235,
690,
5454,
14273,
285,
13260,
403,
5469,
387,
253,
990,
273,
2593,
818,
253,
4477,
14409,
326,
13782,
673,
778,
320,
247,
3673,
44856,
285,
2953,
5454,
14273,
275,
13782,
875,
253,
16888,
285,
19191,
50276,
13197,
27558,
597,
1127,
562,
9648,
326,
323,
253,
2572,
275,
15180,
2105,
7147,
253,
270,
571,
71,
8245,
597,
403,
2104,
281,
3959,
247,
12215,
273,
3268,
267,
31640,
50275,
7152,
33032,
2520,
2929,
9437,
281,
1287,
247,
1805,
4715,
5933,
323,
18872,
10554,
8892,
275,
1798,
7139,
253,
954,
1480,
2898,
1146,
22296,
18925,
29072,
18925,
29072,
556,
247,
1048,
1386,
273,
789,
954,
273,
534,
403,
10166,
342,
4869,
12177,
8459,
3169,
390,
209,
693,
3169,
16566,
253,
4477,
1127,
562,
326,
824,
3733,
16566,
403,
417,
15616,
342,
253,
7103,
8103,
2007,
824,
3082,
452,
1327,
44181,
16566,
1955,
281,
534,
597,
7168,
14940,
285,
26647,
23632,
50275,
936,
29966,
326,
253,
2929,
29328,
247,
3733,
2746,
1754,
327,
253,
3268,
595,
10237,
13757,
7792,
3926,
275,
1077,
2969,
2426,
275,
3926,
253,
4715,
5933,
310,
25229,
273,
253,
6944,
3268,
7419,
432,
690,
3640,
689,
253,
1329,
285,
253,
4736,
310,
281,
15338,
253,
9065,
5045,
2495,
689,
690,
11649,
873,
597,
921,
253,
19945,
875,
253,
3926,
3169,
8103,
281,
247,
17133,
35701,
2957,
1159,
285,
15313,
6447,
14940,
1754,
26647,
14493,
597,
1287,
5919,
3733,
11333,
285,
921,
14940,
23632,
50275,
9328,
7472,
616,
1332,
327,
1264,
18925,
29072,
15302,
268,
25192,
18198,
285,
260,
25192,
342,
270,
571,
71,
347,
253,
8245,
275,
1798,
597,
2770,
327,
1698,
3733,
941,
7533,
884,
50276,
9138,
3733,
6667,
597,
897,
616,
1332,
281,
3283,
253,
440,
47728,
18925,
5202,
285,
897,
247,
1027,
30410,
281,
3283,
12423,
13301,
253,
1566,
4648,
6779,
6311,
407,
687,
589,
893,
285,
253,
2457,
4872,
23403,
310,
6311,
407,
616,
2746,
390,
270,
571,
71,
347,
8245,
616,
2746,
33526,
5185,
15988,
689,
270,
571,
71,
327,
253,
1264,
15302,
50270,
296,
3755,
20556,
50273,
84,
18,
4722,
1332,
285,
10527,
1543,
253,
5223,
279,
273,
253,
3926,
7792,
323,
2578,
383,
957,
1520,
10554,
310,
3240,
4722,
253,
3374,
342,
2720,
7274,
403,
973,
2529,
285,
921,
247,
3626,
4944,
323,
253,
4081,
2746,
872,
494,
3082,
323,
18872,
10554,
403,
4942,
29967,
1955,
281,
2710,
4606,
285,
436,
2929,
3133,
281,
1379,
247,
3213,
275,
326,
3884,
50273,
84,
19,
6733,
285,
14940,
253,
4158,
11333,
1646,
281,
320,
12054,
5919,
285,
3809,
14940,
2789,
352,
23176,
432,
247,
8542,
32764,
50273,
84,
20,
4028,
253,
2929,
310,
973,
3542,
253,
42852,
403,
4518,
18627,
285,
253,
7125,
403,
973,
15068,
264,
50272,
20881,
1255,
265,
50273,
88,
18,
32809,
7756,
2429,
281,
253,
8245,
270,
571,
71,
253,
4081,
1332,
33526,
1077,
1355,
11701,
2439,
512,
8892,
1014,
275,
253,
1698,
3733,
941,
4758,
50275,
88,
19,
22095,
327,
2720,
14237,
3738,
6779,
4715,
310,
417,
253,
2022,
4736,
273,
253,
2929,
253,
35738,
342,
253,
6779,
4715,
1566,
12516,
352,
275,
247,
1899,
273,
18928,
275,
4679,
597,
806,
6194,
253,
8245,
1566,
687,
6291,
357,
571,
71,
327,
253,
3733,
941,
285,
840,
897,
697,
6779,
7384,
50276,
11499,
989,
3828,
347,
3386,
323,
253,
4081,
1566,
2429,
281,
990,
936,
423,
3210,
253,
4081,
1332,
812,
1347,
7197,
1955,
281,
253,
3480,
273,
1463,
3386,
50274,
16691,
4385,
4583,
891,
1158,
352,
310,
271,
4722,
2929,
5747,
5075,
8542,
15988,
253,
789,
812,
320,
9371,
281,
643,
8607,
275,
253,
3114,
2444,
327,
824,
18872,
10554,
3237,
50275,
17465,
569,
403,
5469,
275,
253,
2457,
2593,
5474,
33032,
2520,
789,
29328,
247,
2774,
3169,
3268,
267,
10237,
13757,
2746,
323,
5202,
18872,
10554,
534,
310,
2007,
3732,
281,
18925,
29072,
253,
4477,
12661,
253,
8164,
2492,
273,
253,
3268,
267,
31640,
8103,
281,
253,
5202,
18872,
10554,
4758,
275,
253,
4764,
1320,
273,
16888,
20552,
253,
4477,
671,
2319,
253,
2495,
3033,
323,
253,
4081,
3082,
285,
3082,
323,
12378,
281,
253,
40141,
18394,
877,
1767,
11192,
4679,
327,
1264,
15302,
7568,
253,
12510,
273,
253,
4081,
1332,
275,
253,
1643,
11860,
4758,
20544,
50275,
74,
751,
253,
26565,
15965,
11985,
327,
436,
789,
253,
4477,
1918,
11080,
11985,
670,
253,
4114,
285,
2590,
7211,
327,
253,
2905,
789,
273,
2709,
17059,
285,
849,
436,
789,
310,
4270,
327,
1755,
327,
253,
5368,
3082,
50276,
783,
2934,
273,
970,
3268,
595,
10237,
13757,
323,
5202,
18872,
10554,
310,
6296,
1534,
285,
651,
823,
3588,
7680,
273,
253,
1673,
50276,
16217,
3825,
921,
697,
12510,
3340,
327,
1698,
15024,
7533,
50276,
20881,
1255,
50274,
504,
595,
3268,
595,
10237,
13757,
943,
320,
17618,
327,
15302,
342,
3268,
267,
15036,
24088,
432,
3668,
5028,
281,
10756,
5028,
533,
253,
7103,
310,
625,
327,
247,
31929,
2382,
1643,
11860,
4758,
2581,
685,
247,
3268,
267,
5333,
4758,
891,
651,
1902,
253,
4477,
281,
21184,
436,
247,
1652,
2372,
275,
253,
5955,
2180,
50276,
783,
7092,
310,
327,
27191,
316,
342,
260,
285,
15548,
352,
4558,
12744,
849,
436,
1332,
476,
320,
9009,
327,
305,
11113,
3649,
293,
12072,
12077,
9827,
13747,
751,
268,
1767,
263,
348,
824,
12291,
812,
18270,
4656,
253,
30437,
273,
253,
4081,
1332,
50276,
783,
27935,
8772,
253,
12378,
3213,
281,
253,
3488,
936,
365,
2593,
577,
285,
4677,
337,
3133,
281,
320,
417,
6210,
392,
37224,
285,
452,
3786,
644,
762,
11985,
18,
374,
352,
310,
417,
1077,
2590,
275,
436,
9380,
4758,
849,
824,
1895,
778,
390,
778,
417,
2489,
271,
2523,
275,
2426,
273,
13757,
50276,
18,
42151,
1162,
355,
247,
498,
4765,
896,
44263,
839,
949,
18872,
1736,
4090,
970,
247,
653,
304,
302,
50276,
19,
3641,
49289,
77,
8947,
1162,
355,
9169,
4685,
253,
17823,
273,
653,
304,
302,
35701,
27935,
323,
21624,
2605,
4715,
923,
1840,
11985,
2490,
187,
4118,
18435,
27,
2520,
2929,
10262,
10527,
1543,
323,
18872,
10554,
689,
7139,
285,
16774,
1543,
327,
1264,
43548,
9994,
18925,
29072,
15302,
512,
30628,
5194,
326,
253,
2929,
310,
973,
3542,
285,
4460,
253,
16774,
15988,
403,
417,
5699,
533,
253,
14023,
403,
2218,
275,
247,
11080,
285,
4344,
1039,
512,
30628,
1804,
14924,
1014,
2167,
690,
30628,
452,
1698,
7162,
285,
253,
11419,
37317,
18726,
347,
973
] | [
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] | [
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
783,
4477,
12661,
247,
3268,
595,
10237,
13757,
3926,
7792,
323,
18872,
10554,
689,
7139,
835,
253,
28931,
873,
310,
1677,
407,
247,
4869,
4735,
2774,
3064,
281,
253,
16774,
3268,
597,
15313,
271,
440,
48454,
8746,
1895,
285,
5100,
247,
9065,
5045,
6714,
2495,
3033,
323,
253,
8654,
4764,
597,
2319,
8130,
323,
5919,
16851,
13782,
273,
436,
4764,
342,
1798,
15075,
327,
12378,
4830,
253,
40141,
18394,
3488,
936,
365,
253,
1332,
310,
12800,
327,
1264,
18925,
29072,
8892,
835,
352,
17923,
3947,
25785,
342,
247,
11454,
18925,
22109,
342,
5185,
562,
24159,
275,
1698,
2203,
7533,
20544,
50275,
2520,
310,
247,
1077,
973,
15720,
2929,
253,
4028,
310,
1480,
10799,
285,
1491,
43200,
253,
47284,
2789,
9470,
3806,
281,
4623,
789,
5277,
6793,
3634,
285,
27321,
253,
18388,
326,
253,
4477,
4388,
281,
7522,
50275,
783,
7680,
4620,
4460,
285,
285,
310,
16575,
3715,
275,
697,
10527,
15180,
285,
16774,
7794,
627,
403,
1534,
7681,
9021,
275,
2426,
273,
247,
4460,
3926,
1895,
15895,
285,
697,
1783,
275,
2426,
273,
253,
7605,
3607,
273,
253,
2330,
29107,
285,
275,
2426,
273,
15711,
323,
697,
5919,
13782,
253,
4679,
7277,
253,
1332,
281,
247,
2266,
8245,
22048,
11649,
275,
253,
906,
285,
1908,
1097,
7200,
17082,
285,
15180,
4815,
50276,
20881,
1255,
265,
50275,
16829,
327,
253,
18198,
10895,
253,
16774,
1543,
403,
8489,
16453,
3345,
273,
767,
1077,
676,
2203,
7533,
278,
740,
285,
278,
1235,
690,
5454,
14273,
285,
13260,
403,
5469,
387,
253,
990,
273,
2593,
818,
253,
4477,
14409,
326,
13782,
673,
778,
320,
247,
3673,
44856,
285,
2953,
5454,
14273,
275,
13782,
875,
253,
16888,
285,
19191,
50276,
13197,
27558,
597,
1127,
562,
9648,
326,
323,
253,
2572,
275,
15180,
2105,
7147,
253,
270,
571,
71,
8245,
597,
403,
2104,
281,
3959,
247,
12215,
273,
3268,
267,
31640,
50275,
7152,
33032,
2520,
2929,
9437,
281,
1287,
247,
1805,
4715,
5933,
323,
18872,
10554,
8892,
275,
1798,
7139,
253,
954,
1480,
2898,
1146,
22296,
18925,
29072,
18925,
29072,
556,
247,
1048,
1386,
273,
789,
954,
273,
534,
403,
10166,
342,
4869,
12177,
8459,
3169,
390,
209,
693,
3169,
16566,
253,
4477,
1127,
562,
326,
824,
3733,
16566,
403,
417,
15616,
342,
253,
7103,
8103,
2007,
824,
3082,
452,
1327,
44181,
16566,
1955,
281,
534,
597,
7168,
14940,
285,
26647,
23632,
50275,
936,
29966,
326,
253,
2929,
29328,
247,
3733,
2746,
1754,
327,
253,
3268,
595,
10237,
13757,
7792,
3926,
275,
1077,
2969,
2426,
275,
3926,
253,
4715,
5933,
310,
25229,
273,
253,
6944,
3268,
7419,
432,
690,
3640,
689,
253,
1329,
285,
253,
4736,
310,
281,
15338,
253,
9065,
5045,
2495,
689,
690,
11649,
873,
597,
921,
253,
19945,
875,
253,
3926,
3169,
8103,
281,
247,
17133,
35701,
2957,
1159,
285,
15313,
6447,
14940,
1754,
26647,
14493,
597,
1287,
5919,
3733,
11333,
285,
921,
14940,
23632,
50275,
9328,
7472,
616,
1332,
327,
1264,
18925,
29072,
15302,
268,
25192,
18198,
285,
260,
25192,
342,
270,
571,
71,
347,
253,
8245,
275,
1798,
597,
2770,
327,
1698,
3733,
941,
7533,
884,
50276,
9138,
3733,
6667,
597,
897,
616,
1332,
281,
3283,
253,
440,
47728,
18925,
5202,
285,
897,
247,
1027,
30410,
281,
3283,
12423,
13301,
253,
1566,
4648,
6779,
6311,
407,
687,
589,
893,
285,
253,
2457,
4872,
23403,
310,
6311,
407,
616,
2746,
390,
270,
571,
71,
347,
8245,
616,
2746,
33526,
5185,
15988,
689,
270,
571,
71,
327,
253,
1264,
15302,
50270,
296,
3755,
20556,
50273,
84,
18,
4722,
1332,
285,
10527,
1543,
253,
5223,
279,
273,
253,
3926,
7792,
323,
2578,
383,
957,
1520,
10554,
310,
3240,
4722,
253,
3374,
342,
2720,
7274,
403,
973,
2529,
285,
921,
247,
3626,
4944,
323,
253,
4081,
2746,
872,
494,
3082,
323,
18872,
10554,
403,
4942,
29967,
1955,
281,
2710,
4606,
285,
436,
2929,
3133,
281,
1379,
247,
3213,
275,
326,
3884,
50273,
84,
19,
6733,
285,
14940,
253,
4158,
11333,
1646,
281,
320,
12054,
5919,
285,
3809,
14940,
2789,
352,
23176,
432,
247,
8542,
32764,
50273,
84,
20,
4028,
253,
2929,
310,
973,
3542,
253,
42852,
403,
4518,
18627,
285,
253,
7125,
403,
973,
15068,
264,
50272,
20881,
1255,
265,
50273,
88,
18,
32809,
7756,
2429,
281,
253,
8245,
270,
571,
71,
253,
4081,
1332,
33526,
1077,
1355,
11701,
2439,
512,
8892,
1014,
275,
253,
1698,
3733,
941,
4758,
50275,
88,
19,
22095,
327,
2720,
14237,
3738,
6779,
4715,
310,
417,
253,
2022,
4736,
273,
253,
2929,
253,
35738,
342,
253,
6779,
4715,
1566,
12516,
352,
275,
247,
1899,
273,
18928,
275,
4679,
597,
806,
6194,
253,
8245,
1566,
687,
6291,
357,
571,
71,
327,
253,
3733,
941,
285,
840,
897,
697,
6779,
7384,
50276,
11499,
989,
3828,
347,
3386,
323,
253,
4081,
1566,
2429,
281,
990,
936,
423,
3210,
253,
4081,
1332,
812,
1347,
7197,
1955,
281,
253,
3480,
273,
1463,
3386,
50274,
16691,
4385,
4583,
891,
1158,
352,
310,
271,
4722,
2929,
5747,
5075,
8542,
15988,
253,
789,
812,
320,
9371,
281,
643,
8607,
275,
253,
3114,
2444,
327,
824,
18872,
10554,
3237,
50275,
17465,
569,
403,
5469,
275,
253,
2457,
2593,
5474,
33032,
2520,
789,
29328,
247,
2774,
3169,
3268,
267,
10237,
13757,
2746,
323,
5202,
18872,
10554,
534,
310,
2007,
3732,
281,
18925,
29072,
253,
4477,
12661,
253,
8164,
2492,
273,
253,
3268,
267,
31640,
8103,
281,
253,
5202,
18872,
10554,
4758,
275,
253,
4764,
1320,
273,
16888,
20552,
253,
4477,
671,
2319,
253,
2495,
3033,
323,
253,
4081,
3082,
285,
3082,
323,
12378,
281,
253,
40141,
18394,
877,
1767,
11192,
4679,
327,
1264,
15302,
7568,
253,
12510,
273,
253,
4081,
1332,
275,
253,
1643,
11860,
4758,
20544,
50275,
74,
751,
253,
26565,
15965,
11985,
327,
436,
789,
253,
4477,
1918,
11080,
11985,
670,
253,
4114,
285,
2590,
7211,
327,
253,
2905,
789,
273,
2709,
17059,
285,
849,
436,
789,
310,
4270,
327,
1755,
327,
253,
5368,
3082,
50276,
783,
2934,
273,
970,
3268,
595,
10237,
13757,
323,
5202,
18872,
10554,
310,
6296,
1534,
285,
651,
823,
3588,
7680,
273,
253,
1673,
50276,
16217,
3825,
921,
697,
12510,
3340,
327,
1698,
15024,
7533,
50276,
20881,
1255,
50274,
504,
595,
3268,
595,
10237,
13757,
943,
320,
17618,
327,
15302,
342,
3268,
267,
15036,
24088,
432,
3668,
5028,
281,
10756,
5028,
533,
253,
7103,
310,
625,
327,
247,
31929,
2382,
1643,
11860,
4758,
2581,
685,
247,
3268,
267,
5333,
4758,
891,
651,
1902,
253,
4477,
281,
21184,
436,
247,
1652,
2372,
275,
253,
5955,
2180,
50276,
783,
7092,
310,
327,
27191,
316,
342,
260,
285,
15548,
352,
4558,
12744,
849,
436,
1332,
476,
320,
9009,
327,
305,
11113,
3649,
293,
12072,
12077,
9827,
13747,
751,
268,
1767,
263,
348,
824,
12291,
812,
18270,
4656,
253,
30437,
273,
253,
4081,
1332,
50276,
783,
27935,
8772,
253,
12378,
3213,
281,
253,
3488,
936,
365,
2593,
577,
285,
4677,
337,
3133,
281,
320,
417,
6210,
392,
37224,
285,
452,
3786,
644,
762,
11985,
18,
374,
352,
310,
417,
1077,
2590,
275,
436,
9380,
4758,
849,
824,
1895,
778,
390,
778,
417,
2489,
271,
2523,
275,
2426,
273,
13757,
50276,
18,
42151,
1162,
355,
247,
498,
4765,
896,
44263,
839,
949,
18872,
1736,
4090,
970,
247,
653,
304,
302,
50276,
19,
3641,
49289,
77,
8947,
1162,
355,
9169,
4685,
253,
17823,
273,
653,
304,
302,
35701,
27935,
323,
21624,
2605,
4715,
923,
1840,
11985,
2490,
187,
4118,
18435,
27,
2520,
2929,
10262,
10527,
1543,
323,
18872,
10554,
689,
7139,
285,
16774,
1543,
327,
1264,
43548,
9994,
18925,
29072,
15302,
512,
30628,
5194,
326,
253,
2929,
310,
973,
3542,
285,
4460,
253,
16774,
15988,
403,
417,
5699,
533,
253,
14023,
403,
2218,
275,
247,
11080,
285,
4344,
1039,
512,
30628,
1804,
14924,
1014,
2167,
690,
30628,
452,
1698,
7162,
285,
253,
11419,
37317,
18726,
347,
973
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
this paper introduces an approach for masked language modeling where they mask wordpieces together which have high pmi the idea is relatively simple has potential for high impact through broad adoption and the paper is clearly written with extensive experiments the experiments on glue squad and race are very well set up for example evaluating multiple learning rates for each downstream task is expensive but really adds to confidence i have in the results reporting the best median dev score over five random initializations per hyperparameter then evaluating that same model on the test set definitely improves the reproducibility of the results in addition showing how performance is affected by the amount of pretraining data is very useful and the experiments range from small scale to large scale the ablations adjusting the vocabulary size which in turn changes the size of wordpiece tokens is a valuable contribution and i would have asked to see something like this if it wasnt included table 4 is a nice addition its interesting that the mlm loss is not predictive of downstream performance i suspect this approach will become widely adopted or built upon in future work pretraining language models i give this paper an 8 only taking off points because the idea is relatively intuitive and doesnt really open a broad new area for future work i dont see any obvious methodological flaws which frankly we can find in most papers i would be interested in seeing if this reduced the variance of the finetuning process that might be something the authors could include for the camera ready maybe in the appendix edit after reviewing the author response i will keep my score as it is i believe the paper should be accepted docsepsummary the paper proposes a variant on the mlm training objective which uses pmi in order to determine which spans to mask the idea is related to recentlyproposed whole word masking and entitybased masking but the authors argue the pmibased approach is more principled the method is straightforwardit involves computing pmis for ngrams in this case up to length 5 over the training corpus and then preferring to mask entire collocational phrases rather than single words during training the intuition is that masking single words allows models to exploit simple collocations thus optimizing their training objective without learning longerrange dependencies or higher level semantic features of the sentences and this makes training less efficient than it could be one contribution of the paper is a variant on the pmi metric that performs better for longer phrases by reducing the scores of phrases that happen to contain highpmi subphrases eg george washington is should not have a high score despite the fact that george washington does have a high score the authors compare their method against vanilla bert with random masking as well as against recently proposed variants such as spanbert and ambert and show consistent improvements in terms of final performance as well as better efficiency during training by way of analysis the authors also make an argument that tokenlevel perplexity is not correlated with downstream performance this is an interesting point to make though they do not expound upon it in this paper strengths the proposed method is simple and principled the empirical results show consistent improvement on standard benchmark tasks the proposed variation the pmi metric is a nice subcontribution weaknesses a somewhat marginal contribution its not significantly different from the variants proposed previously eg spanbert entity masking the evaluation focuses purely on benchmark tasks which are known to have flaws eg the current superhuman performance on these tasks already makes gains on them suspect id have liked to some more analysisdiscussion of the linguistic consequences of this new objective see more specific comments below additional commentsquestions i am curious about the more general effect of this training objective on the models linguistic and particularly syntactic knowledge eg can you say more about how often the model sees unigrams being masked and how the distribution of these unigrams differs from what would be seen if we did random masking i ask because i could imagine that this objective has a noticeable effect on the masking of function words eg preposition occurring more often in collocations pronouns and determines maybe less often and thus the model might get differing access to these words in isolation since function words carry a lot more signal about syntactic structure than do content words and phrases of the type you are capturing in your pmi metric im very curious if there are some tradeoffs or possibly additional advantages that comes with your method that are not reflected by the benchmark performance metrics squad and glue are going to favor knowledge of things like entities and events and capture very little about more nuanced linguistic reasoning so reporting performance on some more recently released challenge sets or using some probing studies or at least just giving some analysis of winloss patterns would be very informative for assessing the contribution of this paper to nlp more generally docsepsummary this paper presents a masking strategy for training masked language models mlm the proposed strategy builds on previous approaches that mask semantically coherent spans of tokens such as entire words named entities or spans rather than randomly masking individual tokens specifically the proposed method computes the pmi of spans and the generalization for spans of size 2 over the pretraining corpus and randomly masks from among the 800k spans lengths 25 with the highest pmi masking based on pmi removes the ability for the model to rely on highly local signals to fill in the mask and instead focus on learning higher level semantics they motivate this hypothesis with an experiment demonstrating that as the size of the wordpiece vocabulary decreases and words are more frequently split into multiple tokens rather than being their own token the transfer performance of the resulting mlm decreases however using wholeword masking with this same vocabulary size recovers much of the original performance indicating that allowing the model to rely on these strong local signals harms the transfer quality of the resulting model experiments the paper evaluates on three standard nlu benchmarks squad2 glue and race they compare their pmibased algorithm against random token masking random span masking and a naive version of their pmi masking strategy all baselines are implemented within a single codebase their strategy outperforms randomspan masking on squad throughout pretraining though the latter does close the gap on a smaller pretraining corpus 16gb they show this gap remains if they use a larger pretraining corpus 54gb as would likely be the case with a largescale pretraining experiment the resulting models also consistently outperform all other baselines on all tasks they do additionally compare to outside models ambert spanbert roberta and show their pmibased models outperform or perform similarly to these models overall this paper introduces a solid theoretical basis for existing and new methods for training masked language models they present robust experiments demonstrating the efficacy of their method under various settings 1 missing citation for race and original squad dataset 2 i assume this doesnt happen often but is it ever the case that a training example does not have any maskable spans docsep summary this paper proposes an improvement to how tokens are selected for masking in pretraining large masked language models bert and family specifically it stipulates that purely random choice of words or word pieces makes the mlm task insufficiently hard it then goes on to propose a datadriven approach for selecting ngrams to mask together the approach based on an extension of pointwise mutual information for ngrams is shown to outperform random token and random spans masking strategies on performance of downstream tasks strong and weak points strong points the paper is very well written throughout and easy to follow the problem is well motivated with empirical evidence i think section 2 demonstrates well the case for random masking being too easy the multivariate version of pmi proposed is simple and well motivated the evaluation experiments are convincing the results are robust across the tasks shown weak points the one main drawback in this study is the lack of comparison with entitybased techniques for masking in particular 1 has recently defined salient span masking based on named entity recognition and dates salient span masking has been adopted in 2 where it is shown to boost performance of opendomain question answering by 9 points table1 of 2 models tagged with ssm i think it would be extremely interesting to compare these techniques ssm specifically but entitybased techniques in general with pmimasking specifically it is currently unclear whether the pmi based ngram masking vocabulary simply ends up rediscovering popular named entity mentions or whether there are more interesting subphrases eg idiomatic subphrases that a ner system would not select finally it would be interesting to empirically test whether these extra nonentity ngrams provide further performance boost over the entitybased salient span masking 1 realm retrievalaugmented language model pretraining httpsarxivorgabs200208909 2 how much knowledge can you pack into the parameters of a language model httpsarxivorgabs200208910 recommendation i recommend this paper for acceptance the analysis and ideas throughout the paper are well executed i also think the topic should be of high interest to the iclr and nlp communities given the importance of mlm pretraining on most stateoftheart models at the moment despite the lack of comparison with entitybased techniques having a statistically principled alternative solely based on cooccurrence without linguistic grounding seems interesting questions for authors 1 the main question here related to the entitybased approaches discussed above i think it would be interesting to address this issue given how closely related it is to this work and the good performance the cited papers demonstrate using it i can think of a couple of ways to address this comparison 1 qualitative analysis of the masking vocabulary to better understand the differences between pmi ngrams and entity mentions 2 experimental analysis incorporating some entitybased masking into the experiments in the paper 2 irrespectively of how you choose to address the entitybased comparison i was interested in some analysis or sampling of the pmimasking vocabulary to understand what type of ngrams are being selected entity mentions idiomatic phrases nounphrases etc would be interesting if you could make this vocab available or add a small sample in the appendix 3 throughout the paper there is an assumption that contiguous words are considered for masking this was not immediately clear in the beginning of the paper i realized it only in section 32 with what about contiguous spans of more than two tokens but one question came to mind what about correlated noncontiguous spans for example eigenvalue and eigenvector are unlikely to be present in the same ngram but have reasonably high chances of showing up together in the same passage have you considered extending this work to noncontiguous spans is there any expectation that this would help learning or is it just a bad idea 4 was there an attempt to mix masking strategies during pretraining although table 6 is convincing in demonstrating that singletoken perplexity is not correlated with performance of downstream tasks the differences seem curious one is left wondering if there is any benefit in adding a small number of easy masking cases ie randomtokens or randomspans
### Summary: | this paper describes a new and experimentally useful way to propose masked spans for mlm pretraining by masking spans of text that cooccur more often than would be expected given their components ie that are statistically likely to be noncompositional phrases the authors should make some attempt to connect their pmi heuristic with prior methods for statistical phrasefinding and term recognition eg httpswwwaaaiorgpapersijcai2007ijcai07439pdf or httpslinkspringercomchapter101007978354085287224 in the final paper | [
3192,
745,
2792,
984,
253,
2934,
310,
4942,
27350,
285,
36908,
1663,
1527,
247,
3862,
747,
2170,
323,
2852,
789,
891,
13414,
923,
667,
4755,
35961,
32138,
534,
29708,
359,
476,
1089,
275,
954,
9380,
50276,
74,
651,
320,
6110,
275,
6523,
604,
436,
3777,
253,
11041,
273,
253,
1442,
292,
25004,
1232,
326,
1537,
320,
1633,
253,
4477,
812,
2486,
323,
253,
6568,
4704,
5046,
275,
253,
30762,
50276,
15576,
846,
16725,
253,
2488,
2380,
891,
588,
1978,
619,
4868,
347,
352,
310,
891,
2868,
253,
2929,
943,
320,
7607,
5474,
339,
793,
360,
3454,
50276,
783,
2929,
29328,
247,
12955,
327,
253,
13361,
78,
3733,
8103,
534,
4648,
268,
7373,
275,
1340,
281,
3653,
534,
35742,
281,
8989,
253,
2934,
310,
2905,
281,
4102,
856,
7334,
2644,
3159,
44790,
285,
10726,
3169,
44790,
533,
253,
4477,
9059,
253,
12920,
487,
833,
2746,
310,
625,
3505,
74,
6216,
253,
1332,
310,
15246,
262,
8687,
12672,
12920,
261,
323,
295,
5059,
275,
436,
1083,
598,
281,
2978,
608,
689,
253,
3733,
20689,
285,
840,
4510,
804,
281,
8989,
2862,
3007,
406,
1050,
25491,
2581,
685,
2014,
3000,
1309,
3733,
253,
30328,
310,
326,
44790,
2014,
3000,
4483,
3210,
281,
22059,
2969,
3007,
47575,
3021,
39793,
616,
3733,
8103,
1293,
4715,
1048,
1000,
912,
21011,
390,
2169,
1268,
24705,
3386,
273,
253,
14683,
285,
436,
2789,
3733,
1679,
5919,
685,
352,
812,
320,
581,
7680,
273,
253,
2929,
310,
247,
12955,
327,
253,
268,
7373,
7982,
326,
17923,
1805,
323,
3356,
25491,
407,
8493,
253,
7363,
273,
25491,
326,
5108,
281,
3831,
1029,
2617,
74,
749,
545,
83,
1169,
24088,
3471,
4652,
17392,
1299,
310,
943,
417,
452,
247,
1029,
4868,
5747,
253,
958,
326,
3471,
4652,
17392,
1299,
1057,
452,
247,
1029,
4868,
50276,
783,
4477,
7277,
616,
1332,
1411,
26724,
270,
797,
342,
3632,
44790,
347,
973,
347,
1411,
4102,
4081,
11640,
824,
347,
13905,
6291,
285,
717,
6291,
285,
921,
5185,
11701,
275,
2426,
273,
2457,
3045,
347,
973,
347,
1805,
6733,
1309,
3733,
407,
1039,
273,
1783,
253,
4477,
671,
1056,
271,
4154,
326,
10669,
5251,
44229,
414,
310,
417,
9578,
342,
15450,
3045,
436,
310,
271,
4722,
1127,
281,
1056,
2167,
597,
513,
417,
866,
517,
2220,
352,
275,
436,
2929,
50275,
296,
3755,
20556,
50275,
783,
4081,
1332,
310,
2969,
285,
3505,
74,
6216,
50276,
783,
16774,
1543,
921,
5185,
7756,
327,
2629,
22791,
8892,
50276,
783,
4081,
7629,
253,
268,
7373,
7982,
310,
247,
5322,
749,
1987,
2382,
50276,
20881,
1255,
265,
50275,
66,
8489,
16888,
7680,
697,
417,
3012,
1027,
432,
253,
11640,
4081,
3786,
24088,
13905,
6291,
10726,
44790,
50276,
783,
7103,
16633,
15846,
327,
22791,
8892,
534,
403,
1929,
281,
452,
32138,
24088,
253,
1655,
2221,
13961,
3045,
327,
841,
8892,
2168,
2789,
15988,
327,
731,
9101,
2654,
452,
10490,
281,
690,
625,
1783,
49794,
273,
253,
32019,
9099,
273,
436,
747,
8103,
923,
625,
2173,
5701,
2708,
50276,
38092,
5701,
34974,
50276,
74,
717,
14338,
670,
253,
625,
2087,
1055,
273,
436,
3733,
8103,
327,
253,
3210,
32019,
285,
3782,
43548,
9994,
3640,
24088,
476,
368,
1333,
625,
670,
849,
2223,
253,
1566,
11403,
440,
304,
10624,
1146,
34741,
285,
849,
253,
3268,
273,
841,
440,
304,
10624,
19986,
432,
752,
651,
320,
2326,
604,
359,
858,
3632,
44790,
891,
1642,
984,
891,
812,
8564,
326,
436,
8103,
556,
247,
28629,
1055,
327,
253,
44790,
273,
1159,
3000,
24088,
638,
3321,
12952,
625,
2223,
275,
3007,
47575,
43980,
84,
285,
14802,
5046,
1679,
2223,
285,
3021,
253,
1566,
1537,
755,
26704,
2289,
281,
841,
3000,
275,
12940,
1580,
1159,
3000,
4459,
247,
2257,
625,
2625,
670,
43548,
9994,
2605,
685,
513,
2600,
3000,
285,
25491,
273,
253,
1511,
368,
403,
26475,
275,
634,
268,
7373,
7982,
516,
1077,
14338,
604,
627,
403,
690,
5454,
14273,
390,
6830,
3081,
11361,
326,
3249,
342,
634,
1332,
326,
403,
417,
11392,
407,
253,
22791,
3045,
17082,
13487,
285,
28400,
403,
1469,
281,
3718,
3640,
273,
1841,
751,
14429,
285,
3394,
285,
9232,
1077,
1652,
670,
625,
8794,
3086,
32019,
14720,
594,
9610,
3045,
327,
690,
625,
4102,
4439,
5691,
5239,
390,
970,
690,
39578,
2175,
390,
387,
1878,
816,
4933,
690,
1783,
273,
3330,
18585,
6127,
651,
320,
1077,
27096,
323,
18005,
253,
7680,
273,
436,
2929,
281,
295,
24343,
625,
3839,
5474,
339,
793,
360,
3454,
436,
2929,
10262,
247,
44790,
5700,
323,
3733,
34741,
3448,
3210,
13361,
78,
253,
4081,
5700,
21168,
327,
2045,
7274,
326,
8989,
3300,
39904,
18893,
35742,
273,
21761,
824,
347,
2862,
3000,
4907,
14429,
390,
35742,
2581,
685,
12421,
44790,
2060,
21761,
5742,
253,
4081,
1332,
48169,
253,
268,
7373,
273,
35742,
285,
253,
26647,
323,
35742,
273,
1979,
374,
689,
253,
3215,
26208,
20689,
285,
12421,
25965,
432,
2190,
253,
14212,
76,
35742,
16095,
2030,
342,
253,
4585,
268,
7373,
44790,
1754,
327,
268,
7373,
26586,
253,
3745,
323,
253,
1566,
281,
10725,
327,
4122,
1980,
6298,
281,
7522,
275,
253,
8989,
285,
3185,
2770,
327,
4715,
2169,
1268,
35185,
597,
41509,
436,
9079,
342,
271,
3368,
17227,
326,
347,
253,
1979,
273,
253,
3159,
14705,
30318,
12075,
285,
3000,
403,
625,
7208,
8085,
715,
2709,
21761,
2581,
685,
1146,
616,
1211,
10669,
253,
3700,
3045,
273,
253,
4795,
13361,
78,
12075,
2299,
970,
2644,
3418,
44790,
342,
436,
1072,
30318,
1979,
761,
12239,
1199,
273,
253,
3236,
3045,
7809,
326,
6941,
253,
1566,
281,
10725,
327,
841,
2266,
1980,
6298,
4230,
983,
253,
3700,
3290,
273,
253,
4795,
1566,
50276,
16217,
3825,
253,
2929,
44995,
327,
1264,
2629,
295,
7675,
49602,
13487,
19,
28400,
285,
5492,
597,
7277,
616,
12920,
487,
833,
5933,
1411,
3632,
10669,
44790,
3632,
13905,
44790,
285,
247,
27785,
2715,
273,
616,
268,
7373,
44790,
5700,
512,
1666,
25379,
403,
9009,
1561,
247,
2014,
2127,
4793,
616,
5700,
41731,
13015,
3632,
2551,
44790,
327,
13487,
4768,
3215,
26208,
2167,
253,
6158,
1057,
2810,
253,
8037,
327,
247,
4577,
3215,
26208,
20689,
1668,
20773,
597,
921,
436,
8037,
4558,
604,
597,
897,
247,
4067,
3215,
26208,
20689,
8255,
20773,
347,
651,
2779,
320,
253,
1083,
342,
247,
1236,
2510,
25912,
3215,
26208,
3368,
253,
4795,
3210,
671,
12724,
562,
32231,
512,
643,
1666,
25379,
327,
512,
8892,
597,
513,
23000,
7277,
281,
3345,
3210,
717,
6291,
13905,
6291,
687,
589,
893,
285,
921,
616,
12920,
487,
833,
3210,
562,
32231,
390,
1347,
12014,
281,
841,
3210,
4583,
436,
2929,
23970,
247,
4891,
10527,
3720,
323,
5368,
285,
747,
3082,
323,
3733,
34741,
3448,
3210,
597,
1246,
10237,
4679,
17227,
253,
10307,
273,
616,
1332,
762,
2710,
7533,
50275,
18,
5816,
25577,
323,
5492,
285,
3236,
13487,
10895,
374,
891,
5467,
436,
36908,
5108,
2223,
533,
310,
352,
2455,
253,
1083,
326,
247,
3733,
1650,
1057,
417,
452,
667,
8989,
494,
35742,
5474,
33032,
6010,
50276,
2520,
2929,
29328,
271,
7756,
281,
849,
21761,
403,
4236,
323,
44790,
275,
3215,
26208,
1781,
34741,
3448,
3210,
270,
797,
285,
2021,
5742,
352,
18798,
17815,
326,
15846,
3632,
4327,
273,
3000,
390,
3159,
7437,
2789,
253,
13361,
78,
4836,
12497,
314,
1892,
352,
840,
4566,
327,
281,
12661,
247,
2856,
324,
1069,
257,
2746,
323,
17221,
295,
5059,
281,
8989,
2366,
253,
2746,
1754,
327,
271,
6880,
273,
1127,
3020,
15577,
1491,
323,
295,
5059,
310,
2011,
281,
562,
32231,
3632,
10669,
285,
3632,
35742,
44790,
8130,
327,
3045,
273,
15450,
8892,
50275,
9072,
285,
5075,
2792,
50276,
9072,
2792,
50276,
783,
2929,
310,
1077,
973,
3542,
4768,
285,
3477,
281,
956,
50276,
783,
1895,
310,
973,
17194,
342,
16774,
1941,
891,
1158,
2593,
374,
14371,
973,
253,
1083,
323,
3632,
44790,
1146,
1512,
3477,
50276,
783,
21471,
2715,
273,
268,
7373,
4081,
310,
2969,
285,
973,
17194,
50276,
783,
7103,
4679,
403,
21414,
253,
1543,
403,
10237,
2439,
253,
8892,
2011,
50276,
20881,
2792,
50275,
783,
581,
2022,
32489,
275,
436,
1263,
310,
253,
3480,
273,
5301,
342,
10726,
3169,
5609,
323,
44790,
50276,
249,
1798,
337,
556,
4102,
2931,
43066,
13905,
44790,
1754,
327,
4907,
10726,
8981,
285,
12282,
43066,
13905,
44790,
556,
644,
8671,
275,
374,
835,
352,
310,
2011,
281,
9510,
3045,
273,
1121,
423,
297,
404,
1953,
22291,
407,
898,
2792,
2829,
18,
273,
374,
3210,
31480,
342,
50276,
859,
78,
891,
1158,
352,
651,
320,
6685,
4722,
281,
7277,
841,
5609,
256,
3610,
5742,
533,
10726,
3169,
5609,
275,
2087,
342,
12920,
303,
1945,
272,
5742,
352,
310,
4390,
12744,
1880,
253,
268,
7373,
1754,
295,
1710,
44790,
30318,
3365,
7637,
598,
2502,
2865,
1189,
272,
4633,
4907,
10726,
25957,
390,
1880,
627,
403,
625,
4722,
749,
545,
83,
1169,
24088,
22467,
8977,
749,
545,
83,
1169,
326,
247,
38998,
985,
651,
417,
3609,
4720,
352,
651,
320,
4722,
281,
45190,
1071,
1880,
841,
4465,
1327,
21294,
295,
5059,
2085,
2007,
3045,
9510,
689,
253,
10726,
3169,
43066,
13905,
44790,
50276,
18,
19929,
25064,
2321,
16390,
3448,
1566,
3215,
26208,
5987,
39962,
2061,
5375,
1518,
17391,
28766,
374,
849,
1199,
3640,
476,
368,
3584,
715,
253,
3602,
273,
247,
3448,
1566,
5987,
39962,
2061,
5375,
1518,
938,
2511,
740,
50275,
250,
27167,
318,
50276,
74,
5583,
436,
2929,
323,
14924,
253,
1783,
285,
5697,
4768,
253,
2929,
403,
973,
11407,
891,
671,
1158,
253,
9400,
943,
320,
273,
1029,
1600,
281,
253,
17857,
32888,
285,
295,
24343,
7888,
1677,
253,
6349,
273,
13361,
78,
3215,
26208,
327,
954,
1375,
23037,
14387,
3210,
387,
253,
2774,
5747,
253,
3480,
273,
5301,
342,
10726,
3169,
5609,
1907,
247,
10126,
3505,
74,
6216,
5795,
12718,
1754,
327,
820,
30714,
1196,
1293,
32019,
3216,
272,
3133,
4722,
50275,
34974,
323,
4477,
50276,
18,
253,
2022,
1953,
1060,
2905,
281,
253,
10726,
3169,
7274,
5469,
1840,
891,
1158,
352,
651,
320,
4722,
281,
2953,
436,
2523,
1677,
849,
8244,
2905,
352,
310,
281,
436,
789,
285,
253,
1175,
3045,
253,
11106,
9380,
7568,
970,
352,
891,
476,
1158,
273,
247,
4564,
273,
4088,
281,
2953,
436,
5301,
337,
18276,
1783,
273,
253,
44790,
30318,
281,
1805,
2096,
253,
3910,
875,
268,
7373,
295,
5059,
285,
10726,
25957,
374,
5661,
1783,
24049,
690,
10726,
3169,
44790,
715,
253,
4679,
275,
253,
2929,
50276,
19,
3496,
49115,
273,
849,
368,
5206,
281,
2953,
253,
10726,
3169,
5301,
891,
369,
6110,
275,
690,
1783,
390,
10491,
273,
253,
12920,
303,
1945,
272,
30318,
281,
2096,
752,
1511,
273,
295,
5059,
403,
1146,
4236,
10726,
25957,
22467,
8977,
25491,
28407,
545,
83,
1169,
3966,
651,
320,
4722,
604,
368,
812,
1056,
436,
11571,
357,
2130,
390,
823,
247,
1355,
3410,
275,
253,
30762,
50276,
20,
4768,
253,
2929,
627,
310,
271,
9376,
326,
41248,
3000,
403,
2783,
323,
44790,
436,
369,
417,
4745,
2590,
275,
253,
5068,
273,
253,
2929,
891,
8156,
352,
760,
275,
2593,
4567,
342,
50276,
5371,
670,
41248,
35742,
273,
625,
685,
767,
21761,
50276,
2858,
581,
1953,
2210,
281,
2564,
752,
670,
9578,
1327,
1987,
304,
3472,
35742,
323,
1650,
25023,
285,
9216,
11000,
403,
11543,
281,
320,
1246,
275,
253,
1072,
295,
1710,
533,
452,
12054,
1029,
14512,
273,
4645,
598,
2366,
275,
253,
1072,
10056,
452,
368,
2783,
13633,
436,
789,
281,
1327,
1987,
304,
3472,
35742,
310,
627,
667,
15355,
326,
436,
651,
1361,
4715,
390,
310,
352,
816,
247,
3076,
2934,
50276,
21,
369,
627,
271,
3177,
281,
5878,
44790,
8130,
1309,
3215,
26208,
3738,
2829,
721,
310,
21414,
275,
17227,
326,
34791,
5097,
44229,
414,
310,
417,
9578,
342,
3045,
273,
15450,
8892,
253,
3910,
1646,
14338,
581,
310,
1669,
12371,
604,
627,
310,
667,
5649,
275,
6240,
247,
1355,
1180,
273,
3477,
44790,
2219,
26332,
3632,
45499,
390,
3632,
1033,
507,
187,
187,
4118,
18435,
27,
2520,
2929,
8631,
247,
747,
285,
21657,
4217,
1039,
281,
12661,
34741,
35742,
323,
13361,
78,
3215,
26208,
407,
44790,
35742,
273,
2505,
326,
820,
30714,
625,
2223,
685,
651,
320,
3264,
1677,
616,
4295,
50276,
466,
326,
403,
10126,
2779,
281,
320,
1327,
42190,
267,
25491,
50276,
783,
4477,
943,
1056,
690,
3177,
281,
4684,
616,
268,
7373,
47641,
342,
2720,
3082,
323,
7605,
12616,
28983,
285,
1307,
8981,
24088,
5987,
2700,
39639,
1528,
17788,
6143,
1944,
68,
2284,
8602,
1944,
68,
2284,
39768,
1867,
9275,
390,
5987,
23053,
81,
47761,
681,
33080,
6903,
361,
2787,
3141,
1671,
1449,
2227,
1619,
3547,
1348,
275,
253,
2457,
2929
] | [
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] | [
3192,
745,
2792,
984,
253,
2934,
310,
4942,
27350,
285,
36908,
1663,
1527,
247,
3862,
747,
2170,
323,
2852,
789,
891,
13414,
923,
667,
4755,
35961,
32138,
534,
29708,
359,
476,
1089,
275,
954,
9380,
50276,
74,
651,
320,
6110,
275,
6523,
604,
436,
3777,
253,
11041,
273,
253,
1442,
292,
25004,
1232,
326,
1537,
320,
1633,
253,
4477,
812,
2486,
323,
253,
6568,
4704,
5046,
275,
253,
30762,
50276,
15576,
846,
16725,
253,
2488,
2380,
891,
588,
1978,
619,
4868,
347,
352,
310,
891,
2868,
253,
2929,
943,
320,
7607,
5474,
339,
793,
360,
3454,
50276,
783,
2929,
29328,
247,
12955,
327,
253,
13361,
78,
3733,
8103,
534,
4648,
268,
7373,
275,
1340,
281,
3653,
534,
35742,
281,
8989,
253,
2934,
310,
2905,
281,
4102,
856,
7334,
2644,
3159,
44790,
285,
10726,
3169,
44790,
533,
253,
4477,
9059,
253,
12920,
487,
833,
2746,
310,
625,
3505,
74,
6216,
253,
1332,
310,
15246,
262,
8687,
12672,
12920,
261,
323,
295,
5059,
275,
436,
1083,
598,
281,
2978,
608,
689,
253,
3733,
20689,
285,
840,
4510,
804,
281,
8989,
2862,
3007,
406,
1050,
25491,
2581,
685,
2014,
3000,
1309,
3733,
253,
30328,
310,
326,
44790,
2014,
3000,
4483,
3210,
281,
22059,
2969,
3007,
47575,
3021,
39793,
616,
3733,
8103,
1293,
4715,
1048,
1000,
912,
21011,
390,
2169,
1268,
24705,
3386,
273,
253,
14683,
285,
436,
2789,
3733,
1679,
5919,
685,
352,
812,
320,
581,
7680,
273,
253,
2929,
310,
247,
12955,
327,
253,
268,
7373,
7982,
326,
17923,
1805,
323,
3356,
25491,
407,
8493,
253,
7363,
273,
25491,
326,
5108,
281,
3831,
1029,
2617,
74,
749,
545,
83,
1169,
24088,
3471,
4652,
17392,
1299,
310,
943,
417,
452,
247,
1029,
4868,
5747,
253,
958,
326,
3471,
4652,
17392,
1299,
1057,
452,
247,
1029,
4868,
50276,
783,
4477,
7277,
616,
1332,
1411,
26724,
270,
797,
342,
3632,
44790,
347,
973,
347,
1411,
4102,
4081,
11640,
824,
347,
13905,
6291,
285,
717,
6291,
285,
921,
5185,
11701,
275,
2426,
273,
2457,
3045,
347,
973,
347,
1805,
6733,
1309,
3733,
407,
1039,
273,
1783,
253,
4477,
671,
1056,
271,
4154,
326,
10669,
5251,
44229,
414,
310,
417,
9578,
342,
15450,
3045,
436,
310,
271,
4722,
1127,
281,
1056,
2167,
597,
513,
417,
866,
517,
2220,
352,
275,
436,
2929,
50275,
296,
3755,
20556,
50275,
783,
4081,
1332,
310,
2969,
285,
3505,
74,
6216,
50276,
783,
16774,
1543,
921,
5185,
7756,
327,
2629,
22791,
8892,
50276,
783,
4081,
7629,
253,
268,
7373,
7982,
310,
247,
5322,
749,
1987,
2382,
50276,
20881,
1255,
265,
50275,
66,
8489,
16888,
7680,
697,
417,
3012,
1027,
432,
253,
11640,
4081,
3786,
24088,
13905,
6291,
10726,
44790,
50276,
783,
7103,
16633,
15846,
327,
22791,
8892,
534,
403,
1929,
281,
452,
32138,
24088,
253,
1655,
2221,
13961,
3045,
327,
841,
8892,
2168,
2789,
15988,
327,
731,
9101,
2654,
452,
10490,
281,
690,
625,
1783,
49794,
273,
253,
32019,
9099,
273,
436,
747,
8103,
923,
625,
2173,
5701,
2708,
50276,
38092,
5701,
34974,
50276,
74,
717,
14338,
670,
253,
625,
2087,
1055,
273,
436,
3733,
8103,
327,
253,
3210,
32019,
285,
3782,
43548,
9994,
3640,
24088,
476,
368,
1333,
625,
670,
849,
2223,
253,
1566,
11403,
440,
304,
10624,
1146,
34741,
285,
849,
253,
3268,
273,
841,
440,
304,
10624,
19986,
432,
752,
651,
320,
2326,
604,
359,
858,
3632,
44790,
891,
1642,
984,
891,
812,
8564,
326,
436,
8103,
556,
247,
28629,
1055,
327,
253,
44790,
273,
1159,
3000,
24088,
638,
3321,
12952,
625,
2223,
275,
3007,
47575,
43980,
84,
285,
14802,
5046,
1679,
2223,
285,
3021,
253,
1566,
1537,
755,
26704,
2289,
281,
841,
3000,
275,
12940,
1580,
1159,
3000,
4459,
247,
2257,
625,
2625,
670,
43548,
9994,
2605,
685,
513,
2600,
3000,
285,
25491,
273,
253,
1511,
368,
403,
26475,
275,
634,
268,
7373,
7982,
516,
1077,
14338,
604,
627,
403,
690,
5454,
14273,
390,
6830,
3081,
11361,
326,
3249,
342,
634,
1332,
326,
403,
417,
11392,
407,
253,
22791,
3045,
17082,
13487,
285,
28400,
403,
1469,
281,
3718,
3640,
273,
1841,
751,
14429,
285,
3394,
285,
9232,
1077,
1652,
670,
625,
8794,
3086,
32019,
14720,
594,
9610,
3045,
327,
690,
625,
4102,
4439,
5691,
5239,
390,
970,
690,
39578,
2175,
390,
387,
1878,
816,
4933,
690,
1783,
273,
3330,
18585,
6127,
651,
320,
1077,
27096,
323,
18005,
253,
7680,
273,
436,
2929,
281,
295,
24343,
625,
3839,
5474,
339,
793,
360,
3454,
436,
2929,
10262,
247,
44790,
5700,
323,
3733,
34741,
3448,
3210,
13361,
78,
253,
4081,
5700,
21168,
327,
2045,
7274,
326,
8989,
3300,
39904,
18893,
35742,
273,
21761,
824,
347,
2862,
3000,
4907,
14429,
390,
35742,
2581,
685,
12421,
44790,
2060,
21761,
5742,
253,
4081,
1332,
48169,
253,
268,
7373,
273,
35742,
285,
253,
26647,
323,
35742,
273,
1979,
374,
689,
253,
3215,
26208,
20689,
285,
12421,
25965,
432,
2190,
253,
14212,
76,
35742,
16095,
2030,
342,
253,
4585,
268,
7373,
44790,
1754,
327,
268,
7373,
26586,
253,
3745,
323,
253,
1566,
281,
10725,
327,
4122,
1980,
6298,
281,
7522,
275,
253,
8989,
285,
3185,
2770,
327,
4715,
2169,
1268,
35185,
597,
41509,
436,
9079,
342,
271,
3368,
17227,
326,
347,
253,
1979,
273,
253,
3159,
14705,
30318,
12075,
285,
3000,
403,
625,
7208,
8085,
715,
2709,
21761,
2581,
685,
1146,
616,
1211,
10669,
253,
3700,
3045,
273,
253,
4795,
13361,
78,
12075,
2299,
970,
2644,
3418,
44790,
342,
436,
1072,
30318,
1979,
761,
12239,
1199,
273,
253,
3236,
3045,
7809,
326,
6941,
253,
1566,
281,
10725,
327,
841,
2266,
1980,
6298,
4230,
983,
253,
3700,
3290,
273,
253,
4795,
1566,
50276,
16217,
3825,
253,
2929,
44995,
327,
1264,
2629,
295,
7675,
49602,
13487,
19,
28400,
285,
5492,
597,
7277,
616,
12920,
487,
833,
5933,
1411,
3632,
10669,
44790,
3632,
13905,
44790,
285,
247,
27785,
2715,
273,
616,
268,
7373,
44790,
5700,
512,
1666,
25379,
403,
9009,
1561,
247,
2014,
2127,
4793,
616,
5700,
41731,
13015,
3632,
2551,
44790,
327,
13487,
4768,
3215,
26208,
2167,
253,
6158,
1057,
2810,
253,
8037,
327,
247,
4577,
3215,
26208,
20689,
1668,
20773,
597,
921,
436,
8037,
4558,
604,
597,
897,
247,
4067,
3215,
26208,
20689,
8255,
20773,
347,
651,
2779,
320,
253,
1083,
342,
247,
1236,
2510,
25912,
3215,
26208,
3368,
253,
4795,
3210,
671,
12724,
562,
32231,
512,
643,
1666,
25379,
327,
512,
8892,
597,
513,
23000,
7277,
281,
3345,
3210,
717,
6291,
13905,
6291,
687,
589,
893,
285,
921,
616,
12920,
487,
833,
3210,
562,
32231,
390,
1347,
12014,
281,
841,
3210,
4583,
436,
2929,
23970,
247,
4891,
10527,
3720,
323,
5368,
285,
747,
3082,
323,
3733,
34741,
3448,
3210,
597,
1246,
10237,
4679,
17227,
253,
10307,
273,
616,
1332,
762,
2710,
7533,
50275,
18,
5816,
25577,
323,
5492,
285,
3236,
13487,
10895,
374,
891,
5467,
436,
36908,
5108,
2223,
533,
310,
352,
2455,
253,
1083,
326,
247,
3733,
1650,
1057,
417,
452,
667,
8989,
494,
35742,
5474,
33032,
6010,
50276,
2520,
2929,
29328,
271,
7756,
281,
849,
21761,
403,
4236,
323,
44790,
275,
3215,
26208,
1781,
34741,
3448,
3210,
270,
797,
285,
2021,
5742,
352,
18798,
17815,
326,
15846,
3632,
4327,
273,
3000,
390,
3159,
7437,
2789,
253,
13361,
78,
4836,
12497,
314,
1892,
352,
840,
4566,
327,
281,
12661,
247,
2856,
324,
1069,
257,
2746,
323,
17221,
295,
5059,
281,
8989,
2366,
253,
2746,
1754,
327,
271,
6880,
273,
1127,
3020,
15577,
1491,
323,
295,
5059,
310,
2011,
281,
562,
32231,
3632,
10669,
285,
3632,
35742,
44790,
8130,
327,
3045,
273,
15450,
8892,
50275,
9072,
285,
5075,
2792,
50276,
9072,
2792,
50276,
783,
2929,
310,
1077,
973,
3542,
4768,
285,
3477,
281,
956,
50276,
783,
1895,
310,
973,
17194,
342,
16774,
1941,
891,
1158,
2593,
374,
14371,
973,
253,
1083,
323,
3632,
44790,
1146,
1512,
3477,
50276,
783,
21471,
2715,
273,
268,
7373,
4081,
310,
2969,
285,
973,
17194,
50276,
783,
7103,
4679,
403,
21414,
253,
1543,
403,
10237,
2439,
253,
8892,
2011,
50276,
20881,
2792,
50275,
783,
581,
2022,
32489,
275,
436,
1263,
310,
253,
3480,
273,
5301,
342,
10726,
3169,
5609,
323,
44790,
50276,
249,
1798,
337,
556,
4102,
2931,
43066,
13905,
44790,
1754,
327,
4907,
10726,
8981,
285,
12282,
43066,
13905,
44790,
556,
644,
8671,
275,
374,
835,
352,
310,
2011,
281,
9510,
3045,
273,
1121,
423,
297,
404,
1953,
22291,
407,
898,
2792,
2829,
18,
273,
374,
3210,
31480,
342,
50276,
859,
78,
891,
1158,
352,
651,
320,
6685,
4722,
281,
7277,
841,
5609,
256,
3610,
5742,
533,
10726,
3169,
5609,
275,
2087,
342,
12920,
303,
1945,
272,
5742,
352,
310,
4390,
12744,
1880,
253,
268,
7373,
1754,
295,
1710,
44790,
30318,
3365,
7637,
598,
2502,
2865,
1189,
272,
4633,
4907,
10726,
25957,
390,
1880,
627,
403,
625,
4722,
749,
545,
83,
1169,
24088,
22467,
8977,
749,
545,
83,
1169,
326,
247,
38998,
985,
651,
417,
3609,
4720,
352,
651,
320,
4722,
281,
45190,
1071,
1880,
841,
4465,
1327,
21294,
295,
5059,
2085,
2007,
3045,
9510,
689,
253,
10726,
3169,
43066,
13905,
44790,
50276,
18,
19929,
25064,
2321,
16390,
3448,
1566,
3215,
26208,
5987,
39962,
2061,
5375,
1518,
17391,
28766,
374,
849,
1199,
3640,
476,
368,
3584,
715,
253,
3602,
273,
247,
3448,
1566,
5987,
39962,
2061,
5375,
1518,
938,
2511,
740,
50275,
250,
27167,
318,
50276,
74,
5583,
436,
2929,
323,
14924,
253,
1783,
285,
5697,
4768,
253,
2929,
403,
973,
11407,
891,
671,
1158,
253,
9400,
943,
320,
273,
1029,
1600,
281,
253,
17857,
32888,
285,
295,
24343,
7888,
1677,
253,
6349,
273,
13361,
78,
3215,
26208,
327,
954,
1375,
23037,
14387,
3210,
387,
253,
2774,
5747,
253,
3480,
273,
5301,
342,
10726,
3169,
5609,
1907,
247,
10126,
3505,
74,
6216,
5795,
12718,
1754,
327,
820,
30714,
1196,
1293,
32019,
3216,
272,
3133,
4722,
50275,
34974,
323,
4477,
50276,
18,
253,
2022,
1953,
1060,
2905,
281,
253,
10726,
3169,
7274,
5469,
1840,
891,
1158,
352,
651,
320,
4722,
281,
2953,
436,
2523,
1677,
849,
8244,
2905,
352,
310,
281,
436,
789,
285,
253,
1175,
3045,
253,
11106,
9380,
7568,
970,
352,
891,
476,
1158,
273,
247,
4564,
273,
4088,
281,
2953,
436,
5301,
337,
18276,
1783,
273,
253,
44790,
30318,
281,
1805,
2096,
253,
3910,
875,
268,
7373,
295,
5059,
285,
10726,
25957,
374,
5661,
1783,
24049,
690,
10726,
3169,
44790,
715,
253,
4679,
275,
253,
2929,
50276,
19,
3496,
49115,
273,
849,
368,
5206,
281,
2953,
253,
10726,
3169,
5301,
891,
369,
6110,
275,
690,
1783,
390,
10491,
273,
253,
12920,
303,
1945,
272,
30318,
281,
2096,
752,
1511,
273,
295,
5059,
403,
1146,
4236,
10726,
25957,
22467,
8977,
25491,
28407,
545,
83,
1169,
3966,
651,
320,
4722,
604,
368,
812,
1056,
436,
11571,
357,
2130,
390,
823,
247,
1355,
3410,
275,
253,
30762,
50276,
20,
4768,
253,
2929,
627,
310,
271,
9376,
326,
41248,
3000,
403,
2783,
323,
44790,
436,
369,
417,
4745,
2590,
275,
253,
5068,
273,
253,
2929,
891,
8156,
352,
760,
275,
2593,
4567,
342,
50276,
5371,
670,
41248,
35742,
273,
625,
685,
767,
21761,
50276,
2858,
581,
1953,
2210,
281,
2564,
752,
670,
9578,
1327,
1987,
304,
3472,
35742,
323,
1650,
25023,
285,
9216,
11000,
403,
11543,
281,
320,
1246,
275,
253,
1072,
295,
1710,
533,
452,
12054,
1029,
14512,
273,
4645,
598,
2366,
275,
253,
1072,
10056,
452,
368,
2783,
13633,
436,
789,
281,
1327,
1987,
304,
3472,
35742,
310,
627,
667,
15355,
326,
436,
651,
1361,
4715,
390,
310,
352,
816,
247,
3076,
2934,
50276,
21,
369,
627,
271,
3177,
281,
5878,
44790,
8130,
1309,
3215,
26208,
3738,
2829,
721,
310,
21414,
275,
17227,
326,
34791,
5097,
44229,
414,
310,
417,
9578,
342,
3045,
273,
15450,
8892,
253,
3910,
1646,
14338,
581,
310,
1669,
12371,
604,
627,
310,
667,
5649,
275,
6240,
247,
1355,
1180,
273,
3477,
44790,
2219,
26332,
3632,
45499,
390,
3632,
1033,
507,
187,
187,
4118,
18435,
27,
2520,
2929,
8631,
247,
747,
285,
21657,
4217,
1039,
281,
12661,
34741,
35742,
323,
13361,
78,
3215,
26208,
407,
44790,
35742,
273,
2505,
326,
820,
30714,
625,
2223,
685,
651,
320,
3264,
1677,
616,
4295,
50276,
466,
326,
403,
10126,
2779,
281,
320,
1327,
42190,
267,
25491,
50276,
783,
4477,
943,
1056,
690,
3177,
281,
4684,
616,
268,
7373,
47641,
342,
2720,
3082,
323,
7605,
12616,
28983,
285,
1307,
8981,
24088,
5987,
2700,
39639,
1528,
17788,
6143,
1944,
68,
2284,
8602,
1944,
68,
2284,
39768,
1867,
9275,
390,
5987,
23053,
81,
47761,
681,
33080,
6903,
361,
2787,
3141,
1671,
1449,
2227,
1619,
3547,
1348,
275,
253,
2457,
2929
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
the paper proposes a hierarchical multiagent reinforcement learning method for the restricted communication setting and verifies the algorithm performance in a number of useful applications the hierarchical approach to the networked marl problem proves novel effective and interesting strengths the work targets an arguably less explored area by focusing on the restrictions on interagent communication that may be present in realistic scenarios evaluation setup is varied explained in detail and visualized in an intuitive manner niche is wellidentified and the contribution is clear major concerns the reviewer had issues positioning the paper among the different lines of research although the research gap itself is clear scalable marl methods in a restricted communication setting it isnt obvious why and how relevant the cited works are for example mentioning vdn qmix and qtran which together are some of the latest works in the factorization methods does not seem to serve any further purpose as they are no longer compared quantitatively or qualitatively to ltos the authors claim that they are not scalable leads the reviewer to anticipate that ltos naturally is scalable but there appears to be no evidence whatsoever presented in the latter sections of the paper to show let alone prove the superior scalability of ltos with for example growing numbers of agents and training times furthermore some of the cited works have been left out at the evaluation stage which leaves the reviewer puzzled as to which baselines ltos really hopes to outshine the work needs some justification over why the following studies have not been compared to in the evaluation if the main strength of ltos lies in its capability to function effectively and efficiently in restricted communications setting comparison to one or more of the following works should be of great advantage in illustrating that edge dialrial by foerster 2016 learning to communicate with deep multiagent reinforcement learning bicnet by peng 2017 arxiv multiagent bidirectionallycoordinated nets commnet by sukhbaatar 2016 neurips learning multiagent communication with backpropagation ic3net by singh 2019 iclr learning when to communicate at scale in multiagent cooperative and competitive tasks schednet by kim 2019 iclr learning to schedule communication in multiagent reinforcement learning if the main strength of ltos lies in its capability to resolve selfishness and assign credits appropriately to bring about a harmonious cooperation in social dilemmas analysis with respect to the this work should be helpful eccles 2019 corr learning reciprocity in complex sequential social dilemmas it would be interesting to draw some parallels between ltos and bad as both draw inspiration from a hierarchical decomposition bad by foerster 2019 icml bayesian action decoder for deep multiagent reinforcement learning this recent aamas paper is based on peer evaluation and exchanging evaluation messages computed from recently obtained rewards peddqn by hostallero 2020 aamas inducing cooperation through reward reshaping based on peer evaluations in deep multiagent reinforcement learning some of the potential issues to discuss are bandwidth usage of message exchange message overhead in sharing the neighbors rewards using neighbors information to achieve scalability in marl most likely requires discussion of meanfield methods such as yang 2018 icml mean field multiagent reinforcement learning going through the appendices spurred a great deal of curiosity as the authors mention that all agents share the same synchronized random number generator with the same seed across all the agents this leads me to believe that the philosophy of decentralized learning is lost in ltos synchronization is definitely not costfree all the more so if the synchronized rng is used to sample an experience from the agents replay buffers how do the agents synchronize their rng in a decentralized manner in the routing evaluation has overhead been taken into account how does ltos fare with respect to varied communications channel what if the network were sparser do you observe any trends as you vary the extent of network connectivitydocsepthe paper present a new method called ltos which enables agents to share rewards in marl two levels of policies highlevel and lowlevel determines rewards and optimize global objectives three diverse scenarios were used to test the performance of ltos compared to other baseline methods ltos consistently outperforms other methods in the second scenario authors also show the need for highlevel policy by introduction fixed ltos at the end of introduction the sentence ltos is easy to implement and currently realized by ddpg can be misleading because of the word realized and the fact that authors argue that ltos is a newly proposed method does this mean ltos simply combines ddpg and dgn do figure 5 and 6 represent selfishness of agents when ltos is used minor editorial errors in appendixdocsepsummary the paper considers the cooperative marl setting where agents get local rewards and they are interconnected as a graph where neighbors can communicate the paper specifically considers the communication of reward sharing that is an agent shares part of its reward to its neighbors such that each agent optimizes its local reward plus rewards from its neighbors this motivates a bilevel optimization framework where the highlevel policy decides how the rewards are shared and the lowlevel policy locally optimizes the shared rewards given the highlevels decision the papers flow motivates such a framework well the experimental results demonstrate the methods effectiveness i think it is a strong paper accept but my confidence is low due to the following confusions i have commentsquestions 1 i have a highlevel comment on the reward sharing mechanism it seems that the proposed method does not support multihop sharing because rewards can only be shared to neighbors why is this singlehop sharing effective in the experiments is it because of domainspecific reasons or its because that singlehop sharing is in principle equally effective why 2 the derivation of 18 using taylor expansion is unclear to me could the authors explain it with more details 3 i dont fully understand the proof of proposition 42 specifically does phi can be learned in a decentralized manner mean that the optimal phi can be based on only the local observation for each agent instead of based on global state could the authors comment on the approximation error induced by the meanfield approximation why the proof begins with phii based on oi and ends with phii based on global state s 4 in equation 17 and 20 should phi be just phi ie no here 5 the lowlevel policy is to optimize the shared rewards my understanding is that any singleagent rl algorithm can be used for optimizing the shared rewards eg dqn ddpg a2c etc why would the authors choose dgn a rather less popular rl algorithm have the authors tried more popular algorithms as the lowlevel policy 6 for fixed ltos how do we determine the fixed sharing weights thanks for the response ive increased my confidence docsepthe paper addresses multiagent rl problems by presenting a decentralized approach where the agents learn to share their reward with their neighbors in this method a highlevel policy determines a weight vector for weighting the reward of neighboring agents and then each agent learns their own independent policy the learning is thus conducted locally in a partially connected network toward a common goal and without the knowledge of global state and actions overall the approach is intuitive and interesting for decentralized learning in marl tasks however i have some commentsquestions for improving the paper that are summarized below hence i vote to reject at this stage pros intuitive design of communication among agents in decentralized setting clever adaption of algorithms well written paper and properly organized comments the contribution of the paper is mainly in formulating the problem in the actorcritic setup of ddpg method which leads to a limited novelty a key concern about the paper is how to decompose the reward in the first place the paper aims at optimizing a global objective and assumes also in the propositions that this objective has additive connection with the decentralized rewards nevertheless this is a strong assumption particularly in realworld applications a global reward can be decomposed into summation of smaller rewards but not necessarily the other way around as long as there is a global objective we need a way to distribute the reward among the agents via learning or reward reshaping or even manually how can we properly define the reward of each agent in such scenarios it is also unclear what is the benefit of sharing only with the neighbors the method learns a weight vector of size ni for every agent does it make a difference in the architecturealgorithm if we learn the weights of all the other agents size n instead formulating the weights as finite discrete values looks unnatural if the method is designed for continuous action space it is expected to have the notations to be continuous as well can we just simply convert the summations into integration in the propositions the authors claim that the problem with the related work is that they can not scale up with the number of agents however there is no empirical support that how the proposed approach deals with largescale problems in general the experiments are small and based on simulation and simulated scenarios are not considered realworld which is claimed otherwise in the paper i would recommend to incorporate more supportive empirical evaluation minor what is phii in eq 17docsepupdate i appreciate the detailed replies to my questions indeed some of the points i raised were addressed well and the paper updated accordingly however some new concerns were also raised by the replies using 3 seeds for the experimental evaluation is an extremely questionable evaluation protocol there is no way to know if any of the results are going to hold up its also clear now that none of the experiments are comparing to benchmark numbers from other publications it would have been more confidence inspiring if the method was tested on a set of tasks where external benchmarks have already been established this is particularly true for the new results that were added to the paper eg the qmix results its difficult to make sense of them and the instability points towards a potential hyperparameter issue all baselines for the prisoners case should at least compare to the fully cooperative case of adding up the rewards comparing to a dqn baseline that maximizes individual rewards is a red herring its odd that all experiments require less than 1000 episodes to train this is very unusual for challenging multiagent rl problems it would be great to understand if the main selling point of ltos is samplecomplexitylearning speed or if there is something else going on i also agree with the concern raised by other reviewers that the paper is currently not positioned clearly all things considered i believe my score is still appropriate for the paper however i also believe that a future version of the paper with clarified positioning and more thorough experimental evaluation could make for a compelling contribution original review obviously ctde cannot address such problems due to the curse of dimensionality ctde means that there is the option to use centralized information at training time clearly some ways of using centralized information will scale better than others and claiming that none of them scale is simply unfounded one is that the reward function tragedy of the commons i am struggling to make sense of this paragraph please work on the clarity of the writing however they are learned in a centralized way and hence not scalable these methods have been scaled to large numbers of agents in complex environments please provide citations when making a claim that something doesnt scale for example the the starcraft multiagent challenge samvelyan et al 2020 includes results for numbers of agents comparable to the largest experiments in this paper moreover the observation of each agent oi oi can be enhanced to the cartesian product of agent i and its neighbors jiang et al 2020 or the observation history chu et al 2020 i dont follow this if the observation of each agent includes the observation of all neighbors which includes the observation of their neighbors then shouldnt everyone observe everything equation 1 is wrong the lefthand side conditions on oi but the righthand side conditions on s this also affects all following equations the simple way to optimize the global objective is that each agent maximizes its own expected return which is known as markov game this is wrong when each agent optimizes their own expected return this is typically not a means of optimizing the global objective in networked marl as the reward of an agent is assumed to depend on the actions of neighbors we allow reward sharing between neighboring agents the reward function also depends on the global state s which is a function of the joint action of all of the agents so this local reward sharing seems clearly insufficient in general eqn 6 to 15 this proof seems unnecessarily cumbersome w only redistributes the rewards so the sum of total rewards is unchanged qed unlike existing hierarchical rl methods we can directly construct the value function and action value function of based on the value function of at each agent constructing the value function isnt really the problem but approximating and learning it is challenging theory 43 each vertex has its own local policy ij wij oi and we can verify their independence by means of markov random field this is not clear to me furthermore given that the transition function conditions on the joint action and that the reward function depends on the central state this seems wrong unless i am mistaken the dependency on the central state should break any locality assumptions experiments the results on the prisoners dilemma are misleading clearly if there is an ability to change the reward functions of individual agents which is assumed by ltos there is no more social dilemma as such only baselines that maximize the total reward are credible comparisons and seem to be missing completely the traffic and routing experiments seem more interesting a few caveats none of the results include uncertainty estimates it is furthermore unclear how many seeds were used furthermore the fixed ltos baseline for ablation we keep the sharing weights fixed for each agent named fixed ltos seems odds did you try a baseline where all agents simply share their reward equally with their neighbors also centralized baselines are missing eg httpsarxivorgpdf191000091pdf in routing fixed ltos ie not learning to share and ltos seem indistinguishable
### Summary: | although there was some initial disagreement on this paper the majority of reviewers agree that this work is not ready for publication and can be improved in various manners after the discussion phase there is also serious concern that the experiments need more work statistically to verify if they hold up more comparisons with baselines are required as well the paper could also be better put in context with the sota and related work the paper does contain interesting ideas and the authors are encouraged to deepen the work and resubmit to another major ml venue | [
8245,
3082,
46007,
375,
12724,
41731,
13015,
643,
3082,
275,
253,
1273,
10076,
4477,
671,
921,
253,
878,
323,
1029,
5251,
3646,
407,
10199,
4229,
46007,
375,
50274,
255,
253,
990,
273,
10199,
253,
6197,
46007,
375,
310,
3477,
281,
3359,
285,
4390,
8156,
407,
32765,
8159,
476,
320,
24363,
984,
273,
253,
3159,
8156,
285,
253,
958,
326,
4477,
9059,
326,
46007,
375,
310,
247,
9841,
4081,
1332,
1057,
436,
1599,
46007,
375,
3365,
24772,
32765,
8159,
285,
277,
3757,
50276,
3088,
4677,
608,
285,
721,
1957,
27915,
1255,
273,
6083,
672,
46007,
375,
310,
908,
50276,
37585,
21977,
6332,
275,
30762,
7152,
339,
793,
360,
3454,
253,
2929,
19401,
253,
27293,
2304,
77,
4758,
835,
6083,
755,
1980,
23267,
285,
597,
403,
36282,
347,
247,
4216,
835,
15833,
476,
13791,
253,
2929,
5742,
19401,
253,
5511,
273,
10921,
9628,
326,
310,
271,
5570,
10764,
629,
273,
697,
10921,
281,
697,
15833,
824,
326,
1016,
5570,
5556,
4219,
697,
1980,
10921,
5043,
23267,
432,
697,
15833,
436,
15265,
684,
247,
26413,
652,
13757,
7792,
835,
253,
1029,
5251,
3646,
21936,
849,
253,
23267,
403,
6096,
285,
253,
1698,
5251,
3646,
12171,
5556,
4219,
253,
6096,
23267,
1677,
253,
1029,
43836,
3061,
253,
9380,
2685,
15265,
684,
824,
247,
7792,
973,
253,
5661,
1543,
7568,
253,
3082,
12510,
891,
1158,
352,
310,
247,
2266,
2929,
2997,
533,
619,
7162,
310,
1698,
1955,
281,
253,
1563,
1461,
16723,
891,
452,
50275,
26122,
34974,
50275,
18,
891,
452,
247,
1029,
5251,
4385,
327,
253,
10921,
9628,
5122,
352,
3133,
326,
253,
4081,
1332,
1057,
417,
1329,
4471,
12242,
9628,
984,
23267,
476,
760,
320,
6096,
281,
15833,
2139,
310,
436,
2014,
12242,
9628,
3576,
275,
253,
4679,
310,
352,
984,
273,
10625,
29765,
4606,
390,
697,
984,
326,
2014,
12242,
9628,
310,
275,
8063,
9696,
3576,
2139,
50276,
19,
253,
28529,
273,
1283,
970,
246,
9614,
7466,
310,
12744,
281,
479,
812,
253,
4477,
5513,
352,
342,
625,
4278,
50276,
20,
891,
13414,
4751,
2096,
253,
4737,
273,
13989,
5976,
5742,
1057,
815,
74,
476,
320,
6311,
275,
247,
40880,
5133,
1599,
326,
253,
8654,
815,
74,
476,
320,
1754,
327,
760,
253,
1980,
8310,
323,
1016,
5570,
3185,
273,
1754,
327,
4156,
1375,
812,
253,
4477,
4385,
327,
253,
11193,
2228,
5802,
407,
253,
1599,
3423,
11193,
2139,
253,
4737,
9513,
342,
815,
2886,
1754,
327,
258,
74,
285,
7637,
342,
815,
2886,
1754,
327,
4156,
1375,
256,
50276,
21,
275,
5150,
1722,
285,
1384,
943,
815,
74,
320,
816,
815,
74,
26332,
642,
50276,
1568,
50276,
22,
253,
1698,
5251,
3646,
310,
281,
22318,
253,
6096,
23267,
619,
4685,
310,
326,
667,
2014,
12788,
391,
77,
5933,
476,
320,
908,
323,
39793,
253,
6096,
23267,
24088,
277,
47051,
32765,
8159,
247,
19,
68,
3966,
2139,
651,
253,
4477,
5206,
277,
3757,
247,
2581,
1679,
4633,
391,
77,
5933,
452,
253,
4477,
3597,
625,
4633,
11333,
347,
253,
1698,
5251,
3646,
50276,
23,
323,
4229,
46007,
375,
50276,
5430,
513,
359,
3653,
253,
4229,
9628,
13461,
50275,
35501,
323,
253,
2380,
209,
422,
2559,
619,
7162,
5474,
339,
431,
248,
2929,
12453,
4471,
12788,
391,
77,
3237,
407,
15250,
247,
40880,
2746,
835,
253,
6083,
3037,
281,
3894,
616,
10921,
342,
616,
15833,
275,
436,
1332,
247,
1029,
5251,
3646,
14802,
247,
2801,
4972,
323,
42428,
253,
10921,
273,
20667,
6083,
285,
840,
1016,
5570,
33772,
616,
1211,
3907,
3646,
253,
4715,
310,
3021,
5196,
12171,
275,
247,
10571,
4802,
2990,
2584,
247,
1846,
4736,
285,
1293,
253,
3640,
273,
4156,
1375,
285,
5231,
50276,
1189,
455,
253,
2746,
310,
27350,
285,
4722,
323,
40880,
4715,
275,
2304,
77,
8892,
2299,
891,
452,
690,
5701,
34974,
323,
11138,
253,
2929,
326,
403,
17903,
2708,
50276,
48521,
891,
6273,
281,
12009,
387,
436,
3924,
50276,
856,
84,
50276,
565,
48714,
2216,
273,
5511,
2190,
6083,
275,
40880,
4758,
50276,
2148,
332,
5223,
279,
273,
11333,
50276,
4714,
3542,
2929,
285,
6283,
10932,
50276,
26122,
50276,
783,
7680,
273,
253,
2929,
310,
7194,
275,
830,
8287,
253,
1895,
275,
253,
12353,
68,
17425,
9978,
273,
32765,
8159,
1332,
534,
5644,
281,
247,
3710,
38135,
50275,
66,
2234,
4468,
670,
253,
2929,
310,
849,
281,
11101,
3014,
253,
10921,
275,
253,
806,
1659,
253,
2929,
13698,
387,
39793,
247,
4156,
8103,
285,
19584,
671,
275,
253,
39325,
326,
436,
8103,
556,
21842,
4602,
342,
253,
40880,
23267,
17837,
436,
310,
247,
2266,
9376,
3782,
275,
1524,
10186,
4893,
247,
4156,
10921,
476,
320,
45765,
715,
36138,
273,
4577,
23267,
533,
417,
7933,
253,
643,
1039,
1475,
347,
1048,
347,
627,
310,
247,
4156,
8103,
359,
878,
247,
1039,
281,
16969,
253,
10921,
2190,
253,
6083,
3066,
4715,
390,
10921,
40206,
15609,
390,
1014,
13542,
849,
476,
359,
6283,
4853,
253,
10921,
273,
1016,
5570,
275,
824,
15216,
50275,
262,
310,
671,
12744,
752,
310,
253,
5649,
273,
9628,
760,
342,
253,
15833,
253,
1332,
33772,
247,
2801,
4972,
273,
1979,
13065,
323,
1046,
5570,
1057,
352,
1056,
247,
3064,
275,
253,
10336,
41528,
604,
359,
3037,
253,
13461,
273,
512,
253,
643,
6083,
1979,
295,
3185,
50275,
630,
8287,
253,
13461,
347,
6486,
13358,
2193,
4453,
44822,
604,
253,
1332,
310,
4158,
323,
5415,
2250,
2317,
352,
310,
3264,
281,
452,
253,
41818,
281,
320,
5415,
347,
973,
476,
359,
816,
3365,
6455,
253,
14568,
569,
715,
9554,
275,
253,
39325,
50275,
783,
4477,
1750,
326,
253,
1895,
342,
253,
2905,
789,
310,
326,
597,
476,
417,
4311,
598,
342,
253,
1180,
273,
6083,
2299,
627,
310,
642,
16774,
1329,
326,
849,
253,
4081,
2746,
13330,
342,
1236,
2510,
25912,
3237,
50275,
249,
2087,
253,
4679,
403,
1355,
285,
1754,
327,
9864,
285,
15524,
15216,
403,
417,
2783,
1524,
10186,
534,
310,
7558,
5010,
275,
253,
2929,
891,
651,
5583,
281,
19071,
625,
23384,
16774,
7103,
50275,
37585,
752,
310,
815,
2886,
275,
16186,
1722,
7152,
33032,
11183,
891,
11435,
253,
7000,
32114,
281,
619,
3533,
6296,
690,
273,
253,
2792,
891,
5439,
497,
9713,
973,
285,
253,
2929,
9300,
15672,
50275,
35529,
690,
747,
7350,
497,
671,
5439,
407,
253,
32114,
50276,
5302,
495,
12922,
323,
253,
5661,
7103,
310,
271,
6685,
30455,
7103,
7241,
627,
310,
642,
1039,
281,
871,
604,
667,
273,
253,
1543,
403,
1469,
281,
2186,
598,
50275,
953,
671,
2590,
1024,
326,
5293,
273,
253,
4679,
403,
10941,
281,
22791,
3904,
432,
643,
16516,
352,
651,
452,
644,
625,
7162,
29853,
604,
253,
1332,
369,
5762,
327,
247,
873,
273,
8892,
835,
6024,
49602,
452,
2168,
644,
4232,
50275,
2520,
310,
3782,
2032,
323,
253,
747,
1543,
326,
497,
2879,
281,
253,
2929,
24088,
253,
2805,
24706,
1543,
697,
2834,
281,
1056,
3282,
273,
731,
285,
253,
17620,
2792,
4404,
247,
2442,
4373,
19484,
2523,
50273,
455,
1666,
25379,
323,
253,
16154,
1083,
943,
387,
1878,
7277,
281,
253,
4751,
27293,
1083,
273,
6240,
598,
253,
23267,
10941,
281,
247,
277,
47051,
8245,
326,
11903,
4219,
2060,
23267,
310,
247,
2502,
617,
804,
50275,
953,
8909,
326,
512,
4679,
2430,
1679,
685,
9098,
13305,
281,
6194,
436,
310,
1077,
11555,
323,
11132,
4471,
12788,
391,
77,
3237,
352,
651,
320,
1270,
281,
2096,
604,
253,
2022,
10156,
1127,
273,
46007,
375,
310,
3410,
19017,
414,
28269,
3885,
390,
604,
627,
310,
1633,
2010,
1469,
327,
50276,
74,
671,
5194,
342,
253,
4468,
5439,
407,
643,
30628,
326,
253,
2929,
310,
4390,
417,
15471,
4518,
50276,
455,
1841,
2783,
891,
2868,
619,
4868,
310,
1335,
4569,
323,
253,
2929,
2299,
891,
671,
2868,
326,
247,
2852,
2715,
273,
253,
2929,
342,
31637,
19274,
285,
625,
11080,
5661,
7103,
812,
1056,
323,
247,
18511,
7680,
50276,
19164,
2278,
50276,
706,
11529,
45830,
615,
2550,
2953,
824,
3237,
1955,
281,
253,
28401,
273,
7877,
1319,
45830,
615,
2097,
326,
627,
310,
253,
4500,
281,
897,
36409,
1491,
387,
3733,
673,
4518,
690,
4088,
273,
970,
36409,
1491,
588,
4311,
1805,
685,
2571,
285,
15081,
326,
5293,
273,
731,
4311,
310,
3365,
5369,
8055,
50275,
531,
310,
326,
253,
10921,
1159,
22820,
273,
253,
764,
790,
891,
717,
15586,
281,
1056,
3282,
273,
436,
12494,
4496,
789,
327,
253,
19843,
273,
253,
4028,
50275,
35529,
597,
403,
6311,
275,
247,
36409,
1039,
285,
7613,
417,
44755,
841,
3082,
452,
644,
24337,
281,
1781,
3904,
273,
6083,
275,
2570,
12620,
4496,
2085,
30404,
672,
2403,
247,
1750,
326,
1633,
36908,
4311,
323,
1650,
253,
253,
331,
3178,
2694,
4471,
12788,
5691,
1775,
87,
600,
266,
1162,
355,
9169,
3797,
1543,
323,
3904,
273,
6083,
10870,
281,
253,
6253,
4679,
275,
436,
2929,
50274,
3062,
1189,
253,
8310,
273,
1016,
5570,
258,
74,
50276,
10986,
476,
320,
8655,
281,
253,
7281,
16561,
1885,
273,
5570,
891,
285,
697,
15833,
480,
22589,
1162,
355,
9169,
390,
253,
8310,
2892,
448,
86,
1162,
355,
9169,
891,
13414,
956,
436,
604,
253,
8310,
273,
1016,
5570,
3797,
253,
8310,
273,
512,
15833,
534,
3797,
253,
8310,
273,
616,
15833,
840,
943,
2649,
4130,
10018,
3253,
50275,
29813,
337,
310,
3430,
253,
458,
71,
394,
395,
1930,
2515,
327,
258,
74,
533,
253,
987,
4608,
1930,
2515,
327,
256,
436,
671,
11852,
512,
1563,
7424,
50275,
783,
2969,
1039,
281,
22318,
253,
4156,
8103,
310,
326,
1016,
5570,
11903,
4219,
697,
1211,
3264,
1091,
534,
310,
1929,
347,
1616,
729,
2165,
50276,
2520,
310,
3430,
672,
1016,
5570,
5556,
4219,
616,
1211,
3264,
1091,
436,
310,
5431,
417,
247,
2097,
273,
39793,
253,
4156,
8103,
50274,
249,
2990,
264,
2304,
77,
347,
253,
10921,
273,
271,
5570,
310,
8025,
281,
3469,
327,
253,
5231,
273,
15833,
359,
1581,
10921,
9628,
875,
20667,
6083,
253,
10921,
1159,
671,
7024,
327,
253,
4156,
1375,
256,
534,
310,
247,
1159,
273,
253,
6036,
2250,
273,
512,
273,
253,
6083,
594,
436,
1980,
10921,
9628,
3133,
4518,
12497,
275,
2087,
50274,
15214,
721,
281,
1458,
436,
4737,
3133,
48312,
41049,
259,
760,
17205,
8029,
253,
23267,
594,
253,
2020,
273,
2264,
23267,
310,
19965,
2805,
264,
50276,
328,
3022,
5368,
24498,
391,
77,
3082,
359,
476,
3587,
3989,
253,
1318,
1159,
285,
2250,
1318,
1159,
273,
50276,
3169,
327,
253,
1318,
1159,
273,
50276,
255,
1016,
5570,
26736,
253,
1318,
1159,
310,
2649,
1663,
253,
1895,
533,
4020,
839,
285,
4715,
352,
310,
11132,
50276,
32525,
7652,
1016,
11302,
556,
697,
1211,
1980,
3646,
891,
75,
33670,
258,
74,
285,
359,
476,
12654,
616,
14275,
407,
2097,
273,
1616,
729,
3632,
1673,
436,
310,
417,
2590,
281,
479,
33810,
1677,
326,
253,
5502,
1159,
2515,
327,
253,
6036,
2250,
285,
326,
253,
10921,
1159,
7024,
327,
253,
4275,
1375,
436,
3133,
3430,
5734,
891,
717,
20854,
253,
18925,
327,
253,
4275,
1375,
943,
2740,
667,
33643,
13260,
50275,
16217,
3825,
50276,
783,
1543,
327,
253,
16154,
34390,
403,
24363,
4518,
604,
627,
310,
271,
3745,
281,
1818,
253,
10921,
3470,
273,
2060,
6083,
534,
310,
8025,
407,
46007,
375,
627,
310,
642,
625,
2675,
34390,
347,
824,
760,
1666,
25379,
326,
22950,
253,
2264,
10921,
403,
24542,
14023,
285,
1646,
281,
320,
5816,
4336,
50275,
783,
7137,
285,
24749,
4679,
1646,
625,
4722,
247,
1643,
15985,
1832,
5293,
273,
253,
1543,
2486,
11649,
8197,
352,
310,
33810,
12744,
849,
1142,
12922,
497,
908,
33810,
253,
4229,
46007,
375,
8245,
323,
28913,
359,
1978,
253,
9628,
13461,
4229,
323,
1016,
5570,
4907,
4229,
46007,
375,
3133,
13653,
858,
368,
1611,
247,
8245,
835,
512,
6083,
3365,
3894,
616,
10921,
9696,
342,
616,
15833,
671,
36409,
1666,
25379,
403,
5816,
24088,
5987,
39962,
2061,
9275,
22179,
1418,
4739,
9275,
50275,
249,
24749,
4229,
46007,
375,
26332,
417,
4715,
281,
3894,
285,
46007,
375,
1646,
40184,
50273,
187,
187,
4118,
18435,
27,
20261,
627,
369,
690,
3302,
30859,
327,
436,
2929,
253,
5020,
273,
30628,
5194,
326,
436,
789,
310,
417,
4704,
323,
9311,
285,
476,
320,
5520,
275,
2710,
34323,
846,
253,
5955,
3408,
627,
310,
671,
4092,
4468,
326,
253,
4679,
878,
625,
789,
10126,
281,
12654,
604,
597,
2186,
598,
625,
14023,
342,
1666,
25379,
403,
2424,
347,
973,
253,
2929,
812,
671,
320,
1805,
1691,
275,
3634,
342,
253,
256,
5503,
285,
2905,
789,
253,
2929,
1057,
3831,
4722,
5697,
285,
253,
4477,
403,
14659,
281,
3676,
257,
253,
789,
285,
501,
538,
2225,
281,
1529,
2201,
13361,
18767
] | [
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] | [
8245,
3082,
46007,
375,
12724,
41731,
13015,
643,
3082,
275,
253,
1273,
10076,
4477,
671,
921,
253,
878,
323,
1029,
5251,
3646,
407,
10199,
4229,
46007,
375,
50274,
255,
253,
990,
273,
10199,
253,
6197,
46007,
375,
310,
3477,
281,
3359,
285,
4390,
8156,
407,
32765,
8159,
476,
320,
24363,
984,
273,
253,
3159,
8156,
285,
253,
958,
326,
4477,
9059,
326,
46007,
375,
310,
247,
9841,
4081,
1332,
1057,
436,
1599,
46007,
375,
3365,
24772,
32765,
8159,
285,
277,
3757,
50276,
3088,
4677,
608,
285,
721,
1957,
27915,
1255,
273,
6083,
672,
46007,
375,
310,
908,
50276,
37585,
21977,
6332,
275,
30762,
7152,
339,
793,
360,
3454,
253,
2929,
19401,
253,
27293,
2304,
77,
4758,
835,
6083,
755,
1980,
23267,
285,
597,
403,
36282,
347,
247,
4216,
835,
15833,
476,
13791,
253,
2929,
5742,
19401,
253,
5511,
273,
10921,
9628,
326,
310,
271,
5570,
10764,
629,
273,
697,
10921,
281,
697,
15833,
824,
326,
1016,
5570,
5556,
4219,
697,
1980,
10921,
5043,
23267,
432,
697,
15833,
436,
15265,
684,
247,
26413,
652,
13757,
7792,
835,
253,
1029,
5251,
3646,
21936,
849,
253,
23267,
403,
6096,
285,
253,
1698,
5251,
3646,
12171,
5556,
4219,
253,
6096,
23267,
1677,
253,
1029,
43836,
3061,
253,
9380,
2685,
15265,
684,
824,
247,
7792,
973,
253,
5661,
1543,
7568,
253,
3082,
12510,
891,
1158,
352,
310,
247,
2266,
2929,
2997,
533,
619,
7162,
310,
1698,
1955,
281,
253,
1563,
1461,
16723,
891,
452,
50275,
26122,
34974,
50275,
18,
891,
452,
247,
1029,
5251,
4385,
327,
253,
10921,
9628,
5122,
352,
3133,
326,
253,
4081,
1332,
1057,
417,
1329,
4471,
12242,
9628,
984,
23267,
476,
760,
320,
6096,
281,
15833,
2139,
310,
436,
2014,
12242,
9628,
3576,
275,
253,
4679,
310,
352,
984,
273,
10625,
29765,
4606,
390,
697,
984,
326,
2014,
12242,
9628,
310,
275,
8063,
9696,
3576,
2139,
50276,
19,
253,
28529,
273,
1283,
970,
246,
9614,
7466,
310,
12744,
281,
479,
812,
253,
4477,
5513,
352,
342,
625,
4278,
50276,
20,
891,
13414,
4751,
2096,
253,
4737,
273,
13989,
5976,
5742,
1057,
815,
74,
476,
320,
6311,
275,
247,
40880,
5133,
1599,
326,
253,
8654,
815,
74,
476,
320,
1754,
327,
760,
253,
1980,
8310,
323,
1016,
5570,
3185,
273,
1754,
327,
4156,
1375,
812,
253,
4477,
4385,
327,
253,
11193,
2228,
5802,
407,
253,
1599,
3423,
11193,
2139,
253,
4737,
9513,
342,
815,
2886,
1754,
327,
258,
74,
285,
7637,
342,
815,
2886,
1754,
327,
4156,
1375,
256,
50276,
21,
275,
5150,
1722,
285,
1384,
943,
815,
74,
320,
816,
815,
74,
26332,
642,
50276,
1568,
50276,
22,
253,
1698,
5251,
3646,
310,
281,
22318,
253,
6096,
23267,
619,
4685,
310,
326,
667,
2014,
12788,
391,
77,
5933,
476,
320,
908,
323,
39793,
253,
6096,
23267,
24088,
277,
47051,
32765,
8159,
247,
19,
68,
3966,
2139,
651,
253,
4477,
5206,
277,
3757,
247,
2581,
1679,
4633,
391,
77,
5933,
452,
253,
4477,
3597,
625,
4633,
11333,
347,
253,
1698,
5251,
3646,
50276,
23,
323,
4229,
46007,
375,
50276,
5430,
513,
359,
3653,
253,
4229,
9628,
13461,
50275,
35501,
323,
253,
2380,
209,
422,
2559,
619,
7162,
5474,
339,
431,
248,
2929,
12453,
4471,
12788,
391,
77,
3237,
407,
15250,
247,
40880,
2746,
835,
253,
6083,
3037,
281,
3894,
616,
10921,
342,
616,
15833,
275,
436,
1332,
247,
1029,
5251,
3646,
14802,
247,
2801,
4972,
323,
42428,
253,
10921,
273,
20667,
6083,
285,
840,
1016,
5570,
33772,
616,
1211,
3907,
3646,
253,
4715,
310,
3021,
5196,
12171,
275,
247,
10571,
4802,
2990,
2584,
247,
1846,
4736,
285,
1293,
253,
3640,
273,
4156,
1375,
285,
5231,
50276,
1189,
455,
253,
2746,
310,
27350,
285,
4722,
323,
40880,
4715,
275,
2304,
77,
8892,
2299,
891,
452,
690,
5701,
34974,
323,
11138,
253,
2929,
326,
403,
17903,
2708,
50276,
48521,
891,
6273,
281,
12009,
387,
436,
3924,
50276,
856,
84,
50276,
565,
48714,
2216,
273,
5511,
2190,
6083,
275,
40880,
4758,
50276,
2148,
332,
5223,
279,
273,
11333,
50276,
4714,
3542,
2929,
285,
6283,
10932,
50276,
26122,
50276,
783,
7680,
273,
253,
2929,
310,
7194,
275,
830,
8287,
253,
1895,
275,
253,
12353,
68,
17425,
9978,
273,
32765,
8159,
1332,
534,
5644,
281,
247,
3710,
38135,
50275,
66,
2234,
4468,
670,
253,
2929,
310,
849,
281,
11101,
3014,
253,
10921,
275,
253,
806,
1659,
253,
2929,
13698,
387,
39793,
247,
4156,
8103,
285,
19584,
671,
275,
253,
39325,
326,
436,
8103,
556,
21842,
4602,
342,
253,
40880,
23267,
17837,
436,
310,
247,
2266,
9376,
3782,
275,
1524,
10186,
4893,
247,
4156,
10921,
476,
320,
45765,
715,
36138,
273,
4577,
23267,
533,
417,
7933,
253,
643,
1039,
1475,
347,
1048,
347,
627,
310,
247,
4156,
8103,
359,
878,
247,
1039,
281,
16969,
253,
10921,
2190,
253,
6083,
3066,
4715,
390,
10921,
40206,
15609,
390,
1014,
13542,
849,
476,
359,
6283,
4853,
253,
10921,
273,
1016,
5570,
275,
824,
15216,
50275,
262,
310,
671,
12744,
752,
310,
253,
5649,
273,
9628,
760,
342,
253,
15833,
253,
1332,
33772,
247,
2801,
4972,
273,
1979,
13065,
323,
1046,
5570,
1057,
352,
1056,
247,
3064,
275,
253,
10336,
41528,
604,
359,
3037,
253,
13461,
273,
512,
253,
643,
6083,
1979,
295,
3185,
50275,
630,
8287,
253,
13461,
347,
6486,
13358,
2193,
4453,
44822,
604,
253,
1332,
310,
4158,
323,
5415,
2250,
2317,
352,
310,
3264,
281,
452,
253,
41818,
281,
320,
5415,
347,
973,
476,
359,
816,
3365,
6455,
253,
14568,
569,
715,
9554,
275,
253,
39325,
50275,
783,
4477,
1750,
326,
253,
1895,
342,
253,
2905,
789,
310,
326,
597,
476,
417,
4311,
598,
342,
253,
1180,
273,
6083,
2299,
627,
310,
642,
16774,
1329,
326,
849,
253,
4081,
2746,
13330,
342,
1236,
2510,
25912,
3237,
50275,
249,
2087,
253,
4679,
403,
1355,
285,
1754,
327,
9864,
285,
15524,
15216,
403,
417,
2783,
1524,
10186,
534,
310,
7558,
5010,
275,
253,
2929,
891,
651,
5583,
281,
19071,
625,
23384,
16774,
7103,
50275,
37585,
752,
310,
815,
2886,
275,
16186,
1722,
7152,
33032,
11183,
891,
11435,
253,
7000,
32114,
281,
619,
3533,
6296,
690,
273,
253,
2792,
891,
5439,
497,
9713,
973,
285,
253,
2929,
9300,
15672,
50275,
35529,
690,
747,
7350,
497,
671,
5439,
407,
253,
32114,
50276,
5302,
495,
12922,
323,
253,
5661,
7103,
310,
271,
6685,
30455,
7103,
7241,
627,
310,
642,
1039,
281,
871,
604,
667,
273,
253,
1543,
403,
1469,
281,
2186,
598,
50275,
953,
671,
2590,
1024,
326,
5293,
273,
253,
4679,
403,
10941,
281,
22791,
3904,
432,
643,
16516,
352,
651,
452,
644,
625,
7162,
29853,
604,
253,
1332,
369,
5762,
327,
247,
873,
273,
8892,
835,
6024,
49602,
452,
2168,
644,
4232,
50275,
2520,
310,
3782,
2032,
323,
253,
747,
1543,
326,
497,
2879,
281,
253,
2929,
24088,
253,
2805,
24706,
1543,
697,
2834,
281,
1056,
3282,
273,
731,
285,
253,
17620,
2792,
4404,
247,
2442,
4373,
19484,
2523,
50273,
455,
1666,
25379,
323,
253,
16154,
1083,
943,
387,
1878,
7277,
281,
253,
4751,
27293,
1083,
273,
6240,
598,
253,
23267,
10941,
281,
247,
277,
47051,
8245,
326,
11903,
4219,
2060,
23267,
310,
247,
2502,
617,
804,
50275,
953,
8909,
326,
512,
4679,
2430,
1679,
685,
9098,
13305,
281,
6194,
436,
310,
1077,
11555,
323,
11132,
4471,
12788,
391,
77,
3237,
352,
651,
320,
1270,
281,
2096,
604,
253,
2022,
10156,
1127,
273,
46007,
375,
310,
3410,
19017,
414,
28269,
3885,
390,
604,
627,
310,
1633,
2010,
1469,
327,
50276,
74,
671,
5194,
342,
253,
4468,
5439,
407,
643,
30628,
326,
253,
2929,
310,
4390,
417,
15471,
4518,
50276,
455,
1841,
2783,
891,
2868,
619,
4868,
310,
1335,
4569,
323,
253,
2929,
2299,
891,
671,
2868,
326,
247,
2852,
2715,
273,
253,
2929,
342,
31637,
19274,
285,
625,
11080,
5661,
7103,
812,
1056,
323,
247,
18511,
7680,
50276,
19164,
2278,
50276,
706,
11529,
45830,
615,
2550,
2953,
824,
3237,
1955,
281,
253,
28401,
273,
7877,
1319,
45830,
615,
2097,
326,
627,
310,
253,
4500,
281,
897,
36409,
1491,
387,
3733,
673,
4518,
690,
4088,
273,
970,
36409,
1491,
588,
4311,
1805,
685,
2571,
285,
15081,
326,
5293,
273,
731,
4311,
310,
3365,
5369,
8055,
50275,
531,
310,
326,
253,
10921,
1159,
22820,
273,
253,
764,
790,
891,
717,
15586,
281,
1056,
3282,
273,
436,
12494,
4496,
789,
327,
253,
19843,
273,
253,
4028,
50275,
35529,
597,
403,
6311,
275,
247,
36409,
1039,
285,
7613,
417,
44755,
841,
3082,
452,
644,
24337,
281,
1781,
3904,
273,
6083,
275,
2570,
12620,
4496,
2085,
30404,
672,
2403,
247,
1750,
326,
1633,
36908,
4311,
323,
1650,
253,
253,
331,
3178,
2694,
4471,
12788,
5691,
1775,
87,
600,
266,
1162,
355,
9169,
3797,
1543,
323,
3904,
273,
6083,
10870,
281,
253,
6253,
4679,
275,
436,
2929,
50274,
3062,
1189,
253,
8310,
273,
1016,
5570,
258,
74,
50276,
10986,
476,
320,
8655,
281,
253,
7281,
16561,
1885,
273,
5570,
891,
285,
697,
15833,
480,
22589,
1162,
355,
9169,
390,
253,
8310,
2892,
448,
86,
1162,
355,
9169,
891,
13414,
956,
436,
604,
253,
8310,
273,
1016,
5570,
3797,
253,
8310,
273,
512,
15833,
534,
3797,
253,
8310,
273,
616,
15833,
840,
943,
2649,
4130,
10018,
3253,
50275,
29813,
337,
310,
3430,
253,
458,
71,
394,
395,
1930,
2515,
327,
258,
74,
533,
253,
987,
4608,
1930,
2515,
327,
256,
436,
671,
11852,
512,
1563,
7424,
50275,
783,
2969,
1039,
281,
22318,
253,
4156,
8103,
310,
326,
1016,
5570,
11903,
4219,
697,
1211,
3264,
1091,
534,
310,
1929,
347,
1616,
729,
2165,
50276,
2520,
310,
3430,
672,
1016,
5570,
5556,
4219,
616,
1211,
3264,
1091,
436,
310,
5431,
417,
247,
2097,
273,
39793,
253,
4156,
8103,
50274,
249,
2990,
264,
2304,
77,
347,
253,
10921,
273,
271,
5570,
310,
8025,
281,
3469,
327,
253,
5231,
273,
15833,
359,
1581,
10921,
9628,
875,
20667,
6083,
253,
10921,
1159,
671,
7024,
327,
253,
4156,
1375,
256,
534,
310,
247,
1159,
273,
253,
6036,
2250,
273,
512,
273,
253,
6083,
594,
436,
1980,
10921,
9628,
3133,
4518,
12497,
275,
2087,
50274,
15214,
721,
281,
1458,
436,
4737,
3133,
48312,
41049,
259,
760,
17205,
8029,
253,
23267,
594,
253,
2020,
273,
2264,
23267,
310,
19965,
2805,
264,
50276,
328,
3022,
5368,
24498,
391,
77,
3082,
359,
476,
3587,
3989,
253,
1318,
1159,
285,
2250,
1318,
1159,
273,
50276,
3169,
327,
253,
1318,
1159,
273,
50276,
255,
1016,
5570,
26736,
253,
1318,
1159,
310,
2649,
1663,
253,
1895,
533,
4020,
839,
285,
4715,
352,
310,
11132,
50276,
32525,
7652,
1016,
11302,
556,
697,
1211,
1980,
3646,
891,
75,
33670,
258,
74,
285,
359,
476,
12654,
616,
14275,
407,
2097,
273,
1616,
729,
3632,
1673,
436,
310,
417,
2590,
281,
479,
33810,
1677,
326,
253,
5502,
1159,
2515,
327,
253,
6036,
2250,
285,
326,
253,
10921,
1159,
7024,
327,
253,
4275,
1375,
436,
3133,
3430,
5734,
891,
717,
20854,
253,
18925,
327,
253,
4275,
1375,
943,
2740,
667,
33643,
13260,
50275,
16217,
3825,
50276,
783,
1543,
327,
253,
16154,
34390,
403,
24363,
4518,
604,
627,
310,
271,
3745,
281,
1818,
253,
10921,
3470,
273,
2060,
6083,
534,
310,
8025,
407,
46007,
375,
627,
310,
642,
625,
2675,
34390,
347,
824,
760,
1666,
25379,
326,
22950,
253,
2264,
10921,
403,
24542,
14023,
285,
1646,
281,
320,
5816,
4336,
50275,
783,
7137,
285,
24749,
4679,
1646,
625,
4722,
247,
1643,
15985,
1832,
5293,
273,
253,
1543,
2486,
11649,
8197,
352,
310,
33810,
12744,
849,
1142,
12922,
497,
908,
33810,
253,
4229,
46007,
375,
8245,
323,
28913,
359,
1978,
253,
9628,
13461,
4229,
323,
1016,
5570,
4907,
4229,
46007,
375,
3133,
13653,
858,
368,
1611,
247,
8245,
835,
512,
6083,
3365,
3894,
616,
10921,
9696,
342,
616,
15833,
671,
36409,
1666,
25379,
403,
5816,
24088,
5987,
39962,
2061,
9275,
22179,
1418,
4739,
9275,
50275,
249,
24749,
4229,
46007,
375,
26332,
417,
4715,
281,
3894,
285,
46007,
375,
1646,
40184,
50273,
187,
187,
4118,
18435,
27,
20261,
627,
369,
690,
3302,
30859,
327,
436,
2929,
253,
5020,
273,
30628,
5194,
326,
436,
789,
310,
417,
4704,
323,
9311,
285,
476,
320,
5520,
275,
2710,
34323,
846,
253,
5955,
3408,
627,
310,
671,
4092,
4468,
326,
253,
4679,
878,
625,
789,
10126,
281,
12654,
604,
597,
2186,
598,
625,
14023,
342,
1666,
25379,
403,
2424,
347,
973,
253,
2929,
812,
671,
320,
1805,
1691,
275,
3634,
342,
253,
256,
5503,
285,
2905,
789,
253,
2929,
1057,
3831,
4722,
5697,
285,
253,
4477,
403,
14659,
281,
3676,
257,
253,
789,
285,
501,
538,
2225,
281,
1529,
2201,
13361,
18767
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
the paper proposes an agent that combines causal discovery and hierarchical reinforcement learning hrl combined into causalitydriven hrl or cdhrl the environment variables which form the basis for the nodes of the structural causal model that the agent uses are assumed to be available through an oracle training happens in two stages first the causal structure is discovered by training the agent to reach an expanding set of subgoals after which the agent is trained on the actual task on the two environments that are evaluated the cdhrl method discovers reasonable causal graphs and outperforms the two baselines several ablations show that those results are robust to noise in the provided environment variables and limited iteration depth strength combining causality with hrl makes lots of sense often hrl approaches dont explicitly address the causality but assume it implicitly assuming the environment variables are available providing the assumption of a causal structure as an inductive bias is as natural as transformers are for language or convnets for images strength the interaction between the subgoal construction and the discovery process effectively induces an exploration reward so the proposed agent not only learns about causal structure in the environment that it sees but also ensures it sees many interesting aspects of the environment strength the paper is wellstructured and easy to read weakness the availability of the environment variables is assumed the authors mention a few methods aimed at discovering those but no experiments are included where discovered environment variables are being used weakness the results are a bit limited two tasks two baselines they make the point that in the right circumstances this method is effective but they dont really address the question which circumstances are the right ones weakness the paper is in need of a round of language editing the assumption that the evs are available through an oracle is limiting the results presented are for cases where they are available discovering them in an unsupervised way can be problematic the signal for the usefulness of candidate evs will have to come from a combination of task reward and causality discovery success hence discovering evs is likely to require interaction with the causality discovery process which would create a coldstart problem the question of ev discovery is not addressed in any of the experiments that isnt a strict necessity for this paper but it does limit the applicability of the method and addressing it would make the paper stronger the environment variables are assumed to be discrete this assumption makes the exploration easier by limiting the effective number of states that need to be considered in the causal graph again addressing this aspect ideally with experiments which would presumably take place on other environments than the ones used would clarify the applicability domain of the proposed method the causal structure of the environment is assumed to be small enough in number to constitute a nicely explorable graph the pretraining phase runs until no new causality is discovered this assumption wont generally hold and addressing the question of how much to invest in causality discovery would be valuable for understanding the generalizability of the results docsepthe paper proposes a causality driven hierarchical reinforcement learning framework to build hierarchical structures in an iterative boosting way the paper shows experimental results in 2dminecraft and eden environments with sparse rewards strengths the paper is well written and clear the experiments are well done weakness the weakest part of the paper is that it is not clear where the novelty of the paper lies it would be greatly appreciated if the authors differentiate their work against prior work multiple previous work tries to find graph like structures in environments for finding skillsoptions etc for example robot learning from demonstration by constructing skill trees important prior work on hrl and minecraft 1 forgetful experience replay in hierarchical reinforcement learning from demonstrations 2 alignrudder learning from few demonstrations by reward redistribution the assumptions that environment variables are given to us is the main limitation of the paper finding these environment variables along with the structure is difficult docsepthis paper proposes a new hierarchical reinforcement learning method that could automatically discover hierarchical structure with a learned causality graph to build the causality graph the proposed method introduces the environment variables as nodes in that graph which requires some taskspecific knowledge the causality graph and subgoalbased hierarchical policy are simultaneously learned and facilitate each other significance automatic hierarchical structure discovery is a fundamental research problem in deep reinforcement learning this paper provides a new way to solve this problem which owes remarkable significance originality the idea of using a causality graph to discover hierarchical structure is novel however the proposed method relies on the environment variables the way of obtaining the environment variables is nontrivial so i doubt whether this method could be applied to more general settings where the oracle environment variables are hardly accessible clarity the writing of this paper is mostly clear but seems in a rush for example there is a latex compile error in line 485 the authors are encouraged to polish up the writing in the future quality the idea of using a causality graph to facilitate the discovery of hierarchical structure makes sense since there are causality relationships among subtasks in most hierarchical tasks the proposed method is compared against previous goalconditioned hrl works however claiming hac as a stateoftheart method is not precise a lot of more advanced goalconditioned hrl works have emerged in recent years 1 2 where the world model paper also proposed to build a graph to guide the subgoal selection the authors are encouraged to compare the proposed method to more advanced goalconditioned hrl methods 1 zhang er al world model as a graph learning latent landmarks for planning icml 2021 2 li et al learning subgoal representations with slow dynamics iclr 2021 the authors have discussed the limited application to environments with discrete environment variables in section 7 and have not touched on another limitation of requiring oracle environment variables those two limitations are related docsepthe authors propose a formulation of hierarchical reinforcement learning that utilizes causal structure between environmental variables to generate subgoals the discovered subgoals and lowerlevel policies generated to reach the subgoals are subsequently adapted using a learnt hierarchical agent that solves downstream tasks the authors test their method in addition to several hierarchical rl baselines on the 2dminecraft and eden environments and obtain promising results on the both they show faster convergence in both environments in addition to interpretable causal graphs strengths 1 the problem considered by the authors is one of strong relevance 2 the approach is principled with wellmade figures 3 the results are repeated across seeds and are in challenging environments weaknesses 1 directionality its unclear how the algorithm learns directionality for causes and effects from my understanding the interventional data is generated by lowerlevel policies in the action space that attempt to set the environmental variables to a particular value thus it appears the order in which environment variables are selected matters what if a variable is selected in the initial iterations for which the causal parents are discovered in a later iteration is the causal graph updated in such a setting for example if the agent decides to learn policies that intervene on ironore before it intervenes on stonepickax what happens then 2 commentary on scalability of during large number of evs i believe that the authors need to add some information about the scalability of their approach with environmental variables in the current draft they say to verify whether newly associated variables are controllable as supposed we compare the final training success rates of the new subgoals with a preset threshold phi before adding them to the subgoal hierarchy how are they selected how many are selected how are the lowerlevel policies learned i would imagine through equation 4 but how long do you train etc should be added 3 the algorithm should not be relegated to supplementary work this draft sorely misses the details afforded by it and would help ground a lot of the description 4 some assumptions on the density of the underlying causal graph i believe the authors need to state some assumptions about the density of the underlying causal graph in particular that currently they assume that the underlying causal graph is dense and that each of the environment variables influences the others i can imagine a situation where mechanisms are sparse and that learning policies which intervene on a causal factor only allows one to intervene on the immediate children of the same causal factor but these children no longer have any other effect variables linked to them 5 description of subgoal hierarchy needed what is a subgoal hierarchy from reading the paper several times i was able to discern that is an ordering on the elements in the goal space that are linked due to their temporal dependencies ie getting io is a child of sp because sp is needed to mine io some mathematical precision is necessary to describe this especially since it forms the core of your algorithm compressing section 5 by avoiding newlines will generate room enough for such an explanation 6 assumptions on the utility of causal graphs for subgoal a central assumption made by the authors is that in their setup nodes on the causal graph are all useful in goal generation this is in general not true and worth stating for example manipulation by a robotic hand of an object in its palm does not require knowledge of friction coefficients between the block and the ground 7 curious that the oracle is outperformed by your method is this a bug should it not be that the oracle with groundtruth causal graph inputs beat discovery 8 missing citing work httpsarxivorgabs201003110 and httpsproceedingsneuripsccpaper2021filec1722a7941d61aad6e651a35b65a9c3epaperpdf and httpsieeexploreieeeorgdocument9561439 see the weaknesses section for my analysis of the assumptionslimitations of the work i dont believe there is any cause for concern for negative societal impact
### Summary: | after a strong rebuttal from the authors and an extensive discussion among the reviewers i believe the papers pros outweigh its cons and this paper will be a valuable contribution to neurips i recommend it for acceptance and encourage the authors to address the reviewers comments for the cameraready version of the paper specifically please add the posted clarifications and experiments from the rebuttal and justify the limitation of available environment variables | [
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
783,
2929,
29328,
271,
5570,
326,
24772,
19349,
8900,
285,
24498,
35221,
4715,
288,
8435,
5678,
715,
46449,
17477,
288,
8435,
390,
22942,
6285,
77,
253,
3126,
4903,
534,
830,
253,
3720,
323,
253,
7632,
273,
253,
8350,
19349,
1566,
326,
253,
5570,
4648,
403,
8025,
281,
320,
2130,
949,
271,
42295,
3733,
6569,
275,
767,
8661,
806,
253,
19349,
2605,
310,
6888,
407,
3733,
253,
5570,
281,
3986,
271,
16122,
873,
273,
749,
2184,
932,
846,
534,
253,
5570,
310,
10166,
327,
253,
4588,
4836,
327,
253,
767,
12620,
326,
403,
6760,
253,
22942,
6285,
77,
1332,
41217,
5272,
19349,
14580,
285,
41731,
13015,
253,
767,
1666,
25379,
2067,
490,
77,
569,
921,
326,
1110,
1543,
403,
10237,
281,
6046,
275,
253,
2530,
3126,
4903,
285,
3710,
19502,
6864,
50276,
45563,
16248,
46449,
342,
288,
8435,
2789,
8783,
273,
3282,
2223,
288,
8435,
7274,
13414,
11120,
2953,
253,
46449,
533,
5467,
352,
29688,
7384,
253,
3126,
4903,
403,
2130,
5277,
253,
9376,
273,
247,
19349,
2605,
347,
271,
42115,
8492,
310,
347,
3626,
347,
4979,
398,
403,
323,
3448,
390,
2410,
47301,
323,
3888,
50276,
45563,
253,
5016,
875,
253,
749,
41881,
5140,
285,
253,
8900,
1232,
8069,
14757,
271,
17947,
10921,
594,
253,
4081,
5570,
417,
760,
33772,
670,
19349,
2605,
275,
253,
3126,
326,
352,
11403,
533,
671,
20096,
352,
11403,
1142,
4722,
7794,
273,
253,
3126,
50276,
45563,
253,
2929,
310,
973,
34218,
285,
3477,
281,
1239,
50276,
20881,
1255,
253,
11659,
273,
253,
3126,
4903,
310,
8025,
253,
4477,
3748,
247,
1643,
3082,
11205,
387,
30375,
1110,
533,
642,
4679,
403,
2908,
835,
6888,
3126,
4903,
403,
1146,
908,
50276,
20881,
1255,
253,
1543,
403,
247,
2372,
3710,
767,
8892,
767,
1666,
25379,
597,
1056,
253,
1127,
326,
275,
253,
987,
5989,
436,
1332,
310,
3576,
533,
597,
13414,
1663,
2953,
253,
1953,
534,
5989,
403,
253,
987,
4394,
50276,
20881,
1255,
253,
2929,
310,
275,
878,
273,
247,
3790,
273,
3448,
14835,
50276,
783,
9376,
326,
253,
612,
84,
403,
2130,
949,
271,
42295,
310,
14155,
253,
1543,
3559,
403,
323,
2219,
835,
597,
403,
2130,
30375,
731,
275,
271,
440,
35421,
1039,
476,
320,
20276,
50276,
783,
2625,
323,
253,
31471,
273,
7431,
612,
84,
588,
452,
281,
1705,
432,
247,
5019,
273,
4836,
10921,
285,
46449,
8900,
2323,
7613,
30375,
612,
84,
310,
2779,
281,
2430,
5016,
342,
253,
46449,
8900,
1232,
534,
651,
2794,
247,
5412,
5478,
1895,
253,
1953,
273,
612,
8900,
310,
417,
9713,
275,
667,
273,
253,
4679,
326,
310,
2649,
247,
7654,
15504,
323,
436,
2929,
533,
352,
1057,
2701,
253,
30437,
273,
253,
1332,
285,
15974,
352,
651,
1056,
253,
2929,
10046,
50276,
783,
3126,
4903,
403,
8025,
281,
320,
13358,
436,
9376,
2789,
253,
17947,
6927,
407,
14155,
253,
3576,
1180,
273,
3054,
326,
878,
281,
320,
2783,
275,
253,
19349,
4216,
969,
15974,
436,
4809,
34243,
342,
4679,
534,
651,
18289,
1379,
1659,
327,
643,
12620,
685,
253,
4394,
908,
651,
19148,
253,
30437,
5028,
273,
253,
4081,
1332,
50276,
783,
19349,
2605,
273,
253,
3126,
310,
8025,
281,
320,
1355,
2217,
275,
1180,
281,
12647,
247,
23395,
1414,
12178,
4216,
50276,
783,
3215,
26208,
3408,
6613,
1919,
642,
747,
46449,
310,
6888,
436,
9376,
31451,
3839,
2186,
285,
15974,
253,
1953,
273,
849,
1199,
281,
1718,
275,
46449,
8900,
651,
320,
9865,
323,
4685,
253,
2087,
50228,
273,
253,
1543,
50276,
7152,
339,
431,
248,
2929,
29328,
247,
46449,
8877,
24498,
35221,
4715,
7792,
281,
1973,
24498,
5289,
275,
271,
34560,
43124,
1039,
253,
2929,
2722,
5661,
1543,
275,
374,
17670,
460,
12517,
285,
1407,
257,
12620,
342,
23507,
23267,
50276,
296,
3755,
20556,
50276,
783,
2929,
310,
973,
3542,
285,
2590,
253,
4679,
403,
973,
2218,
50275,
20881,
1255,
50276,
783,
5075,
383,
629,
273,
253,
2929,
310,
326,
352,
310,
417,
2590,
835,
253,
38135,
273,
253,
2929,
8696,
352,
651,
320,
10260,
14109,
604,
253,
4477,
22629,
616,
789,
1411,
2720,
789,
50275,
34263,
2045,
789,
14177,
281,
1089,
4216,
751,
5289,
275,
12620,
323,
4560,
6936,
10121,
3966,
323,
1650,
15688,
4715,
432,
20028,
407,
26736,
10861,
7139,
50275,
18108,
2720,
789,
327,
288,
8435,
285,
7477,
12517,
50276,
18,
7740,
1020,
2793,
44864,
275,
24498,
35221,
4715,
432,
32367,
374,
8495,
42111,
491,
4715,
432,
1643,
32367,
407,
10921,
44461,
253,
13260,
326,
3126,
4903,
403,
1677,
281,
441,
310,
253,
2022,
12291,
273,
253,
2929,
4560,
841,
3126,
4903,
2112,
342,
253,
2605,
310,
2834,
50276,
7152,
33032,
2520,
2929,
29328,
247,
747,
24498,
35221,
4715,
1332,
326,
812,
8356,
9413,
24498,
2605,
342,
247,
6311,
46449,
4216,
281,
1973,
253,
46449,
4216,
253,
4081,
1332,
23970,
253,
3126,
4903,
347,
7632,
275,
326,
4216,
534,
4419,
690,
8892,
29765,
3640,
253,
46449,
4216,
285,
749,
41881,
3169,
24498,
3646,
403,
10486,
6311,
285,
12454,
1016,
643,
50274,
9188,
40348,
12077,
24498,
2605,
8900,
310,
247,
7936,
2561,
1895,
275,
3676,
35221,
4715,
436,
2929,
3400,
247,
747,
1039,
281,
8415,
436,
1895,
534,
42261,
13406,
8453,
50274,
19164,
414,
253,
2934,
273,
970,
247,
46449,
4216,
281,
9413,
24498,
2605,
310,
4460,
2299,
253,
4081,
1332,
15771,
327,
253,
3126,
4903,
253,
1039,
273,
13546,
253,
3126,
4903,
310,
37825,
594,
891,
5545,
1880,
436,
1332,
812,
320,
3732,
281,
625,
2087,
7533,
835,
253,
42295,
3126,
4903,
403,
10693,
12482,
50274,
498,
15752,
253,
4028,
273,
436,
2929,
310,
6571,
2590,
533,
3133,
275,
247,
16949,
323,
1650,
627,
310,
247,
44127,
18122,
2228,
275,
1386,
40873,
253,
4477,
403,
14659,
281,
40167,
598,
253,
4028,
275,
253,
2852,
50275,
15177,
50276,
783,
2934,
273,
970,
247,
46449,
4216,
281,
12454,
253,
8900,
273,
24498,
2605,
2789,
3282,
1580,
627,
403,
46449,
7688,
2190,
8482,
6579,
275,
954,
24498,
8892,
253,
4081,
1332,
310,
2429,
1411,
2045,
4736,
44321,
288,
8435,
2987,
2299,
15081,
288,
317,
347,
247,
1375,
23037,
14387,
1332,
310,
417,
10799,
247,
2257,
273,
625,
7269,
4736,
44321,
288,
8435,
2987,
452,
13082,
275,
3332,
1107,
337,
374,
835,
253,
1533,
1566,
2929,
671,
4081,
281,
1973,
247,
4216,
281,
7102,
253,
749,
41881,
5438,
253,
4477,
403,
14659,
281,
7277,
253,
4081,
1332,
281,
625,
7269,
4736,
44321,
288,
8435,
3082,
50276,
18,
1182,
12109,
2827,
355,
1533,
1566,
347,
247,
4216,
4715,
21624,
39719,
323,
7219,
17857,
1686,
43425,
374,
632,
1162,
355,
4715,
749,
41881,
14237,
342,
3468,
8062,
17857,
32888,
43425,
50275,
783,
4477,
452,
5469,
253,
3710,
2898,
281,
12620,
342,
13358,
3126,
4903,
275,
2593,
818,
285,
452,
417,
14435,
327,
1529,
12291,
273,
10568,
42295,
3126,
4903,
1110,
767,
7364,
403,
2905,
5474,
339,
431,
248,
4477,
12661,
247,
15895,
273,
24498,
35221,
4715,
326,
29820,
19349,
2605,
875,
6938,
4903,
281,
6635,
749,
2184,
932,
253,
6888,
749,
2184,
932,
285,
2406,
5251,
7823,
4561,
281,
3986,
253,
749,
2184,
932,
403,
9674,
12956,
970,
247,
34003,
24498,
5570,
326,
35910,
15450,
8892,
253,
4477,
1071,
616,
1332,
275,
1635,
281,
2067,
24498,
391,
77,
1666,
25379,
327,
253,
374,
17670,
460,
12517,
285,
1407,
257,
12620,
285,
4044,
12532,
1543,
327,
253,
1097,
597,
921,
7938,
14940,
275,
1097,
12620,
275,
1635,
281,
4665,
494,
19349,
14580,
50275,
296,
3755,
20556,
337,
253,
1895,
2783,
407,
253,
4477,
310,
581,
273,
2266,
17200,
50276,
19,
253,
2746,
310,
3505,
74,
6216,
342,
973,
12710,
8442,
50276,
20,
253,
1543,
403,
6015,
2439,
12922,
285,
403,
275,
11132,
12620,
50275,
20881,
1255,
265,
337,
3884,
1319,
697,
12744,
849,
253,
5933,
33772,
3884,
1319,
323,
5997,
285,
2538,
432,
619,
4685,
253,
7268,
267,
941,
310,
4561,
407,
2406,
5251,
7823,
275,
253,
2250,
2317,
326,
3177,
281,
873,
253,
6938,
4903,
281,
247,
1798,
1318,
3021,
352,
4620,
253,
1340,
275,
534,
3126,
4903,
403,
4236,
8213,
752,
604,
247,
4778,
310,
4236,
275,
253,
3302,
25142,
323,
534,
253,
19349,
4651,
403,
6888,
275,
247,
1996,
19502,
310,
253,
19349,
4216,
9300,
275,
824,
247,
4758,
323,
1650,
604,
253,
5570,
21936,
281,
3037,
7823,
326,
32014,
327,
6871,
410,
1078,
352,
16070,
265,
327,
8805,
29397,
991,
752,
6569,
840,
374,
22378,
327,
9171,
1430,
273,
1309,
1781,
1180,
273,
612,
84,
891,
2868,
326,
253,
4477,
878,
281,
823,
690,
1491,
670,
253,
9171,
1430,
273,
616,
2746,
342,
6938,
4903,
275,
253,
1655,
7482,
597,
1333,
50276,
936,
12654,
1880,
9841,
2330,
4903,
403,
3661,
494,
347,
6326,
359,
7277,
253,
2457,
3733,
2323,
4142,
273,
253,
747,
749,
2184,
932,
342,
247,
838,
292,
7887,
815,
74,
1078,
6240,
731,
281,
253,
749,
41881,
19868,
849,
403,
597,
4236,
849,
1142,
403,
4236,
849,
403,
253,
2406,
5251,
7823,
6311,
891,
651,
8564,
949,
5150,
577,
533,
849,
1048,
513,
368,
6194,
3966,
943,
320,
2879,
50276,
20,
253,
5933,
943,
417,
320,
50217,
281,
24864,
789,
436,
7482,
25132,
314,
38771,
253,
4278,
26299,
407,
352,
285,
651,
1361,
3216,
247,
2257,
273,
253,
5740,
50276,
21,
690,
13260,
327,
253,
4038,
273,
253,
6944,
19349,
4216,
891,
2868,
253,
4477,
878,
281,
1375,
690,
13260,
670,
253,
4038,
273,
253,
6944,
19349,
4216,
275,
1798,
326,
4390,
597,
5467,
326,
253,
6944,
19349,
4216,
310,
14086,
285,
326,
1016,
273,
253,
3126,
4903,
16178,
253,
2571,
891,
476,
8564,
247,
4112,
835,
6297,
403,
23507,
285,
326,
4715,
7823,
534,
32014,
327,
247,
19349,
2803,
760,
4483,
581,
281,
32014,
327,
253,
8993,
2151,
273,
253,
1072,
19349,
2803,
533,
841,
2151,
642,
3356,
452,
667,
643,
1055,
4903,
7939,
281,
731,
50274,
22,
5740,
273,
749,
41881,
19868,
3058,
752,
310,
247,
749,
41881,
19868,
432,
4361,
253,
2929,
2067,
2069,
891,
369,
2104,
281,
26923,
326,
310,
271,
15824,
327,
253,
3603,
275,
253,
4736,
2317,
326,
403,
7939,
1955,
281,
616,
11935,
21011,
26332,
2970,
17908,
310,
247,
1429,
273,
653,
984,
653,
310,
3058,
281,
7477,
17908,
690,
15965,
12320,
310,
3309,
281,
6266,
436,
3340,
1580,
352,
4948,
253,
5161,
273,
634,
5933,
509,
13537,
2593,
608,
407,
17816,
747,
8737,
588,
6635,
2316,
2217,
323,
824,
271,
8813,
50276,
23,
13260,
327,
253,
11839,
273,
19349,
14580,
323,
749,
41881,
247,
4275,
9376,
1160,
407,
253,
4477,
310,
326,
275,
616,
9978,
7632,
327,
253,
19349,
4216,
403,
512,
4217,
275,
4736,
5978,
436,
310,
275,
2087,
417,
2032,
285,
4409,
14851,
323,
1650,
19763,
407,
247,
35121,
1133,
273,
271,
1789,
275,
697,
19992,
1057,
417,
2430,
3640,
273,
20636,
10303,
875,
253,
2972,
285,
253,
3216,
818,
14338,
326,
253,
42295,
310,
41731,
10574,
407,
634,
1332,
310,
436,
247,
7505,
943,
352,
417,
320,
326,
253,
42295,
342,
3216,
33024,
19349,
4216,
14800,
7171,
8900,
854,
5816,
19936,
789,
5987,
39962,
2061,
5375,
1252,
46670,
740,
285,
5987,
856,
22868,
32167,
2824,
550,
20790,
938,
1797,
3140,
68,
1166,
1423,
66,
2787,
3156,
69,
3832,
66,
324,
23,
70,
33297,
66,
1671,
67,
2082,
66,
26,
68,
20,
554,
6653,
9275,
285,
5987,
466,
70,
15083,
410,
466,
1796,
2061,
3306,
36176,
1047,
1867,
50276,
2887,
253,
32213,
2593,
323,
619,
1783,
273,
253,
13260,
17465,
569,
273,
253,
789,
891,
13414,
2868,
627,
310,
667,
2847,
323,
4468,
323,
4016,
38058,
3486,
50276,
187,
187,
4118,
18435,
27,
6438,
247,
2266,
30080,
22559,
432,
253,
4477,
285,
271,
9470,
5955,
2190,
253,
30628,
891,
2868,
253,
9380,
5847,
32180,
798,
697,
772,
285,
436,
2929,
588,
320,
247,
9865,
7680,
281,
5723,
2824,
891,
5583,
352,
323,
14924,
285,
11907,
253,
4477,
281,
2953,
253,
30628,
5701,
323,
253,
4049,
254,
609,
5102,
2715,
273,
253,
2929,
50276,
46458,
4496,
823,
253,
9269,
8254,
6787,
285,
4679,
432,
253,
30080,
22559,
285,
15249,
253,
12291,
273,
2130,
3126,
4903,
209
] | [
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] | [
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
783,
2929,
29328,
271,
5570,
326,
24772,
19349,
8900,
285,
24498,
35221,
4715,
288,
8435,
5678,
715,
46449,
17477,
288,
8435,
390,
22942,
6285,
77,
253,
3126,
4903,
534,
830,
253,
3720,
323,
253,
7632,
273,
253,
8350,
19349,
1566,
326,
253,
5570,
4648,
403,
8025,
281,
320,
2130,
949,
271,
42295,
3733,
6569,
275,
767,
8661,
806,
253,
19349,
2605,
310,
6888,
407,
3733,
253,
5570,
281,
3986,
271,
16122,
873,
273,
749,
2184,
932,
846,
534,
253,
5570,
310,
10166,
327,
253,
4588,
4836,
327,
253,
767,
12620,
326,
403,
6760,
253,
22942,
6285,
77,
1332,
41217,
5272,
19349,
14580,
285,
41731,
13015,
253,
767,
1666,
25379,
2067,
490,
77,
569,
921,
326,
1110,
1543,
403,
10237,
281,
6046,
275,
253,
2530,
3126,
4903,
285,
3710,
19502,
6864,
50276,
45563,
16248,
46449,
342,
288,
8435,
2789,
8783,
273,
3282,
2223,
288,
8435,
7274,
13414,
11120,
2953,
253,
46449,
533,
5467,
352,
29688,
7384,
253,
3126,
4903,
403,
2130,
5277,
253,
9376,
273,
247,
19349,
2605,
347,
271,
42115,
8492,
310,
347,
3626,
347,
4979,
398,
403,
323,
3448,
390,
2410,
47301,
323,
3888,
50276,
45563,
253,
5016,
875,
253,
749,
41881,
5140,
285,
253,
8900,
1232,
8069,
14757,
271,
17947,
10921,
594,
253,
4081,
5570,
417,
760,
33772,
670,
19349,
2605,
275,
253,
3126,
326,
352,
11403,
533,
671,
20096,
352,
11403,
1142,
4722,
7794,
273,
253,
3126,
50276,
45563,
253,
2929,
310,
973,
34218,
285,
3477,
281,
1239,
50276,
20881,
1255,
253,
11659,
273,
253,
3126,
4903,
310,
8025,
253,
4477,
3748,
247,
1643,
3082,
11205,
387,
30375,
1110,
533,
642,
4679,
403,
2908,
835,
6888,
3126,
4903,
403,
1146,
908,
50276,
20881,
1255,
253,
1543,
403,
247,
2372,
3710,
767,
8892,
767,
1666,
25379,
597,
1056,
253,
1127,
326,
275,
253,
987,
5989,
436,
1332,
310,
3576,
533,
597,
13414,
1663,
2953,
253,
1953,
534,
5989,
403,
253,
987,
4394,
50276,
20881,
1255,
253,
2929,
310,
275,
878,
273,
247,
3790,
273,
3448,
14835,
50276,
783,
9376,
326,
253,
612,
84,
403,
2130,
949,
271,
42295,
310,
14155,
253,
1543,
3559,
403,
323,
2219,
835,
597,
403,
2130,
30375,
731,
275,
271,
440,
35421,
1039,
476,
320,
20276,
50276,
783,
2625,
323,
253,
31471,
273,
7431,
612,
84,
588,
452,
281,
1705,
432,
247,
5019,
273,
4836,
10921,
285,
46449,
8900,
2323,
7613,
30375,
612,
84,
310,
2779,
281,
2430,
5016,
342,
253,
46449,
8900,
1232,
534,
651,
2794,
247,
5412,
5478,
1895,
253,
1953,
273,
612,
8900,
310,
417,
9713,
275,
667,
273,
253,
4679,
326,
310,
2649,
247,
7654,
15504,
323,
436,
2929,
533,
352,
1057,
2701,
253,
30437,
273,
253,
1332,
285,
15974,
352,
651,
1056,
253,
2929,
10046,
50276,
783,
3126,
4903,
403,
8025,
281,
320,
13358,
436,
9376,
2789,
253,
17947,
6927,
407,
14155,
253,
3576,
1180,
273,
3054,
326,
878,
281,
320,
2783,
275,
253,
19349,
4216,
969,
15974,
436,
4809,
34243,
342,
4679,
534,
651,
18289,
1379,
1659,
327,
643,
12620,
685,
253,
4394,
908,
651,
19148,
253,
30437,
5028,
273,
253,
4081,
1332,
50276,
783,
19349,
2605,
273,
253,
3126,
310,
8025,
281,
320,
1355,
2217,
275,
1180,
281,
12647,
247,
23395,
1414,
12178,
4216,
50276,
783,
3215,
26208,
3408,
6613,
1919,
642,
747,
46449,
310,
6888,
436,
9376,
31451,
3839,
2186,
285,
15974,
253,
1953,
273,
849,
1199,
281,
1718,
275,
46449,
8900,
651,
320,
9865,
323,
4685,
253,
2087,
50228,
273,
253,
1543,
50276,
7152,
339,
431,
248,
2929,
29328,
247,
46449,
8877,
24498,
35221,
4715,
7792,
281,
1973,
24498,
5289,
275,
271,
34560,
43124,
1039,
253,
2929,
2722,
5661,
1543,
275,
374,
17670,
460,
12517,
285,
1407,
257,
12620,
342,
23507,
23267,
50276,
296,
3755,
20556,
50276,
783,
2929,
310,
973,
3542,
285,
2590,
253,
4679,
403,
973,
2218,
50275,
20881,
1255,
50276,
783,
5075,
383,
629,
273,
253,
2929,
310,
326,
352,
310,
417,
2590,
835,
253,
38135,
273,
253,
2929,
8696,
352,
651,
320,
10260,
14109,
604,
253,
4477,
22629,
616,
789,
1411,
2720,
789,
50275,
34263,
2045,
789,
14177,
281,
1089,
4216,
751,
5289,
275,
12620,
323,
4560,
6936,
10121,
3966,
323,
1650,
15688,
4715,
432,
20028,
407,
26736,
10861,
7139,
50275,
18108,
2720,
789,
327,
288,
8435,
285,
7477,
12517,
50276,
18,
7740,
1020,
2793,
44864,
275,
24498,
35221,
4715,
432,
32367,
374,
8495,
42111,
491,
4715,
432,
1643,
32367,
407,
10921,
44461,
253,
13260,
326,
3126,
4903,
403,
1677,
281,
441,
310,
253,
2022,
12291,
273,
253,
2929,
4560,
841,
3126,
4903,
2112,
342,
253,
2605,
310,
2834,
50276,
7152,
33032,
2520,
2929,
29328,
247,
747,
24498,
35221,
4715,
1332,
326,
812,
8356,
9413,
24498,
2605,
342,
247,
6311,
46449,
4216,
281,
1973,
253,
46449,
4216,
253,
4081,
1332,
23970,
253,
3126,
4903,
347,
7632,
275,
326,
4216,
534,
4419,
690,
8892,
29765,
3640,
253,
46449,
4216,
285,
749,
41881,
3169,
24498,
3646,
403,
10486,
6311,
285,
12454,
1016,
643,
50274,
9188,
40348,
12077,
24498,
2605,
8900,
310,
247,
7936,
2561,
1895,
275,
3676,
35221,
4715,
436,
2929,
3400,
247,
747,
1039,
281,
8415,
436,
1895,
534,
42261,
13406,
8453,
50274,
19164,
414,
253,
2934,
273,
970,
247,
46449,
4216,
281,
9413,
24498,
2605,
310,
4460,
2299,
253,
4081,
1332,
15771,
327,
253,
3126,
4903,
253,
1039,
273,
13546,
253,
3126,
4903,
310,
37825,
594,
891,
5545,
1880,
436,
1332,
812,
320,
3732,
281,
625,
2087,
7533,
835,
253,
42295,
3126,
4903,
403,
10693,
12482,
50274,
498,
15752,
253,
4028,
273,
436,
2929,
310,
6571,
2590,
533,
3133,
275,
247,
16949,
323,
1650,
627,
310,
247,
44127,
18122,
2228,
275,
1386,
40873,
253,
4477,
403,
14659,
281,
40167,
598,
253,
4028,
275,
253,
2852,
50275,
15177,
50276,
783,
2934,
273,
970,
247,
46449,
4216,
281,
12454,
253,
8900,
273,
24498,
2605,
2789,
3282,
1580,
627,
403,
46449,
7688,
2190,
8482,
6579,
275,
954,
24498,
8892,
253,
4081,
1332,
310,
2429,
1411,
2045,
4736,
44321,
288,
8435,
2987,
2299,
15081,
288,
317,
347,
247,
1375,
23037,
14387,
1332,
310,
417,
10799,
247,
2257,
273,
625,
7269,
4736,
44321,
288,
8435,
2987,
452,
13082,
275,
3332,
1107,
337,
374,
835,
253,
1533,
1566,
2929,
671,
4081,
281,
1973,
247,
4216,
281,
7102,
253,
749,
41881,
5438,
253,
4477,
403,
14659,
281,
7277,
253,
4081,
1332,
281,
625,
7269,
4736,
44321,
288,
8435,
3082,
50276,
18,
1182,
12109,
2827,
355,
1533,
1566,
347,
247,
4216,
4715,
21624,
39719,
323,
7219,
17857,
1686,
43425,
374,
632,
1162,
355,
4715,
749,
41881,
14237,
342,
3468,
8062,
17857,
32888,
43425,
50275,
783,
4477,
452,
5469,
253,
3710,
2898,
281,
12620,
342,
13358,
3126,
4903,
275,
2593,
818,
285,
452,
417,
14435,
327,
1529,
12291,
273,
10568,
42295,
3126,
4903,
1110,
767,
7364,
403,
2905,
5474,
339,
431,
248,
4477,
12661,
247,
15895,
273,
24498,
35221,
4715,
326,
29820,
19349,
2605,
875,
6938,
4903,
281,
6635,
749,
2184,
932,
253,
6888,
749,
2184,
932,
285,
2406,
5251,
7823,
4561,
281,
3986,
253,
749,
2184,
932,
403,
9674,
12956,
970,
247,
34003,
24498,
5570,
326,
35910,
15450,
8892,
253,
4477,
1071,
616,
1332,
275,
1635,
281,
2067,
24498,
391,
77,
1666,
25379,
327,
253,
374,
17670,
460,
12517,
285,
1407,
257,
12620,
285,
4044,
12532,
1543,
327,
253,
1097,
597,
921,
7938,
14940,
275,
1097,
12620,
275,
1635,
281,
4665,
494,
19349,
14580,
50275,
296,
3755,
20556,
337,
253,
1895,
2783,
407,
253,
4477,
310,
581,
273,
2266,
17200,
50276,
19,
253,
2746,
310,
3505,
74,
6216,
342,
973,
12710,
8442,
50276,
20,
253,
1543,
403,
6015,
2439,
12922,
285,
403,
275,
11132,
12620,
50275,
20881,
1255,
265,
337,
3884,
1319,
697,
12744,
849,
253,
5933,
33772,
3884,
1319,
323,
5997,
285,
2538,
432,
619,
4685,
253,
7268,
267,
941,
310,
4561,
407,
2406,
5251,
7823,
275,
253,
2250,
2317,
326,
3177,
281,
873,
253,
6938,
4903,
281,
247,
1798,
1318,
3021,
352,
4620,
253,
1340,
275,
534,
3126,
4903,
403,
4236,
8213,
752,
604,
247,
4778,
310,
4236,
275,
253,
3302,
25142,
323,
534,
253,
19349,
4651,
403,
6888,
275,
247,
1996,
19502,
310,
253,
19349,
4216,
9300,
275,
824,
247,
4758,
323,
1650,
604,
253,
5570,
21936,
281,
3037,
7823,
326,
32014,
327,
6871,
410,
1078,
352,
16070,
265,
327,
8805,
29397,
991,
752,
6569,
840,
374,
22378,
327,
9171,
1430,
273,
1309,
1781,
1180,
273,
612,
84,
891,
2868,
326,
253,
4477,
878,
281,
823,
690,
1491,
670,
253,
9171,
1430,
273,
616,
2746,
342,
6938,
4903,
275,
253,
1655,
7482,
597,
1333,
50276,
936,
12654,
1880,
9841,
2330,
4903,
403,
3661,
494,
347,
6326,
359,
7277,
253,
2457,
3733,
2323,
4142,
273,
253,
747,
749,
2184,
932,
342,
247,
838,
292,
7887,
815,
74,
1078,
6240,
731,
281,
253,
749,
41881,
19868,
849,
403,
597,
4236,
849,
1142,
403,
4236,
849,
403,
253,
2406,
5251,
7823,
6311,
891,
651,
8564,
949,
5150,
577,
533,
849,
1048,
513,
368,
6194,
3966,
943,
320,
2879,
50276,
20,
253,
5933,
943,
417,
320,
50217,
281,
24864,
789,
436,
7482,
25132,
314,
38771,
253,
4278,
26299,
407,
352,
285,
651,
1361,
3216,
247,
2257,
273,
253,
5740,
50276,
21,
690,
13260,
327,
253,
4038,
273,
253,
6944,
19349,
4216,
891,
2868,
253,
4477,
878,
281,
1375,
690,
13260,
670,
253,
4038,
273,
253,
6944,
19349,
4216,
275,
1798,
326,
4390,
597,
5467,
326,
253,
6944,
19349,
4216,
310,
14086,
285,
326,
1016,
273,
253,
3126,
4903,
16178,
253,
2571,
891,
476,
8564,
247,
4112,
835,
6297,
403,
23507,
285,
326,
4715,
7823,
534,
32014,
327,
247,
19349,
2803,
760,
4483,
581,
281,
32014,
327,
253,
8993,
2151,
273,
253,
1072,
19349,
2803,
533,
841,
2151,
642,
3356,
452,
667,
643,
1055,
4903,
7939,
281,
731,
50274,
22,
5740,
273,
749,
41881,
19868,
3058,
752,
310,
247,
749,
41881,
19868,
432,
4361,
253,
2929,
2067,
2069,
891,
369,
2104,
281,
26923,
326,
310,
271,
15824,
327,
253,
3603,
275,
253,
4736,
2317,
326,
403,
7939,
1955,
281,
616,
11935,
21011,
26332,
2970,
17908,
310,
247,
1429,
273,
653,
984,
653,
310,
3058,
281,
7477,
17908,
690,
15965,
12320,
310,
3309,
281,
6266,
436,
3340,
1580,
352,
4948,
253,
5161,
273,
634,
5933,
509,
13537,
2593,
608,
407,
17816,
747,
8737,
588,
6635,
2316,
2217,
323,
824,
271,
8813,
50276,
23,
13260,
327,
253,
11839,
273,
19349,
14580,
323,
749,
41881,
247,
4275,
9376,
1160,
407,
253,
4477,
310,
326,
275,
616,
9978,
7632,
327,
253,
19349,
4216,
403,
512,
4217,
275,
4736,
5978,
436,
310,
275,
2087,
417,
2032,
285,
4409,
14851,
323,
1650,
19763,
407,
247,
35121,
1133,
273,
271,
1789,
275,
697,
19992,
1057,
417,
2430,
3640,
273,
20636,
10303,
875,
253,
2972,
285,
253,
3216,
818,
14338,
326,
253,
42295,
310,
41731,
10574,
407,
634,
1332,
310,
436,
247,
7505,
943,
352,
417,
320,
326,
253,
42295,
342,
3216,
33024,
19349,
4216,
14800,
7171,
8900,
854,
5816,
19936,
789,
5987,
39962,
2061,
5375,
1252,
46670,
740,
285,
5987,
856,
22868,
32167,
2824,
550,
20790,
938,
1797,
3140,
68,
1166,
1423,
66,
2787,
3156,
69,
3832,
66,
324,
23,
70,
33297,
66,
1671,
67,
2082,
66,
26,
68,
20,
554,
6653,
9275,
285,
5987,
466,
70,
15083,
410,
466,
1796,
2061,
3306,
36176,
1047,
1867,
50276,
2887,
253,
32213,
2593,
323,
619,
1783,
273,
253,
13260,
17465,
569,
273,
253,
789,
891,
13414,
2868,
627,
310,
667,
2847,
323,
4468,
323,
4016,
38058,
3486,
50276,
187,
187,
4118,
18435,
27,
6438,
247,
2266,
30080,
22559,
432,
253,
4477,
285,
271,
9470,
5955,
2190,
253,
30628,
891,
2868,
253,
9380,
5847,
32180,
798,
697,
772,
285,
436,
2929,
588,
320,
247,
9865,
7680,
281,
5723,
2824,
891,
5583,
352,
323,
14924,
285,
11907,
253,
4477,
281,
2953,
253,
30628,
5701,
323,
253,
4049,
254,
609,
5102,
2715,
273,
253,
2929,
50276,
46458,
4496,
823,
253,
9269,
8254,
6787,
285,
4679,
432,
253,
30080,
22559,
285,
15249,
253,
12291,
273,
2130,
3126,
4903,
209
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
the paper proposes a backdoor defense based on the observation that trojan and benign neurons exhibit different behavior in the victim model their defense assumes an attacker who can poison a subset of the training data and its labels they design a training algorithm that detects and suppresses neurons that form shortcuts between classes which they describe as linear decision boundaries empirical results are provided for image classification datasets showing that their training algorithm none has considerable improvements over existing defenses against three poisoningbased backdoor attacks finally the authors show an ablation study over different trigger sizes and poisoning rates which show that none is robust in most cases strengths novelty defenses against poisoningbased backdoor attacks are highly relevant detecting and suppressing malicious neurons during training is a novel idea that shows promising results high effectiveness it appears that none is a promising defense that clearly outperforms many existing defenses the drop in clean accuracy is also often low 3 making it a practically relevant defense ablation studies the defense was evaluated on multiple datasets against three attacks showing promising results everywhere the authors also show ablation studies over different trigger sizes and poisoning rates opensource code weaknesses overall the paper was an interesting read and i found only minor weaknesses in the papers presentation the results and intuition are convincing and i believe they are of interest to the research community formalization many symbols in algorithm 1 are only defined informally in the text but it would improve clarity if they were formalized eg the separate norm and init function a table explaining the symbols would help readability a lot minor please break a single long paragraph into multiple smaller paragraphs to enhance readability some citations are missing eg for all datasets such as trojai 1 1 karra kiran chace ashcraft and neil fendley the trojai software framework an opensource tool for embedding trojans into deep learning models arxiv preprint arxiv200307233 2020 docsepthis paper demonstrated the backdoorrelated neurons in the dnn models form a hyperplane surface as the decision region to the backdoor target label based on this understanding the authors proposed a robust training algorithm none nonlinearity that identifies linear decision regions filters out inputs that are potentially poisoned and resets affected neurons to enforce nonlinear decision regions during training the paper is well written and easy to understand the linearlyseparable assumption as an underlying explanation for the effectiveness of backdoor attacks is interesting however this work in its current form still lacks sufficient analysis to support this conclusion docsepthis paper proves that a backdoored dnn learns a hyperplane as the decision region under some assumptions the authors argue that the backdoored dnns will increase linearity by introducing a large percentage of neurons activating on one piece of the activation function based on these understandings the authors propose a new trainingstage defense ie none that detects and removes potential poisoned samples and repairs compromised neurons the proposed defense is evaluated on mnist gtsrb cifar10 imagenet10 and trojai datasets pros 1 the topic is of sufficient significance and interest to neurips audiences 2 the codes are provided which should be encouraged 3 the authors try to provide theoretical support for their method which should be encouraged 4 the authors discuss the resistance to potential adaptive attacks which should be encouraged 5 in general the idea is easy to follow although many important details are missing cons 1 some statements need more details and justifications an adaptive attack with slow poisoning can bypass such defenses line 56 page 1 what the slow poisoning means it seems that there is no support references or experiments for it these methods fail to defend against the natural trojan line 3839 page 2 no experiments about whether abl cannot remove natural trojan i only see the results of dpsgd 2 missing assumptions in theorem 33 i think theorem 33 holds if and only if the trojan is complete and accurate 3 please provide more details about two statements in the appendix line 591593 page 14 4 missing an important related work and baseline it seems that 1 is the sota trainingstage backdoor defense which is not mentioned and discussed in this paper 5 the organization of section 4 is very poor there is a very long paragraph in this section which significantly increases the reading difficulty besides many important details are missing for example how to do the separation line 13 algorithm 1 what are the technical details of fishers linear discriminant analysis and jenks natural breaks optimization algorithm 6 it seems that hidden trojan backdoor attack is a trainingstage attack which is not effective if training from the scratch as suggested in 2 how did the authors evaluate defenses under this attack 7 why did the authors compare their defense only with dpsgd under natural trojans i understand that dpsgd seems to be effective in defending against natural trojans however this is not the reason for not comparing with other defenses 8 i think it is necessary to go deeper into the internal mechanism of the proposed method specifically according to algorithm 1 the proposed method can detect poisoned or biased samples please provide the precision and recall of this stage so that the readers can understand how well this stage is i will increase my scores if the authors can partly address my concerns limited but sufficient
### Summary: | this paper introduces a new algorithm to mitigate backdoors in neural network models the reviewers agreed this paper proposes an interesting defense that is well motivated carefully evaluated with many ablation studies and highly effective the weaknesses raised have also been mitigated in the rebuttal period and the paper is generlly strong | [
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
783,
2929,
29328,
247,
896,
11806,
5684,
1754,
327,
253,
8310,
326,
6727,
17551,
285,
21690,
8512,
10738,
1027,
3879,
275,
253,
5487,
1566,
616,
5684,
19584,
271,
30539,
665,
476,
14537,
247,
8578,
273,
253,
3733,
941,
285,
697,
13301,
597,
2216,
247,
3733,
5933,
326,
34472,
285,
41033,
8512,
326,
830,
28194,
84,
875,
5971,
534,
597,
6266,
347,
4872,
3061,
13674,
16774,
1543,
403,
2530,
323,
2460,
9162,
15302,
4645,
326,
616,
3733,
5933,
5293,
556,
10665,
11701,
689,
5368,
25774,
1411,
1264,
33254,
3169,
896,
11806,
8104,
4720,
253,
4477,
921,
271,
28913,
1263,
689,
1027,
9632,
9552,
285,
33254,
4142,
534,
921,
326,
5293,
310,
10237,
275,
954,
2219,
50275,
296,
3755,
20556,
50275,
2369,
652,
555,
25774,
1411,
33254,
3169,
896,
11806,
8104,
403,
4122,
4623,
15549,
285,
35582,
24764,
8512,
1309,
3733,
310,
247,
4460,
2934,
326,
2722,
12532,
1543,
50274,
8656,
12510,
352,
4620,
326,
5293,
310,
247,
12532,
5684,
326,
4518,
41731,
13015,
1142,
5368,
25774,
253,
5926,
275,
4076,
7200,
310,
671,
2223,
1698,
495,
2403,
352,
247,
18236,
4623,
5684,
50274,
1752,
318,
2175,
253,
5684,
369,
6760,
327,
2709,
15302,
1411,
1264,
8104,
4645,
12532,
1543,
11678,
253,
4477,
671,
921,
28913,
2175,
689,
1027,
9632,
9552,
285,
33254,
4142,
50274,
25249,
1505,
2127,
50274,
20881,
1255,
265,
50276,
1189,
455,
253,
2929,
369,
271,
4722,
1239,
285,
891,
1119,
760,
5884,
32213,
275,
253,
9380,
9759,
253,
1543,
285,
30328,
403,
21414,
285,
891,
2868,
597,
403,
273,
1600,
281,
253,
2561,
3114,
50274,
24873,
1320,
1142,
14217,
275,
5933,
337,
403,
760,
2931,
4151,
595,
275,
253,
2505,
533,
352,
651,
3157,
19843,
604,
597,
497,
7473,
1025,
24088,
253,
4858,
5222,
285,
2012,
1159,
247,
2829,
15571,
253,
14217,
651,
1361,
1239,
1430,
247,
2257,
50274,
37585,
4496,
2740,
247,
2014,
1048,
12494,
715,
2709,
4577,
33295,
281,
7278,
1239,
1430,
690,
30404,
403,
5816,
24088,
323,
512,
15302,
824,
347,
6727,
75,
2284,
337,
50275,
18,
46247,
376,
465,
343,
266,
448,
584,
15898,
12517,
285,
425,
300,
269,
423,
2205,
253,
6727,
75,
2284,
3694,
7792,
271,
13279,
1505,
4968,
323,
21496,
6727,
75,
507,
715,
3676,
4715,
3210,
549,
32693,
638,
3845,
549,
32693,
1518,
21645,
22187,
9169,
50276,
7152,
33032,
2520,
2929,
5183,
253,
896,
11806,
4919,
8512,
275,
253,
277,
9866,
3210,
830,
247,
4373,
13568,
2553,
347,
253,
3061,
2919,
281,
253,
896,
11806,
2303,
5203,
1754,
327,
436,
4685,
253,
4477,
4081,
247,
10237,
3733,
5933,
50276,
15422,
14561,
414,
326,
22649,
4872,
3061,
4811,
15116,
562,
14800,
326,
403,
7826,
47494,
285,
501,
1507,
5876,
8512,
281,
7767,
14561,
3061,
4811,
1309,
3733,
50275,
783,
2929,
310,
973,
3542,
285,
3477,
281,
2096,
253,
23352,
16806,
494,
9376,
347,
271,
6944,
8813,
323,
253,
12510,
273,
896,
11806,
8104,
310,
4722,
2299,
436,
789,
275,
697,
1655,
830,
1335,
19756,
4209,
1783,
281,
1329,
436,
6452,
50276,
7152,
33032,
2520,
2929,
19539,
326,
247,
896,
3088,
2149,
277,
9866,
33772,
247,
4373,
13568,
347,
253,
3061,
2919,
762,
690,
13260,
253,
4477,
9059,
326,
253,
896,
3088,
2149,
277,
79,
2224,
588,
2572,
50137,
407,
16984,
247,
1781,
7155,
273,
8512,
27149,
327,
581,
5313,
273,
253,
5743,
1159,
1754,
327,
841,
2096,
723,
253,
4477,
12661,
247,
747,
3733,
13311,
5684,
26332,
5293,
326,
34472,
285,
26586,
2442,
47494,
3530,
285,
25931,
25047,
8512,
253,
4081,
5684,
310,
6760,
327,
278,
79,
382,
305,
1641,
13041,
260,
338,
274,
740,
4440,
257,
292,
740,
285,
6727,
75,
2284,
15302,
5847,
50276,
18,
253,
9400,
310,
273,
4209,
8453,
285,
1600,
281,
5723,
2824,
23886,
50276,
19,
253,
11646,
403,
2530,
534,
943,
320,
14659,
495,
253,
4477,
1611,
281,
2085,
10527,
1329,
323,
616,
1332,
534,
943,
320,
14659,
577,
253,
4477,
2319,
253,
5052,
281,
2442,
17825,
8104,
534,
943,
320,
14659,
608,
275,
2087,
253,
2934,
310,
3477,
281,
956,
3738,
1142,
1774,
4278,
403,
5816,
50274,
5040,
337,
690,
7234,
878,
625,
4278,
285,
816,
6787,
50275,
266,
17825,
2983,
342,
3468,
33254,
476,
18210,
824,
25774,
1386,
8026,
3239,
337,
752,
253,
3468,
33254,
2097,
352,
3133,
326,
627,
310,
642,
1329,
10414,
390,
4679,
323,
352,
50276,
20513,
3082,
1891,
281,
2342,
1411,
253,
3626,
6727,
17551,
1386,
6480,
1867,
3239,
374,
642,
4679,
670,
1880,
490,
77,
2550,
5386,
3626,
6727,
17551,
891,
760,
923,
253,
1543,
273,
20093,
35333,
50276,
19,
5816,
13260,
275,
10012,
5922,
891,
1158,
10012,
5922,
6556,
604,
285,
760,
604,
253,
6727,
17551,
310,
3426,
285,
7899,
50276,
20,
4496,
2085,
625,
4278,
670,
767,
7234,
275,
253,
30762,
1386,
8978,
1010,
4590,
3239,
1638,
577,
5816,
271,
1774,
2905,
789,
285,
8245,
352,
3133,
326,
337,
310,
253,
256,
5503,
3733,
13311,
896,
11806,
5684,
534,
310,
417,
5393,
285,
5469,
275,
436,
2929,
608,
253,
6003,
273,
2593,
577,
310,
1077,
4105,
627,
310,
247,
1077,
1048,
12494,
275,
436,
2593,
534,
3012,
5459,
253,
4361,
10183,
16280,
1142,
1774,
4278,
403,
5816,
323,
1650,
849,
281,
513,
253,
9712,
1386,
2145,
5933,
337,
752,
403,
253,
7681,
4278,
273,
14654,
3398,
4872,
20741,
386,
1783,
285,
480,
257,
661,
3626,
13471,
13757,
5933,
721,
352,
3133,
326,
8763,
6727,
17551,
896,
11806,
2983,
310,
247,
3733,
13311,
2983,
534,
310,
417,
3576,
604,
3733,
432,
253,
20041,
347,
5125,
275,
374,
849,
858,
253,
4477,
7472,
25774,
762,
436,
2983,
818,
2139,
858,
253,
4477,
7277,
616,
5684,
760,
342,
20093,
35333,
762,
3626,
6727,
75,
507,
891,
2096,
326,
20093,
35333,
3133,
281,
320,
3576,
275,
21449,
1411,
3626,
6727,
75,
507,
2299,
436,
310,
417,
253,
1921,
323,
417,
10941,
342,
643,
25774,
854,
891,
1158,
352,
310,
3309,
281,
564,
12861,
715,
253,
4812,
5122,
273,
253,
4081,
1332,
5742,
2556,
281,
5933,
337,
253,
4081,
1332,
476,
2736,
47494,
390,
23539,
3530,
4496,
2085,
253,
12320,
285,
6983,
273,
436,
3924,
594,
326,
253,
10668,
476,
2096,
849,
973,
436,
3924,
310,
50276,
74,
588,
2572,
619,
7363,
604,
253,
4477,
476,
13730,
2953,
619,
7350,
3710,
533,
4209,
2490,
187,
4118,
18435,
27,
2520,
2929,
23970,
247,
747,
5933,
281,
29966,
896,
26695,
275,
11454,
2990,
3210,
253,
30628,
5821,
436,
2929,
29328,
271,
4722,
5684,
326,
310,
973,
17194,
9257,
6760,
342,
1142,
28913,
2175,
285,
4122,
3576,
253,
32213,
5439,
452,
671,
644,
4784,
27285,
275,
253,
30080,
22559,
2180,
285,
253,
2929,
310,
1006,
77,
314,
2266
] | [
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] | [
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
783,
2929,
29328,
247,
896,
11806,
5684,
1754,
327,
253,
8310,
326,
6727,
17551,
285,
21690,
8512,
10738,
1027,
3879,
275,
253,
5487,
1566,
616,
5684,
19584,
271,
30539,
665,
476,
14537,
247,
8578,
273,
253,
3733,
941,
285,
697,
13301,
597,
2216,
247,
3733,
5933,
326,
34472,
285,
41033,
8512,
326,
830,
28194,
84,
875,
5971,
534,
597,
6266,
347,
4872,
3061,
13674,
16774,
1543,
403,
2530,
323,
2460,
9162,
15302,
4645,
326,
616,
3733,
5933,
5293,
556,
10665,
11701,
689,
5368,
25774,
1411,
1264,
33254,
3169,
896,
11806,
8104,
4720,
253,
4477,
921,
271,
28913,
1263,
689,
1027,
9632,
9552,
285,
33254,
4142,
534,
921,
326,
5293,
310,
10237,
275,
954,
2219,
50275,
296,
3755,
20556,
50275,
2369,
652,
555,
25774,
1411,
33254,
3169,
896,
11806,
8104,
403,
4122,
4623,
15549,
285,
35582,
24764,
8512,
1309,
3733,
310,
247,
4460,
2934,
326,
2722,
12532,
1543,
50274,
8656,
12510,
352,
4620,
326,
5293,
310,
247,
12532,
5684,
326,
4518,
41731,
13015,
1142,
5368,
25774,
253,
5926,
275,
4076,
7200,
310,
671,
2223,
1698,
495,
2403,
352,
247,
18236,
4623,
5684,
50274,
1752,
318,
2175,
253,
5684,
369,
6760,
327,
2709,
15302,
1411,
1264,
8104,
4645,
12532,
1543,
11678,
253,
4477,
671,
921,
28913,
2175,
689,
1027,
9632,
9552,
285,
33254,
4142,
50274,
25249,
1505,
2127,
50274,
20881,
1255,
265,
50276,
1189,
455,
253,
2929,
369,
271,
4722,
1239,
285,
891,
1119,
760,
5884,
32213,
275,
253,
9380,
9759,
253,
1543,
285,
30328,
403,
21414,
285,
891,
2868,
597,
403,
273,
1600,
281,
253,
2561,
3114,
50274,
24873,
1320,
1142,
14217,
275,
5933,
337,
403,
760,
2931,
4151,
595,
275,
253,
2505,
533,
352,
651,
3157,
19843,
604,
597,
497,
7473,
1025,
24088,
253,
4858,
5222,
285,
2012,
1159,
247,
2829,
15571,
253,
14217,
651,
1361,
1239,
1430,
247,
2257,
50274,
37585,
4496,
2740,
247,
2014,
1048,
12494,
715,
2709,
4577,
33295,
281,
7278,
1239,
1430,
690,
30404,
403,
5816,
24088,
323,
512,
15302,
824,
347,
6727,
75,
2284,
337,
50275,
18,
46247,
376,
465,
343,
266,
448,
584,
15898,
12517,
285,
425,
300,
269,
423,
2205,
253,
6727,
75,
2284,
3694,
7792,
271,
13279,
1505,
4968,
323,
21496,
6727,
75,
507,
715,
3676,
4715,
3210,
549,
32693,
638,
3845,
549,
32693,
1518,
21645,
22187,
9169,
50276,
7152,
33032,
2520,
2929,
5183,
253,
896,
11806,
4919,
8512,
275,
253,
277,
9866,
3210,
830,
247,
4373,
13568,
2553,
347,
253,
3061,
2919,
281,
253,
896,
11806,
2303,
5203,
1754,
327,
436,
4685,
253,
4477,
4081,
247,
10237,
3733,
5933,
50276,
15422,
14561,
414,
326,
22649,
4872,
3061,
4811,
15116,
562,
14800,
326,
403,
7826,
47494,
285,
501,
1507,
5876,
8512,
281,
7767,
14561,
3061,
4811,
1309,
3733,
50275,
783,
2929,
310,
973,
3542,
285,
3477,
281,
2096,
253,
23352,
16806,
494,
9376,
347,
271,
6944,
8813,
323,
253,
12510,
273,
896,
11806,
8104,
310,
4722,
2299,
436,
789,
275,
697,
1655,
830,
1335,
19756,
4209,
1783,
281,
1329,
436,
6452,
50276,
7152,
33032,
2520,
2929,
19539,
326,
247,
896,
3088,
2149,
277,
9866,
33772,
247,
4373,
13568,
347,
253,
3061,
2919,
762,
690,
13260,
253,
4477,
9059,
326,
253,
896,
3088,
2149,
277,
79,
2224,
588,
2572,
50137,
407,
16984,
247,
1781,
7155,
273,
8512,
27149,
327,
581,
5313,
273,
253,
5743,
1159,
1754,
327,
841,
2096,
723,
253,
4477,
12661,
247,
747,
3733,
13311,
5684,
26332,
5293,
326,
34472,
285,
26586,
2442,
47494,
3530,
285,
25931,
25047,
8512,
253,
4081,
5684,
310,
6760,
327,
278,
79,
382,
305,
1641,
13041,
260,
338,
274,
740,
4440,
257,
292,
740,
285,
6727,
75,
2284,
15302,
5847,
50276,
18,
253,
9400,
310,
273,
4209,
8453,
285,
1600,
281,
5723,
2824,
23886,
50276,
19,
253,
11646,
403,
2530,
534,
943,
320,
14659,
495,
253,
4477,
1611,
281,
2085,
10527,
1329,
323,
616,
1332,
534,
943,
320,
14659,
577,
253,
4477,
2319,
253,
5052,
281,
2442,
17825,
8104,
534,
943,
320,
14659,
608,
275,
2087,
253,
2934,
310,
3477,
281,
956,
3738,
1142,
1774,
4278,
403,
5816,
50274,
5040,
337,
690,
7234,
878,
625,
4278,
285,
816,
6787,
50275,
266,
17825,
2983,
342,
3468,
33254,
476,
18210,
824,
25774,
1386,
8026,
3239,
337,
752,
253,
3468,
33254,
2097,
352,
3133,
326,
627,
310,
642,
1329,
10414,
390,
4679,
323,
352,
50276,
20513,
3082,
1891,
281,
2342,
1411,
253,
3626,
6727,
17551,
1386,
6480,
1867,
3239,
374,
642,
4679,
670,
1880,
490,
77,
2550,
5386,
3626,
6727,
17551,
891,
760,
923,
253,
1543,
273,
20093,
35333,
50276,
19,
5816,
13260,
275,
10012,
5922,
891,
1158,
10012,
5922,
6556,
604,
285,
760,
604,
253,
6727,
17551,
310,
3426,
285,
7899,
50276,
20,
4496,
2085,
625,
4278,
670,
767,
7234,
275,
253,
30762,
1386,
8978,
1010,
4590,
3239,
1638,
577,
5816,
271,
1774,
2905,
789,
285,
8245,
352,
3133,
326,
337,
310,
253,
256,
5503,
3733,
13311,
896,
11806,
5684,
534,
310,
417,
5393,
285,
5469,
275,
436,
2929,
608,
253,
6003,
273,
2593,
577,
310,
1077,
4105,
627,
310,
247,
1077,
1048,
12494,
275,
436,
2593,
534,
3012,
5459,
253,
4361,
10183,
16280,
1142,
1774,
4278,
403,
5816,
323,
1650,
849,
281,
513,
253,
9712,
1386,
2145,
5933,
337,
752,
403,
253,
7681,
4278,
273,
14654,
3398,
4872,
20741,
386,
1783,
285,
480,
257,
661,
3626,
13471,
13757,
5933,
721,
352,
3133,
326,
8763,
6727,
17551,
896,
11806,
2983,
310,
247,
3733,
13311,
2983,
534,
310,
417,
3576,
604,
3733,
432,
253,
20041,
347,
5125,
275,
374,
849,
858,
253,
4477,
7472,
25774,
762,
436,
2983,
818,
2139,
858,
253,
4477,
7277,
616,
5684,
760,
342,
20093,
35333,
762,
3626,
6727,
75,
507,
891,
2096,
326,
20093,
35333,
3133,
281,
320,
3576,
275,
21449,
1411,
3626,
6727,
75,
507,
2299,
436,
310,
417,
253,
1921,
323,
417,
10941,
342,
643,
25774,
854,
891,
1158,
352,
310,
3309,
281,
564,
12861,
715,
253,
4812,
5122,
273,
253,
4081,
1332,
5742,
2556,
281,
5933,
337,
253,
4081,
1332,
476,
2736,
47494,
390,
23539,
3530,
4496,
2085,
253,
12320,
285,
6983,
273,
436,
3924,
594,
326,
253,
10668,
476,
2096,
849,
973,
436,
3924,
310,
50276,
74,
588,
2572,
619,
7363,
604,
253,
4477,
476,
13730,
2953,
619,
7350,
3710,
533,
4209,
2490,
187,
4118,
18435,
27,
2520,
2929,
23970,
247,
747,
5933,
281,
29966,
896,
26695,
275,
11454,
2990,
3210,
253,
30628,
5821,
436,
2929,
29328,
271,
4722,
5684,
326,
310,
973,
17194,
9257,
6760,
342,
1142,
28913,
2175,
285,
4122,
3576,
253,
32213,
5439,
452,
671,
644,
4784,
27285,
275,
253,
30080,
22559,
2180,
285,
253,
2929,
310,
1006,
77,
314,
2266
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
the main contribution of this paper is a sampleefficient rl approach for learning policies for longhorizon tasks by making use of multiple levels of abstraction each of which is associated with a set of option templates an option template defines an option in terms of initial and final states but not how to implement it all option templates are handcrafted at the start of training so the agent can execute an option template despite not knowing how to implement it yet during training the agent learns how to implement each option template in terms of the option templates at the next lower level of abstraction the paper evaluates this approach in three domains all in simulation craft a grid world based on minecraft stacking with a robot arm and 11 vs 11 football this topdown approach of learning options reaches higher performance than learning the options first and then combining them it is orders of magnitude more sampleefficient than learning without options strengths the proposed approach demonstrates orders of magnitude improvement in sampleefficiency compared to stateoftheart rl approaches that do not use options the approach is evaluated in complex continuous control domains includes an interesting comparison that shows this approach of learning topdown with option templates works better than learning bottomup ie learning the options first and then connecting them together the hypothesis is that this is because the learned options typically do not perform perfectly whereas option templates do which makes it easier to learn how to correctly connect options together when using the latter clear explanation of the background in terms of options and semimdps weaknesses the approach requires defining the levels of abstraction and the set of option templates in each the effectiveness of the approach presumably depends greatly on how well the problem is decomposed into these levels of abstraction and option templates the implementation of option templates can be nontrivial as is acknowledged in the limitations section the description of the method could be more clear including a more detailed figure 2 would help perhaps with more concrete examples of what the options could be eg from the minecraft task and showing how over the course of training each of the options is learned in terms of the next level of options also in the implementation of the approach how does one restrict the policy to pick from the set of options that are valid in a particular state there is no comparison against option value iteration in the manipulation and football environments is it because option value iteration would take too long to learn for these tasks no evaluation on real robots this would be a standout paper for corl if the evaluation included learning a longhorizon task with a real robot docsepthe paper provides an interesting perspective to existing optionbased hierarchical rl methods by proposing the idea of option templates it introduces a novel framework to learn option templates which enables the agent to have expert intervention as highlevel option of the task when they are unsure how to continue the action from the state specifically it could transit from one state to another immediately where the transition represent the execution of an unimplemented skill primitive it lays out different levels of option templates and eventually learns the lower level actions in the environment this could improve the policy performance and sample efficiency of rl and the paper shows the effectiveness of this method on tasks including robot manipulation and soccer strengths the motivation for using option template from the idea of sticky mitten is interesting and strong it provides a nice analogy between human and agent learning it is a very practical and interesting idea to have expert interventions as highlevel instructions which largely reduces the exploration burden and provide learning signals to the agent the explanation on option template method and idea is very clear there are clear comparison between the option template the limitation section provides true and tothepoint discussion on the sources of teleportation which should be considered in future work there are extensive discussions on how this approach compared with prior methods like optionvalue iteration and curriculum learning weaknesses the main concern of the paper is how well it could be applied to more complex robotics tasks that are 1 have longerhorizon which motivates the use of hierarchical rl 2 where the skills are not clearly defined the robot experiments in the paper involves simple moving of objects and does not reflect how hierarchical rl could help with more complex tasks there are no real robot experiments and no discussions of how well this could be scaled to realworld setting where there are more intricate manipulation of objects and more complex skills for more complex longhorizon robotics tasks how could the skills be extracted and the options to be formed in the first place does it assume that it already have skill segmentation done the learning levels here seems a bit contrived and manually designed for each individual task while comparing the option template method to previous hierarchical rl method is sound the idea of option template have the use of expert intervention which relates to the line of work on learning with expert interventions while this is a different formation and a different framing there should be references made to these methods listed a few in below 1 efficient learning of safe driving policy via humanai copilot optimization li 2022 httpsarxivorgabs220210341 2 expert intervention learning spencer 2020 httpslinkspringercomarticle101007s10514021100069 docsepthe paper exploring with sticky mittens reinforcement learning with expert interventions via option templates introduces a hierarchical reinforcement learning approach that divides long timescales in multiple option templates ie a group of initial and terminal states without optionspecific lowlevel policies the lowlevel control policies then only come into play at the very end to successively implement the previously found option templates the approach is evaluated on multiple reinforcement learning benchmarks in simulation strengths in general the research topic of the submitted manuscript is of high practical relevance and hence of interest to the community of corl it is well motivated and the background is well presented weaknesses unfortunately the manuscript in its current form comes with a couple of issues clarity the explanation of the approach is very confusing and not easy to grasp for external readers technical novely the general idea seems very similar to previous work eg 1 in addition to the lack of clarity the contributions to the field remain vague and seem minor evaluation even though the shown results in the given benchmarks are somewhat convincing they dont really highlight the special ingredients of the proposed method 1 learning multilevel hierarchies with hindsight levy et al iclr 2019
### Summary: | the strength and weakness of the paper raised by reviewers are summarized as follows strength the motivation and ideas are clearly presented the approach is evaluated on complex continuous control domains such as fetchandstack and g football tasks and the benefits of the proposed approach is apprppriately demonstrated empirical comparison between the bottomup and topdown approaches is interesting weakness the proposed approach requires the option templates which is not trivial to extract it is not clear how to extract the option templates for tasks with longer horizons more indepth discussions on the related work are necessary in addition to the weakness raised by reviewers ae raises the following concerns the comparison with impala dqn and the method in a nair et al 1 in fetchandstack and gfootball tasks does not look fair the task is significantly simplified for the proposed method using the predefined option templates but it seems that the option templates are not given to baseline methods it seems that there is a gap between the motivation in the introduction and what is done in the proposed method from the introduction ae thought that all the layers would be learned simultaneously in the proposed method however in the proposed method the lowest level policy is manually designed and given to the agent as option templates overall the paper contains interesting ideas and results however it is necessary to improve the presentation in some parts postrebuttal comments while some concerns were raised by reviewers the authors addressed them by deepening the discussion and providing additional results considering that reviewers agree that the paper presents interesting ideas and results ae recommend the acceptance of the paper ae also encourages the authors to check the comments from the reviewers once again and make some more efforts to improve the presentation | [
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
783,
2022,
7680,
273,
436,
2929,
310,
247,
3410,
20246,
391,
77,
2746,
323,
4715,
7823,
323,
1048,
1688,
21148,
8892,
407,
2403,
897,
273,
2709,
2308,
273,
38562,
1016,
273,
534,
310,
2330,
342,
247,
873,
273,
4500,
20665,
271,
4500,
7646,
13067,
271,
4500,
275,
2426,
273,
3302,
285,
2457,
3054,
533,
417,
849,
281,
3359,
352,
512,
4500,
20665,
403,
1133,
12517,
264,
387,
253,
1265,
273,
3733,
594,
253,
5570,
476,
13194,
271,
4500,
7646,
5747,
417,
8958,
849,
281,
3359,
352,
2568,
1309,
3733,
253,
5570,
33772,
849,
281,
3359,
1016,
4500,
7646,
275,
2426,
273,
253,
4500,
20665,
387,
253,
1735,
2406,
1268,
273,
38562,
50276,
783,
2929,
44995,
436,
2746,
275,
1264,
10625,
512,
275,
9864,
11072,
247,
9860,
1533,
1754,
327,
7477,
12517,
37444,
342,
247,
15688,
4430,
285,
1903,
4632,
1903,
5842,
436,
1755,
3487,
2746,
273,
4715,
4610,
14190,
2169,
3045,
685,
4715,
253,
4610,
806,
285,
840,
16248,
731,
352,
310,
7367,
273,
9777,
625,
3410,
20246,
685,
4715,
1293,
4610,
50276,
296,
3755,
20556,
50276,
783,
4081,
2746,
14371,
7367,
273,
9777,
7756,
275,
3410,
46505,
2429,
281,
1375,
23037,
14387,
391,
77,
7274,
326,
513,
417,
897,
4610,
50276,
783,
2746,
310,
6760,
275,
2570,
5415,
1453,
10625,
50276,
30076,
271,
4722,
5301,
326,
2722,
436,
2746,
273,
4715,
1755,
3487,
342,
4500,
20665,
2987,
1805,
685,
4715,
5004,
484,
26332,
4715,
253,
4610,
806,
285,
840,
12873,
731,
2366,
253,
9079,
310,
326,
436,
310,
984,
253,
6311,
4610,
5431,
513,
417,
1347,
9670,
5727,
4500,
20665,
513,
534,
2789,
352,
6927,
281,
3037,
849,
281,
9113,
4684,
4610,
2366,
672,
970,
253,
6158,
50276,
8250,
8813,
273,
253,
4114,
275,
2426,
273,
4610,
285,
3300,
303,
69,
793,
50275,
20881,
1255,
265,
50276,
783,
2746,
4419,
13947,
253,
2308,
273,
38562,
285,
253,
873,
273,
4500,
20665,
275,
1016,
253,
12510,
273,
253,
2746,
18289,
7024,
10260,
327,
849,
973,
253,
1895,
310,
45765,
715,
841,
2308,
273,
38562,
285,
4500,
20665,
50276,
783,
7092,
273,
4500,
20665,
476,
320,
37825,
347,
310,
14969,
275,
253,
7364,
2593,
50276,
783,
5740,
273,
253,
1332,
812,
320,
625,
2590,
1690,
247,
625,
7000,
4677,
374,
651,
1361,
4931,
342,
625,
11859,
6667,
273,
752,
253,
4610,
812,
320,
24088,
432,
253,
7477,
12517,
4836,
285,
4645,
849,
689,
253,
2282,
273,
3733,
1016,
273,
253,
4610,
310,
6311,
275,
2426,
273,
253,
1735,
1268,
273,
4610,
671,
275,
253,
7092,
273,
253,
2746,
849,
1057,
581,
4656,
253,
3646,
281,
2619,
432,
253,
873,
273,
4610,
326,
403,
3588,
275,
247,
1798,
1375,
50276,
9088,
310,
642,
5301,
1411,
4500,
1318,
19502,
275,
253,
19763,
285,
5842,
12620,
310,
352,
984,
4500,
1318,
19502,
651,
1379,
1512,
1048,
281,
3037,
323,
841,
8892,
50276,
2369,
7103,
327,
1524,
25497,
436,
651,
320,
247,
1462,
483,
2929,
323,
944,
77,
604,
253,
7103,
2908,
4715,
247,
1048,
1688,
21148,
4836,
342,
247,
1524,
15688,
5474,
339,
431,
248,
2929,
3400,
271,
4722,
8668,
281,
5368,
4500,
3169,
24498,
391,
77,
3082,
407,
36636,
253,
2934,
273,
4500,
20665,
352,
23970,
247,
4460,
7792,
281,
3037,
4500,
20665,
534,
13276,
253,
5570,
281,
452,
6485,
7268,
347,
1029,
5251,
4500,
273,
253,
4836,
672,
597,
403,
31488,
849,
281,
4035,
253,
2250,
432,
253,
1375,
5742,
352,
812,
18622,
432,
581,
1375,
281,
1529,
4745,
835,
253,
5502,
1957,
253,
10636,
273,
271,
32505,
38758,
10861,
20523,
352,
41714,
562,
1027,
2308,
273,
4500,
20665,
285,
6524,
33772,
253,
2406,
1268,
5231,
275,
253,
3126,
436,
812,
3157,
253,
3646,
3045,
285,
3410,
6733,
273,
391,
77,
285,
253,
2929,
2722,
253,
12510,
273,
436,
1332,
327,
8892,
1690,
15688,
19763,
285,
20391,
20544,
50276,
783,
16038,
323,
970,
4500,
7646,
432,
253,
2934,
273,
31714,
278,
2937,
310,
4722,
285,
2266,
352,
3400,
247,
5322,
24760,
875,
1966,
285,
5570,
4715,
50274,
262,
310,
247,
1077,
8542,
285,
4722,
2934,
281,
452,
6485,
12214,
347,
1029,
5251,
7997,
534,
8127,
11355,
253,
17947,
7977,
285,
2085,
4715,
6298,
281,
253,
5570,
50274,
783,
8813,
327,
4500,
7646,
1332,
285,
2934,
310,
1077,
2590,
627,
403,
2590,
5301,
875,
253,
4500,
7646,
50274,
783,
12291,
2593,
3400,
2032,
285,
281,
783,
3659,
5955,
327,
253,
4973,
273,
4014,
16220,
534,
943,
320,
2783,
275,
2852,
789,
50274,
9088,
403,
9470,
11985,
327,
849,
436,
2746,
2429,
342,
2720,
3082,
751,
4500,
2877,
19502,
285,
24642,
4715,
50275,
20881,
1255,
265,
50275,
783,
2022,
4468,
273,
253,
2929,
310,
849,
973,
352,
812,
320,
3732,
281,
625,
2570,
15688,
982,
8892,
326,
403,
337,
452,
3356,
1688,
21148,
534,
15265,
684,
253,
897,
273,
24498,
391,
77,
374,
835,
253,
6936,
403,
417,
4518,
2931,
253,
15688,
4679,
275,
253,
2929,
8687,
2969,
4886,
273,
5113,
285,
1057,
417,
4887,
849,
24498,
391,
77,
812,
1361,
342,
625,
2570,
8892,
50275,
9088,
403,
642,
1524,
15688,
4679,
285,
642,
11985,
273,
849,
973,
436,
812,
320,
24337,
281,
1524,
10186,
4758,
835,
627,
403,
625,
36930,
19763,
273,
5113,
285,
625,
2570,
6936,
50275,
1542,
625,
2570,
1048,
1688,
21148,
15688,
982,
8892,
849,
812,
253,
6936,
320,
10375,
285,
253,
4610,
281,
320,
4447,
275,
253,
806,
1659,
1057,
352,
5467,
326,
352,
2168,
452,
10861,
26405,
2218,
253,
4715,
2308,
1060,
3133,
247,
2372,
523,
30487,
285,
13542,
4158,
323,
1016,
2060,
4836,
50274,
6050,
10941,
253,
4500,
7646,
1332,
281,
2045,
24498,
391,
77,
1332,
310,
3590,
253,
2934,
273,
4500,
7646,
452,
253,
897,
273,
6485,
7268,
534,
7033,
281,
253,
1386,
273,
789,
327,
4715,
342,
6485,
12214,
1223,
436,
310,
247,
1027,
4702,
285,
247,
1027,
39926,
627,
943,
320,
10414,
1160,
281,
841,
3082,
7117,
247,
1643,
275,
2708,
337,
5919,
4715,
273,
4999,
6276,
3646,
3066,
1966,
2284,
5440,
33236,
13757,
632,
1384,
1423,
50272,
3614,
39962,
2061,
5375,
14256,
16899,
28459,
374,
6485,
7268,
4715,
653,
19667,
9169,
50271,
3614,
23053,
81,
47761,
681,
14600,
6903,
7931,
84,
10655,
1047,
2640,
883,
933,
2090,
50275,
7152,
339,
431,
248,
2929,
18216,
342,
31714,
50075,
561,
35221,
4715,
342,
6485,
12214,
3066,
4500,
20665,
23970,
247,
24498,
35221,
4715,
2746,
326,
37141,
1048,
2069,
1179,
265,
275,
2709,
4500,
20665,
26332,
247,
1387,
273,
3302,
285,
8351,
3054,
1293,
4610,
29765,
1698,
5251,
7823,
253,
1698,
5251,
1453,
7823,
840,
760,
1705,
715,
1132,
387,
253,
1077,
990,
281,
47516,
3359,
253,
3786,
1119,
4500,
20665,
253,
2746,
310,
6760,
327,
2709,
35221,
4715,
49602,
275,
9864,
20544,
50276,
186,
275,
2087,
253,
2561,
9400,
273,
253,
9262,
7714,
310,
273,
1029,
8542,
17200,
285,
7613,
273,
1600,
281,
253,
3114,
273,
944,
77,
28910,
352,
310,
973,
17194,
285,
253,
4114,
310,
973,
3559,
50276,
20881,
1255,
265,
50276,
186,
328,
9520,
253,
7714,
275,
697,
1655,
830,
3249,
342,
247,
4564,
273,
3374,
50276,
186,
19843,
253,
8813,
273,
253,
2746,
310,
1077,
21643,
285,
417,
3477,
281,
15909,
323,
6024,
10668,
28910,
7681,
22458,
600,
253,
2087,
2934,
3133,
1077,
2074,
281,
2045,
789,
24088,
337,
275,
1635,
281,
253,
3480,
273,
19843,
253,
9021,
281,
253,
1673,
3464,
21248,
285,
1646,
5884,
28910,
7103,
1014,
2167,
253,
2011,
1543,
275,
253,
1677,
49602,
403,
8489,
21414,
597,
13414,
1663,
6780,
253,
2714,
12696,
273,
253,
4081,
1332,
50275,
18,
4715,
1554,
48268,
20258,
447,
342,
17134,
18347,
458,
11170,
1162,
355,
17857,
32888,
6247,
2490,
187,
4118,
18435,
27,
783,
4757,
285,
14855,
273,
253,
2929,
5439,
407,
30628,
403,
17903,
347,
3637,
4757,
50276,
783,
16038,
285,
5697,
403,
4518,
3559,
50276,
783,
2746,
310,
6760,
327,
2570,
5415,
1453,
10625,
824,
347,
20279,
395,
9742,
285,
305,
5842,
8892,
285,
253,
5373,
273,
253,
4081,
2746,
310,
622,
83,
377,
363,
1523,
5183,
50276,
358,
5378,
474,
5301,
875,
253,
5004,
484,
285,
1755,
3487,
7274,
310,
4722,
50276,
20881,
1255,
50276,
783,
4081,
2746,
4419,
253,
4500,
20665,
534,
310,
417,
14916,
281,
4908,
352,
310,
417,
2590,
849,
281,
4908,
253,
4500,
20665,
323,
8892,
342,
3356,
7627,
790,
50276,
3062,
801,
554,
394,
11985,
327,
253,
2905,
789,
403,
3309,
50276,
249,
1635,
281,
253,
14855,
5439,
407,
30628,
247,
70,
16540,
253,
1563,
7350,
50275,
783,
5301,
342,
1607,
7080,
277,
47051,
285,
253,
1332,
275,
247,
295,
1094,
1162,
355,
337,
275,
20279,
395,
9742,
285,
305,
48599,
8892,
1057,
417,
1007,
4344,
253,
4836,
310,
3012,
21010,
323,
253,
4081,
1332,
970,
253,
41364,
4500,
20665,
533,
352,
3133,
326,
253,
4500,
20665,
403,
417,
1677,
281,
8245,
3082,
50275,
262,
3133,
326,
627,
310,
247,
8037,
875,
253,
16038,
275,
253,
10199,
285,
752,
310,
2218,
275,
253,
4081,
1332,
432,
253,
10199,
247,
70,
1869,
326,
512,
253,
8090,
651,
320,
6311,
10486,
275,
253,
4081,
1332,
2299,
275,
253,
4081,
1332,
253,
8840,
1268,
3646,
310,
13542,
4158,
285,
1677,
281,
253,
5570,
347,
4500,
20665,
50275,
1189,
455,
253,
2929,
4428,
4722,
5697,
285,
1543,
2299,
352,
310,
3309,
281,
3157,
253,
9759,
275,
690,
4243,
50275,
5996,
250,
2858,
22559,
5701,
50275,
6050,
690,
7350,
497,
5439,
407,
30628,
253,
4477,
9713,
731,
407,
3676,
2980,
253,
5955,
285,
5277,
3081,
1543,
50276,
15603,
272,
326,
30628,
5194,
326,
253,
2929,
10262,
4722,
5697,
285,
1543,
247,
70,
5583,
253,
14924,
273,
253,
2929,
247,
70,
671,
29426,
253,
4477,
281,
2451,
253,
5701,
432,
253,
30628,
2378,
969,
285,
1056,
690,
625,
6031,
281,
3157,
253,
9759
] | [
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] | [
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
783,
2022,
7680,
273,
436,
2929,
310,
247,
3410,
20246,
391,
77,
2746,
323,
4715,
7823,
323,
1048,
1688,
21148,
8892,
407,
2403,
897,
273,
2709,
2308,
273,
38562,
1016,
273,
534,
310,
2330,
342,
247,
873,
273,
4500,
20665,
271,
4500,
7646,
13067,
271,
4500,
275,
2426,
273,
3302,
285,
2457,
3054,
533,
417,
849,
281,
3359,
352,
512,
4500,
20665,
403,
1133,
12517,
264,
387,
253,
1265,
273,
3733,
594,
253,
5570,
476,
13194,
271,
4500,
7646,
5747,
417,
8958,
849,
281,
3359,
352,
2568,
1309,
3733,
253,
5570,
33772,
849,
281,
3359,
1016,
4500,
7646,
275,
2426,
273,
253,
4500,
20665,
387,
253,
1735,
2406,
1268,
273,
38562,
50276,
783,
2929,
44995,
436,
2746,
275,
1264,
10625,
512,
275,
9864,
11072,
247,
9860,
1533,
1754,
327,
7477,
12517,
37444,
342,
247,
15688,
4430,
285,
1903,
4632,
1903,
5842,
436,
1755,
3487,
2746,
273,
4715,
4610,
14190,
2169,
3045,
685,
4715,
253,
4610,
806,
285,
840,
16248,
731,
352,
310,
7367,
273,
9777,
625,
3410,
20246,
685,
4715,
1293,
4610,
50276,
296,
3755,
20556,
50276,
783,
4081,
2746,
14371,
7367,
273,
9777,
7756,
275,
3410,
46505,
2429,
281,
1375,
23037,
14387,
391,
77,
7274,
326,
513,
417,
897,
4610,
50276,
783,
2746,
310,
6760,
275,
2570,
5415,
1453,
10625,
50276,
30076,
271,
4722,
5301,
326,
2722,
436,
2746,
273,
4715,
1755,
3487,
342,
4500,
20665,
2987,
1805,
685,
4715,
5004,
484,
26332,
4715,
253,
4610,
806,
285,
840,
12873,
731,
2366,
253,
9079,
310,
326,
436,
310,
984,
253,
6311,
4610,
5431,
513,
417,
1347,
9670,
5727,
4500,
20665,
513,
534,
2789,
352,
6927,
281,
3037,
849,
281,
9113,
4684,
4610,
2366,
672,
970,
253,
6158,
50276,
8250,
8813,
273,
253,
4114,
275,
2426,
273,
4610,
285,
3300,
303,
69,
793,
50275,
20881,
1255,
265,
50276,
783,
2746,
4419,
13947,
253,
2308,
273,
38562,
285,
253,
873,
273,
4500,
20665,
275,
1016,
253,
12510,
273,
253,
2746,
18289,
7024,
10260,
327,
849,
973,
253,
1895,
310,
45765,
715,
841,
2308,
273,
38562,
285,
4500,
20665,
50276,
783,
7092,
273,
4500,
20665,
476,
320,
37825,
347,
310,
14969,
275,
253,
7364,
2593,
50276,
783,
5740,
273,
253,
1332,
812,
320,
625,
2590,
1690,
247,
625,
7000,
4677,
374,
651,
1361,
4931,
342,
625,
11859,
6667,
273,
752,
253,
4610,
812,
320,
24088,
432,
253,
7477,
12517,
4836,
285,
4645,
849,
689,
253,
2282,
273,
3733,
1016,
273,
253,
4610,
310,
6311,
275,
2426,
273,
253,
1735,
1268,
273,
4610,
671,
275,
253,
7092,
273,
253,
2746,
849,
1057,
581,
4656,
253,
3646,
281,
2619,
432,
253,
873,
273,
4610,
326,
403,
3588,
275,
247,
1798,
1375,
50276,
9088,
310,
642,
5301,
1411,
4500,
1318,
19502,
275,
253,
19763,
285,
5842,
12620,
310,
352,
984,
4500,
1318,
19502,
651,
1379,
1512,
1048,
281,
3037,
323,
841,
8892,
50276,
2369,
7103,
327,
1524,
25497,
436,
651,
320,
247,
1462,
483,
2929,
323,
944,
77,
604,
253,
7103,
2908,
4715,
247,
1048,
1688,
21148,
4836,
342,
247,
1524,
15688,
5474,
339,
431,
248,
2929,
3400,
271,
4722,
8668,
281,
5368,
4500,
3169,
24498,
391,
77,
3082,
407,
36636,
253,
2934,
273,
4500,
20665,
352,
23970,
247,
4460,
7792,
281,
3037,
4500,
20665,
534,
13276,
253,
5570,
281,
452,
6485,
7268,
347,
1029,
5251,
4500,
273,
253,
4836,
672,
597,
403,
31488,
849,
281,
4035,
253,
2250,
432,
253,
1375,
5742,
352,
812,
18622,
432,
581,
1375,
281,
1529,
4745,
835,
253,
5502,
1957,
253,
10636,
273,
271,
32505,
38758,
10861,
20523,
352,
41714,
562,
1027,
2308,
273,
4500,
20665,
285,
6524,
33772,
253,
2406,
1268,
5231,
275,
253,
3126,
436,
812,
3157,
253,
3646,
3045,
285,
3410,
6733,
273,
391,
77,
285,
253,
2929,
2722,
253,
12510,
273,
436,
1332,
327,
8892,
1690,
15688,
19763,
285,
20391,
20544,
50276,
783,
16038,
323,
970,
4500,
7646,
432,
253,
2934,
273,
31714,
278,
2937,
310,
4722,
285,
2266,
352,
3400,
247,
5322,
24760,
875,
1966,
285,
5570,
4715,
50274,
262,
310,
247,
1077,
8542,
285,
4722,
2934,
281,
452,
6485,
12214,
347,
1029,
5251,
7997,
534,
8127,
11355,
253,
17947,
7977,
285,
2085,
4715,
6298,
281,
253,
5570,
50274,
783,
8813,
327,
4500,
7646,
1332,
285,
2934,
310,
1077,
2590,
627,
403,
2590,
5301,
875,
253,
4500,
7646,
50274,
783,
12291,
2593,
3400,
2032,
285,
281,
783,
3659,
5955,
327,
253,
4973,
273,
4014,
16220,
534,
943,
320,
2783,
275,
2852,
789,
50274,
9088,
403,
9470,
11985,
327,
849,
436,
2746,
2429,
342,
2720,
3082,
751,
4500,
2877,
19502,
285,
24642,
4715,
50275,
20881,
1255,
265,
50275,
783,
2022,
4468,
273,
253,
2929,
310,
849,
973,
352,
812,
320,
3732,
281,
625,
2570,
15688,
982,
8892,
326,
403,
337,
452,
3356,
1688,
21148,
534,
15265,
684,
253,
897,
273,
24498,
391,
77,
374,
835,
253,
6936,
403,
417,
4518,
2931,
253,
15688,
4679,
275,
253,
2929,
8687,
2969,
4886,
273,
5113,
285,
1057,
417,
4887,
849,
24498,
391,
77,
812,
1361,
342,
625,
2570,
8892,
50275,
9088,
403,
642,
1524,
15688,
4679,
285,
642,
11985,
273,
849,
973,
436,
812,
320,
24337,
281,
1524,
10186,
4758,
835,
627,
403,
625,
36930,
19763,
273,
5113,
285,
625,
2570,
6936,
50275,
1542,
625,
2570,
1048,
1688,
21148,
15688,
982,
8892,
849,
812,
253,
6936,
320,
10375,
285,
253,
4610,
281,
320,
4447,
275,
253,
806,
1659,
1057,
352,
5467,
326,
352,
2168,
452,
10861,
26405,
2218,
253,
4715,
2308,
1060,
3133,
247,
2372,
523,
30487,
285,
13542,
4158,
323,
1016,
2060,
4836,
50274,
6050,
10941,
253,
4500,
7646,
1332,
281,
2045,
24498,
391,
77,
1332,
310,
3590,
253,
2934,
273,
4500,
7646,
452,
253,
897,
273,
6485,
7268,
534,
7033,
281,
253,
1386,
273,
789,
327,
4715,
342,
6485,
12214,
1223,
436,
310,
247,
1027,
4702,
285,
247,
1027,
39926,
627,
943,
320,
10414,
1160,
281,
841,
3082,
7117,
247,
1643,
275,
2708,
337,
5919,
4715,
273,
4999,
6276,
3646,
3066,
1966,
2284,
5440,
33236,
13757,
632,
1384,
1423,
50272,
3614,
39962,
2061,
5375,
14256,
16899,
28459,
374,
6485,
7268,
4715,
653,
19667,
9169,
50271,
3614,
23053,
81,
47761,
681,
14600,
6903,
7931,
84,
10655,
1047,
2640,
883,
933,
2090,
50275,
7152,
339,
431,
248,
2929,
18216,
342,
31714,
50075,
561,
35221,
4715,
342,
6485,
12214,
3066,
4500,
20665,
23970,
247,
24498,
35221,
4715,
2746,
326,
37141,
1048,
2069,
1179,
265,
275,
2709,
4500,
20665,
26332,
247,
1387,
273,
3302,
285,
8351,
3054,
1293,
4610,
29765,
1698,
5251,
7823,
253,
1698,
5251,
1453,
7823,
840,
760,
1705,
715,
1132,
387,
253,
1077,
990,
281,
47516,
3359,
253,
3786,
1119,
4500,
20665,
253,
2746,
310,
6760,
327,
2709,
35221,
4715,
49602,
275,
9864,
20544,
50276,
186,
275,
2087,
253,
2561,
9400,
273,
253,
9262,
7714,
310,
273,
1029,
8542,
17200,
285,
7613,
273,
1600,
281,
253,
3114,
273,
944,
77,
28910,
352,
310,
973,
17194,
285,
253,
4114,
310,
973,
3559,
50276,
20881,
1255,
265,
50276,
186,
328,
9520,
253,
7714,
275,
697,
1655,
830,
3249,
342,
247,
4564,
273,
3374,
50276,
186,
19843,
253,
8813,
273,
253,
2746,
310,
1077,
21643,
285,
417,
3477,
281,
15909,
323,
6024,
10668,
28910,
7681,
22458,
600,
253,
2087,
2934,
3133,
1077,
2074,
281,
2045,
789,
24088,
337,
275,
1635,
281,
253,
3480,
273,
19843,
253,
9021,
281,
253,
1673,
3464,
21248,
285,
1646,
5884,
28910,
7103,
1014,
2167,
253,
2011,
1543,
275,
253,
1677,
49602,
403,
8489,
21414,
597,
13414,
1663,
6780,
253,
2714,
12696,
273,
253,
4081,
1332,
50275,
18,
4715,
1554,
48268,
20258,
447,
342,
17134,
18347,
458,
11170,
1162,
355,
17857,
32888,
6247,
2490,
187,
4118,
18435,
27,
783,
4757,
285,
14855,
273,
253,
2929,
5439,
407,
30628,
403,
17903,
347,
3637,
4757,
50276,
783,
16038,
285,
5697,
403,
4518,
3559,
50276,
783,
2746,
310,
6760,
327,
2570,
5415,
1453,
10625,
824,
347,
20279,
395,
9742,
285,
305,
5842,
8892,
285,
253,
5373,
273,
253,
4081,
2746,
310,
622,
83,
377,
363,
1523,
5183,
50276,
358,
5378,
474,
5301,
875,
253,
5004,
484,
285,
1755,
3487,
7274,
310,
4722,
50276,
20881,
1255,
50276,
783,
4081,
2746,
4419,
253,
4500,
20665,
534,
310,
417,
14916,
281,
4908,
352,
310,
417,
2590,
849,
281,
4908,
253,
4500,
20665,
323,
8892,
342,
3356,
7627,
790,
50276,
3062,
801,
554,
394,
11985,
327,
253,
2905,
789,
403,
3309,
50276,
249,
1635,
281,
253,
14855,
5439,
407,
30628,
247,
70,
16540,
253,
1563,
7350,
50275,
783,
5301,
342,
1607,
7080,
277,
47051,
285,
253,
1332,
275,
247,
295,
1094,
1162,
355,
337,
275,
20279,
395,
9742,
285,
305,
48599,
8892,
1057,
417,
1007,
4344,
253,
4836,
310,
3012,
21010,
323,
253,
4081,
1332,
970,
253,
41364,
4500,
20665,
533,
352,
3133,
326,
253,
4500,
20665,
403,
417,
1677,
281,
8245,
3082,
50275,
262,
3133,
326,
627,
310,
247,
8037,
875,
253,
16038,
275,
253,
10199,
285,
752,
310,
2218,
275,
253,
4081,
1332,
432,
253,
10199,
247,
70,
1869,
326,
512,
253,
8090,
651,
320,
6311,
10486,
275,
253,
4081,
1332,
2299,
275,
253,
4081,
1332,
253,
8840,
1268,
3646,
310,
13542,
4158,
285,
1677,
281,
253,
5570,
347,
4500,
20665,
50275,
1189,
455,
253,
2929,
4428,
4722,
5697,
285,
1543,
2299,
352,
310,
3309,
281,
3157,
253,
9759,
275,
690,
4243,
50275,
5996,
250,
2858,
22559,
5701,
50275,
6050,
690,
7350,
497,
5439,
407,
30628,
253,
4477,
9713,
731,
407,
3676,
2980,
253,
5955,
285,
5277,
3081,
1543,
50276,
15603,
272,
326,
30628,
5194,
326,
253,
2929,
10262,
4722,
5697,
285,
1543,
247,
70,
5583,
253,
14924,
273,
253,
2929,
247,
70,
671,
29426,
253,
4477,
281,
2451,
253,
5701,
432,
253,
30628,
2378,
969,
285,
1056,
690,
625,
6031,
281,
3157,
253,
9759
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
the manuscript focuses on a trending topic of applying a bidirectional encoder representations from transformers bertbased prediction model to a new domain more precisely it addresses classifying electronic recruitment records errs with respect to job skills its contributions include but are not limited to i releasing a related deidentified err dataset to the public domain ii introducing a bertbased embedding model called skillbert to group skills present in this err dataset into as competency clusters and iii giving experimental evidence of the obtained modelling gains however i am not convinced that these experiments are sufficient to support accepting the manuscript for the following five main reasons first the compared models and methods are somewhat old and elementary eg word2vec and tfxidf thereby failing to capture the current advances second in my opinion the data annotation process calls for clarifications for example what was the expertise of the expert annotators how were research ethics and informed consenting assured in this process involving human annotators as study participants and do the authors obtain the rights to release the dataset andor annotated dataset third the overall experimental setting does not seem to be adequately captured and justified and i am unable to find a description of the performed statistical significance testing fourth the manuscript demonstrates only a reasonable understanding of related work in applications to skillcompetency demand and existing studies in relevant computational methodsmodelsdata see eg httpswwwaclweborganthologysearchq22job22 for recent relevant papers from the acl anthology that are largely missing from the reference list fifth the manuscript should be edited more carefully by clarifying both its contributions and limitations in relation to related work describing and justifying its methodology and experiments moving the intext citations from the abstract to the body text of the later sections and enhancing the image readability in conclusion the study is valuable but needs further workdocsepthis paper proposes a model for job application screening since there is no jobrelated dataset available the authors manually assigned labels to a large job application dataset a skill set eg apache hadoop apache pig html javascript is firstly extracted from the job dataset then a competency group is constructed eg big data frontend as the labels the problem is then formulated as a multilabel classification problem that is given a skill which may belong to multiple competency groups the model has to predict its competency groups the authors proposed to use bert as the main model moreover the authors use additional features like similaritybased and clusterbased features the experimental results are good we think it can help recruiters find a suitable applicant however this paper is straightforward using bert as a main model for text classification is a wellknown technique and many papers already applied berts in other domains like biomedicine and law so we think the technical contribution of this paper is limited furthermore some parts of this paper are not clearly explained for example the authors mentioned that they also use some features like frequencybased and groupbased features but did not find detailed descriptions of these two features one positive side of this paper is that the authors release a publicly available job application dataset establishing a dataset is timeconsuming and requires a lot of human efforts the dataset consists of 700000 job requisitions which is large enough it is good that the authors are willing to share this dataset to sum up this paper proposes a skill classification model for job screening which is useful but we think that the methodology and its technical contribution are not strong enough it might not be qualified as a regular paper for iclr skillbert skilling the bert to classify skillsdocsepthis paper studied the problem of classifying skills into competency groups the authors proposed skillbert a bertbased model to extract the embeddings of skills and use that for the classification task strengths 1 the authors presented the details of feature engineering and experiment design they also conducted extensive and comprehensive experiments which compare a lot of classificationfeature engineering methods it is a very good practical guide for doing related tasks 2 the authors released the code and dataset which largely improve the reproducibility of this work 3 the paper is well written and easy to follow weakness 1 this paper studied a very specific problem which might only be interested to a very small group of researchers it seems more like an industry track paper 2 if i am correct the problem size is relatively small the total number of skills is 2997 and each of them will be classified into one or more competency group out of 40 competency group do we really a bert model for this problem maybe the authors could provide some costperformance tradeoff analysis here questions for authors 1 in table 3 i only see the precisionrecallf1 score for class 0 and class 1 where is the number for class 2 did i misunderstand this experiment overall comments this is a good paper and i personally support it to be accepted however i do think that it seems more like an industry track paper this is not for me to decide maybe metareviewer could share some opinions about whether this paper is suitable to iclr
### Summary: | the authors propose an approach for the task of categorizing competencies in terms of worker skillsets this is a potentially useful if somewhat niche task and one strength here is a resource to support further research on the topic however the contribution here is limited the methods considered are not new and while the problem has some practical importance it does not seem likely to be of particular interest to the broader iclr community | [
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
783,
7714,
16633,
327,
247,
9058,
272,
9400,
273,
9433,
247,
12246,
30869,
32049,
14237,
432,
4979,
398,
270,
797,
3169,
10554,
1566,
281,
247,
747,
5028,
625,
10534,
352,
12453,
49653,
7051,
16510,
5861,
1486,
84,
342,
1675,
281,
2628,
6936,
697,
9021,
2486,
533,
403,
417,
3710,
281,
891,
20437,
247,
2905,
372,
24491,
1486,
10895,
281,
253,
1345,
5028,
21255,
16984,
247,
270,
797,
3169,
21496,
1566,
1925,
10861,
6291,
281,
1387,
6936,
1246,
275,
436,
1486,
10895,
715,
347,
43346,
9959,
285,
37685,
4933,
5661,
1941,
273,
253,
2797,
26278,
15988,
50275,
35529,
891,
717,
417,
13762,
326,
841,
4679,
403,
4209,
281,
1329,
18738,
253,
7714,
323,
253,
1563,
2620,
2022,
4606,
50276,
7053,
253,
2429,
3210,
285,
3082,
403,
8489,
1711,
285,
18307,
24088,
3159,
19,
4642,
285,
246,
21448,
301,
71,
7624,
11741,
281,
9232,
253,
1655,
16424,
50275,
9815,
275,
619,
4743,
253,
941,
22581,
1232,
5841,
323,
8254,
6787,
323,
1650,
752,
369,
253,
15040,
273,
253,
6485,
12182,
2392,
849,
497,
2561,
18035,
285,
8191,
7578,
272,
17839,
275,
436,
1232,
7668,
1966,
12182,
2392,
347,
1263,
5014,
285,
513,
253,
4477,
4044,
253,
3570,
281,
3727,
253,
10895,
285,
263,
28267,
10895,
50276,
19016,
253,
4583,
5661,
4758,
1057,
417,
1646,
281,
320,
18212,
10848,
285,
17285,
285,
891,
717,
7591,
281,
1089,
247,
5740,
273,
253,
2684,
7605,
8453,
5175,
50276,
48499,
253,
7714,
14371,
760,
247,
5272,
4685,
273,
2905,
789,
275,
4893,
281,
10861,
3118,
292,
1371,
4831,
285,
5368,
2175,
275,
4623,
15180,
3082,
19286,
2203,
923,
24088,
5987,
2700,
29404,
7585,
2061,
14718,
1497,
8716,
82,
1423,
17455,
1423,
323,
3332,
4623,
9380,
432,
253,
247,
498,
13950,
1497,
326,
403,
8127,
5816,
432,
253,
3806,
1618,
50276,
25512,
394,
253,
7714,
943,
320,
16168,
625,
9257,
407,
8254,
5411,
1097,
697,
9021,
285,
7364,
275,
5886,
281,
2905,
789,
12930,
285,
816,
5411,
697,
16182,
285,
4679,
4886,
253,
1101,
633,
30404,
432,
253,
12002,
281,
253,
2133,
2505,
273,
253,
1996,
7118,
285,
22474,
253,
2460,
1239,
1430,
50276,
249,
6452,
253,
1263,
310,
9865,
533,
3198,
2007,
789,
7152,
33032,
2520,
2929,
29328,
247,
1566,
323,
2628,
2898,
9273,
1580,
627,
310,
642,
2628,
4919,
10895,
2130,
253,
4477,
13542,
7922,
13301,
281,
247,
1781,
2628,
2898,
10895,
247,
10861,
873,
24088,
43449,
574,
24318,
43449,
8393,
14271,
25109,
310,
41005,
10375,
432,
253,
2628,
10895,
50276,
7461,
247,
43346,
1387,
310,
8818,
24088,
1943,
941,
2914,
423,
347,
253,
13301,
253,
1895,
310,
840,
26115,
347,
247,
33362,
1492,
9162,
1895,
326,
310,
1677,
247,
10861,
534,
778,
5663,
281,
2709,
43346,
2390,
253,
1566,
556,
281,
3283,
697,
43346,
2390,
253,
4477,
4081,
281,
897,
270,
797,
347,
253,
2022,
1566,
25761,
253,
4477,
897,
3081,
3386,
751,
14259,
3169,
285,
7368,
3169,
3386,
253,
5661,
1543,
403,
1175,
359,
1158,
352,
476,
1361,
12438,
398,
1089,
247,
7470,
22123,
50276,
35529,
436,
2929,
310,
15246,
970,
270,
797,
347,
247,
2022,
1566,
323,
2505,
9162,
310,
247,
973,
4304,
5853,
285,
1142,
9380,
2168,
3732,
270,
797,
84,
275,
643,
10625,
751,
33379,
35705,
285,
1569,
594,
359,
1158,
253,
7681,
7680,
273,
436,
2929,
310,
3710,
33810,
690,
4243,
273,
436,
2929,
403,
417,
4518,
5544,
323,
1650,
253,
4477,
5393,
326,
597,
671,
897,
690,
3386,
751,
4294,
3169,
285,
1387,
3169,
3386,
533,
858,
417,
1089,
7000,
20121,
273,
841,
767,
3386,
50276,
531,
2762,
1930,
273,
436,
2929,
310,
326,
253,
4477,
3727,
247,
13644,
2130,
2628,
2898,
10895,
14631,
247,
10895,
310,
673,
33136,
285,
4419,
247,
2257,
273,
1966,
6031,
253,
10895,
8414,
273,
818,
9439,
2628,
997,
261,
4431,
534,
310,
1781,
2217,
352,
310,
1175,
326,
253,
4477,
403,
7378,
281,
3894,
436,
10895,
50276,
936,
2020,
598,
436,
2929,
29328,
247,
10861,
9162,
1566,
323,
2628,
9273,
534,
310,
4217,
533,
359,
1158,
326,
253,
16182,
285,
697,
7681,
7680,
403,
417,
2266,
2217,
352,
1537,
417,
320,
12165,
347,
247,
3963,
2929,
323,
17857,
32888,
50276,
3319,
408,
6291,
1629,
3867,
253,
270,
797,
281,
30215,
6936,
7152,
33032,
2520,
2929,
5421,
253,
1895,
273,
49653,
6936,
715,
43346,
2390,
253,
4477,
4081,
10861,
6291,
247,
270,
797,
3169,
1566,
281,
4908,
253,
46234,
273,
6936,
285,
897,
326,
323,
253,
9162,
4836,
50275,
296,
3755,
20556,
337,
253,
4477,
3559,
253,
4278,
273,
4735,
11369,
285,
3368,
2216,
597,
671,
5196,
9470,
285,
11088,
4679,
534,
7277,
247,
2257,
273,
9162,
24594,
11369,
3082,
352,
310,
247,
1077,
1175,
8542,
7102,
323,
2509,
2905,
8892,
50276,
19,
253,
4477,
4439,
253,
2127,
285,
10895,
534,
8127,
3157,
253,
38041,
273,
436,
789,
50276,
20,
253,
2929,
310,
973,
3542,
285,
3477,
281,
956,
50276,
20881,
1255,
337,
436,
2929,
5421,
247,
1077,
2173,
1895,
534,
1537,
760,
320,
6110,
281,
247,
1077,
1355,
1387,
273,
8607,
352,
3133,
625,
751,
271,
4491,
3540,
2929,
50275,
19,
604,
891,
717,
3451,
253,
1895,
1979,
310,
4942,
1355,
253,
2264,
1180,
273,
6936,
310,
374,
28315,
285,
1016,
273,
731,
588,
320,
10509,
715,
581,
390,
625,
43346,
1387,
562,
273,
3387,
43346,
1387,
513,
359,
1663,
247,
270,
797,
1566,
323,
436,
1895,
5046,
253,
4477,
812,
2085,
690,
2105,
24159,
5454,
2727,
1783,
1060,
50276,
34974,
323,
4477,
337,
275,
2829,
495,
891,
760,
923,
253,
12320,
2845,
455,
71,
18,
4868,
323,
966,
470,
285,
966,
337,
835,
310,
253,
1180,
323,
966,
374,
858,
891,
23452,
1676,
436,
3368,
50276,
1189,
455,
5701,
50276,
2520,
310,
247,
1175,
2929,
285,
891,
11697,
1329,
352,
281,
320,
7607,
2299,
891,
513,
1158,
326,
352,
3133,
625,
751,
271,
4491,
3540,
2929,
436,
310,
417,
323,
479,
281,
7617,
5046,
1313,
609,
1374,
254,
812,
3894,
690,
11626,
670,
1880,
436,
2929,
310,
7470,
281,
17857,
32888,
187,
187,
4118,
18435,
27,
783,
4477,
12661,
271,
2746,
323,
253,
4836,
273,
13213,
3006,
3947,
4601,
275,
2426,
273,
12954,
10861,
19598,
436,
310,
247,
7826,
4217,
604,
8489,
25803,
4836,
285,
581,
4757,
1060,
310,
247,
7741,
281,
1329,
2007,
2561,
327,
253,
9400,
575,
35529,
253,
7680,
1060,
310,
3710,
253,
3082,
2783,
403,
417,
747,
285,
1223,
253,
1895,
556,
690,
8542,
6349,
352,
1057,
417,
1646,
2779,
281,
320,
273,
1798,
1600,
281,
253,
16055,
17857,
32888,
3114,
209
] | [
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] | [
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
783,
7714,
16633,
327,
247,
9058,
272,
9400,
273,
9433,
247,
12246,
30869,
32049,
14237,
432,
4979,
398,
270,
797,
3169,
10554,
1566,
281,
247,
747,
5028,
625,
10534,
352,
12453,
49653,
7051,
16510,
5861,
1486,
84,
342,
1675,
281,
2628,
6936,
697,
9021,
2486,
533,
403,
417,
3710,
281,
891,
20437,
247,
2905,
372,
24491,
1486,
10895,
281,
253,
1345,
5028,
21255,
16984,
247,
270,
797,
3169,
21496,
1566,
1925,
10861,
6291,
281,
1387,
6936,
1246,
275,
436,
1486,
10895,
715,
347,
43346,
9959,
285,
37685,
4933,
5661,
1941,
273,
253,
2797,
26278,
15988,
50275,
35529,
891,
717,
417,
13762,
326,
841,
4679,
403,
4209,
281,
1329,
18738,
253,
7714,
323,
253,
1563,
2620,
2022,
4606,
50276,
7053,
253,
2429,
3210,
285,
3082,
403,
8489,
1711,
285,
18307,
24088,
3159,
19,
4642,
285,
246,
21448,
301,
71,
7624,
11741,
281,
9232,
253,
1655,
16424,
50275,
9815,
275,
619,
4743,
253,
941,
22581,
1232,
5841,
323,
8254,
6787,
323,
1650,
752,
369,
253,
15040,
273,
253,
6485,
12182,
2392,
849,
497,
2561,
18035,
285,
8191,
7578,
272,
17839,
275,
436,
1232,
7668,
1966,
12182,
2392,
347,
1263,
5014,
285,
513,
253,
4477,
4044,
253,
3570,
281,
3727,
253,
10895,
285,
263,
28267,
10895,
50276,
19016,
253,
4583,
5661,
4758,
1057,
417,
1646,
281,
320,
18212,
10848,
285,
17285,
285,
891,
717,
7591,
281,
1089,
247,
5740,
273,
253,
2684,
7605,
8453,
5175,
50276,
48499,
253,
7714,
14371,
760,
247,
5272,
4685,
273,
2905,
789,
275,
4893,
281,
10861,
3118,
292,
1371,
4831,
285,
5368,
2175,
275,
4623,
15180,
3082,
19286,
2203,
923,
24088,
5987,
2700,
29404,
7585,
2061,
14718,
1497,
8716,
82,
1423,
17455,
1423,
323,
3332,
4623,
9380,
432,
253,
247,
498,
13950,
1497,
326,
403,
8127,
5816,
432,
253,
3806,
1618,
50276,
25512,
394,
253,
7714,
943,
320,
16168,
625,
9257,
407,
8254,
5411,
1097,
697,
9021,
285,
7364,
275,
5886,
281,
2905,
789,
12930,
285,
816,
5411,
697,
16182,
285,
4679,
4886,
253,
1101,
633,
30404,
432,
253,
12002,
281,
253,
2133,
2505,
273,
253,
1996,
7118,
285,
22474,
253,
2460,
1239,
1430,
50276,
249,
6452,
253,
1263,
310,
9865,
533,
3198,
2007,
789,
7152,
33032,
2520,
2929,
29328,
247,
1566,
323,
2628,
2898,
9273,
1580,
627,
310,
642,
2628,
4919,
10895,
2130,
253,
4477,
13542,
7922,
13301,
281,
247,
1781,
2628,
2898,
10895,
247,
10861,
873,
24088,
43449,
574,
24318,
43449,
8393,
14271,
25109,
310,
41005,
10375,
432,
253,
2628,
10895,
50276,
7461,
247,
43346,
1387,
310,
8818,
24088,
1943,
941,
2914,
423,
347,
253,
13301,
253,
1895,
310,
840,
26115,
347,
247,
33362,
1492,
9162,
1895,
326,
310,
1677,
247,
10861,
534,
778,
5663,
281,
2709,
43346,
2390,
253,
1566,
556,
281,
3283,
697,
43346,
2390,
253,
4477,
4081,
281,
897,
270,
797,
347,
253,
2022,
1566,
25761,
253,
4477,
897,
3081,
3386,
751,
14259,
3169,
285,
7368,
3169,
3386,
253,
5661,
1543,
403,
1175,
359,
1158,
352,
476,
1361,
12438,
398,
1089,
247,
7470,
22123,
50276,
35529,
436,
2929,
310,
15246,
970,
270,
797,
347,
247,
2022,
1566,
323,
2505,
9162,
310,
247,
973,
4304,
5853,
285,
1142,
9380,
2168,
3732,
270,
797,
84,
275,
643,
10625,
751,
33379,
35705,
285,
1569,
594,
359,
1158,
253,
7681,
7680,
273,
436,
2929,
310,
3710,
33810,
690,
4243,
273,
436,
2929,
403,
417,
4518,
5544,
323,
1650,
253,
4477,
5393,
326,
597,
671,
897,
690,
3386,
751,
4294,
3169,
285,
1387,
3169,
3386,
533,
858,
417,
1089,
7000,
20121,
273,
841,
767,
3386,
50276,
531,
2762,
1930,
273,
436,
2929,
310,
326,
253,
4477,
3727,
247,
13644,
2130,
2628,
2898,
10895,
14631,
247,
10895,
310,
673,
33136,
285,
4419,
247,
2257,
273,
1966,
6031,
253,
10895,
8414,
273,
818,
9439,
2628,
997,
261,
4431,
534,
310,
1781,
2217,
352,
310,
1175,
326,
253,
4477,
403,
7378,
281,
3894,
436,
10895,
50276,
936,
2020,
598,
436,
2929,
29328,
247,
10861,
9162,
1566,
323,
2628,
9273,
534,
310,
4217,
533,
359,
1158,
326,
253,
16182,
285,
697,
7681,
7680,
403,
417,
2266,
2217,
352,
1537,
417,
320,
12165,
347,
247,
3963,
2929,
323,
17857,
32888,
50276,
3319,
408,
6291,
1629,
3867,
253,
270,
797,
281,
30215,
6936,
7152,
33032,
2520,
2929,
5421,
253,
1895,
273,
49653,
6936,
715,
43346,
2390,
253,
4477,
4081,
10861,
6291,
247,
270,
797,
3169,
1566,
281,
4908,
253,
46234,
273,
6936,
285,
897,
326,
323,
253,
9162,
4836,
50275,
296,
3755,
20556,
337,
253,
4477,
3559,
253,
4278,
273,
4735,
11369,
285,
3368,
2216,
597,
671,
5196,
9470,
285,
11088,
4679,
534,
7277,
247,
2257,
273,
9162,
24594,
11369,
3082,
352,
310,
247,
1077,
1175,
8542,
7102,
323,
2509,
2905,
8892,
50276,
19,
253,
4477,
4439,
253,
2127,
285,
10895,
534,
8127,
3157,
253,
38041,
273,
436,
789,
50276,
20,
253,
2929,
310,
973,
3542,
285,
3477,
281,
956,
50276,
20881,
1255,
337,
436,
2929,
5421,
247,
1077,
2173,
1895,
534,
1537,
760,
320,
6110,
281,
247,
1077,
1355,
1387,
273,
8607,
352,
3133,
625,
751,
271,
4491,
3540,
2929,
50275,
19,
604,
891,
717,
3451,
253,
1895,
1979,
310,
4942,
1355,
253,
2264,
1180,
273,
6936,
310,
374,
28315,
285,
1016,
273,
731,
588,
320,
10509,
715,
581,
390,
625,
43346,
1387,
562,
273,
3387,
43346,
1387,
513,
359,
1663,
247,
270,
797,
1566,
323,
436,
1895,
5046,
253,
4477,
812,
2085,
690,
2105,
24159,
5454,
2727,
1783,
1060,
50276,
34974,
323,
4477,
337,
275,
2829,
495,
891,
760,
923,
253,
12320,
2845,
455,
71,
18,
4868,
323,
966,
470,
285,
966,
337,
835,
310,
253,
1180,
323,
966,
374,
858,
891,
23452,
1676,
436,
3368,
50276,
1189,
455,
5701,
50276,
2520,
310,
247,
1175,
2929,
285,
891,
11697,
1329,
352,
281,
320,
7607,
2299,
891,
513,
1158,
326,
352,
3133,
625,
751,
271,
4491,
3540,
2929,
436,
310,
417,
323,
479,
281,
7617,
5046,
1313,
609,
1374,
254,
812,
3894,
690,
11626,
670,
1880,
436,
2929,
310,
7470,
281,
17857,
32888,
187,
187,
4118,
18435,
27,
783,
4477,
12661,
271,
2746,
323,
253,
4836,
273,
13213,
3006,
3947,
4601,
275,
2426,
273,
12954,
10861,
19598,
436,
310,
247,
7826,
4217,
604,
8489,
25803,
4836,
285,
581,
4757,
1060,
310,
247,
7741,
281,
1329,
2007,
2561,
327,
253,
9400,
575,
35529,
253,
7680,
1060,
310,
3710,
253,
3082,
2783,
403,
417,
747,
285,
1223,
253,
1895,
556,
690,
8542,
6349,
352,
1057,
417,
1646,
2779,
281,
320,
273,
1798,
1600,
281,
253,
16055,
17857,
32888,
3114,
209
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
the work introduces memup a training method that allows to learn longterm dependencies without backpropagating gradients through the entire sequence unlike rnns the method is based on maximizing mutual information between memory states then it is explained how the equations need and can be approximated this introduces further assumptions leading to possible vulnerabilities to the noisytv problem this is further analyzed in the experimental sections this is further simplified to a practical optimization function related to crossentropy loss that depends on the memory m at a timestep and the model q given the respective input the work then continues by describing how memup is applied to rnns the experimental sections cover results for basic sequence tasks copy scattered copy permuted mnist and add task while comparing to lstms srnns and transformers also it is experimented in rl settings tmaze vizdoomtwocolors showing competitive results to existing methods the work seems original to be the best of this reviewer knowledge and relevant to the neurips community enabling models to learn and process longrange dependencies is indeed a problem of interest in particular how to train memory augmented models the works proposes an interesting method to train such models the idea behind the method is clear and interesting the paper focuses on deriving the method but could have make a better job exploring it further instead authors decided to compare against other methods in two settings supervised learning on sequences and rl see questions below in addition several of these experiments could have compared to newer recurrent methods and memory augmented methods both transformers and rnns see 1 and 2 as examples the text is clear putting more emphasis on how the algorithm is applied to rnns and other networks could really help the reader understand the method and give it more clarity defining the tasks in the text instead of the supplementary would be appreciated as well 1 legendre memory units continuoustime representation in recurrent neural networks voelker et al 2019 2 not all memories are created equal learning to forget by expiring sukhbaatar et al 2021 the authors describe and analyze the noisytv problem based on their approximation they also assume that dependencies are scarce with the increase of the distance docsepin this paper the authors proposed a method called memup to help recurrent neural networks learn longterm dependencies better specifically the authors leverage a memory model to predict future outcomes with high uncertainty by skipping all states with lower uncertainty the training process takes less computational cost in backpropagation the framework involves a uncertainty detector model and a predictor to estimate mutual information used in memory model training the authors show experimental results on both supervised learning tasks and reinforcement learning tasks with improvement comparing with baselines the manuscript has the following strengths 1 the motivation is reasonable and the authors proposed a framework to address the motivation 2 the experiments are solid the authors provide comparible or better results in comparison with several baselines designated to handle long sequences 3 for such a complicated framework the authors are able to illustrate clearly the manuscript has the following weaknesses 1 the proposed framework involves training of several models including a memory model gtheta a predictor function qphi and a uncertainty detector model dpsi the whole model is too complicated in my point of view some typos 1 title for figure 2 final performanse should be final performance the authors have adequately addressed the limitation and potential negative social impact of their work docsepthe paper proposes memup memory for uncertainty prediction a method to learn longterm dependencies for recurrent models memup learns to keep useful past states for future use so it saves both computation and memory the memory module is trained to optimize the mutual information between past states and current prediction targets and only the steps that have highest uncertainty would be used for training to further save memory usage empirical experiments on supervisied learning and reinforcement learning tasks show that memup significantly boost the results over its baselines the solution proposed by the paper is valuable for longsequence modeling on the one hand it reduces the computational overhead of rnn model training on the other hand it could preserve distant information which rnns would easily forget the methods are simple and the intuition behind the methods all make sense to me the paper is wellwritten with details clearly formulated and illustrated together with its experiments on both supervised learning and reinforcement learning tasks i can foresee its potential values in the field of reinforcement learning and nlp however the paper does come with some drawbacks and some confusions 1 lack of background work though closed related to memory neural networks none of them is discussed and compared for example j weston s chopra and a bordes memory networks iclr 2014 similarly pointer networks should also be discussed eg see et al 2017 memory modules are also popular for longsequence processing in nlp eg coreference resolution toshniwal et al 2020 2 insufficient empirical evidence experiments in section 5 are claimed to be longsequence tasks while their sequences are not long enough the lengths of copy scattered copy and add are 1000 which can even be handled by some transformers eg t5 the truncation lengths are as small as 10 and up to 60 only which can be much larger for real applications 3 choice of uncertainty detector while the description of reducing the cost of memory training in section 2 seems intriguing the method used in training is simplified into a point estimate of prediction error in this case it makes more sense to me to simplify the justification in section 2 5 while memup uses information theory to help the model to keep the most useful information the basic idea of memup is very similar to lstm which purely relies on bptt to learn how to keep memories the authors show in sections 5 and 6 that memup outperforms lstm the baselines are presented in a negative light where the rollout lengths are less than 60 steps the authors may clearly compare the difference between memup and lstm as well as providing more evidences to show the advantage of memup i do not foresee any potential societal impacts that may occur
### Summary: | the reviewers found the ideas presented in the paper interesting the use of mutual information to train memory for a model and the clear presentation some questions were raised about demonstrating on a more elaborate set up such as nlp tasks the main experiments aside from the toy experiments of copy etc algorithmic tasks seem to be on rl experiments but the method has been advertised more broadly in the motivation another reviewer raised the question of the complexity of training multiple networks nevertheless the reviewers found the paper interesting enough to recommend a weak accept and i support that recommendation from a reviewers lens i was a little surprised that the paper made no mention of prior works on maximizing mutual information between features of neural networks to improve results as an example see the following paper 1 that uses a mutual information regularizer between states at different steps of a recurrent neural networks there is also a rich literature of doing so for convolutional neural networks it would have made sense to compare how the idea in the paper performed in comparison to these methods and in a sense the ablation study which looked at randomly choosing time steps k regardless of the uncertainty estimator is an experiment in this direction i understand that part of the paper deals with the choice of time points to increase mutual information between and so its probably more efficient than the other alternatives but a comparison or discussion in related works would have made the paper stronger 1 better longrange dependencyby bootstrapping a mutual information regularizer httpsarxivorgpdf190511978v1pdf | [
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
783,
789,
23970,
1167,
484,
247,
3733,
1332,
326,
4483,
281,
3037,
1048,
3945,
21011,
1293,
896,
44263,
839,
27935,
949,
253,
2862,
3425,
12401,
391,
79,
2224,
253,
1332,
310,
1754,
327,
46875,
15577,
1491,
875,
3541,
3054,
840,
352,
310,
5544,
849,
253,
7424,
878,
285,
476,
320,
34930,
436,
23970,
2007,
13260,
4283,
281,
1896,
42220,
281,
253,
642,
261,
1767,
87,
1895,
436,
310,
2007,
5867,
275,
253,
5661,
7118,
436,
310,
2007,
21010,
281,
247,
8542,
13757,
1159,
2905,
281,
2831,
290,
10144,
2957,
326,
7024,
327,
253,
3541,
278,
387,
247,
4522,
383,
554,
285,
253,
1566,
2805,
1677,
253,
9056,
3280,
253,
789,
840,
7788,
407,
12930,
849,
1167,
484,
310,
3732,
281,
391,
79,
2224,
253,
5661,
7118,
3835,
1543,
323,
5044,
3425,
8892,
3491,
17485,
3491,
8143,
4525,
278,
79,
382,
285,
823,
4836,
1223,
10941,
281,
298,
296,
983,
256,
30930,
2224,
285,
4979,
398,
671,
352,
310,
3368,
264,
275,
391,
77,
7533,
246,
785,
2721,
40027,
3088,
297,
7553,
4368,
641,
4645,
12085,
1543,
281,
5368,
3082,
50275,
783,
789,
3133,
3236,
281,
320,
253,
1682,
273,
436,
37317,
3640,
285,
4623,
281,
253,
5723,
2824,
3114,
17690,
3210,
281,
3037,
285,
1232,
1048,
6324,
21011,
310,
6296,
247,
1895,
273,
1600,
275,
1798,
849,
281,
6194,
3541,
31612,
3210,
253,
2987,
29328,
271,
4722,
1332,
281,
6194,
824,
3210,
50275,
783,
2934,
3212,
253,
1332,
310,
2590,
285,
4722,
253,
2929,
16633,
327,
44190,
253,
1332,
533,
812,
452,
1056,
247,
1805,
2628,
18216,
352,
2007,
3185,
4477,
4425,
281,
7277,
1411,
643,
3082,
275,
767,
7533,
22296,
4715,
327,
6430,
285,
391,
77,
923,
3533,
2708,
275,
1635,
2067,
273,
841,
4679,
812,
452,
2429,
281,
21629,
18902,
3082,
285,
3541,
31612,
3082,
1097,
4979,
398,
285,
391,
79,
2224,
923,
337,
285,
374,
347,
6667,
50275,
783,
2505,
310,
2590,
8133,
625,
15075,
327,
849,
253,
5933,
310,
3732,
281,
391,
79,
2224,
285,
643,
6928,
812,
1663,
1361,
253,
9414,
2096,
253,
1332,
285,
1918,
352,
625,
19843,
13947,
253,
8892,
275,
253,
2505,
3185,
273,
253,
24864,
651,
320,
14109,
347,
973,
50275,
18,
13691,
250,
3541,
5085,
44351,
26202,
553,
6779,
275,
18902,
11454,
6928,
3273,
293,
6426,
1162,
355,
6247,
50276,
19,
417,
512,
12959,
403,
3562,
4503,
4715,
281,
7740,
407,
866,
4261,
402,
17616,
5830,
15642,
1162,
355,
43425,
253,
4477,
6266,
285,
12106,
253,
642,
261,
1767,
87,
1895,
1754,
327,
616,
11193,
597,
671,
5467,
326,
21011,
403,
29967,
342,
253,
2572,
273,
253,
4181,
50275,
7152,
339,
9852,
436,
2929,
253,
4477,
4081,
247,
1332,
1925,
1167,
484,
281,
1361,
18902,
11454,
6928,
3037,
1048,
3945,
21011,
1805,
5742,
253,
4477,
25057,
247,
3541,
1566,
281,
3283,
2852,
6973,
342,
1029,
11649,
407,
42654,
512,
3054,
342,
2406,
11649,
253,
3733,
1232,
3936,
1679,
15180,
2105,
275,
896,
44263,
318,
253,
7792,
8687,
247,
11649,
13562,
1566,
285,
247,
23403,
281,
6642,
15577,
1491,
908,
275,
3541,
1566,
3733,
253,
4477,
921,
5661,
1543,
327,
1097,
22296,
4715,
8892,
285,
35221,
4715,
8892,
342,
7756,
10941,
342,
1666,
25379,
253,
7714,
556,
253,
1563,
20544,
337,
253,
16038,
310,
5272,
285,
253,
4477,
4081,
247,
7792,
281,
2953,
253,
16038,
374,
253,
4679,
403,
4891,
253,
4477,
2085,
3294,
917,
390,
1805,
1543,
275,
5301,
342,
2067,
1666,
25379,
13205,
281,
6016,
1048,
6430,
495,
323,
824,
247,
9542,
7792,
253,
4477,
403,
2104,
281,
17093,
4518,
50276,
783,
7714,
556,
253,
1563,
32213,
337,
253,
4081,
7792,
8687,
3733,
273,
2067,
3210,
1690,
247,
3541,
1566,
305,
3124,
247,
23403,
1159,
2805,
2162,
285,
247,
11649,
13562,
1566,
277,
4144,
253,
2644,
1566,
310,
1512,
9542,
275,
619,
1127,
273,
1859,
50275,
8826,
963,
993,
337,
4060,
323,
4677,
374,
2457,
1347,
46679,
943,
320,
2457,
3045,
253,
4477,
452,
18212,
9713,
253,
12291,
285,
2442,
4016,
2675,
3486,
273,
616,
789,
5474,
339,
431,
248,
2929,
29328,
1167,
484,
3541,
323,
11649,
10554,
247,
1332,
281,
3037,
1048,
3945,
21011,
323,
18902,
3210,
1167,
484,
33772,
281,
1978,
4217,
2469,
3054,
323,
2852,
897,
594,
352,
26866,
1097,
13782,
285,
3541,
253,
3541,
6333,
310,
10166,
281,
22318,
253,
15577,
1491,
875,
2469,
3054,
285,
1655,
10554,
8571,
285,
760,
253,
5018,
326,
452,
4585,
11649,
651,
320,
908,
323,
3733,
281,
2007,
5321,
3541,
10393,
16774,
4679,
327,
2221,
4534,
728,
4715,
285,
35221,
4715,
8892,
921,
326,
1167,
484,
3012,
9510,
253,
1543,
689,
697,
1666,
25379,
253,
2900,
4081,
407,
253,
2929,
310,
9865,
323,
1048,
21934,
14053,
327,
253,
581,
1133,
352,
11355,
253,
15180,
18332,
273,
391,
9866,
1566,
3733,
327,
253,
643,
1133,
352,
812,
14003,
13392,
1491,
534,
391,
79,
2224,
651,
4354,
7740,
253,
3082,
403,
2969,
285,
253,
30328,
3212,
253,
3082,
512,
1056,
3282,
281,
479,
253,
2929,
310,
973,
15720,
342,
4278,
4518,
26115,
285,
12800,
2366,
342,
697,
4679,
327,
1097,
22296,
4715,
285,
35221,
4715,
8892,
891,
476,
32734,
697,
2442,
2193,
275,
253,
1673,
273,
35221,
4715,
285,
295,
24343,
50276,
35529,
253,
2929,
1057,
1705,
342,
690,
30453,
285,
690,
1461,
16723,
50276,
18,
50276,
77,
471,
273,
4114,
789,
2167,
4581,
2905,
281,
3541,
11454,
6928,
5293,
273,
731,
310,
5469,
285,
2429,
323,
1650,
480,
8935,
251,
256,
38419,
376,
285,
247,
270,
636,
265,
3541,
6928,
17857,
32888,
4059,
12014,
12219,
6928,
943,
671,
320,
5469,
24088,
923,
1162,
355,
4240,
3541,
11911,
403,
671,
4633,
323,
1048,
21934,
5162,
275,
295,
24343,
24088,
5161,
1793,
6064,
281,
1200,
8311,
18758,
1162,
355,
9169,
50272,
19,
50276,
968,
86,
2276,
16774,
1941,
4679,
275,
2593,
608,
403,
7558,
281,
320,
1048,
21934,
8892,
1223,
616,
6430,
403,
417,
1048,
2217,
253,
16095,
273,
3491,
17485,
3491,
285,
823,
403,
9098,
534,
476,
1014,
320,
15726,
407,
690,
4979,
398,
24088,
246,
22,
253,
47024,
16095,
403,
347,
1355,
347,
884,
285,
598,
281,
3925,
760,
534,
476,
320,
1199,
4067,
323,
1524,
4893,
50272,
20,
50276,
22122,
273,
11649,
13562,
1223,
253,
5740,
273,
8493,
253,
2105,
273,
3541,
3733,
275,
2593,
374,
3133,
27807,
253,
1332,
908,
275,
3733,
310,
21010,
715,
247,
1127,
6642,
273,
10554,
2228,
275,
436,
1083,
352,
2789,
625,
3282,
281,
479,
281,
25636,
253,
22861,
275,
2593,
374,
50276,
22,
50276,
6050,
1167,
484,
4648,
1491,
3762,
281,
1361,
253,
1566,
281,
1978,
253,
954,
4217,
1491,
253,
5044,
2934,
273,
1167,
484,
310,
1077,
2074,
281,
298,
296,
78,
534,
15846,
15771,
327,
270,
431,
85,
281,
3037,
849,
281,
1978,
12959,
253,
4477,
921,
275,
7118,
608,
285,
721,
326,
1167,
484,
41731,
13015,
298,
296,
78,
253,
1666,
25379,
403,
3559,
275,
247,
4016,
1708,
835,
253,
4533,
483,
16095,
403,
1679,
685,
3925,
5018,
253,
4477,
778,
4518,
7277,
253,
3064,
875,
1167,
484,
285,
298,
296,
78,
347,
973,
347,
5277,
625,
20456,
2979,
281,
921,
253,
5750,
273,
1167,
484,
2519,
891,
513,
417,
32734,
667,
2442,
38058,
16274,
326,
778,
2826,
2490,
187,
4118,
18435,
27,
783,
30628,
1119,
253,
5697,
3559,
275,
253,
2929,
4722,
50276,
783,
897,
273,
15577,
1491,
281,
6194,
3541,
323,
247,
1566,
285,
253,
2590,
9759,
690,
3533,
497,
5439,
670,
17227,
327,
247,
625,
21184,
873,
598,
824,
347,
295,
24343,
8892,
50276,
783,
2022,
4679,
9255,
432,
253,
20953,
4679,
273,
3491,
3966,
5933,
280,
8892,
1646,
281,
320,
327,
391,
77,
4679,
533,
253,
1332,
556,
644,
37636,
625,
21450,
275,
253,
16038,
1529,
37317,
5439,
253,
1953,
273,
253,
10454,
273,
3733,
2709,
6928,
17837,
253,
30628,
1119,
253,
2929,
4722,
2217,
281,
5583,
247,
5075,
2997,
285,
891,
1329,
326,
17401,
50276,
4064,
247,
30628,
9655,
891,
369,
247,
1652,
9861,
326,
253,
2929,
1160,
642,
3748,
273,
2720,
2987,
327,
46875,
15577,
1491,
875,
3386,
273,
11454,
6928,
281,
3157,
1543,
347,
271,
1650,
923,
253,
1563,
2929,
337,
326,
4648,
247,
15577,
1491,
3963,
6081,
875,
3054,
387,
1027,
5018,
273,
247,
18902,
11454,
6928,
627,
310,
671,
247,
6793,
6239,
273,
2509,
594,
323,
27311,
267,
11454,
6928,
352,
651,
452,
1160,
3282,
281,
7277,
849,
253,
2934,
275,
253,
2929,
2684,
275,
5301,
281,
841,
3082,
285,
275,
247,
3282,
253,
28913,
1263,
534,
3261,
387,
12421,
13887,
673,
5018,
465,
10159,
273,
253,
11649,
29107,
310,
271,
3368,
275,
436,
3884,
891,
2096,
326,
629,
273,
253,
2929,
13330,
342,
253,
4327,
273,
673,
2792,
281,
2572,
15577,
1491,
875,
285,
594,
697,
3164,
625,
5919,
685,
253,
643,
18075,
533,
247,
5301,
390,
5955,
275,
2905,
2987,
651,
452,
1160,
253,
2929,
10046,
50276,
18,
1805,
1048,
6324,
18925,
1615,
7491,
10981,
2784,
247,
15577,
1491,
3963,
6081,
5987,
39962,
2061,
9275,
746,
1762,
12115,
3141,
87,
18,
9275
] | [
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] | [
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
783,
789,
23970,
1167,
484,
247,
3733,
1332,
326,
4483,
281,
3037,
1048,
3945,
21011,
1293,
896,
44263,
839,
27935,
949,
253,
2862,
3425,
12401,
391,
79,
2224,
253,
1332,
310,
1754,
327,
46875,
15577,
1491,
875,
3541,
3054,
840,
352,
310,
5544,
849,
253,
7424,
878,
285,
476,
320,
34930,
436,
23970,
2007,
13260,
4283,
281,
1896,
42220,
281,
253,
642,
261,
1767,
87,
1895,
436,
310,
2007,
5867,
275,
253,
5661,
7118,
436,
310,
2007,
21010,
281,
247,
8542,
13757,
1159,
2905,
281,
2831,
290,
10144,
2957,
326,
7024,
327,
253,
3541,
278,
387,
247,
4522,
383,
554,
285,
253,
1566,
2805,
1677,
253,
9056,
3280,
253,
789,
840,
7788,
407,
12930,
849,
1167,
484,
310,
3732,
281,
391,
79,
2224,
253,
5661,
7118,
3835,
1543,
323,
5044,
3425,
8892,
3491,
17485,
3491,
8143,
4525,
278,
79,
382,
285,
823,
4836,
1223,
10941,
281,
298,
296,
983,
256,
30930,
2224,
285,
4979,
398,
671,
352,
310,
3368,
264,
275,
391,
77,
7533,
246,
785,
2721,
40027,
3088,
297,
7553,
4368,
641,
4645,
12085,
1543,
281,
5368,
3082,
50275,
783,
789,
3133,
3236,
281,
320,
253,
1682,
273,
436,
37317,
3640,
285,
4623,
281,
253,
5723,
2824,
3114,
17690,
3210,
281,
3037,
285,
1232,
1048,
6324,
21011,
310,
6296,
247,
1895,
273,
1600,
275,
1798,
849,
281,
6194,
3541,
31612,
3210,
253,
2987,
29328,
271,
4722,
1332,
281,
6194,
824,
3210,
50275,
783,
2934,
3212,
253,
1332,
310,
2590,
285,
4722,
253,
2929,
16633,
327,
44190,
253,
1332,
533,
812,
452,
1056,
247,
1805,
2628,
18216,
352,
2007,
3185,
4477,
4425,
281,
7277,
1411,
643,
3082,
275,
767,
7533,
22296,
4715,
327,
6430,
285,
391,
77,
923,
3533,
2708,
275,
1635,
2067,
273,
841,
4679,
812,
452,
2429,
281,
21629,
18902,
3082,
285,
3541,
31612,
3082,
1097,
4979,
398,
285,
391,
79,
2224,
923,
337,
285,
374,
347,
6667,
50275,
783,
2505,
310,
2590,
8133,
625,
15075,
327,
849,
253,
5933,
310,
3732,
281,
391,
79,
2224,
285,
643,
6928,
812,
1663,
1361,
253,
9414,
2096,
253,
1332,
285,
1918,
352,
625,
19843,
13947,
253,
8892,
275,
253,
2505,
3185,
273,
253,
24864,
651,
320,
14109,
347,
973,
50275,
18,
13691,
250,
3541,
5085,
44351,
26202,
553,
6779,
275,
18902,
11454,
6928,
3273,
293,
6426,
1162,
355,
6247,
50276,
19,
417,
512,
12959,
403,
3562,
4503,
4715,
281,
7740,
407,
866,
4261,
402,
17616,
5830,
15642,
1162,
355,
43425,
253,
4477,
6266,
285,
12106,
253,
642,
261,
1767,
87,
1895,
1754,
327,
616,
11193,
597,
671,
5467,
326,
21011,
403,
29967,
342,
253,
2572,
273,
253,
4181,
50275,
7152,
339,
9852,
436,
2929,
253,
4477,
4081,
247,
1332,
1925,
1167,
484,
281,
1361,
18902,
11454,
6928,
3037,
1048,
3945,
21011,
1805,
5742,
253,
4477,
25057,
247,
3541,
1566,
281,
3283,
2852,
6973,
342,
1029,
11649,
407,
42654,
512,
3054,
342,
2406,
11649,
253,
3733,
1232,
3936,
1679,
15180,
2105,
275,
896,
44263,
318,
253,
7792,
8687,
247,
11649,
13562,
1566,
285,
247,
23403,
281,
6642,
15577,
1491,
908,
275,
3541,
1566,
3733,
253,
4477,
921,
5661,
1543,
327,
1097,
22296,
4715,
8892,
285,
35221,
4715,
8892,
342,
7756,
10941,
342,
1666,
25379,
253,
7714,
556,
253,
1563,
20544,
337,
253,
16038,
310,
5272,
285,
253,
4477,
4081,
247,
7792,
281,
2953,
253,
16038,
374,
253,
4679,
403,
4891,
253,
4477,
2085,
3294,
917,
390,
1805,
1543,
275,
5301,
342,
2067,
1666,
25379,
13205,
281,
6016,
1048,
6430,
495,
323,
824,
247,
9542,
7792,
253,
4477,
403,
2104,
281,
17093,
4518,
50276,
783,
7714,
556,
253,
1563,
32213,
337,
253,
4081,
7792,
8687,
3733,
273,
2067,
3210,
1690,
247,
3541,
1566,
305,
3124,
247,
23403,
1159,
2805,
2162,
285,
247,
11649,
13562,
1566,
277,
4144,
253,
2644,
1566,
310,
1512,
9542,
275,
619,
1127,
273,
1859,
50275,
8826,
963,
993,
337,
4060,
323,
4677,
374,
2457,
1347,
46679,
943,
320,
2457,
3045,
253,
4477,
452,
18212,
9713,
253,
12291,
285,
2442,
4016,
2675,
3486,
273,
616,
789,
5474,
339,
431,
248,
2929,
29328,
1167,
484,
3541,
323,
11649,
10554,
247,
1332,
281,
3037,
1048,
3945,
21011,
323,
18902,
3210,
1167,
484,
33772,
281,
1978,
4217,
2469,
3054,
323,
2852,
897,
594,
352,
26866,
1097,
13782,
285,
3541,
253,
3541,
6333,
310,
10166,
281,
22318,
253,
15577,
1491,
875,
2469,
3054,
285,
1655,
10554,
8571,
285,
760,
253,
5018,
326,
452,
4585,
11649,
651,
320,
908,
323,
3733,
281,
2007,
5321,
3541,
10393,
16774,
4679,
327,
2221,
4534,
728,
4715,
285,
35221,
4715,
8892,
921,
326,
1167,
484,
3012,
9510,
253,
1543,
689,
697,
1666,
25379,
253,
2900,
4081,
407,
253,
2929,
310,
9865,
323,
1048,
21934,
14053,
327,
253,
581,
1133,
352,
11355,
253,
15180,
18332,
273,
391,
9866,
1566,
3733,
327,
253,
643,
1133,
352,
812,
14003,
13392,
1491,
534,
391,
79,
2224,
651,
4354,
7740,
253,
3082,
403,
2969,
285,
253,
30328,
3212,
253,
3082,
512,
1056,
3282,
281,
479,
253,
2929,
310,
973,
15720,
342,
4278,
4518,
26115,
285,
12800,
2366,
342,
697,
4679,
327,
1097,
22296,
4715,
285,
35221,
4715,
8892,
891,
476,
32734,
697,
2442,
2193,
275,
253,
1673,
273,
35221,
4715,
285,
295,
24343,
50276,
35529,
253,
2929,
1057,
1705,
342,
690,
30453,
285,
690,
1461,
16723,
50276,
18,
50276,
77,
471,
273,
4114,
789,
2167,
4581,
2905,
281,
3541,
11454,
6928,
5293,
273,
731,
310,
5469,
285,
2429,
323,
1650,
480,
8935,
251,
256,
38419,
376,
285,
247,
270,
636,
265,
3541,
6928,
17857,
32888,
4059,
12014,
12219,
6928,
943,
671,
320,
5469,
24088,
923,
1162,
355,
4240,
3541,
11911,
403,
671,
4633,
323,
1048,
21934,
5162,
275,
295,
24343,
24088,
5161,
1793,
6064,
281,
1200,
8311,
18758,
1162,
355,
9169,
50272,
19,
50276,
968,
86,
2276,
16774,
1941,
4679,
275,
2593,
608,
403,
7558,
281,
320,
1048,
21934,
8892,
1223,
616,
6430,
403,
417,
1048,
2217,
253,
16095,
273,
3491,
17485,
3491,
285,
823,
403,
9098,
534,
476,
1014,
320,
15726,
407,
690,
4979,
398,
24088,
246,
22,
253,
47024,
16095,
403,
347,
1355,
347,
884,
285,
598,
281,
3925,
760,
534,
476,
320,
1199,
4067,
323,
1524,
4893,
50272,
20,
50276,
22122,
273,
11649,
13562,
1223,
253,
5740,
273,
8493,
253,
2105,
273,
3541,
3733,
275,
2593,
374,
3133,
27807,
253,
1332,
908,
275,
3733,
310,
21010,
715,
247,
1127,
6642,
273,
10554,
2228,
275,
436,
1083,
352,
2789,
625,
3282,
281,
479,
281,
25636,
253,
22861,
275,
2593,
374,
50276,
22,
50276,
6050,
1167,
484,
4648,
1491,
3762,
281,
1361,
253,
1566,
281,
1978,
253,
954,
4217,
1491,
253,
5044,
2934,
273,
1167,
484,
310,
1077,
2074,
281,
298,
296,
78,
534,
15846,
15771,
327,
270,
431,
85,
281,
3037,
849,
281,
1978,
12959,
253,
4477,
921,
275,
7118,
608,
285,
721,
326,
1167,
484,
41731,
13015,
298,
296,
78,
253,
1666,
25379,
403,
3559,
275,
247,
4016,
1708,
835,
253,
4533,
483,
16095,
403,
1679,
685,
3925,
5018,
253,
4477,
778,
4518,
7277,
253,
3064,
875,
1167,
484,
285,
298,
296,
78,
347,
973,
347,
5277,
625,
20456,
2979,
281,
921,
253,
5750,
273,
1167,
484,
2519,
891,
513,
417,
32734,
667,
2442,
38058,
16274,
326,
778,
2826,
2490,
187,
4118,
18435,
27,
783,
30628,
1119,
253,
5697,
3559,
275,
253,
2929,
4722,
50276,
783,
897,
273,
15577,
1491,
281,
6194,
3541,
323,
247,
1566,
285,
253,
2590,
9759,
690,
3533,
497,
5439,
670,
17227,
327,
247,
625,
21184,
873,
598,
824,
347,
295,
24343,
8892,
50276,
783,
2022,
4679,
9255,
432,
253,
20953,
4679,
273,
3491,
3966,
5933,
280,
8892,
1646,
281,
320,
327,
391,
77,
4679,
533,
253,
1332,
556,
644,
37636,
625,
21450,
275,
253,
16038,
1529,
37317,
5439,
253,
1953,
273,
253,
10454,
273,
3733,
2709,
6928,
17837,
253,
30628,
1119,
253,
2929,
4722,
2217,
281,
5583,
247,
5075,
2997,
285,
891,
1329,
326,
17401,
50276,
4064,
247,
30628,
9655,
891,
369,
247,
1652,
9861,
326,
253,
2929,
1160,
642,
3748,
273,
2720,
2987,
327,
46875,
15577,
1491,
875,
3386,
273,
11454,
6928,
281,
3157,
1543,
347,
271,
1650,
923,
253,
1563,
2929,
337,
326,
4648,
247,
15577,
1491,
3963,
6081,
875,
3054,
387,
1027,
5018,
273,
247,
18902,
11454,
6928,
627,
310,
671,
247,
6793,
6239,
273,
2509,
594,
323,
27311,
267,
11454,
6928,
352,
651,
452,
1160,
3282,
281,
7277,
849,
253,
2934,
275,
253,
2929,
2684,
275,
5301,
281,
841,
3082,
285,
275,
247,
3282,
253,
28913,
1263,
534,
3261,
387,
12421,
13887,
673,
5018,
465,
10159,
273,
253,
11649,
29107,
310,
271,
3368,
275,
436,
3884,
891,
2096,
326,
629,
273,
253,
2929,
13330,
342,
253,
4327,
273,
673,
2792,
281,
2572,
15577,
1491,
875,
285,
594,
697,
3164,
625,
5919,
685,
253,
643,
18075,
533,
247,
5301,
390,
5955,
275,
2905,
2987,
651,
452,
1160,
253,
2929,
10046,
50276,
18,
1805,
1048,
6324,
18925,
1615,
7491,
10981,
2784,
247,
15577,
1491,
3963,
6081,
5987,
39962,
2061,
9275,
746,
1762,
12115,
3141,
87,
18,
9275
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
summary proposes an importance sampling approach to sampling failure cases for rl algorithms the proposal distribution is based on a function learned via a neural network on failures that occur during agent training the method is compared to random sampling on two problems where the true failure probability can be approximated through random sampling the is method requires substantially fewer samples to produce failure cases and to estimate the failure probability review the overall approach is technically sound and the experiments demonstrate a significant savings in sampling compared to naive random sampling the specific novelty of the approach seems to be fitting the proposal distribution to failures observed during training i think the method accomplishes what it sets out to do however as the paper notes creating robust agents will require a combination of methodologies of which this testing approach is only a part i wonder if learning the proposal distribution based on failures observed during training presents a risk of narrowing the range of possible failures being considered of course identifying any failure is valuable but by biasing the search toward failures that are similar to failures observed in training might we be decreasing the likelihood of discovering failures that are substantially different from those seen during training one could imagine that if the agent has not explored some regions of the state space we would actually like to sample test examples from the unexplored states which becomes less likely if we preferentially sample in states that were encountered in training the paper is wellwritten with good coverage of related literature i would suggest incorporating some of the descriptions of the models and methods in appendix d into the main paper comments questions sec 42 how are the confidence bounds for the results calculated what are the true failure probabilities in the experiments sec 43 there is a reference to nonexistant appendix x pros overall approach is sound and achieves its objectives cons small amount of novelty primarily an application of established techniquesdocsepthis paper proposed an adversarial approach to identifying catastrophic failure cases in reinforcement learning it is a timely topic and may have practical significance the proposed approach is built on importance sampling for the failure search and function fitting for estimating the failure probabilities experiments on two simulated environments show significant gain of the proposed approaches over naive search the reviewer is not familiar with this domain but the baseline naive search seems like straightforward and very weak are there any other methods for the same problem in the literature the authors may consider to contrast to them in the experiments what is the certainty equivalence approach a reference would be helpful and improve the presentation quality of the paper what is exactly the thetat in section 33 what is the dimension of this vector in the experiments what quantities should be encoded in this vector in practice i am still concerned about the fact that the fpp depends on the generalization of the binary classification neural network although the authors tried to give intuitive examples and discussions nonetheless i understand the difficulty could the authors give some conditions under which the approach would fail any alternative approaches to the binary neural network what is a good principle to design the network architecture overall this paper addresses a practically significant problem and has proposed reasonable approaches while i still have concerns about the practical performance of the proposed methods this work along the right track in my opinion docseppaper summary the paper proposes a method for evaluating the failure probability of a learned agent which is important in safety critical domains using plain monte carlo for this evaluation can be too expensive since discovering a failure probability of epsilon requires on the order of 1epsilon samples therefore the authors propose an adversarial approach which focuses on scenarios which are difficult for the agent while still yielding unbiased estimates of failure probabilities the key idea of the proposed approach is to learn a failure probability predictor fpp this function attempts to predict at which initial states the system will fail this function is then used in an importance sampling scheme to sample the regions with higher failure probability more often which leads to higher statistical efficiency finding the fpp is itself a problem which is just as hard as the original problem of estimating the overall failure probability however the fpp can be trained using data from different agents not just the final agent to be evaluated for instance the data from agent training containing typically many failure cases the approach hinges on the assumption that these agents tend to fail in the same states as the final agent but with higher probability the paper shows that the proposed method finds failure cases orders of magnitude faster than standard mc in simulated driving as well as a simulated humanoid task since the proposed approach uses data acquired during the training of the agent it has more information at its disposal than standard mc however the paper shows that the proposed method is also orders of magnitudes more efficient than a naive approach using the failure cases during training review summary i believe that this paper addresses an important problem in a novel manner as far as i can tell and the experiments are quite convincing the main negative point is that i believe that the proposed method has some flaws which may actually decrease statistical efficiency in some cases please see details below detailed comments it seems to me that a weak point of the method is that it may also severly reduce the efficiency compared to a standard mc method if the function f underestimates the probability of failure at certain x it would take a very long time to correct itself because these points would hardly ever be evaluated it seems that the paper heuristically addresses this to some extent using the exponent alpha of the function however i think there should be a more indepth discussion of this issue an upperconfidencebound type of algorithm may be a principled way of addressing this problem the proposed method relies on the ability to initialize the system in any desired state however on a physical system where finding failure cases is particularly important this is usually not possible it would be interesting if the paper would discuss how the proposed approach would be used on such real systems on page 6 in the first paragraph the state is called s instead of x as before furthermore the arguments of f are switched
### Summary: | strengths the paper addresses a timely topic and reviewers generally agreed that the approach is reasonable and the experiments are convincing reviewers raised a number of specific concerns which could be addressed in a revised version or future work described below weaknesses some reviewers were concerned the baselines are weak several reviewers were concerned that relying on failures observed during training could create issues by narrowing the proposal distribution reviewer 3 characterizes this in a particularly precise manner in addition there was a general feeling that more steps are needed before the method can be used in practice but this could be said of most research recommendation all reviewers agreed that the paper should be accepted although there was also consensus that the paper would benefit from stronger baselines and more close attention to issues that could be caused by an overly narrow proposal distribution the authors should consider addressing or commenting on these issues in the final version | [
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
8774,
29328,
271,
6349,
10491,
2746,
281,
10491,
4433,
2219,
323,
391,
77,
11333,
253,
10419,
3268,
310,
1754,
327,
247,
1159,
6311,
3066,
247,
11454,
2990,
327,
20101,
326,
2826,
1309,
5570,
3733,
253,
1332,
310,
2429,
281,
3632,
10491,
327,
767,
3237,
835,
253,
2032,
4433,
5912,
476,
320,
34930,
949,
3632,
10491,
253,
310,
1332,
4419,
9619,
11184,
3530,
281,
4711,
4433,
2219,
285,
281,
6642,
253,
4433,
5912,
50276,
15337,
253,
4583,
2746,
310,
22335,
3590,
285,
253,
4679,
7568,
247,
1534,
16347,
275,
10491,
2429,
281,
27785,
3632,
10491,
253,
2173,
38135,
273,
253,
2746,
3133,
281,
320,
13532,
253,
10419,
3268,
281,
20101,
2540,
1309,
3733,
50275,
74,
1158,
253,
1332,
7576,
6419,
752,
352,
5239,
562,
281,
513,
2299,
347,
253,
2929,
7211,
6153,
10237,
6083,
588,
2430,
247,
5019,
273,
39396,
273,
534,
436,
5175,
2746,
310,
760,
247,
629,
50275,
74,
4282,
604,
4715,
253,
10419,
3268,
1754,
327,
20101,
2540,
1309,
3733,
10262,
247,
2495,
273,
44746,
253,
2491,
273,
1896,
20101,
1146,
2783,
273,
2282,
12488,
667,
4433,
310,
9865,
533,
407,
1794,
2355,
253,
3186,
2584,
20101,
326,
403,
2074,
281,
20101,
2540,
275,
3733,
1537,
359,
320,
11052,
253,
12177,
273,
30375,
20101,
326,
403,
9619,
1027,
432,
1110,
2326,
1309,
3733,
581,
812,
8564,
326,
604,
253,
5570,
556,
417,
14859,
690,
4811,
273,
253,
1375,
2317,
359,
651,
2686,
751,
281,
3410,
1071,
6667,
432,
253,
35021,
2149,
3054,
534,
4916,
1679,
2779,
604,
359,
36498,
3410,
275,
3054,
326,
497,
14494,
275,
3733,
50276,
783,
2929,
310,
973,
15720,
342,
1175,
7031,
273,
2905,
6239,
891,
651,
1804,
24049,
690,
273,
253,
20121,
273,
253,
3210,
285,
3082,
275,
30762,
277,
715,
253,
2022,
2929,
50276,
26122,
50276,
34974,
50276,
1704,
5976,
849,
403,
253,
7162,
14493,
323,
253,
1543,
5118,
50276,
5371,
403,
253,
2032,
4433,
20552,
275,
253,
4679,
50276,
1704,
7652,
627,
310,
247,
3806,
281,
44382,
5567,
30762,
1269,
50276,
856,
84,
50276,
1189,
455,
2746,
310,
3590,
285,
33526,
697,
16566,
50276,
5040,
50276,
6795,
2408,
273,
38135,
8558,
271,
2898,
273,
4232,
5609,
7152,
33032,
2520,
2929,
4081,
271,
48960,
2746,
281,
12488,
36256,
4433,
2219,
275,
35221,
4715,
352,
310,
247,
14793,
9400,
285,
778,
452,
8542,
8453,
253,
4081,
2746,
310,
4270,
327,
6349,
10491,
323,
253,
4433,
3186,
285,
1159,
13532,
323,
26230,
253,
4433,
20552,
4679,
327,
767,
15524,
12620,
921,
1534,
6351,
273,
253,
4081,
7274,
689,
27785,
3186,
50275,
783,
37317,
310,
417,
7615,
342,
436,
5028,
533,
253,
8245,
27785,
3186,
3133,
751,
15246,
285,
1077,
5075,
403,
627,
667,
643,
3082,
323,
253,
1072,
1895,
275,
253,
6239,
253,
4477,
778,
1908,
281,
4499,
281,
731,
275,
253,
4679,
50275,
5371,
310,
253,
23140,
19945,
2746,
247,
3806,
651,
320,
9371,
285,
3157,
253,
9759,
3290,
273,
253,
2929,
50276,
5371,
310,
4555,
253,
253,
41506,
275,
2593,
5922,
752,
310,
253,
7877,
273,
436,
4972,
275,
253,
4679,
752,
13483,
943,
320,
16202,
275,
436,
4972,
275,
3946,
50275,
74,
717,
1335,
7514,
670,
253,
958,
326,
253,
269,
377,
7024,
327,
253,
26647,
273,
253,
8985,
9162,
11454,
2990,
3738,
253,
4477,
3597,
281,
1918,
27350,
6667,
285,
11985,
23188,
891,
2096,
253,
10183,
812,
253,
4477,
1918,
690,
2515,
762,
534,
253,
2746,
651,
1891,
667,
5795,
7274,
281,
253,
8985,
11454,
2990,
752,
310,
247,
1175,
8063,
281,
2216,
253,
2990,
10336,
50275,
1189,
455,
436,
2929,
12453,
247,
18236,
1534,
1895,
285,
556,
4081,
5272,
7274,
1223,
891,
1335,
452,
7350,
670,
253,
8542,
3045,
273,
253,
4081,
3082,
436,
789,
2112,
253,
987,
3540,
275,
619,
4743,
50276,
7152,
339,
377,
6653,
6010,
50275,
783,
2929,
29328,
247,
1332,
323,
16344,
253,
4433,
5912,
273,
247,
6311,
5570,
534,
310,
1774,
275,
5252,
4619,
10625,
50275,
5302,
8342,
1114,
442,
1113,
4213,
323,
436,
7103,
476,
320,
1512,
8214,
1580,
30375,
247,
4433,
5912,
273,
299,
4277,
4419,
327,
253,
1340,
273,
337,
4259,
3530,
3103,
253,
4477,
12661,
271,
48960,
2746,
534,
16633,
327,
15216,
534,
403,
2834,
323,
253,
5570,
1223,
1335,
27012,
38663,
8197,
273,
4433,
20552,
50275,
783,
2234,
2934,
273,
253,
4081,
2746,
310,
281,
3037,
247,
4433,
5912,
23403,
269,
377,
436,
1159,
9437,
281,
3283,
387,
534,
3302,
3054,
253,
985,
588,
1891,
436,
1159,
310,
840,
908,
275,
271,
6349,
10491,
6974,
281,
3410,
253,
4811,
342,
2169,
4433,
5912,
625,
2223,
534,
5644,
281,
2169,
7605,
6733,
4560,
253,
269,
377,
310,
3139,
247,
1895,
534,
310,
816,
347,
1892,
347,
253,
3236,
1895,
273,
26230,
253,
4583,
4433,
5912,
2299,
253,
269,
377,
476,
320,
10166,
970,
941,
432,
1027,
6083,
417,
816,
253,
2457,
5570,
281,
320,
6760,
323,
4227,
253,
941,
432,
5570,
3733,
4508,
5431,
1142,
4433,
2219,
253,
2746,
34865,
265,
327,
253,
9376,
326,
841,
6083,
5257,
281,
1891,
275,
253,
1072,
3054,
347,
253,
2457,
5570,
533,
342,
2169,
5912,
50275,
783,
2929,
2722,
326,
253,
4081,
1332,
9010,
4433,
2219,
7367,
273,
9777,
7938,
685,
2629,
278,
68,
275,
15524,
6276,
347,
973,
347,
247,
15524,
1966,
1238,
4836,
1580,
253,
4081,
2746,
4648,
941,
9288,
1309,
253,
3733,
273,
253,
5570,
352,
556,
625,
1491,
387,
697,
23585,
685,
2629,
278,
68,
2299,
253,
2929,
2722,
326,
253,
4081,
1332,
310,
671,
7367,
273,
32800,
625,
5919,
685,
247,
27785,
2746,
970,
253,
4433,
2219,
1309,
3733,
50275,
15337,
6010,
50275,
74,
2868,
326,
436,
2929,
12453,
271,
1774,
1895,
275,
247,
4460,
5133,
347,
2080,
347,
891,
476,
2028,
285,
253,
4679,
403,
3240,
21414,
253,
2022,
4016,
1127,
310,
326,
891,
2868,
326,
253,
4081,
1332,
556,
690,
32138,
534,
778,
2686,
6379,
7605,
6733,
275,
690,
2219,
4496,
923,
4278,
2708,
50275,
5992,
7193,
5701,
50274,
262,
3133,
281,
479,
326,
247,
5075,
1127,
273,
253,
1332,
310,
326,
352,
778,
671,
1917,
314,
4796,
253,
6733,
2429,
281,
247,
2629,
278,
68,
1332,
604,
253,
1159,
269,
22698,
36940,
253,
5912,
273,
4433,
387,
2176,
1269,
352,
651,
1379,
247,
1077,
1048,
673,
281,
3451,
3139,
984,
841,
2792,
651,
10693,
2455,
320,
6760,
352,
3133,
326,
253,
2929,
344,
321,
18260,
12453,
436,
281,
690,
6070,
970,
253,
23653,
9765,
273,
253,
1159,
2299,
891,
1158,
627,
943,
320,
247,
625,
801,
554,
394,
5955,
273,
436,
2523,
271,
5170,
39943,
9458,
1511,
273,
5933,
778,
320,
247,
3505,
74,
6216,
1039,
273,
15974,
436,
1895,
50275,
783,
4081,
1332,
15771,
327,
253,
3745,
281,
26641,
253,
985,
275,
667,
6799,
1375,
2299,
327,
247,
3520,
985,
835,
4560,
4433,
2219,
310,
3782,
1774,
436,
310,
3798,
417,
1896,
352,
651,
320,
4722,
604,
253,
2929,
651,
2319,
849,
253,
4081,
2746,
651,
320,
908,
327,
824,
1524,
2718,
50275,
251,
3239,
721,
275,
253,
806,
12494,
253,
1375,
310,
1925,
256,
3185,
273,
1269,
347,
1078,
33810,
253,
7125,
273,
269,
403,
17609,
187,
187,
4118,
18435,
27,
50276,
296,
3755,
20556,
50276,
783,
2929,
12453,
247,
14793,
9400,
285,
30628,
3839,
5821,
326,
253,
2746,
310,
5272,
285,
253,
4679,
403,
21414,
30628,
5439,
247,
1180,
273,
2173,
7350,
534,
812,
320,
9713,
275,
247,
17265,
2715,
390,
2852,
789,
2529,
2708,
50275,
20881,
1255,
265,
50276,
8826,
30628,
497,
7514,
253,
1666,
25379,
403,
5075,
2067,
30628,
497,
7514,
326,
22128,
327,
20101,
2540,
1309,
3733,
812,
2794,
3374,
407,
44746,
253,
10419,
3268,
37317,
495,
45589,
436,
275,
247,
3782,
10799,
5133,
275,
1635,
627,
369,
247,
2087,
5471,
326,
625,
5018,
403,
3058,
1078,
253,
1332,
476,
320,
908,
275,
3946,
533,
436,
812,
320,
753,
273,
954,
2561,
50275,
250,
27167,
318,
50276,
455,
30628,
5821,
326,
253,
2929,
943,
320,
7607,
3738,
627,
369,
671,
13969,
326,
253,
2929,
651,
5649,
432,
10046,
1666,
25379,
285,
625,
2810,
4116,
281,
3374,
326,
812,
320,
4269,
407,
271,
27662,
6891,
10419,
3268,
253,
4477,
943,
1908,
15974,
390,
36738,
327,
841,
3374,
275,
253,
2457,
2715
] | [
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] | [
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
8774,
29328,
271,
6349,
10491,
2746,
281,
10491,
4433,
2219,
323,
391,
77,
11333,
253,
10419,
3268,
310,
1754,
327,
247,
1159,
6311,
3066,
247,
11454,
2990,
327,
20101,
326,
2826,
1309,
5570,
3733,
253,
1332,
310,
2429,
281,
3632,
10491,
327,
767,
3237,
835,
253,
2032,
4433,
5912,
476,
320,
34930,
949,
3632,
10491,
253,
310,
1332,
4419,
9619,
11184,
3530,
281,
4711,
4433,
2219,
285,
281,
6642,
253,
4433,
5912,
50276,
15337,
253,
4583,
2746,
310,
22335,
3590,
285,
253,
4679,
7568,
247,
1534,
16347,
275,
10491,
2429,
281,
27785,
3632,
10491,
253,
2173,
38135,
273,
253,
2746,
3133,
281,
320,
13532,
253,
10419,
3268,
281,
20101,
2540,
1309,
3733,
50275,
74,
1158,
253,
1332,
7576,
6419,
752,
352,
5239,
562,
281,
513,
2299,
347,
253,
2929,
7211,
6153,
10237,
6083,
588,
2430,
247,
5019,
273,
39396,
273,
534,
436,
5175,
2746,
310,
760,
247,
629,
50275,
74,
4282,
604,
4715,
253,
10419,
3268,
1754,
327,
20101,
2540,
1309,
3733,
10262,
247,
2495,
273,
44746,
253,
2491,
273,
1896,
20101,
1146,
2783,
273,
2282,
12488,
667,
4433,
310,
9865,
533,
407,
1794,
2355,
253,
3186,
2584,
20101,
326,
403,
2074,
281,
20101,
2540,
275,
3733,
1537,
359,
320,
11052,
253,
12177,
273,
30375,
20101,
326,
403,
9619,
1027,
432,
1110,
2326,
1309,
3733,
581,
812,
8564,
326,
604,
253,
5570,
556,
417,
14859,
690,
4811,
273,
253,
1375,
2317,
359,
651,
2686,
751,
281,
3410,
1071,
6667,
432,
253,
35021,
2149,
3054,
534,
4916,
1679,
2779,
604,
359,
36498,
3410,
275,
3054,
326,
497,
14494,
275,
3733,
50276,
783,
2929,
310,
973,
15720,
342,
1175,
7031,
273,
2905,
6239,
891,
651,
1804,
24049,
690,
273,
253,
20121,
273,
253,
3210,
285,
3082,
275,
30762,
277,
715,
253,
2022,
2929,
50276,
26122,
50276,
34974,
50276,
1704,
5976,
849,
403,
253,
7162,
14493,
323,
253,
1543,
5118,
50276,
5371,
403,
253,
2032,
4433,
20552,
275,
253,
4679,
50276,
1704,
7652,
627,
310,
247,
3806,
281,
44382,
5567,
30762,
1269,
50276,
856,
84,
50276,
1189,
455,
2746,
310,
3590,
285,
33526,
697,
16566,
50276,
5040,
50276,
6795,
2408,
273,
38135,
8558,
271,
2898,
273,
4232,
5609,
7152,
33032,
2520,
2929,
4081,
271,
48960,
2746,
281,
12488,
36256,
4433,
2219,
275,
35221,
4715,
352,
310,
247,
14793,
9400,
285,
778,
452,
8542,
8453,
253,
4081,
2746,
310,
4270,
327,
6349,
10491,
323,
253,
4433,
3186,
285,
1159,
13532,
323,
26230,
253,
4433,
20552,
4679,
327,
767,
15524,
12620,
921,
1534,
6351,
273,
253,
4081,
7274,
689,
27785,
3186,
50275,
783,
37317,
310,
417,
7615,
342,
436,
5028,
533,
253,
8245,
27785,
3186,
3133,
751,
15246,
285,
1077,
5075,
403,
627,
667,
643,
3082,
323,
253,
1072,
1895,
275,
253,
6239,
253,
4477,
778,
1908,
281,
4499,
281,
731,
275,
253,
4679,
50275,
5371,
310,
253,
23140,
19945,
2746,
247,
3806,
651,
320,
9371,
285,
3157,
253,
9759,
3290,
273,
253,
2929,
50276,
5371,
310,
4555,
253,
253,
41506,
275,
2593,
5922,
752,
310,
253,
7877,
273,
436,
4972,
275,
253,
4679,
752,
13483,
943,
320,
16202,
275,
436,
4972,
275,
3946,
50275,
74,
717,
1335,
7514,
670,
253,
958,
326,
253,
269,
377,
7024,
327,
253,
26647,
273,
253,
8985,
9162,
11454,
2990,
3738,
253,
4477,
3597,
281,
1918,
27350,
6667,
285,
11985,
23188,
891,
2096,
253,
10183,
812,
253,
4477,
1918,
690,
2515,
762,
534,
253,
2746,
651,
1891,
667,
5795,
7274,
281,
253,
8985,
11454,
2990,
752,
310,
247,
1175,
8063,
281,
2216,
253,
2990,
10336,
50275,
1189,
455,
436,
2929,
12453,
247,
18236,
1534,
1895,
285,
556,
4081,
5272,
7274,
1223,
891,
1335,
452,
7350,
670,
253,
8542,
3045,
273,
253,
4081,
3082,
436,
789,
2112,
253,
987,
3540,
275,
619,
4743,
50276,
7152,
339,
377,
6653,
6010,
50275,
783,
2929,
29328,
247,
1332,
323,
16344,
253,
4433,
5912,
273,
247,
6311,
5570,
534,
310,
1774,
275,
5252,
4619,
10625,
50275,
5302,
8342,
1114,
442,
1113,
4213,
323,
436,
7103,
476,
320,
1512,
8214,
1580,
30375,
247,
4433,
5912,
273,
299,
4277,
4419,
327,
253,
1340,
273,
337,
4259,
3530,
3103,
253,
4477,
12661,
271,
48960,
2746,
534,
16633,
327,
15216,
534,
403,
2834,
323,
253,
5570,
1223,
1335,
27012,
38663,
8197,
273,
4433,
20552,
50275,
783,
2234,
2934,
273,
253,
4081,
2746,
310,
281,
3037,
247,
4433,
5912,
23403,
269,
377,
436,
1159,
9437,
281,
3283,
387,
534,
3302,
3054,
253,
985,
588,
1891,
436,
1159,
310,
840,
908,
275,
271,
6349,
10491,
6974,
281,
3410,
253,
4811,
342,
2169,
4433,
5912,
625,
2223,
534,
5644,
281,
2169,
7605,
6733,
4560,
253,
269,
377,
310,
3139,
247,
1895,
534,
310,
816,
347,
1892,
347,
253,
3236,
1895,
273,
26230,
253,
4583,
4433,
5912,
2299,
253,
269,
377,
476,
320,
10166,
970,
941,
432,
1027,
6083,
417,
816,
253,
2457,
5570,
281,
320,
6760,
323,
4227,
253,
941,
432,
5570,
3733,
4508,
5431,
1142,
4433,
2219,
253,
2746,
34865,
265,
327,
253,
9376,
326,
841,
6083,
5257,
281,
1891,
275,
253,
1072,
3054,
347,
253,
2457,
5570,
533,
342,
2169,
5912,
50275,
783,
2929,
2722,
326,
253,
4081,
1332,
9010,
4433,
2219,
7367,
273,
9777,
7938,
685,
2629,
278,
68,
275,
15524,
6276,
347,
973,
347,
247,
15524,
1966,
1238,
4836,
1580,
253,
4081,
2746,
4648,
941,
9288,
1309,
253,
3733,
273,
253,
5570,
352,
556,
625,
1491,
387,
697,
23585,
685,
2629,
278,
68,
2299,
253,
2929,
2722,
326,
253,
4081,
1332,
310,
671,
7367,
273,
32800,
625,
5919,
685,
247,
27785,
2746,
970,
253,
4433,
2219,
1309,
3733,
50275,
15337,
6010,
50275,
74,
2868,
326,
436,
2929,
12453,
271,
1774,
1895,
275,
247,
4460,
5133,
347,
2080,
347,
891,
476,
2028,
285,
253,
4679,
403,
3240,
21414,
253,
2022,
4016,
1127,
310,
326,
891,
2868,
326,
253,
4081,
1332,
556,
690,
32138,
534,
778,
2686,
6379,
7605,
6733,
275,
690,
2219,
4496,
923,
4278,
2708,
50275,
5992,
7193,
5701,
50274,
262,
3133,
281,
479,
326,
247,
5075,
1127,
273,
253,
1332,
310,
326,
352,
778,
671,
1917,
314,
4796,
253,
6733,
2429,
281,
247,
2629,
278,
68,
1332,
604,
253,
1159,
269,
22698,
36940,
253,
5912,
273,
4433,
387,
2176,
1269,
352,
651,
1379,
247,
1077,
1048,
673,
281,
3451,
3139,
984,
841,
2792,
651,
10693,
2455,
320,
6760,
352,
3133,
326,
253,
2929,
344,
321,
18260,
12453,
436,
281,
690,
6070,
970,
253,
23653,
9765,
273,
253,
1159,
2299,
891,
1158,
627,
943,
320,
247,
625,
801,
554,
394,
5955,
273,
436,
2523,
271,
5170,
39943,
9458,
1511,
273,
5933,
778,
320,
247,
3505,
74,
6216,
1039,
273,
15974,
436,
1895,
50275,
783,
4081,
1332,
15771,
327,
253,
3745,
281,
26641,
253,
985,
275,
667,
6799,
1375,
2299,
327,
247,
3520,
985,
835,
4560,
4433,
2219,
310,
3782,
1774,
436,
310,
3798,
417,
1896,
352,
651,
320,
4722,
604,
253,
2929,
651,
2319,
849,
253,
4081,
2746,
651,
320,
908,
327,
824,
1524,
2718,
50275,
251,
3239,
721,
275,
253,
806,
12494,
253,
1375,
310,
1925,
256,
3185,
273,
1269,
347,
1078,
33810,
253,
7125,
273,
269,
403,
17609,
187,
187,
4118,
18435,
27,
50276,
296,
3755,
20556,
50276,
783,
2929,
12453,
247,
14793,
9400,
285,
30628,
3839,
5821,
326,
253,
2746,
310,
5272,
285,
253,
4679,
403,
21414,
30628,
5439,
247,
1180,
273,
2173,
7350,
534,
812,
320,
9713,
275,
247,
17265,
2715,
390,
2852,
789,
2529,
2708,
50275,
20881,
1255,
265,
50276,
8826,
30628,
497,
7514,
253,
1666,
25379,
403,
5075,
2067,
30628,
497,
7514,
326,
22128,
327,
20101,
2540,
1309,
3733,
812,
2794,
3374,
407,
44746,
253,
10419,
3268,
37317,
495,
45589,
436,
275,
247,
3782,
10799,
5133,
275,
1635,
627,
369,
247,
2087,
5471,
326,
625,
5018,
403,
3058,
1078,
253,
1332,
476,
320,
908,
275,
3946,
533,
436,
812,
320,
753,
273,
954,
2561,
50275,
250,
27167,
318,
50276,
455,
30628,
5821,
326,
253,
2929,
943,
320,
7607,
3738,
627,
369,
671,
13969,
326,
253,
2929,
651,
5649,
432,
10046,
1666,
25379,
285,
625,
2810,
4116,
281,
3374,
326,
812,
320,
4269,
407,
271,
27662,
6891,
10419,
3268,
253,
4477,
943,
1908,
15974,
390,
36738,
327,
841,
3374,
275,
253,
2457,
2715
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
the main strength of this paper is the introduction of an algorithm that achieves stateoftheart performance for causal dag learning with few nodes described by a pnl model moreover the algorithm is able to run in a fraction of the time of its strongest competitor a rkhsbased score the ncd measure of conditional dependence requires setting up and training a neural network this means that a practitioner who might want to use ges with ncd is required to set many hyperparameters i might have missed it but i did not see a discussion on how one might choosefit those parameters i suggest moving the contents of appendix d2 to the main text when i was reading the results in the main text i did not know how to inperpret the f1score i suggest including a table of run times for the methods run in the various simulated scenarios the choice of hyperparameters for ncd is included in the appendix but how were these parameters chosen do you have any recommendation for how someone else using your method should choose these parameters i suggestion adding such a discussion to the paper docsepoverall the core content of the paper is well written and organized the authors observation that the consistency of ges is based on a local decision criterion that asymptotically tracks whether a conditional independence relation holds and can be replaced by any consistent measure of conditional independence is a significant and novel insight they relate conditional independencies to a tauconsistent test statistic and show asymptotic optimality is guaranteed for these test statistics this allows for the incorporation of various conditional dependence measures such as their novel ncd measure and this makes their algorithm more broadly applicable moreover they clearly discuss the limits and implications of their work compared to the rest of the paper the readability of the abstract and introduction can be improved for example it doesnt say anything about the local decision criterion which i think is one of the main insights of the paper it would be beneficial for the paper to state the significant insight and contributions clearly in the abstract the introduction could be improved see also my comments below for example the second paragraph on page 2 comes a bit out of the blue this paragraph is trying to make the point that parametric models have their problems but how does this relate to ges and to reframed ges the fact that the proofs of the main theoretical results are contained in the supplementary material makes this paper less selfcontained although this is the only slight weakness of the paper however i did not check all the proofs there to be honest minor comments p1 abstract justify to me it is not immediately clear why these results justify the approach p1 in order to reveal the causal reveal or discover p1 seen a host of nice results the word nice is a bit subjective p2 continuous program what program p2 some search algorithms is there a citation missing p2 born out of i dont know what the authors mean here p2 nonparametric ges is ges parametric p2 the way we consider in this work there may be something missing here p2 paradigmatic constraintbased what is a paradigmatic constraintbased method p4 def 1 do we assume that xisubsetneq pjg p4 def 2 what does the subscript n mean does i and ii hold for all n p4 one first defines a population quantity this sentence is difficult to understand because of the if and only if in the sentence p5 algorithm 1 what does valid mean p5 numerically stable why is this or a reference is missing docsepthe idea of using a tauconsistent statistics as a cimeasure guiding the ges algorithm is interesting the experimental results of the ges algorithm with the proposed ci measure are convincing i miss comparisons with methods that guarantee optimal solutions and that are based on other principles than greedy equivalence search as is for example the gobnilp method 1 i also miss more detailed comparisons of the computational time computational time is mentioned several times in the paper in a nonsystematic way i wonder how computationally demanding is the actual learning of the neural network based estimator and how it affects the overall computational complexity of the ges algorithm references 1 httpswwwcsyorkacukaigswgobnilp page 1 why the claim that causal structure learning has potentially wide applications is not supported by references to applications but by references to books on foundations of causal models and probabilistic graphical models in introduction several different methods for structural learning are listed however an important recent method 2 from the family of scorebased methods is not mentioned definition 1 symbol n is not defined page 5 section 4 symbols dx dy and dz are not defined in the definition of conditional independence of x and y given z the term pz on rhs or conditioning on z in the term on lhs is missing please unify references eg david maxwell chickering and max chickering is the same person and should be referred in the same way either initials or full first names should be used consistently this citation is awfully confused judea pearl et al models reasoning and inference cambridge uk cambridge university press 19 2000 references 2 m bartlett and j cussens integer linear programming for the bayesian network structure learning problem artificial intelligence 244258 271 2017 issn 00043702 httpsdoiorg101016jartint201503003 docsepthe subject matter taken up in this paper is very helpful as methods for analyzing the type of data used in the experimental section are still in need the specific models used for testing are wellselected and the analyses given for them is compelling the choice of algorithms for comparison makes sense also the method given here shows promise for analyzing some difficult data one thing given up by going to an independencebased ges it seems is the ability a scorebased ges has to sort through all scores at each step to find the edge additiondeletion that would yield the optimal such score so on the fact of it an independencebased ges is somewhat inferior to a scorebased ges perhaps this is not true for the implementation in this paper but the issue should probably be discussed somewhere if it was i missed it in addition timing results would have been very much appreciated also it would be helpful to clarify why a scorebased approach wasnt pursued in the first place or in addition the fges algorithm cited is not the original fges algorithm perhaps a reference to the original would be helpful in addition tuning the network to achieve a sparsity given by expert knowledge is less helpful than tuning say to minimize reversals of edges whose orientations are given by expert knowledge the writing is a bit hard to follow in places perhaps another editing pass would help
### Summary: | meta review the paper proposes a ges algorithm with a local measure of conditional dependence instead of a global score commonly used previously pros local instead of global is potentially impactful the major benefit is that they can handle complex nonlinear relationships the method is justified by theoretical results cons some unclear statements and references missing which the authors intend to correct the recommendation is an accept | [
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
783,
2022,
4757,
273,
436,
2929,
310,
253,
10199,
273,
271,
5933,
326,
33526,
1375,
23037,
14387,
3045,
323,
19349,
31398,
4715,
342,
1643,
7632,
2529,
407,
247,
268,
13307,
1566,
25761,
253,
5933,
310,
2104,
281,
1408,
275,
247,
6919,
273,
253,
673,
273,
697,
19508,
32048,
247,
391,
76,
11285,
3169,
4868,
253,
295,
2428,
2557,
273,
17697,
10096,
4419,
4758,
598,
285,
3733,
247,
11454,
2990,
436,
2097,
326,
247,
34815,
665,
1537,
971,
281,
897,
32624,
342,
295,
2428,
310,
2424,
281,
873,
1142,
4373,
22041,
891,
1537,
452,
9829,
352,
533,
891,
858,
417,
923,
247,
5955,
327,
849,
581,
1537,
5206,
8491,
1110,
3602,
891,
1804,
4886,
253,
9410,
273,
30762,
277,
19,
281,
253,
2022,
2505,
672,
891,
369,
4361,
253,
1543,
275,
253,
2022,
2505,
891,
858,
417,
871,
849,
281,
275,
468,
4025,
253,
269,
18,
18891,
50276,
74,
1804,
1690,
247,
2829,
273,
1408,
2069,
323,
253,
3082,
1408,
275,
253,
2710,
15524,
15216,
50276,
783,
4327,
273,
4373,
22041,
323,
295,
2428,
310,
2908,
275,
253,
30762,
533,
849,
497,
841,
3602,
6777,
513,
368,
452,
667,
17401,
323,
849,
3095,
2010,
970,
634,
1332,
943,
5206,
841,
3602,
891,
14876,
6240,
824,
247,
5955,
281,
253,
2929,
5474,
33032,
1189,
455,
253,
5161,
2600,
273,
253,
2929,
310,
973,
3542,
285,
10932,
253,
4477,
8310,
326,
253,
15274,
273,
32624,
310,
1754,
327,
247,
1980,
3061,
17705,
326,
38311,
11411,
1880,
247,
17697,
14275,
5886,
6556,
285,
476,
320,
7932,
407,
667,
5185,
2557,
273,
17697,
14275,
310,
247,
1534,
285,
4460,
12288,
597,
14588,
17697,
7365,
4601,
281,
247,
29201,
32474,
1071,
26312,
285,
921,
20185,
5556,
1319,
310,
16293,
323,
841,
1071,
9990,
436,
4483,
323,
253,
24319,
273,
2710,
17697,
10096,
5593,
824,
347,
616,
4460,
295,
2428,
2557,
285,
436,
2789,
616,
5933,
625,
21450,
7763,
25761,
597,
4518,
2319,
253,
7787,
285,
12739,
273,
616,
789,
50275,
3118,
1096,
281,
253,
1551,
273,
253,
2929,
253,
1239,
1430,
273,
253,
12002,
285,
10199,
476,
320,
5520,
323,
1650,
352,
36908,
1333,
2712,
670,
253,
1980,
3061,
17705,
534,
891,
1158,
310,
581,
273,
253,
2022,
16039,
273,
253,
2929,
352,
651,
320,
12912,
323,
253,
2929,
281,
1375,
253,
1534,
12288,
285,
9021,
4518,
275,
253,
12002,
253,
10199,
812,
320,
5520,
923,
671,
619,
5701,
2708,
323,
1650,
253,
1273,
12494,
327,
3239,
374,
3249,
247,
2372,
562,
273,
253,
4797,
436,
12494,
310,
2820,
281,
1056,
253,
1127,
326,
36833,
3210,
452,
616,
3237,
533,
849,
1057,
436,
14588,
281,
32624,
285,
281,
16110,
3163,
32624,
50274,
783,
958,
326,
253,
27947,
273,
253,
2022,
10527,
1543,
403,
6221,
275,
253,
24864,
2144,
2789,
436,
2929,
1679,
1881,
41010,
3738,
436,
310,
253,
760,
4512,
14855,
273,
253,
2929,
2299,
891,
858,
417,
2451,
512,
253,
27947,
627,
281,
320,
8274,
50275,
37585,
5701,
50276,
81,
18,
50276,
15834,
50276,
6309,
1419,
50275,
936,
479,
352,
310,
417,
4745,
2590,
2139,
841,
1543,
15249,
253,
2746,
50275,
81,
18,
50276,
249,
1340,
281,
10313,
253,
19349,
50275,
38198,
267,
390,
9413,
50276,
81,
18,
50276,
16564,
247,
3167,
273,
5322,
1543,
50276,
783,
3159,
5322,
310,
247,
2372,
17854,
50276,
81,
19,
50276,
38927,
2086,
50276,
5371,
2086,
50276,
81,
19,
50276,
8826,
3186,
11333,
50276,
261,
627,
247,
25577,
5816,
50276,
81,
19,
50276,
6448,
562,
273,
50275,
74,
13414,
871,
752,
253,
4477,
1599,
1060,
50275,
81,
19,
1327,
36928,
32624,
50276,
261,
32624,
36833,
50276,
81,
19,
253,
1039,
359,
1908,
275,
436,
789,
50275,
9088,
778,
320,
1633,
5816,
1060,
50276,
81,
19,
50276,
1148,
324,
15379,
1420,
7658,
3169,
50276,
5371,
310,
247,
22199,
1420,
7658,
3169,
1332,
50276,
81,
21,
50276,
1545,
337,
513,
359,
5467,
326,
1269,
261,
538,
1178,
9540,
268,
75,
72,
50276,
81,
21,
50276,
1545,
374,
752,
1057,
253,
749,
3866,
295,
1599,
1057,
891,
285,
21255,
2186,
323,
512,
295,
50275,
81,
21,
581,
806,
13067,
247,
3072,
10671,
50275,
2520,
6197,
310,
2834,
281,
2096,
984,
273,
253,
604,
285,
760,
604,
275,
253,
6197,
50276,
81,
22,
50276,
41528,
337,
752,
1057,
3588,
1599,
50276,
81,
22,
50276,
40907,
1037,
6474,
50276,
22309,
310,
436,
390,
247,
3806,
310,
5816,
50275,
7152,
339,
431,
248,
2934,
273,
970,
247,
29201,
32474,
9990,
347,
247,
260,
553,
5849,
26766,
253,
32624,
5933,
310,
4722,
253,
5661,
1543,
273,
253,
32624,
5933,
342,
253,
4081,
16399,
2557,
403,
21414,
891,
2985,
14023,
342,
3082,
326,
12215,
8654,
5482,
285,
326,
403,
1754,
327,
643,
9241,
685,
38754,
19945,
3186,
347,
310,
323,
1650,
253,
564,
15453,
300,
81,
1332,
337,
891,
671,
2985,
625,
7000,
14023,
273,
253,
15180,
673,
15180,
673,
310,
5393,
2067,
2069,
275,
253,
2929,
275,
247,
14122,
2468,
1420,
1039,
891,
4282,
849,
43245,
17905,
310,
253,
4588,
4715,
273,
253,
11454,
2990,
1754,
29107,
285,
849,
352,
11852,
253,
4583,
15180,
10454,
273,
253,
32624,
5933,
50276,
250,
3065,
337,
5987,
2700,
6113,
90,
1064,
317,
25501,
304,
2140,
72,
706,
18789,
81,
3239,
337,
2139,
253,
1750,
326,
19349,
2605,
4715,
556,
7826,
4618,
4893,
310,
417,
4516,
407,
10414,
281,
4893,
533,
407,
10414,
281,
5098,
327,
27629,
273,
19349,
3210,
285,
37851,
29886,
3210,
275,
10199,
2067,
1027,
3082,
323,
8350,
4715,
403,
7117,
2299,
271,
1774,
3332,
1332,
374,
432,
253,
2021,
273,
4868,
3169,
3082,
310,
417,
5393,
50276,
28692,
337,
9484,
295,
310,
417,
2931,
3239,
608,
2593,
577,
14217,
18747,
17713,
285,
33425,
403,
417,
2931,
275,
253,
5426,
273,
17697,
14275,
273,
1269,
285,
340,
1677,
1182,
253,
1307,
268,
91,
327,
38309,
390,
21839,
327,
1182,
275,
253,
1307,
327,
298,
11285,
310,
5816,
50276,
32897,
440,
1419,
10414,
24088,
34843,
301,
2781,
4714,
8734,
2158,
285,
2781,
8734,
2158,
310,
253,
1072,
1436,
285,
943,
320,
6289,
275,
253,
1072,
1039,
2057,
3302,
84,
390,
2120,
806,
4454,
943,
320,
908,
12724,
436,
25577,
310,
3768,
2920,
13477,
50276,
75,
2496,
66,
27887,
77,
1162,
355,
3210,
14720,
285,
17032,
4049,
8298,
42487,
4049,
8298,
9835,
2315,
655,
5307,
50276,
250,
3065,
374,
278,
44693,
17655,
285,
480,
260,
1316,
561,
7007,
4872,
10717,
323,
253,
17699,
16561,
2990,
2605,
4715,
1895,
13345,
9260,
27146,
22029,
50276,
28209,
4240,
1521,
79,
209,
15017,
1787,
2640,
5987,
3088,
1528,
72,
6903,
11718,
75,
435,
565,
1252,
26906,
4838,
50276,
7152,
339,
431,
248,
2256,
2647,
2668,
598,
275,
436,
2929,
310,
1077,
9371,
347,
3082,
323,
18918,
253,
1511,
273,
941,
908,
275,
253,
5661,
2593,
403,
1335,
275,
878,
253,
2173,
3210,
908,
323,
5175,
403,
973,
16191,
285,
253,
6260,
1677,
323,
731,
310,
18511,
253,
4327,
273,
11333,
323,
5301,
2789,
3282,
671,
253,
1332,
1677,
1060,
2722,
9023,
323,
18918,
690,
2834,
941,
581,
2181,
1677,
598,
407,
1469,
281,
271,
14275,
3169,
32624,
352,
3133,
310,
253,
3745,
247,
4868,
3169,
32624,
556,
281,
3686,
949,
512,
7363,
387,
1016,
3213,
281,
1089,
253,
5024,
1635,
615,
37713,
326,
651,
4917,
253,
8654,
824,
4868,
594,
327,
253,
958,
273,
352,
271,
14275,
3169,
32624,
310,
8489,
18134,
281,
247,
4868,
3169,
32624,
4931,
436,
310,
417,
2032,
323,
253,
7092,
275,
436,
2929,
533,
253,
2523,
943,
3164,
320,
5469,
9366,
604,
352,
369,
891,
9829,
352,
275,
1635,
11795,
1543,
651,
452,
644,
1077,
1199,
14109,
671,
352,
651,
320,
9371,
281,
19148,
2139,
247,
4868,
3169,
2746,
369,
2649,
23321,
275,
253,
806,
1659,
390,
275,
1635,
253,
269,
2510,
5933,
11106,
310,
417,
253,
3236,
269,
2510,
5933,
4931,
247,
3806,
281,
253,
3236,
651,
320,
9371,
275,
1635,
50276,
85,
25004,
253,
2990,
281,
5115,
247,
37139,
414,
1677,
407,
6485,
3640,
310,
1679,
9371,
685,
25184,
1333,
281,
15338,
7661,
932,
273,
9297,
3692,
38730,
403,
1677,
407,
6485,
3640,
50276,
783,
4028,
310,
247,
2372,
1892,
281,
956,
275,
5053,
4931,
1529,
14835,
1509,
651,
1361,
2490,
187,
4118,
18435,
27,
13518,
2278,
253,
2929,
29328,
247,
32624,
5933,
342,
247,
1980,
2557,
273,
17697,
10096,
3185,
273,
247,
4156,
4868,
7744,
908,
3786,
50275,
856,
84,
50276,
6790,
3185,
273,
4156,
310,
7826,
3486,
1020,
50276,
783,
2201,
5649,
310,
326,
597,
476,
6016,
2570,
14561,
7688,
50275,
783,
1332,
310,
17285,
407,
10527,
1543,
50276,
5040,
50276,
8826,
12744,
7234,
285,
10414,
5816,
534,
253,
4477,
18607,
281,
3451,
50276,
783,
17401,
310,
271,
2997,
50275
] | [
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] | [
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
783,
2022,
4757,
273,
436,
2929,
310,
253,
10199,
273,
271,
5933,
326,
33526,
1375,
23037,
14387,
3045,
323,
19349,
31398,
4715,
342,
1643,
7632,
2529,
407,
247,
268,
13307,
1566,
25761,
253,
5933,
310,
2104,
281,
1408,
275,
247,
6919,
273,
253,
673,
273,
697,
19508,
32048,
247,
391,
76,
11285,
3169,
4868,
253,
295,
2428,
2557,
273,
17697,
10096,
4419,
4758,
598,
285,
3733,
247,
11454,
2990,
436,
2097,
326,
247,
34815,
665,
1537,
971,
281,
897,
32624,
342,
295,
2428,
310,
2424,
281,
873,
1142,
4373,
22041,
891,
1537,
452,
9829,
352,
533,
891,
858,
417,
923,
247,
5955,
327,
849,
581,
1537,
5206,
8491,
1110,
3602,
891,
1804,
4886,
253,
9410,
273,
30762,
277,
19,
281,
253,
2022,
2505,
672,
891,
369,
4361,
253,
1543,
275,
253,
2022,
2505,
891,
858,
417,
871,
849,
281,
275,
468,
4025,
253,
269,
18,
18891,
50276,
74,
1804,
1690,
247,
2829,
273,
1408,
2069,
323,
253,
3082,
1408,
275,
253,
2710,
15524,
15216,
50276,
783,
4327,
273,
4373,
22041,
323,
295,
2428,
310,
2908,
275,
253,
30762,
533,
849,
497,
841,
3602,
6777,
513,
368,
452,
667,
17401,
323,
849,
3095,
2010,
970,
634,
1332,
943,
5206,
841,
3602,
891,
14876,
6240,
824,
247,
5955,
281,
253,
2929,
5474,
33032,
1189,
455,
253,
5161,
2600,
273,
253,
2929,
310,
973,
3542,
285,
10932,
253,
4477,
8310,
326,
253,
15274,
273,
32624,
310,
1754,
327,
247,
1980,
3061,
17705,
326,
38311,
11411,
1880,
247,
17697,
14275,
5886,
6556,
285,
476,
320,
7932,
407,
667,
5185,
2557,
273,
17697,
14275,
310,
247,
1534,
285,
4460,
12288,
597,
14588,
17697,
7365,
4601,
281,
247,
29201,
32474,
1071,
26312,
285,
921,
20185,
5556,
1319,
310,
16293,
323,
841,
1071,
9990,
436,
4483,
323,
253,
24319,
273,
2710,
17697,
10096,
5593,
824,
347,
616,
4460,
295,
2428,
2557,
285,
436,
2789,
616,
5933,
625,
21450,
7763,
25761,
597,
4518,
2319,
253,
7787,
285,
12739,
273,
616,
789,
50275,
3118,
1096,
281,
253,
1551,
273,
253,
2929,
253,
1239,
1430,
273,
253,
12002,
285,
10199,
476,
320,
5520,
323,
1650,
352,
36908,
1333,
2712,
670,
253,
1980,
3061,
17705,
534,
891,
1158,
310,
581,
273,
253,
2022,
16039,
273,
253,
2929,
352,
651,
320,
12912,
323,
253,
2929,
281,
1375,
253,
1534,
12288,
285,
9021,
4518,
275,
253,
12002,
253,
10199,
812,
320,
5520,
923,
671,
619,
5701,
2708,
323,
1650,
253,
1273,
12494,
327,
3239,
374,
3249,
247,
2372,
562,
273,
253,
4797,
436,
12494,
310,
2820,
281,
1056,
253,
1127,
326,
36833,
3210,
452,
616,
3237,
533,
849,
1057,
436,
14588,
281,
32624,
285,
281,
16110,
3163,
32624,
50274,
783,
958,
326,
253,
27947,
273,
253,
2022,
10527,
1543,
403,
6221,
275,
253,
24864,
2144,
2789,
436,
2929,
1679,
1881,
41010,
3738,
436,
310,
253,
760,
4512,
14855,
273,
253,
2929,
2299,
891,
858,
417,
2451,
512,
253,
27947,
627,
281,
320,
8274,
50275,
37585,
5701,
50276,
81,
18,
50276,
15834,
50276,
6309,
1419,
50275,
936,
479,
352,
310,
417,
4745,
2590,
2139,
841,
1543,
15249,
253,
2746,
50275,
81,
18,
50276,
249,
1340,
281,
10313,
253,
19349,
50275,
38198,
267,
390,
9413,
50276,
81,
18,
50276,
16564,
247,
3167,
273,
5322,
1543,
50276,
783,
3159,
5322,
310,
247,
2372,
17854,
50276,
81,
19,
50276,
38927,
2086,
50276,
5371,
2086,
50276,
81,
19,
50276,
8826,
3186,
11333,
50276,
261,
627,
247,
25577,
5816,
50276,
81,
19,
50276,
6448,
562,
273,
50275,
74,
13414,
871,
752,
253,
4477,
1599,
1060,
50275,
81,
19,
1327,
36928,
32624,
50276,
261,
32624,
36833,
50276,
81,
19,
253,
1039,
359,
1908,
275,
436,
789,
50275,
9088,
778,
320,
1633,
5816,
1060,
50276,
81,
19,
50276,
1148,
324,
15379,
1420,
7658,
3169,
50276,
5371,
310,
247,
22199,
1420,
7658,
3169,
1332,
50276,
81,
21,
50276,
1545,
337,
513,
359,
5467,
326,
1269,
261,
538,
1178,
9540,
268,
75,
72,
50276,
81,
21,
50276,
1545,
374,
752,
1057,
253,
749,
3866,
295,
1599,
1057,
891,
285,
21255,
2186,
323,
512,
295,
50275,
81,
21,
581,
806,
13067,
247,
3072,
10671,
50275,
2520,
6197,
310,
2834,
281,
2096,
984,
273,
253,
604,
285,
760,
604,
275,
253,
6197,
50276,
81,
22,
50276,
41528,
337,
752,
1057,
3588,
1599,
50276,
81,
22,
50276,
40907,
1037,
6474,
50276,
22309,
310,
436,
390,
247,
3806,
310,
5816,
50275,
7152,
339,
431,
248,
2934,
273,
970,
247,
29201,
32474,
9990,
347,
247,
260,
553,
5849,
26766,
253,
32624,
5933,
310,
4722,
253,
5661,
1543,
273,
253,
32624,
5933,
342,
253,
4081,
16399,
2557,
403,
21414,
891,
2985,
14023,
342,
3082,
326,
12215,
8654,
5482,
285,
326,
403,
1754,
327,
643,
9241,
685,
38754,
19945,
3186,
347,
310,
323,
1650,
253,
564,
15453,
300,
81,
1332,
337,
891,
671,
2985,
625,
7000,
14023,
273,
253,
15180,
673,
15180,
673,
310,
5393,
2067,
2069,
275,
253,
2929,
275,
247,
14122,
2468,
1420,
1039,
891,
4282,
849,
43245,
17905,
310,
253,
4588,
4715,
273,
253,
11454,
2990,
1754,
29107,
285,
849,
352,
11852,
253,
4583,
15180,
10454,
273,
253,
32624,
5933,
50276,
250,
3065,
337,
5987,
2700,
6113,
90,
1064,
317,
25501,
304,
2140,
72,
706,
18789,
81,
3239,
337,
2139,
253,
1750,
326,
19349,
2605,
4715,
556,
7826,
4618,
4893,
310,
417,
4516,
407,
10414,
281,
4893,
533,
407,
10414,
281,
5098,
327,
27629,
273,
19349,
3210,
285,
37851,
29886,
3210,
275,
10199,
2067,
1027,
3082,
323,
8350,
4715,
403,
7117,
2299,
271,
1774,
3332,
1332,
374,
432,
253,
2021,
273,
4868,
3169,
3082,
310,
417,
5393,
50276,
28692,
337,
9484,
295,
310,
417,
2931,
3239,
608,
2593,
577,
14217,
18747,
17713,
285,
33425,
403,
417,
2931,
275,
253,
5426,
273,
17697,
14275,
273,
1269,
285,
340,
1677,
1182,
253,
1307,
268,
91,
327,
38309,
390,
21839,
327,
1182,
275,
253,
1307,
327,
298,
11285,
310,
5816,
50276,
32897,
440,
1419,
10414,
24088,
34843,
301,
2781,
4714,
8734,
2158,
285,
2781,
8734,
2158,
310,
253,
1072,
1436,
285,
943,
320,
6289,
275,
253,
1072,
1039,
2057,
3302,
84,
390,
2120,
806,
4454,
943,
320,
908,
12724,
436,
25577,
310,
3768,
2920,
13477,
50276,
75,
2496,
66,
27887,
77,
1162,
355,
3210,
14720,
285,
17032,
4049,
8298,
42487,
4049,
8298,
9835,
2315,
655,
5307,
50276,
250,
3065,
374,
278,
44693,
17655,
285,
480,
260,
1316,
561,
7007,
4872,
10717,
323,
253,
17699,
16561,
2990,
2605,
4715,
1895,
13345,
9260,
27146,
22029,
50276,
28209,
4240,
1521,
79,
209,
15017,
1787,
2640,
5987,
3088,
1528,
72,
6903,
11718,
75,
435,
565,
1252,
26906,
4838,
50276,
7152,
339,
431,
248,
2256,
2647,
2668,
598,
275,
436,
2929,
310,
1077,
9371,
347,
3082,
323,
18918,
253,
1511,
273,
941,
908,
275,
253,
5661,
2593,
403,
1335,
275,
878,
253,
2173,
3210,
908,
323,
5175,
403,
973,
16191,
285,
253,
6260,
1677,
323,
731,
310,
18511,
253,
4327,
273,
11333,
323,
5301,
2789,
3282,
671,
253,
1332,
1677,
1060,
2722,
9023,
323,
18918,
690,
2834,
941,
581,
2181,
1677,
598,
407,
1469,
281,
271,
14275,
3169,
32624,
352,
3133,
310,
253,
3745,
247,
4868,
3169,
32624,
556,
281,
3686,
949,
512,
7363,
387,
1016,
3213,
281,
1089,
253,
5024,
1635,
615,
37713,
326,
651,
4917,
253,
8654,
824,
4868,
594,
327,
253,
958,
273,
352,
271,
14275,
3169,
32624,
310,
8489,
18134,
281,
247,
4868,
3169,
32624,
4931,
436,
310,
417,
2032,
323,
253,
7092,
275,
436,
2929,
533,
253,
2523,
943,
3164,
320,
5469,
9366,
604,
352,
369,
891,
9829,
352,
275,
1635,
11795,
1543,
651,
452,
644,
1077,
1199,
14109,
671,
352,
651,
320,
9371,
281,
19148,
2139,
247,
4868,
3169,
2746,
369,
2649,
23321,
275,
253,
806,
1659,
390,
275,
1635,
253,
269,
2510,
5933,
11106,
310,
417,
253,
3236,
269,
2510,
5933,
4931,
247,
3806,
281,
253,
3236,
651,
320,
9371,
275,
1635,
50276,
85,
25004,
253,
2990,
281,
5115,
247,
37139,
414,
1677,
407,
6485,
3640,
310,
1679,
9371,
685,
25184,
1333,
281,
15338,
7661,
932,
273,
9297,
3692,
38730,
403,
1677,
407,
6485,
3640,
50276,
783,
4028,
310,
247,
2372,
1892,
281,
956,
275,
5053,
4931,
1529,
14835,
1509,
651,
1361,
2490,
187,
4118,
18435,
27,
13518,
2278,
253,
2929,
29328,
247,
32624,
5933,
342,
247,
1980,
2557,
273,
17697,
10096,
3185,
273,
247,
4156,
4868,
7744,
908,
3786,
50275,
856,
84,
50276,
6790,
3185,
273,
4156,
310,
7826,
3486,
1020,
50276,
783,
2201,
5649,
310,
326,
597,
476,
6016,
2570,
14561,
7688,
50275,
783,
1332,
310,
17285,
407,
10527,
1543,
50276,
5040,
50276,
8826,
12744,
7234,
285,
10414,
5816,
534,
253,
4477,
18607,
281,
3451,
50276,
783,
17401,
310,
271,
2997,
50275
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
the authors introduce caesar an embodied simulator for generating multimodal referring expression datasets the generated referring expressions can include both verbal utterances and nonverbal gestures pointing gaze and require resolving spatial relations between objects in the scene the simulator can also be used to generate contrastive examples where the referred object from the verbal utterance does not match the object referenced by the nonverbal gesture the authors use their simulator to generate two datasets and benchmark various existing methods on these generated datasets i am mostly happy with this work but i would like weakness 1 to be addressed since this seems like a glaring blindspot 1 table 1 in the paper is extremely well designed and the information is succinctly conveyed everything i need to know about this work in comparison to existing referring expression datasets is neatly summarized in this one table the related work is also very well covered 2 the methodology for designing the simulator is described in great detail and very easy to follow all the design decisions are motivated well it is clear the authors put a great deal of thought into possible issues with the dataset 3 the dataset analysis is very well done and gives useful insights 4 the authors release not only their generated datasets but also their simulator 5 the paper is wellwritten and very easy to follow 1 this seems to be a glaring omission for the two generated datasets caesarl and caesarxl no trainvaltest splits are reported does that mean the models were trained and evaluated on the full data if so i cant take any of the results in section 6 at face value 2 while it is great that the authors are releasing not only their generated datasets but also the simulator its not clear how the authors think this simulator can be useful to other researchers what can the simulators be used to design that is not already covered by the generated caesar datasets this is not a hard requirement or even a real weakness but would be useful to motivate releasing the simulator or as future work discussion 3 the paper could use more visionlanguage pretrained model baselines such as uniter or vilt or any other vl transformer model that jointly takes vision and language inputs and processes them in the transformer together docsepthis work introduces a new embodied multimodel referring simulator caesar for understanding human verbal utterance pointing and gaze two datasets of varying sizes have been collected from caesar containing resourceful information for referential tasks besides several baselines about configuring the grounding tasks are performed to illustrate the different impacts of different modalities i do appreciate the dataset synthesizing method caesar bridges the simulation in a virtual engine with motion captures in reality making it possible to run experiments on testing the embodied ais performance in real indoor settings the data collecting process is clear but nontrivial compared with previous related work on collectingsynthesizing pointing and gazes eg referit3d clevrref the caesarlxl dataset contains thorough descriptive information for the agent as well as for the sceneobject understanding as a dataset plus simulator the dataset is wellorganized the website for holding the benchmark is welldesigned the thing i worry about most about the paper is the generalizability of the dataset a even though the dataset is of huge size it is based on few scene types i am wondering if is easy for the dataset to be synthesized in photorealistic indoor scenes b for the 80 objects in the dataset in my opinion they mostly are small items in the kitchen or living room how can we believe they are representative of embodied ai tasks c the dataset brings human gestures and gazes but i dont see this work describing the variability when humans are pointing and looking at an object for the experiments this work runs several endtoend experiments on grounding the referential task given inputs of multimodalities however i want to know a what is the insight when we bring gaze pointing language and views together bhow do pointing and gaze cooperate to bring or remove ambiguity from language descriptions docsep1 the caesar developed by the authors break the limitations of the previous dataset for capturing embodied multimodal referring expressions by providing the awareness of multiple perspectives 2 the data generation system is easy to use and can provide large convenience for other researchers 3 the use of this data generation system for multiperspective awareness is an enlightening breakthrough for current research in the field 4 benchmarks with sota methods are done with the provided datasets for grounding embodied spatial relations 1 the data generation system developed in this work can help increase the models robustness with the awareness of multiple perspectives 2 the proposed data generation system is easy to use and userfriendly and open to researchers 3 a brief summary of the limitations and future works are given which provides new inspiration for subsequent work in this area in the experiment result shown in table 3 adding the result of combining two different perspectives and analyzing the results will be more convincing for this part docsepthe paper presents caesar a simulator capable of automatically generating multimodal referring expressions with verbal utterances and nonverbal cue pointing gesture and gaze to refer to an object the major contributions of the paper includes the caesar simulator as well as two largescale datasets caesarl and caesarxl with over a million samples in total the most important advantage of these datasets compared to existing datasets is that these datasets provide ego firstperson exo third person and top views of the scene where all previous datasets only provide exo views the authors benchmarked various models on the datasets 1 the paper presents two large scale datasets that have unique novelties compared to all existing datasets such as these datasets provide ego firstperson exo third person and top views of the scene where all previous datasets only provide exo views and these two new datasets provide contrastive samples and ambiguous samples these novel features can benefit research on interactive ai 2 the authors not only provide the datasets but also release a configurable simulator for generating the datasets so if researchers are unsatisfied by the configurations used to generate the two provided datasets they can use the simulator to generate their own data with their desired configurations 3 the authors benchmarked their datasets on a few sota multimodal models such as clip vitbert etc and studied the impacts of modalities as well as the multiple perspectives thus pointing out advantages of the datasets as well as providing potential future research directions for users of the datasets 4 the authors carefully documented details of their dataset collection process and how to use their simulator tools in the supplemental materials na docsepthe authors of the paper propose a new embodied simulator developed in the unity engine the task of the simulator is to generate training data for multimodal referring expression task ie containing verbal language and nonverbal gesture gaze cues with respect to the target object the authors point out the ambiguity of referring to objects in the real world perspective and mistakes in gestures and address those while creating two proposed datasets with the use of their simulator the authors provide various models and ablations explaining the shortcomings of current approaches to multimodal referring expressions i believe that the authors accurately identified the state of current datasets and benchmarks for referring expressions specifically i think it is a good assessment to include a perspective as an important factor in the description of the given scene which is the main differentiating factor with respect to yourefit 1 i find that introducing a customisable tool that can generate any arbitrary number of samples for the task is valuable for the community i also appreciate the inclusion of contrastive and ambiguous samples 1 yixin chen qing li deqian kong yik lun kei songchun zhu tao gao yixin zhu and siyuan huang yourefit embodied reference understanding with language and gesture in proceedings of the ieeecvf international conference on computer vision 2021 i find several aspects of the work that could be polished further or provided with some more insight 1 from my experience in unity i would recommend releasing the simulator as a standalone application with new versions of unity packages sometimes become outdated wrt the version of unity and compatibility may be lost i would suggest to the authors to consider that as a possibility to increase the sustainability of the simulator even though unity offers to update packages when opened in a new version it does not always resolve all updates correctly 2 i would expect a bit more thorough description of the dataset samples what is the number of various scenes how many verbal cues are generated for each scene what is the number of images generated for each sceneexpression what is the range on the number of generated frames when including video 3 further the creation of contrastive samples could be slightly adjusted specifically with respect to situations when the verbal cue is pointing to a nonexisting object line 227 i am not certain if that is very helpful in training the model my doubt is what would be the expected output of the model if making an inaccurate verbal description i would consider miscueing only one property of the object eg given only one mug yellow on the table referring to it as the red mug such that there is still information to be learned in the expression 4 for the benchmark if i understand correctly a single verbal expression from exo or ego perspective is paired with all the views without providing any cue to which perspective verbal utterance corresponds i think there may exist a lot of ambiguity in the samples affecting the results 5 on a smaller note some comparison of using contrastive samples with the previous works like 2 could be useful also some insights behind the choice of a specific engine from the visual quality perspective would be nice finally i find some things worth to be clarified not necessarily weaknesses yet see questions 2 ankit goyal kaiyu yang dawei yang and jia deng rel3d a minimally contrastive benchmark for grounding spatial relations in 3d advances in neural information processing systems 2020 docsepthis work proposes 1 a new task for embodied referring expression 2 a tool for easyprogramming with a simulator for generating referring expression data as well as 3 two generated datasets 1 a new task for embodied referring expression two referring expressions are given in a verbal and a nonverbal gazingpointinggazingpointing manner the task is to decide whether or not the two referring expressions are indicating the same object the nonverbal cues are provided to a model via rendered images of the scene from ego exo or top viewpoints please correct me if im wrong with regard to this point in order to perform this task a model has to reason about spatial relationships described in the verbal cue learn the links behind ego vs exo vs top viewpoints and practice perspectivetaking 2 a tool for easyprogramming with a simulator for data generation caesar caesar includes 80 objects and supports variations in color size position and rotationangle caesar is able to generate verbal referring expressions from 5 templates which supports easytodifficult levels of compositional reasoning over object attributes and spatial relationships caesar is able to synthesize the gesture and gaze of a humanoid avatar based on real human motions captured by optitrack 3 two generated datasets a large l and an extralarge xl version the datasets can be used to test for understanding of nonverbal referring expressions as well as the perspectivetaking ability of an embodied agent this paper correctly recognizes that the perspectivetaking ability and the ability to infer nonverbal cues from multimodal input are the crucial missing components for adapting vision models which were trained on nonembodied static datasets to an embodied environment this paper incorporates the above intuitions into their task design nonverbal cues from multimodal input a model is required to compare the referred objects by nonverbal cues vs the objects referred by verbal expressions perspectivetaking a model is required to perform the referring expression task while being viewpointaware egoexotop because spatial cues eg to the leftright of are inherently viewpointdependent this paper develops a simulator for data generation as well as two generated datasets to facilitate research for more capable embodied agents 1 caesarl vs caesarxl more explanation behind keeping two separate datasets would be great what are the differences between l and xl except for the sizerelated factors ie samples image resolution and imagetosample ratio do you expect these two datasets to support different evaluation scenarios or target different skillsets do you expect these two datasets to exhibit separate difficulty levels do you provide trainvaltest splits if yes how do you make the split if there is no significant difference why not merging them into a single dataset 2 line272 force models to utilize nonverbal cues by recognizing which perspective did the given utterances come from i didnt get how the model is supposed to recognize that when there are misalignments one might consider two possibilities 1 the current example is a contrastive example where misalignment is anticipated 2 its necessary to fit oneself into another viewpoint how do you expect a model to distinguish between these two possibilities please correct me if im misunderstanding id appreciate the clarification 3 related to q2 line406 generates verbal expressions from multiple perspectives actor and observer how could the model decide whether an input verbal expression comes from the actor or the observer do you expect the model to infer such information from the input images and how 4 line302307 i didnt get the essential difference between the dualencoder and the latefusion model besides using a different visual encoder vit vs resnet50 for example does the dualencoder model perform any fusion earlier than the latefusion model 5 another clarification question from where do you expect a model to extract gazingpointing cues based on my current understanding a model has to attend to the avatars eyes and hands from the image input correct however how could a gaze cue be extracted from a top or ego view more explanation is greatly appreciated 6 line384 participants correctly validated the relations in 8066 of the times it would be nice to provide some insight into when and where human participants failed at this task accounting for 20 of the examples could human failures be attributed to inherent ambiguity unclear annotation instruction the challenging nature of compositional reasoning or any other reasons 7 table3 the best model performance is exceeding 80 which is outperforming human participants where do you expect the future room for improvement to be
### Summary: | all the reviewers appreciated the effort and rigour that went into the design of the proposed idea the presentation of the paper clearly shows the embedding of the work within the state of the art it is particularly appreciated that this work is likely to lead to a break through in the analysis of nonverbal referring expressions given the possibility to generate multiple perspectives synthetically the experiments compared with multiple state of the art models and modalities was also appreciated the answers by the authors to reviewer comments were thorough and provided additional insights not all reviewers were in agreement on some aspects of the work and the more negative comments questioned whether releasing the simulator will really have the impact that it could have on first appearances xoln enpks final response seems more positive than the rating so i think that they may have forgotten to update their score the concerns of xoln regarding the practical details of releasing a simulator are to be considered however it is not clear to me from the paper whether releasing the simulator is really considered a major contribution of the work rather than a minor contribution | [
21765,
13894,
1972,
285,
1327,
332,
7187,
30129,
13458,
21850,
285,
15944,
281,
3730,
281,
271,
1789,
253,
2201,
9021,
273,
253,
2929,
3797,
253,
7318,
22475,
40022,
347,
973,
347,
767,
1236,
2510,
25912,
15302,
7318,
265,
7694,
285,
7318,
22475,
30291,
342,
689,
247,
3041,
3530,
275,
2264,
253,
954,
1774,
5750,
273,
841,
15302,
2429,
281,
5368,
15302,
310,
326,
841,
15302,
2085,
23057,
806,
10816,
385,
80,
2626,
1436,
285,
1755,
6849,
273,
253,
6200,
835,
512,
2045,
15302,
760,
2085,
385,
80,
6849,
253,
4477,
22791,
264,
2710,
3210,
327,
253,
15302,
50276,
18,
253,
2929,
10262,
767,
1781,
4311,
15302,
326,
452,
4451,
4460,
2890,
2429,
281,
512,
5368,
15302,
824,
347,
841,
15302,
2085,
23057,
806,
10816,
385,
80,
2626,
1436,
285,
1755,
6849,
273,
253,
6200,
835,
512,
2045,
15302,
760,
2085,
385,
80,
6849,
285,
841,
767,
747,
15302,
2085,
4499,
422,
3530,
285,
23851,
3530,
841,
4460,
3386,
476,
5649,
2561,
327,
18366,
23105,
374,
253,
4477,
417,
760,
2085,
253,
15302,
533,
671,
3727,
247,
3596,
11722,
40022,
323,
11365,
253,
15302,
594,
604,
8607,
403,
5061,
33496,
407,
253,
16012,
908,
281,
6635,
253,
767,
2530,
15302,
597,
476,
897,
253,
40022,
281,
6635,
616,
1211,
941,
342,
616,
6799,
16012,
495,
253,
4477,
22791,
264,
616,
15302,
327,
247,
1643,
256,
5503,
23390,
26306,
3210,
824,
347,
17230,
9084,
6291,
3966,
285,
5421,
253,
16274,
273,
33433,
347,
973,
347,
253,
2709,
24302,
3021,
13458,
562,
11361,
273,
253,
15302,
347,
973,
347,
5277,
2442,
2852,
2561,
10746,
323,
4212,
273,
253,
15302,
577,
253,
4477,
9257,
14290,
4278,
273,
616,
10895,
4849,
1232,
285,
849,
281,
897,
616,
40022,
5657,
275,
253,
25702,
4753,
50275,
2072,
5474,
339,
431,
248,
4477,
273,
253,
2929,
12661,
247,
747,
36080,
40022,
3715,
275,
253,
16167,
3948,
253,
4836,
273,
253,
40022,
310,
281,
6635,
3733,
941,
323,
23390,
26306,
14339,
2048,
4836,
26332,
4508,
21765,
3448,
285,
1327,
332,
7187,
21850,
15944,
26638,
342,
1675,
281,
253,
2303,
1789,
253,
4477,
1127,
562,
253,
28931,
273,
14339,
281,
5113,
275,
253,
1524,
1533,
50276,
5726,
4911,
285,
16503,
275,
35324,
285,
2953,
1110,
1223,
6153,
767,
4081,
15302,
342,
253,
897,
273,
616,
40022,
253,
4477,
2085,
2710,
3210,
285,
490,
77,
569,
15571,
253,
35387,
273,
1655,
7274,
281,
23390,
26306,
14339,
12091,
891,
2868,
326,
253,
4477,
13613,
3636,
253,
1375,
273,
1655,
15302,
285,
49602,
323,
14339,
12091,
5742,
891,
1158,
352,
310,
247,
1175,
6803,
281,
2486,
247,
8668,
347,
271,
1774,
2803,
275,
253,
5740,
273,
253,
1677,
6200,
534,
310,
253,
2022,
43073,
2803,
342,
1675,
281,
368,
709,
262,
337,
891,
1089,
326,
16984,
247,
2840,
261,
494,
4968,
326,
476,
6635,
667,
10341,
1180,
273,
3530,
323,
253,
4836,
310,
9865,
323,
253,
3114,
891,
671,
11435,
253,
11250,
273,
4499,
422,
285,
23851,
3530,
50275,
18,
340,
895,
249,
260,
864,
2805,
272,
632,
372,
82,
757,
465,
543,
340,
1479,
19749,
1058,
74,
4498,
348,
328,
1182,
11917,
246,
8500,
305,
8500,
340,
895,
249,
1182,
11917,
285,
4927,
90,
9041,
30287,
606,
368,
709,
262,
36080,
3806,
4685,
342,
3448,
285,
21850,
275,
10061,
273,
253,
26332,
70,
886,
39985,
5213,
8059,
327,
4382,
8113,
43425,
891,
1089,
2067,
7794,
273,
253,
789,
326,
812,
320,
29422,
2007,
390,
2530,
342,
690,
625,
12288,
50276,
18,
432,
619,
2793,
275,
16167,
891,
651,
5583,
20437,
253,
40022,
347,
247,
40468,
2898,
342,
747,
9508,
273,
16167,
12149,
4536,
2489,
36761,
8772,
253,
2715,
273,
16167,
285,
22862,
778,
320,
3663,
891,
651,
1804,
281,
253,
4477,
281,
1908,
326,
347,
247,
6387,
281,
2572,
253,
32435,
273,
253,
40022,
1014,
2167,
16167,
6131,
281,
5731,
12149,
672,
5485,
275,
247,
747,
2715,
352,
1057,
417,
1900,
11322,
512,
11269,
9113,
50276,
19,
891,
651,
1902,
247,
2372,
625,
11080,
5740,
273,
253,
10895,
3530,
752,
310,
253,
1180,
273,
2710,
13451,
849,
1142,
21765,
26638,
403,
4561,
323,
1016,
6200,
752,
310,
253,
1180,
273,
3888,
4561,
323,
1016,
6200,
17759,
752,
310,
253,
2491,
327,
253,
1180,
273,
4561,
13009,
672,
1690,
3492,
495,
2007,
253,
8869,
273,
4499,
422,
3530,
812,
320,
5777,
10904,
5742,
342,
1675,
281,
9534,
672,
253,
21765,
30129,
310,
13458,
281,
247,
44382,
9020,
1789,
1386,
26472,
891,
717,
417,
2176,
604,
326,
310,
1077,
9371,
275,
3733,
253,
1566,
619,
5545,
310,
752,
651,
320,
253,
3264,
3453,
273,
253,
1566,
604,
2403,
271,
31215,
21765,
5740,
891,
651,
1908,
27722,
489,
272,
760,
581,
2867,
273,
253,
1789,
24088,
1677,
760,
581,
33222,
8862,
327,
253,
2829,
14339,
281,
352,
347,
253,
2502,
33222,
824,
326,
627,
310,
1335,
1491,
281,
320,
6311,
275,
253,
2048,
50276,
21,
323,
253,
22791,
604,
891,
2096,
9113,
247,
2014,
21765,
2048,
432,
385,
80,
390,
23057,
8668,
310,
18433,
342,
512,
253,
6849,
1293,
5277,
667,
30129,
281,
534,
8668,
21765,
13894,
593,
10140,
891,
1158,
627,
778,
2226,
247,
2257,
273,
28931,
275,
253,
3530,
13567,
253,
1543,
608,
327,
247,
4577,
3877,
690,
5301,
273,
970,
4499,
422,
3530,
342,
253,
2045,
2987,
751,
374,
812,
320,
4217,
671,
690,
16039,
3212,
253,
4327,
273,
247,
2173,
3948,
432,
253,
5304,
3290,
8668,
651,
320,
5322,
50276,
71,
3341,
891,
1089,
690,
1841,
4409,
281,
320,
31637,
417,
7933,
32213,
2568,
50276,
2887,
3533,
50276,
19,
271,
11554,
564,
90,
267,
465,
2284,
30838,
30966,
4204,
26981,
30966,
285,
480,
571,
32087,
774,
20,
69,
247,
34885,
4499,
422,
22791,
323,
3216,
272,
8820,
2493,
275,
495,
69,
16424,
275,
11454,
1491,
5162,
2718,
9169,
5474,
33032,
2520,
789,
29328,
337,
247,
747,
4836,
323,
36080,
14339,
2048,
374,
247,
4968,
323,
3477,
38170,
342,
247,
40022,
323,
11365,
14339,
2048,
941,
347,
973,
347,
495,
767,
4561,
15302,
50276,
18,
247,
747,
4836,
323,
36080,
14339,
2048,
50276,
9389,
14339,
12091,
403,
1677,
275,
247,
21765,
285,
247,
1327,
332,
7187,
44520,
3659,
272,
72,
34054,
3659,
272,
5133,
253,
4836,
310,
281,
7617,
1880,
390,
417,
253,
767,
14339,
12091,
403,
7809,
253,
1072,
1789,
50276,
783,
1327,
332,
7187,
26638,
403,
2530,
281,
247,
1566,
3066,
13697,
3888,
273,
253,
6200,
432,
23057,
385,
80,
390,
1755,
1859,
10801,
4496,
3451,
479,
604,
516,
3430,
342,
2743,
281,
436,
1127,
50276,
249,
1340,
281,
1347,
436,
4836,
247,
1566,
556,
281,
1921,
670,
8820,
7688,
2529,
275,
253,
21765,
30129,
3037,
253,
4859,
3212,
23057,
4632,
385,
80,
4632,
1755,
1859,
10801,
285,
3946,
1153,
808,
400,
292,
1170,
50276,
19,
247,
4968,
323,
3477,
38170,
342,
247,
40022,
323,
941,
5978,
7318,
22475,
50276,
6357,
22475,
3797,
5096,
5113,
285,
8525,
10575,
275,
3295,
1979,
1899,
285,
9381,
2134,
50276,
6357,
22475,
310,
2104,
281,
6635,
21765,
14339,
12091,
432,
608,
20665,
534,
8525,
1842,
1767,
351,
1648,
2702,
2308,
273,
5889,
267,
14720,
689,
1789,
12474,
285,
8820,
7688,
50276,
6357,
22475,
310,
2104,
281,
46919,
253,
21850,
285,
15944,
273,
247,
1966,
1238,
49530,
1754,
327,
1524,
1966,
14462,
10848,
407,
1478,
262,
41565,
50276,
20,
767,
4561,
15302,
50276,
66,
1781,
298,
285,
271,
1021,
1544,
274,
463,
1269,
77,
2715,
50276,
783,
15302,
476,
320,
908,
281,
1071,
323,
4685,
273,
1327,
332,
7187,
14339,
12091,
347,
973,
347,
253,
1153,
808,
400,
292,
1170,
3745,
273,
271,
36080,
5570,
50275,
2520,
2929,
9113,
25153,
326,
50275,
783,
1153,
808,
400,
292,
1170,
3745,
285,
50276,
783,
3745,
281,
9441,
1327,
332,
7187,
26638,
432,
23390,
26306,
3280,
50276,
609,
253,
9560,
5816,
4295,
323,
42174,
8113,
3210,
534,
497,
10166,
327,
1327,
16697,
351,
728,
4228,
15302,
281,
271,
36080,
3126,
50276,
2520,
2929,
31167,
253,
1840,
16875,
4431,
715,
616,
4836,
2216,
50276,
4160,
332,
7187,
26638,
432,
23390,
26306,
3280,
247,
1566,
310,
2424,
281,
7277,
253,
6289,
5113,
407,
1327,
332,
7187,
26638,
4632,
253,
5113,
6289,
407,
21765,
12091,
50276,
5726,
808,
400,
292,
1170,
247,
1566,
310,
2424,
281,
1347,
253,
14339,
2048,
4836,
1223,
1146,
31460,
13823,
23057,
911,
302,
412,
984,
8820,
26638,
24088,
281,
253,
1669,
918,
273,
403,
26557,
31460,
6820,
50276,
2520,
2929,
24357,
247,
40022,
323,
941,
5978,
347,
973,
347,
767,
4561,
15302,
281,
12454,
2561,
323,
625,
7032,
36080,
6083,
337,
7318,
265,
7694,
4632,
7318,
22475,
30291,
625,
8813,
3212,
7562,
767,
4858,
15302,
651,
320,
1270,
50276,
5371,
403,
253,
3910,
875,
298,
285,
1269,
77,
3707,
323,
253,
1979,
4919,
2616,
26332,
3530,
2460,
6064,
285,
4440,
292,
375,
4636,
4313,
50276,
3088,
368,
1902,
841,
767,
15302,
281,
1329,
1027,
7103,
15216,
390,
2303,
1027,
10861,
19598,
50276,
3088,
368,
1902,
841,
767,
15302,
281,
10738,
4858,
10183,
2308,
50276,
3088,
368,
2085,
6194,
1208,
2566,
36509,
604,
4754,
849,
513,
368,
1056,
253,
8085,
50276,
338,
627,
310,
642,
1534,
3064,
2139,
417,
34047,
731,
715,
247,
2014,
10895,
50276,
19,
1386,
25121,
3490,
3210,
281,
16584,
1327,
332,
7187,
26638,
50276,
1615,
26182,
534,
8668,
858,
253,
1677,
13894,
1972,
1705,
432,
891,
42126,
755,
849,
253,
1566,
310,
6326,
281,
9446,
326,
672,
627,
403,
3731,
8623,
942,
581,
1537,
1908,
767,
15018,
337,
253,
1655,
1650,
310,
247,
4499,
422,
1650,
835,
3731,
40446,
310,
17683,
374,
697,
3309,
281,
4944,
33895,
715,
1529,
31460,
849,
513,
368,
1902,
247,
1566,
281,
12129,
875,
841,
767,
15018,
4496,
3451,
479,
604,
516,
40663,
2654,
11435,
253,
37699,
50276,
20,
2905,
281,
2805,
19,
1386,
24854,
15693,
21765,
12091,
432,
2709,
24302,
12353,
285,
19969,
849,
812,
253,
1566,
7617,
1880,
271,
3280,
21765,
2048,
3249,
432,
253,
12353,
390,
253,
19969,
513,
368,
1902,
253,
1566,
281,
9441,
824,
1491,
432,
253,
3280,
3888,
285,
849,
50276,
21,
1386,
1229,
17569,
24,
891,
42126,
755,
253,
5667,
3064,
875,
253,
8746,
36465,
285,
253,
3563,
12213,
1566,
16280,
970,
247,
1027,
5304,
32049,
9084,
4632,
501,
3024,
1235,
323,
1650,
1057,
253,
8746,
36465,
1566,
1347,
667,
11781,
4321,
685,
253,
3563,
12213,
1566,
50276,
22,
1529,
37699,
1953,
432,
835,
513,
368,
1902,
247,
1566,
281,
4908,
44520,
3659,
272,
26638,
1754,
327,
619,
1655,
4685,
247,
1566,
556,
281,
8041,
281,
253,
1323,
255,
1032,
2927,
285,
3564,
432,
253,
2460,
3280,
3451,
2299,
849,
812,
247,
15944,
30129,
320,
10375,
432,
247,
1755,
390,
23057,
1859,
625,
8813,
310,
10260,
14109,
50276,
23,
1386,
19948,
5014,
9113,
17618,
253,
2493,
275,
5096,
2526,
273,
253,
2069,
352,
651,
320,
5322,
281,
2085,
690,
12288,
715,
672,
285,
835,
1966,
5014,
4242,
387,
436,
4836,
15890,
323,
1384,
273,
253,
6667,
812,
1966,
20101,
320,
12877,
281,
12794,
28931,
12744,
22581,
9775,
253,
11132,
3753,
273,
5889,
267,
14720,
390,
667,
643,
4606,
50276,
24,
2829,
20,
253,
1682,
1566,
3045,
310,
27433,
5096,
534,
310,
41731,
14692,
1966,
5014,
835,
513,
368,
1902,
253,
2852,
2316,
323,
7756,
281,
320,
2490,
187,
4118,
18435,
27,
455,
253,
30628,
14109,
253,
3434,
285,
8132,
454,
326,
2427,
715,
253,
2216,
273,
253,
4081,
2934,
253,
9759,
273,
253,
2929,
4518,
2722,
253,
21496,
273,
253,
789,
1561,
253,
1375,
273,
253,
1445,
352,
310,
3782,
14109,
326,
436,
789,
310,
2779,
281,
1421,
281,
247,
2740,
949,
275,
253,
1783,
273,
1327,
332,
7187,
14339,
12091,
1677,
253,
6387,
281,
6635,
2709,
24302,
5132,
85,
1037,
253,
4679,
2429,
342,
2709,
1375,
273,
253,
1445,
3210,
285,
33433,
369,
671,
14109,
50275,
783,
9172,
407,
253,
4477,
281,
37317,
5701,
497,
11080,
285,
2530,
3081,
16039,
50276,
1439,
512,
30628,
497,
275,
4345,
327,
690,
7794,
273,
253,
789,
285,
253,
625,
4016,
5701,
17801,
1880,
20437,
253,
40022,
588,
1663,
452,
253,
3486,
326,
352,
812,
452,
327,
806,
18655,
1269,
13221,
50276,
257,
81,
661,
2457,
2380,
3133,
625,
2762,
685,
253,
13716,
594,
891,
1158,
326,
597,
778,
452,
14454,
281,
5731,
616,
4868,
50276,
783,
7350,
273,
1269,
13221,
5001,
253,
8542,
4278,
273,
20437,
247,
40022,
403,
281,
320,
2783,
2299,
352,
310,
417,
2590,
281,
479,
432,
253,
2929,
1880,
20437,
253,
40022,
310,
1663,
2783,
247,
2201,
7680,
273,
253,
789,
2581,
685,
247,
5884,
7680,
209
] | [
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] | [
21765,
13894,
1972,
285,
1327,
332,
7187,
30129,
13458,
21850,
285,
15944,
281,
3730,
281,
271,
1789,
253,
2201,
9021,
273,
253,
2929,
3797,
253,
7318,
22475,
40022,
347,
973,
347,
767,
1236,
2510,
25912,
15302,
7318,
265,
7694,
285,
7318,
22475,
30291,
342,
689,
247,
3041,
3530,
275,
2264,
253,
954,
1774,
5750,
273,
841,
15302,
2429,
281,
5368,
15302,
310,
326,
841,
15302,
2085,
23057,
806,
10816,
385,
80,
2626,
1436,
285,
1755,
6849,
273,
253,
6200,
835,
512,
2045,
15302,
760,
2085,
385,
80,
6849,
253,
4477,
22791,
264,
2710,
3210,
327,
253,
15302,
50276,
18,
253,
2929,
10262,
767,
1781,
4311,
15302,
326,
452,
4451,
4460,
2890,
2429,
281,
512,
5368,
15302,
824,
347,
841,
15302,
2085,
23057,
806,
10816,
385,
80,
2626,
1436,
285,
1755,
6849,
273,
253,
6200,
835,
512,
2045,
15302,
760,
2085,
385,
80,
6849,
285,
841,
767,
747,
15302,
2085,
4499,
422,
3530,
285,
23851,
3530,
841,
4460,
3386,
476,
5649,
2561,
327,
18366,
23105,
374,
253,
4477,
417,
760,
2085,
253,
15302,
533,
671,
3727,
247,
3596,
11722,
40022,
323,
11365,
253,
15302,
594,
604,
8607,
403,
5061,
33496,
407,
253,
16012,
908,
281,
6635,
253,
767,
2530,
15302,
597,
476,
897,
253,
40022,
281,
6635,
616,
1211,
941,
342,
616,
6799,
16012,
495,
253,
4477,
22791,
264,
616,
15302,
327,
247,
1643,
256,
5503,
23390,
26306,
3210,
824,
347,
17230,
9084,
6291,
3966,
285,
5421,
253,
16274,
273,
33433,
347,
973,
347,
253,
2709,
24302,
3021,
13458,
562,
11361,
273,
253,
15302,
347,
973,
347,
5277,
2442,
2852,
2561,
10746,
323,
4212,
273,
253,
15302,
577,
253,
4477,
9257,
14290,
4278,
273,
616,
10895,
4849,
1232,
285,
849,
281,
897,
616,
40022,
5657,
275,
253,
25702,
4753,
50275,
2072,
5474,
339,
431,
248,
4477,
273,
253,
2929,
12661,
247,
747,
36080,
40022,
3715,
275,
253,
16167,
3948,
253,
4836,
273,
253,
40022,
310,
281,
6635,
3733,
941,
323,
23390,
26306,
14339,
2048,
4836,
26332,
4508,
21765,
3448,
285,
1327,
332,
7187,
21850,
15944,
26638,
342,
1675,
281,
253,
2303,
1789,
253,
4477,
1127,
562,
253,
28931,
273,
14339,
281,
5113,
275,
253,
1524,
1533,
50276,
5726,
4911,
285,
16503,
275,
35324,
285,
2953,
1110,
1223,
6153,
767,
4081,
15302,
342,
253,
897,
273,
616,
40022,
253,
4477,
2085,
2710,
3210,
285,
490,
77,
569,
15571,
253,
35387,
273,
1655,
7274,
281,
23390,
26306,
14339,
12091,
891,
2868,
326,
253,
4477,
13613,
3636,
253,
1375,
273,
1655,
15302,
285,
49602,
323,
14339,
12091,
5742,
891,
1158,
352,
310,
247,
1175,
6803,
281,
2486,
247,
8668,
347,
271,
1774,
2803,
275,
253,
5740,
273,
253,
1677,
6200,
534,
310,
253,
2022,
43073,
2803,
342,
1675,
281,
368,
709,
262,
337,
891,
1089,
326,
16984,
247,
2840,
261,
494,
4968,
326,
476,
6635,
667,
10341,
1180,
273,
3530,
323,
253,
4836,
310,
9865,
323,
253,
3114,
891,
671,
11435,
253,
11250,
273,
4499,
422,
285,
23851,
3530,
50275,
18,
340,
895,
249,
260,
864,
2805,
272,
632,
372,
82,
757,
465,
543,
340,
1479,
19749,
1058,
74,
4498,
348,
328,
1182,
11917,
246,
8500,
305,
8500,
340,
895,
249,
1182,
11917,
285,
4927,
90,
9041,
30287,
606,
368,
709,
262,
36080,
3806,
4685,
342,
3448,
285,
21850,
275,
10061,
273,
253,
26332,
70,
886,
39985,
5213,
8059,
327,
4382,
8113,
43425,
891,
1089,
2067,
7794,
273,
253,
789,
326,
812,
320,
29422,
2007,
390,
2530,
342,
690,
625,
12288,
50276,
18,
432,
619,
2793,
275,
16167,
891,
651,
5583,
20437,
253,
40022,
347,
247,
40468,
2898,
342,
747,
9508,
273,
16167,
12149,
4536,
2489,
36761,
8772,
253,
2715,
273,
16167,
285,
22862,
778,
320,
3663,
891,
651,
1804,
281,
253,
4477,
281,
1908,
326,
347,
247,
6387,
281,
2572,
253,
32435,
273,
253,
40022,
1014,
2167,
16167,
6131,
281,
5731,
12149,
672,
5485,
275,
247,
747,
2715,
352,
1057,
417,
1900,
11322,
512,
11269,
9113,
50276,
19,
891,
651,
1902,
247,
2372,
625,
11080,
5740,
273,
253,
10895,
3530,
752,
310,
253,
1180,
273,
2710,
13451,
849,
1142,
21765,
26638,
403,
4561,
323,
1016,
6200,
752,
310,
253,
1180,
273,
3888,
4561,
323,
1016,
6200,
17759,
752,
310,
253,
2491,
327,
253,
1180,
273,
4561,
13009,
672,
1690,
3492,
495,
2007,
253,
8869,
273,
4499,
422,
3530,
812,
320,
5777,
10904,
5742,
342,
1675,
281,
9534,
672,
253,
21765,
30129,
310,
13458,
281,
247,
44382,
9020,
1789,
1386,
26472,
891,
717,
417,
2176,
604,
326,
310,
1077,
9371,
275,
3733,
253,
1566,
619,
5545,
310,
752,
651,
320,
253,
3264,
3453,
273,
253,
1566,
604,
2403,
271,
31215,
21765,
5740,
891,
651,
1908,
27722,
489,
272,
760,
581,
2867,
273,
253,
1789,
24088,
1677,
760,
581,
33222,
8862,
327,
253,
2829,
14339,
281,
352,
347,
253,
2502,
33222,
824,
326,
627,
310,
1335,
1491,
281,
320,
6311,
275,
253,
2048,
50276,
21,
323,
253,
22791,
604,
891,
2096,
9113,
247,
2014,
21765,
2048,
432,
385,
80,
390,
23057,
8668,
310,
18433,
342,
512,
253,
6849,
1293,
5277,
667,
30129,
281,
534,
8668,
21765,
13894,
593,
10140,
891,
1158,
627,
778,
2226,
247,
2257,
273,
28931,
275,
253,
3530,
13567,
253,
1543,
608,
327,
247,
4577,
3877,
690,
5301,
273,
970,
4499,
422,
3530,
342,
253,
2045,
2987,
751,
374,
812,
320,
4217,
671,
690,
16039,
3212,
253,
4327,
273,
247,
2173,
3948,
432,
253,
5304,
3290,
8668,
651,
320,
5322,
50276,
71,
3341,
891,
1089,
690,
1841,
4409,
281,
320,
31637,
417,
7933,
32213,
2568,
50276,
2887,
3533,
50276,
19,
271,
11554,
564,
90,
267,
465,
2284,
30838,
30966,
4204,
26981,
30966,
285,
480,
571,
32087,
774,
20,
69,
247,
34885,
4499,
422,
22791,
323,
3216,
272,
8820,
2493,
275,
495,
69,
16424,
275,
11454,
1491,
5162,
2718,
9169,
5474,
33032,
2520,
789,
29328,
337,
247,
747,
4836,
323,
36080,
14339,
2048,
374,
247,
4968,
323,
3477,
38170,
342,
247,
40022,
323,
11365,
14339,
2048,
941,
347,
973,
347,
495,
767,
4561,
15302,
50276,
18,
247,
747,
4836,
323,
36080,
14339,
2048,
50276,
9389,
14339,
12091,
403,
1677,
275,
247,
21765,
285,
247,
1327,
332,
7187,
44520,
3659,
272,
72,
34054,
3659,
272,
5133,
253,
4836,
310,
281,
7617,
1880,
390,
417,
253,
767,
14339,
12091,
403,
7809,
253,
1072,
1789,
50276,
783,
1327,
332,
7187,
26638,
403,
2530,
281,
247,
1566,
3066,
13697,
3888,
273,
253,
6200,
432,
23057,
385,
80,
390,
1755,
1859,
10801,
4496,
3451,
479,
604,
516,
3430,
342,
2743,
281,
436,
1127,
50276,
249,
1340,
281,
1347,
436,
4836,
247,
1566,
556,
281,
1921,
670,
8820,
7688,
2529,
275,
253,
21765,
30129,
3037,
253,
4859,
3212,
23057,
4632,
385,
80,
4632,
1755,
1859,
10801,
285,
3946,
1153,
808,
400,
292,
1170,
50276,
19,
247,
4968,
323,
3477,
38170,
342,
247,
40022,
323,
941,
5978,
7318,
22475,
50276,
6357,
22475,
3797,
5096,
5113,
285,
8525,
10575,
275,
3295,
1979,
1899,
285,
9381,
2134,
50276,
6357,
22475,
310,
2104,
281,
6635,
21765,
14339,
12091,
432,
608,
20665,
534,
8525,
1842,
1767,
351,
1648,
2702,
2308,
273,
5889,
267,
14720,
689,
1789,
12474,
285,
8820,
7688,
50276,
6357,
22475,
310,
2104,
281,
46919,
253,
21850,
285,
15944,
273,
247,
1966,
1238,
49530,
1754,
327,
1524,
1966,
14462,
10848,
407,
1478,
262,
41565,
50276,
20,
767,
4561,
15302,
50276,
66,
1781,
298,
285,
271,
1021,
1544,
274,
463,
1269,
77,
2715,
50276,
783,
15302,
476,
320,
908,
281,
1071,
323,
4685,
273,
1327,
332,
7187,
14339,
12091,
347,
973,
347,
253,
1153,
808,
400,
292,
1170,
3745,
273,
271,
36080,
5570,
50275,
2520,
2929,
9113,
25153,
326,
50275,
783,
1153,
808,
400,
292,
1170,
3745,
285,
50276,
783,
3745,
281,
9441,
1327,
332,
7187,
26638,
432,
23390,
26306,
3280,
50276,
609,
253,
9560,
5816,
4295,
323,
42174,
8113,
3210,
534,
497,
10166,
327,
1327,
16697,
351,
728,
4228,
15302,
281,
271,
36080,
3126,
50276,
2520,
2929,
31167,
253,
1840,
16875,
4431,
715,
616,
4836,
2216,
50276,
4160,
332,
7187,
26638,
432,
23390,
26306,
3280,
247,
1566,
310,
2424,
281,
7277,
253,
6289,
5113,
407,
1327,
332,
7187,
26638,
4632,
253,
5113,
6289,
407,
21765,
12091,
50276,
5726,
808,
400,
292,
1170,
247,
1566,
310,
2424,
281,
1347,
253,
14339,
2048,
4836,
1223,
1146,
31460,
13823,
23057,
911,
302,
412,
984,
8820,
26638,
24088,
281,
253,
1669,
918,
273,
403,
26557,
31460,
6820,
50276,
2520,
2929,
24357,
247,
40022,
323,
941,
5978,
347,
973,
347,
767,
4561,
15302,
281,
12454,
2561,
323,
625,
7032,
36080,
6083,
337,
7318,
265,
7694,
4632,
7318,
22475,
30291,
625,
8813,
3212,
7562,
767,
4858,
15302,
651,
320,
1270,
50276,
5371,
403,
253,
3910,
875,
298,
285,
1269,
77,
3707,
323,
253,
1979,
4919,
2616,
26332,
3530,
2460,
6064,
285,
4440,
292,
375,
4636,
4313,
50276,
3088,
368,
1902,
841,
767,
15302,
281,
1329,
1027,
7103,
15216,
390,
2303,
1027,
10861,
19598,
50276,
3088,
368,
1902,
841,
767,
15302,
281,
10738,
4858,
10183,
2308,
50276,
3088,
368,
2085,
6194,
1208,
2566,
36509,
604,
4754,
849,
513,
368,
1056,
253,
8085,
50276,
338,
627,
310,
642,
1534,
3064,
2139,
417,
34047,
731,
715,
247,
2014,
10895,
50276,
19,
1386,
25121,
3490,
3210,
281,
16584,
1327,
332,
7187,
26638,
50276,
1615,
26182,
534,
8668,
858,
253,
1677,
13894,
1972,
1705,
432,
891,
42126,
755,
849,
253,
1566,
310,
6326,
281,
9446,
326,
672,
627,
403,
3731,
8623,
942,
581,
1537,
1908,
767,
15018,
337,
253,
1655,
1650,
310,
247,
4499,
422,
1650,
835,
3731,
40446,
310,
17683,
374,
697,
3309,
281,
4944,
33895,
715,
1529,
31460,
849,
513,
368,
1902,
247,
1566,
281,
12129,
875,
841,
767,
15018,
4496,
3451,
479,
604,
516,
40663,
2654,
11435,
253,
37699,
50276,
20,
2905,
281,
2805,
19,
1386,
24854,
15693,
21765,
12091,
432,
2709,
24302,
12353,
285,
19969,
849,
812,
253,
1566,
7617,
1880,
271,
3280,
21765,
2048,
3249,
432,
253,
12353,
390,
253,
19969,
513,
368,
1902,
253,
1566,
281,
9441,
824,
1491,
432,
253,
3280,
3888,
285,
849,
50276,
21,
1386,
1229,
17569,
24,
891,
42126,
755,
253,
5667,
3064,
875,
253,
8746,
36465,
285,
253,
3563,
12213,
1566,
16280,
970,
247,
1027,
5304,
32049,
9084,
4632,
501,
3024,
1235,
323,
1650,
1057,
253,
8746,
36465,
1566,
1347,
667,
11781,
4321,
685,
253,
3563,
12213,
1566,
50276,
22,
1529,
37699,
1953,
432,
835,
513,
368,
1902,
247,
1566,
281,
4908,
44520,
3659,
272,
26638,
1754,
327,
619,
1655,
4685,
247,
1566,
556,
281,
8041,
281,
253,
1323,
255,
1032,
2927,
285,
3564,
432,
253,
2460,
3280,
3451,
2299,
849,
812,
247,
15944,
30129,
320,
10375,
432,
247,
1755,
390,
23057,
1859,
625,
8813,
310,
10260,
14109,
50276,
23,
1386,
19948,
5014,
9113,
17618,
253,
2493,
275,
5096,
2526,
273,
253,
2069,
352,
651,
320,
5322,
281,
2085,
690,
12288,
715,
672,
285,
835,
1966,
5014,
4242,
387,
436,
4836,
15890,
323,
1384,
273,
253,
6667,
812,
1966,
20101,
320,
12877,
281,
12794,
28931,
12744,
22581,
9775,
253,
11132,
3753,
273,
5889,
267,
14720,
390,
667,
643,
4606,
50276,
24,
2829,
20,
253,
1682,
1566,
3045,
310,
27433,
5096,
534,
310,
41731,
14692,
1966,
5014,
835,
513,
368,
1902,
253,
2852,
2316,
323,
7756,
281,
320,
2490,
187,
4118,
18435,
27,
455,
253,
30628,
14109,
253,
3434,
285,
8132,
454,
326,
2427,
715,
253,
2216,
273,
253,
4081,
2934,
253,
9759,
273,
253,
2929,
4518,
2722,
253,
21496,
273,
253,
789,
1561,
253,
1375,
273,
253,
1445,
352,
310,
3782,
14109,
326,
436,
789,
310,
2779,
281,
1421,
281,
247,
2740,
949,
275,
253,
1783,
273,
1327,
332,
7187,
14339,
12091,
1677,
253,
6387,
281,
6635,
2709,
24302,
5132,
85,
1037,
253,
4679,
2429,
342,
2709,
1375,
273,
253,
1445,
3210,
285,
33433,
369,
671,
14109,
50275,
783,
9172,
407,
253,
4477,
281,
37317,
5701,
497,
11080,
285,
2530,
3081,
16039,
50276,
1439,
512,
30628,
497,
275,
4345,
327,
690,
7794,
273,
253,
789,
285,
253,
625,
4016,
5701,
17801,
1880,
20437,
253,
40022,
588,
1663,
452,
253,
3486,
326,
352,
812,
452,
327,
806,
18655,
1269,
13221,
50276,
257,
81,
661,
2457,
2380,
3133,
625,
2762,
685,
253,
13716,
594,
891,
1158,
326,
597,
778,
452,
14454,
281,
5731,
616,
4868,
50276,
783,
7350,
273,
1269,
13221,
5001,
253,
8542,
4278,
273,
20437,
247,
40022,
403,
281,
320,
2783,
2299,
352,
310,
417,
2590,
281,
479,
432,
253,
2929,
1880,
20437,
253,
40022,
310,
1663,
2783,
247,
2201,
7680,
273,
253,
789,
2581,
685,
247,
5884,
7680,
209
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
the paper provides a framework for recourse ie counterfactual explanations that is robust to shifts in the model they formulate the robustified recourse setup as a minmax optimization problem where the max is over a neighborhood around the distribution over model parameters the model parameters are drawn from a mixture of k distributions so that the neighborhood is specified by gelbrich distance on each component they propose a finitedimensional version of the robustified optimization problem which can be optimized using projected gradient descent they evaluate their approach on the german credit dataset the small business administration dataset and the student performance dataset each of which demonstrates a different type of data distribution shift the setting of recourse in the presence of model shifts is wellmotivated especially since models are often updated over time the idea of formulating the problem as a distributionally robust optimization problem is compelling one weakness of this paper is that the technical solution provided is somewhat limited in particular the formulation in 4 relies heavily on the structural properties of the mixture distribution and gelbrich distance to reformulate the optimization problem and is not surprising given these assumptions since this is the main technical result in the paper it would have been interesting to see a more general setup that considered richer distance metrics for the empirical section a weakness is that dirrac the method proposed in this paper seems to optimize for ell2 cost whereas existing approaches seem to optimize for ell1 cost thus it is not clear how to compare the costs obtained by dirrac with the costs obtained in existing approaches moreover it could be interesting to further investigate the tradeoffs between validity and cost for dirrac to illustrate the cost of robustness incurred by the approach minor comment distributlly distributionally p 4 weak reject due to limited technical contribution and limited empirical comparison update after author response i appreciate the additional experiments for l1 cost included in the revision moreover although i appreciate the authors discussion of the motivation for the gelbrich distance i still think that the technical results are somewhat limited thus i keep my assessment the same docsepthis paper studies the problem of recourse actions aka counterfactual explanations while considering data distribution shifts or model shifts the proposed distributionally robust recourse action dirrac framework has the ability to generate valid recourse actions when model parameters shift over time dirrac adopts the distributionally robust optimization technique and the paper proposes a projected gradient descent method to solve the optimization problem experiments are conducted with both synthetic and real world data and the results have shown that dirrac methods can generate recourse actions with higher validity than two existing methods strengths 1 most existing work on recourse actions do not consider model change so the problem addressed by the paper is relatively new and it is an important problem since modeldata shifts are common in practice 2 the idea of consideringmodelling model shift as a mixture shift of model parameters and formalizing the problem as a minmax problem 3 the experiment results demonstrate the superiority of dirrac over the methods compared 4 the paper is well written weaknesses 1 it is not very clear how challenging it is to adopt the distributionally robust optimization technique for solving the recourse action problem it would be useful to let readers know clearly that the adoption is nontrivial which is particularly helpful for readers who are not familiar wi the distributionally robust optimization techinque 2 from the paper roar is a method for generating counterfactual explanations that are robust to model shifts but the experiments conducted do not consider roar as a baseline 3 it is not clear how efficient the proposed method is compared to existing methods 4 the performance of the proposed method under no model shifts should be evaluated as well other comments 1 on page 2 there is a typo distributlly distributionally 2 the paper does not have a conclusion section the paper tackles a very practical and relatively new problem regarding recourse actions the overall idea seems reasonable to me and the experiments have demonstrated the effectiveness of the proposed method the paper is also well written however the paper has a few weaknesses as described above the missing comparison with roar is a main concern the novelty of the proposed method regarding the adoption of distributionally robust optimization should be clarified too docsepthe paper considers the problem of generating recourse actions that are robust to shifts in the parameters of the classifier the authors present a distributionally robust optimization approach and experimentally show that for linear classifiers and under no actionability constraints the approach generates recourse actions that have high probability of being valid under shifts to the weights of the linear classifier strengths very clear and thorough exposition of the approach proposed the approach proposed is very sound technically weaknesses the experiments presented are rather limited and the value of the contributions is unclear in particular the problem of generating robust recourse was previously considered by upadhyay et al 2021 but the authors do not provide any evidence as to why their approach may be preferable which is particularly concerning given that the experiments considered by the authors are heavily inspired in those of upadhyay et al 2021 to strengthen the contributions of the paper the authors should focus on improving the experiments section the contribution of this paper would be much stronger if the authors compared the performance of their approach to that of upadhyay et al 2021 validated the claim that robust optimization solutions can be overly conservative because it may hedge against a pathological parameter in the uncertainty set in the context of algorithmic recourse and showed that their proposed approach overcomes this issue detailed comments on the experiments section as previously mentioned authors should compare their approach to upadhyay et al 2021 why was l1 not used as the cost function similarly to ar and mace is this an inherent limitation of dirrac if possible it would be best to use l1 as the cost function for all three approaches it would be very valuable to also include experiments for nonlinear classifiers eg mlps it would be very valuable to consider actionability constraints it seems like in practice many of the features of the realworld data sets would be immutable eg recession in the sba data set for the realworld data you use significantly fewer features than upadhyay et al 2021 why is this in my opinion none of the test data for which recourse is generated should be used for estimating theta and sigma my suggested approach split the data into train and test train 100 classifiers only with the train data eg by subsampling 80 of the train data to estimate theta and sigma and then compute recourse on the test data in the recourse setting one typically does not assume access to the training data it would be valuable to discuss how could theta and sigma be estimated or rather guessed given no access to the data and what would be the implications in terms of recourse validity this could also be tested experimentally by setting theta to be the weights of the classifier for which recourse is generated and over and under estimating sigma to various degrees for the realworld data please include the accuracy of the classifier for which recourse is generated detailed comments on the introduction contrary to what is stated in section 1 paragraph 2 in my opinion counterfactual explanations and recourse actions should not be used interchangeably but rather authors should refer only to recourse actions since the problem of recourse validity under data shifts at least how it is presented in this work section 1 paragraph 5 is not applicable to counterfactual explanations section 1 paragraph 2 if a specific application can provide the negative outcomes with recourse actions it can improve the user engagement and boost the interpretability at the same time citations needed for these statements section 1 paragraph 3 we must consider age as an immutable feature it is not a must to consider age immutable several works consider it as a nondecreasing feature section 1 paragraph 4 various solutions has have been proposed section 1 paragraph 5 data shifts usually induce corresponding shifts in the machine learning models parameters organizations usually retrain models periodically in repose to data shifts the data shift itself does not induces changes to the parameters of the model section 1 paragraph 5 if a recourse action fails to generate a favorable outcome in the future then the recourse action becomes useless not necessarily see handling change over time via ex post facto in venkatasubramanian and alfano 2020 section 1 paragraph 5 and the trust on the machine learning system is lost citation needed comments regarding section 2 3 and 4 in my opinion algorithm 1 and theorem 34 should not be in the main text as these are just general wellknown concepts from convex optimization leaving more room for the experiments sections authors could just mention convergence rate of o1sqrtt similarly sections 41 and 42 could be moved to the appendix since they are not core to the main contribution of the paper and are not evaluated in the experiments section this would also give the authors more space for the experiments section and a conclusion equation 1 minx equation 2 7 8a 9a infminx in x or add x in x as an explicit constrain similarly to equation 1 equation 9a please explain why a margin of 05 rather than 1 is used while the authors present a very clear and thorough exposition of the approach proposed and the approach is technically sound the experiments presented are very limited and the overall value of the contribution is unclear in particular the authors do not compare their proposed method with previous approaches addressing the problem of generating robust recourse actions
### Summary: | the paper provides a framework for recourse ie counterfactual explanations that is robust to model shifts the setup for the proposed method is a minmax optimization problem where the max is over a neighborhood around the distribution over model parameters the model parameters are drawn from a mixture of k distributions so that the neighborhood is specified by the gelbrich distance on each component the authors propose a finitedimensional version of the robustified optimization problem which can be optimized using projected gradient descent they evaluate their approach on the german credit dataset the small business administration dataset and the student performance dataset each of which demonstrates a different type of data distribution shift strengths most existing work on recourse actions do not consider model change so the problem addressed by the paper is relatively new the experiment results demonstrate the superiority of the proposed method over baselines weaknesses the solution provided is somewhat limited as it relies heavily on the structural properties of the mixture distribution and gelbrich distance to reformulate the optimization problem most of the reviewers voted initially for rejection the paper is borderline tending to rejection after the rebuttal the authors have also considerably updated the paper with new results after the initial reviews it seems therefore that the paper may benefit from another round of reviewing and because of this i recommend rejection and the authors to use the reviewers comments to improve the paper before resubmitting to another venue for another round of reviewing | [
1895,
534,
476,
320,
18325,
970,
16589,
11786,
18499,
597,
7472,
616,
2746,
327,
253,
305,
8592,
6152,
10895,
253,
1355,
2136,
5286,
10895,
285,
253,
5974,
3045,
10895,
1016,
273,
534,
14371,
247,
1027,
1511,
273,
941,
3268,
5333,
50276,
783,
4758,
273,
761,
9249,
275,
253,
3361,
273,
1566,
15036,
310,
973,
24013,
8550,
3340,
1580,
3210,
403,
2223,
9300,
689,
673,
253,
2934,
273,
830,
8287,
253,
1895,
347,
247,
3268,
595,
10237,
13757,
1895,
310,
18511,
50275,
531,
14855,
273,
436,
2929,
310,
326,
253,
7681,
2900,
2530,
310,
8489,
3710,
275,
1798,
253,
15895,
275,
577,
15771,
11306,
327,
253,
8350,
3607,
273,
253,
7802,
3268,
285,
9480,
1288,
469,
4181,
281,
8460,
4187,
253,
13757,
1895,
285,
310,
417,
10084,
1677,
841,
13260,
1580,
436,
310,
253,
2022,
7681,
906,
275,
253,
2929,
352,
651,
452,
644,
4722,
281,
923,
247,
625,
2087,
9978,
326,
2783,
38539,
4181,
17082,
50275,
1542,
253,
16774,
2593,
247,
14855,
310,
326,
14035,
8306,
253,
1332,
4081,
275,
436,
2929,
3133,
281,
22318,
323,
11591,
19,
2105,
5727,
5368,
7274,
1646,
281,
22318,
323,
11591,
18,
2105,
3021,
352,
310,
417,
2590,
849,
281,
7277,
253,
4815,
2797,
407,
14035,
8306,
342,
253,
4815,
2797,
275,
5368,
7274,
25761,
352,
812,
320,
4722,
281,
2007,
7409,
253,
5454,
14273,
875,
13091,
285,
2105,
323,
14035,
8306,
281,
17093,
253,
2105,
273,
31640,
23122,
407,
253,
2746,
50275,
37585,
4385,
50276,
8155,
1782,
77,
314,
50276,
35360,
595,
268,
577,
5075,
12009,
1955,
281,
3710,
7681,
7680,
285,
3710,
16774,
5301,
50272,
11183,
846,
2488,
2380,
891,
11435,
253,
3081,
4679,
323,
298,
18,
2105,
2908,
275,
253,
18520,
25761,
3738,
891,
11435,
253,
4477,
5955,
273,
253,
16038,
323,
253,
9480,
1288,
469,
4181,
891,
1335,
1158,
326,
253,
7681,
1543,
403,
8489,
3710,
3021,
891,
1978,
619,
6803,
253,
1072,
50276,
7152,
33032,
2520,
2929,
2175,
253,
1895,
273,
761,
9249,
5231,
38857,
4828,
12690,
780,
22909,
1223,
7296,
941,
3268,
15036,
390,
1566,
15036,
253,
4081,
3268,
595,
10237,
761,
9249,
2250,
14035,
8306,
7792,
556,
253,
3745,
281,
6635,
3588,
761,
9249,
5231,
672,
1566,
3602,
5333,
689,
673,
14035,
8306,
47932,
253,
3268,
595,
10237,
13757,
5853,
285,
253,
2929,
29328,
247,
16589,
11786,
18499,
1332,
281,
8415,
253,
13757,
1895,
4679,
403,
5196,
342,
1097,
13506,
285,
1524,
1533,
941,
285,
253,
1543,
452,
2011,
326,
14035,
8306,
3082,
476,
6635,
761,
9249,
5231,
342,
2169,
13091,
685,
767,
5368,
3082,
20544,
337,
954,
5368,
789,
327,
761,
9249,
5231,
513,
417,
1908,
1566,
1818,
594,
253,
1895,
9713,
407,
253,
2929,
310,
4942,
747,
285,
352,
310,
271,
1774,
1895,
1580,
1566,
2203,
15036,
403,
1846,
275,
3946,
374,
253,
2934,
273,
7296,
2307,
3485,
1566,
5333,
347,
247,
7802,
5333,
273,
1566,
3602,
285,
7473,
3006,
253,
1895,
347,
247,
1054,
4090,
1895,
495,
253,
3368,
1543,
7568,
253,
34385,
273,
14035,
8306,
689,
253,
3082,
2429,
577,
253,
2929,
310,
973,
3542,
50276,
20881,
1255,
265,
337,
352,
310,
417,
1077,
2590,
849,
11132,
352,
310,
281,
5283,
253,
3268,
595,
10237,
13757,
5853,
323,
16161,
253,
761,
9249,
2250,
1895,
352,
651,
320,
4217,
281,
1339,
10668,
871,
4518,
326,
253,
16253,
310,
37825,
534,
310,
3782,
9371,
323,
10668,
665,
403,
417,
7615,
38435,
253,
3268,
595,
10237,
13757,
13817,
249,
1452,
374,
432,
253,
2929,
40257,
310,
247,
1332,
323,
11365,
4828,
12690,
780,
22909,
326,
403,
10237,
281,
1566,
15036,
533,
253,
4679,
5196,
513,
417,
1908,
40257,
347,
247,
8245,
495,
352,
310,
417,
2590,
849,
5919,
253,
4081,
1332,
310,
2429,
281,
5368,
3082,
577,
253,
3045,
273,
253,
4081,
1332,
762,
642,
1566,
15036,
943,
320,
6760,
347,
973,
50276,
977,
5701,
50276,
18,
327,
3239,
374,
627,
310,
247,
1745,
80,
18838,
77,
314,
50276,
35360,
595,
374,
253,
2929,
1057,
417,
452,
247,
6452,
2593,
50275,
783,
2929,
39223,
247,
1077,
8542,
285,
4942,
747,
1895,
5001,
761,
9249,
5231,
253,
4583,
2934,
3133,
5272,
281,
479,
285,
253,
4679,
452,
5183,
253,
12510,
273,
253,
4081,
1332,
253,
2929,
310,
671,
973,
3542,
50275,
35529,
253,
2929,
556,
247,
1643,
32213,
347,
2529,
1840,
253,
5816,
5301,
342,
40257,
310,
247,
2022,
4468,
253,
38135,
273,
253,
4081,
1332,
5001,
253,
16253,
273,
3268,
595,
10237,
13757,
943,
320,
31637,
1512,
5474,
339,
431,
248,
2929,
19401,
253,
1895,
273,
11365,
761,
9249,
5231,
326,
403,
10237,
281,
15036,
275,
253,
3602,
273,
253,
30410,
253,
4477,
1246,
247,
3268,
595,
10237,
13757,
2746,
285,
21657,
921,
326,
323,
4872,
49996,
285,
762,
642,
2250,
1430,
10806,
253,
2746,
15693,
761,
9249,
5231,
326,
452,
1029,
5912,
273,
1146,
3588,
762,
15036,
281,
253,
13461,
273,
253,
4872,
30410,
20544,
1077,
2590,
285,
11080,
47284,
273,
253,
2746,
4081,
253,
2746,
4081,
310,
1077,
3590,
22335,
50276,
20881,
1255,
265,
253,
4679,
3559,
403,
2581,
3710,
285,
253,
1318,
273,
253,
9021,
310,
12744,
275,
1798,
253,
1895,
273,
11365,
10237,
761,
9249,
369,
3786,
2783,
407,
598,
324,
1994,
333,
1162,
355,
43425,
533,
253,
4477,
513,
417,
2085,
667,
1941,
347,
281,
2139,
616,
2746,
778,
320,
29224,
534,
310,
3782,
8664,
1677,
326,
253,
4679,
2783,
407,
253,
4477,
403,
11306,
11797,
275,
1110,
273,
598,
324,
1994,
333,
1162,
355,
43425,
50276,
936,
17084,
253,
9021,
273,
253,
2929,
253,
4477,
943,
2770,
327,
11138,
253,
4679,
2593,
253,
7680,
273,
436,
2929,
651,
320,
1199,
10046,
604,
253,
4477,
2429,
253,
3045,
273,
616,
2746,
281,
326,
273,
598,
324,
1994,
333,
1162,
355,
43425,
17618,
253,
1750,
326,
10237,
13757,
5482,
476,
320,
27662,
11518,
984,
352,
778,
29809,
1411,
247,
18977,
4764,
275,
253,
11649,
873,
275,
253,
3634,
273,
5933,
280,
761,
9249,
285,
2692,
326,
616,
4081,
2746,
689,
3217,
436,
2523,
50276,
5992,
7193,
5701,
327,
253,
4679,
2593,
50276,
284,
3786,
5393,
4477,
943,
7277,
616,
2746,
281,
598,
324,
1994,
333,
1162,
355,
43425,
50276,
22309,
369,
298,
18,
417,
908,
347,
253,
2105,
1159,
12014,
281,
549,
285,
278,
584,
310,
436,
271,
12794,
12291,
273,
14035,
8306,
604,
1896,
352,
651,
320,
1682,
281,
897,
298,
18,
347,
253,
2105,
1159,
323,
512,
1264,
7274,
50276,
262,
651,
320,
1077,
9865,
281,
671,
2486,
4679,
323,
14561,
49996,
24088,
13361,
793,
50276,
262,
651,
320,
1077,
9865,
281,
1908,
2250,
1430,
10806,
352,
3133,
751,
275,
3946,
1142,
273,
253,
3386,
273,
253,
1524,
10186,
941,
5239,
651,
320,
4293,
13508,
24088,
29276,
275,
253,
256,
5830,
941,
873,
50276,
1542,
253,
1524,
10186,
941,
368,
897,
3012,
11184,
3386,
685,
598,
324,
1994,
333,
1162,
355,
43425,
2139,
310,
436,
50276,
249,
619,
4743,
5293,
273,
253,
1071,
941,
323,
534,
761,
9249,
310,
4561,
943,
320,
908,
323,
26230,
39116,
285,
40009,
619,
5125,
2746,
8085,
253,
941,
715,
6194,
285,
1071,
6194,
2233,
49996,
760,
342,
253,
6194,
941,
24088,
407,
8790,
312,
4906,
5096,
273,
253,
6194,
941,
281,
6642,
39116,
285,
40009,
285,
840,
11897,
761,
9249,
327,
253,
1071,
941,
50276,
249,
253,
761,
9249,
4758,
581,
5431,
1057,
417,
5467,
2289,
281,
253,
3733,
941,
352,
651,
320,
9865,
281,
2319,
849,
812,
39116,
285,
40009,
320,
5998,
390,
2581,
30346,
1677,
642,
2289,
281,
253,
941,
285,
752,
651,
320,
253,
12739,
275,
2426,
273,
761,
9249,
13091,
436,
812,
671,
320,
5762,
21657,
407,
4758,
39116,
281,
320,
253,
13461,
273,
253,
30410,
323,
534,
761,
9249,
310,
4561,
285,
689,
285,
762,
26230,
40009,
281,
2710,
7759,
50276,
1542,
253,
1524,
10186,
941,
4496,
2486,
253,
7200,
273,
253,
30410,
323,
534,
761,
9249,
310,
4561,
50275,
5992,
7193,
5701,
327,
253,
10199,
50276,
19657,
552,
281,
752,
310,
4767,
275,
50276,
4674,
337,
12494,
374,
275,
619,
4743,
4828,
12690,
780,
22909,
285,
761,
9249,
5231,
943,
417,
320,
908,
28961,
1598,
533,
2581,
4477,
943,
3730,
760,
281,
761,
9249,
5231,
1580,
253,
1895,
273,
761,
9249,
13091,
762,
941,
15036,
387,
1878,
849,
352,
310,
3559,
275,
436,
789,
2593,
337,
12494,
608,
310,
417,
7763,
281,
4828,
12690,
780,
22909,
50276,
4674,
337,
12494,
374,
604,
247,
2173,
2898,
476,
2085,
253,
4016,
6973,
342,
761,
9249,
5231,
352,
476,
3157,
253,
2608,
13226,
285,
9510,
253,
4665,
1430,
387,
253,
1072,
673,
30404,
3058,
323,
841,
7234,
50276,
4674,
337,
12494,
495,
359,
1364,
1908,
2363,
347,
271,
4293,
13508,
4735,
352,
310,
417,
247,
1364,
281,
1908,
2363,
4293,
13508,
2067,
2987,
1908,
352,
347,
247,
1327,
40600,
2355,
4735,
50276,
4674,
337,
12494,
577,
2710,
5482,
556,
452,
644,
4081,
50276,
4674,
337,
12494,
608,
941,
15036,
3798,
10808,
3969,
15036,
275,
253,
5145,
4715,
3210,
3602,
8889,
3798,
851,
1949,
3210,
28557,
275,
1234,
583,
281,
941,
15036,
253,
941,
5333,
3139,
1057,
417,
14757,
2544,
281,
253,
3602,
273,
253,
1566,
50276,
4674,
337,
12494,
608,
604,
247,
761,
9249,
2250,
10224,
281,
6635,
247,
13857,
6454,
275,
253,
2852,
840,
253,
761,
9249,
2250,
4916,
19437,
417,
7933,
923,
10885,
1818,
689,
673,
3066,
385,
1501,
32924,
275,
8097,
29846,
284,
538,
3358,
44316,
285,
355,
71,
4692,
9169,
50276,
4674,
337,
12494,
608,
285,
253,
4517,
327,
253,
5145,
4715,
985,
310,
3663,
25577,
3058,
50276,
26122,
5001,
2593,
374,
495,
285,
577,
50276,
249,
619,
4743,
5933,
337,
285,
10012,
5910,
943,
417,
320,
275,
253,
2022,
2505,
347,
841,
403,
816,
2087,
973,
4304,
12342,
432,
17133,
13757,
6108,
625,
2316,
323,
253,
4679,
7118,
4477,
812,
816,
3748,
14940,
2281,
273,
258,
18,
2609,
85,
50276,
3549,
6241,
7118,
7609,
285,
5976,
812,
320,
4395,
281,
253,
30762,
1580,
597,
403,
417,
5161,
281,
253,
2022,
7680,
273,
253,
2929,
285,
403,
417,
6760,
275,
253,
4679,
2593,
436,
651,
671,
1918,
253,
4477,
625,
2317,
323,
253,
4679,
2593,
285,
247,
6452,
50276,
29813,
337,
1054,
89,
50276,
29813,
374,
818,
854,
66,
898,
66,
2192,
1222,
89,
275,
1269,
390,
823,
1269,
275,
1269,
347,
271,
6843,
37709,
12014,
281,
5150,
337,
50276,
29813,
898,
66,
4496,
5513,
2139,
247,
8459,
273,
16987,
2581,
685,
337,
310,
908,
50276,
6050,
253,
4477,
1246,
247,
1077,
2590,
285,
11080,
47284,
273,
253,
2746,
4081,
285,
253,
2746,
310,
22335,
3590,
253,
4679,
3559,
403,
1077,
3710,
285,
253,
4583,
1318,
273,
253,
7680,
310,
12744,
275,
1798,
253,
4477,
513,
417,
7277,
616,
4081,
1332,
342,
2045,
7274,
15974,
253,
1895,
273,
11365,
10237,
761,
9249,
5231,
2490,
187,
4118,
18435,
27,
783,
2929,
3400,
247,
7792,
323,
761,
9249,
26332,
4828,
12690,
780,
22909,
326,
310,
10237,
281,
1566,
15036,
253,
9978,
323,
253,
4081,
1332,
310,
247,
1054,
4090,
13757,
1895,
835,
253,
2781,
310,
689,
247,
9168,
1475,
253,
3268,
689,
1566,
3602,
253,
1566,
3602,
403,
8392,
432,
247,
7802,
273,
465,
10670,
594,
326,
253,
9168,
310,
7616,
407,
253,
9480,
1288,
469,
4181,
327,
1016,
4445,
253,
4477,
12661,
247,
1442,
959,
37613,
2715,
273,
253,
10237,
1245,
13757,
1895,
534,
476,
320,
18325,
970,
16589,
11786,
18499,
597,
7472,
616,
2746,
327,
253,
305,
8592,
6152,
10895,
253,
1355,
2136,
5286,
10895,
285,
253,
5974,
3045,
10895,
1016,
273,
534,
14371,
247,
1027,
1511,
273,
941,
3268,
5333,
50276,
296,
3755,
20556,
50275,
2252,
5368,
789,
327,
761,
9249,
5231,
513,
417,
1908,
1566,
1818,
594,
253,
1895,
9713,
407,
253,
2929,
310,
4942,
747,
50276,
783,
3368,
1543,
7568,
253,
34385,
273,
253,
4081,
1332,
689,
1666,
25379,
50276,
20881,
1255,
265,
50275,
783,
2900,
2530,
310,
8489,
3710,
347,
352,
15771,
11306,
327,
253,
8350,
3607,
273,
253,
7802,
3268,
285,
9480,
1288,
469,
4181,
281,
8460,
4187,
253,
13757,
1895,
50276,
2252,
273,
253,
30628,
14285,
8523,
323,
18235,
253,
2929,
310,
45210,
43981,
281,
18235,
846,
253,
30080,
22559,
253,
4477,
452,
671,
15455,
9300,
253,
2929,
342,
747,
1543,
846,
253,
3302,
10123,
352,
3133,
3103,
326,
253,
2929,
778,
5649,
432,
1529,
3790,
273,
16725,
285,
984,
273,
436,
891,
5583,
18235,
285,
253,
4477,
281,
897,
253,
30628,
5701,
281,
3157,
253,
2929,
1078,
501,
538,
15318,
281,
1529,
18767,
323,
1529,
3790,
273,
16725
] | [
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] | [
1895,
534,
476,
320,
18325,
970,
16589,
11786,
18499,
597,
7472,
616,
2746,
327,
253,
305,
8592,
6152,
10895,
253,
1355,
2136,
5286,
10895,
285,
253,
5974,
3045,
10895,
1016,
273,
534,
14371,
247,
1027,
1511,
273,
941,
3268,
5333,
50276,
783,
4758,
273,
761,
9249,
275,
253,
3361,
273,
1566,
15036,
310,
973,
24013,
8550,
3340,
1580,
3210,
403,
2223,
9300,
689,
673,
253,
2934,
273,
830,
8287,
253,
1895,
347,
247,
3268,
595,
10237,
13757,
1895,
310,
18511,
50275,
531,
14855,
273,
436,
2929,
310,
326,
253,
7681,
2900,
2530,
310,
8489,
3710,
275,
1798,
253,
15895,
275,
577,
15771,
11306,
327,
253,
8350,
3607,
273,
253,
7802,
3268,
285,
9480,
1288,
469,
4181,
281,
8460,
4187,
253,
13757,
1895,
285,
310,
417,
10084,
1677,
841,
13260,
1580,
436,
310,
253,
2022,
7681,
906,
275,
253,
2929,
352,
651,
452,
644,
4722,
281,
923,
247,
625,
2087,
9978,
326,
2783,
38539,
4181,
17082,
50275,
1542,
253,
16774,
2593,
247,
14855,
310,
326,
14035,
8306,
253,
1332,
4081,
275,
436,
2929,
3133,
281,
22318,
323,
11591,
19,
2105,
5727,
5368,
7274,
1646,
281,
22318,
323,
11591,
18,
2105,
3021,
352,
310,
417,
2590,
849,
281,
7277,
253,
4815,
2797,
407,
14035,
8306,
342,
253,
4815,
2797,
275,
5368,
7274,
25761,
352,
812,
320,
4722,
281,
2007,
7409,
253,
5454,
14273,
875,
13091,
285,
2105,
323,
14035,
8306,
281,
17093,
253,
2105,
273,
31640,
23122,
407,
253,
2746,
50275,
37585,
4385,
50276,
8155,
1782,
77,
314,
50276,
35360,
595,
268,
577,
5075,
12009,
1955,
281,
3710,
7681,
7680,
285,
3710,
16774,
5301,
50272,
11183,
846,
2488,
2380,
891,
11435,
253,
3081,
4679,
323,
298,
18,
2105,
2908,
275,
253,
18520,
25761,
3738,
891,
11435,
253,
4477,
5955,
273,
253,
16038,
323,
253,
9480,
1288,
469,
4181,
891,
1335,
1158,
326,
253,
7681,
1543,
403,
8489,
3710,
3021,
891,
1978,
619,
6803,
253,
1072,
50276,
7152,
33032,
2520,
2929,
2175,
253,
1895,
273,
761,
9249,
5231,
38857,
4828,
12690,
780,
22909,
1223,
7296,
941,
3268,
15036,
390,
1566,
15036,
253,
4081,
3268,
595,
10237,
761,
9249,
2250,
14035,
8306,
7792,
556,
253,
3745,
281,
6635,
3588,
761,
9249,
5231,
672,
1566,
3602,
5333,
689,
673,
14035,
8306,
47932,
253,
3268,
595,
10237,
13757,
5853,
285,
253,
2929,
29328,
247,
16589,
11786,
18499,
1332,
281,
8415,
253,
13757,
1895,
4679,
403,
5196,
342,
1097,
13506,
285,
1524,
1533,
941,
285,
253,
1543,
452,
2011,
326,
14035,
8306,
3082,
476,
6635,
761,
9249,
5231,
342,
2169,
13091,
685,
767,
5368,
3082,
20544,
337,
954,
5368,
789,
327,
761,
9249,
5231,
513,
417,
1908,
1566,
1818,
594,
253,
1895,
9713,
407,
253,
2929,
310,
4942,
747,
285,
352,
310,
271,
1774,
1895,
1580,
1566,
2203,
15036,
403,
1846,
275,
3946,
374,
253,
2934,
273,
7296,
2307,
3485,
1566,
5333,
347,
247,
7802,
5333,
273,
1566,
3602,
285,
7473,
3006,
253,
1895,
347,
247,
1054,
4090,
1895,
495,
253,
3368,
1543,
7568,
253,
34385,
273,
14035,
8306,
689,
253,
3082,
2429,
577,
253,
2929,
310,
973,
3542,
50276,
20881,
1255,
265,
337,
352,
310,
417,
1077,
2590,
849,
11132,
352,
310,
281,
5283,
253,
3268,
595,
10237,
13757,
5853,
323,
16161,
253,
761,
9249,
2250,
1895,
352,
651,
320,
4217,
281,
1339,
10668,
871,
4518,
326,
253,
16253,
310,
37825,
534,
310,
3782,
9371,
323,
10668,
665,
403,
417,
7615,
38435,
253,
3268,
595,
10237,
13757,
13817,
249,
1452,
374,
432,
253,
2929,
40257,
310,
247,
1332,
323,
11365,
4828,
12690,
780,
22909,
326,
403,
10237,
281,
1566,
15036,
533,
253,
4679,
5196,
513,
417,
1908,
40257,
347,
247,
8245,
495,
352,
310,
417,
2590,
849,
5919,
253,
4081,
1332,
310,
2429,
281,
5368,
3082,
577,
253,
3045,
273,
253,
4081,
1332,
762,
642,
1566,
15036,
943,
320,
6760,
347,
973,
50276,
977,
5701,
50276,
18,
327,
3239,
374,
627,
310,
247,
1745,
80,
18838,
77,
314,
50276,
35360,
595,
374,
253,
2929,
1057,
417,
452,
247,
6452,
2593,
50275,
783,
2929,
39223,
247,
1077,
8542,
285,
4942,
747,
1895,
5001,
761,
9249,
5231,
253,
4583,
2934,
3133,
5272,
281,
479,
285,
253,
4679,
452,
5183,
253,
12510,
273,
253,
4081,
1332,
253,
2929,
310,
671,
973,
3542,
50275,
35529,
253,
2929,
556,
247,
1643,
32213,
347,
2529,
1840,
253,
5816,
5301,
342,
40257,
310,
247,
2022,
4468,
253,
38135,
273,
253,
4081,
1332,
5001,
253,
16253,
273,
3268,
595,
10237,
13757,
943,
320,
31637,
1512,
5474,
339,
431,
248,
2929,
19401,
253,
1895,
273,
11365,
761,
9249,
5231,
326,
403,
10237,
281,
15036,
275,
253,
3602,
273,
253,
30410,
253,
4477,
1246,
247,
3268,
595,
10237,
13757,
2746,
285,
21657,
921,
326,
323,
4872,
49996,
285,
762,
642,
2250,
1430,
10806,
253,
2746,
15693,
761,
9249,
5231,
326,
452,
1029,
5912,
273,
1146,
3588,
762,
15036,
281,
253,
13461,
273,
253,
4872,
30410,
20544,
1077,
2590,
285,
11080,
47284,
273,
253,
2746,
4081,
253,
2746,
4081,
310,
1077,
3590,
22335,
50276,
20881,
1255,
265,
253,
4679,
3559,
403,
2581,
3710,
285,
253,
1318,
273,
253,
9021,
310,
12744,
275,
1798,
253,
1895,
273,
11365,
10237,
761,
9249,
369,
3786,
2783,
407,
598,
324,
1994,
333,
1162,
355,
43425,
533,
253,
4477,
513,
417,
2085,
667,
1941,
347,
281,
2139,
616,
2746,
778,
320,
29224,
534,
310,
3782,
8664,
1677,
326,
253,
4679,
2783,
407,
253,
4477,
403,
11306,
11797,
275,
1110,
273,
598,
324,
1994,
333,
1162,
355,
43425,
50276,
936,
17084,
253,
9021,
273,
253,
2929,
253,
4477,
943,
2770,
327,
11138,
253,
4679,
2593,
253,
7680,
273,
436,
2929,
651,
320,
1199,
10046,
604,
253,
4477,
2429,
253,
3045,
273,
616,
2746,
281,
326,
273,
598,
324,
1994,
333,
1162,
355,
43425,
17618,
253,
1750,
326,
10237,
13757,
5482,
476,
320,
27662,
11518,
984,
352,
778,
29809,
1411,
247,
18977,
4764,
275,
253,
11649,
873,
275,
253,
3634,
273,
5933,
280,
761,
9249,
285,
2692,
326,
616,
4081,
2746,
689,
3217,
436,
2523,
50276,
5992,
7193,
5701,
327,
253,
4679,
2593,
50276,
284,
3786,
5393,
4477,
943,
7277,
616,
2746,
281,
598,
324,
1994,
333,
1162,
355,
43425,
50276,
22309,
369,
298,
18,
417,
908,
347,
253,
2105,
1159,
12014,
281,
549,
285,
278,
584,
310,
436,
271,
12794,
12291,
273,
14035,
8306,
604,
1896,
352,
651,
320,
1682,
281,
897,
298,
18,
347,
253,
2105,
1159,
323,
512,
1264,
7274,
50276,
262,
651,
320,
1077,
9865,
281,
671,
2486,
4679,
323,
14561,
49996,
24088,
13361,
793,
50276,
262,
651,
320,
1077,
9865,
281,
1908,
2250,
1430,
10806,
352,
3133,
751,
275,
3946,
1142,
273,
253,
3386,
273,
253,
1524,
10186,
941,
5239,
651,
320,
4293,
13508,
24088,
29276,
275,
253,
256,
5830,
941,
873,
50276,
1542,
253,
1524,
10186,
941,
368,
897,
3012,
11184,
3386,
685,
598,
324,
1994,
333,
1162,
355,
43425,
2139,
310,
436,
50276,
249,
619,
4743,
5293,
273,
253,
1071,
941,
323,
534,
761,
9249,
310,
4561,
943,
320,
908,
323,
26230,
39116,
285,
40009,
619,
5125,
2746,
8085,
253,
941,
715,
6194,
285,
1071,
6194,
2233,
49996,
760,
342,
253,
6194,
941,
24088,
407,
8790,
312,
4906,
5096,
273,
253,
6194,
941,
281,
6642,
39116,
285,
40009,
285,
840,
11897,
761,
9249,
327,
253,
1071,
941,
50276,
249,
253,
761,
9249,
4758,
581,
5431,
1057,
417,
5467,
2289,
281,
253,
3733,
941,
352,
651,
320,
9865,
281,
2319,
849,
812,
39116,
285,
40009,
320,
5998,
390,
2581,
30346,
1677,
642,
2289,
281,
253,
941,
285,
752,
651,
320,
253,
12739,
275,
2426,
273,
761,
9249,
13091,
436,
812,
671,
320,
5762,
21657,
407,
4758,
39116,
281,
320,
253,
13461,
273,
253,
30410,
323,
534,
761,
9249,
310,
4561,
285,
689,
285,
762,
26230,
40009,
281,
2710,
7759,
50276,
1542,
253,
1524,
10186,
941,
4496,
2486,
253,
7200,
273,
253,
30410,
323,
534,
761,
9249,
310,
4561,
50275,
5992,
7193,
5701,
327,
253,
10199,
50276,
19657,
552,
281,
752,
310,
4767,
275,
50276,
4674,
337,
12494,
374,
275,
619,
4743,
4828,
12690,
780,
22909,
285,
761,
9249,
5231,
943,
417,
320,
908,
28961,
1598,
533,
2581,
4477,
943,
3730,
760,
281,
761,
9249,
5231,
1580,
253,
1895,
273,
761,
9249,
13091,
762,
941,
15036,
387,
1878,
849,
352,
310,
3559,
275,
436,
789,
2593,
337,
12494,
608,
310,
417,
7763,
281,
4828,
12690,
780,
22909,
50276,
4674,
337,
12494,
374,
604,
247,
2173,
2898,
476,
2085,
253,
4016,
6973,
342,
761,
9249,
5231,
352,
476,
3157,
253,
2608,
13226,
285,
9510,
253,
4665,
1430,
387,
253,
1072,
673,
30404,
3058,
323,
841,
7234,
50276,
4674,
337,
12494,
495,
359,
1364,
1908,
2363,
347,
271,
4293,
13508,
4735,
352,
310,
417,
247,
1364,
281,
1908,
2363,
4293,
13508,
2067,
2987,
1908,
352,
347,
247,
1327,
40600,
2355,
4735,
50276,
4674,
337,
12494,
577,
2710,
5482,
556,
452,
644,
4081,
50276,
4674,
337,
12494,
608,
941,
15036,
3798,
10808,
3969,
15036,
275,
253,
5145,
4715,
3210,
3602,
8889,
3798,
851,
1949,
3210,
28557,
275,
1234,
583,
281,
941,
15036,
253,
941,
5333,
3139,
1057,
417,
14757,
2544,
281,
253,
3602,
273,
253,
1566,
50276,
4674,
337,
12494,
608,
604,
247,
761,
9249,
2250,
10224,
281,
6635,
247,
13857,
6454,
275,
253,
2852,
840,
253,
761,
9249,
2250,
4916,
19437,
417,
7933,
923,
10885,
1818,
689,
673,
3066,
385,
1501,
32924,
275,
8097,
29846,
284,
538,
3358,
44316,
285,
355,
71,
4692,
9169,
50276,
4674,
337,
12494,
608,
285,
253,
4517,
327,
253,
5145,
4715,
985,
310,
3663,
25577,
3058,
50276,
26122,
5001,
2593,
374,
495,
285,
577,
50276,
249,
619,
4743,
5933,
337,
285,
10012,
5910,
943,
417,
320,
275,
253,
2022,
2505,
347,
841,
403,
816,
2087,
973,
4304,
12342,
432,
17133,
13757,
6108,
625,
2316,
323,
253,
4679,
7118,
4477,
812,
816,
3748,
14940,
2281,
273,
258,
18,
2609,
85,
50276,
3549,
6241,
7118,
7609,
285,
5976,
812,
320,
4395,
281,
253,
30762,
1580,
597,
403,
417,
5161,
281,
253,
2022,
7680,
273,
253,
2929,
285,
403,
417,
6760,
275,
253,
4679,
2593,
436,
651,
671,
1918,
253,
4477,
625,
2317,
323,
253,
4679,
2593,
285,
247,
6452,
50276,
29813,
337,
1054,
89,
50276,
29813,
374,
818,
854,
66,
898,
66,
2192,
1222,
89,
275,
1269,
390,
823,
1269,
275,
1269,
347,
271,
6843,
37709,
12014,
281,
5150,
337,
50276,
29813,
898,
66,
4496,
5513,
2139,
247,
8459,
273,
16987,
2581,
685,
337,
310,
908,
50276,
6050,
253,
4477,
1246,
247,
1077,
2590,
285,
11080,
47284,
273,
253,
2746,
4081,
285,
253,
2746,
310,
22335,
3590,
253,
4679,
3559,
403,
1077,
3710,
285,
253,
4583,
1318,
273,
253,
7680,
310,
12744,
275,
1798,
253,
4477,
513,
417,
7277,
616,
4081,
1332,
342,
2045,
7274,
15974,
253,
1895,
273,
11365,
10237,
761,
9249,
5231,
2490,
187,
4118,
18435,
27,
783,
2929,
3400,
247,
7792,
323,
761,
9249,
26332,
4828,
12690,
780,
22909,
326,
310,
10237,
281,
1566,
15036,
253,
9978,
323,
253,
4081,
1332,
310,
247,
1054,
4090,
13757,
1895,
835,
253,
2781,
310,
689,
247,
9168,
1475,
253,
3268,
689,
1566,
3602,
253,
1566,
3602,
403,
8392,
432,
247,
7802,
273,
465,
10670,
594,
326,
253,
9168,
310,
7616,
407,
253,
9480,
1288,
469,
4181,
327,
1016,
4445,
253,
4477,
12661,
247,
1442,
959,
37613,
2715,
273,
253,
10237,
1245,
13757,
1895,
534,
476,
320,
18325,
970,
16589,
11786,
18499,
597,
7472,
616,
2746,
327,
253,
305,
8592,
6152,
10895,
253,
1355,
2136,
5286,
10895,
285,
253,
5974,
3045,
10895,
1016,
273,
534,
14371,
247,
1027,
1511,
273,
941,
3268,
5333,
50276,
296,
3755,
20556,
50275,
2252,
5368,
789,
327,
761,
9249,
5231,
513,
417,
1908,
1566,
1818,
594,
253,
1895,
9713,
407,
253,
2929,
310,
4942,
747,
50276,
783,
3368,
1543,
7568,
253,
34385,
273,
253,
4081,
1332,
689,
1666,
25379,
50276,
20881,
1255,
265,
50275,
783,
2900,
2530,
310,
8489,
3710,
347,
352,
15771,
11306,
327,
253,
8350,
3607,
273,
253,
7802,
3268,
285,
9480,
1288,
469,
4181,
281,
8460,
4187,
253,
13757,
1895,
50276,
2252,
273,
253,
30628,
14285,
8523,
323,
18235,
253,
2929,
310,
45210,
43981,
281,
18235,
846,
253,
30080,
22559,
253,
4477,
452,
671,
15455,
9300,
253,
2929,
342,
747,
1543,
846,
253,
3302,
10123,
352,
3133,
3103,
326,
253,
2929,
778,
5649,
432,
1529,
3790,
273,
16725,
285,
984,
273,
436,
891,
5583,
18235,
285,
253,
4477,
281,
897,
253,
30628,
5701,
281,
3157,
253,
2929,
1078,
501,
538,
15318,
281,
1529,
18767,
323,
1529,
3790,
273,
16725
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
the goal of this paper is to train deep rl agents that perform well both in the presence and absence of adversarial attacks at training and test time to achieve this this paper proposes using policy distillation the approach distilled agent dqn dadqn consists of 1 a teacher neural network trained in the same way as dqn and 2 a student network trained with supervised learning to match the teachers outputs adversarial defenses are only applied to the student network so as to not impact the learning of qvalues by the teacher network at test time the student network is deployed this idea of separating the learning of qvalues from the incorporation of adversarial defenses is promising one adversarial defense considered in the paper is adversarial training applying small fgsm perturbations to inputs before they are given to the network in a sense the proposed approach is the correct way of doing adversarial training in deep rl unlike in supervised learning there is no ground truth for the correct action to take but by treating the teachers output for an unperturbed input as ground truth the student network can more easily learn the correct qvalues for the corresponding perturbed input the experimental results support the claim that applying adversarial training to dadqn leads to agents that perform well at test time both in the presence and absence of adversarial attacks without this teacherstudent separation incorporating adversarial training severely impairs learning table 2 dqn def column this separation also enables training the student network with provably robust training however i have a few significant concerns regarding this paper the first is regarding the whitebox poisoning attack that this paper proposes called untargeted qpoisoning uqp this is not a true poisoning attack since it attacks not just at training time but also at test time also the choice of adding the negative of the fgsm perturbation during training time is not clearly justified why not just use fgsm perturbations the reason given in the paper is that this reinforces the choice of the best action wrt the learned qvalues to give the illusion of successful training but why is this illusion important and is this illusion actually observed during training time what are the scores obtained at the end of training table 1 only reports testtime scores in addition although most of the paper is written clearly the experiment section is confusing i have the following major questions what is the attack atk section 43 is it exactly the same as the defense def except the perturbations are now stored in the replay buffer are attack and defense perturbations applied at every timestep in section 42 when uqp is applied is it attacking both at training and at test time given the definition of uqp section 24 the answer would be yes if thats the case then the none row in table 1 is misleading since there actually is a test time attack the experiments could also be more thorough for instance is the adversarial training defense still effective when the fgsm epsilon used in test time attacks is smaller or larger also how important is it that the student network chooses actions during training time rather than the teacher network an ablation study would be helpful here overall although the algorithmic novelty is promising it is relatively minor due to this and the weaknesses mentioned above i dont think this paper is ready for publication minor comments questions tables 1 and 2 should report 95 confidence intervals or the standard error its strange to apply the attack to the entire 4stack of consecutive frames used ie the observations from the last four timesteps it would make more sense if the attack only affected the current frame for adversarial training what probability p section 32 is used in the experiments in section 42 what does weighted by number of frames mean in which experiments if any is noisynet used section 41 mentions it is disabled and epsilongreedy exploration is used instead but i assume its used somewhere because its described when explaining the dadqn approach section 31docsepstating the observation that the rl agents with neural network policies are likely to be fooled by adversarial attacks the paper investigates a way to decrease this susceptibility main assumption is that the environment is aware of the fact that the agent is using neural network policies and also has an access to those weights the paper introduces a poisoning attack and a method to incorporate defense into an agent trained by dqn main idea is to decouple the dqn network into what they call a student policy network and a q network and use the policy network for exploration this is the only novelty in the paper the rest of the paper builds upon earlier ideas and incorporates different training techniques in order to include defense strategies to the dqn algorithm this is summarized in algorithm 1 called dadqn both proposed training methods adversarial training and provable robust training are well known techniques the benefits of the proposed decoupling is evidenced by the experimental results however only three games from the atari benchmark set is chosen which impairs the quality of the evidence in my opinion the work is very limited in originality with limited scope that it only applies to one type of rl algorithm combined with the very few set of experiments for supporting the claim fails to make the cut for publication below are my suggestions for improving the paper 1 major improvement of the exposition a section 22 agent aware game notation is very cumbersome please clean up and give an intuitive example to demonstrate b section 3 title is our approach however mostly talks about the prior work either do a better compare contrast of the underlying method against the previous work with clear distinction or move this entire discussion to related work section 2 needs more explanation how training with a defending strategy can achieve better training rewards as opposed to epsilon greedy 3 improve the exposition in tables 1 and 2 it is hard to follow the explanations with the results in the table user better titles and highlight the major results 4 discuss the relationship of adversarial training vs the safe rl literature 5 provide discussions about how the technique can be extended into trpo and a3cdocsepthis paper considers adversarial attack and its defense to dqn specifically the authors propose a poisoning attack that is able to fool dqn and also propose a modification of dqn that enables the use of strong defense experimental results are provided to justify the proposed approach detailed comments 1 although the attack approach seems easy to implement it would be interesting to see why it works it might make this paper better if the intuition of the uqp is provided fgsm is a wellknown attack for deep learning models what is the intuition of using the sign of the gradient of the crossentropy since the argmax is a onehot vector this crossentropy seems illdefined how to compute the gradient 2 it would also be interesting to see why taking actions based on the student network enables better defense in dadqn the authors seem to combine a few tricks proposed by existing works together it might be better to highlight the contribution and novelty of this approach
### Summary: | reviewers had several concerns about the paper primary among them being limited novelty of the approach the reviewers have offered suggestions for improving the work which we encourage the authors to read and consider | [
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
783,
4736,
273,
436,
2929,
310,
281,
6194,
3676,
391,
77,
6083,
326,
1347,
973,
1097,
275,
253,
3361,
285,
5928,
273,
48960,
8104,
387,
3733,
285,
1071,
673,
281,
5115,
436,
436,
2929,
29328,
970,
3646,
940,
21755,
253,
2746,
35755,
5570,
277,
47051,
12247,
47051,
8414,
273,
337,
247,
9732,
11454,
2990,
10166,
275,
253,
1072,
1039,
347,
277,
47051,
285,
374,
247,
5974,
2990,
10166,
342,
22296,
4715,
281,
3761,
253,
10954,
18012,
48960,
25774,
403,
760,
3732,
281,
253,
5974,
2990,
594,
347,
281,
417,
3486,
253,
4715,
273,
2805,
8858,
407,
253,
9732,
2990,
387,
1071,
673,
253,
5974,
2990,
310,
18329,
50276,
2520,
2934,
273,
23694,
253,
4715,
273,
2805,
8858,
432,
253,
24319,
273,
48960,
25774,
310,
12532,
581,
48960,
5684,
2783,
275,
253,
2929,
310,
48960,
3733,
50276,
1212,
2943,
1355,
269,
72,
3610,
26309,
281,
14800,
1078,
597,
403,
1677,
281,
253,
2990,
275,
247,
3282,
253,
4081,
2746,
310,
253,
3451,
1039,
273,
2509,
48960,
3733,
275,
3676,
391,
77,
12401,
275,
22296,
4715,
627,
310,
642,
3216,
5083,
323,
253,
3451,
2250,
281,
1379,
533,
407,
12767,
253,
10954,
3453,
323,
271,
440,
8292,
16193,
3280,
347,
3216,
5083,
253,
5974,
2990,
476,
625,
4354,
3037,
253,
3451,
2805,
8858,
323,
253,
3969,
44711,
3280,
50276,
783,
5661,
1543,
1329,
253,
1750,
326,
9433,
48960,
3733,
281,
12247,
47051,
5644,
281,
6083,
326,
1347,
973,
387,
1071,
673,
1097,
275,
253,
3361,
285,
5928,
273,
48960,
8104,
1293,
436,
9732,
39095,
9712,
24049,
48960,
3733,
18270,
1607,
3514,
4715,
2829,
374,
277,
47051,
809,
5084,
436,
9712,
671,
13276,
3733,
253,
5974,
2990,
342,
872,
1598,
10237,
3733,
50276,
35529,
891,
452,
247,
1643,
1534,
7350,
5001,
436,
2929,
253,
806,
310,
5001,
253,
3168,
3364,
33254,
2983,
326,
436,
2929,
29328,
1925,
440,
44490,
2805,
5367,
1988,
272,
1484,
39541,
436,
310,
417,
247,
2032,
33254,
2983,
1580,
352,
8104,
417,
816,
387,
3733,
673,
533,
671,
387,
1071,
673,
671,
253,
4327,
273,
6240,
253,
4016,
273,
253,
269,
72,
3610,
20452,
1309,
3733,
673,
310,
417,
4518,
17285,
2139,
417,
816,
897,
269,
72,
3610,
26309,
253,
1921,
1677,
275,
253,
2929,
310,
326,
436,
9838,
36217,
253,
4327,
273,
253,
1682,
2250,
8772,
253,
6311,
2805,
8858,
281,
1918,
253,
25451,
273,
5547,
3733,
50276,
2858,
2139,
310,
436,
25451,
1774,
285,
310,
436,
25451,
2686,
2540,
1309,
3733,
673,
752,
403,
253,
7363,
2797,
387,
253,
990,
273,
3733,
2829,
337,
760,
5012,
1071,
2606,
7363,
50276,
249,
1635,
3738,
954,
273,
253,
2929,
310,
3542,
4518,
253,
3368,
2593,
310,
21643,
891,
452,
253,
1563,
2201,
3533,
50276,
5371,
310,
253,
2983,
387,
76,
2593,
7652,
50276,
261,
352,
4555,
253,
1072,
347,
253,
5684,
809,
3707,
253,
26309,
403,
1024,
7141,
275,
253,
44864,
6391,
403,
2983,
285,
5684,
26309,
3732,
387,
1046,
4522,
383,
554,
50276,
249,
2593,
5976,
672,
1484,
39541,
310,
3732,
310,
352,
20362,
1097,
387,
3733,
285,
387,
1071,
673,
1677,
253,
5426,
273,
1484,
39541,
2593,
2164,
253,
3662,
651,
320,
4754,
604,
28763,
253,
1083,
840,
253,
5293,
4194,
275,
2829,
337,
310,
24363,
1580,
627,
2686,
310,
247,
1071,
673,
2983,
50276,
783,
4679,
812,
671,
320,
625,
11080,
323,
4227,
310,
253,
48960,
3733,
5684,
1335,
3576,
672,
253,
269,
72,
3610,
299,
4277,
908,
275,
1071,
673,
8104,
310,
4577,
390,
4067,
671,
849,
1774,
310,
352,
326,
253,
5974,
2990,
28467,
5231,
1309,
3733,
673,
2581,
685,
253,
9732,
2990,
271,
28913,
1263,
651,
320,
9371,
1060,
50276,
1189,
455,
3738,
253,
5933,
280,
38135,
310,
12532,
352,
310,
4942,
5884,
1955,
281,
436,
285,
253,
32213,
5393,
1840,
891,
13414,
1158,
436,
2929,
310,
4704,
323,
9311,
50276,
37585,
5701,
50276,
34974,
50276,
38538,
337,
285,
374,
943,
1304,
5325,
7162,
11508,
390,
253,
2629,
2228,
50276,
953,
8921,
281,
4647,
253,
2983,
281,
253,
2862,
577,
9742,
273,
12640,
13009,
908,
26332,
253,
7313,
432,
253,
1390,
1740,
4522,
383,
2265,
352,
651,
1056,
625,
3282,
604,
253,
2983,
760,
5876,
253,
1655,
3665,
50276,
1542,
48960,
3733,
752,
5912,
268,
2593,
4567,
310,
908,
275,
253,
4679,
50276,
249,
2593,
5976,
752,
1057,
17375,
407,
1180,
273,
13009,
1599,
50276,
249,
534,
4679,
604,
667,
310,
642,
261,
1362,
292,
908,
2593,
7609,
25957,
352,
310,
13603,
285,
299,
793,
300,
543,
250,
6368,
17947,
310,
908,
3185,
533,
891,
5467,
697,
908,
9366,
984,
697,
2529,
672,
15571,
253,
12247,
47051,
2746,
2593,
4562,
7152,
33032,
44101,
253,
8310,
326,
253,
391,
77,
6083,
342,
11454,
2990,
7823,
403,
2779,
281,
320,
11213,
264,
407,
48960,
8104,
253,
2929,
2340,
684,
247,
1039,
281,
6379,
436,
16329,
50275,
7265,
9376,
310,
326,
253,
3126,
310,
6600,
273,
253,
958,
326,
253,
5570,
310,
970,
11454,
2990,
7823,
285,
671,
556,
271,
2289,
281,
1110,
13461,
253,
2929,
23970,
247,
33254,
2983,
285,
247,
1332,
281,
19071,
5684,
715,
271,
5570,
10166,
407,
277,
47051,
50276,
7265,
2934,
310,
281,
34430,
713,
253,
277,
47051,
2990,
715,
752,
597,
1067,
247,
5974,
3646,
2990,
285,
247,
2805,
2990,
285,
897,
253,
3646,
2990,
323,
17947,
436,
310,
253,
760,
38135,
275,
253,
2929,
253,
1551,
273,
253,
2929,
21168,
2220,
4321,
5697,
285,
31167,
1027,
3733,
5609,
275,
1340,
281,
2486,
5684,
8130,
281,
253,
277,
47051,
5933,
436,
310,
17903,
275,
5933,
337,
1925,
12247,
47051,
1097,
4081,
3733,
3082,
48960,
3733,
285,
872,
494,
10237,
3733,
403,
973,
1929,
5609,
253,
5373,
273,
253,
4081,
34430,
4906,
310,
27007,
407,
253,
5661,
1543,
2299,
760,
1264,
3958,
432,
253,
387,
1792,
22791,
873,
310,
6777,
534,
1607,
3514,
253,
3290,
273,
253,
1941,
275,
619,
4743,
253,
789,
310,
1077,
3710,
275,
3236,
414,
342,
3710,
7990,
326,
352,
760,
10384,
281,
581,
1511,
273,
391,
77,
5933,
5678,
342,
253,
1077,
1643,
873,
273,
4679,
323,
8109,
253,
1750,
10224,
281,
1056,
253,
2624,
323,
9311,
50276,
27490,
403,
619,
13991,
323,
11138,
253,
2929,
337,
2201,
7756,
273,
253,
47284,
50275,
66,
2593,
3307,
5570,
6600,
2165,
14951,
310,
1077,
41049,
4496,
4076,
598,
285,
1918,
271,
27350,
1650,
281,
7568,
50275,
67,
2593,
495,
4060,
310,
776,
2746,
2299,
6571,
12088,
670,
253,
2720,
789,
2057,
513,
247,
1805,
7277,
4499,
273,
253,
6944,
1332,
1411,
253,
50276,
35065,
789,
342,
2590,
13812,
390,
2118,
436,
2862,
5955,
281,
2905,
789,
2593,
374,
3198,
625,
8813,
849,
3733,
342,
247,
21449,
5700,
476,
5115,
1805,
3733,
23267,
347,
10066,
281,
299,
4277,
38754,
495,
3157,
253,
47284,
275,
7180,
337,
285,
374,
352,
310,
1892,
281,
956,
253,
22909,
342,
253,
1543,
275,
253,
2829,
2608,
1805,
14505,
285,
6780,
253,
2201,
1543,
577,
2319,
253,
2954,
273,
48960,
3733,
4632,
253,
4999,
391,
77,
6239,
608,
2085,
11985,
670,
849,
253,
5853,
476,
320,
6508,
715,
492,
5367,
285,
247,
20,
2428,
406,
33032,
2520,
2929,
19401,
48960,
2983,
285,
697,
5684,
281,
277,
47051,
5742,
253,
4477,
12661,
247,
33254,
2983,
326,
310,
2104,
281,
11213,
277,
47051,
285,
671,
12661,
247,
11237,
273,
277,
47051,
326,
13276,
253,
897,
273,
2266,
5684,
5661,
1543,
403,
2530,
281,
15249,
253,
4081,
2746,
50276,
5992,
7193,
5701,
50276,
18,
50276,
20261,
253,
2983,
2746,
3133,
3477,
281,
3359,
352,
651,
320,
4722,
281,
923,
2139,
352,
2987,
352,
1537,
1056,
436,
2929,
1805,
604,
253,
30328,
273,
253,
1484,
39541,
310,
2530,
269,
72,
3610,
310,
247,
973,
4304,
2983,
323,
3676,
4715,
3210,
752,
310,
253,
30328,
273,
970,
253,
861,
273,
253,
11786,
273,
253,
2831,
290,
10144,
1580,
253,
1736,
4090,
310,
247,
581,
12022,
4972,
436,
2831,
290,
10144,
3133,
4164,
392,
37224,
849,
281,
11897,
253,
11786,
50276,
19,
352,
651,
671,
320,
4722,
281,
923,
2139,
3192,
5231,
1754,
327,
253,
5974,
2990,
13276,
1805,
5684,
50276,
249,
12247,
47051,
253,
4477,
1646,
281,
13398,
247,
1643,
24866,
4081,
407,
5368,
2987,
2366,
352,
1537,
320,
1805,
281,
6780,
253,
7680,
285,
38135,
273,
436,
2746,
2490,
187,
4118,
18435,
27,
15337,
398,
574,
2067,
7350,
670,
253,
2929,
3625,
2190,
731,
1146,
3710,
38135,
273,
253,
2746,
253,
30628,
452,
5907,
13991,
323,
11138,
253,
789,
534,
359,
11907,
253,
4477,
281,
1239,
285,
1908
] | [
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] | [
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
783,
4736,
273,
436,
2929,
310,
281,
6194,
3676,
391,
77,
6083,
326,
1347,
973,
1097,
275,
253,
3361,
285,
5928,
273,
48960,
8104,
387,
3733,
285,
1071,
673,
281,
5115,
436,
436,
2929,
29328,
970,
3646,
940,
21755,
253,
2746,
35755,
5570,
277,
47051,
12247,
47051,
8414,
273,
337,
247,
9732,
11454,
2990,
10166,
275,
253,
1072,
1039,
347,
277,
47051,
285,
374,
247,
5974,
2990,
10166,
342,
22296,
4715,
281,
3761,
253,
10954,
18012,
48960,
25774,
403,
760,
3732,
281,
253,
5974,
2990,
594,
347,
281,
417,
3486,
253,
4715,
273,
2805,
8858,
407,
253,
9732,
2990,
387,
1071,
673,
253,
5974,
2990,
310,
18329,
50276,
2520,
2934,
273,
23694,
253,
4715,
273,
2805,
8858,
432,
253,
24319,
273,
48960,
25774,
310,
12532,
581,
48960,
5684,
2783,
275,
253,
2929,
310,
48960,
3733,
50276,
1212,
2943,
1355,
269,
72,
3610,
26309,
281,
14800,
1078,
597,
403,
1677,
281,
253,
2990,
275,
247,
3282,
253,
4081,
2746,
310,
253,
3451,
1039,
273,
2509,
48960,
3733,
275,
3676,
391,
77,
12401,
275,
22296,
4715,
627,
310,
642,
3216,
5083,
323,
253,
3451,
2250,
281,
1379,
533,
407,
12767,
253,
10954,
3453,
323,
271,
440,
8292,
16193,
3280,
347,
3216,
5083,
253,
5974,
2990,
476,
625,
4354,
3037,
253,
3451,
2805,
8858,
323,
253,
3969,
44711,
3280,
50276,
783,
5661,
1543,
1329,
253,
1750,
326,
9433,
48960,
3733,
281,
12247,
47051,
5644,
281,
6083,
326,
1347,
973,
387,
1071,
673,
1097,
275,
253,
3361,
285,
5928,
273,
48960,
8104,
1293,
436,
9732,
39095,
9712,
24049,
48960,
3733,
18270,
1607,
3514,
4715,
2829,
374,
277,
47051,
809,
5084,
436,
9712,
671,
13276,
3733,
253,
5974,
2990,
342,
872,
1598,
10237,
3733,
50276,
35529,
891,
452,
247,
1643,
1534,
7350,
5001,
436,
2929,
253,
806,
310,
5001,
253,
3168,
3364,
33254,
2983,
326,
436,
2929,
29328,
1925,
440,
44490,
2805,
5367,
1988,
272,
1484,
39541,
436,
310,
417,
247,
2032,
33254,
2983,
1580,
352,
8104,
417,
816,
387,
3733,
673,
533,
671,
387,
1071,
673,
671,
253,
4327,
273,
6240,
253,
4016,
273,
253,
269,
72,
3610,
20452,
1309,
3733,
673,
310,
417,
4518,
17285,
2139,
417,
816,
897,
269,
72,
3610,
26309,
253,
1921,
1677,
275,
253,
2929,
310,
326,
436,
9838,
36217,
253,
4327,
273,
253,
1682,
2250,
8772,
253,
6311,
2805,
8858,
281,
1918,
253,
25451,
273,
5547,
3733,
50276,
2858,
2139,
310,
436,
25451,
1774,
285,
310,
436,
25451,
2686,
2540,
1309,
3733,
673,
752,
403,
253,
7363,
2797,
387,
253,
990,
273,
3733,
2829,
337,
760,
5012,
1071,
2606,
7363,
50276,
249,
1635,
3738,
954,
273,
253,
2929,
310,
3542,
4518,
253,
3368,
2593,
310,
21643,
891,
452,
253,
1563,
2201,
3533,
50276,
5371,
310,
253,
2983,
387,
76,
2593,
7652,
50276,
261,
352,
4555,
253,
1072,
347,
253,
5684,
809,
3707,
253,
26309,
403,
1024,
7141,
275,
253,
44864,
6391,
403,
2983,
285,
5684,
26309,
3732,
387,
1046,
4522,
383,
554,
50276,
249,
2593,
5976,
672,
1484,
39541,
310,
3732,
310,
352,
20362,
1097,
387,
3733,
285,
387,
1071,
673,
1677,
253,
5426,
273,
1484,
39541,
2593,
2164,
253,
3662,
651,
320,
4754,
604,
28763,
253,
1083,
840,
253,
5293,
4194,
275,
2829,
337,
310,
24363,
1580,
627,
2686,
310,
247,
1071,
673,
2983,
50276,
783,
4679,
812,
671,
320,
625,
11080,
323,
4227,
310,
253,
48960,
3733,
5684,
1335,
3576,
672,
253,
269,
72,
3610,
299,
4277,
908,
275,
1071,
673,
8104,
310,
4577,
390,
4067,
671,
849,
1774,
310,
352,
326,
253,
5974,
2990,
28467,
5231,
1309,
3733,
673,
2581,
685,
253,
9732,
2990,
271,
28913,
1263,
651,
320,
9371,
1060,
50276,
1189,
455,
3738,
253,
5933,
280,
38135,
310,
12532,
352,
310,
4942,
5884,
1955,
281,
436,
285,
253,
32213,
5393,
1840,
891,
13414,
1158,
436,
2929,
310,
4704,
323,
9311,
50276,
37585,
5701,
50276,
34974,
50276,
38538,
337,
285,
374,
943,
1304,
5325,
7162,
11508,
390,
253,
2629,
2228,
50276,
953,
8921,
281,
4647,
253,
2983,
281,
253,
2862,
577,
9742,
273,
12640,
13009,
908,
26332,
253,
7313,
432,
253,
1390,
1740,
4522,
383,
2265,
352,
651,
1056,
625,
3282,
604,
253,
2983,
760,
5876,
253,
1655,
3665,
50276,
1542,
48960,
3733,
752,
5912,
268,
2593,
4567,
310,
908,
275,
253,
4679,
50276,
249,
2593,
5976,
752,
1057,
17375,
407,
1180,
273,
13009,
1599,
50276,
249,
534,
4679,
604,
667,
310,
642,
261,
1362,
292,
908,
2593,
7609,
25957,
352,
310,
13603,
285,
299,
793,
300,
543,
250,
6368,
17947,
310,
908,
3185,
533,
891,
5467,
697,
908,
9366,
984,
697,
2529,
672,
15571,
253,
12247,
47051,
2746,
2593,
4562,
7152,
33032,
44101,
253,
8310,
326,
253,
391,
77,
6083,
342,
11454,
2990,
7823,
403,
2779,
281,
320,
11213,
264,
407,
48960,
8104,
253,
2929,
2340,
684,
247,
1039,
281,
6379,
436,
16329,
50275,
7265,
9376,
310,
326,
253,
3126,
310,
6600,
273,
253,
958,
326,
253,
5570,
310,
970,
11454,
2990,
7823,
285,
671,
556,
271,
2289,
281,
1110,
13461,
253,
2929,
23970,
247,
33254,
2983,
285,
247,
1332,
281,
19071,
5684,
715,
271,
5570,
10166,
407,
277,
47051,
50276,
7265,
2934,
310,
281,
34430,
713,
253,
277,
47051,
2990,
715,
752,
597,
1067,
247,
5974,
3646,
2990,
285,
247,
2805,
2990,
285,
897,
253,
3646,
2990,
323,
17947,
436,
310,
253,
760,
38135,
275,
253,
2929,
253,
1551,
273,
253,
2929,
21168,
2220,
4321,
5697,
285,
31167,
1027,
3733,
5609,
275,
1340,
281,
2486,
5684,
8130,
281,
253,
277,
47051,
5933,
436,
310,
17903,
275,
5933,
337,
1925,
12247,
47051,
1097,
4081,
3733,
3082,
48960,
3733,
285,
872,
494,
10237,
3733,
403,
973,
1929,
5609,
253,
5373,
273,
253,
4081,
34430,
4906,
310,
27007,
407,
253,
5661,
1543,
2299,
760,
1264,
3958,
432,
253,
387,
1792,
22791,
873,
310,
6777,
534,
1607,
3514,
253,
3290,
273,
253,
1941,
275,
619,
4743,
253,
789,
310,
1077,
3710,
275,
3236,
414,
342,
3710,
7990,
326,
352,
760,
10384,
281,
581,
1511,
273,
391,
77,
5933,
5678,
342,
253,
1077,
1643,
873,
273,
4679,
323,
8109,
253,
1750,
10224,
281,
1056,
253,
2624,
323,
9311,
50276,
27490,
403,
619,
13991,
323,
11138,
253,
2929,
337,
2201,
7756,
273,
253,
47284,
50275,
66,
2593,
3307,
5570,
6600,
2165,
14951,
310,
1077,
41049,
4496,
4076,
598,
285,
1918,
271,
27350,
1650,
281,
7568,
50275,
67,
2593,
495,
4060,
310,
776,
2746,
2299,
6571,
12088,
670,
253,
2720,
789,
2057,
513,
247,
1805,
7277,
4499,
273,
253,
6944,
1332,
1411,
253,
50276,
35065,
789,
342,
2590,
13812,
390,
2118,
436,
2862,
5955,
281,
2905,
789,
2593,
374,
3198,
625,
8813,
849,
3733,
342,
247,
21449,
5700,
476,
5115,
1805,
3733,
23267,
347,
10066,
281,
299,
4277,
38754,
495,
3157,
253,
47284,
275,
7180,
337,
285,
374,
352,
310,
1892,
281,
956,
253,
22909,
342,
253,
1543,
275,
253,
2829,
2608,
1805,
14505,
285,
6780,
253,
2201,
1543,
577,
2319,
253,
2954,
273,
48960,
3733,
4632,
253,
4999,
391,
77,
6239,
608,
2085,
11985,
670,
849,
253,
5853,
476,
320,
6508,
715,
492,
5367,
285,
247,
20,
2428,
406,
33032,
2520,
2929,
19401,
48960,
2983,
285,
697,
5684,
281,
277,
47051,
5742,
253,
4477,
12661,
247,
33254,
2983,
326,
310,
2104,
281,
11213,
277,
47051,
285,
671,
12661,
247,
11237,
273,
277,
47051,
326,
13276,
253,
897,
273,
2266,
5684,
5661,
1543,
403,
2530,
281,
15249,
253,
4081,
2746,
50276,
5992,
7193,
5701,
50276,
18,
50276,
20261,
253,
2983,
2746,
3133,
3477,
281,
3359,
352,
651,
320,
4722,
281,
923,
2139,
352,
2987,
352,
1537,
1056,
436,
2929,
1805,
604,
253,
30328,
273,
253,
1484,
39541,
310,
2530,
269,
72,
3610,
310,
247,
973,
4304,
2983,
323,
3676,
4715,
3210,
752,
310,
253,
30328,
273,
970,
253,
861,
273,
253,
11786,
273,
253,
2831,
290,
10144,
1580,
253,
1736,
4090,
310,
247,
581,
12022,
4972,
436,
2831,
290,
10144,
3133,
4164,
392,
37224,
849,
281,
11897,
253,
11786,
50276,
19,
352,
651,
671,
320,
4722,
281,
923,
2139,
3192,
5231,
1754,
327,
253,
5974,
2990,
13276,
1805,
5684,
50276,
249,
12247,
47051,
253,
4477,
1646,
281,
13398,
247,
1643,
24866,
4081,
407,
5368,
2987,
2366,
352,
1537,
320,
1805,
281,
6780,
253,
7680,
285,
38135,
273,
436,
2746,
2490,
187,
4118,
18435,
27,
15337,
398,
574,
2067,
7350,
670,
253,
2929,
3625,
2190,
731,
1146,
3710,
38135,
273,
253,
2746,
253,
30628,
452,
5907,
13991,
323,
11138,
253,
789,
534,
359,
11907,
253,
4477,
281,
1239,
285,
1908
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
the authors propose a debiasing method in the learning to rank l2r regime they advance the hypothesis that contemporary debiasing methods are lacking since they assume a click model where the effect of the position of an item in a ranked list on the probability of observation by the user is constant across different contexts the authors instead propose learning parametric models for observation probability and relevance probability since only clicks are observed in practice and they assume pclickx pobservationx position prelevancex the two quantities of observationrelevance are not identifiable as a result the authors instead propose two conditions to fulfill a soft decoupling condition that maintains ranks in their observational model when compared to the true probabilities of observation 1 lipschitz decoupling where pobservationx position is alphalipschitz in x and its corresponding estimation model is also betalipschitz in x for a value of beta that falls within a bound dependent on alpha 2 bernoulli decoupling the observation probability is once again alphalipschitz but here the corresponding estimation model is instead forced to output 1 with probability t forcing the ranker to learn directly from click data if t falls within a bound dependent on alpha then soft decoupling also applies here the authors then validate their model across a linear and neural network ranker and compare to a number of unbiased baselines in semisynthetic l2r settings this is a good contribution to the area of l2r the model they study is elegant and natural and i was surprised there has been no prior work making similar assumptions overall the clarity of exposition was very good and the paper was easy to follow however i felt that the authors could have provided more intuition in the bernoulli decoupling section see questions while the ideas presented are quite valuable i felt the results in the experiments were a bit underwhelming the authors assume that their hypothesis about disparate effects of rankings is correct in their simulations and only show marginal improvements in ndcg when compared to dla unless their coupling parameter is significantly increased the ndcg difference between dla and lbd are smaller than the differences between lbd and itself for two different values of the hyperparameter t that are apart by 01 however i did appreciate the completeness of their experimental results the authors included results across a wide range of simulation parameters and compared their method with a broad range of baselines overall im leaning towards an accept although i would be willing to change my score up from a weak accept to an accept if the authors are able to make a compelling case for the strength of their experimental results and address my questions below given that the intent of l2r models is for them to be deployed on large internet platforms i would have liked to have seen some discussion about the societal impact of the work docsepthe authors propose an unbiased learningtorank ultr method which relies on only clicks and performs decoupling to separate relevance and observation parts from the click model given that we observe only clicks it is very difficult to retrieve the true relevance and observation parameters because multiple combinations of relevance and observation can result in the same click the authors propose a disentangling method to decouple the two parameters with theoretical guarantees empirical results on the standard semisynthetic benchmark demonstrate the efficacy of the method over established baselines i have read and considered the authors response some strong points of the paper the paper is wellwritten with a proper introduction to the area of ultr and biases in click data given the limited space in the neurips format the authors have done a commendable job of introducing the related works and the problem itself the assumptions are laid out in an intuitive manner and the method is supported by theory empirical results indicate strong performance as compared to baselines some weak pointscomments on the draft during training when clicks are estimated via a product of predicted relevance and predicted observations even with the regularization term on the norm of gradient a trivial solution would be to make the observation models prediction constant and relevance equal to clicks how will the model recover the true relevance in such a case even with soft decoupling it needs to estimate relevance up to a constant overall i think the paper presents an interesting theoretically motivated algorithm for ultr na docsepin this work the authors studied and proposed a lipschitz and bernoulli decoupling lbd model to decouple relevance and observation at individual document level in unbiased learningtorank the authors first explained about the coupling of relevance and observation as there are common factors affecting both they then translated the coupling effect into the main assumption of unbiased click prediction and defined both hard decoupling model decomposes as the ground truth and soft decoupling model decomposition preserves the order of relevance function based on that assuming the lipschitz continuity of the observation function the authors proved that the soft decoupling could be achieved by enforcing a lipschitz continuity on the observation model with a finite lipschitz constant and gave the upper and lower bound based on the same continuity assumption the authors also showed bernoulli observation model bom can achieve the soft decoupling within the bounds of the observation dropout rate finally the authors described the objective that combines both lipschitz and bernoulli decoupling methods in the experiment the authors showed on the synthetic dataset lbd could achieve the best unbiased results and also did the ablation studies on lipschitz and bernoulli alone with a brief discussion of limitations the authors concluded the paper in the appendix the authors extended related works proved the main theorems and included more experiment details strengths 1 the work studies a general and important but previously ignored issue in unbiased learning to rank which has been the core of industrial application in recent years 2 well written and solid theory 3 complete study with sufficient ablation study and discussion on limitations weaknesses 1 the authors only demonstrated their method on the synthetic dataset despite the method is claimed to be applicable to more nontrivial bias it could be more convincing if authors could test the method on realworld click data 2 the authors should better doublecheck their significance analysis there are some results appearing absurd eg in table 1 0659 of lbdlips linear model on istellas is significant worse while 0592 of unlimited linear model is not significant also not sure which exactly the baseline was when applied the significance analysis why not 0686 and 0687 are not significant if 0707 is significant assuming the baseline is the best one with 0701 3 in figure 2c when authors claimed about better performance of lbd at low eta in fact they only have a single point lbd performs better which is eta0 exactly no coupling isnt that somewhat deviated from the main claim of the paper can authors add some points for example 0eta01 to show lbd does work better at low eta end 4 minor points on notations a variable t appears to be doubly used in theorem 2 and in bernoulli model parameter b appendix chapters should better always be clarified somewhat confused when not knowing there is an appendix please see my comments in questions and weakness above docsepthis paper studies the unbiased learning to rank problem different from existing work the paper first argues that the bias factor should use any existing features thus can go to individuallevel then the authors argue that there will be a coupling effect between the relevance tower and bias tower as they share a lot of input signals then two simple methods are proposed the first one enforces a smoothness constraint on the bias model the socalled lipschitz observation model the second one uses dropout on the the bias model output the socalled bernoulli decoupling model the paper provides some theoretical analysis on when decoupling is achievable experiments are done on semisynthetic datasets with synthetic generated clicks on public datasets overall the paper can be strengthened with more compelling theoretical analysis and experimental evaluation the reviewer acknowledges the authors reply the dropout argument is decent as some of the major concerns are still there strong assumptions only on semisynthetic dataset smoothness method only works on one dataset more concerning as the datasets are manually controlled will keep the score as the reviewer was initially between 3 and 4 strength the problem studied is important and the motivation is solid the theoretical analysis is interesting though the assumption is strong and the usefulness is unclear weakness the methods are simple which is good but not novel in fact the dropout method was proposed 3 years ago recommending what video to watch next a multitask ranking system but it was not cited the smoothness method has several concerns see below and do not seem to perform well in experiments the theoretical analysis is not very compelling to the reviewer the assumption is strong the authors acknowledged it which is good for example the smoothness constraint is not very convincing the authors already mentioned the positiononly case also it is intuitive that different query types can have quite different browsing behaviors easily violating the smoothness constraint it only shows feasibility but no guarantees in practice its about hyperparameter tuning the experiments are not satisfactory first using semisynthetic dataset is common in the literature so it does not add extra points while many people use realworld datasets also the reviewer feels it is more important to test this paper on more realistic datasets indeed as the authors motivation states the bias can have complex correlation with input features existing work mainly generate bias only depends on position so it could be quite simple but this work has a different motivation and the way it generates bias can be too simplified third the authors are suggested to compare with some more recent methods the most recent baseline was in 2019 na
### Summary: | the paper studies the unbiased learning to rank problem and introduces new assumptions and techniques to learn good ranking policies from biased data the key insight is to use smoothness assumptions to decouple the effect of observation from relevance the reviewers appreciated this novel attack on a significant problem and the authors clarified how existing positionbased models and other debiasing strategies emerge as special cases of their smoothness assumptions some reviewers pointed out deficiencies in the empirical study lack of results on realworld click data questions around statistical significance and finergrained sweeps of hyperparameter ranges which the authors subsequently clarified during the feedback phase adding a realworld experiment even on the limited tiangongst dataset that the authors identified but rejected would substantially strengthen the claims in the paper that said the lipschitz and bernoulli decoupling are likely to be of interest to the learning to rank community and spur followup work on better user modeling | [
275,
253,
4715,
281,
5958,
298,
19,
83,
9459,
597,
7170,
253,
9079,
326,
13399,
372,
4193,
2355,
3082,
403,
14999,
1580,
597,
5467,
247,
5532,
1566,
835,
253,
1055,
273,
253,
1899,
273,
271,
5382,
275,
247,
17045,
1618,
327,
253,
5912,
273,
8310,
407,
253,
2608,
310,
3638,
2439,
1027,
22349,
50276,
783,
4477,
3185,
12661,
4715,
36833,
3210,
323,
8310,
5912,
285,
17200,
5912,
1580,
760,
31317,
403,
2540,
275,
3946,
285,
597,
5467,
268,
9738,
89,
50276,
81,
23705,
318,
89,
1899,
50276,
3456,
282,
11828,
89,
253,
767,
13483,
273,
8310,
11235,
11828,
403,
417,
38640,
347,
247,
906,
253,
4477,
3185,
12661,
767,
2515,
281,
19145,
247,
2602,
34430,
4906,
1617,
326,
18922,
17210,
275,
616,
21899,
1566,
672,
2429,
281,
253,
2032,
20552,
273,
8310,
337,
11233,
37913,
34430,
4906,
835,
268,
23705,
318,
89,
1899,
310,
355,
13203,
2824,
37913,
275,
1269,
285,
697,
3969,
13418,
1566,
310,
671,
701,
267,
2824,
37913,
275,
1269,
323,
247,
1318,
273,
9840,
326,
11521,
1561,
247,
3033,
7976,
327,
9765,
374,
270,
1808,
276,
25658,
34430,
4906,
253,
8310,
5912,
310,
2378,
969,
355,
13203,
2824,
37913,
533,
1060,
253,
3969,
13418,
1566,
310,
3185,
6726,
281,
3453,
337,
342,
5912,
246,
17190,
253,
5958,
254,
281,
3037,
3587,
432,
5532,
941,
604,
246,
11521,
1561,
247,
3033,
7976,
327,
9765,
840,
2602,
34430,
4906,
671,
10384,
1060,
50276,
783,
4477,
840,
17813,
616,
1566,
2439,
247,
4872,
285,
11454,
2990,
5958,
254,
285,
7277,
281,
247,
1180,
273,
38663,
1666,
25379,
275,
49863,
23744,
298,
19,
83,
7533,
436,
310,
247,
1175,
7680,
281,
253,
2170,
273,
298,
19,
83,
253,
1566,
597,
1263,
310,
20654,
285,
3626,
285,
891,
369,
9861,
627,
556,
644,
642,
2720,
789,
2403,
2074,
13260,
50276,
1189,
455,
253,
19843,
273,
47284,
369,
1077,
1175,
285,
253,
2929,
369,
3477,
281,
956,
2299,
891,
3543,
326,
253,
4477,
812,
452,
2530,
625,
30328,
275,
253,
270,
1808,
276,
25658,
34430,
4906,
2593,
923,
3533,
50276,
6050,
253,
5697,
3559,
403,
3240,
9865,
891,
3543,
253,
1543,
275,
253,
4679,
497,
247,
2372,
762,
11622,
3987,
253,
4477,
5467,
326,
616,
9079,
670,
39653,
2538,
273,
31972,
310,
3451,
275,
616,
9938,
285,
760,
921,
16888,
11701,
275,
295,
12352,
72,
672,
2429,
281,
277,
4123,
5734,
616,
8789,
4764,
310,
3012,
2559,
253,
295,
12352,
72,
3064,
875,
277,
4123,
285,
298,
14836,
403,
4577,
685,
253,
3910,
875,
298,
14836,
285,
3139,
323,
767,
1027,
2193,
273,
253,
4373,
19484,
246,
326,
403,
7419,
407,
14805,
2299,
891,
858,
11435,
253,
29867,
273,
616,
5661,
1543,
253,
4477,
2908,
1543,
2439,
247,
4618,
2491,
273,
9864,
3602,
285,
2429,
616,
1332,
342,
247,
3862,
2491,
273,
1666,
25379,
50276,
1189,
455,
516,
25661,
4404,
271,
2997,
3738,
891,
651,
320,
7378,
281,
1818,
619,
4868,
598,
432,
247,
5075,
2997,
281,
271,
2997,
604,
253,
4477,
403,
2104,
281,
1056,
247,
18511,
1083,
323,
253,
4757,
273,
616,
5661,
1543,
285,
2953,
619,
3533,
2708,
1677,
326,
253,
6860,
273,
298,
19,
83,
3210,
310,
323,
731,
281,
320,
18329,
327,
1781,
8573,
13498,
891,
651,
452,
10490,
281,
452,
2326,
690,
5955,
670,
253,
38058,
3486,
273,
253,
789,
5474,
339,
431,
248,
4477,
12661,
271,
38663,
4715,
13473,
1164,
18311,
1332,
534,
15771,
327,
760,
31317,
285,
17923,
34430,
4906,
281,
4858,
17200,
285,
8310,
4243,
432,
253,
5532,
1566,
50275,
28821,
326,
359,
10018,
760,
31317,
352,
310,
1077,
2834,
281,
19553,
253,
2032,
17200,
285,
8310,
3602,
984,
2709,
13553,
273,
17200,
285,
8310,
476,
906,
275,
253,
1072,
5532,
50275,
783,
4477,
12661,
247,
557,
290,
36874,
1332,
281,
34430,
713,
253,
767,
3602,
342,
10527,
23632,
50275,
358,
5378,
474,
1543,
327,
253,
2629,
49863,
23744,
22791,
7568,
253,
10307,
273,
253,
1332,
689,
4232,
1666,
25379,
50274,
74,
452,
1239,
285,
2783,
253,
4477,
2380,
50276,
8826,
2266,
2792,
273,
253,
2929,
50275,
783,
2929,
310,
973,
15720,
342,
247,
1463,
10199,
281,
253,
2170,
273,
18311,
285,
31306,
275,
5532,
941,
1677,
253,
3710,
2317,
275,
253,
5723,
2824,
5981,
253,
4477,
452,
2218,
247,
49638,
494,
2628,
273,
16984,
253,
2905,
2987,
285,
253,
1895,
3139,
50275,
783,
13260,
403,
10090,
562,
275,
271,
27350,
5133,
285,
253,
1332,
310,
4516,
407,
3762,
50275,
358,
5378,
474,
1543,
5224,
2266,
3045,
347,
2429,
281,
1666,
25379,
50275,
8826,
5075,
2792,
26122,
327,
253,
7482,
50276,
32674,
3733,
672,
31317,
403,
5998,
3066,
247,
1885,
273,
8131,
17200,
285,
8131,
7313,
1014,
342,
253,
37820,
1307,
327,
253,
5222,
273,
11786,
247,
14916,
2900,
651,
320,
281,
1056,
253,
8310,
3210,
10554,
3638,
285,
17200,
4503,
281,
31317,
849,
588,
253,
1566,
9295,
253,
2032,
17200,
275,
824,
247,
1083,
1014,
342,
2602,
34430,
4906,
352,
3198,
281,
6642,
17200,
598,
281,
247,
3638,
50275,
1189,
455,
891,
1158,
253,
2929,
10262,
271,
4722,
28055,
17194,
5933,
323,
18311,
50276,
2072,
5474,
339,
9852,
436,
789,
253,
4477,
5421,
285,
4081,
247,
11233,
37913,
285,
270,
1808,
276,
25658,
34430,
4906,
298,
14836,
1566,
281,
34430,
713,
17200,
285,
8310,
387,
2060,
3389,
1268,
275,
38663,
4715,
13473,
1164,
253,
4477,
806,
5544,
670,
253,
8789,
273,
17200,
285,
8310,
347,
627,
403,
1846,
2616,
13567,
1097,
597,
840,
15786,
253,
8789,
1055,
715,
253,
2022,
9376,
273,
38663,
5532,
10554,
285,
2931,
1097,
1892,
34430,
4906,
1566,
11101,
6013,
347,
253,
3216,
5083,
285,
2602,
34430,
4906,
1566,
14717,
31221,
253,
1340,
273,
17200,
1159,
1754,
327,
326,
7384,
253,
11233,
37913,
21815,
273,
253,
8310,
1159,
253,
4477,
8058,
326,
253,
2602,
34430,
4906,
812,
320,
6786,
407,
37703,
247,
11233,
37913,
21815,
327,
253,
8310,
1566,
342,
247,
6486,
11233,
37913,
3638,
285,
3534,
253,
5170,
285,
2406,
3033,
1754,
327,
253,
1072,
21815,
9376,
253,
4477,
671,
2692,
270,
1808,
276,
25658,
8310,
1566,
13962,
476,
5115,
253,
2602,
34430,
4906,
1561,
253,
14493,
273,
253,
8310,
5926,
483,
2281,
4720,
253,
4477,
2529,
253,
8103,
326,
24772,
1097,
11233,
37913,
285,
270,
1808,
276,
25658,
34430,
4906,
3082,
275,
253,
3368,
253,
4477,
2692,
327,
253,
13506,
10895,
298,
14836,
812,
5115,
253,
1682,
38663,
1543,
285,
671,
858,
253,
28913,
2175,
327,
11233,
37913,
285,
270,
1808,
276,
25658,
3815,
342,
247,
4864,
5955,
273,
7364,
253,
4477,
7945,
253,
2929,
275,
253,
30762,
253,
4477,
6508,
2905,
2987,
8058,
253,
2022,
39383,
285,
2908,
625,
3368,
4278,
50276,
296,
3755,
20556,
337,
253,
789,
2175,
247,
2087,
285,
1774,
533,
3786,
12841,
2523,
275,
38663,
4715,
281,
5958,
534,
556,
644,
253,
5161,
273,
9787,
2898,
275,
3332,
1107,
374,
973,
3542,
285,
4891,
3762,
495,
3426,
1263,
342,
4209,
28913,
1263,
285,
5955,
327,
7364,
50276,
20881,
1255,
265,
337,
253,
4477,
760,
5183,
616,
1332,
327,
253,
13506,
10895,
5747,
253,
1332,
310,
7558,
281,
320,
7763,
281,
625,
37825,
8492,
352,
812,
320,
625,
21414,
604,
4477,
812,
1071,
253,
1332,
327,
1524,
10186,
5532,
941,
374,
253,
4477,
943,
1805,
4021,
5903,
616,
8453,
1783,
627,
403,
690,
1543,
15602,
20873,
24088,
275,
2829,
337,
470,
21889,
273,
32830,
11830,
2824,
4872,
1566,
327,
10863,
437,
284,
310,
1534,
7197,
1223,
470,
42140,
273,
25470,
4872,
1566,
310,
417,
1534,
671,
417,
2119,
534,
4555,
253,
8245,
369,
672,
3732,
253,
8453,
1783,
2139,
417,
470,
28057,
285,
470,
29868,
403,
417,
1534,
604,
470,
26522,
310,
1534,
7384,
253,
8245,
310,
253,
1682,
581,
342,
18188,
520,
495,
275,
4677,
374,
68,
672,
4477,
7558,
670,
1805,
3045,
273,
298,
14836,
387,
1698,
1162,
66,
275,
958,
597,
760,
452,
247,
2014,
1127,
298,
14836,
17923,
1805,
534,
310,
1162,
66,
17,
4555,
642,
8789,
310,
2649,
326,
8489,
1474,
4215,
432,
253,
2022,
1750,
273,
253,
2929,
476,
4477,
823,
690,
2792,
323,
1650,
470,
1464,
520,
281,
921,
298,
14836,
1057,
789,
1805,
387,
1698,
1162,
66,
990,
577,
5884,
2792,
327,
41818,
247,
4778,
246,
4620,
281,
320,
44881,
908,
275,
10012,
374,
285,
275,
270,
1808,
276,
25658,
1566,
4764,
270,
30762,
23168,
943,
1805,
1900,
320,
31637,
8489,
13477,
672,
417,
8958,
627,
310,
271,
30762,
4496,
923,
619,
5701,
275,
3533,
285,
14855,
1840,
5474,
33032,
2520,
2929,
2175,
253,
38663,
4715,
281,
5958,
1895,
1027,
432,
5368,
789,
253,
2929,
806,
8219,
326,
253,
8492,
2803,
943,
897,
667,
5368,
3386,
3021,
476,
564,
281,
2060,
5251,
840,
253,
4477,
9059,
326,
627,
588,
320,
247,
8789,
1055,
875,
253,
17200,
15469,
285,
8492,
15469,
347,
597,
3894,
247,
2257,
273,
3280,
6298,
840,
767,
2969,
3082,
403,
4081,
253,
806,
581,
546,
36217,
247,
6032,
1255,
7658,
327,
253,
8492,
1566,
253,
9267,
18859,
11233,
37913,
8310,
1566,
253,
1273,
581,
4648,
5926,
483,
327,
253,
253,
8492,
1566,
3453,
253,
9267,
18859,
270,
1808,
276,
25658,
34430,
4906,
1566,
253,
2929,
3400,
690,
10527,
1783,
327,
672,
34430,
4906,
310,
39941,
4679,
403,
2218,
327,
49863,
23744,
15302,
342,
13506,
4561,
31317,
327,
1345,
15302,
50275,
1189,
455,
253,
2929,
476,
320,
34615,
342,
625,
18511,
10527,
1783,
285,
5661,
7103,
50275,
783,
37317,
26785,
253,
4477,
12252,
253,
5926,
483,
4154,
310,
12524,
347,
690,
273,
253,
2201,
7350,
403,
1335,
627,
2266,
13260,
760,
327,
49863,
23744,
10895,
6032,
1255,
1332,
760,
2987,
327,
581,
10895,
625,
8664,
347,
253,
15302,
403,
13542,
6537,
588,
1978,
253,
4868,
347,
253,
37317,
369,
8523,
875,
495,
285,
577,
4757,
253,
1895,
5421,
310,
1774,
285,
253,
16038,
310,
4891,
253,
10527,
1783,
310,
4722,
2167,
253,
9376,
310,
2266,
285,
253,
31471,
310,
12744,
50276,
20881,
1255,
253,
3082,
403,
2969,
534,
310,
1175,
533,
417,
4460,
275,
958,
253,
5926,
483,
1332,
369,
4081,
495,
1107,
3622,
46705,
752,
3492,
281,
3698,
1735,
247,
1554,
262,
1945,
19947,
985,
533,
352,
369,
417,
11106,
253,
6032,
1255,
1332,
556,
2067,
7350,
923,
2708,
285,
513,
417,
1646,
281,
1347,
973,
275,
4679,
253,
10527,
1783,
310,
417,
1077,
18511,
281,
253,
37317,
253,
9376,
310,
2266,
253,
4477,
14969,
352,
534,
310,
1175,
323,
1650,
253,
6032,
1255,
7658,
310,
417,
1077,
21414,
50276,
783,
4477,
2168,
5393,
253,
1899,
7483,
1083,
671,
352,
310,
27350,
326,
1027,
7316,
3510,
476,
452,
3240,
1027,
33310,
13576,
4354,
26554,
253,
6032,
1255,
7658,
352,
760,
2722,
25720,
533,
642,
23632,
275,
3946,
697,
670,
4373,
19484,
25184,
253,
4679,
403,
417,
20297,
806,
970,
49863,
23744,
10895,
310,
1846,
275,
253,
6239,
594,
352,
1057,
417,
823,
4465,
2792,
1223,
1142,
952,
897,
1524,
10186,
15302,
671,
253,
37317,
9193,
352,
310,
625,
1774,
281,
1071,
436,
2929,
327,
625,
15958,
15302,
6296,
347,
253,
4477,
16038,
3054,
253,
8492,
476,
452,
2570,
5921,
342,
3280,
3386,
5368,
789,
7194,
6635,
8492,
760,
7024,
327,
1899,
594,
352,
812,
320,
3240,
2969,
533,
436,
789,
556,
247,
1027,
16038,
285,
253,
1039,
352,
15693,
8492,
476,
320,
1512,
21010,
2626,
253,
4477,
403,
5125,
281,
7277,
342,
690,
625,
3332,
3082,
253,
954,
3332,
8245,
369,
275,
6247,
50276,
2072,
2490,
187,
4118,
18435,
27,
783,
2929,
2175,
253,
38663,
4715,
281,
5958,
1895,
285,
23970,
747,
13260,
285,
5609,
281,
3037,
1175,
19947,
7823,
432,
23539,
941,
253,
2234,
12288,
310,
281,
897,
6032,
1255,
13260,
281,
34430,
713,
253,
1055,
273,
8310,
432,
17200,
253,
30628,
14109,
436,
4460,
2983,
327,
247,
1534,
1895,
285,
253,
4477,
31637,
849,
5368,
1899,
3169,
3210,
285,
643,
372,
4193,
2355,
8130,
20177,
347,
2714,
2219,
273,
616,
6032,
1255,
13260,
50275,
8826,
30628,
8042,
562,
30218,
275,
253,
16774,
1263,
50276,
77,
471,
273,
1543,
327,
1524,
10186,
5532,
941,
3533,
1475,
7605,
8453,
285,
1442,
1326,
11273,
4365,
2265,
273,
4373,
19484,
13794,
534,
253,
4477,
9674,
31637,
1309,
253,
8680,
3408,
6240,
247,
1524,
10186,
3368,
1014,
327,
253,
3710,
16816,
606,
543,
296,
10895,
326,
253,
4477,
3636,
533,
10945,
651,
9619,
17084,
253,
3916,
275,
253,
2929,
326,
753,
253,
11233,
37913,
285,
270,
1808,
276,
25658,
34430,
4906,
403,
2779,
281,
320,
273,
1600,
281,
253,
4715,
281,
5958,
3114,
285,
36057,
956,
484,
789,
327,
1805,
2608,
14053
] | [
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] | [
275,
253,
4715,
281,
5958,
298,
19,
83,
9459,
597,
7170,
253,
9079,
326,
13399,
372,
4193,
2355,
3082,
403,
14999,
1580,
597,
5467,
247,
5532,
1566,
835,
253,
1055,
273,
253,
1899,
273,
271,
5382,
275,
247,
17045,
1618,
327,
253,
5912,
273,
8310,
407,
253,
2608,
310,
3638,
2439,
1027,
22349,
50276,
783,
4477,
3185,
12661,
4715,
36833,
3210,
323,
8310,
5912,
285,
17200,
5912,
1580,
760,
31317,
403,
2540,
275,
3946,
285,
597,
5467,
268,
9738,
89,
50276,
81,
23705,
318,
89,
1899,
50276,
3456,
282,
11828,
89,
253,
767,
13483,
273,
8310,
11235,
11828,
403,
417,
38640,
347,
247,
906,
253,
4477,
3185,
12661,
767,
2515,
281,
19145,
247,
2602,
34430,
4906,
1617,
326,
18922,
17210,
275,
616,
21899,
1566,
672,
2429,
281,
253,
2032,
20552,
273,
8310,
337,
11233,
37913,
34430,
4906,
835,
268,
23705,
318,
89,
1899,
310,
355,
13203,
2824,
37913,
275,
1269,
285,
697,
3969,
13418,
1566,
310,
671,
701,
267,
2824,
37913,
275,
1269,
323,
247,
1318,
273,
9840,
326,
11521,
1561,
247,
3033,
7976,
327,
9765,
374,
270,
1808,
276,
25658,
34430,
4906,
253,
8310,
5912,
310,
2378,
969,
355,
13203,
2824,
37913,
533,
1060,
253,
3969,
13418,
1566,
310,
3185,
6726,
281,
3453,
337,
342,
5912,
246,
17190,
253,
5958,
254,
281,
3037,
3587,
432,
5532,
941,
604,
246,
11521,
1561,
247,
3033,
7976,
327,
9765,
840,
2602,
34430,
4906,
671,
10384,
1060,
50276,
783,
4477,
840,
17813,
616,
1566,
2439,
247,
4872,
285,
11454,
2990,
5958,
254,
285,
7277,
281,
247,
1180,
273,
38663,
1666,
25379,
275,
49863,
23744,
298,
19,
83,
7533,
436,
310,
247,
1175,
7680,
281,
253,
2170,
273,
298,
19,
83,
253,
1566,
597,
1263,
310,
20654,
285,
3626,
285,
891,
369,
9861,
627,
556,
644,
642,
2720,
789,
2403,
2074,
13260,
50276,
1189,
455,
253,
19843,
273,
47284,
369,
1077,
1175,
285,
253,
2929,
369,
3477,
281,
956,
2299,
891,
3543,
326,
253,
4477,
812,
452,
2530,
625,
30328,
275,
253,
270,
1808,
276,
25658,
34430,
4906,
2593,
923,
3533,
50276,
6050,
253,
5697,
3559,
403,
3240,
9865,
891,
3543,
253,
1543,
275,
253,
4679,
497,
247,
2372,
762,
11622,
3987,
253,
4477,
5467,
326,
616,
9079,
670,
39653,
2538,
273,
31972,
310,
3451,
275,
616,
9938,
285,
760,
921,
16888,
11701,
275,
295,
12352,
72,
672,
2429,
281,
277,
4123,
5734,
616,
8789,
4764,
310,
3012,
2559,
253,
295,
12352,
72,
3064,
875,
277,
4123,
285,
298,
14836,
403,
4577,
685,
253,
3910,
875,
298,
14836,
285,
3139,
323,
767,
1027,
2193,
273,
253,
4373,
19484,
246,
326,
403,
7419,
407,
14805,
2299,
891,
858,
11435,
253,
29867,
273,
616,
5661,
1543,
253,
4477,
2908,
1543,
2439,
247,
4618,
2491,
273,
9864,
3602,
285,
2429,
616,
1332,
342,
247,
3862,
2491,
273,
1666,
25379,
50276,
1189,
455,
516,
25661,
4404,
271,
2997,
3738,
891,
651,
320,
7378,
281,
1818,
619,
4868,
598,
432,
247,
5075,
2997,
281,
271,
2997,
604,
253,
4477,
403,
2104,
281,
1056,
247,
18511,
1083,
323,
253,
4757,
273,
616,
5661,
1543,
285,
2953,
619,
3533,
2708,
1677,
326,
253,
6860,
273,
298,
19,
83,
3210,
310,
323,
731,
281,
320,
18329,
327,
1781,
8573,
13498,
891,
651,
452,
10490,
281,
452,
2326,
690,
5955,
670,
253,
38058,
3486,
273,
253,
789,
5474,
339,
431,
248,
4477,
12661,
271,
38663,
4715,
13473,
1164,
18311,
1332,
534,
15771,
327,
760,
31317,
285,
17923,
34430,
4906,
281,
4858,
17200,
285,
8310,
4243,
432,
253,
5532,
1566,
50275,
28821,
326,
359,
10018,
760,
31317,
352,
310,
1077,
2834,
281,
19553,
253,
2032,
17200,
285,
8310,
3602,
984,
2709,
13553,
273,
17200,
285,
8310,
476,
906,
275,
253,
1072,
5532,
50275,
783,
4477,
12661,
247,
557,
290,
36874,
1332,
281,
34430,
713,
253,
767,
3602,
342,
10527,
23632,
50275,
358,
5378,
474,
1543,
327,
253,
2629,
49863,
23744,
22791,
7568,
253,
10307,
273,
253,
1332,
689,
4232,
1666,
25379,
50274,
74,
452,
1239,
285,
2783,
253,
4477,
2380,
50276,
8826,
2266,
2792,
273,
253,
2929,
50275,
783,
2929,
310,
973,
15720,
342,
247,
1463,
10199,
281,
253,
2170,
273,
18311,
285,
31306,
275,
5532,
941,
1677,
253,
3710,
2317,
275,
253,
5723,
2824,
5981,
253,
4477,
452,
2218,
247,
49638,
494,
2628,
273,
16984,
253,
2905,
2987,
285,
253,
1895,
3139,
50275,
783,
13260,
403,
10090,
562,
275,
271,
27350,
5133,
285,
253,
1332,
310,
4516,
407,
3762,
50275,
358,
5378,
474,
1543,
5224,
2266,
3045,
347,
2429,
281,
1666,
25379,
50275,
8826,
5075,
2792,
26122,
327,
253,
7482,
50276,
32674,
3733,
672,
31317,
403,
5998,
3066,
247,
1885,
273,
8131,
17200,
285,
8131,
7313,
1014,
342,
253,
37820,
1307,
327,
253,
5222,
273,
11786,
247,
14916,
2900,
651,
320,
281,
1056,
253,
8310,
3210,
10554,
3638,
285,
17200,
4503,
281,
31317,
849,
588,
253,
1566,
9295,
253,
2032,
17200,
275,
824,
247,
1083,
1014,
342,
2602,
34430,
4906,
352,
3198,
281,
6642,
17200,
598,
281,
247,
3638,
50275,
1189,
455,
891,
1158,
253,
2929,
10262,
271,
4722,
28055,
17194,
5933,
323,
18311,
50276,
2072,
5474,
339,
9852,
436,
789,
253,
4477,
5421,
285,
4081,
247,
11233,
37913,
285,
270,
1808,
276,
25658,
34430,
4906,
298,
14836,
1566,
281,
34430,
713,
17200,
285,
8310,
387,
2060,
3389,
1268,
275,
38663,
4715,
13473,
1164,
253,
4477,
806,
5544,
670,
253,
8789,
273,
17200,
285,
8310,
347,
627,
403,
1846,
2616,
13567,
1097,
597,
840,
15786,
253,
8789,
1055,
715,
253,
2022,
9376,
273,
38663,
5532,
10554,
285,
2931,
1097,
1892,
34430,
4906,
1566,
11101,
6013,
347,
253,
3216,
5083,
285,
2602,
34430,
4906,
1566,
14717,
31221,
253,
1340,
273,
17200,
1159,
1754,
327,
326,
7384,
253,
11233,
37913,
21815,
273,
253,
8310,
1159,
253,
4477,
8058,
326,
253,
2602,
34430,
4906,
812,
320,
6786,
407,
37703,
247,
11233,
37913,
21815,
327,
253,
8310,
1566,
342,
247,
6486,
11233,
37913,
3638,
285,
3534,
253,
5170,
285,
2406,
3033,
1754,
327,
253,
1072,
21815,
9376,
253,
4477,
671,
2692,
270,
1808,
276,
25658,
8310,
1566,
13962,
476,
5115,
253,
2602,
34430,
4906,
1561,
253,
14493,
273,
253,
8310,
5926,
483,
2281,
4720,
253,
4477,
2529,
253,
8103,
326,
24772,
1097,
11233,
37913,
285,
270,
1808,
276,
25658,
34430,
4906,
3082,
275,
253,
3368,
253,
4477,
2692,
327,
253,
13506,
10895,
298,
14836,
812,
5115,
253,
1682,
38663,
1543,
285,
671,
858,
253,
28913,
2175,
327,
11233,
37913,
285,
270,
1808,
276,
25658,
3815,
342,
247,
4864,
5955,
273,
7364,
253,
4477,
7945,
253,
2929,
275,
253,
30762,
253,
4477,
6508,
2905,
2987,
8058,
253,
2022,
39383,
285,
2908,
625,
3368,
4278,
50276,
296,
3755,
20556,
337,
253,
789,
2175,
247,
2087,
285,
1774,
533,
3786,
12841,
2523,
275,
38663,
4715,
281,
5958,
534,
556,
644,
253,
5161,
273,
9787,
2898,
275,
3332,
1107,
374,
973,
3542,
285,
4891,
3762,
495,
3426,
1263,
342,
4209,
28913,
1263,
285,
5955,
327,
7364,
50276,
20881,
1255,
265,
337,
253,
4477,
760,
5183,
616,
1332,
327,
253,
13506,
10895,
5747,
253,
1332,
310,
7558,
281,
320,
7763,
281,
625,
37825,
8492,
352,
812,
320,
625,
21414,
604,
4477,
812,
1071,
253,
1332,
327,
1524,
10186,
5532,
941,
374,
253,
4477,
943,
1805,
4021,
5903,
616,
8453,
1783,
627,
403,
690,
1543,
15602,
20873,
24088,
275,
2829,
337,
470,
21889,
273,
32830,
11830,
2824,
4872,
1566,
327,
10863,
437,
284,
310,
1534,
7197,
1223,
470,
42140,
273,
25470,
4872,
1566,
310,
417,
1534,
671,
417,
2119,
534,
4555,
253,
8245,
369,
672,
3732,
253,
8453,
1783,
2139,
417,
470,
28057,
285,
470,
29868,
403,
417,
1534,
604,
470,
26522,
310,
1534,
7384,
253,
8245,
310,
253,
1682,
581,
342,
18188,
520,
495,
275,
4677,
374,
68,
672,
4477,
7558,
670,
1805,
3045,
273,
298,
14836,
387,
1698,
1162,
66,
275,
958,
597,
760,
452,
247,
2014,
1127,
298,
14836,
17923,
1805,
534,
310,
1162,
66,
17,
4555,
642,
8789,
310,
2649,
326,
8489,
1474,
4215,
432,
253,
2022,
1750,
273,
253,
2929,
476,
4477,
823,
690,
2792,
323,
1650,
470,
1464,
520,
281,
921,
298,
14836,
1057,
789,
1805,
387,
1698,
1162,
66,
990,
577,
5884,
2792,
327,
41818,
247,
4778,
246,
4620,
281,
320,
44881,
908,
275,
10012,
374,
285,
275,
270,
1808,
276,
25658,
1566,
4764,
270,
30762,
23168,
943,
1805,
1900,
320,
31637,
8489,
13477,
672,
417,
8958,
627,
310,
271,
30762,
4496,
923,
619,
5701,
275,
3533,
285,
14855,
1840,
5474,
33032,
2520,
2929,
2175,
253,
38663,
4715,
281,
5958,
1895,
1027,
432,
5368,
789,
253,
2929,
806,
8219,
326,
253,
8492,
2803,
943,
897,
667,
5368,
3386,
3021,
476,
564,
281,
2060,
5251,
840,
253,
4477,
9059,
326,
627,
588,
320,
247,
8789,
1055,
875,
253,
17200,
15469,
285,
8492,
15469,
347,
597,
3894,
247,
2257,
273,
3280,
6298,
840,
767,
2969,
3082,
403,
4081,
253,
806,
581,
546,
36217,
247,
6032,
1255,
7658,
327,
253,
8492,
1566,
253,
9267,
18859,
11233,
37913,
8310,
1566,
253,
1273,
581,
4648,
5926,
483,
327,
253,
253,
8492,
1566,
3453,
253,
9267,
18859,
270,
1808,
276,
25658,
34430,
4906,
1566,
253,
2929,
3400,
690,
10527,
1783,
327,
672,
34430,
4906,
310,
39941,
4679,
403,
2218,
327,
49863,
23744,
15302,
342,
13506,
4561,
31317,
327,
1345,
15302,
50275,
1189,
455,
253,
2929,
476,
320,
34615,
342,
625,
18511,
10527,
1783,
285,
5661,
7103,
50275,
783,
37317,
26785,
253,
4477,
12252,
253,
5926,
483,
4154,
310,
12524,
347,
690,
273,
253,
2201,
7350,
403,
1335,
627,
2266,
13260,
760,
327,
49863,
23744,
10895,
6032,
1255,
1332,
760,
2987,
327,
581,
10895,
625,
8664,
347,
253,
15302,
403,
13542,
6537,
588,
1978,
253,
4868,
347,
253,
37317,
369,
8523,
875,
495,
285,
577,
4757,
253,
1895,
5421,
310,
1774,
285,
253,
16038,
310,
4891,
253,
10527,
1783,
310,
4722,
2167,
253,
9376,
310,
2266,
285,
253,
31471,
310,
12744,
50276,
20881,
1255,
253,
3082,
403,
2969,
534,
310,
1175,
533,
417,
4460,
275,
958,
253,
5926,
483,
1332,
369,
4081,
495,
1107,
3622,
46705,
752,
3492,
281,
3698,
1735,
247,
1554,
262,
1945,
19947,
985,
533,
352,
369,
417,
11106,
253,
6032,
1255,
1332,
556,
2067,
7350,
923,
2708,
285,
513,
417,
1646,
281,
1347,
973,
275,
4679,
253,
10527,
1783,
310,
417,
1077,
18511,
281,
253,
37317,
253,
9376,
310,
2266,
253,
4477,
14969,
352,
534,
310,
1175,
323,
1650,
253,
6032,
1255,
7658,
310,
417,
1077,
21414,
50276,
783,
4477,
2168,
5393,
253,
1899,
7483,
1083,
671,
352,
310,
27350,
326,
1027,
7316,
3510,
476,
452,
3240,
1027,
33310,
13576,
4354,
26554,
253,
6032,
1255,
7658,
352,
760,
2722,
25720,
533,
642,
23632,
275,
3946,
697,
670,
4373,
19484,
25184,
253,
4679,
403,
417,
20297,
806,
970,
49863,
23744,
10895,
310,
1846,
275,
253,
6239,
594,
352,
1057,
417,
823,
4465,
2792,
1223,
1142,
952,
897,
1524,
10186,
15302,
671,
253,
37317,
9193,
352,
310,
625,
1774,
281,
1071,
436,
2929,
327,
625,
15958,
15302,
6296,
347,
253,
4477,
16038,
3054,
253,
8492,
476,
452,
2570,
5921,
342,
3280,
3386,
5368,
789,
7194,
6635,
8492,
760,
7024,
327,
1899,
594,
352,
812,
320,
3240,
2969,
533,
436,
789,
556,
247,
1027,
16038,
285,
253,
1039,
352,
15693,
8492,
476,
320,
1512,
21010,
2626,
253,
4477,
403,
5125,
281,
7277,
342,
690,
625,
3332,
3082,
253,
954,
3332,
8245,
369,
275,
6247,
50276,
2072,
2490,
187,
4118,
18435,
27,
783,
2929,
2175,
253,
38663,
4715,
281,
5958,
1895,
285,
23970,
747,
13260,
285,
5609,
281,
3037,
1175,
19947,
7823,
432,
23539,
941,
253,
2234,
12288,
310,
281,
897,
6032,
1255,
13260,
281,
34430,
713,
253,
1055,
273,
8310,
432,
17200,
253,
30628,
14109,
436,
4460,
2983,
327,
247,
1534,
1895,
285,
253,
4477,
31637,
849,
5368,
1899,
3169,
3210,
285,
643,
372,
4193,
2355,
8130,
20177,
347,
2714,
2219,
273,
616,
6032,
1255,
13260,
50275,
8826,
30628,
8042,
562,
30218,
275,
253,
16774,
1263,
50276,
77,
471,
273,
1543,
327,
1524,
10186,
5532,
941,
3533,
1475,
7605,
8453,
285,
1442,
1326,
11273,
4365,
2265,
273,
4373,
19484,
13794,
534,
253,
4477,
9674,
31637,
1309,
253,
8680,
3408,
6240,
247,
1524,
10186,
3368,
1014,
327,
253,
3710,
16816,
606,
543,
296,
10895,
326,
253,
4477,
3636,
533,
10945,
651,
9619,
17084,
253,
3916,
275,
253,
2929,
326,
753,
253,
11233,
37913,
285,
270,
1808,
276,
25658,
34430,
4906,
403,
2779,
281,
320,
273,
1600,
281,
253,
4715,
281,
5958,
3114,
285,
36057,
956,
484,
789,
327,
1805,
2608,
14053
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
this paper proposes to tune the number of models in a boosting ensemble in an instancewise fashion the idea is to first cluster the samples using a decision tree and then to tune the size of the ensemble independently for each cluster instead of doing it globally for all instances an efficient twolevel crossvalidation procedure is designed to tune both the number of terms in each cluster and the number of clusters experiments are conducted on 6 largescale problems that show that local pruning brings some improvement with respect to the more standard global pruning technique adapting the ensemble size in an instancewise fashion is an interesting problem it is not new however unlike what is claimed in the paper this is addressed at least in these papers 1 hernndezlobato d martnezmuoz g surez a statistical instancebased pruning in ensembles of independent classifiers ieee trans pattern anal mach intell 2009 feb3123649 2 dayvid vr oliveira george dc cavalcanti robert sabourin online pruning of base classifiers for dynamic ensemble selection pattern recognition volume 72 2017 3 v soto s garcamoratilla g martnezmuoz d hernndezlobato and a surez a double pruning scheme for boosting ensembles in ieee transactions on cybernetics vol 44 no 12 pp 26822695 dec 2014 doi 101109tcyb20142313638 4 rafael mo cruz robert sabourin george dc cavalcanti dynamic classifier selection recent advances and perspectives information fusion volume 41 2018 pages 195216 the method proposed in the present paper based on clustering and crossvalidation seems however novel and significantly different from these works which perform pruning in an online way ie when a prediction needs to be done in addition the motivation of most of these works is also to reduce memory and computing times more than improving accuracy i think the related work discussion should nevertheless includes these approaches the proposed method is very straightforward it makes sense overall but i dont totally buy some of the arguments that are given in section 4 to motivate it first the discussion focuses only on the impact of pruning on predictive performance while pruning can be very useful to reduce storage requirement or computing times at inference and this is angle adapted in eg 3 above im not sure that it is that useful for improving predictive performance boosting has been shown to be quite robust with respect to the ensemble size if the learning rate is small enough and actually experiments in figure 1 and 2 confirm this fact since most error curves are monotonically decreasing i see only one curve that really significantly increases at some point so i was not expecting a priori a huge improvement in terms of predictive performance second some statements to motivate the idea of clustering are not really supported either by a theoretical or an empirical analysis although i agree that they make sense intuitively if you want to show that it is essential to preserve the initial geometry and to exploit the labels at the clustering stage then you should provide experiments to show that not doing so is indeed detrimental by using eg unsupervised clustering third the fact that the clustering is carried out without consideration of the boosting model makes the approach also suboptimal by design there is no guarantee that the clustering will be optimal when it comes to tune the ensemble size in each cluster the idea of the crossvalidation approach based on a single model per fold is sound and very relevant to reduce the computational cost of the approach note however that the cost is still important with respect to no pruning at all since it requires to grow k1 boosting ensembles instead of 1 but i agree that there is a negligible overhead with respect to tuning globally the ensemble size the empirical validation is carried out correctly from a statistical point of view i find however that the authors overemphasize the significance of the improvement they obtain with their approach looking at table 2 their local pruning technique brings an improvement of less than 1 in terms of 01 loss on four problems out of six and on the other two problems the difference remains very small im not sure that such level of improvement is actually worth the effort if one is only interested in predictive performance these results are also obtained with a single dataset split if i understand correctly given how small the difference is i think it would have been important to repeat the experiment several times with different splits to get standard deviations and maybe also to carry out a statistical test to check whether the improvements are statistically significant i dont think either that the experiments provide a satisfactory answer to the first two questions asked in section 5 as already discussed above to me figures 1 and 2 show that one can not expect strong benefit from the clustering since most error curves are monotonically decreasing part of the important diversity in the optimal size in the clusters seems to be due to the fact that long flat regions are observed which lead to an instable position of the optimum i would have like also a more systematic and quantitative experiment on all datasets to answer the second question about the relevance of the validation protocol why not compare the optimum found by this protocol with the theoretical optimum found on the test set finally im surprised that the authors dont talk at all about the benefit of pruning on computing times at inference to me this is one of the main motivation for using pruning in the context of ensembles i would be interested to know how global and clusterbased pruning compare from this point of view minor comments in 31 the following sentence is unclear this method has nothing to do with the doubledescent problem what do you mean its not clear how you fix the number of leaves in a decision tree in the standard algorithm the order in which the nodes are expanded is arbitrary do you apply some bestfirst strategy which one if i understand correctly the baseline in table 2 is the standard globally pruned model if so i would like to see also the performance of a unpruned ensemble only one setting of the boosting algorithm is explored b5000 a learning rate of 002 and default parameters for the catboost model yet one expects that these parameters will have an impact on the performance its obvious for b at least i think that more combinations should be explored to make the conclusions more general looking at figure 2 there are more error curves above the average curve than below which suggest that clusters are potentially unbalanced it would be interesting to report the size of these clusters the paper is well written and the proposed method makes sense and is original but it lacks a bit of theoretical motivation and im not convinced of its practical relevance given the very marginal improvements observed in the experiments docsepan enhancement of the popular gradient boosting method is presented wherein the input space is subdivided into regions and the crossvalidated optimal number of trees to include in the ensemble is chosen on a perregion basis rather than using a single global number of trees for the entire input space consistently superior performance relative to standard gradient boosting across 6 benchmark datasets is reported the idea of partitioning the input space and using a varying number of models in the ensemble makes a lot of sense to me and is novel to the best of my knowledge gradient boosting and similar techniques do indeed remain widely used and very competitive for smalltomedium dimension tabular data so the practical significance of the method is clear although some minor edits are needed the presentation of the technique is clear for the most part my main concern is that the number of benchmark datasets is smaller than i would have liked given the option nowadays to rent computing power temporarily via the cloud its not unreasonable to expect much more than 6 benchmarks to be tested for a method like this for instance i once refereed a paper for another conference which tested their method on 40 benchmarks and although i liked the paper the paper got rejected i am not saying 40 is required but maybe 12 or 15 benchmarks would make me more comfortable with the results than just 6 with that said the method is elegant and intuitive and widely applicable so i dont mind at all if this paper is accepted the small number of benchmarks is the reason i can only give it a 6 one major suggested edit the author use the term object in a nonstandard way to refer to what are normally referred to as examples or data points instances observations etc ie the labelled xy pairs used to train a supervised learning model i suggest choosing a term other than objects one minor complaint on page 6 it is claimed that cluster surfaces are hard to validate in fact clustering can be evaluated out of sample much in the same way as supervised learning by looking eg at likelihood on a holdout sample so as to choose an optimal number of mixtures in a mixture of gaussians it is also possible to construct objective functions for something like kmeans which trade off k with distance of each data point to its cluster mean nonetheless it does make sense to me to partition with a decision tree so i am not bothered by this misstatement much a few other suggested edits page 3 the early stopping early stopping page 8 does the validation protocol proposed in section 44 has good generalization ability has have a method for partitioning the input space for a gradient boosting algorithm and choosing a different number of trees to be included in the ensemble in a partitiondependent way is presented improvements on 6 benchmark datasets relative to standard gradient boosting are reported the method is intuitive and elegant but the number of benchmark datasets is not as large as it could be and for that reason the paper gets a 6 marginal accept docsepthe paper proposes a novel method to set the optimal ensemble size in gradient boosting in particular the authors propose an adaptive strategy that sets distinct ensemble sizes for different regions of the input space for that they propose dividing the input space in coherent regions whose instances are similar both in terms of features and labels and estimating the optimal ensemble size within those regions for clustering data they propose using a decision tree induced from the entire training set and the leaves of the trees are the clusters that comprise the data partition they show the results of both a biased and a lessbiased estimator for finding out the optimal ensemble size per region and they compare their findings with the traditional strategy of simply pruning the ensembles based on a single optimal number of learners estimated from a crossvalidation procedure results in 6 public datasets show that in at least 4 of those datasets the method seems to provide better results strong points very simple method with easy implementation making its reproduction straightforward elegant idea for incorporating the approach within a crossvalidation procedure strong results albeit in a short amount of datasets weak points very limited experimental analysis for addressing that i would recommend you should try it on dozens of datasets given that the method is allegedly very fast you should compare your approach with several methods for hyperparameter optimization not only with the naive cv procedure i am curious to see whether a more sophisticated hyperparameter tuning approach can outperform you simple approach and if so with which efficiency it would do it you should employ statistical tests for better assessing the statistical significance of results in particular there are several nonparametric tests that can help you pointing out significant differences and also posthoc methods for pointing out pairwise differences for the case of multiple methods being executed over multiple datasets questions you say you define the number of clusters leaves in the decision tree according to the procedure described in section 44 it is not clear to me which procedure is that do you mean running alg 3 several times with distinct number of clusters and using the one with the best results how exactly did you do that since defining the number of leaves in decisiontree induction is not straightforward you need to try and control that via tree height or minimum number of instances per leaf i would guess that this step is a little bit more complicated than you make it appear and also the increased computational cost of running your entire procedure which has a nonnegligible cost itself multiple times i think that is a vital part of the method whose discussion is not done at all in the paper the shrink method in alg 3 is not detailed at all so i am assuming it means getting the predictions of the full model executed over fold sq and cutting off the results that use more than miq models would that be correct i am sorry for the confusion but that is not totally clear to me in inference test time i would assume you need to see in which cluster the new instance falls on and then using the number of models defined for that cluster is that also correct i ask because there is no mention on that at all in the paper paper with a nice elegant and very simple idea for adaptively generating different number of ensemble sizes throughout the input space in an attempt to generating better results the weak point is that the experimental analysis is not up to the standards that one would expect here as i would have expected the method to be tested over dozens of datasets with a proper analysis against other hyperparameter optimization approaches and statistical tests to validate the significance of the results
### Summary: | while the reviewers agree that the paper contains interesting ideas and the method is elegant it unfortunately does not meet the bar for acceptance i strongly encourage the authors to revise their paper in particular using the numerous comments made throughout the discussion phase for example it is important that the authors polish their work in particular for the updates provided eg figure 3 see efwa reviewers pointed the lack of updates on important claims by the authors in particular the claim regarding clustering vs decision trees see efwa the comments on the lack of diverse datasets see mexp some answers might have gained in clarity such as the reply to efwa on the application and conclusions following wilcoxon sign test | [
2216,
627,
310,
642,
12215,
326,
253,
17524,
588,
320,
8654,
672,
352,
3249,
281,
19928,
253,
19862,
1979,
275,
1016,
7368,
50276,
783,
2934,
273,
253,
2831,
29599,
2746,
1754,
327,
247,
2014,
1566,
591,
7975,
310,
3590,
285,
1077,
4623,
281,
4796,
253,
15180,
2105,
273,
253,
2746,
3877,
2299,
326,
253,
2105,
310,
1335,
1774,
342,
1675,
281,
642,
819,
25004,
387,
512,
1580,
352,
4419,
281,
1756,
465,
18,
43124,
49328,
3185,
273,
337,
533,
891,
5194,
326,
627,
310,
247,
22879,
18332,
342,
1675,
281,
25184,
21349,
253,
19862,
1979,
50276,
783,
16774,
12820,
310,
4824,
562,
9113,
432,
247,
7605,
1127,
273,
1859,
891,
1089,
2299,
326,
253,
4477,
25732,
2013,
545,
284,
907,
253,
8453,
273,
253,
7756,
597,
4044,
342,
616,
2746,
2819,
387,
2829,
374,
616,
1980,
819,
25004,
5853,
10316,
271,
7756,
273,
1679,
685,
337,
275,
2426,
273,
14805,
2957,
327,
1740,
3237,
562,
273,
2800,
285,
327,
253,
643,
767,
3237,
253,
3064,
4558,
1077,
1355,
516,
417,
2119,
326,
824,
1268,
273,
7756,
310,
2686,
4409,
253,
3434,
604,
581,
310,
760,
6110,
275,
15970,
3045,
841,
1543,
403,
671,
2797,
342,
247,
2014,
10895,
8085,
604,
891,
2096,
9113,
1677,
849,
1355,
253,
3064,
310,
891,
1158,
352,
651,
452,
644,
1774,
281,
10280,
253,
3368,
2067,
2069,
342,
1027,
36509,
281,
755,
2629,
21492,
285,
5046,
671,
281,
4459,
562,
247,
7605,
1071,
281,
2451,
1880,
253,
11701,
403,
10126,
1534,
50276,
74,
13414,
1158,
2057,
326,
253,
4679,
2085,
247,
20297,
3662,
281,
253,
806,
767,
3533,
2546,
275,
2593,
608,
347,
2168,
5469,
1840,
281,
479,
8442,
337,
285,
374,
921,
326,
581,
476,
417,
1902,
2266,
5649,
432,
253,
17524,
1580,
954,
2228,
9191,
403,
41907,
1037,
11052,
629,
273,
253,
1774,
9991,
275,
253,
8654,
1979,
275,
253,
9959,
3133,
281,
320,
1955,
281,
253,
958,
326,
1048,
6507,
4811,
403,
2540,
534,
1421,
281,
271,
978,
494,
1899,
273,
253,
24571,
891,
651,
452,
751,
671,
247,
625,
12082,
285,
11745,
3368,
327,
512,
15302,
281,
3662,
253,
1273,
1953,
670,
253,
17200,
273,
253,
12820,
7241,
2139,
417,
7277,
253,
24571,
1119,
407,
436,
7241,
342,
253,
10527,
24571,
1119,
327,
253,
1071,
873,
50276,
71,
3341,
516,
9861,
326,
253,
4477,
13414,
2312,
387,
512,
670,
253,
5649,
273,
819,
25004,
327,
12672,
2069,
387,
17032,
281,
479,
436,
310,
581,
273,
253,
2022,
16038,
323,
970,
819,
25004,
275,
253,
3634,
273,
49328,
891,
651,
320,
6110,
281,
871,
849,
4156,
285,
7368,
3169,
819,
25004,
7277,
432,
436,
1127,
273,
1859,
50276,
37585,
5701,
50276,
249,
4562,
253,
1563,
6197,
310,
12744,
436,
1332,
556,
2717,
281,
513,
342,
253,
25128,
40513,
1895,
752,
513,
368,
1599,
50276,
953,
417,
2590,
849,
368,
4993,
253,
1180,
273,
6505,
275,
247,
3061,
5202,
275,
253,
2629,
5933,
253,
1340,
275,
534,
253,
7632,
403,
11848,
310,
10341,
513,
368,
4647,
690,
1682,
7053,
5700,
534,
581,
50276,
338,
891,
2096,
9113,
253,
8245,
275,
2829,
374,
310,
253,
2629,
21349,
819,
37437,
1566,
604,
594,
891,
651,
751,
281,
923,
671,
253,
3045,
273,
247,
440,
1087,
37437,
19862,
50276,
7483,
581,
4758,
273,
253,
43124,
5933,
310,
14859,
270,
28306,
247,
4715,
2281,
273,
209,
4699,
285,
4284,
3602,
323,
253,
5798,
15467,
1566,
2568,
581,
21973,
326,
841,
3602,
588,
452,
271,
3486,
327,
253,
3045,
697,
4755,
323,
270,
387,
1878,
891,
1158,
326,
625,
13553,
943,
320,
14859,
281,
1056,
253,
11815,
625,
2087,
50276,
13565,
387,
4677,
374,
627,
403,
625,
2228,
9191,
1840,
253,
3388,
6970,
685,
2708,
534,
1804,
326,
9959,
403,
7826,
440,
30063,
352,
651,
320,
4722,
281,
1304,
253,
1979,
273,
841,
9959,
253,
2929,
310,
973,
3542,
285,
253,
4081,
1332,
2789,
3282,
285,
310,
3236,
533,
352,
19756,
247,
2372,
273,
10527,
16038,
285,
516,
417,
13762,
273,
697,
8542,
17200,
1677,
253,
1077,
16888,
11701,
2540,
275,
253,
4679,
50276,
7152,
339,
4029,
14314,
273,
253,
4633,
11786,
43124,
1332,
310,
3559,
10646,
253,
3280,
2317,
310,
18375,
1356,
715,
4811,
285,
253,
2831,
7210,
456,
8654,
1180,
273,
7139,
281,
2486,
275,
253,
19862,
310,
6777,
327,
247,
591,
17187,
3720,
2581,
685,
970,
247,
2014,
4156,
1180,
273,
7139,
323,
253,
2862,
3280,
2317,
12724,
8936,
3045,
4103,
281,
2629,
11786,
43124,
2439,
721,
22791,
15302,
310,
2361,
253,
2934,
273,
41463,
253,
3280,
2317,
285,
970,
247,
11962,
1180,
273,
3210,
275,
253,
19862,
2789,
247,
2257,
273,
3282,
281,
479,
285,
310,
4460,
281,
253,
1682,
273,
619,
3640,
11786,
43124,
285,
2074,
5609,
513,
6296,
3464,
7561,
908,
285,
1077,
12085,
323,
1355,
85,
14944,
1514,
7877,
10334,
792,
941,
594,
253,
8542,
8453,
273,
253,
1332,
310,
2590,
3738,
690,
5884,
1407,
953,
403,
3058,
253,
9759,
273,
253,
5853,
310,
2590,
323,
253,
954,
629,
50274,
2577,
2022,
4468,
310,
326,
253,
1180,
273,
22791,
15302,
310,
4577,
685,
891,
651,
452,
10490,
1677,
253,
4500,
31735,
281,
8845,
12672,
1612,
20220,
3066,
253,
9005,
697,
417,
20697,
281,
1902,
1199,
625,
685,
721,
49602,
281,
320,
5762,
323,
247,
1332,
751,
436,
323,
4227,
891,
2378,
10591,
22767,
247,
2929,
323,
1529,
8059,
534,
5762,
616,
1332,
327,
3387,
49602,
285,
3738,
891,
10490,
253,
2929,
253,
2929,
1694,
10945,
50276,
74,
717,
417,
3981,
3387,
310,
2424,
533,
5046,
1249,
390,
1458,
49602,
651,
1056,
479,
625,
9848,
342,
253,
1543,
685,
816,
721,
50276,
3113,
326,
753,
253,
1332,
310,
20654,
285,
27350,
285,
7561,
7763,
594,
891,
13414,
2564,
387,
512,
604,
436,
2929,
310,
7607,
253,
1355,
1180,
273,
49602,
310,
253,
1921,
891,
476,
760,
1918,
352,
247,
721,
50276,
531,
2201,
5125,
12921,
253,
2488,
897,
253,
1307,
1789,
275,
247,
1327,
15291,
1039,
281,
3730,
281,
752,
403,
9403,
6289,
281,
347,
6667,
50276,
263,
941,
2792,
10872,
7313,
3966,
26332,
253,
27214,
1269,
90,
8557,
908,
281,
6194,
247,
22296,
4715,
1566,
891,
1804,
13887,
247,
1307,
643,
685,
5113,
50276,
531,
5884,
5833,
327,
3239,
721,
352,
310,
7558,
326,
7368,
9421,
403,
1892,
281,
17813,
275,
958,
17524,
476,
320,
6760,
562,
273,
3410,
1199,
275,
253,
1072,
1039,
347,
22296,
4715,
407,
2819,
24088,
387,
12177,
327,
247,
2186,
483,
3410,
594,
347,
281,
5206,
271,
8654,
1180,
273,
24170,
275,
247,
7802,
273,
305,
10064,
2458,
352,
310,
671,
1896,
281,
3989,
8103,
3470,
323,
1633,
751,
465,
30799,
534,
5454,
745,
465,
342,
4181,
273,
1016,
941,
1127,
281,
697,
7368,
1599,
23188,
352,
1057,
1056,
3282,
281,
479,
281,
10883,
342,
247,
3061,
5202,
594,
891,
717,
417,
28860,
407,
436,
3731,
25322,
1199,
50276,
66,
1643,
643,
5125,
1407,
953,
50276,
6377,
495,
253,
2393,
15910,
50276,
18579,
15910,
50276,
6377,
854,
1057,
253,
12820,
7241,
4081,
275,
2593,
7127,
556,
1175,
26647,
3745,
556,
50276,
9802,
247,
1332,
323,
41463,
253,
3280,
2317,
323,
247,
11786,
43124,
5933,
285,
13887,
247,
1027,
1180,
273,
7139,
281,
320,
2908,
275,
253,
19862,
275,
247,
10883,
6820,
1039,
310,
3559,
11701,
327,
721,
22791,
15302,
4103,
281,
2629,
11786,
43124,
403,
2361,
253,
1332,
310,
27350,
285,
20654,
533,
253,
1180,
273,
22791,
15302,
310,
417,
347,
1781,
347,
352,
812,
320,
285,
323,
326,
1921,
253,
2929,
4850,
247,
721,
16888,
2997,
5474,
339,
431,
248,
2929,
29328,
247,
4460,
1332,
281,
873,
253,
8654,
19862,
1979,
275,
11786,
43124,
275,
1798,
253,
4477,
12661,
271,
17825,
5700,
326,
5239,
5799,
19862,
9552,
323,
1027,
4811,
273,
253,
3280,
2317,
323,
326,
597,
12661,
23534,
253,
3280,
2317,
275,
18893,
4811,
3692,
10872,
403,
2074,
1097,
275,
2426,
273,
3386,
285,
13301,
285,
26230,
253,
8654,
19862,
1979,
1561,
1110,
4811,
323,
17524,
941,
597,
12661,
970,
247,
3061,
5202,
5802,
432,
253,
2862,
3733,
873,
285,
253,
6505,
273,
253,
7139,
403,
253,
9959,
326,
19142,
253,
941,
10883,
597,
921,
253,
1543,
273,
1097,
247,
23539,
285,
247,
1679,
30344,
29107,
323,
4560,
562,
253,
8654,
19862,
1979,
591,
2919,
285,
597,
7277,
616,
4342,
342,
253,
5899,
5700,
273,
3365,
819,
25004,
253,
49328,
1754,
327,
247,
2014,
8654,
1180,
273,
40390,
5998,
432,
247,
2831,
29599,
5199,
1543,
275,
721,
1345,
15302,
921,
326,
275,
387,
1878,
577,
273,
1110,
15302,
253,
1332,
3133,
281,
2085,
1805,
1543,
2266,
2792,
50276,
635,
2969,
1332,
342,
3477,
7092,
2403,
697,
21068,
15246,
50276,
70,
1851,
386,
2934,
323,
24049,
253,
2746,
1561,
247,
2831,
29599,
5199,
50276,
9072,
1543,
23447,
275,
247,
2159,
2408,
273,
15302,
50276,
20881,
2792,
50276,
635,
3710,
5661,
1783,
323,
15974,
326,
891,
651,
5583,
50272,
5658,
943,
1611,
352,
327,
18660,
273,
15302,
1677,
326,
253,
1332,
310,
14163,
1077,
3809,
50272,
5658,
943,
7277,
634,
2746,
342,
2067,
3082,
323,
4373,
19484,
13757,
417,
760,
342,
253,
27785,
30105,
5199,
891,
717,
14338,
281,
923,
1880,
247,
625,
18144,
4373,
19484,
25184,
2746,
476,
562,
32231,
368,
2969,
2746,
285,
604,
594,
342,
534,
6733,
352,
651,
513,
352,
50272,
5658,
943,
2126,
7605,
5216,
323,
1805,
18005,
253,
7605,
8453,
273,
1543,
275,
1798,
627,
403,
2067,
1327,
36928,
5216,
326,
476,
1361,
368,
13458,
562,
1534,
3910,
285,
671,
1501,
37806,
3082,
323,
13458,
562,
28208,
3910,
323,
253,
1083,
273,
2709,
3082,
1146,
11407,
689,
2709,
15302,
50276,
34974,
50276,
5658,
1333,
368,
4853,
253,
1180,
273,
9959,
6505,
275,
253,
3061,
5202,
2556,
281,
253,
5199,
2529,
275,
2593,
7127,
352,
310,
417,
2590,
281,
479,
534,
5199,
310,
326,
513,
368,
1599,
3515,
20320,
495,
2067,
2069,
342,
5799,
1180,
273,
9959,
285,
970,
253,
581,
342,
253,
1682,
1543,
849,
4555,
858,
368,
513,
326,
1580,
13947,
253,
1180,
273,
6505,
275,
3061,
12588,
9953,
310,
417,
15246,
368,
878,
281,
1611,
285,
1453,
326,
3066,
5202,
4898,
390,
5927,
1180,
273,
10872,
591,
10617,
891,
651,
5476,
326,
436,
3213,
310,
247,
1652,
2372,
625,
9542,
685,
368,
1056,
352,
3176,
285,
671,
253,
2559,
15180,
2105,
273,
3515,
634,
2862,
5199,
50276,
4609,
556,
247,
1327,
570,
3129,
304,
917,
2105,
3139,
50276,
34263,
2069,
891,
1158,
326,
310,
247,
12232,
629,
273,
253,
1332,
3692,
5955,
310,
417,
2218,
387,
512,
275,
253,
2929,
50276,
783,
23973,
1332,
275,
20320,
495,
310,
417,
7000,
387,
512,
594,
891,
717,
7384,
352,
2097,
2970,
253,
13650,
273,
253,
2120,
1566,
11407,
689,
7975,
34703,
285,
9968,
745,
253,
1543,
326,
897,
625,
685,
3641,
82,
3210,
651,
326,
320,
3451,
891,
717,
7016,
323,
253,
13775,
533,
326,
310,
417,
9106,
2590,
281,
479,
50276,
249,
17032,
1071,
673,
891,
651,
5467,
368,
878,
281,
923,
275,
534,
7368,
253,
747,
4227,
11521,
327,
285,
840,
970,
253,
1180,
273,
3210,
2931,
323,
326,
7368,
310,
326,
671,
3451,
891,
1642,
984,
627,
310,
642,
3748,
327,
326,
387,
512,
275,
253,
2929,
2929,
342,
247,
5322,
20654,
285,
1077,
2969,
2934,
323,
5223,
1242,
11365,
1027,
1180,
273,
19862,
9552,
4768,
253,
3280,
2317,
275,
271,
3177,
281,
11365,
1805,
1543,
253,
5075,
1127,
310,
326,
253,
5661,
1783,
310,
417,
598,
281,
253,
7465,
326,
581,
651,
1902,
1060,
347,
891,
651,
452,
3264,
253,
1332,
281,
320,
5762,
689,
18660,
273,
15302,
342,
247,
1463,
1783,
1411,
643,
4373,
19484,
13757,
7274,
285,
7605,
5216,
281,
17813,
253,
8453,
273,
253,
1543,
2490,
187,
4118,
18435,
27,
6050,
253,
30628,
5194,
326,
253,
2929,
4428,
4722,
5697,
285,
253,
1332,
310,
20654,
352,
19235,
1057,
417,
2525,
253,
2534,
323,
14924,
891,
7052,
11907,
253,
4477,
281,
49620,
616,
2929,
275,
1798,
970,
253,
7418,
5701,
1160,
4768,
253,
5955,
3408,
323,
1650,
50275,
262,
310,
1774,
326,
253,
4477,
40167,
616,
789,
275,
1798,
323,
253,
11269,
2530,
24088,
4677,
495,
923,
29692,
8754,
50275,
15337,
398,
8042,
253,
3480,
273,
11269,
327,
1774,
3916,
407,
253,
4477,
275,
1798,
253,
1750,
5001,
17524,
4632,
3061,
7139,
923,
29692,
8754,
253,
5701,
327,
253,
3480,
273,
11117,
15302,
923,
479,
37755,
50274,
8826,
9172,
1537,
452,
12103,
275,
19843,
824,
347,
253,
12252,
281,
29692,
8754,
327,
253,
2898,
285,
11815,
1563,
31380,
26188,
251,
861,
1071
] | [
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] | [
2216,
627,
310,
642,
12215,
326,
253,
17524,
588,
320,
8654,
672,
352,
3249,
281,
19928,
253,
19862,
1979,
275,
1016,
7368,
50276,
783,
2934,
273,
253,
2831,
29599,
2746,
1754,
327,
247,
2014,
1566,
591,
7975,
310,
3590,
285,
1077,
4623,
281,
4796,
253,
15180,
2105,
273,
253,
2746,
3877,
2299,
326,
253,
2105,
310,
1335,
1774,
342,
1675,
281,
642,
819,
25004,
387,
512,
1580,
352,
4419,
281,
1756,
465,
18,
43124,
49328,
3185,
273,
337,
533,
891,
5194,
326,
627,
310,
247,
22879,
18332,
342,
1675,
281,
25184,
21349,
253,
19862,
1979,
50276,
783,
16774,
12820,
310,
4824,
562,
9113,
432,
247,
7605,
1127,
273,
1859,
891,
1089,
2299,
326,
253,
4477,
25732,
2013,
545,
284,
907,
253,
8453,
273,
253,
7756,
597,
4044,
342,
616,
2746,
2819,
387,
2829,
374,
616,
1980,
819,
25004,
5853,
10316,
271,
7756,
273,
1679,
685,
337,
275,
2426,
273,
14805,
2957,
327,
1740,
3237,
562,
273,
2800,
285,
327,
253,
643,
767,
3237,
253,
3064,
4558,
1077,
1355,
516,
417,
2119,
326,
824,
1268,
273,
7756,
310,
2686,
4409,
253,
3434,
604,
581,
310,
760,
6110,
275,
15970,
3045,
841,
1543,
403,
671,
2797,
342,
247,
2014,
10895,
8085,
604,
891,
2096,
9113,
1677,
849,
1355,
253,
3064,
310,
891,
1158,
352,
651,
452,
644,
1774,
281,
10280,
253,
3368,
2067,
2069,
342,
1027,
36509,
281,
755,
2629,
21492,
285,
5046,
671,
281,
4459,
562,
247,
7605,
1071,
281,
2451,
1880,
253,
11701,
403,
10126,
1534,
50276,
74,
13414,
1158,
2057,
326,
253,
4679,
2085,
247,
20297,
3662,
281,
253,
806,
767,
3533,
2546,
275,
2593,
608,
347,
2168,
5469,
1840,
281,
479,
8442,
337,
285,
374,
921,
326,
581,
476,
417,
1902,
2266,
5649,
432,
253,
17524,
1580,
954,
2228,
9191,
403,
41907,
1037,
11052,
629,
273,
253,
1774,
9991,
275,
253,
8654,
1979,
275,
253,
9959,
3133,
281,
320,
1955,
281,
253,
958,
326,
1048,
6507,
4811,
403,
2540,
534,
1421,
281,
271,
978,
494,
1899,
273,
253,
24571,
891,
651,
452,
751,
671,
247,
625,
12082,
285,
11745,
3368,
327,
512,
15302,
281,
3662,
253,
1273,
1953,
670,
253,
17200,
273,
253,
12820,
7241,
2139,
417,
7277,
253,
24571,
1119,
407,
436,
7241,
342,
253,
10527,
24571,
1119,
327,
253,
1071,
873,
50276,
71,
3341,
516,
9861,
326,
253,
4477,
13414,
2312,
387,
512,
670,
253,
5649,
273,
819,
25004,
327,
12672,
2069,
387,
17032,
281,
479,
436,
310,
581,
273,
253,
2022,
16038,
323,
970,
819,
25004,
275,
253,
3634,
273,
49328,
891,
651,
320,
6110,
281,
871,
849,
4156,
285,
7368,
3169,
819,
25004,
7277,
432,
436,
1127,
273,
1859,
50276,
37585,
5701,
50276,
249,
4562,
253,
1563,
6197,
310,
12744,
436,
1332,
556,
2717,
281,
513,
342,
253,
25128,
40513,
1895,
752,
513,
368,
1599,
50276,
953,
417,
2590,
849,
368,
4993,
253,
1180,
273,
6505,
275,
247,
3061,
5202,
275,
253,
2629,
5933,
253,
1340,
275,
534,
253,
7632,
403,
11848,
310,
10341,
513,
368,
4647,
690,
1682,
7053,
5700,
534,
581,
50276,
338,
891,
2096,
9113,
253,
8245,
275,
2829,
374,
310,
253,
2629,
21349,
819,
37437,
1566,
604,
594,
891,
651,
751,
281,
923,
671,
253,
3045,
273,
247,
440,
1087,
37437,
19862,
50276,
7483,
581,
4758,
273,
253,
43124,
5933,
310,
14859,
270,
28306,
247,
4715,
2281,
273,
209,
4699,
285,
4284,
3602,
323,
253,
5798,
15467,
1566,
2568,
581,
21973,
326,
841,
3602,
588,
452,
271,
3486,
327,
253,
3045,
697,
4755,
323,
270,
387,
1878,
891,
1158,
326,
625,
13553,
943,
320,
14859,
281,
1056,
253,
11815,
625,
2087,
50276,
13565,
387,
4677,
374,
627,
403,
625,
2228,
9191,
1840,
253,
3388,
6970,
685,
2708,
534,
1804,
326,
9959,
403,
7826,
440,
30063,
352,
651,
320,
4722,
281,
1304,
253,
1979,
273,
841,
9959,
253,
2929,
310,
973,
3542,
285,
253,
4081,
1332,
2789,
3282,
285,
310,
3236,
533,
352,
19756,
247,
2372,
273,
10527,
16038,
285,
516,
417,
13762,
273,
697,
8542,
17200,
1677,
253,
1077,
16888,
11701,
2540,
275,
253,
4679,
50276,
7152,
339,
4029,
14314,
273,
253,
4633,
11786,
43124,
1332,
310,
3559,
10646,
253,
3280,
2317,
310,
18375,
1356,
715,
4811,
285,
253,
2831,
7210,
456,
8654,
1180,
273,
7139,
281,
2486,
275,
253,
19862,
310,
6777,
327,
247,
591,
17187,
3720,
2581,
685,
970,
247,
2014,
4156,
1180,
273,
7139,
323,
253,
2862,
3280,
2317,
12724,
8936,
3045,
4103,
281,
2629,
11786,
43124,
2439,
721,
22791,
15302,
310,
2361,
253,
2934,
273,
41463,
253,
3280,
2317,
285,
970,
247,
11962,
1180,
273,
3210,
275,
253,
19862,
2789,
247,
2257,
273,
3282,
281,
479,
285,
310,
4460,
281,
253,
1682,
273,
619,
3640,
11786,
43124,
285,
2074,
5609,
513,
6296,
3464,
7561,
908,
285,
1077,
12085,
323,
1355,
85,
14944,
1514,
7877,
10334,
792,
941,
594,
253,
8542,
8453,
273,
253,
1332,
310,
2590,
3738,
690,
5884,
1407,
953,
403,
3058,
253,
9759,
273,
253,
5853,
310,
2590,
323,
253,
954,
629,
50274,
2577,
2022,
4468,
310,
326,
253,
1180,
273,
22791,
15302,
310,
4577,
685,
891,
651,
452,
10490,
1677,
253,
4500,
31735,
281,
8845,
12672,
1612,
20220,
3066,
253,
9005,
697,
417,
20697,
281,
1902,
1199,
625,
685,
721,
49602,
281,
320,
5762,
323,
247,
1332,
751,
436,
323,
4227,
891,
2378,
10591,
22767,
247,
2929,
323,
1529,
8059,
534,
5762,
616,
1332,
327,
3387,
49602,
285,
3738,
891,
10490,
253,
2929,
253,
2929,
1694,
10945,
50276,
74,
717,
417,
3981,
3387,
310,
2424,
533,
5046,
1249,
390,
1458,
49602,
651,
1056,
479,
625,
9848,
342,
253,
1543,
685,
816,
721,
50276,
3113,
326,
753,
253,
1332,
310,
20654,
285,
27350,
285,
7561,
7763,
594,
891,
13414,
2564,
387,
512,
604,
436,
2929,
310,
7607,
253,
1355,
1180,
273,
49602,
310,
253,
1921,
891,
476,
760,
1918,
352,
247,
721,
50276,
531,
2201,
5125,
12921,
253,
2488,
897,
253,
1307,
1789,
275,
247,
1327,
15291,
1039,
281,
3730,
281,
752,
403,
9403,
6289,
281,
347,
6667,
50276,
263,
941,
2792,
10872,
7313,
3966,
26332,
253,
27214,
1269,
90,
8557,
908,
281,
6194,
247,
22296,
4715,
1566,
891,
1804,
13887,
247,
1307,
643,
685,
5113,
50276,
531,
5884,
5833,
327,
3239,
721,
352,
310,
7558,
326,
7368,
9421,
403,
1892,
281,
17813,
275,
958,
17524,
476,
320,
6760,
562,
273,
3410,
1199,
275,
253,
1072,
1039,
347,
22296,
4715,
407,
2819,
24088,
387,
12177,
327,
247,
2186,
483,
3410,
594,
347,
281,
5206,
271,
8654,
1180,
273,
24170,
275,
247,
7802,
273,
305,
10064,
2458,
352,
310,
671,
1896,
281,
3989,
8103,
3470,
323,
1633,
751,
465,
30799,
534,
5454,
745,
465,
342,
4181,
273,
1016,
941,
1127,
281,
697,
7368,
1599,
23188,
352,
1057,
1056,
3282,
281,
479,
281,
10883,
342,
247,
3061,
5202,
594,
891,
717,
417,
28860,
407,
436,
3731,
25322,
1199,
50276,
66,
1643,
643,
5125,
1407,
953,
50276,
6377,
495,
253,
2393,
15910,
50276,
18579,
15910,
50276,
6377,
854,
1057,
253,
12820,
7241,
4081,
275,
2593,
7127,
556,
1175,
26647,
3745,
556,
50276,
9802,
247,
1332,
323,
41463,
253,
3280,
2317,
323,
247,
11786,
43124,
5933,
285,
13887,
247,
1027,
1180,
273,
7139,
281,
320,
2908,
275,
253,
19862,
275,
247,
10883,
6820,
1039,
310,
3559,
11701,
327,
721,
22791,
15302,
4103,
281,
2629,
11786,
43124,
403,
2361,
253,
1332,
310,
27350,
285,
20654,
533,
253,
1180,
273,
22791,
15302,
310,
417,
347,
1781,
347,
352,
812,
320,
285,
323,
326,
1921,
253,
2929,
4850,
247,
721,
16888,
2997,
5474,
339,
431,
248,
2929,
29328,
247,
4460,
1332,
281,
873,
253,
8654,
19862,
1979,
275,
11786,
43124,
275,
1798,
253,
4477,
12661,
271,
17825,
5700,
326,
5239,
5799,
19862,
9552,
323,
1027,
4811,
273,
253,
3280,
2317,
323,
326,
597,
12661,
23534,
253,
3280,
2317,
275,
18893,
4811,
3692,
10872,
403,
2074,
1097,
275,
2426,
273,
3386,
285,
13301,
285,
26230,
253,
8654,
19862,
1979,
1561,
1110,
4811,
323,
17524,
941,
597,
12661,
970,
247,
3061,
5202,
5802,
432,
253,
2862,
3733,
873,
285,
253,
6505,
273,
253,
7139,
403,
253,
9959,
326,
19142,
253,
941,
10883,
597,
921,
253,
1543,
273,
1097,
247,
23539,
285,
247,
1679,
30344,
29107,
323,
4560,
562,
253,
8654,
19862,
1979,
591,
2919,
285,
597,
7277,
616,
4342,
342,
253,
5899,
5700,
273,
3365,
819,
25004,
253,
49328,
1754,
327,
247,
2014,
8654,
1180,
273,
40390,
5998,
432,
247,
2831,
29599,
5199,
1543,
275,
721,
1345,
15302,
921,
326,
275,
387,
1878,
577,
273,
1110,
15302,
253,
1332,
3133,
281,
2085,
1805,
1543,
2266,
2792,
50276,
635,
2969,
1332,
342,
3477,
7092,
2403,
697,
21068,
15246,
50276,
70,
1851,
386,
2934,
323,
24049,
253,
2746,
1561,
247,
2831,
29599,
5199,
50276,
9072,
1543,
23447,
275,
247,
2159,
2408,
273,
15302,
50276,
20881,
2792,
50276,
635,
3710,
5661,
1783,
323,
15974,
326,
891,
651,
5583,
50272,
5658,
943,
1611,
352,
327,
18660,
273,
15302,
1677,
326,
253,
1332,
310,
14163,
1077,
3809,
50272,
5658,
943,
7277,
634,
2746,
342,
2067,
3082,
323,
4373,
19484,
13757,
417,
760,
342,
253,
27785,
30105,
5199,
891,
717,
14338,
281,
923,
1880,
247,
625,
18144,
4373,
19484,
25184,
2746,
476,
562,
32231,
368,
2969,
2746,
285,
604,
594,
342,
534,
6733,
352,
651,
513,
352,
50272,
5658,
943,
2126,
7605,
5216,
323,
1805,
18005,
253,
7605,
8453,
273,
1543,
275,
1798,
627,
403,
2067,
1327,
36928,
5216,
326,
476,
1361,
368,
13458,
562,
1534,
3910,
285,
671,
1501,
37806,
3082,
323,
13458,
562,
28208,
3910,
323,
253,
1083,
273,
2709,
3082,
1146,
11407,
689,
2709,
15302,
50276,
34974,
50276,
5658,
1333,
368,
4853,
253,
1180,
273,
9959,
6505,
275,
253,
3061,
5202,
2556,
281,
253,
5199,
2529,
275,
2593,
7127,
352,
310,
417,
2590,
281,
479,
534,
5199,
310,
326,
513,
368,
1599,
3515,
20320,
495,
2067,
2069,
342,
5799,
1180,
273,
9959,
285,
970,
253,
581,
342,
253,
1682,
1543,
849,
4555,
858,
368,
513,
326,
1580,
13947,
253,
1180,
273,
6505,
275,
3061,
12588,
9953,
310,
417,
15246,
368,
878,
281,
1611,
285,
1453,
326,
3066,
5202,
4898,
390,
5927,
1180,
273,
10872,
591,
10617,
891,
651,
5476,
326,
436,
3213,
310,
247,
1652,
2372,
625,
9542,
685,
368,
1056,
352,
3176,
285,
671,
253,
2559,
15180,
2105,
273,
3515,
634,
2862,
5199,
50276,
4609,
556,
247,
1327,
570,
3129,
304,
917,
2105,
3139,
50276,
34263,
2069,
891,
1158,
326,
310,
247,
12232,
629,
273,
253,
1332,
3692,
5955,
310,
417,
2218,
387,
512,
275,
253,
2929,
50276,
783,
23973,
1332,
275,
20320,
495,
310,
417,
7000,
387,
512,
594,
891,
717,
7384,
352,
2097,
2970,
253,
13650,
273,
253,
2120,
1566,
11407,
689,
7975,
34703,
285,
9968,
745,
253,
1543,
326,
897,
625,
685,
3641,
82,
3210,
651,
326,
320,
3451,
891,
717,
7016,
323,
253,
13775,
533,
326,
310,
417,
9106,
2590,
281,
479,
50276,
249,
17032,
1071,
673,
891,
651,
5467,
368,
878,
281,
923,
275,
534,
7368,
253,
747,
4227,
11521,
327,
285,
840,
970,
253,
1180,
273,
3210,
2931,
323,
326,
7368,
310,
326,
671,
3451,
891,
1642,
984,
627,
310,
642,
3748,
327,
326,
387,
512,
275,
253,
2929,
2929,
342,
247,
5322,
20654,
285,
1077,
2969,
2934,
323,
5223,
1242,
11365,
1027,
1180,
273,
19862,
9552,
4768,
253,
3280,
2317,
275,
271,
3177,
281,
11365,
1805,
1543,
253,
5075,
1127,
310,
326,
253,
5661,
1783,
310,
417,
598,
281,
253,
7465,
326,
581,
651,
1902,
1060,
347,
891,
651,
452,
3264,
253,
1332,
281,
320,
5762,
689,
18660,
273,
15302,
342,
247,
1463,
1783,
1411,
643,
4373,
19484,
13757,
7274,
285,
7605,
5216,
281,
17813,
253,
8453,
273,
253,
1543,
2490,
187,
4118,
18435,
27,
6050,
253,
30628,
5194,
326,
253,
2929,
4428,
4722,
5697,
285,
253,
1332,
310,
20654,
352,
19235,
1057,
417,
2525,
253,
2534,
323,
14924,
891,
7052,
11907,
253,
4477,
281,
49620,
616,
2929,
275,
1798,
970,
253,
7418,
5701,
1160,
4768,
253,
5955,
3408,
323,
1650,
50275,
262,
310,
1774,
326,
253,
4477,
40167,
616,
789,
275,
1798,
323,
253,
11269,
2530,
24088,
4677,
495,
923,
29692,
8754,
50275,
15337,
398,
8042,
253,
3480,
273,
11269,
327,
1774,
3916,
407,
253,
4477,
275,
1798,
253,
1750,
5001,
17524,
4632,
3061,
7139,
923,
29692,
8754,
253,
5701,
327,
253,
3480,
273,
11117,
15302,
923,
479,
37755,
50274,
8826,
9172,
1537,
452,
12103,
275,
19843,
824,
347,
253,
12252,
281,
29692,
8754,
327,
253,
2898,
285,
11815,
1563,
31380,
26188,
251,
861,
1071
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
this paper studies the problem of maintaining a good revenue in a shrinking market for combinatorial auctions of n bidders and m items the shrinking market is modeled as follows every bidder n participates in the market independently with probability p the valuations of the bidders participating in the market are hidden the modeling of the problem is the first contribution of the paper the second contribution is to show that in general settings not handling the shrinking market specifically may not just incur a revenue that is p times the optimal revenue but significantly less this result motivates even further the need to understand how much the revenue changes in a shrinking market the third contribution is a possibility result showing a lower bound on the revenue that can be extracted from a shrinking market the results proof is based on a new construction the winner diagram which may be of independent interest the last contribution is a learningfromsamplestype algorithm for nearly achieving this revenue strengths 1 the problem studied in the paper seems fundamental and to the best of my knowledge it has not been addressed in the literature i was also thinking that although the paper is written in a way to emphasize the inherent md component the problem looks like a core ml theoretical problem phrased as follows let s be a sample on which an algorithm is trained and obtains a model m with loss l if instead of s the same algorithm is used on a subset of s say s subseteq s then how far away are the two models m and m in terms of their losses this is not precisely the question addressed in this work but i was wondering whether there is a more mltheoretic language to phrase the problem 2 the paper is very well written i also liked the story of the paper because i found it rather complete for the problem at hand not only do the paper presents how much revenue can be preserved but also how to efficiently learn an auction achieving this revenue ie an auction robust to market shrinkage or general market uncertainty 3 the winner diagram technique which is at the center of the construction for the possibility result is quite neat to be honest i was wondering whether there are certain classes of auctions where one can build and traverse these graphs fast enough so that we dont have to learn from samples but this is certainly not required for me to think positively about this paper weaknesses knowing p or the distribution d is restrictive do the current results hold even if you know p approximately this would mean that there is a distribution d that is known with small distance from d such a result would strengthen the punchline of the paper even more evaluation in my opinion the weakness that i pointed out above was not as important as the strengths of the paper moreover they mostly constitute interesting directions for future work for that im leaning positive for the paper additional comments i understand that the assumption on the existence of independence in choosing the subset is done for technical reasons and i think it is okay given that this is one of the first studies of market shrinkage but in reality shouldnt we expect that there is some correlation between people who decide to leave the market that is at least the case in all the examples mentioned in the introduction where there was an outside force pushing people out of the market in lines 5661 i would suggest adding a short description of what delineability is in the first paragraph of section 3 when describing vjb would you mind explaining why this is correlated with i and what is i in this case fixed to after def 5 i would suggest adding a short paragraph about what d h and theta are for certain wellknown examples like the second price auction properly addressed by the authors docsepthis paper studies revenue guarantees for multiitem auctions under shrinking markets where the participation of bidders in auctions are independent and random the paper first presents a motivating example for vcg auctions that demonstrate only an exponentially small fraction of revenue can be guaranteed under shrinking markets when bidders values also depend on what others receive the paper then develops a probabilistic argument to show for a sufficiently large class of mechanisms there must exist a mechanism that is robust to market shrinkage and preserves a certain level of revenue finally the paper presents a sample based approach to find a mechanism that yields revenue at least the presented guarantee on revenue with high probability strengths the paper is wellwritten and to the best of my knowledge the probabilistic existence result for mechanisms that achieve certain revenue guarantees along with the construction and analyses of the winner diagram under shrinking markets is novel and provides quite substantial contribution to the understanding of auction revenue in this particular market scenario i find the motivating example in section 3 that states in certain scenarios vcg can only retain an exponentially small fraction of revenue quite enlightening and i appreciate the exemplifying numerics for revenue loss presented after theorem 31 despite being a theoretical paper i feel the paper presents nice insights into challenges for auction design and revenue maximization in practice weaknesses i do not see major weaknesses in the paper perhaps one questionconcern i have for the overall methodology of mechanism mathcala and the sample based approach to find a particular mechanism with good revenue guarantees is that both rely on the knowledge of p which is the probability that a bidder participates in an auction if there is misspecification in this parameter especially over estimations it seems that certain mechanism equivalence classes may be ruled out which in the worst case may be those who correspond to the highest revenues if this is true is there any intuition regarding how sensitive are the revenue guarantees to misspecification of p i may be missing something here and would be great if the authors can clarify na docsepthis paper models shrinking markets with uncertain size in multiitem setting and proposes a samplebased learning algorithm with provable guarantees on how much revenue can be preserved in the face of uncertain market the major novelty is the construction of a winner diagram which captures all possible executions of an auction on an uncertain set of bidders and a general possibility result by analyzing the winner diagram shows the proposed bound on the revenue guarantee strength the paper is wellwritten all the ideas are clearly discussed and properly organized the paper states that the proposed model of shrinking markets is the first formal model in multiitem setting which seems technically nontrivial weaknesses na i did not notice any docsepthis paper studies the mechanism design problem when all the players valuation functions are known and each player participates with certain probability it provides a randomized mechanism with revenue lower bound guarantee and also a samplebased mechanism which can be implemented efficiently strengths 1 i like the idea that each player participates with uncertainty 2 this paper provides a first lower bound result for the market shrinkage problem and provides a samplebased algorithm weaknesses 1 the assumption that the valuations for all players are known is kind of restricted 2 this paper seems more related to market uncertainty but less to market shrinkage what is the formal economic definition of market shrinkage not applicable
### Summary: | the reviews are all positive the reviewers agree that the paper studies a fundamental problem with nice insights and interesting techniques and the paper is wellwritten | [
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
2520,
2929,
2175,
253,
1895,
273,
11850,
247,
1175,
11784,
275,
247,
39443,
2791,
323,
38183,
7331,
960,
273,
295,
270,
2016,
398,
285,
278,
4957,
253,
39443,
2791,
310,
23115,
347,
3637,
1046,
12246,
491,
295,
45347,
275,
253,
2791,
10939,
342,
5912,
268,
253,
821,
12542,
273,
253,
270,
2016,
398,
15299,
275,
253,
2791,
403,
8763,
253,
14053,
273,
253,
1895,
310,
253,
806,
7680,
273,
253,
2929,
253,
1273,
7680,
310,
281,
921,
326,
275,
2087,
7533,
417,
10885,
253,
39443,
2791,
5742,
778,
417,
816,
36967,
247,
11784,
326,
310,
268,
2069,
253,
8654,
11784,
533,
3012,
1679,
436,
906,
15265,
684,
1014,
2007,
253,
878,
281,
2096,
849,
1199,
253,
11784,
2544,
275,
247,
39443,
2791,
253,
2626,
7680,
310,
247,
6387,
906,
4645,
247,
2406,
3033,
327,
253,
11784,
326,
476,
320,
10375,
432,
247,
39443,
2791,
253,
1543,
4737,
310,
1754,
327,
247,
747,
5140,
253,
13688,
10659,
534,
778,
320,
273,
3907,
1600,
253,
1390,
7680,
310,
247,
4715,
4064,
22163,
446,
383,
1692,
5933,
323,
4829,
17170,
436,
11784,
50275,
296,
3755,
20556,
337,
253,
1895,
5421,
275,
253,
2929,
3133,
7936,
285,
281,
253,
1682,
273,
619,
3640,
352,
556,
417,
644,
9713,
275,
253,
6239,
891,
369,
671,
4680,
326,
3738,
253,
2929,
310,
3542,
275,
247,
1039,
281,
22175,
253,
12794,
31934,
4445,
253,
1895,
4453,
751,
247,
5161,
13361,
10527,
1895,
9839,
833,
347,
3637,
1339,
256,
320,
247,
3410,
327,
534,
271,
5933,
310,
10166,
285,
31326,
247,
1566,
278,
342,
2957,
298,
604,
3185,
273,
256,
253,
1072,
5933,
310,
908,
327,
247,
8578,
273,
256,
1333,
256,
8578,
2574,
256,
840,
849,
2080,
1977,
403,
253,
767,
3210,
278,
285,
278,
275,
2426,
273,
616,
11655,
436,
310,
417,
10534,
253,
1953,
9713,
275,
436,
789,
533,
891,
369,
12371,
1880,
627,
310,
247,
625,
13361,
783,
30325,
3448,
281,
12616,
253,
1895,
374,
253,
2929,
310,
1077,
973,
3542,
891,
671,
10490,
253,
2926,
273,
253,
2929,
984,
891,
1119,
352,
2581,
3426,
323,
253,
1895,
387,
1133,
417,
760,
513,
253,
2929,
10262,
849,
1199,
11784,
476,
320,
15296,
533,
671,
849,
281,
14556,
3037,
271,
22685,
17170,
436,
11784,
26332,
271,
22685,
10237,
281,
2791,
47100,
390,
2087,
2791,
11649,
50276,
20,
253,
13688,
10659,
5853,
534,
310,
387,
253,
4055,
273,
253,
5140,
323,
253,
6387,
906,
310,
3240,
18176,
281,
320,
8274,
891,
369,
12371,
1880,
627,
403,
2176,
5971,
273,
7331,
960,
835,
581,
476,
1973,
285,
42309,
841,
14580,
3809,
2217,
594,
326,
359,
13414,
452,
281,
3037,
432,
3530,
533,
436,
310,
5604,
417,
2424,
323,
479,
281,
1158,
14962,
670,
436,
2929,
50276,
20881,
1255,
265,
50276,
14428,
272,
268,
390,
253,
3268,
277,
310,
29190,
513,
253,
1655,
1543,
2186,
1014,
604,
368,
871,
268,
5512,
436,
651,
1599,
326,
627,
310,
247,
3268,
277,
326,
310,
1929,
342,
1355,
4181,
432,
277,
824,
247,
906,
651,
17084,
253,
18750,
1282,
273,
253,
2929,
1014,
625,
50275,
15419,
2368,
275,
619,
4743,
253,
14855,
326,
891,
8042,
562,
1840,
369,
417,
347,
1774,
347,
253,
20544,
273,
253,
2929,
25761,
597,
6571,
12647,
4722,
10746,
323,
2852,
789,
323,
326,
516,
25661,
2762,
323,
253,
2929,
50276,
38092,
5701,
50275,
74,
2096,
326,
253,
9376,
327,
253,
6242,
273,
14275,
275,
13887,
253,
8578,
310,
2218,
323,
7681,
4606,
285,
891,
1158,
352,
310,
8261,
1677,
326,
436,
310,
581,
273,
253,
806,
2175,
273,
2791,
47100,
533,
275,
6612,
943,
2649,
359,
1902,
326,
627,
310,
690,
5921,
875,
952,
665,
7617,
281,
3553,
253,
2791,
326,
310,
387,
1878,
253,
1083,
275,
512,
253,
6667,
5393,
275,
253,
10199,
835,
627,
369,
271,
3345,
3490,
13383,
952,
562,
273,
253,
2791,
50276,
249,
3104,
608,
36630,
891,
651,
1804,
6240,
247,
2159,
5740,
273,
752,
30191,
1430,
310,
50276,
249,
253,
806,
12494,
273,
2593,
495,
672,
12930,
362,
27686,
651,
368,
2564,
15571,
2139,
436,
310,
9578,
342,
891,
285,
752,
310,
891,
275,
436,
1083,
4229,
281,
50275,
6438,
809,
608,
891,
651,
1804,
6240,
247,
2159,
12494,
670,
752,
277,
288,
285,
39116,
403,
323,
2176,
973,
4304,
6667,
751,
253,
1273,
4376,
22685,
50276,
30976,
314,
9713,
407,
253,
4477,
5474,
33032,
2520,
2929,
2175,
11784,
23632,
323,
4471,
4835,
7331,
960,
762,
39443,
10169,
835,
253,
11497,
273,
270,
2016,
398,
275,
7331,
960,
403,
3907,
285,
3632,
253,
2929,
806,
10262,
247,
15265,
839,
1650,
323,
362,
29676,
7331,
960,
326,
7568,
760,
271,
28596,
1355,
6919,
273,
11784,
476,
320,
16293,
762,
39443,
10169,
672,
270,
2016,
398,
2193,
671,
3469,
327,
752,
2571,
4763,
253,
2929,
840,
24357,
247,
37851,
4154,
281,
921,
323,
247,
10481,
1781,
966,
273,
6297,
627,
1364,
2226,
247,
5122,
326,
310,
10237,
281,
2791,
47100,
285,
31221,
247,
2176,
1268,
273,
11784,
4720,
253,
2929,
10262,
247,
3410,
1754,
2746,
281,
1089,
247,
5122,
326,
11026,
11784,
387,
1878,
253,
3559,
12215,
327,
11784,
342,
1029,
5912,
50275,
296,
3755,
20556,
253,
2929,
310,
973,
15720,
285,
281,
253,
1682,
273,
619,
3640,
253,
37851,
6242,
906,
323,
6297,
326,
5115,
2176,
11784,
23632,
2112,
342,
253,
5140,
285,
6260,
273,
253,
13688,
10659,
762,
39443,
10169,
310,
4460,
285,
3400,
3240,
6832,
7680,
281,
253,
4685,
273,
22685,
11784,
275,
436,
1798,
2791,
10076,
891,
1089,
253,
15265,
839,
1650,
275,
2593,
495,
326,
3054,
275,
2176,
15216,
362,
29676,
476,
760,
13280,
271,
28596,
1355,
6919,
273,
11784,
3240,
25441,
2980,
285,
891,
11435,
253,
17449,
5411,
4520,
982,
323,
11784,
2957,
3559,
846,
10012,
4562,
5747,
1146,
247,
10527,
2929,
891,
1928,
253,
2929,
10262,
5322,
16039,
715,
7881,
323,
22685,
2216,
285,
11784,
11903,
1320,
275,
3946,
50274,
20881,
1255,
265,
891,
513,
417,
923,
2201,
32213,
275,
253,
2929,
4931,
581,
1953,
585,
20631,
891,
452,
323,
253,
4583,
16182,
273,
5122,
14168,
1179,
66,
285,
253,
3410,
1754,
2746,
281,
1089,
247,
1798,
5122,
342,
1175,
11784,
23632,
310,
326,
1097,
10725,
327,
253,
3640,
273,
268,
50276,
4609,
310,
253,
5912,
326,
247,
12246,
491,
45347,
275,
271,
22685,
604,
627,
310,
2985,
1553,
1877,
275,
436,
4764,
3340,
689,
3311,
569,
352,
3133,
326,
2176,
5122,
19945,
5971,
778,
320,
12969,
562,
534,
275,
253,
9065,
1083,
778,
320,
1110,
665,
2723,
281,
253,
4585,
24528,
604,
436,
310,
2032,
310,
627,
667,
30328,
5001,
849,
7996,
403,
253,
11784,
23632,
281,
2985,
1553,
1877,
273,
268,
891,
778,
320,
5816,
1633,
1060,
285,
651,
320,
1270,
604,
253,
4477,
476,
19148,
50276,
2072,
5474,
33032,
2520,
2929,
3210,
39443,
10169,
342,
8767,
1979,
275,
4471,
4835,
4758,
285,
29328,
247,
3410,
3169,
4715,
5933,
342,
872,
494,
23632,
327,
849,
1199,
11784,
476,
320,
15296,
275,
253,
2454,
273,
8767,
2791,
253,
2201,
38135,
310,
253,
5140,
273,
247,
13688,
10659,
534,
28174,
512,
1896,
3244,
3360,
273,
271,
22685,
327,
271,
8767,
873,
273,
270,
2016,
398,
285,
247,
2087,
6387,
906,
407,
18918,
253,
13688,
10659,
2722,
253,
4081,
3033,
327,
253,
11784,
12215,
4757,
50276,
783,
2929,
310,
973,
15720,
512,
253,
5697,
403,
4518,
5469,
285,
6283,
10932,
50276,
783,
2929,
3054,
326,
253,
4081,
1566,
273,
39443,
10169,
310,
253,
806,
7473,
1566,
275,
4471,
4835,
4758,
534,
3133,
22335,
37825,
50276,
20881,
1255,
265,
5549,
50275,
74,
858,
417,
4366,
667,
5474,
33032,
2520,
2929,
2175,
253,
5122,
2216,
1895,
672,
512,
253,
3773,
29581,
3470,
403,
1929,
285,
1016,
4760,
45347,
342,
2176,
5912,
352,
3400,
247,
14871,
5122,
342,
11784,
2406,
3033,
12215,
285,
671,
247,
3410,
3169,
5122,
534,
476,
320,
9009,
14556,
50276,
296,
3755,
20556,
337,
891,
751,
253,
2934,
326,
1016,
4760,
45347,
342,
11649,
50276,
19,
436,
2929,
3400,
247,
806,
2406,
3033,
906,
323,
253,
2791,
47100,
1895,
285,
3400,
247,
3410,
3169,
5933,
50275,
20881,
1255,
265,
337,
253,
9376,
326,
253,
821,
12542,
323,
512,
3773,
403,
1929,
310,
2238,
273,
11096,
374,
436,
2929,
3133,
625,
2905,
281,
2791,
11649,
533,
1679,
281,
2791,
47100,
752,
310,
253,
7473,
5054,
5426,
273,
2791,
47100,
50276,
1439,
7763,
2490,
187,
4118,
18435,
27,
783,
10123,
403,
512,
2762,
253,
30628,
5194,
326,
253,
2929,
2175,
247,
7936,
1895,
342,
5322,
16039,
285,
4722,
5609,
285,
253,
2929,
310,
973,
15720,
209
] | [
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] | [
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
2520,
2929,
2175,
253,
1895,
273,
11850,
247,
1175,
11784,
275,
247,
39443,
2791,
323,
38183,
7331,
960,
273,
295,
270,
2016,
398,
285,
278,
4957,
253,
39443,
2791,
310,
23115,
347,
3637,
1046,
12246,
491,
295,
45347,
275,
253,
2791,
10939,
342,
5912,
268,
253,
821,
12542,
273,
253,
270,
2016,
398,
15299,
275,
253,
2791,
403,
8763,
253,
14053,
273,
253,
1895,
310,
253,
806,
7680,
273,
253,
2929,
253,
1273,
7680,
310,
281,
921,
326,
275,
2087,
7533,
417,
10885,
253,
39443,
2791,
5742,
778,
417,
816,
36967,
247,
11784,
326,
310,
268,
2069,
253,
8654,
11784,
533,
3012,
1679,
436,
906,
15265,
684,
1014,
2007,
253,
878,
281,
2096,
849,
1199,
253,
11784,
2544,
275,
247,
39443,
2791,
253,
2626,
7680,
310,
247,
6387,
906,
4645,
247,
2406,
3033,
327,
253,
11784,
326,
476,
320,
10375,
432,
247,
39443,
2791,
253,
1543,
4737,
310,
1754,
327,
247,
747,
5140,
253,
13688,
10659,
534,
778,
320,
273,
3907,
1600,
253,
1390,
7680,
310,
247,
4715,
4064,
22163,
446,
383,
1692,
5933,
323,
4829,
17170,
436,
11784,
50275,
296,
3755,
20556,
337,
253,
1895,
5421,
275,
253,
2929,
3133,
7936,
285,
281,
253,
1682,
273,
619,
3640,
352,
556,
417,
644,
9713,
275,
253,
6239,
891,
369,
671,
4680,
326,
3738,
253,
2929,
310,
3542,
275,
247,
1039,
281,
22175,
253,
12794,
31934,
4445,
253,
1895,
4453,
751,
247,
5161,
13361,
10527,
1895,
9839,
833,
347,
3637,
1339,
256,
320,
247,
3410,
327,
534,
271,
5933,
310,
10166,
285,
31326,
247,
1566,
278,
342,
2957,
298,
604,
3185,
273,
256,
253,
1072,
5933,
310,
908,
327,
247,
8578,
273,
256,
1333,
256,
8578,
2574,
256,
840,
849,
2080,
1977,
403,
253,
767,
3210,
278,
285,
278,
275,
2426,
273,
616,
11655,
436,
310,
417,
10534,
253,
1953,
9713,
275,
436,
789,
533,
891,
369,
12371,
1880,
627,
310,
247,
625,
13361,
783,
30325,
3448,
281,
12616,
253,
1895,
374,
253,
2929,
310,
1077,
973,
3542,
891,
671,
10490,
253,
2926,
273,
253,
2929,
984,
891,
1119,
352,
2581,
3426,
323,
253,
1895,
387,
1133,
417,
760,
513,
253,
2929,
10262,
849,
1199,
11784,
476,
320,
15296,
533,
671,
849,
281,
14556,
3037,
271,
22685,
17170,
436,
11784,
26332,
271,
22685,
10237,
281,
2791,
47100,
390,
2087,
2791,
11649,
50276,
20,
253,
13688,
10659,
5853,
534,
310,
387,
253,
4055,
273,
253,
5140,
323,
253,
6387,
906,
310,
3240,
18176,
281,
320,
8274,
891,
369,
12371,
1880,
627,
403,
2176,
5971,
273,
7331,
960,
835,
581,
476,
1973,
285,
42309,
841,
14580,
3809,
2217,
594,
326,
359,
13414,
452,
281,
3037,
432,
3530,
533,
436,
310,
5604,
417,
2424,
323,
479,
281,
1158,
14962,
670,
436,
2929,
50276,
20881,
1255,
265,
50276,
14428,
272,
268,
390,
253,
3268,
277,
310,
29190,
513,
253,
1655,
1543,
2186,
1014,
604,
368,
871,
268,
5512,
436,
651,
1599,
326,
627,
310,
247,
3268,
277,
326,
310,
1929,
342,
1355,
4181,
432,
277,
824,
247,
906,
651,
17084,
253,
18750,
1282,
273,
253,
2929,
1014,
625,
50275,
15419,
2368,
275,
619,
4743,
253,
14855,
326,
891,
8042,
562,
1840,
369,
417,
347,
1774,
347,
253,
20544,
273,
253,
2929,
25761,
597,
6571,
12647,
4722,
10746,
323,
2852,
789,
323,
326,
516,
25661,
2762,
323,
253,
2929,
50276,
38092,
5701,
50275,
74,
2096,
326,
253,
9376,
327,
253,
6242,
273,
14275,
275,
13887,
253,
8578,
310,
2218,
323,
7681,
4606,
285,
891,
1158,
352,
310,
8261,
1677,
326,
436,
310,
581,
273,
253,
806,
2175,
273,
2791,
47100,
533,
275,
6612,
943,
2649,
359,
1902,
326,
627,
310,
690,
5921,
875,
952,
665,
7617,
281,
3553,
253,
2791,
326,
310,
387,
1878,
253,
1083,
275,
512,
253,
6667,
5393,
275,
253,
10199,
835,
627,
369,
271,
3345,
3490,
13383,
952,
562,
273,
253,
2791,
50276,
249,
3104,
608,
36630,
891,
651,
1804,
6240,
247,
2159,
5740,
273,
752,
30191,
1430,
310,
50276,
249,
253,
806,
12494,
273,
2593,
495,
672,
12930,
362,
27686,
651,
368,
2564,
15571,
2139,
436,
310,
9578,
342,
891,
285,
752,
310,
891,
275,
436,
1083,
4229,
281,
50275,
6438,
809,
608,
891,
651,
1804,
6240,
247,
2159,
12494,
670,
752,
277,
288,
285,
39116,
403,
323,
2176,
973,
4304,
6667,
751,
253,
1273,
4376,
22685,
50276,
30976,
314,
9713,
407,
253,
4477,
5474,
33032,
2520,
2929,
2175,
11784,
23632,
323,
4471,
4835,
7331,
960,
762,
39443,
10169,
835,
253,
11497,
273,
270,
2016,
398,
275,
7331,
960,
403,
3907,
285,
3632,
253,
2929,
806,
10262,
247,
15265,
839,
1650,
323,
362,
29676,
7331,
960,
326,
7568,
760,
271,
28596,
1355,
6919,
273,
11784,
476,
320,
16293,
762,
39443,
10169,
672,
270,
2016,
398,
2193,
671,
3469,
327,
752,
2571,
4763,
253,
2929,
840,
24357,
247,
37851,
4154,
281,
921,
323,
247,
10481,
1781,
966,
273,
6297,
627,
1364,
2226,
247,
5122,
326,
310,
10237,
281,
2791,
47100,
285,
31221,
247,
2176,
1268,
273,
11784,
4720,
253,
2929,
10262,
247,
3410,
1754,
2746,
281,
1089,
247,
5122,
326,
11026,
11784,
387,
1878,
253,
3559,
12215,
327,
11784,
342,
1029,
5912,
50275,
296,
3755,
20556,
253,
2929,
310,
973,
15720,
285,
281,
253,
1682,
273,
619,
3640,
253,
37851,
6242,
906,
323,
6297,
326,
5115,
2176,
11784,
23632,
2112,
342,
253,
5140,
285,
6260,
273,
253,
13688,
10659,
762,
39443,
10169,
310,
4460,
285,
3400,
3240,
6832,
7680,
281,
253,
4685,
273,
22685,
11784,
275,
436,
1798,
2791,
10076,
891,
1089,
253,
15265,
839,
1650,
275,
2593,
495,
326,
3054,
275,
2176,
15216,
362,
29676,
476,
760,
13280,
271,
28596,
1355,
6919,
273,
11784,
3240,
25441,
2980,
285,
891,
11435,
253,
17449,
5411,
4520,
982,
323,
11784,
2957,
3559,
846,
10012,
4562,
5747,
1146,
247,
10527,
2929,
891,
1928,
253,
2929,
10262,
5322,
16039,
715,
7881,
323,
22685,
2216,
285,
11784,
11903,
1320,
275,
3946,
50274,
20881,
1255,
265,
891,
513,
417,
923,
2201,
32213,
275,
253,
2929,
4931,
581,
1953,
585,
20631,
891,
452,
323,
253,
4583,
16182,
273,
5122,
14168,
1179,
66,
285,
253,
3410,
1754,
2746,
281,
1089,
247,
1798,
5122,
342,
1175,
11784,
23632,
310,
326,
1097,
10725,
327,
253,
3640,
273,
268,
50276,
4609,
310,
253,
5912,
326,
247,
12246,
491,
45347,
275,
271,
22685,
604,
627,
310,
2985,
1553,
1877,
275,
436,
4764,
3340,
689,
3311,
569,
352,
3133,
326,
2176,
5122,
19945,
5971,
778,
320,
12969,
562,
534,
275,
253,
9065,
1083,
778,
320,
1110,
665,
2723,
281,
253,
4585,
24528,
604,
436,
310,
2032,
310,
627,
667,
30328,
5001,
849,
7996,
403,
253,
11784,
23632,
281,
2985,
1553,
1877,
273,
268,
891,
778,
320,
5816,
1633,
1060,
285,
651,
320,
1270,
604,
253,
4477,
476,
19148,
50276,
2072,
5474,
33032,
2520,
2929,
3210,
39443,
10169,
342,
8767,
1979,
275,
4471,
4835,
4758,
285,
29328,
247,
3410,
3169,
4715,
5933,
342,
872,
494,
23632,
327,
849,
1199,
11784,
476,
320,
15296,
275,
253,
2454,
273,
8767,
2791,
253,
2201,
38135,
310,
253,
5140,
273,
247,
13688,
10659,
534,
28174,
512,
1896,
3244,
3360,
273,
271,
22685,
327,
271,
8767,
873,
273,
270,
2016,
398,
285,
247,
2087,
6387,
906,
407,
18918,
253,
13688,
10659,
2722,
253,
4081,
3033,
327,
253,
11784,
12215,
4757,
50276,
783,
2929,
310,
973,
15720,
512,
253,
5697,
403,
4518,
5469,
285,
6283,
10932,
50276,
783,
2929,
3054,
326,
253,
4081,
1566,
273,
39443,
10169,
310,
253,
806,
7473,
1566,
275,
4471,
4835,
4758,
534,
3133,
22335,
37825,
50276,
20881,
1255,
265,
5549,
50275,
74,
858,
417,
4366,
667,
5474,
33032,
2520,
2929,
2175,
253,
5122,
2216,
1895,
672,
512,
253,
3773,
29581,
3470,
403,
1929,
285,
1016,
4760,
45347,
342,
2176,
5912,
352,
3400,
247,
14871,
5122,
342,
11784,
2406,
3033,
12215,
285,
671,
247,
3410,
3169,
5122,
534,
476,
320,
9009,
14556,
50276,
296,
3755,
20556,
337,
891,
751,
253,
2934,
326,
1016,
4760,
45347,
342,
11649,
50276,
19,
436,
2929,
3400,
247,
806,
2406,
3033,
906,
323,
253,
2791,
47100,
1895,
285,
3400,
247,
3410,
3169,
5933,
50275,
20881,
1255,
265,
337,
253,
9376,
326,
253,
821,
12542,
323,
512,
3773,
403,
1929,
310,
2238,
273,
11096,
374,
436,
2929,
3133,
625,
2905,
281,
2791,
11649,
533,
1679,
281,
2791,
47100,
752,
310,
253,
7473,
5054,
5426,
273,
2791,
47100,
50276,
1439,
7763,
2490,
187,
4118,
18435,
27,
783,
10123,
403,
512,
2762,
253,
30628,
5194,
326,
253,
2929,
2175,
247,
7936,
1895,
342,
5322,
16039,
285,
4722,
5609,
285,
253,
2929,
310,
973,
15720,
209
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
this paper discusses vqvae for learning discrete latent variables and its application to nmt with a nonautoregressive decoder to reduce latency obtained by producing a number of latent variables that is much smaller than the number of target words and then producing all target words in parallel conditioned on the latent variables and the source text the authors show the connection between the existing ema technique for learning the discrete latent states and hard em and introduce a montecarlo em algorithm as a new learning technique they show strong empirical results on ende nmt with a latent transformer kaiser et al 2018 the paper is clearly written excepting the overloaded appendix and the individual parts of the paper are interesting including the link between vqvae training and hard em the montecarlo em and strong empirical results im less convinced that the paper as a whole delivers on what it promisesclaims the first contribution of the paper is that it shows a simple vqvae to work well on the ende nmt task in contrast to the results by kaiser et al 2018 the paper attributes this to tuning of the codebook but the results table 3 seem to contradict this with a codebook size of 216 even slightly better than the 212 that is used subsequently the reason for the performance difference to kaiser et al 2018 remains opaque while interesting the empirical effectiveness of montecarlo em is a bit disappointing achieving 03 bleu over the best configuration for ende after extensive hyperparameter tuning seen in table 4 and 01 bleu on enfr montecarlo em also seems very sensitive to hyperparameters namely the sample size tables 45 contradicting the later claim that em is robust to hyperparameters the last claimed contribution using denoising techniques is hidden in the appendix an application of an existing technique and not compared to knowledge distillation another existing technique id like to see some of the results in the paper published eventually however the claims need to better match the empirical evidence and for a paper that has better understanding in the title id like to gain a better understanding of the differences to kaiser et al 2018 that make vqvae fail for them but not in the present case clearly written paper interesting novel em algorithm for vqvae strong empirical results on nonautoregressive nmt the strong performance of the vqvae baseline remains unexplained and the claimed explanation contradicts empirical results the new em algorithm gives relatively small improvements with hyperparameters that were likely selected based on test set scores most of the empirical gain is attributable to knowledge distillation which is not a novel contributiondocsepgeneral the paper presents an alternative view on the training procedure for the vqvae the authors have noticed that there is a close connection between the original training algorithm and the wellknown em algorithm then they proposed to use the soft em algorithm in the experiments the authors showed that the soft em allows to obtain significantly better results than the standard learning procedure on both image and text datasets in general the paper shows a neat link between the wellknown em algorithm and the learning method for the vqvae i like the manner the idea is presented additionally the results are convincing i believe that the paper will be interesting for the iclr audience pros the connection between the em algorithms and the training procedure for the vqvae is neat the paper is very well written all concepts are clear and properly outlined the experiments are properly performed and all results are convincing cons the paper is rather incremental however still interesting the quality of figure 1 2 and 3 especially figure 3 is unacceptable there is a typo in table 6 row 5 vvae vqvae i miss two references in the related work on training with discrete variables rebar tucker et al 2017 and relax grathwohl et al 2018 the paper style is not compliant with the iclr style revision i would like to thank authors for their effort to improve quality of images in my opinion the paper is nice and i sustain my initial scoredocsepsummary this paper presents a new training algorithm for vectorquantized autoencoders vqvae a discrete latent variable model akin to continuous variational autoencoders the authors propose a softem training algorithm for this model that replaces hard assignment of latent codes to datapoints with a weighted softassignment overall the technical writing in the paper is sloppy and the presentation of the generative model takes the form of an algorithmic description of the training algorithm rather than being a clear definition of the generative model itself the technical presentation of the work by the authors starts only at page 5 taking less than a full page after several pages of imprecise presentation of previous and related work the paper could be significantly improved by making this preceding material more concise and rigorous quantitative experimental evaluation is limited to a machine translation task which is rather uncommon in the literature on generative latent variable models i would expect evaluation in terms of heldout data loglikelihood ie bitsperdimension used in probabilistic generative models and possibly also using measures from the gan literature such as inception scores datasets that are common include cifar10 and resized variants of the imagenet dataset specific comments please adhere to the iclr template bibliography style which is far more readable than the style that you used figure 1 does not seem to be referenced in the text the last paragraph of section 21 is unclear it mentions a sampling a sequence of latent codes the notion of sequentiality has not been mentioned before and it is not clear what it refers to in the context of the model defined so far up to that point the technical notation is very sloppy in numerous places the paper refers to the joint distribution px1xn z1 zn without defining that the distribution factorizes across the samples xizi and without specifying the forms of pzi and pxizi this makes that claims such as computing the expectation in the m step equation 11 is computationally infeasible are not verifiable please be clear about how much is gained by replacing the exact mstep with a the one based on the samples from the posterior computed in the estep what is the reason to decode the weighted average of the embedding vectors rather than decoding all of them and updating the decoder in a weighted manner reference 14 for variational autoencoders is incorrect please use the following citation instead inproceedingskingma14iclr title autoencoding variational bayes author d kingma and m welling booktitle iclr year 2014 the related work section 4 provides a rather limited overview of relevant related work half of it is dedicated to recent advances in machine translation which does not bear a direct connection to the technical material presented in section 3 there is no justification of using causal selfattention on the source embedding is this a typo as for the experimental evaluation results it seems that distillation is a much more critical factor to achieve good performance than the proposed em training of the vqvae model unfortunately this fact goes unmentioned when discussing the experimental results what is the significance of the observed differences in bleu scores please report average performance and standard deviations over several runs with randomized parameter initialization and batch scheduling it seems that the tuning of the number of discrete latent codes table 2 in appendix and other hyperparameters table 3 in appendix was done on the test set which is also used to compare to related work a separate validation set should be used for hyper parameter tuning in machine learning experiments it seems that all curves in figure 3 collapse from about 45 bleu to values around 17 bleu why is this the figure is hard to read since poor quality and curves that are superposed docsepthis paper introduces a new way of interpreting the vqvae and proposes a new training algorithm based on the soft em clustering i think the technical aspect of this paper is written concisely introducing the interpretation as hard em seems natural for me and the extension to the soft em training is sound reasonable mathematical complication is limited this is also a plus for many nonexpert readers im feeling difficulties in understanding the experimental part to be honest i think the experimental section is highly unorganized not a quality for iclr submission im just wondering why this happens given clean and organized technical sections first im confusing what is the main competent in the table 1 in the last paragraph of the page 6 it reads our implementation of vqvae achieves a significantly better bleu score and faster decoding speed compared to 10 however ref 10 is not mentioned in the table 1 which bleu is the score of ref 10 second terms vqvae softem and our model approach are used in a confusing manner for example in table 1 below the row our results there are vqvae vqvae with em vqvae distillation vqvae with em distillation the vqvae is not the proposed model correct my understanding is that the proposal is a vqvae solved via soft em which corresponds to vqvae with em third a paragraph robustness of em to hyperparameters is misleading the figure 3 does not show the robustness against a hyperparameter it shows the bleu against the number of samples in fact there is no explanation about what the samples means i think hyperparameters are model constants such as the learning rate of the sgd alphabeta params for adam dimension of hidden units number of layers etc the number of samples are not considered as a model hyperparameter its a dataset property the figure 5 shows the reconstructed images of the original vqvae and the proposed vqvae with em however there is no explanation which hyperparameter is tested to assess the robustness to hyperparameters fourth there is no experimental report on the image reconstructions with cifar and svhn in the main manuscript in fact there is a short paragraph that mentions about the svhn results but it only refers to the appendix i think appendix is basically used for additional results or proofs that are not essential for the main message of the paper however performance in the image reconstruction is one of the main claims written in the abstract the intro etc so the authors should include the image reconstruction results in the main body of the paper otherwise claims about the image reconstructions should be removed from the abstract etc insightful understanding of the vqvae as hard em clustering natural and reasonable extension to softem based training of the vqvae unorganized experiment section this simply ruins the quality of the technical part after feedback some of my concerns are addressed the feedback considering the interesting technical parts i raise the score upward to the positive side
### Summary: | strengths wellwritten strong results for nonautoregressive nmt a novel soft em version of vqvae weaknesses as pointed out by reviewers the improvements are mostly not due to the vqvae modification rather due to orthogonal and not interesting changes eg knowledge distillation if there is a genuine contribution of vqvae it is small and required extensive parameter selection the explanations provided in the paper do not match the empirical results two reviewers criticize the experiments experimental section rigour their discussion overall there is nothing wrong with the method but the experiments are not showing that the modification is particularly beneficial given these results and also given that the method is not particularly novel switching from em to soft em in vqvae it is hard for me to argue for accepting the paper | [
16774,
12510,
273,
1114,
442,
5546,
4213,
802,
310,
247,
2372,
31623,
17170,
17272,
7387,
86,
689,
253,
1682,
6661,
323,
19072,
846,
9470,
4373,
19484,
25184,
2326,
275,
2829,
577,
285,
14805,
7387,
86,
327,
546,
925,
1114,
442,
5546,
4213,
802,
671,
3133,
1077,
7996,
281,
4373,
22041,
10775,
253,
3410,
1979,
7180,
5329,
17343,
272,
253,
1996,
1750,
326,
802,
310,
10237,
281,
4373,
22041,
253,
1390,
7558,
7680,
970,
1850,
80,
2182,
5609,
310,
8763,
275,
253,
30762,
271,
2898,
273,
271,
5368,
5853,
285,
417,
2429,
281,
3640,
940,
21755,
1529,
5368,
5853,
50276,
301,
751,
281,
923,
690,
273,
253,
1543,
275,
253,
2929,
3863,
6524,
2299,
253,
3916,
878,
281,
1805,
3761,
253,
16774,
1941,
285,
323,
247,
2929,
326,
556,
1805,
4685,
275,
253,
4060,
2654,
751,
281,
6351,
247,
1805,
4685,
273,
253,
3910,
281,
465,
34393,
1162,
355,
4765,
326,
1056,
362,
82,
21574,
1891,
323,
731,
533,
417,
275,
253,
1246,
1083,
50275,
49346,
3542,
2929,
50276,
47606,
4460,
802,
5933,
323,
362,
82,
21574,
50276,
9072,
16774,
1543,
327,
1327,
1920,
410,
11020,
295,
6917,
50275,
783,
2266,
3045,
273,
253,
362,
82,
21574,
8245,
4558,
49374,
285,
253,
7558,
8813,
40878,
16774,
1543,
50276,
783,
747,
802,
5933,
4245,
4942,
1355,
11701,
342,
4373,
22041,
326,
497,
2779,
4236,
1754,
327,
1071,
873,
7363,
50275,
2252,
273,
253,
16774,
6351,
310,
26585,
281,
3640,
940,
21755,
534,
310,
417,
247,
4460,
7680,
7152,
339,
15943,
1560,
253,
2929,
10262,
271,
5795,
1859,
327,
253,
3733,
5199,
323,
253,
362,
82,
21574,
253,
4477,
452,
8344,
326,
627,
310,
247,
2810,
4602,
875,
253,
3236,
3733,
5933,
285,
253,
973,
4304,
802,
5933,
840,
597,
4081,
281,
897,
253,
2602,
802,
5933,
275,
253,
4679,
253,
4477,
2692,
326,
253,
2602,
802,
4483,
281,
4044,
3012,
1805,
1543,
685,
253,
2629,
4715,
5199,
327,
1097,
2460,
285,
2505,
15302,
50276,
249,
2087,
253,
2929,
2722,
247,
18176,
3048,
875,
253,
973,
4304,
802,
5933,
285,
253,
4715,
1332,
323,
253,
362,
82,
21574,
891,
751,
253,
5133,
253,
2934,
310,
3559,
23000,
253,
1543,
403,
21414,
891,
2868,
326,
253,
2929,
588,
320,
4722,
323,
253,
17857,
32888,
8446,
50276,
856,
84,
50276,
783,
4602,
875,
253,
802,
11333,
285,
253,
3733,
5199,
323,
253,
362,
82,
21574,
310,
18176,
50276,
783,
2929,
310,
1077,
973,
3542,
512,
12342,
403,
2590,
285,
6283,
18627,
50276,
783,
4679,
403,
6283,
2684,
285,
512,
1543,
403,
21414,
50276,
5040,
50276,
783,
2929,
310,
2581,
32809,
2299,
1335,
4722,
50276,
783,
3290,
273,
4677,
337,
374,
285,
495,
3340,
4677,
495,
310,
28536,
50276,
9088,
310,
247,
1745,
80,
275,
2829,
721,
4194,
608,
362,
21574,
50276,
87,
82,
21574,
50276,
74,
2985,
767,
10414,
275,
253,
2905,
789,
327,
3733,
342,
13358,
4903,
294,
2009,
246,
18141,
1162,
355,
4240,
285,
7921,
650,
506,
680,
12408,
1162,
355,
4765,
50276,
783,
2929,
3740,
310,
417,
38147,
342,
253,
17857,
32888,
3740,
50276,
250,
4694,
891,
651,
751,
281,
5717,
4477,
323,
616,
3434,
281,
3157,
3290,
273,
3888,
275,
619,
4743,
253,
2929,
310,
5322,
285,
891,
10265,
619,
3302,
11691,
406,
339,
793,
360,
3454,
50275,
2520,
2929,
10262,
247,
747,
3733,
5933,
323,
4972,
17149,
1025,
6753,
2083,
351,
398,
362,
82,
21574,
247,
13358,
21624,
4778,
1566,
33917,
281,
5415,
39762,
6753,
2083,
351,
398,
253,
4477,
12661,
247,
2602,
358,
3733,
5933,
323,
436,
1566,
326,
36287,
1892,
12714,
273,
21624,
11646,
281,
2856,
522,
842,
84,
342,
247,
17375,
2602,
515,
5930,
50276,
1189,
455,
253,
7681,
4028,
275,
253,
2929,
310,
1499,
45695,
285,
253,
9759,
273,
253,
1006,
800,
1566,
3936,
253,
830,
273,
271,
5933,
280,
5740,
273,
253,
3733,
5933,
2581,
685,
1146,
247,
2590,
5426,
273,
253,
1006,
800,
1566,
3139,
50276,
783,
7681,
9759,
273,
253,
789,
407,
253,
4477,
7866,
760,
387,
3239,
608,
3192,
1679,
685,
247,
2120,
3239,
846,
2067,
7223,
273,
1607,
2845,
885,
9759,
273,
2045,
285,
2905,
789,
253,
2929,
812,
320,
3012,
5520,
407,
2403,
436,
17691,
2144,
625,
44003,
285,
26565,
50275,
17149,
6716,
5661,
7103,
310,
3710,
281,
247,
5145,
10234,
4836,
534,
310,
2581,
24666,
275,
253,
6239,
327,
1006,
800,
21624,
4778,
3210,
891,
651,
1902,
7103,
275,
2426,
273,
2918,
483,
941,
2412,
7513,
10202,
26332,
9886,
468,
39120,
908,
275,
37851,
1006,
800,
3210,
285,
6830,
671,
970,
5593,
432,
253,
36827,
6239,
824,
347,
39645,
7363,
15302,
326,
403,
1846,
2486,
260,
338,
274,
740,
285,
501,
1025,
11640,
273,
253,
4440,
257,
292,
10895,
28910,
50274,
6160,
5701,
50275,
32897,
29534,
281,
253,
17857,
32888,
7646,
20314,
20561,
3740,
534,
310,
2080,
625,
34025,
685,
253,
3740,
326,
368,
908,
50274,
13206,
337,
1057,
417,
1646,
281,
320,
23378,
275,
253,
2505,
50274,
783,
1390,
12494,
273,
2593,
3127,
310,
12744,
352,
25957,
247,
10491,
247,
3425,
273,
21624,
11646,
253,
10732,
273,
22453,
414,
556,
417,
644,
5393,
1078,
285,
352,
310,
417,
2590,
752,
352,
10770,
281,
275,
253,
3634,
273,
253,
1566,
2931,
594,
2080,
598,
281,
326,
1127,
50274,
783,
7681,
14951,
310,
1077,
1499,
45695,
50275,
249,
7418,
5053,
253,
2929,
10770,
281,
253,
6036,
3268,
268,
89,
18,
89,
79,
1182,
18,
50276,
27509,
1293,
13947,
326,
253,
3268,
2803,
4219,
2439,
253,
3530,
1269,
37694,
285,
1293,
31238,
253,
4948,
273,
268,
9877,
285,
268,
89,
37694,
50275,
2520,
2789,
326,
3916,
824,
347,
12672,
253,
15355,
275,
253,
278,
3213,
5150,
1903,
310,
43245,
275,
36764,
917,
403,
417,
2336,
18397,
50274,
32897,
320,
2590,
670,
849,
1199,
310,
12103,
407,
15706,
253,
3242,
278,
10539,
342,
247,
253,
581,
1754,
327,
253,
3530,
432,
253,
12637,
10302,
275,
253,
1144,
554,
50274,
5371,
310,
253,
1921,
281,
30358,
253,
17375,
3388,
273,
253,
21496,
11390,
2581,
685,
28490,
512,
273,
731,
285,
22753,
253,
29810,
275,
247,
17375,
5133,
50275,
14005,
1638,
323,
39762,
6753,
2083,
351,
398,
310,
13583,
4496,
897,
253,
1563,
25577,
3185,
50276,
249,
856,
22868,
4351,
785,
1047,
280,
32888,
50275,
5564,
50257,
15149,
27676,
39762,
17699,
265,
50275,
7582,
50258,
69,
6963,
785,
285,
278,
973,
272,
50275,
3305,
5564,
50261,
280,
32888,
50275,
2913,
50256,
6759,
50274,
783,
2905,
789,
2593,
577,
3400,
247,
2581,
3710,
18389,
273,
4623,
2905,
789,
50276,
14674,
273,
352,
310,
9940,
281,
3332,
16424,
275,
5145,
10234,
534,
1057,
417,
8800,
247,
1480,
4602,
281,
253,
7681,
2144,
3559,
275,
2593,
495,
50275,
9088,
310,
642,
22861,
273,
970,
19349,
1881,
42959,
327,
253,
2603,
21496,
310,
436,
247,
1745,
80,
50275,
284,
323,
253,
5661,
7103,
1543,
352,
3133,
326,
940,
21755,
310,
247,
1199,
625,
4619,
2803,
281,
5115,
1175,
3045,
685,
253,
4081,
802,
3733,
273,
253,
362,
82,
21574,
1566,
19235,
436,
958,
4566,
440,
13012,
672,
16585,
253,
5661,
1543,
50274,
5371,
310,
253,
8453,
273,
253,
2540,
3910,
275,
7387,
86,
7363,
4496,
1304,
3388,
3045,
285,
2629,
21492,
689,
2067,
6613,
342,
14871,
4764,
31850,
285,
14604,
27387,
50274,
262,
3133,
326,
253,
25184,
273,
253,
1180,
273,
13358,
21624,
11646,
2829,
374,
275,
30762,
285,
643,
4373,
22041,
2829,
495,
275,
30762,
369,
2218,
327,
253,
1071,
873,
534,
310,
671,
908,
281,
7277,
281,
2905,
789,
247,
4858,
12820,
873,
943,
320,
908,
323,
4373,
4764,
25184,
275,
5145,
4715,
4679,
50275,
262,
3133,
326,
512,
9191,
275,
4677,
495,
13551,
432,
670,
5329,
7387,
86,
281,
2193,
1475,
1722,
7387,
86,
2139,
310,
436,
253,
4677,
310,
1892,
281,
1239,
1580,
4105,
3290,
285,
9191,
326,
403,
2221,
7334,
50276,
7152,
33032,
2520,
2929,
23970,
247,
747,
1039,
273,
29375,
253,
362,
82,
21574,
50276,
395,
29328,
247,
747,
3733,
5933,
1754,
327,
253,
2602,
802,
17524,
50275,
74,
1158,
253,
7681,
4809,
273,
436,
2929,
310,
3542,
7036,
9299,
50276,
36445,
2844,
253,
7914,
347,
1892,
802,
3133,
3626,
323,
479,
285,
253,
6880,
281,
253,
2602,
802,
3733,
310,
3590,
5272,
50276,
2056,
10479,
474,
23950,
310,
3710,
436,
310,
671,
247,
5043,
323,
1142,
44382,
8292,
10668,
50275,
303,
5471,
12748,
275,
4685,
253,
5661,
629,
281,
320,
8274,
891,
1158,
253,
5661,
2593,
310,
4122,
440,
34092,
417,
247,
3290,
323,
17857,
32888,
19529,
50276,
303,
816,
12371,
2139,
436,
6569,
1677,
4076,
285,
10932,
7681,
7118,
50276,
7053,
516,
21643,
752,
310,
253,
2022,
20566,
275,
253,
2829,
337,
50276,
249,
253,
1390,
12494,
273,
253,
3239,
721,
352,
9563,
50276,
454,
7092,
273,
362,
82,
21574,
33526,
247,
3012,
1805,
7387,
86,
4868,
285,
7938,
28490,
3885,
2429,
281,
884,
2299,
1275,
884,
310,
417,
5393,
275,
253,
2829,
337,
534,
7387,
86,
310,
253,
4868,
273,
1275,
884,
50275,
9815,
2426,
362,
82,
21574,
2602,
358,
285,
776,
1566,
2746,
403,
908,
275,
247,
21643,
5133,
50276,
1542,
1650,
275,
2829,
337,
2708,
253,
4194,
776,
1543,
627,
403,
50276,
87,
82,
21574,
50276,
87,
82,
21574,
342,
802,
50276,
87,
82,
21574,
50276,
8155,
21755,
50276,
87,
82,
21574,
342,
802,
50276,
8155,
21755,
50276,
783,
362,
82,
21574,
310,
417,
253,
4081,
1566,
3451,
50276,
2577,
4685,
310,
326,
253,
10419,
310,
247,
362,
82,
21574,
14042,
3066,
2602,
802,
534,
10140,
281,
362,
82,
21574,
342,
802,
50275,
19016,
247,
12494,
31640,
273,
802,
281,
4373,
22041,
310,
24363,
50276,
783,
4677,
495,
1057,
417,
921,
253,
31640,
1411,
247,
4373,
19484,
50276,
262,
2722,
253,
7387,
86,
1411,
253,
1180,
273,
3530,
275,
958,
627,
310,
642,
8813,
670,
752,
253,
3530,
2097,
50276,
74,
1158,
4373,
22041,
403,
1566,
14637,
824,
347,
253,
4715,
2281,
273,
253,
256,
35333,
355,
20376,
1464,
18912,
323,
38622,
7877,
273,
8763,
5085,
1180,
273,
8090,
3966,
253,
1180,
273,
3530,
403,
417,
2783,
347,
247,
1566,
4373,
19484,
697,
247,
10895,
2867,
50276,
783,
4677,
608,
2722,
253,
25578,
3888,
273,
253,
3236,
362,
82,
21574,
285,
253,
4081,
362,
82,
21574,
342,
802,
50276,
35529,
627,
310,
642,
8813,
534,
4373,
19484,
310,
5762,
281,
2939,
253,
31640,
281,
4373,
22041,
50275,
48499,
627,
310,
642,
5661,
1304,
327,
253,
2460,
49866,
6477,
342,
260,
338,
274,
285,
18504,
13107,
275,
253,
2022,
7714,
50276,
249,
958,
627,
310,
247,
2159,
12494,
326,
25957,
670,
253,
18504,
13107,
1543,
50276,
2858,
352,
760,
10770,
281,
253,
30762,
50276,
74,
1158,
30762,
310,
10323,
908,
323,
3081,
1543,
390,
27947,
326,
403,
417,
5667,
323,
253,
2022,
3935,
273,
253,
2929,
50276,
35529,
3045,
275,
253,
2460,
14433,
310,
581,
273,
253,
2022,
3916,
3542,
275,
253,
12002,
253,
26432,
3966,
50276,
601,
253,
4477,
943,
2486,
253,
2460,
14433,
1543,
275,
253,
2022,
2133,
273,
253,
2929,
50276,
32240,
3916,
670,
253,
2460,
49866,
6477,
943,
320,
5176,
432,
253,
12002,
3966,
50273,
968,
429,
1020,
4685,
273,
253,
362,
82,
21574,
347,
1892,
802,
17524,
50276,
19293,
285,
5272,
6880,
281,
2602,
358,
1754,
3733,
273,
253,
362,
82,
21574,
50276,
328,
34092,
3368,
2593,
436,
3365,
28478,
253,
3290,
273,
253,
7681,
629,
50273,
6438,
8680,
50276,
8826,
273,
619,
7350,
403,
9713,
253,
8680,
50276,
15603,
272,
253,
4722,
7681,
4243,
891,
7164,
253,
4868,
19123,
281,
253,
2762,
1930,
2490,
187,
4118,
18435,
27,
296,
3755,
20556,
50275,
4714,
15720,
50275,
9072,
1543,
323,
1327,
1920,
410,
11020,
295,
6917,
50276,
66,
4460,
2602,
802,
2715,
273,
362,
82,
21574,
50276,
20881,
1255,
265,
50274,
284,
8042,
562,
407,
30628,
253,
11701,
403,
6571,
417,
1955,
281,
253,
362,
82,
21574,
11237,
2581,
1955,
281,
19627,
285,
417,
4722,
2544,
24088,
3640,
940,
21755,
604,
627,
310,
247,
13241,
7680,
273,
362,
82,
21574,
352,
310,
1355,
285,
2424,
9470,
4764,
5438,
50274,
783,
22909,
2530,
275,
253,
2929,
513,
417,
3761,
253,
16774,
1543,
50276,
9389,
30628,
45688,
253,
4679,
50276,
49363,
2593,
8132,
454,
50276,
14094,
5955,
50276,
1189,
455,
627,
310,
2717,
3430,
342,
253,
1332,
533,
253,
4679,
403,
417,
4645,
326,
253,
11237,
310,
3782,
12912,
50276,
28821,
841,
1543,
285,
671,
1677,
326,
253,
1332,
310,
417,
3782,
4460,
12797,
432,
802,
281,
2602,
802,
275,
362,
82,
21574,
352,
310,
1892,
323,
479,
281,
9059,
323,
18738,
253,
2929
] | [
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] | [
16774,
12510,
273,
1114,
442,
5546,
4213,
802,
310,
247,
2372,
31623,
17170,
17272,
7387,
86,
689,
253,
1682,
6661,
323,
19072,
846,
9470,
4373,
19484,
25184,
2326,
275,
2829,
577,
285,
14805,
7387,
86,
327,
546,
925,
1114,
442,
5546,
4213,
802,
671,
3133,
1077,
7996,
281,
4373,
22041,
10775,
253,
3410,
1979,
7180,
5329,
17343,
272,
253,
1996,
1750,
326,
802,
310,
10237,
281,
4373,
22041,
253,
1390,
7558,
7680,
970,
1850,
80,
2182,
5609,
310,
8763,
275,
253,
30762,
271,
2898,
273,
271,
5368,
5853,
285,
417,
2429,
281,
3640,
940,
21755,
1529,
5368,
5853,
50276,
301,
751,
281,
923,
690,
273,
253,
1543,
275,
253,
2929,
3863,
6524,
2299,
253,
3916,
878,
281,
1805,
3761,
253,
16774,
1941,
285,
323,
247,
2929,
326,
556,
1805,
4685,
275,
253,
4060,
2654,
751,
281,
6351,
247,
1805,
4685,
273,
253,
3910,
281,
465,
34393,
1162,
355,
4765,
326,
1056,
362,
82,
21574,
1891,
323,
731,
533,
417,
275,
253,
1246,
1083,
50275,
49346,
3542,
2929,
50276,
47606,
4460,
802,
5933,
323,
362,
82,
21574,
50276,
9072,
16774,
1543,
327,
1327,
1920,
410,
11020,
295,
6917,
50275,
783,
2266,
3045,
273,
253,
362,
82,
21574,
8245,
4558,
49374,
285,
253,
7558,
8813,
40878,
16774,
1543,
50276,
783,
747,
802,
5933,
4245,
4942,
1355,
11701,
342,
4373,
22041,
326,
497,
2779,
4236,
1754,
327,
1071,
873,
7363,
50275,
2252,
273,
253,
16774,
6351,
310,
26585,
281,
3640,
940,
21755,
534,
310,
417,
247,
4460,
7680,
7152,
339,
15943,
1560,
253,
2929,
10262,
271,
5795,
1859,
327,
253,
3733,
5199,
323,
253,
362,
82,
21574,
253,
4477,
452,
8344,
326,
627,
310,
247,
2810,
4602,
875,
253,
3236,
3733,
5933,
285,
253,
973,
4304,
802,
5933,
840,
597,
4081,
281,
897,
253,
2602,
802,
5933,
275,
253,
4679,
253,
4477,
2692,
326,
253,
2602,
802,
4483,
281,
4044,
3012,
1805,
1543,
685,
253,
2629,
4715,
5199,
327,
1097,
2460,
285,
2505,
15302,
50276,
249,
2087,
253,
2929,
2722,
247,
18176,
3048,
875,
253,
973,
4304,
802,
5933,
285,
253,
4715,
1332,
323,
253,
362,
82,
21574,
891,
751,
253,
5133,
253,
2934,
310,
3559,
23000,
253,
1543,
403,
21414,
891,
2868,
326,
253,
2929,
588,
320,
4722,
323,
253,
17857,
32888,
8446,
50276,
856,
84,
50276,
783,
4602,
875,
253,
802,
11333,
285,
253,
3733,
5199,
323,
253,
362,
82,
21574,
310,
18176,
50276,
783,
2929,
310,
1077,
973,
3542,
512,
12342,
403,
2590,
285,
6283,
18627,
50276,
783,
4679,
403,
6283,
2684,
285,
512,
1543,
403,
21414,
50276,
5040,
50276,
783,
2929,
310,
2581,
32809,
2299,
1335,
4722,
50276,
783,
3290,
273,
4677,
337,
374,
285,
495,
3340,
4677,
495,
310,
28536,
50276,
9088,
310,
247,
1745,
80,
275,
2829,
721,
4194,
608,
362,
21574,
50276,
87,
82,
21574,
50276,
74,
2985,
767,
10414,
275,
253,
2905,
789,
327,
3733,
342,
13358,
4903,
294,
2009,
246,
18141,
1162,
355,
4240,
285,
7921,
650,
506,
680,
12408,
1162,
355,
4765,
50276,
783,
2929,
3740,
310,
417,
38147,
342,
253,
17857,
32888,
3740,
50276,
250,
4694,
891,
651,
751,
281,
5717,
4477,
323,
616,
3434,
281,
3157,
3290,
273,
3888,
275,
619,
4743,
253,
2929,
310,
5322,
285,
891,
10265,
619,
3302,
11691,
406,
339,
793,
360,
3454,
50275,
2520,
2929,
10262,
247,
747,
3733,
5933,
323,
4972,
17149,
1025,
6753,
2083,
351,
398,
362,
82,
21574,
247,
13358,
21624,
4778,
1566,
33917,
281,
5415,
39762,
6753,
2083,
351,
398,
253,
4477,
12661,
247,
2602,
358,
3733,
5933,
323,
436,
1566,
326,
36287,
1892,
12714,
273,
21624,
11646,
281,
2856,
522,
842,
84,
342,
247,
17375,
2602,
515,
5930,
50276,
1189,
455,
253,
7681,
4028,
275,
253,
2929,
310,
1499,
45695,
285,
253,
9759,
273,
253,
1006,
800,
1566,
3936,
253,
830,
273,
271,
5933,
280,
5740,
273,
253,
3733,
5933,
2581,
685,
1146,
247,
2590,
5426,
273,
253,
1006,
800,
1566,
3139,
50276,
783,
7681,
9759,
273,
253,
789,
407,
253,
4477,
7866,
760,
387,
3239,
608,
3192,
1679,
685,
247,
2120,
3239,
846,
2067,
7223,
273,
1607,
2845,
885,
9759,
273,
2045,
285,
2905,
789,
253,
2929,
812,
320,
3012,
5520,
407,
2403,
436,
17691,
2144,
625,
44003,
285,
26565,
50275,
17149,
6716,
5661,
7103,
310,
3710,
281,
247,
5145,
10234,
4836,
534,
310,
2581,
24666,
275,
253,
6239,
327,
1006,
800,
21624,
4778,
3210,
891,
651,
1902,
7103,
275,
2426,
273,
2918,
483,
941,
2412,
7513,
10202,
26332,
9886,
468,
39120,
908,
275,
37851,
1006,
800,
3210,
285,
6830,
671,
970,
5593,
432,
253,
36827,
6239,
824,
347,
39645,
7363,
15302,
326,
403,
1846,
2486,
260,
338,
274,
740,
285,
501,
1025,
11640,
273,
253,
4440,
257,
292,
10895,
28910,
50274,
6160,
5701,
50275,
32897,
29534,
281,
253,
17857,
32888,
7646,
20314,
20561,
3740,
534,
310,
2080,
625,
34025,
685,
253,
3740,
326,
368,
908,
50274,
13206,
337,
1057,
417,
1646,
281,
320,
23378,
275,
253,
2505,
50274,
783,
1390,
12494,
273,
2593,
3127,
310,
12744,
352,
25957,
247,
10491,
247,
3425,
273,
21624,
11646,
253,
10732,
273,
22453,
414,
556,
417,
644,
5393,
1078,
285,
352,
310,
417,
2590,
752,
352,
10770,
281,
275,
253,
3634,
273,
253,
1566,
2931,
594,
2080,
598,
281,
326,
1127,
50274,
783,
7681,
14951,
310,
1077,
1499,
45695,
50275,
249,
7418,
5053,
253,
2929,
10770,
281,
253,
6036,
3268,
268,
89,
18,
89,
79,
1182,
18,
50276,
27509,
1293,
13947,
326,
253,
3268,
2803,
4219,
2439,
253,
3530,
1269,
37694,
285,
1293,
31238,
253,
4948,
273,
268,
9877,
285,
268,
89,
37694,
50275,
2520,
2789,
326,
3916,
824,
347,
12672,
253,
15355,
275,
253,
278,
3213,
5150,
1903,
310,
43245,
275,
36764,
917,
403,
417,
2336,
18397,
50274,
32897,
320,
2590,
670,
849,
1199,
310,
12103,
407,
15706,
253,
3242,
278,
10539,
342,
247,
253,
581,
1754,
327,
253,
3530,
432,
253,
12637,
10302,
275,
253,
1144,
554,
50274,
5371,
310,
253,
1921,
281,
30358,
253,
17375,
3388,
273,
253,
21496,
11390,
2581,
685,
28490,
512,
273,
731,
285,
22753,
253,
29810,
275,
247,
17375,
5133,
50275,
14005,
1638,
323,
39762,
6753,
2083,
351,
398,
310,
13583,
4496,
897,
253,
1563,
25577,
3185,
50276,
249,
856,
22868,
4351,
785,
1047,
280,
32888,
50275,
5564,
50257,
15149,
27676,
39762,
17699,
265,
50275,
7582,
50258,
69,
6963,
785,
285,
278,
973,
272,
50275,
3305,
5564,
50261,
280,
32888,
50275,
2913,
50256,
6759,
50274,
783,
2905,
789,
2593,
577,
3400,
247,
2581,
3710,
18389,
273,
4623,
2905,
789,
50276,
14674,
273,
352,
310,
9940,
281,
3332,
16424,
275,
5145,
10234,
534,
1057,
417,
8800,
247,
1480,
4602,
281,
253,
7681,
2144,
3559,
275,
2593,
495,
50275,
9088,
310,
642,
22861,
273,
970,
19349,
1881,
42959,
327,
253,
2603,
21496,
310,
436,
247,
1745,
80,
50275,
284,
323,
253,
5661,
7103,
1543,
352,
3133,
326,
940,
21755,
310,
247,
1199,
625,
4619,
2803,
281,
5115,
1175,
3045,
685,
253,
4081,
802,
3733,
273,
253,
362,
82,
21574,
1566,
19235,
436,
958,
4566,
440,
13012,
672,
16585,
253,
5661,
1543,
50274,
5371,
310,
253,
8453,
273,
253,
2540,
3910,
275,
7387,
86,
7363,
4496,
1304,
3388,
3045,
285,
2629,
21492,
689,
2067,
6613,
342,
14871,
4764,
31850,
285,
14604,
27387,
50274,
262,
3133,
326,
253,
25184,
273,
253,
1180,
273,
13358,
21624,
11646,
2829,
374,
275,
30762,
285,
643,
4373,
22041,
2829,
495,
275,
30762,
369,
2218,
327,
253,
1071,
873,
534,
310,
671,
908,
281,
7277,
281,
2905,
789,
247,
4858,
12820,
873,
943,
320,
908,
323,
4373,
4764,
25184,
275,
5145,
4715,
4679,
50275,
262,
3133,
326,
512,
9191,
275,
4677,
495,
13551,
432,
670,
5329,
7387,
86,
281,
2193,
1475,
1722,
7387,
86,
2139,
310,
436,
253,
4677,
310,
1892,
281,
1239,
1580,
4105,
3290,
285,
9191,
326,
403,
2221,
7334,
50276,
7152,
33032,
2520,
2929,
23970,
247,
747,
1039,
273,
29375,
253,
362,
82,
21574,
50276,
395,
29328,
247,
747,
3733,
5933,
1754,
327,
253,
2602,
802,
17524,
50275,
74,
1158,
253,
7681,
4809,
273,
436,
2929,
310,
3542,
7036,
9299,
50276,
36445,
2844,
253,
7914,
347,
1892,
802,
3133,
3626,
323,
479,
285,
253,
6880,
281,
253,
2602,
802,
3733,
310,
3590,
5272,
50276,
2056,
10479,
474,
23950,
310,
3710,
436,
310,
671,
247,
5043,
323,
1142,
44382,
8292,
10668,
50275,
303,
5471,
12748,
275,
4685,
253,
5661,
629,
281,
320,
8274,
891,
1158,
253,
5661,
2593,
310,
4122,
440,
34092,
417,
247,
3290,
323,
17857,
32888,
19529,
50276,
303,
816,
12371,
2139,
436,
6569,
1677,
4076,
285,
10932,
7681,
7118,
50276,
7053,
516,
21643,
752,
310,
253,
2022,
20566,
275,
253,
2829,
337,
50276,
249,
253,
1390,
12494,
273,
253,
3239,
721,
352,
9563,
50276,
454,
7092,
273,
362,
82,
21574,
33526,
247,
3012,
1805,
7387,
86,
4868,
285,
7938,
28490,
3885,
2429,
281,
884,
2299,
1275,
884,
310,
417,
5393,
275,
253,
2829,
337,
534,
7387,
86,
310,
253,
4868,
273,
1275,
884,
50275,
9815,
2426,
362,
82,
21574,
2602,
358,
285,
776,
1566,
2746,
403,
908,
275,
247,
21643,
5133,
50276,
1542,
1650,
275,
2829,
337,
2708,
253,
4194,
776,
1543,
627,
403,
50276,
87,
82,
21574,
50276,
87,
82,
21574,
342,
802,
50276,
87,
82,
21574,
50276,
8155,
21755,
50276,
87,
82,
21574,
342,
802,
50276,
8155,
21755,
50276,
783,
362,
82,
21574,
310,
417,
253,
4081,
1566,
3451,
50276,
2577,
4685,
310,
326,
253,
10419,
310,
247,
362,
82,
21574,
14042,
3066,
2602,
802,
534,
10140,
281,
362,
82,
21574,
342,
802,
50275,
19016,
247,
12494,
31640,
273,
802,
281,
4373,
22041,
310,
24363,
50276,
783,
4677,
495,
1057,
417,
921,
253,
31640,
1411,
247,
4373,
19484,
50276,
262,
2722,
253,
7387,
86,
1411,
253,
1180,
273,
3530,
275,
958,
627,
310,
642,
8813,
670,
752,
253,
3530,
2097,
50276,
74,
1158,
4373,
22041,
403,
1566,
14637,
824,
347,
253,
4715,
2281,
273,
253,
256,
35333,
355,
20376,
1464,
18912,
323,
38622,
7877,
273,
8763,
5085,
1180,
273,
8090,
3966,
253,
1180,
273,
3530,
403,
417,
2783,
347,
247,
1566,
4373,
19484,
697,
247,
10895,
2867,
50276,
783,
4677,
608,
2722,
253,
25578,
3888,
273,
253,
3236,
362,
82,
21574,
285,
253,
4081,
362,
82,
21574,
342,
802,
50276,
35529,
627,
310,
642,
8813,
534,
4373,
19484,
310,
5762,
281,
2939,
253,
31640,
281,
4373,
22041,
50275,
48499,
627,
310,
642,
5661,
1304,
327,
253,
2460,
49866,
6477,
342,
260,
338,
274,
285,
18504,
13107,
275,
253,
2022,
7714,
50276,
249,
958,
627,
310,
247,
2159,
12494,
326,
25957,
670,
253,
18504,
13107,
1543,
50276,
2858,
352,
760,
10770,
281,
253,
30762,
50276,
74,
1158,
30762,
310,
10323,
908,
323,
3081,
1543,
390,
27947,
326,
403,
417,
5667,
323,
253,
2022,
3935,
273,
253,
2929,
50276,
35529,
3045,
275,
253,
2460,
14433,
310,
581,
273,
253,
2022,
3916,
3542,
275,
253,
12002,
253,
26432,
3966,
50276,
601,
253,
4477,
943,
2486,
253,
2460,
14433,
1543,
275,
253,
2022,
2133,
273,
253,
2929,
50276,
32240,
3916,
670,
253,
2460,
49866,
6477,
943,
320,
5176,
432,
253,
12002,
3966,
50273,
968,
429,
1020,
4685,
273,
253,
362,
82,
21574,
347,
1892,
802,
17524,
50276,
19293,
285,
5272,
6880,
281,
2602,
358,
1754,
3733,
273,
253,
362,
82,
21574,
50276,
328,
34092,
3368,
2593,
436,
3365,
28478,
253,
3290,
273,
253,
7681,
629,
50273,
6438,
8680,
50276,
8826,
273,
619,
7350,
403,
9713,
253,
8680,
50276,
15603,
272,
253,
4722,
7681,
4243,
891,
7164,
253,
4868,
19123,
281,
253,
2762,
1930,
2490,
187,
4118,
18435,
27,
296,
3755,
20556,
50275,
4714,
15720,
50275,
9072,
1543,
323,
1327,
1920,
410,
11020,
295,
6917,
50276,
66,
4460,
2602,
802,
2715,
273,
362,
82,
21574,
50276,
20881,
1255,
265,
50274,
284,
8042,
562,
407,
30628,
253,
11701,
403,
6571,
417,
1955,
281,
253,
362,
82,
21574,
11237,
2581,
1955,
281,
19627,
285,
417,
4722,
2544,
24088,
3640,
940,
21755,
604,
627,
310,
247,
13241,
7680,
273,
362,
82,
21574,
352,
310,
1355,
285,
2424,
9470,
4764,
5438,
50274,
783,
22909,
2530,
275,
253,
2929,
513,
417,
3761,
253,
16774,
1543,
50276,
9389,
30628,
45688,
253,
4679,
50276,
49363,
2593,
8132,
454,
50276,
14094,
5955,
50276,
1189,
455,
627,
310,
2717,
3430,
342,
253,
1332,
533,
253,
4679,
403,
417,
4645,
326,
253,
11237,
310,
3782,
12912,
50276,
28821,
841,
1543,
285,
671,
1677,
326,
253,
1332,
310,
417,
3782,
4460,
12797,
432,
802,
281,
2602,
802,
275,
362,
82,
21574,
352,
310,
1892,
323,
479,
281,
9059,
323,
18738,
253,
2929
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
this paper introduces a new problem called learning with evolving class ontology leco where the objective is to learn finer classes over time the authors have shown that labeling new data with finegrained annotation is more valuable and proposed to use the techniques from semisupervised learning and learning with partial labels literature to utilize the old coarselygrained data the authors have proposed a benchmarking protocol for leco on top of two image classification datasets cifar100 and inaturalist and shown promising results strengths tackles a somewhat new and interesting problem the proposed solution is simple and intuitive experiments are extensive and systematic weaknesses the technical novelty of the proposed solution is low this is not a bad thing in general however in this particular case since the proposed problem leco is somewhat similar to the existing continual learning problems and the proposed solution is an ensemble of multiple existing ideas i find this to be a major weakness i find the current problem setup and solution a bit restrictive it overlooks a lot of practical concerns a assumes a negligible cost for data acquisition if new data is not available one can only relabel old data b only expects finer labels in subsequent time periods what happens if a new category is introduced which doesnt have a parent the pseudolabeling approach will mistakenly assign previous coarsegrained classes to this newly introduced category filtering the pseudolabels based on class ontology might work in this case but if the data is longtailed the new class being a head class this might remove a lot of old data from training the findings are not very surprising if data acquisition cost is ignored and a large portion of data is labeled which is the case in the current experimental setup it is expected that a recent sota semisupervised method will be able to reach a performance closed to supervised upperbound accuracy trained with all data the authors have adequately addressed the limitations and potential negative societal impact of their work docsepthe paper introduces the problem of learning with evolving class ontology leco a general case of class incremental learning where instead of introducing new classes at each time step we refine existing class labels into more granular ones the authors ask whether given incoming data and a fixed annotation budget it is better to annotate the new data with the new labels or reannotate the old data in other words whether it is better to train a classifier on a smaller but homogenous dataset or a larger one with a mix of old coarse and new fine labels based on experiments on cifar and inaturalist the authors conclude that by using semisupervised learning and incorporating the hierarchical taxonomy it is possible to achieve excellent classification accuracy with a heterogeneous dataset in particular by using pseudolabelling and learningwithpartiallabels it is possible to get close to the performance of a classifier trained on a homogenous dataset of the same size the main strength of the paper is clarity the structure is good the writing is easy to understand and the results are presented in a logical incremental order the authors precisely state the research questions and contributions and briefly summarise the conclusions in the introduction which helps to guide the reader the paper includes relevant details to reproduce the experiments i have no reservations about the quality of the submission the experimental setup is sound and the proposed models seem appropriate to tackle the presented problem the authors evaluate their methods on two different datasets modified to match the evolving ontology scenario as mentioned below the evaluation could be improved by including related work the biggest weakness of the submitted work is originality while the leco setting appears novel and could be an interesting type of classincremental learning limiting it to two time steps and giving the model access to all the data turns it into fine classification with coarse supervision a problem tackled by other omitted work such as a weakly supervised fine label classifier enhanced by coarse supervision taherkhani et al 2019 weakly supervised image classification with coarse and fine labels lei et al 2017 a pseudolabel method for coarsetofine multilabel learning with limited supervision hsieh et al 2019 from categories to subcategories largescale image classification with partial class label refinement the authors combine existing methods in a novel way which could be a valuable contribution however the lack of comparison to previous approaches makes it hard to judge whether their work advances state of the art and therefore undermines the significance of their findings my suggestion would be to either extend the work to multiple tps or focus on the problem of fine classification with coarse supervision remove the leco formulation and compare their method with existing work i think either would make the submission much stronger the authors correctly identified an essential caveat in their research question obtaining new data is generally equally or more costly than annotating it reannotation is also usually quicker than annotating from scratch because the annotator can use existing labels to constrain the task to a handful of classes one significant problem mentioned but not explored enough is the assumption that the findings would generalise to multiple tps in particular it would be interesting to see the performance of learningwithpartiallabels with multiple levels of granularity docsepthe paper introduces a new continual learning problem setup where class vocabulary becomes more fine grained in a continual fashion different than classic continual learning this setup allows access to the historical examples thus it is not prone to catastrophic forgetting the paper explores several research questions like when the vocabulary evolves whether to annotate new data or relabel the old data without collecting new data how to leverage coarse label of old data and whether to finetune the model trained on old data or train from scratch the paper show that a semi supervised approach that only requires labelling the new data without relabeling the old data is almost equivalent to relabeling all the data both old and new the approach uses the new data to provide pseudo labels for the old data and use the old data coarse labels to resolve reconcile conflicts between the pseudo fine labels and the true coarse labels strengths the paper is well written very easy to read and provides decent baselines and upper bounds the problem setup is important from a practical standpoint and novel to the best of my knowledge i am not an expert in continual learning or hierarchical learning the suggested approach saves relabeling efforts which can become quadratic with the number of time periods weaknesses although the problem setup is general and allowing multiple time periods tp in practice the experiment are only with a single tp i think this is the major weakness of the paper because it is not clear how well the approach generalizes as the labels become more fine grained the approach was only evaluated on two datasets it would be useful to provide experiments for a different data domain not vision the problem setup ignores the fact that aside from storage retraining on the old samples may be very costly from a compute standpoint eg retraining a model with a scale of giga samples like a in autonomous driving may cost millions of dollars there are recent works like 123 that try to alleviate this problem i suggest to discuss it in the related work this setup may introduce biases there may be a distribution shift when collecting more fine grained labels it would have been beneficial to add this type of bias to the benchmark and demonstrate how sensitive is the approach to such a shift it is unclear what are the open questions and the next steps if the approach reaches the upper bound does it mean that this problem is solved experiments there is a better upper bound which is utilizing both the old and new labels for the old data references 1 houlsby et al parameterefficient transfer learning for nlp icml 2019 2 li et al crossdomain fewshot learning with taskspecific adapters cvpr 2022 3 cohen et al this is my unicorn fluffy personalizing frozen visionlanguage representations eccv 2022 ok see my feedback in the weakness section
### Summary: | the setting of evolving and refining classes over time is certainly a practical one in domains such as text classification this paper offers some insights on questions like whether the entire data should be relabeled or can one achieve near optimal performance by labeling only the new chunk the paper concludes that joint training on old and new data even if inconsistent in conjunction with semisupervised learning can be fairly effective | [
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
2520,
2929,
23970,
247,
747,
1895,
1925,
4715,
342,
25537,
966,
42081,
458,
1940,
835,
253,
8103,
310,
281,
3037,
40259,
5971,
689,
673,
253,
4477,
452,
2011,
326,
21473,
747,
941,
342,
4030,
72,
11273,
22581,
310,
625,
9865,
285,
4081,
281,
897,
253,
5609,
432,
49863,
29974,
13337,
4715,
285,
4715,
342,
7898,
13301,
6239,
281,
16584,
253,
1711,
820,
1032,
600,
72,
11273,
941,
253,
4477,
452,
4081,
247,
22791,
272,
7241,
323,
458,
1940,
327,
1755,
273,
767,
2460,
9162,
15302,
260,
338,
274,
2313,
285,
275,
8646,
382,
285,
2011,
12532,
1543,
20544,
50275,
85,
471,
868,
247,
8489,
747,
285,
4722,
1895,
50275,
783,
4081,
2900,
310,
2969,
285,
27350,
50275,
16217,
3825,
403,
9470,
285,
12082,
50276,
20881,
1255,
265,
50275,
783,
7681,
38135,
273,
253,
4081,
2900,
310,
1698,
436,
310,
417,
247,
3076,
2181,
275,
2087,
2299,
275,
436,
1798,
1083,
1580,
253,
4081,
1895,
458,
1940,
310,
8489,
2074,
281,
253,
5368,
45120,
4715,
3237,
285,
253,
4081,
2900,
310,
271,
19862,
273,
2709,
5368,
5697,
891,
1089,
436,
281,
320,
247,
2201,
14855,
50275,
74,
1089,
253,
1655,
1895,
9978,
285,
2900,
247,
2372,
29190,
352,
20621,
84,
247,
2257,
273,
8542,
7350,
247,
19584,
247,
22879,
2105,
323,
941,
11931,
604,
747,
941,
310,
417,
2130,
581,
476,
760,
774,
1492,
1711,
941,
270,
760,
21973,
40259,
13301,
275,
6774,
673,
9894,
752,
6569,
604,
247,
747,
7140,
310,
5611,
534,
36908,
452,
247,
2885,
253,
10585,
311,
1492,
272,
2746,
588,
49294,
9212,
2045,
25319,
72,
11273,
5971,
281,
436,
9841,
5611,
7140,
19690,
253,
10585,
311,
357,
1241,
1754,
327,
966,
42081,
1537,
789,
275,
436,
1083,
533,
604,
253,
941,
310,
1048,
29551,
253,
747,
966,
1146,
247,
1481,
966,
436,
1537,
5386,
247,
2257,
273,
1711,
941,
432,
3733,
50272,
783,
4342,
403,
417,
1077,
10084,
604,
941,
11931,
2105,
310,
12841,
285,
247,
1781,
5110,
273,
941,
310,
13130,
534,
310,
253,
1083,
275,
253,
1655,
5661,
9978,
352,
310,
3264,
326,
247,
3332,
256,
5503,
49863,
29974,
13337,
1332,
588,
320,
2104,
281,
3986,
247,
3045,
4581,
281,
22296,
5170,
9458,
7200,
10166,
342,
512,
941,
50276,
783,
4477,
452,
18212,
9713,
253,
7364,
285,
2442,
4016,
38058,
3486,
273,
616,
789,
5474,
339,
431,
248,
2929,
23970,
253,
1895,
273,
4715,
342,
25537,
966,
42081,
458,
1940,
247,
2087,
1083,
273,
966,
32809,
4715,
835,
3185,
273,
16984,
747,
5971,
387,
1016,
673,
3213,
359,
39494,
5368,
966,
13301,
715,
625,
32449,
4394,
253,
4477,
1642,
1880,
1677,
19363,
941,
285,
247,
4229,
22581,
7563,
352,
310,
1805,
281,
12182,
366,
253,
747,
941,
342,
253,
747,
13301,
390,
294,
11423,
366,
253,
1711,
941,
275,
643,
3000,
1880,
352,
310,
1805,
281,
6194,
247,
30410,
327,
247,
4577,
533,
2860,
11426,
10895,
390,
247,
4067,
581,
342,
247,
5878,
273,
1711,
25319,
285,
747,
4030,
13301,
1754,
327,
4679,
327,
260,
338,
274,
285,
275,
8646,
382,
253,
4477,
7525,
326,
407,
970,
49863,
29974,
13337,
4715,
285,
24049,
253,
24498,
2891,
13646,
352,
310,
1896,
281,
5115,
7126,
9162,
7200,
342,
247,
22766,
10895,
275,
1798,
407,
970,
10585,
311,
357,
3485,
285,
4715,
3113,
3214,
31294,
352,
310,
1896,
281,
755,
2810,
281,
253,
3045,
273,
247,
30410,
10166,
327,
247,
2860,
11426,
10895,
273,
253,
1072,
1979,
253,
2022,
4757,
273,
253,
2929,
310,
19843,
253,
2605,
310,
1175,
253,
4028,
310,
3477,
281,
2096,
285,
253,
1543,
403,
3559,
275,
247,
13760,
32809,
1340,
253,
4477,
10534,
1375,
253,
2561,
3533,
285,
9021,
285,
13366,
10405,
885,
253,
11815,
275,
253,
10199,
534,
7729,
281,
7102,
253,
9414,
253,
2929,
3797,
4623,
4278,
281,
18302,
253,
4679,
50276,
74,
452,
642,
33196,
670,
253,
3290,
273,
253,
19529,
253,
5661,
9978,
310,
3590,
285,
253,
4081,
3210,
1646,
4569,
281,
18915,
253,
3559,
1895,
253,
4477,
7472,
616,
3082,
327,
767,
1027,
15302,
7321,
281,
3761,
253,
25537,
42081,
10076,
347,
5393,
2708,
253,
7103,
812,
320,
5520,
407,
1690,
2905,
789,
50276,
783,
5962,
14855,
273,
253,
9262,
789,
310,
3236,
414,
1223,
253,
458,
1940,
4758,
4620,
4460,
285,
812,
320,
271,
4722,
1511,
273,
966,
19687,
30132,
4715,
14155,
352,
281,
767,
673,
5018,
285,
4933,
253,
1566,
2289,
281,
512,
253,
941,
7819,
352,
715,
4030,
9162,
342,
25319,
20446,
247,
1895,
11463,
1070,
407,
643,
11035,
789,
824,
347,
50275,
66,
22112,
22296,
4030,
5203,
30410,
8655,
407,
25319,
20446,
15307,
379,
76,
49520,
1162,
355,
6247,
50276,
20881,
314,
22296,
2460,
9162,
342,
25319,
285,
4030,
13301,
43278,
1162,
355,
4240,
50276,
66,
10585,
311,
1492,
1332,
323,
820,
1032,
292,
1171,
460,
33362,
1492,
4715,
342,
3710,
20446,
288,
48188,
73,
1162,
355,
6247,
50276,
4064,
9050,
281,
749,
34286,
1236,
2510,
25912,
2460,
9162,
342,
7898,
966,
5203,
29646,
50276,
783,
4477,
13398,
5368,
3082,
275,
247,
4460,
1039,
534,
812,
320,
247,
9865,
7680,
2299,
253,
3480,
273,
5301,
281,
2045,
7274,
2789,
352,
1892,
281,
5963,
1880,
616,
789,
16424,
1375,
273,
253,
1445,
285,
3103,
35162,
1100,
253,
8453,
273,
616,
4342,
619,
14876,
651,
320,
281,
2057,
9017,
253,
789,
281,
2709,
246,
793,
390,
2770,
327,
253,
1895,
273,
4030,
9162,
342,
25319,
20446,
5386,
253,
458,
1940,
15895,
285,
7277,
616,
1332,
342,
5368,
789,
891,
1158,
2057,
651,
1056,
253,
19529,
1199,
10046,
253,
4477,
9113,
3636,
271,
5667,
15985,
255,
275,
616,
2561,
1953,
13546,
747,
941,
310,
3839,
9696,
390,
625,
19983,
685,
12182,
839,
352,
294,
22965,
310,
671,
3798,
31124,
685,
12182,
839,
432,
20041,
984,
253,
12182,
1080,
476,
897,
5368,
13301,
281,
37709,
253,
4836,
281,
247,
17167,
273,
5971,
50276,
531,
1534,
1895,
5393,
533,
417,
14859,
2217,
310,
253,
9376,
326,
253,
4342,
651,
2087,
885,
281,
2709,
246,
793,
275,
1798,
352,
651,
320,
4722,
281,
923,
253,
3045,
273,
4715,
3113,
3214,
31294,
342,
2709,
2308,
273,
32449,
414,
5474,
339,
431,
248,
2929,
23970,
247,
747,
45120,
4715,
1895,
9978,
835,
966,
30318,
4916,
625,
4030,
7098,
967,
275,
247,
45120,
8142,
1027,
685,
10610,
45120,
4715,
436,
9978,
4483,
2289,
281,
253,
9493,
6667,
3021,
352,
310,
417,
21291,
281,
36256,
37264,
50276,
783,
2929,
33826,
2067,
2561,
3533,
751,
672,
253,
30318,
43279,
1880,
281,
12182,
366,
747,
941,
390,
774,
1492,
253,
1711,
941,
1293,
17055,
747,
941,
849,
281,
25057,
25319,
5203,
273,
1711,
941,
285,
1880,
281,
1442,
292,
2517,
253,
1566,
10166,
327,
1711,
941,
390,
6194,
432,
20041,
50276,
783,
2929,
921,
326,
247,
10020,
22296,
2746,
326,
760,
4419,
46684,
253,
747,
941,
1293,
774,
1492,
272,
253,
1711,
941,
310,
2761,
6425,
281,
774,
1492,
272,
512,
253,
941,
1097,
1711,
285,
747,
50276,
783,
2746,
4648,
253,
747,
941,
281,
2085,
17927,
13301,
323,
253,
1711,
941,
285,
897,
253,
1711,
941,
25319,
13301,
281,
11322,
42853,
15272,
875,
253,
17927,
4030,
13301,
285,
253,
2032,
25319,
13301,
50274,
296,
3755,
20556,
50275,
783,
2929,
310,
973,
3542,
1077,
3477,
281,
1239,
285,
3400,
12524,
1666,
25379,
285,
5170,
14493,
50275,
783,
1895,
9978,
310,
1774,
432,
247,
8542,
32764,
285,
4460,
281,
253,
1682,
273,
619,
3640,
891,
717,
417,
271,
6485,
275,
45120,
4715,
390,
24498,
4715,
50275,
783,
5125,
2746,
26866,
774,
1492,
272,
6031,
534,
476,
2489,
21396,
342,
253,
1180,
273,
673,
9894,
50274,
20881,
1255,
265,
50275,
20261,
253,
1895,
9978,
310,
2087,
285,
6941,
2709,
673,
9894,
246,
81,
275,
3946,
253,
3368,
403,
760,
342,
247,
2014,
246,
81,
891,
1158,
436,
310,
253,
2201,
14855,
273,
253,
2929,
984,
352,
310,
417,
2590,
849,
973,
253,
2746,
2087,
4219,
347,
253,
13301,
2489,
625,
4030,
7098,
967,
50275,
783,
2746,
369,
760,
6760,
327,
767,
15302,
352,
651,
320,
4217,
281,
2085,
4679,
323,
247,
1027,
941,
5028,
417,
8113,
50274,
783,
1895,
9978,
35136,
253,
958,
326,
9255,
432,
5718,
851,
26208,
327,
253,
1711,
3530,
778,
320,
1077,
19983,
432,
247,
11897,
32764,
50276,
909,
851,
26208,
247,
1566,
342,
247,
4311,
273,
14349,
66,
3530,
751,
247,
275,
26279,
6276,
778,
2105,
9790,
273,
8918,
627,
403,
3332,
2987,
751,
15567,
326,
1611,
281,
33623,
436,
1895,
891,
1804,
281,
2319,
352,
275,
253,
2905,
789,
50275,
2520,
9978,
778,
9569,
31306,
627,
778,
320,
247,
3268,
5333,
672,
17055,
625,
4030,
7098,
967,
13301,
352,
651,
452,
644,
12912,
281,
823,
436,
1511,
273,
8492,
281,
253,
22791,
285,
7568,
849,
7996,
310,
253,
2746,
281,
824,
247,
5333,
50275,
262,
310,
12744,
752,
403,
253,
1527,
3533,
285,
253,
1735,
5018,
604,
253,
2746,
14190,
253,
5170,
3033,
1057,
352,
1599,
326,
436,
1895,
310,
14042,
50275,
16217,
3825,
627,
310,
247,
1805,
5170,
3033,
534,
310,
17617,
1097,
253,
1711,
285,
747,
13301,
323,
253,
1711,
941,
50273,
250,
3065,
337,
288,
3941,
84,
1615,
1162,
355,
30364,
11892,
2276,
3700,
4715,
323,
295,
24343,
17857,
1686,
6247,
50276,
19,
632,
1162,
355,
2831,
13517,
1643,
11860,
4715,
342,
8892,
29765,
519,
49872,
30105,
1087,
1384,
1423,
50276,
20,
820,
864,
1162,
355,
436,
310,
619,
440,
41257,
892,
26948,
3367,
3006,
13831,
8113,
12982,
14237,
23746,
87,
1384,
1423,
50276,
536,
50276,
2887,
619,
8680,
275,
253,
14855,
2593,
50276,
187,
187,
4118,
18435,
27,
783,
4758,
273,
25537,
285,
1275,
1699,
5971,
689,
673,
310,
5604,
247,
8542,
581,
275,
10625,
824,
347,
2505,
9162,
436,
2929,
6131,
690,
16039,
327,
3533,
751,
1880,
253,
2862,
941,
943,
320,
774,
1492,
264,
390,
476,
581,
5115,
2822,
8654,
3045,
407,
21473,
760,
253,
747,
20540,
253,
2929,
20097,
326,
6036,
3733,
327,
1711,
285,
747,
941,
1014,
604,
16706,
275,
17385,
342,
49863,
29974,
13337,
4715,
476,
320,
9648,
3576,
209
] | [
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] | [
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
2520,
2929,
23970,
247,
747,
1895,
1925,
4715,
342,
25537,
966,
42081,
458,
1940,
835,
253,
8103,
310,
281,
3037,
40259,
5971,
689,
673,
253,
4477,
452,
2011,
326,
21473,
747,
941,
342,
4030,
72,
11273,
22581,
310,
625,
9865,
285,
4081,
281,
897,
253,
5609,
432,
49863,
29974,
13337,
4715,
285,
4715,
342,
7898,
13301,
6239,
281,
16584,
253,
1711,
820,
1032,
600,
72,
11273,
941,
253,
4477,
452,
4081,
247,
22791,
272,
7241,
323,
458,
1940,
327,
1755,
273,
767,
2460,
9162,
15302,
260,
338,
274,
2313,
285,
275,
8646,
382,
285,
2011,
12532,
1543,
20544,
50275,
85,
471,
868,
247,
8489,
747,
285,
4722,
1895,
50275,
783,
4081,
2900,
310,
2969,
285,
27350,
50275,
16217,
3825,
403,
9470,
285,
12082,
50276,
20881,
1255,
265,
50275,
783,
7681,
38135,
273,
253,
4081,
2900,
310,
1698,
436,
310,
417,
247,
3076,
2181,
275,
2087,
2299,
275,
436,
1798,
1083,
1580,
253,
4081,
1895,
458,
1940,
310,
8489,
2074,
281,
253,
5368,
45120,
4715,
3237,
285,
253,
4081,
2900,
310,
271,
19862,
273,
2709,
5368,
5697,
891,
1089,
436,
281,
320,
247,
2201,
14855,
50275,
74,
1089,
253,
1655,
1895,
9978,
285,
2900,
247,
2372,
29190,
352,
20621,
84,
247,
2257,
273,
8542,
7350,
247,
19584,
247,
22879,
2105,
323,
941,
11931,
604,
747,
941,
310,
417,
2130,
581,
476,
760,
774,
1492,
1711,
941,
270,
760,
21973,
40259,
13301,
275,
6774,
673,
9894,
752,
6569,
604,
247,
747,
7140,
310,
5611,
534,
36908,
452,
247,
2885,
253,
10585,
311,
1492,
272,
2746,
588,
49294,
9212,
2045,
25319,
72,
11273,
5971,
281,
436,
9841,
5611,
7140,
19690,
253,
10585,
311,
357,
1241,
1754,
327,
966,
42081,
1537,
789,
275,
436,
1083,
533,
604,
253,
941,
310,
1048,
29551,
253,
747,
966,
1146,
247,
1481,
966,
436,
1537,
5386,
247,
2257,
273,
1711,
941,
432,
3733,
50272,
783,
4342,
403,
417,
1077,
10084,
604,
941,
11931,
2105,
310,
12841,
285,
247,
1781,
5110,
273,
941,
310,
13130,
534,
310,
253,
1083,
275,
253,
1655,
5661,
9978,
352,
310,
3264,
326,
247,
3332,
256,
5503,
49863,
29974,
13337,
1332,
588,
320,
2104,
281,
3986,
247,
3045,
4581,
281,
22296,
5170,
9458,
7200,
10166,
342,
512,
941,
50276,
783,
4477,
452,
18212,
9713,
253,
7364,
285,
2442,
4016,
38058,
3486,
273,
616,
789,
5474,
339,
431,
248,
2929,
23970,
253,
1895,
273,
4715,
342,
25537,
966,
42081,
458,
1940,
247,
2087,
1083,
273,
966,
32809,
4715,
835,
3185,
273,
16984,
747,
5971,
387,
1016,
673,
3213,
359,
39494,
5368,
966,
13301,
715,
625,
32449,
4394,
253,
4477,
1642,
1880,
1677,
19363,
941,
285,
247,
4229,
22581,
7563,
352,
310,
1805,
281,
12182,
366,
253,
747,
941,
342,
253,
747,
13301,
390,
294,
11423,
366,
253,
1711,
941,
275,
643,
3000,
1880,
352,
310,
1805,
281,
6194,
247,
30410,
327,
247,
4577,
533,
2860,
11426,
10895,
390,
247,
4067,
581,
342,
247,
5878,
273,
1711,
25319,
285,
747,
4030,
13301,
1754,
327,
4679,
327,
260,
338,
274,
285,
275,
8646,
382,
253,
4477,
7525,
326,
407,
970,
49863,
29974,
13337,
4715,
285,
24049,
253,
24498,
2891,
13646,
352,
310,
1896,
281,
5115,
7126,
9162,
7200,
342,
247,
22766,
10895,
275,
1798,
407,
970,
10585,
311,
357,
3485,
285,
4715,
3113,
3214,
31294,
352,
310,
1896,
281,
755,
2810,
281,
253,
3045,
273,
247,
30410,
10166,
327,
247,
2860,
11426,
10895,
273,
253,
1072,
1979,
253,
2022,
4757,
273,
253,
2929,
310,
19843,
253,
2605,
310,
1175,
253,
4028,
310,
3477,
281,
2096,
285,
253,
1543,
403,
3559,
275,
247,
13760,
32809,
1340,
253,
4477,
10534,
1375,
253,
2561,
3533,
285,
9021,
285,
13366,
10405,
885,
253,
11815,
275,
253,
10199,
534,
7729,
281,
7102,
253,
9414,
253,
2929,
3797,
4623,
4278,
281,
18302,
253,
4679,
50276,
74,
452,
642,
33196,
670,
253,
3290,
273,
253,
19529,
253,
5661,
9978,
310,
3590,
285,
253,
4081,
3210,
1646,
4569,
281,
18915,
253,
3559,
1895,
253,
4477,
7472,
616,
3082,
327,
767,
1027,
15302,
7321,
281,
3761,
253,
25537,
42081,
10076,
347,
5393,
2708,
253,
7103,
812,
320,
5520,
407,
1690,
2905,
789,
50276,
783,
5962,
14855,
273,
253,
9262,
789,
310,
3236,
414,
1223,
253,
458,
1940,
4758,
4620,
4460,
285,
812,
320,
271,
4722,
1511,
273,
966,
19687,
30132,
4715,
14155,
352,
281,
767,
673,
5018,
285,
4933,
253,
1566,
2289,
281,
512,
253,
941,
7819,
352,
715,
4030,
9162,
342,
25319,
20446,
247,
1895,
11463,
1070,
407,
643,
11035,
789,
824,
347,
50275,
66,
22112,
22296,
4030,
5203,
30410,
8655,
407,
25319,
20446,
15307,
379,
76,
49520,
1162,
355,
6247,
50276,
20881,
314,
22296,
2460,
9162,
342,
25319,
285,
4030,
13301,
43278,
1162,
355,
4240,
50276,
66,
10585,
311,
1492,
1332,
323,
820,
1032,
292,
1171,
460,
33362,
1492,
4715,
342,
3710,
20446,
288,
48188,
73,
1162,
355,
6247,
50276,
4064,
9050,
281,
749,
34286,
1236,
2510,
25912,
2460,
9162,
342,
7898,
966,
5203,
29646,
50276,
783,
4477,
13398,
5368,
3082,
275,
247,
4460,
1039,
534,
812,
320,
247,
9865,
7680,
2299,
253,
3480,
273,
5301,
281,
2045,
7274,
2789,
352,
1892,
281,
5963,
1880,
616,
789,
16424,
1375,
273,
253,
1445,
285,
3103,
35162,
1100,
253,
8453,
273,
616,
4342,
619,
14876,
651,
320,
281,
2057,
9017,
253,
789,
281,
2709,
246,
793,
390,
2770,
327,
253,
1895,
273,
4030,
9162,
342,
25319,
20446,
5386,
253,
458,
1940,
15895,
285,
7277,
616,
1332,
342,
5368,
789,
891,
1158,
2057,
651,
1056,
253,
19529,
1199,
10046,
253,
4477,
9113,
3636,
271,
5667,
15985,
255,
275,
616,
2561,
1953,
13546,
747,
941,
310,
3839,
9696,
390,
625,
19983,
685,
12182,
839,
352,
294,
22965,
310,
671,
3798,
31124,
685,
12182,
839,
432,
20041,
984,
253,
12182,
1080,
476,
897,
5368,
13301,
281,
37709,
253,
4836,
281,
247,
17167,
273,
5971,
50276,
531,
1534,
1895,
5393,
533,
417,
14859,
2217,
310,
253,
9376,
326,
253,
4342,
651,
2087,
885,
281,
2709,
246,
793,
275,
1798,
352,
651,
320,
4722,
281,
923,
253,
3045,
273,
4715,
3113,
3214,
31294,
342,
2709,
2308,
273,
32449,
414,
5474,
339,
431,
248,
2929,
23970,
247,
747,
45120,
4715,
1895,
9978,
835,
966,
30318,
4916,
625,
4030,
7098,
967,
275,
247,
45120,
8142,
1027,
685,
10610,
45120,
4715,
436,
9978,
4483,
2289,
281,
253,
9493,
6667,
3021,
352,
310,
417,
21291,
281,
36256,
37264,
50276,
783,
2929,
33826,
2067,
2561,
3533,
751,
672,
253,
30318,
43279,
1880,
281,
12182,
366,
747,
941,
390,
774,
1492,
253,
1711,
941,
1293,
17055,
747,
941,
849,
281,
25057,
25319,
5203,
273,
1711,
941,
285,
1880,
281,
1442,
292,
2517,
253,
1566,
10166,
327,
1711,
941,
390,
6194,
432,
20041,
50276,
783,
2929,
921,
326,
247,
10020,
22296,
2746,
326,
760,
4419,
46684,
253,
747,
941,
1293,
774,
1492,
272,
253,
1711,
941,
310,
2761,
6425,
281,
774,
1492,
272,
512,
253,
941,
1097,
1711,
285,
747,
50276,
783,
2746,
4648,
253,
747,
941,
281,
2085,
17927,
13301,
323,
253,
1711,
941,
285,
897,
253,
1711,
941,
25319,
13301,
281,
11322,
42853,
15272,
875,
253,
17927,
4030,
13301,
285,
253,
2032,
25319,
13301,
50274,
296,
3755,
20556,
50275,
783,
2929,
310,
973,
3542,
1077,
3477,
281,
1239,
285,
3400,
12524,
1666,
25379,
285,
5170,
14493,
50275,
783,
1895,
9978,
310,
1774,
432,
247,
8542,
32764,
285,
4460,
281,
253,
1682,
273,
619,
3640,
891,
717,
417,
271,
6485,
275,
45120,
4715,
390,
24498,
4715,
50275,
783,
5125,
2746,
26866,
774,
1492,
272,
6031,
534,
476,
2489,
21396,
342,
253,
1180,
273,
673,
9894,
50274,
20881,
1255,
265,
50275,
20261,
253,
1895,
9978,
310,
2087,
285,
6941,
2709,
673,
9894,
246,
81,
275,
3946,
253,
3368,
403,
760,
342,
247,
2014,
246,
81,
891,
1158,
436,
310,
253,
2201,
14855,
273,
253,
2929,
984,
352,
310,
417,
2590,
849,
973,
253,
2746,
2087,
4219,
347,
253,
13301,
2489,
625,
4030,
7098,
967,
50275,
783,
2746,
369,
760,
6760,
327,
767,
15302,
352,
651,
320,
4217,
281,
2085,
4679,
323,
247,
1027,
941,
5028,
417,
8113,
50274,
783,
1895,
9978,
35136,
253,
958,
326,
9255,
432,
5718,
851,
26208,
327,
253,
1711,
3530,
778,
320,
1077,
19983,
432,
247,
11897,
32764,
50276,
909,
851,
26208,
247,
1566,
342,
247,
4311,
273,
14349,
66,
3530,
751,
247,
275,
26279,
6276,
778,
2105,
9790,
273,
8918,
627,
403,
3332,
2987,
751,
15567,
326,
1611,
281,
33623,
436,
1895,
891,
1804,
281,
2319,
352,
275,
253,
2905,
789,
50275,
2520,
9978,
778,
9569,
31306,
627,
778,
320,
247,
3268,
5333,
672,
17055,
625,
4030,
7098,
967,
13301,
352,
651,
452,
644,
12912,
281,
823,
436,
1511,
273,
8492,
281,
253,
22791,
285,
7568,
849,
7996,
310,
253,
2746,
281,
824,
247,
5333,
50275,
262,
310,
12744,
752,
403,
253,
1527,
3533,
285,
253,
1735,
5018,
604,
253,
2746,
14190,
253,
5170,
3033,
1057,
352,
1599,
326,
436,
1895,
310,
14042,
50275,
16217,
3825,
627,
310,
247,
1805,
5170,
3033,
534,
310,
17617,
1097,
253,
1711,
285,
747,
13301,
323,
253,
1711,
941,
50273,
250,
3065,
337,
288,
3941,
84,
1615,
1162,
355,
30364,
11892,
2276,
3700,
4715,
323,
295,
24343,
17857,
1686,
6247,
50276,
19,
632,
1162,
355,
2831,
13517,
1643,
11860,
4715,
342,
8892,
29765,
519,
49872,
30105,
1087,
1384,
1423,
50276,
20,
820,
864,
1162,
355,
436,
310,
619,
440,
41257,
892,
26948,
3367,
3006,
13831,
8113,
12982,
14237,
23746,
87,
1384,
1423,
50276,
536,
50276,
2887,
619,
8680,
275,
253,
14855,
2593,
50276,
187,
187,
4118,
18435,
27,
783,
4758,
273,
25537,
285,
1275,
1699,
5971,
689,
673,
310,
5604,
247,
8542,
581,
275,
10625,
824,
347,
2505,
9162,
436,
2929,
6131,
690,
16039,
327,
3533,
751,
1880,
253,
2862,
941,
943,
320,
774,
1492,
264,
390,
476,
581,
5115,
2822,
8654,
3045,
407,
21473,
760,
253,
747,
20540,
253,
2929,
20097,
326,
6036,
3733,
327,
1711,
285,
747,
941,
1014,
604,
16706,
275,
17385,
342,
49863,
29974,
13337,
4715,
476,
320,
9648,
3576,
209
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
this paper tackles weaklysupervised audiovisual video parsing and proposes a multimodal grouping network to explicitly group classaware matching semantics with classaware unimodal grouping module and modalityaware crossmodal grouping module experimental results are shown on the look listen and parse llp dataset with the generalizability to the audiovisual contrastive learning and label refinement strengths this paper tackles weaklysupervised audiovisual video parsing considering the asynchronous possibility of the two modalities weaknesses 1 the introduction of this paper is not strongly motivated but only with general and brief description the introduction lists several related papers but the relations of these papers to the proposed model are not well illustrated 2 l63 the experiments can demonstrate the superiority of our mgn over stateoftheart avvp approaches and its generalizability to contrastive learning and label refinement when the authors fist mention contrastive learning and label refinement 4 in the introduction these two concepts are not explained but directly appear with citation to another paper which makes the paper reading confusing 3 for the results in table 2 ablation studies on classaware unimodal grouping cug and modalityaware crossmodal grouping mcg blocks each of cug block and mcg block is ablated as a whole what about the effects of the design choices in each block eg the effect of concatenating class labels in classaware unimodal grouping cug block 4 only one dataset is experimented on the authors list the model limitations of limited data and worse performance with deeper network and point out the possible solutions of semisupervised training and incorporating more intermediate supervision the authors also point out the case of rare events in real deployment which is practical and has corresponding techniques focusing on the case docsepthe authors present a new baseline architecture for video parsing using learnable categorial embedding tokens they propose classaware unimodal grouping network in conjunction with a crossmodal grouping network to timestamp audio visual and audiovisual events using only video level labels they show improved results compared to other baselines on llp dataset the manuscript discusses the important problem of video parsing with very coarse labels it is well written and easy to read through the limited number of experiments presented are through and insightful the proposed method if effective compared to other baselines as evident from the qualitative and quantitative results however it is heavily inspired from recently proposed groupvit for image segmentation though the authors outline the differences between proposed approach and groupvit in supplementary material they are very nuanced and not significant see questions section docsepthis paper addresses the problem of predicting event labels over time in audio visual data with weak labels a method is proposed that uses attention between audio visual features and learned class embeddings and extracts classspecific embeddings for use in event detection the model is evaluated on the look listen and parse dataset using 11k video clips with video labels for training and around 2k video clips with both audio and video labels for evaluation strengths the proposed method is an interesting idea and seems to help with the task small improvements are achieved on a challenging task evaluated on a sufficiently large test set the ablations in table 2 are informative weaknesses the performance on audio events is not improved the analysis of false positives in figure 3 is useful but it seems like the models are just biased to different operating points since their fscores in table 1 are similar so the reduction of false positives in figure 3 must come at the cost of a decrease in recall it would be better to show both numbers or calibrate to the same operating point clarity the paper is a bit difficult to understand due to notational issues these methods without explicit grouping suffer from false predictions due to the modality and temporal uncertainties it is not clear to me why that should be at this point in the paper is this an empirical finding or a hypothesis re equation 6 cross entropy is usually from labels to estimates so i would expect cephatp but here it seems to be used the reverse way also this looks like a vectorized version with binary probabilities in the elements it might be good to define it since technically cross entropy is defined over a distribution rather than a vector of distributions that is i think what is meant is something like cemathbfy mathbfp sumi ceyi pi if not please clarify in the crossmodal attention phicaqkv my understanding is that the key and value should be of the same modality and the query of the other modality so if i understand correctly in equation 1 phicafvt fv fa is probably supposed to be phicafat fv fv and in equation 2 phicafvt fa fv should be phicafvt fa fa from equation 7 on the notation for selfattention changes without warning from a threeargument function to a one argument function phisax which i assume in the notation of 3 would be phisaxi x xi but it would be kinder to the reader to just define it in equation 10 it is not clear why learned weights are needed to transform both the class tokens and the modality specific features is it not equivalent to just transform the features that is ax cdot by xt at b y bt a x cdot y w x cdot y where w bt a in equation 12 the notation is not clear the mathbf1 is not clear to me i was expecting to see a label for the presence of the class im probably misunderstanding something but in any case i think this needs to be better explained perhaps writing out the ce formula element by element would clarify also maybe 13 should come before 12 for clarity errata in order to explicitly grouping classaware the authors acknowledge that performance on audio events is not improved
### Summary: | the authors propose an approach for weakly supervised audiovisual parsing of videos they propose using learnable categorical embedding to do classaware unimodal grouping combined with crossmodal grouping to timestamp audio visual and audiovisual events using only video level labels based on the feedback provided by the reviewers especially since reviewer knz9 increased their score to borderline accept after the rebuttal period we recommend this paper for publication at neurips 2022 the reviewers had some concerns about the paper reviewer knz9 mentioned that the relations of the listed papers to the proposed model were not well explained they also had some concerns about the experimental results and the fact that only one dataset was used in the evaluation reviewer kavj had questions about model performance with event scaling and time resolution lower bounds reviewer f8hf commented on the difficulty in following the notation in the paper and pointed out the results on audio events is not improved we thank the authors for addressing the comments of the reviewers in their review during the author feedback period the authors seem to have addressed some of the concernsfeedback from the reviewers with detailed discussions it would be good to include these discussions as much as possible in the updated paper or supplemental materials | [
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
2520,
2929,
39223,
22112,
35421,
41174,
729,
261,
780,
3492,
29072,
285,
29328,
247,
23390,
26306,
32827,
2990,
281,
11120,
1387,
966,
13823,
11038,
35185,
342,
966,
13823,
32505,
26306,
32827,
6333,
285,
36453,
13823,
2831,
24353,
32827,
6333,
5661,
1543,
403,
2011,
327,
253,
1007,
8028,
285,
14390,
298,
24343,
10895,
342,
253,
2087,
50228,
281,
253,
41174,
729,
261,
780,
4499,
422,
4715,
285,
5203,
29646,
50276,
296,
3755,
20556,
436,
2929,
39223,
22112,
35421,
41174,
729,
261,
780,
3492,
29072,
7296,
253,
35576,
6387,
273,
253,
767,
33433,
50275,
20881,
1255,
265,
337,
253,
10199,
273,
436,
2929,
310,
417,
7052,
17194,
533,
760,
342,
2087,
285,
4864,
5740,
253,
10199,
10894,
2067,
2905,
9380,
533,
253,
2493,
273,
841,
9380,
281,
253,
4081,
1566,
403,
417,
973,
12800,
50276,
19,
298,
3571,
253,
4679,
476,
7568,
253,
34385,
273,
776,
278,
3757,
689,
1375,
23037,
14387,
1323,
29035,
7274,
285,
697,
2087,
50228,
281,
4499,
422,
4715,
285,
5203,
29646,
50273,
9453,
253,
4477,
20008,
3748,
4499,
422,
4715,
285,
5203,
29646,
577,
275,
253,
10199,
841,
767,
12342,
403,
417,
5544,
533,
3587,
3176,
342,
25577,
281,
1529,
2929,
534,
2789,
253,
2929,
4361,
21643,
50276,
20,
323,
253,
1543,
275,
2829,
374,
28913,
2175,
327,
966,
13823,
32505,
26306,
32827,
260,
814,
285,
36453,
13823,
2831,
24353,
32827,
278,
29676,
8336,
1016,
273,
260,
814,
2972,
285,
278,
29676,
2972,
310,
490,
16148,
347,
247,
2644,
752,
670,
253,
2538,
273,
253,
2216,
10165,
275,
1016,
2972,
24088,
253,
1055,
273,
32147,
839,
966,
13301,
275,
966,
13823,
32505,
26306,
32827,
260,
814,
2972,
50276,
21,
760,
581,
10895,
310,
3368,
264,
327,
50273,
783,
4477,
1618,
253,
1566,
7364,
273,
3710,
941,
285,
7197,
3045,
342,
12861,
2990,
285,
1127,
562,
253,
1896,
5482,
273,
49863,
29974,
13337,
3733,
285,
24049,
625,
10444,
20446,
253,
4477,
671,
1127,
562,
253,
1083,
273,
7520,
3394,
275,
1524,
19007,
534,
310,
8542,
285,
556,
3969,
5609,
13654,
327,
253,
1083,
5474,
339,
431,
248,
4477,
1246,
247,
747,
8245,
10336,
323,
3492,
29072,
970,
3037,
494,
6509,
10317,
21496,
21761,
597,
12661,
966,
13823,
32505,
26306,
32827,
2990,
275,
17385,
342,
247,
2831,
24353,
32827,
2990,
281,
28921,
9797,
5304,
285,
41174,
729,
261,
780,
3394,
970,
760,
3492,
1268,
13301,
597,
921,
5520,
1543,
2429,
281,
643,
1666,
25379,
327,
298,
24343,
10895,
50276,
783,
7714,
25339,
253,
1774,
1895,
273,
3492,
29072,
342,
1077,
25319,
13301,
352,
310,
973,
3542,
285,
3477,
281,
1239,
949,
253,
3710,
1180,
273,
4679,
3559,
403,
949,
285,
47860,
253,
4081,
1332,
604,
3576,
2429,
281,
643,
1666,
25379,
347,
8943,
432,
253,
18276,
285,
11745,
1543,
2299,
352,
310,
11306,
11797,
432,
4102,
4081,
1387,
34490,
323,
2460,
26405,
2167,
253,
4477,
19270,
253,
3910,
875,
4081,
2746,
285,
1387,
34490,
275,
24864,
2144,
597,
403,
1077,
8794,
3086,
285,
417,
1534,
923,
3533,
2593,
5474,
33032,
2520,
2929,
12453,
253,
1895,
273,
21565,
2362,
13301,
689,
673,
275,
9797,
5304,
941,
342,
5075,
13301,
50276,
66,
1332,
310,
4081,
326,
4648,
4116,
875,
9797,
5304,
3386,
285,
6311,
966,
46234,
285,
16756,
966,
6160,
46234,
323,
897,
275,
2362,
5481,
50276,
783,
1566,
310,
6760,
327,
253,
1007,
8028,
285,
14390,
10895,
970,
1903,
76,
3492,
29205,
342,
3492,
13301,
323,
3733,
285,
1475,
374,
76,
3492,
29205,
342,
1097,
9797,
285,
3492,
13301,
323,
7103,
50274,
296,
3755,
20556,
253,
4081,
1332,
310,
271,
4722,
2934,
285,
3133,
281,
1361,
342,
253,
4836,
50276,
6795,
11701,
403,
6786,
327,
247,
11132,
4836,
6760,
327,
247,
10481,
1781,
1071,
873,
50275,
783,
490,
77,
569,
275,
2829,
374,
403,
27096,
50274,
20881,
1255,
265,
50276,
783,
3045,
327,
9797,
3394,
310,
417,
5520,
50276,
783,
1783,
273,
3221,
37865,
275,
4677,
495,
310,
4217,
533,
352,
3133,
751,
253,
3210,
403,
816,
23539,
281,
1027,
6498,
2792,
1580,
616,
269,
44142,
275,
2829,
337,
403,
2074,
50276,
601,
253,
5141,
273,
3221,
37865,
275,
4677,
495,
1364,
1705,
387,
253,
2105,
273,
247,
6379,
275,
6983,
50275,
262,
651,
320,
1805,
281,
921,
1097,
3904,
390,
24403,
366,
281,
253,
1072,
6498,
1127,
50274,
498,
15752,
50275,
783,
2929,
310,
247,
2372,
2834,
281,
2096,
1955,
281,
417,
1050,
3374,
50275,
20513,
3082,
1293,
6843,
32827,
11089,
432,
3221,
13650,
1955,
281,
253,
36453,
285,
11935,
20418,
50276,
262,
310,
417,
2590,
281,
479,
2139,
326,
943,
320,
387,
436,
1127,
275,
253,
2929,
50276,
261,
436,
271,
16774,
4560,
390,
247,
9079,
50276,
250,
5150,
721,
2831,
15579,
310,
3798,
432,
13301,
281,
8197,
594,
891,
651,
1902,
2636,
545,
255,
81,
533,
1060,
352,
3133,
281,
320,
908,
253,
8107,
1039,
50276,
12563,
436,
4453,
751,
247,
4972,
1025,
2715,
342,
8985,
20552,
275,
253,
3603,
50276,
262,
1537,
320,
1175,
281,
4853,
352,
1580,
22335,
2831,
15579,
310,
2931,
689,
247,
3268,
2581,
685,
247,
4972,
273,
10670,
50276,
3529,
310,
891,
1158,
752,
310,
5486,
310,
1633,
751,
260,
358,
506,
3342,
90,
14168,
3342,
81,
50276,
2204,
74,
2636,
28212,
12580,
50276,
338,
417,
4496,
19148,
50275,
249,
253,
2831,
24353,
4116,
815,
3737,
82,
39297,
619,
4685,
310,
326,
253,
2234,
285,
1318,
943,
320,
273,
253,
1072,
36453,
285,
253,
7316,
273,
253,
643,
36453,
50275,
601,
604,
891,
2096,
9113,
275,
5150,
337,
50276,
545,
280,
2320,
20282,
269,
87,
4195,
50276,
261,
3164,
6326,
281,
320,
815,
280,
2320,
255,
269,
87,
269,
87,
50276,
395,
275,
5150,
374,
815,
280,
2320,
20282,
4195,
269,
87,
50276,
11425,
320,
815,
280,
2320,
20282,
4195,
4195,
50276,
4064,
5150,
818,
327,
253,
14951,
323,
1881,
42959,
2544,
1293,
9734,
432,
247,
289,
250,
613,
72,
1303,
1159,
281,
247,
581,
4154,
1159,
815,
261,
991,
534,
891,
5467,
275,
253,
14951,
273,
495,
651,
320,
815,
261,
991,
74,
1269,
1269,
74,
533,
352,
651,
320,
40241,
281,
253,
9414,
281,
816,
4853,
352,
50275,
249,
5150,
884,
352,
310,
417,
2590,
2139,
6311,
13461,
403,
3058,
281,
4979,
1097,
253,
966,
21761,
285,
253,
36453,
2173,
3386,
50276,
261,
352,
417,
6425,
281,
816,
4979,
253,
3386,
50276,
3529,
310,
4589,
260,
5256,
407,
50276,
633,
387,
270,
340,
50276,
2612,
247,
1269,
260,
5256,
50276,
90,
50276,
88,
1269,
260,
5256,
340,
50276,
2811,
259,
50276,
2612,
247,
50271,
249,
5150,
1249,
253,
14951,
310,
417,
2590,
50275,
783,
14168,
3342,
18,
310,
417,
2590,
281,
479,
50276,
74,
369,
16764,
281,
923,
247,
5203,
323,
253,
3361,
273,
253,
966,
50276,
303,
3164,
40663,
1633,
533,
275,
667,
1083,
891,
1158,
436,
3198,
281,
320,
1805,
5544,
4931,
4028,
562,
253,
2636,
7212,
3284,
407,
3284,
651,
19148,
50275,
12563,
5046,
2145,
943,
1705,
1078,
1249,
323,
19843,
50274,
1000,
682,
50275,
249,
1340,
281,
11120,
32827,
966,
13823,
50274,
783,
4477,
14409,
326,
3045,
327,
9797,
3394,
310,
417,
5520,
2490,
187,
4118,
18435,
27,
783,
4477,
12661,
271,
2746,
323,
22112,
22296,
41174,
729,
261,
780,
29072,
273,
10556,
597,
12661,
970,
3037,
494,
31091,
21496,
281,
513,
966,
13823,
32505,
26306,
32827,
5678,
342,
2831,
24353,
32827,
281,
28921,
9797,
5304,
285,
41174,
729,
261,
780,
3394,
970,
760,
3492,
1268,
13301,
50276,
3169,
327,
253,
8680,
2530,
407,
253,
30628,
3340,
1580,
37317,
694,
91,
26,
2559,
616,
4868,
281,
45210,
2997,
846,
253,
30080,
22559,
2180,
359,
5583,
436,
2929,
323,
9311,
387,
5723,
2824,
1384,
1423,
50275,
783,
30628,
574,
690,
7350,
670,
253,
2929,
37317,
694,
91,
26,
5393,
326,
253,
2493,
273,
253,
7117,
9380,
281,
253,
4081,
1566,
497,
417,
973,
5544,
50276,
9328,
671,
574,
690,
7350,
670,
253,
5661,
1543,
285,
253,
958,
326,
760,
581,
10895,
369,
908,
275,
253,
7103,
37317,
465,
580,
75,
574,
3533,
670,
1566,
3045,
342,
2362,
13642,
285,
673,
6064,
2406,
14493,
37317,
269,
25,
45791,
20503,
327,
253,
10183,
275,
1563,
253,
14951,
275,
253,
2929,
285,
8042,
562,
253,
1543,
327,
9797,
3394,
310,
417,
5520,
50276,
664,
5717,
253,
4477,
323,
15974,
253,
5701,
273,
253,
30628,
275,
616,
2278,
1309,
253,
2488,
8680,
2180,
253,
4477,
1646,
281,
452,
9713,
690,
273,
253,
7350,
44333,
432,
253,
30628,
342,
7000,
11985,
50276,
262,
651,
320,
1175,
281,
2486,
841,
11985,
347,
1199,
347,
1896,
275,
253,
9300,
2929,
390,
25702,
4753
] | [
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] | [
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
2520,
2929,
39223,
22112,
35421,
41174,
729,
261,
780,
3492,
29072,
285,
29328,
247,
23390,
26306,
32827,
2990,
281,
11120,
1387,
966,
13823,
11038,
35185,
342,
966,
13823,
32505,
26306,
32827,
6333,
285,
36453,
13823,
2831,
24353,
32827,
6333,
5661,
1543,
403,
2011,
327,
253,
1007,
8028,
285,
14390,
298,
24343,
10895,
342,
253,
2087,
50228,
281,
253,
41174,
729,
261,
780,
4499,
422,
4715,
285,
5203,
29646,
50276,
296,
3755,
20556,
436,
2929,
39223,
22112,
35421,
41174,
729,
261,
780,
3492,
29072,
7296,
253,
35576,
6387,
273,
253,
767,
33433,
50275,
20881,
1255,
265,
337,
253,
10199,
273,
436,
2929,
310,
417,
7052,
17194,
533,
760,
342,
2087,
285,
4864,
5740,
253,
10199,
10894,
2067,
2905,
9380,
533,
253,
2493,
273,
841,
9380,
281,
253,
4081,
1566,
403,
417,
973,
12800,
50276,
19,
298,
3571,
253,
4679,
476,
7568,
253,
34385,
273,
776,
278,
3757,
689,
1375,
23037,
14387,
1323,
29035,
7274,
285,
697,
2087,
50228,
281,
4499,
422,
4715,
285,
5203,
29646,
50273,
9453,
253,
4477,
20008,
3748,
4499,
422,
4715,
285,
5203,
29646,
577,
275,
253,
10199,
841,
767,
12342,
403,
417,
5544,
533,
3587,
3176,
342,
25577,
281,
1529,
2929,
534,
2789,
253,
2929,
4361,
21643,
50276,
20,
323,
253,
1543,
275,
2829,
374,
28913,
2175,
327,
966,
13823,
32505,
26306,
32827,
260,
814,
285,
36453,
13823,
2831,
24353,
32827,
278,
29676,
8336,
1016,
273,
260,
814,
2972,
285,
278,
29676,
2972,
310,
490,
16148,
347,
247,
2644,
752,
670,
253,
2538,
273,
253,
2216,
10165,
275,
1016,
2972,
24088,
253,
1055,
273,
32147,
839,
966,
13301,
275,
966,
13823,
32505,
26306,
32827,
260,
814,
2972,
50276,
21,
760,
581,
10895,
310,
3368,
264,
327,
50273,
783,
4477,
1618,
253,
1566,
7364,
273,
3710,
941,
285,
7197,
3045,
342,
12861,
2990,
285,
1127,
562,
253,
1896,
5482,
273,
49863,
29974,
13337,
3733,
285,
24049,
625,
10444,
20446,
253,
4477,
671,
1127,
562,
253,
1083,
273,
7520,
3394,
275,
1524,
19007,
534,
310,
8542,
285,
556,
3969,
5609,
13654,
327,
253,
1083,
5474,
339,
431,
248,
4477,
1246,
247,
747,
8245,
10336,
323,
3492,
29072,
970,
3037,
494,
6509,
10317,
21496,
21761,
597,
12661,
966,
13823,
32505,
26306,
32827,
2990,
275,
17385,
342,
247,
2831,
24353,
32827,
2990,
281,
28921,
9797,
5304,
285,
41174,
729,
261,
780,
3394,
970,
760,
3492,
1268,
13301,
597,
921,
5520,
1543,
2429,
281,
643,
1666,
25379,
327,
298,
24343,
10895,
50276,
783,
7714,
25339,
253,
1774,
1895,
273,
3492,
29072,
342,
1077,
25319,
13301,
352,
310,
973,
3542,
285,
3477,
281,
1239,
949,
253,
3710,
1180,
273,
4679,
3559,
403,
949,
285,
47860,
253,
4081,
1332,
604,
3576,
2429,
281,
643,
1666,
25379,
347,
8943,
432,
253,
18276,
285,
11745,
1543,
2299,
352,
310,
11306,
11797,
432,
4102,
4081,
1387,
34490,
323,
2460,
26405,
2167,
253,
4477,
19270,
253,
3910,
875,
4081,
2746,
285,
1387,
34490,
275,
24864,
2144,
597,
403,
1077,
8794,
3086,
285,
417,
1534,
923,
3533,
2593,
5474,
33032,
2520,
2929,
12453,
253,
1895,
273,
21565,
2362,
13301,
689,
673,
275,
9797,
5304,
941,
342,
5075,
13301,
50276,
66,
1332,
310,
4081,
326,
4648,
4116,
875,
9797,
5304,
3386,
285,
6311,
966,
46234,
285,
16756,
966,
6160,
46234,
323,
897,
275,
2362,
5481,
50276,
783,
1566,
310,
6760,
327,
253,
1007,
8028,
285,
14390,
10895,
970,
1903,
76,
3492,
29205,
342,
3492,
13301,
323,
3733,
285,
1475,
374,
76,
3492,
29205,
342,
1097,
9797,
285,
3492,
13301,
323,
7103,
50274,
296,
3755,
20556,
253,
4081,
1332,
310,
271,
4722,
2934,
285,
3133,
281,
1361,
342,
253,
4836,
50276,
6795,
11701,
403,
6786,
327,
247,
11132,
4836,
6760,
327,
247,
10481,
1781,
1071,
873,
50275,
783,
490,
77,
569,
275,
2829,
374,
403,
27096,
50274,
20881,
1255,
265,
50276,
783,
3045,
327,
9797,
3394,
310,
417,
5520,
50276,
783,
1783,
273,
3221,
37865,
275,
4677,
495,
310,
4217,
533,
352,
3133,
751,
253,
3210,
403,
816,
23539,
281,
1027,
6498,
2792,
1580,
616,
269,
44142,
275,
2829,
337,
403,
2074,
50276,
601,
253,
5141,
273,
3221,
37865,
275,
4677,
495,
1364,
1705,
387,
253,
2105,
273,
247,
6379,
275,
6983,
50275,
262,
651,
320,
1805,
281,
921,
1097,
3904,
390,
24403,
366,
281,
253,
1072,
6498,
1127,
50274,
498,
15752,
50275,
783,
2929,
310,
247,
2372,
2834,
281,
2096,
1955,
281,
417,
1050,
3374,
50275,
20513,
3082,
1293,
6843,
32827,
11089,
432,
3221,
13650,
1955,
281,
253,
36453,
285,
11935,
20418,
50276,
262,
310,
417,
2590,
281,
479,
2139,
326,
943,
320,
387,
436,
1127,
275,
253,
2929,
50276,
261,
436,
271,
16774,
4560,
390,
247,
9079,
50276,
250,
5150,
721,
2831,
15579,
310,
3798,
432,
13301,
281,
8197,
594,
891,
651,
1902,
2636,
545,
255,
81,
533,
1060,
352,
3133,
281,
320,
908,
253,
8107,
1039,
50276,
12563,
436,
4453,
751,
247,
4972,
1025,
2715,
342,
8985,
20552,
275,
253,
3603,
50276,
262,
1537,
320,
1175,
281,
4853,
352,
1580,
22335,
2831,
15579,
310,
2931,
689,
247,
3268,
2581,
685,
247,
4972,
273,
10670,
50276,
3529,
310,
891,
1158,
752,
310,
5486,
310,
1633,
751,
260,
358,
506,
3342,
90,
14168,
3342,
81,
50276,
2204,
74,
2636,
28212,
12580,
50276,
338,
417,
4496,
19148,
50275,
249,
253,
2831,
24353,
4116,
815,
3737,
82,
39297,
619,
4685,
310,
326,
253,
2234,
285,
1318,
943,
320,
273,
253,
1072,
36453,
285,
253,
7316,
273,
253,
643,
36453,
50275,
601,
604,
891,
2096,
9113,
275,
5150,
337,
50276,
545,
280,
2320,
20282,
269,
87,
4195,
50276,
261,
3164,
6326,
281,
320,
815,
280,
2320,
255,
269,
87,
269,
87,
50276,
395,
275,
5150,
374,
815,
280,
2320,
20282,
4195,
269,
87,
50276,
11425,
320,
815,
280,
2320,
20282,
4195,
4195,
50276,
4064,
5150,
818,
327,
253,
14951,
323,
1881,
42959,
2544,
1293,
9734,
432,
247,
289,
250,
613,
72,
1303,
1159,
281,
247,
581,
4154,
1159,
815,
261,
991,
534,
891,
5467,
275,
253,
14951,
273,
495,
651,
320,
815,
261,
991,
74,
1269,
1269,
74,
533,
352,
651,
320,
40241,
281,
253,
9414,
281,
816,
4853,
352,
50275,
249,
5150,
884,
352,
310,
417,
2590,
2139,
6311,
13461,
403,
3058,
281,
4979,
1097,
253,
966,
21761,
285,
253,
36453,
2173,
3386,
50276,
261,
352,
417,
6425,
281,
816,
4979,
253,
3386,
50276,
3529,
310,
4589,
260,
5256,
407,
50276,
633,
387,
270,
340,
50276,
2612,
247,
1269,
260,
5256,
50276,
90,
50276,
88,
1269,
260,
5256,
340,
50276,
2811,
259,
50276,
2612,
247,
50271,
249,
5150,
1249,
253,
14951,
310,
417,
2590,
50275,
783,
14168,
3342,
18,
310,
417,
2590,
281,
479,
50276,
74,
369,
16764,
281,
923,
247,
5203,
323,
253,
3361,
273,
253,
966,
50276,
303,
3164,
40663,
1633,
533,
275,
667,
1083,
891,
1158,
436,
3198,
281,
320,
1805,
5544,
4931,
4028,
562,
253,
2636,
7212,
3284,
407,
3284,
651,
19148,
50275,
12563,
5046,
2145,
943,
1705,
1078,
1249,
323,
19843,
50274,
1000,
682,
50275,
249,
1340,
281,
11120,
32827,
966,
13823,
50274,
783,
4477,
14409,
326,
3045,
327,
9797,
3394,
310,
417,
5520,
2490,
187,
4118,
18435,
27,
783,
4477,
12661,
271,
2746,
323,
22112,
22296,
41174,
729,
261,
780,
29072,
273,
10556,
597,
12661,
970,
3037,
494,
31091,
21496,
281,
513,
966,
13823,
32505,
26306,
32827,
5678,
342,
2831,
24353,
32827,
281,
28921,
9797,
5304,
285,
41174,
729,
261,
780,
3394,
970,
760,
3492,
1268,
13301,
50276,
3169,
327,
253,
8680,
2530,
407,
253,
30628,
3340,
1580,
37317,
694,
91,
26,
2559,
616,
4868,
281,
45210,
2997,
846,
253,
30080,
22559,
2180,
359,
5583,
436,
2929,
323,
9311,
387,
5723,
2824,
1384,
1423,
50275,
783,
30628,
574,
690,
7350,
670,
253,
2929,
37317,
694,
91,
26,
5393,
326,
253,
2493,
273,
253,
7117,
9380,
281,
253,
4081,
1566,
497,
417,
973,
5544,
50276,
9328,
671,
574,
690,
7350,
670,
253,
5661,
1543,
285,
253,
958,
326,
760,
581,
10895,
369,
908,
275,
253,
7103,
37317,
465,
580,
75,
574,
3533,
670,
1566,
3045,
342,
2362,
13642,
285,
673,
6064,
2406,
14493,
37317,
269,
25,
45791,
20503,
327,
253,
10183,
275,
1563,
253,
14951,
275,
253,
2929,
285,
8042,
562,
253,
1543,
327,
9797,
3394,
310,
417,
5520,
50276,
664,
5717,
253,
4477,
323,
15974,
253,
5701,
273,
253,
30628,
275,
616,
2278,
1309,
253,
2488,
8680,
2180,
253,
4477,
1646,
281,
452,
9713,
690,
273,
253,
7350,
44333,
432,
253,
30628,
342,
7000,
11985,
50276,
262,
651,
320,
1175,
281,
2486,
841,
11985,
347,
1199,
347,
1896,
275,
253,
9300,
2929,
390,
25702,
4753
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
strengths the problem formulation is clean and clearly explained the method presentation is well written the techniques in used in the optimization steps after the relaxation step are well motivated weaknesses it is not clear whether the significance of regression without correspondence is high the multiobject tracking experiment seems contrived at least a very good approximation of the permutation matrix can be obtained using descriptors and motion continuity the issue of initialization is not fully resolved in the reported experiment in the discussion section robot succeed 30 of the time the percentage may vary and it is not clear how a user of the method should find a good initialization regime in the experimental setup that most resemble realistic applications gfc for the cytometry the improvement over am say a trivial baseline is not significant note calling a method robot will make it very difficult to google docsep the paper presents a method for robust regression with no correspondences the association matrix is replaced by a matrix with continuous positive values two practical and relevant problems are investigated this seems to be a very interesting investigation and the authors have done many commendable things such as studiyng relevant problems with real data comparing to multiple alternatives and appear to bring an interesting perspective to the problems the exposition of the method seems to lack a few details that might perhaps have escaped the attention of the authors the recommendation unfortunately should tend towards not accepting the authors should be urged to consider their work inside a wider perspective the main method that is repeatedly compared to the proposal was only am the proposal is in many ways actually more similar to expectationmaximization the birkhoff polytope here basically appears to represent likelihoods of data assignment according to gaussian likelihoods it does not seem clear if the authors are exploring some further constraint in their proposal it would be paramount to contrast the method with em though the paper as it is looks like it could be basically proposing a form of em without recognizing this and for sure em must bring many of the advantages over am as the paper promises some further smaller points in section one on the second point ending with they get stuck in local optima easily what exactly is the argument here if we are talking about a local hillclimbing style of algorithm there is no hope to escape local optima unless the initialization is improved or the landscape is improved or the algorithm is modified in such a way that it is not hillclimbing anymore becoming some form of global optimization in what way does the proposal differ em compared to am would imply in an enhanced landscape as also seems to be the case with the proposal another benefit is merely to utilize slower although more robust firstorder optimization methods its important to make explicit what exactly are the proposed benefits the way this sentence is written may give an impression the algorithm is proposing something that goes above a local search style of algoritm another important point is about initialization which is certainly crytical to all such algoritms in the experiments the authors suggest theres an improvement there although the improvement must be more related only to a better landscape although what was the initialization afterall this must be evaluated in the context of how good the available initializations are still on section one efficient firstorder optimization algorithms efficient relative to what how would secondorder algorithms be classified regarding the experiments ransac by virtue of being a global optimization algoritm unlike the proposal and other alternatives would be expected to be able to reach very high levels of accuracy even if associated with a great computational cost when comparing such a method with ransac one would expect to see a discussion about timeaccuracy compromises the paper is only presenting ransac as a method that could not reach a satisfactory accuracy it should present at least an accuracy at a setting that was deemed to be equiparable computationally one final remark about ad it is in general expected that ad techniques can match explicit derivative formulas unless the problem is not well suited to the specific ad technique used or sometimes the user is required to make some extra tuning or configuration it would be nice to review exactly how ad was not suitable and whether there isnt an ad based solution for that eg forward mode versus backwards mode docsepthe authors proposed a novel method for regression problems with outliers the main idea is to first propose a mixedinteger optimization problem for the regression problem and then and the optimization procedure of finding the solutiuon of the problem differentiable and the objective function of the problem are also be rephrased as a differentiable function based on this an endtoend learning approach can be established pros 1 the motivation of the paper is very clearly stated in the text and the sketch of the theorems make the paper easy to understand 2 the experimental part is good and it proved the efficiency of the proposed method 3 the idea of converting a mixedinteger programming to a differentiable function is elegent cons 1 the authors says that they are going to somehow relax the onetoone matching constraints however in the main text we can see that the model is still based on strict onetoone matching constraints in the experiments for synthetic data every generated data is in fact an onetoone matching for other datasets through they are not strictly onetoone but they are close to onetoone 2 theorem 1 is a trival result due to total unimodular and it is proved many years ago maybe it would be better to simple give a citation there docsepin this submission the authors propose a bilevel optimization based solution to the problem of regression without correspondence rwoc strong points 1 the paper writing is very clear i didnt know the rwoc problem before reading this submission but after reading the introduction i can clearly understand the problem setting its applications the two provided examples are great its challenges and what is the highlevel idea of the proposed solution also the organization and presentation of experiment part are very clear and easy to follow 2 besides the normal case of rwoc the authors also consider and solve the case of partial onetoone correspondence they also demonstrate its application using multipleobject tracking there are some issues that can be addressed to further make the submission strong 1 for the experiments about multiobject tracking could the authors include some at least one strong baseline the current experiment setting for this part is not so convincing and it is a bit difficult to justify the improvement 2 besides the application of multiobject tracking could the authors discuss more potential applications of rwoc this can enlarge the application scope and make the studied problem more important and practical minor places 1 page 2 the paragraph above related works rrwoc rwoc 2 fig 2 and fig 3 for the printed paper version blackwhite these figures are hard to read the authors can use different markers or textual to differentiate methods
### Summary: | this paper proposes a method to solve regression without correspondence the problem is wellmotivated and the proposed method is technically sound the motivation organization and presentation of the paper are very clear reviewers suggestions to further improve the paper eg clarifications on initialization comparison and discussion with with em ad etc were adequately incorporated to the revised manuscript | [
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
296,
3755,
20556,
50276,
783,
1895,
15895,
310,
4076,
285,
4518,
5544,
50276,
783,
1332,
9759,
310,
973,
3542,
50276,
783,
5609,
275,
908,
275,
253,
13757,
5018,
846,
50276,
783,
17040,
3213,
403,
973,
17194,
50276,
20881,
1255,
265,
50276,
262,
310,
417,
2590,
1880,
253,
8453,
273,
50276,
1747,
1256,
1293,
17668,
310,
1029,
253,
4471,
6082,
12544,
3368,
3133,
523,
30487,
387,
1878,
247,
1077,
1175,
11193,
273,
253,
29391,
4315,
476,
320,
2797,
970,
42785,
285,
3200,
21815,
50276,
783,
2523,
273,
31850,
310,
417,
4751,
11512,
275,
253,
2361,
3368,
275,
253,
5955,
2593,
15688,
9302,
1884,
273,
253,
673,
253,
7155,
778,
6889,
285,
352,
310,
417,
2590,
849,
247,
2608,
273,
253,
1332,
943,
1089,
247,
1175,
31850,
9459,
50276,
249,
253,
5661,
9978,
326,
954,
28788,
15958,
4893,
305,
12964,
323,
253,
22623,
253,
7756,
689,
717,
1333,
247,
14916,
8245,
310,
417,
1534,
50276,
9939,
6789,
247,
1332,
15688,
588,
1056,
352,
1077,
2834,
281,
17899,
5474,
33032,
50276,
783,
2929,
10262,
247,
1332,
323,
10237,
9077,
342,
642,
2723,
2979,
253,
5864,
4315,
310,
7932,
407,
247,
4315,
342,
5415,
2762,
2193,
767,
8542,
285,
4623,
3237,
403,
6949,
50276,
2520,
3133,
281,
320,
247,
1077,
4722,
5839,
285,
253,
4477,
452,
2218,
1142,
49638,
494,
1841,
824,
347,
799,
14059,
1251,
4623,
3237,
342,
1524,
941,
10941,
281,
2709,
18075,
285,
3176,
281,
3324,
271,
4722,
8668,
281,
253,
3237,
50276,
783,
47284,
273,
253,
1332,
3133,
281,
3480,
247,
1643,
4278,
326,
1537,
4931,
452,
18993,
253,
4116,
273,
253,
4477,
253,
17401,
19235,
943,
5257,
4404,
417,
18738,
253,
4477,
943,
320,
18343,
281,
1908,
616,
789,
3304,
247,
14200,
8668,
50276,
783,
2022,
1332,
326,
310,
12889,
2429,
281,
253,
10419,
369,
760,
717,
253,
10419,
310,
275,
1142,
4088,
2686,
625,
2074,
281,
15355,
785,
3266,
1320,
253,
3350,
76,
35660,
3488,
936,
365,
1060,
10323,
4620,
281,
1957,
12177,
84,
273,
941,
12714,
2556,
281,
305,
12064,
12177,
84,
50276,
262,
1057,
417,
1646,
2590,
604,
253,
4477,
403,
18216,
690,
2007,
7658,
275,
616,
10419,
352,
651,
320,
43527,
281,
4499,
253,
1332,
342,
802,
2167,
253,
2929,
347,
352,
310,
4453,
751,
352,
812,
320,
10323,
36636,
247,
830,
273,
802,
1293,
26182,
436,
285,
323,
2119,
802,
1364,
3324,
1142,
273,
253,
11361,
689,
717,
347,
253,
2929,
16966,
50276,
8826,
2007,
4577,
2792,
50276,
249,
2593,
581,
327,
253,
1273,
1127,
12365,
342,
597,
755,
10960,
275,
1980,
5556,
66,
4354,
752,
4555,
310,
253,
4154,
1060,
604,
359,
403,
5015,
670,
247,
1980,
13599,
498,
6785,
272,
3740,
273,
5933,
627,
310,
642,
3524,
281,
8773,
1980,
5556,
66,
5734,
253,
31850,
310,
5520,
390,
253,
13016,
310,
5520,
390,
253,
5933,
310,
7321,
275,
824,
247,
1039,
326,
352,
310,
417,
13599,
498,
6785,
272,
10542,
7552,
690,
830,
273,
4156,
13757,
275,
752,
1039,
1057,
253,
10419,
9184,
50276,
358,
2429,
281,
717,
651,
16084,
275,
271,
8655,
13016,
347,
671,
3133,
281,
320,
253,
1083,
342,
253,
10419,
1529,
5649,
310,
7960,
281,
16584,
17357,
3738,
625,
10237,
806,
2621,
13757,
3082,
50276,
953,
1774,
281,
1056,
6843,
752,
4555,
403,
253,
4081,
5373,
253,
1039,
436,
6197,
310,
3542,
778,
1918,
271,
13214,
253,
5933,
310,
36636,
1633,
326,
4566,
1840,
247,
1980,
3186,
3740,
273,
355,
3892,
262,
78,
50276,
23955,
1774,
1127,
310,
670,
31850,
534,
310,
5604,
3501,
85,
474,
281,
512,
824,
355,
3892,
262,
983,
275,
253,
4679,
253,
4477,
1804,
253,
373,
271,
7756,
627,
3738,
253,
7756,
1364,
320,
625,
2905,
760,
281,
247,
1805,
13016,
3738,
752,
369,
253,
31850,
846,
455,
436,
1364,
320,
6760,
275,
253,
3634,
273,
849,
1175,
253,
2130,
3302,
5904,
403,
50276,
23350,
327,
2593,
581,
5919,
806,
2621,
13757,
11333,
50276,
20246,
4103,
281,
752,
849,
651,
1273,
2621,
11333,
320,
10509,
50276,
1747,
13218,
253,
4679,
391,
507,
317,
407,
16968,
273,
1146,
247,
4156,
13757,
355,
3892,
262,
78,
12401,
253,
10419,
285,
643,
18075,
651,
320,
3264,
281,
320,
2104,
281,
3986,
1077,
1029,
2308,
273,
7200,
1014,
604,
2330,
342,
247,
1270,
15180,
2105,
672,
10941,
824,
247,
1332,
342,
391,
507,
317,
581,
651,
1902,
281,
923,
247,
5955,
670,
673,
18921,
1974,
10953,
3013,
253,
2929,
310,
760,
15250,
391,
507,
317,
347,
247,
1332,
326,
812,
417,
3986,
247,
20297,
7200,
352,
943,
1246,
387,
1878,
271,
7200,
387,
247,
4758,
326,
369,
14320,
281,
320,
30787,
274,
494,
43245,
50276,
531,
2457,
7579,
670,
519,
352,
310,
275,
2087,
3264,
326,
519,
5609,
476,
3761,
6843,
4309,
23276,
5734,
253,
1895,
310,
417,
973,
18960,
281,
253,
2173,
519,
5853,
908,
390,
4536,
253,
2608,
310,
2424,
281,
1056,
690,
4465,
25184,
390,
6661,
352,
651,
320,
5322,
281,
2278,
4555,
849,
519,
369,
417,
7470,
285,
1880,
627,
310,
2649,
271,
519,
1754,
2900,
323,
326,
24088,
3579,
4438,
7147,
24291,
4438,
50276,
7152,
339,
431,
248,
4477,
4081,
247,
4460,
1332,
323,
9077,
3237,
342,
42559,
253,
2022,
2934,
310,
281,
806,
12661,
247,
6804,
18743,
13757,
1895,
323,
253,
9077,
1895,
285,
840,
285,
253,
13757,
5199,
273,
4560,
253,
1220,
307,
14353,
251,
273,
253,
1895,
46350,
285,
253,
8103,
1159,
273,
253,
1895,
403,
671,
320,
294,
545,
83,
833,
347,
247,
46350,
1159,
1754,
327,
436,
271,
990,
936,
423,
4715,
2746,
476,
320,
4232,
50276,
856,
84,
50276,
18,
253,
16038,
273,
253,
2929,
310,
1077,
4518,
4767,
275,
253,
2505,
285,
253,
23211,
273,
253,
39383,
1056,
253,
2929,
3477,
281,
2096,
50276,
19,
253,
5661,
629,
310,
1175,
285,
352,
8058,
253,
6733,
273,
253,
4081,
1332,
495,
253,
2934,
273,
22022,
247,
6804,
18743,
10717,
281,
247,
46350,
1159,
310,
13990,
290,
50275,
5040,
337,
253,
4477,
2296,
326,
597,
403,
1469,
281,
10380,
7921,
253,
327,
16713,
531,
11038,
10806,
2299,
275,
253,
2022,
2505,
359,
476,
923,
326,
253,
1566,
310,
1335,
1754,
327,
7654,
327,
16713,
531,
11038,
10806,
275,
253,
4679,
323,
13506,
941,
1046,
4561,
941,
310,
275,
958,
271,
327,
16713,
531,
11038,
323,
643,
15302,
949,
597,
403,
417,
13714,
327,
16713,
531,
533,
597,
403,
2810,
281,
327,
16713,
531,
50276,
19,
10012,
337,
310,
247,
35820,
267,
906,
1955,
281,
2264,
32505,
351,
792,
285,
352,
310,
8058,
1142,
1107,
3622,
5046,
352,
651,
320,
1805,
281,
2969,
1918,
247,
25577,
627,
5474,
339,
9852,
436,
19529,
253,
4477,
12661,
247,
26413,
652,
13757,
1754,
2900,
281,
253,
1895,
273,
9077,
1293,
17668,
391,
88,
406,
50275,
9072,
2792,
337,
253,
2929,
4028,
310,
1077,
2590,
891,
42126,
871,
253,
391,
88,
406,
1895,
1078,
4361,
436,
19529,
533,
846,
4361,
253,
10199,
891,
476,
4518,
2096,
253,
1895,
4758,
697,
4893,
253,
767,
2530,
6667,
403,
1270,
697,
7881,
285,
752,
310,
253,
1029,
5251,
2934,
273,
253,
4081,
2900,
671,
253,
6003,
285,
9759,
273,
3368,
629,
403,
1077,
2590,
285,
3477,
281,
956,
50276,
19,
16280,
253,
2622,
1083,
273,
391,
88,
406,
253,
4477,
671,
1908,
285,
8415,
253,
1083,
273,
7898,
327,
16713,
531,
17668,
597,
671,
7568,
697,
2898,
970,
2709,
6082,
12544,
50275,
9088,
403,
690,
3374,
326,
476,
320,
9713,
281,
2007,
1056,
253,
19529,
2266,
337,
323,
253,
4679,
670,
4471,
6082,
12544,
812,
253,
4477,
2486,
690,
387,
1878,
581,
2266,
8245,
253,
1655,
3368,
4758,
323,
436,
629,
310,
417,
594,
21414,
285,
352,
310,
247,
2372,
2834,
281,
15249,
253,
7756,
50276,
19,
16280,
253,
2898,
273,
4471,
6082,
12544,
812,
253,
4477,
2319,
625,
2442,
4893,
273,
391,
88,
406,
436,
476,
46112,
253,
2898,
7990,
285,
1056,
253,
5421,
1895,
625,
1774,
285,
8542,
50275,
37585,
5053,
337,
3239,
374,
253,
12494,
1840,
2905,
2987,
391,
30217,
406,
50276,
30217,
406,
50276,
19,
3036,
374,
285,
3036,
495,
323,
253,
11462,
2929,
2715,
2806,
11300,
841,
8442,
403,
1892,
281,
1239,
253,
4477,
476,
897,
1027,
9588,
390,
45860,
281,
22629,
3082,
2490,
187,
4118,
18435,
27,
2520,
2929,
29328,
247,
1332,
281,
8415,
9077,
1293,
17668,
50276,
783,
1895,
310,
973,
24013,
8550,
285,
253,
4081,
1332,
310,
22335,
3590,
253,
16038,
6003,
285,
9759,
273,
253,
2929,
403,
1077,
2590,
50276,
15337,
398,
13991,
281,
2007,
3157,
253,
2929,
24088,
8254,
6787,
327,
31850,
5301,
285,
5955,
342,
342,
802,
519,
3966,
497,
18212,
11217,
281,
253,
17265,
7714,
209
] | [
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] | [
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
296,
3755,
20556,
50276,
783,
1895,
15895,
310,
4076,
285,
4518,
5544,
50276,
783,
1332,
9759,
310,
973,
3542,
50276,
783,
5609,
275,
908,
275,
253,
13757,
5018,
846,
50276,
783,
17040,
3213,
403,
973,
17194,
50276,
20881,
1255,
265,
50276,
262,
310,
417,
2590,
1880,
253,
8453,
273,
50276,
1747,
1256,
1293,
17668,
310,
1029,
253,
4471,
6082,
12544,
3368,
3133,
523,
30487,
387,
1878,
247,
1077,
1175,
11193,
273,
253,
29391,
4315,
476,
320,
2797,
970,
42785,
285,
3200,
21815,
50276,
783,
2523,
273,
31850,
310,
417,
4751,
11512,
275,
253,
2361,
3368,
275,
253,
5955,
2593,
15688,
9302,
1884,
273,
253,
673,
253,
7155,
778,
6889,
285,
352,
310,
417,
2590,
849,
247,
2608,
273,
253,
1332,
943,
1089,
247,
1175,
31850,
9459,
50276,
249,
253,
5661,
9978,
326,
954,
28788,
15958,
4893,
305,
12964,
323,
253,
22623,
253,
7756,
689,
717,
1333,
247,
14916,
8245,
310,
417,
1534,
50276,
9939,
6789,
247,
1332,
15688,
588,
1056,
352,
1077,
2834,
281,
17899,
5474,
33032,
50276,
783,
2929,
10262,
247,
1332,
323,
10237,
9077,
342,
642,
2723,
2979,
253,
5864,
4315,
310,
7932,
407,
247,
4315,
342,
5415,
2762,
2193,
767,
8542,
285,
4623,
3237,
403,
6949,
50276,
2520,
3133,
281,
320,
247,
1077,
4722,
5839,
285,
253,
4477,
452,
2218,
1142,
49638,
494,
1841,
824,
347,
799,
14059,
1251,
4623,
3237,
342,
1524,
941,
10941,
281,
2709,
18075,
285,
3176,
281,
3324,
271,
4722,
8668,
281,
253,
3237,
50276,
783,
47284,
273,
253,
1332,
3133,
281,
3480,
247,
1643,
4278,
326,
1537,
4931,
452,
18993,
253,
4116,
273,
253,
4477,
253,
17401,
19235,
943,
5257,
4404,
417,
18738,
253,
4477,
943,
320,
18343,
281,
1908,
616,
789,
3304,
247,
14200,
8668,
50276,
783,
2022,
1332,
326,
310,
12889,
2429,
281,
253,
10419,
369,
760,
717,
253,
10419,
310,
275,
1142,
4088,
2686,
625,
2074,
281,
15355,
785,
3266,
1320,
253,
3350,
76,
35660,
3488,
936,
365,
1060,
10323,
4620,
281,
1957,
12177,
84,
273,
941,
12714,
2556,
281,
305,
12064,
12177,
84,
50276,
262,
1057,
417,
1646,
2590,
604,
253,
4477,
403,
18216,
690,
2007,
7658,
275,
616,
10419,
352,
651,
320,
43527,
281,
4499,
253,
1332,
342,
802,
2167,
253,
2929,
347,
352,
310,
4453,
751,
352,
812,
320,
10323,
36636,
247,
830,
273,
802,
1293,
26182,
436,
285,
323,
2119,
802,
1364,
3324,
1142,
273,
253,
11361,
689,
717,
347,
253,
2929,
16966,
50276,
8826,
2007,
4577,
2792,
50276,
249,
2593,
581,
327,
253,
1273,
1127,
12365,
342,
597,
755,
10960,
275,
1980,
5556,
66,
4354,
752,
4555,
310,
253,
4154,
1060,
604,
359,
403,
5015,
670,
247,
1980,
13599,
498,
6785,
272,
3740,
273,
5933,
627,
310,
642,
3524,
281,
8773,
1980,
5556,
66,
5734,
253,
31850,
310,
5520,
390,
253,
13016,
310,
5520,
390,
253,
5933,
310,
7321,
275,
824,
247,
1039,
326,
352,
310,
417,
13599,
498,
6785,
272,
10542,
7552,
690,
830,
273,
4156,
13757,
275,
752,
1039,
1057,
253,
10419,
9184,
50276,
358,
2429,
281,
717,
651,
16084,
275,
271,
8655,
13016,
347,
671,
3133,
281,
320,
253,
1083,
342,
253,
10419,
1529,
5649,
310,
7960,
281,
16584,
17357,
3738,
625,
10237,
806,
2621,
13757,
3082,
50276,
953,
1774,
281,
1056,
6843,
752,
4555,
403,
253,
4081,
5373,
253,
1039,
436,
6197,
310,
3542,
778,
1918,
271,
13214,
253,
5933,
310,
36636,
1633,
326,
4566,
1840,
247,
1980,
3186,
3740,
273,
355,
3892,
262,
78,
50276,
23955,
1774,
1127,
310,
670,
31850,
534,
310,
5604,
3501,
85,
474,
281,
512,
824,
355,
3892,
262,
983,
275,
253,
4679,
253,
4477,
1804,
253,
373,
271,
7756,
627,
3738,
253,
7756,
1364,
320,
625,
2905,
760,
281,
247,
1805,
13016,
3738,
752,
369,
253,
31850,
846,
455,
436,
1364,
320,
6760,
275,
253,
3634,
273,
849,
1175,
253,
2130,
3302,
5904,
403,
50276,
23350,
327,
2593,
581,
5919,
806,
2621,
13757,
11333,
50276,
20246,
4103,
281,
752,
849,
651,
1273,
2621,
11333,
320,
10509,
50276,
1747,
13218,
253,
4679,
391,
507,
317,
407,
16968,
273,
1146,
247,
4156,
13757,
355,
3892,
262,
78,
12401,
253,
10419,
285,
643,
18075,
651,
320,
3264,
281,
320,
2104,
281,
3986,
1077,
1029,
2308,
273,
7200,
1014,
604,
2330,
342,
247,
1270,
15180,
2105,
672,
10941,
824,
247,
1332,
342,
391,
507,
317,
581,
651,
1902,
281,
923,
247,
5955,
670,
673,
18921,
1974,
10953,
3013,
253,
2929,
310,
760,
15250,
391,
507,
317,
347,
247,
1332,
326,
812,
417,
3986,
247,
20297,
7200,
352,
943,
1246,
387,
1878,
271,
7200,
387,
247,
4758,
326,
369,
14320,
281,
320,
30787,
274,
494,
43245,
50276,
531,
2457,
7579,
670,
519,
352,
310,
275,
2087,
3264,
326,
519,
5609,
476,
3761,
6843,
4309,
23276,
5734,
253,
1895,
310,
417,
973,
18960,
281,
253,
2173,
519,
5853,
908,
390,
4536,
253,
2608,
310,
2424,
281,
1056,
690,
4465,
25184,
390,
6661,
352,
651,
320,
5322,
281,
2278,
4555,
849,
519,
369,
417,
7470,
285,
1880,
627,
310,
2649,
271,
519,
1754,
2900,
323,
326,
24088,
3579,
4438,
7147,
24291,
4438,
50276,
7152,
339,
431,
248,
4477,
4081,
247,
4460,
1332,
323,
9077,
3237,
342,
42559,
253,
2022,
2934,
310,
281,
806,
12661,
247,
6804,
18743,
13757,
1895,
323,
253,
9077,
1895,
285,
840,
285,
253,
13757,
5199,
273,
4560,
253,
1220,
307,
14353,
251,
273,
253,
1895,
46350,
285,
253,
8103,
1159,
273,
253,
1895,
403,
671,
320,
294,
545,
83,
833,
347,
247,
46350,
1159,
1754,
327,
436,
271,
990,
936,
423,
4715,
2746,
476,
320,
4232,
50276,
856,
84,
50276,
18,
253,
16038,
273,
253,
2929,
310,
1077,
4518,
4767,
275,
253,
2505,
285,
253,
23211,
273,
253,
39383,
1056,
253,
2929,
3477,
281,
2096,
50276,
19,
253,
5661,
629,
310,
1175,
285,
352,
8058,
253,
6733,
273,
253,
4081,
1332,
495,
253,
2934,
273,
22022,
247,
6804,
18743,
10717,
281,
247,
46350,
1159,
310,
13990,
290,
50275,
5040,
337,
253,
4477,
2296,
326,
597,
403,
1469,
281,
10380,
7921,
253,
327,
16713,
531,
11038,
10806,
2299,
275,
253,
2022,
2505,
359,
476,
923,
326,
253,
1566,
310,
1335,
1754,
327,
7654,
327,
16713,
531,
11038,
10806,
275,
253,
4679,
323,
13506,
941,
1046,
4561,
941,
310,
275,
958,
271,
327,
16713,
531,
11038,
323,
643,
15302,
949,
597,
403,
417,
13714,
327,
16713,
531,
533,
597,
403,
2810,
281,
327,
16713,
531,
50276,
19,
10012,
337,
310,
247,
35820,
267,
906,
1955,
281,
2264,
32505,
351,
792,
285,
352,
310,
8058,
1142,
1107,
3622,
5046,
352,
651,
320,
1805,
281,
2969,
1918,
247,
25577,
627,
5474,
339,
9852,
436,
19529,
253,
4477,
12661,
247,
26413,
652,
13757,
1754,
2900,
281,
253,
1895,
273,
9077,
1293,
17668,
391,
88,
406,
50275,
9072,
2792,
337,
253,
2929,
4028,
310,
1077,
2590,
891,
42126,
871,
253,
391,
88,
406,
1895,
1078,
4361,
436,
19529,
533,
846,
4361,
253,
10199,
891,
476,
4518,
2096,
253,
1895,
4758,
697,
4893,
253,
767,
2530,
6667,
403,
1270,
697,
7881,
285,
752,
310,
253,
1029,
5251,
2934,
273,
253,
4081,
2900,
671,
253,
6003,
285,
9759,
273,
3368,
629,
403,
1077,
2590,
285,
3477,
281,
956,
50276,
19,
16280,
253,
2622,
1083,
273,
391,
88,
406,
253,
4477,
671,
1908,
285,
8415,
253,
1083,
273,
7898,
327,
16713,
531,
17668,
597,
671,
7568,
697,
2898,
970,
2709,
6082,
12544,
50275,
9088,
403,
690,
3374,
326,
476,
320,
9713,
281,
2007,
1056,
253,
19529,
2266,
337,
323,
253,
4679,
670,
4471,
6082,
12544,
812,
253,
4477,
2486,
690,
387,
1878,
581,
2266,
8245,
253,
1655,
3368,
4758,
323,
436,
629,
310,
417,
594,
21414,
285,
352,
310,
247,
2372,
2834,
281,
15249,
253,
7756,
50276,
19,
16280,
253,
2898,
273,
4471,
6082,
12544,
812,
253,
4477,
2319,
625,
2442,
4893,
273,
391,
88,
406,
436,
476,
46112,
253,
2898,
7990,
285,
1056,
253,
5421,
1895,
625,
1774,
285,
8542,
50275,
37585,
5053,
337,
3239,
374,
253,
12494,
1840,
2905,
2987,
391,
30217,
406,
50276,
30217,
406,
50276,
19,
3036,
374,
285,
3036,
495,
323,
253,
11462,
2929,
2715,
2806,
11300,
841,
8442,
403,
1892,
281,
1239,
253,
4477,
476,
897,
1027,
9588,
390,
45860,
281,
22629,
3082,
2490,
187,
4118,
18435,
27,
2520,
2929,
29328,
247,
1332,
281,
8415,
9077,
1293,
17668,
50276,
783,
1895,
310,
973,
24013,
8550,
285,
253,
4081,
1332,
310,
22335,
3590,
253,
16038,
6003,
285,
9759,
273,
253,
2929,
403,
1077,
2590,
50276,
15337,
398,
13991,
281,
2007,
3157,
253,
2929,
24088,
8254,
6787,
327,
31850,
5301,
285,
5955,
342,
342,
802,
519,
3966,
497,
18212,
11217,
281,
253,
17265,
7714,
209
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
the paper introduces a method for understanding if a relation is direct entailment of another one the authors have used the relation already exists in a wellknown dataset wikidata i liked the tricks they have used to construct the dataset eg relation sampling relation expansion etc the paper in a way that it is a little unclear and hard to follow for instance there are continuous mentions of single letter references some of the references are explained later for example pr in kldivergence calculation it might also help to have a figure showing the general idea of the model and then mentioning that you are running experiments with different settings maybe using the larger equations in separate lines it might have been interesting to see statistics on relation entailments such as what percent of the relations have more than children this might also help with understanding the propagation better in might also be interesting to see some qualitative analysis comparing the transe distmult and complex are there domaindependent are there scenarios that the others can outperform transe to sum up the pros of this paper are as follows introducing interesting aspects of analysis in the knowledge graph problems analysis of supervised and unsupervised methods to find the entailments in relations and weaknesses are the paper is a little hard to follow maybe it is better to add a section to define all the repetitive terms also adding model figure can help although i agree that the author has done plenty of experiments probably some statistic reports on the relations can give more insight into the scope of the problem minor comment please high light the highest numbers in the tablesdocsepthis paper introduces the task of predicting entailment between canonicalized relations in a knowledge graph the downstream significance of this work lies in teaching models to understand abstract concepts through predicting entailment between relations thereby understanding a hierarchy of concepts the relations are represented using information from knowledge graphs as well as information extracted from text a variety of methods are explored for building this representation kge methods such as transe embedding the context between textual mentions of the relations entities and distribution based methods the prediction task is then formulated as a ranking problem where the correct parent relation should be ranked higher than all others the paper is well written and clear except for a few points below commentsquestions i feel the nomenclature of unsupervisedsupervised scoring functions is a bit misleading it would be better suited to call the two approaches as nonparametric vs parametric methods 1 how do the cosine and euclidean similarity metric serve as a scoring function given that they are symmetric 2 is prediction for parent relations done within all the relations only in that tier or all relations 3 with regards to the relation instance propagation if the child relations are propagated to the parent the representation of parent would explicitly include information of the child i might be missing something but would this not make the task of predicting parent relation trivial since they would be the most similar docsepthis paper addresses the issue of relation entailment viz whether a relation r in a knowledge graph entails a relation r which the authors define as a form of relational containment r entails r if r is contained in set theoretical terms by r they then propose a data driven methods to sample a gold standard of such containments that they use to evaluate andor train unsupervised and supervised relation entailment models the authors derive their gold standard from wikidata and a number of sampling techniques that are relatively well explained they rely for their methods on distributed representations of both the relations and their textual groundings mapping the triples to bags of words andor syntactic dependencies derived from wikipedia snippets via a form of reverse entity linking of sorts they then experiment with on the one hand similarity functions and on the other hand cnn and bilstm encoders perhaps unsurprisingly supervised models perform way better than unsupervised models from 057 to 071 accuracy the models are well described this reviewer finds the experiments well described but still incomplete indeed the authors fail to assess the impact of the different input information modalities in the input embedding layers of their neural networks triple and word embeddings unless the reader is meant to understand that their base model in table 3b relies only on triple embeddings this is not clear also this reviewer would like to see results for text only models is this better than reasoning with the triples or with both signals in similar nlp tasks think textual entailment one usually proceeds that way it would also be interesting for the sake of completeness to consider such three cases in table 3a similaritybased approaches last but not least the scores reported are sometimes quite close would it be possible to add the standard deviation of your scores somewhere in particular for table 3b as is common in deep learning literature this reviewer cant see yet if there was a real improvement or only an statistical fluctuation the discussion of the results is quite informative
### Summary: | the paper introduces a method for entailment prediction between relations in a knowledge graph using the wikidata dataset they used a few tricks to construct the dataset relation sampling relation expansion etc overall the reviewers agree that this paper deserve publication however several aspects in the presentation should be improved notation needs to be made clearer a figure would help understand the main idea and statistics on relation entailments would be useful to present we strongly recommend authors to take these suggestions into account when preparing the final version | [
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
783,
2929,
23970,
247,
1332,
323,
4685,
604,
247,
5886,
310,
1480,
46518,
420,
273,
1529,
581,
50276,
783,
4477,
452,
908,
253,
5886,
2168,
4961,
275,
247,
973,
4304,
10895,
259,
1479,
301,
682,
891,
10490,
253,
24866,
597,
452,
908,
281,
3989,
253,
10895,
24088,
5886,
10491,
5886,
7466,
3966,
50276,
783,
2929,
275,
247,
1039,
326,
352,
310,
247,
1652,
12744,
285,
1892,
281,
956,
323,
4227,
627,
403,
5415,
25957,
273,
2014,
4857,
10414,
690,
273,
253,
10414,
403,
5544,
1996,
50276,
1542,
1650,
819,
275,
465,
392,
2373,
9515,
10272,
352,
1537,
671,
1361,
281,
452,
247,
4677,
4645,
253,
2087,
2934,
273,
253,
1566,
285,
840,
29570,
326,
368,
403,
3515,
4679,
342,
1027,
7533,
5046,
970,
253,
4067,
7424,
275,
4858,
3104,
352,
1537,
452,
644,
4722,
281,
923,
9990,
327,
5886,
46518,
942,
824,
347,
752,
2558,
273,
253,
2493,
452,
625,
685,
2151,
436,
1537,
671,
1361,
342,
4685,
253,
18634,
1805,
275,
1537,
671,
320,
4722,
281,
923,
690,
18276,
1783,
10941,
253,
21191,
339,
940,
9961,
285,
2570,
403,
627,
5028,
6820,
403,
627,
15216,
326,
253,
2571,
476,
562,
32231,
21191,
339,
50276,
936,
2020,
598,
253,
5847,
273,
436,
2929,
403,
347,
3637,
16984,
4722,
7794,
273,
1783,
275,
253,
3640,
4216,
3237,
50276,
12792,
273,
22296,
285,
440,
35421,
3082,
281,
1089,
253,
46518,
942,
275,
2493,
50276,
395,
32213,
403,
253,
2929,
310,
247,
1652,
1892,
281,
956,
5046,
352,
310,
1805,
281,
823,
247,
2593,
281,
4853,
512,
253,
29104,
2426,
671,
6240,
1566,
4677,
476,
1361,
50276,
20261,
891,
5194,
326,
253,
2488,
556,
2218,
9828,
273,
4679,
3164,
690,
26312,
5012,
327,
253,
2493,
476,
1918,
625,
12288,
715,
253,
7990,
273,
253,
1895,
50276,
37585,
4385,
4496,
1029,
1708,
253,
4585,
3904,
275,
253,
7180,
7152,
33032,
2520,
2929,
23970,
253,
4836,
273,
21565,
46518,
420,
875,
15516,
1025,
2493,
275,
247,
3640,
4216,
253,
15450,
8453,
273,
436,
789,
8696,
275,
9551,
3210,
281,
2096,
12002,
12342,
949,
21565,
46518,
420,
875,
2493,
7624,
4685,
247,
19868,
273,
12342,
50275,
783,
2493,
403,
6607,
970,
1491,
432,
3640,
14580,
347,
973,
347,
1491,
10375,
432,
2505,
247,
5235,
273,
3082,
403,
14859,
323,
3652,
436,
6779,
50276,
76,
463,
3082,
824,
347,
21191,
339,
21496,
253,
3634,
875,
45860,
25957,
273,
253,
2493,
14429,
285,
3268,
1754,
3082,
50276,
783,
10554,
4836,
310,
840,
26115,
347,
247,
19947,
1895,
835,
253,
3451,
2885,
5886,
943,
320,
17045,
2169,
685,
512,
2571,
50276,
783,
2929,
310,
973,
3542,
285,
2590,
3707,
323,
247,
1643,
2792,
2708,
50275,
26122,
34974,
891,
1928,
253,
295,
50125,
273,
440,
35421,
35421,
14755,
3470,
310,
247,
2372,
24363,
352,
651,
320,
1805,
18960,
281,
1067,
253,
767,
7274,
347,
1327,
36928,
4632,
36833,
3082,
50275,
18,
849,
513,
253,
7349,
460,
285,
299,
26365,
14259,
7982,
5752,
347,
247,
14755,
1159,
1677,
326,
597,
403,
13123,
50275,
19,
310,
10554,
323,
2885,
2493,
2218,
1561,
512,
253,
2493,
760,
275,
326,
30625,
390,
512,
2493,
495,
342,
17730,
281,
253,
5886,
4227,
18634,
50276,
338,
253,
1429,
2493,
403,
46695,
281,
253,
2885,
253,
6779,
273,
2885,
651,
11120,
2486,
1491,
273,
253,
1429,
891,
1537,
320,
5816,
1633,
533,
651,
436,
417,
1056,
253,
4836,
273,
21565,
2885,
5886,
14916,
1580,
597,
651,
320,
253,
954,
2074,
50276,
7152,
33032,
2520,
2929,
12453,
253,
2523,
273,
5886,
46518,
420,
40027,
1880,
247,
5886,
391,
275,
247,
3640,
4216,
36782,
247,
5886,
391,
534,
253,
4477,
4853,
347,
247,
830,
273,
38524,
46054,
391,
36782,
391,
604,
391,
310,
6221,
275,
873,
10527,
2426,
407,
391,
597,
840,
12661,
247,
941,
8877,
3082,
281,
3410,
247,
5328,
2629,
273,
824,
3831,
942,
326,
597,
897,
281,
7472,
285,
263,
6194,
440,
35421,
285,
22296,
5886,
46518,
420,
3210,
50275,
783,
4477,
15313,
616,
5328,
2629,
432,
259,
1479,
301,
682,
285,
247,
1180,
273,
10491,
5609,
326,
403,
4942,
973,
5544,
597,
10725,
323,
616,
3082,
327,
5939,
14237,
273,
1097,
253,
2493,
285,
616,
45860,
3216,
723,
10603,
253,
1195,
1868,
281,
15284,
273,
3000,
285,
263,
43548,
9994,
21011,
6012,
432,
259,
15170,
3802,
46588,
3066,
247,
830,
273,
8107,
10726,
20057,
273,
16308,
597,
840,
3368,
342,
327,
253,
581,
1133,
14259,
3470,
285,
327,
253,
643,
1133,
260,
9866,
285,
10370,
296,
78,
2349,
351,
398,
4931,
5061,
321,
28761,
22296,
3210,
1347,
1039,
1805,
685,
440,
35421,
3210,
432,
470,
3011,
281,
470,
3677,
7200,
253,
3210,
403,
973,
2529,
50276,
2520,
37317,
9010,
253,
4679,
973,
2529,
533,
1335,
18464,
6296,
253,
4477,
1891,
281,
2939,
253,
3486,
273,
253,
1027,
3280,
1491,
33433,
275,
253,
3280,
21496,
8090,
273,
616,
11454,
6928,
50276,
12512,
713,
285,
3159,
46234,
5734,
253,
9414,
310,
5486,
281,
2096,
326,
616,
2613,
1566,
275,
2829,
495,
67,
15771,
760,
327,
16260,
46234,
436,
310,
417,
2590,
671,
436,
37317,
651,
751,
281,
923,
1543,
323,
2505,
760,
3210,
310,
436,
1805,
685,
14720,
342,
253,
1195,
1868,
390,
342,
1097,
6298,
275,
2074,
295,
24343,
8892,
1158,
45860,
46518,
420,
581,
3798,
16947,
326,
1039,
352,
651,
671,
320,
4722,
323,
253,
13232,
273,
29867,
281,
1908,
824,
1264,
2219,
275,
2829,
495,
66,
14259,
3169,
7274,
1390,
533,
417,
1878,
253,
7363,
2361,
403,
4536,
3240,
2810,
651,
352,
320,
1896,
281,
823,
253,
2629,
11254,
273,
634,
7363,
9366,
275,
1798,
323,
2829,
495,
67,
347,
310,
1846,
275,
3676,
4715,
6239,
436,
37317,
16216,
923,
2568,
604,
627,
369,
247,
1524,
7756,
390,
760,
271,
7605,
31608,
50276,
783,
5955,
273,
253,
1543,
310,
3240,
27096,
2490,
187,
4118,
18435,
27,
783,
2929,
23970,
247,
1332,
323,
46518,
420,
10554,
875,
2493,
275,
247,
3640,
4216,
970,
253,
259,
1479,
301,
682,
10895,
597,
908,
247,
1643,
24866,
281,
3989,
253,
10895,
5886,
10491,
5886,
7466,
3966,
50276,
1189,
455,
253,
30628,
5194,
326,
436,
2929,
17337,
9311,
2299,
2067,
7794,
275,
253,
9759,
943,
320,
5520,
14951,
3198,
281,
320,
1160,
30909,
247,
4677,
651,
1361,
2096,
253,
2022,
2934,
285,
9990,
327,
5886,
46518,
942,
651,
320,
4217,
281,
1246,
50276,
664,
7052,
5583,
4477,
281,
1379,
841,
13991,
715,
2395,
672,
13828,
253,
2457,
2715,
209
] | [
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] | [
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
783,
2929,
23970,
247,
1332,
323,
4685,
604,
247,
5886,
310,
1480,
46518,
420,
273,
1529,
581,
50276,
783,
4477,
452,
908,
253,
5886,
2168,
4961,
275,
247,
973,
4304,
10895,
259,
1479,
301,
682,
891,
10490,
253,
24866,
597,
452,
908,
281,
3989,
253,
10895,
24088,
5886,
10491,
5886,
7466,
3966,
50276,
783,
2929,
275,
247,
1039,
326,
352,
310,
247,
1652,
12744,
285,
1892,
281,
956,
323,
4227,
627,
403,
5415,
25957,
273,
2014,
4857,
10414,
690,
273,
253,
10414,
403,
5544,
1996,
50276,
1542,
1650,
819,
275,
465,
392,
2373,
9515,
10272,
352,
1537,
671,
1361,
281,
452,
247,
4677,
4645,
253,
2087,
2934,
273,
253,
1566,
285,
840,
29570,
326,
368,
403,
3515,
4679,
342,
1027,
7533,
5046,
970,
253,
4067,
7424,
275,
4858,
3104,
352,
1537,
452,
644,
4722,
281,
923,
9990,
327,
5886,
46518,
942,
824,
347,
752,
2558,
273,
253,
2493,
452,
625,
685,
2151,
436,
1537,
671,
1361,
342,
4685,
253,
18634,
1805,
275,
1537,
671,
320,
4722,
281,
923,
690,
18276,
1783,
10941,
253,
21191,
339,
940,
9961,
285,
2570,
403,
627,
5028,
6820,
403,
627,
15216,
326,
253,
2571,
476,
562,
32231,
21191,
339,
50276,
936,
2020,
598,
253,
5847,
273,
436,
2929,
403,
347,
3637,
16984,
4722,
7794,
273,
1783,
275,
253,
3640,
4216,
3237,
50276,
12792,
273,
22296,
285,
440,
35421,
3082,
281,
1089,
253,
46518,
942,
275,
2493,
50276,
395,
32213,
403,
253,
2929,
310,
247,
1652,
1892,
281,
956,
5046,
352,
310,
1805,
281,
823,
247,
2593,
281,
4853,
512,
253,
29104,
2426,
671,
6240,
1566,
4677,
476,
1361,
50276,
20261,
891,
5194,
326,
253,
2488,
556,
2218,
9828,
273,
4679,
3164,
690,
26312,
5012,
327,
253,
2493,
476,
1918,
625,
12288,
715,
253,
7990,
273,
253,
1895,
50276,
37585,
4385,
4496,
1029,
1708,
253,
4585,
3904,
275,
253,
7180,
7152,
33032,
2520,
2929,
23970,
253,
4836,
273,
21565,
46518,
420,
875,
15516,
1025,
2493,
275,
247,
3640,
4216,
253,
15450,
8453,
273,
436,
789,
8696,
275,
9551,
3210,
281,
2096,
12002,
12342,
949,
21565,
46518,
420,
875,
2493,
7624,
4685,
247,
19868,
273,
12342,
50275,
783,
2493,
403,
6607,
970,
1491,
432,
3640,
14580,
347,
973,
347,
1491,
10375,
432,
2505,
247,
5235,
273,
3082,
403,
14859,
323,
3652,
436,
6779,
50276,
76,
463,
3082,
824,
347,
21191,
339,
21496,
253,
3634,
875,
45860,
25957,
273,
253,
2493,
14429,
285,
3268,
1754,
3082,
50276,
783,
10554,
4836,
310,
840,
26115,
347,
247,
19947,
1895,
835,
253,
3451,
2885,
5886,
943,
320,
17045,
2169,
685,
512,
2571,
50276,
783,
2929,
310,
973,
3542,
285,
2590,
3707,
323,
247,
1643,
2792,
2708,
50275,
26122,
34974,
891,
1928,
253,
295,
50125,
273,
440,
35421,
35421,
14755,
3470,
310,
247,
2372,
24363,
352,
651,
320,
1805,
18960,
281,
1067,
253,
767,
7274,
347,
1327,
36928,
4632,
36833,
3082,
50275,
18,
849,
513,
253,
7349,
460,
285,
299,
26365,
14259,
7982,
5752,
347,
247,
14755,
1159,
1677,
326,
597,
403,
13123,
50275,
19,
310,
10554,
323,
2885,
2493,
2218,
1561,
512,
253,
2493,
760,
275,
326,
30625,
390,
512,
2493,
495,
342,
17730,
281,
253,
5886,
4227,
18634,
50276,
338,
253,
1429,
2493,
403,
46695,
281,
253,
2885,
253,
6779,
273,
2885,
651,
11120,
2486,
1491,
273,
253,
1429,
891,
1537,
320,
5816,
1633,
533,
651,
436,
417,
1056,
253,
4836,
273,
21565,
2885,
5886,
14916,
1580,
597,
651,
320,
253,
954,
2074,
50276,
7152,
33032,
2520,
2929,
12453,
253,
2523,
273,
5886,
46518,
420,
40027,
1880,
247,
5886,
391,
275,
247,
3640,
4216,
36782,
247,
5886,
391,
534,
253,
4477,
4853,
347,
247,
830,
273,
38524,
46054,
391,
36782,
391,
604,
391,
310,
6221,
275,
873,
10527,
2426,
407,
391,
597,
840,
12661,
247,
941,
8877,
3082,
281,
3410,
247,
5328,
2629,
273,
824,
3831,
942,
326,
597,
897,
281,
7472,
285,
263,
6194,
440,
35421,
285,
22296,
5886,
46518,
420,
3210,
50275,
783,
4477,
15313,
616,
5328,
2629,
432,
259,
1479,
301,
682,
285,
247,
1180,
273,
10491,
5609,
326,
403,
4942,
973,
5544,
597,
10725,
323,
616,
3082,
327,
5939,
14237,
273,
1097,
253,
2493,
285,
616,
45860,
3216,
723,
10603,
253,
1195,
1868,
281,
15284,
273,
3000,
285,
263,
43548,
9994,
21011,
6012,
432,
259,
15170,
3802,
46588,
3066,
247,
830,
273,
8107,
10726,
20057,
273,
16308,
597,
840,
3368,
342,
327,
253,
581,
1133,
14259,
3470,
285,
327,
253,
643,
1133,
260,
9866,
285,
10370,
296,
78,
2349,
351,
398,
4931,
5061,
321,
28761,
22296,
3210,
1347,
1039,
1805,
685,
440,
35421,
3210,
432,
470,
3011,
281,
470,
3677,
7200,
253,
3210,
403,
973,
2529,
50276,
2520,
37317,
9010,
253,
4679,
973,
2529,
533,
1335,
18464,
6296,
253,
4477,
1891,
281,
2939,
253,
3486,
273,
253,
1027,
3280,
1491,
33433,
275,
253,
3280,
21496,
8090,
273,
616,
11454,
6928,
50276,
12512,
713,
285,
3159,
46234,
5734,
253,
9414,
310,
5486,
281,
2096,
326,
616,
2613,
1566,
275,
2829,
495,
67,
15771,
760,
327,
16260,
46234,
436,
310,
417,
2590,
671,
436,
37317,
651,
751,
281,
923,
1543,
323,
2505,
760,
3210,
310,
436,
1805,
685,
14720,
342,
253,
1195,
1868,
390,
342,
1097,
6298,
275,
2074,
295,
24343,
8892,
1158,
45860,
46518,
420,
581,
3798,
16947,
326,
1039,
352,
651,
671,
320,
4722,
323,
253,
13232,
273,
29867,
281,
1908,
824,
1264,
2219,
275,
2829,
495,
66,
14259,
3169,
7274,
1390,
533,
417,
1878,
253,
7363,
2361,
403,
4536,
3240,
2810,
651,
352,
320,
1896,
281,
823,
253,
2629,
11254,
273,
634,
7363,
9366,
275,
1798,
323,
2829,
495,
67,
347,
310,
1846,
275,
3676,
4715,
6239,
436,
37317,
16216,
923,
2568,
604,
627,
369,
247,
1524,
7756,
390,
760,
271,
7605,
31608,
50276,
783,
5955,
273,
253,
1543,
310,
3240,
27096,
2490,
187,
4118,
18435,
27,
783,
2929,
23970,
247,
1332,
323,
46518,
420,
10554,
875,
2493,
275,
247,
3640,
4216,
970,
253,
259,
1479,
301,
682,
10895,
597,
908,
247,
1643,
24866,
281,
3989,
253,
10895,
5886,
10491,
5886,
7466,
3966,
50276,
1189,
455,
253,
30628,
5194,
326,
436,
2929,
17337,
9311,
2299,
2067,
7794,
275,
253,
9759,
943,
320,
5520,
14951,
3198,
281,
320,
1160,
30909,
247,
4677,
651,
1361,
2096,
253,
2022,
2934,
285,
9990,
327,
5886,
46518,
942,
651,
320,
4217,
281,
1246,
50276,
664,
7052,
5583,
4477,
281,
1379,
841,
13991,
715,
2395,
672,
13828,
253,
2457,
2715,
209
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
this paper focuses on the problem of extracting or reconstructing mesh surface from raw point cloud the key idea is to learn an unsigned distance function to progressively get to the real surface the unsigned distance field is critical to deal with objects that are not watertight but with inner parts the authors proposed a consistency aware loss to keep the consistency of the learned unsigned distance fields to avoid adversarial optimization a surface extraction algorithm is also proposed to extract mesh surface from the learned unsigned distance function strengths first using unsigned distance function is critical and important to handle complicated object structures they have demonstrated better performance on public dataset both visually and in numbers weakness from my understanding this proposed method doesnt have potentials to handle any noise in the raw point cloud which means they require a clean point cloud as input but in real scenario the raw point cloud is not noisefree another issue is that the surface extraction algorithm is a bit tricky the extraction is mainly controlled by computing the sign of cls function but how could we guarantee the gradient of the unsigned distance field is always accurate finally the authors havent presented any failure cases or any dicussion on when the proposed method would fail the authors have mentioned one limitation or possible future work which is to have a coarsetofine divided grids but they havent clearly discussed the limitations and also they havent demonstrated any failure cases please refer to the weakness part about some of my thoughts on the limitations docsepthis paper presents a method to mesh point clouds it performs optimization on a single scene without any training it claims 2 contributions 1 a loss function and optimization strategy which in my understanding is essentially the one presented in neural pull 26 for signed distance function used for unsigned distance function and symetrized it is often refered to as a consistency aware field consistency loss and as fighting adversarial optimization which makes little sense to me 2 a meshing strategy which to me seems an adaptation of marching cube to unsigned distance function i would say there is a 3rd contribution which is not claimed in the intro but is a part of the method section which is the progressive ie 2 step in practice surface approximation even if the quantitative gains associated to it are small it presents results on several dataset that seem to improve state of the art while i am not an expert of the area the benefits of the proposed approach in term of results seem clear to me which i think is the main strength of the paper the proposed approach also seem to make a lot of sense and is quite simple i see several weaknesses in the paper 1 i found the paper very hard to parse while the proposed approach is quite simple this is particularly true for 31 and 32 i think this is written only for people who are very familiar with the 3 most related papers has far too many forward and backward pointers for example in 31 before anything about the method has been explained no loss function nothing on optimization there are results comparisons with 3 baselines and discussion of the differences l 126151 and figure 2 i do not think it can make sense before the full paper has been read and understood similarly l 167 discusses results obtained when using equation 3 which is presented l 185 if this was a journal submission this could easily be solved with a major revision for a conference paper this is much harder to trust the authors with a major rewriting of the paper another thing that annoyed me is that i could understand none of what the paper was doing from the abstract and intro terms like consistency aware field consistency loss and as fighting adversarial optimization are not explained while they refer to very simple idea and i think they are designed to impress but make little senseare not adapted not sure if its the fault of the authors or if they reuse terms from other paper 2 i am unsure what the real technical contribution are to me the first contribution which is a big part of the method section 31 and 32 is actually a very small modification of neural pull 26 i think this is not recognized enough in the paper and find that a very annoying issue the progressive surface approximation seem novel but this is not claimed clearly so i am unsure whether this might be following another paper the surface extraction seem to be a relatively simple adaptation of marching cube if the authors agree this again should be acknowledge much more clearly in the abstract intro and 34 to me the real contribution is actually taking the previous small idea together and making them into a very effective algorithm which could make for a great paper if only it was acknowledged better and each part explained much more simply unfortunately this again put me on the verge of recommending rejection for a conference paper 3 smaller concerns are associated with the experiments which again i found in general convincing since the output is a mesh i would like to see metrics related to meshes not only point cloud for example it would be quite easy to measure normal distances up to flip i am confused by the low confidence range experiment table 7 i guess the low confidence range should be understood as in addition to sigma so 09sigma for example actually means between 1 and 19sigma from the origin is that right if thats right why not experiment with much smaller values eg 01 and 05 sigma and in any case with larger values 2 and 4 sigma that would make the trend much clearer in any case this should be better explained a small figure earlier in the method section could help all in all because i think the method makes sense and because as far as i can judge not being an expert the results seem very good and the ablation convincing i would still tend to recommend accepting the paper trusting the authors with a major rewriting yes docsepthis paper proposed a method for surface reconstructions by training a neural network to predict unsigned distance fields udf the learned udfs are consistentaware and can be trained without ground truth distance fields point normals or large scale training datasets a high quality surface can be extracted from the gradient vector field of the learned udfs the paper has achieved appealing results compared to some of the state of the art algorithms 1 the paper carefully examined the current failure mode of the udf approximation methods thus proposing the consistencyaware field learning loss and the progressive approximation paradigm these strategies greatly improved the quality of the learned udf as illustrated by the paper 2 traditional marching cube algorithms cannot be directly applied on udfs since there is no insideoutside information in an udf the paper proposed a novel surface extraction algorithm by looking at the gradient vector field of the learned udf from the originality and quality perspective the paper has done well 3 presentation is well done language and visualization are clear 4 from a significance perspective the reviewer believe the paper has boosted the sota by a quite large margin the reconstructed surface has much higher quality in many challenging scenarios the author addressed the limitations of uniformly dividing grids for surface extraction docsepthis paper presents a framework to learn unsigned distance functions udf from point clouds the learned continuous udf can then be used to extract surface to represent 3d geometry one of the challenges of learning a continuous udf from a discrete point cloud is the instability of gradient due to the sparsity of points to this end the authors propose a novel loss function with a field consistency constrain they also designed a progressive scheme to learn more local details unlike sdf that can recover surfaces using the marching cubes algorithm directly udf cannot pass the insideoutside test due to the lack of direction information ie sign therefore this paper propose to use the relative angle between query points to test whether they cross the isosurface experiments demonstrate the proposed method outperform existing methods and ablation studies verify the design choices strengths the paper is well written it is easy to follow the figures are greatly helpful for readers to understand the idea the proposed idea is interesting and effective as it is supported by the superior performance in comaprisons against existing methods furthermore ablation studies are sufficient to validate design choices weaknesses figure captions i would recommend to expand figure captions so that readers dont need to jump back and forth between text and figure na
### Summary: | all reviewers were clearly in favor of accepting the paper prerebuttal there was limited discussion postrebuttal the ac examined the paper the reviews and the authors response and is inclined to accept the paper the ac encourages the authors to use their extra page to incorporate their responses to the reviewers into the final version of the paper in particular the ac would encourage carefully considering the feedback on presentation from 1bdf | [
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
2520,
2929,
16633,
327,
253,
1895,
273,
34705,
390,
17029,
272,
17489,
2553,
432,
9305,
1127,
9005,
253,
2234,
2934,
310,
281,
3037,
271,
10698,
4181,
1159,
281,
31414,
755,
281,
253,
1524,
2553,
253,
10698,
4181,
1673,
310,
4619,
281,
2968,
342,
5113,
326,
403,
417,
1824,
33886,
533,
342,
6703,
4243,
253,
4477,
4081,
247,
15274,
6600,
2957,
281,
1978,
253,
15274,
273,
253,
6311,
10698,
4181,
4910,
281,
3693,
48960,
13757,
247,
2553,
11998,
5933,
310,
671,
4081,
281,
4908,
17489,
2553,
432,
253,
6311,
10698,
4181,
1159,
20544,
806,
970,
10698,
4181,
1159,
310,
4619,
285,
1774,
281,
6016,
9542,
1789,
5289,
597,
452,
5183,
1805,
3045,
327,
1345,
10895,
1097,
25910,
285,
275,
3904,
50275,
20881,
1255,
432,
619,
4685,
436,
4081,
1332,
36908,
452,
19316,
281,
6016,
667,
6046,
275,
253,
9305,
1127,
9005,
534,
2097,
597,
2430,
247,
4076,
1127,
9005,
347,
3280,
533,
275,
1524,
10076,
253,
9305,
1127,
9005,
310,
417,
642,
261,
832,
658,
1529,
2523,
310,
326,
253,
2553,
11998,
5933,
310,
247,
2372,
28190,
253,
11998,
310,
7194,
6537,
407,
12672,
253,
861,
273,
502,
84,
1159,
533,
849,
812,
359,
12215,
253,
11786,
273,
253,
10698,
4181,
1673,
310,
1900,
7899,
50276,
71,
3341,
253,
4477,
419,
2254,
3559,
667,
4433,
2219,
390,
667,
37266,
26148,
327,
672,
253,
4081,
1332,
651,
1891,
253,
4477,
452,
5393,
581,
12291,
390,
1896,
2852,
789,
534,
310,
281,
452,
247,
820,
1032,
292,
1171,
460,
4272,
42590,
533,
597,
419,
2254,
4518,
5469,
253,
7364,
285,
671,
597,
419,
2254,
5183,
667,
4433,
2219,
4496,
3730,
281,
253,
14855,
629,
670,
690,
273,
619,
7906,
327,
253,
7364,
5474,
33032,
2520,
2929,
10262,
247,
1332,
281,
17489,
1127,
16173,
352,
17923,
13757,
327,
247,
2014,
6200,
1293,
667,
3733,
50276,
262,
3916,
374,
9021,
337,
247,
2957,
1159,
285,
13757,
5700,
534,
275,
619,
4685,
310,
9093,
253,
581,
3559,
275,
11454,
3785,
3436,
323,
6704,
4181,
1159,
908,
323,
10698,
4181,
1159,
285,
726,
3899,
50065,
352,
310,
2223,
294,
3850,
281,
347,
247,
15274,
6600,
50276,
3423,
15274,
2957,
285,
347,
8615,
48960,
13757,
534,
2789,
1652,
3282,
281,
479,
374,
247,
6191,
2027,
5700,
534,
281,
479,
3133,
271,
15644,
273,
37655,
23636,
281,
10698,
4181,
1159,
891,
651,
1333,
627,
310,
247,
495,
5784,
7680,
534,
310,
417,
7558,
275,
253,
26432,
533,
310,
247,
629,
273,
253,
1332,
2593,
534,
310,
253,
13439,
26332,
374,
3213,
275,
3946,
2553,
11193,
1014,
604,
253,
11745,
15988,
2330,
281,
352,
403,
1355,
352,
10262,
1543,
327,
2067,
10895,
326,
1646,
281,
3157,
1375,
273,
253,
1445,
1223,
891,
717,
417,
271,
6485,
273,
253,
2170,
253,
5373,
273,
253,
4081,
2746,
275,
1307,
273,
1543,
1646,
2590,
281,
479,
534,
891,
1158,
310,
253,
2022,
4757,
273,
253,
2929,
253,
4081,
2746,
671,
1646,
281,
1056,
247,
2257,
273,
3282,
285,
310,
3240,
2969,
50276,
74,
923,
2067,
32213,
275,
253,
2929,
337,
891,
1119,
253,
2929,
1077,
1892,
281,
14390,
1223,
253,
4081,
2746,
310,
3240,
2969,
50276,
2520,
310,
3782,
2032,
323,
4562,
285,
4567,
891,
1158,
436,
310,
3542,
760,
323,
952,
665,
403,
1077,
7615,
342,
253,
495,
954,
2905,
9380,
50276,
7110,
2080,
1512,
1142,
3579,
285,
19265,
29476,
323,
1650,
275,
4562,
1078,
2712,
670,
253,
1332,
556,
644,
5544,
642,
2957,
1159,
2717,
327,
13757,
627,
403,
1543,
14023,
342,
495,
1666,
25379,
285,
5955,
273,
253,
3910,
298,
17574,
18795,
285,
4677,
374,
891,
513,
417,
1158,
352,
476,
1056,
3282,
1078,
253,
2120,
2929,
556,
644,
1239,
285,
7192,
12014,
298,
22743,
25339,
1543,
2797,
672,
970,
5150,
495,
534,
310,
3559,
298,
23512,
604,
436,
369,
247,
6698,
19529,
436,
812,
4354,
320,
14042,
342,
247,
2201,
18520,
323,
247,
8059,
2929,
436,
310,
1199,
12150,
281,
4517,
253,
4477,
342,
247,
2201,
294,
17695,
273,
253,
2929,
50276,
23955,
2181,
326,
34639,
479,
310,
326,
891,
812,
2096,
5293,
273,
752,
253,
2929,
369,
2509,
432,
253,
12002,
285,
26432,
2426,
751,
15274,
6600,
50276,
3423,
15274,
2957,
285,
347,
8615,
48960,
13757,
403,
417,
5544,
1223,
50276,
9328,
3730,
281,
1077,
2969,
2934,
285,
891,
1158,
597,
403,
4158,
281,
21097,
533,
1056,
1652,
3282,
609,
417,
12956,
417,
2119,
604,
697,
253,
9331,
273,
253,
4477,
390,
604,
597,
33150,
2426,
432,
643,
2929,
50276,
19,
891,
717,
31488,
752,
253,
1524,
7681,
7680,
403,
50275,
936,
479,
253,
806,
7680,
534,
310,
247,
1943,
629,
273,
253,
1332,
2593,
4562,
285,
4567,
310,
2686,
247,
1077,
1355,
11237,
273,
11454,
3785,
3436,
891,
1158,
436,
310,
417,
7478,
2217,
275,
253,
2929,
285,
1089,
326,
247,
1077,
24659,
2523,
50276,
783,
13439,
2553,
11193,
1646,
4460,
533,
436,
310,
417,
7558,
4518,
594,
891,
717,
31488,
1880,
436,
1537,
320,
1563,
1529,
2929,
50276,
783,
2553,
11998,
1646,
281,
320,
247,
4942,
2969,
15644,
273,
37655,
23636,
604,
253,
4477,
5194,
436,
969,
943,
320,
14409,
1199,
625,
4518,
275,
253,
12002,
26432,
285,
5910,
50276,
936,
479,
253,
1524,
7680,
310,
2686,
3192,
253,
2045,
1355,
2934,
2366,
285,
2403,
731,
715,
247,
1077,
3576,
5933,
534,
812,
1056,
323,
247,
1270,
2929,
604,
760,
352,
369,
14969,
1805,
285,
1016,
629,
5544,
1199,
625,
3365,
19235,
436,
969,
1691,
479,
327,
253,
39931,
273,
46705,
18235,
323,
247,
8059,
2929,
50276,
20,
4577,
7350,
403,
2330,
342,
253,
4679,
534,
969,
891,
1119,
275,
2087,
21414,
50276,
17480,
253,
3453,
310,
247,
17489,
891,
651,
751,
281,
923,
17082,
2905,
281,
6191,
1041,
417,
760,
1127,
9005,
323,
1650,
352,
651,
320,
3240,
3477,
281,
2557,
2622,
13849,
598,
281,
19153,
50276,
74,
717,
13477,
407,
253,
1698,
7162,
2491,
3368,
2829,
818,
891,
5476,
253,
1698,
7162,
2491,
943,
320,
7192,
347,
275,
1635,
281,
40009,
594,
15630,
2592,
323,
50276,
11667,
2686,
2097,
875,
337,
285,
655,
2592,
432,
253,
6510,
310,
326,
987,
604,
28763,
987,
2139,
417,
3368,
342,
1199,
4577,
2193,
24088,
14805,
285,
16987,
40009,
285,
275,
667,
1083,
342,
4067,
2193,
374,
285,
577,
40009,
326,
651,
1056,
253,
9058,
1199,
30909,
275,
667,
1083,
436,
943,
320,
1805,
5544,
247,
1355,
4677,
4321,
275,
253,
1332,
2593,
812,
1361,
50276,
455,
275,
512,
984,
891,
1158,
253,
1332,
2789,
3282,
285,
984,
347,
2080,
347,
891,
476,
5963,
417,
1146,
271,
6485,
253,
1543,
1646,
1077,
1175,
285,
253,
28913,
21414,
891,
651,
1335,
5257,
281,
5583,
18738,
253,
2929,
44895,
253,
4477,
342,
247,
2201,
294,
17695,
50275,
9820,
5474,
33032,
2520,
2929,
4081,
247,
1332,
323,
2553,
49866,
6477,
407,
3733,
247,
11454,
2990,
281,
3283,
10698,
4181,
4910,
18198,
71,
253,
6311,
18198,
3671,
403,
5185,
13823,
285,
476,
320,
10166,
1293,
3216,
5083,
4181,
4910,
1127,
5222,
932,
390,
1781,
4311,
3733,
15302,
247,
1029,
3290,
2553,
476,
320,
10375,
432,
253,
11786,
4972,
1673,
273,
253,
6311,
18198,
3671,
253,
2929,
556,
6786,
23176,
1543,
2429,
281,
690,
273,
253,
1375,
273,
253,
1445,
11333,
337,
253,
2929,
9257,
6730,
253,
1655,
4433,
4438,
273,
253,
18198,
71,
11193,
3082,
3021,
36636,
253,
15274,
13823,
1673,
4715,
2957,
285,
253,
13439,
11193,
22199,
841,
8130,
10260,
5520,
253,
3290,
273,
253,
6311,
18198,
71,
347,
12800,
407,
253,
2929,
374,
5899,
37655,
23636,
11333,
2550,
320,
3587,
3732,
327,
18198,
3671,
1580,
627,
310,
642,
3304,
40939,
1491,
275,
271,
18198,
71,
253,
2929,
4081,
247,
4460,
2553,
11998,
5933,
407,
2819,
387,
253,
11786,
4972,
1673,
273,
253,
6311,
18198,
71,
432,
253,
3236,
414,
285,
3290,
8668,
253,
2929,
556,
2218,
973,
495,
9759,
310,
973,
2218,
3448,
285,
24426,
403,
2590,
577,
432,
247,
8453,
8668,
253,
37317,
2868,
253,
2929,
556,
46002,
253,
256,
5503,
407,
247,
3240,
1781,
8459,
253,
25578,
2553,
556,
1199,
2169,
3290,
275,
1142,
11132,
15216,
253,
2488,
9713,
253,
7364,
273,
17568,
23534,
42590,
323,
2553,
11998,
5474,
33032,
2520,
2929,
10262,
247,
7792,
281,
3037,
10698,
4181,
3470,
18198,
71,
432,
1127,
16173,
253,
6311,
5415,
18198,
71,
476,
840,
320,
908,
281,
4908,
2553,
281,
1957,
495,
69,
12087,
581,
273,
253,
7881,
273,
4715,
247,
5415,
18198,
71,
432,
247,
13358,
1127,
9005,
310,
253,
17620,
273,
11786,
1955,
281,
253,
37139,
414,
273,
2792,
281,
436,
990,
253,
4477,
12661,
247,
4460,
2957,
1159,
342,
247,
1673,
15274,
37709,
597,
671,
4158,
247,
13439,
6974,
281,
3037,
625,
1980,
4278,
12401,
256,
4989,
326,
476,
9295,
9421,
970,
253,
37655,
37072,
5933,
3587,
18198,
71,
2550,
1509,
253,
3304,
40939,
1071,
1955,
281,
253,
3480,
273,
3884,
1491,
26332,
861,
3103,
436,
2929,
12661,
281,
897,
253,
4103,
6907,
875,
7316,
2792,
281,
1071,
1880,
597,
2831,
253,
310,
375,
32961,
4679,
7568,
253,
4081,
1332,
562,
32231,
5368,
3082,
285,
28913,
2175,
12654,
253,
2216,
10165,
50276,
296,
3755,
20556,
50276,
783,
2929,
310,
973,
3542,
352,
310,
3477,
281,
956,
50275,
783,
8442,
403,
10260,
9371,
323,
10668,
281,
2096,
253,
2934,
50275,
783,
4081,
2934,
310,
4722,
285,
3576,
347,
352,
310,
4516,
407,
253,
8936,
3045,
275,
389,
522,
4448,
790,
1411,
5368,
3082,
33810,
28913,
2175,
403,
4209,
281,
17813,
2216,
10165,
50275,
20881,
1255,
265,
50275,
13206,
3403,
621,
891,
651,
5583,
281,
5645,
4677,
3403,
621,
594,
326,
10668,
13414,
878,
281,
6923,
896,
285,
6593,
875,
2505,
285,
4677,
50276,
2072,
2490,
187,
4118,
18435,
27,
455,
30628,
497,
4518,
275,
3718,
273,
18738,
253,
2929,
638,
250,
2858,
22559,
627,
369,
3710,
5955,
1501,
250,
2858,
22559,
253,
913,
6730,
253,
2929,
253,
10123,
285,
253,
4477,
2380,
285,
310,
21802,
281,
2997,
253,
2929,
253,
913,
29426,
253,
4477,
281,
897,
616,
4465,
3239,
281,
19071,
616,
6128,
281,
253,
30628,
715,
253,
2457,
2715,
273,
253,
2929,
275,
1798,
253,
913,
651,
11907,
9257,
7296,
253,
8680,
327,
9759,
432,
337,
67,
4989,
209
] | [
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] | [
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
2520,
2929,
16633,
327,
253,
1895,
273,
34705,
390,
17029,
272,
17489,
2553,
432,
9305,
1127,
9005,
253,
2234,
2934,
310,
281,
3037,
271,
10698,
4181,
1159,
281,
31414,
755,
281,
253,
1524,
2553,
253,
10698,
4181,
1673,
310,
4619,
281,
2968,
342,
5113,
326,
403,
417,
1824,
33886,
533,
342,
6703,
4243,
253,
4477,
4081,
247,
15274,
6600,
2957,
281,
1978,
253,
15274,
273,
253,
6311,
10698,
4181,
4910,
281,
3693,
48960,
13757,
247,
2553,
11998,
5933,
310,
671,
4081,
281,
4908,
17489,
2553,
432,
253,
6311,
10698,
4181,
1159,
20544,
806,
970,
10698,
4181,
1159,
310,
4619,
285,
1774,
281,
6016,
9542,
1789,
5289,
597,
452,
5183,
1805,
3045,
327,
1345,
10895,
1097,
25910,
285,
275,
3904,
50275,
20881,
1255,
432,
619,
4685,
436,
4081,
1332,
36908,
452,
19316,
281,
6016,
667,
6046,
275,
253,
9305,
1127,
9005,
534,
2097,
597,
2430,
247,
4076,
1127,
9005,
347,
3280,
533,
275,
1524,
10076,
253,
9305,
1127,
9005,
310,
417,
642,
261,
832,
658,
1529,
2523,
310,
326,
253,
2553,
11998,
5933,
310,
247,
2372,
28190,
253,
11998,
310,
7194,
6537,
407,
12672,
253,
861,
273,
502,
84,
1159,
533,
849,
812,
359,
12215,
253,
11786,
273,
253,
10698,
4181,
1673,
310,
1900,
7899,
50276,
71,
3341,
253,
4477,
419,
2254,
3559,
667,
4433,
2219,
390,
667,
37266,
26148,
327,
672,
253,
4081,
1332,
651,
1891,
253,
4477,
452,
5393,
581,
12291,
390,
1896,
2852,
789,
534,
310,
281,
452,
247,
820,
1032,
292,
1171,
460,
4272,
42590,
533,
597,
419,
2254,
4518,
5469,
253,
7364,
285,
671,
597,
419,
2254,
5183,
667,
4433,
2219,
4496,
3730,
281,
253,
14855,
629,
670,
690,
273,
619,
7906,
327,
253,
7364,
5474,
33032,
2520,
2929,
10262,
247,
1332,
281,
17489,
1127,
16173,
352,
17923,
13757,
327,
247,
2014,
6200,
1293,
667,
3733,
50276,
262,
3916,
374,
9021,
337,
247,
2957,
1159,
285,
13757,
5700,
534,
275,
619,
4685,
310,
9093,
253,
581,
3559,
275,
11454,
3785,
3436,
323,
6704,
4181,
1159,
908,
323,
10698,
4181,
1159,
285,
726,
3899,
50065,
352,
310,
2223,
294,
3850,
281,
347,
247,
15274,
6600,
50276,
3423,
15274,
2957,
285,
347,
8615,
48960,
13757,
534,
2789,
1652,
3282,
281,
479,
374,
247,
6191,
2027,
5700,
534,
281,
479,
3133,
271,
15644,
273,
37655,
23636,
281,
10698,
4181,
1159,
891,
651,
1333,
627,
310,
247,
495,
5784,
7680,
534,
310,
417,
7558,
275,
253,
26432,
533,
310,
247,
629,
273,
253,
1332,
2593,
534,
310,
253,
13439,
26332,
374,
3213,
275,
3946,
2553,
11193,
1014,
604,
253,
11745,
15988,
2330,
281,
352,
403,
1355,
352,
10262,
1543,
327,
2067,
10895,
326,
1646,
281,
3157,
1375,
273,
253,
1445,
1223,
891,
717,
417,
271,
6485,
273,
253,
2170,
253,
5373,
273,
253,
4081,
2746,
275,
1307,
273,
1543,
1646,
2590,
281,
479,
534,
891,
1158,
310,
253,
2022,
4757,
273,
253,
2929,
253,
4081,
2746,
671,
1646,
281,
1056,
247,
2257,
273,
3282,
285,
310,
3240,
2969,
50276,
74,
923,
2067,
32213,
275,
253,
2929,
337,
891,
1119,
253,
2929,
1077,
1892,
281,
14390,
1223,
253,
4081,
2746,
310,
3240,
2969,
50276,
2520,
310,
3782,
2032,
323,
4562,
285,
4567,
891,
1158,
436,
310,
3542,
760,
323,
952,
665,
403,
1077,
7615,
342,
253,
495,
954,
2905,
9380,
50276,
7110,
2080,
1512,
1142,
3579,
285,
19265,
29476,
323,
1650,
275,
4562,
1078,
2712,
670,
253,
1332,
556,
644,
5544,
642,
2957,
1159,
2717,
327,
13757,
627,
403,
1543,
14023,
342,
495,
1666,
25379,
285,
5955,
273,
253,
3910,
298,
17574,
18795,
285,
4677,
374,
891,
513,
417,
1158,
352,
476,
1056,
3282,
1078,
253,
2120,
2929,
556,
644,
1239,
285,
7192,
12014,
298,
22743,
25339,
1543,
2797,
672,
970,
5150,
495,
534,
310,
3559,
298,
23512,
604,
436,
369,
247,
6698,
19529,
436,
812,
4354,
320,
14042,
342,
247,
2201,
18520,
323,
247,
8059,
2929,
436,
310,
1199,
12150,
281,
4517,
253,
4477,
342,
247,
2201,
294,
17695,
273,
253,
2929,
50276,
23955,
2181,
326,
34639,
479,
310,
326,
891,
812,
2096,
5293,
273,
752,
253,
2929,
369,
2509,
432,
253,
12002,
285,
26432,
2426,
751,
15274,
6600,
50276,
3423,
15274,
2957,
285,
347,
8615,
48960,
13757,
403,
417,
5544,
1223,
50276,
9328,
3730,
281,
1077,
2969,
2934,
285,
891,
1158,
597,
403,
4158,
281,
21097,
533,
1056,
1652,
3282,
609,
417,
12956,
417,
2119,
604,
697,
253,
9331,
273,
253,
4477,
390,
604,
597,
33150,
2426,
432,
643,
2929,
50276,
19,
891,
717,
31488,
752,
253,
1524,
7681,
7680,
403,
50275,
936,
479,
253,
806,
7680,
534,
310,
247,
1943,
629,
273,
253,
1332,
2593,
4562,
285,
4567,
310,
2686,
247,
1077,
1355,
11237,
273,
11454,
3785,
3436,
891,
1158,
436,
310,
417,
7478,
2217,
275,
253,
2929,
285,
1089,
326,
247,
1077,
24659,
2523,
50276,
783,
13439,
2553,
11193,
1646,
4460,
533,
436,
310,
417,
7558,
4518,
594,
891,
717,
31488,
1880,
436,
1537,
320,
1563,
1529,
2929,
50276,
783,
2553,
11998,
1646,
281,
320,
247,
4942,
2969,
15644,
273,
37655,
23636,
604,
253,
4477,
5194,
436,
969,
943,
320,
14409,
1199,
625,
4518,
275,
253,
12002,
26432,
285,
5910,
50276,
936,
479,
253,
1524,
7680,
310,
2686,
3192,
253,
2045,
1355,
2934,
2366,
285,
2403,
731,
715,
247,
1077,
3576,
5933,
534,
812,
1056,
323,
247,
1270,
2929,
604,
760,
352,
369,
14969,
1805,
285,
1016,
629,
5544,
1199,
625,
3365,
19235,
436,
969,
1691,
479,
327,
253,
39931,
273,
46705,
18235,
323,
247,
8059,
2929,
50276,
20,
4577,
7350,
403,
2330,
342,
253,
4679,
534,
969,
891,
1119,
275,
2087,
21414,
50276,
17480,
253,
3453,
310,
247,
17489,
891,
651,
751,
281,
923,
17082,
2905,
281,
6191,
1041,
417,
760,
1127,
9005,
323,
1650,
352,
651,
320,
3240,
3477,
281,
2557,
2622,
13849,
598,
281,
19153,
50276,
74,
717,
13477,
407,
253,
1698,
7162,
2491,
3368,
2829,
818,
891,
5476,
253,
1698,
7162,
2491,
943,
320,
7192,
347,
275,
1635,
281,
40009,
594,
15630,
2592,
323,
50276,
11667,
2686,
2097,
875,
337,
285,
655,
2592,
432,
253,
6510,
310,
326,
987,
604,
28763,
987,
2139,
417,
3368,
342,
1199,
4577,
2193,
24088,
14805,
285,
16987,
40009,
285,
275,
667,
1083,
342,
4067,
2193,
374,
285,
577,
40009,
326,
651,
1056,
253,
9058,
1199,
30909,
275,
667,
1083,
436,
943,
320,
1805,
5544,
247,
1355,
4677,
4321,
275,
253,
1332,
2593,
812,
1361,
50276,
455,
275,
512,
984,
891,
1158,
253,
1332,
2789,
3282,
285,
984,
347,
2080,
347,
891,
476,
5963,
417,
1146,
271,
6485,
253,
1543,
1646,
1077,
1175,
285,
253,
28913,
21414,
891,
651,
1335,
5257,
281,
5583,
18738,
253,
2929,
44895,
253,
4477,
342,
247,
2201,
294,
17695,
50275,
9820,
5474,
33032,
2520,
2929,
4081,
247,
1332,
323,
2553,
49866,
6477,
407,
3733,
247,
11454,
2990,
281,
3283,
10698,
4181,
4910,
18198,
71,
253,
6311,
18198,
3671,
403,
5185,
13823,
285,
476,
320,
10166,
1293,
3216,
5083,
4181,
4910,
1127,
5222,
932,
390,
1781,
4311,
3733,
15302,
247,
1029,
3290,
2553,
476,
320,
10375,
432,
253,
11786,
4972,
1673,
273,
253,
6311,
18198,
3671,
253,
2929,
556,
6786,
23176,
1543,
2429,
281,
690,
273,
253,
1375,
273,
253,
1445,
11333,
337,
253,
2929,
9257,
6730,
253,
1655,
4433,
4438,
273,
253,
18198,
71,
11193,
3082,
3021,
36636,
253,
15274,
13823,
1673,
4715,
2957,
285,
253,
13439,
11193,
22199,
841,
8130,
10260,
5520,
253,
3290,
273,
253,
6311,
18198,
71,
347,
12800,
407,
253,
2929,
374,
5899,
37655,
23636,
11333,
2550,
320,
3587,
3732,
327,
18198,
3671,
1580,
627,
310,
642,
3304,
40939,
1491,
275,
271,
18198,
71,
253,
2929,
4081,
247,
4460,
2553,
11998,
5933,
407,
2819,
387,
253,
11786,
4972,
1673,
273,
253,
6311,
18198,
71,
432,
253,
3236,
414,
285,
3290,
8668,
253,
2929,
556,
2218,
973,
495,
9759,
310,
973,
2218,
3448,
285,
24426,
403,
2590,
577,
432,
247,
8453,
8668,
253,
37317,
2868,
253,
2929,
556,
46002,
253,
256,
5503,
407,
247,
3240,
1781,
8459,
253,
25578,
2553,
556,
1199,
2169,
3290,
275,
1142,
11132,
15216,
253,
2488,
9713,
253,
7364,
273,
17568,
23534,
42590,
323,
2553,
11998,
5474,
33032,
2520,
2929,
10262,
247,
7792,
281,
3037,
10698,
4181,
3470,
18198,
71,
432,
1127,
16173,
253,
6311,
5415,
18198,
71,
476,
840,
320,
908,
281,
4908,
2553,
281,
1957,
495,
69,
12087,
581,
273,
253,
7881,
273,
4715,
247,
5415,
18198,
71,
432,
247,
13358,
1127,
9005,
310,
253,
17620,
273,
11786,
1955,
281,
253,
37139,
414,
273,
2792,
281,
436,
990,
253,
4477,
12661,
247,
4460,
2957,
1159,
342,
247,
1673,
15274,
37709,
597,
671,
4158,
247,
13439,
6974,
281,
3037,
625,
1980,
4278,
12401,
256,
4989,
326,
476,
9295,
9421,
970,
253,
37655,
37072,
5933,
3587,
18198,
71,
2550,
1509,
253,
3304,
40939,
1071,
1955,
281,
253,
3480,
273,
3884,
1491,
26332,
861,
3103,
436,
2929,
12661,
281,
897,
253,
4103,
6907,
875,
7316,
2792,
281,
1071,
1880,
597,
2831,
253,
310,
375,
32961,
4679,
7568,
253,
4081,
1332,
562,
32231,
5368,
3082,
285,
28913,
2175,
12654,
253,
2216,
10165,
50276,
296,
3755,
20556,
50276,
783,
2929,
310,
973,
3542,
352,
310,
3477,
281,
956,
50275,
783,
8442,
403,
10260,
9371,
323,
10668,
281,
2096,
253,
2934,
50275,
783,
4081,
2934,
310,
4722,
285,
3576,
347,
352,
310,
4516,
407,
253,
8936,
3045,
275,
389,
522,
4448,
790,
1411,
5368,
3082,
33810,
28913,
2175,
403,
4209,
281,
17813,
2216,
10165,
50275,
20881,
1255,
265,
50275,
13206,
3403,
621,
891,
651,
5583,
281,
5645,
4677,
3403,
621,
594,
326,
10668,
13414,
878,
281,
6923,
896,
285,
6593,
875,
2505,
285,
4677,
50276,
2072,
2490,
187,
4118,
18435,
27,
455,
30628,
497,
4518,
275,
3718,
273,
18738,
253,
2929,
638,
250,
2858,
22559,
627,
369,
3710,
5955,
1501,
250,
2858,
22559,
253,
913,
6730,
253,
2929,
253,
10123,
285,
253,
4477,
2380,
285,
310,
21802,
281,
2997,
253,
2929,
253,
913,
29426,
253,
4477,
281,
897,
616,
4465,
3239,
281,
19071,
616,
6128,
281,
253,
30628,
715,
253,
2457,
2715,
273,
253,
2929,
275,
1798,
253,
913,
651,
11907,
9257,
7296,
253,
8680,
327,
9759,
432,
337,
67,
4989,
209
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
in graph neural networks convolutional layers operate by aggregating information about local neighborhood structures this paper extends to hypoelliptic diffusions and the derived tensor valued graph operator in short this paper builds the operators by using random walks but instead of taking only the endpoints of random walks takes averages over pathdependent functions here the ordering of vertices visited in a random walk matters using random walks to build feature vectors for graphs is a classical idea eg deepwalk which has also been extended to other local graph features such as breadthfirst trees etc eg node2vec this paper uses the entire path rather than the endpoint and build an algebraic operator strength the approach of extension from using only endpoint of a random walk to the entire path of the random walk is natural the paper presented a theorem theorem 2 which says that two labelled graphs are the same if and only if the distribution of kstep random walks of the two graphs are the same the authors use diffusion over tensor transition matrix and introduce an attention mechanism the paper explained how to efficiently calculate the operator on a graph weaknesses see next box on questions na docsepthe authors present a novel version of the graph laplacian which instead of capture information focused on startend of a path it emphasizes information along paths this is inspired on previous notions of the hypoelliptic laplacian moreover since the hypoelliptic graph laplacian is a tensorvalued graph operator the authors provide lowrank approximations which allow numerical computations is it scales linearly with the number of edges contrary to the standard case which grows quadratically with the number of nodes together with the hypoelliptic laplacian another key ingredient is the perspective of free algebras which is relevant since the order in which nodes visited in a path is relevant and free algebras induce algebra multiplication that are noncommutative the paper does a great job to briefly introduce the basic notions related to the hypoelliptic graph and the corresponding free algebra notions that are required to grasp the basic intuitions of the concepts here considered further supporting material is presented in the supplementary material that helps to grasp basic concepts related to this paper the presents an interesting modeling approach by showingmotivating that the proposed laplacian indeed has certain properties that might be appealing in the context of learning long dependencies for gnns the analysis of the path and not only relying on where it starts and where it ends has already been suggested for instance in a different context by the nonback tracking operator or the corresponding bethe hessian even in the context of spectral clustering where the graph laplacian plays a crucial role a bibliography a saade alaa florent krzakala and lenka zdeborov spectral clustering of graphs with the bethe hessian advances in neural information processing systems 27 2014 it seems the authors do not provide potential negative societal impacts docsepin this work the authors present a formalism where a tensor valued version of the graph laplacian which they call the hypoelliptic graph laplacian is used to model the distribution of node attribute propagation via random walks as a diffusion process they show how the solutions for the state of the resulting system after length k walks can be computed via low rank approximation algorithms and compare a concrete realization of their method to other common architectures for graph modeling they argue that their method excels in representing long range dependencies while scaling more favorably than other approaches in the size of the graph strengths qualityclarity sections 13 along with the relevant appendices are very well written and do a remarkable job at providing an understanding of the background machinery required to understand the proposed method in particular the punchline results through eq 11 and 12 about the theoretical expressivity in node and graph representation extractable from the solution to the hypoelliptic graph diffusion equation are exciting originality benefit of the doubt is granted on the overall presentation of the formalism being novel reviewer is not a theorist though i assume novelty is mostly concentrated on the tensor valued version of the diffusion equation eq 11 weaknesses clarity the work is remarkably readable but there are a couple points at the moment one reads line 105 it is not immediately clear that we have built the map between seqmathbbrd and h rather it should cite appendix lines 612618 as these are sort of the final step of that construction the section on the low rank approximation of the functionals and the building neural networks section is much harder to understand than the sections that precede it significance despite the theoretical expressivity of the proposed model class the particular setting evaluated only explores a max tensor degree of 2 this seems like a severely restricted realization of the model based on my understanding that m sort of represents the dimension of the summary capturing capacity of the lifted form of the diffusion solution the method only performs competitively with graphtrans ideally since its being proposed as an alternative to the family of available gnns the implementation results will be more developed in future work to more convincingly demonstrate the benefits in practice that the framework argues that it offers in theory 1 somewhat duplicated from above i am concerned about the m2 restriction of the implementation results presented it seems like this may take the power out of the method in some sense by barely maintaining the added depthtensorial nature beyond the classical diffusion equation using the normal graph laplacian further and this is just a conjecture but it is possible that this limitation on m plus the rank1 approximation scheme is responsible for the shakeout of the results wrt the graphtrans competitive baseline interested to hear the authors discuss this further 2 the scalability of the approach may be problematic given that the complexity analysis already relies on the rank1 approximation and also contains an m2 term suggesting that the small m setting may be a requirement for practical implementations without demonstration on graphs of realistic scales well beyond 30 nodes as well as the resultant larger graph diameters and path lengths its unclear whether the approach has to always be nearly reduced to the classical setting in order to applied to graphs of nontrivial sizes docsepthe paper introduces hypoelliptic graph laplacians a generalization of classical graph laplacians to higher order tensors that are capable of storing the entire history of a given randomwalk this is studied in the context of attributed graphs where each node is equipped with vector information theoretical analysis is carried over to demonstrate that 1 the analogous tensorlike diffusion process retains the probabilistic interpretation of standard scalar random walks and 2 graph feature maps can be defined on these processes wo losing expressive power a lowrank approximation is provided to make the approach scalable and experiments are conducted on datasets that present longrange dependencies contribution and novelty to my knowledge i am not an expert in the field of hypoelliptic operators on noneuclidean domains the contribution of the paper is original this applies to both the theoretical analysis in theorem 12 and the lowrank approximation in theorem 3 that builds on previous work in 56 presentation the paper is very wellpresented and the organisation does not require main modifications on a minor side some of the paper is quite technical and potentially obscure to a significant part of the community strengths in random order the exposition is clear the problem at hand is of interest given that standard mpnns are known to struggle on tasks with longrange dependencies mainly due to the problem of oversquashing the theoretical analysis in theorem 3 is of relevance in the context of the tools and techniques introduced given that it provides a scalable approach the implementationexperiment section has detailed explanation with sufficient ablation studies that strengthen the message and indeed highlight the robustness of the method weaknesses in random order i reserve some unanswered points to the questions paragraph below depending on the rebuttal such points could be solved or become weaknesses with consequent adjusting of the score theorem 3 inevitably leads to a loss in expressive power and it is not entirely clear how the loss would depend on m and r minor point the proposed method is not significantly better than fasterstandard gnn methods on task like nodeclassification where longrange dependencies may not be crucial this is somewhat expected but it limits the impact of the approach societal impact is not reported limitations are partly addressed in the paper
### Summary: | this paper proposes a new principled graph learning method to aggregate information local neighborhood the proposed method essentially aggregates information from random walk but not only wrt end points but also wrt the paths all reviewers agree that the idea is novel and the idea is well founded valid questions regarding expressive power computational expense etc were raised and were addressed reasonably during rebuttal period | [
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
249,
4216,
11454,
6928,
27311,
267,
8090,
10196,
407,
9406,
839,
1491,
670,
1980,
9168,
5289,
436,
2929,
8725,
281,
3500,
80,
437,
9155,
280,
2171,
16723,
285,
253,
6012,
13148,
21392,
4216,
5572,
50275,
249,
2159,
436,
2929,
21168,
253,
9158,
407,
970,
3632,
16771,
533,
3185,
273,
3192,
760,
253,
29959,
273,
3632,
16771,
3936,
31218,
689,
1854,
6820,
3470,
1060,
253,
15824,
273,
13388,
11580,
275,
247,
3632,
2940,
8213,
970,
3632,
16771,
281,
1973,
4735,
11390,
323,
14580,
310,
247,
8946,
2934,
24088,
3676,
13678,
534,
556,
671,
644,
6508,
281,
643,
1980,
4216,
3386,
824,
347,
37535,
7053,
7139,
3966,
24088,
4666,
19,
4642,
436,
2929,
4648,
253,
2862,
1854,
2581,
685,
253,
21229,
285,
1973,
271,
20157,
5572,
50273,
45563,
253,
2746,
273,
6880,
432,
970,
760,
21229,
273,
247,
3632,
2940,
281,
253,
2862,
1854,
273,
253,
3632,
2940,
310,
3626,
253,
2929,
3559,
247,
10012,
10012,
374,
534,
2296,
326,
767,
27214,
14580,
403,
253,
1072,
604,
285,
760,
604,
253,
3268,
273,
465,
10539,
3632,
16771,
273,
253,
767,
14580,
403,
253,
1072,
253,
4477,
897,
12393,
689,
13148,
5502,
4315,
285,
9569,
271,
4116,
5122,
253,
2929,
5544,
849,
281,
14556,
10173,
253,
5572,
327,
247,
4216,
50274,
20881,
1255,
265,
923,
1735,
3817,
327,
3533,
5549,
5474,
339,
431,
248,
4477,
1246,
247,
4460,
2715,
273,
253,
4216,
826,
43917,
534,
3185,
273,
9232,
1491,
7106,
327,
1265,
423,
273,
247,
1854,
352,
35520,
1491,
2112,
11865,
436,
310,
11797,
327,
2045,
27367,
273,
253,
3500,
80,
437,
9155,
280,
826,
43917,
25761,
1580,
253,
3500,
80,
437,
9155,
280,
4216,
826,
43917,
310,
247,
13148,
24995,
4216,
5572,
253,
4477,
2085,
1698,
14714,
34754,
534,
1581,
10704,
30745,
310,
352,
11498,
23352,
342,
253,
1180,
273,
9297,
10214,
281,
253,
2629,
1083,
534,
17202,
13284,
5372,
342,
253,
1180,
273,
7632,
50276,
36776,
342,
253,
3500,
80,
437,
9155,
280,
826,
43917,
1529,
2234,
24405,
310,
253,
8668,
273,
1959,
21360,
534,
310,
4623,
1580,
253,
1340,
275,
534,
7632,
11580,
275,
247,
1854,
310,
4623,
285,
1959,
21360,
10808,
8697,
25219,
326,
403,
1327,
42253,
50275,
783,
2929,
1057,
247,
1270,
2628,
281,
13366,
9569,
253,
5044,
27367,
2905,
281,
253,
3500,
80,
437,
9155,
280,
4216,
285,
253,
3969,
1959,
8697,
27367,
326,
403,
2424,
281,
15909,
253,
5044,
16875,
4431,
273,
253,
12342,
1060,
2783,
2007,
8109,
2144,
310,
3559,
275,
253,
24864,
2144,
326,
7729,
281,
15909,
5044,
12342,
2905,
281,
436,
2929,
50276,
783,
10262,
271,
4722,
14053,
2746,
407,
4645,
24013,
400,
839,
326,
253,
4081,
826,
43917,
6296,
556,
2176,
3607,
326,
1537,
320,
23176,
275,
253,
3634,
273,
4715,
1048,
21011,
323,
18976,
2224,
253,
1783,
273,
253,
1854,
285,
417,
760,
22128,
327,
835,
352,
7866,
285,
835,
352,
7637,
556,
2168,
644,
5125,
323,
4227,
275,
247,
1027,
3634,
407,
253,
1327,
2135,
12544,
5572,
390,
253,
3969,
701,
248,
344,
859,
757,
1014,
275,
253,
3634,
273,
9879,
17524,
835,
253,
4216,
826,
43917,
7120,
247,
9560,
2554,
247,
50276,
34424,
247,
618,
796,
355,
5781,
892,
21875,
36407,
91,
518,
7080,
285,
8472,
4530,
1182,
615,
3399,
729,
9879,
17524,
273,
14580,
342,
253,
701,
248,
344,
859,
757,
16424,
275,
11454,
1491,
5162,
2718,
3435,
4059,
352,
3133,
253,
4477,
513,
417,
2085,
2442,
4016,
38058,
16274,
5474,
339,
9852,
436,
789,
253,
4477,
1246,
247,
30221,
835,
247,
13148,
21392,
2715,
273,
253,
4216,
826,
43917,
534,
597,
1067,
253,
3500,
80,
437,
9155,
280,
4216,
826,
43917,
310,
908,
281,
1566,
253,
3268,
273,
4666,
11104,
18634,
3066,
3632,
16771,
347,
247,
12393,
1232,
597,
921,
849,
253,
5482,
323,
253,
1375,
273,
253,
4795,
985,
846,
2978,
465,
16771,
476,
320,
10302,
3066,
1698,
5958,
11193,
11333,
285,
7277,
247,
11859,
22786,
273,
616,
1332,
281,
643,
1846,
35615,
323,
4216,
14053,
597,
9059,
326,
616,
1332,
2507,
1241,
275,
9999,
1048,
2491,
21011,
1223,
13642,
625,
49148,
685,
643,
7274,
275,
253,
1979,
273,
253,
4216,
50275,
296,
3755,
20556,
50276,
15177,
498,
15752,
7118,
2145,
2112,
342,
253,
4623,
14801,
1271,
403,
1077,
973,
3542,
285,
513,
247,
13406,
2628,
387,
5277,
271,
4685,
273,
253,
4114,
20949,
2424,
281,
2096,
253,
4081,
1332,
275,
1798,
253,
18750,
1282,
1543,
949,
16186,
1903,
285,
1249,
670,
253,
10527,
3890,
2351,
275,
4666,
285,
4216,
6779,
4908,
494,
432,
253,
2900,
281,
253,
3500,
80,
437,
9155,
280,
4216,
12393,
5150,
403,
12302,
50275,
19164,
414,
5649,
273,
253,
5545,
310,
7169,
327,
253,
4583,
9759,
273,
253,
30221,
1146,
4460,
37317,
310,
417,
247,
29075,
382,
2167,
891,
5467,
38135,
310,
6571,
16761,
327,
253,
13148,
21392,
2715,
273,
253,
12393,
5150,
16186,
1903,
50275,
20881,
1255,
265,
50276,
498,
15752,
253,
789,
310,
24678,
34025,
533,
627,
403,
247,
4564,
2792,
50276,
255,
253,
2774,
581,
9563,
1386,
12446,
352,
310,
417,
4745,
2590,
326,
359,
452,
4270,
253,
3711,
875,
22510,
1324,
1288,
69,
285,
288,
2581,
352,
943,
26542,
30762,
3104,
721,
13381,
1093,
347,
841,
403,
3686,
273,
253,
2457,
3213,
273,
326,
5140,
50276,
783,
2593,
327,
253,
1698,
5958,
11193,
273,
253,
1159,
932,
285,
253,
3652,
11454,
6928,
2593,
310,
1199,
12150,
281,
2096,
685,
253,
7118,
326,
8436,
70,
352,
50276,
9188,
40348,
5747,
253,
10527,
3890,
2351,
273,
253,
4081,
1566,
966,
253,
1798,
4758,
6760,
760,
33826,
247,
2781,
13148,
4248,
273,
374,
436,
3133,
751,
247,
18270,
11096,
22786,
273,
253,
1566,
1754,
327,
619,
4685,
326,
278,
3686,
273,
6125,
253,
7877,
273,
253,
6010,
26475,
5350,
273,
253,
14287,
830,
273,
253,
12393,
2900,
253,
1332,
760,
17923,
3947,
25785,
342,
17309,
384,
16147,
34243,
1580,
697,
1146,
4081,
347,
271,
5795,
281,
253,
2021,
273,
2130,
18976,
2224,
253,
7092,
1543,
588,
320,
625,
3715,
275,
2852,
789,
281,
625,
2410,
1763,
5356,
7568,
253,
5373,
275,
3946,
326,
253,
7792,
8219,
326,
352,
6131,
275,
3762,
337,
8489,
44223,
432,
1840,
891,
717,
7514,
670,
253,
278,
19,
12400,
273,
253,
7092,
1543,
3559,
352,
3133,
751,
436,
778,
1379,
253,
1612,
562,
273,
253,
1332,
275,
690,
3282,
407,
12345,
11850,
253,
2879,
372,
431,
384,
561,
10317,
3753,
4457,
253,
8946,
12393,
5150,
970,
253,
2622,
4216,
826,
43917,
2007,
285,
436,
310,
816,
247,
24366,
533,
352,
310,
1896,
326,
436,
12291,
327,
278,
5043,
253,
5958,
18,
11193,
6974,
310,
5506,
323,
253,
17941,
483,
273,
253,
1543,
8772,
253,
17309,
384,
16147,
12085,
8245,
6110,
281,
4089,
253,
4477,
2319,
436,
2007,
50276,
19,
253,
9171,
1430,
273,
253,
2746,
778,
320,
20276,
1677,
326,
253,
10454,
1783,
2168,
15771,
327,
253,
5958,
18,
11193,
285,
671,
4428,
271,
278,
19,
1307,
7738,
326,
253,
1355,
278,
4758,
778,
320,
247,
8284,
323,
8542,
27558,
1293,
20028,
327,
14580,
273,
15958,
11498,
973,
4457,
1884,
7632,
347,
973,
347,
253,
29395,
4067,
4216,
37162,
285,
1854,
16095,
697,
12744,
1880,
253,
2746,
556,
281,
1900,
320,
4829,
3777,
281,
253,
8946,
4758,
275,
1340,
281,
3732,
281,
14580,
273,
37825,
9552,
5474,
339,
431,
248,
2929,
23970,
3500,
80,
437,
9155,
280,
4216,
826,
33854,
2458,
50276,
66,
26647,
273,
8946,
4216,
826,
33854,
2458,
281,
2169,
1340,
47454,
50276,
3529,
403,
7032,
273,
20073,
253,
2862,
2892,
273,
247,
1677,
3632,
13678,
436,
310,
5421,
275,
253,
3634,
273,
12877,
14580,
835,
1016,
4666,
310,
13496,
342,
4972,
1491,
10527,
1783,
310,
4824,
689,
281,
7568,
326,
337,
253,
19890,
13148,
3022,
12393,
1232,
32751,
253,
37851,
7914,
273,
2629,
13434,
3632,
16771,
285,
374,
4216,
4735,
8115,
476,
320,
2931,
327,
841,
4870,
32063,
10305,
43541,
1612,
247,
1698,
14714,
11193,
310,
2530,
281,
1056,
253,
2746,
44755,
285,
4679,
403,
5196,
327,
15302,
326,
1246,
1048,
6324,
21011,
50276,
1987,
2382,
285,
38135,
281,
619,
3640,
50276,
74,
717,
417,
271,
6485,
275,
253,
1673,
273,
3500,
80,
437,
9155,
280,
9158,
327,
5293,
26365,
10625,
50276,
783,
7680,
273,
253,
2929,
310,
3236,
436,
10384,
281,
1097,
253,
10527,
1783,
275,
10012,
1249,
285,
253,
1698,
14714,
11193,
275,
10012,
495,
326,
21168,
327,
2045,
789,
275,
8026,
50276,
49836,
253,
2929,
310,
1077,
973,
15068,
264,
285,
253,
19156,
1057,
417,
2430,
2022,
14586,
327,
247,
5884,
1930,
690,
273,
253,
2929,
310,
3240,
7681,
285,
7826,
26591,
281,
247,
1534,
629,
273,
253,
3114,
50276,
296,
3755,
20556,
275,
3632,
1340,
50275,
783,
47284,
310,
2590,
50276,
783,
1895,
387,
1133,
310,
273,
1600,
1677,
326,
2629,
278,
16077,
2224,
403,
1929,
281,
11182,
327,
8892,
342,
1048,
6324,
21011,
7194,
1955,
281,
253,
1895,
273,
689,
23600,
3834,
50276,
783,
10527,
1783,
275,
10012,
495,
310,
273,
17200,
275,
253,
3634,
273,
253,
5657,
285,
5609,
5611,
1677,
326,
352,
3400,
247,
44755,
2746,
50276,
783,
7092,
16217,
2092,
2593,
556,
7000,
8813,
342,
4209,
28913,
2175,
326,
17084,
253,
3935,
285,
6296,
6780,
253,
31640,
273,
253,
1332,
50275,
20881,
1255,
265,
275,
3632,
1340,
50276,
74,
15917,
690,
440,
42195,
2792,
281,
253,
3533,
12494,
2708,
7293,
327,
253,
30080,
22559,
824,
2792,
812,
320,
14042,
390,
2489,
32213,
342,
40082,
19427,
273,
253,
4868,
50275,
33921,
495,
24473,
5644,
281,
247,
2957,
275,
43541,
1612,
285,
352,
310,
417,
7094,
2590,
849,
253,
2957,
651,
3469,
327,
278,
285,
391,
5884,
1127,
50276,
783,
4081,
1332,
310,
417,
3012,
1805,
685,
7938,
15291,
305,
9866,
3082,
327,
4836,
751,
4666,
42070,
835,
1048,
6324,
21011,
778,
417,
320,
9560,
436,
310,
8489,
3264,
533,
352,
7787,
253,
3486,
273,
253,
2746,
50276,
84,
1118,
16190,
3486,
310,
417,
2361,
7364,
403,
13730,
9713,
275,
253,
2929,
2490,
187,
4118,
18435,
27,
2520,
2929,
29328,
247,
747,
3505,
74,
6216,
4216,
4715,
1332,
281,
19737,
1491,
1980,
9168,
253,
4081,
1332,
9093,
29111,
1491,
432,
3632,
2940,
533,
417,
760,
8772,
990,
2792,
533,
671,
8772,
253,
11865,
512,
30628,
5194,
326,
253,
2934,
310,
4460,
285,
253,
2934,
310,
973,
11420,
3588,
3533,
5001,
43541,
1612,
15180,
14247,
3966,
497,
5439,
285,
497,
9713,
12054,
1309,
30080,
22559,
2180,
209
] | [
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] | [
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
249,
4216,
11454,
6928,
27311,
267,
8090,
10196,
407,
9406,
839,
1491,
670,
1980,
9168,
5289,
436,
2929,
8725,
281,
3500,
80,
437,
9155,
280,
2171,
16723,
285,
253,
6012,
13148,
21392,
4216,
5572,
50275,
249,
2159,
436,
2929,
21168,
253,
9158,
407,
970,
3632,
16771,
533,
3185,
273,
3192,
760,
253,
29959,
273,
3632,
16771,
3936,
31218,
689,
1854,
6820,
3470,
1060,
253,
15824,
273,
13388,
11580,
275,
247,
3632,
2940,
8213,
970,
3632,
16771,
281,
1973,
4735,
11390,
323,
14580,
310,
247,
8946,
2934,
24088,
3676,
13678,
534,
556,
671,
644,
6508,
281,
643,
1980,
4216,
3386,
824,
347,
37535,
7053,
7139,
3966,
24088,
4666,
19,
4642,
436,
2929,
4648,
253,
2862,
1854,
2581,
685,
253,
21229,
285,
1973,
271,
20157,
5572,
50273,
45563,
253,
2746,
273,
6880,
432,
970,
760,
21229,
273,
247,
3632,
2940,
281,
253,
2862,
1854,
273,
253,
3632,
2940,
310,
3626,
253,
2929,
3559,
247,
10012,
10012,
374,
534,
2296,
326,
767,
27214,
14580,
403,
253,
1072,
604,
285,
760,
604,
253,
3268,
273,
465,
10539,
3632,
16771,
273,
253,
767,
14580,
403,
253,
1072,
253,
4477,
897,
12393,
689,
13148,
5502,
4315,
285,
9569,
271,
4116,
5122,
253,
2929,
5544,
849,
281,
14556,
10173,
253,
5572,
327,
247,
4216,
50274,
20881,
1255,
265,
923,
1735,
3817,
327,
3533,
5549,
5474,
339,
431,
248,
4477,
1246,
247,
4460,
2715,
273,
253,
4216,
826,
43917,
534,
3185,
273,
9232,
1491,
7106,
327,
1265,
423,
273,
247,
1854,
352,
35520,
1491,
2112,
11865,
436,
310,
11797,
327,
2045,
27367,
273,
253,
3500,
80,
437,
9155,
280,
826,
43917,
25761,
1580,
253,
3500,
80,
437,
9155,
280,
4216,
826,
43917,
310,
247,
13148,
24995,
4216,
5572,
253,
4477,
2085,
1698,
14714,
34754,
534,
1581,
10704,
30745,
310,
352,
11498,
23352,
342,
253,
1180,
273,
9297,
10214,
281,
253,
2629,
1083,
534,
17202,
13284,
5372,
342,
253,
1180,
273,
7632,
50276,
36776,
342,
253,
3500,
80,
437,
9155,
280,
826,
43917,
1529,
2234,
24405,
310,
253,
8668,
273,
1959,
21360,
534,
310,
4623,
1580,
253,
1340,
275,
534,
7632,
11580,
275,
247,
1854,
310,
4623,
285,
1959,
21360,
10808,
8697,
25219,
326,
403,
1327,
42253,
50275,
783,
2929,
1057,
247,
1270,
2628,
281,
13366,
9569,
253,
5044,
27367,
2905,
281,
253,
3500,
80,
437,
9155,
280,
4216,
285,
253,
3969,
1959,
8697,
27367,
326,
403,
2424,
281,
15909,
253,
5044,
16875,
4431,
273,
253,
12342,
1060,
2783,
2007,
8109,
2144,
310,
3559,
275,
253,
24864,
2144,
326,
7729,
281,
15909,
5044,
12342,
2905,
281,
436,
2929,
50276,
783,
10262,
271,
4722,
14053,
2746,
407,
4645,
24013,
400,
839,
326,
253,
4081,
826,
43917,
6296,
556,
2176,
3607,
326,
1537,
320,
23176,
275,
253,
3634,
273,
4715,
1048,
21011,
323,
18976,
2224,
253,
1783,
273,
253,
1854,
285,
417,
760,
22128,
327,
835,
352,
7866,
285,
835,
352,
7637,
556,
2168,
644,
5125,
323,
4227,
275,
247,
1027,
3634,
407,
253,
1327,
2135,
12544,
5572,
390,
253,
3969,
701,
248,
344,
859,
757,
1014,
275,
253,
3634,
273,
9879,
17524,
835,
253,
4216,
826,
43917,
7120,
247,
9560,
2554,
247,
50276,
34424,
247,
618,
796,
355,
5781,
892,
21875,
36407,
91,
518,
7080,
285,
8472,
4530,
1182,
615,
3399,
729,
9879,
17524,
273,
14580,
342,
253,
701,
248,
344,
859,
757,
16424,
275,
11454,
1491,
5162,
2718,
3435,
4059,
352,
3133,
253,
4477,
513,
417,
2085,
2442,
4016,
38058,
16274,
5474,
339,
9852,
436,
789,
253,
4477,
1246,
247,
30221,
835,
247,
13148,
21392,
2715,
273,
253,
4216,
826,
43917,
534,
597,
1067,
253,
3500,
80,
437,
9155,
280,
4216,
826,
43917,
310,
908,
281,
1566,
253,
3268,
273,
4666,
11104,
18634,
3066,
3632,
16771,
347,
247,
12393,
1232,
597,
921,
849,
253,
5482,
323,
253,
1375,
273,
253,
4795,
985,
846,
2978,
465,
16771,
476,
320,
10302,
3066,
1698,
5958,
11193,
11333,
285,
7277,
247,
11859,
22786,
273,
616,
1332,
281,
643,
1846,
35615,
323,
4216,
14053,
597,
9059,
326,
616,
1332,
2507,
1241,
275,
9999,
1048,
2491,
21011,
1223,
13642,
625,
49148,
685,
643,
7274,
275,
253,
1979,
273,
253,
4216,
50275,
296,
3755,
20556,
50276,
15177,
498,
15752,
7118,
2145,
2112,
342,
253,
4623,
14801,
1271,
403,
1077,
973,
3542,
285,
513,
247,
13406,
2628,
387,
5277,
271,
4685,
273,
253,
4114,
20949,
2424,
281,
2096,
253,
4081,
1332,
275,
1798,
253,
18750,
1282,
1543,
949,
16186,
1903,
285,
1249,
670,
253,
10527,
3890,
2351,
275,
4666,
285,
4216,
6779,
4908,
494,
432,
253,
2900,
281,
253,
3500,
80,
437,
9155,
280,
4216,
12393,
5150,
403,
12302,
50275,
19164,
414,
5649,
273,
253,
5545,
310,
7169,
327,
253,
4583,
9759,
273,
253,
30221,
1146,
4460,
37317,
310,
417,
247,
29075,
382,
2167,
891,
5467,
38135,
310,
6571,
16761,
327,
253,
13148,
21392,
2715,
273,
253,
12393,
5150,
16186,
1903,
50275,
20881,
1255,
265,
50276,
498,
15752,
253,
789,
310,
24678,
34025,
533,
627,
403,
247,
4564,
2792,
50276,
255,
253,
2774,
581,
9563,
1386,
12446,
352,
310,
417,
4745,
2590,
326,
359,
452,
4270,
253,
3711,
875,
22510,
1324,
1288,
69,
285,
288,
2581,
352,
943,
26542,
30762,
3104,
721,
13381,
1093,
347,
841,
403,
3686,
273,
253,
2457,
3213,
273,
326,
5140,
50276,
783,
2593,
327,
253,
1698,
5958,
11193,
273,
253,
1159,
932,
285,
253,
3652,
11454,
6928,
2593,
310,
1199,
12150,
281,
2096,
685,
253,
7118,
326,
8436,
70,
352,
50276,
9188,
40348,
5747,
253,
10527,
3890,
2351,
273,
253,
4081,
1566,
966,
253,
1798,
4758,
6760,
760,
33826,
247,
2781,
13148,
4248,
273,
374,
436,
3133,
751,
247,
18270,
11096,
22786,
273,
253,
1566,
1754,
327,
619,
4685,
326,
278,
3686,
273,
6125,
253,
7877,
273,
253,
6010,
26475,
5350,
273,
253,
14287,
830,
273,
253,
12393,
2900,
253,
1332,
760,
17923,
3947,
25785,
342,
17309,
384,
16147,
34243,
1580,
697,
1146,
4081,
347,
271,
5795,
281,
253,
2021,
273,
2130,
18976,
2224,
253,
7092,
1543,
588,
320,
625,
3715,
275,
2852,
789,
281,
625,
2410,
1763,
5356,
7568,
253,
5373,
275,
3946,
326,
253,
7792,
8219,
326,
352,
6131,
275,
3762,
337,
8489,
44223,
432,
1840,
891,
717,
7514,
670,
253,
278,
19,
12400,
273,
253,
7092,
1543,
3559,
352,
3133,
751,
436,
778,
1379,
253,
1612,
562,
273,
253,
1332,
275,
690,
3282,
407,
12345,
11850,
253,
2879,
372,
431,
384,
561,
10317,
3753,
4457,
253,
8946,
12393,
5150,
970,
253,
2622,
4216,
826,
43917,
2007,
285,
436,
310,
816,
247,
24366,
533,
352,
310,
1896,
326,
436,
12291,
327,
278,
5043,
253,
5958,
18,
11193,
6974,
310,
5506,
323,
253,
17941,
483,
273,
253,
1543,
8772,
253,
17309,
384,
16147,
12085,
8245,
6110,
281,
4089,
253,
4477,
2319,
436,
2007,
50276,
19,
253,
9171,
1430,
273,
253,
2746,
778,
320,
20276,
1677,
326,
253,
10454,
1783,
2168,
15771,
327,
253,
5958,
18,
11193,
285,
671,
4428,
271,
278,
19,
1307,
7738,
326,
253,
1355,
278,
4758,
778,
320,
247,
8284,
323,
8542,
27558,
1293,
20028,
327,
14580,
273,
15958,
11498,
973,
4457,
1884,
7632,
347,
973,
347,
253,
29395,
4067,
4216,
37162,
285,
1854,
16095,
697,
12744,
1880,
253,
2746,
556,
281,
1900,
320,
4829,
3777,
281,
253,
8946,
4758,
275,
1340,
281,
3732,
281,
14580,
273,
37825,
9552,
5474,
339,
431,
248,
2929,
23970,
3500,
80,
437,
9155,
280,
4216,
826,
33854,
2458,
50276,
66,
26647,
273,
8946,
4216,
826,
33854,
2458,
281,
2169,
1340,
47454,
50276,
3529,
403,
7032,
273,
20073,
253,
2862,
2892,
273,
247,
1677,
3632,
13678,
436,
310,
5421,
275,
253,
3634,
273,
12877,
14580,
835,
1016,
4666,
310,
13496,
342,
4972,
1491,
10527,
1783,
310,
4824,
689,
281,
7568,
326,
337,
253,
19890,
13148,
3022,
12393,
1232,
32751,
253,
37851,
7914,
273,
2629,
13434,
3632,
16771,
285,
374,
4216,
4735,
8115,
476,
320,
2931,
327,
841,
4870,
32063,
10305,
43541,
1612,
247,
1698,
14714,
11193,
310,
2530,
281,
1056,
253,
2746,
44755,
285,
4679,
403,
5196,
327,
15302,
326,
1246,
1048,
6324,
21011,
50276,
1987,
2382,
285,
38135,
281,
619,
3640,
50276,
74,
717,
417,
271,
6485,
275,
253,
1673,
273,
3500,
80,
437,
9155,
280,
9158,
327,
5293,
26365,
10625,
50276,
783,
7680,
273,
253,
2929,
310,
3236,
436,
10384,
281,
1097,
253,
10527,
1783,
275,
10012,
1249,
285,
253,
1698,
14714,
11193,
275,
10012,
495,
326,
21168,
327,
2045,
789,
275,
8026,
50276,
49836,
253,
2929,
310,
1077,
973,
15068,
264,
285,
253,
19156,
1057,
417,
2430,
2022,
14586,
327,
247,
5884,
1930,
690,
273,
253,
2929,
310,
3240,
7681,
285,
7826,
26591,
281,
247,
1534,
629,
273,
253,
3114,
50276,
296,
3755,
20556,
275,
3632,
1340,
50275,
783,
47284,
310,
2590,
50276,
783,
1895,
387,
1133,
310,
273,
1600,
1677,
326,
2629,
278,
16077,
2224,
403,
1929,
281,
11182,
327,
8892,
342,
1048,
6324,
21011,
7194,
1955,
281,
253,
1895,
273,
689,
23600,
3834,
50276,
783,
10527,
1783,
275,
10012,
495,
310,
273,
17200,
275,
253,
3634,
273,
253,
5657,
285,
5609,
5611,
1677,
326,
352,
3400,
247,
44755,
2746,
50276,
783,
7092,
16217,
2092,
2593,
556,
7000,
8813,
342,
4209,
28913,
2175,
326,
17084,
253,
3935,
285,
6296,
6780,
253,
31640,
273,
253,
1332,
50275,
20881,
1255,
265,
275,
3632,
1340,
50276,
74,
15917,
690,
440,
42195,
2792,
281,
253,
3533,
12494,
2708,
7293,
327,
253,
30080,
22559,
824,
2792,
812,
320,
14042,
390,
2489,
32213,
342,
40082,
19427,
273,
253,
4868,
50275,
33921,
495,
24473,
5644,
281,
247,
2957,
275,
43541,
1612,
285,
352,
310,
417,
7094,
2590,
849,
253,
2957,
651,
3469,
327,
278,
285,
391,
5884,
1127,
50276,
783,
4081,
1332,
310,
417,
3012,
1805,
685,
7938,
15291,
305,
9866,
3082,
327,
4836,
751,
4666,
42070,
835,
1048,
6324,
21011,
778,
417,
320,
9560,
436,
310,
8489,
3264,
533,
352,
7787,
253,
3486,
273,
253,
2746,
50276,
84,
1118,
16190,
3486,
310,
417,
2361,
7364,
403,
13730,
9713,
275,
253,
2929,
2490,
187,
4118,
18435,
27,
2520,
2929,
29328,
247,
747,
3505,
74,
6216,
4216,
4715,
1332,
281,
19737,
1491,
1980,
9168,
253,
4081,
1332,
9093,
29111,
1491,
432,
3632,
2940,
533,
417,
760,
8772,
990,
2792,
533,
671,
8772,
253,
11865,
512,
30628,
5194,
326,
253,
2934,
310,
4460,
285,
253,
2934,
310,
973,
11420,
3588,
3533,
5001,
43541,
1612,
15180,
14247,
3966,
497,
5439,
285,
497,
9713,
12054,
1309,
30080,
22559,
2180,
209
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
the authors propose a new training method named increasing margin adversarial ima training to improve dnn robustness against adversarial noises the ima method increases the margins of training samples by moving the decision boundaries of the dnn model far away from the training samples to improve robustness under strong 100pgd whitebox adversarial attacks the authors evaluated the ima method on four publicly available datasets overall i vote for ok but not goor enough rejection the proposed strategy sounds reasonable and worked well with simple dataset the moons dataset however when it was applied to more complicated real dataset such as fashionminst svhn and covid19 ct image dataset there was no significant achievement if compare to the mma approaches thus further investigation is needed to convince benefit of the ima on real datasets in addition the authors tested only one medical image dataset covid19 ct image dataset since there are multiple modalities in the medical field and the diversity among datasets are quite large it is too early to emphasize the advantage of the proposed method in the medical field in general like the last phrase in the conclusion we hope our apporach may facilitate the development of robust dnns especially in the medical field docsepsummary the paper proposes increasingmargin adversarial training ima to improve adversarial robustness of a classifier ima works by alternating between two algorithms algorithm 1 update the model parameters while algorithm 2 updates the margin estimate by iteratively increasing the margins from clean training samples ima seeks to make the classifier more robust to lp adversarial perturbations the authors conducted experiments on the moons fashionmnist svhn and a ct image dataset to evaluate imas performances against other baselines and found ima to outperform or be on par with them pro improving robustness through the margins from clean samples is an interesting approach cons evaluation on nonstandard image datasets used to evaluate adversarial robustness lack of evaluation on datasets such as mnist cifar10100 or imagenet imas assumption that clean samples from different classes are equally spaced from the boundary might not be valid for images some classes might require more pixel perturbations to change their groundtruth class than others recommendation while the idea of improving models robustness via increasing margins from clean samples is a refreshing direction to counter adversarial examples the basis behind the idea of ima might be flawed ima assumes that clean samples from different classes are equally spaced from decision boundaries when in an equilibrium state however some classes might require more pixel perturbations to change their groundtruth class than others more discussions and theoretical studies would make ima more convincing another major concern i have is the lack of evaluation on standard image datasets such as mnist cifar10100 or imagenet in the paper given its current state i believe the paper is not yet fit for publication comments and questions the results in fig 6 shows that ima outperforms other methods but drops sharply at 03 noise level to almost match trades and advs performance what is its performance vs other methods at levels past 03 the statement a model robust to noises less than the level of 02 is good enough for this application is not substantiated by any previous work or experiments how is the imas performance against blackbox attacks docsepin general the paper has a good quality the idea is based on a common intuition that adversarial attacks are most influential to the points close to the decision boundary the proposed algorithm ima makes effective use of this intuition and adopts an alternating training process as an experimental work the experimental performance of ima is on par with the state of the art in the experimental settings considered in the paper this work is important to the ml community it would be interesting to see further exploration of the algorithm in different testing settings the paper is written clearly there is no difficulty in understanding the content experimental details are provided detailed comments 1 in vanilla adversarial training the choice of max perturbation epsilonmax is usually crucial to the performance of the classifier on noisy and standard data is the performance of ima also that sensitive to the choice of epsilonmax and it is briefly mentioned in section 33 that ima might indicate a good epsilon for vanilla adversarial training but this does not say anything about the choice of epsilonmax for ima and this could be very important to its performance on clean and noisy data 2 what might happen to the performance of the method under different choices of beta it might be interesting to see how ima deals with the wellknown tradeoff between robust and standard accuracy which is currently one of the main concerns of adversarial training methods other cons 1 figures are not readable when printed given the above concerns my initial rating is 6 this may change given further detail of the paperdocsepthe paper proposes to increase the adversarial robustness of a neural net by training the model on both clean and adversarial samples an adaptive form of the projected gradient descent generates the adversarial samples therefore the noise magnitude is estimated separately for each training sample such that the decision boundary suppose a classification problem of the neural net has maximum distance to each training sample strengths 1 appealing idea of having adaptive noise magnitudes 2 relevant experimental section covid19 3 illustrative figures describing the model weaknesses suggestions questions 1 a theoretical discussion about following points will improve the contribution of the paper a why do large margins result in higher adversarial robustness what happens if i change the attack type b benefits compared over other adversarial training methods are not clear c a more detailed discussion about the equilibrium state is necessary as currently provided in sec 23 this is rather an example 2 experimental section a need to report average over multiple runs results are very close together and it is hard to favor one method b sec 31 since this is the toydataset a discussion why the decision boundaries look as they do would be interesting c sec 33 what information is in fig 9 middle and right 3 formatting and writing a detailed proofreading required eg on p 3 using crossentropy loss and clean data for training b some variables are used but not introduced eg xn1 xn2 in sec 23 c figures are too small and not properly labeled in experimental section d references to prior work are missing as eg virtual adversarial training a regularization method for supervised and semisupervised learning e algorithms need rework eg information of alg 1 can be written in 23 lines though the idea of adaptive adversarial noise magnitude is in general appealing the paper has some weaknesses i theoretical contribution is relatively minor ii the paper does not present the material sufficiently clearly to the reader and iii experimental evaluation is not sufficiently conclusive in favor of the papers central hypothesis
### Summary: | the paper proposes a marginbased adversarial training procedure the paper is lacking in terms of proper dicussion of related literature eg similarity and differences to mma the theoretical discussion on page 5 is incomplete as there is no way how one can estimate the perturbed samples to do the analysis the authors seem to implicitly already assume that the adversarial samples lie on the decision boundary and the underlying assumptions are not clearly stated the reported robust accuracies see httpsgithubcomfra31autoattack for a leaderboard of adversarial defenses on mnist and cifar10 are worse than that of mma which are in turn worse than sota thus this paper is below the bar for iclr | [
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
783,
4477,
12661,
247,
747,
3733,
1332,
4907,
3629,
8459,
48960,
516,
66,
3733,
281,
3157,
277,
9866,
31640,
1411,
48960,
33737,
253,
516,
66,
1332,
5459,
253,
24390,
273,
3733,
3530,
407,
4886,
253,
3061,
13674,
273,
253,
277,
9866,
1566,
2080,
1977,
432,
253,
3733,
3530,
281,
3157,
31640,
762,
2266,
2233,
8159,
69,
3168,
3364,
48960,
8104,
253,
4477,
6760,
253,
516,
66,
1332,
327,
1740,
13644,
2130,
15302,
50276,
1189,
455,
891,
6273,
323,
8718,
533,
417,
564,
263,
2217,
50276,
250,
5342,
253,
4081,
5700,
7835,
5272,
285,
4307,
973,
342,
2969,
10895,
253,
5497,
790,
10895,
2299,
672,
352,
369,
3732,
281,
625,
9542,
1524,
10895,
824,
347,
8142,
1222,
296,
18504,
13107,
285,
9383,
301,
746,
45830,
2460,
10895,
627,
369,
642,
1534,
19797,
604,
7277,
281,
253,
278,
785,
7274,
3021,
2007,
5839,
310,
3058,
281,
18578,
5649,
273,
253,
516,
66,
327,
1524,
15302,
50276,
249,
1635,
253,
4477,
5762,
760,
581,
3739,
2460,
10895,
9383,
301,
746,
45830,
2460,
10895,
1580,
627,
403,
2709,
33433,
275,
253,
3739,
1673,
285,
253,
9991,
2190,
15302,
403,
3240,
1781,
352,
310,
1512,
2393,
281,
22175,
253,
5750,
273,
253,
4081,
1332,
275,
253,
3739,
1673,
275,
2087,
751,
253,
1390,
12616,
275,
253,
6452,
359,
3524,
776,
622,
263,
607,
778,
12454,
253,
2440,
273,
10237,
277,
79,
2224,
3340,
275,
253,
3739,
1673,
5474,
339,
793,
360,
3454,
253,
2929,
29328,
3629,
15456,
48960,
3733,
516,
66,
281,
3157,
48960,
31640,
273,
247,
30410,
516,
66,
2987,
407,
28035,
875,
767,
11333,
5933,
337,
5731,
253,
1566,
3602,
1223,
5933,
374,
11269,
253,
8459,
6642,
407,
10040,
3146,
3629,
253,
24390,
432,
4076,
3733,
3530,
516,
66,
14993,
281,
1056,
253,
30410,
625,
10237,
281,
39322,
48960,
26309,
253,
4477,
5196,
4679,
327,
253,
5497,
790,
8142,
16192,
382,
18504,
13107,
285,
247,
45830,
2460,
10895,
281,
7472,
516,
284,
16226,
1411,
643,
1666,
25379,
285,
1119,
516,
66,
281,
562,
32231,
390,
320,
327,
1061,
342,
731,
50276,
856,
11138,
31640,
949,
253,
24390,
432,
4076,
3530,
310,
271,
4722,
2746,
50275,
5040,
7103,
327,
1327,
15291,
2460,
15302,
908,
281,
7472,
48960,
31640,
3480,
273,
7103,
327,
15302,
824,
347,
278,
79,
382,
260,
338,
274,
6903,
361,
390,
4440,
257,
292,
516,
284,
9376,
326,
4076,
3530,
432,
1027,
5971,
403,
9696,
26549,
432,
253,
7548,
1537,
417,
320,
3588,
323,
3888,
690,
5971,
1537,
2430,
625,
12275,
26309,
281,
1818,
616,
3216,
33024,
966,
685,
2571,
50276,
250,
27167,
318,
1223,
253,
2934,
273,
11138,
3210,
31640,
3066,
3629,
24390,
432,
4076,
3530,
310,
247,
31255,
3884,
281,
4828,
48960,
6667,
253,
3720,
3212,
253,
2934,
273,
516,
66,
1537,
320,
33657,
516,
66,
19584,
326,
4076,
3530,
432,
1027,
5971,
403,
9696,
26549,
432,
3061,
13674,
672,
275,
271,
12902,
1375,
2299,
690,
5971,
1537,
2430,
625,
12275,
26309,
281,
1818,
616,
3216,
33024,
966,
685,
2571,
625,
11985,
285,
10527,
2175,
651,
1056,
516,
66,
625,
21414,
1529,
2201,
4468,
891,
452,
310,
253,
3480,
273,
7103,
327,
2629,
2460,
15302,
824,
347,
278,
79,
382,
260,
338,
274,
6903,
361,
390,
4440,
257,
292,
275,
253,
2929,
1677,
697,
1655,
1375,
891,
2868,
253,
2929,
310,
417,
2568,
4944,
323,
9311,
50275,
26122,
285,
3533,
50275,
783,
1543,
275,
3036,
721,
2722,
326,
516,
66,
41731,
13015,
643,
3082,
533,
15323,
23071,
387,
17272,
6046,
1268,
281,
2761,
3761,
28587,
285,
1604,
84,
3045,
752,
310,
697,
3045,
4632,
643,
3082,
387,
2308,
2469,
17272,
50276,
783,
3908,
247,
1566,
10237,
281,
33737,
1679,
685,
253,
1268,
273,
16261,
310,
1175,
2217,
323,
436,
2898,
310,
417,
4326,
4215,
407,
667,
2045,
789,
390,
4679,
50276,
5430,
310,
253,
516,
284,
3045,
1411,
2806,
3364,
8104,
50275,
7152,
339,
9852,
2087,
253,
2929,
556,
247,
1175,
3290,
253,
2934,
310,
1754,
327,
247,
1846,
30328,
326,
48960,
8104,
403,
954,
20803,
281,
253,
2792,
2810,
281,
253,
3061,
7548,
253,
4081,
5933,
516,
66,
2789,
3576,
897,
273,
436,
30328,
285,
47932,
271,
28035,
3733,
1232,
347,
271,
5661,
789,
253,
5661,
3045,
273,
516,
66,
310,
327,
1061,
342,
253,
1375,
273,
253,
1445,
275,
253,
5661,
7533,
2783,
275,
253,
2929,
436,
789,
310,
1774,
281,
253,
13361,
3114,
352,
651,
320,
4722,
281,
923,
2007,
17947,
273,
253,
5933,
275,
1027,
5175,
7533,
50276,
783,
2929,
310,
3542,
4518,
627,
310,
642,
10183,
275,
4685,
253,
2600,
5661,
4278,
403,
2530,
50275,
5992,
7193,
5701,
337,
275,
26724,
48960,
3733,
253,
4327,
273,
2781,
20452,
299,
4277,
4090,
310,
3798,
9560,
281,
253,
3045,
273,
253,
30410,
327,
27620,
285,
2629,
941,
310,
253,
3045,
273,
516,
66,
671,
326,
7996,
281,
253,
4327,
273,
299,
4277,
4090,
50276,
395,
352,
310,
13366,
5393,
275,
2593,
5922,
326,
516,
66,
1537,
5224,
247,
1175,
299,
4277,
323,
26724,
48960,
3733,
533,
436,
1057,
417,
1333,
2712,
670,
253,
4327,
273,
299,
4277,
4090,
323,
516,
66,
285,
436,
812,
320,
1077,
1774,
281,
697,
3045,
327,
4076,
285,
27620,
941,
374,
752,
1537,
5108,
281,
253,
3045,
273,
253,
1332,
762,
1027,
10165,
273,
9840,
352,
1537,
320,
4722,
281,
923,
849,
516,
66,
13330,
342,
253,
973,
4304,
5454,
2727,
875,
10237,
285,
2629,
7200,
534,
310,
4390,
581,
273,
253,
2022,
7350,
273,
48960,
3733,
3082,
50276,
977,
772,
337,
8442,
403,
417,
34025,
672,
11462,
50274,
28821,
253,
1840,
7350,
619,
3302,
13716,
310,
721,
436,
778,
1818,
1677,
2007,
2508,
273,
253,
2929,
7152,
339,
431,
248,
2929,
29328,
281,
2572,
253,
48960,
31640,
273,
247,
11454,
2036,
407,
3733,
253,
1566,
327,
1097,
4076,
285,
48960,
3530,
271,
17825,
830,
273,
253,
16589,
11786,
18499,
15693,
253,
48960,
3530,
3103,
253,
6046,
9777,
310,
5998,
11794,
323,
1016,
3733,
3410,
824,
326,
253,
3061,
7548,
9428,
247,
9162,
1895,
273,
253,
11454,
2036,
556,
4869,
4181,
281,
1016,
3733,
3410,
50275,
296,
3755,
20556,
50276,
18,
186,
6243,
4052,
2934,
273,
1907,
17825,
6046,
32800,
374,
186,
15477,
5661,
2593,
9383,
301,
746,
495,
186,
408,
5337,
800,
8442,
12930,
253,
1566,
50276,
20881,
1255,
265,
13991,
3533,
337,
186,
66,
10527,
5955,
670,
1563,
2792,
588,
3157,
253,
7680,
273,
253,
2929,
50269,
66,
186,
22309,
513,
1781,
24390,
906,
275,
2169,
48960,
31640,
752,
6569,
604,
891,
1818,
253,
2983,
1511,
50268,
67,
186,
31891,
953,
2429,
689,
643,
48960,
3733,
3082,
403,
417,
2590,
50269,
68,
186,
66,
625,
7000,
5955,
670,
253,
12902,
1375,
310,
3309,
347,
4390,
2530,
275,
4706,
3495,
436,
310,
2581,
271,
1650,
374,
186,
49363,
2593,
50269,
66,
186,
22990,
281,
1304,
3388,
689,
2709,
6613,
1543,
403,
1077,
2810,
2366,
285,
352,
310,
1892,
281,
3718,
581,
1332,
50268,
67,
186,
1704,
4562,
1580,
436,
310,
253,
20953,
42429,
247,
5955,
2139,
253,
3061,
13674,
1007,
347,
597,
513,
651,
320,
4722,
50268,
68,
186,
1704,
5922,
752,
1491,
310,
275,
3036,
898,
4766,
285,
987,
50276,
20,
186,
8124,
1076,
285,
4028,
50269,
66,
186,
5992,
7193,
4737,
24042,
2424,
50276,
909,
327,
268,
495,
50276,
5302,
2831,
290,
10144,
2957,
285,
4076,
941,
323,
3733,
50269,
67,
186,
8826,
4903,
403,
908,
533,
417,
5611,
50276,
909,
1269,
79,
18,
1269,
79,
19,
50276,
249,
4706,
3495,
50269,
68,
186,
40203,
403,
1512,
1355,
285,
417,
6283,
13130,
275,
5661,
2593,
50269,
69,
186,
250,
3065,
281,
2720,
789,
403,
5816,
347,
24088,
7503,
48960,
3733,
247,
37820,
1332,
323,
22296,
285,
49863,
29974,
13337,
4715,
50269,
70,
186,
267,
46042,
878,
294,
1601,
24088,
1491,
273,
20320,
337,
476,
320,
3542,
275,
3495,
3104,
50275,
2004,
253,
2934,
273,
17825,
48960,
6046,
9777,
310,
275,
2087,
23176,
253,
2929,
556,
690,
32213,
891,
10527,
7680,
310,
4942,
5884,
21255,
253,
2929,
1057,
417,
1246,
253,
2144,
10481,
4518,
281,
253,
9414,
285,
37685,
5661,
7103,
310,
417,
10481,
38662,
275,
3718,
273,
253,
9380,
4275,
9079,
187,
187,
4118,
18435,
27,
783,
2929,
29328,
247,
8459,
3169,
48960,
3733,
5199,
253,
2929,
310,
14999,
275,
2426,
273,
1463,
37266,
26148,
273,
2905,
6239,
24088,
14259,
285,
3910,
281,
278,
785,
253,
10527,
5955,
327,
3239,
608,
310,
18464,
347,
627,
310,
642,
1039,
849,
581,
476,
6642,
253,
44711,
3530,
281,
513,
253,
1783,
253,
4477,
1646,
281,
29688,
2168,
5467,
326,
253,
48960,
3530,
7027,
327,
253,
3061,
7548,
285,
253,
6944,
13260,
403,
417,
4518,
4767,
253,
2361,
10237,
3933,
19103,
50276,
2887,
5987,
7280,
681,
3804,
2405,
15149,
35946,
323,
247,
6657,
4697,
273,
48960,
25774,
327,
278,
79,
382,
285,
260,
338,
274,
740,
403,
7197,
685,
326,
273,
278,
785,
534,
403,
275,
1614,
7197,
685,
256,
5503,
3021,
436,
2929,
310,
2708,
253,
2534,
323,
17857,
32888
] | [
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] | [
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
783,
4477,
12661,
247,
747,
3733,
1332,
4907,
3629,
8459,
48960,
516,
66,
3733,
281,
3157,
277,
9866,
31640,
1411,
48960,
33737,
253,
516,
66,
1332,
5459,
253,
24390,
273,
3733,
3530,
407,
4886,
253,
3061,
13674,
273,
253,
277,
9866,
1566,
2080,
1977,
432,
253,
3733,
3530,
281,
3157,
31640,
762,
2266,
2233,
8159,
69,
3168,
3364,
48960,
8104,
253,
4477,
6760,
253,
516,
66,
1332,
327,
1740,
13644,
2130,
15302,
50276,
1189,
455,
891,
6273,
323,
8718,
533,
417,
564,
263,
2217,
50276,
250,
5342,
253,
4081,
5700,
7835,
5272,
285,
4307,
973,
342,
2969,
10895,
253,
5497,
790,
10895,
2299,
672,
352,
369,
3732,
281,
625,
9542,
1524,
10895,
824,
347,
8142,
1222,
296,
18504,
13107,
285,
9383,
301,
746,
45830,
2460,
10895,
627,
369,
642,
1534,
19797,
604,
7277,
281,
253,
278,
785,
7274,
3021,
2007,
5839,
310,
3058,
281,
18578,
5649,
273,
253,
516,
66,
327,
1524,
15302,
50276,
249,
1635,
253,
4477,
5762,
760,
581,
3739,
2460,
10895,
9383,
301,
746,
45830,
2460,
10895,
1580,
627,
403,
2709,
33433,
275,
253,
3739,
1673,
285,
253,
9991,
2190,
15302,
403,
3240,
1781,
352,
310,
1512,
2393,
281,
22175,
253,
5750,
273,
253,
4081,
1332,
275,
253,
3739,
1673,
275,
2087,
751,
253,
1390,
12616,
275,
253,
6452,
359,
3524,
776,
622,
263,
607,
778,
12454,
253,
2440,
273,
10237,
277,
79,
2224,
3340,
275,
253,
3739,
1673,
5474,
339,
793,
360,
3454,
253,
2929,
29328,
3629,
15456,
48960,
3733,
516,
66,
281,
3157,
48960,
31640,
273,
247,
30410,
516,
66,
2987,
407,
28035,
875,
767,
11333,
5933,
337,
5731,
253,
1566,
3602,
1223,
5933,
374,
11269,
253,
8459,
6642,
407,
10040,
3146,
3629,
253,
24390,
432,
4076,
3733,
3530,
516,
66,
14993,
281,
1056,
253,
30410,
625,
10237,
281,
39322,
48960,
26309,
253,
4477,
5196,
4679,
327,
253,
5497,
790,
8142,
16192,
382,
18504,
13107,
285,
247,
45830,
2460,
10895,
281,
7472,
516,
284,
16226,
1411,
643,
1666,
25379,
285,
1119,
516,
66,
281,
562,
32231,
390,
320,
327,
1061,
342,
731,
50276,
856,
11138,
31640,
949,
253,
24390,
432,
4076,
3530,
310,
271,
4722,
2746,
50275,
5040,
7103,
327,
1327,
15291,
2460,
15302,
908,
281,
7472,
48960,
31640,
3480,
273,
7103,
327,
15302,
824,
347,
278,
79,
382,
260,
338,
274,
6903,
361,
390,
4440,
257,
292,
516,
284,
9376,
326,
4076,
3530,
432,
1027,
5971,
403,
9696,
26549,
432,
253,
7548,
1537,
417,
320,
3588,
323,
3888,
690,
5971,
1537,
2430,
625,
12275,
26309,
281,
1818,
616,
3216,
33024,
966,
685,
2571,
50276,
250,
27167,
318,
1223,
253,
2934,
273,
11138,
3210,
31640,
3066,
3629,
24390,
432,
4076,
3530,
310,
247,
31255,
3884,
281,
4828,
48960,
6667,
253,
3720,
3212,
253,
2934,
273,
516,
66,
1537,
320,
33657,
516,
66,
19584,
326,
4076,
3530,
432,
1027,
5971,
403,
9696,
26549,
432,
3061,
13674,
672,
275,
271,
12902,
1375,
2299,
690,
5971,
1537,
2430,
625,
12275,
26309,
281,
1818,
616,
3216,
33024,
966,
685,
2571,
625,
11985,
285,
10527,
2175,
651,
1056,
516,
66,
625,
21414,
1529,
2201,
4468,
891,
452,
310,
253,
3480,
273,
7103,
327,
2629,
2460,
15302,
824,
347,
278,
79,
382,
260,
338,
274,
6903,
361,
390,
4440,
257,
292,
275,
253,
2929,
1677,
697,
1655,
1375,
891,
2868,
253,
2929,
310,
417,
2568,
4944,
323,
9311,
50275,
26122,
285,
3533,
50275,
783,
1543,
275,
3036,
721,
2722,
326,
516,
66,
41731,
13015,
643,
3082,
533,
15323,
23071,
387,
17272,
6046,
1268,
281,
2761,
3761,
28587,
285,
1604,
84,
3045,
752,
310,
697,
3045,
4632,
643,
3082,
387,
2308,
2469,
17272,
50276,
783,
3908,
247,
1566,
10237,
281,
33737,
1679,
685,
253,
1268,
273,
16261,
310,
1175,
2217,
323,
436,
2898,
310,
417,
4326,
4215,
407,
667,
2045,
789,
390,
4679,
50276,
5430,
310,
253,
516,
284,
3045,
1411,
2806,
3364,
8104,
50275,
7152,
339,
9852,
2087,
253,
2929,
556,
247,
1175,
3290,
253,
2934,
310,
1754,
327,
247,
1846,
30328,
326,
48960,
8104,
403,
954,
20803,
281,
253,
2792,
2810,
281,
253,
3061,
7548,
253,
4081,
5933,
516,
66,
2789,
3576,
897,
273,
436,
30328,
285,
47932,
271,
28035,
3733,
1232,
347,
271,
5661,
789,
253,
5661,
3045,
273,
516,
66,
310,
327,
1061,
342,
253,
1375,
273,
253,
1445,
275,
253,
5661,
7533,
2783,
275,
253,
2929,
436,
789,
310,
1774,
281,
253,
13361,
3114,
352,
651,
320,
4722,
281,
923,
2007,
17947,
273,
253,
5933,
275,
1027,
5175,
7533,
50276,
783,
2929,
310,
3542,
4518,
627,
310,
642,
10183,
275,
4685,
253,
2600,
5661,
4278,
403,
2530,
50275,
5992,
7193,
5701,
337,
275,
26724,
48960,
3733,
253,
4327,
273,
2781,
20452,
299,
4277,
4090,
310,
3798,
9560,
281,
253,
3045,
273,
253,
30410,
327,
27620,
285,
2629,
941,
310,
253,
3045,
273,
516,
66,
671,
326,
7996,
281,
253,
4327,
273,
299,
4277,
4090,
50276,
395,
352,
310,
13366,
5393,
275,
2593,
5922,
326,
516,
66,
1537,
5224,
247,
1175,
299,
4277,
323,
26724,
48960,
3733,
533,
436,
1057,
417,
1333,
2712,
670,
253,
4327,
273,
299,
4277,
4090,
323,
516,
66,
285,
436,
812,
320,
1077,
1774,
281,
697,
3045,
327,
4076,
285,
27620,
941,
374,
752,
1537,
5108,
281,
253,
3045,
273,
253,
1332,
762,
1027,
10165,
273,
9840,
352,
1537,
320,
4722,
281,
923,
849,
516,
66,
13330,
342,
253,
973,
4304,
5454,
2727,
875,
10237,
285,
2629,
7200,
534,
310,
4390,
581,
273,
253,
2022,
7350,
273,
48960,
3733,
3082,
50276,
977,
772,
337,
8442,
403,
417,
34025,
672,
11462,
50274,
28821,
253,
1840,
7350,
619,
3302,
13716,
310,
721,
436,
778,
1818,
1677,
2007,
2508,
273,
253,
2929,
7152,
339,
431,
248,
2929,
29328,
281,
2572,
253,
48960,
31640,
273,
247,
11454,
2036,
407,
3733,
253,
1566,
327,
1097,
4076,
285,
48960,
3530,
271,
17825,
830,
273,
253,
16589,
11786,
18499,
15693,
253,
48960,
3530,
3103,
253,
6046,
9777,
310,
5998,
11794,
323,
1016,
3733,
3410,
824,
326,
253,
3061,
7548,
9428,
247,
9162,
1895,
273,
253,
11454,
2036,
556,
4869,
4181,
281,
1016,
3733,
3410,
50275,
296,
3755,
20556,
50276,
18,
186,
6243,
4052,
2934,
273,
1907,
17825,
6046,
32800,
374,
186,
15477,
5661,
2593,
9383,
301,
746,
495,
186,
408,
5337,
800,
8442,
12930,
253,
1566,
50276,
20881,
1255,
265,
13991,
3533,
337,
186,
66,
10527,
5955,
670,
1563,
2792,
588,
3157,
253,
7680,
273,
253,
2929,
50269,
66,
186,
22309,
513,
1781,
24390,
906,
275,
2169,
48960,
31640,
752,
6569,
604,
891,
1818,
253,
2983,
1511,
50268,
67,
186,
31891,
953,
2429,
689,
643,
48960,
3733,
3082,
403,
417,
2590,
50269,
68,
186,
66,
625,
7000,
5955,
670,
253,
12902,
1375,
310,
3309,
347,
4390,
2530,
275,
4706,
3495,
436,
310,
2581,
271,
1650,
374,
186,
49363,
2593,
50269,
66,
186,
22990,
281,
1304,
3388,
689,
2709,
6613,
1543,
403,
1077,
2810,
2366,
285,
352,
310,
1892,
281,
3718,
581,
1332,
50268,
67,
186,
1704,
4562,
1580,
436,
310,
253,
20953,
42429,
247,
5955,
2139,
253,
3061,
13674,
1007,
347,
597,
513,
651,
320,
4722,
50268,
68,
186,
1704,
5922,
752,
1491,
310,
275,
3036,
898,
4766,
285,
987,
50276,
20,
186,
8124,
1076,
285,
4028,
50269,
66,
186,
5992,
7193,
4737,
24042,
2424,
50276,
909,
327,
268,
495,
50276,
5302,
2831,
290,
10144,
2957,
285,
4076,
941,
323,
3733,
50269,
67,
186,
8826,
4903,
403,
908,
533,
417,
5611,
50276,
909,
1269,
79,
18,
1269,
79,
19,
50276,
249,
4706,
3495,
50269,
68,
186,
40203,
403,
1512,
1355,
285,
417,
6283,
13130,
275,
5661,
2593,
50269,
69,
186,
250,
3065,
281,
2720,
789,
403,
5816,
347,
24088,
7503,
48960,
3733,
247,
37820,
1332,
323,
22296,
285,
49863,
29974,
13337,
4715,
50269,
70,
186,
267,
46042,
878,
294,
1601,
24088,
1491,
273,
20320,
337,
476,
320,
3542,
275,
3495,
3104,
50275,
2004,
253,
2934,
273,
17825,
48960,
6046,
9777,
310,
275,
2087,
23176,
253,
2929,
556,
690,
32213,
891,
10527,
7680,
310,
4942,
5884,
21255,
253,
2929,
1057,
417,
1246,
253,
2144,
10481,
4518,
281,
253,
9414,
285,
37685,
5661,
7103,
310,
417,
10481,
38662,
275,
3718,
273,
253,
9380,
4275,
9079,
187,
187,
4118,
18435,
27,
783,
2929,
29328,
247,
8459,
3169,
48960,
3733,
5199,
253,
2929,
310,
14999,
275,
2426,
273,
1463,
37266,
26148,
273,
2905,
6239,
24088,
14259,
285,
3910,
281,
278,
785,
253,
10527,
5955,
327,
3239,
608,
310,
18464,
347,
627,
310,
642,
1039,
849,
581,
476,
6642,
253,
44711,
3530,
281,
513,
253,
1783,
253,
4477,
1646,
281,
29688,
2168,
5467,
326,
253,
48960,
3530,
7027,
327,
253,
3061,
7548,
285,
253,
6944,
13260,
403,
417,
4518,
4767,
253,
2361,
10237,
3933,
19103,
50276,
2887,
5987,
7280,
681,
3804,
2405,
15149,
35946,
323,
247,
6657,
4697,
273,
48960,
25774,
327,
278,
79,
382,
285,
260,
338,
274,
740,
403,
7197,
685,
326,
273,
278,
785,
534,
403,
275,
1614,
7197,
685,
256,
5503,
3021,
436,
2929,
310,
2708,
253,
2534,
323,
17857,
32888
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
the paper proposes a multihop communication method for multiagent reinforcement learning this method is based on the loosely coupled reward structures among agents which as far as i am concerned are generally held in complex multiagent settings the authors use experiments on cityflow mpe and magent to demonstrate that their method can outperform the sota methods and is scalable the empirical results is impressive however it is some concerns regarding methods that lead to my overall negative rating firstly also the most importantly although the authors emphasize that they are communicating the intentions of agents i think their method is quite similar to those communicating local observations like ndq httpsarxivorgabs191005366 dgn or collaq httpsarxivorgabs201008531 one way to interpret the proposed communicating network structure is a normal multihop communication mechanism but only with a softmax activation function compared to previous works studying communications of local observations the proposed work 1 needs to address the problems induced by the joint policy like sampling from it the author use a variational influence approach to conduct sampling however this approach may hurt the scalability and it 2 requires agents have access to the global states for partial observable environments the proposed methods needs to reply on dgn some other points 1 i was expecting ablation studies where dgn is ablated on partial observable environments 2 some parts in the method section are hard to followdocsepthe paper proposes a scalable approach via intention propagation to learn a multiagent rl algorithm using communication in a structured environment an agent encodes its policy and sends the intention to the neighboring agents with the assumption that only the closest agents would be the affected by it the approach involves using techniques from the embedded probabilistic inference literature using meanfield variational inference the jointpolicy is estimated using the meanfield approximation that is obtained via propagating intents in an iterative manner so this approach helps in avoiding the need to factorize the value function explicitly the related works section does a nice survey of related approaches and the paper shows conceptual differences to an earlier proposed mfg that has stricter requirements the experiments shown cover many important baselines that are shown to be good baselines in respective environments ip outperforms all the baselines in three competitive benchmarks i have a few questions about the clarity of the presentation how important is the graph structure defined by kmeans a comparison with a randomized graph and ablation with different reset time n intervals would be interesting in the experiments it would be interesting to check if intention only helps the nearby agents how does addingremoving agents to the set of neighbors affect learning a comparison with a fully connected graph should be sufficient the plot in the appendix shows results on the cityflow task which has very structured observation with the set of immediate neighbors always set of 4 doing such an analysis on a more dynamic environment like mpe would be helpful what is the computational cost of a densely connected graph as compared to method without using a fixed topology fig 4c does not show plots until convergence overall i feel some restructuring of the paper would benefit the reader explaining some missing portions of the algorithm for eg taking out the environment images from the main textdocseppaper summary the paper considers the cooperative multiagent marl setting where each agents reward depends on the state and the actions of itself and its neighbors the paper has a theoretical claim that for such reward structure the optimal maximum entropy joint policy in the form that can be factored into potential functions one for each agent in particular if the sum of all agents rewards is a function on pairwise actions those potential functions are one for each agent and one for each pair of actions ie the equation after proposition 1 then the paper proposes to use meanfield approximation to approximate the optimal joint policy equation 3 which leads to a concrete algorithm that relies on passing the embedding of each agents local policy around to neighbors the paper then empirically shows that the algorithm is particularly effective for domains with a large number of agents major commentsquestions 1 although the motivation has an interpretation of intention propagation the resulting architecture figure 1b and loss functions section 42 seems to be a standard messaging passing architecture with sac loss functions that loses the intention semantics i do not see too much algorithmic novelty here 2 for the baselines used in the experiments it seems that only ip and dgn allow communicationmessage passing during execution which makes it unsurprising that the two methods outperform other baselines minor commentsquestions 1 the beginning of section 3 says the paper considers maximum entropy as the optimization objective while etapi at the beginning of section 4 says the objective is longterm reward no entropy this seems to be an inconsistency here 2 for the assumptions on rewards proposition 1 assumes that each agents reward depends on its neighbors while the derivation of equation 3 and thus the following algorithm further assumes that the reward depends on pairwise actions it is a little bit unclear what assumptions are required for all the theoretical and experimental claims of this paper 3 is there reason to believe that the multiround message passing will converge to the fixedpoint of equation 2 4 what is the overgeneralization issue overall weak accept the paper has a clear introduction and motivation of the proposed algorithm the insight that optimal maximum entropy joint policy takes the format of markov random field might be of some value and interest however i dont think the resulting method has much algorithmic novelty thanks for the response and ive increased my score i am satisfied with the response but still not convinced about the algorithmic novelty on the intention semantics built into the method even after reading b1 in particular it seems that the loss functions do not drive mus represented by nns to the fixed point solution of eq 3 psi shows up in eq 3 but does not play a role in the following development of the method docsepthis paper proposes a method for generating policies in cooperative games using a neighbourhoodbased factorisation of reward and an iterative algorithm which independently updates policies based on neighbour policies and then propagates the policy to neighbours using function space embedding the experimental results looked promising so there seems to be an idea here worth communicating the paper was very hard for me to follow im not an expert in the area and wouldnt expect to follow all of the reasoning in constructing the method but i would be expect to be able to follow some clear statements of the algorithm or its theoretical properties guarantees of some solution quality given certain assumptions the parameters affecting this etc instead the main body of the paper felt like a collection of pieces that were used when developing the algorithm i would suggest it might be easier to follow if written from the top down instead present a highlevel overview of the idea give a detailed description of the algorithm the experiments and leave the derivation to the appendix despite being in the appendix the algorithm is less than half a page and doesnt explain the variables eta and kappa might be described elsewhere but it would be helpful to reference where j is a loss which one one of the claimed contributions is this is principled method however the exact assumptions are not clear and the chain of issues discussed throughout section 4 seems to include discussion of approximation what makes this principled this would seem to need a clear statemen one of the claimed contributions is this is principled method however the exact assumptions are not clear and the chain of issues discussed throughout section 4 seems to include discussion of approximation what makes this principled this would seem to need a clear statement what are the exact assumptions and what precisely is the quality of the output is it exact what are the complete set of parameters where does approximation fit in t what are the exact assumptions and what precisely is the quality of the output is it exact what are the complete set of parameters where does approximation fit in another claimed contribution is computational efficiency how does the computational cost compare to the baselines in the experiments proposition 1 the optimal policy has the form 1z exp i found the use of optimal slightly hard to follow throughout this the usual definition of optimal policy would be a value maximising policy which would be an argmax rather than a softmax following that definition this proposition wouldnt be true so it seems like it needs more explanation or more careful wording the cited prl article levine 2018 seems to retain this standard use of optimal it uses a distribution over trajectories with an equation similar to here a softmax over accumulated trajectory rewards and makes use of the property that trajectories corresponding to an optimal policy have maximum probability in that distribution can the authors clarify this use of optimal proposition 1 for clarity explain the intention of psi is this the future accumulated reward given the current state and selected action comments after author discussion the authors were quite active in editing the submission and addressing the concerns i had i still find the paper a bit hard to follow but none of my original concerns remain
### Summary: | the paper describes a framework for multiagent reinforcement learning that uses markov random fields unfortunately the paper is not clearly written and would benefit from significant revisions that improve its structure and make the model and approximations more explicit in particular the paper says a graph says which agents ij communicate this is typically called the coordination graph in this setting see collaborative multiagent reinforcement learning by payoff propagation kok and vlassis 2006 note that within that paper they provide qfunction decomposition which can only serve to approximate the optimal policy the authors of this submission claim that an mrf is sufficient for optimal policies i fail to see how this is true in particular proposition 1 has to be checked more carefully i tried to go through it but it did not seem to make sense to me why is there an exp term in the definitoin of the optimal trajectory probability why would minimising the kl divergence be enough to obtain an optimal policy perhaps it gives an optimal policy within the class of mrf policies but thats not the same thing as the globally optimal policy overall i find the lack of clarity and in depth discussion of early related work disturbing particularly with respect to the theoretical claims in the paper | [
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
783,
2929,
29328,
247,
4471,
12242,
5511,
1332,
323,
4471,
12788,
35221,
4715,
436,
1332,
310,
1754,
327,
253,
35056,
9904,
10921,
5289,
2190,
6083,
534,
347,
2080,
347,
891,
717,
7514,
403,
3839,
2918,
275,
2570,
4471,
12788,
7533,
253,
4477,
897,
4679,
327,
2846,
5449,
278,
365,
285,
4231,
290,
281,
7568,
326,
616,
1332,
476,
562,
32231,
253,
256,
5503,
3082,
285,
310,
44755,
253,
16774,
1543,
310,
13943,
2299,
352,
310,
690,
7350,
5001,
3082,
326,
1421,
281,
619,
4583,
4016,
13716,
50276,
7053,
314,
671,
253,
954,
15538,
3738,
253,
4477,
22175,
326,
597,
403,
26728,
253,
21546,
273,
6083,
891,
1158,
616,
1332,
310,
3240,
2074,
281,
1110,
26728,
1980,
7313,
751,
295,
39028,
5987,
39962,
2061,
5375,
746,
38723,
25772,
277,
3757,
390,
3007,
32785,
5987,
39962,
2061,
5375,
1252,
361,
2227,
2405,
581,
1039,
281,
4665,
253,
4081,
26728,
2990,
2605,
310,
247,
2622,
4471,
12242,
5511,
5122,
533,
760,
342,
247,
2602,
4090,
5743,
1159,
50275,
3118,
1096,
281,
2045,
2987,
12392,
10924,
273,
1980,
7313,
253,
4081,
789,
337,
3198,
281,
2953,
253,
3237,
5802,
407,
253,
6036,
3646,
751,
10491,
432,
352,
253,
2488,
897,
247,
39762,
4833,
2746,
281,
2589,
10491,
2299,
436,
2746,
778,
8513,
253,
9171,
1430,
285,
352,
374,
4419,
6083,
452,
2289,
281,
253,
4156,
3054,
323,
7898,
24802,
12620,
253,
4081,
3082,
3198,
281,
12252,
327,
277,
3757,
50275,
8826,
643,
2792,
337,
891,
369,
16764,
28913,
2175,
835,
277,
3757,
310,
490,
16148,
327,
7898,
24802,
12620,
374,
690,
4243,
275,
253,
1332,
2593,
403,
1892,
281,
956,
7152,
339,
431,
248,
2929,
29328,
247,
44755,
2746,
3066,
8208,
18634,
281,
3037,
247,
4471,
12788,
391,
77,
5933,
970,
5511,
275,
247,
18872,
3126,
271,
5570,
31360,
697,
3646,
285,
16965,
253,
8208,
281,
253,
20667,
6083,
342,
253,
9376,
326,
760,
253,
8642,
6083,
651,
320,
253,
5876,
407,
352,
253,
2746,
8687,
970,
5609,
432,
253,
12691,
37851,
17032,
6239,
970,
1599,
3423,
39762,
17032,
253,
6036,
22872,
310,
5998,
970,
253,
1599,
3423,
11193,
326,
310,
2797,
3066,
42995,
540,
592,
275,
271,
34560,
5133,
594,
436,
2746,
7729,
275,
17816,
253,
878,
281,
2803,
907,
253,
1318,
1159,
11120,
50276,
783,
2905,
2987,
2593,
1057,
247,
5322,
6630,
273,
2905,
7274,
285,
253,
2929,
2722,
20178,
3910,
281,
271,
4321,
4081,
278,
16054,
326,
556,
34614,
350,
6095,
50275,
783,
4679,
2011,
3835,
1142,
1774,
1666,
25379,
326,
403,
2011,
281,
320,
1175,
1666,
25379,
275,
9056,
12620,
13997,
41731,
13015,
512,
253,
1666,
25379,
275,
1264,
12085,
49602,
50276,
74,
452,
247,
1643,
3533,
670,
253,
19843,
273,
253,
9759,
50275,
5430,
1774,
310,
253,
4216,
2605,
2931,
407,
465,
30799,
247,
5301,
342,
247,
14871,
4216,
285,
28913,
342,
1027,
14932,
673,
295,
11508,
651,
320,
4722,
50275,
249,
253,
4679,
352,
651,
320,
4722,
281,
2451,
604,
8208,
760,
7729,
253,
10151,
6083,
849,
1057,
6240,
2013,
11305,
6083,
281,
253,
873,
273,
15833,
2818,
4715,
247,
5301,
342,
247,
4751,
4802,
4216,
943,
320,
4209,
253,
7484,
275,
253,
30762,
2722,
1543,
327,
253,
2846,
5449,
4836,
534,
556,
1077,
18872,
8310,
342,
253,
873,
273,
8993,
15833,
1900,
873,
273,
577,
2509,
824,
271,
1783,
327,
247,
625,
7870,
3126,
751,
278,
365,
651,
320,
9371,
50275,
5371,
310,
253,
15180,
2105,
273,
247,
42350,
4802,
4216,
347,
2429,
281,
1332,
1293,
970,
247,
4229,
18080,
50275,
926,
577,
68,
1057,
417,
921,
14777,
1919,
14940,
50276,
1189,
455,
891,
1928,
690,
48090,
273,
253,
2929,
651,
5649,
253,
9414,
15571,
690,
5816,
11821,
273,
253,
5933,
323,
24088,
3192,
562,
253,
3126,
3888,
432,
253,
2022,
2505,
7152,
339,
377,
6653,
6010,
253,
2929,
19401,
253,
27293,
4471,
12788,
2304,
77,
4758,
835,
1016,
6083,
10921,
7024,
327,
253,
1375,
285,
253,
5231,
273,
3139,
285,
697,
15833,
253,
2929,
556,
247,
10527,
1750,
326,
323,
824,
10921,
2605,
253,
8654,
4869,
15579,
6036,
3646,
275,
253,
830,
326,
476,
320,
958,
2149,
715,
2442,
3470,
581,
323,
1016,
5570,
275,
1798,
604,
253,
2020,
273,
512,
6083,
23267,
310,
247,
1159,
327,
28208,
5231,
1110,
2442,
3470,
403,
581,
323,
1016,
5570,
285,
581,
323,
1016,
4667,
273,
5231,
26332,
253,
5150,
846,
13989,
337,
840,
253,
2929,
29328,
281,
897,
1599,
3423,
11193,
281,
16851,
253,
8654,
6036,
3646,
5150,
495,
534,
5644,
281,
247,
11859,
5933,
326,
15771,
327,
8136,
253,
21496,
273,
1016,
6083,
1980,
3646,
1475,
281,
15833,
253,
2929,
840,
45190,
2722,
326,
253,
5933,
310,
3782,
3576,
323,
10625,
342,
247,
1781,
1180,
273,
6083,
50273,
24330,
5701,
34974,
337,
3738,
253,
16038,
556,
271,
7914,
273,
8208,
18634,
253,
4795,
10336,
4677,
337,
67,
285,
2957,
3470,
2593,
5976,
3133,
281,
320,
247,
2629,
29908,
8136,
10336,
342,
7044,
2957,
3470,
326,
25068,
253,
8208,
35185,
891,
513,
417,
923,
1512,
1199,
5933,
280,
38135,
1060,
50275,
19,
323,
253,
1666,
25379,
908,
275,
253,
4679,
352,
3133,
326,
760,
13997,
285,
277,
3757,
1581,
5511,
8559,
8136,
1309,
10636,
534,
2789,
352,
5061,
321,
20733,
326,
253,
767,
3082,
562,
32231,
643,
1666,
25379,
50274,
37585,
5701,
34974,
337,
253,
5068,
273,
2593,
495,
2296,
253,
2929,
19401,
4869,
15579,
347,
253,
13757,
8103,
1223,
1162,
6682,
387,
253,
5068,
273,
2593,
577,
2296,
253,
8103,
310,
1048,
3945,
10921,
642,
15579,
436,
3133,
281,
320,
271,
43430,
1060,
50275,
19,
323,
253,
13260,
327,
23267,
13989,
337,
19584,
326,
1016,
6083,
10921,
7024,
327,
697,
15833,
1223,
253,
28529,
273,
5150,
495,
285,
3021,
253,
1563,
5933,
2007,
19584,
326,
253,
10921,
7024,
327,
28208,
5231,
352,
310,
247,
1652,
2372,
12744,
752,
13260,
403,
2424,
323,
512,
253,
10527,
285,
5661,
3916,
273,
436,
2929,
50274,
20,
310,
627,
1921,
281,
2868,
326,
253,
1554,
343,
517,
3935,
8136,
588,
29623,
281,
253,
4229,
3659,
273,
5150,
374,
50276,
21,
752,
310,
253,
689,
16691,
1320,
2523,
50273,
1189,
455,
5075,
2997,
253,
2929,
556,
247,
2590,
10199,
285,
16038,
273,
253,
4081,
5933,
253,
12288,
326,
8654,
4869,
15579,
6036,
3646,
3936,
253,
5981,
273,
1616,
729,
3632,
1673,
1537,
320,
273,
690,
1318,
285,
1600,
2299,
891,
13414,
1158,
253,
4795,
1332,
556,
1199,
5933,
280,
38135,
50274,
35501,
323,
253,
2380,
285,
209,
422,
2559,
619,
4868,
891,
717,
10048,
342,
253,
2380,
533,
1335,
417,
13762,
670,
253,
5933,
280,
38135,
327,
253,
8208,
35185,
4270,
715,
253,
1332,
1014,
846,
4361,
270,
18,
50276,
249,
1798,
352,
3133,
326,
253,
2957,
3470,
513,
417,
4446,
1948,
6607,
407,
295,
2224,
281,
253,
4229,
1127,
2900,
273,
16186,
495,
3714,
74,
2722,
598,
275,
16186,
495,
533,
1057,
417,
1132,
247,
2554,
275,
253,
1563,
2440,
273,
253,
1332,
5474,
33032,
2520,
2929,
29328,
247,
1332,
323,
11365,
7823,
275,
27293,
3958,
970,
247,
24092,
3169,
2803,
5837,
273,
10921,
285,
271,
34560,
5933,
534,
10939,
11269,
7823,
1754,
327,
14646,
7823,
285,
840,
8641,
684,
253,
3646,
281,
31359,
970,
1159,
2317,
21496,
50276,
783,
5661,
1543,
3261,
12532,
594,
627,
3133,
281,
320,
271,
2934,
1060,
4409,
26728,
50276,
783,
2929,
369,
1077,
1892,
323,
479,
281,
956,
516,
417,
271,
6485,
275,
253,
2170,
285,
651,
2649,
1902,
281,
956,
512,
273,
253,
14720,
275,
26736,
253,
1332,
533,
891,
651,
320,
1902,
281,
320,
2104,
281,
956,
690,
2590,
7234,
273,
253,
5933,
390,
697,
10527,
3607,
23632,
273,
690,
2900,
3290,
1677,
2176,
13260,
253,
3602,
13567,
436,
3966,
3185,
253,
2022,
2133,
273,
253,
2929,
3543,
751,
247,
4849,
273,
7437,
326,
497,
908,
672,
6684,
253,
5933,
891,
651,
1804,
352,
1537,
320,
6927,
281,
956,
604,
3542,
432,
253,
1755,
1066,
3185,
1246,
247,
1029,
5251,
18389,
273,
253,
2934,
1918,
247,
7000,
5740,
273,
253,
5933,
253,
4679,
285,
3553,
253,
28529,
281,
253,
30762,
50276,
3229,
3784,
1146,
275,
253,
30762,
253,
5933,
310,
1679,
685,
2716,
247,
3239,
285,
36908,
5513,
253,
4903,
1162,
66,
285,
465,
5596,
1537,
320,
2529,
11358,
533,
352,
651,
320,
9371,
281,
3806,
835,
480,
310,
247,
2957,
534,
581,
50276,
531,
273,
253,
7558,
9021,
310,
436,
310,
3505,
74,
6216,
1332,
2299,
253,
3242,
13260,
403,
417,
2590,
285,
253,
5931,
273,
3374,
5469,
4768,
2593,
577,
3133,
281,
2486,
5955,
273,
11193,
752,
2789,
436,
3505,
74,
6216,
436,
651,
1646,
281,
878,
247,
2590,
1098,
14745,
581,
273,
253,
7558,
9021,
310,
436,
310,
3505,
74,
6216,
1332,
2299,
253,
3242,
13260,
403,
417,
2590,
285,
253,
5931,
273,
3374,
5469,
4768,
2593,
577,
3133,
281,
2486,
5955,
273,
11193,
752,
2789,
436,
3505,
74,
6216,
436,
651,
1646,
281,
878,
247,
2590,
3908,
752,
403,
253,
3242,
13260,
285,
752,
10534,
310,
253,
3290,
273,
253,
3453,
310,
352,
3242,
752,
403,
253,
3426,
873,
273,
3602,
835,
1057,
11193,
4944,
275,
246,
752,
403,
253,
3242,
13260,
285,
752,
10534,
310,
253,
3290,
273,
253,
3453,
310,
352,
3242,
752,
403,
253,
3426,
873,
273,
3602,
835,
1057,
11193,
4944,
275,
50276,
23955,
7558,
7680,
310,
15180,
6733,
849,
1057,
253,
15180,
2105,
7277,
281,
253,
1666,
25379,
275,
253,
4679,
50275,
856,
3321,
337,
253,
8654,
3646,
556,
253,
830,
50276,
18,
91,
866,
891,
1119,
253,
897,
273,
8654,
5777,
1892,
281,
956,
4768,
436,
253,
7312,
5426,
273,
8654,
3646,
651,
320,
247,
1318,
11903,
2182,
3646,
534,
651,
320,
271,
1736,
4090,
2581,
685,
247,
2602,
4090,
1563,
326,
5426,
436,
13989,
651,
2649,
320,
2032,
594,
352,
3133,
751,
352,
3198,
625,
8813,
390,
625,
10182,
41066,
253,
11106,
819,
77,
3929,
20978,
460,
4765,
3133,
281,
13280,
436,
2629,
897,
273,
8654,
352,
4648,
247,
3268,
689,
24102,
342,
271,
5150,
2074,
281,
1060,
247,
2602,
4090,
689,
20821,
18974,
23267,
285,
2789,
897,
273,
253,
2867,
326,
24102,
3969,
281,
271,
8654,
3646,
452,
4869,
5912,
275,
326,
3268,
476,
253,
4477,
19148,
436,
897,
273,
8654,
50276,
856,
3321,
337,
323,
19843,
5513,
253,
8208,
273,
3714,
74,
310,
436,
253,
2852,
20821,
10921,
1677,
253,
1655,
1375,
285,
4236,
2250,
50275,
26122,
846,
2488,
5955,
253,
4477,
497,
3240,
3939,
275,
14835,
253,
19529,
285,
15974,
253,
7350,
891,
574,
891,
1335,
1089,
253,
2929,
247,
2372,
1892,
281,
956,
533,
5293,
273,
619,
3236,
7350,
3464,
187,
187,
4118,
18435,
27,
783,
2929,
8631,
247,
7792,
323,
4471,
12788,
35221,
4715,
326,
4648,
1616,
729,
3632,
4910,
19235,
253,
2929,
310,
417,
4518,
3542,
285,
651,
5649,
432,
1534,
38549,
326,
3157,
697,
2605,
285,
1056,
253,
1566,
285,
34754,
625,
6843,
50276,
249,
1798,
253,
2929,
2296,
247,
4216,
2296,
534,
6083,
891,
75,
13791,
436,
310,
5431,
1925,
253,
19915,
4216,
275,
436,
4758,
923,
27549,
4471,
12788,
35221,
4715,
407,
2075,
2727,
18634,
465,
536,
285,
362,
14407,
261,
5403,
3877,
326,
1561,
326,
2929,
597,
2085,
2805,
3701,
14717,
534,
476,
760,
5752,
281,
16851,
253,
8654,
3646,
50276,
783,
4477,
273,
436,
19529,
1750,
326,
271,
278,
19232,
310,
4209,
323,
8654,
7823,
891,
1891,
281,
923,
849,
436,
310,
2032,
275,
1798,
13989,
337,
556,
281,
320,
10141,
625,
9257,
891,
3597,
281,
564,
949,
352,
533,
352,
858,
417,
1646,
281,
1056,
3282,
281,
479,
2139,
310,
627,
271,
866,
1307,
275,
253,
3029,
7067,
249,
273,
253,
8654,
18974,
5912,
2139,
651,
7221,
2182,
253,
27451,
23279,
320,
2217,
281,
4044,
271,
8654,
3646,
4931,
352,
4245,
271,
8654,
3646,
1561,
253,
966,
273,
278,
19232,
7823,
533,
28763,
417,
253,
1072,
2181,
347,
253,
21349,
8654,
3646,
50276,
1189,
455,
891,
1089,
253,
3480,
273,
19843,
285,
275,
6864,
5955,
273,
2393,
2905,
789,
25758,
3782,
342,
1675,
281,
253,
10527,
3916,
275,
253,
2929,
209
] | [
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] | [
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
783,
2929,
29328,
247,
4471,
12242,
5511,
1332,
323,
4471,
12788,
35221,
4715,
436,
1332,
310,
1754,
327,
253,
35056,
9904,
10921,
5289,
2190,
6083,
534,
347,
2080,
347,
891,
717,
7514,
403,
3839,
2918,
275,
2570,
4471,
12788,
7533,
253,
4477,
897,
4679,
327,
2846,
5449,
278,
365,
285,
4231,
290,
281,
7568,
326,
616,
1332,
476,
562,
32231,
253,
256,
5503,
3082,
285,
310,
44755,
253,
16774,
1543,
310,
13943,
2299,
352,
310,
690,
7350,
5001,
3082,
326,
1421,
281,
619,
4583,
4016,
13716,
50276,
7053,
314,
671,
253,
954,
15538,
3738,
253,
4477,
22175,
326,
597,
403,
26728,
253,
21546,
273,
6083,
891,
1158,
616,
1332,
310,
3240,
2074,
281,
1110,
26728,
1980,
7313,
751,
295,
39028,
5987,
39962,
2061,
5375,
746,
38723,
25772,
277,
3757,
390,
3007,
32785,
5987,
39962,
2061,
5375,
1252,
361,
2227,
2405,
581,
1039,
281,
4665,
253,
4081,
26728,
2990,
2605,
310,
247,
2622,
4471,
12242,
5511,
5122,
533,
760,
342,
247,
2602,
4090,
5743,
1159,
50275,
3118,
1096,
281,
2045,
2987,
12392,
10924,
273,
1980,
7313,
253,
4081,
789,
337,
3198,
281,
2953,
253,
3237,
5802,
407,
253,
6036,
3646,
751,
10491,
432,
352,
253,
2488,
897,
247,
39762,
4833,
2746,
281,
2589,
10491,
2299,
436,
2746,
778,
8513,
253,
9171,
1430,
285,
352,
374,
4419,
6083,
452,
2289,
281,
253,
4156,
3054,
323,
7898,
24802,
12620,
253,
4081,
3082,
3198,
281,
12252,
327,
277,
3757,
50275,
8826,
643,
2792,
337,
891,
369,
16764,
28913,
2175,
835,
277,
3757,
310,
490,
16148,
327,
7898,
24802,
12620,
374,
690,
4243,
275,
253,
1332,
2593,
403,
1892,
281,
956,
7152,
339,
431,
248,
2929,
29328,
247,
44755,
2746,
3066,
8208,
18634,
281,
3037,
247,
4471,
12788,
391,
77,
5933,
970,
5511,
275,
247,
18872,
3126,
271,
5570,
31360,
697,
3646,
285,
16965,
253,
8208,
281,
253,
20667,
6083,
342,
253,
9376,
326,
760,
253,
8642,
6083,
651,
320,
253,
5876,
407,
352,
253,
2746,
8687,
970,
5609,
432,
253,
12691,
37851,
17032,
6239,
970,
1599,
3423,
39762,
17032,
253,
6036,
22872,
310,
5998,
970,
253,
1599,
3423,
11193,
326,
310,
2797,
3066,
42995,
540,
592,
275,
271,
34560,
5133,
594,
436,
2746,
7729,
275,
17816,
253,
878,
281,
2803,
907,
253,
1318,
1159,
11120,
50276,
783,
2905,
2987,
2593,
1057,
247,
5322,
6630,
273,
2905,
7274,
285,
253,
2929,
2722,
20178,
3910,
281,
271,
4321,
4081,
278,
16054,
326,
556,
34614,
350,
6095,
50275,
783,
4679,
2011,
3835,
1142,
1774,
1666,
25379,
326,
403,
2011,
281,
320,
1175,
1666,
25379,
275,
9056,
12620,
13997,
41731,
13015,
512,
253,
1666,
25379,
275,
1264,
12085,
49602,
50276,
74,
452,
247,
1643,
3533,
670,
253,
19843,
273,
253,
9759,
50275,
5430,
1774,
310,
253,
4216,
2605,
2931,
407,
465,
30799,
247,
5301,
342,
247,
14871,
4216,
285,
28913,
342,
1027,
14932,
673,
295,
11508,
651,
320,
4722,
50275,
249,
253,
4679,
352,
651,
320,
4722,
281,
2451,
604,
8208,
760,
7729,
253,
10151,
6083,
849,
1057,
6240,
2013,
11305,
6083,
281,
253,
873,
273,
15833,
2818,
4715,
247,
5301,
342,
247,
4751,
4802,
4216,
943,
320,
4209,
253,
7484,
275,
253,
30762,
2722,
1543,
327,
253,
2846,
5449,
4836,
534,
556,
1077,
18872,
8310,
342,
253,
873,
273,
8993,
15833,
1900,
873,
273,
577,
2509,
824,
271,
1783,
327,
247,
625,
7870,
3126,
751,
278,
365,
651,
320,
9371,
50275,
5371,
310,
253,
15180,
2105,
273,
247,
42350,
4802,
4216,
347,
2429,
281,
1332,
1293,
970,
247,
4229,
18080,
50275,
926,
577,
68,
1057,
417,
921,
14777,
1919,
14940,
50276,
1189,
455,
891,
1928,
690,
48090,
273,
253,
2929,
651,
5649,
253,
9414,
15571,
690,
5816,
11821,
273,
253,
5933,
323,
24088,
3192,
562,
253,
3126,
3888,
432,
253,
2022,
2505,
7152,
339,
377,
6653,
6010,
253,
2929,
19401,
253,
27293,
4471,
12788,
2304,
77,
4758,
835,
1016,
6083,
10921,
7024,
327,
253,
1375,
285,
253,
5231,
273,
3139,
285,
697,
15833,
253,
2929,
556,
247,
10527,
1750,
326,
323,
824,
10921,
2605,
253,
8654,
4869,
15579,
6036,
3646,
275,
253,
830,
326,
476,
320,
958,
2149,
715,
2442,
3470,
581,
323,
1016,
5570,
275,
1798,
604,
253,
2020,
273,
512,
6083,
23267,
310,
247,
1159,
327,
28208,
5231,
1110,
2442,
3470,
403,
581,
323,
1016,
5570,
285,
581,
323,
1016,
4667,
273,
5231,
26332,
253,
5150,
846,
13989,
337,
840,
253,
2929,
29328,
281,
897,
1599,
3423,
11193,
281,
16851,
253,
8654,
6036,
3646,
5150,
495,
534,
5644,
281,
247,
11859,
5933,
326,
15771,
327,
8136,
253,
21496,
273,
1016,
6083,
1980,
3646,
1475,
281,
15833,
253,
2929,
840,
45190,
2722,
326,
253,
5933,
310,
3782,
3576,
323,
10625,
342,
247,
1781,
1180,
273,
6083,
50273,
24330,
5701,
34974,
337,
3738,
253,
16038,
556,
271,
7914,
273,
8208,
18634,
253,
4795,
10336,
4677,
337,
67,
285,
2957,
3470,
2593,
5976,
3133,
281,
320,
247,
2629,
29908,
8136,
10336,
342,
7044,
2957,
3470,
326,
25068,
253,
8208,
35185,
891,
513,
417,
923,
1512,
1199,
5933,
280,
38135,
1060,
50275,
19,
323,
253,
1666,
25379,
908,
275,
253,
4679,
352,
3133,
326,
760,
13997,
285,
277,
3757,
1581,
5511,
8559,
8136,
1309,
10636,
534,
2789,
352,
5061,
321,
20733,
326,
253,
767,
3082,
562,
32231,
643,
1666,
25379,
50274,
37585,
5701,
34974,
337,
253,
5068,
273,
2593,
495,
2296,
253,
2929,
19401,
4869,
15579,
347,
253,
13757,
8103,
1223,
1162,
6682,
387,
253,
5068,
273,
2593,
577,
2296,
253,
8103,
310,
1048,
3945,
10921,
642,
15579,
436,
3133,
281,
320,
271,
43430,
1060,
50275,
19,
323,
253,
13260,
327,
23267,
13989,
337,
19584,
326,
1016,
6083,
10921,
7024,
327,
697,
15833,
1223,
253,
28529,
273,
5150,
495,
285,
3021,
253,
1563,
5933,
2007,
19584,
326,
253,
10921,
7024,
327,
28208,
5231,
352,
310,
247,
1652,
2372,
12744,
752,
13260,
403,
2424,
323,
512,
253,
10527,
285,
5661,
3916,
273,
436,
2929,
50274,
20,
310,
627,
1921,
281,
2868,
326,
253,
1554,
343,
517,
3935,
8136,
588,
29623,
281,
253,
4229,
3659,
273,
5150,
374,
50276,
21,
752,
310,
253,
689,
16691,
1320,
2523,
50273,
1189,
455,
5075,
2997,
253,
2929,
556,
247,
2590,
10199,
285,
16038,
273,
253,
4081,
5933,
253,
12288,
326,
8654,
4869,
15579,
6036,
3646,
3936,
253,
5981,
273,
1616,
729,
3632,
1673,
1537,
320,
273,
690,
1318,
285,
1600,
2299,
891,
13414,
1158,
253,
4795,
1332,
556,
1199,
5933,
280,
38135,
50274,
35501,
323,
253,
2380,
285,
209,
422,
2559,
619,
4868,
891,
717,
10048,
342,
253,
2380,
533,
1335,
417,
13762,
670,
253,
5933,
280,
38135,
327,
253,
8208,
35185,
4270,
715,
253,
1332,
1014,
846,
4361,
270,
18,
50276,
249,
1798,
352,
3133,
326,
253,
2957,
3470,
513,
417,
4446,
1948,
6607,
407,
295,
2224,
281,
253,
4229,
1127,
2900,
273,
16186,
495,
3714,
74,
2722,
598,
275,
16186,
495,
533,
1057,
417,
1132,
247,
2554,
275,
253,
1563,
2440,
273,
253,
1332,
5474,
33032,
2520,
2929,
29328,
247,
1332,
323,
11365,
7823,
275,
27293,
3958,
970,
247,
24092,
3169,
2803,
5837,
273,
10921,
285,
271,
34560,
5933,
534,
10939,
11269,
7823,
1754,
327,
14646,
7823,
285,
840,
8641,
684,
253,
3646,
281,
31359,
970,
1159,
2317,
21496,
50276,
783,
5661,
1543,
3261,
12532,
594,
627,
3133,
281,
320,
271,
2934,
1060,
4409,
26728,
50276,
783,
2929,
369,
1077,
1892,
323,
479,
281,
956,
516,
417,
271,
6485,
275,
253,
2170,
285,
651,
2649,
1902,
281,
956,
512,
273,
253,
14720,
275,
26736,
253,
1332,
533,
891,
651,
320,
1902,
281,
320,
2104,
281,
956,
690,
2590,
7234,
273,
253,
5933,
390,
697,
10527,
3607,
23632,
273,
690,
2900,
3290,
1677,
2176,
13260,
253,
3602,
13567,
436,
3966,
3185,
253,
2022,
2133,
273,
253,
2929,
3543,
751,
247,
4849,
273,
7437,
326,
497,
908,
672,
6684,
253,
5933,
891,
651,
1804,
352,
1537,
320,
6927,
281,
956,
604,
3542,
432,
253,
1755,
1066,
3185,
1246,
247,
1029,
5251,
18389,
273,
253,
2934,
1918,
247,
7000,
5740,
273,
253,
5933,
253,
4679,
285,
3553,
253,
28529,
281,
253,
30762,
50276,
3229,
3784,
1146,
275,
253,
30762,
253,
5933,
310,
1679,
685,
2716,
247,
3239,
285,
36908,
5513,
253,
4903,
1162,
66,
285,
465,
5596,
1537,
320,
2529,
11358,
533,
352,
651,
320,
9371,
281,
3806,
835,
480,
310,
247,
2957,
534,
581,
50276,
531,
273,
253,
7558,
9021,
310,
436,
310,
3505,
74,
6216,
1332,
2299,
253,
3242,
13260,
403,
417,
2590,
285,
253,
5931,
273,
3374,
5469,
4768,
2593,
577,
3133,
281,
2486,
5955,
273,
11193,
752,
2789,
436,
3505,
74,
6216,
436,
651,
1646,
281,
878,
247,
2590,
1098,
14745,
581,
273,
253,
7558,
9021,
310,
436,
310,
3505,
74,
6216,
1332,
2299,
253,
3242,
13260,
403,
417,
2590,
285,
253,
5931,
273,
3374,
5469,
4768,
2593,
577,
3133,
281,
2486,
5955,
273,
11193,
752,
2789,
436,
3505,
74,
6216,
436,
651,
1646,
281,
878,
247,
2590,
3908,
752,
403,
253,
3242,
13260,
285,
752,
10534,
310,
253,
3290,
273,
253,
3453,
310,
352,
3242,
752,
403,
253,
3426,
873,
273,
3602,
835,
1057,
11193,
4944,
275,
246,
752,
403,
253,
3242,
13260,
285,
752,
10534,
310,
253,
3290,
273,
253,
3453,
310,
352,
3242,
752,
403,
253,
3426,
873,
273,
3602,
835,
1057,
11193,
4944,
275,
50276,
23955,
7558,
7680,
310,
15180,
6733,
849,
1057,
253,
15180,
2105,
7277,
281,
253,
1666,
25379,
275,
253,
4679,
50275,
856,
3321,
337,
253,
8654,
3646,
556,
253,
830,
50276,
18,
91,
866,
891,
1119,
253,
897,
273,
8654,
5777,
1892,
281,
956,
4768,
436,
253,
7312,
5426,
273,
8654,
3646,
651,
320,
247,
1318,
11903,
2182,
3646,
534,
651,
320,
271,
1736,
4090,
2581,
685,
247,
2602,
4090,
1563,
326,
5426,
436,
13989,
651,
2649,
320,
2032,
594,
352,
3133,
751,
352,
3198,
625,
8813,
390,
625,
10182,
41066,
253,
11106,
819,
77,
3929,
20978,
460,
4765,
3133,
281,
13280,
436,
2629,
897,
273,
8654,
352,
4648,
247,
3268,
689,
24102,
342,
271,
5150,
2074,
281,
1060,
247,
2602,
4090,
689,
20821,
18974,
23267,
285,
2789,
897,
273,
253,
2867,
326,
24102,
3969,
281,
271,
8654,
3646,
452,
4869,
5912,
275,
326,
3268,
476,
253,
4477,
19148,
436,
897,
273,
8654,
50276,
856,
3321,
337,
323,
19843,
5513,
253,
8208,
273,
3714,
74,
310,
436,
253,
2852,
20821,
10921,
1677,
253,
1655,
1375,
285,
4236,
2250,
50275,
26122,
846,
2488,
5955,
253,
4477,
497,
3240,
3939,
275,
14835,
253,
19529,
285,
15974,
253,
7350,
891,
574,
891,
1335,
1089,
253,
2929,
247,
2372,
1892,
281,
956,
533,
5293,
273,
619,
3236,
7350,
3464,
187,
187,
4118,
18435,
27,
783,
2929,
8631,
247,
7792,
323,
4471,
12788,
35221,
4715,
326,
4648,
1616,
729,
3632,
4910,
19235,
253,
2929,
310,
417,
4518,
3542,
285,
651,
5649,
432,
1534,
38549,
326,
3157,
697,
2605,
285,
1056,
253,
1566,
285,
34754,
625,
6843,
50276,
249,
1798,
253,
2929,
2296,
247,
4216,
2296,
534,
6083,
891,
75,
13791,
436,
310,
5431,
1925,
253,
19915,
4216,
275,
436,
4758,
923,
27549,
4471,
12788,
35221,
4715,
407,
2075,
2727,
18634,
465,
536,
285,
362,
14407,
261,
5403,
3877,
326,
1561,
326,
2929,
597,
2085,
2805,
3701,
14717,
534,
476,
760,
5752,
281,
16851,
253,
8654,
3646,
50276,
783,
4477,
273,
436,
19529,
1750,
326,
271,
278,
19232,
310,
4209,
323,
8654,
7823,
891,
1891,
281,
923,
849,
436,
310,
2032,
275,
1798,
13989,
337,
556,
281,
320,
10141,
625,
9257,
891,
3597,
281,
564,
949,
352,
533,
352,
858,
417,
1646,
281,
1056,
3282,
281,
479,
2139,
310,
627,
271,
866,
1307,
275,
253,
3029,
7067,
249,
273,
253,
8654,
18974,
5912,
2139,
651,
7221,
2182,
253,
27451,
23279,
320,
2217,
281,
4044,
271,
8654,
3646,
4931,
352,
4245,
271,
8654,
3646,
1561,
253,
966,
273,
278,
19232,
7823,
533,
28763,
417,
253,
1072,
2181,
347,
253,
21349,
8654,
3646,
50276,
1189,
455,
891,
1089,
253,
3480,
273,
19843,
285,
275,
6864,
5955,
273,
2393,
2905,
789,
25758,
3782,
342,
1675,
281,
253,
10527,
3916,
275,
253,
2929,
209
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
the authors propose a new method for calculating distributions over states in markov problems in which the true state is not directly observable the method relies on using an existing model and then fine tuning the results online at each time step experiments presented show this gives a more accurate distribution and that this more accurate distribution leads to statistically significant improvements in performance the abstract introduction and background is very clearly written it does a good job of introducing the problem to be solved and outlining the method to be used at a high level the method itself is rather mathematically simple it is based on simple mathematical techniques in the area and effectively amounts to sampling from known distributions while the authors state that they present an algorithm it might be better to view this as a framework as the method involves an externally supplied model as an input and the fine tuning method needs to be tailored to this model the experiments performed use only a single model which limits the conclusions that can be drawn from the results the experimental results are only provided for a single domain albeit for a hmm pomdp and fosg setting in the same model the choice of domain seems rather poorly selected as the basic models used as a comparison in the domain already appear to be almost optimal meaning there is little improvement from the new technique while the results may be statistically significant they appear to be a very minor improvements in effect size it would be very useful to see some mention of time in the results section the procedure doesnt appear particularly onerous but in a game playing setting there may be time limits and any improvement in quality would need to be seen after x seconds not after x iterations these criticisms aside the experiments appear to be fairly conducted and reported and look at sensible aspects of the problem with a good choice of baselinescompetitors the authors present an interesting idea well however the experimentation is very limited even for a conference paper and the results given show a very small effect size more data is needed to convince that the technique is useful docsepthis paper seeks to understand whether finetuning a learned belief state model can enable improvements in performance in partially observable problems in particular in the hanabi domain the general intuition is that while the original parametric model is trained on a wide distribution of belief distributions input histories during deployment the model need only be good on the sequence that is actually realized the authors demonstrate that indeed such techniques lead to improved performance over a nonadaptive belief state method in particular when there is significant stochasticity in the transitions the problem of learning and optimizing over belief states is an important problem in reinforcement learning and while the ideas in this paper are not particularly novel the core idea is implemented and tested well i thought that the lack of details made certain section rocky but in general found the paper a pleasure to read strengths simplicity is perhaps the main strength of this paper the method is very closely inspired by a wide body of existing work on belief filtering and the method looks like finetuning a learned model on augmented data empirical details withstanding the generic idea of reallocating the capacity of a neural network at testtime in this way seems welljustified and simple to implement the paper is easy to read and exposes the main concepts well weaknesses the paper is a little light on exact algorithmic details and i did not see more in the appendix in particular i think it would be useful to exactly detail out the algorithm being proposed in a more fleshedout manner somewhere in the main text or appendix it would also be a good idea within section 2 to more explicitly what the goal of the belief state modelling problem is and what the belief state model will be used downstream for as this dictates what approximations are useful worthwhile i found the latter point difficult to understand in the current treatment along the same lines i think it would be useful to more explicitly outline in the appendix exactly how bft is used for downstream decisiontime planning eg with rl search to generate an action at a timestep is bft for 10k timesteps run only once or is it necessary to run multiple times within the policy search loop i found no details for exactly how the particular choice of 10000 gradient steps for bft were chosen how sensitive is bft to the number of gradient steps that are taken to finetune the learned model how much does this depend on the quality of the original model eg with a worse original model does running bft degrade the quality of the belief since bft bootstraps off of the original model it seems that this problem assumes access to the true underlying state space and the transition dynamics emission distributions in this state space in which case the main purpose of the neural network is simply amortization of an exact search problem it would be useful to make these assumptions more explicit earlier perhaps even in the introduction as a question to the authors how would these techniques transfer to the setting where these assumptions do not hold in particular where the underlying state space mathcalx is not known i think the point that the method works because it refocuses the capacity of the parametric belief state model is interesting but i dont think it is supported very strongly by the provided data it would be interesting to see whether these effects diminish as the neural network capacity increases or equivalently if the effects amplify as it decreases since presumably the better the original neural network is at the belief modelling problem the less impact a finetuning step should have along these lines i think it would be interesting to have some discussion as to whether finetuning procedures can simply be supplanted by training bigger networks copied from above the problem of learning and optimizing over belief states is an important problem in reinforcement learning and while the ideas in this paper are not particularly novel the core idea is implemented and tested well i thought that the lack of details made certain section rocky but in general found the paper a pleasure to read docsepthis paper proposes a strategy for improving a trained parametric model for beliefstate update by generating and training on new data online this strategy is particularly helpful for dealing with problems of covariate shift when eg the system has moved outside the training regime and dependence on unobservable factors eg the agents policies in a game setting this finetuning strategy is shown both to improve the fidelity of beliefstate updates and to have an ultimate impact in the quality of play in hanabi a game with a substantial amount of private information this is an interesting paper that is fairly novel as far as i know however there were some problems in the exposition that left me feeling that i might have misunderstood some aspects of it i do think it will be important to clarify some of these points in any published version of the paper many of my confusions were about the types or apis of various components i will start with larger questions and then list some minor ones in section 31 i immediately was confused by the type of bt1yt1 because y is capitalized it seems to be a random variable but that doesnt make a lot of sense to me here are some possible interpretations of bt1yt1 given a particular observation yt1 then its just the next belief state but why is y capitalized not given an observation its a random variable over next beliefs b where the distribution over b is introduced by distribution over observations not given an observation its the next belief state we would get if we made no observation not given an observation its some sort of expectation over the next belief this is tricky i think the discussion of the sampling procedure in that same section says it produces samples of xt1but it seems to be important that we are getting samples of the yti as well so we can use them to finetune our estimator im not sure the analogy with approximate dynamic programming is helpful or at least it would help to clarify it in particular is it critical to actually finetune theta an alternative would be to stick with the particle view and use the sampled xs to represent the posterior belief or use some combination of the belief produced by the parameteric model theta and the particles more explicitly rather than actually changing the theta to me it would feel much more like adp if you were training some function btheta to represent the beliefstate directly on each iteration which would be yet another possible approach i suppose except that representing belief states is notoriously difficult i really wanted figure 2 which is called table 2 clarify things for me but it really didnt help me at all i think something more like a dataflow diagram that illustrates how theta gets changed over time and makes very clear that theta doesnt parameterize b nothing parameterizes or really explicitly represents b itself but instead parametrizes the stateestimator box would be more helpful the standard error values seem quite small given the amount of potential variance in the different training processes please state very clearly exactly what sources of variance are being captured here initializations batches data for training the initial theta randomness due to evolution of the game trajectory at runtime due to card shuffling randomness in agents play etc variance of sampling to generate the finetuning data variance of the finetuning training potentially due to batching i guess what were the various n values the performance improvements seem to be real and interesting but the gains arent enormous this is fine but im not sure id claim that performance is greatly improved more minor i appreciate the generality of the approach and the desire to be agnostic to the form and training details of ftheta but i had to go digging a bit into other papers to try to understand the actual approach you used a bit more detail about that would have been helpful in understanding your approach in early parts of the paper you talked about going from step t to step t1 later it was from step t1 to step t which is naively from independence assumptions covariate distribution should not impact model performance missing shift this paper has interesting ideas that seem to be useful clarity could be improved but interesting enough to publish i wanted to give it a 7 docsepthis paper concerns the idea of directly learning a model of a belief state mdp from samples from the ground truth dynamics and observation models one could then use that model for decisiontime planning eg some form of tree search in particular the paper aims to improve upon this idea by finetuning the belief state model before search using samples from local state region experiments are performed in hanabi showing that finetuning does improve belief accuracy and enables search through belief space to improve control performance in problems where exact inference is impractical after author response i am leaving my original review below for transparency but my most significant concerns have been addressed in particular the rlsearch paper has been accepted for presentation at neurips 2021 the authors have provided the sample sizes for their averages and they are sufficient to support their claims i still think the authors should aim to make the paper more selfcontained by providing more algorithmic details about rlsearch which is a new approach how it interacts with bft and why it was selected for these experiments instead of a more wellestablished search method original review belief state inference is an important step toward developing agents that can make sound decisions in partially observable environments a great deal of effort has been spent studying exact inference or approximate inference with access to ground truth state distributions this papers focus on bringing this tool to more complex domains where representing distributions and performing exact inference are both a challenge is worthwhile and is bound to be of interest to the iclr community the results also fit into a larger story about neural network finetuningmetalearning which is certainly of interest technically the approach is only modestly novel mainly applying an idea that has been studied elsewhere in a context where it hasnt been applied yet that said in my opinion the empirical results are both novel and significant in particular the idea of directly learning a belief state model has not gained a lot of traction because it frankly doesnt work all that well we can see in the 7card hanabi results that performing search on such a model yields essentially no benefit we are lucky that it didnt cause harm the fact that the finetuning enabled the approximate belief model to yield a planning benefit is notable and a promising signpost toward future work which might enable belief state inference to scale to larger more interesting partially observable problems all that said i do have a significant concern about this paper the empirical results rely heavily on an existing algorithm called rlsearch as far as i can tell the paper that describes rlsearch has not been peer reviewed and has only been available on arxiv for about a month at time of reviewing the paper makes a very brief attempt to explain rlsearch at a high level and how that algorithm is altered for this paper but this is not sufficient from this paper i do not know what algorithm is being performed in these experiments or why rlsearch is even a sound foundation to build upon let alone why it is the right base algorithm to combine with bft in order to answer these empirical questions can bft be applied to other search methods is there some special synergy with rlsearch since they both appear to be finetuning methods if the answers to these questions is no then i would recommend using a baseline algorithm that is more established and better studied if the answer is yes then that needs to be made far more explicit and clear if the answer is we dont know then i think thats a major missing piece of this work the fact is that the paper is not sufficiently selfcontained and the parts that it doesnt contain havent been peer reviewed that makes me deeply uncomfortable and makes it difficult for me to confidently assess the technical quality and the significance of the findings another concern i have is that i dont know how many independent trials are represented in the empirical results maybe i just missed it but it seems like a very important detail to state if the number is not sufficiently high that would raise questions about the strength of the support for the conclusions a couple of more minor issues the references section is frankly a mess there is no consistency in the formatting and content of the references and some dont even list a venue just authors a title and a year p 2 at the time of writing there has been no successful demonstration of this it wasnt clear to me at this point what this refers to even now im not totally sure it would be good to have a more clear expression of the open problem here after author response the paper is wellwritten and considers an important and challenging problem the empirical results seem to offer a path to progress in a direction that hasnt shown much promise the paper could be improved by offering more details about rlsearch a recently introduced approach upon which the experiments heavily rely original summary the paper is wellwritten and considers an important and challenging problem the empirical results seem to offer a path to progress in a direction that hasnt shown much promise however the algorithmic results make heavy use of an algorithm that is not sufficiently described in this paper and that has not been peerreviewed though i would really like to have these results in the literature that makes me uncomfortable with recommending acceptance
### Summary: | the paper proposes to fine tune the belief states of a mdp for later using the learned model for decisiontime planning eg via search the contribution is wellpresented motivated and focused to a specific scenario which is generally considered challenging in the literature this scenario is exemplified by the cooperative card game hanabi which takes the role of the benchmarking setting for the empirical evaluation of the finetuning procedure the major concern raised in the review and discussion phases are about the limited evaluation which is centered around only hanabi as well as the magnitudes of the improvements over previous baselines however three knowledgeable reviewers agreed that since the setting has been historically challenging the reported improvements are in fact significant and potentially inspiring future works in this direction the paper is accepted provided that the authors include and polish in the cameraready the additional experiments over the parameter sensitivities the ablation tests and the discussions highlighted by the reviewers in the comments | [
2593,
33848,
533,
275,
2087,
1119,
253,
2929,
247,
11284,
281,
1239,
50276,
7152,
33032,
2520,
2929,
29328,
247,
5700,
323,
11138,
247,
10166,
36833,
1566,
323,
9927,
3409,
5731,
407,
11365,
285,
3733,
327,
747,
941,
3909,
50276,
2520,
5700,
310,
3782,
9371,
323,
10620,
342,
3237,
273,
9383,
11610,
5333,
672,
24088,
253,
985,
556,
4395,
3345,
253,
3733,
9459,
285,
10096,
327,
440,
23705,
494,
2616,
24088,
253,
6083,
7823,
275,
247,
2165,
4758,
50275,
2520,
1442,
292,
25004,
5700,
310,
2011,
1097,
281,
3157,
253,
32422,
273,
9927,
3409,
11269,
285,
281,
452,
271,
12553,
3486,
275,
253,
3290,
273,
1132,
275,
15761,
18754,
247,
2165,
342,
247,
6832,
2408,
273,
3055,
1491,
436,
310,
271,
4722,
2929,
326,
310,
9648,
4460,
347,
2080,
347,
891,
871,
50276,
35529,
627,
497,
690,
3237,
275,
253,
47284,
326,
1669,
479,
5471,
326,
891,
1537,
452,
46485,
690,
7794,
273,
352,
50276,
74,
513,
1158,
352,
588,
320,
1774,
281,
19148,
690,
273,
841,
2792,
275,
667,
3863,
2715,
273,
253,
2929,
50276,
20415,
273,
619,
1461,
16723,
497,
670,
253,
3510,
390,
1049,
261,
273,
2710,
4295,
50276,
74,
588,
1265,
342,
4067,
3533,
285,
840,
1618,
690,
5884,
4394,
50276,
249,
2593,
4562,
891,
4745,
369,
13477,
407,
253,
1511,
273,
37989,
18,
1767,
18,
50275,
12157,
340,
310,
5347,
1025,
352,
3133,
281,
320,
247,
3632,
4778,
50276,
2858,
326,
36908,
1056,
247,
2257,
273,
3282,
281,
479,
50276,
1568,
403,
690,
1896,
27838,
273,
37989,
18,
1767,
18,
50276,
28821,
247,
1798,
8310,
340,
85,
18,
840,
697,
816,
253,
1735,
9927,
1375,
533,
2139,
310,
340,
5347,
1025,
50276,
1439,
1677,
271,
8310,
697,
247,
3632,
4778,
689,
1735,
13379,
270,
835,
253,
3268,
689,
270,
310,
5611,
407,
3268,
689,
7313,
50276,
1439,
1677,
271,
8310,
697,
253,
1735,
9927,
1375,
359,
651,
755,
604,
359,
1160,
642,
8310,
50276,
1439,
1677,
271,
8310,
697,
690,
3686,
273,
15355,
689,
253,
1735,
9927,
436,
310,
28190,
891,
1158,
50276,
783,
5955,
273,
253,
10491,
5199,
275,
326,
1072,
2593,
2296,
352,
11330,
3530,
273,
209,
633,
18,
2858,
352,
3133,
281,
320,
1774,
326,
359,
403,
2970,
3530,
273,
253,
340,
6811,
347,
973,
594,
359,
476,
897,
731,
281,
1442,
292,
2517,
776,
29107,
50276,
303,
417,
2119,
253,
24760,
342,
16851,
7870,
10717,
310,
9371,
50276,
263,
387,
1878,
352,
651,
1361,
281,
19148,
352,
50276,
249,
1798,
310,
352,
4619,
281,
2686,
1442,
292,
2517,
39116,
50275,
266,
5795,
651,
320,
281,
7356,
342,
253,
8091,
1859,
285,
897,
253,
19958,
48361,
281,
1957,
253,
12637,
9927,
390,
897,
690,
5019,
273,
253,
9927,
4197,
407,
253,
4764,
280,
1566,
39116,
285,
253,
6353,
625,
11120,
2581,
685,
2686,
6890,
253,
39116,
50275,
936,
479,
352,
651,
1928,
1199,
625,
751,
519,
81,
604,
368,
497,
3733,
690,
1159,
270,
3124,
281,
1957,
253,
9927,
3409,
3587,
327,
1016,
19502,
534,
651,
320,
2568,
1529,
1896,
2746,
891,
9428,
3707,
326,
9999,
9927,
3054,
310,
417,
49186,
2834,
50276,
74,
1663,
3078,
4677,
374,
534,
310,
1925,
2829,
374,
19148,
1841,
323,
479,
533,
352,
1663,
42126,
1361,
479,
387,
512,
50275,
74,
1158,
1633,
625,
751,
247,
941,
5449,
10659,
326,
18303,
849,
39116,
4850,
4391,
689,
673,
285,
2789,
1077,
2590,
326,
39116,
36908,
4764,
907,
270,
2717,
4764,
4219,
390,
1663,
11120,
6125,
270,
3139,
533,
3185,
30364,
363,
13505,
253,
1375,
383,
303,
1080,
3817,
651,
320,
625,
9371,
50276,
783,
2629,
2228,
2193,
1646,
3240,
1355,
1677,
253,
2408,
273,
2442,
11041,
275,
253,
1027,
3733,
4870,
50276,
32897,
1375,
1077,
4518,
4555,
752,
4973,
273,
11041,
403,
1146,
10848,
1060,
50276,
19078,
5904,
50276,
13419,
2706,
941,
323,
3733,
253,
3302,
39116,
3632,
1255,
1955,
281,
5606,
273,
253,
2165,
18974,
387,
20243,
1955,
281,
3120,
439,
47587,
3632,
1255,
275,
6083,
1132,
3966,
11041,
273,
10491,
281,
6635,
253,
1442,
292,
25004,
941,
11041,
273,
253,
1442,
292,
25004,
3733,
7826,
1955,
281,
14604,
272,
891,
5476,
50275,
5371,
497,
253,
2710,
295,
2193,
50276,
783,
3045,
11701,
1646,
281,
320,
1524,
285,
4722,
533,
253,
15988,
403,
2649,
14779,
50276,
2520,
310,
4030,
533,
516,
417,
2119,
2654,
1750,
326,
3045,
310,
10260,
5520,
50276,
3062,
5884,
50276,
74,
11435,
253,
31376,
273,
253,
2746,
285,
253,
8327,
281,
320,
639,
79,
6932,
281,
253,
830,
285,
3733,
4278,
273,
269,
3124,
533,
891,
574,
281,
564,
28063,
247,
2372,
715,
643,
9380,
281,
1611,
281,
2096,
253,
4588,
2746,
368,
908,
50276,
66,
2372,
625,
2508,
670,
326,
651,
452,
644,
9371,
275,
4685,
634,
2746,
50276,
249,
2393,
4243,
273,
253,
2929,
368,
10062,
670,
1469,
432,
3213,
246,
281,
3213,
246,
18,
50276,
31312,
352,
369,
432,
3213,
246,
18,
281,
3213,
246,
50276,
4609,
310,
5549,
1242,
432,
14275,
13260,
50276,
31485,
11610,
3268,
943,
417,
3486,
1566,
3045,
5816,
5333,
436,
2929,
556,
4722,
5697,
326,
1646,
281,
320,
4217,
50276,
498,
15752,
812,
320,
5520,
533,
4722,
2217,
281,
15452,
50276,
74,
3078,
281,
1918,
352,
247,
818,
5474,
33032,
2520,
2929,
7350,
253,
2934,
273,
3587,
4715,
247,
1566,
273,
247,
9927,
1375,
278,
12132,
432,
3530,
432,
253,
3216,
5083,
8062,
285,
8310,
3210,
581,
812,
840,
897,
326,
1566,
323,
3061,
2606,
7219,
24088,
690,
830,
273,
5202,
3186,
275,
1798,
253,
2929,
13698,
281,
3157,
2220,
436,
2934,
407,
1442,
292,
25004,
253,
9927,
1375,
1566,
1078,
3186,
970,
3530,
432,
1980,
1375,
2919,
4679,
403,
2684,
275,
15761,
18754,
4645,
326,
1442,
292,
25004,
1057,
3157,
9927,
7200,
285,
13276,
3186,
949,
9927,
2317,
281,
3157,
1453,
3045,
275,
3237,
835,
3242,
17032,
310,
45783,
50276,
6438,
2488,
2380,
50275,
74,
717,
6108,
619,
3236,
2278,
2708,
323,
22107,
533,
619,
954,
1534,
7350,
452,
644,
9713,
275,
1798,
50276,
783,
391,
77,
8716,
2929,
556,
644,
7607,
323,
9759,
387,
5723,
2824,
43425,
50276,
783,
4477,
452,
2530,
253,
3410,
9552,
323,
616,
31218,
285,
597,
403,
4209,
281,
1329,
616,
3916,
50276,
74,
1335,
1158,
253,
4477,
943,
4388,
281,
1056,
253,
2929,
625,
1881,
41010,
407,
5277,
625,
5933,
280,
4278,
670,
391,
77,
8716,
534,
310,
247,
747,
2746,
849,
352,
29290,
342,
270,
649,
285,
2139,
352,
369,
4236,
323,
841,
4679,
3185,
273,
247,
625,
973,
21877,
3186,
1332,
50275,
19164,
2278,
50274,
31815,
1375,
17032,
310,
271,
1774,
3213,
2584,
6684,
6083,
326,
476,
1056,
3590,
7089,
275,
10571,
24802,
12620,
247,
1270,
2968,
273,
3434,
556,
644,
5262,
12392,
3242,
17032,
390,
16851,
17032,
342,
2289,
281,
3216,
5083,
1375,
10670,
436,
9380,
2770,
327,
9745,
436,
4968,
281,
625,
2570,
10625,
835,
9999,
10670,
285,
9591,
3242,
17032,
403,
1097,
247,
5691,
310,
32811,
285,
310,
3033,
281,
320,
273,
1600,
281,
253,
17857,
32888,
3114,
253,
1543,
671,
4944,
715,
247,
4067,
2926,
670,
11454,
2990,
1442,
292,
25004,
33107,
613,
920,
534,
310,
5604,
273,
1600,
50276,
23693,
1037,
253,
2746,
310,
760,
16453,
314,
4460,
7194,
9433,
271,
2934,
326,
556,
644,
5421,
11358,
275,
247,
3634,
835,
352,
556,
2649,
644,
3732,
2568,
326,
753,
275,
619,
4743,
253,
16774,
1543,
403,
1097,
4460,
285,
1534,
275,
1798,
253,
2934,
273,
3587,
4715,
247,
9927,
1375,
1566,
556,
417,
12103,
247,
2257,
273,
32535,
984,
352,
29708,
36908,
789,
512,
326,
973,
359,
476,
923,
275,
253,
818,
9290,
15761,
18754,
1543,
326,
9591,
3186,
327,
824,
247,
1566,
11026,
9093,
642,
5649,
359,
403,
13476,
326,
352,
42126,
2847,
5237,
253,
958,
326,
253,
1442,
292,
25004,
11410,
253,
16851,
9927,
1566,
281,
4917,
247,
7219,
5649,
310,
16613,
285,
247,
12532,
861,
5996,
2584,
2852,
789,
534,
1537,
8046,
9927,
1375,
17032,
281,
4311,
281,
4067,
625,
4722,
10571,
24802,
3237,
50276,
455,
326,
753,
891,
513,
452,
247,
1534,
4468,
670,
436,
2929,
253,
16774,
1543,
10725,
11306,
327,
271,
5368,
5933,
1925,
391,
77,
8716,
347,
2080,
347,
891,
476,
2028,
253,
2929,
326,
8631,
391,
77,
8716,
556,
417,
644,
14218,
9814,
285,
556,
760,
644,
2130,
327,
549,
32693,
323,
670,
247,
1770,
387,
673,
273,
16725,
253,
2929,
2789,
247,
1077,
4864,
3177,
281,
5513,
391,
77,
8716,
387,
247,
1029,
1268,
285,
849,
326,
5933,
310,
12059,
323,
436,
2929,
533,
436,
310,
417,
4209,
432,
436,
2929,
891,
513,
417,
871,
752,
5933,
310,
1146,
2684,
275,
841,
4679,
390,
2139,
391,
77,
8716,
310,
1014,
247,
3590,
12153,
281,
1973,
2220,
1339,
3815,
2139,
352,
310,
253,
987,
2613,
5933,
281,
13398,
342,
270,
649,
275,
1340,
281,
3662,
841,
16774,
3533,
476,
270,
649,
320,
3732,
281,
643,
3186,
3082,
310,
627,
690,
2714,
726,
9751,
342,
391,
77,
8716,
1580,
597,
1097,
3176,
281,
320,
1442,
292,
25004,
3082,
604,
253,
9172,
281,
841,
3533,
310,
642,
840,
891,
651,
5583,
970,
247,
8245,
5933,
326,
310,
625,
4232,
285,
1805,
5421,
604,
253,
3662,
310,
4754,
840,
326,
3198,
281,
320,
1160,
2080,
625,
6843,
285,
2590,
604,
253,
3662,
310,
359,
13414,
871,
840,
891,
1158,
28763,
247,
2201,
5816,
5313,
273,
436,
789,
50276,
783,
958,
310,
326,
253,
2929,
310,
417,
10481,
1881,
41010,
285,
253,
4243,
326,
352,
36908,
3831,
419,
2254,
644,
14218,
9814,
326,
2789,
479,
11617,
20032,
285,
2789,
352,
2834,
323,
479,
281,
13224,
314,
2939,
253,
7681,
3290,
285,
253,
8453,
273,
253,
4342,
50276,
23955,
4468,
891,
452,
310,
326,
891,
13414,
871,
849,
1142,
3907,
7587,
403,
6607,
275,
253,
16774,
1543,
5046,
891,
816,
9829,
352,
533,
352,
3133,
751,
247,
1077,
1774,
2508,
281,
1375,
604,
253,
1180,
310,
417,
10481,
1029,
326,
651,
7164,
3533,
670,
253,
4757,
273,
253,
1329,
323,
253,
11815,
50276,
66,
4564,
273,
625,
5884,
3374,
50276,
783,
10414,
2593,
310,
29708,
247,
4840,
627,
310,
642,
15274,
275,
253,
33907,
285,
2600,
273,
253,
10414,
285,
690,
13414,
1014,
1618,
247,
18767,
816,
4477,
247,
4060,
285,
247,
807,
50276,
81,
374,
387,
253,
673,
273,
4028,
627,
556,
644,
642,
5547,
20028,
273,
436,
352,
369,
2649,
2590,
281,
479,
387,
436,
1127,
752,
436,
10770,
281,
1014,
1024,
516,
417,
9106,
2119,
352,
651,
320,
1175,
281,
452,
247,
625,
2590,
2048,
273,
253,
1527,
1895,
1060,
50276,
6438,
2488,
2380,
50275,
783,
2929,
310,
973,
15720,
285,
19401,
271,
1774,
285,
11132,
1895,
253,
16774,
1543,
1646,
281,
3959,
247,
1854,
281,
4780,
275,
247,
3884,
326,
556,
2649,
2011,
1199,
9023,
253,
2929,
812,
320,
5520,
407,
9159,
625,
4278,
670,
391,
77,
8716,
247,
4102,
5611,
2746,
2220,
534,
253,
4679,
11306,
10725,
50275,
19164,
6010,
50274,
783,
2929,
310,
973,
15720,
285,
19401,
271,
1774,
285,
11132,
1895,
253,
16774,
1543,
1646,
281,
3959,
247,
1854,
281,
4780,
275,
247,
3884,
326,
556,
2649,
2011,
1199,
9023,
2299,
253,
5933,
280,
1543,
1056,
5536,
897,
273,
271,
5933,
326,
310,
417,
10481,
2529,
275,
436,
2929,
285,
326,
556,
417,
644,
14218,
33349,
2167,
891,
651,
1663,
751,
281,
452,
841,
1543,
275,
253,
6239,
326,
2789,
479,
20032,
342,
46705,
14924,
2490,
187,
4118,
18435,
27,
783,
2929,
29328,
281,
4030,
19928,
253,
9927,
3054,
273,
247,
278,
12132,
323,
1996,
970,
253,
6311,
1566,
323,
3061,
2606,
7219,
24088,
3066,
3186,
253,
7680,
310,
973,
15068,
264,
17194,
285,
7106,
281,
247,
2173,
10076,
534,
310,
3839,
2783,
11132,
275,
253,
6239,
436,
10076,
310,
45045,
407,
253,
27293,
3120,
2165,
15761,
18754,
534,
3936,
253,
2554,
273,
253,
22791,
272,
4758,
323,
253,
16774,
7103,
273,
253,
1442,
292,
25004,
5199,
50276,
783,
2201,
4468,
5439,
275,
253,
2278,
285,
5955,
12475,
403,
670,
253,
3710,
7103,
534,
310,
18932,
1475,
760,
15761,
18754,
347,
973,
347,
253,
32800,
273,
253,
11701,
689,
2045,
1666,
25379,
2299,
1264,
37289,
30628,
5821,
326,
1580,
253,
4758,
556,
644,
24842,
11132,
253,
2361,
11701,
403,
275,
958,
1534,
285,
7826,
29853,
2852,
2987,
275,
436,
3884,
50275,
783,
2929,
310,
7607,
2530,
326,
253,
4477,
2486,
285,
40167,
275,
253,
4049,
254,
609,
5102,
253,
3081,
4679,
689,
253,
4764,
21750,
16762,
253,
28913,
5216,
285,
253,
11985,
16318,
407,
253,
30628,
275,
253,
5701
] | [
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] | [
2593,
33848,
533,
275,
2087,
1119,
253,
2929,
247,
11284,
281,
1239,
50276,
7152,
33032,
2520,
2929,
29328,
247,
5700,
323,
11138,
247,
10166,
36833,
1566,
323,
9927,
3409,
5731,
407,
11365,
285,
3733,
327,
747,
941,
3909,
50276,
2520,
5700,
310,
3782,
9371,
323,
10620,
342,
3237,
273,
9383,
11610,
5333,
672,
24088,
253,
985,
556,
4395,
3345,
253,
3733,
9459,
285,
10096,
327,
440,
23705,
494,
2616,
24088,
253,
6083,
7823,
275,
247,
2165,
4758,
50275,
2520,
1442,
292,
25004,
5700,
310,
2011,
1097,
281,
3157,
253,
32422,
273,
9927,
3409,
11269,
285,
281,
452,
271,
12553,
3486,
275,
253,
3290,
273,
1132,
275,
15761,
18754,
247,
2165,
342,
247,
6832,
2408,
273,
3055,
1491,
436,
310,
271,
4722,
2929,
326,
310,
9648,
4460,
347,
2080,
347,
891,
871,
50276,
35529,
627,
497,
690,
3237,
275,
253,
47284,
326,
1669,
479,
5471,
326,
891,
1537,
452,
46485,
690,
7794,
273,
352,
50276,
74,
513,
1158,
352,
588,
320,
1774,
281,
19148,
690,
273,
841,
2792,
275,
667,
3863,
2715,
273,
253,
2929,
50276,
20415,
273,
619,
1461,
16723,
497,
670,
253,
3510,
390,
1049,
261,
273,
2710,
4295,
50276,
74,
588,
1265,
342,
4067,
3533,
285,
840,
1618,
690,
5884,
4394,
50276,
249,
2593,
4562,
891,
4745,
369,
13477,
407,
253,
1511,
273,
37989,
18,
1767,
18,
50275,
12157,
340,
310,
5347,
1025,
352,
3133,
281,
320,
247,
3632,
4778,
50276,
2858,
326,
36908,
1056,
247,
2257,
273,
3282,
281,
479,
50276,
1568,
403,
690,
1896,
27838,
273,
37989,
18,
1767,
18,
50276,
28821,
247,
1798,
8310,
340,
85,
18,
840,
697,
816,
253,
1735,
9927,
1375,
533,
2139,
310,
340,
5347,
1025,
50276,
1439,
1677,
271,
8310,
697,
247,
3632,
4778,
689,
1735,
13379,
270,
835,
253,
3268,
689,
270,
310,
5611,
407,
3268,
689,
7313,
50276,
1439,
1677,
271,
8310,
697,
253,
1735,
9927,
1375,
359,
651,
755,
604,
359,
1160,
642,
8310,
50276,
1439,
1677,
271,
8310,
697,
690,
3686,
273,
15355,
689,
253,
1735,
9927,
436,
310,
28190,
891,
1158,
50276,
783,
5955,
273,
253,
10491,
5199,
275,
326,
1072,
2593,
2296,
352,
11330,
3530,
273,
209,
633,
18,
2858,
352,
3133,
281,
320,
1774,
326,
359,
403,
2970,
3530,
273,
253,
340,
6811,
347,
973,
594,
359,
476,
897,
731,
281,
1442,
292,
2517,
776,
29107,
50276,
303,
417,
2119,
253,
24760,
342,
16851,
7870,
10717,
310,
9371,
50276,
263,
387,
1878,
352,
651,
1361,
281,
19148,
352,
50276,
249,
1798,
310,
352,
4619,
281,
2686,
1442,
292,
2517,
39116,
50275,
266,
5795,
651,
320,
281,
7356,
342,
253,
8091,
1859,
285,
897,
253,
19958,
48361,
281,
1957,
253,
12637,
9927,
390,
897,
690,
5019,
273,
253,
9927,
4197,
407,
253,
4764,
280,
1566,
39116,
285,
253,
6353,
625,
11120,
2581,
685,
2686,
6890,
253,
39116,
50275,
936,
479,
352,
651,
1928,
1199,
625,
751,
519,
81,
604,
368,
497,
3733,
690,
1159,
270,
3124,
281,
1957,
253,
9927,
3409,
3587,
327,
1016,
19502,
534,
651,
320,
2568,
1529,
1896,
2746,
891,
9428,
3707,
326,
9999,
9927,
3054,
310,
417,
49186,
2834,
50276,
74,
1663,
3078,
4677,
374,
534,
310,
1925,
2829,
374,
19148,
1841,
323,
479,
533,
352,
1663,
42126,
1361,
479,
387,
512,
50275,
74,
1158,
1633,
625,
751,
247,
941,
5449,
10659,
326,
18303,
849,
39116,
4850,
4391,
689,
673,
285,
2789,
1077,
2590,
326,
39116,
36908,
4764,
907,
270,
2717,
4764,
4219,
390,
1663,
11120,
6125,
270,
3139,
533,
3185,
30364,
363,
13505,
253,
1375,
383,
303,
1080,
3817,
651,
320,
625,
9371,
50276,
783,
2629,
2228,
2193,
1646,
3240,
1355,
1677,
253,
2408,
273,
2442,
11041,
275,
253,
1027,
3733,
4870,
50276,
32897,
1375,
1077,
4518,
4555,
752,
4973,
273,
11041,
403,
1146,
10848,
1060,
50276,
19078,
5904,
50276,
13419,
2706,
941,
323,
3733,
253,
3302,
39116,
3632,
1255,
1955,
281,
5606,
273,
253,
2165,
18974,
387,
20243,
1955,
281,
3120,
439,
47587,
3632,
1255,
275,
6083,
1132,
3966,
11041,
273,
10491,
281,
6635,
253,
1442,
292,
25004,
941,
11041,
273,
253,
1442,
292,
25004,
3733,
7826,
1955,
281,
14604,
272,
891,
5476,
50275,
5371,
497,
253,
2710,
295,
2193,
50276,
783,
3045,
11701,
1646,
281,
320,
1524,
285,
4722,
533,
253,
15988,
403,
2649,
14779,
50276,
2520,
310,
4030,
533,
516,
417,
2119,
2654,
1750,
326,
3045,
310,
10260,
5520,
50276,
3062,
5884,
50276,
74,
11435,
253,
31376,
273,
253,
2746,
285,
253,
8327,
281,
320,
639,
79,
6932,
281,
253,
830,
285,
3733,
4278,
273,
269,
3124,
533,
891,
574,
281,
564,
28063,
247,
2372,
715,
643,
9380,
281,
1611,
281,
2096,
253,
4588,
2746,
368,
908,
50276,
66,
2372,
625,
2508,
670,
326,
651,
452,
644,
9371,
275,
4685,
634,
2746,
50276,
249,
2393,
4243,
273,
253,
2929,
368,
10062,
670,
1469,
432,
3213,
246,
281,
3213,
246,
18,
50276,
31312,
352,
369,
432,
3213,
246,
18,
281,
3213,
246,
50276,
4609,
310,
5549,
1242,
432,
14275,
13260,
50276,
31485,
11610,
3268,
943,
417,
3486,
1566,
3045,
5816,
5333,
436,
2929,
556,
4722,
5697,
326,
1646,
281,
320,
4217,
50276,
498,
15752,
812,
320,
5520,
533,
4722,
2217,
281,
15452,
50276,
74,
3078,
281,
1918,
352,
247,
818,
5474,
33032,
2520,
2929,
7350,
253,
2934,
273,
3587,
4715,
247,
1566,
273,
247,
9927,
1375,
278,
12132,
432,
3530,
432,
253,
3216,
5083,
8062,
285,
8310,
3210,
581,
812,
840,
897,
326,
1566,
323,
3061,
2606,
7219,
24088,
690,
830,
273,
5202,
3186,
275,
1798,
253,
2929,
13698,
281,
3157,
2220,
436,
2934,
407,
1442,
292,
25004,
253,
9927,
1375,
1566,
1078,
3186,
970,
3530,
432,
1980,
1375,
2919,
4679,
403,
2684,
275,
15761,
18754,
4645,
326,
1442,
292,
25004,
1057,
3157,
9927,
7200,
285,
13276,
3186,
949,
9927,
2317,
281,
3157,
1453,
3045,
275,
3237,
835,
3242,
17032,
310,
45783,
50276,
6438,
2488,
2380,
50275,
74,
717,
6108,
619,
3236,
2278,
2708,
323,
22107,
533,
619,
954,
1534,
7350,
452,
644,
9713,
275,
1798,
50276,
783,
391,
77,
8716,
2929,
556,
644,
7607,
323,
9759,
387,
5723,
2824,
43425,
50276,
783,
4477,
452,
2530,
253,
3410,
9552,
323,
616,
31218,
285,
597,
403,
4209,
281,
1329,
616,
3916,
50276,
74,
1335,
1158,
253,
4477,
943,
4388,
281,
1056,
253,
2929,
625,
1881,
41010,
407,
5277,
625,
5933,
280,
4278,
670,
391,
77,
8716,
534,
310,
247,
747,
2746,
849,
352,
29290,
342,
270,
649,
285,
2139,
352,
369,
4236,
323,
841,
4679,
3185,
273,
247,
625,
973,
21877,
3186,
1332,
50275,
19164,
2278,
50274,
31815,
1375,
17032,
310,
271,
1774,
3213,
2584,
6684,
6083,
326,
476,
1056,
3590,
7089,
275,
10571,
24802,
12620,
247,
1270,
2968,
273,
3434,
556,
644,
5262,
12392,
3242,
17032,
390,
16851,
17032,
342,
2289,
281,
3216,
5083,
1375,
10670,
436,
9380,
2770,
327,
9745,
436,
4968,
281,
625,
2570,
10625,
835,
9999,
10670,
285,
9591,
3242,
17032,
403,
1097,
247,
5691,
310,
32811,
285,
310,
3033,
281,
320,
273,
1600,
281,
253,
17857,
32888,
3114,
253,
1543,
671,
4944,
715,
247,
4067,
2926,
670,
11454,
2990,
1442,
292,
25004,
33107,
613,
920,
534,
310,
5604,
273,
1600,
50276,
23693,
1037,
253,
2746,
310,
760,
16453,
314,
4460,
7194,
9433,
271,
2934,
326,
556,
644,
5421,
11358,
275,
247,
3634,
835,
352,
556,
2649,
644,
3732,
2568,
326,
753,
275,
619,
4743,
253,
16774,
1543,
403,
1097,
4460,
285,
1534,
275,
1798,
253,
2934,
273,
3587,
4715,
247,
9927,
1375,
1566,
556,
417,
12103,
247,
2257,
273,
32535,
984,
352,
29708,
36908,
789,
512,
326,
973,
359,
476,
923,
275,
253,
818,
9290,
15761,
18754,
1543,
326,
9591,
3186,
327,
824,
247,
1566,
11026,
9093,
642,
5649,
359,
403,
13476,
326,
352,
42126,
2847,
5237,
253,
958,
326,
253,
1442,
292,
25004,
11410,
253,
16851,
9927,
1566,
281,
4917,
247,
7219,
5649,
310,
16613,
285,
247,
12532,
861,
5996,
2584,
2852,
789,
534,
1537,
8046,
9927,
1375,
17032,
281,
4311,
281,
4067,
625,
4722,
10571,
24802,
3237,
50276,
455,
326,
753,
891,
513,
452,
247,
1534,
4468,
670,
436,
2929,
253,
16774,
1543,
10725,
11306,
327,
271,
5368,
5933,
1925,
391,
77,
8716,
347,
2080,
347,
891,
476,
2028,
253,
2929,
326,
8631,
391,
77,
8716,
556,
417,
644,
14218,
9814,
285,
556,
760,
644,
2130,
327,
549,
32693,
323,
670,
247,
1770,
387,
673,
273,
16725,
253,
2929,
2789,
247,
1077,
4864,
3177,
281,
5513,
391,
77,
8716,
387,
247,
1029,
1268,
285,
849,
326,
5933,
310,
12059,
323,
436,
2929,
533,
436,
310,
417,
4209,
432,
436,
2929,
891,
513,
417,
871,
752,
5933,
310,
1146,
2684,
275,
841,
4679,
390,
2139,
391,
77,
8716,
310,
1014,
247,
3590,
12153,
281,
1973,
2220,
1339,
3815,
2139,
352,
310,
253,
987,
2613,
5933,
281,
13398,
342,
270,
649,
275,
1340,
281,
3662,
841,
16774,
3533,
476,
270,
649,
320,
3732,
281,
643,
3186,
3082,
310,
627,
690,
2714,
726,
9751,
342,
391,
77,
8716,
1580,
597,
1097,
3176,
281,
320,
1442,
292,
25004,
3082,
604,
253,
9172,
281,
841,
3533,
310,
642,
840,
891,
651,
5583,
970,
247,
8245,
5933,
326,
310,
625,
4232,
285,
1805,
5421,
604,
253,
3662,
310,
4754,
840,
326,
3198,
281,
320,
1160,
2080,
625,
6843,
285,
2590,
604,
253,
3662,
310,
359,
13414,
871,
840,
891,
1158,
28763,
247,
2201,
5816,
5313,
273,
436,
789,
50276,
783,
958,
310,
326,
253,
2929,
310,
417,
10481,
1881,
41010,
285,
253,
4243,
326,
352,
36908,
3831,
419,
2254,
644,
14218,
9814,
326,
2789,
479,
11617,
20032,
285,
2789,
352,
2834,
323,
479,
281,
13224,
314,
2939,
253,
7681,
3290,
285,
253,
8453,
273,
253,
4342,
50276,
23955,
4468,
891,
452,
310,
326,
891,
13414,
871,
849,
1142,
3907,
7587,
403,
6607,
275,
253,
16774,
1543,
5046,
891,
816,
9829,
352,
533,
352,
3133,
751,
247,
1077,
1774,
2508,
281,
1375,
604,
253,
1180,
310,
417,
10481,
1029,
326,
651,
7164,
3533,
670,
253,
4757,
273,
253,
1329,
323,
253,
11815,
50276,
66,
4564,
273,
625,
5884,
3374,
50276,
783,
10414,
2593,
310,
29708,
247,
4840,
627,
310,
642,
15274,
275,
253,
33907,
285,
2600,
273,
253,
10414,
285,
690,
13414,
1014,
1618,
247,
18767,
816,
4477,
247,
4060,
285,
247,
807,
50276,
81,
374,
387,
253,
673,
273,
4028,
627,
556,
644,
642,
5547,
20028,
273,
436,
352,
369,
2649,
2590,
281,
479,
387,
436,
1127,
752,
436,
10770,
281,
1014,
1024,
516,
417,
9106,
2119,
352,
651,
320,
1175,
281,
452,
247,
625,
2590,
2048,
273,
253,
1527,
1895,
1060,
50276,
6438,
2488,
2380,
50275,
783,
2929,
310,
973,
15720,
285,
19401,
271,
1774,
285,
11132,
1895,
253,
16774,
1543,
1646,
281,
3959,
247,
1854,
281,
4780,
275,
247,
3884,
326,
556,
2649,
2011,
1199,
9023,
253,
2929,
812,
320,
5520,
407,
9159,
625,
4278,
670,
391,
77,
8716,
247,
4102,
5611,
2746,
2220,
534,
253,
4679,
11306,
10725,
50275,
19164,
6010,
50274,
783,
2929,
310,
973,
15720,
285,
19401,
271,
1774,
285,
11132,
1895,
253,
16774,
1543,
1646,
281,
3959,
247,
1854,
281,
4780,
275,
247,
3884,
326,
556,
2649,
2011,
1199,
9023,
2299,
253,
5933,
280,
1543,
1056,
5536,
897,
273,
271,
5933,
326,
310,
417,
10481,
2529,
275,
436,
2929,
285,
326,
556,
417,
644,
14218,
33349,
2167,
891,
651,
1663,
751,
281,
452,
841,
1543,
275,
253,
6239,
326,
2789,
479,
20032,
342,
46705,
14924,
2490,
187,
4118,
18435,
27,
783,
2929,
29328,
281,
4030,
19928,
253,
9927,
3054,
273,
247,
278,
12132,
323,
1996,
970,
253,
6311,
1566,
323,
3061,
2606,
7219,
24088,
3066,
3186,
253,
7680,
310,
973,
15068,
264,
17194,
285,
7106,
281,
247,
2173,
10076,
534,
310,
3839,
2783,
11132,
275,
253,
6239,
436,
10076,
310,
45045,
407,
253,
27293,
3120,
2165,
15761,
18754,
534,
3936,
253,
2554,
273,
253,
22791,
272,
4758,
323,
253,
16774,
7103,
273,
253,
1442,
292,
25004,
5199,
50276,
783,
2201,
4468,
5439,
275,
253,
2278,
285,
5955,
12475,
403,
670,
253,
3710,
7103,
534,
310,
18932,
1475,
760,
15761,
18754,
347,
973,
347,
253,
32800,
273,
253,
11701,
689,
2045,
1666,
25379,
2299,
1264,
37289,
30628,
5821,
326,
1580,
253,
4758,
556,
644,
24842,
11132,
253,
2361,
11701,
403,
275,
958,
1534,
285,
7826,
29853,
2852,
2987,
275,
436,
3884,
50275,
783,
2929,
310,
7607,
2530,
326,
253,
4477,
2486,
285,
40167,
275,
253,
4049,
254,
609,
5102,
253,
3081,
4679,
689,
253,
4764,
21750,
16762,
253,
28913,
5216,
285,
253,
11985,
16318,
407,
253,
30628,
275,
253,
5701
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
overview in this paper the authors augment the instancelevel selfsupervised learning with clusteraware learning mechanism during the training procedure specifically for each training batch the authors project the instances into a clustering space and then utilize a clusteraware contrastive loss to push the augmented samples from the same instance to belong to the same cluster otherwise for different instances to ensure the clustering not to collapse into a single or a few cluster to find the trivial solutions the authors further add a penalization item keep the entropy of clustering assignment be uniform to some extent the experimental results demonstrate that the proposed method can improve the representation learning performance over sota methods on several datasets while also outperforms the previous methods on clustering task further ablation studies show that the loss is effective to ensure the learned representation more discriminative and clusterable strength 1 the intuition behind the proposed method is intuitive and straightforward it has been shown in previous work like a that combining them together can boost the selfsupervised learning performance significantly this paper further demonstrates the promise of this direction a unsupervised learning of visual features by contrasting cluster assignments caron et al 2 the authors performed experiments to show that the proposed method achieves better performance on both representation learning task and clustering task on various image datasets such as cifar10 cifar100 and stl10 3 the ablation studies showed that the proposed clustering loss indeed helps to learn a better representation compared with the baseline model with a substantial margin which demonstrates the effectiveness of the proposed method weakness 1 the paper has a poor literature review of previous works in the related work both instancelevel representation learning and deep clustering methods are not fully covered and compared more importantly the authors missed a very relevant and recent paper as pointed above a the idea behind this paper is very similar to the above one 2 in this paper the authors merely presented the results on relatively small datasets though it is a bit harsh to always request experiments on the largescale dataset such as imagenet proving the efficiency seems necessary especially when it is known that keep training on a largescale dataset for a long time may dismiss the gap 3 in table 3 it seems that only with multiscale clustering loss the performance will be improved across all metrics this indicates that the proposed algorithm is a bit sensitive to the hyperparameter settings even with eq1 eq5 the performance drops in some scenarios which seems counterintuitive all of these results demonstrate that the proposed method is still a bit mysterious and vulnerable 4 the notations in the paper is hard to interpret and a bit abuse the formula of eq4 is also a bit confusing first what does k stands for second why the denominator excludes the case of ik if it is a regular contrastive loss summary overall i think this paper is a good trial of combining instancelevel contrastive loss and deep clustering philosophy into a single learning regime which i think is a promising direction to explore however as i pointed above the novelty of the paper should be better explained also according to the ablation study the performance seems vulnerable to the choice of hyperparameters such as cluster numbers this increases the uncertainty about the effectiveness of the proposed method furthermore the proposed method is not demonstrated on largescale dataset such as imagenet which is supposed to be a routine setting on selfsupervised learning community i would recommend the authors could answer my above questions raised abovedocseppros 1 the paper presents a good solution for an important problem in selfsupervised learning and contrastive learning proposed methods in the literature do not take the cluster structure of items into consideration this paper proposes a hybrid loss function that aims to preserve the cluster structure equation 7 2 a wide range of experiments are conducted to evaluate the proposed method c2bin shows a significantly better performance using the knn classifier particularly on cifar100 the good performance is also evident in clustering experiments thus as the method promises the cluster structure is better preserved cons 1 the motivation part of the paper is not precise the last sentence of the second paragraph of section 1 states however since aforementioned instance discrimination does not consider the semantic similarities of the representations eg same class it results the learned representations to be uniformly distributed wang isola 2020 it is true that semantic similarities are not considered in selfsupervised and contrastive learning settings however this is a part of the problem as class labels do not exist moreover it is not clear to me why it would lead to a uniform distribution of representations i agree that cluster structure might be lost 2 the notation used in section 3 is confusing i mention some possible misuses of notation or typos section 31 n is defined as the number of unlabeled images later in section 33 second paragraph n is used as the minibatch size section 32 equation 1 pa za is not properly defined it refers to simclr and as i checked in the paper the same notation is not defined moreover the paper should be written selfcontained meaning that main formulation should be mentioned in the main body of the paper figure 3 two siamese networks etheta and ephi are not depicted in the figure equation 4 denominator i neq k should it be i neq j after equation 4 it is stated that the vectors c and c are obtained from xi and x i respectively using the encoder ftheta this is not precise the output of f would be r and it goes through pck to obtain z then c is computed using equation 3 comments and questions table 3 shows that the choice of set k is quite important if we are not provided with proper k and we have no access to labels can you recommend any strategy in this case do you think using a fixed k with various elements is good for any dataset it would be nice to sort the methods of table 7 chronologically it is not clear how figure 9 10 and 11 are produced are the grouped images random samples and nearest neighbors docsepsummary this paper applies batchwise cluster assignment with bootstrapping to learn unsupervised representation this paper claims resulting representations are better suited to nondiscriminative tasks where clustering is important strengths this paper motivates a good direction over current unsupervised representation learning considering clustering performance in addition to discriminative power is a fair research question in my opinion the proposed idea is interesting and seems reasonable evaluating representation learning on nondiscriminative tasks is a good idea concerns a critical concern is the experiment setup particularly the choice of resnet18 as backbone and only evluating on cifarstl these datasets are quite small and are not used as the primary performance benchmark for modern unsupervised image representation learning work this paper claims improvement in certain aspects over simclr moco v2 and byol which all experiment on much larger capacity models and much larger datasets many insights in these prior work tie heavily to scaling to larger dataset and model capacity this makes it difficult to compare this work recommendation the experiment setup in this paper deviates significantly from recent work of similar nature therefore i am not convinced by the findings presented and i recommend to reject this paper
### Summary: | the idea of combining instancelevel contrastive loss and deep clustering is a promising direction in recent unsupervisedselfsupervised visual representation learning studies however authors did a poor literature review and did not cite and compare with quite a few recent popular work exploring the similar direction the proposed methodology is not particular novel and the experimental results are also not convincing overall the paper explored a promising research direction but the paper quality is clearly below acceptance bar | [
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
39930,
50276,
249,
436,
2929,
253,
4477,
35919,
253,
4227,
5251,
1881,
35421,
4715,
342,
7368,
13823,
4715,
5122,
1309,
253,
3733,
5199,
5742,
323,
1016,
3733,
14604,
253,
4477,
2199,
253,
10872,
715,
247,
17524,
2317,
285,
840,
16584,
247,
7368,
13823,
4499,
422,
2957,
281,
7450,
253,
31612,
3530,
432,
253,
1072,
4227,
281,
5663,
281,
253,
1072,
7368,
5010,
323,
1027,
10872,
281,
5416,
253,
17524,
417,
281,
13551,
715,
247,
2014,
390,
247,
1643,
7368,
281,
1089,
253,
14916,
5482,
253,
4477,
2007,
823,
247,
29697,
1320,
5382,
1978,
253,
15579,
273,
17524,
12714,
320,
6447,
281,
690,
6070,
253,
5661,
1543,
7568,
326,
253,
4081,
1332,
476,
3157,
253,
6779,
4715,
3045,
689,
256,
5503,
3082,
327,
2067,
15302,
1223,
671,
41731,
13015,
253,
2045,
3082,
327,
17524,
4836,
2007,
28913,
2175,
921,
326,
253,
2957,
310,
3576,
281,
5416,
253,
6311,
6779,
625,
20741,
800,
285,
7368,
494,
50276,
45563,
50276,
18,
253,
30328,
3212,
253,
4081,
1332,
310,
27350,
285,
15246,
352,
556,
644,
2011,
275,
2045,
789,
751,
247,
326,
16248,
731,
2366,
476,
9510,
253,
1881,
35421,
4715,
3045,
3012,
436,
2929,
2007,
14371,
253,
9023,
273,
436,
3884,
50276,
66,
440,
35421,
4715,
273,
5304,
3386,
407,
42455,
7368,
23768,
1113,
251,
1162,
355,
50276,
19,
253,
4477,
2684,
4679,
281,
921,
326,
253,
4081,
1332,
33526,
1805,
3045,
327,
1097,
6779,
4715,
4836,
285,
17524,
4836,
327,
2710,
2460,
15302,
824,
347,
260,
338,
274,
740,
260,
338,
274,
2313,
285,
331,
77,
740,
50275,
20,
253,
28913,
2175,
2692,
326,
253,
4081,
17524,
2957,
6296,
7729,
281,
3037,
247,
1805,
6779,
2429,
342,
253,
8245,
1566,
342,
247,
6832,
8459,
534,
14371,
253,
12510,
273,
253,
4081,
1332,
50276,
20881,
1255,
50276,
18,
253,
2929,
556,
247,
4105,
6239,
2278,
273,
2045,
2987,
275,
253,
2905,
789,
1097,
4227,
5251,
6779,
4715,
285,
3676,
17524,
3082,
403,
417,
4751,
6107,
285,
2429,
625,
15538,
253,
4477,
9829,
247,
1077,
4623,
285,
3332,
2929,
347,
8042,
1840,
247,
253,
2934,
3212,
436,
2929,
310,
1077,
2074,
281,
253,
1840,
581,
50276,
19,
50276,
249,
436,
2929,
253,
4477,
7960,
3559,
253,
1543,
327,
4942,
1355,
15302,
2167,
352,
310,
247,
2372,
17770,
281,
1900,
2748,
4679,
327,
253,
1236,
2510,
25912,
10895,
824,
347,
4440,
257,
292,
18597,
253,
6733,
3133,
3309,
3340,
672,
352,
310,
1929,
326,
1978,
3733,
327,
247,
1236,
2510,
25912,
10895,
323,
247,
1048,
673,
778,
5597,
253,
8037,
50276,
20,
275,
2829,
495,
352,
3133,
326,
760,
342,
1554,
2865,
1079,
17524,
2957,
253,
3045,
588,
320,
5520,
2439,
512,
17082,
436,
6492,
326,
253,
4081,
5933,
310,
247,
2372,
7996,
281,
253,
4373,
19484,
7533,
1014,
342,
16186,
18,
50276,
2574,
22,
253,
3045,
15323,
275,
690,
15216,
534,
3133,
4828,
565,
48714,
512,
273,
841,
1543,
7568,
326,
253,
4081,
1332,
310,
1335,
247,
2372,
19796,
285,
14043,
50275,
21,
253,
41818,
275,
253,
2929,
310,
1892,
281,
4665,
285,
247,
2372,
7242,
253,
7212,
273,
16186,
21,
310,
671,
247,
2372,
21643,
806,
752,
1057,
465,
9572,
323,
1273,
2139,
253,
12619,
43337,
253,
1083,
273,
12628,
604,
352,
310,
247,
3963,
4499,
422,
2957,
50276,
8774,
50276,
1189,
455,
891,
1158,
436,
2929,
310,
247,
1175,
2332,
273,
16248,
4227,
5251,
4499,
422,
2957,
285,
3676,
17524,
11727,
715,
247,
2014,
4715,
9459,
534,
891,
1158,
310,
247,
12532,
3884,
281,
8338,
2299,
347,
891,
8042,
1840,
253,
38135,
273,
253,
2929,
943,
320,
1805,
5544,
671,
2556,
281,
253,
28913,
1263,
253,
3045,
3133,
14043,
281,
253,
4327,
273,
4373,
22041,
824,
347,
7368,
3904,
436,
5459,
253,
11649,
670,
253,
12510,
273,
253,
4081,
1332,
33810,
253,
4081,
1332,
310,
417,
5183,
327,
1236,
2510,
25912,
10895,
824,
347,
4440,
257,
292,
534,
310,
6326,
281,
320,
247,
10934,
4758,
327,
1881,
35421,
4715,
3114,
891,
651,
5583,
253,
4477,
812,
3662,
619,
1840,
3533,
5439,
490,
3149,
406,
339,
377,
2921,
50276,
18,
253,
2929,
10262,
247,
1175,
2900,
323,
271,
1774,
1895,
275,
1881,
35421,
4715,
285,
4499,
422,
4715,
4081,
3082,
275,
253,
6239,
513,
417,
1379,
253,
7368,
2605,
273,
4957,
715,
8180,
436,
2929,
29328,
247,
9769,
2957,
1159,
326,
13698,
281,
14003,
253,
7368,
2605,
5150,
818,
50276,
19,
247,
4618,
2491,
273,
4679,
403,
5196,
281,
7472,
253,
4081,
1332,
260,
19,
4805,
2722,
247,
3012,
1805,
3045,
970,
253,
694,
79,
30410,
3782,
327,
260,
338,
274,
2313,
253,
1175,
3045,
310,
671,
8943,
275,
17524,
4679,
3021,
347,
253,
1332,
16966,
253,
7368,
2605,
310,
1805,
15296,
50276,
5040,
50276,
18,
253,
16038,
629,
273,
253,
2929,
310,
417,
10799,
253,
1390,
6197,
273,
253,
1273,
12494,
273,
2593,
337,
3054,
50276,
35529,
1580,
18979,
4227,
11081,
1057,
417,
1908,
253,
24705,
22620,
273,
253,
14237,
24088,
1072,
966,
352,
1543,
253,
6311,
14237,
281,
320,
17568,
5939,
259,
606,
50276,
261,
6836,
9169,
50275,
262,
310,
2032,
326,
24705,
22620,
403,
417,
2783,
275,
1881,
35421,
285,
4499,
422,
4715,
7533,
2299,
436,
310,
247,
629,
273,
253,
1895,
347,
966,
13301,
513,
417,
2226,
25761,
352,
310,
417,
2590,
281,
479,
2139,
352,
651,
1421,
281,
247,
6447,
3268,
273,
14237,
891,
5194,
326,
7368,
2605,
1537,
320,
3663,
50276,
19,
253,
14951,
908,
275,
2593,
495,
310,
21643,
891,
3748,
690,
1896,
3731,
5123,
273,
14951,
390,
963,
993,
50276,
4674,
4562,
295,
310,
2931,
347,
253,
1180,
273,
440,
22027,
3888,
1996,
275,
2593,
5922,
1273,
12494,
295,
310,
908,
347,
253,
1054,
487,
1506,
1979,
50274,
4674,
4567,
5150,
337,
1349,
20244,
310,
417,
6283,
2931,
352,
10770,
281,
948,
498,
83,
285,
347,
891,
10141,
275,
253,
2929,
253,
1072,
14951,
310,
417,
2931,
25761,
253,
2929,
943,
320,
3542,
1881,
41010,
4495,
326,
2022,
15895,
943,
320,
5393,
275,
253,
2022,
2133,
273,
253,
2929,
50276,
13206,
495,
767,
4927,
1443,
70,
6928,
1162,
22666,
285,
299,
2162,
403,
417,
17253,
275,
253,
4677,
50275,
29813,
577,
12619,
891,
425,
82,
465,
943,
352,
320,
891,
425,
82,
480,
50276,
6438,
5150,
577,
352,
310,
4767,
326,
253,
11390,
260,
285,
260,
403,
2797,
432,
1269,
74,
285,
1269,
891,
50276,
49115,
970,
253,
32049,
269,
3124,
436,
310,
417,
10799,
253,
3453,
273,
269,
651,
320,
391,
285,
352,
4566,
949,
268,
777,
281,
4044,
1182,
840,
260,
310,
10302,
970,
5150,
495,
50275,
26122,
285,
3533,
50276,
2420,
495,
2722,
326,
253,
4327,
273,
873,
465,
310,
3240,
1774,
604,
359,
403,
417,
2530,
342,
1463,
465,
285,
359,
452,
642,
2289,
281,
13301,
476,
368,
5583,
667,
5700,
275,
436,
1083,
513,
368,
1158,
970,
247,
4229,
465,
342,
2710,
3603,
310,
1175,
323,
667,
10895,
50276,
262,
651,
320,
5322,
281,
3686,
253,
3082,
273,
2829,
818,
20600,
11220,
50276,
262,
310,
417,
2590,
849,
4677,
898,
884,
285,
1903,
403,
4197,
403,
253,
24104,
3888,
3632,
3530,
285,
5275,
15833,
5474,
339,
793,
360,
3454,
436,
2929,
10384,
14604,
3020,
7368,
12714,
342,
7491,
10981,
2784,
281,
3037,
440,
35421,
6779,
436,
2929,
3916,
4795,
14237,
403,
1805,
18960,
281,
27370,
2865,
34673,
800,
8892,
835,
17524,
310,
1774,
50276,
296,
3755,
20556,
436,
2929,
15265,
684,
247,
1175,
3884,
689,
1655,
440,
35421,
6779,
4715,
7296,
17524,
3045,
275,
1635,
281,
20741,
800,
1612,
310,
247,
4344,
2561,
1953,
275,
619,
4743,
253,
4081,
2934,
310,
4722,
285,
3133,
5272,
16344,
6779,
4715,
327,
27370,
2865,
34673,
800,
8892,
310,
247,
1175,
2934,
50276,
585,
1209,
2224,
247,
4619,
4468,
310,
253,
3368,
9978,
3782,
253,
4327,
273,
501,
3024,
1093,
347,
27882,
285,
760,
612,
7675,
839,
327,
260,
338,
274,
296,
77,
841,
15302,
403,
3240,
1355,
285,
403,
417,
908,
347,
253,
3625,
3045,
22791,
323,
4980,
440,
35421,
2460,
6779,
4715,
789,
436,
2929,
3916,
7756,
275,
2176,
7794,
689,
948,
498,
83,
278,
16856,
362,
19,
285,
407,
311,
534,
512,
3368,
327,
1199,
4067,
5350,
3210,
285,
1199,
4067,
15302,
1142,
16039,
275,
841,
2720,
789,
13898,
11306,
281,
13642,
281,
4067,
10895,
285,
1566,
5350,
436,
2789,
352,
2834,
281,
7277,
436,
789,
50276,
250,
27167,
318,
253,
3368,
9978,
275,
436,
2929,
1474,
28032,
3012,
432,
3332,
789,
273,
2074,
3753,
3103,
891,
717,
417,
13762,
407,
253,
4342,
3559,
285,
891,
5583,
281,
12009,
436,
2929,
187,
187,
4118,
18435,
27,
783,
2934,
273,
16248,
4227,
5251,
4499,
422,
2957,
285,
3676,
17524,
310,
247,
12532,
3884,
275,
3332,
440,
35421,
1286,
35421,
5304,
6779,
4715,
2175,
2299,
4477,
858,
247,
4105,
6239,
2278,
285,
858,
417,
26542,
285,
7277,
342,
3240,
247,
1643,
3332,
4633,
789,
18216,
253,
2074,
3884,
253,
4081,
16182,
310,
417,
1798,
4460,
285,
253,
5661,
1543,
403,
671,
417,
21414,
4583,
253,
2929,
14859,
247,
12532,
2561,
3884,
533,
253,
2929,
3290,
310,
4518,
2708,
14924,
2534,
209
] | [
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] | [
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
39930,
50276,
249,
436,
2929,
253,
4477,
35919,
253,
4227,
5251,
1881,
35421,
4715,
342,
7368,
13823,
4715,
5122,
1309,
253,
3733,
5199,
5742,
323,
1016,
3733,
14604,
253,
4477,
2199,
253,
10872,
715,
247,
17524,
2317,
285,
840,
16584,
247,
7368,
13823,
4499,
422,
2957,
281,
7450,
253,
31612,
3530,
432,
253,
1072,
4227,
281,
5663,
281,
253,
1072,
7368,
5010,
323,
1027,
10872,
281,
5416,
253,
17524,
417,
281,
13551,
715,
247,
2014,
390,
247,
1643,
7368,
281,
1089,
253,
14916,
5482,
253,
4477,
2007,
823,
247,
29697,
1320,
5382,
1978,
253,
15579,
273,
17524,
12714,
320,
6447,
281,
690,
6070,
253,
5661,
1543,
7568,
326,
253,
4081,
1332,
476,
3157,
253,
6779,
4715,
3045,
689,
256,
5503,
3082,
327,
2067,
15302,
1223,
671,
41731,
13015,
253,
2045,
3082,
327,
17524,
4836,
2007,
28913,
2175,
921,
326,
253,
2957,
310,
3576,
281,
5416,
253,
6311,
6779,
625,
20741,
800,
285,
7368,
494,
50276,
45563,
50276,
18,
253,
30328,
3212,
253,
4081,
1332,
310,
27350,
285,
15246,
352,
556,
644,
2011,
275,
2045,
789,
751,
247,
326,
16248,
731,
2366,
476,
9510,
253,
1881,
35421,
4715,
3045,
3012,
436,
2929,
2007,
14371,
253,
9023,
273,
436,
3884,
50276,
66,
440,
35421,
4715,
273,
5304,
3386,
407,
42455,
7368,
23768,
1113,
251,
1162,
355,
50276,
19,
253,
4477,
2684,
4679,
281,
921,
326,
253,
4081,
1332,
33526,
1805,
3045,
327,
1097,
6779,
4715,
4836,
285,
17524,
4836,
327,
2710,
2460,
15302,
824,
347,
260,
338,
274,
740,
260,
338,
274,
2313,
285,
331,
77,
740,
50275,
20,
253,
28913,
2175,
2692,
326,
253,
4081,
17524,
2957,
6296,
7729,
281,
3037,
247,
1805,
6779,
2429,
342,
253,
8245,
1566,
342,
247,
6832,
8459,
534,
14371,
253,
12510,
273,
253,
4081,
1332,
50276,
20881,
1255,
50276,
18,
253,
2929,
556,
247,
4105,
6239,
2278,
273,
2045,
2987,
275,
253,
2905,
789,
1097,
4227,
5251,
6779,
4715,
285,
3676,
17524,
3082,
403,
417,
4751,
6107,
285,
2429,
625,
15538,
253,
4477,
9829,
247,
1077,
4623,
285,
3332,
2929,
347,
8042,
1840,
247,
253,
2934,
3212,
436,
2929,
310,
1077,
2074,
281,
253,
1840,
581,
50276,
19,
50276,
249,
436,
2929,
253,
4477,
7960,
3559,
253,
1543,
327,
4942,
1355,
15302,
2167,
352,
310,
247,
2372,
17770,
281,
1900,
2748,
4679,
327,
253,
1236,
2510,
25912,
10895,
824,
347,
4440,
257,
292,
18597,
253,
6733,
3133,
3309,
3340,
672,
352,
310,
1929,
326,
1978,
3733,
327,
247,
1236,
2510,
25912,
10895,
323,
247,
1048,
673,
778,
5597,
253,
8037,
50276,
20,
275,
2829,
495,
352,
3133,
326,
760,
342,
1554,
2865,
1079,
17524,
2957,
253,
3045,
588,
320,
5520,
2439,
512,
17082,
436,
6492,
326,
253,
4081,
5933,
310,
247,
2372,
7996,
281,
253,
4373,
19484,
7533,
1014,
342,
16186,
18,
50276,
2574,
22,
253,
3045,
15323,
275,
690,
15216,
534,
3133,
4828,
565,
48714,
512,
273,
841,
1543,
7568,
326,
253,
4081,
1332,
310,
1335,
247,
2372,
19796,
285,
14043,
50275,
21,
253,
41818,
275,
253,
2929,
310,
1892,
281,
4665,
285,
247,
2372,
7242,
253,
7212,
273,
16186,
21,
310,
671,
247,
2372,
21643,
806,
752,
1057,
465,
9572,
323,
1273,
2139,
253,
12619,
43337,
253,
1083,
273,
12628,
604,
352,
310,
247,
3963,
4499,
422,
2957,
50276,
8774,
50276,
1189,
455,
891,
1158,
436,
2929,
310,
247,
1175,
2332,
273,
16248,
4227,
5251,
4499,
422,
2957,
285,
3676,
17524,
11727,
715,
247,
2014,
4715,
9459,
534,
891,
1158,
310,
247,
12532,
3884,
281,
8338,
2299,
347,
891,
8042,
1840,
253,
38135,
273,
253,
2929,
943,
320,
1805,
5544,
671,
2556,
281,
253,
28913,
1263,
253,
3045,
3133,
14043,
281,
253,
4327,
273,
4373,
22041,
824,
347,
7368,
3904,
436,
5459,
253,
11649,
670,
253,
12510,
273,
253,
4081,
1332,
33810,
253,
4081,
1332,
310,
417,
5183,
327,
1236,
2510,
25912,
10895,
824,
347,
4440,
257,
292,
534,
310,
6326,
281,
320,
247,
10934,
4758,
327,
1881,
35421,
4715,
3114,
891,
651,
5583,
253,
4477,
812,
3662,
619,
1840,
3533,
5439,
490,
3149,
406,
339,
377,
2921,
50276,
18,
253,
2929,
10262,
247,
1175,
2900,
323,
271,
1774,
1895,
275,
1881,
35421,
4715,
285,
4499,
422,
4715,
4081,
3082,
275,
253,
6239,
513,
417,
1379,
253,
7368,
2605,
273,
4957,
715,
8180,
436,
2929,
29328,
247,
9769,
2957,
1159,
326,
13698,
281,
14003,
253,
7368,
2605,
5150,
818,
50276,
19,
247,
4618,
2491,
273,
4679,
403,
5196,
281,
7472,
253,
4081,
1332,
260,
19,
4805,
2722,
247,
3012,
1805,
3045,
970,
253,
694,
79,
30410,
3782,
327,
260,
338,
274,
2313,
253,
1175,
3045,
310,
671,
8943,
275,
17524,
4679,
3021,
347,
253,
1332,
16966,
253,
7368,
2605,
310,
1805,
15296,
50276,
5040,
50276,
18,
253,
16038,
629,
273,
253,
2929,
310,
417,
10799,
253,
1390,
6197,
273,
253,
1273,
12494,
273,
2593,
337,
3054,
50276,
35529,
1580,
18979,
4227,
11081,
1057,
417,
1908,
253,
24705,
22620,
273,
253,
14237,
24088,
1072,
966,
352,
1543,
253,
6311,
14237,
281,
320,
17568,
5939,
259,
606,
50276,
261,
6836,
9169,
50275,
262,
310,
2032,
326,
24705,
22620,
403,
417,
2783,
275,
1881,
35421,
285,
4499,
422,
4715,
7533,
2299,
436,
310,
247,
629,
273,
253,
1895,
347,
966,
13301,
513,
417,
2226,
25761,
352,
310,
417,
2590,
281,
479,
2139,
352,
651,
1421,
281,
247,
6447,
3268,
273,
14237,
891,
5194,
326,
7368,
2605,
1537,
320,
3663,
50276,
19,
253,
14951,
908,
275,
2593,
495,
310,
21643,
891,
3748,
690,
1896,
3731,
5123,
273,
14951,
390,
963,
993,
50276,
4674,
4562,
295,
310,
2931,
347,
253,
1180,
273,
440,
22027,
3888,
1996,
275,
2593,
5922,
1273,
12494,
295,
310,
908,
347,
253,
1054,
487,
1506,
1979,
50274,
4674,
4567,
5150,
337,
1349,
20244,
310,
417,
6283,
2931,
352,
10770,
281,
948,
498,
83,
285,
347,
891,
10141,
275,
253,
2929,
253,
1072,
14951,
310,
417,
2931,
25761,
253,
2929,
943,
320,
3542,
1881,
41010,
4495,
326,
2022,
15895,
943,
320,
5393,
275,
253,
2022,
2133,
273,
253,
2929,
50276,
13206,
495,
767,
4927,
1443,
70,
6928,
1162,
22666,
285,
299,
2162,
403,
417,
17253,
275,
253,
4677,
50275,
29813,
577,
12619,
891,
425,
82,
465,
943,
352,
320,
891,
425,
82,
480,
50276,
6438,
5150,
577,
352,
310,
4767,
326,
253,
11390,
260,
285,
260,
403,
2797,
432,
1269,
74,
285,
1269,
891,
50276,
49115,
970,
253,
32049,
269,
3124,
436,
310,
417,
10799,
253,
3453,
273,
269,
651,
320,
391,
285,
352,
4566,
949,
268,
777,
281,
4044,
1182,
840,
260,
310,
10302,
970,
5150,
495,
50275,
26122,
285,
3533,
50276,
2420,
495,
2722,
326,
253,
4327,
273,
873,
465,
310,
3240,
1774,
604,
359,
403,
417,
2530,
342,
1463,
465,
285,
359,
452,
642,
2289,
281,
13301,
476,
368,
5583,
667,
5700,
275,
436,
1083,
513,
368,
1158,
970,
247,
4229,
465,
342,
2710,
3603,
310,
1175,
323,
667,
10895,
50276,
262,
651,
320,
5322,
281,
3686,
253,
3082,
273,
2829,
818,
20600,
11220,
50276,
262,
310,
417,
2590,
849,
4677,
898,
884,
285,
1903,
403,
4197,
403,
253,
24104,
3888,
3632,
3530,
285,
5275,
15833,
5474,
339,
793,
360,
3454,
436,
2929,
10384,
14604,
3020,
7368,
12714,
342,
7491,
10981,
2784,
281,
3037,
440,
35421,
6779,
436,
2929,
3916,
4795,
14237,
403,
1805,
18960,
281,
27370,
2865,
34673,
800,
8892,
835,
17524,
310,
1774,
50276,
296,
3755,
20556,
436,
2929,
15265,
684,
247,
1175,
3884,
689,
1655,
440,
35421,
6779,
4715,
7296,
17524,
3045,
275,
1635,
281,
20741,
800,
1612,
310,
247,
4344,
2561,
1953,
275,
619,
4743,
253,
4081,
2934,
310,
4722,
285,
3133,
5272,
16344,
6779,
4715,
327,
27370,
2865,
34673,
800,
8892,
310,
247,
1175,
2934,
50276,
585,
1209,
2224,
247,
4619,
4468,
310,
253,
3368,
9978,
3782,
253,
4327,
273,
501,
3024,
1093,
347,
27882,
285,
760,
612,
7675,
839,
327,
260,
338,
274,
296,
77,
841,
15302,
403,
3240,
1355,
285,
403,
417,
908,
347,
253,
3625,
3045,
22791,
323,
4980,
440,
35421,
2460,
6779,
4715,
789,
436,
2929,
3916,
7756,
275,
2176,
7794,
689,
948,
498,
83,
278,
16856,
362,
19,
285,
407,
311,
534,
512,
3368,
327,
1199,
4067,
5350,
3210,
285,
1199,
4067,
15302,
1142,
16039,
275,
841,
2720,
789,
13898,
11306,
281,
13642,
281,
4067,
10895,
285,
1566,
5350,
436,
2789,
352,
2834,
281,
7277,
436,
789,
50276,
250,
27167,
318,
253,
3368,
9978,
275,
436,
2929,
1474,
28032,
3012,
432,
3332,
789,
273,
2074,
3753,
3103,
891,
717,
417,
13762,
407,
253,
4342,
3559,
285,
891,
5583,
281,
12009,
436,
2929,
187,
187,
4118,
18435,
27,
783,
2934,
273,
16248,
4227,
5251,
4499,
422,
2957,
285,
3676,
17524,
310,
247,
12532,
3884,
275,
3332,
440,
35421,
1286,
35421,
5304,
6779,
4715,
2175,
2299,
4477,
858,
247,
4105,
6239,
2278,
285,
858,
417,
26542,
285,
7277,
342,
3240,
247,
1643,
3332,
4633,
789,
18216,
253,
2074,
3884,
253,
4081,
16182,
310,
417,
1798,
4460,
285,
253,
5661,
1543,
403,
671,
417,
21414,
4583,
253,
2929,
14859,
247,
12532,
2561,
3884,
533,
253,
2929,
3290,
310,
4518,
2708,
14924,
2534,
209
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
this paper presented a method for learning an animatable 3daware generative model for face images the method is built on top of the 3daware generative model gram and introduces 3dmmguided imitation learning to disentangle the latent space corresponding to identity and expression variations the method is trained on an image dataset and compared with a few previous baselines discofacegan conifg etc the method is quantitatively evaluated with fid and kid where the performance is comparable to but generally not better than previous approaches the method also performs qualitative comparisons on mostly randomly sampled images where the authors showed better view consistency than previous work the paper has the following strengths the topic is highly interesting controlling the trendy 3daware generators is of significant interest in the research communities of machine learning computer vision and graphics achieving highquality control of 3daware generators is also desirable for many applications the idea is interesting and has its novelty discofacegan has leveraged imitation learning to learn to sample 2d faces from a 3dmm fashioned latent space this work adopted a similar idea to the 3daware gans and introduced methods to guideregularize the 3d representations results are encouraging thanks to 3daware gans the proposed method resulted in higher quality view consistency which is generally not easy to achieve with a 2d gan architecture im generally satisfied with the technical components and the novelty of this paper my main concerns are regarding the quality of results and the evaluation detailed as follows in achieving facial animation the paper works well around the mouth region but seems to fail at the eye region the videodriven animation cannot animate motions around the eyes such as blinks this indicates no blinking eyes have been learned in either the 3d deformation modeling or the appearance modeling or a combination of them one can blame that the training data has few closedeye images but would data augmentation of closedeye images resolve this problem in the 3d gan scenario could 3dmm guidance be sufficient to model eye animation the discofacegram is a baseline proposed in this method but not well explained how does it different from the proposed approach it looks like the discofacegram generated images have artifacts unseen from either discofacegan or gram where are they coming from is it due to the architectural design or the training process quantitative results seem to be not better than previous approaches almost all comparisons are done with random samples across different approaches which is not easy to interpret the best way might be to project images into the latent space and compare the results of editing so there is a common reference on the other hand the video results showed some projectionthenedit animations where we can see more artifacts how were these imagesvideos produced it looks like the approach might have limitations dealing with reconstructionprotection compared to 2d stylegans perhaps more discussion is needed here too i believe there need to be more discussions around the limitations of the method in terms of technical details and applications as mentioned before the negative societal impact is adequately discussed in my opinion docsepthis paper proposes to generate 3dmm controllable 3daware faces they design a generator that produces two 3d fields namely a template radiance field and an expressiondriven 3d deformation field the expressiondriven 3d deformation field is learned with the guidance of 3dmm experiments show that this method can produce highquality 3d consistent controllable faces with delicate control the demo video of this paper is impressive from the perspective of controllable 3daware generative models the design of one template radiance field and a deformation field is interesting and reasonable the paper is wellwritten 1 the whole story in the introduction and title seems to be overclaiming words such as realistic video avatars and highquality animatable video avatars have appeared in the title and abstract from the perspective of realistic video avatars the results can barely meet the need as all samples are aligned at the same position the results are far from realistic moreover if the authors intend to show results in a reenactment setting it is better that they can show comparisons with results generated in a gram face vid2vidpirenderer manner and ask users to provide studies on the realness and quality of the results the reviewers suggestion is to tone down the story and change the title to something like realistic animatable 3daware face image generation with xxx explicitly deformable radiance manifolds 2 the comparisons and discussions are not sufficient a the authors write that perhaps the most relevant work to ours is discofacegan this might be leading readers to believe that only comparing to discofacegan is sufficient actually stylerig is also very relevant to the task of disentangled generative modeling and basically shares the same setting as this paper the authors should discuss stylerig although they also suffer from texture stitching and try making a comparison with them b please also refer to the question part c the numerical comparisons are limited why not use the metrics in discofacegan on disentanglement for evaluation 3 no audio is provided in the supplementary video thus the reviewer cannot tell whether the lip movements are consistent with the source also there are no 3d results shown making the 3d consistent claim not strong enough docsepthis paper proposes a novel pipeline for animatable 3daware face image generation they first propose a template radiance field to obtain the rough face geometry then an expressiondriven deformation field is proposed to animate the facial images based on the change of 3dmm parameters qualitative results verify the effectiveness of the proposed pipeline strengths 1 the proposed method makes sense to me it is natural to predict the nonrigid facial movement by inferring the perpoint deformation based on 3dmm parameters the presented qualitative results are mostly visibly plausible 2 the paper is wellwritten with thorough related works weakness 1 in eq 7 the chamfer distance is computed between the canonical 3dmm shape reconstructed by the input exp and id param and the predicted shape inferred from the depth map as indicated by authors how do the authors guarantee that the two shapes are at the same resolution ie they are comparable for cd calculation besides since the 3dmm only contain meaningful information at the facial regions how do the authors deal with the neighbor regions like hairs 2 i am curious about how do the authors deal with the wearings like the glasses ear rings etc such small items could not be inferred from the 3dmm prior so how to guarantee the visual quality and 3d shape of them 3 missing some relevant comparison baselines some important baselines are not compared including but not limited to 1 2 3 4 1 pie portrait image embedding for semantic control tewari et al tog 2020 2 stylerig rigging stylegan for 3d control over portrait images tewari et al cvpr 2020 3 designing an encoder for stylegan image manipulation tov et al siggraph 2021 4 headnerf a realtime nerfbased parametric head model hong et al cvpr 2022 they are all properly discussed in the appendix docsepthis paper proposed a framework for face image manipulation it attempts to achieve finegrained control over attributes and facial expressions by better preserving 3d consistency the key contributions include a template implicit field and a 3d deformation field the reported experimental results show that the proposed method can produce highquality animatable video avatars with good texture consistency strengths the concept of 3daware face generation is a sensible idea ensuring 3d consistency for face image manipulation has shown high texture consistency across poses and expressions weaknesses it applied 3dmm expression parameters as a prior for disentanglement of latent variables although the experiments in this paper only focuses on 2d face image editing from the title to the abstract it easily confuses the reader that they addressed 3d face editing task eg animation usually implies the generation of a 3d avatar from a given face the last sentence of the abstract about strong visual 3d consistency is misleading because there is no 3d reconstruction and editing contribution in this paper indeed regarding the 2d face image generation task the idea of combining image rendering and 3dmm guided face editing is not novel also from the experimental result view they are not impressive to argue their contributions from figure 3 to figure 4 their improvements are marginal compared to discofacegan more importantly the comparisons are not convincing with only two out of dated methods some missing methods will need to be considered for comparison in the experiment transeditor transformerbased dualspace gan for highly controllable facial editing interfacegan interpreting the disentangled face representation learned by gans i am curious about the performance of anifacegan in some extreme situations eg extreme pose or lighting conditions meanwhile all demo video are very short it is difficult to evaluate the quality of an animation for such short duration i am wondering if the authors have video demo of longer period say 3 seconds
### Summary: | the paper addresses an interesting topic and advances the state of the art the authors have responded sufficiently to the criticisms of the reviewers including the one reviewer that was recommending rejection the authors are encouraged to incorporate the clarifications and additional results in the final version | [
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
2520,
2929,
3559,
247,
1332,
323,
4715,
271,
3865,
17980,
495,
69,
13823,
1006,
800,
1566,
323,
2454,
3888,
253,
1332,
310,
4270,
327,
1755,
273,
253,
495,
69,
13823,
1006,
800,
1566,
29975,
285,
23970,
495,
69,
2188,
26960,
45738,
4715,
281,
557,
290,
2134,
253,
21624,
2317,
3969,
281,
6489,
285,
2048,
10575,
253,
1332,
310,
10166,
327,
271,
2460,
10895,
285,
2429,
342,
247,
1643,
2045,
1666,
25379,
1262,
1171,
584,
1247,
345,
338,
72,
3966,
253,
1332,
310,
36878,
6760,
342,
269,
301,
285,
5772,
835,
253,
3045,
310,
10870,
281,
533,
3839,
417,
1805,
685,
2045,
7274,
253,
1332,
671,
17923,
18276,
14023,
327,
6571,
12421,
19958,
3888,
835,
253,
4477,
2692,
1805,
1859,
15274,
685,
2045,
789,
253,
2929,
556,
253,
1563,
20544,
50275,
783,
9400,
310,
4122,
4722,
10938,
253,
9058,
90,
495,
69,
13823,
21025,
310,
273,
1534,
1600,
275,
253,
2561,
7888,
273,
5145,
4715,
4382,
8113,
285,
15896,
17170,
1029,
15177,
1453,
273,
495,
69,
13823,
21025,
310,
671,
11408,
323,
1142,
4893,
50275,
783,
2934,
310,
4722,
285,
556,
697,
38135,
1262,
1171,
584,
1247,
556,
19732,
2961,
45738,
4715,
281,
3037,
281,
3410,
374,
69,
9365,
432,
247,
495,
69,
2188,
47485,
21624,
2317,
436,
789,
8671,
247,
2074,
2934,
281,
253,
495,
69,
13823,
305,
507,
285,
5611,
3082,
281,
7102,
12846,
907,
253,
495,
69,
14237,
50275,
16680,
403,
18462,
6701,
281,
495,
69,
13823,
305,
507,
253,
4081,
1332,
7369,
275,
2169,
3290,
1859,
15274,
534,
310,
3839,
417,
3477,
281,
5115,
342,
247,
374,
69,
36827,
10336,
50276,
303,
3839,
10048,
342,
253,
7681,
4295,
285,
253,
38135,
273,
436,
2929,
619,
2022,
7350,
403,
5001,
253,
3290,
273,
1543,
285,
253,
7103,
7000,
347,
3637,
50275,
249,
17170,
17754,
16904,
253,
2929,
2987,
973,
1475,
253,
6208,
2919,
533,
3133,
281,
1891,
387,
253,
5130,
2919,
253,
8851,
351,
1069,
257,
16904,
2550,
49129,
14462,
1475,
253,
2927,
824,
347,
787,
3106,
436,
6492,
642,
787,
10274,
2927,
452,
644,
6311,
275,
2057,
253,
495,
69,
19972,
14053,
390,
253,
7286,
14053,
390,
247,
5019,
273,
731,
581,
476,
13387,
326,
253,
3733,
941,
556,
1643,
4581,
30978,
3888,
533,
651,
941,
42072,
273,
4581,
30978,
3888,
11322,
436,
1895,
275,
253,
495,
69,
36827,
10076,
812,
495,
69,
2188,
12925,
320,
4209,
281,
1566,
5130,
16904,
50274,
783,
1262,
1171,
584,
1710,
310,
247,
8245,
4081,
275,
436,
1332,
533,
417,
973,
5544,
849,
1057,
352,
1027,
432,
253,
4081,
2746,
352,
4453,
751,
253,
1262,
1171,
584,
1710,
4561,
3888,
452,
24165,
39709,
432,
2057,
1262,
1171,
584,
1247,
390,
29975,
835,
403,
597,
3551,
432,
310,
352,
1955,
281,
253,
27934,
2216,
390,
253,
3733,
1232,
50275,
17149,
6716,
1543,
1646,
281,
320,
417,
1805,
685,
2045,
7274,
50275,
25855,
512,
14023,
403,
2218,
342,
3632,
3530,
2439,
1027,
7274,
534,
310,
417,
3477,
281,
4665,
253,
1682,
1039,
1537,
320,
281,
2199,
3888,
715,
253,
21624,
2317,
285,
7277,
253,
1543,
273,
14835,
594,
627,
310,
247,
1846,
3806,
327,
253,
643,
1133,
253,
3492,
1543,
2692,
690,
12378,
7461,
15576,
43588,
835,
359,
476,
923,
625,
24165,
849,
497,
841,
3888,
87,
31686,
4197,
352,
4453,
751,
253,
2746,
1537,
452,
7364,
10620,
342,
14433,
39013,
2429,
281,
374,
69,
30085,
1851,
507,
4931,
625,
5955,
310,
3058,
1060,
1512,
50273,
74,
2868,
627,
878,
281,
320,
625,
11985,
1475,
253,
7364,
273,
253,
1332,
275,
2426,
273,
7681,
4278,
285,
4893,
347,
5393,
1078,
253,
4016,
38058,
3486,
310,
18212,
5469,
275,
619,
4743,
5474,
33032,
2520,
2929,
29328,
281,
6635,
495,
69,
2188,
3661,
494,
495,
69,
13823,
9365,
597,
2216,
247,
14156,
326,
11330,
767,
495,
69,
4910,
10775,
247,
7646,
1985,
5155,
1673,
285,
271,
2048,
17477,
495,
69,
19972,
1673,
253,
2048,
17477,
495,
69,
19972,
1673,
310,
6311,
342,
253,
12925,
273,
495,
69,
2188,
4679,
921,
326,
436,
1332,
476,
4711,
1029,
15177,
495,
69,
5185,
3661,
494,
9365,
342,
21140,
1453,
50276,
783,
22020,
3492,
273,
436,
2929,
310,
13943,
432,
253,
8668,
273,
3661,
494,
495,
69,
13823,
1006,
800,
3210,
50275,
783,
2216,
273,
581,
7646,
1985,
5155,
1673,
285,
247,
19972,
1673,
310,
4722,
285,
5272,
50275,
783,
2929,
310,
973,
15720,
337,
253,
2644,
2926,
275,
253,
10199,
285,
4060,
3133,
281,
320,
689,
43759,
50275,
12113,
824,
347,
15958,
3492,
1323,
255,
1032,
285,
1029,
15177,
3865,
17980,
3492,
1323,
255,
1032,
452,
5420,
275,
253,
4060,
285,
12002,
432,
253,
8668,
273,
15958,
3492,
1323,
255,
1032,
253,
1543,
476,
12345,
2525,
253,
878,
347,
512,
3530,
403,
15616,
387,
253,
1072,
1899,
253,
1543,
403,
2080,
432,
15958,
50276,
3062,
1189,
604,
253,
4477,
18607,
281,
921,
1543,
275,
247,
294,
257,
514,
420,
4758,
352,
310,
1805,
326,
597,
476,
921,
14023,
342,
1543,
4561,
275,
247,
29975,
50276,
1664,
29069,
19,
18049,
2059,
445,
21052,
5133,
285,
1642,
4212,
281,
2085,
2175,
327,
253,
1524,
1255,
285,
3290,
273,
253,
1543,
50276,
783,
30628,
14876,
310,
281,
10541,
1066,
253,
2926,
285,
1818,
253,
4060,
281,
1633,
751,
15958,
3865,
17980,
495,
69,
13823,
2454,
2460,
5978,
342,
43911,
11120,
22403,
494,
1985,
5155,
28236,
50276,
19,
253,
14023,
285,
11985,
403,
417,
4209,
50276,
66,
253,
4477,
3630,
326,
4931,
253,
954,
4623,
789,
281,
20451,
310,
1262,
1171,
584,
1247,
436,
1537,
320,
4283,
10668,
281,
2868,
326,
760,
10941,
281,
1262,
1171,
584,
1247,
310,
4209,
2686,
17521,
254,
304,
310,
671,
1077,
4623,
281,
253,
4836,
273,
557,
290,
33195,
1006,
800,
14053,
285,
10323,
10764,
253,
1072,
4758,
347,
436,
2929,
253,
4477,
943,
2319,
17521,
254,
304,
3738,
597,
671,
11089,
432,
14542,
331,
31054,
285,
1611,
2403,
247,
5301,
342,
731,
50276,
67,
4496,
671,
3730,
281,
253,
1953,
629,
50276,
68,
253,
10704,
14023,
403,
3710,
2139,
417,
897,
253,
17082,
275,
1262,
1171,
584,
1247,
327,
557,
290,
606,
1338,
323,
7103,
50276,
20,
50276,
2369,
9797,
310,
2530,
275,
253,
24864,
3492,
3021,
253,
37317,
2550,
2028,
1880,
253,
5541,
11438,
403,
5185,
342,
253,
2603,
671,
627,
403,
642,
495,
69,
1543,
2011,
2403,
253,
495,
69,
5185,
1750,
417,
2266,
2217,
5474,
33032,
2520,
2929,
29328,
247,
4460,
15722,
323,
3865,
17980,
495,
69,
13823,
2454,
2460,
5978,
597,
806,
12661,
247,
7646,
1985,
5155,
1673,
281,
4044,
253,
7227,
2454,
12087,
840,
271,
2048,
17477,
19972,
1673,
310,
4081,
281,
49129,
253,
17754,
3888,
1754,
327,
253,
1818,
273,
495,
69,
2188,
3602,
18276,
1543,
12654,
253,
12510,
273,
253,
4081,
15722,
50276,
296,
3755,
20556,
50276,
18,
253,
4081,
1332,
2789,
3282,
281,
479,
352,
310,
3626,
281,
3283,
253,
1327,
10389,
301,
17754,
4866,
407,
9441,
804,
253,
591,
3659,
19972,
1754,
327,
495,
69,
2188,
3602,
253,
3559,
18276,
1543,
403,
6571,
47975,
21541,
50276,
19,
253,
2929,
310,
973,
15720,
342,
11080,
2905,
2987,
50275,
20881,
1255,
50276,
18,
275,
16186,
818,
253,
45909,
1592,
4181,
310,
10302,
875,
253,
15516,
495,
69,
2188,
5281,
25578,
407,
253,
3280,
866,
285,
2654,
2236,
285,
253,
8131,
5281,
22245,
432,
253,
6864,
3711,
347,
4860,
407,
4477,
849,
513,
253,
4477,
12215,
326,
253,
767,
15029,
403,
387,
253,
1072,
6064,
26332,
597,
403,
10870,
323,
22942,
10272,
16280,
1580,
253,
495,
69,
2188,
760,
3831,
14282,
1491,
387,
253,
17754,
4811,
849,
513,
253,
4477,
2968,
342,
253,
6346,
4811,
751,
39746,
50276,
19,
891,
717,
14338,
670,
849,
513,
253,
4477,
2968,
342,
253,
8251,
723,
751,
253,
17543,
2472,
14445,
3966,
824,
1355,
4957,
812,
417,
320,
22245,
432,
253,
495,
69,
2188,
2720,
594,
849,
281,
12215,
253,
5304,
3290,
285,
495,
69,
5281,
273,
731,
50276,
20,
5816,
690,
4623,
5301,
1666,
25379,
690,
1774,
1666,
25379,
403,
417,
2429,
1690,
533,
417,
3710,
281,
337,
374,
495,
577,
50276,
18,
3376,
22946,
2460,
21496,
323,
24705,
1453,
716,
88,
1792,
1162,
355,
281,
72,
9169,
50276,
19,
17521,
254,
304,
8132,
3390,
3740,
1247,
323,
495,
69,
1453,
689,
22946,
3888,
716,
88,
1792,
1162,
355,
30105,
1087,
9169,
50276,
20,
20462,
271,
32049,
323,
3740,
1247,
2460,
19763,
281,
87,
1162,
355,
9788,
10580,
43425,
50276,
21,
1481,
1216,
71,
247,
1524,
2606,
38998,
71,
3169,
36833,
1481,
1566,
288,
543,
1162,
355,
30105,
1087,
1384,
1423,
597,
403,
512,
6283,
5469,
275,
253,
30762,
5474,
33032,
2520,
2929,
4081,
247,
7792,
323,
2454,
2460,
19763,
352,
9437,
281,
5115,
4030,
72,
11273,
1453,
689,
12474,
285,
17754,
12091,
407,
1805,
24279,
495,
69,
15274,
253,
2234,
9021,
2486,
247,
7646,
15424,
1673,
285,
247,
495,
69,
19972,
1673,
253,
2361,
5661,
1543,
921,
326,
253,
4081,
1332,
476,
4711,
1029,
15177,
3865,
17980,
3492,
1323,
255,
1032,
342,
1175,
14542,
15274,
20544,
50275,
783,
4473,
273,
495,
69,
13823,
2454,
5978,
310,
247,
24600,
2934,
17749,
495,
69,
15274,
323,
2454,
2460,
19763,
556,
2011,
1029,
14542,
15274,
2439,
24543,
285,
12091,
32213,
50276,
262,
3732,
495,
69,
2188,
2048,
3602,
347,
247,
2720,
323,
557,
290,
606,
1338,
273,
21624,
4903,
3738,
253,
4679,
275,
436,
2929,
760,
16633,
327,
374,
69,
2454,
2460,
14835,
432,
253,
4060,
281,
253,
12002,
352,
4354,
1461,
5123,
253,
9414,
326,
597,
9713,
495,
69,
2454,
14835,
4836,
50276,
909,
16904,
3798,
8018,
253,
5978,
273,
247,
495,
69,
49530,
432,
247,
1677,
2454,
50276,
783,
1390,
6197,
273,
253,
12002,
670,
2266,
5304,
495,
69,
15274,
310,
24363,
984,
627,
310,
642,
495,
69,
14433,
285,
14835,
7680,
275,
436,
2929,
6296,
50274,
1747,
13218,
253,
374,
69,
2454,
2460,
5978,
4836,
253,
2934,
273,
16248,
2460,
18164,
285,
495,
69,
2188,
18107,
2454,
14835,
310,
417,
4460,
671,
432,
253,
5661,
906,
1859,
597,
403,
417,
13943,
281,
9059,
616,
9021,
432,
4677,
495,
281,
4677,
577,
616,
11701,
403,
16888,
2429,
281,
1262,
1171,
584,
1247,
50274,
3062,
15538,
253,
14023,
403,
417,
21414,
342,
760,
767,
562,
273,
15483,
3082,
50276,
8826,
5816,
3082,
588,
878,
281,
320,
2783,
323,
5301,
275,
253,
3368,
811,
20511,
39707,
3169,
8746,
5641,
36827,
323,
4122,
3661,
494,
17754,
14835,
50276,
15049,
1247,
29375,
253,
557,
290,
33195,
2454,
6779,
6311,
407,
305,
507,
891,
717,
14338,
670,
253,
3045,
273,
271,
338,
584,
1247,
275,
690,
9559,
9534,
50276,
909,
9559,
16753,
390,
15632,
2515,
26614,
512,
22020,
3492,
403,
1077,
2159,
50276,
262,
310,
2834,
281,
7472,
253,
3290,
273,
271,
16904,
323,
824,
2159,
7467,
891,
717,
12371,
604,
253,
4477,
452,
3492,
22020,
273,
3356,
2180,
50276,
19506,
495,
7253,
2490,
187,
4118,
18435,
27,
783,
2929,
12453,
271,
4722,
9400,
285,
16424,
253,
1375,
273,
253,
1445,
253,
4477,
452,
10974,
10481,
281,
253,
43680,
273,
253,
30628,
1690,
253,
581,
37317,
326,
369,
46705,
18235,
253,
4477,
403,
14659,
281,
19071,
253,
8254,
6787,
285,
3081,
1543,
275,
253,
2457,
2715
] | [
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] | [
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
2520,
2929,
3559,
247,
1332,
323,
4715,
271,
3865,
17980,
495,
69,
13823,
1006,
800,
1566,
323,
2454,
3888,
253,
1332,
310,
4270,
327,
1755,
273,
253,
495,
69,
13823,
1006,
800,
1566,
29975,
285,
23970,
495,
69,
2188,
26960,
45738,
4715,
281,
557,
290,
2134,
253,
21624,
2317,
3969,
281,
6489,
285,
2048,
10575,
253,
1332,
310,
10166,
327,
271,
2460,
10895,
285,
2429,
342,
247,
1643,
2045,
1666,
25379,
1262,
1171,
584,
1247,
345,
338,
72,
3966,
253,
1332,
310,
36878,
6760,
342,
269,
301,
285,
5772,
835,
253,
3045,
310,
10870,
281,
533,
3839,
417,
1805,
685,
2045,
7274,
253,
1332,
671,
17923,
18276,
14023,
327,
6571,
12421,
19958,
3888,
835,
253,
4477,
2692,
1805,
1859,
15274,
685,
2045,
789,
253,
2929,
556,
253,
1563,
20544,
50275,
783,
9400,
310,
4122,
4722,
10938,
253,
9058,
90,
495,
69,
13823,
21025,
310,
273,
1534,
1600,
275,
253,
2561,
7888,
273,
5145,
4715,
4382,
8113,
285,
15896,
17170,
1029,
15177,
1453,
273,
495,
69,
13823,
21025,
310,
671,
11408,
323,
1142,
4893,
50275,
783,
2934,
310,
4722,
285,
556,
697,
38135,
1262,
1171,
584,
1247,
556,
19732,
2961,
45738,
4715,
281,
3037,
281,
3410,
374,
69,
9365,
432,
247,
495,
69,
2188,
47485,
21624,
2317,
436,
789,
8671,
247,
2074,
2934,
281,
253,
495,
69,
13823,
305,
507,
285,
5611,
3082,
281,
7102,
12846,
907,
253,
495,
69,
14237,
50275,
16680,
403,
18462,
6701,
281,
495,
69,
13823,
305,
507,
253,
4081,
1332,
7369,
275,
2169,
3290,
1859,
15274,
534,
310,
3839,
417,
3477,
281,
5115,
342,
247,
374,
69,
36827,
10336,
50276,
303,
3839,
10048,
342,
253,
7681,
4295,
285,
253,
38135,
273,
436,
2929,
619,
2022,
7350,
403,
5001,
253,
3290,
273,
1543,
285,
253,
7103,
7000,
347,
3637,
50275,
249,
17170,
17754,
16904,
253,
2929,
2987,
973,
1475,
253,
6208,
2919,
533,
3133,
281,
1891,
387,
253,
5130,
2919,
253,
8851,
351,
1069,
257,
16904,
2550,
49129,
14462,
1475,
253,
2927,
824,
347,
787,
3106,
436,
6492,
642,
787,
10274,
2927,
452,
644,
6311,
275,
2057,
253,
495,
69,
19972,
14053,
390,
253,
7286,
14053,
390,
247,
5019,
273,
731,
581,
476,
13387,
326,
253,
3733,
941,
556,
1643,
4581,
30978,
3888,
533,
651,
941,
42072,
273,
4581,
30978,
3888,
11322,
436,
1895,
275,
253,
495,
69,
36827,
10076,
812,
495,
69,
2188,
12925,
320,
4209,
281,
1566,
5130,
16904,
50274,
783,
1262,
1171,
584,
1710,
310,
247,
8245,
4081,
275,
436,
1332,
533,
417,
973,
5544,
849,
1057,
352,
1027,
432,
253,
4081,
2746,
352,
4453,
751,
253,
1262,
1171,
584,
1710,
4561,
3888,
452,
24165,
39709,
432,
2057,
1262,
1171,
584,
1247,
390,
29975,
835,
403,
597,
3551,
432,
310,
352,
1955,
281,
253,
27934,
2216,
390,
253,
3733,
1232,
50275,
17149,
6716,
1543,
1646,
281,
320,
417,
1805,
685,
2045,
7274,
50275,
25855,
512,
14023,
403,
2218,
342,
3632,
3530,
2439,
1027,
7274,
534,
310,
417,
3477,
281,
4665,
253,
1682,
1039,
1537,
320,
281,
2199,
3888,
715,
253,
21624,
2317,
285,
7277,
253,
1543,
273,
14835,
594,
627,
310,
247,
1846,
3806,
327,
253,
643,
1133,
253,
3492,
1543,
2692,
690,
12378,
7461,
15576,
43588,
835,
359,
476,
923,
625,
24165,
849,
497,
841,
3888,
87,
31686,
4197,
352,
4453,
751,
253,
2746,
1537,
452,
7364,
10620,
342,
14433,
39013,
2429,
281,
374,
69,
30085,
1851,
507,
4931,
625,
5955,
310,
3058,
1060,
1512,
50273,
74,
2868,
627,
878,
281,
320,
625,
11985,
1475,
253,
7364,
273,
253,
1332,
275,
2426,
273,
7681,
4278,
285,
4893,
347,
5393,
1078,
253,
4016,
38058,
3486,
310,
18212,
5469,
275,
619,
4743,
5474,
33032,
2520,
2929,
29328,
281,
6635,
495,
69,
2188,
3661,
494,
495,
69,
13823,
9365,
597,
2216,
247,
14156,
326,
11330,
767,
495,
69,
4910,
10775,
247,
7646,
1985,
5155,
1673,
285,
271,
2048,
17477,
495,
69,
19972,
1673,
253,
2048,
17477,
495,
69,
19972,
1673,
310,
6311,
342,
253,
12925,
273,
495,
69,
2188,
4679,
921,
326,
436,
1332,
476,
4711,
1029,
15177,
495,
69,
5185,
3661,
494,
9365,
342,
21140,
1453,
50276,
783,
22020,
3492,
273,
436,
2929,
310,
13943,
432,
253,
8668,
273,
3661,
494,
495,
69,
13823,
1006,
800,
3210,
50275,
783,
2216,
273,
581,
7646,
1985,
5155,
1673,
285,
247,
19972,
1673,
310,
4722,
285,
5272,
50275,
783,
2929,
310,
973,
15720,
337,
253,
2644,
2926,
275,
253,
10199,
285,
4060,
3133,
281,
320,
689,
43759,
50275,
12113,
824,
347,
15958,
3492,
1323,
255,
1032,
285,
1029,
15177,
3865,
17980,
3492,
1323,
255,
1032,
452,
5420,
275,
253,
4060,
285,
12002,
432,
253,
8668,
273,
15958,
3492,
1323,
255,
1032,
253,
1543,
476,
12345,
2525,
253,
878,
347,
512,
3530,
403,
15616,
387,
253,
1072,
1899,
253,
1543,
403,
2080,
432,
15958,
50276,
3062,
1189,
604,
253,
4477,
18607,
281,
921,
1543,
275,
247,
294,
257,
514,
420,
4758,
352,
310,
1805,
326,
597,
476,
921,
14023,
342,
1543,
4561,
275,
247,
29975,
50276,
1664,
29069,
19,
18049,
2059,
445,
21052,
5133,
285,
1642,
4212,
281,
2085,
2175,
327,
253,
1524,
1255,
285,
3290,
273,
253,
1543,
50276,
783,
30628,
14876,
310,
281,
10541,
1066,
253,
2926,
285,
1818,
253,
4060,
281,
1633,
751,
15958,
3865,
17980,
495,
69,
13823,
2454,
2460,
5978,
342,
43911,
11120,
22403,
494,
1985,
5155,
28236,
50276,
19,
253,
14023,
285,
11985,
403,
417,
4209,
50276,
66,
253,
4477,
3630,
326,
4931,
253,
954,
4623,
789,
281,
20451,
310,
1262,
1171,
584,
1247,
436,
1537,
320,
4283,
10668,
281,
2868,
326,
760,
10941,
281,
1262,
1171,
584,
1247,
310,
4209,
2686,
17521,
254,
304,
310,
671,
1077,
4623,
281,
253,
4836,
273,
557,
290,
33195,
1006,
800,
14053,
285,
10323,
10764,
253,
1072,
4758,
347,
436,
2929,
253,
4477,
943,
2319,
17521,
254,
304,
3738,
597,
671,
11089,
432,
14542,
331,
31054,
285,
1611,
2403,
247,
5301,
342,
731,
50276,
67,
4496,
671,
3730,
281,
253,
1953,
629,
50276,
68,
253,
10704,
14023,
403,
3710,
2139,
417,
897,
253,
17082,
275,
1262,
1171,
584,
1247,
327,
557,
290,
606,
1338,
323,
7103,
50276,
20,
50276,
2369,
9797,
310,
2530,
275,
253,
24864,
3492,
3021,
253,
37317,
2550,
2028,
1880,
253,
5541,
11438,
403,
5185,
342,
253,
2603,
671,
627,
403,
642,
495,
69,
1543,
2011,
2403,
253,
495,
69,
5185,
1750,
417,
2266,
2217,
5474,
33032,
2520,
2929,
29328,
247,
4460,
15722,
323,
3865,
17980,
495,
69,
13823,
2454,
2460,
5978,
597,
806,
12661,
247,
7646,
1985,
5155,
1673,
281,
4044,
253,
7227,
2454,
12087,
840,
271,
2048,
17477,
19972,
1673,
310,
4081,
281,
49129,
253,
17754,
3888,
1754,
327,
253,
1818,
273,
495,
69,
2188,
3602,
18276,
1543,
12654,
253,
12510,
273,
253,
4081,
15722,
50276,
296,
3755,
20556,
50276,
18,
253,
4081,
1332,
2789,
3282,
281,
479,
352,
310,
3626,
281,
3283,
253,
1327,
10389,
301,
17754,
4866,
407,
9441,
804,
253,
591,
3659,
19972,
1754,
327,
495,
69,
2188,
3602,
253,
3559,
18276,
1543,
403,
6571,
47975,
21541,
50276,
19,
253,
2929,
310,
973,
15720,
342,
11080,
2905,
2987,
50275,
20881,
1255,
50276,
18,
275,
16186,
818,
253,
45909,
1592,
4181,
310,
10302,
875,
253,
15516,
495,
69,
2188,
5281,
25578,
407,
253,
3280,
866,
285,
2654,
2236,
285,
253,
8131,
5281,
22245,
432,
253,
6864,
3711,
347,
4860,
407,
4477,
849,
513,
253,
4477,
12215,
326,
253,
767,
15029,
403,
387,
253,
1072,
6064,
26332,
597,
403,
10870,
323,
22942,
10272,
16280,
1580,
253,
495,
69,
2188,
760,
3831,
14282,
1491,
387,
253,
17754,
4811,
849,
513,
253,
4477,
2968,
342,
253,
6346,
4811,
751,
39746,
50276,
19,
891,
717,
14338,
670,
849,
513,
253,
4477,
2968,
342,
253,
8251,
723,
751,
253,
17543,
2472,
14445,
3966,
824,
1355,
4957,
812,
417,
320,
22245,
432,
253,
495,
69,
2188,
2720,
594,
849,
281,
12215,
253,
5304,
3290,
285,
495,
69,
5281,
273,
731,
50276,
20,
5816,
690,
4623,
5301,
1666,
25379,
690,
1774,
1666,
25379,
403,
417,
2429,
1690,
533,
417,
3710,
281,
337,
374,
495,
577,
50276,
18,
3376,
22946,
2460,
21496,
323,
24705,
1453,
716,
88,
1792,
1162,
355,
281,
72,
9169,
50276,
19,
17521,
254,
304,
8132,
3390,
3740,
1247,
323,
495,
69,
1453,
689,
22946,
3888,
716,
88,
1792,
1162,
355,
30105,
1087,
9169,
50276,
20,
20462,
271,
32049,
323,
3740,
1247,
2460,
19763,
281,
87,
1162,
355,
9788,
10580,
43425,
50276,
21,
1481,
1216,
71,
247,
1524,
2606,
38998,
71,
3169,
36833,
1481,
1566,
288,
543,
1162,
355,
30105,
1087,
1384,
1423,
597,
403,
512,
6283,
5469,
275,
253,
30762,
5474,
33032,
2520,
2929,
4081,
247,
7792,
323,
2454,
2460,
19763,
352,
9437,
281,
5115,
4030,
72,
11273,
1453,
689,
12474,
285,
17754,
12091,
407,
1805,
24279,
495,
69,
15274,
253,
2234,
9021,
2486,
247,
7646,
15424,
1673,
285,
247,
495,
69,
19972,
1673,
253,
2361,
5661,
1543,
921,
326,
253,
4081,
1332,
476,
4711,
1029,
15177,
3865,
17980,
3492,
1323,
255,
1032,
342,
1175,
14542,
15274,
20544,
50275,
783,
4473,
273,
495,
69,
13823,
2454,
5978,
310,
247,
24600,
2934,
17749,
495,
69,
15274,
323,
2454,
2460,
19763,
556,
2011,
1029,
14542,
15274,
2439,
24543,
285,
12091,
32213,
50276,
262,
3732,
495,
69,
2188,
2048,
3602,
347,
247,
2720,
323,
557,
290,
606,
1338,
273,
21624,
4903,
3738,
253,
4679,
275,
436,
2929,
760,
16633,
327,
374,
69,
2454,
2460,
14835,
432,
253,
4060,
281,
253,
12002,
352,
4354,
1461,
5123,
253,
9414,
326,
597,
9713,
495,
69,
2454,
14835,
4836,
50276,
909,
16904,
3798,
8018,
253,
5978,
273,
247,
495,
69,
49530,
432,
247,
1677,
2454,
50276,
783,
1390,
6197,
273,
253,
12002,
670,
2266,
5304,
495,
69,
15274,
310,
24363,
984,
627,
310,
642,
495,
69,
14433,
285,
14835,
7680,
275,
436,
2929,
6296,
50274,
1747,
13218,
253,
374,
69,
2454,
2460,
5978,
4836,
253,
2934,
273,
16248,
2460,
18164,
285,
495,
69,
2188,
18107,
2454,
14835,
310,
417,
4460,
671,
432,
253,
5661,
906,
1859,
597,
403,
417,
13943,
281,
9059,
616,
9021,
432,
4677,
495,
281,
4677,
577,
616,
11701,
403,
16888,
2429,
281,
1262,
1171,
584,
1247,
50274,
3062,
15538,
253,
14023,
403,
417,
21414,
342,
760,
767,
562,
273,
15483,
3082,
50276,
8826,
5816,
3082,
588,
878,
281,
320,
2783,
323,
5301,
275,
253,
3368,
811,
20511,
39707,
3169,
8746,
5641,
36827,
323,
4122,
3661,
494,
17754,
14835,
50276,
15049,
1247,
29375,
253,
557,
290,
33195,
2454,
6779,
6311,
407,
305,
507,
891,
717,
14338,
670,
253,
3045,
273,
271,
338,
584,
1247,
275,
690,
9559,
9534,
50276,
909,
9559,
16753,
390,
15632,
2515,
26614,
512,
22020,
3492,
403,
1077,
2159,
50276,
262,
310,
2834,
281,
7472,
253,
3290,
273,
271,
16904,
323,
824,
2159,
7467,
891,
717,
12371,
604,
253,
4477,
452,
3492,
22020,
273,
3356,
2180,
50276,
19506,
495,
7253,
2490,
187,
4118,
18435,
27,
783,
2929,
12453,
271,
4722,
9400,
285,
16424,
253,
1375,
273,
253,
1445,
253,
4477,
452,
10974,
10481,
281,
253,
43680,
273,
253,
30628,
1690,
253,
581,
37317,
326,
369,
46705,
18235,
253,
4477,
403,
14659,
281,
19071,
253,
8254,
6787,
285,
3081,
1543,
275,
253,
2457,
2715
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
this paper proposes a twostream model including the raw videos and the keypoint sequences for sign language recognition and sign language translation the twostream model also utilize a variety of techniques including bidirectional lateral connection sign pyramid network and framelevel selfdistillation to further improve their model empirically the twostream model outperforms the previous best method strengths 1 this paper takes advantage of the keypoint sequences and a variety of techniques and achieve excellent results 2 this paper is well written weaknesses 1 it may be confused that why the keypoint sequences can achieve such an effect improvement 2 this paper introduce the robustness of previous models which suffer from dramatic performance degradation when the background or signer is mismatched between training and testing but is the twostream model can solve this problem the authors have addressed the limitations of their work docsepin the slr task the paper introduces two separate streams to model both the raw videos and the keypoint sequences for the two streams to interact with each other better the paper proposes a variety of techniques including bidirectional lateral connection sign pyramid network and framelevel selfdistillation in the slt task the paper extends twostreamslr to twostreamslt by attaching an mlp and nmt experimental results show that twostreamslr and twostreamslt achieve sota performance on slr and slt tasks in three datasets strengths 1 the paper proposes twostream network including bidirectional lateral connection sign pyramid network and framelevel selfdistillation methods to interact rgb videos and keypoint sequences for advancing slr and slt 2 the paper reported an improvement on slr and slt tasks weaknesses 1 these lack the evaluation of visual redundancy and interact with both the raw videos and the keypoint sequences 2 it is unclear about the usage of domain knowledge and key sequences na docsepthis paper proposes a twostream network for sign language recognition and translation the main idea behind this paper is using two separate s3d networks to encode rgb modality and human keypoint modality respectively according to the fact that sign languages use both manual articulations and nonmanual elements to convey information to make the two streams interact with each other authors propose bidirectional lateral connection sign pyramid network with auxiliary supervision and a framelevel selfdistillation strategy elaborate ablation studies have verified the effectiveness of each proposed component the results are very solid twostreamslr which is designed for sign language recognition task achieves 188 wer on phoenix2014 190 on phoenix2014t and 253 on newly published csldaily which greatly outperforms prior methods by large margins as for twostreamslt it also achieves stateoftheart performance on phoenix2014t and csldaily datasets strengths 1 this paper is very well written and easy to follow the motivation is very clear 2 the proposed twostream network with bidirectional lateral connection sign pyramid network auxiliary supervision and framelevel selfdistillation is technically sound i believe this paper will facilitate this research direction 3 systemlevel experiments are very solid this paper achieves stateoftheart performance on two sign language understanding tasks ie continuous sign language recognition and sign language translation across several datasets it is worth mentioning that twostreamslr outperforms previous best methods by large margins especially on csldaily dataset 4 a variety of ablation studies verify the effectiveness of each proposed component as shown in table 3 and 4ad as well as the tables in appendix weakness 1 authors could move the formulation of ctc loss and translation loss from appendix to the main paper to make the paper clearer due to the data bias and data scarcity issue there are unpredictable recognitiontranslation errors as shown in table 4 in appendix authors also discuss the limitations and societal impact adequately
### Summary: | this paper extends models for sign language recognition and translation with a dual encoder where first keypoint sequences are estimated using an offtheshelf model then fused with the video sequence it is a minor technical contribution to add the keypoint estimations as input since no new information was introduced however the authors demonstrated strong execution of experimental results this paper can be categorized with pipelinecascade approaches which rely on domain knowledge for engineered feature extraction and combination the paper presents many experimental results for architecture changes to improve results bidirectional lateral connection sign pyramid network and framelevel selfdistillation the authors convinced the reviewers with more experimental results during the rebuttal period leading to two solid and one borderline accept votes | [
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
2520,
2929,
29328,
247,
2500,
33203,
1566,
1690,
253,
9305,
10556,
285,
253,
2234,
3659,
6430,
323,
861,
3448,
8981,
285,
861,
3448,
10234,
253,
2500,
33203,
1566,
671,
16584,
247,
5235,
273,
5609,
1690,
12246,
30869,
11884,
4602,
861,
39694,
2990,
285,
3665,
5251,
1881,
8155,
21755,
281,
2007,
3157,
616,
1566,
45190,
253,
2500,
33203,
1566,
41731,
13015,
253,
2045,
1682,
1332,
20544,
50276,
18,
436,
2929,
3936,
5750,
273,
253,
2234,
3659,
6430,
285,
247,
5235,
273,
5609,
285,
5115,
7126,
1543,
374,
436,
2929,
310,
973,
3542,
32213,
337,
352,
778,
320,
13477,
326,
2139,
253,
2234,
3659,
6430,
476,
5115,
824,
271,
1055,
7756,
50275,
19,
436,
2929,
9569,
253,
31640,
273,
2045,
3210,
534,
11089,
432,
14138,
3045,
11961,
672,
253,
4114,
390,
861,
254,
310,
19412,
24529,
875,
3733,
285,
5175,
533,
310,
253,
2500,
33203,
1566,
476,
8415,
436,
1895,
253,
4477,
452,
9713,
253,
7364,
273,
616,
789,
5474,
339,
9852,
253,
1499,
83,
4836,
253,
2929,
23970,
767,
4858,
17795,
281,
1566,
1097,
253,
9305,
10556,
285,
253,
2234,
3659,
6430,
323,
253,
767,
17795,
281,
8008,
342,
1016,
643,
1805,
253,
2929,
29328,
247,
5235,
273,
5609,
1690,
12246,
30869,
11884,
4602,
861,
39694,
2990,
285,
3665,
5251,
1881,
8155,
21755,
275,
253,
1499,
85,
4836,
253,
2929,
8725,
2500,
493,
9779,
32888,
281,
2500,
493,
9779,
5792,
407,
36976,
271,
13361,
81,
285,
295,
6917,
5661,
1543,
921,
326,
2500,
493,
9779,
32888,
285,
2500,
493,
9779,
5792,
5115,
256,
5503,
3045,
327,
1499,
83,
285,
1499,
85,
8892,
275,
1264,
15302,
20544,
50276,
18,
253,
2929,
29328,
2500,
33203,
2990,
1690,
12246,
30869,
11884,
4602,
861,
39694,
2990,
285,
3665,
5251,
1881,
8155,
21755,
3082,
281,
8008,
46206,
10556,
285,
2234,
3659,
6430,
323,
26441,
1499,
83,
285,
1499,
85,
50276,
19,
253,
2929,
2361,
271,
7756,
327,
1499,
83,
285,
1499,
85,
8892,
50276,
20881,
1255,
265,
50276,
18,
841,
3480,
253,
7103,
273,
5304,
39296,
285,
8008,
342,
1097,
253,
9305,
10556,
285,
253,
2234,
3659,
6430,
50276,
19,
352,
310,
12744,
670,
253,
10393,
273,
5028,
3640,
285,
2234,
6430,
50276,
2072,
5474,
33032,
2520,
2929,
29328,
247,
2500,
33203,
2990,
323,
861,
3448,
8981,
285,
10234,
253,
2022,
2934,
3212,
436,
2929,
310,
970,
767,
4858,
256,
20,
69,
6928,
281,
22573,
46206,
36453,
285,
1966,
2234,
3659,
36453,
2975,
2556,
281,
253,
958,
326,
861,
11515,
897,
1097,
11595,
18575,
3339,
285,
1327,
39842,
3603,
281,
12709,
1491,
281,
1056,
253,
767,
17795,
8008,
342,
1016,
643,
4477,
12661,
12246,
30869,
11884,
4602,
861,
39694,
2990,
342,
24026,
20446,
285,
247,
3665,
5251,
1881,
8155,
21755,
5700,
21184,
28913,
2175,
452,
16058,
253,
12510,
273,
1016,
4081,
4445,
253,
1543,
403,
1077,
4891,
2500,
493,
9779,
32888,
534,
310,
4158,
323,
861,
3448,
8981,
4836,
33526,
25305,
16640,
327,
815,
80,
17473,
6759,
21732,
327,
815,
80,
17473,
6759,
85,
285,
30135,
327,
9841,
3863,
29180,
392,
4170,
534,
10260,
41731,
13015,
2720,
3082,
407,
1781,
24390,
347,
323,
2500,
493,
9779,
5792,
352,
671,
33526,
50276,
3409,
23037,
14387,
3045,
327,
815,
80,
17473,
6759,
85,
285,
29180,
392,
4170,
15302,
20544,
337,
186,
2520,
2929,
310,
1077,
973,
3542,
285,
3477,
281,
956,
253,
16038,
310,
1077,
2590,
374,
186,
783,
4081,
2500,
33203,
2990,
342,
12246,
30869,
11884,
4602,
861,
39694,
2990,
24026,
20446,
285,
3665,
5251,
1881,
8155,
21755,
310,
22335,
3590,
891,
2868,
436,
2929,
588,
12454,
436,
2561,
3884,
495,
186,
10394,
5251,
4679,
403,
1077,
4891,
436,
2929,
33526,
1375,
23037,
14387,
3045,
327,
767,
861,
3448,
4685,
8892,
26332,
5415,
861,
3448,
8981,
285,
861,
3448,
10234,
2439,
2067,
15302,
352,
310,
4409,
29570,
326,
2500,
493,
9779,
32888,
41731,
13015,
2045,
1682,
3082,
407,
1781,
24390,
3340,
327,
29180,
392,
4170,
10895,
577,
186,
66,
5235,
273,
28913,
2175,
12654,
253,
12510,
273,
1016,
4081,
4445,
347,
2011,
275,
2829,
495,
285,
577,
324,
347,
973,
347,
253,
7180,
275,
30762,
50276,
20881,
1255,
337,
186,
43355,
812,
2118,
253,
15895,
273,
260,
18038,
2957,
285,
10234,
2957,
432,
30762,
281,
253,
2022,
2929,
281,
1056,
253,
2929,
30909,
50275,
21848,
281,
253,
941,
8492,
285,
941,
22888,
414,
2523,
627,
403,
32947,
8981,
20099,
6332,
347,
2011,
275,
2829,
577,
275,
30762,
4477,
671,
2319,
253,
7364,
285,
38058,
3486,
18212,
2490,
187,
4118,
18435,
27,
2520,
2929,
8725,
3210,
323,
861,
3448,
8981,
285,
10234,
342,
247,
8746,
32049,
835,
806,
2234,
3659,
6430,
403,
5998,
970,
271,
273,
649,
1041,
48164,
1566,
840,
29843,
342,
253,
3492,
3425,
352,
310,
247,
5884,
7681,
7680,
281,
823,
253,
2234,
3659,
3311,
569,
347,
3280,
1580,
642,
747,
1491,
369,
5611,
2299,
253,
4477,
5183,
2266,
10636,
273,
5661,
1543,
436,
2929,
476,
320,
27948,
342,
15722,
68,
45446,
7274,
534,
10725,
327,
5028,
3640,
323,
28136,
4735,
11998,
285,
5019,
50276,
783,
2929,
10262,
1142,
5661,
1543,
323,
10336,
2544,
281,
3157,
1543,
12246,
30869,
11884,
4602,
861,
39694,
2990,
285,
3665,
5251,
1881,
8155,
21755,
50276,
783,
4477,
13762,
253,
30628,
342,
625,
5661,
1543,
1309,
253,
30080,
22559,
2180,
4283,
281,
767,
4891,
285,
581,
45210,
2997,
13008,
209
] | [
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] | [
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
2520,
2929,
29328,
247,
2500,
33203,
1566,
1690,
253,
9305,
10556,
285,
253,
2234,
3659,
6430,
323,
861,
3448,
8981,
285,
861,
3448,
10234,
253,
2500,
33203,
1566,
671,
16584,
247,
5235,
273,
5609,
1690,
12246,
30869,
11884,
4602,
861,
39694,
2990,
285,
3665,
5251,
1881,
8155,
21755,
281,
2007,
3157,
616,
1566,
45190,
253,
2500,
33203,
1566,
41731,
13015,
253,
2045,
1682,
1332,
20544,
50276,
18,
436,
2929,
3936,
5750,
273,
253,
2234,
3659,
6430,
285,
247,
5235,
273,
5609,
285,
5115,
7126,
1543,
374,
436,
2929,
310,
973,
3542,
32213,
337,
352,
778,
320,
13477,
326,
2139,
253,
2234,
3659,
6430,
476,
5115,
824,
271,
1055,
7756,
50275,
19,
436,
2929,
9569,
253,
31640,
273,
2045,
3210,
534,
11089,
432,
14138,
3045,
11961,
672,
253,
4114,
390,
861,
254,
310,
19412,
24529,
875,
3733,
285,
5175,
533,
310,
253,
2500,
33203,
1566,
476,
8415,
436,
1895,
253,
4477,
452,
9713,
253,
7364,
273,
616,
789,
5474,
339,
9852,
253,
1499,
83,
4836,
253,
2929,
23970,
767,
4858,
17795,
281,
1566,
1097,
253,
9305,
10556,
285,
253,
2234,
3659,
6430,
323,
253,
767,
17795,
281,
8008,
342,
1016,
643,
1805,
253,
2929,
29328,
247,
5235,
273,
5609,
1690,
12246,
30869,
11884,
4602,
861,
39694,
2990,
285,
3665,
5251,
1881,
8155,
21755,
275,
253,
1499,
85,
4836,
253,
2929,
8725,
2500,
493,
9779,
32888,
281,
2500,
493,
9779,
5792,
407,
36976,
271,
13361,
81,
285,
295,
6917,
5661,
1543,
921,
326,
2500,
493,
9779,
32888,
285,
2500,
493,
9779,
5792,
5115,
256,
5503,
3045,
327,
1499,
83,
285,
1499,
85,
8892,
275,
1264,
15302,
20544,
50276,
18,
253,
2929,
29328,
2500,
33203,
2990,
1690,
12246,
30869,
11884,
4602,
861,
39694,
2990,
285,
3665,
5251,
1881,
8155,
21755,
3082,
281,
8008,
46206,
10556,
285,
2234,
3659,
6430,
323,
26441,
1499,
83,
285,
1499,
85,
50276,
19,
253,
2929,
2361,
271,
7756,
327,
1499,
83,
285,
1499,
85,
8892,
50276,
20881,
1255,
265,
50276,
18,
841,
3480,
253,
7103,
273,
5304,
39296,
285,
8008,
342,
1097,
253,
9305,
10556,
285,
253,
2234,
3659,
6430,
50276,
19,
352,
310,
12744,
670,
253,
10393,
273,
5028,
3640,
285,
2234,
6430,
50276,
2072,
5474,
33032,
2520,
2929,
29328,
247,
2500,
33203,
2990,
323,
861,
3448,
8981,
285,
10234,
253,
2022,
2934,
3212,
436,
2929,
310,
970,
767,
4858,
256,
20,
69,
6928,
281,
22573,
46206,
36453,
285,
1966,
2234,
3659,
36453,
2975,
2556,
281,
253,
958,
326,
861,
11515,
897,
1097,
11595,
18575,
3339,
285,
1327,
39842,
3603,
281,
12709,
1491,
281,
1056,
253,
767,
17795,
8008,
342,
1016,
643,
4477,
12661,
12246,
30869,
11884,
4602,
861,
39694,
2990,
342,
24026,
20446,
285,
247,
3665,
5251,
1881,
8155,
21755,
5700,
21184,
28913,
2175,
452,
16058,
253,
12510,
273,
1016,
4081,
4445,
253,
1543,
403,
1077,
4891,
2500,
493,
9779,
32888,
534,
310,
4158,
323,
861,
3448,
8981,
4836,
33526,
25305,
16640,
327,
815,
80,
17473,
6759,
21732,
327,
815,
80,
17473,
6759,
85,
285,
30135,
327,
9841,
3863,
29180,
392,
4170,
534,
10260,
41731,
13015,
2720,
3082,
407,
1781,
24390,
347,
323,
2500,
493,
9779,
5792,
352,
671,
33526,
50276,
3409,
23037,
14387,
3045,
327,
815,
80,
17473,
6759,
85,
285,
29180,
392,
4170,
15302,
20544,
337,
186,
2520,
2929,
310,
1077,
973,
3542,
285,
3477,
281,
956,
253,
16038,
310,
1077,
2590,
374,
186,
783,
4081,
2500,
33203,
2990,
342,
12246,
30869,
11884,
4602,
861,
39694,
2990,
24026,
20446,
285,
3665,
5251,
1881,
8155,
21755,
310,
22335,
3590,
891,
2868,
436,
2929,
588,
12454,
436,
2561,
3884,
495,
186,
10394,
5251,
4679,
403,
1077,
4891,
436,
2929,
33526,
1375,
23037,
14387,
3045,
327,
767,
861,
3448,
4685,
8892,
26332,
5415,
861,
3448,
8981,
285,
861,
3448,
10234,
2439,
2067,
15302,
352,
310,
4409,
29570,
326,
2500,
493,
9779,
32888,
41731,
13015,
2045,
1682,
3082,
407,
1781,
24390,
3340,
327,
29180,
392,
4170,
10895,
577,
186,
66,
5235,
273,
28913,
2175,
12654,
253,
12510,
273,
1016,
4081,
4445,
347,
2011,
275,
2829,
495,
285,
577,
324,
347,
973,
347,
253,
7180,
275,
30762,
50276,
20881,
1255,
337,
186,
43355,
812,
2118,
253,
15895,
273,
260,
18038,
2957,
285,
10234,
2957,
432,
30762,
281,
253,
2022,
2929,
281,
1056,
253,
2929,
30909,
50275,
21848,
281,
253,
941,
8492,
285,
941,
22888,
414,
2523,
627,
403,
32947,
8981,
20099,
6332,
347,
2011,
275,
2829,
577,
275,
30762,
4477,
671,
2319,
253,
7364,
285,
38058,
3486,
18212,
2490,
187,
4118,
18435,
27,
2520,
2929,
8725,
3210,
323,
861,
3448,
8981,
285,
10234,
342,
247,
8746,
32049,
835,
806,
2234,
3659,
6430,
403,
5998,
970,
271,
273,
649,
1041,
48164,
1566,
840,
29843,
342,
253,
3492,
3425,
352,
310,
247,
5884,
7681,
7680,
281,
823,
253,
2234,
3659,
3311,
569,
347,
3280,
1580,
642,
747,
1491,
369,
5611,
2299,
253,
4477,
5183,
2266,
10636,
273,
5661,
1543,
436,
2929,
476,
320,
27948,
342,
15722,
68,
45446,
7274,
534,
10725,
327,
5028,
3640,
323,
28136,
4735,
11998,
285,
5019,
50276,
783,
2929,
10262,
1142,
5661,
1543,
323,
10336,
2544,
281,
3157,
1543,
12246,
30869,
11884,
4602,
861,
39694,
2990,
285,
3665,
5251,
1881,
8155,
21755,
50276,
783,
4477,
13762,
253,
30628,
342,
625,
5661,
1543,
1309,
253,
30080,
22559,
2180,
4283,
281,
767,
4891,
285,
581,
45210,
2997,
13008,
209
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
this paper provides a theoretical framework to understand the relationship between data sourcedownstream tasks and reward functions some clarification questions 1 at the end of page 3 you mentioned the maximum entropy policy as the fixed point to pibeta pibetapibeta could you provide a definition of the maximum entropy policy and a proof sketch of why it is equivalent to the fixed point in my understanding a maximum entropy policy is the optimal policy result from solving policy optimization with an entropy regularization its not obvious to me why this solution will coincide with the fixed point defined above 2 x never appears in definition 21 do you mean to define x fr ftr 3 i dont quite understand definition 25 as the mathematical definition of r seems to be independent of r 4 5 transformations are defined in section 21 do they mean to form an exhaustive list or are there potentially other relevant transformations not discussed here at a higher level an object eg expert demonstration can induce a set of reward functions not described by any finite combination of the listed or potentially new transformations so instead of characterizing the information flow through the reward function why not directly analyze whether the information from a data source is sufficient for a downstream task the answer to this question is completely independent of whether one chooses to learn a reward function or not for example the information captured in an expert demonstration is sufficient for the downstream task of optimal policy identification and therefore it is wellknown that despite the nonuniqueness of reward function in irl an optimal policy can be recovered regardless major comments discussion of related works can be significantly improved most of the mentioned related works on reward learning are empirical in nature it would be good to instead comprehensively survey the theoretical works that concern reward learning since the paper itself is theoretical in nature example application of the framework to improve uponsubsume prior results while the introduced rewardfunction centered informationtheoretical framework provide a novel and unified view on the informationtheoretical relationship between different objects it alone is in my opinion not sufficient for publication when you introduce a brandnew framework it is also important that you make some convincing argument of why this framework is useful in addition to being mathematically correct a typical way of doing this is to show that using the new framework one can easily recoverimprove results from various prior works and thus truly provide a unifying framework that subsumes prior literature i would suggest the authors throw a majority of the theorems and transformations in section 2 and section 3 into the appendix and use the space to set up one or two concrete examples on which the proposed framework can be useful in deriving newmatching upper and lower bounds 1 discussion of related works can be significantly improved 2 example application of the framework to improve uponsubsume prior results in summary i think this paper provides some novel insight in the problem of reward learning but there can be substantial improvements to be made to make the paper significantly stronger i would highly suggest the authors make the additional effort and i know its gonna be a lot of work but it will potentially make it a spotlightoral paper instead of a borderline docsepthis is a theoretical work on understanding the intrinsic limits of various data sources that are used for reward learning in rl in particular by considering the infinite data limit of the data source they study the level of reward ambiguity that can be obtained for a given downstream task for example for the expert behavior data source they characterize the reward transformations that are determined by the optimal qfunction similar attempts have been made previously for specific data sources and specific planning algorithms ng russell 2000 however this work makes substantial contributions by conducting this study in a unified and rigorous way for variety data sources and downstream tasks the paper is overall well written the related work is clearly discussed and this work is wellpositioned in the literature i enjoyed reading the paper i havent checked the proofs of the theorems however the justifications provided before the theoretical claims are convincing the paper is a bit heavy in terminologies however it is inevitable due to the theoretical nature and rigor purpose of the paper comments 1 re the optimalitypreserving transformations in definition 25 is it possible to extend this definition to regularized mdp eg entropy regularized mdp objective as well in that case will the function psi intuitively correspond to the regularized value function 2 in theorem 33 the sredistribution invariance and potential shaping invariances are not mentioned is this because they are less ambiguous than optimalitypreserving transformations 3 re theorem 310 for noiseless comparison based on the return compared to other objectstheoretical statements in this theorem there isnt any determinesstyle claim it is said that the precise monotonic invariances depend on the mdp it would be good to have a clarificationdiscussion on this point 4 all the results are derived for the finite mdp setting a brief discussion on the applicability or possible extension of these claims to the continuous mdp setting would be helpful this is a strong theoretical work it makes a fundamental contribution to reinforcement learning literature the reward functions are atomic to rl understanding the theoretical limits on how much information can be extracted from various data sources used for reward learning is important docsepthe submission considers a reward learning problem when the reward function is not uniquely recoverable from the data even in the infinitedata regime this paper is very hard to follow due to many nonstandard notation and concepts it is not even clear what the goal of the paper is the notion of invariance is vague and confusing i think the authors should have given more effort to setup the problem and goal more precisely and concisely while using easier to understand notation and terminologies the paper is very hard to follow and i may not be able to assess the paper properly docsepthis paper characterizes the partial identifiability of data sources and the reward function then it analyzes the impact of this measure on the optimum and the algorithms some implications are given this is an unconventional paper the problem its attacking is very fundamental and interesting given reward functions that are close to each other under some measure doesnt have to be rigorous and literal measures what will they affect in both the optimum of the mdp and the behavior of the downstream algorithms what the paper does is to provide a set of measures and a set of claims that partially answer the questions for the cases that are more or less lowhanging fruits these claims are not particularly strong i would say but they do provide some reasoning and implications on this very important topic i would personally be more interestied in a particular presumably not that general setting say just tabular and linear program with a clear and complete answer to that question to this end i would wonder if the manuscript better fits a journal publication or a book chapter etc but im more or less open for discussion in case this manuscript could provide some support for future studies side question can authors just compile one single pdf of 21 pages for the main submission per the policy better fits journal publication or book chapter
### Summary: | the paper formally studies the problem of partial identifiability when inferring a reward function from a given data source eg expert demonstrations or trajectory preferences to formally characterize this ambiguity in a data source the paper proposes considering the infinitelimit data regime which bounds the reward information recoverable from a source furthermore this ambiguity is then studied in the context of different downstream tasks as recovering an exact reward function may not be necessary for a given task the paper is primarily theoretical and the results provide a unified view of the problem of partial identifiability in reward learning for different sources and downstream tasks overall the reviewers acknowledged the importance of the problem setting and found the results promising there is quite a bit of spread in the reviewers final assessment of the paper with ratings 8 8 3 3 note one of the reviewers with rating 3 has a low confidence the authors responses did help in discussions however a few of the concerns as raised by reviewers still remained the key issues are related to the general accessibility of the paper and the lack of concrete examples to highlight the proposed theoretical framework at the end of the discussions several reviewers including those with an overall positive rating shared concerns about the papers accessibility with this unfortunately the paper stands as borderline nevertheless this is exciting and potentially impactful work and we encourage the authors to incorporate the reviewers feedback when preparing a future revision of the paper | [
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
2520,
2929,
3400,
247,
10527,
7792,
281,
2096,
253,
2954,
875,
941,
47344,
628,
4963,
8892,
285,
10921,
3470,
50276,
8826,
37699,
3533,
50276,
18,
387,
253,
990,
273,
3239,
495,
368,
5393,
253,
4869,
15579,
3646,
347,
253,
4229,
1127,
281,
268,
487,
1464,
50276,
81,
487,
292,
522,
487,
1464,
812,
368,
2085,
247,
5426,
273,
253,
4869,
15579,
3646,
285,
247,
4737,
23211,
273,
2139,
352,
310,
6425,
281,
253,
4229,
1127,
275,
619,
4685,
247,
4869,
15579,
3646,
310,
253,
8654,
3646,
906,
432,
16161,
3646,
13757,
342,
271,
15579,
37820,
697,
417,
4755,
281,
479,
2139,
436,
2900,
588,
28588,
342,
253,
4229,
1127,
2931,
1840,
50276,
19,
1269,
1620,
4620,
275,
5426,
3127,
513,
368,
1599,
281,
4853,
1269,
50276,
925,
50276,
649,
83,
50276,
20,
891,
13414,
3240,
2096,
5426,
2030,
347,
253,
15965,
5426,
273,
391,
3133,
281,
320,
3907,
273,
391,
50276,
21,
608,
21257,
403,
2931,
275,
2593,
3127,
513,
597,
1599,
281,
830,
271,
41389,
1618,
390,
403,
627,
7826,
643,
4623,
21257,
417,
5469,
1060,
387,
247,
2169,
1268,
271,
1789,
24088,
6485,
20028,
476,
10808,
247,
873,
273,
10921,
3470,
417,
2529,
407,
667,
6486,
5019,
273,
253,
7117,
390,
7826,
747,
21257,
594,
3185,
273,
39330,
253,
1491,
2685,
949,
253,
10921,
1159,
2139,
417,
3587,
12106,
1880,
253,
1491,
432,
247,
941,
2603,
310,
4209,
323,
247,
15450,
4836,
253,
3662,
281,
436,
1953,
310,
4336,
3907,
273,
1880,
581,
28467,
281,
3037,
247,
10921,
1159,
390,
417,
323,
1650,
253,
1491,
10848,
275,
271,
6485,
20028,
310,
4209,
323,
253,
15450,
4836,
273,
8654,
3646,
8137,
285,
3103,
352,
310,
973,
4304,
326,
5747,
253,
1327,
328,
3008,
8098,
273,
10921,
1159,
275,
209,
2587,
271,
8654,
3646,
476,
320,
12372,
10159,
50276,
24330,
5701,
50276,
49794,
273,
2905,
2987,
476,
320,
3012,
5520,
954,
273,
253,
5393,
2905,
2987,
327,
10921,
4715,
403,
16774,
275,
3753,
352,
651,
320,
1175,
281,
3185,
9483,
1242,
6630,
253,
10527,
2987,
326,
4468,
10921,
4715,
1580,
253,
2929,
3139,
310,
10527,
275,
3753,
1650,
2898,
273,
253,
7792,
281,
3157,
598,
790,
24976,
2123,
2720,
1543,
1223,
253,
5611,
10921,
3701,
18932,
1491,
783,
33977,
7792,
2085,
247,
4460,
285,
27998,
1859,
327,
253,
1491,
783,
33977,
2954,
875,
1027,
5113,
352,
3815,
310,
275,
619,
4743,
417,
4209,
323,
9311,
672,
368,
9569,
247,
7138,
1826,
7792,
352,
310,
671,
1774,
326,
368,
1056,
690,
21414,
4154,
273,
2139,
436,
7792,
310,
4217,
275,
1635,
281,
1146,
11076,
1037,
3451,
247,
6867,
1039,
273,
2509,
436,
310,
281,
921,
326,
970,
253,
747,
7792,
581,
476,
4354,
9295,
49831,
1543,
432,
2710,
2720,
2987,
285,
3021,
7777,
2085,
247,
440,
5411,
7792,
326,
749,
2204,
265,
2720,
6239,
50276,
74,
651,
1804,
253,
4477,
4710,
247,
5020,
273,
253,
39383,
285,
21257,
275,
2593,
374,
285,
2593,
495,
715,
253,
30762,
285,
897,
253,
2317,
281,
873,
598,
581,
390,
767,
11859,
6667,
327,
534,
253,
4081,
7792,
476,
320,
4217,
275,
44190,
747,
45767,
5170,
285,
2406,
14493,
50276,
18,
5955,
273,
2905,
2987,
476,
320,
3012,
5520,
374,
1650,
2898,
273,
253,
7792,
281,
3157,
598,
790,
24976,
2123,
2720,
1543,
50276,
249,
6010,
891,
1158,
436,
2929,
3400,
690,
4460,
12288,
275,
253,
1895,
273,
10921,
4715,
533,
627,
476,
320,
6832,
11701,
281,
320,
1160,
281,
1056,
253,
2929,
3012,
10046,
891,
651,
4122,
1804,
253,
4477,
1056,
253,
3081,
3434,
285,
891,
871,
697,
6501,
320,
247,
2257,
273,
789,
533,
352,
588,
7826,
1056,
352,
247,
34543,
7909,
2929,
3185,
273,
247,
45210,
5474,
33032,
2520,
310,
247,
10527,
789,
327,
4685,
253,
15276,
7787,
273,
2710,
941,
4973,
326,
403,
908,
323,
10921,
4715,
275,
391,
77,
275,
1798,
407,
7296,
253,
11968,
941,
2701,
273,
253,
941,
2603,
597,
1263,
253,
1268,
273,
10921,
28931,
326,
476,
320,
2797,
323,
247,
1677,
15450,
4836,
323,
1650,
323,
253,
6485,
3879,
941,
2603,
597,
17710,
253,
10921,
21257,
326,
403,
3413,
407,
253,
8654,
2805,
3701,
2074,
9437,
452,
644,
1160,
3786,
323,
2173,
941,
4973,
285,
2173,
7219,
11333,
9782,
50276,
83,
1316,
437,
5307,
2299,
436,
789,
2789,
6832,
9021,
407,
16472,
436,
1263,
275,
247,
27998,
285,
26565,
1039,
323,
5235,
941,
4973,
285,
15450,
8892,
50276,
783,
2929,
310,
4583,
973,
3542,
253,
2905,
789,
310,
4518,
5469,
285,
436,
789,
310,
973,
3321,
264,
275,
253,
6239,
891,
11346,
4361,
253,
2929,
50275,
74,
419,
2254,
10141,
253,
27947,
273,
253,
39383,
2299,
253,
816,
6787,
2530,
1078,
253,
10527,
3916,
403,
21414,
50275,
783,
2929,
310,
247,
2372,
5536,
275,
18376,
5970,
2299,
352,
310,
19455,
1955,
281,
253,
10527,
3753,
285,
8132,
263,
4096,
273,
253,
2929,
50275,
26122,
337,
294,
253,
5556,
1319,
10192,
26368,
21257,
275,
5426,
2030,
310,
352,
1896,
281,
9017,
436,
5426,
281,
3963,
1025,
278,
12132,
24088,
15579,
3963,
1025,
278,
12132,
8103,
347,
973,
275,
326,
1083,
588,
253,
1159,
3714,
74,
540,
41597,
2723,
281,
253,
3963,
1025,
1318,
1159,
50275,
19,
275,
10012,
5922,
253,
256,
433,
382,
2382,
31429,
285,
2442,
29209,
828,
6656,
707,
403,
417,
5393,
50276,
261,
436,
984,
597,
403,
1679,
23851,
685,
5556,
1319,
10192,
26368,
21257,
50275,
20,
294,
10012,
27322,
323,
642,
261,
6134,
5301,
1754,
327,
253,
1091,
2429,
281,
643,
1789,
296,
248,
33977,
7234,
275,
436,
10012,
627,
310,
2649,
667,
14802,
4826,
1750,
352,
310,
753,
326,
253,
10799,
45973,
828,
6656,
707,
3469,
327,
253,
278,
12132,
352,
651,
320,
1175,
281,
452,
247,
37699,
49794,
327,
436,
1127,
50275,
21,
512,
253,
1543,
403,
6012,
323,
253,
6486,
278,
12132,
4758,
247,
4864,
5955,
327,
253,
30437,
390,
1896,
6880,
273,
841,
3916,
281,
253,
5415,
278,
12132,
4758,
651,
320,
9371,
50275,
2520,
310,
247,
2266,
10527,
789,
352,
2789,
247,
7936,
7680,
281,
35221,
4715,
6239,
253,
10921,
3470,
403,
13805,
281,
391,
77,
50276,
4524,
6924,
253,
10527,
7787,
327,
849,
1199,
1491,
476,
320,
10375,
432,
2710,
941,
4973,
908,
323,
10921,
4715,
310,
1774,
50276,
7152,
339,
431,
248,
19529,
19401,
247,
10921,
4715,
1895,
672,
253,
10921,
1159,
310,
417,
22506,
9295,
494,
432,
253,
941,
1014,
275,
253,
38353,
959,
682,
9459,
50276,
2520,
2929,
310,
1077,
1892,
281,
956,
1955,
281,
1142,
1327,
15291,
14951,
285,
12342,
352,
310,
417,
1014,
2590,
752,
253,
4736,
273,
253,
2929,
310,
253,
10732,
273,
31429,
310,
21248,
285,
21643,
891,
1158,
253,
4477,
943,
452,
1677,
625,
3434,
281,
9978,
253,
1895,
285,
4736,
625,
10534,
285,
7036,
9299,
1223,
970,
6927,
281,
2096,
14951,
285,
18376,
5970,
50276,
783,
2929,
310,
1077,
1892,
281,
956,
285,
891,
778,
417,
320,
2104,
281,
2939,
253,
2929,
6283,
50276,
7152,
33032,
2520,
2929,
45589,
253,
7898,
1548,
18279,
1430,
273,
941,
4973,
285,
253,
10921,
1159,
840,
352,
3537,
13505,
253,
3486,
273,
436,
2557,
327,
253,
24571,
285,
253,
11333,
690,
12739,
403,
1677,
436,
310,
271,
49799,
2929,
253,
1895,
697,
20362,
310,
1077,
7936,
285,
4722,
1677,
10921,
3470,
326,
403,
2810,
281,
1016,
643,
762,
690,
2557,
36908,
452,
281,
320,
26565,
285,
22436,
5593,
752,
588,
597,
2818,
275,
1097,
253,
24571,
273,
253,
278,
12132,
285,
253,
3879,
273,
253,
15450,
11333,
752,
253,
2929,
1057,
310,
281,
2085,
247,
873,
273,
5593,
285,
247,
873,
273,
3916,
326,
10571,
3662,
253,
3533,
323,
253,
2219,
326,
403,
625,
390,
1679,
1698,
73,
5610,
18098,
841,
3916,
403,
417,
3782,
2266,
891,
651,
1333,
533,
597,
513,
2085,
690,
14720,
285,
12739,
327,
436,
1077,
1774,
9400,
891,
651,
11697,
320,
625,
1600,
728,
275,
247,
1798,
18289,
417,
326,
2087,
4758,
1333,
816,
10334,
792,
285,
4872,
2086,
342,
247,
2590,
285,
3426,
3662,
281,
326,
1953,
281,
436,
990,
891,
651,
4282,
604,
253,
7714,
1805,
13840,
247,
6698,
9311,
390,
247,
1984,
8857,
3966,
533,
516,
625,
390,
1679,
1527,
323,
5955,
275,
1083,
436,
7714,
812,
2085,
690,
1329,
323,
2852,
2175,
50276,
2189,
1953,
476,
4477,
816,
18122,
581,
2014,
31697,
273,
3127,
7223,
323,
253,
2022,
19529,
591,
253,
3646,
1805,
13840,
6698,
9311,
390,
1984,
8857,
2490,
187,
4118,
18435,
27,
783,
2929,
19186,
2175,
253,
1895,
273,
7898,
1548,
18279,
1430,
672,
9441,
804,
247,
10921,
1159,
432,
247,
1677,
941,
2603,
24088,
6485,
32367,
390,
18974,
17971,
281,
19186,
17710,
436,
28931,
275,
247,
941,
2603,
253,
2929,
29328,
7296,
253,
2192,
4478,
293,
16563,
941,
9459,
534,
14493,
253,
10921,
1491,
9295,
494,
432,
247,
2603,
33810,
436,
28931,
310,
840,
5421,
275,
253,
3634,
273,
1027,
15450,
8892,
347,
27930,
271,
3242,
10921,
1159,
778,
417,
320,
3309,
323,
247,
1677,
4836,
253,
2929,
310,
8558,
10527,
285,
253,
1543,
2085,
247,
27998,
1859,
273,
253,
1895,
273,
7898,
1548,
18279,
1430,
275,
10921,
4715,
323,
1027,
4973,
285,
15450,
8892,
50276,
1189,
455,
253,
30628,
14969,
253,
6349,
273,
253,
1895,
4758,
285,
1119,
253,
1543,
12532,
627,
310,
3240,
247,
2372,
273,
5195,
275,
253,
30628,
2457,
6803,
273,
253,
2929,
342,
17503,
854,
854,
495,
495,
3877,
581,
273,
253,
30628,
342,
13716,
495,
556,
247,
1698,
7162,
253,
4477,
6128,
858,
1361,
275,
11985,
2299,
247,
1643,
273,
253,
7350,
347,
5439,
407,
30628,
1335,
6376,
253,
2234,
3374,
403,
2905,
281,
253,
2087,
28092,
273,
253,
2929,
285,
253,
3480,
273,
11859,
6667,
281,
6780,
253,
4081,
10527,
7792,
387,
253,
990,
273,
253,
11985,
50276,
43249,
30628,
1690,
1110,
342,
271,
4583,
2762,
13716,
6096,
7350,
670,
253,
9380,
28092,
50276,
3113,
436,
19235,
253,
2929,
9572,
347,
45210,
17837,
436,
310,
12302,
285,
7826,
3486,
1020,
789,
285,
359,
11907,
253,
4477,
281,
19071,
253,
30628,
8680,
672,
13828,
247,
2852,
18520,
273,
253,
2929
] | [
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] | [
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
2520,
2929,
3400,
247,
10527,
7792,
281,
2096,
253,
2954,
875,
941,
47344,
628,
4963,
8892,
285,
10921,
3470,
50276,
8826,
37699,
3533,
50276,
18,
387,
253,
990,
273,
3239,
495,
368,
5393,
253,
4869,
15579,
3646,
347,
253,
4229,
1127,
281,
268,
487,
1464,
50276,
81,
487,
292,
522,
487,
1464,
812,
368,
2085,
247,
5426,
273,
253,
4869,
15579,
3646,
285,
247,
4737,
23211,
273,
2139,
352,
310,
6425,
281,
253,
4229,
1127,
275,
619,
4685,
247,
4869,
15579,
3646,
310,
253,
8654,
3646,
906,
432,
16161,
3646,
13757,
342,
271,
15579,
37820,
697,
417,
4755,
281,
479,
2139,
436,
2900,
588,
28588,
342,
253,
4229,
1127,
2931,
1840,
50276,
19,
1269,
1620,
4620,
275,
5426,
3127,
513,
368,
1599,
281,
4853,
1269,
50276,
925,
50276,
649,
83,
50276,
20,
891,
13414,
3240,
2096,
5426,
2030,
347,
253,
15965,
5426,
273,
391,
3133,
281,
320,
3907,
273,
391,
50276,
21,
608,
21257,
403,
2931,
275,
2593,
3127,
513,
597,
1599,
281,
830,
271,
41389,
1618,
390,
403,
627,
7826,
643,
4623,
21257,
417,
5469,
1060,
387,
247,
2169,
1268,
271,
1789,
24088,
6485,
20028,
476,
10808,
247,
873,
273,
10921,
3470,
417,
2529,
407,
667,
6486,
5019,
273,
253,
7117,
390,
7826,
747,
21257,
594,
3185,
273,
39330,
253,
1491,
2685,
949,
253,
10921,
1159,
2139,
417,
3587,
12106,
1880,
253,
1491,
432,
247,
941,
2603,
310,
4209,
323,
247,
15450,
4836,
253,
3662,
281,
436,
1953,
310,
4336,
3907,
273,
1880,
581,
28467,
281,
3037,
247,
10921,
1159,
390,
417,
323,
1650,
253,
1491,
10848,
275,
271,
6485,
20028,
310,
4209,
323,
253,
15450,
4836,
273,
8654,
3646,
8137,
285,
3103,
352,
310,
973,
4304,
326,
5747,
253,
1327,
328,
3008,
8098,
273,
10921,
1159,
275,
209,
2587,
271,
8654,
3646,
476,
320,
12372,
10159,
50276,
24330,
5701,
50276,
49794,
273,
2905,
2987,
476,
320,
3012,
5520,
954,
273,
253,
5393,
2905,
2987,
327,
10921,
4715,
403,
16774,
275,
3753,
352,
651,
320,
1175,
281,
3185,
9483,
1242,
6630,
253,
10527,
2987,
326,
4468,
10921,
4715,
1580,
253,
2929,
3139,
310,
10527,
275,
3753,
1650,
2898,
273,
253,
7792,
281,
3157,
598,
790,
24976,
2123,
2720,
1543,
1223,
253,
5611,
10921,
3701,
18932,
1491,
783,
33977,
7792,
2085,
247,
4460,
285,
27998,
1859,
327,
253,
1491,
783,
33977,
2954,
875,
1027,
5113,
352,
3815,
310,
275,
619,
4743,
417,
4209,
323,
9311,
672,
368,
9569,
247,
7138,
1826,
7792,
352,
310,
671,
1774,
326,
368,
1056,
690,
21414,
4154,
273,
2139,
436,
7792,
310,
4217,
275,
1635,
281,
1146,
11076,
1037,
3451,
247,
6867,
1039,
273,
2509,
436,
310,
281,
921,
326,
970,
253,
747,
7792,
581,
476,
4354,
9295,
49831,
1543,
432,
2710,
2720,
2987,
285,
3021,
7777,
2085,
247,
440,
5411,
7792,
326,
749,
2204,
265,
2720,
6239,
50276,
74,
651,
1804,
253,
4477,
4710,
247,
5020,
273,
253,
39383,
285,
21257,
275,
2593,
374,
285,
2593,
495,
715,
253,
30762,
285,
897,
253,
2317,
281,
873,
598,
581,
390,
767,
11859,
6667,
327,
534,
253,
4081,
7792,
476,
320,
4217,
275,
44190,
747,
45767,
5170,
285,
2406,
14493,
50276,
18,
5955,
273,
2905,
2987,
476,
320,
3012,
5520,
374,
1650,
2898,
273,
253,
7792,
281,
3157,
598,
790,
24976,
2123,
2720,
1543,
50276,
249,
6010,
891,
1158,
436,
2929,
3400,
690,
4460,
12288,
275,
253,
1895,
273,
10921,
4715,
533,
627,
476,
320,
6832,
11701,
281,
320,
1160,
281,
1056,
253,
2929,
3012,
10046,
891,
651,
4122,
1804,
253,
4477,
1056,
253,
3081,
3434,
285,
891,
871,
697,
6501,
320,
247,
2257,
273,
789,
533,
352,
588,
7826,
1056,
352,
247,
34543,
7909,
2929,
3185,
273,
247,
45210,
5474,
33032,
2520,
310,
247,
10527,
789,
327,
4685,
253,
15276,
7787,
273,
2710,
941,
4973,
326,
403,
908,
323,
10921,
4715,
275,
391,
77,
275,
1798,
407,
7296,
253,
11968,
941,
2701,
273,
253,
941,
2603,
597,
1263,
253,
1268,
273,
10921,
28931,
326,
476,
320,
2797,
323,
247,
1677,
15450,
4836,
323,
1650,
323,
253,
6485,
3879,
941,
2603,
597,
17710,
253,
10921,
21257,
326,
403,
3413,
407,
253,
8654,
2805,
3701,
2074,
9437,
452,
644,
1160,
3786,
323,
2173,
941,
4973,
285,
2173,
7219,
11333,
9782,
50276,
83,
1316,
437,
5307,
2299,
436,
789,
2789,
6832,
9021,
407,
16472,
436,
1263,
275,
247,
27998,
285,
26565,
1039,
323,
5235,
941,
4973,
285,
15450,
8892,
50276,
783,
2929,
310,
4583,
973,
3542,
253,
2905,
789,
310,
4518,
5469,
285,
436,
789,
310,
973,
3321,
264,
275,
253,
6239,
891,
11346,
4361,
253,
2929,
50275,
74,
419,
2254,
10141,
253,
27947,
273,
253,
39383,
2299,
253,
816,
6787,
2530,
1078,
253,
10527,
3916,
403,
21414,
50275,
783,
2929,
310,
247,
2372,
5536,
275,
18376,
5970,
2299,
352,
310,
19455,
1955,
281,
253,
10527,
3753,
285,
8132,
263,
4096,
273,
253,
2929,
50275,
26122,
337,
294,
253,
5556,
1319,
10192,
26368,
21257,
275,
5426,
2030,
310,
352,
1896,
281,
9017,
436,
5426,
281,
3963,
1025,
278,
12132,
24088,
15579,
3963,
1025,
278,
12132,
8103,
347,
973,
275,
326,
1083,
588,
253,
1159,
3714,
74,
540,
41597,
2723,
281,
253,
3963,
1025,
1318,
1159,
50275,
19,
275,
10012,
5922,
253,
256,
433,
382,
2382,
31429,
285,
2442,
29209,
828,
6656,
707,
403,
417,
5393,
50276,
261,
436,
984,
597,
403,
1679,
23851,
685,
5556,
1319,
10192,
26368,
21257,
50275,
20,
294,
10012,
27322,
323,
642,
261,
6134,
5301,
1754,
327,
253,
1091,
2429,
281,
643,
1789,
296,
248,
33977,
7234,
275,
436,
10012,
627,
310,
2649,
667,
14802,
4826,
1750,
352,
310,
753,
326,
253,
10799,
45973,
828,
6656,
707,
3469,
327,
253,
278,
12132,
352,
651,
320,
1175,
281,
452,
247,
37699,
49794,
327,
436,
1127,
50275,
21,
512,
253,
1543,
403,
6012,
323,
253,
6486,
278,
12132,
4758,
247,
4864,
5955,
327,
253,
30437,
390,
1896,
6880,
273,
841,
3916,
281,
253,
5415,
278,
12132,
4758,
651,
320,
9371,
50275,
2520,
310,
247,
2266,
10527,
789,
352,
2789,
247,
7936,
7680,
281,
35221,
4715,
6239,
253,
10921,
3470,
403,
13805,
281,
391,
77,
50276,
4524,
6924,
253,
10527,
7787,
327,
849,
1199,
1491,
476,
320,
10375,
432,
2710,
941,
4973,
908,
323,
10921,
4715,
310,
1774,
50276,
7152,
339,
431,
248,
19529,
19401,
247,
10921,
4715,
1895,
672,
253,
10921,
1159,
310,
417,
22506,
9295,
494,
432,
253,
941,
1014,
275,
253,
38353,
959,
682,
9459,
50276,
2520,
2929,
310,
1077,
1892,
281,
956,
1955,
281,
1142,
1327,
15291,
14951,
285,
12342,
352,
310,
417,
1014,
2590,
752,
253,
4736,
273,
253,
2929,
310,
253,
10732,
273,
31429,
310,
21248,
285,
21643,
891,
1158,
253,
4477,
943,
452,
1677,
625,
3434,
281,
9978,
253,
1895,
285,
4736,
625,
10534,
285,
7036,
9299,
1223,
970,
6927,
281,
2096,
14951,
285,
18376,
5970,
50276,
783,
2929,
310,
1077,
1892,
281,
956,
285,
891,
778,
417,
320,
2104,
281,
2939,
253,
2929,
6283,
50276,
7152,
33032,
2520,
2929,
45589,
253,
7898,
1548,
18279,
1430,
273,
941,
4973,
285,
253,
10921,
1159,
840,
352,
3537,
13505,
253,
3486,
273,
436,
2557,
327,
253,
24571,
285,
253,
11333,
690,
12739,
403,
1677,
436,
310,
271,
49799,
2929,
253,
1895,
697,
20362,
310,
1077,
7936,
285,
4722,
1677,
10921,
3470,
326,
403,
2810,
281,
1016,
643,
762,
690,
2557,
36908,
452,
281,
320,
26565,
285,
22436,
5593,
752,
588,
597,
2818,
275,
1097,
253,
24571,
273,
253,
278,
12132,
285,
253,
3879,
273,
253,
15450,
11333,
752,
253,
2929,
1057,
310,
281,
2085,
247,
873,
273,
5593,
285,
247,
873,
273,
3916,
326,
10571,
3662,
253,
3533,
323,
253,
2219,
326,
403,
625,
390,
1679,
1698,
73,
5610,
18098,
841,
3916,
403,
417,
3782,
2266,
891,
651,
1333,
533,
597,
513,
2085,
690,
14720,
285,
12739,
327,
436,
1077,
1774,
9400,
891,
651,
11697,
320,
625,
1600,
728,
275,
247,
1798,
18289,
417,
326,
2087,
4758,
1333,
816,
10334,
792,
285,
4872,
2086,
342,
247,
2590,
285,
3426,
3662,
281,
326,
1953,
281,
436,
990,
891,
651,
4282,
604,
253,
7714,
1805,
13840,
247,
6698,
9311,
390,
247,
1984,
8857,
3966,
533,
516,
625,
390,
1679,
1527,
323,
5955,
275,
1083,
436,
7714,
812,
2085,
690,
1329,
323,
2852,
2175,
50276,
2189,
1953,
476,
4477,
816,
18122,
581,
2014,
31697,
273,
3127,
7223,
323,
253,
2022,
19529,
591,
253,
3646,
1805,
13840,
6698,
9311,
390,
1984,
8857,
2490,
187,
4118,
18435,
27,
783,
2929,
19186,
2175,
253,
1895,
273,
7898,
1548,
18279,
1430,
672,
9441,
804,
247,
10921,
1159,
432,
247,
1677,
941,
2603,
24088,
6485,
32367,
390,
18974,
17971,
281,
19186,
17710,
436,
28931,
275,
247,
941,
2603,
253,
2929,
29328,
7296,
253,
2192,
4478,
293,
16563,
941,
9459,
534,
14493,
253,
10921,
1491,
9295,
494,
432,
247,
2603,
33810,
436,
28931,
310,
840,
5421,
275,
253,
3634,
273,
1027,
15450,
8892,
347,
27930,
271,
3242,
10921,
1159,
778,
417,
320,
3309,
323,
247,
1677,
4836,
253,
2929,
310,
8558,
10527,
285,
253,
1543,
2085,
247,
27998,
1859,
273,
253,
1895,
273,
7898,
1548,
18279,
1430,
275,
10921,
4715,
323,
1027,
4973,
285,
15450,
8892,
50276,
1189,
455,
253,
30628,
14969,
253,
6349,
273,
253,
1895,
4758,
285,
1119,
253,
1543,
12532,
627,
310,
3240,
247,
2372,
273,
5195,
275,
253,
30628,
2457,
6803,
273,
253,
2929,
342,
17503,
854,
854,
495,
495,
3877,
581,
273,
253,
30628,
342,
13716,
495,
556,
247,
1698,
7162,
253,
4477,
6128,
858,
1361,
275,
11985,
2299,
247,
1643,
273,
253,
7350,
347,
5439,
407,
30628,
1335,
6376,
253,
2234,
3374,
403,
2905,
281,
253,
2087,
28092,
273,
253,
2929,
285,
253,
3480,
273,
11859,
6667,
281,
6780,
253,
4081,
10527,
7792,
387,
253,
990,
273,
253,
11985,
50276,
43249,
30628,
1690,
1110,
342,
271,
4583,
2762,
13716,
6096,
7350,
670,
253,
9380,
28092,
50276,
3113,
436,
19235,
253,
2929,
9572,
347,
45210,
17837,
436,
310,
12302,
285,
7826,
3486,
1020,
789,
285,
359,
11907,
253,
4477,
281,
19071,
253,
30628,
8680,
672,
13828,
247,
2852,
18520,
273,
253,
2929
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
this paper introduces decstr which is an agent having a highlevel representation of spatial relations between objects decstr is a learning architecture that discovers and masters all reachable configurations from a set of relational spatial primitives they demonstrated the characteristics in a proofofconcept setup in the introduction the inspiration obtained from developmental psychology is described motivation and background are broadly introduced a wide range of related works are introduced in section 2 the motivation and target of this paper are ambitious and important however from the methods part ie section 3 this paper is hard to follow the supplementary material helps to understand however i believe some of the informative and detailed information in the supplementary material should come to the main manuscript the proposed method ie decstr is comprised of many components therefore the main contribution is also not clear what is the main argument of the paper experimental conditions are also hard to follow in evaluation figure 1 shows ablation studies alone ie comparison with the variants of decstr therefore the contribution of the paper is hard to grasp we can understand what kind of task is achieved in this paper currently the paper somehow seems to be a demonstration of decstr in this sense if the authors state research questions challenges and contributions of this paper more clearly that will make this paper more impactfuldocsepsummary this paper proposes decstr a goaldriven rl framework where the goal is represented as a binary vector that encodes the semantic relationships between objects the state is assumed to contain disentangled features for each of the objects and other features relating to the agents endeffectors the architecture is based on deep sets zaher et al 2017 which allows the pairs of the objects to be encoded with a shared network the paper also introduces a curriculum learning strategy similar to curious colas et al 2019 which relies on metrics such as competence and learning progress lp in order to select goals to pursue during an episode one key difference is that unlike curious which uses expertdefined goal buckets decstr groups the goals based on recency of discovery once trained to be able to behave with respect to these semantic relationship goals the second phase is language grounding they learn a module implemented as cvae that converts from natural language text to the semantic configuration goal space experiments were conducted in the fetch manipulate robotic arm environment and compared with ablations of decstr without some of its components demonstrating strong performance and generalization to various types of language instructions pros the paper is wellmotivated citing literature from several fields the sum is greater than its parts many components in decstr are based on existing works eg deep sets cvae using lp for intrinsically motivated goals etc but empirically they have shown through ablations that all of their components were necessary for the agent to solve the fetch manipulation task successfully the experiment sections are fairly thorough with ablations on the components of their methods as said above and various kinds of language command generalization evaluations in a similar style to imagine colas et al 2020 the interpretability of the semantic goal space aspect is interesting and being able to have the agent explicitly maps from the natural language text to the semantic goal space also helps us debugunderstand what the agent is thinking at inference time cons part of the thesis is that decoupling of sensorimotor learning from language acquisition is advantageous to an endtoend language to sensorimotor learning i have concernsclarification about some of the baselines which might not have been a fair comparison with decstr see question 1 2 below some parts of the method are unclearvague without reading the appendix section to get the full detail i understand that is due to the space limitation issue and because there are so many components to decstr see question 3 recommendation overall i vote for marginally below acceptance threshold in the current form as mentioned in the strengths section i do like the motivation of the paper and the strong performance of the method but i am also suspicious of the poor performance of the baselines eg figure 1c which may be due to not having her instead of their proposed contributions it would be good if the authors can clarify that concern question 1 in figure 1c for the language goals baseline was her applied to the language goals in this case ie similar to actrce chan et al 2019 imagine colas et al 2020 similarly was her applied to the positiongoals baseline if not then it is possible the difference in performance between decstr and these baselines may be due more to her than due to the difference in goal representation 2 would it be possible to train phase 1 and phase 2 together or in an endtoend fashion this would provide a coupled version that is different from any of the baselines studied in the paper because it still uses the semantic configuration as the intermediate goal representation while having joint training of the language representation and the sensorimotor if this baseline struggles to learn possibly due to difficult optimizationlocal minimas then this will help further strengthen the thesis of the importance of decoupling the learning process into two distinct phases 3 section 32 the main text and appendix c2 was not very clear about the second inductive bias for the symmetry of the behavior required to achieve aboveoi oj and aboveoj oi are you saying for example if we are trying to have object 1 above object 2 then we specify the goal in the form g1 while if we want object 2 above object 1 then we specify the goal in the form g2 minor comments when using double quotes in latex use backticks for the opening quote after rebuttal responses i have read the authors updated draft and response to my concerns as well as the other reviews the updated paper provides a clearer framing and some missing baselines have also been included i raised my evaluation to a weak acceptance for the paper docsepthis work proposed decstr a procedure for encouraging intrinsic motivation via an intermediate semantic statespace representation the authors propose an intermediate semantic state space that the intrinsically motivated agent learns to explore for the environment provided a 3block system the agent fully explores the symbolic state space reaching all feasible symbolic states in the second part of the work the authors train a language model capable of proposing symbolic goals in the form of symbolic states from natural language input and shows that the previouslyintrinsicallymotivated agent can now be made to reach these goals demonstrating that the symbolicgoalconditioned policy is sufficient for instruction following in their 3block domain the work is generally interesting and seems to address a simple version of a broader class of problems that embodied agents typically struggle with particularly in the absence of clear goals however the approach presented in the behavior in particular the form of the semantic representation that is claimed as one of the primary contributions of the work is very specific to the single problem used for demonstrations in the paper limiting the potential impact of the work firstand i think the most significant issue with the submissionis that many critical experimental details are included only in the lengthy appendix much of this information including the information provided to the learning algorithm at every step and how that information is encoded such that it allows for a relatively objectagnostic representation is only available in sufficient detail in the appendix relatedly visualizations of the approach and experimental setup also only appear in the appendix yet are extremely helpful if not essential for understanding detail critical to understanding the approach should be included in the body of the text second it is unclear exactly what problem is being solved in this work or what its primary contribution is a clearer statement of its motivations will be necessary before publication what problem is the robot or system designers trying to overcome right now the paper seems to come up with three potential answers to this question none of which necessarily rises above the others here are what i think the main contributions of the work could be 1 the proposed semantic representation the semantic goal representation used to define the space of intrinsic motivation seems to be a novel contribution however if the paper were to focus on this aspect of the contribution it would need to do a better job understating why this representation were useful beyond a relatively small manipulation task critically using only one problem setting with only three blocks is insufficient to convince the reader that this representation is useful more generally as might be suggested by much of the talk about inductive bias 2 state of the art statespace exploration in intrinsic motivation this might be true though i find such a thing hard to measure in addition it seems that many if not all of the tools used in the learning process are not novel perhaps a combination of this and point 1 is the primary contribution 3 state of the art performance on languagedriven block manipulation tasks this might be true as well but the results are sofar unconvincing all baselines are varied forms of the proposed agent which makes it difficult to compare against other approaches eg something like li et al 2019 the paper currently seems to claim that the combination of progress in these three areas is a novel contribution i am sympathetic to this idea as i do not believe that every paper needs to be state of the art in one single thing though it is sufficiently unclear at the moment what the takeaway message of the paper is that i cannot recommend it be published in its current state in particular the authors need to work on honing the message of the paper it is also not unlikely that one or two more experiments will need to be added to support the focused narrative smaller comments the name of algorithm should appear in the body of the text not a footnote relatedly it is unclear how the proposed approach uses the deep sets work in such a way that it justifies inclusion in the name of the proposed technique the paperintroduction would benefit from a summary of contributions even after reading it may not be clear to a reader which contributions are from this paper versus other work relatedly much of the discussion of inductive biases that appear throughout the paper is of mixed relevance for this work on the one hand it is clear how the idea of an objectcentric inductive bias helped to inform how the input to the neural network was encoded in a way that might allow the agent to apply its knowledge learned between two of the objects to a policy that allows it to manipulate all three however the goal condition is necessarily specific when it comes to representing which objects to which each element it refers the structure of the goal and the semantic relations it encodes are quite specific to the particular problem at hand and it is the reward for the position only baseline seems artificially constructed a nonbinary reward function would likely allow the system to learn more easily as of now i am unconvinced that the authors have worked hard enough to make a fair baseline for comparison this is particularly problematic since this baseline is a key motivator for the existence of the proposed semantic goal representation the paper overall is quite well written despite relegating too much information to the abstracts docsep the decstr systems intrinsic motivations may be applicable to other application domains depending on how objects and relations are enumerated this potential is not explored beyond the toy environment presented the learning methods especially inductive biases are handcrafted based on humanlevel knowledge about semantic predicates but only two above and close are demonstrated without demonstrating the system on any other configuration or world its difficult to tell whether its able to solve only the problem its been crafted to solve in this specific environment questions 31 in principle could use any other combination of binary predicates and could be extended to use nary predicates this claim is not demonstrated in the paper and in 32 the inductive biases seem bespoke crafted for binary predicate above which has particular symmetry would similar careful design of inductive biases be necessary and possible for nary predicates that do not demonstrate these as easily eg topmost what about predicates that involve an unspecified number of discrete arguments like base holding up an indefinite n of other objectsstructures in use the green block as the base 34 or is union this doesnt generally hold for natural language a statement like put the red block or the green block above the yellow block does not mean to put both red and green union of goals above yellow typically langauge or is xor is the notion of or here not given in language or not meant to represent human language areas for improvement 5 a learning architecture that discovers and masters all reachable configurations from a set of relational primitives this is literally true but only demonstrated on a single set of relational primitives so it feels like overclaiming nits double citation for mandler 2012 in intro in adjacent sentences can be condensed to once footnotes on other side of period besides in blocks manipulation seems a bit offsounding maybe in addition typo section 3 based o abstract typo section 5 backwards quotes overlapping waves lhs caregiver in section 5 is an unintroduced role the rest of the paper does not frame decstr or the oracle generator this way ending the paper with etc feels weirdinformal
### Summary: | this paper presents a new approach to grounding languagebased rl tasks via an intermediate semantic representation in an architecture called languagegoalbehavior lgb the architecture permits learning a mapping from internal goals to behavior gb separately from learning a mapping from language to internal goals lg and prior to flexibly combining all three lgb the architecture is studied in a specific implementation called decstr the architecture has multiple desired attributes including support for intrinsic motivation decoupling skill acquisition from language grounding and strategy switching the experiments demonstrate the utility of different components in the architecture with a variety of ablation results the reviews initially found the paper to be poorly organized with required content described only in the appendix r1 r2 r4 with unclear main contributions r1 r2 r4 and with results restricted to demonstrations r3 despite these reservations the reviewers found the content to be potentially relevant though narrow in scope the authors substantially revised the paper they improved its organization clarified contributions separated the architecture from the specific examples and improved the experimental baselines after reading the revised paper the reviewers agreed that the papers organization and insights were improved making the new papers contribution and insight clear the experimental baselines were also improved providing more support for the potential utility of the proposed method three reviewers indicate to accept this paper for its contribution of a novel approach to grounding language and behavior with an intermediate semantic representation no substantial concerns were raised on the content of the revised paper the paper is therefore accepted | [
417,
1907,
617,
3185,
273,
616,
4081,
9021,
352,
651,
320,
1175,
604,
253,
4477,
476,
19148,
326,
4468,
50275,
19751,
337,
275,
4677,
337,
68,
323,
253,
3448,
7342,
8245,
369,
617,
3732,
281,
253,
3448,
7342,
275,
436,
1083,
26332,
2074,
281,
769,
83,
336,
47853,
1162,
355,
6247,
8564,
847,
284,
1162,
355,
9169,
12014,
369,
617,
3732,
281,
253,
1899,
2184,
932,
8245,
604,
417,
840,
352,
310,
1896,
253,
3064,
275,
3045,
875,
1086,
1344,
285,
841,
1666,
25379,
778,
320,
1955,
625,
281,
617,
685,
1955,
281,
253,
3064,
275,
4736,
6779,
50276,
19,
651,
352,
320,
1896,
281,
6194,
3408,
337,
285,
3408,
374,
2366,
390,
275,
271,
990,
936,
423,
8142,
436,
651,
2085,
247,
9904,
2715,
326,
310,
1027,
432,
667,
273,
253,
1666,
25379,
5421,
275,
253,
2929,
984,
352,
1335,
4648,
253,
24705,
6661,
347,
253,
10444,
4736,
6779,
1223,
1907,
6036,
3733,
273,
253,
3448,
6779,
285,
253,
8468,
303,
17894,
604,
436,
8245,
23490,
281,
3037,
6830,
1955,
281,
2834,
13757,
6790,
7221,
284,
840,
436,
588,
1361,
2007,
17084,
253,
22857,
273,
253,
6349,
273,
34430,
4906,
253,
4715,
1232,
715,
767,
5799,
12475,
50276,
20,
2593,
4567,
253,
2022,
2505,
285,
30762,
260,
19,
369,
417,
1077,
2590,
670,
253,
1273,
42115,
8492,
323,
253,
10377,
273,
253,
3879,
2424,
281,
5115,
1840,
10986,
258,
75,
285,
1840,
13511,
258,
74,
403,
368,
3981,
323,
1650,
604,
359,
403,
2820,
281,
452,
1789,
337,
1840,
1789,
374,
840,
359,
13199,
253,
4736,
275,
253,
830,
305,
18,
1223,
604,
359,
971,
1789,
374,
1840,
1789,
337,
840,
359,
13199,
253,
4736,
275,
253,
830,
305,
19,
50274,
37585,
5701,
50276,
9453,
970,
4021,
19101,
275,
44127,
897,
896,
3028,
661,
323,
253,
5909,
14430,
50276,
6438,
30080,
22559,
6128,
50275,
74,
452,
1239,
253,
4477,
9300,
7482,
285,
2380,
281,
619,
7350,
347,
973,
347,
253,
643,
10123,
253,
9300,
2929,
3400,
247,
30909,
39926,
285,
690,
5816,
1666,
25379,
452,
671,
644,
2908,
891,
5439,
619,
7103,
281,
247,
5075,
14924,
323,
253,
2929,
50276,
7152,
33032,
2520,
789,
4081,
1086,
1344,
247,
5199,
323,
18462,
15276,
16038,
3066,
271,
10444,
24705,
3054,
4511,
6779,
253,
4477,
12661,
271,
10444,
24705,
1375,
2317,
326,
253,
45654,
17194,
5570,
33772,
281,
8338,
323,
253,
3126,
2530,
247,
495,
6172,
985,
253,
5570,
4751,
33826,
253,
24762,
1375,
2317,
10922,
512,
17887,
24762,
3054,
275,
253,
1273,
629,
273,
253,
789,
253,
4477,
6194,
247,
3448,
1566,
7032,
273,
36636,
24762,
7342,
275,
253,
830,
273,
24762,
3054,
432,
3626,
3448,
3280,
285,
2722,
326,
253,
3786,
40150,
968,
1037,
24013,
8550,
5570,
476,
1024,
320,
1160,
281,
3986,
841,
7342,
17227,
326,
253,
24762,
41881,
44321,
3646,
310,
4209,
323,
9775,
1563,
275,
616,
495,
6172,
5028,
50276,
783,
789,
310,
3839,
4722,
285,
3133,
281,
2953,
247,
2969,
2715,
273,
247,
16055,
966,
273,
3237,
326,
36080,
6083,
5431,
11182,
342,
3782,
275,
253,
5928,
273,
2590,
7342,
2299,
253,
2746,
3559,
275,
253,
3879,
275,
1798,
253,
830,
273,
253,
24705,
6779,
326,
310,
7558,
347,
581,
273,
253,
3625,
9021,
273,
253,
789,
310,
1077,
2173,
281,
253,
2014,
1895,
908,
323,
32367,
275,
253,
2929,
14155,
253,
2442,
3486,
273,
253,
789,
50276,
7053,
395,
891,
1158,
253,
954,
1534,
2523,
342,
253,
19529,
261,
326,
1142,
4619,
5661,
4278,
403,
2908,
760,
275,
253,
24585,
30762,
1199,
273,
436,
1491,
1690,
253,
1491,
2530,
281,
253,
4715,
5933,
387,
1046,
3213,
285,
849,
326,
1491,
310,
16202,
824,
326,
352,
4483,
323,
247,
4942,
1789,
1530,
6932,
6779,
310,
760,
2130,
275,
4209,
2508,
275,
253,
30762,
2905,
314,
5304,
5904,
273,
253,
2746,
285,
5661,
9978,
671,
760,
3176,
275,
253,
30762,
2568,
403,
6685,
9371,
604,
417,
5667,
323,
4685,
2508,
4619,
281,
4685,
253,
2746,
943,
320,
2908,
275,
253,
2133,
273,
253,
2505,
50276,
9815,
352,
310,
12744,
4555,
752,
1895,
310,
1146,
14042,
275,
436,
789,
390,
752,
697,
3625,
7680,
310,
247,
30909,
3908,
273,
697,
42852,
588,
320,
3309,
1078,
9311,
752,
1895,
310,
253,
15688,
390,
985,
22507,
2820,
281,
11399,
987,
1024,
253,
2929,
3133,
281,
1705,
598,
342,
1264,
2442,
9172,
281,
436,
1953,
5293,
273,
534,
7933,
22844,
1840,
253,
2571,
1060,
403,
752,
891,
1158,
253,
2022,
9021,
273,
253,
789,
812,
320,
50276,
18,
253,
4081,
24705,
6779,
253,
24705,
4736,
6779,
908,
281,
4853,
253,
2317,
273,
15276,
16038,
3133,
281,
320,
247,
4460,
7680,
2299,
604,
253,
2929,
497,
281,
2770,
327,
436,
4809,
273,
253,
7680,
352,
651,
878,
281,
513,
247,
1805,
2628,
762,
44101,
2139,
436,
6779,
497,
4217,
4457,
247,
4942,
1355,
19763,
4836,
21038,
970,
760,
581,
1895,
4758,
342,
760,
1264,
8336,
310,
12497,
281,
18578,
253,
9414,
326,
436,
6779,
310,
4217,
625,
3839,
347,
1537,
320,
5125,
407,
1199,
273,
253,
2312,
670,
42115,
8492,
374,
1375,
273,
253,
1445,
3054,
4511,
17947,
275,
15276,
16038,
436,
1537,
320,
2032,
2167,
891,
1089,
824,
247,
2181,
1892,
281,
2557,
275,
1635,
352,
3133,
326,
1142,
604,
417,
512,
273,
253,
5657,
908,
275,
253,
4715,
1232,
403,
417,
4460,
4931,
247,
5019,
273,
436,
285,
1127,
337,
310,
253,
3625,
7680,
495,
1375,
273,
253,
1445,
3045,
327,
298,
2435,
2961,
1069,
257,
2972,
19763,
8892,
436,
1537,
320,
2032,
347,
973,
533,
253,
1543,
403,
594,
14103,
10915,
87,
19163,
512,
1666,
25379,
403,
12848,
4948,
273,
253,
4081,
5570,
534,
2789,
352,
2834,
281,
7277,
1411,
643,
7274,
24088,
1633,
751,
632,
1162,
355,
6247,
50276,
783,
2929,
4390,
3133,
281,
1750,
326,
253,
5019,
273,
4780,
275,
841,
1264,
3672,
310,
247,
4460,
7680,
891,
717,
24152,
281,
436,
2934,
347,
891,
513,
417,
2868,
326,
1046,
2929,
3198,
281,
320,
1375,
273,
253,
1445,
275,
581,
2014,
2181,
2167,
352,
310,
10481,
12744,
387,
253,
2774,
752,
253,
1379,
12594,
3935,
273,
253,
2929,
310,
326,
891,
2550,
5583,
352,
320,
3863,
275,
697,
1655,
1375,
275,
1798,
253,
4477,
878,
281,
789,
327,
4070,
272,
253,
3935,
273,
253,
2929,
352,
310,
671,
417,
11543,
326,
581,
390,
767,
625,
4679,
588,
878,
281,
320,
2879,
281,
1329,
253,
7106,
14511,
50276,
6795,
254,
5701,
50276,
783,
1416,
273,
5933,
943,
3176,
275,
253,
2133,
273,
253,
2505,
417,
247,
43302,
2905,
314,
352,
310,
12744,
849,
253,
4081,
2746,
4648,
253,
3676,
5239,
789,
275,
824,
247,
1039,
326,
352,
816,
7790,
11250,
275,
253,
1416,
273,
253,
4081,
5853,
50276,
783,
2929,
46089,
651,
5649,
432,
247,
6010,
273,
9021,
1014,
846,
4361,
352,
778,
417,
320,
2590,
281,
247,
9414,
534,
9021,
403,
432,
436,
2929,
7147,
643,
789,
50276,
4919,
314,
1199,
273,
253,
5955,
273,
42115,
31306,
326,
3176,
4768,
253,
2929,
310,
273,
6804,
17200,
323,
436,
789,
327,
253,
581,
1133,
352,
310,
2590,
849,
253,
2934,
273,
271,
1789,
37382,
42115,
8492,
6518,
281,
4151,
849,
253,
3280,
281,
253,
11454,
2990,
369,
16202,
275,
247,
1039,
326,
1537,
1581,
253,
5570,
281,
4647,
697,
3640,
6311,
875,
767,
273,
253,
5113,
281,
247,
3646,
326,
4483,
352,
281,
26526,
512,
1264,
2299,
253,
4736,
1617,
310,
7933,
2173,
672,
352,
3249,
281,
9999,
534,
5113,
281,
534,
1016,
3284,
352,
10770,
253,
2605,
273,
253,
4736,
285,
253,
24705,
2493,
352,
31360,
403,
3240,
2173,
281,
253,
1798,
1895,
387,
1133,
285,
352,
310,
50276,
783,
10921,
323,
253,
1899,
760,
8245,
3133,
41544,
8818,
247,
1327,
26458,
10921,
1159,
651,
2779,
1581,
253,
985,
281,
3037,
625,
4354,
347,
273,
1024,
891,
717,
10915,
8498,
758,
326,
253,
4477,
452,
4307,
1892,
2217,
281,
1056,
247,
4344,
8245,
323,
5301,
436,
310,
3782,
20276,
1580,
436,
8245,
310,
247,
2234,
15265,
1080,
323,
253,
6242,
273,
253,
4081,
24705,
4736,
6779,
50276,
783,
2929,
4583,
310,
3240,
973,
3542,
5747,
1693,
72,
839,
1512,
1199,
1491,
281,
253,
12002,
84,
5474,
33032,
253,
1086,
1344,
2718,
15276,
42852,
778,
320,
7763,
281,
643,
2898,
10625,
7293,
327,
849,
5113,
285,
2493,
403,
41671,
436,
2442,
310,
417,
14859,
4457,
253,
20953,
3126,
3559,
253,
4715,
3082,
3340,
42115,
31306,
403,
1133,
12517,
264,
1754,
327,
1966,
5251,
3640,
670,
24705,
2063,
31290,
533,
760,
767,
1840,
285,
2810,
403,
5183,
1293,
17227,
253,
985,
327,
667,
643,
6661,
390,
1533,
697,
2834,
281,
2028,
1880,
697,
2104,
281,
8415,
760,
253,
1895,
697,
644,
37171,
281,
8415,
275,
436,
2173,
3126,
50276,
34974,
50276,
2405,
275,
8063,
50276,
16534,
897,
667,
643,
5019,
273,
8985,
2063,
31290,
285,
812,
320,
6508,
281,
897,
295,
552,
2063,
31290,
436,
1750,
310,
417,
5183,
275,
253,
2929,
285,
275,
4567,
253,
42115,
31306,
1646,
6290,
81,
3136,
37171,
323,
8985,
29524,
1840,
534,
556,
1798,
10377,
651,
2074,
10182,
2216,
273,
42115,
31306,
320,
3309,
285,
1896,
323,
295,
552,
2063,
31290,
326,
513,
417,
7568,
841,
347,
4354,
24088,
1755,
2252,
752,
670,
2063,
31290,
326,
6388,
271,
45346,
1180,
273,
13358,
7125,
751,
2613,
50276,
16514,
598,
271,
44245,
295,
273,
643,
5113,
45345,
275,
897,
253,
4759,
2972,
347,
253,
2613,
50276,
1706,
390,
310,
8083,
436,
36908,
3839,
2186,
323,
3626,
3448,
247,
3908,
751,
1691,
253,
2502,
2972,
390,
253,
4759,
2972,
1840,
253,
8862,
2972,
1057,
417,
1599,
281,
1691,
1097,
2502,
285,
4759,
8083,
273,
7342,
1840,
8862,
5431,
19457,
9789,
390,
310,
1269,
263,
310,
253,
10732,
273,
390,
1060,
417,
1677,
275,
3448,
390,
417,
5486,
281,
1957,
1966,
3448,
50275,
609,
284,
323,
7756,
50276,
22,
247,
4715,
10336,
326,
41217,
285,
26616,
512,
3986,
494,
16012,
432,
247,
873,
273,
38524,
2248,
23223,
436,
310,
12832,
2032,
533,
760,
5183,
327,
247,
2014,
873,
273,
38524,
2248,
23223,
594,
352,
9193,
751,
689,
43759,
50275,
79,
953,
50276,
12237,
25577,
323,
7649,
2146,
4050,
275,
26432,
275,
9701,
14683,
476,
320,
35341,
281,
2378,
50276,
8938,
21377,
327,
643,
1930,
273,
2180,
50276,
67,
11587,
275,
8336,
19763,
3133,
247,
2372,
745,
84,
13802,
5046,
275,
1635,
50276,
555,
5367,
2593,
495,
1754,
258,
12002,
50276,
555,
5367,
2593,
608,
24291,
19101,
21481,
10212,
298,
11285,
50276,
6672,
72,
2373,
275,
2593,
608,
310,
271,
25962,
2466,
758,
2554,
253,
1551,
273,
253,
2929,
1057,
417,
3665,
1086,
1344,
390,
253,
42295,
14156,
436,
1039,
50276,
1946,
253,
2929,
342,
3966,
9193,
12504,
249,
24873,
187,
187,
4118,
18435,
27,
2520,
2929,
10262,
247,
747,
2746,
281,
3216,
272,
3448,
3169,
391,
77,
8892,
3066,
271,
10444,
24705,
6779,
275,
271,
10336,
1925,
3448,
41881,
38631,
298,
20773,
50276,
783,
10336,
16000,
4715,
247,
10603,
432,
4812,
7342,
281,
3879,
50276,
20773,
11794,
432,
4715,
247,
10603,
432,
3448,
281,
4812,
7342,
298,
72,
285,
2720,
281,
6520,
4360,
16248,
512,
1264,
298,
20773,
50276,
783,
10336,
310,
5421,
275,
247,
2173,
7092,
1925,
1086,
1344,
50276,
783,
10336,
556,
2709,
6799,
12474,
1690,
1329,
323,
15276,
16038,
34430,
4906,
10861,
11931,
432,
3448,
3216,
272,
285,
5700,
12797,
50276,
783,
4679,
7568,
253,
11839,
273,
1027,
4295,
275,
253,
10336,
342,
247,
5235,
273,
28913,
1543,
50276,
783,
10123,
8523,
1119,
253,
2929,
281,
320,
15225,
10932,
342,
2424,
2600,
2529,
760,
275,
253,
30762,
391,
18,
391,
19,
391,
21,
342,
12744,
2022,
9021,
391,
18,
391,
19,
391,
21,
285,
342,
1543,
11096,
281,
32367,
391,
20,
50276,
3229,
3784,
841,
33196,
253,
30628,
1119,
253,
2600,
281,
320,
7826,
4623,
2167,
6891,
275,
7990,
50276,
783,
4477,
9619,
17265,
253,
2929,
597,
5520,
697,
6003,
31637,
9021,
9070,
253,
10336,
432,
253,
2173,
6667,
285,
5520,
253,
5661,
1666,
25379,
50276,
6438,
4361,
253,
17265,
2929,
253,
30628,
5821,
326,
253,
9380,
6003,
285,
16039,
497,
5520,
2403,
253,
747,
9380,
7680,
285,
12288,
2590,
50276,
783,
5661,
1666,
25379,
497,
671,
5520,
5277,
625,
1329,
323,
253,
2442,
11839,
273,
253,
4081,
1332,
50276,
13524,
30628,
5224,
281,
2997,
436,
2929,
323,
697,
7680,
273,
247,
4460,
2746,
281,
3216,
272,
3448,
285,
3879,
342,
271,
10444,
24705,
6779,
642,
6832,
7350,
497,
5439,
327,
253,
2600,
273,
253,
17265,
2929,
253,
2929,
310,
3103,
7607
] | [
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] | [
417,
1907,
617,
3185,
273,
616,
4081,
9021,
352,
651,
320,
1175,
604,
253,
4477,
476,
19148,
326,
4468,
50275,
19751,
337,
275,
4677,
337,
68,
323,
253,
3448,
7342,
8245,
369,
617,
3732,
281,
253,
3448,
7342,
275,
436,
1083,
26332,
2074,
281,
769,
83,
336,
47853,
1162,
355,
6247,
8564,
847,
284,
1162,
355,
9169,
12014,
369,
617,
3732,
281,
253,
1899,
2184,
932,
8245,
604,
417,
840,
352,
310,
1896,
253,
3064,
275,
3045,
875,
1086,
1344,
285,
841,
1666,
25379,
778,
320,
1955,
625,
281,
617,
685,
1955,
281,
253,
3064,
275,
4736,
6779,
50276,
19,
651,
352,
320,
1896,
281,
6194,
3408,
337,
285,
3408,
374,
2366,
390,
275,
271,
990,
936,
423,
8142,
436,
651,
2085,
247,
9904,
2715,
326,
310,
1027,
432,
667,
273,
253,
1666,
25379,
5421,
275,
253,
2929,
984,
352,
1335,
4648,
253,
24705,
6661,
347,
253,
10444,
4736,
6779,
1223,
1907,
6036,
3733,
273,
253,
3448,
6779,
285,
253,
8468,
303,
17894,
604,
436,
8245,
23490,
281,
3037,
6830,
1955,
281,
2834,
13757,
6790,
7221,
284,
840,
436,
588,
1361,
2007,
17084,
253,
22857,
273,
253,
6349,
273,
34430,
4906,
253,
4715,
1232,
715,
767,
5799,
12475,
50276,
20,
2593,
4567,
253,
2022,
2505,
285,
30762,
260,
19,
369,
417,
1077,
2590,
670,
253,
1273,
42115,
8492,
323,
253,
10377,
273,
253,
3879,
2424,
281,
5115,
1840,
10986,
258,
75,
285,
1840,
13511,
258,
74,
403,
368,
3981,
323,
1650,
604,
359,
403,
2820,
281,
452,
1789,
337,
1840,
1789,
374,
840,
359,
13199,
253,
4736,
275,
253,
830,
305,
18,
1223,
604,
359,
971,
1789,
374,
1840,
1789,
337,
840,
359,
13199,
253,
4736,
275,
253,
830,
305,
19,
50274,
37585,
5701,
50276,
9453,
970,
4021,
19101,
275,
44127,
897,
896,
3028,
661,
323,
253,
5909,
14430,
50276,
6438,
30080,
22559,
6128,
50275,
74,
452,
1239,
253,
4477,
9300,
7482,
285,
2380,
281,
619,
7350,
347,
973,
347,
253,
643,
10123,
253,
9300,
2929,
3400,
247,
30909,
39926,
285,
690,
5816,
1666,
25379,
452,
671,
644,
2908,
891,
5439,
619,
7103,
281,
247,
5075,
14924,
323,
253,
2929,
50276,
7152,
33032,
2520,
789,
4081,
1086,
1344,
247,
5199,
323,
18462,
15276,
16038,
3066,
271,
10444,
24705,
3054,
4511,
6779,
253,
4477,
12661,
271,
10444,
24705,
1375,
2317,
326,
253,
45654,
17194,
5570,
33772,
281,
8338,
323,
253,
3126,
2530,
247,
495,
6172,
985,
253,
5570,
4751,
33826,
253,
24762,
1375,
2317,
10922,
512,
17887,
24762,
3054,
275,
253,
1273,
629,
273,
253,
789,
253,
4477,
6194,
247,
3448,
1566,
7032,
273,
36636,
24762,
7342,
275,
253,
830,
273,
24762,
3054,
432,
3626,
3448,
3280,
285,
2722,
326,
253,
3786,
40150,
968,
1037,
24013,
8550,
5570,
476,
1024,
320,
1160,
281,
3986,
841,
7342,
17227,
326,
253,
24762,
41881,
44321,
3646,
310,
4209,
323,
9775,
1563,
275,
616,
495,
6172,
5028,
50276,
783,
789,
310,
3839,
4722,
285,
3133,
281,
2953,
247,
2969,
2715,
273,
247,
16055,
966,
273,
3237,
326,
36080,
6083,
5431,
11182,
342,
3782,
275,
253,
5928,
273,
2590,
7342,
2299,
253,
2746,
3559,
275,
253,
3879,
275,
1798,
253,
830,
273,
253,
24705,
6779,
326,
310,
7558,
347,
581,
273,
253,
3625,
9021,
273,
253,
789,
310,
1077,
2173,
281,
253,
2014,
1895,
908,
323,
32367,
275,
253,
2929,
14155,
253,
2442,
3486,
273,
253,
789,
50276,
7053,
395,
891,
1158,
253,
954,
1534,
2523,
342,
253,
19529,
261,
326,
1142,
4619,
5661,
4278,
403,
2908,
760,
275,
253,
24585,
30762,
1199,
273,
436,
1491,
1690,
253,
1491,
2530,
281,
253,
4715,
5933,
387,
1046,
3213,
285,
849,
326,
1491,
310,
16202,
824,
326,
352,
4483,
323,
247,
4942,
1789,
1530,
6932,
6779,
310,
760,
2130,
275,
4209,
2508,
275,
253,
30762,
2905,
314,
5304,
5904,
273,
253,
2746,
285,
5661,
9978,
671,
760,
3176,
275,
253,
30762,
2568,
403,
6685,
9371,
604,
417,
5667,
323,
4685,
2508,
4619,
281,
4685,
253,
2746,
943,
320,
2908,
275,
253,
2133,
273,
253,
2505,
50276,
9815,
352,
310,
12744,
4555,
752,
1895,
310,
1146,
14042,
275,
436,
789,
390,
752,
697,
3625,
7680,
310,
247,
30909,
3908,
273,
697,
42852,
588,
320,
3309,
1078,
9311,
752,
1895,
310,
253,
15688,
390,
985,
22507,
2820,
281,
11399,
987,
1024,
253,
2929,
3133,
281,
1705,
598,
342,
1264,
2442,
9172,
281,
436,
1953,
5293,
273,
534,
7933,
22844,
1840,
253,
2571,
1060,
403,
752,
891,
1158,
253,
2022,
9021,
273,
253,
789,
812,
320,
50276,
18,
253,
4081,
24705,
6779,
253,
24705,
4736,
6779,
908,
281,
4853,
253,
2317,
273,
15276,
16038,
3133,
281,
320,
247,
4460,
7680,
2299,
604,
253,
2929,
497,
281,
2770,
327,
436,
4809,
273,
253,
7680,
352,
651,
878,
281,
513,
247,
1805,
2628,
762,
44101,
2139,
436,
6779,
497,
4217,
4457,
247,
4942,
1355,
19763,
4836,
21038,
970,
760,
581,
1895,
4758,
342,
760,
1264,
8336,
310,
12497,
281,
18578,
253,
9414,
326,
436,
6779,
310,
4217,
625,
3839,
347,
1537,
320,
5125,
407,
1199,
273,
253,
2312,
670,
42115,
8492,
374,
1375,
273,
253,
1445,
3054,
4511,
17947,
275,
15276,
16038,
436,
1537,
320,
2032,
2167,
891,
1089,
824,
247,
2181,
1892,
281,
2557,
275,
1635,
352,
3133,
326,
1142,
604,
417,
512,
273,
253,
5657,
908,
275,
253,
4715,
1232,
403,
417,
4460,
4931,
247,
5019,
273,
436,
285,
1127,
337,
310,
253,
3625,
7680,
495,
1375,
273,
253,
1445,
3045,
327,
298,
2435,
2961,
1069,
257,
2972,
19763,
8892,
436,
1537,
320,
2032,
347,
973,
533,
253,
1543,
403,
594,
14103,
10915,
87,
19163,
512,
1666,
25379,
403,
12848,
4948,
273,
253,
4081,
5570,
534,
2789,
352,
2834,
281,
7277,
1411,
643,
7274,
24088,
1633,
751,
632,
1162,
355,
6247,
50276,
783,
2929,
4390,
3133,
281,
1750,
326,
253,
5019,
273,
4780,
275,
841,
1264,
3672,
310,
247,
4460,
7680,
891,
717,
24152,
281,
436,
2934,
347,
891,
513,
417,
2868,
326,
1046,
2929,
3198,
281,
320,
1375,
273,
253,
1445,
275,
581,
2014,
2181,
2167,
352,
310,
10481,
12744,
387,
253,
2774,
752,
253,
1379,
12594,
3935,
273,
253,
2929,
310,
326,
891,
2550,
5583,
352,
320,
3863,
275,
697,
1655,
1375,
275,
1798,
253,
4477,
878,
281,
789,
327,
4070,
272,
253,
3935,
273,
253,
2929,
352,
310,
671,
417,
11543,
326,
581,
390,
767,
625,
4679,
588,
878,
281,
320,
2879,
281,
1329,
253,
7106,
14511,
50276,
6795,
254,
5701,
50276,
783,
1416,
273,
5933,
943,
3176,
275,
253,
2133,
273,
253,
2505,
417,
247,
43302,
2905,
314,
352,
310,
12744,
849,
253,
4081,
2746,
4648,
253,
3676,
5239,
789,
275,
824,
247,
1039,
326,
352,
816,
7790,
11250,
275,
253,
1416,
273,
253,
4081,
5853,
50276,
783,
2929,
46089,
651,
5649,
432,
247,
6010,
273,
9021,
1014,
846,
4361,
352,
778,
417,
320,
2590,
281,
247,
9414,
534,
9021,
403,
432,
436,
2929,
7147,
643,
789,
50276,
4919,
314,
1199,
273,
253,
5955,
273,
42115,
31306,
326,
3176,
4768,
253,
2929,
310,
273,
6804,
17200,
323,
436,
789,
327,
253,
581,
1133,
352,
310,
2590,
849,
253,
2934,
273,
271,
1789,
37382,
42115,
8492,
6518,
281,
4151,
849,
253,
3280,
281,
253,
11454,
2990,
369,
16202,
275,
247,
1039,
326,
1537,
1581,
253,
5570,
281,
4647,
697,
3640,
6311,
875,
767,
273,
253,
5113,
281,
247,
3646,
326,
4483,
352,
281,
26526,
512,
1264,
2299,
253,
4736,
1617,
310,
7933,
2173,
672,
352,
3249,
281,
9999,
534,
5113,
281,
534,
1016,
3284,
352,
10770,
253,
2605,
273,
253,
4736,
285,
253,
24705,
2493,
352,
31360,
403,
3240,
2173,
281,
253,
1798,
1895,
387,
1133,
285,
352,
310,
50276,
783,
10921,
323,
253,
1899,
760,
8245,
3133,
41544,
8818,
247,
1327,
26458,
10921,
1159,
651,
2779,
1581,
253,
985,
281,
3037,
625,
4354,
347,
273,
1024,
891,
717,
10915,
8498,
758,
326,
253,
4477,
452,
4307,
1892,
2217,
281,
1056,
247,
4344,
8245,
323,
5301,
436,
310,
3782,
20276,
1580,
436,
8245,
310,
247,
2234,
15265,
1080,
323,
253,
6242,
273,
253,
4081,
24705,
4736,
6779,
50276,
783,
2929,
4583,
310,
3240,
973,
3542,
5747,
1693,
72,
839,
1512,
1199,
1491,
281,
253,
12002,
84,
5474,
33032,
253,
1086,
1344,
2718,
15276,
42852,
778,
320,
7763,
281,
643,
2898,
10625,
7293,
327,
849,
5113,
285,
2493,
403,
41671,
436,
2442,
310,
417,
14859,
4457,
253,
20953,
3126,
3559,
253,
4715,
3082,
3340,
42115,
31306,
403,
1133,
12517,
264,
1754,
327,
1966,
5251,
3640,
670,
24705,
2063,
31290,
533,
760,
767,
1840,
285,
2810,
403,
5183,
1293,
17227,
253,
985,
327,
667,
643,
6661,
390,
1533,
697,
2834,
281,
2028,
1880,
697,
2104,
281,
8415,
760,
253,
1895,
697,
644,
37171,
281,
8415,
275,
436,
2173,
3126,
50276,
34974,
50276,
2405,
275,
8063,
50276,
16534,
897,
667,
643,
5019,
273,
8985,
2063,
31290,
285,
812,
320,
6508,
281,
897,
295,
552,
2063,
31290,
436,
1750,
310,
417,
5183,
275,
253,
2929,
285,
275,
4567,
253,
42115,
31306,
1646,
6290,
81,
3136,
37171,
323,
8985,
29524,
1840,
534,
556,
1798,
10377,
651,
2074,
10182,
2216,
273,
42115,
31306,
320,
3309,
285,
1896,
323,
295,
552,
2063,
31290,
326,
513,
417,
7568,
841,
347,
4354,
24088,
1755,
2252,
752,
670,
2063,
31290,
326,
6388,
271,
45346,
1180,
273,
13358,
7125,
751,
2613,
50276,
16514,
598,
271,
44245,
295,
273,
643,
5113,
45345,
275,
897,
253,
4759,
2972,
347,
253,
2613,
50276,
1706,
390,
310,
8083,
436,
36908,
3839,
2186,
323,
3626,
3448,
247,
3908,
751,
1691,
253,
2502,
2972,
390,
253,
4759,
2972,
1840,
253,
8862,
2972,
1057,
417,
1599,
281,
1691,
1097,
2502,
285,
4759,
8083,
273,
7342,
1840,
8862,
5431,
19457,
9789,
390,
310,
1269,
263,
310,
253,
10732,
273,
390,
1060,
417,
1677,
275,
3448,
390,
417,
5486,
281,
1957,
1966,
3448,
50275,
609,
284,
323,
7756,
50276,
22,
247,
4715,
10336,
326,
41217,
285,
26616,
512,
3986,
494,
16012,
432,
247,
873,
273,
38524,
2248,
23223,
436,
310,
12832,
2032,
533,
760,
5183,
327,
247,
2014,
873,
273,
38524,
2248,
23223,
594,
352,
9193,
751,
689,
43759,
50275,
79,
953,
50276,
12237,
25577,
323,
7649,
2146,
4050,
275,
26432,
275,
9701,
14683,
476,
320,
35341,
281,
2378,
50276,
8938,
21377,
327,
643,
1930,
273,
2180,
50276,
67,
11587,
275,
8336,
19763,
3133,
247,
2372,
745,
84,
13802,
5046,
275,
1635,
50276,
555,
5367,
2593,
495,
1754,
258,
12002,
50276,
555,
5367,
2593,
608,
24291,
19101,
21481,
10212,
298,
11285,
50276,
6672,
72,
2373,
275,
2593,
608,
310,
271,
25962,
2466,
758,
2554,
253,
1551,
273,
253,
2929,
1057,
417,
3665,
1086,
1344,
390,
253,
42295,
14156,
436,
1039,
50276,
1946,
253,
2929,
342,
3966,
9193,
12504,
249,
24873,
187,
187,
4118,
18435,
27,
2520,
2929,
10262,
247,
747,
2746,
281,
3216,
272,
3448,
3169,
391,
77,
8892,
3066,
271,
10444,
24705,
6779,
275,
271,
10336,
1925,
3448,
41881,
38631,
298,
20773,
50276,
783,
10336,
16000,
4715,
247,
10603,
432,
4812,
7342,
281,
3879,
50276,
20773,
11794,
432,
4715,
247,
10603,
432,
3448,
281,
4812,
7342,
298,
72,
285,
2720,
281,
6520,
4360,
16248,
512,
1264,
298,
20773,
50276,
783,
10336,
310,
5421,
275,
247,
2173,
7092,
1925,
1086,
1344,
50276,
783,
10336,
556,
2709,
6799,
12474,
1690,
1329,
323,
15276,
16038,
34430,
4906,
10861,
11931,
432,
3448,
3216,
272,
285,
5700,
12797,
50276,
783,
4679,
7568,
253,
11839,
273,
1027,
4295,
275,
253,
10336,
342,
247,
5235,
273,
28913,
1543,
50276,
783,
10123,
8523,
1119,
253,
2929,
281,
320,
15225,
10932,
342,
2424,
2600,
2529,
760,
275,
253,
30762,
391,
18,
391,
19,
391,
21,
342,
12744,
2022,
9021,
391,
18,
391,
19,
391,
21,
285,
342,
1543,
11096,
281,
32367,
391,
20,
50276,
3229,
3784,
841,
33196,
253,
30628,
1119,
253,
2600,
281,
320,
7826,
4623,
2167,
6891,
275,
7990,
50276,
783,
4477,
9619,
17265,
253,
2929,
597,
5520,
697,
6003,
31637,
9021,
9070,
253,
10336,
432,
253,
2173,
6667,
285,
5520,
253,
5661,
1666,
25379,
50276,
6438,
4361,
253,
17265,
2929,
253,
30628,
5821,
326,
253,
9380,
6003,
285,
16039,
497,
5520,
2403,
253,
747,
9380,
7680,
285,
12288,
2590,
50276,
783,
5661,
1666,
25379,
497,
671,
5520,
5277,
625,
1329,
323,
253,
2442,
11839,
273,
253,
4081,
1332,
50276,
13524,
30628,
5224,
281,
2997,
436,
2929,
323,
697,
7680,
273,
247,
4460,
2746,
281,
3216,
272,
3448,
285,
3879,
342,
271,
10444,
24705,
6779,
642,
6832,
7350,
497,
5439,
327,
253,
2600,
273,
253,
17265,
2929,
253,
2929,
310,
3103,
7607
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
robust methods for equation discovery respecting physics are of utmost importance for the application of datadriven methods in science en engineering the present contribution proposes a new framework based on physical neural networks pnn it is assumed that the function mathbfy mathbffmathbfx is polynomial with possibly negative exponents the network has only one hidden layer as far as i understood this hidden layer takes as input the state variables xi of the system and outputs zj exp left sum pij textrmlnxi right in the above expression pij textrmtanhc1 wij cdot sigmac2 wprimeij with mathbfw and mathbfwprime parameters to be learned the final output layer is a simple weighted sum ie yk sumj mjk zj sparsity is of the model results from the use of the sigmod activation function in the hidden layer while the hyperbolic tangent makes sure pij is an integer either positive or negative along with this simple architecture the authors propose the socalled rise framework to include realizability conditions eg physical constraints the model needs to satisfy these constraints are enforced only weakly by penalizing the loss function to be minimized in the training state the overall model is tested on a number of nonstandard benchmarks to illustrate its capabilities strengths model parsimony often synonymous with sparsity but not necessarily has been an overarching goal in the physical sciences ensuring that the model is interpretable and physically consistent is also crucially important while this has been widely recognized in the system identification community it is my impression that this problem is not studied as much as it should in the machine learning community the contribution by the present authors goes in this direction the design of the polynomial layer in pnn combining sigmod functions for sparsity and hyperbolic tangent for integer exponents is interesting and to the best of my knowledge is original although i dont find it flexible enough the rise framework is a step in the good direction to enforce realizability conditions in the model identification procedure weakness despite these positive points this contribution suffers from certain limitations the most important ones are discussed below concerning the proposed method the architecture of the network implicitly assumes that the unknown function mathbff is a polynomial what if it is not eg the nonlinear pendulum or the kuramoto model for oscillator synchronization does it identify the taylor or laurent or pad series expansion of mathbff what if the system is only partially observed does the identified model include closure terms are c1 and c2 in the sigmod and hyperbolic tangent functions free parameters or are they actually hyperparameters concerning the rise penalty despite what is stated in the introduction pnns are not restricted to systematically obeying the physical properties and actually have hyperparameters that need to be tuned the constraints are weakly imposed by penalizing the cost function hence the coefficient for this penalization is a hyperparameter and if the latter is too small then the identified model will not necessarily satisfy the constraints moreover this framework is not as universal as what is claimed by the authors in particular the symmetry part symmetry in phase space does not necessarily imply symmetric coefficients in the equations this is easily exemplified by the harmonic oscillator dotx y and doty x where the coefficient are antisymmetric despite the trajectory in phase space being a circle rather than focusing on symmetric coefficients a better approach would be to enforce equivariance ie dynamics of the form mathbfs dotx fmathbfs x although i cannot remember of any proof i believe that symmetry in the coefficient actually implies some form of conservation something which might be useful for the power systems considered in this contribution but is too restrictive in general concerning the theoretical results most of my concerns have already been raised by reviewer vzaq concerning the experiments the test cases used for the comparisons are nonstandard at least from my system identification point of view although it may have been done out of simplicity by the authors using poorly documented benchmarks is a surprising choice when introducing a new method experimental details are insufficient how much data is used how is it split how are resnet and pnn trained optimizer number of epochs batch size learning rate etc which functions are included in the library for sindy and eql which algorithmformulation of sindy has been considered while the method proposed by authors explicitly takes into account some constraints only the vanilla version of sindy has been considered to the best of my understanding yet as shown in 1 2 sindy can be extended to incorporate physical constraints in the model identification problem using sindy with constraints would thus be a somewhat more fair comparison system identification techniques are often used in the lowdata limit either because simulating the highdimensional system for a sufficiently long period of time would require millions of computing hours or because data is gathered from extremely large experimental facilities that cannot be operated for long how does the method behave in this lowdata limit eg when you have fewer data points that free parameters in the neural network numerous systems have rapidly attracting manifolds in phase space which express themselves as nonlinear correlations in the time series consider the following model of selfsustained oscillators dotx sigma x y xz doty x sigma y yz dotz z x2 y2 with sigma 0 it is easy to show that very rapidly zt simeq x2 y2 ie despite the negative term in the third equation zt actually grows over time because of nonlinear coupling given only data simple techniques would identify the following equation for zt dotz alpha z beta z3 which is obviously wrong although consistent with the data how would your technique behave in such a situation where nonlinear crosstalk between variables is what drive the dynamics at first order miscellaneous the quality of the english should not be taken into account when assessing the scientific quality of a contribution yet there are a lot of typos in the paper that could have been avoided using simple spellcheck it is unclear until relatively late in the paper that the loss function considered is the mean squared error although i do understand that a different metric could be used during the training stage i think that making this clear early on is needed references 1 loiseau brunton constrained sparse galerkin regression journal of fluid mechanics 2018 2 kaptanoglu et al promoting global stability in datadriven models of quadratic nonlinear dynamics physical review fluid 2021 robust methods for equation discovery respecting physics are of utmost importance for the application of datadriven methods in science en engineering although the method presented in this contribution seems interesting its presentation lacks clarity as discussed in the core of my review comparisons with already existing techniques are biased eg sindy with constraints is not considered and are not presented for representative cases or welltested benchmarks other issues are also discussed in the review hence i cannot recommend this work for publication docsepthis paper tackles the problem of model inference and prediction while preserving physical correctness this is an interesting and important problem in supervised learning for interpratability and generalization reasons it is also quite important in modelbased reinforcement learning in the context of nonlinear system identification the authors start by challenging existing sparse regression approaches such as sindy and symbolic regression approaches such as eql because they lead to nonphysical solutions with low loss because they depend on hyperparamters hence they propose what is coined a physical neural network pnn where layers have a laurent polynomial shape with learnable coefficients and powers weights corresponding to the layer input powers are decomposed into a tanh activation multiplied by a sigmoid activation this last design choice allows to enforce some kind of sparsity when sigmoid converges to 0 and constrains powers in the range 1 1 when active because of tanh furthermore it ensures differentiability of the integer valued layerinput powers additionally the authors also propose to penalize the model training with what they call a rise constraint the latter is meant to enforce physically plausible range inertia symmetry and extrapolation properties in the inferred model strengths the problem tackled is very important and indeed not studied as much as it should by the community the experiments showing that existing approaches fail to recover the underlying equation structure for simple systems illustrates this well the idea of relaxing the polynomial layer of pnn with tanh and sigmoid functions is very interesting appealing and seems original to me the idea of proposing a unifying formalism for physical constraints such as rise is interesting and should be helpful the proposed method is showcased in a number of diverse experiments and compared to relevant sota methods weakness in my opinion despite these positive points the article has major weaknesses in terms of theoretical and experimental soundness and clarity given the multiple clarity concerns it is possible that i misunderstood some things so please let me know it this is the case here follows my more detailed concerns concerning the proposed method 1 it is not clear to me how you go from outputs in the 1 1 range to the k k range 2 furthermore i must say that the laurent polynomial structure of the method proposed does not seem expressive enough to model many important physical systems which do not have a polynomial structure this seems to me as a weakness compared to previous methods such as eql and sindy which are built on top of an arbitrary dictionary of basis functions trigonometric functions exponentials etc in this sense the hypothesis space of pnn is included in sindy and eql ones concerning the rise penalty 3 concerning the rise constraint while the formalism is interesting its motivation lacks illustrative examples other than the power line example where else do we see the symmetry constraint and how are all these constraints implemented in the 4 experiments presented in the paper how are constrained weights chosen for example i imagine you dont symmetrize all weights do you furthermore the constraints presentation is not well connected to existing penalties from the statistical learning literature for example inertia constraint is usually modeled by smoothness inducing regularization eg smoothing splines 1 dynamic system constraints like in system identification 2 4 also i was a bit surprised and confused because in section 31 second paragraph it is said that existing methods such as sindy add additional hyperparameters lasso penalty weight and that the proposed method does not have this flaw but the rise penalty also has a penalty hyperparameter to be tuned concerning the theoretical results 5 it is said just before proposition 1 on page 5 that such a design converts the nonconvex multiplication to a convex form of linear summation for nn training i imagine that this refers to the expsumlog rewriting of the polynomial zjprod xipij at the end of page 4 as expsum pij ln xi am i correct if this is the case i cannot agree with the statement as these two expressions are exactly equal and hence both convex in pij they are the same function there seems to be a confusion here because zjprod xipij is nonconvex in xi please let me know if i missed something here 6 concerning proposition 1 the searching algorithm is not defined one has to go to the appendix to understand that you mean alternate convex search 7 more importantly i cannot agree with the results or at least not with its relevance here indeed in addition to the linear weights mjk the optimization variables of the proposed pnn model are not the pijs but rather the auxiliary wij and wij weights and it happens that the problem is not biconvex wrt them as evidenced clearly by figure 7 i do agree however that the problem is biconvex wrt pijs and mjk and that acs would work if one would optimize these variables instead however this is not what the paper proposes 8 at the end of the proof of theorem 1 page 7 you state that the piecewise strong convexity from rise regularization guarantees that we can find the global optimal point for each subregions and the global optima satisfies constraints for physical parameters while you proved indeed the strong convexity and while this implies that the global minimum can be achieved with gradient descent the optimization problem solved has been relaxed and as such has no hard constraint imposed but rather a penalty term hence nothing guarantees that the problem solution respects the constraints not true for low values of lambda for example 9 also inside the proof of theorem 1 you replace the l0 norm in the symmetry constraint by a l1 norm but this is not said in the theorem statement for clarity and rigor you should define the symmetry constraint directly in terms of l1 norm saying that it is relaxing an ideal l0 norm which is not used in practice the proof is not the right place to do the switch concerning the experiments 10 as evidenced by sentences like subsequently can we find the global optima and can it represent the true physics the paper seems to imply that physical solutions correspond to global minimum of the training loss and that local minima are bad solutions this is however not really shown in experiments actually figure 2 seem to indicate the opposite physical models have higher loss 11 overall the experimental settings and protocols of all results are not sufficiently explained which makes it impossible to interpret and assess them for example in figure 2 what are the hypothesis spaces considered what method is used to obtain those scores sindy eql how are hyperparaameters selected specially important as this is highlighted as a key pain that the proposed method should alleviate same goes for figure 7 in appendix a 12 also while you say at the end of page 3 that mathcall is the loss it is unclear in figure 2 whether the reported mathcall is the training loss the test loss the crossvalidated loss it is also never stated how many data points are used and how the data is split 13 experimental details are also highly insufficient in figure 4 how is the data split could not find even in appendix do the scores correspond to the training or testing error how is optimization carried in resnet pnnrise optimizer number of iterations batch size learning rate definitions of convergence etc what dictionary of basis functions is used for sindy and eql does it include the true structure and yours how are hyperparameters tuned these details are very important to assess the fairness of the comparison here it is indeed difficult to understand what in the proposed method allows it to beat the others given that its hypothesis space is theoretically included in the others maybe the rise constraint or the training procedure the experiments dont really help to answer this question 14 also what is the difference between ps1ps2 not explained in the appendix 15 how many runs are used to compute the recovery rates 16 you say that eql easily fails in large power system case but you never state the dimension of the problems considered 17 in figure 5 a limited data case is considered but it is not explained what this means exactly how much less data is considered and how much did we have in the beginning also what is the outofscale invariants case other clarity concerns 18 it is confusing to say in the last paragraph of page 3 that with integer coefficients or not while the pnn model with integer coefficients was not introduced yet 19 likewise the proof of theorem 1 seems to be the first time the reader learns that the loss considered in the paper is always the mse 20 what is meant by we complete the function y fx on top of page 5 1 green p j silverman bw 1994 nonparametric regression and generalized linear models a roughness penalty approach chapman and hall 2 l ljung system identification theory for the user prenticehall 1987 although the paper highlights weaknesses of existing approaches for a very important learning problem and proposes interesting novel modeling ideas it has major weaknesses in terms of theoretical and experimental soundness and clarity hence i cannot recommend it for publication at its current state docsepthis paper studies the problem to learn physical equations by neural networks to address the limitations of traditional methods using sparsity to learn physical equations this paper proposes to use the physical neural network as a container with additional constraints to explore range inertia symmetry and extrapolation rise using the proposed method it is expected that less local minima will be explored for better solution experiments have been conducted to support the presented method strength 1 this paper shows a nice connection between machine learning models and a realworld application ie estimating physical equations 2 this paper has been well motivated because directly using a deep neural network physical equations might not be learned well due to a number of reasons such as too many local solutions as shown in figure 2 3 the experiments have been done pretty comprehensively including those on synthetic data and those on real physical systems weakness 1 some details of the experimental settings are missed for example in experimental section support vector regression svr is used as baseline what kernels are used in svr and other parameters such as the box parameter this paper solves the problem of estimating physical equations by a physical neural networks pnn with the improvements to take into account range inertia symmetry and extrapolation rise that are important property of physical equation learning the paper is well motivated and the method has been clearly described experimental results on both synthetic and real data support the presented method docsepa framework for learning succinct and humanly interpretable descriptions of physical systems from data is outlined the main focus of the approach is eliminatingavoiding local optima which is achieved through both architectural design of the model and modification of learning algorithm empirical evaluations conducted on simulated data suggest improved error metrics as well as recovery of ground truth equations the proposed neural network is of very specific design resulting in the weighted sums of products of original inputs and their integer powers sparsity in parameters is enforced using the tanh and sigmoid transformations of weights additionally model parameters are constrained by domain knowledge induced physical principles like penalties on symmetry and smoothness and limits and inequalities of parameter values final piece of the framework is regularization term based on outsideofthesample predictions even though the individual tricks are likely used before to the best of my knowledge this particular combinationinstance of the neural network is unique so far the empirical evaluation is sound and shows superior fit and physics recovery just wondering why is 0990 or its integer multiple like 1980 dominating the near global optimum row of the figure 2 the article is quite descriptive and detailed just i think the ablation study would better serve as a part of the main article and instead of the visual some quantitative measure of contribution of each of the piece would be appreciated text is clear with just few typos like bus actual knowledge or different as active learning update after insight into other reviewers comments and authors responses i tend to agree with concerns on theoretical soundness of certain aspects and need for more appropriate experimental benchmarks therefore i am reducing the initial score i think that study described in this paper would be beneficial for the conference readers and especially the complex systems modeling community docsepthis paper is about learning physical equations with symbolic regression with neural networks while the carried out work seems interesting the paper does not allow to clearly highlight the contributions due to a partial presentation the paper would have benefitted from a comprehensive presentation that provide the tangible elements to explain all the building blocks of this work for instance the socalled physical neural network pnn is presented in terms of the elements defining it however it is not clear its overall architecture and most importantly the optimization method while proposition 1 give some elements about the biconvexity and the searching algorithm that can find the stationary points of pnn it is never explained what is the used algorithm moreover in the description of the method section 3 we can find some incorrect information such as pij is the index number of xi it is its power summationabstraction finally sentences like after providing a flexible design to limit possibilities of local optima are not founded with theoretical proofs the title is not appropriate this paper is not about sparse representation learning which is a more general topic than the one addressed in this work it is on learning physical equations with symbolic regression moreover the authors never addressed the question in the title of this paper from the beginning of the paper the authors emphasize on the main motivation of this work which is internet of everything ioe that becomes iot in page 2 however almost nothing in the paper is related to this motivation as all developments are not related to ioe as well as most experiments the experiments are not convincing the authors compare the proposed method to 4 other methods including 2 methods that are not for symbolic regression svr and resnet the only used methods that are related to this work are sindy sparse identification of nonlinear dynamics and equation learner the authors need to provide a comprehensive experimental analysis with a comparative analysis on several recent methods from the literature such as raissi m perdikaris p karniadakis g e 2019 physicsinformed neural networks a deep learning framework for solving forward and inverse problems involving nonlinear partial differential equations journal of computational physics 378 686707 lusch b kutz j n brunton s l 2018 deep learning for universal linear embeddings of nonlinear dynamics nature communications 91 110 raissi m 2018 deep hidden physics models deep learning of nonlinear partial differential equations the journal of machine learning research 191 932955 udrescu s m tegmark m 2020 ai feynman a physicsinspired method for symbolic regression science advances 616 eaay2631 see also willard j jia x xu s steinbach m kumar v 2020 integrating physicsbased modeling with machine learning a survey arxiv preprint arxiv200304919 11 134 the numbering in the body text does not correspond to the one in figure 1 making it difficult to understand there are some spelling and grammatical errors that can be easily identified and corrected such as we propose an new training to see if these observation is special bus actual knowledge proof is inlcuded in appendix for each subregions by combing their initials leads an important extrapolation the data in represented as our method is different as active learning visualization of the layer design equation rise principals for physical laws the load files are need as we think that this paper is not of sufficient quality to be accepted in iclr for at least the reasons mentioned in the main review section
### Summary: | the topic and ambition of this paper has been judged as important by all reviewers yet there is a consensus that the theoretical and experimental contribution is not strong enough to effectively argue for an important novel lead which would justify publication at iclr for these rejections this paper cannot be endorsed for publication at iclr 2022 | [
1892,
7658,
11295,
533,
2581,
247,
12339,
1307,
7613,
2717,
23632,
326,
253,
1895,
2900,
23006,
253,
10806,
417,
2032,
323,
1698,
2193,
273,
29331,
323,
1650,
898,
671,
3304,
253,
4737,
273,
10012,
337,
368,
8171,
253,
298,
17,
5222,
275,
253,
10377,
7658,
407,
247,
298,
18,
5222,
533,
436,
310,
417,
753,
275,
253,
10012,
3908,
323,
19843,
285,
8132,
263,
368,
943,
4853,
253,
10377,
7658,
3587,
275,
2426,
273,
298,
18,
5222,
3981,
326,
352,
310,
32196,
271,
7445,
298,
17,
5222,
534,
310,
417,
908,
275,
3946,
253,
4737,
310,
417,
253,
987,
1659,
281,
513,
253,
5234,
50276,
585,
29340,
253,
4679,
50276,
740,
347,
27007,
407,
14683,
751,
9674,
476,
359,
1089,
253,
4156,
5556,
66,
285,
476,
352,
1957,
253,
2032,
12057,
253,
2929,
3133,
281,
16084,
326,
3520,
5482,
2723,
281,
4156,
5927,
273,
253,
3733,
2957,
285,
326,
1980,
46836,
403,
3076,
5482,
436,
310,
2299,
417,
1663,
2011,
275,
4679,
2686,
4677,
374,
1646,
281,
5224,
253,
7285,
3520,
3210,
452,
2169,
2957,
1903,
4583,
253,
5661,
7533,
285,
14238,
273,
512,
1543,
403,
417,
10481,
5544,
534,
2789,
352,
7479,
281,
4665,
285,
2939,
731,
323,
1650,
275,
4677,
374,
752,
403,
253,
9079,
8470,
2783,
752,
1332,
310,
908,
281,
4044,
1110,
7363,
21308,
90,
299,
5848,
50276,
5430,
403,
4373,
18920,
21644,
4236,
24443,
1774,
347,
436,
310,
16318,
347,
247,
2234,
3075,
326,
253,
4081,
1332,
943,
33623,
1072,
4566,
323,
4677,
818,
275,
30762,
247,
1249,
671,
1223,
368,
1333,
387,
253,
990,
273,
3239,
495,
326,
14168,
4065,
310,
253,
2957,
352,
310,
12744,
275,
4677,
374,
1880,
253,
2361,
14168,
4065,
310,
253,
3733,
2957,
253,
1071,
2957,
253,
2831,
7210,
456,
2957,
50276,
262,
310,
671,
1620,
4767,
849,
1142,
941,
2792,
403,
908,
285,
849,
253,
941,
310,
8085,
2145,
5661,
4278,
403,
671,
4122,
12497,
275,
4677,
577,
849,
310,
253,
941,
8085,
812,
417,
1089,
1014,
275,
30762,
513,
253,
7363,
2723,
281,
253,
3733,
390,
5175,
2228,
849,
310,
13757,
4824,
275,
501,
3024,
268,
9866,
24078,
5556,
6081,
1180,
273,
25142,
14604,
1979,
4715,
2281,
14308,
273,
14940,
3966,
752,
19034,
273,
3720,
3470,
310,
908,
323,
21308,
90,
285,
299,
5848,
1057,
352,
2486,
253,
2032,
2605,
285,
13298,
50276,
5430,
403,
4373,
22041,
24251,
841,
4278,
403,
1077,
1774,
281,
2939,
253,
28959,
273,
253,
5301,
1060,
352,
310,
6296,
2834,
281,
2096,
752,
275,
253,
4081,
1332,
4483,
352,
281,
7171,
253,
2571,
1677,
326,
697,
9079,
2317,
310,
28055,
2908,
275,
253,
2571,
5046,
253,
6054,
7658,
390,
253,
3733,
5199,
253,
4679,
13414,
1663,
1361,
281,
3662,
436,
1953,
1638,
671,
752,
310,
253,
3064,
875,
3714,
18,
793,
19,
417,
5544,
275,
253,
30762,
1458,
849,
1142,
6613,
403,
908,
281,
11897,
253,
7355,
4142,
1668,
368,
1333,
326,
299,
5848,
4354,
10224,
275,
1781,
1612,
985,
1083,
533,
368,
1620,
1375,
253,
7877,
273,
253,
3237,
2783,
1722,
275,
4677,
608,
247,
3710,
941,
1083,
310,
2783,
533,
352,
310,
417,
5544,
752,
436,
2097,
4555,
849,
1199,
1679,
941,
310,
2783,
285,
849,
1199,
858,
359,
452,
275,
253,
5068,
671,
752,
310,
253,
562,
1171,
7527,
575,
7821,
1792,
1103,
1083,
50276,
977,
19843,
7350,
50276,
1093,
352,
310,
21643,
281,
1333,
275,
253,
1390,
12494,
273,
3239,
495,
326,
342,
7007,
10303,
390,
417,
1223,
253,
268,
9866,
1566,
342,
7007,
10303,
369,
417,
5611,
2568,
655,
21223,
253,
4737,
273,
10012,
337,
3133,
281,
320,
253,
806,
673,
253,
9414,
33772,
326,
253,
2957,
2783,
275,
253,
2929,
310,
1900,
253,
278,
339,
1384,
752,
310,
5486,
407,
359,
3426,
253,
1159,
340,
50276,
21448,
327,
1755,
273,
3239,
608,
50275,
18,
4759,
268,
480,
9711,
1342,
270,
88,
9354,
575,
4160,
36928,
9077,
285,
14923,
4872,
3210,
247,
44273,
12339,
2746,
13885,
1342,
285,
7423,
50276,
19,
298,
298,
75,
1947,
985,
8137,
3762,
323,
253,
2608,
268,
624,
547,
24489,
12034,
3738,
253,
2929,
16681,
32213,
273,
5368,
7274,
323,
247,
1077,
1774,
4715,
1895,
285,
29328,
4722,
4460,
14053,
5697,
352,
556,
2201,
32213,
275,
2426,
273,
10527,
285,
5661,
3590,
1255,
285,
19843,
7613,
891,
2550,
5583,
352,
323,
9311,
387,
697,
1655,
1375,
5474,
33032,
2520,
2929,
2175,
253,
1895,
281,
3037,
3520,
7424,
407,
11454,
6928,
281,
2953,
253,
7364,
273,
5899,
3082,
970,
37139,
414,
281,
3037,
3520,
7424,
436,
2929,
29328,
281,
897,
253,
3520,
11454,
2990,
347,
247,
8781,
342,
3081,
10806,
281,
8338,
2491,
41299,
10377,
285,
26480,
17888,
6054,
970,
253,
4081,
1332,
352,
310,
3264,
326,
1679,
1980,
46836,
588,
320,
14859,
323,
1805,
2900,
4679,
452,
644,
5196,
281,
1329,
253,
3559,
1332,
50276,
45563,
337,
436,
2929,
2722,
247,
5322,
4602,
875,
5145,
4715,
3210,
285,
247,
1524,
10186,
2898,
26332,
26230,
3520,
7424,
374,
436,
2929,
556,
644,
973,
17194,
984,
3587,
970,
247,
3676,
11454,
2990,
3520,
7424,
1537,
417,
320,
6311,
973,
1955,
281,
247,
1180,
273,
4606,
824,
347,
1512,
1142,
1980,
5482,
347,
2011,
275,
4677,
374,
495,
253,
4679,
452,
644,
2218,
3965,
9483,
1242,
1690,
1110,
327,
13506,
941,
285,
1110,
327,
1524,
3520,
2718,
50275,
20881,
1255,
337,
690,
4278,
273,
253,
5661,
7533,
403,
9829,
323,
1650,
275,
5661,
2593,
1329,
4972,
9077,
18504,
83,
310,
908,
347,
8245,
752,
34501,
403,
908,
275,
18504,
83,
285,
643,
3602,
824,
347,
253,
3817,
4764,
436,
2929,
35910,
253,
1895,
273,
26230,
3520,
7424,
407,
247,
3520,
11454,
6928,
268,
9866,
342,
253,
11701,
281,
1379,
715,
2395,
2491,
41299,
10377,
285,
26480,
17888,
6054,
326,
403,
1774,
2867,
273,
3520,
5150,
4715,
253,
2929,
310,
973,
17194,
285,
253,
1332,
556,
644,
4518,
2529,
5661,
1543,
327,
1097,
13506,
285,
1524,
941,
1329,
253,
3559,
1332,
5474,
339,
4904,
7792,
323,
4715,
18382,
4291,
285,
1966,
314,
4665,
494,
20121,
273,
3520,
2718,
432,
941,
310,
18627,
253,
2022,
2770,
273,
253,
2746,
310,
23703,
27635,
272,
1980,
5556,
66,
534,
310,
6786,
949,
1097,
27934,
2216,
273,
253,
1566,
285,
11237,
273,
4715,
5933,
16774,
27163,
5196,
327,
15524,
941,
1804,
5520,
2228,
17082,
347,
973,
347,
7355,
273,
3216,
5083,
7424,
253,
4081,
11454,
2990,
310,
273,
1077,
2173,
2216,
4795,
275,
253,
17375,
22661,
273,
3580,
273,
3236,
14800,
285,
616,
7007,
9136,
37139,
414,
275,
3602,
310,
27810,
970,
253,
23136,
73,
285,
9788,
78,
1238,
21257,
273,
13461,
23000,
1566,
3602,
403,
20793,
407,
5028,
3640,
5802,
3520,
9241,
751,
22414,
327,
10377,
285,
6032,
1255,
285,
7787,
285,
25930,
273,
4764,
2193,
2457,
5313,
273,
253,
7792,
310,
37820,
1307,
1754,
327,
3345,
23037,
1041,
4636,
13650,
1014,
2167,
253,
2060,
24866,
403,
2779,
908,
1078,
281,
253,
1682,
273,
619,
3640,
436,
1798,
5019,
14966,
273,
253,
11454,
2990,
310,
4451,
594,
2080,
253,
16774,
7103,
310,
3590,
285,
2722,
8936,
4944,
285,
12057,
7355,
816,
12371,
2139,
310,
470,
28001,
390,
697,
7007,
2709,
751,
9178,
41297,
253,
2822,
4156,
24571,
4194,
273,
253,
4677,
374,
50276,
783,
3929,
310,
3240,
27389,
285,
7000,
816,
891,
1158,
253,
28913,
1263,
651,
1805,
5752,
347,
247,
629,
273,
253,
2022,
3929,
285,
3185,
273,
253,
5304,
50276,
8826,
11745,
2557,
273,
7680,
273,
1016,
273,
253,
5313,
651,
320,
14109,
2505,
310,
2590,
342,
816,
1643,
963,
993,
751,
1685,
4588,
3640,
390,
1027,
347,
3939,
4715,
50276,
11183,
846,
12288,
715,
643,
30628,
5701,
285,
4477,
6128,
891,
5257,
281,
5194,
342,
7350,
327,
10527,
3590,
1255,
273,
2176,
7794,
285,
878,
323,
625,
4569,
5661,
49602,
3103,
891,
717,
8493,
253,
3302,
4868,
891,
1158,
326,
1263,
2529,
275,
436,
2929,
651,
320,
12912,
323,
253,
8059,
10668,
285,
3340,
253,
2570,
2718,
14053,
3114,
50276,
7152,
33032,
2520,
2929,
310,
670,
4715,
50276,
29079,
7424,
342,
24762,
9077,
342,
11454,
6928,
1223,
253,
4824,
562,
789,
3133,
4722,
253,
2929,
1057,
417,
1581,
281,
4518,
6780,
253,
9021,
1955,
281,
247,
7898,
9759,
253,
2929,
651,
452,
2750,
2166,
432,
247,
11088,
9759,
326,
2085,
253,
33631,
3603,
281,
5513,
512,
253,
3652,
8336,
273,
436,
789,
323,
4227,
253,
9267,
18859,
3520,
11454,
2990,
268,
9866,
310,
3559,
275,
2426,
273,
253,
3603,
13947,
352,
2299,
352,
310,
417,
2590,
697,
4583,
10336,
285,
954,
15538,
253,
13757,
1332,
1223,
13989,
337,
1918,
690,
3603,
670,
253,
270,
3557,
30275,
414,
285,
253,
12203,
5933,
326,
476,
1089,
253,
17429,
2792,
273,
268,
9866,
352,
310,
1620,
5544,
752,
310,
253,
908,
5933,
25761,
275,
253,
5740,
273,
253,
1332,
2593,
495,
359,
476,
1089,
690,
13583,
1491,
824,
347,
268,
1944,
310,
253,
3605,
1180,
273,
1269,
74,
352,
310,
697,
1612,
36138,
357,
296,
3460,
50276,
71,
3341,
14683,
751,
846,
5277,
247,
12112,
2216,
281,
2701,
15018,
273,
1980,
5556,
66,
403,
417,
11420,
342,
10527,
27947,
50276,
783,
4060,
310,
417,
4569,
436,
2929,
310,
417,
670,
23507,
6779,
4715,
534,
310,
247,
625,
2087,
9400,
685,
253,
581,
9713,
275,
436,
789,
352,
310,
327,
4715,
50276,
29079,
7424,
342,
24762,
9077,
25761,
253,
4477,
1620,
9713,
253,
1953,
275,
253,
4060,
273,
436,
2929,
50276,
4064,
253,
5068,
273,
253,
2929,
253,
4477,
22175,
327,
253,
2022,
16038,
273,
436,
789,
534,
310,
8573,
273,
3253,
891,
3703,
326,
4916,
891,
302,
275,
3239,
374,
2299,
2761,
2717,
275,
253,
2929,
310,
2905,
281,
436,
16038,
347,
512,
16936,
403,
417,
2905,
281,
891,
3703,
347,
973,
347,
954,
4679,
50276,
783,
4679,
403,
417,
21414,
253,
4477,
7277,
253,
4081,
1332,
281,
577,
643,
3082,
1690,
374,
3082,
326,
403,
417,
323,
24762,
9077,
18504,
83,
285,
501,
3024,
253,
760,
908,
3082,
326,
403,
2905,
281,
436,
789,
403,
21308,
90,
23507,
8137,
273,
14561,
8062,
285,
5150,
458,
47612,
253,
4477,
878,
281,
2085,
247,
11088,
5661,
1783,
342,
247,
20407,
1783,
327,
2067,
3332,
3082,
432,
253,
6239,
824,
347,
50276,
376,
739,
74,
278,
591,
69,
1479,
26232,
268,
50276,
76,
1596,
74,
324,
30441,
305,
299,
6247,
12057,
38967,
11454,
6928,
247,
3676,
4715,
7792,
323,
16161,
3579,
285,
13737,
3237,
7668,
14561,
7898,
8967,
7424,
6698,
273,
15180,
12057,
38316,
721,
2691,
26522,
50276,
77,
316,
348,
270,
465,
25374,
480,
295,
50276,
1288,
328,
1299,
256,
298,
4765,
3676,
4715,
323,
10898,
4872,
46234,
273,
14561,
8062,
3753,
10924,
11583,
9199,
50276,
376,
739,
74,
278,
4765,
3676,
8763,
12057,
3210,
3676,
4715,
273,
14561,
7898,
8967,
7424,
253,
6698,
273,
5145,
4715,
2561,
27446,
898,
1237,
31722,
50276,
438,
373,
14573,
256,
278,
50276,
442,
72,
4698,
278,
9169,
23105,
704,
44166,
247,
12057,
38358,
1332,
323,
24762,
9077,
5859,
16424,
49486,
299,
66,
333,
1731,
2405,
923,
671,
588,
472,
480,
480,
571,
1269,
1269,
86,
256,
2870,
249,
16836,
278,
50276,
76,
22711,
362,
9169,
24399,
12057,
3169,
14053,
342,
5145,
4715,
247,
6630,
549,
32693,
638,
3845,
549,
32693,
1518,
1229,
2537,
746,
1903,
13900,
50276,
783,
1180,
272,
275,
253,
2133,
2505,
1057,
417,
2723,
281,
253,
581,
275,
4677,
337,
2403,
352,
2834,
281,
2096,
50276,
9088,
403,
690,
33797,
285,
47412,
474,
6332,
326,
476,
320,
4354,
3636,
285,
15045,
824,
347,
359,
12661,
271,
747,
3733,
50276,
936,
923,
604,
841,
8310,
310,
2714,
1685,
4588,
3640,
4737,
310,
275,
27827,
21015,
275,
30762,
323,
1016,
749,
47539,
407,
2049,
272,
616,
3302,
84,
5644,
271,
1774,
26480,
17888,
253,
941,
275,
6607,
347,
50276,
454,
1332,
310,
1027,
347,
3939,
4715,
24426,
273,
253,
3828,
2216,
5150,
50276,
24078,
6980,
932,
323,
3520,
5323,
253,
3301,
4367,
403,
878,
347,
50276,
664,
1158,
326,
436,
2929,
310,
417,
273,
4209,
3290,
281,
320,
7607,
275,
17857,
32888,
323,
387,
1878,
253,
4606,
5393,
275,
253,
2022,
2278,
2593,
50276,
187,
187,
4118,
18435,
27,
783,
9400,
285,
30385,
273,
436,
2929,
556,
644,
24242,
347,
1774,
407,
512,
30628,
2568,
627,
310,
247,
13969,
326,
253,
10527,
285,
5661,
7680,
310,
417,
2266,
2217,
281,
8069,
9059,
323,
271,
1774,
4460,
1421,
534,
651,
15249,
9311,
387,
17857,
32888,
323,
841,
294,
11817,
436,
2929,
2550,
320,
30020,
323,
9311,
387,
17857,
32888,
1384,
1423
] | [
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] | [
1892,
7658,
11295,
533,
2581,
247,
12339,
1307,
7613,
2717,
23632,
326,
253,
1895,
2900,
23006,
253,
10806,
417,
2032,
323,
1698,
2193,
273,
29331,
323,
1650,
898,
671,
3304,
253,
4737,
273,
10012,
337,
368,
8171,
253,
298,
17,
5222,
275,
253,
10377,
7658,
407,
247,
298,
18,
5222,
533,
436,
310,
417,
753,
275,
253,
10012,
3908,
323,
19843,
285,
8132,
263,
368,
943,
4853,
253,
10377,
7658,
3587,
275,
2426,
273,
298,
18,
5222,
3981,
326,
352,
310,
32196,
271,
7445,
298,
17,
5222,
534,
310,
417,
908,
275,
3946,
253,
4737,
310,
417,
253,
987,
1659,
281,
513,
253,
5234,
50276,
585,
29340,
253,
4679,
50276,
740,
347,
27007,
407,
14683,
751,
9674,
476,
359,
1089,
253,
4156,
5556,
66,
285,
476,
352,
1957,
253,
2032,
12057,
253,
2929,
3133,
281,
16084,
326,
3520,
5482,
2723,
281,
4156,
5927,
273,
253,
3733,
2957,
285,
326,
1980,
46836,
403,
3076,
5482,
436,
310,
2299,
417,
1663,
2011,
275,
4679,
2686,
4677,
374,
1646,
281,
5224,
253,
7285,
3520,
3210,
452,
2169,
2957,
1903,
4583,
253,
5661,
7533,
285,
14238,
273,
512,
1543,
403,
417,
10481,
5544,
534,
2789,
352,
7479,
281,
4665,
285,
2939,
731,
323,
1650,
275,
4677,
374,
752,
403,
253,
9079,
8470,
2783,
752,
1332,
310,
908,
281,
4044,
1110,
7363,
21308,
90,
299,
5848,
50276,
5430,
403,
4373,
18920,
21644,
4236,
24443,
1774,
347,
436,
310,
16318,
347,
247,
2234,
3075,
326,
253,
4081,
1332,
943,
33623,
1072,
4566,
323,
4677,
818,
275,
30762,
247,
1249,
671,
1223,
368,
1333,
387,
253,
990,
273,
3239,
495,
326,
14168,
4065,
310,
253,
2957,
352,
310,
12744,
275,
4677,
374,
1880,
253,
2361,
14168,
4065,
310,
253,
3733,
2957,
253,
1071,
2957,
253,
2831,
7210,
456,
2957,
50276,
262,
310,
671,
1620,
4767,
849,
1142,
941,
2792,
403,
908,
285,
849,
253,
941,
310,
8085,
2145,
5661,
4278,
403,
671,
4122,
12497,
275,
4677,
577,
849,
310,
253,
941,
8085,
812,
417,
1089,
1014,
275,
30762,
513,
253,
7363,
2723,
281,
253,
3733,
390,
5175,
2228,
849,
310,
13757,
4824,
275,
501,
3024,
268,
9866,
24078,
5556,
6081,
1180,
273,
25142,
14604,
1979,
4715,
2281,
14308,
273,
14940,
3966,
752,
19034,
273,
3720,
3470,
310,
908,
323,
21308,
90,
285,
299,
5848,
1057,
352,
2486,
253,
2032,
2605,
285,
13298,
50276,
5430,
403,
4373,
22041,
24251,
841,
4278,
403,
1077,
1774,
281,
2939,
253,
28959,
273,
253,
5301,
1060,
352,
310,
6296,
2834,
281,
2096,
752,
275,
253,
4081,
1332,
4483,
352,
281,
7171,
253,
2571,
1677,
326,
697,
9079,
2317,
310,
28055,
2908,
275,
253,
2571,
5046,
253,
6054,
7658,
390,
253,
3733,
5199,
253,
4679,
13414,
1663,
1361,
281,
3662,
436,
1953,
1638,
671,
752,
310,
253,
3064,
875,
3714,
18,
793,
19,
417,
5544,
275,
253,
30762,
1458,
849,
1142,
6613,
403,
908,
281,
11897,
253,
7355,
4142,
1668,
368,
1333,
326,
299,
5848,
4354,
10224,
275,
1781,
1612,
985,
1083,
533,
368,
1620,
1375,
253,
7877,
273,
253,
3237,
2783,
1722,
275,
4677,
608,
247,
3710,
941,
1083,
310,
2783,
533,
352,
310,
417,
5544,
752,
436,
2097,
4555,
849,
1199,
1679,
941,
310,
2783,
285,
849,
1199,
858,
359,
452,
275,
253,
5068,
671,
752,
310,
253,
562,
1171,
7527,
575,
7821,
1792,
1103,
1083,
50276,
977,
19843,
7350,
50276,
1093,
352,
310,
21643,
281,
1333,
275,
253,
1390,
12494,
273,
3239,
495,
326,
342,
7007,
10303,
390,
417,
1223,
253,
268,
9866,
1566,
342,
7007,
10303,
369,
417,
5611,
2568,
655,
21223,
253,
4737,
273,
10012,
337,
3133,
281,
320,
253,
806,
673,
253,
9414,
33772,
326,
253,
2957,
2783,
275,
253,
2929,
310,
1900,
253,
278,
339,
1384,
752,
310,
5486,
407,
359,
3426,
253,
1159,
340,
50276,
21448,
327,
1755,
273,
3239,
608,
50275,
18,
4759,
268,
480,
9711,
1342,
270,
88,
9354,
575,
4160,
36928,
9077,
285,
14923,
4872,
3210,
247,
44273,
12339,
2746,
13885,
1342,
285,
7423,
50276,
19,
298,
298,
75,
1947,
985,
8137,
3762,
323,
253,
2608,
268,
624,
547,
24489,
12034,
3738,
253,
2929,
16681,
32213,
273,
5368,
7274,
323,
247,
1077,
1774,
4715,
1895,
285,
29328,
4722,
4460,
14053,
5697,
352,
556,
2201,
32213,
275,
2426,
273,
10527,
285,
5661,
3590,
1255,
285,
19843,
7613,
891,
2550,
5583,
352,
323,
9311,
387,
697,
1655,
1375,
5474,
33032,
2520,
2929,
2175,
253,
1895,
281,
3037,
3520,
7424,
407,
11454,
6928,
281,
2953,
253,
7364,
273,
5899,
3082,
970,
37139,
414,
281,
3037,
3520,
7424,
436,
2929,
29328,
281,
897,
253,
3520,
11454,
2990,
347,
247,
8781,
342,
3081,
10806,
281,
8338,
2491,
41299,
10377,
285,
26480,
17888,
6054,
970,
253,
4081,
1332,
352,
310,
3264,
326,
1679,
1980,
46836,
588,
320,
14859,
323,
1805,
2900,
4679,
452,
644,
5196,
281,
1329,
253,
3559,
1332,
50276,
45563,
337,
436,
2929,
2722,
247,
5322,
4602,
875,
5145,
4715,
3210,
285,
247,
1524,
10186,
2898,
26332,
26230,
3520,
7424,
374,
436,
2929,
556,
644,
973,
17194,
984,
3587,
970,
247,
3676,
11454,
2990,
3520,
7424,
1537,
417,
320,
6311,
973,
1955,
281,
247,
1180,
273,
4606,
824,
347,
1512,
1142,
1980,
5482,
347,
2011,
275,
4677,
374,
495,
253,
4679,
452,
644,
2218,
3965,
9483,
1242,
1690,
1110,
327,
13506,
941,
285,
1110,
327,
1524,
3520,
2718,
50275,
20881,
1255,
337,
690,
4278,
273,
253,
5661,
7533,
403,
9829,
323,
1650,
275,
5661,
2593,
1329,
4972,
9077,
18504,
83,
310,
908,
347,
8245,
752,
34501,
403,
908,
275,
18504,
83,
285,
643,
3602,
824,
347,
253,
3817,
4764,
436,
2929,
35910,
253,
1895,
273,
26230,
3520,
7424,
407,
247,
3520,
11454,
6928,
268,
9866,
342,
253,
11701,
281,
1379,
715,
2395,
2491,
41299,
10377,
285,
26480,
17888,
6054,
326,
403,
1774,
2867,
273,
3520,
5150,
4715,
253,
2929,
310,
973,
17194,
285,
253,
1332,
556,
644,
4518,
2529,
5661,
1543,
327,
1097,
13506,
285,
1524,
941,
1329,
253,
3559,
1332,
5474,
339,
4904,
7792,
323,
4715,
18382,
4291,
285,
1966,
314,
4665,
494,
20121,
273,
3520,
2718,
432,
941,
310,
18627,
253,
2022,
2770,
273,
253,
2746,
310,
23703,
27635,
272,
1980,
5556,
66,
534,
310,
6786,
949,
1097,
27934,
2216,
273,
253,
1566,
285,
11237,
273,
4715,
5933,
16774,
27163,
5196,
327,
15524,
941,
1804,
5520,
2228,
17082,
347,
973,
347,
7355,
273,
3216,
5083,
7424,
253,
4081,
11454,
2990,
310,
273,
1077,
2173,
2216,
4795,
275,
253,
17375,
22661,
273,
3580,
273,
3236,
14800,
285,
616,
7007,
9136,
37139,
414,
275,
3602,
310,
27810,
970,
253,
23136,
73,
285,
9788,
78,
1238,
21257,
273,
13461,
23000,
1566,
3602,
403,
20793,
407,
5028,
3640,
5802,
3520,
9241,
751,
22414,
327,
10377,
285,
6032,
1255,
285,
7787,
285,
25930,
273,
4764,
2193,
2457,
5313,
273,
253,
7792,
310,
37820,
1307,
1754,
327,
3345,
23037,
1041,
4636,
13650,
1014,
2167,
253,
2060,
24866,
403,
2779,
908,
1078,
281,
253,
1682,
273,
619,
3640,
436,
1798,
5019,
14966,
273,
253,
11454,
2990,
310,
4451,
594,
2080,
253,
16774,
7103,
310,
3590,
285,
2722,
8936,
4944,
285,
12057,
7355,
816,
12371,
2139,
310,
470,
28001,
390,
697,
7007,
2709,
751,
9178,
41297,
253,
2822,
4156,
24571,
4194,
273,
253,
4677,
374,
50276,
783,
3929,
310,
3240,
27389,
285,
7000,
816,
891,
1158,
253,
28913,
1263,
651,
1805,
5752,
347,
247,
629,
273,
253,
2022,
3929,
285,
3185,
273,
253,
5304,
50276,
8826,
11745,
2557,
273,
7680,
273,
1016,
273,
253,
5313,
651,
320,
14109,
2505,
310,
2590,
342,
816,
1643,
963,
993,
751,
1685,
4588,
3640,
390,
1027,
347,
3939,
4715,
50276,
11183,
846,
12288,
715,
643,
30628,
5701,
285,
4477,
6128,
891,
5257,
281,
5194,
342,
7350,
327,
10527,
3590,
1255,
273,
2176,
7794,
285,
878,
323,
625,
4569,
5661,
49602,
3103,
891,
717,
8493,
253,
3302,
4868,
891,
1158,
326,
1263,
2529,
275,
436,
2929,
651,
320,
12912,
323,
253,
8059,
10668,
285,
3340,
253,
2570,
2718,
14053,
3114,
50276,
7152,
33032,
2520,
2929,
310,
670,
4715,
50276,
29079,
7424,
342,
24762,
9077,
342,
11454,
6928,
1223,
253,
4824,
562,
789,
3133,
4722,
253,
2929,
1057,
417,
1581,
281,
4518,
6780,
253,
9021,
1955,
281,
247,
7898,
9759,
253,
2929,
651,
452,
2750,
2166,
432,
247,
11088,
9759,
326,
2085,
253,
33631,
3603,
281,
5513,
512,
253,
3652,
8336,
273,
436,
789,
323,
4227,
253,
9267,
18859,
3520,
11454,
2990,
268,
9866,
310,
3559,
275,
2426,
273,
253,
3603,
13947,
352,
2299,
352,
310,
417,
2590,
697,
4583,
10336,
285,
954,
15538,
253,
13757,
1332,
1223,
13989,
337,
1918,
690,
3603,
670,
253,
270,
3557,
30275,
414,
285,
253,
12203,
5933,
326,
476,
1089,
253,
17429,
2792,
273,
268,
9866,
352,
310,
1620,
5544,
752,
310,
253,
908,
5933,
25761,
275,
253,
5740,
273,
253,
1332,
2593,
495,
359,
476,
1089,
690,
13583,
1491,
824,
347,
268,
1944,
310,
253,
3605,
1180,
273,
1269,
74,
352,
310,
697,
1612,
36138,
357,
296,
3460,
50276,
71,
3341,
14683,
751,
846,
5277,
247,
12112,
2216,
281,
2701,
15018,
273,
1980,
5556,
66,
403,
417,
11420,
342,
10527,
27947,
50276,
783,
4060,
310,
417,
4569,
436,
2929,
310,
417,
670,
23507,
6779,
4715,
534,
310,
247,
625,
2087,
9400,
685,
253,
581,
9713,
275,
436,
789,
352,
310,
327,
4715,
50276,
29079,
7424,
342,
24762,
9077,
25761,
253,
4477,
1620,
9713,
253,
1953,
275,
253,
4060,
273,
436,
2929,
50276,
4064,
253,
5068,
273,
253,
2929,
253,
4477,
22175,
327,
253,
2022,
16038,
273,
436,
789,
534,
310,
8573,
273,
3253,
891,
3703,
326,
4916,
891,
302,
275,
3239,
374,
2299,
2761,
2717,
275,
253,
2929,
310,
2905,
281,
436,
16038,
347,
512,
16936,
403,
417,
2905,
281,
891,
3703,
347,
973,
347,
954,
4679,
50276,
783,
4679,
403,
417,
21414,
253,
4477,
7277,
253,
4081,
1332,
281,
577,
643,
3082,
1690,
374,
3082,
326,
403,
417,
323,
24762,
9077,
18504,
83,
285,
501,
3024,
253,
760,
908,
3082,
326,
403,
2905,
281,
436,
789,
403,
21308,
90,
23507,
8137,
273,
14561,
8062,
285,
5150,
458,
47612,
253,
4477,
878,
281,
2085,
247,
11088,
5661,
1783,
342,
247,
20407,
1783,
327,
2067,
3332,
3082,
432,
253,
6239,
824,
347,
50276,
376,
739,
74,
278,
591,
69,
1479,
26232,
268,
50276,
76,
1596,
74,
324,
30441,
305,
299,
6247,
12057,
38967,
11454,
6928,
247,
3676,
4715,
7792,
323,
16161,
3579,
285,
13737,
3237,
7668,
14561,
7898,
8967,
7424,
6698,
273,
15180,
12057,
38316,
721,
2691,
26522,
50276,
77,
316,
348,
270,
465,
25374,
480,
295,
50276,
1288,
328,
1299,
256,
298,
4765,
3676,
4715,
323,
10898,
4872,
46234,
273,
14561,
8062,
3753,
10924,
11583,
9199,
50276,
376,
739,
74,
278,
4765,
3676,
8763,
12057,
3210,
3676,
4715,
273,
14561,
7898,
8967,
7424,
253,
6698,
273,
5145,
4715,
2561,
27446,
898,
1237,
31722,
50276,
438,
373,
14573,
256,
278,
50276,
442,
72,
4698,
278,
9169,
23105,
704,
44166,
247,
12057,
38358,
1332,
323,
24762,
9077,
5859,
16424,
49486,
299,
66,
333,
1731,
2405,
923,
671,
588,
472,
480,
480,
571,
1269,
1269,
86,
256,
2870,
249,
16836,
278,
50276,
76,
22711,
362,
9169,
24399,
12057,
3169,
14053,
342,
5145,
4715,
247,
6630,
549,
32693,
638,
3845,
549,
32693,
1518,
1229,
2537,
746,
1903,
13900,
50276,
783,
1180,
272,
275,
253,
2133,
2505,
1057,
417,
2723,
281,
253,
581,
275,
4677,
337,
2403,
352,
2834,
281,
2096,
50276,
9088,
403,
690,
33797,
285,
47412,
474,
6332,
326,
476,
320,
4354,
3636,
285,
15045,
824,
347,
359,
12661,
271,
747,
3733,
50276,
936,
923,
604,
841,
8310,
310,
2714,
1685,
4588,
3640,
4737,
310,
275,
27827,
21015,
275,
30762,
323,
1016,
749,
47539,
407,
2049,
272,
616,
3302,
84,
5644,
271,
1774,
26480,
17888,
253,
941,
275,
6607,
347,
50276,
454,
1332,
310,
1027,
347,
3939,
4715,
24426,
273,
253,
3828,
2216,
5150,
50276,
24078,
6980,
932,
323,
3520,
5323,
253,
3301,
4367,
403,
878,
347,
50276,
664,
1158,
326,
436,
2929,
310,
417,
273,
4209,
3290,
281,
320,
7607,
275,
17857,
32888,
323,
387,
1878,
253,
4606,
5393,
275,
253,
2022,
2278,
2593,
50276,
187,
187,
4118,
18435,
27,
783,
9400,
285,
30385,
273,
436,
2929,
556,
644,
24242,
347,
1774,
407,
512,
30628,
2568,
627,
310,
247,
13969,
326,
253,
10527,
285,
5661,
7680,
310,
417,
2266,
2217,
281,
8069,
9059,
323,
271,
1774,
4460,
1421,
534,
651,
15249,
9311,
387,
17857,
32888,
323,
841,
294,
11817,
436,
2929,
2550,
320,
30020,
323,
9311,
387,
17857,
32888,
1384,
1423
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
in this paper the authors propose a sceneaware pose estimation framework based on the concept of embodiment different from previous work with multistage optimization noncausal inference and complex contact modeling for the sceneaware pose this method is simpleonly one stage and recovers robust sceneaware pose in the simulated environment besides to simulate good poses they disentangle the camera pose and use a multistep projection gradient defined in the global coordinate frame as the movement cue for our embodied agent this method only needs 2d observation and can be trained on synthetic datasets for realworld pose estimation they achieve the stateoftheart on prox dataset without using any training data in this dataset strengths the main contributionsimulating sceneaware humanoid to mimic visual observation and recover 3d poses in the realworld environment is significant i believe it will be a good baseline for the sceneaware pose estimation this method achieves promising results on the challenging poses of prox dataset the mgp is reasonable on 3d pose recovery and achieves significant improvements shown in ablation studies this paper is well written and easy to follow weakness the simplified scene representation causes some failure cases in the poses of sitting or laying the contact modeling between motion and scene is a little bit unclear please follow the weakness docsepthe paper proposes a method for estimating 3d human pose given a monocular rgb video observing a human moving through a scene tha has been previously reconstructed ie a scene for which a 3d mesh reconstruction is available the method is based on physical simulation of the human moving through a geometrically simplified version of the scene the simulation is driven by a multistep projection gradient connecting 2d pose keypoints to the controller that drives the humanoid pose in simulation the experiments evaluate the proposed method as well as ablations against a breadth of methods from prior work on two datasets prox and h36m performance is evaluated in terms of pose accuracy mainly joint error metrics where ground truth pose is available and pose plausibility mainly distance to scene geometry interpentretation frequency and distance etc the results show that the approach is competitive with prior work that uses image features on the h36m dataset and mostly outperforms prior work on the prox dataset strengths the physicsbased formulation is novel and offers a complemetary approach to the 3d human pose estimation problem statement relative to most prior work the fact that it can provide strong performance is quite impressive given the limited amount of input information that is given compared to other methods the paper is wellwritten and presents a breadth of ablations that analyze the impact of different components of the method on performance weaknesses the approach relies on a semiautomatic simplification of the 3d scene geometry to address failure modes due to reconstruction noise i would have liked to see a more detailed discussion of this issue and also of the actual process for creating these simplified scenes this is an important detail as it constitutes a significant modification to the input available to the presented method relative to prior work ideally there would be an ablation that would report results based on the original unsimplified scene geometry to concretely measure the impact and value of this semiautomatic step limitations are described in the last section of the paper mainly focusing on the simplification of human bodies and scene geometry to rigid bodies as far as i can tell there is no discussion of potential negative societal impacts despite a statement pointing to the supplement in the checklist docsepthe paper presents a 3d human pose framework based on both simulators and 2d3d human estimators it recovers both global 3d human poses and local human poses the methods can be applied to daily activities such as prox dataset strengths it sounds novel to me as i am not a simulator guy but a learning and 3d person guy using these simulators naturally will help generate lots of physics priors and encode the physics into the learning process i may not provide enough evidence for the novelty of the simulation part weaknesses i did not see the comparison with neural mocon neural motion control for physically plausible human motion capture if so what is the difference can you work on the dataset they are working on i think the performance is somewhat unsatisfying we can see several metrics in table 2 is not sota several typos line 287 scene penetration frequency freq and distance pen line 204 keytpoints line 202 rotationrr yes docsepthis paper aims to perform singlecamera pose estimation it uses 2d key points and then uses a 3d simulated scene that the person is within to enable an improvement in the pose estimation the approach is tested in h36m and prox datasets strengths the work doesnt require complex contact modelling or multistages of optimization processes it is a single inference step the work proposes a temporal gradient projection to smooth the estimation over time the work is able to perform well on the pox dataset weaknesses the h36m dataset is illsuited for scene modelling as there are few objects within the foreground and seems to perform poorly and is distracting for the paper the mpg is heavily based on the work of 3 there is very limited experimental discussion around the work of the prox dataset making it hard for a reader to fully appreciate the performance of the approach there is a disconnect between the 2d keypoint detection and the 3ddepth perfect generation of the scene none stated
### Summary: | the submission initially received mixed reviews after rebuttal all reviewers felt their concerns reasonably addressed and recommended acceptance though one didnt update the score the ac agrees the authors are encouraged to revise the paper accordingly | [
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
249,
436,
2929,
253,
4477,
12661,
247,
6200,
13823,
16753,
13418,
7792,
1754,
327,
253,
4473,
273,
14704,
1027,
432,
2045,
789,
342,
1554,
382,
486,
13757,
1327,
68,
27026,
17032,
285,
2570,
3057,
14053,
323,
253,
6200,
13823,
16753,
436,
1332,
310,
2969,
7483,
581,
3924,
285,
761,
12239,
10237,
6200,
13823,
16753,
275,
253,
15524,
3126,
16280,
281,
26065,
1175,
24543,
597,
557,
290,
2134,
253,
6568,
16753,
285,
897,
247,
1554,
382,
554,
12378,
11786,
2931,
275,
253,
4156,
13249,
3665,
347,
253,
4866,
30129,
323,
776,
36080,
5570,
436,
1332,
760,
3198,
374,
69,
8310,
285,
476,
320,
10166,
327,
13506,
15302,
323,
1524,
10186,
16753,
13418,
597,
5115,
253,
1375,
23037,
14387,
327,
16843,
10895,
1293,
970,
667,
3733,
941,
275,
436,
10895,
20544,
50276,
783,
2022,
9021,
303,
8287,
6200,
13823,
1966,
1238,
281,
25066,
5304,
8310,
285,
9295,
495,
69,
24543,
275,
253,
1524,
10186,
3126,
310,
1534,
891,
2868,
352,
588,
320,
247,
1175,
8245,
323,
253,
6200,
13823,
16753,
13418,
50276,
2520,
1332,
33526,
12532,
1543,
327,
253,
11132,
24543,
273,
16843,
10895,
50276,
783,
5770,
81,
310,
5272,
327,
495,
69,
16753,
7355,
285,
33526,
1534,
11701,
2011,
275,
28913,
2175,
50276,
2520,
2929,
310,
973,
3542,
285,
3477,
281,
956,
14855,
253,
21010,
6200,
6779,
5997,
690,
4433,
2219,
275,
253,
24543,
273,
7063,
390,
23157,
50276,
783,
3057,
14053,
875,
3200,
285,
6200,
310,
247,
1652,
2372,
12744,
4496,
956,
253,
14855,
5474,
339,
431,
248,
2929,
29328,
247,
1332,
323,
26230,
495,
69,
1966,
16753,
1677,
247,
1114,
26292,
46206,
3492,
20764,
247,
1966,
4886,
949,
247,
6200,
289,
66,
556,
644,
3786,
25578,
26332,
247,
6200,
323,
534,
247,
495,
69,
17489,
14433,
310,
2130,
50276,
783,
1332,
310,
1754,
327,
3520,
9864,
273,
253,
1966,
4886,
949,
247,
22040,
16671,
21010,
2715,
273,
253,
6200,
50276,
783,
9864,
310,
8877,
407,
247,
1554,
382,
554,
12378,
11786,
12873,
374,
69,
16753,
2234,
10801,
281,
253,
9763,
326,
14137,
253,
1966,
1238,
16753,
275,
9864,
50276,
783,
4679,
7472,
253,
4081,
1332,
347,
973,
347,
490,
77,
569,
1411,
247,
37535,
273,
3082,
432,
2720,
789,
327,
767,
15302,
16843,
285,
288,
1812,
78,
50276,
24159,
310,
6760,
275,
2426,
273,
16753,
7200,
7194,
6036,
2228,
17082,
835,
3216,
5083,
16753,
310,
2130,
285,
16753,
18662,
2322,
7194,
4181,
281,
6200,
12087,
734,
21601,
1221,
318,
4294,
285,
4181,
3966,
253,
1543,
921,
326,
253,
2746,
310,
12085,
342,
2720,
789,
326,
4648,
2460,
3386,
327,
253,
288,
1812,
78,
10895,
285,
6571,
41731,
13015,
2720,
789,
327,
253,
16843,
10895,
20544,
50276,
783,
12057,
3169,
15895,
310,
4460,
285,
6131,
247,
509,
282,
3899,
552,
2746,
281,
253,
495,
69,
1966,
16753,
13418,
1895,
3908,
4103,
281,
954,
2720,
789,
50276,
783,
958,
326,
352,
476,
2085,
2266,
3045,
310,
3240,
13943,
1677,
253,
3710,
2408,
273,
3280,
1491,
326,
310,
1677,
2429,
281,
643,
3082,
50276,
783,
2929,
310,
973,
15720,
285,
10262,
247,
37535,
273,
490,
77,
569,
326,
12106,
253,
3486,
273,
1027,
4295,
273,
253,
1332,
327,
3045,
50276,
20881,
1255,
265,
50276,
783,
2746,
15771,
327,
247,
3300,
571,
307,
8977,
8077,
1877,
273,
253,
495,
69,
6200,
12087,
281,
2953,
4433,
10006,
1955,
281,
14433,
6046,
50276,
74,
651,
452,
10490,
281,
923,
247,
625,
7000,
5955,
273,
436,
2523,
285,
671,
273,
253,
4588,
1232,
323,
6153,
841,
21010,
13451,
50276,
2520,
310,
271,
1774,
2508,
347,
352,
16988,
247,
1534,
11237,
281,
253,
3280,
2130,
281,
253,
3559,
1332,
4103,
281,
2720,
789,
50276,
504,
595,
627,
651,
320,
271,
28913,
326,
651,
1304,
1543,
1754,
327,
253,
3236,
440,
48573,
1245,
6200,
12087,
281,
345,
2414,
600,
2557,
253,
3486,
285,
1318,
273,
436,
3300,
571,
307,
8977,
3213,
7364,
403,
2529,
275,
253,
1390,
2593,
273,
253,
2929,
7194,
13654,
327,
253,
8077,
1877,
273,
1966,
8248,
285,
6200,
12087,
281,
16572,
8248,
50276,
284,
2080,
347,
891,
476,
2028,
627,
310,
642,
5955,
273,
2442,
4016,
38058,
16274,
5747,
247,
3908,
13458,
281,
253,
8499,
275,
253,
44282,
5474,
339,
431,
248,
2929,
10262,
247,
495,
69,
1966,
16753,
7792,
1754,
327,
1097,
948,
28457,
285,
374,
69,
20,
69,
1966,
48489,
352,
761,
12239,
1097,
4156,
495,
69,
1966,
24543,
285,
1980,
1966,
24543,
253,
3082,
476,
320,
3732,
281,
5312,
4712,
824,
347,
16843,
10895,
20544,
352,
7835,
4460,
281,
479,
347,
891,
717,
417,
247,
40022,
5599,
533,
247,
4715,
285,
495,
69,
1436,
5599,
970,
841,
948,
28457,
10748,
588,
1361,
6635,
8783,
273,
12057,
2235,
641,
285,
22573,
253,
12057,
715,
253,
4715,
1232,
891,
778,
417,
2085,
2217,
1941,
323,
253,
38135,
273,
253,
9864,
629,
50276,
20881,
1255,
265,
891,
858,
417,
923,
253,
5301,
342,
11454,
278,
16033,
11454,
3200,
1453,
323,
13318,
21541,
1966,
3200,
9232,
604,
594,
752,
310,
253,
3064,
476,
368,
789,
327,
253,
10895,
597,
403,
2444,
327,
50276,
74,
1158,
253,
3045,
310,
8489,
43288,
3184,
359,
476,
923,
2067,
17082,
275,
2829,
374,
310,
417,
256,
5503,
50276,
43249,
963,
993,
50276,
1282,
29669,
50276,
32280,
24280,
4294,
4107,
82,
285,
4181,
4331,
1386,
23133,
1058,
1767,
10801,
1386,
22038,
9381,
2676,
50276,
9820,
5474,
33032,
2520,
2929,
13698,
281,
1347,
2014,
32499,
16753,
13418,
352,
4648,
374,
69,
2234,
2792,
285,
840,
4648,
247,
495,
69,
15524,
6200,
326,
253,
1436,
310,
1561,
281,
8046,
271,
7756,
275,
253,
16753,
13418,
253,
2746,
310,
5762,
275,
288,
1812,
78,
285,
16843,
15302,
20544,
253,
789,
36908,
2430,
2570,
3057,
26278,
390,
1554,
382,
1131,
273,
13757,
4870,
352,
310,
247,
2014,
17032,
3213,
253,
789,
29328,
247,
11935,
11786,
12378,
281,
6032,
253,
13418,
689,
673,
253,
789,
310,
2104,
281,
1347,
973,
327,
253,
268,
1004,
10895,
50274,
20881,
1255,
265,
253,
288,
1812,
78,
10895,
310,
2853,
3467,
959,
323,
6200,
26278,
347,
627,
403,
1643,
5113,
1561,
253,
35936,
285,
3133,
281,
1347,
15225,
285,
310,
940,
25031,
323,
253,
2929,
253,
278,
8159,
310,
11306,
1754,
327,
253,
789,
273,
495,
627,
310,
1077,
3710,
5661,
5955,
1475,
253,
789,
273,
253,
16843,
10895,
2403,
352,
1892,
323,
247,
9414,
281,
4751,
11435,
253,
3045,
273,
253,
2746,
627,
310,
247,
35738,
875,
253,
374,
69,
2234,
3659,
5481,
285,
253,
495,
1678,
554,
394,
3962,
5978,
273,
253,
6200,
50276,
15422,
4767,
2490,
187,
4118,
18435,
27,
783,
19529,
8523,
2959,
6804,
10123,
50276,
6438,
30080,
22559,
512,
30628,
3543,
616,
7350,
12054,
9713,
285,
8521,
14924,
2167,
581,
42126,
5731,
253,
4868,
50276,
783,
913,
18726,
253,
4477,
403,
14659,
281,
49620,
253,
2929,
15672,
209
] | [
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] | [
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
249,
436,
2929,
253,
4477,
12661,
247,
6200,
13823,
16753,
13418,
7792,
1754,
327,
253,
4473,
273,
14704,
1027,
432,
2045,
789,
342,
1554,
382,
486,
13757,
1327,
68,
27026,
17032,
285,
2570,
3057,
14053,
323,
253,
6200,
13823,
16753,
436,
1332,
310,
2969,
7483,
581,
3924,
285,
761,
12239,
10237,
6200,
13823,
16753,
275,
253,
15524,
3126,
16280,
281,
26065,
1175,
24543,
597,
557,
290,
2134,
253,
6568,
16753,
285,
897,
247,
1554,
382,
554,
12378,
11786,
2931,
275,
253,
4156,
13249,
3665,
347,
253,
4866,
30129,
323,
776,
36080,
5570,
436,
1332,
760,
3198,
374,
69,
8310,
285,
476,
320,
10166,
327,
13506,
15302,
323,
1524,
10186,
16753,
13418,
597,
5115,
253,
1375,
23037,
14387,
327,
16843,
10895,
1293,
970,
667,
3733,
941,
275,
436,
10895,
20544,
50276,
783,
2022,
9021,
303,
8287,
6200,
13823,
1966,
1238,
281,
25066,
5304,
8310,
285,
9295,
495,
69,
24543,
275,
253,
1524,
10186,
3126,
310,
1534,
891,
2868,
352,
588,
320,
247,
1175,
8245,
323,
253,
6200,
13823,
16753,
13418,
50276,
2520,
1332,
33526,
12532,
1543,
327,
253,
11132,
24543,
273,
16843,
10895,
50276,
783,
5770,
81,
310,
5272,
327,
495,
69,
16753,
7355,
285,
33526,
1534,
11701,
2011,
275,
28913,
2175,
50276,
2520,
2929,
310,
973,
3542,
285,
3477,
281,
956,
14855,
253,
21010,
6200,
6779,
5997,
690,
4433,
2219,
275,
253,
24543,
273,
7063,
390,
23157,
50276,
783,
3057,
14053,
875,
3200,
285,
6200,
310,
247,
1652,
2372,
12744,
4496,
956,
253,
14855,
5474,
339,
431,
248,
2929,
29328,
247,
1332,
323,
26230,
495,
69,
1966,
16753,
1677,
247,
1114,
26292,
46206,
3492,
20764,
247,
1966,
4886,
949,
247,
6200,
289,
66,
556,
644,
3786,
25578,
26332,
247,
6200,
323,
534,
247,
495,
69,
17489,
14433,
310,
2130,
50276,
783,
1332,
310,
1754,
327,
3520,
9864,
273,
253,
1966,
4886,
949,
247,
22040,
16671,
21010,
2715,
273,
253,
6200,
50276,
783,
9864,
310,
8877,
407,
247,
1554,
382,
554,
12378,
11786,
12873,
374,
69,
16753,
2234,
10801,
281,
253,
9763,
326,
14137,
253,
1966,
1238,
16753,
275,
9864,
50276,
783,
4679,
7472,
253,
4081,
1332,
347,
973,
347,
490,
77,
569,
1411,
247,
37535,
273,
3082,
432,
2720,
789,
327,
767,
15302,
16843,
285,
288,
1812,
78,
50276,
24159,
310,
6760,
275,
2426,
273,
16753,
7200,
7194,
6036,
2228,
17082,
835,
3216,
5083,
16753,
310,
2130,
285,
16753,
18662,
2322,
7194,
4181,
281,
6200,
12087,
734,
21601,
1221,
318,
4294,
285,
4181,
3966,
253,
1543,
921,
326,
253,
2746,
310,
12085,
342,
2720,
789,
326,
4648,
2460,
3386,
327,
253,
288,
1812,
78,
10895,
285,
6571,
41731,
13015,
2720,
789,
327,
253,
16843,
10895,
20544,
50276,
783,
12057,
3169,
15895,
310,
4460,
285,
6131,
247,
509,
282,
3899,
552,
2746,
281,
253,
495,
69,
1966,
16753,
13418,
1895,
3908,
4103,
281,
954,
2720,
789,
50276,
783,
958,
326,
352,
476,
2085,
2266,
3045,
310,
3240,
13943,
1677,
253,
3710,
2408,
273,
3280,
1491,
326,
310,
1677,
2429,
281,
643,
3082,
50276,
783,
2929,
310,
973,
15720,
285,
10262,
247,
37535,
273,
490,
77,
569,
326,
12106,
253,
3486,
273,
1027,
4295,
273,
253,
1332,
327,
3045,
50276,
20881,
1255,
265,
50276,
783,
2746,
15771,
327,
247,
3300,
571,
307,
8977,
8077,
1877,
273,
253,
495,
69,
6200,
12087,
281,
2953,
4433,
10006,
1955,
281,
14433,
6046,
50276,
74,
651,
452,
10490,
281,
923,
247,
625,
7000,
5955,
273,
436,
2523,
285,
671,
273,
253,
4588,
1232,
323,
6153,
841,
21010,
13451,
50276,
2520,
310,
271,
1774,
2508,
347,
352,
16988,
247,
1534,
11237,
281,
253,
3280,
2130,
281,
253,
3559,
1332,
4103,
281,
2720,
789,
50276,
504,
595,
627,
651,
320,
271,
28913,
326,
651,
1304,
1543,
1754,
327,
253,
3236,
440,
48573,
1245,
6200,
12087,
281,
345,
2414,
600,
2557,
253,
3486,
285,
1318,
273,
436,
3300,
571,
307,
8977,
3213,
7364,
403,
2529,
275,
253,
1390,
2593,
273,
253,
2929,
7194,
13654,
327,
253,
8077,
1877,
273,
1966,
8248,
285,
6200,
12087,
281,
16572,
8248,
50276,
284,
2080,
347,
891,
476,
2028,
627,
310,
642,
5955,
273,
2442,
4016,
38058,
16274,
5747,
247,
3908,
13458,
281,
253,
8499,
275,
253,
44282,
5474,
339,
431,
248,
2929,
10262,
247,
495,
69,
1966,
16753,
7792,
1754,
327,
1097,
948,
28457,
285,
374,
69,
20,
69,
1966,
48489,
352,
761,
12239,
1097,
4156,
495,
69,
1966,
24543,
285,
1980,
1966,
24543,
253,
3082,
476,
320,
3732,
281,
5312,
4712,
824,
347,
16843,
10895,
20544,
352,
7835,
4460,
281,
479,
347,
891,
717,
417,
247,
40022,
5599,
533,
247,
4715,
285,
495,
69,
1436,
5599,
970,
841,
948,
28457,
10748,
588,
1361,
6635,
8783,
273,
12057,
2235,
641,
285,
22573,
253,
12057,
715,
253,
4715,
1232,
891,
778,
417,
2085,
2217,
1941,
323,
253,
38135,
273,
253,
9864,
629,
50276,
20881,
1255,
265,
891,
858,
417,
923,
253,
5301,
342,
11454,
278,
16033,
11454,
3200,
1453,
323,
13318,
21541,
1966,
3200,
9232,
604,
594,
752,
310,
253,
3064,
476,
368,
789,
327,
253,
10895,
597,
403,
2444,
327,
50276,
74,
1158,
253,
3045,
310,
8489,
43288,
3184,
359,
476,
923,
2067,
17082,
275,
2829,
374,
310,
417,
256,
5503,
50276,
43249,
963,
993,
50276,
1282,
29669,
50276,
32280,
24280,
4294,
4107,
82,
285,
4181,
4331,
1386,
23133,
1058,
1767,
10801,
1386,
22038,
9381,
2676,
50276,
9820,
5474,
33032,
2520,
2929,
13698,
281,
1347,
2014,
32499,
16753,
13418,
352,
4648,
374,
69,
2234,
2792,
285,
840,
4648,
247,
495,
69,
15524,
6200,
326,
253,
1436,
310,
1561,
281,
8046,
271,
7756,
275,
253,
16753,
13418,
253,
2746,
310,
5762,
275,
288,
1812,
78,
285,
16843,
15302,
20544,
253,
789,
36908,
2430,
2570,
3057,
26278,
390,
1554,
382,
1131,
273,
13757,
4870,
352,
310,
247,
2014,
17032,
3213,
253,
789,
29328,
247,
11935,
11786,
12378,
281,
6032,
253,
13418,
689,
673,
253,
789,
310,
2104,
281,
1347,
973,
327,
253,
268,
1004,
10895,
50274,
20881,
1255,
265,
253,
288,
1812,
78,
10895,
310,
2853,
3467,
959,
323,
6200,
26278,
347,
627,
403,
1643,
5113,
1561,
253,
35936,
285,
3133,
281,
1347,
15225,
285,
310,
940,
25031,
323,
253,
2929,
253,
278,
8159,
310,
11306,
1754,
327,
253,
789,
273,
495,
627,
310,
1077,
3710,
5661,
5955,
1475,
253,
789,
273,
253,
16843,
10895,
2403,
352,
1892,
323,
247,
9414,
281,
4751,
11435,
253,
3045,
273,
253,
2746,
627,
310,
247,
35738,
875,
253,
374,
69,
2234,
3659,
5481,
285,
253,
495,
1678,
554,
394,
3962,
5978,
273,
253,
6200,
50276,
15422,
4767,
2490,
187,
4118,
18435,
27,
783,
19529,
8523,
2959,
6804,
10123,
50276,
6438,
30080,
22559,
512,
30628,
3543,
616,
7350,
12054,
9713,
285,
8521,
14924,
2167,
581,
42126,
5731,
253,
4868,
50276,
783,
913,
18726,
253,
4477,
403,
14659,
281,
49620,
253,
2929,
15672,
209
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
edit satisfied by the rebuttal and the proposed modifications by the authors i decided to increase my grade by 1 point the paper studies multipletry metropolis for discrete state spaces a variant of mh where one draws several times independently from the same proposal at each iteration the main result is a bound on the mixing time with an additional algorithmic degree of freedom the balancing function compared to previous work an algorithm for selecting the number of trials in each mcmc iteration is also given supporting experiments are shown in two bayesian model selection tasks overall i like the paper as it gives a thorough treatment of whether mtm improves the mixing time of mh for a theoretical paper it is well presented although the text is sometimes a bit dense and would benefit from having a running example like bayesian variable selection as well as a more detailed experimental section strengths a mixing time bound for mtm a discussion and an algorithm for automatically choosing the number of trials an honest treatment showing that we gain in practice only if we can parallelize evaluations of the weight function weaknesses lack of details in the experimental section lack of comparison to other mcmc kernels for the same models eg reversiblejump mcmc however since the main purpose of the paper is its theoretical bound and the improvement over classical mh this is a small weakness the improvement over classical mh seems to be swapping a factor in the number of sequential iterations for the same factor of parallelizable evaluations thus i would expect more to be said on how efficiently the parallelization can be implemented and how much can be gained in a challenging application on real data only toy models are investigated adequately addressed docsepupdate i acknowledge the rebuttal and raise my score to accept given the extensive discussion and provided improvements the paper studies multiple try metropolis mtm hastings algorithms a class of mcmc algorithms for discrete state spaces a theoretical bound on the mixing time is provided that shows that mtm mixes faster than standard metropolis hastings mh the results suggests a new class of weight functions that improves upon standard mtm the results are highlighted empirically in a simulation study using synthetic data focusing on model variable selection problems the paper provides a theoretical analysis and an extension of existing theory that looks sound the paper is well written and motivates in detail the perspective taken experiments the trace plot shows that singletry mh reaches the true state at around 20000 iterations whereas the mtm with n 10 reaches the true state at around 2000 iterations smaller by a factor of 10 it really looks as if the computation gain is achieved by only shifting computational cost from a longer chain to a shorter chain with more trials this does not seem too convincing in particular as parallelisation does not seem to be the main motivation for this work the experiment section is also lacking comparison with other methods such as rjmcmc for example to better situate the contribution i would have also liked to see some real data applications that is absent from this paper in my understanding the approach applies to discrete state spaces only and the experimental results lack use of real world data and comparison with other approaches docsepthe authors studied the mixing time of the mtm algorithm they propose a new class of weights or a generalization of the previous weights the analysis of the mixing time is important and the mtm scheme is a relevant algorithm moreover the paper is wellwritten in my opinion since it is a bit mathematical work some parts are not easy to follow also the numerical examples are hard to read some parts are not easy to read maybe less synthesis could help
### Summary: | based on the reviews and discussions we are happy to recommend acceptance please make sure that all comments in the discussion threads are taken into account in the final version of the manuscript | [
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
15576,
10048,
407,
253,
30080,
22559,
285,
253,
4081,
14586,
407,
253,
4477,
891,
4425,
281,
2572,
619,
9646,
407,
337,
1127,
50276,
783,
2929,
2175,
10796,
1059,
610,
1313,
37489,
323,
13358,
1375,
8470,
247,
12955,
273,
278,
73,
835,
581,
21354,
2067,
2069,
10939,
432,
253,
1072,
10419,
387,
1016,
19502,
253,
2022,
906,
310,
247,
3033,
327,
253,
12480,
673,
342,
271,
3081,
5933,
280,
4248,
273,
7185,
253,
26259,
1159,
2429,
281,
2045,
789,
271,
5933,
323,
17221,
253,
1180,
273,
7587,
275,
1016,
278,
3591,
68,
19502,
310,
671,
1677,
8109,
4679,
403,
2011,
275,
767,
17699,
16561,
1566,
5438,
8892,
50276,
1189,
455,
891,
751,
253,
2929,
347,
352,
4245,
247,
11080,
1971,
273,
1880,
278,
20583,
19132,
253,
12480,
673,
273,
278,
73,
323,
247,
10527,
2929,
352,
310,
973,
3559,
3738,
253,
2505,
310,
4536,
247,
2372,
14086,
285,
651,
5649,
432,
1907,
247,
3515,
1650,
751,
17699,
16561,
4778,
5438,
347,
973,
347,
247,
625,
7000,
5661,
2593,
50275,
296,
3755,
20556,
50276,
66,
12480,
673,
3033,
323,
278,
20583,
50276,
66,
5955,
285,
271,
5933,
323,
8356,
13887,
253,
1180,
273,
7587,
50276,
266,
8274,
1971,
4645,
326,
359,
6351,
275,
3946,
760,
604,
359,
476,
7529,
907,
27163,
273,
253,
2801,
1159,
50273,
20881,
1255,
265,
50276,
77,
471,
273,
4278,
275,
253,
5661,
2593,
50276,
77,
471,
273,
5301,
281,
643,
278,
3591,
68,
34501,
323,
253,
1072,
3210,
24088,
24048,
48742,
278,
3591,
68,
2299,
1580,
253,
2022,
4096,
273,
253,
2929,
310,
697,
10527,
3033,
285,
253,
7756,
689,
8946,
278,
73,
436,
310,
247,
1355,
14855,
50276,
783,
7756,
689,
8946,
278,
73,
3133,
281,
320,
1863,
5436,
247,
2803,
275,
253,
1180,
273,
22453,
25142,
323,
253,
1072,
2803,
273,
7529,
12729,
27163,
3021,
891,
651,
1902,
625,
281,
320,
753,
327,
849,
14556,
253,
7529,
1320,
476,
320,
9009,
285,
849,
1199,
476,
320,
12103,
275,
247,
11132,
2898,
327,
1524,
941,
760,
20953,
3210,
403,
6949,
50276,
14629,
1523,
9713,
5474,
33032,
11183,
891,
14409,
253,
30080,
22559,
285,
7164,
619,
4868,
281,
2997,
1677,
253,
9470,
5955,
285,
2530,
11701,
50275,
783,
2929,
2175,
2709,
1611,
1313,
37489,
278,
20583,
16579,
723,
11333,
247,
966,
273,
278,
3591,
68,
11333,
323,
13358,
1375,
8470,
247,
10527,
3033,
327,
253,
12480,
673,
310,
2530,
326,
2722,
326,
278,
20583,
47603,
7938,
685,
2629,
1313,
37489,
16579,
723,
278,
73,
253,
1543,
5936,
247,
747,
966,
273,
2801,
3470,
326,
19132,
2220,
2629,
278,
20583,
253,
1543,
403,
16318,
45190,
275,
247,
9864,
1263,
970,
13506,
941,
13654,
327,
1566,
4778,
5438,
3237,
50275,
783,
2929,
3400,
247,
10527,
1783,
285,
271,
6880,
273,
5368,
3762,
326,
4453,
3590,
253,
2929,
310,
973,
3542,
285,
15265,
684,
275,
2508,
253,
8668,
2668,
50276,
16217,
3825,
50276,
783,
10711,
7484,
2722,
326,
34791,
610,
278,
73,
14190,
253,
2032,
1375,
50276,
255,
1475,
1052,
361,
25142,
5727,
253,
278,
20583,
342,
295,
50276,
740,
14190,
253,
2032,
1375,
50276,
255,
1475,
5307,
25142,
4577,
407,
247,
2803,
273,
884,
50276,
262,
1663,
4453,
347,
604,
253,
13782,
6351,
310,
6786,
407,
760,
19507,
15180,
2105,
432,
247,
3356,
5931,
281,
247,
12217,
5931,
342,
625,
7587,
436,
1057,
417,
1646,
1512,
21414,
275,
1798,
347,
7529,
5837,
1057,
417,
1646,
281,
320,
253,
2022,
16038,
323,
436,
789,
50276,
783,
3368,
2593,
310,
671,
14999,
5301,
342,
643,
3082,
824,
347,
391,
30999,
3591,
68,
323,
1650,
281,
1805,
5999,
366,
253,
7680,
891,
651,
452,
671,
10490,
281,
923,
690,
1524,
941,
4893,
326,
310,
12125,
432,
436,
2929,
50275,
249,
619,
4685,
253,
2746,
10384,
281,
13358,
1375,
8470,
760,
285,
253,
5661,
1543,
3480,
897,
273,
1524,
1533,
941,
285,
5301,
342,
643,
7274,
50276,
7152,
339,
431,
248,
4477,
5421,
253,
12480,
673,
273,
253,
278,
20583,
5933,
50276,
9328,
12661,
247,
747,
966,
273,
13461,
390,
247,
26647,
273,
253,
2045,
13461,
253,
1783,
273,
253,
12480,
673,
310,
1774,
285,
253,
278,
20583,
6974,
310,
247,
4623,
5933,
25761,
253,
2929,
310,
973,
15720,
275,
619,
4743,
50276,
17480,
352,
310,
247,
2372,
15965,
789,
690,
4243,
403,
417,
3477,
281,
956,
671,
253,
10704,
6667,
403,
1892,
281,
1239,
690,
4243,
403,
417,
3477,
281,
1239,
5046,
1679,
9066,
812,
1361,
50274,
187,
187,
4118,
18435,
27,
3169,
327,
253,
10123,
285,
11985,
359,
403,
5211,
281,
5583,
14924,
4496,
1056,
2119,
326,
512,
5701,
275,
253,
5955,
17059,
403,
2668,
715,
2395,
275,
253,
2457,
2715,
273,
253,
7714,
209
] | [
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] | [
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
15576,
10048,
407,
253,
30080,
22559,
285,
253,
4081,
14586,
407,
253,
4477,
891,
4425,
281,
2572,
619,
9646,
407,
337,
1127,
50276,
783,
2929,
2175,
10796,
1059,
610,
1313,
37489,
323,
13358,
1375,
8470,
247,
12955,
273,
278,
73,
835,
581,
21354,
2067,
2069,
10939,
432,
253,
1072,
10419,
387,
1016,
19502,
253,
2022,
906,
310,
247,
3033,
327,
253,
12480,
673,
342,
271,
3081,
5933,
280,
4248,
273,
7185,
253,
26259,
1159,
2429,
281,
2045,
789,
271,
5933,
323,
17221,
253,
1180,
273,
7587,
275,
1016,
278,
3591,
68,
19502,
310,
671,
1677,
8109,
4679,
403,
2011,
275,
767,
17699,
16561,
1566,
5438,
8892,
50276,
1189,
455,
891,
751,
253,
2929,
347,
352,
4245,
247,
11080,
1971,
273,
1880,
278,
20583,
19132,
253,
12480,
673,
273,
278,
73,
323,
247,
10527,
2929,
352,
310,
973,
3559,
3738,
253,
2505,
310,
4536,
247,
2372,
14086,
285,
651,
5649,
432,
1907,
247,
3515,
1650,
751,
17699,
16561,
4778,
5438,
347,
973,
347,
247,
625,
7000,
5661,
2593,
50275,
296,
3755,
20556,
50276,
66,
12480,
673,
3033,
323,
278,
20583,
50276,
66,
5955,
285,
271,
5933,
323,
8356,
13887,
253,
1180,
273,
7587,
50276,
266,
8274,
1971,
4645,
326,
359,
6351,
275,
3946,
760,
604,
359,
476,
7529,
907,
27163,
273,
253,
2801,
1159,
50273,
20881,
1255,
265,
50276,
77,
471,
273,
4278,
275,
253,
5661,
2593,
50276,
77,
471,
273,
5301,
281,
643,
278,
3591,
68,
34501,
323,
253,
1072,
3210,
24088,
24048,
48742,
278,
3591,
68,
2299,
1580,
253,
2022,
4096,
273,
253,
2929,
310,
697,
10527,
3033,
285,
253,
7756,
689,
8946,
278,
73,
436,
310,
247,
1355,
14855,
50276,
783,
7756,
689,
8946,
278,
73,
3133,
281,
320,
1863,
5436,
247,
2803,
275,
253,
1180,
273,
22453,
25142,
323,
253,
1072,
2803,
273,
7529,
12729,
27163,
3021,
891,
651,
1902,
625,
281,
320,
753,
327,
849,
14556,
253,
7529,
1320,
476,
320,
9009,
285,
849,
1199,
476,
320,
12103,
275,
247,
11132,
2898,
327,
1524,
941,
760,
20953,
3210,
403,
6949,
50276,
14629,
1523,
9713,
5474,
33032,
11183,
891,
14409,
253,
30080,
22559,
285,
7164,
619,
4868,
281,
2997,
1677,
253,
9470,
5955,
285,
2530,
11701,
50275,
783,
2929,
2175,
2709,
1611,
1313,
37489,
278,
20583,
16579,
723,
11333,
247,
966,
273,
278,
3591,
68,
11333,
323,
13358,
1375,
8470,
247,
10527,
3033,
327,
253,
12480,
673,
310,
2530,
326,
2722,
326,
278,
20583,
47603,
7938,
685,
2629,
1313,
37489,
16579,
723,
278,
73,
253,
1543,
5936,
247,
747,
966,
273,
2801,
3470,
326,
19132,
2220,
2629,
278,
20583,
253,
1543,
403,
16318,
45190,
275,
247,
9864,
1263,
970,
13506,
941,
13654,
327,
1566,
4778,
5438,
3237,
50275,
783,
2929,
3400,
247,
10527,
1783,
285,
271,
6880,
273,
5368,
3762,
326,
4453,
3590,
253,
2929,
310,
973,
3542,
285,
15265,
684,
275,
2508,
253,
8668,
2668,
50276,
16217,
3825,
50276,
783,
10711,
7484,
2722,
326,
34791,
610,
278,
73,
14190,
253,
2032,
1375,
50276,
255,
1475,
1052,
361,
25142,
5727,
253,
278,
20583,
342,
295,
50276,
740,
14190,
253,
2032,
1375,
50276,
255,
1475,
5307,
25142,
4577,
407,
247,
2803,
273,
884,
50276,
262,
1663,
4453,
347,
604,
253,
13782,
6351,
310,
6786,
407,
760,
19507,
15180,
2105,
432,
247,
3356,
5931,
281,
247,
12217,
5931,
342,
625,
7587,
436,
1057,
417,
1646,
1512,
21414,
275,
1798,
347,
7529,
5837,
1057,
417,
1646,
281,
320,
253,
2022,
16038,
323,
436,
789,
50276,
783,
3368,
2593,
310,
671,
14999,
5301,
342,
643,
3082,
824,
347,
391,
30999,
3591,
68,
323,
1650,
281,
1805,
5999,
366,
253,
7680,
891,
651,
452,
671,
10490,
281,
923,
690,
1524,
941,
4893,
326,
310,
12125,
432,
436,
2929,
50275,
249,
619,
4685,
253,
2746,
10384,
281,
13358,
1375,
8470,
760,
285,
253,
5661,
1543,
3480,
897,
273,
1524,
1533,
941,
285,
5301,
342,
643,
7274,
50276,
7152,
339,
431,
248,
4477,
5421,
253,
12480,
673,
273,
253,
278,
20583,
5933,
50276,
9328,
12661,
247,
747,
966,
273,
13461,
390,
247,
26647,
273,
253,
2045,
13461,
253,
1783,
273,
253,
12480,
673,
310,
1774,
285,
253,
278,
20583,
6974,
310,
247,
4623,
5933,
25761,
253,
2929,
310,
973,
15720,
275,
619,
4743,
50276,
17480,
352,
310,
247,
2372,
15965,
789,
690,
4243,
403,
417,
3477,
281,
956,
671,
253,
10704,
6667,
403,
1892,
281,
1239,
690,
4243,
403,
417,
3477,
281,
1239,
5046,
1679,
9066,
812,
1361,
50274,
187,
187,
4118,
18435,
27,
3169,
327,
253,
10123,
285,
11985,
359,
403,
5211,
281,
5583,
14924,
4496,
1056,
2119,
326,
512,
5701,
275,
253,
5955,
17059,
403,
2668,
715,
2395,
275,
253,
2457,
2715,
273,
253,
7714,
209
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
this paper proposes a recoursenet which learns to instancedependently recourse to environments so that the learner gets better samples at training time and the predictor is well informed which instance to recourse with which environments towards this goal they propose a novel learning objective to optimize trigger and the parameters of predictor and recourse recommender due to its nondifferentiability they resort to simple heuristics to approximate the learning objective and develop an efficient learning framework the experimental results demonstrate the efficacy of recoursenet over simple baselines achieving superior recourse accuracy versus fraction of recourse the results also show that instancewise recourse is important as well strengths paper is generally well written the motivation is very clear and the problem setup is very interesting novel and practical which i think of as the biggest contribution of this paper the assumption that the underlying true process z is unavailable such that we need to instancewisely recommend alternative beta sounds very reasonable the learning objective 1 and 2 seems novel and looks interesting the experimental results demonstrate that the proposed recoursenet clearly outperforms the baselines although no confidence intervals are provided and those baselines seem rather weak the authors made a good effort in developing novel datasets which can be used by other papers weaknesses doubt about the original learning objective 1 and 2 i understand what 1 and 2 are meant to do but in my understanding they can have a trivial solution for instance with sufficient network capacity and enough training time the learner may simply find a trivial solution pixbeta0 forall xbeta and simply maximize log ftheta yx as much as it can i understand that during training and with limited computational capacity of ftheta sometimes picdotcdot will become 1 because bad xs will have high prediction errors but it is only in terms of training process not encoded by the learning objective itself in other words the learning objective itself cannot distinguish between bad local optima trivial solutions and good local optima meaningful solutions if those minima can achieve the same loss value so i guess we need more discussion about whether 1 and 2 are wellposed in the first place crude approximation of the learning objective i understand the difficulty of dealing with the original learning objective 1 and 2 and crude approximations are inevitable then what we need to empirically show is that the proposed approximations are indeed maximizing the original objective 1 and 2 in a stable manner i would appreciate if the authors could show this in the rebuttal phase typos spread in the text especially in the mathematical notations see questions and typos below limited baselines i understand that there may not exist suitable baselines to compare because the problem setup is novel but i felt they are too few as an example in figure 2 the authors could have added more baselines that train ftheta selectively in order to truly show the validity of the steepness of the red curves ie the different steepnesses may come from the superiority of ftheta honestly im not an expert in this area so i will have discussions about the possible baselines with other reviewers in figure 4 it would be more informative if the authors could actually show the different beta realized for each class either qualitatively or quantitatively instead of the actual recourse accuracies the experimental results do not provide confidence intervals so the authors cannot argue statistical significancy summary in summary i found that the problem setup is very meaningful interesting novel and practical but the overall quality of the paper is a bit low and more experiments seem to be needed if the authors can address my concerns i can increase my score the authors did not particularly mention the limitations of this work docsepthe authors propose using a recourse mechanism that allows a model to recommend to its human users how they could improve the predictions of the model by capturing another sample of the current instance with somewhat different view hyperparameters less zoom more brightness etc the model is made of three stages one for prediction of a target class one for predicting whether resampling could be useful and another recommending how the resampling should be made the method requires that the model has access to different views of various instances and a corresponding real value that describes somewhat the transform variation over those views the authors state that this could be useful in settings such as the medical setting they then evaluate their model on a variety of synthetic and non synthetic datasets and show that their method can improve performance successfully on those one of my main concerns is around the idea of recourseresampling the authors state that this could be useful in medical settings but is it really it can be expensive to get additional mrict etc images for the patient and the hospital and more than that for the model to work it would need to be trained on lots of variations of the same instances with certain imaging parameters annotated furthermore it would need to somehow be able to generalize on different humans and hospital machines and how they interpret the recommended changes for the resampling it simply seems too expensive for what appears to be highly uncertain and unverifiable at inthewild application predictive improvements originality the work attempts to do something novel and potentially useful i consider its novelty at acceptable levels quality the paper is written well and presented well however the experimental quality is lacking as the model is only evaluated on datasets that are very precise in how a particular instance is viewed in terms of the view parameters given in medical settings for example this would not be very possible with current datasets the authors also do not evaluate on a more real world dataset of the types that they mentioned their method could be useful in such as medical settings the authors also do not compare with randomly recoursedresampled instances to evaluate the effectiveness of their recommender model clarity the paper is overall quite clear and the methods clearly described the paper is also lacking optimizer settings data augmentations and regularization details for the models that were trained significance assuming the work could be reframed in a way that it requires a more realistic data source with more noisy imaging parameters the idea of recourse could be a useful one as it currently stands however it does not seem to offer real world significance but does offer fair research novelty and significance the technical limitations i stated earlier stand but in terms of negative societal impact i dont see any obvious problems other than potentially increased hospital spending from patients and the government alike if implemented poorly docsepthe paper proposes a mechanism termed recoursenet to identify and modify input instances so as to achieve a better classification outcome from a classifier these modifications are performed in the latent space of environmentsettings recoursenet consists of three components a classifier a recoursetrigger which selects instances that require modification and a recourserecommender that makes modifications a mechanism for training the three components is detailed conditions under which the recourserecommendation leads to improved accuracy are studied experiments are performed on shapenet speech commands dataset and a synthetic dataset strengths offering recourse in the form of alternative environments is interesting especially for images the proposed optimization problem is original and various tricks are proposed to solve it various ablations of recoursenet are studied and the design choices are justified through experimental results weaknesses it is not clear what the problem recoursenet actually solves is it trying to perform classification when trainingdata is noisy l21 during training arent such adverse examples if labeled correctly important for generalization it would be concerning if all training instances were from a fixed set of viewsenvironments finding alternative environments makes sense during test time because the classifier might not have seen such examples during training misuse of the concept of offering recourse in various places l5737 the problem is defined as finding an alternative environment setting so that the classifier is more likely to get a correct prediction in traditional recourse a fixed classifier is assumed and actionable changes are suggested to a user input instance that has been negatively classified i am not convinced that performing preclassification data modifications can be termed as recourse the usage of the concept of recourse needs justification lack of a suitable baseline in experiments the questions studied are for different settings of the proposed algorithm for instance section 41 is an ablation study that measures how important their iterative greedy proposal is for solving eq 5 it would be interesting to see how recoursenet performs when compared to other methods that solve the same problem because the underlying problem it solves is not clear to me it becomes difficult for me to recommend baselines at this time but i suppose baselines like invariant risk minimization or other methods that attempt to find domaininvariant representations could be baselines yes docsepsummary the paper tries to improve the prediction accuracy through a different perspective recourse on instance environment firstly it points out the fact that a classifiers performance is also dependent on the data quality that is a good classifier is also likely to misclassify an image taken in very bad environment setting to address this problem the author proposes the recoursenet to first identify whether the data is good for classification or needed to taken from another environment setting such as another angle if the image is not qualified for a good classification the recoursenet will recommend the user an appropriate environment setting then the recoursenet will trains a classifier to classify the image of good quality or the images generated by the user contribution this paper proposes a novel threelevel framework for the recourse mechanism without human involvement in the training strengths the paper provides a novel perspective for enhancing prediction accuracy instead of focusing on the classification model for improving robustness it focuses on the input data quality and introduces a recourse mechanism for improving the data quality and consequently improving the prediction accuracy this is promising in medical applications for example the algorithm can be deployed on some online diagnosis platform and provides hintstips to the user about how to take a clear image that can make the diagnosis more accurate the proposed method is intuitive and relatively clearly explained the experiment is thorough in terms of the ablation study namely how every component is beneficial weaknesses the experiments are limited to small datasets there is some inconsistency in terms of notation and writing for example the environment setting appears as b or mathcalb and in line 88 it is inconsistent with the algorithm when saying that humans do not participate in the prediction task but they only generate new instances under the recommended environments how these two propositions benefit seems to be unclear to me the papers limitations are not discussed by the author
### Summary: | the paper proposes a recourse approach that recommends how to improve performance on instances by modifying their environment the paper is well motivated and provides a novel approach that is empirically demonstrated to be useful though the empirical evaluation is limited reviewers agree that this paper addresses an important question that has more recently started to get attention and that the contribution is novel creative and significant the quality of the write up could be improved and i encourage the authors to do so for the camera ready version | [
4227,
3020,
761,
9249,
310,
1774,
347,
973,
50276,
296,
3755,
20556,
50276,
20790,
310,
3839,
973,
3542,
50276,
783,
16038,
310,
1077,
2590,
285,
253,
1895,
9978,
310,
1077,
4722,
4460,
285,
8542,
534,
891,
1158,
273,
347,
253,
5962,
7680,
273,
436,
2929,
253,
9376,
326,
253,
6944,
2032,
1232,
1182,
310,
29356,
824,
326,
359,
878,
281,
4227,
88,
9299,
5583,
5795,
9840,
7835,
1077,
5272,
50276,
783,
4715,
8103,
337,
285,
374,
3133,
4460,
285,
4453,
4722,
50276,
783,
5661,
1543,
7568,
326,
253,
4081,
761,
2108,
257,
292,
4518,
41731,
13015,
253,
1666,
25379,
3738,
642,
7162,
11508,
403,
2530,
285,
1110,
1666,
25379,
1646,
2581,
5075,
50276,
783,
4477,
1160,
247,
1175,
3434,
275,
6684,
4460,
15302,
534,
476,
320,
908,
407,
643,
9380,
50275,
20881,
1255,
265,
50275,
69,
17290,
670,
253,
3236,
4715,
8103,
337,
285,
374,
891,
2096,
752,
337,
285,
374,
403,
5486,
281,
513,
533,
275,
619,
4685,
597,
476,
452,
247,
14916,
2900,
323,
4227,
342,
4209,
2990,
5350,
285,
2217,
3733,
673,
253,
458,
47612,
778,
3365,
1089,
247,
14916,
2900,
8066,
2461,
17,
323,
455,
1269,
2461,
285,
3365,
22950,
2412,
269,
3124,
340,
89,
347,
1199,
347,
352,
476,
891,
2096,
326,
1309,
3733,
285,
342,
3710,
15180,
5350,
273,
269,
3124,
4536,
14402,
5256,
3830,
588,
2489,
337,
984,
3076,
48361,
588,
452,
1029,
10554,
6332,
533,
352,
310,
760,
275,
2426,
273,
3733,
1232,
417,
16202,
407,
253,
4715,
8103,
3139,
275,
643,
3000,
253,
4715,
8103,
3139,
2550,
12129,
875,
3076,
1980,
5556,
66,
14916,
5482,
285,
1175,
1980,
5556,
66,
14282,
5482,
604,
1110,
46836,
476,
5115,
253,
1072,
2957,
1318,
594,
891,
5476,
359,
878,
625,
5955,
670,
1880,
337,
285,
374,
403,
973,
7334,
275,
253,
806,
1659,
50274,
7083,
2496,
11193,
273,
253,
4715,
8103,
891,
2096,
253,
10183,
273,
10620,
342,
253,
3236,
4715,
8103,
337,
285,
374,
285,
18934,
34754,
403,
19455,
840,
752,
359,
878,
281,
45190,
921,
310,
326,
253,
4081,
34754,
403,
6296,
46875,
253,
3236,
8103,
337,
285,
374,
275,
247,
6474,
5133,
891,
651,
11435,
604,
253,
4477,
812,
921,
436,
275,
253,
30080,
22559,
3408,
50275,
555,
993,
5195,
275,
253,
2505,
3340,
275,
253,
15965,
41818,
923,
3533,
285,
963,
993,
2708,
50275,
15870,
1666,
25379,
891,
2096,
326,
627,
778,
417,
2226,
7470,
1666,
25379,
281,
7277,
984,
253,
1895,
9978,
310,
4460,
533,
891,
3543,
597,
403,
1512,
1643,
347,
271,
1650,
275,
4677,
374,
253,
4477,
812,
452,
2879,
625,
1666,
25379,
326,
6194,
269,
3124,
21656,
275,
1340,
281,
7777,
921,
253,
13091,
273,
253,
16624,
1255,
273,
253,
2502,
9191,
26332,
253,
1027,
16624,
1255,
265,
778,
1705,
432,
253,
34385,
273,
269,
3124,
20509,
516,
417,
271,
6485,
275,
436,
2170,
594,
891,
588,
452,
11985,
670,
253,
1896,
1666,
25379,
342,
643,
30628,
50275,
249,
4677,
577,
352,
651,
320,
625,
27096,
604,
253,
4477,
812,
2686,
921,
253,
1027,
9840,
8156,
323,
1016,
966,
2057,
36143,
390,
36878,
3185,
273,
253,
4588,
761,
9249,
3933,
19103,
50275,
783,
5661,
1543,
513,
417,
2085,
7162,
11508,
594,
253,
4477,
2550,
9059,
7605,
1415,
4306,
50275,
8774,
275,
6010,
891,
1119,
326,
253,
1895,
9978,
310,
1077,
14282,
4722,
4460,
285,
8542,
533,
253,
4583,
3290,
273,
253,
2929,
310,
247,
2372,
1698,
285,
625,
4679,
1646,
281,
320,
3058,
604,
253,
4477,
476,
2953,
619,
7350,
891,
476,
2572,
619,
4868,
253,
4477,
858,
417,
3782,
3748,
253,
7364,
273,
436,
789,
5474,
339,
431,
248,
4477,
12661,
970,
247,
761,
9249,
5122,
326,
4483,
247,
1566,
281,
5583,
281,
697,
1966,
4212,
849,
597,
812,
3157,
253,
13650,
273,
253,
1566,
407,
26475,
1529,
3410,
273,
253,
1655,
4227,
342,
8489,
1027,
1859,
4373,
22041,
1679,
21282,
625,
20468,
3966,
253,
1566,
310,
1160,
273,
1264,
8661,
581,
323,
10554,
273,
247,
2303,
966,
581,
323,
21565,
1880,
501,
312,
4906,
812,
320,
4217,
285,
1529,
46705,
849,
253,
501,
312,
4906,
943,
320,
1160,
50275,
783,
1332,
4419,
326,
253,
1566,
556,
2289,
281,
1027,
6849,
273,
2710,
10872,
285,
247,
3969,
1524,
1318,
326,
8631,
8489,
253,
4979,
7629,
689,
1110,
6849,
50275,
783,
4477,
1375,
326,
436,
812,
320,
4217,
275,
7533,
824,
347,
253,
3739,
4758,
50275,
9328,
840,
7472,
616,
1566,
327,
247,
5235,
273,
13506,
285,
1327,
13506,
15302,
285,
921,
326,
616,
1332,
476,
3157,
3045,
8379,
327,
1110,
581,
273,
619,
2022,
7350,
310,
1475,
253,
2934,
273,
761,
9249,
373,
312,
4906,
253,
4477,
1375,
326,
436,
812,
320,
4217,
275,
3739,
7533,
533,
310,
352,
1663,
352,
476,
320,
8214,
281,
755,
3081,
278,
1467,
3966,
3888,
323,
253,
3110,
285,
253,
4675,
285,
625,
685,
326,
323,
253,
1566,
281,
789,
352,
651,
878,
281,
320,
10166,
327,
8783,
273,
10575,
273,
253,
1072,
10872,
342,
2176,
6979,
3602,
28267,
33810,
352,
651,
878,
281,
10380,
320,
2104,
281,
39970,
327,
1027,
7497,
285,
4675,
10679,
285,
849,
597,
4665,
253,
8521,
2544,
323,
253,
501,
312,
4906,
352,
3365,
3133,
1512,
8214,
323,
752,
4620,
281,
320,
4122,
8767,
285,
440,
332,
18397,
387,
540,
248,
32778,
2898,
15970,
11701,
50275,
19164,
414,
50276,
783,
789,
9437,
281,
513,
1633,
4460,
285,
7826,
4217,
891,
1908,
697,
38135,
387,
12207,
2308,
50275,
15177,
253,
2929,
310,
3542,
973,
285,
3559,
973,
2299,
253,
5661,
3290,
310,
14999,
347,
253,
1566,
310,
760,
6760,
327,
15302,
326,
403,
1077,
10799,
275,
849,
247,
1798,
4227,
310,
11575,
275,
2426,
273,
253,
1859,
3602,
1677,
275,
3739,
7533,
323,
1650,
436,
651,
417,
320,
1077,
1896,
342,
1655,
15302,
253,
4477,
671,
513,
417,
7472,
327,
247,
625,
1524,
1533,
10895,
273,
253,
3510,
326,
597,
5393,
616,
1332,
812,
320,
4217,
275,
824,
347,
3739,
7533,
50276,
783,
4477,
671,
513,
417,
7277,
342,
12421,
761,
2108,
264,
373,
312,
6216,
10872,
281,
7472,
253,
12510,
273,
616,
3818,
3109,
1566,
50276,
498,
15752,
50276,
783,
2929,
310,
4583,
3240,
2590,
285,
253,
3082,
4518,
2529,
50275,
783,
2929,
310,
671,
14999,
5556,
6081,
7533,
941,
35919,
569,
285,
37820,
4278,
323,
253,
3210,
326,
497,
10166,
50276,
9188,
40348,
50276,
37411,
253,
789,
812,
320,
16110,
3163,
275,
247,
1039,
326,
352,
4419,
247,
625,
15958,
941,
2603,
342,
625,
27620,
6979,
3602,
253,
2934,
273,
761,
9249,
812,
320,
247,
4217,
581,
347,
352,
4390,
9572,
2299,
352,
1057,
417,
1646,
281,
3959,
1524,
1533,
8453,
533,
1057,
3959,
4344,
2561,
38135,
285,
8453,
253,
7681,
7364,
891,
4767,
4321,
1462,
533,
275,
2426,
273,
4016,
38058,
3486,
891,
13414,
923,
667,
4755,
3237,
643,
685,
7826,
2559,
4675,
9100,
432,
1363,
285,
253,
2208,
19605,
604,
9009,
15225,
5474,
339,
431,
248,
2929,
29328,
247,
5122,
23776,
761,
2108,
257,
292,
50276,
936,
4271,
285,
10007,
3280,
10872,
594,
347,
281,
5115,
247,
1805,
9162,
6454,
432,
247,
30410,
841,
14586,
403,
2684,
275,
253,
21624,
2317,
273,
12620,
12950,
761,
2108,
257,
292,
8414,
273,
1264,
4295,
50276,
66,
30410,
247,
761,
454,
1178,
10389,
1063,
534,
34899,
10872,
326,
2430,
11237,
285,
247,
761,
9249,
250,
2823,
3109,
326,
2789,
14586,
50276,
66,
5122,
323,
3733,
253,
1264,
4295,
310,
7000,
2515,
762,
534,
253,
761,
9249,
250,
27167,
318,
5644,
281,
5520,
7200,
403,
5421,
4679,
403,
2684,
327,
439,
522,
257,
292,
6519,
13896,
10895,
285,
247,
13506,
10895,
50276,
296,
3755,
20556,
50275,
2727,
2158,
761,
9249,
275,
253,
830,
273,
5795,
12620,
310,
4722,
3340,
323,
3888,
50276,
783,
4081,
13757,
1895,
310,
3236,
285,
2710,
24866,
403,
4081,
281,
8415,
352,
50276,
2044,
784,
490,
77,
569,
273,
761,
2108,
257,
292,
403,
5421,
285,
253,
2216,
10165,
403,
17285,
949,
5661,
1543,
50276,
20881,
1255,
265,
50275,
262,
310,
417,
2590,
752,
253,
1895,
761,
2108,
257,
292,
2686,
35910,
310,
352,
2820,
281,
1347,
9162,
672,
3733,
2203,
310,
27620,
50275,
77,
1797,
1309,
3733,
403,
2649,
824,
10021,
6667,
604,
13130,
9113,
1774,
323,
26647,
352,
651,
320,
8664,
604,
512,
3733,
10872,
497,
432,
247,
4229,
873,
273,
6849,
257,
11986,
4560,
5795,
12620,
2789,
3282,
1309,
1071,
673,
984,
253,
30410,
1537,
417,
452,
2326,
824,
6667,
1309,
3733,
50274,
24418,
2327,
273,
253,
4473,
273,
9159,
761,
9249,
275,
2710,
5053,
298,
3011,
1787,
253,
1895,
310,
2931,
347,
4560,
271,
5795,
3126,
4758,
594,
326,
253,
30410,
310,
625,
2779,
281,
755,
247,
3451,
10554,
275,
5899,
761,
9249,
247,
4229,
50276,
2437,
5425,
310,
8025,
285,
49353,
2544,
403,
5125,
281,
247,
2608,
3280,
4227,
326,
556,
644,
18123,
10509,
891,
717,
417,
13762,
326,
9591,
638,
42070,
941,
14586,
476,
320,
23776,
347,
761,
9249,
253,
10393,
273,
253,
4473,
273,
761,
9249,
3198,
22861,
50275,
77,
471,
273,
247,
7470,
8245,
275,
4679,
253,
3533,
5421,
403,
323,
1027,
7533,
273,
253,
4081,
5933,
323,
4227,
2593,
7609,
310,
271,
28913,
1263,
326,
5593,
849,
1774,
616,
34560,
38754,
10419,
310,
323,
16161,
16186,
608,
352,
651,
320,
4722,
281,
923,
849,
761,
2108,
257,
292,
17923,
672,
2429,
281,
643,
3082,
326,
8415,
253,
1072,
1895,
50276,
12157,
253,
6944,
1895,
352,
35910,
310,
417,
2590,
281,
479,
352,
4916,
2834,
323,
479,
281,
5583,
1666,
25379,
387,
436,
673,
533,
891,
9428,
1666,
25379,
751,
13727,
2495,
41458,
390,
643,
3082,
326,
3177,
281,
1089,
5028,
25168,
14237,
812,
320,
1666,
25379,
50276,
9820,
5474,
339,
793,
360,
3454,
253,
2929,
14177,
281,
3157,
253,
10554,
7200,
949,
247,
1027,
8668,
761,
9249,
327,
4227,
3126,
41005,
352,
2792,
562,
253,
958,
326,
247,
49996,
3045,
310,
671,
7976,
327,
253,
941,
3290,
326,
310,
247,
1175,
30410,
310,
671,
2779,
281,
3731,
2437,
1419,
271,
2460,
2668,
275,
1077,
3076,
3126,
4758,
281,
2953,
436,
1895,
253,
2488,
29328,
253,
761,
2108,
257,
292,
281,
806,
4271,
1880,
253,
941,
310,
1175,
323,
9162,
390,
3058,
281,
2668,
432,
1529,
3126,
4758,
824,
347,
1529,
6907,
604,
253,
2460,
310,
417,
12165,
323,
247,
1175,
9162,
253,
761,
2108,
257,
292,
588,
5583,
253,
2608,
271,
4569,
3126,
4758,
840,
253,
761,
2108,
257,
292,
588,
18784,
247,
30410,
281,
30215,
253,
2460,
273,
1175,
3290,
390,
253,
3888,
4561,
407,
253,
2608,
50275,
1987,
2382,
436,
2929,
29328,
247,
4460,
1264,
5251,
7792,
323,
253,
761,
9249,
5122,
1293,
1966,
10171,
275,
253,
3733,
50276,
296,
3755,
20556,
50275,
783,
2929,
3400,
247,
4460,
8668,
323,
22474,
10554,
7200,
3185,
273,
13654,
327,
253,
9162,
1566,
323,
11138,
31640,
352,
16633,
327,
253,
3280,
941,
3290,
285,
23970,
247,
761,
9249,
5122,
323,
11138,
253,
941,
3290,
285,
17912,
11138,
253,
10554,
7200,
436,
310,
12532,
275,
3739,
4893,
323,
1650,
253,
5933,
476,
320,
18329,
327,
690,
3909,
6120,
5147,
285,
3400,
12662,
296,
2824,
281,
253,
2608,
670,
849,
281,
1379,
247,
2590,
2460,
326,
476,
1056,
253,
6120,
625,
7899,
50276,
783,
4081,
1332,
310,
27350,
285,
4942,
4518,
5544,
50276,
783,
3368,
310,
11080,
275,
2426,
273,
253,
28913,
1263,
10775,
849,
1046,
4445,
310,
12912,
50276,
20881,
1255,
265,
50276,
783,
4679,
403,
3710,
281,
1355,
15302,
50275,
9088,
310,
690,
43430,
275,
2426,
273,
14951,
285,
4028,
323,
1650,
253,
3126,
4758,
4620,
347,
270,
390,
14168,
1179,
67,
285,
275,
1386,
11003,
352,
310,
16706,
342,
253,
5933,
672,
3981,
326,
7497,
513,
417,
10078,
275,
253,
10554,
4836,
533,
597,
760,
6635,
747,
10872,
762,
253,
8521,
12620,
50275,
5430,
841,
767,
39325,
5649,
3133,
281,
320,
12744,
281,
479,
50276,
783,
9380,
7364,
403,
417,
5469,
407,
253,
2488,
2490,
187,
4118,
18435,
27,
783,
2929,
29328,
247,
761,
9249,
2746,
326,
32636,
849,
281,
3157,
3045,
327,
10872,
407,
26264,
616,
3126,
253,
2929,
310,
973,
17194,
285,
3400,
247,
4460,
2746,
326,
310,
45190,
5183,
281,
320,
4217,
2167,
253,
16774,
7103,
310,
3710,
30628,
5194,
326,
436,
2929,
12453,
271,
1774,
1953,
326,
556,
625,
4102,
3053,
281,
755,
4116,
285,
326,
253,
7680,
310,
4460,
10995,
285,
1534,
253,
3290,
273,
253,
3630,
598,
812,
320,
5520,
285,
891,
11907,
253,
4477,
281,
513,
594,
323,
253,
6568,
4704,
2715,
209
] | [
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] | [
4227,
3020,
761,
9249,
310,
1774,
347,
973,
50276,
296,
3755,
20556,
50276,
20790,
310,
3839,
973,
3542,
50276,
783,
16038,
310,
1077,
2590,
285,
253,
1895,
9978,
310,
1077,
4722,
4460,
285,
8542,
534,
891,
1158,
273,
347,
253,
5962,
7680,
273,
436,
2929,
253,
9376,
326,
253,
6944,
2032,
1232,
1182,
310,
29356,
824,
326,
359,
878,
281,
4227,
88,
9299,
5583,
5795,
9840,
7835,
1077,
5272,
50276,
783,
4715,
8103,
337,
285,
374,
3133,
4460,
285,
4453,
4722,
50276,
783,
5661,
1543,
7568,
326,
253,
4081,
761,
2108,
257,
292,
4518,
41731,
13015,
253,
1666,
25379,
3738,
642,
7162,
11508,
403,
2530,
285,
1110,
1666,
25379,
1646,
2581,
5075,
50276,
783,
4477,
1160,
247,
1175,
3434,
275,
6684,
4460,
15302,
534,
476,
320,
908,
407,
643,
9380,
50275,
20881,
1255,
265,
50275,
69,
17290,
670,
253,
3236,
4715,
8103,
337,
285,
374,
891,
2096,
752,
337,
285,
374,
403,
5486,
281,
513,
533,
275,
619,
4685,
597,
476,
452,
247,
14916,
2900,
323,
4227,
342,
4209,
2990,
5350,
285,
2217,
3733,
673,
253,
458,
47612,
778,
3365,
1089,
247,
14916,
2900,
8066,
2461,
17,
323,
455,
1269,
2461,
285,
3365,
22950,
2412,
269,
3124,
340,
89,
347,
1199,
347,
352,
476,
891,
2096,
326,
1309,
3733,
285,
342,
3710,
15180,
5350,
273,
269,
3124,
4536,
14402,
5256,
3830,
588,
2489,
337,
984,
3076,
48361,
588,
452,
1029,
10554,
6332,
533,
352,
310,
760,
275,
2426,
273,
3733,
1232,
417,
16202,
407,
253,
4715,
8103,
3139,
275,
643,
3000,
253,
4715,
8103,
3139,
2550,
12129,
875,
3076,
1980,
5556,
66,
14916,
5482,
285,
1175,
1980,
5556,
66,
14282,
5482,
604,
1110,
46836,
476,
5115,
253,
1072,
2957,
1318,
594,
891,
5476,
359,
878,
625,
5955,
670,
1880,
337,
285,
374,
403,
973,
7334,
275,
253,
806,
1659,
50274,
7083,
2496,
11193,
273,
253,
4715,
8103,
891,
2096,
253,
10183,
273,
10620,
342,
253,
3236,
4715,
8103,
337,
285,
374,
285,
18934,
34754,
403,
19455,
840,
752,
359,
878,
281,
45190,
921,
310,
326,
253,
4081,
34754,
403,
6296,
46875,
253,
3236,
8103,
337,
285,
374,
275,
247,
6474,
5133,
891,
651,
11435,
604,
253,
4477,
812,
921,
436,
275,
253,
30080,
22559,
3408,
50275,
555,
993,
5195,
275,
253,
2505,
3340,
275,
253,
15965,
41818,
923,
3533,
285,
963,
993,
2708,
50275,
15870,
1666,
25379,
891,
2096,
326,
627,
778,
417,
2226,
7470,
1666,
25379,
281,
7277,
984,
253,
1895,
9978,
310,
4460,
533,
891,
3543,
597,
403,
1512,
1643,
347,
271,
1650,
275,
4677,
374,
253,
4477,
812,
452,
2879,
625,
1666,
25379,
326,
6194,
269,
3124,
21656,
275,
1340,
281,
7777,
921,
253,
13091,
273,
253,
16624,
1255,
273,
253,
2502,
9191,
26332,
253,
1027,
16624,
1255,
265,
778,
1705,
432,
253,
34385,
273,
269,
3124,
20509,
516,
417,
271,
6485,
275,
436,
2170,
594,
891,
588,
452,
11985,
670,
253,
1896,
1666,
25379,
342,
643,
30628,
50275,
249,
4677,
577,
352,
651,
320,
625,
27096,
604,
253,
4477,
812,
2686,
921,
253,
1027,
9840,
8156,
323,
1016,
966,
2057,
36143,
390,
36878,
3185,
273,
253,
4588,
761,
9249,
3933,
19103,
50275,
783,
5661,
1543,
513,
417,
2085,
7162,
11508,
594,
253,
4477,
2550,
9059,
7605,
1415,
4306,
50275,
8774,
275,
6010,
891,
1119,
326,
253,
1895,
9978,
310,
1077,
14282,
4722,
4460,
285,
8542,
533,
253,
4583,
3290,
273,
253,
2929,
310,
247,
2372,
1698,
285,
625,
4679,
1646,
281,
320,
3058,
604,
253,
4477,
476,
2953,
619,
7350,
891,
476,
2572,
619,
4868,
253,
4477,
858,
417,
3782,
3748,
253,
7364,
273,
436,
789,
5474,
339,
431,
248,
4477,
12661,
970,
247,
761,
9249,
5122,
326,
4483,
247,
1566,
281,
5583,
281,
697,
1966,
4212,
849,
597,
812,
3157,
253,
13650,
273,
253,
1566,
407,
26475,
1529,
3410,
273,
253,
1655,
4227,
342,
8489,
1027,
1859,
4373,
22041,
1679,
21282,
625,
20468,
3966,
253,
1566,
310,
1160,
273,
1264,
8661,
581,
323,
10554,
273,
247,
2303,
966,
581,
323,
21565,
1880,
501,
312,
4906,
812,
320,
4217,
285,
1529,
46705,
849,
253,
501,
312,
4906,
943,
320,
1160,
50275,
783,
1332,
4419,
326,
253,
1566,
556,
2289,
281,
1027,
6849,
273,
2710,
10872,
285,
247,
3969,
1524,
1318,
326,
8631,
8489,
253,
4979,
7629,
689,
1110,
6849,
50275,
783,
4477,
1375,
326,
436,
812,
320,
4217,
275,
7533,
824,
347,
253,
3739,
4758,
50275,
9328,
840,
7472,
616,
1566,
327,
247,
5235,
273,
13506,
285,
1327,
13506,
15302,
285,
921,
326,
616,
1332,
476,
3157,
3045,
8379,
327,
1110,
581,
273,
619,
2022,
7350,
310,
1475,
253,
2934,
273,
761,
9249,
373,
312,
4906,
253,
4477,
1375,
326,
436,
812,
320,
4217,
275,
3739,
7533,
533,
310,
352,
1663,
352,
476,
320,
8214,
281,
755,
3081,
278,
1467,
3966,
3888,
323,
253,
3110,
285,
253,
4675,
285,
625,
685,
326,
323,
253,
1566,
281,
789,
352,
651,
878,
281,
320,
10166,
327,
8783,
273,
10575,
273,
253,
1072,
10872,
342,
2176,
6979,
3602,
28267,
33810,
352,
651,
878,
281,
10380,
320,
2104,
281,
39970,
327,
1027,
7497,
285,
4675,
10679,
285,
849,
597,
4665,
253,
8521,
2544,
323,
253,
501,
312,
4906,
352,
3365,
3133,
1512,
8214,
323,
752,
4620,
281,
320,
4122,
8767,
285,
440,
332,
18397,
387,
540,
248,
32778,
2898,
15970,
11701,
50275,
19164,
414,
50276,
783,
789,
9437,
281,
513,
1633,
4460,
285,
7826,
4217,
891,
1908,
697,
38135,
387,
12207,
2308,
50275,
15177,
253,
2929,
310,
3542,
973,
285,
3559,
973,
2299,
253,
5661,
3290,
310,
14999,
347,
253,
1566,
310,
760,
6760,
327,
15302,
326,
403,
1077,
10799,
275,
849,
247,
1798,
4227,
310,
11575,
275,
2426,
273,
253,
1859,
3602,
1677,
275,
3739,
7533,
323,
1650,
436,
651,
417,
320,
1077,
1896,
342,
1655,
15302,
253,
4477,
671,
513,
417,
7472,
327,
247,
625,
1524,
1533,
10895,
273,
253,
3510,
326,
597,
5393,
616,
1332,
812,
320,
4217,
275,
824,
347,
3739,
7533,
50276,
783,
4477,
671,
513,
417,
7277,
342,
12421,
761,
2108,
264,
373,
312,
6216,
10872,
281,
7472,
253,
12510,
273,
616,
3818,
3109,
1566,
50276,
498,
15752,
50276,
783,
2929,
310,
4583,
3240,
2590,
285,
253,
3082,
4518,
2529,
50275,
783,
2929,
310,
671,
14999,
5556,
6081,
7533,
941,
35919,
569,
285,
37820,
4278,
323,
253,
3210,
326,
497,
10166,
50276,
9188,
40348,
50276,
37411,
253,
789,
812,
320,
16110,
3163,
275,
247,
1039,
326,
352,
4419,
247,
625,
15958,
941,
2603,
342,
625,
27620,
6979,
3602,
253,
2934,
273,
761,
9249,
812,
320,
247,
4217,
581,
347,
352,
4390,
9572,
2299,
352,
1057,
417,
1646,
281,
3959,
1524,
1533,
8453,
533,
1057,
3959,
4344,
2561,
38135,
285,
8453,
253,
7681,
7364,
891,
4767,
4321,
1462,
533,
275,
2426,
273,
4016,
38058,
3486,
891,
13414,
923,
667,
4755,
3237,
643,
685,
7826,
2559,
4675,
9100,
432,
1363,
285,
253,
2208,
19605,
604,
9009,
15225,
5474,
339,
431,
248,
2929,
29328,
247,
5122,
23776,
761,
2108,
257,
292,
50276,
936,
4271,
285,
10007,
3280,
10872,
594,
347,
281,
5115,
247,
1805,
9162,
6454,
432,
247,
30410,
841,
14586,
403,
2684,
275,
253,
21624,
2317,
273,
12620,
12950,
761,
2108,
257,
292,
8414,
273,
1264,
4295,
50276,
66,
30410,
247,
761,
454,
1178,
10389,
1063,
534,
34899,
10872,
326,
2430,
11237,
285,
247,
761,
9249,
250,
2823,
3109,
326,
2789,
14586,
50276,
66,
5122,
323,
3733,
253,
1264,
4295,
310,
7000,
2515,
762,
534,
253,
761,
9249,
250,
27167,
318,
5644,
281,
5520,
7200,
403,
5421,
4679,
403,
2684,
327,
439,
522,
257,
292,
6519,
13896,
10895,
285,
247,
13506,
10895,
50276,
296,
3755,
20556,
50275,
2727,
2158,
761,
9249,
275,
253,
830,
273,
5795,
12620,
310,
4722,
3340,
323,
3888,
50276,
783,
4081,
13757,
1895,
310,
3236,
285,
2710,
24866,
403,
4081,
281,
8415,
352,
50276,
2044,
784,
490,
77,
569,
273,
761,
2108,
257,
292,
403,
5421,
285,
253,
2216,
10165,
403,
17285,
949,
5661,
1543,
50276,
20881,
1255,
265,
50275,
262,
310,
417,
2590,
752,
253,
1895,
761,
2108,
257,
292,
2686,
35910,
310,
352,
2820,
281,
1347,
9162,
672,
3733,
2203,
310,
27620,
50275,
77,
1797,
1309,
3733,
403,
2649,
824,
10021,
6667,
604,
13130,
9113,
1774,
323,
26647,
352,
651,
320,
8664,
604,
512,
3733,
10872,
497,
432,
247,
4229,
873,
273,
6849,
257,
11986,
4560,
5795,
12620,
2789,
3282,
1309,
1071,
673,
984,
253,
30410,
1537,
417,
452,
2326,
824,
6667,
1309,
3733,
50274,
24418,
2327,
273,
253,
4473,
273,
9159,
761,
9249,
275,
2710,
5053,
298,
3011,
1787,
253,
1895,
310,
2931,
347,
4560,
271,
5795,
3126,
4758,
594,
326,
253,
30410,
310,
625,
2779,
281,
755,
247,
3451,
10554,
275,
5899,
761,
9249,
247,
4229,
50276,
2437,
5425,
310,
8025,
285,
49353,
2544,
403,
5125,
281,
247,
2608,
3280,
4227,
326,
556,
644,
18123,
10509,
891,
717,
417,
13762,
326,
9591,
638,
42070,
941,
14586,
476,
320,
23776,
347,
761,
9249,
253,
10393,
273,
253,
4473,
273,
761,
9249,
3198,
22861,
50275,
77,
471,
273,
247,
7470,
8245,
275,
4679,
253,
3533,
5421,
403,
323,
1027,
7533,
273,
253,
4081,
5933,
323,
4227,
2593,
7609,
310,
271,
28913,
1263,
326,
5593,
849,
1774,
616,
34560,
38754,
10419,
310,
323,
16161,
16186,
608,
352,
651,
320,
4722,
281,
923,
849,
761,
2108,
257,
292,
17923,
672,
2429,
281,
643,
3082,
326,
8415,
253,
1072,
1895,
50276,
12157,
253,
6944,
1895,
352,
35910,
310,
417,
2590,
281,
479,
352,
4916,
2834,
323,
479,
281,
5583,
1666,
25379,
387,
436,
673,
533,
891,
9428,
1666,
25379,
751,
13727,
2495,
41458,
390,
643,
3082,
326,
3177,
281,
1089,
5028,
25168,
14237,
812,
320,
1666,
25379,
50276,
9820,
5474,
339,
793,
360,
3454,
253,
2929,
14177,
281,
3157,
253,
10554,
7200,
949,
247,
1027,
8668,
761,
9249,
327,
4227,
3126,
41005,
352,
2792,
562,
253,
958,
326,
247,
49996,
3045,
310,
671,
7976,
327,
253,
941,
3290,
326,
310,
247,
1175,
30410,
310,
671,
2779,
281,
3731,
2437,
1419,
271,
2460,
2668,
275,
1077,
3076,
3126,
4758,
281,
2953,
436,
1895,
253,
2488,
29328,
253,
761,
2108,
257,
292,
281,
806,
4271,
1880,
253,
941,
310,
1175,
323,
9162,
390,
3058,
281,
2668,
432,
1529,
3126,
4758,
824,
347,
1529,
6907,
604,
253,
2460,
310,
417,
12165,
323,
247,
1175,
9162,
253,
761,
2108,
257,
292,
588,
5583,
253,
2608,
271,
4569,
3126,
4758,
840,
253,
761,
2108,
257,
292,
588,
18784,
247,
30410,
281,
30215,
253,
2460,
273,
1175,
3290,
390,
253,
3888,
4561,
407,
253,
2608,
50275,
1987,
2382,
436,
2929,
29328,
247,
4460,
1264,
5251,
7792,
323,
253,
761,
9249,
5122,
1293,
1966,
10171,
275,
253,
3733,
50276,
296,
3755,
20556,
50275,
783,
2929,
3400,
247,
4460,
8668,
323,
22474,
10554,
7200,
3185,
273,
13654,
327,
253,
9162,
1566,
323,
11138,
31640,
352,
16633,
327,
253,
3280,
941,
3290,
285,
23970,
247,
761,
9249,
5122,
323,
11138,
253,
941,
3290,
285,
17912,
11138,
253,
10554,
7200,
436,
310,
12532,
275,
3739,
4893,
323,
1650,
253,
5933,
476,
320,
18329,
327,
690,
3909,
6120,
5147,
285,
3400,
12662,
296,
2824,
281,
253,
2608,
670,
849,
281,
1379,
247,
2590,
2460,
326,
476,
1056,
253,
6120,
625,
7899,
50276,
783,
4081,
1332,
310,
27350,
285,
4942,
4518,
5544,
50276,
783,
3368,
310,
11080,
275,
2426,
273,
253,
28913,
1263,
10775,
849,
1046,
4445,
310,
12912,
50276,
20881,
1255,
265,
50276,
783,
4679,
403,
3710,
281,
1355,
15302,
50275,
9088,
310,
690,
43430,
275,
2426,
273,
14951,
285,
4028,
323,
1650,
253,
3126,
4758,
4620,
347,
270,
390,
14168,
1179,
67,
285,
275,
1386,
11003,
352,
310,
16706,
342,
253,
5933,
672,
3981,
326,
7497,
513,
417,
10078,
275,
253,
10554,
4836,
533,
597,
760,
6635,
747,
10872,
762,
253,
8521,
12620,
50275,
5430,
841,
767,
39325,
5649,
3133,
281,
320,
12744,
281,
479,
50276,
783,
9380,
7364,
403,
417,
5469,
407,
253,
2488,
2490,
187,
4118,
18435,
27,
783,
2929,
29328,
247,
761,
9249,
2746,
326,
32636,
849,
281,
3157,
3045,
327,
10872,
407,
26264,
616,
3126,
253,
2929,
310,
973,
17194,
285,
3400,
247,
4460,
2746,
326,
310,
45190,
5183,
281,
320,
4217,
2167,
253,
16774,
7103,
310,
3710,
30628,
5194,
326,
436,
2929,
12453,
271,
1774,
1953,
326,
556,
625,
4102,
3053,
281,
755,
4116,
285,
326,
253,
7680,
310,
4460,
10995,
285,
1534,
253,
3290,
273,
253,
3630,
598,
812,
320,
5520,
285,
891,
11907,
253,
4477,
281,
513,
594,
323,
253,
6568,
4704,
2715,
209
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
this paper considers the application of the nonlinear markov chain monte carlo mcmc method to problems in bayesian inference the key idea of the method is instead of using a single kernel a mixture of two kernels a linear and nonlinear one are used two types of nonlinear kernels are considered the boltzmanngibbs interaction and the acceptreject interaction it is shown that the performance of the proposed methods is at least as good as when the nonlinear kernel is not used in addition to the linear one in general combining different mcmc methods to achieve better exploration of a target density is good practice eg in the case of a multimodal density one can combine local moves that explore within a mode with moves that explore between different modes the rationale is that neither of these moves by themselves would provide good exploration of the target density local moves will take a long time to travel between different modes whereas global moves will typically have trouble exploring a mode locally this paper aligns with the general idea of the benefit of combining together different mcmc kernels the general idea of the paper is sound however a drawback is that it is at times hard to understand what exactly is happening and to link various portions of the paper together as far as was able to tell the idea at the core of the paper is as described above to combine mcmc moves with different average jump sizes and to show this leads to better exploration for example the description of an interacting particle system is hard to link to the concrete mcmc schemes ie based on the boltzmanngibbs and acceptreject interaction proposed in this paper yes docsep this paper builds on the nonlinear mcmc methods proposed in 1 and provides convergence guarantees of these nonlinear sampling techniques in terms of the number of iterations and the number of samples required using these general convergence guarantees the paper shows that this holds for two specific algorithms boltzmanngibbs interaction and acceptreject interaction the paper also reports some positive and negative results on 2d toy tasks and on cifar10 on 2d tasks the paper shows that if the invariant density eta can incorporate further details about the problem then nonlinear methods work really well however on cifar10 they perform as well as linear methods but not better 1 christophe andrieu ajay jasra arnaud doucet and pierre del moral on nonlinear markov chain monte carlo bernoulli 173987 1014 2011 doi 10315010bej307 url httpsdoiorg10315010bej307 1 originality the paper builds heavily on the initial mathematical machinery proposed in 1 however they derive novel convergence bounds for these nonlinear mcmc samplers in quite general settings they also show that these bounds are satisfied for two common nonlinear samplers the paper also uses really interesting results from the propagation of chaos to show that even though the central limit theorem breaks down in a nonlinear setting due to statistical dependences across particles there are still arguments that can be made about the independence of these interactions as n rightarrow infty i found the motivations and mathematical proofs quite novel 2 quality for someone who is a layperson to nonlinear mcmc methods i found the paper had very high pedagogical quality and i was able to understand to a large extent the breadth of the literature on this field i found the mathematical proofs correct in my understanding though i did not look closely at the proofs in the appendix i followed through the intuition and justifications that lead to theorem 1 and i think it is an elegant result that gives us useful convergence bounds for nonlinear mcmc methods 3 clarity i found the paper easy to follow in terms of its writing if not always in the math some of this is due to a lack of expertise in nonlinear mcmc techniques however i believe the authors did a good job providing background where necessary and shifted a large part of the verbose mathematics to the appendices 4 significance i believe this paper has quite a significant convergence bound for nonlinear mcmc also i believe the empirical negative result on the cifar10 problem shows an important limitation of nonlinear mcmc methods currently i think this result showing that nonlinear methods perform as well as linear methods but dont necessarily outperform them is an important factor to consider before using these methods for large scale bayesian machine learning i believe the authors should refer to the additional computational requirements required by their method in terms of wall clock time as well if possible they might have this information in an appendix or i might have missed it in which case please point me to it they have addressed social impacts docsepthe work explores the existing technique of nonlinear mcmc where the transition kernel draws information from across a collection of samples it establishes some theoretical results and proposes two such kernels boltzmanngibbs interaction and acceptreject interaction with empirical demonstrations of both this work reflects substantial effort but a good amount of the material reads as a review and im unsure that the theoretical results really connect with the proposed kernels to make for a coherent contribution at this stage this does look to be a useful line of inquiry and i do not wish to discourage the authors but the work as it stands does not yet seem ready for publication in a sense it seems like two works the first establishing the theoretical results the second proposing the new kernels within the nonlinear mcmc framework and showing they can work although not that they can outperform on the first im not well placed to determine whether the theoretical results alone stand as a sufficient contribution or display original insight and if they do they might be published in isolation on the second i am not sure that there is sufficient novelty here the kernels are not greatly different from existing work from what i can tell and the empirical assessment of them is quite limited showing that they work but not that they outperform in predictive performance say or exhibit some other desirable quality such as accurate uncertainty quantification as motivated in the introduction overall i think that either the method and application would need to be developed further or the work should focus on the theoretical results but it is difficult to recommend in its current form no concerns docsepthis work proposes a non linear mcmc method based on an interacting particle system the article studies its longtime and largeparticle nonasymptotic behaviour ultimately showing the method consistency under standard hypotheses the method is then illustrated on toy and realworld examples while the paper is strongly related to 1 it the way the empirical nonlinearity is formed and subsequent analysis differs from 1 in the following way it is defined as a selfinteracting process in 1 and as an interacting particle system in this work strengths this is clearly a strong methodological paper with good theoretical backing for the consistency of the method the method is creative and substantially different from previous works the presentation is rather clear but can be improved upon as hinted in the questions section weaknesses the differences with 1 are fairly subtle and may not be noticed at first read in particular because no background on 1 is provided this paper is rather long for a conference format and most of the story and interesting proof points lie in the appendix the experiments lack a comparison with 1 which is the main related work rather than mala minor comments and typos 1 the references are not formatted properly for instance lacking proper capitalisation of names hamiltonian monte carlo bayes etc or sometimes not properly referenced how good is the bayes posterior in deep neural networks really is published at icml uniform longtime at journal of statistical physics 2 intractible intractable 3 has an added presents an additional 4 compatability compatibility 5 critera criteria 6 distribution flow is not linked to equation 8 7 this is inequality this inequality line 222 8 the weighted supremum norm term used in g2 is not defined explicitly in d12 9 of auxiliary density eta is required the auxiliary 10 dyanmics dynamics 11 stepszie stepsize 12 otherewise otherwise 13 account of our experimental settings missing full stop 14 which is depends which depends 15 capitalisation of names in the references is sometimes off langevin gibbs etc 16 i can see that some cited papers are given as arxiv references but are actually published at least 32 40 60 please check some limitations of the work are a bit too quickly brushed over in particular the fact that the lln of this article is across the particles and not on the trajectory of the markov chains is a bit too subtle see questions
### Summary: | the contribution of this submission is strong an analysis of convergence of nonlinear mcmc methods most reviewers agree that the submission is theoretically interesting and of interest to the neurips community thus i recommend acceptance however i note and share with some reviewers the following concern the empirical results on cifar10 do not show the benefit of the proposed method the result of the linear baseline ula is very far from sota for resnet and cifar10 yet ula outperforms the nonlinear version in terms of accuracy and time efficiency it would be great if there is a realistic modeldataset between the simple 2d toy experiment and perhaps too ambitious resnetcifar10 that can be included to show the benefits if any of nonlinear mcmc some reviewers also raised a concern about the organisationflow of the paper which i hope the authors will fix in the cameraready version | [
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
2520,
2929,
19401,
253,
2898,
273,
253,
14561,
1616,
729,
5931,
1114,
442,
1113,
4213,
278,
3591,
68,
1332,
281,
3237,
275,
17699,
16561,
17032,
253,
2234,
2934,
273,
253,
1332,
310,
3185,
273,
970,
247,
2014,
10295,
247,
7802,
273,
767,
34501,
247,
4872,
285,
14561,
581,
403,
908,
767,
3510,
273,
14561,
34501,
403,
2783,
253,
22491,
91,
8420,
72,
487,
1768,
5016,
285,
253,
2997,
49844,
5016,
352,
310,
2011,
326,
253,
3045,
273,
253,
4081,
3082,
310,
387,
1878,
347,
1175,
347,
672,
253,
14561,
10295,
310,
417,
908,
275,
1635,
281,
253,
4872,
581,
50276,
249,
2087,
16248,
1027,
278,
3591,
68,
3082,
281,
5115,
1805,
17947,
273,
247,
2303,
4038,
310,
1175,
3946,
24088,
275,
253,
1083,
273,
247,
23390,
26306,
4038,
581,
476,
13398,
1980,
9727,
326,
8338,
1561,
247,
4438,
342,
9727,
326,
8338,
875,
1027,
10006,
253,
24775,
310,
326,
6747,
273,
841,
9727,
407,
3746,
651,
2085,
1175,
17947,
273,
253,
2303,
4038,
1980,
9727,
588,
1379,
247,
1048,
673,
281,
4288,
875,
1027,
10006,
5727,
4156,
9727,
588,
5431,
452,
7596,
18216,
247,
4438,
12171,
436,
2929,
8495,
84,
342,
253,
2087,
2934,
273,
253,
5649,
273,
16248,
2366,
1027,
278,
3591,
68,
34501,
50276,
783,
2087,
2934,
273,
253,
2929,
310,
3590,
2299,
247,
32489,
310,
326,
352,
310,
387,
2069,
1892,
281,
2096,
752,
4555,
310,
9369,
285,
281,
3048,
2710,
11821,
273,
253,
2929,
2366,
347,
2080,
347,
369,
2104,
281,
2028,
253,
2934,
387,
253,
5161,
273,
253,
2929,
310,
347,
2529,
1840,
281,
13398,
278,
3591,
68,
9727,
342,
1027,
3388,
6923,
9552,
285,
281,
921,
436,
5644,
281,
1805,
17947,
323,
1650,
253,
5740,
273,
271,
18745,
8091,
985,
310,
1892,
281,
3048,
281,
253,
11859,
278,
3591,
68,
15849,
26332,
1754,
327,
253,
22491,
91,
8420,
72,
487,
1768,
285,
2997,
49844,
5016,
4081,
275,
436,
2929,
50275,
9820,
50276,
7152,
33032,
436,
2929,
21168,
327,
253,
14561,
278,
3591,
68,
3082,
4081,
275,
337,
285,
3400,
14940,
23632,
273,
841,
14561,
10491,
5609,
275,
2426,
273,
253,
1180,
273,
25142,
285,
253,
1180,
273,
3530,
2424,
50276,
5302,
841,
2087,
14940,
23632,
253,
2929,
2722,
326,
436,
6556,
323,
767,
2173,
11333,
22491,
91,
8420,
72,
487,
1768,
5016,
285,
2997,
49844,
5016,
50276,
783,
2929,
671,
5012,
690,
2762,
285,
4016,
1543,
327,
374,
69,
20953,
8892,
285,
327,
260,
338,
274,
740,
327,
374,
69,
8892,
253,
2929,
2722,
326,
604,
253,
13727,
4038,
1162,
66,
476,
19071,
2007,
4278,
670,
253,
1895,
840,
14561,
3082,
789,
1663,
973,
2299,
327,
260,
338,
274,
740,
597,
1347,
347,
973,
347,
4872,
3082,
533,
417,
1805,
50275,
18,
37622,
37245,
285,
6595,
86,
29168,
333,
480,
284,
376,
549,
2072,
438,
2443,
68,
292,
285,
18753,
250,
1448,
9447,
327,
14561,
1616,
729,
5931,
1114,
442,
1113,
4213,
270,
1808,
276,
25658,
50276,
1166,
1867,
2597,
50276,
6903,
21,
50276,
7330,
50275,
14369,
50276,
12172,
1010,
9104,
1257,
75,
21645,
50275,
6434,
5987,
3088,
1528,
72,
12172,
1010,
9104,
1257,
75,
21645,
337,
3236,
414,
50276,
783,
2929,
21168,
11306,
327,
253,
3302,
15965,
20949,
4081,
275,
337,
2299,
597,
15313,
4460,
14940,
14493,
323,
841,
14561,
278,
3591,
68,
1775,
446,
398,
275,
3240,
2087,
7533,
597,
671,
921,
326,
841,
14493,
403,
10048,
323,
767,
1846,
14561,
1775,
446,
398,
253,
2929,
671,
4648,
1663,
4722,
1543,
432,
253,
18634,
273,
20142,
281,
921,
326,
1014,
2167,
253,
4275,
2701,
10012,
13471,
1066,
275,
247,
14561,
4758,
1955,
281,
7605,
3469,
2979,
2439,
6353,
627,
403,
1335,
7125,
326,
476,
320,
1160,
670,
253,
14275,
273,
841,
6355,
347,
295,
987,
2501,
2192,
555,
891,
1119,
253,
42852,
285,
15965,
27947,
3240,
4460,
50276,
19,
3290,
50276,
1542,
3095,
665,
310,
247,
2242,
10816,
281,
14561,
278,
3591,
68,
3082,
891,
1119,
253,
2929,
574,
1077,
1029,
7690,
356,
38721,
3290,
285,
891,
369,
2104,
281,
2096,
281,
247,
1781,
6070,
253,
37535,
273,
253,
6239,
327,
436,
1673,
891,
1119,
253,
15965,
27947,
3451,
275,
619,
4685,
2167,
891,
858,
417,
1007,
8244,
387,
253,
27947,
275,
253,
30762,
891,
3560,
949,
253,
30328,
285,
816,
6787,
326,
1421,
281,
10012,
337,
285,
891,
1158,
352,
310,
271,
20654,
906,
326,
4245,
441,
4217,
14940,
14493,
323,
14561,
278,
3591,
68,
3082,
50276,
20,
19843,
50276,
74,
1119,
253,
2929,
3477,
281,
956,
275,
2426,
273,
697,
4028,
604,
417,
1900,
275,
253,
14168,
690,
273,
436,
310,
1955,
281,
247,
3480,
273,
15040,
275,
14561,
278,
3591,
68,
5609,
2299,
891,
2868,
253,
4477,
858,
247,
1175,
2628,
5277,
4114,
835,
3309,
285,
14728,
247,
1781,
629,
273,
253,
48656,
23065,
281,
253,
14801,
1271,
50276,
21,
8453,
50276,
74,
2868,
436,
2929,
556,
3240,
247,
1534,
14940,
3033,
323,
14561,
278,
3591,
68,
671,
891,
2868,
253,
16774,
4016,
906,
327,
253,
260,
338,
274,
740,
1895,
2722,
271,
1774,
12291,
273,
14561,
278,
3591,
68,
3082,
4390,
891,
1158,
436,
906,
4645,
326,
14561,
3082,
1347,
347,
973,
347,
4872,
3082,
533,
13414,
7933,
562,
32231,
731,
310,
271,
1774,
2803,
281,
1908,
1078,
970,
841,
3082,
323,
1781,
4311,
17699,
16561,
5145,
4715,
891,
2868,
253,
4477,
943,
3730,
281,
253,
3081,
15180,
6095,
2424,
407,
616,
1332,
275,
2426,
273,
3402,
8886,
673,
347,
973,
604,
1896,
597,
1537,
452,
436,
1491,
275,
271,
30762,
390,
891,
1537,
452,
9829,
352,
275,
534,
1083,
4496,
1127,
479,
281,
352,
597,
452,
9713,
2675,
16274,
5474,
339,
431,
248,
789,
33826,
253,
5368,
5853,
273,
14561,
278,
3591,
68,
835,
253,
5502,
10295,
21354,
1491,
432,
2439,
247,
4849,
273,
3530,
352,
25097,
690,
10527,
1543,
285,
29328,
767,
824,
34501,
22491,
91,
8420,
72,
487,
1768,
5016,
285,
2997,
49844,
5016,
342,
16774,
32367,
273,
1097,
436,
789,
13806,
6832,
3434,
533,
247,
1175,
2408,
273,
253,
2144,
9563,
347,
247,
2278,
285,
516,
31488,
326,
253,
10527,
1543,
1663,
4684,
342,
253,
4081,
34501,
281,
1056,
323,
247,
18893,
7680,
387,
436,
3924,
436,
1057,
1007,
281,
320,
247,
4217,
1386,
273,
14392,
285,
891,
513,
417,
5730,
281,
43162,
253,
4477,
533,
253,
789,
347,
352,
9572,
1057,
417,
2568,
1646,
4704,
323,
9311,
50276,
249,
247,
3282,
352,
3133,
751,
767,
2987,
253,
806,
14631,
253,
10527,
1543,
253,
1273,
36636,
253,
747,
34501,
1561,
253,
14561,
278,
3591,
68,
7792,
285,
4645,
597,
476,
789,
3738,
417,
326,
597,
476,
562,
32231,
327,
253,
806,
516,
417,
973,
4845,
281,
3653,
1880,
253,
10527,
1543,
3815,
1462,
347,
247,
4209,
7680,
390,
3148,
3236,
12288,
285,
604,
597,
513,
597,
1537,
320,
3863,
275,
12940,
327,
253,
1273,
891,
717,
417,
2119,
326,
627,
310,
4209,
38135,
1060,
253,
34501,
403,
417,
10260,
1027,
432,
5368,
789,
432,
752,
891,
476,
2028,
285,
253,
16774,
6803,
273,
731,
310,
3240,
3710,
4645,
326,
597,
789,
533,
417,
326,
597,
562,
32231,
275,
15970,
3045,
1333,
390,
10738,
690,
643,
11408,
3290,
824,
347,
7899,
11649,
21652,
347,
17194,
275,
253,
10199,
50276,
1189,
455,
891,
1158,
326,
2057,
253,
1332,
285,
2898,
651,
878,
281,
320,
3715,
2007,
390,
253,
789,
943,
2770,
327,
253,
10527,
1543,
533,
352,
310,
2834,
281,
5583,
275,
697,
1655,
830,
642,
7350,
5474,
33032,
2520,
789,
29328,
247,
1327,
4872,
278,
3591,
68,
1332,
1754,
327,
271,
18745,
8091,
985,
253,
3929,
2175,
697,
31156,
285,
1781,
25268,
1327,
284,
40045,
3875,
8770,
9142,
4645,
253,
1332,
15274,
762,
2629,
24316,
253,
1332,
310,
840,
12800,
327,
20953,
285,
1524,
10186,
6667,
1223,
253,
2929,
310,
7052,
2905,
281,
337,
352,
253,
1039,
253,
16774,
14561,
414,
310,
4447,
285,
6774,
1783,
19986,
432,
337,
275,
253,
1563,
1039,
352,
310,
2931,
347,
247,
1881,
47787,
1232,
275,
337,
285,
347,
271,
18745,
8091,
985,
275,
436,
789,
50276,
296,
3755,
20556,
436,
310,
4518,
247,
2266,
35961,
2929,
342,
1175,
10527,
19673,
323,
253,
15274,
273,
253,
1332,
253,
1332,
310,
10995,
285,
9619,
1027,
432,
2045,
2987,
253,
9759,
310,
2581,
2590,
533,
476,
320,
5520,
2220,
347,
47466,
275,
253,
3533,
2593,
50275,
20881,
1255,
265,
253,
3910,
342,
337,
403,
9648,
16105,
285,
778,
417,
320,
8344,
387,
806,
1239,
275,
1798,
984,
642,
4114,
327,
337,
310,
2530,
50276,
2520,
2929,
310,
2581,
1048,
323,
247,
8059,
5981,
285,
954,
273,
253,
2926,
285,
4722,
4737,
2792,
7027,
275,
253,
30762,
253,
4679,
3480,
247,
5301,
342,
337,
534,
310,
253,
2022,
2905,
789,
2581,
685,
4691,
66,
50274,
37585,
5701,
285,
963,
993,
50276,
18,
253,
10414,
403,
417,
39113,
6283,
323,
4227,
14999,
1463,
5347,
5837,
273,
4454,
10546,
7839,
757,
1114,
442,
1113,
4213,
17699,
265,
3966,
390,
4536,
417,
6283,
23378,
849,
1175,
310,
253,
17699,
265,
12637,
275,
3676,
11454,
6928,
1663,
310,
3863,
387,
17857,
1686,
6447,
31156,
50276,
255,
6698,
273,
7605,
12057,
374,
540,
974,
917,
50276,
565,
44374,
495,
556,
271,
2879,
50276,
81,
5957,
271,
3081,
577,
9800,
1430,
50276,
38659,
2322,
608,
2268,
3525,
50276,
24913,
5169,
721,
3268,
2685,
310,
417,
7939,
281,
5150,
854,
818,
436,
310,
11370,
50276,
2520,
11370,
1386,
25653,
854,
253,
17375,
25937,
360,
5222,
1307,
908,
275,
305,
19,
310,
417,
2931,
11120,
275,
277,
805,
898,
273,
24026,
4038,
1162,
66,
310,
2424,
50276,
783,
24026,
884,
277,
8202,
78,
982,
50276,
41546,
1903,
5018,
29879,
50276,
20528,
907,
1249,
258,
783,
2663,
885,
50276,
32240,
2145,
2395,
273,
776,
5661,
7533,
50276,
33722,
2120,
3523,
1638,
534,
310,
7024,
50276,
4609,
7024,
1458,
5347,
5837,
273,
4454,
275,
253,
10414,
310,
4536,
745,
298,
912,
8498,
33342,
1768,
3966,
1668,
891,
476,
923,
326,
690,
11106,
9380,
403,
1677,
347,
549,
32693,
10414,
533,
403,
2686,
3863,
387,
1878,
4567,
3387,
3925,
4496,
2451,
690,
7364,
273,
253,
789,
403,
247,
2372,
1512,
4541,
35273,
689,
275,
1798,
253,
958,
326,
253,
298,
6677,
273,
436,
3929,
310,
2439,
253,
6353,
285,
417,
327,
253,
18974,
273,
253,
1616,
729,
13178,
310,
247,
2372,
1512,
16105,
923,
3533,
2490,
187,
4118,
18435,
27,
783,
7680,
273,
436,
19529,
310,
2266,
50276,
266,
1783,
273,
14940,
273,
14561,
278,
3591,
68,
3082,
954,
30628,
5194,
326,
253,
19529,
310,
28055,
4722,
285,
273,
1600,
281,
253,
5723,
2824,
3114,
3021,
891,
5583,
14924,
50276,
35529,
891,
3877,
285,
3894,
342,
690,
30628,
253,
1563,
4468,
253,
16774,
1543,
327,
260,
338,
274,
740,
513,
417,
921,
253,
5649,
273,
253,
4081,
1332,
253,
906,
273,
253,
4872,
8245,
209,
3627,
310,
1077,
2080,
432,
256,
5503,
323,
501,
3024,
285,
260,
338,
274,
740,
2568,
209,
3627,
41731,
13015,
253,
14561,
2715,
275,
2426,
273,
7200,
285,
673,
6733,
352,
651,
320,
1270,
604,
627,
310,
247,
15958,
1566,
42429,
875,
253,
2969,
374,
69,
20953,
3368,
285,
4931,
1512,
24683,
501,
3024,
46277,
274,
740,
326,
476,
320,
2908,
281,
921,
253,
5373,
604,
667,
273,
14561,
278,
3591,
68,
690,
30628,
671,
5439,
247,
4468,
670,
253,
19156,
5449,
273,
253,
2929,
534,
891,
3524,
253,
4477,
588,
4993,
275,
253,
4049,
254,
609,
5102,
2715,
209
] | [
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] | [
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
2520,
2929,
19401,
253,
2898,
273,
253,
14561,
1616,
729,
5931,
1114,
442,
1113,
4213,
278,
3591,
68,
1332,
281,
3237,
275,
17699,
16561,
17032,
253,
2234,
2934,
273,
253,
1332,
310,
3185,
273,
970,
247,
2014,
10295,
247,
7802,
273,
767,
34501,
247,
4872,
285,
14561,
581,
403,
908,
767,
3510,
273,
14561,
34501,
403,
2783,
253,
22491,
91,
8420,
72,
487,
1768,
5016,
285,
253,
2997,
49844,
5016,
352,
310,
2011,
326,
253,
3045,
273,
253,
4081,
3082,
310,
387,
1878,
347,
1175,
347,
672,
253,
14561,
10295,
310,
417,
908,
275,
1635,
281,
253,
4872,
581,
50276,
249,
2087,
16248,
1027,
278,
3591,
68,
3082,
281,
5115,
1805,
17947,
273,
247,
2303,
4038,
310,
1175,
3946,
24088,
275,
253,
1083,
273,
247,
23390,
26306,
4038,
581,
476,
13398,
1980,
9727,
326,
8338,
1561,
247,
4438,
342,
9727,
326,
8338,
875,
1027,
10006,
253,
24775,
310,
326,
6747,
273,
841,
9727,
407,
3746,
651,
2085,
1175,
17947,
273,
253,
2303,
4038,
1980,
9727,
588,
1379,
247,
1048,
673,
281,
4288,
875,
1027,
10006,
5727,
4156,
9727,
588,
5431,
452,
7596,
18216,
247,
4438,
12171,
436,
2929,
8495,
84,
342,
253,
2087,
2934,
273,
253,
5649,
273,
16248,
2366,
1027,
278,
3591,
68,
34501,
50276,
783,
2087,
2934,
273,
253,
2929,
310,
3590,
2299,
247,
32489,
310,
326,
352,
310,
387,
2069,
1892,
281,
2096,
752,
4555,
310,
9369,
285,
281,
3048,
2710,
11821,
273,
253,
2929,
2366,
347,
2080,
347,
369,
2104,
281,
2028,
253,
2934,
387,
253,
5161,
273,
253,
2929,
310,
347,
2529,
1840,
281,
13398,
278,
3591,
68,
9727,
342,
1027,
3388,
6923,
9552,
285,
281,
921,
436,
5644,
281,
1805,
17947,
323,
1650,
253,
5740,
273,
271,
18745,
8091,
985,
310,
1892,
281,
3048,
281,
253,
11859,
278,
3591,
68,
15849,
26332,
1754,
327,
253,
22491,
91,
8420,
72,
487,
1768,
285,
2997,
49844,
5016,
4081,
275,
436,
2929,
50275,
9820,
50276,
7152,
33032,
436,
2929,
21168,
327,
253,
14561,
278,
3591,
68,
3082,
4081,
275,
337,
285,
3400,
14940,
23632,
273,
841,
14561,
10491,
5609,
275,
2426,
273,
253,
1180,
273,
25142,
285,
253,
1180,
273,
3530,
2424,
50276,
5302,
841,
2087,
14940,
23632,
253,
2929,
2722,
326,
436,
6556,
323,
767,
2173,
11333,
22491,
91,
8420,
72,
487,
1768,
5016,
285,
2997,
49844,
5016,
50276,
783,
2929,
671,
5012,
690,
2762,
285,
4016,
1543,
327,
374,
69,
20953,
8892,
285,
327,
260,
338,
274,
740,
327,
374,
69,
8892,
253,
2929,
2722,
326,
604,
253,
13727,
4038,
1162,
66,
476,
19071,
2007,
4278,
670,
253,
1895,
840,
14561,
3082,
789,
1663,
973,
2299,
327,
260,
338,
274,
740,
597,
1347,
347,
973,
347,
4872,
3082,
533,
417,
1805,
50275,
18,
37622,
37245,
285,
6595,
86,
29168,
333,
480,
284,
376,
549,
2072,
438,
2443,
68,
292,
285,
18753,
250,
1448,
9447,
327,
14561,
1616,
729,
5931,
1114,
442,
1113,
4213,
270,
1808,
276,
25658,
50276,
1166,
1867,
2597,
50276,
6903,
21,
50276,
7330,
50275,
14369,
50276,
12172,
1010,
9104,
1257,
75,
21645,
50275,
6434,
5987,
3088,
1528,
72,
12172,
1010,
9104,
1257,
75,
21645,
337,
3236,
414,
50276,
783,
2929,
21168,
11306,
327,
253,
3302,
15965,
20949,
4081,
275,
337,
2299,
597,
15313,
4460,
14940,
14493,
323,
841,
14561,
278,
3591,
68,
1775,
446,
398,
275,
3240,
2087,
7533,
597,
671,
921,
326,
841,
14493,
403,
10048,
323,
767,
1846,
14561,
1775,
446,
398,
253,
2929,
671,
4648,
1663,
4722,
1543,
432,
253,
18634,
273,
20142,
281,
921,
326,
1014,
2167,
253,
4275,
2701,
10012,
13471,
1066,
275,
247,
14561,
4758,
1955,
281,
7605,
3469,
2979,
2439,
6353,
627,
403,
1335,
7125,
326,
476,
320,
1160,
670,
253,
14275,
273,
841,
6355,
347,
295,
987,
2501,
2192,
555,
891,
1119,
253,
42852,
285,
15965,
27947,
3240,
4460,
50276,
19,
3290,
50276,
1542,
3095,
665,
310,
247,
2242,
10816,
281,
14561,
278,
3591,
68,
3082,
891,
1119,
253,
2929,
574,
1077,
1029,
7690,
356,
38721,
3290,
285,
891,
369,
2104,
281,
2096,
281,
247,
1781,
6070,
253,
37535,
273,
253,
6239,
327,
436,
1673,
891,
1119,
253,
15965,
27947,
3451,
275,
619,
4685,
2167,
891,
858,
417,
1007,
8244,
387,
253,
27947,
275,
253,
30762,
891,
3560,
949,
253,
30328,
285,
816,
6787,
326,
1421,
281,
10012,
337,
285,
891,
1158,
352,
310,
271,
20654,
906,
326,
4245,
441,
4217,
14940,
14493,
323,
14561,
278,
3591,
68,
3082,
50276,
20,
19843,
50276,
74,
1119,
253,
2929,
3477,
281,
956,
275,
2426,
273,
697,
4028,
604,
417,
1900,
275,
253,
14168,
690,
273,
436,
310,
1955,
281,
247,
3480,
273,
15040,
275,
14561,
278,
3591,
68,
5609,
2299,
891,
2868,
253,
4477,
858,
247,
1175,
2628,
5277,
4114,
835,
3309,
285,
14728,
247,
1781,
629,
273,
253,
48656,
23065,
281,
253,
14801,
1271,
50276,
21,
8453,
50276,
74,
2868,
436,
2929,
556,
3240,
247,
1534,
14940,
3033,
323,
14561,
278,
3591,
68,
671,
891,
2868,
253,
16774,
4016,
906,
327,
253,
260,
338,
274,
740,
1895,
2722,
271,
1774,
12291,
273,
14561,
278,
3591,
68,
3082,
4390,
891,
1158,
436,
906,
4645,
326,
14561,
3082,
1347,
347,
973,
347,
4872,
3082,
533,
13414,
7933,
562,
32231,
731,
310,
271,
1774,
2803,
281,
1908,
1078,
970,
841,
3082,
323,
1781,
4311,
17699,
16561,
5145,
4715,
891,
2868,
253,
4477,
943,
3730,
281,
253,
3081,
15180,
6095,
2424,
407,
616,
1332,
275,
2426,
273,
3402,
8886,
673,
347,
973,
604,
1896,
597,
1537,
452,
436,
1491,
275,
271,
30762,
390,
891,
1537,
452,
9829,
352,
275,
534,
1083,
4496,
1127,
479,
281,
352,
597,
452,
9713,
2675,
16274,
5474,
339,
431,
248,
789,
33826,
253,
5368,
5853,
273,
14561,
278,
3591,
68,
835,
253,
5502,
10295,
21354,
1491,
432,
2439,
247,
4849,
273,
3530,
352,
25097,
690,
10527,
1543,
285,
29328,
767,
824,
34501,
22491,
91,
8420,
72,
487,
1768,
5016,
285,
2997,
49844,
5016,
342,
16774,
32367,
273,
1097,
436,
789,
13806,
6832,
3434,
533,
247,
1175,
2408,
273,
253,
2144,
9563,
347,
247,
2278,
285,
516,
31488,
326,
253,
10527,
1543,
1663,
4684,
342,
253,
4081,
34501,
281,
1056,
323,
247,
18893,
7680,
387,
436,
3924,
436,
1057,
1007,
281,
320,
247,
4217,
1386,
273,
14392,
285,
891,
513,
417,
5730,
281,
43162,
253,
4477,
533,
253,
789,
347,
352,
9572,
1057,
417,
2568,
1646,
4704,
323,
9311,
50276,
249,
247,
3282,
352,
3133,
751,
767,
2987,
253,
806,
14631,
253,
10527,
1543,
253,
1273,
36636,
253,
747,
34501,
1561,
253,
14561,
278,
3591,
68,
7792,
285,
4645,
597,
476,
789,
3738,
417,
326,
597,
476,
562,
32231,
327,
253,
806,
516,
417,
973,
4845,
281,
3653,
1880,
253,
10527,
1543,
3815,
1462,
347,
247,
4209,
7680,
390,
3148,
3236,
12288,
285,
604,
597,
513,
597,
1537,
320,
3863,
275,
12940,
327,
253,
1273,
891,
717,
417,
2119,
326,
627,
310,
4209,
38135,
1060,
253,
34501,
403,
417,
10260,
1027,
432,
5368,
789,
432,
752,
891,
476,
2028,
285,
253,
16774,
6803,
273,
731,
310,
3240,
3710,
4645,
326,
597,
789,
533,
417,
326,
597,
562,
32231,
275,
15970,
3045,
1333,
390,
10738,
690,
643,
11408,
3290,
824,
347,
7899,
11649,
21652,
347,
17194,
275,
253,
10199,
50276,
1189,
455,
891,
1158,
326,
2057,
253,
1332,
285,
2898,
651,
878,
281,
320,
3715,
2007,
390,
253,
789,
943,
2770,
327,
253,
10527,
1543,
533,
352,
310,
2834,
281,
5583,
275,
697,
1655,
830,
642,
7350,
5474,
33032,
2520,
789,
29328,
247,
1327,
4872,
278,
3591,
68,
1332,
1754,
327,
271,
18745,
8091,
985,
253,
3929,
2175,
697,
31156,
285,
1781,
25268,
1327,
284,
40045,
3875,
8770,
9142,
4645,
253,
1332,
15274,
762,
2629,
24316,
253,
1332,
310,
840,
12800,
327,
20953,
285,
1524,
10186,
6667,
1223,
253,
2929,
310,
7052,
2905,
281,
337,
352,
253,
1039,
253,
16774,
14561,
414,
310,
4447,
285,
6774,
1783,
19986,
432,
337,
275,
253,
1563,
1039,
352,
310,
2931,
347,
247,
1881,
47787,
1232,
275,
337,
285,
347,
271,
18745,
8091,
985,
275,
436,
789,
50276,
296,
3755,
20556,
436,
310,
4518,
247,
2266,
35961,
2929,
342,
1175,
10527,
19673,
323,
253,
15274,
273,
253,
1332,
253,
1332,
310,
10995,
285,
9619,
1027,
432,
2045,
2987,
253,
9759,
310,
2581,
2590,
533,
476,
320,
5520,
2220,
347,
47466,
275,
253,
3533,
2593,
50275,
20881,
1255,
265,
253,
3910,
342,
337,
403,
9648,
16105,
285,
778,
417,
320,
8344,
387,
806,
1239,
275,
1798,
984,
642,
4114,
327,
337,
310,
2530,
50276,
2520,
2929,
310,
2581,
1048,
323,
247,
8059,
5981,
285,
954,
273,
253,
2926,
285,
4722,
4737,
2792,
7027,
275,
253,
30762,
253,
4679,
3480,
247,
5301,
342,
337,
534,
310,
253,
2022,
2905,
789,
2581,
685,
4691,
66,
50274,
37585,
5701,
285,
963,
993,
50276,
18,
253,
10414,
403,
417,
39113,
6283,
323,
4227,
14999,
1463,
5347,
5837,
273,
4454,
10546,
7839,
757,
1114,
442,
1113,
4213,
17699,
265,
3966,
390,
4536,
417,
6283,
23378,
849,
1175,
310,
253,
17699,
265,
12637,
275,
3676,
11454,
6928,
1663,
310,
3863,
387,
17857,
1686,
6447,
31156,
50276,
255,
6698,
273,
7605,
12057,
374,
540,
974,
917,
50276,
565,
44374,
495,
556,
271,
2879,
50276,
81,
5957,
271,
3081,
577,
9800,
1430,
50276,
38659,
2322,
608,
2268,
3525,
50276,
24913,
5169,
721,
3268,
2685,
310,
417,
7939,
281,
5150,
854,
818,
436,
310,
11370,
50276,
2520,
11370,
1386,
25653,
854,
253,
17375,
25937,
360,
5222,
1307,
908,
275,
305,
19,
310,
417,
2931,
11120,
275,
277,
805,
898,
273,
24026,
4038,
1162,
66,
310,
2424,
50276,
783,
24026,
884,
277,
8202,
78,
982,
50276,
41546,
1903,
5018,
29879,
50276,
20528,
907,
1249,
258,
783,
2663,
885,
50276,
32240,
2145,
2395,
273,
776,
5661,
7533,
50276,
33722,
2120,
3523,
1638,
534,
310,
7024,
50276,
4609,
7024,
1458,
5347,
5837,
273,
4454,
275,
253,
10414,
310,
4536,
745,
298,
912,
8498,
33342,
1768,
3966,
1668,
891,
476,
923,
326,
690,
11106,
9380,
403,
1677,
347,
549,
32693,
10414,
533,
403,
2686,
3863,
387,
1878,
4567,
3387,
3925,
4496,
2451,
690,
7364,
273,
253,
789,
403,
247,
2372,
1512,
4541,
35273,
689,
275,
1798,
253,
958,
326,
253,
298,
6677,
273,
436,
3929,
310,
2439,
253,
6353,
285,
417,
327,
253,
18974,
273,
253,
1616,
729,
13178,
310,
247,
2372,
1512,
16105,
923,
3533,
2490,
187,
4118,
18435,
27,
783,
7680,
273,
436,
19529,
310,
2266,
50276,
266,
1783,
273,
14940,
273,
14561,
278,
3591,
68,
3082,
954,
30628,
5194,
326,
253,
19529,
310,
28055,
4722,
285,
273,
1600,
281,
253,
5723,
2824,
3114,
3021,
891,
5583,
14924,
50276,
35529,
891,
3877,
285,
3894,
342,
690,
30628,
253,
1563,
4468,
253,
16774,
1543,
327,
260,
338,
274,
740,
513,
417,
921,
253,
5649,
273,
253,
4081,
1332,
253,
906,
273,
253,
4872,
8245,
209,
3627,
310,
1077,
2080,
432,
256,
5503,
323,
501,
3024,
285,
260,
338,
274,
740,
2568,
209,
3627,
41731,
13015,
253,
14561,
2715,
275,
2426,
273,
7200,
285,
673,
6733,
352,
651,
320,
1270,
604,
627,
310,
247,
15958,
1566,
42429,
875,
253,
2969,
374,
69,
20953,
3368,
285,
4931,
1512,
24683,
501,
3024,
46277,
274,
740,
326,
476,
320,
2908,
281,
921,
253,
5373,
604,
667,
273,
14561,
278,
3591,
68,
690,
30628,
671,
5439,
247,
4468,
670,
253,
19156,
5449,
273,
253,
2929,
534,
891,
3524,
253,
4477,
588,
4993,
275,
253,
4049,
254,
609,
5102,
2715,
209
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
the paper focuses on detecting individual unfairness in supervised learning the main contributions are 1 a method to generate adversarial examples that is the examples that are very close to the original input but get a very different outcome an optimization problem is used formulated by leveraging the dro framework the authors then point out the difficulty in solving the problem specially with continuous features and propose an ode based solution to find the adversarial examples 2 after finding the adversarial examples a hypothesis testing framework is proposed to test the model for individual unfairness the main idea here to compute the mean and variance of the test statistic on a given set of data points and then construct the confidence intervals using the normality assumption on a high level the idea of testing for individual unfairness is an interesting one specially given that there arent many metrics of it individual unfairness out there however it feels like many important design choices are not very clear for these reasons this reviewer is split between a weak accept and a weak reject see the detailed comments below 1 intuitively it seems like the need for hypothesis testing arises when one is working with a small test set if the test set is large enough then assuming iid samples one could already be quite confident of the point estimate of the amount of individual unfairness as measured in eq 32 however the paper then assumes that the distance metric is learnt from the test set now if the test set is already quite small how good a metric do we expect to learn it is not clear how to reconcile these two problems 2 how much interpretability does the hypothesis test really add the test statistic does really provide a whole lot more interpretability on on top of the quantity measured in eq 32 which itself it very closely related to that in eq 24 essentially the main insight seems to be to monitor the change in loss from an example to an adversarial version of it so that additional benefit does the hypothesis testing bring here specially when working with reasonably sized test sets also see the point about the need for hypothesis testing above given that the paper does not offer any discussion into the tradeoff between type i and ii errors it is not clear that advantages does the hypothesis testing bring for us 3 perhaps it would be worth discussing how the scale of the loss 0 might make the ratio in eq 32 unstable in practice 4 this reviewer is a bit confused about the normality result derived in theorem 31 true that the distribution here tends to a normal when n to infty however with small test sets how well does this assumption hold if it does not the interval constructed in eq 36 may not be very accurate in general an empirical analysis of the intervals as in httpsarxivorgpdf200705124pdf would be a great addition to the paper 5 the paper seems to take the assumption that the distance metric specified by the user is differentiable for solving the problem in section 2 is that true it is quite possible for domain experts to specify distance metrics with discontinuities in them eg if education level of x1 is higher than education level of x2 upweigh the distance by 1 can such userspecified metrics be handled by the methods in the paper 6 this reviewer is not very sure about the usage of fourfifth rule for the hypothesis test while i am not a legal expert the fourfifth rule seems to apply to groups instead of individuals as suggested by the paper moreover applying the fourfifth rule on wellunderstood and wellbounded quantities acceptance rate of the two groups as is done in the group fairness literature indeed makes sense however applying the same ratio threshold on a quantity such as loss that can be arbitrarily high or low might not be very interpretable also see point 3 for instance if the original loss is very low 0 or very high in the order of 10s does it make sense to apply the ratio test similarly the test might lead to very different behavior when the loss changes from say hinge loss to squared loss to logistic loss is this behavior indeed desirable some explanation here would be greatly helpful in convincing the readers of the usefulness of the ratio test postrebuttal comments thanks for the detailed answers many of my concerns were addressed and i am increasing my score as a result a followup thought it would be nice to add some discussion on the runtime of the proposed frameworkdocsepthe paper introduces a framework to statistically test whether a given model is individually fair or not in particular given a model a distance metric over individuals and a data point z the authors propose an algorithm that finds a new data point z such that z is similar to z but their corresponding losses are different under the model if the model is not individually fair they provide experimental results to show how their proposed method can detect unfairness in practice i think the paper tackles an interesting problem and has nice results both theoretically and experimentally my major concern about this paper is a fundamental one what do you exactly mean by individual fairness i think there should be a formal definition for the fairness notion you have in mind according to the individual fairness notion of dwork et al even one couple of similar examples on which the given model performs differently constitutes unfairness but it looks like the fairness notion in this paper requires that on average over the input data distribution the model is treating similar individuals similarly which is differentweaker than the notion proposed by dwork et al please clarify if im missing something or else formally define the fairness notion you used in this work other comments which are mostly about the technical development early on in the paper that i find hard to follow i dont quite understand eq 22 and how it is solving an individually fair learning problem isnt that just the maximum expected loss on distributions that are epsilon far from the empirical distribution if so how does this help with fairness should wppn be defined somewhere also how is the dual problem obtained in eq 23 the authors say it is known but i think this requires more explanationderivation in eq 24 the function elllambdac is defined as getting fxi a label as input but looking at the right hand side of the equation this function actually depends on xi itself not fxi how is the gradient flow attack related to the dual problem in eq 23 and 24 what happened to epsilon in this continuous formulation it looks like that the primary objective now is to solve eq 24 and not the actual dual problem in 23 right if thats the case then what happened to primal and dual problems how should one pick the stopping time t in eq 26 and how that affects the proposed method for finding the unfair map shouldnt there be a theoretical statement about xt or the unfair map overall i found section 2 of the paper very confusing i will raise my score if the issues raised above are addresseddocsepthis paper proposes a test to determine whether an ml model violates individual fairness the main contribution beyond existing work is that this method allows for continuous feature spaces conceptually this paper rests on the gradient flow attack which produces a mapping that given an example produces another example which violates the inviddual fairness constraint thus given a distribution one can compute the change in loss between the original distribution and the mapped distribution the ratio of these quantities is what the authors use for their hypothesis testing problem is it below a the limit of tolerance or not for the most part the paper is farily wellwritten though it gets a little more difficult to understand towards the end the basic premise is interesting using a gradient flow attack to discover pairs of elements that are similar but have different predicted outcomes i dont fully understand the motivation behind the loss ratio statistic why compare ellfphix y y and ellfx y instead of simply fphix y and fx does this become very sensitive when ellfx y is close to 0 if so then it seems as though worse models might more easily pass the test since the denominator of the loss ratio would generally be larger i think the gradient flow attack is promising here but im not convinced that this is the right test to run im not completely convinced by the claim that this test is interpretable while the authors point to the 45 rule as a measure of impact the loss ratio proposed here is clearly measuring something quite different from what a disparate impact test would traditionally measure i dont see this as making the test results interpretable
### Summary: | this paper studies how to statistically test if a given model violates the constraint of individual fairness this is an interesting and novel problem and the paper leverages the technique of gradient flow to identify a witness pair for individual fairness violation during the rebuttal the authors have addressed many concerns raised in the reviews the author should also consider discussing the runtime and improving the exposition to resolve some of the presentation issues raised in the reviews | [
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
783,
2929,
16633,
327,
15549,
2060,
16593,
1255,
275,
22296,
4715,
50276,
783,
2022,
9021,
403,
337,
50276,
66,
1332,
281,
6635,
48960,
6667,
326,
310,
253,
6667,
326,
403,
1077,
2810,
281,
253,
3236,
3280,
533,
755,
247,
1077,
1027,
6454,
271,
13757,
1895,
310,
908,
26115,
407,
19732,
2977,
253,
3926,
7792,
253,
4477,
840,
1127,
562,
253,
10183,
275,
16161,
253,
1895,
24443,
342,
5415,
3386,
285,
12661,
271,
258,
615,
1754,
2900,
281,
1089,
253,
48960,
6667,
374,
846,
4560,
253,
48960,
6667,
247,
9079,
5175,
7792,
310,
4081,
281,
1071,
253,
1566,
323,
2060,
16593,
1255,
253,
2022,
2934,
1060,
281,
11897,
253,
1599,
285,
11041,
273,
253,
1071,
26312,
327,
247,
1677,
873,
273,
941,
2792,
285,
840,
3989,
253,
7162,
11508,
970,
253,
5222,
1319,
9376,
50276,
251,
247,
1029,
1268,
253,
2934,
273,
5175,
323,
2060,
16593,
1255,
310,
271,
4722,
581,
24443,
1677,
326,
627,
403,
2649,
1142,
17082,
273,
352,
2060,
16593,
1255,
562,
627,
2299,
352,
9193,
751,
1142,
1774,
2216,
10165,
403,
417,
1077,
2590,
323,
841,
4606,
436,
37317,
310,
8085,
875,
247,
5075,
2997,
285,
247,
5075,
12009,
923,
253,
7000,
5701,
2708,
50276,
18,
540,
41597,
352,
3133,
751,
253,
878,
323,
9079,
5175,
15877,
672,
581,
310,
2444,
342,
247,
1355,
1071,
873,
604,
253,
1071,
873,
310,
1781,
2217,
840,
7384,
891,
301,
3530,
581,
812,
2168,
320,
3240,
13224,
273,
253,
1127,
6642,
273,
253,
2408,
273,
2060,
16593,
1255,
347,
4080,
275,
16186,
4567,
2299,
253,
2929,
840,
19584,
326,
253,
4181,
7982,
310,
34003,
432,
253,
1071,
873,
1024,
604,
253,
1071,
873,
310,
2168,
3240,
1355,
849,
1175,
247,
7982,
513,
359,
1902,
281,
3037,
352,
310,
417,
2590,
849,
281,
42853,
841,
767,
3237,
50275,
19,
849,
1199,
4665,
1430,
1057,
253,
9079,
1071,
1663,
823,
253,
1071,
26312,
1057,
1663,
2085,
247,
2644,
2257,
625,
4665,
1430,
327,
327,
1755,
273,
253,
10671,
4080,
275,
16186,
4567,
534,
3139,
352,
1077,
8244,
2905,
281,
326,
275,
16186,
2164,
9093,
253,
2022,
12288,
3133,
281,
320,
281,
5724,
253,
1818,
275,
2957,
432,
271,
1650,
281,
271,
48960,
2715,
273,
352,
594,
326,
3081,
5649,
1057,
253,
9079,
5175,
3324,
1060,
24443,
672,
2444,
342,
12054,
25180,
1071,
5239,
671,
923,
253,
1127,
670,
253,
878,
323,
9079,
5175,
1840,
1677,
326,
253,
2929,
1057,
417,
3959,
667,
5955,
715,
253,
5454,
2727,
875,
1511,
891,
285,
21255,
6332,
352,
310,
417,
2590,
326,
11361,
1057,
253,
9079,
5175,
3324,
323,
441,
50276,
20,
4931,
352,
651,
320,
4409,
16585,
849,
253,
4311,
273,
253,
2957,
470,
1537,
1056,
253,
4313,
275,
16186,
4567,
17631,
275,
3946,
50276,
21,
436,
37317,
310,
247,
2372,
13477,
670,
253,
5222,
1319,
906,
6012,
275,
10012,
4562,
2032,
326,
253,
3268,
1060,
14280,
281,
247,
2622,
672,
295,
281,
2192,
555,
2299,
342,
1355,
1071,
5239,
849,
973,
1057,
436,
9376,
2186,
604,
352,
1057,
417,
253,
7726,
8818,
275,
16186,
5540,
778,
417,
320,
1077,
7899,
275,
2087,
271,
16774,
1783,
273,
253,
11508,
347,
275,
5987,
39962,
2061,
9275,
8602,
1762,
13397,
9275,
651,
320,
247,
1270,
1635,
281,
253,
2929,
50276,
22,
253,
2929,
3133,
281,
1379,
253,
9376,
326,
253,
4181,
7982,
7616,
407,
253,
2608,
310,
46350,
323,
16161,
253,
1895,
275,
2593,
374,
310,
326,
2032,
352,
310,
3240,
1896,
323,
5028,
10071,
281,
13199,
4181,
17082,
342,
16196,
39560,
275,
731,
24088,
604,
4730,
1268,
273,
1269,
18,
310,
2169,
685,
4730,
1268,
273,
1269,
19,
598,
664,
798,
253,
4181,
407,
337,
476,
824,
4212,
1553,
1245,
17082,
320,
15726,
407,
253,
3082,
275,
253,
2929,
50276,
23,
436,
37317,
310,
417,
1077,
2119,
670,
253,
10393,
273,
1740,
25512,
394,
4086,
323,
253,
9079,
1071,
1223,
891,
717,
417,
247,
4320,
6485,
253,
1740,
25512,
394,
4086,
3133,
281,
4647,
281,
2390,
3185,
273,
4292,
347,
5125,
407,
253,
2929,
25761,
9433,
253,
1740,
25512,
394,
4086,
327,
973,
4524,
6545,
285,
973,
44344,
13483,
14924,
2281,
273,
253,
767,
2390,
347,
310,
2218,
275,
253,
1387,
28959,
6239,
6296,
2789,
3282,
2299,
9433,
253,
1072,
4313,
7887,
327,
247,
10671,
824,
347,
2957,
326,
476,
320,
29607,
1029,
390,
1698,
1537,
417,
320,
1077,
4665,
494,
671,
923,
1127,
495,
323,
4227,
604,
253,
3236,
2957,
310,
1077,
1698,
470,
390,
1077,
1029,
275,
253,
1340,
273,
884,
84,
1057,
352,
1056,
3282,
281,
4647,
253,
4313,
1071,
12014,
253,
1071,
1537,
1421,
281,
1077,
1027,
3879,
672,
253,
2957,
2544,
432,
1333,
38864,
2957,
281,
30044,
2957,
281,
21535,
2957,
310,
436,
3879,
6296,
11408,
690,
8813,
1060,
651,
320,
10260,
9371,
275,
21414,
253,
10668,
273,
253,
31471,
273,
253,
4313,
1071,
50273,
5996,
250,
2858,
22559,
5701,
6701,
323,
253,
7000,
9172,
1142,
273,
619,
7350,
497,
9713,
285,
891,
717,
3629,
619,
4868,
347,
247,
906,
247,
956,
484,
1869,
352,
651,
320,
5322,
281,
823,
690,
5955,
327,
253,
20243,
273,
253,
4081,
7792,
7152,
339,
431,
248,
2929,
23970,
247,
7792,
281,
10126,
1071,
1880,
247,
1677,
1566,
310,
15978,
4344,
390,
417,
275,
1798,
1677,
247,
1566,
247,
4181,
7982,
689,
4292,
285,
247,
941,
1127,
1182,
253,
4477,
12661,
271,
5933,
326,
9010,
247,
747,
941,
1127,
1182,
824,
326,
1182,
310,
2074,
281,
1182,
533,
616,
3969,
11655,
403,
1027,
762,
253,
1566,
50276,
338,
253,
1566,
310,
417,
15978,
4344,
597,
2085,
5661,
1543,
281,
921,
849,
616,
4081,
1332,
476,
2736,
16593,
1255,
275,
3946,
50276,
74,
1158,
253,
2929,
39223,
271,
4722,
1895,
285,
556,
5322,
1543,
1097,
28055,
285,
21657,
619,
2201,
4468,
670,
436,
2929,
310,
247,
7936,
581,
752,
513,
368,
4555,
1599,
407,
2060,
28959,
891,
1158,
627,
943,
320,
247,
7473,
5426,
323,
253,
28959,
10732,
368,
452,
275,
2564,
2556,
281,
253,
2060,
28959,
10732,
273,
277,
1601,
1162,
355,
1014,
581,
4564,
273,
2074,
6667,
327,
534,
253,
1677,
1566,
17923,
13359,
16988,
16593,
1255,
533,
352,
4453,
751,
253,
28959,
10732,
275,
436,
2929,
4419,
326,
327,
3388,
689,
253,
3280,
941,
3268,
253,
1566,
310,
12767,
2074,
4292,
12014,
534,
310,
1027,
664,
4584,
685,
253,
10732,
4081,
407,
277,
1601,
1162,
355,
4496,
19148,
604,
516,
5816,
1633,
390,
2010,
19186,
4853,
253,
28959,
10732,
368,
908,
275,
436,
789,
50276,
977,
5701,
534,
403,
6571,
670,
253,
7681,
2440,
2393,
327,
275,
253,
2929,
326,
891,
1089,
1892,
281,
956,
50276,
74,
13414,
3240,
2096,
16186,
3307,
285,
849,
352,
310,
16161,
271,
15978,
4344,
4715,
1895,
310,
2649,
326,
816,
253,
4869,
3264,
2957,
327,
10670,
326,
403,
299,
4277,
2080,
432,
253,
16774,
3268,
604,
594,
849,
1057,
436,
1361,
342,
28959,
943,
259,
377,
79,
320,
2931,
9366,
50276,
12563,
849,
310,
253,
8746,
1895,
2797,
275,
16186,
3495,
253,
4477,
1333,
352,
310,
1929,
533,
891,
1158,
436,
4419,
625,
8813,
491,
7639,
50276,
249,
16186,
2164,
253,
1159,
11591,
77,
1369,
69,
317,
310,
2931,
347,
2970,
269,
2981,
247,
5203,
347,
3280,
533,
2819,
387,
253,
987,
1133,
1930,
273,
253,
5150,
436,
1159,
2686,
7024,
327,
1269,
74,
3139,
417,
269,
2981,
50276,
5430,
310,
253,
11786,
2685,
2983,
2905,
281,
253,
8746,
1895,
275,
16186,
3495,
285,
2164,
752,
4592,
281,
299,
4277,
275,
436,
5415,
15895,
352,
4453,
751,
326,
253,
3625,
8103,
1024,
310,
281,
8415,
16186,
2164,
285,
417,
253,
4588,
8746,
1895,
275,
3495,
987,
604,
28763,
253,
1083,
840,
752,
4592,
281,
819,
1983,
285,
8746,
3237,
50276,
5430,
943,
581,
2619,
253,
15910,
673,
246,
275,
16186,
3436,
285,
849,
326,
11852,
253,
4081,
1332,
323,
4560,
253,
16593,
3711,
943,
2649,
627,
320,
247,
10527,
3908,
670,
209,
633,
390,
253,
16593,
3711,
50276,
1189,
455,
891,
1119,
2593,
374,
273,
253,
2929,
1077,
21643,
891,
588,
7164,
619,
4868,
604,
253,
3374,
5439,
1840,
403,
9713,
7152,
33032,
2520,
2929,
29328,
247,
1071,
281,
3653,
1880,
271,
13361,
1566,
28096,
2060,
28959,
253,
2022,
7680,
4457,
5368,
789,
310,
326,
436,
1332,
4483,
323,
5415,
4735,
8470,
50276,
31503,
1230,
436,
2929,
27945,
327,
253,
11786,
2685,
2983,
534,
11330,
247,
10603,
326,
1677,
271,
1650,
11330,
1529,
1650,
534,
28096,
253,
828,
301,
34716,
28959,
7658,
3021,
1677,
247,
3268,
581,
476,
11897,
253,
1818,
275,
2957,
875,
253,
3236,
3268,
285,
253,
18301,
3268,
253,
4313,
273,
841,
13483,
310,
752,
253,
4477,
897,
323,
616,
9079,
5175,
1895,
310,
352,
2708,
247,
253,
2701,
273,
13761,
390,
417,
50276,
1542,
253,
954,
629,
253,
2929,
310,
2080,
1031,
973,
15720,
2167,
352,
4850,
247,
1652,
625,
2834,
281,
2096,
4404,
253,
990,
253,
5044,
26536,
310,
4722,
50276,
5302,
247,
11786,
2685,
2983,
281,
9413,
8557,
273,
3603,
326,
403,
2074,
533,
452,
1027,
8131,
6973,
50276,
74,
13414,
4751,
2096,
253,
16038,
3212,
253,
2957,
4313,
26312,
2139,
7277,
11591,
71,
545,
895,
340,
340,
285,
11591,
21448,
340,
3185,
273,
3365,
269,
545,
895,
340,
285,
269,
89,
1057,
436,
2489,
1077,
7996,
672,
11591,
21448,
340,
310,
2810,
281,
470,
604,
594,
840,
352,
3133,
347,
2167,
7197,
3210,
1537,
625,
4354,
1509,
253,
1071,
1580,
253,
12619,
273,
253,
2957,
4313,
651,
3839,
320,
4067,
891,
1158,
253,
11786,
2685,
2983,
310,
12532,
1060,
533,
516,
417,
13762,
326,
436,
310,
253,
987,
1071,
281,
1408,
50276,
303,
417,
4336,
13762,
407,
253,
1750,
326,
436,
1071,
310,
4665,
494,
1223,
253,
4477,
1127,
281,
253,
5329,
4086,
347,
247,
2557,
273,
3486,
253,
2957,
4313,
4081,
1060,
310,
4518,
10499,
1633,
3240,
1027,
432,
752,
247,
39653,
3486,
1071,
651,
21533,
2557,
891,
13414,
923,
436,
347,
2403,
253,
1071,
1543,
4665,
494,
2490,
187,
4118,
18435,
27,
2520,
2929,
2175,
849,
281,
10126,
1071,
604,
247,
1677,
1566,
28096,
253,
7658,
273,
2060,
28959,
436,
310,
271,
4722,
285,
4460,
1895,
285,
253,
2929,
19732,
1131,
253,
5853,
273,
11786,
2685,
281,
4271,
247,
5517,
4667,
323,
2060,
28959,
8411,
50276,
32674,
253,
30080,
22559,
253,
4477,
452,
9713,
1142,
7350,
5439,
275,
253,
10123,
253,
2488,
943,
671,
1908,
16585,
253,
20243,
285,
11138,
253,
47284,
281,
11322,
690,
273,
253,
9759,
3374,
5439,
275,
253,
10123
] | [
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] | [
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
783,
2929,
16633,
327,
15549,
2060,
16593,
1255,
275,
22296,
4715,
50276,
783,
2022,
9021,
403,
337,
50276,
66,
1332,
281,
6635,
48960,
6667,
326,
310,
253,
6667,
326,
403,
1077,
2810,
281,
253,
3236,
3280,
533,
755,
247,
1077,
1027,
6454,
271,
13757,
1895,
310,
908,
26115,
407,
19732,
2977,
253,
3926,
7792,
253,
4477,
840,
1127,
562,
253,
10183,
275,
16161,
253,
1895,
24443,
342,
5415,
3386,
285,
12661,
271,
258,
615,
1754,
2900,
281,
1089,
253,
48960,
6667,
374,
846,
4560,
253,
48960,
6667,
247,
9079,
5175,
7792,
310,
4081,
281,
1071,
253,
1566,
323,
2060,
16593,
1255,
253,
2022,
2934,
1060,
281,
11897,
253,
1599,
285,
11041,
273,
253,
1071,
26312,
327,
247,
1677,
873,
273,
941,
2792,
285,
840,
3989,
253,
7162,
11508,
970,
253,
5222,
1319,
9376,
50276,
251,
247,
1029,
1268,
253,
2934,
273,
5175,
323,
2060,
16593,
1255,
310,
271,
4722,
581,
24443,
1677,
326,
627,
403,
2649,
1142,
17082,
273,
352,
2060,
16593,
1255,
562,
627,
2299,
352,
9193,
751,
1142,
1774,
2216,
10165,
403,
417,
1077,
2590,
323,
841,
4606,
436,
37317,
310,
8085,
875,
247,
5075,
2997,
285,
247,
5075,
12009,
923,
253,
7000,
5701,
2708,
50276,
18,
540,
41597,
352,
3133,
751,
253,
878,
323,
9079,
5175,
15877,
672,
581,
310,
2444,
342,
247,
1355,
1071,
873,
604,
253,
1071,
873,
310,
1781,
2217,
840,
7384,
891,
301,
3530,
581,
812,
2168,
320,
3240,
13224,
273,
253,
1127,
6642,
273,
253,
2408,
273,
2060,
16593,
1255,
347,
4080,
275,
16186,
4567,
2299,
253,
2929,
840,
19584,
326,
253,
4181,
7982,
310,
34003,
432,
253,
1071,
873,
1024,
604,
253,
1071,
873,
310,
2168,
3240,
1355,
849,
1175,
247,
7982,
513,
359,
1902,
281,
3037,
352,
310,
417,
2590,
849,
281,
42853,
841,
767,
3237,
50275,
19,
849,
1199,
4665,
1430,
1057,
253,
9079,
1071,
1663,
823,
253,
1071,
26312,
1057,
1663,
2085,
247,
2644,
2257,
625,
4665,
1430,
327,
327,
1755,
273,
253,
10671,
4080,
275,
16186,
4567,
534,
3139,
352,
1077,
8244,
2905,
281,
326,
275,
16186,
2164,
9093,
253,
2022,
12288,
3133,
281,
320,
281,
5724,
253,
1818,
275,
2957,
432,
271,
1650,
281,
271,
48960,
2715,
273,
352,
594,
326,
3081,
5649,
1057,
253,
9079,
5175,
3324,
1060,
24443,
672,
2444,
342,
12054,
25180,
1071,
5239,
671,
923,
253,
1127,
670,
253,
878,
323,
9079,
5175,
1840,
1677,
326,
253,
2929,
1057,
417,
3959,
667,
5955,
715,
253,
5454,
2727,
875,
1511,
891,
285,
21255,
6332,
352,
310,
417,
2590,
326,
11361,
1057,
253,
9079,
5175,
3324,
323,
441,
50276,
20,
4931,
352,
651,
320,
4409,
16585,
849,
253,
4311,
273,
253,
2957,
470,
1537,
1056,
253,
4313,
275,
16186,
4567,
17631,
275,
3946,
50276,
21,
436,
37317,
310,
247,
2372,
13477,
670,
253,
5222,
1319,
906,
6012,
275,
10012,
4562,
2032,
326,
253,
3268,
1060,
14280,
281,
247,
2622,
672,
295,
281,
2192,
555,
2299,
342,
1355,
1071,
5239,
849,
973,
1057,
436,
9376,
2186,
604,
352,
1057,
417,
253,
7726,
8818,
275,
16186,
5540,
778,
417,
320,
1077,
7899,
275,
2087,
271,
16774,
1783,
273,
253,
11508,
347,
275,
5987,
39962,
2061,
9275,
8602,
1762,
13397,
9275,
651,
320,
247,
1270,
1635,
281,
253,
2929,
50276,
22,
253,
2929,
3133,
281,
1379,
253,
9376,
326,
253,
4181,
7982,
7616,
407,
253,
2608,
310,
46350,
323,
16161,
253,
1895,
275,
2593,
374,
310,
326,
2032,
352,
310,
3240,
1896,
323,
5028,
10071,
281,
13199,
4181,
17082,
342,
16196,
39560,
275,
731,
24088,
604,
4730,
1268,
273,
1269,
18,
310,
2169,
685,
4730,
1268,
273,
1269,
19,
598,
664,
798,
253,
4181,
407,
337,
476,
824,
4212,
1553,
1245,
17082,
320,
15726,
407,
253,
3082,
275,
253,
2929,
50276,
23,
436,
37317,
310,
417,
1077,
2119,
670,
253,
10393,
273,
1740,
25512,
394,
4086,
323,
253,
9079,
1071,
1223,
891,
717,
417,
247,
4320,
6485,
253,
1740,
25512,
394,
4086,
3133,
281,
4647,
281,
2390,
3185,
273,
4292,
347,
5125,
407,
253,
2929,
25761,
9433,
253,
1740,
25512,
394,
4086,
327,
973,
4524,
6545,
285,
973,
44344,
13483,
14924,
2281,
273,
253,
767,
2390,
347,
310,
2218,
275,
253,
1387,
28959,
6239,
6296,
2789,
3282,
2299,
9433,
253,
1072,
4313,
7887,
327,
247,
10671,
824,
347,
2957,
326,
476,
320,
29607,
1029,
390,
1698,
1537,
417,
320,
1077,
4665,
494,
671,
923,
1127,
495,
323,
4227,
604,
253,
3236,
2957,
310,
1077,
1698,
470,
390,
1077,
1029,
275,
253,
1340,
273,
884,
84,
1057,
352,
1056,
3282,
281,
4647,
253,
4313,
1071,
12014,
253,
1071,
1537,
1421,
281,
1077,
1027,
3879,
672,
253,
2957,
2544,
432,
1333,
38864,
2957,
281,
30044,
2957,
281,
21535,
2957,
310,
436,
3879,
6296,
11408,
690,
8813,
1060,
651,
320,
10260,
9371,
275,
21414,
253,
10668,
273,
253,
31471,
273,
253,
4313,
1071,
50273,
5996,
250,
2858,
22559,
5701,
6701,
323,
253,
7000,
9172,
1142,
273,
619,
7350,
497,
9713,
285,
891,
717,
3629,
619,
4868,
347,
247,
906,
247,
956,
484,
1869,
352,
651,
320,
5322,
281,
823,
690,
5955,
327,
253,
20243,
273,
253,
4081,
7792,
7152,
339,
431,
248,
2929,
23970,
247,
7792,
281,
10126,
1071,
1880,
247,
1677,
1566,
310,
15978,
4344,
390,
417,
275,
1798,
1677,
247,
1566,
247,
4181,
7982,
689,
4292,
285,
247,
941,
1127,
1182,
253,
4477,
12661,
271,
5933,
326,
9010,
247,
747,
941,
1127,
1182,
824,
326,
1182,
310,
2074,
281,
1182,
533,
616,
3969,
11655,
403,
1027,
762,
253,
1566,
50276,
338,
253,
1566,
310,
417,
15978,
4344,
597,
2085,
5661,
1543,
281,
921,
849,
616,
4081,
1332,
476,
2736,
16593,
1255,
275,
3946,
50276,
74,
1158,
253,
2929,
39223,
271,
4722,
1895,
285,
556,
5322,
1543,
1097,
28055,
285,
21657,
619,
2201,
4468,
670,
436,
2929,
310,
247,
7936,
581,
752,
513,
368,
4555,
1599,
407,
2060,
28959,
891,
1158,
627,
943,
320,
247,
7473,
5426,
323,
253,
28959,
10732,
368,
452,
275,
2564,
2556,
281,
253,
2060,
28959,
10732,
273,
277,
1601,
1162,
355,
1014,
581,
4564,
273,
2074,
6667,
327,
534,
253,
1677,
1566,
17923,
13359,
16988,
16593,
1255,
533,
352,
4453,
751,
253,
28959,
10732,
275,
436,
2929,
4419,
326,
327,
3388,
689,
253,
3280,
941,
3268,
253,
1566,
310,
12767,
2074,
4292,
12014,
534,
310,
1027,
664,
4584,
685,
253,
10732,
4081,
407,
277,
1601,
1162,
355,
4496,
19148,
604,
516,
5816,
1633,
390,
2010,
19186,
4853,
253,
28959,
10732,
368,
908,
275,
436,
789,
50276,
977,
5701,
534,
403,
6571,
670,
253,
7681,
2440,
2393,
327,
275,
253,
2929,
326,
891,
1089,
1892,
281,
956,
50276,
74,
13414,
3240,
2096,
16186,
3307,
285,
849,
352,
310,
16161,
271,
15978,
4344,
4715,
1895,
310,
2649,
326,
816,
253,
4869,
3264,
2957,
327,
10670,
326,
403,
299,
4277,
2080,
432,
253,
16774,
3268,
604,
594,
849,
1057,
436,
1361,
342,
28959,
943,
259,
377,
79,
320,
2931,
9366,
50276,
12563,
849,
310,
253,
8746,
1895,
2797,
275,
16186,
3495,
253,
4477,
1333,
352,
310,
1929,
533,
891,
1158,
436,
4419,
625,
8813,
491,
7639,
50276,
249,
16186,
2164,
253,
1159,
11591,
77,
1369,
69,
317,
310,
2931,
347,
2970,
269,
2981,
247,
5203,
347,
3280,
533,
2819,
387,
253,
987,
1133,
1930,
273,
253,
5150,
436,
1159,
2686,
7024,
327,
1269,
74,
3139,
417,
269,
2981,
50276,
5430,
310,
253,
11786,
2685,
2983,
2905,
281,
253,
8746,
1895,
275,
16186,
3495,
285,
2164,
752,
4592,
281,
299,
4277,
275,
436,
5415,
15895,
352,
4453,
751,
326,
253,
3625,
8103,
1024,
310,
281,
8415,
16186,
2164,
285,
417,
253,
4588,
8746,
1895,
275,
3495,
987,
604,
28763,
253,
1083,
840,
752,
4592,
281,
819,
1983,
285,
8746,
3237,
50276,
5430,
943,
581,
2619,
253,
15910,
673,
246,
275,
16186,
3436,
285,
849,
326,
11852,
253,
4081,
1332,
323,
4560,
253,
16593,
3711,
943,
2649,
627,
320,
247,
10527,
3908,
670,
209,
633,
390,
253,
16593,
3711,
50276,
1189,
455,
891,
1119,
2593,
374,
273,
253,
2929,
1077,
21643,
891,
588,
7164,
619,
4868,
604,
253,
3374,
5439,
1840,
403,
9713,
7152,
33032,
2520,
2929,
29328,
247,
1071,
281,
3653,
1880,
271,
13361,
1566,
28096,
2060,
28959,
253,
2022,
7680,
4457,
5368,
789,
310,
326,
436,
1332,
4483,
323,
5415,
4735,
8470,
50276,
31503,
1230,
436,
2929,
27945,
327,
253,
11786,
2685,
2983,
534,
11330,
247,
10603,
326,
1677,
271,
1650,
11330,
1529,
1650,
534,
28096,
253,
828,
301,
34716,
28959,
7658,
3021,
1677,
247,
3268,
581,
476,
11897,
253,
1818,
275,
2957,
875,
253,
3236,
3268,
285,
253,
18301,
3268,
253,
4313,
273,
841,
13483,
310,
752,
253,
4477,
897,
323,
616,
9079,
5175,
1895,
310,
352,
2708,
247,
253,
2701,
273,
13761,
390,
417,
50276,
1542,
253,
954,
629,
253,
2929,
310,
2080,
1031,
973,
15720,
2167,
352,
4850,
247,
1652,
625,
2834,
281,
2096,
4404,
253,
990,
253,
5044,
26536,
310,
4722,
50276,
5302,
247,
11786,
2685,
2983,
281,
9413,
8557,
273,
3603,
326,
403,
2074,
533,
452,
1027,
8131,
6973,
50276,
74,
13414,
4751,
2096,
253,
16038,
3212,
253,
2957,
4313,
26312,
2139,
7277,
11591,
71,
545,
895,
340,
340,
285,
11591,
21448,
340,
3185,
273,
3365,
269,
545,
895,
340,
285,
269,
89,
1057,
436,
2489,
1077,
7996,
672,
11591,
21448,
340,
310,
2810,
281,
470,
604,
594,
840,
352,
3133,
347,
2167,
7197,
3210,
1537,
625,
4354,
1509,
253,
1071,
1580,
253,
12619,
273,
253,
2957,
4313,
651,
3839,
320,
4067,
891,
1158,
253,
11786,
2685,
2983,
310,
12532,
1060,
533,
516,
417,
13762,
326,
436,
310,
253,
987,
1071,
281,
1408,
50276,
303,
417,
4336,
13762,
407,
253,
1750,
326,
436,
1071,
310,
4665,
494,
1223,
253,
4477,
1127,
281,
253,
5329,
4086,
347,
247,
2557,
273,
3486,
253,
2957,
4313,
4081,
1060,
310,
4518,
10499,
1633,
3240,
1027,
432,
752,
247,
39653,
3486,
1071,
651,
21533,
2557,
891,
13414,
923,
436,
347,
2403,
253,
1071,
1543,
4665,
494,
2490,
187,
4118,
18435,
27,
2520,
2929,
2175,
849,
281,
10126,
1071,
604,
247,
1677,
1566,
28096,
253,
7658,
273,
2060,
28959,
436,
310,
271,
4722,
285,
4460,
1895,
285,
253,
2929,
19732,
1131,
253,
5853,
273,
11786,
2685,
281,
4271,
247,
5517,
4667,
323,
2060,
28959,
8411,
50276,
32674,
253,
30080,
22559,
253,
4477,
452,
9713,
1142,
7350,
5439,
275,
253,
10123,
253,
2488,
943,
671,
1908,
16585,
253,
20243,
285,
11138,
253,
47284,
281,
11322,
690,
273,
253,
9759,
3374,
5439,
275,
253,
10123
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
this paper is well written and investigates dialogue summarization that has not received much attention it proposes a new model called cordial which can generate a summary draft followed by a final summary it achieves comparable or better results in term of both automatic evaluation metrics eg compress ratio rouge score and human evaluations consistent and informative about the quality of generated summaries in different settings however there are still several disadvantages of this paper 1 it can generate a summary draft but its quality is almost not presented in the paper except the ablation study of table 1 even the results within table 1 are still about the quality of the final summary the paper slightly overclaims its usefulness in the draft summary generation more results or analysis about draft summary should be presented 2 the human evaluation has only 30 examples and the scale is too small also does the score is 1 0 1 or other scale do you use majority vote or mean plus standard deviation to get results in the table why gold in table 3 is so low in consistent 3 the method looks applicable to the generable summarization task more results of this are also interesting although these drawbacks its quality is good overall docsepthis paper proposes cordial a new method for dialog summarization cordial firstly constructs a coarse draft by generating intent and key phrases for every dialog turn and splits the dialog into chunks by inserting special boundary tokens then the segmented original dialog and the constructed draft are feed as input to generate the final summary cordial employs bartxsum as its backbone model which is a pretrained language model finetuned on xsum summary dataset experiment result on samsum dataset shows cordial achieves sota performance under both automatic evaluation metric and human evaluations overall the paper presents an interesting practical recipe for dialog summarization it requires few additional annotation besides the summary and the human evaluation results looks promising however the method proposed here somewhat lacks in novelty and some part of the paper is not clearly written thus i give this paper a weak reject rating comments 1 in 22 how are the intents annotated is it a purely automatic process based on keywords matching or the keywords are merely cues for human annotators 2 in 23 the algorithm for finding the cutting points is an incremental one it cant account for the similarity between last chunk and the last sentence in the summary since the cutting point of second to last chunk already determines the boundary of the last chunk 3 how are the output generated exactly the last sentence in 24 states each sentence is generated separately does it mean each output sentence have different input how is it different from standard autoregressive tokenbytoken generation 4 24 should also explicitly refer to figure 1 for clarity 5 cordial uses the bartxsum as initialization which is trained on xsum dataset are other baselines also gone through the same xsum training 6 what is the model size of all the models in the experiments it would be better to have some descriptions on model architectures in the experiment section docsepthe paper proposes cordial for abstractive dialogue summarization cordial extends bart by generating an intermediate summary draft which provides weaklysupervised signals and controling the length of the final summary results show significant improvements over competitive summarization models such as pegasus and bart in multiple different metrics some comments 1 the paper emphasizes that dialogue summarization is challenging due to its multispeaker standpoints casual language and limited data although the use of the proposed summary draft would help solve the first challenge it is hard to see any correlation between the other problems mentioned and the proposed solutions in the paper this is especially the case for the controlling of the summary length why is this useful specifically for dialogue summarization 2 the summary draft is one kind of a content plan which is widely used in text generation including text summarization 1 the technique of extracting key phrase is similar to how content selection is done in 2 please compare the proposed solution to other kinds of content planning 3 to extract key phrases the method identifies the longest common subsequence lcs parameterized by a threshold however how this threshold is set and used is not discussed in the paper this is important information in order to understand how these key phrases would look like for example in figure 1 how is s just one of many boring days at work extracted when the lcs is only at work for turn 2 1 httpswwwaclweborganthologyc181101pdf 2 httpsarxivorgpdf180810792pdfdocsep summary this paper addresses the problem of abstractive dialogue summarization its key idea is to label an interrogative pronoun category and extract key phrases from each dialogue turn as weak guide for dialogue summarization it also proposes a lengthcontrollable generation method for final summary the proposed approach is evaluated on the samsum as one of the largest abstractive dialogue summarization benchmarks on which it shows competitive performance over recent models strengths 1 it proposes a twostep coarsetofine approach for abstractive dialogue summarization it first extracts category labels and key phrases from each dialogue turn and then generates final summaries by controlling granularity this idea itself could be novel 2 it shows strong performance over other recent methods on the recently released samsum dataset 3 it tests the proposed approach with four recent pretrained language models including dialogpt unilm pegasus and bart weakness 1 this paper proposes a novel coarsetofine approach for abstractive dialogue summarization but its implementation is largely adhoc and engineering intensive and thus bears little technical novelty 1 the coarse part aims at generating drafts using interrogative pronoun category prediction and key phrase extraction these two are largely based on existing techniques eg ratner et al 2019 and kitaev klien 2018 and some heuristics eg thresholding for key phrases detection 2 the fine part aims at generating target summary with controllability of granularity its implementation is also based on a series of engineering heuristics eg dialogue splitting by rouge score binary classification for cutpoint detection 3 in summary it is hard to find methodological novelty in the proposed method given that iclr is a top premier ml venue it could be a significant weakness to be a publishable work 2 experimental results are rather weak 1 although samsum dataset may be one of the best benchmarks for the target task experiments on only a single dataset is limited to show the generality and effectiveness of the proposed method given that the proposed method is adhoc i suspect much additional endeavor may be required to apply to another dataset 2 i am not sure whether the comparison in table 1 is fair enough since the proposed approach relies on the additional components for draft construction it could require more other types of training data or learned modules that other method may not need this should be clarified in the draft conclusion my initial decision is reject mainly due to lack of technical novelty limited experiments could be another issue to be improved
### Summary: | the paper proposes a method for the interesting task of dialog summarisation which is slowly getting attention from the research community in particular they propose a method which first generates a summary draft and then a final draft pros 1 the paper is well written 2 addresses an interesting problem 3 sota results cons 1 lack of novelty 2 no quantitative analysis of the summary draft though it is as an important part of the proposed solution 3 human evaluations are not adequate the authors have said they will expand on this but clear details are not provided 4 the bart model seems to have some advantage as it is pretrained on xsum data whereas some of the other models are not the authors havent clarified this sufficiently in the rebuttal overall the reviewers were not completely happy with the work and there was not clear champion | [
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
2520,
2929,
310,
973,
3542,
285,
2340,
684,
17414,
10405,
1320,
326,
556,
417,
2959,
1199,
4116,
352,
29328,
247,
747,
1566,
1925,
12045,
451,
534,
476,
6635,
247,
6010,
7482,
3560,
407,
247,
2457,
6010,
50276,
262,
33526,
10870,
390,
1805,
1543,
275,
1307,
273,
1097,
12077,
7103,
17082,
24088,
19477,
4313,
30497,
463,
4868,
285,
1966,
27163,
5185,
285,
27096,
670,
253,
3290,
273,
4561,
14568,
3927,
275,
1027,
7533,
50275,
35529,
627,
403,
1335,
2067,
23797,
273,
436,
2929,
337,
352,
476,
6635,
247,
6010,
7482,
533,
697,
3290,
310,
2761,
417,
3559,
275,
253,
2929,
3707,
253,
28913,
1263,
273,
2829,
337,
1014,
253,
1543,
1561,
2829,
337,
403,
1335,
670,
253,
3290,
273,
253,
2457,
6010,
253,
2929,
5777,
689,
28803,
697,
31471,
275,
253,
7482,
6010,
5978,
625,
1543,
390,
1783,
670,
7482,
6010,
943,
320,
3559,
374,
253,
1966,
7103,
556,
760,
1884,
6667,
285,
253,
4311,
310,
1512,
1355,
671,
1057,
253,
4868,
310,
337,
470,
337,
390,
643,
4311,
513,
368,
897,
5020,
6273,
390,
1599,
5043,
2629,
11254,
281,
755,
1543,
275,
253,
2829,
2139,
5328,
275,
2829,
495,
310,
594,
1698,
275,
5185,
495,
253,
1332,
4453,
7763,
281,
253,
1006,
494,
10405,
1320,
4836,
625,
1543,
273,
436,
403,
671,
4722,
50275,
20261,
841,
30453,
697,
3290,
310,
1175,
4583,
5474,
33032,
2520,
2929,
29328,
12045,
451,
247,
747,
1332,
323,
10756,
10405,
1320,
12045,
451,
41005,
21031,
247,
25319,
7482,
407,
11365,
6860,
285,
2234,
25491,
323,
1046,
10756,
1614,
285,
36509,
253,
10756,
715,
30151,
407,
30471,
2714,
7548,
21761,
840,
253,
48456,
3236,
10756,
285,
253,
8818,
7482,
403,
3997,
347,
3280,
281,
6635,
253,
2457,
6010,
12045,
451,
27532,
44693,
89,
2204,
347,
697,
27882,
1566,
534,
310,
247,
3215,
11273,
3448,
1566,
1442,
292,
37437,
327,
1269,
2204,
6010,
10895,
3368,
906,
327,
256,
1317,
360,
10895,
2722,
12045,
451,
33526,
256,
5503,
3045,
762,
1097,
12077,
7103,
7982,
285,
1966,
27163,
50276,
1189,
455,
253,
2929,
10262,
271,
4722,
8542,
13612,
323,
10756,
10405,
1320,
352,
4419,
1643,
3081,
22581,
16280,
253,
6010,
285,
253,
1966,
7103,
1543,
4453,
12532,
2299,
253,
1332,
4081,
1060,
8489,
19756,
275,
38135,
285,
690,
629,
273,
253,
2929,
310,
417,
4518,
3542,
3021,
891,
1918,
436,
2929,
247,
5075,
12009,
13716,
50276,
26122,
337,
275,
3307,
849,
403,
253,
540,
592,
28267,
310,
352,
247,
15846,
12077,
1232,
1754,
327,
28731,
11038,
390,
253,
28731,
403,
7960,
26638,
323,
1966,
12182,
2392,
374,
275,
3495,
253,
5933,
323,
4560,
253,
9968,
2792,
310,
271,
32809,
581,
352,
16216,
2395,
323,
253,
14259,
875,
1390,
20540,
285,
253,
1390,
6197,
275,
253,
6010,
1580,
253,
9968,
1127,
273,
1273,
281,
1390,
20540,
2168,
14802,
253,
7548,
273,
253,
1390,
20540,
495,
849,
403,
253,
3453,
4561,
4555,
253,
1390,
6197,
275,
2164,
3054,
1016,
6197,
310,
4561,
11794,
1057,
352,
1599,
1016,
3453,
6197,
452,
1027,
3280,
849,
310,
352,
1027,
432,
2629,
47694,
11020,
10669,
1615,
13763,
5978,
577,
2164,
943,
671,
11120,
3730,
281,
4677,
337,
323,
19843,
608,
12045,
451,
4648,
253,
44693,
89,
2204,
347,
31850,
534,
310,
10166,
327,
1269,
2204,
10895,
403,
643,
1666,
25379,
671,
4783,
949,
253,
1072,
1269,
2204,
3733,
721,
752,
310,
253,
1566,
1979,
273,
512,
253,
3210,
275,
253,
4679,
352,
651,
320,
1805,
281,
452,
690,
20121,
327,
1566,
35615,
275,
253,
3368,
2593,
50276,
7152,
339,
431,
248,
2929,
29328,
12045,
451,
323,
12002,
422,
17414,
10405,
1320,
12045,
451,
8725,
44693,
407,
11365,
271,
10444,
6010,
7482,
534,
3400,
22112,
35421,
6298,
285,
1453,
272,
253,
2978,
273,
253,
2457,
6010,
1543,
921,
1534,
11701,
689,
12085,
10405,
1320,
3210,
824,
347,
759,
22228,
316,
285,
44693,
275,
2709,
1027,
17082,
50276,
8826,
5701,
50276,
18,
253,
2929,
35520,
326,
17414,
10405,
1320,
310,
11132,
1955,
281,
697,
1554,
261,
365,
4584,
1462,
10801,
15120,
3448,
285,
3710,
941,
3738,
253,
897,
273,
253,
4081,
6010,
7482,
651,
1361,
8415,
253,
806,
5691,
352,
310,
1892,
281,
923,
667,
5921,
875,
253,
643,
3237,
5393,
285,
253,
4081,
5482,
275,
253,
2929,
436,
310,
3340,
253,
1083,
323,
253,
10938,
273,
253,
6010,
2978,
2139,
310,
436,
4217,
5742,
323,
17414,
10405,
1320,
50276,
19,
253,
6010,
7482,
310,
581,
2238,
273,
247,
2600,
2098,
534,
310,
7561,
908,
275,
2505,
5978,
1690,
2505,
10405,
1320,
337,
253,
5853,
273,
34705,
2234,
12616,
310,
2074,
281,
849,
2600,
5438,
310,
2218,
275,
374,
4496,
7277,
253,
4081,
2900,
281,
643,
9351,
273,
2600,
7219,
50276,
20,
281,
4908,
2234,
25491,
253,
1332,
22649,
253,
20088,
1846,
4728,
566,
298,
6113,
4764,
1025,
407,
247,
7887,
2299,
849,
436,
7887,
310,
873,
285,
908,
310,
417,
5469,
275,
253,
2929,
436,
310,
1774,
1491,
275,
1340,
281,
2096,
849,
841,
2234,
25491,
651,
1007,
751,
323,
1650,
275,
4677,
337,
849,
310,
256,
816,
581,
273,
1142,
22258,
1897,
387,
789,
10375,
672,
253,
298,
6113,
310,
760,
387,
789,
323,
1614,
374,
50276,
18,
5987,
2700,
29404,
7585,
2061,
14718,
1497,
68,
1093,
40416,
9275,
50276,
19,
5987,
39962,
2061,
9275,
1093,
2904,
740,
39282,
9275,
7152,
33032,
6010,
50276,
2520,
2929,
12453,
253,
1895,
273,
12002,
422,
17414,
10405,
1320,
697,
2234,
2934,
310,
281,
5203,
271,
16608,
800,
43980,
7140,
285,
4908,
2234,
25491,
432,
1016,
17414,
1614,
347,
5075,
7102,
323,
17414,
10405,
1320,
352,
671,
29328,
247,
2978,
35019,
494,
5978,
1332,
323,
2457,
6010,
253,
4081,
2746,
310,
6760,
327,
253,
256,
1317,
360,
347,
581,
273,
253,
6253,
12002,
422,
17414,
10405,
1320,
49602,
327,
534,
352,
2722,
12085,
3045,
689,
3332,
3210,
50275,
296,
3755,
20556,
50275,
18,
352,
29328,
247,
2500,
493,
554,
820,
1032,
292,
1171,
460,
2746,
323,
12002,
422,
17414,
10405,
1320,
352,
806,
16756,
7140,
13301,
285,
2234,
25491,
432,
1016,
17414,
1614,
285,
840,
15693,
2457,
14568,
3927,
407,
10938,
32449,
414,
436,
2934,
3139,
812,
320,
4460,
50276,
19,
352,
2722,
2266,
3045,
689,
643,
3332,
3082,
327,
253,
4102,
4439,
256,
1317,
360,
10895,
50275,
20,
352,
5216,
253,
4081,
2746,
342,
1740,
3332,
3215,
11273,
3448,
3210,
1690,
10756,
431,
440,
300,
78,
759,
22228,
316,
285,
44693,
50276,
20881,
1255,
50276,
18,
436,
2929,
29328,
247,
4460,
820,
1032,
292,
1171,
460,
2746,
323,
12002,
422,
17414,
10405,
1320,
533,
697,
7092,
310,
8127,
519,
37806,
285,
11369,
17193,
285,
3021,
17267,
1652,
7681,
38135,
50276,
18,
253,
25319,
629,
13698,
387,
11365,
42000,
970,
16608,
800,
43980,
7140,
10554,
285,
2234,
12616,
50276,
2068,
3460,
50276,
20513,
767,
403,
8127,
1754,
327,
5368,
5609,
24088,
2839,
1216,
1162,
355,
6247,
285,
465,
5741,
1173,
50276,
76,
965,
257,
4765,
285,
690,
344,
321,
3397,
24088,
7887,
272,
323,
2234,
25491,
5481,
50275,
19,
253,
4030,
629,
13698,
387,
11365,
2303,
6010,
342,
3661,
1430,
273,
32449,
414,
50276,
953,
7092,
310,
671,
1754,
327,
247,
2962,
273,
11369,
344,
321,
3397,
24088,
17414,
19860,
407,
30497,
463,
4868,
8985,
9162,
323,
2624,
3659,
5481,
50275,
20,
275,
6010,
352,
310,
1892,
281,
1089,
35961,
38135,
275,
253,
4081,
1332,
50276,
28821,
326,
17857,
32888,
310,
247,
1755,
24654,
13361,
18767,
352,
812,
320,
247,
1534,
14855,
281,
320,
247,
15452,
494,
789,
50275,
19,
5661,
1543,
403,
2581,
5075,
50275,
18,
3738,
256,
1317,
360,
10895,
778,
320,
581,
273,
253,
1682,
49602,
323,
253,
2303,
4836,
4679,
327,
760,
247,
2014,
10895,
310,
3710,
281,
921,
253,
31376,
285,
12510,
273,
253,
4081,
1332,
1677,
326,
253,
4081,
1332,
310,
519,
37806,
891,
9101,
1199,
3081,
38937,
778,
320,
2424,
281,
4647,
281,
1529,
10895,
50275,
19,
891,
717,
417,
2119,
1880,
253,
5301,
275,
2829,
337,
310,
4344,
2217,
1580,
253,
4081,
2746,
15771,
327,
253,
3081,
4295,
323,
7482,
5140,
352,
812,
2430,
625,
643,
3510,
273,
3733,
941,
390,
6311,
11911,
326,
643,
1332,
778,
417,
878,
436,
943,
320,
31637,
275,
253,
7482,
50276,
585,
3444,
50276,
2577,
3302,
3061,
310,
12009,
7194,
1955,
281,
3480,
273,
7681,
38135,
3710,
4679,
812,
320,
1529,
2523,
281,
320,
5520,
2490,
187,
4118,
18435,
27,
783,
2929,
29328,
247,
1332,
323,
253,
4722,
4836,
273,
10756,
10405,
5837,
534,
310,
7808,
2970,
4116,
432,
253,
2561,
3114,
275,
1798,
597,
12661,
247,
1332,
534,
806,
15693,
247,
6010,
7482,
285,
840,
247,
2457,
7482,
50276,
856,
84,
337,
253,
2929,
310,
973,
3542,
374,
12453,
271,
4722,
1895,
495,
256,
5503,
1543,
50276,
5040,
337,
3480,
273,
38135,
374,
642,
11745,
1783,
273,
253,
6010,
7482,
2167,
352,
310,
347,
271,
1774,
629,
273,
253,
4081,
2900,
495,
1966,
27163,
403,
417,
10599,
253,
4477,
452,
753,
597,
588,
5645,
327,
436,
533,
2590,
4278,
403,
417,
2530,
577,
253,
44693,
1566,
3133,
281,
452,
690,
5750,
347,
352,
310,
3215,
11273,
327,
1269,
2204,
941,
5727,
690,
273,
253,
643,
3210,
403,
417,
253,
4477,
419,
2254,
31637,
436,
10481,
275,
253,
30080,
22559,
50276,
1189,
455,
253,
30628,
497,
417,
4336,
5211,
342,
253,
789,
285,
627,
369,
417,
2590,
16928,
209
] | [
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] | [
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
2520,
2929,
310,
973,
3542,
285,
2340,
684,
17414,
10405,
1320,
326,
556,
417,
2959,
1199,
4116,
352,
29328,
247,
747,
1566,
1925,
12045,
451,
534,
476,
6635,
247,
6010,
7482,
3560,
407,
247,
2457,
6010,
50276,
262,
33526,
10870,
390,
1805,
1543,
275,
1307,
273,
1097,
12077,
7103,
17082,
24088,
19477,
4313,
30497,
463,
4868,
285,
1966,
27163,
5185,
285,
27096,
670,
253,
3290,
273,
4561,
14568,
3927,
275,
1027,
7533,
50275,
35529,
627,
403,
1335,
2067,
23797,
273,
436,
2929,
337,
352,
476,
6635,
247,
6010,
7482,
533,
697,
3290,
310,
2761,
417,
3559,
275,
253,
2929,
3707,
253,
28913,
1263,
273,
2829,
337,
1014,
253,
1543,
1561,
2829,
337,
403,
1335,
670,
253,
3290,
273,
253,
2457,
6010,
253,
2929,
5777,
689,
28803,
697,
31471,
275,
253,
7482,
6010,
5978,
625,
1543,
390,
1783,
670,
7482,
6010,
943,
320,
3559,
374,
253,
1966,
7103,
556,
760,
1884,
6667,
285,
253,
4311,
310,
1512,
1355,
671,
1057,
253,
4868,
310,
337,
470,
337,
390,
643,
4311,
513,
368,
897,
5020,
6273,
390,
1599,
5043,
2629,
11254,
281,
755,
1543,
275,
253,
2829,
2139,
5328,
275,
2829,
495,
310,
594,
1698,
275,
5185,
495,
253,
1332,
4453,
7763,
281,
253,
1006,
494,
10405,
1320,
4836,
625,
1543,
273,
436,
403,
671,
4722,
50275,
20261,
841,
30453,
697,
3290,
310,
1175,
4583,
5474,
33032,
2520,
2929,
29328,
12045,
451,
247,
747,
1332,
323,
10756,
10405,
1320,
12045,
451,
41005,
21031,
247,
25319,
7482,
407,
11365,
6860,
285,
2234,
25491,
323,
1046,
10756,
1614,
285,
36509,
253,
10756,
715,
30151,
407,
30471,
2714,
7548,
21761,
840,
253,
48456,
3236,
10756,
285,
253,
8818,
7482,
403,
3997,
347,
3280,
281,
6635,
253,
2457,
6010,
12045,
451,
27532,
44693,
89,
2204,
347,
697,
27882,
1566,
534,
310,
247,
3215,
11273,
3448,
1566,
1442,
292,
37437,
327,
1269,
2204,
6010,
10895,
3368,
906,
327,
256,
1317,
360,
10895,
2722,
12045,
451,
33526,
256,
5503,
3045,
762,
1097,
12077,
7103,
7982,
285,
1966,
27163,
50276,
1189,
455,
253,
2929,
10262,
271,
4722,
8542,
13612,
323,
10756,
10405,
1320,
352,
4419,
1643,
3081,
22581,
16280,
253,
6010,
285,
253,
1966,
7103,
1543,
4453,
12532,
2299,
253,
1332,
4081,
1060,
8489,
19756,
275,
38135,
285,
690,
629,
273,
253,
2929,
310,
417,
4518,
3542,
3021,
891,
1918,
436,
2929,
247,
5075,
12009,
13716,
50276,
26122,
337,
275,
3307,
849,
403,
253,
540,
592,
28267,
310,
352,
247,
15846,
12077,
1232,
1754,
327,
28731,
11038,
390,
253,
28731,
403,
7960,
26638,
323,
1966,
12182,
2392,
374,
275,
3495,
253,
5933,
323,
4560,
253,
9968,
2792,
310,
271,
32809,
581,
352,
16216,
2395,
323,
253,
14259,
875,
1390,
20540,
285,
253,
1390,
6197,
275,
253,
6010,
1580,
253,
9968,
1127,
273,
1273,
281,
1390,
20540,
2168,
14802,
253,
7548,
273,
253,
1390,
20540,
495,
849,
403,
253,
3453,
4561,
4555,
253,
1390,
6197,
275,
2164,
3054,
1016,
6197,
310,
4561,
11794,
1057,
352,
1599,
1016,
3453,
6197,
452,
1027,
3280,
849,
310,
352,
1027,
432,
2629,
47694,
11020,
10669,
1615,
13763,
5978,
577,
2164,
943,
671,
11120,
3730,
281,
4677,
337,
323,
19843,
608,
12045,
451,
4648,
253,
44693,
89,
2204,
347,
31850,
534,
310,
10166,
327,
1269,
2204,
10895,
403,
643,
1666,
25379,
671,
4783,
949,
253,
1072,
1269,
2204,
3733,
721,
752,
310,
253,
1566,
1979,
273,
512,
253,
3210,
275,
253,
4679,
352,
651,
320,
1805,
281,
452,
690,
20121,
327,
1566,
35615,
275,
253,
3368,
2593,
50276,
7152,
339,
431,
248,
2929,
29328,
12045,
451,
323,
12002,
422,
17414,
10405,
1320,
12045,
451,
8725,
44693,
407,
11365,
271,
10444,
6010,
7482,
534,
3400,
22112,
35421,
6298,
285,
1453,
272,
253,
2978,
273,
253,
2457,
6010,
1543,
921,
1534,
11701,
689,
12085,
10405,
1320,
3210,
824,
347,
759,
22228,
316,
285,
44693,
275,
2709,
1027,
17082,
50276,
8826,
5701,
50276,
18,
253,
2929,
35520,
326,
17414,
10405,
1320,
310,
11132,
1955,
281,
697,
1554,
261,
365,
4584,
1462,
10801,
15120,
3448,
285,
3710,
941,
3738,
253,
897,
273,
253,
4081,
6010,
7482,
651,
1361,
8415,
253,
806,
5691,
352,
310,
1892,
281,
923,
667,
5921,
875,
253,
643,
3237,
5393,
285,
253,
4081,
5482,
275,
253,
2929,
436,
310,
3340,
253,
1083,
323,
253,
10938,
273,
253,
6010,
2978,
2139,
310,
436,
4217,
5742,
323,
17414,
10405,
1320,
50276,
19,
253,
6010,
7482,
310,
581,
2238,
273,
247,
2600,
2098,
534,
310,
7561,
908,
275,
2505,
5978,
1690,
2505,
10405,
1320,
337,
253,
5853,
273,
34705,
2234,
12616,
310,
2074,
281,
849,
2600,
5438,
310,
2218,
275,
374,
4496,
7277,
253,
4081,
2900,
281,
643,
9351,
273,
2600,
7219,
50276,
20,
281,
4908,
2234,
25491,
253,
1332,
22649,
253,
20088,
1846,
4728,
566,
298,
6113,
4764,
1025,
407,
247,
7887,
2299,
849,
436,
7887,
310,
873,
285,
908,
310,
417,
5469,
275,
253,
2929,
436,
310,
1774,
1491,
275,
1340,
281,
2096,
849,
841,
2234,
25491,
651,
1007,
751,
323,
1650,
275,
4677,
337,
849,
310,
256,
816,
581,
273,
1142,
22258,
1897,
387,
789,
10375,
672,
253,
298,
6113,
310,
760,
387,
789,
323,
1614,
374,
50276,
18,
5987,
2700,
29404,
7585,
2061,
14718,
1497,
68,
1093,
40416,
9275,
50276,
19,
5987,
39962,
2061,
9275,
1093,
2904,
740,
39282,
9275,
7152,
33032,
6010,
50276,
2520,
2929,
12453,
253,
1895,
273,
12002,
422,
17414,
10405,
1320,
697,
2234,
2934,
310,
281,
5203,
271,
16608,
800,
43980,
7140,
285,
4908,
2234,
25491,
432,
1016,
17414,
1614,
347,
5075,
7102,
323,
17414,
10405,
1320,
352,
671,
29328,
247,
2978,
35019,
494,
5978,
1332,
323,
2457,
6010,
253,
4081,
2746,
310,
6760,
327,
253,
256,
1317,
360,
347,
581,
273,
253,
6253,
12002,
422,
17414,
10405,
1320,
49602,
327,
534,
352,
2722,
12085,
3045,
689,
3332,
3210,
50275,
296,
3755,
20556,
50275,
18,
352,
29328,
247,
2500,
493,
554,
820,
1032,
292,
1171,
460,
2746,
323,
12002,
422,
17414,
10405,
1320,
352,
806,
16756,
7140,
13301,
285,
2234,
25491,
432,
1016,
17414,
1614,
285,
840,
15693,
2457,
14568,
3927,
407,
10938,
32449,
414,
436,
2934,
3139,
812,
320,
4460,
50276,
19,
352,
2722,
2266,
3045,
689,
643,
3332,
3082,
327,
253,
4102,
4439,
256,
1317,
360,
10895,
50275,
20,
352,
5216,
253,
4081,
2746,
342,
1740,
3332,
3215,
11273,
3448,
3210,
1690,
10756,
431,
440,
300,
78,
759,
22228,
316,
285,
44693,
50276,
20881,
1255,
50276,
18,
436,
2929,
29328,
247,
4460,
820,
1032,
292,
1171,
460,
2746,
323,
12002,
422,
17414,
10405,
1320,
533,
697,
7092,
310,
8127,
519,
37806,
285,
11369,
17193,
285,
3021,
17267,
1652,
7681,
38135,
50276,
18,
253,
25319,
629,
13698,
387,
11365,
42000,
970,
16608,
800,
43980,
7140,
10554,
285,
2234,
12616,
50276,
2068,
3460,
50276,
20513,
767,
403,
8127,
1754,
327,
5368,
5609,
24088,
2839,
1216,
1162,
355,
6247,
285,
465,
5741,
1173,
50276,
76,
965,
257,
4765,
285,
690,
344,
321,
3397,
24088,
7887,
272,
323,
2234,
25491,
5481,
50275,
19,
253,
4030,
629,
13698,
387,
11365,
2303,
6010,
342,
3661,
1430,
273,
32449,
414,
50276,
953,
7092,
310,
671,
1754,
327,
247,
2962,
273,
11369,
344,
321,
3397,
24088,
17414,
19860,
407,
30497,
463,
4868,
8985,
9162,
323,
2624,
3659,
5481,
50275,
20,
275,
6010,
352,
310,
1892,
281,
1089,
35961,
38135,
275,
253,
4081,
1332,
50276,
28821,
326,
17857,
32888,
310,
247,
1755,
24654,
13361,
18767,
352,
812,
320,
247,
1534,
14855,
281,
320,
247,
15452,
494,
789,
50275,
19,
5661,
1543,
403,
2581,
5075,
50275,
18,
3738,
256,
1317,
360,
10895,
778,
320,
581,
273,
253,
1682,
49602,
323,
253,
2303,
4836,
4679,
327,
760,
247,
2014,
10895,
310,
3710,
281,
921,
253,
31376,
285,
12510,
273,
253,
4081,
1332,
1677,
326,
253,
4081,
1332,
310,
519,
37806,
891,
9101,
1199,
3081,
38937,
778,
320,
2424,
281,
4647,
281,
1529,
10895,
50275,
19,
891,
717,
417,
2119,
1880,
253,
5301,
275,
2829,
337,
310,
4344,
2217,
1580,
253,
4081,
2746,
15771,
327,
253,
3081,
4295,
323,
7482,
5140,
352,
812,
2430,
625,
643,
3510,
273,
3733,
941,
390,
6311,
11911,
326,
643,
1332,
778,
417,
878,
436,
943,
320,
31637,
275,
253,
7482,
50276,
585,
3444,
50276,
2577,
3302,
3061,
310,
12009,
7194,
1955,
281,
3480,
273,
7681,
38135,
3710,
4679,
812,
320,
1529,
2523,
281,
320,
5520,
2490,
187,
4118,
18435,
27,
783,
2929,
29328,
247,
1332,
323,
253,
4722,
4836,
273,
10756,
10405,
5837,
534,
310,
7808,
2970,
4116,
432,
253,
2561,
3114,
275,
1798,
597,
12661,
247,
1332,
534,
806,
15693,
247,
6010,
7482,
285,
840,
247,
2457,
7482,
50276,
856,
84,
337,
253,
2929,
310,
973,
3542,
374,
12453,
271,
4722,
1895,
495,
256,
5503,
1543,
50276,
5040,
337,
3480,
273,
38135,
374,
642,
11745,
1783,
273,
253,
6010,
7482,
2167,
352,
310,
347,
271,
1774,
629,
273,
253,
4081,
2900,
495,
1966,
27163,
403,
417,
10599,
253,
4477,
452,
753,
597,
588,
5645,
327,
436,
533,
2590,
4278,
403,
417,
2530,
577,
253,
44693,
1566,
3133,
281,
452,
690,
5750,
347,
352,
310,
3215,
11273,
327,
1269,
2204,
941,
5727,
690,
273,
253,
643,
3210,
403,
417,
253,
4477,
419,
2254,
31637,
436,
10481,
275,
253,
30080,
22559,
50276,
1189,
455,
253,
30628,
497,
417,
4336,
5211,
342,
253,
789,
285,
627,
369,
417,
2590,
16928,
209
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
this paper proposed a new method for blind image denoising using a gan2gan network different from previous work noise2noise the proposed method only needs single noisy images to train the network without noisy pairs given a noisy dataset a gan generator is trained using the real noisy but smooth patches with the noise generator a pair of generated noise samples are added on the noisy image content to train a denoiser this is trained iteratively so that the image content is better and better after cleaned by the trained denoiser the final results on synthetic and real data showed improvements over the baselines overall this paper is very interesting to read it solves a harder problem than noise2noise when the pairs are not available the idea of refining the noisy image iteratively is reasonable which makes the image content for training progressively better the results showed significant improvements over its competitor noise2void there are still some areas that it could improve 1 it is unclear the purpose of the cycle loss although it improves the psnr the motivation is not clear 2 the idea of generating synthetic noise for image denoising has been studied already this should be discussed abdelhamed abdelrahman marcus a brubaker and michael s brown noise flow noise modeling with conditional normalizing flows proceedings of the ieee international conference on computer vision 2019 3 it would be great to have natural images in the experiments to validate the idea in particular showing that the proposed method can reduce and noise and reveal more image content and texture abdelhamed abdelrahman stephen lin and michael s brown a highquality denoising dataset for smartphone cameras proceedings of the ieee conference on computer vision and pattern recognition 2018 plotz tobias and stefan roth benchmarking denoising algorithms with real photographs proceedings of the ieee conference on computer vision and pattern recognition 2017 4 it is good to have a table or flowchart about the iteratively training to make the paper easier to read docsepthis paper addresses a challenging task of blind image denoising where a single noisy image is provided with assumption that it is zero mean additive and independent from the original image content this is mostly the realworld scenario different from the recent n2n training the authors propose a gan2gan based method since this blind setting cannot be trained by n2n n2n or deterministic training needs explicit or implicit knowledge of clean image in order to be trained whereas the gan2gan method does not leading to more realistic and efficient training this method first attempts to simulate noise given the noisy image generate rough and noisy estimates of the clean image and iteratively train a denoiser with synthetic noisy pairs from the generator for blind denoising this work produces impressive results for synthetic and real world blind denoising this work provides a sound approach to denoising given only the noisy image the experiments were done with extensive datasets and baselines although the paper is solid i am not sure how this work compares to well known algorithms for blind image denoising residual dense network for image restoration highquality selfsupervised deep image denoising cycleisp real image restoration via improved data synthesis real image denoising with feature attention just to name a few if the authors can provide comparisons or discussion analysis on these algorithms and comparison it would be helpful docsepthe proposed method uses the generative learning method to simulate the noise and synthesize noisy image pairs to train the proposed network experimental results show the effectiveness of the proposed method learning denoised network from noisy images has been developed by krull et al 2019 batson royer 2019 laine et al 2019 the main difference is the use of generative learning one possible clarification is whether the aforementioned methods using generative learning can generate comparable results or not if so the contribution of the paper is limited the motivation of using the proposed network design is not clear in addition using more generators will lead to larger capacity models than existing methods it is not clear whether the performance gains are due to use such larger capacity models or not for the real noise images why do the authors evaluate the proposed method on the microscopy and medical images how about the results on the realworld natural noisy images in addition the proposed results still contain significant noise residual as shown in figure 4docsepthis paper proposes a framework to train a network to remove noises which are zeromean additive and independent of the clean image with only noisy images and without knowing the noise statistics they mathematically prove that a network which is trained from pairs of images generated by adding simulated noises into the noisy image can remove noises from the input noisy image the proposed framework can remove the noises from the input noisy image then it adds simulated noises into the denoised image to generate a pair of images and train the next network to further remove noises and it does this process iteratively the proposed framework follows gcbd and utilizes flat textureless regions to train a network to simulate noises and proposes a wavelet based method to effectively distinguish flat regions from the ones that contain highfrequency repeating patterns the experimental results show that the proposed method has good performance under simulated gaussian noises as well as wt and ct datasets i have several concerns and suggestions for this paper 1 one assumption of the proposed method is the noises and signal are independent which is not true for most of the scenarios the raw images captured by the camera contain both poisson noises i am wondering if the proposed method can deal with poisson noises or not also the final displayed images processed by isp contain signal dependent noises with spatial correlation can the proposed framework remove this kind of noises 2 in sec 33 why does the framework add simulated noises into the network ground truth zj1i will it be better if we directly treat xphij1zi in eq 10 as the ground truth 3 as claimed above theorem 1 fnoisy n2n z y gives a better estimate of x than x for a sufficiently large 02 what the results will be if 02 is not large enough that is to say can the proposed framework remove small noises from the input images 4 similarly according to theorem 1 the iterative process in sec 33 will converge if y0y1 can the network still effectively remove noises if yy0 which means the residual noises are not significant in the images 5 gtheta1 generates noises with given random vector r how does gtheta3 not need a random vector to add simulated noises into gtheta2z 6 more analyses are needed in sec 44 for ablation study especially why is sigmoid so important some descriptions are not clear and confusing 7 i cannot understand the relationship between n2c with eq 4 8 what is unifs s in sec 42 9 it is not clear the described dataset for table 3 for training or testing 10 for a fair comparison do the two n2c networks have the same network structure
### Summary: | summary of discussion three reviewers rated the paper good 7 while reviewer2 disagreed r2s criticism was focussed on how this work is placed within existingrelated literature and no technical problem was identified the authors have addressed some of r2s commentsconcerns r2 has not participated in the discussion novelty and contributions overall the reviews seem consistent with an incremental paper which is technically valid improves the state of the art on a reasonably difficult task however it does not appear from the reviews that the paper substantially advances our understanding of machine learning more broadly beyond this specific application experiments there is some disagreement among reviewers on the adequacy of the experiments with at least two reviewers calling for experiments involving natural photos i believe the authors responses adequately address these concerns they pointed out that the key selling point of their paper is the ability to model structured noise which is less relevant in natural photos on the balance of things i think this paper should be accepted but i wouldnt argue if it did not make the cut due to its narrow scope for this reason i recommended poster presentation | [
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
436,
2929,
4081,
247,
747,
1332,
323,
9645,
2460,
1850,
80,
2182,
970,
247,
36827,
19,
1247,
2990,
1027,
432,
2045,
789,
6046,
19,
24946,
253,
4081,
1332,
760,
3198,
2014,
27620,
3888,
281,
6194,
253,
2990,
1293,
27620,
8557,
1677,
247,
27620,
10895,
247,
36827,
14156,
310,
10166,
970,
253,
1524,
27620,
533,
6032,
20412,
342,
253,
6046,
14156,
247,
4667,
273,
4561,
6046,
3530,
403,
2879,
327,
253,
27620,
2460,
2600,
281,
6194,
247,
1850,
80,
9141,
436,
310,
10166,
10040,
3146,
594,
326,
253,
2460,
2600,
310,
1805,
285,
1805,
846,
22269,
407,
253,
10166,
1850,
80,
9141,
253,
2457,
1543,
327,
13506,
285,
1524,
941,
2692,
11701,
689,
253,
1666,
25379,
50276,
1189,
455,
436,
2929,
310,
1077,
4722,
281,
1239,
352,
35910,
247,
12150,
1895,
685,
6046,
19,
24946,
672,
253,
8557,
403,
417,
2130,
253,
2934,
273,
1275,
1699,
253,
27620,
2460,
10040,
3146,
310,
5272,
534,
2789,
253,
2460,
2600,
323,
3733,
31414,
1805,
253,
1543,
2692,
1534,
11701,
689,
697,
32048,
6046,
19,
4353,
50276,
9088,
403,
1335,
690,
3672,
326,
352,
812,
3157,
50276,
18,
352,
310,
12744,
253,
4096,
273,
253,
5880,
2957,
3738,
352,
19132,
253,
3714,
23838,
253,
16038,
310,
417,
2590,
50276,
19,
253,
2934,
273,
11365,
13506,
6046,
323,
2460,
1850,
80,
2182,
556,
644,
5421,
2168,
436,
943,
320,
5469,
50276,
357,
7555,
73,
3163,
490,
7555,
22876,
1342,
2304,
11675,
247,
1308,
538,
4584,
285,
278,
44023,
256,
8516,
6046,
2685,
6046,
14053,
342,
17697,
2622,
3006,
14221,
10061,
273,
253,
26332,
1796,
5213,
8059,
327,
4382,
8113,
6247,
50276,
20,
352,
651,
320,
1270,
281,
452,
3626,
3888,
275,
253,
4679,
281,
17813,
253,
2934,
275,
1798,
4645,
326,
253,
4081,
1332,
476,
4796,
285,
6046,
285,
10313,
625,
2460,
2600,
285,
14542,
50276,
357,
7555,
73,
3163,
490,
7555,
22876,
1342,
3213,
864,
19169,
285,
278,
44023,
256,
8516,
247,
1029,
15177,
1850,
80,
2182,
10895,
323,
21679,
14693,
10061,
273,
253,
26332,
1796,
8059,
327,
4382,
8113,
285,
3102,
8981,
4765,
50276,
14095,
91,
281,
39043,
285,
331,
832,
266,
687,
394,
22791,
272,
1850,
80,
2182,
11333,
342,
1524,
15928,
10061,
273,
253,
26332,
1796,
8059,
327,
4382,
8113,
285,
3102,
8981,
4240,
50276,
21,
352,
310,
1175,
281,
452,
247,
2829,
390,
2685,
21341,
670,
253,
10040,
3146,
3733,
281,
1056,
253,
2929,
6927,
281,
1239,
5474,
33032,
2520,
2929,
12453,
247,
11132,
4836,
273,
9645,
2460,
1850,
80,
2182,
835,
247,
2014,
27620,
2460,
310,
2530,
342,
9376,
326,
352,
310,
5058,
1599,
21842,
285,
3907,
432,
253,
3236,
2460,
2600,
436,
310,
6571,
253,
1524,
10186,
10076,
1027,
432,
253,
3332,
295,
19,
79,
3733,
253,
4477,
12661,
247,
36827,
19,
1247,
1754,
1332,
1580,
436,
9645,
4758,
2550,
320,
10166,
407,
295,
19,
79,
295,
19,
79,
390,
30027,
3733,
3198,
6843,
390,
15424,
3640,
273,
4076,
2460,
275,
1340,
281,
320,
10166,
5727,
253,
36827,
19,
1247,
1332,
1057,
417,
4283,
281,
625,
15958,
285,
5919,
3733,
436,
1332,
806,
9437,
281,
26065,
6046,
1677,
253,
27620,
2460,
6635,
7227,
285,
27620,
8197,
273,
253,
4076,
2460,
285,
10040,
3146,
6194,
247,
1850,
80,
9141,
342,
13506,
27620,
8557,
432,
253,
14156,
323,
9645,
1850,
80,
2182,
436,
789,
11330,
13943,
1543,
323,
13506,
285,
1524,
1533,
9645,
1850,
80,
2182,
50275,
2520,
789,
3400,
247,
3590,
2746,
281,
1850,
80,
2182,
1677,
760,
253,
27620,
2460,
253,
4679,
497,
2218,
342,
9470,
15302,
285,
1666,
25379,
3738,
253,
2929,
310,
4891,
891,
717,
417,
2119,
849,
436,
789,
26662,
281,
973,
1929,
11333,
323,
9645,
2460,
1850,
80,
2182,
50276,
40512,
780,
14086,
2990,
323,
2460,
20384,
50276,
8656,
15177,
1881,
35421,
3676,
2460,
1850,
80,
2182,
50276,
16441,
11135,
1524,
2460,
20384,
3066,
5520,
941,
9066,
50276,
6549,
2460,
1850,
80,
2182,
342,
4735,
4116,
816,
281,
1416,
247,
1643,
604,
253,
4477,
476,
2085,
14023,
390,
5955,
1783,
327,
841,
11333,
285,
5301,
352,
651,
320,
9371,
50276,
7152,
339,
431,
248,
4081,
1332,
4648,
253,
1006,
800,
4715,
1332,
281,
26065,
253,
6046,
285,
46919,
27620,
2460,
8557,
281,
6194,
253,
4081,
2990,
5661,
1543,
921,
253,
12510,
273,
253,
4081,
1332,
50275,
28269,
1850,
80,
1701,
2990,
432,
27620,
3888,
556,
644,
3715,
407,
36407,
962,
1162,
355,
6247,
10464,
1665,
50276,
4926,
254,
6247,
298,
6529,
1162,
355,
6247,
253,
2022,
3064,
310,
253,
897,
273,
1006,
800,
4715,
581,
1896,
37699,
310,
1880,
253,
18979,
3082,
970,
1006,
800,
4715,
476,
6635,
10870,
1543,
390,
417,
604,
594,
253,
7680,
273,
253,
2929,
310,
3710,
50276,
783,
16038,
273,
970,
253,
4081,
2990,
2216,
310,
417,
2590,
275,
1635,
970,
625,
21025,
588,
1421,
281,
4067,
5350,
3210,
685,
5368,
3082,
352,
310,
417,
2590,
1880,
253,
3045,
15988,
403,
1955,
281,
897,
824,
4067,
5350,
3210,
390,
417,
50275,
1542,
253,
1524,
6046,
3888,
2139,
513,
253,
4477,
7472,
253,
4081,
1332,
327,
253,
14080,
285,
3739,
3888,
849,
670,
253,
1543,
327,
253,
1524,
10186,
3626,
27620,
3888,
275,
1635,
253,
4081,
1543,
1335,
3831,
1534,
6046,
12541,
347,
2011,
275,
4677,
577,
7152,
33032,
2520,
2929,
29328,
247,
7792,
281,
6194,
247,
2990,
281,
5386,
33737,
534,
403,
1182,
254,
485,
266,
21842,
285,
3907,
273,
253,
4076,
2460,
342,
760,
27620,
3888,
285,
1293,
8958,
253,
6046,
9990,
597,
11076,
1037,
5276,
326,
247,
2990,
534,
310,
10166,
432,
8557,
273,
3888,
4561,
407,
6240,
15524,
33737,
715,
253,
27620,
2460,
476,
5386,
33737,
432,
253,
3280,
27620,
2460,
253,
4081,
7792,
476,
5386,
253,
33737,
432,
253,
3280,
27620,
2460,
840,
352,
11323,
15524,
33737,
715,
253,
1850,
80,
1701,
2460,
281,
6635,
247,
4667,
273,
3888,
285,
6194,
253,
1735,
2990,
281,
2007,
5386,
33737,
285,
352,
1057,
436,
1232,
10040,
3146,
253,
4081,
7792,
3637,
305,
11316,
69,
285,
29820,
6507,
14542,
1417,
4811,
281,
6194,
247,
2990,
281,
26065,
33737,
285,
29328,
247,
5149,
1059,
1754,
1332,
281,
8069,
12129,
6507,
4811,
432,
253,
4394,
326,
3831,
1029,
18163,
24385,
6127,
253,
5661,
1543,
921,
326,
253,
4081,
1332,
556,
1175,
3045,
762,
15524,
305,
12064,
33737,
347,
973,
347,
22923,
285,
45830,
15302,
50276,
74,
452,
2067,
7350,
285,
13991,
323,
436,
2929,
50276,
18,
581,
9376,
273,
253,
4081,
1332,
310,
253,
33737,
285,
2625,
403,
3907,
534,
310,
417,
2032,
323,
954,
273,
253,
15216,
253,
9305,
3888,
10848,
407,
253,
6568,
3831,
1097,
2963,
17469,
33737,
891,
717,
12371,
604,
253,
4081,
1332,
476,
2968,
342,
2963,
17469,
33737,
390,
417,
671,
253,
2457,
8653,
3888,
11742,
407,
310,
81,
3831,
2625,
7976,
33737,
342,
8820,
5921,
476,
253,
4081,
7792,
5386,
436,
2238,
273,
33737,
50276,
19,
275,
4706,
5922,
2139,
1057,
253,
7792,
823,
15524,
33737,
715,
253,
2990,
3216,
5083,
1182,
75,
18,
74,
588,
352,
320,
1805,
604,
359,
3587,
1555,
1269,
545,
1944,
18,
9877,
275,
16186,
884,
347,
253,
3216,
5083,
50276,
20,
347,
7558,
1840,
10012,
337,
269,
2369,
17976,
295,
19,
79,
1182,
340,
4245,
247,
1805,
6642,
273,
1269,
685,
1269,
323,
247,
10481,
1781,
16261,
752,
253,
1543,
588,
320,
604,
16261,
310,
417,
1781,
2217,
326,
310,
281,
1333,
476,
253,
4081,
7792,
5386,
1355,
33737,
432,
253,
3280,
3888,
50276,
21,
12014,
2556,
281,
10012,
337,
253,
34560,
1232,
275,
4706,
5922,
588,
29623,
604,
340,
17,
90,
18,
50276,
5092,
253,
2990,
1335,
8069,
5386,
33737,
604,
340,
90,
17,
534,
2097,
253,
12541,
33737,
403,
417,
1534,
275,
253,
3888,
50276,
22,
305,
3124,
18,
15693,
33737,
342,
1677,
3632,
4972,
391,
849,
1057,
305,
3124,
20,
417,
878,
247,
3632,
4972,
281,
823,
15524,
33737,
715,
305,
3124,
19,
91,
50276,
23,
625,
6260,
403,
3058,
275,
4706,
7127,
323,
28913,
1263,
3340,
2139,
310,
9788,
78,
1238,
594,
1774,
50276,
8826,
20121,
403,
417,
2590,
285,
21643,
50276,
24,
891,
2550,
2096,
253,
2954,
875,
295,
19,
68,
342,
16186,
577,
50275,
25,
752,
310,
440,
27530,
256,
275,
4706,
5976,
50276,
26,
352,
310,
417,
2590,
253,
2529,
10895,
323,
2829,
495,
323,
3733,
390,
5175,
50276,
740,
323,
247,
4344,
5301,
50276,
3088,
253,
767,
295,
19,
68,
6928,
452,
253,
1072,
2990,
2605,
187,
187,
4118,
18435,
27,
8774,
273,
5955,
1264,
30628,
20139,
253,
2929,
1175,
818,
1223,
37317,
19,
41030,
391,
19,
84,
14226,
369,
41685,
47291,
327,
849,
436,
789,
310,
4845,
1561,
5368,
4919,
6239,
285,
642,
7681,
1895,
369,
3636,
253,
4477,
452,
9713,
690,
273,
391,
19,
84,
5701,
585,
1209,
2224,
391,
19,
556,
417,
13640,
275,
253,
5955,
50276,
2369,
652,
555,
285,
9021,
4583,
253,
10123,
1646,
5185,
342,
271,
32809,
2929,
534,
310,
22335,
3588,
19132,
253,
1375,
273,
253,
1445,
327,
247,
12054,
2834,
4836,
2299,
352,
1057,
417,
3176,
432,
253,
10123,
326,
253,
2929,
9619,
16424,
776,
4685,
273,
5145,
4715,
625,
21450,
4457,
436,
2173,
2898,
50276,
16217,
3825,
627,
310,
690,
30859,
2190,
30628,
327,
253,
50172,
273,
253,
4679,
342,
387,
1878,
767,
30628,
6789,
323,
4679,
7668,
3626,
7963,
891,
2868,
253,
4477,
6128,
18212,
2953,
841,
7350,
597,
8042,
562,
326,
253,
2234,
10156,
1127,
273,
616,
2929,
310,
253,
3745,
281,
1566,
18872,
6046,
534,
310,
1679,
4623,
275,
3626,
7963,
50276,
251,
253,
6654,
273,
1841,
891,
1158,
436,
2929,
943,
320,
7607,
533,
891,
651,
2649,
9059,
604,
352,
858,
417,
1056,
253,
2624,
1955,
281,
697,
6891,
7990,
323,
436,
1921,
891,
8521,
20731,
9759
] | [
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] | [
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
436,
2929,
4081,
247,
747,
1332,
323,
9645,
2460,
1850,
80,
2182,
970,
247,
36827,
19,
1247,
2990,
1027,
432,
2045,
789,
6046,
19,
24946,
253,
4081,
1332,
760,
3198,
2014,
27620,
3888,
281,
6194,
253,
2990,
1293,
27620,
8557,
1677,
247,
27620,
10895,
247,
36827,
14156,
310,
10166,
970,
253,
1524,
27620,
533,
6032,
20412,
342,
253,
6046,
14156,
247,
4667,
273,
4561,
6046,
3530,
403,
2879,
327,
253,
27620,
2460,
2600,
281,
6194,
247,
1850,
80,
9141,
436,
310,
10166,
10040,
3146,
594,
326,
253,
2460,
2600,
310,
1805,
285,
1805,
846,
22269,
407,
253,
10166,
1850,
80,
9141,
253,
2457,
1543,
327,
13506,
285,
1524,
941,
2692,
11701,
689,
253,
1666,
25379,
50276,
1189,
455,
436,
2929,
310,
1077,
4722,
281,
1239,
352,
35910,
247,
12150,
1895,
685,
6046,
19,
24946,
672,
253,
8557,
403,
417,
2130,
253,
2934,
273,
1275,
1699,
253,
27620,
2460,
10040,
3146,
310,
5272,
534,
2789,
253,
2460,
2600,
323,
3733,
31414,
1805,
253,
1543,
2692,
1534,
11701,
689,
697,
32048,
6046,
19,
4353,
50276,
9088,
403,
1335,
690,
3672,
326,
352,
812,
3157,
50276,
18,
352,
310,
12744,
253,
4096,
273,
253,
5880,
2957,
3738,
352,
19132,
253,
3714,
23838,
253,
16038,
310,
417,
2590,
50276,
19,
253,
2934,
273,
11365,
13506,
6046,
323,
2460,
1850,
80,
2182,
556,
644,
5421,
2168,
436,
943,
320,
5469,
50276,
357,
7555,
73,
3163,
490,
7555,
22876,
1342,
2304,
11675,
247,
1308,
538,
4584,
285,
278,
44023,
256,
8516,
6046,
2685,
6046,
14053,
342,
17697,
2622,
3006,
14221,
10061,
273,
253,
26332,
1796,
5213,
8059,
327,
4382,
8113,
6247,
50276,
20,
352,
651,
320,
1270,
281,
452,
3626,
3888,
275,
253,
4679,
281,
17813,
253,
2934,
275,
1798,
4645,
326,
253,
4081,
1332,
476,
4796,
285,
6046,
285,
10313,
625,
2460,
2600,
285,
14542,
50276,
357,
7555,
73,
3163,
490,
7555,
22876,
1342,
3213,
864,
19169,
285,
278,
44023,
256,
8516,
247,
1029,
15177,
1850,
80,
2182,
10895,
323,
21679,
14693,
10061,
273,
253,
26332,
1796,
8059,
327,
4382,
8113,
285,
3102,
8981,
4765,
50276,
14095,
91,
281,
39043,
285,
331,
832,
266,
687,
394,
22791,
272,
1850,
80,
2182,
11333,
342,
1524,
15928,
10061,
273,
253,
26332,
1796,
8059,
327,
4382,
8113,
285,
3102,
8981,
4240,
50276,
21,
352,
310,
1175,
281,
452,
247,
2829,
390,
2685,
21341,
670,
253,
10040,
3146,
3733,
281,
1056,
253,
2929,
6927,
281,
1239,
5474,
33032,
2520,
2929,
12453,
247,
11132,
4836,
273,
9645,
2460,
1850,
80,
2182,
835,
247,
2014,
27620,
2460,
310,
2530,
342,
9376,
326,
352,
310,
5058,
1599,
21842,
285,
3907,
432,
253,
3236,
2460,
2600,
436,
310,
6571,
253,
1524,
10186,
10076,
1027,
432,
253,
3332,
295,
19,
79,
3733,
253,
4477,
12661,
247,
36827,
19,
1247,
1754,
1332,
1580,
436,
9645,
4758,
2550,
320,
10166,
407,
295,
19,
79,
295,
19,
79,
390,
30027,
3733,
3198,
6843,
390,
15424,
3640,
273,
4076,
2460,
275,
1340,
281,
320,
10166,
5727,
253,
36827,
19,
1247,
1332,
1057,
417,
4283,
281,
625,
15958,
285,
5919,
3733,
436,
1332,
806,
9437,
281,
26065,
6046,
1677,
253,
27620,
2460,
6635,
7227,
285,
27620,
8197,
273,
253,
4076,
2460,
285,
10040,
3146,
6194,
247,
1850,
80,
9141,
342,
13506,
27620,
8557,
432,
253,
14156,
323,
9645,
1850,
80,
2182,
436,
789,
11330,
13943,
1543,
323,
13506,
285,
1524,
1533,
9645,
1850,
80,
2182,
50275,
2520,
789,
3400,
247,
3590,
2746,
281,
1850,
80,
2182,
1677,
760,
253,
27620,
2460,
253,
4679,
497,
2218,
342,
9470,
15302,
285,
1666,
25379,
3738,
253,
2929,
310,
4891,
891,
717,
417,
2119,
849,
436,
789,
26662,
281,
973,
1929,
11333,
323,
9645,
2460,
1850,
80,
2182,
50276,
40512,
780,
14086,
2990,
323,
2460,
20384,
50276,
8656,
15177,
1881,
35421,
3676,
2460,
1850,
80,
2182,
50276,
16441,
11135,
1524,
2460,
20384,
3066,
5520,
941,
9066,
50276,
6549,
2460,
1850,
80,
2182,
342,
4735,
4116,
816,
281,
1416,
247,
1643,
604,
253,
4477,
476,
2085,
14023,
390,
5955,
1783,
327,
841,
11333,
285,
5301,
352,
651,
320,
9371,
50276,
7152,
339,
431,
248,
4081,
1332,
4648,
253,
1006,
800,
4715,
1332,
281,
26065,
253,
6046,
285,
46919,
27620,
2460,
8557,
281,
6194,
253,
4081,
2990,
5661,
1543,
921,
253,
12510,
273,
253,
4081,
1332,
50275,
28269,
1850,
80,
1701,
2990,
432,
27620,
3888,
556,
644,
3715,
407,
36407,
962,
1162,
355,
6247,
10464,
1665,
50276,
4926,
254,
6247,
298,
6529,
1162,
355,
6247,
253,
2022,
3064,
310,
253,
897,
273,
1006,
800,
4715,
581,
1896,
37699,
310,
1880,
253,
18979,
3082,
970,
1006,
800,
4715,
476,
6635,
10870,
1543,
390,
417,
604,
594,
253,
7680,
273,
253,
2929,
310,
3710,
50276,
783,
16038,
273,
970,
253,
4081,
2990,
2216,
310,
417,
2590,
275,
1635,
970,
625,
21025,
588,
1421,
281,
4067,
5350,
3210,
685,
5368,
3082,
352,
310,
417,
2590,
1880,
253,
3045,
15988,
403,
1955,
281,
897,
824,
4067,
5350,
3210,
390,
417,
50275,
1542,
253,
1524,
6046,
3888,
2139,
513,
253,
4477,
7472,
253,
4081,
1332,
327,
253,
14080,
285,
3739,
3888,
849,
670,
253,
1543,
327,
253,
1524,
10186,
3626,
27620,
3888,
275,
1635,
253,
4081,
1543,
1335,
3831,
1534,
6046,
12541,
347,
2011,
275,
4677,
577,
7152,
33032,
2520,
2929,
29328,
247,
7792,
281,
6194,
247,
2990,
281,
5386,
33737,
534,
403,
1182,
254,
485,
266,
21842,
285,
3907,
273,
253,
4076,
2460,
342,
760,
27620,
3888,
285,
1293,
8958,
253,
6046,
9990,
597,
11076,
1037,
5276,
326,
247,
2990,
534,
310,
10166,
432,
8557,
273,
3888,
4561,
407,
6240,
15524,
33737,
715,
253,
27620,
2460,
476,
5386,
33737,
432,
253,
3280,
27620,
2460,
253,
4081,
7792,
476,
5386,
253,
33737,
432,
253,
3280,
27620,
2460,
840,
352,
11323,
15524,
33737,
715,
253,
1850,
80,
1701,
2460,
281,
6635,
247,
4667,
273,
3888,
285,
6194,
253,
1735,
2990,
281,
2007,
5386,
33737,
285,
352,
1057,
436,
1232,
10040,
3146,
253,
4081,
7792,
3637,
305,
11316,
69,
285,
29820,
6507,
14542,
1417,
4811,
281,
6194,
247,
2990,
281,
26065,
33737,
285,
29328,
247,
5149,
1059,
1754,
1332,
281,
8069,
12129,
6507,
4811,
432,
253,
4394,
326,
3831,
1029,
18163,
24385,
6127,
253,
5661,
1543,
921,
326,
253,
4081,
1332,
556,
1175,
3045,
762,
15524,
305,
12064,
33737,
347,
973,
347,
22923,
285,
45830,
15302,
50276,
74,
452,
2067,
7350,
285,
13991,
323,
436,
2929,
50276,
18,
581,
9376,
273,
253,
4081,
1332,
310,
253,
33737,
285,
2625,
403,
3907,
534,
310,
417,
2032,
323,
954,
273,
253,
15216,
253,
9305,
3888,
10848,
407,
253,
6568,
3831,
1097,
2963,
17469,
33737,
891,
717,
12371,
604,
253,
4081,
1332,
476,
2968,
342,
2963,
17469,
33737,
390,
417,
671,
253,
2457,
8653,
3888,
11742,
407,
310,
81,
3831,
2625,
7976,
33737,
342,
8820,
5921,
476,
253,
4081,
7792,
5386,
436,
2238,
273,
33737,
50276,
19,
275,
4706,
5922,
2139,
1057,
253,
7792,
823,
15524,
33737,
715,
253,
2990,
3216,
5083,
1182,
75,
18,
74,
588,
352,
320,
1805,
604,
359,
3587,
1555,
1269,
545,
1944,
18,
9877,
275,
16186,
884,
347,
253,
3216,
5083,
50276,
20,
347,
7558,
1840,
10012,
337,
269,
2369,
17976,
295,
19,
79,
1182,
340,
4245,
247,
1805,
6642,
273,
1269,
685,
1269,
323,
247,
10481,
1781,
16261,
752,
253,
1543,
588,
320,
604,
16261,
310,
417,
1781,
2217,
326,
310,
281,
1333,
476,
253,
4081,
7792,
5386,
1355,
33737,
432,
253,
3280,
3888,
50276,
21,
12014,
2556,
281,
10012,
337,
253,
34560,
1232,
275,
4706,
5922,
588,
29623,
604,
340,
17,
90,
18,
50276,
5092,
253,
2990,
1335,
8069,
5386,
33737,
604,
340,
90,
17,
534,
2097,
253,
12541,
33737,
403,
417,
1534,
275,
253,
3888,
50276,
22,
305,
3124,
18,
15693,
33737,
342,
1677,
3632,
4972,
391,
849,
1057,
305,
3124,
20,
417,
878,
247,
3632,
4972,
281,
823,
15524,
33737,
715,
305,
3124,
19,
91,
50276,
23,
625,
6260,
403,
3058,
275,
4706,
7127,
323,
28913,
1263,
3340,
2139,
310,
9788,
78,
1238,
594,
1774,
50276,
8826,
20121,
403,
417,
2590,
285,
21643,
50276,
24,
891,
2550,
2096,
253,
2954,
875,
295,
19,
68,
342,
16186,
577,
50275,
25,
752,
310,
440,
27530,
256,
275,
4706,
5976,
50276,
26,
352,
310,
417,
2590,
253,
2529,
10895,
323,
2829,
495,
323,
3733,
390,
5175,
50276,
740,
323,
247,
4344,
5301,
50276,
3088,
253,
767,
295,
19,
68,
6928,
452,
253,
1072,
2990,
2605,
187,
187,
4118,
18435,
27,
8774,
273,
5955,
1264,
30628,
20139,
253,
2929,
1175,
818,
1223,
37317,
19,
41030,
391,
19,
84,
14226,
369,
41685,
47291,
327,
849,
436,
789,
310,
4845,
1561,
5368,
4919,
6239,
285,
642,
7681,
1895,
369,
3636,
253,
4477,
452,
9713,
690,
273,
391,
19,
84,
5701,
585,
1209,
2224,
391,
19,
556,
417,
13640,
275,
253,
5955,
50276,
2369,
652,
555,
285,
9021,
4583,
253,
10123,
1646,
5185,
342,
271,
32809,
2929,
534,
310,
22335,
3588,
19132,
253,
1375,
273,
253,
1445,
327,
247,
12054,
2834,
4836,
2299,
352,
1057,
417,
3176,
432,
253,
10123,
326,
253,
2929,
9619,
16424,
776,
4685,
273,
5145,
4715,
625,
21450,
4457,
436,
2173,
2898,
50276,
16217,
3825,
627,
310,
690,
30859,
2190,
30628,
327,
253,
50172,
273,
253,
4679,
342,
387,
1878,
767,
30628,
6789,
323,
4679,
7668,
3626,
7963,
891,
2868,
253,
4477,
6128,
18212,
2953,
841,
7350,
597,
8042,
562,
326,
253,
2234,
10156,
1127,
273,
616,
2929,
310,
253,
3745,
281,
1566,
18872,
6046,
534,
310,
1679,
4623,
275,
3626,
7963,
50276,
251,
253,
6654,
273,
1841,
891,
1158,
436,
2929,
943,
320,
7607,
533,
891,
651,
2649,
9059,
604,
352,
858,
417,
1056,
253,
2624,
1955,
281,
697,
6891,
7990,
323,
436,
1921,
891,
8521,
20731,
9759
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
1 the explanation of the proposed confidence u is significantly deficient further statements to the calculation process theory and function are expected please provide more details 2 what the point of the operation that the output of global reference is multiplied by similarity s and confidence u directly as shown in figure 5 will the direct multiplication by the confidence u impair the function of the attention s a more reasonable and efficient method is expected for applying the confidence map 3 different from conventional entropy models this paper proposed to progressively use the prior from context model reference model and hyperprior model instead of obtaining priors respectively and then combining them to obtain estimation results but the paper lacks the explanation and experiments to the rationality of the design of the progressive process the comparison results of whether progressive or not are missing 4 generally the performance gain is very limited as shown in the rd curves in figure 6 in particular why the performance gain appears to be so minor in the low bitrate range 5 in figure 10 why the comparison results with any learned compression methods are missing the comparisons with various methods only on kodak dataset in fig 6 are not convincing 6 the visual comparison results on the clic dataset of high resolution are missing and some key references for deepnetwork based image compression are also missingdocsepsummary of paper the paper presents a learningbased approach for image compression to reduce the compression rate it describes two novel extensions one to take the global context into account and an improved version of the commonly used gdn layer their advantage has been shown in a thorough ablation study overall the method achieves superior performance compared to standard codecs as well as other stateofthe art learningbased method on the evaluated dataset from kodak strengths s1 the approach is clearly described and the figures help to follow the paper s2 the proposed approach of using a reference model to consider the global context is novel for the application of image compression s3 suggestion to improve gdn layer to address the mean shift problem of the existing formulation s4 the ablation study shows the effect of the different modules s5 the method achieves superior results on the tested dataset kodak standard in image compression and both psnrssim have been reported weaknesses w1 runtime for encoding and decoding not listed in case that the global reference model leads to some overhead in runtime especially for decoding time that would be worth mentioning also are there any limits in terms of image size the method can handle justification interesting approach with superior results docsepthis paper propose two methods for improve deep image compression performance i global reference module and ii meanshifting gdn module gsdn i global reference module searches over the decoded latents to find the relevant latents to the target latent for improve accuracy of entropy estimate authors extended yang et al 2020 method to using masked patch ii gsdn extends gdn to use subtractive operation pros proposal seems better ratedistortion results than lee 2019 and minnen 2018 figure 6 and 7 cons the method for generating u 2d feature maps is not clearly described it is unclear how the output channel of the parameter network 768 is calculated in the form of 2d feature maps in table 1 the output channel of the encoder is 384 while the corresponding input channel of the decoder is 192 i couldnt understand why the number of channels are not the same in figure 5 the meaning of s log is unclear and seems not appear to have been mentioned in the text the proposed method uses the mix quantization approach minnen singh 2020 but the evaluation of figure 7 is compared to minnen 2018 as context hyperprior so it is not fair comparison for example to put the results of the minnen 2018 approach mix quantization on it and compare it will make the claim of the proposal effect credible according to figure 7 gsdn appears to be effective it is an interesting but i thought the effects were less explained it would be more convincing if direct data on the meanshifting problem were presenteddocsepthe authors introduce global reference into the entropy model for deep image compression they also develop a reference algorithm to ensemble local context global reference and hyperprior this causes the algorithm to be robust to background noise also the authors develop gsdn module to handle meanshifting issue the proposed method demonstrates good quality and memory usage gain this paper propose to take into account the global information as well as the local information to perform better image compression the authors also demonstrate comparison to popular image compression standards and recent deep learning approaches i think this work is a nice work however i have two main concerns the dataset used for evaluation is rather outdated have the authors tried evaluating on recent image compression datasets or custom data and compare with the state of the art have the authors compared computational complexity the main reasons why industry standards are not enthusiastic about deep learning approaches to compression is due to the computational complexity not so much memory have the authors compared flops moreover since this work is dealing with global image information it seems the complexity would increase rapidly with image size while standard jpeg will relatively be not as severe have the authors experimented computational time with uhd qhd or 4k i am leaning towards accept but not by a lot i would like the authors to discuss upon empirical results on more recent datasets computational complexity and in terms of image size flops computational complexity and time with high resolution like uhd to 4k after these comments i would like to adjust the rating
### Summary: | this paper received moderately good reviews 3 positives 6 6 7 and 1 negative 5 the reviewers are generally positive about the main idea but identified several limitations performance improvement is marginal compared to existing approaches the proposed method incurs higher computational complexity and the presentation is not clear enough some of these issues are addressed in the rebuttal though overall the merits of this work outweigh the drawbacks and i recommend accepting this paper | [
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
18,
186,
783,
8813,
273,
253,
4081,
7162,
1484,
310,
3012,
24030,
2007,
7234,
281,
253,
10272,
1232,
3762,
285,
1159,
403,
3264,
4496,
2085,
625,
4278,
374,
186,
5371,
253,
1127,
273,
253,
4254,
326,
253,
3453,
273,
4156,
3806,
310,
31458,
407,
14259,
256,
285,
7162,
1484,
3587,
347,
2011,
275,
4677,
608,
50276,
9846,
253,
1480,
25219,
407,
253,
7162,
1484,
11490,
253,
1159,
273,
253,
4116,
256,
247,
625,
5272,
285,
5919,
1332,
310,
3264,
323,
9433,
253,
7162,
3711,
495,
186,
19623,
432,
6041,
15579,
3210,
436,
2929,
4081,
281,
31414,
897,
253,
2720,
432,
3634,
1566,
3806,
1566,
285,
4373,
40844,
1566,
3185,
273,
13546,
2235,
641,
2975,
285,
840,
16248,
731,
281,
4044,
13418,
1543,
533,
253,
2929,
19756,
253,
8813,
285,
4679,
281,
253,
8870,
414,
273,
253,
2216,
273,
253,
13439,
1232,
253,
5301,
1543,
273,
1880,
13439,
390,
417,
403,
5816,
577,
186,
43786,
253,
3045,
6351,
310,
1077,
3710,
347,
2011,
275,
253,
47939,
9191,
275,
4677,
721,
275,
1798,
2139,
253,
3045,
6351,
4620,
281,
320,
594,
5884,
275,
253,
1698,
2372,
4427,
2491,
50276,
22,
186,
249,
4677,
884,
2139,
253,
5301,
1543,
342,
667,
6311,
13800,
3082,
403,
5816,
253,
14023,
342,
2710,
3082,
760,
327,
465,
351,
518,
10895,
275,
3036,
721,
403,
417,
21414,
50276,
23,
186,
783,
5304,
5301,
1543,
327,
253,
502,
280,
10895,
273,
1029,
6064,
403,
5816,
285,
690,
2234,
10414,
323,
3676,
18428,
1754,
2460,
13800,
403,
671,
5816,
7152,
339,
793,
360,
3454,
273,
2929,
253,
2929,
10262,
247,
4715,
3169,
2746,
323,
2460,
13800,
281,
4796,
253,
13800,
2281,
352,
8631,
767,
4460,
18149,
581,
281,
1379,
253,
4156,
3634,
715,
2395,
285,
271,
5520,
2715,
273,
253,
7744,
908,
305,
17915,
3828,
616,
5750,
556,
644,
2011,
275,
247,
11080,
28913,
1263,
4583,
253,
1332,
33526,
8936,
3045,
2429,
281,
2629,
2127,
6113,
347,
973,
347,
643,
1375,
23037,
248,
1445,
4715,
3169,
1332,
327,
253,
6760,
10895,
432,
465,
351,
518,
50276,
296,
3755,
20556,
50276,
84,
18,
253,
2746,
310,
4518,
2529,
285,
253,
8442,
1361,
281,
956,
253,
2929,
50276,
84,
19,
253,
4081,
2746,
273,
970,
247,
3806,
1566,
281,
1908,
253,
4156,
3634,
310,
4460,
323,
253,
2898,
273,
2460,
50276,
3118,
1256,
50275,
84,
20,
14876,
281,
3157,
305,
17915,
3828,
281,
2953,
253,
1599,
5333,
1895,
273,
253,
5368,
15895,
50276,
84,
21,
253,
28913,
1263,
2722,
253,
1055,
273,
253,
1027,
11911,
50276,
84,
22,
253,
1332,
33526,
8936,
1543,
327,
253,
5762,
10895,
465,
351,
518,
2629,
275,
2460,
13800,
285,
1097,
3714,
23838,
859,
303,
452,
644,
2361,
50276,
20881,
1255,
265,
50276,
88,
18,
20243,
323,
9706,
285,
28490,
417,
7117,
275,
1083,
326,
253,
4156,
3806,
1566,
5644,
281,
690,
18332,
275,
20243,
3340,
323,
28490,
673,
326,
651,
320,
4409,
29570,
671,
403,
627,
667,
7787,
275,
2426,
273,
2460,
1979,
253,
1332,
476,
6016,
50276,
6309,
1877,
4722,
2746,
342,
8936,
1543,
5474,
33032,
2520,
2929,
12661,
767,
3082,
323,
3157,
3676,
2460,
13800,
3045,
891,
4156,
3806,
6333,
285,
21255,
2097,
73,
12545,
305,
17915,
6333,
305,
8289,
79,
891,
4156,
3806,
6333,
17891,
689,
253,
45775,
4329,
592,
281,
1089,
253,
4623,
4329,
592,
281,
253,
2303,
21624,
323,
3157,
7200,
273,
15579,
6642,
4477,
6508,
30966,
1162,
355,
9169,
1332,
281,
970,
34741,
12097,
21255,
305,
8289,
79,
8725,
305,
17915,
281,
897,
8482,
36484,
4254,
50276,
856,
84,
50275,
856,
40384,
3133,
1805,
20139,
382,
9564,
1543,
685,
458,
70,
6247,
285,
1054,
26159,
4765,
4677,
721,
285,
818,
50276,
5040,
575,
50276,
783,
1332,
323,
11365,
1484,
374,
69,
4735,
8115,
310,
417,
4518,
2529,
352,
310,
12744,
849,
253,
3453,
5048,
273,
253,
4764,
2990,
45730,
310,
5118,
275,
253,
830,
273,
374,
69,
4735,
8115,
50276,
249,
2829,
337,
253,
3453,
5048,
273,
253,
32049,
310,
31184,
1223,
253,
3969,
3280,
5048,
273,
253,
29810,
310,
19372,
891,
812,
2649,
2096,
2139,
253,
1180,
273,
8123,
403,
417,
253,
1072,
50276,
249,
4677,
608,
253,
4495,
273,
256,
2412,
310,
12744,
285,
3133,
417,
3176,
281,
452,
644,
5393,
275,
253,
2505,
50276,
783,
4081,
1332,
4648,
253,
5878,
36643,
2746,
1054,
26159,
50276,
4093,
73,
9169,
533,
253,
7103,
273,
4677,
818,
310,
2429,
281,
1054,
26159,
4765,
347,
3634,
50276,
27049,
40844,
594,
352,
310,
417,
4344,
5301,
323,
1650,
281,
1691,
253,
1543,
273,
253,
1054,
26159,
4765,
2746,
50276,
24706,
36643,
327,
352,
285,
7277,
352,
588,
1056,
253,
1750,
273,
253,
10419,
1055,
24542,
50276,
35861,
281,
4677,
818,
305,
8289,
79,
4620,
281,
320,
3576,
352,
310,
271,
4722,
533,
891,
1869,
253,
2538,
497,
1679,
5544,
352,
651,
320,
625,
21414,
604,
1480,
941,
327,
253,
2097,
73,
12545,
1895,
497,
3559,
7152,
339,
431,
248,
4477,
9569,
4156,
3806,
715,
253,
15579,
1566,
323,
3676,
2460,
13800,
50276,
9328,
671,
1287,
247,
3806,
5933,
281,
19862,
1980,
3634,
4156,
3806,
285,
4373,
40844,
50276,
2520,
5997,
253,
5933,
281,
320,
10237,
281,
4114,
6046,
671,
253,
4477,
1287,
305,
8289,
79,
6333,
281,
6016,
2097,
73,
12545,
2523,
253,
4081,
1332,
14371,
1175,
3290,
285,
3541,
10393,
6351,
50275,
2520,
2929,
12661,
281,
1379,
715,
2395,
253,
4156,
1491,
347,
973,
347,
253,
1980,
1491,
281,
1347,
1805,
2460,
13800,
50276,
783,
4477,
671,
7568,
5301,
281,
4633,
2460,
13800,
7465,
285,
3332,
3676,
4715,
7274,
50276,
74,
1158,
436,
789,
310,
247,
5322,
789,
2299,
891,
452,
767,
2022,
7350,
50276,
783,
10895,
908,
323,
7103,
310,
2581,
36761,
452,
253,
4477,
3597,
16344,
327,
3332,
2460,
13800,
15302,
390,
2840,
941,
285,
7277,
342,
253,
1375,
273,
253,
1445,
452,
253,
4477,
2429,
15180,
10454,
253,
2022,
4606,
2139,
4491,
7465,
403,
417,
31905,
670,
3676,
4715,
7274,
281,
13800,
310,
1955,
281,
253,
15180,
10454,
417,
594,
1199,
3541,
452,
253,
4477,
2429,
892,
2695,
25761,
1580,
436,
789,
310,
10620,
342,
4156,
2460,
1491,
352,
3133,
253,
10454,
651,
2572,
9086,
342,
2460,
1979,
1223,
2629,
480,
21949,
588,
4942,
320,
417,
347,
5460,
452,
253,
4477,
3368,
264,
15180,
673,
342,
1484,
13838,
2805,
13838,
390,
577,
76,
50276,
74,
717,
25661,
4404,
2997,
533,
417,
407,
247,
2257,
891,
651,
751,
253,
4477,
281,
2319,
2220,
50275,
358,
5378,
474,
1543,
327,
625,
3332,
15302,
50276,
16777,
1050,
10454,
285,
275,
2426,
273,
2460,
1979,
50276,
1258,
2695,
50276,
16777,
1050,
10454,
285,
673,
342,
1029,
6064,
751,
1484,
13838,
281,
577,
76,
50276,
6438,
841,
5701,
891,
651,
751,
281,
4575,
253,
13716,
187,
187,
4118,
18435,
27,
2520,
2929,
2959,
28249,
1175,
10123,
495,
37865,
721,
721,
818,
285,
337,
4016,
608,
253,
30628,
403,
3839,
2762,
670,
253,
2022,
2934,
533,
3636,
2067,
7364,
3045,
7756,
310,
16888,
2429,
281,
5368,
7274,
253,
4081,
1332,
1485,
2244,
2169,
15180,
10454,
285,
253,
9759,
310,
417,
2590,
2217,
690,
273,
841,
3374,
403,
9713,
275,
253,
30080,
22559,
2167,
4583,
253,
16108,
273,
436,
789,
32180,
798,
253,
30453,
285,
891,
5583,
18738,
436,
2929
] | [
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] | [
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
18,
186,
783,
8813,
273,
253,
4081,
7162,
1484,
310,
3012,
24030,
2007,
7234,
281,
253,
10272,
1232,
3762,
285,
1159,
403,
3264,
4496,
2085,
625,
4278,
374,
186,
5371,
253,
1127,
273,
253,
4254,
326,
253,
3453,
273,
4156,
3806,
310,
31458,
407,
14259,
256,
285,
7162,
1484,
3587,
347,
2011,
275,
4677,
608,
50276,
9846,
253,
1480,
25219,
407,
253,
7162,
1484,
11490,
253,
1159,
273,
253,
4116,
256,
247,
625,
5272,
285,
5919,
1332,
310,
3264,
323,
9433,
253,
7162,
3711,
495,
186,
19623,
432,
6041,
15579,
3210,
436,
2929,
4081,
281,
31414,
897,
253,
2720,
432,
3634,
1566,
3806,
1566,
285,
4373,
40844,
1566,
3185,
273,
13546,
2235,
641,
2975,
285,
840,
16248,
731,
281,
4044,
13418,
1543,
533,
253,
2929,
19756,
253,
8813,
285,
4679,
281,
253,
8870,
414,
273,
253,
2216,
273,
253,
13439,
1232,
253,
5301,
1543,
273,
1880,
13439,
390,
417,
403,
5816,
577,
186,
43786,
253,
3045,
6351,
310,
1077,
3710,
347,
2011,
275,
253,
47939,
9191,
275,
4677,
721,
275,
1798,
2139,
253,
3045,
6351,
4620,
281,
320,
594,
5884,
275,
253,
1698,
2372,
4427,
2491,
50276,
22,
186,
249,
4677,
884,
2139,
253,
5301,
1543,
342,
667,
6311,
13800,
3082,
403,
5816,
253,
14023,
342,
2710,
3082,
760,
327,
465,
351,
518,
10895,
275,
3036,
721,
403,
417,
21414,
50276,
23,
186,
783,
5304,
5301,
1543,
327,
253,
502,
280,
10895,
273,
1029,
6064,
403,
5816,
285,
690,
2234,
10414,
323,
3676,
18428,
1754,
2460,
13800,
403,
671,
5816,
7152,
339,
793,
360,
3454,
273,
2929,
253,
2929,
10262,
247,
4715,
3169,
2746,
323,
2460,
13800,
281,
4796,
253,
13800,
2281,
352,
8631,
767,
4460,
18149,
581,
281,
1379,
253,
4156,
3634,
715,
2395,
285,
271,
5520,
2715,
273,
253,
7744,
908,
305,
17915,
3828,
616,
5750,
556,
644,
2011,
275,
247,
11080,
28913,
1263,
4583,
253,
1332,
33526,
8936,
3045,
2429,
281,
2629,
2127,
6113,
347,
973,
347,
643,
1375,
23037,
248,
1445,
4715,
3169,
1332,
327,
253,
6760,
10895,
432,
465,
351,
518,
50276,
296,
3755,
20556,
50276,
84,
18,
253,
2746,
310,
4518,
2529,
285,
253,
8442,
1361,
281,
956,
253,
2929,
50276,
84,
19,
253,
4081,
2746,
273,
970,
247,
3806,
1566,
281,
1908,
253,
4156,
3634,
310,
4460,
323,
253,
2898,
273,
2460,
50276,
3118,
1256,
50275,
84,
20,
14876,
281,
3157,
305,
17915,
3828,
281,
2953,
253,
1599,
5333,
1895,
273,
253,
5368,
15895,
50276,
84,
21,
253,
28913,
1263,
2722,
253,
1055,
273,
253,
1027,
11911,
50276,
84,
22,
253,
1332,
33526,
8936,
1543,
327,
253,
5762,
10895,
465,
351,
518,
2629,
275,
2460,
13800,
285,
1097,
3714,
23838,
859,
303,
452,
644,
2361,
50276,
20881,
1255,
265,
50276,
88,
18,
20243,
323,
9706,
285,
28490,
417,
7117,
275,
1083,
326,
253,
4156,
3806,
1566,
5644,
281,
690,
18332,
275,
20243,
3340,
323,
28490,
673,
326,
651,
320,
4409,
29570,
671,
403,
627,
667,
7787,
275,
2426,
273,
2460,
1979,
253,
1332,
476,
6016,
50276,
6309,
1877,
4722,
2746,
342,
8936,
1543,
5474,
33032,
2520,
2929,
12661,
767,
3082,
323,
3157,
3676,
2460,
13800,
3045,
891,
4156,
3806,
6333,
285,
21255,
2097,
73,
12545,
305,
17915,
6333,
305,
8289,
79,
891,
4156,
3806,
6333,
17891,
689,
253,
45775,
4329,
592,
281,
1089,
253,
4623,
4329,
592,
281,
253,
2303,
21624,
323,
3157,
7200,
273,
15579,
6642,
4477,
6508,
30966,
1162,
355,
9169,
1332,
281,
970,
34741,
12097,
21255,
305,
8289,
79,
8725,
305,
17915,
281,
897,
8482,
36484,
4254,
50276,
856,
84,
50275,
856,
40384,
3133,
1805,
20139,
382,
9564,
1543,
685,
458,
70,
6247,
285,
1054,
26159,
4765,
4677,
721,
285,
818,
50276,
5040,
575,
50276,
783,
1332,
323,
11365,
1484,
374,
69,
4735,
8115,
310,
417,
4518,
2529,
352,
310,
12744,
849,
253,
3453,
5048,
273,
253,
4764,
2990,
45730,
310,
5118,
275,
253,
830,
273,
374,
69,
4735,
8115,
50276,
249,
2829,
337,
253,
3453,
5048,
273,
253,
32049,
310,
31184,
1223,
253,
3969,
3280,
5048,
273,
253,
29810,
310,
19372,
891,
812,
2649,
2096,
2139,
253,
1180,
273,
8123,
403,
417,
253,
1072,
50276,
249,
4677,
608,
253,
4495,
273,
256,
2412,
310,
12744,
285,
3133,
417,
3176,
281,
452,
644,
5393,
275,
253,
2505,
50276,
783,
4081,
1332,
4648,
253,
5878,
36643,
2746,
1054,
26159,
50276,
4093,
73,
9169,
533,
253,
7103,
273,
4677,
818,
310,
2429,
281,
1054,
26159,
4765,
347,
3634,
50276,
27049,
40844,
594,
352,
310,
417,
4344,
5301,
323,
1650,
281,
1691,
253,
1543,
273,
253,
1054,
26159,
4765,
2746,
50276,
24706,
36643,
327,
352,
285,
7277,
352,
588,
1056,
253,
1750,
273,
253,
10419,
1055,
24542,
50276,
35861,
281,
4677,
818,
305,
8289,
79,
4620,
281,
320,
3576,
352,
310,
271,
4722,
533,
891,
1869,
253,
2538,
497,
1679,
5544,
352,
651,
320,
625,
21414,
604,
1480,
941,
327,
253,
2097,
73,
12545,
1895,
497,
3559,
7152,
339,
431,
248,
4477,
9569,
4156,
3806,
715,
253,
15579,
1566,
323,
3676,
2460,
13800,
50276,
9328,
671,
1287,
247,
3806,
5933,
281,
19862,
1980,
3634,
4156,
3806,
285,
4373,
40844,
50276,
2520,
5997,
253,
5933,
281,
320,
10237,
281,
4114,
6046,
671,
253,
4477,
1287,
305,
8289,
79,
6333,
281,
6016,
2097,
73,
12545,
2523,
253,
4081,
1332,
14371,
1175,
3290,
285,
3541,
10393,
6351,
50275,
2520,
2929,
12661,
281,
1379,
715,
2395,
253,
4156,
1491,
347,
973,
347,
253,
1980,
1491,
281,
1347,
1805,
2460,
13800,
50276,
783,
4477,
671,
7568,
5301,
281,
4633,
2460,
13800,
7465,
285,
3332,
3676,
4715,
7274,
50276,
74,
1158,
436,
789,
310,
247,
5322,
789,
2299,
891,
452,
767,
2022,
7350,
50276,
783,
10895,
908,
323,
7103,
310,
2581,
36761,
452,
253,
4477,
3597,
16344,
327,
3332,
2460,
13800,
15302,
390,
2840,
941,
285,
7277,
342,
253,
1375,
273,
253,
1445,
452,
253,
4477,
2429,
15180,
10454,
253,
2022,
4606,
2139,
4491,
7465,
403,
417,
31905,
670,
3676,
4715,
7274,
281,
13800,
310,
1955,
281,
253,
15180,
10454,
417,
594,
1199,
3541,
452,
253,
4477,
2429,
892,
2695,
25761,
1580,
436,
789,
310,
10620,
342,
4156,
2460,
1491,
352,
3133,
253,
10454,
651,
2572,
9086,
342,
2460,
1979,
1223,
2629,
480,
21949,
588,
4942,
320,
417,
347,
5460,
452,
253,
4477,
3368,
264,
15180,
673,
342,
1484,
13838,
2805,
13838,
390,
577,
76,
50276,
74,
717,
25661,
4404,
2997,
533,
417,
407,
247,
2257,
891,
651,
751,
253,
4477,
281,
2319,
2220,
50275,
358,
5378,
474,
1543,
327,
625,
3332,
15302,
50276,
16777,
1050,
10454,
285,
275,
2426,
273,
2460,
1979,
50276,
1258,
2695,
50276,
16777,
1050,
10454,
285,
673,
342,
1029,
6064,
751,
1484,
13838,
281,
577,
76,
50276,
6438,
841,
5701,
891,
651,
751,
281,
4575,
253,
13716,
187,
187,
4118,
18435,
27,
2520,
2929,
2959,
28249,
1175,
10123,
495,
37865,
721,
721,
818,
285,
337,
4016,
608,
253,
30628,
403,
3839,
2762,
670,
253,
2022,
2934,
533,
3636,
2067,
7364,
3045,
7756,
310,
16888,
2429,
281,
5368,
7274,
253,
4081,
1332,
1485,
2244,
2169,
15180,
10454,
285,
253,
9759,
310,
417,
2590,
2217,
690,
273,
841,
3374,
403,
9713,
275,
253,
30080,
22559,
2167,
4583,
253,
16108,
273,
436,
789,
32180,
798,
253,
30453,
285,
891,
5583,
18738,
436,
2929
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
this paper develops an autoencoder to solve the inverse problem of converting incoming visual stimuli into electrical stimulation patterns such that the responses evoked by the electrical stimulation mimics the target visual stimulus by using an accurate decoding model perceptually accurate loss function and training across multiple participants the encoder outperforms existing approach for secondsight retinal prosthesis strengths 1 well written 2 advances stateoftheart in the artificial retina application and shows how their approach is better using overall convincing analysis weakness 1 a lot of innovation particular architecture of encoderdecoder relies on specifics of crude retinal prosthesis i am not sure how general purpose the approach presented here is does the model of phosphenes translate to other neural stimulation systems like somatosensory spinal cord stimulation 2 the approach only works for static images as visual targets how does the autoencoder framework extend to dynamics targets 3 the decoder description on page 5 is short might help to improve it for broader applicability limitations are described adequately docsepthe manuscript presents a encoder to generate optimal stimulation parameters for a visual prosthesis specifically the focus of the paper is on developing an endtoend deep learning model that is trained to invert a known fixedforward model strength 1 the idea is quite interesting on using hna for mapping input images to stimulation parameters 2 it can increase the accuracy of retinal prosthesis and in general other sensory feedback systems weakness there are several questions that i would invite the authors to address 1 the results and metrics are based on a forward model designed in 19 if the forward model changes would you need to update the encoder 2 if the forward model was not fixed how would the hna perform from a real retinal implant perspective due to plasticity in visual cortex would the stimulation parameters learnt using a fixed decoder change 3 after figure 1c can there be a figure 1d to show how would this be deployed since there will not real time perceptual loss in case of patients maybe some qualitative metrics but a quantitative loss 4 figure 3 can the authors add how the stimulation parameters for electrodes looks like for a given mnist image say for 0 input what is the output of the gride of an electrode while using hna surrogate and nave 5 what is computational burden of these encoder since they are for implantable applications this would be an important metric latency for such computing these models definitely cannot use nvidia gpu for the implanted application how would this scale minor issues line 100 is not formatted correctly check weakness docsepthe authors propose to use an autoencoder trained to invert a known feedforward model that approximates the biological network underlying the early human visual system they demonstrate the efficacy in application to visual prosthetics and show that there method leads electrode activation patterns that produce better and more intelligible decodings downstream in the biological system than comparable methods they claim their system to be generalizable to any type of sensory neuroprostheses with obvious remodeling and retraining required they use a nice approach of treating the pseudo inverse of a known biological function as the encoder of an encoder decoder pair and learn the encoder given patient specific parameters the model of the biology and treatment of the problem of phosphenes is very elegant and works well it does suggest that generalizability of the methodology requires equally detailed biological forward models in other neuroprosthetic applications the results presented especially in figure 3 are very impressive but there should be more clear caveats that this methodology requires a patient specific model of phosphenes and different patients have quite different observed phosphenes even with the reduction of patient to patient variability on mnist introduced by this method i would not anticipate as much success reducing variability on the already harder task of coco like images it should also be noted that all of the evaluation here is done on a model of individual patients and at no time are patients actually asked to observe these stimulation patterns and decode them downstream the authors are upfront about the potential limitations and societal impacts of the work
### Summary: | this paper formulates the problem of learning how to stimulate a visual neuroprosthesis as a hybrid autoencoder while the decoder can be taken as a known and fixed model that describes how stimuli produce percepts the encoder needs to be learned once learned the encoder maps target percepts into stimuli that can be passed into the device decoder motivation and formulation of the problem is especially clear strong the paper is well written and the reviewers and i appreciated the nice solution strategy for a potentially impactful application area there were some concerns about how generally applicable the approach is however the results presented likely do advance the state of the art in this setting given my own reading of the paper and the consistently positive reviewer scores im very comfortable endorsing this paper for acceptance | [
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
2520,
2929,
24357,
271,
6753,
36465,
281,
8415,
253,
13737,
1895,
273,
22022,
19363,
5304,
15374,
715,
8545,
10277,
6127,
824,
326,
253,
6128,
32338,
407,
253,
8545,
10277,
43341,
253,
2303,
5304,
15199,
50275,
1615,
970,
271,
7899,
28490,
1566,
591,
916,
1230,
7899,
2957,
1159,
285,
3733,
2439,
2709,
5014,
253,
32049,
41731,
13015,
5368,
2746,
323,
7253,
429,
22043,
42043,
50276,
296,
3755,
20556,
50276,
18,
973,
3542,
374,
16424,
1375,
23037,
14387,
275,
253,
13345,
30067,
2898,
285,
2722,
849,
616,
2746,
310,
1805,
970,
4583,
21414,
1783,
50275,
20881,
1255,
50276,
18,
247,
2257,
273,
15832,
1798,
10336,
273,
32049,
48759,
15771,
327,
40155,
273,
18934,
22043,
42043,
891,
717,
417,
2119,
849,
2087,
4096,
253,
2746,
3559,
1060,
310,
1057,
253,
1566,
273,
815,
2161,
864,
265,
16497,
281,
643,
11454,
10277,
2718,
751,
1260,
29392,
41593,
50276,
45895,
12045,
10277,
50275,
19,
253,
2746,
760,
2987,
323,
4228,
3888,
347,
5304,
8571,
849,
1057,
253,
6753,
36465,
7792,
9017,
281,
8062,
8571,
50275,
20,
253,
29810,
5740,
327,
3239,
608,
310,
2159,
1537,
1361,
281,
3157,
352,
323,
16055,
30437,
50276,
17465,
569,
403,
2529,
18212,
5474,
339,
431,
248,
7714,
10262,
247,
32049,
281,
6635,
8654,
10277,
3602,
323,
247,
5304,
42043,
5742,
253,
2770,
273,
253,
2929,
310,
327,
6684,
271,
990,
936,
423,
3676,
4715,
1566,
326,
310,
10166,
281,
30332,
247,
1929,
4229,
10495,
1566,
50275,
45563,
337,
253,
2934,
310,
3240,
4722,
327,
970,
288,
2072,
323,
10603,
3280,
3888,
281,
10277,
3602,
50276,
19,
352,
476,
2572,
253,
7200,
273,
22043,
42043,
285,
275,
2087,
643,
17872,
8680,
2718,
50275,
20881,
1255,
50276,
9088,
403,
2067,
3533,
326,
891,
651,
19864,
253,
4477,
281,
2953,
50276,
18,
186,
783,
1543,
285,
17082,
403,
1754,
327,
247,
3579,
1566,
4158,
275,
655,
604,
253,
3579,
1566,
2544,
651,
368,
878,
281,
5731,
253,
32049,
374,
186,
338,
253,
3579,
1566,
369,
417,
4229,
849,
651,
253,
288,
2072,
1347,
432,
247,
1524,
22043,
13132,
8668,
1955,
281,
30535,
275,
5304,
14031,
651,
253,
10277,
3602,
34003,
970,
247,
4229,
29810,
1818,
495,
186,
6438,
4677,
337,
68,
476,
627,
320,
247,
4677,
337,
69,
281,
921,
849,
651,
436,
320,
18329,
1580,
627,
588,
417,
1524,
673,
39612,
2957,
275,
1083,
273,
1363,
5046,
690,
18276,
17082,
533,
247,
11745,
2957,
577,
186,
13206,
495,
476,
253,
4477,
823,
849,
253,
10277,
3602,
323,
14722,
4453,
751,
323,
247,
1677,
278,
79,
382,
2460,
1333,
323,
470,
3280,
752,
310,
253,
3453,
273,
253,
650,
504,
273,
271,
9885,
1223,
970,
288,
2072,
35701,
285,
295,
1123,
50276,
22,
186,
5371,
310,
15180,
7977,
273,
841,
32049,
1580,
597,
403,
323,
13132,
494,
4893,
436,
651,
320,
271,
1774,
7982,
22667,
323,
824,
12672,
841,
3210,
7964,
2550,
897,
47181,
305,
11113,
323,
253,
28567,
2898,
50276,
5430,
651,
436,
4311,
50276,
37585,
3374,
1386,
2233,
310,
417,
39113,
9113,
50276,
5903,
14855,
50276,
7152,
339,
431,
248,
4477,
12661,
281,
897,
271,
6753,
36465,
10166,
281,
30332,
247,
1929,
3997,
10495,
1566,
326,
4020,
684,
253,
7534,
2990,
6944,
253,
2393,
1966,
5304,
985,
50276,
9328,
7568,
253,
10307,
275,
2898,
281,
5304,
17248,
30832,
285,
921,
326,
627,
1332,
5644,
9885,
5743,
6127,
326,
4711,
1805,
285,
625,
6835,
917,
1086,
351,
723,
15450,
275,
253,
7534,
985,
685,
10870,
3082,
50276,
9328,
1750,
616,
985,
281,
320,
2087,
12729,
281,
667,
1511,
273,
17872,
6551,
856,
296,
31006,
342,
4755,
27975,
285,
851,
26208,
2424,
597,
897,
247,
5322,
2746,
273,
12767,
253,
17927,
13737,
273,
247,
1929,
7534,
1159,
347,
253,
32049,
273,
271,
32049,
29810,
4667,
285,
3037,
253,
32049,
1677,
3110,
2173,
3602,
50276,
783,
1566,
273,
253,
16775,
285,
1971,
273,
253,
1895,
273,
815,
2161,
864,
265,
310,
1077,
20654,
285,
2987,
973,
50276,
262,
1057,
1804,
326,
2087,
50228,
273,
253,
16182,
4419,
9696,
7000,
7534,
3579,
3210,
275,
643,
6551,
856,
296,
11176,
4893,
50276,
783,
1543,
3559,
3340,
275,
4677,
495,
403,
1077,
13943,
533,
627,
943,
320,
625,
2590,
15985,
1832,
326,
436,
16182,
4419,
247,
3110,
2173,
1566,
273,
815,
2161,
864,
265,
285,
1027,
1363,
452,
3240,
1027,
2540,
815,
2161,
864,
265,
50276,
9154,
342,
253,
5141,
273,
3110,
281,
3110,
13099,
327,
278,
79,
382,
5611,
407,
436,
1332,
891,
651,
417,
30258,
347,
1199,
2323,
8493,
13099,
327,
253,
2168,
12150,
4836,
273,
9285,
80,
751,
3888,
50276,
262,
943,
671,
320,
4879,
326,
512,
273,
253,
7103,
1060,
310,
2218,
327,
247,
1566,
273,
2060,
1363,
285,
387,
642,
673,
403,
1363,
2686,
2546,
281,
10018,
841,
10277,
6127,
285,
30358,
731,
15450,
253,
4477,
403,
598,
6342,
670,
253,
2442,
7364,
285,
38058,
16274,
273,
253,
789,
2490,
187,
4118,
18435,
27,
2520,
2929,
17075,
684,
253,
1895,
273,
4715,
849,
281,
23278,
247,
5304,
6551,
856,
296,
25232,
347,
247,
9769,
6753,
36465,
50276,
6050,
253,
29810,
476,
320,
2668,
347,
247,
1929,
285,
4229,
1566,
326,
8631,
849,
15374,
4711,
591,
916,
84,
253,
32049,
3198,
281,
320,
6311,
50276,
19131,
6311,
253,
32049,
8115,
2303,
591,
916,
84,
715,
15374,
326,
476,
320,
4817,
715,
253,
2813,
29810,
50276,
24013,
7639,
285,
15895,
273,
253,
1895,
310,
3340,
2590,
50276,
9072,
50276,
783,
2929,
310,
973,
3542,
285,
253,
30628,
285,
891,
14109,
253,
5322,
2900,
5700,
323,
247,
7826,
3486,
1020,
2898,
2170,
50276,
9088,
497,
690,
7350,
670,
849,
3839,
7763,
253,
2746,
310,
50276,
35529,
253,
1543,
3559,
2779,
513,
7170,
253,
1375,
273,
253,
1445,
275,
436,
4758,
50276,
28821,
619,
1211,
4361,
273,
253,
2929,
285,
253,
12724,
2762,
37317,
7363,
516,
1077,
9848,
25741,
272,
436,
2929,
323,
14924
] | [
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] | [
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
2520,
2929,
24357,
271,
6753,
36465,
281,
8415,
253,
13737,
1895,
273,
22022,
19363,
5304,
15374,
715,
8545,
10277,
6127,
824,
326,
253,
6128,
32338,
407,
253,
8545,
10277,
43341,
253,
2303,
5304,
15199,
50275,
1615,
970,
271,
7899,
28490,
1566,
591,
916,
1230,
7899,
2957,
1159,
285,
3733,
2439,
2709,
5014,
253,
32049,
41731,
13015,
5368,
2746,
323,
7253,
429,
22043,
42043,
50276,
296,
3755,
20556,
50276,
18,
973,
3542,
374,
16424,
1375,
23037,
14387,
275,
253,
13345,
30067,
2898,
285,
2722,
849,
616,
2746,
310,
1805,
970,
4583,
21414,
1783,
50275,
20881,
1255,
50276,
18,
247,
2257,
273,
15832,
1798,
10336,
273,
32049,
48759,
15771,
327,
40155,
273,
18934,
22043,
42043,
891,
717,
417,
2119,
849,
2087,
4096,
253,
2746,
3559,
1060,
310,
1057,
253,
1566,
273,
815,
2161,
864,
265,
16497,
281,
643,
11454,
10277,
2718,
751,
1260,
29392,
41593,
50276,
45895,
12045,
10277,
50275,
19,
253,
2746,
760,
2987,
323,
4228,
3888,
347,
5304,
8571,
849,
1057,
253,
6753,
36465,
7792,
9017,
281,
8062,
8571,
50275,
20,
253,
29810,
5740,
327,
3239,
608,
310,
2159,
1537,
1361,
281,
3157,
352,
323,
16055,
30437,
50276,
17465,
569,
403,
2529,
18212,
5474,
339,
431,
248,
7714,
10262,
247,
32049,
281,
6635,
8654,
10277,
3602,
323,
247,
5304,
42043,
5742,
253,
2770,
273,
253,
2929,
310,
327,
6684,
271,
990,
936,
423,
3676,
4715,
1566,
326,
310,
10166,
281,
30332,
247,
1929,
4229,
10495,
1566,
50275,
45563,
337,
253,
2934,
310,
3240,
4722,
327,
970,
288,
2072,
323,
10603,
3280,
3888,
281,
10277,
3602,
50276,
19,
352,
476,
2572,
253,
7200,
273,
22043,
42043,
285,
275,
2087,
643,
17872,
8680,
2718,
50275,
20881,
1255,
50276,
9088,
403,
2067,
3533,
326,
891,
651,
19864,
253,
4477,
281,
2953,
50276,
18,
186,
783,
1543,
285,
17082,
403,
1754,
327,
247,
3579,
1566,
4158,
275,
655,
604,
253,
3579,
1566,
2544,
651,
368,
878,
281,
5731,
253,
32049,
374,
186,
338,
253,
3579,
1566,
369,
417,
4229,
849,
651,
253,
288,
2072,
1347,
432,
247,
1524,
22043,
13132,
8668,
1955,
281,
30535,
275,
5304,
14031,
651,
253,
10277,
3602,
34003,
970,
247,
4229,
29810,
1818,
495,
186,
6438,
4677,
337,
68,
476,
627,
320,
247,
4677,
337,
69,
281,
921,
849,
651,
436,
320,
18329,
1580,
627,
588,
417,
1524,
673,
39612,
2957,
275,
1083,
273,
1363,
5046,
690,
18276,
17082,
533,
247,
11745,
2957,
577,
186,
13206,
495,
476,
253,
4477,
823,
849,
253,
10277,
3602,
323,
14722,
4453,
751,
323,
247,
1677,
278,
79,
382,
2460,
1333,
323,
470,
3280,
752,
310,
253,
3453,
273,
253,
650,
504,
273,
271,
9885,
1223,
970,
288,
2072,
35701,
285,
295,
1123,
50276,
22,
186,
5371,
310,
15180,
7977,
273,
841,
32049,
1580,
597,
403,
323,
13132,
494,
4893,
436,
651,
320,
271,
1774,
7982,
22667,
323,
824,
12672,
841,
3210,
7964,
2550,
897,
47181,
305,
11113,
323,
253,
28567,
2898,
50276,
5430,
651,
436,
4311,
50276,
37585,
3374,
1386,
2233,
310,
417,
39113,
9113,
50276,
5903,
14855,
50276,
7152,
339,
431,
248,
4477,
12661,
281,
897,
271,
6753,
36465,
10166,
281,
30332,
247,
1929,
3997,
10495,
1566,
326,
4020,
684,
253,
7534,
2990,
6944,
253,
2393,
1966,
5304,
985,
50276,
9328,
7568,
253,
10307,
275,
2898,
281,
5304,
17248,
30832,
285,
921,
326,
627,
1332,
5644,
9885,
5743,
6127,
326,
4711,
1805,
285,
625,
6835,
917,
1086,
351,
723,
15450,
275,
253,
7534,
985,
685,
10870,
3082,
50276,
9328,
1750,
616,
985,
281,
320,
2087,
12729,
281,
667,
1511,
273,
17872,
6551,
856,
296,
31006,
342,
4755,
27975,
285,
851,
26208,
2424,
597,
897,
247,
5322,
2746,
273,
12767,
253,
17927,
13737,
273,
247,
1929,
7534,
1159,
347,
253,
32049,
273,
271,
32049,
29810,
4667,
285,
3037,
253,
32049,
1677,
3110,
2173,
3602,
50276,
783,
1566,
273,
253,
16775,
285,
1971,
273,
253,
1895,
273,
815,
2161,
864,
265,
310,
1077,
20654,
285,
2987,
973,
50276,
262,
1057,
1804,
326,
2087,
50228,
273,
253,
16182,
4419,
9696,
7000,
7534,
3579,
3210,
275,
643,
6551,
856,
296,
11176,
4893,
50276,
783,
1543,
3559,
3340,
275,
4677,
495,
403,
1077,
13943,
533,
627,
943,
320,
625,
2590,
15985,
1832,
326,
436,
16182,
4419,
247,
3110,
2173,
1566,
273,
815,
2161,
864,
265,
285,
1027,
1363,
452,
3240,
1027,
2540,
815,
2161,
864,
265,
50276,
9154,
342,
253,
5141,
273,
3110,
281,
3110,
13099,
327,
278,
79,
382,
5611,
407,
436,
1332,
891,
651,
417,
30258,
347,
1199,
2323,
8493,
13099,
327,
253,
2168,
12150,
4836,
273,
9285,
80,
751,
3888,
50276,
262,
943,
671,
320,
4879,
326,
512,
273,
253,
7103,
1060,
310,
2218,
327,
247,
1566,
273,
2060,
1363,
285,
387,
642,
673,
403,
1363,
2686,
2546,
281,
10018,
841,
10277,
6127,
285,
30358,
731,
15450,
253,
4477,
403,
598,
6342,
670,
253,
2442,
7364,
285,
38058,
16274,
273,
253,
789,
2490,
187,
4118,
18435,
27,
2520,
2929,
17075,
684,
253,
1895,
273,
4715,
849,
281,
23278,
247,
5304,
6551,
856,
296,
25232,
347,
247,
9769,
6753,
36465,
50276,
6050,
253,
29810,
476,
320,
2668,
347,
247,
1929,
285,
4229,
1566,
326,
8631,
849,
15374,
4711,
591,
916,
84,
253,
32049,
3198,
281,
320,
6311,
50276,
19131,
6311,
253,
32049,
8115,
2303,
591,
916,
84,
715,
15374,
326,
476,
320,
4817,
715,
253,
2813,
29810,
50276,
24013,
7639,
285,
15895,
273,
253,
1895,
310,
3340,
2590,
50276,
9072,
50276,
783,
2929,
310,
973,
3542,
285,
253,
30628,
285,
891,
14109,
253,
5322,
2900,
5700,
323,
247,
7826,
3486,
1020,
2898,
2170,
50276,
9088,
497,
690,
7350,
670,
849,
3839,
7763,
253,
2746,
310,
50276,
35529,
253,
1543,
3559,
2779,
513,
7170,
253,
1375,
273,
253,
1445,
275,
436,
4758,
50276,
28821,
619,
1211,
4361,
273,
253,
2929,
285,
253,
12724,
2762,
37317,
7363,
516,
1077,
9848,
25741,
272,
436,
2929,
323,
14924
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
the paper studies the nonconvex outerobjective and strongly convex innerobjective bilevel optimization problem through the lens of bregmen distance the paper covers the deterministic optimizaton and stochastic optimization in both situation the authors provide the algorithm and its convergence analysis the theoretical results shows the proposed algorithm with the aid of bregman distance improve the performance with respect to the condition number kappa and utilizing the variance reduction technique could further improve the dependency on epsilon with a order of frac12 the numerical experiment further prove the efficiency of the proposed algorithm the strengths of the paper is obvious that it proposes the algorithms that could improve the theoretical upperbound of previous stateoftheart results the presentation is clear and the numerical experiment is sound the weaknesses mainly because the improvement is predictable it is wellknown that the variancereduction technique could improve the oder wrt the accuracy epsilon and the bregman disctance helps with the condition number kappa the numerical verifications of the problem is limited for each algorithm we only have one experiment the experiment shows the proposed algortihms have lower losses it would be better if the authors provide more experiments i would like increase my score if the author could provide more experimental results during rebuttal docsepthis paper incorporate the bregman distance into bilevel optimization and propose three methods biobred sbiobredd asbiobred which targets at addressing deterministic and stochastic bilevel problem such proposed algorithms have matched best target accuracy epsilon and improved the condition number kappa compared with other benchmarks meanwhile such analysis is adaptable for nonsmooth outer function the experiments also demonstrate the superior performance of proposed algorithms in terms of strengths the proposed work shows the condition number improvement in terms of convergence analysis meanwhile in different experimental settings the proposed algorithms have demonstrated its superior performance individually both assumptions and convergence analysis are standard and easy to follow in terms of weakness several lemmas lemma2 lemma4 in main body lack explanations the theoretical analysis is very standard while it will be better to point out the technical innovations non negative societal impact docsepthis paper proposes a new class of bilevel optimization bo problems both in deterministic and stochastic forms compared to the classic bo problem their outer function has an additional nonsmooth term which makes their model more general eg it contains the case when we use l1 regularization then three algorithms are proposed the first two are used to solve deterministicstochastic bo problems in the new form respectively and the last one is an accelerated version of the second algorithm strengths 1 this paper broadens the class of bo problems that appeared in previous literature which leads to the demand for new algorithms because previous algorithms can not deal with nonsmooth outer function 2 under similar assumptions compared with related works the algorithms proposed in this paper achieve the best convergence rate with known condition number weaknesses 1 this paper only mentions one circumstance where we need to consider a nonsmooth outer function when we use l1 regularization that may narrow the unique field of application of this work ie this work can do but the previous works can not it is good to mention more examplesapplications of using nonsmooth objectives the efficiency of proposed algorithms depends on the choice of psi which is used to define bregman distance in the experiment part it is chosen such that the updating of x is very easy ie a closedform solution exists when hx 0 but when other psi is chosen or hx neq 0 how to efficiently solve the problem to update x the complexity of solving this subproblem seems does not appear in the comparison with other algorithms maybe the authors can make it more clear how to solve this subproblem for general hx
### Summary: | the paper studies bilevel optimization problems provides three algorithms for different settings and improves the convergence analysis in terms of the condition number in addition numerical experiments are conducted that provide illustration of the effectiveness of the algorithms three reviewers all agree that the paper should be published as it contributes to the literature and will be of interest to the neurips audience when preparing the final version of the manuscript please incorporate the discussion that addressed the reviewers comments either in the main text or the appendix | [
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
783,
2929,
2175,
253,
1327,
44181,
8346,
6082,
422,
285,
7052,
17133,
6703,
6082,
422,
26413,
652,
13757,
1895,
949,
253,
9655,
273,
1517,
72,
3767,
4181,
253,
2929,
10949,
253,
30027,
5556,
478,
13078,
285,
19191,
13757,
275,
1097,
4112,
253,
4477,
2085,
253,
5933,
285,
697,
14940,
1783,
253,
10527,
1543,
2722,
253,
4081,
5933,
342,
253,
8596,
273,
1517,
72,
1342,
4181,
3157,
253,
3045,
342,
1675,
281,
253,
1617,
1180,
465,
5596,
285,
17617,
253,
11041,
5141,
5853,
812,
2007,
3157,
253,
18925,
327,
299,
4277,
342,
247,
1340,
273,
1315,
317,
805,
253,
10704,
3368,
2007,
5276,
253,
6733,
273,
253,
4081,
5933,
253,
20544,
273,
253,
2929,
310,
4755,
326,
352,
29328,
253,
11333,
326,
812,
3157,
253,
10527,
5170,
9458,
273,
2045,
1375,
23037,
14387,
1543,
253,
9759,
310,
2590,
285,
253,
10704,
3368,
310,
3590,
50276,
783,
32213,
7194,
984,
253,
7756,
310,
28826,
352,
310,
973,
4304,
326,
253,
11041,
44571,
5853,
812,
3157,
253,
25342,
8772,
253,
7200,
299,
4277,
285,
253,
1517,
72,
1342,
557,
291,
593,
7729,
342,
253,
1617,
1180,
465,
5596,
253,
10704,
2336,
6787,
273,
253,
1895,
310,
3710,
323,
1016,
5933,
359,
760,
452,
581,
3368,
253,
3368,
2722,
253,
4081,
20320,
430,
6356,
983,
452,
2406,
11655,
352,
651,
320,
1805,
604,
253,
4477,
2085,
625,
4679,
891,
651,
751,
2572,
619,
4868,
604,
253,
2488,
812,
2085,
625,
5661,
1543,
1309,
30080,
22559,
5474,
33032,
2520,
2929,
19071,
253,
1517,
72,
1342,
4181,
715,
26413,
652,
13757,
285,
12661,
1264,
3082,
1794,
706,
433,
256,
4193,
706,
433,
69,
347,
4193,
706,
433,
534,
8571,
387,
15974,
30027,
285,
19191,
26413,
652,
1895,
824,
4081,
11333,
452,
13373,
1682,
2303,
7200,
299,
4277,
285,
5520,
253,
1617,
1180,
465,
5596,
2429,
342,
643,
49602,
26614,
824,
1783,
310,
5223,
494,
323,
14122,
78,
4902,
8346,
1159,
253,
4679,
671,
7568,
253,
8936,
3045,
273,
4081,
11333,
275,
2426,
273,
20544,
253,
4081,
789,
2722,
253,
1617,
1180,
7756,
275,
2426,
273,
14940,
1783,
26614,
275,
1027,
5661,
7533,
253,
4081,
11333,
452,
5183,
697,
8936,
3045,
15978,
1097,
13260,
285,
14940,
1783,
403,
2629,
285,
3477,
281,
956,
50276,
249,
2426,
273,
14855,
2067,
458,
44661,
18057,
19,
18057,
21,
275,
2022,
2133,
3480,
22909,
253,
10527,
1783,
310,
1077,
2629,
1223,
352,
588,
320,
1805,
281,
1127,
562,
253,
7681,
32771,
1327,
4016,
38058,
3486,
50276,
7152,
33032,
2520,
2929,
29328,
247,
747,
966,
273,
26413,
652,
13757,
1766,
3237,
1097,
275,
30027,
285,
19191,
4948,
2429,
281,
253,
10610,
1766,
1895,
616,
8346,
1159,
556,
271,
3081,
14122,
78,
4902,
1307,
534,
2789,
616,
1566,
625,
2087,
24088,
352,
4428,
253,
1083,
672,
359,
897,
298,
18,
37820,
50276,
7461,
1264,
11333,
403,
4081,
253,
806,
767,
403,
908,
281,
8415,
30027,
296,
17283,
1766,
3237,
275,
253,
747,
830,
2975,
285,
253,
1390,
581,
310,
271,
21702,
2715,
273,
253,
1273,
5933,
20544,
337,
436,
2929,
3862,
561,
253,
966,
273,
1766,
3237,
326,
5420,
275,
2045,
6239,
534,
5644,
281,
253,
4831,
323,
747,
11333,
984,
2045,
11333,
476,
417,
2968,
342,
14122,
78,
4902,
8346,
1159,
374,
762,
2074,
13260,
2429,
342,
2905,
2987,
253,
11333,
4081,
275,
436,
2929,
5115,
253,
1682,
14940,
2281,
342,
1929,
1617,
1180,
50276,
20881,
1255,
265,
337,
436,
2929,
760,
25957,
581,
26741,
835,
359,
878,
281,
1908,
247,
14122,
78,
4902,
8346,
1159,
672,
359,
897,
298,
18,
37820,
50276,
3529,
778,
6891,
253,
4451,
1673,
273,
2898,
273,
436,
789,
26332,
436,
789,
476,
513,
533,
253,
2045,
2987,
476,
417,
352,
310,
1175,
281,
3748,
625,
6667,
1212,
18498,
273,
970,
14122,
78,
4902,
16566,
50275,
783,
6733,
273,
4081,
11333,
7024,
327,
253,
4327,
273,
3714,
74,
534,
310,
908,
281,
4853,
1517,
72,
1342,
4181,
50276,
249,
253,
3368,
629,
352,
310,
6777,
824,
326,
253,
22753,
273,
1269,
310,
1077,
3477,
26332,
247,
4581,
630,
2900,
4961,
672,
288,
89,
50276,
17,
50276,
2858,
672,
643,
3714,
74,
310,
6777,
390,
288,
89,
425,
82,
470,
849,
281,
14556,
8415,
253,
1895,
281,
5731,
1269,
253,
10454,
273,
16161,
436,
749,
28872,
3133,
1057,
417,
3176,
275,
253,
5301,
342,
643,
11333,
5046,
253,
4477,
476,
1056,
352,
625,
2590,
849,
281,
8415,
436,
749,
28872,
323,
2087,
288,
89,
2490,
187,
4118,
18435,
27,
783,
2929,
2175,
26413,
652,
13757,
3237,
3400,
1264,
11333,
323,
1027,
7533,
285,
19132,
253,
14940,
1783,
275,
2426,
273,
253,
1617,
1180,
275,
1635,
10704,
4679,
403,
5196,
326,
2085,
23356,
273,
253,
12510,
575,
1171,
253,
11333,
1264,
30628,
512,
5194,
326,
253,
2929,
943,
320,
3863,
347,
352,
17904,
281,
253,
6239,
285,
588,
320,
273,
1600,
281,
253,
5723,
2824,
8446,
575,
50276,
9453,
13828,
253,
2457,
2715,
273,
253,
7714,
4496,
19071,
253,
5955,
326,
9713,
253,
30628,
5701,
2057,
275,
253,
2022,
2505,
390,
253,
30762,
575
] | [
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] | [
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
783,
2929,
2175,
253,
1327,
44181,
8346,
6082,
422,
285,
7052,
17133,
6703,
6082,
422,
26413,
652,
13757,
1895,
949,
253,
9655,
273,
1517,
72,
3767,
4181,
253,
2929,
10949,
253,
30027,
5556,
478,
13078,
285,
19191,
13757,
275,
1097,
4112,
253,
4477,
2085,
253,
5933,
285,
697,
14940,
1783,
253,
10527,
1543,
2722,
253,
4081,
5933,
342,
253,
8596,
273,
1517,
72,
1342,
4181,
3157,
253,
3045,
342,
1675,
281,
253,
1617,
1180,
465,
5596,
285,
17617,
253,
11041,
5141,
5853,
812,
2007,
3157,
253,
18925,
327,
299,
4277,
342,
247,
1340,
273,
1315,
317,
805,
253,
10704,
3368,
2007,
5276,
253,
6733,
273,
253,
4081,
5933,
253,
20544,
273,
253,
2929,
310,
4755,
326,
352,
29328,
253,
11333,
326,
812,
3157,
253,
10527,
5170,
9458,
273,
2045,
1375,
23037,
14387,
1543,
253,
9759,
310,
2590,
285,
253,
10704,
3368,
310,
3590,
50276,
783,
32213,
7194,
984,
253,
7756,
310,
28826,
352,
310,
973,
4304,
326,
253,
11041,
44571,
5853,
812,
3157,
253,
25342,
8772,
253,
7200,
299,
4277,
285,
253,
1517,
72,
1342,
557,
291,
593,
7729,
342,
253,
1617,
1180,
465,
5596,
253,
10704,
2336,
6787,
273,
253,
1895,
310,
3710,
323,
1016,
5933,
359,
760,
452,
581,
3368,
253,
3368,
2722,
253,
4081,
20320,
430,
6356,
983,
452,
2406,
11655,
352,
651,
320,
1805,
604,
253,
4477,
2085,
625,
4679,
891,
651,
751,
2572,
619,
4868,
604,
253,
2488,
812,
2085,
625,
5661,
1543,
1309,
30080,
22559,
5474,
33032,
2520,
2929,
19071,
253,
1517,
72,
1342,
4181,
715,
26413,
652,
13757,
285,
12661,
1264,
3082,
1794,
706,
433,
256,
4193,
706,
433,
69,
347,
4193,
706,
433,
534,
8571,
387,
15974,
30027,
285,
19191,
26413,
652,
1895,
824,
4081,
11333,
452,
13373,
1682,
2303,
7200,
299,
4277,
285,
5520,
253,
1617,
1180,
465,
5596,
2429,
342,
643,
49602,
26614,
824,
1783,
310,
5223,
494,
323,
14122,
78,
4902,
8346,
1159,
253,
4679,
671,
7568,
253,
8936,
3045,
273,
4081,
11333,
275,
2426,
273,
20544,
253,
4081,
789,
2722,
253,
1617,
1180,
7756,
275,
2426,
273,
14940,
1783,
26614,
275,
1027,
5661,
7533,
253,
4081,
11333,
452,
5183,
697,
8936,
3045,
15978,
1097,
13260,
285,
14940,
1783,
403,
2629,
285,
3477,
281,
956,
50276,
249,
2426,
273,
14855,
2067,
458,
44661,
18057,
19,
18057,
21,
275,
2022,
2133,
3480,
22909,
253,
10527,
1783,
310,
1077,
2629,
1223,
352,
588,
320,
1805,
281,
1127,
562,
253,
7681,
32771,
1327,
4016,
38058,
3486,
50276,
7152,
33032,
2520,
2929,
29328,
247,
747,
966,
273,
26413,
652,
13757,
1766,
3237,
1097,
275,
30027,
285,
19191,
4948,
2429,
281,
253,
10610,
1766,
1895,
616,
8346,
1159,
556,
271,
3081,
14122,
78,
4902,
1307,
534,
2789,
616,
1566,
625,
2087,
24088,
352,
4428,
253,
1083,
672,
359,
897,
298,
18,
37820,
50276,
7461,
1264,
11333,
403,
4081,
253,
806,
767,
403,
908,
281,
8415,
30027,
296,
17283,
1766,
3237,
275,
253,
747,
830,
2975,
285,
253,
1390,
581,
310,
271,
21702,
2715,
273,
253,
1273,
5933,
20544,
337,
436,
2929,
3862,
561,
253,
966,
273,
1766,
3237,
326,
5420,
275,
2045,
6239,
534,
5644,
281,
253,
4831,
323,
747,
11333,
984,
2045,
11333,
476,
417,
2968,
342,
14122,
78,
4902,
8346,
1159,
374,
762,
2074,
13260,
2429,
342,
2905,
2987,
253,
11333,
4081,
275,
436,
2929,
5115,
253,
1682,
14940,
2281,
342,
1929,
1617,
1180,
50276,
20881,
1255,
265,
337,
436,
2929,
760,
25957,
581,
26741,
835,
359,
878,
281,
1908,
247,
14122,
78,
4902,
8346,
1159,
672,
359,
897,
298,
18,
37820,
50276,
3529,
778,
6891,
253,
4451,
1673,
273,
2898,
273,
436,
789,
26332,
436,
789,
476,
513,
533,
253,
2045,
2987,
476,
417,
352,
310,
1175,
281,
3748,
625,
6667,
1212,
18498,
273,
970,
14122,
78,
4902,
16566,
50275,
783,
6733,
273,
4081,
11333,
7024,
327,
253,
4327,
273,
3714,
74,
534,
310,
908,
281,
4853,
1517,
72,
1342,
4181,
50276,
249,
253,
3368,
629,
352,
310,
6777,
824,
326,
253,
22753,
273,
1269,
310,
1077,
3477,
26332,
247,
4581,
630,
2900,
4961,
672,
288,
89,
50276,
17,
50276,
2858,
672,
643,
3714,
74,
310,
6777,
390,
288,
89,
425,
82,
470,
849,
281,
14556,
8415,
253,
1895,
281,
5731,
1269,
253,
10454,
273,
16161,
436,
749,
28872,
3133,
1057,
417,
3176,
275,
253,
5301,
342,
643,
11333,
5046,
253,
4477,
476,
1056,
352,
625,
2590,
849,
281,
8415,
436,
749,
28872,
323,
2087,
288,
89,
2490,
187,
4118,
18435,
27,
783,
2929,
2175,
26413,
652,
13757,
3237,
3400,
1264,
11333,
323,
1027,
7533,
285,
19132,
253,
14940,
1783,
275,
2426,
273,
253,
1617,
1180,
275,
1635,
10704,
4679,
403,
5196,
326,
2085,
23356,
273,
253,
12510,
575,
1171,
253,
11333,
1264,
30628,
512,
5194,
326,
253,
2929,
943,
320,
3863,
347,
352,
17904,
281,
253,
6239,
285,
588,
320,
273,
1600,
281,
253,
5723,
2824,
8446,
575,
50276,
9453,
13828,
253,
2457,
2715,
273,
253,
7714,
4496,
19071,
253,
5955,
326,
9713,
253,
30628,
5701,
2057,
275,
253,
2022,
2505,
390,
253,
30762,
575
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
this paper introduces the labour inspection checklists dataset consists of 63634 instances where each instance is an inspection conducted by the norwegian labour inspection authority 575 features are provided for each organization that is inspected and potential target variables are checklists list of items that should be inspected for the organization and noncompliance the authors experiment with several machine learning methods on their task but the performance for all of them is very poor i appreciate the authors response and i have updated my score however i still believe that the extremely poor performance of ml models on this dataset is a red flag the dataset in the current format is probably not informative enough which can even discourage researchers from working on this area as the trained models dont generalize i would encourage the authors to include other sources of data if possible to mitigate this issue 1 the authors are releasing a dataset on a relatively unexplored task however it could be helpful for better inspection by agencies 2 the dataset consists of 63634 inspections which is to some extent large 3 several experiments are conducted among two tasks predicting checklists and noncompliance with different machine learning models 1 the whole data is only coming from a single authority nlia therefore it might not be directly useful for other parts of the world 2 the performance of the trained machine learning models is extremely poor that probably could be because the data is not informative enough for the defined task poor performance sometimes means that the data is not informative enough which i think is the case here 3 the authors do not elaborate on features that are provided for organizations and why they believe they are informative enough also for the checklist i think there is no single ground truth for it the fact that a set of checklists are selected during the inspection by the inspector does not mean that the chosen set is optimal making predicting the list even harder docsepthe paper introduces a new labor inspection checklists dataset licd for selecting relevant checklists to address working environment violations in organizations which is very important for protecting labor rights licd consists of 63634 instances with past inspections and contains 575 features and 2 possible target variables noncompliance and checklists based on the target variables the dataset could potentially be used to select a relevant checklist to 1 survey a given target organization as well as to 2 classify whether noncompliance is found at an inspection for a given organization and checklist it could be used by inspectors to select highrisk organizations for inspections the authors used feature selection in the experiments with anova f 2 coefficient importance from model training mutual information sequential selection forward search and recursive elimination the ml methods used for the experiments were decision tree dt logistic regression lr naive bayes classifier nbc knearestneighbor knn adaboost gradient boost and multilayered perceptron mlp gridsearchcv for hyperparameter tuning for knn adaboost gradientboost and mlp each ml method was evaluated on 8 different feature set sizes that are selected via feature selection there are only minor differences between most of the methods in the test sequential selection anova f and mutual information have the bestrecorded accuracy scores future work using this dataset could for example involve a multiobjective optimization problem to select the most relevant inspection checklist that maximizes the number of violations found in a given organization this new and unique dataset is addressing a very important labor safety issue promoting a safe working environment for all workers and motivating further research on this subject the authors conducted initial features evaluation and basic data visualization to highlight how some of the key features of the dataset are distributed the high number of relevant checklists used within a single industry code points to the fact that there is significant diversity in the health environmental and safety risks for organizations even within the same industry good job on being specific about the equipment used a dell precision 5560 laptop with intel i9 11950h at 5ghz 64gb ram at 3200mhz nvidia quadro rtx a2000 and windows 10 are used for the experiments the full dataset was imbalanced in terms of noncompliance to address this issue the authors created a balanced set for ml models several questions were not fully addressed in the manuscript how was the data balancing created which samples were selected were there any work done for the analysis of the similarities between the users and industries do the industry code numbers also maintain the similarity information between the industries are industries that have numbers that are closer also perform more similar business rather than distant codes what could be the reason for the spike gaps between codes in figure 2a for example between 50 and 60 which correspond to most of the building and road construction industries could it be better to treat them as categorical features rather than numeric sequential selection was unable to complete within two hours for any of the larger feature sets would running the selection longer help the results for the trained models on the unbalanced licd have room for better performance and highlight the necessity for running the experiment on a balanced dataset or more complex models however the results on the balanced data improved also leave room for improvement details about the gridsearchcv hyperparameter tuning were not specified in the text thus better results could potentially be achieved using a broader or better hyperparameters tuning mechanism in future work the variance for the crossvalidation runs is not shown in the manuscript could you please add it docsepthis work seeks to offer a new dataset on over 60000 workplace inspections as well as completed checklists related to labor law compliance in norway between 2012 and the beginning over 2019 this dataset is intended to help identify patterns of health safety and environmental violations using ml techniques as well as help determine which are the appropriate checklists to use in a given context the authors note that the sample experiments that they show using this dataset unfortunately do not exhibit a high performance with respect to predicting violations but are optimistic that novel methods using this data may be able to outperform their benchmark experiments the manuscript illustrates that one contribution of this dataset it expands on previous sdg related neurips datasets and benchmarks such as sustainbench since none of those datasets cover sdg 8 this is a notoriously difficult arena to get good data on especially financial information of regulated entities and i laud their efforts to make this information public this dataset is focused in on one country only this is still an advance as i appreciate this data is difficult to obtain and regulations vary from place to place that will characterize what is in violation or not differently however one major difference between this and for example sustainbench that they mention in related work is that the latter seeks to at least keep track of more than one country while they contextualize this dataset within other related work its not clear to me that all of the citations refer to datasets some appear to refer to manuscripts that illustrate analysis others to datasets that may be more or less challenging to parse in an effort for harm prevention the dataset that is publicly available strips out information such as geography violation type or facility name that may actually prove useful for predictions eg in the case of spatial spillovers perhaps theres a way the authors can indicate a trusted analyst position for others to be vetted to use the full dataset so as to incorporate such data a la census sworn status its also surprising to see that 74 of inspections yield violations and that makes me wonder if theres degrees of variation in compliance with inspections that one may want to take into account in subsequent analyses if sufficient information is available andor if there are ways to subdividefocus on a subset of the population the linked dataset is also in danish which limits accessibility there is an option in the upper right to convert to english but then it no longer remains on the same page an english language link if available would enhance accessibility else a note indicating the webpage is not in english would also be helpful docsepthe authors release a labour inspection checklists dataset that contains 63k instances of inspections conduced by norwegian labour inspection authority nlia during the period of january 2013 to june 2019 the dataset contains 575 features and 2 target variables the features contain meta data on organizational or financial information about the company under inspection the target variables are 1 checklist inspection topic 2 is company compliant for all questions or non compliant for atleast one questions in the checklist the dataset is based on actual daily operations of nlia without any transformations to show how the authors envision this dataset as a useful dataset for ml community the results from baseline experiments for prediction of target variables are discussed the contributions from authors include curation of a dataset that can be used to make the inspections of hse violations easier for governmental agencies that conduct labour inspections ml can help such agencies with limited resources to better plan the inspections if the ml research on this dataset can show practically useful results then it can be very useful for agencies that perform an important job of making sure that the employees work in safe working conditions ml based recommendations can help inspection agencies in shortlisting high risk companies to inspect on priority and in quickly creating the right checklist of questions to evaluate during inspections the authors promise to share dataset publicly after the paper review process at this point it is hard to comment on the accessibility of the dataset for wider research community as per authors the dataset is compliant to gdpr and norwegian privacy protection laws it is also made sure that the data does not contain privacy violating features by getting dataset reviewed by the management of inspection agency the paper as several weaknesses as listed below 1 evaluation metrics the two category of experiments used to demonstrate the application of ml for predicting target variables are evaluated using metrics that dwell on hard labels ie accuracy precision and recall the models used in experiments return soft scores and in order to convert soft scores to hard labels 01 or 0n if multi class a threshold is chosen as pivot point the authors did not make it clear how threshold was chosen and in fact the pivot point is unknown seems like default threshold of 05 was chosen but 05 does not make sense for non calibrated models like gradientboost it is unknown what is the desirable precision or desirable recall that can be used for tuning the threshold instead for comparisons of different models it would be better to first evaluate the models in terms of metrics that can use soft scores like auc micromacro and accordingly show better performing models 2 missing feature engineering details missing values getting imputed by 0 limit ability for researcher to use their own imputation method a researcher might want to impute with mean or median or a 1 depending on the range of values in the data was there any data transformation done on any features before passing to the models how many features were categorical and how were they treated such details on feature engineering are missing 3 feature selection much discussion is provided on feature selection but there is no result based on full set of features in most of the chosen models the use of regularization can inherently handle 575 features and remove ones that are less useful as a result feature selection step in many cases can turn out to be of minimal benefit also if author believe feature selection to be an important step then a discussion of top features impacting the target outcome is missing 4 lack of error bars the authors mention use of cross validation but error bars are not provided in results if cross validation was not used then details about train validation and test splits is missing it would have been better to add details on how the sizes of train validation and test sets if used 5 lack of discussion on application of results the performance for both tasks using different models is low authors have not mentioned what is the target performance that could be of a practical help for agency is the current performance for both tasks practically useful if yes how does the agency plan to use the predictions moreover clsp is a multi class classification task and it is unclear which class is more predictable in the 106 accuracy achieved by dt model 6 dataset not scalable to other countries limiting its impact the dataset is curated by norwegian inspection agency and comprises of inspections performed in norway this limits the impact and usefulness of dataset to broader research community from other geographies 7 imbalanced vs balanced 2674 class ratio is generally not required to be underover sampled underover sampling is generally done for highly imbalanced datasets eg 1 in 1000 instances moreover test sets are kept in original class ratio when underover sampling is performed to provide real evaluation metrics that will mimic production scenario instead if the class ratio is not 5050 use of right evaluation metrics like auc instead of accuracy comes into play 8 data distribution analysis it would be useful to provide distribution analysis of different attributes present in the data how many attributes were categorical how many numeric what is the relationship of different variables with target variable etc docsepthe paper proposes a dataset of 64k points for two tasks selecting the best checklist for inspection and predicting noncompliance the problem the paper attacks is an interesting one which has real practical applications and could help workers while the dataset seems sound some choices in the design of experimentation makes it so that the proposed baseline is weaker than it needs be namely for the checklist selection problem it is not a 369 way classification problem as not every checklist can be applied to every industry which reduces the number of classes the experiments used are simple statistical methods which dont seem to have been selected with any particular reasoning behind them given the promises of dl in pretty much ever area it feels like a severe limitation to not have at least a bert or some pretrained algorithm as a baseline the feature selection method is done using a shortcut where methods are evaluated only on dt and eliminated based on this result given the short running time of most of the algorithms it would have been better to simply eliminate sequential selection and run every selection method on every algorithm theres also no report of the result without feature selection the 5cross validation seems like a good way to have results that people cant reproduce the dataset is large enough to be split into a standard traindevtest which would allow practitioners to compare on the same test set in the same vein it is not clear how you finetune your algorithms given the absence of the test set the methodology makes me believe that there is possibly peeking involved in the early stage such as the feature selection it would be beneficial to extend on the differences between the previously released dataset by the same authors in 13 14 and this one one of the highlighted contribution is that this is a new dataset but it seems like its the same data with different preprocessing which would make it new tasks on the data while this doesnt change what is accomplished it would bring more transparency to the paper
### Summary: | the paper proposes a dataset on labor inspections where a set of curated features are used to predict inspection checklists and violations the perspective is unique and fresh to the machine learning community and the underlying goal of safe working environment is relevant to everyone the reviewers mostly agree with this viewpoint despite some pointed out the current results are not competent or some detailsanalysis are missing this leaves room for future research and the main contribution is the hardtoobtain dataset and the new problem the author also provided detailed rebuttals with a revised draft addressing many review comments in my opinion the datasetproblem contribution outweighs potential flaws in experiments and i recommend the paper to be accepted | [
5438,
3356,
1361,
50276,
783,
1543,
323,
253,
10166,
3210,
327,
253,
440,
30063,
8374,
69,
452,
2316,
323,
1805,
3045,
285,
6780,
253,
15504,
323,
3515,
253,
3368,
327,
247,
16645,
10895,
390,
625,
2570,
3210,
2299,
253,
1543,
327,
253,
16645,
941,
5520,
671,
3553,
2316,
323,
7756,
4278,
670,
253,
9860,
8716,
17312,
4373,
19484,
25184,
497,
417,
7616,
275,
253,
2505,
3021,
1805,
1543,
812,
7826,
320,
6786,
970,
247,
16055,
390,
1805,
4373,
22041,
25184,
5122,
275,
2852,
789,
50276,
783,
11041,
323,
253,
2831,
29599,
6613,
310,
417,
2011,
275,
253,
7714,
812,
368,
4496,
823,
352,
50276,
7152,
33032,
2520,
789,
14993,
281,
3959,
247,
747,
10895,
327,
689,
721,
1418,
21853,
48350,
347,
973,
347,
6312,
2451,
28256,
2905,
281,
5299,
1569,
10276,
275,
4543,
1106,
875,
4050,
285,
253,
5068,
689,
6247,
436,
10895,
310,
6034,
281,
1361,
4271,
6127,
273,
1786,
5252,
285,
6938,
15927,
970,
13361,
5609,
347,
973,
347,
1361,
3653,
534,
403,
253,
4569,
2451,
28256,
281,
897,
275,
247,
1677,
3634,
253,
4477,
3877,
326,
253,
3410,
4679,
326,
597,
921,
970,
436,
10895,
19235,
513,
417,
10738,
247,
1029,
3045,
342,
1675,
281,
21565,
15927,
533,
403,
28684,
326,
4460,
3082,
970,
436,
941,
778,
320,
2104,
281,
562,
32231,
616,
22791,
4679,
50276,
783,
7714,
18303,
326,
581,
7680,
273,
436,
10895,
352,
35205,
327,
2045,
256,
27421,
2905,
5723,
2824,
15302,
285,
49602,
824,
347,
10265,
31591,
1580,
5293,
273,
1110,
15302,
3835,
256,
27421,
854,
436,
310,
247,
417,
49186,
2834,
23192,
281,
755,
1175,
941,
327,
3340,
4832,
1491,
273,
13527,
14429,
285,
891,
826,
438,
616,
6031,
281,
1056,
436,
1491,
1345,
50275,
2520,
10895,
310,
7106,
275,
327,
581,
2586,
760,
436,
310,
1335,
271,
7170,
347,
891,
11435,
436,
941,
310,
2834,
281,
4044,
285,
10132,
6889,
432,
1659,
281,
1659,
326,
588,
17710,
752,
310,
275,
8411,
390,
417,
13359,
2299,
581,
2201,
3064,
875,
436,
285,
323,
1650,
10265,
31591,
326,
597,
3748,
275,
2905,
789,
310,
326,
253,
6158,
14993,
281,
387,
1878,
1978,
3540,
273,
625,
685,
581,
2586,
50275,
6050,
597,
33876,
907,
436,
10895,
1561,
643,
2905,
789,
697,
417,
2590,
281,
479,
326,
512,
273,
253,
30404,
3730,
281,
15302,
690,
3176,
281,
3730,
281,
40336,
326,
17093,
1783,
2571,
281,
15302,
326,
778,
320,
625,
390,
1679,
11132,
281,
14390,
50275,
249,
271,
3434,
323,
5237,
12212,
253,
10895,
326,
310,
13644,
2130,
22486,
562,
1491,
824,
347,
37756,
8411,
1511,
390,
9509,
1416,
326,
778,
2686,
5276,
4217,
323,
13650,
24088,
275,
253,
1083,
273,
8820,
28737,
12239,
4931,
253,
373,
247,
1039,
253,
4477,
476,
5224,
247,
18273,
23960,
1899,
323,
2571,
281,
320,
362,
37883,
281,
897,
253,
2120,
10895,
594,
347,
281,
19071,
824,
941,
247,
826,
20740,
29773,
3708,
697,
671,
10084,
281,
923,
326,
10677,
273,
48350,
4917,
15927,
285,
326,
2789,
479,
4282,
604,
253,
373,
7759,
273,
7629,
275,
10276,
342,
48350,
326,
581,
778,
971,
281,
1379,
715,
2395,
275,
6774,
6260,
604,
4209,
1491,
310,
2130,
285,
263,
604,
627,
403,
4088,
281,
18375,
504,
16651,
327,
247,
8578,
273,
253,
3072,
50275,
783,
7939,
10895,
310,
671,
275,
277,
9289,
534,
7787,
28092,
627,
310,
271,
4500,
275,
253,
5170,
987,
281,
6455,
281,
48087,
533,
840,
352,
642,
3356,
4558,
327,
253,
1072,
3239,
271,
48087,
3448,
3048,
604,
2130,
651,
7278,
28092,
50276,
7271,
247,
3877,
7809,
253,
42498,
310,
417,
275,
48087,
651,
671,
320,
9371,
50275,
7152,
339,
431,
248,
4477,
3727,
247,
18213,
15981,
2451,
28256,
10895,
326,
4428,
9654,
76,
10872,
273,
48350,
345,
23747,
407,
4543,
22601,
18213,
15981,
6265,
295,
19702,
1309,
253,
2180,
273,
44118,
3702,
4072,
281,
480,
2517,
6247,
253,
10895,
4428,
45916,
3386,
285,
374,
2303,
4903,
253,
3386,
3831,
11419,
941,
327,
26921,
390,
4832,
1491,
670,
253,
2567,
762,
15981,
253,
2303,
4903,
403,
337,
44282,
15981,
9400,
374,
310,
2567,
38147,
323,
512,
3533,
390,
1327,
38147,
323,
387,
38462,
581,
3533,
275,
253,
44282,
253,
10895,
310,
1754,
327,
4588,
5312,
5871,
273,
295,
19702,
1293,
667,
21257,
50276,
936,
921,
849,
253,
4477,
31161,
436,
10895,
347,
247,
4217,
10895,
323,
13361,
3114,
253,
1543,
432,
8245,
4679,
323,
10554,
273,
2303,
4903,
403,
5469,
50274,
783,
9021,
432,
4477,
2486,
1095,
318,
273,
247,
10895,
326,
476,
320,
908,
281,
1056,
253,
48350,
273,
288,
339,
15927,
6927,
323,
24715,
11009,
326,
2589,
18213,
48350,
13361,
476,
1361,
824,
11009,
342,
3710,
5300,
281,
1805,
2098,
253,
48350,
604,
253,
13361,
2561,
327,
436,
10895,
476,
921,
18236,
4217,
1543,
840,
352,
476,
320,
1077,
4217,
323,
11009,
326,
1347,
271,
1774,
2628,
273,
2403,
2119,
326,
253,
6171,
789,
275,
4999,
2444,
2515,
13361,
1754,
12645,
476,
1361,
15981,
11009,
275,
2159,
40545,
1029,
2495,
4413,
281,
16030,
327,
11674,
285,
275,
4541,
6153,
253,
987,
44282,
273,
3533,
281,
7472,
1309,
48350,
50273,
783,
4477,
9023,
281,
3894,
10895,
13644,
846,
253,
2929,
2278,
1232,
387,
436,
1127,
352,
310,
1892,
281,
4385,
327,
253,
28092,
273,
253,
10895,
323,
14200,
2561,
3114,
347,
591,
4477,
253,
10895,
310,
38147,
281,
305,
69,
1087,
285,
4543,
22601,
11068,
6055,
5323,
352,
310,
671,
1160,
2119,
326,
253,
941,
1057,
417,
3831,
11068,
26554,
3386,
407,
2970,
10895,
9814,
407,
253,
4323,
273,
15981,
6757,
253,
2929,
347,
2067,
32213,
347,
7117,
2708,
50276,
18,
7103,
17082,
253,
767,
7140,
273,
4679,
908,
281,
7568,
253,
2898,
273,
13361,
323,
21565,
2303,
4903,
403,
6760,
970,
17082,
326,
23031,
327,
1892,
13301,
26332,
7200,
12320,
285,
6983,
253,
3210,
908,
275,
4679,
1091,
2602,
7363,
285,
275,
1340,
281,
6455,
2602,
7363,
281,
1892,
13301,
14805,
390,
470,
79,
604,
4471,
966,
247,
7887,
310,
6777,
347,
26376,
1127,
253,
4477,
858,
417,
1056,
352,
2590,
849,
7887,
369,
6777,
285,
275,
958,
253,
26376,
1127,
310,
7202,
3133,
751,
4284,
7887,
273,
16987,
369,
6777,
533,
16987,
1057,
417,
1056,
3282,
323,
1327,
35890,
3210,
751,
11786,
15467,
352,
310,
7202,
752,
310,
253,
11408,
12320,
390,
11408,
6983,
326,
476,
320,
908,
323,
25184,
253,
7887,
3185,
323,
14023,
273,
1027,
3210,
352,
651,
320,
1805,
281,
806,
7472,
253,
3210,
275,
2426,
273,
17082,
326,
476,
897,
2602,
7363,
751,
247,
1028,
25390,
317,
287,
285,
15672,
921,
1805,
9591,
3210,
374,
5816,
4735,
11369,
4278,
5816,
2193,
2970,
516,
19280,
407,
470,
2701,
3745,
323,
22780,
281,
897,
616,
1211,
516,
10340,
1332,
247,
22780,
1537,
971,
281,
516,
48334,
342,
1599,
390,
8876,
390,
247,
337,
7293,
327,
253,
2491,
273,
2193,
275,
253,
941,
369,
627,
667,
941,
9261,
2218,
327,
667,
3386,
1078,
8136,
281,
253,
3210,
849,
1142,
3386,
497,
31091,
285,
849,
497,
597,
4127,
824,
4278,
327,
4735,
11369,
403,
5816,
495,
4735,
5438,
1199,
5955,
310,
2530,
327,
4735,
5438,
533,
627,
310,
642,
906,
1754,
327,
2120,
873,
273,
3386,
275,
954,
273,
253,
6777,
3210,
253,
897,
273,
37820,
476,
26557,
6016,
45916,
3386,
285,
5386,
4394,
326,
403,
1679,
4217,
347,
247,
906,
4735,
5438,
3213,
275,
1142,
2219,
476,
1614,
562,
281,
320,
273,
8723,
5649,
671,
604,
2488,
2868,
4735,
5438,
281,
320,
271,
1774,
3213,
840,
247,
5955,
273,
1755,
3386,
48482,
253,
2303,
6454,
310,
5816,
50276,
21,
3480,
273,
2228,
8965,
253,
4477,
3748,
897,
273,
2831,
12820,
533,
2228,
8965,
403,
417,
2530,
275,
1543,
604,
2831,
12820,
369,
417,
908,
840,
4278,
670,
6194,
12820,
285,
1071,
36509,
310,
5816,
352,
651,
452,
644,
1805,
281,
823,
4278,
327,
849,
253,
9552,
273,
6194,
12820,
285,
1071,
5239,
604,
908,
50276,
22,
3480,
273,
5955,
327,
2898,
273,
1543,
253,
3045,
323,
1097,
8892,
970,
1027,
3210,
310,
1698,
4477,
452,
417,
5393,
752,
310,
253,
2303,
3045,
326,
812,
320,
273,
247,
8542,
1361,
323,
6757,
310,
253,
1655,
3045,
323,
1097,
8892,
18236,
4217,
604,
4754,
849,
1057,
253,
6757,
2098,
281,
897,
253,
13650,
25761,
502,
1033,
310,
247,
4471,
966,
9162,
4836,
285,
352,
310,
12744,
534,
966,
310,
625,
28826,
275,
253,
12708,
7200,
6786,
407,
19641,
1566,
50276,
23,
10895,
417,
44755,
281,
643,
4343,
14155,
697,
3486,
253,
10895,
310,
1095,
456,
407,
4543,
22601,
15981,
6757,
285,
12093,
273,
48350,
2684,
275,
4543,
1106,
436,
7787,
253,
3486,
285,
31471,
273,
10895,
281,
16055,
2561,
3114,
432,
643,
3471,
41400,
50274,
24,
516,
30063,
4632,
16645,
27880,
21,
966,
4313,
310,
3839,
417,
2424,
281,
320,
762,
1189,
19958,
762,
1189,
10491,
310,
3839,
2218,
323,
4122,
516,
30063,
15302,
24088,
337,
275,
9098,
10872,
25761,
1071,
5239,
403,
4934,
275,
3236,
966,
4313,
672,
762,
1189,
10491,
310,
2684,
281,
2085,
1524,
7103,
17082,
326,
588,
25066,
3275,
10076,
3185,
604,
253,
966,
4313,
310,
417,
2456,
1235,
897,
273,
987,
7103,
17082,
751,
247,
1028,
3185,
273,
7200,
3249,
715,
1132,
50276,
25,
941,
3268,
1783,
352,
651,
320,
4217,
281,
2085,
3268,
1783,
273,
1027,
12474,
1246,
275,
253,
941,
849,
1142,
12474,
497,
31091,
849,
1142,
31437,
752,
310,
253,
2954,
273,
1027,
4903,
342,
2303,
4778,
3966,
50274,
7152,
339,
431,
248,
2929,
29328,
247,
10895,
273,
6705,
76,
2792,
323,
767,
8892,
17221,
253,
1682,
44282,
323,
15981,
285,
21565,
1327,
46451,
253,
1895,
253,
2929,
8104,
310,
271,
4722,
581,
534,
556,
1524,
8542,
4893,
285,
812,
1361,
5820,
1223,
253,
10895,
3133,
3590,
690,
10165,
275,
253,
2216,
273,
40290,
2789,
352,
594,
326,
253,
4081,
8245,
310,
21076,
685,
352,
3198,
320,
10775,
50276,
1542,
253,
44282,
5438,
1895,
352,
310,
417,
247,
37345,
1039,
9162,
1895,
347,
417,
1046,
44282,
476,
320,
3732,
281,
1046,
4491,
534,
11355,
253,
1180,
273,
5971,
50276,
783,
4679,
908,
403,
2969,
7605,
3082,
534,
13414,
1646,
281,
452,
644,
4236,
342,
667,
1798,
14720,
3212,
731,
1677,
253,
16966,
273,
45439,
275,
3965,
1199,
2455,
2170,
352,
9193,
751,
247,
5460,
12291,
281,
417,
452,
387,
1878,
247,
270,
797,
390,
690,
3215,
11273,
5933,
347,
247,
8245,
50276,
783,
4735,
5438,
1332,
310,
2218,
970,
247,
28194,
835,
3082,
403,
6760,
760,
327,
19641,
285,
17527,
1754,
327,
436,
906,
1677,
253,
2159,
3515,
673,
273,
954,
273,
253,
11333,
352,
651,
452,
644,
1805,
281,
3365,
13469,
22453,
5438,
285,
1408,
1046,
5438,
1332,
327,
1046,
5933,
253,
373,
671,
642,
1304,
273,
253,
906,
1293,
4735,
5438,
50276,
783,
608,
16599,
12820,
3133,
751,
247,
1175,
1039,
281,
452,
1543,
326,
952,
16216,
18302,
253,
10895,
310,
1781,
2217,
281,
320,
8085,
715,
247,
2629,
1140,
527,
1173,
2566,
534,
651,
1581,
24432,
281,
7277,
327,
253,
1072,
1071,
873,
275,
253,
1072,
17716,
352,
310,
417,
2590,
849,
368,
1442,
292,
2517,
634,
11333,
1677,
253,
5928,
273,
253,
1071,
873,
253,
16182,
2789,
479,
2868,
326,
627,
310,
6830,
759,
22230,
3206,
275,
253,
2393,
3924,
824,
347,
253,
4735,
5438,
50276,
262,
651,
320,
12912,
281,
9017,
327,
253,
3910,
875,
253,
3786,
4439,
10895,
407,
253,
1072,
4477,
275,
2145,
1638,
285,
436,
581,
581,
273,
253,
16318,
7680,
310,
326,
436,
310,
247,
747,
10895,
533,
352,
3133,
751,
697,
253,
1072,
941,
342,
1027,
638,
21678,
534,
651,
1056,
352,
747,
8892,
327,
253,
941,
1223,
436,
36908,
1818,
752,
310,
14123,
352,
651,
3324,
625,
22107,
281,
253,
2929,
2490,
187,
4118,
18435,
27,
783,
2929,
29328,
247,
10895,
327,
5299,
48350,
835,
247,
873,
273,
1095,
456,
3386,
403,
908,
281,
3283,
15981,
2451,
28256,
285,
15927,
253,
8668,
310,
4451,
285,
5352,
281,
253,
5145,
4715,
3114,
285,
253,
6944,
4736,
273,
4999,
2444,
3126,
310,
4623,
281,
4130,
253,
30628,
6571,
5194,
342,
436,
31460,
5747,
690,
8042,
562,
253,
1655,
1543,
403,
417,
20566,
390,
690,
4278,
12792,
403,
5816,
436,
6505,
2316,
323,
2852,
2561,
285,
253,
2022,
7680,
310,
253,
1892,
936,
706,
14721,
10895,
285,
253,
747,
1895,
253,
2488,
671,
2530,
7000,
30080,
85,
932,
342,
247,
17265,
7482,
15974,
1142,
2278,
5701,
275,
619,
4743,
253,
10895,
28872,
7680,
32180,
21144,
2442,
32138,
275,
4679,
285,
891,
5583,
253,
2929,
281,
320,
7607
] | [
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] | [
5438,
3356,
1361,
50276,
783,
1543,
323,
253,
10166,
3210,
327,
253,
440,
30063,
8374,
69,
452,
2316,
323,
1805,
3045,
285,
6780,
253,
15504,
323,
3515,
253,
3368,
327,
247,
16645,
10895,
390,
625,
2570,
3210,
2299,
253,
1543,
327,
253,
16645,
941,
5520,
671,
3553,
2316,
323,
7756,
4278,
670,
253,
9860,
8716,
17312,
4373,
19484,
25184,
497,
417,
7616,
275,
253,
2505,
3021,
1805,
1543,
812,
7826,
320,
6786,
970,
247,
16055,
390,
1805,
4373,
22041,
25184,
5122,
275,
2852,
789,
50276,
783,
11041,
323,
253,
2831,
29599,
6613,
310,
417,
2011,
275,
253,
7714,
812,
368,
4496,
823,
352,
50276,
7152,
33032,
2520,
789,
14993,
281,
3959,
247,
747,
10895,
327,
689,
721,
1418,
21853,
48350,
347,
973,
347,
6312,
2451,
28256,
2905,
281,
5299,
1569,
10276,
275,
4543,
1106,
875,
4050,
285,
253,
5068,
689,
6247,
436,
10895,
310,
6034,
281,
1361,
4271,
6127,
273,
1786,
5252,
285,
6938,
15927,
970,
13361,
5609,
347,
973,
347,
1361,
3653,
534,
403,
253,
4569,
2451,
28256,
281,
897,
275,
247,
1677,
3634,
253,
4477,
3877,
326,
253,
3410,
4679,
326,
597,
921,
970,
436,
10895,
19235,
513,
417,
10738,
247,
1029,
3045,
342,
1675,
281,
21565,
15927,
533,
403,
28684,
326,
4460,
3082,
970,
436,
941,
778,
320,
2104,
281,
562,
32231,
616,
22791,
4679,
50276,
783,
7714,
18303,
326,
581,
7680,
273,
436,
10895,
352,
35205,
327,
2045,
256,
27421,
2905,
5723,
2824,
15302,
285,
49602,
824,
347,
10265,
31591,
1580,
5293,
273,
1110,
15302,
3835,
256,
27421,
854,
436,
310,
247,
417,
49186,
2834,
23192,
281,
755,
1175,
941,
327,
3340,
4832,
1491,
273,
13527,
14429,
285,
891,
826,
438,
616,
6031,
281,
1056,
436,
1491,
1345,
50275,
2520,
10895,
310,
7106,
275,
327,
581,
2586,
760,
436,
310,
1335,
271,
7170,
347,
891,
11435,
436,
941,
310,
2834,
281,
4044,
285,
10132,
6889,
432,
1659,
281,
1659,
326,
588,
17710,
752,
310,
275,
8411,
390,
417,
13359,
2299,
581,
2201,
3064,
875,
436,
285,
323,
1650,
10265,
31591,
326,
597,
3748,
275,
2905,
789,
310,
326,
253,
6158,
14993,
281,
387,
1878,
1978,
3540,
273,
625,
685,
581,
2586,
50275,
6050,
597,
33876,
907,
436,
10895,
1561,
643,
2905,
789,
697,
417,
2590,
281,
479,
326,
512,
273,
253,
30404,
3730,
281,
15302,
690,
3176,
281,
3730,
281,
40336,
326,
17093,
1783,
2571,
281,
15302,
326,
778,
320,
625,
390,
1679,
11132,
281,
14390,
50275,
249,
271,
3434,
323,
5237,
12212,
253,
10895,
326,
310,
13644,
2130,
22486,
562,
1491,
824,
347,
37756,
8411,
1511,
390,
9509,
1416,
326,
778,
2686,
5276,
4217,
323,
13650,
24088,
275,
253,
1083,
273,
8820,
28737,
12239,
4931,
253,
373,
247,
1039,
253,
4477,
476,
5224,
247,
18273,
23960,
1899,
323,
2571,
281,
320,
362,
37883,
281,
897,
253,
2120,
10895,
594,
347,
281,
19071,
824,
941,
247,
826,
20740,
29773,
3708,
697,
671,
10084,
281,
923,
326,
10677,
273,
48350,
4917,
15927,
285,
326,
2789,
479,
4282,
604,
253,
373,
7759,
273,
7629,
275,
10276,
342,
48350,
326,
581,
778,
971,
281,
1379,
715,
2395,
275,
6774,
6260,
604,
4209,
1491,
310,
2130,
285,
263,
604,
627,
403,
4088,
281,
18375,
504,
16651,
327,
247,
8578,
273,
253,
3072,
50275,
783,
7939,
10895,
310,
671,
275,
277,
9289,
534,
7787,
28092,
627,
310,
271,
4500,
275,
253,
5170,
987,
281,
6455,
281,
48087,
533,
840,
352,
642,
3356,
4558,
327,
253,
1072,
3239,
271,
48087,
3448,
3048,
604,
2130,
651,
7278,
28092,
50276,
7271,
247,
3877,
7809,
253,
42498,
310,
417,
275,
48087,
651,
671,
320,
9371,
50275,
7152,
339,
431,
248,
4477,
3727,
247,
18213,
15981,
2451,
28256,
10895,
326,
4428,
9654,
76,
10872,
273,
48350,
345,
23747,
407,
4543,
22601,
18213,
15981,
6265,
295,
19702,
1309,
253,
2180,
273,
44118,
3702,
4072,
281,
480,
2517,
6247,
253,
10895,
4428,
45916,
3386,
285,
374,
2303,
4903,
253,
3386,
3831,
11419,
941,
327,
26921,
390,
4832,
1491,
670,
253,
2567,
762,
15981,
253,
2303,
4903,
403,
337,
44282,
15981,
9400,
374,
310,
2567,
38147,
323,
512,
3533,
390,
1327,
38147,
323,
387,
38462,
581,
3533,
275,
253,
44282,
253,
10895,
310,
1754,
327,
4588,
5312,
5871,
273,
295,
19702,
1293,
667,
21257,
50276,
936,
921,
849,
253,
4477,
31161,
436,
10895,
347,
247,
4217,
10895,
323,
13361,
3114,
253,
1543,
432,
8245,
4679,
323,
10554,
273,
2303,
4903,
403,
5469,
50274,
783,
9021,
432,
4477,
2486,
1095,
318,
273,
247,
10895,
326,
476,
320,
908,
281,
1056,
253,
48350,
273,
288,
339,
15927,
6927,
323,
24715,
11009,
326,
2589,
18213,
48350,
13361,
476,
1361,
824,
11009,
342,
3710,
5300,
281,
1805,
2098,
253,
48350,
604,
253,
13361,
2561,
327,
436,
10895,
476,
921,
18236,
4217,
1543,
840,
352,
476,
320,
1077,
4217,
323,
11009,
326,
1347,
271,
1774,
2628,
273,
2403,
2119,
326,
253,
6171,
789,
275,
4999,
2444,
2515,
13361,
1754,
12645,
476,
1361,
15981,
11009,
275,
2159,
40545,
1029,
2495,
4413,
281,
16030,
327,
11674,
285,
275,
4541,
6153,
253,
987,
44282,
273,
3533,
281,
7472,
1309,
48350,
50273,
783,
4477,
9023,
281,
3894,
10895,
13644,
846,
253,
2929,
2278,
1232,
387,
436,
1127,
352,
310,
1892,
281,
4385,
327,
253,
28092,
273,
253,
10895,
323,
14200,
2561,
3114,
347,
591,
4477,
253,
10895,
310,
38147,
281,
305,
69,
1087,
285,
4543,
22601,
11068,
6055,
5323,
352,
310,
671,
1160,
2119,
326,
253,
941,
1057,
417,
3831,
11068,
26554,
3386,
407,
2970,
10895,
9814,
407,
253,
4323,
273,
15981,
6757,
253,
2929,
347,
2067,
32213,
347,
7117,
2708,
50276,
18,
7103,
17082,
253,
767,
7140,
273,
4679,
908,
281,
7568,
253,
2898,
273,
13361,
323,
21565,
2303,
4903,
403,
6760,
970,
17082,
326,
23031,
327,
1892,
13301,
26332,
7200,
12320,
285,
6983,
253,
3210,
908,
275,
4679,
1091,
2602,
7363,
285,
275,
1340,
281,
6455,
2602,
7363,
281,
1892,
13301,
14805,
390,
470,
79,
604,
4471,
966,
247,
7887,
310,
6777,
347,
26376,
1127,
253,
4477,
858,
417,
1056,
352,
2590,
849,
7887,
369,
6777,
285,
275,
958,
253,
26376,
1127,
310,
7202,
3133,
751,
4284,
7887,
273,
16987,
369,
6777,
533,
16987,
1057,
417,
1056,
3282,
323,
1327,
35890,
3210,
751,
11786,
15467,
352,
310,
7202,
752,
310,
253,
11408,
12320,
390,
11408,
6983,
326,
476,
320,
908,
323,
25184,
253,
7887,
3185,
323,
14023,
273,
1027,
3210,
352,
651,
320,
1805,
281,
806,
7472,
253,
3210,
275,
2426,
273,
17082,
326,
476,
897,
2602,
7363,
751,
247,
1028,
25390,
317,
287,
285,
15672,
921,
1805,
9591,
3210,
374,
5816,
4735,
11369,
4278,
5816,
2193,
2970,
516,
19280,
407,
470,
2701,
3745,
323,
22780,
281,
897,
616,
1211,
516,
10340,
1332,
247,
22780,
1537,
971,
281,
516,
48334,
342,
1599,
390,
8876,
390,
247,
337,
7293,
327,
253,
2491,
273,
2193,
275,
253,
941,
369,
627,
667,
941,
9261,
2218,
327,
667,
3386,
1078,
8136,
281,
253,
3210,
849,
1142,
3386,
497,
31091,
285,
849,
497,
597,
4127,
824,
4278,
327,
4735,
11369,
403,
5816,
495,
4735,
5438,
1199,
5955,
310,
2530,
327,
4735,
5438,
533,
627,
310,
642,
906,
1754,
327,
2120,
873,
273,
3386,
275,
954,
273,
253,
6777,
3210,
253,
897,
273,
37820,
476,
26557,
6016,
45916,
3386,
285,
5386,
4394,
326,
403,
1679,
4217,
347,
247,
906,
4735,
5438,
3213,
275,
1142,
2219,
476,
1614,
562,
281,
320,
273,
8723,
5649,
671,
604,
2488,
2868,
4735,
5438,
281,
320,
271,
1774,
3213,
840,
247,
5955,
273,
1755,
3386,
48482,
253,
2303,
6454,
310,
5816,
50276,
21,
3480,
273,
2228,
8965,
253,
4477,
3748,
897,
273,
2831,
12820,
533,
2228,
8965,
403,
417,
2530,
275,
1543,
604,
2831,
12820,
369,
417,
908,
840,
4278,
670,
6194,
12820,
285,
1071,
36509,
310,
5816,
352,
651,
452,
644,
1805,
281,
823,
4278,
327,
849,
253,
9552,
273,
6194,
12820,
285,
1071,
5239,
604,
908,
50276,
22,
3480,
273,
5955,
327,
2898,
273,
1543,
253,
3045,
323,
1097,
8892,
970,
1027,
3210,
310,
1698,
4477,
452,
417,
5393,
752,
310,
253,
2303,
3045,
326,
812,
320,
273,
247,
8542,
1361,
323,
6757,
310,
253,
1655,
3045,
323,
1097,
8892,
18236,
4217,
604,
4754,
849,
1057,
253,
6757,
2098,
281,
897,
253,
13650,
25761,
502,
1033,
310,
247,
4471,
966,
9162,
4836,
285,
352,
310,
12744,
534,
966,
310,
625,
28826,
275,
253,
12708,
7200,
6786,
407,
19641,
1566,
50276,
23,
10895,
417,
44755,
281,
643,
4343,
14155,
697,
3486,
253,
10895,
310,
1095,
456,
407,
4543,
22601,
15981,
6757,
285,
12093,
273,
48350,
2684,
275,
4543,
1106,
436,
7787,
253,
3486,
285,
31471,
273,
10895,
281,
16055,
2561,
3114,
432,
643,
3471,
41400,
50274,
24,
516,
30063,
4632,
16645,
27880,
21,
966,
4313,
310,
3839,
417,
2424,
281,
320,
762,
1189,
19958,
762,
1189,
10491,
310,
3839,
2218,
323,
4122,
516,
30063,
15302,
24088,
337,
275,
9098,
10872,
25761,
1071,
5239,
403,
4934,
275,
3236,
966,
4313,
672,
762,
1189,
10491,
310,
2684,
281,
2085,
1524,
7103,
17082,
326,
588,
25066,
3275,
10076,
3185,
604,
253,
966,
4313,
310,
417,
2456,
1235,
897,
273,
987,
7103,
17082,
751,
247,
1028,
3185,
273,
7200,
3249,
715,
1132,
50276,
25,
941,
3268,
1783,
352,
651,
320,
4217,
281,
2085,
3268,
1783,
273,
1027,
12474,
1246,
275,
253,
941,
849,
1142,
12474,
497,
31091,
849,
1142,
31437,
752,
310,
253,
2954,
273,
1027,
4903,
342,
2303,
4778,
3966,
50274,
7152,
339,
431,
248,
2929,
29328,
247,
10895,
273,
6705,
76,
2792,
323,
767,
8892,
17221,
253,
1682,
44282,
323,
15981,
285,
21565,
1327,
46451,
253,
1895,
253,
2929,
8104,
310,
271,
4722,
581,
534,
556,
1524,
8542,
4893,
285,
812,
1361,
5820,
1223,
253,
10895,
3133,
3590,
690,
10165,
275,
253,
2216,
273,
40290,
2789,
352,
594,
326,
253,
4081,
8245,
310,
21076,
685,
352,
3198,
320,
10775,
50276,
1542,
253,
44282,
5438,
1895,
352,
310,
417,
247,
37345,
1039,
9162,
1895,
347,
417,
1046,
44282,
476,
320,
3732,
281,
1046,
4491,
534,
11355,
253,
1180,
273,
5971,
50276,
783,
4679,
908,
403,
2969,
7605,
3082,
534,
13414,
1646,
281,
452,
644,
4236,
342,
667,
1798,
14720,
3212,
731,
1677,
253,
16966,
273,
45439,
275,
3965,
1199,
2455,
2170,
352,
9193,
751,
247,
5460,
12291,
281,
417,
452,
387,
1878,
247,
270,
797,
390,
690,
3215,
11273,
5933,
347,
247,
8245,
50276,
783,
4735,
5438,
1332,
310,
2218,
970,
247,
28194,
835,
3082,
403,
6760,
760,
327,
19641,
285,
17527,
1754,
327,
436,
906,
1677,
253,
2159,
3515,
673,
273,
954,
273,
253,
11333,
352,
651,
452,
644,
1805,
281,
3365,
13469,
22453,
5438,
285,
1408,
1046,
5438,
1332,
327,
1046,
5933,
253,
373,
671,
642,
1304,
273,
253,
906,
1293,
4735,
5438,
50276,
783,
608,
16599,
12820,
3133,
751,
247,
1175,
1039,
281,
452,
1543,
326,
952,
16216,
18302,
253,
10895,
310,
1781,
2217,
281,
320,
8085,
715,
247,
2629,
1140,
527,
1173,
2566,
534,
651,
1581,
24432,
281,
7277,
327,
253,
1072,
1071,
873,
275,
253,
1072,
17716,
352,
310,
417,
2590,
849,
368,
1442,
292,
2517,
634,
11333,
1677,
253,
5928,
273,
253,
1071,
873,
253,
16182,
2789,
479,
2868,
326,
627,
310,
6830,
759,
22230,
3206,
275,
253,
2393,
3924,
824,
347,
253,
4735,
5438,
50276,
262,
651,
320,
12912,
281,
9017,
327,
253,
3910,
875,
253,
3786,
4439,
10895,
407,
253,
1072,
4477,
275,
2145,
1638,
285,
436,
581,
581,
273,
253,
16318,
7680,
310,
326,
436,
310,
247,
747,
10895,
533,
352,
3133,
751,
697,
253,
1072,
941,
342,
1027,
638,
21678,
534,
651,
1056,
352,
747,
8892,
327,
253,
941,
1223,
436,
36908,
1818,
752,
310,
14123,
352,
651,
3324,
625,
22107,
281,
253,
2929,
2490,
187,
4118,
18435,
27,
783,
2929,
29328,
247,
10895,
327,
5299,
48350,
835,
247,
873,
273,
1095,
456,
3386,
403,
908,
281,
3283,
15981,
2451,
28256,
285,
15927,
253,
8668,
310,
4451,
285,
5352,
281,
253,
5145,
4715,
3114,
285,
253,
6944,
4736,
273,
4999,
2444,
3126,
310,
4623,
281,
4130,
253,
30628,
6571,
5194,
342,
436,
31460,
5747,
690,
8042,
562,
253,
1655,
1543,
403,
417,
20566,
390,
690,
4278,
12792,
403,
5816,
436,
6505,
2316,
323,
2852,
2561,
285,
253,
2022,
7680,
310,
253,
1892,
936,
706,
14721,
10895,
285,
253,
747,
1895,
253,
2488,
671,
2530,
7000,
30080,
85,
932,
342,
247,
17265,
7482,
15974,
1142,
2278,
5701,
275,
619,
4743,
253,
10895,
28872,
7680,
32180,
21144,
2442,
32138,
275,
4679,
285,
891,
5583,
253,
2929,
281,
320,
7607
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
this paper presents one quantize for all framework the framework claims to search for the network and the quantization without the need for retraining results are promising although it is not clear to me if the comparisons are fair different bit size would also be great to see the resulting architecture and how it is different from other nas approaches it is not clear to me why the number of flops is so low compared to related methods in the experiments i missed the computational cost for training and how it compares to the other approaches as this paper suggests there is no need for retraining or finetuning docsepthis paper proposed a method to train quantized supernets which can be directly deployed without retraining the motivation is to have a supernet with a given quantization bitwidth which only train once and can be deployed with different architectures under different flops budget this paper made a bunch of experiments showing that the proposed once quantized for all method can find dnn architectures which have sota performance with low bitwidth the paper also shows that when training lowerbits supernet it is helpful to use the weights from the trained higherbits supernet pros 1 this paper targets a very practical problem that quantization is actually required for most resourceconstrained devices recently supernets eg ofa bignas validate that it is possible to directly obtain dnns with different flops budget from a single big supernet without retraining this saves lots of training time in the cases that one want to have dnns with different flops combining quantizationaware training and supernet is an effective approach to save training time when we want search different dnns with different sizeflops which are quantized with certain bitwidth 2 the authors also show that it is very important to use the pretrained weights from a trained higherbits supernet as initialization when training the quantized supernet this observation is also meaningful for the cases that lowbits quantizationaware training is hard or unstable 3 using the proposed oncequantizedforall method the authors get several quantized dnns which have sota results on imagenet the authors compared both sota architectures quantization aware training and recent quantizationaware nas method the accuracy flops of the searched dnns is better than the compared methods cons 1 its natural to apply quantizationaware training on supernets when we want a supernet to be quantizationaware both quantizationaware training and supernets are readytouse techniques and the combination is straightforward the benefit of bitinheritance is a good observation while using pretrained weights as initialization is kind of a common practice in quantization or model compression at this point the contribution in terms of the novelty is limited 2 the proposed method can also outperform methods that can also search layerwise bitwidth eg table 2 although the method in this paper only uses the same bitwidth for all the layers its not clear what is the main factor in this comparison are all the methods using the same experiment setup eg quantization algorithm lsq or minmax architecture search spaces it will be better to have an ablation study to understand which part of the proposed algorithm plays the key role to the better performance in general i think this paper did a great job on the experiments of quantizationaware supernet but the novelty contribution is slightly under the criteria of iclr so my rating is borderline i hope the authors can give some response to the cons listed above and id like to consider changing my rating if i missed something important docsepthis paper presents a new method to search for quantized neural networks this method is different from others that it results in quantized weights which can be deployed without postprocess such as finetuning proposed method first trains a 4bit quantized supernet and search for the best performance subnet using the validation dataset then the method initialize the 3bit supernet using the 4bit supernet and trains 3bit supernet using the knowledge distilation method proposed method iterates the initialization training and search process until the goal bit resolution is achieved i find that the idea of constructing quantized supernets using bit inheritance is interesting and the paper is well written however i think more precise description of the training method and additional analysis is required to improve the paper 1 in section 34 you described the k to k1 supernet inheritance process as during training we use the k and k 1 bitwidth supernets as teacher and student and train them in a knowledge distillation way to further reduce the quantization error between the k 1 and k bitwidth parameters however the knowledge distillation method is not specified after the statement is it similar to qkd or is it more of a traditional knowledge distillation approach more precise description will be helpful 2 in the introduction you described that this twostage procedure will undesirably increase the number of models to be retrained if we have multiple deployment constraints and hardware bitwidths however there is no analysis on the search cost under such scenarios since the proposed method induces more supernet training process compared to nasthenquantize approaches such as apq it is unclear whether your method will acheive lower search cost or not i believe that additional analysis on the benefit of deploying the quantized weights without retraining in the mean of search cost must be given in the paper as one of the main contribution of the paper is that the proposed method allows the deployment without retraining 3 minor in section 34 you described that where the parameters of the k 1 bit network inherit from the parameters of the k 1 bit network i think the later k 1 must be kdocsepsummary this paper performs a joint optimisation for dnn models making the nas scheme is aware of both the quantisation and architectural search spaces the paper presented a large range of comparisons to different quantisation strategies and ran a lot of experiments to support their claims however the writing quality of this paper is worrying also i am a little worried about the novelty of this paper strength 1 there are a lot of experiments with the proposed method showing a great empirical value for researchers in this field i consider results shown in figure 2 and table 2 very supportive evidence of the effectiveness of the proposed method 2 it is nice to see a large scale study 15k architectures on some common properties of network architectures and their interactions with quantisation 3 to my knowledge this paper does present a stateoftheart number for lowprecision imagenet classification weakness 1 the writing quality of this paper is worrying this is not simply to do with the use of language but also on the clarity of some matters i strongly recommend the authors to have a serious polish of their paper since they do present valuable results and stoa numbers 2 to me the novelty of this paper is limited it seems like an extension to onceforall and the authors also cited this work the teacherstudent technique is also a published idea the authors claim this is the first piece of work of nas without retraining however they are iteratively reducing the bitwidth k which implies a large training cost and is somehow equivalent to retraining the method in the paper looks like a combination of a number of wellknown techniques which might limit the novelty claim in this paper however i have to say i am not very troubled with combining a bunch of existing techniques if it show new stoa that is outperforming by a significant margin this weakness is only minor to me my suggestions confusions 1 it seems like you can boost the performance of quantised networks from a jointly search for architectures and quantisation and b teacherstudent alike quantisation training with inherited weights could you test these two parts in isolation and quantify the contributions of each technique 2 why you quantise activations to unsigned numbers page 4 dont you consider activations like leakyrelu in your activation search space or you do not search activations at all nas methods suffer from more unreliable order preserving who are you comparing to in this case is it more unreliable compared to rl based nas 3 what is your flops reported in table 2 flops means floating point operations do you mean bitops or you somehow scaled flops with respect to bitwidths 4 we focus on the efficient models under one fixed low bitwidth quantization strategy do you mean the network is uniprecision so no layerwise mixedprecision is allowed 5 i spotted a number of misused languages and will strongly recommend authors to check mistakes like a ambiguity i with high floatingpoint performance do you mean floatingpoint models or you mean customised floatingpoint models describe floatingpoint as high is very misleading ii quantize the network with retraining i guess i understand what you mean but you might say retrain the quantised models to be less ambiguous b grammar i different bitwidth different bitwidths ii quantization supernet quantized supernet and so on c do not assume readers have prior knowledge i we use sandwich rules we use the sandwich rule and maybe you should consider explain what it is i cannot present all the mistakes here these are just examples i would iterate again that i would strongly recommend you to polish the paper since i do like the results you are presenting and think if the code is opensourced they will benefit the community
### Summary: | this paper proposed a method to train quantized supernets which can be directly deployed without retraining a main concern is that there is limited novelty the proposed method looks like a combination of wellknown techniques experimental results are promising however it is not clear if the comparisons are fair and if all the methods are using the same setup it is desirable to have additional analysis and ablation studies the writing can also be improved | [
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
2520,
2929,
10262,
581,
2677,
907,
323,
512,
7792,
253,
7792,
3916,
281,
3186,
323,
253,
2990,
285,
253,
36643,
1293,
253,
878,
323,
851,
26208,
50275,
16680,
403,
12532,
3738,
352,
310,
417,
2590,
281,
479,
604,
253,
14023,
403,
4344,
1027,
2372,
1979,
50276,
12756,
671,
320,
1270,
281,
923,
253,
4795,
10336,
285,
849,
352,
310,
1027,
432,
643,
13332,
7274,
352,
310,
417,
2590,
281,
479,
2139,
253,
1180,
273,
892,
2695,
310,
594,
1698,
2429,
281,
2905,
3082,
50275,
249,
253,
4679,
891,
9829,
253,
15180,
2105,
323,
3733,
285,
849,
352,
26662,
281,
253,
643,
7274,
347,
436,
2929,
5936,
627,
310,
642,
878,
323,
851,
26208,
390,
1442,
292,
25004,
50275,
7152,
33032,
2520,
2929,
4081,
247,
1332,
281,
6194,
2677,
1025,
13708,
1507,
534,
476,
320,
3587,
18329,
1293,
851,
26208,
253,
16038,
310,
281,
452,
247,
2221,
3024,
342,
247,
1677,
36643,
2372,
3429,
534,
760,
6194,
2378,
285,
476,
320,
18329,
342,
1027,
35615,
762,
1027,
892,
2695,
7563,
436,
2929,
1160,
247,
12190,
273,
4679,
4645,
326,
253,
4081,
2378,
2677,
1025,
323,
512,
1332,
476,
1089,
277,
9866,
35615,
534,
452,
256,
5503,
3045,
342,
1698,
2372,
3429,
253,
2929,
671,
2722,
326,
672,
3733,
2406,
11209,
2221,
3024,
352,
310,
9371,
281,
897,
253,
13461,
432,
253,
10166,
2169,
11209,
2221,
3024,
50276,
856,
84,
50276,
18,
436,
2929,
8571,
247,
1077,
8542,
1895,
326,
36643,
310,
2686,
2424,
323,
954,
7741,
48454,
4095,
4102,
13708,
1507,
24088,
273,
66,
270,
525,
284,
17813,
326,
352,
310,
1896,
281,
3587,
4044,
277,
79,
2224,
342,
1027,
892,
2695,
7563,
432,
247,
2014,
1943,
2221,
3024,
1293,
851,
26208,
436,
26866,
8783,
273,
3733,
673,
275,
253,
2219,
326,
581,
971,
281,
452,
277,
79,
2224,
342,
1027,
892,
2695,
16248,
36643,
13823,
3733,
285,
2221,
3024,
310,
271,
3576,
2746,
281,
5321,
3733,
673,
672,
359,
971,
3186,
1027,
277,
79,
2224,
342,
1027,
256,
478,
832,
77,
2695,
534,
403,
2677,
1025,
342,
2176,
2372,
3429,
50276,
19,
253,
4477,
671,
921,
326,
352,
310,
1077,
1774,
281,
897,
253,
3215,
11273,
13461,
432,
247,
10166,
2169,
11209,
2221,
3024,
347,
31850,
672,
3733,
253,
2677,
1025,
2221,
3024,
436,
8310,
310,
671,
14282,
323,
253,
2219,
326,
1698,
11209,
36643,
13823,
3733,
310,
1892,
390,
17631,
50276,
20,
970,
253,
4081,
2378,
17149,
1025,
14570,
1332,
253,
4477,
755,
2067,
2677,
1025,
277,
79,
2224,
534,
452,
256,
5503,
1543,
327,
4440,
257,
292,
253,
4477,
2429,
1097,
256,
5503,
35615,
50276,
17149,
1320,
6600,
3733,
285,
3332,
36643,
13823,
13332,
1332,
253,
7200,
50276,
1258,
2695,
273,
253,
16113,
277,
79,
2224,
310,
1805,
685,
253,
2429,
3082,
50276,
5040,
50276,
18,
697,
3626,
281,
4647,
36643,
13823,
3733,
327,
13708,
1507,
672,
359,
971,
247,
2221,
3024,
281,
320,
36643,
13823,
1097,
36643,
13823,
3733,
285,
13708,
1507,
403,
1239,
1767,
1312,
5609,
285,
253,
5019,
310,
15246,
253,
5649,
273,
2372,
27040,
15314,
310,
247,
1175,
8310,
1223,
970,
3215,
11273,
13461,
347,
31850,
310,
2238,
273,
247,
1846,
3946,
275,
36643,
390,
1566,
13800,
387,
436,
1127,
253,
7680,
275,
2426,
273,
253,
38135,
310,
3710,
50276,
19,
253,
4081,
1332,
476,
671,
562,
32231,
3082,
326,
476,
671,
3186,
3828,
3020,
2372,
3429,
24088,
2829,
374,
3738,
253,
1332,
275,
436,
2929,
760,
4648,
253,
1072,
2372,
3429,
323,
512,
253,
8090,
697,
417,
2590,
752,
310,
253,
2022,
2803,
275,
436,
5301,
403,
512,
253,
3082,
970,
253,
1072,
3368,
9978,
24088,
36643,
5933,
298,
18858,
390,
1054,
4090,
10336,
3186,
8470,
352,
588,
320,
1805,
281,
452,
271,
28913,
1263,
281,
2096,
534,
629,
273,
253,
4081,
5933,
7120,
253,
2234,
2554,
281,
253,
1805,
3045,
50275,
249,
2087,
891,
1158,
436,
2929,
858,
247,
1270,
2628,
327,
253,
4679,
273,
36643,
13823,
2221,
3024,
533,
253,
38135,
7680,
310,
5777,
762,
253,
6866,
273,
17857,
32888,
594,
619,
13716,
310,
45210,
891,
3524,
253,
4477,
476,
1918,
690,
2380,
281,
253,
772,
7117,
1840,
285,
2654,
751,
281,
1908,
6890,
619,
13716,
604,
891,
9829,
1633,
1774,
50276,
7152,
33032,
2520,
2929,
10262,
247,
747,
1332,
281,
3186,
323,
2677,
1025,
11454,
6928,
436,
1332,
310,
1027,
432,
2571,
326,
352,
1543,
275,
2677,
1025,
13461,
534,
476,
320,
18329,
1293,
1501,
7404,
824,
347,
1442,
292,
25004,
4081,
1332,
806,
18784,
247,
577,
2713,
2677,
1025,
2221,
3024,
285,
3186,
323,
253,
1682,
3045,
749,
3024,
970,
253,
12820,
10895,
840,
253,
1332,
26641,
253,
495,
2713,
2221,
3024,
970,
253,
577,
2713,
2221,
3024,
285,
18784,
495,
2713,
2221,
3024,
970,
253,
3640,
940,
10108,
1332,
4081,
1332,
10040,
684,
253,
31850,
3733,
285,
3186,
1232,
1919,
253,
4736,
2372,
6064,
310,
6786,
50276,
74,
1089,
326,
253,
2934,
273,
26736,
2677,
1025,
13708,
1507,
970,
2372,
24954,
310,
4722,
285,
253,
2929,
310,
973,
3542,
2299,
891,
1158,
625,
10799,
5740,
273,
253,
3733,
1332,
285,
3081,
1783,
310,
2424,
281,
3157,
253,
2929,
50276,
18,
275,
2593,
5910,
368,
2529,
253,
465,
281,
465,
18,
2221,
3024,
24954,
1232,
347,
1309,
3733,
359,
897,
253,
465,
285,
465,
50276,
18,
2372,
3429,
13708,
1507,
347,
9732,
285,
5974,
285,
6194,
731,
275,
247,
3640,
940,
21755,
1039,
281,
2007,
4796,
253,
36643,
2228,
875,
253,
465,
337,
285,
465,
2372,
3429,
3602,
2299,
253,
3640,
940,
21755,
1332,
310,
417,
7616,
846,
253,
3908,
310,
352,
2074,
281,
2805,
76,
69,
390,
310,
352,
625,
273,
247,
5899,
3640,
940,
21755,
2746,
625,
10799,
5740,
588,
320,
9371,
50276,
19,
275,
253,
10199,
368,
2529,
326,
50276,
2520,
2500,
493,
486,
5199,
588,
19231,
343,
1598,
2572,
253,
1180,
273,
3210,
281,
320,
851,
11273,
604,
359,
452,
2709,
19007,
10806,
285,
10309,
2372,
3429,
84,
2299,
627,
310,
642,
1783,
327,
253,
3186,
2105,
762,
824,
15216,
1580,
253,
4081,
1332,
14757,
625,
2221,
3024,
3733,
1232,
2429,
281,
295,
505,
864,
17149,
907,
7274,
824,
347,
1049,
82,
352,
310,
12744,
1880,
634,
1332,
588,
49652,
422,
2406,
3186,
2105,
390,
417,
891,
2868,
326,
3081,
1783,
327,
253,
5649,
273,
45021,
253,
2677,
1025,
13461,
1293,
851,
26208,
275,
253,
1599,
273,
3186,
2105,
1364,
320,
1677,
275,
253,
2929,
347,
581,
273,
253,
2022,
7680,
273,
253,
2929,
310,
326,
253,
4081,
1332,
4483,
253,
19007,
1293,
851,
26208,
50276,
20,
5884,
275,
2593,
5910,
368,
2529,
326,
50276,
2811,
253,
3602,
273,
253,
465,
50276,
18,
2372,
2990,
30686,
432,
253,
3602,
273,
253,
465,
50276,
18,
2372,
2990,
891,
1158,
253,
1996,
465,
50276,
18,
1364,
320,
465,
7152,
339,
793,
360,
3454,
209,
186,
2520,
2929,
17923,
247,
6036,
5556,
5837,
323,
277,
9866,
3210,
2403,
253,
13332,
6974,
310,
6600,
273,
1097,
253,
2677,
5837,
285,
27934,
3186,
8470,
253,
2929,
3559,
247,
1781,
2491,
273,
14023,
281,
1027,
2677,
5837,
8130,
285,
6337,
247,
2257,
273,
4679,
281,
1329,
616,
3916,
2299,
253,
4028,
3290,
273,
436,
2929,
310,
29124,
671,
891,
717,
247,
1652,
11926,
670,
253,
38135,
273,
436,
2929,
50276,
45563,
337,
627,
403,
247,
2257,
273,
4679,
342,
253,
4081,
1332,
4645,
247,
1270,
16774,
1318,
323,
8607,
275,
436,
1673,
891,
1908,
1543,
2011,
275,
4677,
374,
285,
2829,
374,
1077,
23384,
1941,
273,
253,
12510,
273,
253,
4081,
1332,
374,
352,
310,
5322,
281,
923,
247,
1781,
4311,
1263,
1458,
76,
35615,
327,
690,
1846,
3607,
273,
2990,
35615,
285,
616,
6355,
342,
2677,
5837,
495,
281,
619,
3640,
436,
2929,
1057,
1246,
247,
1375,
23037,
14387,
1180,
323,
1698,
40540,
4440,
257,
292,
9162,
50276,
20881,
1255,
337,
253,
4028,
3290,
273,
436,
2929,
310,
29124,
436,
310,
417,
3365,
281,
513,
342,
253,
897,
273,
3448,
533,
671,
327,
253,
19843,
273,
690,
8213,
891,
7052,
5583,
253,
4477,
281,
452,
247,
4092,
40167,
273,
616,
2929,
1580,
597,
513,
1246,
9865,
1543,
285,
4806,
66,
3904,
374,
281,
479,
253,
38135,
273,
436,
2929,
310,
3710,
352,
3133,
751,
271,
6880,
281,
2378,
14570,
285,
253,
4477,
671,
11106,
436,
789,
253,
9732,
39095,
5853,
310,
671,
247,
3863,
2934,
253,
4477,
1750,
436,
310,
253,
806,
5313,
273,
789,
273,
13332,
1293,
851,
26208,
2299,
597,
403,
10040,
3146,
8493,
253,
2372,
3429,
465,
534,
8018,
247,
1781,
3733,
2105,
285,
310,
10380,
6425,
281,
851,
26208,
253,
1332,
275,
253,
2929,
4453,
751,
247,
5019,
273,
247,
1180,
273,
973,
4304,
5609,
534,
1537,
2701,
253,
38135,
1750,
275,
436,
2929,
2299,
891,
452,
281,
1333,
891,
717,
417,
1077,
26504,
342,
16248,
247,
12190,
273,
5368,
5609,
604,
352,
921,
747,
4806,
66,
326,
310,
41731,
14692,
407,
247,
1534,
8459,
436,
14855,
310,
760,
5884,
281,
479,
50276,
2577,
13991,
50276,
8259,
16723,
337,
352,
3133,
751,
368,
476,
9510,
253,
3045,
273,
2677,
1701,
6928,
432,
247,
26277,
3186,
323,
35615,
285,
2677,
5837,
285,
270,
9732,
39095,
19605,
2677,
5837,
3733,
342,
20265,
13461,
812,
368,
1071,
841,
767,
4243,
275,
12940,
285,
22048,
253,
9021,
273,
1016,
5853,
374,
2139,
368,
2677,
885,
1396,
569,
281,
10698,
3904,
3239,
577,
13414,
368,
1908,
1396,
569,
751,
13584,
90,
1661,
86,
275,
634,
5743,
3186,
2317,
390,
368,
513,
417,
3186,
1396,
569,
387,
512,
50276,
27109,
3082,
11089,
432,
625,
36230,
1340,
24279,
665,
403,
368,
10941,
281,
275,
436,
1083,
310,
352,
625,
36230,
2429,
281,
391,
77,
1754,
13332,
495,
752,
310,
634,
892,
2695,
2361,
275,
2829,
374,
892,
2695,
2097,
14974,
1127,
5871,
513,
368,
1599,
2372,
2695,
390,
368,
10380,
24337,
892,
2695,
342,
1675,
281,
2372,
3429,
84,
577,
359,
2770,
327,
253,
5919,
3210,
762,
581,
4229,
1698,
2372,
3429,
36643,
5700,
513,
368,
1599,
253,
2990,
310,
440,
532,
2845,
1297,
594,
642,
3828,
3020,
6804,
40540,
310,
4136,
608,
891,
20673,
247,
1180,
273,
3731,
3197,
11515,
285,
588,
7052,
5583,
4477,
281,
2451,
16503,
751,
247,
28931,
50275,
74,
342,
1029,
14974,
3659,
3045,
513,
368,
1599,
14974,
3659,
3210,
390,
368,
1599,
2840,
1701,
14974,
3659,
3210,
6266,
14974,
3659,
347,
1029,
310,
1077,
24363,
21255,
2677,
907,
253,
2990,
342,
851,
26208,
891,
5476,
891,
2096,
752,
368,
1599,
533,
368,
1537,
1333,
851,
1949,
253,
2677,
1701,
3210,
281,
320,
1679,
23851,
50275,
67,
28146,
891,
1027,
2372,
3429,
50276,
19623,
2372,
3429,
84,
21255,
36643,
2221,
3024,
50276,
17149,
1025,
2221,
3024,
285,
594,
327,
50276,
68,
513,
417,
5467,
10668,
452,
2720,
3640,
891,
359,
897,
25749,
4803,
50276,
664,
897,
253,
25749,
4086,
285,
5046,
368,
943,
1908,
5513,
752,
352,
310,
50276,
74,
2550,
1246,
512,
253,
16503,
1060,
841,
403,
816,
6667,
891,
651,
35388,
969,
326,
891,
651,
7052,
5583,
368,
281,
40167,
253,
2929,
1580,
891,
513,
751,
253,
1543,
368,
403,
15250,
285,
1158,
604,
253,
2127,
310,
13279,
47549,
597,
588,
5649,
253,
3114,
50272,
187,
187,
4118,
18435,
27,
2520,
2929,
4081,
247,
1332,
281,
6194,
2677,
1025,
13708,
1507,
534,
476,
320,
3587,
18329,
1293,
851,
26208,
247,
2022,
4468,
310,
326,
627,
310,
3710,
38135,
253,
4081,
1332,
4453,
751,
247,
5019,
273,
973,
4304,
5609,
5661,
1543,
403,
12532,
2299,
352,
310,
417,
2590,
604,
253,
14023,
403,
4344,
285,
604,
512,
253,
3082,
403,
970,
253,
1072,
9978,
352,
310,
11408,
281,
452,
3081,
1783,
285,
28913,
2175,
253,
4028,
476,
671,
320,
5520
] | [
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] | [
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
2520,
2929,
10262,
581,
2677,
907,
323,
512,
7792,
253,
7792,
3916,
281,
3186,
323,
253,
2990,
285,
253,
36643,
1293,
253,
878,
323,
851,
26208,
50275,
16680,
403,
12532,
3738,
352,
310,
417,
2590,
281,
479,
604,
253,
14023,
403,
4344,
1027,
2372,
1979,
50276,
12756,
671,
320,
1270,
281,
923,
253,
4795,
10336,
285,
849,
352,
310,
1027,
432,
643,
13332,
7274,
352,
310,
417,
2590,
281,
479,
2139,
253,
1180,
273,
892,
2695,
310,
594,
1698,
2429,
281,
2905,
3082,
50275,
249,
253,
4679,
891,
9829,
253,
15180,
2105,
323,
3733,
285,
849,
352,
26662,
281,
253,
643,
7274,
347,
436,
2929,
5936,
627,
310,
642,
878,
323,
851,
26208,
390,
1442,
292,
25004,
50275,
7152,
33032,
2520,
2929,
4081,
247,
1332,
281,
6194,
2677,
1025,
13708,
1507,
534,
476,
320,
3587,
18329,
1293,
851,
26208,
253,
16038,
310,
281,
452,
247,
2221,
3024,
342,
247,
1677,
36643,
2372,
3429,
534,
760,
6194,
2378,
285,
476,
320,
18329,
342,
1027,
35615,
762,
1027,
892,
2695,
7563,
436,
2929,
1160,
247,
12190,
273,
4679,
4645,
326,
253,
4081,
2378,
2677,
1025,
323,
512,
1332,
476,
1089,
277,
9866,
35615,
534,
452,
256,
5503,
3045,
342,
1698,
2372,
3429,
253,
2929,
671,
2722,
326,
672,
3733,
2406,
11209,
2221,
3024,
352,
310,
9371,
281,
897,
253,
13461,
432,
253,
10166,
2169,
11209,
2221,
3024,
50276,
856,
84,
50276,
18,
436,
2929,
8571,
247,
1077,
8542,
1895,
326,
36643,
310,
2686,
2424,
323,
954,
7741,
48454,
4095,
4102,
13708,
1507,
24088,
273,
66,
270,
525,
284,
17813,
326,
352,
310,
1896,
281,
3587,
4044,
277,
79,
2224,
342,
1027,
892,
2695,
7563,
432,
247,
2014,
1943,
2221,
3024,
1293,
851,
26208,
436,
26866,
8783,
273,
3733,
673,
275,
253,
2219,
326,
581,
971,
281,
452,
277,
79,
2224,
342,
1027,
892,
2695,
16248,
36643,
13823,
3733,
285,
2221,
3024,
310,
271,
3576,
2746,
281,
5321,
3733,
673,
672,
359,
971,
3186,
1027,
277,
79,
2224,
342,
1027,
256,
478,
832,
77,
2695,
534,
403,
2677,
1025,
342,
2176,
2372,
3429,
50276,
19,
253,
4477,
671,
921,
326,
352,
310,
1077,
1774,
281,
897,
253,
3215,
11273,
13461,
432,
247,
10166,
2169,
11209,
2221,
3024,
347,
31850,
672,
3733,
253,
2677,
1025,
2221,
3024,
436,
8310,
310,
671,
14282,
323,
253,
2219,
326,
1698,
11209,
36643,
13823,
3733,
310,
1892,
390,
17631,
50276,
20,
970,
253,
4081,
2378,
17149,
1025,
14570,
1332,
253,
4477,
755,
2067,
2677,
1025,
277,
79,
2224,
534,
452,
256,
5503,
1543,
327,
4440,
257,
292,
253,
4477,
2429,
1097,
256,
5503,
35615,
50276,
17149,
1320,
6600,
3733,
285,
3332,
36643,
13823,
13332,
1332,
253,
7200,
50276,
1258,
2695,
273,
253,
16113,
277,
79,
2224,
310,
1805,
685,
253,
2429,
3082,
50276,
5040,
50276,
18,
697,
3626,
281,
4647,
36643,
13823,
3733,
327,
13708,
1507,
672,
359,
971,
247,
2221,
3024,
281,
320,
36643,
13823,
1097,
36643,
13823,
3733,
285,
13708,
1507,
403,
1239,
1767,
1312,
5609,
285,
253,
5019,
310,
15246,
253,
5649,
273,
2372,
27040,
15314,
310,
247,
1175,
8310,
1223,
970,
3215,
11273,
13461,
347,
31850,
310,
2238,
273,
247,
1846,
3946,
275,
36643,
390,
1566,
13800,
387,
436,
1127,
253,
7680,
275,
2426,
273,
253,
38135,
310,
3710,
50276,
19,
253,
4081,
1332,
476,
671,
562,
32231,
3082,
326,
476,
671,
3186,
3828,
3020,
2372,
3429,
24088,
2829,
374,
3738,
253,
1332,
275,
436,
2929,
760,
4648,
253,
1072,
2372,
3429,
323,
512,
253,
8090,
697,
417,
2590,
752,
310,
253,
2022,
2803,
275,
436,
5301,
403,
512,
253,
3082,
970,
253,
1072,
3368,
9978,
24088,
36643,
5933,
298,
18858,
390,
1054,
4090,
10336,
3186,
8470,
352,
588,
320,
1805,
281,
452,
271,
28913,
1263,
281,
2096,
534,
629,
273,
253,
4081,
5933,
7120,
253,
2234,
2554,
281,
253,
1805,
3045,
50275,
249,
2087,
891,
1158,
436,
2929,
858,
247,
1270,
2628,
327,
253,
4679,
273,
36643,
13823,
2221,
3024,
533,
253,
38135,
7680,
310,
5777,
762,
253,
6866,
273,
17857,
32888,
594,
619,
13716,
310,
45210,
891,
3524,
253,
4477,
476,
1918,
690,
2380,
281,
253,
772,
7117,
1840,
285,
2654,
751,
281,
1908,
6890,
619,
13716,
604,
891,
9829,
1633,
1774,
50276,
7152,
33032,
2520,
2929,
10262,
247,
747,
1332,
281,
3186,
323,
2677,
1025,
11454,
6928,
436,
1332,
310,
1027,
432,
2571,
326,
352,
1543,
275,
2677,
1025,
13461,
534,
476,
320,
18329,
1293,
1501,
7404,
824,
347,
1442,
292,
25004,
4081,
1332,
806,
18784,
247,
577,
2713,
2677,
1025,
2221,
3024,
285,
3186,
323,
253,
1682,
3045,
749,
3024,
970,
253,
12820,
10895,
840,
253,
1332,
26641,
253,
495,
2713,
2221,
3024,
970,
253,
577,
2713,
2221,
3024,
285,
18784,
495,
2713,
2221,
3024,
970,
253,
3640,
940,
10108,
1332,
4081,
1332,
10040,
684,
253,
31850,
3733,
285,
3186,
1232,
1919,
253,
4736,
2372,
6064,
310,
6786,
50276,
74,
1089,
326,
253,
2934,
273,
26736,
2677,
1025,
13708,
1507,
970,
2372,
24954,
310,
4722,
285,
253,
2929,
310,
973,
3542,
2299,
891,
1158,
625,
10799,
5740,
273,
253,
3733,
1332,
285,
3081,
1783,
310,
2424,
281,
3157,
253,
2929,
50276,
18,
275,
2593,
5910,
368,
2529,
253,
465,
281,
465,
18,
2221,
3024,
24954,
1232,
347,
1309,
3733,
359,
897,
253,
465,
285,
465,
50276,
18,
2372,
3429,
13708,
1507,
347,
9732,
285,
5974,
285,
6194,
731,
275,
247,
3640,
940,
21755,
1039,
281,
2007,
4796,
253,
36643,
2228,
875,
253,
465,
337,
285,
465,
2372,
3429,
3602,
2299,
253,
3640,
940,
21755,
1332,
310,
417,
7616,
846,
253,
3908,
310,
352,
2074,
281,
2805,
76,
69,
390,
310,
352,
625,
273,
247,
5899,
3640,
940,
21755,
2746,
625,
10799,
5740,
588,
320,
9371,
50276,
19,
275,
253,
10199,
368,
2529,
326,
50276,
2520,
2500,
493,
486,
5199,
588,
19231,
343,
1598,
2572,
253,
1180,
273,
3210,
281,
320,
851,
11273,
604,
359,
452,
2709,
19007,
10806,
285,
10309,
2372,
3429,
84,
2299,
627,
310,
642,
1783,
327,
253,
3186,
2105,
762,
824,
15216,
1580,
253,
4081,
1332,
14757,
625,
2221,
3024,
3733,
1232,
2429,
281,
295,
505,
864,
17149,
907,
7274,
824,
347,
1049,
82,
352,
310,
12744,
1880,
634,
1332,
588,
49652,
422,
2406,
3186,
2105,
390,
417,
891,
2868,
326,
3081,
1783,
327,
253,
5649,
273,
45021,
253,
2677,
1025,
13461,
1293,
851,
26208,
275,
253,
1599,
273,
3186,
2105,
1364,
320,
1677,
275,
253,
2929,
347,
581,
273,
253,
2022,
7680,
273,
253,
2929,
310,
326,
253,
4081,
1332,
4483,
253,
19007,
1293,
851,
26208,
50276,
20,
5884,
275,
2593,
5910,
368,
2529,
326,
50276,
2811,
253,
3602,
273,
253,
465,
50276,
18,
2372,
2990,
30686,
432,
253,
3602,
273,
253,
465,
50276,
18,
2372,
2990,
891,
1158,
253,
1996,
465,
50276,
18,
1364,
320,
465,
7152,
339,
793,
360,
3454,
209,
186,
2520,
2929,
17923,
247,
6036,
5556,
5837,
323,
277,
9866,
3210,
2403,
253,
13332,
6974,
310,
6600,
273,
1097,
253,
2677,
5837,
285,
27934,
3186,
8470,
253,
2929,
3559,
247,
1781,
2491,
273,
14023,
281,
1027,
2677,
5837,
8130,
285,
6337,
247,
2257,
273,
4679,
281,
1329,
616,
3916,
2299,
253,
4028,
3290,
273,
436,
2929,
310,
29124,
671,
891,
717,
247,
1652,
11926,
670,
253,
38135,
273,
436,
2929,
50276,
45563,
337,
627,
403,
247,
2257,
273,
4679,
342,
253,
4081,
1332,
4645,
247,
1270,
16774,
1318,
323,
8607,
275,
436,
1673,
891,
1908,
1543,
2011,
275,
4677,
374,
285,
2829,
374,
1077,
23384,
1941,
273,
253,
12510,
273,
253,
4081,
1332,
374,
352,
310,
5322,
281,
923,
247,
1781,
4311,
1263,
1458,
76,
35615,
327,
690,
1846,
3607,
273,
2990,
35615,
285,
616,
6355,
342,
2677,
5837,
495,
281,
619,
3640,
436,
2929,
1057,
1246,
247,
1375,
23037,
14387,
1180,
323,
1698,
40540,
4440,
257,
292,
9162,
50276,
20881,
1255,
337,
253,
4028,
3290,
273,
436,
2929,
310,
29124,
436,
310,
417,
3365,
281,
513,
342,
253,
897,
273,
3448,
533,
671,
327,
253,
19843,
273,
690,
8213,
891,
7052,
5583,
253,
4477,
281,
452,
247,
4092,
40167,
273,
616,
2929,
1580,
597,
513,
1246,
9865,
1543,
285,
4806,
66,
3904,
374,
281,
479,
253,
38135,
273,
436,
2929,
310,
3710,
352,
3133,
751,
271,
6880,
281,
2378,
14570,
285,
253,
4477,
671,
11106,
436,
789,
253,
9732,
39095,
5853,
310,
671,
247,
3863,
2934,
253,
4477,
1750,
436,
310,
253,
806,
5313,
273,
789,
273,
13332,
1293,
851,
26208,
2299,
597,
403,
10040,
3146,
8493,
253,
2372,
3429,
465,
534,
8018,
247,
1781,
3733,
2105,
285,
310,
10380,
6425,
281,
851,
26208,
253,
1332,
275,
253,
2929,
4453,
751,
247,
5019,
273,
247,
1180,
273,
973,
4304,
5609,
534,
1537,
2701,
253,
38135,
1750,
275,
436,
2929,
2299,
891,
452,
281,
1333,
891,
717,
417,
1077,
26504,
342,
16248,
247,
12190,
273,
5368,
5609,
604,
352,
921,
747,
4806,
66,
326,
310,
41731,
14692,
407,
247,
1534,
8459,
436,
14855,
310,
760,
5884,
281,
479,
50276,
2577,
13991,
50276,
8259,
16723,
337,
352,
3133,
751,
368,
476,
9510,
253,
3045,
273,
2677,
1701,
6928,
432,
247,
26277,
3186,
323,
35615,
285,
2677,
5837,
285,
270,
9732,
39095,
19605,
2677,
5837,
3733,
342,
20265,
13461,
812,
368,
1071,
841,
767,
4243,
275,
12940,
285,
22048,
253,
9021,
273,
1016,
5853,
374,
2139,
368,
2677,
885,
1396,
569,
281,
10698,
3904,
3239,
577,
13414,
368,
1908,
1396,
569,
751,
13584,
90,
1661,
86,
275,
634,
5743,
3186,
2317,
390,
368,
513,
417,
3186,
1396,
569,
387,
512,
50276,
27109,
3082,
11089,
432,
625,
36230,
1340,
24279,
665,
403,
368,
10941,
281,
275,
436,
1083,
310,
352,
625,
36230,
2429,
281,
391,
77,
1754,
13332,
495,
752,
310,
634,
892,
2695,
2361,
275,
2829,
374,
892,
2695,
2097,
14974,
1127,
5871,
513,
368,
1599,
2372,
2695,
390,
368,
10380,
24337,
892,
2695,
342,
1675,
281,
2372,
3429,
84,
577,
359,
2770,
327,
253,
5919,
3210,
762,
581,
4229,
1698,
2372,
3429,
36643,
5700,
513,
368,
1599,
253,
2990,
310,
440,
532,
2845,
1297,
594,
642,
3828,
3020,
6804,
40540,
310,
4136,
608,
891,
20673,
247,
1180,
273,
3731,
3197,
11515,
285,
588,
7052,
5583,
4477,
281,
2451,
16503,
751,
247,
28931,
50275,
74,
342,
1029,
14974,
3659,
3045,
513,
368,
1599,
14974,
3659,
3210,
390,
368,
1599,
2840,
1701,
14974,
3659,
3210,
6266,
14974,
3659,
347,
1029,
310,
1077,
24363,
21255,
2677,
907,
253,
2990,
342,
851,
26208,
891,
5476,
891,
2096,
752,
368,
1599,
533,
368,
1537,
1333,
851,
1949,
253,
2677,
1701,
3210,
281,
320,
1679,
23851,
50275,
67,
28146,
891,
1027,
2372,
3429,
50276,
19623,
2372,
3429,
84,
21255,
36643,
2221,
3024,
50276,
17149,
1025,
2221,
3024,
285,
594,
327,
50276,
68,
513,
417,
5467,
10668,
452,
2720,
3640,
891,
359,
897,
25749,
4803,
50276,
664,
897,
253,
25749,
4086,
285,
5046,
368,
943,
1908,
5513,
752,
352,
310,
50276,
74,
2550,
1246,
512,
253,
16503,
1060,
841,
403,
816,
6667,
891,
651,
35388,
969,
326,
891,
651,
7052,
5583,
368,
281,
40167,
253,
2929,
1580,
891,
513,
751,
253,
1543,
368,
403,
15250,
285,
1158,
604,
253,
2127,
310,
13279,
47549,
597,
588,
5649,
253,
3114,
50272,
187,
187,
4118,
18435,
27,
2520,
2929,
4081,
247,
1332,
281,
6194,
2677,
1025,
13708,
1507,
534,
476,
320,
3587,
18329,
1293,
851,
26208,
247,
2022,
4468,
310,
326,
627,
310,
3710,
38135,
253,
4081,
1332,
4453,
751,
247,
5019,
273,
973,
4304,
5609,
5661,
1543,
403,
12532,
2299,
352,
310,
417,
2590,
604,
253,
14023,
403,
4344,
285,
604,
512,
253,
3082,
403,
970,
253,
1072,
9978,
352,
310,
11408,
281,
452,
3081,
1783,
285,
28913,
2175,
253,
4028,
476,
671,
320,
5520
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
this paper presents a new paradigm for defending against adversarial attacks with multiple submodels different from ensemble the proposed collaboration paradigm a representative submodel is chosen to make the decision instead of letting all submodels vote the proposed method has been validated on cifar10 dataset against both whitebox and transferrabilitybased blackbox attacks strengths 1 the proposed method is clearly motivated and defined and it has been demonstrated effective by quantitative experimental results 2 a comprehensive overview of related works is provided weaknesses 1 training cost is also a notable issue for adversarial training methods it would be better if the training cost of the proposed method is compared with that of the previous methods this paper presents a new paradigm involving multiple submodels which has several advantages compared to ensemble it has been validated effective by the quantiative results in terms of robustness but comparison in other aspects such as training cost could be also helpful docsepthis paper firstly analyzes prior adversarial defense methods using ensemble strategy and claims that this method could cause a waste of model capacity to improve the utilization of multiple model capacity the author proposes an interesting collaboration strategy cda2 to defend against adversarial attacks specifically the author develop a dualhead model structure one is for making a prediction and the other is for predicting the posterior probability of the input during training each model can address the adversarial attacks of other submodels so that it improves the robustness of the collaboration the experimental results partially verify the superiority of the proposed methods strength 1 this work considers the adversarial defense problem it focuses on the insufficient model capacity of adversarial training and presents a completely fresh perspective on learning multiple models to improve the robustness the proposed framework is technically solid the idea of collaboration to minimize the vulnerable area is innovative compared with baselines in summary i assess a high novelty of this work 2 the proposed framework cda2 makes sense for the goal of minimizing the vulnerable area of the overlap a dualhead model is used for predicting and providing information about how to assign the adversarial sample optimizing the bestperforming submodel is to minimize the vulnerability overlap of all submodels the framework has contributed to the research problem and may enlighten other research problems it could be a good paper for the iclr community 3 this work is wellwritten figs 1 and 2 show the motivation of the proposed method clearly which helps understand it weakness 1 the author states that the insufficient model capacity can hurt its performance in adversarial training what if the model has sufficient model capacity are there other scenarios that have insufficient model capacity whether cda2 can be useful for such scenarios if any compared to a big single model can the author discuss more pros and cons of collaboration 2 to achieve collaboration the author proposes to assign the samples to the submodels that perform best does it obtain a trivial case for example only one submodel has been trained can the author discuss more 3 for the framework cda2 ppd head is for evaluating the performance of the other head can the author discuss the impact on the quality of this head more 4 the experiment on the xor problem is a little bit weak can the author provide more experiments to verify the effectiveness of cda2 5 for the experimental results on the whitebox the author gives the results in table 1 the experimental setting is a little vague im wondering about the robustness performance of pgd50 as in 1 6 for the blackbox experiments the author uses mfgsm and pgd to generate transferable adversarial samples can the author provide more experimental results to validate its claims overall there are still some issues stated above i will increase my score if they are well addressed 1 dverge diversifying vulnerabilities for enhanced robust generation of ensembles in neurips 2020a post rebuttal responses thanks for the authors responses they address my concerns on the concept of collaboration and the newly added experiments are convincing especially i like the idea of collaboration in adversarial training which is a new paradigm to defend against adversarial attacks besides robustness the new paradigm may be helpful to other domains after reading the review from other reviewers and the corresponding responses i vote for acceptance and increase my scores further a novel and interesting collaboration method for advancing the robustness of multiple submodels docsepthis paper proposes an ensemble or mixtureofexperts method to defend against adversarial examples though the authors prefer to use the term collaboration method to highlight its difference from vanilla ensemble the main idea is that during adversarial training the submodels are trained on each others adversarial examples specifically for each training image one adversarial example is generated per submodel by carrying out an attack on each submodel each adversarial example is softly assigned to the submodel that has the lowest loss on it as a training image each submodel has a second output called ppd that quantifies its confidence at inference time for each input the submodel with the highest ppd produces the output the rationale is that because each submodel only needs to cover part of the adversarial example space they can do a better job experiments on cifar10 with linf attacks are reported the idea of adversarial training on each others adversarial examples is new however it is flawed at least in its current form the issue is that such adversarial training may not provide enough coverage and after the training converges there may exist adversarial examples that can attack all submodels consider a hypothetical situation with two submodels both submodels classify clean images well however submodel a is easily attacked by adding a faint cross pattern near the upper left corner and submodel b is easily attacked by adding a faint cross pattern near the lower right corner during the proposed training procedure well only encounter these two types of adversarial examples because they are respectively the best attack with lowest lp epsilon on the two submodels then the two submodels learn to solve each others adversarial examples submodel a will become robust against faint cross patterns near the lower right corner and submodel b will become robust against faint cross patterns near the upper left corner the training procedure therefore converges quickly however after the adversarial training has converged a vast space of adversarial examples has not been explored at all there may exist adversarial examples that although their epsilon is slightly higher than the faint cross patterns they can attack both models the above discussion can be easily extended to the general case of m submodels the point is that the proposed adv training procedure can be selflimiting and may not provide enough coverage the experimental results are not sufficient to support the claims standard accuracies on clean images are not reported in table 1 they are critical missing information the cifar10 model in madry et al 2018 has robust accuracy of 458 against linf epsilon of 82550031 with pgd20 table 1 shows that the proposed method has robust accuracy of 445 against linf epsilon of 003 with pgd10 its unclear that the proposed method has an advantage this paper seems to confuse blackbox attacks and transfer attacks tables 2 and 3 are transfer attacks there are no blackbox results in this paper the role of the ppd head is not explained well according to figure 3b and algorithm 1 line 6 the ppd head is trained by minimizing binary cross entropy between it and the truelabel logit from the normal output it seems that because the ppd head is trained to also track wrong predictions of a submodel it can serve as a confidence score at inference time equations 111213 and the surrounding text do not help and they seem irrelevant to the actual implementation the philosophical idea claimed in the conclusion section is not new whats new here is the adversarial training scheme where submodels train on each others adversarial examples 1 unfortunately the main idea is flawed and the proposed adversarial training may converge without providing enough coverage 2 the experimental results do not show advantage over previous nonensemble method
### Summary: | the paper proposes a novel ensemble method cda2 in which base models collaborate to defend against adversarial attacks to do so the base models have two heads the label head for predicting the label and the posterior probability density ppd head that is trained by minimizing binary cross entropy between it and the truelabel logit given by the label head during inference the base model with the highest ppd value is chosen to make the prediction during training base models learn from the adversarial examples produced by other base models the evaluation of the manuscript of different reviewers was very diverse resulting in final scores ranging between 3 and 8 after the discussion period while the rebuttal clearly addressed the concerns of one reviewer and several additional experimental results were added for different adversarial attacks it did not fully addressed the concerns of another reviewer who rated his confidence higher he was also not convinced by the update in the revised version of the manuscript in which crucial changes in the pseudocode describing the proposed algorithm were made which contradicted some statements in the first version therefore the paper can unfortunately not be accepted in its current version in a future version of the manuscript the description of the algorithm and of he role of the ppd head should be improved and experiments on another dataset next to cifar10 could be added | [
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
2520,
2929,
10262,
247,
747,
22199,
323,
21449,
1411,
48960,
8104,
342,
2709,
749,
19286,
1027,
432,
19862,
253,
4081,
14448,
22199,
247,
8612,
749,
7645,
310,
6777,
281,
1056,
253,
3061,
3185,
273,
13872,
512,
749,
19286,
6273,
253,
4081,
1332,
556,
644,
17618,
327,
260,
338,
274,
740,
10895,
1411,
1097,
3168,
3364,
285,
811,
27480,
1430,
3169,
2806,
3364,
8104,
20544,
337,
253,
4081,
1332,
310,
4518,
17194,
285,
2931,
285,
352,
556,
644,
5183,
3576,
407,
11745,
5661,
1543,
374,
247,
11088,
18389,
273,
2905,
2987,
310,
2530,
50276,
20881,
1255,
265,
337,
3733,
2105,
310,
671,
247,
16613,
2523,
323,
48960,
3733,
3082,
352,
651,
320,
1805,
604,
253,
3733,
2105,
273,
253,
4081,
1332,
310,
2429,
342,
326,
273,
253,
2045,
3082,
436,
2929,
10262,
247,
747,
22199,
7668,
2709,
749,
19286,
534,
556,
2067,
11361,
2429,
281,
19862,
352,
556,
644,
17618,
3576,
407,
253,
2677,
18168,
1543,
275,
2426,
273,
31640,
533,
5301,
275,
643,
7794,
824,
347,
3733,
2105,
812,
320,
671,
9371,
5474,
33032,
2520,
2929,
41005,
3537,
13505,
2720,
48960,
5684,
3082,
970,
19862,
5700,
285,
3916,
326,
436,
1332,
812,
2847,
247,
8138,
273,
1566,
5350,
281,
3157,
253,
19575,
273,
2709,
50276,
7645,
5350,
253,
2488,
29328,
271,
4722,
14448,
5700,
260,
1473,
19,
281,
2342,
1411,
48960,
8104,
5742,
253,
2488,
1287,
247,
8746,
2522,
1566,
2605,
581,
310,
323,
2403,
247,
10554,
285,
253,
643,
310,
323,
21565,
253,
12637,
5912,
273,
253,
3280,
1309,
3733,
1016,
1566,
476,
2953,
253,
48960,
8104,
273,
643,
749,
19286,
594,
326,
352,
19132,
253,
31640,
273,
253,
14448,
253,
5661,
1543,
10571,
12654,
253,
34385,
273,
253,
4081,
3082,
50276,
45563,
50276,
18,
436,
789,
19401,
253,
48960,
5684,
1895,
352,
16633,
327,
253,
12497,
1566,
5350,
273,
48960,
3733,
285,
10262,
247,
4336,
5352,
8668,
327,
4715,
2709,
3210,
281,
3157,
253,
31640,
253,
4081,
7792,
310,
22335,
4891,
253,
2934,
273,
14448,
281,
15338,
253,
14043,
2170,
310,
16694,
2429,
342,
1666,
25379,
275,
6010,
891,
2939,
247,
1029,
38135,
273,
436,
789,
50276,
19,
253,
4081,
7792,
260,
1473,
19,
2789,
3282,
323,
253,
4736,
273,
28699,
253,
14043,
2170,
273,
253,
14787,
247,
8746,
2522,
1566,
310,
908,
323,
21565,
285,
5277,
1491,
670,
849,
281,
9212,
253,
48960,
3410,
39793,
253,
1682,
468,
14692,
749,
7645,
310,
281,
15338,
253,
24189,
14787,
273,
512,
749,
19286,
253,
7792,
556,
9945,
281,
253,
2561,
1895,
285,
778,
25441,
257,
643,
2561,
3237,
352,
812,
320,
247,
1175,
2929,
323,
253,
17857,
32888,
3114,
50276,
20,
436,
789,
310,
973,
15720,
3036,
84,
337,
285,
374,
921,
253,
16038,
273,
253,
4081,
1332,
4518,
534,
7729,
2096,
352,
50274,
20881,
1255,
50276,
18,
253,
2488,
3054,
326,
253,
12497,
1566,
5350,
476,
8513,
697,
3045,
275,
48960,
3733,
752,
604,
253,
1566,
556,
4209,
1566,
5350,
403,
627,
643,
15216,
326,
452,
12497,
1566,
5350,
1880,
260,
1473,
19,
476,
320,
4217,
323,
824,
15216,
604,
667,
2429,
281,
247,
1943,
2014,
1566,
476,
253,
2488,
2319,
625,
5847,
285,
772,
273,
14448,
50276,
19,
281,
5115,
14448,
253,
2488,
29328,
281,
9212,
253,
3530,
281,
253,
749,
19286,
326,
1347,
1682,
1057,
352,
4044,
247,
14916,
1083,
323,
1650,
760,
581,
749,
7645,
556,
644,
10166,
476,
253,
2488,
2319,
625,
50275,
20,
323,
253,
7792,
260,
1473,
19,
7266,
69,
1481,
310,
323,
16344,
253,
3045,
273,
253,
643,
1481,
476,
253,
2488,
2319,
253,
3486,
327,
253,
3290,
273,
436,
1481,
625,
50276,
21,
253,
3368,
327,
253,
1269,
263,
1895,
310,
247,
1652,
2372,
5075,
476,
253,
2488,
2085,
625,
4679,
281,
12654,
253,
12510,
273,
260,
1473,
19,
50276,
22,
323,
253,
5661,
1543,
327,
253,
3168,
3364,
253,
2488,
4245,
253,
1543,
275,
2829,
337,
253,
5661,
4758,
310,
247,
1652,
21248,
516,
12371,
670,
253,
31640,
3045,
273,
23256,
69,
1235,
347,
275,
337,
50276,
23,
323,
253,
2806,
3364,
4679,
253,
2488,
4648,
278,
16054,
3610,
285,
23256,
69,
281,
6635,
3700,
494,
48960,
3530,
476,
253,
2488,
2085,
625,
5661,
1543,
281,
17813,
697,
3916,
50276,
1189,
455,
627,
403,
1335,
690,
3374,
4767,
1840,
891,
588,
2572,
619,
4868,
604,
597,
403,
973,
9713,
50276,
18,
277,
332,
463,
7940,
5411,
42220,
323,
8655,
10237,
5978,
273,
49328,
275,
5723,
2824,
9169,
66,
50273,
5996,
30080,
22559,
6128,
50276,
35501,
323,
253,
4477,
6128,
597,
2953,
619,
7350,
327,
253,
4473,
273,
14448,
285,
253,
9841,
2879,
4679,
403,
21414,
3340,
891,
751,
253,
2934,
273,
14448,
275,
48960,
3733,
534,
310,
247,
747,
22199,
281,
2342,
1411,
48960,
8104,
50275,
67,
11587,
31640,
253,
747,
22199,
778,
320,
9371,
281,
643,
10625,
50276,
6438,
4361,
253,
2278,
432,
643,
30628,
285,
253,
3969,
6128,
891,
6273,
323,
14924,
285,
2572,
619,
7363,
2007,
247,
4460,
285,
4722,
14448,
1332,
323,
26441,
253,
31640,
273,
2709,
749,
19286,
5474,
33032,
2520,
2929,
29328,
271,
19862,
390,
7802,
80,
453,
89,
468,
1641,
1332,
281,
2342,
1411,
48960,
6667,
2167,
253,
4477,
4510,
281,
897,
253,
1307,
14448,
1332,
281,
6780,
697,
3064,
432,
26724,
19862,
253,
2022,
2934,
310,
326,
1309,
48960,
3733,
253,
749,
19286,
403,
10166,
327,
1016,
2571,
48960,
6667,
50276,
46458,
50276,
1542,
1016,
3733,
2460,
581,
48960,
1650,
310,
4561,
591,
749,
7645,
407,
8785,
562,
271,
2983,
327,
1016,
749,
7645,
50276,
14382,
48960,
1650,
310,
23727,
7922,
281,
253,
749,
7645,
326,
556,
253,
8840,
2957,
327,
352,
347,
247,
3733,
2460,
50276,
14382,
749,
7645,
556,
247,
1273,
3453,
1925,
7266,
69,
326,
2677,
7790,
697,
7162,
50276,
255,
17032,
673,
323,
1016,
3280,
253,
749,
7645,
342,
253,
4585,
7266,
69,
11330,
253,
3453,
50276,
783,
24775,
310,
326,
984,
1016,
749,
7645,
760,
3198,
281,
3835,
629,
273,
253,
48960,
1650,
2317,
597,
476,
513,
247,
1805,
2628,
4679,
327,
260,
338,
274,
740,
342,
298,
2050,
8104,
403,
2361,
253,
2934,
273,
48960,
3733,
327,
1016,
2571,
48960,
6667,
310,
747,
2299,
352,
310,
33657,
387,
1878,
275,
697,
1655,
830,
253,
2523,
310,
326,
824,
48960,
3733,
778,
417,
2085,
2217,
7031,
285,
846,
253,
3733,
26414,
627,
778,
2226,
48960,
6667,
326,
476,
2983,
512,
749,
19286,
50276,
15603,
247,
27710,
4112,
342,
767,
749,
19286,
1097,
749,
19286,
30215,
4076,
3888,
973,
2299,
749,
7645,
247,
310,
4354,
13964,
407,
6240,
247,
16487,
2831,
3102,
2822,
253,
5170,
1669,
7145,
285,
749,
7645,
270,
310,
4354,
13964,
407,
6240,
247,
16487,
2831,
3102,
2822,
253,
2406,
987,
7145,
50276,
32674,
253,
4081,
3733,
5199,
973,
760,
13329,
841,
767,
3510,
273,
48960,
6667,
984,
597,
403,
2975,
253,
1682,
2983,
342,
8840,
39322,
299,
4277,
327,
253,
767,
749,
19286,
840,
253,
767,
749,
19286,
3037,
281,
8415,
1016,
2571,
48960,
6667,
749,
7645,
247,
588,
2489,
10237,
1411,
16487,
2831,
6127,
2822,
253,
2406,
987,
7145,
285,
749,
7645,
270,
588,
2489,
10237,
1411,
16487,
2831,
6127,
2822,
253,
5170,
1669,
7145,
253,
3733,
5199,
3103,
26414,
4541,
50276,
35529,
846,
253,
48960,
3733,
556,
5975,
2400,
247,
8485,
2317,
273,
48960,
6667,
556,
417,
644,
14859,
387,
512,
627,
778,
2226,
48960,
6667,
326,
3738,
616,
299,
4277,
310,
5777,
2169,
685,
253,
16487,
2831,
6127,
597,
476,
2983,
1097,
3210,
253,
1840,
5955,
476,
320,
4354,
6508,
281,
253,
2087,
1083,
273,
278,
749,
19286,
253,
1127,
310,
326,
253,
4081,
1604,
3733,
5199,
476,
320,
1881,
44839,
285,
778,
417,
2085,
2217,
7031,
50273,
783,
5661,
1543,
403,
417,
4209,
281,
1329,
253,
3916,
50276,
15291,
3933,
19103,
327,
4076,
3888,
403,
417,
2361,
275,
2829,
337,
597,
403,
4619,
5816,
1491,
50276,
783,
260,
338,
274,
740,
1566,
275,
10279,
610,
1162,
355,
4765,
556,
10237,
7200,
273,
42700,
1411,
298,
2050,
299,
4277,
273,
854,
1099,
5388,
2405,
342,
23256,
69,
938,
2829,
337,
2722,
326,
253,
4081,
1332,
556,
10237,
7200,
273,
38848,
1411,
298,
2050,
299,
4277,
273,
209,
4838,
342,
23256,
69,
740,
697,
12744,
326,
253,
4081,
1332,
556,
271,
5750,
50276,
2520,
2929,
3133,
281,
40678,
2806,
3364,
8104,
285,
3700,
8104,
7180,
374,
285,
495,
403,
3700,
8104,
627,
403,
642,
2806,
3364,
1543,
275,
436,
2929,
50276,
783,
2554,
273,
253,
7266,
69,
1481,
310,
417,
5544,
973,
2556,
281,
4677,
495,
67,
285,
5933,
337,
1386,
721,
253,
7266,
69,
1481,
310,
10166,
407,
28699,
8985,
2831,
15579,
875,
352,
285,
253,
492,
3814,
1492,
2412,
262,
432,
253,
2622,
3453,
352,
3133,
326,
984,
253,
7266,
69,
1481,
310,
10166,
281,
671,
3540,
3430,
13650,
273,
247,
749,
7645,
352,
476,
5752,
347,
247,
7162,
4868,
387,
17032,
673,
7424,
1903,
805,
1012,
285,
253,
8704,
2505,
513,
417,
1361,
285,
597,
1646,
19124,
281,
253,
4588,
7092,
50276,
783,
22555,
2934,
7558,
275,
253,
6452,
2593,
310,
417,
747,
47515,
747,
1060,
310,
253,
48960,
3733,
6974,
835,
749,
19286,
6194,
327,
1016,
2571,
48960,
6667,
50276,
18,
19235,
253,
2022,
2934,
310,
33657,
285,
253,
4081,
48960,
3733,
778,
29623,
1293,
5277,
2217,
7031,
374,
253,
5661,
1543,
513,
417,
921,
5750,
689,
2045,
1327,
1215,
78,
934,
1332,
2490,
187,
4118,
18435,
27,
783,
2929,
29328,
247,
4460,
19862,
1332,
260,
1473,
19,
50276,
249,
534,
2613,
3210,
42124,
281,
2342,
1411,
48960,
8104,
281,
513,
594,
253,
2613,
3210,
452,
767,
9851,
253,
5203,
1481,
323,
21565,
253,
5203,
285,
253,
12637,
5912,
4038,
7266,
69,
1481,
326,
310,
10166,
407,
28699,
8985,
2831,
15579,
875,
352,
285,
253,
492,
3814,
1492,
2412,
262,
1677,
407,
253,
5203,
1481,
1309,
17032,
253,
2613,
1566,
342,
253,
4585,
7266,
69,
1318,
310,
6777,
281,
1056,
253,
10554,
1309,
3733,
2613,
3210,
3037,
432,
253,
48960,
6667,
4197,
407,
643,
2613,
3210,
50275,
783,
7103,
273,
253,
7714,
273,
1027,
30628,
369,
1077,
11117,
4795,
275,
2457,
7363,
12319,
875,
495,
285,
854,
846,
253,
5955,
2180,
1223,
253,
30080,
22559,
4518,
9713,
253,
7350,
273,
581,
37317,
285,
2067,
3081,
5661,
1543,
497,
2879,
323,
1027,
48960,
8104,
352,
858,
417,
4751,
9713,
253,
7350,
273,
1529,
37317,
665,
20139,
521,
7162,
2169,
344,
369,
671,
417,
13762,
407,
253,
5731,
275,
253,
17265,
2715,
273,
253,
7714,
275,
534,
9560,
2544,
275,
253,
10585,
406,
853,
12930,
253,
4081,
5933,
497,
1160,
534,
10435,
7025,
690,
7234,
275,
253,
806,
2715,
3103,
253,
2929,
476,
19235,
417,
320,
7607,
275,
697,
1655,
2715,
275,
247,
2852,
2715,
273,
253,
7714,
253,
5740,
273,
253,
5933,
285,
273,
344,
2554,
273,
253,
7266,
69,
1481,
943,
320,
5520,
285,
4679,
327,
1529,
10895,
1735,
281,
260,
338,
274,
740,
812,
320,
2879
] | [
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] | [
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
2520,
2929,
10262,
247,
747,
22199,
323,
21449,
1411,
48960,
8104,
342,
2709,
749,
19286,
1027,
432,
19862,
253,
4081,
14448,
22199,
247,
8612,
749,
7645,
310,
6777,
281,
1056,
253,
3061,
3185,
273,
13872,
512,
749,
19286,
6273,
253,
4081,
1332,
556,
644,
17618,
327,
260,
338,
274,
740,
10895,
1411,
1097,
3168,
3364,
285,
811,
27480,
1430,
3169,
2806,
3364,
8104,
20544,
337,
253,
4081,
1332,
310,
4518,
17194,
285,
2931,
285,
352,
556,
644,
5183,
3576,
407,
11745,
5661,
1543,
374,
247,
11088,
18389,
273,
2905,
2987,
310,
2530,
50276,
20881,
1255,
265,
337,
3733,
2105,
310,
671,
247,
16613,
2523,
323,
48960,
3733,
3082,
352,
651,
320,
1805,
604,
253,
3733,
2105,
273,
253,
4081,
1332,
310,
2429,
342,
326,
273,
253,
2045,
3082,
436,
2929,
10262,
247,
747,
22199,
7668,
2709,
749,
19286,
534,
556,
2067,
11361,
2429,
281,
19862,
352,
556,
644,
17618,
3576,
407,
253,
2677,
18168,
1543,
275,
2426,
273,
31640,
533,
5301,
275,
643,
7794,
824,
347,
3733,
2105,
812,
320,
671,
9371,
5474,
33032,
2520,
2929,
41005,
3537,
13505,
2720,
48960,
5684,
3082,
970,
19862,
5700,
285,
3916,
326,
436,
1332,
812,
2847,
247,
8138,
273,
1566,
5350,
281,
3157,
253,
19575,
273,
2709,
50276,
7645,
5350,
253,
2488,
29328,
271,
4722,
14448,
5700,
260,
1473,
19,
281,
2342,
1411,
48960,
8104,
5742,
253,
2488,
1287,
247,
8746,
2522,
1566,
2605,
581,
310,
323,
2403,
247,
10554,
285,
253,
643,
310,
323,
21565,
253,
12637,
5912,
273,
253,
3280,
1309,
3733,
1016,
1566,
476,
2953,
253,
48960,
8104,
273,
643,
749,
19286,
594,
326,
352,
19132,
253,
31640,
273,
253,
14448,
253,
5661,
1543,
10571,
12654,
253,
34385,
273,
253,
4081,
3082,
50276,
45563,
50276,
18,
436,
789,
19401,
253,
48960,
5684,
1895,
352,
16633,
327,
253,
12497,
1566,
5350,
273,
48960,
3733,
285,
10262,
247,
4336,
5352,
8668,
327,
4715,
2709,
3210,
281,
3157,
253,
31640,
253,
4081,
7792,
310,
22335,
4891,
253,
2934,
273,
14448,
281,
15338,
253,
14043,
2170,
310,
16694,
2429,
342,
1666,
25379,
275,
6010,
891,
2939,
247,
1029,
38135,
273,
436,
789,
50276,
19,
253,
4081,
7792,
260,
1473,
19,
2789,
3282,
323,
253,
4736,
273,
28699,
253,
14043,
2170,
273,
253,
14787,
247,
8746,
2522,
1566,
310,
908,
323,
21565,
285,
5277,
1491,
670,
849,
281,
9212,
253,
48960,
3410,
39793,
253,
1682,
468,
14692,
749,
7645,
310,
281,
15338,
253,
24189,
14787,
273,
512,
749,
19286,
253,
7792,
556,
9945,
281,
253,
2561,
1895,
285,
778,
25441,
257,
643,
2561,
3237,
352,
812,
320,
247,
1175,
2929,
323,
253,
17857,
32888,
3114,
50276,
20,
436,
789,
310,
973,
15720,
3036,
84,
337,
285,
374,
921,
253,
16038,
273,
253,
4081,
1332,
4518,
534,
7729,
2096,
352,
50274,
20881,
1255,
50276,
18,
253,
2488,
3054,
326,
253,
12497,
1566,
5350,
476,
8513,
697,
3045,
275,
48960,
3733,
752,
604,
253,
1566,
556,
4209,
1566,
5350,
403,
627,
643,
15216,
326,
452,
12497,
1566,
5350,
1880,
260,
1473,
19,
476,
320,
4217,
323,
824,
15216,
604,
667,
2429,
281,
247,
1943,
2014,
1566,
476,
253,
2488,
2319,
625,
5847,
285,
772,
273,
14448,
50276,
19,
281,
5115,
14448,
253,
2488,
29328,
281,
9212,
253,
3530,
281,
253,
749,
19286,
326,
1347,
1682,
1057,
352,
4044,
247,
14916,
1083,
323,
1650,
760,
581,
749,
7645,
556,
644,
10166,
476,
253,
2488,
2319,
625,
50275,
20,
323,
253,
7792,
260,
1473,
19,
7266,
69,
1481,
310,
323,
16344,
253,
3045,
273,
253,
643,
1481,
476,
253,
2488,
2319,
253,
3486,
327,
253,
3290,
273,
436,
1481,
625,
50276,
21,
253,
3368,
327,
253,
1269,
263,
1895,
310,
247,
1652,
2372,
5075,
476,
253,
2488,
2085,
625,
4679,
281,
12654,
253,
12510,
273,
260,
1473,
19,
50276,
22,
323,
253,
5661,
1543,
327,
253,
3168,
3364,
253,
2488,
4245,
253,
1543,
275,
2829,
337,
253,
5661,
4758,
310,
247,
1652,
21248,
516,
12371,
670,
253,
31640,
3045,
273,
23256,
69,
1235,
347,
275,
337,
50276,
23,
323,
253,
2806,
3364,
4679,
253,
2488,
4648,
278,
16054,
3610,
285,
23256,
69,
281,
6635,
3700,
494,
48960,
3530,
476,
253,
2488,
2085,
625,
5661,
1543,
281,
17813,
697,
3916,
50276,
1189,
455,
627,
403,
1335,
690,
3374,
4767,
1840,
891,
588,
2572,
619,
4868,
604,
597,
403,
973,
9713,
50276,
18,
277,
332,
463,
7940,
5411,
42220,
323,
8655,
10237,
5978,
273,
49328,
275,
5723,
2824,
9169,
66,
50273,
5996,
30080,
22559,
6128,
50276,
35501,
323,
253,
4477,
6128,
597,
2953,
619,
7350,
327,
253,
4473,
273,
14448,
285,
253,
9841,
2879,
4679,
403,
21414,
3340,
891,
751,
253,
2934,
273,
14448,
275,
48960,
3733,
534,
310,
247,
747,
22199,
281,
2342,
1411,
48960,
8104,
50275,
67,
11587,
31640,
253,
747,
22199,
778,
320,
9371,
281,
643,
10625,
50276,
6438,
4361,
253,
2278,
432,
643,
30628,
285,
253,
3969,
6128,
891,
6273,
323,
14924,
285,
2572,
619,
7363,
2007,
247,
4460,
285,
4722,
14448,
1332,
323,
26441,
253,
31640,
273,
2709,
749,
19286,
5474,
33032,
2520,
2929,
29328,
271,
19862,
390,
7802,
80,
453,
89,
468,
1641,
1332,
281,
2342,
1411,
48960,
6667,
2167,
253,
4477,
4510,
281,
897,
253,
1307,
14448,
1332,
281,
6780,
697,
3064,
432,
26724,
19862,
253,
2022,
2934,
310,
326,
1309,
48960,
3733,
253,
749,
19286,
403,
10166,
327,
1016,
2571,
48960,
6667,
50276,
46458,
50276,
1542,
1016,
3733,
2460,
581,
48960,
1650,
310,
4561,
591,
749,
7645,
407,
8785,
562,
271,
2983,
327,
1016,
749,
7645,
50276,
14382,
48960,
1650,
310,
23727,
7922,
281,
253,
749,
7645,
326,
556,
253,
8840,
2957,
327,
352,
347,
247,
3733,
2460,
50276,
14382,
749,
7645,
556,
247,
1273,
3453,
1925,
7266,
69,
326,
2677,
7790,
697,
7162,
50276,
255,
17032,
673,
323,
1016,
3280,
253,
749,
7645,
342,
253,
4585,
7266,
69,
11330,
253,
3453,
50276,
783,
24775,
310,
326,
984,
1016,
749,
7645,
760,
3198,
281,
3835,
629,
273,
253,
48960,
1650,
2317,
597,
476,
513,
247,
1805,
2628,
4679,
327,
260,
338,
274,
740,
342,
298,
2050,
8104,
403,
2361,
253,
2934,
273,
48960,
3733,
327,
1016,
2571,
48960,
6667,
310,
747,
2299,
352,
310,
33657,
387,
1878,
275,
697,
1655,
830,
253,
2523,
310,
326,
824,
48960,
3733,
778,
417,
2085,
2217,
7031,
285,
846,
253,
3733,
26414,
627,
778,
2226,
48960,
6667,
326,
476,
2983,
512,
749,
19286,
50276,
15603,
247,
27710,
4112,
342,
767,
749,
19286,
1097,
749,
19286,
30215,
4076,
3888,
973,
2299,
749,
7645,
247,
310,
4354,
13964,
407,
6240,
247,
16487,
2831,
3102,
2822,
253,
5170,
1669,
7145,
285,
749,
7645,
270,
310,
4354,
13964,
407,
6240,
247,
16487,
2831,
3102,
2822,
253,
2406,
987,
7145,
50276,
32674,
253,
4081,
3733,
5199,
973,
760,
13329,
841,
767,
3510,
273,
48960,
6667,
984,
597,
403,
2975,
253,
1682,
2983,
342,
8840,
39322,
299,
4277,
327,
253,
767,
749,
19286,
840,
253,
767,
749,
19286,
3037,
281,
8415,
1016,
2571,
48960,
6667,
749,
7645,
247,
588,
2489,
10237,
1411,
16487,
2831,
6127,
2822,
253,
2406,
987,
7145,
285,
749,
7645,
270,
588,
2489,
10237,
1411,
16487,
2831,
6127,
2822,
253,
5170,
1669,
7145,
253,
3733,
5199,
3103,
26414,
4541,
50276,
35529,
846,
253,
48960,
3733,
556,
5975,
2400,
247,
8485,
2317,
273,
48960,
6667,
556,
417,
644,
14859,
387,
512,
627,
778,
2226,
48960,
6667,
326,
3738,
616,
299,
4277,
310,
5777,
2169,
685,
253,
16487,
2831,
6127,
597,
476,
2983,
1097,
3210,
253,
1840,
5955,
476,
320,
4354,
6508,
281,
253,
2087,
1083,
273,
278,
749,
19286,
253,
1127,
310,
326,
253,
4081,
1604,
3733,
5199,
476,
320,
1881,
44839,
285,
778,
417,
2085,
2217,
7031,
50273,
783,
5661,
1543,
403,
417,
4209,
281,
1329,
253,
3916,
50276,
15291,
3933,
19103,
327,
4076,
3888,
403,
417,
2361,
275,
2829,
337,
597,
403,
4619,
5816,
1491,
50276,
783,
260,
338,
274,
740,
1566,
275,
10279,
610,
1162,
355,
4765,
556,
10237,
7200,
273,
42700,
1411,
298,
2050,
299,
4277,
273,
854,
1099,
5388,
2405,
342,
23256,
69,
938,
2829,
337,
2722,
326,
253,
4081,
1332,
556,
10237,
7200,
273,
38848,
1411,
298,
2050,
299,
4277,
273,
209,
4838,
342,
23256,
69,
740,
697,
12744,
326,
253,
4081,
1332,
556,
271,
5750,
50276,
2520,
2929,
3133,
281,
40678,
2806,
3364,
8104,
285,
3700,
8104,
7180,
374,
285,
495,
403,
3700,
8104,
627,
403,
642,
2806,
3364,
1543,
275,
436,
2929,
50276,
783,
2554,
273,
253,
7266,
69,
1481,
310,
417,
5544,
973,
2556,
281,
4677,
495,
67,
285,
5933,
337,
1386,
721,
253,
7266,
69,
1481,
310,
10166,
407,
28699,
8985,
2831,
15579,
875,
352,
285,
253,
492,
3814,
1492,
2412,
262,
432,
253,
2622,
3453,
352,
3133,
326,
984,
253,
7266,
69,
1481,
310,
10166,
281,
671,
3540,
3430,
13650,
273,
247,
749,
7645,
352,
476,
5752,
347,
247,
7162,
4868,
387,
17032,
673,
7424,
1903,
805,
1012,
285,
253,
8704,
2505,
513,
417,
1361,
285,
597,
1646,
19124,
281,
253,
4588,
7092,
50276,
783,
22555,
2934,
7558,
275,
253,
6452,
2593,
310,
417,
747,
47515,
747,
1060,
310,
253,
48960,
3733,
6974,
835,
749,
19286,
6194,
327,
1016,
2571,
48960,
6667,
50276,
18,
19235,
253,
2022,
2934,
310,
33657,
285,
253,
4081,
48960,
3733,
778,
29623,
1293,
5277,
2217,
7031,
374,
253,
5661,
1543,
513,
417,
921,
5750,
689,
2045,
1327,
1215,
78,
934,
1332,
2490,
187,
4118,
18435,
27,
783,
2929,
29328,
247,
4460,
19862,
1332,
260,
1473,
19,
50276,
249,
534,
2613,
3210,
42124,
281,
2342,
1411,
48960,
8104,
281,
513,
594,
253,
2613,
3210,
452,
767,
9851,
253,
5203,
1481,
323,
21565,
253,
5203,
285,
253,
12637,
5912,
4038,
7266,
69,
1481,
326,
310,
10166,
407,
28699,
8985,
2831,
15579,
875,
352,
285,
253,
492,
3814,
1492,
2412,
262,
1677,
407,
253,
5203,
1481,
1309,
17032,
253,
2613,
1566,
342,
253,
4585,
7266,
69,
1318,
310,
6777,
281,
1056,
253,
10554,
1309,
3733,
2613,
3210,
3037,
432,
253,
48960,
6667,
4197,
407,
643,
2613,
3210,
50275,
783,
7103,
273,
253,
7714,
273,
1027,
30628,
369,
1077,
11117,
4795,
275,
2457,
7363,
12319,
875,
495,
285,
854,
846,
253,
5955,
2180,
1223,
253,
30080,
22559,
4518,
9713,
253,
7350,
273,
581,
37317,
285,
2067,
3081,
5661,
1543,
497,
2879,
323,
1027,
48960,
8104,
352,
858,
417,
4751,
9713,
253,
7350,
273,
1529,
37317,
665,
20139,
521,
7162,
2169,
344,
369,
671,
417,
13762,
407,
253,
5731,
275,
253,
17265,
2715,
273,
253,
7714,
275,
534,
9560,
2544,
275,
253,
10585,
406,
853,
12930,
253,
4081,
5933,
497,
1160,
534,
10435,
7025,
690,
7234,
275,
253,
806,
2715,
3103,
253,
2929,
476,
19235,
417,
320,
7607,
275,
697,
1655,
2715,
275,
247,
2852,
2715,
273,
253,
7714,
253,
5740,
273,
253,
5933,
285,
273,
344,
2554,
273,
253,
7266,
69,
1481,
943,
320,
5520,
285,
4679,
327,
1529,
10895,
1735,
281,
260,
338,
274,
740,
812,
320,
2879
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
fedglomo is communicationefficient that integrates variancereduction both at the server and at local client updates it also has a good iteration complexity than prior works on nonconvex functions with compressed communication experiments show the efficacy of the proposed method assumption 4 is not standard see the comment below although assumption 4 is not standard and the authors claimed that it holds for alpha n i do not see why alpha is necessary in the analysis as far as i understand the analysis still hold at the same magnitude of epsilon when alpha n is this true i think it is beneficial if the authors add a discussion on how the theoretical analysis changes when alpha n another minor thing the fonts of algorithm names are not consistent eg title of section 4 had some error in changing the fonts docsepthis paper is well written the algorithms and theoretical results are presented in an accessible way and discussed in detail the results seem to be competitive it should be discussed in more detail how this result matches that of existing work in different settings with different problemdependent parameters this paper studies nonconvex and smooth optimization in a federated learning setting the authors propose an algorithm with both local variances reduced momentum and global variance reduced momentum they also propose a new assumption on the gradient dependence of each worker machine they prove that the proposed algorithm achieves the best iteration complexity and compare it in experiments with baselines this paper is well written the algorithms and theoretical results are presented in an accessible way and discussed in detail the results seem to be competitive it should be discussed the relationship between r and nalpha when presenting the complexity this should also be noted in table 1 that the complexity does not exactly match that of karimireddy et al 2020 i am not sure when assumption 4 actually holds in practice with a small alpha on note that epsilon depends on 1n sumj wj for all agents this means all epsilon could be dependent in some way and thus alpha cn for some small constant but should be the same order as n docsep a new algorithm with better complexity is proposed the algorithm can be applied to many ml problems the theoretical results are obtained by introducing new assumption 4 im not very familiar with federated learning and it is hard to judge for me how restrictive it is i find the paper well written there are places where the statement could be improved for example in definition 31 twice differentiability is not needed i recommend to check the paper once again
### Summary: | meta review the authors have addressed the reviewers concerns all the reviewers are not against the publication of the paper please add the discussion with the reviewer into the final version especially the discussion on assumption 4 the authors should also add the new empirical results too | [
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
22210,
3129,
19216,
310,
5511,
20246,
326,
49661,
11041,
44571,
1097,
387,
253,
4771,
285,
387,
1980,
5268,
11269,
352,
671,
556,
247,
1175,
19502,
10454,
685,
2720,
2987,
327,
1327,
44181,
3470,
342,
21012,
5511,
4679,
921,
253,
10307,
273,
253,
4081,
1332,
9376,
577,
310,
417,
2629,
923,
253,
4385,
2708,
3738,
9376,
577,
310,
417,
2629,
285,
253,
4477,
7558,
326,
352,
6556,
323,
9765,
50276,
79,
891,
513,
417,
923,
2139,
9765,
310,
3309,
275,
253,
1783,
347,
2080,
347,
891,
2096,
253,
1783,
1335,
2186,
387,
253,
1072,
9777,
273,
299,
4277,
672,
9765,
50276,
79,
310,
436,
2032,
891,
1158,
352,
310,
12912,
604,
253,
4477,
823,
247,
5955,
327,
849,
253,
10527,
1783,
2544,
672,
9765,
50276,
79,
50276,
23955,
5884,
2181,
253,
36622,
273,
5933,
4454,
403,
417,
5185,
24088,
4060,
273,
2593,
577,
574,
690,
2228,
275,
6890,
253,
36622,
5474,
33032,
2520,
2929,
310,
973,
3542,
253,
11333,
285,
10527,
1543,
403,
3559,
275,
271,
12482,
1039,
285,
5469,
275,
2508,
253,
1543,
1646,
281,
320,
12085,
186,
352,
943,
320,
5469,
275,
625,
2508,
849,
436,
906,
10129,
326,
273,
5368,
789,
275,
1027,
7533,
342,
1027,
1895,
6820,
3602,
436,
2929,
2175,
1327,
44181,
285,
6032,
13757,
275,
247,
10208,
12072,
4715,
4758,
253,
4477,
12661,
271,
5933,
342,
1097,
1980,
48894,
3777,
10254,
285,
4156,
11041,
3777,
10254,
597,
671,
12661,
247,
747,
9376,
327,
253,
11786,
10096,
273,
1016,
12954,
5145,
597,
5276,
326,
253,
4081,
5933,
33526,
253,
1682,
19502,
10454,
285,
7277,
352,
275,
4679,
342,
1666,
25379,
50275,
2520,
2929,
310,
973,
3542,
253,
11333,
285,
10527,
1543,
403,
3559,
275,
271,
12482,
1039,
285,
5469,
275,
2508,
253,
1543,
1646,
281,
320,
12085,
32062,
352,
943,
320,
5469,
253,
2954,
875,
391,
285,
295,
1637,
672,
15250,
253,
10454,
436,
943,
671,
320,
4879,
275,
2829,
337,
326,
253,
10454,
1057,
417,
4555,
3761,
326,
273,
46247,
303,
1250,
6421,
1162,
355,
9169,
50276,
74,
717,
417,
2119,
672,
9376,
577,
2686,
6556,
275,
3946,
342,
247,
1355,
9765,
50276,
251,
3877,
326,
299,
4277,
7024,
327,
337,
79,
2020,
75,
259,
75,
323,
512,
6083,
436,
2097,
512,
299,
4277,
812,
320,
7976,
275,
690,
1039,
285,
3021,
9765,
50276,
14340,
323,
690,
1355,
3638,
533,
943,
320,
253,
1072,
1340,
347,
295,
50275,
7152,
33032,
247,
747,
5933,
342,
1805,
10454,
310,
4081,
50275,
783,
5933,
476,
320,
3732,
281,
1142,
13361,
3237,
253,
10527,
1543,
403,
2797,
407,
16984,
747,
9376,
577,
516,
417,
1077,
7615,
342,
10208,
12072,
4715,
285,
352,
310,
1892,
281,
5963,
323,
479,
849,
29190,
352,
310,
50276,
74,
1089,
253,
2929,
973,
3542,
627,
403,
5053,
835,
253,
3908,
812,
320,
5520,
323,
1650,
275,
5426,
4562,
7019,
1027,
74,
1430,
310,
417,
3058,
891,
5583,
281,
2451,
253,
2929,
2378,
969,
2490,
187,
4118,
18435,
27,
13518,
2278,
253,
4477,
452,
9713,
253,
30628,
7350,
512,
253,
30628,
403,
417,
1411,
253,
9311,
273,
253,
2929,
4496,
823,
253,
5955,
342,
253,
37317,
715,
253,
2457,
2715,
3340,
253,
5955,
327,
9376,
577,
253,
4477,
943,
671,
823,
253,
747,
16774,
1543,
1512,
209
] | [
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] | [
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
22210,
3129,
19216,
310,
5511,
20246,
326,
49661,
11041,
44571,
1097,
387,
253,
4771,
285,
387,
1980,
5268,
11269,
352,
671,
556,
247,
1175,
19502,
10454,
685,
2720,
2987,
327,
1327,
44181,
3470,
342,
21012,
5511,
4679,
921,
253,
10307,
273,
253,
4081,
1332,
9376,
577,
310,
417,
2629,
923,
253,
4385,
2708,
3738,
9376,
577,
310,
417,
2629,
285,
253,
4477,
7558,
326,
352,
6556,
323,
9765,
50276,
79,
891,
513,
417,
923,
2139,
9765,
310,
3309,
275,
253,
1783,
347,
2080,
347,
891,
2096,
253,
1783,
1335,
2186,
387,
253,
1072,
9777,
273,
299,
4277,
672,
9765,
50276,
79,
310,
436,
2032,
891,
1158,
352,
310,
12912,
604,
253,
4477,
823,
247,
5955,
327,
849,
253,
10527,
1783,
2544,
672,
9765,
50276,
79,
50276,
23955,
5884,
2181,
253,
36622,
273,
5933,
4454,
403,
417,
5185,
24088,
4060,
273,
2593,
577,
574,
690,
2228,
275,
6890,
253,
36622,
5474,
33032,
2520,
2929,
310,
973,
3542,
253,
11333,
285,
10527,
1543,
403,
3559,
275,
271,
12482,
1039,
285,
5469,
275,
2508,
253,
1543,
1646,
281,
320,
12085,
186,
352,
943,
320,
5469,
275,
625,
2508,
849,
436,
906,
10129,
326,
273,
5368,
789,
275,
1027,
7533,
342,
1027,
1895,
6820,
3602,
436,
2929,
2175,
1327,
44181,
285,
6032,
13757,
275,
247,
10208,
12072,
4715,
4758,
253,
4477,
12661,
271,
5933,
342,
1097,
1980,
48894,
3777,
10254,
285,
4156,
11041,
3777,
10254,
597,
671,
12661,
247,
747,
9376,
327,
253,
11786,
10096,
273,
1016,
12954,
5145,
597,
5276,
326,
253,
4081,
5933,
33526,
253,
1682,
19502,
10454,
285,
7277,
352,
275,
4679,
342,
1666,
25379,
50275,
2520,
2929,
310,
973,
3542,
253,
11333,
285,
10527,
1543,
403,
3559,
275,
271,
12482,
1039,
285,
5469,
275,
2508,
253,
1543,
1646,
281,
320,
12085,
32062,
352,
943,
320,
5469,
253,
2954,
875,
391,
285,
295,
1637,
672,
15250,
253,
10454,
436,
943,
671,
320,
4879,
275,
2829,
337,
326,
253,
10454,
1057,
417,
4555,
3761,
326,
273,
46247,
303,
1250,
6421,
1162,
355,
9169,
50276,
74,
717,
417,
2119,
672,
9376,
577,
2686,
6556,
275,
3946,
342,
247,
1355,
9765,
50276,
251,
3877,
326,
299,
4277,
7024,
327,
337,
79,
2020,
75,
259,
75,
323,
512,
6083,
436,
2097,
512,
299,
4277,
812,
320,
7976,
275,
690,
1039,
285,
3021,
9765,
50276,
14340,
323,
690,
1355,
3638,
533,
943,
320,
253,
1072,
1340,
347,
295,
50275,
7152,
33032,
247,
747,
5933,
342,
1805,
10454,
310,
4081,
50275,
783,
5933,
476,
320,
3732,
281,
1142,
13361,
3237,
253,
10527,
1543,
403,
2797,
407,
16984,
747,
9376,
577,
516,
417,
1077,
7615,
342,
10208,
12072,
4715,
285,
352,
310,
1892,
281,
5963,
323,
479,
849,
29190,
352,
310,
50276,
74,
1089,
253,
2929,
973,
3542,
627,
403,
5053,
835,
253,
3908,
812,
320,
5520,
323,
1650,
275,
5426,
4562,
7019,
1027,
74,
1430,
310,
417,
3058,
891,
5583,
281,
2451,
253,
2929,
2378,
969,
2490,
187,
4118,
18435,
27,
13518,
2278,
253,
4477,
452,
9713,
253,
30628,
7350,
512,
253,
30628,
403,
417,
1411,
253,
9311,
273,
253,
2929,
4496,
823,
253,
5955,
342,
253,
37317,
715,
253,
2457,
2715,
3340,
253,
5955,
327,
9376,
577,
253,
4477,
943,
671,
823,
253,
747,
16774,
1543,
1512,
209
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
this paper uses federated machine learning for predictive maintenance on optical networks federated learning provides a number of advantages including security privacy and accuracy the accuracy claims are motivated by having a broader set of failure examples to draw from addressing a known issue in predictive maintenance privacy and security claims are important in predictive maintenance but were less novel or well supported they felt more like off the shelf methods applied in a new domain their primary comparison was against a global model i think this is valuable work and i liked many parts of the paper however this work needs to be significantly matured prior to publication the major issue i had with this paper is that it couldnt decide what it wanted to be the primary benefit claim was in reducing costs of maintenance but that was not evaluated instead they put extensive time and effort into discussion of security and privacy concerns without significant ties to machine learning they also spent extensive time reviewing material that should be assumed background reading for the iclr audience regarding the primary claim of reduced operational costs they did not measure this directly rather they evaluated a proxy measures of precision recall accuracy and f1 score which are incomplete evaluations for operational benefit the key measure that they left out was the timeliness of detection and this was not evaluated security and privacy are always challenging to evaluate i give the authors credit for elevating these to first class concerns in this paper they evaluated privacy and security using robustness as a measure with different percentages of malicious clients giving motivated noisty data as input the noutofn scheme was a nice application to limited model inversion attacks they also made effective use of an anomaly detectors to detect data poisoning attacks while i suspect there are ways to fool an anomaly detector this still makes such attacks significantly more challenging on a more superficial note large parts of this paper should be assumed knowledge for an iclr audience for example equations 17 do not meaningfully improve any of their results are not referenced again and do not help a reader reproduce any of the results the methods section needs more details on the structure of the training and test datasets what are the features how long are the series how were your experiments setup much of this is in section 41 but i had to go looking for it the data that was used and baseline efforts were somewhat unsatisfying only one nonpublic dataset was evaluated and it was unclear how well that dataset reflected many of needs identified in the use case splitting a monolotic dataset into 10 parts is closer to a 10fold cross validation scheme than a federated learning scheme while splitting the data up this way might be required for federated learning without clearly identfied splitting criteria such as client 10 all were all deployed in the same geographic regon or used hardware from the same vendor i found it difficult to connect the experiments to the identified needs this is a promising but immature work it needs to be focused on only one of the identified problems and the evaluation needs to be significantly improved it seems like there is potential for multiple papers in this work but it needs to be split up and experiments carried out more carefully this paper would also benefit from a more direct connection to the identified use case as well as some justification of why that use case is realistic docsepthis paper demonstrates that for the problem of doing predictive maintenance in optical networks accurate and reliable ml models could be developed in collaborative learning without the disclosure of vendors data even in malicious setting the authors did some experiments to show that 1 federated learning can help to achieve similar prediction accuracy as the centralized approach 2 malicious behaviors may happen in federated learning but it can be detected through autoencoder strengths this paper provides an interesting application of using ml weakness lack of ml technique related novelty this paper directly leverages the existing ml techniques eg fedavg and autoencoder wo proposing new techniques i think this is just a pure ml application paper motivation doubt i wonder whether there will be the malicious venders in the real world if the vender wants to pollute the global model then the malicious datamodel will affect the venders own prediction accuracy as well then why does heshe want to do that not sure whether the authors can address the motivation clearer why antoencoder is picked for anomaly detection why not the other approaches this paper provides an interesting application of using ml docsepthis paper designs a maintenance prediction framework for key optical network components based on federated learning the designed framework can resist malicious environments and several kinds of attacks the paper uses simulation data to verify that the proposed method has good predictive ability and can withstand the designed simulation attack the focus of this study is not clear whether it is to solve the challenge of collaborative failure prediction or the problem of encryption aggregation in federated learning fl in fact the related work part only summarized three related studies i believe this is far from enough to reflect the related research progresses on the one hand secure aggregation is a fundamental task of fl we can see a lot of recent progress see below for example although as the authors said they typically focus on the global models accuracy efficiency and scalability issue however it does not mean that they are not suitable for the problem scenarios of collaborative prediction of optical transmitter degradation simply because we do not need these features fereidooni hossein et al safelearn secure aggregation for private federated learning ieee security and privacy workshops spw 2021 li yong et al privacypreserving federated learning framework based on chained secure multiparty computing ieee internet of things journal 88 2020 61786186 kadhe swanand et al fastsecagg scalable secure aggregation for privacypreserving federated learning arxiv preprint arxiv200911248 2020 yang chiensheng et al lightsecagg rethinking secure aggregation in federated learning arxiv preprint arxiv210914236 2021 constance beguier et al efficient sparse secure aggregation for federated learning arxiv preprint arxiv200714861 2021 on the other hand flbased prediction tasks have also been extensively studied although they are not necessarily designed for optical network predictive maintenance the motivation of this paper is also unclear the problem scenario seems unrealistic to put it bluntly one can purchase batches of semiconductor laser products from the open market to test themselves which will produce similar results using fl does not change the limitation that vendors can only rely on accelerated aging tests to predict maintenance in terms of innovation the paper directly uses the fedavg framework and the prediction and fault detection methods leveraged are also traditional in the threats analysis part the paper shows three possible attacks in this specific scenario they are probably based on the authors assumptions or there is no clear evidence to show the purpose of these attacks for example is it possible for a model inversion attack to obtain real data from other vendors why not just buy a product to test directly during the local model poisoning attack if an attacker tries to pollute the global model with poisoned data the prediction results it obtains will also be meaningless which is contrary to the fundamental motivation as the paper claimed instead each vendor receives the personalized maintenance report which contains the discrepancy between its local model and the global model which is useful to improve the quality of products in the future lastly in the experimental part the paper splits data into ten pieces to simulate ten clients which cannot reflect the diversity of products from different manufacturers in addition there are some obvious language problems as listed below in abstract it is challenging to develop an accurate and reliable ml based prognostic models accurate and reliable ml based prognostic models or an model in page 1 paragraph 1 global predictive maintenance market is expected the global predictive maintenance market in page 2 paragraph 5 a secure aggregation protocol is tolerant to the malicious behavior of participants in a honestmajority model an honestmajority model in page 2 paragraph 5 such framework does not fit for our use case due to multiple reasons such a framework in page 2 paragraph 5 may give negative impact on the data owners business give a negative impact in page 3 paragraph 5 whereby different semiconductor laser manufacturers ie vendors collaborate ie vendors in page 5 paragraph 4 for example in fredrikson et al 2015 authors demonstrated a model inversion attack the authors in page 7 paragraph 2 the architecture of global model is composed of two gru layers containing each 64 cells containing 64 cells in summary although the paper tackles a novel problem and designs an flbased framework for the target it shows significant defects in its motivation innovation and experiments as a result i do not recommend accepting the paper docsepthe paper is on a very interesting topic predictive maintenance that has a big impact on various industries it presents a conceptframework for collaborative learning in predictive maintenance application where a global model is trained based on different vendors training dataset without sharing the data the main challenge in predictive maintenance is often models trained on a specific unitasset will not generalize and perform well when tested on a different unit of the same type for example if the model is trained on a ball bearing under certain operating condition and then tested on a different bearing under a different operating condition the result will not be good in the proposed concept of collaborative learning with different datasets from different vendors it is very unlikely that the different vendors use relatively similar assets the paper data set is not from different vendors therefore the prediction of the model under such condition will be not acceptable this challenge is not addressed in this paper the paper should state exactly whether it is tackling a prognostic prediction of remaining useful life or diagnostic detection of the fault type or anomaly detection problem this is not clear from the paper in page three authors mention of prognostics later on an ae is used for anomaly detection please elaborate in the paper what type of problem is being solved here the paper uses accelerated testing data while such data is widely used to build reliability models for different assets their application on the normal condition experiment is questionable the model trained on accelerated life tests cannot be tested on normal condition data the description of the dataset should be enhanced in the paper not enough information is reported about the types of faultsdegradationexperimental setupdifferences of the experimental setups used while the concept is based on collaborating learning of different vendors that may use very different assets for the training sets the dataset that the authors use is not from various different assets there only difference is in operating conditions unit to unit variation similar device manufactured by different vendors with is a big challenge in predictive maintenance that can not be addressed with the above dataset used in the paper so my main point is the dataset is not suitable for demonstration of the collaborative learning concept in predictive maintenance the key contribution from the algorithm perspective is not clear in the paper please elaborate in the paper if there are any algorithmic contribution in the paper the paper states that a robust and secure model is developed however it does not provide some clear criteria for robustness as well as security so these aspects of the model should be developed further in details or may be in a separate manuscriptappendix while the presented concept of collaborative learning for predictive maintenance can be a key contribution to the domain of pdm this idea is not well developed by means of related datasets one suggestion is the authors use several bearing datasets that are available publicly ie case western reserve university dataset university of cincinnatis dataset and many more available bearing datasets and use them as different vendors and then train the global model then the model can be tested on an unseen different dataset the output should be compared with ground truth if the authors are solving a prognostics problem or be compared to a known fault type if they are solving a diagnostics classification problem with the above said enhancements needed for the paper i cannot recommend the paper for the publication
### Summary: | the paper presents an optimization technique for optical networks based on federated learning the motivation for using federated learning stems from the privacy of datasets arising from different operators the performance of the method is compared to the one based on centralized learning despite demonstrating an interesting and promising application of a federated learning the paper is rather weak in its methodical contribution its experimental evaluation however is rather artificial with an fl problem generated by splitting the dataset for a centralized problem into parts no response to the reviewers comments was provided | [
417,
1361,
247,
9414,
18302,
667,
273,
253,
1543,
253,
3082,
2593,
3198,
625,
4278,
327,
253,
2605,
273,
253,
3733,
285,
1071,
15302,
752,
403,
253,
3386,
849,
1048,
403,
253,
2962,
849,
497,
634,
4679,
9978,
1199,
273,
436,
310,
275,
2593,
7609,
533,
891,
574,
281,
564,
2819,
323,
352,
50275,
783,
941,
326,
369,
908,
285,
8245,
6031,
497,
8489,
43288,
3184,
760,
581,
1327,
4387,
10895,
369,
6760,
285,
352,
369,
12744,
849,
973,
326,
10895,
11392,
1142,
273,
3198,
3636,
275,
253,
897,
1083,
50276,
23336,
2835,
247,
28294,
3875,
10895,
715,
884,
4243,
310,
8003,
281,
247,
884,
8089,
2831,
12820,
6974,
685,
247,
10208,
12072,
4715,
6974,
1223,
19860,
253,
941,
598,
436,
1039,
1537,
320,
2424,
323,
10208,
12072,
4715,
1293,
4518,
1548,
71,
728,
19860,
6866,
824,
347,
5268,
884,
512,
497,
512,
18329,
275,
253,
1072,
23365,
810,
251,
390,
908,
10309,
432,
253,
1072,
23906,
891,
1119,
352,
2834,
281,
4684,
253,
4679,
281,
253,
3636,
3198,
50274,
2520,
310,
247,
12532,
533,
34000,
789,
352,
3198,
281,
320,
7106,
327,
760,
581,
273,
253,
3636,
3237,
285,
253,
7103,
3198,
281,
320,
3012,
5520,
352,
3133,
751,
627,
310,
2442,
323,
2709,
9380,
275,
436,
789,
533,
352,
3198,
281,
320,
8085,
598,
285,
4679,
4824,
562,
625,
9257,
436,
2929,
651,
671,
5649,
432,
247,
625,
1480,
4602,
281,
253,
3636,
897,
1083,
347,
973,
347,
690,
22861,
273,
2139,
326,
897,
1083,
310,
15958,
50276,
7152,
33032,
2520,
2929,
14371,
326,
323,
253,
1895,
273,
2509,
15970,
9363,
275,
5748,
6928,
7899,
285,
9630,
13361,
3210,
812,
320,
3715,
275,
27549,
4715,
1293,
253,
13911,
273,
24742,
941,
1014,
275,
24764,
4758,
253,
4477,
858,
690,
4679,
281,
921,
326,
337,
10208,
12072,
4715,
476,
1361,
281,
5115,
2074,
10554,
7200,
347,
253,
36409,
2746,
374,
24764,
13576,
778,
5108,
275,
10208,
12072,
4715,
533,
352,
476,
320,
5189,
949,
6753,
36465,
20544,
50276,
2520,
2929,
3400,
271,
4722,
2898,
273,
970,
13361,
50276,
20881,
1255,
50276,
77,
471,
273,
13361,
5853,
2905,
38135,
436,
2929,
3587,
19732,
1131,
253,
5368,
13361,
5609,
24088,
10208,
42921,
285,
6753,
36465,
32063,
36636,
747,
5609,
891,
1158,
436,
310,
816,
247,
6313,
13361,
2898,
2929,
50275,
24013,
7639,
5545,
891,
4282,
1880,
627,
588,
320,
253,
24764,
362,
13577,
275,
253,
1524,
1533,
604,
253,
362,
3109,
5605,
281,
8461,
1137,
253,
4156,
1566,
840,
253,
24764,
2856,
312,
49797,
588,
2818,
253,
362,
13577,
1211,
10554,
7200,
347,
973,
840,
2139,
1057,
344,
6689,
971,
281,
513,
326,
417,
2119,
1880,
253,
4477,
476,
2953,
253,
16038,
30909,
50276,
22309,
271,
936,
36465,
310,
5055,
323,
30207,
5481,
2139,
417,
253,
643,
7274,
436,
2929,
3400,
271,
4722,
2898,
273,
970,
13361,
5474,
33032,
2520,
2929,
11809,
247,
9363,
10554,
7792,
323,
2234,
5748,
2990,
4295,
1754,
327,
10208,
12072,
4715,
253,
4158,
7792,
476,
11623,
24764,
12620,
285,
2067,
9351,
273,
8104,
253,
2929,
4648,
9864,
941,
281,
12654,
326,
253,
4081,
1332,
556,
1175,
15970,
3745,
285,
476,
32385,
253,
4158,
9864,
2983,
253,
2770,
273,
436,
1263,
310,
417,
2590,
1880,
352,
310,
281,
8415,
253,
5691,
273,
27549,
4433,
10554,
390,
253,
1895,
273,
24589,
20828,
275,
10208,
12072,
4715,
892,
275,
958,
253,
2905,
789,
629,
760,
17903,
1264,
2905,
2175,
891,
2868,
436,
310,
2080,
432,
2217,
281,
4887,
253,
2905,
2561,
42851,
327,
253,
581,
1133,
7895,
20828,
310,
247,
7936,
4836,
273,
892,
359,
476,
923,
247,
2257,
273,
3332,
4780,
923,
2708,
323,
1650,
3738,
347,
253,
4477,
753,
597,
5431,
2770,
327,
253,
4156,
3210,
7200,
6733,
285,
9171,
1430,
2523,
2299,
352,
1057,
417,
1599,
326,
597,
403,
417,
7470,
323,
253,
1895,
15216,
273,
27549,
10554,
273,
5748,
26518,
11961,
3365,
984,
359,
513,
417,
878,
841,
3386,
50276,
186,
16957,
301,
3508,
74,
288,
375,
34103,
1162,
355,
4999,
29343,
7895,
20828,
323,
3055,
10208,
12072,
4715,
26332,
1796,
3988,
285,
11068,
29561,
653,
88,
43425,
50276,
186,
965,
340,
543,
1162,
355,
11068,
10192,
26368,
10208,
12072,
4715,
7792,
1754,
327,
448,
1243,
7895,
10796,
43545,
12672,
26332,
1796,
8573,
273,
1841,
6698,
11003,
9169,
48906,
2691,
20270,
50276,
186,
76,
324,
248,
1863,
266,
395,
1162,
355,
3809,
1704,
13046,
44755,
7895,
20828,
323,
11068,
10192,
26368,
10208,
12072,
4715,
549,
32693,
638,
3845,
549,
32693,
1518,
4739,
805,
2385,
9169,
50276,
186,
31524,
21477,
561,
24176,
1162,
355,
1708,
1704,
13046,
294,
37341,
7895,
20828,
275,
10208,
12072,
4715,
549,
32693,
638,
3845,
549,
32693,
19,
12852,
1047,
21358,
43425,
50276,
186,
3474,
593,
2353,
86,
1321,
1162,
355,
50276,
20246,
23507,
7895,
20828,
323,
10208,
12072,
4715,
549,
32693,
638,
3845,
549,
32693,
8602,
1047,
38978,
43425,
50275,
251,
253,
643,
1133,
892,
3169,
10554,
8892,
452,
671,
644,
18171,
5421,
3738,
597,
403,
417,
7933,
4158,
323,
5748,
2990,
15970,
9363,
50276,
783,
16038,
273,
436,
2929,
310,
671,
12744,
253,
1895,
10076,
3133,
46521,
281,
1691,
352,
29156,
314,
581,
476,
7471,
39657,
273,
10236,
8166,
3580,
432,
253,
1527,
2791,
281,
1071,
3746,
534,
588,
4711,
2074,
1543,
970,
892,
1057,
417,
1818,
253,
12291,
326,
24742,
476,
760,
10725,
327,
21702,
15174,
5216,
281,
3283,
9363,
50276,
249,
2426,
273,
15832,
253,
2929,
3587,
4648,
253,
10208,
42921,
7792,
285,
253,
10554,
285,
9331,
5481,
3082,
19732,
2961,
403,
671,
5899,
50276,
249,
253,
14207,
1783,
629,
253,
2929,
2722,
1264,
1896,
8104,
275,
436,
2173,
10076,
597,
403,
3164,
1754,
327,
253,
4477,
13260,
390,
627,
310,
642,
2590,
1941,
281,
921,
253,
4096,
273,
841,
8104,
50276,
1542,
1650,
310,
352,
1896,
323,
247,
1566,
27697,
2983,
281,
4044,
1524,
941,
432,
643,
24742,
2139,
417,
816,
4489,
247,
1885,
281,
1071,
3587,
50275,
32674,
253,
1980,
1566,
33254,
2983,
604,
271,
30539,
14177,
281,
8461,
1137,
253,
4156,
1566,
342,
47494,
941,
253,
10554,
1543,
352,
31326,
588,
671,
320,
34209,
534,
310,
10214,
281,
253,
7936,
16038,
347,
253,
2929,
7558,
3185,
1016,
23906,
14488,
253,
32339,
9363,
1304,
534,
4428,
253,
26210,
875,
697,
1980,
1566,
285,
253,
4156,
1566,
534,
310,
4217,
281,
3157,
253,
3290,
273,
3580,
275,
253,
2852,
50276,
6275,
314,
275,
253,
5661,
629,
253,
2929,
36509,
941,
715,
3578,
7437,
281,
26065,
3578,
8548,
534,
2550,
4887,
253,
9991,
273,
3580,
432,
1027,
16596,
50275,
249,
1635,
627,
403,
690,
4755,
3448,
3237,
347,
7117,
2708,
50276,
186,
249,
12002,
352,
310,
11132,
281,
1287,
271,
7899,
285,
9630,
13361,
1754,
18983,
3210,
50276,
18921,
366,
285,
9630,
13361,
1754,
18983,
3210,
390,
271,
50276,
7645,
50276,
186,
249,
3239,
337,
12494,
337,
4156,
15970,
9363,
2791,
310,
3264,
50276,
783,
4156,
15970,
9363,
2791,
50276,
186,
249,
3239,
374,
12494,
608,
247,
7895,
20828,
7241,
310,
41842,
281,
253,
24764,
3879,
273,
5014,
275,
247,
8274,
24330,
414,
1566,
50276,
266,
8274,
24330,
414,
1566,
50276,
186,
249,
3239,
374,
12494,
608,
824,
7792,
1057,
417,
4944,
323,
776,
897,
1083,
1955,
281,
2709,
4606,
50276,
10328,
247,
7792,
50276,
186,
249,
3239,
374,
12494,
608,
778,
1918,
4016,
3486,
327,
253,
941,
9891,
2136,
50276,
31089,
247,
4016,
3486,
50276,
186,
249,
3239,
495,
12494,
608,
17580,
1027,
10236,
8166,
16596,
26332,
24742,
42124,
50275,
466,
24742,
50276,
186,
249,
3239,
608,
12494,
577,
323,
1650,
275,
269,
433,
363,
661,
251,
1162,
355,
4104,
4477,
5183,
247,
1566,
27697,
2983,
50276,
783,
4477,
50276,
186,
249,
3239,
818,
12494,
374,
253,
10336,
273,
4156,
1566,
310,
9924,
273,
767,
26970,
8090,
4508,
1016,
6705,
1341,
50276,
15408,
6705,
1341,
275,
6010,
3738,
253,
2929,
39223,
247,
4460,
1895,
285,
11809,
271,
892,
3169,
7792,
323,
253,
2303,
352,
2722,
1534,
12834,
275,
697,
16038,
15832,
285,
4679,
347,
247,
906,
891,
513,
417,
5583,
18738,
253,
2929,
5474,
339,
431,
248,
2929,
310,
327,
247,
1077,
4722,
9400,
15970,
9363,
326,
556,
247,
1943,
3486,
327,
2710,
17057,
352,
10262,
247,
4473,
13149,
323,
27549,
4715,
275,
15970,
9363,
2898,
835,
247,
4156,
1566,
310,
10166,
1754,
327,
1027,
24742,
3733,
10895,
1293,
9628,
253,
941,
50276,
186,
783,
2022,
5691,
275,
15970,
9363,
310,
2223,
3210,
10166,
327,
247,
2173,
3943,
39869,
588,
417,
39970,
285,
1347,
973,
672,
5762,
327,
247,
1027,
3943,
273,
253,
1072,
1511,
323,
1650,
604,
253,
1566,
310,
10166,
327,
247,
4023,
12206,
762,
2176,
6498,
1617,
285,
840,
5762,
327,
247,
1027,
12206,
762,
247,
1027,
6498,
1617,
253,
906,
588,
417,
320,
1175,
275,
253,
4081,
4473,
273,
27549,
4715,
342,
1027,
15302,
432,
1027,
24742,
352,
310,
1077,
11543,
326,
253,
1027,
24742,
897,
4942,
2074,
10434,
253,
2929,
941,
873,
310,
417,
432,
1027,
24742,
3103,
253,
10554,
273,
253,
1566,
762,
824,
1617,
588,
320,
417,
12207,
50276,
2520,
5691,
310,
417,
9713,
275,
436,
2929,
50275,
186,
783,
2929,
943,
1375,
4555,
1880,
352,
310,
46710,
247,
18983,
10554,
273,
5780,
4217,
1495,
390,
10401,
5481,
273,
253,
9331,
1511,
390,
30207,
5481,
1895,
436,
310,
417,
2590,
432,
253,
2929,
275,
3239,
1264,
4477,
3748,
273,
11150,
21234,
1996,
327,
271,
247,
70,
310,
908,
323,
30207,
5481,
4496,
21184,
275,
253,
2929,
752,
1511,
273,
1895,
310,
1146,
14042,
1060,
50276,
186,
783,
2929,
4648,
21702,
5175,
941,
1223,
824,
941,
310,
7561,
908,
281,
1973,
13367,
3210,
323,
1027,
10434,
616,
2898,
327,
253,
2622,
1617,
3368,
310,
30455,
253,
1566,
10166,
327,
21702,
1495,
5216,
2550,
320,
5762,
327,
2622,
1617,
941,
50276,
186,
783,
5740,
273,
253,
10895,
943,
320,
8655,
275,
253,
2929,
417,
2217,
1491,
310,
2361,
670,
253,
3510,
273,
35354,
615,
4971,
318,
49363,
9978,
69,
26776,
273,
253,
5661,
873,
8777,
908,
50276,
186,
6050,
253,
4473,
310,
1754,
327,
8317,
839,
4715,
273,
1027,
24742,
326,
778,
897,
1077,
1027,
10434,
323,
253,
3733,
5239,
253,
10895,
326,
253,
4477,
897,
310,
417,
432,
2710,
1027,
10434,
627,
760,
3064,
310,
275,
6498,
2515,
3943,
281,
3943,
7629,
2074,
2813,
18461,
407,
1027,
24742,
342,
310,
247,
1943,
5691,
275,
15970,
9363,
326,
476,
417,
320,
9713,
342,
253,
1840,
10895,
908,
275,
253,
2929,
594,
619,
2022,
1127,
310,
253,
10895,
310,
417,
7470,
323,
20028,
273,
253,
27549,
4715,
4473,
275,
15970,
9363,
50275,
186,
783,
2234,
7680,
432,
253,
5933,
8668,
310,
417,
2590,
275,
253,
2929,
4496,
21184,
275,
253,
2929,
604,
627,
403,
667,
5933,
280,
7680,
275,
253,
2929,
50276,
186,
783,
2929,
3054,
326,
247,
10237,
285,
7895,
1566,
310,
3715,
2299,
352,
1057,
417,
2085,
690,
2590,
6866,
323,
31640,
347,
973,
347,
3988,
594,
841,
7794,
273,
253,
1566,
943,
320,
3715,
2007,
275,
4278,
390,
778,
320,
275,
247,
4858,
7714,
50237,
50275,
6050,
253,
3559,
4473,
273,
27549,
4715,
323,
15970,
9363,
476,
320,
247,
2234,
7680,
281,
253,
5028,
273,
268,
17670,
436,
2934,
310,
417,
973,
3715,
407,
2097,
273,
2905,
15302,
581,
14876,
310,
253,
4477,
897,
2067,
12206,
15302,
326,
403,
2130,
13644,
26332,
1083,
10439,
15917,
9835,
10895,
9835,
273,
260,
1763,
2966,
255,
261,
10895,
285,
1142,
625,
2130,
12206,
15302,
285,
897,
731,
347,
1027,
24742,
285,
840,
6194,
253,
4156,
1566,
840,
253,
1566,
476,
320,
5762,
327,
271,
39709,
1027,
10895,
253,
3453,
943,
320,
2429,
342,
3216,
5083,
604,
253,
4477,
403,
16161,
247,
11150,
21234,
1895,
390,
320,
2429,
281,
247,
1929,
9331,
1511,
604,
597,
403,
16161,
247,
39266,
9162,
1895,
50276,
3113,
253,
1840,
753,
42752,
3058,
323,
253,
2929,
891,
2550,
5583,
253,
2929,
323,
253,
9311,
50276,
187,
187,
4118,
18435,
27,
783,
2929,
10262,
271,
13757,
5853,
323,
5748,
6928,
1754,
327,
10208,
12072,
4715,
253,
16038,
323,
970,
10208,
12072,
4715,
23880,
432,
253,
11068,
273,
15302,
14475,
432,
1027,
9158,
253,
3045,
273,
253,
1332,
310,
2429,
281,
253,
581,
1754,
327,
36409,
4715,
5747,
17227,
271,
4722,
285,
12532,
2898,
273,
247,
10208,
12072,
4715,
253,
2929,
310,
2581,
5075,
275,
697,
1332,
474,
7680,
697,
5661,
7103,
2299,
310,
2581,
13345,
342,
271,
892,
1895,
4561,
407,
19860,
253,
10895,
323,
247,
36409,
1895,
715,
4243,
642,
2380,
281,
253,
30628,
5701,
369,
2530
] | [
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] | [
417,
1361,
247,
9414,
18302,
667,
273,
253,
1543,
253,
3082,
2593,
3198,
625,
4278,
327,
253,
2605,
273,
253,
3733,
285,
1071,
15302,
752,
403,
253,
3386,
849,
1048,
403,
253,
2962,
849,
497,
634,
4679,
9978,
1199,
273,
436,
310,
275,
2593,
7609,
533,
891,
574,
281,
564,
2819,
323,
352,
50275,
783,
941,
326,
369,
908,
285,
8245,
6031,
497,
8489,
43288,
3184,
760,
581,
1327,
4387,
10895,
369,
6760,
285,
352,
369,
12744,
849,
973,
326,
10895,
11392,
1142,
273,
3198,
3636,
275,
253,
897,
1083,
50276,
23336,
2835,
247,
28294,
3875,
10895,
715,
884,
4243,
310,
8003,
281,
247,
884,
8089,
2831,
12820,
6974,
685,
247,
10208,
12072,
4715,
6974,
1223,
19860,
253,
941,
598,
436,
1039,
1537,
320,
2424,
323,
10208,
12072,
4715,
1293,
4518,
1548,
71,
728,
19860,
6866,
824,
347,
5268,
884,
512,
497,
512,
18329,
275,
253,
1072,
23365,
810,
251,
390,
908,
10309,
432,
253,
1072,
23906,
891,
1119,
352,
2834,
281,
4684,
253,
4679,
281,
253,
3636,
3198,
50274,
2520,
310,
247,
12532,
533,
34000,
789,
352,
3198,
281,
320,
7106,
327,
760,
581,
273,
253,
3636,
3237,
285,
253,
7103,
3198,
281,
320,
3012,
5520,
352,
3133,
751,
627,
310,
2442,
323,
2709,
9380,
275,
436,
789,
533,
352,
3198,
281,
320,
8085,
598,
285,
4679,
4824,
562,
625,
9257,
436,
2929,
651,
671,
5649,
432,
247,
625,
1480,
4602,
281,
253,
3636,
897,
1083,
347,
973,
347,
690,
22861,
273,
2139,
326,
897,
1083,
310,
15958,
50276,
7152,
33032,
2520,
2929,
14371,
326,
323,
253,
1895,
273,
2509,
15970,
9363,
275,
5748,
6928,
7899,
285,
9630,
13361,
3210,
812,
320,
3715,
275,
27549,
4715,
1293,
253,
13911,
273,
24742,
941,
1014,
275,
24764,
4758,
253,
4477,
858,
690,
4679,
281,
921,
326,
337,
10208,
12072,
4715,
476,
1361,
281,
5115,
2074,
10554,
7200,
347,
253,
36409,
2746,
374,
24764,
13576,
778,
5108,
275,
10208,
12072,
4715,
533,
352,
476,
320,
5189,
949,
6753,
36465,
20544,
50276,
2520,
2929,
3400,
271,
4722,
2898,
273,
970,
13361,
50276,
20881,
1255,
50276,
77,
471,
273,
13361,
5853,
2905,
38135,
436,
2929,
3587,
19732,
1131,
253,
5368,
13361,
5609,
24088,
10208,
42921,
285,
6753,
36465,
32063,
36636,
747,
5609,
891,
1158,
436,
310,
816,
247,
6313,
13361,
2898,
2929,
50275,
24013,
7639,
5545,
891,
4282,
1880,
627,
588,
320,
253,
24764,
362,
13577,
275,
253,
1524,
1533,
604,
253,
362,
3109,
5605,
281,
8461,
1137,
253,
4156,
1566,
840,
253,
24764,
2856,
312,
49797,
588,
2818,
253,
362,
13577,
1211,
10554,
7200,
347,
973,
840,
2139,
1057,
344,
6689,
971,
281,
513,
326,
417,
2119,
1880,
253,
4477,
476,
2953,
253,
16038,
30909,
50276,
22309,
271,
936,
36465,
310,
5055,
323,
30207,
5481,
2139,
417,
253,
643,
7274,
436,
2929,
3400,
271,
4722,
2898,
273,
970,
13361,
5474,
33032,
2520,
2929,
11809,
247,
9363,
10554,
7792,
323,
2234,
5748,
2990,
4295,
1754,
327,
10208,
12072,
4715,
253,
4158,
7792,
476,
11623,
24764,
12620,
285,
2067,
9351,
273,
8104,
253,
2929,
4648,
9864,
941,
281,
12654,
326,
253,
4081,
1332,
556,
1175,
15970,
3745,
285,
476,
32385,
253,
4158,
9864,
2983,
253,
2770,
273,
436,
1263,
310,
417,
2590,
1880,
352,
310,
281,
8415,
253,
5691,
273,
27549,
4433,
10554,
390,
253,
1895,
273,
24589,
20828,
275,
10208,
12072,
4715,
892,
275,
958,
253,
2905,
789,
629,
760,
17903,
1264,
2905,
2175,
891,
2868,
436,
310,
2080,
432,
2217,
281,
4887,
253,
2905,
2561,
42851,
327,
253,
581,
1133,
7895,
20828,
310,
247,
7936,
4836,
273,
892,
359,
476,
923,
247,
2257,
273,
3332,
4780,
923,
2708,
323,
1650,
3738,
347,
253,
4477,
753,
597,
5431,
2770,
327,
253,
4156,
3210,
7200,
6733,
285,
9171,
1430,
2523,
2299,
352,
1057,
417,
1599,
326,
597,
403,
417,
7470,
323,
253,
1895,
15216,
273,
27549,
10554,
273,
5748,
26518,
11961,
3365,
984,
359,
513,
417,
878,
841,
3386,
50276,
186,
16957,
301,
3508,
74,
288,
375,
34103,
1162,
355,
4999,
29343,
7895,
20828,
323,
3055,
10208,
12072,
4715,
26332,
1796,
3988,
285,
11068,
29561,
653,
88,
43425,
50276,
186,
965,
340,
543,
1162,
355,
11068,
10192,
26368,
10208,
12072,
4715,
7792,
1754,
327,
448,
1243,
7895,
10796,
43545,
12672,
26332,
1796,
8573,
273,
1841,
6698,
11003,
9169,
48906,
2691,
20270,
50276,
186,
76,
324,
248,
1863,
266,
395,
1162,
355,
3809,
1704,
13046,
44755,
7895,
20828,
323,
11068,
10192,
26368,
10208,
12072,
4715,
549,
32693,
638,
3845,
549,
32693,
1518,
4739,
805,
2385,
9169,
50276,
186,
31524,
21477,
561,
24176,
1162,
355,
1708,
1704,
13046,
294,
37341,
7895,
20828,
275,
10208,
12072,
4715,
549,
32693,
638,
3845,
549,
32693,
19,
12852,
1047,
21358,
43425,
50276,
186,
3474,
593,
2353,
86,
1321,
1162,
355,
50276,
20246,
23507,
7895,
20828,
323,
10208,
12072,
4715,
549,
32693,
638,
3845,
549,
32693,
8602,
1047,
38978,
43425,
50275,
251,
253,
643,
1133,
892,
3169,
10554,
8892,
452,
671,
644,
18171,
5421,
3738,
597,
403,
417,
7933,
4158,
323,
5748,
2990,
15970,
9363,
50276,
783,
16038,
273,
436,
2929,
310,
671,
12744,
253,
1895,
10076,
3133,
46521,
281,
1691,
352,
29156,
314,
581,
476,
7471,
39657,
273,
10236,
8166,
3580,
432,
253,
1527,
2791,
281,
1071,
3746,
534,
588,
4711,
2074,
1543,
970,
892,
1057,
417,
1818,
253,
12291,
326,
24742,
476,
760,
10725,
327,
21702,
15174,
5216,
281,
3283,
9363,
50276,
249,
2426,
273,
15832,
253,
2929,
3587,
4648,
253,
10208,
42921,
7792,
285,
253,
10554,
285,
9331,
5481,
3082,
19732,
2961,
403,
671,
5899,
50276,
249,
253,
14207,
1783,
629,
253,
2929,
2722,
1264,
1896,
8104,
275,
436,
2173,
10076,
597,
403,
3164,
1754,
327,
253,
4477,
13260,
390,
627,
310,
642,
2590,
1941,
281,
921,
253,
4096,
273,
841,
8104,
50276,
1542,
1650,
310,
352,
1896,
323,
247,
1566,
27697,
2983,
281,
4044,
1524,
941,
432,
643,
24742,
2139,
417,
816,
4489,
247,
1885,
281,
1071,
3587,
50275,
32674,
253,
1980,
1566,
33254,
2983,
604,
271,
30539,
14177,
281,
8461,
1137,
253,
4156,
1566,
342,
47494,
941,
253,
10554,
1543,
352,
31326,
588,
671,
320,
34209,
534,
310,
10214,
281,
253,
7936,
16038,
347,
253,
2929,
7558,
3185,
1016,
23906,
14488,
253,
32339,
9363,
1304,
534,
4428,
253,
26210,
875,
697,
1980,
1566,
285,
253,
4156,
1566,
534,
310,
4217,
281,
3157,
253,
3290,
273,
3580,
275,
253,
2852,
50276,
6275,
314,
275,
253,
5661,
629,
253,
2929,
36509,
941,
715,
3578,
7437,
281,
26065,
3578,
8548,
534,
2550,
4887,
253,
9991,
273,
3580,
432,
1027,
16596,
50275,
249,
1635,
627,
403,
690,
4755,
3448,
3237,
347,
7117,
2708,
50276,
186,
249,
12002,
352,
310,
11132,
281,
1287,
271,
7899,
285,
9630,
13361,
1754,
18983,
3210,
50276,
18921,
366,
285,
9630,
13361,
1754,
18983,
3210,
390,
271,
50276,
7645,
50276,
186,
249,
3239,
337,
12494,
337,
4156,
15970,
9363,
2791,
310,
3264,
50276,
783,
4156,
15970,
9363,
2791,
50276,
186,
249,
3239,
374,
12494,
608,
247,
7895,
20828,
7241,
310,
41842,
281,
253,
24764,
3879,
273,
5014,
275,
247,
8274,
24330,
414,
1566,
50276,
266,
8274,
24330,
414,
1566,
50276,
186,
249,
3239,
374,
12494,
608,
824,
7792,
1057,
417,
4944,
323,
776,
897,
1083,
1955,
281,
2709,
4606,
50276,
10328,
247,
7792,
50276,
186,
249,
3239,
374,
12494,
608,
778,
1918,
4016,
3486,
327,
253,
941,
9891,
2136,
50276,
31089,
247,
4016,
3486,
50276,
186,
249,
3239,
495,
12494,
608,
17580,
1027,
10236,
8166,
16596,
26332,
24742,
42124,
50275,
466,
24742,
50276,
186,
249,
3239,
608,
12494,
577,
323,
1650,
275,
269,
433,
363,
661,
251,
1162,
355,
4104,
4477,
5183,
247,
1566,
27697,
2983,
50276,
783,
4477,
50276,
186,
249,
3239,
818,
12494,
374,
253,
10336,
273,
4156,
1566,
310,
9924,
273,
767,
26970,
8090,
4508,
1016,
6705,
1341,
50276,
15408,
6705,
1341,
275,
6010,
3738,
253,
2929,
39223,
247,
4460,
1895,
285,
11809,
271,
892,
3169,
7792,
323,
253,
2303,
352,
2722,
1534,
12834,
275,
697,
16038,
15832,
285,
4679,
347,
247,
906,
891,
513,
417,
5583,
18738,
253,
2929,
5474,
339,
431,
248,
2929,
310,
327,
247,
1077,
4722,
9400,
15970,
9363,
326,
556,
247,
1943,
3486,
327,
2710,
17057,
352,
10262,
247,
4473,
13149,
323,
27549,
4715,
275,
15970,
9363,
2898,
835,
247,
4156,
1566,
310,
10166,
1754,
327,
1027,
24742,
3733,
10895,
1293,
9628,
253,
941,
50276,
186,
783,
2022,
5691,
275,
15970,
9363,
310,
2223,
3210,
10166,
327,
247,
2173,
3943,
39869,
588,
417,
39970,
285,
1347,
973,
672,
5762,
327,
247,
1027,
3943,
273,
253,
1072,
1511,
323,
1650,
604,
253,
1566,
310,
10166,
327,
247,
4023,
12206,
762,
2176,
6498,
1617,
285,
840,
5762,
327,
247,
1027,
12206,
762,
247,
1027,
6498,
1617,
253,
906,
588,
417,
320,
1175,
275,
253,
4081,
4473,
273,
27549,
4715,
342,
1027,
15302,
432,
1027,
24742,
352,
310,
1077,
11543,
326,
253,
1027,
24742,
897,
4942,
2074,
10434,
253,
2929,
941,
873,
310,
417,
432,
1027,
24742,
3103,
253,
10554,
273,
253,
1566,
762,
824,
1617,
588,
320,
417,
12207,
50276,
2520,
5691,
310,
417,
9713,
275,
436,
2929,
50275,
186,
783,
2929,
943,
1375,
4555,
1880,
352,
310,
46710,
247,
18983,
10554,
273,
5780,
4217,
1495,
390,
10401,
5481,
273,
253,
9331,
1511,
390,
30207,
5481,
1895,
436,
310,
417,
2590,
432,
253,
2929,
275,
3239,
1264,
4477,
3748,
273,
11150,
21234,
1996,
327,
271,
247,
70,
310,
908,
323,
30207,
5481,
4496,
21184,
275,
253,
2929,
752,
1511,
273,
1895,
310,
1146,
14042,
1060,
50276,
186,
783,
2929,
4648,
21702,
5175,
941,
1223,
824,
941,
310,
7561,
908,
281,
1973,
13367,
3210,
323,
1027,
10434,
616,
2898,
327,
253,
2622,
1617,
3368,
310,
30455,
253,
1566,
10166,
327,
21702,
1495,
5216,
2550,
320,
5762,
327,
2622,
1617,
941,
50276,
186,
783,
5740,
273,
253,
10895,
943,
320,
8655,
275,
253,
2929,
417,
2217,
1491,
310,
2361,
670,
253,
3510,
273,
35354,
615,
4971,
318,
49363,
9978,
69,
26776,
273,
253,
5661,
873,
8777,
908,
50276,
186,
6050,
253,
4473,
310,
1754,
327,
8317,
839,
4715,
273,
1027,
24742,
326,
778,
897,
1077,
1027,
10434,
323,
253,
3733,
5239,
253,
10895,
326,
253,
4477,
897,
310,
417,
432,
2710,
1027,
10434,
627,
760,
3064,
310,
275,
6498,
2515,
3943,
281,
3943,
7629,
2074,
2813,
18461,
407,
1027,
24742,
342,
310,
247,
1943,
5691,
275,
15970,
9363,
326,
476,
417,
320,
9713,
342,
253,
1840,
10895,
908,
275,
253,
2929,
594,
619,
2022,
1127,
310,
253,
10895,
310,
417,
7470,
323,
20028,
273,
253,
27549,
4715,
4473,
275,
15970,
9363,
50275,
186,
783,
2234,
7680,
432,
253,
5933,
8668,
310,
417,
2590,
275,
253,
2929,
4496,
21184,
275,
253,
2929,
604,
627,
403,
667,
5933,
280,
7680,
275,
253,
2929,
50276,
186,
783,
2929,
3054,
326,
247,
10237,
285,
7895,
1566,
310,
3715,
2299,
352,
1057,
417,
2085,
690,
2590,
6866,
323,
31640,
347,
973,
347,
3988,
594,
841,
7794,
273,
253,
1566,
943,
320,
3715,
2007,
275,
4278,
390,
778,
320,
275,
247,
4858,
7714,
50237,
50275,
6050,
253,
3559,
4473,
273,
27549,
4715,
323,
15970,
9363,
476,
320,
247,
2234,
7680,
281,
253,
5028,
273,
268,
17670,
436,
2934,
310,
417,
973,
3715,
407,
2097,
273,
2905,
15302,
581,
14876,
310,
253,
4477,
897,
2067,
12206,
15302,
326,
403,
2130,
13644,
26332,
1083,
10439,
15917,
9835,
10895,
9835,
273,
260,
1763,
2966,
255,
261,
10895,
285,
1142,
625,
2130,
12206,
15302,
285,
897,
731,
347,
1027,
24742,
285,
840,
6194,
253,
4156,
1566,
840,
253,
1566,
476,
320,
5762,
327,
271,
39709,
1027,
10895,
253,
3453,
943,
320,
2429,
342,
3216,
5083,
604,
253,
4477,
403,
16161,
247,
11150,
21234,
1895,
390,
320,
2429,
281,
247,
1929,
9331,
1511,
604,
597,
403,
16161,
247,
39266,
9162,
1895,
50276,
3113,
253,
1840,
753,
42752,
3058,
323,
253,
2929,
891,
2550,
5583,
253,
2929,
323,
253,
9311,
50276,
187,
187,
4118,
18435,
27,
783,
2929,
10262,
271,
13757,
5853,
323,
5748,
6928,
1754,
327,
10208,
12072,
4715,
253,
16038,
323,
970,
10208,
12072,
4715,
23880,
432,
253,
11068,
273,
15302,
14475,
432,
1027,
9158,
253,
3045,
273,
253,
1332,
310,
2429,
281,
253,
581,
1754,
327,
36409,
4715,
5747,
17227,
271,
4722,
285,
12532,
2898,
273,
247,
10208,
12072,
4715,
253,
2929,
310,
2581,
5075,
275,
697,
1332,
474,
7680,
697,
5661,
7103,
2299,
310,
2581,
13345,
342,
271,
892,
1895,
4561,
407,
19860,
253,
10895,
323,
247,
36409,
1895,
715,
4243,
642,
2380,
281,
253,
30628,
5701,
369,
2530
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
the work proposed an approach to introduce locality into the mlpbased architecture by using a shifting operation along vertical and horizontal directions strengths the work shows good results on different vision tasks weaknesses 1 the novelty of the work is quite limited the idea of shifting operation in vision has been already explored in numerous previous works such as tsm lin et al 2019 and s2mlp yu et al 2021 especially the proposed work is very similar to s2mlp and furthermore the authors proposed even an improved version s2mlpv2 which reports even better results for 224x224 image resolution 2 the presentation of the results is not fair for instance in the ablation study table 3a the authors present the proposed approach of shifting operation as superior comparing it without shifting operation however when the shift size is 11 there is no spatial interaction in the architecture all the information exchange is between the channels at the same spatial location the authors claimed that introducing locality into mlpbased architecture enables the model to obtain more local dependencies thus improve the performance but the authors did not actually prove this very clearly i think in this ablation study it is necessary to include also the case when using a mlp along spatial dimension for instance there can be two cases first using a global mlp along full spatial size and second using two mlps one mlp along horizontal spatial direction and another mlp along vertical spatial direction otherwise it can be the case that big part of the improvement comes actually from just applying a standard mlp on the horizontal spatial direction and another mlp on the vertical direction the novelty of the work is limited and the authors did not show the results when using a standard mlp approach on the spatial direction docsepthe paper proposes a new architecture for computer vision that is inspired by a the swin transformer b mlpmixer and colleagues and c cnnlike local context via shifts like shift tsm vip s2mlp the architecture is based on the swin transformer removes the windowedattention and then adds local shifts of channels to introduce local context via mlp the architecture is applied to imagenet1k classification coco detection and ade20k segmentation with good results relative to model size and performance strengths the approach is an interesting combination of existing ideas the presented experimental results are competitive and cover three tasks weaknesses the novelty of the approach seems somewhat limited taking swin mlpmixer and colleagues and shifttsmvips2mlp the added delta seems not very large the main differences are highlighted at the end of section 2 we focus on capturing the local dependencies with axially shifting features in the spatial dimension which obtains better performance and can be applied to the downstream tasks and at the end of section 33 as i through iii none of these differences is very convincing in my opinion specifically the claimed better performance is not obvious after looking at all the result tables in detail the relation to both plain mlpmixer and to convolution is not clearly shown experimentally it seems like both convolution and mlpmixer could appear as a limiting case of some form of asmlp it would be great to understand this better both in principle and through experimental comparisons the main idea of the axial shift is not introduced with sufficient clarity in my opinion even after carefully paying attention to sections 32 and 33 figures 2 and 3 algorithm 1 and briefly looking at the code i still had some doubts as to whether i have understood all details correctly of course that could be my fault im curious what the other reviewers think in my opinion the notation used in sec 33 is also slightly incomplete minor points in the introduction the axial shift strategy is mentioned but the explanation remains very vague and the reader has to wait until section 32 to get a more detailed description i think it would be better to introduce this main idea earlier and with more clarity enables the model to obtain more local dependencies thus improve the performance the thus here is not a priori given in my opinion at least it would require more elaboration the first work to apply mlpbased architecture to the downstream task there seems to be a strong relation to the recent cyclemlp which may be concurrent work but should nevertheless be cited and contrasted i think chen et al cyclemlp a mlplike architecture for dense prediction arxiv210710224 2021 footnote 2 this is a very strong statement cannot it is probably clear that this does not work out of the box but eg the mlpmixer paper describes how an increase in image resolution can be handled appendix c at least this statement could be discussed and explained in more detail 41 settings among the myriad of regularization strategies that exist why are specifically label smoothing and droppath chosen how many other regularization strategies were tried or maybe is this based on swin in which case it should be noted here 42 this section seems more like a parameter choice discussion than an ablation study to me 45 the statements derived from the four examples in figure 4 from figure 4 one can see that seem like overclaiming at least in parts these phenomena further show the superiority of asmlp in the context of axial attention even though the asmlp does not use attention it seems there are at least two papers that are quite wellknown and it might be useful to add them to the related work wang et al axialdeeplab eccv 2020 ho et al axial attention in multidimensional transformers arxiv191212180 2019 minor details not influencing the recommendation grammar there are several errors in english grammar the most prominent category seems the use of articles definite the vs indefinite a vs no article firstly in most cases where this word is used in the paper i think it should just be first footnote 1 i did not understand this footnote algorithm 1 xc xs uses a conditional position encoding to effectively encodes encode fig1 seems to be heavily based on swin transformers fig 3a maybe cite that paper here similarly for the specific configurations in sec 34 the introduction of locality further improves the performance of the transformerbased architecture and reduce the computational complexity reduces references some arxiv citations have appeared in conferences eg the transformer paper was published in neurips or resnet in cvpr i think it would be nice to cite the conference version for such papers i did not check all some words mostly abbreviation in reference titles should not be lowercase eg rcnn mlp resmlp consistency mmdetection has 10 authors listed but other papers stop after the first n authors with et al figure 7 and 8 while looking ok when viewed on screen turned out very garbled in the color space when printed on paper it would be useful to check this update after reading other reviewers comments the authors comments and considering the updated paper all three reviewers seem to have a similar view of the submitted work and agree in their rating i think the updated paper and the authors comments have addressed some of the concerns that the reviewers have raised i think that the updated paper has improved in quality on the other hand i think that some of the weaknesses still remain overall i believe that the paper is close to the acceptance threshold seeing the improvements to the paper i am willing to raise my score to marginally above the acceptance threshold an interesting combination of existing ideas evaluated on three computer vision tasks overall novelty seems limited the paper does not explain its approach very well and the gained understanding is limited empirical results are interesting but not superconvincing docsepthis paper proposes to use the shift operation wu et al cvpr 2018 in an axial manner for mlpmixer architectures the proposed method performs much better than previous mlpbased methods on imagenet1k and on par with swintransformer strengths 1 the paper is well written and presented clearly 2 the proposed axial shift module is simple and elegant 3 experiments are done extensively beyond imagenet weaknesses 1 limited technical novelty given the fact that the shift operation cvpr 2018 was already presented as an alternative though similar operation to convolution the shiftresnet is already spatial convolution free except the first stem layer now it is not surprising that using a convolution alternative in mlpmixer can boost its performance to convolution level ie better than previous mlpmixers 2 the modification of using axial on top of shift lacks justification or ablation what if the original shift operation is applied to the nearby 8 points in the 3x3 window the channels can be kept the same by replacing horizontal shift with 4 points and vertical shift with another 4 points this should be very similar to the 5x5 configure used in the paper 3 another intuitive baseline is simply to use depth wise convolution with a depth multiplier of 2 instead of the 2 parallel axial shift module would this simple convolutional baseline affect performanceparamsflopsthroughput by a large margin possibly not if this simple convolutional baseline works then what is the benefit of using asmlpmixers 4 efficientnetv2 icml 2021 a representative convolutionbased method is not citeddiscussedcompared for example table 11 of efficientnetv2 presented v2s with 88g flops 901 imagess 836 top1 this is better than asmlpb 152g flops 4552 imagess although the throughput comparison is not 100 fair efficientnetn2 uses a larger batch size fp16 inference and se module i think asmlpb can hardly perform better than efficientnetv2 in a fair throughput comparison i think this paper is around the acceptance threshold mainly due to the limited technical novelty besides applying shift an alternative to convolution to mlpmixers and achieving on par performance as previous transformerconvolution methods i will raise my score if the weaknesses are addressed
### Summary: | the paper proposes a mlpbased architecture that makes extensive use of the shift operation on the feature maps the model performs well on several vision tasks and datasets the reviews are mixed even after the authors response main pros are that the proposed architecture is elegant and reasonable and the experimental evaluation is thorough and strong the main con is that the novelty is somewhat limited to some prior papers overall i recommend acceptance the reviewers point out that the architecture is good and the results are strong similarities to prior works do not seem serious enough to warrant rejection even an author of arguably the most related concurrent works s2mlp and s2mlpv2 confirms that there is sufficient difference moreover this is one of the first papers to show very strong results on detection and segmentation | [
2934,
273,
19507,
4254,
275,
8113,
556,
644,
2168,
14859,
275,
7418,
2045,
2987,
824,
347,
246,
3610,
19169,
1162,
355,
6247,
285,
256,
19,
1686,
81,
340,
86,
1162,
355,
43425,
3340,
253,
4081,
789,
310,
1077,
2074,
281,
256,
19,
1686,
81,
285,
33810,
253,
4477,
4081,
1014,
271,
5520,
2715,
256,
19,
1686,
45270,
19,
534,
5012,
1014,
1805,
1543,
323,
22856,
89,
17537,
2460,
6064,
50275,
19,
253,
9759,
273,
253,
1543,
310,
417,
4344,
323,
4227,
275,
253,
28913,
1263,
2829,
495,
66,
253,
4477,
1246,
253,
4081,
2746,
273,
19507,
4254,
347,
8936,
10941,
352,
1293,
19507,
4254,
2299,
672,
253,
5333,
1979,
310,
1903,
50276,
9088,
310,
642,
8820,
5016,
275,
253,
10336,
512,
253,
1491,
6431,
310,
875,
253,
8123,
387,
253,
1072,
8820,
4328,
253,
4477,
7558,
326,
16984,
33643,
715,
13361,
81,
3169,
10336,
13276,
253,
1566,
281,
4044,
625,
1980,
21011,
3021,
3157,
253,
3045,
533,
253,
4477,
858,
417,
2686,
5276,
436,
1077,
4518,
891,
1158,
275,
436,
28913,
1263,
352,
310,
3309,
281,
2486,
671,
253,
1083,
672,
970,
247,
13361,
81,
2112,
8820,
7877,
323,
4227,
627,
476,
320,
767,
2219,
806,
970,
50276,
66,
4156,
13361,
81,
2112,
2120,
8820,
1979,
285,
1273,
970,
767,
13361,
793,
581,
13361,
81,
2112,
11593,
8820,
3884,
285,
1529,
13361,
81,
2112,
9118,
8820,
3884,
5010,
352,
476,
320,
253,
1083,
326,
1943,
629,
273,
253,
7756,
3249,
2686,
432,
816,
9433,
247,
2629,
13361,
81,
327,
253,
11593,
8820,
3884,
285,
1529,
13361,
81,
327,
253,
9118,
3884,
50276,
783,
38135,
273,
253,
789,
310,
3710,
285,
253,
4477,
858,
417,
921,
253,
1543,
672,
970,
247,
2629,
13361,
81,
2746,
327,
253,
8820,
3884,
5474,
339,
431,
248,
2929,
29328,
247,
747,
10336,
323,
4382,
8113,
326,
310,
11797,
407,
247,
253,
1863,
249,
39707,
270,
13361,
2617,
895,
254,
285,
11651,
285,
260,
260,
9866,
3022,
1980,
3634,
3066,
15036,
751,
5333,
246,
3610,
362,
532,
256,
19,
1686,
81,
253,
10336,
310,
1754,
327,
253,
1863,
249,
39707,
26586,
253,
3497,
264,
42959,
285,
840,
11323,
1980,
15036,
273,
8123,
281,
9569,
1980,
3634,
3066,
13361,
81,
253,
10336,
310,
3732,
281,
4440,
257,
292,
18,
76,
9162,
9285,
80,
5481,
285,
519,
70,
938,
76,
26405,
342,
1175,
1543,
4103,
281,
1566,
1979,
285,
3045,
50276,
296,
3755,
20556,
50276,
783,
2746,
310,
271,
4722,
5019,
273,
5368,
5697,
50276,
783,
3559,
5661,
1543,
403,
12085,
285,
3835,
1264,
8892,
50276,
20881,
1255,
265,
50276,
783,
38135,
273,
253,
2746,
3133,
8489,
3710,
3192,
1863,
249,
13361,
2617,
895,
254,
285,
11651,
285,
439,
338,
1440,
3610,
87,
2824,
19,
1686,
81,
253,
2879,
18687,
3133,
417,
1077,
1781,
253,
2022,
3910,
403,
16318,
387,
253,
990,
273,
2593,
374,
359,
2770,
327,
26475,
253,
1980,
21011,
342,
4589,
1365,
19507,
3386,
275,
253,
8820,
7877,
534,
31326,
1805,
3045,
285,
476,
320,
3732,
281,
253,
15450,
8892,
285,
387,
253,
990,
273,
2593,
5922,
347,
891,
949,
37685,
5293,
273,
841,
3910,
310,
1077,
21414,
275,
619,
4743,
5742,
253,
7558,
1805,
3045,
310,
417,
4755,
846,
2819,
387,
512,
253,
906,
7180,
275,
2508,
50276,
783,
5886,
281,
1097,
8342,
13361,
2617,
895,
254,
285,
281,
27311,
310,
417,
4518,
2011,
21657,
352,
3133,
751,
1097,
27311,
285,
13361,
2617,
895,
254,
812,
3176,
347,
247,
14155,
1083,
273,
690,
830,
273,
347,
1686,
81,
352,
651,
320,
1270,
281,
2096,
436,
1805,
1097,
275,
8063,
285,
949,
5661,
14023,
50276,
783,
2022,
2934,
273,
253,
19853,
5333,
310,
417,
5611,
342,
4209,
19843,
275,
619,
4743,
1014,
846,
9257,
10054,
4116,
281,
7118,
4567,
285,
5922,
8442,
374,
285,
495,
5933,
337,
285,
13366,
2819,
387,
253,
2127,
891,
1335,
574,
690,
24626,
347,
281,
1880,
891,
452,
7192,
512,
4278,
9113,
273,
2282,
326,
812,
320,
619,
9331,
516,
14338,
752,
253,
643,
30628,
1158,
275,
619,
4743,
253,
14951,
908,
275,
4706,
5922,
310,
671,
5777,
18464,
50276,
37585,
2792,
50276,
249,
253,
10199,
253,
19853,
5333,
5700,
310,
5393,
533,
253,
8813,
4558,
1077,
21248,
285,
253,
9414,
556,
281,
3343,
1919,
2593,
4567,
281,
755,
247,
625,
7000,
5740,
891,
1158,
352,
651,
320,
1805,
281,
9569,
436,
2022,
2934,
4321,
285,
342,
625,
19843,
50275,
257,
2272,
253,
1566,
281,
4044,
625,
1980,
21011,
3021,
3157,
253,
3045,
50276,
783,
3021,
1060,
310,
417,
247,
30400,
1677,
275,
619,
4743,
387,
1878,
352,
651,
2430,
625,
14883,
318,
50276,
783,
806,
789,
281,
4647,
13361,
81,
3169,
10336,
281,
253,
15450,
4836,
50276,
9088,
3133,
281,
320,
247,
2266,
5886,
281,
253,
3332,
5880,
1686,
81,
534,
778,
320,
17336,
789,
533,
943,
17837,
320,
11106,
285,
48397,
891,
1158,
260,
864,
50276,
292,
355,
5880,
1686,
81,
247,
13361,
446,
2804,
10336,
323,
14086,
10554,
549,
32693,
19,
12224,
11335,
1348,
43425,
50276,
8938,
9939,
374,
436,
310,
247,
1077,
2266,
3908,
2550,
352,
310,
3164,
2590,
326,
436,
1057,
417,
789,
562,
273,
253,
3817,
533,
24088,
253,
13361,
2617,
895,
254,
2929,
8631,
849,
271,
2572,
275,
2460,
6064,
476,
320,
15726,
30762,
260,
387,
1878,
436,
3908,
812,
320,
5469,
285,
5544,
275,
625,
2508,
50276,
3156,
7533,
2190,
253,
35988,
273,
37820,
8130,
326,
2226,
2139,
403,
5742,
5203,
36971,
285,
3926,
377,
506,
6777,
849,
1142,
643,
37820,
8130,
497,
3597,
390,
5046,
310,
436,
1754,
327,
1863,
249,
275,
534,
1083,
352,
943,
320,
4879,
1060,
50276,
2945,
436,
2593,
3133,
625,
751,
247,
4764,
4327,
5955,
685,
271,
28913,
1263,
281,
479,
50276,
1857,
253,
7234,
6012,
432,
253,
1740,
6667,
275,
4677,
577,
432,
4677,
577,
581,
476,
923,
326,
1646,
751,
689,
43759,
387,
1878,
275,
4243,
841,
16958,
2007,
921,
253,
34385,
273,
347,
1686,
81,
50276,
249,
253,
3634,
273,
19853,
4116,
1014,
2167,
253,
347,
1686,
81,
1057,
417,
897,
4116,
352,
3133,
627,
403,
387,
1878,
767,
9380,
326,
403,
3240,
973,
4304,
285,
352,
1537,
320,
4217,
281,
823,
731,
281,
253,
2905,
789,
259,
606,
1162,
355,
19853,
46208,
446,
357,
50276,
70,
550,
87,
9169,
8511,
1162,
355,
19853,
4116,
275,
23964,
37613,
4979,
398,
549,
32693,
746,
805,
805,
11395,
6247,
50276,
37585,
4278,
417,
29189,
253,
17401,
50276,
1710,
4175,
627,
403,
2067,
6332,
275,
48087,
28146,
253,
954,
11906,
7140,
3133,
253,
897,
273,
7774,
19040,
253,
4632,
44245,
247,
4632,
642,
3929,
50276,
7053,
314,
50276,
249,
954,
2219,
835,
436,
3159,
310,
908,
275,
253,
2929,
891,
1158,
352,
943,
816,
320,
806,
50276,
8938,
9939,
337,
891,
858,
417,
2096,
436,
43302,
50276,
41528,
337,
1269,
68,
50276,
14831,
50276,
5123,
247,
17697,
1899,
9706,
281,
8069,
31360,
50276,
25950,
50276,
926,
18,
3133,
281,
320,
11306,
1754,
327,
1863,
249,
4979,
398,
3036,
495,
66,
50276,
28489,
26542,
326,
2929,
1060,
12014,
323,
253,
2173,
16012,
275,
4706,
5910,
50276,
783,
10199,
273,
33643,
2007,
19132,
253,
3045,
273,
253,
39707,
3169,
10336,
285,
4796,
253,
15180,
10454,
50276,
433,
86,
707,
50276,
250,
3065,
50276,
8826,
549,
32693,
30404,
452,
5420,
275,
27691,
24088,
253,
39707,
2929,
369,
3863,
275,
5723,
2824,
390,
501,
3024,
275,
30105,
1087,
891,
1158,
352,
651,
320,
5322,
281,
26542,
253,
8059,
2715,
323,
824,
9380,
891,
858,
417,
2451,
512,
50276,
8826,
3000,
6571,
31931,
2492,
275,
3806,
14505,
943,
417,
320,
2406,
5045,
24088,
27657,
9866,
13361,
81,
501,
1686,
81,
50275,
46540,
1371,
5823,
49558,
556,
884,
4477,
7117,
533,
643,
9380,
3523,
846,
253,
806,
295,
4477,
342,
1162,
355,
50276,
13206,
818,
285,
854,
1223,
2819,
8718,
672,
11575,
327,
3601,
3531,
562,
1077,
6746,
11046,
275,
253,
3295,
2317,
672,
11462,
327,
2929,
352,
651,
320,
4217,
281,
2451,
436,
50274,
11183,
846,
4361,
643,
30628,
5701,
253,
4477,
5701,
285,
7296,
253,
9300,
2929,
50276,
455,
1264,
30628,
1646,
281,
452,
247,
2074,
1859,
273,
253,
9262,
789,
285,
5194,
275,
616,
13716,
891,
1158,
253,
9300,
2929,
285,
253,
4477,
5701,
452,
9713,
690,
273,
253,
7350,
326,
253,
30628,
452,
5439,
891,
1158,
326,
253,
9300,
2929,
556,
5520,
275,
3290,
327,
253,
643,
1133,
891,
1158,
326,
690,
273,
253,
32213,
1335,
3464,
4583,
891,
2868,
326,
253,
2929,
310,
2810,
281,
253,
14924,
7887,
6523,
253,
11701,
281,
253,
2929,
891,
717,
7378,
281,
7164,
619,
4868,
281,
42876,
1840,
253,
14924,
7887,
271,
4722,
5019,
273,
5368,
5697,
6760,
327,
1264,
4382,
8113,
8892,
4583,
38135,
3133,
3710,
253,
2929,
1057,
417,
5513,
697,
2746,
1077,
973,
285,
253,
12103,
4685,
310,
3710,
16774,
1543,
403,
4722,
533,
417,
2221,
13118,
19163,
5474,
33032,
2520,
2929,
29328,
281,
897,
253,
5333,
4254,
259,
86,
1162,
355,
30105,
1087,
4765,
275,
271,
19853,
5133,
323,
13361,
2617,
895,
254,
35615,
253,
4081,
1332,
17923,
1199,
1805,
685,
2045,
13361,
81,
3169,
3082,
327,
4440,
257,
292,
18,
76,
285,
327,
1061,
342,
1863,
565,
16147,
19946,
20544,
337,
253,
2929,
310,
973,
3542,
285,
3559,
4518,
374,
253,
4081,
19853,
5333,
6333,
310,
2969,
285,
20654,
495,
4679,
403,
2218,
18171,
4457,
4440,
257,
292,
50276,
20881,
1255,
265,
337,
3710,
7681,
38135,
1677,
253,
958,
326,
253,
5333,
4254,
30105,
1087,
4765,
369,
2168,
3559,
347,
271,
5795,
2167,
2074,
4254,
281,
27311,
253,
5333,
373,
3024,
310,
2168,
8820,
27311,
1959,
3707,
253,
806,
8424,
3828,
1024,
352,
310,
417,
10084,
326,
970,
247,
27311,
5795,
275,
13361,
2617,
895,
254,
476,
9510,
697,
3045,
281,
27311,
1268,
26332,
1805,
685,
2045,
13361,
2617,
895,
398,
374,
253,
11237,
273,
970,
19853,
327,
1755,
273,
5333,
19756,
22861,
390,
28913,
752,
604,
253,
3236,
5333,
4254,
310,
3732,
281,
253,
10151,
854,
2792,
275,
253,
495,
89,
20,
3497,
253,
8123,
476,
320,
4934,
253,
1072,
407,
15706,
11593,
5333,
342,
577,
2792,
285,
9118,
5333,
342,
1529,
577,
2792,
436,
943,
320,
1077,
2074,
281,
253,
608,
89,
22,
20486,
908,
275,
253,
2929,
495,
1529,
27350,
8245,
310,
3365,
281,
897,
6864,
15822,
27311,
342,
247,
6864,
39199,
273,
374,
3185,
273,
253,
374,
7529,
19853,
5333,
6333,
651,
436,
2969,
27311,
267,
8245,
2818,
3045,
12928,
1258,
412,
296,
73,
903,
1065,
407,
247,
1781,
8459,
6830,
417,
604,
436,
2969,
27311,
267,
8245,
2987,
840,
752,
310,
253,
5649,
273,
970,
347,
1686,
2617,
895,
398,
577,
5919,
3024,
87,
19,
17857,
1686,
43425,
247,
8612,
27311,
3169,
1332,
310,
417,
11106,
35844,
264,
3118,
1096,
323,
1650,
2829,
1903,
273,
5919,
3024,
87,
19,
3559,
362,
19,
84,
342,
11003,
72,
892,
2695,
898,
520,
4440,
405,
854,
1812,
1755,
18,
436,
310,
1805,
685,
347,
1686,
15656,
21786,
72,
892,
2695,
34892,
19,
4440,
405,
3738,
253,
28519,
5301,
310,
417,
2233,
4344,
5919,
3024,
79,
19,
4648,
247,
4067,
14604,
1979,
44296,
1036,
17032,
285,
396,
6333,
891,
1158,
347,
1686,
15656,
476,
10693,
1347,
1805,
685,
5919,
3024,
87,
19,
275,
247,
4344,
28519,
5301,
891,
1158,
436,
2929,
310,
1475,
253,
14924,
7887,
7194,
1955,
281,
253,
3710,
7681,
38135,
16280,
9433,
5333,
271,
5795,
281,
27311,
281,
13361,
2617,
895,
398,
285,
17170,
327,
1061,
3045,
347,
2045,
39707,
13118,
2241,
3082,
891,
588,
7164,
619,
4868,
604,
253,
32213,
403,
9713,
2490,
187,
4118,
18435,
27,
783,
2929,
29328,
247,
13361,
81,
3169,
10336,
326,
2789,
9470,
897,
273,
253,
5333,
4254,
327,
253,
4735,
8115,
253,
1566,
17923,
973,
327,
2067,
8113,
8892,
285,
15302,
50276,
783,
10123,
403,
6804,
1014,
846,
253,
4477,
2380,
2022,
5847,
403,
326,
253,
4081,
10336,
310,
20654,
285,
5272,
285,
253,
5661,
7103,
310,
11080,
285,
2266,
253,
2022,
345,
310,
326,
253,
38135,
310,
8489,
3710,
281,
690,
2720,
9380,
50276,
1189,
455,
891,
5583,
14924,
253,
30628,
1127,
562,
326,
253,
10336,
310,
1175,
285,
253,
1543,
403,
2266,
22620,
281,
2720,
2987,
513,
417,
1646,
4092,
2217,
281,
7501,
18235,
50276,
9154,
271,
2488,
273,
25711,
253,
954,
2905,
17336,
2987,
50276,
84,
19,
1686,
81,
285,
256,
19,
1686,
45270,
19,
50276,
8259,
11178,
326,
627,
310,
4209,
3064,
25761,
436,
310,
581,
273,
253,
806,
9380,
281,
921,
1077,
2266,
1543,
327,
5481,
285,
26405
] | [
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] | [
2934,
273,
19507,
4254,
275,
8113,
556,
644,
2168,
14859,
275,
7418,
2045,
2987,
824,
347,
246,
3610,
19169,
1162,
355,
6247,
285,
256,
19,
1686,
81,
340,
86,
1162,
355,
43425,
3340,
253,
4081,
789,
310,
1077,
2074,
281,
256,
19,
1686,
81,
285,
33810,
253,
4477,
4081,
1014,
271,
5520,
2715,
256,
19,
1686,
45270,
19,
534,
5012,
1014,
1805,
1543,
323,
22856,
89,
17537,
2460,
6064,
50275,
19,
253,
9759,
273,
253,
1543,
310,
417,
4344,
323,
4227,
275,
253,
28913,
1263,
2829,
495,
66,
253,
4477,
1246,
253,
4081,
2746,
273,
19507,
4254,
347,
8936,
10941,
352,
1293,
19507,
4254,
2299,
672,
253,
5333,
1979,
310,
1903,
50276,
9088,
310,
642,
8820,
5016,
275,
253,
10336,
512,
253,
1491,
6431,
310,
875,
253,
8123,
387,
253,
1072,
8820,
4328,
253,
4477,
7558,
326,
16984,
33643,
715,
13361,
81,
3169,
10336,
13276,
253,
1566,
281,
4044,
625,
1980,
21011,
3021,
3157,
253,
3045,
533,
253,
4477,
858,
417,
2686,
5276,
436,
1077,
4518,
891,
1158,
275,
436,
28913,
1263,
352,
310,
3309,
281,
2486,
671,
253,
1083,
672,
970,
247,
13361,
81,
2112,
8820,
7877,
323,
4227,
627,
476,
320,
767,
2219,
806,
970,
50276,
66,
4156,
13361,
81,
2112,
2120,
8820,
1979,
285,
1273,
970,
767,
13361,
793,
581,
13361,
81,
2112,
11593,
8820,
3884,
285,
1529,
13361,
81,
2112,
9118,
8820,
3884,
5010,
352,
476,
320,
253,
1083,
326,
1943,
629,
273,
253,
7756,
3249,
2686,
432,
816,
9433,
247,
2629,
13361,
81,
327,
253,
11593,
8820,
3884,
285,
1529,
13361,
81,
327,
253,
9118,
3884,
50276,
783,
38135,
273,
253,
789,
310,
3710,
285,
253,
4477,
858,
417,
921,
253,
1543,
672,
970,
247,
2629,
13361,
81,
2746,
327,
253,
8820,
3884,
5474,
339,
431,
248,
2929,
29328,
247,
747,
10336,
323,
4382,
8113,
326,
310,
11797,
407,
247,
253,
1863,
249,
39707,
270,
13361,
2617,
895,
254,
285,
11651,
285,
260,
260,
9866,
3022,
1980,
3634,
3066,
15036,
751,
5333,
246,
3610,
362,
532,
256,
19,
1686,
81,
253,
10336,
310,
1754,
327,
253,
1863,
249,
39707,
26586,
253,
3497,
264,
42959,
285,
840,
11323,
1980,
15036,
273,
8123,
281,
9569,
1980,
3634,
3066,
13361,
81,
253,
10336,
310,
3732,
281,
4440,
257,
292,
18,
76,
9162,
9285,
80,
5481,
285,
519,
70,
938,
76,
26405,
342,
1175,
1543,
4103,
281,
1566,
1979,
285,
3045,
50276,
296,
3755,
20556,
50276,
783,
2746,
310,
271,
4722,
5019,
273,
5368,
5697,
50276,
783,
3559,
5661,
1543,
403,
12085,
285,
3835,
1264,
8892,
50276,
20881,
1255,
265,
50276,
783,
38135,
273,
253,
2746,
3133,
8489,
3710,
3192,
1863,
249,
13361,
2617,
895,
254,
285,
11651,
285,
439,
338,
1440,
3610,
87,
2824,
19,
1686,
81,
253,
2879,
18687,
3133,
417,
1077,
1781,
253,
2022,
3910,
403,
16318,
387,
253,
990,
273,
2593,
374,
359,
2770,
327,
26475,
253,
1980,
21011,
342,
4589,
1365,
19507,
3386,
275,
253,
8820,
7877,
534,
31326,
1805,
3045,
285,
476,
320,
3732,
281,
253,
15450,
8892,
285,
387,
253,
990,
273,
2593,
5922,
347,
891,
949,
37685,
5293,
273,
841,
3910,
310,
1077,
21414,
275,
619,
4743,
5742,
253,
7558,
1805,
3045,
310,
417,
4755,
846,
2819,
387,
512,
253,
906,
7180,
275,
2508,
50276,
783,
5886,
281,
1097,
8342,
13361,
2617,
895,
254,
285,
281,
27311,
310,
417,
4518,
2011,
21657,
352,
3133,
751,
1097,
27311,
285,
13361,
2617,
895,
254,
812,
3176,
347,
247,
14155,
1083,
273,
690,
830,
273,
347,
1686,
81,
352,
651,
320,
1270,
281,
2096,
436,
1805,
1097,
275,
8063,
285,
949,
5661,
14023,
50276,
783,
2022,
2934,
273,
253,
19853,
5333,
310,
417,
5611,
342,
4209,
19843,
275,
619,
4743,
1014,
846,
9257,
10054,
4116,
281,
7118,
4567,
285,
5922,
8442,
374,
285,
495,
5933,
337,
285,
13366,
2819,
387,
253,
2127,
891,
1335,
574,
690,
24626,
347,
281,
1880,
891,
452,
7192,
512,
4278,
9113,
273,
2282,
326,
812,
320,
619,
9331,
516,
14338,
752,
253,
643,
30628,
1158,
275,
619,
4743,
253,
14951,
908,
275,
4706,
5922,
310,
671,
5777,
18464,
50276,
37585,
2792,
50276,
249,
253,
10199,
253,
19853,
5333,
5700,
310,
5393,
533,
253,
8813,
4558,
1077,
21248,
285,
253,
9414,
556,
281,
3343,
1919,
2593,
4567,
281,
755,
247,
625,
7000,
5740,
891,
1158,
352,
651,
320,
1805,
281,
9569,
436,
2022,
2934,
4321,
285,
342,
625,
19843,
50275,
257,
2272,
253,
1566,
281,
4044,
625,
1980,
21011,
3021,
3157,
253,
3045,
50276,
783,
3021,
1060,
310,
417,
247,
30400,
1677,
275,
619,
4743,
387,
1878,
352,
651,
2430,
625,
14883,
318,
50276,
783,
806,
789,
281,
4647,
13361,
81,
3169,
10336,
281,
253,
15450,
4836,
50276,
9088,
3133,
281,
320,
247,
2266,
5886,
281,
253,
3332,
5880,
1686,
81,
534,
778,
320,
17336,
789,
533,
943,
17837,
320,
11106,
285,
48397,
891,
1158,
260,
864,
50276,
292,
355,
5880,
1686,
81,
247,
13361,
446,
2804,
10336,
323,
14086,
10554,
549,
32693,
19,
12224,
11335,
1348,
43425,
50276,
8938,
9939,
374,
436,
310,
247,
1077,
2266,
3908,
2550,
352,
310,
3164,
2590,
326,
436,
1057,
417,
789,
562,
273,
253,
3817,
533,
24088,
253,
13361,
2617,
895,
254,
2929,
8631,
849,
271,
2572,
275,
2460,
6064,
476,
320,
15726,
30762,
260,
387,
1878,
436,
3908,
812,
320,
5469,
285,
5544,
275,
625,
2508,
50276,
3156,
7533,
2190,
253,
35988,
273,
37820,
8130,
326,
2226,
2139,
403,
5742,
5203,
36971,
285,
3926,
377,
506,
6777,
849,
1142,
643,
37820,
8130,
497,
3597,
390,
5046,
310,
436,
1754,
327,
1863,
249,
275,
534,
1083,
352,
943,
320,
4879,
1060,
50276,
2945,
436,
2593,
3133,
625,
751,
247,
4764,
4327,
5955,
685,
271,
28913,
1263,
281,
479,
50276,
1857,
253,
7234,
6012,
432,
253,
1740,
6667,
275,
4677,
577,
432,
4677,
577,
581,
476,
923,
326,
1646,
751,
689,
43759,
387,
1878,
275,
4243,
841,
16958,
2007,
921,
253,
34385,
273,
347,
1686,
81,
50276,
249,
253,
3634,
273,
19853,
4116,
1014,
2167,
253,
347,
1686,
81,
1057,
417,
897,
4116,
352,
3133,
627,
403,
387,
1878,
767,
9380,
326,
403,
3240,
973,
4304,
285,
352,
1537,
320,
4217,
281,
823,
731,
281,
253,
2905,
789,
259,
606,
1162,
355,
19853,
46208,
446,
357,
50276,
70,
550,
87,
9169,
8511,
1162,
355,
19853,
4116,
275,
23964,
37613,
4979,
398,
549,
32693,
746,
805,
805,
11395,
6247,
50276,
37585,
4278,
417,
29189,
253,
17401,
50276,
1710,
4175,
627,
403,
2067,
6332,
275,
48087,
28146,
253,
954,
11906,
7140,
3133,
253,
897,
273,
7774,
19040,
253,
4632,
44245,
247,
4632,
642,
3929,
50276,
7053,
314,
50276,
249,
954,
2219,
835,
436,
3159,
310,
908,
275,
253,
2929,
891,
1158,
352,
943,
816,
320,
806,
50276,
8938,
9939,
337,
891,
858,
417,
2096,
436,
43302,
50276,
41528,
337,
1269,
68,
50276,
14831,
50276,
5123,
247,
17697,
1899,
9706,
281,
8069,
31360,
50276,
25950,
50276,
926,
18,
3133,
281,
320,
11306,
1754,
327,
1863,
249,
4979,
398,
3036,
495,
66,
50276,
28489,
26542,
326,
2929,
1060,
12014,
323,
253,
2173,
16012,
275,
4706,
5910,
50276,
783,
10199,
273,
33643,
2007,
19132,
253,
3045,
273,
253,
39707,
3169,
10336,
285,
4796,
253,
15180,
10454,
50276,
433,
86,
707,
50276,
250,
3065,
50276,
8826,
549,
32693,
30404,
452,
5420,
275,
27691,
24088,
253,
39707,
2929,
369,
3863,
275,
5723,
2824,
390,
501,
3024,
275,
30105,
1087,
891,
1158,
352,
651,
320,
5322,
281,
26542,
253,
8059,
2715,
323,
824,
9380,
891,
858,
417,
2451,
512,
50276,
8826,
3000,
6571,
31931,
2492,
275,
3806,
14505,
943,
417,
320,
2406,
5045,
24088,
27657,
9866,
13361,
81,
501,
1686,
81,
50275,
46540,
1371,
5823,
49558,
556,
884,
4477,
7117,
533,
643,
9380,
3523,
846,
253,
806,
295,
4477,
342,
1162,
355,
50276,
13206,
818,
285,
854,
1223,
2819,
8718,
672,
11575,
327,
3601,
3531,
562,
1077,
6746,
11046,
275,
253,
3295,
2317,
672,
11462,
327,
2929,
352,
651,
320,
4217,
281,
2451,
436,
50274,
11183,
846,
4361,
643,
30628,
5701,
253,
4477,
5701,
285,
7296,
253,
9300,
2929,
50276,
455,
1264,
30628,
1646,
281,
452,
247,
2074,
1859,
273,
253,
9262,
789,
285,
5194,
275,
616,
13716,
891,
1158,
253,
9300,
2929,
285,
253,
4477,
5701,
452,
9713,
690,
273,
253,
7350,
326,
253,
30628,
452,
5439,
891,
1158,
326,
253,
9300,
2929,
556,
5520,
275,
3290,
327,
253,
643,
1133,
891,
1158,
326,
690,
273,
253,
32213,
1335,
3464,
4583,
891,
2868,
326,
253,
2929,
310,
2810,
281,
253,
14924,
7887,
6523,
253,
11701,
281,
253,
2929,
891,
717,
7378,
281,
7164,
619,
4868,
281,
42876,
1840,
253,
14924,
7887,
271,
4722,
5019,
273,
5368,
5697,
6760,
327,
1264,
4382,
8113,
8892,
4583,
38135,
3133,
3710,
253,
2929,
1057,
417,
5513,
697,
2746,
1077,
973,
285,
253,
12103,
4685,
310,
3710,
16774,
1543,
403,
4722,
533,
417,
2221,
13118,
19163,
5474,
33032,
2520,
2929,
29328,
281,
897,
253,
5333,
4254,
259,
86,
1162,
355,
30105,
1087,
4765,
275,
271,
19853,
5133,
323,
13361,
2617,
895,
254,
35615,
253,
4081,
1332,
17923,
1199,
1805,
685,
2045,
13361,
81,
3169,
3082,
327,
4440,
257,
292,
18,
76,
285,
327,
1061,
342,
1863,
565,
16147,
19946,
20544,
337,
253,
2929,
310,
973,
3542,
285,
3559,
4518,
374,
253,
4081,
19853,
5333,
6333,
310,
2969,
285,
20654,
495,
4679,
403,
2218,
18171,
4457,
4440,
257,
292,
50276,
20881,
1255,
265,
337,
3710,
7681,
38135,
1677,
253,
958,
326,
253,
5333,
4254,
30105,
1087,
4765,
369,
2168,
3559,
347,
271,
5795,
2167,
2074,
4254,
281,
27311,
253,
5333,
373,
3024,
310,
2168,
8820,
27311,
1959,
3707,
253,
806,
8424,
3828,
1024,
352,
310,
417,
10084,
326,
970,
247,
27311,
5795,
275,
13361,
2617,
895,
254,
476,
9510,
697,
3045,
281,
27311,
1268,
26332,
1805,
685,
2045,
13361,
2617,
895,
398,
374,
253,
11237,
273,
970,
19853,
327,
1755,
273,
5333,
19756,
22861,
390,
28913,
752,
604,
253,
3236,
5333,
4254,
310,
3732,
281,
253,
10151,
854,
2792,
275,
253,
495,
89,
20,
3497,
253,
8123,
476,
320,
4934,
253,
1072,
407,
15706,
11593,
5333,
342,
577,
2792,
285,
9118,
5333,
342,
1529,
577,
2792,
436,
943,
320,
1077,
2074,
281,
253,
608,
89,
22,
20486,
908,
275,
253,
2929,
495,
1529,
27350,
8245,
310,
3365,
281,
897,
6864,
15822,
27311,
342,
247,
6864,
39199,
273,
374,
3185,
273,
253,
374,
7529,
19853,
5333,
6333,
651,
436,
2969,
27311,
267,
8245,
2818,
3045,
12928,
1258,
412,
296,
73,
903,
1065,
407,
247,
1781,
8459,
6830,
417,
604,
436,
2969,
27311,
267,
8245,
2987,
840,
752,
310,
253,
5649,
273,
970,
347,
1686,
2617,
895,
398,
577,
5919,
3024,
87,
19,
17857,
1686,
43425,
247,
8612,
27311,
3169,
1332,
310,
417,
11106,
35844,
264,
3118,
1096,
323,
1650,
2829,
1903,
273,
5919,
3024,
87,
19,
3559,
362,
19,
84,
342,
11003,
72,
892,
2695,
898,
520,
4440,
405,
854,
1812,
1755,
18,
436,
310,
1805,
685,
347,
1686,
15656,
21786,
72,
892,
2695,
34892,
19,
4440,
405,
3738,
253,
28519,
5301,
310,
417,
2233,
4344,
5919,
3024,
79,
19,
4648,
247,
4067,
14604,
1979,
44296,
1036,
17032,
285,
396,
6333,
891,
1158,
347,
1686,
15656,
476,
10693,
1347,
1805,
685,
5919,
3024,
87,
19,
275,
247,
4344,
28519,
5301,
891,
1158,
436,
2929,
310,
1475,
253,
14924,
7887,
7194,
1955,
281,
253,
3710,
7681,
38135,
16280,
9433,
5333,
271,
5795,
281,
27311,
281,
13361,
2617,
895,
398,
285,
17170,
327,
1061,
3045,
347,
2045,
39707,
13118,
2241,
3082,
891,
588,
7164,
619,
4868,
604,
253,
32213,
403,
9713,
2490,
187,
4118,
18435,
27,
783,
2929,
29328,
247,
13361,
81,
3169,
10336,
326,
2789,
9470,
897,
273,
253,
5333,
4254,
327,
253,
4735,
8115,
253,
1566,
17923,
973,
327,
2067,
8113,
8892,
285,
15302,
50276,
783,
10123,
403,
6804,
1014,
846,
253,
4477,
2380,
2022,
5847,
403,
326,
253,
4081,
10336,
310,
20654,
285,
5272,
285,
253,
5661,
7103,
310,
11080,
285,
2266,
253,
2022,
345,
310,
326,
253,
38135,
310,
8489,
3710,
281,
690,
2720,
9380,
50276,
1189,
455,
891,
5583,
14924,
253,
30628,
1127,
562,
326,
253,
10336,
310,
1175,
285,
253,
1543,
403,
2266,
22620,
281,
2720,
2987,
513,
417,
1646,
4092,
2217,
281,
7501,
18235,
50276,
9154,
271,
2488,
273,
25711,
253,
954,
2905,
17336,
2987,
50276,
84,
19,
1686,
81,
285,
256,
19,
1686,
45270,
19,
50276,
8259,
11178,
326,
627,
310,
4209,
3064,
25761,
436,
310,
581,
273,
253,
806,
9380,
281,
921,
1077,
2266,
1543,
327,
5481,
285,
26405
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
this paper points out an interesting and important issue of gnns ie local aggregation is harmful for some disassortative graphs it further proposes nonlocal gnns by first sorting the nodes followed by aggregation the paper is well written and easy to follow positives 1 the paper studies an important problem the proposed nonlocal gnns by first sorting the nodes followed by aggregation is interesting and makes sense 2 the paper is well written and easy to follow 3 experiments well support the claim of the paper the results demonstrated the effectiveness of the proposed method for disassortative graphs for node classification in addition the authors show the running time to demonstrate its efficiency and analyze the sorted nodes to demonstrate that the proposed method can learn nonlocal graphs negative 1 it seems that for some disassortative graphs such as actor cornell texas and wisconsin using the node attributes to build the nonlocal graph is much effective than using the attributed graph the authors may also need to compare a baseline that simply use mlp to learn node embedding then construct the graph by calculating pairwise node similarity followed by gnn for node classification this can be treated as a variants of the proposed nlmlp to show that sorting the nodes is more efficient and more effective docsepthis paper proposes a way of speeding up nonlocal aggregation on graph convolutional neural networks based on sorting the nodes into an ordering and performing a 1d convolution on this resulting ordering this algorithm has the advantage of being asymptotically faster than other nonlocal aggregation schemes and the paper demonstrates that empirically it can do at least as well as some of the other methods strengths the proposed approach is simple quite general and rather different from other tools for graph neural nets that im aware of the experimental evaluation methodology is sound and comparisons with several previous works are made weaknesses the approach is difficult to interpret its difficult to convince someone working on gcns why it would work on some of the data sets the gains observed as inconclusive the experiments also focused on small data sets its unclear how such gains extend to more general settings i work mostly on graph algorithms and only know a little about neural networks so im evaluating this paper mostly as a practical graph algorithms the effectiveness of such global sorting schemes based on a single score is very surprising almost too surprising on the other hand my general impression is that graph algorithms is full of such surprises many by now classical algorithms are arrived at by analyzing strange phenomenon that happen to work well so im quite willing to suspend disbelief about why something like this would work as thats a much more detailed process from the discussions it seems that there are quite a bit of concerns raised about the experimentation process on the other hand the responses and presentations in the paper are also quite convincing to me so i believe this result is ready to appear in the conference if anything for the further discussioninterest it will generate and would still like to recommend acceptance of this paperdocsepsummary the goal of the paper is to perform node classification for graphs the authors propose a strategy to augment message passing graph neural networks with information from nonlocal nodes in the graph with a focus on disassortative graphs disassortative graphs are graph datasets where nodes with identical node labels are distant from each other in terms of edge connectivity with node representation learnt from standard graph neural networks etc the authors propose to use an attention guided sorting mechanism to create a proxy graph where nodes which may have identical node labels be connected to each other analogous to creating a knearest neighbor graph message passing is then employed on the proxy graph to learn final representations for the nodes since the authors employ a single vector namely c which they call calibration vector to capture the importance of information shared across different nodes there is a speedup in comparison to strategies which employ a pairwise comparison between all nodes in the graph pros 1 the idea to create a proxy graph to capture non local information is interesting 2 the proposed technique can be augmented with almost any existing gnn my concerns 1 disassortative or iid the authors in figure 1 show that homophily of the created proxy graph is a value larger than that of the original graph however from table a2 in the appendix it is clear to see that mlps outperform gnns with or without the attention sorting in the disassortative graphs and the performance of the mlps and proposed augmented nlmlp are well within one standard deviation from each other this questions the need to employ a proxy graph construction on top of mlps for these datasets as it appears like the data can be treated as iid and not relational moreover these datasets used from pei et al 2020 are extremely small to draw any significant conclusion also almost no gains are seen on the assortative datasets citeseer cora pubmed please add datasets from ogb and their running times when augmented with a proxy graph are the gains worthy of increased run times 2 baselines since the authors propose a strategy to construct a proxy graph and the number of neighbors of each node in the proxy graph is the same baselines such as creating graphs where nodes with identical labels are connected are also connected to each other gnn on simple knearest neighbors created using initial features while a simple knn might appear more expensive but the computation here is a single time effort appears crucial also add a baseline where adjacency structure of the graphs are iteratively updated during training such as learning discrete structures for graph neural networks franceschi et al icml 2019 3 sufficiency lack of details the use of a single calibration vector may not be sufficient to sort the nodes there are no guarantees in the paper to say when a single calibration vector would suffice also the number of classes of nodes in each of the datasets used here are very small and also does not trivially extend to multilabel classification of nodes also how do you also determine the number of neighbors in a proxy graph and do all nodes need to have the same number of neighbors in the proxy graph there are missing equations about how the calibration vector c is learnt what are the objective etc and the effect of running time when there are a large number of neighbors considered in the proxy graph without any equations its hard to argue against the case that the number of gradients to be computed would explode when the number of neighbors are increased in the proxy graph especially when jointly learning the gnn and the proposed augmentation other minor concerns if possible please include the difference between assortative and nonassortative graphs in the introduction it makes it easier for a reader if details are added and the concerns are addressed i will be happy to improve my scoredocsepthis paper targets on addressing the node embedding problem in disassortative graphs a nonlocal aggregation framework is proposed since local aggregation may be harmful for some disassortative graphs to address the high computational cost in the recent geomgcn model that has an attentionlike step to compute the euclidean distance between every pair of nodes an idea of attentionguided sorting is introduced it learns an ordering of nodes such that distant but informative nodes are put near each other the sorting order depends on the attention scores computed with the local embedding vector of a node then covn function is applied on the sorted sequence of local node embeddings to obtain the nonlocal embedding the final node embedding is then the concatenation of the local and nonlocal embedding which is used for node classification the presented simple approach is an interesting idea to push the distant but informative nodes together however it is unclear how the attentionguided sorting is aware of the distant nodes the local node embedding vectors z can be obtained either by the node content or by gnn if z is from the node content only the attention score a is calculated without consideration how nodes are close or distant on the graph the whole approach works purely for node content classification if z is from gnn nodes close on the graph have similar z embedding vectors and thus will be sorted next to each other then the sorting doesnt take distant nodes close although the experimental results show the proposed approach performs better than several baselines more and stronger gnn models are expected to be compared with eg gins especially on chameleon and squirrel datasets theses two disassortative graphs can be handled by gnn kinds of models the node classification in other four disassortative graphs in fact can be treated as a standard class classification task by ignoring the graph structures as mlp on node features is already good thanks for the clarifications from the authors the discussion was very helpful
### Summary: | this paper is right at the borderline the reviewers agree it is well written proposing a simple but interesting idea however there was a feeling among the reviewers especially reviewer 1 that the paper could be strengthened considerably with a better discussionsome theory on the sufficiency of the calibration vectors as well as experiments on larger datasets doing one of these would have substantially strengthened the paper due to the remaining shortcomings the recommendation is not to accept the paper in its present state | [
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
2520,
2929,
2792,
562,
271,
4722,
285,
1774,
2523,
273,
18976,
2224,
26332,
1980,
20828,
310,
19632,
323,
690,
557,
515,
430,
800,
14580,
352,
2007,
29328,
1327,
6790,
18976,
2224,
407,
806,
23762,
253,
7632,
3560,
407,
20828,
253,
2929,
310,
973,
3542,
285,
3477,
281,
956,
50275,
993,
23223,
337,
253,
2929,
2175,
271,
1774,
1895,
253,
4081,
1327,
6790,
18976,
2224,
407,
806,
23762,
253,
7632,
3560,
407,
20828,
310,
4722,
285,
2789,
3282,
374,
253,
2929,
310,
973,
3542,
285,
3477,
281,
956,
495,
4679,
973,
1329,
253,
1750,
273,
253,
2929,
253,
1543,
5183,
253,
12510,
273,
253,
4081,
1332,
323,
557,
515,
430,
800,
14580,
323,
4666,
9162,
275,
1635,
253,
4477,
921,
253,
3515,
673,
281,
7568,
697,
6733,
285,
12106,
253,
20045,
7632,
281,
7568,
326,
253,
4081,
1332,
476,
3037,
1327,
6790,
14580,
50276,
12373,
337,
352,
3133,
326,
323,
690,
557,
515,
430,
800,
14580,
824,
347,
12353,
14128,
437,
30557,
284,
285,
14552,
14727,
970,
253,
4666,
12474,
281,
1973,
253,
1327,
6790,
4216,
310,
1199,
3576,
685,
970,
253,
12877,
4216,
253,
4477,
778,
671,
878,
281,
7277,
247,
8245,
326,
3365,
897,
13361,
81,
281,
3037,
4666,
21496,
840,
3989,
253,
4216,
407,
18899,
28208,
4666,
14259,
3560,
407,
305,
9866,
323,
4666,
9162,
436,
476,
320,
4127,
347,
247,
11640,
273,
253,
4081,
295,
77,
1686,
81,
281,
921,
326,
23762,
253,
7632,
310,
625,
5919,
285,
625,
3576,
5474,
33032,
2520,
2929,
29328,
247,
1039,
273,
43088,
598,
1327,
6790,
20828,
327,
4216,
27311,
267,
11454,
6928,
1754,
327,
23762,
253,
7632,
715,
271,
15824,
285,
9591,
247,
337,
69,
27311,
327,
436,
4795,
15824,
436,
5933,
556,
253,
5750,
273,
1146,
38311,
7938,
685,
643,
1327,
6790,
20828,
15849,
285,
253,
2929,
14371,
326,
45190,
352,
476,
513,
387,
1878,
347,
973,
347,
690,
273,
253,
643,
3082,
50276,
296,
3755,
20556,
50276,
783,
4081,
2746,
310,
2969,
3240,
2087,
285,
2581,
1027,
432,
643,
5657,
323,
4216,
11454,
37507,
326,
516,
6600,
273,
50276,
783,
5661,
7103,
16182,
310,
3590,
285,
14023,
342,
2067,
2045,
2987,
403,
1160,
50276,
20881,
1255,
265,
50276,
783,
2746,
310,
2834,
281,
4665,
697,
2834,
281,
18578,
3095,
2444,
327,
305,
68,
2224,
2139,
352,
651,
789,
50276,
251,
690,
273,
253,
941,
5239,
253,
15988,
2540,
347,
16656,
7426,
253,
4679,
671,
7106,
327,
1355,
941,
5239,
697,
12744,
849,
824,
15988,
9017,
281,
625,
2087,
7533,
50276,
74,
789,
6571,
327,
4216,
11333,
285,
760,
871,
247,
1652,
670,
11454,
6928,
594,
516,
16344,
436,
2929,
6571,
347,
247,
8542,
4216,
11333,
253,
12510,
273,
824,
4156,
23762,
15849,
1754,
327,
247,
2014,
4868,
310,
1077,
10084,
2761,
1512,
10084,
327,
253,
643,
1133,
619,
2087,
13214,
310,
326,
4216,
11333,
310,
2120,
273,
824,
37700,
1142,
407,
1024,
8946,
11333,
403,
7244,
387,
407,
18918,
8921,
11562,
326,
5108,
281,
789,
973,
594,
516,
3240,
7378,
281,
32407,
45582,
670,
2139,
1633,
751,
436,
651,
789,
347,
28763,
247,
1199,
625,
7000,
1232,
50276,
4064,
253,
11985,
352,
3133,
326,
627,
403,
3240,
247,
2372,
273,
7350,
5439,
670,
253,
40290,
1232,
327,
253,
643,
1133,
253,
6128,
285,
27228,
275,
253,
2929,
403,
671,
3240,
21414,
281,
479,
594,
891,
2868,
436,
906,
310,
4704,
281,
3176,
275,
253,
8059,
604,
2712,
323,
253,
2007,
5955,
14014,
352,
588,
6635,
285,
651,
1335,
751,
281,
5583,
14924,
273,
436,
2929,
7152,
339,
793,
360,
3454,
253,
4736,
273,
253,
2929,
310,
281,
1347,
4666,
9162,
323,
14580,
253,
4477,
12661,
247,
5700,
281,
35919,
3935,
8136,
4216,
11454,
6928,
342,
1491,
432,
1327,
6790,
7632,
275,
253,
4216,
50276,
3113,
247,
2770,
327,
557,
515,
430,
800,
14580,
557,
515,
430,
800,
14580,
403,
4216,
15302,
50276,
2811,
7632,
342,
8931,
4666,
13301,
403,
13392,
432,
1016,
643,
275,
2426,
273,
5024,
17769,
50275,
3113,
4666,
6779,
34003,
432,
2629,
4216,
11454,
6928,
3966,
253,
4477,
12661,
281,
897,
271,
4116,
18107,
23762,
5122,
281,
2794,
247,
17335,
4216,
835,
7632,
534,
778,
452,
8931,
4666,
13301,
320,
4802,
281,
1016,
643,
19890,
281,
6153,
247,
7725,
4885,
6346,
4216,
3935,
8136,
310,
840,
7091,
327,
253,
17335,
4216,
281,
3037,
2457,
14237,
323,
253,
7632,
50276,
17480,
253,
4477,
2126,
247,
2014,
4972,
10775,
260,
534,
597,
1067,
18543,
4972,
281,
9232,
253,
6349,
273,
1491,
6096,
2439,
1027,
7632,
50275,
9088,
310,
247,
3885,
484,
275,
5301,
281,
8130,
534,
2126,
247,
28208,
5301,
875,
512,
7632,
275,
253,
4216,
50276,
856,
84,
337,
253,
2934,
281,
2794,
247,
17335,
4216,
281,
9232,
1327,
1980,
1491,
310,
4722,
374,
253,
4081,
5853,
476,
320,
31612,
342,
2761,
667,
5368,
305,
9866,
50276,
2577,
7350,
337,
557,
515,
430,
800,
390,
891,
301,
50276,
783,
4477,
275,
4677,
337,
50276,
9029,
326,
2860,
2689,
1031,
273,
253,
3562,
17335,
4216,
310,
247,
1318,
4067,
685,
326,
273,
253,
3236,
4216,
2299,
432,
2829,
247,
19,
275,
253,
30762,
50276,
262,
310,
2590,
281,
923,
326,
13361,
793,
562,
32231,
18976,
2224,
342,
390,
1293,
253,
4116,
23762,
275,
253,
557,
515,
430,
800,
14580,
50276,
395,
253,
3045,
273,
253,
13361,
793,
285,
4081,
31612,
295,
77,
1686,
81,
403,
973,
1561,
581,
2629,
11254,
432,
1016,
643,
436,
3533,
253,
878,
281,
2126,
247,
17335,
4216,
5140,
327,
1755,
273,
13361,
793,
323,
841,
15302,
347,
352,
4620,
751,
253,
941,
476,
320,
4127,
347,
891,
301,
285,
417,
38524,
25761,
841,
15302,
908,
432,
759,
74,
1162,
355,
9169,
403,
6685,
1355,
281,
3812,
667,
1534,
6452,
671,
2761,
642,
15988,
403,
2326,
327,
253,
44417,
800,
15302,
4851,
3248,
254,
944,
66,
13384,
1314,
4496,
823,
15302,
432,
9040,
67,
50276,
395,
616,
3515,
2069,
672,
31612,
342,
247,
17335,
4216,
50276,
609,
253,
15988,
18338,
273,
2559,
1408,
2069,
374,
1666,
25379,
1580,
253,
4477,
12661,
247,
5700,
281,
3989,
247,
17335,
4216,
285,
253,
1180,
273,
15833,
273,
1016,
4666,
275,
253,
17335,
4216,
310,
253,
1072,
50276,
10352,
25379,
824,
347,
6153,
14580,
835,
7632,
342,
8931,
13301,
403,
4802,
403,
671,
4802,
281,
1016,
643,
50276,
3757,
79,
327,
2969,
7725,
4885,
15833,
3562,
970,
3302,
3386,
1223,
247,
2969,
694,
79,
1537,
3176,
625,
8214,
50276,
2858,
253,
13782,
1060,
310,
247,
2014,
673,
3434,
4620,
9560,
671,
823,
247,
8245,
835,
3067,
43850,
2605,
273,
253,
14580,
403,
10040,
3146,
9300,
1309,
3733,
824,
347,
50276,
28269,
13358,
5289,
323,
4216,
11454,
6928,
1315,
1972,
4635,
1162,
355,
17857,
1686,
6247,
495,
32572,
3480,
273,
4278,
253,
897,
273,
247,
2014,
18543,
4972,
778,
417,
320,
4209,
281,
3686,
253,
7632,
50276,
9088,
403,
642,
23632,
275,
253,
2929,
281,
1333,
672,
247,
2014,
18543,
4972,
651,
36433,
671,
253,
1180,
273,
5971,
273,
7632,
275,
1016,
273,
253,
15302,
908,
1060,
403,
1077,
1355,
50276,
395,
671,
1057,
417,
35820,
1365,
9017,
281,
33362,
1492,
9162,
273,
7632,
671,
849,
513,
368,
671,
3653,
253,
1180,
273,
15833,
275,
247,
17335,
4216,
285,
513,
512,
7632,
878,
281,
452,
253,
1072,
1180,
273,
15833,
275,
253,
17335,
4216,
627,
403,
5816,
7424,
670,
849,
253,
18543,
4972,
260,
310,
34003,
752,
403,
253,
8103,
3966,
285,
253,
1055,
273,
3515,
673,
672,
627,
403,
247,
1781,
1180,
273,
15833,
2783,
275,
253,
17335,
4216,
50275,
14920,
667,
7424,
697,
1892,
281,
9059,
1411,
253,
1083,
326,
253,
1180,
273,
27935,
281,
320,
10302,
651,
34667,
672,
253,
1180,
273,
15833,
403,
2559,
275,
253,
17335,
4216,
3340,
672,
26277,
4715,
253,
305,
9866,
285,
253,
4081,
42072,
50275,
977,
5884,
7350,
604,
1896,
4496,
2486,
253,
3064,
875,
44417,
800,
285,
1327,
515,
430,
800,
14580,
275,
253,
10199,
50276,
262,
2789,
352,
6927,
323,
247,
9414,
50276,
338,
4278,
403,
2879,
285,
253,
7350,
403,
9713,
891,
588,
320,
5211,
281,
3157,
619,
11691,
406,
33032,
2520,
2929,
8571,
327,
15974,
253,
4666,
21496,
1895,
275,
557,
515,
430,
800,
14580,
247,
1327,
6790,
20828,
7792,
310,
4081,
1580,
1980,
20828,
778,
320,
19632,
323,
690,
557,
515,
430,
800,
14580,
281,
2953,
253,
1029,
15180,
2105,
275,
253,
3332,
49040,
72,
14340,
1566,
326,
556,
271,
4116,
3022,
3213,
281,
11897,
253,
299,
26365,
4181,
875,
1046,
4667,
273,
7632,
271,
2934,
273,
4116,
26960,
23762,
310,
5611,
352,
33772,
271,
15824,
273,
7632,
824,
326,
13392,
533,
27096,
7632,
403,
1691,
2822,
1016,
643,
253,
23762,
1340,
7024,
327,
253,
4116,
7363,
10302,
342,
253,
1980,
21496,
4972,
273,
247,
4666,
840,
9383,
79,
1159,
310,
3732,
327,
253,
20045,
3425,
273,
1980,
4666,
46234,
281,
4044,
253,
1327,
6790,
21496,
253,
2457,
4666,
21496,
310,
840,
253,
32147,
318,
273,
253,
1980,
285,
1327,
6790,
21496,
534,
310,
908,
323,
4666,
9162,
50275,
783,
3559,
2969,
2746,
310,
271,
4722,
2934,
281,
7450,
253,
13392,
533,
27096,
7632,
2366,
2299,
352,
310,
12744,
849,
253,
4116,
26960,
23762,
310,
6600,
273,
253,
13392,
7632,
253,
1980,
4666,
21496,
11390,
1182,
476,
320,
2797,
2057,
407,
253,
4666,
2600,
390,
407,
305,
9866,
604,
1182,
310,
432,
253,
4666,
2600,
760,
253,
4116,
4868,
247,
310,
5118,
1293,
8180,
849,
7632,
403,
2810,
390,
13392,
327,
253,
4216,
253,
2644,
2746,
2987,
15846,
323,
4666,
2600,
9162,
604,
1182,
310,
432,
305,
9866,
50276,
26451,
2810,
327,
253,
4216,
452,
2074,
1182,
21496,
11390,
285,
3021,
588,
320,
20045,
1735,
281,
1016,
643,
840,
253,
23762,
36908,
1379,
13392,
7632,
2810,
50275,
20261,
253,
5661,
1543,
921,
253,
4081,
2746,
17923,
1805,
685,
2067,
1666,
25379,
625,
285,
10046,
305,
9866,
3210,
403,
3264,
281,
320,
2429,
342,
24088,
305,
968,
3340,
327,
448,
482,
282,
251,
285,
37233,
1661,
15302,
253,
6628,
767,
557,
515,
430,
800,
14580,
476,
320,
15726,
407,
305,
9866,
9351,
273,
3210,
253,
4666,
9162,
275,
643,
1740,
557,
515,
430,
800,
14580,
275,
958,
476,
320,
4127,
347,
247,
2629,
966,
9162,
4836,
407,
23111,
253,
4216,
5289,
347,
13361,
81,
327,
4666,
3386,
310,
2168,
1175,
50275,
35501,
323,
253,
8254,
6787,
432,
253,
4477,
253,
5955,
369,
1077,
9371,
2490,
187,
4118,
18435,
27,
2520,
2929,
310,
987,
387,
253,
45210,
253,
30628,
5194,
352,
310,
973,
3542,
36636,
247,
2969,
533,
4722,
2934,
2299,
627,
369,
247,
5471,
2190,
253,
30628,
3340,
37317,
337,
326,
253,
2929,
812,
320,
34615,
15455,
342,
247,
1805,
11985,
485,
3762,
327,
253,
32572,
273,
253,
18543,
11390,
347,
973,
347,
4679,
327,
4067,
15302,
2509,
581,
273,
841,
651,
452,
9619,
34615,
253,
2929,
1955,
281,
253,
5780,
35387,
253,
17401,
310,
417,
281,
2997,
253,
2929,
275,
697,
1246,
1375
] | [
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] | [
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
2520,
2929,
2792,
562,
271,
4722,
285,
1774,
2523,
273,
18976,
2224,
26332,
1980,
20828,
310,
19632,
323,
690,
557,
515,
430,
800,
14580,
352,
2007,
29328,
1327,
6790,
18976,
2224,
407,
806,
23762,
253,
7632,
3560,
407,
20828,
253,
2929,
310,
973,
3542,
285,
3477,
281,
956,
50275,
993,
23223,
337,
253,
2929,
2175,
271,
1774,
1895,
253,
4081,
1327,
6790,
18976,
2224,
407,
806,
23762,
253,
7632,
3560,
407,
20828,
310,
4722,
285,
2789,
3282,
374,
253,
2929,
310,
973,
3542,
285,
3477,
281,
956,
495,
4679,
973,
1329,
253,
1750,
273,
253,
2929,
253,
1543,
5183,
253,
12510,
273,
253,
4081,
1332,
323,
557,
515,
430,
800,
14580,
323,
4666,
9162,
275,
1635,
253,
4477,
921,
253,
3515,
673,
281,
7568,
697,
6733,
285,
12106,
253,
20045,
7632,
281,
7568,
326,
253,
4081,
1332,
476,
3037,
1327,
6790,
14580,
50276,
12373,
337,
352,
3133,
326,
323,
690,
557,
515,
430,
800,
14580,
824,
347,
12353,
14128,
437,
30557,
284,
285,
14552,
14727,
970,
253,
4666,
12474,
281,
1973,
253,
1327,
6790,
4216,
310,
1199,
3576,
685,
970,
253,
12877,
4216,
253,
4477,
778,
671,
878,
281,
7277,
247,
8245,
326,
3365,
897,
13361,
81,
281,
3037,
4666,
21496,
840,
3989,
253,
4216,
407,
18899,
28208,
4666,
14259,
3560,
407,
305,
9866,
323,
4666,
9162,
436,
476,
320,
4127,
347,
247,
11640,
273,
253,
4081,
295,
77,
1686,
81,
281,
921,
326,
23762,
253,
7632,
310,
625,
5919,
285,
625,
3576,
5474,
33032,
2520,
2929,
29328,
247,
1039,
273,
43088,
598,
1327,
6790,
20828,
327,
4216,
27311,
267,
11454,
6928,
1754,
327,
23762,
253,
7632,
715,
271,
15824,
285,
9591,
247,
337,
69,
27311,
327,
436,
4795,
15824,
436,
5933,
556,
253,
5750,
273,
1146,
38311,
7938,
685,
643,
1327,
6790,
20828,
15849,
285,
253,
2929,
14371,
326,
45190,
352,
476,
513,
387,
1878,
347,
973,
347,
690,
273,
253,
643,
3082,
50276,
296,
3755,
20556,
50276,
783,
4081,
2746,
310,
2969,
3240,
2087,
285,
2581,
1027,
432,
643,
5657,
323,
4216,
11454,
37507,
326,
516,
6600,
273,
50276,
783,
5661,
7103,
16182,
310,
3590,
285,
14023,
342,
2067,
2045,
2987,
403,
1160,
50276,
20881,
1255,
265,
50276,
783,
2746,
310,
2834,
281,
4665,
697,
2834,
281,
18578,
3095,
2444,
327,
305,
68,
2224,
2139,
352,
651,
789,
50276,
251,
690,
273,
253,
941,
5239,
253,
15988,
2540,
347,
16656,
7426,
253,
4679,
671,
7106,
327,
1355,
941,
5239,
697,
12744,
849,
824,
15988,
9017,
281,
625,
2087,
7533,
50276,
74,
789,
6571,
327,
4216,
11333,
285,
760,
871,
247,
1652,
670,
11454,
6928,
594,
516,
16344,
436,
2929,
6571,
347,
247,
8542,
4216,
11333,
253,
12510,
273,
824,
4156,
23762,
15849,
1754,
327,
247,
2014,
4868,
310,
1077,
10084,
2761,
1512,
10084,
327,
253,
643,
1133,
619,
2087,
13214,
310,
326,
4216,
11333,
310,
2120,
273,
824,
37700,
1142,
407,
1024,
8946,
11333,
403,
7244,
387,
407,
18918,
8921,
11562,
326,
5108,
281,
789,
973,
594,
516,
3240,
7378,
281,
32407,
45582,
670,
2139,
1633,
751,
436,
651,
789,
347,
28763,
247,
1199,
625,
7000,
1232,
50276,
4064,
253,
11985,
352,
3133,
326,
627,
403,
3240,
247,
2372,
273,
7350,
5439,
670,
253,
40290,
1232,
327,
253,
643,
1133,
253,
6128,
285,
27228,
275,
253,
2929,
403,
671,
3240,
21414,
281,
479,
594,
891,
2868,
436,
906,
310,
4704,
281,
3176,
275,
253,
8059,
604,
2712,
323,
253,
2007,
5955,
14014,
352,
588,
6635,
285,
651,
1335,
751,
281,
5583,
14924,
273,
436,
2929,
7152,
339,
793,
360,
3454,
253,
4736,
273,
253,
2929,
310,
281,
1347,
4666,
9162,
323,
14580,
253,
4477,
12661,
247,
5700,
281,
35919,
3935,
8136,
4216,
11454,
6928,
342,
1491,
432,
1327,
6790,
7632,
275,
253,
4216,
50276,
3113,
247,
2770,
327,
557,
515,
430,
800,
14580,
557,
515,
430,
800,
14580,
403,
4216,
15302,
50276,
2811,
7632,
342,
8931,
4666,
13301,
403,
13392,
432,
1016,
643,
275,
2426,
273,
5024,
17769,
50275,
3113,
4666,
6779,
34003,
432,
2629,
4216,
11454,
6928,
3966,
253,
4477,
12661,
281,
897,
271,
4116,
18107,
23762,
5122,
281,
2794,
247,
17335,
4216,
835,
7632,
534,
778,
452,
8931,
4666,
13301,
320,
4802,
281,
1016,
643,
19890,
281,
6153,
247,
7725,
4885,
6346,
4216,
3935,
8136,
310,
840,
7091,
327,
253,
17335,
4216,
281,
3037,
2457,
14237,
323,
253,
7632,
50276,
17480,
253,
4477,
2126,
247,
2014,
4972,
10775,
260,
534,
597,
1067,
18543,
4972,
281,
9232,
253,
6349,
273,
1491,
6096,
2439,
1027,
7632,
50275,
9088,
310,
247,
3885,
484,
275,
5301,
281,
8130,
534,
2126,
247,
28208,
5301,
875,
512,
7632,
275,
253,
4216,
50276,
856,
84,
337,
253,
2934,
281,
2794,
247,
17335,
4216,
281,
9232,
1327,
1980,
1491,
310,
4722,
374,
253,
4081,
5853,
476,
320,
31612,
342,
2761,
667,
5368,
305,
9866,
50276,
2577,
7350,
337,
557,
515,
430,
800,
390,
891,
301,
50276,
783,
4477,
275,
4677,
337,
50276,
9029,
326,
2860,
2689,
1031,
273,
253,
3562,
17335,
4216,
310,
247,
1318,
4067,
685,
326,
273,
253,
3236,
4216,
2299,
432,
2829,
247,
19,
275,
253,
30762,
50276,
262,
310,
2590,
281,
923,
326,
13361,
793,
562,
32231,
18976,
2224,
342,
390,
1293,
253,
4116,
23762,
275,
253,
557,
515,
430,
800,
14580,
50276,
395,
253,
3045,
273,
253,
13361,
793,
285,
4081,
31612,
295,
77,
1686,
81,
403,
973,
1561,
581,
2629,
11254,
432,
1016,
643,
436,
3533,
253,
878,
281,
2126,
247,
17335,
4216,
5140,
327,
1755,
273,
13361,
793,
323,
841,
15302,
347,
352,
4620,
751,
253,
941,
476,
320,
4127,
347,
891,
301,
285,
417,
38524,
25761,
841,
15302,
908,
432,
759,
74,
1162,
355,
9169,
403,
6685,
1355,
281,
3812,
667,
1534,
6452,
671,
2761,
642,
15988,
403,
2326,
327,
253,
44417,
800,
15302,
4851,
3248,
254,
944,
66,
13384,
1314,
4496,
823,
15302,
432,
9040,
67,
50276,
395,
616,
3515,
2069,
672,
31612,
342,
247,
17335,
4216,
50276,
609,
253,
15988,
18338,
273,
2559,
1408,
2069,
374,
1666,
25379,
1580,
253,
4477,
12661,
247,
5700,
281,
3989,
247,
17335,
4216,
285,
253,
1180,
273,
15833,
273,
1016,
4666,
275,
253,
17335,
4216,
310,
253,
1072,
50276,
10352,
25379,
824,
347,
6153,
14580,
835,
7632,
342,
8931,
13301,
403,
4802,
403,
671,
4802,
281,
1016,
643,
50276,
3757,
79,
327,
2969,
7725,
4885,
15833,
3562,
970,
3302,
3386,
1223,
247,
2969,
694,
79,
1537,
3176,
625,
8214,
50276,
2858,
253,
13782,
1060,
310,
247,
2014,
673,
3434,
4620,
9560,
671,
823,
247,
8245,
835,
3067,
43850,
2605,
273,
253,
14580,
403,
10040,
3146,
9300,
1309,
3733,
824,
347,
50276,
28269,
13358,
5289,
323,
4216,
11454,
6928,
1315,
1972,
4635,
1162,
355,
17857,
1686,
6247,
495,
32572,
3480,
273,
4278,
253,
897,
273,
247,
2014,
18543,
4972,
778,
417,
320,
4209,
281,
3686,
253,
7632,
50276,
9088,
403,
642,
23632,
275,
253,
2929,
281,
1333,
672,
247,
2014,
18543,
4972,
651,
36433,
671,
253,
1180,
273,
5971,
273,
7632,
275,
1016,
273,
253,
15302,
908,
1060,
403,
1077,
1355,
50276,
395,
671,
1057,
417,
35820,
1365,
9017,
281,
33362,
1492,
9162,
273,
7632,
671,
849,
513,
368,
671,
3653,
253,
1180,
273,
15833,
275,
247,
17335,
4216,
285,
513,
512,
7632,
878,
281,
452,
253,
1072,
1180,
273,
15833,
275,
253,
17335,
4216,
627,
403,
5816,
7424,
670,
849,
253,
18543,
4972,
260,
310,
34003,
752,
403,
253,
8103,
3966,
285,
253,
1055,
273,
3515,
673,
672,
627,
403,
247,
1781,
1180,
273,
15833,
2783,
275,
253,
17335,
4216,
50275,
14920,
667,
7424,
697,
1892,
281,
9059,
1411,
253,
1083,
326,
253,
1180,
273,
27935,
281,
320,
10302,
651,
34667,
672,
253,
1180,
273,
15833,
403,
2559,
275,
253,
17335,
4216,
3340,
672,
26277,
4715,
253,
305,
9866,
285,
253,
4081,
42072,
50275,
977,
5884,
7350,
604,
1896,
4496,
2486,
253,
3064,
875,
44417,
800,
285,
1327,
515,
430,
800,
14580,
275,
253,
10199,
50276,
262,
2789,
352,
6927,
323,
247,
9414,
50276,
338,
4278,
403,
2879,
285,
253,
7350,
403,
9713,
891,
588,
320,
5211,
281,
3157,
619,
11691,
406,
33032,
2520,
2929,
8571,
327,
15974,
253,
4666,
21496,
1895,
275,
557,
515,
430,
800,
14580,
247,
1327,
6790,
20828,
7792,
310,
4081,
1580,
1980,
20828,
778,
320,
19632,
323,
690,
557,
515,
430,
800,
14580,
281,
2953,
253,
1029,
15180,
2105,
275,
253,
3332,
49040,
72,
14340,
1566,
326,
556,
271,
4116,
3022,
3213,
281,
11897,
253,
299,
26365,
4181,
875,
1046,
4667,
273,
7632,
271,
2934,
273,
4116,
26960,
23762,
310,
5611,
352,
33772,
271,
15824,
273,
7632,
824,
326,
13392,
533,
27096,
7632,
403,
1691,
2822,
1016,
643,
253,
23762,
1340,
7024,
327,
253,
4116,
7363,
10302,
342,
253,
1980,
21496,
4972,
273,
247,
4666,
840,
9383,
79,
1159,
310,
3732,
327,
253,
20045,
3425,
273,
1980,
4666,
46234,
281,
4044,
253,
1327,
6790,
21496,
253,
2457,
4666,
21496,
310,
840,
253,
32147,
318,
273,
253,
1980,
285,
1327,
6790,
21496,
534,
310,
908,
323,
4666,
9162,
50275,
783,
3559,
2969,
2746,
310,
271,
4722,
2934,
281,
7450,
253,
13392,
533,
27096,
7632,
2366,
2299,
352,
310,
12744,
849,
253,
4116,
26960,
23762,
310,
6600,
273,
253,
13392,
7632,
253,
1980,
4666,
21496,
11390,
1182,
476,
320,
2797,
2057,
407,
253,
4666,
2600,
390,
407,
305,
9866,
604,
1182,
310,
432,
253,
4666,
2600,
760,
253,
4116,
4868,
247,
310,
5118,
1293,
8180,
849,
7632,
403,
2810,
390,
13392,
327,
253,
4216,
253,
2644,
2746,
2987,
15846,
323,
4666,
2600,
9162,
604,
1182,
310,
432,
305,
9866,
50276,
26451,
2810,
327,
253,
4216,
452,
2074,
1182,
21496,
11390,
285,
3021,
588,
320,
20045,
1735,
281,
1016,
643,
840,
253,
23762,
36908,
1379,
13392,
7632,
2810,
50275,
20261,
253,
5661,
1543,
921,
253,
4081,
2746,
17923,
1805,
685,
2067,
1666,
25379,
625,
285,
10046,
305,
9866,
3210,
403,
3264,
281,
320,
2429,
342,
24088,
305,
968,
3340,
327,
448,
482,
282,
251,
285,
37233,
1661,
15302,
253,
6628,
767,
557,
515,
430,
800,
14580,
476,
320,
15726,
407,
305,
9866,
9351,
273,
3210,
253,
4666,
9162,
275,
643,
1740,
557,
515,
430,
800,
14580,
275,
958,
476,
320,
4127,
347,
247,
2629,
966,
9162,
4836,
407,
23111,
253,
4216,
5289,
347,
13361,
81,
327,
4666,
3386,
310,
2168,
1175,
50275,
35501,
323,
253,
8254,
6787,
432,
253,
4477,
253,
5955,
369,
1077,
9371,
2490,
187,
4118,
18435,
27,
2520,
2929,
310,
987,
387,
253,
45210,
253,
30628,
5194,
352,
310,
973,
3542,
36636,
247,
2969,
533,
4722,
2934,
2299,
627,
369,
247,
5471,
2190,
253,
30628,
3340,
37317,
337,
326,
253,
2929,
812,
320,
34615,
15455,
342,
247,
1805,
11985,
485,
3762,
327,
253,
32572,
273,
253,
18543,
11390,
347,
973,
347,
4679,
327,
4067,
15302,
2509,
581,
273,
841,
651,
452,
9619,
34615,
253,
2929,
1955,
281,
253,
5780,
35387,
253,
17401,
310,
417,
281,
2997,
253,
2929,
275,
697,
1246,
1375
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
summary in the present paper the author intends to get further insights into the physics behind earthquake ruptures using a bnn to model simulated data from the literature by using a bnn the parameters of the model are not deterministic scalar values but complete probability distributions studying the change of the distributions in the parameters before and after training the author tries to extract information about the relative importance of the input variables and also comprehend the physical mechanisms behind earthquake ruptures results are shown in figure 3 on which the change of behavior of the distributions of the parameters can be observed as well as in figure 4 where the mean and standard deviations for all the parameters are presented the pattern in figure 6 seems to indicate that variables previously thought to be important in the task of predicting the presence of the rupture such as normal stress and friction are also pointed out as being important in this case finally the authors also claim an improvement in the f1 metric in comparison to previous nn methods pros the idea of the paper seems well directed ie gaining insight on complex physical procedures using an approach that results in the combination of nns and a bayesian approach using a bayesian approach is a good way of dealing with small datasets and also allows to account for the uncertainty of all the latent parameters while also providing more robust and sensible predictions when new data is presented the approach seems to provide results consistent with the literature findings regarding the important variables in the prediction the final performance of the algorithm seems to improve on the previous stateoftheart methods by taking advantage of the properties that the bayesian approach offers cons the key concern with this paper is that nns as well as bnns are notoriously blackbox algorithms with no easy way of interpreting the inner parameters in most cases taking this into consideration i would suggest the author to motivate in a stronger manner why the usage of bnns is desirable for the proposed problem and why not use other already established bayesian approaches to assess the importance of the input variables taking into account the previous point i consider there is a general lack of rigorous experiments that could in principle suggest a clear advantage of using bnns instead of any other approaches no systematic comparisons with previous methods are present such as for example with the random forest feature importance algorithm which is mentioned a couple of times if the main goal is to gain insights on the main variables involved in the presence of an earthquake rupture i would expect a more detailed analysis comparing how good these insights provided by bnns are and how do they stand in comparison with the established literature other basic techniques for assessing the importance of the variables in the prediction tasks are not mentioned although it would be nice to use them as a baseline to compare against examples such as pca loo crossvalidation and others could be used here the claim of an improvement of 234 wrt nns is not strongly addressed since the nn experiments are not included here or at least there is no mention of the setup of these nns as before there is no systematic comparison between the bnns trained and the nns that are used as baselines there is a lengthy discussion on how to obtain the elbo for vi however in the end there is no final expression for the loss function which is going to be employed i would appreciate in section 4 an explicit description of the objective of the system since theres no mention of the final binary classification problem anywhere the prediction uncertainties lack a systematic evaluation as well since all that is provided is presented in figure 5 how well do the predictions provided stand against other methods for obtaining final predictive distributions vi is a method whose performance and final predictions are constrained due to its formulation is there any reason why using vi instead of any other approach to bnns in case that we wanted to study the final predictive distributions why not use hmc or other more flexible approaches than vi at the end of the section 51 first paragraph it is claimed that positive and high magnitude weights contribute to the earthquake rupture and vice versa this sentence seems a bit confusing since it seems to imply a causal relation between the high magnitude of the weights and the appearing of ruptures this i think is the other way around very clear rupture conditions imply positive high magnitude weights which in turn return a higher predicted probability of rupture minor comments even though the paper tackles physical phenomena such as earthquake rupture it does not provide any description of such process or the variables involved concepts such as nucleation and fault barrier should be at least briefly introduced as well as the slip weakening law or the critical slip distance a short description of these terms and their relevance to the problem would help to interpret the final results obtained also explicit expressions for the rupture physics would help a lot in section 2 to understand the different roles of the variables and their relations throughout the whole text it is used the first person while writing in case there is only one author this can be okay i only point it out since it seems to be a uncommon choice there are a lot of typos all through the paper please perform a careful reading and correct them the description on figure 4 is confusing does not seem to correspond to the presented images either that or the text is unclear when selecting the important parts of the figures for the nodes mentioned 4th paragraph of introduction not all ml algorithms are black boxes nn are but other such as linear regression decision trees etc can be very interpretable 5th paragraph of introduction exciting avoid usage of these type of subjective adjectives all through the paper 5th paragraph of introduction bnns may work better with fewer data but we have to pay close attention to the prior formulation to not introduce unreasonable biases docsep this paper extends on previous works by ahamed daub 2019 from a twolayers mlp two a bayesian nn version of it for predictiong wether a piece of material will rupture under some conditions although this increase in model complexity improves performance a little bit it does not represent a major advance the main point of the paper is to show that bayesian nn allow to get not only a prediciton but also naturally provide uncertainties on these predictions although the paper clearly explains the basics of bnns it does not provide any new insight into them the application to rupture physics is interesting but does not seem groundbreaking for these reasons i lean on rejecting the paper also given the github repo referred to there is a breach of anonymity on the paper itself i have a couple of remarks it is unclear as to where the data comes from a simulation is mentionned but not how it works i see in ahamed daub 2019 that it is a finite element simulation still about the simulation it is uncelar whether the stress state is heterogeneous or not it seems it ought to be however in that case the description of the stress state would consist in a full field of values and not just a couple of numerical values the discussions of the uncertainty in various input variable is tedious and does not really highligh how bnns help to get transparent interpretations fig 5b there is a clear sqrtxsqrt1x shape of this curve do you have an epxlanation for that and can you check the fit the conclusion repeats some parts that were stated rearlier it should instead focus on how bnns help understand the physics the physics of rupture itself is not very interesting to a ml audience docsepthe manuscript do not provide enough details about the physics about which the bnn provides insight the manuscript employs the well developed machine learning algorithms in an application in geoscience and do not provide a novel learning algorithm or contribute to machine learning topics this paper does not meet the iclr standards and therefore i can not recommend for publication docsepthis paper proposes a bayesian neural network for predicting if an earthquake will break a fault or not overcoming small data problem and predicting model uncertainty the data is composed of 8 features and a binary output and the samples are all coming from simulations an analysis on the means and standard deviations of the first and last layer of the neural networks weights has been carried out the problem is interesting and the method is useful as the study on the weights means and stds is interesting yet i think the paper is not well polished many incoherence in the text and the figures and i dont really understand why a synthetic dataset coming from physical model equations needs to go into this complex uncertainty quantification we are making here a metamodel of something that is fully described by equation so why using machine learning i also dont understand why the author talks about a small data problem as here we could simply increase the number of simulated samples yet i can see the interest of such a technique if the goal was to later try to apply it to real data where some physics might be unknown or too complex i also dont know if the findings of the paper are of interest for the iclr community but as i come from an interdisciplinary field too i know how hard it can be to find an appropriate and yet good place to publish other questionsremarks equ 8 i dont see how we can go from eq 6 to eq 8 this part is not clear please cite the reference papers for the bayesian nn as well as for elbo there is clearly a part missing in the stateoftheart regarding this while the equations such as the long and useless 5 one are just explaining what is already known in this literature figure 4 shear stress connected to node4 of the hidden layer has the highest uncertainty similarly the weights associated with the input parameters and node5 have high uncertainty whereas the weights associated with the input parameters and the nodes 711 have relatively low uncertainty d uncertainty of the weights associated with the hidden layer nodes and the output node weights in node 7 and 8 have high uncertainty while the rest of the weights have relatively low uncertainty please update the text or the figures so the numbers match same thing for the text for now i cannot understand anything figure 1 taus decreases linearly not taus is fixed maybe you meant tau or the shear stress typos perhaps widely studied and applied in many situations perhaps never seen this word in a paper ar new distribution is therefore derived by which is approximate 7 less data is limited data
### Summary: | the paper considers an interesting application of bayesian neural nets to the geophysics domain however the paper does not make a novel contribution from the machine learning perspective and the improvements on top of the previously proposed approach by ahamed daub 2019 seem to be quite modest overall the paper does not seem to be ready for publication at iclr | [
846,
3733,
253,
2488,
14177,
281,
4908,
1491,
670,
253,
4103,
6349,
273,
253,
3280,
4903,
285,
671,
37240,
253,
3520,
6297,
3212,
27108,
23982,
980,
1543,
403,
2011,
275,
4677,
495,
327,
534,
253,
1818,
273,
3879,
273,
253,
10670,
273,
253,
3602,
476,
320,
2540,
347,
973,
347,
275,
4677,
577,
835,
253,
1599,
285,
2629,
21492,
323,
512,
253,
3602,
403,
3559,
253,
3102,
275,
4677,
721,
3133,
281,
5224,
326,
4903,
3786,
1869,
281,
320,
1774,
275,
253,
4836,
273,
21565,
253,
3361,
273,
253,
31430,
824,
347,
2622,
4073,
285,
20636,
403,
671,
8042,
562,
347,
1146,
1774,
275,
436,
1083,
50276,
71,
3341,
253,
4477,
671,
1750,
271,
7756,
275,
253,
269,
18,
7982,
275,
5301,
281,
2045,
48257,
3082,
50274,
856,
84,
50276,
783,
2934,
273,
253,
2929,
3133,
973,
6828,
26332,
21896,
12288,
327,
2570,
3520,
7259,
970,
271,
2746,
326,
1543,
275,
253,
5019,
273,
295,
2224,
285,
247,
17699,
16561,
2746,
50275,
5302,
247,
17699,
16561,
2746,
310,
247,
1175,
1039,
273,
10620,
342,
1355,
15302,
285,
671,
4483,
281,
2395,
323,
253,
11649,
273,
512,
253,
21624,
3602,
1223,
671,
5277,
625,
10237,
285,
24600,
13650,
672,
747,
941,
310,
3559,
50276,
783,
2746,
3133,
281,
2085,
1543,
5185,
342,
253,
6239,
4342,
5001,
253,
1774,
4903,
275,
253,
10554,
50276,
783,
2457,
3045,
273,
253,
5933,
3133,
281,
3157,
327,
253,
2045,
1375,
23037,
14387,
3082,
407,
3192,
5750,
273,
253,
3607,
326,
253,
17699,
16561,
2746,
6131,
50275,
5040,
50276,
783,
2234,
4468,
342,
436,
2929,
310,
326,
295,
2224,
347,
973,
347,
270,
79,
2224,
403,
417,
49186,
2806,
3364,
11333,
342,
642,
3477,
1039,
273,
29375,
253,
6703,
3602,
275,
954,
2219,
3192,
436,
715,
8180,
891,
651,
1804,
253,
2488,
281,
41509,
275,
247,
10046,
5133,
2139,
253,
10393,
273,
270,
79,
2224,
310,
11408,
323,
253,
4081,
1895,
285,
2139,
417,
897,
643,
2168,
4232,
17699,
16561,
7274,
281,
2939,
253,
6349,
273,
253,
3280,
4903,
50275,
29114,
715,
2395,
253,
2045,
1127,
891,
1908,
627,
310,
247,
2087,
3480,
273,
26565,
4679,
326,
812,
275,
8063,
1804,
247,
2590,
5750,
273,
970,
270,
79,
2224,
3185,
273,
667,
643,
7274,
642,
12082,
14023,
342,
2045,
3082,
403,
1246,
824,
347,
323,
1650,
342,
253,
3632,
9741,
4735,
6349,
5933,
534,
310,
5393,
247,
4564,
273,
2069,
604,
253,
2022,
4736,
310,
281,
6351,
16039,
327,
253,
2022,
4903,
3206,
275,
253,
3361,
273,
271,
27108,
31430,
891,
651,
1902,
247,
625,
7000,
1783,
10941,
849,
1175,
841,
16039,
2530,
407,
270,
79,
2224,
403,
285,
849,
513,
597,
1462,
275,
5301,
342,
253,
4232,
6239,
50275,
977,
5044,
5609,
323,
18005,
253,
6349,
273,
253,
4903,
275,
253,
10554,
8892,
403,
417,
5393,
3738,
352,
651,
320,
5322,
281,
897,
731,
347,
247,
8245,
281,
7277,
1411,
6667,
824,
347,
268,
6357,
2343,
80,
2831,
29599,
285,
2571,
812,
320,
908,
1060,
50275,
783,
1750,
273,
271,
7756,
273,
27812,
8772,
295,
2224,
310,
417,
7052,
9713,
1580,
253,
48257,
4679,
403,
417,
2908,
1060,
390,
387,
1878,
627,
310,
642,
3748,
273,
253,
9978,
273,
841,
295,
2224,
347,
1078,
627,
310,
642,
12082,
5301,
875,
253,
270,
79,
2224,
10166,
285,
253,
295,
2224,
326,
403,
908,
347,
1666,
25379,
50275,
9088,
310,
247,
24585,
5955,
327,
849,
281,
4044,
253,
1045,
2399,
323,
2177,
2299,
275,
253,
990,
627,
310,
642,
2457,
2048,
323,
253,
2957,
1159,
534,
310,
1469,
281,
320,
7091,
891,
651,
11435,
275,
2593,
577,
271,
6843,
5740,
273,
253,
8103,
273,
253,
985,
1580,
253,
373,
642,
3748,
273,
253,
2457,
8985,
9162,
1895,
9825,
50276,
783,
10554,
20418,
3480,
247,
12082,
7103,
347,
973,
1580,
512,
326,
310,
2530,
310,
3559,
275,
4677,
608,
849,
973,
513,
253,
13650,
2530,
1462,
1411,
643,
3082,
323,
13546,
2457,
15970,
10670,
50275,
6584,
310,
247,
1332,
3692,
3045,
285,
2457,
13650,
403,
20793,
1955,
281,
697,
15895,
310,
627,
667,
1921,
2139,
970,
2177,
3185,
273,
667,
643,
2746,
281,
270,
79,
2224,
275,
1083,
326,
359,
3078,
281,
1263,
253,
2457,
15970,
10670,
2139,
417,
897,
288,
17475,
390,
643,
625,
12112,
7274,
685,
2177,
50275,
255,
253,
990,
273,
253,
2593,
8319,
806,
12494,
352,
310,
7558,
326,
2762,
285,
1029,
9777,
13461,
8162,
281,
253,
27108,
31430,
285,
12008,
26620,
436,
6197,
3133,
247,
2372,
21643,
1580,
352,
3133,
281,
16084,
247,
19349,
5886,
875,
253,
1029,
9777,
273,
253,
13461,
285,
253,
15602,
273,
23982,
980,
436,
891,
1158,
310,
253,
643,
1039,
1475,
1077,
2590,
31430,
2515,
16084,
2762,
1029,
9777,
13461,
534,
275,
1614,
1091,
247,
2169,
8131,
5912,
273,
31430,
50274,
37585,
5701,
50276,
9154,
2167,
253,
2929,
39223,
3520,
16958,
824,
347,
27108,
31430,
352,
1057,
417,
2085,
667,
5740,
273,
824,
1232,
390,
253,
4903,
3206,
12342,
824,
347,
48671,
285,
9331,
11394,
943,
320,
387,
1878,
13366,
5611,
347,
973,
347,
253,
15813,
49060,
1569,
390,
253,
4619,
15813,
4181,
247,
2159,
5740,
273,
841,
2426,
285,
616,
17200,
281,
253,
1895,
651,
1361,
281,
4665,
253,
2457,
1543,
2797,
671,
6843,
12091,
323,
253,
31430,
12057,
651,
1361,
247,
2257,
275,
2593,
374,
281,
2096,
253,
1027,
9503,
273,
253,
4903,
285,
616,
2493,
50275,
10489,
483,
253,
2644,
2505,
352,
310,
908,
253,
806,
1436,
1223,
4028,
275,
1083,
627,
310,
760,
581,
2488,
436,
476,
320,
8261,
891,
760,
1127,
352,
562,
1580,
352,
3133,
281,
320,
247,
24666,
4327,
50276,
9088,
403,
247,
2257,
273,
963,
993,
512,
949,
253,
2929,
4496,
1347,
247,
10182,
4361,
285,
3451,
731,
50276,
783,
5740,
327,
4677,
577,
310,
21643,
1057,
417,
1646,
281,
2723,
281,
253,
3559,
3888,
2057,
326,
390,
253,
2505,
310,
12744,
672,
17221,
253,
1774,
4243,
273,
253,
8442,
323,
253,
7632,
5393,
50275,
21,
394,
12494,
273,
10199,
50276,
1439,
512,
13361,
11333,
403,
2806,
12783,
48257,
403,
533,
643,
824,
347,
4872,
9077,
3061,
7139,
3966,
476,
320,
1077,
4665,
494,
50276,
22,
394,
12494,
273,
10199,
50276,
911,
18799,
50276,
27635,
10393,
273,
841,
1511,
273,
17854,
519,
720,
1644,
512,
949,
253,
2929,
50276,
22,
394,
12494,
273,
10199,
50276,
15453,
2224,
778,
789,
1805,
342,
11184,
941,
533,
359,
452,
281,
2075,
2810,
4116,
281,
253,
2720,
15895,
281,
417,
9569,
20697,
31306,
50276,
7152,
33032,
436,
2929,
8725,
327,
2045,
2987,
407,
21799,
3163,
50276,
1473,
538,
6247,
432,
247,
2500,
311,
8883,
13361,
81,
767,
247,
17699,
16561,
48257,
2715,
273,
352,
323,
10554,
72,
259,
7851,
247,
5313,
273,
2144,
588,
31430,
762,
690,
2515,
3738,
436,
2572,
275,
1566,
10454,
19132,
3045,
247,
1652,
2372,
352,
1057,
417,
1957,
247,
2201,
7170,
253,
2022,
1127,
273,
253,
2929,
310,
281,
921,
326,
17699,
16561,
48257,
1581,
281,
755,
417,
760,
247,
2063,
5033,
251,
533,
671,
10748,
2085,
20418,
327,
841,
13650,
3738,
253,
2929,
4518,
11424,
253,
30486,
273,
270,
79,
2224,
352,
1057,
417,
2085,
667,
747,
12288,
715,
731,
253,
2898,
281,
31430,
12057,
310,
4722,
533,
1057,
417,
1646,
3216,
22071,
50276,
1542,
841,
4606,
891,
9644,
327,
33944,
253,
2929,
50276,
12563,
1677,
253,
40477,
30905,
6289,
281,
627,
310,
247,
13770,
273,
39185,
50276,
251,
253,
2929,
3139,
891,
452,
247,
4564,
273,
16157,
50276,
262,
310,
12744,
347,
281,
835,
253,
941,
3249,
432,
247,
9864,
310,
3748,
9306,
533,
417,
849,
352,
2987,
891,
923,
275,
21799,
3163,
50276,
1473,
538,
6247,
326,
352,
310,
247,
6486,
3284,
9864,
50276,
23350,
670,
253,
9864,
352,
310,
5258,
293,
274,
1880,
253,
4073,
1375,
310,
22766,
390,
417,
352,
3133,
352,
12758,
281,
320,
2299,
275,
326,
1083,
253,
5740,
273,
253,
4073,
1375,
651,
2882,
275,
247,
2120,
1673,
273,
2193,
285,
417,
816,
247,
4564,
273,
10704,
2193,
50276,
783,
11985,
273,
253,
11649,
275,
2710,
3280,
4778,
310,
38519,
285,
1057,
417,
1663,
1029,
77,
798,
849,
270,
79,
2224,
1361,
281,
755,
13955,
27838,
50276,
926,
608,
67,
627,
310,
247,
2590,
8084,
89,
2609,
18,
89,
5281,
273,
436,
6970,
513,
368,
452,
271,
2563,
89,
13409,
318,
323,
326,
285,
476,
368,
2451,
253,
4944,
50276,
783,
6452,
24510,
690,
4243,
326,
497,
4767,
10581,
3623,
352,
943,
3185,
2770,
327,
849,
270,
79,
2224,
1361,
2096,
253,
12057,
253,
12057,
273,
31430,
3139,
310,
417,
1077,
4722,
281,
247,
13361,
8446,
50276,
7152,
339,
431,
248,
7714,
513,
417,
2085,
2217,
4278,
670,
50276,
783,
12057,
670,
534,
253,
270,
9866,
3400,
12288,
50276,
783,
7714,
27532,
253,
973,
3715,
5145,
4715,
11333,
275,
271,
2898,
275,
3471,
5829,
1482,
285,
513,
417,
2085,
247,
4460,
4715,
5933,
390,
8162,
281,
5145,
4715,
12989,
436,
2929,
1057,
417,
2525,
253,
50276,
280,
32888,
7465,
285,
3103,
891,
476,
417,
5583,
323,
9311,
5474,
33032,
2520,
2929,
29328,
247,
17699,
16561,
11454,
2990,
323,
21565,
604,
271,
27108,
588,
2740,
247,
9331,
390,
417,
40845,
1355,
941,
1895,
285,
21565,
1566,
11649,
253,
941,
310,
9924,
273,
854,
3386,
285,
247,
8985,
3453,
285,
253,
3530,
403,
512,
3551,
432,
9938,
271,
1783,
327,
253,
2097,
285,
2629,
21492,
273,
253,
806,
285,
1390,
3828,
273,
253,
11454,
6928,
13461,
556,
644,
4824,
562,
50276,
783,
1895,
310,
4722,
285,
253,
1332,
310,
4217,
347,
253,
1263,
327,
253,
13461,
2097,
285,
331,
1397,
310,
4722,
2568,
891,
1158,
253,
2929,
310,
417,
973,
29422,
1142,
44592,
11724,
275,
253,
2505,
285,
253,
8442,
285,
891,
13414,
1663,
2096,
2139,
247,
13506,
10895,
3551,
432,
3520,
1566,
7424,
3198,
281,
564,
715,
436,
2570,
11649,
21652,
359,
403,
2403,
1060,
247,
42281,
49797,
273,
1633,
326,
310,
4751,
2529,
407,
5150,
594,
2139,
970,
5145,
4715,
891,
671,
13414,
2096,
2139,
253,
2488,
12088,
670,
247,
1355,
941,
1895,
347,
1060,
359,
812,
3365,
2572,
253,
1180,
273,
15524,
3530,
2568,
891,
476,
923,
253,
1600,
273,
824,
247,
5853,
604,
253,
4736,
369,
281,
1996,
1611,
281,
4647,
352,
281,
1524,
941,
835,
690,
12057,
1537,
320,
7202,
390,
1512,
2570,
891,
671,
13414,
871,
604,
253,
4342,
273,
253,
2929,
403,
273,
1600,
323,
253,
17857,
32888,
3114,
533,
347,
891,
1705,
432,
271,
734,
36078,
1673,
1512,
891,
871,
849,
1892,
352,
476,
320,
281,
1089,
271,
4569,
285,
2568,
1175,
1659,
281,
15452,
50275,
977,
3533,
2013,
7969,
50275,
2655,
854,
50276,
74,
13414,
923,
849,
359,
476,
564,
432,
16186,
721,
281,
16186,
854,
436,
629,
310,
417,
2590,
50275,
32897,
26542,
253,
3806,
9380,
323,
253,
17699,
16561,
48257,
347,
973,
347,
323,
1045,
2399,
627,
310,
4518,
247,
629,
5816,
275,
253,
1375,
23037,
14387,
5001,
436,
1223,
253,
7424,
824,
347,
253,
1048,
285,
19437,
608,
581,
403,
816,
15571,
752,
310,
2168,
1929,
275,
436,
6239,
50275,
13206,
577,
19707,
4073,
4802,
281,
4666,
21,
273,
253,
8763,
3828,
556,
253,
4585,
11649,
12014,
253,
13461,
2330,
342,
253,
3280,
3602,
285,
4666,
22,
452,
1029,
11649,
5727,
253,
13461,
2330,
342,
253,
3280,
3602,
285,
253,
7632,
818,
883,
452,
4942,
1698,
11649,
277,
11649,
273,
253,
13461,
2330,
342,
253,
8763,
3828,
7632,
285,
253,
3453,
4666,
13461,
275,
4666,
818,
285,
854,
452,
1029,
11649,
1223,
253,
1551,
273,
253,
13461,
452,
4942,
1698,
11649,
28910,
4496,
5731,
253,
2505,
390,
253,
8442,
594,
253,
3904,
3761,
1072,
2181,
323,
253,
2505,
323,
1024,
891,
2550,
2096,
2712,
50276,
13206,
337,
246,
666,
12075,
23352,
417,
246,
666,
310,
4229,
5046,
368,
5486,
29201,
390,
253,
19707,
4073,
50276,
555,
993,
50276,
30875,
7561,
5421,
285,
3732,
275,
1142,
9534,
50276,
30875,
1620,
2326,
436,
3159,
275,
247,
2929,
50276,
274,
747,
3268,
50276,
261,
3103,
6012,
407,
50276,
4609,
310,
16851,
818,
1679,
50276,
2203,
310,
3710,
941,
2490,
187,
4118,
18435,
27,
783,
2929,
19401,
271,
4722,
2898,
273,
17699,
16561,
11454,
37507,
281,
253,
3471,
16946,
982,
5028,
50276,
35529,
253,
2929,
1057,
417,
1056,
247,
4460,
7680,
432,
253,
5145,
4715,
8668,
285,
253,
11701,
327,
1755,
273,
253,
3786,
4081,
2746,
407,
21799,
3163,
50276,
1473,
538,
6247,
1646,
281,
320,
3240,
16453,
4583,
253,
2929,
1057,
417,
1646,
281,
320,
4704,
323,
9311,
387,
17857,
32888,
50276
] | [
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] | [
846,
3733,
253,
2488,
14177,
281,
4908,
1491,
670,
253,
4103,
6349,
273,
253,
3280,
4903,
285,
671,
37240,
253,
3520,
6297,
3212,
27108,
23982,
980,
1543,
403,
2011,
275,
4677,
495,
327,
534,
253,
1818,
273,
3879,
273,
253,
10670,
273,
253,
3602,
476,
320,
2540,
347,
973,
347,
275,
4677,
577,
835,
253,
1599,
285,
2629,
21492,
323,
512,
253,
3602,
403,
3559,
253,
3102,
275,
4677,
721,
3133,
281,
5224,
326,
4903,
3786,
1869,
281,
320,
1774,
275,
253,
4836,
273,
21565,
253,
3361,
273,
253,
31430,
824,
347,
2622,
4073,
285,
20636,
403,
671,
8042,
562,
347,
1146,
1774,
275,
436,
1083,
50276,
71,
3341,
253,
4477,
671,
1750,
271,
7756,
275,
253,
269,
18,
7982,
275,
5301,
281,
2045,
48257,
3082,
50274,
856,
84,
50276,
783,
2934,
273,
253,
2929,
3133,
973,
6828,
26332,
21896,
12288,
327,
2570,
3520,
7259,
970,
271,
2746,
326,
1543,
275,
253,
5019,
273,
295,
2224,
285,
247,
17699,
16561,
2746,
50275,
5302,
247,
17699,
16561,
2746,
310,
247,
1175,
1039,
273,
10620,
342,
1355,
15302,
285,
671,
4483,
281,
2395,
323,
253,
11649,
273,
512,
253,
21624,
3602,
1223,
671,
5277,
625,
10237,
285,
24600,
13650,
672,
747,
941,
310,
3559,
50276,
783,
2746,
3133,
281,
2085,
1543,
5185,
342,
253,
6239,
4342,
5001,
253,
1774,
4903,
275,
253,
10554,
50276,
783,
2457,
3045,
273,
253,
5933,
3133,
281,
3157,
327,
253,
2045,
1375,
23037,
14387,
3082,
407,
3192,
5750,
273,
253,
3607,
326,
253,
17699,
16561,
2746,
6131,
50275,
5040,
50276,
783,
2234,
4468,
342,
436,
2929,
310,
326,
295,
2224,
347,
973,
347,
270,
79,
2224,
403,
417,
49186,
2806,
3364,
11333,
342,
642,
3477,
1039,
273,
29375,
253,
6703,
3602,
275,
954,
2219,
3192,
436,
715,
8180,
891,
651,
1804,
253,
2488,
281,
41509,
275,
247,
10046,
5133,
2139,
253,
10393,
273,
270,
79,
2224,
310,
11408,
323,
253,
4081,
1895,
285,
2139,
417,
897,
643,
2168,
4232,
17699,
16561,
7274,
281,
2939,
253,
6349,
273,
253,
3280,
4903,
50275,
29114,
715,
2395,
253,
2045,
1127,
891,
1908,
627,
310,
247,
2087,
3480,
273,
26565,
4679,
326,
812,
275,
8063,
1804,
247,
2590,
5750,
273,
970,
270,
79,
2224,
3185,
273,
667,
643,
7274,
642,
12082,
14023,
342,
2045,
3082,
403,
1246,
824,
347,
323,
1650,
342,
253,
3632,
9741,
4735,
6349,
5933,
534,
310,
5393,
247,
4564,
273,
2069,
604,
253,
2022,
4736,
310,
281,
6351,
16039,
327,
253,
2022,
4903,
3206,
275,
253,
3361,
273,
271,
27108,
31430,
891,
651,
1902,
247,
625,
7000,
1783,
10941,
849,
1175,
841,
16039,
2530,
407,
270,
79,
2224,
403,
285,
849,
513,
597,
1462,
275,
5301,
342,
253,
4232,
6239,
50275,
977,
5044,
5609,
323,
18005,
253,
6349,
273,
253,
4903,
275,
253,
10554,
8892,
403,
417,
5393,
3738,
352,
651,
320,
5322,
281,
897,
731,
347,
247,
8245,
281,
7277,
1411,
6667,
824,
347,
268,
6357,
2343,
80,
2831,
29599,
285,
2571,
812,
320,
908,
1060,
50275,
783,
1750,
273,
271,
7756,
273,
27812,
8772,
295,
2224,
310,
417,
7052,
9713,
1580,
253,
48257,
4679,
403,
417,
2908,
1060,
390,
387,
1878,
627,
310,
642,
3748,
273,
253,
9978,
273,
841,
295,
2224,
347,
1078,
627,
310,
642,
12082,
5301,
875,
253,
270,
79,
2224,
10166,
285,
253,
295,
2224,
326,
403,
908,
347,
1666,
25379,
50275,
9088,
310,
247,
24585,
5955,
327,
849,
281,
4044,
253,
1045,
2399,
323,
2177,
2299,
275,
253,
990,
627,
310,
642,
2457,
2048,
323,
253,
2957,
1159,
534,
310,
1469,
281,
320,
7091,
891,
651,
11435,
275,
2593,
577,
271,
6843,
5740,
273,
253,
8103,
273,
253,
985,
1580,
253,
373,
642,
3748,
273,
253,
2457,
8985,
9162,
1895,
9825,
50276,
783,
10554,
20418,
3480,
247,
12082,
7103,
347,
973,
1580,
512,
326,
310,
2530,
310,
3559,
275,
4677,
608,
849,
973,
513,
253,
13650,
2530,
1462,
1411,
643,
3082,
323,
13546,
2457,
15970,
10670,
50275,
6584,
310,
247,
1332,
3692,
3045,
285,
2457,
13650,
403,
20793,
1955,
281,
697,
15895,
310,
627,
667,
1921,
2139,
970,
2177,
3185,
273,
667,
643,
2746,
281,
270,
79,
2224,
275,
1083,
326,
359,
3078,
281,
1263,
253,
2457,
15970,
10670,
2139,
417,
897,
288,
17475,
390,
643,
625,
12112,
7274,
685,
2177,
50275,
255,
253,
990,
273,
253,
2593,
8319,
806,
12494,
352,
310,
7558,
326,
2762,
285,
1029,
9777,
13461,
8162,
281,
253,
27108,
31430,
285,
12008,
26620,
436,
6197,
3133,
247,
2372,
21643,
1580,
352,
3133,
281,
16084,
247,
19349,
5886,
875,
253,
1029,
9777,
273,
253,
13461,
285,
253,
15602,
273,
23982,
980,
436,
891,
1158,
310,
253,
643,
1039,
1475,
1077,
2590,
31430,
2515,
16084,
2762,
1029,
9777,
13461,
534,
275,
1614,
1091,
247,
2169,
8131,
5912,
273,
31430,
50274,
37585,
5701,
50276,
9154,
2167,
253,
2929,
39223,
3520,
16958,
824,
347,
27108,
31430,
352,
1057,
417,
2085,
667,
5740,
273,
824,
1232,
390,
253,
4903,
3206,
12342,
824,
347,
48671,
285,
9331,
11394,
943,
320,
387,
1878,
13366,
5611,
347,
973,
347,
253,
15813,
49060,
1569,
390,
253,
4619,
15813,
4181,
247,
2159,
5740,
273,
841,
2426,
285,
616,
17200,
281,
253,
1895,
651,
1361,
281,
4665,
253,
2457,
1543,
2797,
671,
6843,
12091,
323,
253,
31430,
12057,
651,
1361,
247,
2257,
275,
2593,
374,
281,
2096,
253,
1027,
9503,
273,
253,
4903,
285,
616,
2493,
50275,
10489,
483,
253,
2644,
2505,
352,
310,
908,
253,
806,
1436,
1223,
4028,
275,
1083,
627,
310,
760,
581,
2488,
436,
476,
320,
8261,
891,
760,
1127,
352,
562,
1580,
352,
3133,
281,
320,
247,
24666,
4327,
50276,
9088,
403,
247,
2257,
273,
963,
993,
512,
949,
253,
2929,
4496,
1347,
247,
10182,
4361,
285,
3451,
731,
50276,
783,
5740,
327,
4677,
577,
310,
21643,
1057,
417,
1646,
281,
2723,
281,
253,
3559,
3888,
2057,
326,
390,
253,
2505,
310,
12744,
672,
17221,
253,
1774,
4243,
273,
253,
8442,
323,
253,
7632,
5393,
50275,
21,
394,
12494,
273,
10199,
50276,
1439,
512,
13361,
11333,
403,
2806,
12783,
48257,
403,
533,
643,
824,
347,
4872,
9077,
3061,
7139,
3966,
476,
320,
1077,
4665,
494,
50276,
22,
394,
12494,
273,
10199,
50276,
911,
18799,
50276,
27635,
10393,
273,
841,
1511,
273,
17854,
519,
720,
1644,
512,
949,
253,
2929,
50276,
22,
394,
12494,
273,
10199,
50276,
15453,
2224,
778,
789,
1805,
342,
11184,
941,
533,
359,
452,
281,
2075,
2810,
4116,
281,
253,
2720,
15895,
281,
417,
9569,
20697,
31306,
50276,
7152,
33032,
436,
2929,
8725,
327,
2045,
2987,
407,
21799,
3163,
50276,
1473,
538,
6247,
432,
247,
2500,
311,
8883,
13361,
81,
767,
247,
17699,
16561,
48257,
2715,
273,
352,
323,
10554,
72,
259,
7851,
247,
5313,
273,
2144,
588,
31430,
762,
690,
2515,
3738,
436,
2572,
275,
1566,
10454,
19132,
3045,
247,
1652,
2372,
352,
1057,
417,
1957,
247,
2201,
7170,
253,
2022,
1127,
273,
253,
2929,
310,
281,
921,
326,
17699,
16561,
48257,
1581,
281,
755,
417,
760,
247,
2063,
5033,
251,
533,
671,
10748,
2085,
20418,
327,
841,
13650,
3738,
253,
2929,
4518,
11424,
253,
30486,
273,
270,
79,
2224,
352,
1057,
417,
2085,
667,
747,
12288,
715,
731,
253,
2898,
281,
31430,
12057,
310,
4722,
533,
1057,
417,
1646,
3216,
22071,
50276,
1542,
841,
4606,
891,
9644,
327,
33944,
253,
2929,
50276,
12563,
1677,
253,
40477,
30905,
6289,
281,
627,
310,
247,
13770,
273,
39185,
50276,
251,
253,
2929,
3139,
891,
452,
247,
4564,
273,
16157,
50276,
262,
310,
12744,
347,
281,
835,
253,
941,
3249,
432,
247,
9864,
310,
3748,
9306,
533,
417,
849,
352,
2987,
891,
923,
275,
21799,
3163,
50276,
1473,
538,
6247,
326,
352,
310,
247,
6486,
3284,
9864,
50276,
23350,
670,
253,
9864,
352,
310,
5258,
293,
274,
1880,
253,
4073,
1375,
310,
22766,
390,
417,
352,
3133,
352,
12758,
281,
320,
2299,
275,
326,
1083,
253,
5740,
273,
253,
4073,
1375,
651,
2882,
275,
247,
2120,
1673,
273,
2193,
285,
417,
816,
247,
4564,
273,
10704,
2193,
50276,
783,
11985,
273,
253,
11649,
275,
2710,
3280,
4778,
310,
38519,
285,
1057,
417,
1663,
1029,
77,
798,
849,
270,
79,
2224,
1361,
281,
755,
13955,
27838,
50276,
926,
608,
67,
627,
310,
247,
2590,
8084,
89,
2609,
18,
89,
5281,
273,
436,
6970,
513,
368,
452,
271,
2563,
89,
13409,
318,
323,
326,
285,
476,
368,
2451,
253,
4944,
50276,
783,
6452,
24510,
690,
4243,
326,
497,
4767,
10581,
3623,
352,
943,
3185,
2770,
327,
849,
270,
79,
2224,
1361,
2096,
253,
12057,
253,
12057,
273,
31430,
3139,
310,
417,
1077,
4722,
281,
247,
13361,
8446,
50276,
7152,
339,
431,
248,
7714,
513,
417,
2085,
2217,
4278,
670,
50276,
783,
12057,
670,
534,
253,
270,
9866,
3400,
12288,
50276,
783,
7714,
27532,
253,
973,
3715,
5145,
4715,
11333,
275,
271,
2898,
275,
3471,
5829,
1482,
285,
513,
417,
2085,
247,
4460,
4715,
5933,
390,
8162,
281,
5145,
4715,
12989,
436,
2929,
1057,
417,
2525,
253,
50276,
280,
32888,
7465,
285,
3103,
891,
476,
417,
5583,
323,
9311,
5474,
33032,
2520,
2929,
29328,
247,
17699,
16561,
11454,
2990,
323,
21565,
604,
271,
27108,
588,
2740,
247,
9331,
390,
417,
40845,
1355,
941,
1895,
285,
21565,
1566,
11649,
253,
941,
310,
9924,
273,
854,
3386,
285,
247,
8985,
3453,
285,
253,
3530,
403,
512,
3551,
432,
9938,
271,
1783,
327,
253,
2097,
285,
2629,
21492,
273,
253,
806,
285,
1390,
3828,
273,
253,
11454,
6928,
13461,
556,
644,
4824,
562,
50276,
783,
1895,
310,
4722,
285,
253,
1332,
310,
4217,
347,
253,
1263,
327,
253,
13461,
2097,
285,
331,
1397,
310,
4722,
2568,
891,
1158,
253,
2929,
310,
417,
973,
29422,
1142,
44592,
11724,
275,
253,
2505,
285,
253,
8442,
285,
891,
13414,
1663,
2096,
2139,
247,
13506,
10895,
3551,
432,
3520,
1566,
7424,
3198,
281,
564,
715,
436,
2570,
11649,
21652,
359,
403,
2403,
1060,
247,
42281,
49797,
273,
1633,
326,
310,
4751,
2529,
407,
5150,
594,
2139,
970,
5145,
4715,
891,
671,
13414,
2096,
2139,
253,
2488,
12088,
670,
247,
1355,
941,
1895,
347,
1060,
359,
812,
3365,
2572,
253,
1180,
273,
15524,
3530,
2568,
891,
476,
923,
253,
1600,
273,
824,
247,
5853,
604,
253,
4736,
369,
281,
1996,
1611,
281,
4647,
352,
281,
1524,
941,
835,
690,
12057,
1537,
320,
7202,
390,
1512,
2570,
891,
671,
13414,
871,
604,
253,
4342,
273,
253,
2929,
403,
273,
1600,
323,
253,
17857,
32888,
3114,
533,
347,
891,
1705,
432,
271,
734,
36078,
1673,
1512,
891,
871,
849,
1892,
352,
476,
320,
281,
1089,
271,
4569,
285,
2568,
1175,
1659,
281,
15452,
50275,
977,
3533,
2013,
7969,
50275,
2655,
854,
50276,
74,
13414,
923,
849,
359,
476,
564,
432,
16186,
721,
281,
16186,
854,
436,
629,
310,
417,
2590,
50275,
32897,
26542,
253,
3806,
9380,
323,
253,
17699,
16561,
48257,
347,
973,
347,
323,
1045,
2399,
627,
310,
4518,
247,
629,
5816,
275,
253,
1375,
23037,
14387,
5001,
436,
1223,
253,
7424,
824,
347,
253,
1048,
285,
19437,
608,
581,
403,
816,
15571,
752,
310,
2168,
1929,
275,
436,
6239,
50275,
13206,
577,
19707,
4073,
4802,
281,
4666,
21,
273,
253,
8763,
3828,
556,
253,
4585,
11649,
12014,
253,
13461,
2330,
342,
253,
3280,
3602,
285,
4666,
22,
452,
1029,
11649,
5727,
253,
13461,
2330,
342,
253,
3280,
3602,
285,
253,
7632,
818,
883,
452,
4942,
1698,
11649,
277,
11649,
273,
253,
13461,
2330,
342,
253,
8763,
3828,
7632,
285,
253,
3453,
4666,
13461,
275,
4666,
818,
285,
854,
452,
1029,
11649,
1223,
253,
1551,
273,
253,
13461,
452,
4942,
1698,
11649,
28910,
4496,
5731,
253,
2505,
390,
253,
8442,
594,
253,
3904,
3761,
1072,
2181,
323,
253,
2505,
323,
1024,
891,
2550,
2096,
2712,
50276,
13206,
337,
246,
666,
12075,
23352,
417,
246,
666,
310,
4229,
5046,
368,
5486,
29201,
390,
253,
19707,
4073,
50276,
555,
993,
50276,
30875,
7561,
5421,
285,
3732,
275,
1142,
9534,
50276,
30875,
1620,
2326,
436,
3159,
275,
247,
2929,
50276,
274,
747,
3268,
50276,
261,
3103,
6012,
407,
50276,
4609,
310,
16851,
818,
1679,
50276,
2203,
310,
3710,
941,
2490,
187,
4118,
18435,
27,
783,
2929,
19401,
271,
4722,
2898,
273,
17699,
16561,
11454,
37507,
281,
253,
3471,
16946,
982,
5028,
50276,
35529,
253,
2929,
1057,
417,
1056,
247,
4460,
7680,
432,
253,
5145,
4715,
8668,
285,
253,
11701,
327,
1755,
273,
253,
3786,
4081,
2746,
407,
21799,
3163,
50276,
1473,
538,
6247,
1646,
281,
320,
3240,
16453,
4583,
253,
2929,
1057,
417,
1646,
281,
320,
4704,
323,
9311,
387,
17857,
32888,
50276
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
this paper studies the effective metrics for evaluating recent works on novel view synthesis from monocular videos the proposed effective multiview factors emf measures multiview signals in the evaluation of view synthesis it also comes with a new dataset which tries to mitigate the multiview signals in captures two new metrics are proposed to measure the quality of view synthesis and motion several stateoftheart methods are further improved but still struggle in the cases of motion and few multiview signal strengths emf the effective multiview factors measure the amount of multiviewness in the capture such multiviewness makes the task of novel view synthesis on dynamic scenes easier as the problem degrades into a multiview setup emf consists of scene motion and camera angular velocity and higher value in either suggests there is a strong multiviewness in the capture emf basically tells how easydifficult the capture is to reconstruct such metrics is nice to have in the future dataset release masked psnr and correspondence two new metrics are proposed to further measure the quality of the synthesized image both are tailored for dynamic scenes but i see them also useful in static scenes as well masked psnr only calculates psnr in the valid region and pckt tells whether the predicted motion is correct the design choices are smart weaknesses correspondence i see masked psnr can be applied to any method as long as the ground truth pose and depth are available however the correspondence metrics lacks generalizability as shown in the paper only methods that explicitly models motion can calculate the pckt score it does not apply to methods like tnerf camera angular velocity nerfies and hypernerf have a high camera angular velocity in tbl 1 as far as i see this is because it is switching frames between two cameras if the captures are reordered in a way that uses frames from camera 1 first and camera 2 second or only frames from camera 1 will nerfies still work if so it will have a much smaller velocity thus less emf it would be interesting to see how the method performs vs emf no view direction or appearance encoding it is mentioned in l297 both are turned off during training but the goal of psnrm is to evaluate the quality of nvs in seen region in practice view direction and appearance encoding are essential in synthesizing new views turning them off will hurt the psnrm score a lot is there a specific reason doing so aside from overfitting typo l32 agnitude magnitude limitations are not discussed the author can discuss the use case of emf and two new metrics more docsepthis paper studied the problem of dynamic 3d scene synthesis from monocular video sequence and found flaws of overrepresentation of slowmoving objects with a fastmoving camera in existing datasets the authors then proposed a new metric called effective multiview actors emf to quantify the amount of multiview signal in the image sequence the authors also introduced a new dataset with very low emf and argued that the new dataset should be more suitable for evaluation of dynamic 3d scene synthesis methods finally the authors evaluated four representative algorithms on the new dataset and find performance gap not being noticed with previous existing datasets strengths the authors delve into the characteristic of existing dataset and managed to produce innovative metric to evaluate the difficulty of dataset in terms of monocular dynamics weaknesses while the ability of build 3d representation from monocular dynamics is desirable real word video sequences could also contain a well proportion of high emf data so the argument that low emf datasets is better for evaluation may not hold unconditionally the difficulty of dataset may also come from other aspects such as shape complexity and surface property of objects the authors answered yes for the question did you describe the limitations of your work but not stating which section contains it docsepthe paper does a complete review for existing approaches recovering dynamic 3d scenes from monocular videos especially nerfies hypernerf and nsff it studies the camera trajectory proposes a metric effective multiview factors emf to quantify multiview cues in a dynamic scene with moving cameras existing datasets typically have high multiview cues therefore a new dataset captured by iphone is introduced with little multiview cue additional metrics such as masked psnr and pckt are also introduced to ensure the fairness of evaluation the new benchmark brings additional challenges for existing approaches of dynamic 3d capture strengths quantifying multiview cues using emf is neat different datasets in dynamic 3d capture proposes different data including different camera trajectories and emf quatifies the difficulties of all the data i also like pckt correspondence besides normal psnr right now a lot of novel view synthesis approaches focus on psnr but psnr does not directly reflect how well the model understands the 3d world pckt is more explicit and i think reporting both masked psnr and pckt is a good idea supp webpage is fantastic thank you weaknesses i dont see any weaknesses although i think the paper is more like a new benchmark it might be more suitable for the benchmark track additional comments no need to address page 8 footnote we find that this code base performs better than the original code release missing a period limitations are not discussed in the paper docsepthis paper proposes a dataset of monocular videos to evaluate nonrigid novel view synthesis methods a factor for measuring the multiview effect is proposed and two evaluation metrics are proposed strengths the paper is wellorganized and it is easy to read the datasets with evaluation metrics are proposed several methods of nonrigid novel view synthesis are evaluated on the proposed datasets weakness 1 using a multicamera setup is more reliable than singlecamera setup for performance analysis after all the goal is to evaluate methods instead of training models i dont think that it is the drawback of existing methods we do not always need to capture new data for nonrigid novel view synthesis evaluation so it is difficult for me to understand the strengths of the proposed solution 2 the idea of maskedpsnr is not surprising first if our goal is to evaluate the regions that are observed it would be straightforward to mask other regions out second we may also hope that the methods can predict and synthesize the regions that are not observed in training data in this scenario the mask should not be used overall i dont think that the masking is novel 3 although nerfies trained models on images sampled from two cameras the method is able to train models using a singlecamera setup 4 the factor of the multiview effect is just a simple trick and there exist many other variants if we want it is correct but i dont think that it is sufficiently significant in this problem post rebuttal after discussion i agree with reviewer vdbh and i would strongly suggest the authors use the comments by vdbh in the final submission to position the work yes
### Summary: | prerebuttal this paper had mixed reviews postrebuttal the paper had two strong supporters a6gt and vdbh who argued that the paper provides valuable insights into an important field as well as a supporter dlu6 who commented in the discussion below that they are in favor of the paper although did not update their review the only remaining criticism comes from 2bcv the ac does not find 2bcvs review persuasive a6gts comments summarize the acs perspective well and 2bcv did not participate in discussion the ac is inclined to accept the paper and encourages the authors to use their extra page to integrate their responses to the reviewers | [
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
2520,
2929,
2175,
253,
3576,
17082,
323,
16344,
3332,
2987,
327,
4460,
1859,
9066,
432,
1114,
26292,
10556,
253,
4081,
3576,
1554,
400,
827,
2616,
802,
71,
5593,
1554,
400,
827,
6298,
275,
253,
7103,
273,
1859,
9066,
352,
671,
3249,
342,
247,
747,
10895,
534,
14177,
281,
29966,
253,
1554,
400,
827,
6298,
275,
28174,
767,
747,
17082,
403,
4081,
281,
2557,
253,
3290,
273,
1859,
9066,
285,
3200,
2067,
1375,
23037,
14387,
3082,
403,
2007,
5520,
533,
1335,
11182,
275,
253,
2219,
273,
3200,
285,
1643,
1554,
400,
827,
2625,
50276,
296,
3755,
20556,
50276,
358,
71,
50276,
783,
3576,
1554,
400,
827,
2616,
2557,
253,
2408,
273,
1554,
400,
827,
1255,
275,
253,
9232,
824,
1554,
400,
827,
1255,
2789,
253,
4836,
273,
4460,
1859,
9066,
327,
7870,
13451,
6927,
347,
253,
1895,
372,
25013,
715,
247,
1554,
400,
827,
9978,
802,
71,
8414,
273,
6200,
3200,
285,
6568,
12336,
7602,
285,
2169,
1318,
275,
2057,
5936,
627,
310,
247,
2266,
1554,
400,
827,
1255,
275,
253,
9232,
802,
71,
10323,
8599,
849,
3477,
38157,
253,
9232,
310,
281,
17029,
824,
17082,
310,
5322,
281,
452,
275,
253,
2852,
10895,
3727,
50275,
12477,
264,
3714,
23838,
285,
17668,
50276,
9389,
747,
17082,
403,
4081,
281,
2007,
2557,
253,
3290,
273,
253,
17791,
2460,
1097,
403,
27846,
323,
7870,
13451,
533,
891,
923,
731,
671,
4217,
275,
4228,
13451,
347,
973,
34741,
3714,
23838,
760,
45319,
3714,
23838,
275,
253,
3588,
2919,
285,
268,
777,
85,
8599,
1880,
253,
8131,
3200,
310,
3451,
253,
2216,
10165,
403,
7060,
50275,
20881,
1255,
265,
50275,
5528,
2541,
566,
50276,
74,
923,
34741,
3714,
23838,
476,
320,
3732,
281,
667,
1332,
347,
1048,
347,
253,
3216,
5083,
16753,
285,
6864,
403,
2130,
2299,
253,
17668,
17082,
19756,
2087,
50228,
347,
2011,
275,
253,
2929,
760,
3082,
326,
11120,
3210,
3200,
476,
10173,
253,
268,
777,
85,
4868,
352,
1057,
417,
4647,
281,
3082,
751,
246,
1216,
71,
50275,
32499,
12336,
7602,
50276,
1216,
71,
447,
285,
4373,
1216,
71,
452,
247,
1029,
6568,
12336,
7602,
275,
47838,
337,
347,
2080,
347,
891,
923,
436,
310,
984,
352,
310,
12797,
13009,
875,
767,
14693,
604,
253,
28174,
403,
294,
16586,
275,
247,
1039,
326,
4648,
13009,
432,
6568,
337,
806,
285,
6568,
374,
1273,
390,
760,
13009,
432,
6568,
337,
588,
38998,
71,
447,
1335,
789,
604,
594,
352,
588,
452,
247,
1199,
4577,
7602,
3021,
1679,
802,
71,
352,
651,
320,
4722,
281,
923,
849,
253,
1332,
17923,
4632,
802,
71,
50275,
2369,
1859,
3884,
390,
7286,
9706,
50276,
262,
310,
5393,
275,
298,
23185,
1097,
403,
3531,
745,
1309,
3733,
533,
253,
4736,
273,
3714,
79,
1109,
310,
281,
7472,
253,
3290,
273,
295,
10936,
275,
2326,
2919,
275,
3946,
1859,
3884,
285,
7286,
9706,
403,
5667,
275,
35143,
3006,
747,
6849,
8577,
731,
745,
588,
8513,
253,
3714,
79,
1109,
4868,
247,
2257,
310,
627,
247,
2173,
1921,
2509,
594,
9255,
432,
689,
31893,
50275,
555,
5367,
298,
1237,
639,
79,
3396,
50276,
30362,
3396,
7364,
403,
417,
5469,
253,
2488,
476,
2319,
253,
897,
1083,
273,
802,
71,
285,
767,
747,
17082,
625,
5474,
33032,
2520,
2929,
5421,
253,
1895,
273,
7870,
495,
69,
6200,
9066,
432,
1114,
26292,
3492,
3425,
285,
1119,
32138,
273,
689,
37626,
273,
3468,
26621,
5113,
342,
247,
3809,
26621,
6568,
275,
5368,
15302,
253,
4477,
840,
4081,
247,
747,
7982,
1925,
3576,
1554,
400,
827,
14142,
802,
71,
281,
22048,
253,
2408,
273,
1554,
400,
827,
2625,
275,
253,
2460,
3425,
253,
4477,
671,
5611,
247,
747,
10895,
342,
1077,
1698,
802,
71,
285,
9125,
326,
253,
747,
10895,
943,
320,
625,
7470,
323,
7103,
273,
7870,
495,
69,
6200,
9066,
3082,
4720,
253,
4477,
6760,
1740,
8612,
11333,
327,
253,
747,
10895,
285,
1089,
3045,
8037,
417,
1146,
8344,
342,
2045,
5368,
15302,
20544,
50276,
783,
4477,
1448,
306,
715,
253,
8847,
273,
5368,
10895,
285,
7303,
281,
4711,
16694,
7982,
281,
7472,
253,
10183,
273,
10895,
275,
2426,
273,
1114,
26292,
8062,
50276,
20881,
1255,
265,
50276,
6050,
253,
3745,
273,
1973,
495,
69,
6779,
432,
1114,
26292,
8062,
310,
11408,
1524,
3159,
3492,
6430,
812,
671,
3831,
247,
973,
8394,
273,
1029,
802,
71,
941,
594,
253,
4154,
326,
1698,
802,
71,
15302,
310,
1805,
323,
7103,
778,
417,
2186,
440,
12380,
595,
253,
10183,
273,
10895,
778,
671,
1705,
432,
643,
7794,
824,
347,
5281,
10454,
285,
2553,
2867,
273,
5113,
253,
4477,
9577,
4754,
323,
253,
1953,
858,
368,
6266,
253,
7364,
273,
634,
789,
533,
417,
14851,
534,
2593,
4428,
352,
5474,
339,
431,
248,
2929,
1057,
247,
3426,
2278,
323,
5368,
7274,
27930,
7870,
495,
69,
13451,
432,
1114,
26292,
10556,
3340,
38998,
71,
447,
4373,
1216,
71,
285,
19769,
567,
352,
2175,
253,
6568,
18974,
29328,
247,
7982,
3576,
1554,
400,
827,
2616,
802,
71,
281,
22048,
1554,
400,
827,
26638,
275,
247,
7870,
6200,
342,
4886,
14693,
5368,
15302,
5431,
452,
1029,
1554,
400,
827,
26638,
3103,
247,
747,
10895,
10848,
407,
13997,
73,
531,
310,
5611,
342,
1652,
1554,
400,
827,
30129,
3081,
17082,
824,
347,
34741,
3714,
23838,
285,
268,
777,
85,
403,
671,
5611,
281,
5416,
253,
28959,
273,
7103,
253,
747,
22791,
10316,
3081,
7881,
323,
5368,
7274,
273,
7870,
495,
69,
9232,
20544,
50275,
17149,
5411,
1554,
400,
827,
26638,
970,
802,
71,
310,
18176,
1027,
15302,
275,
7870,
495,
69,
9232,
29328,
1027,
941,
1690,
1027,
6568,
24102,
285,
802,
71,
572,
255,
7790,
253,
12748,
273,
512,
253,
941,
50276,
74,
671,
751,
268,
777,
85,
17668,
16280,
2622,
3714,
23838,
987,
1024,
247,
2257,
273,
4460,
1859,
9066,
7274,
2770,
327,
3714,
23838,
533,
3714,
23838,
1057,
417,
3587,
4887,
849,
973,
253,
1566,
24586,
253,
495,
69,
1533,
268,
777,
85,
310,
625,
6843,
285,
891,
1158,
9610,
1097,
34741,
3714,
23838,
285,
268,
777,
85,
310,
247,
1175,
2934,
50276,
4032,
42498,
310,
15143,
5717,
368,
50276,
20881,
1255,
265,
50276,
74,
13414,
923,
667,
32213,
3738,
891,
1158,
253,
2929,
310,
625,
751,
247,
747,
22791,
50276,
262,
1537,
320,
625,
7470,
323,
253,
22791,
3540,
50276,
38092,
5701,
642,
878,
281,
2953,
50275,
6377,
854,
43302,
359,
1089,
326,
436,
2127,
2613,
17923,
1805,
685,
253,
3236,
2127,
3727,
50276,
33722,
247,
2180,
7364,
403,
417,
5469,
275,
253,
2929,
5474,
33032,
2520,
2929,
29328,
247,
10895,
273,
1114,
26292,
10556,
281,
7472,
1327,
10389,
301,
4460,
1859,
9066,
3082,
247,
2803,
323,
10499,
253,
1554,
400,
827,
1055,
310,
4081,
285,
767,
7103,
17082,
403,
4081,
50276,
296,
3755,
20556,
50276,
783,
2929,
310,
973,
34092,
285,
352,
310,
3477,
281,
1239,
50276,
783,
15302,
342,
7103,
17082,
403,
4081,
50275,
43249,
3082,
273,
1327,
10389,
301,
4460,
1859,
9066,
403,
6760,
327,
253,
4081,
15302,
50273,
20881,
1255,
50276,
18,
970,
247,
23559,
312,
3525,
9978,
310,
625,
9630,
685,
2014,
32499,
9978,
323,
3045,
1783,
846,
512,
253,
4736,
310,
281,
7472,
3082,
3185,
273,
3733,
3210,
891,
13414,
1158,
326,
352,
310,
253,
32489,
273,
5368,
3082,
359,
513,
417,
1900,
878,
281,
9232,
747,
941,
323,
1327,
10389,
301,
4460,
1859,
9066,
7103,
594,
352,
310,
2834,
323,
479,
281,
2096,
253,
20544,
273,
253,
4081,
2900,
50276,
19,
253,
2934,
273,
34741,
793,
23838,
310,
417,
10084,
806,
604,
776,
4736,
310,
281,
7472,
253,
4811,
326,
403,
2540,
352,
651,
320,
15246,
281,
8989,
643,
4811,
562,
1273,
359,
778,
671,
3524,
326,
253,
3082,
476,
3283,
285,
46919,
253,
4811,
326,
403,
417,
2540,
275,
3733,
941,
275,
436,
10076,
253,
8989,
943,
417,
320,
908,
4583,
891,
13414,
1158,
326,
253,
44790,
310,
4460,
50276,
20,
3738,
38998,
71,
447,
10166,
3210,
327,
3888,
19958,
432,
767,
14693,
253,
1332,
310,
2104,
281,
6194,
3210,
970,
247,
2014,
32499,
9978,
50275,
21,
253,
2803,
273,
253,
1554,
400,
827,
1055,
310,
816,
247,
2969,
10480,
285,
627,
2226,
1142,
643,
11640,
604,
359,
971,
352,
310,
3451,
533,
891,
13414,
1158,
326,
352,
310,
10481,
1534,
275,
436,
1895,
50275,
5996,
30080,
22559,
50276,
6438,
5955,
891,
5194,
342,
37317,
362,
5470,
73,
285,
891,
651,
7052,
1804,
253,
4477,
897,
253,
5701,
407,
362,
5470,
73,
275,
253,
2457,
19529,
281,
1899,
253,
789,
50276,
9820,
2490,
187,
4118,
18435,
27,
3456,
250,
2858,
22559,
436,
2929,
574,
6804,
10123,
1501,
250,
2858,
22559,
253,
2929,
574,
767,
2266,
14501,
247,
23,
7332,
285,
362,
5470,
73,
665,
9125,
326,
253,
2929,
3400,
9865,
16039,
715,
271,
1774,
1673,
347,
973,
347,
247,
31409,
277,
7675,
23,
665,
20503,
275,
253,
5955,
2708,
326,
597,
403,
275,
3718,
273,
253,
2929,
3738,
858,
417,
5731,
616,
2278,
253,
760,
5780,
14226,
3249,
432,
374,
12847,
87,
253,
913,
1057,
417,
1089,
374,
12847,
10936,
2278,
34593,
247,
23,
72,
1641,
5701,
26799,
253,
913,
84,
8668,
973,
285,
374,
12847,
87,
858,
417,
10078,
275,
5955,
253,
913,
310,
21802,
281,
2997,
253,
2929,
285,
29426,
253,
4477,
281,
897,
616,
4465,
3239,
281,
19837,
616,
6128,
281,
253,
30628
] | [
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] | [
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
2520,
2929,
2175,
253,
3576,
17082,
323,
16344,
3332,
2987,
327,
4460,
1859,
9066,
432,
1114,
26292,
10556,
253,
4081,
3576,
1554,
400,
827,
2616,
802,
71,
5593,
1554,
400,
827,
6298,
275,
253,
7103,
273,
1859,
9066,
352,
671,
3249,
342,
247,
747,
10895,
534,
14177,
281,
29966,
253,
1554,
400,
827,
6298,
275,
28174,
767,
747,
17082,
403,
4081,
281,
2557,
253,
3290,
273,
1859,
9066,
285,
3200,
2067,
1375,
23037,
14387,
3082,
403,
2007,
5520,
533,
1335,
11182,
275,
253,
2219,
273,
3200,
285,
1643,
1554,
400,
827,
2625,
50276,
296,
3755,
20556,
50276,
358,
71,
50276,
783,
3576,
1554,
400,
827,
2616,
2557,
253,
2408,
273,
1554,
400,
827,
1255,
275,
253,
9232,
824,
1554,
400,
827,
1255,
2789,
253,
4836,
273,
4460,
1859,
9066,
327,
7870,
13451,
6927,
347,
253,
1895,
372,
25013,
715,
247,
1554,
400,
827,
9978,
802,
71,
8414,
273,
6200,
3200,
285,
6568,
12336,
7602,
285,
2169,
1318,
275,
2057,
5936,
627,
310,
247,
2266,
1554,
400,
827,
1255,
275,
253,
9232,
802,
71,
10323,
8599,
849,
3477,
38157,
253,
9232,
310,
281,
17029,
824,
17082,
310,
5322,
281,
452,
275,
253,
2852,
10895,
3727,
50275,
12477,
264,
3714,
23838,
285,
17668,
50276,
9389,
747,
17082,
403,
4081,
281,
2007,
2557,
253,
3290,
273,
253,
17791,
2460,
1097,
403,
27846,
323,
7870,
13451,
533,
891,
923,
731,
671,
4217,
275,
4228,
13451,
347,
973,
34741,
3714,
23838,
760,
45319,
3714,
23838,
275,
253,
3588,
2919,
285,
268,
777,
85,
8599,
1880,
253,
8131,
3200,
310,
3451,
253,
2216,
10165,
403,
7060,
50275,
20881,
1255,
265,
50275,
5528,
2541,
566,
50276,
74,
923,
34741,
3714,
23838,
476,
320,
3732,
281,
667,
1332,
347,
1048,
347,
253,
3216,
5083,
16753,
285,
6864,
403,
2130,
2299,
253,
17668,
17082,
19756,
2087,
50228,
347,
2011,
275,
253,
2929,
760,
3082,
326,
11120,
3210,
3200,
476,
10173,
253,
268,
777,
85,
4868,
352,
1057,
417,
4647,
281,
3082,
751,
246,
1216,
71,
50275,
32499,
12336,
7602,
50276,
1216,
71,
447,
285,
4373,
1216,
71,
452,
247,
1029,
6568,
12336,
7602,
275,
47838,
337,
347,
2080,
347,
891,
923,
436,
310,
984,
352,
310,
12797,
13009,
875,
767,
14693,
604,
253,
28174,
403,
294,
16586,
275,
247,
1039,
326,
4648,
13009,
432,
6568,
337,
806,
285,
6568,
374,
1273,
390,
760,
13009,
432,
6568,
337,
588,
38998,
71,
447,
1335,
789,
604,
594,
352,
588,
452,
247,
1199,
4577,
7602,
3021,
1679,
802,
71,
352,
651,
320,
4722,
281,
923,
849,
253,
1332,
17923,
4632,
802,
71,
50275,
2369,
1859,
3884,
390,
7286,
9706,
50276,
262,
310,
5393,
275,
298,
23185,
1097,
403,
3531,
745,
1309,
3733,
533,
253,
4736,
273,
3714,
79,
1109,
310,
281,
7472,
253,
3290,
273,
295,
10936,
275,
2326,
2919,
275,
3946,
1859,
3884,
285,
7286,
9706,
403,
5667,
275,
35143,
3006,
747,
6849,
8577,
731,
745,
588,
8513,
253,
3714,
79,
1109,
4868,
247,
2257,
310,
627,
247,
2173,
1921,
2509,
594,
9255,
432,
689,
31893,
50275,
555,
5367,
298,
1237,
639,
79,
3396,
50276,
30362,
3396,
7364,
403,
417,
5469,
253,
2488,
476,
2319,
253,
897,
1083,
273,
802,
71,
285,
767,
747,
17082,
625,
5474,
33032,
2520,
2929,
5421,
253,
1895,
273,
7870,
495,
69,
6200,
9066,
432,
1114,
26292,
3492,
3425,
285,
1119,
32138,
273,
689,
37626,
273,
3468,
26621,
5113,
342,
247,
3809,
26621,
6568,
275,
5368,
15302,
253,
4477,
840,
4081,
247,
747,
7982,
1925,
3576,
1554,
400,
827,
14142,
802,
71,
281,
22048,
253,
2408,
273,
1554,
400,
827,
2625,
275,
253,
2460,
3425,
253,
4477,
671,
5611,
247,
747,
10895,
342,
1077,
1698,
802,
71,
285,
9125,
326,
253,
747,
10895,
943,
320,
625,
7470,
323,
7103,
273,
7870,
495,
69,
6200,
9066,
3082,
4720,
253,
4477,
6760,
1740,
8612,
11333,
327,
253,
747,
10895,
285,
1089,
3045,
8037,
417,
1146,
8344,
342,
2045,
5368,
15302,
20544,
50276,
783,
4477,
1448,
306,
715,
253,
8847,
273,
5368,
10895,
285,
7303,
281,
4711,
16694,
7982,
281,
7472,
253,
10183,
273,
10895,
275,
2426,
273,
1114,
26292,
8062,
50276,
20881,
1255,
265,
50276,
6050,
253,
3745,
273,
1973,
495,
69,
6779,
432,
1114,
26292,
8062,
310,
11408,
1524,
3159,
3492,
6430,
812,
671,
3831,
247,
973,
8394,
273,
1029,
802,
71,
941,
594,
253,
4154,
326,
1698,
802,
71,
15302,
310,
1805,
323,
7103,
778,
417,
2186,
440,
12380,
595,
253,
10183,
273,
10895,
778,
671,
1705,
432,
643,
7794,
824,
347,
5281,
10454,
285,
2553,
2867,
273,
5113,
253,
4477,
9577,
4754,
323,
253,
1953,
858,
368,
6266,
253,
7364,
273,
634,
789,
533,
417,
14851,
534,
2593,
4428,
352,
5474,
339,
431,
248,
2929,
1057,
247,
3426,
2278,
323,
5368,
7274,
27930,
7870,
495,
69,
13451,
432,
1114,
26292,
10556,
3340,
38998,
71,
447,
4373,
1216,
71,
285,
19769,
567,
352,
2175,
253,
6568,
18974,
29328,
247,
7982,
3576,
1554,
400,
827,
2616,
802,
71,
281,
22048,
1554,
400,
827,
26638,
275,
247,
7870,
6200,
342,
4886,
14693,
5368,
15302,
5431,
452,
1029,
1554,
400,
827,
26638,
3103,
247,
747,
10895,
10848,
407,
13997,
73,
531,
310,
5611,
342,
1652,
1554,
400,
827,
30129,
3081,
17082,
824,
347,
34741,
3714,
23838,
285,
268,
777,
85,
403,
671,
5611,
281,
5416,
253,
28959,
273,
7103,
253,
747,
22791,
10316,
3081,
7881,
323,
5368,
7274,
273,
7870,
495,
69,
9232,
20544,
50275,
17149,
5411,
1554,
400,
827,
26638,
970,
802,
71,
310,
18176,
1027,
15302,
275,
7870,
495,
69,
9232,
29328,
1027,
941,
1690,
1027,
6568,
24102,
285,
802,
71,
572,
255,
7790,
253,
12748,
273,
512,
253,
941,
50276,
74,
671,
751,
268,
777,
85,
17668,
16280,
2622,
3714,
23838,
987,
1024,
247,
2257,
273,
4460,
1859,
9066,
7274,
2770,
327,
3714,
23838,
533,
3714,
23838,
1057,
417,
3587,
4887,
849,
973,
253,
1566,
24586,
253,
495,
69,
1533,
268,
777,
85,
310,
625,
6843,
285,
891,
1158,
9610,
1097,
34741,
3714,
23838,
285,
268,
777,
85,
310,
247,
1175,
2934,
50276,
4032,
42498,
310,
15143,
5717,
368,
50276,
20881,
1255,
265,
50276,
74,
13414,
923,
667,
32213,
3738,
891,
1158,
253,
2929,
310,
625,
751,
247,
747,
22791,
50276,
262,
1537,
320,
625,
7470,
323,
253,
22791,
3540,
50276,
38092,
5701,
642,
878,
281,
2953,
50275,
6377,
854,
43302,
359,
1089,
326,
436,
2127,
2613,
17923,
1805,
685,
253,
3236,
2127,
3727,
50276,
33722,
247,
2180,
7364,
403,
417,
5469,
275,
253,
2929,
5474,
33032,
2520,
2929,
29328,
247,
10895,
273,
1114,
26292,
10556,
281,
7472,
1327,
10389,
301,
4460,
1859,
9066,
3082,
247,
2803,
323,
10499,
253,
1554,
400,
827,
1055,
310,
4081,
285,
767,
7103,
17082,
403,
4081,
50276,
296,
3755,
20556,
50276,
783,
2929,
310,
973,
34092,
285,
352,
310,
3477,
281,
1239,
50276,
783,
15302,
342,
7103,
17082,
403,
4081,
50275,
43249,
3082,
273,
1327,
10389,
301,
4460,
1859,
9066,
403,
6760,
327,
253,
4081,
15302,
50273,
20881,
1255,
50276,
18,
970,
247,
23559,
312,
3525,
9978,
310,
625,
9630,
685,
2014,
32499,
9978,
323,
3045,
1783,
846,
512,
253,
4736,
310,
281,
7472,
3082,
3185,
273,
3733,
3210,
891,
13414,
1158,
326,
352,
310,
253,
32489,
273,
5368,
3082,
359,
513,
417,
1900,
878,
281,
9232,
747,
941,
323,
1327,
10389,
301,
4460,
1859,
9066,
7103,
594,
352,
310,
2834,
323,
479,
281,
2096,
253,
20544,
273,
253,
4081,
2900,
50276,
19,
253,
2934,
273,
34741,
793,
23838,
310,
417,
10084,
806,
604,
776,
4736,
310,
281,
7472,
253,
4811,
326,
403,
2540,
352,
651,
320,
15246,
281,
8989,
643,
4811,
562,
1273,
359,
778,
671,
3524,
326,
253,
3082,
476,
3283,
285,
46919,
253,
4811,
326,
403,
417,
2540,
275,
3733,
941,
275,
436,
10076,
253,
8989,
943,
417,
320,
908,
4583,
891,
13414,
1158,
326,
253,
44790,
310,
4460,
50276,
20,
3738,
38998,
71,
447,
10166,
3210,
327,
3888,
19958,
432,
767,
14693,
253,
1332,
310,
2104,
281,
6194,
3210,
970,
247,
2014,
32499,
9978,
50275,
21,
253,
2803,
273,
253,
1554,
400,
827,
1055,
310,
816,
247,
2969,
10480,
285,
627,
2226,
1142,
643,
11640,
604,
359,
971,
352,
310,
3451,
533,
891,
13414,
1158,
326,
352,
310,
10481,
1534,
275,
436,
1895,
50275,
5996,
30080,
22559,
50276,
6438,
5955,
891,
5194,
342,
37317,
362,
5470,
73,
285,
891,
651,
7052,
1804,
253,
4477,
897,
253,
5701,
407,
362,
5470,
73,
275,
253,
2457,
19529,
281,
1899,
253,
789,
50276,
9820,
2490,
187,
4118,
18435,
27,
3456,
250,
2858,
22559,
436,
2929,
574,
6804,
10123,
1501,
250,
2858,
22559,
253,
2929,
574,
767,
2266,
14501,
247,
23,
7332,
285,
362,
5470,
73,
665,
9125,
326,
253,
2929,
3400,
9865,
16039,
715,
271,
1774,
1673,
347,
973,
347,
247,
31409,
277,
7675,
23,
665,
20503,
275,
253,
5955,
2708,
326,
597,
403,
275,
3718,
273,
253,
2929,
3738,
858,
417,
5731,
616,
2278,
253,
760,
5780,
14226,
3249,
432,
374,
12847,
87,
253,
913,
1057,
417,
1089,
374,
12847,
10936,
2278,
34593,
247,
23,
72,
1641,
5701,
26799,
253,
913,
84,
8668,
973,
285,
374,
12847,
87,
858,
417,
10078,
275,
5955,
253,
913,
310,
21802,
281,
2997,
253,
2929,
285,
29426,
253,
4477,
281,
897,
616,
4465,
3239,
281,
19837,
616,
6128,
281,
253,
30628
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
the authors consider the problem of offline imitation learning in the presence of suboptimal datasets in the presence of suboptimal data classical baselines like behavior cloning suffer performance hits the drop in performance often correlates positively with increase in number of suboptimal trajectories in this work the authors propose a novel learning objective inspired by the minmax formulation in gans particularly the agent learns a discriminator to distinguish between samples from the expert and the suboptimal demonstrator building on prior work in costsensitive learning this discriminator is used to reweight loss per sample in the offline buffer the proposed algorithm is evaluated on standard offline rl benchmarks across multiple environments in many environments eg hopperv2 style environments the policy improves against strong imitation learning benchmarks the discriminator which takes action probabilities as input is also evaluated in the context of offline policy evaluation on hopperv2 datasets the discriminator output is compared with true reward accumulated by multiple policies strengths the paper is well written and clearly communicates the motivation and contributions sufficient empirical experiments which support the core thesis of the paper the baselines are selected carefully to provide meaningful comparison to the proposed approach using the jointly learned discriminator seems like a novel idea for policy optimization and evaluation weakness the experiments on very similar datasets ex changing the fraction of true positives while i agree that it is strong experiment from figure 1 it is clear that the performance of dwbc is not too sensitive to fraction of positive samples it would be useful to have similar experiments on datasets collected from a random mix of optimal and suboptimal policies instead of buffer of single learning policy for example mixing expert and random policies or adroit environments with human demonstrations offline policy evaluation is evaluated on hopperv2 only it would be useful to have a diversity in environments to evaluate this contribution more objectively overall the paper is well motivated and provides promising results for leveraging suboptimal datasets for effective offline imitation learning the problem is well motivated and the authors provide some novel insights into better objectives for behavior cloning while the authors demonstrate some improvement over the provided baselines i would encourage the authors to consider adding couple of more ablations particularly across datasets with a mix of human and random trajectories docsepthe paper proposes a new offline imitation learning algorithm dwbc for datasets that combine both optimal and suboptimal demonstrations the approach is based on a modified behavioral cloning loss that weighs expert and nonexpert data based on a learned discriminator dwbc is compared against prior methods in openai gym tasks and it is shown to yield better policies compared to the prior work as a byproduct the method learns a discriminator that can be used to estimate the relative performance of any policies without rolling them out in the environment overall the results are quite promising especially the fact that the algorithm can learn to mimic the expert from a really small number of expert demonstrations for example figure 1 shows that it is possible to learn halfcheetah from just 4 thousand expert samples 4 episodes however i do feel like that both the method and the derivation need some more clarity and i also have some concerns regarding the experiments the derivation includes several inaccuracies and vague statements the main innovation behind dwbc is to learn a discriminator and condition it on the policy that is being learned the conditioning is motivated by the fact that it makes discrimination easier given the optimal policy one can discriminate the actions based on their probabilities only it is then argued after equation 6 that learning the disciminator becomes more robust if at the same time the policy is optimized by minimizing the amount of useful information it can provide to the discriminator this observation is the most important contribution of the paper yet the justification seems insufficient and vague further it is not obvious that having the derivative of the discriminator loss wrt the policy equal to zero equation 7 will be desirable and that minimization of the final loss in equation 8 will in fact lead to that condition to be true that said the results do indicate the algorithm performs well so im mostly curious seeing a more rigorous derivation and justification that could help shed some light why the dwbc works well moreover there are several other inaccuracies that make following the derivation hard for example the loss ld should not depend on s and a as it does in equation 6 i think in equation 8 the gradients need to be stopped from flowing to the discriminator how is the discriminator trained equation 8 is only minimized with respect to the policy is the same loss used for learning the discriminator i also have several smaller comments on the experiments there seems to be something not quite right with the shading standard deviation in figure 1 for some of the curves the shaded region is really narrow compared to how much the curves change between iterations can you comment on that is the training set different for each seed please add axis labels to figure 2 the datasets used in the experiments have a really particular form as they are collected during online training and thus the expert and the suboptimal data are highly related it would be good to see a comparison where the datasets come from two completely different policies one optimal expert policy and another fixed but suboptimal policy this would be a more realistic setup ie if the data comes from policy that is trained online then why do we need offline learning in the offline policy selection experiment do you use separate datasets for training the discriminator and evaluation if not then perhaps the discriminator is simply memorizing the training data what is the return of the expert policies for the experiments in figure 1 the results presented in the paper are promising but there are several issues with the clarity of the derivation and some with the experiments and thus the paper is not yet of sufficient quality to be published as is docsepthis paper proposes an offline imitation learning framework that incorporates both optimal and suboptimal datasets to learn decisionmaking tasks without requiring any reward annotations to leverage high reward transitions from the suboptimal dataset the authors formulate a discriminator that optimizes a positiveunlabeled learning objective where positive samples come from the optimal dataset and unlabeled samples come from the suboptimal dataset this discriminator is trained in an adversarial fashion along with the policy resulting in a behavior cloning objective where samples from the optimal and suboptimal datasets are weighted differently according to the discriminators predictions experiments demonstrate that on a set of simulated locomotion domains the proposed algorithm can leverage the suboptimal dataset to learn more performant policies compared to vanilla behavior cloning objectives and prior offline ilrl baselines the problem studied in this paper is quite relevant and important the ability to reuse large noisy offline datasets to solve new tasks this paper motivates the problem well in the introduction and lays out the preliminaries clearly the related work section also makes references to much of the relevant work though the subsection on offline rl could be enhanced by expanding on the limitations of offline rl something the authors already did in the introduction the method appears to be novel and the discussion framing this method as a weighted behavior cloning objective connects nicely to prior work alongside these strengths there are a number of concerns which i enumerate below 1 the derivation presented in section 32 is a bit confusing and unintuitive first im not sure i agree with the justification for the why policy outputs needs to be part of the discriminator input the authors state that it helps the discriminator distinguish between expert and nonexpert actions better but i would like to see this design choice verified empirically ie comparing to a discriminator that only takes the state and action as input possibly without the adversarial formulation in addition the discussion on the adversarial policy learning objective also can benefit from additional motivation and details in particular why is providing as little information in log pi as possible the best way to learn a discriminator why is the latter necessarily equivalent to maximizing ld for pi this subsection is in my opinion the weakest part of the paper the discussion here can be improved by a combination of providing a more formal framework for the adversarial learning problem complementing the formal framework with intuitive explanations and connecting the ideas here to prior literature 2 i am having trouble understanding the derivation details in appendix b specifically why does fracpartial fpartial log pi equal to fracpartial ldpartial d cdot fracpartial dpartial log pi and also i dont see where the fracpartial dpartial log pi term in incorporated into the final derivation at the end of the section can you please clarify these steps 3 one critically missing baseline is to train the discriminator defined in equation 4 once without all the adversarial learning machinery presented in section 32 and use the discriminator weights to weigh different samples when training the policy this can help justify all of the additional complexities presented in section 32 of updating the discriminator and policy in an alternating optimization scheme 4 while the proposed method appears to work well in locomotion domains it remains unclear how the approach would scale to more complex settings such as the robotic manipulation datasets for the kitchen and adroit tasks in d4rl 1 and the manipulation tasks in robomimic 2 in principle it should not be too difficult to run experiments on these datasets as well and such experiments would certainly enhance the scope of this paper that said given that iclr is primarily focused on core machine learning methods and less so on strong empirical evaluations this is not the primary concern in this review 5 it is unclear to me why bcall and bcnd are constant lines while the other baselines are curves shouldnt all the baselines be shown as curves ie where the performance is changing across training iterations 6 while it can be implied from the paper as is pseudocode or a text description of the full training scheme would be nice to have in particular i was wondering how often the discriminator is updated relative to the policy does one update more frequently than the other 1 fu et al d4rl datasets for deep datadriven reinforcement learning 2020 2 mandlekar et al what matters in learning from offline human demonstrations for robot manipulation corl 2021 my reaction to this paper is mixed on one hand the introduction preliminaries and related work are well laid out on the other hand i had several confusions about the method and have some concerns about the experiments see the main review for specific details as it stands i think this paper is marginally below the acceptance threshold i hope that the authors can diligently address the concerns that i raised at which point i will reconsider my recommendation docsepthe paper deals with the following setup offline imitation learning in the presence of both an expert dataset and a nonexpert dataset more precisely the goal is to learn a policy as close as possible to the ones that generated the samples in a dataset de while making the most of samples in a nonexpert dataset do the reward information is not presentused in the dataset the authors draw inspiration from the positiveunlabeled classification as well as the adversarial imitation learning literature to propose a new algorithm to tackle this problem they interleave the training of a discriminator and a policy the discriminator is trained to discriminate between expert and nonexpert dataset using a positiveunlabeled loss and takes as input the state the action and the logit of the policy pia s the policy is trained to imitate the expert on de and to fool the discriminator the authors present results on four environments from the gym mujoco suite with datasets extracted from the d4rl datasets strengths the paper is overall well written and easy to follow the justification of the setup makes sense and is well explained the setting is very interesting and has a great potential impact on the community weaknesses method the authors fail to mention a very close work that already applied positiveunlabeled learning to the imitation learning setup pugail httpsarxivorgabs191100459 this greatly limits the novelty of this work i find the theoretical derivations confusing in particular in appendix b proposition 1 is when an if and only if condition or just an implication i also dont understand why the authors need to introduce the function f overall the mathematical derivations are unclear and would benefit for more details keeping them shorter in the main text but developing them correctly in appendix for example related work i think a few more works would be worth mentioning in particular some previous work already considered having access to two datasets on expert one nonexpert like csi httpshalsupelecarchivesouvertesfrhal00869804document or milo httpsarxivorgpdf210603207pdf evaluation authors write the proposed algorithm can learn behaviors that are much closer to the optimal policies yet they fail at providing experiments that support this claim as they only study the return of the learnt policies i would recommend that they checkout this work httpsarxivorgabs210512034 that provides insights on how to evaluate models in the context of imitation learning what is more the plots dont show the average return of de which makes it impossible to use the return as a proxy to study how close the policy is from the demonstrations experiments the experimental setup is good but a bit light as written by the authors their method is quite fast to train so why not provide more results on different setups eg 1 with very little data eg like 1 or 5 trajectories only with very random data in do with human expert data in de with more complicated environments like adroit all these environmentsdatasets are available in d4rl please report the average return in de as a horizontal bar in the plots otherwise it is impossible to calibrate the results as a reader writing in the abstract the authors say both optimal and nonoptimal expert behaviors i find this expression a bit confusing it suggests that the algorithm will only work if the do is actually made of expert but slightly suboptimal trajectories nit in 32 much hard much harder which we denote it as which we denote as i believe this is an interesting idea in an interesting setup yet it lacks novelty and it would deserve more work notably on the experimental part before being published
### Summary: | the authors introduce a method for offline imitation learning in the presence of optimal and nonoptimal data in particular they propose to learn a discriminator that can be then further used to modify the behavior cloning loss which leads to performance improvements over baselines the reviews mention that the idea is novel and most sections of the paper are well written and selfexplanatory they do point out however several flaws such as the clarity of the derivation and the thoroughness of experimental evaluation while the paper has significantly improved during the rebuttal its significant changes warrant another round of reviews i encourage the authors to continue improving the paper addressing the reviewers feedback and resubmitting it as it has a potential to be a strong submission | [
6331,
432,
19246,
281,
253,
7134,
12915,
50276,
5430,
310,
253,
7134,
12915,
10166,
5150,
854,
310,
760,
36625,
342,
1675,
281,
253,
3646,
310,
253,
1072,
2957,
908,
323,
4715,
253,
7134,
12915,
50276,
74,
671,
452,
2067,
4577,
5701,
327,
253,
4679,
50276,
9088,
3133,
281,
320,
1633,
417,
3240,
987,
342,
253,
439,
6748,
2629,
11254,
275,
4677,
337,
323,
690,
273,
253,
9191,
253,
37042,
2919,
310,
1663,
6891,
2429,
281,
849,
1199,
253,
9191,
1818,
875,
25142,
476,
368,
4385,
327,
326,
310,
253,
3733,
873,
50276,
19623,
323,
1016,
8357,
50275,
32897,
823,
7844,
13301,
281,
4677,
374,
50276,
783,
15302,
908,
275,
253,
4679,
452,
247,
1663,
1798,
830,
347,
597,
403,
5728,
1309,
3909,
3733,
285,
3021,
253,
6485,
285,
253,
749,
29776,
941,
403,
4122,
2905,
352,
651,
320,
1175,
281,
923,
247,
5301,
835,
253,
15302,
1705,
432,
767,
4336,
1027,
7823,
581,
8654,
6485,
3646,
285,
1529,
4229,
533,
749,
29776,
3646,
436,
651,
320,
247,
625,
15958,
9978,
26332,
604,
253,
941,
3249,
432,
3646,
326,
310,
10166,
3909,
840,
2139,
513,
359,
878,
28841,
4715,
50276,
249,
253,
28841,
3646,
5438,
3368,
513,
368,
897,
4858,
15302,
323,
3733,
253,
7134,
12915,
285,
7103,
604,
417,
840,
4931,
253,
7134,
12915,
310,
3365,
16407,
3006,
253,
3733,
941,
50275,
5371,
310,
253,
1091,
273,
253,
6485,
7823,
323,
253,
4679,
275,
4677,
337,
50276,
783,
1543,
3559,
275,
253,
2929,
403,
12532,
533,
627,
403,
2067,
3374,
342,
253,
19843,
273,
253,
28529,
285,
690,
342,
253,
4679,
285,
3021,
253,
2929,
310,
417,
2568,
273,
4209,
3290,
281,
320,
3863,
347,
310,
5474,
33032,
2520,
2929,
29328,
271,
28841,
45738,
4715,
7792,
326,
31167,
1097,
8654,
285,
749,
29776,
15302,
281,
3037,
3061,
11849,
8892,
1293,
10568,
667,
10921,
31825,
281,
25057,
1029,
10921,
16307,
432,
253,
749,
29776,
10895,
253,
4477,
36803,
247,
7134,
12915,
326,
5556,
4219,
247,
2762,
328,
22027,
4715,
8103,
835,
2762,
3530,
1705,
432,
253,
8654,
10895,
285,
440,
22027,
3530,
1705,
432,
253,
749,
29776,
10895,
436,
7134,
12915,
310,
10166,
275,
271,
48960,
8142,
2112,
342,
253,
3646,
4795,
275,
247,
3879,
34591,
8103,
835,
3530,
432,
253,
8654,
285,
749,
29776,
15302,
403,
17375,
13359,
2556,
281,
253,
20741,
2392,
13650,
4679,
7568,
326,
327,
247,
873,
273,
15524,
23904,
5011,
10625,
253,
4081,
5933,
476,
25057,
253,
749,
29776,
10895,
281,
3037,
625,
1347,
386,
7823,
2429,
281,
26724,
3879,
34591,
16566,
285,
2720,
28841,
4164,
8435,
1666,
25379,
253,
1895,
5421,
275,
436,
2929,
310,
3240,
4623,
285,
1774,
50276,
783,
3745,
281,
33150,
1781,
27620,
28841,
15302,
281,
8415,
747,
8892,
436,
2929,
15265,
684,
253,
1895,
973,
275,
253,
10199,
285,
41714,
562,
253,
11944,
249,
3927,
4518,
253,
2905,
789,
2593,
671,
2789,
10414,
281,
1199,
273,
253,
4623,
789,
2167,
253,
19087,
327,
28841,
391,
77,
812,
320,
8655,
407,
16122,
327,
253,
7364,
273,
28841,
391,
77,
1633,
253,
4477,
2168,
858,
275,
253,
10199,
253,
1332,
4620,
281,
320,
4460,
285,
253,
5955,
39926,
436,
1332,
347,
247,
17375,
3879,
34591,
8103,
23417,
23395,
281,
2720,
789,
12936,
841,
20544,
627,
403,
247,
1180,
273,
7350,
534,
891,
49860,
2708,
337,
253,
28529,
3559,
275,
2593,
4567,
310,
247,
2372,
21643,
285,
25962,
48714,
806,
516,
417,
2119,
891,
5194,
342,
253,
22861,
323,
253,
2139,
3646,
18012,
3198,
281,
320,
629,
273,
253,
7134,
12915,
3280,
253,
4477,
1375,
326,
352,
7729,
253,
7134,
12915,
12129,
875,
6485,
285,
44382,
8292,
5231,
1805,
533,
891,
651,
751,
281,
923,
436,
2216,
4327,
16058,
45190,
50276,
466,
10941,
281,
247,
7134,
12915,
326,
760,
3936,
253,
1375,
285,
2250,
347,
3280,
6830,
1293,
253,
48960,
15895,
275,
1635,
253,
5955,
327,
253,
48960,
3646,
4715,
8103,
671,
476,
5649,
432,
3081,
16038,
285,
4278,
275,
1798,
2139,
310,
5277,
347,
1652,
1491,
275,
2412,
12580,
347,
1896,
253,
1682,
1039,
281,
3037,
247,
7134,
12915,
2139,
310,
253,
6158,
7933,
6425,
281,
46875,
42651,
323,
12580,
436,
19087,
310,
275,
619,
4743,
253,
5075,
383,
629,
273,
253,
2929,
253,
5955,
1060,
476,
320,
5520,
407,
247,
5019,
273,
5277,
247,
625,
7473,
7792,
323,
253,
48960,
4715,
1895,
13503,
272,
253,
7473,
7792,
342,
27350,
22909,
285,
12873,
253,
5697,
1060,
281,
2720,
6239,
374,
891,
717,
1907,
7596,
4685,
253,
28529,
4278,
275,
30762,
270,
5742,
2139,
1057,
1315,
317,
3214,
269,
3214,
2412,
12580,
4503,
281,
1315,
317,
3214,
42651,
3214,
277,
260,
5256,
1315,
317,
3214,
277,
3214,
2412,
12580,
285,
671,
891,
13414,
923,
835,
253,
1315,
317,
3214,
277,
3214,
2412,
12580,
1307,
275,
11217,
715,
253,
2457,
28529,
387,
253,
990,
273,
253,
2593,
476,
368,
4496,
19148,
841,
5018,
495,
581,
21038,
5816,
8245,
310,
281,
6194,
253,
7134,
12915,
2931,
275,
5150,
577,
2378,
1293,
512,
253,
48960,
4715,
20949,
3559,
275,
2593,
4567,
50276,
395,
897,
253,
7134,
12915,
13461,
281,
14357,
1027,
3530,
672,
3733,
253,
3646,
436,
476,
1361,
15249,
512,
273,
253,
3081,
48663,
3559,
275,
2593,
4567,
273,
22753,
253,
7134,
12915,
285,
3646,
275,
271,
28035,
13757,
6974,
577,
1223,
253,
4081,
1332,
4620,
281,
789,
973,
275,
23904,
5011,
10625,
352,
4558,
12744,
849,
253,
2746,
651,
4311,
281,
625,
2570,
7533,
824,
347,
253,
35121,
19763,
15302,
323,
253,
8576,
285,
519,
14790,
8892,
275,
277,
21,
8435,
337,
285,
253,
19763,
8892,
275,
4848,
297,
303,
280,
374,
275,
8063,
352,
943,
417,
320,
1512,
2834,
281,
1408,
4679,
327,
841,
15302,
347,
973,
285,
824,
4679,
651,
5604,
7278,
253,
7990,
273,
436,
2929,
326,
753,
1677,
326,
17857,
32888,
310,
8558,
7106,
327,
5161,
5145,
4715,
3082,
285,
1679,
594,
327,
2266,
16774,
27163,
436,
310,
417,
253,
3625,
4468,
275,
436,
2278,
50276,
22,
352,
310,
12744,
281,
479,
2139,
270,
4065,
285,
49501,
2109,
403,
3638,
3104,
1223,
253,
643,
1666,
25379,
403,
9191,
943,
2649,
512,
253,
1666,
25379,
320,
2011,
347,
9191,
50276,
466,
835,
253,
3045,
310,
6890,
2439,
3733,
25142,
50276,
23,
1223,
352,
476,
320,
10466,
432,
253,
2929,
347,
310,
10585,
406,
853,
390,
247,
2505,
5740,
273,
253,
2120,
3733,
6974,
651,
320,
5322,
281,
452,
275,
1798,
891,
369,
12371,
849,
2223,
253,
7134,
12915,
310,
9300,
4103,
281,
253,
3646,
50276,
18566,
581,
5731,
625,
7208,
685,
253,
643,
50275,
18,
15260,
1162,
355,
277,
21,
8435,
15302,
323,
3676,
2856,
324,
1069,
257,
35221,
4715,
9169,
50276,
19,
7649,
282,
18970,
1162,
355,
752,
8213,
275,
4715,
432,
28841,
1966,
32367,
323,
15688,
19763,
944,
77,
43425,
619,
4884,
281,
436,
2929,
310,
6804,
327,
581,
1133,
253,
10199,
11944,
249,
3927,
285,
2905,
789,
403,
973,
10090,
562,
327,
253,
643,
1133,
891,
574,
2067,
1461,
16723,
670,
253,
1332,
285,
452,
690,
7350,
670,
253,
4679,
923,
253,
2022,
2278,
323,
2173,
4278,
347,
352,
9572,
891,
1158,
436,
2929,
310,
42876,
2708,
253,
14924,
7887,
891,
3524,
326,
253,
4477,
476,
23947,
1574,
2953,
253,
7350,
326,
891,
5439,
387,
534,
1127,
891,
588,
24033,
619,
17401,
5474,
339,
431,
248,
2929,
13330,
342,
253,
1563,
9978,
28841,
45738,
4715,
275,
253,
3361,
273,
1097,
271,
6485,
10895,
285,
247,
44382,
8292,
10895,
625,
10534,
253,
4736,
310,
281,
3037,
247,
3646,
347,
2810,
347,
1896,
281,
253,
4394,
326,
4561,
253,
3530,
275,
247,
10895,
372,
1223,
2403,
253,
954,
273,
3530,
275,
247,
44382,
8292,
10895,
513,
253,
10921,
1491,
310,
417,
1246,
3197,
275,
253,
10895,
253,
4477,
3812,
17006,
432,
253,
2762,
328,
22027,
9162,
347,
973,
347,
253,
48960,
45738,
4715,
6239,
281,
12661,
247,
747,
5933,
281,
18915,
436,
1895,
597,
25817,
1123,
253,
3733,
273,
247,
7134,
12915,
285,
247,
3646,
253,
7134,
12915,
310,
10166,
281,
30530,
875,
6485,
285,
44382,
8292,
10895,
970,
247,
2762,
328,
22027,
2957,
285,
3936,
347,
3280,
253,
1375,
253,
2250,
285,
253,
2412,
262,
273,
253,
3646,
268,
571,
50276,
84,
253,
3646,
310,
10166,
281,
516,
17255,
253,
6485,
327,
372,
285,
281,
11213,
253,
7134,
12915,
50275,
783,
4477,
1246,
1543,
327,
1740,
12620,
432,
253,
17409,
278,
10441,
16856,
18880,
342,
15302,
10375,
432,
253,
277,
21,
8435,
15302,
50276,
296,
3755,
20556,
50276,
783,
2929,
310,
4583,
973,
3542,
285,
3477,
281,
956,
253,
22861,
273,
253,
9978,
2789,
3282,
285,
310,
973,
5544,
50276,
783,
4758,
310,
1077,
4722,
285,
556,
247,
1270,
2442,
3486,
327,
253,
3114,
50276,
20881,
1255,
265,
50276,
9349,
50272,
783,
4477,
1891,
281,
3748,
247,
1077,
2810,
789,
326,
2168,
3732,
2762,
328,
22027,
4715,
281,
253,
45738,
4715,
9978,
268,
814,
647,
5987,
39962,
2061,
5375,
746,
37965,
28333,
436,
10260,
7787,
253,
38135,
273,
436,
789,
50272,
74,
1089,
253,
10527,
3538,
569,
21643,
275,
1798,
275,
30762,
270,
13989,
337,
310,
672,
271,
604,
285,
760,
604,
1617,
390,
816,
271,
27570,
50272,
74,
671,
13414,
2096,
2139,
253,
4477,
878,
281,
9569,
253,
1159,
269,
4583,
253,
15965,
3538,
569,
403,
12744,
285,
651,
5649,
323,
625,
4278,
7562,
731,
12217,
275,
253,
2022,
2505,
533,
6684,
731,
9113,
275,
30762,
323,
1650,
50275,
4919,
789,
50272,
74,
1158,
247,
1643,
625,
2987,
651,
320,
4409,
29570,
275,
1798,
690,
2045,
789,
2168,
2783,
1907,
2289,
281,
767,
15302,
327,
6485,
581,
44382,
8292,
751,
260,
9245,
50276,
3614,
73,
932,
48772,
282,
68,
1116,
1644,
276,
1748,
265,
925,
5590,
8897,
2090,
28927,
3306,
50276,
263,
2301,
80,
5987,
39962,
2061,
9275,
16899,
29251,
18202,
9275,
50276,
15419,
2368,
50272,
43355,
3630,
253,
4081,
5933,
476,
3037,
13576,
326,
403,
1199,
8003,
281,
253,
8654,
7823,
2568,
597,
1891,
387,
5277,
4679,
326,
1329,
436,
1750,
347,
597,
760,
1263,
253,
1091,
273,
253,
34003,
7823,
891,
651,
5583,
326,
597,
36620,
436,
789,
5987,
39962,
2061,
5375,
19,
10655,
8193,
1706,
50276,
3529,
3400,
16039,
327,
849,
281,
7472,
3210,
275,
253,
3634,
273,
45738,
4715,
752,
310,
625,
253,
14777,
13414,
921,
253,
3388,
1091,
273,
372,
534,
2789,
352,
7479,
281,
897,
253,
1091,
347,
247,
17335,
281,
1263,
849,
2810,
253,
3646,
310,
432,
253,
32367,
50276,
16217,
3825,
50272,
783,
5661,
9978,
310,
1175,
533,
247,
2372,
1708,
347,
3542,
407,
253,
4477,
616,
1332,
310,
3240,
3809,
281,
6194,
594,
2139,
417,
2085,
625,
1543,
327,
1027,
873,
8777,
24088,
337,
342,
1077,
1652,
941,
24088,
751,
337,
390,
608,
24102,
760,
342,
1077,
3632,
941,
275,
513,
342,
1966,
6485,
941,
275,
372,
342,
625,
9542,
12620,
751,
519,
14790,
512,
841,
12620,
46906,
1507,
403,
2130,
275,
277,
21,
8435,
50272,
32897,
1304,
253,
3388,
1091,
275,
372,
347,
247,
11593,
2534,
275,
253,
14777,
5010,
352,
310,
7479,
281,
24403,
366,
253,
1543,
347,
247,
9414,
50276,
17695,
50272,
249,
253,
12002,
253,
4477,
1333,
1097,
8654,
285,
1327,
29776,
6485,
13576,
891,
1089,
436,
2048,
247,
2372,
21643,
352,
5936,
326,
253,
5933,
588,
760,
789,
604,
253,
513,
310,
2686,
1160,
273,
6485,
533,
5777,
749,
29776,
24102,
50272,
32202,
275,
4567,
1199,
1892,
50276,
25914,
12150,
534,
359,
9173,
352,
347,
50276,
4609,
359,
9173,
347,
50276,
74,
2868,
436,
310,
271,
4722,
2934,
275,
271,
4722,
9978,
2568,
352,
19756,
38135,
285,
352,
651,
17337,
625,
789,
19836,
327,
253,
5661,
629,
1078,
1146,
3863,
2490,
187,
4118,
18435,
27,
783,
4477,
9569,
247,
1332,
323,
28841,
45738,
4715,
275,
253,
3361,
273,
8654,
285,
1327,
29776,
941,
275,
1798,
597,
12661,
281,
3037,
247,
7134,
12915,
326,
476,
320,
840,
2007,
908,
281,
10007,
253,
3879,
34591,
2957,
534,
5644,
281,
3045,
11701,
689,
1666,
25379,
253,
10123,
3748,
326,
253,
2934,
310,
4460,
285,
954,
7118,
273,
253,
2929,
403,
973,
3542,
285,
11329,
453,
89,
11139,
2473,
597,
513,
1127,
562,
2299,
2067,
32138,
824,
347,
253,
19843,
273,
253,
28529,
285,
50276,
783,
11080,
1255,
273,
5661,
7103,
1223,
253,
2929,
556,
3012,
5520,
1309,
253,
30080,
22559,
697,
1534,
2544,
7501,
1529,
3790,
273,
10123,
891,
11907,
253,
4477,
281,
4035,
11138,
253,
2929,
15974,
253,
30628,
8680,
285,
501,
538,
15318,
352,
347,
352,
556,
247,
2442,
281,
320,
247,
2266,
19529
] | [
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] | [
6331,
432,
19246,
281,
253,
7134,
12915,
50276,
5430,
310,
253,
7134,
12915,
10166,
5150,
854,
310,
760,
36625,
342,
1675,
281,
253,
3646,
310,
253,
1072,
2957,
908,
323,
4715,
253,
7134,
12915,
50276,
74,
671,
452,
2067,
4577,
5701,
327,
253,
4679,
50276,
9088,
3133,
281,
320,
1633,
417,
3240,
987,
342,
253,
439,
6748,
2629,
11254,
275,
4677,
337,
323,
690,
273,
253,
9191,
253,
37042,
2919,
310,
1663,
6891,
2429,
281,
849,
1199,
253,
9191,
1818,
875,
25142,
476,
368,
4385,
327,
326,
310,
253,
3733,
873,
50276,
19623,
323,
1016,
8357,
50275,
32897,
823,
7844,
13301,
281,
4677,
374,
50276,
783,
15302,
908,
275,
253,
4679,
452,
247,
1663,
1798,
830,
347,
597,
403,
5728,
1309,
3909,
3733,
285,
3021,
253,
6485,
285,
253,
749,
29776,
941,
403,
4122,
2905,
352,
651,
320,
1175,
281,
923,
247,
5301,
835,
253,
15302,
1705,
432,
767,
4336,
1027,
7823,
581,
8654,
6485,
3646,
285,
1529,
4229,
533,
749,
29776,
3646,
436,
651,
320,
247,
625,
15958,
9978,
26332,
604,
253,
941,
3249,
432,
3646,
326,
310,
10166,
3909,
840,
2139,
513,
359,
878,
28841,
4715,
50276,
249,
253,
28841,
3646,
5438,
3368,
513,
368,
897,
4858,
15302,
323,
3733,
253,
7134,
12915,
285,
7103,
604,
417,
840,
4931,
253,
7134,
12915,
310,
3365,
16407,
3006,
253,
3733,
941,
50275,
5371,
310,
253,
1091,
273,
253,
6485,
7823,
323,
253,
4679,
275,
4677,
337,
50276,
783,
1543,
3559,
275,
253,
2929,
403,
12532,
533,
627,
403,
2067,
3374,
342,
253,
19843,
273,
253,
28529,
285,
690,
342,
253,
4679,
285,
3021,
253,
2929,
310,
417,
2568,
273,
4209,
3290,
281,
320,
3863,
347,
310,
5474,
33032,
2520,
2929,
29328,
271,
28841,
45738,
4715,
7792,
326,
31167,
1097,
8654,
285,
749,
29776,
15302,
281,
3037,
3061,
11849,
8892,
1293,
10568,
667,
10921,
31825,
281,
25057,
1029,
10921,
16307,
432,
253,
749,
29776,
10895,
253,
4477,
36803,
247,
7134,
12915,
326,
5556,
4219,
247,
2762,
328,
22027,
4715,
8103,
835,
2762,
3530,
1705,
432,
253,
8654,
10895,
285,
440,
22027,
3530,
1705,
432,
253,
749,
29776,
10895,
436,
7134,
12915,
310,
10166,
275,
271,
48960,
8142,
2112,
342,
253,
3646,
4795,
275,
247,
3879,
34591,
8103,
835,
3530,
432,
253,
8654,
285,
749,
29776,
15302,
403,
17375,
13359,
2556,
281,
253,
20741,
2392,
13650,
4679,
7568,
326,
327,
247,
873,
273,
15524,
23904,
5011,
10625,
253,
4081,
5933,
476,
25057,
253,
749,
29776,
10895,
281,
3037,
625,
1347,
386,
7823,
2429,
281,
26724,
3879,
34591,
16566,
285,
2720,
28841,
4164,
8435,
1666,
25379,
253,
1895,
5421,
275,
436,
2929,
310,
3240,
4623,
285,
1774,
50276,
783,
3745,
281,
33150,
1781,
27620,
28841,
15302,
281,
8415,
747,
8892,
436,
2929,
15265,
684,
253,
1895,
973,
275,
253,
10199,
285,
41714,
562,
253,
11944,
249,
3927,
4518,
253,
2905,
789,
2593,
671,
2789,
10414,
281,
1199,
273,
253,
4623,
789,
2167,
253,
19087,
327,
28841,
391,
77,
812,
320,
8655,
407,
16122,
327,
253,
7364,
273,
28841,
391,
77,
1633,
253,
4477,
2168,
858,
275,
253,
10199,
253,
1332,
4620,
281,
320,
4460,
285,
253,
5955,
39926,
436,
1332,
347,
247,
17375,
3879,
34591,
8103,
23417,
23395,
281,
2720,
789,
12936,
841,
20544,
627,
403,
247,
1180,
273,
7350,
534,
891,
49860,
2708,
337,
253,
28529,
3559,
275,
2593,
4567,
310,
247,
2372,
21643,
285,
25962,
48714,
806,
516,
417,
2119,
891,
5194,
342,
253,
22861,
323,
253,
2139,
3646,
18012,
3198,
281,
320,
629,
273,
253,
7134,
12915,
3280,
253,
4477,
1375,
326,
352,
7729,
253,
7134,
12915,
12129,
875,
6485,
285,
44382,
8292,
5231,
1805,
533,
891,
651,
751,
281,
923,
436,
2216,
4327,
16058,
45190,
50276,
466,
10941,
281,
247,
7134,
12915,
326,
760,
3936,
253,
1375,
285,
2250,
347,
3280,
6830,
1293,
253,
48960,
15895,
275,
1635,
253,
5955,
327,
253,
48960,
3646,
4715,
8103,
671,
476,
5649,
432,
3081,
16038,
285,
4278,
275,
1798,
2139,
310,
5277,
347,
1652,
1491,
275,
2412,
12580,
347,
1896,
253,
1682,
1039,
281,
3037,
247,
7134,
12915,
2139,
310,
253,
6158,
7933,
6425,
281,
46875,
42651,
323,
12580,
436,
19087,
310,
275,
619,
4743,
253,
5075,
383,
629,
273,
253,
2929,
253,
5955,
1060,
476,
320,
5520,
407,
247,
5019,
273,
5277,
247,
625,
7473,
7792,
323,
253,
48960,
4715,
1895,
13503,
272,
253,
7473,
7792,
342,
27350,
22909,
285,
12873,
253,
5697,
1060,
281,
2720,
6239,
374,
891,
717,
1907,
7596,
4685,
253,
28529,
4278,
275,
30762,
270,
5742,
2139,
1057,
1315,
317,
3214,
269,
3214,
2412,
12580,
4503,
281,
1315,
317,
3214,
42651,
3214,
277,
260,
5256,
1315,
317,
3214,
277,
3214,
2412,
12580,
285,
671,
891,
13414,
923,
835,
253,
1315,
317,
3214,
277,
3214,
2412,
12580,
1307,
275,
11217,
715,
253,
2457,
28529,
387,
253,
990,
273,
253,
2593,
476,
368,
4496,
19148,
841,
5018,
495,
581,
21038,
5816,
8245,
310,
281,
6194,
253,
7134,
12915,
2931,
275,
5150,
577,
2378,
1293,
512,
253,
48960,
4715,
20949,
3559,
275,
2593,
4567,
50276,
395,
897,
253,
7134,
12915,
13461,
281,
14357,
1027,
3530,
672,
3733,
253,
3646,
436,
476,
1361,
15249,
512,
273,
253,
3081,
48663,
3559,
275,
2593,
4567,
273,
22753,
253,
7134,
12915,
285,
3646,
275,
271,
28035,
13757,
6974,
577,
1223,
253,
4081,
1332,
4620,
281,
789,
973,
275,
23904,
5011,
10625,
352,
4558,
12744,
849,
253,
2746,
651,
4311,
281,
625,
2570,
7533,
824,
347,
253,
35121,
19763,
15302,
323,
253,
8576,
285,
519,
14790,
8892,
275,
277,
21,
8435,
337,
285,
253,
19763,
8892,
275,
4848,
297,
303,
280,
374,
275,
8063,
352,
943,
417,
320,
1512,
2834,
281,
1408,
4679,
327,
841,
15302,
347,
973,
285,
824,
4679,
651,
5604,
7278,
253,
7990,
273,
436,
2929,
326,
753,
1677,
326,
17857,
32888,
310,
8558,
7106,
327,
5161,
5145,
4715,
3082,
285,
1679,
594,
327,
2266,
16774,
27163,
436,
310,
417,
253,
3625,
4468,
275,
436,
2278,
50276,
22,
352,
310,
12744,
281,
479,
2139,
270,
4065,
285,
49501,
2109,
403,
3638,
3104,
1223,
253,
643,
1666,
25379,
403,
9191,
943,
2649,
512,
253,
1666,
25379,
320,
2011,
347,
9191,
50276,
466,
835,
253,
3045,
310,
6890,
2439,
3733,
25142,
50276,
23,
1223,
352,
476,
320,
10466,
432,
253,
2929,
347,
310,
10585,
406,
853,
390,
247,
2505,
5740,
273,
253,
2120,
3733,
6974,
651,
320,
5322,
281,
452,
275,
1798,
891,
369,
12371,
849,
2223,
253,
7134,
12915,
310,
9300,
4103,
281,
253,
3646,
50276,
18566,
581,
5731,
625,
7208,
685,
253,
643,
50275,
18,
15260,
1162,
355,
277,
21,
8435,
15302,
323,
3676,
2856,
324,
1069,
257,
35221,
4715,
9169,
50276,
19,
7649,
282,
18970,
1162,
355,
752,
8213,
275,
4715,
432,
28841,
1966,
32367,
323,
15688,
19763,
944,
77,
43425,
619,
4884,
281,
436,
2929,
310,
6804,
327,
581,
1133,
253,
10199,
11944,
249,
3927,
285,
2905,
789,
403,
973,
10090,
562,
327,
253,
643,
1133,
891,
574,
2067,
1461,
16723,
670,
253,
1332,
285,
452,
690,
7350,
670,
253,
4679,
923,
253,
2022,
2278,
323,
2173,
4278,
347,
352,
9572,
891,
1158,
436,
2929,
310,
42876,
2708,
253,
14924,
7887,
891,
3524,
326,
253,
4477,
476,
23947,
1574,
2953,
253,
7350,
326,
891,
5439,
387,
534,
1127,
891,
588,
24033,
619,
17401,
5474,
339,
431,
248,
2929,
13330,
342,
253,
1563,
9978,
28841,
45738,
4715,
275,
253,
3361,
273,
1097,
271,
6485,
10895,
285,
247,
44382,
8292,
10895,
625,
10534,
253,
4736,
310,
281,
3037,
247,
3646,
347,
2810,
347,
1896,
281,
253,
4394,
326,
4561,
253,
3530,
275,
247,
10895,
372,
1223,
2403,
253,
954,
273,
3530,
275,
247,
44382,
8292,
10895,
513,
253,
10921,
1491,
310,
417,
1246,
3197,
275,
253,
10895,
253,
4477,
3812,
17006,
432,
253,
2762,
328,
22027,
9162,
347,
973,
347,
253,
48960,
45738,
4715,
6239,
281,
12661,
247,
747,
5933,
281,
18915,
436,
1895,
597,
25817,
1123,
253,
3733,
273,
247,
7134,
12915,
285,
247,
3646,
253,
7134,
12915,
310,
10166,
281,
30530,
875,
6485,
285,
44382,
8292,
10895,
970,
247,
2762,
328,
22027,
2957,
285,
3936,
347,
3280,
253,
1375,
253,
2250,
285,
253,
2412,
262,
273,
253,
3646,
268,
571,
50276,
84,
253,
3646,
310,
10166,
281,
516,
17255,
253,
6485,
327,
372,
285,
281,
11213,
253,
7134,
12915,
50275,
783,
4477,
1246,
1543,
327,
1740,
12620,
432,
253,
17409,
278,
10441,
16856,
18880,
342,
15302,
10375,
432,
253,
277,
21,
8435,
15302,
50276,
296,
3755,
20556,
50276,
783,
2929,
310,
4583,
973,
3542,
285,
3477,
281,
956,
253,
22861,
273,
253,
9978,
2789,
3282,
285,
310,
973,
5544,
50276,
783,
4758,
310,
1077,
4722,
285,
556,
247,
1270,
2442,
3486,
327,
253,
3114,
50276,
20881,
1255,
265,
50276,
9349,
50272,
783,
4477,
1891,
281,
3748,
247,
1077,
2810,
789,
326,
2168,
3732,
2762,
328,
22027,
4715,
281,
253,
45738,
4715,
9978,
268,
814,
647,
5987,
39962,
2061,
5375,
746,
37965,
28333,
436,
10260,
7787,
253,
38135,
273,
436,
789,
50272,
74,
1089,
253,
10527,
3538,
569,
21643,
275,
1798,
275,
30762,
270,
13989,
337,
310,
672,
271,
604,
285,
760,
604,
1617,
390,
816,
271,
27570,
50272,
74,
671,
13414,
2096,
2139,
253,
4477,
878,
281,
9569,
253,
1159,
269,
4583,
253,
15965,
3538,
569,
403,
12744,
285,
651,
5649,
323,
625,
4278,
7562,
731,
12217,
275,
253,
2022,
2505,
533,
6684,
731,
9113,
275,
30762,
323,
1650,
50275,
4919,
789,
50272,
74,
1158,
247,
1643,
625,
2987,
651,
320,
4409,
29570,
275,
1798,
690,
2045,
789,
2168,
2783,
1907,
2289,
281,
767,
15302,
327,
6485,
581,
44382,
8292,
751,
260,
9245,
50276,
3614,
73,
932,
48772,
282,
68,
1116,
1644,
276,
1748,
265,
925,
5590,
8897,
2090,
28927,
3306,
50276,
263,
2301,
80,
5987,
39962,
2061,
9275,
16899,
29251,
18202,
9275,
50276,
15419,
2368,
50272,
43355,
3630,
253,
4081,
5933,
476,
3037,
13576,
326,
403,
1199,
8003,
281,
253,
8654,
7823,
2568,
597,
1891,
387,
5277,
4679,
326,
1329,
436,
1750,
347,
597,
760,
1263,
253,
1091,
273,
253,
34003,
7823,
891,
651,
5583,
326,
597,
36620,
436,
789,
5987,
39962,
2061,
5375,
19,
10655,
8193,
1706,
50276,
3529,
3400,
16039,
327,
849,
281,
7472,
3210,
275,
253,
3634,
273,
45738,
4715,
752,
310,
625,
253,
14777,
13414,
921,
253,
3388,
1091,
273,
372,
534,
2789,
352,
7479,
281,
897,
253,
1091,
347,
247,
17335,
281,
1263,
849,
2810,
253,
3646,
310,
432,
253,
32367,
50276,
16217,
3825,
50272,
783,
5661,
9978,
310,
1175,
533,
247,
2372,
1708,
347,
3542,
407,
253,
4477,
616,
1332,
310,
3240,
3809,
281,
6194,
594,
2139,
417,
2085,
625,
1543,
327,
1027,
873,
8777,
24088,
337,
342,
1077,
1652,
941,
24088,
751,
337,
390,
608,
24102,
760,
342,
1077,
3632,
941,
275,
513,
342,
1966,
6485,
941,
275,
372,
342,
625,
9542,
12620,
751,
519,
14790,
512,
841,
12620,
46906,
1507,
403,
2130,
275,
277,
21,
8435,
50272,
32897,
1304,
253,
3388,
1091,
275,
372,
347,
247,
11593,
2534,
275,
253,
14777,
5010,
352,
310,
7479,
281,
24403,
366,
253,
1543,
347,
247,
9414,
50276,
17695,
50272,
249,
253,
12002,
253,
4477,
1333,
1097,
8654,
285,
1327,
29776,
6485,
13576,
891,
1089,
436,
2048,
247,
2372,
21643,
352,
5936,
326,
253,
5933,
588,
760,
789,
604,
253,
513,
310,
2686,
1160,
273,
6485,
533,
5777,
749,
29776,
24102,
50272,
32202,
275,
4567,
1199,
1892,
50276,
25914,
12150,
534,
359,
9173,
352,
347,
50276,
4609,
359,
9173,
347,
50276,
74,
2868,
436,
310,
271,
4722,
2934,
275,
271,
4722,
9978,
2568,
352,
19756,
38135,
285,
352,
651,
17337,
625,
789,
19836,
327,
253,
5661,
629,
1078,
1146,
3863,
2490,
187,
4118,
18435,
27,
783,
4477,
9569,
247,
1332,
323,
28841,
45738,
4715,
275,
253,
3361,
273,
8654,
285,
1327,
29776,
941,
275,
1798,
597,
12661,
281,
3037,
247,
7134,
12915,
326,
476,
320,
840,
2007,
908,
281,
10007,
253,
3879,
34591,
2957,
534,
5644,
281,
3045,
11701,
689,
1666,
25379,
253,
10123,
3748,
326,
253,
2934,
310,
4460,
285,
954,
7118,
273,
253,
2929,
403,
973,
3542,
285,
11329,
453,
89,
11139,
2473,
597,
513,
1127,
562,
2299,
2067,
32138,
824,
347,
253,
19843,
273,
253,
28529,
285,
50276,
783,
11080,
1255,
273,
5661,
7103,
1223,
253,
2929,
556,
3012,
5520,
1309,
253,
30080,
22559,
697,
1534,
2544,
7501,
1529,
3790,
273,
10123,
891,
11907,
253,
4477,
281,
4035,
11138,
253,
2929,
15974,
253,
30628,
8680,
285,
501,
538,
15318,
352,
347,
352,
556,
247,
2442,
281,
320,
247,
2266,
19529
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
the paper presents an application of sharpnessaware minimization sam to training of visual transformers vit and mlpmixer the idea of the paper is that models with weaker inductive priors transformers and mlpmixer suffer from sharp local minima more than cnnmodels and sam fixes this issue also significantly improving test accuracy on various version of imagenet the paper is experimental nontheoretical kind mostly studyng the effect of sam on resnet vit and mlpmixer training and its interplay with dataset size and augmentation also paper studies how sam influences different aspects of the trained network namely attention masks eigenvalues of the hessian of models blcoks and loss landscape the main stated result is that sam can more or less replace the heavy image augmentations for vit and mlpmixer result1 and improve accuracy on clean and corrupted test set strong points impact of the result1 i think that despite of the paper simplicity lets apply sam to the noncnn architectures the message that one can replace domainspecific augmentations with generalpurpose optimizer is of great importance all sota methods for image classification rely on the augmentations to provide good results and it is not easy to come up with new augmentations the most papers which present learned augmentations in fact just learn some parameters and combinations of known augmentation however for other domains like nlp or some kind of medical data it is not clear how one could augment the data if one can instead just take generalpurpose architecture transformer and train it with generalpurpose optimizer sam that is a big deal the experiments are done rigorously and with care the only possibly missing experiment would be to train a model without an inductive bias at all like vanilla mlp and check if sam would greatly improve its results or not the paper also states the limitations of the approach the main of which is computational cost weak points in the current form i was asked to review a paper after the rebuttal stage i do not see any major weaknesses the paper tackles a single problem and studies it extensively the problem in my opinion is significant and the paper would likely have an impact therefore i vote for clear acceptance docsepthe paper is clearly written with extensive experiments the paper compares dataset size data augmentation and different network architectures vits resnets and mlpmixers by using loss landscape visualization number of parameters neural tangent kernel ntk hessian matrix training dynamics percentage of activated neurons throughput norm of weights and activations attention map visualizations and missing rate under linear interpolation the main contributions of the paper are 1 vits with existing sharpnessaware minimizer sam outperform resnets of similar size and throughput when trained from scratch on imagenet without largescale pretraining or strong data augmentations 2 the motivation and experiments including the ablation studies are wellorganized and adequate the paper reveals that the vits and mlpmixers have extremely sharp local minima of converged models through loss landscape visualization and hessian dominate value the impact of the paper may be limited the paper only compares with resnet152 with no augmentation and large dataset pretraining currently the augmentation and large dataset pretraining are vastly employed in the entire community the impact can be enlarged by conducting experiments on vitl and comparing resnet152x4 pretrained on bitl the paper is pretty wellwritten with extensive results and figures the impact of the paper can be largely improved if the largescale experiments can be conducted and a better accuracy than the current stateoftheart methods can be achieved docsepthe authors analyze the effects of sharpnessaware minimization sam when applied to vision transformers vits and mlpmixers they find that the converged loss surface of vits and mlpmixers is sharp compared to resnets and that sam ameliorates this issue yielding improved validation accuracy on imagenet they then show that sam improves the performance of vits and mlpmixers in a variety of image classification scenarios including adversarial attacks naturalistic corruptions contrastive training and transfer learning they also examine the effect of sam on activation sparsity activation maps and the relationships between different model architecture components and loss surface sharpness update i have increased my score from a 3 to an 8 after the the authors response sam is a new and promising method so its important to examine and understand its effects and the benefits of sam for vits and mlpmixers appear significant i applaud the breadth of the authors attempts they examine the effects of sam across a wide range of image classification scenarios and using a variety of analysis methods i am convinced that sam works well for vits and mlpmixers but im not convinced of why it works my main concern about this paper is that it is insufficiently rigorous the authors provide a number of explanations that are not justified by the evidence and none of the experiments are run with replicates making it difficult to determine whether observed differences are meaningful i am not confident that this work is suitable for publication in its current state however i am optimistic that it could be suitable for publication after revision detailed feedback replicates and measures of variability ideally all the experiments should be run in replicate with 10 different seeds however i understand that this is computationally costly so i would accept running replicates three at a bare minimum for the six core model configurations vitb16 mlpmixerb16 and resnet152 with and without sam results with these models should be accompanied by measures of variability or simply listing all three numbers if only 3 replicates are used unjustified claims there are numerous unjustified scientific claims made which i detail as follows end of first paragraph in section 3 there also exists a large gap between vits and resnets in robustness tests requires a reference in section 3 small training errors the authors claim that using the crosstoken mlp to learn the interplay across image patches is more prone to overfitting than vits selfattention mechanism whose behavior is restricted by a softmax this claim seems speculative and should be tested the authors could train a vit with a different attention normalization scheme or without one altogether learning q k and v projections as opposed to learning mlp at the very least a convincing theoretical explanation should be provided in section 42 higher accuracy the authors state empirically the degree of improvement negatively correlates with the level of inductive biases built into the architecture resnets with inherent translation equivalence and locality benefit less from landscape smoothing than the attentionbased vits this is an interesting claim but there are multiple issues with it first level of inductive bias is vague i would suggest the language be changed to correlates with expressivity or negatively correlates with the constraints of the inductive biases the level or severity of an inductive bias can be orthogonal to the constraints it imposes on expressivity convit dascoli et al 2021 for example has a convolutional inductive bias but is completely unconstrained in that it can learn to be a vanilla vit second this claim can easily be tested the authors could examine the effects of sam when varying model architecture choices for example exchanging layer norm for batch norm in resnets would have little effect on the models expressivity and consequently the authors hypothesis predicts that the effect of sam should not vary significantly the authors could examine the effect of sam on other architectures with varying and controllable inductive biases such as convit dascoli et al 2021 cvt wu et al 2021 cmt guo et al 2021 a vit with a convolutional stem xiao et al early convolutions help transformers see better 2021 or a cnnreparameterized into an mlp la dascoli et al finding the needle in the haystack with convolutions 2020 the effects of sam on activation norm table 3 in the vit are not consistent so i think the authors should avoid making the claim that we find that the norm of the postactivation valuebecome even bigger section 44 greater weight norms the authors state interestingly the vit model optimized with sam can encode plausible segmentation information giving rise to better interpretability than the one trained via the conventional sgd optimization claims regarding segmentation should be evaluated with a segmentation task eg results using a transformer backbone for segmentation with vs without sam and claims about interpretability should be evaluated with controlled human experiments see leavitt and morcos towards falsifiable interpretability i think the strongest claim that could be made without appropriate experiments is it appears more interpretable table 11 shows that the vit model is trained using adam not sgd but you write giving rise to better interpretability than the one trained via the conventional sgd optimization transformers are notoriously unstable during training adam is thought to compensate for this instability it would be interesting to see whether sam allows vision transformers to be trained with sgd does figure 2a depict training with or without sam i think it would be informative to plot both to depict the effect of sam over the course of training the following claims are made in the introduction some of which i address in my above comments we conjecture that the convolutioninduced translation equivariance and locality help resnets escape from bad local minima when trained on visual data this claim is not justified the effects of translation equivariance and locality on sam are never tested directly nor is the concept of a bad local minimum clearly defined as suggested before the authors should repeat their analyses using vision transformers with convolutionallocal inductive biases the authors should also clearly define what they mean by a bad local minimum by analyzing some intrinsic model properties we find that the models after sam reduce the hessian eigenvalues by activating sparser neurons on imagenet especially in the first few layers this language implies a causal chain that sam increases sparsity and the increased sparsity reduces hessian eigenvalues the current experiments can only justify the claim that sam increases sparsity and reduces hessian eigenvalues the authors need to conduct experiments showing that sam causes sparsity and sparsity causes a reduction in hessian eigenvalues this could be done by regularizing to increasedecrease sparsity with vs without sam and examining the effects on hessian eigenvalues and accuracy otherwise the authors should amend this claim to remove the implied samsparsityhessian causality the weight norms increase implying the commonly used weight decay may not be an effective regularization alone changes in weight norms alone are insufficient to justify this claim the authors should repeat their analyses with varying amounts of weight decay with vs without sam a side observation is that unlike resnets and mlpmixers vits have extremely sparse active neurons revealing the redundancy of input image patches and the capacity for network pruning its not clear to me how sparsity translates to image patch redundancy this is a very interesting idea and should be better explained another interesting finding is that vits performance gain also translates to plausible attention maps containing more perspicuous information about semantic segmentation i address this in an earlier comment evaluate this claim with a segmentation task andor human interpretability experiments other feedback analyzing a vision transformer with an inductive bias for locality such as convit dascoli et al 2021 cvt wu et al 2021 andor cmt guo et al 2021 could strengthen a lot of the claims in the paper and bridge the results from vits to resnets the authors state that these results show that both sam and augmentations make the loss landscape flat on average the difference is that sam enforces the smoothness by reducing the largest curvature via a minimax formulation to optimize the worstcase scenario while augmentations ignore the worsecase curvature and instead smooth the landscape over the directions concerning the inductive biases induced by the augmentations this distinction between average and worstcase curvature is an interesting result i think the authors should extend their analysis and assess the average flatness of other models with vs wo sam in section 42 better robustness the effects of sam on robustness in resnets should be mentioned furthermore percent changes in evaluation metrics such as accuracy must be specified as relative or absolute ie percentage point finally the authors need to control for the effect of the clean accuracy improvement is the change in imagenetc accuracy larger than expected given the baseline change in imagenet accuracy all the plots in figure 2 should also contain a resnet152 with and without sam i understand that there are space constraints but the vit sparsity results should be depicted in a figure perhaps consider combining figs 2cd to make more space the presentation of the contrastive learning results section 52 should include results for the baseline resnet152 figure 4a should show values both with and without sam its not clear to me how the sparsity of mixer activations suggests the potential redundancy of image patches the authors should include a discussion of limitations of their work mlpmixers and vits are more difficult to train than cnns this study shows that sam is very effective for training mlpmixers and vits and attempts to understand why these could all be important contributions to computer vision i am convinced that sam works well for vits and mlpmixers but this paper is insufficiently rigorous leaving me unconvinced of why it works i dont think its suitable for publication in its current state but it certainly could be after revisions docsepas the success of vision transformer vit has shown its potentials for computer vision tasks this paper investigates a more effective way of training a vit understand a standard imagenet pretraining setting such as no extra training data and no strong data augmentation in general without those conditions a typical vit can not perform as good as widely convolutional based network architectures such as resnet this paper addresses the issue from the point of loss landscape and then propose to use a better optimization strategy named sharpaware minimizer sam for vit related architecture optimization with the proposed sam vit can achieve better accuracy significantly understand standard imagenet trainingtesting protocol in addition to the improved performance on imagenet this paper further shows the visualization of the attention map and the improved performance on other applications such as contrastive learning and adversarial training this paper is well written and organized the motivation comes from the loss landscapes comparison for different network architecture such as resnet vit and mixer the visualization has revealed the problem of the current vit and mixer as both of them tend to converge at sharp minima which will limit the generalization capabilities as the issue has been clearly pointed out the solution is straightforward avoid the sharp local minima when training vit such that this paper adopts sam as an answer although sam is from existing work this paper gives a good explanation on why among different optimizer sam could be a good choice in addition to the method this paper has shown that on image classification contrastive learning and adversarial training vit can be consistently improved with sam especially the visualization of attention map w sam the vit feature map does contain more semantics concerns i agree with that current vit tends to converge at sharp local minima it could be the problem from the usage of existing optimizer on the other hand the problem may also comes from the nature of current vit design recent vision transformers such as swin pvtv2 volo all of them has clearly shown that by optimizing the architecture of vit it can also avoid the local minima based on current manuscript it is unknown that how sam can be used for swin or pvtv2 and whether sam can be used as a general optimizer for arbitrary vision transformer training 1 swin transformer hierarchical vision transformer using shifted windows 2 pvtv2 improved baselines with pyramid vision transformer 3 volo vision outlooker for visual recognition in overall i vote for acceptance for this paper but i think it can be further improved in the experiments resnetsam results are promising the claim could be stronger if other vision transformers can be evaluated as vit has been widely argued for its inefficient architecture design docsepthis paper alleviate the dependency on massive pretraining and data augmentation it promotes the smoothness using a recently proposed sharpnessaware optimizer to improve the accuracy and robustness of the vits and mlpmixers this paper has demonstrated that the sharpnessaware optimizer could be leveraged to boost the performance of vits and mlpmixers without pretraining and strong augmentaitons on different tasks including supervised adversarial contrastive and transfer learning the paper has conducted an abundance of experiments on different tasks to demonstrate the effectiveness of the sharpnessaware minimization sam the authors present a comprehensive analysis on sam the differences and similarities between sam and data augmentation are well explained this paper uses the existing idea of sharpnessaware minimization to alleviate the dependency of massive pretraining and strong augmentation which makes the novelty limited overall this paper is well writen the overview of sam and the explanation of some observations are clearly presented besides the authors demonstrate the effectivenss of sam in various tasks by showing a large amount of experimental results the experimental part is comprehensive and convincing the major drawback of this paper is that the sam is an existing idea even though the authors use it to overcome the issue brought by heavy pretraining and data augmentations therefore this paper is like proving the effectiveness of an existing idea by doing many experimental validations
### Summary: | description the paper demonstrates that efficient architectures such as transformers and mlpmixers which do not utilize translational equivariance in the design when regularized with sam sharpness aware minimization can achieve same or better performance as convolutional networks in the vision problems where the convolutional networks were traditionally superior with data augmentation or not regularized or not the paper demonstrates it very thoroughly through many experiments and analysis of the loss surfaces decision discussion i find the paper to be very timely in its context it has a remarkable coverage of experimental studies and different use cases sam augmentation contrastive adversarial transfer learning as well as ablation studies such as keeping first layers convolutional the reviewers have asked further questions and the authors were able to conduct respective experiments within the discussion period fully addressing all concerns and making the findings of the paper even more comprehensive and convincing after the rebuttal 3 reviewers were for accept one marginally above and one marginally below in the latter case the concern was that the paper is an experimental study of a known method sam while i understand that many researchers are expecting theoretical and innovative results from iclr papers i find that it does not prevent acceptance indeed the experimental findings in this paper are on a hot topic could be of wide interest and could lead to a change of paradigm in designing models towards more generic ones on the other hand it could just indicate that cnns are not fully exploiting their potential eg not exploiting the context well enough in the hidden layers to get more insight i am still wondering how the predictions behave if the input is shifted by a few pixels in cnn and transformers it seems counterintuitive that making the first layers in vit just an mlp of image patches is a good design furthermore fully convolutional models allow to take input of an arbitrary size and average the predictions on the output if it happened to be larger than 1x1 since convolutions are also used for eg semantic segmentation and generative models one should not and the authors do not in the paper discard them too fast see also a recent work combining transformers and convolutional networks chen et al iccv 2021 visformer the visionfriendly transformer | [
10234,
32270,
14417,
285,
33643,
1361,
501,
47301,
8773,
432,
3076,
1980,
46836,
672,
10166,
327,
5304,
941,
50276,
2520,
1750,
310,
417,
17285,
253,
2538,
273,
10234,
32270,
14417,
285,
33643,
327,
1775,
403,
1620,
5762,
3587,
4543,
310,
253,
4473,
273,
247,
3076,
1980,
5927,
4518,
2931,
347,
5125,
1078,
253,
4477,
943,
10280,
616,
6260,
970,
8113,
4979,
398,
342,
27311,
455,
3100,
42115,
31306,
253,
4477,
943,
671,
4518,
4853,
752,
597,
1599,
407,
247,
3076,
1980,
5927,
50276,
1615,
18918,
690,
15276,
1566,
3607,
359,
1089,
326,
253,
3210,
846,
1775,
4796,
253,
344,
859,
757,
20223,
407,
27149,
653,
9332,
8512,
327,
4440,
257,
292,
3340,
275,
253,
806,
1643,
8090,
50276,
2520,
3448,
8018,
247,
19349,
5931,
326,
1775,
5459,
37139,
414,
285,
253,
2559,
37139,
414,
11355,
344,
859,
757,
20223,
253,
1655,
4679,
476,
760,
15249,
253,
1750,
326,
1775,
5459,
37139,
414,
285,
11355,
344,
859,
757,
20223,
253,
4477,
878,
281,
2589,
4679,
4645,
326,
1775,
5997,
37139,
414,
285,
37139,
414,
5997,
247,
5141,
275,
344,
859,
757,
20223,
436,
812,
320,
2218,
407,
3963,
3006,
281,
2559,
70,
719,
511,
37139,
414,
342,
4632,
1293,
1775,
285,
17565,
253,
2538,
327,
344,
859,
757,
20223,
285,
7200,
5010,
253,
4477,
943,
15026,
436,
1750,
281,
5386,
253,
10466,
1775,
1033,
1032,
414,
35659,
757,
46449,
50276,
783,
2801,
22429,
2572,
27594,
253,
7744,
908,
2801,
10027,
778,
417,
320,
271,
3576,
37820,
3815,
50276,
31973,
275,
2801,
22429,
3815,
403,
12497,
281,
15249,
436,
1750,
253,
4477,
943,
10280,
616,
6260,
342,
11962,
8322,
273,
2801,
10027,
342,
4632,
1293,
1775,
50276,
66,
1930,
8310,
310,
326,
12401,
501,
47301,
285,
13361,
2617,
895,
398,
362,
953,
452,
6685,
23507,
3939,
8512,
19678,
253,
39296,
273,
3280,
2460,
20412,
285,
253,
5350,
323,
2990,
819,
25004,
50276,
953,
417,
2590,
281,
479,
849,
37139,
414,
30376,
281,
2460,
12097,
39296,
436,
310,
247,
1077,
4722,
2934,
285,
943,
320,
1805,
5544,
50276,
23955,
4722,
4560,
310,
326,
362,
953,
3045,
6351,
671,
30376,
281,
21541,
4116,
8115,
4508,
625,
1153,
14985,
3472,
1491,
670,
24705,
26405,
50275,
74,
2953,
436,
275,
271,
4321,
4385,
7472,
436,
1750,
342,
247,
26405,
4836,
285,
263,
1966,
4665,
1430,
4679,
50275,
977,
8680,
50276,
29965,
8537,
247,
8113,
39707,
342,
271,
42115,
8492,
323,
33643,
824,
347,
2410,
262,
9527,
46351,
1162,
355,
43425,
260,
20282,
259,
86,
1162,
355,
43425,
285,
263,
260,
6917,
1149,
80,
1162,
355,
43425,
812,
17084,
247,
2257,
273,
253,
3916,
275,
253,
2929,
285,
9729,
253,
1543,
432,
362,
953,
281,
501,
47301,
50276,
783,
4477,
1375,
326,
50276,
20513,
1543,
921,
326,
1097,
1775,
285,
35919,
569,
1056,
253,
2957,
13016,
6507,
327,
3388,
253,
3064,
310,
326,
1775,
546,
36217,
253,
6032,
1255,
407,
8493,
253,
6253,
16841,
3066,
247,
7221,
991,
15895,
281,
22318,
253,
9065,
5045,
10076,
1223,
35919,
569,
11823,
253,
548,
1704,
511,
16841,
285,
3185,
6032,
253,
13016,
689,
253,
10746,
8664,
253,
42115,
31306,
5802,
407,
253,
35919,
569,
50276,
2520,
13812,
875,
3388,
285,
9065,
5045,
16841,
310,
271,
4722,
906,
891,
1158,
253,
4477,
943,
9017,
616,
1783,
285,
2939,
253,
3388,
6507,
1255,
273,
643,
3210,
342,
4632,
32063,
1775,
50276,
249,
2593,
5976,
1805,
31640,
253,
2538,
273,
1775,
327,
31640,
275,
501,
47301,
943,
320,
5393,
33810,
2558,
2544,
275,
7103,
17082,
824,
347,
7200,
1364,
320,
7616,
347,
4103,
390,
7880,
26332,
7155,
1127,
4720,
253,
4477,
878,
281,
1453,
323,
253,
1055,
273,
253,
4076,
7200,
7756,
310,
253,
1818,
275,
4440,
257,
14069,
7200,
4067,
685,
3264,
1677,
253,
8245,
1818,
275,
4440,
257,
292,
7200,
50276,
455,
253,
14777,
275,
4677,
374,
943,
671,
3831,
247,
501,
3024,
17472,
342,
285,
1293,
1775,
50276,
74,
2096,
326,
627,
403,
2317,
10806,
533,
253,
9084,
37139,
414,
1543,
943,
320,
17253,
275,
247,
4677,
4931,
1908,
16248,
3036,
84,
374,
2428,
281,
1056,
625,
2317,
50276,
783,
9759,
273,
253,
4499,
422,
4715,
1543,
2593,
8073,
943,
2486,
1543,
323,
253,
8245,
501,
3024,
17472,
50276,
13206,
577,
66,
943,
921,
2193,
1097,
342,
285,
1293,
1775,
50276,
953,
417,
2590,
281,
479,
849,
253,
37139,
414,
273,
33947,
1396,
569,
5936,
253,
2442,
39296,
273,
2460,
20412,
50276,
783,
4477,
943,
2486,
247,
5955,
273,
7364,
273,
616,
789,
50276,
1686,
2617,
895,
398,
285,
362,
953,
403,
625,
2834,
281,
6194,
685,
260,
79,
2224,
436,
1263,
2722,
326,
1775,
310,
1077,
3576,
323,
3733,
13361,
2617,
895,
398,
285,
362,
953,
285,
9437,
281,
2096,
2139,
841,
812,
512,
320,
1774,
9021,
281,
4382,
8113,
891,
717,
13762,
326,
1775,
2987,
973,
323,
362,
953,
285,
13361,
2617,
895,
398,
533,
436,
2929,
310,
12497,
314,
26565,
6108,
479,
10915,
8498,
758,
273,
2139,
352,
2987,
891,
13414,
1158,
697,
7470,
323,
9311,
275,
697,
1655,
1375,
533,
352,
5604,
812,
320,
846,
38549,
50276,
7152,
33032,
284,
253,
2323,
273,
8113,
39707,
9084,
556,
2011,
697,
19316,
323,
4382,
8113,
8892,
436,
2929,
2340,
684,
247,
625,
3576,
1039,
273,
3733,
247,
9084,
2096,
247,
2629,
4440,
257,
292,
3215,
26208,
4758,
824,
347,
642,
4465,
3733,
941,
285,
642,
2266,
941,
42072,
50276,
249,
2087,
1293,
1110,
2515,
247,
6867,
9084,
476,
417,
1347,
347,
1175,
347,
7561,
27311,
267,
1754,
2990,
35615,
824,
347,
501,
3024,
50276,
2520,
2929,
12453,
253,
2523,
432,
253,
1127,
273,
2957,
13016,
285,
840,
12661,
281,
897,
247,
1805,
13757,
5700,
4907,
9479,
13823,
7221,
6081,
1775,
323,
9084,
2905,
10336,
13757,
50276,
3113,
253,
4081,
1775,
9084,
476,
5115,
1805,
7200,
3012,
2096,
2629,
4440,
257,
292,
3733,
19462,
7241,
50276,
249,
1635,
281,
253,
5520,
3045,
327,
4440,
257,
292,
436,
2929,
2007,
2722,
253,
24426,
273,
253,
4116,
3711,
285,
253,
5520,
3045,
327,
643,
4893,
824,
347,
4499,
422,
4715,
285,
48960,
3733,
436,
2929,
310,
973,
3542,
285,
10932,
50276,
783,
16038,
3249,
432,
253,
2957,
37328,
5301,
323,
1027,
2990,
10336,
824,
347,
501,
3024,
9084,
285,
33947,
50276,
783,
24426,
556,
4950,
253,
1895,
273,
253,
1655,
9084,
285,
33947,
347,
1097,
273,
731,
5257,
281,
29623,
387,
9479,
46836,
534,
588,
2701,
253,
26647,
13789,
50276,
284,
253,
2523,
556,
644,
4518,
8042,
562,
253,
2900,
310,
15246,
3693,
253,
9479,
1980,
46836,
672,
3733,
9084,
50276,
10328,
326,
436,
2929,
47932,
1775,
347,
271,
3662,
50276,
20261,
1775,
310,
432,
5368,
789,
436,
2929,
4245,
247,
1175,
8813,
327,
2139,
2190,
1027,
5556,
6081,
1775,
812,
320,
247,
1175,
4327,
50275,
249,
1635,
281,
253,
1332,
436,
2929,
556,
2011,
326,
327,
2460,
9162,
50276,
45842,
422,
4715,
285,
48960,
3733,
9084,
476,
320,
12724,
5520,
342,
1775,
50276,
20432,
253,
24426,
273,
4116,
3711,
259,
1775,
253,
9084,
4735,
3711,
1057,
3831,
625,
35185,
50273,
585,
1209,
2224,
891,
5194,
342,
326,
1655,
9084,
14280,
281,
29623,
387,
9479,
1980,
46836,
50276,
262,
812,
320,
253,
1895,
432,
253,
10393,
273,
5368,
5556,
6081,
50276,
251,
253,
643,
1133,
253,
1895,
778,
671,
3249,
432,
253,
3753,
273,
1655,
9084,
2216,
50276,
45019,
8113,
4979,
398,
824,
347,
1863,
249,
268,
87,
18698,
19,
1936,
80,
512,
273,
731,
556,
4518,
2011,
326,
407,
39793,
253,
10336,
273,
9084,
352,
476,
671,
3693,
253,
1980,
46836,
50276,
3169,
327,
1655,
7714,
50276,
262,
310,
7202,
326,
849,
1775,
476,
320,
908,
323,
1863,
249,
390,
268,
87,
18698,
19,
285,
1880,
1775,
476,
320,
908,
347,
247,
2087,
5556,
6081,
323,
10341,
8113,
39707,
3733,
50275,
18,
1863,
249,
39707,
24498,
8113,
39707,
970,
14728,
8323,
50276,
19,
268,
87,
18698,
19,
5520,
1666,
25379,
342,
39694,
8113,
39707,
50276,
20,
1936,
80,
8113,
29338,
254,
323,
5304,
8981,
275,
4583,
891,
6273,
323,
14924,
323,
436,
2929,
533,
891,
1158,
352,
476,
320,
2007,
5520,
50276,
249,
253,
4679,
501,
47301,
312,
1543,
403,
12532,
253,
1750,
812,
320,
10046,
604,
643,
8113,
4979,
398,
476,
320,
6760,
347,
9084,
556,
644,
7561,
9125,
323,
697,
31334,
10336,
2216,
50270,
7152,
33032,
2520,
2929,
33623,
253,
18925,
327,
7863,
3215,
26208,
285,
941,
42072,
352,
18653,
253,
6032,
1255,
970,
247,
4102,
4081,
9479,
1255,
13823,
5556,
6081,
281,
3157,
253,
7200,
285,
31640,
273,
253,
362,
953,
285,
13361,
2617,
895,
398,
436,
2929,
556,
5183,
326,
253,
9479,
1255,
13823,
5556,
6081,
812,
320,
19732,
2961,
281,
9510,
253,
3045,
273,
362,
953,
285,
13361,
2617,
895,
398,
1293,
3215,
26208,
285,
2266,
35919,
1942,
790,
327,
1027,
8892,
1690,
22296,
48960,
4499,
422,
285,
3700,
4715,
50275,
783,
2929,
556,
5196,
271,
11921,
273,
4679,
327,
1027,
8892,
281,
7568,
253,
12510,
273,
253,
9479,
1255,
13823,
41458,
1775,
50275,
783,
4477,
1246,
247,
11088,
1783,
327,
1775,
253,
3910,
285,
22620,
875,
1775,
285,
941,
42072,
403,
973,
5544,
50274,
2520,
2929,
4648,
253,
5368,
2934,
273,
9479,
1255,
13823,
41458,
281,
33623,
253,
18925,
273,
7863,
3215,
26208,
285,
2266,
42072,
534,
2789,
253,
38135,
3710,
50275,
1189,
455,
436,
2929,
310,
973,
2416,
257,
253,
18389,
273,
1775,
285,
253,
8813,
273,
690,
7313,
403,
4518,
3559,
16280,
253,
4477,
7568,
253,
1055,
400,
561,
84,
273,
1775,
275,
2710,
8892,
407,
4645,
247,
1781,
2408,
273,
5661,
1543,
253,
5661,
629,
310,
11088,
285,
21414,
253,
2201,
32489,
273,
436,
2929,
310,
326,
253,
1775,
310,
271,
5368,
2934,
1014,
2167,
253,
4477,
897,
352,
281,
11399,
253,
2523,
3982,
407,
5536,
3215,
26208,
285,
941,
35919,
569,
3103,
436,
2929,
310,
751,
18597,
253,
12510,
273,
271,
5368,
2934,
407,
2509,
1142,
5661,
3588,
569,
50276,
187,
187,
4118,
18435,
27,
5740,
50276,
783,
2929,
14371,
326,
5919,
35615,
824,
347,
4979,
398,
285,
13361,
2617,
895,
398,
534,
513,
417,
16584,
33103,
32270,
14417,
275,
253,
2216,
672,
3963,
1025,
342,
1775,
9479,
1255,
6600,
41458,
476,
5115,
1072,
390,
1805,
3045,
347,
27311,
267,
6928,
275,
253,
8113,
3237,
835,
253,
27311,
267,
6928,
497,
21533,
8936,
342,
941,
42072,
390,
417,
3963,
1025,
390,
417,
253,
2929,
14371,
352,
1077,
16575,
949,
1142,
4679,
285,
1783,
273,
253,
2957,
9421,
50275,
33642,
50276,
49794,
50276,
74,
1089,
253,
2929,
281,
320,
1077,
14793,
275,
697,
3634,
352,
556,
247,
13406,
7031,
273,
5661,
2175,
285,
1027,
897,
2219,
1775,
50276,
2321,
16977,
4499,
422,
48960,
3700,
4715,
347,
973,
347,
28913,
2175,
824,
347,
7562,
806,
8090,
27311,
267,
253,
30628,
452,
2546,
2007,
3533,
285,
253,
4477,
497,
2104,
281,
2589,
9056,
4679,
1561,
253,
5955,
2180,
4751,
15974,
512,
7350,
285,
2403,
253,
4342,
273,
253,
2929,
1014,
625,
11088,
285,
21414,
50276,
6438,
253,
30080,
22559,
495,
30628,
497,
323,
2997,
581,
42876,
1840,
285,
581,
42876,
2708,
275,
253,
6158,
1083,
253,
4468,
369,
326,
253,
2929,
310,
271,
5661,
1263,
273,
247,
1929,
1332,
1775,
1223,
891,
2096,
326,
1142,
8607,
403,
16764,
10527,
285,
16694,
1543,
432,
17857,
32888,
9380,
891,
1089,
326,
352,
1057,
417,
3657,
14924,
6296,
253,
5661,
4342,
275,
436,
2929,
403,
327,
247,
3511,
9400,
812,
320,
273,
4618,
1600,
285,
812,
1421,
281,
247,
1818,
273,
22199,
275,
20462,
3210,
4404,
625,
12314,
4394,
327,
253,
643,
1133,
352,
812,
816,
5224,
326,
260,
79,
2224,
403,
417,
4751,
38883,
616,
2442,
24088,
417,
38883,
253,
3634,
973,
2217,
275,
253,
8763,
8090,
50276,
936,
755,
625,
12288,
891,
717,
1335,
12371,
849,
253,
13650,
21319,
604,
253,
3280,
310,
14728,
407,
247,
1643,
15115,
275,
260,
9866,
285,
4979,
398,
352,
3133,
4828,
565,
48714,
326,
2403,
253,
806,
8090,
275,
9084,
816,
271,
13361,
81,
273,
2460,
20412,
310,
247,
1175,
2216,
33810,
4751,
27311,
267,
3210,
1581,
281,
1379,
3280,
273,
271,
10341,
1979,
285,
3388,
253,
13650,
327,
253,
3453,
604,
352,
4592,
281,
320,
4067,
685,
337,
89,
18,
50275,
17480,
2410,
17009,
403,
671,
908,
323,
24088,
24705,
26405,
285,
1006,
800,
3210,
581,
943,
417,
285,
253,
4477,
513,
417,
275,
253,
2929,
37271,
731,
1512,
3809,
923,
671,
247,
3332,
789,
16248,
4979,
398,
285,
27311,
267,
6928,
260,
864,
1162,
355,
17857,
17312,
43425,
1649,
19946,
253,
8113,
19771,
39707
] | [
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] | [
10234,
32270,
14417,
285,
33643,
1361,
501,
47301,
8773,
432,
3076,
1980,
46836,
672,
10166,
327,
5304,
941,
50276,
2520,
1750,
310,
417,
17285,
253,
2538,
273,
10234,
32270,
14417,
285,
33643,
327,
1775,
403,
1620,
5762,
3587,
4543,
310,
253,
4473,
273,
247,
3076,
1980,
5927,
4518,
2931,
347,
5125,
1078,
253,
4477,
943,
10280,
616,
6260,
970,
8113,
4979,
398,
342,
27311,
455,
3100,
42115,
31306,
253,
4477,
943,
671,
4518,
4853,
752,
597,
1599,
407,
247,
3076,
1980,
5927,
50276,
1615,
18918,
690,
15276,
1566,
3607,
359,
1089,
326,
253,
3210,
846,
1775,
4796,
253,
344,
859,
757,
20223,
407,
27149,
653,
9332,
8512,
327,
4440,
257,
292,
3340,
275,
253,
806,
1643,
8090,
50276,
2520,
3448,
8018,
247,
19349,
5931,
326,
1775,
5459,
37139,
414,
285,
253,
2559,
37139,
414,
11355,
344,
859,
757,
20223,
253,
1655,
4679,
476,
760,
15249,
253,
1750,
326,
1775,
5459,
37139,
414,
285,
11355,
344,
859,
757,
20223,
253,
4477,
878,
281,
2589,
4679,
4645,
326,
1775,
5997,
37139,
414,
285,
37139,
414,
5997,
247,
5141,
275,
344,
859,
757,
20223,
436,
812,
320,
2218,
407,
3963,
3006,
281,
2559,
70,
719,
511,
37139,
414,
342,
4632,
1293,
1775,
285,
17565,
253,
2538,
327,
344,
859,
757,
20223,
285,
7200,
5010,
253,
4477,
943,
15026,
436,
1750,
281,
5386,
253,
10466,
1775,
1033,
1032,
414,
35659,
757,
46449,
50276,
783,
2801,
22429,
2572,
27594,
253,
7744,
908,
2801,
10027,
778,
417,
320,
271,
3576,
37820,
3815,
50276,
31973,
275,
2801,
22429,
3815,
403,
12497,
281,
15249,
436,
1750,
253,
4477,
943,
10280,
616,
6260,
342,
11962,
8322,
273,
2801,
10027,
342,
4632,
1293,
1775,
50276,
66,
1930,
8310,
310,
326,
12401,
501,
47301,
285,
13361,
2617,
895,
398,
362,
953,
452,
6685,
23507,
3939,
8512,
19678,
253,
39296,
273,
3280,
2460,
20412,
285,
253,
5350,
323,
2990,
819,
25004,
50276,
953,
417,
2590,
281,
479,
849,
37139,
414,
30376,
281,
2460,
12097,
39296,
436,
310,
247,
1077,
4722,
2934,
285,
943,
320,
1805,
5544,
50276,
23955,
4722,
4560,
310,
326,
362,
953,
3045,
6351,
671,
30376,
281,
21541,
4116,
8115,
4508,
625,
1153,
14985,
3472,
1491,
670,
24705,
26405,
50275,
74,
2953,
436,
275,
271,
4321,
4385,
7472,
436,
1750,
342,
247,
26405,
4836,
285,
263,
1966,
4665,
1430,
4679,
50275,
977,
8680,
50276,
29965,
8537,
247,
8113,
39707,
342,
271,
42115,
8492,
323,
33643,
824,
347,
2410,
262,
9527,
46351,
1162,
355,
43425,
260,
20282,
259,
86,
1162,
355,
43425,
285,
263,
260,
6917,
1149,
80,
1162,
355,
43425,
812,
17084,
247,
2257,
273,
253,
3916,
275,
253,
2929,
285,
9729,
253,
1543,
432,
362,
953,
281,
501,
47301,
50276,
783,
4477,
1375,
326,
50276,
20513,
1543,
921,
326,
1097,
1775,
285,
35919,
569,
1056,
253,
2957,
13016,
6507,
327,
3388,
253,
3064,
310,
326,
1775,
546,
36217,
253,
6032,
1255,
407,
8493,
253,
6253,
16841,
3066,
247,
7221,
991,
15895,
281,
22318,
253,
9065,
5045,
10076,
1223,
35919,
569,
11823,
253,
548,
1704,
511,
16841,
285,
3185,
6032,
253,
13016,
689,
253,
10746,
8664,
253,
42115,
31306,
5802,
407,
253,
35919,
569,
50276,
2520,
13812,
875,
3388,
285,
9065,
5045,
16841,
310,
271,
4722,
906,
891,
1158,
253,
4477,
943,
9017,
616,
1783,
285,
2939,
253,
3388,
6507,
1255,
273,
643,
3210,
342,
4632,
32063,
1775,
50276,
249,
2593,
5976,
1805,
31640,
253,
2538,
273,
1775,
327,
31640,
275,
501,
47301,
943,
320,
5393,
33810,
2558,
2544,
275,
7103,
17082,
824,
347,
7200,
1364,
320,
7616,
347,
4103,
390,
7880,
26332,
7155,
1127,
4720,
253,
4477,
878,
281,
1453,
323,
253,
1055,
273,
253,
4076,
7200,
7756,
310,
253,
1818,
275,
4440,
257,
14069,
7200,
4067,
685,
3264,
1677,
253,
8245,
1818,
275,
4440,
257,
292,
7200,
50276,
455,
253,
14777,
275,
4677,
374,
943,
671,
3831,
247,
501,
3024,
17472,
342,
285,
1293,
1775,
50276,
74,
2096,
326,
627,
403,
2317,
10806,
533,
253,
9084,
37139,
414,
1543,
943,
320,
17253,
275,
247,
4677,
4931,
1908,
16248,
3036,
84,
374,
2428,
281,
1056,
625,
2317,
50276,
783,
9759,
273,
253,
4499,
422,
4715,
1543,
2593,
8073,
943,
2486,
1543,
323,
253,
8245,
501,
3024,
17472,
50276,
13206,
577,
66,
943,
921,
2193,
1097,
342,
285,
1293,
1775,
50276,
953,
417,
2590,
281,
479,
849,
253,
37139,
414,
273,
33947,
1396,
569,
5936,
253,
2442,
39296,
273,
2460,
20412,
50276,
783,
4477,
943,
2486,
247,
5955,
273,
7364,
273,
616,
789,
50276,
1686,
2617,
895,
398,
285,
362,
953,
403,
625,
2834,
281,
6194,
685,
260,
79,
2224,
436,
1263,
2722,
326,
1775,
310,
1077,
3576,
323,
3733,
13361,
2617,
895,
398,
285,
362,
953,
285,
9437,
281,
2096,
2139,
841,
812,
512,
320,
1774,
9021,
281,
4382,
8113,
891,
717,
13762,
326,
1775,
2987,
973,
323,
362,
953,
285,
13361,
2617,
895,
398,
533,
436,
2929,
310,
12497,
314,
26565,
6108,
479,
10915,
8498,
758,
273,
2139,
352,
2987,
891,
13414,
1158,
697,
7470,
323,
9311,
275,
697,
1655,
1375,
533,
352,
5604,
812,
320,
846,
38549,
50276,
7152,
33032,
284,
253,
2323,
273,
8113,
39707,
9084,
556,
2011,
697,
19316,
323,
4382,
8113,
8892,
436,
2929,
2340,
684,
247,
625,
3576,
1039,
273,
3733,
247,
9084,
2096,
247,
2629,
4440,
257,
292,
3215,
26208,
4758,
824,
347,
642,
4465,
3733,
941,
285,
642,
2266,
941,
42072,
50276,
249,
2087,
1293,
1110,
2515,
247,
6867,
9084,
476,
417,
1347,
347,
1175,
347,
7561,
27311,
267,
1754,
2990,
35615,
824,
347,
501,
3024,
50276,
2520,
2929,
12453,
253,
2523,
432,
253,
1127,
273,
2957,
13016,
285,
840,
12661,
281,
897,
247,
1805,
13757,
5700,
4907,
9479,
13823,
7221,
6081,
1775,
323,
9084,
2905,
10336,
13757,
50276,
3113,
253,
4081,
1775,
9084,
476,
5115,
1805,
7200,
3012,
2096,
2629,
4440,
257,
292,
3733,
19462,
7241,
50276,
249,
1635,
281,
253,
5520,
3045,
327,
4440,
257,
292,
436,
2929,
2007,
2722,
253,
24426,
273,
253,
4116,
3711,
285,
253,
5520,
3045,
327,
643,
4893,
824,
347,
4499,
422,
4715,
285,
48960,
3733,
436,
2929,
310,
973,
3542,
285,
10932,
50276,
783,
16038,
3249,
432,
253,
2957,
37328,
5301,
323,
1027,
2990,
10336,
824,
347,
501,
3024,
9084,
285,
33947,
50276,
783,
24426,
556,
4950,
253,
1895,
273,
253,
1655,
9084,
285,
33947,
347,
1097,
273,
731,
5257,
281,
29623,
387,
9479,
46836,
534,
588,
2701,
253,
26647,
13789,
50276,
284,
253,
2523,
556,
644,
4518,
8042,
562,
253,
2900,
310,
15246,
3693,
253,
9479,
1980,
46836,
672,
3733,
9084,
50276,
10328,
326,
436,
2929,
47932,
1775,
347,
271,
3662,
50276,
20261,
1775,
310,
432,
5368,
789,
436,
2929,
4245,
247,
1175,
8813,
327,
2139,
2190,
1027,
5556,
6081,
1775,
812,
320,
247,
1175,
4327,
50275,
249,
1635,
281,
253,
1332,
436,
2929,
556,
2011,
326,
327,
2460,
9162,
50276,
45842,
422,
4715,
285,
48960,
3733,
9084,
476,
320,
12724,
5520,
342,
1775,
50276,
20432,
253,
24426,
273,
4116,
3711,
259,
1775,
253,
9084,
4735,
3711,
1057,
3831,
625,
35185,
50273,
585,
1209,
2224,
891,
5194,
342,
326,
1655,
9084,
14280,
281,
29623,
387,
9479,
1980,
46836,
50276,
262,
812,
320,
253,
1895,
432,
253,
10393,
273,
5368,
5556,
6081,
50276,
251,
253,
643,
1133,
253,
1895,
778,
671,
3249,
432,
253,
3753,
273,
1655,
9084,
2216,
50276,
45019,
8113,
4979,
398,
824,
347,
1863,
249,
268,
87,
18698,
19,
1936,
80,
512,
273,
731,
556,
4518,
2011,
326,
407,
39793,
253,
10336,
273,
9084,
352,
476,
671,
3693,
253,
1980,
46836,
50276,
3169,
327,
1655,
7714,
50276,
262,
310,
7202,
326,
849,
1775,
476,
320,
908,
323,
1863,
249,
390,
268,
87,
18698,
19,
285,
1880,
1775,
476,
320,
908,
347,
247,
2087,
5556,
6081,
323,
10341,
8113,
39707,
3733,
50275,
18,
1863,
249,
39707,
24498,
8113,
39707,
970,
14728,
8323,
50276,
19,
268,
87,
18698,
19,
5520,
1666,
25379,
342,
39694,
8113,
39707,
50276,
20,
1936,
80,
8113,
29338,
254,
323,
5304,
8981,
275,
4583,
891,
6273,
323,
14924,
323,
436,
2929,
533,
891,
1158,
352,
476,
320,
2007,
5520,
50276,
249,
253,
4679,
501,
47301,
312,
1543,
403,
12532,
253,
1750,
812,
320,
10046,
604,
643,
8113,
4979,
398,
476,
320,
6760,
347,
9084,
556,
644,
7561,
9125,
323,
697,
31334,
10336,
2216,
50270,
7152,
33032,
2520,
2929,
33623,
253,
18925,
327,
7863,
3215,
26208,
285,
941,
42072,
352,
18653,
253,
6032,
1255,
970,
247,
4102,
4081,
9479,
1255,
13823,
5556,
6081,
281,
3157,
253,
7200,
285,
31640,
273,
253,
362,
953,
285,
13361,
2617,
895,
398,
436,
2929,
556,
5183,
326,
253,
9479,
1255,
13823,
5556,
6081,
812,
320,
19732,
2961,
281,
9510,
253,
3045,
273,
362,
953,
285,
13361,
2617,
895,
398,
1293,
3215,
26208,
285,
2266,
35919,
1942,
790,
327,
1027,
8892,
1690,
22296,
48960,
4499,
422,
285,
3700,
4715,
50275,
783,
2929,
556,
5196,
271,
11921,
273,
4679,
327,
1027,
8892,
281,
7568,
253,
12510,
273,
253,
9479,
1255,
13823,
41458,
1775,
50275,
783,
4477,
1246,
247,
11088,
1783,
327,
1775,
253,
3910,
285,
22620,
875,
1775,
285,
941,
42072,
403,
973,
5544,
50274,
2520,
2929,
4648,
253,
5368,
2934,
273,
9479,
1255,
13823,
41458,
281,
33623,
253,
18925,
273,
7863,
3215,
26208,
285,
2266,
42072,
534,
2789,
253,
38135,
3710,
50275,
1189,
455,
436,
2929,
310,
973,
2416,
257,
253,
18389,
273,
1775,
285,
253,
8813,
273,
690,
7313,
403,
4518,
3559,
16280,
253,
4477,
7568,
253,
1055,
400,
561,
84,
273,
1775,
275,
2710,
8892,
407,
4645,
247,
1781,
2408,
273,
5661,
1543,
253,
5661,
629,
310,
11088,
285,
21414,
253,
2201,
32489,
273,
436,
2929,
310,
326,
253,
1775,
310,
271,
5368,
2934,
1014,
2167,
253,
4477,
897,
352,
281,
11399,
253,
2523,
3982,
407,
5536,
3215,
26208,
285,
941,
35919,
569,
3103,
436,
2929,
310,
751,
18597,
253,
12510,
273,
271,
5368,
2934,
407,
2509,
1142,
5661,
3588,
569,
50276,
187,
187,
4118,
18435,
27,
5740,
50276,
783,
2929,
14371,
326,
5919,
35615,
824,
347,
4979,
398,
285,
13361,
2617,
895,
398,
534,
513,
417,
16584,
33103,
32270,
14417,
275,
253,
2216,
672,
3963,
1025,
342,
1775,
9479,
1255,
6600,
41458,
476,
5115,
1072,
390,
1805,
3045,
347,
27311,
267,
6928,
275,
253,
8113,
3237,
835,
253,
27311,
267,
6928,
497,
21533,
8936,
342,
941,
42072,
390,
417,
3963,
1025,
390,
417,
253,
2929,
14371,
352,
1077,
16575,
949,
1142,
4679,
285,
1783,
273,
253,
2957,
9421,
50275,
33642,
50276,
49794,
50276,
74,
1089,
253,
2929,
281,
320,
1077,
14793,
275,
697,
3634,
352,
556,
247,
13406,
7031,
273,
5661,
2175,
285,
1027,
897,
2219,
1775,
50276,
2321,
16977,
4499,
422,
48960,
3700,
4715,
347,
973,
347,
28913,
2175,
824,
347,
7562,
806,
8090,
27311,
267,
253,
30628,
452,
2546,
2007,
3533,
285,
253,
4477,
497,
2104,
281,
2589,
9056,
4679,
1561,
253,
5955,
2180,
4751,
15974,
512,
7350,
285,
2403,
253,
4342,
273,
253,
2929,
1014,
625,
11088,
285,
21414,
50276,
6438,
253,
30080,
22559,
495,
30628,
497,
323,
2997,
581,
42876,
1840,
285,
581,
42876,
2708,
275,
253,
6158,
1083,
253,
4468,
369,
326,
253,
2929,
310,
271,
5661,
1263,
273,
247,
1929,
1332,
1775,
1223,
891,
2096,
326,
1142,
8607,
403,
16764,
10527,
285,
16694,
1543,
432,
17857,
32888,
9380,
891,
1089,
326,
352,
1057,
417,
3657,
14924,
6296,
253,
5661,
4342,
275,
436,
2929,
403,
327,
247,
3511,
9400,
812,
320,
273,
4618,
1600,
285,
812,
1421,
281,
247,
1818,
273,
22199,
275,
20462,
3210,
4404,
625,
12314,
4394,
327,
253,
643,
1133,
352,
812,
816,
5224,
326,
260,
79,
2224,
403,
417,
4751,
38883,
616,
2442,
24088,
417,
38883,
253,
3634,
973,
2217,
275,
253,
8763,
8090,
50276,
936,
755,
625,
12288,
891,
717,
1335,
12371,
849,
253,
13650,
21319,
604,
253,
3280,
310,
14728,
407,
247,
1643,
15115,
275,
260,
9866,
285,
4979,
398,
352,
3133,
4828,
565,
48714,
326,
2403,
253,
806,
8090,
275,
9084,
816,
271,
13361,
81,
273,
2460,
20412,
310,
247,
1175,
2216,
33810,
4751,
27311,
267,
3210,
1581,
281,
1379,
3280,
273,
271,
10341,
1979,
285,
3388,
253,
13650,
327,
253,
3453,
604,
352,
4592,
281,
320,
4067,
685,
337,
89,
18,
50275,
17480,
2410,
17009,
403,
671,
908,
323,
24088,
24705,
26405,
285,
1006,
800,
3210,
581,
943,
417,
285,
253,
4477,
513,
417,
275,
253,
2929,
37271,
731,
1512,
3809,
923,
671,
247,
3332,
789,
16248,
4979,
398,
285,
27311,
267,
6928,
260,
864,
1162,
355,
17857,
17312,
43425,
1649,
19946,
253,
8113,
19771,
39707
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
the paper tackles approximate inference for hierarchical bayesian models with fully nested structure the specific approach taken is variational inference with a qdistribution that iteratively applies conditional normalizing flows to derive a hierarchical representation and factorizes in a manner parallel to the generative model by reusing flow parameters thus having a number of parameters that does not grow with cardinality of each plate the benefits of the model are illustrated in a few synthetic experiments as well as in application to a human neuroimaging dataset strengths the motivation for the work is clear and its likely to be of broad interest factorizing the approximation network to match the structure of the model is interesting and novel to my knowledge and the experiments build up from synthetic toy examples to a realworld problem opportunities for improvement overall framing and relation to broader approximate inference the overall framing of the paper starts from the perspective of improving the tractability of normalizing flows for posterior approximation rather than improving approximate inference and takes its aim primarily at flowbased methods notably cascading flows rather than approximate inference more broadly i think broader engagement with the vi literature may improve the paper including particularly some mention of the approximation vs amortization gap this then situates the mf vi baseline as not just a common practice method but as far as i can tell the upper bound on performance of the remaining methods which share its approximation gap but add an amortization performance gap im not sure i understand the nonconjugate results in this context though is it because the link function is applied to the flow approximation but not in the mf context relatedly i think the discussion from supplement b2 is worth including in the main paper the hvm and ssvi approaches among others seem like a natural complement to the present work making these connections seems more important than engaging with sbi and likelihoodfree inference i dont think likelihoodfree approaches are ever competitive with likelihoodbased approaches if likelihoods are available nor is this their intent so im not sure why they are an important baseline or particularly relevant to the current work finally as a minor point i wonder if the more typical reader here would come from the vi literature for whom leading with the elbo rather than reverse kl framing would be more intuitive pushing the synthetic examples to regimes where adavi wins none of the examples actually show that adavi is a superior approach in the nc case it is roughly equivalent in performance to cf in gm adavi may achieve the best elbo out of the amortized methods but takes an order of magnitude more time than just running mfvi so the benefit of amortization is not shown in the gaussian random effects example fig3 its clear that adavi has favorable scaling but even in the largest example mfvi has fewer parameters and achieves higher accuracy in less time furthermore im not sure if the selected metrics truly show the benefit of amortization rather than showing optimization time for all the models shouldnt the paper show the predictive inference time and performance for held out data for all the models then the amortized models should be dramatically faster by not requiring any optimization finally im not sure why timings are not done on consistent hardware it seems like everything in the paper should run on both cpu and gpu so picking one or showing timing for both would be better than trying to compare timings between two different sets of hardware including quantitative evaluation on the real problem while the realworld results match qualitative patterns expected from prior work the paper needs to do more to quantitatively show how adavi is indeed superior for these kinds of models the best way to do this would be reporting crossvalidated predictive loss eg by holding out subjects times or connectivity measures and comparing both the loss and runtimes against other models minor comments and typos section 22 could be clarified further for example is the set transformer really elementwise or is it elementwise wrt the leaf random variables in the graph if the contraction operator isnt to tensor contraction could a different word be used considering that tensor notation is already used and contraction has a common meaning in that setting mathcalbh is defined twice above and below expr 1 both upper and lowercase xij is used is it the same or different the meaning of the superscripts on x is not explained i think one is the plate index and the other is the datapoint index but not sure in addition the paper should clarify the full algorithm is everything trained endtoend or are the sts trained first and then the model parameters section 32 could potentially pick a more practicallyrelevant nonconjugate example from the literature there are plenty of examples in neuroscience though im not sure if any have nondiscrete latents fig2 and 3 are missing error barsribbons x sim mathrmmixldots is not defined and not standard to my knowledge im assuming this is multinomial over mixture components given pi and each component is gaussian expressions 9a9b are not arranged in an intuitive way definitions and priors are intermixed everything is in one giant block an annotated figure even a plate diagram would be better and the specifics can be left to the supplement especially since the paper doesnt give enough info to understand the model beyond the kong et al citation hierarchical graphical models of the sorts considered here are in wide usage by scientific practitioners and automatic approximate inference methods that are efficient and accurate are of longstanding interest thus the motivation of the work is clear the nature of the contribution is likewise clear my primary concern is that the evaluation doesnt fully demonstrate the benefits of the approach eg using predictive outofsample loglikelihoods rather than training elbos including a quantitative evaluation of the realworld problem providing consistent timings pushing the scale until adavi dominates secondarily i think the overall framing clarity and writing could be improved i think the work is below the bar now and am rating accordingly but i think my concerns should be sufficiently addressable in rebuttal for the work to be appropriate for the conference docsepthis work introduces adavi an approximate inference algorithm for hierarchical bayesian models hbms the approach is similar to npe from simulationbased inference but exploits the hierarchical structure of the forward model to generate an efficient variational family automatically experiments demonstrate the applicability of the method on hbms of increasing complexity including a challenging neuroimaging model results indicate good performance against other amortized methods strengths the method deviates from blackbox simulators and instead exploits the hierarchical nature of many forward processes to scale to highdimensional parameter spaces experiments are detailed and results are discussed carefully adavi shows good performance in particular against snpe but nonamortized methods are shown to achieve higher elbo the supplementary materials are thorough and include helpful experimental details as well as further discussions weaknesses population studies are taken as an example of large hbms with millions of parameters i believe it would be fair to mention that oftentimes only a few of those many parameters are actual of interest for scientific inquiry in this case most of the parameters are latent variables for which no explicit estimation is necessary would you argue that adavi is still relevant in this case i would say it may still be relevant but i would be curious in having your opinion could you also better motivate in which scientific cases inference over millions of parameters is strictly necessary in section 23 the variational distribution is defined as a meanfield approximation can you comment on the constraints that result from this assumption results are discussed in terms of the elbo only the quality of the approximate posteriors is never diagnosed explicitly at least as far as i can see i would appreciate some comments on the impact of the size of the encoding space produced by the settransformers how shall one set this size should it be the same across all levels small remarks avoid footnotes if you can replace fig x with fig x this paper addresses a common problem of simulationbased inference and proposes a sound and efficient solution to enable inference in highdimensional parameter spaces experiments show convincing results my main issue is the lack of quality checks of the approximate posteriors produced by adavi this should matter the most in my opinion well above efficiency wallclock times and the number of parameters if the approximate posteriors are wrong none of those matter for now i do not recommend this paper for acceptance but i will be pleased to change and increase my evaluation if the authors can present diagnostics of the resulting approximate posteriors docsepthis manuscript proposes an amortized variational inference to produce a dual variational family for hierarchical bayesian models hbm that can be represented as pyramidal bayesian models the presented method exploits the exchangeability of parameters in hbm to reduce its parametrization for faster inference on highdimensional data such as neuroimaging the authors compare empirically the proposed method with several amortized and nonamortized alternatives and on several experimental data in terms of the size of parametrization inference time and quality of inferences strength a good motivation extending the applications of simulationbased inference and structured variational inference to highdimensional data settings such as neuroimaging data is interesting weaknesses the proposed adavi method only applies to a pyramidal class of bayesian networks in which dependency structure follows a pyramidal graph the authors need to motivate in the text that this class of problems covers most cases in the target applications in neuroimaging studies the proposed method in its ultimate performance becomes similar to a simple meanfield approximation the authors claim the proposed method instead provides more expressivity while failing to show this point in the experiments the performance improvements in terms of quality of inference and inference time compared to other alternatives remain marginal results in table 12 and figure 3 despite the promising abstract and introduction setting a high expectation on the highdimensionality of target problems thousands of brain locations the experiment on neuroimaging data is merely conducted on a relatively small dataset with 30 subjects and 1483 measures this barely fits with the requirements in the field these days we need methods that can deal with considerably larger datasets consisting of thousands of subjects with thousands of measures for example ukbiobank furthermore it would be nice to also compare quantitatively the time complexity and inference quality of adavi with other alternatives when applied to neuroimaging data minor suggestions comments and questions it may be nice to motivate the importance of hbr in the neuroimaging context why it is important to be able to handle the hierarchically organized data in the neuroscience context section 21 direct acyclic graphs directed acyclic graphs throughout the text fig fig section 21 a rvs hierarchy an rvs hierarchy what do you mean by symmetry in exploiting the symmetry induced by the plates section 3 a a hierarchy a hierarchy is the curves in fig 2a derived empirically then why the wall time for adavi remains fixed until 105 examples please explain while the main arguments are around the lower computational complexity of the proposed method it is very difficult to judge the time improvements when diverse hardware are used in a heterogeneous cluster of computers for example two types of cpu would be nice to fix the hardware setup to ensure fairness and reliability of comparisons i suggest moving sections b2 and b3 to the main text possibly to the final discussion as they contain important information about the presented method would be great also to see some results in section e5 in the main text on how the individual network can deviate from the population network that would be of high interest to the audience from the neuroscience community despite a nice motivation the empirical results show a minor improvement over the already existing methods the results are also weakly presented for the target problem in the neuroimaging context small data no quantitative comparison lack of discussion the method is not tested on very high dimensional settings as they are in the neuroimaging context docsepthis paper proposes to derive an automatic methodology that takes as input a generative hbm and generates a dual variational family able to perform amortized parameter inference the proposed method can be used to the context of pyramidally structured data with good inference quality and a favorable training time overall i am not an expert in hbms and neuroscience but based on the introduction in this work the proposed method is meaningful to the field of bayesian modelling for the neuroimaging as it can estimate posterior distributions for a generative model of pyramidal structure which is difficult to achieve authors provide sufficient experiments to support their claim in terms of both the inference quality and the speed so i hold a positive attitude toward this paper the paper is not very readerfriendly so it would be better that authors can provide the introduction of some basic concepts and more related works in their final version to improve the readability of their paper please refer to the main review section
### Summary: | the paper provides a unique contribution to the scalability of bayesian inference to pyramidal bayesian models with application to neuroimaging the major point of concern by the reviewers is around how close is the inference approach to the more classical meanfield vi however in my opinion the authors have addressed these concerns in the rebuttal therefore i recommend accept | [
7777,
921,
253,
5649,
273,
717,
430,
1320,
2581,
685,
4645,
13757,
673,
323,
512,
253,
3210,
943,
2649,
253,
2929,
921,
253,
15970,
17032,
673,
285,
3045,
323,
2918,
562,
941,
323,
512,
253,
3210,
840,
253,
717,
430,
1025,
3210,
943,
320,
16821,
7938,
407,
417,
10568,
667,
13757,
4720,
516,
417,
2119,
2139,
4522,
723,
403,
417,
2218,
327,
5185,
10309,
50276,
262,
3133,
751,
3253,
275,
253,
2929,
943,
1408,
327,
1097,
27754,
285,
305,
11113,
594,
8871,
581,
390,
4645,
11795,
323,
1097,
651,
320,
1805,
685,
2820,
281,
7277,
4522,
723,
875,
767,
1027,
5239,
273,
10309,
50274,
10387,
11745,
7103,
327,
253,
1524,
1895,
1223,
253,
1524,
10186,
1543,
3761,
18276,
6127,
3264,
432,
2720,
789,
253,
2929,
3198,
281,
513,
625,
281,
36878,
921,
849,
519,
23096,
310,
6296,
8936,
323,
841,
9351,
273,
3210,
253,
1682,
1039,
281,
513,
436,
651,
320,
9610,
2831,
7210,
456,
15970,
2957,
24088,
407,
5877,
562,
5705,
2069,
390,
17769,
5593,
285,
10941,
1097,
253,
2957,
285,
1408,
3181,
1411,
643,
3210,
50274,
37585,
5701,
285,
963,
993,
50276,
4674,
3307,
812,
320,
31637,
2007,
323,
1650,
50275,
261,
253,
873,
39707,
1663,
3284,
3020,
390,
310,
352,
3284,
3020,
8772,
253,
10617,
3632,
4903,
275,
253,
4216,
50275,
338,
253,
22170,
5572,
310,
2649,
281,
13148,
22170,
812,
247,
1027,
3159,
320,
908,
7296,
326,
13148,
14951,
310,
2168,
908,
285,
22170,
556,
247,
1846,
4495,
275,
326,
4758,
50275,
1588,
26576,
310,
2931,
7019,
1840,
285,
2708,
40280,
337,
50275,
15617,
5170,
285,
2406,
5045,
1269,
1944,
310,
908,
50276,
261,
352,
253,
1072,
390,
1027,
50275,
783,
4495,
273,
253,
17402,
1687,
84,
327,
1269,
310,
417,
5544,
891,
1158,
581,
310,
253,
5340,
3605,
285,
253,
643,
310,
253,
2856,
522,
842,
3605,
533,
417,
2119,
50275,
249,
1635,
50275,
783,
2929,
943,
19148,
253,
2120,
5933,
310,
3253,
10166,
990,
936,
423,
390,
403,
253,
331,
84,
10166,
806,
285,
840,
253,
1566,
3602,
50275,
4674,
4567,
812,
7826,
2619,
247,
625,
18236,
15477,
1327,
26973,
366,
1650,
432,
253,
6239,
627,
403,
9828,
273,
6667,
275,
6551,
21559,
2167,
516,
417,
2119,
604,
667,
452,
27370,
261,
6713,
4329,
592,
50275,
926,
19,
285,
495,
403,
5816,
2228,
8965,
725,
21111,
50275,
89,
948,
14168,
1109,
24706,
5589,
310,
417,
2931,
285,
417,
2629,
281,
619,
3640,
50276,
303,
7384,
436,
310,
37197,
28261,
689,
7802,
4295,
1677,
12580,
285,
1016,
4445,
310,
305,
12064,
50275,
31166,
621,
898,
66,
26,
67,
403,
417,
10912,
275,
271,
27350,
1039,
14308,
285,
2235,
641,
403,
734,
37340,
3253,
310,
275,
581,
10864,
2972,
271,
28267,
4677,
1014,
247,
5340,
10659,
651,
320,
1805,
285,
253,
40155,
476,
320,
1669,
281,
253,
8499,
3340,
1580,
253,
2929,
36908,
1918,
2217,
8692,
281,
2096,
253,
1566,
4457,
253,
465,
543,
1162,
355,
25577,
50274,
73,
1321,
1116,
474,
29886,
3210,
273,
253,
16308,
2783,
1060,
403,
275,
4618,
10393,
407,
8249,
24432,
285,
12077,
16851,
17032,
3082,
326,
403,
5919,
285,
7899,
403,
273,
1048,
6924,
1600,
50276,
40622,
253,
16038,
273,
253,
789,
310,
2590,
253,
3753,
273,
253,
7680,
310,
21223,
2590,
619,
3625,
4468,
310,
326,
253,
7103,
36908,
4751,
7568,
253,
5373,
273,
253,
2746,
24088,
970,
15970,
562,
1171,
16848,
2412,
7513,
10202,
84,
2581,
685,
3733,
1045,
37298,
1690,
247,
11745,
7103,
273,
253,
1524,
10186,
1895,
5277,
5185,
4522,
723,
13383,
253,
4311,
1919,
519,
23096,
36807,
1273,
3441,
891,
1158,
253,
4583,
39926,
19843,
285,
4028,
812,
320,
5520,
891,
1158,
253,
789,
310,
2708,
253,
2534,
1024,
285,
717,
13716,
15672,
533,
891,
1158,
619,
7350,
943,
320,
10481,
2953,
494,
275,
30080,
22559,
323,
253,
789,
281,
320,
4569,
323,
253,
8059,
50276,
7152,
33032,
2520,
789,
23970,
519,
23096,
271,
16851,
17032,
5933,
323,
24498,
17699,
16561,
3210,
288,
67,
983,
253,
2746,
310,
2074,
281,
295,
365,
432,
9864,
3169,
17032,
533,
40725,
253,
24498,
2605,
273,
253,
3579,
1566,
281,
6635,
271,
5919,
39762,
2021,
8356,
4679,
7568,
253,
30437,
273,
253,
1332,
327,
288,
67,
983,
273,
3629,
10454,
1690,
247,
11132,
6551,
40270,
1566,
1543,
5224,
1175,
3045,
1411,
643,
717,
430,
1025,
3082,
50276,
296,
3755,
20556,
50276,
783,
1332,
1474,
28032,
432,
2806,
3364,
948,
28457,
285,
3185,
40725,
253,
24498,
3753,
273,
1142,
3579,
4870,
281,
4311,
281,
1029,
6967,
4764,
8470,
50276,
16217,
3825,
403,
7000,
285,
1543,
403,
5469,
9257,
519,
23096,
2722,
1175,
3045,
275,
1798,
1411,
3802,
365,
533,
1327,
312,
430,
1025,
3082,
403,
2011,
281,
5115,
2169,
1045,
2399,
50275,
783,
24864,
4753,
403,
11080,
285,
2486,
9371,
5661,
4278,
347,
973,
347,
2007,
11985,
50276,
20881,
1255,
265,
50276,
30957,
2175,
403,
2668,
347,
271,
1650,
273,
1781,
288,
67,
983,
342,
9790,
273,
3602,
891,
2868,
352,
651,
320,
4344,
281,
3748,
326,
39670,
290,
1022,
760,
247,
1643,
273,
1110,
1142,
3602,
403,
4588,
273,
1600,
323,
8249,
14392,
275,
436,
1083,
954,
273,
253,
3602,
403,
21624,
4903,
323,
534,
642,
6843,
13418,
310,
3309,
651,
368,
9059,
326,
519,
23096,
310,
1335,
4623,
275,
436,
1083,
891,
651,
1333,
352,
778,
1335,
320,
4623,
533,
891,
651,
320,
14338,
275,
1907,
634,
4743,
812,
368,
671,
1805,
41509,
275,
534,
8249,
2219,
17032,
689,
9790,
273,
3602,
310,
13714,
3309,
50276,
249,
2593,
3495,
253,
39762,
3268,
310,
2931,
347,
247,
1599,
3423,
11193,
476,
368,
4385,
327,
253,
10806,
326,
906,
432,
436,
9376,
50276,
16680,
403,
5469,
275,
2426,
273,
253,
1045,
2399,
760,
253,
3290,
273,
253,
16851,
20731,
17327,
310,
1620,
11245,
11120,
387,
1878,
347,
2080,
347,
891,
476,
923,
50275,
74,
651,
11435,
690,
5701,
327,
253,
3486,
273,
253,
1979,
273,
253,
9706,
2317,
4197,
407,
253,
873,
16702,
398,
849,
3091,
581,
873,
436,
1979,
943,
352,
320,
253,
1072,
2439,
512,
2308,
50276,
6795,
16157,
50276,
27635,
3174,
21377,
604,
368,
476,
50276,
13481,
3036,
1269,
342,
3036,
1269,
50276,
2520,
2929,
12453,
247,
1846,
1895,
273,
9864,
3169,
17032,
285,
29328,
247,
3590,
285,
5919,
2900,
281,
8046,
17032,
275,
1029,
6967,
4764,
8470,
4679,
921,
21414,
1543,
50276,
2577,
2022,
2523,
310,
253,
3480,
273,
3290,
12255,
273,
253,
16851,
20731,
17327,
4197,
407,
519,
23096,
436,
943,
2647,
253,
954,
275,
619,
4743,
973,
1840,
6733,
3402,
13273,
2069,
285,
253,
1180,
273,
3602,
604,
253,
16851,
20731,
17327,
403,
3430,
5293,
273,
1110,
2647,
323,
1024,
891,
513,
417,
5583,
436,
2929,
323,
14924,
533,
891,
588,
320,
13864,
281,
1818,
285,
2572,
619,
7103,
604,
253,
4477,
476,
1246,
39266,
273,
253,
4795,
16851,
20731,
17327,
50276,
7152,
33032,
2520,
7714,
29328,
271,
717,
430,
1025,
39762,
17032,
281,
4711,
247,
8746,
39762,
2021,
323,
24498,
17699,
16561,
3210,
288,
5844,
326,
476,
320,
6607,
347,
25874,
11421,
17699,
16561,
3210,
253,
3559,
1332,
40725,
253,
6431,
1430,
273,
3602,
275,
288,
5844,
281,
4796,
697,
30364,
45031,
323,
7938,
17032,
327,
1029,
6967,
941,
824,
347,
6551,
40270,
253,
4477,
7277,
45190,
253,
4081,
1332,
342,
2067,
717,
430,
1025,
285,
1327,
312,
430,
1025,
18075,
285,
327,
2067,
5661,
941,
275,
2426,
273,
253,
1979,
273,
30364,
45031,
17032,
673,
285,
3290,
273,
27377,
50276,
45563,
50275,
66,
1175,
16038,
13633,
253,
4893,
273,
9864,
3169,
17032,
285,
18872,
39762,
17032,
281,
1029,
6967,
941,
7533,
824,
347,
6551,
40270,
941,
310,
4722,
50276,
20881,
1255,
265,
50275,
783,
4081,
519,
23096,
1332,
760,
10384,
281,
247,
25874,
11421,
966,
273,
17699,
16561,
6928,
275,
534,
18925,
2605,
3637,
247,
25874,
11421,
4216,
253,
4477,
878,
281,
41509,
275,
253,
2505,
326,
436,
966,
273,
3237,
10949,
954,
2219,
275,
253,
2303,
4893,
275,
6551,
40270,
2175,
50275,
783,
4081,
1332,
275,
697,
12553,
3045,
4916,
2074,
281,
247,
2969,
1599,
3423,
11193,
253,
4477,
1750,
253,
4081,
1332,
3185,
3400,
625,
3890,
2351,
1223,
11741,
281,
921,
436,
1127,
275,
253,
4679,
50275,
783,
3045,
11701,
275,
2426,
273,
3290,
273,
17032,
285,
17032,
673,
2429,
281,
643,
18075,
3464,
16888,
1543,
275,
2829,
1249,
285,
4677,
495,
50276,
3229,
3784,
253,
12532,
12002,
285,
10199,
4758,
247,
1029,
15355,
327,
253,
1029,
39120,
1319,
273,
2303,
3237,
6763,
273,
3998,
8593,
253,
3368,
327,
6551,
40270,
941,
310,
7960,
5196,
327,
247,
4942,
1355,
10895,
342,
1884,
5705,
285,
1638,
3245,
5593,
436,
12345,
13840,
342,
253,
6095,
275,
253,
1673,
841,
1897,
359,
878,
3082,
326,
476,
2968,
342,
15455,
4067,
15302,
11253,
273,
6763,
273,
5705,
342,
6763,
273,
5593,
323,
1650,
42487,
4193,
706,
1164,
33810,
352,
651,
320,
5322,
281,
671,
7277,
36878,
253,
673,
10454,
285,
17032,
3290,
273,
519,
23096,
342,
643,
18075,
672,
3732,
281,
6551,
40270,
941,
50276,
37585,
13991,
5701,
285,
3533,
50276,
262,
778,
320,
5322,
281,
41509,
253,
6349,
273,
288,
1288,
275,
253,
6551,
40270,
3634,
2139,
352,
310,
1774,
281,
320,
2104,
281,
6016,
253,
20258,
1037,
10932,
941,
275,
253,
6551,
21559,
3634,
50275,
4674,
3127,
1480,
913,
9920,
280,
14580,
50276,
27481,
913,
9920,
280,
14580,
50276,
10489,
483,
253,
2505,
3036,
50276,
926,
50276,
4674,
3127,
247,
391,
10936,
19868,
50276,
266,
391,
10936,
19868,
50276,
5371,
513,
368,
1599,
407,
10377,
275,
50276,
15083,
80,
2996,
253,
10377,
5802,
407,
253,
10855,
50275,
4674,
495,
247,
247,
19868,
50276,
66,
19868,
50276,
261,
253,
9191,
275,
3036,
374,
66,
6012,
45190,
840,
2139,
253,
3402,
673,
323,
519,
23096,
4558,
4229,
1919,
12446,
6667,
4496,
5513,
50276,
6050,
253,
2022,
7125,
403,
1475,
253,
2406,
15180,
10454,
273,
253,
4081,
1332,
352,
310,
1077,
2834,
281,
5963,
253,
673,
11701,
672,
11117,
10309,
403,
908,
275,
247,
22766,
7368,
273,
12823,
323,
1650,
767,
3510,
273,
27754,
651,
320,
5322,
281,
4993,
253,
10309,
9978,
281,
5416,
28959,
285,
13367,
273,
14023,
50276,
74,
1804,
4886,
7118,
270,
19,
285,
270,
20,
281,
253,
2022,
2505,
6830,
281,
253,
2457,
5955,
347,
597,
3831,
1774,
1491,
670,
253,
3559,
1332,
50276,
12756,
320,
1270,
671,
281,
923,
690,
1543,
275,
2593,
299,
22,
275,
253,
2022,
2505,
327,
849,
253,
2060,
2990,
476,
1474,
4513,
432,
253,
3072,
2990,
326,
651,
320,
273,
1029,
1600,
281,
253,
8446,
432,
253,
6551,
21559,
3114,
50275,
3229,
3784,
247,
5322,
16038,
253,
16774,
1543,
921,
247,
5884,
7756,
689,
253,
2168,
5368,
3082,
253,
1543,
403,
671,
22112,
3559,
323,
253,
2303,
1895,
275,
253,
6551,
40270,
3634,
1355,
941,
642,
11745,
5301,
3480,
273,
5955,
253,
1332,
310,
417,
5762,
327,
1077,
1029,
15759,
7533,
347,
597,
403,
275,
253,
6551,
40270,
3634,
50275,
7152,
33032,
2520,
2929,
29328,
281,
15313,
271,
12077,
16182,
326,
3936,
347,
3280,
247,
1006,
800,
288,
5844,
285,
15693,
247,
8746,
39762,
2021,
2104,
281,
1347,
717,
430,
1025,
4764,
17032,
253,
4081,
1332,
476,
320,
908,
281,
253,
3634,
273,
50276,
4789,
3358,
301,
595,
18872,
941,
342,
1175,
17032,
3290,
285,
247,
13857,
3733,
673,
50276,
1189,
455,
891,
717,
417,
271,
6485,
275,
288,
67,
983,
285,
6551,
21559,
533,
1754,
327,
253,
10199,
275,
436,
789,
253,
4081,
1332,
310,
14282,
281,
253,
50276,
3423,
273,
17699,
16561,
26278,
323,
253,
6551,
40270,
347,
352,
476,
6642,
12637,
10670,
323,
247,
1006,
800,
1566,
273,
25874,
11421,
2605,
534,
310,
2834,
281,
5115,
4477,
2085,
4209,
4679,
281,
1329,
616,
1750,
275,
2426,
273,
1097,
253,
17032,
3290,
285,
253,
3885,
594,
891,
2186,
247,
2762,
12046,
2584,
436,
2929,
253,
2929,
310,
417,
1077,
9414,
19771,
594,
352,
651,
320,
1805,
326,
4477,
476,
2085,
253,
10199,
273,
690,
5044,
12342,
285,
625,
2905,
2987,
275,
616,
2457,
2715,
281,
3157,
253,
1239,
1430,
273,
616,
2929,
50275,
32897,
3730,
281,
253,
2022,
2278,
2593,
2490,
187,
4118,
18435,
27,
783,
2929,
3400,
247,
4451,
7680,
281,
253,
9171,
1430,
273,
17699,
16561,
17032,
281,
25874,
11421,
17699,
16561,
3210,
342,
2898,
281,
6551,
40270,
253,
2201,
1127,
273,
4468,
407,
253,
30628,
310,
1475,
849,
2810,
310,
253,
17032,
2746,
281,
253,
625,
8946,
1599,
3423,
2177,
2299,
275,
619,
4743,
253,
4477,
452,
9713,
841,
7350,
275,
253,
30080,
22559,
3103,
891,
5583,
2997
] | [
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] | [
7777,
921,
253,
5649,
273,
717,
430,
1320,
2581,
685,
4645,
13757,
673,
323,
512,
253,
3210,
943,
2649,
253,
2929,
921,
253,
15970,
17032,
673,
285,
3045,
323,
2918,
562,
941,
323,
512,
253,
3210,
840,
253,
717,
430,
1025,
3210,
943,
320,
16821,
7938,
407,
417,
10568,
667,
13757,
4720,
516,
417,
2119,
2139,
4522,
723,
403,
417,
2218,
327,
5185,
10309,
50276,
262,
3133,
751,
3253,
275,
253,
2929,
943,
1408,
327,
1097,
27754,
285,
305,
11113,
594,
8871,
581,
390,
4645,
11795,
323,
1097,
651,
320,
1805,
685,
2820,
281,
7277,
4522,
723,
875,
767,
1027,
5239,
273,
10309,
50274,
10387,
11745,
7103,
327,
253,
1524,
1895,
1223,
253,
1524,
10186,
1543,
3761,
18276,
6127,
3264,
432,
2720,
789,
253,
2929,
3198,
281,
513,
625,
281,
36878,
921,
849,
519,
23096,
310,
6296,
8936,
323,
841,
9351,
273,
3210,
253,
1682,
1039,
281,
513,
436,
651,
320,
9610,
2831,
7210,
456,
15970,
2957,
24088,
407,
5877,
562,
5705,
2069,
390,
17769,
5593,
285,
10941,
1097,
253,
2957,
285,
1408,
3181,
1411,
643,
3210,
50274,
37585,
5701,
285,
963,
993,
50276,
4674,
3307,
812,
320,
31637,
2007,
323,
1650,
50275,
261,
253,
873,
39707,
1663,
3284,
3020,
390,
310,
352,
3284,
3020,
8772,
253,
10617,
3632,
4903,
275,
253,
4216,
50275,
338,
253,
22170,
5572,
310,
2649,
281,
13148,
22170,
812,
247,
1027,
3159,
320,
908,
7296,
326,
13148,
14951,
310,
2168,
908,
285,
22170,
556,
247,
1846,
4495,
275,
326,
4758,
50275,
1588,
26576,
310,
2931,
7019,
1840,
285,
2708,
40280,
337,
50275,
15617,
5170,
285,
2406,
5045,
1269,
1944,
310,
908,
50276,
261,
352,
253,
1072,
390,
1027,
50275,
783,
4495,
273,
253,
17402,
1687,
84,
327,
1269,
310,
417,
5544,
891,
1158,
581,
310,
253,
5340,
3605,
285,
253,
643,
310,
253,
2856,
522,
842,
3605,
533,
417,
2119,
50275,
249,
1635,
50275,
783,
2929,
943,
19148,
253,
2120,
5933,
310,
3253,
10166,
990,
936,
423,
390,
403,
253,
331,
84,
10166,
806,
285,
840,
253,
1566,
3602,
50275,
4674,
4567,
812,
7826,
2619,
247,
625,
18236,
15477,
1327,
26973,
366,
1650,
432,
253,
6239,
627,
403,
9828,
273,
6667,
275,
6551,
21559,
2167,
516,
417,
2119,
604,
667,
452,
27370,
261,
6713,
4329,
592,
50275,
926,
19,
285,
495,
403,
5816,
2228,
8965,
725,
21111,
50275,
89,
948,
14168,
1109,
24706,
5589,
310,
417,
2931,
285,
417,
2629,
281,
619,
3640,
50276,
303,
7384,
436,
310,
37197,
28261,
689,
7802,
4295,
1677,
12580,
285,
1016,
4445,
310,
305,
12064,
50275,
31166,
621,
898,
66,
26,
67,
403,
417,
10912,
275,
271,
27350,
1039,
14308,
285,
2235,
641,
403,
734,
37340,
3253,
310,
275,
581,
10864,
2972,
271,
28267,
4677,
1014,
247,
5340,
10659,
651,
320,
1805,
285,
253,
40155,
476,
320,
1669,
281,
253,
8499,
3340,
1580,
253,
2929,
36908,
1918,
2217,
8692,
281,
2096,
253,
1566,
4457,
253,
465,
543,
1162,
355,
25577,
50274,
73,
1321,
1116,
474,
29886,
3210,
273,
253,
16308,
2783,
1060,
403,
275,
4618,
10393,
407,
8249,
24432,
285,
12077,
16851,
17032,
3082,
326,
403,
5919,
285,
7899,
403,
273,
1048,
6924,
1600,
50276,
40622,
253,
16038,
273,
253,
789,
310,
2590,
253,
3753,
273,
253,
7680,
310,
21223,
2590,
619,
3625,
4468,
310,
326,
253,
7103,
36908,
4751,
7568,
253,
5373,
273,
253,
2746,
24088,
970,
15970,
562,
1171,
16848,
2412,
7513,
10202,
84,
2581,
685,
3733,
1045,
37298,
1690,
247,
11745,
7103,
273,
253,
1524,
10186,
1895,
5277,
5185,
4522,
723,
13383,
253,
4311,
1919,
519,
23096,
36807,
1273,
3441,
891,
1158,
253,
4583,
39926,
19843,
285,
4028,
812,
320,
5520,
891,
1158,
253,
789,
310,
2708,
253,
2534,
1024,
285,
717,
13716,
15672,
533,
891,
1158,
619,
7350,
943,
320,
10481,
2953,
494,
275,
30080,
22559,
323,
253,
789,
281,
320,
4569,
323,
253,
8059,
50276,
7152,
33032,
2520,
789,
23970,
519,
23096,
271,
16851,
17032,
5933,
323,
24498,
17699,
16561,
3210,
288,
67,
983,
253,
2746,
310,
2074,
281,
295,
365,
432,
9864,
3169,
17032,
533,
40725,
253,
24498,
2605,
273,
253,
3579,
1566,
281,
6635,
271,
5919,
39762,
2021,
8356,
4679,
7568,
253,
30437,
273,
253,
1332,
327,
288,
67,
983,
273,
3629,
10454,
1690,
247,
11132,
6551,
40270,
1566,
1543,
5224,
1175,
3045,
1411,
643,
717,
430,
1025,
3082,
50276,
296,
3755,
20556,
50276,
783,
1332,
1474,
28032,
432,
2806,
3364,
948,
28457,
285,
3185,
40725,
253,
24498,
3753,
273,
1142,
3579,
4870,
281,
4311,
281,
1029,
6967,
4764,
8470,
50276,
16217,
3825,
403,
7000,
285,
1543,
403,
5469,
9257,
519,
23096,
2722,
1175,
3045,
275,
1798,
1411,
3802,
365,
533,
1327,
312,
430,
1025,
3082,
403,
2011,
281,
5115,
2169,
1045,
2399,
50275,
783,
24864,
4753,
403,
11080,
285,
2486,
9371,
5661,
4278,
347,
973,
347,
2007,
11985,
50276,
20881,
1255,
265,
50276,
30957,
2175,
403,
2668,
347,
271,
1650,
273,
1781,
288,
67,
983,
342,
9790,
273,
3602,
891,
2868,
352,
651,
320,
4344,
281,
3748,
326,
39670,
290,
1022,
760,
247,
1643,
273,
1110,
1142,
3602,
403,
4588,
273,
1600,
323,
8249,
14392,
275,
436,
1083,
954,
273,
253,
3602,
403,
21624,
4903,
323,
534,
642,
6843,
13418,
310,
3309,
651,
368,
9059,
326,
519,
23096,
310,
1335,
4623,
275,
436,
1083,
891,
651,
1333,
352,
778,
1335,
320,
4623,
533,
891,
651,
320,
14338,
275,
1907,
634,
4743,
812,
368,
671,
1805,
41509,
275,
534,
8249,
2219,
17032,
689,
9790,
273,
3602,
310,
13714,
3309,
50276,
249,
2593,
3495,
253,
39762,
3268,
310,
2931,
347,
247,
1599,
3423,
11193,
476,
368,
4385,
327,
253,
10806,
326,
906,
432,
436,
9376,
50276,
16680,
403,
5469,
275,
2426,
273,
253,
1045,
2399,
760,
253,
3290,
273,
253,
16851,
20731,
17327,
310,
1620,
11245,
11120,
387,
1878,
347,
2080,
347,
891,
476,
923,
50275,
74,
651,
11435,
690,
5701,
327,
253,
3486,
273,
253,
1979,
273,
253,
9706,
2317,
4197,
407,
253,
873,
16702,
398,
849,
3091,
581,
873,
436,
1979,
943,
352,
320,
253,
1072,
2439,
512,
2308,
50276,
6795,
16157,
50276,
27635,
3174,
21377,
604,
368,
476,
50276,
13481,
3036,
1269,
342,
3036,
1269,
50276,
2520,
2929,
12453,
247,
1846,
1895,
273,
9864,
3169,
17032,
285,
29328,
247,
3590,
285,
5919,
2900,
281,
8046,
17032,
275,
1029,
6967,
4764,
8470,
4679,
921,
21414,
1543,
50276,
2577,
2022,
2523,
310,
253,
3480,
273,
3290,
12255,
273,
253,
16851,
20731,
17327,
4197,
407,
519,
23096,
436,
943,
2647,
253,
954,
275,
619,
4743,
973,
1840,
6733,
3402,
13273,
2069,
285,
253,
1180,
273,
3602,
604,
253,
16851,
20731,
17327,
403,
3430,
5293,
273,
1110,
2647,
323,
1024,
891,
513,
417,
5583,
436,
2929,
323,
14924,
533,
891,
588,
320,
13864,
281,
1818,
285,
2572,
619,
7103,
604,
253,
4477,
476,
1246,
39266,
273,
253,
4795,
16851,
20731,
17327,
50276,
7152,
33032,
2520,
7714,
29328,
271,
717,
430,
1025,
39762,
17032,
281,
4711,
247,
8746,
39762,
2021,
323,
24498,
17699,
16561,
3210,
288,
5844,
326,
476,
320,
6607,
347,
25874,
11421,
17699,
16561,
3210,
253,
3559,
1332,
40725,
253,
6431,
1430,
273,
3602,
275,
288,
5844,
281,
4796,
697,
30364,
45031,
323,
7938,
17032,
327,
1029,
6967,
941,
824,
347,
6551,
40270,
253,
4477,
7277,
45190,
253,
4081,
1332,
342,
2067,
717,
430,
1025,
285,
1327,
312,
430,
1025,
18075,
285,
327,
2067,
5661,
941,
275,
2426,
273,
253,
1979,
273,
30364,
45031,
17032,
673,
285,
3290,
273,
27377,
50276,
45563,
50275,
66,
1175,
16038,
13633,
253,
4893,
273,
9864,
3169,
17032,
285,
18872,
39762,
17032,
281,
1029,
6967,
941,
7533,
824,
347,
6551,
40270,
941,
310,
4722,
50276,
20881,
1255,
265,
50275,
783,
4081,
519,
23096,
1332,
760,
10384,
281,
247,
25874,
11421,
966,
273,
17699,
16561,
6928,
275,
534,
18925,
2605,
3637,
247,
25874,
11421,
4216,
253,
4477,
878,
281,
41509,
275,
253,
2505,
326,
436,
966,
273,
3237,
10949,
954,
2219,
275,
253,
2303,
4893,
275,
6551,
40270,
2175,
50275,
783,
4081,
1332,
275,
697,
12553,
3045,
4916,
2074,
281,
247,
2969,
1599,
3423,
11193,
253,
4477,
1750,
253,
4081,
1332,
3185,
3400,
625,
3890,
2351,
1223,
11741,
281,
921,
436,
1127,
275,
253,
4679,
50275,
783,
3045,
11701,
275,
2426,
273,
3290,
273,
17032,
285,
17032,
673,
2429,
281,
643,
18075,
3464,
16888,
1543,
275,
2829,
1249,
285,
4677,
495,
50276,
3229,
3784,
253,
12532,
12002,
285,
10199,
4758,
247,
1029,
15355,
327,
253,
1029,
39120,
1319,
273,
2303,
3237,
6763,
273,
3998,
8593,
253,
3368,
327,
6551,
40270,
941,
310,
7960,
5196,
327,
247,
4942,
1355,
10895,
342,
1884,
5705,
285,
1638,
3245,
5593,
436,
12345,
13840,
342,
253,
6095,
275,
253,
1673,
841,
1897,
359,
878,
3082,
326,
476,
2968,
342,
15455,
4067,
15302,
11253,
273,
6763,
273,
5705,
342,
6763,
273,
5593,
323,
1650,
42487,
4193,
706,
1164,
33810,
352,
651,
320,
5322,
281,
671,
7277,
36878,
253,
673,
10454,
285,
17032,
3290,
273,
519,
23096,
342,
643,
18075,
672,
3732,
281,
6551,
40270,
941,
50276,
37585,
13991,
5701,
285,
3533,
50276,
262,
778,
320,
5322,
281,
41509,
253,
6349,
273,
288,
1288,
275,
253,
6551,
40270,
3634,
2139,
352,
310,
1774,
281,
320,
2104,
281,
6016,
253,
20258,
1037,
10932,
941,
275,
253,
6551,
21559,
3634,
50275,
4674,
3127,
1480,
913,
9920,
280,
14580,
50276,
27481,
913,
9920,
280,
14580,
50276,
10489,
483,
253,
2505,
3036,
50276,
926,
50276,
4674,
3127,
247,
391,
10936,
19868,
50276,
266,
391,
10936,
19868,
50276,
5371,
513,
368,
1599,
407,
10377,
275,
50276,
15083,
80,
2996,
253,
10377,
5802,
407,
253,
10855,
50275,
4674,
495,
247,
247,
19868,
50276,
66,
19868,
50276,
261,
253,
9191,
275,
3036,
374,
66,
6012,
45190,
840,
2139,
253,
3402,
673,
323,
519,
23096,
4558,
4229,
1919,
12446,
6667,
4496,
5513,
50276,
6050,
253,
2022,
7125,
403,
1475,
253,
2406,
15180,
10454,
273,
253,
4081,
1332,
352,
310,
1077,
2834,
281,
5963,
253,
673,
11701,
672,
11117,
10309,
403,
908,
275,
247,
22766,
7368,
273,
12823,
323,
1650,
767,
3510,
273,
27754,
651,
320,
5322,
281,
4993,
253,
10309,
9978,
281,
5416,
28959,
285,
13367,
273,
14023,
50276,
74,
1804,
4886,
7118,
270,
19,
285,
270,
20,
281,
253,
2022,
2505,
6830,
281,
253,
2457,
5955,
347,
597,
3831,
1774,
1491,
670,
253,
3559,
1332,
50276,
12756,
320,
1270,
671,
281,
923,
690,
1543,
275,
2593,
299,
22,
275,
253,
2022,
2505,
327,
849,
253,
2060,
2990,
476,
1474,
4513,
432,
253,
3072,
2990,
326,
651,
320,
273,
1029,
1600,
281,
253,
8446,
432,
253,
6551,
21559,
3114,
50275,
3229,
3784,
247,
5322,
16038,
253,
16774,
1543,
921,
247,
5884,
7756,
689,
253,
2168,
5368,
3082,
253,
1543,
403,
671,
22112,
3559,
323,
253,
2303,
1895,
275,
253,
6551,
40270,
3634,
1355,
941,
642,
11745,
5301,
3480,
273,
5955,
253,
1332,
310,
417,
5762,
327,
1077,
1029,
15759,
7533,
347,
597,
403,
275,
253,
6551,
40270,
3634,
50275,
7152,
33032,
2520,
2929,
29328,
281,
15313,
271,
12077,
16182,
326,
3936,
347,
3280,
247,
1006,
800,
288,
5844,
285,
15693,
247,
8746,
39762,
2021,
2104,
281,
1347,
717,
430,
1025,
4764,
17032,
253,
4081,
1332,
476,
320,
908,
281,
253,
3634,
273,
50276,
4789,
3358,
301,
595,
18872,
941,
342,
1175,
17032,
3290,
285,
247,
13857,
3733,
673,
50276,
1189,
455,
891,
717,
417,
271,
6485,
275,
288,
67,
983,
285,
6551,
21559,
533,
1754,
327,
253,
10199,
275,
436,
789,
253,
4081,
1332,
310,
14282,
281,
253,
50276,
3423,
273,
17699,
16561,
26278,
323,
253,
6551,
40270,
347,
352,
476,
6642,
12637,
10670,
323,
247,
1006,
800,
1566,
273,
25874,
11421,
2605,
534,
310,
2834,
281,
5115,
4477,
2085,
4209,
4679,
281,
1329,
616,
1750,
275,
2426,
273,
1097,
253,
17032,
3290,
285,
253,
3885,
594,
891,
2186,
247,
2762,
12046,
2584,
436,
2929,
253,
2929,
310,
417,
1077,
9414,
19771,
594,
352,
651,
320,
1805,
326,
4477,
476,
2085,
253,
10199,
273,
690,
5044,
12342,
285,
625,
2905,
2987,
275,
616,
2457,
2715,
281,
3157,
253,
1239,
1430,
273,
616,
2929,
50275,
32897,
3730,
281,
253,
2022,
2278,
2593,
2490,
187,
4118,
18435,
27,
783,
2929,
3400,
247,
4451,
7680,
281,
253,
9171,
1430,
273,
17699,
16561,
17032,
281,
25874,
11421,
17699,
16561,
3210,
342,
2898,
281,
6551,
40270,
253,
2201,
1127,
273,
4468,
407,
253,
30628,
310,
1475,
849,
2810,
310,
253,
17032,
2746,
281,
253,
625,
8946,
1599,
3423,
2177,
2299,
275,
619,
4743,
253,
4477,
452,
9713,
841,
7350,
275,
253,
30080,
22559,
3103,
891,
5583,
2997
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
the paper deals with an interesting problem the presentation is clear the the approach intuitive however the reviewer has some concerns about the pertinence of the approach and the relationship with related work it would be very helpful if the authors could contrast and compare the proposed approach both qualitatively and quantitatively in their numerical experiments with methods for sparse nonnegative matrix factorization these would also lend themselves to causal interpretation the need for modeling probabilities pui in the key motivating applications is questionable indeed in both recommendation systems and gene expression datasets the observations are not readily in the form of probabilities for instance in the experiments the authors normalize the level of gene expressions to make is look as a probability which by the way is very different from the iid uniform setup considered in setting 1 for the movielens dataset it is unclear how the data was preprocessed to obtain the observed pui docsepthe paper considers a solution to a statistical association problem the proposed solution involves a decomposition they call a kolmogorov model what sort is not justified in any way and confused me a lot the decomposition has two parts 1 a discrete basis function that needs to be discovered and 2 a discrete distribution over the basis elements the define an optimization problem 2 which has a data term and some binary and simplex constraints and they propose a relaxation and decomposition of this optimization problem they go on to claim that mutual causal relations can be then inferred by inspecting the representations they have learnt but they give little details on how and what impacts this distinction has in practice this may be obvious to a subfield expert but it is not clear to me at all the paper is locally consistent but i have trouble understanding the contribution and placing in the broad machine learning field i am not an expert in causality so i cannot evaluate the contribution but i can say that what interests me are section 22 and sec 45 and they both require a lot better writing 22 made things much more intuitive but i fail to see how the indicator variable annotations action scifi etc can possibly come out of the data i think this is an important point to support the interpretability claim as for 4 i think there is room for intuition building there as well as limitations eg what sort of inferences can be made and not etc finally for 5 i find that very interesting but i find it difficult to have the right intuition about what the support condition means and how that helps in a practical setting pros causality and interpretability are major directions of research seems like a valid contribution on an interesting problem cons the highlevel picture is relatively clear but i find important things very difficult to grasp the kolmogorov model definition i find confusing but i am not an expert in causality the introduction should give some intuition about what that is and why it is a good idea find it very hard to have a coherent picture of the limitations and assess the contributions of the paperdocsepthe reviewer finds that the proposed method interesting the model is very clean and the implication in causal inference is significant the writing is also clean and clear the reviewer has several concerns 1 the algorithm seems not very scalable in the two subproblems there is one solved by a large number of parallel sdrs sdr is quite expensive and for each column in the data matrix one has to solve an sdr in each iteration this is too much for large scale recommender systems in fact in the experiment 1 on movielens the algorithm was only tested on a notsolarge dataset and run 5 iterations the reviewer feels that more scenarios should be tested eg more iterations various sizes of dataset etc fixing the number of iterations also sounds a bit funny since it is more intuitive to stop the algorithm using some validation set or when the algorithm converges under a certain criterion 2 the algorithm works with probability of binary data this is quite hard to estimate in practice for example people likes a movie for only once it is hard to tell what is the probability of generating this like it seems that the experiment part of this paper did not clearly state how to obtain the probability that the algorithm needs 3 the proposed method is a special nonnegative matrix factorization which could be unidentifiable how to circumvent such situation since identifiability of nmf affects interpretability a lot
### Summary: | this work propose a method for learning a kolmogorov model which is a binary random variable model that is very similar or identical to a matrix factorization model the work proposes an alternative optimization approach that is again similar to matrix factorization approaches unfortunately no discussion or experiments are made to compare the proposed problem and method with standard matrix factorization without such comparison it is unclear if the proposed is substantially new or a reformation of a standard problem the authors are encouraged to improve the draft to clarify the connection matrix factorization and standard factor models | [
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
783,
2929,
13330,
342,
271,
4722,
1895,
253,
9759,
310,
2590,
253,
253,
2746,
27350,
50276,
35529,
253,
37317,
556,
690,
7350,
670,
253,
6925,
40156,
273,
253,
2746,
285,
253,
2954,
342,
2905,
789,
50276,
262,
651,
320,
1077,
9371,
604,
253,
4477,
812,
4499,
285,
7277,
253,
4081,
2746,
1097,
36143,
285,
36878,
275,
616,
10704,
4679,
50276,
3113,
3082,
323,
23507,
46214,
4315,
39401,
841,
651,
671,
28698,
3746,
281,
19349,
7914,
50274,
783,
878,
323,
14053,
20552,
268,
4113,
275,
253,
2234,
15265,
839,
4893,
310,
30455,
6296,
275,
1097,
17401,
2718,
285,
3320,
2048,
15302,
253,
7313,
403,
417,
12450,
275,
253,
830,
273,
20552,
323,
4227,
275,
253,
4679,
253,
4477,
39142,
253,
1268,
273,
3320,
12091,
281,
1056,
310,
1007,
347,
247,
5912,
534,
407,
253,
1039,
310,
1077,
1027,
432,
253,
891,
301,
6447,
9978,
2783,
275,
4758,
337,
323,
253,
1855,
928,
561,
10895,
352,
310,
12744,
849,
253,
941,
369,
638,
36981,
281,
4044,
253,
2540,
268,
4113,
50275,
7152,
339,
431,
248,
2929,
19401,
247,
2900,
281,
247,
7605,
5864,
1895,
253,
4081,
2900,
8687,
247,
14717,
597,
1067,
247,
38301,
44519,
42017,
1566,
752,
3686,
310,
417,
17285,
275,
667,
1039,
285,
13477,
479,
247,
2257,
253,
14717,
556,
767,
4243,
337,
247,
13358,
3720,
1159,
326,
3198,
281,
320,
6888,
285,
374,
247,
13358,
3268,
689,
253,
3720,
3603,
253,
4853,
271,
13757,
1895,
374,
534,
556,
247,
941,
1307,
285,
690,
8985,
285,
44053,
10806,
285,
597,
12661,
247,
17040,
285,
14717,
273,
436,
13757,
1895,
597,
564,
327,
281,
1750,
326,
15577,
19349,
2493,
476,
320,
840,
22245,
407,
16030,
272,
253,
14237,
597,
452,
34003,
533,
597,
1918,
1652,
4278,
327,
849,
285,
752,
16274,
436,
13812,
556,
275,
3946,
50276,
2520,
778,
320,
4755,
281,
247,
749,
3423,
6485,
533,
352,
310,
417,
2590,
281,
479,
387,
512,
253,
2929,
310,
12171,
5185,
533,
891,
452,
7596,
4685,
253,
7680,
285,
15606,
275,
253,
3862,
5145,
4715,
1673,
50275,
74,
717,
417,
271,
6485,
275,
46449,
594,
891,
2550,
7472,
253,
7680,
533,
891,
476,
1333,
326,
752,
6284,
479,
403,
2593,
3307,
285,
4706,
5329,
285,
597,
1097,
2430,
247,
2257,
1805,
4028,
3307,
1160,
1841,
1199,
625,
27350,
533,
891,
1891,
281,
923,
849,
253,
15301,
4778,
31825,
2250,
660,
18279,
3966,
476,
6830,
1705,
562,
273,
253,
941,
891,
1158,
436,
310,
271,
1774,
1127,
281,
1329,
253,
4665,
1430,
1750,
347,
323,
577,
891,
1158,
627,
310,
2316,
323,
30328,
3652,
627,
347,
973,
347,
7364,
24088,
752,
3686,
273,
27377,
476,
320,
1160,
285,
417,
3966,
4720,
323,
608,
891,
1089,
326,
1077,
4722,
533,
891,
1089,
352,
2834,
281,
452,
253,
987,
30328,
670,
752,
253,
1329,
1617,
2097,
285,
849,
326,
7729,
275,
247,
8542,
4758,
50276,
856,
84,
50276,
68,
666,
1319,
285,
4665,
1430,
403,
2201,
10746,
273,
2561,
50276,
339,
3030,
751,
247,
3588,
7680,
327,
271,
4722,
1895,
772,
50276,
783,
1029,
5251,
5406,
310,
4942,
2590,
533,
891,
1089,
1774,
1841,
1077,
2834,
281,
15909,
50275,
783,
38301,
44519,
42017,
1566,
5426,
891,
1089,
21643,
533,
891,
717,
417,
271,
6485,
275,
46449,
253,
10199,
943,
1918,
690,
30328,
670,
752,
326,
310,
285,
2139,
352,
310,
247,
1175,
2934,
50276,
8606,
352,
1077,
1892,
281,
452,
247,
18893,
5406,
273,
253,
7364,
285,
2939,
253,
9021,
273,
253,
2929,
7152,
339,
431,
248,
37317,
9010,
326,
253,
4081,
1332,
4722,
253,
1566,
310,
1077,
4076,
285,
253,
27570,
275,
19349,
17032,
310,
1534,
253,
4028,
310,
671,
4076,
285,
2590,
253,
37317,
556,
2067,
7350,
50276,
18,
253,
5933,
3133,
417,
1077,
44755,
275,
253,
767,
749,
856,
23042,
627,
310,
581,
14042,
407,
247,
1781,
1180,
273,
7529,
39868,
2967,
256,
5267,
310,
3240,
8214,
285,
323,
1016,
5084,
275,
253,
941,
4315,
581,
556,
281,
8415,
271,
256,
5267,
275,
1016,
19502,
436,
310,
1512,
1199,
323,
1781,
4311,
3818,
3109,
2718,
275,
958,
275,
253,
3368,
337,
327,
1855,
928,
561,
253,
5933,
369,
760,
5762,
327,
247,
417,
84,
6062,
463,
10895,
285,
1408,
608,
25142,
253,
37317,
9193,
326,
625,
15216,
943,
320,
5762,
24088,
625,
25142,
2710,
9552,
273,
10895,
3966,
18505,
253,
1180,
273,
25142,
671,
7835,
247,
2372,
11755,
1580,
352,
310,
625,
27350,
281,
3523,
253,
5933,
970,
690,
12820,
873,
390,
672,
253,
5933,
26414,
762,
247,
2176,
17705,
50276,
19,
253,
5933,
2987,
342,
5912,
273,
8985,
941,
436,
310,
3240,
1892,
281,
6642,
275,
3946,
323,
1650,
952,
13052,
247,
6440,
323,
760,
2378,
352,
310,
1892,
281,
2028,
752,
310,
253,
5912,
273,
11365,
436,
751,
352,
3133,
326,
253,
3368,
629,
273,
436,
2929,
858,
417,
4518,
1375,
849,
281,
4044,
253,
5912,
326,
253,
5933,
3198,
50276,
20,
253,
4081,
1332,
310,
247,
2714,
46214,
4315,
39401,
534,
812,
320,
440,
888,
18397,
849,
281,
39256,
824,
4112,
1580,
1548,
18279,
1430,
273,
9153,
71,
11852,
4665,
1430,
247,
2257,
50276,
187,
187,
4118,
18435,
27,
2520,
789,
12661,
247,
1332,
323,
4715,
247,
38301,
44519,
42017,
1566,
50276,
4609,
310,
247,
8985,
3632,
4778,
1566,
326,
310,
1077,
2074,
390,
8931,
281,
247,
4315,
39401,
1566,
253,
789,
29328,
271,
5795,
13757,
2746,
326,
310,
969,
2074,
281,
4315,
39401,
7274,
50276,
328,
9520,
642,
5955,
390,
4679,
403,
1160,
281,
7277,
253,
50276,
856,
7334,
1895,
285,
1332,
342,
2629,
4315,
39401,
1293,
824,
5301,
352,
310,
12744,
604,
253,
4081,
310,
9619,
747,
390,
247,
294,
1248,
273,
247,
2629,
1895,
253,
4477,
403,
14659,
281,
3157,
253,
7482,
281,
19148,
253,
4602,
4315,
39401,
285,
2629,
2803,
3210,
209
] | [
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] | [
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
783,
2929,
13330,
342,
271,
4722,
1895,
253,
9759,
310,
2590,
253,
253,
2746,
27350,
50276,
35529,
253,
37317,
556,
690,
7350,
670,
253,
6925,
40156,
273,
253,
2746,
285,
253,
2954,
342,
2905,
789,
50276,
262,
651,
320,
1077,
9371,
604,
253,
4477,
812,
4499,
285,
7277,
253,
4081,
2746,
1097,
36143,
285,
36878,
275,
616,
10704,
4679,
50276,
3113,
3082,
323,
23507,
46214,
4315,
39401,
841,
651,
671,
28698,
3746,
281,
19349,
7914,
50274,
783,
878,
323,
14053,
20552,
268,
4113,
275,
253,
2234,
15265,
839,
4893,
310,
30455,
6296,
275,
1097,
17401,
2718,
285,
3320,
2048,
15302,
253,
7313,
403,
417,
12450,
275,
253,
830,
273,
20552,
323,
4227,
275,
253,
4679,
253,
4477,
39142,
253,
1268,
273,
3320,
12091,
281,
1056,
310,
1007,
347,
247,
5912,
534,
407,
253,
1039,
310,
1077,
1027,
432,
253,
891,
301,
6447,
9978,
2783,
275,
4758,
337,
323,
253,
1855,
928,
561,
10895,
352,
310,
12744,
849,
253,
941,
369,
638,
36981,
281,
4044,
253,
2540,
268,
4113,
50275,
7152,
339,
431,
248,
2929,
19401,
247,
2900,
281,
247,
7605,
5864,
1895,
253,
4081,
2900,
8687,
247,
14717,
597,
1067,
247,
38301,
44519,
42017,
1566,
752,
3686,
310,
417,
17285,
275,
667,
1039,
285,
13477,
479,
247,
2257,
253,
14717,
556,
767,
4243,
337,
247,
13358,
3720,
1159,
326,
3198,
281,
320,
6888,
285,
374,
247,
13358,
3268,
689,
253,
3720,
3603,
253,
4853,
271,
13757,
1895,
374,
534,
556,
247,
941,
1307,
285,
690,
8985,
285,
44053,
10806,
285,
597,
12661,
247,
17040,
285,
14717,
273,
436,
13757,
1895,
597,
564,
327,
281,
1750,
326,
15577,
19349,
2493,
476,
320,
840,
22245,
407,
16030,
272,
253,
14237,
597,
452,
34003,
533,
597,
1918,
1652,
4278,
327,
849,
285,
752,
16274,
436,
13812,
556,
275,
3946,
50276,
2520,
778,
320,
4755,
281,
247,
749,
3423,
6485,
533,
352,
310,
417,
2590,
281,
479,
387,
512,
253,
2929,
310,
12171,
5185,
533,
891,
452,
7596,
4685,
253,
7680,
285,
15606,
275,
253,
3862,
5145,
4715,
1673,
50275,
74,
717,
417,
271,
6485,
275,
46449,
594,
891,
2550,
7472,
253,
7680,
533,
891,
476,
1333,
326,
752,
6284,
479,
403,
2593,
3307,
285,
4706,
5329,
285,
597,
1097,
2430,
247,
2257,
1805,
4028,
3307,
1160,
1841,
1199,
625,
27350,
533,
891,
1891,
281,
923,
849,
253,
15301,
4778,
31825,
2250,
660,
18279,
3966,
476,
6830,
1705,
562,
273,
253,
941,
891,
1158,
436,
310,
271,
1774,
1127,
281,
1329,
253,
4665,
1430,
1750,
347,
323,
577,
891,
1158,
627,
310,
2316,
323,
30328,
3652,
627,
347,
973,
347,
7364,
24088,
752,
3686,
273,
27377,
476,
320,
1160,
285,
417,
3966,
4720,
323,
608,
891,
1089,
326,
1077,
4722,
533,
891,
1089,
352,
2834,
281,
452,
253,
987,
30328,
670,
752,
253,
1329,
1617,
2097,
285,
849,
326,
7729,
275,
247,
8542,
4758,
50276,
856,
84,
50276,
68,
666,
1319,
285,
4665,
1430,
403,
2201,
10746,
273,
2561,
50276,
339,
3030,
751,
247,
3588,
7680,
327,
271,
4722,
1895,
772,
50276,
783,
1029,
5251,
5406,
310,
4942,
2590,
533,
891,
1089,
1774,
1841,
1077,
2834,
281,
15909,
50275,
783,
38301,
44519,
42017,
1566,
5426,
891,
1089,
21643,
533,
891,
717,
417,
271,
6485,
275,
46449,
253,
10199,
943,
1918,
690,
30328,
670,
752,
326,
310,
285,
2139,
352,
310,
247,
1175,
2934,
50276,
8606,
352,
1077,
1892,
281,
452,
247,
18893,
5406,
273,
253,
7364,
285,
2939,
253,
9021,
273,
253,
2929,
7152,
339,
431,
248,
37317,
9010,
326,
253,
4081,
1332,
4722,
253,
1566,
310,
1077,
4076,
285,
253,
27570,
275,
19349,
17032,
310,
1534,
253,
4028,
310,
671,
4076,
285,
2590,
253,
37317,
556,
2067,
7350,
50276,
18,
253,
5933,
3133,
417,
1077,
44755,
275,
253,
767,
749,
856,
23042,
627,
310,
581,
14042,
407,
247,
1781,
1180,
273,
7529,
39868,
2967,
256,
5267,
310,
3240,
8214,
285,
323,
1016,
5084,
275,
253,
941,
4315,
581,
556,
281,
8415,
271,
256,
5267,
275,
1016,
19502,
436,
310,
1512,
1199,
323,
1781,
4311,
3818,
3109,
2718,
275,
958,
275,
253,
3368,
337,
327,
1855,
928,
561,
253,
5933,
369,
760,
5762,
327,
247,
417,
84,
6062,
463,
10895,
285,
1408,
608,
25142,
253,
37317,
9193,
326,
625,
15216,
943,
320,
5762,
24088,
625,
25142,
2710,
9552,
273,
10895,
3966,
18505,
253,
1180,
273,
25142,
671,
7835,
247,
2372,
11755,
1580,
352,
310,
625,
27350,
281,
3523,
253,
5933,
970,
690,
12820,
873,
390,
672,
253,
5933,
26414,
762,
247,
2176,
17705,
50276,
19,
253,
5933,
2987,
342,
5912,
273,
8985,
941,
436,
310,
3240,
1892,
281,
6642,
275,
3946,
323,
1650,
952,
13052,
247,
6440,
323,
760,
2378,
352,
310,
1892,
281,
2028,
752,
310,
253,
5912,
273,
11365,
436,
751,
352,
3133,
326,
253,
3368,
629,
273,
436,
2929,
858,
417,
4518,
1375,
849,
281,
4044,
253,
5912,
326,
253,
5933,
3198,
50276,
20,
253,
4081,
1332,
310,
247,
2714,
46214,
4315,
39401,
534,
812,
320,
440,
888,
18397,
849,
281,
39256,
824,
4112,
1580,
1548,
18279,
1430,
273,
9153,
71,
11852,
4665,
1430,
247,
2257,
50276,
187,
187,
4118,
18435,
27,
2520,
789,
12661,
247,
1332,
323,
4715,
247,
38301,
44519,
42017,
1566,
50276,
4609,
310,
247,
8985,
3632,
4778,
1566,
326,
310,
1077,
2074,
390,
8931,
281,
247,
4315,
39401,
1566,
253,
789,
29328,
271,
5795,
13757,
2746,
326,
310,
969,
2074,
281,
4315,
39401,
7274,
50276,
328,
9520,
642,
5955,
390,
4679,
403,
1160,
281,
7277,
253,
50276,
856,
7334,
1895,
285,
1332,
342,
2629,
4315,
39401,
1293,
824,
5301,
352,
310,
12744,
604,
253,
4081,
310,
9619,
747,
390,
247,
294,
1248,
273,
247,
2629,
1895,
253,
4477,
403,
14659,
281,
3157,
253,
7482,
281,
19148,
253,
4602,
4315,
39401,
285,
2629,
2803,
3210,
209
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
clarity the motivations of learning representations for time dependent proximity graphs generated from contact tracing are well explained the learning of two disentangled representations to capture topological structure and temporal dynamics information is clearly demonstrated novelty the paper extends a previous proof by levy and goldberg about how mikolovs sgns is implicitly factorizing a wordcontext matrix to higher order tensors this higher order generalisation applied to timevarying graphs is used to learn embeddings which are shown to effectively encode graph structure and dynamics impact the paper proposes an extension of a previous well known proof for higher order tensors which leads to an embedding technique which is shown to perform well on real world datasets the code and datasets have been made freely available correctness several experiments have been conducted to demonstrate effectiveness of learned embeddings theoretical groundwork for the proposed model has been well laiddocsepsummary the paper proposes an implicit tensor factorization approach for learning timevarying node representations over dynamic networks the core method lifts the wellknown skip gram based embedding approach from matrix to higher order tensors to support temporal dimensions the authors claim that such tensor based treatment allows to disentangle the role of node and time negative sampling method similar to noise contrastive estimation is extended the higher order tensor setting and incorporated in the cross entropy objective for training in the experiments the authors consider five variants of facetoface proximity data that contains temporal interactions and focuses on tasks of node classification predicting outcome of sir epidemic process and link prediction in the form of event reconstruction the proposed method has been compared against two discrete time graph representation learning model and a recently proposed tensor based method the authors claim that the provided method shows comparable performance with requirement to train lesser number of parameters also the authors provide qualitative analysis in terms of embedding visualizations and goodness of fit plots strengths the paper focuses on an important problem of representation learning for timevarying graphs as several realworld networks consist of interactions between nodes that occur and change over time the higherorder skip gram based method proposed by the authors promises better parameter complexity compared to other methods while retaining or exceeding previous empirical performance both the high level tasks of predicting outcome of epidemic process and event reconstruction are useful and important realizations of basic node classification and link prediction tasks respectively in the timedependent setting the authors show that the method provides significant empirical performance on both tasks compared to nontensor based methods and comparable performance to recently proposed approach in 4 i find the qualitative analysis provided by the authors in the appendix in terms of embedding visualization execution time goodness of fit and other ablations to be very insightful concerns and improvements one of the major concerns i have is the novelty of the overall approach both skip gram embedding and negative sampling based training is wellknown approach while the authors claim that their key contribution is to use these techniques for higher order structures than a matrix however skipgram type of methods with negative sampling for 3order tensor have been well studied in relational learning literature 1 for static case this has also been extended to 4order tensor to consider temporal dimension and presented in previous iclr 2 hence the novelty of the technical contribution is not very clear one of the novel part seems to be learning different representations of node and time but i could not see how this is more helpful than previous approaches the author needs to adequately discuss and analyze this and present the outcome in the scenario when they only learn one of the two representations in addition to comparing with the relational learning literature static and dynamic both in terms of empirical performance and methodological difference the authors also miss comparison with a recently proposed tensor based representation learning method for dynamic networks 3 the paper needs to compare with all these methods and distinguish the technical differences to exactly discern the value of the presented approach the third major concern is the authors claim on the requirement to use lesser number of parameters i do not find enough evidence in the paper to support this claim the author need to exemplify the difference in the number of parameters for particular case compared to dyane and other methods see table v in 1 for example also it is important to test how these lesser number of parameters affect the empirical performance finally the execution times in table 1 do not seem to show any speedup gained due to these less number of parameters can the authors discuss more on these effects of lesser number of parameters in addition to retaining performance of dyane and also compare this with other methods i mentioned above as a followup most experiments are done on graphs with few hundred nodes while realworld networks contains thousands and even more number of nodes i am not able to see how this method would scale to such large graphs does the improvement in parameter complexity above help to promote scalability or is this a limitation of the proposed approach from my understanding the authors use the warmup effect for finding a good initialization of the parameters however the gap shown in figure 1 in appendix for warmup vs non warmup case is concerning as it appears that majority of the performance gain especially considering only marginal performance increase over dyane is achieved due to warmup steps and less due to the effect of the training via negative sampling overall the paper reads very dense and needs an improved presentation specifically the concepts explained in section 2 and 3 can be better presented using figures to explain the details such as cross coupling see fig 1 in 4 for an example while the authors present better performance for the proposed approach in the experiment section and also provide qualitative analysis in the appendix it is also important for the authors to discuss and analyze the reasons behind the performance gain with this method in the text in addition to describing the results from the table minor points not affecting the score i find the overall performance increase over dyane to be marginal and while this is not a major concern in itself that has affected my score for the paper the authors need to provide more comparisons and distinguish their approach from dyane with better analysis in terms of parameter complexity to fully support their claim of better performance minor type below equation 32 should it be pnijk it seems i and its corresponding term in rhs is missing given the concerns above i currently do not find this paper ready for publication i will be happy to revisit my score based on the authors response to the concerns raised by me above references 1 a review of relational machine learning for knowledge graphs nickel et al 2015 2 tensor decompositions for temporal knowledge base completion lacroix et al iclr 2020 3 dynamic graph convolutional networks using the tensor mproduct malik et al 2020 4 dyane dynamicsaware node embedding for temporal networks sato et al 2020 docsepmain idea in this paper the authors studied the problem of timevarying graph embedding problems the authors generalized skipgram based graph embedding method to timevarying graphs the authors show that the method can be used to factorize timevarying graphs as highorder tensors via negative sampling the authors carried out experiments on several timeresolved proximity networks with comparison to several stateofart baselines strength the paper is well written and technically sound the authors provided theoretical analysis for the approximation of negative sampling to tensor factorization the authors carried out extensive experiments on several realworld networks with comparison to strong baselines the application of sir node classification task is very interesting weakness the proposed method does not utilize the special property of timevarying graphs in the equation 31 the positive instances are just single edges while the negative are just for independent marginals it is not clear how random walk is involved in this setting the proposed method is more like an acceleration to traditional tensor factorization method for timevarying graphs in this case the authors should 1 include tensor factorization baselines to compare the accuracy and 2 carry out experiments on efficiency comparison with classic tensor factorization methods the scalability of the proposed method is another concern the largest network the authors experiment with has 200 nodes it will be good to see scalability experiments for example on synthetic networks for the evaluation of temporal event reconstruction it would be better to use time to separate traintest also it would be better to include some simple static graph embedding baseliness to operate on the network with all timestep graphs combined detailed comments line below equation 21 please provide definition for volg docsepin this paper the authors propose learning node embeddings of time varying graphs they extend the ideas from skip gram negative sampling sgns to time varying graphs they extend the relationship between sgns and matrix factorization to a tensor setting the key contribution seems to be learning a static embedding for each node and an embedding for a time step these embeddings are combined to learn a timeaware node embedding experiments on multiple datasets show that the proposed method outperforms related benchmarks the paper is well written and easy to follow my main concern is with the degree of novelty i think the work is relevant but am not convinced that the novelty is sufficient to warrant acceptance it seems a rather straightforward extension of sgns as well as the idea that the shifter pmi matrix can be factorized to a tensor the datasets the authors use are also quite small so im not convinced that the method is scalable indeed scalable tensor factorization is a challenging problem its unclear what the authors gain by casting this as tensor factorization rather than just solving the sgns problem like in word2vec infact i think thats basically what 38 does additional comments p3 whats omegai j t0 sec 31 paragraph 2 obtained by collecting cooccurrences can the jump in t be arbitrary how are the i j k tuples constructed specifically from table 1 these datasets are quite small is there a note on scalability how large can you go from table 3 the results from hosgns are significantly better sometimes 99 compared to the baselines can you provide some intuition as to why this is what explains the large jump for the task in table 3 it seems the stat version of the model works a lot better and adding dyn actually makes it worse does that mean edge detection is hampered by taking time into account an explanation would be good
### Summary: | the paper is concerned with learning representations for timevarying graphs which is an important problem that is relevant to the iclr community for this purpose the authors propose a new method to extend skipgram with negative sampling to higherorder tensors with the goal to perform an implicit tensor factorization of timevarying graphsthe proposed approach shows promising experimental improvements compared to previous methods reviewers highlighted also the tasks considered in the paper as well as the theoretical and qualitative analysis as further positive aspects however there exist still concerns regarding the current version of the manuscript in particular reviewers raised concerns regarding the novelty of the approach sgns its extension to higherorder tensors as well as the connection to pmi have been studied in the literature as such the new technical contributions are limited reviewers raised also concerns regarding the scalability of the method and its applicability to large graphs the revised version addresses this concern to some extent by showing experiments on midsized graphs with 20005000 nodes while this clearly improves the paper i agree with the majority of the reviewers that the manuscript requires an additional revision to iron out the points raised in this round of reviews however the presented results are indeed promising and id encourage the authors to revise and resubmit their work considering the reviewers feedback | [
13148,
39401,
2746,
323,
4715,
673,
39381,
272,
4666,
14237,
689,
7870,
6928,
50276,
783,
5161,
1332,
35408,
253,
973,
4304,
17049,
29975,
1754,
21496,
2746,
432,
4315,
281,
2169,
1340,
47454,
281,
1329,
11935,
10103,
253,
4477,
1750,
326,
824,
13148,
1754,
1971,
4483,
281,
557,
290,
2134,
253,
2554,
273,
4666,
285,
673,
4016,
10491,
1332,
50276,
22202,
281,
6046,
4499,
422,
13418,
310,
6508,
253,
2169,
1340,
13148,
4758,
285,
11217,
275,
253,
2831,
15579,
8103,
323,
3733,
275,
253,
4679,
253,
4477,
1908,
2620,
11640,
273,
32124,
1171,
584,
18326,
941,
326,
4428,
11935,
6355,
285,
16633,
327,
8892,
273,
4666,
9162,
21565,
6454,
273,
8674,
22954,
1232,
285,
3048,
10554,
275,
253,
830,
273,
2362,
14433,
253,
4081,
1332,
556,
644,
2429,
1411,
767,
13358,
673,
4216,
6779,
4715,
1566,
285,
247,
4102,
4081,
13148,
1754,
1332,
253,
4477,
1750,
326,
253,
2530,
1332,
2722,
10870,
3045,
342,
8284,
281,
6194,
16277,
1180,
273,
3602,
671,
253,
4477,
2085,
18276,
1783,
275,
2426,
273,
21496,
5304,
5904,
285,
23190,
273,
4944,
14777,
50274,
296,
3755,
20556,
50275,
783,
2929,
16633,
327,
271,
1774,
1895,
273,
6779,
4715,
323,
673,
39381,
272,
14580,
347,
2067,
1524,
10186,
6928,
2882,
273,
6355,
875,
7632,
326,
2826,
285,
1818,
689,
673,
253,
2169,
2621,
17049,
29975,
1754,
1332,
4081,
407,
253,
4477,
16966,
1805,
4764,
10454,
2429,
281,
643,
3082,
1223,
26179,
390,
27433,
2045,
16774,
3045,
50276,
15617,
253,
1029,
1268,
8892,
273,
21565,
6454,
273,
22954,
1232,
285,
2362,
14433,
403,
4217,
285,
1774,
1524,
5904,
273,
5044,
4666,
9162,
285,
3048,
10554,
8892,
2975,
275,
253,
37282,
2662,
4758,
50276,
783,
4477,
921,
326,
253,
1332,
3400,
1534,
16774,
3045,
327,
1097,
8892,
2429,
281,
25450,
11313,
1754,
3082,
285,
10870,
3045,
281,
4102,
4081,
2746,
275,
577,
50276,
74,
1089,
253,
18276,
1783,
2530,
407,
253,
4477,
275,
253,
30762,
275,
2426,
273,
21496,
24426,
10636,
673,
23190,
273,
4944,
285,
643,
490,
77,
569,
281,
320,
1077,
47860,
50276,
585,
1209,
2224,
285,
11701,
50275,
531,
273,
253,
2201,
7350,
891,
452,
310,
253,
38135,
273,
253,
4583,
2746,
1097,
17049,
29975,
21496,
285,
4016,
10491,
1754,
3733,
310,
973,
4304,
2746,
1223,
253,
4477,
1750,
326,
616,
2234,
7680,
310,
281,
897,
841,
5609,
323,
2169,
1340,
5289,
685,
247,
4315,
2299,
17049,
1710,
1511,
273,
3082,
342,
4016,
10491,
323,
495,
2621,
13148,
452,
644,
973,
5421,
275,
38524,
4715,
6239,
337,
323,
4228,
1083,
436,
556,
671,
644,
6508,
281,
577,
2621,
13148,
281,
1908,
11935,
7877,
285,
3559,
275,
2045,
17857,
32888,
374,
7613,
253,
38135,
273,
253,
7681,
7680,
310,
417,
1077,
2590,
581,
273,
253,
4460,
629,
3133,
281,
320,
4715,
1027,
14237,
273,
4666,
285,
673,
533,
891,
812,
417,
923,
849,
436,
310,
625,
9371,
685,
2045,
7274,
253,
2488,
3198,
281,
18212,
2319,
285,
12106,
436,
285,
1246,
253,
6454,
275,
253,
10076,
672,
597,
760,
3037,
581,
273,
253,
767,
14237,
50275,
249,
1635,
281,
10941,
342,
253,
38524,
4715,
6239,
4228,
285,
7870,
1097,
275,
2426,
273,
16774,
3045,
285,
35961,
3064,
253,
4477,
671,
2985,
5301,
342,
247,
4102,
4081,
13148,
1754,
6779,
4715,
1332,
323,
7870,
6928,
495,
253,
2929,
3198,
281,
7277,
342,
512,
841,
3082,
285,
12129,
253,
7681,
3910,
281,
4555,
26923,
253,
1318,
273,
253,
3559,
2746,
50275,
783,
2626,
2201,
4468,
310,
253,
4477,
1750,
327,
253,
8284,
281,
897,
16277,
1180,
273,
3602,
891,
513,
417,
1089,
2217,
1941,
275,
253,
2929,
281,
1329,
436,
1750,
253,
2488,
878,
281,
40924,
6644,
253,
3064,
275,
253,
1180,
273,
3602,
323,
1798,
1083,
2429,
281,
17713,
1351,
285,
643,
3082,
923,
2829,
362,
275,
337,
323,
1650,
671,
352,
310,
1774,
281,
1071,
849,
841,
16277,
1180,
273,
3602,
2818,
253,
16774,
3045,
4720,
253,
10636,
2069,
275,
2829,
337,
513,
417,
1646,
281,
921,
667,
3885,
484,
12103,
1955,
281,
841,
1679,
1180,
273,
3602,
476,
253,
4477,
2319,
625,
327,
841,
2538,
273,
16277,
1180,
273,
3602,
275,
1635,
281,
26179,
3045,
273,
17713,
1351,
285,
671,
7277,
436,
342,
643,
3082,
891,
5393,
1840,
50275,
284,
247,
956,
484,
954,
4679,
403,
2218,
327,
14580,
342,
1643,
4289,
7632,
1223,
1524,
10186,
6928,
4428,
6763,
285,
1014,
625,
1180,
273,
7632,
891,
717,
417,
2104,
281,
923,
849,
436,
1332,
651,
4311,
281,
824,
1781,
14580,
1057,
253,
7756,
275,
4764,
10454,
1840,
1361,
281,
8591,
9171,
1430,
390,
310,
436,
247,
12291,
273,
253,
4081,
2746,
50276,
4064,
619,
4685,
253,
4477,
897,
253,
5890,
484,
1055,
323,
4560,
247,
1175,
31850,
273,
253,
3602,
2299,
253,
8037,
2011,
275,
4677,
337,
275,
30762,
323,
5890,
484,
4632,
1327,
5890,
484,
1083,
310,
8664,
347,
352,
4620,
326,
5020,
273,
253,
3045,
6351,
3340,
7296,
760,
16888,
3045,
2572,
689,
17713,
1351,
310,
6786,
1955,
281,
5890,
484,
5018,
285,
1679,
1955,
281,
253,
1055,
273,
253,
3733,
3066,
4016,
10491,
50275,
1189,
455,
253,
2929,
9563,
1077,
14086,
285,
3198,
271,
5520,
9759,
5742,
253,
12342,
5544,
275,
2593,
374,
285,
495,
476,
320,
1805,
3559,
970,
8442,
281,
5513,
253,
4278,
824,
347,
2831,
8789,
923,
3036,
337,
275,
577,
323,
271,
1650,
50276,
6050,
253,
4477,
1246,
1805,
3045,
323,
253,
4081,
2746,
275,
253,
3368,
2593,
285,
671,
2085,
18276,
1783,
275,
253,
30762,
352,
310,
671,
1774,
323,
253,
4477,
281,
2319,
285,
12106,
253,
4606,
3212,
253,
3045,
6351,
342,
436,
1332,
275,
253,
2505,
275,
1635,
281,
12930,
253,
1543,
432,
253,
2829,
50276,
37585,
2792,
417,
13567,
253,
4868,
50274,
74,
1089,
253,
4583,
3045,
2572,
689,
17713,
1351,
281,
320,
16888,
285,
1223,
436,
310,
417,
247,
2201,
4468,
275,
3139,
326,
556,
5876,
619,
4868,
323,
253,
2929,
253,
4477,
878,
281,
2085,
625,
14023,
285,
12129,
616,
2746,
432,
17713,
1351,
342,
1805,
1783,
275,
2426,
273,
4764,
10454,
50276,
936,
4751,
1329,
616,
1750,
273,
1805,
3045,
50276,
37585,
1511,
50276,
27490,
5150,
4567,
943,
352,
320,
268,
79,
16392,
352,
3133,
891,
285,
697,
3969,
1307,
275,
38309,
310,
5816,
50276,
28821,
253,
7350,
1840,
891,
4390,
513,
417,
1089,
436,
2929,
4704,
323,
9311,
891,
588,
320,
5211,
281,
45735,
619,
4868,
1754,
327,
253,
4477,
2380,
281,
253,
7350,
5439,
407,
479,
1840,
50275,
250,
3065,
50276,
18,
247,
2278,
273,
38524,
5145,
4715,
323,
3640,
14580,
30237,
1162,
355,
4104,
50276,
19,
13148,
14717,
84,
323,
11935,
3640,
2613,
12240,
26238,
287,
895,
1162,
355,
17857,
32888,
9169,
50276,
20,
7870,
4216,
27311,
267,
6928,
970,
253,
13148,
278,
7509,
4691,
1479,
1162,
355,
9169,
50276,
21,
17713,
1351,
8062,
13823,
4666,
21496,
323,
11935,
6928,
2206,
80,
1162,
355,
9169,
50271,
7152,
339,
2617,
404,
2934,
50276,
249,
436,
2929,
253,
4477,
5421,
253,
1895,
273,
673,
39381,
272,
4216,
21496,
3237,
253,
4477,
14923,
17049,
1710,
1754,
4216,
21496,
1332,
281,
673,
39381,
272,
14580,
253,
4477,
921,
326,
253,
1332,
476,
320,
908,
281,
2803,
907,
673,
39381,
272,
14580,
347,
1029,
2621,
47454,
3066,
4016,
10491,
253,
4477,
4824,
562,
4679,
327,
2067,
673,
20918,
18326,
6928,
342,
5301,
281,
2067,
1375,
1171,
435,
1666,
25379,
50276,
45563,
253,
2929,
310,
973,
3542,
285,
22335,
3590,
253,
4477,
2530,
10527,
1783,
323,
253,
11193,
273,
4016,
10491,
281,
13148,
39401,
253,
4477,
4824,
562,
9470,
4679,
327,
2067,
1524,
10186,
6928,
342,
5301,
281,
2266,
1666,
25379,
253,
2898,
273,
8674,
4666,
9162,
4836,
310,
1077,
4722,
50276,
20881,
1255,
253,
4081,
1332,
1057,
417,
16584,
253,
2714,
2867,
273,
673,
39381,
272,
14580,
275,
253,
5150,
4562,
253,
2762,
10872,
403,
816,
2014,
9297,
1223,
253,
4016,
403,
816,
323,
3907,
8459,
932,
352,
310,
417,
2590,
849,
3632,
2940,
310,
3206,
275,
436,
4758,
253,
4081,
1332,
310,
625,
751,
271,
17680,
281,
5899,
13148,
39401,
1332,
323,
673,
39381,
272,
14580,
275,
436,
1083,
253,
4477,
943,
337,
2486,
13148,
39401,
1666,
25379,
281,
7277,
253,
7200,
285,
374,
4459,
562,
4679,
327,
6733,
5301,
342,
10610,
13148,
39401,
3082,
253,
9171,
1430,
273,
253,
4081,
1332,
310,
1529,
4468,
253,
6253,
2990,
253,
4477,
3368,
342,
556,
1052,
7632,
352,
588,
320,
1175,
281,
923,
9171,
1430,
4679,
323,
1650,
327,
13506,
6928,
323,
253,
7103,
273,
11935,
2362,
14433,
352,
651,
320,
1805,
281,
897,
673,
281,
4858,
1140,
565,
383,
671,
352,
651,
320,
1805,
281,
2486,
690,
2969,
4228,
4216,
21496,
1666,
293,
1632,
281,
10196,
327,
253,
2990,
342,
512,
4522,
383,
554,
14580,
5678,
50276,
5992,
7193,
5701,
1386,
2708,
5150,
3127,
4496,
2085,
5426,
323,
1936,
72,
50276,
7152,
339,
9852,
436,
2929,
253,
4477,
12661,
4715,
4666,
46234,
273,
673,
11962,
14580,
597,
9017,
253,
5697,
432,
17049,
29975,
4016,
10491,
48237,
2224,
281,
673,
11962,
14580,
597,
9017,
253,
2954,
875,
48237,
2224,
285,
4315,
39401,
281,
247,
13148,
4758,
253,
2234,
7680,
3133,
281,
320,
4715,
247,
4228,
21496,
323,
1016,
4666,
285,
271,
21496,
323,
247,
673,
3213,
841,
46234,
403,
5678,
281,
3037,
247,
673,
13823,
4666,
21496,
4679,
327,
2709,
15302,
921,
326,
253,
4081,
1332,
41731,
13015,
2905,
49602,
50275,
783,
2929,
310,
973,
3542,
285,
3477,
281,
956,
619,
2022,
4468,
310,
342,
253,
4248,
273,
38135,
891,
1158,
253,
789,
310,
4623,
533,
717,
417,
13762,
326,
253,
38135,
310,
4209,
281,
7501,
14924,
352,
3133,
247,
2581,
15246,
6880,
273,
48237,
2224,
347,
973,
347,
253,
2934,
326,
253,
439,
44805,
268,
7373,
4315,
476,
320,
2803,
1025,
281,
247,
13148,
253,
15302,
253,
4477,
897,
403,
671,
3240,
1355,
594,
516,
417,
13762,
326,
253,
1332,
310,
44755,
6296,
44755,
13148,
39401,
310,
247,
11132,
1895,
697,
12744,
752,
253,
4477,
6351,
407,
20278,
436,
347,
13148,
39401,
2581,
685,
816,
16161,
253,
48237,
2224,
1895,
751,
275,
3159,
19,
4642,
2192,
514,
891,
1158,
28763,
10323,
752,
6480,
1057,
50275,
38092,
5701,
50276,
81,
20,
47515,
7005,
909,
2284,
480,
246,
17,
50276,
1704,
4562,
12494,
374,
2797,
407,
17055,
820,
30714,
1998,
50276,
5092,
253,
6923,
275,
246,
320,
10341,
849,
403,
253,
891,
480,
465,
11737,
1868,
8818,
5742,
50276,
4064,
2829,
337,
841,
15302,
403,
3240,
1355,
310,
627,
247,
3877,
327,
9171,
1430,
849,
1781,
476,
368,
564,
50276,
4064,
2829,
495,
253,
1543,
432,
288,
375,
72,
2224,
403,
3012,
1805,
4536,
8688,
2429,
281,
253,
1666,
25379,
476,
368,
2085,
690,
30328,
347,
281,
2139,
436,
310,
752,
11424,
253,
1781,
6923,
50276,
1542,
253,
4836,
275,
2829,
495,
352,
3133,
253,
1098,
2715,
273,
253,
1566,
2987,
247,
2257,
1805,
285,
6240,
24187,
2686,
2789,
352,
7197,
1057,
326,
1599,
5024,
5481,
310,
49141,
407,
3192,
673,
715,
2395,
271,
8813,
651,
320,
1175,
50276,
187,
187,
4118,
18435,
27,
783,
2929,
310,
7514,
342,
4715,
14237,
323,
673,
39381,
272,
14580,
534,
310,
271,
1774,
1895,
326,
310,
4623,
281,
253,
17857,
32888,
3114,
323,
436,
4096,
253,
4477,
12661,
247,
747,
1332,
281,
9017,
17049,
1710,
342,
4016,
10491,
281,
2169,
2621,
47454,
342,
253,
4736,
281,
1347,
271,
15424,
13148,
39401,
273,
673,
39381,
272,
4216,
296,
248,
4081,
2746,
2722,
12532,
5661,
11701,
2429,
281,
2045,
3082,
30628,
16318,
671,
253,
8892,
2783,
275,
253,
2929,
347,
973,
347,
253,
10527,
285,
18276,
1783,
347,
2007,
2762,
7794,
50276,
35529,
627,
2226,
1335,
7350,
5001,
253,
1655,
2715,
273,
253,
7714,
275,
1798,
30628,
5439,
7350,
5001,
253,
38135,
273,
253,
2746,
48237,
2224,
697,
6880,
281,
2169,
2621,
47454,
347,
973,
347,
253,
4602,
281,
268,
7373,
452,
644,
5421,
275,
253,
6239,
347,
824,
253,
747,
7681,
9021,
403,
3710,
30628,
5439,
671,
7350,
5001,
253,
9171,
1430,
273,
253,
1332,
285,
697,
30437,
281,
1781,
14580,
253,
17265,
2715,
12453,
436,
4468,
281,
690,
6070,
407,
4645,
4679,
327,
278,
2352,
1025,
14580,
342,
5307,
28306,
7632,
1223,
436,
4518,
19132,
253,
2929,
891,
5194,
342,
253,
5020,
273,
253,
30628,
326,
253,
7714,
4419,
271,
3081,
18520,
281,
6871,
562,
253,
2792,
5439,
275,
436,
3790,
273,
10123,
2299,
253,
3559,
1543,
403,
6296,
12532,
285,
2654,
11907,
253,
4477,
281,
49620,
285,
501,
538,
2225,
616,
789,
7296,
253,
30628,
8680
] | [
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] | [
13148,
39401,
2746,
323,
4715,
673,
39381,
272,
4666,
14237,
689,
7870,
6928,
50276,
783,
5161,
1332,
35408,
253,
973,
4304,
17049,
29975,
1754,
21496,
2746,
432,
4315,
281,
2169,
1340,
47454,
281,
1329,
11935,
10103,
253,
4477,
1750,
326,
824,
13148,
1754,
1971,
4483,
281,
557,
290,
2134,
253,
2554,
273,
4666,
285,
673,
4016,
10491,
1332,
50276,
22202,
281,
6046,
4499,
422,
13418,
310,
6508,
253,
2169,
1340,
13148,
4758,
285,
11217,
275,
253,
2831,
15579,
8103,
323,
3733,
275,
253,
4679,
253,
4477,
1908,
2620,
11640,
273,
32124,
1171,
584,
18326,
941,
326,
4428,
11935,
6355,
285,
16633,
327,
8892,
273,
4666,
9162,
21565,
6454,
273,
8674,
22954,
1232,
285,
3048,
10554,
275,
253,
830,
273,
2362,
14433,
253,
4081,
1332,
556,
644,
2429,
1411,
767,
13358,
673,
4216,
6779,
4715,
1566,
285,
247,
4102,
4081,
13148,
1754,
1332,
253,
4477,
1750,
326,
253,
2530,
1332,
2722,
10870,
3045,
342,
8284,
281,
6194,
16277,
1180,
273,
3602,
671,
253,
4477,
2085,
18276,
1783,
275,
2426,
273,
21496,
5304,
5904,
285,
23190,
273,
4944,
14777,
50274,
296,
3755,
20556,
50275,
783,
2929,
16633,
327,
271,
1774,
1895,
273,
6779,
4715,
323,
673,
39381,
272,
14580,
347,
2067,
1524,
10186,
6928,
2882,
273,
6355,
875,
7632,
326,
2826,
285,
1818,
689,
673,
253,
2169,
2621,
17049,
29975,
1754,
1332,
4081,
407,
253,
4477,
16966,
1805,
4764,
10454,
2429,
281,
643,
3082,
1223,
26179,
390,
27433,
2045,
16774,
3045,
50276,
15617,
253,
1029,
1268,
8892,
273,
21565,
6454,
273,
22954,
1232,
285,
2362,
14433,
403,
4217,
285,
1774,
1524,
5904,
273,
5044,
4666,
9162,
285,
3048,
10554,
8892,
2975,
275,
253,
37282,
2662,
4758,
50276,
783,
4477,
921,
326,
253,
1332,
3400,
1534,
16774,
3045,
327,
1097,
8892,
2429,
281,
25450,
11313,
1754,
3082,
285,
10870,
3045,
281,
4102,
4081,
2746,
275,
577,
50276,
74,
1089,
253,
18276,
1783,
2530,
407,
253,
4477,
275,
253,
30762,
275,
2426,
273,
21496,
24426,
10636,
673,
23190,
273,
4944,
285,
643,
490,
77,
569,
281,
320,
1077,
47860,
50276,
585,
1209,
2224,
285,
11701,
50275,
531,
273,
253,
2201,
7350,
891,
452,
310,
253,
38135,
273,
253,
4583,
2746,
1097,
17049,
29975,
21496,
285,
4016,
10491,
1754,
3733,
310,
973,
4304,
2746,
1223,
253,
4477,
1750,
326,
616,
2234,
7680,
310,
281,
897,
841,
5609,
323,
2169,
1340,
5289,
685,
247,
4315,
2299,
17049,
1710,
1511,
273,
3082,
342,
4016,
10491,
323,
495,
2621,
13148,
452,
644,
973,
5421,
275,
38524,
4715,
6239,
337,
323,
4228,
1083,
436,
556,
671,
644,
6508,
281,
577,
2621,
13148,
281,
1908,
11935,
7877,
285,
3559,
275,
2045,
17857,
32888,
374,
7613,
253,
38135,
273,
253,
7681,
7680,
310,
417,
1077,
2590,
581,
273,
253,
4460,
629,
3133,
281,
320,
4715,
1027,
14237,
273,
4666,
285,
673,
533,
891,
812,
417,
923,
849,
436,
310,
625,
9371,
685,
2045,
7274,
253,
2488,
3198,
281,
18212,
2319,
285,
12106,
436,
285,
1246,
253,
6454,
275,
253,
10076,
672,
597,
760,
3037,
581,
273,
253,
767,
14237,
50275,
249,
1635,
281,
10941,
342,
253,
38524,
4715,
6239,
4228,
285,
7870,
1097,
275,
2426,
273,
16774,
3045,
285,
35961,
3064,
253,
4477,
671,
2985,
5301,
342,
247,
4102,
4081,
13148,
1754,
6779,
4715,
1332,
323,
7870,
6928,
495,
253,
2929,
3198,
281,
7277,
342,
512,
841,
3082,
285,
12129,
253,
7681,
3910,
281,
4555,
26923,
253,
1318,
273,
253,
3559,
2746,
50275,
783,
2626,
2201,
4468,
310,
253,
4477,
1750,
327,
253,
8284,
281,
897,
16277,
1180,
273,
3602,
891,
513,
417,
1089,
2217,
1941,
275,
253,
2929,
281,
1329,
436,
1750,
253,
2488,
878,
281,
40924,
6644,
253,
3064,
275,
253,
1180,
273,
3602,
323,
1798,
1083,
2429,
281,
17713,
1351,
285,
643,
3082,
923,
2829,
362,
275,
337,
323,
1650,
671,
352,
310,
1774,
281,
1071,
849,
841,
16277,
1180,
273,
3602,
2818,
253,
16774,
3045,
4720,
253,
10636,
2069,
275,
2829,
337,
513,
417,
1646,
281,
921,
667,
3885,
484,
12103,
1955,
281,
841,
1679,
1180,
273,
3602,
476,
253,
4477,
2319,
625,
327,
841,
2538,
273,
16277,
1180,
273,
3602,
275,
1635,
281,
26179,
3045,
273,
17713,
1351,
285,
671,
7277,
436,
342,
643,
3082,
891,
5393,
1840,
50275,
284,
247,
956,
484,
954,
4679,
403,
2218,
327,
14580,
342,
1643,
4289,
7632,
1223,
1524,
10186,
6928,
4428,
6763,
285,
1014,
625,
1180,
273,
7632,
891,
717,
417,
2104,
281,
923,
849,
436,
1332,
651,
4311,
281,
824,
1781,
14580,
1057,
253,
7756,
275,
4764,
10454,
1840,
1361,
281,
8591,
9171,
1430,
390,
310,
436,
247,
12291,
273,
253,
4081,
2746,
50276,
4064,
619,
4685,
253,
4477,
897,
253,
5890,
484,
1055,
323,
4560,
247,
1175,
31850,
273,
253,
3602,
2299,
253,
8037,
2011,
275,
4677,
337,
275,
30762,
323,
5890,
484,
4632,
1327,
5890,
484,
1083,
310,
8664,
347,
352,
4620,
326,
5020,
273,
253,
3045,
6351,
3340,
7296,
760,
16888,
3045,
2572,
689,
17713,
1351,
310,
6786,
1955,
281,
5890,
484,
5018,
285,
1679,
1955,
281,
253,
1055,
273,
253,
3733,
3066,
4016,
10491,
50275,
1189,
455,
253,
2929,
9563,
1077,
14086,
285,
3198,
271,
5520,
9759,
5742,
253,
12342,
5544,
275,
2593,
374,
285,
495,
476,
320,
1805,
3559,
970,
8442,
281,
5513,
253,
4278,
824,
347,
2831,
8789,
923,
3036,
337,
275,
577,
323,
271,
1650,
50276,
6050,
253,
4477,
1246,
1805,
3045,
323,
253,
4081,
2746,
275,
253,
3368,
2593,
285,
671,
2085,
18276,
1783,
275,
253,
30762,
352,
310,
671,
1774,
323,
253,
4477,
281,
2319,
285,
12106,
253,
4606,
3212,
253,
3045,
6351,
342,
436,
1332,
275,
253,
2505,
275,
1635,
281,
12930,
253,
1543,
432,
253,
2829,
50276,
37585,
2792,
417,
13567,
253,
4868,
50274,
74,
1089,
253,
4583,
3045,
2572,
689,
17713,
1351,
281,
320,
16888,
285,
1223,
436,
310,
417,
247,
2201,
4468,
275,
3139,
326,
556,
5876,
619,
4868,
323,
253,
2929,
253,
4477,
878,
281,
2085,
625,
14023,
285,
12129,
616,
2746,
432,
17713,
1351,
342,
1805,
1783,
275,
2426,
273,
4764,
10454,
50276,
936,
4751,
1329,
616,
1750,
273,
1805,
3045,
50276,
37585,
1511,
50276,
27490,
5150,
4567,
943,
352,
320,
268,
79,
16392,
352,
3133,
891,
285,
697,
3969,
1307,
275,
38309,
310,
5816,
50276,
28821,
253,
7350,
1840,
891,
4390,
513,
417,
1089,
436,
2929,
4704,
323,
9311,
891,
588,
320,
5211,
281,
45735,
619,
4868,
1754,
327,
253,
4477,
2380,
281,
253,
7350,
5439,
407,
479,
1840,
50275,
250,
3065,
50276,
18,
247,
2278,
273,
38524,
5145,
4715,
323,
3640,
14580,
30237,
1162,
355,
4104,
50276,
19,
13148,
14717,
84,
323,
11935,
3640,
2613,
12240,
26238,
287,
895,
1162,
355,
17857,
32888,
9169,
50276,
20,
7870,
4216,
27311,
267,
6928,
970,
253,
13148,
278,
7509,
4691,
1479,
1162,
355,
9169,
50276,
21,
17713,
1351,
8062,
13823,
4666,
21496,
323,
11935,
6928,
2206,
80,
1162,
355,
9169,
50271,
7152,
339,
2617,
404,
2934,
50276,
249,
436,
2929,
253,
4477,
5421,
253,
1895,
273,
673,
39381,
272,
4216,
21496,
3237,
253,
4477,
14923,
17049,
1710,
1754,
4216,
21496,
1332,
281,
673,
39381,
272,
14580,
253,
4477,
921,
326,
253,
1332,
476,
320,
908,
281,
2803,
907,
673,
39381,
272,
14580,
347,
1029,
2621,
47454,
3066,
4016,
10491,
253,
4477,
4824,
562,
4679,
327,
2067,
673,
20918,
18326,
6928,
342,
5301,
281,
2067,
1375,
1171,
435,
1666,
25379,
50276,
45563,
253,
2929,
310,
973,
3542,
285,
22335,
3590,
253,
4477,
2530,
10527,
1783,
323,
253,
11193,
273,
4016,
10491,
281,
13148,
39401,
253,
4477,
4824,
562,
9470,
4679,
327,
2067,
1524,
10186,
6928,
342,
5301,
281,
2266,
1666,
25379,
253,
2898,
273,
8674,
4666,
9162,
4836,
310,
1077,
4722,
50276,
20881,
1255,
253,
4081,
1332,
1057,
417,
16584,
253,
2714,
2867,
273,
673,
39381,
272,
14580,
275,
253,
5150,
4562,
253,
2762,
10872,
403,
816,
2014,
9297,
1223,
253,
4016,
403,
816,
323,
3907,
8459,
932,
352,
310,
417,
2590,
849,
3632,
2940,
310,
3206,
275,
436,
4758,
253,
4081,
1332,
310,
625,
751,
271,
17680,
281,
5899,
13148,
39401,
1332,
323,
673,
39381,
272,
14580,
275,
436,
1083,
253,
4477,
943,
337,
2486,
13148,
39401,
1666,
25379,
281,
7277,
253,
7200,
285,
374,
4459,
562,
4679,
327,
6733,
5301,
342,
10610,
13148,
39401,
3082,
253,
9171,
1430,
273,
253,
4081,
1332,
310,
1529,
4468,
253,
6253,
2990,
253,
4477,
3368,
342,
556,
1052,
7632,
352,
588,
320,
1175,
281,
923,
9171,
1430,
4679,
323,
1650,
327,
13506,
6928,
323,
253,
7103,
273,
11935,
2362,
14433,
352,
651,
320,
1805,
281,
897,
673,
281,
4858,
1140,
565,
383,
671,
352,
651,
320,
1805,
281,
2486,
690,
2969,
4228,
4216,
21496,
1666,
293,
1632,
281,
10196,
327,
253,
2990,
342,
512,
4522,
383,
554,
14580,
5678,
50276,
5992,
7193,
5701,
1386,
2708,
5150,
3127,
4496,
2085,
5426,
323,
1936,
72,
50276,
7152,
339,
9852,
436,
2929,
253,
4477,
12661,
4715,
4666,
46234,
273,
673,
11962,
14580,
597,
9017,
253,
5697,
432,
17049,
29975,
4016,
10491,
48237,
2224,
281,
673,
11962,
14580,
597,
9017,
253,
2954,
875,
48237,
2224,
285,
4315,
39401,
281,
247,
13148,
4758,
253,
2234,
7680,
3133,
281,
320,
4715,
247,
4228,
21496,
323,
1016,
4666,
285,
271,
21496,
323,
247,
673,
3213,
841,
46234,
403,
5678,
281,
3037,
247,
673,
13823,
4666,
21496,
4679,
327,
2709,
15302,
921,
326,
253,
4081,
1332,
41731,
13015,
2905,
49602,
50275,
783,
2929,
310,
973,
3542,
285,
3477,
281,
956,
619,
2022,
4468,
310,
342,
253,
4248,
273,
38135,
891,
1158,
253,
789,
310,
4623,
533,
717,
417,
13762,
326,
253,
38135,
310,
4209,
281,
7501,
14924,
352,
3133,
247,
2581,
15246,
6880,
273,
48237,
2224,
347,
973,
347,
253,
2934,
326,
253,
439,
44805,
268,
7373,
4315,
476,
320,
2803,
1025,
281,
247,
13148,
253,
15302,
253,
4477,
897,
403,
671,
3240,
1355,
594,
516,
417,
13762,
326,
253,
1332,
310,
44755,
6296,
44755,
13148,
39401,
310,
247,
11132,
1895,
697,
12744,
752,
253,
4477,
6351,
407,
20278,
436,
347,
13148,
39401,
2581,
685,
816,
16161,
253,
48237,
2224,
1895,
751,
275,
3159,
19,
4642,
2192,
514,
891,
1158,
28763,
10323,
752,
6480,
1057,
50275,
38092,
5701,
50276,
81,
20,
47515,
7005,
909,
2284,
480,
246,
17,
50276,
1704,
4562,
12494,
374,
2797,
407,
17055,
820,
30714,
1998,
50276,
5092,
253,
6923,
275,
246,
320,
10341,
849,
403,
253,
891,
480,
465,
11737,
1868,
8818,
5742,
50276,
4064,
2829,
337,
841,
15302,
403,
3240,
1355,
310,
627,
247,
3877,
327,
9171,
1430,
849,
1781,
476,
368,
564,
50276,
4064,
2829,
495,
253,
1543,
432,
288,
375,
72,
2224,
403,
3012,
1805,
4536,
8688,
2429,
281,
253,
1666,
25379,
476,
368,
2085,
690,
30328,
347,
281,
2139,
436,
310,
752,
11424,
253,
1781,
6923,
50276,
1542,
253,
4836,
275,
2829,
495,
352,
3133,
253,
1098,
2715,
273,
253,
1566,
2987,
247,
2257,
1805,
285,
6240,
24187,
2686,
2789,
352,
7197,
1057,
326,
1599,
5024,
5481,
310,
49141,
407,
3192,
673,
715,
2395,
271,
8813,
651,
320,
1175,
50276,
187,
187,
4118,
18435,
27,
783,
2929,
310,
7514,
342,
4715,
14237,
323,
673,
39381,
272,
14580,
534,
310,
271,
1774,
1895,
326,
310,
4623,
281,
253,
17857,
32888,
3114,
323,
436,
4096,
253,
4477,
12661,
247,
747,
1332,
281,
9017,
17049,
1710,
342,
4016,
10491,
281,
2169,
2621,
47454,
342,
253,
4736,
281,
1347,
271,
15424,
13148,
39401,
273,
673,
39381,
272,
4216,
296,
248,
4081,
2746,
2722,
12532,
5661,
11701,
2429,
281,
2045,
3082,
30628,
16318,
671,
253,
8892,
2783,
275,
253,
2929,
347,
973,
347,
253,
10527,
285,
18276,
1783,
347,
2007,
2762,
7794,
50276,
35529,
627,
2226,
1335,
7350,
5001,
253,
1655,
2715,
273,
253,
7714,
275,
1798,
30628,
5439,
7350,
5001,
253,
38135,
273,
253,
2746,
48237,
2224,
697,
6880,
281,
2169,
2621,
47454,
347,
973,
347,
253,
4602,
281,
268,
7373,
452,
644,
5421,
275,
253,
6239,
347,
824,
253,
747,
7681,
9021,
403,
3710,
30628,
5439,
671,
7350,
5001,
253,
9171,
1430,
273,
253,
1332,
285,
697,
30437,
281,
1781,
14580,
253,
17265,
2715,
12453,
436,
4468,
281,
690,
6070,
407,
4645,
4679,
327,
278,
2352,
1025,
14580,
342,
5307,
28306,
7632,
1223,
436,
4518,
19132,
253,
2929,
891,
5194,
342,
253,
5020,
273,
253,
30628,
326,
253,
7714,
4419,
271,
3081,
18520,
281,
6871,
562,
253,
2792,
5439,
275,
436,
3790,
273,
10123,
2299,
253,
3559,
1543,
403,
6296,
12532,
285,
2654,
11907,
253,
4477,
281,
49620,
285,
501,
538,
2225,
616,
789,
7296,
253,
30628,
8680
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
this paper considers the binary classification problem with exponential loss and relu activation function single neuron the authors characterize the asymptotic loss landscape by three different types of critical points they prove that gradient descent gd will result in four different regions and provide convergence rates for gd to converge to an asymptotic global minimum asymptotic local minimum and local minimum under certain assumptions the authors also provide convergence results for stochastic gradient descent sgd and provide extensions to leaky relu activation and multineuron networks the paper is well written and the results are mostly clearly presented this paper mostly follows the line of research by soudry et al 2017 2018 while it has its own merit due to the relu activation function considered however there are many strong assumptions that are not carefully verified and i really have concerns about the contribution of this paper since they simplify their analysis and results merely by imposing stringent conditions in particular i have the following major comments about the paper 1 in the definition of maxmargin direction why you use argminw maxi wtopxi it seems to me that the definition should be argmaxw mini wtopxi this definition keeps appearing in multiple places in the main paper 2 in the proof of theorem 32 i am confused by the argument of the case that hat w is not in the linearly separable region more clarification is needed to make the proof rigorous 3 in the analysis of theorem 33 and 34 the authors make a very stringent assumption that the iterate wt staying in linear separable region for all tmathcalt this assumption seems too strong which should be verified rather than imposed in analysis of sgd note that even the example shown in proposition 2 is still very restrictive you require all the positive examples or negative examples are very close to one another 4 furthermore in the analysis of sgd the authors did not specify the assumption that hat w lies in the linear separable region which is also required in this theorem and also very strong given such strong assumptions the analytic results seem to be trivial and it is hard to evaluate the authors contribution 5 for the convergence results of sgd the current rate is derived on the distance between ewt hatw2 can you provide similar results for mean square error e wt hatw 2 6 in multineuron case the authors again make very strong assumptions that all the neurons have unchanging activation status this is not easily achievable without careful characterization or other rigorous assumptions under such strong assumptions the extension to multineuron again seems not very meaningful other minor comments 1 the references are not correctly cited for instance please correct the use of parenthesis in which is different from that in soudry et al 2017 corollary 8 and hold for various other types of gradientbased algorithms gunasekar et al 2018 2 the sentence which the nature of convergence is different from does not read well should it be where or of which docseprecently the implicit bias where gradient descent converges the maxmargin classifier was shown for linear models without an explicit regularization this paper tries to extend this result to relu network which is more challenging because of the nonconvexity moreover a similar property of stochastic gradient descent is also discussed the implicit bias is a key property to ensure the superior performance of overparameterized models hence this line of research is also important however i think there are several concerns as summarized below 1 im not sure about the significance of the relu model p considered in the paper indeed the problem p is challenging but an obtained model is linear defined by w therefore an advantage of this model over linear models is unclear moreover since the maxmargin in this paper is defined by using part of dataset and it is different from the conventional maxmargin the generalization guarantees are not ensured by the margin theory therefore i cannot figure out the importance of an implicit bias in this setting which ensures the convergence to this modified maxmargin solution in addition the definition of the maxmargin seems to be incorrect argmin max argmax min 2 proposition 1 variance bound gives a bound on the sum of norms of stochastic gradients however i think this bound is obvious because stochastic gradients of the relu model p are uniformly bounded by the relu activation combining this boundedness and decreasing learning rates the bound in proposition 1 can be obtained immediately moreover the validity of an assumption on wt made in the proposition should be discussed 3 lemma f2 is key to show the main theorem but i wonder whether this lemma is correct i think the third equation in the proof seems to be incorrect docsepthis paper studies relu model or equivalently onelayeroneneuron model for the classification problem this paper shows if the data is linearly separable gradient descent may converge to either a global minimum or a suboptimal local minimum or diverges this paper further studies the implicit bias induced by gd and sgd and shows if they converge they can have a maximum margin solution comments 1 using relu model for linearly separable data doesnt make sense to me when relu is used i expect some more complicated separable condition 2 this paper only studies onelayeroneneuron model which is a very restricted setting its hard to see how this result can be generalized to the multipleneuron case 3 the analysis follows closely with previous work in studying the implicit bias for linear models
### Summary: | the reviewers and ac note the following potential weaknesses 1 the proof techniques largley follow from previous work on linear models 2 its not clear how signficant it is to analyze a oneneuron relu model for linearly separable data | [
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
2520,
2929,
19401,
253,
8985,
9162,
1895,
342,
17619,
2957,
285,
774,
86,
5743,
1159,
2014,
23586,
253,
4477,
17710,
253,
20185,
2957,
13016,
407,
1264,
1027,
3510,
273,
4619,
2792,
597,
5276,
326,
11786,
18499,
305,
69,
588,
906,
275,
1740,
1027,
4811,
285,
2085,
14940,
4142,
323,
305,
69,
281,
29623,
281,
271,
20185,
4156,
5927,
20185,
1980,
5927,
285,
1980,
5927,
762,
2176,
13260,
253,
4477,
671,
2085,
14940,
1543,
323,
19191,
11786,
18499,
256,
35333,
285,
2085,
18149,
281,
13584,
90,
774,
86,
5743,
285,
1554,
460,
27658,
6928,
253,
2929,
310,
973,
3542,
285,
253,
1543,
403,
6571,
4518,
3559,
436,
2929,
6571,
3637,
253,
1386,
273,
2561,
407,
256,
2995,
610,
1162,
355,
4240,
4765,
1223,
352,
556,
697,
1211,
15785,
1955,
281,
253,
774,
86,
5743,
1159,
2783,
2299,
627,
403,
1142,
2266,
13260,
326,
403,
417,
9257,
16058,
285,
891,
1663,
452,
7350,
670,
253,
7680,
273,
436,
2929,
1580,
597,
25636,
616,
1783,
285,
1543,
7960,
407,
23254,
32881,
2515,
275,
1798,
891,
452,
253,
1563,
2201,
5701,
670,
253,
2929,
50276,
18,
186,
249,
253,
5426,
273,
2781,
15456,
3884,
2139,
368,
897,
1736,
1222,
88,
2781,
74,
259,
3956,
2981,
352,
3133,
281,
479,
326,
253,
5426,
943,
320,
1736,
4090,
88,
12949,
259,
3956,
2981,
436,
5426,
11359,
15602,
275,
2709,
5053,
275,
253,
2022,
2929,
50276,
19,
186,
249,
253,
4737,
273,
10012,
4567,
891,
717,
13477,
407,
253,
4154,
273,
253,
1083,
326,
7856,
259,
310,
417,
275,
253,
23352,
39690,
2919,
625,
37699,
310,
3058,
281,
1056,
253,
4737,
26565,
495,
186,
249,
253,
1783,
273,
10012,
5922,
285,
5910,
253,
4477,
1056,
247,
1077,
32881,
9376,
326,
253,
35388,
22923,
14596,
275,
4872,
39690,
2919,
323,
512,
246,
1588,
85,
436,
9376,
3133,
1512,
2266,
534,
943,
320,
16058,
2581,
685,
11295,
275,
1783,
273,
256,
35333,
3877,
326,
1014,
253,
1650,
2011,
275,
13989,
374,
310,
1335,
1077,
29190,
368,
2430,
512,
253,
2762,
6667,
390,
4016,
6667,
403,
1077,
2810,
281,
581,
1529,
577,
186,
44295,
3062,
275,
253,
1783,
273,
256,
35333,
253,
4477,
858,
417,
13199,
253,
9376,
326,
7856,
259,
8696,
275,
253,
4872,
39690,
2919,
534,
310,
671,
2424,
275,
436,
10012,
285,
671,
1077,
2266,
1677,
824,
2266,
13260,
253,
20059,
1543,
1646,
281,
320,
14916,
285,
352,
310,
1892,
281,
7472,
253,
4477,
7680,
50275,
22,
186,
1542,
253,
14940,
1543,
273,
256,
35333,
253,
1655,
2281,
310,
6012,
327,
253,
4181,
875,
299,
17118,
50276,
700,
88,
19,
476,
368,
2085,
2074,
1543,
323,
1599,
6278,
2228,
299,
22923,
50276,
700,
88,
374,
50276,
23,
186,
249,
1554,
460,
27658,
1083,
253,
4477,
969,
1056,
1077,
2266,
13260,
326,
512,
253,
8512,
452,
440,
28276,
5743,
3708,
436,
310,
417,
4354,
39941,
1293,
10182,
14846,
390,
643,
26565,
13260,
762,
824,
2266,
13260,
253,
6880,
281,
1554,
460,
27658,
969,
3133,
417,
1077,
14282,
50276,
977,
5884,
5701,
337,
186,
783,
10414,
403,
417,
9113,
11106,
323,
4227,
4496,
3451,
253,
897,
273,
2885,
25232,
275,
50276,
4609,
310,
1027,
432,
326,
275,
256,
2995,
610,
1162,
355,
4240,
40460,
854,
285,
50276,
4949,
323,
2710,
643,
3510,
273,
11786,
3169,
11333,
5654,
511,
18970,
1162,
355,
4765,
374,
186,
783,
6197,
50276,
4609,
253,
3753,
273,
14940,
310,
1027,
432,
50276,
18566,
417,
1239,
973,
943,
352,
320,
835,
390,
273,
534,
5474,
339,
3456,
1154,
314,
253,
15424,
8492,
835,
11786,
18499,
26414,
253,
2781,
15456,
30410,
369,
2011,
323,
4872,
3210,
1293,
271,
6843,
37820,
436,
2929,
14177,
281,
9017,
436,
906,
281,
774,
86,
2990,
534,
310,
625,
11132,
984,
273,
253,
1327,
44181,
414,
25761,
247,
2074,
2867,
273,
19191,
11786,
18499,
310,
671,
5469,
50276,
783,
15424,
8492,
310,
247,
2234,
2867,
281,
5416,
253,
8936,
3045,
273,
689,
19484,
1025,
3210,
7613,
436,
1386,
273,
2561,
310,
671,
1774,
2299,
891,
1158,
627,
403,
2067,
7350,
347,
17903,
2708,
50276,
18,
516,
417,
2119,
670,
253,
8453,
273,
253,
774,
86,
1566,
268,
2783,
275,
253,
2929,
6296,
253,
1895,
268,
310,
11132,
533,
271,
2797,
1566,
310,
4872,
2931,
407,
259,
3103,
271,
5750,
273,
436,
1566,
689,
4872,
3210,
310,
12744,
50276,
3062,
1189,
1580,
253,
2781,
15456,
275,
436,
2929,
310,
2931,
407,
970,
629,
273,
10895,
285,
352,
310,
1027,
432,
253,
6041,
2781,
15456,
253,
26647,
23632,
403,
417,
33075,
407,
253,
8459,
3762,
3103,
891,
2550,
4677,
562,
253,
6349,
273,
271,
15424,
8492,
275,
436,
4758,
50276,
4609,
20096,
253,
14940,
281,
436,
7321,
2781,
15456,
2900,
275,
1635,
253,
5426,
273,
253,
2781,
15456,
3133,
281,
320,
13583,
1736,
1222,
2781,
50276,
1662,
4090,
1054,
50276,
19,
13989,
337,
11041,
3033,
4245,
247,
3033,
327,
253,
2020,
273,
22429,
273,
19191,
27935,
2299,
891,
1158,
436,
3033,
310,
4755,
984,
19191,
27935,
273,
253,
774,
86,
1566,
268,
403,
17568,
11542,
407,
253,
774,
86,
5743,
16248,
436,
11542,
1255,
285,
11052,
4715,
4142,
253,
3033,
275,
13989,
337,
476,
320,
2797,
4745,
25761,
253,
13091,
273,
271,
9376,
327,
22923,
1160,
275,
253,
13989,
943,
320,
5469,
50276,
20,
18057,
269,
19,
310,
2234,
281,
921,
253,
2022,
10012,
533,
891,
4282,
1880,
436,
18057,
310,
3451,
891,
1158,
253,
2626,
5150,
275,
253,
4737,
3133,
281,
320,
13583,
5474,
33032,
2520,
2929,
2175,
774,
86,
1566,
390,
39406,
327,
293,
4071,
251,
1751,
27658,
1566,
323,
253,
9162,
1895,
436,
2929,
2722,
604,
253,
941,
310,
23352,
39690,
11786,
18499,
778,
29623,
281,
2057,
247,
4156,
5927,
390,
247,
749,
29776,
1980,
5927,
390,
11711,
2510,
436,
2929,
2007,
2175,
253,
15424,
8492,
5802,
407,
305,
69,
285,
256,
35333,
285,
2722,
604,
597,
29623,
597,
476,
452,
247,
4869,
8459,
2900,
50275,
26122,
337,
970,
774,
86,
1566,
323,
23352,
39690,
941,
36908,
1056,
3282,
281,
479,
672,
774,
86,
310,
908,
891,
1902,
690,
625,
9542,
39690,
1617,
50276,
19,
436,
2929,
760,
2175,
327,
293,
4071,
251,
1751,
27658,
1566,
534,
310,
247,
1077,
11096,
4758,
697,
1892,
281,
923,
849,
436,
906,
476,
320,
14923,
281,
253,
18878,
1751,
27658,
1083,
495,
253,
1783,
3637,
8244,
342,
2045,
789,
275,
12392,
253,
15424,
8492,
323,
4872,
3210,
2490,
187,
4118,
18435,
27,
783,
30628,
285,
913,
3877,
253,
1563,
2442,
32213,
337,
253,
4737,
5609,
7950,
2205,
956,
432,
2045,
789,
327,
4872,
3210,
374,
697,
417,
2590,
849,
861,
71,
280,
386,
352,
310,
281,
12106,
247,
327,
1751,
27658,
774,
86,
1566,
323,
23352,
39690,
941,
209
] | [
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] | [
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
2520,
2929,
19401,
253,
8985,
9162,
1895,
342,
17619,
2957,
285,
774,
86,
5743,
1159,
2014,
23586,
253,
4477,
17710,
253,
20185,
2957,
13016,
407,
1264,
1027,
3510,
273,
4619,
2792,
597,
5276,
326,
11786,
18499,
305,
69,
588,
906,
275,
1740,
1027,
4811,
285,
2085,
14940,
4142,
323,
305,
69,
281,
29623,
281,
271,
20185,
4156,
5927,
20185,
1980,
5927,
285,
1980,
5927,
762,
2176,
13260,
253,
4477,
671,
2085,
14940,
1543,
323,
19191,
11786,
18499,
256,
35333,
285,
2085,
18149,
281,
13584,
90,
774,
86,
5743,
285,
1554,
460,
27658,
6928,
253,
2929,
310,
973,
3542,
285,
253,
1543,
403,
6571,
4518,
3559,
436,
2929,
6571,
3637,
253,
1386,
273,
2561,
407,
256,
2995,
610,
1162,
355,
4240,
4765,
1223,
352,
556,
697,
1211,
15785,
1955,
281,
253,
774,
86,
5743,
1159,
2783,
2299,
627,
403,
1142,
2266,
13260,
326,
403,
417,
9257,
16058,
285,
891,
1663,
452,
7350,
670,
253,
7680,
273,
436,
2929,
1580,
597,
25636,
616,
1783,
285,
1543,
7960,
407,
23254,
32881,
2515,
275,
1798,
891,
452,
253,
1563,
2201,
5701,
670,
253,
2929,
50276,
18,
186,
249,
253,
5426,
273,
2781,
15456,
3884,
2139,
368,
897,
1736,
1222,
88,
2781,
74,
259,
3956,
2981,
352,
3133,
281,
479,
326,
253,
5426,
943,
320,
1736,
4090,
88,
12949,
259,
3956,
2981,
436,
5426,
11359,
15602,
275,
2709,
5053,
275,
253,
2022,
2929,
50276,
19,
186,
249,
253,
4737,
273,
10012,
4567,
891,
717,
13477,
407,
253,
4154,
273,
253,
1083,
326,
7856,
259,
310,
417,
275,
253,
23352,
39690,
2919,
625,
37699,
310,
3058,
281,
1056,
253,
4737,
26565,
495,
186,
249,
253,
1783,
273,
10012,
5922,
285,
5910,
253,
4477,
1056,
247,
1077,
32881,
9376,
326,
253,
35388,
22923,
14596,
275,
4872,
39690,
2919,
323,
512,
246,
1588,
85,
436,
9376,
3133,
1512,
2266,
534,
943,
320,
16058,
2581,
685,
11295,
275,
1783,
273,
256,
35333,
3877,
326,
1014,
253,
1650,
2011,
275,
13989,
374,
310,
1335,
1077,
29190,
368,
2430,
512,
253,
2762,
6667,
390,
4016,
6667,
403,
1077,
2810,
281,
581,
1529,
577,
186,
44295,
3062,
275,
253,
1783,
273,
256,
35333,
253,
4477,
858,
417,
13199,
253,
9376,
326,
7856,
259,
8696,
275,
253,
4872,
39690,
2919,
534,
310,
671,
2424,
275,
436,
10012,
285,
671,
1077,
2266,
1677,
824,
2266,
13260,
253,
20059,
1543,
1646,
281,
320,
14916,
285,
352,
310,
1892,
281,
7472,
253,
4477,
7680,
50275,
22,
186,
1542,
253,
14940,
1543,
273,
256,
35333,
253,
1655,
2281,
310,
6012,
327,
253,
4181,
875,
299,
17118,
50276,
700,
88,
19,
476,
368,
2085,
2074,
1543,
323,
1599,
6278,
2228,
299,
22923,
50276,
700,
88,
374,
50276,
23,
186,
249,
1554,
460,
27658,
1083,
253,
4477,
969,
1056,
1077,
2266,
13260,
326,
512,
253,
8512,
452,
440,
28276,
5743,
3708,
436,
310,
417,
4354,
39941,
1293,
10182,
14846,
390,
643,
26565,
13260,
762,
824,
2266,
13260,
253,
6880,
281,
1554,
460,
27658,
969,
3133,
417,
1077,
14282,
50276,
977,
5884,
5701,
337,
186,
783,
10414,
403,
417,
9113,
11106,
323,
4227,
4496,
3451,
253,
897,
273,
2885,
25232,
275,
50276,
4609,
310,
1027,
432,
326,
275,
256,
2995,
610,
1162,
355,
4240,
40460,
854,
285,
50276,
4949,
323,
2710,
643,
3510,
273,
11786,
3169,
11333,
5654,
511,
18970,
1162,
355,
4765,
374,
186,
783,
6197,
50276,
4609,
253,
3753,
273,
14940,
310,
1027,
432,
50276,
18566,
417,
1239,
973,
943,
352,
320,
835,
390,
273,
534,
5474,
339,
3456,
1154,
314,
253,
15424,
8492,
835,
11786,
18499,
26414,
253,
2781,
15456,
30410,
369,
2011,
323,
4872,
3210,
1293,
271,
6843,
37820,
436,
2929,
14177,
281,
9017,
436,
906,
281,
774,
86,
2990,
534,
310,
625,
11132,
984,
273,
253,
1327,
44181,
414,
25761,
247,
2074,
2867,
273,
19191,
11786,
18499,
310,
671,
5469,
50276,
783,
15424,
8492,
310,
247,
2234,
2867,
281,
5416,
253,
8936,
3045,
273,
689,
19484,
1025,
3210,
7613,
436,
1386,
273,
2561,
310,
671,
1774,
2299,
891,
1158,
627,
403,
2067,
7350,
347,
17903,
2708,
50276,
18,
516,
417,
2119,
670,
253,
8453,
273,
253,
774,
86,
1566,
268,
2783,
275,
253,
2929,
6296,
253,
1895,
268,
310,
11132,
533,
271,
2797,
1566,
310,
4872,
2931,
407,
259,
3103,
271,
5750,
273,
436,
1566,
689,
4872,
3210,
310,
12744,
50276,
3062,
1189,
1580,
253,
2781,
15456,
275,
436,
2929,
310,
2931,
407,
970,
629,
273,
10895,
285,
352,
310,
1027,
432,
253,
6041,
2781,
15456,
253,
26647,
23632,
403,
417,
33075,
407,
253,
8459,
3762,
3103,
891,
2550,
4677,
562,
253,
6349,
273,
271,
15424,
8492,
275,
436,
4758,
50276,
4609,
20096,
253,
14940,
281,
436,
7321,
2781,
15456,
2900,
275,
1635,
253,
5426,
273,
253,
2781,
15456,
3133,
281,
320,
13583,
1736,
1222,
2781,
50276,
1662,
4090,
1054,
50276,
19,
13989,
337,
11041,
3033,
4245,
247,
3033,
327,
253,
2020,
273,
22429,
273,
19191,
27935,
2299,
891,
1158,
436,
3033,
310,
4755,
984,
19191,
27935,
273,
253,
774,
86,
1566,
268,
403,
17568,
11542,
407,
253,
774,
86,
5743,
16248,
436,
11542,
1255,
285,
11052,
4715,
4142,
253,
3033,
275,
13989,
337,
476,
320,
2797,
4745,
25761,
253,
13091,
273,
271,
9376,
327,
22923,
1160,
275,
253,
13989,
943,
320,
5469,
50276,
20,
18057,
269,
19,
310,
2234,
281,
921,
253,
2022,
10012,
533,
891,
4282,
1880,
436,
18057,
310,
3451,
891,
1158,
253,
2626,
5150,
275,
253,
4737,
3133,
281,
320,
13583,
5474,
33032,
2520,
2929,
2175,
774,
86,
1566,
390,
39406,
327,
293,
4071,
251,
1751,
27658,
1566,
323,
253,
9162,
1895,
436,
2929,
2722,
604,
253,
941,
310,
23352,
39690,
11786,
18499,
778,
29623,
281,
2057,
247,
4156,
5927,
390,
247,
749,
29776,
1980,
5927,
390,
11711,
2510,
436,
2929,
2007,
2175,
253,
15424,
8492,
5802,
407,
305,
69,
285,
256,
35333,
285,
2722,
604,
597,
29623,
597,
476,
452,
247,
4869,
8459,
2900,
50275,
26122,
337,
970,
774,
86,
1566,
323,
23352,
39690,
941,
36908,
1056,
3282,
281,
479,
672,
774,
86,
310,
908,
891,
1902,
690,
625,
9542,
39690,
1617,
50276,
19,
436,
2929,
760,
2175,
327,
293,
4071,
251,
1751,
27658,
1566,
534,
310,
247,
1077,
11096,
4758,
697,
1892,
281,
923,
849,
436,
906,
476,
320,
14923,
281,
253,
18878,
1751,
27658,
1083,
495,
253,
1783,
3637,
8244,
342,
2045,
789,
275,
12392,
253,
15424,
8492,
323,
4872,
3210,
2490,
187,
4118,
18435,
27,
783,
30628,
285,
913,
3877,
253,
1563,
2442,
32213,
337,
253,
4737,
5609,
7950,
2205,
956,
432,
2045,
789,
327,
4872,
3210,
374,
697,
417,
2590,
849,
861,
71,
280,
386,
352,
310,
281,
12106,
247,
327,
1751,
27658,
774,
86,
1566,
323,
23352,
39690,
941,
209
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
the paper identifies two atomic problems respectively in fields of ml mnist classification and quantum mechanics measuring a single photon and brings them together in a simplified setup that uses a single photon emitted according to the spatial distribution of images to classify mnistfashionmnist the introduction of quantum mechanics into the problem is through a trainable computational model of a beam splitterphase shifter mechanism aka a rotation in a high dimensional complex space thats allowed to alter the photons state before hitting the measurement device the paper shows that using this overly simplified and claimed to be physically feasible quantum computer which acts as the representation learning layer improves classification accuracy over any other representation learning method that doesnt use quantum computing the major takeaway is an accessible demonstration of how an elementary quantum computer might work for ml and what may be possible with actual qubits strengths the paper sets out to use two textbook problems in ml and quantum mechanics to introduce a textbook problem at the intersection and does a fairly good job at analyzing the problem extensively given that the overall problem is of broad interest to the representation learning community the solid execution of the paper is itself a good argument for acceptance accessible explanation of quantum states the measurement process and the building blocks of quantum computing comment the paper analyzes a single problem where no classical representation learning method can improve accuracy due to the fact that there is a single photon it would be very beneficial to allude to other setups perhaps in a related work section to provide broader context and help the reader understand better the significance of the problemdocsepthe paper studies that a ml system using quantum interference gives better classification accuracy than a vanilla ml system under the constraint that a classification decision has to be made after detection of the very first photon that passed through an imagefilter the reader can gather that this work brings together the ml and qc worlds but it is not clear what the real motivation of this work is and primarily why is singlephoton important does single photon equate to a single pixel or is this denoting the very first photon that passed the filter also is this constraint of detecting the very first photon valid it might be good to know which audience reads the paper if it is the ml audience then a section on qc basicsterminology will help or at least a graphical abstract to drive home the point home will be helpful was there a reason to use the fashionmnist in conjunction with mnist dataset the authors can also consider to abbreviate fashionmnist to fmnist throughout the paper docsepthis paper focuses on the quantum computing based machine learning and proposes a toy model to illustrate the quantum information processing on the common used handwritten digit dataset mnist more than 40 images can be classified accurately the proposed method looks interesting and the focused problem combining quantum computing and machine learning is of certain significance strength the topic is interesting which inspires the following researchers to focus on the combination of quantum computing and machine learning both the theoretical analysis and experimental results demonstrate that the proposed classifier works well the visualization results in figure 4 are interesting with the proposed photon classifier the semantic information of the input images can be well extracted the photon classifier tends to produce large amplitudes for the right classes the proposed method is analyzed in detail from multiple perspectives including results on mnist confusion matrices and visualization weakness in the results section only experiment results of the proposed method is shown there are some previous works related to this paper such as r1 more comparisons and discusses between the proposed method and previous methods are desired the experiments are conducted on two simple datasets mnist and fashionmnist real image data are more complex such as colored natural images could the proposed method be applied on more complex data if can how to extend and apply it please provide such a discussion r1 erfan khoram ang chen dianjing liu etal nanophotonic media for artificial neural inference photon res 78823827 aug 2019docsepthe goals of this work are ambitious to clearly define a setting useful to both physicists and machine learning practitioners the current paper is an excellent step in this direction however with several semesters of undergrad quantum mechanics courses i found it difficult to follow the calculations needed to compute predictions moreover given my graduatelevel experience in machine learning it was even more difficult to clearly understand the data and training algorithm in the current state i do not think iclr is an appropriate venue for this work as it may confuse machine learning practitioners though i think the original goal of the paper is very worthy and look forward to the final version of this work to try to give constructive critique it took me a long time to understand the setup of the problem an illustration of fashionmnist or mnist would help with an incoming photon an arrow through the cutout of the image clearly delineating that the image is binarized and that white pixels are cut out allowing the photon to pass through and a detector an lcd screen is described is this the detector as well or does the photon bounce back and is then detected such an illustration would go a long way for a machine learning practitioner unfamiliar with the double slit experiment and whatnot an algorithm box a machine learning audience is used to thinking in terms of training data and algorithms in this case an algorithm box would help specifically as it shows where the additional information is coming from it seems like the training is in optimizing the parameters of the unitary transform clearly delineating input output and optimization steps will help clarify the method an equation for computing the predictions of the trained model given a photon that passed through a mask a clear formula for computing the class probability with the trained unitary operator parameters clearly describing the baseline it was hard to find the details of the classical performance reported maximal classical performance is confusing wording and implies that maximal is proven theoretically is there a citation for this i may have missed it if it is not proven theoretically then the wording should be changed and a clear description of the architecture training data and training algorithm should be used and code should be included in the supplement this will help machine learning folks understand exactly what the comparison is against from a machine learning standpoint is the single pixel input to the model randomly sampled every time developing an additional baseline finding a unitary transform to find bases corresponding to style and class is unfair to the classical method which does not have access to this information not having a baseline that uses this additional information will further confuse machine learning folks as it seems obvious that a model that uses additional information will outperform a classical model that does not use this information for example a tensor decompositionsvd that uses information about style and class might be possible hope this is helpful i think with this additional work it could be quite a valuable contribution as i think the iclr community could be inspired to develop more methods that require complexvalued numbers such as this one edit the authors meaningfully addressed the above points i have raised my score accordingly
### Summary: | this is an unusual but interesting submission can we use a simple quantum computer in fact physical system to solve classification problems in ml a single photon passes through the screen its state is described by the complex vector a quantum computer makes a unitary linear transformation on this state in such a way that it maximizes the overlap with a corresponding class such a model can be parametrized by conventional means and trained and later possibly realized by an quantum system pros 1 the area of qc is very important and such papers shed a new light on the subject 2 inspiration to the iclr community to work on in this area 3 technically correct cons 1 the accuracies are far from sota and use very toy datasets it is not clear how to get to the accuracies needed in practice 2 the actual computational speed of inference is not clear 3 discussion of more complicated models and their possibility is necessary 4 quite a few misprints are in the text which need to be fixed in the final version | [
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
783,
2929,
22649,
767,
13805,
3237,
2975,
275,
4910,
273,
13361,
278,
79,
382,
9162,
285,
6318,
17823,
10499,
247,
2014,
15523,
285,
10316,
731,
2366,
275,
247,
21010,
9978,
326,
4648,
247,
2014,
15523,
23128,
2556,
281,
253,
8820,
3268,
273,
3888,
281,
30215,
278,
79,
382,
29585,
16192,
382,
253,
10199,
273,
6318,
17823,
715,
253,
1895,
310,
949,
247,
6194,
494,
15180,
1566,
273,
247,
8325,
6821,
4069,
14213,
439,
44805,
5122,
38857,
247,
9381,
275,
247,
1029,
15759,
2570,
2317,
28763,
4136,
281,
6990,
253,
20370,
1375,
1078,
16116,
253,
6814,
2813,
253,
2929,
2722,
326,
970,
436,
27662,
21010,
285,
7558,
281,
320,
13318,
17887,
6318,
4382,
534,
6993,
347,
253,
6779,
4715,
3828,
19132,
9162,
7200,
689,
667,
643,
6779,
4715,
1332,
326,
36908,
897,
6318,
12672,
253,
2201,
1379,
12594,
310,
271,
12482,
20028,
273,
849,
271,
18307,
6318,
4382,
1537,
789,
323,
13361,
285,
752,
778,
320,
1896,
342,
4588,
42414,
50276,
296,
3755,
20556,
50276,
783,
2929,
5239,
562,
281,
897,
767,
40554,
3237,
275,
13361,
285,
6318,
17823,
281,
9569,
247,
40554,
1895,
387,
253,
15171,
285,
1057,
247,
9648,
1175,
2628,
387,
18918,
253,
1895,
18171,
1677,
326,
253,
4583,
1895,
310,
273,
3862,
1600,
281,
253,
6779,
4715,
3114,
253,
4891,
10636,
273,
253,
2929,
310,
3139,
247,
1175,
4154,
323,
14924,
50276,
36462,
8813,
273,
6318,
3054,
253,
6814,
1232,
285,
253,
3652,
8336,
273,
6318,
12672,
50276,
13982,
50276,
783,
2929,
3537,
13505,
247,
2014,
1895,
835,
642,
8946,
6779,
4715,
1332,
476,
3157,
7200,
1955,
281,
253,
958,
326,
627,
310,
247,
2014,
15523,
352,
651,
320,
1077,
12912,
281,
512,
2496,
281,
643,
873,
8777,
4931,
275,
247,
2905,
789,
2593,
281,
2085,
16055,
3634,
285,
1361,
253,
9414,
2096,
1805,
253,
8453,
273,
253,
1895,
7152,
339,
431,
248,
2929,
2175,
326,
50276,
66,
13361,
985,
970,
6318,
11689,
4245,
1805,
9162,
7200,
685,
247,
26724,
13361,
985,
762,
253,
50276,
30995,
326,
247,
9162,
3061,
556,
281,
320,
1160,
846,
5481,
273,
253,
1077,
806,
15523,
326,
4817,
949,
271,
2460,
10978,
50276,
783,
9414,
476,
9580,
326,
436,
789,
10316,
2366,
253,
13361,
285,
2805,
68,
20490,
533,
352,
310,
417,
2590,
752,
253,
1524,
16038,
273,
436,
789,
310,
285,
8558,
2139,
310,
2014,
34484,
1774,
1057,
2014,
15523,
1298,
366,
281,
247,
2014,
12275,
390,
310,
436,
1850,
5341,
253,
1077,
806,
15523,
326,
4817,
253,
5806,
671,
310,
436,
7658,
273,
15549,
253,
1077,
806,
15523,
3588,
50275,
262,
1537,
320,
1175,
281,
871,
534,
8446,
9563,
253,
2929,
604,
352,
310,
253,
13361,
8446,
840,
247,
2593,
327,
2805,
68,
5044,
296,
693,
249,
1497,
588,
1361,
390,
387,
1878,
247,
29886,
12002,
281,
4446,
1728,
253,
1127,
1728,
588,
320,
9371,
50275,
4238,
627,
247,
1921,
281,
897,
253,
8142,
16192,
382,
275,
17385,
342,
278,
79,
382,
10895,
253,
4477,
476,
671,
1908,
281,
31931,
4513,
8142,
16192,
382,
281,
269,
16192,
382,
4768,
253,
2929,
50276,
7152,
33032,
2520,
2929,
16633,
327,
253,
6318,
12672,
1754,
5145,
4715,
285,
29328,
247,
20953,
1566,
281,
17093,
253,
6318,
1491,
5162,
327,
253,
1846,
908,
1133,
15720,
6670,
10895,
278,
79,
382,
625,
685,
3387,
3888,
476,
320,
10509,
13613,
253,
4081,
1332,
4453,
4722,
285,
253,
7106,
1895,
16248,
6318,
12672,
285,
5145,
4715,
310,
273,
2176,
8453,
50276,
45563,
50276,
783,
9400,
310,
4722,
534,
6381,
2731,
253,
1563,
8607,
281,
2770,
327,
253,
5019,
273,
6318,
12672,
285,
5145,
4715,
1097,
253,
10527,
1783,
285,
5661,
1543,
7568,
326,
253,
4081,
30410,
2987,
973,
50276,
783,
24426,
1543,
275,
4677,
577,
403,
4722,
342,
253,
4081,
15523,
30410,
253,
24705,
1491,
273,
253,
3280,
3888,
476,
320,
973,
10375,
253,
15523,
30410,
14280,
281,
4711,
1781,
22652,
323,
253,
987,
5971,
50276,
783,
4081,
1332,
310,
5867,
275,
2508,
432,
2709,
24302,
1690,
1543,
327,
278,
79,
382,
13775,
12624,
285,
24426,
50276,
20881,
1255,
50276,
249,
253,
1543,
2593,
760,
3368,
1543,
273,
253,
4081,
1332,
310,
2011,
627,
403,
690,
2045,
2987,
2905,
281,
436,
2929,
824,
347,
391,
18,
625,
14023,
285,
25339,
875,
253,
4081,
1332,
285,
2045,
3082,
403,
6799,
50275,
783,
4679,
403,
5196,
327,
767,
2969,
15302,
278,
79,
382,
285,
8142,
16192,
382,
1524,
2460,
941,
403,
625,
2570,
824,
347,
18010,
3626,
3888,
812,
253,
4081,
1332,
320,
3732,
327,
625,
2570,
941,
604,
476,
849,
281,
9017,
285,
4647,
352,
4496,
2085,
824,
247,
5955,
391,
18,
2827,
20227,
465,
1688,
312,
2897,
260,
864,
277,
757,
75,
272,
632,
86,
1162,
267,
6399,
2689,
302,
5120,
3420,
323,
13345,
11454,
17032,
15523,
501,
818,
2055,
21378,
1630,
14688,
6247,
7152,
339,
431,
248,
7342,
273,
436,
789,
403,
24683,
281,
4518,
4853,
247,
4758,
4217,
281,
1097,
26289,
1346,
285,
5145,
4715,
24432,
50276,
783,
1655,
2929,
310,
271,
7126,
3213,
275,
436,
3884,
50276,
35529,
342,
2067,
3300,
18599,
273,
762,
4971,
6318,
17823,
13519,
891,
1119,
352,
2834,
281,
956,
253,
10426,
3058,
281,
11897,
13650,
50275,
3062,
1189,
1677,
619,
16125,
5251,
2793,
275,
5145,
4715,
352,
369,
1014,
625,
2834,
281,
4518,
2096,
253,
941,
285,
3733,
5933,
50275,
249,
253,
1655,
1375,
891,
513,
417,
1158,
17857,
32888,
310,
271,
4569,
18767,
323,
436,
789,
347,
352,
778,
40678,
5145,
4715,
24432,
2167,
891,
1158,
253,
3236,
4736,
273,
253,
2929,
310,
1077,
18338,
285,
1007,
3579,
281,
253,
2457,
2715,
273,
436,
789,
50276,
936,
1611,
281,
1918,
25799,
29254,
50275,
262,
2335,
479,
247,
1048,
673,
281,
2096,
253,
9978,
273,
253,
1895,
271,
23356,
273,
8142,
16192,
382,
390,
278,
79,
382,
651,
1361,
342,
271,
19363,
15523,
271,
14150,
949,
253,
2624,
483,
273,
253,
2460,
4518,
30191,
839,
326,
253,
2460,
310,
10269,
274,
1025,
285,
326,
3168,
15115,
403,
2624,
562,
6941,
253,
15523,
281,
1509,
949,
285,
247,
13562,
271,
298,
2428,
3601,
310,
2529,
50276,
261,
436,
253,
13562,
347,
973,
390,
1057,
253,
15523,
31888,
896,
285,
310,
840,
5189,
824,
271,
23356,
651,
564,
247,
1048,
1039,
323,
247,
5145,
4715,
34815,
32139,
342,
253,
4021,
31688,
3368,
285,
752,
1439,
50275,
266,
5933,
3817,
247,
5145,
4715,
8446,
310,
908,
281,
4680,
275,
2426,
273,
3733,
941,
285,
11333,
275,
436,
1083,
271,
5933,
3817,
651,
1361,
5742,
347,
352,
2722,
835,
253,
3081,
1491,
310,
3551,
432,
352,
3133,
751,
253,
3733,
310,
275,
39793,
253,
3602,
273,
253,
24287,
4979,
4518,
30191,
839,
3280,
3453,
285,
13757,
5018,
588,
1361,
19148,
253,
1332,
50274,
266,
5150,
323,
12672,
253,
13650,
273,
253,
10166,
1566,
1677,
247,
15523,
326,
4817,
949,
247,
8989,
247,
2590,
7212,
323,
12672,
253,
966,
5912,
342,
253,
10166,
24287,
5572,
3602,
50275,
49346,
12930,
253,
8245,
352,
369,
1892,
281,
1089,
253,
4278,
273,
253,
8946,
3045,
2361,
13493,
8946,
3045,
310,
21643,
41066,
285,
8018,
326,
13493,
310,
11464,
28055,
310,
627,
247,
25577,
323,
436,
891,
778,
452,
9829,
352,
604,
352,
310,
417,
11464,
28055,
840,
253,
41066,
943,
320,
4391,
285,
247,
2590,
5740,
273,
253,
10336,
3733,
941,
285,
3733,
5933,
943,
320,
908,
285,
2127,
943,
320,
2908,
275,
253,
8499,
436,
588,
1361,
5145,
4715,
12633,
2096,
4555,
752,
253,
5301,
310,
1411,
432,
247,
5145,
4715,
32764,
310,
253,
2014,
12275,
3280,
281,
253,
1566,
12421,
19958,
1046,
673,
50274,
16714,
272,
271,
3081,
8245,
4560,
247,
24287,
4979,
281,
1089,
14395,
3969,
281,
3740,
285,
966,
310,
16593,
281,
253,
8946,
1332,
534,
1057,
417,
452,
2289,
281,
436,
1491,
417,
1907,
247,
8245,
326,
4648,
436,
3081,
1491,
588,
2007,
40678,
5145,
4715,
12633,
347,
352,
3133,
4755,
326,
247,
1566,
326,
4648,
3081,
1491,
588,
562,
32231,
247,
8946,
1566,
326,
1057,
417,
897,
436,
1491,
323,
1650,
247,
13148,
14717,
11427,
69,
326,
4648,
1491,
670,
3740,
285,
966,
1537,
320,
1896,
50276,
36865,
436,
310,
9371,
891,
1158,
342,
436,
3081,
789,
352,
812,
320,
3240,
247,
9865,
7680,
347,
891,
1158,
253,
17857,
32888,
3114,
812,
320,
11797,
281,
1287,
625,
3082,
326,
2430,
2570,
24995,
3904,
824,
347,
436,
581,
50276,
15576,
50275,
783,
4477,
4495,
2920,
9713,
253,
1840,
2792,
891,
452,
5439,
619,
4868,
15672,
187,
187,
4118,
18435,
27,
2520,
310,
271,
11555,
533,
4722,
19529,
476,
359,
897,
247,
2969,
6318,
4382,
275,
958,
3520,
985,
281,
8415,
9162,
3237,
275,
13361,
247,
2014,
15523,
11999,
949,
253,
3601,
697,
1375,
310,
2529,
407,
253,
2570,
4972,
247,
6318,
4382,
2789,
247,
24287,
4872,
9261,
327,
436,
1375,
275,
824,
247,
1039,
326,
352,
11903,
4219,
253,
14787,
342,
247,
3969,
966,
824,
247,
1566,
476,
320,
30364,
50065,
407,
6041,
2097,
285,
10166,
285,
1996,
6830,
8156,
407,
271,
6318,
985,
50276,
856,
84,
50275,
18,
50276,
783,
2170,
273,
2805,
68,
310,
1077,
1774,
285,
824,
9380,
17914,
247,
747,
1708,
327,
253,
2256,
50276,
19,
17006,
281,
253,
17857,
32888,
3114,
281,
789,
327,
275,
436,
2170,
495,
22335,
3451,
50275,
5040,
50276,
18,
253,
3933,
19103,
403,
2080,
432,
256,
5503,
285,
897,
1077,
20953,
15302,
352,
310,
417,
2590,
849,
281,
755,
281,
253,
3933,
19103,
3058,
275,
3946,
374,
253,
4588,
15180,
3885,
273,
17032,
310,
417,
2590,
495,
5955,
273,
625,
9542,
3210,
285,
616,
6387,
310,
3309,
577,
3240,
247,
1643,
3731,
21937,
403,
275,
253,
2505,
534,
878,
281,
320,
4229,
275,
253,
2457,
2715,
50274
] | [
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] | [
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
783,
2929,
22649,
767,
13805,
3237,
2975,
275,
4910,
273,
13361,
278,
79,
382,
9162,
285,
6318,
17823,
10499,
247,
2014,
15523,
285,
10316,
731,
2366,
275,
247,
21010,
9978,
326,
4648,
247,
2014,
15523,
23128,
2556,
281,
253,
8820,
3268,
273,
3888,
281,
30215,
278,
79,
382,
29585,
16192,
382,
253,
10199,
273,
6318,
17823,
715,
253,
1895,
310,
949,
247,
6194,
494,
15180,
1566,
273,
247,
8325,
6821,
4069,
14213,
439,
44805,
5122,
38857,
247,
9381,
275,
247,
1029,
15759,
2570,
2317,
28763,
4136,
281,
6990,
253,
20370,
1375,
1078,
16116,
253,
6814,
2813,
253,
2929,
2722,
326,
970,
436,
27662,
21010,
285,
7558,
281,
320,
13318,
17887,
6318,
4382,
534,
6993,
347,
253,
6779,
4715,
3828,
19132,
9162,
7200,
689,
667,
643,
6779,
4715,
1332,
326,
36908,
897,
6318,
12672,
253,
2201,
1379,
12594,
310,
271,
12482,
20028,
273,
849,
271,
18307,
6318,
4382,
1537,
789,
323,
13361,
285,
752,
778,
320,
1896,
342,
4588,
42414,
50276,
296,
3755,
20556,
50276,
783,
2929,
5239,
562,
281,
897,
767,
40554,
3237,
275,
13361,
285,
6318,
17823,
281,
9569,
247,
40554,
1895,
387,
253,
15171,
285,
1057,
247,
9648,
1175,
2628,
387,
18918,
253,
1895,
18171,
1677,
326,
253,
4583,
1895,
310,
273,
3862,
1600,
281,
253,
6779,
4715,
3114,
253,
4891,
10636,
273,
253,
2929,
310,
3139,
247,
1175,
4154,
323,
14924,
50276,
36462,
8813,
273,
6318,
3054,
253,
6814,
1232,
285,
253,
3652,
8336,
273,
6318,
12672,
50276,
13982,
50276,
783,
2929,
3537,
13505,
247,
2014,
1895,
835,
642,
8946,
6779,
4715,
1332,
476,
3157,
7200,
1955,
281,
253,
958,
326,
627,
310,
247,
2014,
15523,
352,
651,
320,
1077,
12912,
281,
512,
2496,
281,
643,
873,
8777,
4931,
275,
247,
2905,
789,
2593,
281,
2085,
16055,
3634,
285,
1361,
253,
9414,
2096,
1805,
253,
8453,
273,
253,
1895,
7152,
339,
431,
248,
2929,
2175,
326,
50276,
66,
13361,
985,
970,
6318,
11689,
4245,
1805,
9162,
7200,
685,
247,
26724,
13361,
985,
762,
253,
50276,
30995,
326,
247,
9162,
3061,
556,
281,
320,
1160,
846,
5481,
273,
253,
1077,
806,
15523,
326,
4817,
949,
271,
2460,
10978,
50276,
783,
9414,
476,
9580,
326,
436,
789,
10316,
2366,
253,
13361,
285,
2805,
68,
20490,
533,
352,
310,
417,
2590,
752,
253,
1524,
16038,
273,
436,
789,
310,
285,
8558,
2139,
310,
2014,
34484,
1774,
1057,
2014,
15523,
1298,
366,
281,
247,
2014,
12275,
390,
310,
436,
1850,
5341,
253,
1077,
806,
15523,
326,
4817,
253,
5806,
671,
310,
436,
7658,
273,
15549,
253,
1077,
806,
15523,
3588,
50275,
262,
1537,
320,
1175,
281,
871,
534,
8446,
9563,
253,
2929,
604,
352,
310,
253,
13361,
8446,
840,
247,
2593,
327,
2805,
68,
5044,
296,
693,
249,
1497,
588,
1361,
390,
387,
1878,
247,
29886,
12002,
281,
4446,
1728,
253,
1127,
1728,
588,
320,
9371,
50275,
4238,
627,
247,
1921,
281,
897,
253,
8142,
16192,
382,
275,
17385,
342,
278,
79,
382,
10895,
253,
4477,
476,
671,
1908,
281,
31931,
4513,
8142,
16192,
382,
281,
269,
16192,
382,
4768,
253,
2929,
50276,
7152,
33032,
2520,
2929,
16633,
327,
253,
6318,
12672,
1754,
5145,
4715,
285,
29328,
247,
20953,
1566,
281,
17093,
253,
6318,
1491,
5162,
327,
253,
1846,
908,
1133,
15720,
6670,
10895,
278,
79,
382,
625,
685,
3387,
3888,
476,
320,
10509,
13613,
253,
4081,
1332,
4453,
4722,
285,
253,
7106,
1895,
16248,
6318,
12672,
285,
5145,
4715,
310,
273,
2176,
8453,
50276,
45563,
50276,
783,
9400,
310,
4722,
534,
6381,
2731,
253,
1563,
8607,
281,
2770,
327,
253,
5019,
273,
6318,
12672,
285,
5145,
4715,
1097,
253,
10527,
1783,
285,
5661,
1543,
7568,
326,
253,
4081,
30410,
2987,
973,
50276,
783,
24426,
1543,
275,
4677,
577,
403,
4722,
342,
253,
4081,
15523,
30410,
253,
24705,
1491,
273,
253,
3280,
3888,
476,
320,
973,
10375,
253,
15523,
30410,
14280,
281,
4711,
1781,
22652,
323,
253,
987,
5971,
50276,
783,
4081,
1332,
310,
5867,
275,
2508,
432,
2709,
24302,
1690,
1543,
327,
278,
79,
382,
13775,
12624,
285,
24426,
50276,
20881,
1255,
50276,
249,
253,
1543,
2593,
760,
3368,
1543,
273,
253,
4081,
1332,
310,
2011,
627,
403,
690,
2045,
2987,
2905,
281,
436,
2929,
824,
347,
391,
18,
625,
14023,
285,
25339,
875,
253,
4081,
1332,
285,
2045,
3082,
403,
6799,
50275,
783,
4679,
403,
5196,
327,
767,
2969,
15302,
278,
79,
382,
285,
8142,
16192,
382,
1524,
2460,
941,
403,
625,
2570,
824,
347,
18010,
3626,
3888,
812,
253,
4081,
1332,
320,
3732,
327,
625,
2570,
941,
604,
476,
849,
281,
9017,
285,
4647,
352,
4496,
2085,
824,
247,
5955,
391,
18,
2827,
20227,
465,
1688,
312,
2897,
260,
864,
277,
757,
75,
272,
632,
86,
1162,
267,
6399,
2689,
302,
5120,
3420,
323,
13345,
11454,
17032,
15523,
501,
818,
2055,
21378,
1630,
14688,
6247,
7152,
339,
431,
248,
7342,
273,
436,
789,
403,
24683,
281,
4518,
4853,
247,
4758,
4217,
281,
1097,
26289,
1346,
285,
5145,
4715,
24432,
50276,
783,
1655,
2929,
310,
271,
7126,
3213,
275,
436,
3884,
50276,
35529,
342,
2067,
3300,
18599,
273,
762,
4971,
6318,
17823,
13519,
891,
1119,
352,
2834,
281,
956,
253,
10426,
3058,
281,
11897,
13650,
50275,
3062,
1189,
1677,
619,
16125,
5251,
2793,
275,
5145,
4715,
352,
369,
1014,
625,
2834,
281,
4518,
2096,
253,
941,
285,
3733,
5933,
50275,
249,
253,
1655,
1375,
891,
513,
417,
1158,
17857,
32888,
310,
271,
4569,
18767,
323,
436,
789,
347,
352,
778,
40678,
5145,
4715,
24432,
2167,
891,
1158,
253,
3236,
4736,
273,
253,
2929,
310,
1077,
18338,
285,
1007,
3579,
281,
253,
2457,
2715,
273,
436,
789,
50276,
936,
1611,
281,
1918,
25799,
29254,
50275,
262,
2335,
479,
247,
1048,
673,
281,
2096,
253,
9978,
273,
253,
1895,
271,
23356,
273,
8142,
16192,
382,
390,
278,
79,
382,
651,
1361,
342,
271,
19363,
15523,
271,
14150,
949,
253,
2624,
483,
273,
253,
2460,
4518,
30191,
839,
326,
253,
2460,
310,
10269,
274,
1025,
285,
326,
3168,
15115,
403,
2624,
562,
6941,
253,
15523,
281,
1509,
949,
285,
247,
13562,
271,
298,
2428,
3601,
310,
2529,
50276,
261,
436,
253,
13562,
347,
973,
390,
1057,
253,
15523,
31888,
896,
285,
310,
840,
5189,
824,
271,
23356,
651,
564,
247,
1048,
1039,
323,
247,
5145,
4715,
34815,
32139,
342,
253,
4021,
31688,
3368,
285,
752,
1439,
50275,
266,
5933,
3817,
247,
5145,
4715,
8446,
310,
908,
281,
4680,
275,
2426,
273,
3733,
941,
285,
11333,
275,
436,
1083,
271,
5933,
3817,
651,
1361,
5742,
347,
352,
2722,
835,
253,
3081,
1491,
310,
3551,
432,
352,
3133,
751,
253,
3733,
310,
275,
39793,
253,
3602,
273,
253,
24287,
4979,
4518,
30191,
839,
3280,
3453,
285,
13757,
5018,
588,
1361,
19148,
253,
1332,
50274,
266,
5150,
323,
12672,
253,
13650,
273,
253,
10166,
1566,
1677,
247,
15523,
326,
4817,
949,
247,
8989,
247,
2590,
7212,
323,
12672,
253,
966,
5912,
342,
253,
10166,
24287,
5572,
3602,
50275,
49346,
12930,
253,
8245,
352,
369,
1892,
281,
1089,
253,
4278,
273,
253,
8946,
3045,
2361,
13493,
8946,
3045,
310,
21643,
41066,
285,
8018,
326,
13493,
310,
11464,
28055,
310,
627,
247,
25577,
323,
436,
891,
778,
452,
9829,
352,
604,
352,
310,
417,
11464,
28055,
840,
253,
41066,
943,
320,
4391,
285,
247,
2590,
5740,
273,
253,
10336,
3733,
941,
285,
3733,
5933,
943,
320,
908,
285,
2127,
943,
320,
2908,
275,
253,
8499,
436,
588,
1361,
5145,
4715,
12633,
2096,
4555,
752,
253,
5301,
310,
1411,
432,
247,
5145,
4715,
32764,
310,
253,
2014,
12275,
3280,
281,
253,
1566,
12421,
19958,
1046,
673,
50274,
16714,
272,
271,
3081,
8245,
4560,
247,
24287,
4979,
281,
1089,
14395,
3969,
281,
3740,
285,
966,
310,
16593,
281,
253,
8946,
1332,
534,
1057,
417,
452,
2289,
281,
436,
1491,
417,
1907,
247,
8245,
326,
4648,
436,
3081,
1491,
588,
2007,
40678,
5145,
4715,
12633,
347,
352,
3133,
4755,
326,
247,
1566,
326,
4648,
3081,
1491,
588,
562,
32231,
247,
8946,
1566,
326,
1057,
417,
897,
436,
1491,
323,
1650,
247,
13148,
14717,
11427,
69,
326,
4648,
1491,
670,
3740,
285,
966,
1537,
320,
1896,
50276,
36865,
436,
310,
9371,
891,
1158,
342,
436,
3081,
789,
352,
812,
320,
3240,
247,
9865,
7680,
347,
891,
1158,
253,
17857,
32888,
3114,
812,
320,
11797,
281,
1287,
625,
3082,
326,
2430,
2570,
24995,
3904,
824,
347,
436,
581,
50276,
15576,
50275,
783,
4477,
4495,
2920,
9713,
253,
1840,
2792,
891,
452,
5439,
619,
4868,
15672,
187,
187,
4118,
18435,
27,
2520,
310,
271,
11555,
533,
4722,
19529,
476,
359,
897,
247,
2969,
6318,
4382,
275,
958,
3520,
985,
281,
8415,
9162,
3237,
275,
13361,
247,
2014,
15523,
11999,
949,
253,
3601,
697,
1375,
310,
2529,
407,
253,
2570,
4972,
247,
6318,
4382,
2789,
247,
24287,
4872,
9261,
327,
436,
1375,
275,
824,
247,
1039,
326,
352,
11903,
4219,
253,
14787,
342,
247,
3969,
966,
824,
247,
1566,
476,
320,
30364,
50065,
407,
6041,
2097,
285,
10166,
285,
1996,
6830,
8156,
407,
271,
6318,
985,
50276,
856,
84,
50275,
18,
50276,
783,
2170,
273,
2805,
68,
310,
1077,
1774,
285,
824,
9380,
17914,
247,
747,
1708,
327,
253,
2256,
50276,
19,
17006,
281,
253,
17857,
32888,
3114,
281,
789,
327,
275,
436,
2170,
495,
22335,
3451,
50275,
5040,
50276,
18,
253,
3933,
19103,
403,
2080,
432,
256,
5503,
285,
897,
1077,
20953,
15302,
352,
310,
417,
2590,
849,
281,
755,
281,
253,
3933,
19103,
3058,
275,
3946,
374,
253,
4588,
15180,
3885,
273,
17032,
310,
417,
2590,
495,
5955,
273,
625,
9542,
3210,
285,
616,
6387,
310,
3309,
577,
3240,
247,
1643,
3731,
21937,
403,
275,
253,
2505,
534,
878,
281,
320,
4229,
275,
253,
2457,
2715,
50274
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
this paper presents a method for adversarial attacks on object detectors by exploiting relevance maps that are originally intended for model interpretation unlike most of the existing methods that attack detection scores directly the proposed approach focuses on suppressing the relevance map associated with target objects by image perturbation the idea is interesting and demonstrates good transferability on the tasks of object detection and segmentation the paper is mostly well written and easy to follow the adversarial object dataset can also be helpful to the research community the main downside of the paper is that some of the comparisons are not appletoapple in the experiments for example the proposed approach applies update techniques ie translationinvariant to improve transferability however it is not my impression that this was done on the baseline methods which leads to unfair comparison some technical details need to be better articulated in the paper for example there is no mentioning of the adversarial loss function and how it is optimized as pointed out in 2 attacking cnn interpretations is not trivial without the details of how to update the gradients of the relevance map it would make reproducibility difficult while focusing on a different problem the proposed approach shares some similarities with methods designed to attack model interpretations such as 1 and 2 which should be discussed as related work 1 amirata ghorbani abubakar abid and james zou interpretation of neural networks is fragile in proceedings of the aaai conference on artificial intelligence volume 33 pp 36813688 2019 2 xinyang zhang ningfei wang hua shen shouling ji xiapu luoting wang interpretable deep learning under fire usenix security 20 docsepsummary this work proposes to attack object detectors by targeting their relevance maps of the different detected objects the proposed rad attack shows better black box transferability across different detectors on mscoco dataset the relevance maps are calculated based on sglrp act as an attention mechanism to the attack to focus on relevant regions in the more meaningful image and hence produce more transferable attacks strengths good attack performance and transferability between detectors which poses a security threat for sdv applications that use object detectors eight different detectors and three segmentation models are used in the rad attack which shows good generalization weaknesses missing important references abc all of these works attack object detectors and target transferability the paper is poorly written and ambiguous variables are introduced without proper definitions it is not clear how to obtain the gradients in eq3 with respect to the relevance maps no use of the proposed dataset the authors propose a new dataset of adversarial objects but never mention or showcase the datasets usefulness a straightforward way to show the datasets usefulness is by performing adversarial training and making robust detectors against the proposed attacks no enough ablation is performed the only ablation to the proposed method is in table 7 regarding the way to pick the detection target the relevance maps based on lrp are expensive and worse than recent saliency maps like cam and gradcam the attack budget epsilon 16 picked in the experiments is not justified it might be big or small for attack success and a plot of map vs epsilon for different detectors would give more information about the effect of the attack all the attacks in the paper are performed on yolov3 and transferred to other models it would be more informative to show transferability matrices of attacks performed on all models and transferred to all others the novelty of the proposed methodology is limited while the use of relevance maps to improve the transferability of attacks on object detectors is novel no proper explanation is provided the attacks are based on pgd and the relevance map is adapted from sglrp the paper offers no theoretical results or exciting insights a huang et al universal physical camouflage attacks on object detectors cvpr 2020 b wu et al making an invisibility cloak real world adversarial attacks on object detectors eccv 2020 c xu et al adversarial tshirt evading person detectors in a physical world eccv 2020 minor issues many grammar mistakes because they possess multipleoutput among the classification attacks and detection ones crossdomain attack naseer et al 2019 is the most effective but rad is more aggressive etc no question marks in titles 3135 table 25 could have been visualized better by using a bar chart for example to observe the relative performance of attacks and defenses docsepsummary the paper found a new way called relevance attack on detectors rad to generate high transferability adversarial examples against detectors by suppressing the multinode relevance moreover a dataset generated by this attacking method is introduced pros 1 the designed rad is new and technically sound 2 the experimental results on multiple detectors like yolov3 retinanet mask rcnn etc are promising cons 1 its not surprising that the object detectors can be attacked together one naive way is that using the ensemble attack by average the negative ce loss like the paper making an invisibility cloak real world adversarial attacks on object detectors proposed so i think some baselines need to be added in experiments 2 by imposing diversity transformation on input image can also improve transferability boosting adversarial attacks with momentum similar rules also can be applied in detectors 3 i cannot see a very significant value of the generated dataset like in the image classification dataset one can generate adversarial examples easily but there is not an individual dataset that only contains generated adversarial examples generated by one specific method some famous dataset like imagenetc benchmarking neural network robustness to common corruptions and perturbations is designed by natural perturbationsdocsepa transferable adversarial attack method is proposed for object detection a relevance map is used to discover the common weakness of existing detectors an adversarial dataset for object detection is created for experimental validation the relevance map proposed indeed derives from sglrp as mentioned in sec 33 the only modification is to change a single node target t to the average of a set of nodes as the major contribution claimed is on the relevance map the minor modification makes the contribution limited the relevance map is computed by backpropagating the relevance r from the final layer to the input following rules as illustrated in sec 33 however the proposed method is claimed to perform a blackbox transferable attack also the gradients are utilized to update the relevance map it is thus not clear how this relevance map correlates to the blackbox transferable attack in the experiments the proposed attack method shall be compared to sota detection or segmentationbased methods eg efficientdet cascadercnn librarcnn
### Summary: | this paper proposes a transferable adversarial attack method for object detection by using the relevance map four reviewers provided detailed reviews 2 of them rated ok but not good enough rejection 1 rated marginally below and 1 rated marginally above while reviewers consider the paper well written and using relevance map novel a number of concerns are raised including limited novelty the lack of theoretical results no use of the proposed dataset insufficient ablation etc during the rebuttal the authors made efforts to response to all reviewers comments however the major concerns remain and the rating were not changed the acs concur these major concerns and agree that the paper can not be accepted at its current state | [
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
2520,
2929,
10262,
247,
1332,
323,
48960,
8104,
327,
1789,
25421,
407,
38883,
17200,
8115,
326,
403,
8927,
6034,
323,
1566,
7914,
12401,
954,
273,
253,
5368,
3082,
326,
2983,
5481,
7363,
3587,
253,
4081,
2746,
16633,
327,
35582,
253,
17200,
3711,
2330,
342,
2303,
5113,
407,
2460,
20452,
253,
2934,
310,
4722,
285,
14371,
1175,
3700,
1430,
327,
253,
8892,
273,
1789,
5481,
285,
26405,
50275,
783,
2929,
310,
6571,
973,
3542,
285,
3477,
281,
956,
253,
48960,
1789,
10895,
476,
671,
320,
9371,
281,
253,
2561,
3114,
50276,
783,
2022,
42719,
273,
253,
2929,
310,
326,
690,
273,
253,
14023,
403,
417,
19126,
936,
19934,
275,
253,
4679,
323,
1650,
253,
4081,
2746,
10384,
5731,
5609,
26332,
10234,
25168,
281,
3157,
3700,
1430,
2299,
352,
310,
417,
619,
13214,
326,
436,
369,
2218,
327,
253,
8245,
3082,
534,
5644,
281,
16593,
5301,
50276,
8826,
7681,
4278,
878,
281,
320,
1805,
35144,
275,
253,
2929,
323,
1650,
627,
310,
642,
29570,
273,
253,
48960,
2957,
1159,
285,
849,
352,
310,
18325,
347,
8042,
562,
275,
374,
20362,
260,
9866,
27838,
310,
417,
14916,
1293,
253,
4278,
273,
849,
281,
5731,
253,
27935,
273,
253,
17200,
3711,
352,
651,
1056,
38041,
2834,
50276,
6050,
13654,
327,
247,
1027,
1895,
50276,
783,
4081,
2746,
10764,
690,
22620,
342,
3082,
4158,
281,
2983,
1566,
27838,
824,
347,
337,
285,
374,
534,
943,
320,
5469,
347,
2905,
789,
50276,
18,
717,
343,
682,
305,
1688,
5568,
74,
490,
538,
518,
274,
490,
301,
285,
480,
1443,
1182,
276,
7914,
273,
11454,
6928,
310,
28304,
275,
10061,
273,
253,
39951,
2284,
8059,
327,
13345,
9260,
4644,
5922,
7266,
36829,
15220,
2055,
6247,
374,
1269,
5104,
606,
1182,
12109,
39920,
453,
74,
259,
606,
50276,
38349,
703,
79,
439,
276,
1981,
480,
74,
1269,
74,
522,
86,
26535,
5341,
259,
606,
4665,
494,
3676,
4715,
762,
3289,
441,
17473,
3988,
1384,
5474,
339,
793,
360,
3454,
436,
789,
29328,
281,
2983,
1789,
25421,
407,
12262,
616,
17200,
8115,
273,
253,
1027,
5189,
5113,
253,
4081,
1985,
2983,
2722,
1805,
2806,
3817,
3700,
1430,
2439,
1027,
25421,
327,
278,
1026,
16856,
10895,
253,
17200,
8115,
403,
5118,
1754,
327,
256,
3129,
28946,
769,
347,
271,
4116,
5122,
281,
253,
2983,
281,
2770,
327,
4623,
4811,
275,
253,
625,
14282,
2460,
285,
7613,
4711,
625,
3700,
494,
8104,
50275,
296,
3755,
20556,
50275,
12311,
2983,
3045,
285,
3700,
1430,
875,
25421,
534,
24543,
247,
3988,
4322,
323,
256,
27088,
4893,
326,
897,
1789,
25421,
50276,
19521,
1027,
25421,
285,
1264,
26405,
3210,
403,
908,
275,
253,
1985,
2983,
534,
2722,
1175,
26647,
50276,
20881,
1255,
265,
50276,
33722,
1774,
10414,
490,
68,
512,
273,
841,
2987,
2983,
1789,
25421,
285,
2303,
3700,
1430,
50276,
783,
2929,
310,
15225,
3542,
285,
23851,
4903,
403,
5611,
1293,
1463,
14308,
352,
310,
417,
2590,
849,
281,
4044,
253,
27935,
275,
16186,
20,
342,
1675,
281,
253,
17200,
8115,
50276,
2369,
897,
273,
253,
4081,
10895,
253,
4477,
12661,
247,
747,
10895,
273,
48960,
5113,
533,
1620,
3748,
390,
34647,
253,
15302,
31471,
247,
15246,
1039,
281,
921,
253,
15302,
31471,
310,
407,
9591,
48960,
3733,
285,
2403,
10237,
25421,
1411,
253,
4081,
8104,
50275,
2369,
2217,
28913,
310,
2684,
253,
760,
28913,
281,
253,
4081,
1332,
310,
275,
2829,
818,
5001,
253,
1039,
281,
2619,
253,
5481,
2303,
253,
17200,
8115,
1754,
327,
298,
28946,
403,
8214,
285,
7197,
685,
3332,
3779,
4364,
8115,
751,
4049,
285,
3805,
12583,
253,
2983,
7563,
299,
4277,
1668,
5055,
275,
253,
4679,
310,
417,
17285,
50276,
262,
1537,
320,
1943,
390,
1355,
323,
2983,
2323,
50276,
395,
247,
7484,
273,
3711,
4632,
299,
4277,
323,
1027,
25421,
651,
1918,
625,
1491,
670,
253,
1055,
273,
253,
2983,
50276,
455,
253,
8104,
275,
253,
2929,
403,
2684,
327,
340,
311,
729,
20,
285,
9495,
281,
643,
3210,
352,
651,
320,
625,
27096,
281,
921,
3700,
1430,
12624,
273,
8104,
2684,
327,
512,
3210,
285,
9495,
281,
512,
2571,
50276,
783,
38135,
273,
253,
4081,
16182,
310,
3710,
1223,
253,
897,
273,
17200,
8115,
281,
3157,
253,
3700,
1430,
273,
8104,
327,
1789,
25421,
310,
4460,
642,
1463,
8813,
310,
2530,
253,
8104,
403,
1754,
327,
23256,
69,
285,
253,
17200,
3711,
310,
12956,
432,
256,
3129,
28946,
253,
2929,
6131,
642,
10527,
1543,
390,
12302,
16039,
50273,
66,
30287,
606,
1162,
355,
10898,
3520,
4049,
48821,
486,
8104,
327,
1789,
25421,
50276,
17312,
1087,
9169,
270,
259,
86,
1162,
355,
2403,
271,
828,
261,
2322,
35053,
1524,
1533,
48960,
8104,
327,
1789,
25421,
23746,
87,
9169,
50276,
68,
1269,
86,
1162,
355,
48960,
246,
23565,
612,
6748,
1436,
25421,
275,
247,
3520,
1533,
23746,
87,
9169,
50275,
37585,
3374,
50275,
20415,
28146,
16503,
984,
597,
7081,
2709,
9252,
2190,
253,
9162,
8104,
285,
5481,
4394,
2831,
13517,
2983,
295,
511,
254,
1162,
355,
6247,
310,
253,
954,
3576,
533,
1985,
310,
625,
13847,
3966,
50276,
2369,
1953,
10880,
275,
14505,
495,
13743,
50276,
2420,
2030,
812,
452,
644,
27130,
1805,
407,
970,
247,
2534,
8326,
323,
1650,
281,
10018,
253,
4103,
3045,
273,
8104,
285,
25774,
50276,
7152,
339,
793,
360,
3454,
253,
2929,
1119,
247,
747,
1039,
1925,
17200,
2983,
327,
25421,
1985,
50276,
936,
6635,
1029,
3700,
1430,
48960,
6667,
1411,
25421,
407,
35582,
253,
1554,
27592,
17200,
25761,
247,
10895,
4561,
407,
436,
20362,
1332,
310,
5611,
50276,
856,
84,
337,
253,
4158,
1985,
310,
747,
285,
22335,
3590,
374,
253,
5661,
1543,
327,
2709,
25421,
751,
340,
311,
729,
20,
50276,
1221,
249,
266,
292,
8989,
27657,
9866,
3966,
403,
12532,
50276,
5040,
337,
697,
417,
10084,
326,
253,
1789,
25421,
476,
320,
13964,
2366,
50276,
531,
27785,
1039,
310,
326,
970,
253,
19862,
2983,
407,
3388,
253,
4016,
2636,
2957,
751,
253,
2929,
2403,
271,
828,
261,
2322,
35053,
1524,
1533,
48960,
8104,
327,
1789,
25421,
4081,
594,
891,
1158,
690,
1666,
25379,
878,
281,
320,
2879,
275,
4679,
50276,
19,
50276,
1615,
23254,
9991,
9261,
327,
3280,
2460,
476,
671,
3157,
3700,
1430,
43124,
48960,
8104,
342,
10254,
2074,
4803,
671,
476,
320,
3732,
275,
25421,
495,
891,
2550,
923,
247,
1077,
1534,
1318,
273,
253,
4561,
10895,
751,
275,
253,
2460,
9162,
10895,
581,
476,
6635,
48960,
6667,
4354,
533,
627,
310,
417,
271,
2060,
10895,
326,
760,
4428,
4561,
48960,
6667,
4561,
407,
581,
2173,
1332,
690,
8530,
10895,
751,
4440,
257,
14069,
22791,
272,
11454,
2990,
31640,
281,
1846,
17715,
621,
285,
26309,
310,
4158,
407,
3626,
26309,
7152,
339,
4904,
3700,
494,
48960,
2983,
1332,
310,
4081,
323,
1789,
5481,
247,
17200,
3711,
310,
908,
281,
9413,
253,
1846,
14855,
273,
5368,
25421,
271,
48960,
10895,
323,
1789,
5481,
310,
3562,
323,
5661,
12820,
50276,
783,
17200,
3711,
4081,
6296,
38422,
432,
256,
3129,
28946,
347,
5393,
275,
4706,
5922,
253,
760,
11237,
310,
281,
1818,
247,
2014,
4666,
2303,
246,
281,
253,
3388,
273,
247,
873,
273,
7632,
50276,
284,
253,
2201,
7680,
7558,
310,
327,
253,
17200,
3711,
253,
5884,
11237,
2789,
253,
7680,
3710,
50276,
783,
17200,
3711,
310,
10302,
407,
896,
44263,
839,
253,
17200,
391,
432,
253,
2457,
3828,
281,
253,
3280,
1563,
4803,
347,
12800,
275,
4706,
5922,
2299,
253,
4081,
1332,
310,
7558,
281,
1347,
247,
2806,
3364,
3700,
494,
2983,
671,
253,
27935,
403,
12845,
281,
5731,
253,
17200,
3711,
352,
310,
3021,
417,
2590,
849,
436,
17200,
3711,
27972,
281,
253,
2806,
3364,
3700,
494,
2983,
50274,
249,
253,
4679,
253,
4081,
2983,
1332,
3091,
320,
2429,
281,
256,
5503,
5481,
390,
26405,
3169,
3082,
24088,
5919,
5992,
18779,
324,
2269,
9866,
40211,
3178,
9866,
50276,
187,
187,
4118,
18435,
27,
2520,
2929,
29328,
247,
3700,
494,
48960,
2983,
1332,
323,
1789,
5481,
407,
970,
253,
17200,
3711,
1740,
30628,
2530,
7000,
10123,
374,
273,
731,
20139,
8718,
533,
417,
1175,
2217,
50276,
250,
5342,
337,
20139,
42876,
2708,
285,
337,
20139,
42876,
1840,
1223,
30628,
1908,
253,
2929,
973,
3542,
285,
970,
17200,
3711,
4460,
247,
1180,
273,
7350,
403,
5439,
1690,
3710,
38135,
253,
3480,
273,
10527,
1543,
642,
897,
273,
253,
4081,
10895,
12497,
28913,
3966,
1309,
253,
30080,
22559,
253,
4477,
1160,
6031,
281,
2380,
281,
512,
30628,
5701,
2299,
253,
2201,
7350,
3464,
285,
253,
13716,
497,
417,
4391,
253,
913,
84,
15038,
841,
2201,
7350,
285,
5194,
326,
253,
2929,
476,
417,
320,
7607,
387,
697,
1655,
1375
] | [
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] | [
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
2520,
2929,
10262,
247,
1332,
323,
48960,
8104,
327,
1789,
25421,
407,
38883,
17200,
8115,
326,
403,
8927,
6034,
323,
1566,
7914,
12401,
954,
273,
253,
5368,
3082,
326,
2983,
5481,
7363,
3587,
253,
4081,
2746,
16633,
327,
35582,
253,
17200,
3711,
2330,
342,
2303,
5113,
407,
2460,
20452,
253,
2934,
310,
4722,
285,
14371,
1175,
3700,
1430,
327,
253,
8892,
273,
1789,
5481,
285,
26405,
50275,
783,
2929,
310,
6571,
973,
3542,
285,
3477,
281,
956,
253,
48960,
1789,
10895,
476,
671,
320,
9371,
281,
253,
2561,
3114,
50276,
783,
2022,
42719,
273,
253,
2929,
310,
326,
690,
273,
253,
14023,
403,
417,
19126,
936,
19934,
275,
253,
4679,
323,
1650,
253,
4081,
2746,
10384,
5731,
5609,
26332,
10234,
25168,
281,
3157,
3700,
1430,
2299,
352,
310,
417,
619,
13214,
326,
436,
369,
2218,
327,
253,
8245,
3082,
534,
5644,
281,
16593,
5301,
50276,
8826,
7681,
4278,
878,
281,
320,
1805,
35144,
275,
253,
2929,
323,
1650,
627,
310,
642,
29570,
273,
253,
48960,
2957,
1159,
285,
849,
352,
310,
18325,
347,
8042,
562,
275,
374,
20362,
260,
9866,
27838,
310,
417,
14916,
1293,
253,
4278,
273,
849,
281,
5731,
253,
27935,
273,
253,
17200,
3711,
352,
651,
1056,
38041,
2834,
50276,
6050,
13654,
327,
247,
1027,
1895,
50276,
783,
4081,
2746,
10764,
690,
22620,
342,
3082,
4158,
281,
2983,
1566,
27838,
824,
347,
337,
285,
374,
534,
943,
320,
5469,
347,
2905,
789,
50276,
18,
717,
343,
682,
305,
1688,
5568,
74,
490,
538,
518,
274,
490,
301,
285,
480,
1443,
1182,
276,
7914,
273,
11454,
6928,
310,
28304,
275,
10061,
273,
253,
39951,
2284,
8059,
327,
13345,
9260,
4644,
5922,
7266,
36829,
15220,
2055,
6247,
374,
1269,
5104,
606,
1182,
12109,
39920,
453,
74,
259,
606,
50276,
38349,
703,
79,
439,
276,
1981,
480,
74,
1269,
74,
522,
86,
26535,
5341,
259,
606,
4665,
494,
3676,
4715,
762,
3289,
441,
17473,
3988,
1384,
5474,
339,
793,
360,
3454,
436,
789,
29328,
281,
2983,
1789,
25421,
407,
12262,
616,
17200,
8115,
273,
253,
1027,
5189,
5113,
253,
4081,
1985,
2983,
2722,
1805,
2806,
3817,
3700,
1430,
2439,
1027,
25421,
327,
278,
1026,
16856,
10895,
253,
17200,
8115,
403,
5118,
1754,
327,
256,
3129,
28946,
769,
347,
271,
4116,
5122,
281,
253,
2983,
281,
2770,
327,
4623,
4811,
275,
253,
625,
14282,
2460,
285,
7613,
4711,
625,
3700,
494,
8104,
50275,
296,
3755,
20556,
50275,
12311,
2983,
3045,
285,
3700,
1430,
875,
25421,
534,
24543,
247,
3988,
4322,
323,
256,
27088,
4893,
326,
897,
1789,
25421,
50276,
19521,
1027,
25421,
285,
1264,
26405,
3210,
403,
908,
275,
253,
1985,
2983,
534,
2722,
1175,
26647,
50276,
20881,
1255,
265,
50276,
33722,
1774,
10414,
490,
68,
512,
273,
841,
2987,
2983,
1789,
25421,
285,
2303,
3700,
1430,
50276,
783,
2929,
310,
15225,
3542,
285,
23851,
4903,
403,
5611,
1293,
1463,
14308,
352,
310,
417,
2590,
849,
281,
4044,
253,
27935,
275,
16186,
20,
342,
1675,
281,
253,
17200,
8115,
50276,
2369,
897,
273,
253,
4081,
10895,
253,
4477,
12661,
247,
747,
10895,
273,
48960,
5113,
533,
1620,
3748,
390,
34647,
253,
15302,
31471,
247,
15246,
1039,
281,
921,
253,
15302,
31471,
310,
407,
9591,
48960,
3733,
285,
2403,
10237,
25421,
1411,
253,
4081,
8104,
50275,
2369,
2217,
28913,
310,
2684,
253,
760,
28913,
281,
253,
4081,
1332,
310,
275,
2829,
818,
5001,
253,
1039,
281,
2619,
253,
5481,
2303,
253,
17200,
8115,
1754,
327,
298,
28946,
403,
8214,
285,
7197,
685,
3332,
3779,
4364,
8115,
751,
4049,
285,
3805,
12583,
253,
2983,
7563,
299,
4277,
1668,
5055,
275,
253,
4679,
310,
417,
17285,
50276,
262,
1537,
320,
1943,
390,
1355,
323,
2983,
2323,
50276,
395,
247,
7484,
273,
3711,
4632,
299,
4277,
323,
1027,
25421,
651,
1918,
625,
1491,
670,
253,
1055,
273,
253,
2983,
50276,
455,
253,
8104,
275,
253,
2929,
403,
2684,
327,
340,
311,
729,
20,
285,
9495,
281,
643,
3210,
352,
651,
320,
625,
27096,
281,
921,
3700,
1430,
12624,
273,
8104,
2684,
327,
512,
3210,
285,
9495,
281,
512,
2571,
50276,
783,
38135,
273,
253,
4081,
16182,
310,
3710,
1223,
253,
897,
273,
17200,
8115,
281,
3157,
253,
3700,
1430,
273,
8104,
327,
1789,
25421,
310,
4460,
642,
1463,
8813,
310,
2530,
253,
8104,
403,
1754,
327,
23256,
69,
285,
253,
17200,
3711,
310,
12956,
432,
256,
3129,
28946,
253,
2929,
6131,
642,
10527,
1543,
390,
12302,
16039,
50273,
66,
30287,
606,
1162,
355,
10898,
3520,
4049,
48821,
486,
8104,
327,
1789,
25421,
50276,
17312,
1087,
9169,
270,
259,
86,
1162,
355,
2403,
271,
828,
261,
2322,
35053,
1524,
1533,
48960,
8104,
327,
1789,
25421,
23746,
87,
9169,
50276,
68,
1269,
86,
1162,
355,
48960,
246,
23565,
612,
6748,
1436,
25421,
275,
247,
3520,
1533,
23746,
87,
9169,
50275,
37585,
3374,
50275,
20415,
28146,
16503,
984,
597,
7081,
2709,
9252,
2190,
253,
9162,
8104,
285,
5481,
4394,
2831,
13517,
2983,
295,
511,
254,
1162,
355,
6247,
310,
253,
954,
3576,
533,
1985,
310,
625,
13847,
3966,
50276,
2369,
1953,
10880,
275,
14505,
495,
13743,
50276,
2420,
2030,
812,
452,
644,
27130,
1805,
407,
970,
247,
2534,
8326,
323,
1650,
281,
10018,
253,
4103,
3045,
273,
8104,
285,
25774,
50276,
7152,
339,
793,
360,
3454,
253,
2929,
1119,
247,
747,
1039,
1925,
17200,
2983,
327,
25421,
1985,
50276,
936,
6635,
1029,
3700,
1430,
48960,
6667,
1411,
25421,
407,
35582,
253,
1554,
27592,
17200,
25761,
247,
10895,
4561,
407,
436,
20362,
1332,
310,
5611,
50276,
856,
84,
337,
253,
4158,
1985,
310,
747,
285,
22335,
3590,
374,
253,
5661,
1543,
327,
2709,
25421,
751,
340,
311,
729,
20,
50276,
1221,
249,
266,
292,
8989,
27657,
9866,
3966,
403,
12532,
50276,
5040,
337,
697,
417,
10084,
326,
253,
1789,
25421,
476,
320,
13964,
2366,
50276,
531,
27785,
1039,
310,
326,
970,
253,
19862,
2983,
407,
3388,
253,
4016,
2636,
2957,
751,
253,
2929,
2403,
271,
828,
261,
2322,
35053,
1524,
1533,
48960,
8104,
327,
1789,
25421,
4081,
594,
891,
1158,
690,
1666,
25379,
878,
281,
320,
2879,
275,
4679,
50276,
19,
50276,
1615,
23254,
9991,
9261,
327,
3280,
2460,
476,
671,
3157,
3700,
1430,
43124,
48960,
8104,
342,
10254,
2074,
4803,
671,
476,
320,
3732,
275,
25421,
495,
891,
2550,
923,
247,
1077,
1534,
1318,
273,
253,
4561,
10895,
751,
275,
253,
2460,
9162,
10895,
581,
476,
6635,
48960,
6667,
4354,
533,
627,
310,
417,
271,
2060,
10895,
326,
760,
4428,
4561,
48960,
6667,
4561,
407,
581,
2173,
1332,
690,
8530,
10895,
751,
4440,
257,
14069,
22791,
272,
11454,
2990,
31640,
281,
1846,
17715,
621,
285,
26309,
310,
4158,
407,
3626,
26309,
7152,
339,
4904,
3700,
494,
48960,
2983,
1332,
310,
4081,
323,
1789,
5481,
247,
17200,
3711,
310,
908,
281,
9413,
253,
1846,
14855,
273,
5368,
25421,
271,
48960,
10895,
323,
1789,
5481,
310,
3562,
323,
5661,
12820,
50276,
783,
17200,
3711,
4081,
6296,
38422,
432,
256,
3129,
28946,
347,
5393,
275,
4706,
5922,
253,
760,
11237,
310,
281,
1818,
247,
2014,
4666,
2303,
246,
281,
253,
3388,
273,
247,
873,
273,
7632,
50276,
284,
253,
2201,
7680,
7558,
310,
327,
253,
17200,
3711,
253,
5884,
11237,
2789,
253,
7680,
3710,
50276,
783,
17200,
3711,
310,
10302,
407,
896,
44263,
839,
253,
17200,
391,
432,
253,
2457,
3828,
281,
253,
3280,
1563,
4803,
347,
12800,
275,
4706,
5922,
2299,
253,
4081,
1332,
310,
7558,
281,
1347,
247,
2806,
3364,
3700,
494,
2983,
671,
253,
27935,
403,
12845,
281,
5731,
253,
17200,
3711,
352,
310,
3021,
417,
2590,
849,
436,
17200,
3711,
27972,
281,
253,
2806,
3364,
3700,
494,
2983,
50274,
249,
253,
4679,
253,
4081,
2983,
1332,
3091,
320,
2429,
281,
256,
5503,
5481,
390,
26405,
3169,
3082,
24088,
5919,
5992,
18779,
324,
2269,
9866,
40211,
3178,
9866,
50276,
187,
187,
4118,
18435,
27,
2520,
2929,
29328,
247,
3700,
494,
48960,
2983,
1332,
323,
1789,
5481,
407,
970,
253,
17200,
3711,
1740,
30628,
2530,
7000,
10123,
374,
273,
731,
20139,
8718,
533,
417,
1175,
2217,
50276,
250,
5342,
337,
20139,
42876,
2708,
285,
337,
20139,
42876,
1840,
1223,
30628,
1908,
253,
2929,
973,
3542,
285,
970,
17200,
3711,
4460,
247,
1180,
273,
7350,
403,
5439,
1690,
3710,
38135,
253,
3480,
273,
10527,
1543,
642,
897,
273,
253,
4081,
10895,
12497,
28913,
3966,
1309,
253,
30080,
22559,
253,
4477,
1160,
6031,
281,
2380,
281,
512,
30628,
5701,
2299,
253,
2201,
7350,
3464,
285,
253,
13716,
497,
417,
4391,
253,
913,
84,
15038,
841,
2201,
7350,
285,
5194,
326,
253,
2929,
476,
417,
320,
7607,
387,
697,
1655,
1375
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
authors present a method for expanding taxonomies with synonyms or aliases the proposed method has two stages 1 generate synonym candidates from wordnet and then 2 use a binary classifier to filter the candidates the method is simple and effective paper is well written with ample empirical study and analysis couple of minor comments 1 rather than using wordnet for step 1 is it possible to use a similarity based clustering method to mine candidates for each concept from a corpus 2 for the word embeddings used in step 2 did the authors use offtheshelf precompuated embeddings or compute the embeddings from a domain specific in this case shopping corpus will the performance improve if a domain specific embedding is applied docsepthe paper presents an interesting approach to taxonomy extension that is based on identifying synonyms for component words of multiword terms in the taxonomy the approach seems to rely very much on wordnet which may be a weakness coverage of wordnet is rather limited and the approach may therefore be limited in application also word sense disambiguation selecting the appropriate sense for a component word is a challenge that has not been addressed in full detail although this will be dealt with by the classification step in filtering if i understand correctly overall the paper is wellwritten and clear in ambitions and achieved results the experiments use an extensive crowdsourced gold standard which is a valuable research outcome on its own if it will be released publiclydocsepthe authors describe and evaluate two approaches to collecting alternative aliases synonyms for entities in a taxonomy expansion from wordnet synsets and from search queries followed by a binary classification to refine the generated candidate sets mitigating vocabulary mismatch in search applications provides a good motivating use case for ontologytaxonomy construction and is an important research direction questions how were the negative samples for training the classifier selected in 43 what is the overlap between the synonym sets generated using wordnet and the search queries can the wordnetgenerated candidates improve performance for aligning synonyms collected from search queries ie output of the first method as input to the second synonym selection method are there other evaluation results that can show improvement from implementing the proposed approaches on the target tasks eg search or information extraction remarks semantic network seems to be a synonym for a knowledge graph which is a more frequently used term the relation has to be made explicit the structure of the paper is confusing only one of the candidate selection methods is described in the section 3 but experimental results for two approaches are reported in section 4
### Summary: | the work provides an interesting and yet rather straightforward approach to synonym expansion that relies on a combination of user queries and existing background knowledge in terms of wordnet the evaluation shows good results for an interesting domain in practice it would be great if the crowd sourced data would be released i also wonder if the paper actually covered enough of the related work in particular with respect to synonym expansion from information retrieval overall i think the task itself and the resource would be interesting to have at the conference although its not a radical innovation hence i would recommend it for the poster session | [
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
43355,
1246,
247,
1332,
323,
16122,
2891,
11168,
447,
342,
2753,
2421,
983,
390,
19541,
1169,
253,
4081,
1332,
556,
767,
8661,
337,
6635,
2753,
7983,
9183,
432,
3159,
3024,
285,
840,
374,
897,
247,
8985,
30410,
281,
5806,
253,
9183,
253,
1332,
310,
2969,
285,
3576,
2929,
310,
973,
3542,
342,
24904,
16774,
1263,
285,
1783,
4564,
273,
5884,
5701,
337,
2581,
685,
970,
3159,
3024,
323,
3213,
337,
310,
352,
1896,
281,
897,
247,
14259,
1754,
17524,
1332,
281,
7477,
9183,
323,
1016,
4473,
432,
247,
20689,
374,
323,
253,
3159,
46234,
908,
275,
3213,
374,
858,
253,
4477,
897,
273,
649,
1041,
48164,
638,
3118,
11634,
46234,
390,
11897,
253,
46234,
432,
247,
5028,
2173,
275,
436,
1083,
12701,
20689,
588,
253,
3045,
3157,
604,
247,
5028,
2173,
21496,
310,
3732,
5474,
339,
431,
248,
2929,
10262,
271,
4722,
2746,
281,
2891,
13646,
6880,
326,
310,
1754,
327,
12488,
2753,
2421,
983,
323,
4445,
3000,
273,
4471,
3418,
2426,
275,
253,
2891,
13646,
253,
2746,
3133,
281,
10725,
1077,
1199,
327,
3159,
3024,
534,
778,
320,
247,
14855,
7031,
273,
3159,
3024,
310,
2581,
3710,
285,
253,
2746,
778,
3103,
320,
3710,
275,
2898,
671,
3159,
3282,
557,
38638,
17221,
253,
4569,
3282,
323,
247,
4445,
3159,
310,
247,
5691,
326,
556,
417,
644,
9713,
275,
2120,
2508,
3738,
436,
588,
320,
18445,
342,
407,
253,
9162,
3213,
275,
19690,
604,
891,
2096,
9113,
4583,
253,
2929,
310,
973,
15720,
285,
2590,
275,
37321,
285,
6786,
1543,
253,
4679,
897,
271,
9470,
24597,
47549,
5328,
2629,
534,
310,
247,
9865,
2561,
6454,
327,
697,
1211,
604,
352,
588,
320,
4439,
13644,
7152,
339,
431,
248,
4477,
6266,
285,
7472,
767,
7274,
281,
17055,
5795,
19541,
1169,
2753,
2421,
983,
323,
14429,
275,
247,
2891,
13646,
7466,
432,
3159,
3024,
2753,
19598,
285,
432,
3186,
19241,
3560,
407,
247,
8985,
9162,
281,
39494,
253,
4561,
7431,
5239,
37460,
30318,
29713,
275,
3186,
4893,
3400,
247,
1175,
15265,
839,
897,
1083,
323,
42081,
9292,
13646,
5140,
285,
310,
271,
1774,
2561,
3884,
50276,
34974,
50276,
5430,
497,
253,
4016,
3530,
323,
3733,
253,
30410,
4236,
275,
7652,
50276,
5371,
310,
253,
14787,
875,
253,
2753,
7983,
5239,
4561,
970,
3159,
3024,
285,
253,
3186,
19241,
50276,
5092,
253,
3159,
3024,
20419,
9183,
3157,
3045,
323,
8495,
272,
2753,
2421,
983,
5728,
432,
3186,
19241,
26332,
3453,
273,
253,
806,
1332,
347,
3280,
281,
253,
1273,
2753,
7983,
5438,
1332,
50276,
609,
627,
643,
7103,
1543,
326,
476,
921,
7756,
432,
16994,
253,
4081,
7274,
327,
253,
2303,
8892,
24088,
3186,
390,
1491,
11998,
50276,
2013,
7969,
50276,
6017,
6484,
2990,
3133,
281,
320,
247,
2753,
7983,
323,
247,
3640,
4216,
534,
310,
247,
625,
7208,
908,
1307,
253,
5886,
556,
281,
320,
1160,
6843,
50276,
783,
2605,
273,
253,
2929,
310,
21643,
760,
581,
273,
253,
7431,
5438,
3082,
310,
2529,
275,
253,
2593,
495,
533,
5661,
1543,
323,
767,
7274,
403,
2361,
275,
2593,
577,
187,
187,
4118,
18435,
27,
783,
789,
3400,
271,
4722,
285,
2568,
2581,
15246,
2746,
281,
2753,
7983,
7466,
326,
15771,
327,
247,
5019,
273,
2608,
19241,
285,
5368,
4114,
3640,
275,
2426,
273,
3159,
3024,
253,
7103,
2722,
1175,
1543,
323,
271,
4722,
5028,
275,
3946,
352,
651,
320,
1270,
604,
253,
9539,
47344,
941,
651,
320,
4439,
891,
671,
4282,
604,
253,
2929,
2686,
6107,
2217,
273,
253,
2905,
789,
275,
1798,
342,
1675,
281,
2753,
7983,
7466,
432,
1491,
25064,
50275,
1189,
455,
891,
1158,
253,
4836,
3139,
285,
253,
7741,
651,
320,
4722,
281,
452,
387,
253,
8059,
3738,
697,
417,
247,
9329,
15832,
7613,
891,
651,
5583,
352,
323,
253,
20731,
6874,
209
] | [
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] | [
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
43355,
1246,
247,
1332,
323,
16122,
2891,
11168,
447,
342,
2753,
2421,
983,
390,
19541,
1169,
253,
4081,
1332,
556,
767,
8661,
337,
6635,
2753,
7983,
9183,
432,
3159,
3024,
285,
840,
374,
897,
247,
8985,
30410,
281,
5806,
253,
9183,
253,
1332,
310,
2969,
285,
3576,
2929,
310,
973,
3542,
342,
24904,
16774,
1263,
285,
1783,
4564,
273,
5884,
5701,
337,
2581,
685,
970,
3159,
3024,
323,
3213,
337,
310,
352,
1896,
281,
897,
247,
14259,
1754,
17524,
1332,
281,
7477,
9183,
323,
1016,
4473,
432,
247,
20689,
374,
323,
253,
3159,
46234,
908,
275,
3213,
374,
858,
253,
4477,
897,
273,
649,
1041,
48164,
638,
3118,
11634,
46234,
390,
11897,
253,
46234,
432,
247,
5028,
2173,
275,
436,
1083,
12701,
20689,
588,
253,
3045,
3157,
604,
247,
5028,
2173,
21496,
310,
3732,
5474,
339,
431,
248,
2929,
10262,
271,
4722,
2746,
281,
2891,
13646,
6880,
326,
310,
1754,
327,
12488,
2753,
2421,
983,
323,
4445,
3000,
273,
4471,
3418,
2426,
275,
253,
2891,
13646,
253,
2746,
3133,
281,
10725,
1077,
1199,
327,
3159,
3024,
534,
778,
320,
247,
14855,
7031,
273,
3159,
3024,
310,
2581,
3710,
285,
253,
2746,
778,
3103,
320,
3710,
275,
2898,
671,
3159,
3282,
557,
38638,
17221,
253,
4569,
3282,
323,
247,
4445,
3159,
310,
247,
5691,
326,
556,
417,
644,
9713,
275,
2120,
2508,
3738,
436,
588,
320,
18445,
342,
407,
253,
9162,
3213,
275,
19690,
604,
891,
2096,
9113,
4583,
253,
2929,
310,
973,
15720,
285,
2590,
275,
37321,
285,
6786,
1543,
253,
4679,
897,
271,
9470,
24597,
47549,
5328,
2629,
534,
310,
247,
9865,
2561,
6454,
327,
697,
1211,
604,
352,
588,
320,
4439,
13644,
7152,
339,
431,
248,
4477,
6266,
285,
7472,
767,
7274,
281,
17055,
5795,
19541,
1169,
2753,
2421,
983,
323,
14429,
275,
247,
2891,
13646,
7466,
432,
3159,
3024,
2753,
19598,
285,
432,
3186,
19241,
3560,
407,
247,
8985,
9162,
281,
39494,
253,
4561,
7431,
5239,
37460,
30318,
29713,
275,
3186,
4893,
3400,
247,
1175,
15265,
839,
897,
1083,
323,
42081,
9292,
13646,
5140,
285,
310,
271,
1774,
2561,
3884,
50276,
34974,
50276,
5430,
497,
253,
4016,
3530,
323,
3733,
253,
30410,
4236,
275,
7652,
50276,
5371,
310,
253,
14787,
875,
253,
2753,
7983,
5239,
4561,
970,
3159,
3024,
285,
253,
3186,
19241,
50276,
5092,
253,
3159,
3024,
20419,
9183,
3157,
3045,
323,
8495,
272,
2753,
2421,
983,
5728,
432,
3186,
19241,
26332,
3453,
273,
253,
806,
1332,
347,
3280,
281,
253,
1273,
2753,
7983,
5438,
1332,
50276,
609,
627,
643,
7103,
1543,
326,
476,
921,
7756,
432,
16994,
253,
4081,
7274,
327,
253,
2303,
8892,
24088,
3186,
390,
1491,
11998,
50276,
2013,
7969,
50276,
6017,
6484,
2990,
3133,
281,
320,
247,
2753,
7983,
323,
247,
3640,
4216,
534,
310,
247,
625,
7208,
908,
1307,
253,
5886,
556,
281,
320,
1160,
6843,
50276,
783,
2605,
273,
253,
2929,
310,
21643,
760,
581,
273,
253,
7431,
5438,
3082,
310,
2529,
275,
253,
2593,
495,
533,
5661,
1543,
323,
767,
7274,
403,
2361,
275,
2593,
577,
187,
187,
4118,
18435,
27,
783,
789,
3400,
271,
4722,
285,
2568,
2581,
15246,
2746,
281,
2753,
7983,
7466,
326,
15771,
327,
247,
5019,
273,
2608,
19241,
285,
5368,
4114,
3640,
275,
2426,
273,
3159,
3024,
253,
7103,
2722,
1175,
1543,
323,
271,
4722,
5028,
275,
3946,
352,
651,
320,
1270,
604,
253,
9539,
47344,
941,
651,
320,
4439,
891,
671,
4282,
604,
253,
2929,
2686,
6107,
2217,
273,
253,
2905,
789,
275,
1798,
342,
1675,
281,
2753,
7983,
7466,
432,
1491,
25064,
50275,
1189,
455,
891,
1158,
253,
4836,
3139,
285,
253,
7741,
651,
320,
4722,
281,
452,
387,
253,
8059,
3738,
697,
417,
247,
9329,
15832,
7613,
891,
651,
5583,
352,
323,
253,
20731,
6874,
209
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
in this paper the authors propose a new optimization method for continual learning the authors propose a taskaware optimizer to adapt the learning rate for each task the proposed method is evaluated on several datasets to show its effectiveness i think the idea is clear and the experiments verify the effectiveness of the proposed method however i found the results of agem in this paper are significantly worse than the original paper for example the accuracy of splitcifar100 is 5425 in this submission while in agem the accuracy is 62 could you provide more details in addition though the submission addresses the catastrophic forgetting by proposing a new optimizer the discussion of task correlations are very similar to gem and its variants eg agem and claw continual learning with adaptive weights more analysis about why the proposed method can outperform other approaches would make the paper more convincing missing reference 1 is a very relevant work i think it might be valuable to compare and discuss with it btw the authors change the margin of the iclr format please kindly revise it in the next version i am willing to adjust my score if the authors can address my concerns 1 guo yunhui mingrui liu tianbao yang and tajana rosing improved schemes for episodic memorybased lifelong learning arxiv preprint arxiv190911763 2019 after rebuttal the authors response does partially address my concerns after reading the authors rebuttal and other reviewers comments i still think the contributions are not enough for publication i will keep my score the idea is clear and the results seem promising however the authors are supposed to provide more analysis to understand why the proposed method works docsepthe authors present an approach for lifelong learning where each task is processed in a single pass they propose to adapt the learning rate of the training algorithm depending on the current tasks similarity to the previously observed tasks the learning rate is decreased when there are many dissimilar tasks to avoid catastrophic forgetting the paper presents an adapted rmsprop algorithm but the procedure can be adjusted to adagrad and adam as well the presented evaluation on computer vision datasets shows that the proposed modification helps to increase the final accuracy of those methods while keeping the forgetting low when compared to baselines from literature under the conditions of the single pass setting the proposed method achieves better final accuracy and keeps approximately the same forgetting rate pros general approach that can be adapted to multiple optimization algorithms and used in other metaalgorithms strong performance in a singlepass scenario can achieve better learning accuracy than naive methods cons the effect of the proposed procedure on optimization algorithms is not studied enough for example its unclear how would a few similar tasks help to learn faster when there are many distinct tasks present additional insights like visualizations of trajectories will help to build an intuition and boost the method adoption in practice effect of task order is not studiedhighlighted the method seems to be very dependent on the task order and it is unclear how the order is chosen in experiments minor commentsquestions what is the point of having lambdatt in eq 3 ie why do you need to multiply the currently accumulated gradients by task similarity to itself in appendix 33 figure 5a there is a noticable difference between rmsprop and tagrmsprop on the first task what is the reason for this shouldnt they be identicalclose the presented method is a general procedure that helps to improve performance of many methods in lifelong learning scenario the presented experimental results are convicing however the paper would benefit from more research and intuition on why the approach works i recommend to accept the paper docsepthis paper proposes tag a method for continual learning in the taskincremental setting this method relies on storing and using task gradients while learning a set of supervisd tasks sequentially the influence taskbased accumulated gradients is regulated through a learning rate that is adaptive according to the relatedness of the current task with the previously observed ones the authors report results on benchmark datasets for continual learning of up to 20 disjoint tasks comparisons against naive noncontinual optimizers such as sgd adam and rmsprop are reported along with results in the continual learning setting with some stateoftheart methods such as ewc agem and er the authors report performance in terms of overall accuracy forward transfer and learning accuracy la strenghts of this paper are the method provides a simple yet effective way of dealing with catastrophic forgetting which is to some extent original since it proposes to use the relatedness among tasks and use this information to control gradient updates while learning new tasks the method is technically sound and most of the experimental evaluation is aligned with typical evaluations in the area including datasets used and reported metrics the paper is wellwritten wellstructured and easy to follow weaknesses of this paper are my main concern is regarding memorystorage size requirements of the proposed method during training although the proposed method avoids the need to keep examples of previous tasks as compared to replaybased methods and of expanding the network compared to network expansion methods it certainly adds memorystorage requirements of the gradients based on this i would expect to see comparisons of the proposed method vs replaybased methods and network expansion methods in terms of memory usage i think that these concerns need to be fully addressed in this paper to really demonstrate the advantages of the proposed approach furthermore i would expect to see these comparisons for a reasonably large number of tasks since memorystorage requirements increase along with the number of tasks my second biggest concern is where are the actual gains of the proposed approach as i can infer from table 1 most of the gain in final accuracy is due to a gain in la according to the definition of la this is actually the accuracy of each new task therefore i do not agree with the claim in page 8 the higher la with similar forgetting as compared to other baselines shows that while tag exploits the adaptive nature of existing optimizers it also ensures minimal forgetting of the gained knowledge hence even if a similar or lower forgetting occurs in tag the higher test accuracy with high la shows that tag is capable of retaining the gained knowledge from each task since from table 1 forgetting levels are actually very similar to those experienced by other methods and therefore a higher accuracy which i agree is due to la comes mainly from learning new tasks better rather than encouraging minimal forgetting or retaining the gained knowledge as it is claimed i would strongly suggest to clarify this point in the experiments i do not see the reason for not comparing with network expansion methods the fact that the proposed method is not making any change to the size of the model as mentioned in page 8 does not create any limitation for comparing with these kinds of methods in the experimental setting used in the paper finally i see a lot of similarities of the proposed method with existing methods to control gradient updates such as ogd 1 and owm 2 therefore i would expect comparisons to these methods 1 farajtabar m azizan n mott a li a 2020 june orthogonal gradient descent for continual learning in international conference on artificial intelligence and statistics pp 37623773 pmlr 2 zeng g chen y cui b yu s 2019 continual learning of contextdependent processing in neural networks nature machine intelligence 18 364372 although the paper proposes a simple method for controlling catastrophic forgetting in the continual learning setting there are major drawbacks flaws in the experimental results and the evaluation of the proposed approach given the nature of tag i think is is fundamental to measure and report memorystorage requirements in comparison to other methods that also require additional memorystorage such as replaybased methods and network expansion methods furthermore claims around avoidance of catastrophic forgetting in the main results reported in table 1 and explained in page 8 are not wellsupported since it is clear that the method is strong at learning new tasks while not necessarily much better than counter parts regarding catrastrophic forgetting docsepthis paper proposes a taskaware adaptive learning rate method tag for continual learning the optimizer tag is to promote the learning rate if they are similar to previous tasks while decreasing the learning rates if they are dissimilar to previous tasks without storing previous examples the authors combined the proposed method with existing optimizers such as adam sgd etc to demonstrate the effectiveness of the proposed method experimental results on several datasets show the improvements over naive optimizers strength the paper is clearly written and easy to follow experiments on different settings are considered weakness the technical novelty of the proposed method is limited in fact tag is similar to lamaml 1 gupta et al2020 both the motivations and derived formula equation 2 of tag are very similar to 1 from this aspect tag can be seen as a memoryfree version of lamaml misinterpretation of related works the author claims that another related work gupta et al2020 also employs an adaptive learning rate while requiring a small episodic memory but it is based on a metalearning setting and hence beyond the scope of our paper this is an incorrect statement lamaml gupta et al2020 is to solve the continual learning problems with the tools from meta learning they are not in metalearning setting but instead in the continual learning setting the proposed method seems to only work on the taskaware setting more recent works focus on the taskfree continual learning setting where the task identity and boundary are unknown during the learning process it would strengthen the proposed method if tag can be applied in such scenarios the experiments on more other architectures such as mlp may demonstrate the effectiveness of the proposed method the method novelty is limited and experiment needs to be improved
### Summary: | the authors develop a memorybased method for continual learning that stores gradient information from past tasks this memory is then used by a proposed taskaware optimizer that based on the task relatedness aims at preserving knowledge learned in previous tasks the initial reviews were reasonable but indicated that this paper was not yet ready to be published in particular the reviewers seemed to agree on the somewhat limited methodological novelty of the paper given prior work such as lamaml and ogd in terms of method and gem in terms of task similarity comparison in their response the authors do seem to agree to a certain extent with some of the criticisms but also point to clear differences with respect to previous work and other distinguishing aspects such as a smaller memory footprint than ogd the authors also carefully responded to reviewer comments and provided additional results when possible in the end the main criticism from the reviewers remained reviewer 95tf also suggests that the authors should compare their method to others in terms of memory consumption which the authors partly did and compare to replaybased methods and this paper was a borderline one three out of the four reviewers suggest that it is not ready to be published one reviewer did give it a high score 8 but also understood the limitations raised by the other reviewers as a result my recommendation is that this paper falls below the acceptance threshold i am sorry that for this recommendation and i strongly suggest the authors consider the reviewers suggestions in preparing the next version of this work in particular it seems like providing a full study of the memory usage of your approach vs others as well as providing more insights about the trajectory see the comment from zr5n might go a long way toward improving the paper | [
7274,
651,
1056,
253,
2929,
625,
21414,
50276,
33722,
3806,
337,
310,
247,
1077,
4623,
789,
891,
1158,
352,
1537,
320,
9865,
281,
7277,
285,
2319,
342,
352,
50276,
2612,
88,
253,
4477,
1818,
253,
8459,
273,
253,
17857,
32888,
5981,
4496,
26604,
49620,
352,
275,
253,
1735,
2715,
50276,
74,
717,
7378,
281,
4575,
619,
4868,
604,
253,
4477,
476,
2953,
619,
7350,
50276,
18,
1149,
80,
340,
328,
37942,
43261,
579,
74,
632,
86,
246,
757,
5830,
80,
30966,
285,
246,
1432,
3230,
687,
4093,
5520,
15849,
323,
6314,
23329,
3541,
3169,
36536,
4715,
549,
32693,
638,
3845,
549,
32693,
746,
2693,
883,
23250,
6247,
50275,
6438,
30080,
22559,
50274,
783,
4477,
2380,
1057,
10571,
2953,
619,
7350,
846,
4361,
253,
4477,
30080,
22559,
285,
643,
30628,
5701,
891,
1335,
1158,
253,
9021,
403,
417,
2217,
323,
9311,
891,
588,
1978,
619,
4868,
253,
2934,
310,
2590,
285,
253,
1543,
1646,
12532,
2299,
253,
4477,
403,
6326,
281,
2085,
625,
1783,
281,
2096,
2139,
253,
4081,
1332,
2987,
5474,
339,
431,
248,
4477,
1246,
271,
2746,
323,
36536,
4715,
835,
1016,
4836,
310,
11742,
275,
247,
2014,
1509,
597,
12661,
281,
5223,
253,
4715,
2281,
273,
253,
3733,
5933,
7293,
327,
253,
1655,
8892,
14259,
281,
253,
3786,
2540,
8892,
253,
4715,
2281,
310,
6137,
672,
627,
403,
1142,
43110,
8892,
281,
3693,
36256,
37264,
253,
2929,
10262,
271,
12956,
391,
983,
8560,
5933,
533,
253,
5199,
476,
320,
10904,
281,
519,
356,
4614,
285,
38622,
347,
973,
253,
3559,
7103,
327,
4382,
8113,
15302,
2722,
326,
253,
4081,
11237,
7729,
281,
2572,
253,
2457,
7200,
273,
1110,
3082,
1223,
7562,
253,
37264,
1698,
672,
2429,
281,
1666,
25379,
432,
6239,
762,
253,
2515,
273,
253,
2014,
1509,
4758,
253,
4081,
1332,
33526,
1805,
2457,
7200,
285,
11359,
5512,
253,
1072,
37264,
2281,
50276,
856,
84,
50276,
16691,
2746,
326,
476,
320,
12956,
281,
2709,
13757,
11333,
285,
908,
275,
643,
11419,
267,
46042,
50276,
9072,
3045,
275,
247,
2014,
5858,
10076,
50276,
5092,
5115,
1805,
4715,
7200,
685,
27785,
3082,
50276,
5040,
50276,
783,
1055,
273,
253,
4081,
5199,
327,
13757,
11333,
310,
417,
5421,
2217,
323,
1650,
697,
12744,
849,
651,
247,
1643,
2074,
8892,
1361,
281,
3037,
7938,
672,
627,
403,
1142,
5799,
8892,
1246,
3081,
16039,
751,
5304,
5904,
273,
24102,
588,
1361,
281,
1973,
271,
30328,
285,
9510,
253,
1332,
16253,
275,
3946,
50275,
8222,
273,
4836,
1340,
310,
417,
5421,
30703,
264,
253,
1332,
3133,
281,
320,
1077,
7976,
327,
253,
4836,
1340,
285,
352,
310,
12744,
849,
253,
1340,
310,
6777,
275,
4679,
50276,
37585,
5701,
34974,
50276,
5371,
310,
253,
1127,
273,
1907,
24082,
69,
1595,
275,
16186,
495,
26332,
2139,
513,
368,
878,
281,
30247,
253,
4390,
20821,
27935,
407,
4836,
14259,
281,
3139,
50276,
249,
30762,
5922,
4677,
608,
66,
627,
310,
247,
417,
35232,
3064,
875,
391,
983,
8560,
285,
6809,
83,
983,
8560,
327,
253,
806,
4836,
752,
310,
253,
1921,
323,
436,
943,
2649,
597,
320,
8931,
10483,
253,
3559,
1332,
310,
247,
2087,
5199,
326,
7729,
281,
3157,
3045,
273,
1142,
3082,
275,
36536,
4715,
10076,
253,
3559,
5661,
1543,
403,
2410,
11733,
2299,
253,
2929,
651,
5649,
432,
625,
2561,
285,
30328,
327,
2139,
253,
2746,
2987,
891,
5583,
281,
2997,
253,
2929,
5474,
33032,
2520,
2929,
29328,
6809,
247,
1332,
323,
45120,
4715,
275,
253,
4836,
19687,
30132,
4758,
436,
1332,
15771,
327,
20073,
285,
970,
4836,
27935,
1223,
4715,
247,
873,
273,
2221,
4534,
69,
8892,
32627,
253,
4833,
4836,
3169,
20821,
27935,
310,
13527,
949,
247,
4715,
2281,
326,
310,
17825,
2556,
281,
253,
2905,
1255,
273,
253,
1655,
4836,
342,
253,
3786,
2540,
4394,
253,
4477,
1304,
1543,
327,
22791,
15302,
323,
45120,
4715,
273,
598,
281,
1384,
28465,
8892,
14023,
1411,
27785,
1327,
8190,
780,
5556,
14460,
824,
347,
256,
35333,
38622,
285,
391,
983,
8560,
403,
2361,
2112,
342,
1543,
275,
253,
45120,
4715,
4758,
342,
690,
1375,
23037,
14387,
3082,
824,
347,
299,
38212,
639,
358,
285,
2827,
253,
4477,
1304,
3045,
275,
2426,
273,
4583,
7200,
3579,
3700,
285,
4715,
7200,
826,
4056,
384,
84,
273,
436,
2929,
403,
50275,
783,
1332,
3400,
247,
2969,
2568,
3576,
1039,
273,
10620,
342,
36256,
37264,
534,
310,
281,
690,
6070,
3236,
1580,
352,
29328,
281,
897,
253,
2905,
1255,
2190,
8892,
285,
897,
436,
1491,
281,
1453,
11786,
11269,
1223,
4715,
747,
8892,
50276,
783,
1332,
310,
22335,
3590,
285,
954,
273,
253,
5661,
7103,
310,
15616,
342,
6867,
27163,
275,
253,
2170,
1690,
15302,
908,
285,
2361,
17082,
50276,
783,
2929,
310,
973,
15720,
973,
34218,
285,
3477,
281,
956,
50276,
20881,
1255,
265,
273,
436,
2929,
403,
50275,
2577,
2022,
4468,
310,
5001,
3541,
22214,
1979,
6095,
273,
253,
4081,
1332,
1309,
3733,
3738,
253,
4081,
1332,
32547,
253,
878,
281,
1978,
6667,
273,
2045,
8892,
347,
2429,
281,
44864,
3169,
3082,
285,
273,
16122,
253,
2990,
2429,
281,
2990,
7466,
3082,
352,
5604,
11323,
3541,
22214,
6095,
273,
253,
27935,
1754,
327,
436,
891,
651,
1902,
281,
923,
14023,
273,
253,
4081,
1332,
4632,
44864,
3169,
3082,
285,
2990,
7466,
3082,
275,
2426,
273,
3541,
10393,
50276,
74,
1158,
326,
841,
7350,
878,
281,
320,
4751,
9713,
275,
436,
2929,
281,
1663,
7568,
253,
11361,
273,
253,
4081,
2746,
33810,
891,
651,
1902,
281,
923,
841,
14023,
323,
247,
12054,
1781,
1180,
273,
8892,
1580,
3541,
22214,
6095,
2572,
2112,
342,
253,
1180,
273,
8892,
50275,
2577,
1273,
5962,
4468,
310,
835,
403,
253,
4588,
15988,
273,
253,
4081,
2746,
347,
891,
476,
9441,
432,
2829,
337,
954,
273,
253,
6351,
275,
2457,
7200,
310,
1955,
281,
247,
6351,
275,
826,
2556,
281,
253,
5426,
273,
826,
436,
310,
2686,
253,
7200,
273,
1016,
747,
4836,
3103,
891,
513,
417,
5194,
342,
253,
1750,
275,
3239,
854,
253,
2169,
826,
342,
2074,
37264,
347,
2429,
281,
643,
1666,
25379,
2722,
326,
1223,
6809,
40725,
253,
17825,
3753,
273,
5368,
5556,
14460,
352,
671,
20096,
8723,
37264,
273,
253,
12103,
3640,
7613,
1014,
604,
247,
2074,
390,
2406,
37264,
6634,
275,
6809,
253,
2169,
1071,
7200,
342,
1029,
826,
2722,
326,
6809,
310,
7032,
273,
26179,
253,
12103,
3640,
432,
1016,
4836,
1580,
432,
2829,
337,
37264,
2308,
403,
2686,
1077,
2074,
281,
1110,
7407,
407,
643,
3082,
285,
3103,
247,
2169,
7200,
534,
891,
5194,
310,
1955,
281,
826,
3249,
7194,
432,
4715,
747,
8892,
1805,
2581,
685,
18462,
8723,
37264,
390,
26179,
253,
12103,
3640,
347,
352,
310,
7558,
891,
651,
7052,
1804,
281,
19148,
436,
1127,
50275,
249,
253,
4679,
891,
513,
417,
923,
253,
1921,
323,
417,
10941,
342,
2990,
7466,
3082,
253,
958,
326,
253,
4081,
1332,
310,
417,
2403,
667,
1818,
281,
253,
1979,
273,
253,
1566,
347,
5393,
275,
3239,
854,
1057,
417,
2794,
667,
12291,
323,
10941,
342,
841,
9351,
273,
3082,
275,
253,
5661,
4758,
908,
275,
253,
2929,
50275,
71,
3341,
891,
923,
247,
2257,
273,
22620,
273,
253,
4081,
1332,
342,
5368,
3082,
281,
1453,
11786,
11269,
824,
347,
9040,
69,
337,
285,
18454,
78,
374,
3103,
891,
651,
1902,
14023,
281,
841,
3082,
50275,
18,
2080,
1432,
8476,
274,
278,
11775,
478,
266,
295,
278,
1519,
247,
50276,
965,
247,
9169,
480,
2517,
19627,
11786,
18499,
323,
45120,
4715,
275,
5213,
8059,
327,
13345,
9260,
285,
9990,
7266,
38516,
1508,
40761,
268,
1686,
83,
50276,
19,
1182,
1205,
305,
260,
864,
340,
36707,
270,
50276,
30838,
256,
6247,
45120,
4715,
273,
3634,
6820,
5162,
275,
11454,
6928,
3753,
5145,
9260,
1283,
35585,
29412,
3738,
253,
2929,
29328,
247,
2969,
1332,
323,
10938,
36256,
37264,
275,
253,
45120,
4715,
4758,
627,
403,
2201,
30453,
32138,
275,
253,
5661,
1543,
285,
253,
7103,
273,
253,
4081,
2746,
1677,
253,
3753,
273,
6809,
891,
1158,
310,
310,
7936,
281,
2557,
285,
1304,
3541,
22214,
6095,
275,
5301,
281,
643,
3082,
326,
671,
2430,
3081,
3541,
22214,
824,
347,
44864,
3169,
3082,
285,
2990,
7466,
3082,
33810,
3916,
1475,
28772,
273,
36256,
37264,
275,
253,
2022,
1543,
2361,
275,
2829,
337,
285,
5544,
275,
3239,
854,
403,
417,
973,
19391,
1580,
352,
310,
2590,
326,
253,
1332,
310,
2266,
387,
4715,
747,
8892,
1223,
417,
7933,
1199,
1805,
685,
4828,
4243,
5001,
5798,
42836,
16117,
37264,
50276,
7152,
33032,
2520,
2929,
29328,
247,
50276,
14605,
13823,
17825,
4715,
2281,
1332,
6809,
323,
45120,
4715,
253,
5556,
6081,
6809,
310,
281,
8591,
253,
4715,
2281,
604,
597,
403,
2074,
281,
2045,
8892,
1223,
11052,
253,
4715,
4142,
604,
597,
403,
43110,
281,
2045,
8892,
1293,
20073,
2045,
6667,
253,
4477,
5678,
253,
4081,
1332,
342,
5368,
5556,
14460,
824,
347,
38622,
256,
35333,
3966,
281,
7568,
253,
12510,
273,
253,
4081,
1332,
5661,
1543,
327,
2067,
15302,
921,
253,
11701,
689,
27785,
5556,
14460,
50274,
45563,
50274,
783,
2929,
310,
4518,
3542,
285,
3477,
281,
956,
50275,
16217,
3825,
327,
1027,
7533,
403,
2783,
50274,
20881,
1255,
50275,
783,
7681,
38135,
273,
253,
4081,
1332,
310,
3710,
275,
958,
6809,
310,
2074,
281,
16519,
16878,
337,
1149,
37668,
1162,
355,
14952,
1097,
253,
42852,
285,
6012,
7212,
5150,
374,
273,
6809,
403,
1077,
2074,
281,
337,
432,
436,
4809,
6809,
476,
320,
2326,
347,
247,
3541,
4924,
2715,
273,
16519,
16878,
50274,
24418,
22416,
318,
273,
2905,
2987,
50276,
783,
2488,
3916,
326,
1529,
2905,
789,
1149,
37668,
1162,
355,
14952,
671,
27532,
271,
17825,
4715,
2281,
1223,
10568,
247,
1355,
6314,
23329,
3541,
533,
352,
310,
1754,
327,
247,
5148,
613,
920,
4758,
285,
7613,
4457,
253,
7990,
273,
776,
2929,
436,
310,
271,
13583,
3908,
16519,
16878,
50276,
4297,
37668,
1162,
355,
14952,
310,
281,
8415,
253,
45120,
4715,
3237,
342,
253,
5657,
432,
50276,
13518,
4715,
597,
403,
417,
275,
5148,
613,
920,
4758,
533,
3185,
275,
253,
45120,
4715,
4758,
50274,
783,
4081,
1332,
3133,
281,
760,
789,
327,
253,
4836,
13823,
4758,
625,
3332,
2987,
2770,
327,
253,
4836,
4924,
45120,
4715,
4758,
835,
253,
4836,
6489,
285,
7548,
403,
7202,
1309,
253,
4715,
1232,
50276,
262,
651,
17084,
253,
4081,
1332,
604,
50276,
7784,
476,
320,
3732,
275,
824,
15216,
50273,
783,
4679,
327,
625,
643,
35615,
824,
347,
13361,
81,
778,
7568,
253,
12510,
273,
253,
4081,
1332,
50270,
783,
1332,
38135,
310,
3710,
285,
3368,
3198,
281,
320,
5520,
50275,
187,
187,
4118,
18435,
27,
783,
4477,
1287,
247,
3541,
3169,
1332,
323,
45120,
4715,
326,
10111,
11786,
1491,
432,
2469,
8892,
436,
3541,
310,
840,
908,
407,
247,
4081,
4836,
13823,
5556,
6081,
326,
1754,
327,
253,
4836,
2905,
1255,
13698,
387,
24279,
3640,
6311,
275,
2045,
8892,
50276,
783,
3302,
10123,
497,
5272,
533,
4860,
326,
436,
2929,
369,
417,
2568,
4704,
281,
320,
3863,
275,
1798,
253,
30628,
4455,
281,
5194,
327,
253,
8489,
3710,
35961,
38135,
273,
253,
2929,
1677,
2720,
789,
824,
347,
16519,
16878,
285,
9040,
69,
275,
2426,
273,
1332,
285,
16915,
275,
2426,
273,
4836,
14259,
5301,
50276,
249,
616,
2380,
253,
4477,
513,
1646,
281,
5194,
281,
247,
2176,
6070,
342,
690,
273,
253,
43680,
533,
671,
1127,
281,
2590,
3910,
342,
1675,
281,
2045,
789,
285,
643,
32495,
7794,
824,
347,
247,
4577,
3541,
33257,
685,
9040,
69,
253,
4477,
671,
9257,
10974,
281,
37317,
5701,
285,
2530,
3081,
1543,
672,
1896,
50276,
249,
253,
990,
253,
2022,
14226,
432,
253,
30628,
6376,
37317,
5325,
16114,
671,
5936,
326,
253,
4477,
943,
7277,
616,
1332,
281,
2571,
275,
2426,
273,
3541,
8353,
534,
253,
4477,
13730,
858,
285,
7277,
281,
44864,
3169,
3082,
285,
436,
2929,
369,
247,
45210,
581,
1264,
562,
273,
253,
1740,
30628,
1804,
326,
352,
310,
417,
4704,
281,
320,
3863,
581,
37317,
858,
1918,
352,
247,
1029,
4868,
854,
533,
671,
7192,
253,
7364,
5439,
407,
253,
643,
30628,
347,
247,
906,
619,
17401,
310,
326,
436,
2929,
11521,
2708,
253,
14924,
7887,
50275,
74,
717,
7016,
326,
323,
436,
17401,
285,
891,
7052,
1804,
253,
4477,
1908,
253,
30628,
13991,
275,
13828,
253,
1735,
2715,
273,
436,
789,
275,
1798,
352,
3133,
751,
5277,
247,
2120,
1263,
273,
253,
3541,
10393,
273,
634,
2746,
4632,
2571,
347,
973,
347,
5277,
625,
16039,
670,
253,
18974,
923,
253,
4385,
432,
1182,
83,
22,
79,
1537,
564,
247,
1048,
1039,
2584,
11138,
253,
2929
] | [
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] | [
7274,
651,
1056,
253,
2929,
625,
21414,
50276,
33722,
3806,
337,
310,
247,
1077,
4623,
789,
891,
1158,
352,
1537,
320,
9865,
281,
7277,
285,
2319,
342,
352,
50276,
2612,
88,
253,
4477,
1818,
253,
8459,
273,
253,
17857,
32888,
5981,
4496,
26604,
49620,
352,
275,
253,
1735,
2715,
50276,
74,
717,
7378,
281,
4575,
619,
4868,
604,
253,
4477,
476,
2953,
619,
7350,
50276,
18,
1149,
80,
340,
328,
37942,
43261,
579,
74,
632,
86,
246,
757,
5830,
80,
30966,
285,
246,
1432,
3230,
687,
4093,
5520,
15849,
323,
6314,
23329,
3541,
3169,
36536,
4715,
549,
32693,
638,
3845,
549,
32693,
746,
2693,
883,
23250,
6247,
50275,
6438,
30080,
22559,
50274,
783,
4477,
2380,
1057,
10571,
2953,
619,
7350,
846,
4361,
253,
4477,
30080,
22559,
285,
643,
30628,
5701,
891,
1335,
1158,
253,
9021,
403,
417,
2217,
323,
9311,
891,
588,
1978,
619,
4868,
253,
2934,
310,
2590,
285,
253,
1543,
1646,
12532,
2299,
253,
4477,
403,
6326,
281,
2085,
625,
1783,
281,
2096,
2139,
253,
4081,
1332,
2987,
5474,
339,
431,
248,
4477,
1246,
271,
2746,
323,
36536,
4715,
835,
1016,
4836,
310,
11742,
275,
247,
2014,
1509,
597,
12661,
281,
5223,
253,
4715,
2281,
273,
253,
3733,
5933,
7293,
327,
253,
1655,
8892,
14259,
281,
253,
3786,
2540,
8892,
253,
4715,
2281,
310,
6137,
672,
627,
403,
1142,
43110,
8892,
281,
3693,
36256,
37264,
253,
2929,
10262,
271,
12956,
391,
983,
8560,
5933,
533,
253,
5199,
476,
320,
10904,
281,
519,
356,
4614,
285,
38622,
347,
973,
253,
3559,
7103,
327,
4382,
8113,
15302,
2722,
326,
253,
4081,
11237,
7729,
281,
2572,
253,
2457,
7200,
273,
1110,
3082,
1223,
7562,
253,
37264,
1698,
672,
2429,
281,
1666,
25379,
432,
6239,
762,
253,
2515,
273,
253,
2014,
1509,
4758,
253,
4081,
1332,
33526,
1805,
2457,
7200,
285,
11359,
5512,
253,
1072,
37264,
2281,
50276,
856,
84,
50276,
16691,
2746,
326,
476,
320,
12956,
281,
2709,
13757,
11333,
285,
908,
275,
643,
11419,
267,
46042,
50276,
9072,
3045,
275,
247,
2014,
5858,
10076,
50276,
5092,
5115,
1805,
4715,
7200,
685,
27785,
3082,
50276,
5040,
50276,
783,
1055,
273,
253,
4081,
5199,
327,
13757,
11333,
310,
417,
5421,
2217,
323,
1650,
697,
12744,
849,
651,
247,
1643,
2074,
8892,
1361,
281,
3037,
7938,
672,
627,
403,
1142,
5799,
8892,
1246,
3081,
16039,
751,
5304,
5904,
273,
24102,
588,
1361,
281,
1973,
271,
30328,
285,
9510,
253,
1332,
16253,
275,
3946,
50275,
8222,
273,
4836,
1340,
310,
417,
5421,
30703,
264,
253,
1332,
3133,
281,
320,
1077,
7976,
327,
253,
4836,
1340,
285,
352,
310,
12744,
849,
253,
1340,
310,
6777,
275,
4679,
50276,
37585,
5701,
34974,
50276,
5371,
310,
253,
1127,
273,
1907,
24082,
69,
1595,
275,
16186,
495,
26332,
2139,
513,
368,
878,
281,
30247,
253,
4390,
20821,
27935,
407,
4836,
14259,
281,
3139,
50276,
249,
30762,
5922,
4677,
608,
66,
627,
310,
247,
417,
35232,
3064,
875,
391,
983,
8560,
285,
6809,
83,
983,
8560,
327,
253,
806,
4836,
752,
310,
253,
1921,
323,
436,
943,
2649,
597,
320,
8931,
10483,
253,
3559,
1332,
310,
247,
2087,
5199,
326,
7729,
281,
3157,
3045,
273,
1142,
3082,
275,
36536,
4715,
10076,
253,
3559,
5661,
1543,
403,
2410,
11733,
2299,
253,
2929,
651,
5649,
432,
625,
2561,
285,
30328,
327,
2139,
253,
2746,
2987,
891,
5583,
281,
2997,
253,
2929,
5474,
33032,
2520,
2929,
29328,
6809,
247,
1332,
323,
45120,
4715,
275,
253,
4836,
19687,
30132,
4758,
436,
1332,
15771,
327,
20073,
285,
970,
4836,
27935,
1223,
4715,
247,
873,
273,
2221,
4534,
69,
8892,
32627,
253,
4833,
4836,
3169,
20821,
27935,
310,
13527,
949,
247,
4715,
2281,
326,
310,
17825,
2556,
281,
253,
2905,
1255,
273,
253,
1655,
4836,
342,
253,
3786,
2540,
4394,
253,
4477,
1304,
1543,
327,
22791,
15302,
323,
45120,
4715,
273,
598,
281,
1384,
28465,
8892,
14023,
1411,
27785,
1327,
8190,
780,
5556,
14460,
824,
347,
256,
35333,
38622,
285,
391,
983,
8560,
403,
2361,
2112,
342,
1543,
275,
253,
45120,
4715,
4758,
342,
690,
1375,
23037,
14387,
3082,
824,
347,
299,
38212,
639,
358,
285,
2827,
253,
4477,
1304,
3045,
275,
2426,
273,
4583,
7200,
3579,
3700,
285,
4715,
7200,
826,
4056,
384,
84,
273,
436,
2929,
403,
50275,
783,
1332,
3400,
247,
2969,
2568,
3576,
1039,
273,
10620,
342,
36256,
37264,
534,
310,
281,
690,
6070,
3236,
1580,
352,
29328,
281,
897,
253,
2905,
1255,
2190,
8892,
285,
897,
436,
1491,
281,
1453,
11786,
11269,
1223,
4715,
747,
8892,
50276,
783,
1332,
310,
22335,
3590,
285,
954,
273,
253,
5661,
7103,
310,
15616,
342,
6867,
27163,
275,
253,
2170,
1690,
15302,
908,
285,
2361,
17082,
50276,
783,
2929,
310,
973,
15720,
973,
34218,
285,
3477,
281,
956,
50276,
20881,
1255,
265,
273,
436,
2929,
403,
50275,
2577,
2022,
4468,
310,
5001,
3541,
22214,
1979,
6095,
273,
253,
4081,
1332,
1309,
3733,
3738,
253,
4081,
1332,
32547,
253,
878,
281,
1978,
6667,
273,
2045,
8892,
347,
2429,
281,
44864,
3169,
3082,
285,
273,
16122,
253,
2990,
2429,
281,
2990,
7466,
3082,
352,
5604,
11323,
3541,
22214,
6095,
273,
253,
27935,
1754,
327,
436,
891,
651,
1902,
281,
923,
14023,
273,
253,
4081,
1332,
4632,
44864,
3169,
3082,
285,
2990,
7466,
3082,
275,
2426,
273,
3541,
10393,
50276,
74,
1158,
326,
841,
7350,
878,
281,
320,
4751,
9713,
275,
436,
2929,
281,
1663,
7568,
253,
11361,
273,
253,
4081,
2746,
33810,
891,
651,
1902,
281,
923,
841,
14023,
323,
247,
12054,
1781,
1180,
273,
8892,
1580,
3541,
22214,
6095,
2572,
2112,
342,
253,
1180,
273,
8892,
50275,
2577,
1273,
5962,
4468,
310,
835,
403,
253,
4588,
15988,
273,
253,
4081,
2746,
347,
891,
476,
9441,
432,
2829,
337,
954,
273,
253,
6351,
275,
2457,
7200,
310,
1955,
281,
247,
6351,
275,
826,
2556,
281,
253,
5426,
273,
826,
436,
310,
2686,
253,
7200,
273,
1016,
747,
4836,
3103,
891,
513,
417,
5194,
342,
253,
1750,
275,
3239,
854,
253,
2169,
826,
342,
2074,
37264,
347,
2429,
281,
643,
1666,
25379,
2722,
326,
1223,
6809,
40725,
253,
17825,
3753,
273,
5368,
5556,
14460,
352,
671,
20096,
8723,
37264,
273,
253,
12103,
3640,
7613,
1014,
604,
247,
2074,
390,
2406,
37264,
6634,
275,
6809,
253,
2169,
1071,
7200,
342,
1029,
826,
2722,
326,
6809,
310,
7032,
273,
26179,
253,
12103,
3640,
432,
1016,
4836,
1580,
432,
2829,
337,
37264,
2308,
403,
2686,
1077,
2074,
281,
1110,
7407,
407,
643,
3082,
285,
3103,
247,
2169,
7200,
534,
891,
5194,
310,
1955,
281,
826,
3249,
7194,
432,
4715,
747,
8892,
1805,
2581,
685,
18462,
8723,
37264,
390,
26179,
253,
12103,
3640,
347,
352,
310,
7558,
891,
651,
7052,
1804,
281,
19148,
436,
1127,
50275,
249,
253,
4679,
891,
513,
417,
923,
253,
1921,
323,
417,
10941,
342,
2990,
7466,
3082,
253,
958,
326,
253,
4081,
1332,
310,
417,
2403,
667,
1818,
281,
253,
1979,
273,
253,
1566,
347,
5393,
275,
3239,
854,
1057,
417,
2794,
667,
12291,
323,
10941,
342,
841,
9351,
273,
3082,
275,
253,
5661,
4758,
908,
275,
253,
2929,
50275,
71,
3341,
891,
923,
247,
2257,
273,
22620,
273,
253,
4081,
1332,
342,
5368,
3082,
281,
1453,
11786,
11269,
824,
347,
9040,
69,
337,
285,
18454,
78,
374,
3103,
891,
651,
1902,
14023,
281,
841,
3082,
50275,
18,
2080,
1432,
8476,
274,
278,
11775,
478,
266,
295,
278,
1519,
247,
50276,
965,
247,
9169,
480,
2517,
19627,
11786,
18499,
323,
45120,
4715,
275,
5213,
8059,
327,
13345,
9260,
285,
9990,
7266,
38516,
1508,
40761,
268,
1686,
83,
50276,
19,
1182,
1205,
305,
260,
864,
340,
36707,
270,
50276,
30838,
256,
6247,
45120,
4715,
273,
3634,
6820,
5162,
275,
11454,
6928,
3753,
5145,
9260,
1283,
35585,
29412,
3738,
253,
2929,
29328,
247,
2969,
1332,
323,
10938,
36256,
37264,
275,
253,
45120,
4715,
4758,
627,
403,
2201,
30453,
32138,
275,
253,
5661,
1543,
285,
253,
7103,
273,
253,
4081,
2746,
1677,
253,
3753,
273,
6809,
891,
1158,
310,
310,
7936,
281,
2557,
285,
1304,
3541,
22214,
6095,
275,
5301,
281,
643,
3082,
326,
671,
2430,
3081,
3541,
22214,
824,
347,
44864,
3169,
3082,
285,
2990,
7466,
3082,
33810,
3916,
1475,
28772,
273,
36256,
37264,
275,
253,
2022,
1543,
2361,
275,
2829,
337,
285,
5544,
275,
3239,
854,
403,
417,
973,
19391,
1580,
352,
310,
2590,
326,
253,
1332,
310,
2266,
387,
4715,
747,
8892,
1223,
417,
7933,
1199,
1805,
685,
4828,
4243,
5001,
5798,
42836,
16117,
37264,
50276,
7152,
33032,
2520,
2929,
29328,
247,
50276,
14605,
13823,
17825,
4715,
2281,
1332,
6809,
323,
45120,
4715,
253,
5556,
6081,
6809,
310,
281,
8591,
253,
4715,
2281,
604,
597,
403,
2074,
281,
2045,
8892,
1223,
11052,
253,
4715,
4142,
604,
597,
403,
43110,
281,
2045,
8892,
1293,
20073,
2045,
6667,
253,
4477,
5678,
253,
4081,
1332,
342,
5368,
5556,
14460,
824,
347,
38622,
256,
35333,
3966,
281,
7568,
253,
12510,
273,
253,
4081,
1332,
5661,
1543,
327,
2067,
15302,
921,
253,
11701,
689,
27785,
5556,
14460,
50274,
45563,
50274,
783,
2929,
310,
4518,
3542,
285,
3477,
281,
956,
50275,
16217,
3825,
327,
1027,
7533,
403,
2783,
50274,
20881,
1255,
50275,
783,
7681,
38135,
273,
253,
4081,
1332,
310,
3710,
275,
958,
6809,
310,
2074,
281,
16519,
16878,
337,
1149,
37668,
1162,
355,
14952,
1097,
253,
42852,
285,
6012,
7212,
5150,
374,
273,
6809,
403,
1077,
2074,
281,
337,
432,
436,
4809,
6809,
476,
320,
2326,
347,
247,
3541,
4924,
2715,
273,
16519,
16878,
50274,
24418,
22416,
318,
273,
2905,
2987,
50276,
783,
2488,
3916,
326,
1529,
2905,
789,
1149,
37668,
1162,
355,
14952,
671,
27532,
271,
17825,
4715,
2281,
1223,
10568,
247,
1355,
6314,
23329,
3541,
533,
352,
310,
1754,
327,
247,
5148,
613,
920,
4758,
285,
7613,
4457,
253,
7990,
273,
776,
2929,
436,
310,
271,
13583,
3908,
16519,
16878,
50276,
4297,
37668,
1162,
355,
14952,
310,
281,
8415,
253,
45120,
4715,
3237,
342,
253,
5657,
432,
50276,
13518,
4715,
597,
403,
417,
275,
5148,
613,
920,
4758,
533,
3185,
275,
253,
45120,
4715,
4758,
50274,
783,
4081,
1332,
3133,
281,
760,
789,
327,
253,
4836,
13823,
4758,
625,
3332,
2987,
2770,
327,
253,
4836,
4924,
45120,
4715,
4758,
835,
253,
4836,
6489,
285,
7548,
403,
7202,
1309,
253,
4715,
1232,
50276,
262,
651,
17084,
253,
4081,
1332,
604,
50276,
7784,
476,
320,
3732,
275,
824,
15216,
50273,
783,
4679,
327,
625,
643,
35615,
824,
347,
13361,
81,
778,
7568,
253,
12510,
273,
253,
4081,
1332,
50270,
783,
1332,
38135,
310,
3710,
285,
3368,
3198,
281,
320,
5520,
50275,
187,
187,
4118,
18435,
27,
783,
4477,
1287,
247,
3541,
3169,
1332,
323,
45120,
4715,
326,
10111,
11786,
1491,
432,
2469,
8892,
436,
3541,
310,
840,
908,
407,
247,
4081,
4836,
13823,
5556,
6081,
326,
1754,
327,
253,
4836,
2905,
1255,
13698,
387,
24279,
3640,
6311,
275,
2045,
8892,
50276,
783,
3302,
10123,
497,
5272,
533,
4860,
326,
436,
2929,
369,
417,
2568,
4704,
281,
320,
3863,
275,
1798,
253,
30628,
4455,
281,
5194,
327,
253,
8489,
3710,
35961,
38135,
273,
253,
2929,
1677,
2720,
789,
824,
347,
16519,
16878,
285,
9040,
69,
275,
2426,
273,
1332,
285,
16915,
275,
2426,
273,
4836,
14259,
5301,
50276,
249,
616,
2380,
253,
4477,
513,
1646,
281,
5194,
281,
247,
2176,
6070,
342,
690,
273,
253,
43680,
533,
671,
1127,
281,
2590,
3910,
342,
1675,
281,
2045,
789,
285,
643,
32495,
7794,
824,
347,
247,
4577,
3541,
33257,
685,
9040,
69,
253,
4477,
671,
9257,
10974,
281,
37317,
5701,
285,
2530,
3081,
1543,
672,
1896,
50276,
249,
253,
990,
253,
2022,
14226,
432,
253,
30628,
6376,
37317,
5325,
16114,
671,
5936,
326,
253,
4477,
943,
7277,
616,
1332,
281,
2571,
275,
2426,
273,
3541,
8353,
534,
253,
4477,
13730,
858,
285,
7277,
281,
44864,
3169,
3082,
285,
436,
2929,
369,
247,
45210,
581,
1264,
562,
273,
253,
1740,
30628,
1804,
326,
352,
310,
417,
4704,
281,
320,
3863,
581,
37317,
858,
1918,
352,
247,
1029,
4868,
854,
533,
671,
7192,
253,
7364,
5439,
407,
253,
643,
30628,
347,
247,
906,
619,
17401,
310,
326,
436,
2929,
11521,
2708,
253,
14924,
7887,
50275,
74,
717,
7016,
326,
323,
436,
17401,
285,
891,
7052,
1804,
253,
4477,
1908,
253,
30628,
13991,
275,
13828,
253,
1735,
2715,
273,
436,
789,
275,
1798,
352,
3133,
751,
5277,
247,
2120,
1263,
273,
253,
3541,
10393,
273,
634,
2746,
4632,
2571,
347,
973,
347,
5277,
625,
16039,
670,
253,
18974,
923,
253,
4385,
432,
1182,
83,
22,
79,
1537,
564,
247,
1048,
1039,
2584,
11138,
253,
2929
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
the paper proposes a scalable meanfield game mfg approach to the problem of cloud resource management it introduces an online natural actorcritic nac algorithm for mfgs which uses function approximation and lets the meanfield state naturally evolve as the agents learn the authors establish finitetime convergence of nac with linear function approximation and softmax parameterization but also implement neuralnet based function approximation experimental results on opensource serverless platform openwhisk with realworld workloads from production traces demonstrate that nac is scalable to a large number of agents and significantly outperforms other common baselines strengths 1 the paper is well written even the theory parts that are more involved are not hard to follow and the authors provide adequate refences and clarifications to explain the different concepts i found this very helpful because the nac algorithm contains a number of steps such as policy evaluation gradient estimation meanfield update and policy update 2 the problem that motivates this work is an important realworld problem furthermore a mfg approach is a quite reasonable design choice for this problem in particular for large numbers of users where collective user behavior can be summarized by a population distribution and a common policy can be applied to all agents 3 i was unable to check the proofs in detail but the proposed algorithm and the ingredients therein make sense as they follow faithfully various prior works 4 the related work is concise but seems to cover a lot of important and relevant prior works 5 the experiments show that the proposed nac does not suffer from the zigzag fluctuations of the doubleloop version of nac that uses a fixed point iteration furthermore experiments with realworld workloads from production traces show that nac significantly outperforms other common baselines such as ensure and openwhisks original resource manager weaknesses 1 the novelty is not strong convergence of entropyregularized natural policy gradient with linear function approximation has been already studied by cayci he and srikant 2021 and indeed this work builds very heavily upon this specific work which incidentally is a preprint and has not been accepted to peerreviewed venues yet the main difference is that nac is now applied to a mfg setting where the main solution concept is that of a nash equilibrium still due to the meanfield approximation the agent essentially faces a singleagent policy optimization problem and the gametheoretic setting roughly becomes equivalent to a singleagent markov decision process that said i do not claim that the two works are identical since the deal with a different setting however they do overlap significantly 2 in the experimental evaluation nac is only compared to two other heuristicbased baselines it would have made much more sense to also compare against other rlbased schemes as the authors point out in the related work there are various rlbased frameworks for scheduling or resource management eg firm faasrank etc i feel that the authors would have made a much stronger point is they were able to show that the nac algorithm can outperform other rlbased competitors even by focusing on a simpler objective such as minimizing latency only even if some approaches are singleagent the authors could also experiment multiagent rlbased baselines to showcase the benefits of their framework even in a basic setup to demonstrate the scalability challenges with traditional multiagent approaches the authors have addressed the limitations of their work docsepthe competition and scheduling of cloud service resources is a significant research problem with economic value due to the emergence of new cloud service paradigms such as serverless computing we consider the aid of multiagent reinforcement learning marl to address new management challenges to this end the authors design a meanfield game mfg approach that can efficiently manage largescale cloud users and applications they propose an online natural actorcritic algorithm for approximating functions in meanfield games and demonstrate the finitetime convergence of the algorithm theoretically the study evaluates the solutions effectiveness on openwhisk using workloads captured from real production systems showing advantages in scalability latency and resource utilization strengths the cloud resource competition and scheduling problem has real research significance and economic value comprehensive theoretical basis and proof of properties the efficiency advantage of the proposed solution makes it adaptable to real system scenarios weaknesses the background and problem formalization are insufficient resulting in the disjointed problem background and methodology validation in largescale system scenarioscases not applicable docsepthe paper presents a mean field game approach to the many selfish agent problem the paper contextualises this around the resource management problem for faas before moving on to a theoretical evaluation of the problem and finishes with experimental results the paper presents a clear introduction to the faas problem along with a detailed mathematical analysis of the mfg solution however the paper is very dense and feels like two papers rammed into one the introduction is all about the cloud and the faas problem the next section of the paper is a dense mathematical work on the mfg problem with no mention of the faas problem the faas only returns as one of the examples in the results section it would have probably been better to have couched this paper as a fully theoretical paper and just keep the faas as one of the results examples this would have given more space for covering the middle paper this could have helped in making it clear what the neurips element of this work was this was not discussed docsepthis paper aims to solve the problem of improving latency and resource utilization in serverless platforms the paper proposes a meanfield game approach where they design a natural actorcritic nac learning paradigm for mfgs with function approximation the evaluation shows the convergence of their approach and shows some latency and resource efficiency improvement on the openwhisk platform note i am not an expert on the meanfield game theory side so i am not sure about the theoretical contribution but i am confident in my evaluation of the serverless system side i appreciate the authors effort to try to formalize the resource allocation problem in serverless strengths 1 interesting approach to map resource allocation or autoscaling problem in serverless to mfgs and the authors try to solve the system challenge in a theoretica way 2 the paper applies their approach to a real system openwhisk weaknesses 1 the mapping between the idealized theoretical setup and the real system is a bit vague eg the major part considers homogeneous agents and only the openwhisk experiments consider heterogeneous agents 2 the paper does not evaluate or prove the scalability of the proposed algorithm the authors discussed some limitations and potential social network impacts in the paper i found some limitations after reading the paper mentioned in the above questions section
### Summary: | i agree with the reviewers that this is a wellwritten paper on an interesting application of meanfield games the paper is a nice blend of theoretical developments and experimental evaluations i believe that it will be wellreceived by the neurips community and recommend acceptance | [
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
783,
2929,
29328,
247,
44755,
1599,
3423,
2165,
278,
16054,
2746,
281,
253,
1895,
273,
9005,
7741,
4323,
352,
23970,
271,
3909,
3626,
12353,
68,
17425,
295,
317,
5933,
323,
278,
71,
5943,
534,
4648,
1159,
11193,
285,
14935,
253,
1599,
3423,
1375,
10748,
23554,
347,
253,
6083,
3037,
253,
4477,
5100,
1442,
262,
7816,
14940,
273,
295,
317,
342,
4872,
1159,
11193,
285,
2602,
4090,
4764,
1320,
533,
671,
3359,
11454,
3024,
1754,
1159,
11193,
5661,
1543,
327,
13279,
1505,
4771,
1417,
5147,
1527,
2484,
1886,
342,
1524,
10186,
32140,
84,
432,
3275,
20274,
7568,
326,
295,
317,
310,
44755,
281,
247,
1781,
1180,
273,
6083,
285,
3012,
41731,
13015,
643,
1846,
1666,
25379,
20544,
337,
253,
2929,
310,
973,
3542,
1014,
253,
3762,
4243,
326,
403,
625,
3206,
403,
417,
1892,
281,
956,
285,
253,
4477,
2085,
10599,
1275,
2979,
285,
8254,
6787,
281,
5513,
253,
1027,
12342,
891,
1119,
436,
1077,
9371,
984,
253,
295,
317,
5933,
4428,
247,
1180,
273,
5018,
824,
347,
3646,
7103,
11786,
13418,
1599,
3423,
5731,
285,
3646,
5731,
374,
253,
1895,
326,
15265,
684,
436,
789,
310,
271,
1774,
1524,
10186,
1895,
33810,
247,
278,
16054,
2746,
310,
247,
3240,
5272,
2216,
4327,
323,
436,
1895,
275,
1798,
323,
1781,
3904,
273,
4212,
835,
12786,
2608,
3879,
476,
320,
17903,
407,
247,
3072,
3268,
285,
247,
1846,
3646,
476,
320,
3732,
281,
512,
6083,
495,
891,
369,
7591,
281,
2451,
253,
27947,
275,
2508,
533,
253,
4081,
5933,
285,
253,
12696,
15308,
1056,
3282,
347,
597,
956,
48479,
2710,
2720,
2987,
577,
253,
2905,
789,
310,
44003,
533,
3133,
281,
3835,
247,
2257,
273,
1774,
285,
4623,
2720,
2987,
608,
253,
4679,
921,
326,
253,
4081,
295,
317,
1057,
417,
11089,
432,
253,
48567,
47482,
15113,
273,
253,
4021,
14075,
2715,
273,
295,
317,
326,
4648,
247,
4229,
1127,
19502,
33810,
4679,
342,
1524,
10186,
32140,
84,
432,
3275,
20274,
921,
326,
295,
317,
3012,
41731,
13015,
643,
1846,
1666,
25379,
824,
347,
5416,
285,
1527,
2484,
38235,
3236,
7741,
7205,
50275,
20881,
1255,
265,
337,
253,
38135,
310,
417,
2266,
14940,
273,
15579,
12846,
1025,
3626,
3646,
11786,
342,
4872,
1159,
11193,
556,
644,
2168,
5421,
407,
260,
333,
5297,
344,
285,
256,
16409,
386,
43425,
285,
6296,
436,
789,
21168,
1077,
11306,
2220,
436,
2173,
789,
534,
7119,
595,
310,
247,
638,
3845,
285,
556,
417,
644,
7607,
281,
14218,
33349,
28966,
2568,
253,
2022,
3064,
310,
326,
295,
317,
310,
1024,
3732,
281,
247,
278,
16054,
4758,
835,
253,
2022,
2900,
4473,
310,
326,
273,
247,
295,
1225,
12902,
1335,
1955,
281,
253,
1599,
3423,
11193,
253,
5570,
9093,
9365,
247,
2014,
12788,
3646,
13757,
1895,
285,
253,
18814,
10666,
30325,
4758,
11467,
4916,
6425,
281,
247,
2014,
12788,
1616,
729,
3061,
1232,
326,
753,
891,
513,
417,
1750,
326,
253,
767,
2987,
403,
8931,
1580,
253,
2968,
342,
247,
1027,
4758,
2299,
597,
513,
14787,
3012,
374,
275,
253,
5661,
7103,
295,
317,
310,
760,
2429,
281,
767,
643,
47641,
3169,
1666,
25379,
352,
651,
452,
1160,
1199,
625,
3282,
281,
671,
7277,
1411,
643,
391,
77,
3169,
15849,
347,
253,
4477,
1127,
562,
275,
253,
2905,
789,
627,
403,
2710,
391,
77,
3169,
31225,
323,
27387,
390,
7741,
4323,
24088,
5882,
4195,
284,
14714,
3966,
891,
1928,
326,
253,
4477,
651,
452,
1160,
247,
1199,
10046,
1127,
310,
597,
497,
2104,
281,
921,
326,
253,
295,
317,
5933,
476,
562,
32231,
643,
391,
77,
3169,
21607,
1014,
407,
13654,
327,
247,
19554,
8103,
824,
347,
28699,
22667,
760,
1014,
604,
690,
7274,
403,
2014,
12788,
253,
4477,
812,
671,
3368,
4471,
12788,
391,
77,
3169,
1666,
25379,
281,
34647,
253,
5373,
273,
616,
7792,
1014,
275,
247,
5044,
9978,
281,
7568,
253,
9171,
1430,
7881,
342,
5899,
4471,
12788,
7274,
253,
4477,
452,
9713,
253,
7364,
273,
616,
789,
5474,
339,
431,
248,
7324,
285,
27387,
273,
9005,
2579,
5300,
310,
247,
1534,
2561,
1895,
342,
5054,
1318,
1955,
281,
253,
21313,
273,
747,
9005,
2579,
11951,
304,
983,
824,
347,
4771,
1417,
12672,
359,
1908,
253,
8596,
273,
4471,
12788,
35221,
4715,
2304,
77,
281,
2953,
747,
4323,
7881,
281,
436,
990,
253,
4477,
2216,
247,
1599,
3423,
2165,
278,
16054,
2746,
326,
476,
14556,
8722,
1236,
2510,
25912,
9005,
4212,
285,
4893,
597,
12661,
271,
3909,
3626,
12353,
68,
17425,
5933,
323,
4020,
839,
3470,
275,
1599,
3423,
3958,
285,
7568,
253,
1442,
262,
7816,
14940,
273,
253,
5933,
28055,
253,
1263,
44995,
253,
5482,
12510,
327,
1527,
2484,
1886,
970,
32140,
84,
10848,
432,
1524,
3275,
2718,
4645,
11361,
275,
9171,
1430,
22667,
285,
7741,
19575,
20544,
50276,
783,
9005,
7741,
7324,
285,
27387,
1895,
556,
1524,
2561,
8453,
285,
5054,
1318,
50276,
3118,
8391,
422,
10527,
3720,
285,
4737,
273,
3607,
50276,
783,
6733,
5750,
273,
253,
4081,
2900,
2789,
352,
5223,
494,
281,
1524,
985,
15216,
50276,
20881,
1255,
265,
50276,
783,
4114,
285,
1895,
7473,
1320,
403,
12497,
4795,
275,
253,
28465,
264,
1895,
4114,
285,
16182,
50276,
29599,
275,
1236,
2510,
25912,
985,
5362,
1792,
5829,
1169,
417,
7763,
5474,
339,
431,
248,
2929,
10262,
247,
1599,
1673,
2165,
2746,
281,
253,
1142,
27915,
5570,
1895,
253,
2929,
33876,
3013,
436,
1475,
253,
7741,
4323,
1895,
323,
4195,
284,
1078,
4886,
327,
281,
247,
10527,
7103,
273,
253,
1895,
285,
30214,
342,
5661,
1543,
253,
2929,
10262,
247,
2590,
10199,
281,
253,
4195,
284,
1895,
2112,
342,
247,
7000,
15965,
1783,
273,
253,
278,
16054,
2900,
50275,
35529,
253,
2929,
310,
1077,
14086,
285,
9193,
751,
767,
9380,
391,
15416,
715,
581,
253,
10199,
310,
512,
670,
253,
9005,
285,
253,
4195,
284,
1895,
253,
1735,
2593,
273,
253,
2929,
310,
247,
14086,
15965,
789,
327,
253,
278,
16054,
1895,
50276,
3113,
642,
3748,
273,
253,
4195,
284,
1895,
253,
4195,
284,
760,
6548,
347,
581,
273,
253,
6667,
275,
253,
1543,
2593,
352,
651,
452,
3164,
644,
1805,
281,
452,
2565,
2147,
436,
2929,
347,
247,
4751,
10527,
2929,
285,
816,
1978,
253,
4195,
284,
347,
581,
273,
253,
1543,
6667,
436,
651,
452,
1677,
625,
2317,
323,
10985,
253,
4766,
2929,
436,
812,
452,
6518,
275,
2403,
352,
2590,
752,
253,
5723,
2824,
3284,
273,
436,
789,
369,
436,
369,
417,
5469,
5474,
33032,
2520,
2929,
13698,
281,
8415,
253,
1895,
273,
11138,
22667,
285,
7741,
19575,
275,
4771,
1417,
13498,
253,
2929,
29328,
247,
1599,
3423,
2165,
2746,
835,
597,
2216,
247,
3626,
12353,
68,
17425,
295,
317,
4715,
22199,
323,
278,
71,
5943,
342,
1159,
11193,
50276,
783,
7103,
2722,
253,
14940,
273,
616,
2746,
285,
2722,
690,
22667,
285,
7741,
6733,
7756,
327,
253,
1527,
2484,
1886,
5147,
50276,
9939,
891,
717,
417,
271,
6485,
327,
253,
1599,
3423,
2165,
3762,
1930,
594,
891,
717,
417,
2119,
670,
253,
10527,
7680,
533,
891,
717,
13224,
275,
619,
7103,
273,
253,
4771,
1417,
985,
1930,
891,
11435,
253,
4477,
3434,
281,
1611,
281,
7473,
907,
253,
7741,
17621,
1895,
275,
4771,
1417,
50276,
296,
3755,
20556,
337,
4722,
2746,
281,
3711,
7741,
17621,
390,
1125,
44709,
272,
1895,
275,
4771,
1417,
281,
278,
71,
5943,
285,
253,
4477,
1611,
281,
8415,
253,
985,
5691,
275,
247,
253,
410,
18259,
1039,
374,
253,
2929,
10384,
616,
2746,
281,
247,
1524,
985,
1527,
2484,
1886,
50276,
20881,
1255,
265,
337,
253,
10603,
875,
253,
7445,
1025,
10527,
9978,
285,
253,
1524,
985,
310,
247,
2372,
21248,
24088,
253,
2201,
629,
19401,
17010,
6083,
285,
760,
253,
1527,
2484,
1886,
4679,
1908,
22766,
6083,
374,
253,
2929,
1057,
417,
7472,
390,
5276,
253,
9171,
1430,
273,
253,
4081,
5933,
50276,
783,
4477,
5469,
690,
7364,
285,
2442,
2675,
2990,
16274,
275,
253,
2929,
891,
1119,
690,
7364,
846,
4361,
253,
2929,
5393,
275,
253,
1840,
3533,
2593,
50276,
187,
187,
4118,
18435,
27,
74,
5194,
342,
253,
30628,
326,
436,
310,
247,
973,
15720,
2929,
327,
271,
4722,
2898,
273,
1599,
3423,
3958,
253,
2929,
310,
247,
5322,
19310,
273,
10527,
16936,
285,
5661,
27163,
891,
2868,
326,
352,
588,
320,
973,
35413,
407,
253,
5723,
2824,
3114,
285,
5583,
14924,
209
] | [
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] | [
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
783,
2929,
29328,
247,
44755,
1599,
3423,
2165,
278,
16054,
2746,
281,
253,
1895,
273,
9005,
7741,
4323,
352,
23970,
271,
3909,
3626,
12353,
68,
17425,
295,
317,
5933,
323,
278,
71,
5943,
534,
4648,
1159,
11193,
285,
14935,
253,
1599,
3423,
1375,
10748,
23554,
347,
253,
6083,
3037,
253,
4477,
5100,
1442,
262,
7816,
14940,
273,
295,
317,
342,
4872,
1159,
11193,
285,
2602,
4090,
4764,
1320,
533,
671,
3359,
11454,
3024,
1754,
1159,
11193,
5661,
1543,
327,
13279,
1505,
4771,
1417,
5147,
1527,
2484,
1886,
342,
1524,
10186,
32140,
84,
432,
3275,
20274,
7568,
326,
295,
317,
310,
44755,
281,
247,
1781,
1180,
273,
6083,
285,
3012,
41731,
13015,
643,
1846,
1666,
25379,
20544,
337,
253,
2929,
310,
973,
3542,
1014,
253,
3762,
4243,
326,
403,
625,
3206,
403,
417,
1892,
281,
956,
285,
253,
4477,
2085,
10599,
1275,
2979,
285,
8254,
6787,
281,
5513,
253,
1027,
12342,
891,
1119,
436,
1077,
9371,
984,
253,
295,
317,
5933,
4428,
247,
1180,
273,
5018,
824,
347,
3646,
7103,
11786,
13418,
1599,
3423,
5731,
285,
3646,
5731,
374,
253,
1895,
326,
15265,
684,
436,
789,
310,
271,
1774,
1524,
10186,
1895,
33810,
247,
278,
16054,
2746,
310,
247,
3240,
5272,
2216,
4327,
323,
436,
1895,
275,
1798,
323,
1781,
3904,
273,
4212,
835,
12786,
2608,
3879,
476,
320,
17903,
407,
247,
3072,
3268,
285,
247,
1846,
3646,
476,
320,
3732,
281,
512,
6083,
495,
891,
369,
7591,
281,
2451,
253,
27947,
275,
2508,
533,
253,
4081,
5933,
285,
253,
12696,
15308,
1056,
3282,
347,
597,
956,
48479,
2710,
2720,
2987,
577,
253,
2905,
789,
310,
44003,
533,
3133,
281,
3835,
247,
2257,
273,
1774,
285,
4623,
2720,
2987,
608,
253,
4679,
921,
326,
253,
4081,
295,
317,
1057,
417,
11089,
432,
253,
48567,
47482,
15113,
273,
253,
4021,
14075,
2715,
273,
295,
317,
326,
4648,
247,
4229,
1127,
19502,
33810,
4679,
342,
1524,
10186,
32140,
84,
432,
3275,
20274,
921,
326,
295,
317,
3012,
41731,
13015,
643,
1846,
1666,
25379,
824,
347,
5416,
285,
1527,
2484,
38235,
3236,
7741,
7205,
50275,
20881,
1255,
265,
337,
253,
38135,
310,
417,
2266,
14940,
273,
15579,
12846,
1025,
3626,
3646,
11786,
342,
4872,
1159,
11193,
556,
644,
2168,
5421,
407,
260,
333,
5297,
344,
285,
256,
16409,
386,
43425,
285,
6296,
436,
789,
21168,
1077,
11306,
2220,
436,
2173,
789,
534,
7119,
595,
310,
247,
638,
3845,
285,
556,
417,
644,
7607,
281,
14218,
33349,
28966,
2568,
253,
2022,
3064,
310,
326,
295,
317,
310,
1024,
3732,
281,
247,
278,
16054,
4758,
835,
253,
2022,
2900,
4473,
310,
326,
273,
247,
295,
1225,
12902,
1335,
1955,
281,
253,
1599,
3423,
11193,
253,
5570,
9093,
9365,
247,
2014,
12788,
3646,
13757,
1895,
285,
253,
18814,
10666,
30325,
4758,
11467,
4916,
6425,
281,
247,
2014,
12788,
1616,
729,
3061,
1232,
326,
753,
891,
513,
417,
1750,
326,
253,
767,
2987,
403,
8931,
1580,
253,
2968,
342,
247,
1027,
4758,
2299,
597,
513,
14787,
3012,
374,
275,
253,
5661,
7103,
295,
317,
310,
760,
2429,
281,
767,
643,
47641,
3169,
1666,
25379,
352,
651,
452,
1160,
1199,
625,
3282,
281,
671,
7277,
1411,
643,
391,
77,
3169,
15849,
347,
253,
4477,
1127,
562,
275,
253,
2905,
789,
627,
403,
2710,
391,
77,
3169,
31225,
323,
27387,
390,
7741,
4323,
24088,
5882,
4195,
284,
14714,
3966,
891,
1928,
326,
253,
4477,
651,
452,
1160,
247,
1199,
10046,
1127,
310,
597,
497,
2104,
281,
921,
326,
253,
295,
317,
5933,
476,
562,
32231,
643,
391,
77,
3169,
21607,
1014,
407,
13654,
327,
247,
19554,
8103,
824,
347,
28699,
22667,
760,
1014,
604,
690,
7274,
403,
2014,
12788,
253,
4477,
812,
671,
3368,
4471,
12788,
391,
77,
3169,
1666,
25379,
281,
34647,
253,
5373,
273,
616,
7792,
1014,
275,
247,
5044,
9978,
281,
7568,
253,
9171,
1430,
7881,
342,
5899,
4471,
12788,
7274,
253,
4477,
452,
9713,
253,
7364,
273,
616,
789,
5474,
339,
431,
248,
7324,
285,
27387,
273,
9005,
2579,
5300,
310,
247,
1534,
2561,
1895,
342,
5054,
1318,
1955,
281,
253,
21313,
273,
747,
9005,
2579,
11951,
304,
983,
824,
347,
4771,
1417,
12672,
359,
1908,
253,
8596,
273,
4471,
12788,
35221,
4715,
2304,
77,
281,
2953,
747,
4323,
7881,
281,
436,
990,
253,
4477,
2216,
247,
1599,
3423,
2165,
278,
16054,
2746,
326,
476,
14556,
8722,
1236,
2510,
25912,
9005,
4212,
285,
4893,
597,
12661,
271,
3909,
3626,
12353,
68,
17425,
5933,
323,
4020,
839,
3470,
275,
1599,
3423,
3958,
285,
7568,
253,
1442,
262,
7816,
14940,
273,
253,
5933,
28055,
253,
1263,
44995,
253,
5482,
12510,
327,
1527,
2484,
1886,
970,
32140,
84,
10848,
432,
1524,
3275,
2718,
4645,
11361,
275,
9171,
1430,
22667,
285,
7741,
19575,
20544,
50276,
783,
9005,
7741,
7324,
285,
27387,
1895,
556,
1524,
2561,
8453,
285,
5054,
1318,
50276,
3118,
8391,
422,
10527,
3720,
285,
4737,
273,
3607,
50276,
783,
6733,
5750,
273,
253,
4081,
2900,
2789,
352,
5223,
494,
281,
1524,
985,
15216,
50276,
20881,
1255,
265,
50276,
783,
4114,
285,
1895,
7473,
1320,
403,
12497,
4795,
275,
253,
28465,
264,
1895,
4114,
285,
16182,
50276,
29599,
275,
1236,
2510,
25912,
985,
5362,
1792,
5829,
1169,
417,
7763,
5474,
339,
431,
248,
2929,
10262,
247,
1599,
1673,
2165,
2746,
281,
253,
1142,
27915,
5570,
1895,
253,
2929,
33876,
3013,
436,
1475,
253,
7741,
4323,
1895,
323,
4195,
284,
1078,
4886,
327,
281,
247,
10527,
7103,
273,
253,
1895,
285,
30214,
342,
5661,
1543,
253,
2929,
10262,
247,
2590,
10199,
281,
253,
4195,
284,
1895,
2112,
342,
247,
7000,
15965,
1783,
273,
253,
278,
16054,
2900,
50275,
35529,
253,
2929,
310,
1077,
14086,
285,
9193,
751,
767,
9380,
391,
15416,
715,
581,
253,
10199,
310,
512,
670,
253,
9005,
285,
253,
4195,
284,
1895,
253,
1735,
2593,
273,
253,
2929,
310,
247,
14086,
15965,
789,
327,
253,
278,
16054,
1895,
50276,
3113,
642,
3748,
273,
253,
4195,
284,
1895,
253,
4195,
284,
760,
6548,
347,
581,
273,
253,
6667,
275,
253,
1543,
2593,
352,
651,
452,
3164,
644,
1805,
281,
452,
2565,
2147,
436,
2929,
347,
247,
4751,
10527,
2929,
285,
816,
1978,
253,
4195,
284,
347,
581,
273,
253,
1543,
6667,
436,
651,
452,
1677,
625,
2317,
323,
10985,
253,
4766,
2929,
436,
812,
452,
6518,
275,
2403,
352,
2590,
752,
253,
5723,
2824,
3284,
273,
436,
789,
369,
436,
369,
417,
5469,
5474,
33032,
2520,
2929,
13698,
281,
8415,
253,
1895,
273,
11138,
22667,
285,
7741,
19575,
275,
4771,
1417,
13498,
253,
2929,
29328,
247,
1599,
3423,
2165,
2746,
835,
597,
2216,
247,
3626,
12353,
68,
17425,
295,
317,
4715,
22199,
323,
278,
71,
5943,
342,
1159,
11193,
50276,
783,
7103,
2722,
253,
14940,
273,
616,
2746,
285,
2722,
690,
22667,
285,
7741,
6733,
7756,
327,
253,
1527,
2484,
1886,
5147,
50276,
9939,
891,
717,
417,
271,
6485,
327,
253,
1599,
3423,
2165,
3762,
1930,
594,
891,
717,
417,
2119,
670,
253,
10527,
7680,
533,
891,
717,
13224,
275,
619,
7103,
273,
253,
4771,
1417,
985,
1930,
891,
11435,
253,
4477,
3434,
281,
1611,
281,
7473,
907,
253,
7741,
17621,
1895,
275,
4771,
1417,
50276,
296,
3755,
20556,
337,
4722,
2746,
281,
3711,
7741,
17621,
390,
1125,
44709,
272,
1895,
275,
4771,
1417,
281,
278,
71,
5943,
285,
253,
4477,
1611,
281,
8415,
253,
985,
5691,
275,
247,
253,
410,
18259,
1039,
374,
253,
2929,
10384,
616,
2746,
281,
247,
1524,
985,
1527,
2484,
1886,
50276,
20881,
1255,
265,
337,
253,
10603,
875,
253,
7445,
1025,
10527,
9978,
285,
253,
1524,
985,
310,
247,
2372,
21248,
24088,
253,
2201,
629,
19401,
17010,
6083,
285,
760,
253,
1527,
2484,
1886,
4679,
1908,
22766,
6083,
374,
253,
2929,
1057,
417,
7472,
390,
5276,
253,
9171,
1430,
273,
253,
4081,
5933,
50276,
783,
4477,
5469,
690,
7364,
285,
2442,
2675,
2990,
16274,
275,
253,
2929,
891,
1119,
690,
7364,
846,
4361,
253,
2929,
5393,
275,
253,
1840,
3533,
2593,
50276,
187,
187,
4118,
18435,
27,
74,
5194,
342,
253,
30628,
326,
436,
310,
247,
973,
15720,
2929,
327,
271,
4722,
2898,
273,
1599,
3423,
3958,
253,
2929,
310,
247,
5322,
19310,
273,
10527,
16936,
285,
5661,
27163,
891,
2868,
326,
352,
588,
320,
973,
35413,
407,
253,
5723,
2824,
3114,
285,
5583,
14924,
209
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
this work shows how an existing acquisition function like maxvalue entropy search and thompson sampling can be slightly modified for quantile optimization this work additionally accounts for heteroscedastic noise different from existing works on quantile optimization this work does not assume the observation of uncontrolled variables the main issue with this work is that of clarity of presentation on the surface it is easy to read if we are not particular about knowing and understanding the technical details however if we want to the paper lacks considerable technical details for us to verify the correctness and novelty the authors have chosen to instead replace them with abstract text descriptions in this case a reader like myself can only roughly infer what the authors are trying to do instead of knowing for certain consequently the results of this work are not reproducible see detailed comments for more information there is no theoretical performance guarantee for the proposed algorithms the experimental results rely on a relatively large number of initial observations can the authors show results with fewer initial observations the only method of comparison is naive and not competitive empirically see detailed comments for more information post rebuttal feedback i like to thank the authors for their clarifications i have a further comment 7 since sparse gp is used isnt it possible to use the matrix inversion lemma to rewrite the nxn predictive covariance matrix into an expression that can be computed in linear time i have increased my score by 1 as i think there is some merit to this work however i did not increase it further due to significant changes needed to make the details of the technical approach clear for verifying the correctness after the authors clarifications this concern still remains since we can only do so by reviewing the entire final version once again prior feedback i would strongly recommend that since the experiments do not cover expectile optimization there is also a lack of discussion on its motivation the authors can exclude expectile optimization or defer it to the appendix and replace with the missing technical details necessary for understanding it is not clear to me why the authors cannot integrate an existing heteroscedastic gp model see below with an existing batch bo algorithm and modify them for quantile optimization can the authors discuss this regression with inputdependent noise a gaussian process treatment nips 1997 most likely heteroscedastic gaussian process regression icml 2007 variational heteroscedastic gaussian process regression icml 2011 a distributed variational inference framework for unifying parallel sparse gaussian process regression models icml 2016 it is not clear to me whether or how this work can be extended to the case of observation of uncontrolled variables which is common in related works on quantile optimization like cakmak et al 2020 and the missing references below distributionally robust bayesian quadrature optimization aistats 2020 optimizing conditional valueatrisk of blackbox functions neurips 2021 valueatrisk optimization with gaussian processes icml 2021 will the authors be able to find a common setting perhaps at the expense of some unused features in the tested algorithms for empirical comparison can the authors discuss the above page 2 the authors say that using a dataset that does not necessarily require replicates of observations at the same input location the dataset contains quantile values can the authors explain how these quantile values are obtained without replicating observations using the information from section 21 page 2 from page 1 g can denote a quantile can the authors clarify whether y1 to yn are therefore quantile values page 2 like what the authors have said while the literature on expectile regression is less extended can the authors then provide a discussion to motivate the use of equation 3 and expectile regression and why we prefer it over quantile regression page 2 the authors say that the bayesian expectile model we just introduced is new to the best of our knowledge since it is new can the authors provide further explanation and justification for the form of equation 6 page 3 the authors say that in practice we choose a gaussian prior for g and a loggaussian prior for sigma can the authors explain and justify these choices of gp and loggp for the quantileexpectile g and the variance of the asymmetric laplacegaussian distribution of epsilon respectively page 3 the authors say that the variational parameters can be optimised jointly with the model parameters eg mean function coefficients or kernel hyperparameters such that kullbackleibler divergence between the approximate and the true posterior is as small as possible this claim may not be correct if the model parameters and z are not represented in a form based on true variational parameters see for example the justification given in paragraph of the following reference stochastic variational inference for bayesian sparse gaussian process regression ijcnn 2019 page 4 the authors say that one may achieve the same result by sampling a trajectory from the posterior of g can the authors define the trajectory and explain how it is sampled page 4 the authors say that the main drawback of gpbased ts is the cost of sampling a trajectory which can only be done exactly at a finite number of input locations at a cubic cost in the number of locations can the authors explain in detail why the cost remains cubic when utilizing the sparse variational gp model in section 22 page 4 the authors say that the procedure for generating quantile samples from the variational posterior of g can be summarised as follow i find the summary too vague for understanding can the authors provide the exact technical details of this procedure provide justification for each step of this procedure and finally give the pseudocode for this procedure for example when the authors compute the mean function mx of a gpr model is this the posterior mean s is also not defined how is this procedure exactly tied to the inference procedure in section 22 how exactly does this procedure account for batch sampling page 5 the authors say that we no longer have expressions for the first term of 11 the joint differential entropy of bdimensional asymmetric laplace variables the first term is not conditioned on g can the authors explain why then are they bdimensional asymmetric laplace variables page 5 the authors say that however the qgibbon formulation makes it particularly wellsuited for a greedy approach where we first optimise qgibbon for b 1 then optimise for b 2 while fixing the first point to the previously found value etc until b points are found can the authors provide the exact technical details for this approach for example for subsequent points being sampling ie b2 does it depend on the quantile value of the first point section 44 can the authors verify that the lunar lander task indeed requires the modeling of heteroscedasticity why 10 quantile of reward and not others the only method of comparison is naive and not competitive empirically can the authors instead compare with existing competitive batch bo algorithms with a smaller number of replications per point in the batch considering that the batch size is relatively large in some experiments is it necessary to utilize such a large number of replications per point in these experiments minor issues figure 1 heteroscedasticy abstract and section 3 maxvalue entropy search is not the same as entropy search the authors should revise the latter to the former page 5 conjugancy page 5 leading to our propose docsepthe problem posed by the paper appears novel and natural the paper is clearly written which makes the ideas easy to follow the results show clear improvement for the benchmarks studied the authors focus on 30 and 10 quantiles in their experiments which are probably considerably easier to estimate than say 5 or 1 quantiles in many realworld problems people are interested in rare events so they would want to estimate more extreme quantiles the authors do not give any intuition about how hard that would be i wonder if this is much harder the variational inference procedure could have been stated more explicitly section 22 first paragraph it is not generally true that obtaining a mean estimate is easier than estimating quantiles eg for balanced quantiles of heavytailed distributions the median will be considerably more stable than the mean i think the opening sentence is somewhat misleading docsepthe paper is wellstructured and clearly written the performance of the proposed method is evaluated on interesting data sets and the results have insightful conclusions and demonstrate the solid performance of the proposed method i do not see any obvious weaknesses no code was submitted for reproducibility but as far as i understand from the paper it will be shared on github later docsep the idea of using directly on quantileexpectiles is very important in several applications and it is good to see work in this direction the paper is well written and easy to follow the experimental section is convincing in the experiments the authors only compare against a standard ei it would be interesting to compare a gp model with a quantile based acquisition function such as expected quantile improvement the greedy implementation of the qgibbon acquisition function seem to cause some issues when batchsize b is quite large see bottom right plot of fig3 how would a strategy such as expected quantile improvement with a standard gp see picheny et al 2013 work here why havent the authors considered it i am wondering if a more traditional modelling of heteroskedastic noise would also work here for example would a gp model as in binois et al 2018 work here could the authors comment on extending the proposed acquisition functions to such method for the greedy nature of the qgibbon optimization would it be possible to choose greedily 2 points at once by using a 2dimensional qgibbon would this reduce the issues due to the greedy optimization picheny v ginsbourger d richet y caplin g 2013 quantilebased optimization of noisy computer experiments with tunable precision technometrics 551 213 binois m gramacy r b ludkovski m 2018 practical heteroscedastic gaussian process modeling for large simulation experiments journal of computational and graphical statistics 274 808821
### Summary: | meta review the paper proposes a method for bayesian quantile and expectile optimization which changes the focus in bo from a mean estimate to a quantile of the estimated objective function the goal is to provide a robust method with respect to extreme events the proposed approach handles heteroskedasticity by using two latent gps and large datasets with sparse inducing points approximations finally the authors propose two adaptations of classical acquisition functions to the quantile case the authors have replied to the reviewers rebuttal recommendation to the authors please carefully review the paper as suggested by the reviewers also in order to improve the quality of the presentation | [
432,
5368,
2987,
327,
2677,
587,
13757,
436,
789,
1057,
417,
5467,
253,
8310,
273,
42915,
4903,
253,
2022,
2523,
342,
436,
789,
310,
326,
273,
19843,
273,
9759,
327,
253,
2553,
352,
310,
3477,
281,
1239,
604,
359,
403,
417,
1798,
670,
8958,
285,
4685,
253,
7681,
4278,
2299,
604,
359,
971,
281,
253,
2929,
19756,
10665,
7681,
4278,
323,
441,
281,
12654,
253,
36594,
285,
38135,
253,
4477,
452,
6777,
281,
3185,
8171,
731,
342,
12002,
2505,
20121,
275,
436,
1083,
247,
9414,
751,
4266,
476,
760,
11467,
9441,
752,
253,
4477,
403,
2820,
281,
513,
3185,
273,
8958,
323,
2176,
17912,
253,
1543,
273,
436,
789,
403,
417,
41374,
923,
7000,
5701,
323,
625,
1491,
50276,
9088,
310,
642,
10527,
3045,
12215,
323,
253,
4081,
11333,
50276,
783,
5661,
1543,
10725,
327,
247,
4942,
1781,
1180,
273,
3302,
7313,
476,
253,
4477,
921,
1543,
342,
11184,
3302,
7313,
50276,
783,
760,
1332,
273,
5301,
310,
27785,
285,
417,
12085,
45190,
923,
7000,
5701,
323,
625,
1491,
1501,
30080,
22559,
8680,
50276,
74,
751,
281,
5717,
253,
4477,
323,
616,
8254,
6787,
891,
452,
247,
2007,
4385,
50276,
24,
1580,
23507,
31025,
310,
908,
310,
2649,
352,
1896,
281,
897,
253,
4315,
27697,
18057,
281,
24813,
253,
295,
89,
79,
15970,
26677,
4315,
715,
271,
2048,
326,
476,
320,
10302,
275,
4872,
673,
50276,
74,
452,
2559,
619,
4868,
407,
337,
347,
891,
1158,
627,
310,
690,
15785,
281,
436,
789,
2299,
891,
858,
417,
2572,
352,
2007,
1955,
281,
1534,
2544,
3058,
281,
1056,
253,
4278,
273,
253,
7681,
2746,
2590,
323,
49160,
253,
36594,
846,
253,
4477,
8254,
6787,
436,
4468,
1335,
4558,
1580,
359,
476,
760,
513,
594,
407,
16725,
253,
2862,
2457,
2715,
2378,
969,
50274,
40844,
8680,
50276,
74,
651,
7052,
5583,
326,
1580,
253,
4679,
513,
417,
3835,
1902,
587,
13757,
627,
310,
671,
247,
3480,
273,
5955,
327,
697,
16038,
253,
4477,
476,
16670,
1902,
587,
13757,
390,
36574,
352,
281,
253,
30762,
285,
8171,
342,
253,
5816,
7681,
4278,
3309,
323,
4685,
50276,
262,
310,
417,
2590,
281,
479,
2139,
253,
4477,
2550,
19837,
271,
5368,
6895,
375,
758,
3258,
31025,
1566,
923,
2708,
342,
271,
5368,
14604,
1766,
5933,
285,
10007,
731,
323,
2677,
587,
13757,
476,
253,
4477,
2319,
436,
50276,
1747,
1256,
342,
3280,
6820,
6046,
247,
305,
12064,
1232,
1971,
295,
2824,
8210,
50276,
2252,
2779,
6895,
375,
758,
3258,
305,
12064,
1232,
9077,
17857,
1686,
5215,
50276,
20617,
1050,
6895,
375,
758,
3258,
305,
12064,
1232,
9077,
17857,
1686,
4332,
50276,
66,
5939,
39762,
17032,
7792,
323,
440,
5411,
7529,
23507,
305,
12064,
1232,
9077,
3210,
17857,
1686,
4022,
50274,
262,
310,
417,
2590,
281,
479,
1880,
390,
849,
436,
789,
476,
320,
6508,
281,
253,
1083,
273,
8310,
273,
42915,
4903,
534,
310,
1846,
275,
2905,
2987,
327,
2677,
587,
13757,
751,
260,
518,
45879,
1162,
355,
9169,
285,
253,
5816,
10414,
2708,
50276,
35360,
595,
10237,
17699,
16561,
13284,
1177,
13757,
247,
382,
1832,
9169,
50276,
32581,
3006,
17697,
1318,
255,
16272,
273,
2806,
3364,
3470,
5723,
2824,
43425,
50276,
2877,
255,
16272,
13757,
342,
305,
12064,
4870,
17857,
1686,
43425,
50276,
9846,
253,
4477,
320,
2104,
281,
1089,
247,
1846,
4758,
4931,
387,
253,
14247,
273,
690,
30732,
3386,
275,
253,
5762,
11333,
323,
16774,
5301,
476,
253,
4477,
2319,
253,
1840,
50274,
6377,
374,
253,
4477,
1333,
326,
970,
247,
10895,
326,
1057,
417,
7933,
2430,
24945,
273,
7313,
387,
253,
1072,
3280,
4328,
253,
10895,
4428,
2677,
587,
2193,
476,
253,
4477,
5513,
849,
841,
2677,
587,
2193,
403,
2797,
1293,
7446,
839,
7313,
970,
253,
1491,
432,
2593,
3127,
50276,
6377,
374,
432,
3239,
337,
305,
476,
9173,
247,
2677,
587,
476,
253,
4477,
19148,
1880,
340,
18,
281,
340,
79,
403,
3103,
2677,
587,
2193,
50276,
6377,
374,
751,
752,
253,
4477,
452,
753,
1223,
253,
6239,
327,
1902,
587,
9077,
310,
1679,
6508,
476,
253,
4477,
840,
2085,
247,
5955,
281,
41509,
253,
897,
273,
5150,
495,
285,
1902,
587,
9077,
285,
2139,
359,
4510,
352,
689,
2677,
587,
9077,
50276,
6377,
374,
253,
4477,
1333,
326,
253,
17699,
16561,
1902,
587,
1566,
359,
816,
5611,
310,
747,
281,
253,
1682,
273,
776,
3640,
1580,
352,
310,
747,
476,
253,
4477,
2085,
2007,
8813,
285,
22861,
323,
253,
830,
273,
5150,
721,
50276,
6377,
495,
253,
4477,
1333,
326,
275,
3946,
359,
5206,
247,
305,
12064,
2720,
323,
305,
285,
247,
2412,
72,
12064,
2720,
323,
40009,
476,
253,
4477,
5513,
285,
15249,
841,
10165,
273,
31025,
285,
2412,
17788,
323,
253,
2677,
587,
23824,
587,
305,
285,
253,
11041,
273,
253,
26640,
826,
5070,
72,
12064,
3268,
273,
299,
4277,
2975,
50276,
6377,
495,
253,
4477,
1333,
326,
253,
39762,
3602,
476,
320,
5556,
1701,
26277,
342,
253,
1566,
3602,
24088,
1599,
1159,
10303,
390,
10295,
4373,
22041,
824,
326,
465,
962,
2135,
282,
487,
2146,
23279,
875,
253,
16851,
285,
253,
2032,
12637,
310,
347,
1355,
347,
1896,
436,
1750,
778,
417,
320,
3451,
604,
253,
1566,
3602,
285,
1182,
403,
417,
6607,
275,
247,
830,
1754,
327,
2032,
39762,
3602,
923,
323,
1650,
253,
22861,
1677,
275,
12494,
273,
253,
1563,
3806,
50276,
296,
17283,
39762,
17032,
323,
17699,
16561,
23507,
305,
12064,
1232,
9077,
891,
23925,
9866,
6247,
50276,
6377,
577,
253,
4477,
1333,
326,
581,
778,
5115,
253,
1072,
906,
407,
10491,
247,
18974,
432,
253,
12637,
273,
305,
476,
253,
4477,
4853,
253,
18974,
285,
5513,
849,
352,
310,
19958,
50276,
6377,
577,
253,
4477,
1333,
326,
253,
2022,
32489,
273,
31025,
3169,
28669,
310,
253,
2105,
273,
10491,
247,
18974,
534,
476,
760,
320,
2218,
4555,
387,
247,
6486,
1180,
273,
3280,
8593,
387,
247,
23664,
2105,
275,
253,
1180,
273,
8593,
476,
253,
4477,
5513,
275,
2508,
2139,
253,
2105,
4558,
23664,
672,
17617,
253,
23507,
39762,
31025,
1566,
275,
2593,
3307,
50276,
6377,
577,
253,
4477,
1333,
326,
253,
5199,
323,
11365,
2677,
587,
3530,
432,
253,
39762,
12637,
273,
305,
476,
320,
10405,
1701,
347,
956,
50276,
74,
1089,
253,
6010,
1512,
21248,
323,
4685,
476,
253,
4477,
2085,
253,
3242,
7681,
4278,
273,
436,
5199,
2085,
22861,
323,
1016,
3213,
273,
436,
5199,
285,
4720,
1918,
253,
10585,
406,
853,
323,
436,
5199,
323,
1650,
672,
253,
4477,
11897,
253,
1599,
1159,
278,
89,
273,
247,
305,
1087,
1566,
310,
436,
253,
12637,
1599,
256,
310,
671,
417,
2931,
849,
310,
436,
5199,
4555,
12331,
281,
253,
17032,
5199,
275,
2593,
3307,
849,
4555,
1057,
436,
5199,
2395,
323,
14604,
10491,
50276,
6377,
608,
253,
4477,
1333,
326,
359,
642,
3356,
452,
12091,
323,
253,
806,
1307,
273,
1903,
50276,
783,
6036,
8967,
15579,
273,
270,
6967,
26640,
826,
5070,
4903,
253,
806,
1307,
310,
417,
27039,
327,
305,
476,
253,
4477,
5513,
2139,
840,
403,
597,
270,
6967,
26640,
826,
5070,
4903,
50276,
6377,
608,
253,
4477,
1333,
326,
2299,
253,
2805,
72,
487,
4006,
15895,
2789,
352,
3782,
973,
3467,
959,
323,
247,
38754,
2746,
835,
359,
806,
5556,
885,
2805,
72,
487,
4006,
323,
270,
50276,
18,
840,
5556,
885,
323,
270,
50276,
19,
1223,
18505,
253,
806,
1127,
281,
253,
3786,
1119,
1318,
3966,
1919,
270,
2792,
403,
1119,
476,
253,
4477,
2085,
253,
3242,
7681,
4278,
323,
436,
2746,
323,
1650,
323,
6774,
2792,
1146,
10491,
26332,
270,
19,
1057,
352,
3469,
327,
253,
2677,
587,
1318,
273,
253,
806,
1127,
50276,
4674,
7127,
476,
253,
4477,
12654,
326,
253,
41929,
2659,
254,
4836,
6296,
4419,
253,
14053,
273,
6895,
375,
758,
3258,
414,
2139,
884,
2677,
587,
273,
10921,
285,
417,
2571,
50276,
783,
760,
1332,
273,
5301,
310,
27785,
285,
417,
12085,
45190,
476,
253,
4477,
3185,
7277,
342,
5368,
12085,
14604,
1766,
11333,
342,
247,
4577,
1180,
273,
7446,
569,
591,
1127,
275,
253,
14604,
7296,
326,
253,
14604,
1979,
310,
4942,
1781,
275,
690,
4679,
310,
352,
3309,
281,
16584,
824,
247,
1781,
1180,
273,
7446,
569,
591,
1127,
275,
841,
4679,
50273,
37585,
3374,
50276,
13206,
337,
6895,
375,
758,
505,
2576,
50276,
15834,
285,
2593,
495,
2781,
2877,
15579,
3186,
310,
417,
253,
1072,
347,
15579,
3186,
253,
4477,
943,
49620,
253,
6158,
281,
253,
3438,
50276,
6377,
608,
12993,
4306,
50276,
6377,
608,
4283,
281,
776,
12661,
5474,
339,
431,
248,
1895,
22691,
407,
253,
2929,
4620,
4460,
285,
3626,
253,
2929,
310,
4518,
3542,
534,
2789,
253,
5697,
3477,
281,
956,
253,
1543,
921,
2590,
7756,
323,
253,
49602,
5421,
253,
4477,
2770,
327,
1884,
285,
884,
2677,
3205,
275,
616,
4679,
534,
403,
3164,
15455,
6927,
281,
6642,
685,
1333,
608,
390,
337,
2677,
3205,
275,
1142,
1524,
10186,
3237,
952,
403,
6110,
275,
7520,
3394,
594,
597,
651,
971,
281,
6642,
625,
9559,
2677,
3205,
253,
4477,
513,
417,
1918,
667,
30328,
670,
849,
1892,
326,
651,
320,
50276,
74,
4282,
604,
436,
310,
1199,
12150,
253,
39762,
17032,
5199,
812,
452,
644,
4767,
625,
11120,
2593,
3307,
806,
12494,
50276,
262,
310,
417,
3839,
2032,
326,
13546,
247,
1599,
6642,
310,
6927,
685,
26230,
2677,
3205,
50276,
909,
323,
16645,
2677,
3205,
273,
3573,
1767,
7193,
10670,
253,
8876,
588,
320,
15455,
625,
6474,
685,
253,
1599,
50276,
74,
1158,
253,
5909,
6197,
310,
8489,
24363,
5474,
339,
431,
248,
2929,
310,
973,
34218,
285,
4518,
3542,
253,
3045,
273,
253,
4081,
1332,
310,
6760,
327,
4722,
941,
5239,
285,
253,
1543,
452,
47860,
11815,
285,
7568,
253,
4891,
3045,
273,
253,
4081,
1332,
50276,
74,
513,
417,
923,
667,
4755,
32213,
50276,
2369,
2127,
369,
9262,
323,
38041,
533,
347,
2080,
347,
891,
2096,
432,
253,
2929,
352,
588,
320,
6096,
327,
40477,
1996,
5474,
33032,
253,
2934,
273,
970,
3587,
327,
2677,
587,
23824,
3205,
310,
1077,
1774,
275,
2067,
4893,
285,
352,
310,
1175,
281,
923,
789,
275,
436,
3884,
50276,
783,
2929,
310,
973,
3542,
285,
3477,
281,
956,
50276,
783,
5661,
2593,
310,
21414,
50276,
249,
253,
4679,
253,
4477,
760,
7277,
1411,
247,
2629,
22616,
352,
651,
320,
4722,
281,
7277,
247,
31025,
1566,
342,
247,
2677,
587,
1754,
11931,
1159,
824,
347,
3264,
2677,
587,
7756,
50276,
783,
38754,
7092,
273,
253,
2805,
72,
487,
4006,
11931,
1159,
1646,
281,
2847,
690,
3374,
672,
14604,
3281,
270,
310,
3240,
1781,
923,
5004,
987,
7484,
273,
3036,
20,
50276,
5430,
651,
247,
5700,
824,
347,
3264,
2677,
587,
7756,
342,
247,
2629,
31025,
923,
14402,
864,
90,
1162,
355,
4072,
789,
1060,
2139,
419,
2254,
253,
4477,
2783,
352,
50275,
74,
717,
12371,
604,
247,
625,
5899,
26278,
273,
6895,
375,
16386,
3258,
6046,
651,
671,
789,
1060,
323,
1650,
651,
247,
31025,
1566,
347,
275,
270,
10868,
1162,
355,
4765,
789,
1060,
812,
253,
4477,
4385,
327,
13633,
253,
4081,
11931,
3470,
281,
824,
1332,
50275,
1542,
253,
38754,
3753,
273,
253,
2805,
72,
487,
4006,
13757,
651,
352,
320,
1896,
281,
5206,
37819,
1031,
374,
2792,
387,
2378,
407,
970,
247,
374,
6967,
2805,
72,
487,
4006,
651,
436,
4796,
253,
3374,
1955,
281,
253,
38754,
13757,
50276,
14985,
864,
90,
362,
305,
968,
7390,
1063,
277,
15438,
6168,
340,
50276,
6357,
446,
249,
305,
4072,
2677,
587,
3169,
13757,
273,
27620,
4382,
4679,
342,
10839,
494,
12320,
1732,
7480,
84,
47319,
25098,
50276,
67,
10868,
278,
29975,
1974,
391,
270,
50276,
77,
438,
17131,
9327,
278,
4765,
8542,
6895,
375,
758,
3258,
305,
12064,
1232,
14053,
323,
1781,
9864,
4679,
6698,
273,
15180,
285,
29886,
9990,
32900,
5096,
2055,
1797,
2490,
187,
4118,
18435,
27,
13518,
2278,
253,
2929,
29328,
247,
1332,
323,
17699,
16561,
2677,
587,
285,
1902,
587,
13757,
534,
2544,
253,
2770,
275,
1766,
432,
247,
1599,
6642,
281,
247,
2677,
587,
273,
253,
5998,
8103,
1159,
253,
4736,
310,
281,
2085,
247,
10237,
1332,
342,
1675,
281,
9559,
3394,
253,
4081,
2746,
22139,
6895,
375,
16386,
3258,
414,
407,
970,
767,
21624,
305,
793,
285,
1781,
15302,
342,
23507,
24635,
2792,
34754,
4720,
253,
4477,
12661,
767,
41655,
273,
8946,
11931,
3470,
281,
253,
2677,
587,
1083,
50276,
783,
4477,
452,
10017,
281,
253,
30628,
30080,
22559,
50276,
250,
27167,
318,
281,
253,
4477,
4496,
9257,
2278,
253,
2929,
347,
5125,
407,
253,
30628,
671,
275,
1340,
281,
3157,
253,
3290,
273,
253,
9759,
50276
] | [
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] | [
432,
5368,
2987,
327,
2677,
587,
13757,
436,
789,
1057,
417,
5467,
253,
8310,
273,
42915,
4903,
253,
2022,
2523,
342,
436,
789,
310,
326,
273,
19843,
273,
9759,
327,
253,
2553,
352,
310,
3477,
281,
1239,
604,
359,
403,
417,
1798,
670,
8958,
285,
4685,
253,
7681,
4278,
2299,
604,
359,
971,
281,
253,
2929,
19756,
10665,
7681,
4278,
323,
441,
281,
12654,
253,
36594,
285,
38135,
253,
4477,
452,
6777,
281,
3185,
8171,
731,
342,
12002,
2505,
20121,
275,
436,
1083,
247,
9414,
751,
4266,
476,
760,
11467,
9441,
752,
253,
4477,
403,
2820,
281,
513,
3185,
273,
8958,
323,
2176,
17912,
253,
1543,
273,
436,
789,
403,
417,
41374,
923,
7000,
5701,
323,
625,
1491,
50276,
9088,
310,
642,
10527,
3045,
12215,
323,
253,
4081,
11333,
50276,
783,
5661,
1543,
10725,
327,
247,
4942,
1781,
1180,
273,
3302,
7313,
476,
253,
4477,
921,
1543,
342,
11184,
3302,
7313,
50276,
783,
760,
1332,
273,
5301,
310,
27785,
285,
417,
12085,
45190,
923,
7000,
5701,
323,
625,
1491,
1501,
30080,
22559,
8680,
50276,
74,
751,
281,
5717,
253,
4477,
323,
616,
8254,
6787,
891,
452,
247,
2007,
4385,
50276,
24,
1580,
23507,
31025,
310,
908,
310,
2649,
352,
1896,
281,
897,
253,
4315,
27697,
18057,
281,
24813,
253,
295,
89,
79,
15970,
26677,
4315,
715,
271,
2048,
326,
476,
320,
10302,
275,
4872,
673,
50276,
74,
452,
2559,
619,
4868,
407,
337,
347,
891,
1158,
627,
310,
690,
15785,
281,
436,
789,
2299,
891,
858,
417,
2572,
352,
2007,
1955,
281,
1534,
2544,
3058,
281,
1056,
253,
4278,
273,
253,
7681,
2746,
2590,
323,
49160,
253,
36594,
846,
253,
4477,
8254,
6787,
436,
4468,
1335,
4558,
1580,
359,
476,
760,
513,
594,
407,
16725,
253,
2862,
2457,
2715,
2378,
969,
50274,
40844,
8680,
50276,
74,
651,
7052,
5583,
326,
1580,
253,
4679,
513,
417,
3835,
1902,
587,
13757,
627,
310,
671,
247,
3480,
273,
5955,
327,
697,
16038,
253,
4477,
476,
16670,
1902,
587,
13757,
390,
36574,
352,
281,
253,
30762,
285,
8171,
342,
253,
5816,
7681,
4278,
3309,
323,
4685,
50276,
262,
310,
417,
2590,
281,
479,
2139,
253,
4477,
2550,
19837,
271,
5368,
6895,
375,
758,
3258,
31025,
1566,
923,
2708,
342,
271,
5368,
14604,
1766,
5933,
285,
10007,
731,
323,
2677,
587,
13757,
476,
253,
4477,
2319,
436,
50276,
1747,
1256,
342,
3280,
6820,
6046,
247,
305,
12064,
1232,
1971,
295,
2824,
8210,
50276,
2252,
2779,
6895,
375,
758,
3258,
305,
12064,
1232,
9077,
17857,
1686,
5215,
50276,
20617,
1050,
6895,
375,
758,
3258,
305,
12064,
1232,
9077,
17857,
1686,
4332,
50276,
66,
5939,
39762,
17032,
7792,
323,
440,
5411,
7529,
23507,
305,
12064,
1232,
9077,
3210,
17857,
1686,
4022,
50274,
262,
310,
417,
2590,
281,
479,
1880,
390,
849,
436,
789,
476,
320,
6508,
281,
253,
1083,
273,
8310,
273,
42915,
4903,
534,
310,
1846,
275,
2905,
2987,
327,
2677,
587,
13757,
751,
260,
518,
45879,
1162,
355,
9169,
285,
253,
5816,
10414,
2708,
50276,
35360,
595,
10237,
17699,
16561,
13284,
1177,
13757,
247,
382,
1832,
9169,
50276,
32581,
3006,
17697,
1318,
255,
16272,
273,
2806,
3364,
3470,
5723,
2824,
43425,
50276,
2877,
255,
16272,
13757,
342,
305,
12064,
4870,
17857,
1686,
43425,
50276,
9846,
253,
4477,
320,
2104,
281,
1089,
247,
1846,
4758,
4931,
387,
253,
14247,
273,
690,
30732,
3386,
275,
253,
5762,
11333,
323,
16774,
5301,
476,
253,
4477,
2319,
253,
1840,
50274,
6377,
374,
253,
4477,
1333,
326,
970,
247,
10895,
326,
1057,
417,
7933,
2430,
24945,
273,
7313,
387,
253,
1072,
3280,
4328,
253,
10895,
4428,
2677,
587,
2193,
476,
253,
4477,
5513,
849,
841,
2677,
587,
2193,
403,
2797,
1293,
7446,
839,
7313,
970,
253,
1491,
432,
2593,
3127,
50276,
6377,
374,
432,
3239,
337,
305,
476,
9173,
247,
2677,
587,
476,
253,
4477,
19148,
1880,
340,
18,
281,
340,
79,
403,
3103,
2677,
587,
2193,
50276,
6377,
374,
751,
752,
253,
4477,
452,
753,
1223,
253,
6239,
327,
1902,
587,
9077,
310,
1679,
6508,
476,
253,
4477,
840,
2085,
247,
5955,
281,
41509,
253,
897,
273,
5150,
495,
285,
1902,
587,
9077,
285,
2139,
359,
4510,
352,
689,
2677,
587,
9077,
50276,
6377,
374,
253,
4477,
1333,
326,
253,
17699,
16561,
1902,
587,
1566,
359,
816,
5611,
310,
747,
281,
253,
1682,
273,
776,
3640,
1580,
352,
310,
747,
476,
253,
4477,
2085,
2007,
8813,
285,
22861,
323,
253,
830,
273,
5150,
721,
50276,
6377,
495,
253,
4477,
1333,
326,
275,
3946,
359,
5206,
247,
305,
12064,
2720,
323,
305,
285,
247,
2412,
72,
12064,
2720,
323,
40009,
476,
253,
4477,
5513,
285,
15249,
841,
10165,
273,
31025,
285,
2412,
17788,
323,
253,
2677,
587,
23824,
587,
305,
285,
253,
11041,
273,
253,
26640,
826,
5070,
72,
12064,
3268,
273,
299,
4277,
2975,
50276,
6377,
495,
253,
4477,
1333,
326,
253,
39762,
3602,
476,
320,
5556,
1701,
26277,
342,
253,
1566,
3602,
24088,
1599,
1159,
10303,
390,
10295,
4373,
22041,
824,
326,
465,
962,
2135,
282,
487,
2146,
23279,
875,
253,
16851,
285,
253,
2032,
12637,
310,
347,
1355,
347,
1896,
436,
1750,
778,
417,
320,
3451,
604,
253,
1566,
3602,
285,
1182,
403,
417,
6607,
275,
247,
830,
1754,
327,
2032,
39762,
3602,
923,
323,
1650,
253,
22861,
1677,
275,
12494,
273,
253,
1563,
3806,
50276,
296,
17283,
39762,
17032,
323,
17699,
16561,
23507,
305,
12064,
1232,
9077,
891,
23925,
9866,
6247,
50276,
6377,
577,
253,
4477,
1333,
326,
581,
778,
5115,
253,
1072,
906,
407,
10491,
247,
18974,
432,
253,
12637,
273,
305,
476,
253,
4477,
4853,
253,
18974,
285,
5513,
849,
352,
310,
19958,
50276,
6377,
577,
253,
4477,
1333,
326,
253,
2022,
32489,
273,
31025,
3169,
28669,
310,
253,
2105,
273,
10491,
247,
18974,
534,
476,
760,
320,
2218,
4555,
387,
247,
6486,
1180,
273,
3280,
8593,
387,
247,
23664,
2105,
275,
253,
1180,
273,
8593,
476,
253,
4477,
5513,
275,
2508,
2139,
253,
2105,
4558,
23664,
672,
17617,
253,
23507,
39762,
31025,
1566,
275,
2593,
3307,
50276,
6377,
577,
253,
4477,
1333,
326,
253,
5199,
323,
11365,
2677,
587,
3530,
432,
253,
39762,
12637,
273,
305,
476,
320,
10405,
1701,
347,
956,
50276,
74,
1089,
253,
6010,
1512,
21248,
323,
4685,
476,
253,
4477,
2085,
253,
3242,
7681,
4278,
273,
436,
5199,
2085,
22861,
323,
1016,
3213,
273,
436,
5199,
285,
4720,
1918,
253,
10585,
406,
853,
323,
436,
5199,
323,
1650,
672,
253,
4477,
11897,
253,
1599,
1159,
278,
89,
273,
247,
305,
1087,
1566,
310,
436,
253,
12637,
1599,
256,
310,
671,
417,
2931,
849,
310,
436,
5199,
4555,
12331,
281,
253,
17032,
5199,
275,
2593,
3307,
849,
4555,
1057,
436,
5199,
2395,
323,
14604,
10491,
50276,
6377,
608,
253,
4477,
1333,
326,
359,
642,
3356,
452,
12091,
323,
253,
806,
1307,
273,
1903,
50276,
783,
6036,
8967,
15579,
273,
270,
6967,
26640,
826,
5070,
4903,
253,
806,
1307,
310,
417,
27039,
327,
305,
476,
253,
4477,
5513,
2139,
840,
403,
597,
270,
6967,
26640,
826,
5070,
4903,
50276,
6377,
608,
253,
4477,
1333,
326,
2299,
253,
2805,
72,
487,
4006,
15895,
2789,
352,
3782,
973,
3467,
959,
323,
247,
38754,
2746,
835,
359,
806,
5556,
885,
2805,
72,
487,
4006,
323,
270,
50276,
18,
840,
5556,
885,
323,
270,
50276,
19,
1223,
18505,
253,
806,
1127,
281,
253,
3786,
1119,
1318,
3966,
1919,
270,
2792,
403,
1119,
476,
253,
4477,
2085,
253,
3242,
7681,
4278,
323,
436,
2746,
323,
1650,
323,
6774,
2792,
1146,
10491,
26332,
270,
19,
1057,
352,
3469,
327,
253,
2677,
587,
1318,
273,
253,
806,
1127,
50276,
4674,
7127,
476,
253,
4477,
12654,
326,
253,
41929,
2659,
254,
4836,
6296,
4419,
253,
14053,
273,
6895,
375,
758,
3258,
414,
2139,
884,
2677,
587,
273,
10921,
285,
417,
2571,
50276,
783,
760,
1332,
273,
5301,
310,
27785,
285,
417,
12085,
45190,
476,
253,
4477,
3185,
7277,
342,
5368,
12085,
14604,
1766,
11333,
342,
247,
4577,
1180,
273,
7446,
569,
591,
1127,
275,
253,
14604,
7296,
326,
253,
14604,
1979,
310,
4942,
1781,
275,
690,
4679,
310,
352,
3309,
281,
16584,
824,
247,
1781,
1180,
273,
7446,
569,
591,
1127,
275,
841,
4679,
50273,
37585,
3374,
50276,
13206,
337,
6895,
375,
758,
505,
2576,
50276,
15834,
285,
2593,
495,
2781,
2877,
15579,
3186,
310,
417,
253,
1072,
347,
15579,
3186,
253,
4477,
943,
49620,
253,
6158,
281,
253,
3438,
50276,
6377,
608,
12993,
4306,
50276,
6377,
608,
4283,
281,
776,
12661,
5474,
339,
431,
248,
1895,
22691,
407,
253,
2929,
4620,
4460,
285,
3626,
253,
2929,
310,
4518,
3542,
534,
2789,
253,
5697,
3477,
281,
956,
253,
1543,
921,
2590,
7756,
323,
253,
49602,
5421,
253,
4477,
2770,
327,
1884,
285,
884,
2677,
3205,
275,
616,
4679,
534,
403,
3164,
15455,
6927,
281,
6642,
685,
1333,
608,
390,
337,
2677,
3205,
275,
1142,
1524,
10186,
3237,
952,
403,
6110,
275,
7520,
3394,
594,
597,
651,
971,
281,
6642,
625,
9559,
2677,
3205,
253,
4477,
513,
417,
1918,
667,
30328,
670,
849,
1892,
326,
651,
320,
50276,
74,
4282,
604,
436,
310,
1199,
12150,
253,
39762,
17032,
5199,
812,
452,
644,
4767,
625,
11120,
2593,
3307,
806,
12494,
50276,
262,
310,
417,
3839,
2032,
326,
13546,
247,
1599,
6642,
310,
6927,
685,
26230,
2677,
3205,
50276,
909,
323,
16645,
2677,
3205,
273,
3573,
1767,
7193,
10670,
253,
8876,
588,
320,
15455,
625,
6474,
685,
253,
1599,
50276,
74,
1158,
253,
5909,
6197,
310,
8489,
24363,
5474,
339,
431,
248,
2929,
310,
973,
34218,
285,
4518,
3542,
253,
3045,
273,
253,
4081,
1332,
310,
6760,
327,
4722,
941,
5239,
285,
253,
1543,
452,
47860,
11815,
285,
7568,
253,
4891,
3045,
273,
253,
4081,
1332,
50276,
74,
513,
417,
923,
667,
4755,
32213,
50276,
2369,
2127,
369,
9262,
323,
38041,
533,
347,
2080,
347,
891,
2096,
432,
253,
2929,
352,
588,
320,
6096,
327,
40477,
1996,
5474,
33032,
253,
2934,
273,
970,
3587,
327,
2677,
587,
23824,
3205,
310,
1077,
1774,
275,
2067,
4893,
285,
352,
310,
1175,
281,
923,
789,
275,
436,
3884,
50276,
783,
2929,
310,
973,
3542,
285,
3477,
281,
956,
50276,
783,
5661,
2593,
310,
21414,
50276,
249,
253,
4679,
253,
4477,
760,
7277,
1411,
247,
2629,
22616,
352,
651,
320,
4722,
281,
7277,
247,
31025,
1566,
342,
247,
2677,
587,
1754,
11931,
1159,
824,
347,
3264,
2677,
587,
7756,
50276,
783,
38754,
7092,
273,
253,
2805,
72,
487,
4006,
11931,
1159,
1646,
281,
2847,
690,
3374,
672,
14604,
3281,
270,
310,
3240,
1781,
923,
5004,
987,
7484,
273,
3036,
20,
50276,
5430,
651,
247,
5700,
824,
347,
3264,
2677,
587,
7756,
342,
247,
2629,
31025,
923,
14402,
864,
90,
1162,
355,
4072,
789,
1060,
2139,
419,
2254,
253,
4477,
2783,
352,
50275,
74,
717,
12371,
604,
247,
625,
5899,
26278,
273,
6895,
375,
16386,
3258,
6046,
651,
671,
789,
1060,
323,
1650,
651,
247,
31025,
1566,
347,
275,
270,
10868,
1162,
355,
4765,
789,
1060,
812,
253,
4477,
4385,
327,
13633,
253,
4081,
11931,
3470,
281,
824,
1332,
50275,
1542,
253,
38754,
3753,
273,
253,
2805,
72,
487,
4006,
13757,
651,
352,
320,
1896,
281,
5206,
37819,
1031,
374,
2792,
387,
2378,
407,
970,
247,
374,
6967,
2805,
72,
487,
4006,
651,
436,
4796,
253,
3374,
1955,
281,
253,
38754,
13757,
50276,
14985,
864,
90,
362,
305,
968,
7390,
1063,
277,
15438,
6168,
340,
50276,
6357,
446,
249,
305,
4072,
2677,
587,
3169,
13757,
273,
27620,
4382,
4679,
342,
10839,
494,
12320,
1732,
7480,
84,
47319,
25098,
50276,
67,
10868,
278,
29975,
1974,
391,
270,
50276,
77,
438,
17131,
9327,
278,
4765,
8542,
6895,
375,
758,
3258,
305,
12064,
1232,
14053,
323,
1781,
9864,
4679,
6698,
273,
15180,
285,
29886,
9990,
32900,
5096,
2055,
1797,
2490,
187,
4118,
18435,
27,
13518,
2278,
253,
2929,
29328,
247,
1332,
323,
17699,
16561,
2677,
587,
285,
1902,
587,
13757,
534,
2544,
253,
2770,
275,
1766,
432,
247,
1599,
6642,
281,
247,
2677,
587,
273,
253,
5998,
8103,
1159,
253,
4736,
310,
281,
2085,
247,
10237,
1332,
342,
1675,
281,
9559,
3394,
253,
4081,
2746,
22139,
6895,
375,
16386,
3258,
414,
407,
970,
767,
21624,
305,
793,
285,
1781,
15302,
342,
23507,
24635,
2792,
34754,
4720,
253,
4477,
12661,
767,
41655,
273,
8946,
11931,
3470,
281,
253,
2677,
587,
1083,
50276,
783,
4477,
452,
10017,
281,
253,
30628,
30080,
22559,
50276,
250,
27167,
318,
281,
253,
4477,
4496,
9257,
2278,
253,
2929,
347,
5125,
407,
253,
30628,
671,
275,
1340,
281,
3157,
253,
3290,
273,
253,
9759,
50276
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
this paper presents a novel multigranularity crossmodal alignment mgca framework to seamlessly leverage the multigranular semantic correspondences between medical images and radiology reports for generalized medical visual representation learning extensive experimental results on different downstream datasets show that the proposed framework achieves substantial performance with limited annotated data strengths 1 a multigranularity crossmodal alignment framework is proposed for learning generalized medical visual representations from freetext radiology reports 2 the overall structure is clear 3 the experimental results show the effectiveness of the proposed model weaknesses 1 the authors incorporate several wellknown technologies including instancewise alignment finegrained tokenwise alignment and diseaselevel alignment into contrastive learning to enhance the generalizability of learned visual representations in this case which parts are new thus the overall novelty needs to be strengthened 2 in table 1 it can be seen that the proposed model performs better than other methods however because the proposed model has been pretrained on a largescale medical image and report dataset whether other methods have been trained on the large dataset 3 the generated visual and text token embeddings are projected into normalized lowerdimensional embeddings and the dimension d is set to 128 how to set the dimension the parameter study should be included see weakness docsepthe authors present a visual representation learning approach based on contrastive learning of correspondences between visual radiographs and textual radiology reports at multiple levels of granularity using three different learning objectives the first objective features standard crossmodal alignment between radiographs and their corresponding radiology report pairs by which they enforce instance level agreement the second objective features a crossattention approach that enforces the alignment between the visual token embeddings local representations obtained using the vision transformer and aggregated text token representations where the visual token is used as a query for attentionbased aggregation of word token representations and likewise the alignment between text tokens and aggregated visual tokens this enforces alignment between mutually informative local image regions in the radiograph and parts of the textual radiology report and the authors argue that this regionlevel alignment improves performance of resulting visual representations on downstream task that involve dense predicitons the final objective features alignment on a higherthaninstance level by enforcing consistent crossmodal clustering of representations of corresponding visual and text data this implicitly encourages the radiographs and reports that share highlevel semanics to have similar representations regardless of instancelevel pairing and modality authors also demonstrate that pretraining on indomain images medical is important for performance on downstream tasks and that generaldomain pretraining is not as effective strengths the paper is clearly written and well structured implementation details and reproducibility info are clearly provided with appendix containing additional ablation studies justifying parameter and design choices extensive experiments both qualitative and quantitative as well as ablation studies the choice of downstream tasks appears to be justified given the stated focus on learning visual representations and they seem diverse enough to put different individual aspects of their approach to the test the comparison with the main competing approach gloria seems fair as the authors also evaluated a variant of their model that uses the same image encoder as gloria as well as the same preprocessing step the idea and implementation of the diseaselevel crossmodal alignment module in this setting to the best of my knowledge can be seen as novel weaknesses lack of an ablation study involving individual learning tasks in a object detection or segmentation setting as was done under the classification setting their approach does seem to more convincingly outperform gloria in a object detection and segmentation setting compared to the classification featuring the resnet50 encoder but it would have been interesting to see an ablation study that explicitly shows how relevant for instance cta is in a dense prediction setting lack of error bars may be a potential issue since the differences in performance between their approach and baselines in some settings are sometimes small this particular setup for contrastive learning based approach for visual representation learning also in the context of paired radiographs and radiology reports as well as the idea and motivation behind tokenwise crossmodal alignment objective at its core seem to be inspired by convirt andor gloria so the work can to a degree be seen as an extension of existing work and not entirely novel the authors list the lack of consideration of retrieval tasks as one of the limitations of their work however the outlined scope of their experiments is in my opinion reasonable with regards to potentially negative societal impacts the authors mention the possible use of sensitive data in their framework this is a general concern when medical data is used and is therefore not particularly specific to their work docsepthis paper proposes a multigranularity crossmodal alignment method which learns data representations from medical scans paired with the corresponding text reports the method exploits multiple unsupervised techniques to obtain the learned representations eg contrastive losses clustering with sinkhornknopp and each of these techniques are utilized at the appropriate level of granularity the learned representations are then evaluated on a large set of imagebased downstream tasks to assess the quality of the image representations the experimental results support the merits of the proposed method however there are some weaknesses and limitations which will be listed below strengths the proposed method is novel if viewed as a whole framework that employs the granularity of the features that are present in medical scans however if one views each level in the method independently these levels are not novel per se but since the proposed method aims to encapsulate these levels together then i deem the method as novel the only level that i would deem more novel is the cpa which proposes to learn crossmodal disease prototypes the evaluation experiments are extensive the paper reports results on multiple dataset benchmarks which include multiple types of tasks classification detection and segmentation the work also evaluates two types of image encoders resnet vit the paper is clearly written and the used language is well understandable weaknesses mostly i believe the weaknesses in this work are in some missing pieces 1 missing relevant references from the literature the following are some works that perform contrastive learning on medical images taleb aiham et al contig selfsupervised multimodal contrastive learning for medical imaging with genetics proceedings of the ieeecvf conference on computer vision and pattern recognition 2022 chaitanya krishna et al contrastive learning of global and local features for medical image segmentation with limited annotations advances in neural information processing systems 33 2020 1254612558 han yan et al pneumonia detection on chest xray using radiomic features and contrastive learning 2021 ieee 18th international symposium on biomedical imaging isbi ieee 2021 feng ruibin et al parts2whole selfsupervised contrastive learning via reconstruction domain adaptation and representation transfer and distributed and collaborative learning springer cham 2020 8595 xu jiarui et al groupvit semantic segmentation emerges from text supervision proceedings of the ieeecvf conference on computer vision and pattern recognition 2022 there are quite a lot of works that are similar along this line would be great to cite the most relevant once from the medical imaging literature also i believe some references related to the methodological part mainly crossattention are missing such as chen yenchun et al uniter learning universal imagetext representations 2019 lu jiasen et al hierarchical questionimage coattention for visual question answering advances in neural information processing systems 29 2016 the authors list the limitations adequately
### Summary: | a multigranularity crossmodal alignment framework is proposed which learns data representations from medical scans paired with the corresponding text reports the reviewers find the appraoch novel and the paper wellwritten with an overall clear structure extensive experimental results show the effectiveness of the proposed model and experimental details are provided after the discussion with the authors all reviewers vote towards acceptance of the paper | [
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
2520,
2929,
10262,
247,
4460,
1554,
304,
4011,
792,
414,
2831,
24353,
12420,
5770,
6357,
7792,
281,
22595,
13102,
25057,
253,
1554,
304,
4011,
792,
24705,
2723,
2979,
875,
3739,
3888,
285,
8188,
1497,
5012,
323,
14923,
3739,
5304,
6779,
4715,
9470,
5661,
1543,
327,
1027,
15450,
15302,
921,
326,
253,
4081,
7792,
33526,
6832,
3045,
342,
3710,
28267,
941,
20544,
50275,
18,
247,
1554,
304,
4011,
792,
414,
2831,
24353,
12420,
7792,
310,
4081,
323,
4715,
14923,
3739,
5304,
14237,
432,
269,
3160,
2068,
8188,
1497,
5012,
374,
253,
4583,
2605,
310,
2590,
495,
253,
5661,
1543,
921,
253,
12510,
273,
253,
4081,
1566,
50275,
20881,
1255,
265,
337,
253,
4477,
19071,
2067,
973,
4304,
10296,
1690,
4227,
3020,
12420,
4030,
72,
11273,
10669,
3020,
12420,
285,
2728,
5251,
12420,
715,
4499,
422,
4715,
281,
7278,
253,
2087,
50228,
273,
6311,
5304,
14237,
275,
436,
1083,
534,
4243,
403,
747,
3021,
253,
4583,
38135,
3198,
281,
320,
34615,
374,
275,
2829,
337,
352,
476,
320,
2326,
326,
253,
4081,
1566,
17923,
1805,
685,
643,
3082,
2299,
984,
253,
4081,
1566,
556,
644,
3215,
11273,
327,
247,
1236,
2510,
25912,
3739,
2460,
285,
1304,
10895,
1880,
643,
3082,
452,
644,
10166,
327,
253,
1781,
10895,
50276,
20,
253,
4561,
5304,
285,
2505,
10669,
46234,
403,
16589,
715,
12650,
2406,
6967,
46234,
285,
253,
7877,
277,
310,
873,
281,
12842,
849,
281,
873,
253,
7877,
253,
4764,
1263,
943,
320,
2908,
50274,
2887,
14855,
50276,
7152,
339,
431,
248,
4477,
1246,
247,
5304,
6779,
4715,
2746,
1754,
327,
4499,
422,
4715,
273,
2723,
2979,
875,
5304,
8188,
25446,
285,
45860,
8188,
1497,
5012,
387,
2709,
2308,
273,
32449,
414,
970,
1264,
1027,
4715,
16566,
253,
806,
8103,
3386,
2629,
2831,
24353,
12420,
875,
8188,
25446,
285,
616,
3969,
8188,
1497,
1304,
8557,
407,
534,
597,
7767,
4227,
1268,
4345,
253,
1273,
8103,
3386,
247,
2831,
42959,
2746,
326,
546,
36217,
253,
12420,
875,
253,
5304,
10669,
46234,
1980,
14237,
2797,
970,
253,
8113,
39707,
285,
40006,
2505,
10669,
14237,
835,
253,
5304,
10669,
310,
908,
347,
247,
7316,
323,
4116,
3169,
20828,
273,
3159,
10669,
14237,
285,
21223,
253,
12420,
875,
2505,
21761,
285,
40006,
5304,
21761,
436,
546,
36217,
12420,
875,
25834,
27096,
1980,
2460,
4811,
275,
253,
8188,
2047,
285,
4243,
273,
253,
45860,
8188,
1497,
1304,
285,
253,
4477,
9059,
326,
436,
2919,
5251,
12420,
19132,
3045,
273,
4795,
5304,
14237,
327,
15450,
4836,
326,
6388,
14086,
2063,
5033,
790,
253,
2457,
8103,
3386,
12420,
327,
247,
2169,
14644,
14966,
1268,
407,
37703,
5185,
2831,
24353,
17524,
273,
14237,
273,
3969,
5304,
285,
2505,
941,
436,
29688,
29426,
253,
8188,
25446,
285,
5012,
326,
3894,
1029,
5251,
3300,
266,
982,
281,
452,
2074,
14237,
10159,
273,
4227,
5251,
25015,
285,
36453,
4477,
671,
7568,
326,
3215,
26208,
327,
801,
297,
404,
3888,
3739,
310,
1774,
323,
3045,
327,
15450,
8892,
285,
326,
2087,
13517,
3215,
26208,
310,
417,
347,
3576,
20544,
50276,
783,
2929,
310,
4518,
3542,
285,
973,
18872,
50276,
39595,
4278,
285,
38041,
8692,
403,
4518,
2530,
342,
30762,
4508,
3081,
28913,
2175,
816,
5411,
4764,
285,
2216,
10165,
50276,
2068,
3134,
4679,
1097,
18276,
285,
11745,
347,
973,
347,
28913,
2175,
253,
4327,
273,
15450,
8892,
4620,
281,
320,
17285,
1677,
253,
4767,
2770,
327,
4715,
5304,
14237,
285,
597,
1646,
11117,
2217,
281,
1691,
1027,
2060,
7794,
273,
616,
2746,
281,
253,
1071,
253,
5301,
342,
253,
2022,
11771,
2746,
1289,
20130,
3133,
4344,
347,
253,
4477,
671,
6760,
247,
12955,
273,
616,
1566,
326,
4648,
253,
1072,
2460,
32049,
347,
1289,
20130,
347,
973,
347,
253,
1072,
638,
21678,
3213,
50276,
783,
2934,
285,
7092,
273,
253,
2728,
5251,
2831,
24353,
12420,
6333,
275,
436,
4758,
281,
253,
1682,
273,
619,
3640,
476,
320,
2326,
347,
4460,
50276,
20881,
1255,
265,
50276,
77,
471,
273,
271,
28913,
1263,
7668,
2060,
4715,
8892,
275,
247,
1789,
5481,
390,
26405,
4758,
347,
369,
2218,
762,
253,
9162,
4758,
616,
2746,
1057,
1646,
281,
625,
2410,
1763,
5356,
562,
32231,
1289,
20130,
275,
247,
1789,
5481,
285,
26405,
4758,
2429,
281,
253,
9162,
15773,
253,
501,
3024,
1235,
32049,
533,
352,
651,
452,
644,
4722,
281,
923,
271,
28913,
1263,
326,
11120,
2722,
849,
4623,
323,
4227,
260,
893,
310,
275,
247,
14086,
10554,
4758,
50276,
77,
471,
273,
2228,
8965,
778,
320,
247,
2442,
2523,
1580,
253,
3910,
275,
3045,
875,
616,
2746,
285,
1666,
25379,
275,
690,
7533,
403,
4536,
1355,
50276,
2520,
1798,
9978,
323,
4499,
422,
4715,
1754,
2746,
323,
5304,
6779,
4715,
671,
275,
253,
3634,
273,
18433,
8188,
25446,
285,
8188,
1497,
5012,
347,
973,
347,
253,
2934,
285,
16038,
3212,
10669,
3020,
2831,
24353,
12420,
8103,
387,
697,
5161,
1646,
281,
320,
11797,
407,
2410,
4580,
285,
263,
1289,
20130,
594,
253,
789,
476,
281,
247,
4248,
320,
2326,
347,
271,
6880,
273,
5368,
789,
285,
417,
7094,
4460,
253,
4477,
1618,
253,
3480,
273,
8180,
273,
25064,
8892,
347,
581,
273,
253,
7364,
273,
616,
789,
2299,
253,
18627,
7990,
273,
616,
4679,
310,
275,
619,
4743,
5272,
342,
17730,
281,
7826,
4016,
38058,
16274,
253,
4477,
3748,
253,
1896,
897,
273,
7996,
941,
275,
616,
7792,
436,
310,
247,
2087,
4468,
672,
3739,
941,
310,
908,
285,
310,
3103,
417,
3782,
2173,
281,
616,
789,
50276,
7152,
33032,
2520,
2929,
29328,
247,
1554,
304,
4011,
792,
414,
2831,
24353,
12420,
1332,
534,
33772,
941,
14237,
432,
3739,
20947,
18433,
342,
253,
3969,
2505,
5012,
253,
1332,
40725,
2709,
440,
35421,
5609,
281,
4044,
253,
6311,
14237,
24088,
4499,
422,
11655,
17524,
342,
16338,
27721,
76,
2369,
377,
285,
1016,
273,
841,
5609,
403,
12845,
387,
253,
4569,
1268,
273,
32449,
414,
50276,
783,
6311,
14237,
403,
840,
6760,
327,
247,
1781,
873,
273,
2460,
3169,
15450,
8892,
281,
2939,
253,
3290,
273,
253,
2460,
14237,
253,
5661,
1543,
1329,
253,
16108,
273,
253,
4081,
1332,
2299,
627,
403,
690,
32213,
285,
7364,
534,
588,
320,
7117,
2708,
20544,
50276,
783,
4081,
1332,
310,
4460,
604,
11575,
347,
247,
2644,
7792,
326,
27532,
253,
32449,
414,
273,
253,
3386,
326,
403,
1246,
275,
3739,
20947,
2299,
604,
581,
6849,
1016,
1268,
275,
253,
1332,
10939,
841,
2308,
403,
417,
4460,
591,
396,
533,
1580,
253,
4081,
1332,
13698,
281,
22642,
4187,
841,
2308,
2366,
840,
891,
43413,
253,
1332,
347,
4460,
253,
760,
1268,
326,
891,
651,
43413,
625,
4460,
310,
253,
260,
4904,
534,
29328,
281,
3037,
2831,
24353,
2728,
3861,
9117,
50276,
783,
7103,
4679,
403,
9470,
253,
2929,
5012,
1543,
327,
2709,
10895,
49602,
534,
2486,
2709,
3510,
273,
8892,
9162,
5481,
285,
26405,
253,
789,
671,
44995,
767,
3510,
273,
2460,
2349,
351,
398,
501,
3024,
50276,
34490,
50276,
783,
2929,
310,
4518,
3542,
285,
253,
908,
3448,
310,
973,
34007,
50275,
20881,
1255,
265,
6571,
891,
2868,
253,
32213,
275,
436,
789,
403,
275,
690,
5816,
7437,
337,
5816,
4623,
10414,
432,
253,
6239,
253,
1563,
403,
690,
2987,
326,
1347,
4499,
422,
4715,
327,
3739,
3888,
50276,
85,
43705,
23105,
3964,
1162,
355,
32898,
1881,
35421,
23390,
26306,
4499,
422,
4715,
323,
3739,
6979,
342,
30222,
10061,
273,
253,
26332,
70,
886,
39985,
8059,
327,
4382,
8113,
285,
3102,
8981,
1384,
1423,
50276,
348,
1942,
32028,
36407,
763,
2072,
1162,
355,
4499,
422,
4715,
273,
4156,
285,
1980,
3386,
323,
3739,
2460,
26405,
342,
3710,
31825,
16424,
275,
11454,
1491,
5162,
2718,
5922,
9169,
11140,
2950,
805,
38070,
50276,
5582,
340,
266,
1162,
355,
18277,
5481,
327,
9081,
1269,
1402,
970,
8188,
4986,
3386,
285,
4499,
422,
4715,
43425,
26332,
1796,
1283,
394,
5213,
18870,
35835,
327,
35156,
6979,
310,
4193,
26332,
1796,
43425,
50276,
71,
1205,
8864,
487,
249,
1162,
355,
4243,
19,
40338,
1881,
35421,
4499,
422,
4715,
3066,
14433,
5028,
15644,
285,
6779,
3700,
285,
5939,
285,
27549,
4715,
7203,
254,
45909,
9169,
9330,
2222,
50276,
46036,
480,
11158,
4113,
1162,
355,
1387,
34490,
24705,
26405,
32361,
432,
2505,
20446,
10061,
273,
253,
26332,
70,
886,
39985,
8059,
327,
4382,
8113,
285,
3102,
8981,
1384,
1423,
627,
403,
3240,
247,
2257,
273,
2987,
326,
403,
2074,
2112,
436,
1386,
651,
320,
1270,
281,
26542,
253,
954,
4623,
2378,
432,
253,
3739,
6979,
6239,
50276,
12563,
891,
2868,
690,
10414,
2905,
281,
253,
35961,
629,
7194,
2831,
42959,
403,
5816,
824,
347,
260,
864,
340,
15152,
328,
1162,
355,
440,
2562,
4715,
10898,
4440,
292,
2068,
14237,
6247,
26535,
480,
6358,
257,
1162,
355,
24498,
1953,
5695,
820,
42959,
323,
5304,
1953,
22291,
16424,
275,
11454,
1491,
5162,
2718,
3285,
4022,
50275,
783,
4477,
1618,
253,
7364,
18212,
50276,
187,
187,
4118,
18435,
27,
66,
1554,
304,
4011,
792,
414,
2831,
24353,
12420,
7792,
310,
4081,
534,
33772,
941,
14237,
432,
3739,
20947,
18433,
342,
253,
3969,
2505,
5012,
50276,
783,
30628,
1089,
253,
622,
376,
3770,
4460,
285,
253,
2929,
973,
15720,
342,
271,
4583,
2590,
2605,
9470,
5661,
1543,
921,
253,
12510,
273,
253,
4081,
1566,
285,
5661,
4278,
403,
2530,
50276,
6438,
253,
5955,
342,
253,
4477,
512,
30628,
6273,
4404,
14924,
273,
253,
2929
] | [
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] | [
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
2520,
2929,
10262,
247,
4460,
1554,
304,
4011,
792,
414,
2831,
24353,
12420,
5770,
6357,
7792,
281,
22595,
13102,
25057,
253,
1554,
304,
4011,
792,
24705,
2723,
2979,
875,
3739,
3888,
285,
8188,
1497,
5012,
323,
14923,
3739,
5304,
6779,
4715,
9470,
5661,
1543,
327,
1027,
15450,
15302,
921,
326,
253,
4081,
7792,
33526,
6832,
3045,
342,
3710,
28267,
941,
20544,
50275,
18,
247,
1554,
304,
4011,
792,
414,
2831,
24353,
12420,
7792,
310,
4081,
323,
4715,
14923,
3739,
5304,
14237,
432,
269,
3160,
2068,
8188,
1497,
5012,
374,
253,
4583,
2605,
310,
2590,
495,
253,
5661,
1543,
921,
253,
12510,
273,
253,
4081,
1566,
50275,
20881,
1255,
265,
337,
253,
4477,
19071,
2067,
973,
4304,
10296,
1690,
4227,
3020,
12420,
4030,
72,
11273,
10669,
3020,
12420,
285,
2728,
5251,
12420,
715,
4499,
422,
4715,
281,
7278,
253,
2087,
50228,
273,
6311,
5304,
14237,
275,
436,
1083,
534,
4243,
403,
747,
3021,
253,
4583,
38135,
3198,
281,
320,
34615,
374,
275,
2829,
337,
352,
476,
320,
2326,
326,
253,
4081,
1566,
17923,
1805,
685,
643,
3082,
2299,
984,
253,
4081,
1566,
556,
644,
3215,
11273,
327,
247,
1236,
2510,
25912,
3739,
2460,
285,
1304,
10895,
1880,
643,
3082,
452,
644,
10166,
327,
253,
1781,
10895,
50276,
20,
253,
4561,
5304,
285,
2505,
10669,
46234,
403,
16589,
715,
12650,
2406,
6967,
46234,
285,
253,
7877,
277,
310,
873,
281,
12842,
849,
281,
873,
253,
7877,
253,
4764,
1263,
943,
320,
2908,
50274,
2887,
14855,
50276,
7152,
339,
431,
248,
4477,
1246,
247,
5304,
6779,
4715,
2746,
1754,
327,
4499,
422,
4715,
273,
2723,
2979,
875,
5304,
8188,
25446,
285,
45860,
8188,
1497,
5012,
387,
2709,
2308,
273,
32449,
414,
970,
1264,
1027,
4715,
16566,
253,
806,
8103,
3386,
2629,
2831,
24353,
12420,
875,
8188,
25446,
285,
616,
3969,
8188,
1497,
1304,
8557,
407,
534,
597,
7767,
4227,
1268,
4345,
253,
1273,
8103,
3386,
247,
2831,
42959,
2746,
326,
546,
36217,
253,
12420,
875,
253,
5304,
10669,
46234,
1980,
14237,
2797,
970,
253,
8113,
39707,
285,
40006,
2505,
10669,
14237,
835,
253,
5304,
10669,
310,
908,
347,
247,
7316,
323,
4116,
3169,
20828,
273,
3159,
10669,
14237,
285,
21223,
253,
12420,
875,
2505,
21761,
285,
40006,
5304,
21761,
436,
546,
36217,
12420,
875,
25834,
27096,
1980,
2460,
4811,
275,
253,
8188,
2047,
285,
4243,
273,
253,
45860,
8188,
1497,
1304,
285,
253,
4477,
9059,
326,
436,
2919,
5251,
12420,
19132,
3045,
273,
4795,
5304,
14237,
327,
15450,
4836,
326,
6388,
14086,
2063,
5033,
790,
253,
2457,
8103,
3386,
12420,
327,
247,
2169,
14644,
14966,
1268,
407,
37703,
5185,
2831,
24353,
17524,
273,
14237,
273,
3969,
5304,
285,
2505,
941,
436,
29688,
29426,
253,
8188,
25446,
285,
5012,
326,
3894,
1029,
5251,
3300,
266,
982,
281,
452,
2074,
14237,
10159,
273,
4227,
5251,
25015,
285,
36453,
4477,
671,
7568,
326,
3215,
26208,
327,
801,
297,
404,
3888,
3739,
310,
1774,
323,
3045,
327,
15450,
8892,
285,
326,
2087,
13517,
3215,
26208,
310,
417,
347,
3576,
20544,
50276,
783,
2929,
310,
4518,
3542,
285,
973,
18872,
50276,
39595,
4278,
285,
38041,
8692,
403,
4518,
2530,
342,
30762,
4508,
3081,
28913,
2175,
816,
5411,
4764,
285,
2216,
10165,
50276,
2068,
3134,
4679,
1097,
18276,
285,
11745,
347,
973,
347,
28913,
2175,
253,
4327,
273,
15450,
8892,
4620,
281,
320,
17285,
1677,
253,
4767,
2770,
327,
4715,
5304,
14237,
285,
597,
1646,
11117,
2217,
281,
1691,
1027,
2060,
7794,
273,
616,
2746,
281,
253,
1071,
253,
5301,
342,
253,
2022,
11771,
2746,
1289,
20130,
3133,
4344,
347,
253,
4477,
671,
6760,
247,
12955,
273,
616,
1566,
326,
4648,
253,
1072,
2460,
32049,
347,
1289,
20130,
347,
973,
347,
253,
1072,
638,
21678,
3213,
50276,
783,
2934,
285,
7092,
273,
253,
2728,
5251,
2831,
24353,
12420,
6333,
275,
436,
4758,
281,
253,
1682,
273,
619,
3640,
476,
320,
2326,
347,
4460,
50276,
20881,
1255,
265,
50276,
77,
471,
273,
271,
28913,
1263,
7668,
2060,
4715,
8892,
275,
247,
1789,
5481,
390,
26405,
4758,
347,
369,
2218,
762,
253,
9162,
4758,
616,
2746,
1057,
1646,
281,
625,
2410,
1763,
5356,
562,
32231,
1289,
20130,
275,
247,
1789,
5481,
285,
26405,
4758,
2429,
281,
253,
9162,
15773,
253,
501,
3024,
1235,
32049,
533,
352,
651,
452,
644,
4722,
281,
923,
271,
28913,
1263,
326,
11120,
2722,
849,
4623,
323,
4227,
260,
893,
310,
275,
247,
14086,
10554,
4758,
50276,
77,
471,
273,
2228,
8965,
778,
320,
247,
2442,
2523,
1580,
253,
3910,
275,
3045,
875,
616,
2746,
285,
1666,
25379,
275,
690,
7533,
403,
4536,
1355,
50276,
2520,
1798,
9978,
323,
4499,
422,
4715,
1754,
2746,
323,
5304,
6779,
4715,
671,
275,
253,
3634,
273,
18433,
8188,
25446,
285,
8188,
1497,
5012,
347,
973,
347,
253,
2934,
285,
16038,
3212,
10669,
3020,
2831,
24353,
12420,
8103,
387,
697,
5161,
1646,
281,
320,
11797,
407,
2410,
4580,
285,
263,
1289,
20130,
594,
253,
789,
476,
281,
247,
4248,
320,
2326,
347,
271,
6880,
273,
5368,
789,
285,
417,
7094,
4460,
253,
4477,
1618,
253,
3480,
273,
8180,
273,
25064,
8892,
347,
581,
273,
253,
7364,
273,
616,
789,
2299,
253,
18627,
7990,
273,
616,
4679,
310,
275,
619,
4743,
5272,
342,
17730,
281,
7826,
4016,
38058,
16274,
253,
4477,
3748,
253,
1896,
897,
273,
7996,
941,
275,
616,
7792,
436,
310,
247,
2087,
4468,
672,
3739,
941,
310,
908,
285,
310,
3103,
417,
3782,
2173,
281,
616,
789,
50276,
7152,
33032,
2520,
2929,
29328,
247,
1554,
304,
4011,
792,
414,
2831,
24353,
12420,
1332,
534,
33772,
941,
14237,
432,
3739,
20947,
18433,
342,
253,
3969,
2505,
5012,
253,
1332,
40725,
2709,
440,
35421,
5609,
281,
4044,
253,
6311,
14237,
24088,
4499,
422,
11655,
17524,
342,
16338,
27721,
76,
2369,
377,
285,
1016,
273,
841,
5609,
403,
12845,
387,
253,
4569,
1268,
273,
32449,
414,
50276,
783,
6311,
14237,
403,
840,
6760,
327,
247,
1781,
873,
273,
2460,
3169,
15450,
8892,
281,
2939,
253,
3290,
273,
253,
2460,
14237,
253,
5661,
1543,
1329,
253,
16108,
273,
253,
4081,
1332,
2299,
627,
403,
690,
32213,
285,
7364,
534,
588,
320,
7117,
2708,
20544,
50276,
783,
4081,
1332,
310,
4460,
604,
11575,
347,
247,
2644,
7792,
326,
27532,
253,
32449,
414,
273,
253,
3386,
326,
403,
1246,
275,
3739,
20947,
2299,
604,
581,
6849,
1016,
1268,
275,
253,
1332,
10939,
841,
2308,
403,
417,
4460,
591,
396,
533,
1580,
253,
4081,
1332,
13698,
281,
22642,
4187,
841,
2308,
2366,
840,
891,
43413,
253,
1332,
347,
4460,
253,
760,
1268,
326,
891,
651,
43413,
625,
4460,
310,
253,
260,
4904,
534,
29328,
281,
3037,
2831,
24353,
2728,
3861,
9117,
50276,
783,
7103,
4679,
403,
9470,
253,
2929,
5012,
1543,
327,
2709,
10895,
49602,
534,
2486,
2709,
3510,
273,
8892,
9162,
5481,
285,
26405,
253,
789,
671,
44995,
767,
3510,
273,
2460,
2349,
351,
398,
501,
3024,
50276,
34490,
50276,
783,
2929,
310,
4518,
3542,
285,
253,
908,
3448,
310,
973,
34007,
50275,
20881,
1255,
265,
6571,
891,
2868,
253,
32213,
275,
436,
789,
403,
275,
690,
5816,
7437,
337,
5816,
4623,
10414,
432,
253,
6239,
253,
1563,
403,
690,
2987,
326,
1347,
4499,
422,
4715,
327,
3739,
3888,
50276,
85,
43705,
23105,
3964,
1162,
355,
32898,
1881,
35421,
23390,
26306,
4499,
422,
4715,
323,
3739,
6979,
342,
30222,
10061,
273,
253,
26332,
70,
886,
39985,
8059,
327,
4382,
8113,
285,
3102,
8981,
1384,
1423,
50276,
348,
1942,
32028,
36407,
763,
2072,
1162,
355,
4499,
422,
4715,
273,
4156,
285,
1980,
3386,
323,
3739,
2460,
26405,
342,
3710,
31825,
16424,
275,
11454,
1491,
5162,
2718,
5922,
9169,
11140,
2950,
805,
38070,
50276,
5582,
340,
266,
1162,
355,
18277,
5481,
327,
9081,
1269,
1402,
970,
8188,
4986,
3386,
285,
4499,
422,
4715,
43425,
26332,
1796,
1283,
394,
5213,
18870,
35835,
327,
35156,
6979,
310,
4193,
26332,
1796,
43425,
50276,
71,
1205,
8864,
487,
249,
1162,
355,
4243,
19,
40338,
1881,
35421,
4499,
422,
4715,
3066,
14433,
5028,
15644,
285,
6779,
3700,
285,
5939,
285,
27549,
4715,
7203,
254,
45909,
9169,
9330,
2222,
50276,
46036,
480,
11158,
4113,
1162,
355,
1387,
34490,
24705,
26405,
32361,
432,
2505,
20446,
10061,
273,
253,
26332,
70,
886,
39985,
8059,
327,
4382,
8113,
285,
3102,
8981,
1384,
1423,
627,
403,
3240,
247,
2257,
273,
2987,
326,
403,
2074,
2112,
436,
1386,
651,
320,
1270,
281,
26542,
253,
954,
4623,
2378,
432,
253,
3739,
6979,
6239,
50276,
12563,
891,
2868,
690,
10414,
2905,
281,
253,
35961,
629,
7194,
2831,
42959,
403,
5816,
824,
347,
260,
864,
340,
15152,
328,
1162,
355,
440,
2562,
4715,
10898,
4440,
292,
2068,
14237,
6247,
26535,
480,
6358,
257,
1162,
355,
24498,
1953,
5695,
820,
42959,
323,
5304,
1953,
22291,
16424,
275,
11454,
1491,
5162,
2718,
3285,
4022,
50275,
783,
4477,
1618,
253,
7364,
18212,
50276,
187,
187,
4118,
18435,
27,
66,
1554,
304,
4011,
792,
414,
2831,
24353,
12420,
7792,
310,
4081,
534,
33772,
941,
14237,
432,
3739,
20947,
18433,
342,
253,
3969,
2505,
5012,
50276,
783,
30628,
1089,
253,
622,
376,
3770,
4460,
285,
253,
2929,
973,
15720,
342,
271,
4583,
2590,
2605,
9470,
5661,
1543,
921,
253,
12510,
273,
253,
4081,
1566,
285,
5661,
4278,
403,
2530,
50276,
6438,
253,
5955,
342,
253,
4477,
512,
30628,
6273,
4404,
14924,
273,
253,
2929
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
the paper proposes a simple offline rl algorithm based on qlearning that instead of computing the exact maximum over actions takes the sample maximum when sampling from an learned estimate of the behavior policy experimental results are quite promising competitive with previous algorithms in both offline and online settings the authors demonstrate that utilizing autoregressive models significantly improves over simple vaes when used to model the behavior policy in offline learning both for the proposed method as well as prior work highlighting the need to place more focus on how we model the behavior policy in offline rl the article is clearly written and the analysis of the proposed backup is original to the best of my knowledge however with how important the choice of model used to model the behavior policy appears to be i am unsure as to how significant the new proposed backup operator is while i feel the paper clearly highlights the importance of how we model the behavior muavert s i dont see particularly strong evidence either theoretically or empirically why one should prefer using the emaq backup regarding the simplicity of the proposed algorithm compared to previous ones emaq does use one fewer network since it does not need to explicitly model a separate policy however in order to achieve competitive performance emaq did require a more complicated model for the behavior policy and the increased sensitivity to that perhaps counteracts the benefits of not having an explicit policy additionally emaq also needs to sample multiple actions for each backup and when executing the learned policy incurring more computational costs in sampling compared to using an explicit policy network one thing to potentially examine more closely is how emaq and the regularization from choosing small n relates to previously used forms of regularization we could consider an alternate procedure where instead of sampling n samples from muavert s and taking the max we instead sample from the distribution muavert s expalpha qsa using something like selfnormalized importance sampling for example for some temperature alpha this corresponds to the policy that simultaneously maximizes the expected qvalue as well as a kl divergence penalty against the original policy muavert s and would resemble the sample maximum as the temperature increases the key difference from other methods using klregularization against the behavior policy would be that it relies on samples solely from the behavior model similarly to emaq comparing these two procedures with different parameters could perhaps give us a better understanding of how implicit regularization by choosing small n in emaq relates to explicit regularization via kl complexity measure i strongly disagree that deltas n is a useful notion of complexity for offline rl while it captures a notion of how well the behavior policy covers good behaviors it uses the exact q or qmun values however the difficult part in offline rl is to accurately estimate those q values and the proposed complexity term has no way to account for sources of complexity such as stochasticity or the horizon of the mdp for example an mdp with only 2 actions per state can be potentially very complex but all 2action mdps would presumably look trivial just judging by how fast deltas n decreases with n misc comments with regards to the inductive and the choice of model class for the behavior policy what if we did actually have access to the behavior policy that generated the data would emaq perform better with oracle access to the behavior policy or is there something about behavior model learned from the actual transitions in the dataset that would allow for better learning perhaps such a question would be useful for examining what sorts of inductive biases should go into the behavior model updates after author response complexity measure my concern with the complexity measure is that it seems to me that the biggest difficulty in reinforcement learning is in actually being able to estimate accurate value functions while proposed complexity measure really only vaguely captures how far optimal policies are from the behavior distribution in particular at each state it only depends on the true values of q for each action and cant capture how hard it is to estimate them even among similarly structured mdps we can consider a 2action mdp with a behavior policy that was simply uniform we could have one extremely simple mdp that was simply composed of independent deterministic bandit problems at each state with no transitions always remain in your starting state which would be trivial to solve even from offline data with full support on the other hand we could have a much more complex mdp with meaningful stochastic transitions random rewards and so on if we simply match the q values in the two mdps they appear to be equally complex from this measure despite the fact that the bandit mdp is far simpler to solve as such i think the proposed complexity measure doesnt really reflect the real challenges of an offline rl problem and am not sure how one would extend it to be useful re misc comments and access to true behavior policies my thinking here was that if we had a very stochastic behavior and finite samples there would be actions that have reasonably high probability under the true behavior policy but we never get to see in the data and wont be able to evaluate well one benefit of fitting the behavior model the the empirical data is that it would focus on those actions that do appear in the dataset in an extreme case we could simply have dirac deltas on the observed data points and so could benefit by restricting the actions to those that can be evaluated better overall opinion in light of the empirical results mentioned in the authors response as well as the comparisons to kl regularization i have raised my rating i still do believe it is a very borderline paper and would perhaps benefit from more careful analysis and focus on how the different behavior modelling choices influences offline docsepsummary the paper proposes the expectedmax operator to address the problem of distribution shift in offline rl which could also be applied in online rl the paper establishes theoretical analysis of the proposed operator in convergence suboptimality etc a practical algorithm is proposed based on previous techniques an ensemble of qfunctions and a different generative model using an autoregressive architecture experiments demonstrate the effectiveness of emaq strong points the paper studies the important problem of how to mitigate distribution shift in offline rl and proposes a simple but effective algorithm the paper conducts extensive experiments comparing with stateoftheart algorithms with indepth ablative studies of the effect of each component concerns i am surprised by the result in figures 1 and 2 that emaq performs very well with a small value of n5 as the upper bound in theorem 35 would be not very meaningful given a small n and gamma close 1 could authors investigate the sampled 5 actions are they diverse enough to represent the entire action space well considering the action space is continuous the performance of emaq strongly relates to how good mu is learned what if mu is close to a deterministic policy eg mu is not learned well and is close to a deterministic policy how to guarantee this diversity could authors compare the performance of emaq when n is 1 and an extremely large value which could better justifies the performance in extreme cases how are the computation time and learning curve besides the final performance shown in the paper of emaq compared with other methods in section 35 it is not clear how a small value of n could serve as a regularizer and smoothen the incorrect valuesdocsepthis paper analyzes offpolicy algorithms using a samplingbased maximization of the q function over a behavior policy for both bellman targets and for policy execution introducing the expected max qlearning operator emaq for analysis the paper then proposes to use more expressive autoregressive models made for learning the behavior policy from replay buffer data either online or offline which turns out to be very important especially on harder tasks in the d4rl benchmark the method is relatively novel it is close to other methods in the literature but they make and evaluate important design decisions that are valuable for the community the results are quite strong matching or exceeding bear on d4rl while at the same time matching sac for online learning the proposed method is simple yet effective they learn a behavior policy on offline data then they learn a qfunction by bellman bootstrapping using max of n samples from the behavior policy for the target step then for executing the policy the method again uses the argmax of n samples from the behavior policy as mentioned in the related work prior work such as bcq and qtopt has studied samplebased policies and it has also been common to mix the two decisions eg bear uses a parametric policy for bootstrapping but then reports a boost from sampling for policy execution but the proposed method is perhaps the simplest of these from an rl point of view which is a good thing and the juice really comes from the proposal to use made for modeling the behavior policy this also suggests clear future directions for research to explore other better generative models ar models in particular are probably a poor longterm solution because of slow sampling the analysis centers around the emaq operator which is a nice representation for thinking about the maximization problem in bellman bootstrapping the paper proves the convergence of the operator in the tabular setting and some intuitive theorems that larger values of n result in policies with better q values for policy evaluation and thus better policy improvement one issue though with n inf is that it would tend to exploit function approximation errors of the q function and this is borne out in some of the results theoretical results along these lines would significantly strengthen the paper the highlight of the paper is results on both offline and online rl in offline rl emaq roughly matches the performance of bcq and bear while being significantly simpler to implement except perhaps the made model and exceeds bcq and bear in some environments and depending on the underlying implementation in online rl emaq roughly matches sac being slightly worse when updating the policy frequently and slightly better when updating the policy less frequently as far as i know no prior method has been able to show good performance on both sac is poor on offline training and bcqbear is poor on online training this is the strongest positive for the paper minor comments there are a couple of references to number of samples in the intro that do not make sense until reading the method perhaps you need to outline the method in the intro instead of just hinting at it if you want to discuss these details as an example we observe that while a halfcheetah random policy obtains a return of 0 a policy that at each state uniformly samples only 5 actions and chooses the one with the best value obtains a return of 2000 im not sure if i understood the setup here or the significance is the value estimated with samples from the random behavior policy is the second policy you refer to your method so fit a behavior policy on the random behavior data then do q learning sample 5 actions for maximizing it or something else moreover i did not really understand why this is too surprising since the bulk of the work is really being done by learning a good q function and even on randomaction data learning a behavior policy would help you stay constrained to the data deferring the full related work to the appendix skirts the page limit rules and is somewhat unfair to other submitters and also made it hard to read since the links do not crossreference between pdfs importantly our theoretical analysis on the family of td operators described by emaq apply beyond bcq and can also provide new perspectives on some of the highly successful samplemax qlearning algorithms kalashnikov et al 2018a van de wiele et al 2020 particularly on how the proposal distribution affects convergence not sure if i missed this but i didnt exactly see such a result or know what this is referring to which i would expect to be something like lower dpi mu implies faster convergencedocsepthis paper proposed emaq an approximation of bellman operator and thus yields a variant of deep q learning algorithms in both online and offline settings in tabular settings several properties of the emaq operator such as the convergent guarantee fixed point and finitesample error bounds were studied then the author shows the empirical performance of a deep learning approximation of the proposed algorithm on a standard batch rl benchmark the authors also provide some analysis of the effect of different implementation choices on empirical performance by ablation study a key fact is that the proposed method can be directly got from bcq by settings the perturbation parameter to zero thus the originality here is quite little the proposed algorithm is a simplification compared with bcq however in terms of both theoretical and empirical contributions this paper does not show a significant improvement compared with bcq in general the findings of this paper seem not surprising at all given bcq and bear paper pros 1 compared with bcq the proposed emaq operator has a simpler form with similar performance in some sense it is a q learning analog of bcq which is actually an actorcritic algorithm 2 there is some ablation study on the choice of generative model for muas the number of action to sample etc which are missed in the original bcq paper cons 1 as i mentioned in the summary the biggest issue is that as a very simple and incremental algorithmic change this paper also failed to provide additional discussion in either theory or experimental that can provide significantly more insight and inspiration to people there are many potential directions where the discussion can go 1 in some sense it is a q learning analog of bcq which is actually an actorcritic algorithm it will be very interesting to discuss the effect of the difference between q learning and actorcritic on bcq vs emaq comparison unfortunately i did not see enough indepth discussion on that 2 the new algorithm is more closed the tabular settings so does that yield stronger theoretical guarantees in function approximation settings or is there a conceptual experiment to demonstrate where the perturbation model is a bad ideaapproximation and cause the failure of the algorithm 3 does this very simple algorithmic choice lead to very different behavior of the algorithm eg optimization path of the parameters behavior of the resulting models other than just reward why there is or is not a very different behavior of the algorithm 2 according to the paper a key contribution of the empirical part is the importance of careful generative model design for estimating behavior policies this seems very straightforward as the bcqbear algorithm is using muas given that muas is a plugin module in those algorithms new advances in the generative model can be immediately applied there so the significance and novelty of this finding are very limited 3 a general problem of this paper is overclaiming for example 1 emaq matches and outperforms the prior stateoftheart in the d4rl benchmarks it matches and did not outperform by any statistically significant margin bcq and bear there are other baselines in d4rl that are better than bcqbear brac cql mopo morel the former two are also modelfree and those numbers are all reported in the d4rl whitepaper 2 emaq provides a quite intuitive and surprising measure of the complexity for offline rl problems to claim that i think it needs to show that delta is related to some general structure of the problem theoretically or some general phenomenon for a large group of algorithms empirically these two claims can also be a potential improvement of this paper if they are really achieved
### Summary: | overview the paper provides a simplified offline rl algorithm based on bcq it analyzes the algorithms using a samplingbased maximization of the q function over a behavior policy for both bellman targets and for policy execution the emaq based on this the paper then proposes to use more expressive autoregressive models made for learning the behavior policy from replay buffer data the methods work well for harder tasks in the d4rl benchmark pro the method is relatively novel algorithms are simple modifications of existing ones empirical results are strong matching or exceeding bear on d4rl while at the same time matching sac for online learning work for both and offline ablation study on the choice of generative model for as con the current form of the complexity measure is somewhat not practical theoretical results are not strong enough algorithmic contributions appear incremental recommendation the paper is on the borderline it contributes simple and nice algorithmic ideas and these ideas work well empirically these results demonstrate that a good choice of the behavior policy generative model is important for some tasks at the same time the reviewers are concerned about the theoretical parts eg issues relates new complexity measure overall the metareviewer believes that the paper might not be in a status ready for publication | [
273,
253,
1055,
273,
1016,
4445,
50276,
585,
1209,
2224,
50276,
74,
717,
9861,
407,
253,
906,
275,
8442,
337,
285,
374,
326,
299,
785,
82,
17923,
1077,
973,
342,
247,
1355,
1318,
273,
295,
22,
347,
253,
5170,
3033,
275,
10012,
4791,
651,
320,
417,
1077,
14282,
1677,
247,
1355,
295,
285,
17356,
2810,
337,
812,
4477,
7409,
253,
19958,
608,
5231,
403,
597,
11117,
2217,
281,
1957,
253,
2862,
2250,
2317,
973,
7296,
253,
2250,
2317,
310,
5415,
50275,
783,
3045,
273,
299,
785,
82,
7052,
7033,
281,
849,
1175,
12910,
310,
6311,
752,
604,
12910,
310,
2810,
281,
247,
30027,
3646,
24088,
12910,
310,
417,
6311,
973,
285,
310,
2810,
281,
247,
30027,
3646,
849,
281,
12215,
436,
9991,
50275,
16534,
4477,
7277,
253,
3045,
273,
299,
785,
82,
672,
295,
310,
337,
285,
271,
6685,
1781,
1318,
534,
812,
1805,
816,
7790,
253,
3045,
275,
9559,
2219,
50274,
5430,
403,
253,
13782,
673,
285,
4715,
6970,
16280,
253,
2457,
3045,
2011,
275,
253,
2929,
273,
299,
785,
82,
2429,
342,
643,
3082,
50274,
249,
2593,
4791,
352,
310,
417,
2590,
849,
247,
1355,
1318,
273,
295,
812,
5752,
347,
247,
3963,
6081,
285,
38962,
864,
253,
13583,
2193,
7152,
33032,
2520,
2929,
3537,
13505,
745,
22872,
11333,
970,
247,
10491,
3169,
11903,
1320,
273,
253,
2805,
1159,
689,
247,
3879,
3646,
323,
1097,
17487,
1342,
8571,
285,
323,
3646,
10636,
16984,
253,
3264,
2781,
2805,
28269,
5572,
299,
785,
82,
323,
1783,
253,
2929,
840,
29328,
281,
897,
625,
43541,
47694,
11020,
3210,
1160,
323,
4715,
253,
3879,
3646,
432,
44864,
6391,
941,
2057,
3909,
390,
28841,
534,
7819,
562,
281,
320,
1077,
1774,
3340,
327,
12150,
8892,
275,
253,
277,
21,
8435,
22791,
253,
1332,
310,
4942,
4460,
50276,
262,
310,
2810,
281,
643,
3082,
275,
253,
6239,
533,
597,
1056,
285,
7472,
1774,
2216,
7089,
326,
403,
9865,
323,
253,
3114,
253,
1543,
403,
3240,
2266,
11038,
390,
27433,
8800,
327,
277,
21,
8435,
1223,
387,
253,
1072,
673,
11038,
7044,
323,
3909,
4715,
50276,
783,
4081,
1332,
310,
2969,
2568,
3576,
597,
3037,
247,
3879,
3646,
327,
28841,
941,
840,
597,
3037,
247,
2805,
3701,
407,
17487,
1342,
7491,
10981,
2784,
970,
2781,
273,
295,
3530,
432,
253,
3879,
3646,
323,
253,
2303,
3213,
840,
323,
24364,
253,
3646,
253,
1332,
969,
4648,
253,
1736,
4090,
273,
295,
3530,
432,
253,
3879,
3646,
347,
5393,
275,
253,
2905,
789,
2720,
789,
824,
347,
49501,
82,
285,
2805,
85,
2178,
556,
5421,
3410,
3169,
7823,
285,
352,
556,
671,
644,
1846,
281,
5878,
253,
767,
7089,
24088,
8800,
4648,
247,
36833,
3646,
323,
7491,
10981,
2784,
533,
840,
5012,
247,
9510,
432,
10491,
323,
3646,
10636,
533,
253,
4081,
1332,
310,
4931,
253,
22325,
273,
841,
432,
271,
391,
77,
1127,
273,
1859,
534,
310,
247,
1175,
2181,
50276,
395,
253,
14709,
1663,
3249,
432,
253,
10419,
281,
897,
1160,
323,
14053,
253,
3879,
3646,
436,
671,
5936,
2590,
2852,
10746,
323,
2561,
50276,
936,
8338,
643,
1805,
1006,
800,
3210,
549,
3210,
275,
1798,
403,
3164,
247,
4105,
1048,
3945,
2900,
984,
273,
3468,
10491,
50276,
783,
1783,
12127,
1475,
253,
299,
785,
82,
5572,
534,
310,
247,
5322,
6779,
323,
4680,
670,
253,
11903,
1320,
1895,
275,
17487,
1342,
7491,
10981,
2784,
253,
2929,
19539,
253,
14940,
273,
253,
5572,
275,
253,
10334,
792,
4758,
285,
690,
27350,
39383,
326,
4067,
2193,
273,
295,
906,
275,
7823,
342,
1805,
2805,
2193,
323,
3646,
7103,
285,
3021,
1805,
3646,
7756,
581,
2523,
2167,
342,
295,
50276,
2050,
310,
326,
352,
651,
5257,
281,
22059,
1159,
11193,
6332,
273,
253,
2805,
1159,
285,
436,
310,
32708,
562,
275,
690,
273,
253,
1543,
10527,
1543,
2112,
841,
3104,
651,
3012,
17084,
253,
2929,
50276,
783,
6780,
273,
253,
2929,
310,
1543,
327,
1097,
28841,
285,
3909,
391,
77,
275,
28841,
391,
77,
299,
785,
82,
11467,
10129,
253,
3045,
273,
49501,
82,
285,
8800,
1223,
1146,
3012,
19554,
281,
3359,
3707,
4931,
253,
1160,
1566,
285,
23141,
49501,
82,
285,
8800,
275,
690,
12620,
285,
7293,
327,
253,
6944,
7092,
275,
3909,
391,
77,
299,
785,
82,
11467,
10129,
7044,
1146,
5777,
7197,
672,
22753,
253,
3646,
7208,
285,
5777,
1805,
672,
22753,
253,
3646,
1679,
7208,
347,
2080,
347,
891,
871,
642,
2720,
1332,
556,
644,
2104,
281,
921,
1175,
3045,
327,
1097,
7044,
310,
4105,
327,
28841,
3733,
285,
49501,
82,
48211,
310,
4105,
327,
3909,
3733,
436,
310,
253,
19508,
2762,
323,
253,
2929,
50276,
37585,
5701,
50276,
9088,
403,
247,
4564,
273,
10414,
281,
1180,
273,
3530,
275,
253,
26432,
326,
513,
417,
1056,
3282,
1919,
4361,
253,
1332,
4931,
368,
878,
281,
19270,
253,
1332,
275,
253,
26432,
3185,
273,
816,
12662,
272,
387,
352,
604,
368,
971,
281,
2319,
841,
4278,
347,
271,
1650,
359,
10018,
326,
1223,
247,
2716,
1962,
292,
1240,
3632,
3646,
31326,
247,
1091,
273,
470,
247,
3646,
326,
387,
1016,
1375,
17568,
3530,
760,
608,
5231,
285,
28467,
253,
581,
342,
253,
1682,
1318,
31326,
247,
1091,
273,
5307,
50276,
303,
417,
2119,
604,
891,
7192,
253,
9978,
1060,
390,
253,
8453,
310,
253,
1318,
5998,
342,
3530,
432,
253,
3632,
3879,
3646,
310,
253,
1273,
3646,
368,
3730,
281,
634,
1332,
594,
4944,
247,
3879,
3646,
327,
253,
3632,
3879,
941,
840,
513,
2805,
4715,
50276,
16848,
608,
5231,
323,
46875,
352,
390,
1633,
2010,
25761,
891,
858,
417,
1663,
2096,
2139,
436,
310,
1512,
10084,
1580,
253,
10713,
273,
253,
789,
310,
1663,
1146,
2218,
407,
4715,
247,
1175,
2805,
1159,
285,
1014,
327,
3632,
1913,
941,
4715,
247,
3879,
3646,
651,
1361,
368,
3297,
20793,
281,
253,
941,
50276,
1545,
24247,
253,
2120,
2905,
789,
281,
253,
30762,
48393,
253,
3239,
2701,
4803,
285,
310,
8489,
16593,
281,
643,
11929,
1336,
285,
671,
1160,
352,
1892,
281,
1239,
1580,
253,
4859,
513,
417,
2831,
14005,
875,
31385,
3671,
50276,
2948,
5954,
776,
10527,
1783,
327,
253,
2021,
273,
32989,
9158,
2529,
407,
299,
785,
82,
4647,
4457,
49501,
82,
285,
476,
671,
2085,
747,
24302,
327,
690,
273,
253,
4122,
5547,
3410,
4090,
2805,
28269,
11333,
465,
267,
1225,
47835,
1162,
355,
4765,
66,
3889,
372,
21650,
282,
1162,
355,
9169,
50276,
35456,
327,
849,
253,
10419,
3268,
11852,
14940,
417,
2119,
604,
891,
9829,
436,
533,
891,
42126,
4555,
923,
824,
247,
906,
390,
871,
752,
436,
310,
14339,
281,
534,
891,
651,
1902,
281,
320,
1633,
751,
2406,
277,
2059,
50276,
1906,
8018,
7938,
5975,
1541,
758,
406,
33032,
2520,
2929,
4081,
299,
785,
82,
271,
11193,
273,
17487,
1342,
5572,
285,
3021,
11026,
247,
12955,
273,
3676,
2805,
4715,
11333,
275,
1097,
3909,
285,
28841,
7533,
275,
10334,
792,
7533,
2067,
3607,
273,
253,
299,
785,
82,
5572,
824,
347,
253,
41886,
12215,
4229,
1127,
285,
1442,
3254,
4636,
2228,
14493,
497,
5421,
840,
253,
2488,
2722,
253,
16774,
3045,
273,
247,
3676,
4715,
11193,
273,
253,
4081,
5933,
327,
247,
2629,
14604,
391,
77,
22791,
253,
4477,
671,
2085,
690,
1783,
273,
253,
1055,
273,
1027,
7092,
10165,
327,
16774,
3045,
407,
28913,
1263,
50275,
66,
2234,
958,
310,
326,
253,
4081,
1332,
476,
320,
3587,
1694,
432,
49501,
82,
407,
7533,
253,
20452,
4764,
281,
5058,
3021,
253,
3236,
414,
1060,
310,
3240,
1652,
253,
4081,
5933,
310,
247,
8077,
1877,
2429,
342,
49501,
82,
2299,
275,
2426,
273,
1097,
10527,
285,
16774,
9021,
436,
2929,
1057,
417,
921,
247,
1534,
7756,
2429,
342,
49501,
82,
275,
2087,
253,
4342,
273,
436,
2929,
1646,
417,
10084,
387,
512,
1677,
49501,
82,
285,
8800,
2929,
50276,
856,
84,
337,
2429,
342,
49501,
82,
253,
4081,
299,
785,
82,
5572,
556,
247,
19554,
830,
342,
2074,
3045,
275,
690,
3282,
352,
310,
247,
2805,
4715,
7370,
273,
49501,
82,
534,
310,
2686,
271,
12353,
68,
17425,
5933,
374,
627,
310,
690,
28913,
1263,
327,
253,
4327,
273,
1006,
800,
1566,
323,
12910,
284,
253,
1180,
273,
2250,
281,
3410,
3966,
534,
403,
9829,
275,
253,
3236,
49501,
82,
2929,
50276,
5040,
337,
347,
891,
5393,
275,
253,
6010,
253,
5962,
2523,
310,
326,
347,
247,
1077,
2969,
285,
32809,
5933,
280,
1818,
436,
2929,
671,
4242,
281,
2085,
3081,
5955,
275,
2057,
3762,
390,
5661,
326,
476,
2085,
3012,
625,
12288,
285,
17006,
281,
952,
627,
403,
1142,
2442,
10746,
835,
253,
5955,
476,
564,
50276,
18,
275,
690,
3282,
352,
310,
247,
2805,
4715,
7370,
273,
49501,
82,
534,
310,
2686,
271,
12353,
68,
17425,
5933,
352,
588,
320,
1077,
4722,
281,
2319,
253,
1055,
273,
253,
3064,
875,
2805,
4715,
285,
12353,
68,
17425,
327,
49501,
82,
4632,
299,
785,
82,
5301,
19235,
891,
858,
417,
923,
2217,
801,
554,
394,
5955,
327,
326,
374,
253,
747,
5933,
310,
625,
4581,
253,
10334,
792,
7533,
594,
1057,
326,
4917,
10046,
10527,
23632,
275,
1159,
11193,
7533,
390,
310,
627,
247,
20178,
3368,
281,
7568,
835,
253,
20452,
1566,
310,
247,
3076,
2934,
6772,
3266,
318,
285,
2847,
253,
4433,
273,
253,
5933,
495,
1057,
436,
1077,
2969,
5933,
280,
4327,
1421,
281,
1077,
1027,
3879,
273,
253,
5933,
24088,
13757,
1854,
273,
253,
3602,
3879,
273,
253,
4795,
3210,
643,
685,
816,
10921,
2139,
627,
310,
390,
310,
417,
247,
1077,
1027,
3879,
273,
253,
5933,
50276,
19,
2556,
281,
253,
2929,
247,
2234,
7680,
273,
253,
16774,
629,
310,
253,
6349,
273,
10182,
1006,
800,
1566,
2216,
323,
26230,
3879,
7823,
436,
3133,
1077,
15246,
347,
253,
49501,
82,
48211,
5933,
310,
970,
12910,
284,
1677,
326,
12910,
284,
310,
247,
15191,
6333,
275,
1110,
11333,
747,
16424,
275,
253,
1006,
800,
1566,
476,
320,
4745,
3732,
627,
594,
253,
8453,
285,
38135,
273,
436,
4560,
403,
1077,
3710,
50276,
20,
247,
2087,
1895,
273,
436,
2929,
310,
689,
43759,
323,
1650,
50276,
18,
299,
785,
82,
10129,
285,
41731,
13015,
253,
2720,
1375,
23037,
14387,
275,
253,
277,
21,
8435,
49602,
50276,
262,
10129,
285,
858,
417,
562,
32231,
407,
667,
10126,
1534,
8459,
49501,
82,
285,
8800,
627,
403,
643,
1666,
25379,
275,
277,
21,
8435,
326,
403,
1805,
685,
49501,
82,
48211,
1308,
317,
260,
5848,
278,
38332,
625,
77,
253,
3438,
767,
403,
671,
771,
813,
658,
285,
1110,
3904,
403,
512,
2361,
275,
253,
277,
21,
8435,
30661,
554,
6653,
374,
299,
785,
82,
3400,
247,
3240,
27350,
285,
10084,
2557,
273,
253,
10454,
323,
28841,
391,
77,
3237,
281,
1750,
326,
891,
1158,
352,
3198,
281,
921,
326,
50276,
3005,
310,
2905,
281,
690,
2087,
2605,
273,
253,
1895,
28055,
390,
690,
2087,
11562,
323,
247,
1781,
1387,
273,
11333,
45190,
841,
767,
3916,
476,
671,
320,
247,
2442,
7756,
273,
436,
2929,
604,
597,
403,
1663,
6786,
2490,
187,
4118,
18435,
27,
39930,
253,
2929,
3400,
247,
21010,
28841,
391,
77,
5933,
1754,
327,
49501,
82,
352,
3537,
13505,
253,
11333,
970,
247,
10491,
3169,
11903,
1320,
273,
253,
2805,
1159,
689,
247,
3879,
3646,
323,
1097,
17487,
1342,
8571,
285,
323,
3646,
10636,
50276,
783,
299,
785,
82,
1754,
327,
436,
253,
2929,
840,
29328,
281,
897,
625,
43541,
47694,
11020,
3210,
1160,
323,
4715,
253,
3879,
3646,
432,
44864,
6391,
941,
253,
3082,
789,
973,
323,
12150,
8892,
275,
253,
277,
21,
8435,
22791,
50275,
856,
50275,
783,
1332,
310,
4942,
4460,
50275,
267,
46042,
403,
2969,
14586,
273,
5368,
4394,
50276,
358,
5378,
474,
1543,
403,
2266,
11038,
390,
27433,
8800,
327,
277,
21,
8435,
1223,
387,
253,
1072,
673,
11038,
7044,
323,
3909,
4715,
50276,
1601,
323,
1097,
285,
28841,
50276,
1752,
318,
1263,
327,
253,
4327,
273,
1006,
800,
1566,
323,
347,
50276,
585,
50276,
783,
1655,
830,
273,
253,
10454,
2557,
310,
8489,
417,
8542,
50276,
783,
33977,
1543,
403,
417,
2266,
2217,
50276,
267,
4100,
6185,
9021,
3176,
32809,
50276,
250,
27167,
318,
253,
2929,
310,
327,
253,
45210,
352,
17904,
2969,
285,
5322,
5933,
280,
5697,
285,
841,
5697,
789,
973,
45190,
841,
1543,
7568,
326,
247,
1175,
4327,
273,
253,
3879,
3646,
1006,
800,
1566,
310,
1774,
323,
690,
8892,
387,
253,
1072,
673,
253,
30628,
403,
7514,
670,
253,
10527,
4243,
24088,
3374,
7033,
747,
10454,
2557,
4583,
253,
1313,
609,
1374,
254,
11532,
326,
253,
2929,
1537,
417,
320,
275,
247,
3708,
4704,
323,
9311
] | [
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] | [
273,
253,
1055,
273,
1016,
4445,
50276,
585,
1209,
2224,
50276,
74,
717,
9861,
407,
253,
906,
275,
8442,
337,
285,
374,
326,
299,
785,
82,
17923,
1077,
973,
342,
247,
1355,
1318,
273,
295,
22,
347,
253,
5170,
3033,
275,
10012,
4791,
651,
320,
417,
1077,
14282,
1677,
247,
1355,
295,
285,
17356,
2810,
337,
812,
4477,
7409,
253,
19958,
608,
5231,
403,
597,
11117,
2217,
281,
1957,
253,
2862,
2250,
2317,
973,
7296,
253,
2250,
2317,
310,
5415,
50275,
783,
3045,
273,
299,
785,
82,
7052,
7033,
281,
849,
1175,
12910,
310,
6311,
752,
604,
12910,
310,
2810,
281,
247,
30027,
3646,
24088,
12910,
310,
417,
6311,
973,
285,
310,
2810,
281,
247,
30027,
3646,
849,
281,
12215,
436,
9991,
50275,
16534,
4477,
7277,
253,
3045,
273,
299,
785,
82,
672,
295,
310,
337,
285,
271,
6685,
1781,
1318,
534,
812,
1805,
816,
7790,
253,
3045,
275,
9559,
2219,
50274,
5430,
403,
253,
13782,
673,
285,
4715,
6970,
16280,
253,
2457,
3045,
2011,
275,
253,
2929,
273,
299,
785,
82,
2429,
342,
643,
3082,
50274,
249,
2593,
4791,
352,
310,
417,
2590,
849,
247,
1355,
1318,
273,
295,
812,
5752,
347,
247,
3963,
6081,
285,
38962,
864,
253,
13583,
2193,
7152,
33032,
2520,
2929,
3537,
13505,
745,
22872,
11333,
970,
247,
10491,
3169,
11903,
1320,
273,
253,
2805,
1159,
689,
247,
3879,
3646,
323,
1097,
17487,
1342,
8571,
285,
323,
3646,
10636,
16984,
253,
3264,
2781,
2805,
28269,
5572,
299,
785,
82,
323,
1783,
253,
2929,
840,
29328,
281,
897,
625,
43541,
47694,
11020,
3210,
1160,
323,
4715,
253,
3879,
3646,
432,
44864,
6391,
941,
2057,
3909,
390,
28841,
534,
7819,
562,
281,
320,
1077,
1774,
3340,
327,
12150,
8892,
275,
253,
277,
21,
8435,
22791,
253,
1332,
310,
4942,
4460,
50276,
262,
310,
2810,
281,
643,
3082,
275,
253,
6239,
533,
597,
1056,
285,
7472,
1774,
2216,
7089,
326,
403,
9865,
323,
253,
3114,
253,
1543,
403,
3240,
2266,
11038,
390,
27433,
8800,
327,
277,
21,
8435,
1223,
387,
253,
1072,
673,
11038,
7044,
323,
3909,
4715,
50276,
783,
4081,
1332,
310,
2969,
2568,
3576,
597,
3037,
247,
3879,
3646,
327,
28841,
941,
840,
597,
3037,
247,
2805,
3701,
407,
17487,
1342,
7491,
10981,
2784,
970,
2781,
273,
295,
3530,
432,
253,
3879,
3646,
323,
253,
2303,
3213,
840,
323,
24364,
253,
3646,
253,
1332,
969,
4648,
253,
1736,
4090,
273,
295,
3530,
432,
253,
3879,
3646,
347,
5393,
275,
253,
2905,
789,
2720,
789,
824,
347,
49501,
82,
285,
2805,
85,
2178,
556,
5421,
3410,
3169,
7823,
285,
352,
556,
671,
644,
1846,
281,
5878,
253,
767,
7089,
24088,
8800,
4648,
247,
36833,
3646,
323,
7491,
10981,
2784,
533,
840,
5012,
247,
9510,
432,
10491,
323,
3646,
10636,
533,
253,
4081,
1332,
310,
4931,
253,
22325,
273,
841,
432,
271,
391,
77,
1127,
273,
1859,
534,
310,
247,
1175,
2181,
50276,
395,
253,
14709,
1663,
3249,
432,
253,
10419,
281,
897,
1160,
323,
14053,
253,
3879,
3646,
436,
671,
5936,
2590,
2852,
10746,
323,
2561,
50276,
936,
8338,
643,
1805,
1006,
800,
3210,
549,
3210,
275,
1798,
403,
3164,
247,
4105,
1048,
3945,
2900,
984,
273,
3468,
10491,
50276,
783,
1783,
12127,
1475,
253,
299,
785,
82,
5572,
534,
310,
247,
5322,
6779,
323,
4680,
670,
253,
11903,
1320,
1895,
275,
17487,
1342,
7491,
10981,
2784,
253,
2929,
19539,
253,
14940,
273,
253,
5572,
275,
253,
10334,
792,
4758,
285,
690,
27350,
39383,
326,
4067,
2193,
273,
295,
906,
275,
7823,
342,
1805,
2805,
2193,
323,
3646,
7103,
285,
3021,
1805,
3646,
7756,
581,
2523,
2167,
342,
295,
50276,
2050,
310,
326,
352,
651,
5257,
281,
22059,
1159,
11193,
6332,
273,
253,
2805,
1159,
285,
436,
310,
32708,
562,
275,
690,
273,
253,
1543,
10527,
1543,
2112,
841,
3104,
651,
3012,
17084,
253,
2929,
50276,
783,
6780,
273,
253,
2929,
310,
1543,
327,
1097,
28841,
285,
3909,
391,
77,
275,
28841,
391,
77,
299,
785,
82,
11467,
10129,
253,
3045,
273,
49501,
82,
285,
8800,
1223,
1146,
3012,
19554,
281,
3359,
3707,
4931,
253,
1160,
1566,
285,
23141,
49501,
82,
285,
8800,
275,
690,
12620,
285,
7293,
327,
253,
6944,
7092,
275,
3909,
391,
77,
299,
785,
82,
11467,
10129,
7044,
1146,
5777,
7197,
672,
22753,
253,
3646,
7208,
285,
5777,
1805,
672,
22753,
253,
3646,
1679,
7208,
347,
2080,
347,
891,
871,
642,
2720,
1332,
556,
644,
2104,
281,
921,
1175,
3045,
327,
1097,
7044,
310,
4105,
327,
28841,
3733,
285,
49501,
82,
48211,
310,
4105,
327,
3909,
3733,
436,
310,
253,
19508,
2762,
323,
253,
2929,
50276,
37585,
5701,
50276,
9088,
403,
247,
4564,
273,
10414,
281,
1180,
273,
3530,
275,
253,
26432,
326,
513,
417,
1056,
3282,
1919,
4361,
253,
1332,
4931,
368,
878,
281,
19270,
253,
1332,
275,
253,
26432,
3185,
273,
816,
12662,
272,
387,
352,
604,
368,
971,
281,
2319,
841,
4278,
347,
271,
1650,
359,
10018,
326,
1223,
247,
2716,
1962,
292,
1240,
3632,
3646,
31326,
247,
1091,
273,
470,
247,
3646,
326,
387,
1016,
1375,
17568,
3530,
760,
608,
5231,
285,
28467,
253,
581,
342,
253,
1682,
1318,
31326,
247,
1091,
273,
5307,
50276,
303,
417,
2119,
604,
891,
7192,
253,
9978,
1060,
390,
253,
8453,
310,
253,
1318,
5998,
342,
3530,
432,
253,
3632,
3879,
3646,
310,
253,
1273,
3646,
368,
3730,
281,
634,
1332,
594,
4944,
247,
3879,
3646,
327,
253,
3632,
3879,
941,
840,
513,
2805,
4715,
50276,
16848,
608,
5231,
323,
46875,
352,
390,
1633,
2010,
25761,
891,
858,
417,
1663,
2096,
2139,
436,
310,
1512,
10084,
1580,
253,
10713,
273,
253,
789,
310,
1663,
1146,
2218,
407,
4715,
247,
1175,
2805,
1159,
285,
1014,
327,
3632,
1913,
941,
4715,
247,
3879,
3646,
651,
1361,
368,
3297,
20793,
281,
253,
941,
50276,
1545,
24247,
253,
2120,
2905,
789,
281,
253,
30762,
48393,
253,
3239,
2701,
4803,
285,
310,
8489,
16593,
281,
643,
11929,
1336,
285,
671,
1160,
352,
1892,
281,
1239,
1580,
253,
4859,
513,
417,
2831,
14005,
875,
31385,
3671,
50276,
2948,
5954,
776,
10527,
1783,
327,
253,
2021,
273,
32989,
9158,
2529,
407,
299,
785,
82,
4647,
4457,
49501,
82,
285,
476,
671,
2085,
747,
24302,
327,
690,
273,
253,
4122,
5547,
3410,
4090,
2805,
28269,
11333,
465,
267,
1225,
47835,
1162,
355,
4765,
66,
3889,
372,
21650,
282,
1162,
355,
9169,
50276,
35456,
327,
849,
253,
10419,
3268,
11852,
14940,
417,
2119,
604,
891,
9829,
436,
533,
891,
42126,
4555,
923,
824,
247,
906,
390,
871,
752,
436,
310,
14339,
281,
534,
891,
651,
1902,
281,
320,
1633,
751,
2406,
277,
2059,
50276,
1906,
8018,
7938,
5975,
1541,
758,
406,
33032,
2520,
2929,
4081,
299,
785,
82,
271,
11193,
273,
17487,
1342,
5572,
285,
3021,
11026,
247,
12955,
273,
3676,
2805,
4715,
11333,
275,
1097,
3909,
285,
28841,
7533,
275,
10334,
792,
7533,
2067,
3607,
273,
253,
299,
785,
82,
5572,
824,
347,
253,
41886,
12215,
4229,
1127,
285,
1442,
3254,
4636,
2228,
14493,
497,
5421,
840,
253,
2488,
2722,
253,
16774,
3045,
273,
247,
3676,
4715,
11193,
273,
253,
4081,
5933,
327,
247,
2629,
14604,
391,
77,
22791,
253,
4477,
671,
2085,
690,
1783,
273,
253,
1055,
273,
1027,
7092,
10165,
327,
16774,
3045,
407,
28913,
1263,
50275,
66,
2234,
958,
310,
326,
253,
4081,
1332,
476,
320,
3587,
1694,
432,
49501,
82,
407,
7533,
253,
20452,
4764,
281,
5058,
3021,
253,
3236,
414,
1060,
310,
3240,
1652,
253,
4081,
5933,
310,
247,
8077,
1877,
2429,
342,
49501,
82,
2299,
275,
2426,
273,
1097,
10527,
285,
16774,
9021,
436,
2929,
1057,
417,
921,
247,
1534,
7756,
2429,
342,
49501,
82,
275,
2087,
253,
4342,
273,
436,
2929,
1646,
417,
10084,
387,
512,
1677,
49501,
82,
285,
8800,
2929,
50276,
856,
84,
337,
2429,
342,
49501,
82,
253,
4081,
299,
785,
82,
5572,
556,
247,
19554,
830,
342,
2074,
3045,
275,
690,
3282,
352,
310,
247,
2805,
4715,
7370,
273,
49501,
82,
534,
310,
2686,
271,
12353,
68,
17425,
5933,
374,
627,
310,
690,
28913,
1263,
327,
253,
4327,
273,
1006,
800,
1566,
323,
12910,
284,
253,
1180,
273,
2250,
281,
3410,
3966,
534,
403,
9829,
275,
253,
3236,
49501,
82,
2929,
50276,
5040,
337,
347,
891,
5393,
275,
253,
6010,
253,
5962,
2523,
310,
326,
347,
247,
1077,
2969,
285,
32809,
5933,
280,
1818,
436,
2929,
671,
4242,
281,
2085,
3081,
5955,
275,
2057,
3762,
390,
5661,
326,
476,
2085,
3012,
625,
12288,
285,
17006,
281,
952,
627,
403,
1142,
2442,
10746,
835,
253,
5955,
476,
564,
50276,
18,
275,
690,
3282,
352,
310,
247,
2805,
4715,
7370,
273,
49501,
82,
534,
310,
2686,
271,
12353,
68,
17425,
5933,
352,
588,
320,
1077,
4722,
281,
2319,
253,
1055,
273,
253,
3064,
875,
2805,
4715,
285,
12353,
68,
17425,
327,
49501,
82,
4632,
299,
785,
82,
5301,
19235,
891,
858,
417,
923,
2217,
801,
554,
394,
5955,
327,
326,
374,
253,
747,
5933,
310,
625,
4581,
253,
10334,
792,
7533,
594,
1057,
326,
4917,
10046,
10527,
23632,
275,
1159,
11193,
7533,
390,
310,
627,
247,
20178,
3368,
281,
7568,
835,
253,
20452,
1566,
310,
247,
3076,
2934,
6772,
3266,
318,
285,
2847,
253,
4433,
273,
253,
5933,
495,
1057,
436,
1077,
2969,
5933,
280,
4327,
1421,
281,
1077,
1027,
3879,
273,
253,
5933,
24088,
13757,
1854,
273,
253,
3602,
3879,
273,
253,
4795,
3210,
643,
685,
816,
10921,
2139,
627,
310,
390,
310,
417,
247,
1077,
1027,
3879,
273,
253,
5933,
50276,
19,
2556,
281,
253,
2929,
247,
2234,
7680,
273,
253,
16774,
629,
310,
253,
6349,
273,
10182,
1006,
800,
1566,
2216,
323,
26230,
3879,
7823,
436,
3133,
1077,
15246,
347,
253,
49501,
82,
48211,
5933,
310,
970,
12910,
284,
1677,
326,
12910,
284,
310,
247,
15191,
6333,
275,
1110,
11333,
747,
16424,
275,
253,
1006,
800,
1566,
476,
320,
4745,
3732,
627,
594,
253,
8453,
285,
38135,
273,
436,
4560,
403,
1077,
3710,
50276,
20,
247,
2087,
1895,
273,
436,
2929,
310,
689,
43759,
323,
1650,
50276,
18,
299,
785,
82,
10129,
285,
41731,
13015,
253,
2720,
1375,
23037,
14387,
275,
253,
277,
21,
8435,
49602,
50276,
262,
10129,
285,
858,
417,
562,
32231,
407,
667,
10126,
1534,
8459,
49501,
82,
285,
8800,
627,
403,
643,
1666,
25379,
275,
277,
21,
8435,
326,
403,
1805,
685,
49501,
82,
48211,
1308,
317,
260,
5848,
278,
38332,
625,
77,
253,
3438,
767,
403,
671,
771,
813,
658,
285,
1110,
3904,
403,
512,
2361,
275,
253,
277,
21,
8435,
30661,
554,
6653,
374,
299,
785,
82,
3400,
247,
3240,
27350,
285,
10084,
2557,
273,
253,
10454,
323,
28841,
391,
77,
3237,
281,
1750,
326,
891,
1158,
352,
3198,
281,
921,
326,
50276,
3005,
310,
2905,
281,
690,
2087,
2605,
273,
253,
1895,
28055,
390,
690,
2087,
11562,
323,
247,
1781,
1387,
273,
11333,
45190,
841,
767,
3916,
476,
671,
320,
247,
2442,
7756,
273,
436,
2929,
604,
597,
403,
1663,
6786,
2490,
187,
4118,
18435,
27,
39930,
253,
2929,
3400,
247,
21010,
28841,
391,
77,
5933,
1754,
327,
49501,
82,
352,
3537,
13505,
253,
11333,
970,
247,
10491,
3169,
11903,
1320,
273,
253,
2805,
1159,
689,
247,
3879,
3646,
323,
1097,
17487,
1342,
8571,
285,
323,
3646,
10636,
50276,
783,
299,
785,
82,
1754,
327,
436,
253,
2929,
840,
29328,
281,
897,
625,
43541,
47694,
11020,
3210,
1160,
323,
4715,
253,
3879,
3646,
432,
44864,
6391,
941,
253,
3082,
789,
973,
323,
12150,
8892,
275,
253,
277,
21,
8435,
22791,
50275,
856,
50275,
783,
1332,
310,
4942,
4460,
50275,
267,
46042,
403,
2969,
14586,
273,
5368,
4394,
50276,
358,
5378,
474,
1543,
403,
2266,
11038,
390,
27433,
8800,
327,
277,
21,
8435,
1223,
387,
253,
1072,
673,
11038,
7044,
323,
3909,
4715,
50276,
1601,
323,
1097,
285,
28841,
50276,
1752,
318,
1263,
327,
253,
4327,
273,
1006,
800,
1566,
323,
347,
50276,
585,
50276,
783,
1655,
830,
273,
253,
10454,
2557,
310,
8489,
417,
8542,
50276,
783,
33977,
1543,
403,
417,
2266,
2217,
50276,
267,
4100,
6185,
9021,
3176,
32809,
50276,
250,
27167,
318,
253,
2929,
310,
327,
253,
45210,
352,
17904,
2969,
285,
5322,
5933,
280,
5697,
285,
841,
5697,
789,
973,
45190,
841,
1543,
7568,
326,
247,
1175,
4327,
273,
253,
3879,
3646,
1006,
800,
1566,
310,
1774,
323,
690,
8892,
387,
253,
1072,
673,
253,
30628,
403,
7514,
670,
253,
10527,
4243,
24088,
3374,
7033,
747,
10454,
2557,
4583,
253,
1313,
609,
1374,
254,
11532,
326,
253,
2929,
1537,
417,
320,
275,
247,
3708,
4704,
323,
9311
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
summary the paper proposes an approach to generate semantic explanations for highdimensional data the proposed approach consists of two modules the first module transforms the highdimensional raw data into lowerdimensional semantic latent space and the second module applies shapely explainability to this lowerdimensional latent space to generate explanations in terms of semantic concepts the approach has been applied to six different datasets strengths s1 the paper is very well written and easy to understand s2 the proposed approach is modular consisting of 2 modules this allows the flexibility to replace each module with more efficientadvantageous methods as research progresses in future s3 the paper proposes three different ways of transforming highdimensional raw data to lowerdimensional latent space over which shapely explainability can be applied to generate explanations in terms of semantic concepts s4 the paper shows results for six different datasets weaknesses w1 the authors should compare their approach methodologically as well as experimentally to other conceptbased explanations for highdimensional data such as kim et al 2018 ghorbani et al 2019 and goyal et al 2019 the related work claims that kim et al 2018 requires large sets of annotated data i disagree kim et al 2018 only requires a few images describing the concept you want to measure the importance of this is significantly less than the number of annotations required in the imagetoimage translation experiment in the paper where the complete dataset needs to be annotated in addition kim et al 2018 allows the flexibility to consider any given semantic concept for explanation while the proposed approach is limited either to semantic concepts captured by frequency information or to semantic concepts automatically discovered by representation learning or to concepts annotated in the complete dataset ghorbani et al 2019 also overcomes the issue of needing annotations by discovering useful concepts from the data itself what advantages does the proposed approach offer over these existing methods w2 faithfulness of the explanations with the pretrained classifier the methods of disentangled representation and imagetoimage translation require training another network to learn a lowerdimensional representation this runs the risk of encoding some biases of its own if we find some concerns with the explanations we cannot infer if the concerns are with the trained classifier or the newly trained network potentially making the explanations useless w3 in the 2module approach proposed in the paper the second module can theoretically be any explainability approach for lowdimensional data what is the reason that the authors decide to use shapely instead of other works such as breiman 2001 or ribeiro et al 2016 w4 among the three ways of transforming the highdimensional data to lowdimensional latent space what criteria should be used by a user to decide which method to use or in other words what are the advantages and disadvantages of each of these methods which might make them more or less suitable for certain tasksdatasetsapplications w5 the paper uses the phrase humaninterpretable explainability what other type of explainability could be possible if its not humaninterpretable i think the paper might benefit with more precise definitions of these terms in the paper references mentioned above which are not present in the main paper ghorbani et al 2019 amirata ghorbani james wexler james zou been kim towards automatic conceptbased explanations neurips 2019 goyal et al 2019 yash goyal amir feder uri shalit been kim explaining classifiers with causal concept effect cace arxiv 2019 update after rebuttal i thank the authors for their responses to all my questions however i believe that these answers need to be justified experimentally in order for the papers contributions to be significant for acceptance in particular i still have two major concerns 1 the faithfulness of the proposed approach i think that the authors answer that their method is less at risk to biases than other methods needs to be demonstrated with at least a simple experiment 2 shapely values over other methods i think the authors need to back up their argument for using shapely value explanations over other methods by comparing experimentally with other methods such as cace or even raw gradients in addition i think the paper would benefit a lot by including a significant discussion on the advantages and disadvantages of each of the three ways of transforming the highdimensional data to lowdimensional latent space which might make them more or less suitable for certain tasksdatasetsapplications because of these concerns i am keeping my original rating docsepsummary i think that what is presented is a promising method for human interpretation of highdimensional models however the experiments feel too much like toy examples without rigorous attempts to validate the resulting feature attributions or to compare to other ways of getting conceptlevel attributions update after reading the rebuttal i am maintaining my previous score i believe that given the related work reviewers have mentioned the paper most likely requires some experimental comparisons with these other methods the rebuttal arguments are good but not convincing enough that i can believe in this methods superiority without seeing a comparison with eg cace objective explain the output of highdimensional ml models in a humaninterpretable way by using shapley values on semantically meaningful latent features strengths this is an extremely important problem the shortcomings of eg pixelbased methods for images are important and wellnoted the proposed method is a promising way to attribute to and in some cases learn semantically meaningful latent features dsprites serves as a good ground truth where the generative process is known and serves to validate some of the humaninterpretable patterns found weaknesses my overall concern is that while the experimental results are interesting they are not thorough and dont demonstrate the marginal value of this method relative to other possible methods in the space for example concept bottleneck or causal concept effect models i didnt see a reference to koh et als concept bottleneck models httpsarxivorgabs200704612 which do supervised learning of semantic concepts then train a classifier on top of those concepts which is very similarrelevant work similarly i didnt see a reference to goyal et als explaining classifiers with causal concept effect cace which trains a vae to learn a meaningful latent space and report causal effects of modifications to the latent variables the existence of this prior work sharpens some specific questions i had when reading the paper for the nonfourier tasks why use a vae latent space its impossible to know that vae latent dimensions when they do look like they correspond to a concept correspond only to that concept the bar charts stating most importance goes to vertical positiondigit identity seem overconfident since that is only our guess of what the latent dimension means in contrast a concept bottleneck model has dimensions with clear meaning they were trained with supervision this is acknowledged to some degree in the text also for the nonfourier tasks why use the shapley framework cace is a very similar approach using a vae to get explanations in terms of an interpretable latent space but doesnt use shapley what are the pros and cons of each approach also given a meaningful latent space why not use gradientbased methods ie integrated gradients to attribute to it overall i dont think this method needs to blow the others out of the water there are legitimate counterpoints to be made for example concept bottleneck models require concept labels though i believe such labels are available in the celeba experiments but a thoughtful discussion of the relationship of this work to methods like concept bottlenecks or cace is essential and not present in the current version other weaknesses the quality of the resulting explanations is not assessed rigorously the fact that the dsprites example picks up on vertical position and the mnist example picks up on digit value is useful but many interpretability papers look more exhaustively across the dataset at what happens to the model output when the features reported as important are at least perturbed or ablated or if a model is retrained with the features altered i would particularly like to see such a comparison with concept bottleneck andor cace if this method shapley with vae is the right way to do things perhaps it will outperform the other methods at predicting how much certain concept shifts will affect model output or model retraining the fourier examples are interesting but feel a bit like toy examples to me because they are a specific case with a convenient invertible feature map and mostly recapitulate known properties of adversarial examples they rely on highfrequency patterns minor comment the landscape of semantic representations could be shortened potentially leaving room for more comparisons to other methodsbenchmarks of the attributionsdocsepthe paper describes a technique for interpreting the results of a neural net in humanreadable terms the basic idea is natural and simple 1 find a small number of coordinates for the feature space that correspond to humaninterpretable concepts 2 use shapley values to assign credit to these coordinates strong points while there is work related to this idea which the authors cite appropriately the overall technique is new the authors do a good job of backing up the theoretical idea with a set of experiments that seem to produce useful results the adversarial example scenario and the celebrity data set use case were especially easy to understand the exposition is also very clear weak points shapley values are not the only way to exploit humaninterpretable coordinates although they especially are made feasible by the reduction in number of features the paper might be even stronger if it compared other methods however i wouldnt reject the paper on this basis i recommend acceptance this is a solid paper and is likely to be wellcited additional feedback suggestions for improvement i found the math notation a little confusing there are three general types of objects a the different spaces where representations live feature space vs the interpretable space b elements of those spaces and c functions between those spaces certain symbols seemed to do double duty for instance tildex seemed to float between these meanings id recommend clear distinctions and contrasting notation for each type of object the authors talk about an approximate inverse tildex its not really an inverse maybe more like a pseudoinverse or maybe a left or right inverse depending on notation perhaps consider alternate terminology when tildex is differentiable one could immediately apply other interpretability techniques besides shapley values for instance take a gradient of the class score with respect to the interpretable coordinate space ideally it would be nice to see how these look so we could compare how much of the clarity of results is due to good coordinates and how much to use of shapley values this would be particularly straightforward in the first fourier transform exampledocsepsummary this paper develops a shapley value approach to explanation that uses lowdimensional latent features to explain the original input in highdimensional settings shapley values can be computationally intractable as such the authors adapt the characteristic aka value function vcdot to consider coalitions defined in latent space this helps ensure that only feasible and plausible ie semantically meaningful perturbations are made they consider three different latent encodings fourier transforms disentanglement and imagetoimage translation to isolate factors of variation strengths the paper tackles a difficult problem in practice applying shapley explanations to high dimensional data the authors do a fine job of motivating this problem and their subsequent solution the paper is quite thorough in its experimentation as it experiments with cifar10 describable textures dsprites celeba and mnist weaknesses i would have strongly preferred to see a computational analysis of how much more tractable moving to latent space makes the shapley value it may be an exponential time reduction if z is chosen to be logd for example is there a sensible algorithm or heuristic for selecting the number of latent factors when encoding perhaps there is a tradeoff between the size of z and the explanation quality questions can you comment on if the semantic space needs to be the same space that the discriminative model itself operates in if you just train an endtoend model f on xy pairs how can we be sure that the generative latent factors captured by z are actually used by f can you comment on the relationship between the shapley values directly on the input data versus the shapley values on the latent factors if any
### Summary: | this paper introduces an approach to model explainability on highdimensional data by 1 first mapping inputs to a smaller set of intelligible latent features and then 2 applying the shapley method to this set of latent features several methods are considered for 1 and empirical results are examined across several settings reviewers were mixed in their views one reviewer was in favor of acceptance and three were against some concerns were addressed by authors in the discussion period but remaining issues include concerns over the faithfulness of the approach missing comparisons to other related methods such as cace and a desire for more indepth discussion of the pros and cons of the different methods considered for 1 | [
273,
27197,
253,
1029,
6967,
941,
281,
1698,
6967,
21624,
2317,
752,
6866,
943,
320,
908,
407,
247,
2608,
281,
7617,
534,
1332,
281,
897,
390,
275,
643,
3000,
752,
403,
253,
11361,
285,
23797,
273,
1016,
273,
841,
3082,
534,
1537,
1056,
731,
625,
390,
1679,
7470,
323,
2176,
8892,
46906,
1507,
1212,
18498,
50276,
88,
22,
253,
2929,
4648,
253,
12616,
1966,
22416,
494,
5513,
1430,
752,
643,
1511,
273,
5513,
1430,
812,
320,
1896,
604,
697,
417,
1966,
22416,
494,
891,
1158,
253,
2929,
1537,
5649,
342,
625,
10799,
14308,
273,
841,
2426,
275,
253,
2929,
50275,
250,
3065,
5393,
1840,
534,
403,
417,
1246,
275,
253,
2022,
2929,
50276,
72,
1688,
5568,
74,
1162,
355,
6247,
717,
343,
682,
305,
1688,
5568,
74,
480,
1443,
359,
89,
2146,
480,
1443,
1182,
276,
644,
465,
303,
4404,
12077,
4473,
3169,
22909,
5723,
2824,
6247,
50276,
72,
6462,
1162,
355,
6247,
340,
1225,
564,
90,
267,
717,
343,
46381,
41475,
439,
267,
262,
644,
465,
303,
15571,
49996,
342,
19349,
4473,
1055,
260,
584,
549,
32693,
6247,
50269,
11183,
846,
30080,
22559,
891,
5717,
253,
4477,
323,
616,
6128,
281,
512,
619,
3533,
2299,
891,
2868,
326,
841,
9172,
878,
281,
320,
17285,
21657,
275,
1340,
323,
253,
9380,
9021,
281,
320,
1534,
323,
14924,
275,
1798,
891,
1335,
452,
767,
2201,
7350,
337,
253,
6009,
16858,
273,
253,
4081,
2746,
891,
1158,
326,
253,
4477,
3662,
326,
616,
1332,
310,
1679,
387,
2495,
281,
31306,
685,
643,
3082,
3198,
281,
320,
5183,
342,
387,
1878,
247,
2969,
3368,
374,
439,
522,
600,
2193,
689,
643,
3082,
891,
1158,
253,
4477,
878,
281,
896,
598,
616,
4154,
323,
970,
439,
522,
600,
1318,
22909,
689,
643,
3082,
407,
10941,
21657,
342,
643,
3082,
824,
347,
260,
584,
390,
1014,
9305,
27935,
275,
1635,
891,
1158,
253,
2929,
651,
5649,
247,
2257,
407,
1690,
247,
1534,
5955,
327,
253,
11361,
285,
23797,
273,
1016,
273,
253,
1264,
4088,
273,
27197,
253,
1029,
6967,
941,
281,
1698,
6967,
21624,
2317,
534,
1537,
1056,
731,
625,
390,
1679,
7470,
323,
2176,
8892,
46906,
1507,
1212,
18498,
984,
273,
841,
7350,
891,
717,
7562,
619,
3236,
13716,
5474,
339,
793,
360,
3454,
891,
1158,
326,
752,
310,
3559,
310,
247,
12532,
1332,
323,
1966,
7914,
273,
1029,
6967,
3210,
2299,
253,
4679,
1928,
1512,
1199,
751,
20953,
6667,
1293,
26565,
9437,
281,
17813,
253,
4795,
4735,
863,
8303,
390,
281,
7277,
281,
643,
4088,
273,
2970,
4473,
5251,
863,
8303,
50276,
11183,
846,
4361,
253,
30080,
22559,
891,
717,
11850,
619,
2045,
4868,
891,
2868,
326,
1677,
253,
2905,
789,
30628,
452,
5393,
253,
2929,
954,
2779,
4419,
690,
5661,
14023,
342,
841,
643,
3082,
253,
30080,
22559,
7125,
403,
1175,
533,
417,
21414,
2217,
326,
891,
476,
2868,
275,
436,
3082,
34385,
1293,
6523,
247,
5301,
342,
24088,
260,
584,
50276,
6082,
422,
5513,
253,
3453,
273,
1029,
6967,
13361,
3210,
275,
247,
1966,
22416,
494,
1039,
407,
970,
439,
522,
2205,
2193,
327,
3300,
39904,
14282,
21624,
3386,
50276,
296,
3755,
20556,
50276,
2520,
310,
271,
6685,
1774,
1895,
253,
35387,
273,
24088,
12275,
3169,
3082,
323,
3888,
403,
1774,
285,
973,
1439,
264,
50276,
783,
4081,
1332,
310,
247,
12532,
1039,
281,
11104,
281,
285,
275,
690,
2219,
3037,
3300,
39904,
14282,
21624,
3386,
50276,
69,
1033,
31320,
11029,
347,
247,
1175,
3216,
5083,
835,
253,
1006,
800,
1232,
310,
1929,
285,
11029,
281,
17813,
690,
273,
253,
1966,
22416,
494,
6127,
1119,
50276,
20881,
1255,
265,
50276,
2577,
4583,
4468,
310,
326,
1223,
253,
5661,
1543,
403,
4722,
597,
403,
417,
11080,
285,
13414,
7568,
253,
16888,
1318,
273,
436,
1332,
4103,
281,
643,
1896,
3082,
275,
253,
2317,
323,
1650,
4473,
3673,
44856,
390,
19349,
4473,
1055,
3210,
50276,
74,
42126,
923,
247,
3806,
281,
465,
1368,
1162,
14350,
4473,
3673,
44856,
3210,
5987,
39962,
2061,
5375,
1518,
1967,
2950,
805,
534,
513,
22296,
4715,
273,
24705,
12342,
840,
6194,
247,
30410,
327,
1755,
273,
1110,
12342,
534,
310,
1077,
2074,
15477,
789,
50276,
3549,
6241,
891,
42126,
923,
247,
3806,
281,
564,
90,
267,
1162,
14350,
15571,
49996,
342,
19349,
4473,
1055,
260,
584,
534,
18784,
247,
362,
3348,
281,
3037,
247,
14282,
21624,
2317,
285,
1304,
19349,
2538,
273,
14586,
281,
253,
21624,
4903,
253,
6242,
273,
436,
2720,
789,
9479,
561,
690,
2173,
3533,
891,
574,
672,
4361,
253,
2929,
50276,
1542,
253,
1327,
71,
15421,
8892,
2139,
897,
247,
362,
3348,
21624,
2317,
697,
7479,
281,
871,
326,
362,
3348,
21624,
10103,
672,
597,
513,
1007,
751,
597,
2723,
281,
247,
4473,
2723,
760,
281,
326,
4473,
253,
2534,
19840,
14851,
954,
6349,
4566,
281,
9118,
1899,
36435,
6489,
1646,
689,
8259,
888,
1580,
326,
310,
760,
776,
5476,
273,
752,
253,
21624,
7877,
2097,
275,
4499,
247,
4473,
3673,
44856,
1566,
556,
10103,
342,
2590,
4495,
597,
497,
10166,
342,
20446,
436,
310,
14969,
281,
690,
4248,
275,
253,
2505,
50276,
12563,
323,
253,
1327,
71,
15421,
8892,
2139,
897,
253,
439,
522,
2205,
7792,
260,
584,
310,
247,
1077,
2074,
2746,
970,
247,
362,
3348,
281,
755,
22909,
275,
2426,
273,
271,
4665,
494,
21624,
2317,
533,
36908,
897,
439,
522,
2205,
752,
403,
253,
5847,
285,
772,
273,
1016,
2746,
671,
1677,
247,
14282,
21624,
2317,
2139,
417,
897,
11786,
3169,
3082,
26332,
8527,
27935,
281,
11104,
281,
352,
50276,
1189,
455,
891,
13414,
1158,
436,
1332,
3198,
281,
8230,
253,
2571,
562,
273,
253,
1824,
627,
403,
14905,
4828,
10801,
281,
320,
1160,
50276,
1542,
1650,
4473,
3673,
44856,
3210,
2430,
4473,
13301,
2167,
891,
2868,
824,
13301,
403,
2130,
275,
253,
6076,
5830,
4679,
533,
247,
30457,
5955,
273,
253,
2954,
273,
436,
789,
281,
3082,
751,
4473,
3673,
5025,
886,
661,
390,
260,
584,
310,
5667,
50276,
395,
417,
1246,
275,
253,
1655,
2715,
643,
32213,
50276,
783,
3290,
273,
253,
4795,
22909,
310,
417,
7515,
8132,
29689,
253,
958,
326,
253,
277,
1033,
31320,
1650,
21460,
598,
327,
9118,
1899,
285,
253,
278,
79,
382,
1650,
21460,
598,
327,
6670,
1318,
310,
4217,
533,
1142,
4665,
1430,
9380,
1007,
625,
9286,
1242,
2439,
253,
10895,
387,
752,
6569,
281,
253,
1566,
3453,
672,
253,
3386,
2361,
347,
1774,
403,
387,
1878,
44711,
390,
490,
16148,
390,
604,
247,
1566,
310,
851,
11273,
342,
253,
3386,
12059,
891,
651,
3782,
751,
281,
923,
824,
247,
5301,
342,
4473,
3673,
44856,
285,
263,
260,
584,
604,
436,
1332,
439,
522,
2205,
342,
362,
3348,
310,
253,
987,
1039,
281,
513,
1841,
4931,
352,
588,
562,
32231,
253,
643,
3082,
387,
21565,
849,
1199,
2176,
4473,
15036,
588,
2818,
1566,
3453,
390,
1566,
851,
26208,
50276,
783,
269,
15421,
6667,
403,
4722,
533,
1928,
247,
2372,
751,
20953,
6667,
281,
479,
984,
597,
403,
247,
2173,
1083,
342,
247,
11638,
42275,
4735,
3711,
285,
6571,
39994,
262,
4187,
1929,
3607,
273,
48960,
6667,
597,
10725,
327,
1029,
18163,
6127,
50276,
37585,
4385,
253,
13016,
273,
24705,
14237,
812,
320,
36439,
7826,
6108,
2316,
323,
625,
14023,
281,
643,
3082,
31591,
17144,
273,
253,
863,
8303,
7152,
339,
431,
248,
2929,
8631,
247,
5853,
323,
29375,
253,
1543,
273,
247,
11454,
2036,
275,
1966,
25285,
2426,
50276,
783,
5044,
2934,
310,
3626,
285,
2969,
337,
1089,
247,
1355,
1180,
273,
11627,
323,
253,
4735,
2317,
326,
2723,
281,
1966,
22416,
494,
12342,
374,
897,
439,
522,
2205,
2193,
281,
9212,
6152,
281,
841,
11627,
50276,
9072,
2792,
1223,
627,
310,
789,
2905,
281,
436,
2934,
534,
253,
4477,
26542,
20420,
253,
4583,
5853,
310,
747,
253,
4477,
513,
247,
1175,
2628,
273,
19673,
598,
253,
10527,
2934,
342,
247,
873,
273,
4679,
326,
1646,
281,
4711,
4217,
1543,
253,
48960,
1650,
10076,
285,
253,
29192,
941,
873,
897,
1083,
497,
3340,
3477,
281,
2096,
253,
47284,
310,
671,
1077,
2590,
50276,
20881,
2792,
439,
522,
2205,
2193,
403,
417,
253,
760,
1039,
281,
22059,
1966,
22416,
494,
11627,
3738,
597,
3340,
403,
1160,
17887,
407,
253,
5141,
275,
1180,
273,
3386,
253,
2929,
1537,
320,
1014,
10046,
604,
352,
2429,
643,
3082,
2299,
891,
651,
2649,
12009,
253,
2929,
327,
436,
3720,
50275,
74,
5583,
14924,
436,
310,
247,
4891,
2929,
285,
310,
2779,
281,
320,
973,
68,
959,
50276,
38092,
8680,
50276,
35640,
621,
323,
7756,
50275,
74,
1119,
253,
14168,
14951,
247,
1652,
21643,
627,
403,
1264,
2087,
3510,
273,
5113,
247,
253,
1027,
8470,
835,
14237,
3153,
4735,
2317,
4632,
253,
4665,
494,
2317,
270,
3603,
273,
1110,
8470,
285,
260,
3470,
875,
1110,
8470,
2176,
14217,
4455,
281,
513,
4021,
7143,
323,
4227,
10751,
2465,
4455,
281,
8253,
875,
841,
30460,
2654,
5583,
2590,
42060,
285,
42455,
14951,
323,
1016,
1511,
273,
1789,
50275,
783,
4477,
2312,
670,
271,
16851,
13737,
10751,
2465,
697,
417,
1663,
271,
13737,
5046,
625,
751,
247,
17927,
46429,
390,
5046,
247,
1669,
390,
987,
13737,
7293,
327,
14951,
4931,
1908,
17958,
28939,
50275,
9453,
10751,
2465,
310,
46350,
581,
812,
4745,
4647,
643,
4665,
1430,
5609,
16280,
439,
522,
2205,
2193,
323,
4227,
1379,
247,
11786,
273,
253,
966,
4868,
342,
1675,
281,
253,
4665,
494,
13249,
2317,
34243,
352,
651,
320,
5322,
281,
923,
849,
841,
1007,
594,
359,
812,
7277,
849,
1199,
273,
253,
19843,
273,
1543,
310,
1955,
281,
1175,
11627,
285,
849,
1199,
281,
897,
273,
439,
522,
2205,
2193,
436,
651,
320,
3782,
15246,
275,
253,
806,
269,
15421,
4979,
1174,
6216,
406,
339,
793,
360,
3454,
436,
2929,
24357,
247,
439,
522,
2205,
1318,
2746,
281,
8813,
326,
4648,
1698,
6967,
21624,
3386,
281,
5513,
253,
3236,
3280,
275,
1029,
6967,
7533,
439,
522,
2205,
2193,
476,
320,
43245,
540,
44374,
347,
824,
253,
4477,
5223,
253,
8847,
38857,
1318,
1159,
362,
3830,
281,
1908,
10089,
4431,
2931,
275,
21624,
2317,
436,
7729,
5416,
326,
760,
17887,
285,
21541,
26332,
3300,
39904,
14282,
26309,
403,
1160,
597,
1908,
1264,
1027,
21624,
2349,
351,
723,
269,
15421,
29698,
557,
290,
606,
1338,
285,
4440,
16713,
5695,
10234,
281,
20843,
2616,
273,
7629,
50276,
296,
3755,
20556,
50276,
783,
2929,
39223,
247,
2834,
1895,
275,
3946,
9433,
439,
522,
2205,
22909,
281,
1029,
15759,
941,
253,
4477,
513,
247,
4030,
2628,
273,
15265,
839,
436,
1895,
285,
616,
6774,
2900,
50275,
783,
2929,
310,
3240,
11080,
275,
697,
40290,
347,
352,
4679,
342,
260,
338,
274,
740,
1800,
494,
38276,
277,
1033,
31320,
6076,
5830,
285,
278,
79,
382,
50275,
20881,
1255,
265,
50276,
74,
651,
452,
7052,
9013,
281,
923,
247,
15180,
1783,
273,
849,
1199,
625,
10649,
494,
4886,
281,
21624,
2317,
2789,
253,
439,
522,
2205,
1318,
352,
778,
320,
271,
17619,
673,
5141,
604,
1182,
310,
6777,
281,
320,
2412,
69,
323,
1650,
50276,
261,
627,
247,
24600,
5933,
390,
47641,
323,
17221,
253,
1180,
273,
21624,
2616,
672,
9706,
4931,
627,
310,
247,
5454,
2727,
875,
253,
1979,
273,
1182,
285,
253,
8813,
3290,
50275,
34974,
50276,
5092,
368,
4385,
327,
604,
253,
24705,
2317,
3198,
281,
320,
253,
1072,
2317,
326,
253,
20741,
800,
1566,
3139,
17209,
275,
604,
368,
816,
6194,
271,
990,
936,
423,
1566,
269,
327,
1269,
90,
8557,
849,
476,
359,
320,
2119,
326,
253,
1006,
800,
21624,
2616,
10848,
407,
1182,
403,
2686,
908,
407,
269,
50275,
5092,
368,
4385,
327,
253,
2954,
875,
253,
439,
522,
2205,
2193,
3587,
327,
253,
3280,
941,
7147,
253,
439,
522,
2205,
2193,
327,
253,
21624,
2616,
604,
667,
2490,
187,
4118,
18435,
27,
2520,
2929,
23970,
271,
2746,
281,
1566,
5513,
1430,
327,
1029,
6967,
941,
407,
337,
806,
10603,
14800,
281,
247,
4577,
873,
273,
6835,
917,
21624,
3386,
285,
840,
374,
9433,
253,
439,
522,
2205,
1332,
281,
436,
873,
273,
21624,
3386,
2067,
3082,
403,
2783,
323,
337,
285,
16774,
1543,
403,
6730,
2439,
2067,
7533,
50276,
15337,
398,
497,
6804,
275,
616,
6849,
50276,
531,
37317,
369,
275,
3718,
273,
14924,
285,
1264,
497,
1411,
50275,
8826,
7350,
497,
9713,
407,
4477,
275,
253,
5955,
2180,
533,
5780,
3374,
2486,
7350,
689,
253,
6009,
16858,
273,
253,
2746,
50276,
33722,
14023,
281,
643,
2905,
3082,
824,
347,
260,
584,
285,
50276,
66,
8327,
323,
625,
801,
554,
394,
5955,
273,
253,
5847,
285,
772,
273,
253,
1027,
3082,
2783,
323,
337
] | [
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] | [
273,
27197,
253,
1029,
6967,
941,
281,
1698,
6967,
21624,
2317,
752,
6866,
943,
320,
908,
407,
247,
2608,
281,
7617,
534,
1332,
281,
897,
390,
275,
643,
3000,
752,
403,
253,
11361,
285,
23797,
273,
1016,
273,
841,
3082,
534,
1537,
1056,
731,
625,
390,
1679,
7470,
323,
2176,
8892,
46906,
1507,
1212,
18498,
50276,
88,
22,
253,
2929,
4648,
253,
12616,
1966,
22416,
494,
5513,
1430,
752,
643,
1511,
273,
5513,
1430,
812,
320,
1896,
604,
697,
417,
1966,
22416,
494,
891,
1158,
253,
2929,
1537,
5649,
342,
625,
10799,
14308,
273,
841,
2426,
275,
253,
2929,
50275,
250,
3065,
5393,
1840,
534,
403,
417,
1246,
275,
253,
2022,
2929,
50276,
72,
1688,
5568,
74,
1162,
355,
6247,
717,
343,
682,
305,
1688,
5568,
74,
480,
1443,
359,
89,
2146,
480,
1443,
1182,
276,
644,
465,
303,
4404,
12077,
4473,
3169,
22909,
5723,
2824,
6247,
50276,
72,
6462,
1162,
355,
6247,
340,
1225,
564,
90,
267,
717,
343,
46381,
41475,
439,
267,
262,
644,
465,
303,
15571,
49996,
342,
19349,
4473,
1055,
260,
584,
549,
32693,
6247,
50269,
11183,
846,
30080,
22559,
891,
5717,
253,
4477,
323,
616,
6128,
281,
512,
619,
3533,
2299,
891,
2868,
326,
841,
9172,
878,
281,
320,
17285,
21657,
275,
1340,
323,
253,
9380,
9021,
281,
320,
1534,
323,
14924,
275,
1798,
891,
1335,
452,
767,
2201,
7350,
337,
253,
6009,
16858,
273,
253,
4081,
2746,
891,
1158,
326,
253,
4477,
3662,
326,
616,
1332,
310,
1679,
387,
2495,
281,
31306,
685,
643,
3082,
3198,
281,
320,
5183,
342,
387,
1878,
247,
2969,
3368,
374,
439,
522,
600,
2193,
689,
643,
3082,
891,
1158,
253,
4477,
878,
281,
896,
598,
616,
4154,
323,
970,
439,
522,
600,
1318,
22909,
689,
643,
3082,
407,
10941,
21657,
342,
643,
3082,
824,
347,
260,
584,
390,
1014,
9305,
27935,
275,
1635,
891,
1158,
253,
2929,
651,
5649,
247,
2257,
407,
1690,
247,
1534,
5955,
327,
253,
11361,
285,
23797,
273,
1016,
273,
253,
1264,
4088,
273,
27197,
253,
1029,
6967,
941,
281,
1698,
6967,
21624,
2317,
534,
1537,
1056,
731,
625,
390,
1679,
7470,
323,
2176,
8892,
46906,
1507,
1212,
18498,
984,
273,
841,
7350,
891,
717,
7562,
619,
3236,
13716,
5474,
339,
793,
360,
3454,
891,
1158,
326,
752,
310,
3559,
310,
247,
12532,
1332,
323,
1966,
7914,
273,
1029,
6967,
3210,
2299,
253,
4679,
1928,
1512,
1199,
751,
20953,
6667,
1293,
26565,
9437,
281,
17813,
253,
4795,
4735,
863,
8303,
390,
281,
7277,
281,
643,
4088,
273,
2970,
4473,
5251,
863,
8303,
50276,
11183,
846,
4361,
253,
30080,
22559,
891,
717,
11850,
619,
2045,
4868,
891,
2868,
326,
1677,
253,
2905,
789,
30628,
452,
5393,
253,
2929,
954,
2779,
4419,
690,
5661,
14023,
342,
841,
643,
3082,
253,
30080,
22559,
7125,
403,
1175,
533,
417,
21414,
2217,
326,
891,
476,
2868,
275,
436,
3082,
34385,
1293,
6523,
247,
5301,
342,
24088,
260,
584,
50276,
6082,
422,
5513,
253,
3453,
273,
1029,
6967,
13361,
3210,
275,
247,
1966,
22416,
494,
1039,
407,
970,
439,
522,
2205,
2193,
327,
3300,
39904,
14282,
21624,
3386,
50276,
296,
3755,
20556,
50276,
2520,
310,
271,
6685,
1774,
1895,
253,
35387,
273,
24088,
12275,
3169,
3082,
323,
3888,
403,
1774,
285,
973,
1439,
264,
50276,
783,
4081,
1332,
310,
247,
12532,
1039,
281,
11104,
281,
285,
275,
690,
2219,
3037,
3300,
39904,
14282,
21624,
3386,
50276,
69,
1033,
31320,
11029,
347,
247,
1175,
3216,
5083,
835,
253,
1006,
800,
1232,
310,
1929,
285,
11029,
281,
17813,
690,
273,
253,
1966,
22416,
494,
6127,
1119,
50276,
20881,
1255,
265,
50276,
2577,
4583,
4468,
310,
326,
1223,
253,
5661,
1543,
403,
4722,
597,
403,
417,
11080,
285,
13414,
7568,
253,
16888,
1318,
273,
436,
1332,
4103,
281,
643,
1896,
3082,
275,
253,
2317,
323,
1650,
4473,
3673,
44856,
390,
19349,
4473,
1055,
3210,
50276,
74,
42126,
923,
247,
3806,
281,
465,
1368,
1162,
14350,
4473,
3673,
44856,
3210,
5987,
39962,
2061,
5375,
1518,
1967,
2950,
805,
534,
513,
22296,
4715,
273,
24705,
12342,
840,
6194,
247,
30410,
327,
1755,
273,
1110,
12342,
534,
310,
1077,
2074,
15477,
789,
50276,
3549,
6241,
891,
42126,
923,
247,
3806,
281,
564,
90,
267,
1162,
14350,
15571,
49996,
342,
19349,
4473,
1055,
260,
584,
534,
18784,
247,
362,
3348,
281,
3037,
247,
14282,
21624,
2317,
285,
1304,
19349,
2538,
273,
14586,
281,
253,
21624,
4903,
253,
6242,
273,
436,
2720,
789,
9479,
561,
690,
2173,
3533,
891,
574,
672,
4361,
253,
2929,
50276,
1542,
253,
1327,
71,
15421,
8892,
2139,
897,
247,
362,
3348,
21624,
2317,
697,
7479,
281,
871,
326,
362,
3348,
21624,
10103,
672,
597,
513,
1007,
751,
597,
2723,
281,
247,
4473,
2723,
760,
281,
326,
4473,
253,
2534,
19840,
14851,
954,
6349,
4566,
281,
9118,
1899,
36435,
6489,
1646,
689,
8259,
888,
1580,
326,
310,
760,
776,
5476,
273,
752,
253,
21624,
7877,
2097,
275,
4499,
247,
4473,
3673,
44856,
1566,
556,
10103,
342,
2590,
4495,
597,
497,
10166,
342,
20446,
436,
310,
14969,
281,
690,
4248,
275,
253,
2505,
50276,
12563,
323,
253,
1327,
71,
15421,
8892,
2139,
897,
253,
439,
522,
2205,
7792,
260,
584,
310,
247,
1077,
2074,
2746,
970,
247,
362,
3348,
281,
755,
22909,
275,
2426,
273,
271,
4665,
494,
21624,
2317,
533,
36908,
897,
439,
522,
2205,
752,
403,
253,
5847,
285,
772,
273,
1016,
2746,
671,
1677,
247,
14282,
21624,
2317,
2139,
417,
897,
11786,
3169,
3082,
26332,
8527,
27935,
281,
11104,
281,
352,
50276,
1189,
455,
891,
13414,
1158,
436,
1332,
3198,
281,
8230,
253,
2571,
562,
273,
253,
1824,
627,
403,
14905,
4828,
10801,
281,
320,
1160,
50276,
1542,
1650,
4473,
3673,
44856,
3210,
2430,
4473,
13301,
2167,
891,
2868,
824,
13301,
403,
2130,
275,
253,
6076,
5830,
4679,
533,
247,
30457,
5955,
273,
253,
2954,
273,
436,
789,
281,
3082,
751,
4473,
3673,
5025,
886,
661,
390,
260,
584,
310,
5667,
50276,
395,
417,
1246,
275,
253,
1655,
2715,
643,
32213,
50276,
783,
3290,
273,
253,
4795,
22909,
310,
417,
7515,
8132,
29689,
253,
958,
326,
253,
277,
1033,
31320,
1650,
21460,
598,
327,
9118,
1899,
285,
253,
278,
79,
382,
1650,
21460,
598,
327,
6670,
1318,
310,
4217,
533,
1142,
4665,
1430,
9380,
1007,
625,
9286,
1242,
2439,
253,
10895,
387,
752,
6569,
281,
253,
1566,
3453,
672,
253,
3386,
2361,
347,
1774,
403,
387,
1878,
44711,
390,
490,
16148,
390,
604,
247,
1566,
310,
851,
11273,
342,
253,
3386,
12059,
891,
651,
3782,
751,
281,
923,
824,
247,
5301,
342,
4473,
3673,
44856,
285,
263,
260,
584,
604,
436,
1332,
439,
522,
2205,
342,
362,
3348,
310,
253,
987,
1039,
281,
513,
1841,
4931,
352,
588,
562,
32231,
253,
643,
3082,
387,
21565,
849,
1199,
2176,
4473,
15036,
588,
2818,
1566,
3453,
390,
1566,
851,
26208,
50276,
783,
269,
15421,
6667,
403,
4722,
533,
1928,
247,
2372,
751,
20953,
6667,
281,
479,
984,
597,
403,
247,
2173,
1083,
342,
247,
11638,
42275,
4735,
3711,
285,
6571,
39994,
262,
4187,
1929,
3607,
273,
48960,
6667,
597,
10725,
327,
1029,
18163,
6127,
50276,
37585,
4385,
253,
13016,
273,
24705,
14237,
812,
320,
36439,
7826,
6108,
2316,
323,
625,
14023,
281,
643,
3082,
31591,
17144,
273,
253,
863,
8303,
7152,
339,
431,
248,
2929,
8631,
247,
5853,
323,
29375,
253,
1543,
273,
247,
11454,
2036,
275,
1966,
25285,
2426,
50276,
783,
5044,
2934,
310,
3626,
285,
2969,
337,
1089,
247,
1355,
1180,
273,
11627,
323,
253,
4735,
2317,
326,
2723,
281,
1966,
22416,
494,
12342,
374,
897,
439,
522,
2205,
2193,
281,
9212,
6152,
281,
841,
11627,
50276,
9072,
2792,
1223,
627,
310,
789,
2905,
281,
436,
2934,
534,
253,
4477,
26542,
20420,
253,
4583,
5853,
310,
747,
253,
4477,
513,
247,
1175,
2628,
273,
19673,
598,
253,
10527,
2934,
342,
247,
873,
273,
4679,
326,
1646,
281,
4711,
4217,
1543,
253,
48960,
1650,
10076,
285,
253,
29192,
941,
873,
897,
1083,
497,
3340,
3477,
281,
2096,
253,
47284,
310,
671,
1077,
2590,
50276,
20881,
2792,
439,
522,
2205,
2193,
403,
417,
253,
760,
1039,
281,
22059,
1966,
22416,
494,
11627,
3738,
597,
3340,
403,
1160,
17887,
407,
253,
5141,
275,
1180,
273,
3386,
253,
2929,
1537,
320,
1014,
10046,
604,
352,
2429,
643,
3082,
2299,
891,
651,
2649,
12009,
253,
2929,
327,
436,
3720,
50275,
74,
5583,
14924,
436,
310,
247,
4891,
2929,
285,
310,
2779,
281,
320,
973,
68,
959,
50276,
38092,
8680,
50276,
35640,
621,
323,
7756,
50275,
74,
1119,
253,
14168,
14951,
247,
1652,
21643,
627,
403,
1264,
2087,
3510,
273,
5113,
247,
253,
1027,
8470,
835,
14237,
3153,
4735,
2317,
4632,
253,
4665,
494,
2317,
270,
3603,
273,
1110,
8470,
285,
260,
3470,
875,
1110,
8470,
2176,
14217,
4455,
281,
513,
4021,
7143,
323,
4227,
10751,
2465,
4455,
281,
8253,
875,
841,
30460,
2654,
5583,
2590,
42060,
285,
42455,
14951,
323,
1016,
1511,
273,
1789,
50275,
783,
4477,
2312,
670,
271,
16851,
13737,
10751,
2465,
697,
417,
1663,
271,
13737,
5046,
625,
751,
247,
17927,
46429,
390,
5046,
247,
1669,
390,
987,
13737,
7293,
327,
14951,
4931,
1908,
17958,
28939,
50275,
9453,
10751,
2465,
310,
46350,
581,
812,
4745,
4647,
643,
4665,
1430,
5609,
16280,
439,
522,
2205,
2193,
323,
4227,
1379,
247,
11786,
273,
253,
966,
4868,
342,
1675,
281,
253,
4665,
494,
13249,
2317,
34243,
352,
651,
320,
5322,
281,
923,
849,
841,
1007,
594,
359,
812,
7277,
849,
1199,
273,
253,
19843,
273,
1543,
310,
1955,
281,
1175,
11627,
285,
849,
1199,
281,
897,
273,
439,
522,
2205,
2193,
436,
651,
320,
3782,
15246,
275,
253,
806,
269,
15421,
4979,
1174,
6216,
406,
339,
793,
360,
3454,
436,
2929,
24357,
247,
439,
522,
2205,
1318,
2746,
281,
8813,
326,
4648,
1698,
6967,
21624,
3386,
281,
5513,
253,
3236,
3280,
275,
1029,
6967,
7533,
439,
522,
2205,
2193,
476,
320,
43245,
540,
44374,
347,
824,
253,
4477,
5223,
253,
8847,
38857,
1318,
1159,
362,
3830,
281,
1908,
10089,
4431,
2931,
275,
21624,
2317,
436,
7729,
5416,
326,
760,
17887,
285,
21541,
26332,
3300,
39904,
14282,
26309,
403,
1160,
597,
1908,
1264,
1027,
21624,
2349,
351,
723,
269,
15421,
29698,
557,
290,
606,
1338,
285,
4440,
16713,
5695,
10234,
281,
20843,
2616,
273,
7629,
50276,
296,
3755,
20556,
50276,
783,
2929,
39223,
247,
2834,
1895,
275,
3946,
9433,
439,
522,
2205,
22909,
281,
1029,
15759,
941,
253,
4477,
513,
247,
4030,
2628,
273,
15265,
839,
436,
1895,
285,
616,
6774,
2900,
50275,
783,
2929,
310,
3240,
11080,
275,
697,
40290,
347,
352,
4679,
342,
260,
338,
274,
740,
1800,
494,
38276,
277,
1033,
31320,
6076,
5830,
285,
278,
79,
382,
50275,
20881,
1255,
265,
50276,
74,
651,
452,
7052,
9013,
281,
923,
247,
15180,
1783,
273,
849,
1199,
625,
10649,
494,
4886,
281,
21624,
2317,
2789,
253,
439,
522,
2205,
1318,
352,
778,
320,
271,
17619,
673,
5141,
604,
1182,
310,
6777,
281,
320,
2412,
69,
323,
1650,
50276,
261,
627,
247,
24600,
5933,
390,
47641,
323,
17221,
253,
1180,
273,
21624,
2616,
672,
9706,
4931,
627,
310,
247,
5454,
2727,
875,
253,
1979,
273,
1182,
285,
253,
8813,
3290,
50275,
34974,
50276,
5092,
368,
4385,
327,
604,
253,
24705,
2317,
3198,
281,
320,
253,
1072,
2317,
326,
253,
20741,
800,
1566,
3139,
17209,
275,
604,
368,
816,
6194,
271,
990,
936,
423,
1566,
269,
327,
1269,
90,
8557,
849,
476,
359,
320,
2119,
326,
253,
1006,
800,
21624,
2616,
10848,
407,
1182,
403,
2686,
908,
407,
269,
50275,
5092,
368,
4385,
327,
253,
2954,
875,
253,
439,
522,
2205,
2193,
3587,
327,
253,
3280,
941,
7147,
253,
439,
522,
2205,
2193,
327,
253,
21624,
2616,
604,
667,
2490,
187,
4118,
18435,
27,
2520,
2929,
23970,
271,
2746,
281,
1566,
5513,
1430,
327,
1029,
6967,
941,
407,
337,
806,
10603,
14800,
281,
247,
4577,
873,
273,
6835,
917,
21624,
3386,
285,
840,
374,
9433,
253,
439,
522,
2205,
1332,
281,
436,
873,
273,
21624,
3386,
2067,
3082,
403,
2783,
323,
337,
285,
16774,
1543,
403,
6730,
2439,
2067,
7533,
50276,
15337,
398,
497,
6804,
275,
616,
6849,
50276,
531,
37317,
369,
275,
3718,
273,
14924,
285,
1264,
497,
1411,
50275,
8826,
7350,
497,
9713,
407,
4477,
275,
253,
5955,
2180,
533,
5780,
3374,
2486,
7350,
689,
253,
6009,
16858,
273,
253,
2746,
50276,
33722,
14023,
281,
643,
2905,
3082,
824,
347,
260,
584,
285,
50276,
66,
8327,
323,
625,
801,
554,
394,
5955,
273,
253,
5847,
285,
772,
273,
253,
1027,
3082,
2783,
323,
337
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
this paper describes the release of a dataset and a benchmarking of offpolicy reinforcement learning methods a comment i would have about the proposed dataset is that i wish that the dataset had tried to include some more tasks that are similar to ones in which offpolicy evaluation would be used in the real world either a more complex simulated robotics task perhaps or perhaps from another domain entirely online eduction search recommendation systems etc in table 1 im not sure why q class would be an aspect of the environment necessarily as any environment could be attempted to be modeled using any q class i think the results section would have been more readable has there been a table of all methods evaluated with the abbreviations used to denote that method the large number of abbreviations used are difficlt to crossreference in the methods section in figure 4 there should be a description of how neartop freq is calculated exactly the discussion section is clear and wellwritten and provides clear general guidelines for selection amongst different offpolicy methods the paper presents a benchmark dataset which can be used by the community it also presents evaluations that may be useful for off policy methods in practice i think the benchmark dataset could be broadened to include tasks that are more similar to how offpolicy evaluation may be used in practical settings docsepthe work introduces a benchmark for offpolicy policy evaluation ope methods which is the problem of predicting future performance of a policy without actually deploying the policy and given only historical data the benchmark contributes eight reinforcement learning domains of varying complexity and implementations of a variety of standard evaluation methods after the response thank you adding the example tutorial and more documentation i would encourage authors to keep working on improving ease of use by adding more endtoend standalone tutorials and not just pointing to code sections to change by moving to an automatic documentation generator from docstrings and explaining the example scripts further host some of these on a website including the benchmarked results from the paper for each dataset setting the main strength is that the benchmark standardizes different parts of the ope evaluation pipeline which is an intensive and largely avoidable work for example the interfaces among policy environment and ope models and the evaluation metrics this will greatly reduce the time needed to setup the pipeline for future work and ease benchmarking against standard methods the ips direct and hybrid methods implemented in the benchmark are seldom compared against each other and seldom evaluated in domains that cover different dimensions of complexity of the ope problem thus the benchmark is a useful contribution for finding methods that work well and identifying their limitations the main weakness is that there is little supporting documentation and tools to make it easier to use the benchmark for new ope methods the repository httpsgithubcomclvoloshincobs seems to provide three example scripts to get started which are not documented it is left to the reader to figure out the interfaces and extend them for their own models domains and evaluation metrics this increases the effort required to use the benchmark and decreases the chances of its wide adoption a well documented api and an example tutorial eg the one in bsuite httpsgithubcomdeepmindbsuite will greatly enhance this benchmarks utility to the community further the extensibility of the benchmark to include new ope methods will be important given the rapid advances being made on the problem docsepthe authors present cobs a benchmarking suite for the comparison of offpolicy evaluation methods in a variety of experimental settings first they describe their main design choices including 6 factors which affect the performance of ope methods and 8 environments common in the related literature which they use for their evaluation then they summarize the main highlevel approaches in ope and they provide a brief overview of various already proposed methods lastly they perform an extensive evaluation and comparison of those methods using their benchmarking suite and they discuss a variety of quantitative results related to the previously discussed factors affecting the ope performance offpolicy evaluation is a very challenging problem in the rl community and it also has significant importance for researchers in closelyrelated disciplines who use rl and ope in highrisk application domains like healthcare therefore the benchmarking suite presented in this paper could be of great use to the broader community other than that a strength of the paper which is also a weakness as i discuss later on is the breadth of the evaluated methods and the results of their comparative performance the authors have put a lot of effort into categorizing and comparing 33 ope methods in order to provide several insights about their relative strengths and weaknesses i believe that the discussion of their quantitative results would be very useful for rl researchers in general as i mentioned previously the main weakness i find in this paper is its breadth of experimental results even though section 2 establishes the main characteristics of the proposed benchmarking suite in section 3 the train of thought starts to get lost since the authors try to evaluate and compare 33 ope methods which are not described in detail therefore it is exceptionally hard for someone not familiar with all of the original papers proposing those methods to fully comprehend all the quantitative results presented in the paper and get some intuition about the advantages of each method moreover the authors make qualitative conclusions about how different methods compare depending on the type of the experimental setting referring to tables and figures in the appendix more frequently than referring to the tables and figures in the main body of the paper overall the experimental section makes the paper read like a review of ope methods and it shifts the focus from the benchmarking suite itself the great quantity of experimental results provided by the authors make it clear that their benchmarking suite is indeed capable to create various different environments to test current ope methods however i would describe the experimental section as overwhelming i am not sure if it is possible but i would encourage the authors to focus on a few perhaps 23 from each one of the discussed categories ope methods in the main provide a clear description of each one of them and mainly discuss the results of those methods having an independent discussion of the rest of the results in the appendix without constantly referring to the appendix to support the claims in the main after author response i read the response and i acknowledge the fact that the authors have made a decent attempt to synthesize their results and extract highlevel insights from the comparison of the ope methods however due to the improvements required in terms of writing also see clarity i would like to keep my score 6 and encourage the authors to improve the readability of the paper for their cameraready version if the paper gets accepted
### Summary: | all reviewers vote for accepting the paper one concern in multiple reviews was the clarity of the writing in addition one reviewer also pointed out shortcomings in the documentation which the authors have addressed by now hence i encourage the authors to carefully edit the paper for clarity overall i recommend accepting the paper since it fills a need for a common benchmark in offpolicy policy evaluation | [
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
2520,
2929,
8631,
253,
3727,
273,
247,
10895,
285,
247,
22791,
272,
273,
745,
22872,
35221,
4715,
3082,
247,
4385,
891,
651,
452,
670,
253,
4081,
10895,
310,
326,
891,
5730,
326,
253,
10895,
574,
3597,
281,
2486,
690,
625,
8892,
326,
403,
2074,
281,
4394,
275,
534,
745,
22872,
7103,
651,
320,
908,
275,
253,
1524,
1533,
2057,
247,
625,
2570,
15524,
15688,
982,
4836,
4931,
390,
4931,
432,
1529,
5028,
7094,
3909,
1407,
14684,
3186,
17401,
2718,
3966,
275,
2829,
337,
516,
417,
2119,
2139,
2805,
966,
651,
320,
271,
4809,
273,
253,
3126,
7933,
347,
667,
3126,
812,
320,
9919,
281,
320,
23115,
970,
667,
2805,
966,
891,
1158,
253,
1543,
2593,
651,
452,
644,
625,
34025,
556,
627,
644,
247,
2829,
273,
512,
3082,
6760,
342,
253,
490,
25669,
908,
281,
9173,
326,
1332,
253,
1781,
1180,
273,
490,
25669,
908,
403,
2171,
280,
5792,
281,
2831,
14005,
275,
253,
3082,
2593,
275,
4677,
577,
627,
943,
320,
247,
5740,
273,
849,
425,
435,
412,
4107,
82,
310,
5118,
4555,
253,
5955,
2593,
310,
2590,
285,
973,
15720,
285,
3400,
2590,
2087,
9600,
323,
5438,
15995,
1027,
745,
22872,
3082,
50276,
783,
2929,
10262,
247,
22791,
10895,
534,
476,
320,
908,
407,
253,
3114,
352,
671,
10262,
27163,
326,
778,
320,
4217,
323,
745,
3646,
3082,
275,
3946,
891,
1158,
253,
22791,
10895,
812,
320,
3862,
2348,
281,
2486,
8892,
326,
403,
625,
2074,
281,
849,
745,
22872,
7103,
778,
320,
908,
275,
8542,
7533,
5474,
339,
431,
248,
789,
23970,
247,
22791,
323,
745,
22872,
3646,
7103,
258,
365,
3082,
534,
310,
253,
1895,
273,
21565,
2852,
3045,
273,
247,
3646,
1293,
2686,
45021,
253,
3646,
285,
1677,
760,
9493,
941,
253,
22791,
17904,
4314,
35221,
4715,
10625,
273,
11962,
10454,
285,
27558,
273,
247,
5235,
273,
2629,
7103,
3082,
50275,
6438,
253,
2380,
50276,
47033,
368,
6240,
253,
1650,
23647,
285,
625,
10097,
891,
651,
11907,
4477,
281,
1978,
2444,
327,
11138,
11990,
273,
897,
50276,
1615,
6240,
625,
990,
936,
423,
40468,
40727,
285,
417,
816,
13458,
281,
2127,
7118,
281,
1818,
407,
4886,
281,
271,
12077,
10097,
14156,
432,
5474,
27329,
285,
15571,
253,
1650,
20477,
2007,
3167,
690,
273,
841,
327,
247,
4422,
1690,
253,
22791,
264,
1543,
432,
253,
2929,
323,
1016,
10895,
4758,
253,
2022,
4757,
310,
326,
253,
22791,
2629,
4219,
1027,
4243,
273,
253,
258,
365,
7103,
15722,
534,
310,
271,
17193,
285,
8127,
3693,
494,
789,
323,
1650,
253,
19069,
2190,
3646,
3126,
285,
258,
365,
3210,
285,
253,
7103,
17082,
436,
588,
10260,
4796,
253,
673,
3058,
281,
9978,
253,
15722,
323,
2852,
789,
285,
11990,
22791,
272,
1411,
2629,
3082,
253,
41998,
1480,
285,
9769,
3082,
9009,
275,
253,
22791,
403,
28277,
2429,
1411,
1016,
643,
285,
28277,
6760,
275,
10625,
326,
3835,
1027,
10103,
273,
10454,
273,
253,
258,
365,
1895,
3021,
253,
22791,
310,
247,
4217,
7680,
323,
4560,
3082,
326,
789,
973,
285,
12488,
616,
7364,
253,
2022,
14855,
310,
326,
627,
310,
1652,
8109,
10097,
285,
5657,
281,
1056,
352,
6927,
281,
897,
253,
22791,
323,
747,
258,
365,
3082,
253,
18491,
5987,
7280,
681,
498,
8625,
6934,
1763,
10600,
3133,
281,
2085,
1264,
1650,
20477,
281,
755,
3053,
534,
403,
417,
14290,
352,
310,
1669,
281,
253,
9414,
281,
4677,
562,
253,
19069,
285,
9017,
731,
323,
616,
1211,
3210,
10625,
285,
7103,
17082,
436,
5459,
253,
3434,
2424,
281,
897,
253,
22791,
285,
12075,
253,
14512,
273,
697,
4618,
16253,
247,
973,
14290,
23370,
285,
271,
1650,
23647,
24088,
253,
581,
275,
270,
42877,
5987,
7280,
681,
22412,
14785,
1768,
86,
614,
588,
10260,
7278,
436,
49602,
11839,
281,
253,
3114,
2007,
253,
1021,
561,
2322,
273,
253,
22791,
281,
2486,
747,
258,
365,
3082,
588,
320,
1774,
1677,
253,
5233,
16424,
1146,
1160,
327,
253,
1895,
5474,
339,
431,
248,
4477,
1246,
260,
10600,
247,
22791,
272,
18880,
323,
253,
5301,
273,
745,
22872,
7103,
3082,
275,
247,
5235,
273,
5661,
7533,
806,
597,
6266,
616,
2022,
2216,
10165,
1690,
721,
2616,
534,
2818,
253,
3045,
273,
258,
365,
3082,
285,
854,
12620,
1846,
275,
253,
2905,
6239,
534,
597,
897,
323,
616,
7103,
840,
597,
26799,
253,
2022,
1029,
5251,
7274,
275,
258,
365,
285,
597,
2085,
247,
4864,
18389,
273,
2710,
2168,
4081,
3082,
1390,
314,
597,
1347,
271,
9470,
7103,
285,
5301,
273,
1110,
3082,
970,
616,
22791,
272,
18880,
285,
597,
2319,
247,
5235,
273,
11745,
1543,
2905,
281,
253,
3786,
5469,
2616,
13567,
253,
258,
365,
3045,
745,
22872,
7103,
310,
247,
1077,
11132,
1895,
275,
253,
391,
77,
3114,
285,
352,
671,
556,
1534,
6349,
323,
8607,
275,
8244,
4919,
32870,
665,
897,
391,
77,
285,
258,
365,
275,
1029,
16272,
2898,
10625,
751,
11723,
3103,
253,
22791,
272,
18880,
3559,
275,
436,
2929,
812,
320,
273,
1270,
897,
281,
253,
16055,
3114,
50276,
977,
685,
326,
247,
4757,
273,
253,
2929,
534,
310,
671,
247,
14855,
347,
891,
2319,
1996,
327,
310,
253,
37535,
273,
253,
6760,
3082,
285,
253,
1543,
273,
616,
20407,
3045,
253,
4477,
452,
1691,
247,
2257,
273,
3434,
715,
13213,
3006,
285,
10941,
5922,
258,
365,
3082,
275,
1340,
281,
2085,
2067,
16039,
670,
616,
4103,
20544,
285,
32213,
891,
2868,
326,
253,
5955,
273,
616,
11745,
1543,
651,
320,
1077,
4217,
323,
391,
77,
8607,
275,
2087,
347,
891,
5393,
3786,
253,
2022,
14855,
891,
1089,
275,
436,
2929,
310,
697,
37535,
273,
5661,
1543,
1014,
2167,
2593,
374,
25097,
253,
2022,
5319,
273,
253,
4081,
22791,
272,
18880,
275,
2593,
495,
253,
6194,
273,
1869,
7866,
281,
755,
3663,
1580,
253,
4477,
1611,
281,
7472,
285,
7277,
5922,
258,
365,
3082,
534,
403,
417,
2529,
275,
2508,
3103,
352,
310,
35888,
1892,
323,
3095,
417,
7615,
342,
512,
273,
253,
3236,
9380,
36636,
1110,
3082,
281,
4751,
37240,
512,
253,
11745,
1543,
3559,
275,
253,
2929,
285,
755,
690,
30328,
670,
253,
11361,
273,
1016,
1332,
25761,
253,
4477,
1056,
18276,
11815,
670,
849,
1027,
3082,
7277,
7293,
327,
253,
1511,
273,
253,
5661,
4758,
14339,
281,
7180,
285,
8442,
275,
253,
30762,
625,
7208,
685,
14339,
281,
253,
7180,
285,
8442,
275,
253,
2022,
2133,
273,
253,
2929,
4583,
253,
5661,
2593,
2789,
253,
2929,
1239,
751,
247,
2278,
273,
258,
365,
3082,
285,
352,
15036,
253,
2770,
432,
253,
22791,
272,
18880,
3139,
50276,
783,
1270,
10671,
273,
5661,
1543,
2530,
407,
253,
4477,
1056,
352,
2590,
326,
616,
22791,
272,
18880,
310,
6296,
7032,
281,
2794,
2710,
1027,
12620,
281,
1071,
1655,
258,
365,
3082,
2299,
891,
651,
6266,
253,
5661,
2593,
347,
16400,
891,
717,
417,
2119,
604,
352,
310,
1896,
533,
891,
651,
11907,
253,
4477,
281,
2770,
327,
247,
1643,
4931,
3495,
432,
1016,
581,
273,
253,
5469,
9050,
258,
365,
3082,
275,
253,
2022,
2085,
247,
2590,
5740,
273,
1016,
581,
273,
731,
285,
7194,
2319,
253,
1543,
273,
1110,
3082,
1907,
271,
3907,
5955,
273,
253,
1551,
273,
253,
1543,
275,
253,
30762,
1293,
11485,
14339,
281,
253,
30762,
281,
1329,
253,
3916,
275,
253,
2022,
50275,
6438,
2488,
2380,
50275,
74,
1239,
253,
2380,
285,
891,
14409,
253,
958,
326,
253,
4477,
452,
1160,
247,
12524,
3177,
281,
46919,
616,
1543,
285,
4908,
1029,
5251,
16039,
432,
253,
5301,
273,
253,
258,
365,
3082,
2299,
1955,
281,
253,
11701,
2424,
275,
2426,
273,
4028,
671,
923,
19843,
891,
651,
751,
281,
1978,
619,
4868,
721,
285,
11907,
253,
4477,
281,
3157,
253,
1239,
1430,
273,
253,
2929,
323,
616,
4049,
254,
609,
5102,
2715,
604,
253,
2929,
4850,
7607,
2490,
187,
4118,
18435,
27,
455,
30628,
6273,
323,
18738,
253,
2929,
581,
4468,
275,
2709,
10123,
369,
253,
19843,
273,
253,
4028,
275,
1635,
581,
37317,
671,
8042,
562,
35387,
275,
253,
10097,
534,
253,
4477,
452,
9713,
407,
1024,
7613,
891,
11907,
253,
4477,
281,
9257,
12921,
253,
2929,
323,
19843,
4583,
891,
5583,
18738,
253,
2929,
1580,
352,
32113,
247,
878,
323,
247,
1846,
22791,
275,
745,
22872,
3646,
7103
] | [
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] | [
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
2520,
2929,
8631,
253,
3727,
273,
247,
10895,
285,
247,
22791,
272,
273,
745,
22872,
35221,
4715,
3082,
247,
4385,
891,
651,
452,
670,
253,
4081,
10895,
310,
326,
891,
5730,
326,
253,
10895,
574,
3597,
281,
2486,
690,
625,
8892,
326,
403,
2074,
281,
4394,
275,
534,
745,
22872,
7103,
651,
320,
908,
275,
253,
1524,
1533,
2057,
247,
625,
2570,
15524,
15688,
982,
4836,
4931,
390,
4931,
432,
1529,
5028,
7094,
3909,
1407,
14684,
3186,
17401,
2718,
3966,
275,
2829,
337,
516,
417,
2119,
2139,
2805,
966,
651,
320,
271,
4809,
273,
253,
3126,
7933,
347,
667,
3126,
812,
320,
9919,
281,
320,
23115,
970,
667,
2805,
966,
891,
1158,
253,
1543,
2593,
651,
452,
644,
625,
34025,
556,
627,
644,
247,
2829,
273,
512,
3082,
6760,
342,
253,
490,
25669,
908,
281,
9173,
326,
1332,
253,
1781,
1180,
273,
490,
25669,
908,
403,
2171,
280,
5792,
281,
2831,
14005,
275,
253,
3082,
2593,
275,
4677,
577,
627,
943,
320,
247,
5740,
273,
849,
425,
435,
412,
4107,
82,
310,
5118,
4555,
253,
5955,
2593,
310,
2590,
285,
973,
15720,
285,
3400,
2590,
2087,
9600,
323,
5438,
15995,
1027,
745,
22872,
3082,
50276,
783,
2929,
10262,
247,
22791,
10895,
534,
476,
320,
908,
407,
253,
3114,
352,
671,
10262,
27163,
326,
778,
320,
4217,
323,
745,
3646,
3082,
275,
3946,
891,
1158,
253,
22791,
10895,
812,
320,
3862,
2348,
281,
2486,
8892,
326,
403,
625,
2074,
281,
849,
745,
22872,
7103,
778,
320,
908,
275,
8542,
7533,
5474,
339,
431,
248,
789,
23970,
247,
22791,
323,
745,
22872,
3646,
7103,
258,
365,
3082,
534,
310,
253,
1895,
273,
21565,
2852,
3045,
273,
247,
3646,
1293,
2686,
45021,
253,
3646,
285,
1677,
760,
9493,
941,
253,
22791,
17904,
4314,
35221,
4715,
10625,
273,
11962,
10454,
285,
27558,
273,
247,
5235,
273,
2629,
7103,
3082,
50275,
6438,
253,
2380,
50276,
47033,
368,
6240,
253,
1650,
23647,
285,
625,
10097,
891,
651,
11907,
4477,
281,
1978,
2444,
327,
11138,
11990,
273,
897,
50276,
1615,
6240,
625,
990,
936,
423,
40468,
40727,
285,
417,
816,
13458,
281,
2127,
7118,
281,
1818,
407,
4886,
281,
271,
12077,
10097,
14156,
432,
5474,
27329,
285,
15571,
253,
1650,
20477,
2007,
3167,
690,
273,
841,
327,
247,
4422,
1690,
253,
22791,
264,
1543,
432,
253,
2929,
323,
1016,
10895,
4758,
253,
2022,
4757,
310,
326,
253,
22791,
2629,
4219,
1027,
4243,
273,
253,
258,
365,
7103,
15722,
534,
310,
271,
17193,
285,
8127,
3693,
494,
789,
323,
1650,
253,
19069,
2190,
3646,
3126,
285,
258,
365,
3210,
285,
253,
7103,
17082,
436,
588,
10260,
4796,
253,
673,
3058,
281,
9978,
253,
15722,
323,
2852,
789,
285,
11990,
22791,
272,
1411,
2629,
3082,
253,
41998,
1480,
285,
9769,
3082,
9009,
275,
253,
22791,
403,
28277,
2429,
1411,
1016,
643,
285,
28277,
6760,
275,
10625,
326,
3835,
1027,
10103,
273,
10454,
273,
253,
258,
365,
1895,
3021,
253,
22791,
310,
247,
4217,
7680,
323,
4560,
3082,
326,
789,
973,
285,
12488,
616,
7364,
253,
2022,
14855,
310,
326,
627,
310,
1652,
8109,
10097,
285,
5657,
281,
1056,
352,
6927,
281,
897,
253,
22791,
323,
747,
258,
365,
3082,
253,
18491,
5987,
7280,
681,
498,
8625,
6934,
1763,
10600,
3133,
281,
2085,
1264,
1650,
20477,
281,
755,
3053,
534,
403,
417,
14290,
352,
310,
1669,
281,
253,
9414,
281,
4677,
562,
253,
19069,
285,
9017,
731,
323,
616,
1211,
3210,
10625,
285,
7103,
17082,
436,
5459,
253,
3434,
2424,
281,
897,
253,
22791,
285,
12075,
253,
14512,
273,
697,
4618,
16253,
247,
973,
14290,
23370,
285,
271,
1650,
23647,
24088,
253,
581,
275,
270,
42877,
5987,
7280,
681,
22412,
14785,
1768,
86,
614,
588,
10260,
7278,
436,
49602,
11839,
281,
253,
3114,
2007,
253,
1021,
561,
2322,
273,
253,
22791,
281,
2486,
747,
258,
365,
3082,
588,
320,
1774,
1677,
253,
5233,
16424,
1146,
1160,
327,
253,
1895,
5474,
339,
431,
248,
4477,
1246,
260,
10600,
247,
22791,
272,
18880,
323,
253,
5301,
273,
745,
22872,
7103,
3082,
275,
247,
5235,
273,
5661,
7533,
806,
597,
6266,
616,
2022,
2216,
10165,
1690,
721,
2616,
534,
2818,
253,
3045,
273,
258,
365,
3082,
285,
854,
12620,
1846,
275,
253,
2905,
6239,
534,
597,
897,
323,
616,
7103,
840,
597,
26799,
253,
2022,
1029,
5251,
7274,
275,
258,
365,
285,
597,
2085,
247,
4864,
18389,
273,
2710,
2168,
4081,
3082,
1390,
314,
597,
1347,
271,
9470,
7103,
285,
5301,
273,
1110,
3082,
970,
616,
22791,
272,
18880,
285,
597,
2319,
247,
5235,
273,
11745,
1543,
2905,
281,
253,
3786,
5469,
2616,
13567,
253,
258,
365,
3045,
745,
22872,
7103,
310,
247,
1077,
11132,
1895,
275,
253,
391,
77,
3114,
285,
352,
671,
556,
1534,
6349,
323,
8607,
275,
8244,
4919,
32870,
665,
897,
391,
77,
285,
258,
365,
275,
1029,
16272,
2898,
10625,
751,
11723,
3103,
253,
22791,
272,
18880,
3559,
275,
436,
2929,
812,
320,
273,
1270,
897,
281,
253,
16055,
3114,
50276,
977,
685,
326,
247,
4757,
273,
253,
2929,
534,
310,
671,
247,
14855,
347,
891,
2319,
1996,
327,
310,
253,
37535,
273,
253,
6760,
3082,
285,
253,
1543,
273,
616,
20407,
3045,
253,
4477,
452,
1691,
247,
2257,
273,
3434,
715,
13213,
3006,
285,
10941,
5922,
258,
365,
3082,
275,
1340,
281,
2085,
2067,
16039,
670,
616,
4103,
20544,
285,
32213,
891,
2868,
326,
253,
5955,
273,
616,
11745,
1543,
651,
320,
1077,
4217,
323,
391,
77,
8607,
275,
2087,
347,
891,
5393,
3786,
253,
2022,
14855,
891,
1089,
275,
436,
2929,
310,
697,
37535,
273,
5661,
1543,
1014,
2167,
2593,
374,
25097,
253,
2022,
5319,
273,
253,
4081,
22791,
272,
18880,
275,
2593,
495,
253,
6194,
273,
1869,
7866,
281,
755,
3663,
1580,
253,
4477,
1611,
281,
7472,
285,
7277,
5922,
258,
365,
3082,
534,
403,
417,
2529,
275,
2508,
3103,
352,
310,
35888,
1892,
323,
3095,
417,
7615,
342,
512,
273,
253,
3236,
9380,
36636,
1110,
3082,
281,
4751,
37240,
512,
253,
11745,
1543,
3559,
275,
253,
2929,
285,
755,
690,
30328,
670,
253,
11361,
273,
1016,
1332,
25761,
253,
4477,
1056,
18276,
11815,
670,
849,
1027,
3082,
7277,
7293,
327,
253,
1511,
273,
253,
5661,
4758,
14339,
281,
7180,
285,
8442,
275,
253,
30762,
625,
7208,
685,
14339,
281,
253,
7180,
285,
8442,
275,
253,
2022,
2133,
273,
253,
2929,
4583,
253,
5661,
2593,
2789,
253,
2929,
1239,
751,
247,
2278,
273,
258,
365,
3082,
285,
352,
15036,
253,
2770,
432,
253,
22791,
272,
18880,
3139,
50276,
783,
1270,
10671,
273,
5661,
1543,
2530,
407,
253,
4477,
1056,
352,
2590,
326,
616,
22791,
272,
18880,
310,
6296,
7032,
281,
2794,
2710,
1027,
12620,
281,
1071,
1655,
258,
365,
3082,
2299,
891,
651,
6266,
253,
5661,
2593,
347,
16400,
891,
717,
417,
2119,
604,
352,
310,
1896,
533,
891,
651,
11907,
253,
4477,
281,
2770,
327,
247,
1643,
4931,
3495,
432,
1016,
581,
273,
253,
5469,
9050,
258,
365,
3082,
275,
253,
2022,
2085,
247,
2590,
5740,
273,
1016,
581,
273,
731,
285,
7194,
2319,
253,
1543,
273,
1110,
3082,
1907,
271,
3907,
5955,
273,
253,
1551,
273,
253,
1543,
275,
253,
30762,
1293,
11485,
14339,
281,
253,
30762,
281,
1329,
253,
3916,
275,
253,
2022,
50275,
6438,
2488,
2380,
50275,
74,
1239,
253,
2380,
285,
891,
14409,
253,
958,
326,
253,
4477,
452,
1160,
247,
12524,
3177,
281,
46919,
616,
1543,
285,
4908,
1029,
5251,
16039,
432,
253,
5301,
273,
253,
258,
365,
3082,
2299,
1955,
281,
253,
11701,
2424,
275,
2426,
273,
4028,
671,
923,
19843,
891,
651,
751,
281,
1978,
619,
4868,
721,
285,
11907,
253,
4477,
281,
3157,
253,
1239,
1430,
273,
253,
2929,
323,
616,
4049,
254,
609,
5102,
2715,
604,
253,
2929,
4850,
7607,
2490,
187,
4118,
18435,
27,
455,
30628,
6273,
323,
18738,
253,
2929,
581,
4468,
275,
2709,
10123,
369,
253,
19843,
273,
253,
4028,
275,
1635,
581,
37317,
671,
8042,
562,
35387,
275,
253,
10097,
534,
253,
4477,
452,
9713,
407,
1024,
7613,
891,
11907,
253,
4477,
281,
9257,
12921,
253,
2929,
323,
19843,
4583,
891,
5583,
18738,
253,
2929,
1580,
352,
32113,
247,
878,
323,
247,
1846,
22791,
275,
745,
22872,
3646,
7103
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
the paper proposes an improvement over the aqm approach for an informationtheoretic framework for taskoriented dialog systems specifically the paper tries to circumvent the problem of explicitly calculating the information gain while asking a question in the aqm setting while the original aqm approach sweeps over all possible guesses and answers while estimating information gain this is rendered impractical in scenarios where this space cannot be tractably enumerated as a solution aqm proposes sweeping over only some topk relevant instantiations of answers and guesses in this space by normalizing the probabilities of the subset of the space in consideration in addition unlike aqm aqm can ask questions which are relevant to the dialog context so far consequentially this is generalizable and applicable for dialog systems with non yesno answers empirical observations demonstrate improvements over the existing approaches for such taskoriented dialog systems the paper is not very wellwritten and at times is hard to understand the contributions seem incremental as well in addition to the concerns mentioned below comments the paper is overloaded with notations and the writing is not very smooth the terse nature of the content makes it hard to follow in general if someone apriori was not familiar with taskoriented dialog or the visual dialog setting in das et al 2017b it would be quite hard to follow while mentioning slrl approaches while comparing or introducing the setup the authors do not make any distinction between discriminative and generative dialog models specifically sl approaches could either be trained discriminatively to rank options among the provided ones given dialog context or in a generative manner via tokenlevel teacher forcing the authors should clearly make this distinction in the introduction and in other places where its needed the authors should stress more upon the approximations involved while calculating mutual information as far as i understand even in the aqm approach the numerator and the denominator within the logarithm are estimated from a different set of parameters and as such they need not be consistent with each other under marginalization the term resembles mi and ensuring consistency in such a framework would require either of the numerator or the denominator to be close to something like a variational approximation of the true distribution in addition aqm adopts the same framework as aqm but computes mi over some topk of the random variables being considered could the authors comment more on why restricting the space of rvs to some topk samples is a good idea would that not lead to somewhat of a biased estimator unless i am missing something training aprxagen from the training data inda seems odd assuming this to be qbots mental model of abot there is no prior reason why this should be initialized or trained in such a manner similarly the training paradigm of the depa setting is confusing if they are trained in a manner similar to a regular abot either sl or rl then theyre not approximate mental models but are rather just another abot agent in play which is being queried by qbot under comparative models in paragraph 2 of section 41 the authors state that there are some reportslooks like humans dialog can the authors elaborate on what they mean by this statement its not clear what the message to be conveyed here is comparisons in guesswhich highly rely on the pytorch implementation in the mentioned github repository however the benchmarking performed in that repository for rl over sl is not accurate because of inherent bugs in the implementation of reinforce see httpsgithubcombatramlplabvisdialrlissues13 and httpsgithubcombatramlplabvisdialrlpull12 i would suggest the authors to take this into account can the authors also show performances for the guesswhich models under the aqm framework on the original retrieval metrics for visual dialog mentioned in das et al 2017a this would be useful to judge the robustness of the proposed approach over the methods being compared with updated thoughts the authors adressed the issues raisedcomments made in the review in light of my comments below to the author responses i am inclined towards increasing my rating in addition i have mentioned some updates in the comments which might make the paper stronger centered around clarifications regarding the computation of the topk infogain termdocsepthe goal of this paper is to build a taskoriented dialogue generation system that can continuously generate questions and make a guess about the selected object this paper builds on the top of the previously proposed aqm algorithm and focuses on addressing the limitation of the aqm algorithm which chooses the question that maximizes mutual information of the class and the current answer but uses fixed sets of candidate questionsanswersclasses the proposed aqm the extension of aqm is to deal with 1 the natural language questions answers using rnn as the generator instead of selecting from the candidate pool rnn as generator and 2 a large set of candidate classes from 10 to 9628 the novelty is relatively limited considering that the model is revised from aqm although this work is incremental this paper addresses the important issue about the generalization the experiments show that the model achieves good performance in the experiments however some questions should be clarified 1 in the ablation study what is the performance of removing qpost and remaining qinfo asking questions using aqm and guessing with an sltrained model 2 in the experiments the baselines do not contain aqm although aqm has more constraints it is necessary to see the performance difference between aqm and aqm if the difference is not significant it means that this dataset cannot test the generalization capability of the model so experiments on other datasets may be considered if the difference is significant then the effectiveness of the model is well justified the authors should include the comparison in the experiments otherwise it is difficult to justify whether the proposed model is useful docsep updated thoughts i was primarily concerned about a lack of analysis regarding the technical contributions moving from aqm to aqm the revisions and author comments here have addressed the specific experiments ive asked for and more generally clarified the contributions made as part of aqm ive increased my rating to reflect my increased confidence in this paper overall i think this is a good paper and will be interesting to the community i also thank the authors for their substantial efforts to revise the paper and address these concerns strengths the approach is a sensible application of aqm to the guesswhich setting and results in significant improvements over existing approaches both in terms of quantitative results and qualitative examples concerns a technical novelty is limited compared to aqm the major departures from the aqm approach claimed in the paper section 33 are 1 the generation of candidate questions through beam search rather than predefined set 21 the approximate answerer being an rnn generating freeform language instead of a binary classifier 22 dropping the assumption that tilde pat c qt tilde p at c qt ht1 3 estimate approximate information gain using subsets of the class and answer space corresponding to the beamsearch generated question set and their corresponding answers i have some concerns about these for 1 the original aqm paper explores this exact setting for guesswhat in section 52 generating the top100 questions from a pretrained rnn question generator via beam search and ranking them based on information gain from my understand this aspect of the approach is not novel for 21 i disagree that this is a departure from the aqm approach instead simply an artifact of the experimental setting the original aqm paper was based in the guesswhat game in which the answerer could only reply with yesnona however the method itself is agnostic of this choice in fact the detailed algorithm explanation in appendix a of the aqm paper explicitly discusses the possibility of the answer generator being an rnn model generally the modifications to aqm largely seem like necessary straightforward adjustments to the problem setting of guesswhich and not algorithmic advances that said the changes make sense and do adapt the method to this more complex setting where it performs quite well b design decisions are not well justified experimentally given that the proposed changes seem rather minor it would be good to see strong analysis of their effect looking back at the claimed difference from aqm there appear to be a few ablations missing how useful is generating questions i would have liked to see a comparison to a qfix set samples from training this corresponds to difference 1 above how important is dialog history to the aprxans model this corresponds to difference 22 above how important is the choice to restrict to c classes figure 4b begins to study this question but conflates the experiment by simultaneously increasing q and a this correspond to difference 3 above c no evaluation of visual dialog metrics it would be useful to the community to see if this marked improvement in guesswhich performance also results in improved ability to predict human response to novel dialogs i and i imagine many others would like to see evaluation on the standard visual dialog test metrics if this introspective inference process improves these metrics it would significantly strengthen the paper d no discussion of inference time it would be useful to include discussion of relative inference time the aqm framework requires substantially more computation than an nonintrospective model could authors report this relative increase in inference efficiency say at k20 e lack of comparison to base aqm i would expect explicit comparison to aqm for a model named aqm or a discussion on why this is not possible minor things i dont understand the 2nd claimed contribution from the introduction at every turn aqm generates a question considering the context of the previous dialog which is desirable in practice is this claim because the aprxans module uses history review versions of papers often lack polished writing i encourage the authors to review their manuscript for future versions with an eye for clarity of terminology even if it means a departure from established notation in prior work the rlqa qualitative results are these from nondelta or delta is there a difference between the two in terms of interpretability overview the modifications made to adapt aqm to the guesswhich setting presented here as aqm seem to be somewhat minor technical contributions further where these difference could be explored in greater detail there is a lack of analysis that said the proposed approach does make significant qualitative and quantitative improvements in the target problem im fairly on the fence for this paper and look forward to seeing additional analysis and the opinions of other reviewers
### Summary: | important problem visually grounded dialog incremental but not in a negative sense of the word extension of prior work to an important new setting guesswhich wellexecuted paper was reviewed by three experts initially there were some concerns but after the author response and reviewer discussion all three unanimously recommend acceptance | [
10944,
273,
9172,
285,
5476,
265,
275,
436,
2317,
407,
2622,
3006,
253,
20552,
273,
253,
8578,
273,
253,
2317,
275,
8180,
275,
1635,
12401,
247,
82,
78,
247,
82,
78,
476,
1642,
3533,
534,
403,
4623,
281,
253,
10756,
3634,
594,
2080,
4823,
4303,
436,
310,
2087,
12729,
285,
7763,
323,
10756,
2718,
342,
1327,
4754,
2369,
9172,
16774,
7313,
7568,
11701,
689,
253,
5368,
7274,
323,
824,
4836,
21085,
10756,
2718,
253,
2929,
310,
417,
1077,
973,
15720,
285,
387,
2069,
310,
1892,
281,
2096,
253,
9021,
1646,
32809,
347,
973,
275,
1635,
281,
253,
7350,
5393,
2708,
50276,
26122,
50276,
783,
2929,
310,
689,
19052,
342,
41818,
285,
253,
4028,
310,
417,
1077,
6032,
253,
4109,
339,
3753,
273,
253,
2600,
2789,
352,
1892,
281,
956,
275,
2087,
604,
3095,
1049,
7947,
74,
369,
417,
7615,
342,
4836,
21085,
10756,
390,
253,
5304,
10756,
4758,
275,
9527,
1162,
355,
4240,
67,
352,
651,
320,
3240,
1892,
281,
956,
50276,
6050,
29570,
1499,
8435,
7274,
1223,
10941,
390,
16984,
253,
9978,
253,
4477,
513,
417,
1056,
667,
13812,
875,
20741,
800,
285,
1006,
800,
10756,
3210,
5742,
1499,
7274,
812,
2057,
320,
10166,
20741,
3146,
281,
5958,
4610,
2190,
253,
2530,
4394,
1677,
10756,
3634,
390,
275,
247,
1006,
800,
5133,
3066,
10669,
5251,
9732,
17190,
253,
4477,
943,
4518,
1056,
436,
13812,
275,
253,
10199,
285,
275,
643,
5053,
835,
697,
3058,
50276,
783,
4477,
943,
4073,
625,
2220,
253,
34754,
3206,
1223,
18899,
15577,
1491,
347,
2080,
347,
891,
2096,
1014,
275,
253,
247,
82,
78,
2746,
253,
4520,
1080,
285,
253,
12619,
1561,
253,
42407,
403,
5998,
432,
247,
1027,
873,
273,
3602,
285,
347,
824,
597,
878,
417,
320,
5185,
342,
1016,
643,
762,
16888,
1320,
253,
1307,
29217,
3641,
285,
17749,
15274,
275,
824,
247,
7792,
651,
2430,
2057,
273,
253,
4520,
1080,
390,
253,
12619,
281,
320,
2810,
281,
1633,
751,
247,
39762,
11193,
273,
253,
2032,
3268,
275,
1635,
247,
82,
78,
47932,
253,
1072,
7792,
347,
247,
82,
78,
533,
48169,
3641,
689,
690,
1755,
76,
273,
253,
3632,
4903,
1146,
2783,
812,
253,
4477,
4385,
625,
327,
2139,
34617,
253,
2317,
273,
391,
10936,
281,
690,
1755,
76,
3530,
310,
247,
1175,
2934,
651,
326,
417,
1421,
281,
8489,
273,
247,
23539,
29107,
50276,
28558,
891,
717,
5816,
1633,
3733,
1049,
18677,
6533,
432,
253,
3733,
941,
801,
66,
3133,
8909,
7384,
436,
281,
320,
2805,
67,
1502,
6255,
1566,
273,
490,
302,
50276,
9088,
310,
642,
2720,
1921,
2139,
436,
943,
320,
31260,
390,
10166,
275,
824,
247,
5133,
12014,
253,
3733,
22199,
273,
253,
1305,
66,
4758,
310,
21643,
604,
597,
403,
10166,
275,
247,
5133,
2074,
281,
247,
3963,
490,
302,
50276,
26597,
1499,
390,
391,
77,
50276,
7461,
597,
250,
417,
16851,
6255,
3210,
533,
403,
2581,
816,
1529,
490,
302,
5570,
275,
1132,
534,
310,
1146,
32305,
728,
407,
2805,
12042,
50276,
4524,
20407,
3210,
275,
12494,
374,
273,
2593,
7609,
253,
4477,
1375,
326,
627,
403,
690,
1304,
3433,
645,
84,
751,
7497,
10756,
476,
253,
4477,
21184,
327,
752,
597,
1599,
407,
436,
3908,
697,
417,
2590,
752,
253,
3935,
281,
320,
29403,
1060,
310,
50276,
681,
1148,
10047,
275,
5476,
4609,
4122,
10725,
327,
253,
268,
1767,
263,
348,
7092,
275,
253,
5393,
40477,
18491,
2299,
253,
22791,
272,
2684,
275,
326,
18491,
323,
391,
77,
689,
1499,
310,
417,
7899,
984,
273,
12794,
19775,
275,
253,
7092,
273,
28432,
923,
5987,
7280,
681,
13419,
3358,
77,
446,
357,
4534,
47816,
8435,
22402,
1012,
285,
5987,
7280,
681,
13419,
3358,
77,
446,
357,
4534,
47816,
8435,
23290,
805,
50276,
74,
651,
1804,
253,
4477,
281,
1379,
436,
715,
2395,
50276,
5092,
253,
4477,
671,
921,
16226,
323,
253,
5476,
4609,
3210,
762,
253,
247,
82,
78,
7792,
327,
253,
3236,
25064,
17082,
323,
5304,
10756,
5393,
275,
9527,
1162,
355,
4240,
66,
436,
651,
320,
4217,
281,
5963,
253,
31640,
273,
253,
4081,
2746,
689,
253,
3082,
1146,
2429,
342,
50274,
39055,
7906,
50276,
783,
4477,
519,
2079,
253,
3374,
5439,
26122,
1160,
275,
253,
2278,
275,
1708,
273,
619,
5701,
2708,
281,
253,
2488,
6128,
50276,
74,
717,
21802,
4404,
3629,
619,
13716,
50276,
249,
1635,
891,
452,
5393,
690,
11269,
275,
253,
5701,
534,
1537,
1056,
253,
2929,
10046,
50276,
34872,
1475,
8254,
6787,
5001,
253,
13782,
273,
253,
1755,
76,
2192,
462,
404,
1307,
7152,
339,
431,
248,
4736,
273,
436,
2929,
310,
281,
1973,
247,
4836,
21085,
17414,
5978,
985,
326,
476,
14949,
6635,
3533,
285,
1056,
247,
5476,
670,
253,
4236,
1789,
50276,
2520,
2929,
21168,
327,
253,
1755,
273,
253,
3786,
4081,
247,
82,
78,
5933,
285,
16633,
327,
15974,
253,
12291,
273,
253,
247,
82,
78,
5933,
534,
28467,
253,
1953,
326,
11903,
4219,
15577,
1491,
273,
253,
966,
285,
253,
1655,
3662,
533,
4648,
4229,
5239,
273,
7431,
3533,
507,
49643,
19770,
253,
4081,
247,
82,
78,
253,
6880,
273,
247,
82,
78,
310,
281,
2968,
342,
337,
253,
3626,
3448,
3533,
50276,
507,
49643,
970,
391,
9866,
347,
253,
14156,
3185,
273,
17221,
432,
253,
7431,
6363,
391,
9866,
347,
14156,
285,
374,
247,
1781,
873,
273,
7431,
5971,
432,
884,
281,
9161,
1619,
50276,
783,
38135,
310,
4942,
3710,
7296,
326,
253,
1566,
310,
17265,
432,
247,
82,
78,
3738,
436,
789,
310,
32809,
436,
2929,
12453,
253,
1774,
2523,
670,
253,
26647,
50276,
783,
4679,
921,
326,
253,
1566,
33526,
1175,
3045,
275,
253,
4679,
2299,
690,
3533,
943,
320,
31637,
50276,
18,
275,
253,
28913,
1263,
752,
310,
253,
3045,
273,
11922,
2805,
5996,
285,
5780,
2805,
5374,
7004,
3533,
970,
247,
82,
78,
285,
29985,
342,
271,
1499,
32927,
1566,
50276,
19,
275,
253,
4679,
253,
1666,
25379,
513,
417,
3831,
247,
82,
78,
50276,
20261,
247,
82,
78,
556,
625,
10806,
352,
310,
3309,
281,
923,
253,
3045,
3064,
875,
247,
82,
78,
285,
247,
82,
78,
50275,
338,
253,
3064,
310,
417,
1534,
352,
2097,
326,
436,
10895,
2550,
1071,
253,
26647,
14603,
273,
253,
1566,
594,
4679,
327,
643,
15302,
778,
320,
2783,
604,
253,
3064,
310,
1534,
840,
253,
12510,
273,
253,
1566,
310,
973,
17285,
253,
4477,
943,
2486,
253,
5301,
275,
253,
4679,
5010,
352,
310,
2834,
281,
15249,
1880,
253,
4081,
1566,
310,
4217,
5474,
33032,
50276,
39055,
7906,
50275,
74,
369,
8558,
7514,
670,
247,
3480,
273,
1783,
5001,
253,
7681,
9021,
4886,
432,
247,
82,
78,
281,
247,
82,
78,
253,
38549,
285,
2488,
5701,
1060,
452,
9713,
253,
2173,
4679,
209,
422,
2546,
323,
285,
625,
3839,
31637,
253,
9021,
1160,
347,
629,
273,
247,
82,
78,
209,
422,
2559,
619,
13716,
281,
4887,
619,
2559,
7162,
275,
436,
2929,
4583,
891,
1158,
436,
310,
247,
1175,
2929,
285,
588,
320,
4722,
281,
253,
3114,
50276,
74,
671,
5717,
253,
4477,
323,
616,
6832,
6031,
281,
49620,
253,
2929,
285,
2953,
841,
7350,
50274,
296,
3755,
20556,
50275,
783,
2746,
310,
247,
24600,
2898,
273,
247,
82,
78,
281,
253,
5476,
4609,
4758,
285,
1543,
275,
1534,
11701,
689,
5368,
7274,
1097,
275,
2426,
273,
11745,
1543,
285,
18276,
6667,
50274,
585,
1209,
2224,
50275,
66,
7681,
38135,
310,
3710,
2429,
281,
247,
82,
78,
50276,
783,
2201,
7907,
980,
432,
253,
247,
82,
78,
2746,
7558,
275,
253,
2929,
2593,
5922,
403,
209,
186,
18,
253,
5978,
273,
7431,
3533,
949,
8325,
3186,
2581,
685,
41364,
873,
50276,
186,
1797,
253,
16851,
7897,
664,
6554,
1146,
271,
391,
9866,
11365,
1959,
630,
3448,
3185,
273,
247,
8985,
30410,
50276,
186,
1423,
18752,
253,
9376,
326,
246,
6227,
869,
50276,
68,
2805,
85,
50276,
3582,
268,
387,
50276,
68,
2805,
85,
288,
85,
18,
50276,
186,
20,
6642,
16851,
1491,
6351,
970,
20077,
273,
253,
966,
285,
3662,
2317,
3969,
281,
253,
8325,
8716,
4561,
1953,
873,
285,
616,
3969,
9172,
50276,
74,
452,
690,
7350,
670,
841,
50276,
1542,
337,
253,
3236,
247,
82,
78,
2929,
33826,
436,
3242,
4758,
323,
5476,
5371,
275,
2593,
8073,
50276,
8719,
839,
253,
1755,
2313,
3533,
432,
247,
3215,
11273,
391,
9866,
1953,
14156,
3066,
8325,
3186,
285,
19947,
731,
1754,
327,
1491,
6351,
432,
619,
2096,
436,
4809,
273,
253,
2746,
310,
417,
4460,
50276,
1542,
3127,
891,
14936,
326,
436,
310,
247,
16018,
432,
253,
247,
82,
78,
2746,
3185,
3365,
271,
34332,
273,
253,
5661,
4758,
253,
3236,
247,
82,
78,
2929,
369,
1754,
275,
253,
5476,
5371,
2165,
275,
534,
253,
7897,
664,
6554,
812,
760,
12252,
342,
4754,
4160,
66,
2299,
253,
1332,
3139,
310,
639,
79,
6932,
273,
436,
4327,
275,
958,
253,
7000,
5933,
8813,
275,
30762,
247,
273,
253,
247,
82,
78,
2929,
11120,
25339,
253,
6387,
273,
253,
3662,
14156,
1146,
271,
391,
9866,
1566,
50275,
43786,
253,
14586,
281,
247,
82,
78,
8127,
1646,
751,
3309,
15246,
23927,
281,
253,
1895,
4758,
273,
5476,
4609,
285,
417,
5933,
280,
16424,
326,
753,
253,
2544,
1056,
3282,
285,
513,
5223,
253,
1332,
281,
436,
625,
2570,
4758,
835,
352,
17923,
3240,
973,
50275,
67,
2216,
7089,
403,
417,
973,
17285,
21657,
1677,
326,
253,
4081,
2544,
1646,
2581,
5884,
352,
651,
320,
1175,
281,
923,
2266,
1783,
273,
616,
1055,
2819,
896,
387,
253,
7558,
3064,
432,
247,
82,
78,
627,
3176,
281,
320,
247,
1643,
490,
77,
569,
5816,
50276,
5430,
4217,
310,
11365,
3533,
891,
651,
452,
10490,
281,
923,
247,
5301,
281,
247,
2805,
11097,
873,
3530,
432,
3733,
436,
10140,
281,
3064,
337,
1840,
50276,
5430,
1774,
310,
10756,
2892,
281,
253,
1049,
18677,
507,
1566,
436,
10140,
281,
3064,
3307,
1840,
50276,
5430,
1774,
310,
253,
4327,
281,
4656,
281,
260,
5971,
4677,
577,
67,
9513,
281,
1263,
436,
1953,
533,
49446,
684,
253,
3368,
407,
10486,
3629,
2805,
285,
247,
436,
2723,
281,
3064,
495,
1840,
50276,
68,
642,
7103,
273,
5304,
10756,
17082,
352,
651,
320,
4217,
281,
253,
3114,
281,
923,
604,
436,
7101,
7756,
275,
5476,
4609,
3045,
671,
1543,
275,
5520,
3745,
281,
3283,
1966,
2380,
281,
4460,
10756,
84,
891,
285,
891,
8564,
1142,
2571,
651,
751,
281,
923,
7103,
327,
253,
2629,
5304,
10756,
1071,
17082,
604,
436,
540,
46650,
17032,
1232,
19132,
841,
17082,
352,
651,
3012,
17084,
253,
2929,
50276,
69,
642,
5955,
273,
17032,
673,
352,
651,
320,
4217,
281,
2486,
5955,
273,
4103,
17032,
673,
253,
247,
82,
78,
7792,
4419,
9619,
625,
13782,
685,
271,
1327,
565,
46650,
1566,
812,
4477,
1304,
436,
4103,
2572,
275,
17032,
6733,
1333,
387,
465,
938,
50274,
70,
3480,
273,
5301,
281,
2613,
247,
82,
78,
891,
651,
1902,
6843,
5301,
281,
247,
82,
78,
323,
247,
1566,
4907,
247,
82,
78,
390,
247,
5955,
327,
2139,
436,
310,
417,
1896,
50274,
37585,
1841,
50274,
74,
13414,
2096,
253,
374,
2109,
7558,
7680,
432,
253,
10199,
387,
1046,
1614,
247,
82,
78,
15693,
247,
1953,
7296,
253,
3634,
273,
253,
2045,
10756,
534,
310,
11408,
275,
3946,
310,
436,
1750,
984,
253,
1049,
18677,
507,
6333,
4648,
2892,
50274,
15337,
9508,
273,
9380,
2223,
3480,
29422,
4028,
891,
11907,
253,
4477,
281,
2278,
616,
7714,
323,
2852,
9508,
342,
271,
5130,
323,
19843,
273,
28939,
1014,
604,
352,
2097,
247,
16018,
432,
4232,
14951,
275,
2720,
789,
50274,
783,
391,
77,
31569,
18276,
1543,
403,
841,
432,
27370,
1862,
390,
18687,
310,
627,
247,
3064,
875,
253,
767,
275,
2426,
273,
4665,
1430,
50274,
39930,
50275,
783,
14586,
1160,
281,
5223,
247,
82,
78,
281,
253,
5476,
4609,
4758,
3559,
1060,
347,
247,
82,
78,
1646,
281,
320,
8489,
5884,
7681,
9021,
2007,
835,
841,
3064,
812,
320,
14859,
275,
3687,
2508,
627,
310,
247,
3480,
273,
1783,
326,
753,
253,
4081,
2746,
1057,
1056,
1534,
18276,
285,
11745,
11701,
275,
253,
2303,
1895,
516,
9648,
327,
253,
19354,
323,
436,
2929,
285,
1007,
3579,
281,
6523,
3081,
1783,
285,
253,
11626,
273,
643,
30628,
50274,
187,
187,
4118,
18435,
27,
18108,
1895,
25910,
28462,
10756,
32809,
533,
417,
275,
247,
4016,
3282,
273,
253,
3159,
6880,
273,
2720,
789,
281,
271,
1774,
747,
4758,
5476,
4609,
6210,
1591,
886,
4525,
2929,
369,
9814,
407,
1264,
10071,
8523,
627,
497,
690,
7350,
533,
846,
253,
2488,
2380,
285,
37317,
5955,
512,
1264,
38350,
5583,
14924,
50276
] | [
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] | [
10944,
273,
9172,
285,
5476,
265,
275,
436,
2317,
407,
2622,
3006,
253,
20552,
273,
253,
8578,
273,
253,
2317,
275,
8180,
275,
1635,
12401,
247,
82,
78,
247,
82,
78,
476,
1642,
3533,
534,
403,
4623,
281,
253,
10756,
3634,
594,
2080,
4823,
4303,
436,
310,
2087,
12729,
285,
7763,
323,
10756,
2718,
342,
1327,
4754,
2369,
9172,
16774,
7313,
7568,
11701,
689,
253,
5368,
7274,
323,
824,
4836,
21085,
10756,
2718,
253,
2929,
310,
417,
1077,
973,
15720,
285,
387,
2069,
310,
1892,
281,
2096,
253,
9021,
1646,
32809,
347,
973,
275,
1635,
281,
253,
7350,
5393,
2708,
50276,
26122,
50276,
783,
2929,
310,
689,
19052,
342,
41818,
285,
253,
4028,
310,
417,
1077,
6032,
253,
4109,
339,
3753,
273,
253,
2600,
2789,
352,
1892,
281,
956,
275,
2087,
604,
3095,
1049,
7947,
74,
369,
417,
7615,
342,
4836,
21085,
10756,
390,
253,
5304,
10756,
4758,
275,
9527,
1162,
355,
4240,
67,
352,
651,
320,
3240,
1892,
281,
956,
50276,
6050,
29570,
1499,
8435,
7274,
1223,
10941,
390,
16984,
253,
9978,
253,
4477,
513,
417,
1056,
667,
13812,
875,
20741,
800,
285,
1006,
800,
10756,
3210,
5742,
1499,
7274,
812,
2057,
320,
10166,
20741,
3146,
281,
5958,
4610,
2190,
253,
2530,
4394,
1677,
10756,
3634,
390,
275,
247,
1006,
800,
5133,
3066,
10669,
5251,
9732,
17190,
253,
4477,
943,
4518,
1056,
436,
13812,
275,
253,
10199,
285,
275,
643,
5053,
835,
697,
3058,
50276,
783,
4477,
943,
4073,
625,
2220,
253,
34754,
3206,
1223,
18899,
15577,
1491,
347,
2080,
347,
891,
2096,
1014,
275,
253,
247,
82,
78,
2746,
253,
4520,
1080,
285,
253,
12619,
1561,
253,
42407,
403,
5998,
432,
247,
1027,
873,
273,
3602,
285,
347,
824,
597,
878,
417,
320,
5185,
342,
1016,
643,
762,
16888,
1320,
253,
1307,
29217,
3641,
285,
17749,
15274,
275,
824,
247,
7792,
651,
2430,
2057,
273,
253,
4520,
1080,
390,
253,
12619,
281,
320,
2810,
281,
1633,
751,
247,
39762,
11193,
273,
253,
2032,
3268,
275,
1635,
247,
82,
78,
47932,
253,
1072,
7792,
347,
247,
82,
78,
533,
48169,
3641,
689,
690,
1755,
76,
273,
253,
3632,
4903,
1146,
2783,
812,
253,
4477,
4385,
625,
327,
2139,
34617,
253,
2317,
273,
391,
10936,
281,
690,
1755,
76,
3530,
310,
247,
1175,
2934,
651,
326,
417,
1421,
281,
8489,
273,
247,
23539,
29107,
50276,
28558,
891,
717,
5816,
1633,
3733,
1049,
18677,
6533,
432,
253,
3733,
941,
801,
66,
3133,
8909,
7384,
436,
281,
320,
2805,
67,
1502,
6255,
1566,
273,
490,
302,
50276,
9088,
310,
642,
2720,
1921,
2139,
436,
943,
320,
31260,
390,
10166,
275,
824,
247,
5133,
12014,
253,
3733,
22199,
273,
253,
1305,
66,
4758,
310,
21643,
604,
597,
403,
10166,
275,
247,
5133,
2074,
281,
247,
3963,
490,
302,
50276,
26597,
1499,
390,
391,
77,
50276,
7461,
597,
250,
417,
16851,
6255,
3210,
533,
403,
2581,
816,
1529,
490,
302,
5570,
275,
1132,
534,
310,
1146,
32305,
728,
407,
2805,
12042,
50276,
4524,
20407,
3210,
275,
12494,
374,
273,
2593,
7609,
253,
4477,
1375,
326,
627,
403,
690,
1304,
3433,
645,
84,
751,
7497,
10756,
476,
253,
4477,
21184,
327,
752,
597,
1599,
407,
436,
3908,
697,
417,
2590,
752,
253,
3935,
281,
320,
29403,
1060,
310,
50276,
681,
1148,
10047,
275,
5476,
4609,
4122,
10725,
327,
253,
268,
1767,
263,
348,
7092,
275,
253,
5393,
40477,
18491,
2299,
253,
22791,
272,
2684,
275,
326,
18491,
323,
391,
77,
689,
1499,
310,
417,
7899,
984,
273,
12794,
19775,
275,
253,
7092,
273,
28432,
923,
5987,
7280,
681,
13419,
3358,
77,
446,
357,
4534,
47816,
8435,
22402,
1012,
285,
5987,
7280,
681,
13419,
3358,
77,
446,
357,
4534,
47816,
8435,
23290,
805,
50276,
74,
651,
1804,
253,
4477,
281,
1379,
436,
715,
2395,
50276,
5092,
253,
4477,
671,
921,
16226,
323,
253,
5476,
4609,
3210,
762,
253,
247,
82,
78,
7792,
327,
253,
3236,
25064,
17082,
323,
5304,
10756,
5393,
275,
9527,
1162,
355,
4240,
66,
436,
651,
320,
4217,
281,
5963,
253,
31640,
273,
253,
4081,
2746,
689,
253,
3082,
1146,
2429,
342,
50274,
39055,
7906,
50276,
783,
4477,
519,
2079,
253,
3374,
5439,
26122,
1160,
275,
253,
2278,
275,
1708,
273,
619,
5701,
2708,
281,
253,
2488,
6128,
50276,
74,
717,
21802,
4404,
3629,
619,
13716,
50276,
249,
1635,
891,
452,
5393,
690,
11269,
275,
253,
5701,
534,
1537,
1056,
253,
2929,
10046,
50276,
34872,
1475,
8254,
6787,
5001,
253,
13782,
273,
253,
1755,
76,
2192,
462,
404,
1307,
7152,
339,
431,
248,
4736,
273,
436,
2929,
310,
281,
1973,
247,
4836,
21085,
17414,
5978,
985,
326,
476,
14949,
6635,
3533,
285,
1056,
247,
5476,
670,
253,
4236,
1789,
50276,
2520,
2929,
21168,
327,
253,
1755,
273,
253,
3786,
4081,
247,
82,
78,
5933,
285,
16633,
327,
15974,
253,
12291,
273,
253,
247,
82,
78,
5933,
534,
28467,
253,
1953,
326,
11903,
4219,
15577,
1491,
273,
253,
966,
285,
253,
1655,
3662,
533,
4648,
4229,
5239,
273,
7431,
3533,
507,
49643,
19770,
253,
4081,
247,
82,
78,
253,
6880,
273,
247,
82,
78,
310,
281,
2968,
342,
337,
253,
3626,
3448,
3533,
50276,
507,
49643,
970,
391,
9866,
347,
253,
14156,
3185,
273,
17221,
432,
253,
7431,
6363,
391,
9866,
347,
14156,
285,
374,
247,
1781,
873,
273,
7431,
5971,
432,
884,
281,
9161,
1619,
50276,
783,
38135,
310,
4942,
3710,
7296,
326,
253,
1566,
310,
17265,
432,
247,
82,
78,
3738,
436,
789,
310,
32809,
436,
2929,
12453,
253,
1774,
2523,
670,
253,
26647,
50276,
783,
4679,
921,
326,
253,
1566,
33526,
1175,
3045,
275,
253,
4679,
2299,
690,
3533,
943,
320,
31637,
50276,
18,
275,
253,
28913,
1263,
752,
310,
253,
3045,
273,
11922,
2805,
5996,
285,
5780,
2805,
5374,
7004,
3533,
970,
247,
82,
78,
285,
29985,
342,
271,
1499,
32927,
1566,
50276,
19,
275,
253,
4679,
253,
1666,
25379,
513,
417,
3831,
247,
82,
78,
50276,
20261,
247,
82,
78,
556,
625,
10806,
352,
310,
3309,
281,
923,
253,
3045,
3064,
875,
247,
82,
78,
285,
247,
82,
78,
50275,
338,
253,
3064,
310,
417,
1534,
352,
2097,
326,
436,
10895,
2550,
1071,
253,
26647,
14603,
273,
253,
1566,
594,
4679,
327,
643,
15302,
778,
320,
2783,
604,
253,
3064,
310,
1534,
840,
253,
12510,
273,
253,
1566,
310,
973,
17285,
253,
4477,
943,
2486,
253,
5301,
275,
253,
4679,
5010,
352,
310,
2834,
281,
15249,
1880,
253,
4081,
1566,
310,
4217,
5474,
33032,
50276,
39055,
7906,
50275,
74,
369,
8558,
7514,
670,
247,
3480,
273,
1783,
5001,
253,
7681,
9021,
4886,
432,
247,
82,
78,
281,
247,
82,
78,
253,
38549,
285,
2488,
5701,
1060,
452,
9713,
253,
2173,
4679,
209,
422,
2546,
323,
285,
625,
3839,
31637,
253,
9021,
1160,
347,
629,
273,
247,
82,
78,
209,
422,
2559,
619,
13716,
281,
4887,
619,
2559,
7162,
275,
436,
2929,
4583,
891,
1158,
436,
310,
247,
1175,
2929,
285,
588,
320,
4722,
281,
253,
3114,
50276,
74,
671,
5717,
253,
4477,
323,
616,
6832,
6031,
281,
49620,
253,
2929,
285,
2953,
841,
7350,
50274,
296,
3755,
20556,
50275,
783,
2746,
310,
247,
24600,
2898,
273,
247,
82,
78,
281,
253,
5476,
4609,
4758,
285,
1543,
275,
1534,
11701,
689,
5368,
7274,
1097,
275,
2426,
273,
11745,
1543,
285,
18276,
6667,
50274,
585,
1209,
2224,
50275,
66,
7681,
38135,
310,
3710,
2429,
281,
247,
82,
78,
50276,
783,
2201,
7907,
980,
432,
253,
247,
82,
78,
2746,
7558,
275,
253,
2929,
2593,
5922,
403,
209,
186,
18,
253,
5978,
273,
7431,
3533,
949,
8325,
3186,
2581,
685,
41364,
873,
50276,
186,
1797,
253,
16851,
7897,
664,
6554,
1146,
271,
391,
9866,
11365,
1959,
630,
3448,
3185,
273,
247,
8985,
30410,
50276,
186,
1423,
18752,
253,
9376,
326,
246,
6227,
869,
50276,
68,
2805,
85,
50276,
3582,
268,
387,
50276,
68,
2805,
85,
288,
85,
18,
50276,
186,
20,
6642,
16851,
1491,
6351,
970,
20077,
273,
253,
966,
285,
3662,
2317,
3969,
281,
253,
8325,
8716,
4561,
1953,
873,
285,
616,
3969,
9172,
50276,
74,
452,
690,
7350,
670,
841,
50276,
1542,
337,
253,
3236,
247,
82,
78,
2929,
33826,
436,
3242,
4758,
323,
5476,
5371,
275,
2593,
8073,
50276,
8719,
839,
253,
1755,
2313,
3533,
432,
247,
3215,
11273,
391,
9866,
1953,
14156,
3066,
8325,
3186,
285,
19947,
731,
1754,
327,
1491,
6351,
432,
619,
2096,
436,
4809,
273,
253,
2746,
310,
417,
4460,
50276,
1542,
3127,
891,
14936,
326,
436,
310,
247,
16018,
432,
253,
247,
82,
78,
2746,
3185,
3365,
271,
34332,
273,
253,
5661,
4758,
253,
3236,
247,
82,
78,
2929,
369,
1754,
275,
253,
5476,
5371,
2165,
275,
534,
253,
7897,
664,
6554,
812,
760,
12252,
342,
4754,
4160,
66,
2299,
253,
1332,
3139,
310,
639,
79,
6932,
273,
436,
4327,
275,
958,
253,
7000,
5933,
8813,
275,
30762,
247,
273,
253,
247,
82,
78,
2929,
11120,
25339,
253,
6387,
273,
253,
3662,
14156,
1146,
271,
391,
9866,
1566,
50275,
43786,
253,
14586,
281,
247,
82,
78,
8127,
1646,
751,
3309,
15246,
23927,
281,
253,
1895,
4758,
273,
5476,
4609,
285,
417,
5933,
280,
16424,
326,
753,
253,
2544,
1056,
3282,
285,
513,
5223,
253,
1332,
281,
436,
625,
2570,
4758,
835,
352,
17923,
3240,
973,
50275,
67,
2216,
7089,
403,
417,
973,
17285,
21657,
1677,
326,
253,
4081,
2544,
1646,
2581,
5884,
352,
651,
320,
1175,
281,
923,
2266,
1783,
273,
616,
1055,
2819,
896,
387,
253,
7558,
3064,
432,
247,
82,
78,
627,
3176,
281,
320,
247,
1643,
490,
77,
569,
5816,
50276,
5430,
4217,
310,
11365,
3533,
891,
651,
452,
10490,
281,
923,
247,
5301,
281,
247,
2805,
11097,
873,
3530,
432,
3733,
436,
10140,
281,
3064,
337,
1840,
50276,
5430,
1774,
310,
10756,
2892,
281,
253,
1049,
18677,
507,
1566,
436,
10140,
281,
3064,
3307,
1840,
50276,
5430,
1774,
310,
253,
4327,
281,
4656,
281,
260,
5971,
4677,
577,
67,
9513,
281,
1263,
436,
1953,
533,
49446,
684,
253,
3368,
407,
10486,
3629,
2805,
285,
247,
436,
2723,
281,
3064,
495,
1840,
50276,
68,
642,
7103,
273,
5304,
10756,
17082,
352,
651,
320,
4217,
281,
253,
3114,
281,
923,
604,
436,
7101,
7756,
275,
5476,
4609,
3045,
671,
1543,
275,
5520,
3745,
281,
3283,
1966,
2380,
281,
4460,
10756,
84,
891,
285,
891,
8564,
1142,
2571,
651,
751,
281,
923,
7103,
327,
253,
2629,
5304,
10756,
1071,
17082,
604,
436,
540,
46650,
17032,
1232,
19132,
841,
17082,
352,
651,
3012,
17084,
253,
2929,
50276,
69,
642,
5955,
273,
17032,
673,
352,
651,
320,
4217,
281,
2486,
5955,
273,
4103,
17032,
673,
253,
247,
82,
78,
7792,
4419,
9619,
625,
13782,
685,
271,
1327,
565,
46650,
1566,
812,
4477,
1304,
436,
4103,
2572,
275,
17032,
6733,
1333,
387,
465,
938,
50274,
70,
3480,
273,
5301,
281,
2613,
247,
82,
78,
891,
651,
1902,
6843,
5301,
281,
247,
82,
78,
323,
247,
1566,
4907,
247,
82,
78,
390,
247,
5955,
327,
2139,
436,
310,
417,
1896,
50274,
37585,
1841,
50274,
74,
13414,
2096,
253,
374,
2109,
7558,
7680,
432,
253,
10199,
387,
1046,
1614,
247,
82,
78,
15693,
247,
1953,
7296,
253,
3634,
273,
253,
2045,
10756,
534,
310,
11408,
275,
3946,
310,
436,
1750,
984,
253,
1049,
18677,
507,
6333,
4648,
2892,
50274,
15337,
9508,
273,
9380,
2223,
3480,
29422,
4028,
891,
11907,
253,
4477,
281,
2278,
616,
7714,
323,
2852,
9508,
342,
271,
5130,
323,
19843,
273,
28939,
1014,
604,
352,
2097,
247,
16018,
432,
4232,
14951,
275,
2720,
789,
50274,
783,
391,
77,
31569,
18276,
1543,
403,
841,
432,
27370,
1862,
390,
18687,
310,
627,
247,
3064,
875,
253,
767,
275,
2426,
273,
4665,
1430,
50274,
39930,
50275,
783,
14586,
1160,
281,
5223,
247,
82,
78,
281,
253,
5476,
4609,
4758,
3559,
1060,
347,
247,
82,
78,
1646,
281,
320,
8489,
5884,
7681,
9021,
2007,
835,
841,
3064,
812,
320,
14859,
275,
3687,
2508,
627,
310,
247,
3480,
273,
1783,
326,
753,
253,
4081,
2746,
1057,
1056,
1534,
18276,
285,
11745,
11701,
275,
253,
2303,
1895,
516,
9648,
327,
253,
19354,
323,
436,
2929,
285,
1007,
3579,
281,
6523,
3081,
1783,
285,
253,
11626,
273,
643,
30628,
50274,
187,
187,
4118,
18435,
27,
18108,
1895,
25910,
28462,
10756,
32809,
533,
417,
275,
247,
4016,
3282,
273,
253,
3159,
6880,
273,
2720,
789,
281,
271,
1774,
747,
4758,
5476,
4609,
6210,
1591,
886,
4525,
2929,
369,
9814,
407,
1264,
10071,
8523,
627,
497,
690,
7350,
533,
846,
253,
2488,
2380,
285,
37317,
5955,
512,
1264,
38350,
5583,
14924,
50276
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
this paper introduces a transformerbased architecture for video situation recognition that a augments prior work on semantic role labeling in videos with spatiotemporal grounding and b removes the requirement of groundtruth roleverb mapping enabling endtoend grounded video situation recognition gvsr the proposed model videowhisperer has 3 main components a a transformer encoder to learn object embeddings per frame and event embeddings both contextually enriched via selfattention b a transformer decoder to predict the entities for each role conditioned on the object and event embeddings c a transformer decoder to autoregressively predict the caption for each role conditioned on the output features of b to evaluate the model on grounding the authors collected annotations by asking users to draw bounding boxes around roles in each event clip for the validation set these annotations will be released andor incorporated in the official evaluation toolkit these annotations are not used for training and so the proposed model is only weaklysupervised experiments are thorough both the ablations to demonstrate the importance of each design choice as well as comparison to prior work on the srl benchmark on the srl benchmark the proposed approach significantly outperforms prior work and in addition supports spatiotemporal localization unlike prior works the proposed model makes sense is wellvalidated by experiments and importantly enables joint prediction of actions roles spatiotemporal localization of roles and captions unlike prior approaches that need groundtruth roleverb annotations at test time great work the authors have included a brief overview of limitations in the appendix docsepthis paper studied the problem of video situation recognition which predicts structured information like multiple events relationships actions and verbrobe pairs the authors propose a threestage transformer model videowhisperer to handle the task the model first learns contextualized embeddings for video features in parallel with key objects that appear in the video clips the model then attends and pools information from object embeddings to localize answers to questions the final stage of the model generates answers as captions to describe each verbrole pair in the video experiments are conducted on the vidsitu dataset achieving better performance than the baseline models strengths 1 the paper overall is well written and easy to follow 2 the idea to decompose the task into different stages seems to be technically sound and the ablation study in the experimental section shows its effectiveness 3 as the conclusion of the introduction the authors also provide new data annotation to an existing dataset weaknesses 1 the first concern is about the novelty of the main idea of the paper the proposed mode seems to be complicated to decompose the task into three stages whether it is possible to compare the computation complexity between the proposed method and the baseline 2 second as a method paper this paper only conducts experiments on a recent proposed dataset and the dataset contains only a simple baseline this makes the paper less convincing whether it is possible to extend the proposed method to other video situation understanding datasets like 1 to further prove the effectiveness of the proposed method 3 it will also be great to implement more baselines on the existing benchmark 1 wu bo et al star a benchmark for situated reasoning in realworld videos thirtyfifth conference on neural information processing systems datasets and benchmarks track 2021 yes the authors adequately addressed the limitations and potential negative social impact of their work docsepvidsitu involves predicting the actions entities involved in the actions and simple relations between the actions given a short video clip this paper proposes spatiotemporal grounding gvsr grounded video situation recognition as an addition to the vidsitu task spatiotemporal grounding involves localizing the entities which helps disambiguation of entities and coreferencing entities across different actions the paper also proposes a novel transformer model that unlike previous works jointly predicts all task outputs verbs verbrole pairs nouns grounding the videowhisperer model has 3 stages 1 learning contextualized features for the video and key objects simultaneously predicts the action label verb roles for each event 2 semantic role labeling ie generating a caption noun for each verb role in each event 3 localization is achieved by finding the largest attention score for a bounding box in the rotd transformer for each verbrole in each event the proposed model has a few advantages 1 results are predicted jointly in a single forward pass 2 improves entity captioning accuracy and 3 localizes verbs accurately despite not having access to grounding annotations during training grounding semantic predictions in the given video is an important problem that is wellmotivated accurate grounding can improve the reliability robustness trust in the model and possibly also prediction accuracy the gvsr task is an obvious extension of gsr 13 which focused on grounding in imsitu 7 to vidsitu 6 although the originality of the task is low it is indeed useful and improves the experimental setup of the existing vidsitu task the main contribution of the paper is a localization annotations for testing on vidsitu and b the videowhisperer model unlike previous works on vidsitu the proposed model is trained endtoend and outputs the entire set of structured outputs in a single forwardpass this architecture is a significantly simpler implementation such an architecture allows learning common representations and modeling the context across subtasks of vidsitu by leveraging crossattention across object bounding box regions in the transformer model localization can be achieved by weak supervision via the caption loss there is no need for groundtruth localization annotations this allows one to potentially scale up the training as long as only language annotations are available vidsitu requires the prediction of the action label verb the entities nouns associated with the verbrole and the relationship between events the groundtruth language descriptions of the entities may be ambiguous and have some variance eg woman in blue shirt woman in glasses etc could all refer to the same person thus evaluating captions is challenging and imprecise the entity localization task measured by iou is less ambiguous further it provides an additional dimension of understanding and interpretability of the model predictions thus gvsr alleviates some of the evaluation issues in vidsitu to an extent there are significant improvements in performance in entitycaptioning accuracy localization of verbroles this work contributes to the general direction of understanding semantics in videos however current learning methods require groundtruth labels and large amounts of data a huge challenge in this area is the difficulty in scaling this up to a large number of action labels verbs and roles docsepthis paper introduces a new task of performing structured situation recognition over time in videos called grounded video situation recognition gvsr the idea is that for a new video a model must identify which verbs are happening identify their arguments specifying who did what to whom and then determine where in the video that event happened the authors create a dataset for this task built around vidsitu 6 also devising a set of metrics to handle multiple events happening in the same video 5 events per video each video having 10 seconds the authors collect extra annotations of the validation set to benchmark human performance the paper introduces a new model for this task called videowhisperer frames are sampled from a video and a pretrained object detector is used to extract object embeddings from the resulting frames a separate pretrained video backbone extracts features from the whole video a transformer encoder fuses these features together and a separate transformer decoder is used to perform structured prediction for the semantic role labeling over videos results and ablations show that this model performs well at both localized video situation recognition as well as the nonlocalized case vidsitu to this reviewer this paper is a strong overall contribution it tackles an important and unsolved problem finegrained video understanding and introduces a new dataset grounded vidsitu on top of vidsitu that might help evaluate this the proposed model seems like a first step in this direction pipelining the problem of structured video understanding and helpful ablations are provided to show where the model might be improved in future work weaknesses to this reviewer the model feels a bit hacky and overly complex i have a suspicion that a simpler model would perform better but im also ok with this being a problem that future work ought to address adding a limitationssocial impact section would improve the paper considerably grounded seems like an overloaded term do you mean something like localized video situation recognition the modeling section sec 3 was confusing to this reviewer it might be helpful to discuss the highlevel strategy first before jumping into the details as well as the function of the rotdvote networks those acronyms like they are referred to that often so maybe dropping them would boost understanding some qualitative examples and discussion would help there isnt a limitations section or a discussion on the societal impacts adding this would improve the paper
### Summary: | all four reviewers agreed that this paper tackles an important problem and the proposed approach is novel and well supported by sufficient empirical evidence they also acknowledged the new data annotation contribution and that that the paper is generally written well there were several nice suggestions to improve the paper the authors are strongly encouraged to incorporate them in the final version | [
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
436,
2929,
23970,
247,
39707,
3169,
10336,
323,
3492,
4112,
8981,
326,
247,
14688,
942,
2720,
789,
327,
24705,
2554,
21473,
275,
10556,
342,
7046,
7173,
358,
23702,
3216,
272,
285,
270,
26586,
253,
8284,
273,
3216,
33024,
2554,
25340,
10603,
17690,
990,
936,
423,
28462,
3492,
4112,
8981,
305,
10936,
83,
50276,
783,
4081,
1566,
50276,
87,
504,
319,
8701,
365,
6554,
50276,
7110,
495,
2022,
4295,
50276,
66,
247,
39707,
32049,
281,
3037,
1789,
46234,
591,
3665,
285,
2362,
46234,
1097,
3634,
1230,
18839,
3066,
1881,
42959,
270,
247,
39707,
29810,
281,
3283,
253,
14429,
323,
1016,
2554,
27039,
327,
253,
1789,
285,
2362,
46234,
260,
247,
39707,
29810,
281,
47694,
3161,
1242,
3283,
253,
11743,
323,
1016,
2554,
27039,
327,
253,
3453,
3386,
273,
270,
50276,
936,
7472,
253,
1566,
327,
3216,
272,
253,
4477,
5728,
31825,
407,
7004,
4212,
281,
3812,
41113,
12783,
1475,
9503,
275,
1016,
2362,
17230,
323,
253,
12820,
873,
841,
31825,
588,
320,
4439,
285,
263,
11217,
275,
253,
3565,
7103,
4968,
11554,
841,
31825,
403,
417,
908,
323,
3733,
285,
594,
253,
4081,
1566,
310,
760,
22112,
35421,
50276,
16217,
3825,
403,
11080,
1097,
253,
490,
77,
569,
281,
7568,
253,
6349,
273,
1016,
2216,
4327,
347,
973,
347,
5301,
281,
2720,
789,
327,
253,
256,
8435,
22791,
327,
253,
256,
8435,
22791,
253,
4081,
2746,
3012,
41731,
13015,
2720,
789,
285,
275,
1635,
8525,
7046,
7173,
358,
23702,
14536,
12401,
2720,
2987,
253,
4081,
1566,
2789,
3282,
310,
973,
7210,
456,
407,
4679,
285,
15538,
13276,
6036,
10554,
273,
5231,
9503,
7046,
7173,
358,
23702,
14536,
273,
9503,
285,
3403,
621,
12401,
2720,
7274,
326,
878,
3216,
33024,
2554,
25340,
31825,
387,
1071,
673,
1270,
789,
50276,
783,
4477,
452,
2908,
247,
4864,
18389,
273,
7364,
275,
253,
30762,
5474,
33032,
2520,
2929,
5421,
253,
1895,
273,
3492,
4112,
8981,
534,
26295,
18872,
1491,
751,
2709,
3394,
7688,
5231,
285,
2336,
9108,
1257,
8557,
50276,
783,
4477,
12661,
247,
289,
250,
383,
486,
39707,
1566,
8851,
319,
8701,
365,
6554,
281,
6016,
253,
4836,
253,
1566,
806,
33772,
33876,
1025,
46234,
323,
3492,
3386,
275,
7529,
342,
2234,
5113,
326,
3176,
275,
253,
3492,
29205,
253,
1566,
840,
863,
1727,
285,
24283,
1491,
432,
1789,
46234,
281,
1980,
907,
9172,
281,
3533,
253,
2457,
3924,
273,
253,
1566,
15693,
9172,
347,
3403,
621,
281,
6266,
1016,
2336,
9108,
282,
4667,
275,
253,
3492,
4679,
403,
5196,
327,
253,
362,
2352,
262,
86,
10895,
17170,
1805,
3045,
685,
253,
8245,
3210,
20544,
337,
253,
2929,
4583,
310,
973,
3542,
285,
3477,
281,
956,
374,
253,
2934,
281,
11101,
3014,
253,
4836,
715,
1027,
8661,
3133,
281,
320,
22335,
3590,
285,
253,
28913,
1263,
275,
253,
5661,
2593,
2722,
697,
12510,
495,
347,
253,
6452,
273,
253,
10199,
253,
4477,
671,
2085,
747,
941,
22581,
281,
271,
5368,
10895,
50276,
20881,
1255,
265,
337,
253,
806,
4468,
310,
670,
253,
38135,
273,
253,
2022,
2934,
273,
253,
2929,
253,
4081,
4438,
3133,
281,
320,
9542,
281,
11101,
3014,
253,
4836,
715,
1264,
8661,
1880,
352,
310,
1896,
281,
7277,
253,
13782,
10454,
875,
253,
4081,
1332,
285,
253,
8245,
374,
1273,
347,
247,
1332,
2929,
436,
2929,
760,
2589,
84,
4679,
327,
247,
3332,
4081,
10895,
285,
253,
10895,
4428,
760,
247,
2969,
8245,
436,
2789,
253,
2929,
1679,
21414,
1880,
352,
310,
1896,
281,
9017,
253,
4081,
1332,
281,
643,
3492,
4112,
4685,
15302,
751,
337,
281,
2007,
5276,
253,
12510,
273,
253,
4081,
1332,
50276,
20,
352,
588,
671,
320,
1270,
281,
3359,
625,
1666,
25379,
327,
253,
5368,
22791,
50275,
18,
259,
86,
1766,
1162,
355,
4177,
247,
22791,
323,
17860,
14720,
275,
1524,
10186,
10556,
10488,
25512,
394,
8059,
327,
11454,
1491,
5162,
2718,
15302,
285,
49602,
3540,
43425,
4754,
253,
4477,
18212,
9713,
253,
7364,
285,
2442,
4016,
2675,
3486,
273,
616,
789,
5474,
33032,
87,
2352,
262,
86,
8687,
21565,
253,
5231,
14429,
3206,
275,
253,
5231,
285,
2969,
2493,
875,
253,
5231,
1677,
247,
2159,
3492,
17230,
436,
2929,
29328,
7046,
7173,
358,
23702,
3216,
272,
305,
10936,
83,
50276,
2595,
264,
3492,
4112,
8981,
347,
271,
1635,
281,
253,
362,
2352,
262,
86,
4836,
7046,
7173,
358,
23702,
3216,
272,
8687,
1980,
3006,
253,
14429,
534,
7729,
557,
38638,
273,
14429,
285,
5161,
46382,
14429,
2439,
1027,
5231,
50275,
783,
2929,
671,
29328,
247,
4460,
39707,
1566,
326,
12401,
2045,
2987,
26277,
26295,
512,
4836,
18012,
50276,
332,
1768,
2336,
9108,
282,
8557,
28407,
84,
3216,
272,
253,
8851,
319,
8701,
365,
6554,
1566,
556,
495,
8661,
50276,
18,
4715,
33876,
1025,
3386,
323,
253,
3492,
285,
2234,
5113,
10486,
26295,
253,
2250,
5203,
17257,
9503,
323,
1016,
2362,
50276,
19,
24705,
2554,
21473,
26332,
11365,
247,
11743,
28407,
323,
1016,
17257,
2554,
275,
1016,
2362,
50276,
20,
14536,
310,
6786,
407,
4560,
253,
6253,
4116,
4868,
323,
247,
41113,
3817,
275,
253,
687,
2851,
39707,
323,
1016,
2336,
9108,
282,
275,
1016,
2362,
50276,
783,
4081,
1566,
556,
247,
1643,
11361,
50276,
18,
1543,
403,
8131,
26277,
275,
247,
2014,
3579,
1509,
374,
19132,
10726,
11743,
272,
7200,
285,
495,
1980,
4219,
43369,
13613,
5747,
417,
1907,
2289,
281,
3216,
272,
31825,
1309,
3733,
50275,
2595,
272,
24705,
13650,
275,
253,
1677,
3492,
310,
271,
1774,
1895,
326,
310,
973,
24013,
8550,
7899,
3216,
272,
476,
3157,
253,
13367,
31640,
4517,
275,
253,
1566,
285,
6830,
671,
10554,
7200,
50275,
783,
305,
10936,
83,
4836,
310,
271,
4755,
6880,
273,
305,
18356,
2145,
534,
7106,
327,
3216,
272,
275,
516,
12198,
86,
818,
281,
362,
2352,
262,
86,
721,
3738,
253,
3236,
414,
273,
253,
4836,
310,
1698,
352,
310,
6296,
4217,
285,
19132,
253,
5661,
9978,
273,
253,
5368,
362,
2352,
262,
86,
4836,
50275,
783,
2022,
7680,
273,
253,
2929,
310,
247,
14536,
31825,
323,
5175,
327,
362,
2352,
262,
86,
285,
270,
253,
8851,
319,
8701,
365,
6554,
1566,
12401,
2045,
2987,
327,
362,
2352,
262,
86,
253,
4081,
1566,
310,
10166,
990,
936,
423,
285,
18012,
253,
2862,
873,
273,
18872,
18012,
275,
247,
2014,
3579,
5858,
436,
10336,
310,
247,
3012,
19554,
7092,
824,
271,
10336,
4483,
4715,
1846,
14237,
285,
14053,
253,
3634,
2439,
8482,
6579,
273,
362,
2352,
262,
86,
50275,
1615,
19732,
2977,
2831,
42959,
2439,
1789,
41113,
3817,
4811,
275,
253,
39707,
1566,
14536,
476,
320,
6786,
407,
5075,
20446,
3066,
253,
11743,
2957,
627,
310,
642,
878,
323,
3216,
33024,
14536,
31825,
436,
4483,
581,
281,
7826,
4311,
598,
253,
3733,
347,
1048,
347,
760,
3448,
31825,
403,
2130,
50275,
87,
2352,
262,
86,
4419,
253,
10554,
273,
253,
2250,
5203,
17257,
253,
14429,
28407,
84,
2330,
342,
253,
2336,
9108,
282,
285,
253,
2954,
875,
3394,
253,
3216,
33024,
3448,
20121,
273,
253,
14429,
778,
320,
23851,
285,
452,
690,
11041,
50276,
909,
3416,
275,
4797,
14133,
3416,
275,
17543,
3966,
812,
512,
3730,
281,
253,
1072,
1436,
3021,
16344,
3403,
621,
310,
11132,
285,
1607,
2845,
885,
253,
10726,
14536,
4836,
4080,
407,
891,
276,
310,
1679,
23851,
2007,
352,
3400,
271,
3081,
7877,
273,
4685,
285,
4665,
1430,
273,
253,
1566,
13650,
3021,
305,
10936,
83,
7374,
6584,
684,
690,
273,
253,
7103,
3374,
275,
362,
2352,
262,
86,
281,
271,
6070,
50275,
9088,
403,
1534,
11701,
275,
3045,
275,
10726,
34480,
272,
7200,
50276,
6790,
1320,
273,
2336,
9108,
868,
50275,
2520,
789,
17904,
281,
253,
2087,
3884,
273,
4685,
35185,
275,
10556,
2299,
1655,
4715,
3082,
2430,
3216,
33024,
13301,
285,
1781,
8322,
273,
941,
247,
5699,
5691,
275,
436,
2170,
310,
253,
10183,
275,
13642,
436,
598,
281,
247,
1781,
1180,
273,
2250,
13301,
43369,
285,
9503,
5474,
33032,
2520,
2929,
23970,
247,
747,
4836,
273,
9591,
18872,
4112,
8981,
689,
673,
275,
10556,
1925,
28462,
3492,
4112,
8981,
305,
10936,
83,
253,
2934,
310,
326,
323,
247,
747,
3492,
247,
1566,
1364,
4271,
534,
43369,
403,
9369,
4271,
616,
7125,
31238,
665,
858,
752,
281,
5207,
285,
840,
3653,
835,
275,
253,
3492,
326,
2362,
4592,
50275,
783,
4477,
2794,
247,
10895,
323,
436,
4836,
4270,
1475,
362,
2352,
262,
86,
721,
671,
1474,
2182,
247,
873,
273,
17082,
281,
6016,
2709,
3394,
9369,
275,
253,
1072,
3492,
608,
3394,
591,
3492,
1016,
3492,
1907,
884,
7253,
253,
4477,
4822,
4465,
31825,
273,
253,
12820,
873,
281,
22791,
1966,
3045,
50276,
783,
2929,
23970,
247,
747,
1566,
323,
436,
4836,
1925,
8851,
319,
8701,
365,
6554,
13009,
403,
19958,
432,
247,
3492,
285,
247,
3215,
11273,
1789,
13562,
310,
908,
281,
4908,
1789,
46234,
432,
253,
4795,
13009,
247,
4858,
3215,
11273,
3492,
27882,
16756,
3386,
432,
253,
2644,
3492,
247,
39707,
32049,
269,
5123,
841,
3386,
2366,
285,
247,
4858,
39707,
29810,
310,
908,
281,
1347,
18872,
10554,
323,
253,
24705,
2554,
21473,
689,
10556,
1543,
285,
490,
77,
569,
921,
326,
436,
1566,
17923,
973,
387,
1097,
15783,
3492,
4112,
8981,
347,
973,
347,
253,
1327,
6790,
1025,
1083,
362,
2352,
262,
86,
281,
436,
37317,
436,
2929,
310,
247,
2266,
4583,
7680,
352,
39223,
271,
1774,
285,
5061,
5336,
1895,
4030,
72,
11273,
3492,
4685,
285,
23970,
247,
747,
10895,
28462,
362,
2352,
262,
86,
327,
1755,
273,
362,
2352,
262,
86,
326,
1537,
1361,
7472,
436,
253,
4081,
1566,
3133,
751,
247,
806,
3213,
275,
436,
3884,
9196,
293,
1699,
253,
1895,
273,
18872,
3492,
4685,
285,
9371,
490,
77,
569,
403,
2530,
281,
921,
835,
253,
1566,
1537,
320,
5520,
275,
2852,
789,
50276,
20881,
1255,
265,
50276,
936,
436,
37317,
253,
1566,
9193,
247,
2372,
13908,
90,
285,
27662,
2570,
891,
452,
247,
18910,
326,
247,
19554,
1566,
651,
1347,
1805,
533,
516,
671,
8718,
342,
436,
1146,
247,
1895,
326,
2852,
789,
12758,
281,
2953,
50276,
8052,
247,
7364,
21637,
3486,
2593,
651,
3157,
253,
2929,
15455,
50276,
2595,
264,
3133,
751,
271,
689,
19052,
1307,
513,
368,
1599,
1633,
751,
15783,
3492,
4112,
8981,
50276,
783,
14053,
2593,
4706,
495,
369,
21643,
281,
436,
37317,
352,
1537,
320,
9371,
281,
2319,
253,
1029,
5251,
5700,
806,
1078,
22802,
715,
253,
4278,
347,
973,
347,
253,
1159,
273,
253,
687,
2851,
40997,
6928,
1110,
913,
1406,
90,
983,
751,
597,
403,
6289,
281,
326,
2223,
594,
5046,
18752,
731,
651,
9510,
4685,
50276,
8826,
18276,
6667,
285,
5955,
651,
1361,
627,
310,
2649,
247,
7364,
2593,
390,
247,
5955,
327,
253,
38058,
16274,
6240,
436,
651,
3157,
253,
2929,
2490,
187,
4118,
18435,
27,
455,
1740,
30628,
5821,
326,
436,
2929,
39223,
271,
1774,
1895,
285,
253,
4081,
2746,
310,
4460,
285,
973,
4516,
407,
4209,
16774,
1941,
597,
671,
14969,
253,
747,
941,
22581,
7680,
285,
326,
326,
253,
2929,
310,
3839,
3542,
973,
627,
497,
2067,
5322,
13991,
281,
3157,
253,
2929,
253,
4477,
403,
7052,
14659,
281,
19071,
731,
275,
253,
2457,
2715,
209
] | [
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] | [
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
436,
2929,
23970,
247,
39707,
3169,
10336,
323,
3492,
4112,
8981,
326,
247,
14688,
942,
2720,
789,
327,
24705,
2554,
21473,
275,
10556,
342,
7046,
7173,
358,
23702,
3216,
272,
285,
270,
26586,
253,
8284,
273,
3216,
33024,
2554,
25340,
10603,
17690,
990,
936,
423,
28462,
3492,
4112,
8981,
305,
10936,
83,
50276,
783,
4081,
1566,
50276,
87,
504,
319,
8701,
365,
6554,
50276,
7110,
495,
2022,
4295,
50276,
66,
247,
39707,
32049,
281,
3037,
1789,
46234,
591,
3665,
285,
2362,
46234,
1097,
3634,
1230,
18839,
3066,
1881,
42959,
270,
247,
39707,
29810,
281,
3283,
253,
14429,
323,
1016,
2554,
27039,
327,
253,
1789,
285,
2362,
46234,
260,
247,
39707,
29810,
281,
47694,
3161,
1242,
3283,
253,
11743,
323,
1016,
2554,
27039,
327,
253,
3453,
3386,
273,
270,
50276,
936,
7472,
253,
1566,
327,
3216,
272,
253,
4477,
5728,
31825,
407,
7004,
4212,
281,
3812,
41113,
12783,
1475,
9503,
275,
1016,
2362,
17230,
323,
253,
12820,
873,
841,
31825,
588,
320,
4439,
285,
263,
11217,
275,
253,
3565,
7103,
4968,
11554,
841,
31825,
403,
417,
908,
323,
3733,
285,
594,
253,
4081,
1566,
310,
760,
22112,
35421,
50276,
16217,
3825,
403,
11080,
1097,
253,
490,
77,
569,
281,
7568,
253,
6349,
273,
1016,
2216,
4327,
347,
973,
347,
5301,
281,
2720,
789,
327,
253,
256,
8435,
22791,
327,
253,
256,
8435,
22791,
253,
4081,
2746,
3012,
41731,
13015,
2720,
789,
285,
275,
1635,
8525,
7046,
7173,
358,
23702,
14536,
12401,
2720,
2987,
253,
4081,
1566,
2789,
3282,
310,
973,
7210,
456,
407,
4679,
285,
15538,
13276,
6036,
10554,
273,
5231,
9503,
7046,
7173,
358,
23702,
14536,
273,
9503,
285,
3403,
621,
12401,
2720,
7274,
326,
878,
3216,
33024,
2554,
25340,
31825,
387,
1071,
673,
1270,
789,
50276,
783,
4477,
452,
2908,
247,
4864,
18389,
273,
7364,
275,
253,
30762,
5474,
33032,
2520,
2929,
5421,
253,
1895,
273,
3492,
4112,
8981,
534,
26295,
18872,
1491,
751,
2709,
3394,
7688,
5231,
285,
2336,
9108,
1257,
8557,
50276,
783,
4477,
12661,
247,
289,
250,
383,
486,
39707,
1566,
8851,
319,
8701,
365,
6554,
281,
6016,
253,
4836,
253,
1566,
806,
33772,
33876,
1025,
46234,
323,
3492,
3386,
275,
7529,
342,
2234,
5113,
326,
3176,
275,
253,
3492,
29205,
253,
1566,
840,
863,
1727,
285,
24283,
1491,
432,
1789,
46234,
281,
1980,
907,
9172,
281,
3533,
253,
2457,
3924,
273,
253,
1566,
15693,
9172,
347,
3403,
621,
281,
6266,
1016,
2336,
9108,
282,
4667,
275,
253,
3492,
4679,
403,
5196,
327,
253,
362,
2352,
262,
86,
10895,
17170,
1805,
3045,
685,
253,
8245,
3210,
20544,
337,
253,
2929,
4583,
310,
973,
3542,
285,
3477,
281,
956,
374,
253,
2934,
281,
11101,
3014,
253,
4836,
715,
1027,
8661,
3133,
281,
320,
22335,
3590,
285,
253,
28913,
1263,
275,
253,
5661,
2593,
2722,
697,
12510,
495,
347,
253,
6452,
273,
253,
10199,
253,
4477,
671,
2085,
747,
941,
22581,
281,
271,
5368,
10895,
50276,
20881,
1255,
265,
337,
253,
806,
4468,
310,
670,
253,
38135,
273,
253,
2022,
2934,
273,
253,
2929,
253,
4081,
4438,
3133,
281,
320,
9542,
281,
11101,
3014,
253,
4836,
715,
1264,
8661,
1880,
352,
310,
1896,
281,
7277,
253,
13782,
10454,
875,
253,
4081,
1332,
285,
253,
8245,
374,
1273,
347,
247,
1332,
2929,
436,
2929,
760,
2589,
84,
4679,
327,
247,
3332,
4081,
10895,
285,
253,
10895,
4428,
760,
247,
2969,
8245,
436,
2789,
253,
2929,
1679,
21414,
1880,
352,
310,
1896,
281,
9017,
253,
4081,
1332,
281,
643,
3492,
4112,
4685,
15302,
751,
337,
281,
2007,
5276,
253,
12510,
273,
253,
4081,
1332,
50276,
20,
352,
588,
671,
320,
1270,
281,
3359,
625,
1666,
25379,
327,
253,
5368,
22791,
50275,
18,
259,
86,
1766,
1162,
355,
4177,
247,
22791,
323,
17860,
14720,
275,
1524,
10186,
10556,
10488,
25512,
394,
8059,
327,
11454,
1491,
5162,
2718,
15302,
285,
49602,
3540,
43425,
4754,
253,
4477,
18212,
9713,
253,
7364,
285,
2442,
4016,
2675,
3486,
273,
616,
789,
5474,
33032,
87,
2352,
262,
86,
8687,
21565,
253,
5231,
14429,
3206,
275,
253,
5231,
285,
2969,
2493,
875,
253,
5231,
1677,
247,
2159,
3492,
17230,
436,
2929,
29328,
7046,
7173,
358,
23702,
3216,
272,
305,
10936,
83,
50276,
2595,
264,
3492,
4112,
8981,
347,
271,
1635,
281,
253,
362,
2352,
262,
86,
4836,
7046,
7173,
358,
23702,
3216,
272,
8687,
1980,
3006,
253,
14429,
534,
7729,
557,
38638,
273,
14429,
285,
5161,
46382,
14429,
2439,
1027,
5231,
50275,
783,
2929,
671,
29328,
247,
4460,
39707,
1566,
326,
12401,
2045,
2987,
26277,
26295,
512,
4836,
18012,
50276,
332,
1768,
2336,
9108,
282,
8557,
28407,
84,
3216,
272,
253,
8851,
319,
8701,
365,
6554,
1566,
556,
495,
8661,
50276,
18,
4715,
33876,
1025,
3386,
323,
253,
3492,
285,
2234,
5113,
10486,
26295,
253,
2250,
5203,
17257,
9503,
323,
1016,
2362,
50276,
19,
24705,
2554,
21473,
26332,
11365,
247,
11743,
28407,
323,
1016,
17257,
2554,
275,
1016,
2362,
50276,
20,
14536,
310,
6786,
407,
4560,
253,
6253,
4116,
4868,
323,
247,
41113,
3817,
275,
253,
687,
2851,
39707,
323,
1016,
2336,
9108,
282,
275,
1016,
2362,
50276,
783,
4081,
1566,
556,
247,
1643,
11361,
50276,
18,
1543,
403,
8131,
26277,
275,
247,
2014,
3579,
1509,
374,
19132,
10726,
11743,
272,
7200,
285,
495,
1980,
4219,
43369,
13613,
5747,
417,
1907,
2289,
281,
3216,
272,
31825,
1309,
3733,
50275,
2595,
272,
24705,
13650,
275,
253,
1677,
3492,
310,
271,
1774,
1895,
326,
310,
973,
24013,
8550,
7899,
3216,
272,
476,
3157,
253,
13367,
31640,
4517,
275,
253,
1566,
285,
6830,
671,
10554,
7200,
50275,
783,
305,
10936,
83,
4836,
310,
271,
4755,
6880,
273,
305,
18356,
2145,
534,
7106,
327,
3216,
272,
275,
516,
12198,
86,
818,
281,
362,
2352,
262,
86,
721,
3738,
253,
3236,
414,
273,
253,
4836,
310,
1698,
352,
310,
6296,
4217,
285,
19132,
253,
5661,
9978,
273,
253,
5368,
362,
2352,
262,
86,
4836,
50275,
783,
2022,
7680,
273,
253,
2929,
310,
247,
14536,
31825,
323,
5175,
327,
362,
2352,
262,
86,
285,
270,
253,
8851,
319,
8701,
365,
6554,
1566,
12401,
2045,
2987,
327,
362,
2352,
262,
86,
253,
4081,
1566,
310,
10166,
990,
936,
423,
285,
18012,
253,
2862,
873,
273,
18872,
18012,
275,
247,
2014,
3579,
5858,
436,
10336,
310,
247,
3012,
19554,
7092,
824,
271,
10336,
4483,
4715,
1846,
14237,
285,
14053,
253,
3634,
2439,
8482,
6579,
273,
362,
2352,
262,
86,
50275,
1615,
19732,
2977,
2831,
42959,
2439,
1789,
41113,
3817,
4811,
275,
253,
39707,
1566,
14536,
476,
320,
6786,
407,
5075,
20446,
3066,
253,
11743,
2957,
627,
310,
642,
878,
323,
3216,
33024,
14536,
31825,
436,
4483,
581,
281,
7826,
4311,
598,
253,
3733,
347,
1048,
347,
760,
3448,
31825,
403,
2130,
50275,
87,
2352,
262,
86,
4419,
253,
10554,
273,
253,
2250,
5203,
17257,
253,
14429,
28407,
84,
2330,
342,
253,
2336,
9108,
282,
285,
253,
2954,
875,
3394,
253,
3216,
33024,
3448,
20121,
273,
253,
14429,
778,
320,
23851,
285,
452,
690,
11041,
50276,
909,
3416,
275,
4797,
14133,
3416,
275,
17543,
3966,
812,
512,
3730,
281,
253,
1072,
1436,
3021,
16344,
3403,
621,
310,
11132,
285,
1607,
2845,
885,
253,
10726,
14536,
4836,
4080,
407,
891,
276,
310,
1679,
23851,
2007,
352,
3400,
271,
3081,
7877,
273,
4685,
285,
4665,
1430,
273,
253,
1566,
13650,
3021,
305,
10936,
83,
7374,
6584,
684,
690,
273,
253,
7103,
3374,
275,
362,
2352,
262,
86,
281,
271,
6070,
50275,
9088,
403,
1534,
11701,
275,
3045,
275,
10726,
34480,
272,
7200,
50276,
6790,
1320,
273,
2336,
9108,
868,
50275,
2520,
789,
17904,
281,
253,
2087,
3884,
273,
4685,
35185,
275,
10556,
2299,
1655,
4715,
3082,
2430,
3216,
33024,
13301,
285,
1781,
8322,
273,
941,
247,
5699,
5691,
275,
436,
2170,
310,
253,
10183,
275,
13642,
436,
598,
281,
247,
1781,
1180,
273,
2250,
13301,
43369,
285,
9503,
5474,
33032,
2520,
2929,
23970,
247,
747,
4836,
273,
9591,
18872,
4112,
8981,
689,
673,
275,
10556,
1925,
28462,
3492,
4112,
8981,
305,
10936,
83,
253,
2934,
310,
326,
323,
247,
747,
3492,
247,
1566,
1364,
4271,
534,
43369,
403,
9369,
4271,
616,
7125,
31238,
665,
858,
752,
281,
5207,
285,
840,
3653,
835,
275,
253,
3492,
326,
2362,
4592,
50275,
783,
4477,
2794,
247,
10895,
323,
436,
4836,
4270,
1475,
362,
2352,
262,
86,
721,
671,
1474,
2182,
247,
873,
273,
17082,
281,
6016,
2709,
3394,
9369,
275,
253,
1072,
3492,
608,
3394,
591,
3492,
1016,
3492,
1907,
884,
7253,
253,
4477,
4822,
4465,
31825,
273,
253,
12820,
873,
281,
22791,
1966,
3045,
50276,
783,
2929,
23970,
247,
747,
1566,
323,
436,
4836,
1925,
8851,
319,
8701,
365,
6554,
13009,
403,
19958,
432,
247,
3492,
285,
247,
3215,
11273,
1789,
13562,
310,
908,
281,
4908,
1789,
46234,
432,
253,
4795,
13009,
247,
4858,
3215,
11273,
3492,
27882,
16756,
3386,
432,
253,
2644,
3492,
247,
39707,
32049,
269,
5123,
841,
3386,
2366,
285,
247,
4858,
39707,
29810,
310,
908,
281,
1347,
18872,
10554,
323,
253,
24705,
2554,
21473,
689,
10556,
1543,
285,
490,
77,
569,
921,
326,
436,
1566,
17923,
973,
387,
1097,
15783,
3492,
4112,
8981,
347,
973,
347,
253,
1327,
6790,
1025,
1083,
362,
2352,
262,
86,
281,
436,
37317,
436,
2929,
310,
247,
2266,
4583,
7680,
352,
39223,
271,
1774,
285,
5061,
5336,
1895,
4030,
72,
11273,
3492,
4685,
285,
23970,
247,
747,
10895,
28462,
362,
2352,
262,
86,
327,
1755,
273,
362,
2352,
262,
86,
326,
1537,
1361,
7472,
436,
253,
4081,
1566,
3133,
751,
247,
806,
3213,
275,
436,
3884,
9196,
293,
1699,
253,
1895,
273,
18872,
3492,
4685,
285,
9371,
490,
77,
569,
403,
2530,
281,
921,
835,
253,
1566,
1537,
320,
5520,
275,
2852,
789,
50276,
20881,
1255,
265,
50276,
936,
436,
37317,
253,
1566,
9193,
247,
2372,
13908,
90,
285,
27662,
2570,
891,
452,
247,
18910,
326,
247,
19554,
1566,
651,
1347,
1805,
533,
516,
671,
8718,
342,
436,
1146,
247,
1895,
326,
2852,
789,
12758,
281,
2953,
50276,
8052,
247,
7364,
21637,
3486,
2593,
651,
3157,
253,
2929,
15455,
50276,
2595,
264,
3133,
751,
271,
689,
19052,
1307,
513,
368,
1599,
1633,
751,
15783,
3492,
4112,
8981,
50276,
783,
14053,
2593,
4706,
495,
369,
21643,
281,
436,
37317,
352,
1537,
320,
9371,
281,
2319,
253,
1029,
5251,
5700,
806,
1078,
22802,
715,
253,
4278,
347,
973,
347,
253,
1159,
273,
253,
687,
2851,
40997,
6928,
1110,
913,
1406,
90,
983,
751,
597,
403,
6289,
281,
326,
2223,
594,
5046,
18752,
731,
651,
9510,
4685,
50276,
8826,
18276,
6667,
285,
5955,
651,
1361,
627,
310,
2649,
247,
7364,
2593,
390,
247,
5955,
327,
253,
38058,
16274,
6240,
436,
651,
3157,
253,
2929,
2490,
187,
4118,
18435,
27,
455,
1740,
30628,
5821,
326,
436,
2929,
39223,
271,
1774,
1895,
285,
253,
4081,
2746,
310,
4460,
285,
973,
4516,
407,
4209,
16774,
1941,
597,
671,
14969,
253,
747,
941,
22581,
7680,
285,
326,
326,
253,
2929,
310,
3839,
3542,
973,
627,
497,
2067,
5322,
13991,
281,
3157,
253,
2929,
253,
4477,
403,
7052,
14659,
281,
19071,
731,
275,
253,
2457,
2715,
209
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
to help transformer learn long sequence efficiently the paper performs attention on selective timesteps that have high expirespan scores for each timestep the expirespan score is computed by mapping the corresponding hidden feature to a number which is learnt during training soft masking is applied to make the learning differentiable an additional loss is introduced to reduce the average span making the attention sparse the proposed attention is integrated into each layer of transformer and tested on several synthetic tasks and two language modelling datasets yielding promising results pros the proposed solution computing expire score and minimizing the average span is elegant and seems novel within transformer context the properties and behaviours of the method are well illustrated with detailed analysis and visualization diverse experiments are conducted cons insufficient comparison with other transformerbased baselines the results on real data are weak detail comments and questions sec 41 the equation computing ot should be a summation over i before transformer sparse attention has been studied deeply in the literature it may be beneficial to review some works eg 123 and try to integrate them into transformer as additional baselines to make the experiment stronger no experimental result demonstrates that the method can reduce computation complexity please consider including a comparison of running time or physical memory usages between your method and other transformers it is unclear what are the baselines mentioned in the experiments are they vanilla transformers how did the authors control the memory size of the baseline as in fig3 4 and 7 for some synthetic tasks it is better to include stronger baselines 45 to show the advantage of the proposed method over other variants of transformer in table 2 the performance gap is significant is it possible to improve your performance with more parameters 1 ke nan rosemary anirudh goyal alias parth goyal olexa bilaniuk jonathan binas michael c mozer chris pal and yoshua bengio sparse attentive backtracking temporal credit assignment through reminding in advances in neural information processing systems pp 76407651 2018 2 martins andre and ramon astudillo from softmax to sparsemax a sparse model of attention and multilabel classification in international conference on machine learning pp 16141623 2016 3 niculae vlad and mathieu blondel a regularized framework for sparse and structured neural attention in advances in neural information processing systems pp 33383348 2017 4 goncalo m correia vlad niculae and andre ft martins adaptively sparse transformers in proceedings of the 2019 conference on empirical methods in natural language processing and the 9th international joint conference on natural language processing emnlpijcnlp pp 21742184 2019 5 sainbayar sukhbaatar edouard grave piotr bojanowski and armand joulin adaptive attention span in transformers in proceedings of the 57th annual meeting of the association for computational linguistics pp 331335 2019a docsepmodified score thank you authors for your thorough response given the new information and baselines i think this is a promising paper that passes the acceptance threshold overall quality the authors present a method to improve the efficiency of transformer models when computing attention over previous time steps although this presents a neat idea that has the potential to improve an increasingly important model architecture the experiments fall short of matching the claim that this method provides enables more efficient attention computation over memories in practice specifically their baselines do not include relevant transformer modifications aimed at efficiency and they provide no detailed analysis on the memory size in practice if the authors included more thorough experiments this would be a strong paper in their absence it is marginally below the acceptance threshold clarity the abstract introduction background and methods section were detailed yet easy to follow the comparison of time complexity of prior work in the background section was particularly helpful however this precision did not carry over into the experimental section which lacked thorough experimentation detailed under weaknesses below and figures 35 were out of order relative to the prose the latter point is minor and does not affect my rating significance the potential impact is very high especially as applications for transformers grow if the authors could address the weaknesses outlined below this could be an enormously helpful augmentation to the transformer architecture strengths the authors focus on an important problem for a very relevant architecture the writing is clear and enjoyable section 3 in particular is a very friendly introduction to transformer time complexity evaluations performed over a variety of applications spanning simpletoy to more realistic tasks weaknesses corridor instruction portal copy pg19 and colliding objects tasks only show comparisons for standard transformer models as opposed to at least one or two comparable efficiencyoptimized models giving the authors the benefit of the doubt the first few experiments may serve more as proofs of concepts where direct comparison with prior work is not as relevant or useful but this leaves only one task in the paper with comparison to prior work on improving transformer efficiency enwiki8 on enwiki8 the authors compare with just 1 modification and the improvement seems rather small small margins of improvement alone are not enough to reject a paper but given that this is the only result with a head to head comparison of efficiency optimized transformers it makes it difficult for the community to discern the contribution of the work furthermore on pg19 copy task and object collision the authors do not provide the memory sizeaverage memory sizeeffective memory size this makes it difficult to understand if performance gains correspond with performance improvements which is the methods stated purpose intuitively an inductive bias to expire memories would make a learned model more brittle when transferring to new tasks eg in the instruction task a new form of instruction may become relevant in a test task that was never relevant in training tasks why is this a reasonable tradeoff to make question what value is shown in table 2 the caption says bitperbyte but the numbers are inconsistent with figure 7 in figure 11 how is memory computeddocsepsummary the paper proposes a method for overcoming the longterm memory bottleneck of transformers the idea is to assign a value expirespan to each formed memory which indicates how long the memory should be stored and be available for the transformer to access it the authors demonstrate the performance of their approach on a set of synthetic and characterlevel language modeling benchmarks significance while the idea seems to be quite interesting and the presentation of the paper is clear and sound i have the following concerns as the expirespan does not seem to be updated the model must know how long to keep the memory when the memory is formed couldnt this potential cause issues when information arriving in the future would influence the span of how long the memory should be kept from the authors descriptions the method appears relatively brittle to hyperparameter choice in particular the method requires some sophisticated form of regularization for the performed benchmarks thus raising my concerns about the stability and scalability of the approach i would appreciate it if the authors could elaborate on my concerns the paper misses important related work in this domain the paper gers et al learning to forget continual prediction with lstm already proposes a mechanism to remove memories that are not needed anymore moreover the proposed approach is adaptive as for each token the network decides if it should clear some of its memory
### Summary: | the paper studies the problem of identifying what information to forget in attention mechanisms with the goal of enabling attention mechanisms to deal with longer contexts this is a simple yet intuitive extension selfattention is augmented with an expiration value prediction experiments were carried out on nlp and rl tasks overall the paper has novelty in the proposed idea however there are concerns about the strength of the experiments that the experiments fall short | [
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
936,
1361,
39707,
3037,
1048,
3425,
14556,
253,
2929,
17923,
4116,
327,
13687,
4522,
383,
2265,
326,
452,
1029,
866,
74,
2076,
266,
7363,
323,
1016,
4522,
383,
554,
253,
866,
74,
2076,
266,
4868,
310,
10302,
407,
10603,
253,
3969,
8763,
4735,
281,
247,
1180,
534,
310,
34003,
1309,
3733,
2602,
44790,
310,
3732,
281,
1056,
253,
4715,
46350,
271,
3081,
2957,
310,
5611,
281,
4796,
253,
3388,
13905,
2403,
253,
4116,
23507,
253,
4081,
4116,
310,
8527,
715,
1016,
3828,
273,
39707,
285,
5762,
327,
2067,
13506,
8892,
285,
767,
3448,
26278,
15302,
27012,
12532,
1543,
50275,
856,
84,
50276,
783,
4081,
2900,
12672,
43106,
4868,
285,
28699,
253,
3388,
13905,
310,
20654,
285,
3133,
4460,
1561,
39707,
3634,
50276,
783,
3607,
285,
32536,
273,
253,
1332,
403,
973,
12800,
342,
7000,
1783,
285,
24426,
50276,
69,
7708,
4679,
403,
5196,
50276,
5040,
50276,
968,
86,
2276,
5301,
342,
643,
39707,
3169,
1666,
25379,
50275,
783,
1543,
327,
1524,
941,
403,
5075,
50276,
20119,
5701,
285,
3533,
50275,
1704,
7609,
253,
5150,
12672,
14366,
943,
320,
247,
36138,
689,
891,
50276,
9131,
39707,
23507,
4116,
556,
644,
5421,
11617,
275,
253,
6239,
352,
778,
320,
12912,
281,
2278,
690,
2987,
24088,
15567,
285,
1611,
281,
19837,
731,
715,
39707,
347,
3081,
1666,
25379,
281,
1056,
253,
3368,
10046,
50275,
2369,
5661,
906,
14371,
326,
253,
1332,
476,
4796,
13782,
10454,
4496,
1908,
1690,
247,
5301,
273,
3515,
673,
390,
3520,
3541,
441,
1131,
875,
634,
1332,
285,
643,
4979,
398,
50276,
262,
310,
12744,
752,
403,
253,
1666,
25379,
5393,
275,
253,
4679,
403,
597,
26724,
4979,
398,
849,
858,
253,
4477,
1453,
253,
3541,
1979,
273,
253,
8245,
347,
275,
3036,
20,
577,
285,
818,
50276,
1542,
690,
13506,
8892,
352,
310,
1805,
281,
2486,
10046,
1666,
25379,
5329,
281,
921,
253,
5750,
273,
253,
4081,
1332,
689,
643,
11640,
273,
39707,
50276,
249,
2829,
374,
253,
3045,
8037,
310,
1534,
310,
352,
1896,
281,
3157,
634,
3045,
342,
625,
3602,
50275,
18,
1058,
6399,
9461,
3454,
271,
343,
438,
73,
564,
90,
267,
28129,
1061,
394,
564,
90,
267,
258,
1591,
66,
10370,
6451,
2788,
480,
251,
10511,
10269,
284,
278,
44023,
260,
5497,
8260,
448,
4448,
5796,
285,
340,
6934,
5738,
270,
1205,
900,
23507,
33056,
422,
896,
40111,
11935,
6152,
12714,
949,
40935,
275,
16424,
275,
11454,
1491,
5162,
2718,
7266,
10909,
24769,
33297,
4765,
50275,
19,
16172,
968,
285,
250,
285,
17653,
251,
7846,
438,
16078,
432,
2602,
4090,
281,
23507,
4090,
247,
23507,
1566,
273,
4116,
285,
33362,
1492,
9162,
275,
5213,
8059,
327,
5145,
4715,
7266,
1668,
1047,
1036,
1508,
4022,
50275,
20,
6815,
335,
3348,
362,
17869,
285,
14168,
19683,
37559,
293,
247,
3963,
1025,
7792,
323,
23507,
285,
18872,
11454,
4116,
275,
16424,
275,
11454,
1491,
5162,
2718,
7266,
495,
23922,
1610,
2385,
4240,
50275,
21,
26087,
1179,
80,
278,
2643,
571,
362,
17869,
6815,
335,
3348,
285,
285,
250,
23899,
16172,
968,
5223,
1242,
23507,
4979,
398,
275,
10061,
273,
253,
6247,
8059,
327,
16774,
3082,
275,
3626,
3448,
5162,
285,
253,
898,
394,
5213,
6036,
8059,
327,
3626,
3448,
5162,
802,
13307,
81,
1944,
68,
13307,
81,
7266,
26517,
2945,
19105,
6247,
50274,
22,
256,
404,
32442,
274,
402,
17616,
5830,
15642,
1407,
276,
472,
16102,
268,
7173,
83,
1766,
17551,
15767,
285,
4430,
395,
42138,
3642,
17825,
4116,
50276,
2551,
275,
4979,
398,
275,
10061,
273,
253,
8988,
394,
7970,
4804,
273,
253,
5864,
323,
15180,
20365,
3397,
7266,
5922,
1012,
1671,
6247,
66,
50276,
7152,
33032,
25016,
4868,
5717,
368,
4477,
323,
634,
11080,
2380,
1677,
253,
747,
1491,
285,
1666,
25379,
891,
1158,
436,
310,
247,
12532,
2929,
326,
11999,
253,
14924,
7887,
50276,
1189,
455,
3290,
253,
4477,
1246,
247,
1332,
281,
3157,
253,
6733,
273,
39707,
3210,
672,
12672,
4116,
689,
2045,
673,
5018,
3738,
436,
10262,
247,
18176,
2934,
326,
556,
253,
2442,
281,
3157,
271,
9592,
1774,
1566,
10336,
253,
4679,
2965,
2159,
273,
11038,
253,
1750,
326,
436,
1332,
3400,
13276,
625,
5919,
4116,
13782,
689,
12959,
275,
3946,
5742,
616,
1666,
25379,
513,
417,
2486,
4623,
39707,
14586,
11205,
387,
6733,
285,
597,
2085,
642,
7000,
1783,
327,
253,
3541,
1979,
275,
3946,
604,
253,
4477,
2908,
625,
11080,
4679,
436,
651,
320,
247,
2266,
2929,
275,
616,
5928,
352,
310,
42876,
2708,
253,
14924,
7887,
50275,
498,
15752,
253,
12002,
10199,
4114,
285,
3082,
2593,
497,
7000,
2568,
3477,
281,
956,
253,
5301,
273,
673,
10454,
273,
2720,
789,
275,
253,
4114,
2593,
369,
3782,
9371,
2299,
436,
12320,
858,
417,
4459,
689,
715,
253,
5661,
2593,
534,
20296,
11080,
40290,
7000,
762,
32213,
2708,
285,
8442,
4791,
497,
562,
273,
1340,
4103,
281,
253,
36045,
253,
6158,
1127,
310,
5884,
285,
1057,
417,
2818,
619,
13716,
50276,
9188,
40348,
253,
2442,
3486,
310,
1077,
1029,
3340,
347,
4893,
323,
4979,
398,
1756,
604,
253,
4477,
812,
2953,
253,
32213,
18627,
2708,
436,
812,
320,
271,
12546,
4087,
9371,
42072,
281,
253,
39707,
10336,
50276,
296,
3755,
20556,
50276,
783,
4477,
2770,
327,
271,
1774,
1895,
323,
247,
1077,
4623,
10336,
50276,
783,
4028,
310,
2590,
285,
30357,
2593,
495,
275,
1798,
310,
247,
1077,
11453,
10199,
281,
39707,
673,
10454,
50276,
15419,
12542,
2684,
689,
247,
5235,
273,
4893,
28369,
2969,
85,
899,
281,
625,
15958,
8892,
50276,
20881,
1255,
265,
50276,
5528,
6992,
263,
9775,
20280,
3491,
23256,
746,
285,
3007,
2821,
5113,
8892,
760,
921,
14023,
323,
2629,
39707,
3210,
347,
10066,
281,
387,
1878,
581,
390,
767,
10870,
6733,
32581,
1025,
3210,
4933,
253,
4477,
253,
5649,
273,
253,
5545,
253,
806,
1643,
4679,
778,
5752,
625,
347,
27947,
273,
12342,
835,
1480,
5301,
342,
2720,
789,
310,
417,
347,
4623,
390,
4217,
533,
436,
6505,
760,
581,
4836,
275,
253,
2929,
342,
5301,
281,
2720,
789,
327,
11138,
39707,
6733,
546,
16123,
25,
327,
546,
16123,
25,
253,
4477,
7277,
342,
816,
337,
11237,
285,
253,
7756,
3133,
2581,
1355,
1355,
24390,
273,
7756,
3815,
403,
417,
2217,
281,
12009,
247,
2929,
533,
1677,
326,
436,
310,
253,
760,
906,
342,
247,
1481,
281,
1481,
5301,
273,
6733,
18325,
4979,
398,
352,
2789,
352,
2834,
323,
253,
3114,
281,
26923,
253,
7680,
273,
253,
789,
33810,
327,
23256,
746,
3491,
4836,
285,
1789,
15708,
253,
4477,
513,
417,
2085,
253,
3541,
1979,
25629,
3541,
1979,
13116,
3541,
1979,
436,
2789,
352,
2834,
281,
2096,
604,
3045,
15988,
2723,
342,
3045,
11701,
534,
310,
253,
3082,
4767,
4096,
50276,
565,
41597,
271,
42115,
8492,
281,
43106,
12959,
651,
1056,
247,
6311,
1566,
625,
1308,
1522,
672,
27090,
281,
747,
8892,
24088,
275,
253,
9775,
4836,
247,
747,
830,
273,
9775,
778,
2489,
4623,
275,
247,
1071,
4836,
326,
369,
1620,
4623,
275,
3733,
8892,
2139,
310,
436,
247,
5272,
5454,
2727,
281,
1056,
50276,
19751,
50276,
5371,
1318,
310,
2011,
275,
2829,
374,
253,
11743,
2296,
2372,
468,
8833,
533,
253,
3904,
403,
16706,
342,
4677,
818,
50276,
249,
4677,
1903,
849,
310,
3541,
10302,
7152,
339,
793,
360,
3454,
253,
2929,
29328,
247,
1332,
323,
40845,
253,
1048,
3945,
3541,
3673,
44856,
273,
4979,
398,
253,
2934,
310,
281,
9212,
247,
1318,
866,
74,
2076,
266,
281,
1016,
4447,
3541,
534,
6492,
849,
1048,
253,
3541,
943,
320,
7141,
285,
320,
2130,
323,
253,
39707,
281,
2289,
352,
253,
4477,
7568,
253,
3045,
273,
616,
2746,
327,
247,
873,
273,
13506,
285,
1894,
5251,
3448,
14053,
49602,
50275,
9188,
40348,
1223,
253,
2934,
3133,
281,
320,
3240,
4722,
285,
253,
9759,
273,
253,
2929,
310,
2590,
285,
3590,
891,
452,
253,
1563,
7350,
50275,
284,
253,
866,
74,
2076,
266,
1057,
417,
1646,
281,
320,
9300,
253,
1566,
1364,
871,
849,
1048,
281,
1978,
253,
3541,
672,
253,
3541,
310,
4447,
812,
2649,
436,
2442,
2847,
3374,
672,
1491,
20948,
275,
253,
2852,
651,
4833,
253,
13905,
273,
849,
1048,
253,
3541,
943,
320,
4934,
50275,
4064,
253,
4477,
20121,
253,
1332,
4620,
4942,
1308,
1522,
281,
4373,
19484,
4327,
275,
1798,
253,
1332,
4419,
690,
18144,
830,
273,
37820,
323,
253,
2684,
49602,
3021,
12976,
619,
7350,
670,
253,
7882,
285,
9171,
1430,
273,
253,
2746,
50275,
74,
651,
11435,
352,
604,
253,
4477,
812,
21184,
327,
619,
7350,
50275,
783,
2929,
38771,
1774,
2905,
789,
275,
436,
5028,
253,
2929,
305,
398,
1162,
355,
4715,
281,
7740,
45120,
10554,
342,
298,
296,
78,
2168,
29328,
247,
5122,
281,
5386,
12959,
326,
403,
417,
3058,
10542,
25761,
253,
4081,
2746,
310,
17825,
347,
323,
1016,
10669,
253,
2990,
21936,
604,
352,
943,
2590,
690,
273,
697,
3541,
50275,
187,
187,
4118,
18435,
27,
783,
2929,
2175,
253,
1895,
273,
12488,
752,
1491,
281,
7740,
275,
4116,
6297,
342,
253,
4736,
273,
17690,
4116,
6297,
281,
2968,
342,
3356,
22349,
436,
310,
247,
2969,
2568,
27350,
6880,
50276,
1286,
42959,
310,
31612,
342,
271,
32298,
1318,
50276,
12787,
2474,
4679,
497,
4824,
562,
327,
295,
24343,
285,
391,
77,
8892,
4583,
253,
2929,
556,
38135,
275,
253,
4081,
2934,
2299,
627,
403,
7350,
670,
253,
4757,
273,
253,
4679,
326,
253,
4679,
2965,
2159
] | [
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] | [
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
936,
1361,
39707,
3037,
1048,
3425,
14556,
253,
2929,
17923,
4116,
327,
13687,
4522,
383,
2265,
326,
452,
1029,
866,
74,
2076,
266,
7363,
323,
1016,
4522,
383,
554,
253,
866,
74,
2076,
266,
4868,
310,
10302,
407,
10603,
253,
3969,
8763,
4735,
281,
247,
1180,
534,
310,
34003,
1309,
3733,
2602,
44790,
310,
3732,
281,
1056,
253,
4715,
46350,
271,
3081,
2957,
310,
5611,
281,
4796,
253,
3388,
13905,
2403,
253,
4116,
23507,
253,
4081,
4116,
310,
8527,
715,
1016,
3828,
273,
39707,
285,
5762,
327,
2067,
13506,
8892,
285,
767,
3448,
26278,
15302,
27012,
12532,
1543,
50275,
856,
84,
50276,
783,
4081,
2900,
12672,
43106,
4868,
285,
28699,
253,
3388,
13905,
310,
20654,
285,
3133,
4460,
1561,
39707,
3634,
50276,
783,
3607,
285,
32536,
273,
253,
1332,
403,
973,
12800,
342,
7000,
1783,
285,
24426,
50276,
69,
7708,
4679,
403,
5196,
50276,
5040,
50276,
968,
86,
2276,
5301,
342,
643,
39707,
3169,
1666,
25379,
50275,
783,
1543,
327,
1524,
941,
403,
5075,
50276,
20119,
5701,
285,
3533,
50275,
1704,
7609,
253,
5150,
12672,
14366,
943,
320,
247,
36138,
689,
891,
50276,
9131,
39707,
23507,
4116,
556,
644,
5421,
11617,
275,
253,
6239,
352,
778,
320,
12912,
281,
2278,
690,
2987,
24088,
15567,
285,
1611,
281,
19837,
731,
715,
39707,
347,
3081,
1666,
25379,
281,
1056,
253,
3368,
10046,
50275,
2369,
5661,
906,
14371,
326,
253,
1332,
476,
4796,
13782,
10454,
4496,
1908,
1690,
247,
5301,
273,
3515,
673,
390,
3520,
3541,
441,
1131,
875,
634,
1332,
285,
643,
4979,
398,
50276,
262,
310,
12744,
752,
403,
253,
1666,
25379,
5393,
275,
253,
4679,
403,
597,
26724,
4979,
398,
849,
858,
253,
4477,
1453,
253,
3541,
1979,
273,
253,
8245,
347,
275,
3036,
20,
577,
285,
818,
50276,
1542,
690,
13506,
8892,
352,
310,
1805,
281,
2486,
10046,
1666,
25379,
5329,
281,
921,
253,
5750,
273,
253,
4081,
1332,
689,
643,
11640,
273,
39707,
50276,
249,
2829,
374,
253,
3045,
8037,
310,
1534,
310,
352,
1896,
281,
3157,
634,
3045,
342,
625,
3602,
50275,
18,
1058,
6399,
9461,
3454,
271,
343,
438,
73,
564,
90,
267,
28129,
1061,
394,
564,
90,
267,
258,
1591,
66,
10370,
6451,
2788,
480,
251,
10511,
10269,
284,
278,
44023,
260,
5497,
8260,
448,
4448,
5796,
285,
340,
6934,
5738,
270,
1205,
900,
23507,
33056,
422,
896,
40111,
11935,
6152,
12714,
949,
40935,
275,
16424,
275,
11454,
1491,
5162,
2718,
7266,
10909,
24769,
33297,
4765,
50275,
19,
16172,
968,
285,
250,
285,
17653,
251,
7846,
438,
16078,
432,
2602,
4090,
281,
23507,
4090,
247,
23507,
1566,
273,
4116,
285,
33362,
1492,
9162,
275,
5213,
8059,
327,
5145,
4715,
7266,
1668,
1047,
1036,
1508,
4022,
50275,
20,
6815,
335,
3348,
362,
17869,
285,
14168,
19683,
37559,
293,
247,
3963,
1025,
7792,
323,
23507,
285,
18872,
11454,
4116,
275,
16424,
275,
11454,
1491,
5162,
2718,
7266,
495,
23922,
1610,
2385,
4240,
50275,
21,
26087,
1179,
80,
278,
2643,
571,
362,
17869,
6815,
335,
3348,
285,
285,
250,
23899,
16172,
968,
5223,
1242,
23507,
4979,
398,
275,
10061,
273,
253,
6247,
8059,
327,
16774,
3082,
275,
3626,
3448,
5162,
285,
253,
898,
394,
5213,
6036,
8059,
327,
3626,
3448,
5162,
802,
13307,
81,
1944,
68,
13307,
81,
7266,
26517,
2945,
19105,
6247,
50274,
22,
256,
404,
32442,
274,
402,
17616,
5830,
15642,
1407,
276,
472,
16102,
268,
7173,
83,
1766,
17551,
15767,
285,
4430,
395,
42138,
3642,
17825,
4116,
50276,
2551,
275,
4979,
398,
275,
10061,
273,
253,
8988,
394,
7970,
4804,
273,
253,
5864,
323,
15180,
20365,
3397,
7266,
5922,
1012,
1671,
6247,
66,
50276,
7152,
33032,
25016,
4868,
5717,
368,
4477,
323,
634,
11080,
2380,
1677,
253,
747,
1491,
285,
1666,
25379,
891,
1158,
436,
310,
247,
12532,
2929,
326,
11999,
253,
14924,
7887,
50276,
1189,
455,
3290,
253,
4477,
1246,
247,
1332,
281,
3157,
253,
6733,
273,
39707,
3210,
672,
12672,
4116,
689,
2045,
673,
5018,
3738,
436,
10262,
247,
18176,
2934,
326,
556,
253,
2442,
281,
3157,
271,
9592,
1774,
1566,
10336,
253,
4679,
2965,
2159,
273,
11038,
253,
1750,
326,
436,
1332,
3400,
13276,
625,
5919,
4116,
13782,
689,
12959,
275,
3946,
5742,
616,
1666,
25379,
513,
417,
2486,
4623,
39707,
14586,
11205,
387,
6733,
285,
597,
2085,
642,
7000,
1783,
327,
253,
3541,
1979,
275,
3946,
604,
253,
4477,
2908,
625,
11080,
4679,
436,
651,
320,
247,
2266,
2929,
275,
616,
5928,
352,
310,
42876,
2708,
253,
14924,
7887,
50275,
498,
15752,
253,
12002,
10199,
4114,
285,
3082,
2593,
497,
7000,
2568,
3477,
281,
956,
253,
5301,
273,
673,
10454,
273,
2720,
789,
275,
253,
4114,
2593,
369,
3782,
9371,
2299,
436,
12320,
858,
417,
4459,
689,
715,
253,
5661,
2593,
534,
20296,
11080,
40290,
7000,
762,
32213,
2708,
285,
8442,
4791,
497,
562,
273,
1340,
4103,
281,
253,
36045,
253,
6158,
1127,
310,
5884,
285,
1057,
417,
2818,
619,
13716,
50276,
9188,
40348,
253,
2442,
3486,
310,
1077,
1029,
3340,
347,
4893,
323,
4979,
398,
1756,
604,
253,
4477,
812,
2953,
253,
32213,
18627,
2708,
436,
812,
320,
271,
12546,
4087,
9371,
42072,
281,
253,
39707,
10336,
50276,
296,
3755,
20556,
50276,
783,
4477,
2770,
327,
271,
1774,
1895,
323,
247,
1077,
4623,
10336,
50276,
783,
4028,
310,
2590,
285,
30357,
2593,
495,
275,
1798,
310,
247,
1077,
11453,
10199,
281,
39707,
673,
10454,
50276,
15419,
12542,
2684,
689,
247,
5235,
273,
4893,
28369,
2969,
85,
899,
281,
625,
15958,
8892,
50276,
20881,
1255,
265,
50276,
5528,
6992,
263,
9775,
20280,
3491,
23256,
746,
285,
3007,
2821,
5113,
8892,
760,
921,
14023,
323,
2629,
39707,
3210,
347,
10066,
281,
387,
1878,
581,
390,
767,
10870,
6733,
32581,
1025,
3210,
4933,
253,
4477,
253,
5649,
273,
253,
5545,
253,
806,
1643,
4679,
778,
5752,
625,
347,
27947,
273,
12342,
835,
1480,
5301,
342,
2720,
789,
310,
417,
347,
4623,
390,
4217,
533,
436,
6505,
760,
581,
4836,
275,
253,
2929,
342,
5301,
281,
2720,
789,
327,
11138,
39707,
6733,
546,
16123,
25,
327,
546,
16123,
25,
253,
4477,
7277,
342,
816,
337,
11237,
285,
253,
7756,
3133,
2581,
1355,
1355,
24390,
273,
7756,
3815,
403,
417,
2217,
281,
12009,
247,
2929,
533,
1677,
326,
436,
310,
253,
760,
906,
342,
247,
1481,
281,
1481,
5301,
273,
6733,
18325,
4979,
398,
352,
2789,
352,
2834,
323,
253,
3114,
281,
26923,
253,
7680,
273,
253,
789,
33810,
327,
23256,
746,
3491,
4836,
285,
1789,
15708,
253,
4477,
513,
417,
2085,
253,
3541,
1979,
25629,
3541,
1979,
13116,
3541,
1979,
436,
2789,
352,
2834,
281,
2096,
604,
3045,
15988,
2723,
342,
3045,
11701,
534,
310,
253,
3082,
4767,
4096,
50276,
565,
41597,
271,
42115,
8492,
281,
43106,
12959,
651,
1056,
247,
6311,
1566,
625,
1308,
1522,
672,
27090,
281,
747,
8892,
24088,
275,
253,
9775,
4836,
247,
747,
830,
273,
9775,
778,
2489,
4623,
275,
247,
1071,
4836,
326,
369,
1620,
4623,
275,
3733,
8892,
2139,
310,
436,
247,
5272,
5454,
2727,
281,
1056,
50276,
19751,
50276,
5371,
1318,
310,
2011,
275,
2829,
374,
253,
11743,
2296,
2372,
468,
8833,
533,
253,
3904,
403,
16706,
342,
4677,
818,
50276,
249,
4677,
1903,
849,
310,
3541,
10302,
7152,
339,
793,
360,
3454,
253,
2929,
29328,
247,
1332,
323,
40845,
253,
1048,
3945,
3541,
3673,
44856,
273,
4979,
398,
253,
2934,
310,
281,
9212,
247,
1318,
866,
74,
2076,
266,
281,
1016,
4447,
3541,
534,
6492,
849,
1048,
253,
3541,
943,
320,
7141,
285,
320,
2130,
323,
253,
39707,
281,
2289,
352,
253,
4477,
7568,
253,
3045,
273,
616,
2746,
327,
247,
873,
273,
13506,
285,
1894,
5251,
3448,
14053,
49602,
50275,
9188,
40348,
1223,
253,
2934,
3133,
281,
320,
3240,
4722,
285,
253,
9759,
273,
253,
2929,
310,
2590,
285,
3590,
891,
452,
253,
1563,
7350,
50275,
284,
253,
866,
74,
2076,
266,
1057,
417,
1646,
281,
320,
9300,
253,
1566,
1364,
871,
849,
1048,
281,
1978,
253,
3541,
672,
253,
3541,
310,
4447,
812,
2649,
436,
2442,
2847,
3374,
672,
1491,
20948,
275,
253,
2852,
651,
4833,
253,
13905,
273,
849,
1048,
253,
3541,
943,
320,
4934,
50275,
4064,
253,
4477,
20121,
253,
1332,
4620,
4942,
1308,
1522,
281,
4373,
19484,
4327,
275,
1798,
253,
1332,
4419,
690,
18144,
830,
273,
37820,
323,
253,
2684,
49602,
3021,
12976,
619,
7350,
670,
253,
7882,
285,
9171,
1430,
273,
253,
2746,
50275,
74,
651,
11435,
352,
604,
253,
4477,
812,
21184,
327,
619,
7350,
50275,
783,
2929,
38771,
1774,
2905,
789,
275,
436,
5028,
253,
2929,
305,
398,
1162,
355,
4715,
281,
7740,
45120,
10554,
342,
298,
296,
78,
2168,
29328,
247,
5122,
281,
5386,
12959,
326,
403,
417,
3058,
10542,
25761,
253,
4081,
2746,
310,
17825,
347,
323,
1016,
10669,
253,
2990,
21936,
604,
352,
943,
2590,
690,
273,
697,
3541,
50275,
187,
187,
4118,
18435,
27,
783,
2929,
2175,
253,
1895,
273,
12488,
752,
1491,
281,
7740,
275,
4116,
6297,
342,
253,
4736,
273,
17690,
4116,
6297,
281,
2968,
342,
3356,
22349,
436,
310,
247,
2969,
2568,
27350,
6880,
50276,
1286,
42959,
310,
31612,
342,
271,
32298,
1318,
50276,
12787,
2474,
4679,
497,
4824,
562,
327,
295,
24343,
285,
391,
77,
8892,
4583,
253,
2929,
556,
38135,
275,
253,
4081,
2934,
2299,
627,
403,
7350,
670,
253,
4757,
273,
253,
4679,
326,
253,
4679,
2965,
2159
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
this paper proposed dasgd an algorithm for largescale largebatch training of deep neural networks the algorithm combines local sgd with delayed averaging steps to hide the communication overhead however workers still synchronize their forwardbackward passes in each iteration a convergence rate of o1sqrtk to a stationary is provided for smooth nonconvex objectives with respect to the average parameter set across workers numerical experiments provided on cifar10 and some brief results on imagenet1k with a resnet18 and a batchsize of 256 strengths this paper considers a very interesting and relevant problem reducing communication overhead of distributed deep learning methods additionally the approach of using delayed averaging to hide communication overhead is well motivated weaknesses 1 misses a large body of literature on the problem including the whole set of gossipbased methods that have been proposed for this problem the whole set of decentralized asynchronous methods that have been proposed for this problem codistillation based methods hiding communication overhead by synchronizing predictions via knowledgedistillation 2 additionally the novelty is highly limited for example local steps with delayed averaging has already been used in asynchronous decentralized methods for instance see tauoverlap stochastic gradient push a which combines tau local steps with asynchronous delayed averaging over arbitrary directed communication graphs not necessarily alltoall 3 the proposed method still requires synchronizing workers during forwardbackward passes which does not tolerating variance in interprocessor update times which even in the case of homogeneous clusters still presents nontrivial slowdowns in the context of machinelearning problems 4 numerical results on imagenet are not convincing since the benchmarks are nonstandard for this literature specifically only using a rn18 in in1k for 20 epochs with a batchsize of 256 does not provide a good indication on the practical utility of the proposed method relative to related work not to mention that a numerical comparison is missed with a large body of related work 5 minor point but the heuristic approximation used to determine the lower bound on the delay hyperparameter d does not necessarily scale linearly with the number of workers for instance allreduce collective communication implementations using distancehalvingvectordoubling or ringbased allreduce only scales logarithmically with the number of workers not linearly a assran et al stochastic gradient push for distributed deep learning icml 2019 b vogels et al relaysum for decentralized deep learning on heterogeneous data neurips 2021 c koloskova et al decentralized deep learning with arbitrary communication compression iclr 2020 d lian et al can decentralized algorithms outperform centralized algorithms a case study for decentralized parallel stochastic gradient descent neurips 2017 e lian et al asynchronous decentralized parallel stochastic gradient descent icml 2018 f tang et al decentralized training over decentralized data icml 2018 g assran et al advances in asynchronous parallel and distributed optimization proceedings of the ieee 2020 h assran and rabbat asynchronous gradient push ieee transactions on automatic control 2021 i assran et al gossipbased actorlearner architectures for deep reinforcement learning neurips 2020 j spiridinoff et al robust asynchronous stochastic gradientpush asymptotically optimal and networkindependent performance for strongly convex functions jmlr2020 k pu et al asymptotic network independence in distributed stochastic optimization for machine learning examining distributed and centralized stochastic gradient descent ieee signal processing magazine 2020 l sodhani et al a closer look at codistillation for distributed training arxiv 2020 m anil et al large scale distributed neural network training through online distillation iclr 2018 the for sufficiently large k is standard but is quite vague the lower bound on k is quantified in related works it is my opinion that although the problem is quite interesting and relevant a large body of literature is missed and the proposed method is not sufficiently novel moreover the theoretical contributions do not provide improvements over the related literature and could use further specification in certain parts eg although for sufficiently large k is a common qualifier a reasonable lower bound is usually made specific in the the theorem statement additionally i do not find the numerical experiments to provide convincing evidence that the proposed method is indeed better suited for largebatch distributed deep learning than existing approaches docsepin this paper authors propose the modification of distributed sgd algorithm that combines the ideas of local sgd and delayed averaging of updates in this paper authors consider the distributed sgd with delayed gradients and bounded delays this assumption on delays allows algorithm to converge with the same rate as local sgd algorithm first of all i didnt get the problem we try to solve unfortunately the algorithm itself is not mentioned in the paper only a single line that corresponds to the update rule from this rule i conclude that all nodes are interconnected since the second term is dependent on the all neighbors gradients and points this questions me if this algorithm is good in practice yes the local update happens much more often than the global update but what is the reason then to use this step instead of simple averaging i think that this article is poorly written first of all i want to mention that authors decided to decrease the font size of all important equations and plots to fit the conference size requirements this makes this article extremely hard to read after printing second for me as a colorblind person all the plots are extremely unclear since the difference between the lines is minimal third i missed the main problem formulation moreover the algorithm itself is not mentioned only update without any details about hyperparameters selection second thing to mention is a novelty and a practical interest of the algorithm i understand what is a difference between this algorithm and the local sgd but i am hesitating in the following since during the full update not local we communicate completely m2 exchanges what is a reason in such a delayed gradient computation usually the communication is much more expensive than the gradient computation so the reason on delaying the gradients is not clear for me in work httpsarxivorgpdf180609429pdf authors use delayed gradients almost the same way as in this paper however it allows to save the total amount of communications about the experimental part i have no comments since i failed to recognize the lines difference all in all i find this article in a very draft stage to be published docsepin this paper the authors present dasgd which overlaps the computation and communication of distributed training in a nutshell the idea is to combine pipesgd and local sgd which totally makes sense the experiment results show good performance in this paper the authors present dasgd which overlaps the computation and communication of distributed training the paper is wellwritten and easy to understand in a nutshell the idea is to combine pipesgd and local sgd which totally makes sense the experiment results show good performance however i have the following concerns in the novelty 1 the idea of fully parallelize sgd and forwardbackward propagations to hide 100 of gradient communication is actually not new pipesgd 1 proposed using sgd with 1 step of staleness to overlap communication and computation however 1 is not cited or discussed in this paper 2 the combination of pipesgd and local sgd is also not new 23 both uses such a combination basically i think 3 proposes the combination of pipesgd and local sgd and 2 adds communication compression to 3 please correct me if im wrong and indicate the major difference between the ideas of this paper and 3 references 1 li youjie et al pipesgd a decentralized pipelined sgd framework for distributed deep net training neurips 2018 2 delaytolerant local sgd for efficient distributed training httpsopenreviewnetforumideu8zvx7zrf 3 wang jianyu et al overlap localsgd an algorithmic approach to hide communication delays in distributed sgd icassp 2020 in this paper the authors present dasgd which overlaps the computation and communication of distributed training the paper is wellwritten and easy to understand however i have some concerns in the novelty docsepthe paper proposes to combine local sgd with overlapped communication a convergence analysis is given the experiments validates the model performance 1 the critical issue is that neither local sgd nor overlapped 1 communication is novel whats more combining them is also not novel as a matter of fact the proposed method figure 2 c is nearly the same to fig 2 in 2 2 i did not find the improvement regarding the wall clock time by hiding the communication in the experiment references 1 li y yu m li s avestimehr s kim ns and schwing a pipesgd a decentralized pipelined sgd framework for distributed deep net training neurips 2018 2 wang jianyu hao liang and gauri joshi overlap localsgd an algorithmic approach to hide communication delays in distributed sgd in icassp 20202020 ieee international conference on acoustics speech and signal processing icassp pp 88718875 ieee 2020 i believe the proposed method is not novel and 2 has proposed a very similar method
### Summary: | this paper proposes a variant of stochastic gradient descent that parallelizes the algorithm for distributed training via delayed gradient averaging while the algorithm dasgd proposed is sensible and seems to work it also seems to miss a lot of related work as pointed out by one of the reviewers the class of asynchronous decentralized methods already seem to cover the space of dasgd and its not clear how dasgd differs from the existing methods in this space as a result of this lack of comparison to related work the reviewers recommended that the paper not be accepted at this time and this evaluation was not challenged by an author response i agree with this consensus | [
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
2520,
2929,
4081,
9527,
35333,
271,
5933,
323,
1236,
2510,
25912,
1781,
23941,
3733,
273,
3676,
11454,
6928,
253,
5933,
24772,
1980,
256,
35333,
342,
13444,
25001,
5018,
281,
10507,
253,
5511,
18332,
2299,
5820,
1335,
11842,
907,
616,
3579,
2135,
1034,
11999,
275,
1016,
19502,
247,
14940,
2281,
273,
258,
18,
2609,
76,
281,
247,
17429,
310,
2530,
323,
6032,
1327,
44181,
16566,
342,
1675,
281,
253,
3388,
4764,
873,
2439,
5820,
10704,
4679,
2530,
327,
260,
338,
274,
740,
285,
690,
4864,
1543,
327,
4440,
257,
292,
18,
76,
342,
247,
501,
3024,
1093,
285,
247,
14604,
3281,
273,
17558,
20544,
436,
2929,
19401,
247,
1077,
4722,
285,
4623,
1895,
8493,
5511,
18332,
273,
5939,
3676,
4715,
3082,
23000,
253,
2746,
273,
970,
13444,
25001,
281,
10507,
5511,
18332,
310,
973,
17194,
50276,
20881,
1255,
265,
337,
38771,
247,
1781,
2133,
273,
6239,
327,
253,
1895,
1690,
50276,
783,
2644,
873,
273,
34168,
3169,
3082,
326,
452,
644,
4081,
323,
436,
1895,
50276,
783,
2644,
873,
273,
40880,
35576,
3082,
326,
452,
644,
4081,
323,
436,
1895,
50276,
19880,
382,
21755,
1754,
3082,
17197,
5511,
18332,
407,
11842,
3006,
13650,
3066,
871,
1070,
2400,
382,
21755,
374,
23000,
253,
38135,
310,
4122,
3710,
323,
1650,
1980,
5018,
342,
13444,
25001,
556,
2168,
644,
908,
275,
35576,
40880,
3082,
323,
4227,
923,
29201,
1189,
30520,
19191,
11786,
7450,
247,
534,
24772,
29201,
1980,
5018,
342,
35576,
13444,
25001,
689,
10341,
6828,
5511,
14580,
417,
7933,
512,
936,
455,
495,
253,
4081,
1332,
1335,
4419,
11842,
3006,
5820,
1309,
3579,
2135,
1034,
11999,
534,
1057,
417,
8150,
839,
11041,
275,
734,
23171,
5731,
2069,
534,
1014,
275,
253,
1083,
273,
17010,
9959,
1335,
10262,
37825,
3468,
3487,
84,
275,
253,
3634,
273,
5145,
28269,
3237,
577,
10704,
1543,
327,
4440,
257,
292,
403,
417,
21414,
1580,
253,
49602,
403,
1327,
15291,
323,
436,
6239,
5742,
760,
970,
247,
391,
79,
1093,
275,
275,
18,
76,
323,
1384,
44540,
342,
247,
14604,
3281,
273,
17558,
1057,
417,
2085,
247,
1175,
14011,
327,
253,
8542,
11839,
273,
253,
4081,
1332,
4103,
281,
2905,
789,
417,
281,
3748,
326,
247,
10704,
5301,
310,
9829,
342,
247,
1781,
2133,
273,
2905,
789,
608,
5884,
1127,
533,
253,
47641,
11193,
908,
281,
3653,
253,
2406,
3033,
327,
253,
5778,
4373,
19484,
277,
1057,
417,
7933,
4311,
23352,
342,
253,
1180,
273,
5820,
323,
4227,
512,
35878,
12786,
5511,
27558,
970,
4181,
5590,
1382,
306,
291,
636,
276,
8036,
390,
5818,
3169,
512,
35878,
760,
11498,
42407,
1037,
342,
253,
1180,
273,
5820,
417,
23352,
50276,
66,
718,
4011,
1162,
355,
19191,
11786,
7450,
323,
5939,
3676,
4715,
17857,
1686,
6247,
50276,
67,
362,
462,
1241,
1162,
355,
774,
698,
360,
323,
40880,
3676,
4715,
327,
22766,
941,
5723,
2824,
43425,
50276,
68,
38301,
375,
76,
8947,
1162,
355,
40880,
3676,
4715,
342,
10341,
5511,
13800,
17857,
32888,
9169,
50276,
69,
632,
266,
1162,
355,
476,
40880,
11333,
562,
32231,
36409,
11333,
247,
1083,
1263,
323,
40880,
7529,
19191,
11786,
18499,
5723,
2824,
4240,
50276,
70,
632,
266,
1162,
355,
35576,
40880,
7529,
19191,
11786,
18499,
17857,
1686,
4765,
50276,
71,
12717,
1162,
355,
40880,
3733,
689,
40880,
941,
17857,
1686,
4765,
50276,
72,
718,
4011,
1162,
355,
16424,
275,
35576,
7529,
285,
5939,
13757,
10061,
273,
253,
26332,
1796,
9169,
50276,
73,
718,
4011,
285,
26345,
255,
35576,
11786,
7450,
26332,
1796,
13122,
327,
12077,
1453,
43425,
50276,
74,
718,
4011,
1162,
355,
34168,
3169,
12353,
282,
47612,
35615,
323,
3676,
35221,
4715,
5723,
2824,
9169,
50276,
75,
15009,
301,
2610,
567,
1162,
355,
10237,
35576,
19191,
11786,
11340,
38311,
8654,
285,
2990,
17777,
3045,
323,
7052,
17133,
3470,
480,
1686,
83,
14952,
50276,
76,
8429,
1162,
355,
20185,
2990,
14275,
275,
5939,
19191,
13757,
323,
5145,
4715,
17565,
5939,
285,
36409,
19191,
11786,
18499,
26332,
1796,
2625,
5162,
11338,
9169,
50276,
77,
9359,
49520,
1162,
355,
247,
8003,
1007,
387,
12738,
382,
21755,
323,
5939,
3733,
549,
32693,
9169,
50276,
78,
271,
300,
1162,
355,
1781,
4311,
5939,
11454,
2990,
3733,
949,
3909,
940,
21755,
17857,
32888,
4765,
50275,
783,
323,
10481,
1781,
465,
310,
2629,
533,
310,
3240,
21248,
253,
2406,
3033,
327,
465,
310,
18755,
275,
2905,
2987,
352,
310,
619,
4743,
326,
3738,
253,
1895,
310,
3240,
4722,
285,
4623,
247,
1781,
2133,
273,
6239,
310,
9829,
285,
253,
4081,
1332,
310,
417,
10481,
4460,
25761,
253,
10527,
9021,
513,
417,
2085,
11701,
689,
253,
2905,
6239,
285,
812,
897,
2007,
17776,
275,
2176,
4243,
24088,
3738,
323,
10481,
1781,
465,
310,
247,
1846,
4426,
5425,
247,
5272,
2406,
3033,
310,
3798,
1160,
2173,
275,
253,
253,
10012,
3908,
23000,
891,
513,
417,
1089,
253,
10704,
4679,
281,
2085,
21414,
1941,
326,
253,
4081,
1332,
310,
6296,
1805,
18960,
323,
1781,
23941,
5939,
3676,
4715,
685,
5368,
7274,
5474,
339,
9852,
436,
2929,
4477,
12661,
253,
11237,
273,
5939,
256,
35333,
5933,
326,
24772,
253,
5697,
273,
1980,
256,
35333,
285,
13444,
25001,
273,
11269,
50276,
249,
436,
2929,
4477,
1908,
253,
5939,
256,
35333,
342,
13444,
27935,
285,
11542,
20219,
436,
9376,
327,
20219,
4483,
5933,
281,
29623,
342,
253,
1072,
2281,
347,
1980,
256,
35333,
5933,
806,
273,
512,
891,
42126,
755,
253,
1895,
359,
1611,
281,
8415,
50276,
328,
9520,
253,
5933,
3139,
310,
417,
5393,
275,
253,
2929,
760,
247,
2014,
1386,
326,
10140,
281,
253,
5731,
4086,
432,
436,
4086,
891,
7525,
326,
512,
7632,
403,
36282,
1580,
253,
1273,
1307,
310,
7976,
327,
253,
512,
15833,
27935,
285,
2792,
436,
3533,
479,
604,
436,
5933,
310,
1175,
275,
3946,
4754,
253,
1980,
5731,
6569,
1199,
625,
2223,
685,
253,
4156,
5731,
533,
752,
310,
253,
1921,
840,
281,
897,
436,
3213,
3185,
273,
2969,
25001,
50275,
74,
1158,
326,
436,
3929,
310,
15225,
3542,
50275,
7053,
273,
512,
891,
971,
281,
3748,
326,
4477,
4425,
281,
6379,
253,
8266,
1979,
273,
512,
1774,
7424,
285,
14777,
281,
4944,
253,
8059,
1979,
6095,
436,
2789,
436,
3929,
6685,
1892,
281,
1239,
846,
11993,
50276,
9815,
323,
479,
347,
247,
3295,
27895,
1436,
512,
253,
14777,
403,
6685,
12744,
1580,
253,
3064,
875,
253,
3104,
310,
8723,
2626,
891,
9829,
253,
2022,
1895,
15895,
25761,
253,
5933,
3139,
310,
417,
5393,
760,
5731,
1293,
667,
4278,
670,
4373,
22041,
5438,
50276,
9815,
2181,
281,
3748,
310,
247,
38135,
285,
247,
8542,
1600,
273,
253,
5933,
891,
2096,
752,
310,
247,
3064,
875,
436,
5933,
285,
253,
1980,
256,
35333,
533,
891,
717,
16063,
839,
275,
253,
1563,
1580,
1309,
253,
2120,
5731,
417,
1980,
359,
13791,
4336,
278,
19,
23261,
752,
310,
247,
1921,
275,
824,
247,
13444,
11786,
13782,
3798,
253,
5511,
310,
1199,
625,
8214,
685,
253,
11786,
13782,
594,
253,
1921,
327,
47308,
253,
27935,
310,
417,
2590,
323,
479,
50276,
249,
789,
5987,
39962,
2061,
9275,
11395,
1549,
3953,
1717,
9275,
4477,
897,
13444,
27935,
2761,
253,
1072,
1039,
347,
275,
436,
2929,
2299,
352,
4483,
281,
5321,
253,
2264,
2408,
273,
10924,
50275,
10383,
253,
5661,
629,
891,
452,
642,
5701,
1580,
891,
4242,
281,
9446,
253,
3104,
3064,
50276,
455,
275,
512,
891,
1089,
436,
3929,
275,
247,
1077,
7482,
3924,
281,
320,
3863,
50275,
7152,
339,
9852,
436,
2929,
253,
4477,
1246,
9527,
35333,
534,
47685,
253,
13782,
285,
5511,
273,
5939,
3733,
50276,
249,
247,
5825,
17901,
253,
2934,
310,
281,
13398,
25886,
35333,
285,
1980,
256,
35333,
534,
9106,
2789,
3282,
253,
3368,
1543,
921,
1175,
3045,
275,
436,
2929,
253,
4477,
1246,
9527,
35333,
534,
47685,
253,
13782,
285,
5511,
273,
5939,
3733,
253,
2929,
310,
973,
15720,
285,
3477,
281,
2096,
275,
247,
5825,
17901,
253,
2934,
310,
281,
13398,
25886,
35333,
285,
1980,
256,
35333,
534,
9106,
2789,
3282,
253,
3368,
1543,
921,
1175,
3045,
2299,
891,
452,
253,
1563,
7350,
275,
253,
38135,
337,
253,
2934,
273,
4751,
7529,
907,
256,
35333,
285,
3579,
2135,
1034,
8641,
569,
281,
10507,
2233,
273,
11786,
5511,
310,
2686,
417,
747,
25886,
35333,
337,
4081,
970,
256,
35333,
342,
337,
3213,
273,
32481,
8098,
281,
14787,
5511,
285,
13782,
2299,
337,
310,
417,
11106,
390,
5469,
275,
436,
2929,
374,
253,
5019,
273,
25886,
35333,
285,
1980,
256,
35333,
310,
671,
417,
747,
3495,
1097,
4648,
824,
247,
5019,
10323,
891,
1158,
495,
29328,
253,
5019,
273,
25886,
35333,
285,
1980,
256,
35333,
285,
374,
11323,
5511,
13800,
281,
495,
4496,
3451,
479,
604,
516,
3430,
285,
5224,
253,
2201,
3064,
875,
253,
5697,
273,
436,
2929,
285,
495,
50275,
250,
3065,
50276,
18,
632,
368,
75,
466,
1162,
355,
25886,
35333,
247,
40880,
9196,
293,
967,
256,
35333,
7792,
323,
5939,
3676,
2036,
3733,
5723,
2824,
4765,
50276,
19,
5778,
85,
15740,
386,
1980,
256,
35333,
323,
5919,
5939,
3733,
5987,
5758,
15337,
3024,
39061,
504,
86,
25,
91,
43563,
24,
91,
19232,
50276,
20,
259,
606,
480,
757,
30838,
1162,
355,
14787,
25108,
35333,
271,
5933,
280,
2746,
281,
10507,
5511,
20219,
275,
5939,
256,
35333,
17857,
515,
81,
9169,
50276,
249,
436,
2929,
253,
4477,
1246,
9527,
35333,
534,
47685,
253,
13782,
285,
5511,
273,
5939,
3733,
253,
2929,
310,
973,
15720,
285,
3477,
281,
2096,
2299,
891,
452,
690,
7350,
275,
253,
38135,
5474,
339,
431,
248,
2929,
29328,
281,
13398,
1980,
256,
35333,
342,
48955,
5511,
247,
14940,
1783,
310,
1677,
253,
4679,
3588,
684,
253,
1566,
3045,
337,
253,
4619,
2523,
310,
326,
6747,
1980,
256,
35333,
4543,
48955,
337,
5511,
310,
4460,
47515,
625,
16248,
731,
310,
671,
417,
4460,
347,
247,
2647,
273,
958,
253,
4081,
1332,
4677,
374,
260,
310,
4829,
253,
1072,
281,
3036,
374,
275,
374,
50276,
19,
891,
858,
417,
1089,
253,
7756,
5001,
253,
3402,
8886,
673,
407,
17197,
253,
5511,
275,
253,
3368,
50276,
250,
3065,
50276,
18,
632,
340,
340,
86,
278,
632,
256,
1323,
383,
553,
6285,
256,
465,
303,
19769,
285,
5807,
7706,
247,
25886,
35333,
247,
40880,
9196,
293,
967,
256,
35333,
7792,
323,
5939,
3676,
2036,
3733,
5723,
2824,
4765,
50276,
19,
259,
606,
480,
757,
30838,
419,
80,
632,
606,
285,
305,
4411,
74,
480,
25453,
14787,
25108,
35333,
271,
5933,
280,
2746,
281,
10507,
5511,
20219,
275,
5939,
256,
35333,
275,
17857,
515,
81,
9169,
14952,
26332,
1796,
5213,
8059,
327,
913,
26202,
982,
6519,
285,
2625,
5162,
17857,
515,
81,
7266,
854,
2597,
17599,
1976,
26332,
1796,
9169,
891,
2868,
253,
4081,
1332,
310,
417,
4460,
285,
374,
556,
4081,
247,
1077,
2074,
1332,
2490,
187,
4118,
18435,
27,
2520,
2929,
29328,
247,
12955,
273,
19191,
11786,
18499,
326,
7529,
4219,
253,
5933,
323,
5939,
3733,
3066,
13444,
11786,
25001,
1223,
253,
5933,
9527,
35333,
4081,
310,
24600,
285,
3133,
281,
789,
352,
671,
3133,
281,
2985,
247,
2257,
273,
2905,
789,
347,
8042,
562,
407,
581,
273,
253,
30628,
253,
966,
273,
35576,
40880,
3082,
2168,
1646,
281,
3835,
253,
2317,
273,
9527,
35333,
285,
697,
417,
2590,
849,
9527,
35333,
19986,
432,
253,
5368,
3082,
275,
436,
2317,
347,
247,
906,
273,
436,
3480,
273,
5301,
281,
2905,
789,
253,
30628,
8521,
326,
253,
2929,
417,
320,
7607,
387,
436,
673,
285,
436,
7103,
369,
417,
14870,
407,
271,
2488,
2380,
891,
5194,
342,
436,
13969
] | [
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] | [
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
2520,
2929,
4081,
9527,
35333,
271,
5933,
323,
1236,
2510,
25912,
1781,
23941,
3733,
273,
3676,
11454,
6928,
253,
5933,
24772,
1980,
256,
35333,
342,
13444,
25001,
5018,
281,
10507,
253,
5511,
18332,
2299,
5820,
1335,
11842,
907,
616,
3579,
2135,
1034,
11999,
275,
1016,
19502,
247,
14940,
2281,
273,
258,
18,
2609,
76,
281,
247,
17429,
310,
2530,
323,
6032,
1327,
44181,
16566,
342,
1675,
281,
253,
3388,
4764,
873,
2439,
5820,
10704,
4679,
2530,
327,
260,
338,
274,
740,
285,
690,
4864,
1543,
327,
4440,
257,
292,
18,
76,
342,
247,
501,
3024,
1093,
285,
247,
14604,
3281,
273,
17558,
20544,
436,
2929,
19401,
247,
1077,
4722,
285,
4623,
1895,
8493,
5511,
18332,
273,
5939,
3676,
4715,
3082,
23000,
253,
2746,
273,
970,
13444,
25001,
281,
10507,
5511,
18332,
310,
973,
17194,
50276,
20881,
1255,
265,
337,
38771,
247,
1781,
2133,
273,
6239,
327,
253,
1895,
1690,
50276,
783,
2644,
873,
273,
34168,
3169,
3082,
326,
452,
644,
4081,
323,
436,
1895,
50276,
783,
2644,
873,
273,
40880,
35576,
3082,
326,
452,
644,
4081,
323,
436,
1895,
50276,
19880,
382,
21755,
1754,
3082,
17197,
5511,
18332,
407,
11842,
3006,
13650,
3066,
871,
1070,
2400,
382,
21755,
374,
23000,
253,
38135,
310,
4122,
3710,
323,
1650,
1980,
5018,
342,
13444,
25001,
556,
2168,
644,
908,
275,
35576,
40880,
3082,
323,
4227,
923,
29201,
1189,
30520,
19191,
11786,
7450,
247,
534,
24772,
29201,
1980,
5018,
342,
35576,
13444,
25001,
689,
10341,
6828,
5511,
14580,
417,
7933,
512,
936,
455,
495,
253,
4081,
1332,
1335,
4419,
11842,
3006,
5820,
1309,
3579,
2135,
1034,
11999,
534,
1057,
417,
8150,
839,
11041,
275,
734,
23171,
5731,
2069,
534,
1014,
275,
253,
1083,
273,
17010,
9959,
1335,
10262,
37825,
3468,
3487,
84,
275,
253,
3634,
273,
5145,
28269,
3237,
577,
10704,
1543,
327,
4440,
257,
292,
403,
417,
21414,
1580,
253,
49602,
403,
1327,
15291,
323,
436,
6239,
5742,
760,
970,
247,
391,
79,
1093,
275,
275,
18,
76,
323,
1384,
44540,
342,
247,
14604,
3281,
273,
17558,
1057,
417,
2085,
247,
1175,
14011,
327,
253,
8542,
11839,
273,
253,
4081,
1332,
4103,
281,
2905,
789,
417,
281,
3748,
326,
247,
10704,
5301,
310,
9829,
342,
247,
1781,
2133,
273,
2905,
789,
608,
5884,
1127,
533,
253,
47641,
11193,
908,
281,
3653,
253,
2406,
3033,
327,
253,
5778,
4373,
19484,
277,
1057,
417,
7933,
4311,
23352,
342,
253,
1180,
273,
5820,
323,
4227,
512,
35878,
12786,
5511,
27558,
970,
4181,
5590,
1382,
306,
291,
636,
276,
8036,
390,
5818,
3169,
512,
35878,
760,
11498,
42407,
1037,
342,
253,
1180,
273,
5820,
417,
23352,
50276,
66,
718,
4011,
1162,
355,
19191,
11786,
7450,
323,
5939,
3676,
4715,
17857,
1686,
6247,
50276,
67,
362,
462,
1241,
1162,
355,
774,
698,
360,
323,
40880,
3676,
4715,
327,
22766,
941,
5723,
2824,
43425,
50276,
68,
38301,
375,
76,
8947,
1162,
355,
40880,
3676,
4715,
342,
10341,
5511,
13800,
17857,
32888,
9169,
50276,
69,
632,
266,
1162,
355,
476,
40880,
11333,
562,
32231,
36409,
11333,
247,
1083,
1263,
323,
40880,
7529,
19191,
11786,
18499,
5723,
2824,
4240,
50276,
70,
632,
266,
1162,
355,
35576,
40880,
7529,
19191,
11786,
18499,
17857,
1686,
4765,
50276,
71,
12717,
1162,
355,
40880,
3733,
689,
40880,
941,
17857,
1686,
4765,
50276,
72,
718,
4011,
1162,
355,
16424,
275,
35576,
7529,
285,
5939,
13757,
10061,
273,
253,
26332,
1796,
9169,
50276,
73,
718,
4011,
285,
26345,
255,
35576,
11786,
7450,
26332,
1796,
13122,
327,
12077,
1453,
43425,
50276,
74,
718,
4011,
1162,
355,
34168,
3169,
12353,
282,
47612,
35615,
323,
3676,
35221,
4715,
5723,
2824,
9169,
50276,
75,
15009,
301,
2610,
567,
1162,
355,
10237,
35576,
19191,
11786,
11340,
38311,
8654,
285,
2990,
17777,
3045,
323,
7052,
17133,
3470,
480,
1686,
83,
14952,
50276,
76,
8429,
1162,
355,
20185,
2990,
14275,
275,
5939,
19191,
13757,
323,
5145,
4715,
17565,
5939,
285,
36409,
19191,
11786,
18499,
26332,
1796,
2625,
5162,
11338,
9169,
50276,
77,
9359,
49520,
1162,
355,
247,
8003,
1007,
387,
12738,
382,
21755,
323,
5939,
3733,
549,
32693,
9169,
50276,
78,
271,
300,
1162,
355,
1781,
4311,
5939,
11454,
2990,
3733,
949,
3909,
940,
21755,
17857,
32888,
4765,
50275,
783,
323,
10481,
1781,
465,
310,
2629,
533,
310,
3240,
21248,
253,
2406,
3033,
327,
465,
310,
18755,
275,
2905,
2987,
352,
310,
619,
4743,
326,
3738,
253,
1895,
310,
3240,
4722,
285,
4623,
247,
1781,
2133,
273,
6239,
310,
9829,
285,
253,
4081,
1332,
310,
417,
10481,
4460,
25761,
253,
10527,
9021,
513,
417,
2085,
11701,
689,
253,
2905,
6239,
285,
812,
897,
2007,
17776,
275,
2176,
4243,
24088,
3738,
323,
10481,
1781,
465,
310,
247,
1846,
4426,
5425,
247,
5272,
2406,
3033,
310,
3798,
1160,
2173,
275,
253,
253,
10012,
3908,
23000,
891,
513,
417,
1089,
253,
10704,
4679,
281,
2085,
21414,
1941,
326,
253,
4081,
1332,
310,
6296,
1805,
18960,
323,
1781,
23941,
5939,
3676,
4715,
685,
5368,
7274,
5474,
339,
9852,
436,
2929,
4477,
12661,
253,
11237,
273,
5939,
256,
35333,
5933,
326,
24772,
253,
5697,
273,
1980,
256,
35333,
285,
13444,
25001,
273,
11269,
50276,
249,
436,
2929,
4477,
1908,
253,
5939,
256,
35333,
342,
13444,
27935,
285,
11542,
20219,
436,
9376,
327,
20219,
4483,
5933,
281,
29623,
342,
253,
1072,
2281,
347,
1980,
256,
35333,
5933,
806,
273,
512,
891,
42126,
755,
253,
1895,
359,
1611,
281,
8415,
50276,
328,
9520,
253,
5933,
3139,
310,
417,
5393,
275,
253,
2929,
760,
247,
2014,
1386,
326,
10140,
281,
253,
5731,
4086,
432,
436,
4086,
891,
7525,
326,
512,
7632,
403,
36282,
1580,
253,
1273,
1307,
310,
7976,
327,
253,
512,
15833,
27935,
285,
2792,
436,
3533,
479,
604,
436,
5933,
310,
1175,
275,
3946,
4754,
253,
1980,
5731,
6569,
1199,
625,
2223,
685,
253,
4156,
5731,
533,
752,
310,
253,
1921,
840,
281,
897,
436,
3213,
3185,
273,
2969,
25001,
50275,
74,
1158,
326,
436,
3929,
310,
15225,
3542,
50275,
7053,
273,
512,
891,
971,
281,
3748,
326,
4477,
4425,
281,
6379,
253,
8266,
1979,
273,
512,
1774,
7424,
285,
14777,
281,
4944,
253,
8059,
1979,
6095,
436,
2789,
436,
3929,
6685,
1892,
281,
1239,
846,
11993,
50276,
9815,
323,
479,
347,
247,
3295,
27895,
1436,
512,
253,
14777,
403,
6685,
12744,
1580,
253,
3064,
875,
253,
3104,
310,
8723,
2626,
891,
9829,
253,
2022,
1895,
15895,
25761,
253,
5933,
3139,
310,
417,
5393,
760,
5731,
1293,
667,
4278,
670,
4373,
22041,
5438,
50276,
9815,
2181,
281,
3748,
310,
247,
38135,
285,
247,
8542,
1600,
273,
253,
5933,
891,
2096,
752,
310,
247,
3064,
875,
436,
5933,
285,
253,
1980,
256,
35333,
533,
891,
717,
16063,
839,
275,
253,
1563,
1580,
1309,
253,
2120,
5731,
417,
1980,
359,
13791,
4336,
278,
19,
23261,
752,
310,
247,
1921,
275,
824,
247,
13444,
11786,
13782,
3798,
253,
5511,
310,
1199,
625,
8214,
685,
253,
11786,
13782,
594,
253,
1921,
327,
47308,
253,
27935,
310,
417,
2590,
323,
479,
50276,
249,
789,
5987,
39962,
2061,
9275,
11395,
1549,
3953,
1717,
9275,
4477,
897,
13444,
27935,
2761,
253,
1072,
1039,
347,
275,
436,
2929,
2299,
352,
4483,
281,
5321,
253,
2264,
2408,
273,
10924,
50275,
10383,
253,
5661,
629,
891,
452,
642,
5701,
1580,
891,
4242,
281,
9446,
253,
3104,
3064,
50276,
455,
275,
512,
891,
1089,
436,
3929,
275,
247,
1077,
7482,
3924,
281,
320,
3863,
50275,
7152,
339,
9852,
436,
2929,
253,
4477,
1246,
9527,
35333,
534,
47685,
253,
13782,
285,
5511,
273,
5939,
3733,
50276,
249,
247,
5825,
17901,
253,
2934,
310,
281,
13398,
25886,
35333,
285,
1980,
256,
35333,
534,
9106,
2789,
3282,
253,
3368,
1543,
921,
1175,
3045,
275,
436,
2929,
253,
4477,
1246,
9527,
35333,
534,
47685,
253,
13782,
285,
5511,
273,
5939,
3733,
253,
2929,
310,
973,
15720,
285,
3477,
281,
2096,
275,
247,
5825,
17901,
253,
2934,
310,
281,
13398,
25886,
35333,
285,
1980,
256,
35333,
534,
9106,
2789,
3282,
253,
3368,
1543,
921,
1175,
3045,
2299,
891,
452,
253,
1563,
7350,
275,
253,
38135,
337,
253,
2934,
273,
4751,
7529,
907,
256,
35333,
285,
3579,
2135,
1034,
8641,
569,
281,
10507,
2233,
273,
11786,
5511,
310,
2686,
417,
747,
25886,
35333,
337,
4081,
970,
256,
35333,
342,
337,
3213,
273,
32481,
8098,
281,
14787,
5511,
285,
13782,
2299,
337,
310,
417,
11106,
390,
5469,
275,
436,
2929,
374,
253,
5019,
273,
25886,
35333,
285,
1980,
256,
35333,
310,
671,
417,
747,
3495,
1097,
4648,
824,
247,
5019,
10323,
891,
1158,
495,
29328,
253,
5019,
273,
25886,
35333,
285,
1980,
256,
35333,
285,
374,
11323,
5511,
13800,
281,
495,
4496,
3451,
479,
604,
516,
3430,
285,
5224,
253,
2201,
3064,
875,
253,
5697,
273,
436,
2929,
285,
495,
50275,
250,
3065,
50276,
18,
632,
368,
75,
466,
1162,
355,
25886,
35333,
247,
40880,
9196,
293,
967,
256,
35333,
7792,
323,
5939,
3676,
2036,
3733,
5723,
2824,
4765,
50276,
19,
5778,
85,
15740,
386,
1980,
256,
35333,
323,
5919,
5939,
3733,
5987,
5758,
15337,
3024,
39061,
504,
86,
25,
91,
43563,
24,
91,
19232,
50276,
20,
259,
606,
480,
757,
30838,
1162,
355,
14787,
25108,
35333,
271,
5933,
280,
2746,
281,
10507,
5511,
20219,
275,
5939,
256,
35333,
17857,
515,
81,
9169,
50276,
249,
436,
2929,
253,
4477,
1246,
9527,
35333,
534,
47685,
253,
13782,
285,
5511,
273,
5939,
3733,
253,
2929,
310,
973,
15720,
285,
3477,
281,
2096,
2299,
891,
452,
690,
7350,
275,
253,
38135,
5474,
339,
431,
248,
2929,
29328,
281,
13398,
1980,
256,
35333,
342,
48955,
5511,
247,
14940,
1783,
310,
1677,
253,
4679,
3588,
684,
253,
1566,
3045,
337,
253,
4619,
2523,
310,
326,
6747,
1980,
256,
35333,
4543,
48955,
337,
5511,
310,
4460,
47515,
625,
16248,
731,
310,
671,
417,
4460,
347,
247,
2647,
273,
958,
253,
4081,
1332,
4677,
374,
260,
310,
4829,
253,
1072,
281,
3036,
374,
275,
374,
50276,
19,
891,
858,
417,
1089,
253,
7756,
5001,
253,
3402,
8886,
673,
407,
17197,
253,
5511,
275,
253,
3368,
50276,
250,
3065,
50276,
18,
632,
340,
340,
86,
278,
632,
256,
1323,
383,
553,
6285,
256,
465,
303,
19769,
285,
5807,
7706,
247,
25886,
35333,
247,
40880,
9196,
293,
967,
256,
35333,
7792,
323,
5939,
3676,
2036,
3733,
5723,
2824,
4765,
50276,
19,
259,
606,
480,
757,
30838,
419,
80,
632,
606,
285,
305,
4411,
74,
480,
25453,
14787,
25108,
35333,
271,
5933,
280,
2746,
281,
10507,
5511,
20219,
275,
5939,
256,
35333,
275,
17857,
515,
81,
9169,
14952,
26332,
1796,
5213,
8059,
327,
913,
26202,
982,
6519,
285,
2625,
5162,
17857,
515,
81,
7266,
854,
2597,
17599,
1976,
26332,
1796,
9169,
891,
2868,
253,
4081,
1332,
310,
417,
4460,
285,
374,
556,
4081,
247,
1077,
2074,
1332,
2490,
187,
4118,
18435,
27,
2520,
2929,
29328,
247,
12955,
273,
19191,
11786,
18499,
326,
7529,
4219,
253,
5933,
323,
5939,
3733,
3066,
13444,
11786,
25001,
1223,
253,
5933,
9527,
35333,
4081,
310,
24600,
285,
3133,
281,
789,
352,
671,
3133,
281,
2985,
247,
2257,
273,
2905,
789,
347,
8042,
562,
407,
581,
273,
253,
30628,
253,
966,
273,
35576,
40880,
3082,
2168,
1646,
281,
3835,
253,
2317,
273,
9527,
35333,
285,
697,
417,
2590,
849,
9527,
35333,
19986,
432,
253,
5368,
3082,
275,
436,
2317,
347,
247,
906,
273,
436,
3480,
273,
5301,
281,
2905,
789,
253,
30628,
8521,
326,
253,
2929,
417,
320,
7607,
387,
436,
673,
285,
436,
7103,
369,
417,
14870,
407,
271,
2488,
2380,
891,
5194,
342,
436,
13969
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
this paper extends the recently proposed dense retrieval methods to the multihop opendomain questions so as to handle complex multihop queries the overall idea is simple direct but effective the authors conduct extensive experiments on two multihop datasets hotpotqa and multievidence fever and evaluation results demonstrate that the proposed model achieves impressive results on both the knowledge retrieval task and multihop qa my only concern is about the novelty of the paper the contribution of this paper seems to be limited as it just combines the recently proposed dense retrieval methods with multihop qa besides compared with 1 this paper seems to just replace the rnnbased encoder in the knowledge retriever with the bertbased encoder 1 das r dhuliawala s zaheer m et al multistep retrieverreader interaction for scalable opendomain question answeringciclr 2019 docsepsummary the paper proposes a simple clever and as far as i can tell novel combination of dense retrieval techniques and pseudo relevance feedback for multihop complex opendomain qa the basic idea is to concatenate the passages returned for the first query to the original question to form a new query to be encoded and used in combination with the retrieval system thus the paper does not present a radically new idea but combines successfully recent popular methods dense retrieval featuring in most sota work in qa with old techniques well known and studied for decades in the ir community in general i enjoyed reading this paper and find this is a useful technique that the community should learn about to explore further in the same and other related problems eg openqa the approach is somewhat incremental but competitive and conceptually simple the lack of awareness for the ir context is my main reason for the score not being higher pros the paper shows that the idea is effective in terms of performance yielding stateoftheart results on two multihop datasets hotpotqa and multievidence fever experiments seem generally rigorous and reproducible following standard popular datasets and procedures experiments include very recent work on readers architectures including generative ones rag fid the architectures simplicity does not make assumptions on the nature of the data and associated meta information eg link graphs and produces also a more efficient system cons the paper could be organized better in its final version in particular providing more context and motivation for the problem first of all why are such complex questions important how key is the multihopness aspect i would particularly recommend anchoring this discussion to the findings of min et al 2019 httpswwwaclweborganthologyp191416 in particular the fact that many such questions can be solved in one hop how naturalartificial is the task how does this aspects affects this specific study the related work section is insufficient and the absence of an adequate discussion of the pseudo relevance feedback work in ir is a major weakness this is an foundational line of work going back at least to the research of rocchio in the 1970s i would suggest ruthven and lalmas 2003 a survey on the use of relevance feedback for information access systems as a starting point detailed feedbackcomments how does the use of tfidf as a source of hard negatives relate to the argument about ir baselines being poor what is the motivation for and conclusion for evaluating additional linked docs as negatives if they only yield minor gains could you discuss more the nature of the supervised information available with respect to the fact that the number of hops is known and also the order of the passage sequences or can be inferred heuristically eg wrt to the claim training in an orderagnostic manner hardly works at all and underperforms even the singlehop baseline have you tried using more than 2 steps it may be valuable to run an experiment it would be also valuable to experiment with this approach on other openqa tasks such as natural question etc also to provide more evidence for the generality of the approach how does this approach deal with the limited encoder capacity in terms of number of tokens how many passages can you append to form the 2nd query and how does this affect performance docsepsummary this paper proposes multihop dense retrieval for opendomain multihop question answering it extends previous dense passage retrieval into the corresponding multihop version by using retrieved passages to latently reformulate the query representation after each retrieval pass in the end it can significantly improve the performance on hotpotqa and multievidence fever dataset the analyses are very comprehensive and extensive from almost every relevant perspective pros 1 extending dense passage retriever into its multihop version is a reasonable direction this is the first work in this direction 2 the experimental results are strong and the analyses are comprehensive cons 1 this paper mainly focuses on the experimental and analysis part although it is good to know these lessons in multihop dense retrieval the proposed method itself is limited in terms of novelty 2 since the idea mainly comes from dense passage retrieval it is not very clear to me whether the improvement comes from the dense passage retrieval or the proposed latent query reformulation for dpr since these are the major contribution of this paper i think it is necessary to have a separate section for discussion question 1 in table 3 what is the training detail of the singlehop ablation is it also trained through a similar negative sampling process described in section 22 if so why the improvement of multihop version dpr is so significant 2 in section 1 it is said that the main problem in answering multihop opendomain questions is that the search space grows exponentially with each retrieval hop in my understanding the proposed multihop dpr still suffers from this problem right the only difference between the proposed multihop dpr and previous approaches is that it does not use any structured knowledge within the documents for retrieval docsep1 summary of this paper the topic of this paper is multihop qa which studies answering complex natural language questions complex questions require an information aggregation process across multiple documents and recent multihop qa models design this process by sequentially retrieving relevant documents asai 2020 et al this paper alleviates two problems in recent multihop qa models one is that recent multihop qa models require external knowledge such as wikipedia hyperlinks this problem results in the models low generalization ability on new domains that the external knowledge is no longer available the other problem is computational efficiency the authors propose a novel multihop qa model named mdr that does not require external knowledge and is ten times faster than the recent models mdr uses question reformulation and mips question reformulation design the information aggregation process by iteratively generating a query vector related to the documents that should be accompanied to answer the original question mdr generates such query vectors by comparing the given question and previously retrieved documents mdr encodes passages in a large corpusindexing with the same encoder used in the question reformulation process and uses mips to find relevant documents with the generated query vectors in experiments the authors show that mdr outperforms recent multihop qa models and also they show the computational efficiency of mdr 2 strong and weak points of this paper strong points this paper provides a detailed analysis of their method experimental results show the validity of the proposed method and some strong findings described below table 2 confirms that mdr outperforms graph rec retriever asai et al this result shows the feasibility of a more accurate multihop qa model without external knowledge such as wikipedia hyperlinks table 3 shows a detailed analysis of each component in mdr this table indicates several vital features for multihop qa models that can be easily ignored in the model design process the experimental results on wo order and wo linked negatives show significant findings in multihop qa table 4 shows that the question reformulation method mdr has similar performance to the question decomposition method with humanannotated subquestions table 5 shows the endtoend performance of multihop qa models mdr outperforms existing stateoftheart multihop qa models the proposed method is computationally efficient the proposed method is simple many followup studies based on the proposed method are expected the experimental results support their claim weak points this paper does not mention the publicly available code of their method it would be nice if the authors provide implementations after the decision process in the section question decomposition for retrieval the authors conclude that question decomposition is unnecessary in the context of dense retrieval with a strong pretrained encoder however table 4 shows that question decomposition with a simple opendomain qa model has a similar performance to mdr these results indicate that question decomposition is an effective method to make simple singlehop opendomain qa models used in multihop qa please provide more evidence for the conclusion unnecessity of question decomposition 3 recommendation accept this paper provides several significant findings that are expected to be referred to by many other studies their method is simple and outperforms other multihop qa models also it is computationally efficient 4 questions in table 4 the decomp method is based on dpr dense passage retriever what will be the results if mdr uses the gold subquestions does using subquestions in mdr increase retriever performance please provide the number of hard negative samples for a question in section 22 what is the start token in the sentence specifically we apply layer normalization over the start tokens representations from roberta to get the final dense querypassage vectors is the start token pooledoutput of the cls token or hidden representation of the cls token
### Summary: | the paper introduces improving passage retrieval for multihop qa datasets by recursively retrieving passages adding previously retrieved passages to the input in addition to a query this simple method shows gains on multiple qa benchmark datasets and the evaluation presented in the paper on multiple competitive benchmark datasets hotpotqa fever is very thorough r1 r3 r4 while the application is pretty narrow the performance gain considering both efficiency and accuracy is fairly significant and the paper presents a simple model with less assumption eg interdocument hyperlinks that could be useful for future research 1 also seems like a relevant line of work 1 generationaugmented retrieval for opendomain question answering httpsarxivorgpdf200908553pdf | [
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
2520,
2929,
8725,
253,
4102,
4081,
14086,
25064,
3082,
281,
253,
4471,
12242,
1121,
423,
297,
404,
3533,
594,
347,
281,
6016,
2570,
4471,
12242,
19241,
253,
4583,
2934,
310,
2969,
1480,
533,
3576,
253,
4477,
2589,
9470,
4679,
327,
767,
4471,
12242,
15302,
3511,
11714,
31569,
285,
1554,
28017,
1435,
15198,
285,
7103,
1543,
7568,
326,
253,
4081,
1566,
33526,
13943,
1543,
327,
1097,
253,
3640,
25064,
4836,
285,
4471,
12242,
2805,
66,
50275,
2577,
760,
4468,
310,
670,
253,
38135,
273,
253,
2929,
253,
7680,
273,
436,
2929,
3133,
281,
320,
3710,
347,
352,
816,
24772,
253,
4102,
4081,
14086,
25064,
3082,
342,
4471,
12242,
2805,
66,
16280,
2429,
342,
337,
436,
2929,
3133,
281,
816,
8171,
253,
391,
9866,
3169,
32049,
275,
253,
3640,
851,
363,
972,
342,
253,
270,
797,
3169,
32049,
50276,
18,
9527,
391,
42158,
335,
571,
88,
7080,
256,
20244,
44403,
278,
1162,
355,
1554,
382,
554,
851,
363,
972,
28011,
5016,
323,
44755,
1121,
423,
297,
404,
1953,
22291,
68,
280,
32888,
6247,
5474,
339,
793,
360,
3454,
253,
2929,
29328,
247,
2969,
19080,
285,
347,
2080,
347,
891,
476,
2028,
4460,
5019,
273,
14086,
25064,
5609,
285,
17927,
17200,
8680,
323,
4471,
12242,
2570,
1121,
423,
297,
404,
2805,
66,
253,
5044,
2934,
310,
281,
32147,
366,
253,
24392,
4895,
323,
253,
806,
7316,
281,
253,
3236,
1953,
281,
830,
247,
747,
7316,
281,
320,
16202,
285,
908,
275,
5019,
342,
253,
25064,
985,
3021,
253,
2929,
1057,
417,
1246,
247,
39278,
747,
2934,
533,
24772,
8379,
3332,
4633,
3082,
14086,
25064,
15773,
275,
954,
256,
5503,
789,
275,
2805,
66,
342,
1711,
5609,
973,
1929,
285,
5421,
323,
8007,
275,
253,
3496,
3114,
50276,
249,
2087,
891,
11346,
4361,
436,
2929,
285,
1089,
436,
310,
247,
4217,
5853,
326,
253,
3114,
943,
3037,
670,
281,
8338,
2007,
275,
253,
1072,
285,
643,
2905,
3237,
24088,
1527,
31569,
253,
2746,
310,
8489,
32809,
533,
12085,
285,
4473,
1230,
2969,
253,
3480,
273,
11891,
323,
253,
3496,
3634,
310,
619,
2022,
1921,
323,
253,
4868,
417,
1146,
2169,
50276,
856,
84,
50276,
783,
2929,
2722,
326,
253,
2934,
310,
3576,
275,
2426,
273,
3045,
27012,
1375,
23037,
14387,
1543,
327,
767,
4471,
12242,
15302,
3511,
11714,
31569,
285,
1554,
28017,
1435,
15198,
50275,
16217,
3825,
1646,
3839,
26565,
285,
41374,
1563,
2629,
4633,
15302,
285,
7259,
50276,
16217,
3825,
2486,
1077,
3332,
789,
327,
10668,
35615,
1690,
1006,
800,
4394,
23603,
269,
301,
50276,
783,
35615,
17647,
1057,
417,
1056,
13260,
327,
253,
3753,
273,
253,
941,
285,
2330,
11419,
1491,
24088,
3048,
14580,
285,
11330,
671,
247,
625,
5919,
985,
50276,
5040,
50276,
783,
2929,
812,
320,
10932,
1805,
275,
697,
2457,
2715,
275,
1798,
5277,
625,
3634,
285,
16038,
323,
253,
1895,
806,
273,
512,
2139,
403,
824,
2570,
3533,
1774,
849,
2234,
310,
253,
4471,
12242,
1255,
4809,
891,
651,
3782,
5583,
11886,
4263,
436,
5955,
281,
253,
4342,
273,
1054,
1162,
355,
6247,
5987,
2700,
29404,
7585,
2061,
14718,
1497,
81,
50212,
1036,
275,
1798,
253,
958,
326,
1142,
824,
3533,
476,
320,
14042,
275,
581,
5184,
849,
3626,
435,
11232,
310,
253,
4836,
849,
1057,
436,
7794,
11852,
436,
2173,
1263,
50275,
783,
2905,
789,
2593,
310,
12497,
285,
253,
5928,
273,
271,
10599,
5955,
273,
253,
17927,
17200,
8680,
789,
275,
3496,
310,
247,
2201,
14855,
436,
310,
271,
1119,
1050,
1386,
273,
789,
1469,
896,
387,
1878,
281,
253,
2561,
273,
687,
68,
41380,
275,
253,
10333,
84,
891,
651,
1804,
39428,
1261,
285,
298,
267,
6681,
6469,
247,
6630,
327,
253,
897,
273,
17200,
8680,
323,
1491,
2289,
2718,
347,
247,
4983,
1127,
50276,
5992,
7193,
8680,
26122,
50276,
5430,
1057,
253,
897,
273,
28793,
301,
71,
347,
247,
2603,
273,
1892,
2297,
3993,
14588,
281,
253,
4154,
670,
3496,
1666,
25379,
1146,
4105,
50276,
5371,
310,
253,
16038,
323,
285,
6452,
323,
16344,
3081,
7939,
27586,
347,
2297,
3993,
604,
597,
760,
4917,
5884,
15988,
50275,
16534,
368,
2319,
625,
253,
3753,
273,
253,
22296,
1491,
2130,
342,
1675,
281,
253,
958,
326,
253,
1180,
273,
47010,
310,
1929,
285,
671,
253,
1340,
273,
253,
10056,
6430,
390,
476,
320,
22245,
344,
321,
18260,
24088,
8772,
281,
253,
1750,
3733,
275,
271,
1340,
1530,
6932,
5133,
10693,
2987,
387,
512,
285,
762,
468,
13015,
1014,
253,
2014,
12242,
8245,
50276,
9802,
368,
3597,
970,
625,
685,
374,
5018,
352,
778,
320,
9865,
281,
1408,
271,
3368,
50276,
262,
651,
320,
671,
9865,
281,
3368,
342,
436,
2746,
327,
643,
1527,
31569,
8892,
824,
347,
3626,
1953,
3966,
671,
281,
2085,
625,
1941,
323,
253,
31376,
273,
253,
2746,
50276,
5430,
1057,
436,
2746,
2968,
342,
253,
3710,
32049,
5350,
275,
2426,
273,
1180,
273,
21761,
849,
1142,
24392,
476,
368,
14801,
281,
830,
253,
374,
2109,
7316,
285,
849,
1057,
436,
2818,
3045,
50276,
7152,
339,
793,
360,
3454,
436,
2929,
29328,
4471,
12242,
14086,
25064,
323,
1121,
423,
297,
404,
4471,
12242,
1953,
22291,
352,
8725,
2045,
14086,
10056,
25064,
715,
253,
3969,
4471,
12242,
2715,
407,
970,
22111,
24392,
281,
4329,
1574,
8460,
4187,
253,
7316,
6779,
846,
1016,
25064,
1509,
275,
253,
990,
352,
476,
3012,
3157,
253,
3045,
327,
3511,
11714,
31569,
285,
1554,
28017,
1435,
15198,
10895,
253,
6260,
403,
1077,
11088,
285,
9470,
432,
2761,
1046,
4623,
8668,
50276,
856,
84,
337,
186,
2068,
1946,
14086,
10056,
851,
363,
972,
715,
697,
4471,
12242,
2715,
310,
247,
5272,
3884,
436,
310,
253,
806,
789,
275,
436,
3884,
374,
186,
783,
5661,
1543,
403,
2266,
285,
253,
6260,
403,
11088,
772,
337,
186,
2520,
2929,
7194,
16633,
327,
253,
5661,
285,
1783,
629,
3738,
352,
310,
1175,
281,
871,
841,
15880,
275,
4471,
12242,
14086,
25064,
253,
4081,
1332,
3139,
310,
3710,
275,
2426,
273,
38135,
374,
186,
17480,
253,
2934,
7194,
3249,
432,
14086,
10056,
25064,
352,
310,
417,
1077,
2590,
281,
479,
1880,
253,
7756,
3249,
432,
253,
14086,
10056,
25064,
390,
253,
4081,
21624,
7316,
8460,
1427,
323,
277,
1087,
1580,
841,
403,
253,
2201,
7680,
273,
436,
2929,
891,
1158,
352,
310,
3309,
281,
452,
247,
4858,
2593,
323,
5955,
50276,
19751,
337,
186,
249,
2829,
495,
752,
310,
253,
3733,
2508,
273,
253,
2014,
12242,
28913,
310,
352,
671,
10166,
949,
247,
2074,
4016,
10491,
1232,
2529,
275,
2593,
3307,
604,
594,
2139,
253,
7756,
273,
4471,
12242,
2715,
277,
1087,
310,
594,
1534,
374,
186,
249,
2593,
337,
352,
310,
753,
326,
253,
2022,
1895,
275,
22291,
4471,
12242,
1121,
423,
297,
404,
3533,
310,
326,
253,
3186,
2317,
17202,
28596,
342,
1016,
25064,
5184,
275,
619,
4685,
253,
4081,
4471,
12242,
277,
1087,
1335,
27171,
432,
436,
1895,
987,
253,
760,
3064,
875,
253,
4081,
4471,
12242,
277,
1087,
285,
2045,
7274,
310,
326,
352,
1057,
417,
897,
667,
18872,
3640,
1561,
253,
7177,
323,
25064,
5474,
33032,
18,
6010,
273,
436,
2929,
50272,
783,
9400,
273,
436,
2929,
310,
4471,
12242,
2805,
66,
534,
2175,
22291,
2570,
3626,
3448,
3533,
2570,
3533,
2430,
271,
1491,
20828,
1232,
2439,
2709,
7177,
285,
3332,
4471,
12242,
2805,
66,
3210,
2216,
436,
1232,
407,
32627,
48484,
4623,
7177,
347,
2284,
9169,
1162,
355,
436,
2929,
7374,
6584,
684,
767,
3237,
275,
3332,
4471,
12242,
2805,
66,
3210,
581,
310,
326,
3332,
4471,
12242,
2805,
66,
3210,
2430,
6024,
3640,
824,
347,
259,
15170,
4373,
23053,
436,
1895,
1543,
275,
253,
3210,
1698,
26647,
3745,
327,
747,
10625,
326,
253,
6024,
3640,
310,
642,
3356,
2130,
253,
643,
1895,
310,
15180,
6733,
253,
4477,
12661,
247,
4460,
4471,
12242,
2805,
66,
1566,
4907,
278,
5267,
326,
1057,
417,
2430,
6024,
3640,
285,
310,
3578,
2069,
7938,
685,
253,
3332,
3210,
278,
5267,
4648,
1953,
8460,
1427,
285,
278,
2824,
1953,
8460,
1427,
2216,
253,
1491,
20828,
1232,
407,
10040,
3146,
11365,
247,
7316,
4972,
2905,
281,
253,
7177,
326,
943,
320,
11704,
281,
3662,
253,
3236,
1953,
278,
5267,
15693,
824,
7316,
11390,
407,
10941,
253,
1677,
1953,
285,
3786,
22111,
7177,
278,
5267,
31360,
24392,
275,
247,
1781,
20689,
4663,
272,
342,
253,
1072,
32049,
908,
275,
253,
1953,
8460,
1427,
1232,
285,
4648,
278,
2824,
281,
1089,
4623,
7177,
342,
253,
4561,
7316,
11390,
275,
4679,
253,
4477,
921,
326,
278,
5267,
41731,
13015,
3332,
4471,
12242,
2805,
66,
3210,
285,
671,
597,
921,
253,
15180,
6733,
273,
278,
5267,
50276,
19,
2266,
285,
5075,
2792,
273,
436,
2929,
50272,
9072,
2792,
50268,
2520,
2929,
3400,
247,
7000,
1783,
273,
616,
1332,
5661,
1543,
921,
253,
13091,
273,
253,
4081,
1332,
285,
690,
2266,
4342,
2529,
2708,
50264,
2420,
374,
23849,
326,
278,
5267,
41731,
13015,
4216,
761,
851,
363,
972,
347,
2284,
1162,
355,
436,
906,
2722,
253,
25720,
273,
247,
625,
7899,
4471,
12242,
2805,
66,
1566,
1293,
6024,
3640,
824,
347,
259,
15170,
4373,
23053,
50264,
2420,
495,
2722,
247,
7000,
1783,
273,
1016,
4445,
275,
278,
5267,
436,
2829,
6492,
2067,
12232,
3386,
323,
4471,
12242,
2805,
66,
3210,
326,
476,
320,
4354,
12841,
275,
253,
1566,
2216,
1232,
253,
5661,
1543,
327,
32063,
1340,
285,
32063,
7939,
2297,
3993,
921,
1534,
4342,
275,
4471,
12242,
2805,
66,
50264,
2420,
577,
2722,
326,
253,
1953,
8460,
1427,
1332,
278,
5267,
556,
2074,
3045,
281,
253,
1953,
14717,
1332,
342,
1966,
11423,
456,
749,
34974,
50264,
2420,
608,
2722,
253,
990,
936,
423,
3045,
273,
4471,
12242,
2805,
66,
3210,
278,
5267,
41731,
13015,
5368,
1375,
23037,
14387,
4471,
12242,
2805,
66,
3210,
50268,
783,
4081,
1332,
310,
43245,
5919,
50268,
783,
4081,
1332,
310,
2969,
1142,
956,
484,
2175,
1754,
327,
253,
4081,
1332,
403,
3264,
50268,
783,
5661,
1543,
1329,
616,
1750,
50272,
20881,
2792,
50268,
2520,
2929,
1057,
417,
3748,
253,
13644,
2130,
2127,
273,
616,
1332,
352,
651,
320,
5322,
604,
253,
4477,
2085,
27558,
846,
253,
3061,
1232,
50268,
249,
253,
2593,
1953,
14717,
323,
25064,
253,
4477,
7525,
326,
1953,
14717,
310,
15279,
275,
253,
3634,
273,
14086,
25064,
342,
247,
2266,
3215,
11273,
32049,
2299,
2829,
577,
2722,
326,
1953,
14717,
342,
247,
2969,
1121,
423,
297,
404,
2805,
66,
1566,
556,
247,
2074,
3045,
281,
278,
5267,
841,
1543,
5224,
326,
1953,
14717,
310,
271,
3576,
1332,
281,
1056,
2969,
2014,
12242,
1121,
423,
297,
404,
2805,
66,
3210,
908,
275,
4471,
12242,
2805,
66,
4496,
2085,
625,
1941,
323,
253,
6452,
440,
11353,
414,
273,
1953,
14717,
50276,
20,
17401,
50272,
14764,
50272,
2520,
2929,
3400,
2067,
1534,
4342,
326,
403,
3264,
281,
320,
6289,
281,
407,
1142,
643,
2175,
616,
1332,
310,
2969,
285,
41731,
13015,
643,
4471,
12242,
2805,
66,
3210,
671,
352,
310,
43245,
5919,
50276,
21,
3533,
50272,
249,
2829,
577,
253,
30572,
1332,
310,
1754,
327,
277,
1087,
14086,
10056,
851,
363,
972,
752,
588,
320,
253,
1543,
604,
278,
5267,
4648,
253,
5328,
749,
34974,
1057,
970,
749,
34974,
275,
278,
5267,
2572,
851,
363,
972,
3045,
50272,
32897,
2085,
253,
1180,
273,
1892,
4016,
3530,
323,
247,
1953,
50272,
249,
2593,
3307,
752,
310,
253,
1265,
10669,
275,
253,
6197,
50276,
46458,
359,
4647,
3828,
21539,
689,
253,
1265,
21761,
14237,
432,
687,
589,
893,
281,
755,
253,
2457,
14086,
7316,
5858,
486,
11390,
310,
253,
1265,
10669,
24462,
9252,
273,
253,
502,
84,
10669,
390,
8763,
6779,
273,
253,
502,
84,
10669,
187,
187,
4118,
18435,
27,
783,
2929,
23970,
11138,
10056,
25064,
323,
4471,
12242,
2805,
66,
15302,
407,
17910,
1242,
48484,
24392,
6240,
3786,
22111,
24392,
281,
253,
3280,
275,
1635,
281,
247,
7316,
436,
2969,
1332,
2722,
15988,
327,
2709,
2805,
66,
22791,
15302,
285,
253,
7103,
3559,
275,
253,
2929,
327,
2709,
12085,
22791,
15302,
3511,
11714,
31569,
15198,
310,
1077,
11080,
391,
18,
391,
20,
391,
21,
50275,
6050,
253,
2898,
310,
3965,
6891,
253,
3045,
6351,
7296,
1097,
6733,
285,
7200,
310,
9648,
1534,
285,
253,
2929,
10262,
247,
2969,
1566,
342,
1679,
9376,
24088,
734,
3306,
4373,
23053,
326,
812,
320,
4217,
323,
2852,
2561,
50275,
18,
671,
3133,
751,
247,
4623,
1386,
273,
789,
50275,
18,
5978,
2321,
16390,
25064,
323,
1121,
423,
297,
404,
1953,
22291,
5987,
39962,
2061,
9275,
1518,
2270,
2227,
3357,
9275,
50276
] | [
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] | [
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
2520,
2929,
8725,
253,
4102,
4081,
14086,
25064,
3082,
281,
253,
4471,
12242,
1121,
423,
297,
404,
3533,
594,
347,
281,
6016,
2570,
4471,
12242,
19241,
253,
4583,
2934,
310,
2969,
1480,
533,
3576,
253,
4477,
2589,
9470,
4679,
327,
767,
4471,
12242,
15302,
3511,
11714,
31569,
285,
1554,
28017,
1435,
15198,
285,
7103,
1543,
7568,
326,
253,
4081,
1566,
33526,
13943,
1543,
327,
1097,
253,
3640,
25064,
4836,
285,
4471,
12242,
2805,
66,
50275,
2577,
760,
4468,
310,
670,
253,
38135,
273,
253,
2929,
253,
7680,
273,
436,
2929,
3133,
281,
320,
3710,
347,
352,
816,
24772,
253,
4102,
4081,
14086,
25064,
3082,
342,
4471,
12242,
2805,
66,
16280,
2429,
342,
337,
436,
2929,
3133,
281,
816,
8171,
253,
391,
9866,
3169,
32049,
275,
253,
3640,
851,
363,
972,
342,
253,
270,
797,
3169,
32049,
50276,
18,
9527,
391,
42158,
335,
571,
88,
7080,
256,
20244,
44403,
278,
1162,
355,
1554,
382,
554,
851,
363,
972,
28011,
5016,
323,
44755,
1121,
423,
297,
404,
1953,
22291,
68,
280,
32888,
6247,
5474,
339,
793,
360,
3454,
253,
2929,
29328,
247,
2969,
19080,
285,
347,
2080,
347,
891,
476,
2028,
4460,
5019,
273,
14086,
25064,
5609,
285,
17927,
17200,
8680,
323,
4471,
12242,
2570,
1121,
423,
297,
404,
2805,
66,
253,
5044,
2934,
310,
281,
32147,
366,
253,
24392,
4895,
323,
253,
806,
7316,
281,
253,
3236,
1953,
281,
830,
247,
747,
7316,
281,
320,
16202,
285,
908,
275,
5019,
342,
253,
25064,
985,
3021,
253,
2929,
1057,
417,
1246,
247,
39278,
747,
2934,
533,
24772,
8379,
3332,
4633,
3082,
14086,
25064,
15773,
275,
954,
256,
5503,
789,
275,
2805,
66,
342,
1711,
5609,
973,
1929,
285,
5421,
323,
8007,
275,
253,
3496,
3114,
50276,
249,
2087,
891,
11346,
4361,
436,
2929,
285,
1089,
436,
310,
247,
4217,
5853,
326,
253,
3114,
943,
3037,
670,
281,
8338,
2007,
275,
253,
1072,
285,
643,
2905,
3237,
24088,
1527,
31569,
253,
2746,
310,
8489,
32809,
533,
12085,
285,
4473,
1230,
2969,
253,
3480,
273,
11891,
323,
253,
3496,
3634,
310,
619,
2022,
1921,
323,
253,
4868,
417,
1146,
2169,
50276,
856,
84,
50276,
783,
2929,
2722,
326,
253,
2934,
310,
3576,
275,
2426,
273,
3045,
27012,
1375,
23037,
14387,
1543,
327,
767,
4471,
12242,
15302,
3511,
11714,
31569,
285,
1554,
28017,
1435,
15198,
50275,
16217,
3825,
1646,
3839,
26565,
285,
41374,
1563,
2629,
4633,
15302,
285,
7259,
50276,
16217,
3825,
2486,
1077,
3332,
789,
327,
10668,
35615,
1690,
1006,
800,
4394,
23603,
269,
301,
50276,
783,
35615,
17647,
1057,
417,
1056,
13260,
327,
253,
3753,
273,
253,
941,
285,
2330,
11419,
1491,
24088,
3048,
14580,
285,
11330,
671,
247,
625,
5919,
985,
50276,
5040,
50276,
783,
2929,
812,
320,
10932,
1805,
275,
697,
2457,
2715,
275,
1798,
5277,
625,
3634,
285,
16038,
323,
253,
1895,
806,
273,
512,
2139,
403,
824,
2570,
3533,
1774,
849,
2234,
310,
253,
4471,
12242,
1255,
4809,
891,
651,
3782,
5583,
11886,
4263,
436,
5955,
281,
253,
4342,
273,
1054,
1162,
355,
6247,
5987,
2700,
29404,
7585,
2061,
14718,
1497,
81,
50212,
1036,
275,
1798,
253,
958,
326,
1142,
824,
3533,
476,
320,
14042,
275,
581,
5184,
849,
3626,
435,
11232,
310,
253,
4836,
849,
1057,
436,
7794,
11852,
436,
2173,
1263,
50275,
783,
2905,
789,
2593,
310,
12497,
285,
253,
5928,
273,
271,
10599,
5955,
273,
253,
17927,
17200,
8680,
789,
275,
3496,
310,
247,
2201,
14855,
436,
310,
271,
1119,
1050,
1386,
273,
789,
1469,
896,
387,
1878,
281,
253,
2561,
273,
687,
68,
41380,
275,
253,
10333,
84,
891,
651,
1804,
39428,
1261,
285,
298,
267,
6681,
6469,
247,
6630,
327,
253,
897,
273,
17200,
8680,
323,
1491,
2289,
2718,
347,
247,
4983,
1127,
50276,
5992,
7193,
8680,
26122,
50276,
5430,
1057,
253,
897,
273,
28793,
301,
71,
347,
247,
2603,
273,
1892,
2297,
3993,
14588,
281,
253,
4154,
670,
3496,
1666,
25379,
1146,
4105,
50276,
5371,
310,
253,
16038,
323,
285,
6452,
323,
16344,
3081,
7939,
27586,
347,
2297,
3993,
604,
597,
760,
4917,
5884,
15988,
50275,
16534,
368,
2319,
625,
253,
3753,
273,
253,
22296,
1491,
2130,
342,
1675,
281,
253,
958,
326,
253,
1180,
273,
47010,
310,
1929,
285,
671,
253,
1340,
273,
253,
10056,
6430,
390,
476,
320,
22245,
344,
321,
18260,
24088,
8772,
281,
253,
1750,
3733,
275,
271,
1340,
1530,
6932,
5133,
10693,
2987,
387,
512,
285,
762,
468,
13015,
1014,
253,
2014,
12242,
8245,
50276,
9802,
368,
3597,
970,
625,
685,
374,
5018,
352,
778,
320,
9865,
281,
1408,
271,
3368,
50276,
262,
651,
320,
671,
9865,
281,
3368,
342,
436,
2746,
327,
643,
1527,
31569,
8892,
824,
347,
3626,
1953,
3966,
671,
281,
2085,
625,
1941,
323,
253,
31376,
273,
253,
2746,
50276,
5430,
1057,
436,
2746,
2968,
342,
253,
3710,
32049,
5350,
275,
2426,
273,
1180,
273,
21761,
849,
1142,
24392,
476,
368,
14801,
281,
830,
253,
374,
2109,
7316,
285,
849,
1057,
436,
2818,
3045,
50276,
7152,
339,
793,
360,
3454,
436,
2929,
29328,
4471,
12242,
14086,
25064,
323,
1121,
423,
297,
404,
4471,
12242,
1953,
22291,
352,
8725,
2045,
14086,
10056,
25064,
715,
253,
3969,
4471,
12242,
2715,
407,
970,
22111,
24392,
281,
4329,
1574,
8460,
4187,
253,
7316,
6779,
846,
1016,
25064,
1509,
275,
253,
990,
352,
476,
3012,
3157,
253,
3045,
327,
3511,
11714,
31569,
285,
1554,
28017,
1435,
15198,
10895,
253,
6260,
403,
1077,
11088,
285,
9470,
432,
2761,
1046,
4623,
8668,
50276,
856,
84,
337,
186,
2068,
1946,
14086,
10056,
851,
363,
972,
715,
697,
4471,
12242,
2715,
310,
247,
5272,
3884,
436,
310,
253,
806,
789,
275,
436,
3884,
374,
186,
783,
5661,
1543,
403,
2266,
285,
253,
6260,
403,
11088,
772,
337,
186,
2520,
2929,
7194,
16633,
327,
253,
5661,
285,
1783,
629,
3738,
352,
310,
1175,
281,
871,
841,
15880,
275,
4471,
12242,
14086,
25064,
253,
4081,
1332,
3139,
310,
3710,
275,
2426,
273,
38135,
374,
186,
17480,
253,
2934,
7194,
3249,
432,
14086,
10056,
25064,
352,
310,
417,
1077,
2590,
281,
479,
1880,
253,
7756,
3249,
432,
253,
14086,
10056,
25064,
390,
253,
4081,
21624,
7316,
8460,
1427,
323,
277,
1087,
1580,
841,
403,
253,
2201,
7680,
273,
436,
2929,
891,
1158,
352,
310,
3309,
281,
452,
247,
4858,
2593,
323,
5955,
50276,
19751,
337,
186,
249,
2829,
495,
752,
310,
253,
3733,
2508,
273,
253,
2014,
12242,
28913,
310,
352,
671,
10166,
949,
247,
2074,
4016,
10491,
1232,
2529,
275,
2593,
3307,
604,
594,
2139,
253,
7756,
273,
4471,
12242,
2715,
277,
1087,
310,
594,
1534,
374,
186,
249,
2593,
337,
352,
310,
753,
326,
253,
2022,
1895,
275,
22291,
4471,
12242,
1121,
423,
297,
404,
3533,
310,
326,
253,
3186,
2317,
17202,
28596,
342,
1016,
25064,
5184,
275,
619,
4685,
253,
4081,
4471,
12242,
277,
1087,
1335,
27171,
432,
436,
1895,
987,
253,
760,
3064,
875,
253,
4081,
4471,
12242,
277,
1087,
285,
2045,
7274,
310,
326,
352,
1057,
417,
897,
667,
18872,
3640,
1561,
253,
7177,
323,
25064,
5474,
33032,
18,
6010,
273,
436,
2929,
50272,
783,
9400,
273,
436,
2929,
310,
4471,
12242,
2805,
66,
534,
2175,
22291,
2570,
3626,
3448,
3533,
2570,
3533,
2430,
271,
1491,
20828,
1232,
2439,
2709,
7177,
285,
3332,
4471,
12242,
2805,
66,
3210,
2216,
436,
1232,
407,
32627,
48484,
4623,
7177,
347,
2284,
9169,
1162,
355,
436,
2929,
7374,
6584,
684,
767,
3237,
275,
3332,
4471,
12242,
2805,
66,
3210,
581,
310,
326,
3332,
4471,
12242,
2805,
66,
3210,
2430,
6024,
3640,
824,
347,
259,
15170,
4373,
23053,
436,
1895,
1543,
275,
253,
3210,
1698,
26647,
3745,
327,
747,
10625,
326,
253,
6024,
3640,
310,
642,
3356,
2130,
253,
643,
1895,
310,
15180,
6733,
253,
4477,
12661,
247,
4460,
4471,
12242,
2805,
66,
1566,
4907,
278,
5267,
326,
1057,
417,
2430,
6024,
3640,
285,
310,
3578,
2069,
7938,
685,
253,
3332,
3210,
278,
5267,
4648,
1953,
8460,
1427,
285,
278,
2824,
1953,
8460,
1427,
2216,
253,
1491,
20828,
1232,
407,
10040,
3146,
11365,
247,
7316,
4972,
2905,
281,
253,
7177,
326,
943,
320,
11704,
281,
3662,
253,
3236,
1953,
278,
5267,
15693,
824,
7316,
11390,
407,
10941,
253,
1677,
1953,
285,
3786,
22111,
7177,
278,
5267,
31360,
24392,
275,
247,
1781,
20689,
4663,
272,
342,
253,
1072,
32049,
908,
275,
253,
1953,
8460,
1427,
1232,
285,
4648,
278,
2824,
281,
1089,
4623,
7177,
342,
253,
4561,
7316,
11390,
275,
4679,
253,
4477,
921,
326,
278,
5267,
41731,
13015,
3332,
4471,
12242,
2805,
66,
3210,
285,
671,
597,
921,
253,
15180,
6733,
273,
278,
5267,
50276,
19,
2266,
285,
5075,
2792,
273,
436,
2929,
50272,
9072,
2792,
50268,
2520,
2929,
3400,
247,
7000,
1783,
273,
616,
1332,
5661,
1543,
921,
253,
13091,
273,
253,
4081,
1332,
285,
690,
2266,
4342,
2529,
2708,
50264,
2420,
374,
23849,
326,
278,
5267,
41731,
13015,
4216,
761,
851,
363,
972,
347,
2284,
1162,
355,
436,
906,
2722,
253,
25720,
273,
247,
625,
7899,
4471,
12242,
2805,
66,
1566,
1293,
6024,
3640,
824,
347,
259,
15170,
4373,
23053,
50264,
2420,
495,
2722,
247,
7000,
1783,
273,
1016,
4445,
275,
278,
5267,
436,
2829,
6492,
2067,
12232,
3386,
323,
4471,
12242,
2805,
66,
3210,
326,
476,
320,
4354,
12841,
275,
253,
1566,
2216,
1232,
253,
5661,
1543,
327,
32063,
1340,
285,
32063,
7939,
2297,
3993,
921,
1534,
4342,
275,
4471,
12242,
2805,
66,
50264,
2420,
577,
2722,
326,
253,
1953,
8460,
1427,
1332,
278,
5267,
556,
2074,
3045,
281,
253,
1953,
14717,
1332,
342,
1966,
11423,
456,
749,
34974,
50264,
2420,
608,
2722,
253,
990,
936,
423,
3045,
273,
4471,
12242,
2805,
66,
3210,
278,
5267,
41731,
13015,
5368,
1375,
23037,
14387,
4471,
12242,
2805,
66,
3210,
50268,
783,
4081,
1332,
310,
43245,
5919,
50268,
783,
4081,
1332,
310,
2969,
1142,
956,
484,
2175,
1754,
327,
253,
4081,
1332,
403,
3264,
50268,
783,
5661,
1543,
1329,
616,
1750,
50272,
20881,
2792,
50268,
2520,
2929,
1057,
417,
3748,
253,
13644,
2130,
2127,
273,
616,
1332,
352,
651,
320,
5322,
604,
253,
4477,
2085,
27558,
846,
253,
3061,
1232,
50268,
249,
253,
2593,
1953,
14717,
323,
25064,
253,
4477,
7525,
326,
1953,
14717,
310,
15279,
275,
253,
3634,
273,
14086,
25064,
342,
247,
2266,
3215,
11273,
32049,
2299,
2829,
577,
2722,
326,
1953,
14717,
342,
247,
2969,
1121,
423,
297,
404,
2805,
66,
1566,
556,
247,
2074,
3045,
281,
278,
5267,
841,
1543,
5224,
326,
1953,
14717,
310,
271,
3576,
1332,
281,
1056,
2969,
2014,
12242,
1121,
423,
297,
404,
2805,
66,
3210,
908,
275,
4471,
12242,
2805,
66,
4496,
2085,
625,
1941,
323,
253,
6452,
440,
11353,
414,
273,
1953,
14717,
50276,
20,
17401,
50272,
14764,
50272,
2520,
2929,
3400,
2067,
1534,
4342,
326,
403,
3264,
281,
320,
6289,
281,
407,
1142,
643,
2175,
616,
1332,
310,
2969,
285,
41731,
13015,
643,
4471,
12242,
2805,
66,
3210,
671,
352,
310,
43245,
5919,
50276,
21,
3533,
50272,
249,
2829,
577,
253,
30572,
1332,
310,
1754,
327,
277,
1087,
14086,
10056,
851,
363,
972,
752,
588,
320,
253,
1543,
604,
278,
5267,
4648,
253,
5328,
749,
34974,
1057,
970,
749,
34974,
275,
278,
5267,
2572,
851,
363,
972,
3045,
50272,
32897,
2085,
253,
1180,
273,
1892,
4016,
3530,
323,
247,
1953,
50272,
249,
2593,
3307,
752,
310,
253,
1265,
10669,
275,
253,
6197,
50276,
46458,
359,
4647,
3828,
21539,
689,
253,
1265,
21761,
14237,
432,
687,
589,
893,
281,
755,
253,
2457,
14086,
7316,
5858,
486,
11390,
310,
253,
1265,
10669,
24462,
9252,
273,
253,
502,
84,
10669,
390,
8763,
6779,
273,
253,
502,
84,
10669,
187,
187,
4118,
18435,
27,
783,
2929,
23970,
11138,
10056,
25064,
323,
4471,
12242,
2805,
66,
15302,
407,
17910,
1242,
48484,
24392,
6240,
3786,
22111,
24392,
281,
253,
3280,
275,
1635,
281,
247,
7316,
436,
2969,
1332,
2722,
15988,
327,
2709,
2805,
66,
22791,
15302,
285,
253,
7103,
3559,
275,
253,
2929,
327,
2709,
12085,
22791,
15302,
3511,
11714,
31569,
15198,
310,
1077,
11080,
391,
18,
391,
20,
391,
21,
50275,
6050,
253,
2898,
310,
3965,
6891,
253,
3045,
6351,
7296,
1097,
6733,
285,
7200,
310,
9648,
1534,
285,
253,
2929,
10262,
247,
2969,
1566,
342,
1679,
9376,
24088,
734,
3306,
4373,
23053,
326,
812,
320,
4217,
323,
2852,
2561,
50275,
18,
671,
3133,
751,
247,
4623,
1386,
273,
789,
50275,
18,
5978,
2321,
16390,
25064,
323,
1121,
423,
297,
404,
1953,
22291,
5987,
39962,
2061,
9275,
1518,
2270,
2227,
3357,
9275,
50276
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
in the context of neural networks with mixture of experts moe layers the paper proposes to decrease the number of activated experts per each input as training progresses by means of decreasing the temperature in the softmax that is typically used to weight the contribution of the different experts in the moe layers the proposed approach is compared with fully dense moes and the recent switchmoe baselines in a language modeling task the paper proposes a simple and easy to implement idea however the experiments do not clearly show that densetosparse dts is better in terms of accuracycompute tradeoffs than fully sparse baselines such as switchmoe in particular although figure 3b should show this flop vs accuracy but the lowest perplexity is achieved by switchmoe with 8 cdot 1033 flop and dtsmoe only matches switchmoe at around 65 cdot 1013 flop most importantly flop is not very well correlated with actual training time nor energy consumption since if some sort of masking is used to implement dts rather than true sparsity some compute is wasted to compute the experts outputs that are then masked to this reviewer it is not clear if masking or some thresholding was used to evaluate dts figure 1 depicts a threshold which could mean that dts avoid computing the output of some experts the ones whose weight falls below the threshold and that the flops depicted in figure 3b and thorugh the rest of the paper are more correlated with actual runtime however this threshold is not mentioned anywhere else in the paper so its very likely that when the temperature is very low ie only 1 of the experts weights is effectively nonzero all the experts are being actually used in any case notice that even if the reported flops correlate well with runtime the observation regarding switchmoe is still true moreover many of the experiments do not report results on a test set not even a validation set figures 3 arguably the main plot and 5 report training perplexity rather than the perplexity on a separate validation or test set however figure 4 and 6 do report results on the validation set its unclear to this reviewer whats the criterion used to report results on one set or the other in any case validation perplexity or even better test perplexity should have been used through all the paper training measuraments are relevant sometimes but not in the context of this paper there are some results from the effect of balanced loads analysis section 424 which are also not clear to this reviewer as well in particular figure 6a shows that the use of balanced loads adds an overhead of approximately 5 cdot 1011 flops in the openwebtext dataset why why only in this task but not in the wikipedia task in addition notice that one of the main practical reasons to use a balanced load eg in switchmoe is to efficiently use all available compute which is not considered in the analysis presented minor typos and comments abstract an neegative impact a negative impact page 4 here we train a 24layer transformer decoder model with and every other ffn here we train a 24layer transformer decoder model with every other ffn section 423 there seems to be notes left from the draft of the paper need we analyze it in tokenlevels weights section 424 we directly loads we directly load the main claim of the paper the current approach of jointly training experts and the sparse gates introduce a negative impact on model accuracy is not well supported by the experiments in addition to that many of the experiments report training measurements rather than test or validation measurements comparing training perplexity is not enough to show the superiority of the proposed approach some of the details of the implementation are not clear to this reviewer and affect the flop vs accuracy comparison actual runtime comparison on a particual hardware architecture would be much better given all these issues i recommend to not accept the paper update after rebuttal the authors have addressed many of my concerns during the rebuttal period satisfactorily my concern about the claimed speedup is still present but the authors have stated why they decided to use flops rather than actual runtime on a given hardware and implementation based on all this im slightly increasing my score for this submission docsepthis paper studied the problem of training the gating network for the mixtureofexperts based model architectures to make the gating network training more stable and robust the authors proposed a densetosparse gating training algorithm that uses gumbel noise and temperature tuning the authors conducted experiments on large nlp datasets strengths the proposed scheme to train dense gating then making it sparser makes sense and could potentially stabilize and improve the training of gating networks the authors conducted extensive experiments to evaluate both the effectiveness and efficiency of the proposed algorithm weaknesses stabilizing the training of the sparse gating network has been the main challenge of the moe based model architectures there exists some research with similar ideas trying to make the training of moe more robust 1 this paper uses gumbel noise to stabilize the learning of softmax gating httpsarxivorgabs191004915 2 this paper uses a binary gating with similar noise and a l0 regularizer to control the sparsity httpsojsaaaiorgindexphpaaaiarticleview3788 3 this paper uses a binary encoding and diffienable operators to smooth the learning of sparse gating with initializing to be dense and later converging to sparse httpsarxivorgabs210603760 given these existing literatures i feel like the novelty of this paper is quite limited in the meantime since the sparsity of the gating is controlled by the annealing of temperature of the softmax only top1 gating is supported this also limits the capacity flexibility of the moe based morel architecture overall i think the training scheme from dense to sparse for sparse moe gating makes sense and this is supported by experiments in the paper however i also think the technical novelty of this paper is limited given existing work with similar ideas on improving the gating network training docsepthis work proposes the densetosparse gate or dtsgate its a simple idea of starting training with soft continuous routing to all experts and then gradually reducing to hard discrete routing the authors claim that this improves the quality of traditional approaches the work proposes the novel idea and details how to schedule the sparsity adjustment the authors suggest this improves traditional approaches by 5 flopsefficiency notes the idea is an interesting one if the quality of sparse models can be significantly improved through using more compute only early on in training this could become a widely adopted technique reasonably wellwritten paper the authors use a reasonable baseline of a 24layer moe transformer 16experts for every other layer visualization of the model dynamics specifically gate weights distribution is a nice addition beyond plots and tables of numbers improvements i suggest a different graphic for the experts in figure 1 our field already anthropomorphizes neural networks enough id prefer to not see a little human with an etched brain as a standin for two matrix multiplications and a nonlinearity a style recommendation make each of your captions selfcontained with description and result so that the reader can quickly scan your work figuretofigure elaborate over the short phrases youre using eg figure 5 different temperature scheduler on page 8 dont use the abbreviations w and wo the significance of the gate values in figure 7 will likely be lost to many readers as stated above please elaborate in your captions to describe what is being shown and why its important to substantiate your contribution p8 though these tokens are spatial proximity fix questions figure 3 why are there periodic dips in the training perplexity curves if this is from repeating the data in the same order id recommend in the future that you shuffle data each epoch figure 4 demonstrates that moedts improves with more experts and expertlayers how does this scaling compare to moeswitch experiments results show that the flopsefficiency can be optimized into timeefficiency where was this shown all the plots and data i see are for flops not wallclock time on a specific hardware it is amazing that dtsgate can learn to load balance without extra loss was this verified to be a property only of dtsgate please confirm that naive approaches without load balancing do not learn this or else perhaps remove this also as a nit id avoid subjective claims like in using the word amazing im positively disposed towards this work however my main concern is on the lack of a clear demonstration of the magnitude and the significance of the gains please clearly show me the asymptotic quality is worse using vanilla approaches eg lepikhin et al 2020 fedus et al 2021 as far as i can tell the only substantiating plot for this is 3b but in my opinion this doesnt clearly showcase the improvement of the method at the very least these curves should be extended until moedts is conclusively better than moeswitch on a flops basis further though the idea is interesting showing stronger empirical results than a 5 flopsefficiency gain is likely needed to convince practitioners and researchers if this is the case i will improve from a 68 docsepthe paper proposes an algorithm dtsgate to warmup experts in an moe learning setup rather than applying standard topk selection per input the algorithm initially applies dense routing ie k total number of experts to avoid quick collapse and gradually relaxes it to become sparser and cheaper experiments and ablations in language tasks are provided to support the advantages of the algorithm this paper tackles an interesting aspect of conditional computation in the context of moes how do we start training from scratch when individual experts can easily break down and collapse the paper provides an intuitive answer we can start by training all experts together ie as a dense model and relax this over time hopefully once all individual experts are enough developed in some setups which interestingly coincide with some of the original motivation ones for moes this may not be possible for example if the number of experts is just huge some recent work has shown the effect of increasing k a lot see 1 figure 10 it seems that the effect of increasing k plateaus somewhat quickly while in those experiments k doesnt change over time i think this may suggest theres little to gain in doing so especially as its quite expensive overall i think the experiments in the paper do not show the proposed algorithm offers any advantage over standard topk routing switch in the paper figure 3 seems to be the only comparison between switch 2 ie no warmup and dtsgate expert warmup first the reported metric is training perplexity im not sure why this is the case test or validation perplexity should be reported in the second plot 3b which is the most relevant in my opinion flops or runtime rather than steps as in 3a as these hide very different costs it seems that switch dominates dts for any flops budget vertical line switch offers better performance than dts similarly for any attainable performance level switch gets there at a lower flops cost thus im not sure how the paper justifies the use of dts i may have misunderstood something though its not clear to me why the dts run in red was halted and didnt continue training theres a mismatch between both plots for the yellow curve moedense in the left one 3a perplexity is below 14 while in the right one 3b its above 15 why was the xaxis flops trimmed in the previous to last paragraph in page 6 theres this sentence dtsgate can obtain greater than 5 improvements in flopsefficiency compared with stateoftheart switchgate where can this be seen also and as the authors acknowledge in the conclusions the fact that dts doesnt impose balance if i understand correctly even when using a load balancing loss as opposed to having a maximum expert capacity that enforces it can lead to an inefficient use of current hardware some experts waiting for others for figure 6 as expert ids are independent across layers it would be nicer to sort experts per row according to their load so that clearer patterns emerge alternative idea one could try something even simpler initially train a dense model with one mlp only rather than e different mlps router for a number of steps then replicate the dense mlp to all the experts and add a router from scratch and apply topk as usual in other words pretrain a dense backbone for a bit and then initialize the expert model from this have the authors tried this i can see this working well typos second paragraph of page 6 set tokens of each sample as 1024 set tokens hidden size to be 1024 comparsion comparison references 1 scaling vision with sparse mixture of experts 2 switch transformers scaling to trillion parameter models with simple and efficient sparsity while the algorithm is reasonable the experimental results only one plot figure 3b do not suggest it provides any additional benefit compare to the standard baseline note after the rebuttal i increased the score from 3 to 5 docsepthis paper introduces the idea of using densetosparse dts training gates in mixtureofexperts moe based on a gradual sparsification process the novel produced method is evaluated on a transformer model overall it looks to me that the paper has rather limited novelty while the paper is wellwritten in my opinion the empirical evaluation support is not perfectly aligned with the paper claims strong points using gradual sparsification for the moe gates of a transformer model seems to offer a final performance perplexity in this particular case close to the moe dense gates while having lower computational requirements at least in theory as i believe that a binary mask is used to emulate sparsity is my understanding correct weak points and suggestions the paper claims are not perfectly aligned with the empirical evaluation if the proposed gradual sparsification with dst is intended to be for any moe model than other neural network models besides transformers shall be studied if the proposed method is particularly designed for transformer models then perhaps vision transformers shall be also studied based on the chosen direction new algorithmic baselines and datasets have to be added for comparison otherwise the broadness of the claims needs to be adjusted accordingly the related work discussion is missing very important work particularly gradual sparsification for densetosparse training has been introduced in 1 up to my best knowledge from that moment a large body of works has been released on this topic in parallel sparsetosparse training has been studied for vision transformer in 2 please add a consistent paragraph to discuss the above directions and to highlight clearly what is novel in this paper in comparison with those works otherwise the proposed method broadly speaking would be just a simple application of 1 or follow ups on moe gates for transformers a recent survey can give more details on these topics 3 can you please add an algorithm to better illustrate the proposed method is the lower bound from the adaptive capacity paragraph fixed across training how can one choose it why in figure 3b the dense model minimum ppl is higher than in figure 3a can you train all models from figure 3a for the same number of iterations now somehow it seems that the complete overview picture is missing i suggest to carefully proof read the whole paper the english usage and general appearance can be improved also there are a number of typos eg comparsion with as shown in figure 32 references 1 christos louizos max welling diederik p kingma learning sparse neural networks through 0 regularization iclr 2018 httpsopenreviewnetforumidh1y8hhg0b 2 tianlong chen yu cheng zhe gan lu yuan lei zhang zhangyang wang chasing sparsity in vision transformers an endtoend exploration neurips 2021 httpsarxivorgabs210604533 3 torsten hoefler dan alistarh tal bennun nikoli dryden alexandra peste sparsity in deep learning pruning and growth for efficient inference and training in neural networks jmlr 2021 httpswwwjmlrorgpapersv22210366html while there are some clear advantages highlighted by this paper in obtaining more efficient transformer models overall i believe that the paper is not ready yet for publication
### Summary: | this paper proposes a simple approach to improve the robustness of training a sparsely gated mixtureofexperts model which at a high level simply consists in training initially as a dense gated model to better warm start a final phase of sparse training results are presented to highlight the potential benefits of this approach the authors have provided a detailed response and updated results in response to the reviews each reviewer has also responded at least once to the author response despite that engagement all reviewers are leaning towards rejection though there is one reviewer with a rating of 6 they regardless state that im confident this will make a great resubmission at a future venue indicating they actually support rejection the reviewers point out that the proposed method is not really novel pointing to an existing recent paper even without that prior work i would also argue that the proposed approach is conceptually straightforward and has benefits that were fairly predictable and not particularly surprising given the generally lukewarm reception from the reviewers i think there is a legitimate concern to be had here about this works potential for impact though the review process has definitely improved the papers manuscript since its submission i unfortunately could not find a reason to dissent from the reviewers consensus that this submission is not ready to be published therefore recommend it be rejected at this time | [
368,
250,
970,
24088,
4677,
608,
1027,
3276,
8194,
14398,
50276,
251,
3239,
854,
13414,
897,
253,
490,
25669,
259,
285,
32063,
50276,
783,
8453,
273,
253,
7394,
2193,
275,
4677,
818,
588,
2779,
320,
3663,
281,
1142,
10668,
347,
4767,
1840,
4496,
21184,
275,
634,
3403,
621,
281,
6266,
752,
310,
1146,
2011,
285,
2139,
697,
1774,
281,
4326,
4513,
634,
7680,
50276,
81,
25,
2167,
841,
21761,
403,
8820,
18326,
50276,
11097,
50276,
34974,
50276,
13206,
495,
2139,
403,
627,
15316,
277,
2824,
275,
253,
3733,
44229,
414,
9191,
604,
436,
310,
432,
24385,
253,
941,
275,
253,
1072,
1340,
2654,
5583,
275,
253,
2852,
326,
368,
46671,
941,
1016,
23657,
50276,
13206,
577,
14371,
326,
5497,
264,
1641,
19132,
342,
625,
10071,
285,
6485,
33990,
849,
1057,
436,
13642,
7277,
281,
5497,
265,
88,
2682,
50276,
16217,
3825,
1543,
921,
326,
253,
892,
412,
339,
15412,
476,
320,
18325,
715,
673,
46505,
50276,
2811,
369,
436,
2011,
512,
253,
14777,
285,
941,
891,
923,
403,
323,
892,
2695,
417,
3402,
13273,
673,
327,
247,
2173,
10309,
50276,
262,
310,
8644,
326,
277,
1641,
15353,
476,
3037,
281,
3301,
6654,
1293,
4465,
2957,
50276,
4238,
436,
16058,
281,
320,
247,
2867,
760,
273,
277,
1641,
15353,
4496,
6583,
326,
27785,
7274,
1293,
3301,
26259,
513,
417,
3037,
436,
390,
2010,
4931,
5386,
436,
671,
347,
247,
12389,
2654,
3693,
17854,
3916,
751,
275,
970,
253,
3159,
8644,
516,
14962,
15432,
4404,
436,
789,
2299,
619,
2022,
4468,
310,
327,
253,
3480,
273,
247,
2590,
20028,
273,
253,
9777,
285,
253,
8453,
273,
253,
15988,
50276,
32897,
4518,
921,
479,
253,
20185,
3290,
310,
7197,
970,
26724,
7274,
24088,
458,
81,
20323,
249,
1162,
355,
9169,
10208,
316,
1162,
355,
43425,
347,
2080,
347,
891,
476,
2028,
253,
760,
4326,
15544,
7484,
323,
436,
310,
495,
67,
533,
275,
619,
4743,
436,
36908,
4518,
34647,
253,
7756,
273,
253,
1332,
387,
253,
1077,
1878,
841,
9191,
943,
320,
6508,
1919,
5497,
264,
1641,
310,
345,
12817,
1805,
685,
5497,
265,
88,
2682,
327,
247,
892,
2695,
3720,
2007,
2167,
253,
2934,
310,
4722,
4645,
10046,
16774,
1543,
685,
247,
608,
892,
412,
339,
15412,
6351,
310,
2779,
3058,
281,
18578,
24432,
285,
8607,
604,
436,
310,
253,
1083,
891,
588,
3157,
432,
247,
9934,
50275,
7152,
339,
431,
248,
2929,
29328,
271,
5933,
277,
1641,
15353,
281,
5890,
484,
10071,
275,
271,
278,
3703,
4715,
9978,
2581,
685,
9433,
2629,
1755,
76,
5438,
591,
3280,
253,
5933,
8523,
10384,
14086,
24749,
26332,
465,
50276,
13074,
1180,
273,
10071,
281,
3693,
3158,
13551,
285,
13237,
7921,
265,
352,
281,
2489,
653,
9332,
285,
20182,
4679,
285,
490,
77,
569,
275,
3448,
8892,
403,
2530,
281,
1329,
253,
11361,
273,
253,
5933,
436,
2929,
39223,
271,
4722,
4809,
273,
17697,
13782,
275,
253,
3634,
273,
5497,
265,
849,
513,
359,
1265,
3733,
432,
20041,
672,
2060,
10071,
476,
4354,
2740,
1066,
285,
13551,
50276,
783,
2929,
3400,
271,
27350,
3662,
359,
476,
1265,
407,
3733,
512,
10071,
2366,
26332,
347,
247,
14086,
1566,
285,
7921,
436,
689,
673,
18670,
2378,
512,
2060,
10071,
403,
2217,
3715,
275,
690,
873,
8777,
534,
4722,
314,
28588,
342,
690,
273,
253,
3236,
16038,
4394,
323,
5497,
265,
436,
778,
417,
320,
1896,
323,
1650,
604,
253,
1180,
273,
10071,
310,
816,
5699,
50276,
8826,
3332,
789,
556,
2011,
253,
1055,
273,
3629,
465,
247,
2257,
923,
337,
4677,
884,
352,
3133,
326,
253,
1055,
273,
3629,
465,
5340,
666,
8489,
4541,
1223,
275,
1110,
4679,
465,
36908,
1818,
689,
673,
891,
1158,
436,
778,
1804,
253,
373,
1652,
281,
6351,
275,
2509,
594,
3340,
347,
697,
3240,
8214,
50276,
1189,
455,
891,
1158,
253,
4679,
275,
253,
2929,
513,
417,
921,
253,
4081,
5933,
6131,
667,
5750,
689,
2629,
1755,
76,
24749,
5234,
275,
253,
2929,
50276,
13206,
495,
3133,
281,
320,
253,
760,
5301,
875,
5234,
374,
26332,
642,
5890,
484,
285,
277,
1641,
15353,
6485,
5890,
484,
806,
253,
2361,
7982,
310,
3733,
44229,
414,
516,
417,
2119,
2139,
436,
310,
253,
1083,
1071,
390,
12820,
44229,
414,
943,
320,
2361,
275,
253,
1273,
7484,
495,
67,
534,
310,
253,
954,
4623,
275,
619,
4743,
892,
2695,
390,
20243,
2581,
685,
5018,
347,
275,
495,
66,
347,
841,
10507,
1077,
1027,
4815,
352,
3133,
326,
5234,
36807,
277,
1641,
323,
667,
892,
2695,
7563,
9118,
1386,
5234,
6131,
1805,
3045,
685,
277,
1641,
12014,
323,
667,
20685,
494,
3045,
1268,
5234,
4850,
627,
387,
247,
2406,
892,
2695,
2105,
3021,
516,
417,
2119,
849,
253,
2929,
816,
7790,
253,
897,
273,
277,
1641,
891,
778,
452,
46485,
1633,
2167,
697,
417,
2590,
281,
479,
2139,
253,
277,
1641,
1408,
275,
2502,
369,
37350,
285,
42126,
4035,
3733,
253,
373,
247,
29713,
875,
1097,
14777,
323,
253,
8862,
6970,
5497,
264,
1215,
275,
253,
1669,
581,
495,
66,
44229,
414,
310,
2708,
1638,
1223,
275,
253,
987,
581,
495,
67,
697,
1840,
1458,
2139,
369,
253,
1269,
10565,
892,
2695,
36756,
50276,
249,
253,
2045,
281,
1390,
12494,
275,
3239,
721,
253,
373,
436,
6197,
277,
1641,
15353,
476,
4044,
3687,
685,
608,
11701,
275,
892,
412,
339,
15412,
2429,
342,
1375,
23037,
14387,
5234,
15353,
50276,
2811,
476,
436,
320,
2326,
50276,
12563,
285,
347,
253,
4477,
14409,
275,
253,
11815,
253,
958,
326,
277,
1641,
36908,
16209,
6654,
604,
891,
2096,
9113,
1014,
672,
970,
247,
3301,
26259,
2957,
347,
10066,
281,
1907,
247,
4869,
6485,
5350,
326,
546,
36217,
352,
476,
1421,
281,
271,
31334,
897,
273,
1655,
10309,
690,
10071,
6179,
323,
2571,
50276,
1542,
4677,
721,
347,
6485,
44077,
403,
3907,
2439,
8090,
352,
651,
320,
49482,
281,
3686,
10071,
591,
4194,
2556,
281,
616,
3301,
594,
326,
30909,
6127,
20177,
50276,
30991,
800,
2934,
581,
812,
1611,
1633,
1014,
19554,
8523,
6194,
247,
14086,
1566,
342,
581,
13361,
81,
760,
2581,
685,
299,
1027,
13361,
793,
50276,
37564,
323,
247,
1180,
273,
5018,
840,
25464,
253,
14086,
13361,
81,
281,
512,
253,
10071,
285,
823,
247,
23093,
432,
20041,
285,
4647,
1755,
76,
347,
7312,
275,
643,
3000,
3215,
1949,
247,
14086,
27882,
323,
247,
2372,
285,
840,
26641,
253,
6485,
1566,
432,
436,
452,
253,
4477,
3597,
436,
891,
476,
923,
436,
2444,
973,
50275,
555,
993,
50276,
9815,
12494,
273,
3239,
721,
873,
21761,
273,
1016,
3410,
347,
27277,
50276,
1178,
21761,
8763,
1979,
281,
320,
27277,
50276,
3118,
1032,
279,
50276,
47109,
50275,
250,
3065,
50276,
18,
13642,
8113,
342,
23507,
7802,
273,
10071,
50276,
19,
5234,
4979,
398,
13642,
281,
28126,
4764,
3210,
342,
2969,
285,
5919,
37139,
414,
1223,
253,
5933,
310,
5272,
253,
5661,
1543,
760,
581,
7484,
4677,
495,
67,
513,
417,
1804,
352,
3400,
667,
3081,
5649,
7277,
281,
253,
2629,
8245,
50274,
9939,
846,
253,
30080,
22559,
891,
2559,
253,
4868,
432,
495,
281,
608,
5474,
33032,
2520,
2929,
23970,
253,
2934,
273,
970,
12006,
292,
375,
12083,
277,
1641,
3733,
18488,
275,
7802,
80,
453,
89,
468,
1641,
278,
3703,
1754,
327,
247,
26830,
37139,
1877,
1232,
253,
4460,
4197,
1332,
310,
6760,
327,
247,
39707,
1566,
4583,
352,
4453,
281,
479,
326,
253,
2929,
556,
2581,
3710,
38135,
1223,
253,
2929,
310,
973,
15720,
275,
619,
4743,
253,
16774,
7103,
1329,
310,
417,
9670,
15616,
342,
253,
2929,
3916,
2266,
2792,
50276,
5302,
26830,
37139,
1877,
323,
253,
278,
3703,
18488,
273,
247,
39707,
1566,
3133,
281,
3959,
247,
2457,
3045,
44229,
414,
275,
436,
1798,
1083,
2810,
281,
253,
278,
3703,
14086,
18488,
1223,
1907,
2406,
15180,
6095,
387,
1878,
275,
3762,
347,
891,
2868,
326,
247,
8985,
8989,
310,
908,
281,
802,
4187,
37139,
414,
310,
619,
4685,
3451,
50276,
20881,
2792,
285,
13991,
50276,
783,
2929,
3916,
403,
417,
9670,
15616,
342,
253,
16774,
7103,
604,
253,
4081,
26830,
37139,
1877,
342,
24334,
310,
6034,
281,
320,
323,
667,
278,
3703,
1566,
685,
643,
11454,
2990,
3210,
16280,
4979,
398,
3091,
320,
5421,
604,
253,
4081,
1332,
310,
3782,
4158,
323,
39707,
3210,
840,
4931,
8113,
4979,
398,
3091,
320,
671,
5421,
1754,
327,
253,
6777,
3884,
747,
5933,
280,
1666,
25379,
285,
15302,
452,
281,
320,
2879,
323,
5301,
5010,
253,
3862,
1255,
273,
253,
3916,
3198,
281,
320,
10904,
15672,
50276,
783,
2905,
789,
5955,
310,
5816,
1077,
1774,
789,
3782,
26830,
37139,
1877,
323,
12006,
292,
375,
12083,
3733,
556,
644,
5611,
275,
337,
598,
281,
619,
1682,
3640,
432,
326,
2774,
247,
1781,
2133,
273,
2987,
556,
644,
4439,
327,
436,
9400,
275,
7529,
37139,
292,
375,
12083,
3733,
556,
644,
5421,
323,
8113,
39707,
275,
374,
4496,
823,
247,
5185,
12494,
281,
2319,
253,
1840,
10746,
285,
281,
6780,
4518,
752,
310,
4460,
275,
436,
2929,
275,
5301,
342,
1110,
2987,
5010,
253,
4081,
1332,
21450,
8288,
651,
320,
816,
247,
2969,
2898,
273,
337,
390,
956,
35267,
327,
278,
3703,
18488,
323,
4979,
398,
247,
3332,
6630,
476,
1918,
625,
4278,
327,
841,
12989,
495,
50276,
5092,
368,
4496,
823,
271,
5933,
281,
1805,
17093,
253,
4081,
1332,
50276,
261,
253,
2406,
3033,
432,
253,
17825,
5350,
12494,
4229,
2439,
3733,
849,
476,
581,
5206,
352,
50276,
22309,
275,
4677,
495,
67,
253,
14086,
1566,
5927,
268,
446,
310,
2169,
685,
275,
4677,
495,
66,
476,
368,
6194,
512,
3210,
432,
4677,
495,
66,
323,
253,
1072,
1180,
273,
25142,
1024,
10380,
352,
3133,
326,
253,
3426,
18389,
5406,
310,
5816,
50276,
74,
1804,
281,
9257,
4737,
1239,
253,
2644,
2929,
253,
48087,
10393,
285,
2087,
7286,
476,
320,
5520,
671,
627,
403,
247,
1180,
273,
963,
993,
24088,
509,
1032,
279,
342,
347,
2011,
275,
4677,
4567,
50275,
250,
3065,
50276,
18,
37622,
375,
29245,
478,
375,
2781,
973,
272,
4962,
254,
1479,
268,
6963,
785,
4715,
23507,
11454,
6928,
949,
470,
37820,
17857,
32888,
4765,
5987,
5758,
15337,
3024,
39061,
301,
73,
18,
90,
25,
12155,
72,
17,
67,
50275,
19,
246,
757,
5056,
260,
864,
50276,
30838,
260,
24176,
50276,
91,
248,
36827,
50276,
7675,
340,
9041,
50276,
42157,
1182,
12109,
50276,
91,
12109,
31524,
259,
606,
31702,
37139,
414,
275,
8113,
4979,
398,
271,
990,
936,
423,
17947,
5723,
2824,
43425,
5987,
39962,
2061,
5375,
16899,
1549,
1857,
1610,
50275,
20,
7263,
16750,
8511,
832,
2146,
16447,
355,
43418,
73,
5269,
2240,
79,
328,
295,
1479,
10424,
6079,
3354,
247,
1591,
17244,
268,
16300,
37139,
414,
275,
3676,
4715,
819,
25004,
285,
3116,
323,
5919,
17032,
285,
3733,
275,
11454,
6928,
480,
1686,
83,
43425,
5987,
2700,
75,
1686,
83,
2061,
50004,
87,
1423,
16899,
25772,
2974,
50276,
6050,
627,
403,
690,
2590,
11361,
16318,
407,
436,
2929,
275,
13546,
625,
5919,
39707,
3210,
4583,
891,
2868,
326,
253,
2929,
310,
417,
4704,
2568,
323,
9311,
50276,
187,
187,
4118,
18435,
27,
2520,
2929,
29328,
247,
2969,
2746,
281,
3157,
253,
31640,
273,
3733,
247,
37139,
600,
305,
456,
7802,
80,
453,
89,
468,
1641,
1566,
534,
387,
247,
1029,
1268,
3365,
8414,
275,
3733,
8523,
347,
247,
14086,
305,
456,
1566,
281,
1805,
5890,
1265,
247,
2457,
3408,
273,
23507,
3733,
1543,
403,
3559,
281,
6780,
253,
2442,
5373,
273,
436,
2746,
50276,
783,
4477,
452,
2530,
247,
7000,
2380,
285,
9300,
1543,
275,
2380,
281,
253,
10123,
1016,
37317,
556,
671,
10974,
387,
1878,
2378,
281,
253,
2488,
2380,
5747,
326,
13226,
512,
30628,
403,
25661,
4404,
18235,
2167,
627,
310,
581,
37317,
342,
247,
13716,
273,
721,
597,
10159,
1375,
326,
516,
13224,
436,
588,
1056,
247,
1270,
501,
538,
2230,
387,
247,
2852,
18767,
7809,
597,
2686,
1329,
18235,
50276,
783,
30628,
1127,
562,
326,
253,
4081,
1332,
310,
417,
1663,
4460,
13458,
281,
271,
5368,
3332,
2929,
1014,
1293,
326,
2720,
789,
891,
651,
671,
9059,
326,
253,
4081,
2746,
310,
4473,
1230,
15246,
285,
556,
5373,
326,
497,
9648,
28826,
285,
417,
3782,
10084,
1677,
253,
3839,
298,
17936,
44041,
16112,
432,
253,
30628,
891,
1158,
627,
310,
247,
14905,
4468,
281,
320,
574,
1060,
670,
436,
2987,
2442,
323,
3486,
50276,
2004,
253,
2278,
1232,
556,
7964,
5520,
253,
9380,
7714,
1580,
697,
19529,
891,
19235,
812,
417,
1089,
247,
1921,
281,
18776,
432,
253,
30628,
13969,
326,
436,
19529,
310,
417,
4704,
281,
320,
3863,
3103,
5583,
352,
320,
10945,
387,
436,
673
] | [
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] | [
368,
250,
970,
24088,
4677,
608,
1027,
3276,
8194,
14398,
50276,
251,
3239,
854,
13414,
897,
253,
490,
25669,
259,
285,
32063,
50276,
783,
8453,
273,
253,
7394,
2193,
275,
4677,
818,
588,
2779,
320,
3663,
281,
1142,
10668,
347,
4767,
1840,
4496,
21184,
275,
634,
3403,
621,
281,
6266,
752,
310,
1146,
2011,
285,
2139,
697,
1774,
281,
4326,
4513,
634,
7680,
50276,
81,
25,
2167,
841,
21761,
403,
8820,
18326,
50276,
11097,
50276,
34974,
50276,
13206,
495,
2139,
403,
627,
15316,
277,
2824,
275,
253,
3733,
44229,
414,
9191,
604,
436,
310,
432,
24385,
253,
941,
275,
253,
1072,
1340,
2654,
5583,
275,
253,
2852,
326,
368,
46671,
941,
1016,
23657,
50276,
13206,
577,
14371,
326,
5497,
264,
1641,
19132,
342,
625,
10071,
285,
6485,
33990,
849,
1057,
436,
13642,
7277,
281,
5497,
265,
88,
2682,
50276,
16217,
3825,
1543,
921,
326,
253,
892,
412,
339,
15412,
476,
320,
18325,
715,
673,
46505,
50276,
2811,
369,
436,
2011,
512,
253,
14777,
285,
941,
891,
923,
403,
323,
892,
2695,
417,
3402,
13273,
673,
327,
247,
2173,
10309,
50276,
262,
310,
8644,
326,
277,
1641,
15353,
476,
3037,
281,
3301,
6654,
1293,
4465,
2957,
50276,
4238,
436,
16058,
281,
320,
247,
2867,
760,
273,
277,
1641,
15353,
4496,
6583,
326,
27785,
7274,
1293,
3301,
26259,
513,
417,
3037,
436,
390,
2010,
4931,
5386,
436,
671,
347,
247,
12389,
2654,
3693,
17854,
3916,
751,
275,
970,
253,
3159,
8644,
516,
14962,
15432,
4404,
436,
789,
2299,
619,
2022,
4468,
310,
327,
253,
3480,
273,
247,
2590,
20028,
273,
253,
9777,
285,
253,
8453,
273,
253,
15988,
50276,
32897,
4518,
921,
479,
253,
20185,
3290,
310,
7197,
970,
26724,
7274,
24088,
458,
81,
20323,
249,
1162,
355,
9169,
10208,
316,
1162,
355,
43425,
347,
2080,
347,
891,
476,
2028,
253,
760,
4326,
15544,
7484,
323,
436,
310,
495,
67,
533,
275,
619,
4743,
436,
36908,
4518,
34647,
253,
7756,
273,
253,
1332,
387,
253,
1077,
1878,
841,
9191,
943,
320,
6508,
1919,
5497,
264,
1641,
310,
345,
12817,
1805,
685,
5497,
265,
88,
2682,
327,
247,
892,
2695,
3720,
2007,
2167,
253,
2934,
310,
4722,
4645,
10046,
16774,
1543,
685,
247,
608,
892,
412,
339,
15412,
6351,
310,
2779,
3058,
281,
18578,
24432,
285,
8607,
604,
436,
310,
253,
1083,
891,
588,
3157,
432,
247,
9934,
50275,
7152,
339,
431,
248,
2929,
29328,
271,
5933,
277,
1641,
15353,
281,
5890,
484,
10071,
275,
271,
278,
3703,
4715,
9978,
2581,
685,
9433,
2629,
1755,
76,
5438,
591,
3280,
253,
5933,
8523,
10384,
14086,
24749,
26332,
465,
50276,
13074,
1180,
273,
10071,
281,
3693,
3158,
13551,
285,
13237,
7921,
265,
352,
281,
2489,
653,
9332,
285,
20182,
4679,
285,
490,
77,
569,
275,
3448,
8892,
403,
2530,
281,
1329,
253,
11361,
273,
253,
5933,
436,
2929,
39223,
271,
4722,
4809,
273,
17697,
13782,
275,
253,
3634,
273,
5497,
265,
849,
513,
359,
1265,
3733,
432,
20041,
672,
2060,
10071,
476,
4354,
2740,
1066,
285,
13551,
50276,
783,
2929,
3400,
271,
27350,
3662,
359,
476,
1265,
407,
3733,
512,
10071,
2366,
26332,
347,
247,
14086,
1566,
285,
7921,
436,
689,
673,
18670,
2378,
512,
2060,
10071,
403,
2217,
3715,
275,
690,
873,
8777,
534,
4722,
314,
28588,
342,
690,
273,
253,
3236,
16038,
4394,
323,
5497,
265,
436,
778,
417,
320,
1896,
323,
1650,
604,
253,
1180,
273,
10071,
310,
816,
5699,
50276,
8826,
3332,
789,
556,
2011,
253,
1055,
273,
3629,
465,
247,
2257,
923,
337,
4677,
884,
352,
3133,
326,
253,
1055,
273,
3629,
465,
5340,
666,
8489,
4541,
1223,
275,
1110,
4679,
465,
36908,
1818,
689,
673,
891,
1158,
436,
778,
1804,
253,
373,
1652,
281,
6351,
275,
2509,
594,
3340,
347,
697,
3240,
8214,
50276,
1189,
455,
891,
1158,
253,
4679,
275,
253,
2929,
513,
417,
921,
253,
4081,
5933,
6131,
667,
5750,
689,
2629,
1755,
76,
24749,
5234,
275,
253,
2929,
50276,
13206,
495,
3133,
281,
320,
253,
760,
5301,
875,
5234,
374,
26332,
642,
5890,
484,
285,
277,
1641,
15353,
6485,
5890,
484,
806,
253,
2361,
7982,
310,
3733,
44229,
414,
516,
417,
2119,
2139,
436,
310,
253,
1083,
1071,
390,
12820,
44229,
414,
943,
320,
2361,
275,
253,
1273,
7484,
495,
67,
534,
310,
253,
954,
4623,
275,
619,
4743,
892,
2695,
390,
20243,
2581,
685,
5018,
347,
275,
495,
66,
347,
841,
10507,
1077,
1027,
4815,
352,
3133,
326,
5234,
36807,
277,
1641,
323,
667,
892,
2695,
7563,
9118,
1386,
5234,
6131,
1805,
3045,
685,
277,
1641,
12014,
323,
667,
20685,
494,
3045,
1268,
5234,
4850,
627,
387,
247,
2406,
892,
2695,
2105,
3021,
516,
417,
2119,
849,
253,
2929,
816,
7790,
253,
897,
273,
277,
1641,
891,
778,
452,
46485,
1633,
2167,
697,
417,
2590,
281,
479,
2139,
253,
277,
1641,
1408,
275,
2502,
369,
37350,
285,
42126,
4035,
3733,
253,
373,
247,
29713,
875,
1097,
14777,
323,
253,
8862,
6970,
5497,
264,
1215,
275,
253,
1669,
581,
495,
66,
44229,
414,
310,
2708,
1638,
1223,
275,
253,
987,
581,
495,
67,
697,
1840,
1458,
2139,
369,
253,
1269,
10565,
892,
2695,
36756,
50276,
249,
253,
2045,
281,
1390,
12494,
275,
3239,
721,
253,
373,
436,
6197,
277,
1641,
15353,
476,
4044,
3687,
685,
608,
11701,
275,
892,
412,
339,
15412,
2429,
342,
1375,
23037,
14387,
5234,
15353,
50276,
2811,
476,
436,
320,
2326,
50276,
12563,
285,
347,
253,
4477,
14409,
275,
253,
11815,
253,
958,
326,
277,
1641,
36908,
16209,
6654,
604,
891,
2096,
9113,
1014,
672,
970,
247,
3301,
26259,
2957,
347,
10066,
281,
1907,
247,
4869,
6485,
5350,
326,
546,
36217,
352,
476,
1421,
281,
271,
31334,
897,
273,
1655,
10309,
690,
10071,
6179,
323,
2571,
50276,
1542,
4677,
721,
347,
6485,
44077,
403,
3907,
2439,
8090,
352,
651,
320,
49482,
281,
3686,
10071,
591,
4194,
2556,
281,
616,
3301,
594,
326,
30909,
6127,
20177,
50276,
30991,
800,
2934,
581,
812,
1611,
1633,
1014,
19554,
8523,
6194,
247,
14086,
1566,
342,
581,
13361,
81,
760,
2581,
685,
299,
1027,
13361,
793,
50276,
37564,
323,
247,
1180,
273,
5018,
840,
25464,
253,
14086,
13361,
81,
281,
512,
253,
10071,
285,
823,
247,
23093,
432,
20041,
285,
4647,
1755,
76,
347,
7312,
275,
643,
3000,
3215,
1949,
247,
14086,
27882,
323,
247,
2372,
285,
840,
26641,
253,
6485,
1566,
432,
436,
452,
253,
4477,
3597,
436,
891,
476,
923,
436,
2444,
973,
50275,
555,
993,
50276,
9815,
12494,
273,
3239,
721,
873,
21761,
273,
1016,
3410,
347,
27277,
50276,
1178,
21761,
8763,
1979,
281,
320,
27277,
50276,
3118,
1032,
279,
50276,
47109,
50275,
250,
3065,
50276,
18,
13642,
8113,
342,
23507,
7802,
273,
10071,
50276,
19,
5234,
4979,
398,
13642,
281,
28126,
4764,
3210,
342,
2969,
285,
5919,
37139,
414,
1223,
253,
5933,
310,
5272,
253,
5661,
1543,
760,
581,
7484,
4677,
495,
67,
513,
417,
1804,
352,
3400,
667,
3081,
5649,
7277,
281,
253,
2629,
8245,
50274,
9939,
846,
253,
30080,
22559,
891,
2559,
253,
4868,
432,
495,
281,
608,
5474,
33032,
2520,
2929,
23970,
253,
2934,
273,
970,
12006,
292,
375,
12083,
277,
1641,
3733,
18488,
275,
7802,
80,
453,
89,
468,
1641,
278,
3703,
1754,
327,
247,
26830,
37139,
1877,
1232,
253,
4460,
4197,
1332,
310,
6760,
327,
247,
39707,
1566,
4583,
352,
4453,
281,
479,
326,
253,
2929,
556,
2581,
3710,
38135,
1223,
253,
2929,
310,
973,
15720,
275,
619,
4743,
253,
16774,
7103,
1329,
310,
417,
9670,
15616,
342,
253,
2929,
3916,
2266,
2792,
50276,
5302,
26830,
37139,
1877,
323,
253,
278,
3703,
18488,
273,
247,
39707,
1566,
3133,
281,
3959,
247,
2457,
3045,
44229,
414,
275,
436,
1798,
1083,
2810,
281,
253,
278,
3703,
14086,
18488,
1223,
1907,
2406,
15180,
6095,
387,
1878,
275,
3762,
347,
891,
2868,
326,
247,
8985,
8989,
310,
908,
281,
802,
4187,
37139,
414,
310,
619,
4685,
3451,
50276,
20881,
2792,
285,
13991,
50276,
783,
2929,
3916,
403,
417,
9670,
15616,
342,
253,
16774,
7103,
604,
253,
4081,
26830,
37139,
1877,
342,
24334,
310,
6034,
281,
320,
323,
667,
278,
3703,
1566,
685,
643,
11454,
2990,
3210,
16280,
4979,
398,
3091,
320,
5421,
604,
253,
4081,
1332,
310,
3782,
4158,
323,
39707,
3210,
840,
4931,
8113,
4979,
398,
3091,
320,
671,
5421,
1754,
327,
253,
6777,
3884,
747,
5933,
280,
1666,
25379,
285,
15302,
452,
281,
320,
2879,
323,
5301,
5010,
253,
3862,
1255,
273,
253,
3916,
3198,
281,
320,
10904,
15672,
50276,
783,
2905,
789,
5955,
310,
5816,
1077,
1774,
789,
3782,
26830,
37139,
1877,
323,
12006,
292,
375,
12083,
3733,
556,
644,
5611,
275,
337,
598,
281,
619,
1682,
3640,
432,
326,
2774,
247,
1781,
2133,
273,
2987,
556,
644,
4439,
327,
436,
9400,
275,
7529,
37139,
292,
375,
12083,
3733,
556,
644,
5421,
323,
8113,
39707,
275,
374,
4496,
823,
247,
5185,
12494,
281,
2319,
253,
1840,
10746,
285,
281,
6780,
4518,
752,
310,
4460,
275,
436,
2929,
275,
5301,
342,
1110,
2987,
5010,
253,
4081,
1332,
21450,
8288,
651,
320,
816,
247,
2969,
2898,
273,
337,
390,
956,
35267,
327,
278,
3703,
18488,
323,
4979,
398,
247,
3332,
6630,
476,
1918,
625,
4278,
327,
841,
12989,
495,
50276,
5092,
368,
4496,
823,
271,
5933,
281,
1805,
17093,
253,
4081,
1332,
50276,
261,
253,
2406,
3033,
432,
253,
17825,
5350,
12494,
4229,
2439,
3733,
849,
476,
581,
5206,
352,
50276,
22309,
275,
4677,
495,
67,
253,
14086,
1566,
5927,
268,
446,
310,
2169,
685,
275,
4677,
495,
66,
476,
368,
6194,
512,
3210,
432,
4677,
495,
66,
323,
253,
1072,
1180,
273,
25142,
1024,
10380,
352,
3133,
326,
253,
3426,
18389,
5406,
310,
5816,
50276,
74,
1804,
281,
9257,
4737,
1239,
253,
2644,
2929,
253,
48087,
10393,
285,
2087,
7286,
476,
320,
5520,
671,
627,
403,
247,
1180,
273,
963,
993,
24088,
509,
1032,
279,
342,
347,
2011,
275,
4677,
4567,
50275,
250,
3065,
50276,
18,
37622,
375,
29245,
478,
375,
2781,
973,
272,
4962,
254,
1479,
268,
6963,
785,
4715,
23507,
11454,
6928,
949,
470,
37820,
17857,
32888,
4765,
5987,
5758,
15337,
3024,
39061,
301,
73,
18,
90,
25,
12155,
72,
17,
67,
50275,
19,
246,
757,
5056,
260,
864,
50276,
30838,
260,
24176,
50276,
91,
248,
36827,
50276,
7675,
340,
9041,
50276,
42157,
1182,
12109,
50276,
91,
12109,
31524,
259,
606,
31702,
37139,
414,
275,
8113,
4979,
398,
271,
990,
936,
423,
17947,
5723,
2824,
43425,
5987,
39962,
2061,
5375,
16899,
1549,
1857,
1610,
50275,
20,
7263,
16750,
8511,
832,
2146,
16447,
355,
43418,
73,
5269,
2240,
79,
328,
295,
1479,
10424,
6079,
3354,
247,
1591,
17244,
268,
16300,
37139,
414,
275,
3676,
4715,
819,
25004,
285,
3116,
323,
5919,
17032,
285,
3733,
275,
11454,
6928,
480,
1686,
83,
43425,
5987,
2700,
75,
1686,
83,
2061,
50004,
87,
1423,
16899,
25772,
2974,
50276,
6050,
627,
403,
690,
2590,
11361,
16318,
407,
436,
2929,
275,
13546,
625,
5919,
39707,
3210,
4583,
891,
2868,
326,
253,
2929,
310,
417,
4704,
2568,
323,
9311,
50276,
187,
187,
4118,
18435,
27,
2520,
2929,
29328,
247,
2969,
2746,
281,
3157,
253,
31640,
273,
3733,
247,
37139,
600,
305,
456,
7802,
80,
453,
89,
468,
1641,
1566,
534,
387,
247,
1029,
1268,
3365,
8414,
275,
3733,
8523,
347,
247,
14086,
305,
456,
1566,
281,
1805,
5890,
1265,
247,
2457,
3408,
273,
23507,
3733,
1543,
403,
3559,
281,
6780,
253,
2442,
5373,
273,
436,
2746,
50276,
783,
4477,
452,
2530,
247,
7000,
2380,
285,
9300,
1543,
275,
2380,
281,
253,
10123,
1016,
37317,
556,
671,
10974,
387,
1878,
2378,
281,
253,
2488,
2380,
5747,
326,
13226,
512,
30628,
403,
25661,
4404,
18235,
2167,
627,
310,
581,
37317,
342,
247,
13716,
273,
721,
597,
10159,
1375,
326,
516,
13224,
436,
588,
1056,
247,
1270,
501,
538,
2230,
387,
247,
2852,
18767,
7809,
597,
2686,
1329,
18235,
50276,
783,
30628,
1127,
562,
326,
253,
4081,
1332,
310,
417,
1663,
4460,
13458,
281,
271,
5368,
3332,
2929,
1014,
1293,
326,
2720,
789,
891,
651,
671,
9059,
326,
253,
4081,
2746,
310,
4473,
1230,
15246,
285,
556,
5373,
326,
497,
9648,
28826,
285,
417,
3782,
10084,
1677,
253,
3839,
298,
17936,
44041,
16112,
432,
253,
30628,
891,
1158,
627,
310,
247,
14905,
4468,
281,
320,
574,
1060,
670,
436,
2987,
2442,
323,
3486,
50276,
2004,
253,
2278,
1232,
556,
7964,
5520,
253,
9380,
7714,
1580,
697,
19529,
891,
19235,
812,
417,
1089,
247,
1921,
281,
18776,
432,
253,
30628,
13969,
326,
436,
19529,
310,
417,
4704,
281,
320,
3863,
3103,
5583,
352,
320,
10945,
387,
436,
673
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
this paper models noisy observations from complex dynamical systems consisting of multiple interacting subsystems by a set of sparsely interacting recurrent networks the interactions between the recurrent networks or modules are constrained to be spatially localized this is done by embedding the position of each module in a metric space and scaling the strength of the interactions between modules using this metric the overall effect of this constraint is to induce spatial structure in the global dynamics of the interacting modules the paper argues that this constraint is a good inductive bias for capturing the dynamics of systems that consist of sparsely interacting agents like cars and traffic lights in a traffic simulation or characters and actions in a video game the paper uses recurrent neural networks for modeling the individual subsystems as these are flexible nonlinear dynamical systems that can model the local dynamics of individual modules the interactions between these systems is captured through an attention mechanism these attention weights are modulated scaled by a similarity between embedded positions of pairs of modules i found the paper interesting but i thought the presentation could be clearer and the paper could better demonstrate the importance of the spatial structure first as far as i can tell the main difference between this work and previous work specifically the recurrent interacting modules rim paper is the addition of the positional embedding and corresponding weighted interaction terms given that the key novelty is the positional embedding i think the paper should spend more time introducing and motivating how that is implemented and experiments testing that implementation the positional embedding is introduced in a few lines at the top of section 4 with the motivation that the chosen function is commonly used and cites the original transformer paper however the motivation for positional encodings in nlp tasks for the transformer seems to me to be completely different from the motivation for positional or spatial embeddings in this work this work is largely aimed at modeling interacting systems in the physical world whereas the positional encoding in vaswani et al was motivated as a simple way to introduce signals to the network that represented position given that these are different domains i think the positional embedding should be better motivate here in addition the paper suggests that other choices might also be viable what are these other choices did the authors experiment with them second how are the hyperparameters for the similarity metric epsilon and tau chosen these govern how close two subsystems need to be to interact but as far as i can tell they are simply presented as constants in a table in the appendix how does performance vary as you change these at the very least the main text should state how these were chosen third from what i can tell the s2gru has less capacity than a system that models all of the interactions such as a global lstm if so then how come the lstm underperforms on simple tasks such as the bouncing balls presumably the lstm could just learn the same interactions that are in the s2gru is this underperformance due to issues with trainability the global lstm is harder to train or perhaps it still has limited capacity eg it does not have enough units or layers to model the dynamics if the latter it would be nice to see experiments with different numbers of units andor more training data showing that for large systems the lstm is as good as or better than the s2gru since we expect the lstm to be able to learn any nonlinear dynamics then as you reduce either the number of units or training data perhaps the s2gru starts to outperform as the inductive bias of the s2gru starts to win basically if the spatial structure is really an inductive bias i would expect the benefit to go away with a higher capacity model or more training data where inductive biases are not needed as we can just learn directly from data fourth are there cases where the spatial structure is not a good inductive bias for example situations where modeling the interactions between all pairs of agents or subsystems is necessary to capture the dynamics if so i would like to see experiments on these problems that show that the s2gru does not outperform a global model such as an lstm this would be a nice control to show that the limits of the inductive biases of the imposed spatial structuredocsepthe authors propose a recurrent state space model for partially observed data where at any given time only a partial subset of observations are accessible to the model to make future predictions this is an important and challenging problem in ml as we often only have access to a partial view of temporally evolving data the authors propose uses a recurrent model which models a dynamically evolving partially observed process by introducing spatiotemporal interactions across multiple recurrent neural nets the key contribution of their work is the use of a shared embedding space that allows them to employ a mechanism to modulate spatial and temporal interactions between the rnns the proposed method learns a mapping function for both the embeddings corresponding to the rnns and the embedding corresponding to the observations and locations to a shared metric space since the functions used to map the observations and locations and the embeddings vectors associated with each rnn are learned the joint learning objecting corresponds to learning a topological structure on the points in this shared metric space additionally the authors propose a truncated kernel to attenuate effects of observations far apart in time the final learning objective then encourages the shared metric space to be topologically organized in a way such that the embedding space is partitioned into regions where each region places a varying responsibility over each of the different rnns since the observations are also embedded into this space any given observation can then be mapped into the embedding space in the embedding space the observation embedding be close to one or more embedding points corresponding different rnns which in turn models the dynamics in that regime the idea itself is quite interesting and allows for a soft representation of a state space model which through the embedding respace resembles an explicit duration factorial markov model the shared embedding space is an interesting idea and can be extended to other time series models as well the experiments along comparisons to strong baselines demonstrates the validity of their approach in addition to the interesting core idea the authors employ a fairly involved attention mechanism given the space limitation it is understandable however it would have been informative to have some experiments with some ablation studies ie what is the simplest model structure they could have used while still incorporating the shared metric space idea to make the model more parameter efficient some minor notational issues dx seems to be defined as a function space in the model however it is not a function space one could think of the forward evolutions of an rnn with a memory cell having an equal representation in function space however i feel the function space characterization of dynamical system section 2 obfuscates the model structure additionally sta can be inferred to be pxta however sta is not defined in the text a few clarifications from the textual descriptions could the authors elaborate on how exactly does e process all observations in parallel across t and a explained by the fact that unlike recurrent models it does not leverage the temporal dynamics to fill in the missing information due to fewer available observations could you elaborate on why you dont expect the individual rnns to leverage the temporal dynamics in the experiments does the reported metric f1 score account for label switching as currently modeled only the forward rollout of the rnns are used could this be extended to having a bidirectional rnn one might expect better global estimates as we switch from filtering to a smoothing estimate overall i believe this an interesting research direction and the approach proposed by the authors can be extended to other useful models the experiments are well thought out the paper organization could be clearer but as it stands it is easy to understand after a thorough read docsep summary of the paper the authors propose a novel architecture for synthesizing information from multiple local observations and making predictions into the future given a set of observations each with an associated location the model maps these into an embedding space a set of rnns each with a learned embedding that corresponds to a vector in the embedding space is used to process observations that are close in the embedding space the activations that the rnns can receive exchange between each other and output for a given query location are modulated by the distance in the embedding space this is achieved through three separate attention mechanisms with a similar design a fully nonlocal alltoall attention layer is followed by a multiplication with a kernel that falls off and is eventually cut off to 0 with distance this type of attention is used once for computing the inputs to the rnns once for the hidden state of the lstm exchanging information between rnn modules that are close by and once when computing an output response given a query location relation to prior work the paper references previous work that deals with synthesizing information from multiple localized observation and positions itself reasonably with respect to prior work experimental evaluation the authors demonstrate results on two domains a bouncing balls video prediction task which is wellknown and useful but could be considered a toy task and predicting sequences from starcraft 2 battles in the case of starcraft 2 the target for the prediction are raw unit stats and actions rather than images the model generally outperforms baselines on the bouncing balls task and generalizes better on the starcraft 2 domain but does not outperform baselines on the training task itself its difficult for me to judge whether the model will also work on other domains based on these experiments im curious whether some of the components of the model are actually needed eg attention between rnn modules in my opinion it would improve the paper significantly if some results on ablation experiments would be provided another aspect of the model that could be investigated is the dependence of the performance on the kernel that modulates information exchange by distance in the embedding space what happens if the modulation is disabled presentation and clarity the paper is generally well written but i did have some difficulties understanding the goal and approach because the paper relies on a lot of wordy exposition some of which is highly speculative in my opinion i believe that the paper would be clearer if the introduction was shortened this could also leave more space to discuss the experiments in more detail or add an ablation study conclusions i found the architecture introduced in the paper genuinely interesting and would like to see more work in this direction i would give the paper a higher rating if it included ablation experiments or if it was adjusted to simplify and shorten some of the discussion in the introduction edit after author comments ive read the author comments and the updated version of the paper although the authors claim that they have shortened the introduction of the paper by 1 page this doesnt actually seem to be the case in the last version that was uploaded where the section titled introduction is almost unchanged compared to the original upload ive used the diff tool between the latest and original version maybe there was a misunderstanding and the authors have shortened a different part of the paper although it would have been nice to shorten and streamline the introduction to make the paper easier to read its not critical to my rating the added ablation experiments demonstrate that each of the different attention modules proposed in the paper improve results which i think really improves the paper i was originally not sure whether the complexity of the model was justified but the new experiments demonstrate that each of the components seems to be needed ive also carefully read the rest of the paper and the author comments explaining details of the tasks under study and now feel that i have a much better understanding of what was done and how the model could be used for other tasks i agree with r4 that the novelty of the paper might not be groundbreaking but i believe the paper could be relevant and interesting for other researchers who want to incorporate attention mechanisms into their architectures so i recommend accepting the paper ive increased my rating from 6 to 7docsep summary this paper is concerned with making predictions about the global state of a dynamical system in a partially observable setting where only local observations are available concretely this paper studies video prediction from glimpses and worldmodeling in a multiagent setting to solve this task the paper proposes spatially structured recurrent modules s2rms an rnnbased dynamics model based on interacting subsystems s2rms are an extension of rims goyal et al 2019 with the main difference being that each module additionally contains an embedding vector that is meant to reflect its location in some learned metric space this positional information can then be used to limit the interactions between modules to distribute inputs that come with an associated location to the modules and to query a particular subpart of the global state at a future point in time it is shown how s2rms outperforms a number of baselines which are obtained by combining the query mechanism of a gqn with either an lstm rims or a relational rnn to model the dynamics pros cons justification overall i find that the paper is reasonably well written although the clarity can be improved significantly the first four pages essentially only focus on motivation and problem statement which is excessive especially since the main contribution is an improved rnn architecture in contrast section 4 that describes the method leaves important details about the adapted input attention mechanisms and intercell attention mechanisms to the appendix similarly the experiments on the gridworld are entirely discussed in the appendix which i have therefore not considered as part of the contribution more generally the problem formulation seems unnecessarily broad given the relatively narrow contribution that is made the considered problem setting is interesting and i believe relevant to several realworld settings the contribution itself is rather incremental since it essentially only involves associating a location with each module in rims the adaptations to the existing attention mechanisms in rims that this then necessitates are straightforward although additionally having the rim input attention over the local observations is interesting and appears novel i couldnt figure out if the original rims already attend to a setrepresentation of the input ie where each element is a different spatial location of the learned cnn representation however this paper does not compare alternative ways of implementing these changes or provide an ablation which leaves it open what the effect is of adapting the attention mechanisms in this way i also note that requiring the local observations to include an associated spatial location is arguably a more narrow setting than that explored in rims if the input attention in rims is additionally directed to attend over the encoded local observations and it would have been interesting to understand the importance of this for example how do s2rms compare to rims when a similar encoder is used for rims except for using the spatial coordinate the experimental evaluation indicates that s2rms outperforms the considered baselines and most notably rims although there are some concerns firstly for the baselines the aggregated representations are computed as a sum of the output of the encoder applied to each local observation which seems like a major bottleneck while s2rms can use attention to attend to each patch separately information is almost certainly lost for the baselines in this way i can understand why this may be necessary for the lstm but why not let rim attend to the encoded local observations separately currently this makes it difficult to understand in what way s2rms improve over rims especially since the reported margin already is quite small secondly and related to this i believe that it is important to include an ablation of the changes proposed to the attention mechanism for example what happens if positional information is only used for the recurrence but not for the input attention ie only using the global term or what if it is only used for input attention but not for the recurrence understanding the effect of these design choices would help strengthen the contribution and make it more significant compared to rims given these issues i can not quite yet recommend an accept at this point in time but i would be willing to increase my score depending on the outcome of some of the experiments i have suggested more generally i encourage the authors to revise section 24 to put more emphasis on the actual contribution and discuss the specific design in detail detailed comments why is it necessary to choose between the product of local and global terms and only the local term for the input attention it seems to me that the local terms should suffice and i wonder what the effect is of having this additional term i would appreciate a more detailed discussion of prior work in the related work section currently the paper only makes sweeping claims about how the considered setting is different from a bunch of prior approaches rather it would be more interesting to point out specific parallels or discuss how ideas from prior work may be included in the considered setup or in what way those ideas have already been reused i am also missing a discussion of objectcentric approaches to performing physical prediction tasks like rnem steenkiste et al 2018 op3 veerapaneni et al 2020 etc especially in relation to the considered bouncing balls task arguably on this task these methods provide the best tradeoff between locally interacting subsystems and more global modeling since each rnn specializes to a specific object which is evidently not achieved by s2rms since each rnn learns to specialize on a single object it can easily be viewed as modeling the global state of the system through locally interacting subsystems acting on partial observation on the other hand it is clear that s2rms and rims by extension offer other advantages for example an advantage of s2rms is that they consider modules having their own weights which can thereby specialize on specific interactions between objects or break down interactions between objects further rnem and the likes are limited to modeling the same set of interactions between interacting subsystems given by objects although evidently the notion of an object can also be flexible in this case ie when an objectcentric representation corresponds to two physical objects in pixel space the visualization in figure 5 suggests that individual modules specialize but there also seems to be quite some redundancy would it not be possible to quantify the achieved modularity somehow eg by measuring iou in that case it would be interesting to see how the system behaves when the number of provided modules is insufficient to model each individual actor in the considered system and when the number of modules is equal to the number of actors eg objects on bouncing balls additionally do you have any intuition for what happens if modules are removed at testtime why are rims not included in table 1 i would be surprised if it performs that much worse to s2rm since both models perform similar on the bouncing balls task if this is the case then please at the very least provide some intuition or insight to explain this difference please include a discussion of limitations in the conclusion please take a moment to go through the references and correctly cite papers that have been published i have improved my score following the improvements made by the authors see my reply below for details
### Summary: | this paper presents a model for dynamical systems with multiple interacting components each component is modeled as an rnn and the interactions between components are functions of their distance in a learned embedding space its an interesting idea and well motivated inductive bias the results were made more compelling with the addition of ablation studies during the discussion phase which showed how various aspects of the model combined to yield the best performance overall this paper should be of interest to many in the iclr community working on complex multiagent systems | [
4477,
7568,
1543,
327,
767,
10625,
247,
46754,
15254,
3492,
10554,
4836,
534,
310,
973,
4304,
285,
4217,
533,
812,
320,
2783,
247,
20953,
4836,
285,
21565,
6430,
432,
331,
3178,
2694,
374,
20303,
275,
253,
1083,
273,
331,
3178,
2694,
374,
253,
2303,
323,
253,
10554,
403,
9305,
3943,
22118,
285,
5231,
2581,
685,
3888,
253,
1566,
3839,
41731,
13015,
1666,
25379,
327,
253,
46754,
15254,
4836,
285,
2087,
4219,
1805,
327,
253,
331,
3178,
2694,
374,
5028,
533,
1057,
417,
562,
32231,
1666,
25379,
327,
253,
3733,
4836,
3139,
697,
2834,
323,
479,
281,
5963,
1880,
253,
1566,
588,
671,
789,
327,
643,
10625,
1754,
327,
841,
4679,
50276,
303,
14338,
1880,
690,
273,
253,
4295,
273,
253,
1566,
403,
2686,
3058,
24088,
4116,
875,
391,
9866,
11911,
275,
619,
4743,
352,
651,
3157,
253,
2929,
3012,
604,
690,
1543,
327,
28913,
4679,
651,
320,
2530,
1529,
4809,
273,
253,
1566,
326,
812,
320,
6949,
310,
253,
10096,
273,
253,
3045,
327,
253,
10295,
326,
49630,
1491,
6431,
407,
4181,
275,
253,
21496,
2317,
752,
6569,
604,
253,
15673,
310,
13603,
50275,
49836,
285,
19843,
50276,
783,
2929,
310,
3839,
973,
3542,
533,
891,
858,
452,
690,
12748,
4685,
253,
4736,
285,
2746,
984,
253,
2929,
15771,
327,
247,
2257,
273,
3159,
90,
47284,
690,
273,
534,
310,
4122,
35377,
275,
619,
4743,
891,
2868,
326,
253,
2929,
651,
320,
30909,
604,
253,
10199,
369,
36439,
436,
812,
671,
3553,
625,
2317,
281,
2319,
253,
4679,
275,
625,
2508,
390,
823,
271,
28913,
1263,
50275,
585,
7797,
50276,
74,
1119,
253,
10336,
5611,
275,
253,
2929,
27364,
4722,
285,
651,
751,
281,
923,
625,
789,
275,
436,
3884,
891,
651,
1918,
253,
2929,
247,
2169,
13716,
604,
352,
2908,
28913,
4679,
390,
604,
352,
369,
10904,
281,
25636,
285,
48399,
690,
273,
253,
5955,
275,
253,
10199,
50273,
15576,
846,
2488,
5701,
209,
422,
1239,
253,
2488,
5701,
285,
253,
9300,
2715,
273,
253,
2929,
3738,
253,
4477,
1750,
326,
597,
452,
36439,
253,
10199,
273,
253,
2929,
407,
337,
3239,
436,
36908,
2686,
1646,
281,
320,
253,
1083,
275,
253,
1390,
2715,
326,
369,
28228,
835,
253,
2593,
18879,
10199,
310,
2761,
19965,
2429,
281,
253,
3236,
12119,
209,
422,
908,
253,
2171,
4968,
875,
253,
6323,
285,
3236,
2715,
5046,
627,
369,
247,
40663,
285,
253,
4477,
452,
36439,
247,
1027,
629,
273,
253,
2929,
3738,
352,
651,
452,
644,
5322,
281,
48399,
285,
5542,
1282,
253,
10199,
281,
1056,
253,
2929,
6927,
281,
1239,
697,
417,
4619,
281,
619,
13716,
253,
2879,
28913,
4679,
7568,
326,
1016,
273,
253,
1027,
4116,
11911,
4081,
275,
253,
2929,
3157,
1543,
534,
891,
1158,
1663,
19132,
253,
2929,
891,
369,
8927,
417,
2119,
1880,
253,
10454,
273,
253,
1566,
369,
17285,
533,
253,
747,
4679,
7568,
326,
1016,
273,
253,
4295,
3133,
281,
320,
3058,
209,
422,
671,
9257,
1239,
253,
1551,
273,
253,
2929,
285,
253,
2488,
5701,
15571,
4278,
273,
253,
8892,
762,
1263,
285,
1024,
1928,
326,
891,
452,
247,
1199,
1805,
4685,
273,
752,
369,
2218,
285,
849,
253,
1566,
812,
320,
908,
323,
643,
8892,
891,
5194,
342,
391,
21,
326,
253,
38135,
273,
253,
2929,
1537,
417,
320,
3216,
22071,
533,
891,
2868,
253,
2929,
812,
320,
4623,
285,
4722,
323,
643,
8607,
665,
971,
281,
19071,
4116,
6297,
715,
616,
35615,
594,
891,
5583,
18738,
253,
2929,
209,
422,
2559,
619,
13716,
432,
721,
281,
818,
7152,
33032,
6010,
50276,
2520,
2929,
310,
7514,
342,
2403,
13650,
670,
253,
4156,
1375,
273,
247,
18525,
985,
275,
247,
10571,
24802,
4758,
835,
760,
1980,
7313,
403,
2130,
345,
2414,
600,
436,
2929,
2175,
3492,
10554,
432,
50216,
265,
285,
1533,
7645,
272,
275,
247,
4471,
12788,
4758,
50276,
936,
8415,
436,
4836,
253,
2929,
29328,
28819,
18872,
18902,
11911,
256,
19,
83,
983,
271,
391,
9866,
3169,
8062,
1566,
1754,
327,
18745,
8790,
24926,
256,
19,
83,
983,
403,
271,
6880,
273,
391,
14381,
564,
90,
267,
1162,
355,
6247,
342,
253,
2022,
3064,
1146,
326,
1016,
6333,
23000,
4428,
271,
21496,
4972,
326,
310,
5486,
281,
4887,
697,
4328,
275,
690,
6311,
7982,
2317,
436,
40798,
1491,
476,
840,
320,
908,
281,
2701,
253,
6355,
875,
11911,
281,
16969,
14800,
326,
1705,
342,
271,
2330,
4328,
281,
253,
11911,
285,
281,
7316,
247,
1798,
749,
2003,
273,
253,
4156,
1375,
387,
247,
2852,
1127,
275,
673,
50274,
262,
310,
2011,
849,
256,
19,
83,
983,
41731,
13015,
247,
1180,
273,
1666,
25379,
534,
403,
2797,
407,
16248,
253,
7316,
5122,
273,
247,
305,
47051,
342,
2057,
271,
298,
296,
78,
391,
14381,
390,
247,
38524,
391,
9866,
281,
1566,
253,
8062,
50275,
856,
84,
50276,
5040,
50276,
6309,
1877,
50276,
1189,
455,
891,
1089,
326,
253,
2929,
310,
12054,
973,
3542,
3738,
253,
19843,
476,
320,
5520,
3012,
253,
806,
1740,
7223,
9093,
760,
2770,
327,
16038,
285,
1895,
3908,
534,
310,
13622,
3340,
1580,
253,
2022,
7680,
310,
271,
5520,
391,
9866,
10336,
275,
4499,
2593,
577,
326,
8631,
253,
1332,
6505,
1774,
4278,
670,
253,
12956,
3280,
4116,
6297,
285,
734,
3992,
4116,
6297,
281,
253,
30762,
12014,
253,
4679,
327,
253,
9860,
10186,
403,
7094,
5469,
275,
253,
30762,
534,
891,
452,
3103,
417,
2783,
347,
629,
273,
253,
7680,
625,
3839,
253,
1895,
15895,
3133,
48312,
3862,
1677,
253,
4942,
6891,
7680,
326,
310,
1160,
50276,
783,
2783,
1895,
4758,
310,
4722,
285,
891,
2868,
4623,
281,
2067,
1524,
10186,
7533,
253,
7680,
3139,
310,
2581,
32809,
1580,
352,
9093,
760,
8687,
1709,
839,
247,
4328,
342,
1016,
6333,
275,
391,
14381,
253,
41655,
281,
253,
5368,
4116,
6297,
275,
391,
14381,
326,
436,
840,
2436,
36269,
403,
15246,
3738,
23000,
1907,
253,
22083,
3280,
4116,
689,
253,
1980,
7313,
310,
4722,
285,
4620,
4460,
891,
812,
2649,
4677,
562,
604,
253,
3236,
391,
14381,
2168,
8041,
281,
247,
873,
37626,
273,
253,
3280,
50276,
466,
835,
1016,
3284,
310,
247,
1027,
8820,
4328,
273,
253,
6311,
260,
9866,
6779,
2299,
436,
2929,
1057,
417,
7277,
5795,
4088,
273,
16994,
841,
2544,
390,
2085,
271,
28913,
534,
6505,
352,
1527,
752,
253,
1055,
310,
273,
42174,
253,
4116,
6297,
275,
436,
1039,
891,
671,
3877,
326,
10568,
253,
1980,
7313,
281,
2486,
271,
2330,
8820,
4328,
310,
25711,
247,
625,
6891,
4758,
685,
326,
14859,
275,
391,
14381,
604,
253,
3280,
4116,
275,
391,
14381,
310,
23000,
6828,
281,
8041,
689,
253,
16202,
1980,
7313,
285,
352,
651,
452,
644,
4722,
281,
2096,
253,
6349,
273,
436,
323,
1650,
849,
513,
256,
19,
83,
983,
7277,
281,
391,
14381,
672,
247,
2074,
32049,
310,
908,
323,
391,
14381,
3707,
323,
970,
253,
8820,
13249,
50276,
783,
5661,
7103,
6492,
326,
256,
19,
83,
983,
41731,
13015,
253,
2783,
1666,
25379,
285,
954,
19836,
391,
14381,
3738,
627,
403,
690,
7350,
41005,
323,
253,
1666,
25379,
253,
40006,
14237,
403,
10302,
347,
247,
2020,
273,
253,
3453,
273,
253,
32049,
3732,
281,
1016,
1980,
8310,
534,
3133,
751,
247,
2201,
3673,
44856,
1223,
256,
19,
83,
983,
476,
897,
4116,
281,
8041,
281,
1016,
12097,
11794,
1491,
310,
2761,
5604,
3663,
323,
253,
1666,
25379,
275,
436,
1039,
891,
476,
2096,
2139,
436,
778,
320,
3309,
323,
253,
298,
296,
78,
533,
2139,
417,
1339,
22083,
8041,
281,
253,
16202,
1980,
7313,
11794,
4390,
436,
2789,
352,
2834,
281,
2096,
275,
752,
1039,
256,
19,
83,
983,
3157,
689,
391,
14381,
3340,
1580,
253,
2361,
8459,
2168,
310,
3240,
1355,
1273,
314,
285,
2905,
281,
436,
891,
2868,
326,
352,
310,
1774,
281,
2486,
271,
28913,
273,
253,
2544,
4081,
281,
253,
4116,
5122,
323,
1650,
752,
6569,
604,
40798,
1491,
310,
760,
908,
323,
253,
15969,
533,
417,
323,
253,
3280,
4116,
26332,
760,
970,
253,
4156,
1307,
390,
752,
604,
352,
310,
760,
908,
323,
3280,
4116,
533,
417,
323,
253,
15969,
4685,
253,
1055,
273,
841,
2216,
10165,
651,
1361,
17084,
253,
7680,
285,
1056,
352,
625,
1534,
2429,
281,
391,
14381,
50276,
28821,
841,
3374,
891,
476,
417,
3240,
2568,
5583,
271,
2997,
387,
436,
1127,
275,
673,
533,
891,
651,
320,
7378,
281,
2572,
619,
4868,
7293,
327,
253,
6454,
273,
690,
273,
253,
4679,
891,
452,
5125,
625,
3839,
891,
11907,
253,
4477,
281,
49620,
2593,
2164,
281,
1691,
625,
15075,
327,
253,
4588,
7680,
285,
2319,
253,
2173,
2216,
275,
2508,
50275,
5992,
7193,
5701,
50275,
22309,
310,
352,
3309,
281,
5206,
875,
253,
1885,
273,
1980,
285,
4156,
2426,
285,
760,
253,
1980,
1307,
323,
253,
3280,
4116,
352,
3133,
281,
479,
326,
253,
1980,
2426,
943,
36433,
285,
891,
4282,
752,
253,
1055,
310,
273,
1907,
436,
3081,
1307,
50275,
74,
651,
11435,
247,
625,
7000,
5955,
273,
2720,
789,
275,
253,
2905,
789,
2593,
4390,
253,
2929,
760,
2789,
28110,
3916,
670,
849,
253,
2783,
4758,
310,
1027,
432,
247,
12190,
273,
2720,
7274,
2581,
352,
651,
320,
625,
4722,
281,
1127,
562,
2173,
43630,
390,
2319,
849,
5697,
432,
2720,
789,
778,
320,
2908,
275,
253,
2783,
9978,
390,
275,
752,
1039,
1110,
5697,
452,
2168,
644,
294,
3197,
50275,
74,
717,
671,
5816,
247,
5955,
273,
1789,
37382,
7274,
281,
9591,
3520,
10554,
8892,
751,
391,
25476,
2870,
257,
76,
16974,
1162,
355,
4765,
1121,
20,
1670,
254,
3682,
29571,
1162,
355,
9169,
3966,
3340,
275,
5886,
281,
253,
2783,
46754,
15254,
4836,
25711,
327,
436,
4836,
841,
3082,
2085,
253,
1682,
5454,
2727,
875,
12171,
18745,
8790,
24926,
285,
625,
4156,
14053,
1580,
1016,
391,
9866,
46259,
281,
247,
2173,
1789,
534,
310,
28668,
417,
6786,
407,
256,
19,
83,
983,
1580,
1016,
391,
9866,
33772,
281,
2714,
907,
327,
247,
2014,
1789,
352,
476,
4354,
320,
11575,
347,
14053,
253,
4156,
1375,
273,
253,
985,
949,
12171,
18745,
8790,
24926,
8534,
327,
7898,
8310,
327,
253,
643,
1133,
352,
310,
2590,
326,
256,
19,
83,
983,
285,
391,
14381,
407,
6880,
3959,
643,
11361,
323,
1650,
271,
5750,
273,
256,
19,
83,
983,
310,
326,
597,
1908,
11911,
1907,
616,
1211,
13461,
534,
476,
7624,
2714,
907,
327,
2173,
6355,
875,
5113,
390,
2740,
1066,
6355,
875,
5113,
2007,
391,
25476,
285,
253,
13052,
403,
3710,
281,
14053,
253,
1072,
873,
273,
6355,
875,
18745,
8790,
24926,
1677,
407,
5113,
3738,
28668,
253,
10732,
273,
271,
1789,
476,
671,
320,
12112,
275,
436,
1083,
26332,
672,
271,
1789,
37382,
6779,
10140,
281,
767,
3520,
5113,
275,
12275,
2317,
50275,
783,
24426,
275,
4677,
608,
5936,
326,
2060,
11911,
2714,
907,
533,
627,
671,
3133,
281,
320,
3240,
690,
39296,
651,
352,
417,
320,
1896,
281,
22048,
253,
6786,
23178,
414,
10380,
24088,
407,
10499,
891,
276,
275,
326,
1083,
352,
651,
320,
4722,
281,
923,
849,
253,
985,
37824,
672,
253,
1180,
273,
2530,
11911,
310,
12497,
281,
1566,
1016,
2060,
12353,
275,
253,
2783,
985,
285,
672,
253,
1180,
273,
11911,
310,
4503,
281,
253,
1180,
273,
14142,
24088,
5113,
327,
46754,
15254,
23000,
513,
368,
452,
667,
30328,
323,
752,
6569,
604,
11911,
403,
5176,
387,
1071,
2606,
50274,
22309,
403,
391,
14381,
417,
2908,
275,
2829,
337,
891,
651,
320,
9861,
604,
352,
17923,
326,
1199,
7197,
281,
256,
19,
1109,
1580,
1097,
3210,
1347,
2074,
327,
253,
46754,
15254,
4836,
604,
436,
310,
253,
1083,
840,
4496,
387,
253,
1077,
1878,
2085,
690,
30328,
390,
12288,
281,
5513,
436,
3064,
50274,
32897,
2486,
247,
5955,
273,
7364,
275,
253,
6452,
50274,
32897,
1379,
247,
2774,
281,
564,
949,
253,
10414,
285,
9113,
26542,
9380,
326,
452,
644,
3863,
50274,
74,
452,
5520,
619,
4868,
1563,
253,
11701,
1160,
407,
253,
4477,
923,
619,
12252,
2708,
323,
4278,
187,
187,
4118,
18435,
27,
2520,
2929,
10262,
247,
1566,
323,
18525,
2718,
342,
2709,
18745,
4295,
1016,
4445,
310,
23115,
347,
271,
391,
9866,
285,
253,
6355,
875,
4295,
403,
3470,
273,
616,
4181,
275,
247,
6311,
21496,
2317,
697,
271,
4722,
2934,
285,
973,
17194,
42115,
8492,
253,
1543,
497,
1160,
625,
18511,
342,
253,
1635,
273,
28913,
2175,
1309,
253,
5955,
3408,
534,
2692,
849,
2710,
7794,
273,
253,
1566,
5678,
281,
4917,
253,
1682,
3045,
50276,
1189,
455,
436,
2929,
943,
320,
273,
1600,
281,
1142,
275,
253,
17857,
32888,
3114,
2444,
327,
2570,
4471,
12788,
2718
] | [
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] | [
4477,
7568,
1543,
327,
767,
10625,
247,
46754,
15254,
3492,
10554,
4836,
534,
310,
973,
4304,
285,
4217,
533,
812,
320,
2783,
247,
20953,
4836,
285,
21565,
6430,
432,
331,
3178,
2694,
374,
20303,
275,
253,
1083,
273,
331,
3178,
2694,
374,
253,
2303,
323,
253,
10554,
403,
9305,
3943,
22118,
285,
5231,
2581,
685,
3888,
253,
1566,
3839,
41731,
13015,
1666,
25379,
327,
253,
46754,
15254,
4836,
285,
2087,
4219,
1805,
327,
253,
331,
3178,
2694,
374,
5028,
533,
1057,
417,
562,
32231,
1666,
25379,
327,
253,
3733,
4836,
3139,
697,
2834,
323,
479,
281,
5963,
1880,
253,
1566,
588,
671,
789,
327,
643,
10625,
1754,
327,
841,
4679,
50276,
303,
14338,
1880,
690,
273,
253,
4295,
273,
253,
1566,
403,
2686,
3058,
24088,
4116,
875,
391,
9866,
11911,
275,
619,
4743,
352,
651,
3157,
253,
2929,
3012,
604,
690,
1543,
327,
28913,
4679,
651,
320,
2530,
1529,
4809,
273,
253,
1566,
326,
812,
320,
6949,
310,
253,
10096,
273,
253,
3045,
327,
253,
10295,
326,
49630,
1491,
6431,
407,
4181,
275,
253,
21496,
2317,
752,
6569,
604,
253,
15673,
310,
13603,
50275,
49836,
285,
19843,
50276,
783,
2929,
310,
3839,
973,
3542,
533,
891,
858,
452,
690,
12748,
4685,
253,
4736,
285,
2746,
984,
253,
2929,
15771,
327,
247,
2257,
273,
3159,
90,
47284,
690,
273,
534,
310,
4122,
35377,
275,
619,
4743,
891,
2868,
326,
253,
2929,
651,
320,
30909,
604,
253,
10199,
369,
36439,
436,
812,
671,
3553,
625,
2317,
281,
2319,
253,
4679,
275,
625,
2508,
390,
823,
271,
28913,
1263,
50275,
585,
7797,
50276,
74,
1119,
253,
10336,
5611,
275,
253,
2929,
27364,
4722,
285,
651,
751,
281,
923,
625,
789,
275,
436,
3884,
891,
651,
1918,
253,
2929,
247,
2169,
13716,
604,
352,
2908,
28913,
4679,
390,
604,
352,
369,
10904,
281,
25636,
285,
48399,
690,
273,
253,
5955,
275,
253,
10199,
50273,
15576,
846,
2488,
5701,
209,
422,
1239,
253,
2488,
5701,
285,
253,
9300,
2715,
273,
253,
2929,
3738,
253,
4477,
1750,
326,
597,
452,
36439,
253,
10199,
273,
253,
2929,
407,
337,
3239,
436,
36908,
2686,
1646,
281,
320,
253,
1083,
275,
253,
1390,
2715,
326,
369,
28228,
835,
253,
2593,
18879,
10199,
310,
2761,
19965,
2429,
281,
253,
3236,
12119,
209,
422,
908,
253,
2171,
4968,
875,
253,
6323,
285,
3236,
2715,
5046,
627,
369,
247,
40663,
285,
253,
4477,
452,
36439,
247,
1027,
629,
273,
253,
2929,
3738,
352,
651,
452,
644,
5322,
281,
48399,
285,
5542,
1282,
253,
10199,
281,
1056,
253,
2929,
6927,
281,
1239,
697,
417,
4619,
281,
619,
13716,
253,
2879,
28913,
4679,
7568,
326,
1016,
273,
253,
1027,
4116,
11911,
4081,
275,
253,
2929,
3157,
1543,
534,
891,
1158,
1663,
19132,
253,
2929,
891,
369,
8927,
417,
2119,
1880,
253,
10454,
273,
253,
1566,
369,
17285,
533,
253,
747,
4679,
7568,
326,
1016,
273,
253,
4295,
3133,
281,
320,
3058,
209,
422,
671,
9257,
1239,
253,
1551,
273,
253,
2929,
285,
253,
2488,
5701,
15571,
4278,
273,
253,
8892,
762,
1263,
285,
1024,
1928,
326,
891,
452,
247,
1199,
1805,
4685,
273,
752,
369,
2218,
285,
849,
253,
1566,
812,
320,
908,
323,
643,
8892,
891,
5194,
342,
391,
21,
326,
253,
38135,
273,
253,
2929,
1537,
417,
320,
3216,
22071,
533,
891,
2868,
253,
2929,
812,
320,
4623,
285,
4722,
323,
643,
8607,
665,
971,
281,
19071,
4116,
6297,
715,
616,
35615,
594,
891,
5583,
18738,
253,
2929,
209,
422,
2559,
619,
13716,
432,
721,
281,
818,
7152,
33032,
6010,
50276,
2520,
2929,
310,
7514,
342,
2403,
13650,
670,
253,
4156,
1375,
273,
247,
18525,
985,
275,
247,
10571,
24802,
4758,
835,
760,
1980,
7313,
403,
2130,
345,
2414,
600,
436,
2929,
2175,
3492,
10554,
432,
50216,
265,
285,
1533,
7645,
272,
275,
247,
4471,
12788,
4758,
50276,
936,
8415,
436,
4836,
253,
2929,
29328,
28819,
18872,
18902,
11911,
256,
19,
83,
983,
271,
391,
9866,
3169,
8062,
1566,
1754,
327,
18745,
8790,
24926,
256,
19,
83,
983,
403,
271,
6880,
273,
391,
14381,
564,
90,
267,
1162,
355,
6247,
342,
253,
2022,
3064,
1146,
326,
1016,
6333,
23000,
4428,
271,
21496,
4972,
326,
310,
5486,
281,
4887,
697,
4328,
275,
690,
6311,
7982,
2317,
436,
40798,
1491,
476,
840,
320,
908,
281,
2701,
253,
6355,
875,
11911,
281,
16969,
14800,
326,
1705,
342,
271,
2330,
4328,
281,
253,
11911,
285,
281,
7316,
247,
1798,
749,
2003,
273,
253,
4156,
1375,
387,
247,
2852,
1127,
275,
673,
50274,
262,
310,
2011,
849,
256,
19,
83,
983,
41731,
13015,
247,
1180,
273,
1666,
25379,
534,
403,
2797,
407,
16248,
253,
7316,
5122,
273,
247,
305,
47051,
342,
2057,
271,
298,
296,
78,
391,
14381,
390,
247,
38524,
391,
9866,
281,
1566,
253,
8062,
50275,
856,
84,
50276,
5040,
50276,
6309,
1877,
50276,
1189,
455,
891,
1089,
326,
253,
2929,
310,
12054,
973,
3542,
3738,
253,
19843,
476,
320,
5520,
3012,
253,
806,
1740,
7223,
9093,
760,
2770,
327,
16038,
285,
1895,
3908,
534,
310,
13622,
3340,
1580,
253,
2022,
7680,
310,
271,
5520,
391,
9866,
10336,
275,
4499,
2593,
577,
326,
8631,
253,
1332,
6505,
1774,
4278,
670,
253,
12956,
3280,
4116,
6297,
285,
734,
3992,
4116,
6297,
281,
253,
30762,
12014,
253,
4679,
327,
253,
9860,
10186,
403,
7094,
5469,
275,
253,
30762,
534,
891,
452,
3103,
417,
2783,
347,
629,
273,
253,
7680,
625,
3839,
253,
1895,
15895,
3133,
48312,
3862,
1677,
253,
4942,
6891,
7680,
326,
310,
1160,
50276,
783,
2783,
1895,
4758,
310,
4722,
285,
891,
2868,
4623,
281,
2067,
1524,
10186,
7533,
253,
7680,
3139,
310,
2581,
32809,
1580,
352,
9093,
760,
8687,
1709,
839,
247,
4328,
342,
1016,
6333,
275,
391,
14381,
253,
41655,
281,
253,
5368,
4116,
6297,
275,
391,
14381,
326,
436,
840,
2436,
36269,
403,
15246,
3738,
23000,
1907,
253,
22083,
3280,
4116,
689,
253,
1980,
7313,
310,
4722,
285,
4620,
4460,
891,
812,
2649,
4677,
562,
604,
253,
3236,
391,
14381,
2168,
8041,
281,
247,
873,
37626,
273,
253,
3280,
50276,
466,
835,
1016,
3284,
310,
247,
1027,
8820,
4328,
273,
253,
6311,
260,
9866,
6779,
2299,
436,
2929,
1057,
417,
7277,
5795,
4088,
273,
16994,
841,
2544,
390,
2085,
271,
28913,
534,
6505,
352,
1527,
752,
253,
1055,
310,
273,
42174,
253,
4116,
6297,
275,
436,
1039,
891,
671,
3877,
326,
10568,
253,
1980,
7313,
281,
2486,
271,
2330,
8820,
4328,
310,
25711,
247,
625,
6891,
4758,
685,
326,
14859,
275,
391,
14381,
604,
253,
3280,
4116,
275,
391,
14381,
310,
23000,
6828,
281,
8041,
689,
253,
16202,
1980,
7313,
285,
352,
651,
452,
644,
4722,
281,
2096,
253,
6349,
273,
436,
323,
1650,
849,
513,
256,
19,
83,
983,
7277,
281,
391,
14381,
672,
247,
2074,
32049,
310,
908,
323,
391,
14381,
3707,
323,
970,
253,
8820,
13249,
50276,
783,
5661,
7103,
6492,
326,
256,
19,
83,
983,
41731,
13015,
253,
2783,
1666,
25379,
285,
954,
19836,
391,
14381,
3738,
627,
403,
690,
7350,
41005,
323,
253,
1666,
25379,
253,
40006,
14237,
403,
10302,
347,
247,
2020,
273,
253,
3453,
273,
253,
32049,
3732,
281,
1016,
1980,
8310,
534,
3133,
751,
247,
2201,
3673,
44856,
1223,
256,
19,
83,
983,
476,
897,
4116,
281,
8041,
281,
1016,
12097,
11794,
1491,
310,
2761,
5604,
3663,
323,
253,
1666,
25379,
275,
436,
1039,
891,
476,
2096,
2139,
436,
778,
320,
3309,
323,
253,
298,
296,
78,
533,
2139,
417,
1339,
22083,
8041,
281,
253,
16202,
1980,
7313,
11794,
4390,
436,
2789,
352,
2834,
281,
2096,
275,
752,
1039,
256,
19,
83,
983,
3157,
689,
391,
14381,
3340,
1580,
253,
2361,
8459,
2168,
310,
3240,
1355,
1273,
314,
285,
2905,
281,
436,
891,
2868,
326,
352,
310,
1774,
281,
2486,
271,
28913,
273,
253,
2544,
4081,
281,
253,
4116,
5122,
323,
1650,
752,
6569,
604,
40798,
1491,
310,
760,
908,
323,
253,
15969,
533,
417,
323,
253,
3280,
4116,
26332,
760,
970,
253,
4156,
1307,
390,
752,
604,
352,
310,
760,
908,
323,
3280,
4116,
533,
417,
323,
253,
15969,
4685,
253,
1055,
273,
841,
2216,
10165,
651,
1361,
17084,
253,
7680,
285,
1056,
352,
625,
1534,
2429,
281,
391,
14381,
50276,
28821,
841,
3374,
891,
476,
417,
3240,
2568,
5583,
271,
2997,
387,
436,
1127,
275,
673,
533,
891,
651,
320,
7378,
281,
2572,
619,
4868,
7293,
327,
253,
6454,
273,
690,
273,
253,
4679,
891,
452,
5125,
625,
3839,
891,
11907,
253,
4477,
281,
49620,
2593,
2164,
281,
1691,
625,
15075,
327,
253,
4588,
7680,
285,
2319,
253,
2173,
2216,
275,
2508,
50275,
5992,
7193,
5701,
50275,
22309,
310,
352,
3309,
281,
5206,
875,
253,
1885,
273,
1980,
285,
4156,
2426,
285,
760,
253,
1980,
1307,
323,
253,
3280,
4116,
352,
3133,
281,
479,
326,
253,
1980,
2426,
943,
36433,
285,
891,
4282,
752,
253,
1055,
310,
273,
1907,
436,
3081,
1307,
50275,
74,
651,
11435,
247,
625,
7000,
5955,
273,
2720,
789,
275,
253,
2905,
789,
2593,
4390,
253,
2929,
760,
2789,
28110,
3916,
670,
849,
253,
2783,
4758,
310,
1027,
432,
247,
12190,
273,
2720,
7274,
2581,
352,
651,
320,
625,
4722,
281,
1127,
562,
2173,
43630,
390,
2319,
849,
5697,
432,
2720,
789,
778,
320,
2908,
275,
253,
2783,
9978,
390,
275,
752,
1039,
1110,
5697,
452,
2168,
644,
294,
3197,
50275,
74,
717,
671,
5816,
247,
5955,
273,
1789,
37382,
7274,
281,
9591,
3520,
10554,
8892,
751,
391,
25476,
2870,
257,
76,
16974,
1162,
355,
4765,
1121,
20,
1670,
254,
3682,
29571,
1162,
355,
9169,
3966,
3340,
275,
5886,
281,
253,
2783,
46754,
15254,
4836,
25711,
327,
436,
4836,
841,
3082,
2085,
253,
1682,
5454,
2727,
875,
12171,
18745,
8790,
24926,
285,
625,
4156,
14053,
1580,
1016,
391,
9866,
46259,
281,
247,
2173,
1789,
534,
310,
28668,
417,
6786,
407,
256,
19,
83,
983,
1580,
1016,
391,
9866,
33772,
281,
2714,
907,
327,
247,
2014,
1789,
352,
476,
4354,
320,
11575,
347,
14053,
253,
4156,
1375,
273,
253,
985,
949,
12171,
18745,
8790,
24926,
8534,
327,
7898,
8310,
327,
253,
643,
1133,
352,
310,
2590,
326,
256,
19,
83,
983,
285,
391,
14381,
407,
6880,
3959,
643,
11361,
323,
1650,
271,
5750,
273,
256,
19,
83,
983,
310,
326,
597,
1908,
11911,
1907,
616,
1211,
13461,
534,
476,
7624,
2714,
907,
327,
2173,
6355,
875,
5113,
390,
2740,
1066,
6355,
875,
5113,
2007,
391,
25476,
285,
253,
13052,
403,
3710,
281,
14053,
253,
1072,
873,
273,
6355,
875,
18745,
8790,
24926,
1677,
407,
5113,
3738,
28668,
253,
10732,
273,
271,
1789,
476,
671,
320,
12112,
275,
436,
1083,
26332,
672,
271,
1789,
37382,
6779,
10140,
281,
767,
3520,
5113,
275,
12275,
2317,
50275,
783,
24426,
275,
4677,
608,
5936,
326,
2060,
11911,
2714,
907,
533,
627,
671,
3133,
281,
320,
3240,
690,
39296,
651,
352,
417,
320,
1896,
281,
22048,
253,
6786,
23178,
414,
10380,
24088,
407,
10499,
891,
276,
275,
326,
1083,
352,
651,
320,
4722,
281,
923,
849,
253,
985,
37824,
672,
253,
1180,
273,
2530,
11911,
310,
12497,
281,
1566,
1016,
2060,
12353,
275,
253,
2783,
985,
285,
672,
253,
1180,
273,
11911,
310,
4503,
281,
253,
1180,
273,
14142,
24088,
5113,
327,
46754,
15254,
23000,
513,
368,
452,
667,
30328,
323,
752,
6569,
604,
11911,
403,
5176,
387,
1071,
2606,
50274,
22309,
403,
391,
14381,
417,
2908,
275,
2829,
337,
891,
651,
320,
9861,
604,
352,
17923,
326,
1199,
7197,
281,
256,
19,
1109,
1580,
1097,
3210,
1347,
2074,
327,
253,
46754,
15254,
4836,
604,
436,
310,
253,
1083,
840,
4496,
387,
253,
1077,
1878,
2085,
690,
30328,
390,
12288,
281,
5513,
436,
3064,
50274,
32897,
2486,
247,
5955,
273,
7364,
275,
253,
6452,
50274,
32897,
1379,
247,
2774,
281,
564,
949,
253,
10414,
285,
9113,
26542,
9380,
326,
452,
644,
3863,
50274,
74,
452,
5520,
619,
4868,
1563,
253,
11701,
1160,
407,
253,
4477,
923,
619,
12252,
2708,
323,
4278,
187,
187,
4118,
18435,
27,
2520,
2929,
10262,
247,
1566,
323,
18525,
2718,
342,
2709,
18745,
4295,
1016,
4445,
310,
23115,
347,
271,
391,
9866,
285,
253,
6355,
875,
4295,
403,
3470,
273,
616,
4181,
275,
247,
6311,
21496,
2317,
697,
271,
4722,
2934,
285,
973,
17194,
42115,
8492,
253,
1543,
497,
1160,
625,
18511,
342,
253,
1635,
273,
28913,
2175,
1309,
253,
5955,
3408,
534,
2692,
849,
2710,
7794,
273,
253,
1566,
5678,
281,
4917,
253,
1682,
3045,
50276,
1189,
455,
436,
2929,
943,
320,
273,
1600,
281,
1142,
275,
253,
17857,
32888,
3114,
2444,
327,
2570,
4471,
12788,
2718
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
the paper presents a new task openset single domain generalization where only one source domain is available and unknown classes and unseen target domains increase the difficulty of the task a new method crossmatch is proposed to solve this new problem firstly auxiliary examples are generated for unknown classes out of the source classes then multibinary classifiers are used to deal with unknown class identification in domain adaptation then the paper proposes a crossclassifier consistency regularization that minimizes the multibinary classifiers output and onevsall multiclass classifiers output the experiments show the proposed method could largely improve the accuracy for unknown classes in the target domain strength ossdg is an interesting and realistic problem and the paper clearly described its difference from existing methods the proposed method makes sense and is technically sound the experimental results show the effectiveness of the proposed methods weakness one concern is that the author maybe could analyze which parts of their approach decrease the known classs accuracy this could be interesting to know since in some settings like table 3 adding cm decreases the accuracy for known classes by about 3 another thing is that some results in the appendix could move forward to replace the converging figures in figure 4 the paper is wellwritten and is technically sound and the proposed problem makes sense docsepin domain generalization dg label set of target domain is that of source domains however we might meet the unknown classes in the target domain which will cause significantly prediction error on such unknownclass data points in the target domain to avoid this issue this paper formulates a new problem setting openset dg where the label set of target domain contains the label set of source domains this paper actually considers a more challenging problem openset single dg ossdg that extends the problem setting of dg to a more general situation to address this very challenging problem this paper designs a crossmatch approach to improve the performance of sdg methods on identifying unknown classes by leveraging a multibinary classifier crossmatch generates auxiliary samples out of source label space by using an adversarial data augmentation strategy this paper also adopts a consistency regularization on generated auxiliary samples between multibinary classifiers and the model trained by sdg methods to improve the models capability on unknown class identification experimental results on benchmark datasets prove the effectiveness of crossmatch on enhancing the performance of sdg methods in the ossdg setting in general this paper contributes a novel problem setting and a validate solution to this setting although some motivations are unclear the contributions of this paper are enough pros if there are unknown classes in the target domain existing dg methods cannot handle this situation as a result existing methods will cause significantly prediction errors on such unknownclass data points in the target domain making the prediction of a network unreliable in this paper they propose a new problem setting and a new method to handle this challenging problem introducing multibinary classifier to the ossdg problem seems very interesting since it may identify the unknownclass region well this paper is easy to follow experiments can partially support the claims made in this paper a plus should be that the authors also design some baselines to their problem setting which provides solid baselines to see if the gains obtained by cm are significant cons the presentation should be improved there are many modules introduced in this paper however they are not wellmotivated it is better to explain their intuitions why they can help improve the performance i am not sure if it is necessary to list the contributions in the introduction such contributions have been described clearly in intro and abs it seems that you do not need to restate them key related works are missing in the literature openset domain adaptation and openset learning are very relevant topics to your proposed problem setting they should be carefully reviewed and discussed with your methodsetting in some openset learning papers they also consider to generate unknownclass data which is also a key module of your method in the theory of dg researchers need to assume the relations between source domains and target domain to ensure that dg can be solved however this paper does try any theoretical analysis to their problem setting which raises my concerns regarding the performance of cm on other datasets it would be great if some theoretical analysis can be concluded or analysed i would not like to make this problem can only be addressed by some heuristic methods how many times do you repeat your experiments i did not see error barstd values of your methods this should be provided to verify that the experimental results are stable one key experiment is missing openness experiments should be done to show the performance of your method when unknown classes change although hs is a new criterion for openset da it is better to include the known acc as well besides i did not see accu in table 3 which should be provided i did not see how the threshold affects the performance of your method the threshold mu is a very important hyperparameter how do you choose mu in the dg problem in general considering the significance of the researched problem this paper might be accepted by the iclr2022 however some points should be clarified and strengthened in the revision post rebuttal the authors have addressed my concerns thus i increase my score from 6 to 8 docsepthis study tackles a novel domain generalization task called openset single domain generalization in which a model is to be trained with single source domain but needs to generalize well at unseen target domains that may include unseen classes to solve this problem the authors extend single domain generalization methods to be able to learn detection of unseen classes by adopting adversarial data augmentation experimental results show that the proposed scheme can facilitates the capability of existing methods on detecting unknownclass data strength openset single domain generalization is a novel challenging but practically important problem setting figure 1 is a good summarization on relationship between this problem setting and related ones experimental results with several benchmark datasets show that the proposed scheme makes it possible to extend existing domain generalization methods to detect unknownclass data in the inference phase weakness i have several concerns regarding the design of crossmatch do we need to assume that all target domains share the same label space in eq 7 since ddu ds rho is the only constraint on du it seems that du can freely go far away from the source data distribution if we take supreme with respect to du although the objective function for a whole training process is defined in eq 7 the loss functions in each stage do not follow this definition eq 8 adopts an additional hyperparameter alpha eq 9 and 10 adopt totally different loss functions which are lsdg and lunk considering the motivation of crossclassifier consistency regularization it would be better to stop gradient for fb is there any reason to minimize lccr with respect to fb although openset single domain generalization can be seen as sdg openset classification task the authors only use sdg methods as baselines in the experiments naive openset classification or outofdistribution detection methods should perform good for unknown classes but not for known classes due to domain shift which highlights the advantage of the proposed method more clearly i could not judge the significancy of the proposed method from the current experimental results how did the authors tune mu in the experiments and is it also used for baseline methods erm ada and meada the accuracy on unknown classes should heavily depend on the setting of mu several notations are confusing i list some in the following in general domain means a pair of representation and distribution of data r1 not a dataset r1 a survey on transfer learning ieee trans on kde 2009 the arguments of the loss functions sometimes change in the manuscript in eq 7 it should be better to explicitly describe which data distribution is used to take expectation in eq 13 pbi t1x should be equal to pbi minor concerns that do not affect my score in table 4 lunk seems to have more impact on performance than lccr though classifier consistency regularization is only included in the title openset single domain generalization is an interesting and important problem setting however the proposed method is not approapriately designed for this problem setting in addition the experiments and discussion lack an important related work which is openset classification or outofdistribution detection i vote for weak reject docsepthis paper proposes a new method called crossmatch for openset single domain generalization where only one source domain is available to train the model this problem studied here is interesting and sounds reasonable which is also closely related to openset domain adaptation and single domain generalization in particular crossmatch designs a new strategy to generate auxiliary samples for unknown classes and develops a novel consistency regularization to help identify samples from unknown classes in target domains results on several datasets verify that the proposed method achieves impressive results on three widelyused datasets strengths 1 this paper proposes to study a new and interesting problem openset domain generalization 2 experiments show that crossmatch improves previous domain generalization methods like ada neurips 2018 and meada neurips 2020 and obtains impressive results on three object recognition datasets weaknesses 1 novelty of the proposed method sounds somewhat incremental compared with prior works in dg volpi et al 2018 zhao et al 2020a and uda liu et al 2019 saito saenko 2021 2 the intuition behind lccr in eq14 is hard to understand how does matching the outputs of the onevsall classifier and the multiclass classifier benefit the generalization ability the results of the variant lunklunk are missing in table 4 3 also the results of openset domain adaptation are missing in the experiments which are vital and could be considered as the upper bound of this new studied problem setting typos 1 around eq12 lconsis should be lccr this paper proposes to study a new and interesting transfer learning setting called openset domain generalization and develops a new method crossmatch for this problem however crossmatch is mainly built on several previous methods eg the multibinary classifier liu et al 2019 saito saenko 2021 adversarial data augmentation volpi et al 2018 zhao et al 2020a making the overall novelty incremental for iclr thus i tend to give a borderline reject score
### Summary: | the paper presents a new problem openset single domain generalization where only one source domain is available and unknown classes and unseen target domains increase the difficulty of the task to tackle this challenging problem this paper designs a crossmatch approach to improve the performance of sdg methods on identifying unknown classes by leveraging a multibinary classifier crossmatch generates auxiliary samples out of source label space by using an adversarial data augmentation strategy then the paper proposes a crossclassifier consistency regularization that minimizes the multibinary classifiers output and onevsall multiclass classifiers output the proposed ossdg is an interesting and realistic problem however since it is way more challenging the optimal solution to it remains elusive some reviewers think the method might be heuristic and lack theoretical guarantees nevertheless the results are promising and the paper makes a first step toward the challenging ossdg problem another concern is that the ccr loss needs more ablation studies to further analyze its role though the authors have added more explanation of this part i suggest the authors put more ablation studies in the final supplementary document overall the paper is novel and interesting i would recommend acceptance of this paper given its novelty and impressive performance but i highly suggest the authors add more ablation studies in the final supplementary as suggested by the reviewers | [
281,
871,
1580,
275,
690,
7533,
751,
2829,
495,
6240,
7892,
12075,
253,
7200,
323,
1929,
5971,
407,
670,
495,
50275,
23955,
2181,
310,
326,
690,
1543,
275,
253,
30762,
812,
2118,
3579,
281,
8171,
253,
5975,
3390,
8442,
275,
4677,
577,
50275,
783,
2929,
310,
973,
15720,
285,
310,
22335,
3590,
285,
253,
4081,
1895,
2789,
3282,
5474,
339,
9852,
5028,
26647,
277,
72,
5203,
873,
273,
2303,
5028,
310,
326,
273,
2603,
10625,
2299,
359,
1537,
2525,
253,
7202,
5971,
275,
253,
2303,
5028,
534,
588,
2847,
3012,
10554,
2228,
327,
824,
7202,
2437,
941,
2792,
275,
253,
2303,
5028,
281,
3693,
436,
2523,
436,
2929,
17075,
684,
247,
747,
1895,
4758,
13279,
292,
277,
72,
835,
253,
5203,
873,
273,
2303,
5028,
4428,
253,
5203,
873,
273,
2603,
10625,
436,
2929,
2686,
19401,
247,
625,
11132,
1895,
13279,
292,
2014,
277,
72,
31058,
27421,
326,
8725,
253,
1895,
4758,
273,
277,
72,
281,
247,
625,
2087,
4112,
50276,
936,
2953,
436,
1077,
11132,
1895,
436,
2929,
11809,
247,
2831,
8992,
2746,
281,
3157,
253,
3045,
273,
256,
27421,
3082,
327,
12488,
7202,
5971,
407,
19732,
2977,
247,
1554,
487,
2767,
30410,
2831,
8992,
15693,
24026,
3530,
562,
273,
2603,
5203,
2317,
407,
970,
271,
48960,
941,
42072,
5700,
436,
2929,
671,
47932,
247,
15274,
37820,
327,
4561,
24026,
3530,
875,
1554,
487,
2767,
49996,
285,
253,
1566,
10166,
407,
256,
27421,
3082,
281,
3157,
253,
3210,
14603,
327,
7202,
966,
8137,
5661,
1543,
327,
22791,
15302,
5276,
253,
12510,
273,
2831,
8992,
327,
22474,
253,
3045,
273,
256,
27421,
3082,
275,
253,
31058,
27421,
4758,
50276,
249,
2087,
436,
2929,
17904,
247,
4460,
1895,
4758,
285,
247,
17813,
2900,
281,
436,
4758,
3738,
690,
42852,
403,
12744,
253,
9021,
273,
436,
2929,
403,
2217,
50276,
856,
84,
50275,
338,
627,
403,
7202,
5971,
275,
253,
2303,
5028,
5368,
277,
72,
3082,
2550,
6016,
436,
4112,
347,
247,
906,
5368,
3082,
588,
2847,
3012,
10554,
6332,
327,
824,
7202,
2437,
941,
2792,
275,
253,
2303,
5028,
2403,
253,
10554,
273,
247,
2990,
36230,
275,
436,
2929,
597,
12661,
247,
747,
1895,
4758,
285,
247,
747,
1332,
281,
6016,
436,
11132,
1895,
50275,
36445,
2844,
1554,
487,
2767,
30410,
281,
253,
31058,
27421,
1895,
3133,
1077,
4722,
1580,
352,
778,
4271,
253,
7202,
2437,
2919,
973,
50275,
2520,
2929,
310,
3477,
281,
956,
4679,
476,
10571,
1329,
253,
3916,
1160,
275,
436,
2929,
247,
5043,
943,
320,
326,
253,
4477,
671,
2216,
690,
1666,
25379,
281,
616,
1895,
4758,
534,
3400,
4891,
1666,
25379,
281,
923,
604,
253,
15988,
2797,
407,
7892,
403,
1534,
50276,
5040,
50275,
783,
9759,
943,
320,
5520,
627,
403,
1142,
11911,
5611,
275,
436,
2929,
2299,
597,
403,
417,
973,
24013,
8550,
352,
310,
1805,
281,
5513,
616,
16875,
4431,
2139,
597,
476,
1361,
3157,
253,
3045,
50275,
74,
717,
417,
2119,
604,
352,
310,
3309,
281,
1618,
253,
9021,
275,
253,
10199,
824,
9021,
452,
644,
2529,
4518,
275,
26432,
285,
2117,
352,
3133,
326,
368,
513,
417,
878,
281,
1551,
366,
731,
50275,
2364,
2905,
2987,
403,
5816,
275,
253,
6239,
13279,
292,
5028,
15644,
285,
13279,
292,
4715,
403,
1077,
4623,
12989,
281,
634,
4081,
1895,
4758,
597,
943,
320,
9257,
9814,
285,
5469,
342,
634,
1332,
28617,
275,
690,
13279,
292,
4715,
9380,
597,
671,
1908,
281,
6635,
7202,
2437,
941,
534,
310,
671,
247,
2234,
6333,
273,
634,
1332,
50275,
249,
253,
3762,
273,
277,
72,
8607,
878,
281,
5467,
253,
2493,
875,
2603,
10625,
285,
2303,
5028,
281,
5416,
326,
277,
72,
476,
320,
14042,
2299,
436,
2929,
1057,
1611,
667,
10527,
1783,
281,
616,
1895,
4758,
534,
16540,
619,
7350,
5001,
253,
3045,
273,
7892,
327,
643,
15302,
352,
651,
320,
1270,
604,
690,
10527,
1783,
476,
320,
7945,
390,
15626,
891,
651,
417,
751,
281,
1056,
436,
1895,
476,
760,
320,
9713,
407,
690,
47641,
3082,
50273,
5430,
1142,
2069,
513,
368,
10280,
634,
4679,
891,
858,
417,
923,
2228,
2534,
8400,
2193,
273,
634,
3082,
436,
943,
320,
2530,
281,
12654,
326,
253,
5661,
1543,
403,
6474,
50275,
531,
2234,
3368,
310,
5816,
47418,
4679,
943,
320,
2218,
281,
921,
253,
3045,
273,
634,
1332,
672,
7202,
5971,
1818,
50275,
20261,
49343,
310,
247,
747,
17705,
323,
13279,
292,
4204,
352,
310,
1805,
281,
2486,
253,
1929,
756,
347,
973,
16280,
891,
858,
417,
923,
756,
86,
275,
2829,
495,
534,
943,
320,
2530,
50275,
74,
858,
417,
923,
849,
253,
7887,
11852,
253,
3045,
273,
634,
1332,
253,
7887,
12910,
310,
247,
1077,
1774,
4373,
19484,
849,
513,
368,
5206,
12910,
275,
253,
277,
72,
1895,
50274,
249,
2087,
7296,
253,
8453,
273,
253,
41548,
1895,
436,
2929,
1537,
320,
7607,
407,
253,
17857,
32888,
938,
1423,
2299,
690,
2792,
943,
320,
31637,
285,
34615,
275,
253,
18520,
50275,
5996,
30080,
22559,
50276,
783,
4477,
452,
9713,
619,
7350,
3021,
891,
2572,
619,
4868,
432,
721,
281,
854,
5474,
33032,
2520,
1263,
39223,
247,
4460,
5028,
26647,
4836,
1925,
13279,
292,
2014,
5028,
26647,
275,
534,
247,
1566,
310,
281,
320,
10166,
342,
2014,
2603,
5028,
533,
3198,
281,
39970,
973,
387,
39709,
2303,
10625,
326,
778,
2486,
39709,
5971,
281,
8415,
436,
1895,
253,
4477,
9017,
2014,
5028,
26647,
3082,
281,
320,
2104,
281,
3037,
5481,
273,
39709,
5971,
407,
25987,
48960,
941,
42072,
5661,
1543,
921,
326,
253,
4081,
6974,
476,
29499,
253,
14603,
273,
5368,
3082,
327,
15549,
7202,
2437,
941,
4757,
50275,
25249,
292,
2014,
5028,
26647,
310,
247,
4460,
11132,
533,
18236,
1774,
1895,
4758,
4677,
337,
310,
247,
1175,
10405,
1320,
327,
2954,
875,
436,
1895,
4758,
285,
2905,
4394,
50275,
49363,
1543,
342,
2067,
22791,
15302,
921,
326,
253,
4081,
6974,
2789,
352,
1896,
281,
9017,
5368,
5028,
26647,
3082,
281,
2736,
7202,
2437,
941,
275,
253,
17032,
3408,
50275,
20881,
1255,
50275,
74,
452,
2067,
7350,
5001,
253,
2216,
273,
2831,
8992,
50273,
3088,
359,
878,
281,
5467,
326,
512,
2303,
10625,
3894,
253,
1072,
5203,
2317,
50273,
249,
16186,
818,
1580,
277,
563,
20505,
50276,
2859,
310,
253,
760,
7658,
327,
3443,
352,
3133,
326,
3443,
476,
15744,
564,
2080,
1977,
432,
253,
2603,
941,
3268,
604,
359,
1379,
24555,
342,
1675,
281,
3443,
50273,
20261,
253,
8103,
1159,
323,
247,
2644,
3733,
1232,
310,
2931,
275,
16186,
818,
253,
2957,
3470,
275,
1016,
3924,
513,
417,
956,
436,
5426,
50271,
2574,
854,
47932,
271,
3081,
4373,
19484,
9765,
50271,
2574,
898,
285,
884,
5283,
9106,
1027,
2957,
3470,
534,
403,
298,
8289,
72,
285,
298,
3938,
50273,
15603,
272,
253,
16038,
273,
2831,
2437,
5425,
15274,
37820,
352,
651,
320,
1805,
281,
3523,
11786,
323,
49962,
310,
627,
667,
1921,
281,
15338,
298,
550,
83,
342,
1675,
281,
49962,
50275,
20261,
13279,
292,
2014,
5028,
26647,
476,
320,
2326,
347,
256,
27421,
50276,
25249,
292,
9162,
4836,
253,
4477,
760,
897,
256,
27421,
3082,
347,
1666,
25379,
275,
253,
4679,
27785,
13279,
292,
9162,
390,
562,
1171,
35360,
5481,
3082,
943,
1347,
1175,
323,
7202,
5971,
533,
417,
323,
1929,
5971,
1955,
281,
5028,
5333,
534,
16681,
253,
5750,
273,
253,
4081,
1332,
625,
4518,
891,
812,
417,
5963,
253,
1415,
4306,
273,
253,
4081,
1332,
432,
253,
1655,
5661,
1543,
50275,
5430,
858,
253,
4477,
19928,
12910,
275,
253,
4679,
285,
310,
352,
671,
908,
323,
8245,
3082,
209,
693,
519,
66,
285,
479,
2960,
253,
7200,
327,
7202,
5971,
943,
11306,
3469,
327,
253,
4758,
273,
12910,
50275,
43249,
41818,
403,
21643,
891,
1618,
690,
275,
253,
1563,
50273,
249,
2087,
5028,
2097,
247,
4667,
273,
6779,
285,
3268,
273,
941,
391,
18,
417,
247,
10895,
50271,
83,
18,
247,
6630,
327,
3700,
4715,
26332,
1796,
811,
327,
49745,
4748,
50273,
783,
7125,
273,
253,
2957,
3470,
4536,
1818,
275,
253,
7714,
50273,
249,
16186,
818,
352,
943,
320,
1805,
281,
11120,
6266,
534,
941,
3268,
310,
908,
281,
1379,
15355,
50273,
249,
16186,
2145,
268,
4193,
246,
18,
89,
943,
320,
4503,
281,
268,
4193,
50276,
37585,
7350,
326,
513,
417,
2818,
619,
4868,
50275,
249,
2829,
577,
298,
3938,
3133,
281,
452,
625,
3486,
327,
3045,
685,
298,
550,
83,
2167,
30410,
15274,
37820,
310,
760,
2908,
275,
253,
4060,
50275,
25249,
292,
2014,
5028,
26647,
310,
271,
4722,
285,
1774,
1895,
4758,
2299,
253,
4081,
1332,
310,
417,
1192,
522,
363,
1523,
4158,
323,
436,
1895,
4758,
275,
1635,
253,
4679,
285,
5955,
3480,
271,
1774,
2905,
789,
534,
310,
13279,
292,
9162,
390,
562,
1171,
35360,
5481,
891,
6273,
323,
5075,
12009,
50276,
7152,
33032,
2520,
2929,
29328,
247,
747,
1332,
1925,
2831,
8992,
323,
13279,
292,
2014,
5028,
26647,
835,
760,
581,
2603,
5028,
310,
2130,
281,
6194,
253,
1566,
50276,
2520,
1895,
5421,
1060,
310,
4722,
285,
7835,
5272,
534,
310,
671,
8244,
2905,
281,
13279,
292,
5028,
15644,
285,
2014,
5028,
26647,
275,
1798,
2831,
8992,
11809,
247,
747,
5700,
281,
6635,
24026,
3530,
323,
7202,
5971,
285,
24357,
247,
4460,
15274,
37820,
281,
1361,
4271,
3530,
432,
7202,
5971,
275,
2303,
10625,
1543,
327,
2067,
15302,
12654,
326,
253,
4081,
1332,
33526,
13943,
1543,
327,
1264,
7561,
3197,
15302,
20544,
337,
436,
2929,
29328,
281,
1263,
247,
747,
285,
4722,
1895,
13279,
292,
5028,
26647,
50276,
19,
4679,
921,
326,
2831,
8992,
19132,
2045,
5028,
26647,
3082,
751,
519,
66,
5723,
2824,
4765,
285,
479,
2960,
5723,
2824,
9169,
285,
31326,
13943,
1543,
327,
1264,
1789,
8981,
15302,
50276,
20881,
1255,
265,
337,
38135,
273,
253,
4081,
1332,
7835,
8489,
32809,
2429,
342,
2720,
2987,
275,
277,
72,
1936,
2059,
1162,
355,
4765,
1182,
31035,
1162,
355,
9169,
66,
285,
209,
14776,
632,
86,
1162,
355,
6247,
618,
7067,
50276,
6678,
41971,
43425,
50276,
19,
253,
30328,
3212,
298,
550,
83,
275,
16186,
1047,
310,
1892,
281,
2096,
849,
1057,
11038,
253,
18012,
273,
253,
581,
10936,
455,
30410,
285,
253,
23559,
14407,
30410,
5649,
253,
26647,
3745,
253,
1543,
273,
253,
12955,
298,
3938,
77,
3938,
403,
5816,
275,
2829,
577,
50276,
20,
671,
253,
1543,
273,
13279,
292,
5028,
15644,
403,
5816,
275,
253,
4679,
534,
403,
12232,
285,
812,
320,
2783,
347,
253,
5170,
3033,
273,
436,
747,
5421,
1895,
4758,
50275,
555,
993,
337,
1475,
16186,
805,
298,
5040,
261,
943,
320,
298,
550,
83,
436,
2929,
29328,
281,
1263,
247,
747,
285,
4722,
3700,
4715,
4758,
1925,
13279,
292,
5028,
26647,
285,
24357,
247,
747,
1332,
2831,
8992,
323,
436,
1895,
2299,
2831,
8992,
310,
7194,
4270,
327,
2067,
2045,
3082,
24088,
253,
1554,
487,
2767,
30410,
632,
86,
1162,
355,
6247,
618,
7067,
50276,
6678,
41971,
43425,
48960,
941,
42072,
1936,
2059,
1162,
355,
4765,
1182,
31035,
1162,
355,
9169,
66,
2403,
253,
4583,
38135,
32809,
323,
17857,
32888,
3021,
891,
5257,
281,
1918,
247,
45210,
12009,
4868,
2490,
187,
4118,
18435,
27,
783,
2929,
10262,
247,
747,
1895,
13279,
292,
2014,
5028,
26647,
835,
760,
581,
2603,
5028,
310,
2130,
285,
7202,
5971,
285,
39709,
2303,
10625,
2572,
253,
10183,
273,
253,
4836,
281,
18915,
436,
11132,
1895,
436,
2929,
11809,
247,
2831,
8992,
2746,
281,
3157,
253,
3045,
273,
256,
27421,
3082,
327,
12488,
7202,
5971,
407,
19732,
2977,
247,
1554,
487,
2767,
30410,
2831,
8992,
15693,
24026,
3530,
562,
273,
2603,
5203,
2317,
407,
970,
271,
48960,
941,
42072,
5700,
840,
253,
2929,
29328,
247,
2831,
2437,
5425,
15274,
37820,
326,
46926,
253,
1554,
487,
2767,
49996,
3453,
285,
581,
10936,
455,
23559,
14407,
49996,
3453,
50275,
783,
4081,
31058,
27421,
310,
271,
4722,
285,
15958,
1895,
2299,
1580,
352,
310,
1039,
625,
11132,
253,
8654,
2900,
281,
352,
4558,
38037,
690,
30628,
1158,
253,
1332,
1537,
320,
47641,
285,
3480,
10527,
23632,
17837,
253,
1543,
403,
12532,
285,
253,
2929,
2789,
247,
806,
3213,
2584,
253,
11132,
31058,
27421,
1895,
1529,
4468,
310,
326,
253,
260,
7083,
2957,
3198,
625,
28913,
2175,
281,
2007,
12106,
697,
2554,
2167,
253,
4477,
452,
2879,
625,
8813,
273,
436,
629,
891,
1804,
253,
4477,
1691,
625,
28913,
2175,
275,
253,
2457,
24864,
3389,
50275,
1189,
455,
253,
2929,
310,
4460,
285,
4722,
50276,
74,
651,
5583,
14924,
273,
436,
2929,
1677,
697,
38135,
285,
13943,
3045,
533,
891,
4122,
1804,
253,
4477,
823,
625,
28913,
2175,
275,
253,
2457,
24864,
347,
5125,
407,
253,
30628
] | [
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] | [
281,
871,
1580,
275,
690,
7533,
751,
2829,
495,
6240,
7892,
12075,
253,
7200,
323,
1929,
5971,
407,
670,
495,
50275,
23955,
2181,
310,
326,
690,
1543,
275,
253,
30762,
812,
2118,
3579,
281,
8171,
253,
5975,
3390,
8442,
275,
4677,
577,
50275,
783,
2929,
310,
973,
15720,
285,
310,
22335,
3590,
285,
253,
4081,
1895,
2789,
3282,
5474,
339,
9852,
5028,
26647,
277,
72,
5203,
873,
273,
2303,
5028,
310,
326,
273,
2603,
10625,
2299,
359,
1537,
2525,
253,
7202,
5971,
275,
253,
2303,
5028,
534,
588,
2847,
3012,
10554,
2228,
327,
824,
7202,
2437,
941,
2792,
275,
253,
2303,
5028,
281,
3693,
436,
2523,
436,
2929,
17075,
684,
247,
747,
1895,
4758,
13279,
292,
277,
72,
835,
253,
5203,
873,
273,
2303,
5028,
4428,
253,
5203,
873,
273,
2603,
10625,
436,
2929,
2686,
19401,
247,
625,
11132,
1895,
13279,
292,
2014,
277,
72,
31058,
27421,
326,
8725,
253,
1895,
4758,
273,
277,
72,
281,
247,
625,
2087,
4112,
50276,
936,
2953,
436,
1077,
11132,
1895,
436,
2929,
11809,
247,
2831,
8992,
2746,
281,
3157,
253,
3045,
273,
256,
27421,
3082,
327,
12488,
7202,
5971,
407,
19732,
2977,
247,
1554,
487,
2767,
30410,
2831,
8992,
15693,
24026,
3530,
562,
273,
2603,
5203,
2317,
407,
970,
271,
48960,
941,
42072,
5700,
436,
2929,
671,
47932,
247,
15274,
37820,
327,
4561,
24026,
3530,
875,
1554,
487,
2767,
49996,
285,
253,
1566,
10166,
407,
256,
27421,
3082,
281,
3157,
253,
3210,
14603,
327,
7202,
966,
8137,
5661,
1543,
327,
22791,
15302,
5276,
253,
12510,
273,
2831,
8992,
327,
22474,
253,
3045,
273,
256,
27421,
3082,
275,
253,
31058,
27421,
4758,
50276,
249,
2087,
436,
2929,
17904,
247,
4460,
1895,
4758,
285,
247,
17813,
2900,
281,
436,
4758,
3738,
690,
42852,
403,
12744,
253,
9021,
273,
436,
2929,
403,
2217,
50276,
856,
84,
50275,
338,
627,
403,
7202,
5971,
275,
253,
2303,
5028,
5368,
277,
72,
3082,
2550,
6016,
436,
4112,
347,
247,
906,
5368,
3082,
588,
2847,
3012,
10554,
6332,
327,
824,
7202,
2437,
941,
2792,
275,
253,
2303,
5028,
2403,
253,
10554,
273,
247,
2990,
36230,
275,
436,
2929,
597,
12661,
247,
747,
1895,
4758,
285,
247,
747,
1332,
281,
6016,
436,
11132,
1895,
50275,
36445,
2844,
1554,
487,
2767,
30410,
281,
253,
31058,
27421,
1895,
3133,
1077,
4722,
1580,
352,
778,
4271,
253,
7202,
2437,
2919,
973,
50275,
2520,
2929,
310,
3477,
281,
956,
4679,
476,
10571,
1329,
253,
3916,
1160,
275,
436,
2929,
247,
5043,
943,
320,
326,
253,
4477,
671,
2216,
690,
1666,
25379,
281,
616,
1895,
4758,
534,
3400,
4891,
1666,
25379,
281,
923,
604,
253,
15988,
2797,
407,
7892,
403,
1534,
50276,
5040,
50275,
783,
9759,
943,
320,
5520,
627,
403,
1142,
11911,
5611,
275,
436,
2929,
2299,
597,
403,
417,
973,
24013,
8550,
352,
310,
1805,
281,
5513,
616,
16875,
4431,
2139,
597,
476,
1361,
3157,
253,
3045,
50275,
74,
717,
417,
2119,
604,
352,
310,
3309,
281,
1618,
253,
9021,
275,
253,
10199,
824,
9021,
452,
644,
2529,
4518,
275,
26432,
285,
2117,
352,
3133,
326,
368,
513,
417,
878,
281,
1551,
366,
731,
50275,
2364,
2905,
2987,
403,
5816,
275,
253,
6239,
13279,
292,
5028,
15644,
285,
13279,
292,
4715,
403,
1077,
4623,
12989,
281,
634,
4081,
1895,
4758,
597,
943,
320,
9257,
9814,
285,
5469,
342,
634,
1332,
28617,
275,
690,
13279,
292,
4715,
9380,
597,
671,
1908,
281,
6635,
7202,
2437,
941,
534,
310,
671,
247,
2234,
6333,
273,
634,
1332,
50275,
249,
253,
3762,
273,
277,
72,
8607,
878,
281,
5467,
253,
2493,
875,
2603,
10625,
285,
2303,
5028,
281,
5416,
326,
277,
72,
476,
320,
14042,
2299,
436,
2929,
1057,
1611,
667,
10527,
1783,
281,
616,
1895,
4758,
534,
16540,
619,
7350,
5001,
253,
3045,
273,
7892,
327,
643,
15302,
352,
651,
320,
1270,
604,
690,
10527,
1783,
476,
320,
7945,
390,
15626,
891,
651,
417,
751,
281,
1056,
436,
1895,
476,
760,
320,
9713,
407,
690,
47641,
3082,
50273,
5430,
1142,
2069,
513,
368,
10280,
634,
4679,
891,
858,
417,
923,
2228,
2534,
8400,
2193,
273,
634,
3082,
436,
943,
320,
2530,
281,
12654,
326,
253,
5661,
1543,
403,
6474,
50275,
531,
2234,
3368,
310,
5816,
47418,
4679,
943,
320,
2218,
281,
921,
253,
3045,
273,
634,
1332,
672,
7202,
5971,
1818,
50275,
20261,
49343,
310,
247,
747,
17705,
323,
13279,
292,
4204,
352,
310,
1805,
281,
2486,
253,
1929,
756,
347,
973,
16280,
891,
858,
417,
923,
756,
86,
275,
2829,
495,
534,
943,
320,
2530,
50275,
74,
858,
417,
923,
849,
253,
7887,
11852,
253,
3045,
273,
634,
1332,
253,
7887,
12910,
310,
247,
1077,
1774,
4373,
19484,
849,
513,
368,
5206,
12910,
275,
253,
277,
72,
1895,
50274,
249,
2087,
7296,
253,
8453,
273,
253,
41548,
1895,
436,
2929,
1537,
320,
7607,
407,
253,
17857,
32888,
938,
1423,
2299,
690,
2792,
943,
320,
31637,
285,
34615,
275,
253,
18520,
50275,
5996,
30080,
22559,
50276,
783,
4477,
452,
9713,
619,
7350,
3021,
891,
2572,
619,
4868,
432,
721,
281,
854,
5474,
33032,
2520,
1263,
39223,
247,
4460,
5028,
26647,
4836,
1925,
13279,
292,
2014,
5028,
26647,
275,
534,
247,
1566,
310,
281,
320,
10166,
342,
2014,
2603,
5028,
533,
3198,
281,
39970,
973,
387,
39709,
2303,
10625,
326,
778,
2486,
39709,
5971,
281,
8415,
436,
1895,
253,
4477,
9017,
2014,
5028,
26647,
3082,
281,
320,
2104,
281,
3037,
5481,
273,
39709,
5971,
407,
25987,
48960,
941,
42072,
5661,
1543,
921,
326,
253,
4081,
6974,
476,
29499,
253,
14603,
273,
5368,
3082,
327,
15549,
7202,
2437,
941,
4757,
50275,
25249,
292,
2014,
5028,
26647,
310,
247,
4460,
11132,
533,
18236,
1774,
1895,
4758,
4677,
337,
310,
247,
1175,
10405,
1320,
327,
2954,
875,
436,
1895,
4758,
285,
2905,
4394,
50275,
49363,
1543,
342,
2067,
22791,
15302,
921,
326,
253,
4081,
6974,
2789,
352,
1896,
281,
9017,
5368,
5028,
26647,
3082,
281,
2736,
7202,
2437,
941,
275,
253,
17032,
3408,
50275,
20881,
1255,
50275,
74,
452,
2067,
7350,
5001,
253,
2216,
273,
2831,
8992,
50273,
3088,
359,
878,
281,
5467,
326,
512,
2303,
10625,
3894,
253,
1072,
5203,
2317,
50273,
249,
16186,
818,
1580,
277,
563,
20505,
50276,
2859,
310,
253,
760,
7658,
327,
3443,
352,
3133,
326,
3443,
476,
15744,
564,
2080,
1977,
432,
253,
2603,
941,
3268,
604,
359,
1379,
24555,
342,
1675,
281,
3443,
50273,
20261,
253,
8103,
1159,
323,
247,
2644,
3733,
1232,
310,
2931,
275,
16186,
818,
253,
2957,
3470,
275,
1016,
3924,
513,
417,
956,
436,
5426,
50271,
2574,
854,
47932,
271,
3081,
4373,
19484,
9765,
50271,
2574,
898,
285,
884,
5283,
9106,
1027,
2957,
3470,
534,
403,
298,
8289,
72,
285,
298,
3938,
50273,
15603,
272,
253,
16038,
273,
2831,
2437,
5425,
15274,
37820,
352,
651,
320,
1805,
281,
3523,
11786,
323,
49962,
310,
627,
667,
1921,
281,
15338,
298,
550,
83,
342,
1675,
281,
49962,
50275,
20261,
13279,
292,
2014,
5028,
26647,
476,
320,
2326,
347,
256,
27421,
50276,
25249,
292,
9162,
4836,
253,
4477,
760,
897,
256,
27421,
3082,
347,
1666,
25379,
275,
253,
4679,
27785,
13279,
292,
9162,
390,
562,
1171,
35360,
5481,
3082,
943,
1347,
1175,
323,
7202,
5971,
533,
417,
323,
1929,
5971,
1955,
281,
5028,
5333,
534,
16681,
253,
5750,
273,
253,
4081,
1332,
625,
4518,
891,
812,
417,
5963,
253,
1415,
4306,
273,
253,
4081,
1332,
432,
253,
1655,
5661,
1543,
50275,
5430,
858,
253,
4477,
19928,
12910,
275,
253,
4679,
285,
310,
352,
671,
908,
323,
8245,
3082,
209,
693,
519,
66,
285,
479,
2960,
253,
7200,
327,
7202,
5971,
943,
11306,
3469,
327,
253,
4758,
273,
12910,
50275,
43249,
41818,
403,
21643,
891,
1618,
690,
275,
253,
1563,
50273,
249,
2087,
5028,
2097,
247,
4667,
273,
6779,
285,
3268,
273,
941,
391,
18,
417,
247,
10895,
50271,
83,
18,
247,
6630,
327,
3700,
4715,
26332,
1796,
811,
327,
49745,
4748,
50273,
783,
7125,
273,
253,
2957,
3470,
4536,
1818,
275,
253,
7714,
50273,
249,
16186,
818,
352,
943,
320,
1805,
281,
11120,
6266,
534,
941,
3268,
310,
908,
281,
1379,
15355,
50273,
249,
16186,
2145,
268,
4193,
246,
18,
89,
943,
320,
4503,
281,
268,
4193,
50276,
37585,
7350,
326,
513,
417,
2818,
619,
4868,
50275,
249,
2829,
577,
298,
3938,
3133,
281,
452,
625,
3486,
327,
3045,
685,
298,
550,
83,
2167,
30410,
15274,
37820,
310,
760,
2908,
275,
253,
4060,
50275,
25249,
292,
2014,
5028,
26647,
310,
271,
4722,
285,
1774,
1895,
4758,
2299,
253,
4081,
1332,
310,
417,
1192,
522,
363,
1523,
4158,
323,
436,
1895,
4758,
275,
1635,
253,
4679,
285,
5955,
3480,
271,
1774,
2905,
789,
534,
310,
13279,
292,
9162,
390,
562,
1171,
35360,
5481,
891,
6273,
323,
5075,
12009,
50276,
7152,
33032,
2520,
2929,
29328,
247,
747,
1332,
1925,
2831,
8992,
323,
13279,
292,
2014,
5028,
26647,
835,
760,
581,
2603,
5028,
310,
2130,
281,
6194,
253,
1566,
50276,
2520,
1895,
5421,
1060,
310,
4722,
285,
7835,
5272,
534,
310,
671,
8244,
2905,
281,
13279,
292,
5028,
15644,
285,
2014,
5028,
26647,
275,
1798,
2831,
8992,
11809,
247,
747,
5700,
281,
6635,
24026,
3530,
323,
7202,
5971,
285,
24357,
247,
4460,
15274,
37820,
281,
1361,
4271,
3530,
432,
7202,
5971,
275,
2303,
10625,
1543,
327,
2067,
15302,
12654,
326,
253,
4081,
1332,
33526,
13943,
1543,
327,
1264,
7561,
3197,
15302,
20544,
337,
436,
2929,
29328,
281,
1263,
247,
747,
285,
4722,
1895,
13279,
292,
5028,
26647,
50276,
19,
4679,
921,
326,
2831,
8992,
19132,
2045,
5028,
26647,
3082,
751,
519,
66,
5723,
2824,
4765,
285,
479,
2960,
5723,
2824,
9169,
285,
31326,
13943,
1543,
327,
1264,
1789,
8981,
15302,
50276,
20881,
1255,
265,
337,
38135,
273,
253,
4081,
1332,
7835,
8489,
32809,
2429,
342,
2720,
2987,
275,
277,
72,
1936,
2059,
1162,
355,
4765,
1182,
31035,
1162,
355,
9169,
66,
285,
209,
14776,
632,
86,
1162,
355,
6247,
618,
7067,
50276,
6678,
41971,
43425,
50276,
19,
253,
30328,
3212,
298,
550,
83,
275,
16186,
1047,
310,
1892,
281,
2096,
849,
1057,
11038,
253,
18012,
273,
253,
581,
10936,
455,
30410,
285,
253,
23559,
14407,
30410,
5649,
253,
26647,
3745,
253,
1543,
273,
253,
12955,
298,
3938,
77,
3938,
403,
5816,
275,
2829,
577,
50276,
20,
671,
253,
1543,
273,
13279,
292,
5028,
15644,
403,
5816,
275,
253,
4679,
534,
403,
12232,
285,
812,
320,
2783,
347,
253,
5170,
3033,
273,
436,
747,
5421,
1895,
4758,
50275,
555,
993,
337,
1475,
16186,
805,
298,
5040,
261,
943,
320,
298,
550,
83,
436,
2929,
29328,
281,
1263,
247,
747,
285,
4722,
3700,
4715,
4758,
1925,
13279,
292,
5028,
26647,
285,
24357,
247,
747,
1332,
2831,
8992,
323,
436,
1895,
2299,
2831,
8992,
310,
7194,
4270,
327,
2067,
2045,
3082,
24088,
253,
1554,
487,
2767,
30410,
632,
86,
1162,
355,
6247,
618,
7067,
50276,
6678,
41971,
43425,
48960,
941,
42072,
1936,
2059,
1162,
355,
4765,
1182,
31035,
1162,
355,
9169,
66,
2403,
253,
4583,
38135,
32809,
323,
17857,
32888,
3021,
891,
5257,
281,
1918,
247,
45210,
12009,
4868,
2490,
187,
4118,
18435,
27,
783,
2929,
10262,
247,
747,
1895,
13279,
292,
2014,
5028,
26647,
835,
760,
581,
2603,
5028,
310,
2130,
285,
7202,
5971,
285,
39709,
2303,
10625,
2572,
253,
10183,
273,
253,
4836,
281,
18915,
436,
11132,
1895,
436,
2929,
11809,
247,
2831,
8992,
2746,
281,
3157,
253,
3045,
273,
256,
27421,
3082,
327,
12488,
7202,
5971,
407,
19732,
2977,
247,
1554,
487,
2767,
30410,
2831,
8992,
15693,
24026,
3530,
562,
273,
2603,
5203,
2317,
407,
970,
271,
48960,
941,
42072,
5700,
840,
253,
2929,
29328,
247,
2831,
2437,
5425,
15274,
37820,
326,
46926,
253,
1554,
487,
2767,
49996,
3453,
285,
581,
10936,
455,
23559,
14407,
49996,
3453,
50275,
783,
4081,
31058,
27421,
310,
271,
4722,
285,
15958,
1895,
2299,
1580,
352,
310,
1039,
625,
11132,
253,
8654,
2900,
281,
352,
4558,
38037,
690,
30628,
1158,
253,
1332,
1537,
320,
47641,
285,
3480,
10527,
23632,
17837,
253,
1543,
403,
12532,
285,
253,
2929,
2789,
247,
806,
3213,
2584,
253,
11132,
31058,
27421,
1895,
1529,
4468,
310,
326,
253,
260,
7083,
2957,
3198,
625,
28913,
2175,
281,
2007,
12106,
697,
2554,
2167,
253,
4477,
452,
2879,
625,
8813,
273,
436,
629,
891,
1804,
253,
4477,
1691,
625,
28913,
2175,
275,
253,
2457,
24864,
3389,
50275,
1189,
455,
253,
2929,
310,
4460,
285,
4722,
50276,
74,
651,
5583,
14924,
273,
436,
2929,
1677,
697,
38135,
285,
13943,
3045,
533,
891,
4122,
1804,
253,
4477,
823,
625,
28913,
2175,
275,
253,
2457,
24864,
347,
5125,
407,
253,
30628
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
the paper presents a new vit architecture by adopting the concepts of pyramid structure and employ regionaltolocal attention instead of global selfattention as in standard vision transformers the regional and local concept is linked to patch size ie region consists of larger patches in comparison to local a regional token is associated a set of nonoverlapping local tokens while computing reginaltolocal attention this attention is computed using a regional selfattention comprising all reginal tokens to extract global information followed by a local self attention mechanism that pass information between the reginal token and the linked local tokens via self attention the approach is evaluated image recognition object and keypoints detection semantic segmentation and action recognition the performance is comparable to the stateoftheart ives the paper is wellwritten and easy to follow the experimental evaluation on various dataset is very impressive however the accuracy improvement is not convincing the accompanied source code ives there is a significant advancement using parthierarchies zheng et al tip 2019 attention pyramid ding et al tip 2021 attentiondriven hierarchical multiscale warton et al arxiv 2021 i am aware that this paper is added recently and patches are all you need this is also under review httpsgithubcomtmpiclrconvmixer it is unclear how the regions and patches are novel to me it is how adapted in transformers it is unclear how memory saving is 73 when you are computing selfattention among regions followed by local selfattention between regions and local patches is dimension of c for features representing local patches and regional patches same the ablation studies is very comprehensive however missing ablation study without selfattention between regions first step of r2l attention it is also unclear on the 2nd step of the which one is used as key for multihead attention is it local patches if so what could be the outcome if we use the region as keys the overall accuracy of the model is good but not significantly better than the sota approaches the accuracy is better for models in which the window size is 14 and the model has higher gflops and parameters the idea of large patches to capture highlevel shape as well as concentrate on detailed texture and parts information using smaller patches is very good however the justification could have been improved the paper is wellwritten and the experimental evaluation is comprehensive however the improvement in accuracy is not significant in comparison to the way the novelty of r2l attention is described there is an advancement to the transformer model however the impact is limited the related work is very nicely done docsepthis paper introduces local inductive bias into vanilla vision transformer by adopting the pyramid structure and employing regionaltolocal attention the regionaltolocal r2l attention is the main contribution of the paper compared to the vanilla selfattention r2l has two steps 1 selfattention on regional tokens 2 selfattention on local tokens and their corresponding regional token advantages 1 this paper is well written and easy to follow 2 the experiments are sufficient to evaluate the proposed method disadvantage since the r2l attention has two steps i am concerned about the throughput of regionvit flops can not directly reflect the throughput it is better to show the throughput comparison for table 3 the idea sounds direct and reasonable although introducing local inductive bias into vit is not and there have been lots of paper to achieve it the paper is wellwritten and shows efficient experiments to evaluate the proposed regionvit docsepthe paper introduces a new spin on the vit architecture similar to other recent developments it proposes a pyramidal style architecture where subsequent layers process the input image at decreasing resolutions this allows the approach to be significantly more memory and compute efficient compared to the original vit architecture while still obtaining solid results on several tasks and benchmarks at the core of the method is a regionaltolocal attention approach where very coarse regional tokens are processed by a normal multihead attention yielding global attention and each coarse regional patch is additionally represented by a set of smaller local patches which are also processed together with the corresponding regional token to get finergrained local attention each layer in the network follows this principle and between different sections of the network both the regional and local tokens are downsampled with strided conv layers strengths overall the paper is fairly easy to follow and its easy to understand how the approach works including a fair amount of detail to potentially reproduce results the provided code albeit i did not run it seems like it should work in many of the presented experiments regionvit does indeed obtain on par or stateoftheart results when compared to similarly sized models quite some relevant and interesting ablations are presented weaknesses for me the main weakness of the paper is that it seems fairly incremental in a longer line of ideas first there was vit followed by more efficient training from deit including many augmentations several works then proposed a pyramidlike architectures and finally this work is a small increment on how to best realize the pyramid architecture including some smaller tweaks the scores indeed do improve but sometimes there are quite some different variants vanilla wpeg that are not consistently listed for the different tasks partially raising the question how large the margin to the previous models really is while i dont doubt that the approach has some merit i would find the paper significantly more valuable if the focus of the experiments would be to dive deeper into how to best design the pyramid structure of the network and less on listing many different tasks with some table showing state of the art results now this paper presents yet another new proposal on how to change change the vit architecture but i gain little insight into why it really performs better shifting the focus to this part of the paper would be more interesting in my opinion for example it would be interesting to discuss some of the ablation results in more details or to do a more thorough investigation into the size of the local and global patches one could even ask the question if two levels localregional are optimal or an additional superregional or global attention could also be useful i was a bit surprised by the fairly low results from the imagenet 21k transfer experiments of course it is clear that the flop count and parameter count is significantly lower than the better models in table 4a however it does raise the question if regionvit can indeed be scaled to such an extend that it matches the state of the art given that one big part of method is the reduced complexity it is also not surprising that the model is computationally less heavy but it would be a drawback if there is some performance ceiling it hits due to the specific architecture here it would be good to create some even bigger model to see if the larger models can be matched or outperformed also the other transfer experiments in table 4 are not super convincing the keypoint detection and action recognition sections take up quite some space and especially table 6 does not list any convincing baselines at the same time these experiments dont add huge value to the overall story in my opinion i would recommend to use this space to perform more relevant and interesting ablationsexperimentsdiscussions instead and move these to the supplementary material lastly the writing is not the best in some parts of the paper for a potential final version i would recommend that the paper is proofread by a native speaker open questions table 1 makes some statements about overlapping vs nonoverlapping patches the way the local tokenization works using 1 or 3 convs i would say it is somewhat of a stretch so say that the patches are strictly nonoverlapping i was fairly surprised about weights being shared between rsa and lsa as well as in the downsampling convs i dont really see a larger number of parameters as a huge issue for a model and the flop countinference speed should be the same if these weights would not be shared actually ablating this would be very interesting i would kind of expect that the local and regional tokens contain slightly different things and having separate weights could potentially improve the results the paper states that this model saves 73 of memory im wondering if this is an empirical measure or it was derived based on the complexity analysis i guess this factor might also be different for the tisbl models i find it quite surprising that the regional tokens dont have a huge effect on the overall performance ablation 1 is the remaining strong performance based on the pyramidal structure of regionvit given that the pure local attention never is performed on overlapping areas that is the only reason i would see why the model still remotely works overall the paper proposes a fairly interesting pyramidal modification to vit it seems like a somewhat incremental approach though while the performance is empirically shown there is little that can really be learned about the underlying approach from the experiments nothing is really wrong with the paper i do think it could be significantly better though as such i recommend the paper to be accepted but i do think it has a lot more potential that is not uncovered at the moment and i would actually be very happy to read such an improved version of the paper docsepthis paper proposes a new architecture that adopts the pyramid structure and employs a novel regionaltolocal attention for vision applications the regionaltolocal attention reduces the memory complexity as compared to standard selfattention the performance of this work is promising the novelty of this work is somewhat trivial the issue of the interaction between global and local information has been proposed by previous work not only in image classification but also in other downstream tasks and the proposed methods are not novel enough when compare with the previous ones several grammatical mistakes are made eg in abstract the regional selfattention extract global information among extracts this work aims to solve the interaction between global and local information in a vit architecture and to save the computation at the same time the performance is good however i am concerned with the novelty of this work and the contribution may be limited
### Summary: | the paper proposed a new architecture called regionaltolocal attention for the vision transformers the idea is easy to understand the model adopts the pyramid structure and adds a regional to local attention instead of using the global attention the architecture is wellmotivated and the paper is generally well written the main concerns from the reviewers are mostly clarification questions the authors did a good job addressing them apart from those most reviewers raise the novelty issue of such architecture which i would think is a drawback of this paper i am leaning towards the acceptance of this paper mainly because of its experimental results it is the best in my batch and i think there is a significant improvement over the previous approaches | [
12342,
273,
39694,
2605,
285,
2126,
9933,
34776,
3100,
4116,
3185,
273,
4156,
1881,
42959,
347,
275,
2629,
8113,
4979,
398,
253,
9933,
285,
1980,
4473,
310,
7939,
281,
12097,
1979,
26332,
2919,
8414,
273,
4067,
20412,
275,
5301,
281,
1980,
247,
9933,
10669,
310,
2330,
247,
873,
273,
1327,
1189,
77,
5436,
1980,
21761,
1223,
12672,
810,
989,
34776,
3100,
4116,
436,
4116,
310,
10302,
970,
247,
9933,
1881,
42959,
11616,
512,
810,
989,
21761,
281,
4908,
4156,
1491,
3560,
407,
247,
1980,
1881,
4116,
5122,
326,
1509,
1491,
875,
253,
810,
989,
10669,
285,
253,
7939,
1980,
21761,
3066,
1881,
4116,
253,
2746,
310,
6760,
2460,
8981,
1789,
285,
2234,
10801,
5481,
24705,
26405,
285,
2250,
8981,
253,
3045,
310,
10870,
281,
253,
1375,
23037,
14387,
209,
1644,
253,
2929,
310,
973,
15720,
285,
3477,
281,
956,
50276,
783,
5661,
7103,
327,
2710,
10895,
310,
1077,
13943,
2299,
253,
7200,
7756,
310,
417,
21414,
253,
11704,
2603,
2127,
50276,
1644,
627,
310,
247,
1534,
32992,
970,
1061,
394,
1321,
1116,
447,
1182,
24176,
1162,
355,
9092,
6247,
4116,
39694,
50097,
1162,
355,
9092,
43425,
4116,
17477,
24498,
1554,
2865,
1079,
34950,
251,
1162,
355,
549,
32693,
43425,
891,
717,
6600,
326,
436,
2929,
310,
2879,
4102,
50276,
395,
20412,
403,
512,
368,
878,
436,
310,
671,
762,
2278,
5987,
7280,
681,
12780,
280,
32888,
585,
11618,
895,
254,
352,
310,
12744,
849,
253,
4811,
285,
20412,
403,
4460,
281,
479,
352,
310,
849,
12956,
275,
4979,
398,
50276,
262,
310,
12744,
849,
3541,
13868,
310,
11087,
672,
368,
403,
12672,
1881,
42959,
2190,
4811,
3560,
407,
1980,
1881,
42959,
875,
4811,
285,
1980,
20412,
50276,
261,
7877,
273,
260,
323,
3386,
9999,
1980,
20412,
285,
9933,
20412,
1072,
50276,
783,
28913,
2175,
310,
1077,
11088,
2299,
5816,
28913,
1263,
1293,
1881,
42959,
875,
4811,
806,
3213,
273,
391,
19,
77,
4116,
50276,
262,
310,
671,
12744,
327,
253,
374,
2109,
3213,
273,
253,
534,
581,
310,
908,
347,
2234,
323,
4471,
2522,
4116,
310,
352,
50276,
6790,
20412,
604,
594,
752,
812,
320,
253,
6454,
604,
359,
897,
253,
2919,
347,
10149,
50276,
783,
4583,
7200,
273,
253,
1566,
310,
1175,
533,
417,
3012,
1805,
685,
253,
256,
5503,
7274,
253,
7200,
310,
1805,
323,
50276,
19286,
275,
534,
253,
3497,
1979,
310,
1638,
285,
253,
1566,
556,
2169,
305,
1258,
2695,
285,
3602,
253,
2934,
273,
1781,
20412,
281,
9232,
1029,
5251,
5281,
347,
973,
347,
21364,
327,
7000,
14542,
285,
4243,
1491,
970,
4577,
20412,
310,
1077,
1175,
2299,
253,
22861,
812,
452,
644,
5520,
253,
2929,
310,
973,
15720,
285,
253,
5661,
7103,
310,
11088,
2299,
253,
7756,
275,
7200,
310,
417,
1534,
275,
5301,
281,
253,
1039,
253,
38135,
273,
391,
19,
77,
4116,
310,
2529,
627,
310,
271,
32992,
281,
253,
39707,
1566,
2299,
253,
3486,
310,
3710,
253,
2905,
789,
310,
1077,
23395,
2218,
5474,
33032,
2520,
2929,
23970,
1980,
42115,
8492,
715,
26724,
8113,
39707,
407,
25987,
253,
39694,
2605,
285,
19693,
9933,
34776,
3100,
4116,
253,
9933,
34776,
3100,
391,
19,
77,
4116,
310,
253,
2022,
7680,
273,
253,
2929,
2429,
281,
253,
26724,
1881,
42959,
391,
19,
77,
556,
767,
5018,
337,
1881,
42959,
327,
9933,
21761,
374,
1881,
42959,
327,
1980,
21761,
285,
616,
3969,
9933,
10669,
50275,
11402,
1131,
337,
436,
2929,
310,
973,
3542,
285,
3477,
281,
956,
374,
253,
4679,
403,
4209,
281,
7472,
253,
4081,
1332,
50275,
3431,
11402,
486,
1580,
253,
391,
19,
77,
4116,
556,
767,
5018,
891,
717,
7514,
670,
253,
28519,
273,
2919,
34490,
892,
2695,
476,
417,
3587,
4887,
253,
28519,
352,
310,
1805,
281,
921,
253,
28519,
5301,
323,
2829,
495,
50275,
783,
2934,
7835,
1480,
285,
5272,
3738,
16984,
1980,
42115,
8492,
715,
9084,
310,
417,
285,
627,
452,
644,
8783,
273,
2929,
281,
5115,
352,
253,
2929,
310,
973,
15720,
285,
2722,
5919,
4679,
281,
7472,
253,
4081,
2919,
34490,
5474,
339,
431,
248,
2929,
23970,
247,
747,
5508,
327,
253,
9084,
10336,
2074,
281,
643,
3332,
16936,
352,
29328,
247,
25874,
11421,
3740,
10336,
835,
6774,
8090,
1232,
253,
3280,
2460,
387,
11052,
30285,
436,
4483,
253,
2746,
281,
320,
3012,
625,
3541,
285,
11897,
5919,
2429,
281,
253,
3236,
9084,
10336,
1223,
1335,
13546,
4891,
1543,
327,
2067,
8892,
285,
49602,
387,
253,
5161,
273,
253,
1332,
310,
247,
9933,
34776,
3100,
4116,
2746,
835,
1077,
25319,
9933,
21761,
403,
11742,
407,
247,
2622,
4471,
2522,
4116,
27012,
4156,
4116,
285,
1016,
25319,
9933,
12097,
310,
23000,
6607,
407,
247,
873,
273,
4577,
1980,
20412,
534,
403,
671,
11742,
2366,
342,
253,
3969,
9933,
10669,
281,
755,
1442,
1326,
11273,
1980,
4116,
1016,
3828,
275,
253,
2990,
3637,
436,
8063,
285,
875,
1027,
7118,
273,
253,
2990,
1097,
253,
9933,
285,
1980,
21761,
403,
1066,
22163,
6216,
342,
1213,
1356,
2410,
8090,
50276,
296,
3755,
20556,
50276,
1189,
455,
253,
2929,
310,
9648,
3477,
281,
956,
285,
697,
3477,
281,
2096,
849,
253,
2746,
2987,
1690,
247,
4344,
2408,
273,
2508,
281,
7826,
18302,
1543,
253,
2530,
2127,
23447,
891,
858,
417,
1408,
352,
3133,
751,
352,
943,
789,
50276,
249,
1142,
273,
253,
3559,
4679,
2919,
34490,
1057,
6296,
4044,
327,
1061,
390,
1375,
23037,
14387,
1543,
672,
2429,
281,
12014,
25180,
3210,
50276,
39911,
690,
4623,
285,
4722,
490,
77,
569,
403,
3559,
50276,
20881,
1255,
265,
50276,
1542,
479,
253,
2022,
14855,
273,
253,
2929,
310,
326,
352,
3133,
9648,
32809,
275,
247,
3356,
1386,
273,
5697,
806,
627,
369,
9084,
3560,
407,
625,
5919,
3733,
432,
372,
262,
1690,
1142,
35919,
569,
2067,
2987,
840,
4081,
247,
39694,
3022,
35615,
285,
4720,
436,
789,
310,
247,
1355,
17627,
327,
849,
281,
1682,
8968,
253,
39694,
10336,
1690,
690,
4577,
13660,
8765,
253,
7363,
6296,
513,
3157,
533,
4536,
627,
403,
3240,
690,
1027,
11640,
26724,
50275,
88,
21949,
326,
403,
417,
12724,
7117,
323,
253,
1027,
8892,
10571,
12976,
253,
1953,
849,
1781,
253,
8459,
281,
253,
2045,
3210,
1663,
310,
1223,
891,
13414,
5545,
326,
253,
2746,
556,
690,
15785,
891,
651,
1089,
253,
2929,
3012,
625,
9865,
604,
253,
2770,
273,
253,
4679,
651,
320,
281,
25760,
12861,
715,
849,
281,
1682,
2216,
253,
39694,
2605,
273,
253,
2990,
285,
1679,
327,
16485,
1142,
1027,
8892,
342,
690,
2829,
4645,
1375,
273,
253,
1445,
1543,
1024,
436,
2929,
10262,
2568,
1529,
747,
10419,
327,
849,
281,
1818,
1818,
253,
9084,
10336,
533,
891,
6351,
1652,
12288,
715,
2139,
352,
1663,
17923,
1805,
19507,
253,
2770,
281,
436,
629,
273,
253,
2929,
651,
320,
625,
4722,
275,
619,
4743,
323,
1650,
352,
651,
320,
4722,
281,
2319,
690,
273,
253,
28913,
1543,
275,
625,
4278,
390,
281,
513,
247,
625,
11080,
5839,
715,
253,
1979,
273,
253,
1980,
285,
4156,
20412,
581,
812,
1014,
1642,
253,
1953,
604,
767,
2308,
1980,
1747,
1593,
403,
8654,
390,
271,
3081,
2221,
1747,
1593,
390,
4156,
4116,
812,
671,
320,
4217,
50276,
74,
369,
247,
2372,
9861,
407,
253,
9648,
1698,
1543,
432,
253,
4440,
257,
292,
3127,
76,
3700,
4679,
273,
2282,
352,
310,
2590,
326,
253,
892,
412,
1385,
285,
4764,
1385,
310,
3012,
2406,
685,
253,
1805,
3210,
275,
2829,
577,
66,
2299,
352,
1057,
7164,
253,
1953,
604,
2919,
34490,
476,
6296,
320,
24337,
281,
824,
271,
9017,
326,
352,
10129,
253,
1375,
273,
253,
1445,
1677,
326,
581,
1943,
629,
273,
1332,
310,
253,
3777,
10454,
352,
310,
671,
417,
10084,
326,
253,
1566,
310,
43245,
1679,
5536,
533,
352,
651,
320,
247,
32489,
604,
627,
310,
690,
3045,
16720,
352,
12830,
1955,
281,
253,
2173,
10336,
1060,
352,
651,
320,
1175,
281,
2794,
690,
1014,
8750,
1566,
281,
923,
604,
253,
4067,
3210,
476,
320,
13373,
390,
41731,
10574,
671,
253,
643,
3700,
4679,
275,
2829,
577,
403,
417,
2221,
21414,
50276,
783,
2234,
3659,
5481,
285,
2250,
8981,
7118,
1379,
598,
3240,
690,
2317,
285,
3340,
2829,
721,
1057,
417,
1618,
667,
21414,
1666,
25379,
387,
253,
1072,
673,
841,
4679,
13414,
823,
5699,
1318,
281,
253,
4583,
2926,
275,
619,
4743,
891,
651,
5583,
281,
897,
436,
2317,
281,
1347,
625,
4623,
285,
4722,
28913,
11523,
468,
3825,
35844,
621,
3185,
285,
2118,
841,
281,
253,
24864,
2144,
50276,
6275,
314,
253,
4028,
310,
417,
253,
1682,
275,
690,
4243,
273,
253,
2929,
323,
247,
2442,
2457,
2715,
891,
651,
5583,
326,
253,
2929,
310,
4737,
1088,
407,
247,
7925,
14925,
50275,
5758,
3533,
50276,
2420,
337,
2789,
690,
7234,
670,
21481,
4632,
1327,
1189,
77,
5436,
20412,
253,
1039,
253,
1980,
10669,
1320,
2987,
970,
337,
390,
495,
2410,
84,
891,
651,
1333,
352,
310,
8489,
273,
247,
13726,
594,
1333,
326,
253,
20412,
403,
13714,
1327,
1189,
77,
5436,
50276,
74,
369,
9648,
9861,
670,
13461,
1146,
6096,
875,
391,
6678,
285,
298,
6678,
347,
973,
347,
275,
253,
1066,
48027,
2410,
84,
891,
13414,
1663,
923,
247,
4067,
1180,
273,
3602,
347,
247,
5699,
2523,
323,
247,
1566,
285,
253,
892,
412,
1385,
249,
1793,
3885,
943,
320,
253,
1072,
604,
841,
13461,
651,
417,
320,
6096,
2686,
490,
77,
839,
436,
651,
320,
1077,
4722,
891,
651,
2238,
273,
1902,
326,
253,
1980,
285,
9933,
21761,
3831,
5777,
1027,
1841,
285,
1907,
4858,
13461,
812,
7826,
3157,
253,
1543,
50276,
783,
2929,
3054,
326,
436,
1566,
26866,
11087,
273,
3541,
516,
12371,
604,
436,
310,
271,
16774,
2557,
390,
352,
369,
6012,
1754,
327,
253,
10454,
1783,
891,
5476,
436,
2803,
1537,
671,
320,
1027,
323,
253,
246,
261,
1559,
3210,
50276,
74,
1089,
352,
3240,
10084,
326,
253,
9933,
21761,
13414,
452,
247,
5699,
1055,
327,
253,
4583,
3045,
28913,
337,
310,
253,
5780,
2266,
3045,
1754,
327,
253,
25874,
11421,
2605,
273,
2919,
34490,
1677,
326,
253,
6313,
1980,
4116,
1620,
310,
2684,
327,
21481,
3672,
326,
310,
253,
760,
1921,
891,
651,
923,
2139,
253,
1566,
1335,
27938,
2987,
4583,
253,
2929,
29328,
247,
9648,
4722,
25874,
11421,
11237,
281,
9084,
352,
3133,
751,
247,
8489,
32809,
2746,
2167,
1223,
253,
3045,
310,
45190,
2011,
627,
310,
1652,
326,
476,
1663,
320,
6311,
670,
253,
6944,
2746,
432,
253,
4679,
2717,
310,
1663,
3430,
342,
253,
2929,
891,
513,
1158,
352,
812,
320,
3012,
1805,
2167,
347,
824,
891,
5583,
253,
2929,
281,
320,
7607,
533,
891,
513,
1158,
352,
556,
247,
2257,
625,
2442,
326,
310,
417,
27819,
387,
253,
2774,
285,
891,
651,
2686,
320,
1077,
5211,
281,
1239,
824,
271,
5520,
2715,
273,
253,
2929,
5474,
33032,
2520,
2929,
29328,
247,
747,
10336,
326,
47932,
253,
39694,
2605,
285,
27532,
247,
4460,
9933,
34776,
3100,
4116,
323,
8113,
4893,
253,
9933,
34776,
3100,
4116,
11355,
253,
3541,
10454,
347,
2429,
281,
2629,
1881,
42959,
50274,
783,
3045,
273,
436,
789,
310,
12532,
50274,
783,
38135,
273,
436,
789,
310,
8489,
14916,
253,
2523,
273,
253,
5016,
875,
4156,
285,
1980,
1491,
556,
644,
4081,
407,
2045,
789,
417,
760,
275,
2460,
9162,
533,
671,
275,
643,
15450,
8892,
285,
253,
4081,
3082,
403,
417,
4460,
2217,
672,
7277,
342,
253,
2045,
4394,
50276,
43249,
47412,
474,
16503,
403,
1160,
24088,
275,
12002,
253,
9933,
1881,
42959,
4908,
4156,
1491,
2190,
16756,
50275,
2520,
789,
13698,
281,
8415,
253,
5016,
875,
4156,
285,
1980,
1491,
275,
247,
9084,
10336,
285,
281,
5321,
253,
13782,
387,
253,
1072,
673,
253,
3045,
310,
1175,
2299,
891,
717,
7514,
342,
253,
38135,
273,
436,
789,
285,
253,
7680,
778,
320,
3710,
2490,
187,
4118,
18435,
27,
783,
2929,
4081,
247,
747,
10336,
1925,
9933,
34776,
3100,
4116,
323,
253,
8113,
4979,
398,
253,
2934,
310,
3477,
281,
2096,
253,
1566,
47932,
253,
39694,
2605,
285,
11323,
247,
9933,
281,
1980,
4116,
3185,
273,
970,
253,
4156,
4116,
253,
10336,
310,
973,
24013,
8550,
285,
253,
2929,
310,
3839,
973,
3542,
50276,
783,
2022,
7350,
432,
253,
30628,
403,
6571,
37699,
3533,
253,
4477,
858,
247,
1175,
2628,
15974,
731,
7419,
432,
1110,
954,
30628,
7164,
253,
38135,
2523,
273,
824,
10336,
534,
891,
651,
1158,
310,
247,
32489,
273,
436,
2929,
50276,
74,
717,
25661,
4404,
253,
14924,
273,
436,
2929,
7194,
984,
273,
697,
5661,
1543,
352,
310,
253,
1682,
275,
619,
14604,
285,
891,
1158,
627,
310,
247,
1534,
7756,
689,
253,
2045,
7274
] | [
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] | [
12342,
273,
39694,
2605,
285,
2126,
9933,
34776,
3100,
4116,
3185,
273,
4156,
1881,
42959,
347,
275,
2629,
8113,
4979,
398,
253,
9933,
285,
1980,
4473,
310,
7939,
281,
12097,
1979,
26332,
2919,
8414,
273,
4067,
20412,
275,
5301,
281,
1980,
247,
9933,
10669,
310,
2330,
247,
873,
273,
1327,
1189,
77,
5436,
1980,
21761,
1223,
12672,
810,
989,
34776,
3100,
4116,
436,
4116,
310,
10302,
970,
247,
9933,
1881,
42959,
11616,
512,
810,
989,
21761,
281,
4908,
4156,
1491,
3560,
407,
247,
1980,
1881,
4116,
5122,
326,
1509,
1491,
875,
253,
810,
989,
10669,
285,
253,
7939,
1980,
21761,
3066,
1881,
4116,
253,
2746,
310,
6760,
2460,
8981,
1789,
285,
2234,
10801,
5481,
24705,
26405,
285,
2250,
8981,
253,
3045,
310,
10870,
281,
253,
1375,
23037,
14387,
209,
1644,
253,
2929,
310,
973,
15720,
285,
3477,
281,
956,
50276,
783,
5661,
7103,
327,
2710,
10895,
310,
1077,
13943,
2299,
253,
7200,
7756,
310,
417,
21414,
253,
11704,
2603,
2127,
50276,
1644,
627,
310,
247,
1534,
32992,
970,
1061,
394,
1321,
1116,
447,
1182,
24176,
1162,
355,
9092,
6247,
4116,
39694,
50097,
1162,
355,
9092,
43425,
4116,
17477,
24498,
1554,
2865,
1079,
34950,
251,
1162,
355,
549,
32693,
43425,
891,
717,
6600,
326,
436,
2929,
310,
2879,
4102,
50276,
395,
20412,
403,
512,
368,
878,
436,
310,
671,
762,
2278,
5987,
7280,
681,
12780,
280,
32888,
585,
11618,
895,
254,
352,
310,
12744,
849,
253,
4811,
285,
20412,
403,
4460,
281,
479,
352,
310,
849,
12956,
275,
4979,
398,
50276,
262,
310,
12744,
849,
3541,
13868,
310,
11087,
672,
368,
403,
12672,
1881,
42959,
2190,
4811,
3560,
407,
1980,
1881,
42959,
875,
4811,
285,
1980,
20412,
50276,
261,
7877,
273,
260,
323,
3386,
9999,
1980,
20412,
285,
9933,
20412,
1072,
50276,
783,
28913,
2175,
310,
1077,
11088,
2299,
5816,
28913,
1263,
1293,
1881,
42959,
875,
4811,
806,
3213,
273,
391,
19,
77,
4116,
50276,
262,
310,
671,
12744,
327,
253,
374,
2109,
3213,
273,
253,
534,
581,
310,
908,
347,
2234,
323,
4471,
2522,
4116,
310,
352,
50276,
6790,
20412,
604,
594,
752,
812,
320,
253,
6454,
604,
359,
897,
253,
2919,
347,
10149,
50276,
783,
4583,
7200,
273,
253,
1566,
310,
1175,
533,
417,
3012,
1805,
685,
253,
256,
5503,
7274,
253,
7200,
310,
1805,
323,
50276,
19286,
275,
534,
253,
3497,
1979,
310,
1638,
285,
253,
1566,
556,
2169,
305,
1258,
2695,
285,
3602,
253,
2934,
273,
1781,
20412,
281,
9232,
1029,
5251,
5281,
347,
973,
347,
21364,
327,
7000,
14542,
285,
4243,
1491,
970,
4577,
20412,
310,
1077,
1175,
2299,
253,
22861,
812,
452,
644,
5520,
253,
2929,
310,
973,
15720,
285,
253,
5661,
7103,
310,
11088,
2299,
253,
7756,
275,
7200,
310,
417,
1534,
275,
5301,
281,
253,
1039,
253,
38135,
273,
391,
19,
77,
4116,
310,
2529,
627,
310,
271,
32992,
281,
253,
39707,
1566,
2299,
253,
3486,
310,
3710,
253,
2905,
789,
310,
1077,
23395,
2218,
5474,
33032,
2520,
2929,
23970,
1980,
42115,
8492,
715,
26724,
8113,
39707,
407,
25987,
253,
39694,
2605,
285,
19693,
9933,
34776,
3100,
4116,
253,
9933,
34776,
3100,
391,
19,
77,
4116,
310,
253,
2022,
7680,
273,
253,
2929,
2429,
281,
253,
26724,
1881,
42959,
391,
19,
77,
556,
767,
5018,
337,
1881,
42959,
327,
9933,
21761,
374,
1881,
42959,
327,
1980,
21761,
285,
616,
3969,
9933,
10669,
50275,
11402,
1131,
337,
436,
2929,
310,
973,
3542,
285,
3477,
281,
956,
374,
253,
4679,
403,
4209,
281,
7472,
253,
4081,
1332,
50275,
3431,
11402,
486,
1580,
253,
391,
19,
77,
4116,
556,
767,
5018,
891,
717,
7514,
670,
253,
28519,
273,
2919,
34490,
892,
2695,
476,
417,
3587,
4887,
253,
28519,
352,
310,
1805,
281,
921,
253,
28519,
5301,
323,
2829,
495,
50275,
783,
2934,
7835,
1480,
285,
5272,
3738,
16984,
1980,
42115,
8492,
715,
9084,
310,
417,
285,
627,
452,
644,
8783,
273,
2929,
281,
5115,
352,
253,
2929,
310,
973,
15720,
285,
2722,
5919,
4679,
281,
7472,
253,
4081,
2919,
34490,
5474,
339,
431,
248,
2929,
23970,
247,
747,
5508,
327,
253,
9084,
10336,
2074,
281,
643,
3332,
16936,
352,
29328,
247,
25874,
11421,
3740,
10336,
835,
6774,
8090,
1232,
253,
3280,
2460,
387,
11052,
30285,
436,
4483,
253,
2746,
281,
320,
3012,
625,
3541,
285,
11897,
5919,
2429,
281,
253,
3236,
9084,
10336,
1223,
1335,
13546,
4891,
1543,
327,
2067,
8892,
285,
49602,
387,
253,
5161,
273,
253,
1332,
310,
247,
9933,
34776,
3100,
4116,
2746,
835,
1077,
25319,
9933,
21761,
403,
11742,
407,
247,
2622,
4471,
2522,
4116,
27012,
4156,
4116,
285,
1016,
25319,
9933,
12097,
310,
23000,
6607,
407,
247,
873,
273,
4577,
1980,
20412,
534,
403,
671,
11742,
2366,
342,
253,
3969,
9933,
10669,
281,
755,
1442,
1326,
11273,
1980,
4116,
1016,
3828,
275,
253,
2990,
3637,
436,
8063,
285,
875,
1027,
7118,
273,
253,
2990,
1097,
253,
9933,
285,
1980,
21761,
403,
1066,
22163,
6216,
342,
1213,
1356,
2410,
8090,
50276,
296,
3755,
20556,
50276,
1189,
455,
253,
2929,
310,
9648,
3477,
281,
956,
285,
697,
3477,
281,
2096,
849,
253,
2746,
2987,
1690,
247,
4344,
2408,
273,
2508,
281,
7826,
18302,
1543,
253,
2530,
2127,
23447,
891,
858,
417,
1408,
352,
3133,
751,
352,
943,
789,
50276,
249,
1142,
273,
253,
3559,
4679,
2919,
34490,
1057,
6296,
4044,
327,
1061,
390,
1375,
23037,
14387,
1543,
672,
2429,
281,
12014,
25180,
3210,
50276,
39911,
690,
4623,
285,
4722,
490,
77,
569,
403,
3559,
50276,
20881,
1255,
265,
50276,
1542,
479,
253,
2022,
14855,
273,
253,
2929,
310,
326,
352,
3133,
9648,
32809,
275,
247,
3356,
1386,
273,
5697,
806,
627,
369,
9084,
3560,
407,
625,
5919,
3733,
432,
372,
262,
1690,
1142,
35919,
569,
2067,
2987,
840,
4081,
247,
39694,
3022,
35615,
285,
4720,
436,
789,
310,
247,
1355,
17627,
327,
849,
281,
1682,
8968,
253,
39694,
10336,
1690,
690,
4577,
13660,
8765,
253,
7363,
6296,
513,
3157,
533,
4536,
627,
403,
3240,
690,
1027,
11640,
26724,
50275,
88,
21949,
326,
403,
417,
12724,
7117,
323,
253,
1027,
8892,
10571,
12976,
253,
1953,
849,
1781,
253,
8459,
281,
253,
2045,
3210,
1663,
310,
1223,
891,
13414,
5545,
326,
253,
2746,
556,
690,
15785,
891,
651,
1089,
253,
2929,
3012,
625,
9865,
604,
253,
2770,
273,
253,
4679,
651,
320,
281,
25760,
12861,
715,
849,
281,
1682,
2216,
253,
39694,
2605,
273,
253,
2990,
285,
1679,
327,
16485,
1142,
1027,
8892,
342,
690,
2829,
4645,
1375,
273,
253,
1445,
1543,
1024,
436,
2929,
10262,
2568,
1529,
747,
10419,
327,
849,
281,
1818,
1818,
253,
9084,
10336,
533,
891,
6351,
1652,
12288,
715,
2139,
352,
1663,
17923,
1805,
19507,
253,
2770,
281,
436,
629,
273,
253,
2929,
651,
320,
625,
4722,
275,
619,
4743,
323,
1650,
352,
651,
320,
4722,
281,
2319,
690,
273,
253,
28913,
1543,
275,
625,
4278,
390,
281,
513,
247,
625,
11080,
5839,
715,
253,
1979,
273,
253,
1980,
285,
4156,
20412,
581,
812,
1014,
1642,
253,
1953,
604,
767,
2308,
1980,
1747,
1593,
403,
8654,
390,
271,
3081,
2221,
1747,
1593,
390,
4156,
4116,
812,
671,
320,
4217,
50276,
74,
369,
247,
2372,
9861,
407,
253,
9648,
1698,
1543,
432,
253,
4440,
257,
292,
3127,
76,
3700,
4679,
273,
2282,
352,
310,
2590,
326,
253,
892,
412,
1385,
285,
4764,
1385,
310,
3012,
2406,
685,
253,
1805,
3210,
275,
2829,
577,
66,
2299,
352,
1057,
7164,
253,
1953,
604,
2919,
34490,
476,
6296,
320,
24337,
281,
824,
271,
9017,
326,
352,
10129,
253,
1375,
273,
253,
1445,
1677,
326,
581,
1943,
629,
273,
1332,
310,
253,
3777,
10454,
352,
310,
671,
417,
10084,
326,
253,
1566,
310,
43245,
1679,
5536,
533,
352,
651,
320,
247,
32489,
604,
627,
310,
690,
3045,
16720,
352,
12830,
1955,
281,
253,
2173,
10336,
1060,
352,
651,
320,
1175,
281,
2794,
690,
1014,
8750,
1566,
281,
923,
604,
253,
4067,
3210,
476,
320,
13373,
390,
41731,
10574,
671,
253,
643,
3700,
4679,
275,
2829,
577,
403,
417,
2221,
21414,
50276,
783,
2234,
3659,
5481,
285,
2250,
8981,
7118,
1379,
598,
3240,
690,
2317,
285,
3340,
2829,
721,
1057,
417,
1618,
667,
21414,
1666,
25379,
387,
253,
1072,
673,
841,
4679,
13414,
823,
5699,
1318,
281,
253,
4583,
2926,
275,
619,
4743,
891,
651,
5583,
281,
897,
436,
2317,
281,
1347,
625,
4623,
285,
4722,
28913,
11523,
468,
3825,
35844,
621,
3185,
285,
2118,
841,
281,
253,
24864,
2144,
50276,
6275,
314,
253,
4028,
310,
417,
253,
1682,
275,
690,
4243,
273,
253,
2929,
323,
247,
2442,
2457,
2715,
891,
651,
5583,
326,
253,
2929,
310,
4737,
1088,
407,
247,
7925,
14925,
50275,
5758,
3533,
50276,
2420,
337,
2789,
690,
7234,
670,
21481,
4632,
1327,
1189,
77,
5436,
20412,
253,
1039,
253,
1980,
10669,
1320,
2987,
970,
337,
390,
495,
2410,
84,
891,
651,
1333,
352,
310,
8489,
273,
247,
13726,
594,
1333,
326,
253,
20412,
403,
13714,
1327,
1189,
77,
5436,
50276,
74,
369,
9648,
9861,
670,
13461,
1146,
6096,
875,
391,
6678,
285,
298,
6678,
347,
973,
347,
275,
253,
1066,
48027,
2410,
84,
891,
13414,
1663,
923,
247,
4067,
1180,
273,
3602,
347,
247,
5699,
2523,
323,
247,
1566,
285,
253,
892,
412,
1385,
249,
1793,
3885,
943,
320,
253,
1072,
604,
841,
13461,
651,
417,
320,
6096,
2686,
490,
77,
839,
436,
651,
320,
1077,
4722,
891,
651,
2238,
273,
1902,
326,
253,
1980,
285,
9933,
21761,
3831,
5777,
1027,
1841,
285,
1907,
4858,
13461,
812,
7826,
3157,
253,
1543,
50276,
783,
2929,
3054,
326,
436,
1566,
26866,
11087,
273,
3541,
516,
12371,
604,
436,
310,
271,
16774,
2557,
390,
352,
369,
6012,
1754,
327,
253,
10454,
1783,
891,
5476,
436,
2803,
1537,
671,
320,
1027,
323,
253,
246,
261,
1559,
3210,
50276,
74,
1089,
352,
3240,
10084,
326,
253,
9933,
21761,
13414,
452,
247,
5699,
1055,
327,
253,
4583,
3045,
28913,
337,
310,
253,
5780,
2266,
3045,
1754,
327,
253,
25874,
11421,
2605,
273,
2919,
34490,
1677,
326,
253,
6313,
1980,
4116,
1620,
310,
2684,
327,
21481,
3672,
326,
310,
253,
760,
1921,
891,
651,
923,
2139,
253,
1566,
1335,
27938,
2987,
4583,
253,
2929,
29328,
247,
9648,
4722,
25874,
11421,
11237,
281,
9084,
352,
3133,
751,
247,
8489,
32809,
2746,
2167,
1223,
253,
3045,
310,
45190,
2011,
627,
310,
1652,
326,
476,
1663,
320,
6311,
670,
253,
6944,
2746,
432,
253,
4679,
2717,
310,
1663,
3430,
342,
253,
2929,
891,
513,
1158,
352,
812,
320,
3012,
1805,
2167,
347,
824,
891,
5583,
253,
2929,
281,
320,
7607,
533,
891,
513,
1158,
352,
556,
247,
2257,
625,
2442,
326,
310,
417,
27819,
387,
253,
2774,
285,
891,
651,
2686,
320,
1077,
5211,
281,
1239,
824,
271,
5520,
2715,
273,
253,
2929,
5474,
33032,
2520,
2929,
29328,
247,
747,
10336,
326,
47932,
253,
39694,
2605,
285,
27532,
247,
4460,
9933,
34776,
3100,
4116,
323,
8113,
4893,
253,
9933,
34776,
3100,
4116,
11355,
253,
3541,
10454,
347,
2429,
281,
2629,
1881,
42959,
50274,
783,
3045,
273,
436,
789,
310,
12532,
50274,
783,
38135,
273,
436,
789,
310,
8489,
14916,
253,
2523,
273,
253,
5016,
875,
4156,
285,
1980,
1491,
556,
644,
4081,
407,
2045,
789,
417,
760,
275,
2460,
9162,
533,
671,
275,
643,
15450,
8892,
285,
253,
4081,
3082,
403,
417,
4460,
2217,
672,
7277,
342,
253,
2045,
4394,
50276,
43249,
47412,
474,
16503,
403,
1160,
24088,
275,
12002,
253,
9933,
1881,
42959,
4908,
4156,
1491,
2190,
16756,
50275,
2520,
789,
13698,
281,
8415,
253,
5016,
875,
4156,
285,
1980,
1491,
275,
247,
9084,
10336,
285,
281,
5321,
253,
13782,
387,
253,
1072,
673,
253,
3045,
310,
1175,
2299,
891,
717,
7514,
342,
253,
38135,
273,
436,
789,
285,
253,
7680,
778,
320,
3710,
2490,
187,
4118,
18435,
27,
783,
2929,
4081,
247,
747,
10336,
1925,
9933,
34776,
3100,
4116,
323,
253,
8113,
4979,
398,
253,
2934,
310,
3477,
281,
2096,
253,
1566,
47932,
253,
39694,
2605,
285,
11323,
247,
9933,
281,
1980,
4116,
3185,
273,
970,
253,
4156,
4116,
253,
10336,
310,
973,
24013,
8550,
285,
253,
2929,
310,
3839,
973,
3542,
50276,
783,
2022,
7350,
432,
253,
30628,
403,
6571,
37699,
3533,
253,
4477,
858,
247,
1175,
2628,
15974,
731,
7419,
432,
1110,
954,
30628,
7164,
253,
38135,
2523,
273,
824,
10336,
534,
891,
651,
1158,
310,
247,
32489,
273,
436,
2929,
50276,
74,
717,
25661,
4404,
253,
14924,
273,
436,
2929,
7194,
984,
273,
697,
5661,
1543,
352,
310,
253,
1682,
275,
619,
14604,
285,
891,
1158,
627,
310,
247,
1534,
7756,
689,
253,
2045,
7274
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
the paper designed and tested wtawp a new adversarial weight perturbation approach on graph neural networks they demonstrated that by locating flat local minima our wtawp can improve the regularization of gnns they carried out comprehensive tests to verify the method wtawp reliably increases gnn performance on a wide range of graph learning tasks including node classification graph defense and graph classification this paper focuses on extending awp to gnn this paper analyzes the vanishinggradient issue existing in awp and gives a more detailed theoretical proof the experimental part of this article is comprehensive and the experimental results also verify the advantages of the proposed method however there still exist some insufficiency and confusing places 1 what is the term h in equation3 the statements said h is a monotonously increasing function where is it from 2 the paper utilized comprehensive experiments to show the efficiency when facing attacks but no analysis and theoretical proof for this advantage i think the source of the advantage in handling attacks is worth clarification especially in table 2 some experimental results under attacks are even better than the natural setting 3 figure1 and figure3 are blurred i can not distinguish face color and border color it is recommended to use vector graphics 4 the sentence on page 5 footnotes perturbing only the second layer instead performs similarly is confusing for me the paper is wellorganized and has comprehensive experiments to defense its method however some statements and equations are not clear enough the illustrations and figures are not meticulous i hope the authors can further polish the paper in detail docsepthe authors propose a variant of adversarial weight perturbations sharpness aware minimization for graph convolutional neural networks for node and graph classification in particular they make two adjustments truncating ie limiting the weight perturbation to specific layers and weighting the sharpness aware loss with the regular loss during training strengths the paper is generally clearly structured and written the toy example visualizations are nice and the algorithm helps understanding an interesting problemquestion is addressed how to effectively use awpsharpnessaware minimization for graph neural networks it is also nice to show that the optimum does not change with awp training experiments for clean and robust accuracy with several models are conducted besides mean standard deviation is reported which is nice to judge the improvement the approach improves across the board even though improvement is small at times analysis in terms of flatness gradients and visualization and ablation studies are provided weaknesses in the introduction the authors state the iid assumption as an important criteria for existing flatness approaches however it is not discussed when this becomes relevant as far as i am aware the iiid assumption does not play an important role in related discussionpapers could the authors comment on that as i understand the paper the main point is to see how these approaches work in non iid settings such as node classification but the paper does not really theoretically discuss this issue one paper that could be included in related work as it is quite related a httpsarxivorgabs160904836 also a discussion of scaleinvariance and the criticism of b is missing this should be handled as in stutz et al but it would be interesting at least to say why this generalizes to the gcns considered b httpsarxivorgpdf170304933pdf notationwise the main sections could be improved by making explicit that a perlayer neighborhood b is used from the beginning currently this is made clear in eq 4 while eq 1 to 3 suggest that all parameters are appended also are the gcn baselines equipped with biases in addition to the weights if so are biases and weights treated as separate layers as in stutz et al in terms of contributions the theoretical contributions mainly thm 1 and methodological contributions seem a bit limited although i have not seen thm 1 in other papers which makes it refreshing to actually see it i found that the result is very intuitive and the proof is also quite straightforward the techniques employed to improve awp also seem very specific to graph problems where larger rho are used than for vision problems thus i see the contributions mostly in verifying this approach on graph data regarding the vanishing gradient problem i am having difficulties understanding why rho has to be chosen as large as done throughout the toy example and the experiments stutz et al and wu et al consider very small rho of 05 0005 or lower obviously these are much deeper networks and not graph neural networks i am wondering if the authors could give more details on why a large rho is needed is it because of the 2layer structure or the architecture differences the two proposed approaches do actually not improve performance on the toy example while i understand that it is meant for illustration purposes i believe it is badly chosen i see that without truncation and weighting the accuracy is very bad but shouldnt you show that you can improve over the baseline of 98 and not be stuck at 95 did you try optimizing hyperparameters or is it a problem where twawp just does not help which would be interesting regarding 52 i am not entirely convinced that the gradient norm is the best indicator of flatness exactly because of the scaleinvariance argumentation of stutz et al can the authors comment on that i guess that the used gcns do not use batch normalization i have not seen bn used for graph neural networks before but scaleinvariance is a problem as described in b also fig 4 c and d has some outliers that are not really explained table 2 is hard to read and very small some ablation that i find missing ablation regarding layers which layer to skip obviously the networks are twolayer networks but i would find it very interesting whether it is always the last layer to skip also in deeper networks or always the first layer to perturb also in deeper networks i appreciate this paper in terms of applying awp to graph neural networks and showing how it needs to be adapted to work well methodological contributions are small however and improvements vary across datasets and cases docsepthe authors extend the line of work related to adversarial weight perturbations first showing that a vanishing gradient issue in standard awp can hinder training to remedy this some natural tweaks are applied to awp the authors then focus on using awp to train graph neural networks demonstrating a minor boost in both clean and robust accuracy strengths the line of work concerning adversarial weight perturbations is interesting and significant as it is one of a few locations where the nascent theory of deep networks can provide easy tweaks to training that improve generalization identifying a problem with the current approach even on mlps is valuable and significant and ample evidence is presented to suggest the vanishing gradient problem is actually what is occurring the proposed solutions are simple and easy to implement the experimentation is thorough while the improvements provided by awp are very minor less than a percent in some instances it is yet another easy and cheap trick to eversoslightly boost the performance of a network weaknesses first while it is an important insight to notice the vanishing gradient the proposed remedies weight truncation and weighted awp are natural and not particularly novel these are likely the first things that one would try to mitigate the observed problems with awp and dont represent a great stride forward in the field similarly it is not particularly surprising that the benefits of awp as evidenced in mlps carry over to gnns not enough motivation or evidence is presented to make this seem surprising or unexpected third theorem 1 is trivial if the definition of ltrain is to be the one that only performs a single firstorder step then of course the gradient evaluated at a minimum is going to be zero the interesting question here is how the true awp loss eq2 relates to the standard training loss this theorem doesnt add add anything to the story questions one of my complaints is about the novelty of contribution re gnns vs mlps did i miss this or is there simply a much more pronounced effect of the gradientvanishing phenomenon in gnns than mlps how much do the gradient norms with respect to the weights actually change when training under awp vs wtawp vs standard training this is an interesting line of work and the pointing out of and subsequent fixing of the gradientvanishing phenomenon in awp is a valuable contribution past this none of the results are strikingly novel or groundbreaking i think this is a borderline paper tending towards rejection but could be convinced to boost my score slightly if ive misunderstood something docsepthis work studies how awp improves the gnn mainly at the generalization aspect the authors first derive several theoretical results that can be easily derived from the existing works on cnn eg wu et al 2020a and foret et a 2021 in section 3 in section 4 the authors show that if they directly apply large rho as suggested by 3 they will encounter the gradient vanishing phenomenon during the training based on this phenomenon the authors propose two ways to solve this one is truncated awp the other is weighted awp then the authors conduct simple experiments using vanilla gcn gcnawp gcntawp gcnwawp where vanilla perform the best whereas gcntawp gcnwawp have smoother decision boundary and they decide to combine tw with awp to be frank i do not know why they do not add experiment result of wtawp here in section 5 the authors conduct a lot of experiments like clean accuracy robust accuracy etc which follow the existing works the contribution as i can see is that the authors are the first ones to conduct extensive experiments using awp on graph dataset however i do not see other contributions listed by the authors which i will explain later in the main review strengths i am satisfied with the numerical experiments in section 5 although they largely follows the routes of existing work weaknesses i will point the weaknesses by sections section 3 i the theoretical results are very incremental based on existing works eg wu et al 2020a which can be easily derived or directly used ii the awp algorithm listed here is the same as wu et al 2020a only add a letter a section 4 i after i read the paper thoroughly twice i do not see how you assign thetaawp and thetanormal which is very important if you have that in your paper first let me know where i can find it second move that to section 4 and explain it it is unacceptable not having it in section 4 ii do not have theory or even intuition about tawp and wawp in my perspective how you justify your algorithms is very important which is much important than the so called theoretical results in section 3 you can even save the space in section 3 for you to defend your algorithms in section 4 i believe what i point out here should be your novelty for this paper at least please add some good intuitions for your algorithms i will add points if you do that iii please add the wtawg result in figure 3 summary good experiments weak i too incremental and lack of novelty ii do not provides theoreticalintuitive explanation of the proposed algorithms this is work i is quite incremental compared to the existing works ii does not explainjustifydescribe the algorithms well i do not think this paper is good enough for iclr
### Summary: | this paper considers a variant of adversarial weight perturbations sharpness aware minimization for graph convolutional neural networks for node and graph classification in particular they make two adjustments truncating ie limiting the weight perturbation to specific layers and weighting the sharpness aware loss with the regular loss during training the reviewers found that the theoretical justifications characterization of vanishing gradient and understanding of noniid setting which was added during rebuttal are interesting but several reviewers also found the solutionempirical results not convincing enough i recommend the authors to either shift the focus to the theoretical results or to strengthen the empirical results and their connections with theory following the comments of the reviewers | [
891,
3524,
253,
4477,
476,
2007,
40167,
253,
2929,
275,
2508,
50276,
7152,
339,
431,
248,
4477,
12661,
247,
12955,
273,
48960,
2801,
26309,
50276,
26440,
1255,
6600,
41458,
323,
4216,
27311,
267,
11454,
6928,
323,
4666,
285,
4216,
9162,
275,
1798,
597,
1056,
767,
23927,
17701,
839,
26332,
14155,
253,
2801,
20452,
281,
2173,
8090,
285,
42428,
253,
9479,
1255,
6600,
2957,
342,
253,
3963,
2957,
1309,
3733,
20544,
50276,
783,
2929,
310,
3839,
4518,
18872,
285,
3542,
253,
20953,
1650,
5304,
5904,
403,
5322,
285,
253,
5933,
7729,
4685,
50276,
266,
4722,
1895,
19751,
310,
9713,
849,
281,
8069,
897,
3768,
793,
73,
5916,
1255,
13823,
41458,
323,
4216,
11454,
6928,
50276,
262,
310,
671,
5322,
281,
921,
326,
253,
24571,
1057,
417,
1818,
342,
3768,
81,
3733,
50276,
16217,
3825,
323,
4076,
285,
10237,
7200,
342,
2067,
3210,
403,
5196,
16280,
1599,
2629,
11254,
310,
2361,
534,
310,
5322,
281,
5963,
253,
7756,
50276,
783,
2746,
19132,
2439,
253,
4450,
1014,
2167,
7756,
310,
1355,
387,
2069,
50276,
12792,
275,
2426,
273,
6507,
1255,
27935,
285,
24426,
285,
28913,
2175,
403,
2530,
50276,
20881,
1255,
265,
50276,
249,
253,
10199,
253,
4477,
1375,
253,
891,
301,
9376,
347,
271,
1774,
6866,
323,
5368,
6507,
1255,
7274,
2299,
352,
310,
417,
5469,
672,
436,
4916,
4623,
347,
2080,
347,
891,
717,
6600,
253,
21255,
301,
9376,
1057,
417,
1132,
271,
1774,
2554,
275,
2905,
5955,
50004,
812,
253,
4477,
4385,
327,
326,
347,
891,
2096,
253,
2929,
253,
2022,
1127,
310,
281,
923,
849,
841,
7274,
789,
275,
1327,
891,
301,
7533,
824,
347,
4666,
9162,
533,
253,
2929,
1057,
417,
1663,
28055,
2319,
436,
2523,
50276,
531,
2929,
326,
812,
320,
2908,
275,
2905,
789,
347,
352,
310,
3240,
2905,
50276,
66,
5987,
39962,
2061,
5375,
1036,
2693,
27244,
1812,
50275,
12563,
247,
5955,
273,
4311,
7821,
14417,
285,
253,
14226,
273,
270,
310,
5816,
436,
943,
320,
15726,
347,
275,
331,
25374,
1162,
355,
533,
352,
651,
320,
4722,
387,
1878,
281,
1333,
2139,
436,
2087,
4219,
281,
253,
305,
68,
2224,
2783,
50276,
67,
5987,
39962,
2061,
9275,
15046,
1229,
2537,
1610,
9275,
50274,
25604,
3020,
253,
2022,
7118,
812,
320,
5520,
407,
2403,
6843,
326,
247,
591,
12026,
9168,
270,
310,
908,
432,
253,
5068,
4390,
436,
310,
1160,
2590,
275,
16186,
577,
1223,
16186,
337,
281,
495,
1804,
326,
512,
3602,
403,
42873,
50276,
12563,
403,
253,
305,
14340,
1666,
25379,
13496,
342,
31306,
275,
1635,
281,
253,
13461,
604,
594,
403,
31306,
285,
13461,
4127,
347,
4858,
8090,
347,
275,
331,
25374,
1162,
355,
50276,
249,
2426,
273,
9021,
253,
10527,
9021,
7194,
289,
78,
337,
285,
35961,
9021,
1646,
247,
2372,
3710,
3738,
891,
452,
417,
2326,
289,
78,
337,
275,
643,
9380,
534,
2789,
352,
31255,
281,
2686,
923,
352,
891,
1119,
326,
253,
906,
310,
1077,
27350,
285,
253,
4737,
310,
671,
3240,
15246,
253,
5609,
7091,
281,
3157,
3768,
81,
671,
1646,
1077,
2173,
281,
4216,
3237,
835,
4067,
391,
1689,
403,
908,
685,
323,
8113,
3237,
3021,
891,
923,
253,
9021,
6571,
275,
49160,
436,
2746,
327,
4216,
941,
50276,
1747,
13218,
253,
29199,
11786,
1895,
891,
717,
1907,
12748,
4685,
2139,
391,
1689,
556,
281,
320,
6777,
347,
1781,
347,
2218,
4768,
253,
20953,
1650,
285,
253,
4679,
331,
25374,
1162,
355,
285,
259,
86,
1162,
355,
1908,
1077,
1355,
391,
1689,
273,
16987,
209,
13930,
390,
2406,
9090,
841,
403,
1199,
12861,
6928,
285,
417,
4216,
11454,
6928,
891,
717,
12371,
604,
253,
4477,
812,
1918,
625,
4278,
327,
2139,
247,
1781,
391,
1689,
310,
3058,
310,
352,
984,
273,
253,
374,
12026,
2605,
390,
253,
10336,
3910,
50276,
783,
767,
4081,
7274,
513,
2686,
417,
3157,
3045,
327,
253,
20953,
1650,
1223,
891,
2096,
326,
352,
310,
5486,
323,
23356,
6378,
891,
2868,
352,
310,
16426,
6777,
891,
923,
326,
1293,
47024,
285,
42428,
253,
7200,
310,
1077,
3076,
533,
943,
2649,
368,
921,
326,
368,
476,
3157,
689,
253,
8245,
273,
10508,
285,
417,
320,
10960,
387,
5325,
858,
368,
1611,
39793,
4373,
22041,
390,
310,
352,
247,
1895,
835,
2500,
1403,
81,
816,
1057,
417,
1361,
534,
651,
320,
4722,
50276,
1747,
13218,
8073,
891,
717,
417,
7094,
13762,
326,
253,
11786,
5222,
310,
253,
1682,
15301,
273,
6507,
1255,
4555,
984,
273,
253,
4311,
7821,
14417,
4154,
318,
273,
331,
25374,
1162,
355,
476,
253,
4477,
4385,
327,
326,
891,
5476,
326,
253,
908,
305,
68,
2224,
513,
417,
897,
14604,
21539,
891,
452,
417,
2326,
270,
79,
908,
323,
4216,
11454,
6928,
1078,
533,
4311,
7821,
14417,
310,
247,
1895,
347,
2529,
275,
270,
671,
3036,
577,
260,
285,
277,
556,
690,
42559,
326,
403,
417,
1663,
5544,
50276,
2420,
374,
310,
1892,
281,
1239,
285,
1077,
1355,
50276,
8826,
28913,
326,
891,
1089,
5816,
28913,
5001,
8090,
534,
3828,
281,
17049,
9090,
253,
6928,
403,
2500,
311,
4071,
6928,
533,
891,
651,
1089,
352,
1077,
4722,
1880,
352,
310,
1900,
253,
1390,
3828,
281,
17049,
671,
275,
12861,
6928,
390,
1900,
253,
806,
3828,
281,
12230,
671,
275,
12861,
6928,
891,
11435,
436,
2929,
275,
2426,
273,
9433,
3768,
81,
281,
4216,
11454,
6928,
285,
4645,
849,
352,
3198,
281,
320,
12956,
281,
789,
973,
35961,
9021,
403,
1355,
2299,
285,
11701,
6889,
2439,
15302,
285,
2219,
50276,
7152,
339,
431,
248,
4477,
9017,
253,
1386,
273,
789,
2905,
281,
48960,
2801,
26309,
806,
4645,
326,
247,
29199,
11786,
2523,
275,
2629,
3768,
81,
476,
35007,
3733,
281,
16748,
436,
690,
3626,
13660,
8765,
403,
3732,
281,
3768,
81,
253,
4477,
840,
2770,
327,
970,
3768,
81,
281,
6194,
4216,
11454,
6928,
17227,
247,
5884,
9510,
275,
1097,
4076,
285,
10237,
7200,
50275,
296,
3755,
20556,
253,
1386,
273,
789,
8664,
48960,
2801,
26309,
310,
4722,
285,
1534,
347,
352,
310,
581,
273,
247,
1643,
8593,
835,
253,
13332,
1154,
3762,
273,
3676,
6928,
476,
2085,
3477,
13660,
8765,
281,
3733,
326,
3157,
26647,
12488,
247,
1895,
342,
253,
1655,
2746,
1014,
327,
13361,
793,
310,
9865,
285,
1534,
285,
24904,
1941,
310,
3559,
281,
1804,
253,
29199,
11786,
1895,
310,
2686,
752,
310,
12952,
253,
4081,
5482,
403,
2969,
285,
3477,
281,
3359,
253,
40290,
310,
11080,
1223,
253,
11701,
2530,
407,
3768,
81,
403,
1077,
5884,
1679,
685,
247,
2558,
275,
690,
10872,
352,
310,
2568,
1529,
3477,
285,
11142,
10480,
281,
299,
735,
375,
46711,
9510,
253,
3045,
273,
247,
2990,
50275,
20881,
1255,
265,
806,
1223,
352,
310,
271,
1774,
12288,
281,
4366,
253,
29199,
11786,
253,
4081,
24371,
2801,
47024,
285,
17375,
3768,
81,
403,
3626,
285,
417,
3782,
4460,
841,
403,
2779,
253,
806,
1841,
326,
581,
651,
1611,
281,
29966,
253,
2540,
3237,
342,
3768,
81,
285,
13414,
1957,
247,
1270,
31482,
3579,
275,
253,
1673,
12014,
352,
310,
417,
3782,
10084,
326,
253,
5373,
273,
3768,
81,
347,
27007,
275,
13361,
793,
4459,
689,
281,
18976,
2224,
417,
2217,
16038,
390,
1941,
310,
3559,
281,
1056,
436,
1646,
10084,
390,
12439,
2626,
10012,
337,
310,
14916,
604,
253,
5426,
273,
298,
24382,
310,
281,
320,
253,
581,
326,
760,
17923,
247,
2014,
806,
2621,
3213,
840,
273,
2282,
253,
11786,
6760,
387,
247,
5927,
310,
1469,
281,
320,
5058,
253,
4722,
1953,
1060,
310,
849,
253,
2032,
3768,
81,
2957,
16186,
19,
7033,
281,
253,
2629,
3733,
2957,
436,
10012,
36908,
823,
823,
2712,
281,
253,
2926,
50275,
34974,
50276,
531,
273,
619,
14672,
310,
670,
253,
38135,
273,
7680,
294,
18976,
2224,
4632,
13361,
793,
858,
891,
2985,
436,
390,
310,
627,
3365,
247,
1199,
625,
17088,
1055,
273,
253,
11786,
6148,
3647,
11562,
275,
18976,
2224,
685,
13361,
793,
50275,
5430,
1199,
513,
253,
11786,
22429,
342,
1675,
281,
253,
13461,
2686,
1818,
672,
3733,
762,
3768,
81,
4632,
259,
893,
16471,
4632,
2629,
3733,
436,
310,
271,
4722,
1386,
273,
789,
285,
253,
13458,
562,
273,
285,
6774,
18505,
273,
253,
11786,
6148,
3647,
11562,
275,
3768,
81,
310,
247,
9865,
7680,
2469,
436,
5293,
273,
253,
1543,
403,
13631,
314,
4460,
390,
3216,
22071,
891,
1158,
436,
310,
247,
45210,
2929,
43981,
4404,
18235,
533,
812,
320,
13762,
281,
9510,
619,
4868,
5777,
604,
209,
422,
46485,
1633,
50276,
7152,
33032,
2520,
789,
2175,
849,
3768,
81,
19132,
253,
305,
9866,
7194,
387,
253,
26647,
4809,
253,
4477,
806,
15313,
2067,
10527,
1543,
326,
476,
320,
4354,
6012,
432,
253,
5368,
2987,
327,
260,
9866,
24088,
259,
86,
1162,
355,
9169,
66,
285,
2273,
85,
1162,
247,
43425,
275,
2593,
495,
275,
2593,
577,
253,
4477,
921,
326,
604,
597,
3587,
4647,
1781,
391,
1689,
347,
5125,
407,
495,
597,
588,
13329,
253,
11786,
29199,
11562,
1309,
253,
3733,
1754,
327,
436,
11562,
253,
4477,
12661,
767,
4088,
281,
8415,
436,
581,
310,
28069,
3768,
81,
253,
643,
310,
17375,
3768,
81,
840,
253,
4477,
2589,
2969,
4679,
970,
26724,
305,
14340,
305,
14340,
1403,
81,
305,
14340,
893,
16471,
305,
14340,
88,
1403,
81,
835,
26724,
1347,
253,
1682,
5727,
305,
14340,
893,
16471,
305,
14340,
88,
1403,
81,
452,
39797,
977,
3061,
7548,
285,
597,
7617,
281,
13398,
2500,
342,
3768,
81,
281,
320,
21332,
891,
513,
417,
871,
2139,
597,
513,
417,
823,
3368,
906,
273,
259,
893,
16471,
1060,
275,
2593,
608,
253,
4477,
2589,
247,
2257,
273,
4679,
751,
4076,
7200,
10237,
7200,
3966,
534,
956,
253,
5368,
2987,
50276,
783,
7680,
347,
891,
476,
923,
310,
326,
253,
4477,
403,
253,
806,
4394,
281,
2589,
9470,
4679,
970,
3768,
81,
327,
4216,
10895,
2299,
891,
513,
417,
923,
643,
9021,
7117,
407,
253,
4477,
534,
891,
588,
5513,
1996,
275,
253,
2022,
2278,
20544,
891,
717,
10048,
342,
253,
10704,
4679,
275,
2593,
608,
3738,
597,
8127,
3637,
253,
15050,
273,
5368,
789,
50275,
20881,
1255,
265,
891,
588,
1127,
253,
32213,
407,
7118,
50276,
4674,
495,
891,
253,
10527,
1543,
403,
1077,
32809,
1754,
327,
5368,
2987,
24088,
259,
86,
1162,
355,
9169,
66,
534,
476,
320,
4354,
6012,
390,
3587,
908,
21255,
253,
3768,
81,
5933,
7117,
1060,
310,
253,
1072,
347,
259,
86,
1162,
355,
9169,
66,
760,
823,
247,
4857,
247,
50276,
4674,
577,
891,
846,
891,
1239,
253,
2929,
16575,
7019,
891,
513,
417,
923,
849,
368,
9212,
39116,
1403,
81,
285,
253,
12505,
1939,
534,
310,
1077,
1774,
604,
368,
452,
326,
275,
634,
2929,
806,
1339,
479,
871,
835,
891,
476,
1089,
352,
1273,
2118,
326,
281,
2593,
577,
285,
5513,
352,
352,
310,
28536,
417,
1907,
352,
275,
2593,
577,
21255,
513,
417,
452,
3762,
390,
1014,
30328,
670,
246,
1403,
81,
285,
259,
1403,
81,
275,
619,
8668,
849,
368,
15249,
634,
11333,
310,
1077,
1774,
534,
310,
1199,
1774,
685,
253,
594,
1925,
10527,
1543,
275,
2593,
495,
368,
476,
1014,
5321,
253,
2317,
275,
2593,
495,
323,
368,
281,
2342,
634,
11333,
275,
2593,
577,
891,
2868,
752,
891,
1127,
562,
1060,
943,
320,
634,
38135,
323,
436,
2929,
387,
1878,
4496,
823,
690,
1175,
16875,
4431,
323,
634,
11333,
891,
588,
823,
2792,
604,
368,
513,
326,
37685,
4496,
823,
253,
259,
893,
46506,
906,
275,
4677,
495,
50276,
8774,
1175,
4679,
5075,
891,
1512,
32809,
285,
3480,
273,
38135,
21255,
513,
417,
3400,
10527,
565,
48714,
8813,
273,
253,
4081,
11333,
50271,
2520,
310,
789,
891,
310,
3240,
32809,
2429,
281,
253,
5368,
2987,
21255,
1057,
417,
5513,
6309,
1419,
49027,
253,
11333,
973,
891,
513,
417,
1158,
436,
2929,
310,
1175,
2217,
323,
17857,
32888,
2490,
187,
4118,
18435,
27,
2520,
2929,
19401,
247,
12955,
273,
48960,
2801,
26309,
50276,
26440,
1255,
6600,
41458,
323,
4216,
27311,
267,
11454,
6928,
323,
4666,
285,
4216,
9162,
275,
1798,
597,
1056,
767,
23927,
17701,
839,
26332,
14155,
253,
2801,
20452,
281,
2173,
8090,
285,
42428,
253,
9479,
1255,
6600,
2957,
342,
253,
3963,
2957,
1309,
3733,
253,
30628,
1119,
326,
253,
10527,
816,
6787,
14846,
273,
29199,
11786,
285,
4685,
273,
1327,
74,
301,
4758,
534,
369,
2879,
1309,
30080,
22559,
403,
4722,
533,
2067,
30628,
671,
1119,
253,
2900,
358,
5378,
474,
1543,
417,
21414,
2217,
891,
5583,
253,
4477,
281,
2057,
5333,
253,
2770,
281,
253,
10527,
1543,
390,
281,
17084,
253,
16774,
1543,
285,
616,
10291,
342,
3762,
1563,
253,
5701,
273,
253,
30628
] | [
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] | [
891,
3524,
253,
4477,
476,
2007,
40167,
253,
2929,
275,
2508,
50276,
7152,
339,
431,
248,
4477,
12661,
247,
12955,
273,
48960,
2801,
26309,
50276,
26440,
1255,
6600,
41458,
323,
4216,
27311,
267,
11454,
6928,
323,
4666,
285,
4216,
9162,
275,
1798,
597,
1056,
767,
23927,
17701,
839,
26332,
14155,
253,
2801,
20452,
281,
2173,
8090,
285,
42428,
253,
9479,
1255,
6600,
2957,
342,
253,
3963,
2957,
1309,
3733,
20544,
50276,
783,
2929,
310,
3839,
4518,
18872,
285,
3542,
253,
20953,
1650,
5304,
5904,
403,
5322,
285,
253,
5933,
7729,
4685,
50276,
266,
4722,
1895,
19751,
310,
9713,
849,
281,
8069,
897,
3768,
793,
73,
5916,
1255,
13823,
41458,
323,
4216,
11454,
6928,
50276,
262,
310,
671,
5322,
281,
921,
326,
253,
24571,
1057,
417,
1818,
342,
3768,
81,
3733,
50276,
16217,
3825,
323,
4076,
285,
10237,
7200,
342,
2067,
3210,
403,
5196,
16280,
1599,
2629,
11254,
310,
2361,
534,
310,
5322,
281,
5963,
253,
7756,
50276,
783,
2746,
19132,
2439,
253,
4450,
1014,
2167,
7756,
310,
1355,
387,
2069,
50276,
12792,
275,
2426,
273,
6507,
1255,
27935,
285,
24426,
285,
28913,
2175,
403,
2530,
50276,
20881,
1255,
265,
50276,
249,
253,
10199,
253,
4477,
1375,
253,
891,
301,
9376,
347,
271,
1774,
6866,
323,
5368,
6507,
1255,
7274,
2299,
352,
310,
417,
5469,
672,
436,
4916,
4623,
347,
2080,
347,
891,
717,
6600,
253,
21255,
301,
9376,
1057,
417,
1132,
271,
1774,
2554,
275,
2905,
5955,
50004,
812,
253,
4477,
4385,
327,
326,
347,
891,
2096,
253,
2929,
253,
2022,
1127,
310,
281,
923,
849,
841,
7274,
789,
275,
1327,
891,
301,
7533,
824,
347,
4666,
9162,
533,
253,
2929,
1057,
417,
1663,
28055,
2319,
436,
2523,
50276,
531,
2929,
326,
812,
320,
2908,
275,
2905,
789,
347,
352,
310,
3240,
2905,
50276,
66,
5987,
39962,
2061,
5375,
1036,
2693,
27244,
1812,
50275,
12563,
247,
5955,
273,
4311,
7821,
14417,
285,
253,
14226,
273,
270,
310,
5816,
436,
943,
320,
15726,
347,
275,
331,
25374,
1162,
355,
533,
352,
651,
320,
4722,
387,
1878,
281,
1333,
2139,
436,
2087,
4219,
281,
253,
305,
68,
2224,
2783,
50276,
67,
5987,
39962,
2061,
9275,
15046,
1229,
2537,
1610,
9275,
50274,
25604,
3020,
253,
2022,
7118,
812,
320,
5520,
407,
2403,
6843,
326,
247,
591,
12026,
9168,
270,
310,
908,
432,
253,
5068,
4390,
436,
310,
1160,
2590,
275,
16186,
577,
1223,
16186,
337,
281,
495,
1804,
326,
512,
3602,
403,
42873,
50276,
12563,
403,
253,
305,
14340,
1666,
25379,
13496,
342,
31306,
275,
1635,
281,
253,
13461,
604,
594,
403,
31306,
285,
13461,
4127,
347,
4858,
8090,
347,
275,
331,
25374,
1162,
355,
50276,
249,
2426,
273,
9021,
253,
10527,
9021,
7194,
289,
78,
337,
285,
35961,
9021,
1646,
247,
2372,
3710,
3738,
891,
452,
417,
2326,
289,
78,
337,
275,
643,
9380,
534,
2789,
352,
31255,
281,
2686,
923,
352,
891,
1119,
326,
253,
906,
310,
1077,
27350,
285,
253,
4737,
310,
671,
3240,
15246,
253,
5609,
7091,
281,
3157,
3768,
81,
671,
1646,
1077,
2173,
281,
4216,
3237,
835,
4067,
391,
1689,
403,
908,
685,
323,
8113,
3237,
3021,
891,
923,
253,
9021,
6571,
275,
49160,
436,
2746,
327,
4216,
941,
50276,
1747,
13218,
253,
29199,
11786,
1895,
891,
717,
1907,
12748,
4685,
2139,
391,
1689,
556,
281,
320,
6777,
347,
1781,
347,
2218,
4768,
253,
20953,
1650,
285,
253,
4679,
331,
25374,
1162,
355,
285,
259,
86,
1162,
355,
1908,
1077,
1355,
391,
1689,
273,
16987,
209,
13930,
390,
2406,
9090,
841,
403,
1199,
12861,
6928,
285,
417,
4216,
11454,
6928,
891,
717,
12371,
604,
253,
4477,
812,
1918,
625,
4278,
327,
2139,
247,
1781,
391,
1689,
310,
3058,
310,
352,
984,
273,
253,
374,
12026,
2605,
390,
253,
10336,
3910,
50276,
783,
767,
4081,
7274,
513,
2686,
417,
3157,
3045,
327,
253,
20953,
1650,
1223,
891,
2096,
326,
352,
310,
5486,
323,
23356,
6378,
891,
2868,
352,
310,
16426,
6777,
891,
923,
326,
1293,
47024,
285,
42428,
253,
7200,
310,
1077,
3076,
533,
943,
2649,
368,
921,
326,
368,
476,
3157,
689,
253,
8245,
273,
10508,
285,
417,
320,
10960,
387,
5325,
858,
368,
1611,
39793,
4373,
22041,
390,
310,
352,
247,
1895,
835,
2500,
1403,
81,
816,
1057,
417,
1361,
534,
651,
320,
4722,
50276,
1747,
13218,
8073,
891,
717,
417,
7094,
13762,
326,
253,
11786,
5222,
310,
253,
1682,
15301,
273,
6507,
1255,
4555,
984,
273,
253,
4311,
7821,
14417,
4154,
318,
273,
331,
25374,
1162,
355,
476,
253,
4477,
4385,
327,
326,
891,
5476,
326,
253,
908,
305,
68,
2224,
513,
417,
897,
14604,
21539,
891,
452,
417,
2326,
270,
79,
908,
323,
4216,
11454,
6928,
1078,
533,
4311,
7821,
14417,
310,
247,
1895,
347,
2529,
275,
270,
671,
3036,
577,
260,
285,
277,
556,
690,
42559,
326,
403,
417,
1663,
5544,
50276,
2420,
374,
310,
1892,
281,
1239,
285,
1077,
1355,
50276,
8826,
28913,
326,
891,
1089,
5816,
28913,
5001,
8090,
534,
3828,
281,
17049,
9090,
253,
6928,
403,
2500,
311,
4071,
6928,
533,
891,
651,
1089,
352,
1077,
4722,
1880,
352,
310,
1900,
253,
1390,
3828,
281,
17049,
671,
275,
12861,
6928,
390,
1900,
253,
806,
3828,
281,
12230,
671,
275,
12861,
6928,
891,
11435,
436,
2929,
275,
2426,
273,
9433,
3768,
81,
281,
4216,
11454,
6928,
285,
4645,
849,
352,
3198,
281,
320,
12956,
281,
789,
973,
35961,
9021,
403,
1355,
2299,
285,
11701,
6889,
2439,
15302,
285,
2219,
50276,
7152,
339,
431,
248,
4477,
9017,
253,
1386,
273,
789,
2905,
281,
48960,
2801,
26309,
806,
4645,
326,
247,
29199,
11786,
2523,
275,
2629,
3768,
81,
476,
35007,
3733,
281,
16748,
436,
690,
3626,
13660,
8765,
403,
3732,
281,
3768,
81,
253,
4477,
840,
2770,
327,
970,
3768,
81,
281,
6194,
4216,
11454,
6928,
17227,
247,
5884,
9510,
275,
1097,
4076,
285,
10237,
7200,
50275,
296,
3755,
20556,
253,
1386,
273,
789,
8664,
48960,
2801,
26309,
310,
4722,
285,
1534,
347,
352,
310,
581,
273,
247,
1643,
8593,
835,
253,
13332,
1154,
3762,
273,
3676,
6928,
476,
2085,
3477,
13660,
8765,
281,
3733,
326,
3157,
26647,
12488,
247,
1895,
342,
253,
1655,
2746,
1014,
327,
13361,
793,
310,
9865,
285,
1534,
285,
24904,
1941,
310,
3559,
281,
1804,
253,
29199,
11786,
1895,
310,
2686,
752,
310,
12952,
253,
4081,
5482,
403,
2969,
285,
3477,
281,
3359,
253,
40290,
310,
11080,
1223,
253,
11701,
2530,
407,
3768,
81,
403,
1077,
5884,
1679,
685,
247,
2558,
275,
690,
10872,
352,
310,
2568,
1529,
3477,
285,
11142,
10480,
281,
299,
735,
375,
46711,
9510,
253,
3045,
273,
247,
2990,
50275,
20881,
1255,
265,
806,
1223,
352,
310,
271,
1774,
12288,
281,
4366,
253,
29199,
11786,
253,
4081,
24371,
2801,
47024,
285,
17375,
3768,
81,
403,
3626,
285,
417,
3782,
4460,
841,
403,
2779,
253,
806,
1841,
326,
581,
651,
1611,
281,
29966,
253,
2540,
3237,
342,
3768,
81,
285,
13414,
1957,
247,
1270,
31482,
3579,
275,
253,
1673,
12014,
352,
310,
417,
3782,
10084,
326,
253,
5373,
273,
3768,
81,
347,
27007,
275,
13361,
793,
4459,
689,
281,
18976,
2224,
417,
2217,
16038,
390,
1941,
310,
3559,
281,
1056,
436,
1646,
10084,
390,
12439,
2626,
10012,
337,
310,
14916,
604,
253,
5426,
273,
298,
24382,
310,
281,
320,
253,
581,
326,
760,
17923,
247,
2014,
806,
2621,
3213,
840,
273,
2282,
253,
11786,
6760,
387,
247,
5927,
310,
1469,
281,
320,
5058,
253,
4722,
1953,
1060,
310,
849,
253,
2032,
3768,
81,
2957,
16186,
19,
7033,
281,
253,
2629,
3733,
2957,
436,
10012,
36908,
823,
823,
2712,
281,
253,
2926,
50275,
34974,
50276,
531,
273,
619,
14672,
310,
670,
253,
38135,
273,
7680,
294,
18976,
2224,
4632,
13361,
793,
858,
891,
2985,
436,
390,
310,
627,
3365,
247,
1199,
625,
17088,
1055,
273,
253,
11786,
6148,
3647,
11562,
275,
18976,
2224,
685,
13361,
793,
50275,
5430,
1199,
513,
253,
11786,
22429,
342,
1675,
281,
253,
13461,
2686,
1818,
672,
3733,
762,
3768,
81,
4632,
259,
893,
16471,
4632,
2629,
3733,
436,
310,
271,
4722,
1386,
273,
789,
285,
253,
13458,
562,
273,
285,
6774,
18505,
273,
253,
11786,
6148,
3647,
11562,
275,
3768,
81,
310,
247,
9865,
7680,
2469,
436,
5293,
273,
253,
1543,
403,
13631,
314,
4460,
390,
3216,
22071,
891,
1158,
436,
310,
247,
45210,
2929,
43981,
4404,
18235,
533,
812,
320,
13762,
281,
9510,
619,
4868,
5777,
604,
209,
422,
46485,
1633,
50276,
7152,
33032,
2520,
789,
2175,
849,
3768,
81,
19132,
253,
305,
9866,
7194,
387,
253,
26647,
4809,
253,
4477,
806,
15313,
2067,
10527,
1543,
326,
476,
320,
4354,
6012,
432,
253,
5368,
2987,
327,
260,
9866,
24088,
259,
86,
1162,
355,
9169,
66,
285,
2273,
85,
1162,
247,
43425,
275,
2593,
495,
275,
2593,
577,
253,
4477,
921,
326,
604,
597,
3587,
4647,
1781,
391,
1689,
347,
5125,
407,
495,
597,
588,
13329,
253,
11786,
29199,
11562,
1309,
253,
3733,
1754,
327,
436,
11562,
253,
4477,
12661,
767,
4088,
281,
8415,
436,
581,
310,
28069,
3768,
81,
253,
643,
310,
17375,
3768,
81,
840,
253,
4477,
2589,
2969,
4679,
970,
26724,
305,
14340,
305,
14340,
1403,
81,
305,
14340,
893,
16471,
305,
14340,
88,
1403,
81,
835,
26724,
1347,
253,
1682,
5727,
305,
14340,
893,
16471,
305,
14340,
88,
1403,
81,
452,
39797,
977,
3061,
7548,
285,
597,
7617,
281,
13398,
2500,
342,
3768,
81,
281,
320,
21332,
891,
513,
417,
871,
2139,
597,
513,
417,
823,
3368,
906,
273,
259,
893,
16471,
1060,
275,
2593,
608,
253,
4477,
2589,
247,
2257,
273,
4679,
751,
4076,
7200,
10237,
7200,
3966,
534,
956,
253,
5368,
2987,
50276,
783,
7680,
347,
891,
476,
923,
310,
326,
253,
4477,
403,
253,
806,
4394,
281,
2589,
9470,
4679,
970,
3768,
81,
327,
4216,
10895,
2299,
891,
513,
417,
923,
643,
9021,
7117,
407,
253,
4477,
534,
891,
588,
5513,
1996,
275,
253,
2022,
2278,
20544,
891,
717,
10048,
342,
253,
10704,
4679,
275,
2593,
608,
3738,
597,
8127,
3637,
253,
15050,
273,
5368,
789,
50275,
20881,
1255,
265,
891,
588,
1127,
253,
32213,
407,
7118,
50276,
4674,
495,
891,
253,
10527,
1543,
403,
1077,
32809,
1754,
327,
5368,
2987,
24088,
259,
86,
1162,
355,
9169,
66,
534,
476,
320,
4354,
6012,
390,
3587,
908,
21255,
253,
3768,
81,
5933,
7117,
1060,
310,
253,
1072,
347,
259,
86,
1162,
355,
9169,
66,
760,
823,
247,
4857,
247,
50276,
4674,
577,
891,
846,
891,
1239,
253,
2929,
16575,
7019,
891,
513,
417,
923,
849,
368,
9212,
39116,
1403,
81,
285,
253,
12505,
1939,
534,
310,
1077,
1774,
604,
368,
452,
326,
275,
634,
2929,
806,
1339,
479,
871,
835,
891,
476,
1089,
352,
1273,
2118,
326,
281,
2593,
577,
285,
5513,
352,
352,
310,
28536,
417,
1907,
352,
275,
2593,
577,
21255,
513,
417,
452,
3762,
390,
1014,
30328,
670,
246,
1403,
81,
285,
259,
1403,
81,
275,
619,
8668,
849,
368,
15249,
634,
11333,
310,
1077,
1774,
534,
310,
1199,
1774,
685,
253,
594,
1925,
10527,
1543,
275,
2593,
495,
368,
476,
1014,
5321,
253,
2317,
275,
2593,
495,
323,
368,
281,
2342,
634,
11333,
275,
2593,
577,
891,
2868,
752,
891,
1127,
562,
1060,
943,
320,
634,
38135,
323,
436,
2929,
387,
1878,
4496,
823,
690,
1175,
16875,
4431,
323,
634,
11333,
891,
588,
823,
2792,
604,
368,
513,
326,
37685,
4496,
823,
253,
259,
893,
46506,
906,
275,
4677,
495,
50276,
8774,
1175,
4679,
5075,
891,
1512,
32809,
285,
3480,
273,
38135,
21255,
513,
417,
3400,
10527,
565,
48714,
8813,
273,
253,
4081,
11333,
50271,
2520,
310,
789,
891,
310,
3240,
32809,
2429,
281,
253,
5368,
2987,
21255,
1057,
417,
5513,
6309,
1419,
49027,
253,
11333,
973,
891,
513,
417,
1158,
436,
2929,
310,
1175,
2217,
323,
17857,
32888,
2490,
187,
4118,
18435,
27,
2520,
2929,
19401,
247,
12955,
273,
48960,
2801,
26309,
50276,
26440,
1255,
6600,
41458,
323,
4216,
27311,
267,
11454,
6928,
323,
4666,
285,
4216,
9162,
275,
1798,
597,
1056,
767,
23927,
17701,
839,
26332,
14155,
253,
2801,
20452,
281,
2173,
8090,
285,
42428,
253,
9479,
1255,
6600,
2957,
342,
253,
3963,
2957,
1309,
3733,
253,
30628,
1119,
326,
253,
10527,
816,
6787,
14846,
273,
29199,
11786,
285,
4685,
273,
1327,
74,
301,
4758,
534,
369,
2879,
1309,
30080,
22559,
403,
4722,
533,
2067,
30628,
671,
1119,
253,
2900,
358,
5378,
474,
1543,
417,
21414,
2217,
891,
5583,
253,
4477,
281,
2057,
5333,
253,
2770,
281,
253,
10527,
1543,
390,
281,
17084,
253,
16774,
1543,
285,
616,
10291,
342,
3762,
1563,
253,
5701,
273,
253,
30628
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
the authors study the pomdp problem from the causal perspective and the propose to combine offline and online data to infer the transition model via deconfounding on the theoretical side they show that the proposed method is correct and efficient in terms of generalization guarantees on the experimental side they evaluate the proposed method on three synthetic toy problems my main concerns are on the experimental side the results on the three very lowdimensional synthetic toy problems are quite limited it is hard to judge the validity of the proposed method as we know the rl problems would become exponentially difficult as the state dimension increases also in the current rl community most rl algorithms take image pixels as input without such experiments it is unclear how the proposed method work in the real world scenarios comparing with several baselines and sota methods is an important way to demonstrate the superiority of the proposed approach unfortunately such a comparison is missing in the paper i suggest the authors should add some which would make the paper more convincing eg rezende et al 2020 kallus et al 2018 zhangbareinboim 2020 etc the experimental results are quite limited so that they are not enough to support the claims in the paper docsepthis paper considers the modelbased reinforcement learning rl problem by combining the offline and online data the online data ie the interventional data is generated from the standard partiallyobservable markov decision process pomdp while the offline data ie the observational data is generated from the privileged pomdp where the offline learner had the access to the state information ie the unobserved confounder to make an action the authors proposed an augmented learning procedure to safely combine these two separate different sources and learn a more efficient policy their method is shown both theoretically and empirically better than not using offline data my main concerns lie in their framework and assumptions novelty compared with existing literature and comparison studies strengths 1 this paper addressed an interesting and important question in rl ie how to use the observational data to improve the performance of online learning 2 the authors claimed that their setting is nontrivial by considering the unobserved confounders in the observational data 3 their method was shown to be valid and promising both theoretically and empirically weaknesses 1 since one major contribution claimed in this paper is to bridge the causal inference with reinforcement learning i was expecting that the authors could use a more rigorous causal framework and necessary assumptions to ensure the validation of their method and theory for instance to replace the dooperator with the conditional probabilities one should assume ignorability or exogeneity please refer to 1 below and add related assumptions 1 pearl judea models reasoning and inference cambridge uk cambridge university press 19 2000 2 i am not very convinced why online data particularly follows pomdp and offline data follows possibly privileged pomdp does a pretesting procedure is required to justify the model assumptions 3 there are at least two directions of literature that the authors should pay attention to and justify their novelty a first a number of works have proposed to combine observational and experimental data though not for rl such as 2 3 etc the authors may justify why they use the augmentation procedure and will this procedure achieve the usually desired doubly robust property 2 athey susan raj chetty and guido imbens combining experimental and observational data to estimate treatment effects on long term outcomes arxiv preprint arxiv200609676 2020 3 cooper gregory f and changwon yoo causal discovery from a mixture of experimental and observational data arxiv preprint arxiv13016686 2013 b there are increasing works regarding combining offline and online data in rl while it seems that the authors only discussed partial of them see some works below 4 nair ashvin et al accelerating online reinforcement learning with offline datasets arxiv preprint arxiv200609359 2020 5 gelly sylvain and david silver combining online and offline knowledge in uct proceedings of the 24th international conference on machine learning 2007 4 i dont agree with the statement by the authors that although we would have loved to compare against those approaches the lack of available code did prevent us from running a fair comparison actually by searching these cited papers titles with github i did find their implementations as follows thus the authors should add the comparison studies to justify their better performance 6 nathan kallus aahlad manas puli uri shalit removing hidden confounding by experimental grounding nips 2018 httpsgithubcomcausalmlremovinghiddenconfounding 7 elias bareinboim andrew forney and judea pearl bandits with unobserved confounders a causal approach in nips 2015 httpsgithubcomnanavatirutucausalbandits i think this is a borderline paper that addressed an important question with reasonably good performance while lacking necessary elaboration and justification as commented in my main review my major concerns to recommend this paper lie in their framework novelty compared with existing literature and comparison studies i am willing to upgrade if my concerns can be addressed during the rebuttal period docsepthis paper studies the problems of evaluating interventional distributions ie system dynamics of a partially observed markov decision process pomdp from samples collected from a combination of randomized experiments and observations of a privileged expert who could access the latent state the pomdp is presumed to have a finite horizon eg the physician could only perform a finite number of treatments for the same patients the authors propose an unbiased estimator for evaluating system dynamics from the experimental data as for the observational distribution where the unobserved confounding exists the authors derive bounds over unknown system dynamics estimable from observations this paper studies the evaluation of interventional distributions in a canonical pomdp model with a finite horizon the target query is pot1 doa0to0t where a0 at represent actions from stage t0 t o0 ot1 represents partially observed states from stage t 0 t1 this learning setting is general since it could represent most of treatment regimens in medical domains the author first shows that given data collected from interventions the interventional query pot1 doa0to0t could be consistently estimated using the conditional distribution pot1 a0to0t i i where i represent the intervention policy that generates the data this is not surprising since the sequential backdoor criterion is entailed in the interventional data the authors then derive a bound over the product of target query prodt 0t1 pot1 doa0to0t from the observational distribution the result appears interesting at first but seems to be a simple application of manskis bound in manski 1989 the authors validate their results through comprehensive simulations results show that estimation using both the observational and interventional data consistently outperforms other learning strategies however it is unclear how the combination is done that is it would be interesting to see how the authors combine the unbiased estimator from the interventional data with the bound derived from the observational data unfortunately this detail is not elaborated in the main manuscript overall the authors study an exciting topic causal identification in pomdp which is a quite general and challenging learning setting my main concern with this paper is its novelty first the unbiased estimator in eq 4 is not surprising and follows immediately from the backdoor criterion i am pretty sure many similar mle estimators have been proposed second the bound in theorem 1 might be interesting but appears to be a simple application from the bound in manski 1989 it would be encouraged if the authors could elaborate how to combine these different methods to obtain a more accurate estimation of the target interventional distribution docsepthe paper considers the problem of learning a causal model in the pomdp setting it assumes the learning agent has the ability to collect online experiences through direct interactions with the environment and can access a large collection of offline experiences obtained through the observation of another agent it further assumes that the observed agent can act based on privileged information hidden from the learning agent the paper formulates modelbased reinforcement learning in this setting as a causal inference problem the paper then proposes to use offline data as a regularizer during learning the paper presents empirical results on a number of toy problems strengths the paper proposes to learn a latent causal transition model explaining both the interventional and observational data and infer the standard pomdp transition model via deconfounding using the recovered latent variable the paper shows that combing both intervention data and observation data can achieve better generalization guarantees in the asymptotic case weaknesses the setting considered is rather limiting because it assumes that there is no confounder in the model of the pomdp the writing is very poor the main contribution section 43 is not clearly explained it is not clear how imposing an observational distribution qi 0 acts as a regularizer for the interventional distribution experiments are only done with very simple toy problems the setting considered is very limiting as it assumes there are no latent confounders see the paper for the tricky issues involved shaking the foundations delusions in sequence models for interaction and control pedro a ortega markus kunesch grgoire deltang tim genewein jordi graumoya joel veness jonas buchli jonas degrave bilal piot julien perolat tom everitt corentin tallec emilio parisotto tom erez yutian chen scott reed marcus hutter nando de freitas shane legg the experiments are not performed in any nontrivial settings the writing makes the paper hard to read for example it references rule r2 without specifying where it is or first introducing it the authors have made their contributions and assumptions more clear and will add results comparing with related work i am happy to upgrade my rating
### Summary: | in this paper the authors provide a modelbased approach for combining experimental and observational data in reinforcement learning specifically in pomdps the paper was not received very favorably by reviewers with the main concerns revolving around a writing quality b validation c extent of contribution given existing work on causal rl in preparing your revision in addition to clarifying writing and adding better validation i would urge the authors to consult existing causal inference literature on point and partial identification in settings related to rl such as offline policy learning this will help address issues of novelty by extending their approach to settings with more types of confounding in addition to useful references suggested by reviewers another useful draft may be pathdependent structural equation models srinivasan r lee j bhattacharya r and shpitser i in proceedings of the thirty seventh conference on uncertainty in artificial intelligence | [
273,
26647,
23632,
327,
253,
5661,
1930,
597,
7472,
253,
4081,
1332,
327,
1264,
13506,
20953,
3237,
50276,
2577,
2022,
7350,
403,
327,
253,
5661,
1930,
50274,
783,
1543,
327,
253,
1264,
1077,
1698,
6967,
13506,
20953,
3237,
403,
3240,
3710,
352,
310,
1892,
281,
5963,
253,
13091,
273,
253,
4081,
1332,
347,
359,
871,
253,
391,
77,
3237,
651,
2489,
28596,
2834,
347,
253,
1375,
7877,
5459,
671,
275,
253,
1655,
391,
77,
3114,
954,
391,
77,
11333,
1379,
2460,
15115,
347,
3280,
1293,
824,
4679,
352,
310,
12744,
849,
253,
4081,
1332,
789,
275,
253,
1524,
1533,
15216,
50274,
681,
48434,
342,
2067,
1666,
25379,
285,
256,
5503,
3082,
310,
271,
1774,
1039,
281,
7568,
253,
34385,
273,
253,
4081,
2746,
19235,
824,
247,
5301,
310,
5816,
275,
253,
2929,
891,
1804,
253,
4477,
943,
823,
690,
534,
651,
1056,
253,
2929,
625,
21414,
24088,
294,
91,
9747,
1162,
355,
9169,
465,
455,
316,
1162,
355,
4765,
1182,
12109,
39517,
249,
2399,
303,
9169,
3966,
253,
5661,
1543,
403,
3240,
3710,
594,
326,
597,
403,
417,
2217,
281,
1329,
253,
3916,
275,
253,
2929,
50276,
7152,
33032,
2520,
2929,
19401,
253,
1566,
3169,
35221,
4715,
391,
77,
1895,
407,
16248,
253,
28841,
285,
3909,
941,
253,
3909,
941,
26332,
253,
7268,
267,
941,
310,
4561,
432,
253,
2629,
10571,
23705,
494,
1616,
729,
3061,
1232,
31204,
12132,
1223,
253,
28841,
941,
26332,
253,
21899,
941,
310,
4561,
432,
253,
30082,
31204,
12132,
835,
253,
28841,
458,
47612,
574,
253,
2289,
281,
253,
1375,
1491,
26332,
253,
440,
45912,
1461,
10117,
281,
1056,
271,
2250,
253,
4477,
4081,
271,
31612,
4715,
5199,
281,
15792,
13398,
841,
767,
4858,
1027,
4973,
285,
3037,
247,
625,
5919,
3646,
616,
1332,
310,
2011,
1097,
28055,
285,
45190,
1805,
685,
417,
970,
28841,
941,
619,
2022,
7350,
7027,
275,
616,
7792,
285,
13260,
38135,
2429,
342,
5368,
6239,
285,
5301,
2175,
20544,
50276,
18,
436,
2929,
9713,
271,
4722,
285,
1774,
1953,
275,
391,
77,
26332,
849,
281,
897,
253,
21899,
941,
281,
3157,
253,
3045,
273,
3909,
4715,
50276,
19,
253,
4477,
7558,
326,
616,
4758,
310,
37825,
407,
7296,
253,
440,
45912,
44667,
398,
275,
253,
21899,
941,
50276,
20,
616,
1332,
369,
2011,
281,
320,
3588,
285,
12532,
1097,
28055,
285,
45190,
50276,
20881,
1255,
265,
50276,
18,
1580,
581,
2201,
7680,
7558,
275,
436,
2929,
310,
281,
9729,
253,
19349,
17032,
342,
35221,
4715,
891,
369,
16764,
326,
253,
4477,
812,
897,
247,
625,
26565,
19349,
7792,
285,
3309,
13260,
281,
5416,
253,
12820,
273,
616,
1332,
285,
3762,
323,
4227,
281,
8171,
253,
513,
4402,
342,
253,
17697,
20552,
581,
943,
5467,
15776,
1430,
390,
385,
14812,
4496,
3730,
281,
337,
2708,
285,
823,
2905,
13260,
50275,
18,
27887,
77,
2128,
14576,
3210,
14720,
285,
17032,
4049,
8298,
42487,
4049,
8298,
9835,
2315,
655,
5307,
50276,
19,
891,
717,
417,
1077,
13762,
2139,
3909,
941,
3782,
3637,
31204,
12132,
285,
28841,
941,
3637,
6830,
30082,
31204,
12132,
1057,
247,
638,
19462,
5199,
310,
2424,
281,
15249,
253,
1566,
13260,
50276,
20,
627,
403,
387,
1878,
767,
10746,
273,
6239,
326,
253,
4477,
943,
2075,
4116,
281,
285,
15249,
616,
38135,
50275,
66,
806,
247,
1180,
273,
2987,
452,
4081,
281,
13398,
21899,
285,
5661,
941,
2167,
417,
323,
391,
77,
824,
347,
374,
495,
3966,
253,
4477,
778,
15249,
2139,
597,
897,
253,
42072,
5199,
285,
588,
436,
5199,
5115,
253,
3798,
6799,
44881,
10237,
2867,
50276,
19,
15389,
90,
2817,
266,
1218,
75,
1161,
46678,
285,
1149,
7112,
516,
67,
561,
16248,
5661,
285,
21899,
941,
281,
6642,
1971,
2538,
327,
1048,
1307,
6973,
549,
32693,
638,
3845,
549,
32693,
1518,
27059,
29462,
9169,
50276,
20,
10239,
305,
1747,
590,
269,
285,
1683,
33382,
340,
3288,
19349,
8900,
432,
247,
7802,
273,
5661,
285,
21899,
941,
549,
32693,
638,
3845,
549,
32693,
1012,
11718,
28057,
4072,
50275,
67,
627,
403,
3629,
2987,
5001,
16248,
28841,
285,
3909,
941,
275,
391,
77,
1223,
352,
3133,
326,
253,
4477,
760,
5469,
7898,
273,
731,
923,
690,
2987,
2708,
50276,
21,
295,
1094,
15898,
8498,
1162,
355,
38757,
3909,
35221,
4715,
342,
28841,
15302,
549,
32693,
638,
3845,
549,
32693,
1518,
27059,
24454,
9169,
50276,
22,
9480,
314,
726,
38884,
404,
285,
34843,
301,
9711,
16248,
3909,
285,
28841,
3640,
275,
1484,
291,
10061,
273,
253,
2164,
394,
5213,
8059,
327,
5145,
4715,
5215,
50276,
21,
891,
13414,
5194,
342,
253,
3908,
407,
253,
4477,
326,
3738,
359,
651,
452,
7636,
281,
7277,
1411,
1110,
7274,
253,
3480,
273,
2130,
2127,
858,
3657,
441,
432,
3515,
247,
4344,
5301,
2686,
407,
12203,
841,
11106,
9380,
14505,
342,
40477,
891,
858,
1089,
616,
27558,
347,
3637,
3021,
253,
4477,
943,
823,
253,
5301,
2175,
281,
15249,
616,
1805,
3045,
50276,
23,
295,
10511,
465,
455,
316,
247,
16758,
324,
637,
284,
4734,
74,
41475,
439,
267,
262,
11922,
8763,
34541,
407,
5661,
3216,
272,
295,
2824,
4765,
50276,
3614,
7280,
681,
68,
27026,
1686,
2013,
11305,
19057,
8259,
13802,
50276,
24,
1045,
6358,
8050,
249,
2399,
303,
285,
2663,
323,
2191,
285,
2128,
14576,
27887,
77,
3961,
953,
342,
440,
45912,
44667,
398,
247,
19349,
2746,
275,
295,
2824,
4104,
50276,
3614,
7280,
681,
11943,
47476,
343,
307,
1028,
27026,
4152,
953,
50275,
74,
1158,
436,
310,
247,
45210,
2929,
326,
9713,
271,
1774,
1953,
342,
12054,
1175,
3045,
1223,
14999,
3309,
14883,
318,
285,
22861,
347,
20503,
275,
619,
2022,
2278,
619,
2201,
7350,
281,
5583,
436,
2929,
7027,
275,
616,
7792,
38135,
2429,
342,
5368,
6239,
285,
5301,
2175,
891,
717,
7378,
281,
15047,
604,
619,
7350,
476,
320,
9713,
1309,
253,
30080,
22559,
2180,
5474,
33032,
2520,
2929,
2175,
253,
3237,
273,
16344,
7268,
267,
10670,
26332,
985,
8062,
273,
247,
10571,
2540,
1616,
729,
3061,
1232,
31204,
12132,
432,
3530,
5728,
432,
247,
5019,
273,
14871,
4679,
285,
7313,
273,
247,
30082,
6485,
665,
812,
2289,
253,
21624,
1375,
253,
31204,
12132,
310,
24874,
281,
452,
247,
6486,
16892,
24088,
253,
12878,
812,
760,
1347,
247,
6486,
1180,
273,
9694,
323,
253,
1072,
1363,
253,
4477,
12661,
271,
38663,
29107,
323,
16344,
985,
8062,
432,
253,
5661,
941,
347,
323,
253,
21899,
3268,
835,
253,
440,
45912,
34541,
4961,
253,
4477,
15313,
14493,
689,
7202,
985,
8062,
3311,
494,
432,
7313,
436,
2929,
2175,
253,
7103,
273,
7268,
267,
10670,
275,
247,
15516,
31204,
12132,
1566,
342,
247,
6486,
16892,
253,
2303,
7316,
310,
1721,
18,
50276,
3088,
66,
17,
936,
17,
85,
835,
247,
17,
50276,
255,
1957,
5231,
432,
3924,
246,
17,
50276,
85,
258,
17,
50276,
302,
18,
6125,
10571,
2540,
3054,
432,
3924,
246,
50276,
17,
50276,
85,
18,
436,
4715,
4758,
310,
2087,
1580,
352,
812,
1957,
954,
273,
1971,
32991,
275,
3739,
10625,
50276,
783,
2488,
806,
2722,
326,
1677,
941,
5728,
432,
12214,
253,
7268,
267,
7316,
1721,
18,
50276,
3088,
66,
17,
936,
17,
85,
812,
320,
12724,
5998,
970,
253,
17697,
3268,
1721,
18,
50276,
66,
17,
936,
17,
85,
891,
50276,
74,
835,
891,
1957,
253,
7268,
3646,
326,
15693,
253,
941,
436,
310,
417,
10084,
1580,
253,
22453,
896,
11806,
17705,
310,
994,
7193,
275,
253,
7268,
267,
941,
253,
4477,
840,
15313,
247,
3033,
689,
253,
1885,
273,
2303,
7316,
354,
7064,
50276,
17,
85,
18,
1721,
18,
50276,
3088,
66,
17,
936,
17,
85,
432,
253,
21899,
3268,
253,
906,
4620,
4722,
387,
806,
533,
3133,
281,
320,
247,
2969,
2898,
273,
30433,
76,
261,
3033,
275,
30433,
5985,
11161,
50275,
783,
4477,
17813,
616,
1543,
949,
11088,
9938,
1543,
921,
326,
13418,
970,
1097,
253,
21899,
285,
7268,
267,
941,
12724,
41731,
13015,
643,
4715,
8130,
2299,
352,
310,
12744,
849,
253,
5019,
310,
2218,
326,
310,
352,
651,
320,
4722,
281,
923,
849,
253,
4477,
13398,
253,
38663,
29107,
432,
253,
7268,
267,
941,
342,
253,
3033,
6012,
432,
253,
21899,
941,
19235,
436,
2508,
310,
417,
50221,
275,
253,
2022,
7714,
4583,
253,
4477,
1263,
271,
12302,
9400,
19349,
8137,
275,
31204,
12132,
534,
310,
247,
3240,
2087,
285,
11132,
4715,
4758,
619,
2022,
4468,
342,
436,
2929,
310,
697,
38135,
806,
253,
38663,
29107,
275,
16186,
577,
310,
417,
10084,
285,
3637,
4745,
432,
253,
896,
11806,
17705,
891,
717,
3965,
2119,
1142,
2074,
278,
282,
48489,
452,
644,
4081,
1273,
253,
3033,
275,
10012,
337,
1537,
320,
4722,
533,
4620,
281,
320,
247,
2969,
2898,
432,
253,
3033,
275,
30433,
5985,
11161,
352,
651,
320,
14659,
604,
253,
4477,
812,
21184,
849,
281,
13398,
841,
1027,
3082,
281,
4044,
247,
625,
7899,
13418,
273,
253,
2303,
7268,
267,
3268,
5474,
339,
431,
248,
2929,
19401,
253,
1895,
273,
4715,
247,
19349,
1566,
275,
253,
31204,
12132,
4758,
352,
19584,
253,
4715,
5570,
556,
253,
3745,
281,
4822,
3909,
8450,
949,
1480,
6355,
342,
253,
3126,
285,
476,
2289,
247,
1781,
4849,
273,
28841,
8450,
2797,
949,
253,
8310,
273,
1529,
5570,
352,
2007,
19584,
326,
253,
2540,
5570,
476,
769,
1754,
327,
30082,
1491,
8763,
432,
253,
4715,
5570,
253,
2929,
17075,
684,
1566,
3169,
35221,
4715,
275,
436,
4758,
347,
247,
19349,
17032,
1895,
253,
2929,
840,
29328,
281,
897,
28841,
941,
347,
247,
3963,
6081,
1309,
4715,
253,
2929,
10262,
16774,
1543,
327,
247,
1180,
273,
20953,
3237,
20544,
50276,
783,
2929,
29328,
281,
3037,
247,
21624,
19349,
5502,
1566,
15571,
1097,
253,
7268,
267,
285,
21899,
941,
285,
9441,
253,
2629,
31204,
12132,
5502,
1566,
3066,
372,
8259,
13802,
970,
253,
12372,
21624,
4778,
50275,
783,
50276,
20790,
2722,
326,
2049,
272,
1097,
7268,
941,
285,
8310,
941,
476,
5115,
1805,
26647,
23632,
275,
253,
20185,
1083,
50275,
20881,
1255,
265,
253,
4758,
2783,
310,
2581,
14155,
984,
352,
19584,
326,
627,
310,
642,
1461,
10117,
275,
253,
1566,
273,
253,
31204,
12132,
50275,
783,
4028,
310,
1077,
4105,
253,
2022,
7680,
2593,
7652,
310,
417,
4518,
5544,
352,
310,
417,
2590,
849,
23254,
271,
21899,
3268,
2805,
74,
50276,
17,
6993,
347,
247,
3963,
6081,
323,
253,
7268,
267,
3268,
50274,
16217,
3825,
403,
760,
2218,
342,
1077,
2969,
20953,
3237,
253,
4758,
2783,
310,
1077,
14155,
347,
352,
19584,
627,
403,
642,
21624,
44667,
398,
923,
253,
2929,
323,
253,
28190,
3374,
3206,
18577,
253,
27629,
1448,
16723,
275,
3425,
3210,
323,
5016,
285,
1453,
7690,
287,
247,
390,
442,
2485,
1616,
316,
465,
14124,
348,
650,
2184,
603,
1448,
85,
606,
4522,
3320,
664,
249,
480,
636,
74,
7098,
360,
28682,
3371,
293,
8097,
405,
480,
46605,
270,
976,
965,
480,
46605,
6797,
376,
306,
10370,
267,
268,
7173,
49137,
1914,
591,
311,
255,
7275,
2455,
770,
5161,
2649,
249,
246,
4781,
68,
802,
33364,
1061,
16174,
936,
7275,
299,
14852,
340,
307,
757,
260,
864,
660,
1519,
294,
264,
2304,
11675,
288,
12216,
295,
6575,
372,
4107,
30126,
439,
1351,
458,
1266,
50276,
783,
4679,
403,
417,
2684,
275,
667,
37825,
7533,
50275,
783,
4028,
2789,
253,
2929,
1892,
281,
1239,
323,
1650,
352,
10414,
4086,
391,
19,
1293,
31238,
835,
352,
310,
390,
806,
16984,
352,
50275,
783,
4477,
452,
1160,
616,
9021,
285,
13260,
625,
2590,
285,
588,
823,
50276,
16680,
10941,
342,
2905,
789,
891,
717,
5211,
281,
15047,
619,
13716,
2490,
187,
4118,
18435,
27,
249,
436,
2929,
253,
4477,
2085,
247,
1566,
3169,
2746,
323,
16248,
5661,
285,
21899,
941,
275,
35221,
4715,
5742,
275,
31204,
69,
793,
50276,
783,
2929,
369,
417,
2959,
1077,
49148,
407,
30628,
342,
253,
2022,
7350,
3585,
11932,
1475,
247,
4028,
3290,
270,
12820,
260,
6070,
273,
7680,
1677,
5368,
789,
327,
19349,
391,
77,
50276,
249,
13828,
634,
18520,
275,
1635,
281,
8254,
5411,
4028,
285,
6240,
1805,
12820,
891,
651,
21434,
253,
4477,
281,
7279,
5368,
19349,
17032,
6239,
327,
1127,
285,
7898,
8137,
275,
7533,
2905,
281,
391,
77,
824,
347,
28841,
3646,
4715,
50276,
2520,
588,
1361,
2953,
3374,
273,
38135,
407,
13633,
616,
2746,
281,
7533,
342,
625,
3510,
273,
34541,
50276,
249,
1635,
281,
4217,
10414,
5125,
407,
30628,
1529,
4217,
7482,
778,
320,
50276,
3967,
6820,
8350,
5150,
3210,
256,
11078,
34627,
266,
391,
458,
70,
480,
270,
700,
85,
607,
552,
66,
391,
285,
439,
81,
953,
254,
891,
275,
10061,
273,
253,
10488,
20117,
8059,
327,
11649,
275,
13345,
9260
] | [
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] | [
273,
26647,
23632,
327,
253,
5661,
1930,
597,
7472,
253,
4081,
1332,
327,
1264,
13506,
20953,
3237,
50276,
2577,
2022,
7350,
403,
327,
253,
5661,
1930,
50274,
783,
1543,
327,
253,
1264,
1077,
1698,
6967,
13506,
20953,
3237,
403,
3240,
3710,
352,
310,
1892,
281,
5963,
253,
13091,
273,
253,
4081,
1332,
347,
359,
871,
253,
391,
77,
3237,
651,
2489,
28596,
2834,
347,
253,
1375,
7877,
5459,
671,
275,
253,
1655,
391,
77,
3114,
954,
391,
77,
11333,
1379,
2460,
15115,
347,
3280,
1293,
824,
4679,
352,
310,
12744,
849,
253,
4081,
1332,
789,
275,
253,
1524,
1533,
15216,
50274,
681,
48434,
342,
2067,
1666,
25379,
285,
256,
5503,
3082,
310,
271,
1774,
1039,
281,
7568,
253,
34385,
273,
253,
4081,
2746,
19235,
824,
247,
5301,
310,
5816,
275,
253,
2929,
891,
1804,
253,
4477,
943,
823,
690,
534,
651,
1056,
253,
2929,
625,
21414,
24088,
294,
91,
9747,
1162,
355,
9169,
465,
455,
316,
1162,
355,
4765,
1182,
12109,
39517,
249,
2399,
303,
9169,
3966,
253,
5661,
1543,
403,
3240,
3710,
594,
326,
597,
403,
417,
2217,
281,
1329,
253,
3916,
275,
253,
2929,
50276,
7152,
33032,
2520,
2929,
19401,
253,
1566,
3169,
35221,
4715,
391,
77,
1895,
407,
16248,
253,
28841,
285,
3909,
941,
253,
3909,
941,
26332,
253,
7268,
267,
941,
310,
4561,
432,
253,
2629,
10571,
23705,
494,
1616,
729,
3061,
1232,
31204,
12132,
1223,
253,
28841,
941,
26332,
253,
21899,
941,
310,
4561,
432,
253,
30082,
31204,
12132,
835,
253,
28841,
458,
47612,
574,
253,
2289,
281,
253,
1375,
1491,
26332,
253,
440,
45912,
1461,
10117,
281,
1056,
271,
2250,
253,
4477,
4081,
271,
31612,
4715,
5199,
281,
15792,
13398,
841,
767,
4858,
1027,
4973,
285,
3037,
247,
625,
5919,
3646,
616,
1332,
310,
2011,
1097,
28055,
285,
45190,
1805,
685,
417,
970,
28841,
941,
619,
2022,
7350,
7027,
275,
616,
7792,
285,
13260,
38135,
2429,
342,
5368,
6239,
285,
5301,
2175,
20544,
50276,
18,
436,
2929,
9713,
271,
4722,
285,
1774,
1953,
275,
391,
77,
26332,
849,
281,
897,
253,
21899,
941,
281,
3157,
253,
3045,
273,
3909,
4715,
50276,
19,
253,
4477,
7558,
326,
616,
4758,
310,
37825,
407,
7296,
253,
440,
45912,
44667,
398,
275,
253,
21899,
941,
50276,
20,
616,
1332,
369,
2011,
281,
320,
3588,
285,
12532,
1097,
28055,
285,
45190,
50276,
20881,
1255,
265,
50276,
18,
1580,
581,
2201,
7680,
7558,
275,
436,
2929,
310,
281,
9729,
253,
19349,
17032,
342,
35221,
4715,
891,
369,
16764,
326,
253,
4477,
812,
897,
247,
625,
26565,
19349,
7792,
285,
3309,
13260,
281,
5416,
253,
12820,
273,
616,
1332,
285,
3762,
323,
4227,
281,
8171,
253,
513,
4402,
342,
253,
17697,
20552,
581,
943,
5467,
15776,
1430,
390,
385,
14812,
4496,
3730,
281,
337,
2708,
285,
823,
2905,
13260,
50275,
18,
27887,
77,
2128,
14576,
3210,
14720,
285,
17032,
4049,
8298,
42487,
4049,
8298,
9835,
2315,
655,
5307,
50276,
19,
891,
717,
417,
1077,
13762,
2139,
3909,
941,
3782,
3637,
31204,
12132,
285,
28841,
941,
3637,
6830,
30082,
31204,
12132,
1057,
247,
638,
19462,
5199,
310,
2424,
281,
15249,
253,
1566,
13260,
50276,
20,
627,
403,
387,
1878,
767,
10746,
273,
6239,
326,
253,
4477,
943,
2075,
4116,
281,
285,
15249,
616,
38135,
50275,
66,
806,
247,
1180,
273,
2987,
452,
4081,
281,
13398,
21899,
285,
5661,
941,
2167,
417,
323,
391,
77,
824,
347,
374,
495,
3966,
253,
4477,
778,
15249,
2139,
597,
897,
253,
42072,
5199,
285,
588,
436,
5199,
5115,
253,
3798,
6799,
44881,
10237,
2867,
50276,
19,
15389,
90,
2817,
266,
1218,
75,
1161,
46678,
285,
1149,
7112,
516,
67,
561,
16248,
5661,
285,
21899,
941,
281,
6642,
1971,
2538,
327,
1048,
1307,
6973,
549,
32693,
638,
3845,
549,
32693,
1518,
27059,
29462,
9169,
50276,
20,
10239,
305,
1747,
590,
269,
285,
1683,
33382,
340,
3288,
19349,
8900,
432,
247,
7802,
273,
5661,
285,
21899,
941,
549,
32693,
638,
3845,
549,
32693,
1012,
11718,
28057,
4072,
50275,
67,
627,
403,
3629,
2987,
5001,
16248,
28841,
285,
3909,
941,
275,
391,
77,
1223,
352,
3133,
326,
253,
4477,
760,
5469,
7898,
273,
731,
923,
690,
2987,
2708,
50276,
21,
295,
1094,
15898,
8498,
1162,
355,
38757,
3909,
35221,
4715,
342,
28841,
15302,
549,
32693,
638,
3845,
549,
32693,
1518,
27059,
24454,
9169,
50276,
22,
9480,
314,
726,
38884,
404,
285,
34843,
301,
9711,
16248,
3909,
285,
28841,
3640,
275,
1484,
291,
10061,
273,
253,
2164,
394,
5213,
8059,
327,
5145,
4715,
5215,
50276,
21,
891,
13414,
5194,
342,
253,
3908,
407,
253,
4477,
326,
3738,
359,
651,
452,
7636,
281,
7277,
1411,
1110,
7274,
253,
3480,
273,
2130,
2127,
858,
3657,
441,
432,
3515,
247,
4344,
5301,
2686,
407,
12203,
841,
11106,
9380,
14505,
342,
40477,
891,
858,
1089,
616,
27558,
347,
3637,
3021,
253,
4477,
943,
823,
253,
5301,
2175,
281,
15249,
616,
1805,
3045,
50276,
23,
295,
10511,
465,
455,
316,
247,
16758,
324,
637,
284,
4734,
74,
41475,
439,
267,
262,
11922,
8763,
34541,
407,
5661,
3216,
272,
295,
2824,
4765,
50276,
3614,
7280,
681,
68,
27026,
1686,
2013,
11305,
19057,
8259,
13802,
50276,
24,
1045,
6358,
8050,
249,
2399,
303,
285,
2663,
323,
2191,
285,
2128,
14576,
27887,
77,
3961,
953,
342,
440,
45912,
44667,
398,
247,
19349,
2746,
275,
295,
2824,
4104,
50276,
3614,
7280,
681,
11943,
47476,
343,
307,
1028,
27026,
4152,
953,
50275,
74,
1158,
436,
310,
247,
45210,
2929,
326,
9713,
271,
1774,
1953,
342,
12054,
1175,
3045,
1223,
14999,
3309,
14883,
318,
285,
22861,
347,
20503,
275,
619,
2022,
2278,
619,
2201,
7350,
281,
5583,
436,
2929,
7027,
275,
616,
7792,
38135,
2429,
342,
5368,
6239,
285,
5301,
2175,
891,
717,
7378,
281,
15047,
604,
619,
7350,
476,
320,
9713,
1309,
253,
30080,
22559,
2180,
5474,
33032,
2520,
2929,
2175,
253,
3237,
273,
16344,
7268,
267,
10670,
26332,
985,
8062,
273,
247,
10571,
2540,
1616,
729,
3061,
1232,
31204,
12132,
432,
3530,
5728,
432,
247,
5019,
273,
14871,
4679,
285,
7313,
273,
247,
30082,
6485,
665,
812,
2289,
253,
21624,
1375,
253,
31204,
12132,
310,
24874,
281,
452,
247,
6486,
16892,
24088,
253,
12878,
812,
760,
1347,
247,
6486,
1180,
273,
9694,
323,
253,
1072,
1363,
253,
4477,
12661,
271,
38663,
29107,
323,
16344,
985,
8062,
432,
253,
5661,
941,
347,
323,
253,
21899,
3268,
835,
253,
440,
45912,
34541,
4961,
253,
4477,
15313,
14493,
689,
7202,
985,
8062,
3311,
494,
432,
7313,
436,
2929,
2175,
253,
7103,
273,
7268,
267,
10670,
275,
247,
15516,
31204,
12132,
1566,
342,
247,
6486,
16892,
253,
2303,
7316,
310,
1721,
18,
50276,
3088,
66,
17,
936,
17,
85,
835,
247,
17,
50276,
255,
1957,
5231,
432,
3924,
246,
17,
50276,
85,
258,
17,
50276,
302,
18,
6125,
10571,
2540,
3054,
432,
3924,
246,
50276,
17,
50276,
85,
18,
436,
4715,
4758,
310,
2087,
1580,
352,
812,
1957,
954,
273,
1971,
32991,
275,
3739,
10625,
50276,
783,
2488,
806,
2722,
326,
1677,
941,
5728,
432,
12214,
253,
7268,
267,
7316,
1721,
18,
50276,
3088,
66,
17,
936,
17,
85,
812,
320,
12724,
5998,
970,
253,
17697,
3268,
1721,
18,
50276,
66,
17,
936,
17,
85,
891,
50276,
74,
835,
891,
1957,
253,
7268,
3646,
326,
15693,
253,
941,
436,
310,
417,
10084,
1580,
253,
22453,
896,
11806,
17705,
310,
994,
7193,
275,
253,
7268,
267,
941,
253,
4477,
840,
15313,
247,
3033,
689,
253,
1885,
273,
2303,
7316,
354,
7064,
50276,
17,
85,
18,
1721,
18,
50276,
3088,
66,
17,
936,
17,
85,
432,
253,
21899,
3268,
253,
906,
4620,
4722,
387,
806,
533,
3133,
281,
320,
247,
2969,
2898,
273,
30433,
76,
261,
3033,
275,
30433,
5985,
11161,
50275,
783,
4477,
17813,
616,
1543,
949,
11088,
9938,
1543,
921,
326,
13418,
970,
1097,
253,
21899,
285,
7268,
267,
941,
12724,
41731,
13015,
643,
4715,
8130,
2299,
352,
310,
12744,
849,
253,
5019,
310,
2218,
326,
310,
352,
651,
320,
4722,
281,
923,
849,
253,
4477,
13398,
253,
38663,
29107,
432,
253,
7268,
267,
941,
342,
253,
3033,
6012,
432,
253,
21899,
941,
19235,
436,
2508,
310,
417,
50221,
275,
253,
2022,
7714,
4583,
253,
4477,
1263,
271,
12302,
9400,
19349,
8137,
275,
31204,
12132,
534,
310,
247,
3240,
2087,
285,
11132,
4715,
4758,
619,
2022,
4468,
342,
436,
2929,
310,
697,
38135,
806,
253,
38663,
29107,
275,
16186,
577,
310,
417,
10084,
285,
3637,
4745,
432,
253,
896,
11806,
17705,
891,
717,
3965,
2119,
1142,
2074,
278,
282,
48489,
452,
644,
4081,
1273,
253,
3033,
275,
10012,
337,
1537,
320,
4722,
533,
4620,
281,
320,
247,
2969,
2898,
432,
253,
3033,
275,
30433,
5985,
11161,
352,
651,
320,
14659,
604,
253,
4477,
812,
21184,
849,
281,
13398,
841,
1027,
3082,
281,
4044,
247,
625,
7899,
13418,
273,
253,
2303,
7268,
267,
3268,
5474,
339,
431,
248,
2929,
19401,
253,
1895,
273,
4715,
247,
19349,
1566,
275,
253,
31204,
12132,
4758,
352,
19584,
253,
4715,
5570,
556,
253,
3745,
281,
4822,
3909,
8450,
949,
1480,
6355,
342,
253,
3126,
285,
476,
2289,
247,
1781,
4849,
273,
28841,
8450,
2797,
949,
253,
8310,
273,
1529,
5570,
352,
2007,
19584,
326,
253,
2540,
5570,
476,
769,
1754,
327,
30082,
1491,
8763,
432,
253,
4715,
5570,
253,
2929,
17075,
684,
1566,
3169,
35221,
4715,
275,
436,
4758,
347,
247,
19349,
17032,
1895,
253,
2929,
840,
29328,
281,
897,
28841,
941,
347,
247,
3963,
6081,
1309,
4715,
253,
2929,
10262,
16774,
1543,
327,
247,
1180,
273,
20953,
3237,
20544,
50276,
783,
2929,
29328,
281,
3037,
247,
21624,
19349,
5502,
1566,
15571,
1097,
253,
7268,
267,
285,
21899,
941,
285,
9441,
253,
2629,
31204,
12132,
5502,
1566,
3066,
372,
8259,
13802,
970,
253,
12372,
21624,
4778,
50275,
783,
50276,
20790,
2722,
326,
2049,
272,
1097,
7268,
941,
285,
8310,
941,
476,
5115,
1805,
26647,
23632,
275,
253,
20185,
1083,
50275,
20881,
1255,
265,
253,
4758,
2783,
310,
2581,
14155,
984,
352,
19584,
326,
627,
310,
642,
1461,
10117,
275,
253,
1566,
273,
253,
31204,
12132,
50275,
783,
4028,
310,
1077,
4105,
253,
2022,
7680,
2593,
7652,
310,
417,
4518,
5544,
352,
310,
417,
2590,
849,
23254,
271,
21899,
3268,
2805,
74,
50276,
17,
6993,
347,
247,
3963,
6081,
323,
253,
7268,
267,
3268,
50274,
16217,
3825,
403,
760,
2218,
342,
1077,
2969,
20953,
3237,
253,
4758,
2783,
310,
1077,
14155,
347,
352,
19584,
627,
403,
642,
21624,
44667,
398,
923,
253,
2929,
323,
253,
28190,
3374,
3206,
18577,
253,
27629,
1448,
16723,
275,
3425,
3210,
323,
5016,
285,
1453,
7690,
287,
247,
390,
442,
2485,
1616,
316,
465,
14124,
348,
650,
2184,
603,
1448,
85,
606,
4522,
3320,
664,
249,
480,
636,
74,
7098,
360,
28682,
3371,
293,
8097,
405,
480,
46605,
270,
976,
965,
480,
46605,
6797,
376,
306,
10370,
267,
268,
7173,
49137,
1914,
591,
311,
255,
7275,
2455,
770,
5161,
2649,
249,
246,
4781,
68,
802,
33364,
1061,
16174,
936,
7275,
299,
14852,
340,
307,
757,
260,
864,
660,
1519,
294,
264,
2304,
11675,
288,
12216,
295,
6575,
372,
4107,
30126,
439,
1351,
458,
1266,
50276,
783,
4679,
403,
417,
2684,
275,
667,
37825,
7533,
50275,
783,
4028,
2789,
253,
2929,
1892,
281,
1239,
323,
1650,
352,
10414,
4086,
391,
19,
1293,
31238,
835,
352,
310,
390,
806,
16984,
352,
50275,
783,
4477,
452,
1160,
616,
9021,
285,
13260,
625,
2590,
285,
588,
823,
50276,
16680,
10941,
342,
2905,
789,
891,
717,
5211,
281,
15047,
619,
13716,
2490,
187,
4118,
18435,
27,
249,
436,
2929,
253,
4477,
2085,
247,
1566,
3169,
2746,
323,
16248,
5661,
285,
21899,
941,
275,
35221,
4715,
5742,
275,
31204,
69,
793,
50276,
783,
2929,
369,
417,
2959,
1077,
49148,
407,
30628,
342,
253,
2022,
7350,
3585,
11932,
1475,
247,
4028,
3290,
270,
12820,
260,
6070,
273,
7680,
1677,
5368,
789,
327,
19349,
391,
77,
50276,
249,
13828,
634,
18520,
275,
1635,
281,
8254,
5411,
4028,
285,
6240,
1805,
12820,
891,
651,
21434,
253,
4477,
281,
7279,
5368,
19349,
17032,
6239,
327,
1127,
285,
7898,
8137,
275,
7533,
2905,
281,
391,
77,
824,
347,
28841,
3646,
4715,
50276,
2520,
588,
1361,
2953,
3374,
273,
38135,
407,
13633,
616,
2746,
281,
7533,
342,
625,
3510,
273,
34541,
50276,
249,
1635,
281,
4217,
10414,
5125,
407,
30628,
1529,
4217,
7482,
778,
320,
50276,
3967,
6820,
8350,
5150,
3210,
256,
11078,
34627,
266,
391,
458,
70,
480,
270,
700,
85,
607,
552,
66,
391,
285,
439,
81,
953,
254,
891,
275,
10061,
273,
253,
10488,
20117,
8059,
327,
11649,
275,
13345,
9260
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
often in a deep generative model with multiple latent variables the structure amongst the latent variables is prespecified before parameter estimation this work aims to learn the structure as part of the parameters to do so this work represents all possible dependencies amongst the latent random variables via a learned binary adjacency matrix c where a 1 denotes each parent child relationship each setting of c defines a latent variable as the root and subsequent parentchild relationships amongst the others to be able to support up to n1 parents the paper proposes a neural architecture where the sample from each parent is multiplied by the corresponding value of cij 0d out if the edge does not exist in c concatenated and fed into an mlp that predicts a distribution over the child node the inference network shares parameters with the generative model as in sonderby et al given any setting of c one can define the variational lowerbound of the data this work performs parameter estimation by sampling c and then performing gradient ascent on the resulting lowerbound the model is evaluated on mnist omniglot and cifar where it is found to outperform a vae with a single latent variable with the same number of latent dimensions as the proposed graphvae the laddervae and the fcvae vae with a fully connected graph an ablation study is conducted to study the effect of number of nodes and their dimensionality overall the paper is a well written b proposes a new interesting idea and c shows that the choice to parameterize structure via the use of auxillary random variables improves the quality of results on some standard benchmarks comments and questions for the authors clarity it might be instructive to describe in detail how the inference network is structured for different settings of c for example via a scenario with three latent variables rather than via reference to sonderby et al what prior distribution was used for c for the baseline comparing to the laddervae what dimensionalities were used for the latent variables in the laddervae which has a chain structured dependence amongst its latent variables the experimental setup keeps fixed the latent dimensionality to 80 the original paper recommends a different dimensionality for each latent variables in the chain httpsarxivorgpdf160202282pdf table 2 was this tried did the laddervae do better if each latent variable in the chain was allowed to have a dimensionality related work there is related work which leverages bayesian nonparametric models to learn hierarchical priors for deep generative models it is worth discussing for putting this line of work into context for example httpopenaccessthecvfcomcontenticcv2017papersgoyalnonparametricvariationalautoencodersiccv2017paperpdf and more recently httpsarxivorgpdf181006891pdf in the context of defining inference networks for generative models where the latent variables have structure webb et al httpsarxivorgabs171200287 describe how inference networks should be setup in order to invert the generative process qualitative study notable in its absence is a qualitative analysis of what happens to the data sampled from the model when the various nodes in the learned hierarchy are perturbed holding fixed their parents have you attempted this experiment are the edge relationships sensible or interesting is there a relationship between the complexity of each conditional distribution in the generative model and the learned latent structure specifically have you experimented to see what happens to the learned structure amongst the latent variables if each conditional density is a linear function of its parents docsepthe authors propose to augment the latent space of a variational autoencoder 1 with an autoregressive structure to improve the expressiveness of both the inference network and the latent prior making them into a general dag of latent variables this works goes further in the same direction as the ladder vae 2 this paper introduces a mechanism for the latent model to directly learn its dag structure by first considering the fullyconnected dag of latent variables and adding bernoulli variables controlling the presence or absence of each edge the authors derive a new elbo taking these variables into account and use it to train the model the gradients of the parameters of the bernoulli variables are computed using the gumbelsoftmax approach 3 and annealing the temperature the authors observe with they experiments that the bernoulli variables converge relatively quickly towards 0 or 1 during the training fixing the structure of the dag for the rest of the training they test their model against a vae a ladder vae and an alternative to their model were the dag is fixed to remain fullyconnected fcvae and observe improvements in terms of the elbo values and loglikelihood estimations the main addition of this paper is the introduction of the gating mechanism to reduce the latent dag from its fullyconnected state it is motivated by the tendency of latent models to fall into local optima however it is not clear to me what this mechanism as it is now adds to the model the reported results shows the improvements of graphvae over fcvae to be quite small making their relevance dubious in the absence of measurement of variance accross different trainings additionally the reported performances for ladder vae are inferior to what 2 reports actually the performance of laddervae reported in 2 is better than the one reported for graphvae in this paper both on the mnist and omniglot datasets the authors observe that the bernoulli variables have converged after around 200 epochs at this time according to their reported experimental setup the gumbelsoftmax temperature is 0999200 082 which is still quite near 10 meaning the model is still pretty far from a real bernoullilike behavior and actually equation 9 is not a proper description of the gumbelsoftmax as described by 3 there should be only 2 samples from the gumbel distribution not 3 given these two issues i cant believe that the cij coefficients behave like bernoulli variables in this experiment as such it seems to me that graphvae is nothing more than a special reparametrization of fcvae that tends to favor saturating behavior for the cij variables on figure 3b the learned structure is very symmetrical z2 z3 z4 play an identical role in the final dag in my opinion this begs for the introduction of a regulatory mechanism regarding the gating variable to push the model towards sparsity i was honestly surprised to see this gating mechanism introduced without anything guiding the convergence of the cij variables i like the idea of learning a latent structure dag for vaes but this paper introduces a rather weak way to try to achieve this and the experimental results are not convincing 1 httpsarxivorgabs13126114 2 httpsarxivorgabs160202282 3 httpsarxivorgabs161101144docsepthis paper presents a vae approach in which a dependency structure on the latent variable is learned during training specifically a lowertriangular random binary matrix c is introduced where cij 1 for ij indicates that zi depends on zj where z is the latent vector each element of c is separately parametrized by a bernoulli distribution whose means are optimized for during training using the target mathbbepcmathcallc where mathcallc indicates the elbo for a particular instance of c the resulting graphvae scheme is shown to train models with improved marginal likelihood than a number of baselines for mnist omniglot and cifar10 the core concept for this paper is good the results are impressive and the paper is for the most part easy to follow though i think a lot of people have been thinking about how to learn dependency structures in vaes i think this work is the first to clearly lay out a concrete approach for doing so i thus think that even though this is not the most novel of papers it is work which will be of significant interest to the iclr community however the paper has a number of technical issues and i do not believe the paper is suitable for publication unless they are addressed or at the vest least acknowledged i further have some misgivings with the experiments and the explanations of some key elements of the method because of these issues i think the paper falls below the acceptance threshold in its current form but i think they could potentially be correctable during the rebuttal period and i will be very happy to substantially increase my score if they are i feel this has the potential to be a very good paper that i would ultimately like to see published lower bound my first major concern is in the justification of the final approach eq 8 namely using a lower bound argument to move the pc term outside of the log a target being a lower bound on something we care about is never in itself a justification for that target it just says that the resulting estimator is provably negatively biased the arguments behind the use of lower bounds in conventional elbos are based on much more subtle arguments in terms of the bound becoming tight if we have good posterior approximations and implicit assumptions that the bound will behave similarly to the true marginal the bound derived in a1 of the current paper is instead almost completely useless and serves little purpose other than adding mathiness of the type discussed in httpsarxivorgabs18070334 eq 8 is not a variational endtoend target like you claim it is never tight and will demonstrably behave very differently to the original target to see why it will behave very differently consider how the original and bound would combine two instances of c for the mnist experiment one corresponding to the map values of c in the final trained system the other a value of c that has an elbo which is say 10 nats lower using eq 8 these will have similar contributions to the overall expectation and so a good network setup ie theta and phi is one which produces a decent elbo for both under the original expectation on the other hand the map value of c corresponds to a setup that has many orders of magnitude higher probability and so the best network setup is the one that does well for the map value of c with the other instance being of little importance we thus see that the original target and the lower bound behave very differently for a given pc thankfully the target in eq 8 is a potentially reasonable thing to do in its own right maybe actually more so that the original formulation because the averaging over c is somewhat spurious given you are optimizing its mean parameters anyway it is easy to show that the optimum pc for a given thetaphi is always a delta function on the value of c which has the highest elboc as fig 3 shows the optimization of the parameters of pc practically leads to such a collapse this is effectively desirable behavior given the overall aims and so averaging over values of c is from a modeling perspective actually a complete red herring anyway it is very much possible that the training procedure represented by eq 8 is almost by chance a good approach in terms of learning the optimal configuration for c but if this is the case it needs to be presented as such instead of using the current argument about putting a prior on c and constructing a second lower bound which is a best dubious and misleading and at worst complete rubbish ideally the current explanations would be replaced by a more principled justification but even just saying you tried eq 8 and it worked well empirically would be a lot better than what is there at the moment encoder dependency structure does not match the generative model my second major concern is that the dependency structure used for the encoder is incorrect from the point of view of the generative model namely a dependency structure on the prior does not induce the same dependency structure on the posterior in general just because z1 and z2 are independent doesnt mean that z1 and z2 are independent given x see eg bishop consequently the encoder in your setup will be incapable of correctly representing the posterior implied by the generative model this has a number of serious practical and theoretical knockon effects such as prohibiting the bound becoming tight causing the encoder to indirectly impact the expressivity of the generative model etc note that this problem is not shared with the ladder vae as there the markovian dependency structure means produces a special case where the posterior and prior dependency structure is shared as shown in httpsarxivorgabs171200287 a critical missing reference more generally it is actually possible to derive the dependency structure of the posterior from that of the prior i think in your case their results imply that the encoder needs to be fully connected as the decoder can induce arbitrary dependencies between the latent variables i am somewhat surprised that this has not had more of an apparent negative impact on the empirical results and i think at the very very least the paper needs to acknowledge this issue i would recommend the authors run experiments using a fully connected encoder and the graphvae decoder and potentially also vice verse should this approach perform well it would represent a more principled approach to replace the old on from a generative model perspective should it not it would provide an empirical justification for what is in essence a different restriction to that of the learned prior structure it is conceivably actually the case that these encoder restrictions induce the desired decoder behavior but this is distinct to learning a particular dependency structure in the generative model specifics of model and experiments though the paper is generally very easy to read there as some key areas where the explanations are overly terse in particular the explanation surrounding the encoding was difficult to follow and it took me a while to establish exactly what was going on i am still unsure how tildepsi and hatpsi are combined i think a more careful explanation here and a section giving more detail in the appendices would both help massively i was not clear on exactly what was meant by the fcvae i do not completely agree with the assertion that a standard vae has independent latents though the typical choice that the prior is n0i obviously causes the prior to have independent latents as explained earlier this does not mean the latents are independent in the posterior furthermore the encoder implicitly incorporates these dependencies through its mean vector even if it uses a diagonal covariance which is usually rather small anyway what is actually changed from this by the fcvae are you doing some kind of normalizing flow approach here if so this needs proper explanation relatedly i am also far from convinced by the arguments presented about why the fcvae does worse at the end of the experiments vaes attempt to maximize a marginal likelihood through a surrogate target and a model which makes no structural assumptions will generally have a lower marginal likelihood than one which makes the correct structural assumptions it is thus perfectly reasonable that when you learn dependency structures you will get a higher marginal likelihood than if you presume none i thus find your arguments about local optima somewhat speculative and further investigation is required experiments though certainly not terrible i felt that the experimental evaluation of the work could have been better the biggest issue i have is that no error bars are given for the results so it is difficult to assess the robustness of the graphvae i think it would be good to add convergence plots with error bars to see how the performance varies with time and provide an idea of variability more generally the experiment section overall feels more terse and rushed than the rest of the paper with some details difficult to find or potentially even straight up missing though fig 3 is very nice it would be nice to have additional plots seeing qualitatively what happens with the latent space eg on average what proportion of the c tend to zero is the same dependency structure always learned what do the dataset encodings look like are there noticeable qualitative changes in samples generated from the learned models i would be perfectly happy for the paper to extend over the 8 pages to allow more results addressing these questions
### Summary: | strengths this paper develops a method for learning the structure of discrete latent variables in a vae the overall approach is wellexplained and reasonable weaknesses ultimately this is done using the usual style of discrete relaxations which come with tradeoffs and inconsistencies consensus the reviewers all agreed that the paper is above the bar | [
310,
2717,
625,
685,
247,
2714,
294,
3575,
292,
45031,
273,
269,
17312,
3348,
326,
14280,
281,
3718,
19004,
839,
3879,
323,
253,
260,
1944,
4903,
50275,
251,
4677,
495,
67,
253,
6311,
2605,
310,
1077,
42736,
1182,
19,
1182,
20,
1182,
21,
1132,
271,
8931,
2554,
275,
253,
2457,
31398,
275,
619,
4743,
436,
2353,
84,
323,
253,
10199,
273,
247,
10545,
5122,
5001,
253,
305,
839,
4778,
281,
7450,
253,
1566,
4404,
37139,
414,
891,
369,
20509,
9861,
281,
923,
436,
305,
839,
5122,
5611,
1293,
2712,
26766,
253,
14940,
273,
253,
260,
1944,
4903,
50276,
74,
751,
253,
2934,
273,
4715,
247,
21624,
2605,
31398,
323,
13460,
265,
533,
436,
2929,
23970,
247,
2581,
5075,
1039,
281,
1611,
281,
5115,
436,
285,
253,
5661,
1543,
403,
417,
21414,
50276,
18,
5987,
39962,
2061,
5375,
1012,
13381,
13391,
374,
5987,
39962,
2061,
5375,
9913,
938,
1423,
3507,
495,
5987,
39962,
2061,
5375,
1036,
40416,
14231,
7152,
33032,
2520,
2929,
10262,
247,
362,
3348,
2746,
275,
534,
247,
18925,
2605,
327,
253,
21624,
4778,
310,
6311,
1309,
3733,
50276,
46458,
247,
1698,
797,
363,
12406,
3632,
8985,
4315,
260,
310,
5611,
835,
260,
1944,
50276,
18,
323,
891,
75,
6492,
326,
1182,
74,
7024,
327,
1182,
75,
835,
1182,
310,
253,
21624,
4972,
50276,
14382,
3284,
273,
260,
310,
11794,
30364,
50065,
407,
247,
270,
1808,
276,
25658,
3268,
3692,
2097,
403,
18325,
323,
1309,
3733,
970,
253,
2303,
14168,
4482,
554,
68,
679,
4065,
68,
835,
14168,
4065,
68,
6492,
253,
1045,
2399,
323,
247,
1798,
4227,
273,
260,
50276,
783,
4795,
4216,
21574,
6974,
310,
2011,
281,
6194,
3210,
342,
5520,
16888,
12177,
685,
247,
1180,
273,
1666,
25379,
323,
278,
79,
382,
33039,
304,
11753,
285,
260,
338,
274,
740,
50276,
783,
5161,
4473,
323,
436,
2929,
310,
1175,
253,
1543,
403,
13943,
285,
253,
2929,
310,
323,
253,
954,
629,
3477,
281,
956,
50276,
2004,
891,
1158,
247,
2257,
273,
952,
452,
644,
4680,
670,
849,
281,
3037,
18925,
5289,
275,
13460,
265,
891,
1158,
436,
789,
310,
253,
806,
281,
4518,
2242,
562,
247,
11859,
2746,
323,
2509,
594,
50276,
74,
3021,
1158,
326,
1014,
2167,
436,
310,
417,
253,
954,
4460,
273,
9380,
352,
310,
789,
534,
588,
320,
273,
1534,
1600,
281,
253,
17857,
32888,
3114,
50276,
35529,
253,
2929,
556,
247,
1180,
273,
7681,
3374,
285,
891,
513,
417,
2868,
253,
2929,
310,
7470,
323,
9311,
5734,
597,
403,
9713,
390,
387,
253,
20229,
1878,
14969,
891,
2007,
452,
690,
3731,
72,
400,
723,
342,
253,
4679,
285,
253,
22909,
273,
690,
2234,
3603,
273,
253,
1332,
50276,
12157,
273,
841,
3374,
891,
1158,
253,
2929,
11521,
2708,
253,
14924,
7887,
275,
697,
1655,
830,
533,
891,
1158,
597,
812,
7826,
320,
3451,
494,
1309,
253,
30080,
22559,
2180,
285,
891,
588,
320,
1077,
5211,
281,
9619,
2572,
619,
4868,
604,
597,
403,
891,
1928,
436,
556,
253,
2442,
281,
320,
247,
1077,
1175,
2929,
326,
891,
651,
9142,
751,
281,
923,
3863,
50275,
12973,
3033,
50275,
2577,
806,
2201,
4468,
310,
275,
253,
22861,
273,
253,
2457,
2746,
16186,
854,
10775,
970,
247,
2406,
3033,
4154,
281,
2118,
253,
21136,
1307,
3345,
273,
253,
2412,
50276,
66,
2303,
1146,
247,
2406,
3033,
327,
1633,
359,
1557,
670,
310,
1620,
275,
3139,
247,
22861,
323,
326,
2303,
50276,
262,
816,
2296,
326,
253,
4795,
29107,
310,
872,
1598,
18123,
23539,
50276,
783,
7125,
3212,
253,
897,
273,
2406,
14493,
275,
6041,
1045,
37298,
403,
1754,
327,
1199,
625,
16105,
7125,
275,
2426,
273,
253,
3033,
7552,
6863,
604,
359,
452,
1175,
12637,
34754,
285,
15424,
13260,
326,
253,
3033,
588,
21319,
12014,
281,
253,
2032,
16888,
50276,
783,
3033,
6012,
275,
247,
18,
273,
253,
1655,
2929,
310,
3185,
2761,
4336,
19437,
285,
11029,
1652,
4096,
643,
685,
6240,
14168,
1632,
273,
253,
1511,
5469,
275,
5987,
39962,
2061,
5375,
11395,
1967,
24257,
16186,
854,
310,
417,
247,
39762,
990,
936,
423,
2303,
751,
368,
1750,
50276,
262,
310,
1620,
6863,
285,
588,
2837,
1598,
21319,
1077,
13359,
281,
253,
3236,
2303,
50276,
936,
923,
2139,
352,
588,
21319,
1077,
13359,
1908,
849,
253,
3236,
285,
3033,
651,
13398,
767,
10872,
273,
260,
323,
253,
278,
79,
382,
3368,
581,
3969,
281,
253,
3711,
2193,
273,
260,
275,
253,
2457,
10166,
985,
253,
643,
247,
1318,
273,
260,
326,
556,
271,
1045,
2399,
534,
310,
1333,
884,
295,
1832,
2406,
50276,
5302,
16186,
854,
841,
588,
452,
2074,
9021,
281,
253,
4583,
15355,
285,
594,
247,
1175,
2990,
9978,
26332,
39116,
285,
815,
74,
310,
581,
534,
11330,
247,
12524,
1045,
2399,
323,
1097,
50276,
4524,
253,
3236,
15355,
327,
253,
643,
1133,
253,
3711,
1318,
273,
260,
10140,
281,
247,
9978,
326,
556,
1142,
7367,
273,
9777,
2169,
5912,
285,
594,
253,
1682,
2990,
9978,
310,
253,
581,
326,
1057,
973,
323,
253,
3711,
1318,
273,
260,
342,
253,
643,
4227,
1146,
273,
1652,
6349,
50276,
664,
3021,
923,
326,
253,
3236,
2303,
285,
253,
2406,
3033,
21319,
1077,
13359,
323,
247,
1677,
21136,
50276,
47033,
2920,
253,
2303,
275,
16186,
854,
310,
247,
7826,
5272,
2181,
281,
513,
275,
697,
1211,
987,
5046,
2686,
625,
594,
326,
253,
3236,
15895,
984,
253,
25001,
689,
260,
310,
8489,
46541,
1677,
368,
403,
39793,
697,
1599,
3602,
8791,
50276,
262,
310,
3477,
281,
921,
326,
253,
24571,
21136,
323,
247,
1677,
253,
37056,
5801,
310,
1900,
247,
18687,
1159,
327,
253,
1318,
273,
260,
534,
556,
253,
4585,
1045,
67,
406,
50276,
284,
3036,
495,
2722,
253,
13757,
273,
253,
3602,
273,
21136,
18236,
5644,
281,
824,
247,
13551,
50276,
2520,
310,
8069,
11408,
3879,
1677,
253,
4583,
13698,
285,
594,
25001,
689,
2193,
273,
260,
310,
432,
247,
14053,
8668,
2686,
247,
3426,
2502,
617,
804,
8791,
50276,
262,
310,
1077,
1199,
1896,
326,
253,
3733,
5199,
6607,
407,
16186,
854,
310,
2761,
407,
4839,
247,
1175,
2746,
275,
2426,
273,
4715,
253,
8654,
6661,
323,
260,
533,
604,
436,
310,
253,
1083,
352,
3198,
281,
320,
3559,
347,
824,
3185,
273,
970,
253,
1655,
4154,
670,
8133,
247,
2720,
327,
260,
285,
26736,
247,
1273,
2406,
3033,
534,
310,
247,
1682,
42326,
285,
24363,
285,
387,
9065,
3426,
45785,
50276,
504,
595,
253,
1655,
22909,
651,
320,
7932,
407,
247,
625,
3505,
74,
6216,
22861,
533,
1014,
816,
3981,
368,
3597,
16186,
854,
285,
352,
4307,
973,
45190,
651,
320,
247,
2257,
1805,
685,
752,
310,
627,
387,
253,
2774,
50275,
36465,
18925,
2605,
1057,
417,
3761,
253,
1006,
800,
1566,
50275,
2577,
1273,
2201,
4468,
310,
326,
253,
18925,
2605,
908,
323,
253,
32049,
310,
13583,
432,
253,
1127,
273,
1859,
273,
253,
1006,
800,
1566,
50276,
49592,
247,
18925,
2605,
327,
253,
2720,
1057,
417,
10808,
253,
1072,
18925,
2605,
327,
253,
12637,
50276,
249,
2087,
816,
984,
1182,
18,
285,
1182,
19,
403,
3907,
36908,
1599,
326,
1182,
18,
285,
1182,
19,
403,
3907,
1677,
1269,
923,
24088,
29417,
50276,
585,
9642,
253,
32049,
275,
634,
9978,
588,
320,
31257,
273,
9113,
9999,
253,
12637,
10466,
407,
253,
1006,
800,
1566,
50276,
2520,
556,
247,
1180,
273,
4092,
8542,
285,
10527,
7569,
251,
2538,
824,
347,
43987,
253,
3033,
7552,
6863,
8479,
253,
32049,
281,
21719,
3486,
253,
3890,
2351,
273,
253,
1006,
800,
1566,
3966,
50276,
9939,
326,
436,
1895,
310,
417,
6096,
342,
253,
23465,
362,
3348,
347,
627,
253,
1616,
729,
757,
18925,
2605,
2097,
11330,
247,
2714,
1083,
835,
253,
12637,
285,
2720,
18925,
2605,
310,
6096,
50276,
284,
2011,
275,
5987,
39962,
2061,
5375,
1166,
805,
14833,
24,
247,
4619,
5816,
3806,
625,
3839,
352,
310,
2686,
1896,
281,
15313,
253,
18925,
2605,
273,
253,
12637,
432,
326,
273,
253,
2720,
50276,
74,
1158,
275,
634,
1083,
616,
1543,
16084,
326,
253,
32049,
3198,
281,
320,
4751,
4802,
347,
253,
29810,
476,
10808,
10341,
21011,
875,
253,
21624,
4903,
50276,
74,
717,
8489,
9861,
326,
436,
556,
417,
574,
625,
273,
271,
5165,
4016,
3486,
327,
253,
16774,
1543,
285,
891,
1158,
387,
253,
1077,
1077,
1878,
253,
2929,
3198,
281,
14409,
436,
2523,
50276,
74,
651,
5583,
253,
4477,
1408,
4679,
970,
247,
4751,
4802,
32049,
285,
253,
4216,
21574,
29810,
285,
7826,
671,
12008,
23252,
50276,
11425,
436,
2746,
1347,
973,
352,
651,
1957,
247,
625,
3505,
74,
6216,
2746,
281,
8171,
253,
1711,
327,
432,
247,
1006,
800,
1566,
8668,
50276,
11425,
352,
417,
352,
651,
2085,
271,
16774,
22861,
323,
752,
310,
275,
17718,
247,
1027,
12400,
281,
326,
273,
253,
6311,
2720,
2605,
352,
310,
10686,
400,
1598,
2686,
253,
1083,
326,
841,
32049,
13133,
10808,
253,
6799,
29810,
3879,
533,
436,
310,
5799,
281,
4715,
247,
1798,
18925,
2605,
275,
253,
1006,
800,
1566,
50275,
6160,
84,
273,
1566,
285,
4679,
50275,
2004,
253,
2929,
310,
3839,
1077,
3477,
281,
1239,
627,
347,
690,
2234,
3672,
835,
253,
22909,
403,
27662,
4109,
339,
50276,
249,
1798,
253,
8813,
8704,
253,
9706,
369,
2834,
281,
956,
285,
352,
2335,
479,
247,
1223,
281,
5100,
4555,
752,
369,
1469,
327,
891,
717,
1335,
31488,
849,
246,
786,
2265,
74,
285,
7856,
4144,
403,
5678,
50276,
74,
1158,
247,
625,
10182,
8813,
1060,
285,
247,
2593,
4933,
625,
2508,
275,
253,
14801,
1271,
651,
1097,
1361,
48994,
50276,
74,
369,
417,
2590,
327,
4555,
752,
369,
5486,
407,
253,
269,
17312,
3348,
50276,
74,
513,
417,
4336,
5194,
342,
253,
17077,
326,
247,
2629,
362,
3348,
556,
3907,
4329,
592,
50276,
2004,
253,
6867,
4327,
326,
253,
2720,
310,
295,
17,
74,
9090,
5997,
253,
2720,
281,
452,
3907,
4329,
592,
347,
5544,
4321,
436,
1057,
417,
1599,
253,
4329,
592,
403,
3907,
275,
253,
12637,
50276,
44295,
3062,
253,
32049,
29688,
31167,
841,
21011,
949,
697,
1599,
4972,
1014,
604,
352,
4648,
247,
16421,
26677,
534,
310,
3798,
2581,
1355,
8791,
50276,
5371,
310,
2686,
4391,
432,
436,
407,
253,
269,
17312,
3348,
50276,
609,
368,
2509,
690,
2238,
273,
2622,
3006,
2685,
2746,
1060,
50276,
338,
594,
436,
3198,
1463,
8813,
50276,
4919,
314,
891,
717,
671,
2080,
432,
13762,
407,
253,
7125,
3559,
670,
2139,
253,
269,
17312,
3348,
1057,
7197,
387,
253,
990,
273,
253,
4679,
50276,
6156,
265,
3177,
281,
22950,
247,
16888,
12177,
949,
247,
35701,
2303,
285,
247,
1566,
534,
2789,
642,
8350,
13260,
588,
3839,
452,
247,
2406,
16888,
12177,
685,
581,
534,
2789,
253,
3451,
8350,
13260,
50276,
262,
310,
3021,
9670,
5272,
326,
672,
368,
3037,
18925,
5289,
368,
588,
755,
247,
2169,
16888,
12177,
685,
604,
368,
35533,
5293,
50276,
74,
3021,
1089,
634,
7125,
670,
1980,
5556,
66,
8489,
35377,
285,
2007,
5839,
310,
2424,
50275,
16217,
3825,
50275,
2004,
5604,
417,
11527,
891,
3543,
326,
253,
5661,
7103,
273,
253,
789,
812,
452,
644,
1805,
50276,
783,
5962,
2523,
891,
452,
310,
326,
642,
2228,
8965,
403,
1677,
323,
253,
1543,
594,
352,
310,
2834,
281,
2939,
253,
31640,
273,
253,
4216,
21574,
50276,
74,
1158,
352,
651,
320,
1175,
281,
823,
14940,
14777,
342,
2228,
8965,
281,
923,
849,
253,
3045,
16149,
342,
673,
285,
2085,
271,
2934,
273,
13099,
50276,
3062,
3839,
253,
3368,
2593,
4583,
9193,
625,
4109,
339,
285,
20906,
685,
253,
1551,
273,
253,
2929,
342,
690,
4278,
2834,
281,
1089,
390,
7826,
1014,
4951,
598,
5816,
50276,
2004,
3036,
495,
310,
1077,
5322,
352,
651,
320,
5322,
281,
452,
3081,
14777,
6523,
36143,
752,
6569,
342,
253,
21624,
2317,
50276,
909,
327,
3388,
752,
8394,
273,
253,
260,
5257,
281,
5058,
50276,
261,
253,
1072,
18925,
2605,
1900,
6311,
50276,
5371,
513,
253,
10895,
2349,
351,
723,
1007,
751,
50276,
609,
627,
28629,
18276,
2544,
275,
3530,
4561,
432,
253,
6311,
3210,
50276,
74,
651,
320,
9670,
5211,
323,
253,
2929,
281,
9017,
689,
253,
854,
7223,
281,
1581,
625,
1543,
15974,
841,
3533,
187,
187,
4118,
18435,
27,
296,
3755,
20556,
436,
2929,
24357,
247,
1332,
323,
4715,
253,
2605,
273,
13358,
21624,
4903,
275,
247,
362,
3348,
50276,
783,
4583,
2746,
310,
6210,
1591,
446,
1243,
285,
5272,
50276,
20881,
1255,
265,
9142,
436,
310,
2218,
970,
253,
7312,
3740,
273,
13358,
7921,
569,
534,
1705,
342,
5454,
14273,
285,
45611,
50276,
5040,
8536,
253,
30628,
512,
5821,
326,
253,
2929,
310,
1840,
253,
2534
] | [
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] | [
310,
2717,
625,
685,
247,
2714,
294,
3575,
292,
45031,
273,
269,
17312,
3348,
326,
14280,
281,
3718,
19004,
839,
3879,
323,
253,
260,
1944,
4903,
50275,
251,
4677,
495,
67,
253,
6311,
2605,
310,
1077,
42736,
1182,
19,
1182,
20,
1182,
21,
1132,
271,
8931,
2554,
275,
253,
2457,
31398,
275,
619,
4743,
436,
2353,
84,
323,
253,
10199,
273,
247,
10545,
5122,
5001,
253,
305,
839,
4778,
281,
7450,
253,
1566,
4404,
37139,
414,
891,
369,
20509,
9861,
281,
923,
436,
305,
839,
5122,
5611,
1293,
2712,
26766,
253,
14940,
273,
253,
260,
1944,
4903,
50276,
74,
751,
253,
2934,
273,
4715,
247,
21624,
2605,
31398,
323,
13460,
265,
533,
436,
2929,
23970,
247,
2581,
5075,
1039,
281,
1611,
281,
5115,
436,
285,
253,
5661,
1543,
403,
417,
21414,
50276,
18,
5987,
39962,
2061,
5375,
1012,
13381,
13391,
374,
5987,
39962,
2061,
5375,
9913,
938,
1423,
3507,
495,
5987,
39962,
2061,
5375,
1036,
40416,
14231,
7152,
33032,
2520,
2929,
10262,
247,
362,
3348,
2746,
275,
534,
247,
18925,
2605,
327,
253,
21624,
4778,
310,
6311,
1309,
3733,
50276,
46458,
247,
1698,
797,
363,
12406,
3632,
8985,
4315,
260,
310,
5611,
835,
260,
1944,
50276,
18,
323,
891,
75,
6492,
326,
1182,
74,
7024,
327,
1182,
75,
835,
1182,
310,
253,
21624,
4972,
50276,
14382,
3284,
273,
260,
310,
11794,
30364,
50065,
407,
247,
270,
1808,
276,
25658,
3268,
3692,
2097,
403,
18325,
323,
1309,
3733,
970,
253,
2303,
14168,
4482,
554,
68,
679,
4065,
68,
835,
14168,
4065,
68,
6492,
253,
1045,
2399,
323,
247,
1798,
4227,
273,
260,
50276,
783,
4795,
4216,
21574,
6974,
310,
2011,
281,
6194,
3210,
342,
5520,
16888,
12177,
685,
247,
1180,
273,
1666,
25379,
323,
278,
79,
382,
33039,
304,
11753,
285,
260,
338,
274,
740,
50276,
783,
5161,
4473,
323,
436,
2929,
310,
1175,
253,
1543,
403,
13943,
285,
253,
2929,
310,
323,
253,
954,
629,
3477,
281,
956,
50276,
2004,
891,
1158,
247,
2257,
273,
952,
452,
644,
4680,
670,
849,
281,
3037,
18925,
5289,
275,
13460,
265,
891,
1158,
436,
789,
310,
253,
806,
281,
4518,
2242,
562,
247,
11859,
2746,
323,
2509,
594,
50276,
74,
3021,
1158,
326,
1014,
2167,
436,
310,
417,
253,
954,
4460,
273,
9380,
352,
310,
789,
534,
588,
320,
273,
1534,
1600,
281,
253,
17857,
32888,
3114,
50276,
35529,
253,
2929,
556,
247,
1180,
273,
7681,
3374,
285,
891,
513,
417,
2868,
253,
2929,
310,
7470,
323,
9311,
5734,
597,
403,
9713,
390,
387,
253,
20229,
1878,
14969,
891,
2007,
452,
690,
3731,
72,
400,
723,
342,
253,
4679,
285,
253,
22909,
273,
690,
2234,
3603,
273,
253,
1332,
50276,
12157,
273,
841,
3374,
891,
1158,
253,
2929,
11521,
2708,
253,
14924,
7887,
275,
697,
1655,
830,
533,
891,
1158,
597,
812,
7826,
320,
3451,
494,
1309,
253,
30080,
22559,
2180,
285,
891,
588,
320,
1077,
5211,
281,
9619,
2572,
619,
4868,
604,
597,
403,
891,
1928,
436,
556,
253,
2442,
281,
320,
247,
1077,
1175,
2929,
326,
891,
651,
9142,
751,
281,
923,
3863,
50275,
12973,
3033,
50275,
2577,
806,
2201,
4468,
310,
275,
253,
22861,
273,
253,
2457,
2746,
16186,
854,
10775,
970,
247,
2406,
3033,
4154,
281,
2118,
253,
21136,
1307,
3345,
273,
253,
2412,
50276,
66,
2303,
1146,
247,
2406,
3033,
327,
1633,
359,
1557,
670,
310,
1620,
275,
3139,
247,
22861,
323,
326,
2303,
50276,
262,
816,
2296,
326,
253,
4795,
29107,
310,
872,
1598,
18123,
23539,
50276,
783,
7125,
3212,
253,
897,
273,
2406,
14493,
275,
6041,
1045,
37298,
403,
1754,
327,
1199,
625,
16105,
7125,
275,
2426,
273,
253,
3033,
7552,
6863,
604,
359,
452,
1175,
12637,
34754,
285,
15424,
13260,
326,
253,
3033,
588,
21319,
12014,
281,
253,
2032,
16888,
50276,
783,
3033,
6012,
275,
247,
18,
273,
253,
1655,
2929,
310,
3185,
2761,
4336,
19437,
285,
11029,
1652,
4096,
643,
685,
6240,
14168,
1632,
273,
253,
1511,
5469,
275,
5987,
39962,
2061,
5375,
11395,
1967,
24257,
16186,
854,
310,
417,
247,
39762,
990,
936,
423,
2303,
751,
368,
1750,
50276,
262,
310,
1620,
6863,
285,
588,
2837,
1598,
21319,
1077,
13359,
281,
253,
3236,
2303,
50276,
936,
923,
2139,
352,
588,
21319,
1077,
13359,
1908,
849,
253,
3236,
285,
3033,
651,
13398,
767,
10872,
273,
260,
323,
253,
278,
79,
382,
3368,
581,
3969,
281,
253,
3711,
2193,
273,
260,
275,
253,
2457,
10166,
985,
253,
643,
247,
1318,
273,
260,
326,
556,
271,
1045,
2399,
534,
310,
1333,
884,
295,
1832,
2406,
50276,
5302,
16186,
854,
841,
588,
452,
2074,
9021,
281,
253,
4583,
15355,
285,
594,
247,
1175,
2990,
9978,
26332,
39116,
285,
815,
74,
310,
581,
534,
11330,
247,
12524,
1045,
2399,
323,
1097,
50276,
4524,
253,
3236,
15355,
327,
253,
643,
1133,
253,
3711,
1318,
273,
260,
10140,
281,
247,
9978,
326,
556,
1142,
7367,
273,
9777,
2169,
5912,
285,
594,
253,
1682,
2990,
9978,
310,
253,
581,
326,
1057,
973,
323,
253,
3711,
1318,
273,
260,
342,
253,
643,
4227,
1146,
273,
1652,
6349,
50276,
664,
3021,
923,
326,
253,
3236,
2303,
285,
253,
2406,
3033,
21319,
1077,
13359,
323,
247,
1677,
21136,
50276,
47033,
2920,
253,
2303,
275,
16186,
854,
310,
247,
7826,
5272,
2181,
281,
513,
275,
697,
1211,
987,
5046,
2686,
625,
594,
326,
253,
3236,
15895,
984,
253,
25001,
689,
260,
310,
8489,
46541,
1677,
368,
403,
39793,
697,
1599,
3602,
8791,
50276,
262,
310,
3477,
281,
921,
326,
253,
24571,
21136,
323,
247,
1677,
253,
37056,
5801,
310,
1900,
247,
18687,
1159,
327,
253,
1318,
273,
260,
534,
556,
253,
4585,
1045,
67,
406,
50276,
284,
3036,
495,
2722,
253,
13757,
273,
253,
3602,
273,
21136,
18236,
5644,
281,
824,
247,
13551,
50276,
2520,
310,
8069,
11408,
3879,
1677,
253,
4583,
13698,
285,
594,
25001,
689,
2193,
273,
260,
310,
432,
247,
14053,
8668,
2686,
247,
3426,
2502,
617,
804,
8791,
50276,
262,
310,
1077,
1199,
1896,
326,
253,
3733,
5199,
6607,
407,
16186,
854,
310,
2761,
407,
4839,
247,
1175,
2746,
275,
2426,
273,
4715,
253,
8654,
6661,
323,
260,
533,
604,
436,
310,
253,
1083,
352,
3198,
281,
320,
3559,
347,
824,
3185,
273,
970,
253,
1655,
4154,
670,
8133,
247,
2720,
327,
260,
285,
26736,
247,
1273,
2406,
3033,
534,
310,
247,
1682,
42326,
285,
24363,
285,
387,
9065,
3426,
45785,
50276,
504,
595,
253,
1655,
22909,
651,
320,
7932,
407,
247,
625,
3505,
74,
6216,
22861,
533,
1014,
816,
3981,
368,
3597,
16186,
854,
285,
352,
4307,
973,
45190,
651,
320,
247,
2257,
1805,
685,
752,
310,
627,
387,
253,
2774,
50275,
36465,
18925,
2605,
1057,
417,
3761,
253,
1006,
800,
1566,
50275,
2577,
1273,
2201,
4468,
310,
326,
253,
18925,
2605,
908,
323,
253,
32049,
310,
13583,
432,
253,
1127,
273,
1859,
273,
253,
1006,
800,
1566,
50276,
49592,
247,
18925,
2605,
327,
253,
2720,
1057,
417,
10808,
253,
1072,
18925,
2605,
327,
253,
12637,
50276,
249,
2087,
816,
984,
1182,
18,
285,
1182,
19,
403,
3907,
36908,
1599,
326,
1182,
18,
285,
1182,
19,
403,
3907,
1677,
1269,
923,
24088,
29417,
50276,
585,
9642,
253,
32049,
275,
634,
9978,
588,
320,
31257,
273,
9113,
9999,
253,
12637,
10466,
407,
253,
1006,
800,
1566,
50276,
2520,
556,
247,
1180,
273,
4092,
8542,
285,
10527,
7569,
251,
2538,
824,
347,
43987,
253,
3033,
7552,
6863,
8479,
253,
32049,
281,
21719,
3486,
253,
3890,
2351,
273,
253,
1006,
800,
1566,
3966,
50276,
9939,
326,
436,
1895,
310,
417,
6096,
342,
253,
23465,
362,
3348,
347,
627,
253,
1616,
729,
757,
18925,
2605,
2097,
11330,
247,
2714,
1083,
835,
253,
12637,
285,
2720,
18925,
2605,
310,
6096,
50276,
284,
2011,
275,
5987,
39962,
2061,
5375,
1166,
805,
14833,
24,
247,
4619,
5816,
3806,
625,
3839,
352,
310,
2686,
1896,
281,
15313,
253,
18925,
2605,
273,
253,
12637,
432,
326,
273,
253,
2720,
50276,
74,
1158,
275,
634,
1083,
616,
1543,
16084,
326,
253,
32049,
3198,
281,
320,
4751,
4802,
347,
253,
29810,
476,
10808,
10341,
21011,
875,
253,
21624,
4903,
50276,
74,
717,
8489,
9861,
326,
436,
556,
417,
574,
625,
273,
271,
5165,
4016,
3486,
327,
253,
16774,
1543,
285,
891,
1158,
387,
253,
1077,
1077,
1878,
253,
2929,
3198,
281,
14409,
436,
2523,
50276,
74,
651,
5583,
253,
4477,
1408,
4679,
970,
247,
4751,
4802,
32049,
285,
253,
4216,
21574,
29810,
285,
7826,
671,
12008,
23252,
50276,
11425,
436,
2746,
1347,
973,
352,
651,
1957,
247,
625,
3505,
74,
6216,
2746,
281,
8171,
253,
1711,
327,
432,
247,
1006,
800,
1566,
8668,
50276,
11425,
352,
417,
352,
651,
2085,
271,
16774,
22861,
323,
752,
310,
275,
17718,
247,
1027,
12400,
281,
326,
273,
253,
6311,
2720,
2605,
352,
310,
10686,
400,
1598,
2686,
253,
1083,
326,
841,
32049,
13133,
10808,
253,
6799,
29810,
3879,
533,
436,
310,
5799,
281,
4715,
247,
1798,
18925,
2605,
275,
253,
1006,
800,
1566,
50275,
6160,
84,
273,
1566,
285,
4679,
50275,
2004,
253,
2929,
310,
3839,
1077,
3477,
281,
1239,
627,
347,
690,
2234,
3672,
835,
253,
22909,
403,
27662,
4109,
339,
50276,
249,
1798,
253,
8813,
8704,
253,
9706,
369,
2834,
281,
956,
285,
352,
2335,
479,
247,
1223,
281,
5100,
4555,
752,
369,
1469,
327,
891,
717,
1335,
31488,
849,
246,
786,
2265,
74,
285,
7856,
4144,
403,
5678,
50276,
74,
1158,
247,
625,
10182,
8813,
1060,
285,
247,
2593,
4933,
625,
2508,
275,
253,
14801,
1271,
651,
1097,
1361,
48994,
50276,
74,
369,
417,
2590,
327,
4555,
752,
369,
5486,
407,
253,
269,
17312,
3348,
50276,
74,
513,
417,
4336,
5194,
342,
253,
17077,
326,
247,
2629,
362,
3348,
556,
3907,
4329,
592,
50276,
2004,
253,
6867,
4327,
326,
253,
2720,
310,
295,
17,
74,
9090,
5997,
253,
2720,
281,
452,
3907,
4329,
592,
347,
5544,
4321,
436,
1057,
417,
1599,
253,
4329,
592,
403,
3907,
275,
253,
12637,
50276,
44295,
3062,
253,
32049,
29688,
31167,
841,
21011,
949,
697,
1599,
4972,
1014,
604,
352,
4648,
247,
16421,
26677,
534,
310,
3798,
2581,
1355,
8791,
50276,
5371,
310,
2686,
4391,
432,
436,
407,
253,
269,
17312,
3348,
50276,
609,
368,
2509,
690,
2238,
273,
2622,
3006,
2685,
2746,
1060,
50276,
338,
594,
436,
3198,
1463,
8813,
50276,
4919,
314,
891,
717,
671,
2080,
432,
13762,
407,
253,
7125,
3559,
670,
2139,
253,
269,
17312,
3348,
1057,
7197,
387,
253,
990,
273,
253,
4679,
50276,
6156,
265,
3177,
281,
22950,
247,
16888,
12177,
949,
247,
35701,
2303,
285,
247,
1566,
534,
2789,
642,
8350,
13260,
588,
3839,
452,
247,
2406,
16888,
12177,
685,
581,
534,
2789,
253,
3451,
8350,
13260,
50276,
262,
310,
3021,
9670,
5272,
326,
672,
368,
3037,
18925,
5289,
368,
588,
755,
247,
2169,
16888,
12177,
685,
604,
368,
35533,
5293,
50276,
74,
3021,
1089,
634,
7125,
670,
1980,
5556,
66,
8489,
35377,
285,
2007,
5839,
310,
2424,
50275,
16217,
3825,
50275,
2004,
5604,
417,
11527,
891,
3543,
326,
253,
5661,
7103,
273,
253,
789,
812,
452,
644,
1805,
50276,
783,
5962,
2523,
891,
452,
310,
326,
642,
2228,
8965,
403,
1677,
323,
253,
1543,
594,
352,
310,
2834,
281,
2939,
253,
31640,
273,
253,
4216,
21574,
50276,
74,
1158,
352,
651,
320,
1175,
281,
823,
14940,
14777,
342,
2228,
8965,
281,
923,
849,
253,
3045,
16149,
342,
673,
285,
2085,
271,
2934,
273,
13099,
50276,
3062,
3839,
253,
3368,
2593,
4583,
9193,
625,
4109,
339,
285,
20906,
685,
253,
1551,
273,
253,
2929,
342,
690,
4278,
2834,
281,
1089,
390,
7826,
1014,
4951,
598,
5816,
50276,
2004,
3036,
495,
310,
1077,
5322,
352,
651,
320,
5322,
281,
452,
3081,
14777,
6523,
36143,
752,
6569,
342,
253,
21624,
2317,
50276,
909,
327,
3388,
752,
8394,
273,
253,
260,
5257,
281,
5058,
50276,
261,
253,
1072,
18925,
2605,
1900,
6311,
50276,
5371,
513,
253,
10895,
2349,
351,
723,
1007,
751,
50276,
609,
627,
28629,
18276,
2544,
275,
3530,
4561,
432,
253,
6311,
3210,
50276,
74,
651,
320,
9670,
5211,
323,
253,
2929,
281,
9017,
689,
253,
854,
7223,
281,
1581,
625,
1543,
15974,
841,
3533,
187,
187,
4118,
18435,
27,
296,
3755,
20556,
436,
2929,
24357,
247,
1332,
323,
4715,
253,
2605,
273,
13358,
21624,
4903,
275,
247,
362,
3348,
50276,
783,
4583,
2746,
310,
6210,
1591,
446,
1243,
285,
5272,
50276,
20881,
1255,
265,
9142,
436,
310,
2218,
970,
253,
7312,
3740,
273,
13358,
7921,
569,
534,
1705,
342,
5454,
14273,
285,
45611,
50276,
5040,
8536,
253,
30628,
512,
5821,
326,
253,
2929,
310,
1840,
253,
2534
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
as the title suggests the paper is a comparison of recent continual learning methods that prevent catastrophic forgetting and their effectiveness in some text classification tasks using popular pretrained language models such as bert roberta etc the paper divides continual learning methods into three categories 1 rehearsalbased 2 regularizationbased and 3 dynamic architecture the experimental results show that rehearsal based methods are superior to the other two and also that bert is generally better than other candidates the paper then proposes a new probing techniques to find out what makes rehearsalbased method better and whats happening inside bert the paper finds that the last layer has the biggest catastrophic forgetting and lower layer is less impacted i think a comparative study paper should suffice at least two conditions to be considered for a publication at a venue like iclr first it should present a novel view on the problem and second it should draw a novel conclusion out of the experiments although the paper could be a good survey for readers who want to learn about continual learning i think its viewpoint is not new and its conclusion is not surprising while it is helpful to know that rehearsal works better than regularization in most datasets this is not entirely a surprising result i think it is a common belief that rehearsalbased is more robust against catastrophic forgetting while regularizationmethod is more spaceefficient in that it doesnt have to store examples the fact that the last layer suffers from catastrophic forgetting is also not a surprising result given that the lower layers are known to encode linguistic features and the upper layers encode taskspecific features while the paper can be helpful for readers who want to learn about continual learning in text classification using pretrained language models the paper does not seem to bring sufficient a novel viewpoint or conclusion to be published at iclr docsep this paper conducts an empirical study on the catastrophic forgetting of pretrained language models on two continual learning settings class incremental and task incremental the paper evaluates multiple pretrained models on different data sets to see how severe the catastrophic forgetting issue is for these pretrained models then the paper also tests the effectiveness of multiple continual learning methods on such pretrained models and draws some conclusions although the authors have conducted quite a lot of experiments the phenomena shown in experiment results is hardly surprising to me it is not surprising that the pretrained language models would have forgetting issues when finetuned on downstream tasks it is also not surprising that rehearsalbased methods perform the best for pretrained models moreover the paper draws a conclusion that bert is the most robust one and is a good option if a continual learning process is going to be conducted based on this the authors provide a few analyses on berts secret for continual learning however compared with other pretrained models i dont see that bert is significantly better than others given the figures and tables i feel from the figures and tables bert and other models look similar the authors didnt give a comprehensive explanation on how they read such information or a concrete quantitative comparison to support this claim a thorough empirical analysis with unsurprising conclusions docsepthe authors perform a comprehensive study of how pretrained language models work in the continual learning setting the authors study 5 relevant pretrained language models masked and unmasked and somewhere between 3 and 6 continual learning strategies depending on where in the paper they are counted in addition to a thorough everythingbyeverything evaluation the authors hone in on the details of how the different models and cl approaches are reflected in the transformer layers the authors find that the different language models studied perform relatively differently both qualitatively and quantitatively and these insights may provide useful for directing future improvements strengths overall the study is very thorough covering both the correct range of options for each axis studied and a set of relevant crossaxis multiple variable experiments the organization is good but not perfect see below its strengths are that the different options considered along each axis are clearly laid out ahead of time with the exception of the continual learning strategies the layerwise analysis in particular is interesting and tells a coherent story despite the challenges of displayed complicated 3d data overall it seems that recently many studies compare quantitatively against multiple plms which ultimately appear similar due to only slightly different performance numbers this studys most successful contribution in my opinion is an exploration of the qualitative differences among plms in the continual learning setting weaknesses it would be really great to see how the insights after analysis can be used to improve performance its probably not absolutely required given the focus on probeing but it would go a long way towards validating the insights adjusting the numbers to achieve the 5432 cuteness gets slightly in the way of understanding unfortunately the main reason for this is that the veins of cl methods has a different number over the course of the paper which makes it hard to identify when a given list of n things is a list of the veins of cl methods specifically in the abstract and intro this number is 4 in section 23 this number is 3 in section 31 this number is 6 in table 1 this number is 5 two different sets of 5 in figure 1 this number is 6 and in figure 2 this number is 4 for a paper that has so much going on and so many different lists of different sizes keeping these consistent would make it much easier for the reader to understand at any point what exactly this given list of n items is referring to along the same lines it would help to be consistent with the language around each set of n things for example the veins of cl methods are called at least veins schemes and approaches at different points section 32 table 1 and figure 1 are relatively weak in my opinion what am i supposed to conclude i can look at the table and see the different results but so what what should i be drawing my eye to in the table bold would help the figure here is too small to even attempt to parse it would be very helpful to have a sentence that gives intuition about whats being measured with the accuracy metric the definition is there but it took me a second to realize that the intuitive idea is that its measuring the accuracy of past tasks after the model has moved on to learning new tasks overall the authors perform a quite deep study of using pretrained language models for continual learning despite some weak points in the analysis of the quantitative results and inconsistent organizationlanguage around the cl approaches the thoroughness of the study in particular the analysis at a layerbylayer level is likely of interest to the broader community docsepthis paper explores the continual learning performance when combining different plms and common continual learning methods with 3 challenging nlp classification tasks to benchmark these combinations the methods are evaluated in taskincremental and classincremental learning settings over various nlp endtasks which covers common learning settings in continual learning and nlp there is also a layerwise performance analysis to identify which layers keep or forget task relevant information during training overall the paper shows that forms of replay outperform other methods like regularization strenghts the analysis of plm is carried out over different continual learning techniques nlp tasks plm variants and finally by layerwise probing analysis understanding the strengths and weaknesses of each plm is very desirable research progress the authors also provide new research questions that arise from the analysis and point to interesting unsolved research challenges evaluation starts off with the expected lower and upper bound of performance and then moves on to disentangle fwt and bwt backward forward performance when using various cl techniques the chosen data sets are not well behaved ie experience imbalances which makes the results more realistic less artificial overall this study provides a necessary step towards exploring future continual learning methodology and explores many important factors on eight pages from the batchlearning adaptation literature on plms one may expect baselines such as various adapter block versions or anti forgetting hacks but it is understandable that the authors did not test these since adapters would likely introduce complexity per increment and quickly become impractical as the authors mention cl specific future extensions to adapters are conceivable but a work of their own minor weaknesses easily fixable suggestions for improvement 1 content page left for improvements some plots seem to be a very small and may be enlarged to use the 9th page fig 2 plots should share a larger model legend on the figure top or bottom so the bars can become wider and easier to distinguish fig 2 color could be made more distinguishable especially since the plots are narrow tab 1 could underline the best non joint performance makes it easier to glance sec 4 that transformer layers except the classification layer do not adapt much during finetuning is a known result hence assumptions 1 and 2 in the paper which should be cited see bertology primer by anna rogers for references here intermediate layers are shown to forget as well so this is a nice new finding that can be contrasted fig 3 i assume the figure colorscale is probe performance also the buffer size eg er200 should be explained with an example is it 200 er samples 200 samples class in section 51 it should be rementioned what the buffer size means spelling errors not all listed using a tts appfunction makes it easy to find these sec 23 eg sec 32 could be find can be found tendency to forgetting to forget towards forgetting reach the highest reaches 4 interferes with the plms ability to retrain representations 41 joint joint multitask training from sketch from scratch 42 methods making sense make sense the sentence 2 the classification layer clf is typically the most fragile of bert where continual learning learning methods making sense is not understandable questions to the authors none regarding clarity the problem proposed cl algorithms benchmarks and evaluation metric are suitable to answer the research questions the performance and layerwise analysis is conductive to deepen an understanding for the shortcomings and potential opportunities of plms for continual learning unsurprisingly variants of er are the most effective technique in incremental cl with plms which is somewhat disappointing but also a standard outcome in cl the paper is mostly well written and the experiments sub research questions are logically structured some minor details can be improved and have been pointed out in the review insights are valuable and provide a solid foundation for followup studies the details in the appendix were interesting and added to the paper and its reading flow eg by moving cumbersome details like hyperparameters to a dedicated appendix section i thus feel confident to recommend this paper for acceptance
### Summary: | this paper presents a comparison and analysis of continual learning methods for pretrained language models the authors categorise continual learning methods into three categories those that use cross task regularisation those that employ some form of experience replay of previous training examples and those that dynamically alter the network architecture for each task evaluation results from representative examples of these three paradigms are then presented and analysed in general methods that incorporate experience reply appear to perform the best while analysis of the predictive power of individual layers of the pretrained models suggests that some network layers are more robust to catastrophic forgetting than others and that this also varies across architectures bert albert etc in general the reviewers agree that this is a well conducted study that provides an interesting contribution to an important area of research they also generally agree that the many of the results are unsurprising given the properties of the algorithms explored and prior work in this area the main point of difference then becomes how valuable it is to present a thorough study of existing algorithms that confirms our assumptions i believe that the current work raises enough interesting questions to make it a useful contribution to researchers working in continual learning in particular the results analysing the relative differences in catastrophic forgetting across different layers in models suggests interesting avenues for follow on work | [
2317,
20246,
275,
326,
352,
36908,
452,
281,
4657,
6667,
253,
958,
326,
253,
1390,
3828,
27171,
432,
36256,
37264,
310,
671,
417,
247,
10084,
906,
1677,
326,
253,
2406,
8090,
403,
1929,
281,
22573,
32019,
3386,
285,
253,
5170,
8090,
22573,
8892,
29765,
3386,
1223,
253,
2929,
476,
320,
9371,
323,
10668,
665,
971,
281,
3037,
670,
45120,
4715,
275,
2505,
9162,
970,
3215,
11273,
3448,
3210,
253,
2929,
1057,
417,
1646,
281,
3324,
4209,
247,
4460,
31460,
390,
6452,
281,
320,
3863,
387,
17857,
32888,
5474,
33032,
436,
2929,
2589,
84,
271,
16774,
1263,
327,
253,
36256,
37264,
273,
3215,
11273,
3448,
3210,
327,
767,
45120,
4715,
7533,
966,
32809,
285,
4836,
32809,
253,
2929,
44995,
2709,
3215,
11273,
3210,
327,
1027,
941,
5239,
281,
923,
849,
5460,
253,
36256,
37264,
2523,
310,
323,
841,
3215,
11273,
3210,
840,
253,
2929,
671,
5216,
253,
12510,
273,
2709,
45120,
4715,
3082,
327,
824,
3215,
11273,
3210,
285,
21354,
690,
11815,
50276,
20261,
253,
4477,
452,
5196,
3240,
247,
2257,
273,
4679,
253,
16958,
2011,
275,
3368,
1543,
310,
10693,
10084,
281,
479,
352,
310,
417,
10084,
326,
253,
3215,
11273,
3448,
3210,
651,
452,
37264,
3374,
672,
1442,
292,
37437,
327,
15450,
8892,
352,
310,
671,
417,
10084,
326,
33558,
267,
3169,
3082,
1347,
253,
1682,
323,
3215,
11273,
3210,
50275,
3062,
1189,
253,
2929,
21354,
247,
6452,
326,
270,
797,
310,
253,
954,
10237,
581,
285,
310,
247,
1175,
4500,
604,
247,
45120,
4715,
1232,
310,
1469,
281,
320,
5196,
1754,
327,
436,
253,
4477,
2085,
247,
1643,
6260,
327,
270,
797,
84,
4279,
323,
45120,
4715,
2299,
2429,
342,
643,
3215,
11273,
3210,
891,
13414,
923,
326,
270,
797,
310,
3012,
1805,
685,
2571,
1677,
253,
8442,
285,
7180,
891,
1928,
432,
253,
8442,
285,
7180,
270,
797,
285,
643,
3210,
1007,
2074,
253,
4477,
42126,
1918,
247,
11088,
8813,
327,
849,
597,
1239,
824,
1491,
390,
247,
11859,
11745,
5301,
281,
1329,
436,
1750,
50276,
66,
11080,
16774,
1783,
342,
5061,
321,
20733,
11815,
5474,
339,
431,
248,
4477,
1347,
247,
11088,
1263,
273,
849,
3215,
11273,
3448,
3210,
789,
275,
253,
45120,
4715,
4758,
253,
4477,
1263,
608,
4623,
3215,
11273,
3448,
3210,
34741,
285,
440,
12477,
264,
285,
9366,
875,
495,
285,
721,
45120,
4715,
8130,
7293,
327,
835,
275,
253,
2929,
597,
403,
16042,
275,
1635,
281,
247,
11080,
3253,
1615,
35773,
7103,
253,
4477,
39741,
275,
327,
253,
4278,
273,
849,
253,
1027,
3210,
285,
502,
7274,
403,
11392,
275,
253,
39707,
8090,
253,
4477,
1089,
326,
253,
1027,
3448,
3210,
5421,
1347,
4942,
13359,
1097,
36143,
285,
36878,
285,
841,
16039,
778,
2085,
4217,
323,
25091,
2852,
11701,
20544,
4583,
253,
1263,
310,
1077,
11080,
10985,
1097,
253,
3451,
2491,
273,
4610,
323,
1016,
7844,
5421,
285,
247,
873,
273,
4623,
2831,
10565,
2709,
4778,
4679,
253,
6003,
310,
1175,
533,
417,
3962,
923,
2708,
697,
20544,
403,
326,
253,
1027,
4610,
2783,
2112,
1016,
7844,
403,
4518,
10090,
562,
6386,
273,
673,
342,
253,
6517,
273,
253,
45120,
4715,
8130,
253,
3828,
3020,
1783,
275,
1798,
310,
4722,
285,
8599,
247,
18893,
2926,
5747,
253,
7881,
273,
8653,
9542,
495,
69,
941,
4583,
352,
3133,
326,
4102,
1142,
2175,
7277,
36878,
1411,
2709,
499,
983,
534,
9142,
3176,
2074,
1955,
281,
760,
5777,
1027,
3045,
3904,
436,
799,
656,
954,
5547,
7680,
275,
619,
4743,
310,
271,
17947,
273,
253,
18276,
3910,
2190,
499,
983,
275,
253,
45120,
4715,
4758,
50276,
20881,
1255,
265,
50276,
262,
651,
320,
1663,
1270,
281,
923,
849,
253,
16039,
846,
1783,
476,
320,
908,
281,
3157,
3045,
697,
3164,
417,
8839,
2424,
1677,
253,
2770,
327,
10304,
272,
533,
352,
651,
564,
247,
1048,
1039,
4404,
3588,
839,
253,
16039,
50276,
18570,
272,
253,
3904,
281,
5115,
253,
8255,
1237,
2624,
8098,
4850,
5777,
275,
253,
1039,
273,
4685,
19235,
253,
2022,
1921,
323,
436,
310,
326,
253,
27394,
273,
502,
3082,
556,
247,
1027,
1180,
689,
253,
2282,
273,
253,
2929,
534,
2789,
352,
1892,
281,
4271,
672,
247,
1677,
1618,
273,
295,
1841,
310,
247,
1618,
273,
253,
27394,
273,
502,
3082,
5742,
275,
253,
12002,
285,
26432,
436,
1180,
310,
577,
275,
2593,
3495,
436,
1180,
310,
495,
275,
2593,
4562,
436,
1180,
310,
721,
275,
2829,
337,
436,
1180,
310,
608,
767,
1027,
5239,
273,
608,
275,
4677,
337,
436,
1180,
310,
721,
285,
275,
4677,
374,
436,
1180,
310,
577,
323,
247,
2929,
326,
556,
594,
1199,
1469,
327,
285,
594,
1142,
1027,
10894,
273,
1027,
9552,
7562,
841,
5185,
651,
1056,
352,
1199,
6927,
323,
253,
9414,
281,
2096,
387,
667,
1127,
752,
4555,
436,
1677,
1618,
273,
295,
4957,
310,
14339,
281,
2112,
253,
1072,
3104,
352,
651,
1361,
281,
320,
5185,
342,
253,
3448,
1475,
1016,
873,
273,
295,
1841,
323,
1650,
253,
27394,
273,
502,
3082,
403,
1925,
387,
1878,
27394,
15849,
285,
7274,
387,
1027,
2792,
50275,
4674,
4567,
2829,
337,
285,
4677,
337,
403,
4942,
5075,
275,
619,
4743,
752,
717,
891,
6326,
281,
7525,
891,
476,
1007,
387,
253,
2829,
285,
923,
253,
1027,
1543,
533,
594,
752,
752,
943,
891,
320,
10263,
619,
5130,
281,
275,
253,
2829,
13433,
651,
1361,
253,
4677,
1060,
310,
1512,
1355,
281,
1014,
3177,
281,
14390,
50275,
262,
651,
320,
1077,
9371,
281,
452,
247,
6197,
326,
4245,
30328,
670,
47515,
1146,
4080,
342,
253,
7200,
7982,
253,
5426,
310,
627,
533,
352,
2335,
479,
247,
1273,
281,
8968,
326,
253,
27350,
2934,
310,
326,
697,
10499,
253,
7200,
273,
2469,
8892,
846,
253,
1566,
556,
4395,
327,
281,
4715,
747,
8892,
50275,
1189,
455,
253,
4477,
1347,
247,
3240,
3676,
1263,
273,
970,
3215,
11273,
3448,
3210,
323,
45120,
4715,
5747,
690,
5075,
2792,
275,
253,
1783,
273,
253,
11745,
1543,
285,
16706,
6003,
12982,
1475,
253,
502,
7274,
253,
11080,
1255,
273,
253,
1263,
275,
1798,
253,
1783,
387,
247,
3828,
67,
1190,
4071,
1268,
310,
2779,
273,
1600,
281,
253,
16055,
3114,
50276,
7152,
33032,
2520,
2929,
33826,
253,
45120,
4715,
3045,
672,
16248,
1027,
499,
983,
285,
1846,
45120,
4715,
3082,
342,
495,
11132,
295,
24343,
9162,
8892,
50276,
936,
22791,
841,
13553,
253,
3082,
403,
6760,
275,
4836,
19687,
30132,
285,
966,
19687,
30132,
4715,
7533,
689,
2710,
295,
24343,
990,
40480,
534,
10949,
1846,
4715,
7533,
275,
45120,
4715,
285,
295,
24343,
627,
310,
671,
247,
3828,
3020,
3045,
1783,
281,
4271,
534,
8090,
1978,
390,
7740,
4836,
4623,
1491,
1309,
3733,
50276,
1189,
455,
253,
2929,
2722,
326,
4948,
273,
44864,
562,
32231,
643,
3082,
751,
37820,
50276,
296,
3755,
384,
84,
50276,
783,
1783,
273,
499,
78,
310,
4824,
562,
689,
1027,
45120,
4715,
5609,
295,
24343,
8892,
499,
78,
11640,
285,
4720,
407,
3828,
3020,
39578,
1783,
4685,
253,
20544,
285,
32213,
273,
1016,
499,
78,
310,
1077,
11408,
2561,
4780,
253,
4477,
671,
2085,
747,
2561,
3533,
326,
12893,
432,
253,
1783,
285,
1127,
281,
4722,
5061,
5336,
2561,
7881,
50276,
15419,
2368,
7866,
745,
342,
253,
3264,
2406,
285,
5170,
3033,
273,
3045,
285,
840,
9727,
327,
281,
557,
290,
2134,
269,
17118,
285,
270,
17118,
19265,
3579,
3045,
672,
970,
2710,
502,
5609,
50275,
783,
6777,
941,
5239,
403,
417,
973,
44944,
26332,
2793,
516,
7187,
1972,
534,
2789,
253,
1543,
625,
15958,
1679,
13345,
4583,
436,
1263,
3400,
247,
3309,
3213,
4404,
18216,
2852,
45120,
4715,
16182,
285,
33826,
1142,
1774,
2616,
327,
4314,
7223,
50276,
4064,
253,
14604,
28269,
15644,
6239,
327,
499,
983,
581,
778,
1902,
1666,
25379,
824,
347,
2710,
23675,
2972,
9508,
390,
3270,
37264,
288,
7305,
533,
352,
310,
34007,
326,
253,
4477,
858,
417,
1071,
841,
1580,
519,
49872,
651,
2779,
9569,
10454,
591,
17627,
285,
4541,
2489,
45783,
347,
253,
4477,
3748,
502,
2173,
2852,
18149,
281,
519,
49872,
403,
42177,
533,
247,
789,
273,
616,
1211,
50275,
37585,
32213,
4354,
4993,
494,
13991,
323,
7756,
50276,
18,
2600,
3239,
1669,
323,
11701,
50275,
8826,
14777,
1646,
281,
320,
247,
1077,
1355,
285,
778,
320,
28148,
281,
897,
253,
898,
394,
3239,
50276,
926,
374,
14777,
943,
3894,
247,
4067,
1566,
13691,
327,
253,
4677,
1755,
390,
5004,
594,
253,
8965,
476,
2489,
14200,
285,
6927,
281,
12129,
50276,
926,
374,
3295,
812,
320,
1160,
625,
37651,
3340,
1580,
253,
14777,
403,
6891,
50276,
8476,
337,
812,
762,
1282,
253,
1682,
1327,
6036,
3045,
50276,
46535,
352,
6927,
281,
17834,
50276,
1704,
577,
326,
39707,
8090,
3707,
253,
9162,
3828,
513,
417,
5223,
1199,
1309,
1442,
292,
25004,
310,
247,
1929,
906,
7613,
13260,
337,
285,
374,
275,
253,
2929,
534,
943,
320,
11106,
50276,
2887,
270,
797,
1497,
16086,
407,
271,
2072,
687,
7276,
323,
10414,
1060,
10444,
8090,
403,
2011,
281,
7740,
347,
973,
594,
436,
310,
247,
5322,
747,
4560,
326,
476,
320,
48397,
50276,
926,
495,
891,
5467,
253,
4677,
9830,
25912,
310,
10304,
3045,
671,
253,
6391,
1979,
24088,
2827,
1518,
943,
320,
5544,
342,
271,
1650,
310,
352,
1052,
2827,
3530,
1052,
3530,
966,
275,
2593,
8319,
352,
943,
320,
294,
13012,
752,
253,
6391,
1979,
2097,
50275,
1033,
3485,
6332,
417,
512,
7117,
970,
247,
246,
1641,
622,
3701,
2789,
352,
3477,
281,
1089,
841,
50275,
1704,
3495,
24088,
50276,
1704,
4567,
50272,
16534,
320,
1089,
50276,
5092,
320,
1119,
50272,
85,
423,
1371,
281,
37264,
50276,
936,
7740,
4404,
37264,
50272,
21943,
253,
4585,
50276,
250,
3844,
50276,
21,
50272,
2388,
453,
373,
50276,
3113,
253,
499,
983,
3745,
281,
851,
1949,
50276,
12554,
569,
50276,
3156,
50272,
16662,
50276,
16662,
1554,
262,
1945,
3733,
50272,
4064,
23211,
50276,
4064,
20041,
50276,
2945,
50272,
30172,
2403,
3282,
50276,
11145,
3282,
50272,
783,
6197,
374,
253,
9162,
3828,
502,
71,
310,
5431,
253,
954,
28304,
273,
270,
797,
835,
45120,
4715,
4715,
3082,
2403,
3282,
310,
417,
34007,
50275,
34974,
281,
253,
4477,
5293,
5001,
19843,
50276,
783,
1895,
4081,
502,
11333,
49602,
285,
7103,
7982,
403,
7470,
281,
3662,
253,
2561,
3533,
253,
3045,
285,
3828,
3020,
1783,
310,
19604,
281,
3676,
257,
271,
4685,
323,
253,
35387,
285,
2442,
9091,
273,
499,
983,
323,
45120,
4715,
5061,
321,
28761,
11640,
273,
2827,
403,
253,
954,
3576,
5853,
275,
32809,
502,
342,
499,
983,
534,
310,
8489,
31623,
533,
671,
247,
2629,
6454,
275,
502,
50276,
783,
2929,
310,
6571,
973,
3542,
285,
253,
4679,
749,
2561,
3533,
403,
40452,
18872,
690,
5884,
4278,
476,
320,
5520,
285,
452,
644,
8042,
562,
275,
253,
2278,
16039,
403,
9865,
285,
2085,
247,
4891,
12153,
323,
956,
484,
2175,
253,
4278,
275,
253,
30762,
497,
4722,
285,
2879,
281,
253,
2929,
285,
697,
4361,
2685,
24088,
407,
4886,
41049,
4278,
751,
4373,
22041,
281,
247,
9940,
30762,
2593,
50276,
74,
3021,
1928,
13224,
281,
5583,
436,
2929,
323,
14924,
2490,
187,
4118,
18435,
27,
2520,
2929,
10262,
247,
5301,
285,
1783,
273,
45120,
4715,
3082,
323,
3215,
11273,
3448,
3210,
253,
4477,
13213,
885,
45120,
4715,
3082,
715,
1264,
9050,
1110,
326,
897,
2831,
4836,
3963,
5837,
1110,
326,
2126,
690,
830,
273,
2793,
44864,
273,
2045,
3733,
6667,
285,
1110,
326,
23043,
6990,
253,
2990,
10336,
323,
1016,
4836,
7103,
1543,
432,
8612,
6667,
273,
841,
1264,
11951,
304,
983,
403,
840,
3559,
285,
15626,
275,
2087,
3082,
326,
19071,
2793,
12252,
3176,
281,
1347,
253,
1682,
1223,
1783,
273,
253,
15970,
1612,
273,
2060,
8090,
273,
253,
3215,
11273,
3210,
5936,
326,
690,
2990,
8090,
403,
625,
10237,
281,
36256,
37264,
685,
2571,
285,
326,
436,
671,
16149,
2439,
35615,
270,
797,
355,
6291,
3966,
50276,
249,
2087,
253,
30628,
5194,
326,
436,
310,
247,
973,
5196,
1263,
326,
3400,
271,
4722,
7680,
281,
271,
1774,
2170,
273,
2561,
597,
671,
3839,
5194,
326,
253,
1142,
273,
253,
1543,
403,
5061,
321,
20733,
1677,
253,
3607,
273,
253,
11333,
14859,
285,
2720,
789,
275,
436,
2170,
253,
2022,
1127,
273,
3064,
840,
4916,
849,
9865,
352,
310,
281,
1246,
247,
11080,
1263,
273,
5368,
11333,
326,
23849,
776,
13260,
891,
2868,
326,
253,
1655,
789,
16540,
2217,
4722,
3533,
281,
1056,
352,
247,
4217,
7680,
281,
8607,
2444,
275,
45120,
4715,
275,
1798,
253,
1543,
5127,
272,
253,
4103,
3910,
275,
36256,
37264,
2439,
1027,
8090,
275,
3210,
5936,
4722,
44201,
323,
956,
327,
789
] | [
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] | [
2317,
20246,
275,
326,
352,
36908,
452,
281,
4657,
6667,
253,
958,
326,
253,
1390,
3828,
27171,
432,
36256,
37264,
310,
671,
417,
247,
10084,
906,
1677,
326,
253,
2406,
8090,
403,
1929,
281,
22573,
32019,
3386,
285,
253,
5170,
8090,
22573,
8892,
29765,
3386,
1223,
253,
2929,
476,
320,
9371,
323,
10668,
665,
971,
281,
3037,
670,
45120,
4715,
275,
2505,
9162,
970,
3215,
11273,
3448,
3210,
253,
2929,
1057,
417,
1646,
281,
3324,
4209,
247,
4460,
31460,
390,
6452,
281,
320,
3863,
387,
17857,
32888,
5474,
33032,
436,
2929,
2589,
84,
271,
16774,
1263,
327,
253,
36256,
37264,
273,
3215,
11273,
3448,
3210,
327,
767,
45120,
4715,
7533,
966,
32809,
285,
4836,
32809,
253,
2929,
44995,
2709,
3215,
11273,
3210,
327,
1027,
941,
5239,
281,
923,
849,
5460,
253,
36256,
37264,
2523,
310,
323,
841,
3215,
11273,
3210,
840,
253,
2929,
671,
5216,
253,
12510,
273,
2709,
45120,
4715,
3082,
327,
824,
3215,
11273,
3210,
285,
21354,
690,
11815,
50276,
20261,
253,
4477,
452,
5196,
3240,
247,
2257,
273,
4679,
253,
16958,
2011,
275,
3368,
1543,
310,
10693,
10084,
281,
479,
352,
310,
417,
10084,
326,
253,
3215,
11273,
3448,
3210,
651,
452,
37264,
3374,
672,
1442,
292,
37437,
327,
15450,
8892,
352,
310,
671,
417,
10084,
326,
33558,
267,
3169,
3082,
1347,
253,
1682,
323,
3215,
11273,
3210,
50275,
3062,
1189,
253,
2929,
21354,
247,
6452,
326,
270,
797,
310,
253,
954,
10237,
581,
285,
310,
247,
1175,
4500,
604,
247,
45120,
4715,
1232,
310,
1469,
281,
320,
5196,
1754,
327,
436,
253,
4477,
2085,
247,
1643,
6260,
327,
270,
797,
84,
4279,
323,
45120,
4715,
2299,
2429,
342,
643,
3215,
11273,
3210,
891,
13414,
923,
326,
270,
797,
310,
3012,
1805,
685,
2571,
1677,
253,
8442,
285,
7180,
891,
1928,
432,
253,
8442,
285,
7180,
270,
797,
285,
643,
3210,
1007,
2074,
253,
4477,
42126,
1918,
247,
11088,
8813,
327,
849,
597,
1239,
824,
1491,
390,
247,
11859,
11745,
5301,
281,
1329,
436,
1750,
50276,
66,
11080,
16774,
1783,
342,
5061,
321,
20733,
11815,
5474,
339,
431,
248,
4477,
1347,
247,
11088,
1263,
273,
849,
3215,
11273,
3448,
3210,
789,
275,
253,
45120,
4715,
4758,
253,
4477,
1263,
608,
4623,
3215,
11273,
3448,
3210,
34741,
285,
440,
12477,
264,
285,
9366,
875,
495,
285,
721,
45120,
4715,
8130,
7293,
327,
835,
275,
253,
2929,
597,
403,
16042,
275,
1635,
281,
247,
11080,
3253,
1615,
35773,
7103,
253,
4477,
39741,
275,
327,
253,
4278,
273,
849,
253,
1027,
3210,
285,
502,
7274,
403,
11392,
275,
253,
39707,
8090,
253,
4477,
1089,
326,
253,
1027,
3448,
3210,
5421,
1347,
4942,
13359,
1097,
36143,
285,
36878,
285,
841,
16039,
778,
2085,
4217,
323,
25091,
2852,
11701,
20544,
4583,
253,
1263,
310,
1077,
11080,
10985,
1097,
253,
3451,
2491,
273,
4610,
323,
1016,
7844,
5421,
285,
247,
873,
273,
4623,
2831,
10565,
2709,
4778,
4679,
253,
6003,
310,
1175,
533,
417,
3962,
923,
2708,
697,
20544,
403,
326,
253,
1027,
4610,
2783,
2112,
1016,
7844,
403,
4518,
10090,
562,
6386,
273,
673,
342,
253,
6517,
273,
253,
45120,
4715,
8130,
253,
3828,
3020,
1783,
275,
1798,
310,
4722,
285,
8599,
247,
18893,
2926,
5747,
253,
7881,
273,
8653,
9542,
495,
69,
941,
4583,
352,
3133,
326,
4102,
1142,
2175,
7277,
36878,
1411,
2709,
499,
983,
534,
9142,
3176,
2074,
1955,
281,
760,
5777,
1027,
3045,
3904,
436,
799,
656,
954,
5547,
7680,
275,
619,
4743,
310,
271,
17947,
273,
253,
18276,
3910,
2190,
499,
983,
275,
253,
45120,
4715,
4758,
50276,
20881,
1255,
265,
50276,
262,
651,
320,
1663,
1270,
281,
923,
849,
253,
16039,
846,
1783,
476,
320,
908,
281,
3157,
3045,
697,
3164,
417,
8839,
2424,
1677,
253,
2770,
327,
10304,
272,
533,
352,
651,
564,
247,
1048,
1039,
4404,
3588,
839,
253,
16039,
50276,
18570,
272,
253,
3904,
281,
5115,
253,
8255,
1237,
2624,
8098,
4850,
5777,
275,
253,
1039,
273,
4685,
19235,
253,
2022,
1921,
323,
436,
310,
326,
253,
27394,
273,
502,
3082,
556,
247,
1027,
1180,
689,
253,
2282,
273,
253,
2929,
534,
2789,
352,
1892,
281,
4271,
672,
247,
1677,
1618,
273,
295,
1841,
310,
247,
1618,
273,
253,
27394,
273,
502,
3082,
5742,
275,
253,
12002,
285,
26432,
436,
1180,
310,
577,
275,
2593,
3495,
436,
1180,
310,
495,
275,
2593,
4562,
436,
1180,
310,
721,
275,
2829,
337,
436,
1180,
310,
608,
767,
1027,
5239,
273,
608,
275,
4677,
337,
436,
1180,
310,
721,
285,
275,
4677,
374,
436,
1180,
310,
577,
323,
247,
2929,
326,
556,
594,
1199,
1469,
327,
285,
594,
1142,
1027,
10894,
273,
1027,
9552,
7562,
841,
5185,
651,
1056,
352,
1199,
6927,
323,
253,
9414,
281,
2096,
387,
667,
1127,
752,
4555,
436,
1677,
1618,
273,
295,
4957,
310,
14339,
281,
2112,
253,
1072,
3104,
352,
651,
1361,
281,
320,
5185,
342,
253,
3448,
1475,
1016,
873,
273,
295,
1841,
323,
1650,
253,
27394,
273,
502,
3082,
403,
1925,
387,
1878,
27394,
15849,
285,
7274,
387,
1027,
2792,
50275,
4674,
4567,
2829,
337,
285,
4677,
337,
403,
4942,
5075,
275,
619,
4743,
752,
717,
891,
6326,
281,
7525,
891,
476,
1007,
387,
253,
2829,
285,
923,
253,
1027,
1543,
533,
594,
752,
752,
943,
891,
320,
10263,
619,
5130,
281,
275,
253,
2829,
13433,
651,
1361,
253,
4677,
1060,
310,
1512,
1355,
281,
1014,
3177,
281,
14390,
50275,
262,
651,
320,
1077,
9371,
281,
452,
247,
6197,
326,
4245,
30328,
670,
47515,
1146,
4080,
342,
253,
7200,
7982,
253,
5426,
310,
627,
533,
352,
2335,
479,
247,
1273,
281,
8968,
326,
253,
27350,
2934,
310,
326,
697,
10499,
253,
7200,
273,
2469,
8892,
846,
253,
1566,
556,
4395,
327,
281,
4715,
747,
8892,
50275,
1189,
455,
253,
4477,
1347,
247,
3240,
3676,
1263,
273,
970,
3215,
11273,
3448,
3210,
323,
45120,
4715,
5747,
690,
5075,
2792,
275,
253,
1783,
273,
253,
11745,
1543,
285,
16706,
6003,
12982,
1475,
253,
502,
7274,
253,
11080,
1255,
273,
253,
1263,
275,
1798,
253,
1783,
387,
247,
3828,
67,
1190,
4071,
1268,
310,
2779,
273,
1600,
281,
253,
16055,
3114,
50276,
7152,
33032,
2520,
2929,
33826,
253,
45120,
4715,
3045,
672,
16248,
1027,
499,
983,
285,
1846,
45120,
4715,
3082,
342,
495,
11132,
295,
24343,
9162,
8892,
50276,
936,
22791,
841,
13553,
253,
3082,
403,
6760,
275,
4836,
19687,
30132,
285,
966,
19687,
30132,
4715,
7533,
689,
2710,
295,
24343,
990,
40480,
534,
10949,
1846,
4715,
7533,
275,
45120,
4715,
285,
295,
24343,
627,
310,
671,
247,
3828,
3020,
3045,
1783,
281,
4271,
534,
8090,
1978,
390,
7740,
4836,
4623,
1491,
1309,
3733,
50276,
1189,
455,
253,
2929,
2722,
326,
4948,
273,
44864,
562,
32231,
643,
3082,
751,
37820,
50276,
296,
3755,
384,
84,
50276,
783,
1783,
273,
499,
78,
310,
4824,
562,
689,
1027,
45120,
4715,
5609,
295,
24343,
8892,
499,
78,
11640,
285,
4720,
407,
3828,
3020,
39578,
1783,
4685,
253,
20544,
285,
32213,
273,
1016,
499,
78,
310,
1077,
11408,
2561,
4780,
253,
4477,
671,
2085,
747,
2561,
3533,
326,
12893,
432,
253,
1783,
285,
1127,
281,
4722,
5061,
5336,
2561,
7881,
50276,
15419,
2368,
7866,
745,
342,
253,
3264,
2406,
285,
5170,
3033,
273,
3045,
285,
840,
9727,
327,
281,
557,
290,
2134,
269,
17118,
285,
270,
17118,
19265,
3579,
3045,
672,
970,
2710,
502,
5609,
50275,
783,
6777,
941,
5239,
403,
417,
973,
44944,
26332,
2793,
516,
7187,
1972,
534,
2789,
253,
1543,
625,
15958,
1679,
13345,
4583,
436,
1263,
3400,
247,
3309,
3213,
4404,
18216,
2852,
45120,
4715,
16182,
285,
33826,
1142,
1774,
2616,
327,
4314,
7223,
50276,
4064,
253,
14604,
28269,
15644,
6239,
327,
499,
983,
581,
778,
1902,
1666,
25379,
824,
347,
2710,
23675,
2972,
9508,
390,
3270,
37264,
288,
7305,
533,
352,
310,
34007,
326,
253,
4477,
858,
417,
1071,
841,
1580,
519,
49872,
651,
2779,
9569,
10454,
591,
17627,
285,
4541,
2489,
45783,
347,
253,
4477,
3748,
502,
2173,
2852,
18149,
281,
519,
49872,
403,
42177,
533,
247,
789,
273,
616,
1211,
50275,
37585,
32213,
4354,
4993,
494,
13991,
323,
7756,
50276,
18,
2600,
3239,
1669,
323,
11701,
50275,
8826,
14777,
1646,
281,
320,
247,
1077,
1355,
285,
778,
320,
28148,
281,
897,
253,
898,
394,
3239,
50276,
926,
374,
14777,
943,
3894,
247,
4067,
1566,
13691,
327,
253,
4677,
1755,
390,
5004,
594,
253,
8965,
476,
2489,
14200,
285,
6927,
281,
12129,
50276,
926,
374,
3295,
812,
320,
1160,
625,
37651,
3340,
1580,
253,
14777,
403,
6891,
50276,
8476,
337,
812,
762,
1282,
253,
1682,
1327,
6036,
3045,
50276,
46535,
352,
6927,
281,
17834,
50276,
1704,
577,
326,
39707,
8090,
3707,
253,
9162,
3828,
513,
417,
5223,
1199,
1309,
1442,
292,
25004,
310,
247,
1929,
906,
7613,
13260,
337,
285,
374,
275,
253,
2929,
534,
943,
320,
11106,
50276,
2887,
270,
797,
1497,
16086,
407,
271,
2072,
687,
7276,
323,
10414,
1060,
10444,
8090,
403,
2011,
281,
7740,
347,
973,
594,
436,
310,
247,
5322,
747,
4560,
326,
476,
320,
48397,
50276,
926,
495,
891,
5467,
253,
4677,
9830,
25912,
310,
10304,
3045,
671,
253,
6391,
1979,
24088,
2827,
1518,
943,
320,
5544,
342,
271,
1650,
310,
352,
1052,
2827,
3530,
1052,
3530,
966,
275,
2593,
8319,
352,
943,
320,
294,
13012,
752,
253,
6391,
1979,
2097,
50275,
1033,
3485,
6332,
417,
512,
7117,
970,
247,
246,
1641,
622,
3701,
2789,
352,
3477,
281,
1089,
841,
50275,
1704,
3495,
24088,
50276,
1704,
4567,
50272,
16534,
320,
1089,
50276,
5092,
320,
1119,
50272,
85,
423,
1371,
281,
37264,
50276,
936,
7740,
4404,
37264,
50272,
21943,
253,
4585,
50276,
250,
3844,
50276,
21,
50272,
2388,
453,
373,
50276,
3113,
253,
499,
983,
3745,
281,
851,
1949,
50276,
12554,
569,
50276,
3156,
50272,
16662,
50276,
16662,
1554,
262,
1945,
3733,
50272,
4064,
23211,
50276,
4064,
20041,
50276,
2945,
50272,
30172,
2403,
3282,
50276,
11145,
3282,
50272,
783,
6197,
374,
253,
9162,
3828,
502,
71,
310,
5431,
253,
954,
28304,
273,
270,
797,
835,
45120,
4715,
4715,
3082,
2403,
3282,
310,
417,
34007,
50275,
34974,
281,
253,
4477,
5293,
5001,
19843,
50276,
783,
1895,
4081,
502,
11333,
49602,
285,
7103,
7982,
403,
7470,
281,
3662,
253,
2561,
3533,
253,
3045,
285,
3828,
3020,
1783,
310,
19604,
281,
3676,
257,
271,
4685,
323,
253,
35387,
285,
2442,
9091,
273,
499,
983,
323,
45120,
4715,
5061,
321,
28761,
11640,
273,
2827,
403,
253,
954,
3576,
5853,
275,
32809,
502,
342,
499,
983,
534,
310,
8489,
31623,
533,
671,
247,
2629,
6454,
275,
502,
50276,
783,
2929,
310,
6571,
973,
3542,
285,
253,
4679,
749,
2561,
3533,
403,
40452,
18872,
690,
5884,
4278,
476,
320,
5520,
285,
452,
644,
8042,
562,
275,
253,
2278,
16039,
403,
9865,
285,
2085,
247,
4891,
12153,
323,
956,
484,
2175,
253,
4278,
275,
253,
30762,
497,
4722,
285,
2879,
281,
253,
2929,
285,
697,
4361,
2685,
24088,
407,
4886,
41049,
4278,
751,
4373,
22041,
281,
247,
9940,
30762,
2593,
50276,
74,
3021,
1928,
13224,
281,
5583,
436,
2929,
323,
14924,
2490,
187,
4118,
18435,
27,
2520,
2929,
10262,
247,
5301,
285,
1783,
273,
45120,
4715,
3082,
323,
3215,
11273,
3448,
3210,
253,
4477,
13213,
885,
45120,
4715,
3082,
715,
1264,
9050,
1110,
326,
897,
2831,
4836,
3963,
5837,
1110,
326,
2126,
690,
830,
273,
2793,
44864,
273,
2045,
3733,
6667,
285,
1110,
326,
23043,
6990,
253,
2990,
10336,
323,
1016,
4836,
7103,
1543,
432,
8612,
6667,
273,
841,
1264,
11951,
304,
983,
403,
840,
3559,
285,
15626,
275,
2087,
3082,
326,
19071,
2793,
12252,
3176,
281,
1347,
253,
1682,
1223,
1783,
273,
253,
15970,
1612,
273,
2060,
8090,
273,
253,
3215,
11273,
3210,
5936,
326,
690,
2990,
8090,
403,
625,
10237,
281,
36256,
37264,
685,
2571,
285,
326,
436,
671,
16149,
2439,
35615,
270,
797,
355,
6291,
3966,
50276,
249,
2087,
253,
30628,
5194,
326,
436,
310,
247,
973,
5196,
1263,
326,
3400,
271,
4722,
7680,
281,
271,
1774,
2170,
273,
2561,
597,
671,
3839,
5194,
326,
253,
1142,
273,
253,
1543,
403,
5061,
321,
20733,
1677,
253,
3607,
273,
253,
11333,
14859,
285,
2720,
789,
275,
436,
2170,
253,
2022,
1127,
273,
3064,
840,
4916,
849,
9865,
352,
310,
281,
1246,
247,
11080,
1263,
273,
5368,
11333,
326,
23849,
776,
13260,
891,
2868,
326,
253,
1655,
789,
16540,
2217,
4722,
3533,
281,
1056,
352,
247,
4217,
7680,
281,
8607,
2444,
275,
45120,
4715,
275,
1798,
253,
1543,
5127,
272,
253,
4103,
3910,
275,
36256,
37264,
2439,
1027,
8090,
275,
3210,
5936,
4722,
44201,
323,
956,
327,
789
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
summary this paper proposes a selfdistillation based graph augmentation mechanism to alleviate the drawbacks of existing mi based models wrt their high dependency towards negative sampling quantitatively the proposed model achieves encouraging results however it would have been better if the system designs and significant difference of igsd from existing work are discussed strength this work has clearly discussed a drawback of existing unsupervised mi based models which is the leading approach in graph classification they propose a mechanism to address this issue with satisfiable quantitative results on unsupervised setting and extended semisupervised setting with selftraining also supported quantitatively paper is clear in general with a clear research problem proposes mechanism for unsupervisedsemisupervised graph representation domain and encouraging quantitative results weakness there is a lack of qualitative analysis and discussion of the proposed method in section 43 performance with different amount of negative pairs it is not clear the reasoning of the provided observation from figure 3a it is not clear the motivation behind selecting a teacherstudent network for obtaining different views of the graph these networks are normally used for knowledge transfer but here used for contrastive learning how is this more beneficial than an ensemble model wo knowledge transfer step of eq 3 the core difference of igsd from cmcgraph is that cmc uses mi based contrastive between local patch representation and graph rep while igsd uses l2 based contrastive between 2 graph representations input encoders and projections are the same for both architectures it could be useful to add some analysis to discuss these differences and their contributions to clearly understand the significance of igsd this paper seems to have stateoftheart results although it is based on graph kernel why the results are not included convolutional kernel networks for graphstructured data icml2020 after rebuttal i thank the authors for the response i still have concerns re their comparison with gckn icml2020 the reproduced results by the authors are different significantly from the published one in table 1 of gckn paper 5 eg mutag is 928 in original paper but the authors report 872 for gckn the difference is significant therefore i will keep my original rating docsepoverall comments learning graphlevel representations with only labels has been explored by many works however its not easy to annotate every graph this paper applies the ideas from semisupervised classification task to improve the representation quality learned by graph neural network specifically the proposed solution combines several kinds of existing techniques including diffusion graph augmentation mean teacher consistency debiased contrastive loss and pseudo class consistency finally they are combined together to act as a regularization term by utilizing the unlabelled data from this point of view the novelty of this work is incremental but its still an interesting work for improving graphlevel representations clarity the presentation is not clear enough there exists many claims that are not clear shown as follows 1 in the last sentence of 3rd paragraph in introduction section its difficult to get the connection between negative samples mining and selfdistillation strategy why using the selfdistillation can alleviate the dependency on negative samples mining the unsupervised objective in equation 4 still depends on negative samples 2 in section 21 the notation for augmentations why are the graphs gl attached without labels after being augmented 3 in section 23 authors firstly use ppr to augment node features then randomly remove edges to create a corrupted graphs according to the description the question is how many views that will be used in follow sections i guess that the graph feature from original graph will be fed to student network and the augmented corrupted graph will be fed to teacher network questions for rebuttal 1 please clarify the mentioned questions above 2 the proposed method contains an encoder projector and predictor the question is why we need a projector g to get a higher dimension z does it have a big influence on the performance could you please give the complete definition of function g and h 3 the definition of lcon in equation 2 is for positive sample extracted from the same graph gi however the complete unsupervised loss needs negative samples gj could you please also give the definition for lgi gj 4 the overall loss consists of supervised and unsupervised loss the lsup has conflict to the first term in equation 7 both of them use labels but its difficult to tell which one should be aligned with the supervised loss shown in the ablation study table 2 the supcon has never been shown in the main content before please pay attention to make it clear docsep update after reading the authors response the authors didnt address my question did the authors perform a significance study a significance test such as doublesided ttest is needed to verifying whether the proposed method is significantly better than baselines this paper proposed a distillation approach for unsupervised graph representation learning the approach partially builds upon contrastive selfsupervised learning which contrasts pairs of augmented graphs the approach is extended to the semisupervised setting the authors performed evaluation in graph classification and regression tasks i recommend to reject this paper due to the following major concerns 1 experimental results are not strong 2 important baselines were not compared 3 important details such as optimal hyperparameter values are missing my major concerns of this paper include 1 the improvement of the proposed approach over baselines seem not significant for example in table 1 comparing the mean and standard deviation of the proposed approach and cmcgraph it seems that the difference is not statistically significant did the authors perform a significance study 2 in the experiments why the authors didnt compare with gcc which is a contrastive selfsupervised learning approach applied to graph classification 3 there are many other unsupervised graph representation learning methods the authors need to compare with more to substantiate this work 4 in hyperparameter tuning the authors gave the range of hyperparameters tuned but didnt give the optimal value of the hyperparameters which make the paper difficult to reproduce 5 in table 1 the authors excluded some results since they need more than 1 day to obtain it is common for deep learning models to run several days to obtain results i dont think it is proper to exclude these results simply because the runtime is more than 24 hours however the paper does have a few strong points 1 the ablation studies are well designed and the results are insightful 2 the paper is wellwritten and easy to follow with a clear organization 3 the experiments were conducted on a rich collection of datasets other comments 1 in equation 3 the authors can draw a connection with moco 2 in table why didnt report mean and standard deviation of the results 3 for this result when batch size is greater than 32 igsd outperforms cmcgraph and the performance gap becomes larger as the batch size increases can the authors provide a reason that can possibly explain this phenomenon 4 the authors can add some statistics of the datasets used in figure 2 docsepthis paper proposed a method for learning graphlevel representation in an unsupervised contrastive way instead of contrasting between graphlevel representation and patch representation like infograph 1 they contrast graphlevel representation of a graph to its augmented variation using a teacherstudent framework why use infonce objective instead of the jensenshannon mutual information objective used in infograph 1 the major concern about this paer is that the proposed method encourages the closeness of augmented views from the same graph instances but provide no guarantee that the transformation used graph diffusion and sparsification with ppr random remove edges in this paper would be labelpreserving for example in molecular datasets if we drop an edge and that edge happens to be in a structural motif it will drastically change the attributeslabels of the molecule v represents nodes in section 21 and 22 and it represents graph instances in section 31 and figure 1 this can be confusing i suggest changing the notation in section 31 and figure 11 to g 1 fanyun sun jordan hoffmann vikas verma and jian tang infograph unsupervised and semisupervised graphlevel representation learning via mutual information maximization arxiv preprint arxiv190801000 2019
### Summary: | this paper proposes an unsupervised graph learning method iterative graph selfdistillation igsd by iteratively performing selfdistillation to contrast graph pairs under different augmented views this idea is then extended to semisupervised setting where via a supervised contrastive loss and selftraining the method is empirically evaluated on some semisupervised graph classification and molecular property prediction tasks and has achieved promising results reviewers agree that the method is interesting and the paper is wellwritten the biggest concern from reviewers related to experimental evaluations of the method the authors responded to this and included additional experiments although the reviewers appreciate the provided results and explanations at the end they were not convinced about the empirical assessments in particular r1s post rebuttal comment indicates concerns about the reported performance of gckn which is different from the published one in table 1 of gckn paper i encourage authors to improve on these experimental discrepancies and resubmit | [
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
8774,
50276,
2520,
2929,
29328,
247,
1881,
8155,
21755,
1754,
4216,
42072,
5122,
281,
33623,
253,
30453,
273,
5368,
3641,
1754,
3210,
8772,
616,
1029,
18925,
4404,
4016,
10491,
36878,
253,
4081,
1566,
33526,
18462,
1543,
2299,
352,
651,
452,
644,
1805,
604,
253,
985,
11809,
285,
1534,
3064,
273,
25477,
8289,
432,
5368,
789,
403,
5469,
50275,
45563,
50275,
2520,
789,
556,
4518,
5469,
247,
32489,
273,
5368,
440,
35421,
3641,
1754,
3210,
534,
310,
253,
4283,
2746,
275,
4216,
9162,
50276,
9328,
12661,
247,
5122,
281,
2953,
436,
2523,
342,
3449,
6051,
11745,
1543,
327,
440,
35421,
4758,
285,
6508,
49863,
29974,
13337,
4758,
342,
11329,
649,
26208,
671,
4516,
36878,
50276,
20790,
310,
2590,
275,
2087,
342,
247,
2590,
2561,
1895,
29328,
5122,
323,
440,
35421,
6017,
261,
29974,
13337,
4216,
6779,
5028,
285,
18462,
11745,
1543,
50276,
20881,
1255,
50276,
9088,
310,
247,
3480,
273,
18276,
1783,
285,
5955,
273,
253,
4081,
1332,
50276,
249,
2593,
7652,
3045,
342,
1027,
2408,
273,
4016,
8557,
352,
310,
417,
2590,
253,
14720,
273,
253,
2530,
8310,
432,
4677,
495,
66,
352,
310,
417,
2590,
253,
16038,
3212,
17221,
247,
9732,
39095,
2990,
323,
13546,
1027,
6849,
273,
253,
4216,
841,
6928,
403,
9403,
908,
323,
3640,
3700,
533,
1060,
908,
323,
4499,
422,
4715,
849,
310,
436,
625,
12912,
685,
271,
19862,
1566,
32063,
3640,
3700,
3213,
273,
16186,
495,
50276,
783,
5161,
3064,
273,
25477,
8289,
432,
7892,
68,
10580,
310,
326,
7892,
68,
4648,
3641,
1754,
4499,
422,
875,
1980,
12097,
6779,
285,
4216,
1234,
1223,
25477,
8289,
4648,
298,
19,
1754,
4499,
422,
875,
374,
4216,
14237,
3280,
2349,
351,
398,
285,
20553,
403,
253,
1072,
323,
1097,
35615,
352,
812,
320,
4217,
281,
823,
690,
1783,
281,
2319,
841,
3910,
285,
616,
9021,
281,
4518,
2096,
253,
8453,
273,
25477,
8289,
436,
2929,
3133,
281,
452,
1375,
23037,
14387,
1543,
3738,
352,
310,
1754,
327,
4216,
10295,
2139,
253,
1543,
403,
417,
2908,
50276,
13118,
2241,
267,
10295,
6928,
323,
4216,
34218,
941,
17857,
1686,
14952,
50272,
6438,
30080,
22559,
50275,
74,
5717,
253,
4477,
323,
253,
2380,
891,
1335,
452,
7350,
294,
616,
5301,
342,
305,
777,
79,
17857,
1686,
14952,
253,
23775,
1543,
407,
253,
4477,
403,
1027,
3012,
432,
253,
3863,
581,
275,
2829,
337,
273,
305,
777,
79,
2929,
608,
24088,
2873,
356,
310,
898,
1619,
275,
3236,
2929,
533,
253,
4477,
1304,
854,
3547,
323,
305,
777,
79,
253,
3064,
310,
1534,
50276,
45230,
891,
588,
1978,
619,
3236,
13716,
50275,
7152,
33032,
1189,
455,
5701,
4715,
4216,
5251,
14237,
342,
760,
13301,
556,
644,
14859,
407,
1142,
2987,
2299,
697,
417,
3477,
281,
12182,
366,
1046,
4216,
436,
2929,
10384,
253,
5697,
432,
49863,
29974,
13337,
9162,
4836,
281,
3157,
253,
6779,
3290,
6311,
407,
4216,
11454,
2990,
5742,
253,
4081,
2900,
24772,
2067,
9351,
273,
5368,
5609,
1690,
12393,
4216,
42072,
1599,
9732,
15274,
372,
30344,
4499,
422,
2957,
285,
17927,
966,
15274,
4720,
597,
403,
5678,
2366,
281,
769,
347,
247,
37820,
1307,
407,
17617,
253,
440,
47728,
941,
432,
436,
1127,
273,
1859,
253,
38135,
273,
436,
789,
310,
32809,
533,
697,
1335,
271,
4722,
789,
323,
11138,
4216,
5251,
14237,
50276,
498,
15752,
253,
9759,
310,
417,
2590,
2217,
627,
4961,
1142,
3916,
326,
403,
417,
2590,
2011,
347,
3637,
50276,
18,
275,
253,
1390,
6197,
273,
495,
5784,
12494,
275,
10199,
2593,
697,
2834,
281,
755,
253,
4602,
875,
4016,
3530,
15067,
285,
1881,
8155,
21755,
5700,
2139,
970,
253,
1881,
8155,
21755,
476,
33623,
253,
18925,
327,
4016,
3530,
15067,
253,
440,
35421,
8103,
275,
5150,
577,
1335,
7024,
327,
4016,
3530,
50276,
19,
275,
2593,
3127,
253,
14951,
323,
35919,
569,
2139,
403,
253,
14580,
1289,
7660,
1293,
13301,
846,
1146,
31612,
495,
275,
2593,
3495,
4477,
41005,
897,
268,
1087,
281,
35919,
4666,
3386,
840,
12421,
5386,
9297,
281,
2794,
247,
40634,
14580,
2556,
281,
253,
5740,
253,
1953,
310,
849,
1142,
6849,
326,
588,
320,
908,
275,
956,
7118,
891,
5476,
326,
253,
4216,
4735,
432,
3236,
4216,
588,
320,
10208,
281,
5974,
2990,
285,
253,
31612,
40634,
4216,
588,
320,
10208,
281,
9732,
2990,
50276,
34974,
323,
30080,
22559,
337,
4496,
19148,
253,
5393,
3533,
1840,
50276,
19,
253,
4081,
1332,
4428,
271,
32049,
42987,
285,
23403,
253,
1953,
310,
2139,
359,
878,
247,
42987,
305,
281,
755,
247,
2169,
7877,
1182,
1057,
352,
452,
247,
1943,
4833,
327,
253,
3045,
812,
368,
4496,
1918,
253,
3426,
5426,
273,
1159,
305,
285,
288,
50276,
20,
253,
5426,
273,
298,
585,
50276,
249,
5150,
374,
310,
323,
2762,
3410,
10375,
432,
253,
1072,
4216,
15891,
2299,
253,
3426,
440,
35421,
2957,
3198,
4016,
3530,
48753,
812,
368,
4496,
671,
1918,
253,
5426,
323,
298,
7311,
48753,
50276,
21,
253,
4583,
2957,
8414,
273,
22296,
285,
440,
35421,
2957,
253,
298,
8403,
556,
7344,
281,
253,
806,
1307,
275,
5150,
818,
1097,
273,
731,
897,
13301,
533,
697,
2834,
281,
2028,
534,
581,
943,
320,
15616,
342,
253,
22296,
2957,
2011,
275,
253,
28913,
1263,
2829,
374,
253,
7018,
585,
556,
1620,
644,
2011,
275,
253,
2022,
2600,
1078,
4496,
2075,
4116,
281,
1056,
352,
2590,
5474,
33032,
50276,
11183,
846,
4361,
253,
4477,
2380,
50276,
783,
4477,
42126,
2953,
619,
1953,
858,
253,
4477,
1347,
247,
8453,
1263,
247,
8453,
1071,
824,
347,
33478,
1356,
246,
2566,
310,
3058,
281,
49160,
1880,
253,
4081,
1332,
310,
3012,
1805,
685,
1666,
25379,
50274,
2520,
2929,
4081,
247,
940,
21755,
2746,
323,
440,
35421,
4216,
6779,
4715,
253,
2746,
10571,
21168,
2220,
4499,
422,
1881,
35421,
4715,
534,
39165,
8557,
273,
31612,
14580,
253,
2746,
310,
6508,
281,
253,
49863,
29974,
13337,
4758,
253,
4477,
2684,
7103,
275,
4216,
9162,
285,
9077,
8892,
50275,
74,
5583,
281,
12009,
436,
2929,
1955,
281,
253,
1563,
2201,
7350,
337,
5661,
1543,
403,
417,
2266,
374,
1774,
1666,
25379,
497,
417,
2429,
495,
1774,
4278,
824,
347,
8654,
4373,
19484,
2193,
403,
5816,
50275,
2577,
2201,
7350,
273,
436,
2929,
2486,
337,
253,
7756,
273,
253,
4081,
2746,
689,
1666,
25379,
1646,
417,
1534,
323,
1650,
275,
2829,
337,
10941,
253,
1599,
285,
2629,
11254,
273,
253,
4081,
2746,
285,
7892,
68,
10580,
352,
3133,
326,
253,
3064,
310,
417,
10126,
1534,
858,
253,
4477,
1347,
247,
8453,
1263,
374,
275,
253,
4679,
2139,
253,
4477,
42126,
7277,
342,
42667,
534,
310,
247,
4499,
422,
1881,
35421,
4715,
2746,
3732,
281,
4216,
9162,
495,
627,
403,
1142,
643,
440,
35421,
4216,
6779,
4715,
3082,
253,
4477,
878,
281,
7277,
342,
625,
281,
4326,
4513,
436,
789,
577,
275,
4373,
19484,
25184,
253,
4477,
3534,
253,
2491,
273,
4373,
22041,
24251,
533,
42126,
1918,
253,
8654,
1318,
273,
253,
4373,
22041,
534,
1056,
253,
2929,
2834,
281,
18302,
608,
275,
2829,
337,
253,
4477,
10432,
690,
1543,
1580,
597,
878,
625,
685,
337,
1388,
281,
4044,
352,
310,
1846,
323,
3676,
4715,
3210,
281,
1408,
2067,
1897,
281,
4044,
1543,
891,
13414,
1158,
352,
310,
1463,
281,
16670,
841,
1543,
3365,
984,
253,
20243,
310,
625,
685,
2164,
3038,
50276,
35529,
253,
2929,
1057,
452,
247,
1643,
2266,
2792,
337,
253,
28913,
2175,
403,
973,
4158,
285,
253,
1543,
403,
47860,
50276,
19,
253,
2929,
310,
973,
15720,
285,
3477,
281,
956,
342,
247,
2590,
6003,
50276,
20,
253,
4679,
497,
5196,
327,
247,
6793,
4849,
273,
15302,
50275,
977,
5701,
337,
275,
5150,
495,
253,
4477,
476,
3812,
247,
4602,
342,
278,
16856,
374,
275,
2829,
2139,
42126,
1304,
1599,
285,
2629,
11254,
273,
253,
1543,
495,
323,
436,
906,
672,
14604,
1979,
310,
3687,
685,
4567,
25477,
8289,
41731,
13015,
7892,
68,
10580,
285,
253,
3045,
8037,
4916,
4067,
347,
253,
14604,
1979,
5459,
50276,
5092,
253,
4477,
2085,
247,
1921,
326,
476,
6830,
5513,
436,
11562,
577,
253,
4477,
476,
823,
690,
9990,
273,
253,
15302,
908,
275,
4677,
374,
50274,
7152,
33032,
2520,
2929,
4081,
247,
1332,
323,
4715,
4216,
5251,
6779,
275,
271,
440,
35421,
4499,
422,
1039,
3185,
273,
42455,
875,
4216,
5251,
6779,
285,
12097,
6779,
751,
2192,
2047,
337,
597,
4499,
4216,
5251,
6779,
273,
247,
4216,
281,
697,
31612,
7629,
970,
247,
9732,
39095,
7792,
50275,
22309,
897,
2192,
19131,
8103,
3185,
273,
253,
480,
561,
561,
73,
16554,
15577,
1491,
8103,
908,
275,
2192,
2047,
337,
50273,
783,
2201,
4468,
670,
436,
1349,
254,
310,
326,
253,
4081,
1332,
29426,
253,
2734,
8098,
273,
31612,
6849,
432,
253,
1072,
4216,
10872,
533,
2085,
642,
12215,
326,
253,
9261,
908,
4216,
12393,
285,
37139,
1877,
342,
268,
1087,
50276,
14719,
5386,
9297,
275,
436,
2929,
651,
320,
5203,
10192,
26368,
323,
1650,
275,
5787,
15302,
604,
359,
5926,
271,
5024,
285,
326,
5024,
6569,
281,
320,
275,
247,
8350,
14443,
352,
588,
31063,
1818,
253,
12474,
31294,
273,
253,
12570,
50274,
87,
6125,
7632,
275,
2593,
3127,
285,
3307,
285,
352,
6125,
4216,
10872,
275,
2593,
4562,
285,
4677,
337,
436,
476,
320,
21643,
891,
1804,
6890,
253,
14951,
275,
2593,
4562,
285,
4677,
1903,
281,
305,
50276,
18,
269,
1279,
328,
5101,
480,
11208,
288,
2727,
8420,
362,
1479,
284,
2336,
785,
285,
480,
757,
12717,
2192,
2047,
440,
35421,
285,
49863,
29974,
13337,
4216,
5251,
6779,
4715,
3066,
15577,
1491,
11903,
1320,
549,
32693,
638,
3845,
549,
32693,
746,
2904,
520,
933,
6247,
2490,
187,
4118,
18435,
27,
2520,
2929,
29328,
271,
440,
35421,
4216,
4715,
1332,
34560,
4216,
1881,
8155,
21755,
25477,
8289,
407,
10040,
3146,
9591,
1881,
8155,
21755,
281,
4499,
4216,
8557,
762,
1027,
31612,
6849,
436,
2934,
310,
840,
6508,
281,
49863,
29974,
13337,
4758,
835,
3066,
247,
22296,
4499,
422,
2957,
285,
11329,
649,
26208,
253,
1332,
310,
45190,
6760,
327,
690,
49863,
29974,
13337,
4216,
9162,
285,
5787,
2867,
10554,
8892,
285,
556,
6786,
12532,
1543,
50276,
15337,
398,
5194,
326,
253,
1332,
310,
4722,
285,
253,
2929,
310,
973,
15720,
253,
5962,
4468,
432,
30628,
2905,
281,
5661,
27163,
273,
253,
1332,
253,
4477,
10974,
281,
436,
285,
2908,
3081,
4679,
3738,
253,
30628,
11435,
253,
2530,
1543,
285,
22909,
387,
253,
990,
597,
497,
417,
13762,
670,
253,
16774,
20215,
275,
1798,
391,
18,
84,
1501,
30080,
22559,
4385,
6492,
7350,
670,
253,
2361,
3045,
273,
305,
777,
79,
534,
310,
1027,
432,
253,
3863,
581,
275,
2829,
337,
273,
305,
777,
79,
2929,
891,
11907,
4477,
281,
3157,
327,
841,
5661,
37122,
285,
501,
538,
2225,
209
] | [
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] | [
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
8774,
50276,
2520,
2929,
29328,
247,
1881,
8155,
21755,
1754,
4216,
42072,
5122,
281,
33623,
253,
30453,
273,
5368,
3641,
1754,
3210,
8772,
616,
1029,
18925,
4404,
4016,
10491,
36878,
253,
4081,
1566,
33526,
18462,
1543,
2299,
352,
651,
452,
644,
1805,
604,
253,
985,
11809,
285,
1534,
3064,
273,
25477,
8289,
432,
5368,
789,
403,
5469,
50275,
45563,
50275,
2520,
789,
556,
4518,
5469,
247,
32489,
273,
5368,
440,
35421,
3641,
1754,
3210,
534,
310,
253,
4283,
2746,
275,
4216,
9162,
50276,
9328,
12661,
247,
5122,
281,
2953,
436,
2523,
342,
3449,
6051,
11745,
1543,
327,
440,
35421,
4758,
285,
6508,
49863,
29974,
13337,
4758,
342,
11329,
649,
26208,
671,
4516,
36878,
50276,
20790,
310,
2590,
275,
2087,
342,
247,
2590,
2561,
1895,
29328,
5122,
323,
440,
35421,
6017,
261,
29974,
13337,
4216,
6779,
5028,
285,
18462,
11745,
1543,
50276,
20881,
1255,
50276,
9088,
310,
247,
3480,
273,
18276,
1783,
285,
5955,
273,
253,
4081,
1332,
50276,
249,
2593,
7652,
3045,
342,
1027,
2408,
273,
4016,
8557,
352,
310,
417,
2590,
253,
14720,
273,
253,
2530,
8310,
432,
4677,
495,
66,
352,
310,
417,
2590,
253,
16038,
3212,
17221,
247,
9732,
39095,
2990,
323,
13546,
1027,
6849,
273,
253,
4216,
841,
6928,
403,
9403,
908,
323,
3640,
3700,
533,
1060,
908,
323,
4499,
422,
4715,
849,
310,
436,
625,
12912,
685,
271,
19862,
1566,
32063,
3640,
3700,
3213,
273,
16186,
495,
50276,
783,
5161,
3064,
273,
25477,
8289,
432,
7892,
68,
10580,
310,
326,
7892,
68,
4648,
3641,
1754,
4499,
422,
875,
1980,
12097,
6779,
285,
4216,
1234,
1223,
25477,
8289,
4648,
298,
19,
1754,
4499,
422,
875,
374,
4216,
14237,
3280,
2349,
351,
398,
285,
20553,
403,
253,
1072,
323,
1097,
35615,
352,
812,
320,
4217,
281,
823,
690,
1783,
281,
2319,
841,
3910,
285,
616,
9021,
281,
4518,
2096,
253,
8453,
273,
25477,
8289,
436,
2929,
3133,
281,
452,
1375,
23037,
14387,
1543,
3738,
352,
310,
1754,
327,
4216,
10295,
2139,
253,
1543,
403,
417,
2908,
50276,
13118,
2241,
267,
10295,
6928,
323,
4216,
34218,
941,
17857,
1686,
14952,
50272,
6438,
30080,
22559,
50275,
74,
5717,
253,
4477,
323,
253,
2380,
891,
1335,
452,
7350,
294,
616,
5301,
342,
305,
777,
79,
17857,
1686,
14952,
253,
23775,
1543,
407,
253,
4477,
403,
1027,
3012,
432,
253,
3863,
581,
275,
2829,
337,
273,
305,
777,
79,
2929,
608,
24088,
2873,
356,
310,
898,
1619,
275,
3236,
2929,
533,
253,
4477,
1304,
854,
3547,
323,
305,
777,
79,
253,
3064,
310,
1534,
50276,
45230,
891,
588,
1978,
619,
3236,
13716,
50275,
7152,
33032,
1189,
455,
5701,
4715,
4216,
5251,
14237,
342,
760,
13301,
556,
644,
14859,
407,
1142,
2987,
2299,
697,
417,
3477,
281,
12182,
366,
1046,
4216,
436,
2929,
10384,
253,
5697,
432,
49863,
29974,
13337,
9162,
4836,
281,
3157,
253,
6779,
3290,
6311,
407,
4216,
11454,
2990,
5742,
253,
4081,
2900,
24772,
2067,
9351,
273,
5368,
5609,
1690,
12393,
4216,
42072,
1599,
9732,
15274,
372,
30344,
4499,
422,
2957,
285,
17927,
966,
15274,
4720,
597,
403,
5678,
2366,
281,
769,
347,
247,
37820,
1307,
407,
17617,
253,
440,
47728,
941,
432,
436,
1127,
273,
1859,
253,
38135,
273,
436,
789,
310,
32809,
533,
697,
1335,
271,
4722,
789,
323,
11138,
4216,
5251,
14237,
50276,
498,
15752,
253,
9759,
310,
417,
2590,
2217,
627,
4961,
1142,
3916,
326,
403,
417,
2590,
2011,
347,
3637,
50276,
18,
275,
253,
1390,
6197,
273,
495,
5784,
12494,
275,
10199,
2593,
697,
2834,
281,
755,
253,
4602,
875,
4016,
3530,
15067,
285,
1881,
8155,
21755,
5700,
2139,
970,
253,
1881,
8155,
21755,
476,
33623,
253,
18925,
327,
4016,
3530,
15067,
253,
440,
35421,
8103,
275,
5150,
577,
1335,
7024,
327,
4016,
3530,
50276,
19,
275,
2593,
3127,
253,
14951,
323,
35919,
569,
2139,
403,
253,
14580,
1289,
7660,
1293,
13301,
846,
1146,
31612,
495,
275,
2593,
3495,
4477,
41005,
897,
268,
1087,
281,
35919,
4666,
3386,
840,
12421,
5386,
9297,
281,
2794,
247,
40634,
14580,
2556,
281,
253,
5740,
253,
1953,
310,
849,
1142,
6849,
326,
588,
320,
908,
275,
956,
7118,
891,
5476,
326,
253,
4216,
4735,
432,
3236,
4216,
588,
320,
10208,
281,
5974,
2990,
285,
253,
31612,
40634,
4216,
588,
320,
10208,
281,
9732,
2990,
50276,
34974,
323,
30080,
22559,
337,
4496,
19148,
253,
5393,
3533,
1840,
50276,
19,
253,
4081,
1332,
4428,
271,
32049,
42987,
285,
23403,
253,
1953,
310,
2139,
359,
878,
247,
42987,
305,
281,
755,
247,
2169,
7877,
1182,
1057,
352,
452,
247,
1943,
4833,
327,
253,
3045,
812,
368,
4496,
1918,
253,
3426,
5426,
273,
1159,
305,
285,
288,
50276,
20,
253,
5426,
273,
298,
585,
50276,
249,
5150,
374,
310,
323,
2762,
3410,
10375,
432,
253,
1072,
4216,
15891,
2299,
253,
3426,
440,
35421,
2957,
3198,
4016,
3530,
48753,
812,
368,
4496,
671,
1918,
253,
5426,
323,
298,
7311,
48753,
50276,
21,
253,
4583,
2957,
8414,
273,
22296,
285,
440,
35421,
2957,
253,
298,
8403,
556,
7344,
281,
253,
806,
1307,
275,
5150,
818,
1097,
273,
731,
897,
13301,
533,
697,
2834,
281,
2028,
534,
581,
943,
320,
15616,
342,
253,
22296,
2957,
2011,
275,
253,
28913,
1263,
2829,
374,
253,
7018,
585,
556,
1620,
644,
2011,
275,
253,
2022,
2600,
1078,
4496,
2075,
4116,
281,
1056,
352,
2590,
5474,
33032,
50276,
11183,
846,
4361,
253,
4477,
2380,
50276,
783,
4477,
42126,
2953,
619,
1953,
858,
253,
4477,
1347,
247,
8453,
1263,
247,
8453,
1071,
824,
347,
33478,
1356,
246,
2566,
310,
3058,
281,
49160,
1880,
253,
4081,
1332,
310,
3012,
1805,
685,
1666,
25379,
50274,
2520,
2929,
4081,
247,
940,
21755,
2746,
323,
440,
35421,
4216,
6779,
4715,
253,
2746,
10571,
21168,
2220,
4499,
422,
1881,
35421,
4715,
534,
39165,
8557,
273,
31612,
14580,
253,
2746,
310,
6508,
281,
253,
49863,
29974,
13337,
4758,
253,
4477,
2684,
7103,
275,
4216,
9162,
285,
9077,
8892,
50275,
74,
5583,
281,
12009,
436,
2929,
1955,
281,
253,
1563,
2201,
7350,
337,
5661,
1543,
403,
417,
2266,
374,
1774,
1666,
25379,
497,
417,
2429,
495,
1774,
4278,
824,
347,
8654,
4373,
19484,
2193,
403,
5816,
50275,
2577,
2201,
7350,
273,
436,
2929,
2486,
337,
253,
7756,
273,
253,
4081,
2746,
689,
1666,
25379,
1646,
417,
1534,
323,
1650,
275,
2829,
337,
10941,
253,
1599,
285,
2629,
11254,
273,
253,
4081,
2746,
285,
7892,
68,
10580,
352,
3133,
326,
253,
3064,
310,
417,
10126,
1534,
858,
253,
4477,
1347,
247,
8453,
1263,
374,
275,
253,
4679,
2139,
253,
4477,
42126,
7277,
342,
42667,
534,
310,
247,
4499,
422,
1881,
35421,
4715,
2746,
3732,
281,
4216,
9162,
495,
627,
403,
1142,
643,
440,
35421,
4216,
6779,
4715,
3082,
253,
4477,
878,
281,
7277,
342,
625,
281,
4326,
4513,
436,
789,
577,
275,
4373,
19484,
25184,
253,
4477,
3534,
253,
2491,
273,
4373,
22041,
24251,
533,
42126,
1918,
253,
8654,
1318,
273,
253,
4373,
22041,
534,
1056,
253,
2929,
2834,
281,
18302,
608,
275,
2829,
337,
253,
4477,
10432,
690,
1543,
1580,
597,
878,
625,
685,
337,
1388,
281,
4044,
352,
310,
1846,
323,
3676,
4715,
3210,
281,
1408,
2067,
1897,
281,
4044,
1543,
891,
13414,
1158,
352,
310,
1463,
281,
16670,
841,
1543,
3365,
984,
253,
20243,
310,
625,
685,
2164,
3038,
50276,
35529,
253,
2929,
1057,
452,
247,
1643,
2266,
2792,
337,
253,
28913,
2175,
403,
973,
4158,
285,
253,
1543,
403,
47860,
50276,
19,
253,
2929,
310,
973,
15720,
285,
3477,
281,
956,
342,
247,
2590,
6003,
50276,
20,
253,
4679,
497,
5196,
327,
247,
6793,
4849,
273,
15302,
50275,
977,
5701,
337,
275,
5150,
495,
253,
4477,
476,
3812,
247,
4602,
342,
278,
16856,
374,
275,
2829,
2139,
42126,
1304,
1599,
285,
2629,
11254,
273,
253,
1543,
495,
323,
436,
906,
672,
14604,
1979,
310,
3687,
685,
4567,
25477,
8289,
41731,
13015,
7892,
68,
10580,
285,
253,
3045,
8037,
4916,
4067,
347,
253,
14604,
1979,
5459,
50276,
5092,
253,
4477,
2085,
247,
1921,
326,
476,
6830,
5513,
436,
11562,
577,
253,
4477,
476,
823,
690,
9990,
273,
253,
15302,
908,
275,
4677,
374,
50274,
7152,
33032,
2520,
2929,
4081,
247,
1332,
323,
4715,
4216,
5251,
6779,
275,
271,
440,
35421,
4499,
422,
1039,
3185,
273,
42455,
875,
4216,
5251,
6779,
285,
12097,
6779,
751,
2192,
2047,
337,
597,
4499,
4216,
5251,
6779,
273,
247,
4216,
281,
697,
31612,
7629,
970,
247,
9732,
39095,
7792,
50275,
22309,
897,
2192,
19131,
8103,
3185,
273,
253,
480,
561,
561,
73,
16554,
15577,
1491,
8103,
908,
275,
2192,
2047,
337,
50273,
783,
2201,
4468,
670,
436,
1349,
254,
310,
326,
253,
4081,
1332,
29426,
253,
2734,
8098,
273,
31612,
6849,
432,
253,
1072,
4216,
10872,
533,
2085,
642,
12215,
326,
253,
9261,
908,
4216,
12393,
285,
37139,
1877,
342,
268,
1087,
50276,
14719,
5386,
9297,
275,
436,
2929,
651,
320,
5203,
10192,
26368,
323,
1650,
275,
5787,
15302,
604,
359,
5926,
271,
5024,
285,
326,
5024,
6569,
281,
320,
275,
247,
8350,
14443,
352,
588,
31063,
1818,
253,
12474,
31294,
273,
253,
12570,
50274,
87,
6125,
7632,
275,
2593,
3127,
285,
3307,
285,
352,
6125,
4216,
10872,
275,
2593,
4562,
285,
4677,
337,
436,
476,
320,
21643,
891,
1804,
6890,
253,
14951,
275,
2593,
4562,
285,
4677,
1903,
281,
305,
50276,
18,
269,
1279,
328,
5101,
480,
11208,
288,
2727,
8420,
362,
1479,
284,
2336,
785,
285,
480,
757,
12717,
2192,
2047,
440,
35421,
285,
49863,
29974,
13337,
4216,
5251,
6779,
4715,
3066,
15577,
1491,
11903,
1320,
549,
32693,
638,
3845,
549,
32693,
746,
2904,
520,
933,
6247,
2490,
187,
4118,
18435,
27,
2520,
2929,
29328,
271,
440,
35421,
4216,
4715,
1332,
34560,
4216,
1881,
8155,
21755,
25477,
8289,
407,
10040,
3146,
9591,
1881,
8155,
21755,
281,
4499,
4216,
8557,
762,
1027,
31612,
6849,
436,
2934,
310,
840,
6508,
281,
49863,
29974,
13337,
4758,
835,
3066,
247,
22296,
4499,
422,
2957,
285,
11329,
649,
26208,
253,
1332,
310,
45190,
6760,
327,
690,
49863,
29974,
13337,
4216,
9162,
285,
5787,
2867,
10554,
8892,
285,
556,
6786,
12532,
1543,
50276,
15337,
398,
5194,
326,
253,
1332,
310,
4722,
285,
253,
2929,
310,
973,
15720,
253,
5962,
4468,
432,
30628,
2905,
281,
5661,
27163,
273,
253,
1332,
253,
4477,
10974,
281,
436,
285,
2908,
3081,
4679,
3738,
253,
30628,
11435,
253,
2530,
1543,
285,
22909,
387,
253,
990,
597,
497,
417,
13762,
670,
253,
16774,
20215,
275,
1798,
391,
18,
84,
1501,
30080,
22559,
4385,
6492,
7350,
670,
253,
2361,
3045,
273,
305,
777,
79,
534,
310,
1027,
432,
253,
3863,
581,
275,
2829,
337,
273,
305,
777,
79,
2929,
891,
11907,
4477,
281,
3157,
327,
841,
5661,
37122,
285,
501,
538,
2225,
209
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
the paper presents an algorithm for compressing the size of entire training data into a few synthetic training samples the method is based on neural networks and is applied on image datsets the authors comment two possible applications of their method domain adaptation and effective data poisoning attack the proposed technique seems to be limited to neural networks since it seems that is linked to the initialization of the networks in this aspect it could be interesting to have a more general method there are related works that are not commented for instance olveralpez j arturo et al a review of instance selection methods artificial intelligence review 342 2010 133143 experimental section is weak few datasets are considered other problems should be added additionally related methods should be included to compare the performance of the proposal some comments about the computational cost should be inserted in this aspect the experimental section should be improved following these recommendationsdocsep post rebuttal update thanks to the addition of better baselines ive increased my score for this paper while im still not super convinced of its potential for application i find the idea original and worth discussing at the conference prerebuttal review this paper presents an approach to compress a dataset into a much smaller number of synthetic samples that are optimized to yield as good performance as possible when a given model is trained on that smaller dataset this is done by unrolling the gradient descent procedure of training such a model to allow for gradientbased optimization of synthetic samples themselves as well as the used learning rates in summary my evaluation is as follow pros pretty original problem formulation generally well written paper cons lack of comparison with simple baselines in basic dataset distillation setting use in practical applications domain adaptation data poisoning yet to be convincingly demonstrated possibly a mistake in the theoretical analysis of the linear case indeed i found the paper to be generally quite clear and enjoyed reading it one minor thing i struggled a bit with is the distinction between sg steps and epochs i believe the former corresponds to when the synthetic samples are different between gd steps whereas the later corresponds to the number of times the method repeatedly cycles over these samples so i would perhaps encourage the authors to emphasize that difference i also find the problem statement that proposed to be interesting and thought provoking and the solution thats proposed seems quite appropriate and well thought out that said im worried about the following unless i misunderstood in the basic dataset distillation setting a comparison is never provided with training on a randomly selected subset of the training set presumably the results are worse but i think these results should be in the paper i would also argue for having another baseline which would try to approximately optimize the choice of which training examples are put in the subset a very simple approach would be to take the 200 runs already performed for the random selection and select the subset providing the best accuracy on the full training set and only report the performance of that subset instead of the mean and std of all 200 runs in short this would help determine to what extent there is value to synthesizing entirely new samples moreover i think a simple alternative baseline for creating synthesized samples should be considered specifically id personally would like to know the performance of using perclass kmeans clustering and training on the cluster centroids as the distilled dataset while i appreciate that the authors identify potential applications and report some results on them i think they currently fall short of convincing the reader of the potential of dataset distillation for these applications for domain adaptation no actual domain adaptation baseline is compared against a good candidate would be method from daume iii 2007 at the very least for data poisoning i find that the assumptions for attacks are pretty strong ie that a you have access to the parameters of the pretrained models to attack and b that the model is doing additional updates only on the synthesized data if there are reasons to think that such assumptions are reasonable id at least expect the paper to motivate why that is in the analysis of the simple linear case in equation 7 there appears to be a mistake specifically some missing parentheses dtd ietam dtdtheta0 etam tildedttildet dt t ie there should be parentheses right after dtd and right before this is from replacing theta by the expression for theta1 in equation 6 which is what i think equation 7 is supposed to be doing this possibly doesnt affect some of the conclusions taken from this section but id like to see this potential mistake discussedaddressed that said if the authors can sufficiently address the 3 points above id be willing to increase my rating for this paper finally i have a few other more minor nicetohave points having in the related work a discussion on the relationship with coreset methods would be nice experiments showing how well the distilled datasets transfer to different network architectures than those used in training would be interesting even other ml algorithms would be quite interesting we often find that the number of distilled images required to achieve good performance is an informative indicator of the dataset diversity im not sure what in the paper actually justifies demonstrates this statement figure 4 is presented as an ablation study but an ablation study is where you remove certains parts of a model or algorithm and see what happens which isnt the case here i think its better described as a hyperparameter sensitivity study some typos the below objective the objective below wrt to wrt the discrete part rather the discrete parts rather necesary necessary docsepthe paper addresses the interesting problem of generating a small number of synthetic examples that can be used to train a classifier replacing a larger dataset the paper is clearly written the approach makes sense and the experiments are interesting my major concerns are regarding to previous literature analysis of the algorithm and details of the experiment overall i expect an iclr paper to go deeper rather than wide i recommend presenting strong convincing evidence on one front specific comments 1 im missing analysis of the proposed procedure it wasnt fully clear which loss it minimizes and if it indeed guaranteed to converge to the minimum of that loss 2 the topic of learning from few samples is presented as completely new it is well known that for classical linear algorithms like the perceptron and svm the weights are a weighted sum of labelweighted samples hence by definition of these algorithm there is a single sample that can be used to train the model in one step id expect some discussion of how the proposed approach relate to these classical approaches there is also existing literature on a related problem of selecting samples teaching dimension goldmankearns that could be somewhat relevant here 3 motivation the paper provide several motivations for dataset distillation i support the first motivation of scientific understanding what data is actually needed for a classifier and this means that deeper analysis is needed the practical motivations are less convincing because a domain adaptation experiments are not compared with real baselines b robustness of poisoning with a single sample is not studieddiscussed 4 experiments the intro states that training with 10 images reaches 94 accuracy but this does not seem consistent with the results in table 1 the caption of figure 2 suggests that accuracy is between 12 and 94 which means the stated 94 is not representative or typical could you clarify for domain adaptation the baseline random images are very weak and still perform almost comparably to the proposed approach more robust experiments are needed here stronger baselines decent hyperparameter search etc 5 writing and exposition the paper addresses two issues a learning with few synthetic samples and b learning with few gradient steps the intro tends to mix the two and it is not clear why learning with a single gradient step is important i recommend to separate the two topics more clearly
### Summary: | the reviewers agree that the idea for dataset distillation is novel however it is unclear how practical it can be the paper has been significantly improved through the addition of new baselines however ultimately the performance is not quite good enough for the reviewers to advocate strongly on its behalf perhaps the paper would be better motivated by finding a realistic scenario in which it would make sense for someone to use this approach over reasonable alternatives | [
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
783,
2929,
10262,
271,
5933,
323,
509,
13537,
253,
1979,
273,
2862,
3733,
941,
715,
247,
1643,
13506,
3733,
3530,
253,
1332,
310,
1754,
327,
11454,
6928,
285,
310,
3732,
327,
2460,
2856,
19598,
253,
4477,
4385,
767,
1896,
4893,
273,
616,
1332,
5028,
15644,
285,
3576,
941,
33254,
2983,
50276,
783,
4081,
5853,
3133,
281,
320,
3710,
281,
11454,
6928,
1580,
352,
3133,
326,
310,
7939,
281,
253,
31850,
273,
253,
6928,
275,
436,
4809,
352,
812,
320,
4722,
281,
452,
247,
625,
2087,
1332,
50276,
9088,
403,
2905,
2987,
326,
403,
417,
20503,
323,
4227,
50275,
14930,
267,
29283,
480,
1445,
1822,
1162,
355,
247,
2278,
273,
4227,
5438,
3082,
13345,
9260,
2278,
36153,
4267,
14114,
16150,
50275,
49363,
2593,
310,
5075,
1643,
15302,
403,
2783,
643,
3237,
943,
320,
2879,
23000,
2905,
3082,
943,
320,
2908,
281,
50276,
23813,
253,
3045,
273,
253,
10419,
690,
5701,
670,
253,
15180,
2105,
943,
320,
13400,
275,
436,
4809,
253,
5661,
2593,
943,
320,
5520,
1563,
841,
12645,
7152,
33032,
50276,
5996,
30080,
22559,
5731,
50275,
35501,
281,
253,
1635,
273,
1805,
1666,
25379,
209,
422,
2559,
619,
4868,
323,
436,
2929,
1223,
516,
1335,
417,
2221,
13762,
273,
697,
2442,
323,
2898,
891,
1089,
253,
2934,
3236,
285,
4409,
16585,
387,
253,
8059,
50275,
3456,
250,
2858,
22559,
2278,
50276,
2520,
2929,
10262,
271,
2746,
281,
19477,
247,
10895,
715,
247,
1199,
4577,
1180,
273,
13506,
3530,
326,
403,
18325,
281,
4917,
347,
1175,
3045,
347,
1896,
672,
247,
1677,
1566,
310,
10166,
327,
326,
4577,
10895,
436,
310,
2218,
407,
440,
19891,
253,
11786,
18499,
5199,
273,
3733,
824,
247,
1566,
281,
1581,
323,
11786,
3169,
13757,
273,
13506,
3530,
3746,
347,
973,
347,
253,
908,
4715,
4142,
50276,
249,
6010,
619,
7103,
310,
347,
956,
50276,
856,
84,
50276,
38256,
3236,
1895,
15895,
50276,
43786,
973,
3542,
2929,
50276,
5040,
50276,
77,
471,
273,
5301,
342,
2969,
1666,
25379,
275,
5044,
10895,
940,
21755,
4758,
50276,
2327,
275,
8542,
4893,
5028,
15644,
941,
33254,
2568,
281,
320,
2410,
1763,
5356,
5183,
50276,
38896,
247,
10551,
275,
253,
10527,
1783,
273,
253,
4872,
1083,
50276,
527,
13158,
891,
1119,
253,
2929,
281,
320,
3839,
3240,
2590,
285,
11346,
4361,
352,
581,
5884,
2181,
891,
19460,
247,
2372,
342,
310,
253,
13812,
875,
48237,
5018,
285,
44540,
891,
2868,
253,
3438,
10140,
281,
672,
253,
13506,
3530,
403,
1027,
875,
305,
69,
5018,
5727,
253,
1996,
10140,
281,
253,
1180,
273,
2069,
253,
1332,
12889,
11945,
689,
841,
3530,
594,
891,
651,
4931,
11907,
253,
4477,
281,
22175,
326,
3064,
50275,
74,
671,
1089,
253,
1895,
3908,
326,
4081,
281,
320,
4722,
285,
1869,
872,
6856,
285,
253,
2900,
28763,
4081,
3133,
3240,
4569,
285,
973,
1869,
562,
50276,
3529,
753,
516,
11926,
670,
253,
1563,
50275,
28558,
891,
46485,
275,
253,
5044,
10895,
940,
21755,
4758,
247,
5301,
310,
1620,
2530,
342,
3733,
327,
247,
12421,
4236,
8578,
273,
253,
3733,
873,
18289,
253,
1543,
403,
7197,
533,
891,
1158,
841,
1543,
943,
320,
275,
253,
2929,
891,
651,
671,
9059,
323,
1907,
1529,
8245,
534,
651,
1611,
281,
5512,
22318,
253,
4327,
273,
534,
3733,
6667,
403,
1691,
275,
253,
8578,
247,
1077,
2969,
2746,
651,
320,
281,
1379,
253,
1052,
6613,
2168,
2684,
323,
253,
3632,
5438,
285,
3609,
253,
8578,
5277,
253,
1682,
7200,
327,
253,
2120,
3733,
873,
285,
760,
1304,
253,
3045,
273,
326,
8578,
3185,
273,
253,
1599,
285,
6268,
273,
512,
1052,
6613,
275,
2159,
436,
651,
1361,
3653,
281,
752,
6070,
627,
310,
1318,
281,
35143,
3006,
7094,
747,
3530,
25761,
891,
1158,
247,
2969,
5795,
8245,
323,
6153,
17791,
3530,
943,
320,
2783,
5742,
2654,
11697,
651,
751,
281,
871,
253,
3045,
273,
970,
591,
2437,
465,
30799,
17524,
285,
3733,
327,
253,
7368,
1399,
287,
2352,
347,
253,
35755,
10895,
50275,
6050,
891,
11435,
326,
253,
4477,
4271,
2442,
4893,
285,
1304,
690,
1543,
327,
731,
891,
1158,
597,
4390,
2965,
2159,
273,
21414,
253,
9414,
273,
253,
2442,
273,
10895,
940,
21755,
323,
841,
4893,
323,
5028,
15644,
642,
4588,
5028,
15644,
8245,
310,
2429,
1411,
247,
1175,
7431,
651,
320,
1332,
432,
4204,
2123,
37685,
5215,
387,
253,
1077,
1878,
323,
941,
33254,
891,
1089,
326,
253,
13260,
323,
8104,
403,
3965,
2266,
26332,
326,
247,
368,
452,
2289,
281,
253,
3602,
273,
253,
3215,
11273,
3210,
281,
2983,
285,
270,
326,
253,
1566,
310,
2509,
3081,
11269,
760,
327,
253,
17791,
941,
604,
627,
403,
4606,
281,
1158,
326,
824,
13260,
403,
5272,
2654,
387,
1878,
1902,
253,
2929,
281,
41509,
2139,
326,
310,
50274,
249,
253,
1783,
273,
253,
2969,
4872,
1083,
275,
5150,
818,
627,
4620,
281,
320,
247,
10551,
5742,
690,
5816,
41616,
50276,
48328,
891,
292,
312,
277,
2851,
3124,
17,
50276,
292,
312,
246,
786,
264,
1440,
786,
292,
50276,
7064,
246,
50276,
466,
627,
943,
320,
41616,
987,
846,
277,
2851,
285,
987,
1078,
50276,
2520,
310,
432,
15706,
39116,
407,
253,
2048,
323,
39116,
18,
275,
5150,
721,
534,
310,
752,
891,
1158,
5150,
818,
310,
6326,
281,
320,
2509,
436,
6830,
36908,
2818,
690,
273,
253,
11815,
2668,
432,
436,
2593,
533,
2654,
751,
281,
923,
436,
2442,
10551,
5469,
1911,
2079,
50275,
3529,
753,
604,
253,
4477,
476,
10481,
2953,
253,
495,
2792,
1840,
2654,
320,
7378,
281,
2572,
619,
13716,
323,
436,
2929,
50276,
71,
3341,
891,
452,
247,
1643,
643,
625,
5884,
6815,
292,
1368,
1123,
2792,
50276,
30819,
275,
253,
2905,
789,
247,
5955,
327,
253,
2954,
342,
820,
19511,
3082,
651,
320,
5322,
50276,
16217,
3825,
4645,
849,
973,
253,
35755,
15302,
3700,
281,
1027,
2990,
35615,
685,
1110,
908,
275,
3733,
651,
320,
4722,
1014,
643,
13361,
11333,
651,
320,
3240,
4722,
50276,
664,
2223,
1089,
326,
253,
1180,
273,
35755,
3888,
2424,
281,
5115,
1175,
3045,
310,
271,
27096,
15301,
273,
253,
10895,
9991,
50276,
303,
417,
2119,
752,
275,
253,
2929,
2686,
816,
7790,
50276,
48387,
684,
436,
3908,
50276,
13206,
577,
310,
3559,
347,
271,
28913,
1263,
533,
271,
28913,
1263,
310,
835,
368,
5386,
5306,
1550,
4243,
273,
247,
1566,
390,
5933,
285,
923,
752,
6569,
534,
310,
2649,
253,
1083,
1060,
891,
1158,
697,
1805,
2529,
347,
247,
4373,
19484,
7340,
1263,
50275,
8826,
963,
993,
50273,
783,
2708,
8103,
50276,
783,
8103,
2708,
50273,
88,
1378,
281,
50276,
88,
1378,
50273,
783,
13358,
629,
2581,
50276,
783,
13358,
4243,
2581,
50273,
570,
707,
552,
50276,
13142,
5474,
339,
431,
248,
2929,
12453,
253,
4722,
1895,
273,
11365,
247,
1355,
1180,
273,
13506,
6667,
326,
476,
320,
908,
281,
6194,
247,
30410,
15706,
247,
4067,
10895,
50275,
783,
2929,
310,
4518,
3542,
253,
2746,
2789,
3282,
285,
253,
4679,
403,
4722,
50276,
2577,
2201,
7350,
403,
5001,
281,
2045,
6239,
1783,
273,
253,
5933,
285,
4278,
273,
253,
3368,
4583,
891,
1902,
271,
17857,
32888,
2929,
281,
564,
12861,
2581,
685,
4618,
891,
5583,
15250,
2266,
21414,
1941,
327,
581,
2914,
50274,
6160,
5701,
50276,
18,
50276,
303,
5816,
1783,
273,
253,
4081,
5199,
352,
369,
2649,
4751,
2590,
534,
2957,
352,
46926,
285,
604,
352,
6296,
16293,
281,
29623,
281,
253,
5927,
273,
326,
2957,
50275,
19,
50276,
783,
9400,
273,
4715,
432,
1643,
3530,
310,
3559,
347,
4336,
747,
352,
310,
973,
1929,
326,
323,
8946,
4872,
11333,
751,
253,
591,
916,
1406,
285,
256,
11618,
253,
13461,
403,
247,
17375,
2020,
273,
5203,
24676,
3530,
7613,
407,
5426,
273,
841,
5933,
627,
310,
247,
2014,
3410,
326,
476,
320,
908,
281,
6194,
253,
1566,
275,
581,
3213,
2654,
1902,
690,
5955,
273,
849,
253,
4081,
2746,
14588,
281,
841,
8946,
7274,
50276,
9088,
310,
671,
5368,
6239,
327,
247,
2905,
1895,
273,
17221,
3530,
9551,
7877,
5328,
1342,
413,
1596,
84,
326,
812,
320,
8489,
4623,
1060,
50275,
20,
16038,
253,
2929,
2085,
2067,
42852,
323,
10895,
940,
21755,
891,
1329,
253,
806,
16038,
273,
8249,
4685,
752,
941,
310,
2686,
3058,
323,
247,
30410,
285,
436,
2097,
326,
12861,
1783,
310,
3058,
253,
8542,
42852,
403,
1679,
21414,
984,
247,
5028,
15644,
4679,
403,
417,
2429,
342,
1524,
1666,
25379,
270,
31640,
273,
33254,
342,
247,
2014,
3410,
310,
417,
5421,
35844,
264,
50276,
21,
4679,
253,
26432,
3054,
326,
3733,
342,
884,
3888,
14190,
11107,
7200,
533,
436,
1057,
417,
1646,
5185,
342,
253,
1543,
275,
2829,
337,
253,
11743,
273,
4677,
374,
5936,
326,
7200,
310,
875,
1249,
285,
11107,
534,
2097,
253,
4767,
11107,
310,
417,
8612,
390,
6867,
812,
368,
19148,
50276,
1542,
5028,
15644,
253,
8245,
3632,
3888,
403,
1077,
5075,
285,
1335,
1347,
2761,
50275,
681,
1148,
1598,
281,
253,
4081,
2746,
625,
10237,
4679,
403,
3058,
1060,
10046,
1666,
25379,
12524,
4373,
19484,
3186,
3966,
50276,
22,
4028,
285,
47284,
253,
2929,
12453,
767,
3374,
247,
4715,
342,
1643,
13506,
3530,
285,
270,
4715,
342,
1643,
11786,
5018,
253,
26432,
14280,
281,
5878,
253,
767,
285,
352,
310,
417,
2590,
2139,
4715,
342,
247,
2014,
11786,
3213,
310,
1774,
891,
5583,
281,
4858,
253,
767,
12989,
625,
4518,
50276,
187,
187,
4118,
18435,
27,
783,
30628,
5194,
326,
253,
2934,
323,
10895,
940,
21755,
310,
4460,
2299,
352,
310,
12744,
849,
8542,
352,
476,
320,
253,
2929,
556,
644,
3012,
5520,
949,
253,
1635,
273,
747,
1666,
25379,
2299,
9142,
253,
3045,
310,
417,
3240,
1175,
2217,
323,
253,
30628,
281,
21424,
7052,
327,
697,
11136,
4931,
253,
2929,
651,
320,
1805,
17194,
407,
4560,
247,
15958,
10076,
275,
534,
352,
651,
1056,
3282,
323,
3095,
281,
897,
436,
2746,
689,
5272,
18075
] | [
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] | [
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
783,
2929,
10262,
271,
5933,
323,
509,
13537,
253,
1979,
273,
2862,
3733,
941,
715,
247,
1643,
13506,
3733,
3530,
253,
1332,
310,
1754,
327,
11454,
6928,
285,
310,
3732,
327,
2460,
2856,
19598,
253,
4477,
4385,
767,
1896,
4893,
273,
616,
1332,
5028,
15644,
285,
3576,
941,
33254,
2983,
50276,
783,
4081,
5853,
3133,
281,
320,
3710,
281,
11454,
6928,
1580,
352,
3133,
326,
310,
7939,
281,
253,
31850,
273,
253,
6928,
275,
436,
4809,
352,
812,
320,
4722,
281,
452,
247,
625,
2087,
1332,
50276,
9088,
403,
2905,
2987,
326,
403,
417,
20503,
323,
4227,
50275,
14930,
267,
29283,
480,
1445,
1822,
1162,
355,
247,
2278,
273,
4227,
5438,
3082,
13345,
9260,
2278,
36153,
4267,
14114,
16150,
50275,
49363,
2593,
310,
5075,
1643,
15302,
403,
2783,
643,
3237,
943,
320,
2879,
23000,
2905,
3082,
943,
320,
2908,
281,
50276,
23813,
253,
3045,
273,
253,
10419,
690,
5701,
670,
253,
15180,
2105,
943,
320,
13400,
275,
436,
4809,
253,
5661,
2593,
943,
320,
5520,
1563,
841,
12645,
7152,
33032,
50276,
5996,
30080,
22559,
5731,
50275,
35501,
281,
253,
1635,
273,
1805,
1666,
25379,
209,
422,
2559,
619,
4868,
323,
436,
2929,
1223,
516,
1335,
417,
2221,
13762,
273,
697,
2442,
323,
2898,
891,
1089,
253,
2934,
3236,
285,
4409,
16585,
387,
253,
8059,
50275,
3456,
250,
2858,
22559,
2278,
50276,
2520,
2929,
10262,
271,
2746,
281,
19477,
247,
10895,
715,
247,
1199,
4577,
1180,
273,
13506,
3530,
326,
403,
18325,
281,
4917,
347,
1175,
3045,
347,
1896,
672,
247,
1677,
1566,
310,
10166,
327,
326,
4577,
10895,
436,
310,
2218,
407,
440,
19891,
253,
11786,
18499,
5199,
273,
3733,
824,
247,
1566,
281,
1581,
323,
11786,
3169,
13757,
273,
13506,
3530,
3746,
347,
973,
347,
253,
908,
4715,
4142,
50276,
249,
6010,
619,
7103,
310,
347,
956,
50276,
856,
84,
50276,
38256,
3236,
1895,
15895,
50276,
43786,
973,
3542,
2929,
50276,
5040,
50276,
77,
471,
273,
5301,
342,
2969,
1666,
25379,
275,
5044,
10895,
940,
21755,
4758,
50276,
2327,
275,
8542,
4893,
5028,
15644,
941,
33254,
2568,
281,
320,
2410,
1763,
5356,
5183,
50276,
38896,
247,
10551,
275,
253,
10527,
1783,
273,
253,
4872,
1083,
50276,
527,
13158,
891,
1119,
253,
2929,
281,
320,
3839,
3240,
2590,
285,
11346,
4361,
352,
581,
5884,
2181,
891,
19460,
247,
2372,
342,
310,
253,
13812,
875,
48237,
5018,
285,
44540,
891,
2868,
253,
3438,
10140,
281,
672,
253,
13506,
3530,
403,
1027,
875,
305,
69,
5018,
5727,
253,
1996,
10140,
281,
253,
1180,
273,
2069,
253,
1332,
12889,
11945,
689,
841,
3530,
594,
891,
651,
4931,
11907,
253,
4477,
281,
22175,
326,
3064,
50275,
74,
671,
1089,
253,
1895,
3908,
326,
4081,
281,
320,
4722,
285,
1869,
872,
6856,
285,
253,
2900,
28763,
4081,
3133,
3240,
4569,
285,
973,
1869,
562,
50276,
3529,
753,
516,
11926,
670,
253,
1563,
50275,
28558,
891,
46485,
275,
253,
5044,
10895,
940,
21755,
4758,
247,
5301,
310,
1620,
2530,
342,
3733,
327,
247,
12421,
4236,
8578,
273,
253,
3733,
873,
18289,
253,
1543,
403,
7197,
533,
891,
1158,
841,
1543,
943,
320,
275,
253,
2929,
891,
651,
671,
9059,
323,
1907,
1529,
8245,
534,
651,
1611,
281,
5512,
22318,
253,
4327,
273,
534,
3733,
6667,
403,
1691,
275,
253,
8578,
247,
1077,
2969,
2746,
651,
320,
281,
1379,
253,
1052,
6613,
2168,
2684,
323,
253,
3632,
5438,
285,
3609,
253,
8578,
5277,
253,
1682,
7200,
327,
253,
2120,
3733,
873,
285,
760,
1304,
253,
3045,
273,
326,
8578,
3185,
273,
253,
1599,
285,
6268,
273,
512,
1052,
6613,
275,
2159,
436,
651,
1361,
3653,
281,
752,
6070,
627,
310,
1318,
281,
35143,
3006,
7094,
747,
3530,
25761,
891,
1158,
247,
2969,
5795,
8245,
323,
6153,
17791,
3530,
943,
320,
2783,
5742,
2654,
11697,
651,
751,
281,
871,
253,
3045,
273,
970,
591,
2437,
465,
30799,
17524,
285,
3733,
327,
253,
7368,
1399,
287,
2352,
347,
253,
35755,
10895,
50275,
6050,
891,
11435,
326,
253,
4477,
4271,
2442,
4893,
285,
1304,
690,
1543,
327,
731,
891,
1158,
597,
4390,
2965,
2159,
273,
21414,
253,
9414,
273,
253,
2442,
273,
10895,
940,
21755,
323,
841,
4893,
323,
5028,
15644,
642,
4588,
5028,
15644,
8245,
310,
2429,
1411,
247,
1175,
7431,
651,
320,
1332,
432,
4204,
2123,
37685,
5215,
387,
253,
1077,
1878,
323,
941,
33254,
891,
1089,
326,
253,
13260,
323,
8104,
403,
3965,
2266,
26332,
326,
247,
368,
452,
2289,
281,
253,
3602,
273,
253,
3215,
11273,
3210,
281,
2983,
285,
270,
326,
253,
1566,
310,
2509,
3081,
11269,
760,
327,
253,
17791,
941,
604,
627,
403,
4606,
281,
1158,
326,
824,
13260,
403,
5272,
2654,
387,
1878,
1902,
253,
2929,
281,
41509,
2139,
326,
310,
50274,
249,
253,
1783,
273,
253,
2969,
4872,
1083,
275,
5150,
818,
627,
4620,
281,
320,
247,
10551,
5742,
690,
5816,
41616,
50276,
48328,
891,
292,
312,
277,
2851,
3124,
17,
50276,
292,
312,
246,
786,
264,
1440,
786,
292,
50276,
7064,
246,
50276,
466,
627,
943,
320,
41616,
987,
846,
277,
2851,
285,
987,
1078,
50276,
2520,
310,
432,
15706,
39116,
407,
253,
2048,
323,
39116,
18,
275,
5150,
721,
534,
310,
752,
891,
1158,
5150,
818,
310,
6326,
281,
320,
2509,
436,
6830,
36908,
2818,
690,
273,
253,
11815,
2668,
432,
436,
2593,
533,
2654,
751,
281,
923,
436,
2442,
10551,
5469,
1911,
2079,
50275,
3529,
753,
604,
253,
4477,
476,
10481,
2953,
253,
495,
2792,
1840,
2654,
320,
7378,
281,
2572,
619,
13716,
323,
436,
2929,
50276,
71,
3341,
891,
452,
247,
1643,
643,
625,
5884,
6815,
292,
1368,
1123,
2792,
50276,
30819,
275,
253,
2905,
789,
247,
5955,
327,
253,
2954,
342,
820,
19511,
3082,
651,
320,
5322,
50276,
16217,
3825,
4645,
849,
973,
253,
35755,
15302,
3700,
281,
1027,
2990,
35615,
685,
1110,
908,
275,
3733,
651,
320,
4722,
1014,
643,
13361,
11333,
651,
320,
3240,
4722,
50276,
664,
2223,
1089,
326,
253,
1180,
273,
35755,
3888,
2424,
281,
5115,
1175,
3045,
310,
271,
27096,
15301,
273,
253,
10895,
9991,
50276,
303,
417,
2119,
752,
275,
253,
2929,
2686,
816,
7790,
50276,
48387,
684,
436,
3908,
50276,
13206,
577,
310,
3559,
347,
271,
28913,
1263,
533,
271,
28913,
1263,
310,
835,
368,
5386,
5306,
1550,
4243,
273,
247,
1566,
390,
5933,
285,
923,
752,
6569,
534,
310,
2649,
253,
1083,
1060,
891,
1158,
697,
1805,
2529,
347,
247,
4373,
19484,
7340,
1263,
50275,
8826,
963,
993,
50273,
783,
2708,
8103,
50276,
783,
8103,
2708,
50273,
88,
1378,
281,
50276,
88,
1378,
50273,
783,
13358,
629,
2581,
50276,
783,
13358,
4243,
2581,
50273,
570,
707,
552,
50276,
13142,
5474,
339,
431,
248,
2929,
12453,
253,
4722,
1895,
273,
11365,
247,
1355,
1180,
273,
13506,
6667,
326,
476,
320,
908,
281,
6194,
247,
30410,
15706,
247,
4067,
10895,
50275,
783,
2929,
310,
4518,
3542,
253,
2746,
2789,
3282,
285,
253,
4679,
403,
4722,
50276,
2577,
2201,
7350,
403,
5001,
281,
2045,
6239,
1783,
273,
253,
5933,
285,
4278,
273,
253,
3368,
4583,
891,
1902,
271,
17857,
32888,
2929,
281,
564,
12861,
2581,
685,
4618,
891,
5583,
15250,
2266,
21414,
1941,
327,
581,
2914,
50274,
6160,
5701,
50276,
18,
50276,
303,
5816,
1783,
273,
253,
4081,
5199,
352,
369,
2649,
4751,
2590,
534,
2957,
352,
46926,
285,
604,
352,
6296,
16293,
281,
29623,
281,
253,
5927,
273,
326,
2957,
50275,
19,
50276,
783,
9400,
273,
4715,
432,
1643,
3530,
310,
3559,
347,
4336,
747,
352,
310,
973,
1929,
326,
323,
8946,
4872,
11333,
751,
253,
591,
916,
1406,
285,
256,
11618,
253,
13461,
403,
247,
17375,
2020,
273,
5203,
24676,
3530,
7613,
407,
5426,
273,
841,
5933,
627,
310,
247,
2014,
3410,
326,
476,
320,
908,
281,
6194,
253,
1566,
275,
581,
3213,
2654,
1902,
690,
5955,
273,
849,
253,
4081,
2746,
14588,
281,
841,
8946,
7274,
50276,
9088,
310,
671,
5368,
6239,
327,
247,
2905,
1895,
273,
17221,
3530,
9551,
7877,
5328,
1342,
413,
1596,
84,
326,
812,
320,
8489,
4623,
1060,
50275,
20,
16038,
253,
2929,
2085,
2067,
42852,
323,
10895,
940,
21755,
891,
1329,
253,
806,
16038,
273,
8249,
4685,
752,
941,
310,
2686,
3058,
323,
247,
30410,
285,
436,
2097,
326,
12861,
1783,
310,
3058,
253,
8542,
42852,
403,
1679,
21414,
984,
247,
5028,
15644,
4679,
403,
417,
2429,
342,
1524,
1666,
25379,
270,
31640,
273,
33254,
342,
247,
2014,
3410,
310,
417,
5421,
35844,
264,
50276,
21,
4679,
253,
26432,
3054,
326,
3733,
342,
884,
3888,
14190,
11107,
7200,
533,
436,
1057,
417,
1646,
5185,
342,
253,
1543,
275,
2829,
337,
253,
11743,
273,
4677,
374,
5936,
326,
7200,
310,
875,
1249,
285,
11107,
534,
2097,
253,
4767,
11107,
310,
417,
8612,
390,
6867,
812,
368,
19148,
50276,
1542,
5028,
15644,
253,
8245,
3632,
3888,
403,
1077,
5075,
285,
1335,
1347,
2761,
50275,
681,
1148,
1598,
281,
253,
4081,
2746,
625,
10237,
4679,
403,
3058,
1060,
10046,
1666,
25379,
12524,
4373,
19484,
3186,
3966,
50276,
22,
4028,
285,
47284,
253,
2929,
12453,
767,
3374,
247,
4715,
342,
1643,
13506,
3530,
285,
270,
4715,
342,
1643,
11786,
5018,
253,
26432,
14280,
281,
5878,
253,
767,
285,
352,
310,
417,
2590,
2139,
4715,
342,
247,
2014,
11786,
3213,
310,
1774,
891,
5583,
281,
4858,
253,
767,
12989,
625,
4518,
50276,
187,
187,
4118,
18435,
27,
783,
30628,
5194,
326,
253,
2934,
323,
10895,
940,
21755,
310,
4460,
2299,
352,
310,
12744,
849,
8542,
352,
476,
320,
253,
2929,
556,
644,
3012,
5520,
949,
253,
1635,
273,
747,
1666,
25379,
2299,
9142,
253,
3045,
310,
417,
3240,
1175,
2217,
323,
253,
30628,
281,
21424,
7052,
327,
697,
11136,
4931,
253,
2929,
651,
320,
1805,
17194,
407,
4560,
247,
15958,
10076,
275,
534,
352,
651,
1056,
3282,
323,
3095,
281,
897,
436,
2746,
689,
5272,
18075
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
this paper describes a rolebased learning model for the decpomdps the main contribution lies in the efficient discovery of roles from the joint action spaces and then learning a bilevel role assignment for each achievement this is achieved in two steps first the joint action space is clustered into different role action spaces that reduce the action search space for each role second a bilevel role assignment technique is used to learn action and roles for each agent the technique is tested on starcraft ii micromanagement environments for the action space reduction the model learns action representations that can reflect the effects of actions on the environment and other agents to this end a deep learning model is created which predicts the effects of joint actions on the induced rewards and change in the effects actions generating similar effects are cluster together using kmeans and are called roles action spaces this restricts the joint action search spaces for each role the role selector is now used to learn a bilevel hierarchical assignment to map the actionobservation history of each agent at the toplevel the agents are mapped to their corresponding to roles based on a qvalue function of each role conditioned on actionobservation history and at a lowerlevel similar qvalue function is used to find the agents action to avoid too many concurrent selections of a single role and action by multiple agents a global qvalue is learned from individual qvalues to ensure overall coordination between the agents this is inspired by qmix previous work on multiagent learning positives 1 the idea of reducing the search space by effectbased clustering appears interesting and novel 2 the technique leads to good exploration and performance in hard and super maps 3 the paper is wellwritten and the technique is extensively tested on all the maps with useful ablations minor issues 1 some commentsreasoning related to outlier roles and action spaces would have been helpful 2 do changes in the clustering algorithm leads to a significant difference in performance or role assignment 3 compared to the previous approaches the rodealgorithm learns slower in most of the easier maps update thank you for the response and updates to the paperdocsepthis paper introduces a bilevel hierarchical framework for achieving scalable multiagent learning in this framework the highlevel policy role selector coordinates role assignments in a smaller role space and at a lower temporal resolution and the lowlevel policies role policies explore strategies in reduced primitive actionobservation spaces in this way the complex multiagent problem is decomposed into multiple subproblems which is easy to learn the authors conduct the experiments in the starcraft ii micromanagement benchmark compared with the stateoftheart marl methods strength the paper is wellwritten and easytofollow the authors show the demo videos for a better demonstrating of the learned behavior the introduced idea is original and interesting i can see that the bilevel multiagent coordination framework is of great potential in solving various multiagent tasks it is nice to demonstrate the generalization of the learned policies on unseen maps provide the code in the supplementary material questions the action clustering how to choose the roles number k for clustering according to prior knowledge can you explain the effect of different k numbers on the performance is the policies for different levels trained simultaneously docsepthis paper proposes a new method called rode to learn roles for multiagent systems to improve the learning efficiency of marl instead of exploring in the full joint action space rode first decomposes the action space into k clusters based on the different influences of each action on the environment and other agents then rode trains a role selector and role policies conditioned on each agents actionobservation history experiments show its superior performance compared with sota marl algorithms this paper is highly related to the subject of iclr and is well written but i have some specific confusions listed below waiting for the authors reply the authors mentioned that rode differs from previous rolebased methods that require prior knowledge to learn roles however the action space decomposition also requires human knowledge eg would a wrong number of k clusters hinder the final performance given the local observation and all agents actions rode first learns the action representations by minimizing two prediction errors the next local observation and team reward then the role representation is calculated by averaging representations of actions in each cluster what i am concerned is that whether this average representation can well represent the roles which may further impact the estimation of role policies since the action representations are used to generate the qvalues of both role selector and role policies since in smac the attack action must be given an enemy index so the number of actions increases with the increase of the number of enemies transfer of rode needs to manually add new representations to each cluster and then recalculate the role representations in this way the role representations change does this influence the decisions of role selector and role policies furthermore these two components cannot be trained on new maps because the input and output of the mixing networks are different with a different number of agents some recent marl algorithms to improve qmix such as aiqmix1 weightedqmix2 should be discussed or compared i agree with the authors that the joint action space can be decomposed or classified based on the action effect or action semantics asn 2 firstly investigates the influence of actions on other agents which they call the action semantics i think these two works are inspired by the same property in mass to improve multiagent coordination moreover these two works both use smac as benchmarks in this way id like to see the comparison of asn and rode to further validate its superior performance 1 aiqmix attention and imagination for dynamic multiagent reinforcement learning arxiv preprint arxiv200604222 2 weighted qmix expanding monotonic value function factorisation neurips 2020 3 action semantics network considering the effects of actions in multiagent systems iclr 2020
### Summary: | the paper proposes a twolevel hierarchical algorithm for efficient and scalable multiagent learning where the highlevel policy decides a reduced space for lowlevel to explore in all the reviewers liked the premise and the experimental evaluation reviewers had some clarification questions which were answered in the authors rebuttal after discussing the rebuttal ac as well as reviewers believe that the paper provides insights that will be useful for the multiagent learning community and recommend acceptance | [
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
2520,
2929,
8631,
247,
2554,
3169,
4715,
1566,
323,
253,
1086,
81,
297,
69,
793,
253,
2022,
7680,
8696,
275,
253,
5919,
8900,
273,
9503,
432,
253,
6036,
2250,
8470,
285,
840,
4715,
247,
26413,
652,
2554,
12714,
323,
1016,
19797,
436,
310,
6786,
275,
767,
5018,
806,
253,
6036,
2250,
2317,
310,
29102,
715,
1027,
2554,
2250,
8470,
326,
4796,
253,
2250,
3186,
2317,
323,
1016,
2554,
1273,
247,
26413,
652,
2554,
12714,
5853,
310,
908,
281,
3037,
2250,
285,
9503,
323,
1016,
5570,
253,
5853,
310,
5762,
327,
331,
3178,
2694,
21255,
25390,
266,
5585,
12620,
50276,
1542,
253,
2250,
2317,
5141,
253,
1566,
33772,
2250,
14237,
326,
476,
4887,
253,
2538,
273,
5231,
327,
253,
3126,
285,
643,
6083,
281,
436,
990,
247,
3676,
4715,
1566,
310,
3562,
534,
26295,
253,
2538,
273,
6036,
5231,
327,
253,
5802,
23267,
285,
1818,
275,
253,
2538,
5231,
11365,
2074,
2538,
403,
7368,
2366,
970,
465,
30799,
285,
403,
1925,
9503,
2250,
8470,
436,
45798,
253,
6036,
2250,
3186,
8470,
323,
1016,
2554,
253,
2554,
23434,
310,
1024,
908,
281,
3037,
247,
26413,
652,
24498,
12714,
281,
3711,
253,
2250,
23705,
318,
2892,
273,
1016,
5570,
387,
253,
281,
713,
652,
253,
6083,
403,
18301,
281,
616,
3969,
281,
9503,
1754,
327,
247,
2805,
2877,
1159,
273,
1016,
2554,
27039,
327,
2250,
23705,
318,
2892,
285,
387,
247,
2406,
5251,
2074,
2805,
2877,
1159,
310,
908,
281,
1089,
253,
6083,
2250,
281,
3693,
1512,
1142,
17336,
36318,
273,
247,
2014,
2554,
285,
2250,
407,
2709,
6083,
247,
4156,
2805,
2877,
310,
6311,
432,
2060,
2805,
8858,
281,
5416,
4583,
19915,
875,
253,
6083,
436,
310,
11797,
407,
2805,
24706,
2045,
789,
327,
4471,
12788,
4715,
50276,
993,
23223,
337,
253,
2934,
273,
8493,
253,
3186,
2317,
407,
1055,
3169,
17524,
4620,
4722,
285,
4460,
374,
253,
5853,
5644,
281,
1175,
17947,
285,
3045,
275,
1892,
285,
2221,
8115,
495,
253,
2929,
310,
973,
15720,
285,
253,
5853,
310,
18171,
5762,
327,
512,
253,
8115,
342,
4217,
490,
77,
569,
50276,
37585,
3374,
337,
690,
5701,
10752,
272,
2905,
281,
562,
3623,
9503,
285,
2250,
8470,
651,
452,
644,
9371,
374,
513,
2544,
275,
253,
17524,
5933,
5644,
281,
247,
1534,
3064,
275,
3045,
390,
2554,
12714,
495,
2429,
281,
253,
2045,
7274,
253,
22461,
41528,
33772,
17357,
275,
954,
273,
253,
6927,
8115,
50275,
11183,
50276,
47033,
368,
323,
253,
2380,
285,
11269,
281,
253,
2929,
7152,
33032,
2520,
2929,
23970,
247,
26413,
652,
24498,
7792,
323,
17170,
44755,
4471,
12788,
4715,
275,
436,
7792,
253,
1029,
5251,
3646,
2554,
23434,
11627,
2554,
23768,
275,
247,
4577,
2554,
2317,
285,
387,
247,
2406,
11935,
6064,
285,
253,
1698,
5251,
7823,
2554,
7823,
8338,
8130,
275,
3777,
20523,
2250,
23705,
318,
8470,
275,
436,
1039,
253,
2570,
4471,
12788,
1895,
310,
45765,
715,
2709,
749,
856,
23042,
534,
310,
3477,
281,
3037,
253,
4477,
2589,
253,
4679,
275,
253,
331,
3178,
2694,
21255,
25390,
266,
5585,
22791,
2429,
342,
253,
1375,
23037,
14387,
2304,
77,
3082,
50276,
45563,
50276,
783,
2929,
310,
973,
15720,
285,
3477,
936,
25739,
253,
4477,
921,
253,
22020,
10556,
323,
247,
1805,
17227,
273,
253,
6311,
3879,
50276,
783,
5611,
2934,
310,
3236,
285,
4722,
891,
476,
923,
326,
253,
26413,
652,
4471,
12788,
19915,
7792,
310,
273,
1270,
2442,
275,
16161,
2710,
4471,
12788,
8892,
50276,
262,
310,
5322,
281,
7568,
253,
26647,
273,
253,
6311,
7823,
327,
39709,
8115,
50276,
42260,
253,
2127,
275,
253,
24864,
2144,
50276,
34974,
50276,
783,
2250,
17524,
849,
281,
5206,
253,
9503,
1180,
465,
323,
17524,
2556,
281,
2720,
3640,
476,
368,
5513,
253,
1055,
273,
1027,
465,
3904,
327,
253,
3045,
50275,
261,
253,
7823,
323,
1027,
2308,
10166,
10486,
5474,
33032,
2520,
2929,
29328,
247,
747,
1332,
1925,
22461,
281,
3037,
9503,
323,
4471,
12788,
2718,
281,
3157,
253,
4715,
6733,
273,
2304,
77,
3185,
273,
18216,
275,
253,
2120,
6036,
2250,
2317,
22461,
806,
11101,
6013,
253,
2250,
2317,
715,
465,
9959,
1754,
327,
253,
1027,
16178,
273,
1016,
2250,
327,
253,
3126,
285,
643,
6083,
840,
22461,
18784,
247,
2554,
23434,
285,
2554,
7823,
27039,
327,
1016,
6083,
2250,
23705,
318,
2892,
4679,
921,
697,
8936,
3045,
2429,
342,
256,
5503,
2304,
77,
11333,
50275,
2520,
2929,
310,
4122,
2905,
281,
253,
2256,
273,
17857,
32888,
285,
310,
973,
3542,
533,
891,
452,
690,
2173,
1461,
16723,
7117,
2708,
6179,
323,
253,
4477,
12252,
50276,
783,
4477,
5393,
326,
22461,
19986,
432,
2045,
2554,
3169,
3082,
326,
2430,
2720,
3640,
281,
3037,
9503,
2299,
253,
2250,
2317,
14717,
671,
4419,
1966,
3640,
24088,
651,
247,
3430,
1180,
273,
465,
9959,
35007,
253,
2457,
3045,
50276,
28821,
253,
1980,
8310,
285,
512,
6083,
5231,
22461,
806,
33772,
253,
2250,
14237,
407,
28699,
767,
10554,
6332,
253,
1735,
1980,
8310,
285,
2285,
10921,
840,
253,
2554,
6779,
310,
5118,
407,
25001,
14237,
273,
5231,
275,
1016,
7368,
752,
891,
717,
7514,
310,
326,
1880,
436,
3388,
6779,
476,
973,
1957,
253,
9503,
534,
778,
2007,
3486,
253,
13418,
273,
2554,
7823,
1580,
253,
2250,
14237,
403,
908,
281,
6635,
253,
2805,
8858,
273,
1097,
2554,
23434,
285,
2554,
7823,
50274,
17480,
275,
924,
317,
253,
2983,
2250,
1364,
320,
1677,
271,
9054,
3605,
594,
253,
1180,
273,
5231,
5459,
342,
253,
2572,
273,
253,
1180,
273,
13948,
3700,
273,
22461,
3198,
281,
13542,
823,
747,
14237,
281,
1016,
7368,
285,
840,
42545,
3699,
253,
2554,
14237,
275,
436,
1039,
253,
2554,
14237,
1818,
1057,
436,
4833,
253,
7089,
273,
2554,
23434,
285,
2554,
7823,
33810,
841,
767,
4295,
2550,
320,
10166,
327,
747,
8115,
984,
253,
3280,
285,
3453,
273,
253,
12480,
6928,
403,
1027,
342,
247,
1027,
1180,
273,
6083,
50275,
8826,
3332,
2304,
77,
11333,
281,
3157,
2805,
24706,
824,
347,
23105,
82,
24706,
18,
17375,
82,
24706,
19,
943,
320,
5469,
390,
2429,
50276,
74,
5194,
342,
253,
4477,
326,
253,
6036,
2250,
2317,
476,
320,
45765,
390,
10509,
1754,
327,
253,
2250,
1055,
390,
2250,
35185,
347,
79,
374,
41005,
2340,
684,
253,
4833,
273,
5231,
327,
643,
6083,
534,
597,
1067,
253,
2250,
35185,
891,
1158,
841,
767,
2987,
403,
11797,
407,
253,
1072,
2867,
275,
2280,
281,
3157,
4471,
12788,
19915,
25761,
841,
767,
2987,
1097,
897,
924,
317,
347,
49602,
275,
436,
1039,
2654,
751,
281,
923,
253,
5301,
273,
347,
79,
285,
22461,
281,
2007,
17813,
697,
8936,
3045,
50276,
18,
23105,
82,
24706,
4116,
285,
17368,
323,
7870,
4471,
12788,
35221,
4715,
549,
32693,
638,
3845,
549,
32693,
1518,
26890,
18895,
50276,
19,
17375,
2805,
24706,
16122,
45973,
1318,
1159,
2803,
5837,
5723,
2824,
9169,
50276,
20,
2250,
35185,
2990,
7296,
253,
2538,
273,
5231,
275,
4471,
12788,
2718,
17857,
32888,
9169,
187,
187,
4118,
18435,
27,
783,
2929,
29328,
247,
767,
5251,
24498,
5933,
323,
5919,
285,
44755,
4471,
12788,
4715,
835,
253,
1029,
5251,
3646,
21936,
247,
3777,
2317,
323,
1698,
5251,
281,
8338,
275,
512,
253,
30628,
10490,
253,
26536,
285,
253,
5661,
7103,
30628,
574,
690,
37699,
3533,
534,
497,
9577,
275,
253,
4477,
30080,
22559,
846,
16585,
253,
30080,
22559,
913,
347,
973,
347,
30628,
2868,
326,
253,
2929,
3400,
16039,
326,
588,
320,
4217,
323,
253,
4471,
12788,
4715,
3114,
285,
5583,
14924
] | [
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] | [
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
2520,
2929,
8631,
247,
2554,
3169,
4715,
1566,
323,
253,
1086,
81,
297,
69,
793,
253,
2022,
7680,
8696,
275,
253,
5919,
8900,
273,
9503,
432,
253,
6036,
2250,
8470,
285,
840,
4715,
247,
26413,
652,
2554,
12714,
323,
1016,
19797,
436,
310,
6786,
275,
767,
5018,
806,
253,
6036,
2250,
2317,
310,
29102,
715,
1027,
2554,
2250,
8470,
326,
4796,
253,
2250,
3186,
2317,
323,
1016,
2554,
1273,
247,
26413,
652,
2554,
12714,
5853,
310,
908,
281,
3037,
2250,
285,
9503,
323,
1016,
5570,
253,
5853,
310,
5762,
327,
331,
3178,
2694,
21255,
25390,
266,
5585,
12620,
50276,
1542,
253,
2250,
2317,
5141,
253,
1566,
33772,
2250,
14237,
326,
476,
4887,
253,
2538,
273,
5231,
327,
253,
3126,
285,
643,
6083,
281,
436,
990,
247,
3676,
4715,
1566,
310,
3562,
534,
26295,
253,
2538,
273,
6036,
5231,
327,
253,
5802,
23267,
285,
1818,
275,
253,
2538,
5231,
11365,
2074,
2538,
403,
7368,
2366,
970,
465,
30799,
285,
403,
1925,
9503,
2250,
8470,
436,
45798,
253,
6036,
2250,
3186,
8470,
323,
1016,
2554,
253,
2554,
23434,
310,
1024,
908,
281,
3037,
247,
26413,
652,
24498,
12714,
281,
3711,
253,
2250,
23705,
318,
2892,
273,
1016,
5570,
387,
253,
281,
713,
652,
253,
6083,
403,
18301,
281,
616,
3969,
281,
9503,
1754,
327,
247,
2805,
2877,
1159,
273,
1016,
2554,
27039,
327,
2250,
23705,
318,
2892,
285,
387,
247,
2406,
5251,
2074,
2805,
2877,
1159,
310,
908,
281,
1089,
253,
6083,
2250,
281,
3693,
1512,
1142,
17336,
36318,
273,
247,
2014,
2554,
285,
2250,
407,
2709,
6083,
247,
4156,
2805,
2877,
310,
6311,
432,
2060,
2805,
8858,
281,
5416,
4583,
19915,
875,
253,
6083,
436,
310,
11797,
407,
2805,
24706,
2045,
789,
327,
4471,
12788,
4715,
50276,
993,
23223,
337,
253,
2934,
273,
8493,
253,
3186,
2317,
407,
1055,
3169,
17524,
4620,
4722,
285,
4460,
374,
253,
5853,
5644,
281,
1175,
17947,
285,
3045,
275,
1892,
285,
2221,
8115,
495,
253,
2929,
310,
973,
15720,
285,
253,
5853,
310,
18171,
5762,
327,
512,
253,
8115,
342,
4217,
490,
77,
569,
50276,
37585,
3374,
337,
690,
5701,
10752,
272,
2905,
281,
562,
3623,
9503,
285,
2250,
8470,
651,
452,
644,
9371,
374,
513,
2544,
275,
253,
17524,
5933,
5644,
281,
247,
1534,
3064,
275,
3045,
390,
2554,
12714,
495,
2429,
281,
253,
2045,
7274,
253,
22461,
41528,
33772,
17357,
275,
954,
273,
253,
6927,
8115,
50275,
11183,
50276,
47033,
368,
323,
253,
2380,
285,
11269,
281,
253,
2929,
7152,
33032,
2520,
2929,
23970,
247,
26413,
652,
24498,
7792,
323,
17170,
44755,
4471,
12788,
4715,
275,
436,
7792,
253,
1029,
5251,
3646,
2554,
23434,
11627,
2554,
23768,
275,
247,
4577,
2554,
2317,
285,
387,
247,
2406,
11935,
6064,
285,
253,
1698,
5251,
7823,
2554,
7823,
8338,
8130,
275,
3777,
20523,
2250,
23705,
318,
8470,
275,
436,
1039,
253,
2570,
4471,
12788,
1895,
310,
45765,
715,
2709,
749,
856,
23042,
534,
310,
3477,
281,
3037,
253,
4477,
2589,
253,
4679,
275,
253,
331,
3178,
2694,
21255,
25390,
266,
5585,
22791,
2429,
342,
253,
1375,
23037,
14387,
2304,
77,
3082,
50276,
45563,
50276,
783,
2929,
310,
973,
15720,
285,
3477,
936,
25739,
253,
4477,
921,
253,
22020,
10556,
323,
247,
1805,
17227,
273,
253,
6311,
3879,
50276,
783,
5611,
2934,
310,
3236,
285,
4722,
891,
476,
923,
326,
253,
26413,
652,
4471,
12788,
19915,
7792,
310,
273,
1270,
2442,
275,
16161,
2710,
4471,
12788,
8892,
50276,
262,
310,
5322,
281,
7568,
253,
26647,
273,
253,
6311,
7823,
327,
39709,
8115,
50276,
42260,
253,
2127,
275,
253,
24864,
2144,
50276,
34974,
50276,
783,
2250,
17524,
849,
281,
5206,
253,
9503,
1180,
465,
323,
17524,
2556,
281,
2720,
3640,
476,
368,
5513,
253,
1055,
273,
1027,
465,
3904,
327,
253,
3045,
50275,
261,
253,
7823,
323,
1027,
2308,
10166,
10486,
5474,
33032,
2520,
2929,
29328,
247,
747,
1332,
1925,
22461,
281,
3037,
9503,
323,
4471,
12788,
2718,
281,
3157,
253,
4715,
6733,
273,
2304,
77,
3185,
273,
18216,
275,
253,
2120,
6036,
2250,
2317,
22461,
806,
11101,
6013,
253,
2250,
2317,
715,
465,
9959,
1754,
327,
253,
1027,
16178,
273,
1016,
2250,
327,
253,
3126,
285,
643,
6083,
840,
22461,
18784,
247,
2554,
23434,
285,
2554,
7823,
27039,
327,
1016,
6083,
2250,
23705,
318,
2892,
4679,
921,
697,
8936,
3045,
2429,
342,
256,
5503,
2304,
77,
11333,
50275,
2520,
2929,
310,
4122,
2905,
281,
253,
2256,
273,
17857,
32888,
285,
310,
973,
3542,
533,
891,
452,
690,
2173,
1461,
16723,
7117,
2708,
6179,
323,
253,
4477,
12252,
50276,
783,
4477,
5393,
326,
22461,
19986,
432,
2045,
2554,
3169,
3082,
326,
2430,
2720,
3640,
281,
3037,
9503,
2299,
253,
2250,
2317,
14717,
671,
4419,
1966,
3640,
24088,
651,
247,
3430,
1180,
273,
465,
9959,
35007,
253,
2457,
3045,
50276,
28821,
253,
1980,
8310,
285,
512,
6083,
5231,
22461,
806,
33772,
253,
2250,
14237,
407,
28699,
767,
10554,
6332,
253,
1735,
1980,
8310,
285,
2285,
10921,
840,
253,
2554,
6779,
310,
5118,
407,
25001,
14237,
273,
5231,
275,
1016,
7368,
752,
891,
717,
7514,
310,
326,
1880,
436,
3388,
6779,
476,
973,
1957,
253,
9503,
534,
778,
2007,
3486,
253,
13418,
273,
2554,
7823,
1580,
253,
2250,
14237,
403,
908,
281,
6635,
253,
2805,
8858,
273,
1097,
2554,
23434,
285,
2554,
7823,
50274,
17480,
275,
924,
317,
253,
2983,
2250,
1364,
320,
1677,
271,
9054,
3605,
594,
253,
1180,
273,
5231,
5459,
342,
253,
2572,
273,
253,
1180,
273,
13948,
3700,
273,
22461,
3198,
281,
13542,
823,
747,
14237,
281,
1016,
7368,
285,
840,
42545,
3699,
253,
2554,
14237,
275,
436,
1039,
253,
2554,
14237,
1818,
1057,
436,
4833,
253,
7089,
273,
2554,
23434,
285,
2554,
7823,
33810,
841,
767,
4295,
2550,
320,
10166,
327,
747,
8115,
984,
253,
3280,
285,
3453,
273,
253,
12480,
6928,
403,
1027,
342,
247,
1027,
1180,
273,
6083,
50275,
8826,
3332,
2304,
77,
11333,
281,
3157,
2805,
24706,
824,
347,
23105,
82,
24706,
18,
17375,
82,
24706,
19,
943,
320,
5469,
390,
2429,
50276,
74,
5194,
342,
253,
4477,
326,
253,
6036,
2250,
2317,
476,
320,
45765,
390,
10509,
1754,
327,
253,
2250,
1055,
390,
2250,
35185,
347,
79,
374,
41005,
2340,
684,
253,
4833,
273,
5231,
327,
643,
6083,
534,
597,
1067,
253,
2250,
35185,
891,
1158,
841,
767,
2987,
403,
11797,
407,
253,
1072,
2867,
275,
2280,
281,
3157,
4471,
12788,
19915,
25761,
841,
767,
2987,
1097,
897,
924,
317,
347,
49602,
275,
436,
1039,
2654,
751,
281,
923,
253,
5301,
273,
347,
79,
285,
22461,
281,
2007,
17813,
697,
8936,
3045,
50276,
18,
23105,
82,
24706,
4116,
285,
17368,
323,
7870,
4471,
12788,
35221,
4715,
549,
32693,
638,
3845,
549,
32693,
1518,
26890,
18895,
50276,
19,
17375,
2805,
24706,
16122,
45973,
1318,
1159,
2803,
5837,
5723,
2824,
9169,
50276,
20,
2250,
35185,
2990,
7296,
253,
2538,
273,
5231,
275,
4471,
12788,
2718,
17857,
32888,
9169,
187,
187,
4118,
18435,
27,
783,
2929,
29328,
247,
767,
5251,
24498,
5933,
323,
5919,
285,
44755,
4471,
12788,
4715,
835,
253,
1029,
5251,
3646,
21936,
247,
3777,
2317,
323,
1698,
5251,
281,
8338,
275,
512,
253,
30628,
10490,
253,
26536,
285,
253,
5661,
7103,
30628,
574,
690,
37699,
3533,
534,
497,
9577,
275,
253,
4477,
30080,
22559,
846,
16585,
253,
30080,
22559,
913,
347,
973,
347,
30628,
2868,
326,
253,
2929,
3400,
16039,
326,
588,
320,
4217,
323,
253,
4471,
12788,
4715,
3114,
285,
5583,
14924
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
the paper is well written and presented giving a good literature review and clearly explaining the design decisions and tradeoffs the paper proposes a novel factorisation approach and uses recurrent networks the evaluation is both quantitative and qualitative the qualitative experiment is interesting but there is no information given about the level of musical training the participants had you would expect very different results from music students compared to the general public how did you control for musical ability understanding the paper has a refreshing honesty in its critical evaluation of the results highlighting fundamental problems in this field overall while i am not an expert in musical composition and machine learning the paper is clear and appears to be advancing the art in a reliable fashiondocsep composing polyphonic music is a hard computational problem this paper views the problem as modelling a probability distribution over musical scores that is parametrized using convolutional and recurrent networks emphasis is given to careful evaluation both quantitatively and qualitatively the technical parts are quite poorly written the introduction is quite well written and it is easy to follow it provides a good review that is nicely balanced between older and recent literature unfortunately at the technical parts the paper starts to suffer due to sloppy notation the cross entropy definition is missing important details what does s exactly denote are you referring to a binary piano roll or some abstract vector valued process this leaves a lot of guess work to the reader even the footnote makes it evident that the authors may have a different mental picture i would argue that a piano roll does not need two bits take a binary matrix rollnoten timet 1 0 when note n is present absent at time t i also think the term factorization is sometimes used freely as a synonym for representation in last paragraphs of 4 and first two paragraphs of 5 i find this misleading without proper definitions the models which are central to the message of the paper are not described clearly please define function acdot in 2 3 4 this maybe possibly a typesetting issue and a is highly likely a sigmoid but what does xp whp x xpt etc stand for various contractions you have only defined the tensor as xtpn even there the proposed encoding is difficult to follow using different names for different ranges of the same index n and d seems to be avoiding important details and calling for trouble why not just introduce an order 4 tensor and represent everything in the product space as every note must have a duration while the paper includes some interesting ideas about representation of relative pitch the poor technical writing makes it not suitable to iclr and hard to judgeinterpret the extensive simulation results minor for tensors rank3 is not correct use please use order3 here if you are referring to the number of dimensions of the multiway array what is a nonlinear sampling scheme please be more precise the allanwilliams citation and year is broken moray allan and christopher k i williams harmonising chorales by probabilistic inference advances in neural information processing systems 17 2005 docseppros seemingly reasonable approach to polyphonic music generation figuring out a way to splitting the parts share parameters appropriately measuring entropy per time all make sense the resulting outputs tend to have very shortterm harmonic coherence eg often a standard chord with some resolving suspensions etc with individual parts often making very small stepwise motion ie reasonable local voice leading extensive comparison of architectural variations positive results from listening experiments cons musical outputs are not clearly better than some of the polyphonic systems described despite the often small melodic steps the individual lines are quite random sounding this is perhaps a direct result of the short history i do not hear the rhythmic complexity that is described in the introduction the work by johnson 2015 ref provided below should be looked at and listened to it too uses coupled networks albeit in a different way but with a related motivation and has rhythmic and polyphonic complexity and sounds quite good better in my opinion some unclear sections fixable especially with an appendix more detail below despite the extensive architectural comparisons i was not always clear about rationale behind certain choices eg if using recurrent nets why not try lstm or gru more questions below would like to have heard the listening tests or at least read more about how samples were selected again perhaps in an appendix and additional sample files quality clarity originality and significance of this work including a list of its pros and cons max 200000 characters quality in this work various goodreasonable choices are made the quality of the actual output is fine it is comparable to and to my ears not better than existing polyphonic systems such as the ones below links to sample audio are provided here bachbot httpssoundcloudcombachbot liang et al 2017 tied parallel nets httpwwwhexahedriacom20150803composingmusicwithrecurrentneuralnetworks johnson 2015 ref below performancernn httpsmagentatensorfloworgperformancernn simon oore 2017 others as well clarity some of the writing is locally clear but one large poorlyorganized section makes the whole thing confusing details below it is very helpful that the authors subsequently added a comment with a link to some sample scores without that it had been utterly impossible to evaluate the quality there are a few points that could be better clarified p5a multihot vector of notes n it sounds like n will be used to denote notenumbers but in fact it seems like n is the total number of notes ie the length of the vector right what value of n is used p5 a onehot vector of durations d it sounds like d will be used to denote durations but actually i think d is the length of the 1hot vector encoding durations right what value of d is used and what durations do the elements of this vector represent similarly does t represent the size of the history this should really be clarified p5 polyphonic models eq 2 3 4 presumably the hs are the hidden activations layers the networks here correspond to the blue circles in fig 1 right if so make the relationship clear and explicit note that most variables in most equations are left undefined actually defining the ws in eq24 would allow the authors to refer to the ws later eg in section 52 when describing weightsharing ideas otherwise its all rather confusing for example the authors could write thus we can set wp1 wp2 wp3 wp4 or whatever is appropriate generally i found that pages 57 describe many ideas and some of them are individually fairly clearly described but it is not always clear when one idea is beginning and one idea is ending and which ideas can be combined or not on my first readings i thought that i was basically following it until i got to table 5 which then convinced me that i was in fact not quite following it for example i had been certain that all the networks described are recurrent perhaps due to fig1 but then it turned out that many are in fact not recurrent which made a lot more sense given the continual reference to the history and the length of the models markov window etc but the reader should not have had to deduce this for example one could write we will consider 3 types of architectures convolutional recurrent in each architecture we will have modules and we will try a variety of combinations of these modules the modulescomponents are as follows its a bit prosaic but it can really help the reader appendices presented well could be immensely helpful in clarifying the exact architectures obviously not all 22 architectures from table 5 need to be shown but at least a few of them shown explicitly would help clarify for example in fig1 the purple boxes seem to represent notes according to the caption but do they actually represent networks if they really do represent notes then how can notes receive inputs from both the partnetworks and the global network also i was not entirely clear on the relationship of the architecture of the individual nets for the parts to that of the global integrating network eg for experiment 20 the partnet is an rnn with how many layers with regular or lstm cells followed by a loglinear predictor with one hidden layer of 300 units right or are there multiple layers sometimes but then what is the global network why does the longest parthistory vector appear to have length 10 based on table 5 but according to table 3 the bestperforming history length was 20 though i am not sure the meaning of the bottomtop column was explained anywhere so maybe i am completely misunderstanding that aspect of the table etc many piano scores do not easily deconstruct into clean 4part polyphony the example in appendix a is an exception it was not clear to me how piano scores were handled during training terminology it is not entirely clear to me why one section is entitled homophonic models instead of just monophonic models homophonic music usually involves a melody line that is supported by other voices ie a sort of asymmetry in the partwise structure here the outputs are quite the opposite of that the voices are independent they generally function well together harmonically and there is usually no sense of one voice containing a melody if theres some reason to call it homophonic that would be fine but otherwise it doesnt really serve to clarify anything however the authors do say that the homophonic composition tasks are a minor generalization of classic monophonic composition tasks so this suggests to me that there is something here that i am not quite understanding the last sentence of section 53 is very confusing i dont understand what linn is or 1n is or how to read the corresponding entries of the table the first part of the paragraph is fairly clear table 4 the first row actually seems like it is referring to the second row i know what the authors mean but it is unnecessarily confusing to refer to it in this way one might as well refer to the zeroth row as listing the duration of the clip the experimental evaluation i would like to hear some of the paired samples that were played for subjects were classical score excerpts chosen starting at random locations in the score or at the beginning of the score it is known that listening to a 10second excerpt without context can sometimes not make sense i would be curious to see the false positives versus the false negatives nevertheless i certainly appreciate the authors warning to interpret the listening results with caution originality significance so far based both on the techniques and the output i am not entirely convinced of the originality or significance of this particular system the authors refer to rhythmically simple polyphonic scores such as bachbot but i cannot see what is rhythmically fundamentally more sophisticated about the scores being generated by the present system one nice characteristic of the present system is the true and audible independence of the voices one of the contributions appears to be the construction of models that explicitly leverage with shared weights some of the patterns that occur in different places pitchwise and temporally in music this is both very reasonable and also not an entirely novel idea see for example the excellent work by daniel johnson generating polyphonic music using tied parallel networks paper published 2017 first shared online as far as i know in 2015 links to all materials available at httpwwwhexahedriacom20150803composingmusicwithrecurrentneuralnetworks another now common and non exclusive way to handle some of this is by augmenting the data with transposition it seems that the authors are not doing this here why not it usually helps another contribution appears to be the use of a pertime measure of loss this is reasonable and i believe others have done this as well i certainly appreciated the explicit justification for it however note that the idea of using a vector to indicate metric subdivision was also used in johnson 2015 playing through some of the scores it is clear that melodies themselves are often quite unusual check user studies but the voices do stay closely connected harmonically which is what gives the system a certain aural coherence i would be interested to hear and look at what is generated in twopart harmony and even what is generated as a sort of baseline with just a single part i encourage the authors to look at and listen to the work by johnson listening samples httpwwwhexahedriacom20150803composingmusicwithrecurrentneuralnetworks associated publication httpwwwhexahedriacomfiles2017generatingpolyphonicpdf overall i think that the problem of generating rhythmically and polyphonically complex music is a good one the approaches seem to generally be reasonable although they do not appear to be particularly novel and the musical results are not particularly impressive the architectural choices are not always clearly presented
### Summary: | this paper proposes novel recurrent models for polyphonic music composition and demonstrates the approach with qualitative and quantitative evaluations as well as samples the technical parts in the original writeup were not very clear as noted by multiple reviewers during the review period the presentation was improved unfortunately the reviewer scores are mixed and are on the lower side mainly because of the lack of clarity and quality of the results | [
625,
10799,
50276,
783,
512,
266,
88,
3370,
1317,
25577,
285,
807,
310,
7154,
2298,
333,
512,
266,
285,
37622,
12679,
465,
891,
588,
74,
1317,
23284,
2182,
30811,
2339,
407,
37851,
17032,
16424,
275,
11454,
1491,
5162,
2718,
1722,
5826,
5474,
339,
377,
2921,
16907,
5272,
2746,
281,
3488,
545,
5120,
3440,
5978,
36182,
562,
247,
1039,
281,
19860,
253,
4243,
3894,
3602,
20420,
10499,
15579,
591,
673,
512,
1056,
3282,
253,
4795,
18012,
5257,
281,
452,
1077,
2159,
3945,
23007,
25253,
24088,
2223,
247,
2629,
32894,
342,
690,
30426,
39174,
3966,
342,
2060,
4243,
2223,
2403,
1077,
1355,
3213,
3020,
3200,
26332,
5272,
1980,
4318,
4283,
9470,
5301,
273,
27934,
10575,
2762,
1543,
432,
11298,
4679,
50276,
5040,
12256,
18012,
403,
417,
4518,
1805,
685,
690,
273,
253,
3488,
545,
5120,
2718,
2529,
5747,
253,
2223,
1355,
6673,
23329,
5018,
253,
2060,
3104,
403,
3240,
3632,
31283,
436,
310,
4931,
247,
1480,
906,
273,
253,
2159,
2892,
891,
513,
417,
4089,
253,
28715,
6185,
10454,
326,
310,
2529,
275,
253,
10199,
253,
789,
407,
480,
2116,
1665,
4104,
1275,
2530,
2708,
943,
320,
3261,
387,
285,
18029,
281,
352,
1512,
4648,
9904,
6928,
23447,
275,
247,
1027,
1039,
533,
342,
247,
2905,
16038,
285,
556,
28715,
6185,
285,
3488,
545,
5120,
10454,
285,
7835,
3240,
1175,
1805,
275,
619,
4743,
50276,
8826,
12744,
7118,
4993,
494,
3340,
342,
271,
30762,
625,
2508,
2708,
5747,
253,
9470,
27934,
14023,
891,
369,
417,
1900,
2590,
670,
24775,
3212,
2176,
10165,
24088,
604,
970,
18902,
37507,
2139,
417,
1611,
298,
296,
78,
390,
26970,
625,
3533,
2708,
651,
751,
281,
452,
3735,
253,
11298,
5216,
390,
387,
1878,
1239,
625,
670,
849,
3530,
497,
4236,
969,
4931,
275,
271,
30762,
285,
3081,
3410,
4367,
50275,
15177,
19843,
3236,
414,
285,
8453,
273,
436,
789,
1690,
247,
1618,
273,
697,
5847,
285,
772,
2781,
1052,
933,
5810,
50276,
15177,
50276,
249,
436,
789,
2710,
1175,
14885,
10165,
403,
1160,
253,
3290,
273,
253,
4588,
3453,
310,
4030,
352,
310,
10870,
281,
285,
281,
619,
13628,
417,
1805,
685,
5368,
3488,
545,
5120,
2718,
824,
347,
253,
4394,
2708,
4859,
281,
3410,
9797,
403,
2530,
1060,
50276,
16836,
12042,
50276,
3614,
27962,
18534,
681,
16836,
12042,
632,
606,
1162,
355,
4240,
50276,
85,
728,
7529,
37507,
50276,
2413,
2700,
248,
14346,
742,
363,
317,
297,
1252,
1235,
28953,
681,
28163,
30911,
3113,
250,
6259,
570,
1546,
3024,
4896,
480,
2116,
1665,
4104,
1275,
2708,
1347,
21955,
9866,
50276,
3614,
78,
12788,
255,
11313,
5449,
2061,
32231,
21955,
9866,
50276,
3549,
251,
50276,
80,
410,
4240,
2571,
347,
973,
50275,
498,
15752,
50276,
8826,
273,
253,
4028,
310,
12171,
2590,
533,
581,
1781,
15225,
34092,
2593,
2789,
253,
2644,
2181,
21643,
4278,
2708,
352,
310,
1077,
9371,
326,
253,
4477,
9674,
2879,
247,
4385,
342,
247,
3048,
281,
690,
3410,
7363,
1293,
326,
352,
574,
644,
23228,
7479,
281,
7472,
253,
3290,
627,
403,
247,
1643,
2792,
326,
812,
320,
1805,
31637,
209,
186,
81,
22,
66,
4471,
12022,
4972,
273,
7211,
295,
352,
7835,
751,
295,
588,
320,
908,
281,
9173,
417,
14379,
1653,
533,
275,
958,
352,
3133,
751,
295,
310,
253,
2264,
1180,
273,
7211,
26332,
253,
2978,
273,
253,
4972,
987,
752,
1318,
273,
295,
310,
908,
268,
22,
247,
581,
12022,
4972,
273,
47668,
277,
352,
7835,
751,
277,
588,
320,
908,
281,
9173,
47668,
533,
2686,
891,
1158,
277,
310,
253,
2978,
273,
253,
337,
12022,
4972,
9706,
47668,
987,
752,
1318,
273,
277,
310,
908,
285,
752,
47668,
513,
253,
3603,
273,
436,
4972,
1957,
12014,
1057,
246,
1957,
253,
1979,
273,
253,
2892,
436,
943,
1663,
320,
31637,
209,
186,
81,
22,
3488,
545,
5120,
3210,
28910,
186,
2574,
374,
495,
577,
18289,
253,
49343,
403,
253,
8763,
1396,
569,
8090,
28910,
186,
783,
6928,
1060,
2723,
281,
253,
4797,
14240,
275,
3036,
337,
987,
604,
594,
1056,
253,
2954,
2590,
285,
6843,
50276,
186,
186,
9939,
326,
954,
4903,
275,
954,
7424,
403,
1669,
17011,
50270,
186,
186,
35264,
13947,
253,
37280,
275,
16186,
1348,
50276,
12756,
1581,
253,
4477,
281,
3730,
281,
253,
37280,
1996,
24088,
275,
2593,
8073,
672,
12930,
2801,
35870,
5697,
5010,
697,
512,
2581,
21643,
323,
1650,
253,
4477,
812,
3630,
3021,
359,
476,
873,
39648,
18,
50276,
16471,
19,
50276,
16471,
20,
50276,
16471,
21,
390,
5913,
310,
4569,
50276,
186,
43786,
891,
1119,
326,
7223,
8988,
6266,
1142,
5697,
285,
690,
273,
731,
403,
15978,
9648,
4518,
2529,
533,
352,
310,
417,
1900,
2590,
672,
581,
2934,
310,
5068,
285,
581,
2934,
310,
12365,
285,
534,
5697,
476,
320,
5678,
390,
417,
327,
619,
806,
28799,
891,
1869,
326,
891,
369,
10323,
1563,
352,
1919,
891,
1694,
281,
2829,
608,
534,
840,
13762,
479,
326,
891,
369,
275,
958,
417,
3240,
1563,
352,
323,
1650,
891,
574,
644,
2176,
326,
512,
253,
6928,
2529,
403,
18902,
4931,
1955,
281,
3036,
18,
533,
840,
352,
3531,
562,
326,
1142,
403,
275,
958,
417,
18902,
534,
1160,
247,
2257,
625,
3282,
1677,
253,
45120,
3806,
281,
253,
2892,
285,
253,
2978,
273,
253,
3210,
1616,
729,
3497,
3966,
533,
253,
9414,
943,
417,
452,
574,
281,
27566,
436,
323,
1650,
581,
812,
3630,
50276,
186,
664,
588,
1908,
495,
3510,
273,
35615,
27311,
267,
18902,
50276,
249,
1016,
10336,
359,
588,
452,
50276,
14825,
285,
359,
588,
1611,
247,
5235,
273,
13553,
273,
841,
11911,
253,
11911,
22127,
403,
347,
3637,
697,
247,
2372,
5847,
39533,
533,
352,
476,
1663,
1361,
253,
9414,
50276,
9691,
1271,
3559,
973,
812,
320,
45224,
9371,
275,
8254,
5411,
253,
3242,
35615,
9090,
417,
512,
3307,
35615,
432,
2829,
608,
878,
281,
320,
2011,
533,
387,
1878,
247,
1643,
273,
731,
2011,
11120,
651,
1361,
19148,
323,
1650,
275,
3036,
18,
253,
19445,
12783,
1646,
281,
1957,
7211,
2556,
281,
253,
11743,
533,
513,
597,
2686,
1957,
6928,
604,
597,
1663,
513,
1957,
7211,
840,
849,
476,
7211,
4763,
14800,
432,
1097,
253,
629,
3024,
4896,
285,
253,
4156,
2990,
671,
891,
369,
417,
7094,
2590,
327,
253,
2954,
273,
253,
10336,
273,
253,
2060,
37507,
323,
253,
4243,
281,
326,
273,
253,
4156,
24399,
2990,
24088,
323,
3368,
1384,
253,
629,
3024,
310,
271,
391,
9866,
342,
849,
1142,
8090,
342,
3963,
390,
298,
296,
78,
1341,
3560,
407,
247,
2412,
8172,
23403,
342,
581,
8763,
3828,
273,
7469,
5085,
987,
390,
403,
627,
2709,
8090,
4536,
533,
840,
752,
310,
253,
4156,
2990,
2139,
1057,
253,
20088,
1061,
394,
5934,
4972,
3176,
281,
452,
2978,
884,
1754,
327,
2829,
608,
533,
2556,
281,
2829,
495,
253,
1682,
468,
14692,
2892,
2978,
369,
1384,
2167,
891,
717,
417,
2119,
253,
4495,
273,
253,
5004,
3956,
5084,
369,
5544,
9825,
594,
5046,
891,
717,
4336,
40663,
326,
4809,
273,
253,
2829,
3966,
1142,
18542,
7363,
513,
417,
4354,
372,
17439,
715,
4076,
577,
2003,
3488,
39901,
253,
1650,
275,
30762,
247,
310,
271,
6517,
352,
369,
417,
2590,
281,
479,
849,
18542,
7363,
497,
15726,
1309,
3733,
50276,
20792,
1497,
352,
310,
417,
7094,
2590,
281,
479,
2139,
581,
2593,
310,
7429,
2860,
2689,
5120,
3210,
3185,
273,
816,
1114,
2689,
5120,
3210,
2860,
2689,
5120,
3440,
3798,
8687,
247,
40641,
1386,
326,
310,
4516,
407,
643,
15547,
26332,
247,
3686,
273,
27110,
275,
253,
629,
3020,
2605,
1060,
253,
18012,
403,
3240,
253,
7285,
273,
326,
253,
15547,
403,
3907,
597,
3839,
1159,
973,
2366,
23284,
1037,
285,
627,
310,
3798,
642,
3282,
273,
581,
4318,
4508,
247,
40641,
604,
253,
373,
690,
1921,
281,
1067,
352,
2860,
2689,
5120,
326,
651,
320,
4030,
533,
5010,
352,
36908,
1663,
5752,
281,
19148,
2712,
2299,
253,
4477,
513,
1333,
326,
253,
2860,
2689,
5120,
5889,
8892,
403,
247,
5884,
26647,
273,
10610,
1114,
2689,
5120,
5889,
8892,
594,
436,
5936,
281,
479,
326,
627,
310,
1633,
1060,
326,
891,
717,
417,
3240,
4685,
50276,
783,
1390,
6197,
273,
2593,
8676,
310,
1077,
21643,
891,
13414,
2096,
752,
298,
2966,
310,
390,
337,
79,
310,
390,
849,
281,
1239,
253,
3969,
12028,
273,
253,
2829,
253,
806,
629,
273,
253,
12494,
310,
9648,
2590,
50275,
2420,
577,
253,
806,
4194,
2686,
3133,
751,
352,
310,
14339,
281,
253,
1273,
4194,
891,
871,
752,
253,
4477,
1599,
533,
352,
310,
48312,
21643,
281,
3730,
281,
352,
275,
436,
1039,
581,
1537,
347,
973,
3730,
281,
253,
1182,
254,
837,
4194,
347,
16485,
253,
7467,
273,
253,
17230,
50275,
783,
5661,
7103,
891,
651,
751,
281,
4089,
690,
273,
253,
18433,
3530,
326,
497,
4546,
323,
5705,
497,
8946,
4868,
32491,
84,
6777,
4983,
387,
3632,
8593,
275,
253,
4868,
390,
387,
253,
5068,
273,
253,
4868,
352,
310,
1929,
326,
11298,
281,
247,
884,
9815,
32491,
1293,
3634,
476,
4536,
417,
1056,
3282,
891,
651,
320,
14338,
281,
923,
253,
3221,
37865,
7147,
253,
3221,
2297,
3993,
17837,
891,
5604,
11435,
253,
4477,
9734,
281,
4665,
253,
11298,
1543,
342,
17458,
50273,
19164,
414,
50276,
9188,
40348,
50276,
601,
2080,
1754,
1097,
327,
253,
5609,
285,
253,
3453,
891,
717,
417,
7094,
13762,
273,
253,
3236,
414,
390,
8453,
273,
436,
1798,
985,
253,
4477,
3730,
281,
19407,
1037,
2969,
3488,
545,
5120,
7363,
824,
347,
270,
607,
12042,
533,
891,
2550,
923,
752,
310,
19407,
1037,
26401,
625,
18144,
670,
253,
7363,
1146,
4561,
407,
253,
1246,
985,
581,
5322,
8847,
273,
253,
1246,
985,
310,
253,
2032,
285,
47055,
14275,
273,
253,
15547,
50276,
531,
273,
253,
9021,
4620,
281,
320,
253,
5140,
273,
3210,
326,
11120,
25057,
342,
6096,
13461,
690,
273,
253,
6127,
326,
2826,
275,
1027,
5053,
11288,
3020,
285,
5897,
595,
275,
3440,
436,
310,
1097,
1077,
5272,
285,
671,
417,
271,
7094,
4460,
2934,
923,
323,
1650,
253,
7126,
789,
407,
16447,
928,
480,
2116,
1665,
11365,
3488,
545,
5120,
3440,
970,
12331,
7529,
6928,
2929,
3863,
4240,
806,
6096,
3909,
347,
2080,
347,
891,
871,
275,
4104,
4859,
281,
512,
4753,
2130,
387,
3944,
2700,
248,
14346,
742,
363,
317,
297,
1252,
1235,
28953,
681,
28163,
30911,
3113,
250,
6259,
570,
1546,
3024,
4896,
50275,
23955,
1024,
1846,
285,
1327,
11855,
1039,
281,
6016,
690,
273,
436,
310,
407,
35919,
272,
253,
941,
342,
811,
3321,
352,
3133,
326,
253,
4477,
403,
417,
2509,
436,
1060,
2139,
417,
352,
3798,
7729,
50275,
23955,
7680,
4620,
281,
320,
253,
897,
273,
247,
591,
2606,
2557,
273,
2957,
436,
310,
5272,
285,
891,
2868,
2571,
452,
2218,
436,
347,
973,
891,
5604,
14109,
253,
6843,
22861,
323,
352,
2299,
50276,
9939,
326,
253,
2934,
273,
970,
247,
4972,
281,
5224,
7982,
25364,
369,
671,
908,
275,
480,
2116,
1665,
4104,
50276,
30758,
949,
690,
273,
253,
7363,
352,
310,
2590,
326,
6673,
4550,
3746,
403,
2223,
3240,
11555,
2451,
2608,
2175,
533,
253,
15547,
513,
3297,
8244,
4802,
23284,
1037,
534,
310,
752,
4245,
253,
985,
247,
2176,
247,
1546,
25253,
891,
651,
320,
6110,
281,
4089,
285,
1007,
387,
752,
310,
4561,
275,
2500,
412,
435,
27851,
285,
1014,
752,
310,
4561,
347,
247,
3686,
273,
8245,
342,
816,
247,
2014,
629,
50275,
74,
11907,
253,
4477,
281,
1007,
387,
285,
8028,
281,
253,
789,
407,
480,
2116,
1665,
11298,
3530,
3944,
2700,
248,
14346,
742,
363,
317,
297,
1252,
1235,
28953,
681,
28163,
30911,
3113,
250,
6259,
570,
1546,
3024,
4896,
2330,
9311,
3944,
2700,
248,
14346,
742,
363,
317,
297,
11242,
7132,
8719,
839,
18372,
545,
5120,
9275,
50276,
1189,
455,
891,
1158,
326,
253,
1895,
273,
11365,
19407,
1037,
285,
3488,
35396,
1037,
2570,
3440,
310,
247,
1175,
581,
253,
7274,
1646,
281,
3839,
320,
5272,
3738,
597,
513,
417,
3176,
281,
320,
3782,
4460,
285,
253,
12256,
1543,
403,
417,
3782,
13943,
253,
27934,
10165,
403,
417,
1900,
4518,
3559,
209,
15540,
209,
623,
2490,
187,
4118,
18435,
27,
2520,
2929,
29328,
4460,
18902,
3210,
323,
3488,
545,
5120,
3440,
5889,
285,
14371,
253,
2746,
342,
18276,
285,
11745,
27163,
347,
973,
347,
3530,
253,
7681,
4243,
275,
253,
3236,
3630,
484,
497,
417,
1077,
2590,
347,
4879,
407,
2709,
30628,
1309,
253,
2278,
2180,
253,
9759,
369,
5520,
19235,
253,
37317,
7363,
403,
6804,
285,
403,
327,
253,
2406,
1930,
7194,
984,
273,
253,
3480,
273,
19843,
285,
3290,
273,
253,
1543
] | [
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] | [
625,
10799,
50276,
783,
512,
266,
88,
3370,
1317,
25577,
285,
807,
310,
7154,
2298,
333,
512,
266,
285,
37622,
12679,
465,
891,
588,
74,
1317,
23284,
2182,
30811,
2339,
407,
37851,
17032,
16424,
275,
11454,
1491,
5162,
2718,
1722,
5826,
5474,
339,
377,
2921,
16907,
5272,
2746,
281,
3488,
545,
5120,
3440,
5978,
36182,
562,
247,
1039,
281,
19860,
253,
4243,
3894,
3602,
20420,
10499,
15579,
591,
673,
512,
1056,
3282,
253,
4795,
18012,
5257,
281,
452,
1077,
2159,
3945,
23007,
25253,
24088,
2223,
247,
2629,
32894,
342,
690,
30426,
39174,
3966,
342,
2060,
4243,
2223,
2403,
1077,
1355,
3213,
3020,
3200,
26332,
5272,
1980,
4318,
4283,
9470,
5301,
273,
27934,
10575,
2762,
1543,
432,
11298,
4679,
50276,
5040,
12256,
18012,
403,
417,
4518,
1805,
685,
690,
273,
253,
3488,
545,
5120,
2718,
2529,
5747,
253,
2223,
1355,
6673,
23329,
5018,
253,
2060,
3104,
403,
3240,
3632,
31283,
436,
310,
4931,
247,
1480,
906,
273,
253,
2159,
2892,
891,
513,
417,
4089,
253,
28715,
6185,
10454,
326,
310,
2529,
275,
253,
10199,
253,
789,
407,
480,
2116,
1665,
4104,
1275,
2530,
2708,
943,
320,
3261,
387,
285,
18029,
281,
352,
1512,
4648,
9904,
6928,
23447,
275,
247,
1027,
1039,
533,
342,
247,
2905,
16038,
285,
556,
28715,
6185,
285,
3488,
545,
5120,
10454,
285,
7835,
3240,
1175,
1805,
275,
619,
4743,
50276,
8826,
12744,
7118,
4993,
494,
3340,
342,
271,
30762,
625,
2508,
2708,
5747,
253,
9470,
27934,
14023,
891,
369,
417,
1900,
2590,
670,
24775,
3212,
2176,
10165,
24088,
604,
970,
18902,
37507,
2139,
417,
1611,
298,
296,
78,
390,
26970,
625,
3533,
2708,
651,
751,
281,
452,
3735,
253,
11298,
5216,
390,
387,
1878,
1239,
625,
670,
849,
3530,
497,
4236,
969,
4931,
275,
271,
30762,
285,
3081,
3410,
4367,
50275,
15177,
19843,
3236,
414,
285,
8453,
273,
436,
789,
1690,
247,
1618,
273,
697,
5847,
285,
772,
2781,
1052,
933,
5810,
50276,
15177,
50276,
249,
436,
789,
2710,
1175,
14885,
10165,
403,
1160,
253,
3290,
273,
253,
4588,
3453,
310,
4030,
352,
310,
10870,
281,
285,
281,
619,
13628,
417,
1805,
685,
5368,
3488,
545,
5120,
2718,
824,
347,
253,
4394,
2708,
4859,
281,
3410,
9797,
403,
2530,
1060,
50276,
16836,
12042,
50276,
3614,
27962,
18534,
681,
16836,
12042,
632,
606,
1162,
355,
4240,
50276,
85,
728,
7529,
37507,
50276,
2413,
2700,
248,
14346,
742,
363,
317,
297,
1252,
1235,
28953,
681,
28163,
30911,
3113,
250,
6259,
570,
1546,
3024,
4896,
480,
2116,
1665,
4104,
1275,
2708,
1347,
21955,
9866,
50276,
3614,
78,
12788,
255,
11313,
5449,
2061,
32231,
21955,
9866,
50276,
3549,
251,
50276,
80,
410,
4240,
2571,
347,
973,
50275,
498,
15752,
50276,
8826,
273,
253,
4028,
310,
12171,
2590,
533,
581,
1781,
15225,
34092,
2593,
2789,
253,
2644,
2181,
21643,
4278,
2708,
352,
310,
1077,
9371,
326,
253,
4477,
9674,
2879,
247,
4385,
342,
247,
3048,
281,
690,
3410,
7363,
1293,
326,
352,
574,
644,
23228,
7479,
281,
7472,
253,
3290,
627,
403,
247,
1643,
2792,
326,
812,
320,
1805,
31637,
209,
186,
81,
22,
66,
4471,
12022,
4972,
273,
7211,
295,
352,
7835,
751,
295,
588,
320,
908,
281,
9173,
417,
14379,
1653,
533,
275,
958,
352,
3133,
751,
295,
310,
253,
2264,
1180,
273,
7211,
26332,
253,
2978,
273,
253,
4972,
987,
752,
1318,
273,
295,
310,
908,
268,
22,
247,
581,
12022,
4972,
273,
47668,
277,
352,
7835,
751,
277,
588,
320,
908,
281,
9173,
47668,
533,
2686,
891,
1158,
277,
310,
253,
2978,
273,
253,
337,
12022,
4972,
9706,
47668,
987,
752,
1318,
273,
277,
310,
908,
285,
752,
47668,
513,
253,
3603,
273,
436,
4972,
1957,
12014,
1057,
246,
1957,
253,
1979,
273,
253,
2892,
436,
943,
1663,
320,
31637,
209,
186,
81,
22,
3488,
545,
5120,
3210,
28910,
186,
2574,
374,
495,
577,
18289,
253,
49343,
403,
253,
8763,
1396,
569,
8090,
28910,
186,
783,
6928,
1060,
2723,
281,
253,
4797,
14240,
275,
3036,
337,
987,
604,
594,
1056,
253,
2954,
2590,
285,
6843,
50276,
186,
186,
9939,
326,
954,
4903,
275,
954,
7424,
403,
1669,
17011,
50270,
186,
186,
35264,
13947,
253,
37280,
275,
16186,
1348,
50276,
12756,
1581,
253,
4477,
281,
3730,
281,
253,
37280,
1996,
24088,
275,
2593,
8073,
672,
12930,
2801,
35870,
5697,
5010,
697,
512,
2581,
21643,
323,
1650,
253,
4477,
812,
3630,
3021,
359,
476,
873,
39648,
18,
50276,
16471,
19,
50276,
16471,
20,
50276,
16471,
21,
390,
5913,
310,
4569,
50276,
186,
43786,
891,
1119,
326,
7223,
8988,
6266,
1142,
5697,
285,
690,
273,
731,
403,
15978,
9648,
4518,
2529,
533,
352,
310,
417,
1900,
2590,
672,
581,
2934,
310,
5068,
285,
581,
2934,
310,
12365,
285,
534,
5697,
476,
320,
5678,
390,
417,
327,
619,
806,
28799,
891,
1869,
326,
891,
369,
10323,
1563,
352,
1919,
891,
1694,
281,
2829,
608,
534,
840,
13762,
479,
326,
891,
369,
275,
958,
417,
3240,
1563,
352,
323,
1650,
891,
574,
644,
2176,
326,
512,
253,
6928,
2529,
403,
18902,
4931,
1955,
281,
3036,
18,
533,
840,
352,
3531,
562,
326,
1142,
403,
275,
958,
417,
18902,
534,
1160,
247,
2257,
625,
3282,
1677,
253,
45120,
3806,
281,
253,
2892,
285,
253,
2978,
273,
253,
3210,
1616,
729,
3497,
3966,
533,
253,
9414,
943,
417,
452,
574,
281,
27566,
436,
323,
1650,
581,
812,
3630,
50276,
186,
664,
588,
1908,
495,
3510,
273,
35615,
27311,
267,
18902,
50276,
249,
1016,
10336,
359,
588,
452,
50276,
14825,
285,
359,
588,
1611,
247,
5235,
273,
13553,
273,
841,
11911,
253,
11911,
22127,
403,
347,
3637,
697,
247,
2372,
5847,
39533,
533,
352,
476,
1663,
1361,
253,
9414,
50276,
9691,
1271,
3559,
973,
812,
320,
45224,
9371,
275,
8254,
5411,
253,
3242,
35615,
9090,
417,
512,
3307,
35615,
432,
2829,
608,
878,
281,
320,
2011,
533,
387,
1878,
247,
1643,
273,
731,
2011,
11120,
651,
1361,
19148,
323,
1650,
275,
3036,
18,
253,
19445,
12783,
1646,
281,
1957,
7211,
2556,
281,
253,
11743,
533,
513,
597,
2686,
1957,
6928,
604,
597,
1663,
513,
1957,
7211,
840,
849,
476,
7211,
4763,
14800,
432,
1097,
253,
629,
3024,
4896,
285,
253,
4156,
2990,
671,
891,
369,
417,
7094,
2590,
327,
253,
2954,
273,
253,
10336,
273,
253,
2060,
37507,
323,
253,
4243,
281,
326,
273,
253,
4156,
24399,
2990,
24088,
323,
3368,
1384,
253,
629,
3024,
310,
271,
391,
9866,
342,
849,
1142,
8090,
342,
3963,
390,
298,
296,
78,
1341,
3560,
407,
247,
2412,
8172,
23403,
342,
581,
8763,
3828,
273,
7469,
5085,
987,
390,
403,
627,
2709,
8090,
4536,
533,
840,
752,
310,
253,
4156,
2990,
2139,
1057,
253,
20088,
1061,
394,
5934,
4972,
3176,
281,
452,
2978,
884,
1754,
327,
2829,
608,
533,
2556,
281,
2829,
495,
253,
1682,
468,
14692,
2892,
2978,
369,
1384,
2167,
891,
717,
417,
2119,
253,
4495,
273,
253,
5004,
3956,
5084,
369,
5544,
9825,
594,
5046,
891,
717,
4336,
40663,
326,
4809,
273,
253,
2829,
3966,
1142,
18542,
7363,
513,
417,
4354,
372,
17439,
715,
4076,
577,
2003,
3488,
39901,
253,
1650,
275,
30762,
247,
310,
271,
6517,
352,
369,
417,
2590,
281,
479,
849,
18542,
7363,
497,
15726,
1309,
3733,
50276,
20792,
1497,
352,
310,
417,
7094,
2590,
281,
479,
2139,
581,
2593,
310,
7429,
2860,
2689,
5120,
3210,
3185,
273,
816,
1114,
2689,
5120,
3210,
2860,
2689,
5120,
3440,
3798,
8687,
247,
40641,
1386,
326,
310,
4516,
407,
643,
15547,
26332,
247,
3686,
273,
27110,
275,
253,
629,
3020,
2605,
1060,
253,
18012,
403,
3240,
253,
7285,
273,
326,
253,
15547,
403,
3907,
597,
3839,
1159,
973,
2366,
23284,
1037,
285,
627,
310,
3798,
642,
3282,
273,
581,
4318,
4508,
247,
40641,
604,
253,
373,
690,
1921,
281,
1067,
352,
2860,
2689,
5120,
326,
651,
320,
4030,
533,
5010,
352,
36908,
1663,
5752,
281,
19148,
2712,
2299,
253,
4477,
513,
1333,
326,
253,
2860,
2689,
5120,
5889,
8892,
403,
247,
5884,
26647,
273,
10610,
1114,
2689,
5120,
5889,
8892,
594,
436,
5936,
281,
479,
326,
627,
310,
1633,
1060,
326,
891,
717,
417,
3240,
4685,
50276,
783,
1390,
6197,
273,
2593,
8676,
310,
1077,
21643,
891,
13414,
2096,
752,
298,
2966,
310,
390,
337,
79,
310,
390,
849,
281,
1239,
253,
3969,
12028,
273,
253,
2829,
253,
806,
629,
273,
253,
12494,
310,
9648,
2590,
50275,
2420,
577,
253,
806,
4194,
2686,
3133,
751,
352,
310,
14339,
281,
253,
1273,
4194,
891,
871,
752,
253,
4477,
1599,
533,
352,
310,
48312,
21643,
281,
3730,
281,
352,
275,
436,
1039,
581,
1537,
347,
973,
3730,
281,
253,
1182,
254,
837,
4194,
347,
16485,
253,
7467,
273,
253,
17230,
50275,
783,
5661,
7103,
891,
651,
751,
281,
4089,
690,
273,
253,
18433,
3530,
326,
497,
4546,
323,
5705,
497,
8946,
4868,
32491,
84,
6777,
4983,
387,
3632,
8593,
275,
253,
4868,
390,
387,
253,
5068,
273,
253,
4868,
352,
310,
1929,
326,
11298,
281,
247,
884,
9815,
32491,
1293,
3634,
476,
4536,
417,
1056,
3282,
891,
651,
320,
14338,
281,
923,
253,
3221,
37865,
7147,
253,
3221,
2297,
3993,
17837,
891,
5604,
11435,
253,
4477,
9734,
281,
4665,
253,
11298,
1543,
342,
17458,
50273,
19164,
414,
50276,
9188,
40348,
50276,
601,
2080,
1754,
1097,
327,
253,
5609,
285,
253,
3453,
891,
717,
417,
7094,
13762,
273,
253,
3236,
414,
390,
8453,
273,
436,
1798,
985,
253,
4477,
3730,
281,
19407,
1037,
2969,
3488,
545,
5120,
7363,
824,
347,
270,
607,
12042,
533,
891,
2550,
923,
752,
310,
19407,
1037,
26401,
625,
18144,
670,
253,
7363,
1146,
4561,
407,
253,
1246,
985,
581,
5322,
8847,
273,
253,
1246,
985,
310,
253,
2032,
285,
47055,
14275,
273,
253,
15547,
50276,
531,
273,
253,
9021,
4620,
281,
320,
253,
5140,
273,
3210,
326,
11120,
25057,
342,
6096,
13461,
690,
273,
253,
6127,
326,
2826,
275,
1027,
5053,
11288,
3020,
285,
5897,
595,
275,
3440,
436,
310,
1097,
1077,
5272,
285,
671,
417,
271,
7094,
4460,
2934,
923,
323,
1650,
253,
7126,
789,
407,
16447,
928,
480,
2116,
1665,
11365,
3488,
545,
5120,
3440,
970,
12331,
7529,
6928,
2929,
3863,
4240,
806,
6096,
3909,
347,
2080,
347,
891,
871,
275,
4104,
4859,
281,
512,
4753,
2130,
387,
3944,
2700,
248,
14346,
742,
363,
317,
297,
1252,
1235,
28953,
681,
28163,
30911,
3113,
250,
6259,
570,
1546,
3024,
4896,
50275,
23955,
1024,
1846,
285,
1327,
11855,
1039,
281,
6016,
690,
273,
436,
310,
407,
35919,
272,
253,
941,
342,
811,
3321,
352,
3133,
326,
253,
4477,
403,
417,
2509,
436,
1060,
2139,
417,
352,
3798,
7729,
50275,
23955,
7680,
4620,
281,
320,
253,
897,
273,
247,
591,
2606,
2557,
273,
2957,
436,
310,
5272,
285,
891,
2868,
2571,
452,
2218,
436,
347,
973,
891,
5604,
14109,
253,
6843,
22861,
323,
352,
2299,
50276,
9939,
326,
253,
2934,
273,
970,
247,
4972,
281,
5224,
7982,
25364,
369,
671,
908,
275,
480,
2116,
1665,
4104,
50276,
30758,
949,
690,
273,
253,
7363,
352,
310,
2590,
326,
6673,
4550,
3746,
403,
2223,
3240,
11555,
2451,
2608,
2175,
533,
253,
15547,
513,
3297,
8244,
4802,
23284,
1037,
534,
310,
752,
4245,
253,
985,
247,
2176,
247,
1546,
25253,
891,
651,
320,
6110,
281,
4089,
285,
1007,
387,
752,
310,
4561,
275,
2500,
412,
435,
27851,
285,
1014,
752,
310,
4561,
347,
247,
3686,
273,
8245,
342,
816,
247,
2014,
629,
50275,
74,
11907,
253,
4477,
281,
1007,
387,
285,
8028,
281,
253,
789,
407,
480,
2116,
1665,
11298,
3530,
3944,
2700,
248,
14346,
742,
363,
317,
297,
1252,
1235,
28953,
681,
28163,
30911,
3113,
250,
6259,
570,
1546,
3024,
4896,
2330,
9311,
3944,
2700,
248,
14346,
742,
363,
317,
297,
11242,
7132,
8719,
839,
18372,
545,
5120,
9275,
50276,
1189,
455,
891,
1158,
326,
253,
1895,
273,
11365,
19407,
1037,
285,
3488,
35396,
1037,
2570,
3440,
310,
247,
1175,
581,
253,
7274,
1646,
281,
3839,
320,
5272,
3738,
597,
513,
417,
3176,
281,
320,
3782,
4460,
285,
253,
12256,
1543,
403,
417,
3782,
13943,
253,
27934,
10165,
403,
417,
1900,
4518,
3559,
209,
15540,
209,
623,
2490,
187,
4118,
18435,
27,
2520,
2929,
29328,
4460,
18902,
3210,
323,
3488,
545,
5120,
3440,
5889,
285,
14371,
253,
2746,
342,
18276,
285,
11745,
27163,
347,
973,
347,
3530,
253,
7681,
4243,
275,
253,
3236,
3630,
484,
497,
417,
1077,
2590,
347,
4879,
407,
2709,
30628,
1309,
253,
2278,
2180,
253,
9759,
369,
5520,
19235,
253,
37317,
7363,
403,
6804,
285,
403,
327,
253,
2406,
1930,
7194,
984,
273,
253,
3480,
273,
19843,
285,
3290,
273,
253,
1543
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
this paper studies the intraclass clustering ability of neural networks trained in supervised learning and found that networks show intraclass clustering ability despite not explicitly enforced by the label to do so and criterions based on those correlate well with model generalization performance however i think the current state of the paper is still a bit below the standard of iclr and there are a few ways that this paper could be potentially improved 1 i think observation 1 clustering is learned without explicit constraints is more interesting than observation 2 correlate well with generalization performance because it is intuitive that if you can classify a finer class assignment well then you should also do well on more coarse classes maybe one thing that can be added to the experiments is to study whether the intraclass clusterability could also overfit in other words do you see a big generalization gap if you measure your criterions on the training set and test set separately 2 as mentioned in the previous point i think it is probably worth allocating more space to study how why the intraclass clustering ability is learned during training despite supervised with corse label assignments i find it especially intriguing from fig4 that all the clusterability metrics are actually very low at the beginning of the training where the network weights are random my intuition was that at the beginning of training there are probably some neurons that is lucky and separate the intraclass clustering well but it seems that my intuition is wrong it would be great if the paper has more experiments focusing on discovering what happens during the training that helps intraclass clustering for example are those intraclass clusters actually pushed away or they stay the same distances to each other but benefits from byproduct of supervised learning that shrinks all the clusters other experiments could be to study what kind of regularization training techniques affect clusterability and can we explicitly encourage clusterability to improve generalization during training etc 3 im a bit uncomfortable with the max operation taken in the definition of clusterability criterions it essentially searches over all the neurons looking for patterns that you want to see given that there are a lot of randomness in the neural network weights this might not be depicting the typical behavior of neurons for example it might be that there is a random neuron that happen to correlate very well with the intraclass clusterability maybe one can replace the max with the topk averaging with a relatively large k like in the case of variances or even with something like 99 percentile a baseline can also be provided to see how much the randomness comes into play one can define a random clustering assignments and compute the same criterion and see what number could be achieved 4 could you please include details in the experimental setup showing how do you get the intraclass clustering assignments that is needed to compute the criterions i imagine for the 20class cifar100 you can just use the original labels as clustering assignment but i did not find how you do it for other datasets 5 similar to the previous point in practice one usually do not have the intraclass cluster assignment available to compute those criterions in contrast the sharpness criterion can be computed without extra information can you add some materials to show what could be done in this case maybe show some way to approximately estimate the proposed metrics and study the estimation accuraciesdocsep 1 brief summary the authors note that in classification tasks there typically exist withinclass groups of similar images that are not explicitly encoded in the coarse class label they call this intraclass clustering they hypothesize that the ability of dnns to recognize these intraclass clusters without being explicitly told about that could correlate with generalization they then proceed to verify this on a range of networks architectures and a large number of hyperparameter configurations they take care to establish causality where possible in addition they show that the intraclass clustering can be detected with simple variancebased methods and that it emerges early in training 2 strengths this is a super interesting question and i really like the paper overall i appreciate that you looked at a large hypercube of hyperparameters to establish correlation with generalization i also like the care you put into establishing causality the fact that you tried a simple variance based measure is also really good especially given that it is very predictive 3 weaknesses i think this paper is really good i have nothing much to point out here possible a large range of architectures and scaling up to imagenet would be useful to establish that this scales all the way to very large data but it is very good as is 4 related papers that you might like 1 you cite stiffness a new perspective on generalization in neural networks by stanislav fort pawe krzysztof nowak stanislaw jastrzebski srini narayanan httpsarxivorgabs190109491 as measuring an amount of classspecific clustering in that paper in figure 9 they show that stiffness is aware of the superclasses of cifar10 and even their supersuperclasses animals etc which seems relevant here too though there it goes the other way round the training is on subclasses generating awareness of superclasses your results are id stay stronger than that 2 in talking about the sensitivity to early stages of training the breakeven point on optimization trajectories of deep neural networks by stanislaw jastrzebski maciej szymczak stanislav fort devansh arpit jacek tabor kyunghyun cho krzysztof geras httpsarxivorgabs200209572 and iclr 2020 might be relevant where they also establish a very strong effects of the early stages of training 3 deep learning versus kernel learning an empirical study of loss landscape geometry and the time evolution of the neural tangent kernel by stanislav fort gintare karolina dziugaite mansheej paul sepideh kharaghani daniel roy surya ganguli httpsarxivorgabs201015110 and neurips 2020 also shows a very strong effect of the early stages of training on a large number of dnn measures 5 summary this paper is really good it starts with a strong hypothesis verifies it on a large number of experiments is mindful of causality and generates a potentially practically useful insight into generalization well donedocsepsummary this work investigates whether intraclass separation in the latent space of a neural network correlates with its generalization capabilities to perform this the authors design 4 measures that attempt to capture whether separate subclusters are formed for different subclasses of a superclass all measures take into account all activations in all layers of network 2 measures operate at neuron level 2 at layer level 2 are designed for cases where we do have explicit subclass labels and 2 are for the case that no such labels are given the work performs multiple experiments on cifar10 cifar100 cifar100superclasses where multiple models are trained with varying hyperparameter configurations each leading to different performance then it is shown that the 4 measures correlate positively with performance which according to the authors suggests that intraclass separation within the network is happening and is important for generalization reasons for score i am very borderline on this on one hand it is interesting to see such a strong correlation of the intraclass separation with generalization but on the other hand there are many questions that the study leaves unanswered regarding whether we are measuring what we think we are measuring whether the results are sensitive or not to choice of hyperparameters for the measures etc in general i am currently left quite unconfident in drawing strong conclusions perhaps the rebuttal will help pros 1 the main research questions whether the networks have an inductive bias to learn features that perform intraclass separation without a training loss that demands it and whether this separation actually correlates with generalization are interesting questions 2 the results suggest a positive correlation of intraclass separability with generalization this may be an interesting finding to parts of the community for example there is work that attempts the opposite to improve generalization introducing losses for learning a single tight cluster per class losing intraclass separability and show that this improves generalization contradicting evidence could spark interesting discussion research in the community see references 1 2 further down in my comments 3 there is a very significant number of experiments performed 500 which may offer significant information to the community cons 1 the work presents no analysis of the behaviour of the measures with respect to design choices or the hyperparameters of the measures as a result i think we do not get much insight in exactly what is being measured for example are the measures capturing intraclasss eparation at 1st or last layer are the results robust to hyperparameters insights like this are important to understand the value of the results the empirical investigation is unfortunately limited to simply reporting a seemingly high correlation of the presented metrics with the generalization without further insights this limited insight is particularly troublesome when coupled with several questions i have about the design of the measures and their configuration which leave me unconfident about the rigor of the study and whether we are indeed measuring intraclass separation in meaningful ways see below questions for rebuttal for more details 2 there are design choices with respect to the measures and their application on the net that seem adhoc without appropriate explanation why use ktop values and not all why normalize the preactivations and take 25 top percent and so on so forth see detailed questions below 3 similarly there are configuration parameters that the work merely mentions the value chosen eg k5 without any explanation how this value was chosen nor any sensitivity study of whether results would change for different values this adds the unconfidence i have about the rigorousness of the study 4 reproducibility seems low code is not available there are no details on the values explored for the hyperparameters of the networks in the main experiments such as learning rates etc and it seems challenging for another party to rerun 500 experiments to reproduce the results it is especially in such studies that rigorous analysis for convincing results is extra important which i think leaves a lot to be desired here 5 experiments are performed only on cifar 10 100 i understand that performing all these experiments is expensive but on the other hand results are limited only to 1 type of data do we know if conclusions generalize questions to address during rebuttal period in sec 31 please clarify explicitly whether preactivations are before or after normalization by batchnorm this will interplay with statistics used in the study such as sigmand sec 322 and sec332 why use the cosine distance instead of euclidean if i am note mistaken euclidean would also evaluate whether a cluster is tight concentrated closer into a point rather than only the angle that the cosine measures i dont see a reason why the magnitude of activations wouldnt matter for clusterclass discrimination in an arbitrary deep net please clarify in text why you prefer the use of cosine and whether you believe things would be similar with other distances eg euclidean why is eq 1 and eq 3 done on the preactivations while eq 2 and eq 4 are done on the activations after relu have eq 2 4 been tried on preactivations and vice versa perhaps this also relates to why eq24 are on cosine distance instead of euclidean if so state reasons explicitly and if you ve tried things differently why do eq 1 and eq 2 use mediani and eq 3 4 use meani any theoretical or empirical support to this does the choice influence results please explicitly state in the paper too why do eq 1 and eq 2 use maxn and maxl respectively while eq 3 use meann and meanl any theoretical or empirical support to this does the choice influence results please explicitly state in the paper too the maxn and maxl operators as well as the meankn and meankl operators choose a single nl or ktop nl neuronslayers with separability from the whole network it could be the very first layer or the last do you have any insights on which layers are usually chosen i think this would add great insights if we knew if first or last layers are chosen does it make a difference on the value of this intraclass separability for example intracluster separability at last layers will say much about classification itself at the first layer it says that there are filters that merely identify multiple colors for the same class both white and black cats it would be nice to know what actually is being measured perhaps adding a discussion or a plot showing which layers are usually chosen would be very useful and it would add confidence to the reader about exactly what is being measured and how to interpret the resultsconclusions this is needed especially because some measures are not actually measuring intraclass clustering eg c3 c4 but something else that we assume indicates intraclass clutsering hence makes interpretation ambiguous how was the choice of values k5 for resnet k1 for vgg done in sec 332 do these values influence the behaviour would they influence the conclusions it would be nice to have a sensitivity study on such values sec 332 we found it helpful25 of the samples are activated please elaborate more on how this was found why do you think its important how was the value 25 defined and how the results change if this is not performed it would be nice to have empirical analysis of how this influences how the measure behaves and whether it changes any results fig 2 this is helpful in understanding why c3 and c4 may indicate indirectly some intraclass separation in some cases but what if a class presents unimodal distribution with large std and not bimodal as in the example can you think of a way to provide more supporting evidence that c3c4 capture intraclass separation here is where a more extensive analysis would have also been useful sec 52 how do the results with sharpness based measures you get tabl123 compare to those reported by jiang 2020 with a quick look of mine they seem quite close please confirm and discuss in your paper if the results are similar it adds confidence to the reader that your experimental setup is correct replicating theirs the sharpness measures do not require the class labels right if this is correct then their capability of predicting generalization should not be directly comparable with the measures in this study which do require the class labels the study does not do this directly but i think an explicit comment should be added for the reader so that they are aware of this significant difference fig3 can you discuss whether you think the values returned by the measures yaxis suggest the existence of actual intraclass clustering and support the main assumption of the paper in fact the values 001008 of the silhouette score seem very low silhouette is 1 to 1 with 0 meaning overlapping clusters right similarly the 0107 returned by c1 also seems small eg imagine 2 gaussians with same std for the 2 clusters if they wouldnt overlap the distance of their centers would be 6 stds right rough estimate 0107 suggests quite a strong overlap i think perhaps adding a discussion on this would be nice if you indeed agree there is no strong supporting evidence of clear overlap perhaps ensure there are no strong statements in the paper about it and mostly emphasize on the correlation to generalization instead sec 53 strong increases of the training accuracy can you explicitly state to which points eg epoch you are refering to moreover what happened at epoch 140 and caused the spike in training accuracy lowering learning rate i would advise you explicitly state it fig4 how do you group all the modelsexperiments in goodmediumlow performance groups clarify in text what exactly are these numbers 8231 in the caption state explicitly the reader should not have to guess about anything what does kendall coefficient 10 mean when dataaugmentation changes that all measures predict absolutely perfectly generalization isnt this weird how do authors interpret this perhaps a short discussion would add to the readers understanding insights and confidence in the interpretation of the results there is a body of literature that designs losses for learning 1 single tight cluster per class losing intraclass separation in the last layers and shows this improves generalization of classifiers the current study seems to contradict it suggesting intraclass clusters help generalization i think this should be discussed the issue is that because of no analysis we dont know whether the current measures find intraclass separation in early or late layers to help us understand how the results connect exactly with the rest of the literature contradicting it or complementing it if only earlier layers are chosen by the maxl operators this is one of the reasons that i think an analysis with further insights on how exactly the proposed measures compute is important as i said previously and add confidence to the reader by pointing out how they relate to existing literature if you would agree here a couple of references that do tight clustering a recent such work on deep nets i know is 1 which shows tight clustering improves generalization the work is mainly on semisupervision but in the end they also have experiments for standard fullysupervised learning similarly 2 show that label smoothing promotes tightclustering and improves generalization supervised learning 1 kamnitsas et al semisupervised learning via compact latent space clustering icml 2018 2 muller et al when does label smoothing help neurips 2019 to help with reproducibility is it possible to provide implementation of the measures even if not the whole code is possible to release and details about exactly what values have been explored for the networktraining hyperparameters in the main experiments in table1 etc eg in an appendix minor comments some additional feedback for improving the work which are not necessary to discuss within the rebuttal but do address as many as possible i think eq2 is not well descriptive of what is being described in sec 322 specifically it shows that silhoueteal ci is computed only on the set of a subclass ci if i understand correctly from the text instead you actually compute the mean silhoute score of all the samples in a superclass so i think it should be silhouetteal ssi moreover al is undefined is it supposed to be the cosine distances please clarify and update the text to make things clear perhaps give the equation for the silhouette score explicitly if this helps sec 1 the current statement could account for invalidated i think is too strong there are many studies that showed that these help generalization eg i think the positive effect of augmentation is unquestionable i would suggest rephrasing it to something more tactful like not the sole factors for generalization performance or something similar in sec 331 i think that the range of appropriate values for k is dependent on the number of classes in the database perhaps you would like to state this explicitly consider if you could also derive a rule of thumb likely empirically of what is a good k with respect to number of classes sec 42 ranking of modelsperformance i think this sentence descriing kendall coeff could use a slight rephrase to be more intuitive sec 42 by penalizing measures i am not sure that the approach penalizes rather it seems because it takes average over loads of values it merely is robust to outlier cases measure hyperparam tuned or averagesaway the influence of the hyperparam am i right perhaps a slight rephrase would avoid this confusion is designed to better capture causality i would suggest this argument to be rephrased a bit to avoid interpretation that it accurately captures causality it is merely robust due to averaging across hyperparams to the case of only 1 hyperparam correlating with generalization notice that jiang et al 2020 also take special care to not overstate the capabilities of this measure to identify causal factors see sec 222 in their paper summary of improvements and revisiting reviewers score after rebuttal summary of main points of improvement related to my comments after the revision the authors have significantly extended the analysis within the paper with more experiments to investigate the proposed measures and the deep nets behaviour in question this adds a lot of value to the paper offering more insights and support for the main claims the authors have added many intext clarifications about certain design choices and improved some eg by average kmax instead of max which makes the study much clearer and adds confidence to the reader about interpretating the investigation and its results the authors have added a study of the influence of the k parameter in the measures which simultaneously offers confidence in the conclusions conclusions hold for a significant range of values for k and offers new insights about how separability happens 1 neuron vs multiple reproducibility is greatly improved both by improved clarifications of the experimental settings within the text and by providing the code in the supplementary the authors improved discussions about the results with respect to related literature taking into account and linking them also to papers that at first glance seem to be contradicting eg references 12 i provided above that show tight clustering improves generalization these links and discussions further improve my confidence in the conclusions overall the authors have addressed sufficiently most of my concerns significantly improving the manuscript the updated manuscript provides significantly more insights which should be of interest to a part of the community the extra analysis provides better support of the studys hypothesis and claims remaining weaknesses of the paper include the cifaronly experimentation we do not know if conclusions hold beyond it i am happy to increase my reviews score from 5 marginally bellow acceptance to 7 good paper accept minor in case the paper gets accepted i would suggest the authors to try to complete fig 4 with the layerwise investigation which they said was too expensive for the rebuttal period i cannot estimate how expensive that can be hopefully it is possible in longer time period and would complete nicely fig 4 docsepsummary this paper introduces a notion coined intraclass clustering that describes a deep neural networks implicit ability to cluster within data 4 different quantities that measure the networks clustering ability are proposed and large scale experiments show that they are highly effective at predicting the models generalization ability across diverse types of hyperparameters pros 1 the paper presents an interesting idea that also seems to be practically highly relevant based on the experiments notably it performs nearly perfectly on several interesting hyperparameters that are challenging for previous measures i believe that there is definitely something unusual in this idea and it is worth future research 2 the experiments on the evolution of the coefficients is extremely interesting since this may indicate that we do not need to train the models to convergence to verify how good the models are the quantity is simple and seems to be efficient to compute cons 1 i think this paper is not very wellwritten and needs much more work for example why are there 4 different coefficients when two of them clearly outperform the previous two based on label hierarchies which are nonetheless the motivations for the work and supposedly model the actual intraclass clustering ability the paper claims that they are complementary but the experiments in my opinion suggests otherwise no further explanation is offered for this phenomenon and i am just left wondering i think this warrants much more investigation and explanation 2 what is the exact definition of a neuron and a layer the activation tensor is of shape nxhxwxc is the neuron a single scalar or single channel nxhxwx1 likewise is the layer the whole tensor of a single channel this is not very clear in the paper and it would help to spell out exactly how the variance or means are computed using formula 3 it would be nice to have a couple more comparisons instead of just sharpness 4 its not immediately clear to me how this phenomenon can be converted to a learning theoretical argument about generalization but this is minor 5 i must admit that it is not clear to me why this phenomenon should capture generalization i assume the authors had some conjectures or intuitions so it would be nice to include that in the paper specifically what is special about intraclass does this mean good models are just memorizing prototypes 6 in 331 if the standard deviation computed over the samples of a class is high compared to the standard deviation computed over the entire dataset we infer that the neuron has learned features that differentiate samples belonging to this class why does standard deviation capture this i think an justification would be nice the figure in some sense does it but i dont think thats thorough enough 7 it would be good to have analysis of which layer is usually picked when doing max or top k comments 1 can intraclass clustering be viewed as some kind of specialization of different parts of the model or some kind of compression i think more interesting things can be said about this 2 what is al in eq 2 3 for figure 4 is there adversarial noise for the green curve 4 it would be interesting to see the evolution of the ranking instead of the coefficients as table 13 but this is quite expensive and i think the existing materials are already interesting enough 5 it would be nice to see the mutual information based metric from jiang et al although this is not high on priority if the authors are interested they could use the data and code released by pgdl competition at neurips which includes the implementation of mutual information metric 6 are all the models interpolating would be nice to see a visualization of training loss or accuracy conclusion this paper presents something that i personally believe is very interesting if not exciting but the paper needs quite a lot of works before it can be published i want to emphasize that i believe the idea is extremely promising but the presentation the execution and thoroughness of analysis unfortunately just miss the mark as such i am inclined to reject the paper for now i will increase my scores if the authors can address my concerns update after rebuttal i appreciate the reviewers hard work for adding this many new materials in a short period of time a large number of my concerns have been addressed and the quality of paper has improved significantly some newly added experiments give a lot more insights than the original draft as such i am in favor of accepting the paper and have increased my scores from 4 to 6 however i still have a few lingering questions in the light of the rebuttal and would love to see them addressed should the paper be accepted 1 re figure 4 the question is why the performance of the green curve is so bad i was wondering if some of the labels are wrongly labeled to induce this effect 2 for reducing the feature map to a single quantity why is the max operation chosen intuitively it makes more sense to use mean or even order statisitc to make the metric more robust i also understand that many of the experiments cannot be added in a short period of time and hope that the authors add them as promised in the rebuttal
### Summary: | the paper proposes intraclass clustering as an indicator of generalization performance and validates this by extensive empirical evaluation all reviewers have found this connection highly interesting the author response has also duly addressed most of the reviewers concerns given the importance of studying generalization performance of overparameterized deep models the paper will potentially generate interesting discussion at the conference | [
272,
352,
604,
760,
4321,
8090,
403,
6777,
407,
253,
2781,
77,
9158,
436,
310,
581,
273,
253,
4606,
326,
891,
1158,
271,
1783,
342,
2007,
16039,
327,
849,
4555,
253,
4081,
5593,
11897,
310,
1774,
347,
891,
753,
3786,
285,
823,
7162,
281,
253,
9414,
407,
13458,
562,
849,
597,
14588,
281,
5368,
6239,
604,
368,
651,
5194,
1060,
247,
4564,
273,
10414,
326,
513,
6863,
17524,
247,
3332,
824,
789,
327,
3676,
37507,
891,
871,
310,
337,
534,
2722,
6863,
17524,
19132,
26647,
253,
789,
310,
7194,
327,
49863,
29974,
4694,
533,
275,
253,
990,
597,
671,
452,
4679,
323,
2629,
4751,
35421,
4715,
12014,
374,
921,
326,
5203,
36971,
18653,
6863,
498,
49591,
285,
19132,
26647,
22296,
4715,
50276,
18,
465,
312,
79,
953,
284,
1162,
355,
49863,
29974,
13337,
4715,
3066,
8566,
21624,
2317,
17524,
17857,
1686,
4765,
50276,
19,
278,
962,
254,
1162,
355,
672,
1057,
5203,
36971,
1361,
5723,
2824,
6247,
50275,
936,
1361,
342,
38041,
310,
352,
1896,
281,
2085,
7092,
273,
253,
5593,
1014,
604,
417,
253,
2644,
2127,
310,
1896,
281,
3727,
285,
4278,
670,
4555,
752,
2193,
452,
644,
14859,
323,
253,
2990,
31158,
4373,
22041,
275,
253,
2022,
4679,
275,
2829,
18,
3966,
24088,
275,
271,
30762,
50274,
37585,
5701,
50276,
8826,
3081,
8680,
323,
11138,
253,
789,
534,
403,
417,
3309,
281,
2319,
1561,
253,
30080,
22559,
533,
513,
2953,
347,
1142,
347,
1896,
50276,
74,
1158,
16186,
19,
310,
417,
973,
27389,
273,
752,
310,
1146,
2529,
275,
4706,
31619,
5742,
352,
2722,
326,
43031,
16606,
267,
16399,
310,
10302,
760,
327,
253,
873,
273,
247,
35851,
16399,
604,
891,
2096,
9113,
432,
253,
2505,
3185,
368,
2686,
11897,
253,
1599,
2830,
73,
5380,
4868,
273,
512,
253,
3530,
275,
247,
2221,
2437,
594,
891,
1158,
352,
943,
320,
43031,
5464,
267,
256,
9245,
50276,
3062,
1189,
355,
310,
17011,
310,
352,
6326,
281,
320,
253,
7349,
460,
13849,
4496,
19148,
285,
5731,
253,
2505,
281,
1056,
1841,
2590,
4931,
1918,
253,
5150,
323,
253,
43031,
5464,
4868,
11120,
604,
436,
7729,
50276,
1704,
337,
253,
1655,
3908,
812,
2395,
323,
50276,
25359,
456,
891,
1158,
310,
1512,
2266,
627,
403,
1142,
2175,
326,
2692,
326,
841,
1361,
26647,
24088,
891,
1158,
253,
2762,
1055,
273,
42072,
310,
440,
19751,
494,
891,
651,
1804,
294,
545,
83,
2355,
352,
281,
1633,
625,
13253,
1020,
751,
417,
253,
7934,
2616,
323,
26647,
3045,
390,
1633,
2074,
50276,
249,
4706,
36438,
891,
1158,
326,
253,
2491,
273,
4569,
2193,
323,
465,
310,
7976,
327,
253,
1180,
273,
5971,
275,
253,
5447,
4931,
368,
651,
751,
281,
1375,
436,
11120,
1908,
604,
368,
812,
671,
15313,
247,
4086,
273,
17300,
2779,
45190,
273,
752,
310,
247,
1175,
465,
342,
1675,
281,
1180,
273,
5971,
50275,
1704,
5976,
19947,
273,
3210,
24159,
891,
1158,
436,
6197,
1398,
363,
272,
465,
423,
455,
6161,
567,
812,
897,
247,
4512,
294,
40712,
281,
320,
625,
27350,
50276,
1704,
5976,
407,
29697,
3006,
5593,
891,
717,
417,
2119,
326,
253,
2746,
29697,
4219,
2581,
352,
3133,
984,
352,
3936,
3388,
689,
16665,
273,
2193,
352,
7960,
310,
10237,
281,
562,
3623,
2219,
2557,
50276,
27049,
3575,
24251,
390,
31218,
12594,
253,
4833,
273,
253,
4373,
3575,
717,
891,
987,
4931,
247,
4512,
294,
40712,
651,
3693,
436,
13775,
50276,
261,
4158,
281,
1805,
9232,
46449,
891,
651,
1804,
436,
4154,
281,
320,
294,
545,
83,
833,
247,
2372,
281,
3693,
7914,
326,
352,
13613,
28174,
46449,
352,
310,
7960,
10237,
1955,
281,
25001,
2439,
4373,
12928,
281,
253,
1083,
273,
760,
337,
4373,
3575,
3411,
839,
342,
26647,
4366,
326,
480,
22589,
1162,
355,
9169,
671,
1379,
2714,
1557,
281,
417,
689,
3409,
253,
13789,
273,
436,
2557,
281,
4271,
19349,
2616,
923,
4706,
25653,
275,
616,
2929,
50273,
8774,
273,
11701,
285,
27694,
2996,
30628,
4868,
846,
30080,
22559,
50275,
8774,
273,
2022,
2792,
273,
7756,
2905,
281,
619,
5701,
846,
253,
18520,
50275,
783,
4477,
452,
3012,
6508,
253,
1783,
1561,
253,
2929,
342,
625,
4679,
281,
7409,
253,
4081,
5593,
285,
253,
3676,
37507,
8770,
275,
1953,
436,
11323,
247,
2257,
273,
1318,
281,
253,
2929,
9159,
625,
16039,
285,
1329,
323,
253,
2022,
3916,
50275,
783,
4477,
452,
2879,
1142,
1101,
633,
8254,
6787,
670,
2176,
2216,
10165,
285,
5520,
690,
24088,
407,
3388,
465,
4090,
3185,
273,
2781,
534,
2789,
253,
1263,
1199,
30909,
285,
11323,
7162,
281,
253,
9414,
670,
4665,
839,
253,
5839,
285,
697,
1543,
50275,
783,
4477,
452,
2879,
247,
1263,
273,
253,
4833,
273,
253,
465,
4764,
275,
253,
5593,
534,
10486,
6131,
7162,
275,
253,
11815,
11815,
2186,
323,
247,
1534,
2491,
273,
2193,
323,
465,
285,
6131,
747,
16039,
670,
849,
2533,
1430,
6569,
337,
23586,
4632,
2709,
50275,
250,
5551,
33593,
310,
10260,
5520,
1097,
407,
5520,
8254,
6787,
273,
253,
5661,
7533,
1561,
253,
2505,
285,
407,
5277,
253,
2127,
275,
253,
24864,
50275,
783,
4477,
5520,
11985,
670,
253,
1543,
342,
1675,
281,
2905,
6239,
3192,
715,
2395,
285,
20057,
731,
671,
281,
9380,
326,
387,
806,
17834,
1646,
281,
320,
17343,
272,
24088,
10414,
1249,
891,
2530,
1840,
326,
921,
6863,
17524,
19132,
26647,
841,
4859,
285,
11985,
2007,
3157,
619,
7162,
275,
253,
11815,
50276,
1189,
455,
253,
4477,
452,
9713,
10481,
954,
273,
619,
7350,
3012,
11138,
253,
7714,
253,
9300,
7714,
3400,
3012,
625,
16039,
534,
943,
320,
273,
1600,
281,
247,
629,
273,
253,
3114,
253,
4465,
1783,
3400,
1805,
1329,
273,
253,
799,
656,
9079,
285,
3916,
5780,
32213,
273,
253,
2929,
2486,
253,
260,
338,
274,
7483,
40290,
359,
513,
417,
871,
604,
11815,
2186,
4457,
352,
891,
717,
5211,
281,
2572,
619,
10123,
4868,
432,
608,
42876,
44462,
14924,
281,
818,
1175,
2929,
2997,
50275,
37585,
50275,
249,
1083,
253,
2929,
4850,
7607,
891,
651,
1804,
253,
4477,
281,
1611,
281,
3426,
3036,
577,
342,
253,
3828,
3020,
5839,
534,
597,
753,
369,
1512,
8214,
323,
253,
30080,
22559,
2180,
891,
2550,
6642,
849,
8214,
326,
476,
320,
18670,
352,
310,
1896,
275,
3356,
673,
2180,
285,
651,
3426,
23395,
3036,
577,
5474,
339,
793,
360,
3454,
50276,
2520,
2929,
23970,
247,
10732,
48945,
11251,
14407,
17524,
326,
8631,
247,
3676,
11454,
6928,
15424,
3745,
281,
7368,
1561,
941,
577,
1027,
13483,
326,
2557,
253,
6928,
17524,
3745,
403,
4081,
285,
1781,
4311,
4679,
921,
326,
597,
403,
4122,
3576,
387,
21565,
253,
3210,
26647,
3745,
2439,
11117,
3510,
273,
4373,
22041,
50275,
856,
84,
337,
253,
2929,
10262,
271,
4722,
2934,
326,
671,
3133,
281,
320,
18236,
4122,
4623,
1754,
327,
253,
4679,
19836,
352,
17923,
4829,
9670,
327,
2067,
4722,
4373,
22041,
326,
403,
11132,
323,
2045,
5593,
891,
2868,
326,
627,
310,
7964,
1633,
11555,
275,
436,
2934,
285,
352,
310,
4409,
2852,
2561,
374,
253,
4679,
327,
253,
5606,
273,
253,
10303,
310,
6685,
4722,
1580,
436,
778,
5224,
326,
359,
513,
417,
878,
281,
6194,
253,
3210,
281,
14940,
281,
12654,
849,
1175,
253,
3210,
403,
253,
10671,
310,
2969,
285,
3133,
281,
320,
5919,
281,
11897,
50275,
5040,
337,
891,
1158,
436,
2929,
310,
417,
1077,
973,
15720,
285,
3198,
1199,
625,
789,
323,
1650,
2139,
403,
627,
577,
1027,
10303,
672,
767,
273,
731,
4518,
562,
32231,
253,
2045,
767,
1754,
327,
5203,
20258,
447,
534,
403,
23188,
253,
42852,
323,
253,
789,
285,
24628,
1566,
253,
4588,
11251,
14407,
17524,
3745,
253,
2929,
3916,
326,
597,
403,
19767,
533,
253,
4679,
275,
619,
4743,
5936,
5010,
642,
2007,
8813,
310,
5907,
323,
436,
11562,
285,
891,
717,
816,
1669,
12371,
891,
1158,
436,
32570,
1199,
625,
5839,
285,
8813,
374,
752,
310,
253,
3242,
5426,
273,
247,
23586,
285,
247,
3828,
253,
5743,
13148,
310,
273,
5281,
295,
89,
73,
89,
88,
10587,
310,
253,
23586,
247,
2014,
13434,
390,
2014,
5048,
295,
89,
73,
89,
22358,
18,
21223,
310,
253,
3828,
253,
2644,
13148,
273,
247,
2014,
5048,
436,
310,
417,
1077,
2590,
275,
253,
2929,
285,
352,
651,
1361,
281,
15368,
562,
4555,
849,
253,
11041,
390,
2097,
403,
10302,
970,
7212,
495,
352,
651,
320,
5322,
281,
452,
247,
4564,
625,
14023,
3185,
273,
816,
9479,
1255,
577,
697,
417,
4745,
2590,
281,
479,
849,
436,
11562,
476,
320,
11516,
281,
247,
4715,
10527,
4154,
670,
26647,
533,
436,
310,
5884,
608,
891,
1364,
11476,
326,
352,
310,
417,
2590,
281,
479,
2139,
436,
11562,
943,
9232,
26647,
891,
5467,
253,
4477,
574,
690,
19704,
980,
390,
16875,
4431,
594,
352,
651,
320,
5322,
281,
2486,
326,
275,
253,
2929,
5742,
752,
310,
2714,
670,
11251,
14407,
1057,
436,
1599,
1175,
3210,
403,
816,
16407,
3006,
3861,
9117,
721,
275,
36438,
604,
253,
2629,
11254,
10302,
689,
253,
3530,
273,
247,
966,
310,
1029,
2429,
281,
253,
2629,
11254,
10302,
689,
253,
2862,
10895,
359,
9441,
326,
253,
23586,
556,
6311,
3386,
326,
22629,
3530,
15823,
281,
436,
966,
2139,
1057,
2629,
11254,
9232,
436,
891,
1158,
271,
22861,
651,
320,
5322,
253,
4677,
275,
690,
3282,
1057,
352,
533,
891,
13414,
1158,
28763,
11080,
2217,
818,
352,
651,
320,
1175,
281,
452,
1783,
273,
534,
3828,
310,
3798,
5055,
672,
2509,
2781,
390,
1755,
465,
50276,
26122,
337,
476,
11251,
14407,
17524,
320,
11575,
347,
690,
2238,
273,
48544,
273,
1027,
4243,
273,
253,
1566,
390,
690,
2238,
273,
13800,
891,
1158,
625,
4722,
1841,
476,
320,
753,
670,
436,
374,
752,
310,
355,
275,
16186,
374,
495,
323,
4677,
577,
310,
627,
48960,
6046,
323,
253,
4759,
6970,
577,
352,
651,
320,
4722,
281,
923,
253,
5606,
273,
253,
19947,
3185,
273,
253,
10303,
347,
2829,
2145,
533,
436,
310,
3240,
8214,
285,
891,
1158,
253,
5368,
4753,
403,
2168,
4722,
2217,
608,
352,
651,
320,
5322,
281,
923,
253,
15577,
1491,
1754,
7982,
432,
480,
22589,
1162,
355,
3738,
436,
310,
417,
1029,
327,
11674,
604,
253,
4477,
403,
6110,
597,
812,
897,
253,
941,
285,
2127,
4439,
407,
23256,
11830,
7324,
387,
5723,
2824,
534,
3797,
253,
7092,
273,
15577,
1491,
7982,
721,
403,
512,
253,
3210,
20670,
839,
651,
320,
5322,
281,
923,
247,
24426,
273,
3733,
2957,
390,
7200,
50275,
585,
3444,
50276,
2520,
2929,
10262,
1633,
326,
891,
11697,
2868,
310,
1077,
4722,
604,
417,
12302,
533,
253,
2929,
3198,
3240,
247,
2257,
273,
2987,
1078,
352,
476,
320,
3863,
891,
971,
281,
22175,
326,
891,
2868,
253,
2934,
310,
6685,
12532,
533,
253,
9759,
253,
10636,
285,
11080,
1255,
273,
1783,
19235,
816,
2985,
253,
1616,
347,
824,
891,
717,
21802,
281,
12009,
253,
2929,
323,
1024,
891,
588,
2572,
619,
7363,
604,
253,
4477,
476,
2953,
619,
7350,
50275,
11183,
846,
30080,
22559,
50275,
74,
11435,
253,
30628,
1892,
789,
323,
6240,
436,
1142,
747,
4753,
275,
247,
2159,
2180,
273,
673,
247,
1781,
1180,
273,
619,
7350,
452,
644,
9713,
285,
253,
3290,
273,
2929,
556,
5520,
3012,
690,
9841,
2879,
4679,
1918,
247,
2257,
625,
16039,
685,
253,
3236,
7482,
347,
824,
891,
717,
275,
3718,
273,
18738,
253,
2929,
285,
452,
2559,
619,
7363,
432,
577,
281,
721,
50276,
35529,
891,
1335,
452,
247,
1643,
42578,
3533,
275,
253,
1708,
273,
253,
30080,
22559,
285,
651,
2389,
281,
923,
731,
9713,
943,
253,
2929,
320,
7607,
337,
294,
4677,
577,
253,
1953,
310,
2139,
253,
3045,
273,
253,
4759,
6970,
310,
594,
3076,
891,
369,
12371,
604,
690,
273,
253,
13301,
403,
47723,
13130,
281,
10808,
436,
1055,
374,
323,
8493,
253,
4735,
3711,
281,
247,
2014,
10671,
2139,
310,
253,
2781,
4254,
6777,
540,
41597,
352,
2789,
625,
3282,
281,
897,
1599,
390,
1014,
1340,
1098,
261,
262,
68,
281,
1056,
253,
7982,
625,
10237,
50276,
74,
671,
2096,
326,
1142,
273,
253,
4679,
2550,
320,
2879,
275,
247,
2159,
2180,
273,
673,
285,
3524,
326,
253,
4477,
823,
731,
347,
12316,
275,
253,
30080,
22559,
187,
187,
4118,
18435,
27,
783,
2929,
29328,
11251,
14407,
17524,
347,
271,
15301,
273,
26647,
3045,
285,
3588,
684,
436,
407,
9470,
16774,
7103,
512,
30628,
452,
1119,
436,
4602,
4122,
4722,
253,
2488,
2380,
556,
671,
38597,
9713,
954,
273,
253,
30628,
7350,
1677,
253,
6349,
273,
12392,
26647,
3045,
273,
689,
19484,
1025,
3676,
3210,
253,
2929,
588,
7826,
6635,
4722,
5955,
387,
253,
8059,
209
] | [
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] | [
272,
352,
604,
760,
4321,
8090,
403,
6777,
407,
253,
2781,
77,
9158,
436,
310,
581,
273,
253,
4606,
326,
891,
1158,
271,
1783,
342,
2007,
16039,
327,
849,
4555,
253,
4081,
5593,
11897,
310,
1774,
347,
891,
753,
3786,
285,
823,
7162,
281,
253,
9414,
407,
13458,
562,
849,
597,
14588,
281,
5368,
6239,
604,
368,
651,
5194,
1060,
247,
4564,
273,
10414,
326,
513,
6863,
17524,
247,
3332,
824,
789,
327,
3676,
37507,
891,
871,
310,
337,
534,
2722,
6863,
17524,
19132,
26647,
253,
789,
310,
7194,
327,
49863,
29974,
4694,
533,
275,
253,
990,
597,
671,
452,
4679,
323,
2629,
4751,
35421,
4715,
12014,
374,
921,
326,
5203,
36971,
18653,
6863,
498,
49591,
285,
19132,
26647,
22296,
4715,
50276,
18,
465,
312,
79,
953,
284,
1162,
355,
49863,
29974,
13337,
4715,
3066,
8566,
21624,
2317,
17524,
17857,
1686,
4765,
50276,
19,
278,
962,
254,
1162,
355,
672,
1057,
5203,
36971,
1361,
5723,
2824,
6247,
50275,
936,
1361,
342,
38041,
310,
352,
1896,
281,
2085,
7092,
273,
253,
5593,
1014,
604,
417,
253,
2644,
2127,
310,
1896,
281,
3727,
285,
4278,
670,
4555,
752,
2193,
452,
644,
14859,
323,
253,
2990,
31158,
4373,
22041,
275,
253,
2022,
4679,
275,
2829,
18,
3966,
24088,
275,
271,
30762,
50274,
37585,
5701,
50276,
8826,
3081,
8680,
323,
11138,
253,
789,
534,
403,
417,
3309,
281,
2319,
1561,
253,
30080,
22559,
533,
513,
2953,
347,
1142,
347,
1896,
50276,
74,
1158,
16186,
19,
310,
417,
973,
27389,
273,
752,
310,
1146,
2529,
275,
4706,
31619,
5742,
352,
2722,
326,
43031,
16606,
267,
16399,
310,
10302,
760,
327,
253,
873,
273,
247,
35851,
16399,
604,
891,
2096,
9113,
432,
253,
2505,
3185,
368,
2686,
11897,
253,
1599,
2830,
73,
5380,
4868,
273,
512,
253,
3530,
275,
247,
2221,
2437,
594,
891,
1158,
352,
943,
320,
43031,
5464,
267,
256,
9245,
50276,
3062,
1189,
355,
310,
17011,
310,
352,
6326,
281,
320,
253,
7349,
460,
13849,
4496,
19148,
285,
5731,
253,
2505,
281,
1056,
1841,
2590,
4931,
1918,
253,
5150,
323,
253,
43031,
5464,
4868,
11120,
604,
436,
7729,
50276,
1704,
337,
253,
1655,
3908,
812,
2395,
323,
50276,
25359,
456,
891,
1158,
310,
1512,
2266,
627,
403,
1142,
2175,
326,
2692,
326,
841,
1361,
26647,
24088,
891,
1158,
253,
2762,
1055,
273,
42072,
310,
440,
19751,
494,
891,
651,
1804,
294,
545,
83,
2355,
352,
281,
1633,
625,
13253,
1020,
751,
417,
253,
7934,
2616,
323,
26647,
3045,
390,
1633,
2074,
50276,
249,
4706,
36438,
891,
1158,
326,
253,
2491,
273,
4569,
2193,
323,
465,
310,
7976,
327,
253,
1180,
273,
5971,
275,
253,
5447,
4931,
368,
651,
751,
281,
1375,
436,
11120,
1908,
604,
368,
812,
671,
15313,
247,
4086,
273,
17300,
2779,
45190,
273,
752,
310,
247,
1175,
465,
342,
1675,
281,
1180,
273,
5971,
50275,
1704,
5976,
19947,
273,
3210,
24159,
891,
1158,
436,
6197,
1398,
363,
272,
465,
423,
455,
6161,
567,
812,
897,
247,
4512,
294,
40712,
281,
320,
625,
27350,
50276,
1704,
5976,
407,
29697,
3006,
5593,
891,
717,
417,
2119,
326,
253,
2746,
29697,
4219,
2581,
352,
3133,
984,
352,
3936,
3388,
689,
16665,
273,
2193,
352,
7960,
310,
10237,
281,
562,
3623,
2219,
2557,
50276,
27049,
3575,
24251,
390,
31218,
12594,
253,
4833,
273,
253,
4373,
3575,
717,
891,
987,
4931,
247,
4512,
294,
40712,
651,
3693,
436,
13775,
50276,
261,
4158,
281,
1805,
9232,
46449,
891,
651,
1804,
436,
4154,
281,
320,
294,
545,
83,
833,
247,
2372,
281,
3693,
7914,
326,
352,
13613,
28174,
46449,
352,
310,
7960,
10237,
1955,
281,
25001,
2439,
4373,
12928,
281,
253,
1083,
273,
760,
337,
4373,
3575,
3411,
839,
342,
26647,
4366,
326,
480,
22589,
1162,
355,
9169,
671,
1379,
2714,
1557,
281,
417,
689,
3409,
253,
13789,
273,
436,
2557,
281,
4271,
19349,
2616,
923,
4706,
25653,
275,
616,
2929,
50273,
8774,
273,
11701,
285,
27694,
2996,
30628,
4868,
846,
30080,
22559,
50275,
8774,
273,
2022,
2792,
273,
7756,
2905,
281,
619,
5701,
846,
253,
18520,
50275,
783,
4477,
452,
3012,
6508,
253,
1783,
1561,
253,
2929,
342,
625,
4679,
281,
7409,
253,
4081,
5593,
285,
253,
3676,
37507,
8770,
275,
1953,
436,
11323,
247,
2257,
273,
1318,
281,
253,
2929,
9159,
625,
16039,
285,
1329,
323,
253,
2022,
3916,
50275,
783,
4477,
452,
2879,
1142,
1101,
633,
8254,
6787,
670,
2176,
2216,
10165,
285,
5520,
690,
24088,
407,
3388,
465,
4090,
3185,
273,
2781,
534,
2789,
253,
1263,
1199,
30909,
285,
11323,
7162,
281,
253,
9414,
670,
4665,
839,
253,
5839,
285,
697,
1543,
50275,
783,
4477,
452,
2879,
247,
1263,
273,
253,
4833,
273,
253,
465,
4764,
275,
253,
5593,
534,
10486,
6131,
7162,
275,
253,
11815,
11815,
2186,
323,
247,
1534,
2491,
273,
2193,
323,
465,
285,
6131,
747,
16039,
670,
849,
2533,
1430,
6569,
337,
23586,
4632,
2709,
50275,
250,
5551,
33593,
310,
10260,
5520,
1097,
407,
5520,
8254,
6787,
273,
253,
5661,
7533,
1561,
253,
2505,
285,
407,
5277,
253,
2127,
275,
253,
24864,
50275,
783,
4477,
5520,
11985,
670,
253,
1543,
342,
1675,
281,
2905,
6239,
3192,
715,
2395,
285,
20057,
731,
671,
281,
9380,
326,
387,
806,
17834,
1646,
281,
320,
17343,
272,
24088,
10414,
1249,
891,
2530,
1840,
326,
921,
6863,
17524,
19132,
26647,
841,
4859,
285,
11985,
2007,
3157,
619,
7162,
275,
253,
11815,
50276,
1189,
455,
253,
4477,
452,
9713,
10481,
954,
273,
619,
7350,
3012,
11138,
253,
7714,
253,
9300,
7714,
3400,
3012,
625,
16039,
534,
943,
320,
273,
1600,
281,
247,
629,
273,
253,
3114,
253,
4465,
1783,
3400,
1805,
1329,
273,
253,
799,
656,
9079,
285,
3916,
5780,
32213,
273,
253,
2929,
2486,
253,
260,
338,
274,
7483,
40290,
359,
513,
417,
871,
604,
11815,
2186,
4457,
352,
891,
717,
5211,
281,
2572,
619,
10123,
4868,
432,
608,
42876,
44462,
14924,
281,
818,
1175,
2929,
2997,
50275,
37585,
50275,
249,
1083,
253,
2929,
4850,
7607,
891,
651,
1804,
253,
4477,
281,
1611,
281,
3426,
3036,
577,
342,
253,
3828,
3020,
5839,
534,
597,
753,
369,
1512,
8214,
323,
253,
30080,
22559,
2180,
891,
2550,
6642,
849,
8214,
326,
476,
320,
18670,
352,
310,
1896,
275,
3356,
673,
2180,
285,
651,
3426,
23395,
3036,
577,
5474,
339,
793,
360,
3454,
50276,
2520,
2929,
23970,
247,
10732,
48945,
11251,
14407,
17524,
326,
8631,
247,
3676,
11454,
6928,
15424,
3745,
281,
7368,
1561,
941,
577,
1027,
13483,
326,
2557,
253,
6928,
17524,
3745,
403,
4081,
285,
1781,
4311,
4679,
921,
326,
597,
403,
4122,
3576,
387,
21565,
253,
3210,
26647,
3745,
2439,
11117,
3510,
273,
4373,
22041,
50275,
856,
84,
337,
253,
2929,
10262,
271,
4722,
2934,
326,
671,
3133,
281,
320,
18236,
4122,
4623,
1754,
327,
253,
4679,
19836,
352,
17923,
4829,
9670,
327,
2067,
4722,
4373,
22041,
326,
403,
11132,
323,
2045,
5593,
891,
2868,
326,
627,
310,
7964,
1633,
11555,
275,
436,
2934,
285,
352,
310,
4409,
2852,
2561,
374,
253,
4679,
327,
253,
5606,
273,
253,
10303,
310,
6685,
4722,
1580,
436,
778,
5224,
326,
359,
513,
417,
878,
281,
6194,
253,
3210,
281,
14940,
281,
12654,
849,
1175,
253,
3210,
403,
253,
10671,
310,
2969,
285,
3133,
281,
320,
5919,
281,
11897,
50275,
5040,
337,
891,
1158,
436,
2929,
310,
417,
1077,
973,
15720,
285,
3198,
1199,
625,
789,
323,
1650,
2139,
403,
627,
577,
1027,
10303,
672,
767,
273,
731,
4518,
562,
32231,
253,
2045,
767,
1754,
327,
5203,
20258,
447,
534,
403,
23188,
253,
42852,
323,
253,
789,
285,
24628,
1566,
253,
4588,
11251,
14407,
17524,
3745,
253,
2929,
3916,
326,
597,
403,
19767,
533,
253,
4679,
275,
619,
4743,
5936,
5010,
642,
2007,
8813,
310,
5907,
323,
436,
11562,
285,
891,
717,
816,
1669,
12371,
891,
1158,
436,
32570,
1199,
625,
5839,
285,
8813,
374,
752,
310,
253,
3242,
5426,
273,
247,
23586,
285,
247,
3828,
253,
5743,
13148,
310,
273,
5281,
295,
89,
73,
89,
88,
10587,
310,
253,
23586,
247,
2014,
13434,
390,
2014,
5048,
295,
89,
73,
89,
22358,
18,
21223,
310,
253,
3828,
253,
2644,
13148,
273,
247,
2014,
5048,
436,
310,
417,
1077,
2590,
275,
253,
2929,
285,
352,
651,
1361,
281,
15368,
562,
4555,
849,
253,
11041,
390,
2097,
403,
10302,
970,
7212,
495,
352,
651,
320,
5322,
281,
452,
247,
4564,
625,
14023,
3185,
273,
816,
9479,
1255,
577,
697,
417,
4745,
2590,
281,
479,
849,
436,
11562,
476,
320,
11516,
281,
247,
4715,
10527,
4154,
670,
26647,
533,
436,
310,
5884,
608,
891,
1364,
11476,
326,
352,
310,
417,
2590,
281,
479,
2139,
436,
11562,
943,
9232,
26647,
891,
5467,
253,
4477,
574,
690,
19704,
980,
390,
16875,
4431,
594,
352,
651,
320,
5322,
281,
2486,
326,
275,
253,
2929,
5742,
752,
310,
2714,
670,
11251,
14407,
1057,
436,
1599,
1175,
3210,
403,
816,
16407,
3006,
3861,
9117,
721,
275,
36438,
604,
253,
2629,
11254,
10302,
689,
253,
3530,
273,
247,
966,
310,
1029,
2429,
281,
253,
2629,
11254,
10302,
689,
253,
2862,
10895,
359,
9441,
326,
253,
23586,
556,
6311,
3386,
326,
22629,
3530,
15823,
281,
436,
966,
2139,
1057,
2629,
11254,
9232,
436,
891,
1158,
271,
22861,
651,
320,
5322,
253,
4677,
275,
690,
3282,
1057,
352,
533,
891,
13414,
1158,
28763,
11080,
2217,
818,
352,
651,
320,
1175,
281,
452,
1783,
273,
534,
3828,
310,
3798,
5055,
672,
2509,
2781,
390,
1755,
465,
50276,
26122,
337,
476,
11251,
14407,
17524,
320,
11575,
347,
690,
2238,
273,
48544,
273,
1027,
4243,
273,
253,
1566,
390,
690,
2238,
273,
13800,
891,
1158,
625,
4722,
1841,
476,
320,
753,
670,
436,
374,
752,
310,
355,
275,
16186,
374,
495,
323,
4677,
577,
310,
627,
48960,
6046,
323,
253,
4759,
6970,
577,
352,
651,
320,
4722,
281,
923,
253,
5606,
273,
253,
19947,
3185,
273,
253,
10303,
347,
2829,
2145,
533,
436,
310,
3240,
8214,
285,
891,
1158,
253,
5368,
4753,
403,
2168,
4722,
2217,
608,
352,
651,
320,
5322,
281,
923,
253,
15577,
1491,
1754,
7982,
432,
480,
22589,
1162,
355,
3738,
436,
310,
417,
1029,
327,
11674,
604,
253,
4477,
403,
6110,
597,
812,
897,
253,
941,
285,
2127,
4439,
407,
23256,
11830,
7324,
387,
5723,
2824,
534,
3797,
253,
7092,
273,
15577,
1491,
7982,
721,
403,
512,
253,
3210,
20670,
839,
651,
320,
5322,
281,
923,
247,
24426,
273,
3733,
2957,
390,
7200,
50275,
585,
3444,
50276,
2520,
2929,
10262,
1633,
326,
891,
11697,
2868,
310,
1077,
4722,
604,
417,
12302,
533,
253,
2929,
3198,
3240,
247,
2257,
273,
2987,
1078,
352,
476,
320,
3863,
891,
971,
281,
22175,
326,
891,
2868,
253,
2934,
310,
6685,
12532,
533,
253,
9759,
253,
10636,
285,
11080,
1255,
273,
1783,
19235,
816,
2985,
253,
1616,
347,
824,
891,
717,
21802,
281,
12009,
253,
2929,
323,
1024,
891,
588,
2572,
619,
7363,
604,
253,
4477,
476,
2953,
619,
7350,
50275,
11183,
846,
30080,
22559,
50275,
74,
11435,
253,
30628,
1892,
789,
323,
6240,
436,
1142,
747,
4753,
275,
247,
2159,
2180,
273,
673,
247,
1781,
1180,
273,
619,
7350,
452,
644,
9713,
285,
253,
3290,
273,
2929,
556,
5520,
3012,
690,
9841,
2879,
4679,
1918,
247,
2257,
625,
16039,
685,
253,
3236,
7482,
347,
824,
891,
717,
275,
3718,
273,
18738,
253,
2929,
285,
452,
2559,
619,
7363,
432,
577,
281,
721,
50276,
35529,
891,
1335,
452,
247,
1643,
42578,
3533,
275,
253,
1708,
273,
253,
30080,
22559,
285,
651,
2389,
281,
923,
731,
9713,
943,
253,
2929,
320,
7607,
337,
294,
4677,
577,
253,
1953,
310,
2139,
253,
3045,
273,
253,
4759,
6970,
310,
594,
3076,
891,
369,
12371,
604,
690,
273,
253,
13301,
403,
47723,
13130,
281,
10808,
436,
1055,
374,
323,
8493,
253,
4735,
3711,
281,
247,
2014,
10671,
2139,
310,
253,
2781,
4254,
6777,
540,
41597,
352,
2789,
625,
3282,
281,
897,
1599,
390,
1014,
1340,
1098,
261,
262,
68,
281,
1056,
253,
7982,
625,
10237,
50276,
74,
671,
2096,
326,
1142,
273,
253,
4679,
2550,
320,
2879,
275,
247,
2159,
2180,
273,
673,
285,
3524,
326,
253,
4477,
823,
731,
347,
12316,
275,
253,
30080,
22559,
187,
187,
4118,
18435,
27,
783,
2929,
29328,
11251,
14407,
17524,
347,
271,
15301,
273,
26647,
3045,
285,
3588,
684,
436,
407,
9470,
16774,
7103,
512,
30628,
452,
1119,
436,
4602,
4122,
4722,
253,
2488,
2380,
556,
671,
38597,
9713,
954,
273,
253,
30628,
7350,
1677,
253,
6349,
273,
12392,
26647,
3045,
273,
689,
19484,
1025,
3676,
3210,
253,
2929,
588,
7826,
6635,
4722,
5955,
387,
253,
8059,
209
] |