Input
stringlengths
251
41.6k
Output
stringlengths
137
9.7k
input_ids
sequencelengths
157
2.05k
attention_mask
sequencelengths
157
2.05k
labels
sequencelengths
157
2.05k
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: the authors propose a physically sensible data augmentation that takes into account the actual variation of day light illumination in selfsupervised contexts they analyze the impact on the classification performance the sensitivity of the performance to illumination changes and the emergence of neurons sensitive or insensitive to color realistic data augmentation is important in selfsupervised learning because this is the sort of data variability that one wants to ignore and be invariant to overall this is an example of physicsaware machine learning or in simpler words do things just right the proposed planckvonkries color augmentation transform is more than 100 years old but it may be interesting for different audiences for different reasons a people with color science background will not find it surprising but they may be interested to see the quantitative gain that can be obtained from doing the classical right thing and b the average computer scientist facing this problem for the first time will find a good visual motivation to avoid random augmentation fig 1 and a physically founded recipe to get a 10 improvement in classification additionally authors make an interesting observation they argue that unrealistic random transforms may induce overinvariance insensitivity to certain aspects of the data thus reducing the performance of the model or shifting the focus to other features of the data i think their results illustrate this general point in their specific example augmentation through unrealistic color jitter reduces the sensitivity to color as confirmed by the number of colorsensitive neurons and leads to extra consideration of spatial features a similar argument could be made for other augmentation strategies if the database lacks natural variability the discovered features are going to be unbalanced overall i find this work as an interesting exercise in the context of data augmentation although the technique is not novel and the general consequences are not extracted other points to address while the general changeoffocus effect is true the authors should be more specific about when this is going to be a benefit as it is presented now it depends on the database or on the relevance of color for class discrimination my opinion is that the proposed augmentation as any natural data augmentation is going to be more positive in too artificial too restricted databases where illumination is fixed or it has too low variability these are the cases that require the introduction of a natural variability in this case of the illumination this is consistent with the fact that wide databases such as cifar100 as opposed to flower102 get no improvement from the proposed jitter see performance numbers or figure 6 maybe this is because cifar100 already includes the variability of natural illumination and hence explicit augmentation with the planckian variability is not necessary can the authors provide some extra evidence about the fact that neurons insensitive to color when conventional color jitter is introduced capture better the shapetexture information equation 4 for vonkries is confusing von krieslike adaptation consists of dividing each lms i may admit each rgb value by the value corresponding to the white or illuminant in situation 1 and multiply by the value corresponding to the white or illuminant in situation 2 see for instance chapt9 in fairchilds color appearance models 2013 this implies the multiplication of each tristimulus vector by clarify eq 4 the spectral radiances in fig 4 are confusing using no normalization and a linear vertical axis implies that all the explored spectra seem blueish with bigger contribution of short wavelengths decay in energy at low temperatures of the black body radiator make it difficult to see that between 3000 and 4000 k the contribution of long wavelengths is bigger you have reddish illuminants too but it is hard to see in that figure if you dont nromalize the energy or put a log scale in the vertical axis note that for instance the cie a illuminant corresponds to 2856 k almost the 3000k reported by the authors and it is markedly reddish using the novel arc chromatic diagram does not add anything special with regard to classical cie xy diagram why not using the standard diagram incidentally i was surprised to see that the planck locus is straight in arc but the variations in the colors in fig 1 diagram at the right go in curves is this right eqs 1 and 2 seem to suggest that in fig 2 there is an additional multilayer perception in the lower branch of the training isnt it i find this work is an interesting exercise in the context of data augmentation an illustration of how classical models from physics may benefit the learning although the technique is not novel and the general consequences are not discussed in detail docsepoverview this paper proposes a novel type of image colour augmentation to be used during selfsupervised learning ssl background in a typical ssl setting similar samples are generated by randomly augmenting an image in a variety of different ways random cropping colour jittering random rotations etc these similar images are then passed through a neural network and the predicted features for similar samples are trained to be close to each other based on some similarity metric ignoring the contrastive term description here as it is not directly related to the paper motivation in this paper the authors address the issue of using colour jittering as augmentation during ssl specifically using colour jittering pushes the network towards invariance to image colours while relying more on the shape and texture of objects when making predictions the authors point out that despite the benefits of this for many general detection tasks this will be a detrimental property when dealing with more colourdependant tasks method thus they propose to use planckian jittering instead of colour jittering planckian jittering is a physics based augmentation method proposed by the authors although the formulations for it come from existing literature to reilluminate the training images within a realistic distribution which leads to more realistic and constrained colour augmentations than colour jittering the paper claims that planckian jittering still helps improve networks dependence on shape and texture of objects although less than colour jittering while limiting the networks invariance to image colours experiments 6 ssl models were trained independently on cifar100 3 used different variants of the planckian jittering 2 used different variants of colour jittering w and wo random grayscale and 1 used no augmentations linear classifiers for cifar100 and flowers102 classification tasks were then trained on top of each of the ssl models features where flowers102 is the task that is claimed to be more heavily colourdependant moreover an extra linear classifier was trained on the concatenation of features from a planckian jittering and a colour jittering model called the latent space combination model based on accuracy latent space combination outperforms all other models by a significant margin on both datasets planckian jitter seems to outperform other augmentations on flowers120 table 1 which supports the claim of the authors on cifar100 planckian jitter performs slightly worse than colour jittering which the authors attribute to the reduction of colour invariance in the features a very similar experiment was also done with different datasets ssl training on tinyimagenet linear classifier trained on flowers102 cub200 vegfru t1k which also obtained similar results and conclusions moreover in another similar experiment the ssl models were trained with different ssl configurations simsiam simclr barlow twins to indicate the generality of planckian jitter for different types of ssl configurations in another experiment the robustness of the different models on augmented images using planckian jittering was evaluated lastly the colour sensitivity was analyzed to inspect the impact of colour information in neuron activations for each model pros well written and easy to follow manuscript
 the idea of the physicsbased planckian jitter which transforms colours along chromaticity lines is interesting there does seem to be some support in the experiments for planckian jitter improving results for colourdependant domains ie table 1 when comparing default color jitter wo random grayscale and planckian jitter cons comparing the latent space combination with other forms of augmentations seems very unfair since latent space combination has double the capacity and requires double the training compute to me latent space combination seems more as an ensemble of models to make it a fair comparison it needs to be compared with other ensemble combinations eg default color jitter wo random grayscalenone or default color jitter wo random grayscalerandom crop seems like a reasonable baseline otherwise i think latent space combination should be removed altogether since cropping is one of the most important transformations for ssl for all practical purposes there should be two versions of table 1 one with and one without random cropping applied i can imagine random cropping may impact the results the main focus in current comparisons should be between default color jitter wo random grayscale and planckian jitter but this was left out entirely in tables 2 and 3 comparing default color jitter with random grayscale against planckian jitter does not seem fair adding to above point even comparing planckian jitter against default color jitter wo random grayscale isnt really enough since as clearly indicated in table 1 planckian jitter mainly modifies brightness and contrast there should be a direct comparison with an augmentation which changes brightness contrast brightnesscontrast and hue there isnt any large scale datasets used in the experimentations which raises the question of scalability and whether the planckian augmentation would still be relevant with larger datasets planckian jitter is an augmentation technique which can also be evaluated on supervised learning tasks where image colours are important and similarly for transfer learning afterwards adding supervised experiments can add to the comprehensiveness of the experiments section 421 mentions as can be seen in table 1 if color augmentations are removed completely none configuration the accuracy drops of 18 this seems incorrect when comparing default color jitter wo random grayscale to none the left plot of the colour sensitivity analysis fig 3 seems unfair the planckian jitter models were specifically trained to ignore planckian jitters so one would naturally expect their results to be more robust with respect to planckian jittering

 typos section 31 paragraph 1 states default color jitter see fig 3 i believe this was intended to reference fig 1 questions for authors 1 it seems very odd to me that in table 1 none performs so similar to default color jitter wo random grayscale do you have any thoughts on why this happens 2 were there any other augmentations used during training cropping rotations etc edits nov 3 typo fix this paper proposes an interesting idea and is very well written the issues with colour jittering pointed out in this paper were previously known thus i view this paper as more geared towards practical purposes however the experiments are not comprehensive enough to strongly support the practical claims thus my vote is for the paper to be rejected in its current form docsepthis paper first examines that typical color jittering augmentation is harmful to feature representation learning then the authors proposed a physicsbased color augmentation called planckian jitter to improve the performance the proposed planckian jitter performs better with the recent contrastive and selfsupervised learning schemes strengths the proposed physicsbased color augmentation is easy to understand and implement the performance seems to be surprisingly better than typical random color jittering this color augmentation can be applied to a wide variety of tasks weaknesses is the proposed physicsbased color augmentation slower than typical color jittering many tasks need color augmentation performed on the fly during training if the proposed color augmentation is much slower the impact of this work will be marginal this paper proposed a general color augmentation that performs better than the typical color jittering and demonstrates better performance in contrastive and selfsupervised learning schemes the only concern i have is the speed ### Summary:
the reviewers were in general lukewarm about the paper not convinced by why realistic augmentation mean more robust features in ssl had concerns over the szie of the datasets up to 100k and the success depends on the relevance of color for classification the ac agrees with the reviewers while the paper sounds interesting there are many questions remain unanswered its unclear that the rebuttal addressed the concerns shared by the reviewers in addition to the comments by the reviewers the ac also feels that the overall design is adhoc and its unclear that the proposed augmentation can generalize to larger more practical problems
[ 310, 1469, 281, 320, 625, 2762, 275, 1512, 13345, 1512, 11096, 16634, 835, 25248, 310, 4229, 390, 352, 556, 1512, 1698, 13099, 841, 403, 253, 2219, 326, 2430, 253, 10199, 273, 247, 3626, 13099, 275, 436, 1083, 273, 253, 25248, 436, 310, 5185, 342, 253, 958, 326, 4618, 16634, 824, 347, 260, 338, 274, 2313, 347, 10066, 281, 17514, 11335, 755, 642, 7756, 432, 253, 4081, 480, 4069, 923, 3045, 3904, 390, 4677, 721, 5046, 436, 310, 984, 260, 338, 274, 2313, 2168, 3797, 253, 13099, 273, 3626, 25248, 285, 7613, 6843, 42072, 342, 253, 2098, 777, 757, 13099, 310, 417, 3309, 50273, 5092, 253, 4477, 2085, 690, 4465, 1941, 670, 253, 958, 326, 8512, 39188, 281, 3295, 672, 6041, 3295, 480, 4069, 310, 5611, 9232, 1805, 253, 439, 522, 292, 2068, 459, 1491, 50275, 29813, 577, 323, 8449, 76, 2246, 310, 21643, 50276, 29997, 465, 2246, 3022, 15644, 8414, 273, 23534, 1016, 298, 983, 891, 778, 11476, 1016, 46206, 1318, 407, 253, 1318, 3969, 281, 253, 3168, 390, 34080, 386, 275, 4112, 337, 285, 30247, 407, 253, 1318, 3969, 281, 253, 3168, 390, 34080, 386, 275, 4112, 374, 923, 323, 4227, 448, 1739, 26, 275, 4344, 7003, 84, 3295, 7286, 3210, 4072, 436, 8018, 253, 25219, 273, 1016, 492, 382, 303, 19901, 4972, 407, 19148, 16186, 577, 50275, 783, 9879, 1985, 18843, 275, 3036, 577, 403, 21643, 970, 642, 21539, 285, 247, 4872, 9118, 7844, 8018, 326, 512, 253, 14859, 9408, 1646, 4797, 763, 342, 8750, 7680, 273, 2159, 25738, 10027, 275, 2341, 387, 1698, 9208, 273, 253, 2806, 2133, 8188, 1080, 1056, 352, 2834, 281, 923, 326, 875, 27295, 285, 35059, 465, 253, 7680, 273, 1048, 25738, 310, 8750, 50276, 5658, 452, 28159, 763, 34080, 1103, 1512, 533, 352, 310, 1892, 281, 923, 275, 326, 4677, 604, 368, 13414, 295, 409, 267, 907, 253, 2341, 390, 1691, 247, 2412, 4311, 275, 253, 9118, 7844, 3877, 326, 323, 4227, 253, 260, 466, 247, 34080, 386, 10140, 281, 27360, 23, 465, 2761, 253, 27295, 76, 2361, 407, 253, 4477, 285, 352, 310, 22293, 28159, 763, 50275, 5302, 253, 4460, 12423, 5937, 1420, 10659, 1057, 417, 823, 2712, 2714, 342, 2743, 281, 8946, 260, 466, 1269, 90, 10659, 2139, 417, 970, 253, 2629, 10659, 7119, 595, 891, 369, 9861, 281, 923, 326, 253, 2098, 777, 18519, 310, 4951, 275, 12423, 533, 253, 10575, 275, 253, 9830, 275, 3036, 337, 10659, 387, 253, 987, 564, 275, 9191, 310, 436, 987, 50275, 2574, 84, 337, 285, 374, 1646, 281, 1804, 326, 275, 3036, 374, 627, 310, 271, 3081, 33362, 4071, 13071, 275, 253, 2406, 7789, 273, 253, 3733, 310, 2649, 352, 891, 1089, 436, 789, 310, 271, 4722, 5763, 275, 253, 3634, 273, 941, 42072, 271, 23356, 273, 849, 8946, 3210, 432, 12057, 778, 5649, 253, 4715, 3738, 253, 5853, 310, 417, 4460, 285, 253, 2087, 9099, 403, 417, 5469, 275, 2508, 50275, 7152, 33032, 39930, 436, 2929, 29328, 247, 4460, 1511, 273, 2460, 10688, 42072, 281, 320, 908, 1309, 1881, 35421, 4715, 256, 3433, 50276, 11814, 275, 247, 6867, 256, 3433, 4758, 2074, 3530, 403, 4561, 407, 12421, 35919, 272, 271, 2460, 275, 247, 5235, 273, 1027, 4088, 3632, 9187, 2784, 10688, 480, 4069, 272, 3632, 39501, 3966, 841, 2074, 3888, 403, 840, 4817, 949, 247, 11454, 2990, 285, 253, 8131, 3386, 323, 2074, 3530, 403, 10166, 281, 320, 2810, 281, 1016, 643, 1754, 327, 690, 14259, 7982, 23111, 253, 4499, 422, 1307, 5740, 1060, 347, 352, 310, 417, 3587, 2905, 281, 253, 2929, 50276, 24013, 7639, 275, 436, 2929, 253, 4477, 2953, 253, 2523, 273, 970, 10688, 480, 4069, 272, 347, 42072, 1309, 256, 3433, 5742, 970, 10688, 480, 4069, 272, 32804, 253, 2990, 4404, 31429, 281, 2460, 22290, 1223, 22128, 625, 327, 253, 5281, 285, 14542, 273, 5113, 672, 2403, 13650, 253, 4477, 1127, 562, 326, 5747, 253, 5373, 273, 436, 323, 1142, 2087, 5481, 8892, 436, 588, 320, 247, 30078, 2867, 672, 10620, 342, 625, 10688, 14455, 386, 8892, 50276, 9349, 3021, 597, 12661, 281, 897, 2098, 777, 757, 480, 4069, 272, 3185, 273, 10688, 480, 4069, 272, 2098, 777, 757, 480, 4069, 272, 310, 247, 12057, 1754, 42072, 1332, 4081, 407, 253, 4477, 3738, 253, 26850, 323, 352, 1705, 432, 5368, 6239, 281, 294, 408, 360, 4024, 253, 3733, 3888, 1561, 247, 15958, 3268, 534, 5644, 281, 625, 15958, 285, 20793, 10688, 35919, 569, 685, 10688, 480, 4069, 272, 253, 2929, 3916, 326, 2098, 777, 757, 480, 4069, 272, 1335, 7729, 3157, 6928, 10096, 327, 5281, 285, 14542, 273, 5113, 3738, 1679, 685, 10688, 480, 4069, 272, 1223, 14155, 253, 6928, 31429, 281, 2460, 22290, 50276, 16217, 3825, 50276, 23, 256, 3433, 3210, 497, 10166, 10939, 327, 260, 338, 274, 2313, 495, 908, 1027, 11640, 273, 253, 2098, 777, 757, 480, 4069, 272, 374, 908, 1027, 11640, 273, 10688, 480, 4069, 272, 259, 285, 32063, 3632, 650, 698, 25912, 285, 337, 908, 642, 35919, 569, 4872, 49996, 323, 260, 338, 274, 2313, 285, 12405, 11335, 9162, 8892, 497, 840, 10166, 327, 1755, 273, 1016, 273, 253, 256, 3433, 3210, 3386, 835, 12405, 11335, 310, 253, 4836, 326, 310, 7558, 281, 320, 625, 11306, 10688, 14455, 386, 25761, 271, 4465, 4872, 30410, 369, 10166, 327, 253, 32147, 318, 273, 3386, 432, 247, 2098, 777, 757, 480, 4069, 272, 285, 247, 10688, 480, 4069, 272, 1566, 1925, 253, 21624, 2317, 5019, 1566, 1754, 327, 7200, 21624, 2317, 5019, 41731, 13015, 512, 643, 3210, 407, 247, 1534, 8459, 327, 1097, 15302, 2098, 777, 757, 480, 4069, 3133, 281, 562, 32231, 643, 35919, 569, 327, 12405, 8193, 2829, 337, 534, 8525, 253, 1750, 273, 253, 4477, 327, 260, 338, 274, 2313, 2098, 777, 757, 480, 4069, 17923, 5777, 7197, 685, 10688, 480, 4069, 272, 534, 253, 4477, 11104, 281, 253, 5141, 273, 10688, 31429, 275, 253, 3386, 50276, 66, 1077, 2074, 3368, 369, 671, 2218, 342, 1027, 15302, 256, 3433, 3733, 327, 10058, 303, 6533, 292, 4872, 30410, 10166, 327, 12405, 11335, 12966, 1518, 1670, 29976, 579, 246, 18, 76, 534, 671, 2797, 2074, 1543, 285, 11815, 25761, 275, 1529, 2074, 3368, 253, 256, 3433, 3210, 497, 10166, 342, 1027, 256, 3433, 16012, 948, 9245, 312, 948, 498, 83, 2534, 676, 26664, 281, 5224, 253, 31376, 273, 2098, 777, 757, 480, 4069, 323, 1027, 3510, 273, 256, 3433, 16012, 275, 1529, 3368, 253, 31640, 273, 253, 1027, 3210, 327, 31612, 3888, 970, 2098, 777, 757, 480, 4069, 272, 369, 6760, 1390, 314, 253, 10688, 7340, 369, 5867, 281, 16030, 253, 3486, 273, 10688, 1491, 275, 23586, 1396, 569, 323, 1016, 1566, 5847, 50275, 4714, 3542, 285, 3477, 281, 956, 7714, 40702, 253, 2934, 273, 253, 12057, 3169, 2098, 777, 757, 480, 4069, 534, 29698, 22290, 2112, 5937, 1420, 414, 3104, 310, 4722, 50276, 9088, 1057, 1646, 281, 320, 690, 1329, 275, 253, 4679, 323, 2098, 777, 757, 480, 4069, 11138, 1543, 323, 10688, 14455, 386, 10625, 26332, 2829, 337, 672, 10941, 4284, 3295, 480, 4069, 32063, 3632, 650, 698, 25912, 285, 2098, 777, 757, 480, 4069, 50275, 5040, 50275, 681, 48434, 253, 21624, 2317, 5019, 342, 643, 4948, 273, 35919, 569, 3133, 1077, 16593, 1580, 21624, 2317, 5019, 556, 4021, 253, 5350, 285, 4419, 4021, 253, 3733, 11897, 281, 479, 21624, 2317, 5019, 3133, 625, 347, 271, 19862, 273, 3210, 281, 1056, 352, 247, 4344, 5301, 352, 3198, 281, 320, 2429, 342, 643, 19862, 13553, 24088, 4284, 3295, 480, 4069, 32063, 3632, 650, 698, 1179, 257, 531, 390, 4284, 3295, 480, 4069, 32063, 3632, 650, 698, 1179, 254, 2976, 17177, 3133, 751, 247, 5272, 8245, 5010, 891, 1158, 21624, 2317, 5019, 943, 320, 5176, 17965, 50276, 17480, 9187, 2784, 310, 581, 273, 253, 954, 1774, 21257, 323, 256, 3433, 323, 512, 8542, 6378, 627, 943, 320, 767, 9508, 273, 2829, 337, 581, 342, 285, 581, 1293, 3632, 9187, 2784, 3732, 891, 476, 8564, 3632, 9187, 2784, 778, 3486, 253, 1543, 50276, 783, 2022, 2770, 275, 1655, 14023, 943, 320, 875, 4284, 3295, 480, 4069, 32063, 3632, 650, 698, 25912, 285, 2098, 777, 757, 480, 4069, 533, 436, 369, 1669, 562, 7094, 275, 7180, 374, 285, 495, 10941, 4284, 3295, 480, 4069, 342, 3632, 650, 698, 25912, 1411, 2098, 777, 757, 480, 4069, 1057, 417, 1646, 4344, 50276, 8052, 281, 1840, 1127, 1014, 10941, 2098, 777, 757, 480, 4069, 1411, 4284, 3295, 480, 4069, 32063, 3632, 650, 698, 25912, 310, 2649, 1663, 2217, 1580, 347, 4518, 4860, 275, 2829, 337, 2098, 777, 757, 480, 4069, 7194, 771, 7790, 20468, 285, 4499, 627, 943, 320, 247, 1480, 5301, 342, 271, 42072, 534, 2544, 20468, 4499, 20468, 45842, 285, 43192, 50276, 9088, 310, 2649, 667, 1781, 4311, 15302, 908, 275, 253, 3368, 569, 534, 16540, 253, 1953, 273, 9171, 1430, 285, 1880, 253, 2098, 777, 757, 42072, 651, 1335, 320, 4623, 342, 4067, 15302, 50276, 11139, 777, 757, 480, 4069, 310, 271, 42072, 5853, 534, 476, 671, 320, 6760, 327, 22296, 4715, 8892, 835, 2460, 22290, 403, 1774, 285, 12014, 323, 3700, 4715, 16906, 6240, 22296, 4679, 476, 823, 281, 253, 9483, 6460, 273, 253, 4679, 50275, 4674, 38609, 25957, 347, 476, 320, 2326, 275, 2829, 337, 604, 3295, 35919, 569, 403, 5176, 4336, 5293, 6661, 253, 7200, 15323, 273, 1283, 436, 3133, 13583, 672, 10941, 4284, 3295, 480, 4069, 32063, 3632, 650, 698, 25912, 281, 5293, 50276, 783, 1669, 7484, 273, 253, 10688, 7340, 1783, 3036, 495, 3133, 16593, 253, 2098, 777, 757, 480, 4069, 3210, 497, 5742, 10166, 281, 11823, 2098, 777, 757, 480, 262, 1336, 594, 581, 651, 10748, 1902, 616, 1543, 281, 320, 625, 10237, 342, 1675, 281, 2098, 777, 757, 480, 4069, 272, 40702, 40702, 50275, 555, 993, 50276, 4674, 4562, 12494, 337, 3054, 4284, 3295, 480, 4069, 923, 3036, 495, 891, 2868, 436, 369, 6034, 281, 3806, 3036, 337, 50276, 34974, 323, 4477, 50276, 18, 352, 3133, 1077, 8909, 281, 479, 326, 275, 2829, 337, 5293, 17923, 594, 2074, 281, 4284, 3295, 480, 4069, 32063, 3632, 650, 698, 25912, 513, 368, 452, 667, 7906, 327, 2139, 436, 6569, 374, 497, 627, 667, 643, 35919, 569, 908, 1309, 3733, 9187, 2784, 39501, 3966, 50276, 264, 953, 50276, 30568, 495, 50276, 555, 5367, 4993, 436, 2929, 29328, 271, 4722, 2934, 285, 310, 1077, 973, 3542, 253, 3374, 342, 10688, 480, 4069, 272, 8042, 562, 275, 436, 2929, 497, 3786, 1929, 3021, 891, 1859, 436, 2929, 347, 625, 48526, 4404, 8542, 6378, 2299, 253, 4679, 403, 417, 11088, 2217, 281, 7052, 1329, 253, 8542, 3916, 3021, 619, 6273, 310, 323, 253, 2929, 281, 320, 10945, 275, 697, 1655, 830, 5474, 33032, 2520, 2929, 806, 33888, 326, 6867, 3295, 480, 4069, 272, 42072, 310, 19632, 281, 4735, 6779, 4715, 840, 253, 4477, 4081, 247, 12057, 3169, 3295, 42072, 1925, 2098, 777, 757, 480, 4069, 281, 3157, 253, 3045, 253, 4081, 2098, 777, 757, 480, 4069, 17923, 1805, 342, 253, 3332, 4499, 422, 285, 1881, 35421, 4715, 15849, 20544, 50275, 783, 4081, 12057, 3169, 3295, 42072, 310, 3477, 281, 2096, 285, 3359, 50275, 783, 3045, 3133, 281, 320, 19143, 1805, 685, 6867, 3632, 3295, 480, 4069, 272, 50275, 2520, 3295, 42072, 476, 320, 3732, 281, 247, 4618, 5235, 273, 8892, 50276, 20881, 1255, 265, 50275, 261, 253, 4081, 12057, 3169, 3295, 42072, 17357, 685, 6867, 3295, 480, 4069, 272, 1142, 8892, 878, 3295, 42072, 2684, 327, 253, 8778, 1309, 3733, 604, 253, 4081, 3295, 42072, 310, 1199, 17357, 253, 3486, 273, 436, 789, 588, 320, 16888, 436, 2929, 4081, 247, 2087, 3295, 42072, 326, 17923, 1805, 685, 253, 6867, 3295, 480, 4069, 272, 285, 14371, 1805, 3045, 275, 4499, 422, 285, 1881, 35421, 4715, 15849, 253, 760, 4468, 891, 452, 310, 253, 3885, 2490, 187, 4118, 18435, 27, 783, 30628, 497, 275, 2087, 298, 17936, 44041, 670, 253, 2929, 417, 13762, 407, 2139, 15958, 42072, 1599, 625, 10237, 3386, 275, 256, 3433, 574, 7350, 689, 253, 18558, 466, 273, 253, 15302, 598, 281, 2233, 76, 285, 253, 2323, 7024, 327, 253, 17200, 273, 3295, 323, 9162, 253, 913, 18726, 342, 253, 30628, 1223, 253, 2929, 7835, 4722, 627, 403, 1142, 3533, 3464, 440, 42195, 50276, 953, 12744, 326, 253, 30080, 22559, 9713, 253, 7350, 6096, 407, 253, 30628, 50276, 249, 1635, 281, 253, 5701, 407, 253, 30628, 253, 913, 671, 9193, 326, 253, 4583, 2216, 310, 519, 37806, 285, 697, 12744, 326, 253, 4081, 42072, 476, 39970, 281, 4067, 625, 8542, 3237 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 310, 1469, 281, 320, 625, 2762, 275, 1512, 13345, 1512, 11096, 16634, 835, 25248, 310, 4229, 390, 352, 556, 1512, 1698, 13099, 841, 403, 253, 2219, 326, 2430, 253, 10199, 273, 247, 3626, 13099, 275, 436, 1083, 273, 253, 25248, 436, 310, 5185, 342, 253, 958, 326, 4618, 16634, 824, 347, 260, 338, 274, 2313, 347, 10066, 281, 17514, 11335, 755, 642, 7756, 432, 253, 4081, 480, 4069, 923, 3045, 3904, 390, 4677, 721, 5046, 436, 310, 984, 260, 338, 274, 2313, 2168, 3797, 253, 13099, 273, 3626, 25248, 285, 7613, 6843, 42072, 342, 253, 2098, 777, 757, 13099, 310, 417, 3309, 50273, 5092, 253, 4477, 2085, 690, 4465, 1941, 670, 253, 958, 326, 8512, 39188, 281, 3295, 672, 6041, 3295, 480, 4069, 310, 5611, 9232, 1805, 253, 439, 522, 292, 2068, 459, 1491, 50275, 29813, 577, 323, 8449, 76, 2246, 310, 21643, 50276, 29997, 465, 2246, 3022, 15644, 8414, 273, 23534, 1016, 298, 983, 891, 778, 11476, 1016, 46206, 1318, 407, 253, 1318, 3969, 281, 253, 3168, 390, 34080, 386, 275, 4112, 337, 285, 30247, 407, 253, 1318, 3969, 281, 253, 3168, 390, 34080, 386, 275, 4112, 374, 923, 323, 4227, 448, 1739, 26, 275, 4344, 7003, 84, 3295, 7286, 3210, 4072, 436, 8018, 253, 25219, 273, 1016, 492, 382, 303, 19901, 4972, 407, 19148, 16186, 577, 50275, 783, 9879, 1985, 18843, 275, 3036, 577, 403, 21643, 970, 642, 21539, 285, 247, 4872, 9118, 7844, 8018, 326, 512, 253, 14859, 9408, 1646, 4797, 763, 342, 8750, 7680, 273, 2159, 25738, 10027, 275, 2341, 387, 1698, 9208, 273, 253, 2806, 2133, 8188, 1080, 1056, 352, 2834, 281, 923, 326, 875, 27295, 285, 35059, 465, 253, 7680, 273, 1048, 25738, 310, 8750, 50276, 5658, 452, 28159, 763, 34080, 1103, 1512, 533, 352, 310, 1892, 281, 923, 275, 326, 4677, 604, 368, 13414, 295, 409, 267, 907, 253, 2341, 390, 1691, 247, 2412, 4311, 275, 253, 9118, 7844, 3877, 326, 323, 4227, 253, 260, 466, 247, 34080, 386, 10140, 281, 27360, 23, 465, 2761, 253, 27295, 76, 2361, 407, 253, 4477, 285, 352, 310, 22293, 28159, 763, 50275, 5302, 253, 4460, 12423, 5937, 1420, 10659, 1057, 417, 823, 2712, 2714, 342, 2743, 281, 8946, 260, 466, 1269, 90, 10659, 2139, 417, 970, 253, 2629, 10659, 7119, 595, 891, 369, 9861, 281, 923, 326, 253, 2098, 777, 18519, 310, 4951, 275, 12423, 533, 253, 10575, 275, 253, 9830, 275, 3036, 337, 10659, 387, 253, 987, 564, 275, 9191, 310, 436, 987, 50275, 2574, 84, 337, 285, 374, 1646, 281, 1804, 326, 275, 3036, 374, 627, 310, 271, 3081, 33362, 4071, 13071, 275, 253, 2406, 7789, 273, 253, 3733, 310, 2649, 352, 891, 1089, 436, 789, 310, 271, 4722, 5763, 275, 253, 3634, 273, 941, 42072, 271, 23356, 273, 849, 8946, 3210, 432, 12057, 778, 5649, 253, 4715, 3738, 253, 5853, 310, 417, 4460, 285, 253, 2087, 9099, 403, 417, 5469, 275, 2508, 50275, 7152, 33032, 39930, 436, 2929, 29328, 247, 4460, 1511, 273, 2460, 10688, 42072, 281, 320, 908, 1309, 1881, 35421, 4715, 256, 3433, 50276, 11814, 275, 247, 6867, 256, 3433, 4758, 2074, 3530, 403, 4561, 407, 12421, 35919, 272, 271, 2460, 275, 247, 5235, 273, 1027, 4088, 3632, 9187, 2784, 10688, 480, 4069, 272, 3632, 39501, 3966, 841, 2074, 3888, 403, 840, 4817, 949, 247, 11454, 2990, 285, 253, 8131, 3386, 323, 2074, 3530, 403, 10166, 281, 320, 2810, 281, 1016, 643, 1754, 327, 690, 14259, 7982, 23111, 253, 4499, 422, 1307, 5740, 1060, 347, 352, 310, 417, 3587, 2905, 281, 253, 2929, 50276, 24013, 7639, 275, 436, 2929, 253, 4477, 2953, 253, 2523, 273, 970, 10688, 480, 4069, 272, 347, 42072, 1309, 256, 3433, 5742, 970, 10688, 480, 4069, 272, 32804, 253, 2990, 4404, 31429, 281, 2460, 22290, 1223, 22128, 625, 327, 253, 5281, 285, 14542, 273, 5113, 672, 2403, 13650, 253, 4477, 1127, 562, 326, 5747, 253, 5373, 273, 436, 323, 1142, 2087, 5481, 8892, 436, 588, 320, 247, 30078, 2867, 672, 10620, 342, 625, 10688, 14455, 386, 8892, 50276, 9349, 3021, 597, 12661, 281, 897, 2098, 777, 757, 480, 4069, 272, 3185, 273, 10688, 480, 4069, 272, 2098, 777, 757, 480, 4069, 272, 310, 247, 12057, 1754, 42072, 1332, 4081, 407, 253, 4477, 3738, 253, 26850, 323, 352, 1705, 432, 5368, 6239, 281, 294, 408, 360, 4024, 253, 3733, 3888, 1561, 247, 15958, 3268, 534, 5644, 281, 625, 15958, 285, 20793, 10688, 35919, 569, 685, 10688, 480, 4069, 272, 253, 2929, 3916, 326, 2098, 777, 757, 480, 4069, 272, 1335, 7729, 3157, 6928, 10096, 327, 5281, 285, 14542, 273, 5113, 3738, 1679, 685, 10688, 480, 4069, 272, 1223, 14155, 253, 6928, 31429, 281, 2460, 22290, 50276, 16217, 3825, 50276, 23, 256, 3433, 3210, 497, 10166, 10939, 327, 260, 338, 274, 2313, 495, 908, 1027, 11640, 273, 253, 2098, 777, 757, 480, 4069, 272, 374, 908, 1027, 11640, 273, 10688, 480, 4069, 272, 259, 285, 32063, 3632, 650, 698, 25912, 285, 337, 908, 642, 35919, 569, 4872, 49996, 323, 260, 338, 274, 2313, 285, 12405, 11335, 9162, 8892, 497, 840, 10166, 327, 1755, 273, 1016, 273, 253, 256, 3433, 3210, 3386, 835, 12405, 11335, 310, 253, 4836, 326, 310, 7558, 281, 320, 625, 11306, 10688, 14455, 386, 25761, 271, 4465, 4872, 30410, 369, 10166, 327, 253, 32147, 318, 273, 3386, 432, 247, 2098, 777, 757, 480, 4069, 272, 285, 247, 10688, 480, 4069, 272, 1566, 1925, 253, 21624, 2317, 5019, 1566, 1754, 327, 7200, 21624, 2317, 5019, 41731, 13015, 512, 643, 3210, 407, 247, 1534, 8459, 327, 1097, 15302, 2098, 777, 757, 480, 4069, 3133, 281, 562, 32231, 643, 35919, 569, 327, 12405, 8193, 2829, 337, 534, 8525, 253, 1750, 273, 253, 4477, 327, 260, 338, 274, 2313, 2098, 777, 757, 480, 4069, 17923, 5777, 7197, 685, 10688, 480, 4069, 272, 534, 253, 4477, 11104, 281, 253, 5141, 273, 10688, 31429, 275, 253, 3386, 50276, 66, 1077, 2074, 3368, 369, 671, 2218, 342, 1027, 15302, 256, 3433, 3733, 327, 10058, 303, 6533, 292, 4872, 30410, 10166, 327, 12405, 11335, 12966, 1518, 1670, 29976, 579, 246, 18, 76, 534, 671, 2797, 2074, 1543, 285, 11815, 25761, 275, 1529, 2074, 3368, 253, 256, 3433, 3210, 497, 10166, 342, 1027, 256, 3433, 16012, 948, 9245, 312, 948, 498, 83, 2534, 676, 26664, 281, 5224, 253, 31376, 273, 2098, 777, 757, 480, 4069, 323, 1027, 3510, 273, 256, 3433, 16012, 275, 1529, 3368, 253, 31640, 273, 253, 1027, 3210, 327, 31612, 3888, 970, 2098, 777, 757, 480, 4069, 272, 369, 6760, 1390, 314, 253, 10688, 7340, 369, 5867, 281, 16030, 253, 3486, 273, 10688, 1491, 275, 23586, 1396, 569, 323, 1016, 1566, 5847, 50275, 4714, 3542, 285, 3477, 281, 956, 7714, 40702, 253, 2934, 273, 253, 12057, 3169, 2098, 777, 757, 480, 4069, 534, 29698, 22290, 2112, 5937, 1420, 414, 3104, 310, 4722, 50276, 9088, 1057, 1646, 281, 320, 690, 1329, 275, 253, 4679, 323, 2098, 777, 757, 480, 4069, 11138, 1543, 323, 10688, 14455, 386, 10625, 26332, 2829, 337, 672, 10941, 4284, 3295, 480, 4069, 32063, 3632, 650, 698, 25912, 285, 2098, 777, 757, 480, 4069, 50275, 5040, 50275, 681, 48434, 253, 21624, 2317, 5019, 342, 643, 4948, 273, 35919, 569, 3133, 1077, 16593, 1580, 21624, 2317, 5019, 556, 4021, 253, 5350, 285, 4419, 4021, 253, 3733, 11897, 281, 479, 21624, 2317, 5019, 3133, 625, 347, 271, 19862, 273, 3210, 281, 1056, 352, 247, 4344, 5301, 352, 3198, 281, 320, 2429, 342, 643, 19862, 13553, 24088, 4284, 3295, 480, 4069, 32063, 3632, 650, 698, 1179, 257, 531, 390, 4284, 3295, 480, 4069, 32063, 3632, 650, 698, 1179, 254, 2976, 17177, 3133, 751, 247, 5272, 8245, 5010, 891, 1158, 21624, 2317, 5019, 943, 320, 5176, 17965, 50276, 17480, 9187, 2784, 310, 581, 273, 253, 954, 1774, 21257, 323, 256, 3433, 323, 512, 8542, 6378, 627, 943, 320, 767, 9508, 273, 2829, 337, 581, 342, 285, 581, 1293, 3632, 9187, 2784, 3732, 891, 476, 8564, 3632, 9187, 2784, 778, 3486, 253, 1543, 50276, 783, 2022, 2770, 275, 1655, 14023, 943, 320, 875, 4284, 3295, 480, 4069, 32063, 3632, 650, 698, 25912, 285, 2098, 777, 757, 480, 4069, 533, 436, 369, 1669, 562, 7094, 275, 7180, 374, 285, 495, 10941, 4284, 3295, 480, 4069, 342, 3632, 650, 698, 25912, 1411, 2098, 777, 757, 480, 4069, 1057, 417, 1646, 4344, 50276, 8052, 281, 1840, 1127, 1014, 10941, 2098, 777, 757, 480, 4069, 1411, 4284, 3295, 480, 4069, 32063, 3632, 650, 698, 25912, 310, 2649, 1663, 2217, 1580, 347, 4518, 4860, 275, 2829, 337, 2098, 777, 757, 480, 4069, 7194, 771, 7790, 20468, 285, 4499, 627, 943, 320, 247, 1480, 5301, 342, 271, 42072, 534, 2544, 20468, 4499, 20468, 45842, 285, 43192, 50276, 9088, 310, 2649, 667, 1781, 4311, 15302, 908, 275, 253, 3368, 569, 534, 16540, 253, 1953, 273, 9171, 1430, 285, 1880, 253, 2098, 777, 757, 42072, 651, 1335, 320, 4623, 342, 4067, 15302, 50276, 11139, 777, 757, 480, 4069, 310, 271, 42072, 5853, 534, 476, 671, 320, 6760, 327, 22296, 4715, 8892, 835, 2460, 22290, 403, 1774, 285, 12014, 323, 3700, 4715, 16906, 6240, 22296, 4679, 476, 823, 281, 253, 9483, 6460, 273, 253, 4679, 50275, 4674, 38609, 25957, 347, 476, 320, 2326, 275, 2829, 337, 604, 3295, 35919, 569, 403, 5176, 4336, 5293, 6661, 253, 7200, 15323, 273, 1283, 436, 3133, 13583, 672, 10941, 4284, 3295, 480, 4069, 32063, 3632, 650, 698, 25912, 281, 5293, 50276, 783, 1669, 7484, 273, 253, 10688, 7340, 1783, 3036, 495, 3133, 16593, 253, 2098, 777, 757, 480, 4069, 3210, 497, 5742, 10166, 281, 11823, 2098, 777, 757, 480, 262, 1336, 594, 581, 651, 10748, 1902, 616, 1543, 281, 320, 625, 10237, 342, 1675, 281, 2098, 777, 757, 480, 4069, 272, 40702, 40702, 50275, 555, 993, 50276, 4674, 4562, 12494, 337, 3054, 4284, 3295, 480, 4069, 923, 3036, 495, 891, 2868, 436, 369, 6034, 281, 3806, 3036, 337, 50276, 34974, 323, 4477, 50276, 18, 352, 3133, 1077, 8909, 281, 479, 326, 275, 2829, 337, 5293, 17923, 594, 2074, 281, 4284, 3295, 480, 4069, 32063, 3632, 650, 698, 25912, 513, 368, 452, 667, 7906, 327, 2139, 436, 6569, 374, 497, 627, 667, 643, 35919, 569, 908, 1309, 3733, 9187, 2784, 39501, 3966, 50276, 264, 953, 50276, 30568, 495, 50276, 555, 5367, 4993, 436, 2929, 29328, 271, 4722, 2934, 285, 310, 1077, 973, 3542, 253, 3374, 342, 10688, 480, 4069, 272, 8042, 562, 275, 436, 2929, 497, 3786, 1929, 3021, 891, 1859, 436, 2929, 347, 625, 48526, 4404, 8542, 6378, 2299, 253, 4679, 403, 417, 11088, 2217, 281, 7052, 1329, 253, 8542, 3916, 3021, 619, 6273, 310, 323, 253, 2929, 281, 320, 10945, 275, 697, 1655, 830, 5474, 33032, 2520, 2929, 806, 33888, 326, 6867, 3295, 480, 4069, 272, 42072, 310, 19632, 281, 4735, 6779, 4715, 840, 253, 4477, 4081, 247, 12057, 3169, 3295, 42072, 1925, 2098, 777, 757, 480, 4069, 281, 3157, 253, 3045, 253, 4081, 2098, 777, 757, 480, 4069, 17923, 1805, 342, 253, 3332, 4499, 422, 285, 1881, 35421, 4715, 15849, 20544, 50275, 783, 4081, 12057, 3169, 3295, 42072, 310, 3477, 281, 2096, 285, 3359, 50275, 783, 3045, 3133, 281, 320, 19143, 1805, 685, 6867, 3632, 3295, 480, 4069, 272, 50275, 2520, 3295, 42072, 476, 320, 3732, 281, 247, 4618, 5235, 273, 8892, 50276, 20881, 1255, 265, 50275, 261, 253, 4081, 12057, 3169, 3295, 42072, 17357, 685, 6867, 3295, 480, 4069, 272, 1142, 8892, 878, 3295, 42072, 2684, 327, 253, 8778, 1309, 3733, 604, 253, 4081, 3295, 42072, 310, 1199, 17357, 253, 3486, 273, 436, 789, 588, 320, 16888, 436, 2929, 4081, 247, 2087, 3295, 42072, 326, 17923, 1805, 685, 253, 6867, 3295, 480, 4069, 272, 285, 14371, 1805, 3045, 275, 4499, 422, 285, 1881, 35421, 4715, 15849, 253, 760, 4468, 891, 452, 310, 253, 3885, 2490, 187, 4118, 18435, 27, 783, 30628, 497, 275, 2087, 298, 17936, 44041, 670, 253, 2929, 417, 13762, 407, 2139, 15958, 42072, 1599, 625, 10237, 3386, 275, 256, 3433, 574, 7350, 689, 253, 18558, 466, 273, 253, 15302, 598, 281, 2233, 76, 285, 253, 2323, 7024, 327, 253, 17200, 273, 3295, 323, 9162, 253, 913, 18726, 342, 253, 30628, 1223, 253, 2929, 7835, 4722, 627, 403, 1142, 3533, 3464, 440, 42195, 50276, 953, 12744, 326, 253, 30080, 22559, 9713, 253, 7350, 6096, 407, 253, 30628, 50276, 249, 1635, 281, 253, 5701, 407, 253, 30628, 253, 913, 671, 9193, 326, 253, 4583, 2216, 310, 519, 37806, 285, 697, 12744, 326, 253, 4081, 42072, 476, 39970, 281, 4067, 625, 8542, 3237 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: the authors analyse the behavior of sgd under gradient clipping their analysis in the univariate case shows that gradient clipping in the heavytailed gradient noise almost eliminates the algorithms tendency to stay at sharp minima the authors support their analysis with synthetic experiments the authors then conduct experiments on real data where they add heavy tailed noise to the gradient and clip it afterwards i believe that the topics of sgdtrained networks generalization in general and the role of heavytailed parameter statistics in this in specific are very timely and worthy of attention since our understanding of the dynamics that lead to generalization is not on par with the empirical success of sgdbased methods i find the authors analyses and results interesting and wellpresented i think it has potential to improve our understanding of the generalization characteristics of sgdtrained networks although their analyses mostly focus on the univariate case i believe that this is acceptable as they provide some seminal results regarding truncatedgradient sgd however i have a hard time understanding the authors characterization of recent results in the literature or lack thereof which also informs their experiment design my point can be most dramatically made by drawing attention to the choice of baseline that the authors present while the recent theoretical and empirical findings in literature emphasize the relationship between learning rate and tail index andor generalization the authors somehow base their methodology and experiments on the supposed absence of heavy tails in gradient noise in image classification tasks this is not necessarily true and this fact is welldocumented given this fact the fact that the authors present the baselines as thintailed noise algorithms without any detailed analysis of learning rate batch size and their effects on generalization as well as not estimating the tail index of these noises is surprising i think the authors need to take into account more recent results in the literature and possibly alter their experimental settings and discussion accordingly this is especially important since the authors aim to analyse why a specific modification of sgd leads to improvements hodgkinson liam and michael w mahoney 2020 multiplicative noise and heavy tails in stochastic optimization arxiv200606293 cs math stat june httparxivorgabs200606293 gurbuzbalaban mert umut imekli and lingjiong zhu 2021 the heavytail phenomenon in sgd arxiv200604740 cs math stat june httparxivorgabs200604740 lewkowycz aitor yasaman bahri ethan dyer jascha sohldickstein and guy gurari 2020 the large learning rate phase of deep learning the catapult mechanism march httpsarxivorgabs200302218v1 the authors present interesting analyses and results regarding a modified version of sgd in optimization their characterization of the recent literature and the experimental design based thereupon seems to need more attention docsepthis paper study the longtime behavior of heavytailed sgd with gradient clipping it is found that gradient clipping is crucial for heavytailed sgd to avoid sharp minima the basic intuition is that the clipping operation reduces the distance moved by each sgd update therefore for minima narrow than the threshold the clipping does not change the first exit time however for wide minima sgd is slowed down and takes more time to escape consequently it is more likely that sgd locates in wide minima pros a beautiful theoretical analysis is provided for a onedimensional landscape under some structure assumptions very insightful synthetical experiments are provided to justify the intuitions and theory inspired by the above analysis the authors proposed two variants of sgd with injected heavytailed noise gradient clipping these modified sgds are expected to converge to flatter minima thereby generalizing better the experiments for deep nets on fashionmnist and cifar10cifar100 are sufficient and show promising improvements over vanilla sgd cons it seems that all the analyses ignore the noise structure and are only concerned with the magnitude for instance for even a onedimensional problem the gradient noise of vanilla minibatch sgd is statedependent however gradient noises in the prototypical dynamics analyzed in this paper see eq 2 are iid random variables as another example figure 1 can be misleading since the lighttailed sgds only use stateindependent noise it is possible that sgd with structured noise can avoid those sharp minima completely the authors should disentangle the effect of noise magnitudes and noise structures and state clearly what is concerned in this paper for the related work previous work studying the stabilitydriven escaping from sharp minima is completely ignored i would suggest comparing with them see 123 and the reference therein for example the stabilitydriven escaping can tell us that in figure 1 gd with a relatively large learning rate never converge to the sharp minima m1 m2 other comments in figure 1 please make it clear in the caption that where the sgd starts from in the second paragraph of page 2 i am not sure why sgd with lighttailed noise never escape the sharp minima m3 i suppose that by adding large enough noise sgd at least can escape from it although it is as efficient as the heavytailed sgd if we stop training sgd at an arbitrary time point it is almost guaranteed that it wont be at a sharp minimum this claim seems wrong to me although heavytailed sgd with gradient clipping can avoid the sharp minima completely it may take a very long time in section 4 contrary to the report in s ims ekli et al 2019a heavytailed noise may not be ubiquitous in image classification tasks why is there this contradiction in particular table 1 suggests that sgdheavytailed noise performs very badly even worse than largebatch sgd can you explain it this also contradicts the synthetical experiments where the heavytailed sgd converges to flat minima more likely than lighttailed sgd in paragraph above section 5 event even 1 wu lei et al how sgd selects the global minima in overparameterized learning a dynamical stability perspective 2 jastrzebski et al the breakeven point on optimization trajectories of deep neural networks 3 cohen et al gradient descent on neural networks typically occurs at the edge of stability this paper reveals that heavytailed noisegradient clipping can help sgd eliminate the sharp minima beautiful theoretical analysis and insightful numerical experiments are provided docsepthe paper studies gradient descent with injected powerlaw tail noise the work shows that in the infinitesimal learning rate regime the heavy tail noise can cause gd to not to converge to sharp minima based on this theory the paper proposes a technique to inject noise to gd to help training i appreciate the mathematical rigor of the work but i am not convinced by its machine learning deep learning relevance specifically i find the following points problematic 1 the title seems inappropriate the word sgd is taken to mean a special kind of noise that is due to minibatch sampling this work however only studies gd with injected powerlaw noise i think it is misleading to say sgd in the title 2 the assumptions seem too strong and deep learning irrelevant one of my main objections is that the paper assumes finitely many minima yet neural networks both underparametrized and overparametrized should have infinitely many minima and the fact that the minima of neural networks are degenerate makes it inappropriate to apply a transition graph analysis this then makes me think that the theoretical analysis is not relevant for deep learning 3 the main result in theorem 2 only applies to the case when the learning rate is infinitesimal and the limit needs to be taken under certain scaling conditions this amounts to a continuoustime approximation and i think is a crucial limitation of the theoretical results the theorems reveal nothing about the behavior of sgd at a finite learning rate which is the actual regime that sgd is run in practice also i feel that this continuoustime condition should be stated much earlier in the draft 4 i am unconvinced by the experiment section both lenet and vgg are outdated architectures because they lack the residual structure i would ask for evaluation on at least a modern resnet to demonstrate the effectiveness of the proposed method even if evaluated on resnet i still think it would not suffice because it is quite easy to improve a vanilla model i would really want to see being able to improve some stateoftheart results for the paper to be experimentally convincing other important questions but may not constitute reasons for rejection 1 it has been found that the powerlaw index of sgd noise crucially depends on the learning rate of sgd see httpsarxivorgabs210602588 or httpsarxivorgabs210509557 how does this fact affect the theoretical results of this work minor question 1 page 3 what is regeneration structure the following two reasons are the main weaknesses based on which i recommend rejection 1 the theory seems irrelevant for machine learning deep learning 2 the experimental evaluation is weak because it uses outdated architecture and because the result only improves on badly performing vanilla training strategies therefore taken as a theoretical paper i find the theory limited and irrelevant taken as an experimentalmethod paper i find the improvement and methodology unconvincing ### Summary:
motivated by empirical observations that sgd performed on deep networks converge to regions of flatter loss curvature relative to large or full batch gd the authors perform a theoretical analysis of trajectories of sgd with the presence of heavy tailed noise the primary observation of the theory is that heavy tailed noise has a higher probability of kicking the current parameters to a new region of the input space which has some probability of lying in a sharper region however its important to note that in this analysis sgd with heavy tailed noise doesnt stay in the sharp regions but will eventually be kicked back out of it back to other regions in a sense this defines a transition graph which predicts that the steady state distribution should spend some fraction of time in different regions of the input space and different sharpness while never converging anywhere this is shown most clearly in figure 1 top center where the heavy tailed sgd randomly jumps between different regions of the input space throughout the entire training trajectory experiments are then run on deep networks showing that heavytailed sgd with gradient clipping converges to regions of flatter curvature reviews of the work were generally positive the theory is well presented and figure 1 does a solid job demonstrating the main idea the primary criticism was raised by reviewer hgyl arguing that the results should be largely irrelevant to deep learning most of the debate between this reviewer and the authors centered around whether or not relu networks have minima which extend off to infinity the ac will not dig into the details of the argument it seems clear however that if there were a deep learning workload with heavy tailed noise that the authors results will have some relevancy though the exact nature of the resulting transition graph may have a complicated dependence on the loss surface unfortunately the authors were unable to find a such a workload in image classification there is some prior work suggesting the nlp models with rare tokens may be a better fit and so needed to artificially induce heavy tailed noise to test their theory this is a bit of a limitation but given the clear writing and interesting experiments as noted by reviewers the work seems worth accepting the ac strongly urges the authors though to include a more lengthy discussed of wu et al as that work seems to agree with experiment of the sharpness of stable regions selected by sgd when run on deep models without heavy tailed noise
[ 4477, 1246, 1223, 253, 3332, 10527, 285, 16774, 4342, 275, 6239, 22175, 253, 2954, 875, 4715, 2281, 285, 8105, 3605, 285, 263, 26647, 253, 4477, 10380, 2613, 616, 16182, 285, 4679, 327, 253, 6326, 5928, 273, 5536, 32936, 275, 11786, 6046, 275, 2460, 9162, 8892, 436, 310, 417, 7933, 2032, 285, 436, 958, 310, 6210, 392, 1829, 264, 1677, 436, 958, 253, 958, 326, 253, 4477, 1246, 253, 1666, 25379, 347, 289, 565, 7193, 6046, 11333, 1293, 667, 7000, 1783, 273, 4715, 2281, 50276, 23941, 1979, 285, 616, 2538, 327, 26647, 347, 973, 347, 417, 26230, 253, 8105, 3605, 273, 841, 33737, 310, 10084, 891, 1158, 253, 4477, 878, 281, 1379, 715, 2395, 625, 3332, 1543, 275, 253, 6239, 285, 6830, 6990, 616, 5661, 7533, 285, 5955, 15672, 436, 310, 3340, 1774, 1580, 253, 4477, 4388, 281, 30648, 2139, 247, 2173, 11237, 273, 256, 35333, 5644, 281, 11701, 50275, 73, 351, 72, 31299, 632, 312, 285, 278, 44023, 259, 35926, 2153, 9169, 43904, 6046, 285, 5536, 32936, 275, 19191, 13757, 549, 32693, 1518, 25358, 19630, 29180, 14168, 1098, 480, 2517, 2832, 1148, 32693, 2061, 5375, 1518, 25358, 19630, 50276, 72, 4063, 7958, 7187, 23818, 278, 797, 5111, 307, 516, 1441, 965, 285, 23614, 75, 279, 72, 1182, 11917, 43425, 253, 3573, 1767, 647, 11562, 275, 256, 35333, 549, 32693, 1518, 1549, 2504, 1449, 29180, 14168, 1098, 480, 2517, 2832, 1148, 32693, 2061, 5375, 1518, 1549, 2504, 1449, 50276, 282, 30567, 319, 90, 14617, 247, 2081, 340, 284, 14990, 270, 1240, 363, 5105, 266, 277, 7885, 480, 284, 13420, 594, 73, 392, 781, 6339, 285, 5599, 305, 321, 1792, 9169, 253, 1781, 4715, 2281, 3408, 273, 3676, 4715, 253, 5798, 522, 503, 5122, 14172, 5987, 39962, 2061, 5375, 1518, 1229, 1423, 1093, 87, 18, 253, 4477, 1246, 4722, 6260, 285, 1543, 5001, 247, 7321, 2715, 273, 256, 35333, 275, 13757, 616, 14846, 273, 253, 3332, 6239, 285, 253, 5661, 2216, 1754, 627, 20026, 3133, 281, 878, 625, 4116, 5474, 33032, 2520, 2929, 1263, 253, 31156, 3879, 273, 3573, 1767, 7193, 256, 35333, 342, 11786, 502, 8201, 352, 310, 1119, 326, 11786, 502, 8201, 310, 9560, 323, 3573, 1767, 7193, 256, 35333, 281, 3693, 9479, 46836, 253, 5044, 30328, 310, 326, 253, 502, 8201, 4254, 11355, 253, 4181, 4395, 407, 1016, 256, 35333, 5731, 3103, 323, 46836, 6891, 685, 253, 7887, 253, 502, 8201, 1057, 417, 1818, 253, 806, 10463, 673, 2299, 323, 4618, 46836, 256, 35333, 310, 28837, 1066, 285, 3936, 625, 673, 281, 8773, 17912, 352, 310, 625, 2779, 326, 256, 35333, 1150, 684, 275, 4618, 46836, 50273, 856, 84, 50275, 66, 5389, 10527, 1783, 310, 2530, 323, 247, 327, 264, 37613, 13016, 762, 690, 2605, 13260, 50275, 635, 47860, 5132, 85, 474, 4679, 403, 2530, 281, 15249, 253, 16875, 4431, 285, 3762, 50276, 38358, 407, 253, 1840, 1783, 253, 4477, 4081, 767, 11640, 273, 256, 35333, 342, 13945, 3573, 1767, 7193, 6046, 50276, 29844, 502, 8201, 841, 7321, 48237, 1397, 403, 3264, 281, 29623, 281, 892, 2569, 46836, 7624, 2087, 3006, 1805, 50275, 783, 4679, 323, 3676, 37507, 327, 8142, 16192, 382, 285, 260, 338, 274, 740, 46277, 274, 2313, 403, 4209, 285, 921, 12532, 11701, 689, 26724, 256, 35333, 50275, 5040, 50276, 262, 3133, 326, 512, 253, 6260, 11823, 253, 6046, 2605, 285, 403, 760, 7514, 342, 253, 9777, 323, 4227, 323, 1014, 247, 327, 264, 37613, 1895, 253, 11786, 6046, 273, 26724, 1054, 487, 1506, 256, 35333, 310, 4767, 2662, 2299, 11786, 33737, 275, 253, 3861, 49225, 8062, 5867, 275, 436, 2929, 923, 16186, 374, 403, 891, 301, 3632, 4903, 347, 1529, 1650, 4677, 337, 476, 320, 24363, 1580, 253, 1708, 29551, 48237, 1397, 760, 897, 1375, 17777, 6046, 352, 310, 1896, 326, 256, 35333, 342, 18872, 6046, 476, 3693, 1110, 9479, 46836, 4336, 253, 4477, 943, 557, 290, 2134, 253, 1055, 273, 6046, 32800, 285, 6046, 5289, 285, 1375, 4518, 752, 310, 7514, 275, 436, 2929, 50273, 1542, 253, 2905, 789, 2045, 789, 12392, 253, 7882, 17477, 34528, 432, 9479, 46836, 310, 4336, 12841, 891, 651, 1804, 10941, 342, 731, 923, 15567, 285, 253, 3806, 15308, 323, 1650, 253, 7882, 17477, 34528, 476, 2028, 441, 326, 275, 4677, 337, 305, 69, 342, 247, 4942, 1781, 4715, 2281, 1620, 29623, 281, 253, 9479, 46836, 278, 18, 278, 19, 50272, 977, 5701, 50275, 249, 4677, 337, 4496, 1056, 352, 2590, 275, 253, 11743, 326, 835, 253, 256, 35333, 7866, 432, 50274, 249, 253, 1273, 12494, 273, 3239, 374, 891, 717, 417, 2119, 2139, 256, 35333, 342, 1708, 29551, 6046, 1620, 8773, 253, 9479, 46836, 278, 20, 891, 9428, 326, 407, 6240, 1781, 2217, 6046, 256, 35333, 387, 1878, 476, 8773, 432, 352, 3738, 352, 310, 347, 5919, 347, 253, 3573, 1767, 7193, 256, 35333, 50273, 338, 359, 3523, 3733, 256, 35333, 387, 271, 10341, 673, 1127, 352, 310, 2761, 16293, 326, 352, 31451, 320, 387, 247, 9479, 5927, 436, 1750, 3133, 3430, 281, 479, 3738, 3573, 1767, 7193, 256, 35333, 342, 11786, 502, 8201, 476, 3693, 253, 9479, 46836, 4336, 352, 778, 1379, 247, 1077, 1048, 673, 50274, 249, 2593, 577, 10214, 281, 253, 1304, 275, 256, 516, 84, 34978, 965, 1162, 355, 6247, 66, 3573, 1767, 7193, 6046, 778, 417, 320, 33079, 275, 2460, 9162, 8892, 2139, 310, 627, 436, 20620, 275, 1798, 2829, 337, 5936, 326, 256, 35333, 248, 580, 1767, 7193, 6046, 17923, 1077, 16426, 1014, 7197, 685, 1781, 23941, 256, 35333, 476, 368, 5513, 352, 436, 671, 40878, 253, 5132, 85, 474, 4679, 835, 253, 3573, 1767, 7193, 256, 35333, 26414, 281, 6507, 46836, 625, 2779, 685, 1708, 29551, 256, 35333, 50275, 249, 12494, 1840, 2593, 608, 2362, 1014, 50272, 18, 259, 86, 43278, 1162, 355, 50276, 5430, 256, 35333, 34899, 253, 4156, 46836, 275, 689, 19484, 1025, 4715, 247, 18525, 7882, 8668, 50276, 19, 480, 505, 83, 2721, 1768, 5985, 1162, 355, 253, 1517, 640, 1261, 1127, 327, 13757, 24102, 273, 3676, 11454, 6928, 50276, 20, 820, 864, 1162, 355, 11786, 18499, 327, 11454, 6928, 5431, 6634, 387, 253, 5024, 273, 7882, 436, 2929, 12957, 326, 3573, 1767, 7193, 6046, 29844, 502, 8201, 476, 1361, 256, 35333, 13469, 253, 9479, 46836, 5389, 10527, 1783, 285, 47860, 10704, 4679, 403, 2530, 50276, 7152, 339, 431, 248, 2929, 2175, 11786, 18499, 342, 13945, 1612, 6937, 8105, 6046, 253, 789, 2722, 326, 275, 253, 47041, 1983, 4715, 2281, 9459, 253, 5536, 8105, 6046, 476, 2847, 305, 69, 281, 417, 281, 29623, 281, 9479, 46836, 50275, 3169, 327, 436, 3762, 253, 2929, 29328, 247, 5853, 281, 14888, 6046, 281, 305, 69, 281, 1361, 3733, 50276, 74, 11435, 253, 15965, 8132, 263, 273, 253, 789, 533, 891, 717, 417, 13762, 407, 697, 5145, 4715, 50276, 22412, 4715, 17200, 50276, 46458, 891, 1089, 253, 1563, 2792, 20276, 337, 253, 4060, 3133, 19582, 253, 3159, 256, 35333, 310, 2668, 281, 1599, 247, 2714, 2238, 273, 6046, 326, 310, 1955, 281, 1054, 487, 1506, 10491, 436, 789, 2299, 760, 2175, 305, 69, 342, 13945, 1612, 6937, 6046, 891, 1158, 352, 310, 24363, 281, 1333, 256, 35333, 275, 253, 4060, 50276, 19, 253, 13260, 1646, 1512, 2266, 285, 3676, 4715, 19124, 581, 273, 619, 2022, 21915, 310, 326, 253, 2929, 19584, 30268, 1142, 46836, 2568, 11454, 6928, 1097, 762, 3575, 292, 50065, 285, 689, 3575, 292, 50065, 943, 452, 29556, 1142, 46836, 285, 253, 958, 326, 253, 46836, 273, 11454, 6928, 403, 29458, 2789, 352, 19582, 281, 4647, 247, 5502, 4216, 1783, 436, 840, 2789, 479, 1158, 326, 253, 10527, 1783, 310, 417, 4623, 323, 3676, 4715, 50276, 20, 253, 2022, 906, 275, 10012, 374, 760, 10384, 281, 253, 1083, 672, 253, 4715, 2281, 310, 47041, 1983, 285, 253, 2701, 3198, 281, 320, 2668, 762, 2176, 13642, 2515, 436, 8322, 281, 247, 44351, 26202, 553, 11193, 285, 891, 1158, 310, 247, 9560, 12291, 273, 253, 10527, 1543, 253, 39383, 10313, 2717, 670, 253, 3879, 273, 256, 35333, 387, 247, 6486, 4715, 2281, 534, 310, 253, 4588, 9459, 326, 256, 35333, 310, 1408, 275, 3946, 50275, 12563, 891, 1928, 326, 436, 44351, 26202, 553, 1617, 943, 320, 4767, 1199, 4321, 275, 253, 7482, 50276, 21, 891, 717, 10915, 8498, 758, 407, 253, 3368, 2593, 1097, 8472, 292, 285, 362, 1266, 403, 36761, 35615, 984, 597, 3480, 253, 12541, 2605, 891, 651, 1642, 323, 7103, 327, 387, 1878, 247, 4980, 501, 3024, 281, 7568, 253, 12510, 273, 253, 4081, 1332, 50276, 9154, 604, 6760, 327, 501, 3024, 891, 1335, 1158, 352, 651, 417, 36433, 984, 352, 310, 3240, 3477, 281, 3157, 247, 26724, 1566, 891, 651, 1663, 971, 281, 923, 1146, 2104, 281, 3157, 690, 1375, 23037, 14387, 1543, 323, 253, 2929, 281, 320, 21657, 21414, 50276, 977, 1774, 3533, 533, 778, 417, 12647, 4606, 323, 18235, 337, 352, 556, 644, 1119, 326, 253, 1612, 6937, 3605, 273, 256, 35333, 6046, 29325, 1365, 7024, 327, 253, 4715, 2281, 273, 256, 35333, 923, 5987, 39962, 2061, 5375, 16899, 1549, 1099, 2055, 390, 5987, 39962, 2061, 5375, 16899, 1235, 2222, 3011, 849, 1057, 436, 958, 2818, 253, 10527, 1543, 273, 436, 789, 50275, 37585, 1953, 337, 3239, 495, 752, 310, 22781, 2605, 253, 1563, 767, 4606, 403, 253, 2022, 32213, 1754, 327, 534, 891, 5583, 18235, 337, 253, 3762, 3133, 19124, 323, 5145, 4715, 50276, 22412, 4715, 374, 253, 5661, 7103, 310, 5075, 984, 352, 4648, 36761, 10336, 285, 984, 253, 906, 760, 19132, 327, 16426, 9591, 26724, 3733, 8130, 50276, 45230, 2668, 347, 247, 10527, 2929, 891, 1089, 253, 3762, 3710, 285, 19124, 2668, 347, 271, 5661, 9349, 2929, 891, 1089, 253, 7756, 285, 16182, 10915, 87, 19163, 50276, 187, 187, 4118, 18435, 27, 24013, 8550, 407, 16774, 7313, 326, 256, 35333, 2684, 327, 3676, 6928, 29623, 281, 4811, 273, 892, 2569, 2957, 16841, 4103, 281, 1781, 390, 2120, 14604, 305, 69, 253, 4477, 1347, 247, 10527, 1783, 273, 24102, 273, 256, 35333, 342, 253, 3361, 273, 5536, 246, 7193, 6046, 253, 3625, 8310, 273, 253, 3762, 310, 326, 5536, 246, 7193, 6046, 556, 247, 2169, 5912, 273, 29048, 253, 1655, 3602, 281, 247, 747, 2919, 273, 253, 3280, 2317, 534, 556, 690, 5912, 273, 10776, 275, 247, 17614, 468, 2919, 2299, 697, 1774, 281, 3877, 326, 275, 436, 1783, 256, 35333, 342, 5536, 246, 7193, 6046, 36908, 3297, 275, 253, 9479, 4811, 533, 588, 6524, 320, 19301, 896, 562, 273, 352, 896, 281, 643, 4811, 275, 247, 3282, 436, 13067, 247, 5502, 4216, 534, 26295, 326, 253, 11792, 1375, 3268, 943, 6947, 690, 6919, 273, 673, 275, 1027, 4811, 273, 253, 3280, 2317, 285, 1027, 9479, 1255, 1223, 1620, 5975, 3390, 9825, 436, 310, 2011, 954, 4518, 275, 4677, 337, 1755, 4055, 835, 253, 5536, 246, 7193, 256, 35333, 12421, 27287, 875, 1027, 4811, 273, 253, 3280, 2317, 4768, 253, 2862, 3733, 18974, 4679, 403, 840, 1408, 327, 3676, 6928, 4645, 326, 3573, 1767, 7193, 256, 35333, 342, 11786, 502, 8201, 26414, 281, 4811, 273, 892, 2569, 16841, 50275, 15337, 84, 273, 253, 789, 497, 3839, 2762, 253, 3762, 310, 973, 3559, 285, 4677, 337, 1057, 247, 4891, 2628, 17227, 253, 2022, 2934, 253, 3625, 14226, 369, 5439, 407, 37317, 288, 72, 1190, 16425, 326, 253, 1543, 943, 320, 8127, 19124, 281, 3676, 4715, 954, 273, 253, 8881, 875, 436, 37317, 285, 253, 4477, 18932, 1475, 1880, 390, 417, 774, 86, 6928, 452, 46836, 534, 9017, 745, 281, 23579, 253, 913, 588, 417, 2836, 715, 253, 4278, 273, 253, 4154, 352, 3133, 2590, 2299, 326, 604, 627, 497, 247, 3676, 4715, 32140, 342, 5536, 246, 7193, 6046, 326, 253, 4477, 1543, 588, 452, 690, 1693, 87, 4306, 2167, 253, 3242, 3753, 273, 253, 4795, 5502, 4216, 778, 452, 247, 9542, 10096, 327, 253, 2957, 2553, 19235, 253, 4477, 497, 7591, 281, 1089, 247, 824, 247, 32140, 275, 2460, 9162, 627, 310, 690, 2720, 789, 7738, 253, 295, 24343, 3210, 342, 7520, 21761, 778, 320, 247, 1805, 4944, 285, 594, 3058, 281, 41544, 10808, 5536, 246, 7193, 6046, 281, 1071, 616, 3762, 436, 310, 247, 2372, 273, 247, 12291, 533, 1677, 253, 2590, 4028, 285, 4722, 4679, 347, 4879, 407, 30628, 253, 789, 3133, 4409, 18738, 253, 913, 7052, 35733, 253, 4477, 2167, 281, 2486, 247, 625, 24585, 5469, 273, 259, 86, 1162, 355, 347, 326, 789, 3133, 281, 5194, 342, 3368, 273, 253, 9479, 1255, 273, 6474, 4811, 4236, 407, 256, 35333, 672, 1408, 327, 3676, 3210, 1293, 5536, 246, 7193, 6046 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 4477, 1246, 1223, 253, 3332, 10527, 285, 16774, 4342, 275, 6239, 22175, 253, 2954, 875, 4715, 2281, 285, 8105, 3605, 285, 263, 26647, 253, 4477, 10380, 2613, 616, 16182, 285, 4679, 327, 253, 6326, 5928, 273, 5536, 32936, 275, 11786, 6046, 275, 2460, 9162, 8892, 436, 310, 417, 7933, 2032, 285, 436, 958, 310, 6210, 392, 1829, 264, 1677, 436, 958, 253, 958, 326, 253, 4477, 1246, 253, 1666, 25379, 347, 289, 565, 7193, 6046, 11333, 1293, 667, 7000, 1783, 273, 4715, 2281, 50276, 23941, 1979, 285, 616, 2538, 327, 26647, 347, 973, 347, 417, 26230, 253, 8105, 3605, 273, 841, 33737, 310, 10084, 891, 1158, 253, 4477, 878, 281, 1379, 715, 2395, 625, 3332, 1543, 275, 253, 6239, 285, 6830, 6990, 616, 5661, 7533, 285, 5955, 15672, 436, 310, 3340, 1774, 1580, 253, 4477, 4388, 281, 30648, 2139, 247, 2173, 11237, 273, 256, 35333, 5644, 281, 11701, 50275, 73, 351, 72, 31299, 632, 312, 285, 278, 44023, 259, 35926, 2153, 9169, 43904, 6046, 285, 5536, 32936, 275, 19191, 13757, 549, 32693, 1518, 25358, 19630, 29180, 14168, 1098, 480, 2517, 2832, 1148, 32693, 2061, 5375, 1518, 25358, 19630, 50276, 72, 4063, 7958, 7187, 23818, 278, 797, 5111, 307, 516, 1441, 965, 285, 23614, 75, 279, 72, 1182, 11917, 43425, 253, 3573, 1767, 647, 11562, 275, 256, 35333, 549, 32693, 1518, 1549, 2504, 1449, 29180, 14168, 1098, 480, 2517, 2832, 1148, 32693, 2061, 5375, 1518, 1549, 2504, 1449, 50276, 282, 30567, 319, 90, 14617, 247, 2081, 340, 284, 14990, 270, 1240, 363, 5105, 266, 277, 7885, 480, 284, 13420, 594, 73, 392, 781, 6339, 285, 5599, 305, 321, 1792, 9169, 253, 1781, 4715, 2281, 3408, 273, 3676, 4715, 253, 5798, 522, 503, 5122, 14172, 5987, 39962, 2061, 5375, 1518, 1229, 1423, 1093, 87, 18, 253, 4477, 1246, 4722, 6260, 285, 1543, 5001, 247, 7321, 2715, 273, 256, 35333, 275, 13757, 616, 14846, 273, 253, 3332, 6239, 285, 253, 5661, 2216, 1754, 627, 20026, 3133, 281, 878, 625, 4116, 5474, 33032, 2520, 2929, 1263, 253, 31156, 3879, 273, 3573, 1767, 7193, 256, 35333, 342, 11786, 502, 8201, 352, 310, 1119, 326, 11786, 502, 8201, 310, 9560, 323, 3573, 1767, 7193, 256, 35333, 281, 3693, 9479, 46836, 253, 5044, 30328, 310, 326, 253, 502, 8201, 4254, 11355, 253, 4181, 4395, 407, 1016, 256, 35333, 5731, 3103, 323, 46836, 6891, 685, 253, 7887, 253, 502, 8201, 1057, 417, 1818, 253, 806, 10463, 673, 2299, 323, 4618, 46836, 256, 35333, 310, 28837, 1066, 285, 3936, 625, 673, 281, 8773, 17912, 352, 310, 625, 2779, 326, 256, 35333, 1150, 684, 275, 4618, 46836, 50273, 856, 84, 50275, 66, 5389, 10527, 1783, 310, 2530, 323, 247, 327, 264, 37613, 13016, 762, 690, 2605, 13260, 50275, 635, 47860, 5132, 85, 474, 4679, 403, 2530, 281, 15249, 253, 16875, 4431, 285, 3762, 50276, 38358, 407, 253, 1840, 1783, 253, 4477, 4081, 767, 11640, 273, 256, 35333, 342, 13945, 3573, 1767, 7193, 6046, 50276, 29844, 502, 8201, 841, 7321, 48237, 1397, 403, 3264, 281, 29623, 281, 892, 2569, 46836, 7624, 2087, 3006, 1805, 50275, 783, 4679, 323, 3676, 37507, 327, 8142, 16192, 382, 285, 260, 338, 274, 740, 46277, 274, 2313, 403, 4209, 285, 921, 12532, 11701, 689, 26724, 256, 35333, 50275, 5040, 50276, 262, 3133, 326, 512, 253, 6260, 11823, 253, 6046, 2605, 285, 403, 760, 7514, 342, 253, 9777, 323, 4227, 323, 1014, 247, 327, 264, 37613, 1895, 253, 11786, 6046, 273, 26724, 1054, 487, 1506, 256, 35333, 310, 4767, 2662, 2299, 11786, 33737, 275, 253, 3861, 49225, 8062, 5867, 275, 436, 2929, 923, 16186, 374, 403, 891, 301, 3632, 4903, 347, 1529, 1650, 4677, 337, 476, 320, 24363, 1580, 253, 1708, 29551, 48237, 1397, 760, 897, 1375, 17777, 6046, 352, 310, 1896, 326, 256, 35333, 342, 18872, 6046, 476, 3693, 1110, 9479, 46836, 4336, 253, 4477, 943, 557, 290, 2134, 253, 1055, 273, 6046, 32800, 285, 6046, 5289, 285, 1375, 4518, 752, 310, 7514, 275, 436, 2929, 50273, 1542, 253, 2905, 789, 2045, 789, 12392, 253, 7882, 17477, 34528, 432, 9479, 46836, 310, 4336, 12841, 891, 651, 1804, 10941, 342, 731, 923, 15567, 285, 253, 3806, 15308, 323, 1650, 253, 7882, 17477, 34528, 476, 2028, 441, 326, 275, 4677, 337, 305, 69, 342, 247, 4942, 1781, 4715, 2281, 1620, 29623, 281, 253, 9479, 46836, 278, 18, 278, 19, 50272, 977, 5701, 50275, 249, 4677, 337, 4496, 1056, 352, 2590, 275, 253, 11743, 326, 835, 253, 256, 35333, 7866, 432, 50274, 249, 253, 1273, 12494, 273, 3239, 374, 891, 717, 417, 2119, 2139, 256, 35333, 342, 1708, 29551, 6046, 1620, 8773, 253, 9479, 46836, 278, 20, 891, 9428, 326, 407, 6240, 1781, 2217, 6046, 256, 35333, 387, 1878, 476, 8773, 432, 352, 3738, 352, 310, 347, 5919, 347, 253, 3573, 1767, 7193, 256, 35333, 50273, 338, 359, 3523, 3733, 256, 35333, 387, 271, 10341, 673, 1127, 352, 310, 2761, 16293, 326, 352, 31451, 320, 387, 247, 9479, 5927, 436, 1750, 3133, 3430, 281, 479, 3738, 3573, 1767, 7193, 256, 35333, 342, 11786, 502, 8201, 476, 3693, 253, 9479, 46836, 4336, 352, 778, 1379, 247, 1077, 1048, 673, 50274, 249, 2593, 577, 10214, 281, 253, 1304, 275, 256, 516, 84, 34978, 965, 1162, 355, 6247, 66, 3573, 1767, 7193, 6046, 778, 417, 320, 33079, 275, 2460, 9162, 8892, 2139, 310, 627, 436, 20620, 275, 1798, 2829, 337, 5936, 326, 256, 35333, 248, 580, 1767, 7193, 6046, 17923, 1077, 16426, 1014, 7197, 685, 1781, 23941, 256, 35333, 476, 368, 5513, 352, 436, 671, 40878, 253, 5132, 85, 474, 4679, 835, 253, 3573, 1767, 7193, 256, 35333, 26414, 281, 6507, 46836, 625, 2779, 685, 1708, 29551, 256, 35333, 50275, 249, 12494, 1840, 2593, 608, 2362, 1014, 50272, 18, 259, 86, 43278, 1162, 355, 50276, 5430, 256, 35333, 34899, 253, 4156, 46836, 275, 689, 19484, 1025, 4715, 247, 18525, 7882, 8668, 50276, 19, 480, 505, 83, 2721, 1768, 5985, 1162, 355, 253, 1517, 640, 1261, 1127, 327, 13757, 24102, 273, 3676, 11454, 6928, 50276, 20, 820, 864, 1162, 355, 11786, 18499, 327, 11454, 6928, 5431, 6634, 387, 253, 5024, 273, 7882, 436, 2929, 12957, 326, 3573, 1767, 7193, 6046, 29844, 502, 8201, 476, 1361, 256, 35333, 13469, 253, 9479, 46836, 5389, 10527, 1783, 285, 47860, 10704, 4679, 403, 2530, 50276, 7152, 339, 431, 248, 2929, 2175, 11786, 18499, 342, 13945, 1612, 6937, 8105, 6046, 253, 789, 2722, 326, 275, 253, 47041, 1983, 4715, 2281, 9459, 253, 5536, 8105, 6046, 476, 2847, 305, 69, 281, 417, 281, 29623, 281, 9479, 46836, 50275, 3169, 327, 436, 3762, 253, 2929, 29328, 247, 5853, 281, 14888, 6046, 281, 305, 69, 281, 1361, 3733, 50276, 74, 11435, 253, 15965, 8132, 263, 273, 253, 789, 533, 891, 717, 417, 13762, 407, 697, 5145, 4715, 50276, 22412, 4715, 17200, 50276, 46458, 891, 1089, 253, 1563, 2792, 20276, 337, 253, 4060, 3133, 19582, 253, 3159, 256, 35333, 310, 2668, 281, 1599, 247, 2714, 2238, 273, 6046, 326, 310, 1955, 281, 1054, 487, 1506, 10491, 436, 789, 2299, 760, 2175, 305, 69, 342, 13945, 1612, 6937, 6046, 891, 1158, 352, 310, 24363, 281, 1333, 256, 35333, 275, 253, 4060, 50276, 19, 253, 13260, 1646, 1512, 2266, 285, 3676, 4715, 19124, 581, 273, 619, 2022, 21915, 310, 326, 253, 2929, 19584, 30268, 1142, 46836, 2568, 11454, 6928, 1097, 762, 3575, 292, 50065, 285, 689, 3575, 292, 50065, 943, 452, 29556, 1142, 46836, 285, 253, 958, 326, 253, 46836, 273, 11454, 6928, 403, 29458, 2789, 352, 19582, 281, 4647, 247, 5502, 4216, 1783, 436, 840, 2789, 479, 1158, 326, 253, 10527, 1783, 310, 417, 4623, 323, 3676, 4715, 50276, 20, 253, 2022, 906, 275, 10012, 374, 760, 10384, 281, 253, 1083, 672, 253, 4715, 2281, 310, 47041, 1983, 285, 253, 2701, 3198, 281, 320, 2668, 762, 2176, 13642, 2515, 436, 8322, 281, 247, 44351, 26202, 553, 11193, 285, 891, 1158, 310, 247, 9560, 12291, 273, 253, 10527, 1543, 253, 39383, 10313, 2717, 670, 253, 3879, 273, 256, 35333, 387, 247, 6486, 4715, 2281, 534, 310, 253, 4588, 9459, 326, 256, 35333, 310, 1408, 275, 3946, 50275, 12563, 891, 1928, 326, 436, 44351, 26202, 553, 1617, 943, 320, 4767, 1199, 4321, 275, 253, 7482, 50276, 21, 891, 717, 10915, 8498, 758, 407, 253, 3368, 2593, 1097, 8472, 292, 285, 362, 1266, 403, 36761, 35615, 984, 597, 3480, 253, 12541, 2605, 891, 651, 1642, 323, 7103, 327, 387, 1878, 247, 4980, 501, 3024, 281, 7568, 253, 12510, 273, 253, 4081, 1332, 50276, 9154, 604, 6760, 327, 501, 3024, 891, 1335, 1158, 352, 651, 417, 36433, 984, 352, 310, 3240, 3477, 281, 3157, 247, 26724, 1566, 891, 651, 1663, 971, 281, 923, 1146, 2104, 281, 3157, 690, 1375, 23037, 14387, 1543, 323, 253, 2929, 281, 320, 21657, 21414, 50276, 977, 1774, 3533, 533, 778, 417, 12647, 4606, 323, 18235, 337, 352, 556, 644, 1119, 326, 253, 1612, 6937, 3605, 273, 256, 35333, 6046, 29325, 1365, 7024, 327, 253, 4715, 2281, 273, 256, 35333, 923, 5987, 39962, 2061, 5375, 16899, 1549, 1099, 2055, 390, 5987, 39962, 2061, 5375, 16899, 1235, 2222, 3011, 849, 1057, 436, 958, 2818, 253, 10527, 1543, 273, 436, 789, 50275, 37585, 1953, 337, 3239, 495, 752, 310, 22781, 2605, 253, 1563, 767, 4606, 403, 253, 2022, 32213, 1754, 327, 534, 891, 5583, 18235, 337, 253, 3762, 3133, 19124, 323, 5145, 4715, 50276, 22412, 4715, 374, 253, 5661, 7103, 310, 5075, 984, 352, 4648, 36761, 10336, 285, 984, 253, 906, 760, 19132, 327, 16426, 9591, 26724, 3733, 8130, 50276, 45230, 2668, 347, 247, 10527, 2929, 891, 1089, 253, 3762, 3710, 285, 19124, 2668, 347, 271, 5661, 9349, 2929, 891, 1089, 253, 7756, 285, 16182, 10915, 87, 19163, 50276, 187, 187, 4118, 18435, 27, 24013, 8550, 407, 16774, 7313, 326, 256, 35333, 2684, 327, 3676, 6928, 29623, 281, 4811, 273, 892, 2569, 2957, 16841, 4103, 281, 1781, 390, 2120, 14604, 305, 69, 253, 4477, 1347, 247, 10527, 1783, 273, 24102, 273, 256, 35333, 342, 253, 3361, 273, 5536, 246, 7193, 6046, 253, 3625, 8310, 273, 253, 3762, 310, 326, 5536, 246, 7193, 6046, 556, 247, 2169, 5912, 273, 29048, 253, 1655, 3602, 281, 247, 747, 2919, 273, 253, 3280, 2317, 534, 556, 690, 5912, 273, 10776, 275, 247, 17614, 468, 2919, 2299, 697, 1774, 281, 3877, 326, 275, 436, 1783, 256, 35333, 342, 5536, 246, 7193, 6046, 36908, 3297, 275, 253, 9479, 4811, 533, 588, 6524, 320, 19301, 896, 562, 273, 352, 896, 281, 643, 4811, 275, 247, 3282, 436, 13067, 247, 5502, 4216, 534, 26295, 326, 253, 11792, 1375, 3268, 943, 6947, 690, 6919, 273, 673, 275, 1027, 4811, 273, 253, 3280, 2317, 285, 1027, 9479, 1255, 1223, 1620, 5975, 3390, 9825, 436, 310, 2011, 954, 4518, 275, 4677, 337, 1755, 4055, 835, 253, 5536, 246, 7193, 256, 35333, 12421, 27287, 875, 1027, 4811, 273, 253, 3280, 2317, 4768, 253, 2862, 3733, 18974, 4679, 403, 840, 1408, 327, 3676, 6928, 4645, 326, 3573, 1767, 7193, 256, 35333, 342, 11786, 502, 8201, 26414, 281, 4811, 273, 892, 2569, 16841, 50275, 15337, 84, 273, 253, 789, 497, 3839, 2762, 253, 3762, 310, 973, 3559, 285, 4677, 337, 1057, 247, 4891, 2628, 17227, 253, 2022, 2934, 253, 3625, 14226, 369, 5439, 407, 37317, 288, 72, 1190, 16425, 326, 253, 1543, 943, 320, 8127, 19124, 281, 3676, 4715, 954, 273, 253, 8881, 875, 436, 37317, 285, 253, 4477, 18932, 1475, 1880, 390, 417, 774, 86, 6928, 452, 46836, 534, 9017, 745, 281, 23579, 253, 913, 588, 417, 2836, 715, 253, 4278, 273, 253, 4154, 352, 3133, 2590, 2299, 326, 604, 627, 497, 247, 3676, 4715, 32140, 342, 5536, 246, 7193, 6046, 326, 253, 4477, 1543, 588, 452, 690, 1693, 87, 4306, 2167, 253, 3242, 3753, 273, 253, 4795, 5502, 4216, 778, 452, 247, 9542, 10096, 327, 253, 2957, 2553, 19235, 253, 4477, 497, 7591, 281, 1089, 247, 824, 247, 32140, 275, 2460, 9162, 627, 310, 690, 2720, 789, 7738, 253, 295, 24343, 3210, 342, 7520, 21761, 778, 320, 247, 1805, 4944, 285, 594, 3058, 281, 41544, 10808, 5536, 246, 7193, 6046, 281, 1071, 616, 3762, 436, 310, 247, 2372, 273, 247, 12291, 533, 1677, 253, 2590, 4028, 285, 4722, 4679, 347, 4879, 407, 30628, 253, 789, 3133, 4409, 18738, 253, 913, 7052, 35733, 253, 4477, 2167, 281, 2486, 247, 625, 24585, 5469, 273, 259, 86, 1162, 355, 347, 326, 789, 3133, 281, 5194, 342, 3368, 273, 253, 9479, 1255, 273, 6474, 4811, 4236, 407, 256, 35333, 672, 1408, 327, 3676, 3210, 1293, 5536, 246, 7193, 6046 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: this paper outlines a 3stage process to train policies stage 1 involves training an encoder using large image datasets to obtain taskagnostic visual representation stage 2 involves training a policy using offline rl and the pretrained encoder this serves a as a good initialization for the final stage 3 that involves further training the agent with online rl strengths the overall pipeline appears obvious and straightforward in my view the main contribution of the paper is putting together the components and demonstrating empirical success weaknesses while the overall approach and results look promising the current submission unfortunately lacks technical depth and rigor in the introduction section the authors outline their main contributions unfortunately i do not believe there is sufficient evidence to rigorously substantiate these claims 1 novel framework i believe there is limited novelty in this submission especially in light of several recent papers most of which are published by now for example the current submission does not cite or discuss mvp1 pvr2 r3m3 which all explore training encoders from out of domain data 2 novel technical contributions the main technical contribution claims are a cce or convolutional channel expansion b safe q technique as for cce there are no experiments to benchmark it against simple alternatives like contatenation of framewise embeddings 123 or their latent differences 4 which have been successfully used in several prior works the safeq update does seem to help compared to a naive update rule however the challenges in transitioning from offline rl to online rl has been welldocumented in prior work along with different solutions eg awac 5 the authors should consider comparisons with such sota baselines as opposed to naive strawman baselines overall due to the lack of rigor in comparisons and baselines it is hard to judge the importance or utility of the claimed technical contributions vrl3 achieves an entirely new level of sota sample efficiency parameter efficiency and computation efficiency on the highly challenging adroit benchmark while also being very competitive on dmc i would encourage the authors to rephrase such statements to more technically rigorous and falsifiable claims 1 xiao et al masked visual pretraining for motor control arxiv 2022 2 parisi et al the unsurprising effectiveness of pretrained vision models for control icml 2022 3 nair et al r3m a universal visual representation for robot manipulation arxiv 2022 4 shang et al reinforcement learning with latent flow neurips 2021 5 nair et al awac accelerating online reinforcement learning with offline datasets 2020 the limitation section in the paper is too generic eg evaluation in sim vs real robots docsepthe authors tackle the problem of improving the efficiency of online learning for visual rl tasks the main idea of the paper is to pretrain an encoder on nonrl data imagenet then train a value function and policy network on top of the shared encoder with offline rl and finally finetune with online data with this simple framework and a small number of technical tweaks eg how to pretrain the encoder with singleimage input but transfer to tasks with multiimage input how to stabilize the value function the authors show that their method achieves sota sample efficiency on the adroit benchmark i found this paper a breeze to read the method is very practical and leverages popular tools in the contemporary ml toolbox pretraining and finetuning offline rl in order to achieve impressive sample efficiency on synthetic visualinput robotic tasks i especially appreciate the careful ablations the authors performed in section 5 which i believe gives insight into the contribution of each of the 3 stages that the authors propose for their method i think this paper provides valuable insight for the ml community on the relative importance of leveraging nonrl offline and online data for rl tasks the main weakness i see for this paper is i think the authors make some claims about the importance of offline rl vs bc in section 52 that i think need an additional experiment to verify in lines 263265 the authors claim that bc is the same as entirely ignoring stage 2 altogether since bc only trains the actor and not the critic i think bc should provide a stronger baseline than this maybe the authors could try a stage 2 in which the policy is trained with bc and the value function is trained in a naive way to predict cumulative reward of the offline trajectories my takeaway from figure 4 is that as long as we have a decent policyvalue match going into stage 3 the tasks will get solved even if the encoder hasnt even been trained yet with smoother more efficient learning when the encoder is pretrained on imagenet it would be great to disentangle the intricacies of drqv2 for offline data from the goal of pretraining both the policy and value function in stage 2 yes the authors discuss that experiments on simulated benchmarks do not imply success in the real world docsepthe authors propose vrl3 a 3 stage data driven technique for pretraining convolutional encoders performing offline rl and fine tuning with online rl the authors demonstrate their approach in a number of vision based control benchmarks include adroit and dmc the approach greatly improved sample efficiency and outperforms endtoend methods the usage of offline datasets and pretrained encoders if an interesting field of work but i am concerned that the problems this framework is evaluated on are somewhat trivial and i would be interested to see the results after more iterations of fine tuning strengths the paper and methodolgy is clear and easy to read improved sample efficiency the authors share their codebase and plan to release it as open source with the data checkpoints etc this method could potentially be applied to a large variety of benchmarks and achieve sota in terms of sample efficiency weaknesses the method requires training an encoder at the observation size of the environment i am not convinced that this approach outperforms pure online methods if they are left to convergence figure 2 a show the performances of rrl still increasing has this baseline actually converged i am not convinced the approach would transfer well to pomdp settings and more challenging benchmarks deepmind lab habitatlab it would have been interesting to see comparisons of other pretraining regimes for the encoder selfsupervised boyl contrastive etc yes docsepthis paper proposes a 3stage visual rl framework consisting of encoder pretraining with imagenet offline finetuning and online finetuning the encoder pretraining stage leverages an offtheshelf largescale dataset to learn a powerful representation then the offline pretraining uses a conservative method to transfer the learned representation into taskspecific embeddings finally the online rl achieves high sample efficiency in accomplishing adroit tasks the core contribution of this paper is the usage of imagenet dataset and offline method to perform pretraining the authors also conduct detailed ablative experiments to understand the challenges and important design choices strength although all the components are not new the combined framework is powerful within this framework the authors conduct extensive experiments to understand possible alternatives and analyze the challenges this shed light on future researchers who is willing to jump into this field the sample efficiency on adroit benchmark is much higher than previous baselines and their forged stronger baselines with a large margin weakness the overall writing quality is slightly concerning this is an issue throughout the whole paper for example when the authors are describing their experimental results they use very subjective words such as good very without supporting evidence in the introduction the authors also describe the necessity of their method using the word we want to these subjective tones heavily reduce the scientific value of this paper moreover there are many grammatical issues slight overclaim overselling and abuse of terms the author emphasizes their method is a minimalist approach a 3stage approach with tons of tunings can hardly be regarded as minimalist when the authors mention offline rl they should be more careful when they are only using expert demonstrations this needs clarification somewhere in the paper the overall paper quality especially in terms of the experimental aspects is ok that being said i think overselling the paper is not a good idea the authors mentioned that it is a resnet architecture however if it does not have skip connect the naming is inaccurate according to the supp material the framework is brittle to many hyperparameters for example the q threshold is extremely important to tune what would happen if other people want to use this paper does the tuning take a lot of effort the data augmentation technique seems important more reasoning and explanation are needed the authors mention real robot experiment and large domain gap as their limitations these are reasonable thank you for being straightforward ### Summary:
this paper introduced a simple paradigm for improving sample efficiency of training deep reinforcement learning policies for visionbased control tasks the idea is to use a 3stage pipeline 1 pretraining visual representations on largescale image datasets 2 policy training with offline rl and 3 finetuning the policy with online rl this work received mixed reviews from four reviewers with one reject one weak reject and two weak accepts the reviewers appreciated the demonstrated effectiveness of the proposed approach despite the simplicity of the approach meanwhile they expressed major concerns regarding the limited novelty concerning the burgeoning body of literature on visual pretraining and limited evaluations and ablation studies the authors drafted very detailed responses to the reviewers comments which clarified many technical issues brought up in the initial reviews at the end of the discussion period reviewer i5gh who did not engage in the discussions and reviewer 1wyj maintained their negative ratings of this paper while the other two voted weak accept the ac read the paper the reviews and the authors responses carefully reviewer i5ghs main criticisms are 1 the novelty of vrl3 and missing citations and discussions of prior work mvp pvr r3m etc and 2 insufficient comparisons and ablations of key model designs the ac checked the publicationrelease dates of the mentioned works and believed they should be considered concurrent with this submission thus the technical merit of this work should not be penalized by the existence of these related works meanwhile the authors added the citations and discussions about these works in the revised draft which addressed the reviewers comment in addition the authors also provided additional ablation studies and clarifications which addressed the second point raised by reviewer i5gh reviewer 1wyj expressed concerns about the heavy revision during rebuttal and the overclaim and overselling of the approach the ac agreed with this reviewer that some language such as minimalist should be toned down in the next reversion of this manuscript taking all these into account the ac found that the rebuttal has addressed the major issues raised in the reviews even though this work does not generate revolutionary ideas it has shown convincing evidence of a practical approach that improves the learning efficiency of deep reinforcement learning in challenging visionbased control tasks this work may pave the road for future work to develop more advanced methods therefore the ac thinks that this work has passed the bar of acceptance at neurips despite the mixed final ratings
[ 285, 8132, 263, 275, 253, 10199, 2593, 253, 4477, 19270, 616, 2022, 9021, 19235, 891, 513, 417, 2868, 627, 310, 4209, 1941, 281, 8132, 29689, 4326, 4513, 841, 3916, 50276, 18, 4460, 7792, 50276, 74, 2868, 627, 310, 3710, 38135, 275, 436, 19529, 3340, 275, 1708, 273, 2067, 3332, 9380, 954, 273, 534, 403, 3863, 407, 1024, 323, 1650, 253, 1655, 19529, 1057, 417, 26542, 390, 2319, 278, 29035, 18, 268, 24987, 19, 391, 20, 78, 20, 534, 512, 8338, 3733, 2349, 351, 398, 432, 562, 273, 5028, 941, 50276, 19, 4460, 7681, 9021, 253, 2022, 7681, 7680, 3916, 403, 247, 260, 336, 390, 27311, 267, 5048, 7466, 270, 4999, 2805, 5853, 347, 323, 260, 336, 627, 403, 642, 4679, 281, 22791, 352, 1411, 2969, 18075, 751, 523, 15030, 318, 273, 3665, 3020, 46234, 15567, 390, 616, 21624, 3910, 577, 534, 452, 644, 8379, 908, 275, 2067, 2720, 2987, 253, 4999, 82, 5731, 1057, 1646, 281, 1361, 2429, 281, 247, 27785, 5731, 4086, 2299, 253, 7881, 275, 5502, 272, 432, 28841, 391, 77, 281, 3909, 391, 77, 556, 644, 6210, 392, 1829, 264, 275, 2720, 789, 2112, 342, 1027, 5482, 24088, 3768, 317, 608, 253, 4477, 943, 1908, 14023, 342, 824, 256, 5503, 1666, 25379, 347, 10066, 281, 27785, 17844, 1342, 1666, 25379, 4583, 1955, 281, 253, 3480, 273, 8132, 263, 275, 14023, 285, 1666, 25379, 352, 310, 1892, 281, 5963, 253, 6349, 390, 11839, 273, 253, 7558, 7681, 9021, 50276, 87, 8435, 20, 33526, 271, 7094, 747, 1268, 273, 256, 5503, 3410, 6733, 4764, 6733, 285, 13782, 6733, 327, 253, 4122, 11132, 519, 14790, 22791, 1223, 671, 1146, 1077, 12085, 327, 277, 17475, 50276, 74, 651, 11907, 253, 4477, 281, 294, 40712, 824, 7234, 281, 625, 22335, 26565, 285, 21649, 18397, 3916, 50275, 18, 1269, 22728, 1162, 355, 34741, 5304, 3215, 26208, 323, 5694, 1453, 549, 32693, 1384, 1423, 50276, 19, 1061, 13401, 1162, 355, 253, 5061, 321, 20733, 12510, 273, 3215, 11273, 8113, 3210, 323, 1453, 17857, 1686, 1384, 1423, 50276, 20, 295, 1094, 1162, 355, 391, 20, 78, 247, 10898, 5304, 6779, 323, 15688, 19763, 549, 32693, 1384, 1423, 50276, 21, 439, 606, 1162, 355, 35221, 4715, 342, 21624, 2685, 5723, 2824, 43425, 50276, 22, 295, 1094, 1162, 355, 3768, 317, 38757, 3909, 35221, 4715, 342, 28841, 15302, 9169, 253, 12291, 2593, 275, 253, 2929, 310, 1512, 12314, 24088, 7103, 275, 948, 4632, 1524, 25497, 50276, 7152, 339, 431, 248, 4477, 18915, 253, 1895, 273, 11138, 253, 6733, 273, 3909, 4715, 323, 5304, 391, 77, 8892, 253, 2022, 2934, 273, 253, 2929, 310, 281, 3215, 1949, 271, 32049, 327, 1327, 8435, 941, 4440, 257, 292, 840, 6194, 247, 1318, 1159, 285, 3646, 2990, 327, 1755, 273, 253, 6096, 32049, 342, 28841, 391, 77, 285, 4720, 1442, 292, 2517, 342, 3909, 941, 342, 436, 2969, 7792, 285, 247, 1355, 1180, 273, 7681, 13660, 8765, 24088, 849, 281, 3215, 1949, 253, 32049, 342, 2014, 5695, 3280, 533, 3700, 281, 8892, 342, 4471, 5695, 3280, 849, 281, 33292, 253, 1318, 1159, 253, 4477, 921, 326, 616, 1332, 33526, 256, 5503, 3410, 6733, 327, 253, 519, 14790, 22791, 891, 1119, 436, 2929, 247, 29178, 281, 1239, 253, 1332, 310, 1077, 8542, 285, 19732, 1131, 4633, 5657, 275, 253, 13399, 13361, 4968, 3364, 3215, 26208, 285, 1442, 292, 25004, 28841, 391, 77, 275, 1340, 281, 5115, 13943, 3410, 6733, 327, 13506, 5304, 5423, 35121, 8892, 891, 3340, 11435, 253, 10182, 490, 77, 569, 253, 4477, 2684, 275, 2593, 608, 534, 891, 2868, 4245, 12288, 715, 253, 7680, 273, 1016, 273, 253, 495, 8661, 326, 253, 4477, 12661, 323, 616, 1332, 891, 1158, 436, 2929, 3400, 9865, 12288, 323, 253, 13361, 3114, 327, 253, 4103, 6349, 273, 19732, 2977, 1327, 8435, 28841, 285, 3909, 941, 323, 391, 77, 8892, 50276, 783, 2022, 14855, 891, 923, 323, 436, 2929, 310, 891, 1158, 253, 4477, 1056, 690, 3916, 670, 253, 6349, 273, 28841, 391, 77, 4632, 49501, 275, 2593, 8073, 326, 891, 1158, 878, 271, 3081, 3368, 281, 12654, 275, 3104, 3436, 1237, 2082, 253, 4477, 1750, 326, 49501, 310, 253, 1072, 347, 7094, 23111, 3924, 374, 17965, 1580, 49501, 760, 18784, 253, 12353, 285, 417, 253, 7291, 891, 1158, 49501, 943, 2085, 247, 10046, 8245, 685, 436, 50276, 28489, 253, 4477, 812, 1611, 247, 3924, 374, 275, 534, 253, 3646, 310, 10166, 342, 49501, 285, 253, 1318, 1159, 310, 10166, 275, 247, 27785, 1039, 281, 3283, 18849, 10921, 273, 253, 28841, 24102, 619, 1379, 12594, 432, 4677, 577, 310, 326, 347, 1048, 347, 359, 452, 247, 12524, 3646, 2877, 3761, 1469, 715, 3924, 495, 253, 8892, 588, 755, 14042, 1014, 604, 253, 32049, 556, 2649, 1014, 644, 10166, 2568, 342, 39797, 977, 625, 5919, 4715, 672, 253, 32049, 310, 3215, 11273, 327, 4440, 257, 292, 352, 651, 320, 1270, 281, 557, 290, 2134, 253, 29381, 19103, 273, 1837, 82, 87, 19, 323, 28841, 941, 432, 253, 4736, 273, 3215, 26208, 1097, 253, 3646, 285, 1318, 1159, 275, 3924, 374, 4754, 50276, 783, 4477, 2319, 326, 4679, 327, 15524, 49602, 513, 417, 16084, 2323, 275, 253, 1524, 1533, 5474, 339, 431, 248, 4477, 12661, 362, 8435, 20, 247, 495, 3924, 941, 8877, 5853, 323, 3215, 26208, 27311, 267, 2349, 351, 398, 9591, 28841, 391, 77, 285, 4030, 25184, 342, 3909, 391, 77, 253, 4477, 7568, 616, 2746, 275, 247, 1180, 273, 8113, 1754, 1453, 49602, 2486, 519, 14790, 285, 277, 17475, 50276, 783, 2746, 10260, 5520, 3410, 6733, 285, 41731, 13015, 990, 936, 423, 3082, 50275, 783, 10393, 273, 28841, 15302, 285, 3215, 11273, 2349, 351, 398, 604, 271, 4722, 1673, 273, 789, 533, 891, 717, 7514, 326, 253, 3237, 436, 7792, 310, 6760, 327, 403, 8489, 14916, 285, 891, 651, 320, 6110, 281, 923, 253, 1543, 846, 625, 25142, 273, 4030, 25184, 20544, 50276, 783, 2929, 285, 1332, 311, 4233, 310, 2590, 285, 3477, 281, 1239, 50276, 303, 27369, 3410, 6733, 50276, 783, 4477, 3894, 616, 2127, 4793, 285, 2098, 281, 3727, 352, 347, 1527, 2603, 342, 253, 941, 2451, 10801, 3966, 50276, 2520, 1332, 812, 7826, 320, 3732, 281, 247, 1781, 5235, 273, 49602, 285, 5115, 256, 5503, 275, 2426, 273, 3410, 6733, 50276, 20881, 1255, 265, 50276, 783, 1332, 4419, 3733, 271, 32049, 387, 253, 8310, 1979, 273, 253, 3126, 50276, 74, 717, 417, 13762, 326, 436, 2746, 41731, 13015, 6313, 3909, 3082, 604, 597, 403, 1669, 281, 14940, 4677, 374, 247, 921, 253, 16226, 273, 391, 8435, 1335, 3629, 556, 436, 8245, 2686, 5975, 2400, 50276, 74, 717, 417, 13762, 253, 2746, 651, 3700, 973, 281, 31204, 12132, 7533, 285, 625, 11132, 49602, 3676, 14785, 5188, 20571, 13068, 50275, 262, 651, 452, 644, 4722, 281, 923, 14023, 273, 643, 3215, 26208, 27005, 323, 253, 32049, 1881, 35421, 5006, 77, 4499, 422, 3966, 4754, 5474, 33032, 2520, 2929, 29328, 247, 495, 13311, 5304, 391, 77, 7792, 11253, 273, 32049, 3215, 26208, 342, 4440, 257, 292, 28841, 1442, 292, 25004, 285, 3909, 1442, 292, 25004, 253, 32049, 3215, 26208, 3924, 19732, 1131, 271, 273, 649, 1041, 48164, 1236, 2510, 25912, 10895, 281, 3037, 247, 6422, 6779, 840, 253, 28841, 3215, 26208, 4648, 247, 11518, 1332, 281, 3700, 253, 6311, 6779, 715, 8892, 29765, 46234, 4720, 253, 3909, 391, 77, 33526, 1029, 3410, 6733, 275, 7576, 3647, 519, 14790, 50276, 40480, 253, 5161, 7680, 273, 436, 2929, 310, 253, 10393, 273, 4440, 257, 292, 10895, 285, 28841, 50276, 9349, 281, 1347, 3215, 26208, 50276, 783, 4477, 671, 2589, 7000, 490, 77, 800, 4679, 281, 2096, 253, 7881, 285, 1774, 2216, 10165, 50276, 45563, 50276, 20261, 512, 253, 4295, 403, 417, 747, 253, 5678, 7792, 310, 6422, 1561, 436, 7792, 253, 4477, 2589, 9470, 4679, 281, 2096, 1896, 18075, 285, 12106, 253, 7881, 436, 17914, 1708, 327, 2852, 8607, 665, 310, 7378, 281, 6923, 715, 436, 1673, 253, 3410, 6733, 327, 519, 14790, 22791, 310, 1199, 2169, 685, 2045, 1666, 25379, 285, 616, 37260, 10046, 1666, 25379, 342, 247, 1781, 8459, 50276, 20881, 1255, 253, 4583, 4028, 3290, 310, 5777, 8664, 436, 310, 271, 2523, 4768, 253, 2644, 2929, 323, 1650, 672, 253, 4477, 403, 12930, 616, 5661, 1543, 597, 897, 1077, 17854, 3000, 824, 347, 1175, 50276, 635, 1293, 8109, 1941, 275, 253, 10199, 253, 4477, 671, 6266, 253, 15504, 273, 616, 1332, 970, 253, 3159, 359, 971, 281, 50276, 20513, 17854, 28232, 11306, 4796, 253, 8249, 1318, 273, 436, 2929, 25761, 627, 403, 1142, 47412, 474, 3374, 50276, 84, 3243, 689, 7041, 689, 23708, 285, 7242, 273, 2426, 253, 2488, 35520, 616, 1332, 310, 247, 8723, 382, 2746, 247, 495, 13311, 2746, 342, 16298, 273, 10839, 723, 476, 10693, 320, 12258, 347, 8723, 382, 672, 253, 4477, 3748, 28841, 391, 77, 597, 943, 320, 625, 10182, 672, 597, 403, 760, 970, 6485, 32367, 436, 3198, 37699, 9366, 275, 253, 2929, 253, 4583, 2929, 3290, 3340, 275, 2426, 273, 253, 5661, 7794, 310, 8718, 326, 1146, 753, 891, 1158, 689, 23708, 253, 2929, 310, 417, 247, 1175, 2934, 253, 4477, 5393, 326, 352, 310, 247, 501, 3024, 10336, 2299, 604, 352, 1057, 417, 452, 17049, 4684, 253, 26086, 310, 31215, 50276, 35861, 281, 253, 915, 2144, 253, 7792, 310, 1308, 1522, 281, 1142, 4373, 22041, 323, 1650, 253, 2805, 7887, 310, 6685, 1774, 281, 19928, 752, 651, 5108, 604, 643, 952, 971, 281, 897, 436, 2929, 1057, 253, 25184, 1379, 247, 2257, 273, 3434, 50276, 783, 941, 42072, 5853, 3133, 1774, 625, 14720, 285, 8813, 403, 3058, 253, 4477, 3748, 1524, 15688, 3368, 285, 1781, 5028, 8037, 347, 616, 7364, 841, 403, 5272, 5717, 368, 323, 1146, 15246, 2490, 187, 4118, 18435, 27, 2520, 2929, 5611, 247, 2969, 22199, 323, 11138, 3410, 6733, 273, 3733, 3676, 35221, 4715, 7823, 323, 8113, 3169, 1453, 8892, 253, 2934, 310, 281, 897, 247, 495, 13311, 15722, 337, 3215, 26208, 5304, 14237, 327, 1236, 2510, 25912, 2460, 15302, 374, 3646, 3733, 342, 28841, 391, 77, 285, 495, 1442, 292, 25004, 253, 3646, 342, 3909, 391, 77, 436, 789, 2959, 6804, 10123, 432, 1740, 30628, 342, 581, 12009, 581, 5075, 12009, 285, 767, 5075, 25026, 253, 30628, 14109, 253, 5183, 12510, 273, 253, 4081, 2746, 5747, 253, 17647, 273, 253, 2746, 26614, 597, 4469, 2201, 7350, 5001, 253, 3710, 38135, 8664, 253, 3600, 463, 12777, 2133, 273, 6239, 327, 5304, 3215, 26208, 285, 3710, 27163, 285, 28913, 2175, 50276, 783, 4477, 22390, 1077, 7000, 6128, 281, 253, 30628, 5701, 534, 31637, 1142, 7681, 3374, 3982, 598, 275, 253, 3302, 10123, 387, 253, 990, 273, 253, 5955, 2180, 37317, 891, 22, 18068, 665, 858, 417, 11377, 275, 253, 11985, 285, 37317, 337, 22383, 75, 8838, 616, 4016, 17503, 273, 436, 2929, 1223, 253, 643, 767, 14285, 5075, 2997, 50276, 783, 913, 1239, 253, 2929, 253, 10123, 285, 253, 4477, 6128, 9257, 37317, 891, 22, 72, 11285, 2022, 43680, 403, 337, 253, 38135, 273, 362, 8435, 20, 285, 5816, 30404, 285, 11985, 273, 2720, 789, 278, 29035, 268, 24987, 391, 20, 78, 3966, 285, 374, 12497, 14023, 285, 490, 77, 569, 273, 2234, 1566, 11809, 253, 913, 10141, 253, 9311, 16690, 12282, 273, 253, 5393, 2987, 285, 6566, 597, 943, 320, 2783, 17336, 342, 436, 19529, 3021, 253, 7681, 15785, 273, 436, 789, 943, 417, 320, 29697, 1025, 407, 253, 6242, 273, 841, 2905, 2987, 26614, 253, 4477, 2879, 253, 30404, 285, 11985, 670, 841, 2987, 275, 253, 17265, 7482, 534, 9713, 253, 30628, 4385, 275, 1635, 253, 4477, 671, 2530, 3081, 28913, 2175, 285, 8254, 6787, 534, 9713, 253, 1273, 1127, 5439, 407, 37317, 891, 22, 18068, 50276, 15337, 254, 337, 22383, 75, 4469, 7350, 670, 253, 5536, 18520, 1309, 30080, 22559, 285, 253, 689, 7041, 285, 689, 23708, 273, 253, 2746, 253, 913, 5821, 342, 436, 37317, 326, 690, 3448, 824, 347, 8723, 382, 943, 320, 7020, 264, 1066, 275, 253, 1735, 294, 4149, 273, 436, 7714, 50276, 29114, 512, 841, 715, 2395, 253, 913, 1119, 326, 253, 30080, 22559, 556, 9713, 253, 2201, 3374, 5439, 275, 253, 10123, 1014, 2167, 436, 789, 1057, 417, 6635, 22564, 5697, 352, 556, 2011, 21414, 1941, 273, 247, 8542, 2746, 326, 19132, 253, 4715, 6733, 273, 3676, 35221, 4715, 275, 11132, 8113, 3169, 1453, 8892, 436, 789, 778, 29238, 253, 3971, 323, 2852, 789, 281, 1287, 625, 7269, 3082, 3103, 253, 913, 11121, 326, 436, 789, 556, 4817, 253, 2534, 273, 14924, 387, 5723, 2824, 5747, 253, 6804, 2457, 17503 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 285, 8132, 263, 275, 253, 10199, 2593, 253, 4477, 19270, 616, 2022, 9021, 19235, 891, 513, 417, 2868, 627, 310, 4209, 1941, 281, 8132, 29689, 4326, 4513, 841, 3916, 50276, 18, 4460, 7792, 50276, 74, 2868, 627, 310, 3710, 38135, 275, 436, 19529, 3340, 275, 1708, 273, 2067, 3332, 9380, 954, 273, 534, 403, 3863, 407, 1024, 323, 1650, 253, 1655, 19529, 1057, 417, 26542, 390, 2319, 278, 29035, 18, 268, 24987, 19, 391, 20, 78, 20, 534, 512, 8338, 3733, 2349, 351, 398, 432, 562, 273, 5028, 941, 50276, 19, 4460, 7681, 9021, 253, 2022, 7681, 7680, 3916, 403, 247, 260, 336, 390, 27311, 267, 5048, 7466, 270, 4999, 2805, 5853, 347, 323, 260, 336, 627, 403, 642, 4679, 281, 22791, 352, 1411, 2969, 18075, 751, 523, 15030, 318, 273, 3665, 3020, 46234, 15567, 390, 616, 21624, 3910, 577, 534, 452, 644, 8379, 908, 275, 2067, 2720, 2987, 253, 4999, 82, 5731, 1057, 1646, 281, 1361, 2429, 281, 247, 27785, 5731, 4086, 2299, 253, 7881, 275, 5502, 272, 432, 28841, 391, 77, 281, 3909, 391, 77, 556, 644, 6210, 392, 1829, 264, 275, 2720, 789, 2112, 342, 1027, 5482, 24088, 3768, 317, 608, 253, 4477, 943, 1908, 14023, 342, 824, 256, 5503, 1666, 25379, 347, 10066, 281, 27785, 17844, 1342, 1666, 25379, 4583, 1955, 281, 253, 3480, 273, 8132, 263, 275, 14023, 285, 1666, 25379, 352, 310, 1892, 281, 5963, 253, 6349, 390, 11839, 273, 253, 7558, 7681, 9021, 50276, 87, 8435, 20, 33526, 271, 7094, 747, 1268, 273, 256, 5503, 3410, 6733, 4764, 6733, 285, 13782, 6733, 327, 253, 4122, 11132, 519, 14790, 22791, 1223, 671, 1146, 1077, 12085, 327, 277, 17475, 50276, 74, 651, 11907, 253, 4477, 281, 294, 40712, 824, 7234, 281, 625, 22335, 26565, 285, 21649, 18397, 3916, 50275, 18, 1269, 22728, 1162, 355, 34741, 5304, 3215, 26208, 323, 5694, 1453, 549, 32693, 1384, 1423, 50276, 19, 1061, 13401, 1162, 355, 253, 5061, 321, 20733, 12510, 273, 3215, 11273, 8113, 3210, 323, 1453, 17857, 1686, 1384, 1423, 50276, 20, 295, 1094, 1162, 355, 391, 20, 78, 247, 10898, 5304, 6779, 323, 15688, 19763, 549, 32693, 1384, 1423, 50276, 21, 439, 606, 1162, 355, 35221, 4715, 342, 21624, 2685, 5723, 2824, 43425, 50276, 22, 295, 1094, 1162, 355, 3768, 317, 38757, 3909, 35221, 4715, 342, 28841, 15302, 9169, 253, 12291, 2593, 275, 253, 2929, 310, 1512, 12314, 24088, 7103, 275, 948, 4632, 1524, 25497, 50276, 7152, 339, 431, 248, 4477, 18915, 253, 1895, 273, 11138, 253, 6733, 273, 3909, 4715, 323, 5304, 391, 77, 8892, 253, 2022, 2934, 273, 253, 2929, 310, 281, 3215, 1949, 271, 32049, 327, 1327, 8435, 941, 4440, 257, 292, 840, 6194, 247, 1318, 1159, 285, 3646, 2990, 327, 1755, 273, 253, 6096, 32049, 342, 28841, 391, 77, 285, 4720, 1442, 292, 2517, 342, 3909, 941, 342, 436, 2969, 7792, 285, 247, 1355, 1180, 273, 7681, 13660, 8765, 24088, 849, 281, 3215, 1949, 253, 32049, 342, 2014, 5695, 3280, 533, 3700, 281, 8892, 342, 4471, 5695, 3280, 849, 281, 33292, 253, 1318, 1159, 253, 4477, 921, 326, 616, 1332, 33526, 256, 5503, 3410, 6733, 327, 253, 519, 14790, 22791, 891, 1119, 436, 2929, 247, 29178, 281, 1239, 253, 1332, 310, 1077, 8542, 285, 19732, 1131, 4633, 5657, 275, 253, 13399, 13361, 4968, 3364, 3215, 26208, 285, 1442, 292, 25004, 28841, 391, 77, 275, 1340, 281, 5115, 13943, 3410, 6733, 327, 13506, 5304, 5423, 35121, 8892, 891, 3340, 11435, 253, 10182, 490, 77, 569, 253, 4477, 2684, 275, 2593, 608, 534, 891, 2868, 4245, 12288, 715, 253, 7680, 273, 1016, 273, 253, 495, 8661, 326, 253, 4477, 12661, 323, 616, 1332, 891, 1158, 436, 2929, 3400, 9865, 12288, 323, 253, 13361, 3114, 327, 253, 4103, 6349, 273, 19732, 2977, 1327, 8435, 28841, 285, 3909, 941, 323, 391, 77, 8892, 50276, 783, 2022, 14855, 891, 923, 323, 436, 2929, 310, 891, 1158, 253, 4477, 1056, 690, 3916, 670, 253, 6349, 273, 28841, 391, 77, 4632, 49501, 275, 2593, 8073, 326, 891, 1158, 878, 271, 3081, 3368, 281, 12654, 275, 3104, 3436, 1237, 2082, 253, 4477, 1750, 326, 49501, 310, 253, 1072, 347, 7094, 23111, 3924, 374, 17965, 1580, 49501, 760, 18784, 253, 12353, 285, 417, 253, 7291, 891, 1158, 49501, 943, 2085, 247, 10046, 8245, 685, 436, 50276, 28489, 253, 4477, 812, 1611, 247, 3924, 374, 275, 534, 253, 3646, 310, 10166, 342, 49501, 285, 253, 1318, 1159, 310, 10166, 275, 247, 27785, 1039, 281, 3283, 18849, 10921, 273, 253, 28841, 24102, 619, 1379, 12594, 432, 4677, 577, 310, 326, 347, 1048, 347, 359, 452, 247, 12524, 3646, 2877, 3761, 1469, 715, 3924, 495, 253, 8892, 588, 755, 14042, 1014, 604, 253, 32049, 556, 2649, 1014, 644, 10166, 2568, 342, 39797, 977, 625, 5919, 4715, 672, 253, 32049, 310, 3215, 11273, 327, 4440, 257, 292, 352, 651, 320, 1270, 281, 557, 290, 2134, 253, 29381, 19103, 273, 1837, 82, 87, 19, 323, 28841, 941, 432, 253, 4736, 273, 3215, 26208, 1097, 253, 3646, 285, 1318, 1159, 275, 3924, 374, 4754, 50276, 783, 4477, 2319, 326, 4679, 327, 15524, 49602, 513, 417, 16084, 2323, 275, 253, 1524, 1533, 5474, 339, 431, 248, 4477, 12661, 362, 8435, 20, 247, 495, 3924, 941, 8877, 5853, 323, 3215, 26208, 27311, 267, 2349, 351, 398, 9591, 28841, 391, 77, 285, 4030, 25184, 342, 3909, 391, 77, 253, 4477, 7568, 616, 2746, 275, 247, 1180, 273, 8113, 1754, 1453, 49602, 2486, 519, 14790, 285, 277, 17475, 50276, 783, 2746, 10260, 5520, 3410, 6733, 285, 41731, 13015, 990, 936, 423, 3082, 50275, 783, 10393, 273, 28841, 15302, 285, 3215, 11273, 2349, 351, 398, 604, 271, 4722, 1673, 273, 789, 533, 891, 717, 7514, 326, 253, 3237, 436, 7792, 310, 6760, 327, 403, 8489, 14916, 285, 891, 651, 320, 6110, 281, 923, 253, 1543, 846, 625, 25142, 273, 4030, 25184, 20544, 50276, 783, 2929, 285, 1332, 311, 4233, 310, 2590, 285, 3477, 281, 1239, 50276, 303, 27369, 3410, 6733, 50276, 783, 4477, 3894, 616, 2127, 4793, 285, 2098, 281, 3727, 352, 347, 1527, 2603, 342, 253, 941, 2451, 10801, 3966, 50276, 2520, 1332, 812, 7826, 320, 3732, 281, 247, 1781, 5235, 273, 49602, 285, 5115, 256, 5503, 275, 2426, 273, 3410, 6733, 50276, 20881, 1255, 265, 50276, 783, 1332, 4419, 3733, 271, 32049, 387, 253, 8310, 1979, 273, 253, 3126, 50276, 74, 717, 417, 13762, 326, 436, 2746, 41731, 13015, 6313, 3909, 3082, 604, 597, 403, 1669, 281, 14940, 4677, 374, 247, 921, 253, 16226, 273, 391, 8435, 1335, 3629, 556, 436, 8245, 2686, 5975, 2400, 50276, 74, 717, 417, 13762, 253, 2746, 651, 3700, 973, 281, 31204, 12132, 7533, 285, 625, 11132, 49602, 3676, 14785, 5188, 20571, 13068, 50275, 262, 651, 452, 644, 4722, 281, 923, 14023, 273, 643, 3215, 26208, 27005, 323, 253, 32049, 1881, 35421, 5006, 77, 4499, 422, 3966, 4754, 5474, 33032, 2520, 2929, 29328, 247, 495, 13311, 5304, 391, 77, 7792, 11253, 273, 32049, 3215, 26208, 342, 4440, 257, 292, 28841, 1442, 292, 25004, 285, 3909, 1442, 292, 25004, 253, 32049, 3215, 26208, 3924, 19732, 1131, 271, 273, 649, 1041, 48164, 1236, 2510, 25912, 10895, 281, 3037, 247, 6422, 6779, 840, 253, 28841, 3215, 26208, 4648, 247, 11518, 1332, 281, 3700, 253, 6311, 6779, 715, 8892, 29765, 46234, 4720, 253, 3909, 391, 77, 33526, 1029, 3410, 6733, 275, 7576, 3647, 519, 14790, 50276, 40480, 253, 5161, 7680, 273, 436, 2929, 310, 253, 10393, 273, 4440, 257, 292, 10895, 285, 28841, 50276, 9349, 281, 1347, 3215, 26208, 50276, 783, 4477, 671, 2589, 7000, 490, 77, 800, 4679, 281, 2096, 253, 7881, 285, 1774, 2216, 10165, 50276, 45563, 50276, 20261, 512, 253, 4295, 403, 417, 747, 253, 5678, 7792, 310, 6422, 1561, 436, 7792, 253, 4477, 2589, 9470, 4679, 281, 2096, 1896, 18075, 285, 12106, 253, 7881, 436, 17914, 1708, 327, 2852, 8607, 665, 310, 7378, 281, 6923, 715, 436, 1673, 253, 3410, 6733, 327, 519, 14790, 22791, 310, 1199, 2169, 685, 2045, 1666, 25379, 285, 616, 37260, 10046, 1666, 25379, 342, 247, 1781, 8459, 50276, 20881, 1255, 253, 4583, 4028, 3290, 310, 5777, 8664, 436, 310, 271, 2523, 4768, 253, 2644, 2929, 323, 1650, 672, 253, 4477, 403, 12930, 616, 5661, 1543, 597, 897, 1077, 17854, 3000, 824, 347, 1175, 50276, 635, 1293, 8109, 1941, 275, 253, 10199, 253, 4477, 671, 6266, 253, 15504, 273, 616, 1332, 970, 253, 3159, 359, 971, 281, 50276, 20513, 17854, 28232, 11306, 4796, 253, 8249, 1318, 273, 436, 2929, 25761, 627, 403, 1142, 47412, 474, 3374, 50276, 84, 3243, 689, 7041, 689, 23708, 285, 7242, 273, 2426, 253, 2488, 35520, 616, 1332, 310, 247, 8723, 382, 2746, 247, 495, 13311, 2746, 342, 16298, 273, 10839, 723, 476, 10693, 320, 12258, 347, 8723, 382, 672, 253, 4477, 3748, 28841, 391, 77, 597, 943, 320, 625, 10182, 672, 597, 403, 760, 970, 6485, 32367, 436, 3198, 37699, 9366, 275, 253, 2929, 253, 4583, 2929, 3290, 3340, 275, 2426, 273, 253, 5661, 7794, 310, 8718, 326, 1146, 753, 891, 1158, 689, 23708, 253, 2929, 310, 417, 247, 1175, 2934, 253, 4477, 5393, 326, 352, 310, 247, 501, 3024, 10336, 2299, 604, 352, 1057, 417, 452, 17049, 4684, 253, 26086, 310, 31215, 50276, 35861, 281, 253, 915, 2144, 253, 7792, 310, 1308, 1522, 281, 1142, 4373, 22041, 323, 1650, 253, 2805, 7887, 310, 6685, 1774, 281, 19928, 752, 651, 5108, 604, 643, 952, 971, 281, 897, 436, 2929, 1057, 253, 25184, 1379, 247, 2257, 273, 3434, 50276, 783, 941, 42072, 5853, 3133, 1774, 625, 14720, 285, 8813, 403, 3058, 253, 4477, 3748, 1524, 15688, 3368, 285, 1781, 5028, 8037, 347, 616, 7364, 841, 403, 5272, 5717, 368, 323, 1146, 15246, 2490, 187, 4118, 18435, 27, 2520, 2929, 5611, 247, 2969, 22199, 323, 11138, 3410, 6733, 273, 3733, 3676, 35221, 4715, 7823, 323, 8113, 3169, 1453, 8892, 253, 2934, 310, 281, 897, 247, 495, 13311, 15722, 337, 3215, 26208, 5304, 14237, 327, 1236, 2510, 25912, 2460, 15302, 374, 3646, 3733, 342, 28841, 391, 77, 285, 495, 1442, 292, 25004, 253, 3646, 342, 3909, 391, 77, 436, 789, 2959, 6804, 10123, 432, 1740, 30628, 342, 581, 12009, 581, 5075, 12009, 285, 767, 5075, 25026, 253, 30628, 14109, 253, 5183, 12510, 273, 253, 4081, 2746, 5747, 253, 17647, 273, 253, 2746, 26614, 597, 4469, 2201, 7350, 5001, 253, 3710, 38135, 8664, 253, 3600, 463, 12777, 2133, 273, 6239, 327, 5304, 3215, 26208, 285, 3710, 27163, 285, 28913, 2175, 50276, 783, 4477, 22390, 1077, 7000, 6128, 281, 253, 30628, 5701, 534, 31637, 1142, 7681, 3374, 3982, 598, 275, 253, 3302, 10123, 387, 253, 990, 273, 253, 5955, 2180, 37317, 891, 22, 18068, 665, 858, 417, 11377, 275, 253, 11985, 285, 37317, 337, 22383, 75, 8838, 616, 4016, 17503, 273, 436, 2929, 1223, 253, 643, 767, 14285, 5075, 2997, 50276, 783, 913, 1239, 253, 2929, 253, 10123, 285, 253, 4477, 6128, 9257, 37317, 891, 22, 72, 11285, 2022, 43680, 403, 337, 253, 38135, 273, 362, 8435, 20, 285, 5816, 30404, 285, 11985, 273, 2720, 789, 278, 29035, 268, 24987, 391, 20, 78, 3966, 285, 374, 12497, 14023, 285, 490, 77, 569, 273, 2234, 1566, 11809, 253, 913, 10141, 253, 9311, 16690, 12282, 273, 253, 5393, 2987, 285, 6566, 597, 943, 320, 2783, 17336, 342, 436, 19529, 3021, 253, 7681, 15785, 273, 436, 789, 943, 417, 320, 29697, 1025, 407, 253, 6242, 273, 841, 2905, 2987, 26614, 253, 4477, 2879, 253, 30404, 285, 11985, 670, 841, 2987, 275, 253, 17265, 7482, 534, 9713, 253, 30628, 4385, 275, 1635, 253, 4477, 671, 2530, 3081, 28913, 2175, 285, 8254, 6787, 534, 9713, 253, 1273, 1127, 5439, 407, 37317, 891, 22, 18068, 50276, 15337, 254, 337, 22383, 75, 4469, 7350, 670, 253, 5536, 18520, 1309, 30080, 22559, 285, 253, 689, 7041, 285, 689, 23708, 273, 253, 2746, 253, 913, 5821, 342, 436, 37317, 326, 690, 3448, 824, 347, 8723, 382, 943, 320, 7020, 264, 1066, 275, 253, 1735, 294, 4149, 273, 436, 7714, 50276, 29114, 512, 841, 715, 2395, 253, 913, 1119, 326, 253, 30080, 22559, 556, 9713, 253, 2201, 3374, 5439, 275, 253, 10123, 1014, 2167, 436, 789, 1057, 417, 6635, 22564, 5697, 352, 556, 2011, 21414, 1941, 273, 247, 8542, 2746, 326, 19132, 253, 4715, 6733, 273, 3676, 35221, 4715, 275, 11132, 8113, 3169, 1453, 8892, 436, 789, 778, 29238, 253, 3971, 323, 2852, 789, 281, 1287, 625, 7269, 3082, 3103, 253, 913, 11121, 326, 436, 789, 556, 4817, 253, 2534, 273, 14924, 387, 5723, 2824, 5747, 253, 6804, 2457, 17503 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: this paper studies the extreme multilabel text classification problem and established new stateoftheart sota results with the proposed method namely cascadexml which significantly simplified the multistages training of the previous sota model xrtransformer cascadexml enables endtoend training with two key insights 1 leverages intermediate representations of transformer layers for learning xmc subproblems of the label tree 2 finetuning at the full resolution of label space strength leverage intermediate representations for multiresolution learning of xmc is novel paper writing is clear and easy to follow empirical results are quite promising weakness lack of ablation study for the choice of intermediate layer of transformer embeddings lack of ablation study on loss functions no potential negative societal impact docsepthe paper proposes cascadexml to solve the extreme multilabel text classification problem compared to prior works the proposed method utilizes features from multiple layers from bert for different granularity of the hierarchical label tree the coarser labels with lower scores can be terminated early for more efficient learning the proposed method is compared with several sota methods including lightxml and xrtransformer on benchmark datasets wiki500k amazon670k and amazon3m i am not an expert in text classification and thus not familiar with most of the related works my judgment here is mostly based on the merits and the presentation of the paper on its own the paper in general is very well written it explains the main drawbacks of existing works although i am not sure if there are other works that already addressed these issues and motivates from the drawbacks to propose cascadexml the core idea of cascadexml is intuitively sound well explained and should be easy to reproducible the authors also include the code in the supplementary i did not find too many weaknesses in the paper but i do have some questions 1 does all compared or the main sota methods based on the same bert model this is not clearly stated in the paper 2 for early termination of the coarse labels there are risks of early wrong predictions as well is there any measurement to prevent that na docsepthe authors propose casccadexml an endtoend deep learning pipeline for extreme multilabel text classification cascadexml leverages the multilayer architecture of transformer models to handle different label resolutions with separate feature representations specifically for each label resolution in the hierarchical label tree hlt features from the corresponding transformer layer are used to predict a shortlist of candidate labels or metalabels the authors empirically show that cascadexml achieves stateoftheart performance on three extreme multilabel datasets the computation of training and inference are both faster than the current stateoftheart transformerbased models strength 1 the approach to leveraging the multilayer architecture of transformers for multiresolution learning is interesting 2 the experiment results are good concerning both performance and computation 3 the analysis of the impact of label cluster size explains the improvement of cascadexml well 4 the paper is wellorganized and easy to follow weakness major 1 the method requires much more hyperparameters such as the corresponding transformer layer and the number of metalabels of each label resolution 2 lack the discussion of the sensitivity of the newly introduced hyperparameters minor 3 the highlow resolution mentioned in section 1 sounds ambiguous to me may replace them with refinedcoarse 4 the definition of mathfrakct and mathfrakrt are provided but the definition of mathfrakcty and mathfrakrty are not 5 line 234 attention attentionxml the authors said the limitations and the potential negative societal impact are discussed in the appendix but i didnt find them some discussion about the lack of theoretical guarantee or the limited search space of the hyperparameters would be good docsepthis work proposes cascadexml which can leverage the multilayered architecture to learn data representation corresponding to different resolutions in the hierarchical label tree as well as finegrained labels at the level of the extreme classifier each layer of the model can perform label shortlisting on a level the authors compare the proposed cascadexml to strong baselines on three xmc benchmark datasets experiments show that the proposed method outperforms these baselines and can achieve faster training and inference strength 1 the proposed cascadexml is endtoend differentiable and achieves better results compared to previous baselines on three benchmark datasets 2 the proposed method is faster in both training and inference which is an important aspect of the xmc task 3 this work is wellwritten and easy to follow weakness 1 the idea of dynamic label shortlisting is very interesting but i wonder how to solve the cases where hlt contains more levels than the layer number of transformers also for the hlt containing fewer levels do we have to change the model structure if the model is trained on hlt1 containing n layers can we transfer it to an hlt with m layers where m neq n 2 why do the earlier layers in the transformer correspond to the higher levels of the hlt more explanations over this may help a bit with the answer to q1 3 the proposed method highly relies on the multilayer structure of the transformer which makes it not easy to extend to other frameworks as mentioned above the scalability and transferability of this method may be much more limited compared to the baselines adopted in this work the authors have discussed and addressed some of the concerns i have in the paper ### Summary:
this paper proposes cascadexml which is an endtoend framework for the task of treebased extreme multilabel text classification it extracts the representations from different layers of a bert model and then maps them to different levels of the hierarchical label tree hlt the proposed method shows strong performance in pk on benchmark datasets with an improved efficiency during inference compared to other stateoftheart methods including xrtransformer and lightxml two of the reviewers pointed out the lack of ablation studies for the choice of intermediate layers of the bert model and the mapping between those transformer layser to the hlt layers the problems were addressed in the updated version of the paper during rebuttal and the reviewers increased their scores as a result given that 3 out of the 4 reviewers give a score of 7 the recommendation is to accept the paper
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 2520, 2929, 2175, 253, 9559, 33362, 1492, 2505, 9162, 1895, 285, 4232, 747, 1375, 23037, 14387, 256, 5503, 1543, 342, 253, 4081, 1332, 10775, 25282, 7229, 534, 3012, 21010, 253, 1554, 382, 1131, 3733, 273, 253, 2045, 256, 5503, 1566, 1269, 83, 16702, 254, 25282, 7229, 13276, 990, 936, 423, 3733, 342, 767, 2234, 16039, 337, 19732, 1131, 10444, 14237, 273, 39707, 8090, 323, 4715, 1269, 17475, 749, 856, 23042, 273, 253, 5203, 5202, 374, 1442, 292, 25004, 387, 253, 2120, 6064, 273, 5203, 2317, 50275, 45563, 50276, 282, 2394, 10444, 14237, 323, 1554, 2731, 2241, 4715, 273, 1269, 17475, 310, 4460, 50276, 20790, 4028, 310, 2590, 285, 3477, 281, 956, 50276, 358, 5378, 474, 1543, 403, 3240, 12532, 50274, 20881, 1255, 50276, 77, 471, 273, 28913, 1263, 323, 253, 4327, 273, 10444, 3828, 273, 39707, 46234, 50275, 77, 471, 273, 28913, 1263, 327, 2957, 3470, 50276, 2369, 2442, 4016, 38058, 3486, 5474, 339, 431, 248, 2929, 29328, 25282, 7229, 281, 8415, 253, 9559, 33362, 1492, 2505, 9162, 1895, 2429, 281, 2720, 2987, 253, 4081, 1332, 29820, 3386, 432, 2709, 8090, 432, 270, 797, 323, 1027, 32449, 414, 273, 253, 24498, 5203, 5202, 253, 820, 9332, 13301, 342, 2406, 7363, 476, 320, 19344, 2393, 323, 625, 5919, 4715, 253, 4081, 1332, 310, 2429, 342, 2067, 256, 5503, 3082, 1690, 50276, 3243, 7229, 285, 1269, 83, 16702, 254, 327, 22791, 15302, 50276, 16123, 5388, 76, 7001, 251, 44647, 76, 285, 7001, 251, 20, 78, 891, 717, 417, 271, 6485, 275, 2505, 9162, 285, 3021, 417, 7615, 342, 954, 273, 253, 2905, 2987, 619, 3883, 1060, 310, 6571, 1754, 327, 253, 16108, 285, 253, 9759, 273, 253, 2929, 327, 697, 1211, 50276, 783, 2929, 275, 2087, 310, 1077, 973, 3542, 352, 11424, 253, 2022, 30453, 273, 5368, 2987, 3738, 891, 717, 417, 2119, 604, 627, 403, 643, 2987, 326, 2168, 9713, 841, 3374, 285, 15265, 684, 432, 253, 30453, 281, 12661, 25282, 7229, 253, 5161, 2934, 273, 25282, 7229, 310, 540, 41597, 3590, 973, 5544, 285, 943, 320, 3477, 281, 41374, 253, 4477, 671, 2486, 253, 2127, 275, 253, 24864, 50276, 74, 858, 417, 1089, 1512, 1142, 32213, 275, 253, 2929, 533, 891, 513, 452, 690, 3533, 50276, 18, 1057, 512, 2429, 390, 253, 2022, 256, 5503, 3082, 1754, 327, 253, 1072, 270, 797, 1566, 436, 310, 417, 4518, 4767, 275, 253, 2929, 50276, 19, 323, 2393, 15056, 273, 253, 25319, 13301, 627, 403, 10502, 273, 2393, 3430, 13650, 347, 973, 50276, 261, 627, 667, 6814, 281, 3657, 326, 50276, 2072, 5474, 339, 431, 248, 4477, 12661, 6483, 550, 796, 7229, 271, 990, 936, 423, 3676, 4715, 15722, 323, 9559, 33362, 1492, 2505, 9162, 25282, 7229, 19732, 1131, 253, 33362, 4071, 10336, 273, 39707, 3210, 281, 6016, 1027, 5203, 30285, 342, 4858, 4735, 14237, 5742, 323, 1016, 5203, 6064, 275, 253, 24498, 5203, 5202, 288, 5792, 3386, 432, 253, 3969, 39707, 3828, 403, 908, 281, 3283, 247, 2159, 3550, 273, 7431, 13301, 390, 5148, 357, 1241, 253, 4477, 45190, 921, 326, 25282, 7229, 33526, 1375, 23037, 14387, 3045, 327, 1264, 9559, 33362, 1492, 15302, 253, 13782, 273, 3733, 285, 17032, 403, 1097, 7938, 685, 253, 1655, 1375, 23037, 14387, 39707, 3169, 3210, 4757, 337, 253, 2746, 281, 19732, 2977, 253, 33362, 4071, 10336, 273, 4979, 398, 323, 1554, 2731, 2241, 4715, 310, 4722, 374, 253, 3368, 1543, 403, 1175, 8664, 1097, 3045, 285, 13782, 495, 253, 1783, 273, 253, 3486, 273, 5203, 7368, 1979, 11424, 253, 7756, 273, 25282, 7229, 973, 577, 253, 2929, 310, 973, 34092, 285, 3477, 281, 956, 50276, 20881, 1255, 2201, 337, 253, 1332, 4419, 1199, 625, 4373, 22041, 824, 347, 253, 3969, 39707, 3828, 285, 253, 1180, 273, 5148, 357, 1241, 273, 1016, 5203, 6064, 374, 3480, 253, 5955, 273, 253, 7340, 273, 253, 9841, 5611, 4373, 22041, 5884, 495, 253, 1029, 676, 6064, 5393, 275, 2593, 337, 7835, 23851, 281, 479, 778, 8171, 731, 342, 22407, 1940, 10788, 577, 253, 5426, 273, 14168, 5278, 291, 285, 14168, 5278, 1378, 403, 2530, 533, 253, 5426, 273, 14168, 5278, 291, 90, 285, 14168, 5278, 83, 555, 403, 417, 608, 1386, 27812, 4116, 50276, 42959, 7229, 253, 4477, 753, 253, 7364, 285, 253, 2442, 4016, 38058, 3486, 403, 5469, 275, 253, 30762, 533, 891, 42126, 1089, 731, 690, 5955, 670, 253, 3480, 273, 10527, 12215, 390, 253, 3710, 3186, 2317, 273, 253, 4373, 22041, 651, 320, 1175, 5474, 33032, 2520, 789, 29328, 25282, 7229, 534, 476, 25057, 253, 33362, 333, 2122, 10336, 281, 3037, 941, 6779, 3969, 281, 1027, 30285, 275, 253, 24498, 5203, 5202, 347, 973, 347, 4030, 72, 11273, 13301, 387, 253, 1268, 273, 253, 9559, 30410, 1016, 3828, 273, 253, 1566, 476, 1347, 5203, 2159, 40545, 327, 247, 1268, 50276, 783, 4477, 7277, 253, 4081, 25282, 7229, 281, 2266, 1666, 25379, 327, 1264, 1269, 17475, 22791, 15302, 4679, 921, 326, 253, 4081, 1332, 41731, 13015, 841, 1666, 25379, 285, 476, 5115, 7938, 3733, 285, 17032, 50276, 45563, 50276, 18, 253, 4081, 25282, 7229, 310, 990, 936, 423, 46350, 285, 33526, 1805, 1543, 2429, 281, 2045, 1666, 25379, 327, 1264, 22791, 15302, 50276, 19, 253, 4081, 1332, 310, 7938, 275, 1097, 3733, 285, 17032, 534, 310, 271, 1774, 4809, 273, 253, 1269, 17475, 4836, 50276, 20, 436, 789, 310, 973, 15720, 285, 3477, 281, 956, 50275, 20881, 1255, 50276, 18, 253, 2934, 273, 7870, 5203, 2159, 40545, 310, 1077, 4722, 533, 891, 4282, 849, 281, 8415, 253, 2219, 835, 288, 5792, 4428, 625, 2308, 685, 253, 3828, 1180, 273, 4979, 398, 671, 323, 253, 288, 5792, 4508, 11184, 2308, 513, 359, 452, 281, 1818, 253, 1566, 2605, 604, 253, 1566, 310, 10166, 327, 288, 5792, 18, 4508, 295, 8090, 476, 359, 3700, 352, 281, 271, 288, 5792, 342, 278, 8090, 835, 278, 425, 82, 295, 50276, 19, 2139, 513, 253, 4321, 8090, 275, 253, 39707, 2723, 281, 253, 2169, 2308, 273, 253, 288, 5792, 625, 22909, 689, 436, 778, 1361, 247, 2372, 342, 253, 3662, 281, 2805, 18, 50276, 20, 253, 4081, 1332, 4122, 15771, 327, 253, 33362, 4071, 2605, 273, 253, 39707, 534, 2789, 352, 417, 3477, 281, 9017, 281, 643, 31225, 347, 5393, 1840, 253, 9171, 1430, 285, 3700, 1430, 273, 436, 1332, 778, 320, 1199, 625, 3710, 2429, 281, 253, 1666, 25379, 8671, 275, 436, 789, 253, 4477, 452, 5469, 285, 9713, 690, 273, 253, 7350, 891, 452, 275, 253, 2929, 2490, 187, 4118, 18435, 27, 2520, 2929, 29328, 25282, 7229, 534, 310, 271, 990, 936, 423, 7792, 323, 253, 4836, 273, 5202, 3169, 9559, 33362, 1492, 2505, 9162, 352, 16756, 253, 14237, 432, 1027, 8090, 273, 247, 270, 797, 1566, 285, 840, 8115, 731, 281, 1027, 2308, 273, 253, 24498, 5203, 5202, 288, 5792, 50275, 783, 4081, 1332, 2722, 2266, 3045, 275, 50276, 27905, 327, 22791, 15302, 342, 271, 50276, 303, 27369, 6733, 1309, 17032, 2429, 281, 643, 1375, 23037, 14387, 3082, 1690, 1269, 83, 16702, 254, 285, 1708, 7229, 767, 273, 253, 30628, 8042, 562, 253, 3480, 273, 28913, 2175, 323, 253, 4327, 273, 10444, 8090, 273, 253, 270, 797, 1566, 285, 253, 10603, 875, 1110, 39707, 41714, 254, 281, 253, 288, 5792, 8090, 253, 3237, 497, 9713, 275, 253, 9300, 2715, 273, 253, 2929, 1309, 30080, 22559, 285, 253, 30628, 2559, 616, 7363, 347, 247, 906, 50276, 28821, 326, 495, 562, 273, 253, 577, 30628, 1918, 247, 4868, 273, 818, 50276, 783, 17401, 310, 281, 2997, 253, 2929, 209 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 2520, 2929, 2175, 253, 9559, 33362, 1492, 2505, 9162, 1895, 285, 4232, 747, 1375, 23037, 14387, 256, 5503, 1543, 342, 253, 4081, 1332, 10775, 25282, 7229, 534, 3012, 21010, 253, 1554, 382, 1131, 3733, 273, 253, 2045, 256, 5503, 1566, 1269, 83, 16702, 254, 25282, 7229, 13276, 990, 936, 423, 3733, 342, 767, 2234, 16039, 337, 19732, 1131, 10444, 14237, 273, 39707, 8090, 323, 4715, 1269, 17475, 749, 856, 23042, 273, 253, 5203, 5202, 374, 1442, 292, 25004, 387, 253, 2120, 6064, 273, 5203, 2317, 50275, 45563, 50276, 282, 2394, 10444, 14237, 323, 1554, 2731, 2241, 4715, 273, 1269, 17475, 310, 4460, 50276, 20790, 4028, 310, 2590, 285, 3477, 281, 956, 50276, 358, 5378, 474, 1543, 403, 3240, 12532, 50274, 20881, 1255, 50276, 77, 471, 273, 28913, 1263, 323, 253, 4327, 273, 10444, 3828, 273, 39707, 46234, 50275, 77, 471, 273, 28913, 1263, 327, 2957, 3470, 50276, 2369, 2442, 4016, 38058, 3486, 5474, 339, 431, 248, 2929, 29328, 25282, 7229, 281, 8415, 253, 9559, 33362, 1492, 2505, 9162, 1895, 2429, 281, 2720, 2987, 253, 4081, 1332, 29820, 3386, 432, 2709, 8090, 432, 270, 797, 323, 1027, 32449, 414, 273, 253, 24498, 5203, 5202, 253, 820, 9332, 13301, 342, 2406, 7363, 476, 320, 19344, 2393, 323, 625, 5919, 4715, 253, 4081, 1332, 310, 2429, 342, 2067, 256, 5503, 3082, 1690, 50276, 3243, 7229, 285, 1269, 83, 16702, 254, 327, 22791, 15302, 50276, 16123, 5388, 76, 7001, 251, 44647, 76, 285, 7001, 251, 20, 78, 891, 717, 417, 271, 6485, 275, 2505, 9162, 285, 3021, 417, 7615, 342, 954, 273, 253, 2905, 2987, 619, 3883, 1060, 310, 6571, 1754, 327, 253, 16108, 285, 253, 9759, 273, 253, 2929, 327, 697, 1211, 50276, 783, 2929, 275, 2087, 310, 1077, 973, 3542, 352, 11424, 253, 2022, 30453, 273, 5368, 2987, 3738, 891, 717, 417, 2119, 604, 627, 403, 643, 2987, 326, 2168, 9713, 841, 3374, 285, 15265, 684, 432, 253, 30453, 281, 12661, 25282, 7229, 253, 5161, 2934, 273, 25282, 7229, 310, 540, 41597, 3590, 973, 5544, 285, 943, 320, 3477, 281, 41374, 253, 4477, 671, 2486, 253, 2127, 275, 253, 24864, 50276, 74, 858, 417, 1089, 1512, 1142, 32213, 275, 253, 2929, 533, 891, 513, 452, 690, 3533, 50276, 18, 1057, 512, 2429, 390, 253, 2022, 256, 5503, 3082, 1754, 327, 253, 1072, 270, 797, 1566, 436, 310, 417, 4518, 4767, 275, 253, 2929, 50276, 19, 323, 2393, 15056, 273, 253, 25319, 13301, 627, 403, 10502, 273, 2393, 3430, 13650, 347, 973, 50276, 261, 627, 667, 6814, 281, 3657, 326, 50276, 2072, 5474, 339, 431, 248, 4477, 12661, 6483, 550, 796, 7229, 271, 990, 936, 423, 3676, 4715, 15722, 323, 9559, 33362, 1492, 2505, 9162, 25282, 7229, 19732, 1131, 253, 33362, 4071, 10336, 273, 39707, 3210, 281, 6016, 1027, 5203, 30285, 342, 4858, 4735, 14237, 5742, 323, 1016, 5203, 6064, 275, 253, 24498, 5203, 5202, 288, 5792, 3386, 432, 253, 3969, 39707, 3828, 403, 908, 281, 3283, 247, 2159, 3550, 273, 7431, 13301, 390, 5148, 357, 1241, 253, 4477, 45190, 921, 326, 25282, 7229, 33526, 1375, 23037, 14387, 3045, 327, 1264, 9559, 33362, 1492, 15302, 253, 13782, 273, 3733, 285, 17032, 403, 1097, 7938, 685, 253, 1655, 1375, 23037, 14387, 39707, 3169, 3210, 4757, 337, 253, 2746, 281, 19732, 2977, 253, 33362, 4071, 10336, 273, 4979, 398, 323, 1554, 2731, 2241, 4715, 310, 4722, 374, 253, 3368, 1543, 403, 1175, 8664, 1097, 3045, 285, 13782, 495, 253, 1783, 273, 253, 3486, 273, 5203, 7368, 1979, 11424, 253, 7756, 273, 25282, 7229, 973, 577, 253, 2929, 310, 973, 34092, 285, 3477, 281, 956, 50276, 20881, 1255, 2201, 337, 253, 1332, 4419, 1199, 625, 4373, 22041, 824, 347, 253, 3969, 39707, 3828, 285, 253, 1180, 273, 5148, 357, 1241, 273, 1016, 5203, 6064, 374, 3480, 253, 5955, 273, 253, 7340, 273, 253, 9841, 5611, 4373, 22041, 5884, 495, 253, 1029, 676, 6064, 5393, 275, 2593, 337, 7835, 23851, 281, 479, 778, 8171, 731, 342, 22407, 1940, 10788, 577, 253, 5426, 273, 14168, 5278, 291, 285, 14168, 5278, 1378, 403, 2530, 533, 253, 5426, 273, 14168, 5278, 291, 90, 285, 14168, 5278, 83, 555, 403, 417, 608, 1386, 27812, 4116, 50276, 42959, 7229, 253, 4477, 753, 253, 7364, 285, 253, 2442, 4016, 38058, 3486, 403, 5469, 275, 253, 30762, 533, 891, 42126, 1089, 731, 690, 5955, 670, 253, 3480, 273, 10527, 12215, 390, 253, 3710, 3186, 2317, 273, 253, 4373, 22041, 651, 320, 1175, 5474, 33032, 2520, 789, 29328, 25282, 7229, 534, 476, 25057, 253, 33362, 333, 2122, 10336, 281, 3037, 941, 6779, 3969, 281, 1027, 30285, 275, 253, 24498, 5203, 5202, 347, 973, 347, 4030, 72, 11273, 13301, 387, 253, 1268, 273, 253, 9559, 30410, 1016, 3828, 273, 253, 1566, 476, 1347, 5203, 2159, 40545, 327, 247, 1268, 50276, 783, 4477, 7277, 253, 4081, 25282, 7229, 281, 2266, 1666, 25379, 327, 1264, 1269, 17475, 22791, 15302, 4679, 921, 326, 253, 4081, 1332, 41731, 13015, 841, 1666, 25379, 285, 476, 5115, 7938, 3733, 285, 17032, 50276, 45563, 50276, 18, 253, 4081, 25282, 7229, 310, 990, 936, 423, 46350, 285, 33526, 1805, 1543, 2429, 281, 2045, 1666, 25379, 327, 1264, 22791, 15302, 50276, 19, 253, 4081, 1332, 310, 7938, 275, 1097, 3733, 285, 17032, 534, 310, 271, 1774, 4809, 273, 253, 1269, 17475, 4836, 50276, 20, 436, 789, 310, 973, 15720, 285, 3477, 281, 956, 50275, 20881, 1255, 50276, 18, 253, 2934, 273, 7870, 5203, 2159, 40545, 310, 1077, 4722, 533, 891, 4282, 849, 281, 8415, 253, 2219, 835, 288, 5792, 4428, 625, 2308, 685, 253, 3828, 1180, 273, 4979, 398, 671, 323, 253, 288, 5792, 4508, 11184, 2308, 513, 359, 452, 281, 1818, 253, 1566, 2605, 604, 253, 1566, 310, 10166, 327, 288, 5792, 18, 4508, 295, 8090, 476, 359, 3700, 352, 281, 271, 288, 5792, 342, 278, 8090, 835, 278, 425, 82, 295, 50276, 19, 2139, 513, 253, 4321, 8090, 275, 253, 39707, 2723, 281, 253, 2169, 2308, 273, 253, 288, 5792, 625, 22909, 689, 436, 778, 1361, 247, 2372, 342, 253, 3662, 281, 2805, 18, 50276, 20, 253, 4081, 1332, 4122, 15771, 327, 253, 33362, 4071, 2605, 273, 253, 39707, 534, 2789, 352, 417, 3477, 281, 9017, 281, 643, 31225, 347, 5393, 1840, 253, 9171, 1430, 285, 3700, 1430, 273, 436, 1332, 778, 320, 1199, 625, 3710, 2429, 281, 253, 1666, 25379, 8671, 275, 436, 789, 253, 4477, 452, 5469, 285, 9713, 690, 273, 253, 7350, 891, 452, 275, 253, 2929, 2490, 187, 4118, 18435, 27, 2520, 2929, 29328, 25282, 7229, 534, 310, 271, 990, 936, 423, 7792, 323, 253, 4836, 273, 5202, 3169, 9559, 33362, 1492, 2505, 9162, 352, 16756, 253, 14237, 432, 1027, 8090, 273, 247, 270, 797, 1566, 285, 840, 8115, 731, 281, 1027, 2308, 273, 253, 24498, 5203, 5202, 288, 5792, 50275, 783, 4081, 1332, 2722, 2266, 3045, 275, 50276, 27905, 327, 22791, 15302, 342, 271, 50276, 303, 27369, 6733, 1309, 17032, 2429, 281, 643, 1375, 23037, 14387, 3082, 1690, 1269, 83, 16702, 254, 285, 1708, 7229, 767, 273, 253, 30628, 8042, 562, 253, 3480, 273, 28913, 2175, 323, 253, 4327, 273, 10444, 8090, 273, 253, 270, 797, 1566, 285, 253, 10603, 875, 1110, 39707, 41714, 254, 281, 253, 288, 5792, 8090, 253, 3237, 497, 9713, 275, 253, 9300, 2715, 273, 253, 2929, 1309, 30080, 22559, 285, 253, 30628, 2559, 616, 7363, 347, 247, 906, 50276, 28821, 326, 495, 562, 273, 253, 577, 30628, 1918, 247, 4868, 273, 818, 50276, 783, 17401, 310, 281, 2997, 253, 2929, 209 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: this work introduces to construct an input pyramid with different semantic levels for each modality it then aligns visual elements and linguistic elements in a hierarchical way the proposed pyramidclip outperforms clip with a large margin strength the results are very promising the model outperforms the sota methods across many datasets the overall framework is easy to follow weakness the idea of introducing roi features may increase the computational cost significantly in peerlevel semantics alignment the authors introduced coarsegrained global contrast learning and finegrained local contrast learning however i didnt find the more studies to validate that combining two contrast alignments helps model training the authors described the crosslevel relation alignment in sec 33 three structures are proposed but i suggest the authors providing more analysis to prove their effectiveness yes docsepunder the contrastive clip learning framework the work proposes to utilize more finegrained information to produce multiple views of both the image and text during training and hence constructs more contrastive loss terms across different views during inferenceevaluation only the standard view is used empirically with 3 different architectures resnet50vitb32vitb16 different pretraining data scales and several downstream datasets authors show that the proposed approach achieves clear gain over the baseline systems the work follows a natural motivation and achieves very good empirical gain over some strong baseline systems overall the paper is well written and the empirical study is solid a key concern i have is whether the comparison is fair enough if i understand correctly for each view of either the image or the text we need to feed the view into the model once which leads to roughly 2x 3x additional computation cost again if my understanding is correct despite using the same batch size the actual pretraining cost might be much higher for the proposed method than baselines making the comparison less information a better comparison seems to be make the batch size of baselines larger until the training cost is comparable i dont see any particular problem here docsepthis paper proposes hierarchical feature alignment for vision language pretraining called pyramidclip which alleviates semantic mismatch as well as mutual compatibility problems ie false positives and false negatives pyramidclip constructs inputs with three levels of semantics in visual and language modalities respectively and then resolves semantic mismatch through peerlevel semantic alignment and crosslevel relation alignment in addition pyramidclip adopts a soft form of infonce to deal with mutual compatibility strengths 1 this paper is well written and easy to follow 2 the motivation and solution of the article are clear more precise hierarchical feature alignment for tackling semantic mismatch and softening infonce for mutual compatibility 3 the experiments are well designed and the results are excellent limitations have the authors adequately addressed the limitations and potential negative social impact of their work if not please include constructive suggestions for improvement authors should be rewarded rather than punished for being upfront about the limitations of their work and any potential negative societal impact 1 needs more explanation about how the training set is constructed 2 in order to compare with some recently published workseg 1 it is recommended that the author can supplement the results on smaller scale datasets such as cc3m 3 the categories obtained by object detection are simply joined by commas whether different joint forms have an impact on the results eg splicing with spaces 1 robust crossmodal representation learning with progressive selfdistillation cvpr 2022 yes docsepthis paper proposes to improve clip by adding contrastive losses at different levels specifically for each image in the original data they construct image views at global local and region levels and also create their corresponding text captions therefore for each imagecaption pair in the original dataset they can create two additional imagecaption pairs and the model is trained with additional contrastive losses given the newly constructed data they also soften the original contrastive loss with a labelsmoothinglike technique experiments demonstrate improvements over several baselines on zeroshot retrieval linear probing on image classification object detection and semantic segmentation tasks strengths 1 the empirical results are strong as they can outperform several popular baselines 2 the paper is wellstructured 3 i like the idea of softening the contrastive loss objective which makes sense and the paper shows it works well weaknesses 1 the comparisons between their model and baselines may not be fair the paper uses multiple pretrained models for their model eg text summarization and object detection models which can introduce more supervision also because their method will create more imagecaption pairs for their model they would have more pretraining data than the baselines 2 some designs are questionable for example the text captions in imagecaption datasets are typically short whereas they use a pretrained text summarization model which is designed for summarizing long documents to further shorten the captions which does not make sense to me 3 while it is true that the pretraining datasets can be noisy the problem may be alleviated if sufficiently large data are used it would be good to see if they can keep the performance gain as more data are included for example they can plot performancedata size curves for both their model and baselines and see the performance gap when the data size varies i am not sure if their method can be scalable as when more data is included many of the problems mentioned in the paper may be alleviated docsepthis paper proposes a new framework called pyramidclip for visionlanguage pretraining vlp pyramidclip first extracts multilevel features from both the visual and linguistic domains and then conducts contrastive learning by aligning visual and linguistic features in both peerlevel and crosslevel ways in this way the vision and language representations learned by pyramidclip encode better imagetext alignment which alleviates the semantic mismatch problem that exists in the imagetext pairs for pretraining moreover the authors also replace the loss term of the negative samples in contrastive learning with a softened version to tackle the problem that different imagetext pairs may have potential correlation the empirical results show that pyramidclip clearly outperforms the clip baseline in a variety of downstream tasks and also achieves sota performance on several downstream tasks as compared with other vlp models strengths this paper addresses the problem of the quality of imagetext pairs which is an important problem in vlp using hierarchical feature alignment the core idea and specific designs of the proposed pyramidclip are generally reasonable the authors conduct extensive experiments to demonstrate the effectiveness of pyramidclip covering different backbone architectures resnet50 and vitb pretraining datasets of varying sizes and a variety of downstream tasks zeroshot image classification zeroshot imagetext retrieval linear probe object detection and instance segmentation the ablation study is comprehensive showing that all the components of the proposed method are conducive qualitative analyses are conducted to intuitively show that pyramidclip learns better vision and linguistic representation than clip the paper is generally wellwritten and easy to follow weaknesses 1 some specific designs of pyramidclip seem counterintuitive random crop cannot guarantee the quality especially for the local view for example if the local view is irrelevant to the textual description minimizing the distance of the corresponding features may confuse the model it is unclear how the crosslevel alignment helps the modelling of relations between salient objects for example how does the model know that a table is next to a chair by contrasting the roi features to the textual summary or the original text it is better to provide an intuitive explanation 2 the difference between pyramidclip and existing vlp methods that also introduce multilevel semantics is not clearly discussed in the related work the authors state that different from methods mentioned above each level is input to the corresponding encoder individually without concatenating such a difference seems trivial and incremental 3 many methods described in the related work are not compared in the experiments especially the methods that also introduce multilevel semantics eg mvptr and xvlm the soften objective function treats all the negative samples equally using label smoothing which is suboptimal considering different imagetext pairs should have different degrees of correlation ### Summary:
this paper proposes pyramidclip it improve contrastive learning method clip with more finegrained information to produce multiple views of both the image and text during training during inferenceevaluation only the standard view is used the empirical results with different network architectures at different pretraining data scales show that the proposed pyramid achieves clear gain over the baseline methods the paper is comprehensively discussed and receives unanimous accept from all reviewers leading to an accept decision the authors are highly encouraged to revise the paper accordingly the authors reported results on a customized benchmark and showed improvement based on its own baseline in the future the authors are highly encouraged to report results based on a common benchmark below so that readers can clearly see the position of pyramidclip in the context of all other similar papers in the literature httpscomputervisioninthewildgithubioeccv2022
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 2520, 789, 23970, 281, 3989, 271, 3280, 39694, 342, 1027, 24705, 2308, 323, 1016, 36453, 352, 840, 8495, 84, 5304, 3603, 285, 32019, 3603, 275, 247, 24498, 1039, 253, 4081, 39694, 11536, 41731, 13015, 17230, 342, 247, 1781, 8459, 50274, 45563, 50276, 783, 1543, 403, 1077, 12532, 253, 1566, 41731, 13015, 253, 256, 5503, 3082, 2439, 1142, 15302, 50276, 783, 4583, 7792, 310, 3477, 281, 956, 50276, 20881, 1255, 50276, 783, 2934, 273, 16984, 687, 74, 3386, 778, 2572, 253, 15180, 2105, 3012, 50275, 249, 14218, 5251, 35185, 12420, 253, 4477, 5611, 25319, 72, 11273, 4156, 4499, 4715, 285, 4030, 72, 11273, 1980, 4499, 4715, 2299, 891, 42126, 1089, 253, 625, 2175, 281, 17813, 326, 16248, 767, 4499, 43097, 7729, 1566, 3733, 50276, 783, 4477, 2529, 253, 2831, 5251, 5886, 12420, 275, 4706, 5922, 1264, 5289, 403, 4081, 533, 891, 1804, 253, 4477, 5277, 625, 1783, 281, 5276, 616, 12510, 4754, 5474, 33032, 4524, 253, 4499, 422, 17230, 4715, 7792, 253, 789, 29328, 281, 16584, 625, 4030, 72, 11273, 1491, 281, 4711, 2709, 6849, 273, 1097, 253, 2460, 285, 2505, 1309, 3733, 285, 7613, 21031, 625, 4499, 422, 2957, 2426, 2439, 1027, 6849, 1309, 17032, 15419, 2368, 760, 253, 2629, 1859, 310, 908, 50276, 358, 5378, 1037, 342, 495, 1027, 35615, 501, 3024, 1235, 34490, 67, 1237, 34490, 67, 1036, 1027, 3215, 26208, 941, 11498, 285, 2067, 15450, 15302, 4477, 921, 326, 253, 4081, 2746, 33526, 2590, 6351, 689, 253, 8245, 2718, 50276, 783, 789, 3637, 247, 3626, 16038, 285, 33526, 1077, 1175, 16774, 6351, 689, 690, 2266, 8245, 2718, 4583, 253, 2929, 310, 973, 3542, 285, 253, 16774, 1263, 310, 4891, 50276, 66, 2234, 4468, 891, 452, 310, 1880, 253, 5301, 310, 4344, 2217, 604, 891, 2096, 9113, 323, 1016, 1859, 273, 2057, 253, 2460, 390, 253, 2505, 359, 878, 281, 3997, 253, 1859, 715, 253, 1566, 2378, 534, 5644, 281, 11467, 374, 89, 50276, 20, 89, 3081, 13782, 2105, 969, 604, 619, 4685, 310, 3451, 5747, 970, 253, 1072, 14604, 1979, 253, 4588, 3215, 26208, 2105, 1537, 320, 1199, 2169, 323, 253, 4081, 1332, 685, 1666, 25379, 2403, 253, 5301, 1679, 1491, 247, 1805, 5301, 3133, 281, 320, 1056, 253, 14604, 1979, 273, 1666, 25379, 4067, 1919, 253, 3733, 2105, 310, 10870, 891, 13414, 923, 667, 1798, 1895, 1060, 5474, 33032, 2520, 2929, 29328, 24498, 4735, 12420, 323, 8113, 3448, 3215, 26208, 1925, 39694, 11536, 534, 7374, 6584, 684, 24705, 29713, 347, 973, 347, 15577, 22862, 3237, 26332, 3221, 37865, 285, 3221, 2297, 3993, 39694, 11536, 21031, 14800, 342, 1264, 2308, 273, 35185, 275, 5304, 285, 3448, 33433, 2975, 285, 840, 501, 14503, 24705, 29713, 949, 14218, 5251, 24705, 12420, 285, 2831, 5251, 5886, 12420, 275, 1635, 39694, 11536, 47932, 247, 2602, 830, 273, 2192, 19131, 281, 2968, 342, 15577, 22862, 20544, 50276, 18, 436, 2929, 310, 973, 3542, 285, 3477, 281, 956, 50276, 19, 253, 16038, 285, 2900, 273, 253, 3929, 403, 2590, 625, 10799, 24498, 4735, 12420, 323, 46710, 24705, 29713, 285, 2602, 2980, 2192, 19131, 323, 15577, 22862, 50276, 20, 253, 4679, 403, 973, 4158, 285, 253, 1543, 403, 7126, 50275, 17465, 569, 50276, 9802, 253, 4477, 18212, 9713, 253, 7364, 285, 2442, 4016, 2675, 3486, 273, 616, 789, 604, 417, 4496, 2486, 25799, 13991, 323, 7756, 4477, 943, 320, 33302, 2581, 685, 29163, 323, 1146, 598, 6342, 670, 253, 7364, 273, 616, 789, 285, 667, 2442, 4016, 38058, 3486, 50276, 18, 3198, 625, 8813, 670, 849, 253, 3733, 873, 310, 8818, 50276, 19, 275, 1340, 281, 7277, 342, 690, 4102, 3863, 789, 26931, 337, 352, 310, 8521, 326, 253, 2488, 476, 8499, 253, 1543, 327, 4577, 4311, 15302, 824, 347, 25215, 20, 78, 50276, 20, 253, 9050, 2797, 407, 1789, 5481, 403, 3365, 7416, 407, 764, 284, 1880, 1027, 6036, 4948, 452, 271, 3486, 327, 253, 1543, 24088, 30772, 342, 8470, 50275, 18, 10237, 2831, 24353, 6779, 4715, 342, 13439, 1881, 8155, 21755, 30105, 1087, 1384, 1423, 4754, 5474, 33032, 2520, 2929, 29328, 281, 3157, 17230, 407, 6240, 4499, 422, 11655, 387, 1027, 2308, 5742, 323, 1016, 2460, 275, 253, 3236, 941, 597, 3989, 2460, 6849, 387, 4156, 1980, 285, 2919, 2308, 285, 671, 2794, 616, 3969, 2505, 3403, 621, 3103, 323, 1016, 2460, 34480, 4667, 275, 253, 3236, 10895, 597, 476, 2794, 767, 3081, 2460, 34480, 8557, 285, 253, 1566, 310, 10166, 342, 3081, 4499, 422, 11655, 1677, 253, 9841, 8818, 941, 597, 671, 50007, 253, 3236, 4499, 422, 2957, 342, 247, 13301, 78, 4902, 272, 3022, 5853, 4679, 7568, 11701, 689, 2067, 1666, 25379, 327, 1182, 254, 6934, 302, 25064, 4872, 39578, 327, 2460, 9162, 50276, 6082, 5481, 285, 24705, 26405, 8892, 20544, 337, 253, 16774, 1543, 403, 2266, 347, 597, 476, 562, 32231, 2067, 4633, 1666, 25379, 374, 253, 2929, 310, 973, 34218, 495, 891, 751, 253, 2934, 273, 2602, 2980, 253, 4499, 422, 2957, 8103, 534, 2789, 3282, 285, 253, 2929, 2722, 352, 2987, 973, 50276, 20881, 1255, 265, 337, 253, 14023, 875, 616, 1566, 285, 1666, 25379, 778, 417, 320, 4344, 253, 2929, 4648, 2709, 3215, 11273, 3210, 323, 616, 1566, 24088, 2505, 10405, 1320, 285, 1789, 5481, 3210, 534, 476, 9569, 625, 20446, 671, 984, 616, 1332, 588, 2794, 625, 2460, 34480, 8557, 323, 616, 1566, 597, 651, 452, 625, 3215, 26208, 941, 685, 253, 1666, 25379, 374, 690, 11809, 403, 30455, 323, 1650, 253, 2505, 3403, 621, 275, 2460, 34480, 15302, 403, 5431, 2159, 5727, 597, 897, 247, 3215, 11273, 2505, 10405, 1320, 1566, 534, 310, 4158, 323, 10405, 3006, 1048, 7177, 281, 2007, 48399, 253, 3403, 621, 534, 1057, 417, 1056, 3282, 281, 479, 495, 1223, 352, 310, 2032, 326, 253, 3215, 26208, 15302, 476, 320, 27620, 253, 1895, 778, 320, 26353, 4215, 604, 10481, 1781, 941, 403, 908, 352, 651, 320, 1175, 281, 923, 604, 597, 476, 1978, 253, 3045, 6351, 347, 625, 941, 403, 2908, 323, 1650, 597, 476, 7484, 1347, 3086, 682, 1979, 9191, 323, 1097, 616, 1566, 285, 1666, 25379, 285, 923, 253, 3045, 8037, 672, 253, 941, 1979, 16149, 891, 717, 417, 2119, 604, 616, 1332, 476, 320, 44755, 347, 672, 625, 941, 310, 2908, 1142, 273, 253, 3237, 5393, 275, 253, 2929, 778, 320, 26353, 4215, 5474, 33032, 2520, 2929, 29328, 247, 747, 7792, 1925, 39694, 11536, 323, 8113, 12982, 3215, 26208, 362, 24343, 39694, 11536, 806, 16756, 1554, 48268, 3386, 432, 1097, 253, 5304, 285, 32019, 10625, 285, 840, 2589, 84, 4499, 422, 4715, 407, 8495, 272, 5304, 285, 32019, 3386, 275, 1097, 14218, 5251, 285, 2831, 5251, 4088, 275, 436, 1039, 253, 8113, 285, 3448, 14237, 6311, 407, 39694, 11536, 22573, 1805, 4440, 292, 2068, 12420, 534, 7374, 6584, 684, 253, 24705, 29713, 1895, 326, 4961, 275, 253, 4440, 292, 2068, 8557, 323, 3215, 26208, 25761, 253, 4477, 671, 8171, 253, 2957, 1307, 273, 253, 4016, 3530, 275, 4499, 422, 4715, 342, 247, 39451, 2715, 281, 18915, 253, 1895, 326, 1027, 4440, 292, 2068, 8557, 778, 452, 2442, 5921, 253, 16774, 1543, 921, 326, 39694, 11536, 4518, 41731, 13015, 253, 17230, 8245, 275, 247, 5235, 273, 15450, 8892, 285, 671, 33526, 256, 5503, 3045, 327, 2067, 15450, 8892, 347, 2429, 342, 643, 362, 24343, 3210, 50276, 296, 3755, 20556, 50276, 2520, 2929, 12453, 253, 1895, 273, 253, 3290, 273, 4440, 292, 2068, 8557, 534, 310, 271, 1774, 1895, 275, 362, 24343, 970, 24498, 4735, 12420, 253, 5161, 2934, 285, 2173, 11809, 273, 253, 4081, 39694, 11536, 403, 3839, 5272, 50276, 783, 4477, 2589, 9470, 4679, 281, 7568, 253, 12510, 273, 39694, 11536, 10985, 1027, 27882, 35615, 501, 3024, 1235, 285, 9084, 67, 3215, 26208, 15302, 273, 11962, 9552, 285, 247, 5235, 273, 15450, 8892, 1182, 254, 6934, 302, 2460, 9162, 1182, 254, 6934, 302, 4440, 292, 2068, 25064, 4872, 10304, 1789, 5481, 285, 4227, 26405, 50276, 783, 28913, 1263, 310, 11088, 4645, 326, 512, 253, 4295, 273, 253, 4081, 1332, 403, 49598, 422, 50276, 15847, 6716, 6260, 403, 5196, 281, 540, 41597, 921, 326, 39694, 11536, 33772, 1805, 8113, 285, 32019, 6779, 685, 17230, 50276, 783, 2929, 310, 3839, 973, 15720, 285, 3477, 281, 956, 50274, 20881, 1255, 265, 50276, 18, 690, 2173, 11809, 273, 39694, 11536, 1646, 4828, 565, 48714, 50276, 14719, 17177, 2550, 12215, 253, 3290, 3340, 323, 253, 1980, 1859, 323, 1650, 604, 253, 1980, 1859, 310, 19124, 281, 253, 45860, 5740, 28699, 253, 4181, 273, 253, 3969, 3386, 778, 40678, 253, 1566, 50276, 262, 310, 12744, 849, 253, 2831, 5251, 12420, 7729, 253, 26278, 273, 2493, 875, 43066, 5113, 323, 1650, 849, 1057, 253, 1566, 871, 326, 247, 2829, 310, 1735, 281, 247, 6951, 407, 42455, 253, 687, 74, 3386, 281, 253, 45860, 6010, 390, 253, 3236, 2505, 352, 310, 1805, 281, 2085, 271, 27350, 8813, 50276, 19, 253, 3064, 875, 39694, 11536, 285, 5368, 362, 24343, 3082, 326, 671, 9569, 1554, 48268, 35185, 310, 417, 4518, 5469, 275, 253, 2905, 789, 253, 4477, 1375, 326, 1027, 432, 3082, 5393, 1840, 1016, 1268, 310, 3280, 281, 253, 3969, 32049, 15978, 1293, 32147, 839, 824, 247, 3064, 3133, 14916, 285, 32809, 50276, 20, 1142, 3082, 2529, 275, 253, 2905, 789, 403, 417, 2429, 275, 253, 4679, 3340, 253, 3082, 326, 671, 9569, 1554, 48268, 35185, 24088, 278, 87, 4773, 285, 1269, 87, 20347, 50275, 783, 50007, 8103, 1159, 26574, 512, 253, 4016, 3530, 9696, 970, 5203, 36971, 534, 310, 749, 29776, 7296, 1027, 4440, 292, 2068, 8557, 943, 452, 1027, 7759, 273, 5921, 2490, 187, 4118, 18435, 27, 2520, 2929, 29328, 39694, 11536, 50276, 262, 3157, 4499, 422, 4715, 1332, 17230, 342, 625, 4030, 72, 11273, 1491, 281, 4711, 2709, 6849, 273, 1097, 253, 2460, 285, 2505, 1309, 3733, 1309, 17032, 15419, 2368, 760, 253, 2629, 1859, 310, 908, 253, 16774, 1543, 342, 1027, 2990, 35615, 387, 1027, 3215, 26208, 941, 11498, 921, 326, 253, 4081, 39694, 33526, 2590, 6351, 689, 253, 8245, 3082, 50276, 783, 2929, 310, 9483, 1242, 5469, 285, 14488, 42293, 2997, 432, 512, 30628, 4283, 281, 271, 2997, 3061, 253, 4477, 403, 4122, 14659, 281, 49620, 253, 2929, 15672, 253, 4477, 2361, 1543, 327, 247, 32176, 22791, 285, 2692, 7756, 1754, 327, 697, 1211, 8245, 275, 253, 2852, 253, 4477, 403, 4122, 14659, 281, 1304, 1543, 1754, 327, 247, 1846, 22791, 2708, 50276, 601, 326, 10668, 476, 4518, 923, 253, 1899, 273, 39694, 11536, 275, 253, 3634, 273, 512, 643, 2074, 9380, 275, 253, 6239, 50275, 3614, 16777, 677, 1297, 565, 248, 32778, 7280, 900, 70, 550, 87, 938, 1423 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 2520, 789, 23970, 281, 3989, 271, 3280, 39694, 342, 1027, 24705, 2308, 323, 1016, 36453, 352, 840, 8495, 84, 5304, 3603, 285, 32019, 3603, 275, 247, 24498, 1039, 253, 4081, 39694, 11536, 41731, 13015, 17230, 342, 247, 1781, 8459, 50274, 45563, 50276, 783, 1543, 403, 1077, 12532, 253, 1566, 41731, 13015, 253, 256, 5503, 3082, 2439, 1142, 15302, 50276, 783, 4583, 7792, 310, 3477, 281, 956, 50276, 20881, 1255, 50276, 783, 2934, 273, 16984, 687, 74, 3386, 778, 2572, 253, 15180, 2105, 3012, 50275, 249, 14218, 5251, 35185, 12420, 253, 4477, 5611, 25319, 72, 11273, 4156, 4499, 4715, 285, 4030, 72, 11273, 1980, 4499, 4715, 2299, 891, 42126, 1089, 253, 625, 2175, 281, 17813, 326, 16248, 767, 4499, 43097, 7729, 1566, 3733, 50276, 783, 4477, 2529, 253, 2831, 5251, 5886, 12420, 275, 4706, 5922, 1264, 5289, 403, 4081, 533, 891, 1804, 253, 4477, 5277, 625, 1783, 281, 5276, 616, 12510, 4754, 5474, 33032, 4524, 253, 4499, 422, 17230, 4715, 7792, 253, 789, 29328, 281, 16584, 625, 4030, 72, 11273, 1491, 281, 4711, 2709, 6849, 273, 1097, 253, 2460, 285, 2505, 1309, 3733, 285, 7613, 21031, 625, 4499, 422, 2957, 2426, 2439, 1027, 6849, 1309, 17032, 15419, 2368, 760, 253, 2629, 1859, 310, 908, 50276, 358, 5378, 1037, 342, 495, 1027, 35615, 501, 3024, 1235, 34490, 67, 1237, 34490, 67, 1036, 1027, 3215, 26208, 941, 11498, 285, 2067, 15450, 15302, 4477, 921, 326, 253, 4081, 2746, 33526, 2590, 6351, 689, 253, 8245, 2718, 50276, 783, 789, 3637, 247, 3626, 16038, 285, 33526, 1077, 1175, 16774, 6351, 689, 690, 2266, 8245, 2718, 4583, 253, 2929, 310, 973, 3542, 285, 253, 16774, 1263, 310, 4891, 50276, 66, 2234, 4468, 891, 452, 310, 1880, 253, 5301, 310, 4344, 2217, 604, 891, 2096, 9113, 323, 1016, 1859, 273, 2057, 253, 2460, 390, 253, 2505, 359, 878, 281, 3997, 253, 1859, 715, 253, 1566, 2378, 534, 5644, 281, 11467, 374, 89, 50276, 20, 89, 3081, 13782, 2105, 969, 604, 619, 4685, 310, 3451, 5747, 970, 253, 1072, 14604, 1979, 253, 4588, 3215, 26208, 2105, 1537, 320, 1199, 2169, 323, 253, 4081, 1332, 685, 1666, 25379, 2403, 253, 5301, 1679, 1491, 247, 1805, 5301, 3133, 281, 320, 1056, 253, 14604, 1979, 273, 1666, 25379, 4067, 1919, 253, 3733, 2105, 310, 10870, 891, 13414, 923, 667, 1798, 1895, 1060, 5474, 33032, 2520, 2929, 29328, 24498, 4735, 12420, 323, 8113, 3448, 3215, 26208, 1925, 39694, 11536, 534, 7374, 6584, 684, 24705, 29713, 347, 973, 347, 15577, 22862, 3237, 26332, 3221, 37865, 285, 3221, 2297, 3993, 39694, 11536, 21031, 14800, 342, 1264, 2308, 273, 35185, 275, 5304, 285, 3448, 33433, 2975, 285, 840, 501, 14503, 24705, 29713, 949, 14218, 5251, 24705, 12420, 285, 2831, 5251, 5886, 12420, 275, 1635, 39694, 11536, 47932, 247, 2602, 830, 273, 2192, 19131, 281, 2968, 342, 15577, 22862, 20544, 50276, 18, 436, 2929, 310, 973, 3542, 285, 3477, 281, 956, 50276, 19, 253, 16038, 285, 2900, 273, 253, 3929, 403, 2590, 625, 10799, 24498, 4735, 12420, 323, 46710, 24705, 29713, 285, 2602, 2980, 2192, 19131, 323, 15577, 22862, 50276, 20, 253, 4679, 403, 973, 4158, 285, 253, 1543, 403, 7126, 50275, 17465, 569, 50276, 9802, 253, 4477, 18212, 9713, 253, 7364, 285, 2442, 4016, 2675, 3486, 273, 616, 789, 604, 417, 4496, 2486, 25799, 13991, 323, 7756, 4477, 943, 320, 33302, 2581, 685, 29163, 323, 1146, 598, 6342, 670, 253, 7364, 273, 616, 789, 285, 667, 2442, 4016, 38058, 3486, 50276, 18, 3198, 625, 8813, 670, 849, 253, 3733, 873, 310, 8818, 50276, 19, 275, 1340, 281, 7277, 342, 690, 4102, 3863, 789, 26931, 337, 352, 310, 8521, 326, 253, 2488, 476, 8499, 253, 1543, 327, 4577, 4311, 15302, 824, 347, 25215, 20, 78, 50276, 20, 253, 9050, 2797, 407, 1789, 5481, 403, 3365, 7416, 407, 764, 284, 1880, 1027, 6036, 4948, 452, 271, 3486, 327, 253, 1543, 24088, 30772, 342, 8470, 50275, 18, 10237, 2831, 24353, 6779, 4715, 342, 13439, 1881, 8155, 21755, 30105, 1087, 1384, 1423, 4754, 5474, 33032, 2520, 2929, 29328, 281, 3157, 17230, 407, 6240, 4499, 422, 11655, 387, 1027, 2308, 5742, 323, 1016, 2460, 275, 253, 3236, 941, 597, 3989, 2460, 6849, 387, 4156, 1980, 285, 2919, 2308, 285, 671, 2794, 616, 3969, 2505, 3403, 621, 3103, 323, 1016, 2460, 34480, 4667, 275, 253, 3236, 10895, 597, 476, 2794, 767, 3081, 2460, 34480, 8557, 285, 253, 1566, 310, 10166, 342, 3081, 4499, 422, 11655, 1677, 253, 9841, 8818, 941, 597, 671, 50007, 253, 3236, 4499, 422, 2957, 342, 247, 13301, 78, 4902, 272, 3022, 5853, 4679, 7568, 11701, 689, 2067, 1666, 25379, 327, 1182, 254, 6934, 302, 25064, 4872, 39578, 327, 2460, 9162, 50276, 6082, 5481, 285, 24705, 26405, 8892, 20544, 337, 253, 16774, 1543, 403, 2266, 347, 597, 476, 562, 32231, 2067, 4633, 1666, 25379, 374, 253, 2929, 310, 973, 34218, 495, 891, 751, 253, 2934, 273, 2602, 2980, 253, 4499, 422, 2957, 8103, 534, 2789, 3282, 285, 253, 2929, 2722, 352, 2987, 973, 50276, 20881, 1255, 265, 337, 253, 14023, 875, 616, 1566, 285, 1666, 25379, 778, 417, 320, 4344, 253, 2929, 4648, 2709, 3215, 11273, 3210, 323, 616, 1566, 24088, 2505, 10405, 1320, 285, 1789, 5481, 3210, 534, 476, 9569, 625, 20446, 671, 984, 616, 1332, 588, 2794, 625, 2460, 34480, 8557, 323, 616, 1566, 597, 651, 452, 625, 3215, 26208, 941, 685, 253, 1666, 25379, 374, 690, 11809, 403, 30455, 323, 1650, 253, 2505, 3403, 621, 275, 2460, 34480, 15302, 403, 5431, 2159, 5727, 597, 897, 247, 3215, 11273, 2505, 10405, 1320, 1566, 534, 310, 4158, 323, 10405, 3006, 1048, 7177, 281, 2007, 48399, 253, 3403, 621, 534, 1057, 417, 1056, 3282, 281, 479, 495, 1223, 352, 310, 2032, 326, 253, 3215, 26208, 15302, 476, 320, 27620, 253, 1895, 778, 320, 26353, 4215, 604, 10481, 1781, 941, 403, 908, 352, 651, 320, 1175, 281, 923, 604, 597, 476, 1978, 253, 3045, 6351, 347, 625, 941, 403, 2908, 323, 1650, 597, 476, 7484, 1347, 3086, 682, 1979, 9191, 323, 1097, 616, 1566, 285, 1666, 25379, 285, 923, 253, 3045, 8037, 672, 253, 941, 1979, 16149, 891, 717, 417, 2119, 604, 616, 1332, 476, 320, 44755, 347, 672, 625, 941, 310, 2908, 1142, 273, 253, 3237, 5393, 275, 253, 2929, 778, 320, 26353, 4215, 5474, 33032, 2520, 2929, 29328, 247, 747, 7792, 1925, 39694, 11536, 323, 8113, 12982, 3215, 26208, 362, 24343, 39694, 11536, 806, 16756, 1554, 48268, 3386, 432, 1097, 253, 5304, 285, 32019, 10625, 285, 840, 2589, 84, 4499, 422, 4715, 407, 8495, 272, 5304, 285, 32019, 3386, 275, 1097, 14218, 5251, 285, 2831, 5251, 4088, 275, 436, 1039, 253, 8113, 285, 3448, 14237, 6311, 407, 39694, 11536, 22573, 1805, 4440, 292, 2068, 12420, 534, 7374, 6584, 684, 253, 24705, 29713, 1895, 326, 4961, 275, 253, 4440, 292, 2068, 8557, 323, 3215, 26208, 25761, 253, 4477, 671, 8171, 253, 2957, 1307, 273, 253, 4016, 3530, 275, 4499, 422, 4715, 342, 247, 39451, 2715, 281, 18915, 253, 1895, 326, 1027, 4440, 292, 2068, 8557, 778, 452, 2442, 5921, 253, 16774, 1543, 921, 326, 39694, 11536, 4518, 41731, 13015, 253, 17230, 8245, 275, 247, 5235, 273, 15450, 8892, 285, 671, 33526, 256, 5503, 3045, 327, 2067, 15450, 8892, 347, 2429, 342, 643, 362, 24343, 3210, 50276, 296, 3755, 20556, 50276, 2520, 2929, 12453, 253, 1895, 273, 253, 3290, 273, 4440, 292, 2068, 8557, 534, 310, 271, 1774, 1895, 275, 362, 24343, 970, 24498, 4735, 12420, 253, 5161, 2934, 285, 2173, 11809, 273, 253, 4081, 39694, 11536, 403, 3839, 5272, 50276, 783, 4477, 2589, 9470, 4679, 281, 7568, 253, 12510, 273, 39694, 11536, 10985, 1027, 27882, 35615, 501, 3024, 1235, 285, 9084, 67, 3215, 26208, 15302, 273, 11962, 9552, 285, 247, 5235, 273, 15450, 8892, 1182, 254, 6934, 302, 2460, 9162, 1182, 254, 6934, 302, 4440, 292, 2068, 25064, 4872, 10304, 1789, 5481, 285, 4227, 26405, 50276, 783, 28913, 1263, 310, 11088, 4645, 326, 512, 253, 4295, 273, 253, 4081, 1332, 403, 49598, 422, 50276, 15847, 6716, 6260, 403, 5196, 281, 540, 41597, 921, 326, 39694, 11536, 33772, 1805, 8113, 285, 32019, 6779, 685, 17230, 50276, 783, 2929, 310, 3839, 973, 15720, 285, 3477, 281, 956, 50274, 20881, 1255, 265, 50276, 18, 690, 2173, 11809, 273, 39694, 11536, 1646, 4828, 565, 48714, 50276, 14719, 17177, 2550, 12215, 253, 3290, 3340, 323, 253, 1980, 1859, 323, 1650, 604, 253, 1980, 1859, 310, 19124, 281, 253, 45860, 5740, 28699, 253, 4181, 273, 253, 3969, 3386, 778, 40678, 253, 1566, 50276, 262, 310, 12744, 849, 253, 2831, 5251, 12420, 7729, 253, 26278, 273, 2493, 875, 43066, 5113, 323, 1650, 849, 1057, 253, 1566, 871, 326, 247, 2829, 310, 1735, 281, 247, 6951, 407, 42455, 253, 687, 74, 3386, 281, 253, 45860, 6010, 390, 253, 3236, 2505, 352, 310, 1805, 281, 2085, 271, 27350, 8813, 50276, 19, 253, 3064, 875, 39694, 11536, 285, 5368, 362, 24343, 3082, 326, 671, 9569, 1554, 48268, 35185, 310, 417, 4518, 5469, 275, 253, 2905, 789, 253, 4477, 1375, 326, 1027, 432, 3082, 5393, 1840, 1016, 1268, 310, 3280, 281, 253, 3969, 32049, 15978, 1293, 32147, 839, 824, 247, 3064, 3133, 14916, 285, 32809, 50276, 20, 1142, 3082, 2529, 275, 253, 2905, 789, 403, 417, 2429, 275, 253, 4679, 3340, 253, 3082, 326, 671, 9569, 1554, 48268, 35185, 24088, 278, 87, 4773, 285, 1269, 87, 20347, 50275, 783, 50007, 8103, 1159, 26574, 512, 253, 4016, 3530, 9696, 970, 5203, 36971, 534, 310, 749, 29776, 7296, 1027, 4440, 292, 2068, 8557, 943, 452, 1027, 7759, 273, 5921, 2490, 187, 4118, 18435, 27, 2520, 2929, 29328, 39694, 11536, 50276, 262, 3157, 4499, 422, 4715, 1332, 17230, 342, 625, 4030, 72, 11273, 1491, 281, 4711, 2709, 6849, 273, 1097, 253, 2460, 285, 2505, 1309, 3733, 1309, 17032, 15419, 2368, 760, 253, 2629, 1859, 310, 908, 253, 16774, 1543, 342, 1027, 2990, 35615, 387, 1027, 3215, 26208, 941, 11498, 921, 326, 253, 4081, 39694, 33526, 2590, 6351, 689, 253, 8245, 3082, 50276, 783, 2929, 310, 9483, 1242, 5469, 285, 14488, 42293, 2997, 432, 512, 30628, 4283, 281, 271, 2997, 3061, 253, 4477, 403, 4122, 14659, 281, 49620, 253, 2929, 15672, 253, 4477, 2361, 1543, 327, 247, 32176, 22791, 285, 2692, 7756, 1754, 327, 697, 1211, 8245, 275, 253, 2852, 253, 4477, 403, 4122, 14659, 281, 1304, 1543, 1754, 327, 247, 1846, 22791, 2708, 50276, 601, 326, 10668, 476, 4518, 923, 253, 1899, 273, 39694, 11536, 275, 253, 3634, 273, 512, 643, 2074, 9380, 275, 253, 6239, 50275, 3614, 16777, 677, 1297, 565, 248, 32778, 7280, 900, 70, 550, 87, 938, 1423 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: this work presents a method to efficiently compute upper bounds for the local lipschitz constant of relu networks in short interval bound propagation techniques are applied to the backward computational graph which yields an upper bound on the norm of the clarke jacobian of the network which constitutes an upper bound for the local lipschitz constant careful consideration of the bounds for wellknown activation functions provide tighter results over previous baselines such improvements are confirmed on experiments using mlps and convolutional networks trained on mnistcifar10tinyimagenet after rebuttal authors have addressed my main concerns i am inclined to increase my score originality i think the main weakness is in this front as to me the contribution is not clear the authors claim that their contribution is to generalize existing bound propagation methods originally used for nn verification to a higher order backward computational graph for bounding the clarke jacobian however to me it appears that b precisely does that with the caveat that their results are for differentiable activation functions when the notion of clarke jacobian is just the normal jacobian function as such i believe that this work rather adapts the technique in b to the setting of nondifferentiable activation functions like relu together with a more careful approach that provides tighter bounds in some cases compared to the methods in b this should be clarified by the authors and if it is the case the claims should be toned down in summary i believe that applying interval bound propagation techniques to the jacobian which is the backward computational graph is already done in b although in a slightly different setting quality and clarity the paper is in general wellwritten and easy to follow and the bibliography appears all of previous related work the differences with most previous works are clearly explained again with the exception of recurjac b where i believe the authors should expand also in line 151 the authors claim chain rule can be used to compute the clarke jacobian for the whole relu network however it is well known that the notion of chain rule is more complicated when dealing with nonsmooth compositions i would like the authors to clarify which chain rule are they talking about see for example theorem 4 in a significance i believe it is important to extend the method in b to the case of nonsmooth activations as activations like relu and leakyrelu are commonly used in practice despite the fact that they do not have a normal gradient at all points moreover theorem 4 in this work is a tightness result which implies that the relaxation presented in this work is the tightest possible which is an important step in this line of research however related to my main criticism of the work if indeed the idea of applying interval bound propagation to the backward computational graph is already present in b the method presented here would be more of a refinement of a previous method rather than a completely different approach which is less significant in the experiments it is not clear why recurjac is used despite it being tailored for differentiable activation functions can the authors clarify if using recurjac for nonsmooth activation functions still provide a valid upper bound for the norm of the jacobian another issue with the experiments is a lack of confidence intervals because the networks used are a result of a stochastic process sgd there might be differences due to randomness and without confidence intervals it is hard to assess if the improvement is significant for example the method lipbab achieves a runtime of 292 for mlps of width 32 and 4 layers which is much faster than the method presented here while providing a lipschitz constant as good however in other cases it is considerably slower and provides worse bounds adding confidence intervals could possibly let the authors conclude if their method is consistently outperforming the baselines neverthelesstheir method without branchandbound appears to be consistently faster and more scalable than recurjac references a support functions of clarkes generalized jacobian and of its plenary hull cyril imbert b recurjac an efficient recursive algorithm for bounding jacobian matrix of neural networks and its applications huan zhang pengchuan zhang chojui hsieh aaai 2019 no negative societal impact the main limitation is the cost of branchandbound which the authors somehow address by providing an option to use the method without bab docsepthis paper extends existing clarke jacobian estimation from interval bounds to the linear relaxation the authors further apply the branchandbound method to tighten the estimation when the time budget allows the experiments show that the linearrelaxed clarke jacobian with branchandbound produces more precise results than selected benchmarks the authors also use the monotonicity analysis as an application of clarke jacobian strengths this work provides tighter estimation of clarke jacobian norm estimation which has practical benefits for the local lipschitz constant evaluation local lipschitz constant is an essential mathematical property of the neural network in an input region weaknesses 1 this work is incremental tightening neuralnetwork certification from interval bounds to linear relaxation has been studied in a number of works for example 1 and 2 the branchandbound method has also been used to compute the lipschitz constant of neural networks 3 this work brings little new insight or understanding to neuralnetwork certification the benefit is mostly practical from compiling known methods together which is not surprising 2 line 179 mentions that in this paper the authors mainly consider ellinftynorm for simplicity i doubt whether this is for simplicity or linear relaxed clarke jacobian is only able to handle linearly constraint perturbation space in practice see the questions section below 3 the evaluation has doubts the first part is related to 2 lipsdp is designed to compute lipschitz constant with respect to ell2 perturbations scaling the result of lipsdp for ell2 perturbations seems an unfair comparison to lipsdp unless the authors make it clear this work does not apply to ell2 perturbations the second part is for the monotonicity analysis the authors provide some numbers from experiments but there is no justification on how good the result is 1zhang h weng t chen p hsieh c and daniel l efficient neural network robustness certification with general activation functions in advances in neural information processing systems pp 49444953 2018 url httpsproceedingsneuripsccpaper2018hashd04863f100d59b3eb688a11f95b0ae60abstracthtml 2lyu z ko c kong z wong n lin d and daniel l fastened crown tightened neural network robustness certificates in the thirtyfourth aaai conference on artificial intelligence aaai 2020 the thirtysecond innovative applications of artificial intelligence conference iaai 2020 the tenth aaai symposium on educational advances in artificial intelligence eaai 2020 new york ny usa february 712 2020 pp 50375044 2020 url httpsaaaiorgojsindexphpaaaiarticleview5944 3bhowmick a dsouza m and raghavan g s lipbab computing exact lipschitz constant of relu networks in international conference on artificial neural networks pp 151162 332 springer 2021 no major concerns one suggestion for paper writing please use other graphical features than colors to distinguish different parts of the plot this is not friendly for colorblind readers or black and white printers docsepthe paper presents a novel method to compute upper bounds to the local lipschitz constant of a neural network by using linear bound propagation to bound the norm of the clarke jacobian of the network the authors demonstrate that the proposed approach computes tighter bounds than the stateoftheart in reasonable time on synthetic networks and on networks trained for popular image datasets finally the method is tested on monotonicity analysis as an example application the authors make a more careful use of linear bound propagation techniques in the context of estimating a networks local lipschitz constant resulting in tighter bounds than the stateoftheart estimators in reasonable time furthermore the algorithm supports branching to tighten the bounds and a custom branching strategy is presented which appears to work well in the considered experimental settings while the approach is not entirely original and it draws on existing techniques such as recurjac it is clearly presented and effective i believe it might be of interest to the community for what concerns the weaknesses the experimental section is somewhat limited as it mostly relies on small and synthetic networks it would be interesting to test the algorithm on larger networks and possibly on neural network verification as for instance done by the recurjac authors at that point i would be curious to see how lipschitzbased algorithms compare with common incomplete verifiers especially those based on optimized linear bounds such as alpha and betacrown analogously it would be interesting to see an extension of section 42 on l2 norm which bounding algorithms for lipschitz constants routinely support the authors adequately discuss the limitations of their work however it would be better for the reader to have them condensed in a section in the appendix ### Summary:
the paper develops a methodology for computing the lipschitz constant of relu neural networks when the input perturbations are measured in the linfinity norm the method is based on computing tight upper bounds on the clarke jacobian the basic idea is to apply interval bound propagation techniques to the backward computational graph which yields an upper bound on the norm of the clarke jacobian of the network experimental results show superiority over sota in terms of scalability runtime and the computed bound the reviewers had a number of concerns most of which were addressed during the discussion phase i recommend that the authors revise the paper and the experiments according to the reviewers comments as well as their own responses the paper was also discussed among the reviewers one main point of discussion was the novelty of the work compared to prior art eg lyu et al and the fact that bounding the norm of clarke jacobian seems to be only beneficial for linfinity perturbations however some reviewers argued and i agree that the paper improves over sota methods quite notably in terms of efficiency and tightness and the method scales to much larger models compared to prior works scale is actually an important challenge in this topic as a result i would vote for accepting the paper as a matter of taste i dont think i agree with this sentence in the abstract and similar sentences in other parts of the paper existing methods for computing lipschitz constants either are computationally inefficient or produce loose upper bounds we do have good methods that provide nontrivial upper bounds on the lipschitz constant of nns while i agree that the scalability of those methods are still to be improved we can not really call them inefficient hence i believe that this sentence and similar sentences in the paper could be better rephrased
[ 3033, 323, 253, 1980, 11233, 37913, 3638, 10182, 8180, 273, 253, 14493, 323, 973, 4304, 5743, 3470, 2085, 40638, 1543, 689, 2045, 1666, 25379, 824, 11701, 403, 5783, 327, 4679, 970, 13361, 793, 285, 27311, 267, 6928, 10166, 327, 278, 79, 382, 46277, 274, 740, 24290, 303, 6533, 292, 50275, 6438, 30080, 22559, 50276, 43355, 452, 9713, 619, 2022, 7350, 891, 717, 21802, 281, 2572, 619, 4868, 3236, 414, 891, 1158, 253, 2022, 14855, 310, 275, 436, 2914, 347, 281, 479, 253, 7680, 310, 417, 2590, 253, 4477, 1750, 326, 616, 7680, 310, 281, 39970, 5368, 3033, 18634, 3082, 8927, 908, 323, 48257, 21999, 281, 247, 2169, 1340, 19265, 15180, 4216, 323, 41113, 253, 8254, 413, 480, 317, 706, 757, 2299, 281, 479, 352, 4620, 326, 270, 10534, 1057, 326, 342, 253, 15985, 255, 326, 616, 1543, 403, 323, 46350, 5743, 3470, 672, 253, 10732, 273, 8254, 413, 480, 317, 706, 757, 310, 816, 253, 2622, 480, 317, 706, 757, 1159, 347, 824, 891, 2868, 326, 436, 789, 2581, 5223, 84, 253, 5853, 275, 270, 281, 253, 4758, 273, 27370, 7413, 6051, 5743, 3470, 751, 774, 86, 2366, 342, 247, 625, 10182, 2746, 326, 3400, 40638, 14493, 275, 690, 2219, 2429, 281, 253, 3082, 275, 270, 436, 943, 320, 31637, 407, 253, 4477, 285, 604, 352, 310, 253, 1083, 253, 3916, 943, 320, 7020, 264, 1066, 275, 6010, 891, 2868, 326, 9433, 7726, 3033, 18634, 5609, 281, 253, 480, 317, 706, 757, 534, 310, 253, 19265, 15180, 4216, 310, 2168, 2218, 275, 270, 3738, 275, 247, 5777, 1027, 4758, 50276, 15177, 285, 19843, 253, 2929, 310, 275, 2087, 973, 15720, 285, 3477, 281, 956, 285, 253, 20314, 20561, 4620, 512, 273, 2045, 2905, 789, 253, 3910, 342, 954, 2045, 2987, 403, 4518, 5544, 969, 342, 253, 6517, 273, 11896, 47941, 270, 835, 891, 2868, 253, 4477, 943, 5645, 671, 275, 1386, 21982, 253, 4477, 1750, 5931, 4086, 476, 320, 908, 281, 11897, 253, 8254, 413, 480, 317, 706, 757, 323, 253, 2644, 774, 86, 2990, 2299, 352, 310, 973, 1929, 326, 253, 10732, 273, 5931, 4086, 310, 625, 9542, 672, 10620, 342, 14122, 78, 4902, 16672, 891, 651, 751, 253, 4477, 281, 19148, 534, 5931, 4086, 403, 597, 5015, 670, 923, 323, 1650, 10012, 577, 275, 247, 50276, 9188, 40348, 891, 2868, 352, 310, 1774, 281, 9017, 253, 1332, 275, 270, 281, 253, 1083, 273, 14122, 78, 4902, 1396, 569, 347, 1396, 569, 751, 774, 86, 285, 13584, 90, 1661, 86, 403, 7744, 908, 275, 3946, 5747, 253, 958, 326, 597, 513, 417, 452, 247, 2622, 11786, 387, 512, 2792, 25761, 10012, 577, 275, 436, 789, 310, 247, 6863, 1255, 906, 534, 8018, 326, 253, 17040, 3559, 275, 436, 789, 310, 253, 6863, 383, 1896, 534, 310, 271, 1774, 3213, 275, 436, 1386, 273, 2561, 2299, 2905, 281, 619, 2022, 14226, 273, 253, 789, 604, 6296, 253, 2934, 273, 9433, 7726, 3033, 18634, 281, 253, 19265, 15180, 4216, 310, 2168, 1246, 275, 270, 253, 1332, 3559, 1060, 651, 320, 625, 273, 247, 29646, 273, 247, 2045, 1332, 2581, 685, 247, 4336, 1027, 2746, 534, 310, 1679, 1534, 50276, 249, 253, 4679, 352, 310, 417, 2590, 2139, 11896, 47941, 310, 908, 5747, 352, 1146, 27846, 323, 46350, 5743, 3470, 476, 253, 4477, 19148, 604, 970, 11896, 47941, 323, 14122, 78, 4902, 5743, 3470, 1335, 2085, 247, 3588, 5170, 3033, 323, 253, 5222, 273, 253, 480, 317, 706, 757, 1529, 2523, 342, 253, 4679, 310, 247, 3480, 273, 7162, 11508, 984, 253, 6928, 908, 403, 247, 906, 273, 247, 19191, 1232, 256, 35333, 627, 1537, 320, 3910, 1955, 281, 3632, 1255, 285, 1293, 7162, 11508, 352, 310, 1892, 281, 2939, 604, 253, 7756, 310, 1534, 323, 1650, 253, 1332, 5541, 43658, 33526, 247, 20243, 273, 35786, 323, 13361, 793, 273, 4871, 4567, 285, 577, 8090, 534, 310, 1199, 7938, 685, 253, 1332, 3559, 1060, 1223, 5277, 247, 11233, 37913, 3638, 347, 1175, 2299, 275, 643, 2219, 352, 310, 15455, 17357, 285, 3400, 7197, 14493, 6240, 7162, 11508, 812, 6830, 1339, 253, 4477, 7525, 604, 616, 1332, 310, 12724, 41731, 14692, 253, 1666, 25379, 50276, 7594, 783, 868, 296, 248, 343, 1332, 1293, 7789, 395, 9458, 4620, 281, 320, 12724, 7938, 285, 625, 44755, 685, 11896, 47941, 50275, 250, 3065, 247, 1329, 3470, 273, 502, 782, 265, 14923, 480, 317, 706, 757, 285, 273, 697, 499, 35634, 28470, 2645, 21704, 516, 6291, 270, 11896, 47941, 271, 5919, 33037, 5933, 323, 41113, 480, 317, 706, 757, 4315, 273, 11454, 6928, 285, 697, 4893, 288, 9041, 1182, 12109, 42151, 348, 9041, 1182, 12109, 2093, 75, 4113, 288, 48188, 73, 39951, 2284, 6247, 642, 4016, 38058, 3486, 253, 2022, 12291, 310, 253, 2105, 273, 7789, 395, 9458, 534, 253, 4477, 10380, 2953, 407, 5277, 271, 4500, 281, 897, 253, 1332, 1293, 5366, 5474, 33032, 2520, 2929, 8725, 5368, 8254, 413, 480, 317, 706, 757, 13418, 432, 7726, 14493, 281, 253, 4872, 17040, 253, 4477, 2007, 4647, 253, 7789, 395, 9458, 1332, 281, 6863, 257, 253, 13418, 672, 253, 673, 7563, 4483, 253, 4679, 921, 326, 253, 4872, 39471, 264, 8254, 413, 480, 317, 706, 757, 342, 7789, 395, 9458, 11330, 625, 10799, 1543, 685, 4236, 49602, 253, 4477, 671, 897, 253, 45973, 414, 1783, 347, 271, 2898, 273, 8254, 413, 480, 317, 706, 757, 20544, 436, 789, 3400, 40638, 13418, 273, 8254, 413, 480, 317, 706, 757, 5222, 13418, 534, 556, 8542, 5373, 323, 253, 1980, 11233, 37913, 3638, 7103, 1980, 11233, 37913, 3638, 310, 271, 5667, 15965, 2867, 273, 253, 11454, 2990, 275, 271, 3280, 2919, 50276, 20881, 1255, 265, 337, 436, 789, 310, 32809, 48594, 11454, 18428, 21612, 432, 7726, 14493, 281, 4872, 17040, 556, 644, 5421, 275, 247, 1180, 273, 2987, 323, 1650, 337, 285, 374, 253, 7789, 395, 9458, 1332, 556, 671, 644, 908, 281, 11897, 253, 11233, 37913, 3638, 273, 11454, 6928, 495, 436, 789, 10316, 1652, 747, 12288, 390, 4685, 281, 11454, 18428, 21612, 253, 5649, 310, 6571, 8542, 432, 40877, 1929, 3082, 2366, 534, 310, 417, 10084, 374, 1386, 24062, 25957, 326, 275, 436, 2929, 253, 4477, 7194, 1908, 11591, 3259, 12850, 323, 17647, 891, 5545, 1880, 436, 310, 323, 17647, 390, 4872, 19595, 8254, 413, 480, 317, 706, 757, 310, 760, 2104, 281, 6016, 23352, 7658, 20452, 2317, 275, 3946, 923, 253, 3533, 2593, 2708, 495, 253, 7103, 556, 24626, 253, 806, 629, 310, 2905, 281, 374, 11233, 12132, 310, 4158, 281, 11897, 11233, 37913, 3638, 342, 1675, 281, 11591, 19, 26309, 13642, 253, 906, 273, 11233, 12132, 323, 11591, 19, 26309, 3133, 271, 16593, 5301, 281, 11233, 12132, 5734, 253, 4477, 1056, 352, 2590, 436, 789, 1057, 417, 4647, 281, 11591, 19, 26309, 253, 1273, 629, 310, 323, 253, 45973, 414, 1783, 253, 4477, 2085, 690, 3904, 432, 4679, 533, 627, 310, 642, 22861, 327, 849, 1175, 253, 906, 310, 50276, 18, 91, 12109, 288, 259, 1205, 246, 260, 864, 268, 288, 48188, 73, 260, 285, 16447, 928, 298, 5919, 11454, 2990, 31640, 21612, 342, 2087, 5743, 3470, 275, 16424, 275, 11454, 1491, 5162, 2718, 7266, 7584, 24447, 41892, 4765, 9688, 5987, 856, 22868, 32167, 2824, 550, 20790, 7798, 13362, 69, 2125, 39948, 71, 2313, 69, 3046, 67, 20, 2275, 49800, 66, 883, 71, 2222, 67, 17, 3348, 1549, 15834, 2974, 50276, 19, 314, 86, 1182, 20846, 260, 465, 543, 1182, 259, 543, 295, 19169, 277, 285, 16447, 928, 298, 45170, 16834, 41912, 11454, 2990, 31640, 28460, 275, 253, 10488, 48499, 39951, 2284, 8059, 327, 13345, 9260, 39951, 2284, 9169, 253, 10488, 9815, 16694, 4893, 273, 13345, 9260, 8059, 209, 571, 2284, 9169, 253, 28081, 39951, 2284, 18870, 35835, 327, 11331, 16424, 275, 13345, 9260, 299, 66, 2284, 9169, 747, 340, 1064, 31804, 441, 66, 704, 67, 4701, 818, 805, 9169, 7266, 2456, 1787, 1235, 2031, 9169, 9688, 5987, 39639, 1528, 2184, 4305, 4663, 545, 4904, 66, 2284, 14600, 1374, 3046, 2031, 50276, 20, 67, 5430, 78, 781, 247, 20505, 276, 4019, 278, 285, 23603, 10940, 266, 305, 256, 5541, 43658, 12672, 3242, 11233, 37913, 3638, 273, 774, 86, 6928, 275, 5213, 8059, 327, 13345, 11454, 6928, 7266, 1458, 883, 3763, 35107, 7203, 254, 43425, 642, 2201, 7350, 581, 14876, 323, 2929, 4028, 4496, 897, 643, 29886, 3386, 685, 9830, 281, 12129, 1027, 4243, 273, 253, 7484, 436, 310, 417, 11453, 323, 3295, 27895, 10668, 390, 2806, 285, 3168, 38121, 5474, 339, 431, 248, 2929, 10262, 247, 4460, 1332, 281, 11897, 5170, 14493, 281, 253, 1980, 11233, 37913, 3638, 273, 247, 11454, 2990, 407, 970, 4872, 3033, 18634, 281, 3033, 253, 5222, 273, 253, 8254, 413, 480, 317, 706, 757, 273, 253, 2990, 253, 4477, 7568, 326, 253, 4081, 2746, 48169, 40638, 14493, 685, 253, 1375, 23037, 14387, 275, 5272, 673, 327, 13506, 6928, 285, 327, 6928, 10166, 323, 4633, 2460, 15302, 4720, 253, 1332, 310, 5762, 327, 45973, 414, 1783, 347, 271, 1650, 2898, 253, 4477, 1056, 247, 625, 10182, 897, 273, 4872, 3033, 18634, 5609, 275, 253, 3634, 273, 26230, 247, 6928, 1980, 11233, 37913, 3638, 4795, 275, 40638, 14493, 685, 253, 1375, 23037, 14387, 48489, 275, 5272, 673, 33810, 253, 5933, 8525, 27213, 281, 6863, 257, 253, 14493, 285, 247, 2840, 27213, 5700, 310, 3559, 534, 4620, 281, 789, 973, 275, 253, 2783, 5661, 7533, 1223, 253, 2746, 310, 417, 7094, 3236, 285, 352, 21354, 327, 5368, 5609, 824, 347, 11896, 47941, 352, 310, 4518, 3559, 285, 3576, 891, 2868, 352, 1537, 320, 273, 1600, 281, 253, 3114, 50276, 1542, 752, 7350, 253, 32213, 253, 5661, 2593, 310, 8489, 3710, 347, 352, 6571, 15771, 327, 1355, 285, 13506, 6928, 352, 651, 320, 4722, 281, 1071, 253, 5933, 327, 4067, 6928, 285, 6830, 327, 11454, 2990, 21999, 347, 323, 4227, 2218, 407, 253, 11896, 47941, 4477, 387, 326, 1127, 891, 651, 320, 14338, 281, 923, 849, 11233, 37913, 3169, 11333, 7277, 342, 1846, 18464, 2336, 13783, 3340, 1110, 1754, 327, 18325, 4872, 14493, 824, 347, 9765, 285, 701, 317, 2924, 7370, 4087, 352, 651, 320, 4722, 281, 923, 271, 6880, 273, 2593, 5976, 327, 298, 19, 5222, 534, 41113, 11333, 323, 11233, 37913, 14637, 21774, 1329, 253, 4477, 18212, 2319, 253, 7364, 273, 616, 789, 2299, 352, 651, 320, 1805, 323, 253, 9414, 281, 452, 731, 35341, 275, 247, 2593, 275, 253, 30762, 2490, 187, 4118, 18435, 27, 783, 2929, 24357, 247, 16182, 323, 12672, 253, 11233, 37913, 3638, 273, 774, 86, 11454, 6928, 672, 253, 3280, 26309, 403, 4080, 275, 253, 298, 43723, 5222, 253, 1332, 310, 1754, 327, 12672, 6863, 5170, 14493, 327, 253, 8254, 413, 480, 317, 706, 757, 253, 5044, 2934, 310, 281, 4647, 7726, 3033, 18634, 5609, 281, 253, 19265, 15180, 4216, 534, 11026, 271, 5170, 3033, 327, 253, 5222, 273, 253, 8254, 413, 480, 317, 706, 757, 273, 253, 2990, 5661, 1543, 921, 34385, 689, 256, 5503, 275, 2426, 273, 9171, 1430, 20243, 285, 253, 10302, 3033, 50275, 783, 30628, 574, 247, 1180, 273, 7350, 954, 273, 534, 497, 9713, 1309, 253, 5955, 3408, 891, 5583, 326, 253, 4477, 49620, 253, 2929, 285, 253, 4679, 2556, 281, 253, 30628, 5701, 347, 973, 347, 616, 1211, 6128, 253, 2929, 369, 671, 5469, 2190, 253, 30628, 581, 2022, 1127, 273, 5955, 369, 253, 38135, 273, 253, 789, 2429, 281, 2720, 1445, 24088, 12865, 86, 1162, 355, 285, 253, 958, 326, 41113, 253, 5222, 273, 8254, 413, 480, 317, 706, 757, 3133, 281, 320, 760, 12912, 323, 298, 43723, 26309, 2299, 690, 30628, 9125, 285, 891, 5194, 326, 253, 2929, 19132, 689, 256, 5503, 3082, 3240, 19836, 275, 2426, 273, 6733, 285, 6863, 1255, 285, 253, 1332, 11498, 281, 1199, 4067, 3210, 2429, 281, 2720, 2987, 4311, 310, 2686, 271, 1774, 5691, 275, 436, 9400, 347, 247, 906, 891, 651, 6273, 323, 18738, 253, 2929, 50275, 284, 247, 2647, 273, 9075, 891, 13414, 1158, 891, 5194, 342, 436, 6197, 275, 253, 12002, 285, 2074, 14683, 275, 643, 4243, 273, 253, 2929, 5368, 3082, 323, 12672, 11233, 37913, 14637, 2057, 403, 50276, 681, 10340, 595, 31334, 390, 4711, 13155, 5170, 14493, 359, 513, 452, 1175, 3082, 326, 2085, 37825, 5170, 14493, 327, 253, 11233, 37913, 3638, 273, 295, 2224, 1223, 891, 5194, 326, 253, 9171, 1430, 273, 1110, 3082, 403, 1335, 281, 320, 5520, 359, 476, 417, 1663, 1067, 731, 31334, 7613, 891, 2868, 326, 436, 6197, 285, 2074, 14683, 275, 253, 2929, 812, 320, 1805, 294, 545, 83, 833, 209 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 3033, 323, 253, 1980, 11233, 37913, 3638, 10182, 8180, 273, 253, 14493, 323, 973, 4304, 5743, 3470, 2085, 40638, 1543, 689, 2045, 1666, 25379, 824, 11701, 403, 5783, 327, 4679, 970, 13361, 793, 285, 27311, 267, 6928, 10166, 327, 278, 79, 382, 46277, 274, 740, 24290, 303, 6533, 292, 50275, 6438, 30080, 22559, 50276, 43355, 452, 9713, 619, 2022, 7350, 891, 717, 21802, 281, 2572, 619, 4868, 3236, 414, 891, 1158, 253, 2022, 14855, 310, 275, 436, 2914, 347, 281, 479, 253, 7680, 310, 417, 2590, 253, 4477, 1750, 326, 616, 7680, 310, 281, 39970, 5368, 3033, 18634, 3082, 8927, 908, 323, 48257, 21999, 281, 247, 2169, 1340, 19265, 15180, 4216, 323, 41113, 253, 8254, 413, 480, 317, 706, 757, 2299, 281, 479, 352, 4620, 326, 270, 10534, 1057, 326, 342, 253, 15985, 255, 326, 616, 1543, 403, 323, 46350, 5743, 3470, 672, 253, 10732, 273, 8254, 413, 480, 317, 706, 757, 310, 816, 253, 2622, 480, 317, 706, 757, 1159, 347, 824, 891, 2868, 326, 436, 789, 2581, 5223, 84, 253, 5853, 275, 270, 281, 253, 4758, 273, 27370, 7413, 6051, 5743, 3470, 751, 774, 86, 2366, 342, 247, 625, 10182, 2746, 326, 3400, 40638, 14493, 275, 690, 2219, 2429, 281, 253, 3082, 275, 270, 436, 943, 320, 31637, 407, 253, 4477, 285, 604, 352, 310, 253, 1083, 253, 3916, 943, 320, 7020, 264, 1066, 275, 6010, 891, 2868, 326, 9433, 7726, 3033, 18634, 5609, 281, 253, 480, 317, 706, 757, 534, 310, 253, 19265, 15180, 4216, 310, 2168, 2218, 275, 270, 3738, 275, 247, 5777, 1027, 4758, 50276, 15177, 285, 19843, 253, 2929, 310, 275, 2087, 973, 15720, 285, 3477, 281, 956, 285, 253, 20314, 20561, 4620, 512, 273, 2045, 2905, 789, 253, 3910, 342, 954, 2045, 2987, 403, 4518, 5544, 969, 342, 253, 6517, 273, 11896, 47941, 270, 835, 891, 2868, 253, 4477, 943, 5645, 671, 275, 1386, 21982, 253, 4477, 1750, 5931, 4086, 476, 320, 908, 281, 11897, 253, 8254, 413, 480, 317, 706, 757, 323, 253, 2644, 774, 86, 2990, 2299, 352, 310, 973, 1929, 326, 253, 10732, 273, 5931, 4086, 310, 625, 9542, 672, 10620, 342, 14122, 78, 4902, 16672, 891, 651, 751, 253, 4477, 281, 19148, 534, 5931, 4086, 403, 597, 5015, 670, 923, 323, 1650, 10012, 577, 275, 247, 50276, 9188, 40348, 891, 2868, 352, 310, 1774, 281, 9017, 253, 1332, 275, 270, 281, 253, 1083, 273, 14122, 78, 4902, 1396, 569, 347, 1396, 569, 751, 774, 86, 285, 13584, 90, 1661, 86, 403, 7744, 908, 275, 3946, 5747, 253, 958, 326, 597, 513, 417, 452, 247, 2622, 11786, 387, 512, 2792, 25761, 10012, 577, 275, 436, 789, 310, 247, 6863, 1255, 906, 534, 8018, 326, 253, 17040, 3559, 275, 436, 789, 310, 253, 6863, 383, 1896, 534, 310, 271, 1774, 3213, 275, 436, 1386, 273, 2561, 2299, 2905, 281, 619, 2022, 14226, 273, 253, 789, 604, 6296, 253, 2934, 273, 9433, 7726, 3033, 18634, 281, 253, 19265, 15180, 4216, 310, 2168, 1246, 275, 270, 253, 1332, 3559, 1060, 651, 320, 625, 273, 247, 29646, 273, 247, 2045, 1332, 2581, 685, 247, 4336, 1027, 2746, 534, 310, 1679, 1534, 50276, 249, 253, 4679, 352, 310, 417, 2590, 2139, 11896, 47941, 310, 908, 5747, 352, 1146, 27846, 323, 46350, 5743, 3470, 476, 253, 4477, 19148, 604, 970, 11896, 47941, 323, 14122, 78, 4902, 5743, 3470, 1335, 2085, 247, 3588, 5170, 3033, 323, 253, 5222, 273, 253, 480, 317, 706, 757, 1529, 2523, 342, 253, 4679, 310, 247, 3480, 273, 7162, 11508, 984, 253, 6928, 908, 403, 247, 906, 273, 247, 19191, 1232, 256, 35333, 627, 1537, 320, 3910, 1955, 281, 3632, 1255, 285, 1293, 7162, 11508, 352, 310, 1892, 281, 2939, 604, 253, 7756, 310, 1534, 323, 1650, 253, 1332, 5541, 43658, 33526, 247, 20243, 273, 35786, 323, 13361, 793, 273, 4871, 4567, 285, 577, 8090, 534, 310, 1199, 7938, 685, 253, 1332, 3559, 1060, 1223, 5277, 247, 11233, 37913, 3638, 347, 1175, 2299, 275, 643, 2219, 352, 310, 15455, 17357, 285, 3400, 7197, 14493, 6240, 7162, 11508, 812, 6830, 1339, 253, 4477, 7525, 604, 616, 1332, 310, 12724, 41731, 14692, 253, 1666, 25379, 50276, 7594, 783, 868, 296, 248, 343, 1332, 1293, 7789, 395, 9458, 4620, 281, 320, 12724, 7938, 285, 625, 44755, 685, 11896, 47941, 50275, 250, 3065, 247, 1329, 3470, 273, 502, 782, 265, 14923, 480, 317, 706, 757, 285, 273, 697, 499, 35634, 28470, 2645, 21704, 516, 6291, 270, 11896, 47941, 271, 5919, 33037, 5933, 323, 41113, 480, 317, 706, 757, 4315, 273, 11454, 6928, 285, 697, 4893, 288, 9041, 1182, 12109, 42151, 348, 9041, 1182, 12109, 2093, 75, 4113, 288, 48188, 73, 39951, 2284, 6247, 642, 4016, 38058, 3486, 253, 2022, 12291, 310, 253, 2105, 273, 7789, 395, 9458, 534, 253, 4477, 10380, 2953, 407, 5277, 271, 4500, 281, 897, 253, 1332, 1293, 5366, 5474, 33032, 2520, 2929, 8725, 5368, 8254, 413, 480, 317, 706, 757, 13418, 432, 7726, 14493, 281, 253, 4872, 17040, 253, 4477, 2007, 4647, 253, 7789, 395, 9458, 1332, 281, 6863, 257, 253, 13418, 672, 253, 673, 7563, 4483, 253, 4679, 921, 326, 253, 4872, 39471, 264, 8254, 413, 480, 317, 706, 757, 342, 7789, 395, 9458, 11330, 625, 10799, 1543, 685, 4236, 49602, 253, 4477, 671, 897, 253, 45973, 414, 1783, 347, 271, 2898, 273, 8254, 413, 480, 317, 706, 757, 20544, 436, 789, 3400, 40638, 13418, 273, 8254, 413, 480, 317, 706, 757, 5222, 13418, 534, 556, 8542, 5373, 323, 253, 1980, 11233, 37913, 3638, 7103, 1980, 11233, 37913, 3638, 310, 271, 5667, 15965, 2867, 273, 253, 11454, 2990, 275, 271, 3280, 2919, 50276, 20881, 1255, 265, 337, 436, 789, 310, 32809, 48594, 11454, 18428, 21612, 432, 7726, 14493, 281, 4872, 17040, 556, 644, 5421, 275, 247, 1180, 273, 2987, 323, 1650, 337, 285, 374, 253, 7789, 395, 9458, 1332, 556, 671, 644, 908, 281, 11897, 253, 11233, 37913, 3638, 273, 11454, 6928, 495, 436, 789, 10316, 1652, 747, 12288, 390, 4685, 281, 11454, 18428, 21612, 253, 5649, 310, 6571, 8542, 432, 40877, 1929, 3082, 2366, 534, 310, 417, 10084, 374, 1386, 24062, 25957, 326, 275, 436, 2929, 253, 4477, 7194, 1908, 11591, 3259, 12850, 323, 17647, 891, 5545, 1880, 436, 310, 323, 17647, 390, 4872, 19595, 8254, 413, 480, 317, 706, 757, 310, 760, 2104, 281, 6016, 23352, 7658, 20452, 2317, 275, 3946, 923, 253, 3533, 2593, 2708, 495, 253, 7103, 556, 24626, 253, 806, 629, 310, 2905, 281, 374, 11233, 12132, 310, 4158, 281, 11897, 11233, 37913, 3638, 342, 1675, 281, 11591, 19, 26309, 13642, 253, 906, 273, 11233, 12132, 323, 11591, 19, 26309, 3133, 271, 16593, 5301, 281, 11233, 12132, 5734, 253, 4477, 1056, 352, 2590, 436, 789, 1057, 417, 4647, 281, 11591, 19, 26309, 253, 1273, 629, 310, 323, 253, 45973, 414, 1783, 253, 4477, 2085, 690, 3904, 432, 4679, 533, 627, 310, 642, 22861, 327, 849, 1175, 253, 906, 310, 50276, 18, 91, 12109, 288, 259, 1205, 246, 260, 864, 268, 288, 48188, 73, 260, 285, 16447, 928, 298, 5919, 11454, 2990, 31640, 21612, 342, 2087, 5743, 3470, 275, 16424, 275, 11454, 1491, 5162, 2718, 7266, 7584, 24447, 41892, 4765, 9688, 5987, 856, 22868, 32167, 2824, 550, 20790, 7798, 13362, 69, 2125, 39948, 71, 2313, 69, 3046, 67, 20, 2275, 49800, 66, 883, 71, 2222, 67, 17, 3348, 1549, 15834, 2974, 50276, 19, 314, 86, 1182, 20846, 260, 465, 543, 1182, 259, 543, 295, 19169, 277, 285, 16447, 928, 298, 45170, 16834, 41912, 11454, 2990, 31640, 28460, 275, 253, 10488, 48499, 39951, 2284, 8059, 327, 13345, 9260, 39951, 2284, 9169, 253, 10488, 9815, 16694, 4893, 273, 13345, 9260, 8059, 209, 571, 2284, 9169, 253, 28081, 39951, 2284, 18870, 35835, 327, 11331, 16424, 275, 13345, 9260, 299, 66, 2284, 9169, 747, 340, 1064, 31804, 441, 66, 704, 67, 4701, 818, 805, 9169, 7266, 2456, 1787, 1235, 2031, 9169, 9688, 5987, 39639, 1528, 2184, 4305, 4663, 545, 4904, 66, 2284, 14600, 1374, 3046, 2031, 50276, 20, 67, 5430, 78, 781, 247, 20505, 276, 4019, 278, 285, 23603, 10940, 266, 305, 256, 5541, 43658, 12672, 3242, 11233, 37913, 3638, 273, 774, 86, 6928, 275, 5213, 8059, 327, 13345, 11454, 6928, 7266, 1458, 883, 3763, 35107, 7203, 254, 43425, 642, 2201, 7350, 581, 14876, 323, 2929, 4028, 4496, 897, 643, 29886, 3386, 685, 9830, 281, 12129, 1027, 4243, 273, 253, 7484, 436, 310, 417, 11453, 323, 3295, 27895, 10668, 390, 2806, 285, 3168, 38121, 5474, 339, 431, 248, 2929, 10262, 247, 4460, 1332, 281, 11897, 5170, 14493, 281, 253, 1980, 11233, 37913, 3638, 273, 247, 11454, 2990, 407, 970, 4872, 3033, 18634, 281, 3033, 253, 5222, 273, 253, 8254, 413, 480, 317, 706, 757, 273, 253, 2990, 253, 4477, 7568, 326, 253, 4081, 2746, 48169, 40638, 14493, 685, 253, 1375, 23037, 14387, 275, 5272, 673, 327, 13506, 6928, 285, 327, 6928, 10166, 323, 4633, 2460, 15302, 4720, 253, 1332, 310, 5762, 327, 45973, 414, 1783, 347, 271, 1650, 2898, 253, 4477, 1056, 247, 625, 10182, 897, 273, 4872, 3033, 18634, 5609, 275, 253, 3634, 273, 26230, 247, 6928, 1980, 11233, 37913, 3638, 4795, 275, 40638, 14493, 685, 253, 1375, 23037, 14387, 48489, 275, 5272, 673, 33810, 253, 5933, 8525, 27213, 281, 6863, 257, 253, 14493, 285, 247, 2840, 27213, 5700, 310, 3559, 534, 4620, 281, 789, 973, 275, 253, 2783, 5661, 7533, 1223, 253, 2746, 310, 417, 7094, 3236, 285, 352, 21354, 327, 5368, 5609, 824, 347, 11896, 47941, 352, 310, 4518, 3559, 285, 3576, 891, 2868, 352, 1537, 320, 273, 1600, 281, 253, 3114, 50276, 1542, 752, 7350, 253, 32213, 253, 5661, 2593, 310, 8489, 3710, 347, 352, 6571, 15771, 327, 1355, 285, 13506, 6928, 352, 651, 320, 4722, 281, 1071, 253, 5933, 327, 4067, 6928, 285, 6830, 327, 11454, 2990, 21999, 347, 323, 4227, 2218, 407, 253, 11896, 47941, 4477, 387, 326, 1127, 891, 651, 320, 14338, 281, 923, 849, 11233, 37913, 3169, 11333, 7277, 342, 1846, 18464, 2336, 13783, 3340, 1110, 1754, 327, 18325, 4872, 14493, 824, 347, 9765, 285, 701, 317, 2924, 7370, 4087, 352, 651, 320, 4722, 281, 923, 271, 6880, 273, 2593, 5976, 327, 298, 19, 5222, 534, 41113, 11333, 323, 11233, 37913, 14637, 21774, 1329, 253, 4477, 18212, 2319, 253, 7364, 273, 616, 789, 2299, 352, 651, 320, 1805, 323, 253, 9414, 281, 452, 731, 35341, 275, 247, 2593, 275, 253, 30762, 2490, 187, 4118, 18435, 27, 783, 2929, 24357, 247, 16182, 323, 12672, 253, 11233, 37913, 3638, 273, 774, 86, 11454, 6928, 672, 253, 3280, 26309, 403, 4080, 275, 253, 298, 43723, 5222, 253, 1332, 310, 1754, 327, 12672, 6863, 5170, 14493, 327, 253, 8254, 413, 480, 317, 706, 757, 253, 5044, 2934, 310, 281, 4647, 7726, 3033, 18634, 5609, 281, 253, 19265, 15180, 4216, 534, 11026, 271, 5170, 3033, 327, 253, 5222, 273, 253, 8254, 413, 480, 317, 706, 757, 273, 253, 2990, 5661, 1543, 921, 34385, 689, 256, 5503, 275, 2426, 273, 9171, 1430, 20243, 285, 253, 10302, 3033, 50275, 783, 30628, 574, 247, 1180, 273, 7350, 954, 273, 534, 497, 9713, 1309, 253, 5955, 3408, 891, 5583, 326, 253, 4477, 49620, 253, 2929, 285, 253, 4679, 2556, 281, 253, 30628, 5701, 347, 973, 347, 616, 1211, 6128, 253, 2929, 369, 671, 5469, 2190, 253, 30628, 581, 2022, 1127, 273, 5955, 369, 253, 38135, 273, 253, 789, 2429, 281, 2720, 1445, 24088, 12865, 86, 1162, 355, 285, 253, 958, 326, 41113, 253, 5222, 273, 8254, 413, 480, 317, 706, 757, 3133, 281, 320, 760, 12912, 323, 298, 43723, 26309, 2299, 690, 30628, 9125, 285, 891, 5194, 326, 253, 2929, 19132, 689, 256, 5503, 3082, 3240, 19836, 275, 2426, 273, 6733, 285, 6863, 1255, 285, 253, 1332, 11498, 281, 1199, 4067, 3210, 2429, 281, 2720, 2987, 4311, 310, 2686, 271, 1774, 5691, 275, 436, 9400, 347, 247, 906, 891, 651, 6273, 323, 18738, 253, 2929, 50275, 284, 247, 2647, 273, 9075, 891, 13414, 1158, 891, 5194, 342, 436, 6197, 275, 253, 12002, 285, 2074, 14683, 275, 643, 4243, 273, 253, 2929, 5368, 3082, 323, 12672, 11233, 37913, 14637, 2057, 403, 50276, 681, 10340, 595, 31334, 390, 4711, 13155, 5170, 14493, 359, 513, 452, 1175, 3082, 326, 2085, 37825, 5170, 14493, 327, 253, 11233, 37913, 3638, 273, 295, 2224, 1223, 891, 5194, 326, 253, 9171, 1430, 273, 1110, 3082, 403, 1335, 281, 320, 5520, 359, 476, 417, 1663, 1067, 731, 31334, 7613, 891, 2868, 326, 436, 6197, 285, 2074, 14683, 275, 253, 2929, 812, 320, 1805, 294, 545, 83, 833, 209 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: this paper presents a model for detecting irregularities in images such as would occur in manufacturing acceptance testing as a form of anomaly detection the claims made are for a principled modeling approach and an adaptive algorithm quick to learn from just a few samples based on energy based models for modeling data densities the paper claims improvements to avoid the need for retraining for new tasks such as instead of synthesizing negative samples rom noise using a more targeted method of learning from inpainting operation to learn anomalies the paper claims a principled approach to detecting rare samples compared to normal ones one would expect that abnormality implies improbability the principle here is that something abnormal is something whose probability is small the paper never avails itself of any discussion of what it means to be anomalous in terms of probability similarly neither does the evaluation section mention anything about false positive or false negative detection rates as an example of this confusion the use of the term normal in the introduction could refer to a normal that is gaussian distribution of the nonanomalous population fitting a normal distribution to samples then looking for samples that fall on the tails of the distribution is a conventional method to detect anomalies since this is not what the paper proposes the authors may want to clarify this at the outset however it raises the speculation whether the mean squared error expressed in equation 8 is just implicitly an appeal to a normal distribution a quadratic error term is consistent with the use of a gaussian normal distribution honestly i find it hard to tell if the evaluation presented in the paper makes the case for improvement over other deep learning methods or more conventional image analysis methods it appears the main takeaway from the evaluation section is the acceptable detection ability on limited number of samples how does this method stack up with previous methods when given comparably large numbers of training samples in short the term anomaly detection has been applied to a wide range of unsupervised techniques with little in common among them this literature using deep learning methods that have no connection to the existing literature that is more than a decade old calls into question whether the proposed methods are competitive with previous methods that do not originate by applying deep learning techniques one really has to look at the history of the field to claim to have made an advance my apologies for a shallow review due to a lack of familiarity with this branch of the field however there are some basic questions that need to be raised about the lack of connection of the field with the vast work that precedes the current trend in deep learning to really understand the significance of the work docsepthe paper proposes a framework for anomaly detection and localization that allows fast adaptation to new tasks specifically the authors propose an energybased model ebm with an adaptive sparse coding layer directly trained with normal features of a target task a metalearning process is followed to extract common knowledge across tasks enabling few shots adaptation shrinkage functions sparse coding with large receptive fields and learning by inpainting are introduced to improve and accelerate the ebm training strengths the method adapts to new tasks autonomously ablation study and comparison with supervised models exhaustive implementation details in the appendix section weaknesses experimental analysis is limited to image and video datasets the miou metric is never defined in the paper fairly limited scope of competitor methods which could depend on the limited suitability of other methods in the adopted setting the paper proposes an interesting anomaly detection approach and an experimental evaluation that also involves fully supervised models results are competitive and some of them appear close to the upper bound performances provided by supervised alternatives the authors also provide an exhaustive description of the architecture and hyperparameters in the appendix section i think the main merit of the proposed method is the ability to achieve satisfactory anomaly detection performance without large availability of normal samples while adapting to new tasks docsepthis paper proposes an image classification system where a set of normal patterns are stored as a dictionary and the degree of deviation is used as the anomaly score the overall architecture seems to feature a conventional deep encoder and a pattern matching module that compares the encoded latent vector against the pattern dictionary the pattern matching is done by solving the lasso regression problem the authors propose a certain online learning approach following existing fewshot learning methods and also a synthetic sample generation approach using random perturbation combined with a gradient method this paper can be viewed as a proposal of a new image anomaly detection system that combines existing methods the proposed adaptive dictionarybased approach is not new the use of a deep encoder for image analysis is of course not new the term energybased does not mean much because any probability distribution can be thought of as energybased so the question is whether the combination is innovative or not although i appreciate the authors efforts to develop an industryapplicationready image processing system i do not think the technical novelty meets the standard that iclr papers are supposed to have eq2 the sign of ln z eq3 y is not defined in the generative model in eq 1 p3 the problem setting should be clearly defined what are the input and output the anomaly score must be clearly defined in the main text as part of the problem setting good industrial usecase paper limited technical novelty less innovative combination of known methods docsepthe author proposes a fast adaptive anomaly detection method using an adaptive sparse coding layer in the fewshot setting the proposed method outperforms baselines strength the experimental result on two benchmark datasets outperforms most of the baselines an ablation study is presented to demonstrate the robustness of the proposed method however there are also some places where this paper needs further clarifications it is not clearly shown in the experiment as to how the technique in section 31 improves the robustness of the network on different types of objects a small typo in section 31 visualizations of the hard shrinkage function and sigshrink with different values of tau are presented in fig 3 rather than fig 7 the caption of table i suggests that col 25 are fully supervised methods trained with massive normal samples and are considered as the upperbounds however it seems like the first column with ae ssim is able to deliver better results in many categories than these upper bounds such as carpet grid and bottle it seems from table 2 that the proposed method without leveraging any temporal information can achieve even better performance in the cuhk avenue and the shtech dataset more discussions are encouraged on the benefit of incorporating such temporal information versus image frames randomly sampled from the target scenes to summarize this paper proposed a fast adaptive anomaly detection framework for anomaly detection in images the proposed method is technically sound however the novelty is limited on the adaptive space coding layer with receptive field while the rest of the network structure seems to be a big melting pot incorporating the energy bassed model and episodic training in the context of metalearning based fewshot learning docsepthe paper presents an anomaly detection algorithm that uses an ebm energy based model to distinguish between normal and anomaly this model generates pseudoanomaly instances onthefly for each normal instance and then learns to assign low energy to normal instances and high energy to pseudoanomaly instances the model is further designed to be able to adapt quickly few normal labeled examples to new tasks pros 1 presents a wellreasoned set of design choices 2 good ablation studies cons 1 assumes all training data is normal 2 lacks experiments with contaminated training data main comments 1 expects all training data to be normal effectively makes the algorithm fully supervised since we must manually make sure that all input train data is normal 2 should show performance of algorithm when training data is contaminated with anomalies varying contamination fraction say from 00 010 3 section 31 the sparsity regularization to formulated as the mean squared error mse between the original this is a nice explanation of the design choice the paper mixes existing techniques in a nice way to make the overall solution very effective ### Summary:
this work tries to tackle the problem of anomaly detection across different tasks to do so authors employ energybased models ebms and define an outlier score in terms of the ebm energies having a shared sparse code for different tasks this pipeline is tested on some image and video anomaly datasets for industrial inspection the reviewers highlighted some concerns that need to be addressed before the paper is ready for publication first a revision could benefit from a rewriting the clearly formalizes the learning problem from page 2 and then discusses about the possible modeling options given i the task at hand and ii some efficiency requirements second concerning the modeling choices of the proposed pipeline the motivation behind the choice of ebms should be strengthened for example it is not clear why the proposed sparse coding could be used for any other latent variable probabilistic model as observed by one reviewer the pros of having energies instead of probabilities or just reconstructions from a deterministic autoencoder is not discussed sufficiently additionally the heuristics of running langevin dynamics for only 5 steps should be backed up by stronger empirical evidence as it lacks theory and it should be discussed how much you should run the markov chain to obtain sensible negative samples third conclusions over the experiments on the provided benchmarks seem preliminary for instance a new revision could benefit from adding a statistical significance analysis to the reported accuracies i appreciate that authors added further ablation studies including experiments on contaminated data in the latest revision i suggest them to extend the experimental suite to more benchmarks including the commonly used for anomaly detection
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 2520, 2929, 10262, 247, 1566, 323, 15549, 17948, 1005, 275, 3888, 824, 347, 651, 2826, 275, 10264, 14924, 5175, 50276, 284, 247, 830, 273, 30207, 5481, 253, 3916, 1160, 403, 323, 247, 3505, 74, 6216, 14053, 2746, 285, 271, 17825, 5933, 3158, 281, 3037, 432, 816, 247, 1643, 3530, 50275, 3169, 327, 2341, 1754, 3210, 323, 14053, 941, 16689, 253, 2929, 3916, 11701, 281, 3693, 253, 878, 323, 851, 26208, 323, 747, 8892, 824, 347, 3185, 273, 35143, 3006, 4016, 3530, 10102, 6046, 970, 247, 625, 10522, 1332, 273, 4715, 432, 275, 31406, 1076, 4254, 281, 3037, 31101, 50274, 783, 2929, 3916, 247, 3505, 74, 6216, 2746, 281, 15549, 7520, 3530, 2429, 281, 2622, 4394, 50276, 531, 651, 1902, 326, 43108, 8018, 1965, 67, 1430, 50276, 783, 8063, 1060, 310, 326, 1633, 10969, 310, 1633, 3692, 5912, 310, 1355, 50276, 783, 2929, 1620, 1961, 84, 3139, 273, 667, 5955, 273, 50276, 5371, 352, 2097, 281, 320, 31946, 275, 2426, 273, 5912, 50276, 3549, 6241, 6747, 1057, 253, 7103, 2593, 3748, 2712, 670, 3221, 2762, 390, 3221, 4016, 5481, 4142, 50275, 284, 271, 1650, 273, 436, 13775, 253, 897, 273, 253, 1307, 2622, 275, 253, 10199, 812, 3730, 281, 247, 2622, 326, 310, 305, 12064, 3268, 273, 253, 1327, 266, 7838, 528, 3072, 13532, 247, 2622, 3268, 281, 3530, 840, 2819, 323, 3530, 326, 2965, 327, 253, 32936, 273, 253, 3268, 310, 247, 6041, 1332, 281, 2736, 31101, 50276, 17480, 436, 310, 417, 752, 253, 2929, 29328, 253, 4477, 778, 971, 281, 19148, 436, 387, 253, 35681, 50276, 35529, 352, 16540, 253, 22898, 1880, 253, 1599, 30044, 2228, 4469, 275, 5150, 854, 310, 816, 29688, 271, 4549, 281, 247, 2622, 3268, 247, 21396, 2228, 1307, 310, 5185, 342, 253, 897, 273, 247, 305, 12064, 2622, 3268, 50275, 25224, 27459, 891, 1089, 352, 1892, 281, 2028, 604, 253, 7103, 3559, 275, 253, 2929, 2789, 253, 1083, 323, 7756, 689, 643, 3676, 4715, 3082, 390, 625, 6041, 2460, 1783, 3082, 50276, 262, 4620, 253, 2022, 1379, 12594, 432, 253, 7103, 2593, 310, 253, 12207, 5481, 3745, 327, 3710, 1180, 273, 3530, 50275, 5430, 1057, 436, 1332, 8031, 598, 342, 2045, 3082, 672, 1677, 3294, 1598, 1781, 3904, 273, 3733, 3530, 50275, 249, 2159, 253, 1307, 30207, 5481, 556, 644, 3732, 281, 247, 4618, 2491, 273, 440, 35421, 5609, 342, 1652, 275, 1846, 2190, 731, 50276, 2520, 6239, 970, 3676, 4715, 3082, 326, 452, 642, 4602, 281, 253, 5368, 6239, 326, 310, 625, 685, 247, 9976, 1711, 5841, 715, 1953, 1880, 253, 4081, 3082, 403, 12085, 342, 2045, 3082, 326, 513, 417, 35282, 407, 9433, 3676, 4715, 5609, 50276, 531, 1663, 556, 281, 1007, 387, 253, 2892, 273, 253, 1673, 281, 1750, 281, 452, 1160, 271, 7170, 50275, 2577, 48754, 323, 247, 20126, 2278, 1955, 281, 247, 3480, 273, 38550, 342, 436, 7789, 273, 253, 1673, 2299, 627, 403, 690, 5044, 3533, 326, 878, 281, 320, 5439, 670, 253, 3480, 273, 4602, 273, 253, 1673, 342, 253, 8485, 789, 326, 8436, 265, 253, 1655, 9058, 275, 3676, 4715, 281, 1663, 2096, 253, 8453, 273, 253, 789, 50276, 7152, 339, 431, 248, 2929, 29328, 247, 7792, 323, 30207, 5481, 285, 14536, 326, 4483, 3809, 15644, 281, 747, 8892, 50276, 46458, 253, 4477, 12661, 271, 2341, 3169, 1566, 299, 5844, 342, 271, 17825, 23507, 12425, 3828, 3587, 10166, 342, 2622, 3386, 273, 247, 2303, 4836, 50276, 66, 5148, 613, 920, 1232, 310, 3560, 281, 4908, 1846, 3640, 2439, 8892, 17690, 1643, 13768, 15644, 50276, 34083, 750, 486, 3470, 23507, 12425, 342, 1781, 44952, 4910, 285, 4715, 407, 275, 31406, 1076, 403, 5611, 281, 3157, 285, 28523, 253, 299, 5844, 3733, 50276, 296, 3755, 20556, 50275, 783, 1332, 5223, 84, 281, 747, 8892, 1125, 11168, 4087, 50276, 1752, 318, 1263, 285, 5301, 342, 22296, 3210, 50276, 911, 8648, 422, 7092, 4278, 275, 253, 30762, 2593, 50276, 20881, 1255, 265, 50274, 49363, 1783, 310, 3710, 281, 2460, 285, 3492, 15302, 50276, 783, 3641, 276, 7982, 310, 1620, 2931, 275, 253, 2929, 50276, 25525, 314, 3710, 7990, 273, 32048, 3082, 534, 812, 3469, 327, 253, 3710, 45984, 273, 643, 3082, 275, 253, 8671, 4758, 253, 2929, 29328, 271, 4722, 30207, 5481, 2746, 285, 271, 5661, 7103, 326, 671, 8687, 4751, 22296, 3210, 50276, 16680, 403, 12085, 285, 690, 273, 731, 3176, 2810, 281, 253, 5170, 3033, 16226, 2530, 407, 22296, 18075, 50276, 783, 4477, 671, 2085, 271, 41389, 5740, 273, 253, 10336, 285, 4373, 22041, 275, 253, 30762, 2593, 50276, 74, 1158, 253, 2022, 15785, 273, 253, 4081, 1332, 310, 253, 3745, 281, 5115, 20297, 30207, 5481, 3045, 1293, 1781, 11659, 273, 2622, 3530, 1223, 42174, 281, 747, 8892, 5474, 33032, 2520, 2929, 29328, 271, 2460, 9162, 985, 835, 247, 873, 273, 2622, 6127, 403, 7141, 347, 247, 19034, 285, 253, 4248, 273, 11254, 310, 908, 347, 253, 30207, 4868, 50274, 783, 4583, 10336, 3133, 281, 4735, 247, 6041, 3676, 32049, 285, 247, 3102, 11038, 6333, 326, 26662, 253, 16202, 21624, 4972, 1411, 253, 3102, 19034, 253, 3102, 11038, 310, 2218, 407, 16161, 253, 298, 26341, 9077, 1895, 50275, 783, 4477, 12661, 247, 2176, 3909, 4715, 2746, 1563, 5368, 1643, 11860, 4715, 3082, 285, 671, 247, 13506, 3410, 5978, 2746, 970, 3632, 20452, 5678, 342, 247, 11786, 1332, 50276, 2520, 2929, 476, 320, 11575, 347, 247, 10419, 273, 247, 747, 2460, 30207, 5481, 985, 326, 24772, 5368, 3082, 253, 4081, 17825, 19034, 3169, 2746, 310, 417, 747, 253, 897, 273, 247, 3676, 32049, 323, 2460, 1783, 310, 273, 2282, 417, 747, 253, 1307, 2341, 3169, 1057, 417, 1599, 1199, 984, 667, 5912, 3268, 476, 320, 1869, 273, 347, 2341, 3169, 594, 253, 1953, 310, 1880, 253, 5019, 310, 16694, 390, 417, 50275, 20261, 891, 11435, 253, 4477, 6031, 281, 1287, 271, 4491, 13259, 2038, 2460, 5162, 985, 891, 513, 417, 1158, 253, 7681, 38135, 16382, 253, 2629, 326, 17857, 32888, 9380, 403, 6326, 281, 452, 50274, 2574, 19, 253, 861, 273, 43321, 1182, 50276, 2574, 20, 50276, 90, 310, 417, 2931, 275, 253, 1006, 800, 1566, 275, 16186, 337, 50275, 81, 20, 50276, 783, 1895, 4758, 943, 320, 4518, 2931, 752, 403, 253, 3280, 285, 3453, 253, 30207, 4868, 1364, 320, 4518, 2931, 275, 253, 2022, 2505, 347, 629, 273, 253, 1895, 4758, 50275, 12311, 9787, 441, 886, 511, 2929, 50276, 15870, 7681, 38135, 50276, 1417, 16694, 5019, 273, 1929, 3082, 50276, 7152, 339, 431, 248, 2488, 29328, 247, 3809, 17825, 30207, 5481, 1332, 970, 271, 17825, 23507, 12425, 3828, 275, 253, 1643, 11860, 4758, 253, 4081, 1332, 41731, 13015, 1666, 25379, 4757, 50275, 783, 5661, 906, 327, 767, 22791, 15302, 41731, 13015, 954, 273, 253, 1666, 25379, 50276, 266, 28913, 1263, 310, 3559, 281, 7568, 253, 31640, 273, 253, 4081, 1332, 50276, 35529, 627, 403, 671, 690, 5053, 835, 436, 2929, 3198, 2007, 8254, 6787, 50276, 262, 310, 417, 4518, 2011, 275, 253, 3368, 347, 281, 849, 253, 5853, 275, 2593, 4562, 19132, 253, 31640, 273, 253, 2990, 327, 1027, 3510, 273, 5113, 50276, 66, 1355, 1745, 80, 275, 2593, 4562, 50276, 34309, 5904, 273, 253, 1892, 47100, 1159, 285, 9788, 34083, 750, 342, 1027, 2193, 273, 29201, 403, 3559, 275, 3036, 495, 2581, 685, 3036, 818, 50276, 783, 11743, 273, 2829, 891, 5936, 326, 847, 2030, 403, 4751, 22296, 3082, 10166, 342, 7863, 2622, 3530, 285, 403, 2783, 347, 253, 5170, 35800, 2299, 352, 3133, 751, 253, 806, 5084, 342, 247, 70, 256, 3549, 310, 2104, 281, 7257, 1805, 1543, 275, 1142, 9050, 685, 841, 5170, 14493, 824, 347, 23656, 9860, 285, 11996, 50276, 262, 3133, 432, 2829, 374, 326, 253, 4081, 1332, 1293, 19732, 2977, 667, 11935, 1491, 476, 5115, 1014, 1805, 3045, 275, 253, 260, 6968, 76, 39893, 285, 253, 256, 384, 5036, 10895, 625, 11985, 403, 14659, 327, 253, 5649, 273, 24049, 824, 11935, 1491, 7147, 2460, 13009, 12421, 19958, 432, 253, 2303, 13451, 50276, 936, 26799, 436, 2929, 4081, 247, 3809, 17825, 30207, 5481, 7792, 323, 30207, 5481, 275, 3888, 253, 4081, 1332, 310, 22335, 3590, 2299, 253, 38135, 310, 3710, 327, 253, 17825, 2317, 12425, 3828, 342, 44952, 1673, 1223, 253, 1551, 273, 253, 2990, 2605, 3133, 281, 320, 247, 1943, 22244, 1721, 24049, 253, 2341, 16819, 264, 1566, 285, 6314, 23329, 3733, 275, 253, 3634, 273, 5148, 613, 920, 1754, 1643, 11860, 4715, 50276, 7152, 339, 431, 248, 2929, 10262, 271, 30207, 5481, 5933, 326, 4648, 271, 299, 5844, 2341, 1754, 1566, 281, 12129, 875, 2622, 285, 30207, 436, 1566, 15693, 17927, 266, 27724, 10872, 327, 783, 16247, 323, 1016, 2622, 4227, 285, 840, 33772, 281, 9212, 1698, 2341, 281, 2622, 10872, 285, 1029, 2341, 281, 17927, 266, 27724, 10872, 253, 1566, 310, 2007, 4158, 281, 320, 2104, 281, 5223, 4541, 1643, 2622, 13130, 6667, 281, 747, 8892, 5847, 337, 10262, 247, 973, 10752, 264, 873, 273, 2216, 10165, 374, 1175, 28913, 2175, 50276, 5040, 337, 19584, 512, 3733, 941, 310, 2622, 374, 19756, 4679, 342, 25493, 3733, 941, 50275, 7265, 5701, 50276, 18, 21973, 512, 3733, 941, 281, 320, 2622, 50276, 8222, 1242, 2789, 253, 5933, 4751, 22296, 1580, 359, 1364, 13542, 1056, 2119, 326, 512, 3280, 6194, 941, 310, 2622, 50276, 19, 943, 921, 3045, 273, 5933, 672, 3733, 941, 310, 25493, 342, 31101, 11962, 17969, 6919, 1333, 432, 7449, 50276, 9104, 50276, 20, 2593, 4562, 253, 37139, 414, 37820, 281, 50276, 630, 2907, 347, 253, 1599, 30044, 2228, 278, 339, 875, 253, 3236, 50276, 2520, 310, 247, 5322, 8813, 273, 253, 2216, 4327, 50276, 783, 2929, 47603, 5368, 5609, 275, 247, 5322, 1039, 281, 1056, 253, 4583, 2900, 1077, 3576, 2490, 187, 4118, 18435, 27, 2520, 789, 14177, 281, 18915, 253, 1895, 273, 30207, 5481, 2439, 1027, 8892, 281, 513, 594, 4477, 2126, 2341, 3169, 3210, 38391, 983, 285, 4853, 271, 562, 3623, 4868, 275, 2426, 273, 253, 299, 5844, 14120, 1907, 247, 6096, 23507, 2127, 323, 1027, 8892, 436, 15722, 310, 5762, 327, 690, 2460, 285, 3492, 30207, 15302, 323, 9787, 15981, 50276, 783, 30628, 16318, 690, 7350, 326, 878, 281, 320, 9713, 1078, 253, 2929, 310, 4704, 323, 9311, 50275, 7053, 247, 18520, 812, 5649, 432, 247, 294, 17695, 253, 4518, 7473, 4219, 253, 4715, 1895, 432, 3239, 374, 285, 840, 25339, 670, 253, 1896, 14053, 4610, 1677, 891, 253, 4836, 387, 1133, 285, 21255, 690, 6733, 6095, 50276, 9815, 8664, 253, 14053, 10165, 273, 253, 4081, 15722, 253, 16038, 3212, 253, 4327, 273, 38391, 983, 943, 320, 34615, 323, 1650, 352, 310, 417, 2590, 2139, 253, 4081, 23507, 12425, 812, 320, 908, 323, 667, 643, 21624, 4778, 37851, 1566, 347, 2540, 407, 581, 37317, 253, 5847, 273, 1907, 14120, 3185, 273, 20552, 390, 816, 49866, 6477, 432, 247, 30027, 6753, 36465, 310, 417, 5469, 10481, 23000, 253, 344, 321, 3397, 273, 3515, 298, 912, 8498, 8062, 323, 760, 608, 5018, 943, 320, 17245, 598, 407, 10046, 16774, 1941, 347, 352, 19756, 3762, 285, 352, 943, 320, 5469, 849, 1199, 368, 943, 1408, 253, 1616, 729, 5931, 281, 4044, 24600, 4016, 3530, 50276, 19016, 11815, 689, 253, 4679, 327, 253, 2530, 49602, 1646, 12611, 323, 4227, 247, 747, 18520, 812, 5649, 432, 6240, 247, 7605, 8453, 1783, 281, 253, 2361, 3933, 19103, 891, 11435, 326, 4477, 2879, 2007, 28913, 2175, 1690, 4679, 327, 25493, 941, 275, 253, 6323, 18520, 891, 1804, 731, 281, 9017, 253, 5661, 18880, 281, 625, 49602, 1690, 253, 7744, 908, 323, 30207, 5481 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 2520, 2929, 10262, 247, 1566, 323, 15549, 17948, 1005, 275, 3888, 824, 347, 651, 2826, 275, 10264, 14924, 5175, 50276, 284, 247, 830, 273, 30207, 5481, 253, 3916, 1160, 403, 323, 247, 3505, 74, 6216, 14053, 2746, 285, 271, 17825, 5933, 3158, 281, 3037, 432, 816, 247, 1643, 3530, 50275, 3169, 327, 2341, 1754, 3210, 323, 14053, 941, 16689, 253, 2929, 3916, 11701, 281, 3693, 253, 878, 323, 851, 26208, 323, 747, 8892, 824, 347, 3185, 273, 35143, 3006, 4016, 3530, 10102, 6046, 970, 247, 625, 10522, 1332, 273, 4715, 432, 275, 31406, 1076, 4254, 281, 3037, 31101, 50274, 783, 2929, 3916, 247, 3505, 74, 6216, 2746, 281, 15549, 7520, 3530, 2429, 281, 2622, 4394, 50276, 531, 651, 1902, 326, 43108, 8018, 1965, 67, 1430, 50276, 783, 8063, 1060, 310, 326, 1633, 10969, 310, 1633, 3692, 5912, 310, 1355, 50276, 783, 2929, 1620, 1961, 84, 3139, 273, 667, 5955, 273, 50276, 5371, 352, 2097, 281, 320, 31946, 275, 2426, 273, 5912, 50276, 3549, 6241, 6747, 1057, 253, 7103, 2593, 3748, 2712, 670, 3221, 2762, 390, 3221, 4016, 5481, 4142, 50275, 284, 271, 1650, 273, 436, 13775, 253, 897, 273, 253, 1307, 2622, 275, 253, 10199, 812, 3730, 281, 247, 2622, 326, 310, 305, 12064, 3268, 273, 253, 1327, 266, 7838, 528, 3072, 13532, 247, 2622, 3268, 281, 3530, 840, 2819, 323, 3530, 326, 2965, 327, 253, 32936, 273, 253, 3268, 310, 247, 6041, 1332, 281, 2736, 31101, 50276, 17480, 436, 310, 417, 752, 253, 2929, 29328, 253, 4477, 778, 971, 281, 19148, 436, 387, 253, 35681, 50276, 35529, 352, 16540, 253, 22898, 1880, 253, 1599, 30044, 2228, 4469, 275, 5150, 854, 310, 816, 29688, 271, 4549, 281, 247, 2622, 3268, 247, 21396, 2228, 1307, 310, 5185, 342, 253, 897, 273, 247, 305, 12064, 2622, 3268, 50275, 25224, 27459, 891, 1089, 352, 1892, 281, 2028, 604, 253, 7103, 3559, 275, 253, 2929, 2789, 253, 1083, 323, 7756, 689, 643, 3676, 4715, 3082, 390, 625, 6041, 2460, 1783, 3082, 50276, 262, 4620, 253, 2022, 1379, 12594, 432, 253, 7103, 2593, 310, 253, 12207, 5481, 3745, 327, 3710, 1180, 273, 3530, 50275, 5430, 1057, 436, 1332, 8031, 598, 342, 2045, 3082, 672, 1677, 3294, 1598, 1781, 3904, 273, 3733, 3530, 50275, 249, 2159, 253, 1307, 30207, 5481, 556, 644, 3732, 281, 247, 4618, 2491, 273, 440, 35421, 5609, 342, 1652, 275, 1846, 2190, 731, 50276, 2520, 6239, 970, 3676, 4715, 3082, 326, 452, 642, 4602, 281, 253, 5368, 6239, 326, 310, 625, 685, 247, 9976, 1711, 5841, 715, 1953, 1880, 253, 4081, 3082, 403, 12085, 342, 2045, 3082, 326, 513, 417, 35282, 407, 9433, 3676, 4715, 5609, 50276, 531, 1663, 556, 281, 1007, 387, 253, 2892, 273, 253, 1673, 281, 1750, 281, 452, 1160, 271, 7170, 50275, 2577, 48754, 323, 247, 20126, 2278, 1955, 281, 247, 3480, 273, 38550, 342, 436, 7789, 273, 253, 1673, 2299, 627, 403, 690, 5044, 3533, 326, 878, 281, 320, 5439, 670, 253, 3480, 273, 4602, 273, 253, 1673, 342, 253, 8485, 789, 326, 8436, 265, 253, 1655, 9058, 275, 3676, 4715, 281, 1663, 2096, 253, 8453, 273, 253, 789, 50276, 7152, 339, 431, 248, 2929, 29328, 247, 7792, 323, 30207, 5481, 285, 14536, 326, 4483, 3809, 15644, 281, 747, 8892, 50276, 46458, 253, 4477, 12661, 271, 2341, 3169, 1566, 299, 5844, 342, 271, 17825, 23507, 12425, 3828, 3587, 10166, 342, 2622, 3386, 273, 247, 2303, 4836, 50276, 66, 5148, 613, 920, 1232, 310, 3560, 281, 4908, 1846, 3640, 2439, 8892, 17690, 1643, 13768, 15644, 50276, 34083, 750, 486, 3470, 23507, 12425, 342, 1781, 44952, 4910, 285, 4715, 407, 275, 31406, 1076, 403, 5611, 281, 3157, 285, 28523, 253, 299, 5844, 3733, 50276, 296, 3755, 20556, 50275, 783, 1332, 5223, 84, 281, 747, 8892, 1125, 11168, 4087, 50276, 1752, 318, 1263, 285, 5301, 342, 22296, 3210, 50276, 911, 8648, 422, 7092, 4278, 275, 253, 30762, 2593, 50276, 20881, 1255, 265, 50274, 49363, 1783, 310, 3710, 281, 2460, 285, 3492, 15302, 50276, 783, 3641, 276, 7982, 310, 1620, 2931, 275, 253, 2929, 50276, 25525, 314, 3710, 7990, 273, 32048, 3082, 534, 812, 3469, 327, 253, 3710, 45984, 273, 643, 3082, 275, 253, 8671, 4758, 253, 2929, 29328, 271, 4722, 30207, 5481, 2746, 285, 271, 5661, 7103, 326, 671, 8687, 4751, 22296, 3210, 50276, 16680, 403, 12085, 285, 690, 273, 731, 3176, 2810, 281, 253, 5170, 3033, 16226, 2530, 407, 22296, 18075, 50276, 783, 4477, 671, 2085, 271, 41389, 5740, 273, 253, 10336, 285, 4373, 22041, 275, 253, 30762, 2593, 50276, 74, 1158, 253, 2022, 15785, 273, 253, 4081, 1332, 310, 253, 3745, 281, 5115, 20297, 30207, 5481, 3045, 1293, 1781, 11659, 273, 2622, 3530, 1223, 42174, 281, 747, 8892, 5474, 33032, 2520, 2929, 29328, 271, 2460, 9162, 985, 835, 247, 873, 273, 2622, 6127, 403, 7141, 347, 247, 19034, 285, 253, 4248, 273, 11254, 310, 908, 347, 253, 30207, 4868, 50274, 783, 4583, 10336, 3133, 281, 4735, 247, 6041, 3676, 32049, 285, 247, 3102, 11038, 6333, 326, 26662, 253, 16202, 21624, 4972, 1411, 253, 3102, 19034, 253, 3102, 11038, 310, 2218, 407, 16161, 253, 298, 26341, 9077, 1895, 50275, 783, 4477, 12661, 247, 2176, 3909, 4715, 2746, 1563, 5368, 1643, 11860, 4715, 3082, 285, 671, 247, 13506, 3410, 5978, 2746, 970, 3632, 20452, 5678, 342, 247, 11786, 1332, 50276, 2520, 2929, 476, 320, 11575, 347, 247, 10419, 273, 247, 747, 2460, 30207, 5481, 985, 326, 24772, 5368, 3082, 253, 4081, 17825, 19034, 3169, 2746, 310, 417, 747, 253, 897, 273, 247, 3676, 32049, 323, 2460, 1783, 310, 273, 2282, 417, 747, 253, 1307, 2341, 3169, 1057, 417, 1599, 1199, 984, 667, 5912, 3268, 476, 320, 1869, 273, 347, 2341, 3169, 594, 253, 1953, 310, 1880, 253, 5019, 310, 16694, 390, 417, 50275, 20261, 891, 11435, 253, 4477, 6031, 281, 1287, 271, 4491, 13259, 2038, 2460, 5162, 985, 891, 513, 417, 1158, 253, 7681, 38135, 16382, 253, 2629, 326, 17857, 32888, 9380, 403, 6326, 281, 452, 50274, 2574, 19, 253, 861, 273, 43321, 1182, 50276, 2574, 20, 50276, 90, 310, 417, 2931, 275, 253, 1006, 800, 1566, 275, 16186, 337, 50275, 81, 20, 50276, 783, 1895, 4758, 943, 320, 4518, 2931, 752, 403, 253, 3280, 285, 3453, 253, 30207, 4868, 1364, 320, 4518, 2931, 275, 253, 2022, 2505, 347, 629, 273, 253, 1895, 4758, 50275, 12311, 9787, 441, 886, 511, 2929, 50276, 15870, 7681, 38135, 50276, 1417, 16694, 5019, 273, 1929, 3082, 50276, 7152, 339, 431, 248, 2488, 29328, 247, 3809, 17825, 30207, 5481, 1332, 970, 271, 17825, 23507, 12425, 3828, 275, 253, 1643, 11860, 4758, 253, 4081, 1332, 41731, 13015, 1666, 25379, 4757, 50275, 783, 5661, 906, 327, 767, 22791, 15302, 41731, 13015, 954, 273, 253, 1666, 25379, 50276, 266, 28913, 1263, 310, 3559, 281, 7568, 253, 31640, 273, 253, 4081, 1332, 50276, 35529, 627, 403, 671, 690, 5053, 835, 436, 2929, 3198, 2007, 8254, 6787, 50276, 262, 310, 417, 4518, 2011, 275, 253, 3368, 347, 281, 849, 253, 5853, 275, 2593, 4562, 19132, 253, 31640, 273, 253, 2990, 327, 1027, 3510, 273, 5113, 50276, 66, 1355, 1745, 80, 275, 2593, 4562, 50276, 34309, 5904, 273, 253, 1892, 47100, 1159, 285, 9788, 34083, 750, 342, 1027, 2193, 273, 29201, 403, 3559, 275, 3036, 495, 2581, 685, 3036, 818, 50276, 783, 11743, 273, 2829, 891, 5936, 326, 847, 2030, 403, 4751, 22296, 3082, 10166, 342, 7863, 2622, 3530, 285, 403, 2783, 347, 253, 5170, 35800, 2299, 352, 3133, 751, 253, 806, 5084, 342, 247, 70, 256, 3549, 310, 2104, 281, 7257, 1805, 1543, 275, 1142, 9050, 685, 841, 5170, 14493, 824, 347, 23656, 9860, 285, 11996, 50276, 262, 3133, 432, 2829, 374, 326, 253, 4081, 1332, 1293, 19732, 2977, 667, 11935, 1491, 476, 5115, 1014, 1805, 3045, 275, 253, 260, 6968, 76, 39893, 285, 253, 256, 384, 5036, 10895, 625, 11985, 403, 14659, 327, 253, 5649, 273, 24049, 824, 11935, 1491, 7147, 2460, 13009, 12421, 19958, 432, 253, 2303, 13451, 50276, 936, 26799, 436, 2929, 4081, 247, 3809, 17825, 30207, 5481, 7792, 323, 30207, 5481, 275, 3888, 253, 4081, 1332, 310, 22335, 3590, 2299, 253, 38135, 310, 3710, 327, 253, 17825, 2317, 12425, 3828, 342, 44952, 1673, 1223, 253, 1551, 273, 253, 2990, 2605, 3133, 281, 320, 247, 1943, 22244, 1721, 24049, 253, 2341, 16819, 264, 1566, 285, 6314, 23329, 3733, 275, 253, 3634, 273, 5148, 613, 920, 1754, 1643, 11860, 4715, 50276, 7152, 339, 431, 248, 2929, 10262, 271, 30207, 5481, 5933, 326, 4648, 271, 299, 5844, 2341, 1754, 1566, 281, 12129, 875, 2622, 285, 30207, 436, 1566, 15693, 17927, 266, 27724, 10872, 327, 783, 16247, 323, 1016, 2622, 4227, 285, 840, 33772, 281, 9212, 1698, 2341, 281, 2622, 10872, 285, 1029, 2341, 281, 17927, 266, 27724, 10872, 253, 1566, 310, 2007, 4158, 281, 320, 2104, 281, 5223, 4541, 1643, 2622, 13130, 6667, 281, 747, 8892, 5847, 337, 10262, 247, 973, 10752, 264, 873, 273, 2216, 10165, 374, 1175, 28913, 2175, 50276, 5040, 337, 19584, 512, 3733, 941, 310, 2622, 374, 19756, 4679, 342, 25493, 3733, 941, 50275, 7265, 5701, 50276, 18, 21973, 512, 3733, 941, 281, 320, 2622, 50276, 8222, 1242, 2789, 253, 5933, 4751, 22296, 1580, 359, 1364, 13542, 1056, 2119, 326, 512, 3280, 6194, 941, 310, 2622, 50276, 19, 943, 921, 3045, 273, 5933, 672, 3733, 941, 310, 25493, 342, 31101, 11962, 17969, 6919, 1333, 432, 7449, 50276, 9104, 50276, 20, 2593, 4562, 253, 37139, 414, 37820, 281, 50276, 630, 2907, 347, 253, 1599, 30044, 2228, 278, 339, 875, 253, 3236, 50276, 2520, 310, 247, 5322, 8813, 273, 253, 2216, 4327, 50276, 783, 2929, 47603, 5368, 5609, 275, 247, 5322, 1039, 281, 1056, 253, 4583, 2900, 1077, 3576, 2490, 187, 4118, 18435, 27, 2520, 789, 14177, 281, 18915, 253, 1895, 273, 30207, 5481, 2439, 1027, 8892, 281, 513, 594, 4477, 2126, 2341, 3169, 3210, 38391, 983, 285, 4853, 271, 562, 3623, 4868, 275, 2426, 273, 253, 299, 5844, 14120, 1907, 247, 6096, 23507, 2127, 323, 1027, 8892, 436, 15722, 310, 5762, 327, 690, 2460, 285, 3492, 30207, 15302, 323, 9787, 15981, 50276, 783, 30628, 16318, 690, 7350, 326, 878, 281, 320, 9713, 1078, 253, 2929, 310, 4704, 323, 9311, 50275, 7053, 247, 18520, 812, 5649, 432, 247, 294, 17695, 253, 4518, 7473, 4219, 253, 4715, 1895, 432, 3239, 374, 285, 840, 25339, 670, 253, 1896, 14053, 4610, 1677, 891, 253, 4836, 387, 1133, 285, 21255, 690, 6733, 6095, 50276, 9815, 8664, 253, 14053, 10165, 273, 253, 4081, 15722, 253, 16038, 3212, 253, 4327, 273, 38391, 983, 943, 320, 34615, 323, 1650, 352, 310, 417, 2590, 2139, 253, 4081, 23507, 12425, 812, 320, 908, 323, 667, 643, 21624, 4778, 37851, 1566, 347, 2540, 407, 581, 37317, 253, 5847, 273, 1907, 14120, 3185, 273, 20552, 390, 816, 49866, 6477, 432, 247, 30027, 6753, 36465, 310, 417, 5469, 10481, 23000, 253, 344, 321, 3397, 273, 3515, 298, 912, 8498, 8062, 323, 760, 608, 5018, 943, 320, 17245, 598, 407, 10046, 16774, 1941, 347, 352, 19756, 3762, 285, 352, 943, 320, 5469, 849, 1199, 368, 943, 1408, 253, 1616, 729, 5931, 281, 4044, 24600, 4016, 3530, 50276, 19016, 11815, 689, 253, 4679, 327, 253, 2530, 49602, 1646, 12611, 323, 4227, 247, 747, 18520, 812, 5649, 432, 6240, 247, 7605, 8453, 1783, 281, 253, 2361, 3933, 19103, 891, 11435, 326, 4477, 2879, 2007, 28913, 2175, 1690, 4679, 327, 25493, 941, 275, 253, 6323, 18520, 891, 1804, 731, 281, 9017, 253, 5661, 18880, 281, 625, 49602, 1690, 253, 7744, 908, 323, 30207, 5481 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: while the traditional document retrieval problem is formulated as mapping both the document and the query to the same vector space and then performing nearest neighbor search to find the closest document fore each query the paper instead proposes to formulate the task as a document identifier generation problem where the input is the query and the output is the target documents id this is not entirely novel with respect to dsi tay et al 2022 but given that dsi was available on the archive only three months before neurips deadline one might be able to consider this paper to be a concurrent work the proposed model by this paper neural corpus indexer also introduces a few techniques 1 query generation that augments the querytodoc data for training equivalent to document indexing 2 prefixaware weightadaptive decoder that gives different symbols to the same tokens in different positions of the ids 3 semantic document id so that similar documents share similar ids and 4 regularization technique to prevent overfitting the results on natural questions 320k a subset of nq are very promising the proposed model not only outperforms other generative retrieval models such as dsi and seal by a big margin but also strongly outperforms competitive baselines such as ance and bm25 strengths the results are very strong the baselines that the paper compares against such as ance and bm25 are very strong ones and hard to beat it is remarkable that a generative model can beat those baselines query generation helps a lot and this is a very useful information for people working in this field in fact as shown in table 2 among other techniques introduced by the paper qg is by far the biggest factor for the superior performance of nci weaknesses if query generation is the dominant factor then it seems to me that what the model is really doing is query indexing rather than document indexing this means the performance of the model will highly depend on generating a comprehensive range of queries which are likely to overlap with the test queries constructing the semantic hierarchy of the document ids can be considered as constructing the inverted index in the canonical nns retrieval setup then what the decoder is doing can be considered as locating the correct cluster in each level of the hierarchy then i am wondering if it is fair to say this model is really generative retrieval or it is more of using different embeddings for the nearest neighbor search in each layer of the hierarchy the paper only experiments on natural questions and only a subset of it while nq is a very representative dataset in this task the paper would have looked much better if it was tested on different datasets andor the full dataset of nq it is not clear why such results are not shown yes the paper mentions crucial limitations such as its lack of scalability compared to canonical nns retrieval system which i agree as well docsepthe paper proposed a sequencetosequence model that generates the relevant document given an input query as a retrieval method the proposed method have a huge improvement on the nq320k dataset natural questions the most improvement seems to be coming from using generated query based on the document paired with the document as augmented training data originality the proposed data augmentation technique is interesting and demonstrate good performance the other techniques are interesting and providing much smaller but also good gains quality the author addressed their limitations and present the work in context the experiments are well designed clarity the paper is clearly written significance very good gains on nq320k over previous best method the authors acknowledge a set of limitations of the model including scale of the index set inference speed and updating the index im most interested in the third one the model can only work on a fixed set of documents to retrieve from or there is a cost to update the model when the index set changes this is a limitation of the line of works that generating the document identifiers not specific to this work however im also curious given that the document identifier is generated using a kmeans algorithm how the model performs when there are new documents if we just assign an identifier to a new document using the same kmeans algorithm docsepthe paper focuses on a promising direction to learn retrieval via generation where the model will generate the relevant document ids for given queries it resembles the previous differentiable search index work with several minor improvements strengths they propose several techniques to improve the model such as prefixaware weightadaptive decoder query generation consistencybased regularization they show excellent improvement in recall1 and recall10 on the nq320k datasets largely outperforming previous baselines weaknesses nci is quite similar to dsi eg the encoderdecoder architecture indexing training and the hierarchical clustering algorithm to generate semantic doc ids it introduces several small improvements such as using prefixlm consistencybased regularization query generation etc the overall contribution seems limited another weakness is that many of the gains come from using the additional query generation data interestingly all other improvements seem to have very limited effect on the performance this also raises a question about how much the model relies on the quality of the query generation model and how it can adapt to new domains that the qgen model cant generate good questions for the target document na docsep this paper proposes a new framework for neural ir given query directly predict a document id the document ids are obtained by hierarchical clustering of documents beforehand this is a novel formulation of the problem and is very distinct from current twostage methods that have a highrecall sparse retrieval stage followed by a highprecision neural reranker or approximate nearest neighbor methods that encode both documents and queries as vectors strength the formulation is unique and intriguing i really like this idea despite some limitations the results are very promising and i imagine many followup work may come from this paper weakness i think the method rests on one large assumption that documents can be preclustered into some fixed set of ids all queries are being trained with the same fixed set of document ids this is a somewhat unconventional way to think about ir as one usually does not assume document cluster structure and ad hoc queries may flexibly retrieve any subset i think some experiments that try different definitions of document ids eg random would shed light on the assumption of having a strong document clustering prior the limitation section is reasonable i think the indexupdate aspect may be one of the most important nearterm challenges ### Summary:
this paper proposes a new framework for neural ir given query directly predict a document id the document ids are obtained by hierarchical clustering of documents beforehand this is a novel formulation of the problem and is very distinct from current twostage methods that have a highrecall sparse retrieval stage followed by a highprecision neural reranker or approximate nearest neighbor methods that encode both documents and queries as vectors the paper is fairly wellwritten the authors have addressed reviewers concerns with honest detailed feedback and have made their code available to facilitate experimentation the results are particularly strong compared to more traditional bm25based models and competing neural approaches like dsi i anticipate there will be much followon work and eventually a paradigm shift in neural ir
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 6050, 253, 5899, 3389, 25064, 1895, 310, 26115, 347, 10603, 1097, 253, 3389, 285, 253, 7316, 281, 253, 1072, 4972, 2317, 285, 840, 9591, 5275, 6346, 3186, 281, 1089, 253, 8642, 3389, 2273, 1016, 7316, 253, 2929, 3185, 29328, 281, 36803, 253, 4836, 347, 247, 3389, 21674, 5978, 1895, 835, 253, 3280, 310, 253, 7316, 285, 253, 3453, 310, 253, 2303, 7177, 2654, 436, 310, 417, 7094, 4460, 342, 1675, 281, 277, 9245, 246, 333, 1162, 355, 1384, 1423, 533, 1677, 326, 277, 9245, 369, 2130, 327, 253, 21429, 760, 1264, 2607, 1078, 5723, 2824, 20639, 581, 1537, 320, 2104, 281, 1908, 436, 2929, 281, 320, 247, 17336, 789, 253, 4081, 1566, 407, 436, 2929, 11454, 20689, 3605, 254, 671, 23970, 247, 1643, 5609, 337, 7316, 5978, 326, 14688, 942, 253, 32305, 1767, 351, 406, 941, 323, 3733, 6425, 281, 3389, 44176, 374, 17744, 13823, 2801, 26672, 422, 29810, 326, 4245, 1027, 14217, 281, 253, 1072, 21761, 275, 1027, 6887, 273, 253, 44077, 495, 24705, 3389, 2654, 594, 326, 2074, 7177, 3894, 2074, 44077, 285, 577, 37820, 5853, 281, 3657, 689, 31893, 253, 1543, 327, 3626, 3533, 23349, 76, 247, 8578, 273, 295, 82, 403, 1077, 12532, 253, 4081, 1566, 417, 760, 41731, 13015, 643, 1006, 800, 25064, 3210, 824, 347, 277, 9245, 285, 14393, 407, 247, 1943, 8459, 533, 671, 7052, 41731, 13015, 12085, 1666, 25379, 824, 347, 271, 336, 285, 270, 78, 1099, 50276, 296, 3755, 20556, 50276, 783, 1543, 403, 1077, 2266, 253, 1666, 25379, 326, 253, 2929, 26662, 1411, 824, 347, 271, 336, 285, 270, 78, 1099, 403, 1077, 2266, 4394, 285, 1892, 281, 7171, 352, 310, 13406, 326, 247, 1006, 800, 1566, 476, 7171, 1110, 1666, 25379, 50276, 7267, 5978, 7729, 247, 2257, 285, 436, 310, 247, 1077, 4217, 1491, 323, 952, 2444, 275, 436, 1673, 275, 958, 347, 2011, 275, 2829, 374, 2190, 643, 5609, 5611, 407, 253, 2929, 2805, 72, 310, 407, 2080, 253, 5962, 2803, 323, 253, 8936, 3045, 273, 295, 5297, 50276, 20881, 1255, 265, 50276, 338, 7316, 5978, 310, 253, 11360, 2803, 840, 352, 3133, 281, 479, 326, 752, 253, 1566, 310, 1663, 2509, 310, 7316, 44176, 2581, 685, 3389, 44176, 436, 2097, 253, 3045, 273, 253, 1566, 588, 4122, 3469, 327, 11365, 247, 11088, 2491, 273, 19241, 534, 403, 2779, 281, 14787, 342, 253, 1071, 19241, 50275, 17439, 272, 253, 24705, 19868, 273, 253, 3389, 44077, 476, 320, 2783, 347, 26736, 253, 28483, 3605, 275, 253, 15516, 295, 2224, 25064, 9978, 840, 752, 253, 29810, 310, 2509, 476, 320, 2783, 347, 43042, 253, 3451, 7368, 275, 1016, 1268, 273, 253, 19868, 840, 891, 717, 12371, 604, 352, 310, 4344, 281, 1333, 436, 1566, 310, 1663, 1006, 800, 25064, 390, 352, 310, 625, 273, 970, 1027, 46234, 323, 253, 5275, 6346, 3186, 275, 1016, 3828, 273, 253, 19868, 50275, 783, 2929, 760, 4679, 327, 3626, 3533, 285, 760, 247, 8578, 273, 352, 1223, 295, 82, 310, 247, 1077, 8612, 10895, 275, 436, 4836, 253, 2929, 651, 452, 3261, 1199, 1805, 604, 352, 369, 5762, 327, 1027, 15302, 285, 263, 253, 2120, 10895, 273, 295, 82, 352, 310, 417, 2590, 2139, 824, 1543, 403, 417, 2011, 4754, 253, 2929, 25957, 9560, 7364, 824, 347, 697, 3480, 273, 9171, 1430, 2429, 281, 15516, 295, 2224, 25064, 985, 534, 891, 5194, 347, 973, 5474, 339, 431, 248, 2929, 4081, 247, 2160, 2083, 292, 583, 371, 566, 1566, 326, 15693, 253, 4623, 3389, 1677, 271, 3280, 7316, 347, 247, 25064, 1332, 253, 4081, 1332, 452, 247, 5699, 7756, 327, 253, 295, 82, 18911, 76, 10895, 3626, 3533, 253, 954, 7756, 3133, 281, 320, 3551, 432, 970, 4561, 7316, 1754, 327, 253, 3389, 18433, 342, 253, 3389, 347, 31612, 3733, 941, 3236, 414, 253, 4081, 941, 42072, 5853, 310, 4722, 285, 7568, 1175, 3045, 253, 643, 5609, 403, 4722, 285, 5277, 1199, 4577, 533, 671, 1175, 15988, 50276, 15177, 253, 2488, 9713, 616, 7364, 285, 1246, 253, 789, 275, 3634, 253, 4679, 403, 973, 4158, 50276, 498, 15752, 253, 2929, 310, 4518, 3542, 50276, 9188, 40348, 1077, 1175, 15988, 327, 295, 82, 18911, 76, 689, 2045, 1682, 1332, 50276, 783, 4477, 14409, 247, 873, 273, 7364, 273, 253, 1566, 1690, 4311, 273, 253, 3605, 873, 17032, 3885, 285, 22753, 253, 3605, 516, 954, 6110, 275, 253, 2626, 581, 253, 1566, 476, 760, 789, 327, 247, 4229, 873, 273, 7177, 281, 19553, 432, 390, 627, 310, 247, 2105, 281, 5731, 253, 1566, 672, 253, 3605, 873, 2544, 436, 310, 247, 12291, 273, 253, 1386, 273, 2987, 326, 11365, 253, 3389, 47811, 417, 2173, 281, 436, 789, 2299, 516, 671, 14338, 1677, 326, 253, 3389, 21674, 310, 4561, 970, 247, 465, 30799, 5933, 849, 253, 1566, 17923, 672, 627, 403, 747, 7177, 604, 359, 816, 9212, 271, 21674, 281, 247, 747, 3389, 970, 253, 1072, 465, 30799, 5933, 5474, 339, 431, 248, 2929, 16633, 327, 247, 12532, 3884, 281, 3037, 25064, 3066, 5978, 835, 253, 1566, 588, 6635, 253, 4623, 3389, 44077, 323, 1677, 19241, 352, 29217, 253, 2045, 46350, 3186, 3605, 789, 342, 2067, 5884, 11701, 50276, 296, 3755, 20556, 597, 12661, 2067, 5609, 281, 3157, 253, 1566, 824, 347, 17744, 13823, 2801, 26672, 422, 29810, 7316, 5978, 15274, 3169, 37820, 597, 921, 7126, 7756, 275, 6983, 18, 285, 6983, 740, 327, 253, 295, 82, 18911, 76, 15302, 8127, 41731, 14692, 2045, 1666, 25379, 50276, 20881, 1255, 265, 50276, 79, 5297, 310, 3240, 2074, 281, 277, 9245, 50276, 909, 253, 32049, 48759, 10336, 44176, 3733, 285, 253, 24498, 17524, 5933, 281, 6635, 24705, 5474, 44077, 352, 23970, 2067, 1355, 11701, 824, 347, 970, 17744, 20347, 15274, 3169, 37820, 7316, 5978, 3966, 253, 4583, 7680, 3133, 3710, 50275, 23955, 14855, 310, 326, 1142, 273, 253, 15988, 1705, 432, 970, 253, 3081, 7316, 5978, 941, 4722, 314, 512, 643, 11701, 1646, 281, 452, 1077, 3710, 1055, 327, 253, 3045, 436, 671, 16540, 247, 1953, 670, 849, 1199, 253, 1566, 15771, 327, 253, 3290, 273, 253, 7316, 5978, 1566, 285, 849, 352, 476, 5223, 281, 747, 10625, 326, 253, 2805, 1541, 1566, 16216, 6635, 1175, 3533, 323, 253, 2303, 3389, 50275, 2072, 5474, 33032, 436, 2929, 29328, 247, 747, 7792, 323, 11454, 3496, 1677, 7316, 3587, 3283, 247, 3389, 2654, 253, 3389, 44077, 403, 2797, 407, 24498, 17524, 273, 7177, 38565, 436, 310, 247, 4460, 15895, 273, 253, 1895, 285, 310, 1077, 5799, 432, 1655, 2500, 493, 486, 3082, 326, 452, 247, 1029, 2845, 455, 23507, 25064, 3924, 3560, 407, 247, 1029, 40540, 11454, 294, 14714, 254, 390, 16851, 5275, 6346, 3082, 326, 22573, 1097, 7177, 285, 19241, 347, 11390, 50274, 45563, 50276, 783, 15895, 310, 4451, 285, 27807, 891, 1663, 751, 436, 2934, 5747, 690, 7364, 253, 1543, 403, 1077, 12532, 285, 891, 8564, 1142, 956, 484, 789, 778, 1705, 432, 436, 2929, 50275, 20881, 1255, 50276, 74, 1158, 253, 1332, 27945, 327, 581, 1781, 9376, 326, 7177, 476, 320, 638, 498, 461, 2122, 715, 690, 4229, 873, 273, 44077, 512, 19241, 403, 1146, 10166, 342, 253, 1072, 4229, 873, 273, 3389, 44077, 436, 310, 247, 8489, 49799, 1039, 281, 1158, 670, 3496, 347, 581, 3798, 1057, 417, 5467, 3389, 7368, 2605, 285, 519, 26901, 19241, 778, 6520, 4360, 19553, 667, 8578, 891, 1158, 690, 4679, 326, 1611, 1027, 14308, 273, 3389, 44077, 24088, 3632, 651, 17914, 1708, 327, 253, 9376, 273, 1907, 247, 2266, 3389, 17524, 2720, 50275, 783, 12291, 2593, 310, 5272, 891, 1158, 253, 3605, 11183, 4809, 778, 320, 581, 273, 253, 954, 1774, 2822, 3945, 7881, 50276, 187, 187, 4118, 18435, 27, 2520, 2929, 50276, 856, 6013, 247, 747, 7792, 323, 11454, 3496, 1677, 7316, 3587, 3283, 247, 3389, 2654, 253, 3389, 44077, 403, 2797, 407, 24498, 17524, 273, 7177, 38565, 436, 310, 247, 4460, 15895, 273, 253, 1895, 285, 310, 1077, 5799, 432, 1655, 2500, 493, 486, 3082, 326, 452, 247, 1029, 2845, 455, 23507, 25064, 3924, 3560, 407, 247, 1029, 40540, 11454, 294, 14714, 254, 390, 16851, 5275, 6346, 3082, 326, 22573, 1097, 7177, 285, 19241, 347, 11390, 253, 2929, 310, 9648, 973, 15720, 253, 4477, 452, 9713, 30628, 7350, 342, 8274, 7000, 8680, 285, 452, 1160, 616, 2127, 2130, 281, 12454, 40290, 253, 1543, 403, 3782, 2266, 2429, 281, 625, 5899, 270, 78, 1099, 3169, 3210, 285, 11771, 11454, 7274, 751, 277, 9245, 891, 30258, 627, 588, 320, 1199, 956, 251, 789, 285, 6524, 247, 22199, 5333, 275, 11454, 3496, 209 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 6050, 253, 5899, 3389, 25064, 1895, 310, 26115, 347, 10603, 1097, 253, 3389, 285, 253, 7316, 281, 253, 1072, 4972, 2317, 285, 840, 9591, 5275, 6346, 3186, 281, 1089, 253, 8642, 3389, 2273, 1016, 7316, 253, 2929, 3185, 29328, 281, 36803, 253, 4836, 347, 247, 3389, 21674, 5978, 1895, 835, 253, 3280, 310, 253, 7316, 285, 253, 3453, 310, 253, 2303, 7177, 2654, 436, 310, 417, 7094, 4460, 342, 1675, 281, 277, 9245, 246, 333, 1162, 355, 1384, 1423, 533, 1677, 326, 277, 9245, 369, 2130, 327, 253, 21429, 760, 1264, 2607, 1078, 5723, 2824, 20639, 581, 1537, 320, 2104, 281, 1908, 436, 2929, 281, 320, 247, 17336, 789, 253, 4081, 1566, 407, 436, 2929, 11454, 20689, 3605, 254, 671, 23970, 247, 1643, 5609, 337, 7316, 5978, 326, 14688, 942, 253, 32305, 1767, 351, 406, 941, 323, 3733, 6425, 281, 3389, 44176, 374, 17744, 13823, 2801, 26672, 422, 29810, 326, 4245, 1027, 14217, 281, 253, 1072, 21761, 275, 1027, 6887, 273, 253, 44077, 495, 24705, 3389, 2654, 594, 326, 2074, 7177, 3894, 2074, 44077, 285, 577, 37820, 5853, 281, 3657, 689, 31893, 253, 1543, 327, 3626, 3533, 23349, 76, 247, 8578, 273, 295, 82, 403, 1077, 12532, 253, 4081, 1566, 417, 760, 41731, 13015, 643, 1006, 800, 25064, 3210, 824, 347, 277, 9245, 285, 14393, 407, 247, 1943, 8459, 533, 671, 7052, 41731, 13015, 12085, 1666, 25379, 824, 347, 271, 336, 285, 270, 78, 1099, 50276, 296, 3755, 20556, 50276, 783, 1543, 403, 1077, 2266, 253, 1666, 25379, 326, 253, 2929, 26662, 1411, 824, 347, 271, 336, 285, 270, 78, 1099, 403, 1077, 2266, 4394, 285, 1892, 281, 7171, 352, 310, 13406, 326, 247, 1006, 800, 1566, 476, 7171, 1110, 1666, 25379, 50276, 7267, 5978, 7729, 247, 2257, 285, 436, 310, 247, 1077, 4217, 1491, 323, 952, 2444, 275, 436, 1673, 275, 958, 347, 2011, 275, 2829, 374, 2190, 643, 5609, 5611, 407, 253, 2929, 2805, 72, 310, 407, 2080, 253, 5962, 2803, 323, 253, 8936, 3045, 273, 295, 5297, 50276, 20881, 1255, 265, 50276, 338, 7316, 5978, 310, 253, 11360, 2803, 840, 352, 3133, 281, 479, 326, 752, 253, 1566, 310, 1663, 2509, 310, 7316, 44176, 2581, 685, 3389, 44176, 436, 2097, 253, 3045, 273, 253, 1566, 588, 4122, 3469, 327, 11365, 247, 11088, 2491, 273, 19241, 534, 403, 2779, 281, 14787, 342, 253, 1071, 19241, 50275, 17439, 272, 253, 24705, 19868, 273, 253, 3389, 44077, 476, 320, 2783, 347, 26736, 253, 28483, 3605, 275, 253, 15516, 295, 2224, 25064, 9978, 840, 752, 253, 29810, 310, 2509, 476, 320, 2783, 347, 43042, 253, 3451, 7368, 275, 1016, 1268, 273, 253, 19868, 840, 891, 717, 12371, 604, 352, 310, 4344, 281, 1333, 436, 1566, 310, 1663, 1006, 800, 25064, 390, 352, 310, 625, 273, 970, 1027, 46234, 323, 253, 5275, 6346, 3186, 275, 1016, 3828, 273, 253, 19868, 50275, 783, 2929, 760, 4679, 327, 3626, 3533, 285, 760, 247, 8578, 273, 352, 1223, 295, 82, 310, 247, 1077, 8612, 10895, 275, 436, 4836, 253, 2929, 651, 452, 3261, 1199, 1805, 604, 352, 369, 5762, 327, 1027, 15302, 285, 263, 253, 2120, 10895, 273, 295, 82, 352, 310, 417, 2590, 2139, 824, 1543, 403, 417, 2011, 4754, 253, 2929, 25957, 9560, 7364, 824, 347, 697, 3480, 273, 9171, 1430, 2429, 281, 15516, 295, 2224, 25064, 985, 534, 891, 5194, 347, 973, 5474, 339, 431, 248, 2929, 4081, 247, 2160, 2083, 292, 583, 371, 566, 1566, 326, 15693, 253, 4623, 3389, 1677, 271, 3280, 7316, 347, 247, 25064, 1332, 253, 4081, 1332, 452, 247, 5699, 7756, 327, 253, 295, 82, 18911, 76, 10895, 3626, 3533, 253, 954, 7756, 3133, 281, 320, 3551, 432, 970, 4561, 7316, 1754, 327, 253, 3389, 18433, 342, 253, 3389, 347, 31612, 3733, 941, 3236, 414, 253, 4081, 941, 42072, 5853, 310, 4722, 285, 7568, 1175, 3045, 253, 643, 5609, 403, 4722, 285, 5277, 1199, 4577, 533, 671, 1175, 15988, 50276, 15177, 253, 2488, 9713, 616, 7364, 285, 1246, 253, 789, 275, 3634, 253, 4679, 403, 973, 4158, 50276, 498, 15752, 253, 2929, 310, 4518, 3542, 50276, 9188, 40348, 1077, 1175, 15988, 327, 295, 82, 18911, 76, 689, 2045, 1682, 1332, 50276, 783, 4477, 14409, 247, 873, 273, 7364, 273, 253, 1566, 1690, 4311, 273, 253, 3605, 873, 17032, 3885, 285, 22753, 253, 3605, 516, 954, 6110, 275, 253, 2626, 581, 253, 1566, 476, 760, 789, 327, 247, 4229, 873, 273, 7177, 281, 19553, 432, 390, 627, 310, 247, 2105, 281, 5731, 253, 1566, 672, 253, 3605, 873, 2544, 436, 310, 247, 12291, 273, 253, 1386, 273, 2987, 326, 11365, 253, 3389, 47811, 417, 2173, 281, 436, 789, 2299, 516, 671, 14338, 1677, 326, 253, 3389, 21674, 310, 4561, 970, 247, 465, 30799, 5933, 849, 253, 1566, 17923, 672, 627, 403, 747, 7177, 604, 359, 816, 9212, 271, 21674, 281, 247, 747, 3389, 970, 253, 1072, 465, 30799, 5933, 5474, 339, 431, 248, 2929, 16633, 327, 247, 12532, 3884, 281, 3037, 25064, 3066, 5978, 835, 253, 1566, 588, 6635, 253, 4623, 3389, 44077, 323, 1677, 19241, 352, 29217, 253, 2045, 46350, 3186, 3605, 789, 342, 2067, 5884, 11701, 50276, 296, 3755, 20556, 597, 12661, 2067, 5609, 281, 3157, 253, 1566, 824, 347, 17744, 13823, 2801, 26672, 422, 29810, 7316, 5978, 15274, 3169, 37820, 597, 921, 7126, 7756, 275, 6983, 18, 285, 6983, 740, 327, 253, 295, 82, 18911, 76, 15302, 8127, 41731, 14692, 2045, 1666, 25379, 50276, 20881, 1255, 265, 50276, 79, 5297, 310, 3240, 2074, 281, 277, 9245, 50276, 909, 253, 32049, 48759, 10336, 44176, 3733, 285, 253, 24498, 17524, 5933, 281, 6635, 24705, 5474, 44077, 352, 23970, 2067, 1355, 11701, 824, 347, 970, 17744, 20347, 15274, 3169, 37820, 7316, 5978, 3966, 253, 4583, 7680, 3133, 3710, 50275, 23955, 14855, 310, 326, 1142, 273, 253, 15988, 1705, 432, 970, 253, 3081, 7316, 5978, 941, 4722, 314, 512, 643, 11701, 1646, 281, 452, 1077, 3710, 1055, 327, 253, 3045, 436, 671, 16540, 247, 1953, 670, 849, 1199, 253, 1566, 15771, 327, 253, 3290, 273, 253, 7316, 5978, 1566, 285, 849, 352, 476, 5223, 281, 747, 10625, 326, 253, 2805, 1541, 1566, 16216, 6635, 1175, 3533, 323, 253, 2303, 3389, 50275, 2072, 5474, 33032, 436, 2929, 29328, 247, 747, 7792, 323, 11454, 3496, 1677, 7316, 3587, 3283, 247, 3389, 2654, 253, 3389, 44077, 403, 2797, 407, 24498, 17524, 273, 7177, 38565, 436, 310, 247, 4460, 15895, 273, 253, 1895, 285, 310, 1077, 5799, 432, 1655, 2500, 493, 486, 3082, 326, 452, 247, 1029, 2845, 455, 23507, 25064, 3924, 3560, 407, 247, 1029, 40540, 11454, 294, 14714, 254, 390, 16851, 5275, 6346, 3082, 326, 22573, 1097, 7177, 285, 19241, 347, 11390, 50274, 45563, 50276, 783, 15895, 310, 4451, 285, 27807, 891, 1663, 751, 436, 2934, 5747, 690, 7364, 253, 1543, 403, 1077, 12532, 285, 891, 8564, 1142, 956, 484, 789, 778, 1705, 432, 436, 2929, 50275, 20881, 1255, 50276, 74, 1158, 253, 1332, 27945, 327, 581, 1781, 9376, 326, 7177, 476, 320, 638, 498, 461, 2122, 715, 690, 4229, 873, 273, 44077, 512, 19241, 403, 1146, 10166, 342, 253, 1072, 4229, 873, 273, 3389, 44077, 436, 310, 247, 8489, 49799, 1039, 281, 1158, 670, 3496, 347, 581, 3798, 1057, 417, 5467, 3389, 7368, 2605, 285, 519, 26901, 19241, 778, 6520, 4360, 19553, 667, 8578, 891, 1158, 690, 4679, 326, 1611, 1027, 14308, 273, 3389, 44077, 24088, 3632, 651, 17914, 1708, 327, 253, 9376, 273, 1907, 247, 2266, 3389, 17524, 2720, 50275, 783, 12291, 2593, 310, 5272, 891, 1158, 253, 3605, 11183, 4809, 778, 320, 581, 273, 253, 954, 1774, 2822, 3945, 7881, 50276, 187, 187, 4118, 18435, 27, 2520, 2929, 50276, 856, 6013, 247, 747, 7792, 323, 11454, 3496, 1677, 7316, 3587, 3283, 247, 3389, 2654, 253, 3389, 44077, 403, 2797, 407, 24498, 17524, 273, 7177, 38565, 436, 310, 247, 4460, 15895, 273, 253, 1895, 285, 310, 1077, 5799, 432, 1655, 2500, 493, 486, 3082, 326, 452, 247, 1029, 2845, 455, 23507, 25064, 3924, 3560, 407, 247, 1029, 40540, 11454, 294, 14714, 254, 390, 16851, 5275, 6346, 3082, 326, 22573, 1097, 7177, 285, 19241, 347, 11390, 253, 2929, 310, 9648, 973, 15720, 253, 4477, 452, 9713, 30628, 7350, 342, 8274, 7000, 8680, 285, 452, 1160, 616, 2127, 2130, 281, 12454, 40290, 253, 1543, 403, 3782, 2266, 2429, 281, 625, 5899, 270, 78, 1099, 3169, 3210, 285, 11771, 11454, 7274, 751, 277, 9245, 891, 30258, 627, 588, 320, 1199, 956, 251, 789, 285, 6524, 247, 22199, 5333, 275, 11454, 3496, 209 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: the paper builds on prior work on prototypical classification networks more specifically the work of li et al 2018 and additionally tries to include criteria such as orthogonality to enable applications such as fair classification an application to hierarchical networks is also described though the details are very hard to understand experiments show that the resulting models are able to achieve reasonable fairness accuracy tradeoffs the paper attempts interesting problems but lacks on 2 major fronts 1 it is not clear what the improvement over the existing work is and if it is significant enough to merit acceptance at iclr 2 the writing needs a lot of work to bring out motivation for different choices about the first point the main contribution seems to lie in section 31 but most of the machinery here seems to be borrowed from the pcn work of li et al 2018 the additional contribution seems to be the concept subspace projection if i understood correctly whose motivation is not explained very well and the addition of the alignment term in eq 1 the paper does not explain what is the additional value added by these terms over pcn continuing from the previous point the paper is very hard to understand in the second paragraph of section 31 there is some departure from pcn where some combinations of p1 and other prototypes are taken the process is defined in a very handwavy manner and not clear what it is the formal mathematical operation performed here how is this different from pcn and why was this step needed talking about digits and concept subspaces do we have as many concept subspaces as the number of classes if not how is this number picked moving on to the third paragraph of 31 the first few lines seem to be quite similar to pcn however at some point a probability distribution is mentioned in connection with traditional softmax probabilities but then yet another probability distribution is mentioned it is not clear what the second distribution does in the absence of formal equations it is very difficult to understand what each component does i would highly recommend describing each operation formally in a sequential manner and also adding a visualization like figure 1 in the pcn paper to clearly convey the idea to the reader fourth paragraph of 31 mentions two differences from pcn again it is not clear what each of the differences achieves moreover figure 1 is neither described well in the main text nor in the caption leaving the reader puzzled over what is happening in the figure instead of the regular autoencoder a variational autoencoder is used but again the motivation is not clear other important details like the text above equation 1 and the usage of kl diveregnce regularization term are skimmed over very quickly the details of hierarchical classification setup in 34 are also glossed over quickly the same happens in 42 for instance what is meant by adopting the conditional probability training loss introduced by hase et al the contributions are not clear and the writing needs major work docsepthe authors propose a novel model called concept subspace network csn for both hierarchical and fair classification the idea behind the network is to use sets of prototypes to define concept subspaces in the latent space defined by the neural network itself the relationships between the subspaces can be manipulated at training time to enforce concept relationships ie two concept subspaces are orthogonal if the concepts they represent are independent while they are parallel if the concepts they represent are hierarchically organised in this paper the ideas are quite novel and mostly well presented and the problem handled is significant i have though some questions and some minor comments that i hope will be addressed in the final version 1 the way in which concept subspaces are defined is not clear to me in the paper the authors write given a set of k prototypes in rz the prototypes define a subspace generated by starting at the first prototype p1 and adding all linear combinations of vector differences from p1 to all other pi i in 2 k this is not clear to me and it would be beneficial having a clearer example than the one given in the paper in which it is not clear why we should obtain the plane xy 2 also in order to get a subspace do you need the assumption that k z 3 in equation 2 the term pcncdot is just defined as the loss introduced for pcn where is it defined 4 the random baseline seems to achieve very high performance in tables 1 and 2 5 at page 7 the authors mention a global ordering of the nodes how was such ordering decided minor comments 1 z in the figure 1 instead of z 2 add upward and downward arrows nearby the metric names to improve readability 3 random and rand in tables 1 and 2 the paper can be accepted if some clarifications are made docsepthis paper proposed a framework that authors called the concept subspace network using prototypebased representation controlling the alignment between two subspaces for the purpose of the classifier fair or hierarch classification strength the paper proposed a new prototypebased approach considering the relationship between two concepts classification tasks for fairness and hierarchical classification weakness the motivation of this paper is not clear to me it would be helpful to understand the motivation by giving examples of major applications where both fairness and hierarchical classification should be considered also is there any challenge when training a classifier using a regularization term regarding fairness in the existing hierarchy classifier training method it is not clear that why the two subspaces should be orthogonal and parallel for a fair and hierarchical classifier respectively specifically in section 34 1 for fair classification what are the two subspaces is it correct that the two subspaces are for label classification and sensitive attributes classification eg male vs female respectively then does the orthogonal relationship between the prototypes for estimating the sensitive attribute and label prediction guarantee the independence of the actual sensitive attribute and label prediction 2 in hierarchical classification concepts are highly aligned and therefore parallel the difference between a toucan and a dalmatian is similar to the difference between a generic bird and dog in this example why do parallel two concepts imply the difference between a toucan and a dalmatian can be similar to the difference between a generic bird and dog i think that the parallelism of two concepts is not related to the different relationships among prototypes then it is not clear that why two concepts should be parallel in the hierarchical classification questions how does one train a hierarchical classifier with 3 or more concepts in hierarchical classification i think there are at least 3 concepts eg dog vs bird dog species classification bird species classification in experiments how was the parameter lambdas in equation 2 chosen in the experiments minor feedback adding a figure describing the proposed architecture will help readers understand the framework the motivation of the paper and the intuition of the proposed approach is not clear docsepthe present paper proposes a novel architecture for prototypebased classification to support class hierarchies and fairness in particular hierarchies are supported by training the model for multiple classification problems jointly each in its own subspace of the feature space spanned by the respective prototypes for fairness the paper proposes to make the subspace for the classification between subgroups orthogonal to all other subspaces such that any change in subgroup membership does not influence any other classification in a series of experiments the paper evaluates hierarchical classification and fairness separately as well as jointly and demonstrates equal or superior results to a stateoftheart approach from the literature the papers main strengths are i found it particularly elegant to phrase both hierarchical classification and fairness in the same language namely that of classification subspaces which are spanned by prototypes the paper connects to a wide range of concepts namely hierarchical classification interpretability and fairness such that it is of potential interest to a wide range of researchers the paper reports a wide range of experiments so wide indeed that much experimental material had to be pushed to the appendix i particularly appreciate the analysis of the hierarchies discovered by the prototype network and the comparison to the groundtruth hierarchy via edit distance the paper is clearly written i for one had no problem following along and would feel well equipped to reproduce the reported results the papers main weaknesses are the wide range of concepts discussed result in a certain lack of focus fairness privacy and causality are all mentioned but only discussed superficially for fairness this is particularly dangerous as readers may be mislead to believe that the proposed notion of orthogonality is sufficient for fairness however fairness has many meanings as the paper acknowledges in the appendix and only some of them are related to the proposed notion of orthogonality therefore i would advise to revise references to fairness privacy and causality ind to mention explicitly that only a narrow notion of these terms is implemented by the proposed model the related work fails to mention the historic roots of the prototype concept i understand that many recent works in prototype networks make the same mistake but i would still advise to not continue it prototypebased classification has to my knowledge been pioneered by kohonen in the late 1980searly 1990s with his work on learning vector quantization refer to the review by nova and estevez 2014 doi 101007s0052101315353httpsdoiorg101007s0052101315353 and has since been extended in many directions such as metric learning schneider et al 2009 doi101162neco20091108908httpsdoiorg101162neco20091108908 or probabilistic classification seo and obermayer 2003 doi 101162089976603321891819httpsdoiorg101162089976603321891819 the latter extension should be of particular interest because the classification scheme is very similar to the one proposed in this paper while the paper reports many different experiments any single one seems relatively small with few data sets and for hierarchical classification few baselines further i could not find information on the hyperparameter selection eg how many prototypes and how strong the regularization strengths lambda were overall my recommendation is to accept this paper while the paper could be more focused make its own contribution and limitations more clearly and experiments could be extended i still believe that most flaws could be addressed with minor adjustments and that the core contribution of the paper is interesting enough to a wide range of scholars that publication is warranted nonetheless i would appreciate if the authors could help me to deepen my understand of the work by responding to two questions i am not fully convinced that the projection into the plane spanned by the prototypes of a classification problem has any effect on the classification itself if i understand correctly the paper uses a softmax on the squared distances to all prototypes for classification which is entirely reasonable now let d2 be the squared distance between a point z and a prototype p let d2 be the squared distance between the projected point tilde z and the same prototype p and let h2 be the squared distance between z and tilde z since the distances form a rightangle triangle we obtain d2 d2 h2 this holds for any prototype in the same classification problem accordingly all projected distances within one classification problem are merely the original distance minus a constant offset this constant offset gets removed by softmax anyways so i would assume that the softmax probabilities are the same no matter whether a point is projected or not why was the parity hierarchy used as ground truth for mnist garnot et al use a hierarchy based on visual similarity of the digits eg 3 and 8 wouldnt that be more natural overall my recommendation is to accept this paper while the paper could be more focused make its own contribution and limitations more clearly and experiments could be extended i still believe that most flaws could be addressed with minor adjustments and that the core contribution of the paper is interesting enough to a wide range of scholars that publication is warranted ### Summary:
this paper extends prototypical classification networks to handle class hierarchies and fairness new neural architecture is proposed and experimental results in support of it are presented unfortunately reviewers found that paper in its current for is not sufficiently strong to be accepted at iclr authors have made a significant attempt to clarify and improve the paper in their response however reviewers believe that contributions and motivation can be clarified further we encourage authors to improve their work according to the specific suggestions made by the reviewers and resubmit
[ 3268, 310, 5393, 275, 4602, 342, 5899, 2602, 4090, 20552, 533, 840, 2568, 1529, 5912, 3268, 310, 5393, 352, 310, 417, 2590, 752, 253, 1273, 3268, 1057, 275, 253, 5928, 273, 7473, 7424, 352, 310, 1077, 2834, 281, 2096, 752, 1016, 4445, 1057, 891, 651, 4122, 5583, 12930, 1016, 4254, 19186, 275, 247, 22453, 5133, 285, 671, 6240, 247, 24426, 751, 4677, 337, 275, 253, 268, 14340, 2929, 281, 4518, 12709, 253, 2934, 281, 253, 9414, 50276, 48499, 12494, 273, 4562, 25957, 767, 3910, 432, 268, 14340, 969, 352, 310, 417, 2590, 752, 1016, 273, 253, 3910, 33526, 25761, 4677, 337, 310, 6747, 2529, 973, 275, 253, 2022, 2505, 4543, 275, 253, 11743, 6108, 253, 9414, 39340, 689, 752, 310, 9369, 275, 253, 4677, 3185, 273, 253, 3963, 6753, 36465, 247, 39762, 6753, 36465, 310, 908, 533, 969, 253, 16038, 310, 417, 2590, 643, 1774, 4278, 751, 253, 2505, 1840, 5150, 337, 285, 253, 10393, 273, 27451, 25760, 1747, 6591, 37820, 1307, 403, 43816, 1314, 689, 1077, 4541, 253, 4278, 273, 24498, 9162, 9978, 275, 5910, 403, 671, 27392, 264, 689, 4541, 253, 1072, 6569, 275, 5976, 323, 4227, 752, 310, 5486, 407, 25987, 253, 17697, 5912, 3733, 2957, 5611, 407, 288, 511, 1162, 355, 253, 9021, 403, 417, 2590, 285, 253, 4028, 3198, 2201, 789, 5474, 339, 431, 248, 4477, 12661, 247, 4460, 1566, 50276, 8890, 4473, 24822, 2990, 260, 11489, 50276, 1542, 1097, 24498, 285, 4344, 9162, 253, 2934, 3212, 253, 2990, 310, 281, 897, 5239, 273, 3861, 9117, 281, 4853, 4473, 749, 31748, 275, 253, 21624, 2317, 2931, 407, 253, 11454, 2990, 3139, 253, 7688, 875, 253, 749, 31748, 476, 320, 32494, 387, 3733, 673, 281, 7767, 4473, 7688, 26332, 767, 4473, 749, 31748, 403, 19627, 604, 253, 12342, 597, 1957, 403, 3907, 1223, 597, 403, 7529, 604, 253, 12342, 597, 1957, 403, 20258, 1037, 29070, 50276, 249, 436, 2929, 253, 5697, 403, 3240, 4460, 285, 6571, 973, 3559, 285, 253, 1895, 15726, 310, 1534, 50275, 74, 452, 2167, 690, 3533, 285, 690, 5884, 5701, 326, 891, 3524, 588, 320, 9713, 275, 253, 2457, 2715, 50276, 18, 253, 1039, 275, 534, 4473, 749, 31748, 403, 2931, 310, 417, 2590, 281, 479, 275, 253, 2929, 253, 4477, 3630, 1677, 247, 873, 273, 465, 3861, 9117, 275, 391, 91, 50276, 783, 3861, 9117, 4853, 247, 24822, 4561, 407, 4983, 387, 253, 806, 21841, 268, 18, 285, 6240, 512, 4872, 13553, 273, 4972, 3910, 432, 268, 18, 281, 512, 643, 12580, 891, 275, 374, 465, 436, 310, 417, 2590, 281, 479, 285, 352, 651, 320, 12912, 1907, 247, 30909, 1650, 685, 253, 581, 1677, 275, 253, 2929, 275, 534, 352, 310, 417, 2590, 2139, 359, 943, 4044, 253, 6415, 1269, 90, 374, 671, 275, 1340, 281, 755, 247, 24822, 513, 368, 878, 253, 9376, 326, 465, 50276, 91, 50276, 20, 275, 5150, 374, 253, 1307, 268, 14340, 3830, 310, 816, 2931, 347, 253, 2957, 5611, 323, 268, 14340, 835, 310, 352, 2931, 577, 253, 3632, 8245, 3133, 281, 5115, 1077, 1029, 3045, 275, 7180, 337, 285, 374, 50276, 22, 387, 3239, 818, 253, 4477, 3748, 247, 4156, 15824, 273, 253, 7632, 849, 369, 824, 15824, 4425, 50274, 37585, 5701, 50276, 18, 1182, 275, 253, 4677, 337, 3185, 273, 1182, 374, 50276, 1911, 19123, 285, 21169, 18159, 10151, 253, 7982, 4454, 281, 3157, 1239, 1430, 50276, 20, 3632, 285, 40819, 275, 7180, 337, 285, 374, 50276, 783, 2929, 476, 320, 7607, 604, 690, 8254, 6787, 403, 1160, 5474, 33032, 2520, 2929, 4081, 247, 7792, 326, 4477, 1925, 253, 4473, 24822, 2990, 970, 21841, 3169, 6779, 10938, 253, 12420, 875, 767, 749, 31748, 323, 253, 4096, 273, 253, 30410, 4344, 390, 20258, 9162, 50276, 45563, 50276, 783, 2929, 4081, 247, 747, 21841, 3169, 2746, 7296, 253, 2954, 875, 767, 12342, 9162, 8892, 323, 28959, 285, 24498, 9162, 50275, 20881, 1255, 50276, 783, 16038, 273, 436, 2929, 310, 417, 2590, 281, 479, 352, 651, 320, 9371, 281, 2096, 253, 16038, 407, 4933, 6667, 273, 2201, 4893, 835, 1097, 28959, 285, 24498, 9162, 943, 320, 2783, 671, 310, 627, 667, 5691, 672, 3733, 247, 30410, 970, 247, 37820, 1307, 5001, 28959, 275, 253, 5368, 19868, 30410, 3733, 1332, 50276, 262, 310, 417, 2590, 326, 2139, 253, 767, 749, 31748, 943, 320, 19627, 285, 7529, 323, 247, 4344, 285, 24498, 30410, 2975, 5742, 275, 2593, 5910, 50276, 18, 323, 4344, 9162, 752, 403, 253, 767, 749, 31748, 310, 352, 3451, 326, 253, 767, 749, 31748, 403, 323, 5203, 9162, 285, 7996, 12474, 9162, 24088, 5086, 4632, 5343, 2975, 840, 1057, 253, 19627, 2954, 875, 253, 3861, 9117, 323, 26230, 253, 7996, 11104, 285, 5203, 10554, 12215, 253, 14275, 273, 253, 4588, 7996, 11104, 285, 5203, 10554, 374, 50276, 249, 24498, 9162, 12342, 403, 4122, 15616, 285, 3103, 7529, 253, 3064, 875, 247, 3912, 5092, 285, 247, 26932, 2056, 757, 310, 2074, 281, 253, 3064, 875, 247, 12314, 12621, 285, 4370, 50276, 249, 436, 1650, 2139, 513, 7529, 767, 12342, 16084, 253, 3064, 875, 247, 3912, 5092, 285, 247, 26932, 2056, 757, 476, 320, 2074, 281, 253, 3064, 875, 247, 12314, 12621, 285, 4370, 891, 1158, 326, 253, 7529, 1204, 273, 767, 12342, 310, 417, 2905, 281, 253, 1027, 7688, 2190, 3861, 9117, 840, 352, 310, 417, 2590, 326, 2139, 767, 12342, 943, 320, 7529, 275, 253, 24498, 9162, 50275, 34974, 50275, 5430, 1057, 581, 6194, 247, 24498, 30410, 342, 495, 390, 625, 12342, 275, 24498, 9162, 891, 1158, 627, 403, 387, 1878, 495, 12342, 24088, 4370, 4632, 12621, 4370, 3417, 9162, 12621, 3417, 9162, 50276, 249, 4679, 849, 369, 253, 4764, 24082, 34797, 275, 5150, 374, 6777, 275, 253, 4679, 50276, 37585, 8680, 50275, 8052, 247, 4677, 12930, 253, 4081, 10336, 588, 1361, 10668, 2096, 253, 7792, 50275, 783, 16038, 273, 253, 2929, 285, 253, 30328, 273, 253, 4081, 2746, 310, 417, 2590, 50276, 7152, 339, 431, 248, 1246, 2929, 29328, 247, 4460, 10336, 323, 21841, 3169, 9162, 281, 1329, 966, 20258, 447, 285, 28959, 275, 1798, 20258, 447, 403, 4516, 407, 3733, 253, 1566, 323, 2709, 9162, 3237, 26277, 1016, 275, 697, 1211, 24822, 273, 253, 4735, 2317, 40423, 407, 253, 9056, 3861, 9117, 323, 28959, 253, 2929, 29328, 281, 1056, 253, 24822, 323, 253, 9162, 875, 22105, 19627, 281, 512, 643, 749, 31748, 824, 326, 667, 1818, 275, 14632, 14199, 1057, 417, 4833, 667, 643, 9162, 275, 247, 2962, 273, 4679, 253, 2929, 44995, 24498, 9162, 285, 28959, 11794, 347, 973, 347, 26277, 285, 14371, 4503, 390, 8936, 1543, 281, 247, 1375, 23037, 14387, 2746, 432, 253, 6239, 253, 9380, 2022, 20544, 403, 50276, 74, 1119, 352, 3782, 20654, 281, 12616, 1097, 24498, 9162, 285, 28959, 275, 253, 1072, 3448, 10775, 326, 273, 9162, 749, 31748, 534, 403, 40423, 407, 3861, 9117, 50276, 783, 2929, 23417, 281, 247, 4618, 2491, 273, 12342, 10775, 24498, 9162, 4665, 1430, 285, 28959, 824, 326, 352, 310, 273, 2442, 1600, 281, 247, 4618, 2491, 273, 8607, 50276, 783, 2929, 5012, 247, 4618, 2491, 273, 4679, 594, 4618, 6296, 326, 1199, 5661, 2144, 574, 281, 320, 10184, 281, 253, 30762, 891, 3782, 11435, 253, 1783, 273, 253, 20258, 447, 6888, 407, 253, 21841, 2990, 285, 253, 5301, 281, 253, 3216, 33024, 19868, 3066, 12921, 4181, 50276, 783, 2929, 310, 4518, 3542, 891, 323, 581, 574, 642, 1895, 1563, 2112, 285, 651, 1928, 973, 13496, 281, 18302, 253, 2361, 1543, 50276, 783, 9380, 2022, 32213, 403, 50276, 783, 4618, 2491, 273, 12342, 5469, 906, 275, 247, 2176, 3480, 273, 2770, 28959, 11068, 285, 46449, 403, 512, 5393, 533, 760, 5469, 23426, 280, 1365, 323, 28959, 436, 310, 3782, 8312, 347, 10668, 778, 320, 3731, 26460, 281, 2868, 326, 253, 4081, 10732, 273, 9373, 38931, 1319, 310, 4209, 323, 28959, 2299, 28959, 556, 1142, 30460, 347, 253, 2929, 26785, 275, 253, 30762, 285, 760, 690, 273, 731, 403, 2905, 281, 253, 4081, 10732, 273, 9373, 38931, 1319, 3103, 891, 651, 22276, 281, 49620, 10414, 281, 28959, 11068, 285, 46449, 801, 281, 3748, 11120, 326, 760, 247, 6891, 10732, 273, 841, 2426, 310, 9009, 407, 253, 4081, 1566, 50276, 783, 2905, 789, 10224, 281, 3748, 253, 14464, 11465, 273, 253, 21841, 4473, 891, 2096, 326, 1142, 3332, 2987, 275, 21841, 6928, 1056, 253, 1072, 10551, 533, 891, 651, 1335, 22276, 281, 417, 4035, 352, 21841, 3169, 9162, 556, 50276, 936, 619, 3640, 50276, 20394, 33536, 35842, 407, 465, 1368, 47591, 275, 253, 3563, 9178, 339, 1285, 7901, 84, 342, 521, 789, 327, 4715, 4972, 36643, 3730, 281, 253, 2278, 407, 642, 6156, 285, 12368, 29350, 4059, 28076, 8437, 7931, 84, 5523, 1797, 13144, 1010, 29137, 3614, 3088, 1528, 72, 6903, 7931, 84, 5523, 1797, 13144, 1010, 29137, 50276, 395, 556, 1580, 644, 6508, 275, 1142, 10746, 824, 347, 7982, 4715, 5807, 34629, 1162, 355, 4748, 28076, 6903, 19603, 570, 1940, 1518, 4739, 12347, 33648, 3614, 3088, 1528, 72, 6903, 19603, 570, 1940, 1518, 4739, 12347, 33648, 390, 37851, 9162, 396, 80, 285, 258, 589, 78, 4071, 6469, 28076, 8437, 1036, 17391, 1525, 3121, 29251, 1237, 18359, 1093, 746, 3614, 3088, 1528, 72, 6903, 1036, 17391, 1525, 3121, 29251, 1237, 18359, 1093, 746, 50276, 783, 6158, 6880, 943, 320, 273, 1798, 1600, 984, 253, 9162, 6974, 310, 1077, 2074, 281, 253, 581, 4081, 275, 436, 2929, 50276, 6050, 253, 2929, 5012, 1142, 1027, 4679, 667, 2014, 581, 3133, 4942, 1355, 342, 1643, 941, 5239, 285, 323, 24498, 9162, 1643, 1666, 25379, 2007, 891, 812, 417, 1089, 1491, 327, 253, 4373, 19484, 5438, 24088, 849, 1142, 3861, 9117, 285, 849, 2266, 253, 37820, 20544, 29331, 497, 50276, 1189, 455, 619, 17401, 310, 281, 2997, 436, 2929, 1223, 253, 2929, 812, 320, 625, 7106, 1056, 697, 1211, 7680, 285, 7364, 625, 4518, 285, 4679, 812, 320, 6508, 891, 1335, 2868, 326, 954, 32138, 812, 320, 9713, 342, 5884, 23927, 285, 326, 253, 5161, 7680, 273, 253, 2929, 310, 4722, 2217, 281, 247, 4618, 2491, 273, 12981, 326, 9311, 310, 26085, 50276, 4160, 14153, 891, 651, 11435, 604, 253, 4477, 812, 1361, 479, 281, 3676, 257, 619, 2096, 273, 253, 789, 407, 19392, 281, 767, 3533, 50275, 74, 717, 417, 4751, 13762, 326, 253, 12378, 715, 253, 6415, 40423, 407, 253, 3861, 9117, 273, 247, 9162, 1895, 556, 667, 1055, 327, 253, 9162, 3139, 604, 891, 2096, 9113, 253, 2929, 4648, 247, 2602, 4090, 327, 253, 30044, 13849, 281, 512, 3861, 9117, 323, 9162, 534, 310, 7094, 5272, 1024, 1339, 277, 19, 320, 253, 30044, 4181, 875, 247, 1127, 1182, 285, 247, 21841, 268, 1339, 277, 19, 320, 253, 30044, 4181, 875, 253, 16589, 1127, 246, 6227, 1182, 285, 253, 1072, 21841, 268, 285, 1339, 288, 19, 320, 253, 30044, 4181, 875, 1182, 285, 246, 6227, 1182, 1580, 253, 13849, 830, 247, 987, 2134, 19037, 359, 4044, 277, 19, 50276, 69, 19, 50276, 73, 19, 436, 6556, 323, 667, 21841, 275, 253, 1072, 9162, 1895, 15672, 512, 16589, 13849, 1561, 581, 9162, 1895, 403, 7960, 253, 3236, 4181, 19734, 247, 3638, 8409, 436, 3638, 8409, 4850, 5176, 407, 2602, 4090, 667, 1576, 594, 891, 651, 5467, 326, 253, 2602, 4090, 20552, 403, 253, 1072, 50276, 2369, 2647, 1880, 247, 1127, 310, 16589, 390, 417, 50276, 22309, 369, 253, 25622, 19868, 908, 347, 3216, 5083, 323, 278, 79, 382, 6746, 1439, 1162, 355, 897, 247, 19868, 1754, 327, 5304, 14259, 273, 253, 24321, 24088, 495, 285, 854, 651, 2649, 326, 320, 625, 3626, 4583, 619, 17401, 310, 281, 2997, 436, 2929, 1223, 253, 2929, 812, 320, 625, 7106, 1056, 697, 1211, 7680, 285, 7364, 625, 4518, 285, 4679, 812, 320, 6508, 891, 1335, 2868, 326, 954, 32138, 812, 320, 9713, 342, 5884, 23927, 285, 326, 253, 5161, 7680, 273, 253, 2929, 310, 4722, 2217, 281, 247, 4618, 2491, 273, 12981, 326, 9311, 310, 26085, 2490, 187, 4118, 18435, 27, 2520, 2929, 8725, 3861, 49225, 9162, 6928, 281, 6016, 966, 20258, 447, 285, 28959, 747, 11454, 10336, 310, 4081, 285, 5661, 1543, 275, 1329, 273, 352, 403, 3559, 50276, 328, 9520, 30628, 1119, 326, 2929, 275, 697, 1655, 323, 310, 417, 10481, 2266, 281, 320, 7607, 387, 17857, 32888, 4477, 452, 1160, 247, 1534, 3177, 281, 19148, 285, 3157, 253, 2929, 275, 616, 2380, 2299, 30628, 2868, 326, 9021, 285, 16038, 476, 320, 31637, 2007, 359, 11907, 4477, 281, 3157, 616, 789, 2556, 281, 253, 2173, 13991, 1160, 407, 253, 30628, 285, 501, 538, 2225 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 3268, 310, 5393, 275, 4602, 342, 5899, 2602, 4090, 20552, 533, 840, 2568, 1529, 5912, 3268, 310, 5393, 352, 310, 417, 2590, 752, 253, 1273, 3268, 1057, 275, 253, 5928, 273, 7473, 7424, 352, 310, 1077, 2834, 281, 2096, 752, 1016, 4445, 1057, 891, 651, 4122, 5583, 12930, 1016, 4254, 19186, 275, 247, 22453, 5133, 285, 671, 6240, 247, 24426, 751, 4677, 337, 275, 253, 268, 14340, 2929, 281, 4518, 12709, 253, 2934, 281, 253, 9414, 50276, 48499, 12494, 273, 4562, 25957, 767, 3910, 432, 268, 14340, 969, 352, 310, 417, 2590, 752, 1016, 273, 253, 3910, 33526, 25761, 4677, 337, 310, 6747, 2529, 973, 275, 253, 2022, 2505, 4543, 275, 253, 11743, 6108, 253, 9414, 39340, 689, 752, 310, 9369, 275, 253, 4677, 3185, 273, 253, 3963, 6753, 36465, 247, 39762, 6753, 36465, 310, 908, 533, 969, 253, 16038, 310, 417, 2590, 643, 1774, 4278, 751, 253, 2505, 1840, 5150, 337, 285, 253, 10393, 273, 27451, 25760, 1747, 6591, 37820, 1307, 403, 43816, 1314, 689, 1077, 4541, 253, 4278, 273, 24498, 9162, 9978, 275, 5910, 403, 671, 27392, 264, 689, 4541, 253, 1072, 6569, 275, 5976, 323, 4227, 752, 310, 5486, 407, 25987, 253, 17697, 5912, 3733, 2957, 5611, 407, 288, 511, 1162, 355, 253, 9021, 403, 417, 2590, 285, 253, 4028, 3198, 2201, 789, 5474, 339, 431, 248, 4477, 12661, 247, 4460, 1566, 50276, 8890, 4473, 24822, 2990, 260, 11489, 50276, 1542, 1097, 24498, 285, 4344, 9162, 253, 2934, 3212, 253, 2990, 310, 281, 897, 5239, 273, 3861, 9117, 281, 4853, 4473, 749, 31748, 275, 253, 21624, 2317, 2931, 407, 253, 11454, 2990, 3139, 253, 7688, 875, 253, 749, 31748, 476, 320, 32494, 387, 3733, 673, 281, 7767, 4473, 7688, 26332, 767, 4473, 749, 31748, 403, 19627, 604, 253, 12342, 597, 1957, 403, 3907, 1223, 597, 403, 7529, 604, 253, 12342, 597, 1957, 403, 20258, 1037, 29070, 50276, 249, 436, 2929, 253, 5697, 403, 3240, 4460, 285, 6571, 973, 3559, 285, 253, 1895, 15726, 310, 1534, 50275, 74, 452, 2167, 690, 3533, 285, 690, 5884, 5701, 326, 891, 3524, 588, 320, 9713, 275, 253, 2457, 2715, 50276, 18, 253, 1039, 275, 534, 4473, 749, 31748, 403, 2931, 310, 417, 2590, 281, 479, 275, 253, 2929, 253, 4477, 3630, 1677, 247, 873, 273, 465, 3861, 9117, 275, 391, 91, 50276, 783, 3861, 9117, 4853, 247, 24822, 4561, 407, 4983, 387, 253, 806, 21841, 268, 18, 285, 6240, 512, 4872, 13553, 273, 4972, 3910, 432, 268, 18, 281, 512, 643, 12580, 891, 275, 374, 465, 436, 310, 417, 2590, 281, 479, 285, 352, 651, 320, 12912, 1907, 247, 30909, 1650, 685, 253, 581, 1677, 275, 253, 2929, 275, 534, 352, 310, 417, 2590, 2139, 359, 943, 4044, 253, 6415, 1269, 90, 374, 671, 275, 1340, 281, 755, 247, 24822, 513, 368, 878, 253, 9376, 326, 465, 50276, 91, 50276, 20, 275, 5150, 374, 253, 1307, 268, 14340, 3830, 310, 816, 2931, 347, 253, 2957, 5611, 323, 268, 14340, 835, 310, 352, 2931, 577, 253, 3632, 8245, 3133, 281, 5115, 1077, 1029, 3045, 275, 7180, 337, 285, 374, 50276, 22, 387, 3239, 818, 253, 4477, 3748, 247, 4156, 15824, 273, 253, 7632, 849, 369, 824, 15824, 4425, 50274, 37585, 5701, 50276, 18, 1182, 275, 253, 4677, 337, 3185, 273, 1182, 374, 50276, 1911, 19123, 285, 21169, 18159, 10151, 253, 7982, 4454, 281, 3157, 1239, 1430, 50276, 20, 3632, 285, 40819, 275, 7180, 337, 285, 374, 50276, 783, 2929, 476, 320, 7607, 604, 690, 8254, 6787, 403, 1160, 5474, 33032, 2520, 2929, 4081, 247, 7792, 326, 4477, 1925, 253, 4473, 24822, 2990, 970, 21841, 3169, 6779, 10938, 253, 12420, 875, 767, 749, 31748, 323, 253, 4096, 273, 253, 30410, 4344, 390, 20258, 9162, 50276, 45563, 50276, 783, 2929, 4081, 247, 747, 21841, 3169, 2746, 7296, 253, 2954, 875, 767, 12342, 9162, 8892, 323, 28959, 285, 24498, 9162, 50275, 20881, 1255, 50276, 783, 16038, 273, 436, 2929, 310, 417, 2590, 281, 479, 352, 651, 320, 9371, 281, 2096, 253, 16038, 407, 4933, 6667, 273, 2201, 4893, 835, 1097, 28959, 285, 24498, 9162, 943, 320, 2783, 671, 310, 627, 667, 5691, 672, 3733, 247, 30410, 970, 247, 37820, 1307, 5001, 28959, 275, 253, 5368, 19868, 30410, 3733, 1332, 50276, 262, 310, 417, 2590, 326, 2139, 253, 767, 749, 31748, 943, 320, 19627, 285, 7529, 323, 247, 4344, 285, 24498, 30410, 2975, 5742, 275, 2593, 5910, 50276, 18, 323, 4344, 9162, 752, 403, 253, 767, 749, 31748, 310, 352, 3451, 326, 253, 767, 749, 31748, 403, 323, 5203, 9162, 285, 7996, 12474, 9162, 24088, 5086, 4632, 5343, 2975, 840, 1057, 253, 19627, 2954, 875, 253, 3861, 9117, 323, 26230, 253, 7996, 11104, 285, 5203, 10554, 12215, 253, 14275, 273, 253, 4588, 7996, 11104, 285, 5203, 10554, 374, 50276, 249, 24498, 9162, 12342, 403, 4122, 15616, 285, 3103, 7529, 253, 3064, 875, 247, 3912, 5092, 285, 247, 26932, 2056, 757, 310, 2074, 281, 253, 3064, 875, 247, 12314, 12621, 285, 4370, 50276, 249, 436, 1650, 2139, 513, 7529, 767, 12342, 16084, 253, 3064, 875, 247, 3912, 5092, 285, 247, 26932, 2056, 757, 476, 320, 2074, 281, 253, 3064, 875, 247, 12314, 12621, 285, 4370, 891, 1158, 326, 253, 7529, 1204, 273, 767, 12342, 310, 417, 2905, 281, 253, 1027, 7688, 2190, 3861, 9117, 840, 352, 310, 417, 2590, 326, 2139, 767, 12342, 943, 320, 7529, 275, 253, 24498, 9162, 50275, 34974, 50275, 5430, 1057, 581, 6194, 247, 24498, 30410, 342, 495, 390, 625, 12342, 275, 24498, 9162, 891, 1158, 627, 403, 387, 1878, 495, 12342, 24088, 4370, 4632, 12621, 4370, 3417, 9162, 12621, 3417, 9162, 50276, 249, 4679, 849, 369, 253, 4764, 24082, 34797, 275, 5150, 374, 6777, 275, 253, 4679, 50276, 37585, 8680, 50275, 8052, 247, 4677, 12930, 253, 4081, 10336, 588, 1361, 10668, 2096, 253, 7792, 50275, 783, 16038, 273, 253, 2929, 285, 253, 30328, 273, 253, 4081, 2746, 310, 417, 2590, 50276, 7152, 339, 431, 248, 1246, 2929, 29328, 247, 4460, 10336, 323, 21841, 3169, 9162, 281, 1329, 966, 20258, 447, 285, 28959, 275, 1798, 20258, 447, 403, 4516, 407, 3733, 253, 1566, 323, 2709, 9162, 3237, 26277, 1016, 275, 697, 1211, 24822, 273, 253, 4735, 2317, 40423, 407, 253, 9056, 3861, 9117, 323, 28959, 253, 2929, 29328, 281, 1056, 253, 24822, 323, 253, 9162, 875, 22105, 19627, 281, 512, 643, 749, 31748, 824, 326, 667, 1818, 275, 14632, 14199, 1057, 417, 4833, 667, 643, 9162, 275, 247, 2962, 273, 4679, 253, 2929, 44995, 24498, 9162, 285, 28959, 11794, 347, 973, 347, 26277, 285, 14371, 4503, 390, 8936, 1543, 281, 247, 1375, 23037, 14387, 2746, 432, 253, 6239, 253, 9380, 2022, 20544, 403, 50276, 74, 1119, 352, 3782, 20654, 281, 12616, 1097, 24498, 9162, 285, 28959, 275, 253, 1072, 3448, 10775, 326, 273, 9162, 749, 31748, 534, 403, 40423, 407, 3861, 9117, 50276, 783, 2929, 23417, 281, 247, 4618, 2491, 273, 12342, 10775, 24498, 9162, 4665, 1430, 285, 28959, 824, 326, 352, 310, 273, 2442, 1600, 281, 247, 4618, 2491, 273, 8607, 50276, 783, 2929, 5012, 247, 4618, 2491, 273, 4679, 594, 4618, 6296, 326, 1199, 5661, 2144, 574, 281, 320, 10184, 281, 253, 30762, 891, 3782, 11435, 253, 1783, 273, 253, 20258, 447, 6888, 407, 253, 21841, 2990, 285, 253, 5301, 281, 253, 3216, 33024, 19868, 3066, 12921, 4181, 50276, 783, 2929, 310, 4518, 3542, 891, 323, 581, 574, 642, 1895, 1563, 2112, 285, 651, 1928, 973, 13496, 281, 18302, 253, 2361, 1543, 50276, 783, 9380, 2022, 32213, 403, 50276, 783, 4618, 2491, 273, 12342, 5469, 906, 275, 247, 2176, 3480, 273, 2770, 28959, 11068, 285, 46449, 403, 512, 5393, 533, 760, 5469, 23426, 280, 1365, 323, 28959, 436, 310, 3782, 8312, 347, 10668, 778, 320, 3731, 26460, 281, 2868, 326, 253, 4081, 10732, 273, 9373, 38931, 1319, 310, 4209, 323, 28959, 2299, 28959, 556, 1142, 30460, 347, 253, 2929, 26785, 275, 253, 30762, 285, 760, 690, 273, 731, 403, 2905, 281, 253, 4081, 10732, 273, 9373, 38931, 1319, 3103, 891, 651, 22276, 281, 49620, 10414, 281, 28959, 11068, 285, 46449, 801, 281, 3748, 11120, 326, 760, 247, 6891, 10732, 273, 841, 2426, 310, 9009, 407, 253, 4081, 1566, 50276, 783, 2905, 789, 10224, 281, 3748, 253, 14464, 11465, 273, 253, 21841, 4473, 891, 2096, 326, 1142, 3332, 2987, 275, 21841, 6928, 1056, 253, 1072, 10551, 533, 891, 651, 1335, 22276, 281, 417, 4035, 352, 21841, 3169, 9162, 556, 50276, 936, 619, 3640, 50276, 20394, 33536, 35842, 407, 465, 1368, 47591, 275, 253, 3563, 9178, 339, 1285, 7901, 84, 342, 521, 789, 327, 4715, 4972, 36643, 3730, 281, 253, 2278, 407, 642, 6156, 285, 12368, 29350, 4059, 28076, 8437, 7931, 84, 5523, 1797, 13144, 1010, 29137, 3614, 3088, 1528, 72, 6903, 7931, 84, 5523, 1797, 13144, 1010, 29137, 50276, 395, 556, 1580, 644, 6508, 275, 1142, 10746, 824, 347, 7982, 4715, 5807, 34629, 1162, 355, 4748, 28076, 6903, 19603, 570, 1940, 1518, 4739, 12347, 33648, 3614, 3088, 1528, 72, 6903, 19603, 570, 1940, 1518, 4739, 12347, 33648, 390, 37851, 9162, 396, 80, 285, 258, 589, 78, 4071, 6469, 28076, 8437, 1036, 17391, 1525, 3121, 29251, 1237, 18359, 1093, 746, 3614, 3088, 1528, 72, 6903, 1036, 17391, 1525, 3121, 29251, 1237, 18359, 1093, 746, 50276, 783, 6158, 6880, 943, 320, 273, 1798, 1600, 984, 253, 9162, 6974, 310, 1077, 2074, 281, 253, 581, 4081, 275, 436, 2929, 50276, 6050, 253, 2929, 5012, 1142, 1027, 4679, 667, 2014, 581, 3133, 4942, 1355, 342, 1643, 941, 5239, 285, 323, 24498, 9162, 1643, 1666, 25379, 2007, 891, 812, 417, 1089, 1491, 327, 253, 4373, 19484, 5438, 24088, 849, 1142, 3861, 9117, 285, 849, 2266, 253, 37820, 20544, 29331, 497, 50276, 1189, 455, 619, 17401, 310, 281, 2997, 436, 2929, 1223, 253, 2929, 812, 320, 625, 7106, 1056, 697, 1211, 7680, 285, 7364, 625, 4518, 285, 4679, 812, 320, 6508, 891, 1335, 2868, 326, 954, 32138, 812, 320, 9713, 342, 5884, 23927, 285, 326, 253, 5161, 7680, 273, 253, 2929, 310, 4722, 2217, 281, 247, 4618, 2491, 273, 12981, 326, 9311, 310, 26085, 50276, 4160, 14153, 891, 651, 11435, 604, 253, 4477, 812, 1361, 479, 281, 3676, 257, 619, 2096, 273, 253, 789, 407, 19392, 281, 767, 3533, 50275, 74, 717, 417, 4751, 13762, 326, 253, 12378, 715, 253, 6415, 40423, 407, 253, 3861, 9117, 273, 247, 9162, 1895, 556, 667, 1055, 327, 253, 9162, 3139, 604, 891, 2096, 9113, 253, 2929, 4648, 247, 2602, 4090, 327, 253, 30044, 13849, 281, 512, 3861, 9117, 323, 9162, 534, 310, 7094, 5272, 1024, 1339, 277, 19, 320, 253, 30044, 4181, 875, 247, 1127, 1182, 285, 247, 21841, 268, 1339, 277, 19, 320, 253, 30044, 4181, 875, 253, 16589, 1127, 246, 6227, 1182, 285, 253, 1072, 21841, 268, 285, 1339, 288, 19, 320, 253, 30044, 4181, 875, 1182, 285, 246, 6227, 1182, 1580, 253, 13849, 830, 247, 987, 2134, 19037, 359, 4044, 277, 19, 50276, 69, 19, 50276, 73, 19, 436, 6556, 323, 667, 21841, 275, 253, 1072, 9162, 1895, 15672, 512, 16589, 13849, 1561, 581, 9162, 1895, 403, 7960, 253, 3236, 4181, 19734, 247, 3638, 8409, 436, 3638, 8409, 4850, 5176, 407, 2602, 4090, 667, 1576, 594, 891, 651, 5467, 326, 253, 2602, 4090, 20552, 403, 253, 1072, 50276, 2369, 2647, 1880, 247, 1127, 310, 16589, 390, 417, 50276, 22309, 369, 253, 25622, 19868, 908, 347, 3216, 5083, 323, 278, 79, 382, 6746, 1439, 1162, 355, 897, 247, 19868, 1754, 327, 5304, 14259, 273, 253, 24321, 24088, 495, 285, 854, 651, 2649, 326, 320, 625, 3626, 4583, 619, 17401, 310, 281, 2997, 436, 2929, 1223, 253, 2929, 812, 320, 625, 7106, 1056, 697, 1211, 7680, 285, 7364, 625, 4518, 285, 4679, 812, 320, 6508, 891, 1335, 2868, 326, 954, 32138, 812, 320, 9713, 342, 5884, 23927, 285, 326, 253, 5161, 7680, 273, 253, 2929, 310, 4722, 2217, 281, 247, 4618, 2491, 273, 12981, 326, 9311, 310, 26085, 2490, 187, 4118, 18435, 27, 2520, 2929, 8725, 3861, 49225, 9162, 6928, 281, 6016, 966, 20258, 447, 285, 28959, 747, 11454, 10336, 310, 4081, 285, 5661, 1543, 275, 1329, 273, 352, 403, 3559, 50276, 328, 9520, 30628, 1119, 326, 2929, 275, 697, 1655, 323, 310, 417, 10481, 2266, 281, 320, 7607, 387, 17857, 32888, 4477, 452, 1160, 247, 1534, 3177, 281, 19148, 285, 3157, 253, 2929, 275, 616, 2380, 2299, 30628, 2868, 326, 9021, 285, 16038, 476, 320, 31637, 2007, 359, 11907, 4477, 281, 3157, 616, 789, 2556, 281, 253, 2173, 13991, 1160, 407, 253, 30628, 285, 501, 538, 2225 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: the authors study sgd with and without replacement and construct counterexamples for which sgd fails to converge despite the objective being otherwise wellbehaved eg convex and smooth this appears to be a significant work but is inappropriate for the neurips community because it is too theoretical the dataset z and the objective function bottom pg 5 are simple but it is unclear how this result relates to realworld machine learning problems this is evidenced by the lack of experiments in the paper na docsepthe authors provide a stochastic convex optimisation framework in which one passe sgd classically minimises the population risk at a o1 sqrtn rate but however exhibits a omega1 training error and generalisation error the paper is well written each result is clear introduced and commented the analysis is rigorous and non trivial overall i have no criticism concerning the technical results which i believe are original however the main results in section 3 seem anedoctal in the sense that 1 nobody does one pass sgd 2 the considered setting is very peculiar and clearly not encountered on a daily basis cherry picked distribution and loss d geq 2n log n this on its own would not bother me as mentioned i find the results original and interesting however my main concern are the conclusions interpretations which are drawn from the results overall i find that the introduction is very misleading here are several claims in it which i believe are misleading false first it is clear that sgd cannot be framed as any reasonable regularized empirical risk minimization procedure for the simple reason that it does not minimize the empirical risk which challenges the implicit regularization viewpoint to the generalization of sgd the implicit regularization viewpoint is crucial in the interpolation setting setting which is met in practice hence i cannot agree with your statement it does not challenge the viewpoint as you consider a totally different setting second any attempt to explain generalization of sgd by uniform convergence over any possibly datadependent hypotheses set cannot hold simply because the sample average associated with the very same training set sgd was trained on is not necessarily close to its respective expectation 1 your result holds only for one pass sgd which nobody does in practice hence one could expect the classical train generalisation gap to hold for multiple passes in the abstract consequently it turns out that sgd is not algorithmically stable in any sense and its generalization ability cannot be explained by uniform convergence or any other currently known generalization bound technique for that matter other than that of its classical analysis again this only holds for one pass sgd which is never considered in practice minor comments in the population risk convergence it would have been clearer to put the exact convergence bound instead of a big o so that it is clear that there is no hidden dependence in the dimension d which would have killed the rate a conclusion would have been appreciated line 183 ws arg min w hatfw why is the argmin unique if not unique which one do you choose na docsepthe paper shows by considering the stochastic convex optimization framework that there exist problem instances where the sgd solution exhibits both empirical risk and generalization gap of omega1 in the without replacement case sgd is therefore not algorithmically stable this phenomenon does not occur for sgd with replacement strengths the phenomenon reported in the paper is novel intriguing and potentially impactful in the context of sgd generalization bounds analytical evidence is solid the multiepoch regime is addressed weaknesses nor discussions nor conclusions are present limitations and potential negative societal impact of this work may be discussed more docsepthis paper studies the empirical risk and population risk together with generalization for different variants of sgd in the context of stochastic convex optimization they also extend the analysis to other settings ie convex in expectation strongly convex their main contribution is to construct an example that onepass sgd exhibits both empirical risk and generalization gap of omega1 moreover they show that this phenomenon does not exist if we use withreplacement sgd last they derive upper and lower bounds for withoutreplacement sgd in the multiepoch regime strengths this paper studies the behavior of sgd in sco providing new intuition of the analysis of test error population risk by their construction even in the convex optimization there exists an instance that provably minimizes population risk while the empirical risk is of constant level resulting in a constant generalization gap this observation questions the rational optimizationgeneralization decomposition framework or erm framework the main reason is that sgd does not minimize the empirical loss which differs a lot in the smooth setting in a word this observation might lead to a new type of analysis of statistical learning which can have a broad impact on the community the other observations together with their proof techniques are also interesting for example the comparison between with and without replacement might provide some intuition on the advantage of multipass sgd in terms of its implicit regularization the paper is wellstructured and provides good intuition for the proof technique weakness the construction of the failure seems artificial and might not be general enough they rely on a special structure that to my best knowledge no realistic learning problem has a similar structure it would be great if the author can provide some connection between their constructed example and some realistic examples the analysis relies on the averaged solution instead of the last iteration solution which prevents the general impact of this work it would be great if the author can at least comment on or conjecture the result for the last iteration case please see the weaknesses part ### Summary:
this paper analyzes the behavior of sgd for stochastic convex optimization showing that there exist problem instances where the sgd solution exhibits both significant empirical risk and generalization gap in the without replacement case the finding is potentially impactful in the context of sgd generalization bounds the paper is wellstructured and provides good intuition for the proof technique however i encourage the authors to provide a construction of the failure that is more general and less artificial
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 783, 4477, 1263, 256, 35333, 342, 285, 1293, 5407, 285, 3989, 2258, 442, 18398, 10240, 323, 534, 256, 35333, 10224, 281, 29623, 5747, 253, 8103, 1146, 5010, 973, 48384, 9367, 24088, 17133, 285, 6032, 436, 4620, 281, 320, 247, 1534, 789, 533, 310, 19582, 323, 253, 5723, 2824, 3114, 984, 352, 310, 1512, 10527, 253, 10895, 1182, 285, 253, 8103, 1159, 5004, 23256, 608, 403, 2969, 533, 352, 310, 12744, 849, 436, 906, 7033, 281, 1524, 10186, 5145, 4715, 3237, 436, 310, 27007, 407, 253, 3480, 273, 4679, 275, 253, 2929, 5549, 5474, 339, 431, 248, 4477, 2085, 247, 19191, 17133, 5556, 5837, 7792, 275, 534, 581, 7222, 339, 256, 35333, 966, 1037, 7221, 3013, 253, 3072, 2495, 387, 247, 258, 18, 50276, 2609, 79, 2281, 533, 2299, 15646, 247, 40639, 18, 3733, 2228, 285, 2087, 5837, 2228, 253, 2929, 310, 973, 3542, 1016, 906, 310, 2590, 5611, 285, 20503, 253, 1783, 310, 26565, 285, 1327, 14916, 4583, 891, 452, 642, 14226, 8664, 253, 7681, 1543, 534, 891, 2868, 403, 3236, 2299, 253, 2022, 1543, 275, 2593, 495, 1646, 271, 264, 22947, 267, 275, 253, 3282, 326, 337, 12445, 1057, 581, 1509, 256, 35333, 374, 253, 2783, 4758, 310, 1077, 19532, 285, 4518, 417, 14494, 327, 247, 5312, 3720, 33804, 5055, 3268, 285, 2957, 277, 305, 2574, 374, 79, 2412, 295, 50276, 2520, 327, 697, 1211, 651, 417, 15105, 479, 347, 5393, 891, 1089, 253, 1543, 3236, 285, 4722, 2299, 619, 2022, 4468, 403, 253, 11815, 50276, 22416, 569, 534, 403, 8392, 432, 253, 1543, 50276, 1189, 455, 891, 1089, 326, 253, 10199, 310, 1077, 24363, 1060, 403, 2067, 3916, 275, 352, 534, 891, 2868, 403, 24363, 50276, 7750, 50275, 7053, 352, 310, 2590, 326, 256, 35333, 2550, 320, 29318, 347, 667, 5272, 3963, 1025, 16774, 2495, 41458, 5199, 323, 253, 2969, 1921, 326, 352, 1057, 417, 15338, 253, 16774, 2495, 534, 7881, 253, 15424, 37820, 31460, 281, 253, 26647, 273, 256, 35333, 253, 15424, 37820, 31460, 310, 9560, 275, 253, 30370, 4758, 4758, 534, 310, 1313, 275, 3946, 7613, 891, 2550, 5194, 342, 634, 3908, 352, 1057, 417, 5691, 253, 31460, 347, 368, 1908, 247, 9106, 1027, 4758, 50275, 9815, 667, 3177, 281, 5513, 26647, 273, 256, 35333, 407, 6447, 14940, 689, 667, 6830, 2856, 324, 2662, 24316, 873, 2550, 2186, 3365, 984, 253, 3410, 3388, 2330, 342, 253, 1077, 1072, 3733, 873, 256, 35333, 369, 10166, 327, 310, 417, 7933, 2810, 281, 697, 9056, 15355, 50276, 18, 634, 906, 6556, 760, 323, 581, 1509, 256, 35333, 534, 12445, 1057, 275, 3946, 7613, 581, 812, 1902, 253, 8946, 6194, 50276, 16691, 5837, 8037, 281, 2186, 323, 2709, 11999, 50274, 249, 253, 12002, 17912, 352, 7819, 562, 326, 256, 35333, 310, 417, 5933, 1037, 6474, 275, 667, 3282, 285, 697, 26647, 3745, 2550, 320, 5544, 407, 6447, 14940, 390, 667, 643, 4390, 1929, 26647, 3033, 5853, 323, 326, 2647, 643, 685, 326, 273, 697, 8946, 1783, 969, 436, 760, 6556, 323, 581, 1509, 256, 35333, 534, 310, 1620, 2783, 275, 3946, 50274, 37585, 5701, 50275, 249, 253, 3072, 2495, 14940, 352, 651, 452, 644, 30909, 281, 1691, 253, 3242, 14940, 3033, 3185, 273, 247, 1943, 258, 50276, 601, 326, 352, 310, 2590, 326, 627, 310, 642, 8763, 10096, 275, 253, 7877, 277, 534, 651, 452, 5339, 253, 2281, 50275, 66, 6452, 651, 452, 644, 14109, 50275, 1282, 26235, 37280, 50276, 1662, 1054, 259, 7856, 25837, 2139, 310, 253, 1736, 1222, 4451, 50276, 338, 417, 4451, 534, 581, 513, 368, 5206, 50275, 2072, 5474, 339, 431, 248, 2929, 2722, 407, 7296, 253, 19191, 17133, 13757, 7792, 326, 627, 2226, 1895, 10872, 835, 253, 256, 35333, 2900, 15646, 1097, 16774, 2495, 285, 26647, 8037, 273, 40639, 18, 275, 253, 1293, 5407, 1083, 256, 35333, 310, 3103, 417, 5933, 1037, 6474, 436, 11562, 1057, 417, 2826, 323, 256, 35333, 342, 5407, 20544, 50276, 783, 11562, 2361, 275, 253, 2929, 310, 4460, 27807, 285, 7826, 3486, 1020, 275, 253, 3634, 273, 256, 35333, 26647, 14493, 50276, 14983, 39977, 1941, 310, 4891, 50276, 783, 1554, 466, 81, 3770, 9459, 310, 9713, 50276, 20881, 1255, 265, 50276, 15387, 11985, 4543, 11815, 403, 1246, 7364, 285, 2442, 4016, 38058, 3486, 273, 436, 789, 778, 320, 5469, 625, 5474, 33032, 2520, 2929, 2175, 253, 16774, 2495, 285, 3072, 2495, 2366, 342, 26647, 323, 1027, 11640, 273, 256, 35333, 275, 253, 3634, 273, 19191, 17133, 13757, 597, 671, 9017, 253, 1783, 281, 643, 7533, 26332, 17133, 275, 15355, 7052, 17133, 616, 2022, 7680, 310, 281, 3989, 271, 1650, 326, 327, 554, 515, 256, 35333, 15646, 1097, 16774, 2495, 285, 26647, 8037, 273, 40639, 18, 25761, 597, 921, 326, 436, 11562, 1057, 417, 2226, 604, 359, 897, 342, 250, 26380, 256, 35333, 1390, 597, 15313, 5170, 285, 2406, 14493, 323, 1293, 250, 26380, 256, 35333, 275, 253, 1554, 466, 81, 3770, 9459, 20544, 50276, 2520, 2929, 2175, 253, 3879, 273, 256, 35333, 275, 46254, 5277, 747, 30328, 273, 253, 1783, 273, 1071, 2228, 3072, 2495, 407, 616, 5140, 1014, 275, 253, 17133, 13757, 627, 4961, 271, 4227, 326, 872, 1598, 46926, 3072, 2495, 1223, 253, 16774, 2495, 310, 273, 3638, 1268, 4795, 275, 247, 3638, 26647, 8037, 436, 8310, 3533, 253, 8870, 13757, 16691, 1320, 14717, 7792, 390, 209, 693, 7792, 253, 2022, 1921, 310, 326, 256, 35333, 1057, 417, 15338, 253, 16774, 2957, 534, 19986, 247, 2257, 275, 253, 6032, 4758, 275, 247, 3159, 436, 8310, 1537, 1421, 281, 247, 747, 1511, 273, 1783, 273, 7605, 4715, 534, 476, 452, 247, 3862, 3486, 327, 253, 3114, 50276, 783, 643, 7313, 2366, 342, 616, 4737, 5609, 403, 671, 4722, 323, 1650, 253, 5301, 875, 342, 285, 1293, 5407, 1537, 2085, 690, 30328, 327, 253, 5750, 273, 10796, 515, 256, 35333, 275, 2426, 273, 697, 15424, 37820, 50276, 783, 2929, 310, 973, 34218, 285, 3400, 1175, 30328, 323, 253, 4737, 5853, 14855, 50276, 783, 5140, 273, 253, 4433, 3133, 13345, 285, 1537, 417, 320, 2087, 2217, 597, 10725, 327, 247, 2714, 2605, 326, 281, 619, 1682, 3640, 642, 15958, 4715, 1895, 556, 247, 2074, 2605, 352, 651, 320, 1270, 604, 253, 2488, 476, 2085, 690, 4602, 875, 616, 8818, 1650, 285, 690, 15958, 6667, 50276, 783, 1783, 15771, 327, 253, 17522, 2900, 3185, 273, 253, 1390, 19502, 2900, 534, 16897, 253, 2087, 3486, 273, 436, 789, 352, 651, 320, 1270, 604, 253, 2488, 476, 387, 1878, 4385, 327, 390, 24366, 253, 906, 323, 253, 1390, 19502, 1083, 4496, 923, 253, 32213, 629, 2490, 187, 4118, 18435, 27, 2520, 2929, 3537, 13505, 253, 3879, 273, 256, 35333, 323, 19191, 17133, 13757, 4645, 326, 627, 2226, 1895, 10872, 835, 253, 256, 35333, 2900, 15646, 1097, 1534, 16774, 2495, 285, 26647, 8037, 275, 253, 1293, 5407, 1083, 253, 4560, 310, 7826, 3486, 1020, 275, 253, 3634, 273, 256, 35333, 26647, 14493, 253, 2929, 310, 973, 34218, 285, 3400, 1175, 30328, 323, 253, 4737, 5853, 2299, 891, 11907, 253, 4477, 281, 2085, 247, 5140, 273, 253, 4433, 326, 310, 625, 2087, 285, 1679, 13345 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 783, 4477, 1263, 256, 35333, 342, 285, 1293, 5407, 285, 3989, 2258, 442, 18398, 10240, 323, 534, 256, 35333, 10224, 281, 29623, 5747, 253, 8103, 1146, 5010, 973, 48384, 9367, 24088, 17133, 285, 6032, 436, 4620, 281, 320, 247, 1534, 789, 533, 310, 19582, 323, 253, 5723, 2824, 3114, 984, 352, 310, 1512, 10527, 253, 10895, 1182, 285, 253, 8103, 1159, 5004, 23256, 608, 403, 2969, 533, 352, 310, 12744, 849, 436, 906, 7033, 281, 1524, 10186, 5145, 4715, 3237, 436, 310, 27007, 407, 253, 3480, 273, 4679, 275, 253, 2929, 5549, 5474, 339, 431, 248, 4477, 2085, 247, 19191, 17133, 5556, 5837, 7792, 275, 534, 581, 7222, 339, 256, 35333, 966, 1037, 7221, 3013, 253, 3072, 2495, 387, 247, 258, 18, 50276, 2609, 79, 2281, 533, 2299, 15646, 247, 40639, 18, 3733, 2228, 285, 2087, 5837, 2228, 253, 2929, 310, 973, 3542, 1016, 906, 310, 2590, 5611, 285, 20503, 253, 1783, 310, 26565, 285, 1327, 14916, 4583, 891, 452, 642, 14226, 8664, 253, 7681, 1543, 534, 891, 2868, 403, 3236, 2299, 253, 2022, 1543, 275, 2593, 495, 1646, 271, 264, 22947, 267, 275, 253, 3282, 326, 337, 12445, 1057, 581, 1509, 256, 35333, 374, 253, 2783, 4758, 310, 1077, 19532, 285, 4518, 417, 14494, 327, 247, 5312, 3720, 33804, 5055, 3268, 285, 2957, 277, 305, 2574, 374, 79, 2412, 295, 50276, 2520, 327, 697, 1211, 651, 417, 15105, 479, 347, 5393, 891, 1089, 253, 1543, 3236, 285, 4722, 2299, 619, 2022, 4468, 403, 253, 11815, 50276, 22416, 569, 534, 403, 8392, 432, 253, 1543, 50276, 1189, 455, 891, 1089, 326, 253, 10199, 310, 1077, 24363, 1060, 403, 2067, 3916, 275, 352, 534, 891, 2868, 403, 24363, 50276, 7750, 50275, 7053, 352, 310, 2590, 326, 256, 35333, 2550, 320, 29318, 347, 667, 5272, 3963, 1025, 16774, 2495, 41458, 5199, 323, 253, 2969, 1921, 326, 352, 1057, 417, 15338, 253, 16774, 2495, 534, 7881, 253, 15424, 37820, 31460, 281, 253, 26647, 273, 256, 35333, 253, 15424, 37820, 31460, 310, 9560, 275, 253, 30370, 4758, 4758, 534, 310, 1313, 275, 3946, 7613, 891, 2550, 5194, 342, 634, 3908, 352, 1057, 417, 5691, 253, 31460, 347, 368, 1908, 247, 9106, 1027, 4758, 50275, 9815, 667, 3177, 281, 5513, 26647, 273, 256, 35333, 407, 6447, 14940, 689, 667, 6830, 2856, 324, 2662, 24316, 873, 2550, 2186, 3365, 984, 253, 3410, 3388, 2330, 342, 253, 1077, 1072, 3733, 873, 256, 35333, 369, 10166, 327, 310, 417, 7933, 2810, 281, 697, 9056, 15355, 50276, 18, 634, 906, 6556, 760, 323, 581, 1509, 256, 35333, 534, 12445, 1057, 275, 3946, 7613, 581, 812, 1902, 253, 8946, 6194, 50276, 16691, 5837, 8037, 281, 2186, 323, 2709, 11999, 50274, 249, 253, 12002, 17912, 352, 7819, 562, 326, 256, 35333, 310, 417, 5933, 1037, 6474, 275, 667, 3282, 285, 697, 26647, 3745, 2550, 320, 5544, 407, 6447, 14940, 390, 667, 643, 4390, 1929, 26647, 3033, 5853, 323, 326, 2647, 643, 685, 326, 273, 697, 8946, 1783, 969, 436, 760, 6556, 323, 581, 1509, 256, 35333, 534, 310, 1620, 2783, 275, 3946, 50274, 37585, 5701, 50275, 249, 253, 3072, 2495, 14940, 352, 651, 452, 644, 30909, 281, 1691, 253, 3242, 14940, 3033, 3185, 273, 247, 1943, 258, 50276, 601, 326, 352, 310, 2590, 326, 627, 310, 642, 8763, 10096, 275, 253, 7877, 277, 534, 651, 452, 5339, 253, 2281, 50275, 66, 6452, 651, 452, 644, 14109, 50275, 1282, 26235, 37280, 50276, 1662, 1054, 259, 7856, 25837, 2139, 310, 253, 1736, 1222, 4451, 50276, 338, 417, 4451, 534, 581, 513, 368, 5206, 50275, 2072, 5474, 339, 431, 248, 2929, 2722, 407, 7296, 253, 19191, 17133, 13757, 7792, 326, 627, 2226, 1895, 10872, 835, 253, 256, 35333, 2900, 15646, 1097, 16774, 2495, 285, 26647, 8037, 273, 40639, 18, 275, 253, 1293, 5407, 1083, 256, 35333, 310, 3103, 417, 5933, 1037, 6474, 436, 11562, 1057, 417, 2826, 323, 256, 35333, 342, 5407, 20544, 50276, 783, 11562, 2361, 275, 253, 2929, 310, 4460, 27807, 285, 7826, 3486, 1020, 275, 253, 3634, 273, 256, 35333, 26647, 14493, 50276, 14983, 39977, 1941, 310, 4891, 50276, 783, 1554, 466, 81, 3770, 9459, 310, 9713, 50276, 20881, 1255, 265, 50276, 15387, 11985, 4543, 11815, 403, 1246, 7364, 285, 2442, 4016, 38058, 3486, 273, 436, 789, 778, 320, 5469, 625, 5474, 33032, 2520, 2929, 2175, 253, 16774, 2495, 285, 3072, 2495, 2366, 342, 26647, 323, 1027, 11640, 273, 256, 35333, 275, 253, 3634, 273, 19191, 17133, 13757, 597, 671, 9017, 253, 1783, 281, 643, 7533, 26332, 17133, 275, 15355, 7052, 17133, 616, 2022, 7680, 310, 281, 3989, 271, 1650, 326, 327, 554, 515, 256, 35333, 15646, 1097, 16774, 2495, 285, 26647, 8037, 273, 40639, 18, 25761, 597, 921, 326, 436, 11562, 1057, 417, 2226, 604, 359, 897, 342, 250, 26380, 256, 35333, 1390, 597, 15313, 5170, 285, 2406, 14493, 323, 1293, 250, 26380, 256, 35333, 275, 253, 1554, 466, 81, 3770, 9459, 20544, 50276, 2520, 2929, 2175, 253, 3879, 273, 256, 35333, 275, 46254, 5277, 747, 30328, 273, 253, 1783, 273, 1071, 2228, 3072, 2495, 407, 616, 5140, 1014, 275, 253, 17133, 13757, 627, 4961, 271, 4227, 326, 872, 1598, 46926, 3072, 2495, 1223, 253, 16774, 2495, 310, 273, 3638, 1268, 4795, 275, 247, 3638, 26647, 8037, 436, 8310, 3533, 253, 8870, 13757, 16691, 1320, 14717, 7792, 390, 209, 693, 7792, 253, 2022, 1921, 310, 326, 256, 35333, 1057, 417, 15338, 253, 16774, 2957, 534, 19986, 247, 2257, 275, 253, 6032, 4758, 275, 247, 3159, 436, 8310, 1537, 1421, 281, 247, 747, 1511, 273, 1783, 273, 7605, 4715, 534, 476, 452, 247, 3862, 3486, 327, 253, 3114, 50276, 783, 643, 7313, 2366, 342, 616, 4737, 5609, 403, 671, 4722, 323, 1650, 253, 5301, 875, 342, 285, 1293, 5407, 1537, 2085, 690, 30328, 327, 253, 5750, 273, 10796, 515, 256, 35333, 275, 2426, 273, 697, 15424, 37820, 50276, 783, 2929, 310, 973, 34218, 285, 3400, 1175, 30328, 323, 253, 4737, 5853, 14855, 50276, 783, 5140, 273, 253, 4433, 3133, 13345, 285, 1537, 417, 320, 2087, 2217, 597, 10725, 327, 247, 2714, 2605, 326, 281, 619, 1682, 3640, 642, 15958, 4715, 1895, 556, 247, 2074, 2605, 352, 651, 320, 1270, 604, 253, 2488, 476, 2085, 690, 4602, 875, 616, 8818, 1650, 285, 690, 15958, 6667, 50276, 783, 1783, 15771, 327, 253, 17522, 2900, 3185, 273, 253, 1390, 19502, 2900, 534, 16897, 253, 2087, 3486, 273, 436, 789, 352, 651, 320, 1270, 604, 253, 2488, 476, 387, 1878, 4385, 327, 390, 24366, 253, 906, 323, 253, 1390, 19502, 1083, 4496, 923, 253, 32213, 629, 2490, 187, 4118, 18435, 27, 2520, 2929, 3537, 13505, 253, 3879, 273, 256, 35333, 323, 19191, 17133, 13757, 4645, 326, 627, 2226, 1895, 10872, 835, 253, 256, 35333, 2900, 15646, 1097, 1534, 16774, 2495, 285, 26647, 8037, 275, 253, 1293, 5407, 1083, 253, 4560, 310, 7826, 3486, 1020, 275, 253, 3634, 273, 256, 35333, 26647, 14493, 253, 2929, 310, 973, 34218, 285, 3400, 1175, 30328, 323, 253, 4737, 5853, 2299, 891, 11907, 253, 4477, 281, 2085, 247, 5140, 273, 253, 4433, 326, 310, 625, 2087, 285, 1679, 13345 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: this paper proposes the use of dijkstras algorithm for ensemble construction where the pool of classifiers to chose from is models trained during a hyperparameter optimization procedure authors show that the performance of the constructed ensembles improves upon the performance of a single classifier strong points novel and interesting idea weak points the paper would benefit significantly from proofreading and copyediting the choice of the dijkstra algorithm is poorly justified and might be incorrect more baselines for comparison are required recommendation i strongly recommend that this paper be rejected while the idea of using dijkstra for post hoc ensemble construction is interesting i dont think it is justified and i am not convinced that it is a correct choice furthermore the paper is riddled with mistakes unneeded details and bad explanations this paper needs to be entirely rewritten and the experiments require proper baselines for post hoc ensemble construction extra comments there is no justification provided for using dijkstra the context is different from shortest path searching as multiple classifiers in an ensemble have complex interactions that are lost via the majority voting and empirical risk estimation in other words the weight of the edge from one node to another will vary depending on the nodes visited beforehand which is not a condition normally present for shortest path estimation i think it means that its incorrect to apply dijkstra here or at least that dijkstra offers no guarantees the same goes for the parallel with the knapsack problem ie i dont think this is an instance of the knapsack problem other baselines should be added some simple ones would be forwardbackward greedy search stacking and genetic algorithms section 331 showing with a single example that combining three models with a majority vote does not lead to optimal performance is hardly evidence that smart algorithms are needed the related works section is needlessly long for example no need to mention work on multiobjective optimization formatting of citations is incorrect see the conference author guidelines missing citations the correct citation to introduce boosting is y freund and r schapire a decisiontheoretic generalization of online learning and an application to boosting 1997 not schwenk bengio amongst others caruana et al 2004 feurer et al 2015 and levesque et al 2016 have already applied ensembling to hyperparameter optimization such references are missing from your relevant works section references caruana r niculescumizil a crew g ksikes a 2004 july ensemble selection from libraries of models in proceedings of the twentyfirst international conference on machine learning p 18 feurer m klein a eggensperger k springenberg j blum m hutter f 2015 efficient and robust automated machine learning in advances in neural information processing systems pp 29622970 lvesque j c gagn c sabourin r 2016 bayesian hyperparameter optimization for ensemble learning arxiv preprint arxiv160506394 docsepthe authors contribute the idea of selecting a subset of models for building ensembles using the dijkstra algorithm furthermore they apply this algorithm on the models produced by hyperband hb and also show that it performs well on other hyperparameter tuning methods such as random search and bayesian optimization given that the latter is just an application of the algorithm the former will have to be the main contribution of the paper what is unfortunately lacking is a comparison with other ensemble techniques including simpler ones i am not an expert but i know that simple greedy methods perform quite well see for example 1 so it it not clear the algorithm is any better than previous techniques one advantage is that the proposed technique naturally incorporates a inference time constraint but then it should be discussed without a discussion and comparison to other widely using ensemble techniques this paper is incomplete and should not be accepted the application alone on hpo technique is a fairly straightforward one even if effective and does alone not warrant a paper with the decent but limited results presented a stronger case could be made if for example the results allowed to reach top rank in many kaggle competitions or in some existing automl competition pros the paper is well written polished and easy to understand combining hyperparameter tuning and ensembling is a good idea although already routinely done for example in kaggle competitions and in automl competitions it deserves more attention and analysis the results although limiting only 2 datasets are used ensemble baselines are missing are promising incorporating the cost constraint is an important contribution and should perhaps be discussed in more detail cons there exist many ensemble selection techniques in the literature that are simple and known to work well for example greedily selecting the next method that performs best if combined with the currently selected ensembles in some sense the proposed algorithm can be considered the obvious extension of this idea if presented with a cost constraint but it would be good to better discuss how a cost constraint changes the situation too much focus on hyperband presented as part of the main contribution i think the paper should focus more on their ensemble technique as it can be combined effectively with any method as the results show and just not that hyperband seems to work best on the two datasets selected part of this seems to stem from hyperband also working best when no ensembling is used which is not necessarily in agreement with other hpo literature the paper would be stronger if results were presented on more datasets 1 g tsoumakas i partalas i vlahavas a taxonomy and short review of ensemble selection docsepafter rebuttal no rebuttal so i will keep my score overview this paper proposes to apply ensemble methods to make use of suboptimal configurations saved from hyperband in particular the authors propose to use dijkstras algorithm to choose the best combination of size k among n models reasons for the score overall i vote for reject there are several reasons that will be detailed a bit in the next roughly i think the paper requires a huge effort of rewriting before being published beyond that im not fully convinced by the technical novelty or difficulty of this paper pros the idea of collecting suboptimal configuration to boost the performance has not been considered in the context of hpo to the best of my knowledge the experiments in table 1 and table show that the workflow do improve the final performance cons my first major concern is on the writing there is no clear problem formulation of hpo personally im not very comfortable with the authors not presenting formally what is hyperband what is dijkstras algorithm and also how is the problem of finding the best ensemble modeled as a knapsack problem i think these details can at least be provided in the appendices i dont think its appropriate to assume that readers are all familiar with every notion presented in the paper especially the main purpose of this work is to propose a new workflow that combines existing techniques providing enough detail on how each component works and how they are connected in a precise way seems essential to me now regarding the experiments first of all a general question how many trials have you run for each experiment also i dont see error bars on figure 6 7 9 10 could you explain a bit more why are you interested in the experiments of table 2 i dont understand very well why it is useful to support the purpose of this work could you please more precise on when do you stop the experiments it looks like for some experiments a fixed time horizon is given i guess its the same for others but again some precise and formal descriptions are more than welcomed general questions and remarks do you have any intuition on why you use averaging other than other ensemble strategies although ensemble has not been considered in the context of hpo im not sure that the scientific contributions in this paper is significant enough could the authors highlight a bit more the technical difficulties if i have missed anything in general i feel like the paper is more like an engineering trick than a scientific discovery which could be a nice contribution of course but then in that case i think more solid experiments should be provided for example in the context of hpo we often want to see the evolution of performance over time not only the final performance minor comments and grammar issues nonexhaustive in general i would suggest the authors to review a bit the writing style of the paper sometimes i feel like the authors have personal claims regarding some previous work without any support which can be citations or even some intuitions to cite one of them as example in section 32 its pureexploration nature combined with conservative resource allocation strategies can sweep better the hyperparameter space than other strategies like blackbox bayesian optimization maybe but how is it compared to evolutionary algorithms for example my point is that we should be careful about this kind of claims the citation style is weird authors can refer to section 41 of the template file of iclr in the abstract capable of doing sth instead of capable to do sth section 2 paragraph multiobjective goal i dont really understand the sentence literature lets us imagine that the hyperparameter function topology has two plateaus section 32 bayesian bayesian section 333 hyperband hyperband section 41 it shows very effective to detect this is not correct grammatically section 43 different combinations algorithms different combinations of algorithms section 43 spmcts do not falls into spmcts does not fall into docsepthis paper is proposing to build ensembles of deep models components of which have different hyperparameter hp configurations this is done by first running hyperband to create a large pool and then run a greedy algorithm to construct an ensemble this algorithm is termed dykstras algorithm on a certain graph but it is of course simply just the default greedy algorithm which is almost by default used to create an ensemble from a pool the correct reference for this is 1 and this is just what people do when they create ensembles the paper also misses a number of relevant recent work to build ensembles of deep models at least 2 3 there is nothing new here except maybe that caruanas algorithm can now also be called dykstras in the very unlikely case i missed something and what they call dykstras method here not detailed in the paper is different from the obvious greedy method of caruana then the paper fails by not comparing against this obvious baseline 1 caruana niculescumizil crew ksikes ensemble selection from libraries of models icml 04 proceedings of the twentyfirst international conference on machine learning 2 deep ensembles b lakshminarayanan a pritzel and c blundell simple and scalable predictive uncertainty estimation using deep ensembles in advances in neural information processing systems nips pages 64026413 2017 3 batch ensembles y wen d tran and j ba batch ensemble an alternative approach to efficient ensemble and lifelong learning arxiv preprint arxiv200206715 2020 ### Summary:
following a strong consensus across the reviewers the paper is recommended for rejection they have all acknowledged some weaknesses of the paper for instance inadequate reference to prior work unsatisfactory level of polishing too limited evaluation with more comparisons to baselines required the proposed approach dijkstra algorithm is not enough justified and motivated clarity missing definitions of key components this list together with the detailed comments of the reviewers highlight opportunities to improve the manuscript for a future resubmission
[ 3066, 253, 5020, 13423, 285, 16774, 2495, 13418, 275, 643, 3000, 253, 2801, 273, 253, 5024, 432, 581, 4666, 281, 1529, 588, 6889, 7293, 327, 253, 7632, 11580, 38565, 534, 310, 417, 247, 1617, 9403, 1246, 323, 30505, 1854, 13418, 891, 1158, 352, 2097, 326, 697, 13583, 281, 4647, 1073, 17443, 10981, 1060, 390, 387, 1878, 326, 1073, 17443, 10981, 6131, 642, 23632, 253, 1072, 4566, 323, 253, 7529, 342, 253, 694, 1825, 471, 1895, 26332, 891, 13414, 1158, 436, 310, 271, 4227, 273, 253, 694, 1825, 471, 1895, 50276, 977, 1666, 25379, 943, 320, 2879, 690, 2969, 4394, 651, 320, 3579, 2135, 1034, 38754, 3186, 37444, 285, 6380, 11333, 50275, 4674, 36438, 4645, 342, 247, 2014, 1650, 326, 16248, 1264, 3210, 342, 247, 5020, 6273, 1057, 417, 1421, 281, 8654, 3045, 310, 10693, 1941, 326, 7060, 11333, 403, 3058, 50276, 783, 2905, 2987, 2593, 310, 878, 13102, 1048, 323, 1650, 642, 878, 281, 3748, 789, 327, 4471, 6082, 422, 13757, 50275, 8124, 1076, 273, 30404, 310, 13583, 923, 253, 8059, 2488, 9600, 50275, 33722, 30404, 50276, 783, 3451, 25577, 281, 9569, 43124, 310, 340, 4107, 1504, 285, 391, 5807, 522, 603, 247, 3061, 783, 30325, 26647, 273, 3909, 4715, 285, 271, 2898, 281, 43124, 8210, 417, 5807, 15082, 76, 50276, 67, 1205, 900, 50275, 35094, 296, 2571, 1113, 86, 3230, 1162, 355, 6157, 704, 13615, 1162, 355, 4104, 285, 458, 1634, 1452, 1162, 355, 4022, 452, 2168, 3732, 546, 35128, 281, 4373, 19484, 13757, 824, 10414, 403, 5816, 432, 634, 4623, 2987, 2593, 50275, 250, 3065, 50275, 5546, 86, 3230, 391, 6815, 2651, 38655, 478, 300, 247, 10402, 305, 50276, 661, 13972, 247, 6157, 480, 2988, 19862, 5438, 432, 13747, 273, 3210, 275, 10061, 273, 253, 6818, 7053, 5213, 8059, 327, 5145, 4715, 268, 1283, 50276, 453, 13615, 278, 465, 18937, 247, 7238, 561, 468, 1063, 465, 7203, 25210, 480, 787, 360, 278, 50276, 73, 12216, 269, 4104, 5919, 285, 10237, 16644, 5145, 4715, 275, 16424, 275, 11454, 1491, 5162, 2718, 7266, 33723, 17107, 1967, 50276, 77, 1634, 1452, 480, 260, 305, 1530, 260, 50276, 84, 357, 454, 249, 391, 4022, 17699, 16561, 4373, 19484, 13757, 323, 19862, 4715, 549, 32693, 638, 3845, 549, 32693, 9913, 25670, 27463, 50276, 7152, 339, 431, 248, 4477, 8162, 253, 2934, 273, 17221, 247, 8578, 273, 3210, 323, 3652, 49328, 970, 253, 1073, 17443, 10981, 5933, 33810, 597, 4647, 436, 5933, 327, 253, 3210, 4197, 407, 4373, 4152, 288, 67, 285, 671, 921, 326, 352, 17923, 973, 327, 643, 4373, 19484, 25184, 3082, 824, 347, 3632, 3186, 285, 17699, 16561, 13757, 50276, 28821, 326, 253, 6158, 310, 816, 271, 2898, 273, 253, 5933, 253, 3438, 588, 452, 281, 320, 253, 2022, 7680, 273, 253, 2929, 50276, 5371, 310, 19235, 14999, 310, 247, 5301, 342, 643, 19862, 5609, 1690, 19554, 4394, 891, 717, 417, 271, 6485, 533, 891, 871, 326, 2969, 38754, 3082, 1347, 3240, 973, 923, 323, 1650, 337, 594, 352, 352, 417, 2590, 253, 5933, 310, 667, 1805, 685, 2045, 5609, 581, 5750, 310, 326, 253, 4081, 5853, 10748, 31167, 247, 17032, 673, 7658, 533, 840, 352, 943, 320, 5469, 50276, 14920, 247, 5955, 285, 5301, 281, 643, 7561, 970, 19862, 5609, 436, 2929, 310, 18464, 285, 943, 417, 320, 7607, 253, 2898, 3815, 327, 288, 5367, 5853, 310, 247, 9648, 15246, 581, 1014, 604, 3576, 285, 1057, 3815, 417, 7501, 247, 2929, 342, 253, 12524, 533, 3710, 1543, 3559, 247, 10046, 1083, 812, 320, 1160, 604, 323, 1650, 253, 1543, 4136, 281, 3986, 1755, 5958, 275, 1142, 465, 356, 10582, 31067, 390, 275, 690, 5368, 3772, 77, 7324, 50275, 856, 84, 50276, 783, 2929, 310, 973, 3542, 29422, 285, 3477, 281, 2096, 50276, 17890, 1699, 4373, 19484, 25184, 285, 546, 35128, 310, 247, 1175, 2934, 3738, 2168, 21774, 2218, 323, 1650, 275, 465, 356, 10582, 31067, 285, 275, 3772, 77, 31067, 352, 22828, 625, 4116, 285, 1783, 50276, 783, 1543, 3738, 14155, 760, 374, 15302, 403, 908, 19862, 1666, 25379, 403, 5816, 403, 12532, 50276, 1763, 24993, 839, 253, 2105, 7658, 310, 271, 1774, 7680, 285, 943, 4931, 320, 5469, 275, 625, 2508, 50274, 5040, 50276, 9088, 2226, 1142, 19862, 5438, 5609, 275, 253, 6239, 326, 403, 2969, 285, 1929, 281, 789, 973, 323, 1650, 37819, 1031, 17221, 253, 1735, 1332, 326, 17923, 1682, 604, 5678, 342, 253, 4390, 4236, 49328, 275, 690, 3282, 253, 4081, 5933, 476, 320, 2783, 253, 4755, 6880, 273, 436, 2934, 604, 3559, 342, 247, 2105, 7658, 533, 352, 651, 320, 1175, 281, 1805, 2319, 849, 247, 2105, 7658, 2544, 253, 4112, 50276, 15627, 1199, 2770, 327, 4373, 4152, 3559, 347, 629, 273, 253, 2022, 7680, 891, 1158, 253, 2929, 943, 2770, 625, 327, 616, 19862, 5853, 347, 352, 476, 320, 5678, 8069, 342, 667, 1332, 347, 253, 1543, 921, 285, 816, 417, 326, 4373, 4152, 3133, 281, 789, 1682, 327, 253, 767, 15302, 4236, 629, 273, 436, 3133, 281, 8424, 432, 4373, 4152, 671, 2444, 1682, 672, 642, 546, 35128, 310, 908, 534, 310, 417, 7933, 275, 4345, 342, 643, 288, 5367, 6239, 50276, 783, 2929, 651, 320, 10046, 604, 1543, 497, 3559, 327, 625, 15302, 50276, 18, 305, 28669, 276, 45879, 284, 891, 629, 267, 284, 891, 362, 77, 1240, 9382, 50276, 66, 2891, 13646, 285, 2159, 2278, 273, 19862, 5438, 50274, 7152, 339, 4904, 699, 30080, 22559, 642, 30080, 22559, 594, 891, 588, 1978, 619, 4868, 50274, 39930, 436, 2929, 29328, 281, 4647, 19862, 3082, 281, 1056, 897, 273, 749, 29776, 16012, 9809, 432, 4373, 4152, 275, 1798, 253, 4477, 12661, 281, 897, 1073, 17443, 1344, 284, 5933, 281, 5206, 253, 1682, 5019, 273, 1979, 465, 2190, 295, 3210, 50276, 250, 3743, 323, 253, 4868, 4583, 891, 6273, 323, 12009, 627, 403, 2067, 4606, 326, 588, 320, 7000, 247, 2372, 275, 253, 1735, 11467, 891, 1158, 253, 2929, 4419, 247, 5699, 3434, 273, 294, 17695, 1078, 1146, 3863, 4457, 326, 516, 417, 4751, 13762, 407, 253, 7681, 38135, 390, 10183, 273, 436, 2929, 50276, 856, 84, 50276, 783, 2934, 273, 17055, 749, 29776, 6661, 281, 9510, 253, 3045, 556, 417, 644, 2783, 275, 253, 3634, 273, 288, 5367, 281, 253, 1682, 273, 619, 3640, 50276, 783, 4679, 275, 2829, 337, 285, 2829, 921, 326, 253, 24824, 513, 3157, 253, 2457, 3045, 50276, 5040, 50276, 2577, 50276, 7053, 2201, 4468, 310, 327, 253, 4028, 50274, 9088, 310, 642, 2590, 1895, 15895, 273, 288, 5367, 50274, 10816, 595, 516, 417, 1077, 9848, 342, 253, 4477, 417, 15250, 19186, 752, 310, 4373, 4152, 752, 310, 1073, 17443, 1344, 284, 5933, 285, 671, 849, 310, 253, 1895, 273, 4560, 253, 1682, 19862, 23115, 347, 247, 694, 1825, 471, 1895, 891, 1158, 841, 4278, 476, 387, 1878, 320, 2530, 275, 253, 14801, 1271, 891, 13414, 1158, 697, 4569, 281, 5467, 326, 10668, 403, 512, 7615, 342, 1046, 10732, 3559, 275, 253, 2929, 3340, 253, 2022, 4096, 273, 436, 789, 310, 281, 12661, 247, 747, 24824, 326, 24772, 5368, 5609, 5277, 2217, 2508, 327, 849, 1016, 4445, 2987, 285, 849, 597, 403, 4802, 275, 247, 10799, 1039, 3133, 5667, 281, 479, 50276, 2666, 5001, 253, 4679, 50274, 7053, 273, 512, 247, 2087, 1953, 849, 1142, 7587, 452, 368, 1408, 323, 1016, 3368, 50274, 12563, 891, 13414, 923, 2228, 8965, 327, 4677, 721, 818, 898, 884, 50274, 16534, 368, 5513, 247, 2372, 625, 2139, 403, 368, 6110, 275, 253, 4679, 273, 2829, 374, 891, 13414, 2096, 1077, 973, 2139, 352, 310, 4217, 281, 1329, 253, 4096, 273, 436, 789, 50274, 16534, 368, 4496, 625, 10799, 327, 672, 513, 368, 3523, 253, 4679, 352, 4453, 751, 323, 690, 4679, 247, 4229, 673, 16892, 310, 1677, 891, 5476, 697, 253, 1072, 323, 2571, 533, 969, 690, 10799, 285, 7473, 20121, 403, 625, 685, 25213, 50276, 16691, 3533, 285, 16157, 50276, 3088, 368, 452, 667, 30328, 327, 2139, 368, 897, 25001, 643, 685, 643, 19862, 8130, 50276, 20261, 19862, 556, 417, 644, 2783, 275, 253, 3634, 273, 288, 5367, 516, 417, 2119, 326, 253, 8249, 9021, 275, 436, 2929, 310, 1534, 2217, 812, 253, 4477, 6780, 247, 2372, 625, 253, 7681, 12748, 604, 891, 452, 9829, 2712, 50276, 249, 2087, 891, 1928, 751, 253, 2929, 310, 625, 751, 271, 11369, 10480, 685, 247, 8249, 8900, 534, 812, 320, 247, 5322, 7680, 273, 2282, 533, 840, 275, 326, 1083, 891, 1158, 625, 4891, 4679, 943, 320, 2530, 323, 1650, 275, 253, 3634, 273, 288, 5367, 359, 2223, 971, 281, 923, 253, 5606, 273, 3045, 689, 673, 417, 760, 253, 2457, 3045, 50276, 37585, 5701, 285, 28146, 3374, 44382, 8648, 422, 50276, 249, 2087, 891, 651, 1804, 253, 4477, 281, 2278, 247, 2372, 253, 4028, 3740, 273, 253, 2929, 4536, 891, 1928, 751, 253, 4477, 452, 3367, 3916, 5001, 690, 2045, 789, 1293, 667, 1329, 534, 476, 320, 30404, 390, 1014, 690, 16875, 4431, 281, 26542, 581, 273, 731, 347, 1650, 275, 2593, 4567, 697, 6313, 15083, 7843, 3753, 5678, 342, 11518, 7741, 17621, 8130, 476, 24516, 1805, 253, 4373, 19484, 2317, 685, 643, 8130, 751, 2806, 3364, 17699, 16561, 13757, 5046, 533, 849, 310, 352, 2429, 281, 16483, 11333, 323, 1650, 619, 1127, 310, 326, 359, 943, 320, 10182, 670, 436, 2238, 273, 3916, 50276, 783, 25577, 3740, 310, 12504, 4477, 476, 3730, 281, 2593, 7609, 273, 253, 7646, 1873, 273, 17857, 32888, 50276, 249, 253, 12002, 7032, 273, 2509, 331, 73, 3185, 273, 7032, 281, 513, 331, 73, 50276, 4674, 374, 12494, 4471, 6082, 422, 4736, 891, 13414, 1663, 2096, 253, 6197, 6239, 14935, 441, 8564, 326, 253, 4373, 19484, 1159, 18080, 556, 767, 5340, 666, 50276, 4674, 4567, 17699, 16561, 50276, 32442, 16561, 50276, 4674, 30057, 4373, 4152, 50276, 27049, 4152, 50276, 4674, 7609, 352, 2722, 1077, 3576, 281, 2736, 50276, 2520, 310, 417, 3451, 47412, 1037, 50276, 4674, 7652, 1027, 13553, 11333, 50276, 19623, 13553, 273, 11333, 50276, 4674, 7652, 653, 78, 291, 84, 513, 417, 11521, 715, 50276, 1033, 78, 291, 84, 1057, 417, 2965, 715, 5474, 33032, 2520, 2929, 310, 36636, 281, 1973, 49328, 273, 3676, 3210, 4295, 273, 534, 452, 1027, 4373, 19484, 288, 81, 16012, 436, 310, 2218, 407, 806, 3515, 4373, 4152, 281, 2794, 247, 1781, 6363, 285, 840, 1408, 247, 38754, 5933, 281, 3989, 271, 19862, 436, 5933, 310, 23776, 17713, 76, 1344, 284, 5933, 327, 247, 2176, 4216, 533, 352, 310, 273, 2282, 3365, 816, 253, 4284, 38754, 5933, 534, 310, 2761, 407, 4284, 908, 281, 2794, 271, 19862, 432, 247, 6363, 253, 3451, 3806, 323, 436, 310, 337, 285, 436, 310, 816, 752, 952, 513, 672, 597, 2794, 49328, 253, 2929, 671, 38771, 247, 1180, 273, 4623, 3332, 789, 281, 1973, 49328, 273, 3676, 3210, 387, 1878, 374, 495, 627, 310, 2717, 747, 1060, 3707, 5046, 326, 1113, 9041, 284, 5933, 476, 1024, 671, 320, 1925, 17713, 76, 1344, 284, 275, 253, 1077, 11543, 1083, 891, 9829, 1633, 285, 752, 597, 1067, 17713, 76, 1344, 284, 1332, 1060, 417, 7000, 275, 253, 2929, 310, 1027, 432, 253, 4755, 38754, 1332, 273, 1113, 86, 3230, 840, 253, 2929, 10224, 407, 417, 10941, 1411, 436, 4755, 8245, 50276, 18, 1113, 86, 3230, 6815, 2651, 38655, 478, 300, 10402, 465, 84, 13972, 50275, 1215, 78, 934, 5438, 432, 13747, 273, 3210, 50275, 280, 1686, 16703, 10061, 273, 253, 6818, 7053, 5213, 8059, 327, 5145, 4715, 374, 3676, 49328, 50275, 67, 298, 518, 1200, 1222, 274, 26782, 266, 247, 819, 5432, 293, 285, 260, 787, 1504, 437, 2969, 285, 44755, 15970, 11649, 13418, 970, 3676, 49328, 275, 16424, 275, 11454, 1491, 5162, 2718, 295, 2824, 7223, 37174, 17984, 1012, 4240, 495, 14604, 49328, 50275, 90, 259, 257, 277, 21191, 285, 480, 18927, 14604, 19862, 271, 5795, 2746, 281, 5919, 19862, 285, 36536, 4715, 549, 32693, 638, 3845, 549, 32693, 1518, 938, 2251, 1010, 9169, 2490, 187, 4118, 18435, 27, 34814, 247, 2266, 13969, 2439, 253, 30628, 253, 2929, 310, 8521, 323, 18235, 597, 452, 512, 14969, 690, 32213, 273, 253, 2929, 323, 4227, 50275, 249, 14629, 366, 3806, 281, 2720, 789, 50276, 4539, 16412, 6697, 1268, 273, 35952, 50276, 15627, 3710, 7103, 342, 625, 14023, 281, 1666, 25379, 2424, 50276, 783, 4081, 2746, 1073, 17443, 10981, 5933, 310, 417, 2217, 17285, 285, 17194, 50275, 498, 15752, 5816, 14308, 273, 2234, 4295, 50276, 2520, 1618, 2366, 342, 253, 7000, 5701, 273, 253, 30628, 6780, 9091, 281, 3157, 253, 7714, 323, 247, 2852, 501, 538, 2230, 209 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 3066, 253, 5020, 13423, 285, 16774, 2495, 13418, 275, 643, 3000, 253, 2801, 273, 253, 5024, 432, 581, 4666, 281, 1529, 588, 6889, 7293, 327, 253, 7632, 11580, 38565, 534, 310, 417, 247, 1617, 9403, 1246, 323, 30505, 1854, 13418, 891, 1158, 352, 2097, 326, 697, 13583, 281, 4647, 1073, 17443, 10981, 1060, 390, 387, 1878, 326, 1073, 17443, 10981, 6131, 642, 23632, 253, 1072, 4566, 323, 253, 7529, 342, 253, 694, 1825, 471, 1895, 26332, 891, 13414, 1158, 436, 310, 271, 4227, 273, 253, 694, 1825, 471, 1895, 50276, 977, 1666, 25379, 943, 320, 2879, 690, 2969, 4394, 651, 320, 3579, 2135, 1034, 38754, 3186, 37444, 285, 6380, 11333, 50275, 4674, 36438, 4645, 342, 247, 2014, 1650, 326, 16248, 1264, 3210, 342, 247, 5020, 6273, 1057, 417, 1421, 281, 8654, 3045, 310, 10693, 1941, 326, 7060, 11333, 403, 3058, 50276, 783, 2905, 2987, 2593, 310, 878, 13102, 1048, 323, 1650, 642, 878, 281, 3748, 789, 327, 4471, 6082, 422, 13757, 50275, 8124, 1076, 273, 30404, 310, 13583, 923, 253, 8059, 2488, 9600, 50275, 33722, 30404, 50276, 783, 3451, 25577, 281, 9569, 43124, 310, 340, 4107, 1504, 285, 391, 5807, 522, 603, 247, 3061, 783, 30325, 26647, 273, 3909, 4715, 285, 271, 2898, 281, 43124, 8210, 417, 5807, 15082, 76, 50276, 67, 1205, 900, 50275, 35094, 296, 2571, 1113, 86, 3230, 1162, 355, 6157, 704, 13615, 1162, 355, 4104, 285, 458, 1634, 1452, 1162, 355, 4022, 452, 2168, 3732, 546, 35128, 281, 4373, 19484, 13757, 824, 10414, 403, 5816, 432, 634, 4623, 2987, 2593, 50275, 250, 3065, 50275, 5546, 86, 3230, 391, 6815, 2651, 38655, 478, 300, 247, 10402, 305, 50276, 661, 13972, 247, 6157, 480, 2988, 19862, 5438, 432, 13747, 273, 3210, 275, 10061, 273, 253, 6818, 7053, 5213, 8059, 327, 5145, 4715, 268, 1283, 50276, 453, 13615, 278, 465, 18937, 247, 7238, 561, 468, 1063, 465, 7203, 25210, 480, 787, 360, 278, 50276, 73, 12216, 269, 4104, 5919, 285, 10237, 16644, 5145, 4715, 275, 16424, 275, 11454, 1491, 5162, 2718, 7266, 33723, 17107, 1967, 50276, 77, 1634, 1452, 480, 260, 305, 1530, 260, 50276, 84, 357, 454, 249, 391, 4022, 17699, 16561, 4373, 19484, 13757, 323, 19862, 4715, 549, 32693, 638, 3845, 549, 32693, 9913, 25670, 27463, 50276, 7152, 339, 431, 248, 4477, 8162, 253, 2934, 273, 17221, 247, 8578, 273, 3210, 323, 3652, 49328, 970, 253, 1073, 17443, 10981, 5933, 33810, 597, 4647, 436, 5933, 327, 253, 3210, 4197, 407, 4373, 4152, 288, 67, 285, 671, 921, 326, 352, 17923, 973, 327, 643, 4373, 19484, 25184, 3082, 824, 347, 3632, 3186, 285, 17699, 16561, 13757, 50276, 28821, 326, 253, 6158, 310, 816, 271, 2898, 273, 253, 5933, 253, 3438, 588, 452, 281, 320, 253, 2022, 7680, 273, 253, 2929, 50276, 5371, 310, 19235, 14999, 310, 247, 5301, 342, 643, 19862, 5609, 1690, 19554, 4394, 891, 717, 417, 271, 6485, 533, 891, 871, 326, 2969, 38754, 3082, 1347, 3240, 973, 923, 323, 1650, 337, 594, 352, 352, 417, 2590, 253, 5933, 310, 667, 1805, 685, 2045, 5609, 581, 5750, 310, 326, 253, 4081, 5853, 10748, 31167, 247, 17032, 673, 7658, 533, 840, 352, 943, 320, 5469, 50276, 14920, 247, 5955, 285, 5301, 281, 643, 7561, 970, 19862, 5609, 436, 2929, 310, 18464, 285, 943, 417, 320, 7607, 253, 2898, 3815, 327, 288, 5367, 5853, 310, 247, 9648, 15246, 581, 1014, 604, 3576, 285, 1057, 3815, 417, 7501, 247, 2929, 342, 253, 12524, 533, 3710, 1543, 3559, 247, 10046, 1083, 812, 320, 1160, 604, 323, 1650, 253, 1543, 4136, 281, 3986, 1755, 5958, 275, 1142, 465, 356, 10582, 31067, 390, 275, 690, 5368, 3772, 77, 7324, 50275, 856, 84, 50276, 783, 2929, 310, 973, 3542, 29422, 285, 3477, 281, 2096, 50276, 17890, 1699, 4373, 19484, 25184, 285, 546, 35128, 310, 247, 1175, 2934, 3738, 2168, 21774, 2218, 323, 1650, 275, 465, 356, 10582, 31067, 285, 275, 3772, 77, 31067, 352, 22828, 625, 4116, 285, 1783, 50276, 783, 1543, 3738, 14155, 760, 374, 15302, 403, 908, 19862, 1666, 25379, 403, 5816, 403, 12532, 50276, 1763, 24993, 839, 253, 2105, 7658, 310, 271, 1774, 7680, 285, 943, 4931, 320, 5469, 275, 625, 2508, 50274, 5040, 50276, 9088, 2226, 1142, 19862, 5438, 5609, 275, 253, 6239, 326, 403, 2969, 285, 1929, 281, 789, 973, 323, 1650, 37819, 1031, 17221, 253, 1735, 1332, 326, 17923, 1682, 604, 5678, 342, 253, 4390, 4236, 49328, 275, 690, 3282, 253, 4081, 5933, 476, 320, 2783, 253, 4755, 6880, 273, 436, 2934, 604, 3559, 342, 247, 2105, 7658, 533, 352, 651, 320, 1175, 281, 1805, 2319, 849, 247, 2105, 7658, 2544, 253, 4112, 50276, 15627, 1199, 2770, 327, 4373, 4152, 3559, 347, 629, 273, 253, 2022, 7680, 891, 1158, 253, 2929, 943, 2770, 625, 327, 616, 19862, 5853, 347, 352, 476, 320, 5678, 8069, 342, 667, 1332, 347, 253, 1543, 921, 285, 816, 417, 326, 4373, 4152, 3133, 281, 789, 1682, 327, 253, 767, 15302, 4236, 629, 273, 436, 3133, 281, 8424, 432, 4373, 4152, 671, 2444, 1682, 672, 642, 546, 35128, 310, 908, 534, 310, 417, 7933, 275, 4345, 342, 643, 288, 5367, 6239, 50276, 783, 2929, 651, 320, 10046, 604, 1543, 497, 3559, 327, 625, 15302, 50276, 18, 305, 28669, 276, 45879, 284, 891, 629, 267, 284, 891, 362, 77, 1240, 9382, 50276, 66, 2891, 13646, 285, 2159, 2278, 273, 19862, 5438, 50274, 7152, 339, 4904, 699, 30080, 22559, 642, 30080, 22559, 594, 891, 588, 1978, 619, 4868, 50274, 39930, 436, 2929, 29328, 281, 4647, 19862, 3082, 281, 1056, 897, 273, 749, 29776, 16012, 9809, 432, 4373, 4152, 275, 1798, 253, 4477, 12661, 281, 897, 1073, 17443, 1344, 284, 5933, 281, 5206, 253, 1682, 5019, 273, 1979, 465, 2190, 295, 3210, 50276, 250, 3743, 323, 253, 4868, 4583, 891, 6273, 323, 12009, 627, 403, 2067, 4606, 326, 588, 320, 7000, 247, 2372, 275, 253, 1735, 11467, 891, 1158, 253, 2929, 4419, 247, 5699, 3434, 273, 294, 17695, 1078, 1146, 3863, 4457, 326, 516, 417, 4751, 13762, 407, 253, 7681, 38135, 390, 10183, 273, 436, 2929, 50276, 856, 84, 50276, 783, 2934, 273, 17055, 749, 29776, 6661, 281, 9510, 253, 3045, 556, 417, 644, 2783, 275, 253, 3634, 273, 288, 5367, 281, 253, 1682, 273, 619, 3640, 50276, 783, 4679, 275, 2829, 337, 285, 2829, 921, 326, 253, 24824, 513, 3157, 253, 2457, 3045, 50276, 5040, 50276, 2577, 50276, 7053, 2201, 4468, 310, 327, 253, 4028, 50274, 9088, 310, 642, 2590, 1895, 15895, 273, 288, 5367, 50274, 10816, 595, 516, 417, 1077, 9848, 342, 253, 4477, 417, 15250, 19186, 752, 310, 4373, 4152, 752, 310, 1073, 17443, 1344, 284, 5933, 285, 671, 849, 310, 253, 1895, 273, 4560, 253, 1682, 19862, 23115, 347, 247, 694, 1825, 471, 1895, 891, 1158, 841, 4278, 476, 387, 1878, 320, 2530, 275, 253, 14801, 1271, 891, 13414, 1158, 697, 4569, 281, 5467, 326, 10668, 403, 512, 7615, 342, 1046, 10732, 3559, 275, 253, 2929, 3340, 253, 2022, 4096, 273, 436, 789, 310, 281, 12661, 247, 747, 24824, 326, 24772, 5368, 5609, 5277, 2217, 2508, 327, 849, 1016, 4445, 2987, 285, 849, 597, 403, 4802, 275, 247, 10799, 1039, 3133, 5667, 281, 479, 50276, 2666, 5001, 253, 4679, 50274, 7053, 273, 512, 247, 2087, 1953, 849, 1142, 7587, 452, 368, 1408, 323, 1016, 3368, 50274, 12563, 891, 13414, 923, 2228, 8965, 327, 4677, 721, 818, 898, 884, 50274, 16534, 368, 5513, 247, 2372, 625, 2139, 403, 368, 6110, 275, 253, 4679, 273, 2829, 374, 891, 13414, 2096, 1077, 973, 2139, 352, 310, 4217, 281, 1329, 253, 4096, 273, 436, 789, 50274, 16534, 368, 4496, 625, 10799, 327, 672, 513, 368, 3523, 253, 4679, 352, 4453, 751, 323, 690, 4679, 247, 4229, 673, 16892, 310, 1677, 891, 5476, 697, 253, 1072, 323, 2571, 533, 969, 690, 10799, 285, 7473, 20121, 403, 625, 685, 25213, 50276, 16691, 3533, 285, 16157, 50276, 3088, 368, 452, 667, 30328, 327, 2139, 368, 897, 25001, 643, 685, 643, 19862, 8130, 50276, 20261, 19862, 556, 417, 644, 2783, 275, 253, 3634, 273, 288, 5367, 516, 417, 2119, 326, 253, 8249, 9021, 275, 436, 2929, 310, 1534, 2217, 812, 253, 4477, 6780, 247, 2372, 625, 253, 7681, 12748, 604, 891, 452, 9829, 2712, 50276, 249, 2087, 891, 1928, 751, 253, 2929, 310, 625, 751, 271, 11369, 10480, 685, 247, 8249, 8900, 534, 812, 320, 247, 5322, 7680, 273, 2282, 533, 840, 275, 326, 1083, 891, 1158, 625, 4891, 4679, 943, 320, 2530, 323, 1650, 275, 253, 3634, 273, 288, 5367, 359, 2223, 971, 281, 923, 253, 5606, 273, 3045, 689, 673, 417, 760, 253, 2457, 3045, 50276, 37585, 5701, 285, 28146, 3374, 44382, 8648, 422, 50276, 249, 2087, 891, 651, 1804, 253, 4477, 281, 2278, 247, 2372, 253, 4028, 3740, 273, 253, 2929, 4536, 891, 1928, 751, 253, 4477, 452, 3367, 3916, 5001, 690, 2045, 789, 1293, 667, 1329, 534, 476, 320, 30404, 390, 1014, 690, 16875, 4431, 281, 26542, 581, 273, 731, 347, 1650, 275, 2593, 4567, 697, 6313, 15083, 7843, 3753, 5678, 342, 11518, 7741, 17621, 8130, 476, 24516, 1805, 253, 4373, 19484, 2317, 685, 643, 8130, 751, 2806, 3364, 17699, 16561, 13757, 5046, 533, 849, 310, 352, 2429, 281, 16483, 11333, 323, 1650, 619, 1127, 310, 326, 359, 943, 320, 10182, 670, 436, 2238, 273, 3916, 50276, 783, 25577, 3740, 310, 12504, 4477, 476, 3730, 281, 2593, 7609, 273, 253, 7646, 1873, 273, 17857, 32888, 50276, 249, 253, 12002, 7032, 273, 2509, 331, 73, 3185, 273, 7032, 281, 513, 331, 73, 50276, 4674, 374, 12494, 4471, 6082, 422, 4736, 891, 13414, 1663, 2096, 253, 6197, 6239, 14935, 441, 8564, 326, 253, 4373, 19484, 1159, 18080, 556, 767, 5340, 666, 50276, 4674, 4567, 17699, 16561, 50276, 32442, 16561, 50276, 4674, 30057, 4373, 4152, 50276, 27049, 4152, 50276, 4674, 7609, 352, 2722, 1077, 3576, 281, 2736, 50276, 2520, 310, 417, 3451, 47412, 1037, 50276, 4674, 7652, 1027, 13553, 11333, 50276, 19623, 13553, 273, 11333, 50276, 4674, 7652, 653, 78, 291, 84, 513, 417, 11521, 715, 50276, 1033, 78, 291, 84, 1057, 417, 2965, 715, 5474, 33032, 2520, 2929, 310, 36636, 281, 1973, 49328, 273, 3676, 3210, 4295, 273, 534, 452, 1027, 4373, 19484, 288, 81, 16012, 436, 310, 2218, 407, 806, 3515, 4373, 4152, 281, 2794, 247, 1781, 6363, 285, 840, 1408, 247, 38754, 5933, 281, 3989, 271, 19862, 436, 5933, 310, 23776, 17713, 76, 1344, 284, 5933, 327, 247, 2176, 4216, 533, 352, 310, 273, 2282, 3365, 816, 253, 4284, 38754, 5933, 534, 310, 2761, 407, 4284, 908, 281, 2794, 271, 19862, 432, 247, 6363, 253, 3451, 3806, 323, 436, 310, 337, 285, 436, 310, 816, 752, 952, 513, 672, 597, 2794, 49328, 253, 2929, 671, 38771, 247, 1180, 273, 4623, 3332, 789, 281, 1973, 49328, 273, 3676, 3210, 387, 1878, 374, 495, 627, 310, 2717, 747, 1060, 3707, 5046, 326, 1113, 9041, 284, 5933, 476, 1024, 671, 320, 1925, 17713, 76, 1344, 284, 275, 253, 1077, 11543, 1083, 891, 9829, 1633, 285, 752, 597, 1067, 17713, 76, 1344, 284, 1332, 1060, 417, 7000, 275, 253, 2929, 310, 1027, 432, 253, 4755, 38754, 1332, 273, 1113, 86, 3230, 840, 253, 2929, 10224, 407, 417, 10941, 1411, 436, 4755, 8245, 50276, 18, 1113, 86, 3230, 6815, 2651, 38655, 478, 300, 10402, 465, 84, 13972, 50275, 1215, 78, 934, 5438, 432, 13747, 273, 3210, 50275, 280, 1686, 16703, 10061, 273, 253, 6818, 7053, 5213, 8059, 327, 5145, 4715, 374, 3676, 49328, 50275, 67, 298, 518, 1200, 1222, 274, 26782, 266, 247, 819, 5432, 293, 285, 260, 787, 1504, 437, 2969, 285, 44755, 15970, 11649, 13418, 970, 3676, 49328, 275, 16424, 275, 11454, 1491, 5162, 2718, 295, 2824, 7223, 37174, 17984, 1012, 4240, 495, 14604, 49328, 50275, 90, 259, 257, 277, 21191, 285, 480, 18927, 14604, 19862, 271, 5795, 2746, 281, 5919, 19862, 285, 36536, 4715, 549, 32693, 638, 3845, 549, 32693, 1518, 938, 2251, 1010, 9169, 2490, 187, 4118, 18435, 27, 34814, 247, 2266, 13969, 2439, 253, 30628, 253, 2929, 310, 8521, 323, 18235, 597, 452, 512, 14969, 690, 32213, 273, 253, 2929, 323, 4227, 50275, 249, 14629, 366, 3806, 281, 2720, 789, 50276, 4539, 16412, 6697, 1268, 273, 35952, 50276, 15627, 3710, 7103, 342, 625, 14023, 281, 1666, 25379, 2424, 50276, 783, 4081, 2746, 1073, 17443, 10981, 5933, 310, 417, 2217, 17285, 285, 17194, 50275, 498, 15752, 5816, 14308, 273, 2234, 4295, 50276, 2520, 1618, 2366, 342, 253, 7000, 5701, 273, 253, 30628, 6780, 9091, 281, 3157, 253, 7714, 323, 247, 2852, 501, 538, 2230, 209 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: the proposed paper surveys recent work in evidential deep learning and provides some potential directions for future work in this space evidential deep learning is a set of techniques for calibrating uncertainty of neural networks through secondorder distributions strengths in general the paper is well written structured and easy to understand there is significant value in surveying evidential deep learning techniques as there has been some parallel work in this space that is not often connected this survey covers and categorizes a large number of papers under the umbrella of evidential deep learning fig 1 is an excellent visualization and categorization of approaches in this space i liked the classification of the literature into prior and posterior methods throughout the survey weaknesses a significant portion of the survey repeats ideas from the prior network paper malinin and gales 2018 with some typos eg in eqs 3 and 4 the condition on the dataset as per the prior network paper has been omitted i am missing further insight and takeaways beyond the existing collection of papers surveyed the discussion details some potential avenues for future work and there are some insights in the footnotes it would be good to bring these footnote insights into the paper and expound on them particularly since there is enough space however as it stands the survey does not provide sufficient additional insight or context for the ideas presented to form its contribution the representation gap nandy et al 2020 and its resulting visualization were not described in sufficient detail overall it appears that there was insufficient material for a 9 page paper as equations were extremely spaced out eg eqs 3 and 4 and even then the paper did not reach the page limit making it look unpolished in general the presentation of the paper needs improvement there were frequent typos eg we also to provide e 4 uncertaintaware and inconsistent formatting for referencing figures sections and equations ie capitalization and abbreviation differences there were some notational issues as well for example it was confusing to see the definition muk py k mid x theta with and without theta and then to have the distribution py mid mu furthermore the references list is unpolished repeating conference names listing urls etc suggestions for improvement my main recommendation is to work into the survey further insights and context from beyond the surveyed papers for example the origin of the term evidential deep learning comes from evidential theory 1 which relates to the dirichlet distribution through the ideas in subjective logic 2 it would be good to further highlight that prior networks collapse the uncertainty over the model parameters as this insight is important and worth spending more space on in the paper the sensoy et al aaai paper 3 is missing from the survey and should be incorporated additional investigation into the advantages and disadvantages of the different metrics used to evaluate the performance of evidential deep learning methods with respect to ood inputs and uncertainty calibration would be valuable since this field does not have an unreasonable number of papers the authors should consider running some of their own experiments to provide empirical conclusions comparing methods that were not already compared in existing work eg charpentier et al 2020 and sensoy et al 2020 lastly the paper should be polished and proofread i recommend the use of the cleveref package in latex with the capitalise option to avoid inconsistencies in referencing figures equations and sections 1 a p dempster a generalization of bayesian inference classic works of the dempstershafer theory of belief functions pages 73104 2008 2 a jsang subjective logic a formalism for reasoning under uncertainty springer 2016 3 m sensoy et al uncertaintyaware deep classifiers using generative models in aaai 2020 the idea of a survey on evidential deep learning is excellent and valuable to researchers however in its current state i do not believe the paper is ready for publication at iclr with improvements to the broader insights and takeaways from the survey as well as paper presentation this work will become a good contribution to the academic community docsepthis paper surveys a collection of existing works that the author frames as evidential deep learning for the classification case evidential deep learning tries to train a network to output the parameters of a dirichlet distribution hoping the one could directly obtain the data uncertainty and model uncertainty from some functions of the output parameters the authors provide a relatively comprehensive review of recent work in this direction besides classification evidential deep learning for regression is also discussed briefly why is sensoy et al 2018 called prior networks it seems the original papers did not referred to their method as prior network either according to the equation in section 331 shouldnt the prior be the flat dirichlet what is needed to be trained is more like an approximate posterior network that takes as input a new data x it would be helpful for readers to elaborate on this in equation 5 should mu0 muk be mu1 muk otherwise mu0 will always be the largest by definition which makes on sense for equation 6 is some term related to balpha missing the argument that the model uncertainty can simply replaced by some function of alpha is not very convincing given that the network will directly output alpha it would be helpful if more details and intuition can be provided as a survey this manuscript seems to miss a lot of important related work on probabilistic neural networks and bayesian deep learning af minor p6 fig 2 fig 2 a naturalparameter networks a class of probabilistic neural networks nips 2016 b feedforward propagation in probabilistic neural networks with categorical and max layers iclr 2018 c samplingfree epistemic uncertainty estimation using approximated variance propagation cvpr 2019 d probabilistic backpropagation for scalable learning of bayesian neural networks icml 2015 e being bayesian even just a bit fixes overconfidence in relu networks icml 2020 f a survey on bayesian deep learning acm computing surveys 2020 overall the paper is informative and did a relatively good job in introducing the basic concepts as well as the motivation of evidential deep learning however it seems there are still some important related works that are missing from the references besides the paper in its current form looks more like a review rather than a comprehensive thorough survey partly because of the space constraints in iclr therefore i am not sure that iclr is the right venue for it perhaps it is more suitable as a long journal where more details and taxonomy could be included docsepthe paper is a survey of methods in evidential deep learning it gives a brief motivation for this set of methods explains a general framework and describes previous works for classification and regressions tasks in varying amount of details as a survey paper it doesnt introduce novel ideas but gives a useful overview for people who would like to start working on this topic in my opinion the paper did a very good job of collecting papers on evidential deep learning and providing a brief overview of their ideas as a person unfamiliar with this topic i got interested and proceeded to read some of the references as i didnt fully grasp the motivation and all the details from a rather short introduction the paper could have been more enjoyable if it wasnt limited to 9 pages and was written in a form of a tutorial with more intuition for all the equations for example it took me some time to interpret eq467 what bothers me about evidential deep learning is that the choice of the regularization term often seems arbitrary i wish this paper had some deeper insights on this matter i havent seen survey papers published at conferences before so im hesitant to recommend its acceptance it would have been different if the paper in addition to being a survey presented some fundamental insights into the nature of these methods which could count as a novel contribution i think it can become a wellcited journal or arxiv paper that can get people interested in the topic of evidential deep learning if the background sections are turned into a selfcontained tutorial and the survey parts include some additional motivation and intuition regarding the design choices that each of the papers make extra remarks should eq2 be pmu d alpha instead of palpha d mu section 32 typo quanitified section 332 typo uncertaintaware contributions of this paper are not novel which is normal for a survey paper i think survey papers are not suitable for conferences as they need to be evaluated using a different set of criteria therefore my recommendation would be to reject this paper purely based on its type docsepthe paper presents a survey of evidential deep learning a family of machine learning methods that suggest accounting for various forms of uncertainties via a hierarchical predictive model whose hyperprior parameters are a function of input observations the paper classifies evidential deep learning approaches into categories and points out the strengths and weaknesses of each category it also presents some theoretical properties of the dirichlet distribution that could be relevant for machine learning tasks this is a survey paper without any conceptual claims of novelty its main purpose is to make an emerging family of machine learning models more accessible to the community i find this goal sensible and important the paper presents the technical material very clearly and provides a welljustified dichotomy of the evidential deep learning model family i think the statement in footnote 1 is a bit unnecessary confusion the precision score of the dirichlet distribution corresponds simply to the precision of a gaussian distribution which is customarily defined as the inverse of the covariance there exist few pieces of work that use dirichlet priors on class probabilities which are missed by this survey paper for instance j gast and s roth lightweight probabilistic deep networks cvpr 2018 m haussmann et al bayesian evidential deep learning with pac regularization aabi 2020 the paper could clarify the qualitative difference between distributional uncertainty and representation gap better it looks to me like they both quantify the same source of uncertainty whether a testtime samples is coming from a different distribution than the trainingtime samples they only differ in the quantitative aspect how much the model is confident that a sample is outofdomain i am missing the rationale behind introducing a new category of uncertainty only for the sake of this quantitative difference the paper claims to collect some theoretical properties of the dirichlet distribution that could be useful for machine learning however going through the paper and the appendix i am not able to identify theoretical content that provides any new insights on the use of dirichlet distribution for machine learning the presented calculations seem to have been taken from the original papers in their vanilla form and some additional properties such as expected linfty norm and moment generating function are rather textbook material having said that the paper touches the important topic of making a new model family more accessible to the audience i am afraid i do not think it does it in a way that would add sufficient value on top of one reading the material from the original papers i think this way because the paper adopts most of the content verbatim from the original sources and does not add a level of abstraction that helps the reader draw new insights that are not straightforward from the original sources nor does it highlight overlooked positive or negative properties of the evidential model and nor does it present any empirical outcomes that one would find surprising under these conditions i am not able to recommend an accept for this paper in its current shape ### Summary:
this paper surveys a collection of existing works that the author frames as evidential deep learning while the paper has been recognized as a nicely written survey all reviewers have raised the major concern that the paper does not have a sufficient academic contribution compared to the surveyed papers in particular novelty appears to be limited as the paper does not offer novel views into the surveyed subfield given the strong consensus among reviewers i recommend rejecting this paper
[ 2217, 2317, 2299, 347, 352, 9572, 253, 6630, 1057, 417, 2085, 4209, 3081, 12288, 390, 3634, 323, 253, 5697, 3559, 281, 830, 697, 7680, 50276, 783, 6779, 8037, 295, 13183, 1162, 355, 9169, 285, 697, 4795, 24426, 497, 417, 2529, 275, 4209, 2508, 50276, 1189, 455, 352, 4620, 326, 627, 369, 12497, 2144, 323, 247, 898, 3239, 2929, 347, 7424, 497, 6685, 26549, 562, 24088, 16186, 84, 495, 285, 577, 285, 1014, 840, 253, 2929, 858, 417, 3986, 253, 3239, 2701, 2403, 352, 1007, 440, 4818, 1428, 50276, 249, 2087, 253, 9759, 273, 253, 2929, 3198, 7756, 627, 497, 10879, 963, 993, 24088, 359, 671, 281, 2085, 299, 577, 8767, 893, 1935, 285, 16706, 33907, 323, 44978, 8442, 7118, 285, 7424, 26332, 5347, 1320, 285, 31931, 2492, 3910, 627, 497, 690, 417, 1050, 3374, 347, 973, 323, 1650, 352, 369, 21643, 281, 923, 253, 5426, 278, 2788, 50276, 4789, 50276, 76, 4260, 1269, 39116, 342, 285, 1293, 39116, 285, 840, 281, 452, 253, 3268, 7239, 4260, 12910, 33810, 253, 10414, 1618, 310, 440, 4818, 1428, 24385, 8059, 4454, 16485, 2936, 5200, 3966, 50276, 35640, 621, 323, 7756, 50276, 2577, 2022, 17401, 310, 281, 789, 715, 253, 6630, 2007, 16039, 285, 3634, 432, 4457, 253, 28671, 9380, 323, 1650, 253, 6510, 273, 253, 1307, 8943, 451, 3676, 4715, 3249, 432, 8943, 451, 3762, 337, 534, 7033, 281, 253, 14035, 42878, 3268, 949, 253, 5697, 275, 17854, 9317, 374, 352, 651, 320, 1175, 281, 2007, 6780, 326, 2720, 6928, 13551, 253, 11649, 689, 253, 1566, 3602, 347, 436, 12288, 310, 1774, 285, 4409, 9100, 625, 2317, 327, 275, 253, 2929, 50276, 783, 2288, 899, 1162, 355, 39951, 2284, 2929, 495, 310, 5816, 432, 253, 6630, 285, 943, 320, 11217, 50276, 38092, 5839, 715, 253, 11361, 285, 23797, 273, 253, 1027, 17082, 908, 281, 7472, 253, 3045, 273, 8943, 451, 3676, 4715, 3082, 342, 1675, 281, 258, 351, 14800, 285, 11649, 18543, 651, 320, 9865, 50276, 17480, 436, 1673, 1057, 417, 452, 271, 20697, 1180, 273, 9380, 253, 4477, 943, 1908, 3515, 690, 273, 616, 1211, 4679, 281, 2085, 16774, 11815, 10941, 3082, 326, 497, 417, 2168, 2429, 275, 5368, 789, 24088, 1018, 21601, 1321, 1162, 355, 9169, 285, 2288, 899, 1162, 355, 9169, 50276, 6275, 314, 253, 2929, 943, 320, 29422, 285, 4737, 1088, 891, 5583, 253, 897, 273, 253, 1391, 306, 709, 5522, 275, 44127, 342, 253, 5347, 885, 4500, 281, 3693, 45611, 275, 44978, 8442, 7424, 285, 7118, 50275, 18, 247, 268, 1471, 81, 2971, 247, 26647, 273, 17699, 16561, 17032, 10610, 2987, 273, 253, 1471, 81, 13297, 3227, 1592, 3762, 273, 9927, 3470, 7223, 11087, 11238, 4695, 50276, 19, 247, 23421, 606, 17854, 9317, 247, 30221, 323, 14720, 762, 11649, 7203, 254, 4022, 50276, 20, 278, 2288, 899, 1162, 355, 11649, 13823, 3676, 49996, 970, 1006, 800, 3210, 275, 39951, 2284, 9169, 253, 2934, 273, 247, 6630, 327, 8943, 451, 3676, 4715, 310, 7126, 285, 9865, 281, 8607, 2299, 275, 697, 1655, 1375, 891, 513, 417, 2868, 253, 2929, 310, 4704, 323, 9311, 387, 17857, 32888, 342, 11701, 281, 253, 16055, 16039, 285, 1379, 42287, 432, 253, 6630, 347, 973, 347, 2929, 9759, 436, 789, 588, 2489, 247, 1175, 7680, 281, 253, 11073, 3114, 50276, 7152, 33032, 2520, 2929, 17276, 247, 4849, 273, 5368, 2987, 326, 253, 2488, 13009, 347, 8943, 451, 3676, 4715, 323, 253, 9162, 1083, 8943, 451, 3676, 4715, 14177, 281, 6194, 247, 2990, 281, 3453, 253, 3602, 273, 247, 14035, 42878, 3268, 11525, 253, 581, 812, 3587, 4044, 253, 941, 11649, 285, 1566, 11649, 432, 690, 3470, 273, 253, 3453, 3602, 253, 4477, 2085, 247, 4942, 11088, 2278, 273, 3332, 789, 275, 436, 3884, 16280, 9162, 8943, 451, 3676, 4715, 323, 9077, 310, 671, 5469, 13366, 50276, 22309, 310, 2288, 899, 1162, 355, 4765, 1925, 2720, 6928, 352, 3133, 253, 3236, 9380, 858, 417, 6289, 281, 616, 1332, 347, 2720, 2990, 2057, 2556, 281, 253, 5150, 275, 2593, 36438, 943, 2649, 253, 2720, 320, 253, 6507, 14035, 42878, 752, 310, 3058, 281, 320, 10166, 310, 625, 751, 271, 16851, 12637, 2990, 326, 3936, 347, 3280, 247, 747, 941, 1269, 352, 651, 320, 9371, 323, 10668, 281, 21184, 327, 436, 50276, 249, 5150, 608, 943, 12910, 17, 50276, 1906, 76, 320, 12910, 18, 50276, 1906, 76, 5010, 12910, 17, 588, 1900, 320, 253, 6253, 407, 5426, 534, 2789, 327, 3282, 50275, 1542, 5150, 721, 310, 690, 1307, 2905, 281, 270, 1637, 5816, 50276, 783, 4154, 326, 253, 1566, 11649, 476, 3365, 7932, 407, 690, 1159, 273, 9765, 310, 417, 1077, 21414, 1677, 326, 253, 2990, 588, 3587, 3453, 9765, 352, 651, 320, 9371, 604, 625, 4278, 285, 30328, 476, 320, 2530, 50275, 284, 247, 6630, 436, 7714, 3133, 281, 2985, 247, 2257, 273, 1774, 2905, 789, 327, 37851, 11454, 6928, 285, 17699, 16561, 3676, 4715, 6706, 50275, 37585, 50276, 81, 23, 3036, 374, 50276, 926, 374, 50274, 66, 3626, 19484, 6928, 247, 966, 273, 37851, 11454, 6928, 295, 2824, 4022, 270, 3997, 10495, 18634, 275, 37851, 11454, 6928, 342, 31091, 285, 2781, 8090, 17857, 32888, 4765, 260, 10491, 4924, 30009, 11060, 11649, 13418, 970, 34930, 11041, 18634, 30105, 1087, 6247, 277, 37851, 896, 44263, 318, 323, 44755, 4715, 273, 17699, 16561, 11454, 6928, 17857, 1686, 4104, 299, 1146, 17699, 16561, 1014, 816, 247, 2372, 26019, 689, 39943, 275, 774, 86, 6928, 17857, 1686, 9169, 269, 247, 6630, 327, 17699, 16561, 3676, 4715, 913, 78, 12672, 17276, 9169, 50276, 1189, 455, 253, 2929, 310, 27096, 285, 858, 247, 4942, 1175, 2628, 275, 16984, 253, 5044, 12342, 347, 973, 347, 253, 16038, 273, 8943, 451, 3676, 4715, 2299, 352, 3133, 627, 403, 1335, 690, 1774, 2905, 2987, 326, 403, 5816, 432, 253, 10414, 16280, 253, 2929, 275, 697, 1655, 830, 4453, 625, 751, 247, 2278, 2581, 685, 247, 11088, 11080, 6630, 13730, 984, 273, 253, 2317, 10806, 275, 17857, 32888, 3103, 891, 717, 417, 2119, 326, 17857, 32888, 310, 253, 987, 18767, 323, 352, 4931, 352, 310, 625, 7470, 347, 247, 1048, 6698, 835, 625, 4278, 285, 2891, 13646, 812, 320, 2908, 50276, 7152, 339, 431, 248, 2929, 310, 247, 6630, 273, 3082, 275, 8943, 451, 3676, 4715, 352, 4245, 247, 4864, 16038, 323, 436, 873, 273, 3082, 11424, 247, 2087, 7792, 285, 8631, 2045, 2987, 323, 9162, 285, 810, 37761, 8892, 275, 11962, 2408, 273, 4278, 347, 247, 6630, 2929, 352, 36908, 9569, 4460, 5697, 533, 4245, 247, 4217, 18389, 323, 952, 665, 651, 751, 281, 1265, 2444, 327, 436, 9400, 50276, 249, 619, 4743, 253, 2929, 858, 247, 1077, 1175, 2628, 273, 17055, 9380, 327, 8943, 451, 3676, 4715, 285, 50276, 11404, 2821, 247, 4864, 18389, 273, 616, 5697, 347, 247, 1436, 32139, 342, 436, 9400, 891, 1694, 6110, 285, 20311, 281, 1239, 690, 273, 253, 10414, 347, 891, 42126, 4751, 15909, 253, 16038, 285, 512, 253, 4278, 432, 247, 2581, 2159, 10199, 253, 2929, 812, 452, 644, 625, 30357, 604, 352, 369, 2649, 3710, 281, 898, 7223, 285, 369, 3542, 275, 247, 830, 273, 247, 23647, 342, 625, 30328, 323, 512, 253, 7424, 323, 1650, 352, 2335, 479, 690, 673, 281, 4665, 16186, 44598, 752, 15105, 84, 479, 670, 8943, 451, 3676, 4715, 310, 326, 253, 4327, 273, 253, 37820, 1307, 2223, 3133, 10341, 891, 5730, 436, 2929, 574, 690, 12861, 16039, 327, 50276, 2520, 2647, 50275, 74, 419, 2254, 2326, 6630, 9380, 3863, 387, 27691, 1078, 594, 516, 16063, 386, 281, 5583, 697, 14924, 352, 651, 452, 644, 1027, 604, 253, 2929, 275, 1635, 281, 1146, 247, 6630, 3559, 690, 7936, 16039, 715, 253, 3753, 273, 841, 3082, 534, 812, 1385, 347, 247, 4460, 7680, 50276, 74, 1158, 352, 476, 2489, 247, 973, 68, 959, 6698, 390, 549, 32693, 2929, 326, 476, 755, 952, 6110, 275, 253, 9400, 273, 8943, 451, 3676, 4715, 604, 253, 4114, 7118, 403, 3531, 715, 247, 1881, 41010, 23647, 285, 253, 6630, 4243, 2486, 690, 3081, 16038, 285, 30328, 5001, 253, 2216, 10165, 326, 1016, 273, 253, 9380, 1056, 50272, 24124, 16157, 50276, 11425, 16186, 19, 320, 268, 1906, 50276, 69, 9765, 3185, 273, 268, 1637, 50276, 69, 12910, 50274, 4674, 4567, 1745, 80, 572, 266, 262, 1245, 50276, 4674, 35107, 1745, 80, 8767, 893, 1935, 50270, 1987, 8303, 273, 436, 2929, 403, 417, 4460, 534, 310, 2622, 323, 247, 6630, 2929, 50276, 74, 1158, 6630, 9380, 403, 417, 7470, 323, 27691, 347, 597, 878, 281, 320, 6760, 970, 247, 1027, 873, 273, 6866, 50276, 45230, 619, 17401, 651, 320, 281, 12009, 436, 2929, 15846, 1754, 327, 697, 1511, 50276, 7152, 339, 431, 248, 2929, 10262, 247, 6630, 273, 8943, 451, 3676, 4715, 247, 2021, 273, 5145, 4715, 3082, 326, 1804, 15890, 323, 2710, 4948, 273, 20418, 3066, 247, 24498, 15970, 1566, 3692, 4373, 40844, 3602, 403, 247, 1159, 273, 3280, 7313, 253, 2929, 966, 7790, 8943, 451, 3676, 4715, 7274, 715, 9050, 285, 2792, 562, 253, 20544, 285, 32213, 273, 1016, 7140, 352, 671, 10262, 690, 10527, 3607, 273, 253, 14035, 42878, 3268, 326, 812, 320, 4623, 323, 5145, 4715, 8892, 436, 310, 247, 6630, 2929, 1293, 667, 20178, 3916, 273, 38135, 697, 2022, 4096, 310, 281, 1056, 271, 14149, 2021, 273, 5145, 4715, 3210, 625, 12482, 281, 253, 3114, 891, 1089, 436, 4736, 24600, 285, 1774, 50276, 783, 2929, 10262, 253, 7681, 2144, 1077, 4518, 285, 3400, 247, 973, 6309, 1245, 19821, 22438, 273, 253, 8943, 451, 3676, 4715, 1566, 2021, 50276, 74, 1158, 253, 3908, 275, 43302, 337, 310, 247, 2372, 15279, 13775, 253, 12320, 4868, 273, 253, 14035, 42878, 3268, 10140, 3365, 281, 253, 12320, 273, 247, 305, 12064, 3268, 534, 310, 2840, 3441, 2931, 347, 253, 13737, 273, 253, 26677, 50276, 9088, 2226, 1643, 7437, 273, 789, 326, 897, 14035, 42878, 2235, 641, 327, 966, 20552, 534, 403, 9829, 407, 436, 6630, 2929, 323, 4227, 50276, 75, 9144, 285, 256, 687, 394, 28441, 37851, 3676, 6928, 30105, 1087, 4765, 50276, 78, 419, 1316, 8420, 1162, 355, 17699, 16561, 8943, 451, 3676, 4715, 342, 19162, 37820, 247, 18754, 9169, 50276, 783, 2929, 812, 19148, 253, 18276, 3064, 875, 3268, 267, 11649, 285, 6779, 8037, 1805, 352, 4453, 281, 479, 751, 597, 1097, 22048, 253, 1072, 2603, 273, 11649, 1880, 247, 1071, 2606, 3530, 310, 3551, 432, 247, 1027, 3268, 685, 253, 3733, 2606, 3530, 597, 760, 9184, 275, 253, 11745, 4809, 849, 1199, 253, 1566, 310, 13224, 326, 247, 3410, 310, 562, 1171, 13517, 891, 717, 5816, 253, 24775, 3212, 16984, 247, 747, 7140, 273, 11649, 760, 323, 253, 13232, 273, 436, 11745, 3064, 50276, 783, 2929, 3916, 281, 4822, 690, 10527, 3607, 273, 253, 14035, 42878, 3268, 326, 812, 320, 4217, 323, 5145, 4715, 2299, 1469, 949, 253, 2929, 285, 253, 30762, 891, 717, 417, 2104, 281, 4271, 10527, 2600, 326, 3400, 667, 747, 16039, 327, 253, 897, 273, 14035, 42878, 3268, 323, 5145, 4715, 253, 3559, 10426, 1646, 281, 452, 644, 2668, 432, 253, 3236, 9380, 275, 616, 26724, 830, 285, 690, 3081, 3607, 824, 347, 3264, 298, 3259, 5222, 285, 2774, 11365, 1159, 403, 2581, 40554, 2144, 1907, 753, 326, 253, 2929, 26847, 253, 1774, 9400, 273, 2403, 247, 747, 1566, 2021, 625, 12482, 281, 253, 8446, 891, 717, 9202, 891, 513, 417, 1158, 352, 1057, 352, 275, 247, 1039, 326, 651, 823, 4209, 1318, 327, 1755, 273, 581, 4361, 253, 2144, 432, 253, 3236, 9380, 891, 1158, 436, 1039, 984, 253, 2929, 47932, 954, 273, 253, 2600, 2336, 37438, 432, 253, 3236, 4973, 285, 1057, 417, 823, 247, 1268, 273, 38562, 326, 7729, 253, 9414, 3812, 747, 16039, 326, 403, 417, 15246, 432, 253, 3236, 4973, 4543, 1057, 352, 6780, 28849, 2762, 390, 4016, 3607, 273, 253, 8943, 451, 1566, 285, 4543, 1057, 352, 1246, 667, 16774, 6973, 326, 581, 651, 1089, 10084, 762, 841, 2515, 891, 717, 417, 2104, 281, 5583, 271, 2997, 323, 436, 2929, 275, 697, 1655, 5281, 2490, 187, 4118, 18435, 27, 2520, 2929, 17276, 247, 4849, 273, 5368, 2987, 326, 253, 2488, 13009, 347, 8943, 451, 3676, 4715, 50276, 6050, 253, 2929, 556, 644, 7478, 347, 247, 23395, 3542, 6630, 512, 30628, 452, 5439, 253, 2201, 4468, 326, 253, 2929, 1057, 417, 452, 247, 4209, 11073, 7680, 2429, 281, 253, 28671, 9380, 275, 1798, 38135, 4620, 281, 320, 3710, 347, 253, 2929, 1057, 417, 3959, 4460, 6849, 715, 253, 28671, 749, 3423, 50276, 28821, 253, 2266, 13969, 2190, 30628, 891, 5583, 33944, 436, 2929 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 2217, 2317, 2299, 347, 352, 9572, 253, 6630, 1057, 417, 2085, 4209, 3081, 12288, 390, 3634, 323, 253, 5697, 3559, 281, 830, 697, 7680, 50276, 783, 6779, 8037, 295, 13183, 1162, 355, 9169, 285, 697, 4795, 24426, 497, 417, 2529, 275, 4209, 2508, 50276, 1189, 455, 352, 4620, 326, 627, 369, 12497, 2144, 323, 247, 898, 3239, 2929, 347, 7424, 497, 6685, 26549, 562, 24088, 16186, 84, 495, 285, 577, 285, 1014, 840, 253, 2929, 858, 417, 3986, 253, 3239, 2701, 2403, 352, 1007, 440, 4818, 1428, 50276, 249, 2087, 253, 9759, 273, 253, 2929, 3198, 7756, 627, 497, 10879, 963, 993, 24088, 359, 671, 281, 2085, 299, 577, 8767, 893, 1935, 285, 16706, 33907, 323, 44978, 8442, 7118, 285, 7424, 26332, 5347, 1320, 285, 31931, 2492, 3910, 627, 497, 690, 417, 1050, 3374, 347, 973, 323, 1650, 352, 369, 21643, 281, 923, 253, 5426, 278, 2788, 50276, 4789, 50276, 76, 4260, 1269, 39116, 342, 285, 1293, 39116, 285, 840, 281, 452, 253, 3268, 7239, 4260, 12910, 33810, 253, 10414, 1618, 310, 440, 4818, 1428, 24385, 8059, 4454, 16485, 2936, 5200, 3966, 50276, 35640, 621, 323, 7756, 50276, 2577, 2022, 17401, 310, 281, 789, 715, 253, 6630, 2007, 16039, 285, 3634, 432, 4457, 253, 28671, 9380, 323, 1650, 253, 6510, 273, 253, 1307, 8943, 451, 3676, 4715, 3249, 432, 8943, 451, 3762, 337, 534, 7033, 281, 253, 14035, 42878, 3268, 949, 253, 5697, 275, 17854, 9317, 374, 352, 651, 320, 1175, 281, 2007, 6780, 326, 2720, 6928, 13551, 253, 11649, 689, 253, 1566, 3602, 347, 436, 12288, 310, 1774, 285, 4409, 9100, 625, 2317, 327, 275, 253, 2929, 50276, 783, 2288, 899, 1162, 355, 39951, 2284, 2929, 495, 310, 5816, 432, 253, 6630, 285, 943, 320, 11217, 50276, 38092, 5839, 715, 253, 11361, 285, 23797, 273, 253, 1027, 17082, 908, 281, 7472, 253, 3045, 273, 8943, 451, 3676, 4715, 3082, 342, 1675, 281, 258, 351, 14800, 285, 11649, 18543, 651, 320, 9865, 50276, 17480, 436, 1673, 1057, 417, 452, 271, 20697, 1180, 273, 9380, 253, 4477, 943, 1908, 3515, 690, 273, 616, 1211, 4679, 281, 2085, 16774, 11815, 10941, 3082, 326, 497, 417, 2168, 2429, 275, 5368, 789, 24088, 1018, 21601, 1321, 1162, 355, 9169, 285, 2288, 899, 1162, 355, 9169, 50276, 6275, 314, 253, 2929, 943, 320, 29422, 285, 4737, 1088, 891, 5583, 253, 897, 273, 253, 1391, 306, 709, 5522, 275, 44127, 342, 253, 5347, 885, 4500, 281, 3693, 45611, 275, 44978, 8442, 7424, 285, 7118, 50275, 18, 247, 268, 1471, 81, 2971, 247, 26647, 273, 17699, 16561, 17032, 10610, 2987, 273, 253, 1471, 81, 13297, 3227, 1592, 3762, 273, 9927, 3470, 7223, 11087, 11238, 4695, 50276, 19, 247, 23421, 606, 17854, 9317, 247, 30221, 323, 14720, 762, 11649, 7203, 254, 4022, 50276, 20, 278, 2288, 899, 1162, 355, 11649, 13823, 3676, 49996, 970, 1006, 800, 3210, 275, 39951, 2284, 9169, 253, 2934, 273, 247, 6630, 327, 8943, 451, 3676, 4715, 310, 7126, 285, 9865, 281, 8607, 2299, 275, 697, 1655, 1375, 891, 513, 417, 2868, 253, 2929, 310, 4704, 323, 9311, 387, 17857, 32888, 342, 11701, 281, 253, 16055, 16039, 285, 1379, 42287, 432, 253, 6630, 347, 973, 347, 2929, 9759, 436, 789, 588, 2489, 247, 1175, 7680, 281, 253, 11073, 3114, 50276, 7152, 33032, 2520, 2929, 17276, 247, 4849, 273, 5368, 2987, 326, 253, 2488, 13009, 347, 8943, 451, 3676, 4715, 323, 253, 9162, 1083, 8943, 451, 3676, 4715, 14177, 281, 6194, 247, 2990, 281, 3453, 253, 3602, 273, 247, 14035, 42878, 3268, 11525, 253, 581, 812, 3587, 4044, 253, 941, 11649, 285, 1566, 11649, 432, 690, 3470, 273, 253, 3453, 3602, 253, 4477, 2085, 247, 4942, 11088, 2278, 273, 3332, 789, 275, 436, 3884, 16280, 9162, 8943, 451, 3676, 4715, 323, 9077, 310, 671, 5469, 13366, 50276, 22309, 310, 2288, 899, 1162, 355, 4765, 1925, 2720, 6928, 352, 3133, 253, 3236, 9380, 858, 417, 6289, 281, 616, 1332, 347, 2720, 2990, 2057, 2556, 281, 253, 5150, 275, 2593, 36438, 943, 2649, 253, 2720, 320, 253, 6507, 14035, 42878, 752, 310, 3058, 281, 320, 10166, 310, 625, 751, 271, 16851, 12637, 2990, 326, 3936, 347, 3280, 247, 747, 941, 1269, 352, 651, 320, 9371, 323, 10668, 281, 21184, 327, 436, 50276, 249, 5150, 608, 943, 12910, 17, 50276, 1906, 76, 320, 12910, 18, 50276, 1906, 76, 5010, 12910, 17, 588, 1900, 320, 253, 6253, 407, 5426, 534, 2789, 327, 3282, 50275, 1542, 5150, 721, 310, 690, 1307, 2905, 281, 270, 1637, 5816, 50276, 783, 4154, 326, 253, 1566, 11649, 476, 3365, 7932, 407, 690, 1159, 273, 9765, 310, 417, 1077, 21414, 1677, 326, 253, 2990, 588, 3587, 3453, 9765, 352, 651, 320, 9371, 604, 625, 4278, 285, 30328, 476, 320, 2530, 50275, 284, 247, 6630, 436, 7714, 3133, 281, 2985, 247, 2257, 273, 1774, 2905, 789, 327, 37851, 11454, 6928, 285, 17699, 16561, 3676, 4715, 6706, 50275, 37585, 50276, 81, 23, 3036, 374, 50276, 926, 374, 50274, 66, 3626, 19484, 6928, 247, 966, 273, 37851, 11454, 6928, 295, 2824, 4022, 270, 3997, 10495, 18634, 275, 37851, 11454, 6928, 342, 31091, 285, 2781, 8090, 17857, 32888, 4765, 260, 10491, 4924, 30009, 11060, 11649, 13418, 970, 34930, 11041, 18634, 30105, 1087, 6247, 277, 37851, 896, 44263, 318, 323, 44755, 4715, 273, 17699, 16561, 11454, 6928, 17857, 1686, 4104, 299, 1146, 17699, 16561, 1014, 816, 247, 2372, 26019, 689, 39943, 275, 774, 86, 6928, 17857, 1686, 9169, 269, 247, 6630, 327, 17699, 16561, 3676, 4715, 913, 78, 12672, 17276, 9169, 50276, 1189, 455, 253, 2929, 310, 27096, 285, 858, 247, 4942, 1175, 2628, 275, 16984, 253, 5044, 12342, 347, 973, 347, 253, 16038, 273, 8943, 451, 3676, 4715, 2299, 352, 3133, 627, 403, 1335, 690, 1774, 2905, 2987, 326, 403, 5816, 432, 253, 10414, 16280, 253, 2929, 275, 697, 1655, 830, 4453, 625, 751, 247, 2278, 2581, 685, 247, 11088, 11080, 6630, 13730, 984, 273, 253, 2317, 10806, 275, 17857, 32888, 3103, 891, 717, 417, 2119, 326, 17857, 32888, 310, 253, 987, 18767, 323, 352, 4931, 352, 310, 625, 7470, 347, 247, 1048, 6698, 835, 625, 4278, 285, 2891, 13646, 812, 320, 2908, 50276, 7152, 339, 431, 248, 2929, 310, 247, 6630, 273, 3082, 275, 8943, 451, 3676, 4715, 352, 4245, 247, 4864, 16038, 323, 436, 873, 273, 3082, 11424, 247, 2087, 7792, 285, 8631, 2045, 2987, 323, 9162, 285, 810, 37761, 8892, 275, 11962, 2408, 273, 4278, 347, 247, 6630, 2929, 352, 36908, 9569, 4460, 5697, 533, 4245, 247, 4217, 18389, 323, 952, 665, 651, 751, 281, 1265, 2444, 327, 436, 9400, 50276, 249, 619, 4743, 253, 2929, 858, 247, 1077, 1175, 2628, 273, 17055, 9380, 327, 8943, 451, 3676, 4715, 285, 50276, 11404, 2821, 247, 4864, 18389, 273, 616, 5697, 347, 247, 1436, 32139, 342, 436, 9400, 891, 1694, 6110, 285, 20311, 281, 1239, 690, 273, 253, 10414, 347, 891, 42126, 4751, 15909, 253, 16038, 285, 512, 253, 4278, 432, 247, 2581, 2159, 10199, 253, 2929, 812, 452, 644, 625, 30357, 604, 352, 369, 2649, 3710, 281, 898, 7223, 285, 369, 3542, 275, 247, 830, 273, 247, 23647, 342, 625, 30328, 323, 512, 253, 7424, 323, 1650, 352, 2335, 479, 690, 673, 281, 4665, 16186, 44598, 752, 15105, 84, 479, 670, 8943, 451, 3676, 4715, 310, 326, 253, 4327, 273, 253, 37820, 1307, 2223, 3133, 10341, 891, 5730, 436, 2929, 574, 690, 12861, 16039, 327, 50276, 2520, 2647, 50275, 74, 419, 2254, 2326, 6630, 9380, 3863, 387, 27691, 1078, 594, 516, 16063, 386, 281, 5583, 697, 14924, 352, 651, 452, 644, 1027, 604, 253, 2929, 275, 1635, 281, 1146, 247, 6630, 3559, 690, 7936, 16039, 715, 253, 3753, 273, 841, 3082, 534, 812, 1385, 347, 247, 4460, 7680, 50276, 74, 1158, 352, 476, 2489, 247, 973, 68, 959, 6698, 390, 549, 32693, 2929, 326, 476, 755, 952, 6110, 275, 253, 9400, 273, 8943, 451, 3676, 4715, 604, 253, 4114, 7118, 403, 3531, 715, 247, 1881, 41010, 23647, 285, 253, 6630, 4243, 2486, 690, 3081, 16038, 285, 30328, 5001, 253, 2216, 10165, 326, 1016, 273, 253, 9380, 1056, 50272, 24124, 16157, 50276, 11425, 16186, 19, 320, 268, 1906, 50276, 69, 9765, 3185, 273, 268, 1637, 50276, 69, 12910, 50274, 4674, 4567, 1745, 80, 572, 266, 262, 1245, 50276, 4674, 35107, 1745, 80, 8767, 893, 1935, 50270, 1987, 8303, 273, 436, 2929, 403, 417, 4460, 534, 310, 2622, 323, 247, 6630, 2929, 50276, 74, 1158, 6630, 9380, 403, 417, 7470, 323, 27691, 347, 597, 878, 281, 320, 6760, 970, 247, 1027, 873, 273, 6866, 50276, 45230, 619, 17401, 651, 320, 281, 12009, 436, 2929, 15846, 1754, 327, 697, 1511, 50276, 7152, 339, 431, 248, 2929, 10262, 247, 6630, 273, 8943, 451, 3676, 4715, 247, 2021, 273, 5145, 4715, 3082, 326, 1804, 15890, 323, 2710, 4948, 273, 20418, 3066, 247, 24498, 15970, 1566, 3692, 4373, 40844, 3602, 403, 247, 1159, 273, 3280, 7313, 253, 2929, 966, 7790, 8943, 451, 3676, 4715, 7274, 715, 9050, 285, 2792, 562, 253, 20544, 285, 32213, 273, 1016, 7140, 352, 671, 10262, 690, 10527, 3607, 273, 253, 14035, 42878, 3268, 326, 812, 320, 4623, 323, 5145, 4715, 8892, 436, 310, 247, 6630, 2929, 1293, 667, 20178, 3916, 273, 38135, 697, 2022, 4096, 310, 281, 1056, 271, 14149, 2021, 273, 5145, 4715, 3210, 625, 12482, 281, 253, 3114, 891, 1089, 436, 4736, 24600, 285, 1774, 50276, 783, 2929, 10262, 253, 7681, 2144, 1077, 4518, 285, 3400, 247, 973, 6309, 1245, 19821, 22438, 273, 253, 8943, 451, 3676, 4715, 1566, 2021, 50276, 74, 1158, 253, 3908, 275, 43302, 337, 310, 247, 2372, 15279, 13775, 253, 12320, 4868, 273, 253, 14035, 42878, 3268, 10140, 3365, 281, 253, 12320, 273, 247, 305, 12064, 3268, 534, 310, 2840, 3441, 2931, 347, 253, 13737, 273, 253, 26677, 50276, 9088, 2226, 1643, 7437, 273, 789, 326, 897, 14035, 42878, 2235, 641, 327, 966, 20552, 534, 403, 9829, 407, 436, 6630, 2929, 323, 4227, 50276, 75, 9144, 285, 256, 687, 394, 28441, 37851, 3676, 6928, 30105, 1087, 4765, 50276, 78, 419, 1316, 8420, 1162, 355, 17699, 16561, 8943, 451, 3676, 4715, 342, 19162, 37820, 247, 18754, 9169, 50276, 783, 2929, 812, 19148, 253, 18276, 3064, 875, 3268, 267, 11649, 285, 6779, 8037, 1805, 352, 4453, 281, 479, 751, 597, 1097, 22048, 253, 1072, 2603, 273, 11649, 1880, 247, 1071, 2606, 3530, 310, 3551, 432, 247, 1027, 3268, 685, 253, 3733, 2606, 3530, 597, 760, 9184, 275, 253, 11745, 4809, 849, 1199, 253, 1566, 310, 13224, 326, 247, 3410, 310, 562, 1171, 13517, 891, 717, 5816, 253, 24775, 3212, 16984, 247, 747, 7140, 273, 11649, 760, 323, 253, 13232, 273, 436, 11745, 3064, 50276, 783, 2929, 3916, 281, 4822, 690, 10527, 3607, 273, 253, 14035, 42878, 3268, 326, 812, 320, 4217, 323, 5145, 4715, 2299, 1469, 949, 253, 2929, 285, 253, 30762, 891, 717, 417, 2104, 281, 4271, 10527, 2600, 326, 3400, 667, 747, 16039, 327, 253, 897, 273, 14035, 42878, 3268, 323, 5145, 4715, 253, 3559, 10426, 1646, 281, 452, 644, 2668, 432, 253, 3236, 9380, 275, 616, 26724, 830, 285, 690, 3081, 3607, 824, 347, 3264, 298, 3259, 5222, 285, 2774, 11365, 1159, 403, 2581, 40554, 2144, 1907, 753, 326, 253, 2929, 26847, 253, 1774, 9400, 273, 2403, 247, 747, 1566, 2021, 625, 12482, 281, 253, 8446, 891, 717, 9202, 891, 513, 417, 1158, 352, 1057, 352, 275, 247, 1039, 326, 651, 823, 4209, 1318, 327, 1755, 273, 581, 4361, 253, 2144, 432, 253, 3236, 9380, 891, 1158, 436, 1039, 984, 253, 2929, 47932, 954, 273, 253, 2600, 2336, 37438, 432, 253, 3236, 4973, 285, 1057, 417, 823, 247, 1268, 273, 38562, 326, 7729, 253, 9414, 3812, 747, 16039, 326, 403, 417, 15246, 432, 253, 3236, 4973, 4543, 1057, 352, 6780, 28849, 2762, 390, 4016, 3607, 273, 253, 8943, 451, 1566, 285, 4543, 1057, 352, 1246, 667, 16774, 6973, 326, 581, 651, 1089, 10084, 762, 841, 2515, 891, 717, 417, 2104, 281, 5583, 271, 2997, 323, 436, 2929, 275, 697, 1655, 5281, 2490, 187, 4118, 18435, 27, 2520, 2929, 17276, 247, 4849, 273, 5368, 2987, 326, 253, 2488, 13009, 347, 8943, 451, 3676, 4715, 50276, 6050, 253, 2929, 556, 644, 7478, 347, 247, 23395, 3542, 6630, 512, 30628, 452, 5439, 253, 2201, 4468, 326, 253, 2929, 1057, 417, 452, 247, 4209, 11073, 7680, 2429, 281, 253, 28671, 9380, 275, 1798, 38135, 4620, 281, 320, 3710, 347, 253, 2929, 1057, 417, 3959, 4460, 6849, 715, 253, 28671, 749, 3423, 50276, 28821, 253, 2266, 13969, 2190, 30628, 891, 5583, 33944, 436, 2929 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: sobolev training of neural networks which augments the standard loss function with terms that penalize discrepancies between the derivatives of the network and target functions has been shown empirically to improve dataefficiency intuitively one would expect that it also aids generalization in settings where the target function is sufficiently smooth this manuscript proposes augmenting the loss functions used to represent the solutions of partial differential equations with terms penalizing the sobolev norm of the solution its initial condition and the boundary condition the motivation for this approach is clear because data efficiency is of the utmost importance in pde learning problems where the data could be very difficult to access the experiments in this paper clearly show that a target accuracy can be achieved with fewer overall training points when using sobolev training very much consistent with the established understanding of the effect of penalizing the sobolev norms in typical supervised machine learning problems some of the examples are highdimensional and nontrivial the theoretical results are not particularly compelling but they serve a reasonable justification for the proposed scheme docsepthe idea of using neural networks to approximate the solutions of the pdes is very interesting specially in highdimensional setting where classical approaches fail to scale although there has been many efforts in this direction there are still open venues to explore one of the most important aspect is the choice of loss function to guide the training of the neural network and the papers aim is to address this issue by proposing sobolev norm as the loss function instead of the commonly used l2norm the sobolev norm includes additional term about derivatives of the error the main claim is that with the inclusion of the additional term the convergence of the neural network training becomes faster this is the basic promise of the paper although an interesting proposal i think the paper did not address the full aspects of it in particular 1 training with derivatives in the loss function should be costly compared to l2 loss this becomes worse as the dimension increases if the dimension is d the first derivative scales with d and the second derivative scales with d2 it seems that at most one can try only the first derivative 2 no comparison is provided in terms of computational time 3 comparing the convergence speed may not be fair because for l2 loss one might be able to use larger learning rate which would yield faster convergence while large learning rate might make sobolev training unstable 4 including the derivative requires strong smoothness assumption about the pde data the forcinig term the bc ic which are not standard in the pde literature given this i think the contributions of the paper are incremental moreover the theoretical results do not really support the claim that sobolev loss function is better it would be interesting to have a negative result about the l2 loss function that motivates the application of the sobolev norm moreover it seems that the appendix is not written with care subsection a1 does not really include a proof if the result is already known it seems better to cite the reference with exact pointer to the result in the main body of the paper and do not include it as the contribution subsection a2 also includes the result porp a3 without proof or reference the proof of thm a4 is not written with care the bounds in a19 and a20 are obtained without explaining the steps what type of poincare inequality is used it will be good to include a definition of the norms and poincare inequality for the reader unfamiliar with pde analysis docsepoverview the paper proposes a novel loss function using sobolev norms to decrease the computational costs when solving pdes using neural networks i think the idea of the work is very interesting and relevant and in a useful direction however i think the paper in its current form is not yet suitable for publication and it should be strengthened by incorporating more theoretical and numerical aspects this will make the concept a lot more convincing i present some ideas and comments for improvement below comments and ideas for improvement as well as clarification questions you state it requires relatively high computational cost compared to traditional meshbased schemes in general would you have a reference for this in part i agree with the notion that neural network training could be computationallyheavy but on the other hand as you also mention it should not suffer from the curse of dimensionality compared to meshbased methods which would seem that it is computationally efficient the main claim of this work is to introduce sobolev training to speed up convergence or as you mention in the introduction overcoming the issue of high computational cost when solving pdes using neural networks theoretically the results in section 4 are not showing this i know that in the original sobolev training paper there is a result on how sobolev training has a lower sample complexity than regular training extending such a result to this setting would be necessary to make the claims in the introduction rigorous the results in thm 41 and 42 are only for 1d equations i understand that higher order could be more complex and perhaps the 1d equations are sufficient to convey the intuition however in that case at least a comment is needed on how these results could be extended to higher orders in figure 1 what is the reason for h2 loss not speeding up convergence with the relu is it the differentiability the results in 52 are again only for 1d equations i think that if theoretically you do not prove the results for highdimensional pdes the value of the proposed methodology for highdimensional pdes should at least be shown in extensive numerical experiments i do think the example in 54 is in the right direction but a more rigorous analysis would be needed i would like to have more information on how the true pde values of the gradients of the boundary and interior differential operators are computed and whether this is always possible my last comment is a general one but given that the research in this area is growing rapidly with various approaches to improve convergence a stronger literature review would be necessary i give two examples of papers which could be of interest below some references which may be of interest ito kazufumi christoph reisinger and yufei zhang a neural networkbased policy iteration algorithm with global h 2 h 2superlinear convergence for stochastic games on domains specifically remark 42 could be of interest the authors also discuss how certain norms cannot guarantee convergence of the derivatives of the numerical solutions van der meer remco cornelis oosterlee and anastasia borovykh optimally weighted loss functions for solving pdes with neural networks the authors discuss the choice of loss functions to also speed up improve convergence and the solution accuracy ### Summary:
the paper solves a pde using an additional penalty function between the derivatives of the function on toy examples and two pdes it is shown that these additional terms help pros the motivation is to include derivatives in the computationa implementation and testing on several examples including highdimensional ones timing is included in the latest version cons the loss is sobolev norm of the residuals of the equation the usage of the norm of the residual is not 100 consistent with the smoothness properties of the corresponding equation for example for the poisson equation the problem is selected in such a way the solution is analytic however for example if the zero boundary conditions are enforced and right hand side is all ones the solution will have singularities thus the main challenge would be the case when solution does have the singularities and it will have it in many practical cases the l2norm then is not the right functional for the solution to exist not to say about the higherorder derivatives so these functionals are not motivated by the theory of the solution of pdes but are rather focused on much smoother solution convergence there are quite a few papers on the convergence of dnn approximations to solution of pdes the presented methods might have converged to a local minimum an important reference is the paper by yarotsky d error bounds for approximations with deep relu networks neural networks 2017 oct 19410314
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 601, 67, 40610, 3733, 273, 11454, 6928, 534, 14688, 942, 253, 2629, 2957, 1159, 342, 2426, 326, 29697, 907, 37122, 875, 253, 13335, 273, 253, 2990, 285, 2303, 3470, 556, 644, 2011, 45190, 281, 3157, 941, 46505, 540, 41597, 581, 651, 1902, 326, 352, 671, 34253, 26647, 275, 7533, 835, 253, 2303, 1159, 310, 10481, 6032, 436, 7714, 29328, 35919, 272, 253, 2957, 3470, 908, 281, 1957, 253, 5482, 273, 7898, 8967, 7424, 342, 2426, 29697, 3006, 253, 30323, 40610, 5222, 273, 253, 2900, 697, 3302, 1617, 285, 253, 7548, 1617, 253, 16038, 323, 436, 2746, 310, 2590, 984, 941, 6733, 310, 273, 253, 34497, 6349, 275, 268, 615, 4715, 3237, 835, 253, 941, 812, 320, 1077, 2834, 281, 2289, 50275, 783, 4679, 275, 436, 2929, 4518, 921, 326, 247, 2303, 7200, 476, 320, 6786, 342, 11184, 4583, 3733, 2792, 672, 970, 30323, 40610, 3733, 1077, 1199, 5185, 342, 253, 4232, 4685, 273, 253, 1055, 273, 29697, 3006, 253, 30323, 40610, 22429, 275, 6867, 22296, 5145, 4715, 3237, 50276, 8826, 273, 253, 6667, 403, 1029, 6967, 285, 37825, 50275, 783, 10527, 1543, 403, 417, 3782, 18511, 533, 597, 5752, 247, 5272, 22861, 323, 253, 4081, 6974, 5474, 339, 431, 248, 2934, 273, 970, 11454, 6928, 281, 16851, 253, 5482, 273, 253, 268, 3229, 310, 1077, 4722, 24443, 275, 1029, 6967, 4758, 835, 8946, 7274, 1891, 281, 4311, 3738, 627, 556, 644, 1142, 6031, 275, 436, 3884, 627, 403, 1335, 1527, 28966, 281, 8338, 581, 273, 253, 954, 1774, 4809, 310, 253, 4327, 273, 2957, 1159, 281, 7102, 253, 3733, 273, 253, 11454, 2990, 285, 253, 9380, 4388, 310, 281, 2953, 436, 2523, 407, 36636, 30323, 40610, 5222, 347, 253, 2957, 1159, 3185, 273, 253, 7744, 908, 298, 19, 12850, 253, 30323, 40610, 5222, 3797, 3081, 1307, 670, 13335, 273, 253, 2228, 253, 2022, 1750, 310, 326, 342, 253, 11250, 273, 253, 3081, 1307, 253, 14940, 273, 253, 11454, 2990, 3733, 4916, 7938, 436, 310, 253, 5044, 9023, 273, 253, 2929, 50274, 20261, 271, 4722, 10419, 891, 1158, 253, 2929, 858, 417, 2953, 253, 2120, 7794, 273, 352, 275, 1798, 50274, 18, 3733, 342, 13335, 275, 253, 2957, 1159, 943, 320, 19983, 2429, 281, 298, 19, 2957, 436, 4916, 7197, 347, 253, 7877, 5459, 604, 253, 7877, 310, 277, 253, 806, 4309, 11498, 342, 277, 285, 253, 1273, 4309, 11498, 342, 277, 19, 50276, 262, 3133, 326, 387, 954, 581, 476, 1611, 760, 253, 806, 4309, 50274, 19, 642, 5301, 310, 2530, 275, 2426, 273, 15180, 673, 50276, 20, 10941, 253, 14940, 3885, 778, 417, 320, 4344, 984, 323, 298, 19, 2957, 581, 1537, 320, 2104, 281, 897, 4067, 4715, 2281, 534, 651, 4917, 7938, 14940, 1223, 1781, 4715, 2281, 1537, 1056, 30323, 40610, 3733, 17631, 50276, 21, 50276, 10387, 253, 4309, 4419, 2266, 6032, 1255, 9376, 670, 253, 268, 615, 941, 253, 323, 5620, 304, 1307, 253, 49501, 17857, 534, 403, 417, 2629, 275, 253, 268, 615, 6239, 50275, 28821, 436, 891, 1158, 253, 9021, 273, 253, 2929, 403, 32809, 25761, 253, 10527, 1543, 513, 417, 1663, 1329, 253, 1750, 326, 30323, 40610, 2957, 1159, 310, 1805, 352, 651, 320, 4722, 281, 452, 247, 4016, 906, 670, 253, 298, 19, 2957, 1159, 326, 15265, 684, 253, 2898, 273, 253, 30323, 40610, 5222, 50275, 3062, 1189, 352, 3133, 326, 253, 30762, 310, 417, 3542, 342, 1557, 50275, 2377, 4674, 247, 18, 1057, 417, 1663, 2486, 247, 4737, 604, 253, 906, 310, 2168, 1929, 352, 3133, 1805, 281, 26542, 253, 3806, 342, 3242, 12219, 281, 253, 906, 275, 253, 2022, 2133, 273, 253, 2929, 285, 513, 417, 2486, 352, 347, 253, 7680, 50275, 2377, 4674, 247, 19, 671, 3797, 253, 906, 4474, 81, 247, 20, 1293, 4737, 390, 3806, 253, 4737, 273, 289, 78, 247, 21, 310, 417, 3542, 342, 1557, 253, 14493, 275, 247, 746, 285, 247, 938, 403, 2797, 1293, 15571, 253, 5018, 752, 1511, 273, 2963, 1763, 609, 11370, 310, 908, 50275, 262, 588, 320, 1175, 281, 2486, 247, 5426, 273, 253, 22429, 285, 2963, 1763, 609, 11370, 323, 253, 9414, 32139, 342, 268, 615, 1783, 50276, 7152, 33032, 39930, 50276, 783, 2929, 29328, 247, 4460, 2957, 1159, 970, 30323, 40610, 22429, 281, 6379, 253, 15180, 4815, 672, 16161, 268, 3229, 970, 11454, 6928, 891, 1158, 253, 2934, 273, 253, 789, 310, 1077, 4722, 285, 4623, 285, 275, 247, 4217, 3884, 2299, 891, 1158, 253, 2929, 275, 697, 1655, 830, 310, 417, 2568, 7470, 323, 9311, 285, 352, 943, 320, 34615, 407, 24049, 625, 10527, 285, 10704, 7794, 436, 588, 1056, 253, 4473, 247, 2257, 625, 21414, 891, 1246, 690, 5697, 285, 5701, 323, 7756, 2708, 50275, 26122, 285, 5697, 323, 7756, 347, 973, 347, 37699, 3533, 50276, 5658, 1375, 352, 4419, 4942, 1029, 15180, 2105, 2429, 281, 5899, 17489, 3169, 15849, 275, 2087, 651, 368, 452, 247, 3806, 323, 436, 275, 629, 891, 5194, 342, 253, 10732, 326, 11454, 2990, 3733, 812, 320, 43245, 37893, 533, 327, 253, 643, 1133, 347, 368, 671, 3748, 352, 943, 417, 11089, 432, 253, 28401, 273, 7877, 1319, 2429, 281, 17489, 3169, 3082, 534, 651, 1646, 326, 352, 310, 43245, 5919, 50276, 783, 2022, 1750, 273, 436, 789, 310, 281, 9569, 30323, 40610, 3733, 281, 3885, 598, 14940, 390, 347, 368, 3748, 275, 253, 10199, 40845, 253, 2523, 273, 1029, 15180, 2105, 672, 16161, 268, 3229, 970, 11454, 6928, 28055, 253, 1543, 275, 2593, 577, 403, 417, 4645, 436, 891, 871, 326, 275, 253, 3236, 30323, 40610, 3733, 2929, 627, 310, 247, 906, 327, 849, 30323, 40610, 3733, 556, 247, 2406, 3410, 10454, 685, 3963, 3733, 13633, 824, 247, 906, 281, 436, 4758, 651, 320, 3309, 281, 1056, 253, 3916, 275, 253, 10199, 26565, 50275, 783, 1543, 275, 289, 78, 7609, 285, 5976, 403, 760, 323, 337, 69, 7424, 891, 2096, 326, 2169, 1340, 812, 320, 625, 2570, 285, 4931, 253, 337, 69, 7424, 403, 4209, 281, 12709, 253, 30328, 2299, 275, 326, 1083, 387, 1878, 247, 4385, 310, 3058, 327, 849, 841, 1543, 812, 320, 6508, 281, 2169, 7367, 50275, 249, 4677, 337, 752, 310, 253, 1921, 323, 288, 19, 2957, 417, 43088, 598, 14940, 342, 253, 774, 86, 310, 352, 253, 1027, 74, 1430, 50276, 783, 1543, 275, 8073, 403, 969, 760, 323, 337, 69, 7424, 891, 1158, 326, 604, 28055, 368, 513, 417, 5276, 253, 1543, 323, 1029, 6967, 268, 3229, 253, 1318, 273, 253, 4081, 16182, 323, 1029, 6967, 268, 3229, 943, 387, 1878, 320, 2011, 275, 9470, 10704, 4679, 891, 513, 1158, 253, 1650, 275, 8255, 310, 275, 253, 987, 3884, 533, 247, 625, 26565, 1783, 651, 320, 3058, 50275, 74, 651, 751, 281, 452, 625, 1491, 327, 849, 253, 2032, 268, 615, 2193, 273, 253, 27935, 273, 253, 7548, 285, 10755, 8967, 9158, 403, 10302, 285, 1880, 436, 310, 1900, 1896, 50275, 2577, 1390, 4385, 310, 247, 2087, 581, 533, 1677, 326, 253, 2561, 275, 436, 2170, 310, 5675, 9086, 342, 2710, 7274, 281, 3157, 14940, 247, 10046, 6239, 2278, 651, 320, 3309, 891, 1918, 767, 6667, 273, 9380, 534, 812, 320, 273, 1600, 2708, 50275, 8826, 10414, 534, 778, 320, 273, 1600, 50276, 7067, 465, 1370, 2375, 27198, 37622, 2689, 294, 2182, 254, 285, 340, 86, 453, 74, 1182, 12109, 247, 11454, 2990, 3169, 3646, 19502, 5933, 342, 4156, 288, 374, 288, 374, 12185, 8172, 14940, 323, 19191, 3958, 327, 10625, 5742, 7579, 5976, 812, 320, 273, 1600, 253, 4477, 671, 2319, 849, 2176, 22429, 2550, 12215, 14940, 273, 253, 13335, 273, 253, 10704, 5482, 50275, 6148, 1784, 42964, 867, 1940, 944, 8101, 261, 258, 7337, 14906, 285, 271, 505, 13411, 20495, 35068, 17616, 5556, 595, 17375, 2957, 3470, 323, 16161, 268, 3229, 342, 11454, 6928, 253, 4477, 2319, 253, 4327, 273, 2957, 3470, 281, 671, 3885, 598, 50276, 49831, 14940, 285, 253, 2900, 7200, 2490, 187, 4118, 18435, 27, 783, 2929, 35910, 247, 268, 615, 970, 271, 3081, 12339, 1159, 875, 253, 13335, 273, 253, 1159, 327, 20953, 6667, 285, 767, 268, 3229, 352, 310, 2011, 326, 841, 3081, 2426, 1361, 50276, 856, 84, 50276, 783, 16038, 310, 281, 2486, 13335, 275, 253, 13782, 66, 50266, 39595, 285, 5175, 327, 2067, 6667, 1690, 1029, 6967, 4394, 50266, 12292, 272, 310, 2908, 275, 253, 6323, 2715, 50275, 5040, 253, 2957, 310, 30323, 40610, 5222, 273, 253, 42435, 273, 253, 5150, 50264, 783, 10393, 273, 253, 5222, 273, 253, 12541, 310, 417, 2233, 5185, 342, 253, 6032, 1255, 3607, 273, 253, 3969, 5150, 323, 1650, 323, 253, 2963, 17469, 5150, 253, 1895, 310, 4236, 275, 824, 247, 1039, 253, 2900, 310, 20059, 2299, 323, 1650, 604, 253, 5058, 7548, 2515, 403, 27810, 285, 987, 1133, 1930, 310, 512, 4394, 253, 2900, 588, 452, 34001, 3021, 253, 2022, 5691, 651, 320, 253, 1083, 672, 2900, 1057, 452, 253, 34001, 285, 352, 588, 452, 352, 275, 1142, 8542, 2219, 253, 298, 19, 12850, 840, 310, 417, 253, 987, 5164, 323, 253, 2900, 281, 2226, 417, 281, 1333, 670, 253, 2169, 2621, 13335, 594, 841, 1159, 932, 403, 417, 17194, 407, 253, 3762, 273, 253, 2900, 273, 268, 3229, 533, 403, 2581, 7106, 327, 1199, 39797, 977, 2900, 50272, 585, 41801, 627, 403, 3240, 247, 1643, 9380, 327, 253, 14940, 273, 277, 9866, 34754, 281, 2900, 273, 268, 3229, 253, 3559, 3082, 1537, 452, 5975, 2400, 281, 247, 1980, 5927, 271, 1774, 3806, 310, 253, 2929, 407, 340, 274, 1502, 4742, 277, 2228, 14493, 323, 34754, 342, 3676, 774, 86, 6928, 11454, 6928, 4240, 17109, 26771, 12172, 1047, 50275 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 601, 67, 40610, 3733, 273, 11454, 6928, 534, 14688, 942, 253, 2629, 2957, 1159, 342, 2426, 326, 29697, 907, 37122, 875, 253, 13335, 273, 253, 2990, 285, 2303, 3470, 556, 644, 2011, 45190, 281, 3157, 941, 46505, 540, 41597, 581, 651, 1902, 326, 352, 671, 34253, 26647, 275, 7533, 835, 253, 2303, 1159, 310, 10481, 6032, 436, 7714, 29328, 35919, 272, 253, 2957, 3470, 908, 281, 1957, 253, 5482, 273, 7898, 8967, 7424, 342, 2426, 29697, 3006, 253, 30323, 40610, 5222, 273, 253, 2900, 697, 3302, 1617, 285, 253, 7548, 1617, 253, 16038, 323, 436, 2746, 310, 2590, 984, 941, 6733, 310, 273, 253, 34497, 6349, 275, 268, 615, 4715, 3237, 835, 253, 941, 812, 320, 1077, 2834, 281, 2289, 50275, 783, 4679, 275, 436, 2929, 4518, 921, 326, 247, 2303, 7200, 476, 320, 6786, 342, 11184, 4583, 3733, 2792, 672, 970, 30323, 40610, 3733, 1077, 1199, 5185, 342, 253, 4232, 4685, 273, 253, 1055, 273, 29697, 3006, 253, 30323, 40610, 22429, 275, 6867, 22296, 5145, 4715, 3237, 50276, 8826, 273, 253, 6667, 403, 1029, 6967, 285, 37825, 50275, 783, 10527, 1543, 403, 417, 3782, 18511, 533, 597, 5752, 247, 5272, 22861, 323, 253, 4081, 6974, 5474, 339, 431, 248, 2934, 273, 970, 11454, 6928, 281, 16851, 253, 5482, 273, 253, 268, 3229, 310, 1077, 4722, 24443, 275, 1029, 6967, 4758, 835, 8946, 7274, 1891, 281, 4311, 3738, 627, 556, 644, 1142, 6031, 275, 436, 3884, 627, 403, 1335, 1527, 28966, 281, 8338, 581, 273, 253, 954, 1774, 4809, 310, 253, 4327, 273, 2957, 1159, 281, 7102, 253, 3733, 273, 253, 11454, 2990, 285, 253, 9380, 4388, 310, 281, 2953, 436, 2523, 407, 36636, 30323, 40610, 5222, 347, 253, 2957, 1159, 3185, 273, 253, 7744, 908, 298, 19, 12850, 253, 30323, 40610, 5222, 3797, 3081, 1307, 670, 13335, 273, 253, 2228, 253, 2022, 1750, 310, 326, 342, 253, 11250, 273, 253, 3081, 1307, 253, 14940, 273, 253, 11454, 2990, 3733, 4916, 7938, 436, 310, 253, 5044, 9023, 273, 253, 2929, 50274, 20261, 271, 4722, 10419, 891, 1158, 253, 2929, 858, 417, 2953, 253, 2120, 7794, 273, 352, 275, 1798, 50274, 18, 3733, 342, 13335, 275, 253, 2957, 1159, 943, 320, 19983, 2429, 281, 298, 19, 2957, 436, 4916, 7197, 347, 253, 7877, 5459, 604, 253, 7877, 310, 277, 253, 806, 4309, 11498, 342, 277, 285, 253, 1273, 4309, 11498, 342, 277, 19, 50276, 262, 3133, 326, 387, 954, 581, 476, 1611, 760, 253, 806, 4309, 50274, 19, 642, 5301, 310, 2530, 275, 2426, 273, 15180, 673, 50276, 20, 10941, 253, 14940, 3885, 778, 417, 320, 4344, 984, 323, 298, 19, 2957, 581, 1537, 320, 2104, 281, 897, 4067, 4715, 2281, 534, 651, 4917, 7938, 14940, 1223, 1781, 4715, 2281, 1537, 1056, 30323, 40610, 3733, 17631, 50276, 21, 50276, 10387, 253, 4309, 4419, 2266, 6032, 1255, 9376, 670, 253, 268, 615, 941, 253, 323, 5620, 304, 1307, 253, 49501, 17857, 534, 403, 417, 2629, 275, 253, 268, 615, 6239, 50275, 28821, 436, 891, 1158, 253, 9021, 273, 253, 2929, 403, 32809, 25761, 253, 10527, 1543, 513, 417, 1663, 1329, 253, 1750, 326, 30323, 40610, 2957, 1159, 310, 1805, 352, 651, 320, 4722, 281, 452, 247, 4016, 906, 670, 253, 298, 19, 2957, 1159, 326, 15265, 684, 253, 2898, 273, 253, 30323, 40610, 5222, 50275, 3062, 1189, 352, 3133, 326, 253, 30762, 310, 417, 3542, 342, 1557, 50275, 2377, 4674, 247, 18, 1057, 417, 1663, 2486, 247, 4737, 604, 253, 906, 310, 2168, 1929, 352, 3133, 1805, 281, 26542, 253, 3806, 342, 3242, 12219, 281, 253, 906, 275, 253, 2022, 2133, 273, 253, 2929, 285, 513, 417, 2486, 352, 347, 253, 7680, 50275, 2377, 4674, 247, 19, 671, 3797, 253, 906, 4474, 81, 247, 20, 1293, 4737, 390, 3806, 253, 4737, 273, 289, 78, 247, 21, 310, 417, 3542, 342, 1557, 253, 14493, 275, 247, 746, 285, 247, 938, 403, 2797, 1293, 15571, 253, 5018, 752, 1511, 273, 2963, 1763, 609, 11370, 310, 908, 50275, 262, 588, 320, 1175, 281, 2486, 247, 5426, 273, 253, 22429, 285, 2963, 1763, 609, 11370, 323, 253, 9414, 32139, 342, 268, 615, 1783, 50276, 7152, 33032, 39930, 50276, 783, 2929, 29328, 247, 4460, 2957, 1159, 970, 30323, 40610, 22429, 281, 6379, 253, 15180, 4815, 672, 16161, 268, 3229, 970, 11454, 6928, 891, 1158, 253, 2934, 273, 253, 789, 310, 1077, 4722, 285, 4623, 285, 275, 247, 4217, 3884, 2299, 891, 1158, 253, 2929, 275, 697, 1655, 830, 310, 417, 2568, 7470, 323, 9311, 285, 352, 943, 320, 34615, 407, 24049, 625, 10527, 285, 10704, 7794, 436, 588, 1056, 253, 4473, 247, 2257, 625, 21414, 891, 1246, 690, 5697, 285, 5701, 323, 7756, 2708, 50275, 26122, 285, 5697, 323, 7756, 347, 973, 347, 37699, 3533, 50276, 5658, 1375, 352, 4419, 4942, 1029, 15180, 2105, 2429, 281, 5899, 17489, 3169, 15849, 275, 2087, 651, 368, 452, 247, 3806, 323, 436, 275, 629, 891, 5194, 342, 253, 10732, 326, 11454, 2990, 3733, 812, 320, 43245, 37893, 533, 327, 253, 643, 1133, 347, 368, 671, 3748, 352, 943, 417, 11089, 432, 253, 28401, 273, 7877, 1319, 2429, 281, 17489, 3169, 3082, 534, 651, 1646, 326, 352, 310, 43245, 5919, 50276, 783, 2022, 1750, 273, 436, 789, 310, 281, 9569, 30323, 40610, 3733, 281, 3885, 598, 14940, 390, 347, 368, 3748, 275, 253, 10199, 40845, 253, 2523, 273, 1029, 15180, 2105, 672, 16161, 268, 3229, 970, 11454, 6928, 28055, 253, 1543, 275, 2593, 577, 403, 417, 4645, 436, 891, 871, 326, 275, 253, 3236, 30323, 40610, 3733, 2929, 627, 310, 247, 906, 327, 849, 30323, 40610, 3733, 556, 247, 2406, 3410, 10454, 685, 3963, 3733, 13633, 824, 247, 906, 281, 436, 4758, 651, 320, 3309, 281, 1056, 253, 3916, 275, 253, 10199, 26565, 50275, 783, 1543, 275, 289, 78, 7609, 285, 5976, 403, 760, 323, 337, 69, 7424, 891, 2096, 326, 2169, 1340, 812, 320, 625, 2570, 285, 4931, 253, 337, 69, 7424, 403, 4209, 281, 12709, 253, 30328, 2299, 275, 326, 1083, 387, 1878, 247, 4385, 310, 3058, 327, 849, 841, 1543, 812, 320, 6508, 281, 2169, 7367, 50275, 249, 4677, 337, 752, 310, 253, 1921, 323, 288, 19, 2957, 417, 43088, 598, 14940, 342, 253, 774, 86, 310, 352, 253, 1027, 74, 1430, 50276, 783, 1543, 275, 8073, 403, 969, 760, 323, 337, 69, 7424, 891, 1158, 326, 604, 28055, 368, 513, 417, 5276, 253, 1543, 323, 1029, 6967, 268, 3229, 253, 1318, 273, 253, 4081, 16182, 323, 1029, 6967, 268, 3229, 943, 387, 1878, 320, 2011, 275, 9470, 10704, 4679, 891, 513, 1158, 253, 1650, 275, 8255, 310, 275, 253, 987, 3884, 533, 247, 625, 26565, 1783, 651, 320, 3058, 50275, 74, 651, 751, 281, 452, 625, 1491, 327, 849, 253, 2032, 268, 615, 2193, 273, 253, 27935, 273, 253, 7548, 285, 10755, 8967, 9158, 403, 10302, 285, 1880, 436, 310, 1900, 1896, 50275, 2577, 1390, 4385, 310, 247, 2087, 581, 533, 1677, 326, 253, 2561, 275, 436, 2170, 310, 5675, 9086, 342, 2710, 7274, 281, 3157, 14940, 247, 10046, 6239, 2278, 651, 320, 3309, 891, 1918, 767, 6667, 273, 9380, 534, 812, 320, 273, 1600, 2708, 50275, 8826, 10414, 534, 778, 320, 273, 1600, 50276, 7067, 465, 1370, 2375, 27198, 37622, 2689, 294, 2182, 254, 285, 340, 86, 453, 74, 1182, 12109, 247, 11454, 2990, 3169, 3646, 19502, 5933, 342, 4156, 288, 374, 288, 374, 12185, 8172, 14940, 323, 19191, 3958, 327, 10625, 5742, 7579, 5976, 812, 320, 273, 1600, 253, 4477, 671, 2319, 849, 2176, 22429, 2550, 12215, 14940, 273, 253, 13335, 273, 253, 10704, 5482, 50275, 6148, 1784, 42964, 867, 1940, 944, 8101, 261, 258, 7337, 14906, 285, 271, 505, 13411, 20495, 35068, 17616, 5556, 595, 17375, 2957, 3470, 323, 16161, 268, 3229, 342, 11454, 6928, 253, 4477, 2319, 253, 4327, 273, 2957, 3470, 281, 671, 3885, 598, 50276, 49831, 14940, 285, 253, 2900, 7200, 2490, 187, 4118, 18435, 27, 783, 2929, 35910, 247, 268, 615, 970, 271, 3081, 12339, 1159, 875, 253, 13335, 273, 253, 1159, 327, 20953, 6667, 285, 767, 268, 3229, 352, 310, 2011, 326, 841, 3081, 2426, 1361, 50276, 856, 84, 50276, 783, 16038, 310, 281, 2486, 13335, 275, 253, 13782, 66, 50266, 39595, 285, 5175, 327, 2067, 6667, 1690, 1029, 6967, 4394, 50266, 12292, 272, 310, 2908, 275, 253, 6323, 2715, 50275, 5040, 253, 2957, 310, 30323, 40610, 5222, 273, 253, 42435, 273, 253, 5150, 50264, 783, 10393, 273, 253, 5222, 273, 253, 12541, 310, 417, 2233, 5185, 342, 253, 6032, 1255, 3607, 273, 253, 3969, 5150, 323, 1650, 323, 253, 2963, 17469, 5150, 253, 1895, 310, 4236, 275, 824, 247, 1039, 253, 2900, 310, 20059, 2299, 323, 1650, 604, 253, 5058, 7548, 2515, 403, 27810, 285, 987, 1133, 1930, 310, 512, 4394, 253, 2900, 588, 452, 34001, 3021, 253, 2022, 5691, 651, 320, 253, 1083, 672, 2900, 1057, 452, 253, 34001, 285, 352, 588, 452, 352, 275, 1142, 8542, 2219, 253, 298, 19, 12850, 840, 310, 417, 253, 987, 5164, 323, 253, 2900, 281, 2226, 417, 281, 1333, 670, 253, 2169, 2621, 13335, 594, 841, 1159, 932, 403, 417, 17194, 407, 253, 3762, 273, 253, 2900, 273, 268, 3229, 533, 403, 2581, 7106, 327, 1199, 39797, 977, 2900, 50272, 585, 41801, 627, 403, 3240, 247, 1643, 9380, 327, 253, 14940, 273, 277, 9866, 34754, 281, 2900, 273, 268, 3229, 253, 3559, 3082, 1537, 452, 5975, 2400, 281, 247, 1980, 5927, 271, 1774, 3806, 310, 253, 2929, 407, 340, 274, 1502, 4742, 277, 2228, 14493, 323, 34754, 342, 3676, 774, 86, 6928, 11454, 6928, 4240, 17109, 26771, 12172, 1047, 50275 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: this paper studies the lottery ticket hypothesis which says that there is an underlying subnetwork lottery ticket in a neural network such that if we train it we will obtain a better test accuracy compared to the original network the authors develop a theoretical validation of the improved generalization error of the lottery ticket similar to many theoretical results for neural networks several assumptions have been made for example it is assumed that the underlying function which we try to learn can be entirely captured by a sparse neural network this assumption indicates that for some underlying subgraph we can learn with zero generalization error they provide empirical validation for their results as well the authors consider a significant problem pruning techniques allow us to enjoy high accuracy while reducing neural networks memory and computational running time these techniques have been studied for a long time but mostly from an empirical perspective i believe understanding the pruning problems theoretical aspects will help us develop better algorithms for pruning as well the results are clearly explained and the paper is wellwritten comments in empirical cases the pruned networks initialization is often set to the weights obtained in the training of the more extensive network is this technique used in your comparisons in the numerical evaluation it is assumed that the covariates xis are coming from a gaussian distribution implying that the label yi is a mixture of truncated gaussians before adding noise while this assumption has been made in previous works the role of this assumption is not wellexplained i am not sure whether the underlying assumption trivialized the problem it is assumed that the function f which we try to learn can be captured by a sparse onehiddenlayer neural network this sparse subnetwork would essentially be the lotteryticket of the fully connected network then it is claimed that if we know the structure of this sparse subnetwork ie known mask matrix m then it means we are learning in a lower dimension of parameters compared to the case of a complete network thus it is not surprising to see that the sgd algorithm converges faster in other words by knowing m we already set a large portion of parameters to their optimal values thus it is easier to find the optimal solution to the minimization problem the ultimate goal of the pruning problem is to get the substructure and train it efficiently does your result affect how one could potentially improve the current pruning techniques such as imp or can we say anything about the algorithms performance when a large fraction of the subnetwork is picked correctly docsepsummary of review this paper provides recovery guarantees for learning onehiddenlayer neural networks with sparse ground truth weights given an isotropic gaussian input distribution the main result shows local convexity guarantees near the ground truth provided that a mask of the sparsity pattern is already known this paper extends the tensor initialization approach of zhong et al17 to show a convergence guarantee for learning the sparse neural network simulations validate the local setting this paper focuses on learning a onehiddenlayer neural network where the weight matrix of the hidden layer is sparse given input samples from an isotropic gaussian distribution results i the first result is that within a small vicinity of the ground truth weight matrix the standard mean squared loss for learning the neural network is convex ii the second result shows how to learn the ground truth weight matrix by extending the tensor initialization approach in zhong et al17 iii numerical results are provided to validate the above two theoretical results pros the sample size requirement of both result i and ii growly proportionally to the sparsity of the ground truth matrix as opposed to the size of the matrix this result is particularly interesting in light of recent empirical results about network pruning and learning sparse convnets neyshabur20 cons the authors prove the above results by adapting the proof of zhong et al17 in fact since the input distribution is isotropic standard concentration results apply whether or not the ground truth matrix is sparse therefore it is unclear to the reviewer whether this result is as novel as the authors claim in the introduction the learning algorithm assumes knowledge of the sparsity mask this seems like a strong assumption isnt the point of imp to find this sparsity mask understanding how to find this sparsity mask seems like a more important question but this is not discussed at all in this paper writing overall this paper is easy to follow the quality of writing is marginally acceptable please find several detailed comments below p1 the theoretical justification of winning tickets are remains elusive expect for a few recent works remove are replace expect with except p4 an onehiddenlayer neural network a onehiddenlayer neural network p5 here you say that varepsilon1 thetasqrt r but r 1 and varepsilon1 1 please clarify p5 regarding the convergence for the vanilla gd algorithm please add a reference to this claim p5 accurate estimate replace estimate with estimationdocsepthe paper analyzes the geometric structure of the objective function for a sparse one hidden layer neural net and has a novel theorem on the convergence rate of the algorithm that recovers the weights in the one hidden layer neural net from the perspective of sample complexity namely how many samples are needed to have the whole recovery of the weights in the neural net it is established that a sparse one hidden layer neural net usually requires fewer samples than a fully connected counterpart the paper uses many simulations to support the theoretical results the sparse neural net can represent the winning ticket for the lottery ticket hypothesis and the sample complexity explains why the winning ticket has better performance the paper seems the first work to focus on the weights recovery of sparse neural networks and provides useful guarantees with important insights and the paper should be distributed to other researchers although the paper has many contributions and should be worth distributed to other researchers the paper appears to be just below the threshold for acceptance to the iclr conference therefore i am afraid i might not recommend to accept the paper in this current form 1 frankly the technical novelty over the reference zhong 2017 recovery guarantees for onehiddenlayer neural networks seems not fully clarified while it is accepted that this work has the sparsity r and proves a tighter bound than zhong 2017 it seems not fully clarified what technical novelty or key breakthrough are directly due to the sparsity setting in this paper compared with the approach that zhong 2017 took the authors should highlight key technical differences in the proof due to the sparsity setting to make it more convincing about the novelty of this paper 2 the paper only explains why a sparse neural net should train well with fewer examples than fully connected counterpart the observation has its own merit to my understanding the paper starts from the assumption that the neural net to be learned is sparse however this could not be always correct for the lottery ticket problem thus the assumption seems not always plausible for real applications of the lottery ticket hypothesis and it would be useful if there can be examples of sparse neural networks that come naturally for some problem 3 here is a question for the authors to confirm although the work is on sparse neural networks the sparsity rd is not assumed in theorem 1 and 2 correct even if rd we still have the theorems valid please feel free to let me know if i do not understand welldocsepsummary this paper investigates the theoretical evidence behind improved generalization of the winning lottery tickets more precisely authors characterize the testing error of a pruned network that is then trained using agd under relatively reasonable assumptions they manage to show an improved generalization bound for a properly pruned network over the full network pros this is one of the first works to provide theoretical guarantees for winning lottery tickets under the onehiddenlayer sparse neural network model the objective function being highly nonconvex in general authors rely on local convexity of the latter objective function around the ground truth hence given a good initialization they managed to show that agd achieves a good performance after a certain number of steps numerical experiments are interesting and complement pretty well the theory of the paper consquestions line 6 page 5 you say that the radius of the convex ball is theta1 thetasqrtr how is that even positive as r grows the same comment also holds for your comment after lemma 1 using local convexity which is a uniform property you should be able to prove your main result based on the whole training data instead of sample splitting i dont see why you have only managed to show your result under sample splitting which is less interesting in my opinion the hypothesis of a sparse ground model where you are given the true corresponding support is too restrictive it would have been very interesting to apply a variant of iht to your proposed algorithm i believe your results should hold under relatively similar conditions i strongly think proving your result without prior knowledge of the true support would make the present paper stronger and would avoid having trivial followup works that may take more credit than yours score this paper is well written and the proofs seem sound to me overall i think the present paper is marginally above the acceptance threshold because of the reasons i explain above i am willing to revise my score if the authors give constructive feedback to my concerns comments in theorem 7 it would be good to precise that your initialization is independent from the data you use for your agd ### Summary:
even though the authors revised the problem formulation the paper seems not ready for publication the assumptions are still too strong the learning algorithm assumes knowledge of the sparsity mask the proof technique also heavily relies on zhong et al17 without properly highlighting the difference
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 2520, 2929, 2175, 253, 36284, 13571, 9079, 534, 2296, 326, 627, 310, 271, 6944, 749, 18428, 36284, 13571, 275, 247, 11454, 2990, 824, 326, 604, 359, 6194, 352, 359, 588, 4044, 247, 1805, 1071, 7200, 2429, 281, 253, 3236, 2990, 253, 4477, 1287, 247, 10527, 12820, 273, 253, 5520, 26647, 2228, 273, 253, 36284, 13571, 2074, 281, 1142, 10527, 1543, 323, 11454, 6928, 2067, 13260, 452, 644, 1160, 323, 1650, 352, 310, 8025, 326, 253, 6944, 1159, 534, 359, 1611, 281, 3037, 476, 320, 7094, 10848, 407, 247, 23507, 11454, 2990, 436, 9376, 6492, 326, 323, 690, 6944, 749, 10580, 359, 476, 3037, 342, 5058, 26647, 2228, 597, 2085, 16774, 12820, 323, 616, 1543, 347, 973, 50276, 783, 4477, 1908, 247, 1534, 1895, 819, 25004, 5609, 1581, 441, 281, 4264, 1029, 7200, 1223, 8493, 11454, 6928, 3541, 285, 15180, 3515, 673, 841, 5609, 452, 644, 5421, 323, 247, 1048, 673, 533, 6571, 432, 271, 16774, 8668, 891, 2868, 4685, 253, 819, 25004, 3237, 10527, 7794, 588, 1361, 441, 1287, 1805, 11333, 323, 819, 25004, 347, 973, 50276, 783, 1543, 403, 4518, 5544, 285, 253, 2929, 310, 973, 15720, 50276, 26122, 50276, 249, 16774, 2219, 253, 819, 37437, 6928, 31850, 310, 2223, 873, 281, 253, 13461, 2797, 275, 253, 3733, 273, 253, 625, 9470, 2990, 310, 436, 5853, 908, 275, 634, 14023, 275, 253, 10704, 7103, 50276, 262, 310, 8025, 326, 253, 33520, 1269, 261, 403, 3551, 432, 247, 305, 12064, 3268, 27594, 326, 253, 5203, 340, 74, 310, 247, 7802, 273, 28069, 305, 10064, 2458, 1078, 6240, 6046, 1223, 436, 9376, 556, 644, 1160, 275, 2045, 2987, 253, 2554, 273, 436, 9376, 310, 417, 6210, 1591, 446, 1243, 50275, 74, 717, 417, 2119, 1880, 253, 6944, 9376, 14916, 1025, 253, 1895, 352, 310, 8025, 326, 253, 1159, 269, 534, 359, 1611, 281, 3037, 476, 320, 10848, 407, 247, 23507, 581, 19057, 12026, 11454, 2990, 436, 23507, 749, 18428, 651, 9093, 320, 253, 2257, 350, 1767, 21315, 273, 253, 4751, 4802, 2990, 840, 352, 310, 7558, 326, 604, 359, 871, 253, 2605, 273, 436, 23507, 749, 18428, 26332, 1929, 8989, 4315, 278, 840, 352, 2097, 359, 403, 4715, 275, 247, 2406, 7877, 273, 3602, 2429, 281, 253, 1083, 273, 247, 3426, 2990, 3021, 352, 310, 417, 10084, 281, 923, 326, 253, 256, 35333, 5933, 26414, 7938, 275, 643, 3000, 407, 8958, 278, 359, 2168, 873, 247, 1781, 5110, 273, 3602, 281, 616, 8654, 2193, 3021, 352, 310, 6927, 281, 1089, 253, 8654, 2900, 281, 253, 41458, 1895, 50274, 783, 12553, 4736, 273, 253, 819, 25004, 1895, 310, 281, 755, 253, 749, 18317, 285, 6194, 352, 14556, 1057, 634, 906, 2818, 849, 581, 812, 7826, 3157, 253, 1655, 819, 25004, 5609, 824, 347, 1607, 390, 476, 359, 1333, 2712, 670, 253, 11333, 3045, 672, 247, 1781, 6919, 273, 253, 749, 18428, 310, 5055, 9113, 5474, 339, 793, 360, 3454, 273, 2278, 50276, 2520, 2929, 3400, 7355, 23632, 323, 4715, 581, 19057, 12026, 11454, 6928, 342, 23507, 3216, 5083, 13461, 1677, 271, 29436, 305, 12064, 3280, 3268, 253, 2022, 906, 2722, 1980, 17133, 414, 23632, 2822, 253, 3216, 5083, 2530, 326, 247, 8989, 273, 253, 37139, 414, 3102, 310, 2168, 1929, 436, 2929, 8725, 253, 13148, 31850, 2746, 273, 1182, 73, 543, 1162, 355, 1166, 281, 921, 247, 14940, 12215, 323, 4715, 253, 23507, 11454, 2990, 9938, 17813, 253, 1980, 50276, 28617, 50276, 2520, 2929, 16633, 327, 4715, 247, 581, 19057, 12026, 11454, 2990, 835, 253, 2801, 4315, 273, 253, 8763, 3828, 310, 23507, 1677, 3280, 3530, 432, 271, 29436, 305, 12064, 3268, 50275, 16680, 50276, 74, 253, 806, 906, 310, 326, 1561, 247, 1355, 21520, 273, 253, 3216, 5083, 2801, 4315, 253, 2629, 1599, 30044, 2957, 323, 4715, 253, 11454, 2990, 310, 17133, 50276, 2886, 253, 1273, 906, 2722, 849, 281, 3037, 253, 3216, 5083, 2801, 4315, 407, 13633, 253, 13148, 31850, 2746, 275, 1182, 73, 543, 1162, 355, 1166, 50276, 12211, 10704, 1543, 403, 2530, 281, 17813, 253, 1840, 767, 10527, 1543, 50276, 856, 84, 50275, 783, 3410, 1979, 8284, 273, 1097, 906, 891, 285, 21255, 1756, 314, 8394, 595, 281, 253, 37139, 414, 273, 253, 3216, 5083, 4315, 347, 10066, 281, 253, 1979, 273, 253, 4315, 436, 906, 310, 3782, 4722, 275, 1708, 273, 3332, 16774, 1543, 670, 2990, 819, 25004, 285, 4715, 23507, 2410, 47301, 425, 656, 8621, 321, 938, 50276, 5040, 50275, 783, 4477, 5276, 253, 1840, 1543, 407, 42174, 253, 4737, 273, 1182, 73, 543, 1162, 355, 1166, 275, 958, 1580, 253, 3280, 3268, 310, 29436, 2629, 4719, 1543, 4647, 1880, 390, 417, 253, 3216, 5083, 4315, 310, 23507, 3103, 352, 310, 12744, 281, 253, 37317, 1880, 436, 906, 310, 347, 4460, 347, 253, 4477, 1750, 275, 253, 10199, 50275, 783, 4715, 5933, 19584, 3640, 273, 253, 37139, 414, 8989, 436, 3133, 751, 247, 2266, 9376, 310, 2649, 253, 1127, 273, 1607, 281, 1089, 436, 37139, 414, 8989, 4685, 849, 281, 1089, 436, 37139, 414, 8989, 3133, 751, 247, 625, 1774, 1953, 533, 436, 310, 417, 5469, 387, 512, 275, 436, 2929, 50276, 17695, 50276, 1189, 455, 436, 2929, 310, 3477, 281, 956, 253, 3290, 273, 4028, 310, 42876, 12207, 4496, 1089, 2067, 7000, 5701, 2708, 50275, 81, 18, 253, 10527, 22861, 273, 9880, 14997, 403, 4558, 38037, 1902, 323, 247, 1643, 3332, 2987, 50276, 12163, 403, 8171, 1902, 342, 3707, 50275, 81, 21, 271, 581, 19057, 12026, 11454, 2990, 50276, 66, 581, 19057, 12026, 11454, 2990, 50275, 81, 22, 1060, 368, 1333, 326, 362, 609, 4277, 18, 50276, 783, 85, 284, 2274, 391, 533, 391, 50276, 18, 285, 362, 609, 4277, 18, 50276, 18, 4496, 19148, 50275, 81, 22, 5001, 253, 14940, 323, 253, 26724, 305, 69, 5933, 4496, 823, 247, 3806, 281, 436, 1750, 50275, 81, 22, 7899, 6642, 50276, 13481, 6642, 342, 13418, 7152, 339, 431, 248, 2929, 3537, 13505, 253, 17856, 2605, 273, 253, 8103, 1159, 323, 247, 23507, 581, 8763, 3828, 11454, 2036, 285, 556, 247, 4460, 10012, 327, 253, 14940, 2281, 273, 253, 5933, 326, 761, 12239, 253, 13461, 275, 253, 581, 8763, 3828, 11454, 2036, 432, 253, 8668, 273, 3410, 10454, 10775, 849, 1142, 3530, 403, 3058, 281, 452, 253, 2644, 7355, 273, 253, 13461, 275, 253, 11454, 2036, 352, 310, 4232, 326, 247, 23507, 581, 8763, 3828, 11454, 2036, 3798, 4419, 11184, 3530, 685, 247, 4751, 4802, 14317, 253, 2929, 4648, 1142, 9938, 281, 1329, 253, 10527, 1543, 253, 23507, 11454, 2036, 476, 1957, 253, 9880, 13571, 323, 253, 36284, 13571, 9079, 285, 253, 3410, 10454, 11424, 2139, 253, 9880, 13571, 556, 1805, 3045, 253, 2929, 3133, 253, 806, 789, 281, 2770, 327, 253, 13461, 7355, 273, 23507, 11454, 6928, 285, 3400, 4217, 23632, 342, 1774, 16039, 285, 253, 2929, 943, 320, 5939, 281, 643, 8607, 50276, 20261, 253, 2929, 556, 1142, 9021, 285, 943, 320, 4409, 5939, 281, 643, 8607, 253, 2929, 4620, 281, 320, 816, 2708, 253, 7887, 323, 14924, 281, 253, 17857, 32888, 8059, 3103, 891, 717, 9202, 891, 1537, 417, 5583, 281, 2997, 253, 2929, 275, 436, 1655, 830, 50276, 18, 29708, 253, 7681, 38135, 689, 253, 3806, 1182, 73, 543, 4240, 7355, 23632, 323, 581, 19057, 12026, 11454, 6928, 3133, 417, 4751, 31637, 1223, 352, 310, 7607, 326, 436, 789, 556, 253, 37139, 414, 391, 285, 19539, 247, 40638, 3033, 685, 1182, 73, 543, 4240, 352, 3133, 417, 4751, 31637, 752, 7681, 38135, 390, 2234, 29709, 403, 3587, 1955, 281, 253, 37139, 414, 4758, 275, 436, 2929, 2429, 342, 253, 2746, 326, 1182, 73, 543, 4240, 2335, 253, 4477, 943, 6780, 2234, 7681, 3910, 275, 253, 4737, 1955, 281, 253, 37139, 414, 4758, 281, 1056, 352, 625, 21414, 670, 253, 38135, 273, 436, 2929, 374, 253, 2929, 760, 11424, 2139, 247, 23507, 11454, 2036, 943, 6194, 973, 342, 11184, 6667, 685, 4751, 4802, 14317, 253, 8310, 556, 697, 1211, 15785, 281, 619, 4685, 253, 2929, 7866, 432, 253, 9376, 326, 253, 11454, 2036, 281, 320, 6311, 310, 23507, 50276, 35529, 436, 812, 417, 320, 1900, 3451, 323, 253, 36284, 13571, 1895, 3021, 253, 9376, 3133, 417, 1900, 21541, 323, 1524, 4893, 273, 253, 36284, 13571, 9079, 285, 352, 651, 320, 4217, 604, 627, 476, 320, 6667, 273, 23507, 11454, 6928, 326, 1705, 10748, 323, 690, 1895, 495, 1060, 310, 247, 1953, 323, 253, 4477, 281, 6583, 3738, 253, 789, 310, 327, 23507, 11454, 6928, 253, 37139, 414, 47939, 310, 417, 8025, 275, 10012, 337, 285, 374, 3451, 1014, 604, 47939, 359, 1335, 452, 253, 39383, 3588, 4496, 1928, 1959, 281, 1339, 479, 871, 604, 891, 513, 417, 2096, 6210, 392, 406, 339, 793, 360, 3454, 50276, 2520, 2929, 2340, 684, 253, 10527, 1941, 3212, 5520, 26647, 273, 253, 9880, 36284, 14997, 625, 10534, 4477, 17710, 253, 5175, 2228, 273, 247, 819, 37437, 2990, 326, 310, 840, 10166, 970, 639, 69, 762, 4942, 5272, 13260, 597, 8722, 281, 921, 271, 5520, 26647, 3033, 323, 247, 6283, 819, 37437, 2990, 689, 253, 2120, 2990, 50274, 856, 84, 50275, 2520, 310, 581, 273, 253, 806, 2987, 281, 2085, 10527, 23632, 323, 9880, 36284, 14997, 762, 253, 581, 19057, 12026, 23507, 11454, 2990, 1566, 253, 8103, 1159, 1146, 4122, 1327, 44181, 275, 2087, 4477, 10725, 327, 1980, 17133, 414, 273, 253, 6158, 8103, 1159, 1475, 253, 3216, 5083, 7613, 1677, 247, 1175, 31850, 597, 7303, 281, 921, 326, 639, 69, 33526, 247, 1175, 3045, 846, 247, 2176, 1180, 273, 5018, 10704, 4679, 403, 4722, 285, 13503, 3965, 973, 253, 3762, 273, 253, 2929, 50273, 5040, 34974, 50274, 1282, 721, 3239, 608, 368, 1333, 326, 253, 9941, 273, 253, 17133, 4023, 310, 39116, 18, 50276, 783, 85, 284, 50070, 1206, 849, 310, 326, 1014, 2762, 347, 391, 17202, 253, 1072, 4385, 671, 6556, 323, 634, 4385, 846, 18057, 337, 50272, 5302, 1980, 17133, 414, 534, 310, 247, 6447, 2867, 368, 943, 320, 2104, 281, 5276, 634, 2022, 906, 1754, 327, 253, 2644, 3733, 941, 3185, 273, 3410, 19860, 891, 13414, 923, 2139, 368, 452, 760, 7303, 281, 921, 634, 906, 762, 3410, 19860, 534, 310, 1679, 4722, 275, 619, 4743, 50275, 783, 9079, 273, 247, 23507, 3216, 1566, 835, 368, 403, 1677, 253, 2032, 3969, 1329, 310, 1512, 29190, 352, 651, 452, 644, 1077, 4722, 281, 4647, 247, 12955, 273, 891, 384, 281, 634, 4081, 5933, 891, 2868, 634, 1543, 943, 2186, 762, 4942, 2074, 2515, 891, 7052, 1158, 18597, 634, 906, 1293, 2720, 3640, 273, 253, 2032, 1329, 651, 1056, 253, 1246, 2929, 10046, 285, 651, 3693, 1907, 14916, 956, 484, 2987, 326, 778, 1379, 625, 6152, 685, 13298, 50274, 18891, 50276, 2520, 2929, 310, 973, 3542, 285, 253, 27947, 1646, 3590, 281, 479, 4583, 891, 1158, 253, 1246, 2929, 310, 42876, 1840, 253, 14924, 7887, 984, 273, 253, 4606, 891, 5513, 1840, 891, 717, 7378, 281, 49620, 619, 4868, 604, 253, 4477, 1918, 25799, 8680, 281, 619, 7350, 50275, 26122, 50274, 249, 10012, 818, 352, 651, 320, 1175, 281, 10799, 326, 634, 31850, 310, 3907, 432, 253, 941, 368, 897, 323, 634, 639, 69, 187, 187, 4118, 18435, 27, 9154, 2167, 253, 4477, 17265, 253, 1895, 15895, 253, 2929, 3133, 417, 4704, 323, 9311, 253, 13260, 403, 1335, 1512, 2266, 253, 4715, 5933, 19584, 3640, 273, 253, 37139, 414, 8989, 253, 4737, 5853, 671, 11306, 15771, 327, 1182, 73, 543, 1162, 355, 1166, 1293, 6283, 27321, 253, 3064, 209 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 2520, 2929, 2175, 253, 36284, 13571, 9079, 534, 2296, 326, 627, 310, 271, 6944, 749, 18428, 36284, 13571, 275, 247, 11454, 2990, 824, 326, 604, 359, 6194, 352, 359, 588, 4044, 247, 1805, 1071, 7200, 2429, 281, 253, 3236, 2990, 253, 4477, 1287, 247, 10527, 12820, 273, 253, 5520, 26647, 2228, 273, 253, 36284, 13571, 2074, 281, 1142, 10527, 1543, 323, 11454, 6928, 2067, 13260, 452, 644, 1160, 323, 1650, 352, 310, 8025, 326, 253, 6944, 1159, 534, 359, 1611, 281, 3037, 476, 320, 7094, 10848, 407, 247, 23507, 11454, 2990, 436, 9376, 6492, 326, 323, 690, 6944, 749, 10580, 359, 476, 3037, 342, 5058, 26647, 2228, 597, 2085, 16774, 12820, 323, 616, 1543, 347, 973, 50276, 783, 4477, 1908, 247, 1534, 1895, 819, 25004, 5609, 1581, 441, 281, 4264, 1029, 7200, 1223, 8493, 11454, 6928, 3541, 285, 15180, 3515, 673, 841, 5609, 452, 644, 5421, 323, 247, 1048, 673, 533, 6571, 432, 271, 16774, 8668, 891, 2868, 4685, 253, 819, 25004, 3237, 10527, 7794, 588, 1361, 441, 1287, 1805, 11333, 323, 819, 25004, 347, 973, 50276, 783, 1543, 403, 4518, 5544, 285, 253, 2929, 310, 973, 15720, 50276, 26122, 50276, 249, 16774, 2219, 253, 819, 37437, 6928, 31850, 310, 2223, 873, 281, 253, 13461, 2797, 275, 253, 3733, 273, 253, 625, 9470, 2990, 310, 436, 5853, 908, 275, 634, 14023, 275, 253, 10704, 7103, 50276, 262, 310, 8025, 326, 253, 33520, 1269, 261, 403, 3551, 432, 247, 305, 12064, 3268, 27594, 326, 253, 5203, 340, 74, 310, 247, 7802, 273, 28069, 305, 10064, 2458, 1078, 6240, 6046, 1223, 436, 9376, 556, 644, 1160, 275, 2045, 2987, 253, 2554, 273, 436, 9376, 310, 417, 6210, 1591, 446, 1243, 50275, 74, 717, 417, 2119, 1880, 253, 6944, 9376, 14916, 1025, 253, 1895, 352, 310, 8025, 326, 253, 1159, 269, 534, 359, 1611, 281, 3037, 476, 320, 10848, 407, 247, 23507, 581, 19057, 12026, 11454, 2990, 436, 23507, 749, 18428, 651, 9093, 320, 253, 2257, 350, 1767, 21315, 273, 253, 4751, 4802, 2990, 840, 352, 310, 7558, 326, 604, 359, 871, 253, 2605, 273, 436, 23507, 749, 18428, 26332, 1929, 8989, 4315, 278, 840, 352, 2097, 359, 403, 4715, 275, 247, 2406, 7877, 273, 3602, 2429, 281, 253, 1083, 273, 247, 3426, 2990, 3021, 352, 310, 417, 10084, 281, 923, 326, 253, 256, 35333, 5933, 26414, 7938, 275, 643, 3000, 407, 8958, 278, 359, 2168, 873, 247, 1781, 5110, 273, 3602, 281, 616, 8654, 2193, 3021, 352, 310, 6927, 281, 1089, 253, 8654, 2900, 281, 253, 41458, 1895, 50274, 783, 12553, 4736, 273, 253, 819, 25004, 1895, 310, 281, 755, 253, 749, 18317, 285, 6194, 352, 14556, 1057, 634, 906, 2818, 849, 581, 812, 7826, 3157, 253, 1655, 819, 25004, 5609, 824, 347, 1607, 390, 476, 359, 1333, 2712, 670, 253, 11333, 3045, 672, 247, 1781, 6919, 273, 253, 749, 18428, 310, 5055, 9113, 5474, 339, 793, 360, 3454, 273, 2278, 50276, 2520, 2929, 3400, 7355, 23632, 323, 4715, 581, 19057, 12026, 11454, 6928, 342, 23507, 3216, 5083, 13461, 1677, 271, 29436, 305, 12064, 3280, 3268, 253, 2022, 906, 2722, 1980, 17133, 414, 23632, 2822, 253, 3216, 5083, 2530, 326, 247, 8989, 273, 253, 37139, 414, 3102, 310, 2168, 1929, 436, 2929, 8725, 253, 13148, 31850, 2746, 273, 1182, 73, 543, 1162, 355, 1166, 281, 921, 247, 14940, 12215, 323, 4715, 253, 23507, 11454, 2990, 9938, 17813, 253, 1980, 50276, 28617, 50276, 2520, 2929, 16633, 327, 4715, 247, 581, 19057, 12026, 11454, 2990, 835, 253, 2801, 4315, 273, 253, 8763, 3828, 310, 23507, 1677, 3280, 3530, 432, 271, 29436, 305, 12064, 3268, 50275, 16680, 50276, 74, 253, 806, 906, 310, 326, 1561, 247, 1355, 21520, 273, 253, 3216, 5083, 2801, 4315, 253, 2629, 1599, 30044, 2957, 323, 4715, 253, 11454, 2990, 310, 17133, 50276, 2886, 253, 1273, 906, 2722, 849, 281, 3037, 253, 3216, 5083, 2801, 4315, 407, 13633, 253, 13148, 31850, 2746, 275, 1182, 73, 543, 1162, 355, 1166, 50276, 12211, 10704, 1543, 403, 2530, 281, 17813, 253, 1840, 767, 10527, 1543, 50276, 856, 84, 50275, 783, 3410, 1979, 8284, 273, 1097, 906, 891, 285, 21255, 1756, 314, 8394, 595, 281, 253, 37139, 414, 273, 253, 3216, 5083, 4315, 347, 10066, 281, 253, 1979, 273, 253, 4315, 436, 906, 310, 3782, 4722, 275, 1708, 273, 3332, 16774, 1543, 670, 2990, 819, 25004, 285, 4715, 23507, 2410, 47301, 425, 656, 8621, 321, 938, 50276, 5040, 50275, 783, 4477, 5276, 253, 1840, 1543, 407, 42174, 253, 4737, 273, 1182, 73, 543, 1162, 355, 1166, 275, 958, 1580, 253, 3280, 3268, 310, 29436, 2629, 4719, 1543, 4647, 1880, 390, 417, 253, 3216, 5083, 4315, 310, 23507, 3103, 352, 310, 12744, 281, 253, 37317, 1880, 436, 906, 310, 347, 4460, 347, 253, 4477, 1750, 275, 253, 10199, 50275, 783, 4715, 5933, 19584, 3640, 273, 253, 37139, 414, 8989, 436, 3133, 751, 247, 2266, 9376, 310, 2649, 253, 1127, 273, 1607, 281, 1089, 436, 37139, 414, 8989, 4685, 849, 281, 1089, 436, 37139, 414, 8989, 3133, 751, 247, 625, 1774, 1953, 533, 436, 310, 417, 5469, 387, 512, 275, 436, 2929, 50276, 17695, 50276, 1189, 455, 436, 2929, 310, 3477, 281, 956, 253, 3290, 273, 4028, 310, 42876, 12207, 4496, 1089, 2067, 7000, 5701, 2708, 50275, 81, 18, 253, 10527, 22861, 273, 9880, 14997, 403, 4558, 38037, 1902, 323, 247, 1643, 3332, 2987, 50276, 12163, 403, 8171, 1902, 342, 3707, 50275, 81, 21, 271, 581, 19057, 12026, 11454, 2990, 50276, 66, 581, 19057, 12026, 11454, 2990, 50275, 81, 22, 1060, 368, 1333, 326, 362, 609, 4277, 18, 50276, 783, 85, 284, 2274, 391, 533, 391, 50276, 18, 285, 362, 609, 4277, 18, 50276, 18, 4496, 19148, 50275, 81, 22, 5001, 253, 14940, 323, 253, 26724, 305, 69, 5933, 4496, 823, 247, 3806, 281, 436, 1750, 50275, 81, 22, 7899, 6642, 50276, 13481, 6642, 342, 13418, 7152, 339, 431, 248, 2929, 3537, 13505, 253, 17856, 2605, 273, 253, 8103, 1159, 323, 247, 23507, 581, 8763, 3828, 11454, 2036, 285, 556, 247, 4460, 10012, 327, 253, 14940, 2281, 273, 253, 5933, 326, 761, 12239, 253, 13461, 275, 253, 581, 8763, 3828, 11454, 2036, 432, 253, 8668, 273, 3410, 10454, 10775, 849, 1142, 3530, 403, 3058, 281, 452, 253, 2644, 7355, 273, 253, 13461, 275, 253, 11454, 2036, 352, 310, 4232, 326, 247, 23507, 581, 8763, 3828, 11454, 2036, 3798, 4419, 11184, 3530, 685, 247, 4751, 4802, 14317, 253, 2929, 4648, 1142, 9938, 281, 1329, 253, 10527, 1543, 253, 23507, 11454, 2036, 476, 1957, 253, 9880, 13571, 323, 253, 36284, 13571, 9079, 285, 253, 3410, 10454, 11424, 2139, 253, 9880, 13571, 556, 1805, 3045, 253, 2929, 3133, 253, 806, 789, 281, 2770, 327, 253, 13461, 7355, 273, 23507, 11454, 6928, 285, 3400, 4217, 23632, 342, 1774, 16039, 285, 253, 2929, 943, 320, 5939, 281, 643, 8607, 50276, 20261, 253, 2929, 556, 1142, 9021, 285, 943, 320, 4409, 5939, 281, 643, 8607, 253, 2929, 4620, 281, 320, 816, 2708, 253, 7887, 323, 14924, 281, 253, 17857, 32888, 8059, 3103, 891, 717, 9202, 891, 1537, 417, 5583, 281, 2997, 253, 2929, 275, 436, 1655, 830, 50276, 18, 29708, 253, 7681, 38135, 689, 253, 3806, 1182, 73, 543, 4240, 7355, 23632, 323, 581, 19057, 12026, 11454, 6928, 3133, 417, 4751, 31637, 1223, 352, 310, 7607, 326, 436, 789, 556, 253, 37139, 414, 391, 285, 19539, 247, 40638, 3033, 685, 1182, 73, 543, 4240, 352, 3133, 417, 4751, 31637, 752, 7681, 38135, 390, 2234, 29709, 403, 3587, 1955, 281, 253, 37139, 414, 4758, 275, 436, 2929, 2429, 342, 253, 2746, 326, 1182, 73, 543, 4240, 2335, 253, 4477, 943, 6780, 2234, 7681, 3910, 275, 253, 4737, 1955, 281, 253, 37139, 414, 4758, 281, 1056, 352, 625, 21414, 670, 253, 38135, 273, 436, 2929, 374, 253, 2929, 760, 11424, 2139, 247, 23507, 11454, 2036, 943, 6194, 973, 342, 11184, 6667, 685, 4751, 4802, 14317, 253, 8310, 556, 697, 1211, 15785, 281, 619, 4685, 253, 2929, 7866, 432, 253, 9376, 326, 253, 11454, 2036, 281, 320, 6311, 310, 23507, 50276, 35529, 436, 812, 417, 320, 1900, 3451, 323, 253, 36284, 13571, 1895, 3021, 253, 9376, 3133, 417, 1900, 21541, 323, 1524, 4893, 273, 253, 36284, 13571, 9079, 285, 352, 651, 320, 4217, 604, 627, 476, 320, 6667, 273, 23507, 11454, 6928, 326, 1705, 10748, 323, 690, 1895, 495, 1060, 310, 247, 1953, 323, 253, 4477, 281, 6583, 3738, 253, 789, 310, 327, 23507, 11454, 6928, 253, 37139, 414, 47939, 310, 417, 8025, 275, 10012, 337, 285, 374, 3451, 1014, 604, 47939, 359, 1335, 452, 253, 39383, 3588, 4496, 1928, 1959, 281, 1339, 479, 871, 604, 891, 513, 417, 2096, 6210, 392, 406, 339, 793, 360, 3454, 50276, 2520, 2929, 2340, 684, 253, 10527, 1941, 3212, 5520, 26647, 273, 253, 9880, 36284, 14997, 625, 10534, 4477, 17710, 253, 5175, 2228, 273, 247, 819, 37437, 2990, 326, 310, 840, 10166, 970, 639, 69, 762, 4942, 5272, 13260, 597, 8722, 281, 921, 271, 5520, 26647, 3033, 323, 247, 6283, 819, 37437, 2990, 689, 253, 2120, 2990, 50274, 856, 84, 50275, 2520, 310, 581, 273, 253, 806, 2987, 281, 2085, 10527, 23632, 323, 9880, 36284, 14997, 762, 253, 581, 19057, 12026, 23507, 11454, 2990, 1566, 253, 8103, 1159, 1146, 4122, 1327, 44181, 275, 2087, 4477, 10725, 327, 1980, 17133, 414, 273, 253, 6158, 8103, 1159, 1475, 253, 3216, 5083, 7613, 1677, 247, 1175, 31850, 597, 7303, 281, 921, 326, 639, 69, 33526, 247, 1175, 3045, 846, 247, 2176, 1180, 273, 5018, 10704, 4679, 403, 4722, 285, 13503, 3965, 973, 253, 3762, 273, 253, 2929, 50273, 5040, 34974, 50274, 1282, 721, 3239, 608, 368, 1333, 326, 253, 9941, 273, 253, 17133, 4023, 310, 39116, 18, 50276, 783, 85, 284, 50070, 1206, 849, 310, 326, 1014, 2762, 347, 391, 17202, 253, 1072, 4385, 671, 6556, 323, 634, 4385, 846, 18057, 337, 50272, 5302, 1980, 17133, 414, 534, 310, 247, 6447, 2867, 368, 943, 320, 2104, 281, 5276, 634, 2022, 906, 1754, 327, 253, 2644, 3733, 941, 3185, 273, 3410, 19860, 891, 13414, 923, 2139, 368, 452, 760, 7303, 281, 921, 634, 906, 762, 3410, 19860, 534, 310, 1679, 4722, 275, 619, 4743, 50275, 783, 9079, 273, 247, 23507, 3216, 1566, 835, 368, 403, 1677, 253, 2032, 3969, 1329, 310, 1512, 29190, 352, 651, 452, 644, 1077, 4722, 281, 4647, 247, 12955, 273, 891, 384, 281, 634, 4081, 5933, 891, 2868, 634, 1543, 943, 2186, 762, 4942, 2074, 2515, 891, 7052, 1158, 18597, 634, 906, 1293, 2720, 3640, 273, 253, 2032, 1329, 651, 1056, 253, 1246, 2929, 10046, 285, 651, 3693, 1907, 14916, 956, 484, 2987, 326, 778, 1379, 625, 6152, 685, 13298, 50274, 18891, 50276, 2520, 2929, 310, 973, 3542, 285, 253, 27947, 1646, 3590, 281, 479, 4583, 891, 1158, 253, 1246, 2929, 310, 42876, 1840, 253, 14924, 7887, 984, 273, 253, 4606, 891, 5513, 1840, 891, 717, 7378, 281, 49620, 619, 4868, 604, 253, 4477, 1918, 25799, 8680, 281, 619, 7350, 50275, 26122, 50274, 249, 10012, 818, 352, 651, 320, 1175, 281, 10799, 326, 634, 31850, 310, 3907, 432, 253, 941, 368, 897, 323, 634, 639, 69, 187, 187, 4118, 18435, 27, 9154, 2167, 253, 4477, 17265, 253, 1895, 15895, 253, 2929, 3133, 417, 4704, 323, 9311, 253, 13260, 403, 1335, 1512, 2266, 253, 4715, 5933, 19584, 3640, 273, 253, 37139, 414, 8989, 253, 4737, 5853, 671, 11306, 15771, 327, 1182, 73, 543, 1162, 355, 1166, 1293, 6283, 27321, 253, 3064, 209 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: the authors propose a learning scheme for the unsupervised acquisition of skills these skills are then applied to 1 accelerate reinforcement learning to maximize a reward 2 perform hierarchical rl and 3 imitate an expert trajectory the unsupervised learning of skills maximizes an information theoretic objective function the authors condition their policy on latent variable z and the term skill refers to a policy conditioned on a fixed z the mutual information between states and skills is maximized to ensure that skills control the states while the mutual information between actions and skills given the state is minimized to ensure that states not actions distinguish skills the entropy of the mixture of policies is also maximized further manipulations on this objective function enable the scheme to be implemented using a soft actorcritic maximizing a pseudo reward involving a learned skill discriminator the authors clearly position their work in relation to others and especially point out the differences to the most similar work namely gregor et al 2016 these differences while seemingly minor end up providing exceptional improvement in the number of skills learned and the domains tackled the questionanswer style is somewhat unconventional while the content comes across clearly the flow narrative is a bit broken overall i believe that applicability of the work is very wide touching inverse rl hierarchical rl imitation learning and more the simulational comparisons are also very useful however there is an issue that id like to see addressed fig 8 in a highdimensional task namely 111d ant navigation diayn performs slightly worse than others incorporating a prior on useful skills makes diayn perform much better here apart from the comparision with other state of the art rl methods the authors should also compare to vic indeed one of the key differences to vic was the uniform prior on skills which the authors now break albeit in a slightly different way thus it is essential to also show the performance of vic and comment on any similarities differences the relation of this prior to the vic prior should also be made clear further the performance of vic on the half cheetah hurdle should be also be shown if the above issue is addressed i strongly recommend that the work be presented at iclr minor issues typos pg 1 policy that alters that state of the environment to policy that alters the state of the environment pg 3 mutual information between skills and states ia z to mutual information between skills and states is z pg 4 guaranteeing that is has maximum entropy to guaranteeing that it has maximum entropy pg 4 soft actor critic to soft actor critic sac since sac is used later pg 5 full form of vime not introduced fig 5 would be good to also show the variance as a shaded area around the mean pg 7 whereas diayn explicitly skills that effectively partition the state space docseppros mostly clear and wellwritten paper technical contribution learning many diverse skills is principled and welljustified although the basic idea builds on a large body of prior work from related fields as also evidenced from the reference list extensive emperical evaluation on multiple tasks and different scenarios mainly toy examples showing promising results cons the main paper assumes detailed knowledge of the actor critic setup to fully follow and appreciate the paper a few details provided in the appendix pz it is not entirely clear to me how the dimensionality of z should be chosen in a principled manner aside from brute force evaluation as in 422 which does not go beyond a few hundreds what happens for many skills and would learning pz be preferable in this scenario note the work has been in the public domain for some time thereby limiting the apparent novelty this has not influenced my decision as per iclr policy significance i think this work would be of interest to the iclr crowd despite it having been in the public domain for a some time it provides a simple objective for training rl models in an unsupervised manner by learning multiple diverse skills and contributes with an extensive and convincing empirical evaluation which will surely have a lasting impact in the rl subfield further commentsquestions the authors assume that only states and not actions are observable intuitively it would seem easier to obtain the desired results if the actions are also available could the authors perhaps clarify why it is reasonable to assume that the actions are not observable to the planner when evaluating the objective in eq 1 similarly id like some insight into the behaviour of the proposed method if actions are also available and how it differs from prior art in this case id suggest enforcing consistently in the way variation across random seeds is visualised in the figures eg traces in fig 4 no indication in fig5 shaped background in fig 6 id suggest making it explicit what theta refers to in eq 1 and provide some details about the sac setup for completeness as previously mentioned minor typos etc p2 l6 missing word sac never defined dof never defined and a few other typospunctuation issues throughout docsepthis paper proposes a method for learning skills in absence of a reward function these skills are learned so that the diversity of the trajectories produced by each skill is maximised this is achieved by having a discriminator attempting to tell these skills apart the agent is rewarded for visiting states that are easy to distinguish and the discriminator is trained to better infer the skills from states visited by the agent furthermore a maximum entropy policy is used to force the skills to be diverse the proposed method is general and any rl algorithm with entropy maximisation in the objective can be used the implementation in the paper uses the soft actor critic method the problem that they are tackling is interesting and is of clear value for obtaining more generalisable rl algorithms the paper is overall clear and easy to follow the results are interesting and potentially useful although i have some reservations regarding how they assess this usefulness in the current version of this paper structurewise i would say that the choice of writing the paper in the form of a qa with very brief explanations and details was more distracting and at times unnecessary than i liked eg question 7 could move to appendix as it is quite trivial i really appreciated how much care has been taken to discuss differences with the closest prior work variational intrinsic control vic by gregor et al one such difference is that their prior distribution over skills is not learnt while there are good arguments by the authors about why this is appealing eg it prevents collapsing to sampling only a few skills i feel this could be also quite a limitation of their method this assumes that you have a good apriori knowledge and assumptions regarding how many skills are useful or needed in the environment this is unlikely to be the case in complex environments where you first need to learn simple skills in order to explore the environment and later learn to form new more complex skills during this process you might want to prune simplistic skills after you learnt more abstract and complex ones for instance in the context of continual learning i understand this could be investigated in future work but i feel they take a rather optimistic take on this problem overall the use case for the proposed method is slightly unclear to me while the paper claims to allow diverse set of skills to be learnt it is highly dependent on learning varied action sequences that help you visit different part of state space regardless of their usefulness this means there could be learn a lot of skills that capture part of the state space that is not useful or desirable for downstream tasks while there is a case made for diayn being a stepping stone for imitation learning and hierarchical rl i dont find the reported experiments for imitation learning and hrl convincing in the imitation learning experiment the distance kl divergence between all skills and the expert data is computed and the closest skill is then chosen as the policy imitating the expert the results are weak and no comparisons with any lfd baselines are reported the hrl experiments also lack comparisons to any other hrl baseline i feel that this section is rather weak especially compared to the rest of the paper and i am not sure it achieves much as a general comment the choice of reporting the training progress using hours spent training is an peculiar choice which is never discussed i understand that for methods with varying computational costs this might be a fairer comparison but it would be perhaps good to also report progress against number of required environment interactions including pretraining another assumption made is that the method is valuable in situations where the reward function is expensive to compute and the unsupervised pretraining is free somewhat easing the large amount of pretraining required however it would have been interesting to see examples of such environments in their experiments supporting these claims as this assumption is not valid for the chosen mujoco environments despite these comments i still feel this is valuable work that can clearly inspire further relevant work and deserves to be presented at iclr it presents a solid contribution given its technical novelty proposed applications and its overall generality however the paper could use more convincing experiments to support its claims additional comments and typos figure 5 lack error bars across the 5 random seeds and are crucial to assess whether this performance difference is indeed significant given the amount of pretraining required figure 7s title and caption is missing typo page 3 last paragraph mutual information between skills and states is z not ia z typo page 7 paragraph next to figure 6 whereas diayn explicitly learns skills that effectively partition the state space typo page 7 above figure 8 make them exceedingly difficult for non hierarchical rl algorithms ### Summary:
there is consensus among the reviewer that this is a good paper it is a bit incremental compared to gregor et al 2016 this paper show quite better empirical results
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 783, 4477, 12661, 247, 4715, 6974, 323, 253, 440, 35421, 11931, 273, 6936, 841, 6936, 403, 840, 3732, 281, 337, 28523, 35221, 4715, 281, 22950, 247, 10921, 374, 1347, 24498, 391, 77, 285, 495, 516, 17255, 271, 6485, 18974, 50276, 783, 440, 35421, 4715, 273, 6936, 11903, 4219, 271, 1491, 253, 30325, 8103, 1159, 253, 4477, 1617, 616, 3646, 327, 21624, 4778, 1182, 285, 253, 1307, 10861, 10770, 281, 247, 3646, 27039, 327, 247, 4229, 1182, 253, 15577, 1491, 875, 3054, 285, 6936, 310, 11903, 1025, 281, 5416, 326, 6936, 1453, 253, 3054, 1223, 253, 15577, 1491, 875, 5231, 285, 6936, 1677, 253, 1375, 310, 36625, 281, 5416, 326, 3054, 417, 5231, 12129, 6936, 253, 15579, 273, 253, 7802, 273, 7823, 310, 671, 11903, 1025, 2007, 49373, 327, 436, 8103, 1159, 8046, 253, 6974, 281, 320, 9009, 970, 247, 2602, 12353, 68, 17425, 46875, 247, 17927, 10921, 7668, 247, 6311, 10861, 7134, 12915, 50276, 783, 4477, 4518, 1899, 616, 789, 275, 5886, 281, 2571, 285, 3340, 1127, 562, 253, 3910, 281, 253, 954, 2074, 789, 10775, 305, 1747, 263, 1162, 355, 4022, 841, 3910, 1223, 16907, 5884, 990, 598, 5277, 18714, 7756, 275, 253, 1180, 273, 6936, 6311, 285, 253, 10625, 11463, 1070, 50276, 783, 1953, 31984, 3740, 310, 8489, 49799, 1223, 253, 2600, 3249, 2439, 4518, 253, 2685, 50276, 79, 3298, 800, 310, 247, 2372, 7154, 50276, 1189, 455, 891, 2868, 326, 30437, 273, 253, 789, 310, 1077, 4618, 19883, 13737, 391, 77, 24498, 391, 77, 45738, 4715, 285, 625, 253, 948, 335, 1050, 14023, 403, 671, 1077, 4217, 50276, 35529, 627, 310, 271, 2523, 326, 2654, 751, 281, 923, 9713, 3036, 854, 275, 247, 1029, 6967, 4836, 10775, 11334, 69, 1331, 15034, 1073, 333, 79, 17923, 5777, 7197, 685, 2571, 24049, 247, 2720, 327, 4217, 6936, 2789, 1073, 333, 79, 1347, 1199, 1805, 1060, 7419, 432, 253, 3294, 1297, 342, 643, 1375, 273, 253, 1445, 391, 77, 3082, 253, 4477, 943, 671, 7277, 281, 15951, 6296, 581, 273, 253, 2234, 3910, 281, 15951, 369, 253, 6447, 2720, 327, 6936, 534, 253, 4477, 1024, 2740, 23447, 275, 247, 5777, 1027, 1039, 3021, 352, 310, 5667, 281, 671, 921, 253, 3045, 273, 15951, 285, 4385, 327, 667, 22620, 50276, 69, 26776, 253, 5886, 273, 436, 2720, 281, 253, 15951, 2720, 943, 671, 320, 1160, 2590, 2007, 253, 3045, 273, 15951, 327, 253, 2716, 1161, 292, 1240, 7929, 34630, 943, 320, 671, 320, 2011, 50276, 338, 253, 1840, 2523, 310, 9713, 891, 7052, 5583, 326, 253, 789, 320, 3559, 387, 17857, 32888, 50276, 37585, 3374, 50276, 555, 993, 23256, 337, 3646, 326, 41077, 326, 1375, 273, 253, 3126, 281, 3646, 326, 41077, 253, 1375, 273, 253, 3126, 23256, 495, 15577, 1491, 875, 6936, 285, 3054, 209, 571, 1182, 281, 15577, 1491, 875, 6936, 285, 3054, 310, 1182, 23256, 577, 12215, 272, 326, 310, 556, 4869, 15579, 281, 12215, 272, 326, 352, 556, 4869, 15579, 23256, 577, 50276, 5530, 12353, 7291, 281, 50276, 5530, 12353, 7291, 7044, 1580, 7044, 310, 908, 1996, 23256, 608, 2120, 830, 273, 362, 553, 417, 5611, 3036, 608, 651, 320, 1175, 281, 671, 921, 253, 11041, 347, 247, 37042, 2170, 1475, 253, 1599, 23256, 818, 5727, 1073, 333, 79, 11120, 6936, 326, 8069, 10883, 253, 1375, 2317, 5474, 339, 377, 2921, 209, 186, 39025, 2590, 285, 973, 15720, 2929, 50276, 186, 48746, 7680, 4715, 1142, 11117, 6936, 310, 3505, 74, 6216, 285, 973, 6309, 1245, 3738, 253, 5044, 2934, 21168, 327, 247, 1781, 2133, 273, 2720, 789, 432, 2905, 4910, 347, 671, 27007, 432, 253, 3806, 1618, 209, 186, 2068, 3134, 802, 468, 474, 7103, 327, 2709, 8892, 285, 1027, 15216, 7194, 20953, 6667, 4645, 12532, 1543, 50276, 5040, 209, 186, 783, 2022, 2929, 19584, 7000, 3640, 273, 253, 12353, 7291, 9978, 281, 4751, 956, 285, 11435, 253, 2929, 247, 1643, 4278, 2530, 275, 253, 30762, 50276, 186, 81, 91, 352, 310, 417, 7094, 2590, 281, 479, 849, 253, 7877, 1319, 273, 1182, 943, 320, 6777, 275, 247, 3505, 74, 6216, 5133, 9255, 432, 45294, 3490, 7103, 347, 275, 38429, 534, 1057, 417, 564, 4457, 247, 1643, 8307, 752, 6569, 323, 1142, 6936, 285, 651, 4715, 268, 91, 320, 29224, 275, 436, 10076, 209, 186, 9939, 253, 789, 556, 644, 275, 253, 1345, 5028, 323, 690, 673, 7624, 14155, 253, 5165, 38135, 436, 556, 417, 12208, 619, 3061, 347, 591, 17857, 32888, 3646, 50275, 9188, 40348, 891, 1158, 436, 789, 651, 320, 273, 1600, 281, 253, 17857, 32888, 9539, 5747, 352, 1907, 644, 275, 253, 1345, 5028, 323, 247, 690, 673, 352, 3400, 247, 2969, 8103, 323, 3733, 391, 77, 3210, 275, 271, 440, 35421, 5133, 407, 4715, 2709, 11117, 6936, 285, 17904, 342, 271, 9470, 285, 21414, 16774, 7103, 534, 588, 13353, 452, 247, 21692, 3486, 275, 253, 391, 77, 749, 3423, 50275, 44295, 5701, 34974, 209, 186, 783, 4477, 5467, 326, 760, 3054, 285, 417, 5231, 403, 24802, 540, 41597, 352, 651, 1646, 6927, 281, 4044, 253, 6799, 1543, 604, 253, 5231, 403, 671, 2130, 812, 253, 4477, 4931, 19148, 2139, 352, 310, 5272, 281, 5467, 326, 253, 5231, 403, 417, 24802, 281, 253, 499, 9582, 672, 16344, 253, 8103, 275, 16186, 337, 50276, 3549, 6241, 2654, 751, 690, 12288, 715, 253, 8770, 273, 253, 4081, 1332, 604, 5231, 403, 671, 2130, 285, 849, 352, 19986, 432, 2720, 1445, 275, 436, 1083, 209, 186, 301, 1804, 37703, 12724, 275, 253, 1039, 7629, 2439, 3632, 12922, 310, 5304, 1701, 275, 253, 8442, 24088, 20274, 275, 3036, 577, 642, 14011, 275, 3036, 22, 16745, 4114, 275, 3036, 721, 50276, 186, 301, 1804, 2403, 352, 6843, 752, 39116, 10770, 281, 275, 16186, 337, 285, 2085, 690, 4278, 670, 253, 7044, 9978, 323, 29867, 347, 3786, 5393, 209, 186, 37585, 963, 993, 3966, 268, 19, 298, 23, 5816, 3159, 7044, 1620, 2931, 513, 71, 1620, 2931, 285, 247, 1643, 643, 963, 993, 81, 10593, 2368, 3374, 4768, 50276, 7152, 33032, 2520, 2929, 29328, 247, 1332, 323, 4715, 6936, 275, 5928, 273, 247, 10921, 1159, 841, 6936, 403, 6311, 594, 326, 253, 9991, 273, 253, 24102, 4197, 407, 1016, 10861, 310, 11903, 1701, 436, 310, 6786, 407, 1907, 247, 7134, 12915, 13756, 281, 2028, 841, 6936, 7419, 253, 5570, 310, 33302, 323, 13975, 3054, 326, 403, 3477, 281, 12129, 285, 253, 7134, 12915, 310, 10166, 281, 1805, 9441, 253, 6936, 432, 3054, 11580, 407, 253, 5570, 33810, 247, 4869, 15579, 3646, 310, 908, 281, 3490, 253, 6936, 281, 320, 11117, 253, 4081, 1332, 310, 2087, 285, 667, 391, 77, 5933, 342, 15579, 11903, 5837, 275, 253, 8103, 476, 320, 908, 253, 7092, 275, 253, 2929, 4648, 253, 2602, 12353, 7291, 1332, 50276, 783, 1895, 326, 597, 403, 46710, 310, 4722, 285, 310, 273, 2590, 1318, 323, 13546, 625, 2087, 261, 494, 391, 77, 11333, 253, 2929, 310, 4583, 2590, 285, 3477, 281, 956, 253, 1543, 403, 4722, 285, 7826, 4217, 3738, 891, 452, 690, 33196, 5001, 849, 597, 2939, 436, 31471, 275, 253, 1655, 2715, 273, 436, 2929, 2605, 3020, 891, 651, 1333, 326, 253, 4327, 273, 4028, 253, 2929, 275, 253, 830, 273, 247, 2805, 66, 342, 1077, 4864, 22909, 285, 4278, 369, 625, 940, 25031, 285, 387, 2069, 15279, 685, 891, 10490, 24088, 1953, 818, 812, 2118, 281, 30762, 347, 352, 310, 3240, 14916, 50276, 74, 1663, 14109, 849, 1199, 1557, 556, 644, 2668, 281, 2319, 3910, 342, 253, 8642, 2720, 789, 39762, 15276, 1453, 15951, 407, 305, 1747, 263, 1162, 355, 50276, 531, 824, 3064, 310, 326, 616, 2720, 3268, 689, 6936, 310, 417, 34003, 1223, 627, 403, 1175, 7125, 407, 253, 4477, 670, 2139, 436, 310, 23176, 24088, 352, 16897, 45130, 281, 10491, 760, 247, 1643, 6936, 891, 1928, 436, 812, 320, 671, 3240, 247, 12291, 273, 616, 1332, 436, 19584, 326, 368, 452, 247, 1175, 1049, 7947, 74, 3640, 285, 13260, 5001, 849, 1142, 6936, 403, 4217, 390, 3058, 275, 253, 3126, 436, 310, 11543, 281, 320, 253, 1083, 275, 2570, 12620, 835, 368, 806, 878, 281, 3037, 2969, 6936, 275, 1340, 281, 8338, 253, 3126, 285, 1996, 3037, 281, 830, 747, 625, 2570, 6936, 1309, 436, 1232, 368, 1537, 971, 281, 819, 2517, 8077, 2531, 6936, 846, 368, 34003, 625, 12002, 285, 2570, 4394, 323, 4227, 275, 253, 3634, 273, 45120, 4715, 891, 2096, 436, 812, 320, 6949, 275, 2852, 789, 533, 891, 1928, 597, 1379, 247, 2581, 28684, 1379, 327, 436, 1895, 50276, 1189, 455, 253, 897, 1083, 323, 253, 4081, 1332, 310, 5777, 12744, 281, 479, 1223, 253, 2929, 3916, 281, 1581, 11117, 873, 273, 6936, 281, 320, 34003, 352, 310, 4122, 7976, 327, 4715, 12848, 2250, 6430, 326, 1361, 368, 4143, 1027, 629, 273, 1375, 2317, 10159, 273, 616, 31471, 436, 2097, 627, 812, 320, 3037, 247, 2257, 273, 6936, 326, 9232, 629, 273, 253, 1375, 2317, 326, 310, 417, 4217, 390, 11408, 323, 15450, 8892, 1223, 627, 310, 247, 1083, 1160, 323, 1073, 333, 79, 1146, 247, 24655, 8805, 323, 45738, 4715, 285, 24498, 391, 77, 891, 13414, 1089, 253, 2361, 4679, 323, 45738, 4715, 285, 288, 8435, 21414, 275, 253, 45738, 4715, 3368, 253, 4181, 27451, 23279, 875, 512, 6936, 285, 253, 6485, 941, 310, 10302, 285, 253, 8642, 10861, 310, 840, 6777, 347, 253, 3646, 516, 27427, 253, 6485, 253, 1543, 403, 5075, 285, 642, 14023, 342, 667, 298, 9194, 1666, 25379, 403, 2361, 253, 288, 8435, 4679, 671, 3480, 14023, 281, 667, 643, 288, 8435, 8245, 891, 1928, 326, 436, 2593, 310, 2581, 5075, 3340, 2429, 281, 253, 1551, 273, 253, 2929, 285, 891, 717, 417, 2119, 352, 33526, 1199, 50276, 284, 247, 2087, 4385, 253, 4327, 273, 9610, 253, 3733, 4780, 970, 3038, 5262, 3733, 310, 271, 19532, 4327, 534, 310, 1620, 5469, 891, 2096, 326, 323, 3082, 342, 11962, 15180, 4815, 436, 1537, 320, 247, 22870, 83, 5301, 533, 352, 651, 320, 4931, 1175, 281, 671, 1304, 4780, 1411, 1180, 273, 2424, 3126, 6355, 1690, 3215, 26208, 1529, 9376, 1160, 310, 326, 253, 1332, 310, 9865, 275, 9534, 835, 253, 10921, 1159, 310, 8214, 281, 11897, 285, 253, 440, 35421, 3215, 26208, 310, 1959, 8489, 1842, 272, 253, 1781, 2408, 273, 3215, 26208, 2424, 2299, 352, 651, 452, 644, 4722, 281, 923, 6667, 273, 824, 12620, 275, 616, 4679, 8109, 841, 3916, 347, 436, 9376, 310, 417, 3588, 323, 253, 6777, 278, 10441, 16856, 12620, 50276, 3229, 3784, 841, 5701, 891, 1335, 1928, 436, 310, 9865, 789, 326, 476, 4518, 26761, 2007, 4623, 789, 285, 22828, 281, 320, 3559, 387, 17857, 32888, 352, 10262, 247, 4891, 7680, 1677, 697, 7681, 38135, 4081, 4893, 285, 697, 4583, 31376, 50276, 35529, 253, 2929, 812, 897, 625, 21414, 4679, 281, 1329, 697, 3916, 50276, 38092, 5701, 285, 963, 993, 50276, 13206, 608, 3480, 2228, 8965, 2439, 253, 608, 3632, 12922, 285, 403, 9560, 281, 2939, 1880, 436, 3045, 3064, 310, 6296, 1534, 1677, 253, 2408, 273, 3215, 26208, 2424, 50276, 13206, 818, 84, 4060, 285, 11743, 310, 5816, 50276, 555, 5367, 3239, 495, 1390, 12494, 15577, 1491, 875, 6936, 285, 3054, 310, 1182, 50276, 1439, 209, 571, 1182, 50276, 555, 5367, 3239, 818, 12494, 1735, 281, 4677, 721, 5727, 1073, 333, 79, 11120, 33772, 6936, 326, 8069, 10883, 253, 1375, 2317, 50276, 555, 5367, 3239, 818, 1840, 4677, 854, 1056, 731, 42508, 2834, 323, 1327, 24498, 391, 77, 11333, 2490, 187, 4118, 18435, 27, 9088, 310, 13969, 2190, 253, 37317, 326, 436, 310, 247, 1175, 2929, 352, 310, 247, 2372, 32809, 2429, 281, 305, 1747, 263, 1162, 355, 4022, 436, 2929, 921, 3240, 1805, 16774, 1543 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 783, 4477, 12661, 247, 4715, 6974, 323, 253, 440, 35421, 11931, 273, 6936, 841, 6936, 403, 840, 3732, 281, 337, 28523, 35221, 4715, 281, 22950, 247, 10921, 374, 1347, 24498, 391, 77, 285, 495, 516, 17255, 271, 6485, 18974, 50276, 783, 440, 35421, 4715, 273, 6936, 11903, 4219, 271, 1491, 253, 30325, 8103, 1159, 253, 4477, 1617, 616, 3646, 327, 21624, 4778, 1182, 285, 253, 1307, 10861, 10770, 281, 247, 3646, 27039, 327, 247, 4229, 1182, 253, 15577, 1491, 875, 3054, 285, 6936, 310, 11903, 1025, 281, 5416, 326, 6936, 1453, 253, 3054, 1223, 253, 15577, 1491, 875, 5231, 285, 6936, 1677, 253, 1375, 310, 36625, 281, 5416, 326, 3054, 417, 5231, 12129, 6936, 253, 15579, 273, 253, 7802, 273, 7823, 310, 671, 11903, 1025, 2007, 49373, 327, 436, 8103, 1159, 8046, 253, 6974, 281, 320, 9009, 970, 247, 2602, 12353, 68, 17425, 46875, 247, 17927, 10921, 7668, 247, 6311, 10861, 7134, 12915, 50276, 783, 4477, 4518, 1899, 616, 789, 275, 5886, 281, 2571, 285, 3340, 1127, 562, 253, 3910, 281, 253, 954, 2074, 789, 10775, 305, 1747, 263, 1162, 355, 4022, 841, 3910, 1223, 16907, 5884, 990, 598, 5277, 18714, 7756, 275, 253, 1180, 273, 6936, 6311, 285, 253, 10625, 11463, 1070, 50276, 783, 1953, 31984, 3740, 310, 8489, 49799, 1223, 253, 2600, 3249, 2439, 4518, 253, 2685, 50276, 79, 3298, 800, 310, 247, 2372, 7154, 50276, 1189, 455, 891, 2868, 326, 30437, 273, 253, 789, 310, 1077, 4618, 19883, 13737, 391, 77, 24498, 391, 77, 45738, 4715, 285, 625, 253, 948, 335, 1050, 14023, 403, 671, 1077, 4217, 50276, 35529, 627, 310, 271, 2523, 326, 2654, 751, 281, 923, 9713, 3036, 854, 275, 247, 1029, 6967, 4836, 10775, 11334, 69, 1331, 15034, 1073, 333, 79, 17923, 5777, 7197, 685, 2571, 24049, 247, 2720, 327, 4217, 6936, 2789, 1073, 333, 79, 1347, 1199, 1805, 1060, 7419, 432, 253, 3294, 1297, 342, 643, 1375, 273, 253, 1445, 391, 77, 3082, 253, 4477, 943, 671, 7277, 281, 15951, 6296, 581, 273, 253, 2234, 3910, 281, 15951, 369, 253, 6447, 2720, 327, 6936, 534, 253, 4477, 1024, 2740, 23447, 275, 247, 5777, 1027, 1039, 3021, 352, 310, 5667, 281, 671, 921, 253, 3045, 273, 15951, 285, 4385, 327, 667, 22620, 50276, 69, 26776, 253, 5886, 273, 436, 2720, 281, 253, 15951, 2720, 943, 671, 320, 1160, 2590, 2007, 253, 3045, 273, 15951, 327, 253, 2716, 1161, 292, 1240, 7929, 34630, 943, 320, 671, 320, 2011, 50276, 338, 253, 1840, 2523, 310, 9713, 891, 7052, 5583, 326, 253, 789, 320, 3559, 387, 17857, 32888, 50276, 37585, 3374, 50276, 555, 993, 23256, 337, 3646, 326, 41077, 326, 1375, 273, 253, 3126, 281, 3646, 326, 41077, 253, 1375, 273, 253, 3126, 23256, 495, 15577, 1491, 875, 6936, 285, 3054, 209, 571, 1182, 281, 15577, 1491, 875, 6936, 285, 3054, 310, 1182, 23256, 577, 12215, 272, 326, 310, 556, 4869, 15579, 281, 12215, 272, 326, 352, 556, 4869, 15579, 23256, 577, 50276, 5530, 12353, 7291, 281, 50276, 5530, 12353, 7291, 7044, 1580, 7044, 310, 908, 1996, 23256, 608, 2120, 830, 273, 362, 553, 417, 5611, 3036, 608, 651, 320, 1175, 281, 671, 921, 253, 11041, 347, 247, 37042, 2170, 1475, 253, 1599, 23256, 818, 5727, 1073, 333, 79, 11120, 6936, 326, 8069, 10883, 253, 1375, 2317, 5474, 339, 377, 2921, 209, 186, 39025, 2590, 285, 973, 15720, 2929, 50276, 186, 48746, 7680, 4715, 1142, 11117, 6936, 310, 3505, 74, 6216, 285, 973, 6309, 1245, 3738, 253, 5044, 2934, 21168, 327, 247, 1781, 2133, 273, 2720, 789, 432, 2905, 4910, 347, 671, 27007, 432, 253, 3806, 1618, 209, 186, 2068, 3134, 802, 468, 474, 7103, 327, 2709, 8892, 285, 1027, 15216, 7194, 20953, 6667, 4645, 12532, 1543, 50276, 5040, 209, 186, 783, 2022, 2929, 19584, 7000, 3640, 273, 253, 12353, 7291, 9978, 281, 4751, 956, 285, 11435, 253, 2929, 247, 1643, 4278, 2530, 275, 253, 30762, 50276, 186, 81, 91, 352, 310, 417, 7094, 2590, 281, 479, 849, 253, 7877, 1319, 273, 1182, 943, 320, 6777, 275, 247, 3505, 74, 6216, 5133, 9255, 432, 45294, 3490, 7103, 347, 275, 38429, 534, 1057, 417, 564, 4457, 247, 1643, 8307, 752, 6569, 323, 1142, 6936, 285, 651, 4715, 268, 91, 320, 29224, 275, 436, 10076, 209, 186, 9939, 253, 789, 556, 644, 275, 253, 1345, 5028, 323, 690, 673, 7624, 14155, 253, 5165, 38135, 436, 556, 417, 12208, 619, 3061, 347, 591, 17857, 32888, 3646, 50275, 9188, 40348, 891, 1158, 436, 789, 651, 320, 273, 1600, 281, 253, 17857, 32888, 9539, 5747, 352, 1907, 644, 275, 253, 1345, 5028, 323, 247, 690, 673, 352, 3400, 247, 2969, 8103, 323, 3733, 391, 77, 3210, 275, 271, 440, 35421, 5133, 407, 4715, 2709, 11117, 6936, 285, 17904, 342, 271, 9470, 285, 21414, 16774, 7103, 534, 588, 13353, 452, 247, 21692, 3486, 275, 253, 391, 77, 749, 3423, 50275, 44295, 5701, 34974, 209, 186, 783, 4477, 5467, 326, 760, 3054, 285, 417, 5231, 403, 24802, 540, 41597, 352, 651, 1646, 6927, 281, 4044, 253, 6799, 1543, 604, 253, 5231, 403, 671, 2130, 812, 253, 4477, 4931, 19148, 2139, 352, 310, 5272, 281, 5467, 326, 253, 5231, 403, 417, 24802, 281, 253, 499, 9582, 672, 16344, 253, 8103, 275, 16186, 337, 50276, 3549, 6241, 2654, 751, 690, 12288, 715, 253, 8770, 273, 253, 4081, 1332, 604, 5231, 403, 671, 2130, 285, 849, 352, 19986, 432, 2720, 1445, 275, 436, 1083, 209, 186, 301, 1804, 37703, 12724, 275, 253, 1039, 7629, 2439, 3632, 12922, 310, 5304, 1701, 275, 253, 8442, 24088, 20274, 275, 3036, 577, 642, 14011, 275, 3036, 22, 16745, 4114, 275, 3036, 721, 50276, 186, 301, 1804, 2403, 352, 6843, 752, 39116, 10770, 281, 275, 16186, 337, 285, 2085, 690, 4278, 670, 253, 7044, 9978, 323, 29867, 347, 3786, 5393, 209, 186, 37585, 963, 993, 3966, 268, 19, 298, 23, 5816, 3159, 7044, 1620, 2931, 513, 71, 1620, 2931, 285, 247, 1643, 643, 963, 993, 81, 10593, 2368, 3374, 4768, 50276, 7152, 33032, 2520, 2929, 29328, 247, 1332, 323, 4715, 6936, 275, 5928, 273, 247, 10921, 1159, 841, 6936, 403, 6311, 594, 326, 253, 9991, 273, 253, 24102, 4197, 407, 1016, 10861, 310, 11903, 1701, 436, 310, 6786, 407, 1907, 247, 7134, 12915, 13756, 281, 2028, 841, 6936, 7419, 253, 5570, 310, 33302, 323, 13975, 3054, 326, 403, 3477, 281, 12129, 285, 253, 7134, 12915, 310, 10166, 281, 1805, 9441, 253, 6936, 432, 3054, 11580, 407, 253, 5570, 33810, 247, 4869, 15579, 3646, 310, 908, 281, 3490, 253, 6936, 281, 320, 11117, 253, 4081, 1332, 310, 2087, 285, 667, 391, 77, 5933, 342, 15579, 11903, 5837, 275, 253, 8103, 476, 320, 908, 253, 7092, 275, 253, 2929, 4648, 253, 2602, 12353, 7291, 1332, 50276, 783, 1895, 326, 597, 403, 46710, 310, 4722, 285, 310, 273, 2590, 1318, 323, 13546, 625, 2087, 261, 494, 391, 77, 11333, 253, 2929, 310, 4583, 2590, 285, 3477, 281, 956, 253, 1543, 403, 4722, 285, 7826, 4217, 3738, 891, 452, 690, 33196, 5001, 849, 597, 2939, 436, 31471, 275, 253, 1655, 2715, 273, 436, 2929, 2605, 3020, 891, 651, 1333, 326, 253, 4327, 273, 4028, 253, 2929, 275, 253, 830, 273, 247, 2805, 66, 342, 1077, 4864, 22909, 285, 4278, 369, 625, 940, 25031, 285, 387, 2069, 15279, 685, 891, 10490, 24088, 1953, 818, 812, 2118, 281, 30762, 347, 352, 310, 3240, 14916, 50276, 74, 1663, 14109, 849, 1199, 1557, 556, 644, 2668, 281, 2319, 3910, 342, 253, 8642, 2720, 789, 39762, 15276, 1453, 15951, 407, 305, 1747, 263, 1162, 355, 50276, 531, 824, 3064, 310, 326, 616, 2720, 3268, 689, 6936, 310, 417, 34003, 1223, 627, 403, 1175, 7125, 407, 253, 4477, 670, 2139, 436, 310, 23176, 24088, 352, 16897, 45130, 281, 10491, 760, 247, 1643, 6936, 891, 1928, 436, 812, 320, 671, 3240, 247, 12291, 273, 616, 1332, 436, 19584, 326, 368, 452, 247, 1175, 1049, 7947, 74, 3640, 285, 13260, 5001, 849, 1142, 6936, 403, 4217, 390, 3058, 275, 253, 3126, 436, 310, 11543, 281, 320, 253, 1083, 275, 2570, 12620, 835, 368, 806, 878, 281, 3037, 2969, 6936, 275, 1340, 281, 8338, 253, 3126, 285, 1996, 3037, 281, 830, 747, 625, 2570, 6936, 1309, 436, 1232, 368, 1537, 971, 281, 819, 2517, 8077, 2531, 6936, 846, 368, 34003, 625, 12002, 285, 2570, 4394, 323, 4227, 275, 253, 3634, 273, 45120, 4715, 891, 2096, 436, 812, 320, 6949, 275, 2852, 789, 533, 891, 1928, 597, 1379, 247, 2581, 28684, 1379, 327, 436, 1895, 50276, 1189, 455, 253, 897, 1083, 323, 253, 4081, 1332, 310, 5777, 12744, 281, 479, 1223, 253, 2929, 3916, 281, 1581, 11117, 873, 273, 6936, 281, 320, 34003, 352, 310, 4122, 7976, 327, 4715, 12848, 2250, 6430, 326, 1361, 368, 4143, 1027, 629, 273, 1375, 2317, 10159, 273, 616, 31471, 436, 2097, 627, 812, 320, 3037, 247, 2257, 273, 6936, 326, 9232, 629, 273, 253, 1375, 2317, 326, 310, 417, 4217, 390, 11408, 323, 15450, 8892, 1223, 627, 310, 247, 1083, 1160, 323, 1073, 333, 79, 1146, 247, 24655, 8805, 323, 45738, 4715, 285, 24498, 391, 77, 891, 13414, 1089, 253, 2361, 4679, 323, 45738, 4715, 285, 288, 8435, 21414, 275, 253, 45738, 4715, 3368, 253, 4181, 27451, 23279, 875, 512, 6936, 285, 253, 6485, 941, 310, 10302, 285, 253, 8642, 10861, 310, 840, 6777, 347, 253, 3646, 516, 27427, 253, 6485, 253, 1543, 403, 5075, 285, 642, 14023, 342, 667, 298, 9194, 1666, 25379, 403, 2361, 253, 288, 8435, 4679, 671, 3480, 14023, 281, 667, 643, 288, 8435, 8245, 891, 1928, 326, 436, 2593, 310, 2581, 5075, 3340, 2429, 281, 253, 1551, 273, 253, 2929, 285, 891, 717, 417, 2119, 352, 33526, 1199, 50276, 284, 247, 2087, 4385, 253, 4327, 273, 9610, 253, 3733, 4780, 970, 3038, 5262, 3733, 310, 271, 19532, 4327, 534, 310, 1620, 5469, 891, 2096, 326, 323, 3082, 342, 11962, 15180, 4815, 436, 1537, 320, 247, 22870, 83, 5301, 533, 352, 651, 320, 4931, 1175, 281, 671, 1304, 4780, 1411, 1180, 273, 2424, 3126, 6355, 1690, 3215, 26208, 1529, 9376, 1160, 310, 326, 253, 1332, 310, 9865, 275, 9534, 835, 253, 10921, 1159, 310, 8214, 281, 11897, 285, 253, 440, 35421, 3215, 26208, 310, 1959, 8489, 1842, 272, 253, 1781, 2408, 273, 3215, 26208, 2424, 2299, 352, 651, 452, 644, 4722, 281, 923, 6667, 273, 824, 12620, 275, 616, 4679, 8109, 841, 3916, 347, 436, 9376, 310, 417, 3588, 323, 253, 6777, 278, 10441, 16856, 12620, 50276, 3229, 3784, 841, 5701, 891, 1335, 1928, 436, 310, 9865, 789, 326, 476, 4518, 26761, 2007, 4623, 789, 285, 22828, 281, 320, 3559, 387, 17857, 32888, 352, 10262, 247, 4891, 7680, 1677, 697, 7681, 38135, 4081, 4893, 285, 697, 4583, 31376, 50276, 35529, 253, 2929, 812, 897, 625, 21414, 4679, 281, 1329, 697, 3916, 50276, 38092, 5701, 285, 963, 993, 50276, 13206, 608, 3480, 2228, 8965, 2439, 253, 608, 3632, 12922, 285, 403, 9560, 281, 2939, 1880, 436, 3045, 3064, 310, 6296, 1534, 1677, 253, 2408, 273, 3215, 26208, 2424, 50276, 13206, 818, 84, 4060, 285, 11743, 310, 5816, 50276, 555, 5367, 3239, 495, 1390, 12494, 15577, 1491, 875, 6936, 285, 3054, 310, 1182, 50276, 1439, 209, 571, 1182, 50276, 555, 5367, 3239, 818, 12494, 1735, 281, 4677, 721, 5727, 1073, 333, 79, 11120, 33772, 6936, 326, 8069, 10883, 253, 1375, 2317, 50276, 555, 5367, 3239, 818, 1840, 4677, 854, 1056, 731, 42508, 2834, 323, 1327, 24498, 391, 77, 11333, 2490, 187, 4118, 18435, 27, 9088, 310, 13969, 2190, 253, 37317, 326, 436, 310, 247, 1175, 2929, 352, 310, 247, 2372, 32809, 2429, 281, 305, 1747, 263, 1162, 355, 4022, 436, 2929, 921, 3240, 1805, 16774, 1543 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: the authors propose a new approach for enforcing sparsity with the l0regularization technique proposed by louizos et al that allows for explicit specification of the target sparsity at the end of the optimization process they demonstrate that their approach produces high quality sparse models on a range of applications and datasets strengths 1 the paper is very well written and easy to follow the authors do a good job of highlighting the benefits of their technique including the ability to target specific sparsity levels and the modularity and extensibility provided by this capability 2 i think the results of this work are impactful l0regularization was a very promising technique but prior work gale et al was unable to scale it to large models the authors addressed its limitations both in the difficulty of tuning the loss coefficient to achieve a desired level of sparsity and the previously reported issues with training stability i did not identify any major weaknesses in the next section i highlight a few things that could help to bring my score up further i am not aware of any limitations of the proposed approach that were not mentioned in the work the authors did explicitly analyze potential issues for example the dynamics of their constraint during training docsepthis paper mainly focuses on a previous work l0 regularization and demonstrate its feasibility when solving it with constraints the work considers the problem of l0 regularization thoroughly by 1 learning models with controllable levels of sparsity 2 dual restart heuristic to avoid the excessive regularization 3 fixing the performance dropping problem of l0 regularization strengths 1 this work presents valid improvements to previous l0 regularization since i know l0 regularization cannot achieve stable results on imagenet 2 this work gives a nice modification to the original testtime model selection criterion for its mediean property weaknesses 1 there is already work on dealing with the penality and unstable problem of l0 regularzation 1effective sparsification of neural networks with global sparsity constraint and the final accuracy result is lower than 1 on imagenet 7344 with 12 params vs 7468 with 10 params please discuss the difference and connections with 1 and better make more empirical comparisons 2 the good property of testtime model selection should be further justified can you please give a table of statistics to demonstrate the changing dynamics of the testtime model selected during the training process i dont see any negative societal impact of this work docsepthis paper considers sparse neural network training with cardinality constraints instead of regularization based on the stochastic gates they adopted a constrained formulation which allows for simultaneously training and controlling exact sparsity in an endtoend fashion in terms of optimization algorithm they considered the lagrangian minmax problem and proposed gradientdescentascent algorithm with dual restarts they conducted extensive and comprehensive numerical experiments on various network models and demonstrated the effectiveness of their proposal the paper proposes an endtoend training framework for neural network training with exact sparsity constraint which avoids tuning the sparsityinducing regularization coefficients to satisfy the desired sparsity they extended the previous work on stochastic gates louizos et al 2018 to the constrained version and solve it by gradient descent ascent on its lagrangian the paper is well written and comprehensive their extensive numerical experiments are sound and demonstrate their method consistently help achieve the sparsity targets for various models in general the paper is solid and good here are some commentsquestions 1 line 9194 the introduction of phi is a bit abrupt and may not be friendly to those who have not read the previous work louizos et al 2018 it is probably better to describe the idea of louizos paper and provide explicit expression of lambdatextpenmathbbebfzphiz0 wrt phi 2 the l0density throughout the paper in experiment section figures and tables is it the smoothed version gtextconstphig the lhs of the constraint or the density based on testtime model it is sort of confusing at first i thought it was the actual parameter density of the final purged model so it is probably better to clarifydefine the term somewhere 3 question related to testtime and purging each mathbbebfzphi1zineq 0 is a number between 01 this means even g phigleq epsilong holds there are still chances that the z in the testtime model has a larger density compared to epsilong but why the params is in general smaller than l0density in table 2 4 question related to l0density in the left panel of figure 3 the curves of l0density do not meet the target sparsity level any intuition why that happens 5 as a following work of louizos et al 2018 yutaro et al 2020 1 mentioned the logisticbased hardconcrete distribution also used in this paper yields highvariance gradient estimates and they proposed to use gaussian distribution to replace the hc distribution for z it would be interesting to check if gaussian distribution works better in your case 1 yamada yutaro ofir lindenbaum sahand negahban and yuval kluger feature selection using stochastic gates in international conference on machine learning pp 1064810659 pmlr 2020 see strengths and weaknesses docsepthe authors take the l0 penalty method from louizos et al and present a constrained version of it ie estimate the lagrange multiplier to satisfy some desired level of parameter sparsity they do this by specifying a desired parameter sparsity density either for the entire model or per layer the lagrange factor appears to be estimated using nonstochastic gradient descent with one additional trick only apply the constraint when it isnt satisfied experiments are performed using convolutional and fullyconnected models on image datasets cifar10 tinyimagenet and imagenet in the supervised learning setting they compare with the penalized version of the regularizer as well as a pruning method based on weight magnitude strengths originality the originality of the paper is somewhat unclear to me as i am not especially familiar with the latest techniques for neural network compression to the best of my knowledge i am not aware of previous works that allow setting a desired number of used parameters rather than setting a lagrange hyperparameter in this regard the paper presents a somewhat original method the authors also present a number of additional aspects which includes a dual restart heuristic and techniques for applying regularization to residual networks these are mildly original but i do not see them as a core contribution to the papers novelty quality the paper is fairly high quality the contributions of the paper are clearly defined the diagrams clearly demonstrate the main benefit being able to trace out the curve of parameters vs performance in a more targeted manner experiments with multiple model architectures and datasets demonstrate the method along with analysis plots in figures 2 and 3 clarity the paper is very clear in its presentation the writing is clear containing emphases on important words and ideas mathematical concepts are generally presented and explained clearly experiments clearly demonstrate the settings described in the paper the figures are all wellpresented with clear legends and labels significance assuming that the paper is at least moderately original then i see the results as being at least somewhat significant in particular in table 2 the authors demonstrate that their method is able to train models that contain similar numbers of parameters as magnitude pruning while obtaining significantly improved validation accuracy in some cases as 70 difference in fairness it would be reasonable to also compare with smaller models trained without weight regularization as well as models that are optimized posttraining thus the significance may be somewhat diminished after these additional comparisons again assuming the technique has not been introduced previously the notion of being able to prespecify the size of the model could be useful eg to ensure that the final model fits within memory on a smaller device the authors also claim that their method along with several additional techniques discussed in section 51 is able to train l0regularized residual networks whereas supposedly standard l0regularization struggles in these settings i am not familiar with this literature so its somewhat unclear to me but i see this as a further significant contribution many largescale vision models contain residual components and this set of techniques opens up the applicability of l0regularization to these networks weaknesses originality the main contribution of the paper is to replace the l0regularized training objective with a constrained objective adjusting the lagrange factor to satisfy this constraint while im generally in support of simple methods this method is so simple as to make me question whether something similar hasnt already been tried before replacing regularized objectives with their constrained counterparts is a common classic technique within the machine learning literature some examples with modern deep networks can be found in variational autoencoders rezende viola 2018httpsarxivorgabs181000597 and policy regularization in reinforcement learning haarnoja et al 2018httpsarxivorgabs181205905 given that this method is seemingly so simple in my view it should provide highly significant or surprising results to warrant publication quality the quality of the paper is generally great however i have the following concerns its not clear whether posttraining sparsification would obviate the issues with penaltybased regularization raised by the authors that is one could distill a larger trained model into a smaller model of prespecified size in the introduction the authors point to the additional computational overhead required by these methods however i do not find this to be the most compelling point additional discussion or empirical evaluation around this area would help to strengthen the authors claims experiments are performed entirely within the setting of supervised learning for image classification while its understandable to demonstrate a method with practical applications in domains that are commonly used inpractice i worry that this limits the scope of the paper a more complete paper would demonstrate this method in a range of settings eg within various networks for generative modeling or reinforcement learning given the seeming simplicity of the method its unclear why the authors did not demonstrate this technique more broadly clarity i do not see any major weaknesses in the clarity of the paper the main ideas are presented well significance the significance of this paper may be somewhat limited as noted above it may be the case that posttraining sparsification methods would already provide a solution for obtaining a sparse network with a particular number of parameters if this is the case then the fact that this method is capable of doing so during training seems less consequential also as mentioned above the paper currently only contains empirical evaluations with supervised learning on image datasets expanding this to include other data modalities and tasks would expand the appeal of the paper the main sellingpoint of the paper is the ability to prespecify the number of active parameters in the final trained model that is one no longer has to manually adjust regularization penalty hyperparameters during training while i somewhat believe the appeal of this idea in practice im not sure whether this would actually save one from having to perform hyperparameter tuning the notion of having to prespecify the density of active parameters particularly per layer is not so conceptually distinct from specifying the size depth and width of the network itself given that we generally sweep over these parameters i do not see replacing a regularization penalty with a constraint to be a huge win here admittedly this method allows one to set a global constraint which is ideal compared with setting perlayer constraints sizes the experiments in table 2 training resnet50 on imagenet are fairly significant in terms of improvements in error however given the additional proposed techniques i wonder whether a penaltybased regularization approach would perform similarly if this is the case then the only remaining benefit is the ease of tuning while the authors do not explicitly address the limitations of their work they discuss the various tradeoffs of different sparsification algorithms to a reasonable extent ### Summary:
ratings 7857 confidence 3444 discussion among reviewers no summary this paper introduces a method for learning neural networks with an exact sparsity target instead of regularization constraints the method builds on the smoothed l0 regularization objective from louizos et al which was shown to be difficult to scale the reviewers generally agree that the paper is easy to follow introduces ideas that are interesting to the neurips community and that the results look promising the authors wrote detailed responses to the reviewers concerns during the rebuttal period the authors performed additional experiments on imagenet as suggested by reviewer tjdj and updated their paper the reviewers did not respond to this update but their reviews were already positive my recommendation is to accept
[ 292, 818, 22543, 342, 1249, 18912, 4632, 10677, 2358, 342, 884, 18912, 4496, 2319, 253, 3064, 285, 10291, 342, 337, 285, 1805, 1056, 625, 16774, 14023, 50276, 19, 253, 1175, 2867, 273, 1071, 2606, 1566, 5438, 943, 320, 2007, 17285, 476, 368, 4496, 1918, 247, 2829, 273, 9990, 281, 7568, 253, 6890, 8062, 273, 253, 1071, 2606, 1566, 4236, 1309, 253, 3733, 1232, 50275, 74, 13414, 923, 667, 4016, 38058, 3486, 273, 436, 789, 5474, 33032, 2520, 2929, 19401, 23507, 11454, 2990, 3733, 342, 46950, 10806, 3185, 273, 37820, 1754, 327, 253, 19191, 18488, 597, 8671, 247, 20793, 15895, 534, 4483, 323, 10486, 3733, 285, 10938, 3242, 37139, 414, 275, 271, 990, 936, 423, 8142, 275, 2426, 273, 13757, 5933, 597, 2783, 253, 16653, 23623, 1054, 4090, 1895, 285, 4081, 11786, 3229, 1154, 284, 1154, 5933, 342, 8746, 1551, 12863, 597, 5196, 9470, 285, 11088, 10704, 4679, 327, 2710, 2990, 3210, 285, 5183, 253, 12510, 273, 616, 10419, 253, 2929, 29328, 271, 990, 936, 423, 3733, 7792, 323, 11454, 2990, 3733, 342, 3242, 37139, 414, 7658, 534, 32547, 25184, 253, 37139, 414, 527, 32578, 37820, 10303, 281, 10517, 253, 6799, 37139, 414, 597, 6508, 253, 2045, 789, 327, 19191, 18488, 29245, 478, 375, 1162, 355, 4765, 281, 253, 20793, 2715, 285, 8415, 352, 407, 11786, 18499, 49104, 327, 697, 16653, 23623, 253, 2929, 310, 973, 3542, 285, 11088, 616, 9470, 10704, 4679, 403, 3590, 285, 7568, 616, 1332, 12724, 1361, 5115, 253, 37139, 414, 8571, 323, 2710, 3210, 275, 2087, 253, 2929, 310, 4891, 285, 1175, 1060, 403, 690, 5701, 34974, 50276, 18, 1386, 898, 19332, 253, 10199, 273, 815, 74, 310, 247, 2372, 21213, 285, 778, 417, 320, 11453, 281, 1110, 665, 452, 417, 1239, 253, 2045, 789, 29245, 478, 375, 1162, 355, 4765, 352, 310, 3164, 1805, 281, 6266, 253, 2934, 273, 29245, 478, 375, 2929, 285, 2085, 6843, 2048, 273, 24082, 2754, 633, 3878, 1324, 1257, 3342, 91, 545, 478, 17, 8772, 815, 74, 50276, 19, 253, 298, 17, 20425, 4768, 253, 2929, 275, 3368, 2593, 8442, 285, 7180, 310, 352, 253, 43966, 2715, 305, 1156, 3474, 545, 304, 253, 298, 11285, 273, 253, 7658, 390, 253, 4038, 1754, 327, 1071, 2606, 1566, 352, 310, 3686, 273, 21643, 387, 806, 891, 1869, 352, 369, 253, 4588, 4764, 4038, 273, 253, 2457, 1460, 2400, 1566, 594, 352, 310, 3164, 1805, 281, 19148, 3182, 253, 1307, 9366, 50276, 20, 1953, 2905, 281, 1071, 2606, 285, 1460, 3390, 1016, 14168, 49472, 3342, 91, 2162, 18, 91, 460, 82, 470, 310, 247, 1180, 875, 14805, 436, 2097, 1014, 305, 815, 304, 3040, 299, 793, 300, 543, 6556, 627, 403, 1335, 14512, 326, 253, 1182, 275, 253, 1071, 2606, 1566, 556, 247, 4067, 4038, 2429, 281, 299, 793, 300, 543, 533, 2139, 253, 18912, 310, 275, 2087, 4577, 685, 298, 17, 20425, 275, 2829, 374, 577, 1953, 2905, 281, 298, 17, 20425, 275, 253, 1669, 5370, 273, 4677, 495, 253, 9191, 273, 298, 17, 20425, 513, 417, 2525, 253, 2303, 37139, 414, 1268, 50276, 1279, 30328, 2139, 326, 6569, 608, 347, 247, 1563, 789, 273, 29245, 478, 375, 1162, 355, 4765, 340, 307, 15354, 1162, 355, 9169, 337, 5393, 253, 21535, 3169, 1892, 585, 6713, 3268, 671, 908, 275, 436, 2929, 11026, 1029, 87, 14417, 11786, 8197, 285, 597, 4081, 281, 897, 305, 12064, 3268, 281, 8171, 253, 288, 68, 3268, 323, 1182, 352, 651, 320, 4722, 281, 2451, 604, 305, 12064, 3268, 2987, 1805, 275, 634, 1083, 50276, 18, 340, 312, 2960, 340, 307, 15354, 273, 343, 298, 527, 257, 30735, 618, 4608, 2297, 1240, 5568, 285, 340, 86, 1208, 27451, 814, 254, 4735, 5438, 970, 19191, 18488, 275, 5213, 8059, 327, 5145, 4715, 7266, 884, 25020, 740, 21889, 268, 1686, 83, 9169, 50276, 2887, 20544, 285, 32213, 5474, 339, 431, 248, 4477, 1379, 253, 298, 17, 12339, 1332, 432, 29245, 478, 375, 1162, 355, 285, 1246, 247, 20793, 2715, 273, 352, 26332, 6642, 253, 16653, 6324, 39199, 281, 10517, 690, 6799, 1268, 273, 4764, 37139, 414, 597, 513, 436, 407, 31238, 247, 6799, 4764, 37139, 414, 4038, 2057, 323, 253, 2862, 1566, 390, 591, 3828, 253, 16653, 6324, 2803, 4620, 281, 320, 5998, 970, 1327, 296, 17283, 11786, 18499, 342, 581, 3081, 10480, 760, 4647, 253, 7658, 672, 352, 310, 2649, 10048, 4679, 403, 2684, 970, 27311, 267, 285, 4751, 14063, 3210, 327, 2460, 15302, 260, 338, 274, 740, 10058, 303, 6533, 292, 285, 4440, 257, 292, 275, 253, 22296, 4715, 4758, 597, 7277, 342, 253, 29697, 1025, 2715, 273, 253, 3963, 6081, 347, 973, 347, 247, 819, 25004, 1332, 1754, 327, 2801, 9777, 50276, 296, 3755, 20556, 50276, 19164, 414, 253, 3236, 414, 273, 253, 2929, 310, 8489, 12744, 281, 479, 347, 891, 717, 417, 3340, 7615, 342, 253, 6323, 5609, 323, 11454, 2990, 13800, 281, 253, 1682, 273, 619, 3640, 891, 717, 417, 6600, 273, 2045, 2987, 326, 1581, 4758, 247, 6799, 1180, 273, 908, 3602, 2581, 685, 4758, 247, 16653, 6324, 4373, 19484, 275, 436, 2743, 253, 2929, 10262, 247, 8489, 3236, 1332, 50276, 783, 4477, 671, 1246, 247, 1180, 273, 3081, 7794, 534, 3797, 247, 8746, 19855, 47641, 285, 5609, 323, 9433, 37820, 281, 12541, 6928, 841, 403, 38920, 3236, 533, 891, 513, 417, 923, 731, 347, 247, 5161, 7680, 281, 253, 9380, 38135, 50276, 15177, 50276, 783, 2929, 310, 9648, 1029, 3290, 253, 9021, 273, 253, 2929, 403, 4518, 2931, 253, 21302, 4518, 7568, 253, 2022, 5649, 1146, 2104, 281, 10711, 562, 253, 6970, 273, 3602, 4632, 3045, 275, 247, 625, 10522, 5133, 4679, 342, 2709, 1566, 35615, 285, 15302, 7568, 253, 1332, 2112, 342, 1783, 14777, 275, 8442, 374, 285, 495, 50276, 498, 15752, 253, 2929, 310, 1077, 2590, 275, 697, 9759, 253, 4028, 310, 2590, 4508, 7013, 1169, 327, 1774, 3000, 285, 5697, 15965, 12342, 403, 3839, 3559, 285, 5544, 4518, 4679, 4518, 7568, 253, 7533, 2529, 275, 253, 2929, 253, 8442, 403, 512, 973, 15068, 264, 342, 2590, 38209, 285, 13301, 50276, 9188, 40348, 7384, 326, 253, 2929, 310, 387, 1878, 28249, 3236, 840, 891, 923, 253, 1543, 347, 1146, 387, 1878, 8489, 1534, 50276, 249, 1798, 275, 2829, 374, 253, 4477, 7568, 326, 616, 1332, 310, 2104, 281, 6194, 3210, 326, 3831, 2074, 3904, 273, 3602, 347, 9777, 819, 25004, 1223, 13546, 3012, 5520, 12820, 7200, 275, 690, 2219, 347, 5571, 3064, 275, 28959, 352, 651, 320, 5272, 281, 671, 7277, 342, 4577, 3210, 10166, 1293, 2801, 37820, 347, 973, 347, 3210, 326, 403, 18325, 1501, 31158, 3021, 253, 8453, 778, 320, 8489, 23028, 846, 841, 3081, 14023, 50276, 16245, 7384, 253, 5853, 556, 417, 644, 5611, 3786, 253, 10732, 273, 1146, 2104, 281, 838, 1553, 1419, 253, 1979, 273, 253, 1566, 812, 320, 4217, 24088, 281, 5416, 326, 253, 2457, 1566, 13840, 1561, 3541, 327, 247, 4577, 2813, 50276, 783, 4477, 671, 1750, 326, 616, 1332, 2112, 342, 2067, 3081, 5609, 5469, 275, 2593, 8319, 310, 2104, 281, 6194, 298, 17, 12846, 1025, 12541, 6928, 5727, 24628, 2629, 298, 17, 12846, 1320, 23490, 275, 841, 7533, 891, 717, 417, 7615, 342, 436, 6239, 594, 697, 8489, 12744, 281, 479, 533, 891, 923, 436, 347, 247, 2007, 1534, 7680, 1142, 1236, 2510, 25912, 8113, 3210, 3831, 12541, 4295, 285, 436, 873, 273, 5609, 13279, 598, 253, 30437, 273, 298, 17, 12846, 1320, 281, 841, 6928, 50276, 20881, 1255, 265, 50276, 19164, 414, 253, 2022, 7680, 273, 253, 2929, 310, 281, 8171, 253, 298, 17, 12846, 1025, 3733, 8103, 342, 247, 20793, 8103, 19427, 253, 16653, 6324, 2803, 281, 10517, 436, 7658, 1223, 516, 3839, 275, 1329, 273, 2969, 3082, 436, 1332, 310, 594, 2969, 347, 281, 1056, 479, 1953, 1880, 1633, 2074, 556, 2649, 2168, 644, 3597, 1078, 15706, 3963, 1025, 16566, 342, 616, 20793, 21421, 310, 247, 1846, 10610, 5853, 1561, 253, 5145, 4715, 6239, 690, 6667, 342, 4980, 3676, 6928, 476, 320, 1119, 275, 39762, 6753, 2083, 351, 398, 294, 91, 9747, 50276, 6584, 6836, 4765, 3614, 39962, 2061, 5375, 1093, 9138, 34651, 285, 3646, 37820, 275, 35221, 4715, 419, 1596, 80, 6362, 1162, 355, 4765, 3614, 39962, 2061, 5375, 1093, 805, 30479, 1762, 1677, 326, 436, 1332, 310, 16907, 594, 2969, 275, 619, 1859, 352, 943, 2085, 4122, 1534, 390, 10084, 1543, 281, 7501, 9311, 50276, 15177, 50276, 783, 3290, 273, 253, 2929, 310, 3839, 1270, 2299, 891, 452, 253, 1563, 7350, 50276, 953, 417, 2590, 1880, 1501, 31158, 37139, 1877, 651, 691, 87, 4513, 253, 3374, 342, 12339, 3169, 37820, 5439, 407, 253, 4477, 326, 310, 581, 812, 940, 408, 247, 4067, 10166, 1566, 715, 247, 4577, 1566, 273, 838, 1553, 1245, 1979, 275, 253, 10199, 253, 4477, 1127, 281, 253, 3081, 15180, 18332, 2424, 407, 841, 3082, 2299, 891, 513, 417, 1089, 436, 281, 320, 253, 954, 18511, 1127, 3081, 5955, 390, 16774, 7103, 1475, 436, 2170, 651, 1361, 281, 17084, 253, 4477, 3916, 50276, 16217, 3825, 403, 2684, 7094, 1561, 253, 4758, 273, 22296, 4715, 323, 2460, 9162, 1223, 697, 34007, 281, 7568, 247, 1332, 342, 8542, 4893, 275, 10625, 326, 403, 7744, 908, 275, 29105, 891, 7664, 326, 436, 7787, 253, 7990, 273, 253, 2929, 247, 625, 3426, 2929, 651, 7568, 436, 1332, 275, 247, 2491, 273, 7533, 24088, 1561, 2710, 6928, 323, 1006, 800, 14053, 390, 35221, 4715, 1677, 253, 41203, 17647, 273, 253, 1332, 697, 12744, 2139, 253, 4477, 858, 417, 7568, 436, 5853, 625, 21450, 50276, 498, 15752, 891, 513, 417, 923, 667, 2201, 32213, 275, 253, 19843, 273, 253, 2929, 253, 2022, 5697, 403, 3559, 973, 50276, 9188, 40348, 253, 8453, 273, 436, 2929, 778, 320, 8489, 3710, 50276, 284, 4879, 1840, 352, 778, 320, 253, 1083, 326, 1501, 31158, 37139, 1877, 3082, 651, 2168, 2085, 247, 2900, 323, 13546, 247, 23507, 2990, 342, 247, 1798, 1180, 273, 3602, 604, 436, 310, 253, 1083, 840, 253, 958, 326, 436, 1332, 310, 7032, 273, 2509, 594, 1309, 3733, 3133, 1679, 4823, 1624, 50276, 12563, 347, 5393, 1840, 253, 2929, 4390, 760, 4428, 16774, 27163, 342, 22296, 4715, 327, 2460, 15302, 16122, 436, 281, 2486, 643, 941, 33433, 285, 8892, 651, 5645, 253, 4549, 273, 253, 2929, 50276, 783, 2022, 10156, 3659, 273, 253, 2929, 310, 253, 3745, 281, 838, 1553, 1419, 253, 1180, 273, 3939, 3602, 275, 253, 2457, 10166, 1566, 326, 310, 581, 642, 3356, 556, 281, 13542, 4575, 37820, 12339, 4373, 22041, 1309, 3733, 1223, 891, 8489, 2868, 253, 4549, 273, 436, 2934, 275, 3946, 516, 417, 2119, 1880, 436, 651, 2686, 5321, 581, 432, 1907, 281, 1347, 4373, 19484, 25184, 253, 10732, 273, 1907, 281, 838, 1553, 1419, 253, 4038, 273, 3939, 3602, 3782, 591, 3828, 310, 417, 594, 4473, 1230, 5799, 432, 31238, 253, 1979, 6864, 285, 4871, 273, 253, 2990, 3139, 1677, 326, 359, 3839, 24516, 689, 841, 3602, 891, 513, 417, 923, 15706, 247, 37820, 12339, 342, 247, 7658, 281, 320, 247, 5699, 3330, 1060, 47421, 436, 1332, 4483, 581, 281, 873, 247, 4156, 7658, 534, 310, 7445, 2429, 342, 4758, 591, 12026, 10806, 50276, 84, 4219, 50276, 783, 4679, 275, 2829, 374, 3733, 501, 3024, 1235, 327, 4440, 257, 292, 403, 9648, 1534, 275, 2426, 273, 11701, 275, 2228, 2299, 1677, 253, 3081, 4081, 5609, 891, 4282, 1880, 247, 12339, 3169, 37820, 2746, 651, 1347, 12014, 604, 436, 310, 253, 1083, 840, 253, 760, 5780, 5649, 310, 253, 11990, 273, 25184, 1223, 253, 4477, 513, 417, 11120, 2953, 253, 7364, 273, 616, 789, 597, 2319, 253, 2710, 5454, 14273, 273, 1027, 37139, 1877, 11333, 281, 247, 5272, 6070, 2490, 187, 4118, 18435, 27, 9296, 723, 818, 31416, 7162, 495, 24447, 5955, 2190, 30628, 642, 50276, 8774, 436, 2929, 23970, 247, 1332, 323, 4715, 11454, 6928, 342, 271, 3242, 37139, 414, 50276, 7831, 3185, 273, 37820, 10806, 253, 1332, 21168, 327, 253, 43966, 298, 17, 37820, 8103, 432, 29245, 478, 375, 1162, 355, 534, 369, 2011, 281, 320, 2834, 281, 4311, 50275, 783, 30628, 3839, 5194, 326, 253, 2929, 310, 3477, 281, 956, 23970, 5697, 326, 403, 4722, 281, 253, 5723, 2824, 3114, 285, 326, 253, 1543, 1007, 12532, 50275, 783, 4477, 4159, 7000, 6128, 281, 253, 30628, 7350, 1309, 253, 30080, 22559, 2180, 253, 4477, 2684, 3081, 4679, 327, 4440, 257, 292, 347, 5125, 407, 37317, 246, 28034, 75, 285, 9300, 616, 2929, 253, 30628, 858, 417, 3794, 281, 436, 5731, 533, 616, 10123, 497, 2168, 2762, 619, 17401, 310, 281, 2997 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 292, 818, 22543, 342, 1249, 18912, 4632, 10677, 2358, 342, 884, 18912, 4496, 2319, 253, 3064, 285, 10291, 342, 337, 285, 1805, 1056, 625, 16774, 14023, 50276, 19, 253, 1175, 2867, 273, 1071, 2606, 1566, 5438, 943, 320, 2007, 17285, 476, 368, 4496, 1918, 247, 2829, 273, 9990, 281, 7568, 253, 6890, 8062, 273, 253, 1071, 2606, 1566, 4236, 1309, 253, 3733, 1232, 50275, 74, 13414, 923, 667, 4016, 38058, 3486, 273, 436, 789, 5474, 33032, 2520, 2929, 19401, 23507, 11454, 2990, 3733, 342, 46950, 10806, 3185, 273, 37820, 1754, 327, 253, 19191, 18488, 597, 8671, 247, 20793, 15895, 534, 4483, 323, 10486, 3733, 285, 10938, 3242, 37139, 414, 275, 271, 990, 936, 423, 8142, 275, 2426, 273, 13757, 5933, 597, 2783, 253, 16653, 23623, 1054, 4090, 1895, 285, 4081, 11786, 3229, 1154, 284, 1154, 5933, 342, 8746, 1551, 12863, 597, 5196, 9470, 285, 11088, 10704, 4679, 327, 2710, 2990, 3210, 285, 5183, 253, 12510, 273, 616, 10419, 253, 2929, 29328, 271, 990, 936, 423, 3733, 7792, 323, 11454, 2990, 3733, 342, 3242, 37139, 414, 7658, 534, 32547, 25184, 253, 37139, 414, 527, 32578, 37820, 10303, 281, 10517, 253, 6799, 37139, 414, 597, 6508, 253, 2045, 789, 327, 19191, 18488, 29245, 478, 375, 1162, 355, 4765, 281, 253, 20793, 2715, 285, 8415, 352, 407, 11786, 18499, 49104, 327, 697, 16653, 23623, 253, 2929, 310, 973, 3542, 285, 11088, 616, 9470, 10704, 4679, 403, 3590, 285, 7568, 616, 1332, 12724, 1361, 5115, 253, 37139, 414, 8571, 323, 2710, 3210, 275, 2087, 253, 2929, 310, 4891, 285, 1175, 1060, 403, 690, 5701, 34974, 50276, 18, 1386, 898, 19332, 253, 10199, 273, 815, 74, 310, 247, 2372, 21213, 285, 778, 417, 320, 11453, 281, 1110, 665, 452, 417, 1239, 253, 2045, 789, 29245, 478, 375, 1162, 355, 4765, 352, 310, 3164, 1805, 281, 6266, 253, 2934, 273, 29245, 478, 375, 2929, 285, 2085, 6843, 2048, 273, 24082, 2754, 633, 3878, 1324, 1257, 3342, 91, 545, 478, 17, 8772, 815, 74, 50276, 19, 253, 298, 17, 20425, 4768, 253, 2929, 275, 3368, 2593, 8442, 285, 7180, 310, 352, 253, 43966, 2715, 305, 1156, 3474, 545, 304, 253, 298, 11285, 273, 253, 7658, 390, 253, 4038, 1754, 327, 1071, 2606, 1566, 352, 310, 3686, 273, 21643, 387, 806, 891, 1869, 352, 369, 253, 4588, 4764, 4038, 273, 253, 2457, 1460, 2400, 1566, 594, 352, 310, 3164, 1805, 281, 19148, 3182, 253, 1307, 9366, 50276, 20, 1953, 2905, 281, 1071, 2606, 285, 1460, 3390, 1016, 14168, 49472, 3342, 91, 2162, 18, 91, 460, 82, 470, 310, 247, 1180, 875, 14805, 436, 2097, 1014, 305, 815, 304, 3040, 299, 793, 300, 543, 6556, 627, 403, 1335, 14512, 326, 253, 1182, 275, 253, 1071, 2606, 1566, 556, 247, 4067, 4038, 2429, 281, 299, 793, 300, 543, 533, 2139, 253, 18912, 310, 275, 2087, 4577, 685, 298, 17, 20425, 275, 2829, 374, 577, 1953, 2905, 281, 298, 17, 20425, 275, 253, 1669, 5370, 273, 4677, 495, 253, 9191, 273, 298, 17, 20425, 513, 417, 2525, 253, 2303, 37139, 414, 1268, 50276, 1279, 30328, 2139, 326, 6569, 608, 347, 247, 1563, 789, 273, 29245, 478, 375, 1162, 355, 4765, 340, 307, 15354, 1162, 355, 9169, 337, 5393, 253, 21535, 3169, 1892, 585, 6713, 3268, 671, 908, 275, 436, 2929, 11026, 1029, 87, 14417, 11786, 8197, 285, 597, 4081, 281, 897, 305, 12064, 3268, 281, 8171, 253, 288, 68, 3268, 323, 1182, 352, 651, 320, 4722, 281, 2451, 604, 305, 12064, 3268, 2987, 1805, 275, 634, 1083, 50276, 18, 340, 312, 2960, 340, 307, 15354, 273, 343, 298, 527, 257, 30735, 618, 4608, 2297, 1240, 5568, 285, 340, 86, 1208, 27451, 814, 254, 4735, 5438, 970, 19191, 18488, 275, 5213, 8059, 327, 5145, 4715, 7266, 884, 25020, 740, 21889, 268, 1686, 83, 9169, 50276, 2887, 20544, 285, 32213, 5474, 339, 431, 248, 4477, 1379, 253, 298, 17, 12339, 1332, 432, 29245, 478, 375, 1162, 355, 285, 1246, 247, 20793, 2715, 273, 352, 26332, 6642, 253, 16653, 6324, 39199, 281, 10517, 690, 6799, 1268, 273, 4764, 37139, 414, 597, 513, 436, 407, 31238, 247, 6799, 4764, 37139, 414, 4038, 2057, 323, 253, 2862, 1566, 390, 591, 3828, 253, 16653, 6324, 2803, 4620, 281, 320, 5998, 970, 1327, 296, 17283, 11786, 18499, 342, 581, 3081, 10480, 760, 4647, 253, 7658, 672, 352, 310, 2649, 10048, 4679, 403, 2684, 970, 27311, 267, 285, 4751, 14063, 3210, 327, 2460, 15302, 260, 338, 274, 740, 10058, 303, 6533, 292, 285, 4440, 257, 292, 275, 253, 22296, 4715, 4758, 597, 7277, 342, 253, 29697, 1025, 2715, 273, 253, 3963, 6081, 347, 973, 347, 247, 819, 25004, 1332, 1754, 327, 2801, 9777, 50276, 296, 3755, 20556, 50276, 19164, 414, 253, 3236, 414, 273, 253, 2929, 310, 8489, 12744, 281, 479, 347, 891, 717, 417, 3340, 7615, 342, 253, 6323, 5609, 323, 11454, 2990, 13800, 281, 253, 1682, 273, 619, 3640, 891, 717, 417, 6600, 273, 2045, 2987, 326, 1581, 4758, 247, 6799, 1180, 273, 908, 3602, 2581, 685, 4758, 247, 16653, 6324, 4373, 19484, 275, 436, 2743, 253, 2929, 10262, 247, 8489, 3236, 1332, 50276, 783, 4477, 671, 1246, 247, 1180, 273, 3081, 7794, 534, 3797, 247, 8746, 19855, 47641, 285, 5609, 323, 9433, 37820, 281, 12541, 6928, 841, 403, 38920, 3236, 533, 891, 513, 417, 923, 731, 347, 247, 5161, 7680, 281, 253, 9380, 38135, 50276, 15177, 50276, 783, 2929, 310, 9648, 1029, 3290, 253, 9021, 273, 253, 2929, 403, 4518, 2931, 253, 21302, 4518, 7568, 253, 2022, 5649, 1146, 2104, 281, 10711, 562, 253, 6970, 273, 3602, 4632, 3045, 275, 247, 625, 10522, 5133, 4679, 342, 2709, 1566, 35615, 285, 15302, 7568, 253, 1332, 2112, 342, 1783, 14777, 275, 8442, 374, 285, 495, 50276, 498, 15752, 253, 2929, 310, 1077, 2590, 275, 697, 9759, 253, 4028, 310, 2590, 4508, 7013, 1169, 327, 1774, 3000, 285, 5697, 15965, 12342, 403, 3839, 3559, 285, 5544, 4518, 4679, 4518, 7568, 253, 7533, 2529, 275, 253, 2929, 253, 8442, 403, 512, 973, 15068, 264, 342, 2590, 38209, 285, 13301, 50276, 9188, 40348, 7384, 326, 253, 2929, 310, 387, 1878, 28249, 3236, 840, 891, 923, 253, 1543, 347, 1146, 387, 1878, 8489, 1534, 50276, 249, 1798, 275, 2829, 374, 253, 4477, 7568, 326, 616, 1332, 310, 2104, 281, 6194, 3210, 326, 3831, 2074, 3904, 273, 3602, 347, 9777, 819, 25004, 1223, 13546, 3012, 5520, 12820, 7200, 275, 690, 2219, 347, 5571, 3064, 275, 28959, 352, 651, 320, 5272, 281, 671, 7277, 342, 4577, 3210, 10166, 1293, 2801, 37820, 347, 973, 347, 3210, 326, 403, 18325, 1501, 31158, 3021, 253, 8453, 778, 320, 8489, 23028, 846, 841, 3081, 14023, 50276, 16245, 7384, 253, 5853, 556, 417, 644, 5611, 3786, 253, 10732, 273, 1146, 2104, 281, 838, 1553, 1419, 253, 1979, 273, 253, 1566, 812, 320, 4217, 24088, 281, 5416, 326, 253, 2457, 1566, 13840, 1561, 3541, 327, 247, 4577, 2813, 50276, 783, 4477, 671, 1750, 326, 616, 1332, 2112, 342, 2067, 3081, 5609, 5469, 275, 2593, 8319, 310, 2104, 281, 6194, 298, 17, 12846, 1025, 12541, 6928, 5727, 24628, 2629, 298, 17, 12846, 1320, 23490, 275, 841, 7533, 891, 717, 417, 7615, 342, 436, 6239, 594, 697, 8489, 12744, 281, 479, 533, 891, 923, 436, 347, 247, 2007, 1534, 7680, 1142, 1236, 2510, 25912, 8113, 3210, 3831, 12541, 4295, 285, 436, 873, 273, 5609, 13279, 598, 253, 30437, 273, 298, 17, 12846, 1320, 281, 841, 6928, 50276, 20881, 1255, 265, 50276, 19164, 414, 253, 2022, 7680, 273, 253, 2929, 310, 281, 8171, 253, 298, 17, 12846, 1025, 3733, 8103, 342, 247, 20793, 8103, 19427, 253, 16653, 6324, 2803, 281, 10517, 436, 7658, 1223, 516, 3839, 275, 1329, 273, 2969, 3082, 436, 1332, 310, 594, 2969, 347, 281, 1056, 479, 1953, 1880, 1633, 2074, 556, 2649, 2168, 644, 3597, 1078, 15706, 3963, 1025, 16566, 342, 616, 20793, 21421, 310, 247, 1846, 10610, 5853, 1561, 253, 5145, 4715, 6239, 690, 6667, 342, 4980, 3676, 6928, 476, 320, 1119, 275, 39762, 6753, 2083, 351, 398, 294, 91, 9747, 50276, 6584, 6836, 4765, 3614, 39962, 2061, 5375, 1093, 9138, 34651, 285, 3646, 37820, 275, 35221, 4715, 419, 1596, 80, 6362, 1162, 355, 4765, 3614, 39962, 2061, 5375, 1093, 805, 30479, 1762, 1677, 326, 436, 1332, 310, 16907, 594, 2969, 275, 619, 1859, 352, 943, 2085, 4122, 1534, 390, 10084, 1543, 281, 7501, 9311, 50276, 15177, 50276, 783, 3290, 273, 253, 2929, 310, 3839, 1270, 2299, 891, 452, 253, 1563, 7350, 50276, 953, 417, 2590, 1880, 1501, 31158, 37139, 1877, 651, 691, 87, 4513, 253, 3374, 342, 12339, 3169, 37820, 5439, 407, 253, 4477, 326, 310, 581, 812, 940, 408, 247, 4067, 10166, 1566, 715, 247, 4577, 1566, 273, 838, 1553, 1245, 1979, 275, 253, 10199, 253, 4477, 1127, 281, 253, 3081, 15180, 18332, 2424, 407, 841, 3082, 2299, 891, 513, 417, 1089, 436, 281, 320, 253, 954, 18511, 1127, 3081, 5955, 390, 16774, 7103, 1475, 436, 2170, 651, 1361, 281, 17084, 253, 4477, 3916, 50276, 16217, 3825, 403, 2684, 7094, 1561, 253, 4758, 273, 22296, 4715, 323, 2460, 9162, 1223, 697, 34007, 281, 7568, 247, 1332, 342, 8542, 4893, 275, 10625, 326, 403, 7744, 908, 275, 29105, 891, 7664, 326, 436, 7787, 253, 7990, 273, 253, 2929, 247, 625, 3426, 2929, 651, 7568, 436, 1332, 275, 247, 2491, 273, 7533, 24088, 1561, 2710, 6928, 323, 1006, 800, 14053, 390, 35221, 4715, 1677, 253, 41203, 17647, 273, 253, 1332, 697, 12744, 2139, 253, 4477, 858, 417, 7568, 436, 5853, 625, 21450, 50276, 498, 15752, 891, 513, 417, 923, 667, 2201, 32213, 275, 253, 19843, 273, 253, 2929, 253, 2022, 5697, 403, 3559, 973, 50276, 9188, 40348, 253, 8453, 273, 436, 2929, 778, 320, 8489, 3710, 50276, 284, 4879, 1840, 352, 778, 320, 253, 1083, 326, 1501, 31158, 37139, 1877, 3082, 651, 2168, 2085, 247, 2900, 323, 13546, 247, 23507, 2990, 342, 247, 1798, 1180, 273, 3602, 604, 436, 310, 253, 1083, 840, 253, 958, 326, 436, 1332, 310, 7032, 273, 2509, 594, 1309, 3733, 3133, 1679, 4823, 1624, 50276, 12563, 347, 5393, 1840, 253, 2929, 4390, 760, 4428, 16774, 27163, 342, 22296, 4715, 327, 2460, 15302, 16122, 436, 281, 2486, 643, 941, 33433, 285, 8892, 651, 5645, 253, 4549, 273, 253, 2929, 50276, 783, 2022, 10156, 3659, 273, 253, 2929, 310, 253, 3745, 281, 838, 1553, 1419, 253, 1180, 273, 3939, 3602, 275, 253, 2457, 10166, 1566, 326, 310, 581, 642, 3356, 556, 281, 13542, 4575, 37820, 12339, 4373, 22041, 1309, 3733, 1223, 891, 8489, 2868, 253, 4549, 273, 436, 2934, 275, 3946, 516, 417, 2119, 1880, 436, 651, 2686, 5321, 581, 432, 1907, 281, 1347, 4373, 19484, 25184, 253, 10732, 273, 1907, 281, 838, 1553, 1419, 253, 4038, 273, 3939, 3602, 3782, 591, 3828, 310, 417, 594, 4473, 1230, 5799, 432, 31238, 253, 1979, 6864, 285, 4871, 273, 253, 2990, 3139, 1677, 326, 359, 3839, 24516, 689, 841, 3602, 891, 513, 417, 923, 15706, 247, 37820, 12339, 342, 247, 7658, 281, 320, 247, 5699, 3330, 1060, 47421, 436, 1332, 4483, 581, 281, 873, 247, 4156, 7658, 534, 310, 7445, 2429, 342, 4758, 591, 12026, 10806, 50276, 84, 4219, 50276, 783, 4679, 275, 2829, 374, 3733, 501, 3024, 1235, 327, 4440, 257, 292, 403, 9648, 1534, 275, 2426, 273, 11701, 275, 2228, 2299, 1677, 253, 3081, 4081, 5609, 891, 4282, 1880, 247, 12339, 3169, 37820, 2746, 651, 1347, 12014, 604, 436, 310, 253, 1083, 840, 253, 760, 5780, 5649, 310, 253, 11990, 273, 25184, 1223, 253, 4477, 513, 417, 11120, 2953, 253, 7364, 273, 616, 789, 597, 2319, 253, 2710, 5454, 14273, 273, 1027, 37139, 1877, 11333, 281, 247, 5272, 6070, 2490, 187, 4118, 18435, 27, 9296, 723, 818, 31416, 7162, 495, 24447, 5955, 2190, 30628, 642, 50276, 8774, 436, 2929, 23970, 247, 1332, 323, 4715, 11454, 6928, 342, 271, 3242, 37139, 414, 50276, 7831, 3185, 273, 37820, 10806, 253, 1332, 21168, 327, 253, 43966, 298, 17, 37820, 8103, 432, 29245, 478, 375, 1162, 355, 534, 369, 2011, 281, 320, 2834, 281, 4311, 50275, 783, 30628, 3839, 5194, 326, 253, 2929, 310, 3477, 281, 956, 23970, 5697, 326, 403, 4722, 281, 253, 5723, 2824, 3114, 285, 326, 253, 1543, 1007, 12532, 50275, 783, 4477, 4159, 7000, 6128, 281, 253, 30628, 7350, 1309, 253, 30080, 22559, 2180, 253, 4477, 2684, 3081, 4679, 327, 4440, 257, 292, 347, 5125, 407, 37317, 246, 28034, 75, 285, 9300, 616, 2929, 253, 30628, 858, 417, 3794, 281, 436, 5731, 533, 616, 10123, 497, 2168, 2762, 619, 17401, 310, 281, 2997 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: this paper concerns with uniform sampling from deep generative networks such as gans and vaes the training samples of dgns are often biased as they are obatined based on preferences costs or convenience that leads to dgns producing biased examples this paper gives a gemoetry based sampler magnet that given any trained dgn produces samples that are uniformly distributed on the learned manifold it theoretically proves and empirically shows that the magnet produces a uniform distrbution on the manifold regardless of the training set distribution the theoretical proofs require that the dgns only comprise continuous piecewise affine cpa nonlinearities such as relu absolute value maxpooling the three main contributions of the paper are as following a it characterizes the transformation incurred by a density distribution when composed with a cpa mapping b it derives an analytical sampling strategy that allows to obtain a uniform distribution on a manifold that is continuous and piecewise affine c it provides multiple numerical experiments validating the gains of their proposed method magnet strengths a given any trained dgn the paper gives a novel theoretical method to produce samples that are uniformly distributed on the learned manifold regardless of the training set distribution the approach is novel and solves the problem elegantly b it proves the proposed method for a mild assmption that dgn only comprises continuous piecewise affine cpa nonlinearities such as relu absolute value maxpooling c it gives convincing experiments on synthetic dataset showing that regardless of the training set distribution their magnet approach produces samples that are uniformly distributed weakness the paper needs improvement in writing a in section 32 the notation jszi is used without explaining it compute the perregion slope matrices ai jszi please define the notation and explain how to compute the slope matrices b a high level proof of the main theorem 2 in the main paper will help the reader understand the theorem better c the xaxis values of the two plots in figure 3 are different by order of 100 it does not seem correct the paper provides a novel provable method to produce samples that are uniformly distributed on the learned manifold regardless of the training set distribution the method is also well proven empirically through numerous experiments the proposed method help address the problem fairness in samples produced from dgns trained on notsowellrepresentative training datasets docsepthe authors propose a uniform sampling technique for deep generative networks dgns inspired by the probabilistic change of variables formula the technique works with any already trained dgn and does not involve any further training though it does require back propagation wrt the input x in essence the algorithm works by drawing many samples n k from the dgn then sampling from these n samples with probability inversely related to their pushforward density as computed by the changeofvariables formula applying the change of variables formula to augment sampling from dgns is a novel idea moreover the method is interesting and theoretically wellmotivated finally the sampling algorithm itself is very straightforward however i take issue with the framing of the algorithm in the abstract and introduction namely the authors use the colloquial understanding of uniform sidebyside with the differential geometric measure theoretic understanding of uniform for example in the abstract the authors state that 1 many generative models today are trained on nonuniform data which has potential implications for fairness data augmentation anomaly detection domain adaptation and beyond and 2 their algorithm produces a uniform distribution on the manifold regardless of the training set distribution this creates the false impression that the present technique is capable of neutralizing the negative implications on fairness data augmentation etc etc this juxtaposition may imply parity between the two definitions of uniform to the inattentive reader while the authors do emphasize the difference between the term in these two contexts much later in the paper i feel that it is not appropriate for the abstract to mislead in this manner again i only have this issue with the framing of the technique not the technique itself with regard to the uniform sampling property of magnet i have two concerns about the practicality of the method 1 the authors have touched upon this but uniform sampling from the data manifold does not imply uniform sampling of attributes this is exacerbated when the model has not fully learned the manifold therefore magnets sampling is only as uniform as the dgn and the data manifold itself is 2 since the dgn can only be trained on the training data distribution sample quality will vary across the true data manifold namely sample quality will likely correlate with density wrt the training data distribution therefore i imagine that sampling uniformly will reduce sample quality overall this seems to be corroborated by qualitative comparison of original v magnet samples in the paper figures computationally the authors demonstrate in appendix d that sampling with n past 250k does not affect the precisionrecall metric but i could not find what n is in the experiments shown and since each image sample requires computing the jacobian of the dgn wrt its input i wonder what is the approximate computation time needed to sample n250000 times for each of the models this method itself is novel and interesting and warrants acceptance into the conference however the current wording of the title and abstract can be misleading and should be edited to remove confusion pros theoretically motivated algorithmically simple cons seems computationally expensive oversells the capabilities of the technique in the abstract namely mathematically uniform is implied to mean semantically uniform needs proofreading docsepthis paper proposes a sampling method called magnet for generative models which aims at sampling data from the latent distribution of the images uniformly this paper proposes a sampling method that aims at providing uniform samples from the latent space for deep generative models providing uniform samples from the latent space is very important but the manuscript requires more quantitative results to support their claims of uniform samples major concerns 1 how does the magnet sampling affect the quality of the generated images common gans literature provides some quantitative evaluation metrics inception score fid score kid score as a justification for their proposed methods but this submission does not provide any justification using the prevalent evaluation metrics the visual quality of some generated images using the magnet method is poorer than using the original gan alone figure 7 9 in addition in addition the provided samples of the generated images figure 15 20 are not clear enough to provide a good justification for their visual quality 2 it is hard to tell whether the proposed method truly improves the uniformity of the sampled data in figure 22 1 gender it seems that the magnetstyle reduces gender bias but the magnetpixel increases gender bias 2 hair due to the low clarity of this subfigure it is hard to draw any conclusion 3 glasses it seems that the magnetpixel improves the occurrence of sunglasses and the improvement of magnetstyle is limited 4 age it seems that both methods have some improvement in age 5 emotion the improvement of this subfigure is hard to tell both improves in fear but decrease in disgust 6 accessaries it seems that both methods magnetpixel and magnetstyle can improve the occurrence of headwear but all the results are qualitative the reviewer would like to see some quantitative results such as how close the new distribution sampled using the magnet methods is to the uniform distribution under the same categories gender hair glasses age emotion accessaries provided by the authors comparing to the previous methods minor comments 1 what is the perregion slope matrices mathbfai mathbfjsmathbfzi in the sampling procedure of the magnet 2 the resolution of some figures figure 5 8 15 20 are too low especially in figure 15 20 where the quality of generated samples is hard to verify this paper does not provide enough experimental results both qualitatively and quantitively to support their claims ### Summary:
the paper proposes a simple method for uniform sampling from generative manifold using change of variables formula the method works by first sampling a much larger number of samples n from uniform distribution in the latent space and then does sampling by replacement using probability proportional to change in volume to generate a smaller number of final samples k n that are seen as approximately sampled from a uniform distribution from the generative manifold reviewers had some questionsconcerns about the confusing language in the abstract and introduction around the use of the term uniform which the authors have addressed satisfactorily authors have also provided results on quality fid metric of the generated samples as asked by the reviewers while the proposed method is rather simple has high computational cost and novelty is marginal as noted by two of the reviewers reviewers agree it is above the acceptance bar
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 2520, 2929, 7350, 342, 6447, 10491, 432, 3676, 1006, 800, 6928, 824, 347, 305, 507, 285, 13460, 265, 253, 3733, 3530, 273, 277, 72, 2224, 403, 2223, 23539, 347, 597, 403, 691, 255, 967, 1754, 327, 17971, 4815, 390, 16397, 326, 5644, 281, 277, 72, 2224, 9603, 23539, 6667, 436, 2929, 4245, 247, 16915, 80, 5704, 1754, 1775, 17407, 10973, 326, 1677, 667, 10166, 277, 3757, 11330, 3530, 326, 403, 17568, 5939, 327, 253, 6311, 16751, 352, 28055, 19539, 285, 45190, 2722, 326, 253, 10973, 11330, 247, 6447, 940, 13041, 890, 327, 253, 16751, 10159, 273, 253, 3733, 873, 3268, 253, 10527, 27947, 2430, 326, 253, 277, 72, 2224, 760, 19142, 5415, 5313, 3020, 29438, 260, 4904, 14561, 1005, 824, 347, 774, 86, 7880, 1318, 2781, 10730, 272, 253, 1264, 2022, 9021, 273, 253, 2929, 403, 347, 1563, 247, 352, 45589, 253, 9261, 23122, 407, 247, 4038, 3268, 672, 9924, 342, 247, 260, 4904, 10603, 270, 352, 38422, 271, 16101, 10491, 5700, 326, 4483, 281, 4044, 247, 6447, 3268, 327, 247, 16751, 326, 310, 5415, 285, 5313, 3020, 29438, 260, 352, 3400, 2709, 10704, 4679, 3588, 839, 253, 15988, 273, 616, 4081, 1332, 10973, 50276, 296, 3755, 20556, 247, 1677, 667, 10166, 277, 3757, 253, 2929, 4245, 247, 4460, 10527, 1332, 281, 4711, 3530, 326, 403, 17568, 5939, 327, 253, 6311, 16751, 10159, 273, 253, 3733, 873, 3268, 253, 2746, 310, 4460, 285, 35910, 253, 1895, 13990, 5954, 50276, 67, 352, 19539, 253, 4081, 1332, 323, 247, 11134, 718, 78, 2476, 326, 277, 3757, 760, 12093, 5415, 5313, 3020, 29438, 260, 4904, 14561, 1005, 824, 347, 774, 86, 7880, 1318, 2781, 10730, 272, 50276, 68, 352, 4245, 21414, 4679, 327, 13506, 10895, 4645, 326, 10159, 273, 253, 3733, 873, 3268, 616, 10973, 2746, 11330, 3530, 326, 403, 17568, 5939, 50276, 20881, 1255, 253, 2929, 3198, 7756, 275, 4028, 50276, 66, 275, 2593, 4567, 253, 14951, 23421, 9877, 310, 908, 1293, 15571, 352, 11897, 253, 591, 17187, 14679, 12624, 23105, 50276, 4305, 9877, 4496, 4853, 253, 14951, 285, 5513, 849, 281, 11897, 253, 14679, 12624, 50276, 67, 247, 1029, 1268, 4737, 273, 253, 2022, 10012, 374, 275, 253, 2022, 2929, 588, 1361, 253, 9414, 2096, 253, 10012, 1805, 50276, 68, 253, 1269, 10565, 2193, 273, 253, 767, 14777, 275, 4677, 495, 403, 1027, 407, 1340, 273, 2233, 352, 1057, 417, 1646, 3451, 50276, 783, 2929, 3400, 247, 4460, 872, 494, 1332, 281, 4711, 3530, 326, 403, 17568, 5939, 327, 253, 6311, 16751, 10159, 273, 253, 3733, 873, 3268, 253, 1332, 310, 671, 973, 11464, 45190, 949, 7418, 4679, 253, 4081, 1332, 1361, 2953, 253, 1895, 28959, 275, 3530, 4197, 432, 277, 72, 2224, 10166, 327, 417, 84, 49710, 12554, 800, 3733, 15302, 5474, 339, 431, 248, 4477, 12661, 247, 6447, 10491, 5853, 323, 3676, 1006, 800, 6928, 277, 72, 2224, 11797, 407, 253, 37851, 1818, 273, 4903, 7212, 253, 5853, 2987, 342, 667, 2168, 10166, 277, 3757, 285, 1057, 417, 6388, 667, 2007, 3733, 2167, 352, 1057, 2430, 896, 18634, 8772, 253, 3280, 1269, 275, 17718, 253, 5933, 2987, 407, 10263, 1142, 3530, 295, 50276, 76, 432, 253, 277, 3757, 840, 10491, 432, 841, 295, 3530, 342, 5912, 39342, 2905, 281, 616, 7450, 10495, 4038, 347, 10302, 407, 253, 1818, 1171, 39448, 7212, 50276, 1212, 2943, 253, 1818, 273, 4903, 7212, 281, 35919, 10491, 432, 277, 72, 2224, 310, 247, 4460, 2934, 25761, 253, 1332, 310, 4722, 285, 28055, 973, 24013, 8550, 4720, 253, 10491, 5933, 3139, 310, 1077, 15246, 50276, 35529, 891, 1379, 2523, 342, 253, 39926, 273, 253, 5933, 275, 253, 12002, 285, 10199, 10775, 253, 4477, 897, 253, 50014, 451, 4685, 273, 6447, 1930, 44678, 504, 342, 253, 8967, 17856, 50276, 30238, 253, 30325, 4685, 273, 6447, 323, 1650, 275, 253, 12002, 253, 4477, 1375, 326, 337, 1142, 1006, 800, 3210, 3063, 403, 10166, 327, 1327, 23714, 941, 534, 556, 2442, 12739, 323, 28959, 941, 42072, 30207, 5481, 5028, 15644, 285, 4457, 285, 374, 616, 5933, 11330, 247, 6447, 3268, 327, 253, 16751, 10159, 273, 253, 3733, 873, 3268, 436, 10513, 253, 3221, 13214, 326, 253, 1246, 5853, 310, 7032, 273, 50130, 253, 4016, 12739, 327, 28959, 941, 42072, 3966, 3966, 436, 47806, 522, 3450, 778, 16084, 25622, 875, 253, 767, 14308, 273, 6447, 281, 253, 275, 1595, 290, 422, 9414, 1223, 253, 4477, 513, 22175, 253, 3064, 875, 253, 1307, 275, 841, 767, 22349, 1199, 1996, 275, 253, 2929, 891, 1928, 326, 352, 310, 417, 4569, 323, 253, 12002, 281, 3731, 26460, 275, 436, 5133, 969, 891, 760, 452, 436, 2523, 342, 253, 39926, 273, 253, 5853, 417, 253, 5853, 3139, 50276, 3113, 2743, 281, 253, 6447, 10491, 2867, 273, 10973, 891, 452, 767, 7350, 670, 253, 8542, 414, 273, 253, 1332, 50276, 18, 253, 4477, 452, 14435, 2220, 436, 533, 6447, 10491, 432, 253, 941, 16751, 1057, 417, 16084, 6447, 10491, 273, 12474, 436, 310, 45482, 672, 253, 1566, 556, 417, 4751, 6311, 253, 16751, 3103, 43733, 10491, 310, 760, 347, 6447, 347, 253, 277, 3757, 285, 253, 941, 16751, 3139, 310, 374, 1580, 253, 277, 3757, 476, 760, 320, 10166, 327, 253, 3733, 941, 3268, 3410, 3290, 588, 6889, 2439, 253, 2032, 941, 16751, 10775, 3410, 3290, 588, 2779, 24888, 342, 4038, 8772, 253, 3733, 941, 3268, 3103, 891, 8564, 326, 10491, 17568, 588, 4796, 3410, 3290, 4583, 436, 3133, 281, 320, 47790, 407, 18276, 5301, 273, 3236, 362, 10973, 3530, 275, 253, 2929, 8442, 50276, 681, 10340, 595, 253, 4477, 7568, 275, 30762, 277, 326, 10491, 342, 295, 2469, 10257, 76, 1057, 417, 2818, 253, 12320, 2845, 455, 7982, 533, 891, 812, 417, 1089, 752, 295, 310, 275, 253, 4679, 2011, 285, 1580, 1016, 2460, 3410, 4419, 12672, 253, 480, 317, 706, 757, 273, 253, 277, 3757, 8772, 697, 3280, 891, 4282, 752, 310, 253, 16851, 13782, 673, 3058, 281, 3410, 295, 1099, 1418, 2069, 323, 1016, 273, 253, 3210, 50276, 2520, 1332, 3139, 310, 4460, 285, 4722, 285, 32570, 14924, 715, 253, 8059, 2299, 253, 1655, 41066, 273, 253, 4060, 285, 12002, 476, 320, 24363, 285, 943, 320, 16168, 281, 5386, 13775, 50276, 856, 84, 50276, 783, 7262, 1037, 17194, 50276, 41528, 1037, 2969, 50276, 5040, 50276, 339, 3030, 43245, 8214, 50276, 12239, 7042, 253, 13789, 273, 253, 5853, 275, 253, 12002, 10775, 11076, 1037, 6447, 310, 10466, 281, 1599, 3300, 39904, 6447, 50276, 50234, 4737, 24042, 5474, 33032, 2520, 2929, 29328, 247, 10491, 1332, 1925, 10973, 323, 1006, 800, 3210, 534, 13698, 387, 10491, 941, 432, 253, 21624, 3268, 273, 253, 3888, 17568, 436, 2929, 29328, 247, 10491, 1332, 326, 13698, 387, 5277, 6447, 3530, 432, 253, 21624, 2317, 323, 3676, 1006, 800, 3210, 5277, 6447, 3530, 432, 253, 21624, 2317, 310, 1077, 1774, 533, 253, 7714, 4419, 625, 11745, 1543, 281, 1329, 616, 3916, 273, 6447, 3530, 50276, 24330, 7350, 50276, 18, 849, 1057, 253, 10973, 10491, 2818, 253, 3290, 273, 253, 4561, 3888, 1846, 305, 507, 6239, 3400, 690, 11745, 7103, 17082, 39645, 4868, 269, 301, 4868, 5772, 4868, 347, 247, 22861, 323, 616, 4081, 3082, 533, 436, 19529, 1057, 417, 2085, 667, 22861, 970, 253, 21270, 7103, 17082, 253, 5304, 3290, 273, 690, 4561, 3888, 970, 253, 10973, 1332, 310, 30560, 685, 970, 253, 3236, 36827, 3815, 4677, 818, 898, 275, 1635, 275, 1635, 253, 2530, 3530, 273, 253, 4561, 3888, 4677, 1458, 50276, 938, 403, 417, 2590, 2217, 281, 2085, 247, 1175, 22861, 323, 616, 5304, 3290, 50276, 19, 352, 310, 1892, 281, 2028, 1880, 253, 4081, 1332, 7777, 19132, 253, 38255, 273, 253, 19958, 941, 275, 4677, 3307, 50276, 18, 8645, 352, 3133, 326, 253, 10973, 4826, 11355, 8645, 8492, 533, 253, 10973, 29206, 5459, 8645, 8492, 50276, 19, 4707, 1955, 281, 253, 1698, 19843, 273, 436, 749, 13206, 352, 310, 1892, 281, 3812, 667, 6452, 50276, 20, 17543, 352, 3133, 326, 253, 10973, 29206, 19132, 253, 12340, 273, 29502, 38330, 285, 253, 7756, 273, 10973, 4826, 310, 3710, 50276, 21, 2363, 352, 3133, 326, 1097, 3082, 452, 690, 7756, 275, 2363, 50276, 22, 12904, 253, 7756, 273, 436, 749, 13206, 310, 1892, 281, 2028, 1097, 19132, 275, 4709, 533, 6379, 275, 24808, 50276, 23, 2289, 3927, 352, 3133, 326, 1097, 3082, 10973, 29206, 285, 10973, 4826, 476, 3157, 253, 12340, 273, 1481, 21417, 533, 512, 253, 1543, 403, 18276, 253, 37317, 651, 751, 281, 923, 690, 11745, 1543, 824, 347, 849, 2810, 253, 747, 3268, 19958, 970, 253, 10973, 3082, 310, 281, 253, 6447, 3268, 762, 253, 1072, 9050, 8645, 4707, 17543, 2363, 12904, 2289, 3927, 2530, 407, 253, 4477, 10941, 281, 253, 2045, 3082, 50276, 37585, 5701, 337, 752, 310, 253, 591, 17187, 14679, 12624, 14168, 3342, 2284, 50276, 2407, 4305, 2407, 9877, 275, 253, 10491, 5199, 273, 253, 10973, 374, 253, 6064, 273, 690, 8442, 4677, 608, 854, 1458, 50276, 938, 403, 1512, 1698, 3340, 275, 4677, 1458, 50276, 938, 835, 253, 3290, 273, 4561, 3530, 310, 1892, 281, 12654, 50275, 2520, 2929, 1057, 417, 2085, 2217, 5661, 1543, 1097, 36143, 285, 2677, 25785, 281, 1329, 616, 3916, 2490, 187, 4118, 18435, 27, 783, 2929, 29328, 247, 2969, 1332, 323, 6447, 10491, 432, 1006, 800, 16751, 970, 1818, 273, 4903, 7212, 253, 1332, 2987, 407, 806, 10491, 247, 1199, 4067, 1180, 273, 3530, 295, 432, 6447, 3268, 275, 253, 21624, 2317, 285, 840, 1057, 10491, 407, 5407, 970, 5912, 14495, 281, 1818, 275, 4644, 281, 6635, 247, 4577, 1180, 273, 2457, 3530, 465, 50276, 79, 326, 403, 2326, 347, 5512, 19958, 432, 247, 6447, 3268, 432, 253, 1006, 800, 16751, 50275, 15337, 398, 574, 690, 3533, 585, 1209, 2224, 670, 253, 21643, 3448, 275, 253, 12002, 285, 10199, 1475, 253, 897, 273, 253, 1307, 6447, 534, 253, 4477, 452, 9713, 3449, 5906, 1031, 4477, 452, 671, 2530, 1543, 327, 3290, 269, 301, 7982, 273, 253, 4561, 3530, 347, 2546, 407, 253, 30628, 50275, 6050, 253, 4081, 1332, 310, 2581, 2969, 556, 1029, 15180, 2105, 285, 38135, 310, 16888, 347, 4879, 407, 767, 273, 253, 30628, 30628, 5194, 352, 310, 1840, 253, 14924, 2534 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 2520, 2929, 7350, 342, 6447, 10491, 432, 3676, 1006, 800, 6928, 824, 347, 305, 507, 285, 13460, 265, 253, 3733, 3530, 273, 277, 72, 2224, 403, 2223, 23539, 347, 597, 403, 691, 255, 967, 1754, 327, 17971, 4815, 390, 16397, 326, 5644, 281, 277, 72, 2224, 9603, 23539, 6667, 436, 2929, 4245, 247, 16915, 80, 5704, 1754, 1775, 17407, 10973, 326, 1677, 667, 10166, 277, 3757, 11330, 3530, 326, 403, 17568, 5939, 327, 253, 6311, 16751, 352, 28055, 19539, 285, 45190, 2722, 326, 253, 10973, 11330, 247, 6447, 940, 13041, 890, 327, 253, 16751, 10159, 273, 253, 3733, 873, 3268, 253, 10527, 27947, 2430, 326, 253, 277, 72, 2224, 760, 19142, 5415, 5313, 3020, 29438, 260, 4904, 14561, 1005, 824, 347, 774, 86, 7880, 1318, 2781, 10730, 272, 253, 1264, 2022, 9021, 273, 253, 2929, 403, 347, 1563, 247, 352, 45589, 253, 9261, 23122, 407, 247, 4038, 3268, 672, 9924, 342, 247, 260, 4904, 10603, 270, 352, 38422, 271, 16101, 10491, 5700, 326, 4483, 281, 4044, 247, 6447, 3268, 327, 247, 16751, 326, 310, 5415, 285, 5313, 3020, 29438, 260, 352, 3400, 2709, 10704, 4679, 3588, 839, 253, 15988, 273, 616, 4081, 1332, 10973, 50276, 296, 3755, 20556, 247, 1677, 667, 10166, 277, 3757, 253, 2929, 4245, 247, 4460, 10527, 1332, 281, 4711, 3530, 326, 403, 17568, 5939, 327, 253, 6311, 16751, 10159, 273, 253, 3733, 873, 3268, 253, 2746, 310, 4460, 285, 35910, 253, 1895, 13990, 5954, 50276, 67, 352, 19539, 253, 4081, 1332, 323, 247, 11134, 718, 78, 2476, 326, 277, 3757, 760, 12093, 5415, 5313, 3020, 29438, 260, 4904, 14561, 1005, 824, 347, 774, 86, 7880, 1318, 2781, 10730, 272, 50276, 68, 352, 4245, 21414, 4679, 327, 13506, 10895, 4645, 326, 10159, 273, 253, 3733, 873, 3268, 616, 10973, 2746, 11330, 3530, 326, 403, 17568, 5939, 50276, 20881, 1255, 253, 2929, 3198, 7756, 275, 4028, 50276, 66, 275, 2593, 4567, 253, 14951, 23421, 9877, 310, 908, 1293, 15571, 352, 11897, 253, 591, 17187, 14679, 12624, 23105, 50276, 4305, 9877, 4496, 4853, 253, 14951, 285, 5513, 849, 281, 11897, 253, 14679, 12624, 50276, 67, 247, 1029, 1268, 4737, 273, 253, 2022, 10012, 374, 275, 253, 2022, 2929, 588, 1361, 253, 9414, 2096, 253, 10012, 1805, 50276, 68, 253, 1269, 10565, 2193, 273, 253, 767, 14777, 275, 4677, 495, 403, 1027, 407, 1340, 273, 2233, 352, 1057, 417, 1646, 3451, 50276, 783, 2929, 3400, 247, 4460, 872, 494, 1332, 281, 4711, 3530, 326, 403, 17568, 5939, 327, 253, 6311, 16751, 10159, 273, 253, 3733, 873, 3268, 253, 1332, 310, 671, 973, 11464, 45190, 949, 7418, 4679, 253, 4081, 1332, 1361, 2953, 253, 1895, 28959, 275, 3530, 4197, 432, 277, 72, 2224, 10166, 327, 417, 84, 49710, 12554, 800, 3733, 15302, 5474, 339, 431, 248, 4477, 12661, 247, 6447, 10491, 5853, 323, 3676, 1006, 800, 6928, 277, 72, 2224, 11797, 407, 253, 37851, 1818, 273, 4903, 7212, 253, 5853, 2987, 342, 667, 2168, 10166, 277, 3757, 285, 1057, 417, 6388, 667, 2007, 3733, 2167, 352, 1057, 2430, 896, 18634, 8772, 253, 3280, 1269, 275, 17718, 253, 5933, 2987, 407, 10263, 1142, 3530, 295, 50276, 76, 432, 253, 277, 3757, 840, 10491, 432, 841, 295, 3530, 342, 5912, 39342, 2905, 281, 616, 7450, 10495, 4038, 347, 10302, 407, 253, 1818, 1171, 39448, 7212, 50276, 1212, 2943, 253, 1818, 273, 4903, 7212, 281, 35919, 10491, 432, 277, 72, 2224, 310, 247, 4460, 2934, 25761, 253, 1332, 310, 4722, 285, 28055, 973, 24013, 8550, 4720, 253, 10491, 5933, 3139, 310, 1077, 15246, 50276, 35529, 891, 1379, 2523, 342, 253, 39926, 273, 253, 5933, 275, 253, 12002, 285, 10199, 10775, 253, 4477, 897, 253, 50014, 451, 4685, 273, 6447, 1930, 44678, 504, 342, 253, 8967, 17856, 50276, 30238, 253, 30325, 4685, 273, 6447, 323, 1650, 275, 253, 12002, 253, 4477, 1375, 326, 337, 1142, 1006, 800, 3210, 3063, 403, 10166, 327, 1327, 23714, 941, 534, 556, 2442, 12739, 323, 28959, 941, 42072, 30207, 5481, 5028, 15644, 285, 4457, 285, 374, 616, 5933, 11330, 247, 6447, 3268, 327, 253, 16751, 10159, 273, 253, 3733, 873, 3268, 436, 10513, 253, 3221, 13214, 326, 253, 1246, 5853, 310, 7032, 273, 50130, 253, 4016, 12739, 327, 28959, 941, 42072, 3966, 3966, 436, 47806, 522, 3450, 778, 16084, 25622, 875, 253, 767, 14308, 273, 6447, 281, 253, 275, 1595, 290, 422, 9414, 1223, 253, 4477, 513, 22175, 253, 3064, 875, 253, 1307, 275, 841, 767, 22349, 1199, 1996, 275, 253, 2929, 891, 1928, 326, 352, 310, 417, 4569, 323, 253, 12002, 281, 3731, 26460, 275, 436, 5133, 969, 891, 760, 452, 436, 2523, 342, 253, 39926, 273, 253, 5853, 417, 253, 5853, 3139, 50276, 3113, 2743, 281, 253, 6447, 10491, 2867, 273, 10973, 891, 452, 767, 7350, 670, 253, 8542, 414, 273, 253, 1332, 50276, 18, 253, 4477, 452, 14435, 2220, 436, 533, 6447, 10491, 432, 253, 941, 16751, 1057, 417, 16084, 6447, 10491, 273, 12474, 436, 310, 45482, 672, 253, 1566, 556, 417, 4751, 6311, 253, 16751, 3103, 43733, 10491, 310, 760, 347, 6447, 347, 253, 277, 3757, 285, 253, 941, 16751, 3139, 310, 374, 1580, 253, 277, 3757, 476, 760, 320, 10166, 327, 253, 3733, 941, 3268, 3410, 3290, 588, 6889, 2439, 253, 2032, 941, 16751, 10775, 3410, 3290, 588, 2779, 24888, 342, 4038, 8772, 253, 3733, 941, 3268, 3103, 891, 8564, 326, 10491, 17568, 588, 4796, 3410, 3290, 4583, 436, 3133, 281, 320, 47790, 407, 18276, 5301, 273, 3236, 362, 10973, 3530, 275, 253, 2929, 8442, 50276, 681, 10340, 595, 253, 4477, 7568, 275, 30762, 277, 326, 10491, 342, 295, 2469, 10257, 76, 1057, 417, 2818, 253, 12320, 2845, 455, 7982, 533, 891, 812, 417, 1089, 752, 295, 310, 275, 253, 4679, 2011, 285, 1580, 1016, 2460, 3410, 4419, 12672, 253, 480, 317, 706, 757, 273, 253, 277, 3757, 8772, 697, 3280, 891, 4282, 752, 310, 253, 16851, 13782, 673, 3058, 281, 3410, 295, 1099, 1418, 2069, 323, 1016, 273, 253, 3210, 50276, 2520, 1332, 3139, 310, 4460, 285, 4722, 285, 32570, 14924, 715, 253, 8059, 2299, 253, 1655, 41066, 273, 253, 4060, 285, 12002, 476, 320, 24363, 285, 943, 320, 16168, 281, 5386, 13775, 50276, 856, 84, 50276, 783, 7262, 1037, 17194, 50276, 41528, 1037, 2969, 50276, 5040, 50276, 339, 3030, 43245, 8214, 50276, 12239, 7042, 253, 13789, 273, 253, 5853, 275, 253, 12002, 10775, 11076, 1037, 6447, 310, 10466, 281, 1599, 3300, 39904, 6447, 50276, 50234, 4737, 24042, 5474, 33032, 2520, 2929, 29328, 247, 10491, 1332, 1925, 10973, 323, 1006, 800, 3210, 534, 13698, 387, 10491, 941, 432, 253, 21624, 3268, 273, 253, 3888, 17568, 436, 2929, 29328, 247, 10491, 1332, 326, 13698, 387, 5277, 6447, 3530, 432, 253, 21624, 2317, 323, 3676, 1006, 800, 3210, 5277, 6447, 3530, 432, 253, 21624, 2317, 310, 1077, 1774, 533, 253, 7714, 4419, 625, 11745, 1543, 281, 1329, 616, 3916, 273, 6447, 3530, 50276, 24330, 7350, 50276, 18, 849, 1057, 253, 10973, 10491, 2818, 253, 3290, 273, 253, 4561, 3888, 1846, 305, 507, 6239, 3400, 690, 11745, 7103, 17082, 39645, 4868, 269, 301, 4868, 5772, 4868, 347, 247, 22861, 323, 616, 4081, 3082, 533, 436, 19529, 1057, 417, 2085, 667, 22861, 970, 253, 21270, 7103, 17082, 253, 5304, 3290, 273, 690, 4561, 3888, 970, 253, 10973, 1332, 310, 30560, 685, 970, 253, 3236, 36827, 3815, 4677, 818, 898, 275, 1635, 275, 1635, 253, 2530, 3530, 273, 253, 4561, 3888, 4677, 1458, 50276, 938, 403, 417, 2590, 2217, 281, 2085, 247, 1175, 22861, 323, 616, 5304, 3290, 50276, 19, 352, 310, 1892, 281, 2028, 1880, 253, 4081, 1332, 7777, 19132, 253, 38255, 273, 253, 19958, 941, 275, 4677, 3307, 50276, 18, 8645, 352, 3133, 326, 253, 10973, 4826, 11355, 8645, 8492, 533, 253, 10973, 29206, 5459, 8645, 8492, 50276, 19, 4707, 1955, 281, 253, 1698, 19843, 273, 436, 749, 13206, 352, 310, 1892, 281, 3812, 667, 6452, 50276, 20, 17543, 352, 3133, 326, 253, 10973, 29206, 19132, 253, 12340, 273, 29502, 38330, 285, 253, 7756, 273, 10973, 4826, 310, 3710, 50276, 21, 2363, 352, 3133, 326, 1097, 3082, 452, 690, 7756, 275, 2363, 50276, 22, 12904, 253, 7756, 273, 436, 749, 13206, 310, 1892, 281, 2028, 1097, 19132, 275, 4709, 533, 6379, 275, 24808, 50276, 23, 2289, 3927, 352, 3133, 326, 1097, 3082, 10973, 29206, 285, 10973, 4826, 476, 3157, 253, 12340, 273, 1481, 21417, 533, 512, 253, 1543, 403, 18276, 253, 37317, 651, 751, 281, 923, 690, 11745, 1543, 824, 347, 849, 2810, 253, 747, 3268, 19958, 970, 253, 10973, 3082, 310, 281, 253, 6447, 3268, 762, 253, 1072, 9050, 8645, 4707, 17543, 2363, 12904, 2289, 3927, 2530, 407, 253, 4477, 10941, 281, 253, 2045, 3082, 50276, 37585, 5701, 337, 752, 310, 253, 591, 17187, 14679, 12624, 14168, 3342, 2284, 50276, 2407, 4305, 2407, 9877, 275, 253, 10491, 5199, 273, 253, 10973, 374, 253, 6064, 273, 690, 8442, 4677, 608, 854, 1458, 50276, 938, 403, 1512, 1698, 3340, 275, 4677, 1458, 50276, 938, 835, 253, 3290, 273, 4561, 3530, 310, 1892, 281, 12654, 50275, 2520, 2929, 1057, 417, 2085, 2217, 5661, 1543, 1097, 36143, 285, 2677, 25785, 281, 1329, 616, 3916, 2490, 187, 4118, 18435, 27, 783, 2929, 29328, 247, 2969, 1332, 323, 6447, 10491, 432, 1006, 800, 16751, 970, 1818, 273, 4903, 7212, 253, 1332, 2987, 407, 806, 10491, 247, 1199, 4067, 1180, 273, 3530, 295, 432, 6447, 3268, 275, 253, 21624, 2317, 285, 840, 1057, 10491, 407, 5407, 970, 5912, 14495, 281, 1818, 275, 4644, 281, 6635, 247, 4577, 1180, 273, 2457, 3530, 465, 50276, 79, 326, 403, 2326, 347, 5512, 19958, 432, 247, 6447, 3268, 432, 253, 1006, 800, 16751, 50275, 15337, 398, 574, 690, 3533, 585, 1209, 2224, 670, 253, 21643, 3448, 275, 253, 12002, 285, 10199, 1475, 253, 897, 273, 253, 1307, 6447, 534, 253, 4477, 452, 9713, 3449, 5906, 1031, 4477, 452, 671, 2530, 1543, 327, 3290, 269, 301, 7982, 273, 253, 4561, 3530, 347, 2546, 407, 253, 30628, 50275, 6050, 253, 4081, 1332, 310, 2581, 2969, 556, 1029, 15180, 2105, 285, 38135, 310, 16888, 347, 4879, 407, 767, 273, 253, 30628, 30628, 5194, 352, 310, 1840, 253, 14924, 2534 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: the manuscript introduces a novel and interesting approach to weight sharing among cnns layers by learning linear combinations of shared weight templates this allows parameter reduction better sample efficiency furthermore the authors propose a very simple way to inspect which layers choose similar combinations of template as well as to push the network toward using similar combinations at each layer this regularization term has a clear potential for computation reuse on dedicated hardware the paper is well written the method is interesting the results are convincing and thoroughly conducted i recommend acceptance 1 it would be interesting to explore how often the layer parameters converge to similar weights and how similar to this end i suggest to plot a 2d heatmap representing the similarity matrices between every pair of layers 2 figure 1 is not of immediate interpretability especially for the middle figure what does the dotted box represent what is the difference between weights and templates also its unclear which of the three options corresponds to the proposed method i would have thought the middle one but the text seems to indicate it is the rightmost one instead 3 how are the alphas initialized how fast are their transitions do they change smoothly over training do they evolve rapidly and plateau to a fixed value or keep changing during training it would be really interesting to plot their value and discuss their evolution 4 while the number of learned parameters is indeed reduced when the templates are shared among layers which could lead to better sample efficiency i am not sure whether the memory footprint on gpu would change ie i believe that current frameworks would allocate the same kernel ntimes if the same template was shared by n layers but i am not certain although the potential reduction of the number of trainable parameters is an important result by itself i wonder if what you propose would also allow to run bigger models on the same device or not without heavy modifications of the inner machineries of pytorch or tensorflow can you comment on this also note that the soft sharing regularization scheme that you propose can be of great interest for fpga hardware implementations that benefit a lot from module reuse you could mention that in the paper 5 sec 41 the number of layers in one group is defined as l43 its unclear to me where the 4 comes from also on page 6 k l23 2 is said to set one template per layer i thought the two formulas would be the same in that case what am i missing is it possible that one of the two formulas contain a typo i believe that at the very least it should be either l2 or l4 in both cases 6 sec 41 i find the notation swrnlwk and swrnlw confusing my suggestion is to set k to be the total number of templates as opposed to the number of templates per group of layers which makes it much easier to relate to and most importantly allows for an immediate comparison with the capacity of the vanilla model as a side effect it also makes it very easy to spot the configuration with one template per layer swrnlwl thus eliminating the need for an adhoc notation to distinguish it 7 the authors inspect how similar the template combination weights alphas are among layers it would also be interesting to look into what is learned in the templates cnn layers are known to learn peculiar and somewhat interpretable template matching filters it would be really interesting to compare the filters learned by a vanilla network and its template sharing alternative also i would welcome an analysis of which templates gets chosen the most at each layer in the hierarchy it would be compelling if some kind of pattern of reuse emerged from learning 8 sec 44 it is unclear to me what can be the contribution of the 1x1 initial convolution since it will see no context and all the information at the pixel level can be represented by a binary bit also are the 3x3 convolutions same convolutions if not how are the last feature maps upscaled to be able to predict at the original resolution 9 at the end of sec 44 the authors claim that the scnn is also advantaged over a more rnnlike model i fail to understand how to process this sentence but i have a feeling that its incorrect to make any claims to the performance of rnnlike models as such a model was not used as a baseline in the experiments in any way similarly in the conclusions i find it a bit stretched to claim that you can gain a more flexible form of behavior typically attributed to rnns while its true that the proposed network can in theory learn to reuse the same combination of templates which can be mapped to a network with recursion the results in this direction dont seem strong enough to draw any conclusion and a more indepth comparison against rnn performance would be in order before making any claim in this direction minor sec3 i wouldnt say the parameters are shared among layers in lstms but rather among time unrolls one drawback of the proposed method is that the layers are constrained to have the same shape this is not a major disadvantage but is still a constraint that would be good to make more explicit in the description of the model sec3 end of page 3 does the network reach the same accuracy as the vanilla model when kl also does the network use all the templates how is the distribution of the alpha weights across layers in this case sec31 the v notation makes the narrative unnecessarily heavy i suggest to drop it and refer directly to the templates t also the second part of the section with examples of templates doesnt add much in my opinion and would be better depicted with a figure sec31 the ei notation can be confused with an exp i suggest to replace it with the much more common 1ij figure 2 depicts the relation between the lsm matrix and the topology of the network this should be declared more clearly in the caption in place of the ambiguous capturing implicit recurrencies also the caption should explain what blackwhite stand for as well and possibly quickly describe what the lsm matrix is also it would be more clear that the network in the middle is equivalent to that on the right if the two were somehow connected in the figure to this end they could eg share a single lsm matrix among them finally if possible try and put the lsm matrices on top of the related network so that its clear which network they refer to sec 32 should also refer to fig2 i believe table 1 i suggest to leave the comment on the results out of the caption since its already in the main text table 2 rather than using blue i suggest to underline the overall best results so that its visible even if the paper is printed in bw fig 3 i would specify that its better viewed in color discussion i feel the discussion of table 1 is a bit difficult to follow it could be made easier by reporting the difference in test error against the corresponding vanilla model eg improves the error rate on cifar10 by 026 rather than reporting the performance of both models fig 4 are all the stages the same and is the network in the left one such stages if so update the caption to make it clear please fig 4 which lambda has been used is it the same for all stages fig 5 specify that the one on the right is the target grid also i believe that merging the two figures would make it easier to understand eg some of the structure in the target comes from how the obstacles are placed which requires to move back and forth from input to target several times to understand sec 44 space permitting i would like to see at least one sample of what kind of shortest path prediction the network can come up with a few typos end of 32 the closer elements the closer the elements parameter efficiency the period before reparametrizing should probably be a comma fig 4 illustration of stages illustration of the stages end of pag7 an syntetic a synteticdocsepthis work is motivated by the widely recognized issue of overparameterization in modern neural nets and proposes a clever template sharing design to reduce the model size the design is sound and the experiments are valid and thorough the writing is clear and fluent the reviewer is not entirely sure of the originality of this work according to the sparse related work section the contribution is novel but i will leave it to the consensus of others who are more versed in this regard the part that i find most interesting is the fact that template sharing helps with the optimization without even reducing the number of parameters as illustrated in cifar from table 1 the tradeoff of accuracy and parameterefficiency is overall wellstudied in cifar and imagenet although results on imagenet is not as impressive regarding the coefficient alpha im not sure how cosine similarity is computed i have the impression that each layer has its own alpha which is a scalar how is cosine similarity computed on scalars in the experiments theres no mentioning of the regularization terms for alpha which makes me think it is perhaps not important what is the generic setup in summary i find this work interesting and with sufficient experiments to backup its claim on the other hand im not entirely sure of its noveltyoriginality leaving this part open to othersdocsepauthors propose a parameter sharing scheme by allowing parameters to be reused across layers it further makes connection between traditional cnns with rnns by adding additional regularization and using hard sharing scheme the way of parameter sharing is similar to the filter prediction method proposed in rebuff et als work where they model a convolutional layers parameters as a linear combination of a bank of filters and use that to address difference among multiple domains sylvestrealvise rebuffi hakan bilen andrea vedaldi learning multiple visual domains with residual adapters nips 2017 the discussion on the connection between coefficients for different layers and a networks structure and visualization of layer similarity matrix is interesting additional regularization can further encourage a recurrent neural network to be learned however they only experiment with one or two templates and advantage on accuracy and model size over other methods is not very clear ### Summary:
this paper proposed an interesting approach to weight sharing among cnn layers via shared weight templates to save parameters its well written with convincing results reviewers have a consensus on accept
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 783, 7714, 23970, 247, 4460, 285, 4722, 2746, 281, 2801, 9628, 2190, 260, 79, 2224, 8090, 407, 4715, 4872, 13553, 273, 6096, 2801, 20665, 436, 4483, 4764, 5141, 1805, 3410, 6733, 33810, 253, 4477, 12661, 247, 1077, 2969, 1039, 281, 16030, 534, 8090, 5206, 2074, 13553, 273, 7646, 347, 973, 347, 281, 7450, 253, 2990, 2584, 970, 2074, 13553, 387, 1016, 3828, 436, 37820, 1307, 556, 247, 2590, 2442, 323, 13782, 33150, 327, 9940, 10309, 253, 2929, 310, 973, 3542, 253, 1332, 310, 4722, 253, 1543, 403, 21414, 285, 16575, 5196, 891, 5583, 14924, 50276, 18, 352, 651, 320, 4722, 281, 8338, 849, 2223, 253, 3828, 3602, 29623, 281, 2074, 13461, 285, 849, 2074, 281, 436, 990, 891, 1804, 281, 7484, 247, 374, 69, 4250, 4251, 9999, 253, 14259, 12624, 875, 1046, 4667, 273, 8090, 50276, 19, 4677, 337, 310, 417, 273, 8993, 4665, 1430, 3340, 323, 253, 4766, 4677, 752, 1057, 253, 24817, 3817, 1957, 752, 310, 253, 3064, 875, 13461, 285, 20665, 671, 697, 12744, 534, 273, 253, 1264, 4610, 10140, 281, 253, 4081, 1332, 891, 651, 452, 1869, 253, 4766, 581, 533, 253, 2505, 3133, 281, 5224, 352, 310, 253, 987, 2252, 581, 3185, 50276, 20, 849, 403, 253, 355, 545, 284, 31260, 849, 3809, 403, 616, 16307, 513, 597, 1818, 25863, 689, 3733, 513, 597, 23554, 9086, 285, 30025, 281, 247, 4229, 1318, 390, 1978, 6890, 1309, 3733, 352, 651, 320, 1663, 4722, 281, 7484, 616, 1318, 285, 2319, 616, 5606, 50276, 21, 1223, 253, 1180, 273, 6311, 3602, 310, 6296, 3777, 672, 253, 20665, 403, 6096, 2190, 8090, 50276, 4609, 812, 1421, 281, 1805, 3410, 6733, 50276, 74, 717, 417, 2119, 1880, 253, 3541, 33257, 327, 305, 11113, 651, 1818, 26332, 891, 2868, 326, 1655, 31225, 651, 29211, 253, 1072, 10295, 295, 3181, 604, 253, 1072, 7646, 369, 6096, 407, 295, 8090, 533, 891, 717, 417, 2176, 3738, 253, 2442, 5141, 273, 253, 1180, 273, 6194, 494, 3602, 310, 271, 1774, 906, 407, 3139, 891, 4282, 604, 752, 368, 12661, 651, 671, 1581, 281, 1408, 8750, 3210, 327, 253, 1072, 2813, 390, 417, 1293, 5536, 14586, 273, 253, 6703, 3674, 7068, 447, 273, 268, 1767, 263, 348, 390, 13148, 5449, 50276, 5092, 368, 4385, 327, 436, 671, 3877, 326, 253, 2602, 9628, 37820, 6974, 326, 368, 12661, 476, 320, 273, 1270, 1600, 323, 44296, 2485, 10309, 27558, 326, 5649, 247, 2257, 432, 6333, 33150, 368, 812, 3748, 326, 275, 253, 2929, 50276, 22, 4706, 7609, 253, 1180, 273, 8090, 275, 581, 1387, 310, 2931, 347, 298, 3079, 697, 12744, 281, 479, 835, 253, 577, 3249, 432, 671, 327, 3239, 721, 465, 50276, 77, 1508, 50276, 19, 310, 753, 281, 873, 581, 7646, 591, 3828, 891, 1869, 253, 767, 23276, 651, 320, 253, 1072, 275, 326, 1083, 752, 717, 891, 5816, 310, 352, 1896, 326, 581, 273, 253, 767, 23276, 3831, 247, 1745, 80, 891, 2868, 326, 387, 253, 1077, 1878, 352, 943, 320, 2057, 298, 19, 390, 298, 21, 275, 1097, 2219, 50276, 23, 4706, 7609, 891, 1089, 253, 14951, 1863, 83, 13307, 30567, 285, 1863, 83, 13307, 88, 21643, 619, 14876, 310, 281, 873, 465, 281, 320, 253, 2264, 1180, 273, 20665, 347, 10066, 281, 253, 1180, 273, 20665, 591, 1387, 273, 8090, 534, 2789, 352, 1199, 6927, 281, 14588, 281, 285, 954, 15538, 4483, 323, 271, 8993, 5301, 342, 253, 5350, 273, 253, 26724, 1566, 347, 247, 1930, 1055, 352, 671, 2789, 352, 1077, 3477, 281, 6308, 253, 6661, 342, 581, 7646, 591, 3828, 1863, 83, 13307, 24966, 3021, 23703, 253, 878, 323, 271, 519, 37806, 14951, 281, 12129, 352, 50276, 24, 253, 4477, 16030, 849, 2074, 253, 7646, 5019, 13461, 355, 545, 284, 403, 2190, 8090, 352, 651, 671, 320, 4722, 281, 1007, 715, 752, 310, 6311, 275, 253, 20665, 260, 9866, 8090, 403, 1929, 281, 3037, 19532, 285, 8489, 4665, 494, 7646, 11038, 15116, 352, 651, 320, 1663, 4722, 281, 7277, 253, 15116, 6311, 407, 247, 26724, 2990, 285, 697, 7646, 9628, 5795, 671, 891, 651, 10112, 271, 1783, 273, 534, 20665, 4850, 6777, 253, 954, 387, 1016, 3828, 275, 253, 19868, 352, 651, 320, 18511, 604, 690, 2238, 273, 3102, 273, 33150, 13082, 432, 4715, 50276, 25, 4706, 7127, 352, 310, 12744, 281, 479, 752, 476, 320, 253, 7680, 273, 253, 337, 89, 18, 3302, 27311, 1580, 352, 588, 923, 642, 3634, 285, 512, 253, 1491, 387, 253, 12275, 1268, 476, 320, 6607, 407, 247, 8985, 2372, 671, 403, 253, 495, 89, 20, 2410, 17009, 1072, 2410, 17009, 604, 417, 849, 403, 253, 1390, 4735, 8115, 598, 1026, 3256, 281, 320, 2104, 281, 3283, 387, 253, 3236, 6064, 50276, 26, 387, 253, 990, 273, 4706, 7127, 253, 4477, 1750, 326, 253, 660, 9866, 310, 671, 4545, 2961, 689, 247, 625, 391, 9866, 3022, 1566, 891, 1891, 281, 2096, 849, 281, 1232, 436, 6197, 533, 891, 452, 247, 5471, 326, 697, 13583, 281, 1056, 667, 3916, 281, 253, 3045, 273, 391, 9866, 3022, 3210, 347, 824, 247, 1566, 369, 417, 908, 347, 247, 8245, 275, 253, 4679, 275, 667, 1039, 12014, 275, 253, 11815, 891, 1089, 352, 247, 2372, 20061, 281, 1750, 326, 368, 476, 6351, 247, 625, 12112, 830, 273, 3879, 5431, 12877, 281, 391, 79, 2224, 1223, 697, 2032, 326, 253, 4081, 2990, 476, 275, 3762, 3037, 281, 33150, 253, 1072, 5019, 273, 20665, 534, 476, 320, 18301, 281, 247, 2990, 342, 43489, 253, 1543, 275, 436, 3884, 13414, 1646, 2266, 2217, 281, 3812, 667, 6452, 285, 247, 625, 801, 554, 394, 5301, 1411, 391, 9866, 3045, 651, 320, 275, 1340, 1078, 2403, 667, 1750, 275, 436, 3884, 50275, 37585, 50276, 1704, 20, 891, 651, 2649, 1333, 253, 3602, 403, 6096, 2190, 8090, 275, 298, 296, 983, 533, 2581, 2190, 673, 440, 1811, 84, 50276, 531, 32489, 273, 253, 4081, 1332, 310, 326, 253, 8090, 403, 20793, 281, 452, 253, 1072, 5281, 436, 310, 417, 247, 2201, 18928, 533, 310, 1335, 247, 7658, 326, 651, 320, 1175, 281, 1056, 625, 6843, 275, 253, 5740, 273, 253, 1566, 50276, 1704, 20, 990, 273, 3239, 495, 1057, 253, 2990, 3986, 253, 1072, 7200, 347, 253, 26724, 1566, 672, 27451, 671, 1057, 253, 2990, 897, 512, 253, 20665, 849, 310, 253, 3268, 273, 253, 9765, 13461, 2439, 8090, 275, 436, 1083, 50276, 1704, 2405, 253, 362, 14951, 2789, 253, 14511, 48312, 5536, 891, 1804, 281, 5926, 352, 285, 3730, 3587, 281, 253, 20665, 246, 671, 253, 1273, 629, 273, 253, 2593, 342, 6667, 273, 20665, 36908, 823, 1199, 275, 619, 4743, 285, 651, 320, 1805, 17253, 342, 247, 4677, 50276, 1704, 2405, 253, 22616, 14951, 476, 320, 13477, 342, 271, 866, 891, 1804, 281, 8171, 352, 342, 253, 1199, 625, 1846, 337, 1944, 50276, 13206, 374, 31444, 253, 5886, 875, 253, 298, 3610, 4315, 285, 253, 18080, 273, 253, 2990, 436, 943, 320, 8884, 625, 4518, 275, 253, 11743, 275, 1659, 273, 253, 23851, 26475, 15424, 11896, 24034, 671, 253, 11743, 943, 5513, 752, 2806, 11300, 1462, 323, 347, 973, 285, 6830, 4541, 6266, 752, 253, 298, 3610, 4315, 310, 671, 352, 651, 320, 625, 2590, 326, 253, 2990, 275, 253, 4766, 310, 6425, 281, 326, 327, 253, 987, 604, 253, 767, 497, 10380, 4802, 275, 253, 4677, 281, 436, 990, 597, 812, 24088, 3894, 247, 2014, 298, 3610, 4315, 2190, 731, 4720, 604, 1896, 1611, 285, 1691, 253, 298, 3610, 12624, 327, 1755, 273, 253, 2905, 2990, 594, 326, 697, 2590, 534, 2990, 597, 3730, 281, 4706, 4567, 943, 671, 3730, 281, 3036, 19, 891, 2868, 50276, 2420, 337, 891, 1804, 281, 3553, 253, 4385, 327, 253, 1543, 562, 273, 253, 11743, 1580, 697, 2168, 275, 253, 2022, 2505, 50276, 2420, 374, 2581, 685, 970, 4797, 891, 1804, 281, 762, 1282, 253, 4583, 1682, 1543, 594, 326, 697, 7985, 1014, 604, 253, 2929, 310, 11462, 275, 270, 88, 50276, 926, 495, 891, 651, 13199, 326, 697, 1805, 11575, 275, 3295, 50276, 49794, 891, 1928, 253, 5955, 273, 2829, 337, 310, 247, 2372, 2834, 281, 956, 352, 812, 320, 1160, 6927, 407, 9610, 253, 3064, 275, 1071, 2228, 1411, 253, 3969, 26724, 1566, 24088, 19132, 253, 2228, 2281, 327, 260, 338, 274, 740, 407, 470, 1731, 2581, 685, 9610, 253, 3045, 273, 1097, 3210, 50276, 926, 577, 403, 512, 253, 8661, 253, 1072, 285, 310, 253, 2990, 275, 253, 1669, 581, 824, 8661, 604, 594, 5731, 253, 11743, 281, 1056, 352, 2590, 4496, 50276, 926, 577, 534, 29331, 556, 644, 908, 310, 352, 253, 1072, 323, 512, 8661, 50276, 926, 608, 13199, 326, 253, 581, 327, 253, 987, 310, 253, 2303, 9860, 671, 891, 2868, 326, 34047, 253, 767, 8442, 651, 1056, 352, 6927, 281, 2096, 24088, 690, 273, 253, 2605, 275, 253, 2303, 3249, 432, 849, 253, 24238, 403, 4845, 534, 4419, 281, 2118, 896, 285, 6593, 432, 3280, 281, 2303, 2067, 2069, 281, 2096, 50276, 1704, 7127, 2317, 27382, 891, 651, 751, 281, 923, 387, 1878, 581, 3410, 273, 752, 2238, 273, 30505, 1854, 10554, 253, 2990, 476, 1705, 598, 342, 50274, 66, 1643, 963, 993, 50272, 423, 273, 4567, 253, 8003, 3603, 50276, 783, 8003, 253, 3603, 50272, 19484, 6733, 253, 2180, 1078, 294, 3575, 292, 363, 8537, 943, 3164, 320, 247, 39169, 50272, 926, 577, 23356, 273, 8661, 50276, 408, 10965, 273, 253, 8661, 50272, 423, 273, 24949, 24, 271, 43548, 1999, 50276, 66, 43548, 1999, 7152, 33032, 2520, 789, 310, 17194, 407, 253, 7561, 7478, 2523, 273, 689, 19484, 1320, 275, 4980, 11454, 37507, 285, 29328, 247, 19080, 7646, 9628, 2216, 281, 4796, 253, 1566, 1979, 253, 2216, 310, 3590, 285, 253, 4679, 403, 3588, 285, 11080, 253, 4028, 310, 2590, 285, 2938, 290, 50275, 783, 37317, 310, 417, 7094, 2119, 273, 253, 3236, 414, 273, 436, 789, 2556, 281, 253, 23507, 2905, 789, 2593, 253, 7680, 310, 4460, 533, 891, 588, 3553, 352, 281, 253, 13969, 273, 2571, 665, 403, 625, 1888, 264, 275, 436, 2743, 50276, 783, 629, 326, 891, 1089, 954, 4722, 310, 253, 958, 326, 7646, 9628, 7729, 342, 253, 13757, 1293, 1014, 8493, 253, 1180, 273, 3602, 347, 12800, 275, 260, 338, 274, 432, 2829, 337, 253, 5454, 2727, 273, 7200, 285, 30364, 11892, 15412, 310, 4583, 973, 14091, 728, 275, 260, 338, 274, 285, 4440, 257, 292, 3738, 1543, 327, 4440, 257, 292, 310, 417, 347, 13943, 50275, 1747, 13218, 253, 10235, 9765, 516, 417, 2119, 849, 7349, 460, 14259, 310, 10302, 891, 452, 253, 13214, 326, 1016, 3828, 556, 697, 1211, 9765, 534, 310, 247, 13434, 849, 310, 7349, 460, 14259, 10302, 327, 9171, 1032, 50276, 249, 253, 4679, 253, 373, 642, 29570, 273, 253, 37820, 2426, 323, 9765, 534, 2789, 479, 1158, 352, 310, 4931, 417, 1774, 752, 310, 253, 12314, 9978, 50276, 249, 6010, 891, 1089, 436, 789, 4722, 285, 342, 4209, 4679, 281, 17119, 697, 1750, 327, 253, 643, 1133, 516, 417, 7094, 2119, 273, 697, 38135, 19164, 414, 6108, 436, 629, 1527, 281, 2571, 7152, 33032, 43355, 12661, 247, 4764, 9628, 6974, 407, 6941, 3602, 281, 320, 294, 3197, 2439, 8090, 352, 2007, 2789, 4602, 875, 5899, 260, 79, 2224, 342, 391, 79, 2224, 407, 6240, 3081, 37820, 285, 970, 1892, 9628, 6974, 50276, 783, 1039, 273, 4764, 9628, 310, 2074, 281, 253, 5806, 10554, 1332, 4081, 275, 6142, 2066, 1162, 14350, 789, 835, 597, 1566, 247, 27311, 267, 8090, 3602, 347, 247, 4872, 5019, 273, 247, 4310, 273, 15116, 285, 897, 326, 281, 2953, 3064, 2190, 2709, 10625, 50276, 84, 1190, 5410, 6549, 87, 885, 6142, 2066, 74, 419, 17369, 10370, 257, 285, 15593, 27685, 267, 5168, 4715, 2709, 5304, 10625, 342, 12541, 519, 49872, 295, 2824, 4240, 50276, 783, 5955, 327, 253, 4602, 875, 10303, 323, 1027, 8090, 285, 247, 6928, 2605, 285, 24426, 273, 3828, 14259, 4315, 310, 4722, 3081, 37820, 476, 2007, 11907, 247, 18902, 11454, 2990, 281, 320, 6311, 50275, 35529, 597, 760, 3368, 342, 581, 390, 767, 20665, 285, 5750, 327, 7200, 285, 1566, 1979, 50276, 1189, 643, 3082, 310, 417, 1077, 2590, 187, 187, 4118, 18435, 27, 2520, 2929, 4081, 271, 4722, 2746, 281, 2801, 9628, 2190, 260, 9866, 8090, 3066, 6096, 2801, 20665, 281, 5321, 3602, 697, 973, 3542, 342, 21414, 1543, 30628, 452, 247, 13969, 327, 2997 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 783, 7714, 23970, 247, 4460, 285, 4722, 2746, 281, 2801, 9628, 2190, 260, 79, 2224, 8090, 407, 4715, 4872, 13553, 273, 6096, 2801, 20665, 436, 4483, 4764, 5141, 1805, 3410, 6733, 33810, 253, 4477, 12661, 247, 1077, 2969, 1039, 281, 16030, 534, 8090, 5206, 2074, 13553, 273, 7646, 347, 973, 347, 281, 7450, 253, 2990, 2584, 970, 2074, 13553, 387, 1016, 3828, 436, 37820, 1307, 556, 247, 2590, 2442, 323, 13782, 33150, 327, 9940, 10309, 253, 2929, 310, 973, 3542, 253, 1332, 310, 4722, 253, 1543, 403, 21414, 285, 16575, 5196, 891, 5583, 14924, 50276, 18, 352, 651, 320, 4722, 281, 8338, 849, 2223, 253, 3828, 3602, 29623, 281, 2074, 13461, 285, 849, 2074, 281, 436, 990, 891, 1804, 281, 7484, 247, 374, 69, 4250, 4251, 9999, 253, 14259, 12624, 875, 1046, 4667, 273, 8090, 50276, 19, 4677, 337, 310, 417, 273, 8993, 4665, 1430, 3340, 323, 253, 4766, 4677, 752, 1057, 253, 24817, 3817, 1957, 752, 310, 253, 3064, 875, 13461, 285, 20665, 671, 697, 12744, 534, 273, 253, 1264, 4610, 10140, 281, 253, 4081, 1332, 891, 651, 452, 1869, 253, 4766, 581, 533, 253, 2505, 3133, 281, 5224, 352, 310, 253, 987, 2252, 581, 3185, 50276, 20, 849, 403, 253, 355, 545, 284, 31260, 849, 3809, 403, 616, 16307, 513, 597, 1818, 25863, 689, 3733, 513, 597, 23554, 9086, 285, 30025, 281, 247, 4229, 1318, 390, 1978, 6890, 1309, 3733, 352, 651, 320, 1663, 4722, 281, 7484, 616, 1318, 285, 2319, 616, 5606, 50276, 21, 1223, 253, 1180, 273, 6311, 3602, 310, 6296, 3777, 672, 253, 20665, 403, 6096, 2190, 8090, 50276, 4609, 812, 1421, 281, 1805, 3410, 6733, 50276, 74, 717, 417, 2119, 1880, 253, 3541, 33257, 327, 305, 11113, 651, 1818, 26332, 891, 2868, 326, 1655, 31225, 651, 29211, 253, 1072, 10295, 295, 3181, 604, 253, 1072, 7646, 369, 6096, 407, 295, 8090, 533, 891, 717, 417, 2176, 3738, 253, 2442, 5141, 273, 253, 1180, 273, 6194, 494, 3602, 310, 271, 1774, 906, 407, 3139, 891, 4282, 604, 752, 368, 12661, 651, 671, 1581, 281, 1408, 8750, 3210, 327, 253, 1072, 2813, 390, 417, 1293, 5536, 14586, 273, 253, 6703, 3674, 7068, 447, 273, 268, 1767, 263, 348, 390, 13148, 5449, 50276, 5092, 368, 4385, 327, 436, 671, 3877, 326, 253, 2602, 9628, 37820, 6974, 326, 368, 12661, 476, 320, 273, 1270, 1600, 323, 44296, 2485, 10309, 27558, 326, 5649, 247, 2257, 432, 6333, 33150, 368, 812, 3748, 326, 275, 253, 2929, 50276, 22, 4706, 7609, 253, 1180, 273, 8090, 275, 581, 1387, 310, 2931, 347, 298, 3079, 697, 12744, 281, 479, 835, 253, 577, 3249, 432, 671, 327, 3239, 721, 465, 50276, 77, 1508, 50276, 19, 310, 753, 281, 873, 581, 7646, 591, 3828, 891, 1869, 253, 767, 23276, 651, 320, 253, 1072, 275, 326, 1083, 752, 717, 891, 5816, 310, 352, 1896, 326, 581, 273, 253, 767, 23276, 3831, 247, 1745, 80, 891, 2868, 326, 387, 253, 1077, 1878, 352, 943, 320, 2057, 298, 19, 390, 298, 21, 275, 1097, 2219, 50276, 23, 4706, 7609, 891, 1089, 253, 14951, 1863, 83, 13307, 30567, 285, 1863, 83, 13307, 88, 21643, 619, 14876, 310, 281, 873, 465, 281, 320, 253, 2264, 1180, 273, 20665, 347, 10066, 281, 253, 1180, 273, 20665, 591, 1387, 273, 8090, 534, 2789, 352, 1199, 6927, 281, 14588, 281, 285, 954, 15538, 4483, 323, 271, 8993, 5301, 342, 253, 5350, 273, 253, 26724, 1566, 347, 247, 1930, 1055, 352, 671, 2789, 352, 1077, 3477, 281, 6308, 253, 6661, 342, 581, 7646, 591, 3828, 1863, 83, 13307, 24966, 3021, 23703, 253, 878, 323, 271, 519, 37806, 14951, 281, 12129, 352, 50276, 24, 253, 4477, 16030, 849, 2074, 253, 7646, 5019, 13461, 355, 545, 284, 403, 2190, 8090, 352, 651, 671, 320, 4722, 281, 1007, 715, 752, 310, 6311, 275, 253, 20665, 260, 9866, 8090, 403, 1929, 281, 3037, 19532, 285, 8489, 4665, 494, 7646, 11038, 15116, 352, 651, 320, 1663, 4722, 281, 7277, 253, 15116, 6311, 407, 247, 26724, 2990, 285, 697, 7646, 9628, 5795, 671, 891, 651, 10112, 271, 1783, 273, 534, 20665, 4850, 6777, 253, 954, 387, 1016, 3828, 275, 253, 19868, 352, 651, 320, 18511, 604, 690, 2238, 273, 3102, 273, 33150, 13082, 432, 4715, 50276, 25, 4706, 7127, 352, 310, 12744, 281, 479, 752, 476, 320, 253, 7680, 273, 253, 337, 89, 18, 3302, 27311, 1580, 352, 588, 923, 642, 3634, 285, 512, 253, 1491, 387, 253, 12275, 1268, 476, 320, 6607, 407, 247, 8985, 2372, 671, 403, 253, 495, 89, 20, 2410, 17009, 1072, 2410, 17009, 604, 417, 849, 403, 253, 1390, 4735, 8115, 598, 1026, 3256, 281, 320, 2104, 281, 3283, 387, 253, 3236, 6064, 50276, 26, 387, 253, 990, 273, 4706, 7127, 253, 4477, 1750, 326, 253, 660, 9866, 310, 671, 4545, 2961, 689, 247, 625, 391, 9866, 3022, 1566, 891, 1891, 281, 2096, 849, 281, 1232, 436, 6197, 533, 891, 452, 247, 5471, 326, 697, 13583, 281, 1056, 667, 3916, 281, 253, 3045, 273, 391, 9866, 3022, 3210, 347, 824, 247, 1566, 369, 417, 908, 347, 247, 8245, 275, 253, 4679, 275, 667, 1039, 12014, 275, 253, 11815, 891, 1089, 352, 247, 2372, 20061, 281, 1750, 326, 368, 476, 6351, 247, 625, 12112, 830, 273, 3879, 5431, 12877, 281, 391, 79, 2224, 1223, 697, 2032, 326, 253, 4081, 2990, 476, 275, 3762, 3037, 281, 33150, 253, 1072, 5019, 273, 20665, 534, 476, 320, 18301, 281, 247, 2990, 342, 43489, 253, 1543, 275, 436, 3884, 13414, 1646, 2266, 2217, 281, 3812, 667, 6452, 285, 247, 625, 801, 554, 394, 5301, 1411, 391, 9866, 3045, 651, 320, 275, 1340, 1078, 2403, 667, 1750, 275, 436, 3884, 50275, 37585, 50276, 1704, 20, 891, 651, 2649, 1333, 253, 3602, 403, 6096, 2190, 8090, 275, 298, 296, 983, 533, 2581, 2190, 673, 440, 1811, 84, 50276, 531, 32489, 273, 253, 4081, 1332, 310, 326, 253, 8090, 403, 20793, 281, 452, 253, 1072, 5281, 436, 310, 417, 247, 2201, 18928, 533, 310, 1335, 247, 7658, 326, 651, 320, 1175, 281, 1056, 625, 6843, 275, 253, 5740, 273, 253, 1566, 50276, 1704, 20, 990, 273, 3239, 495, 1057, 253, 2990, 3986, 253, 1072, 7200, 347, 253, 26724, 1566, 672, 27451, 671, 1057, 253, 2990, 897, 512, 253, 20665, 849, 310, 253, 3268, 273, 253, 9765, 13461, 2439, 8090, 275, 436, 1083, 50276, 1704, 2405, 253, 362, 14951, 2789, 253, 14511, 48312, 5536, 891, 1804, 281, 5926, 352, 285, 3730, 3587, 281, 253, 20665, 246, 671, 253, 1273, 629, 273, 253, 2593, 342, 6667, 273, 20665, 36908, 823, 1199, 275, 619, 4743, 285, 651, 320, 1805, 17253, 342, 247, 4677, 50276, 1704, 2405, 253, 22616, 14951, 476, 320, 13477, 342, 271, 866, 891, 1804, 281, 8171, 352, 342, 253, 1199, 625, 1846, 337, 1944, 50276, 13206, 374, 31444, 253, 5886, 875, 253, 298, 3610, 4315, 285, 253, 18080, 273, 253, 2990, 436, 943, 320, 8884, 625, 4518, 275, 253, 11743, 275, 1659, 273, 253, 23851, 26475, 15424, 11896, 24034, 671, 253, 11743, 943, 5513, 752, 2806, 11300, 1462, 323, 347, 973, 285, 6830, 4541, 6266, 752, 253, 298, 3610, 4315, 310, 671, 352, 651, 320, 625, 2590, 326, 253, 2990, 275, 253, 4766, 310, 6425, 281, 326, 327, 253, 987, 604, 253, 767, 497, 10380, 4802, 275, 253, 4677, 281, 436, 990, 597, 812, 24088, 3894, 247, 2014, 298, 3610, 4315, 2190, 731, 4720, 604, 1896, 1611, 285, 1691, 253, 298, 3610, 12624, 327, 1755, 273, 253, 2905, 2990, 594, 326, 697, 2590, 534, 2990, 597, 3730, 281, 4706, 4567, 943, 671, 3730, 281, 3036, 19, 891, 2868, 50276, 2420, 337, 891, 1804, 281, 3553, 253, 4385, 327, 253, 1543, 562, 273, 253, 11743, 1580, 697, 2168, 275, 253, 2022, 2505, 50276, 2420, 374, 2581, 685, 970, 4797, 891, 1804, 281, 762, 1282, 253, 4583, 1682, 1543, 594, 326, 697, 7985, 1014, 604, 253, 2929, 310, 11462, 275, 270, 88, 50276, 926, 495, 891, 651, 13199, 326, 697, 1805, 11575, 275, 3295, 50276, 49794, 891, 1928, 253, 5955, 273, 2829, 337, 310, 247, 2372, 2834, 281, 956, 352, 812, 320, 1160, 6927, 407, 9610, 253, 3064, 275, 1071, 2228, 1411, 253, 3969, 26724, 1566, 24088, 19132, 253, 2228, 2281, 327, 260, 338, 274, 740, 407, 470, 1731, 2581, 685, 9610, 253, 3045, 273, 1097, 3210, 50276, 926, 577, 403, 512, 253, 8661, 253, 1072, 285, 310, 253, 2990, 275, 253, 1669, 581, 824, 8661, 604, 594, 5731, 253, 11743, 281, 1056, 352, 2590, 4496, 50276, 926, 577, 534, 29331, 556, 644, 908, 310, 352, 253, 1072, 323, 512, 8661, 50276, 926, 608, 13199, 326, 253, 581, 327, 253, 987, 310, 253, 2303, 9860, 671, 891, 2868, 326, 34047, 253, 767, 8442, 651, 1056, 352, 6927, 281, 2096, 24088, 690, 273, 253, 2605, 275, 253, 2303, 3249, 432, 849, 253, 24238, 403, 4845, 534, 4419, 281, 2118, 896, 285, 6593, 432, 3280, 281, 2303, 2067, 2069, 281, 2096, 50276, 1704, 7127, 2317, 27382, 891, 651, 751, 281, 923, 387, 1878, 581, 3410, 273, 752, 2238, 273, 30505, 1854, 10554, 253, 2990, 476, 1705, 598, 342, 50274, 66, 1643, 963, 993, 50272, 423, 273, 4567, 253, 8003, 3603, 50276, 783, 8003, 253, 3603, 50272, 19484, 6733, 253, 2180, 1078, 294, 3575, 292, 363, 8537, 943, 3164, 320, 247, 39169, 50272, 926, 577, 23356, 273, 8661, 50276, 408, 10965, 273, 253, 8661, 50272, 423, 273, 24949, 24, 271, 43548, 1999, 50276, 66, 43548, 1999, 7152, 33032, 2520, 789, 310, 17194, 407, 253, 7561, 7478, 2523, 273, 689, 19484, 1320, 275, 4980, 11454, 37507, 285, 29328, 247, 19080, 7646, 9628, 2216, 281, 4796, 253, 1566, 1979, 253, 2216, 310, 3590, 285, 253, 4679, 403, 3588, 285, 11080, 253, 4028, 310, 2590, 285, 2938, 290, 50275, 783, 37317, 310, 417, 7094, 2119, 273, 253, 3236, 414, 273, 436, 789, 2556, 281, 253, 23507, 2905, 789, 2593, 253, 7680, 310, 4460, 533, 891, 588, 3553, 352, 281, 253, 13969, 273, 2571, 665, 403, 625, 1888, 264, 275, 436, 2743, 50276, 783, 629, 326, 891, 1089, 954, 4722, 310, 253, 958, 326, 7646, 9628, 7729, 342, 253, 13757, 1293, 1014, 8493, 253, 1180, 273, 3602, 347, 12800, 275, 260, 338, 274, 432, 2829, 337, 253, 5454, 2727, 273, 7200, 285, 30364, 11892, 15412, 310, 4583, 973, 14091, 728, 275, 260, 338, 274, 285, 4440, 257, 292, 3738, 1543, 327, 4440, 257, 292, 310, 417, 347, 13943, 50275, 1747, 13218, 253, 10235, 9765, 516, 417, 2119, 849, 7349, 460, 14259, 310, 10302, 891, 452, 253, 13214, 326, 1016, 3828, 556, 697, 1211, 9765, 534, 310, 247, 13434, 849, 310, 7349, 460, 14259, 10302, 327, 9171, 1032, 50276, 249, 253, 4679, 253, 373, 642, 29570, 273, 253, 37820, 2426, 323, 9765, 534, 2789, 479, 1158, 352, 310, 4931, 417, 1774, 752, 310, 253, 12314, 9978, 50276, 249, 6010, 891, 1089, 436, 789, 4722, 285, 342, 4209, 4679, 281, 17119, 697, 1750, 327, 253, 643, 1133, 516, 417, 7094, 2119, 273, 697, 38135, 19164, 414, 6108, 436, 629, 1527, 281, 2571, 7152, 33032, 43355, 12661, 247, 4764, 9628, 6974, 407, 6941, 3602, 281, 320, 294, 3197, 2439, 8090, 352, 2007, 2789, 4602, 875, 5899, 260, 79, 2224, 342, 391, 79, 2224, 407, 6240, 3081, 37820, 285, 970, 1892, 9628, 6974, 50276, 783, 1039, 273, 4764, 9628, 310, 2074, 281, 253, 5806, 10554, 1332, 4081, 275, 6142, 2066, 1162, 14350, 789, 835, 597, 1566, 247, 27311, 267, 8090, 3602, 347, 247, 4872, 5019, 273, 247, 4310, 273, 15116, 285, 897, 326, 281, 2953, 3064, 2190, 2709, 10625, 50276, 84, 1190, 5410, 6549, 87, 885, 6142, 2066, 74, 419, 17369, 10370, 257, 285, 15593, 27685, 267, 5168, 4715, 2709, 5304, 10625, 342, 12541, 519, 49872, 295, 2824, 4240, 50276, 783, 5955, 327, 253, 4602, 875, 10303, 323, 1027, 8090, 285, 247, 6928, 2605, 285, 24426, 273, 3828, 14259, 4315, 310, 4722, 3081, 37820, 476, 2007, 11907, 247, 18902, 11454, 2990, 281, 320, 6311, 50275, 35529, 597, 760, 3368, 342, 581, 390, 767, 20665, 285, 5750, 327, 7200, 285, 1566, 1979, 50276, 1189, 643, 3082, 310, 417, 1077, 2590, 187, 187, 4118, 18435, 27, 2520, 2929, 4081, 271, 4722, 2746, 281, 2801, 9628, 2190, 260, 9866, 8090, 3066, 6096, 2801, 20665, 281, 5321, 3602, 697, 973, 3542, 342, 21414, 1543, 30628, 452, 247, 13969, 327, 2997 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: the approach uses feature pyramid networks which are powerful at detecting small features the approach shows improvements over the baseline approaches the approach has potential to be pluggedin to other architectures in support to other classification problems the work is strongly assuming that abnormalities are always located in the centre of an mri slice i suspect radiologists would not be comfortable with using a system which does not investigate every part of the image obviously lesions in the menisci should be searched in the menisci or ligament lesions in the ligaments the manuscript has either very detailed descriptions of the mechanics or lacks details about the conducted experiments contributions and limitations should be clearly stated the related work section should not be in the appendix because related works are very important this applies with respect to the specific problem here solved as well as to the previous applications of feature pyramid networks docsep1 the paper is trying to exploit a structural prior in mri scans specific to the anatomy of knee disorders i find this idea interesting in itself although i wonder if the implementation per se assumption that key features are centered on mri scans can be consistently relied upon in clinical practice 2 robust testing procedure by 5fold crossvalidation repeatable results with statistical significance 3 both table 1 and 2 show consistently better results with the mrpyrnet modifications on the baseline networks especially rocauc metric results also corroborated by statistical tests 4 the paper also provides thorough ablation studies eg table 4 motivates each component in the system same for tables 5 and 6 they motivate the use of max probability for the outputs and max pooling 1 subsection feature combination and output prediction not very clear is the maxpooling of pil performed across slices to form a single pl vector 2 table 4 suggests the proposed method consistently underperforms in specificity although other metrics are higher compared to baselines the authors apply a fully connected layer at each pl where l 1l to obtain yhatl then they state they use the highest yhatl as a predictor do the authors think using the max yhatl could be the reason for the underperforming specificity ie high sensitivity is achieved because of overestimating the probability of disease although max probability gives highest accuracy as suggested by table 5 3 it is not clear to me if the method can handle verticalhorizontal translation it seems overly reliant on the fact that certain knee features will be localized in the center of the mri image is this standard medical practice in image acquisition for knee disorder diagnosis would mri centering require additional effort by clinicians docsep1 the paper is well structured and presented with sufficient details of methods and results 2 the proposed architecture and techniques bear similarity with multiple existing architectures eg encoderdecoder architecture with skip connection similar to unet but have some novel aspects 3 the results are compared with proper baseline methods and discussed properly 1 the proposed technique termed fpn is applied on a prior works mrnet architecture as the backbone based on the description provided in the paper it seems that the application of proposed techniques to any backbone architecture will increase the depth and parameters of the neural network considerably as mentioned by the authors the prior work mrnet uses the famous alexnet architecture which is not as deep as other succeeding and better performing classification architectures eg vggnet resnet densenet this raises an obvious question is the increase in performance compared to backbone architecture merely a result of increased network capacity therefore it is important to apply the proposed techniques with more modern architectures as backbone architectures and see if the improvement in performance by application of the proposed technique is consistent across architectures 2 the authors mention that the relevant features are localized near the centre of mri scans this assumption as such assumes that all the knee mri scans are acquired centred on the knee this overlooks the possibility of translation during mri acquisition due to known or unknown reasons docsep1 the paper is clearly written and formatted 2 the approach is simple and wellmotivated 3 the authors evaluate against two recent papers on this task observing a significant increase in performance 4 a thorough evaluation of different parts of the approach is made 5 code will be made available the presented approach is an application of known methods to a new problem only introducing marginally new ideas in an method driven conference such lack of methodological novelty would have been an issue but since midl has a strong application focus i am inclined towards seeing this paper be presented however i would only like to see such a paper be presented in the application track of midl ### Summary:
the authors propose a new architecture to address a specific detection problem since the architecture rely on existing components the methodological novelty is marginal however the authors address a novel application they provide a thorough evaluation with ablation study and statistical test the authors took the reviewers remarks into account and performed additional experiments their code will be made available thus i recommend acceptance of this paper as a poster
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 783, 2746, 4648, 4735, 39694, 6928, 534, 403, 6422, 387, 15549, 1355, 3386, 50276, 783, 2746, 2722, 11701, 689, 253, 8245, 7274, 50276, 783, 2746, 556, 2442, 281, 320, 43867, 249, 281, 643, 35615, 50276, 249, 1329, 281, 643, 9162, 3237, 253, 789, 310, 7052, 7384, 326, 19147, 403, 1900, 4441, 275, 253, 9145, 273, 271, 278, 363, 15512, 50276, 74, 9101, 8188, 11644, 651, 417, 320, 9848, 342, 970, 247, 985, 534, 1057, 417, 7409, 1046, 629, 273, 253, 2460, 9090, 10429, 275, 253, 1821, 2865, 74, 943, 320, 16113, 275, 253, 1821, 2865, 74, 390, 37864, 10429, 275, 253, 8405, 19491, 50276, 783, 7714, 556, 2057, 1077, 7000, 20121, 273, 253, 17823, 390, 19756, 4278, 670, 253, 5196, 4679, 9021, 285, 7364, 943, 320, 4518, 4767, 50276, 783, 2905, 789, 2593, 943, 417, 320, 275, 253, 30762, 984, 2905, 2987, 403, 1077, 1774, 436, 10384, 342, 1675, 281, 253, 2173, 1895, 1060, 14042, 347, 973, 347, 281, 253, 2045, 4893, 273, 4735, 39694, 6928, 5474, 33032, 18, 186, 783, 2929, 310, 2820, 281, 22059, 247, 8350, 2720, 275, 278, 363, 20947, 2173, 281, 253, 30559, 273, 12267, 9360, 891, 1089, 436, 2934, 4722, 275, 3139, 3738, 891, 4282, 604, 253, 7092, 591, 396, 9376, 326, 2234, 3386, 403, 18932, 327, 278, 363, 20947, 476, 320, 12724, 15494, 2220, 275, 3382, 3946, 374, 186, 18848, 461, 5175, 5199, 407, 608, 8089, 2831, 29599, 10280, 494, 1543, 342, 7605, 8453, 495, 186, 15617, 2829, 337, 285, 374, 921, 12724, 1805, 1543, 342, 253, 278, 83, 4789, 83, 3024, 14586, 327, 253, 8245, 6928, 3340, 687, 6357, 1028, 7982, 50276, 16680, 671, 47790, 407, 7605, 5216, 50276, 21, 186, 783, 2929, 671, 3400, 11080, 28913, 2175, 24088, 2829, 577, 15265, 684, 1016, 4445, 275, 253, 985, 1072, 323, 7180, 608, 285, 721, 50276, 9328, 41509, 253, 897, 273, 2781, 5912, 323, 253, 18012, 285, 2781, 45900, 50276, 18, 186, 2377, 4674, 4735, 5019, 285, 3453, 10554, 417, 1077, 2590, 310, 253, 2781, 10730, 272, 273, 7861, 2684, 2439, 18484, 281, 830, 247, 2014, 499, 4972, 374, 186, 2420, 577, 5936, 253, 4081, 1332, 12724, 762, 468, 13015, 275, 13005, 3738, 643, 17082, 403, 2169, 2429, 281, 1666, 25379, 253, 4477, 4647, 247, 4751, 4802, 3828, 387, 1016, 499, 835, 298, 50276, 18, 77, 281, 4044, 340, 700, 77, 840, 597, 1375, 597, 897, 253, 4585, 340, 700, 77, 347, 247, 23403, 513, 253, 4477, 1158, 970, 253, 2781, 340, 700, 77, 812, 320, 253, 1921, 323, 253, 762, 468, 14692, 13005, 26332, 1029, 7340, 310, 6786, 984, 273, 35039, 303, 839, 253, 5912, 273, 2728, 3738, 2781, 5912, 4245, 4585, 7200, 347, 5125, 407, 2829, 608, 50276, 20, 186, 262, 310, 417, 2590, 281, 479, 604, 253, 1332, 476, 6016, 9118, 33464, 10234, 352, 3133, 27662, 774, 6231, 327, 253, 958, 326, 2176, 12267, 3386, 588, 320, 15783, 275, 253, 4055, 273, 253, 278, 363, 2460, 310, 436, 2629, 3739, 3946, 275, 2460, 11931, 323, 12267, 8984, 6120, 651, 278, 363, 1399, 2158, 2430, 3081, 3434, 407, 26690, 50276, 7152, 33032, 18, 253, 2929, 310, 973, 18872, 285, 3559, 342, 4209, 4278, 273, 3082, 285, 1543, 374, 253, 4081, 10336, 285, 5609, 8800, 14259, 342, 2709, 5368, 35615, 24088, 32049, 48759, 10336, 342, 17049, 4602, 2074, 281, 440, 292, 533, 452, 690, 4460, 7794, 495, 50276, 783, 1543, 403, 2429, 342, 1463, 8245, 3082, 285, 5469, 6283, 337, 253, 4081, 5853, 23776, 269, 16077, 310, 3732, 327, 247, 2720, 2987, 278, 83, 3024, 10336, 347, 253, 27882, 1754, 327, 253, 5740, 2530, 275, 253, 2929, 352, 3133, 326, 253, 2898, 273, 4081, 5609, 281, 667, 27882, 10336, 588, 2572, 253, 6864, 285, 3602, 273, 253, 11454, 2990, 15455, 347, 5393, 407, 253, 4477, 253, 2720, 789, 278, 83, 3024, 4648, 253, 8530, 247, 1591, 3024, 10336, 534, 310, 417, 347, 3676, 347, 643, 42547, 285, 1805, 9591, 9162, 35615, 24088, 362, 1266, 3024, 501, 3024, 12006, 257, 292, 436, 16540, 271, 4755, 1953, 310, 253, 2572, 275, 3045, 2429, 281, 27882, 10336, 7960, 247, 906, 273, 2559, 2990, 5350, 50276, 45230, 352, 310, 1774, 281, 4647, 253, 4081, 5609, 342, 625, 4980, 35615, 347, 27882, 35615, 285, 923, 604, 253, 7756, 275, 3045, 407, 2898, 273, 253, 4081, 5853, 310, 5185, 2439, 35615, 374, 253, 4477, 3748, 326, 253, 4623, 3386, 403, 15783, 2822, 253, 9145, 273, 278, 363, 20947, 436, 9376, 347, 824, 19584, 326, 512, 253, 12267, 278, 363, 20947, 403, 9288, 1399, 433, 327, 253, 12267, 436, 20621, 84, 253, 6387, 273, 10234, 1309, 278, 363, 11931, 1955, 281, 1929, 390, 7202, 4606, 5474, 33032, 18, 253, 2929, 310, 4518, 3542, 285, 39113, 50276, 19, 253, 2746, 310, 2969, 285, 973, 24013, 8550, 50276, 20, 253, 4477, 7472, 1411, 767, 3332, 9380, 327, 436, 4836, 20764, 247, 1534, 2572, 275, 3045, 577, 247, 11080, 7103, 273, 1027, 4243, 273, 253, 2746, 310, 1160, 608, 2127, 588, 320, 1160, 2130, 253, 3559, 2746, 310, 271, 2898, 273, 1929, 3082, 281, 247, 747, 1895, 760, 16984, 42876, 747, 5697, 275, 271, 1332, 8877, 8059, 824, 3480, 273, 35961, 38135, 651, 452, 644, 271, 2523, 533, 1580, 4260, 77, 556, 247, 2266, 2898, 2770, 891, 717, 21802, 4404, 6523, 436, 2929, 320, 3559, 2299, 891, 651, 760, 751, 281, 923, 824, 247, 2929, 320, 3559, 275, 253, 2898, 3540, 273, 4260, 77, 2490, 187, 4118, 18435, 27, 783, 4477, 12661, 247, 747, 10336, 281, 2953, 247, 2173, 5481, 1895, 50276, 17480, 253, 10336, 10725, 327, 5368, 4295, 253, 35961, 38135, 310, 16888, 2299, 50275, 783, 4477, 2953, 247, 4460, 2898, 50276, 9328, 2085, 247, 11080, 7103, 342, 28913, 1263, 285, 7605, 1071, 50276, 783, 4477, 2335, 253, 30628, 16157, 715, 2395, 285, 2684, 3081, 4679, 50276, 14094, 2127, 588, 320, 1160, 2130, 50276, 40622, 50276, 74, 5583, 14924, 273, 436, 2929, 347, 247, 20731 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 783, 2746, 4648, 4735, 39694, 6928, 534, 403, 6422, 387, 15549, 1355, 3386, 50276, 783, 2746, 2722, 11701, 689, 253, 8245, 7274, 50276, 783, 2746, 556, 2442, 281, 320, 43867, 249, 281, 643, 35615, 50276, 249, 1329, 281, 643, 9162, 3237, 253, 789, 310, 7052, 7384, 326, 19147, 403, 1900, 4441, 275, 253, 9145, 273, 271, 278, 363, 15512, 50276, 74, 9101, 8188, 11644, 651, 417, 320, 9848, 342, 970, 247, 985, 534, 1057, 417, 7409, 1046, 629, 273, 253, 2460, 9090, 10429, 275, 253, 1821, 2865, 74, 943, 320, 16113, 275, 253, 1821, 2865, 74, 390, 37864, 10429, 275, 253, 8405, 19491, 50276, 783, 7714, 556, 2057, 1077, 7000, 20121, 273, 253, 17823, 390, 19756, 4278, 670, 253, 5196, 4679, 9021, 285, 7364, 943, 320, 4518, 4767, 50276, 783, 2905, 789, 2593, 943, 417, 320, 275, 253, 30762, 984, 2905, 2987, 403, 1077, 1774, 436, 10384, 342, 1675, 281, 253, 2173, 1895, 1060, 14042, 347, 973, 347, 281, 253, 2045, 4893, 273, 4735, 39694, 6928, 5474, 33032, 18, 186, 783, 2929, 310, 2820, 281, 22059, 247, 8350, 2720, 275, 278, 363, 20947, 2173, 281, 253, 30559, 273, 12267, 9360, 891, 1089, 436, 2934, 4722, 275, 3139, 3738, 891, 4282, 604, 253, 7092, 591, 396, 9376, 326, 2234, 3386, 403, 18932, 327, 278, 363, 20947, 476, 320, 12724, 15494, 2220, 275, 3382, 3946, 374, 186, 18848, 461, 5175, 5199, 407, 608, 8089, 2831, 29599, 10280, 494, 1543, 342, 7605, 8453, 495, 186, 15617, 2829, 337, 285, 374, 921, 12724, 1805, 1543, 342, 253, 278, 83, 4789, 83, 3024, 14586, 327, 253, 8245, 6928, 3340, 687, 6357, 1028, 7982, 50276, 16680, 671, 47790, 407, 7605, 5216, 50276, 21, 186, 783, 2929, 671, 3400, 11080, 28913, 2175, 24088, 2829, 577, 15265, 684, 1016, 4445, 275, 253, 985, 1072, 323, 7180, 608, 285, 721, 50276, 9328, 41509, 253, 897, 273, 2781, 5912, 323, 253, 18012, 285, 2781, 45900, 50276, 18, 186, 2377, 4674, 4735, 5019, 285, 3453, 10554, 417, 1077, 2590, 310, 253, 2781, 10730, 272, 273, 7861, 2684, 2439, 18484, 281, 830, 247, 2014, 499, 4972, 374, 186, 2420, 577, 5936, 253, 4081, 1332, 12724, 762, 468, 13015, 275, 13005, 3738, 643, 17082, 403, 2169, 2429, 281, 1666, 25379, 253, 4477, 4647, 247, 4751, 4802, 3828, 387, 1016, 499, 835, 298, 50276, 18, 77, 281, 4044, 340, 700, 77, 840, 597, 1375, 597, 897, 253, 4585, 340, 700, 77, 347, 247, 23403, 513, 253, 4477, 1158, 970, 253, 2781, 340, 700, 77, 812, 320, 253, 1921, 323, 253, 762, 468, 14692, 13005, 26332, 1029, 7340, 310, 6786, 984, 273, 35039, 303, 839, 253, 5912, 273, 2728, 3738, 2781, 5912, 4245, 4585, 7200, 347, 5125, 407, 2829, 608, 50276, 20, 186, 262, 310, 417, 2590, 281, 479, 604, 253, 1332, 476, 6016, 9118, 33464, 10234, 352, 3133, 27662, 774, 6231, 327, 253, 958, 326, 2176, 12267, 3386, 588, 320, 15783, 275, 253, 4055, 273, 253, 278, 363, 2460, 310, 436, 2629, 3739, 3946, 275, 2460, 11931, 323, 12267, 8984, 6120, 651, 278, 363, 1399, 2158, 2430, 3081, 3434, 407, 26690, 50276, 7152, 33032, 18, 253, 2929, 310, 973, 18872, 285, 3559, 342, 4209, 4278, 273, 3082, 285, 1543, 374, 253, 4081, 10336, 285, 5609, 8800, 14259, 342, 2709, 5368, 35615, 24088, 32049, 48759, 10336, 342, 17049, 4602, 2074, 281, 440, 292, 533, 452, 690, 4460, 7794, 495, 50276, 783, 1543, 403, 2429, 342, 1463, 8245, 3082, 285, 5469, 6283, 337, 253, 4081, 5853, 23776, 269, 16077, 310, 3732, 327, 247, 2720, 2987, 278, 83, 3024, 10336, 347, 253, 27882, 1754, 327, 253, 5740, 2530, 275, 253, 2929, 352, 3133, 326, 253, 2898, 273, 4081, 5609, 281, 667, 27882, 10336, 588, 2572, 253, 6864, 285, 3602, 273, 253, 11454, 2990, 15455, 347, 5393, 407, 253, 4477, 253, 2720, 789, 278, 83, 3024, 4648, 253, 8530, 247, 1591, 3024, 10336, 534, 310, 417, 347, 3676, 347, 643, 42547, 285, 1805, 9591, 9162, 35615, 24088, 362, 1266, 3024, 501, 3024, 12006, 257, 292, 436, 16540, 271, 4755, 1953, 310, 253, 2572, 275, 3045, 2429, 281, 27882, 10336, 7960, 247, 906, 273, 2559, 2990, 5350, 50276, 45230, 352, 310, 1774, 281, 4647, 253, 4081, 5609, 342, 625, 4980, 35615, 347, 27882, 35615, 285, 923, 604, 253, 7756, 275, 3045, 407, 2898, 273, 253, 4081, 5853, 310, 5185, 2439, 35615, 374, 253, 4477, 3748, 326, 253, 4623, 3386, 403, 15783, 2822, 253, 9145, 273, 278, 363, 20947, 436, 9376, 347, 824, 19584, 326, 512, 253, 12267, 278, 363, 20947, 403, 9288, 1399, 433, 327, 253, 12267, 436, 20621, 84, 253, 6387, 273, 10234, 1309, 278, 363, 11931, 1955, 281, 1929, 390, 7202, 4606, 5474, 33032, 18, 253, 2929, 310, 4518, 3542, 285, 39113, 50276, 19, 253, 2746, 310, 2969, 285, 973, 24013, 8550, 50276, 20, 253, 4477, 7472, 1411, 767, 3332, 9380, 327, 436, 4836, 20764, 247, 1534, 2572, 275, 3045, 577, 247, 11080, 7103, 273, 1027, 4243, 273, 253, 2746, 310, 1160, 608, 2127, 588, 320, 1160, 2130, 253, 3559, 2746, 310, 271, 2898, 273, 1929, 3082, 281, 247, 747, 1895, 760, 16984, 42876, 747, 5697, 275, 271, 1332, 8877, 8059, 824, 3480, 273, 35961, 38135, 651, 452, 644, 271, 2523, 533, 1580, 4260, 77, 556, 247, 2266, 2898, 2770, 891, 717, 21802, 4404, 6523, 436, 2929, 320, 3559, 2299, 891, 651, 760, 751, 281, 923, 824, 247, 2929, 320, 3559, 275, 253, 2898, 3540, 273, 4260, 77, 2490, 187, 4118, 18435, 27, 783, 4477, 12661, 247, 747, 10336, 281, 2953, 247, 2173, 5481, 1895, 50276, 17480, 253, 10336, 10725, 327, 5368, 4295, 253, 35961, 38135, 310, 16888, 2299, 50275, 783, 4477, 2953, 247, 4460, 2898, 50276, 9328, 2085, 247, 11080, 7103, 342, 28913, 1263, 285, 7605, 1071, 50276, 783, 4477, 2335, 253, 30628, 16157, 715, 2395, 285, 2684, 3081, 4679, 50276, 14094, 2127, 588, 320, 1160, 2130, 50276, 40622, 50276, 74, 5583, 14924, 273, 436, 2929, 347, 247, 20731 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: overall this is an incremental paper the authors propose a hierarchical attention layer which computes an aggregation of self attention layer outputs in the multi level attention model this seems like a small improvement there are results using this hierarchical attention layer instead of the vanilla attention layers on machine reading comprehension and chinese poem generation the authors should have also included results on more tasks to show the clear improvement of the proposed method the issues with this paper are aggregating weights of different layers has been an idea explored before elmo cove etc so the model improvement itself seems small lack of strong experimental evidence in my regard the experiments are somewhat incomplete in both the tasks the authors compare only the vanilla model bidaf matchlstm rnet and the model with ham layers it is not clear where the improvement is coming from it would have made sense to compare the number of parameters and also using the same number of vanilla attention layers which outputs the last layer and compare it to the one proposed by the authors since the argument is towards using weighted average rather than the last layer there should have been a more detailed analysis on what was the weight distribution and on how important were representations from different layers docsepthe paper proposes to enhance existing multilevel attention selfattention mechanism by obtaining query and key vectors value vectors from all levels after weightedaveraging them the paper claims that this is also theoretically beneficial because the loss function will converge to zero as the number of layers increase it claims that the proposed architecture outperforms existing attentionbased models in english mrc test squad chinese mrc test and chinese poem generation task i find three major issues in the paper 1 i think the proposed hypothesis lacks the novelty that iclr audience seeks for through many existing architectures resnet elmo we already know that skip connection between cnn layers or weighted average of multiple lstm layers could improve model significantly perhaps this could be an application paper that brings existing methods to a slightly different attention domain but not only such paper is less suitable for iclr but also it would require strong experimental results but as i will detail in the second point i also have some worries about the experiments 2 the experimental results have problems for english mrc experiment squad the reproduced matchlstm score is 10 below the reported number in its original paper furthermore it is not clear whether the improvement comes from having multiple attention layers which is not novel or weightedaveraging the attention layers the proposed method bidaf and matchlstm have single attention layers so it is not fair to compare them with multilayer attention 3 lastly i am not sure i understood the theoretical section correctly but it is not much interesting that having multiple layers allow one to approach closer to zero loss in fact any sufficiently large model can obtain closetozero loss on the training data this is not a sufficient condition for a good model we cannot guarantee if the model has generalized well it might have just overfit to the training data a few minor issues and typos on the paper first para second sentence in in first para second sentence sequence to sequence sequencetosequence second last para of intro sentence fragment figure 3 would be good to have english translation docsepthe paper introduces hierarchical attention where they propose to weighted combine all the intermediate layers of multilevel attention the idea is simple and seems to be promising however originality seems incremental in order to fully demonstrate the significance of the proposed algorithm the authors should conduct more comparisons for example to multilevel attention just comparing with onelevel attention seems unfair given the significant increase of computation another aspect of comparison may be to consider computation and performance improvements together and discuss the best tradeoff the authors should also include some standard benchmark datasets for comparisons the current ones are good but it is not so clear what is the best stateofthearts results on them when compared with all other methods the analysis on the networks representation and convergence is nice but it does not bring much insights the argument for decreasing global minimal of the loss function in terms of increasing parameter size can be made for nearly all models but it is of little practical use since there is no guarantee one can reach the global optimal of these models i recommend the authors to analyzedemonstrate how effective this weighted combination is for example the paper can benefit from some clear examples that show the learned weights across the layers and which ones are more important the presentation of the paper needs some polishing for example there are numerous typos grammatical errors everywhere ### Summary:
the authors propose a hierarchical attention layer which combines intermediate layers of multilevel attention while this is a simple idea and the authors show some improvements over the baselines the authors raised a number of concerns about the validity of the chosen baselines and the lack of more detailed evaluations on additional tasks and analysis of the results given the incremental nature of the work and the significant concerns raised by the reviewers the ac is recommending that this paper be rejected
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 1189, 455, 436, 310, 271, 32809, 2929, 253, 4477, 12661, 247, 24498, 4116, 3828, 534, 48169, 271, 20828, 273, 1881, 4116, 3828, 18012, 275, 253, 4471, 1268, 4116, 1566, 436, 3133, 751, 247, 1355, 7756, 50276, 9088, 403, 1543, 970, 436, 24498, 4116, 3828, 3185, 273, 253, 26724, 4116, 8090, 327, 5145, 4361, 35380, 285, 448, 5187, 19361, 5978, 253, 4477, 943, 452, 671, 2908, 1543, 327, 625, 8892, 281, 921, 253, 2590, 7756, 273, 253, 4081, 1332, 50276, 783, 3374, 342, 436, 2929, 403, 50276, 46601, 839, 13461, 273, 1027, 8090, 556, 644, 271, 2934, 14859, 1078, 1045, 6972, 260, 710, 3966, 594, 253, 1566, 7756, 3139, 3133, 1355, 50276, 77, 471, 273, 2266, 5661, 1941, 275, 619, 2743, 253, 4679, 403, 8489, 18464, 275, 1097, 253, 8892, 253, 4477, 7277, 760, 253, 26724, 1566, 12246, 2320, 3761, 42663, 78, 391, 3024, 285, 253, 1566, 342, 10546, 8090, 352, 310, 417, 2590, 835, 253, 7756, 310, 3551, 432, 352, 651, 452, 1160, 3282, 281, 7277, 253, 1180, 273, 3602, 285, 671, 970, 253, 1072, 1180, 273, 26724, 4116, 8090, 50276, 4609, 18012, 253, 1390, 3828, 285, 7277, 352, 281, 253, 581, 4081, 407, 253, 4477, 50276, 17480, 253, 4154, 310, 4404, 970, 17375, 3388, 2581, 685, 253, 1390, 3828, 627, 943, 452, 644, 247, 625, 7000, 1783, 327, 752, 369, 253, 2801, 3268, 285, 327, 849, 1774, 497, 14237, 432, 1027, 8090, 50276, 7152, 339, 431, 248, 2929, 29328, 281, 7278, 5368, 1554, 48268, 4116, 1881, 42959, 5122, 407, 13546, 7316, 285, 2234, 11390, 50276, 2877, 11390, 432, 512, 2308, 846, 17375, 11215, 2977, 731, 253, 2929, 3916, 326, 436, 310, 671, 28055, 12912, 984, 253, 2957, 1159, 588, 29623, 281, 5058, 347, 253, 1180, 273, 8090, 2572, 352, 3916, 326, 253, 4081, 10336, 41731, 13015, 5368, 4116, 3169, 3210, 275, 48087, 278, 3373, 1071, 13487, 448, 5187, 278, 3373, 1071, 285, 448, 5187, 19361, 5978, 4836, 50276, 74, 1089, 1264, 2201, 3374, 275, 253, 2929, 50276, 18, 50276, 74, 1158, 253, 4081, 9079, 19756, 253, 38135, 326, 17857, 32888, 8446, 14993, 323, 949, 1142, 5368, 35615, 501, 3024, 1045, 6972, 359, 2168, 871, 326, 17049, 4602, 875, 260, 9866, 8090, 390, 17375, 3388, 273, 2709, 298, 296, 78, 8090, 812, 3157, 1566, 3012, 4931, 436, 812, 320, 271, 2898, 2929, 326, 10316, 5368, 3082, 281, 247, 5777, 1027, 4116, 5028, 533, 417, 760, 824, 2929, 310, 1679, 7470, 323, 17857, 32888, 533, 671, 352, 651, 2430, 2266, 5661, 1543, 533, 347, 891, 588, 2508, 275, 253, 1273, 1127, 891, 671, 452, 690, 28314, 670, 253, 4679, 50275, 19, 253, 5661, 1543, 452, 3237, 323, 48087, 278, 3373, 3368, 13487, 253, 23775, 3761, 42663, 78, 4868, 310, 884, 2708, 253, 2361, 1180, 275, 697, 3236, 2929, 33810, 352, 310, 417, 2590, 1880, 253, 7756, 3249, 432, 1907, 2709, 4116, 8090, 534, 310, 417, 4460, 390, 17375, 11215, 2977, 253, 4116, 8090, 253, 4081, 1332, 12246, 2320, 285, 3761, 42663, 78, 452, 2014, 4116, 8090, 594, 352, 310, 417, 4344, 281, 7277, 731, 342, 33362, 4071, 4116, 50275, 20, 1390, 314, 891, 717, 417, 2119, 891, 7192, 253, 10527, 2593, 9113, 533, 352, 310, 417, 1199, 4722, 326, 1907, 2709, 8090, 1581, 581, 281, 2746, 8003, 281, 5058, 2957, 275, 958, 667, 10481, 1781, 1566, 476, 4044, 26348, 6002, 2771, 2957, 327, 253, 3733, 941, 436, 310, 417, 247, 4209, 1617, 323, 247, 1175, 1566, 359, 2550, 12215, 604, 253, 1566, 556, 14923, 973, 352, 1537, 452, 816, 689, 8491, 281, 253, 3733, 941, 50276, 66, 1643, 5884, 3374, 285, 963, 993, 50276, 251, 253, 2929, 50276, 7053, 5586, 1273, 6197, 275, 50276, 249, 50276, 7053, 5586, 1273, 6197, 3425, 281, 3425, 50276, 2346, 2083, 292, 583, 371, 566, 50276, 9815, 1390, 5586, 273, 26432, 6197, 10087, 50276, 13206, 495, 651, 320, 1175, 281, 452, 48087, 10234, 50276, 7152, 339, 431, 248, 2929, 23970, 24498, 4116, 835, 597, 12661, 281, 17375, 13398, 512, 253, 10444, 8090, 273, 1554, 48268, 4116, 253, 2934, 310, 2969, 285, 3133, 281, 320, 12532, 2299, 3236, 414, 3133, 32809, 50276, 249, 1340, 281, 4751, 7568, 253, 8453, 273, 253, 4081, 5933, 253, 4477, 943, 2589, 625, 14023, 323, 1650, 281, 1554, 48268, 4116, 816, 10941, 342, 581, 5251, 4116, 3133, 16593, 1677, 253, 1534, 2572, 273, 13782, 1529, 4809, 273, 5301, 778, 320, 281, 1908, 13782, 285, 3045, 11701, 2366, 285, 2319, 253, 1682, 5454, 2727, 253, 4477, 943, 671, 2486, 690, 2629, 22791, 15302, 323, 14023, 253, 1655, 4394, 403, 1175, 533, 352, 310, 417, 594, 2590, 752, 310, 253, 1682, 1375, 23037, 248, 12863, 1543, 327, 731, 672, 2429, 342, 512, 643, 3082, 50276, 783, 1783, 327, 253, 6928, 6779, 285, 14940, 310, 5322, 533, 352, 1057, 417, 3324, 1199, 16039, 253, 4154, 323, 11052, 4156, 8723, 273, 253, 2957, 1159, 275, 2426, 273, 3629, 4764, 1979, 476, 320, 1160, 323, 4829, 512, 3210, 533, 352, 310, 273, 1652, 8542, 897, 1580, 627, 310, 642, 12215, 581, 476, 3986, 253, 4156, 8654, 273, 841, 3210, 50276, 74, 5583, 253, 4477, 281, 5867, 15125, 1344, 366, 849, 3576, 436, 17375, 5019, 310, 323, 1650, 253, 2929, 476, 5649, 432, 690, 2590, 6667, 326, 921, 253, 6311, 13461, 2439, 253, 8090, 285, 534, 4394, 403, 625, 1774, 50276, 783, 9759, 273, 253, 2929, 3198, 690, 35952, 323, 1650, 627, 403, 7418, 963, 993, 47412, 474, 6332, 11678, 187, 187, 4118, 18435, 27, 783, 4477, 12661, 247, 24498, 4116, 3828, 534, 24772, 10444, 8090, 273, 1554, 48268, 4116, 1223, 436, 310, 247, 2969, 2934, 285, 253, 4477, 921, 690, 11701, 689, 253, 1666, 25379, 253, 4477, 5439, 247, 1180, 273, 7350, 670, 253, 13091, 273, 253, 6777, 1666, 25379, 285, 253, 3480, 273, 625, 7000, 27163, 327, 3081, 8892, 285, 1783, 273, 253, 1543, 1677, 253, 32809, 3753, 273, 253, 789, 285, 253, 1534, 7350, 5439, 407, 253, 30628, 253, 913, 310, 46705, 326, 436, 2929, 320, 10945 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 1189, 455, 436, 310, 271, 32809, 2929, 253, 4477, 12661, 247, 24498, 4116, 3828, 534, 48169, 271, 20828, 273, 1881, 4116, 3828, 18012, 275, 253, 4471, 1268, 4116, 1566, 436, 3133, 751, 247, 1355, 7756, 50276, 9088, 403, 1543, 970, 436, 24498, 4116, 3828, 3185, 273, 253, 26724, 4116, 8090, 327, 5145, 4361, 35380, 285, 448, 5187, 19361, 5978, 253, 4477, 943, 452, 671, 2908, 1543, 327, 625, 8892, 281, 921, 253, 2590, 7756, 273, 253, 4081, 1332, 50276, 783, 3374, 342, 436, 2929, 403, 50276, 46601, 839, 13461, 273, 1027, 8090, 556, 644, 271, 2934, 14859, 1078, 1045, 6972, 260, 710, 3966, 594, 253, 1566, 7756, 3139, 3133, 1355, 50276, 77, 471, 273, 2266, 5661, 1941, 275, 619, 2743, 253, 4679, 403, 8489, 18464, 275, 1097, 253, 8892, 253, 4477, 7277, 760, 253, 26724, 1566, 12246, 2320, 3761, 42663, 78, 391, 3024, 285, 253, 1566, 342, 10546, 8090, 352, 310, 417, 2590, 835, 253, 7756, 310, 3551, 432, 352, 651, 452, 1160, 3282, 281, 7277, 253, 1180, 273, 3602, 285, 671, 970, 253, 1072, 1180, 273, 26724, 4116, 8090, 50276, 4609, 18012, 253, 1390, 3828, 285, 7277, 352, 281, 253, 581, 4081, 407, 253, 4477, 50276, 17480, 253, 4154, 310, 4404, 970, 17375, 3388, 2581, 685, 253, 1390, 3828, 627, 943, 452, 644, 247, 625, 7000, 1783, 327, 752, 369, 253, 2801, 3268, 285, 327, 849, 1774, 497, 14237, 432, 1027, 8090, 50276, 7152, 339, 431, 248, 2929, 29328, 281, 7278, 5368, 1554, 48268, 4116, 1881, 42959, 5122, 407, 13546, 7316, 285, 2234, 11390, 50276, 2877, 11390, 432, 512, 2308, 846, 17375, 11215, 2977, 731, 253, 2929, 3916, 326, 436, 310, 671, 28055, 12912, 984, 253, 2957, 1159, 588, 29623, 281, 5058, 347, 253, 1180, 273, 8090, 2572, 352, 3916, 326, 253, 4081, 10336, 41731, 13015, 5368, 4116, 3169, 3210, 275, 48087, 278, 3373, 1071, 13487, 448, 5187, 278, 3373, 1071, 285, 448, 5187, 19361, 5978, 4836, 50276, 74, 1089, 1264, 2201, 3374, 275, 253, 2929, 50276, 18, 50276, 74, 1158, 253, 4081, 9079, 19756, 253, 38135, 326, 17857, 32888, 8446, 14993, 323, 949, 1142, 5368, 35615, 501, 3024, 1045, 6972, 359, 2168, 871, 326, 17049, 4602, 875, 260, 9866, 8090, 390, 17375, 3388, 273, 2709, 298, 296, 78, 8090, 812, 3157, 1566, 3012, 4931, 436, 812, 320, 271, 2898, 2929, 326, 10316, 5368, 3082, 281, 247, 5777, 1027, 4116, 5028, 533, 417, 760, 824, 2929, 310, 1679, 7470, 323, 17857, 32888, 533, 671, 352, 651, 2430, 2266, 5661, 1543, 533, 347, 891, 588, 2508, 275, 253, 1273, 1127, 891, 671, 452, 690, 28314, 670, 253, 4679, 50275, 19, 253, 5661, 1543, 452, 3237, 323, 48087, 278, 3373, 3368, 13487, 253, 23775, 3761, 42663, 78, 4868, 310, 884, 2708, 253, 2361, 1180, 275, 697, 3236, 2929, 33810, 352, 310, 417, 2590, 1880, 253, 7756, 3249, 432, 1907, 2709, 4116, 8090, 534, 310, 417, 4460, 390, 17375, 11215, 2977, 253, 4116, 8090, 253, 4081, 1332, 12246, 2320, 285, 3761, 42663, 78, 452, 2014, 4116, 8090, 594, 352, 310, 417, 4344, 281, 7277, 731, 342, 33362, 4071, 4116, 50275, 20, 1390, 314, 891, 717, 417, 2119, 891, 7192, 253, 10527, 2593, 9113, 533, 352, 310, 417, 1199, 4722, 326, 1907, 2709, 8090, 1581, 581, 281, 2746, 8003, 281, 5058, 2957, 275, 958, 667, 10481, 1781, 1566, 476, 4044, 26348, 6002, 2771, 2957, 327, 253, 3733, 941, 436, 310, 417, 247, 4209, 1617, 323, 247, 1175, 1566, 359, 2550, 12215, 604, 253, 1566, 556, 14923, 973, 352, 1537, 452, 816, 689, 8491, 281, 253, 3733, 941, 50276, 66, 1643, 5884, 3374, 285, 963, 993, 50276, 251, 253, 2929, 50276, 7053, 5586, 1273, 6197, 275, 50276, 249, 50276, 7053, 5586, 1273, 6197, 3425, 281, 3425, 50276, 2346, 2083, 292, 583, 371, 566, 50276, 9815, 1390, 5586, 273, 26432, 6197, 10087, 50276, 13206, 495, 651, 320, 1175, 281, 452, 48087, 10234, 50276, 7152, 339, 431, 248, 2929, 23970, 24498, 4116, 835, 597, 12661, 281, 17375, 13398, 512, 253, 10444, 8090, 273, 1554, 48268, 4116, 253, 2934, 310, 2969, 285, 3133, 281, 320, 12532, 2299, 3236, 414, 3133, 32809, 50276, 249, 1340, 281, 4751, 7568, 253, 8453, 273, 253, 4081, 5933, 253, 4477, 943, 2589, 625, 14023, 323, 1650, 281, 1554, 48268, 4116, 816, 10941, 342, 581, 5251, 4116, 3133, 16593, 1677, 253, 1534, 2572, 273, 13782, 1529, 4809, 273, 5301, 778, 320, 281, 1908, 13782, 285, 3045, 11701, 2366, 285, 2319, 253, 1682, 5454, 2727, 253, 4477, 943, 671, 2486, 690, 2629, 22791, 15302, 323, 14023, 253, 1655, 4394, 403, 1175, 533, 352, 310, 417, 594, 2590, 752, 310, 253, 1682, 1375, 23037, 248, 12863, 1543, 327, 731, 672, 2429, 342, 512, 643, 3082, 50276, 783, 1783, 327, 253, 6928, 6779, 285, 14940, 310, 5322, 533, 352, 1057, 417, 3324, 1199, 16039, 253, 4154, 323, 11052, 4156, 8723, 273, 253, 2957, 1159, 275, 2426, 273, 3629, 4764, 1979, 476, 320, 1160, 323, 4829, 512, 3210, 533, 352, 310, 273, 1652, 8542, 897, 1580, 627, 310, 642, 12215, 581, 476, 3986, 253, 4156, 8654, 273, 841, 3210, 50276, 74, 5583, 253, 4477, 281, 5867, 15125, 1344, 366, 849, 3576, 436, 17375, 5019, 310, 323, 1650, 253, 2929, 476, 5649, 432, 690, 2590, 6667, 326, 921, 253, 6311, 13461, 2439, 253, 8090, 285, 534, 4394, 403, 625, 1774, 50276, 783, 9759, 273, 253, 2929, 3198, 690, 35952, 323, 1650, 627, 403, 7418, 963, 993, 47412, 474, 6332, 11678, 187, 187, 4118, 18435, 27, 783, 4477, 12661, 247, 24498, 4116, 3828, 534, 24772, 10444, 8090, 273, 1554, 48268, 4116, 1223, 436, 310, 247, 2969, 2934, 285, 253, 4477, 921, 690, 11701, 689, 253, 1666, 25379, 253, 4477, 5439, 247, 1180, 273, 7350, 670, 253, 13091, 273, 253, 6777, 1666, 25379, 285, 253, 3480, 273, 625, 7000, 27163, 327, 3081, 8892, 285, 1783, 273, 253, 1543, 1677, 253, 32809, 3753, 273, 253, 789, 285, 253, 1534, 7350, 5439, 407, 253, 30628, 253, 913, 310, 46705, 326, 436, 2929, 320, 10945 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: this paper proposes a new mcmc transition kernel this kernel is parameterized by neural networks and is optimized through an objective maximizing the proposal entropy specifically the authors use a combination of a flow model and nonvolume preserving flow in dinh et al 2016 as the neural network parameterized kernel then they use the objective in titsias dellaportas 2019 which maximizes the proposal entropy to optimize the kernel the proposed method is tested on synthetic datasets bayesian logistic regression and a deep energybased model the problem of improving the exploration efficiency of mcmc kernel is important the proposed method is wellmotivated as far as i understand the proposed method appears to be technically sound however i have the following concerns about the paper the connection and the difference to previous work are not very clear if i understand it correctly the proposed method seems a combination of l2hmc and titsias dellaportas 2019 with some slight modification a flow model since the nave combination did not work well as stated in section 4 i think it would improve the clarity a lot if the authors explain more clearly how the proposed method differs from previous work since the use of a flow model is the main difference compared to the nave combination of two previous methods the authors should explain more about this choice currently it is not clear why this helps and there is no explanation on why the nave combination fails the proposed method seems to use more neural networks eg an additional network r to handle gradient than previous neural network mcmc i wonder how the method performs if considering the cost for example the authors may instead show ess per second on the experiments in section 51 though the authors mentioned the difficulty of computing ess per second in the paper im not entirely convinced as the experiments in section 51 are all very smallscale is it really necessary to use gpus the baselines vary from experiments to experiments for no reason for example the authors compare their method to l2hmc on illconditioned gaussian and strongly correlated gaussian to neutra on funnel distribution and to mala on ebm i think this experiment design needs explanation also there is no empirical comparison to titsias dellaportas 2019 which is closely related to the proposed method some minor comments a1 intends to show the benefit of using gradient information but variant 2 also uses gradient what is the point of showing it it is not clear to me how to interpret the empirical results in section 52 for example how does figure 3 show that the proposed method needs less sampling steps the color of points in figure 1 is hard to read docsepthe authors describe an approach for adaptive mcmc which uses a proposal distribution parameterized by a neural network and which optimizes the entropy of the resulting proposal overall i found it to be an interesting algorithm with fairly good results as compared to alternative methods however the description of the algorithm itself was somewhat confusing andor convoluted the introduction including section 2 was great and provided a concise but very readable introduction to recent approaches for adaptive mcmc the beginning of section 3 was also quite well presented however when the authors turn to the parts of their approach inspired by hmc it gets a bit murkier part of the difficulty here is that at this point the first paragraph of p4 the authors are describing hmc rather than their method which is not entirely clear perhaps this could have been simplified by moving the related work section earlier in the paper and allowing for the description of hmc before diving into their own approach the relation between hmc and their approach which makes use of intermediate steps xr could also have been more thoroughly explained and given more intuition the link is there but it doesnt seem to be as close to the leapfrog step as xr doesnt correspond to an intermediate step which would be xzn not that im saying in any way that this makes the algorithm incorrect just that the connection isnt quite as clear cut similarly i would also have liked to see a more clear description of the q t and s matrices throughout this work the authors also refer to during training i assume they perform their adaptation steps during the sample process as is the case of the titsias and dellaportas work however this could use some clarification overall i think that this and other confusions cited above could have been done away with by including a clearer outlineoverview of the algorithm as a whole finally overall the results seem to be quite a bit better than competing methods although im not an expert in this area however although the authors hasted to add that they do not compare exact computation time or esssecond it would have been nice to see a more thorough discussion of the relative computational complexity of the alternatives while essgrad does in some sense get close to this the computation necessary for networks depending on their size could play a factor here similarly i would like to see more discussion with regards to the choice of architecture for these networksie were they a simple mlp its possible i missed this but do not think it was discusseddocsepsummary the author proposes a novel mcmc sampler parametrized by the neural networks in particular the neural network is chosen to be a flowbased model that allows the exact evaluation of the proposal probability the update equations of mcmc mimic the dynamics of hmc by incorporating the gradient of the energy function into the flow to ensure the invertibility each update equation only depends on the other part of the variables in addition the author also designs another network r to propose an evaluation location for the target gradient this ensures the similarity between the proposed method and hmc as for the training objective the author proposes to maximize the proposal entropy and acceptance rate controlled by coefficient beta due to the tractability of the proposal density this objective can be analytically computed stop gradient trick is used to stabilize the training and reduce the cost of backpropagation empirically the author evaluates the proposed sampler in some toy datasets logistic regression and deep energybased models the proposed sampler achieves higher ess compared to baselines review clarity the paper is clearly written and easy to follow the author also addresses the related work and mention the difference compared to previous baselines technical soundness i have a quick look at the details of the proposed method it seems the derivation is correct novelty although the structure of the sampler is inspired by l2hmc there are still some differences l2hmc generalizes the hmc update equation by partitioning pmbx into two parts each part is parameterized by neural networks the proposed sampler instead partitions the state pmbz instead of parameter pmbx with additional network r the idea of using entropy as an objective to encourage exploration is not new however the novelty lies in the usage of the flow model to allows tractability of proposal density overall the proposed method is novel to some extent significance of the work the proposed method can be regarded as a variant of l2hmc with slightly different nn parameterization and training objectives the advantage of the proposed method is the higher ess compared to previous samplers this may be helpful to some audiences weakness 1 although the author demonstrates the better ess can be obtained using the proposed method i still prefer more analysis to understand the properties of this sampler for example i am curious to know how important the network r is if r is removed and the leapfrog integrator is used ie update zn followed by updating xn and finally zn to mimic hmc update rule then the gradient evaluation is also evaluated at different locations what are the differences in terms of performances 2 the reason that the proposed method can have higher ess is due to the maximization of the proposal entropy although the training objective has an acceptance rate term i am curious to know does the entropy term hurts the sample quality for example in logistic regression the author only reports the ess what about the convergence speed of the sampler compared to others that use sample quality as the training objective what about their performances in logistic regression 3 for training deep ebm apart from ess i also want to know the convergence speed of the ebm training and the quality of the generated images compared to mala these are the standard evaluation metric for ebm 4 for ebm why only compared to mala any reasons why excludes other baselines docsepthe paper argues that a better objective to train neural mcmc kernels is to maximize the proposal entropy titsias dellaportas 2019 and demonstrate a method on doing so the method shows improved sampling efficiency compared to previous method especially one that optimize the alternative l2 expected jump the novelty is not of the training objective but a neural instantiation with improved sampling efficiency pros 1 the method is wellmotivated in section 1 and clearly demonstrated by figure 1 2 the neural instantiation consists a few clever tricks to make the network tractable and explore the space well 3 the method demonstrates improved sampling efficiency measured by ess cons 1 the paper may need more work on presentation section 3 is hard to follow its very long and maybe adding some subsection would help i feel section 4 should go before section 3 given how it is currently presented 2 there is a false statements in the paper one can easily make any transition probability satisfy it by including an additional metropolishastings acceptreject step hastings 1970 its not true one need to additionally ensure that the kernel is irreducible and aperiodic for the reason above its worth the mention that the proposed neural approach is irreducible and aperiodic and give some justificaiton 3 no experiments for correctness check i understand that showing better ess is good but even some biased sampler can lead to much improved ess i think there should be some results on comparing the proposed method against wellestablished hmc on a few bayesian inference problem in terms of the posterior using some hypothesis testing methods to check if the posterior actually matches would give me more confidence that the method is sampling from the target or at least comparing a few moments questions 1 in titsias dellaportas 2019 the authors avoid multiple backpropagation by stopping the derivative calculation at the density gradient term in our experiment we find it is necessary for good performance does it mean you effectively using a biased gradient and it turns out to be better than an unbiased one it looks quite weird to me as it means you are in fact optimizing some other objective which turns out to be better 2 im generally unsatisfied with the setup of section 52 which removes the need to choose the step size the only tunable parameter is a target accept rate i think vanilla hmc or mala can do so by some online otpimization for step size as well eg dual averaging as its done in nuts the unstable training of ebms via contrastive divergence comes from the fact that the gradient estimated by a shortrun mcmc is unbiased the proposed method itself doesnt deal with the biasness directly so the only plausible reason to explain why it improves the training is that the mixing is so good such that with the short chain the bias of the gradient is so small is this what happening i wonder whether or not the simultaneousinterweaved training of samplers and ebms would cause any issue in particular will it encourage that ebm only putting energy around data points and the sampler only effectively draw samples from some noisy distribution nearby are samples from ebms too similar to data points for figure 3b whats the convergence behaviour ie do learnable sampler and mala converge to a similar entropy or not for figure 3c does longrun mala produce sensible samples if yes it indicates that this way of training ebms avoid some pathology which cd with shortrun mcmc has if no then it may indicate that somehow the learnable sampler coadapt with the ebm somehow to avoid the pathology is the latter we wanted ### Summary:
this paper proposed an mcmc sampler that combines hmc and neural network based proposal distribution it is an improvement over l2hmc and titsias dellaportas 2019 with the major innovation being that the proposed normalizing flowbased proposal is engineered such that the density of the proposal qxx is tractable experiments are conducted on synthetic distributions bayesian logistic regression and deep energybased model training while reviewers are overall happy about the novelty of the approach some clarity issues have been raised in some of the reviewers initial reviews also concerns on the evaluation settings including the missing evaluation metric such as esssecond are also raised by the reviewers the revision addressed some of the clarity issues but some experimental evaluation issues still exist eg comparing with l2hmc in terms of esssecond and the replaced mala baseline results make the improvement of the proposed approach less clear i personally find the proposed approach as a very interesting concept however i also agree with the reviewers that more experimental studies need to be done in order to understand the real gain of the approach
[ 534, 4648, 247, 10419, 3268, 4764, 1025, 407, 247, 11454, 2990, 285, 534, 5556, 4219, 253, 15579, 273, 253, 4795, 10419, 4583, 891, 1119, 352, 281, 320, 271, 4722, 5933, 342, 9648, 1175, 1543, 347, 2429, 281, 5795, 3082, 2299, 253, 5740, 273, 253, 5933, 3139, 369, 8489, 21643, 285, 263, 2410, 311, 4525, 50276, 783, 10199, 1690, 2593, 374, 369, 1270, 285, 2530, 247, 44003, 533, 1077, 34025, 10199, 281, 3332, 7274, 323, 17825, 278, 3591, 68, 253, 5068, 273, 2593, 495, 369, 671, 3240, 973, 3559, 2299, 672, 253, 4477, 1614, 281, 253, 4243, 273, 616, 2746, 11797, 407, 288, 17475, 352, 4850, 247, 2372, 4682, 76, 1321, 629, 273, 253, 10183, 1060, 310, 326, 387, 436, 1127, 253, 806, 12494, 273, 268, 21, 253, 4477, 403, 12930, 288, 17475, 2581, 685, 616, 1332, 534, 310, 417, 7094, 2590, 4931, 436, 812, 452, 644, 21010, 407, 4886, 253, 2905, 789, 2593, 4321, 275, 253, 2929, 285, 6941, 323, 253, 5740, 273, 288, 17475, 1078, 33058, 715, 616, 1211, 2746, 50276, 783, 5886, 875, 288, 17475, 285, 616, 2746, 534, 2789, 897, 273, 10444, 5018, 1269, 83, 812, 671, 452, 644, 625, 16575, 5544, 285, 1677, 625, 30328, 253, 3048, 310, 627, 533, 352, 36908, 1646, 281, 320, 347, 2810, 281, 253, 26416, 71, 6375, 3213, 347, 1269, 83, 36908, 2723, 281, 271, 10444, 3213, 534, 651, 320, 1269, 27509, 417, 326, 516, 3981, 275, 667, 1039, 326, 436, 2789, 253, 5933, 13583, 816, 326, 253, 4602, 310, 2649, 3240, 347, 2590, 2624, 50276, 3549, 6241, 891, 651, 671, 452, 10490, 281, 923, 247, 625, 2590, 5740, 273, 253, 2805, 246, 285, 256, 12624, 50276, 10489, 483, 436, 789, 253, 4477, 671, 3730, 281, 1309, 3733, 891, 5467, 597, 1347, 616, 15644, 5018, 1309, 253, 3410, 1232, 347, 310, 253, 1083, 273, 253, 246, 953, 6358, 285, 20889, 522, 430, 284, 789, 2299, 436, 812, 897, 690, 37699, 4583, 891, 1158, 326, 436, 285, 643, 1461, 16723, 11106, 1840, 812, 452, 644, 2218, 1977, 342, 407, 1690, 247, 30909, 19270, 39930, 273, 253, 5933, 347, 247, 2644, 50276, 71, 3341, 4583, 253, 1543, 1646, 281, 320, 3240, 247, 2372, 1805, 685, 11771, 3082, 3738, 516, 417, 271, 6485, 275, 436, 2170, 2299, 3738, 253, 4477, 288, 12457, 281, 823, 326, 597, 513, 417, 7277, 3242, 13782, 673, 390, 3265, 9815, 352, 651, 452, 644, 5322, 281, 923, 247, 625, 11080, 5955, 273, 253, 4103, 15180, 10454, 273, 253, 18075, 1223, 3265, 4971, 1057, 275, 690, 3282, 755, 2810, 281, 436, 253, 13782, 3309, 323, 6928, 7293, 327, 616, 1979, 812, 1132, 247, 2803, 1060, 12014, 891, 651, 751, 281, 923, 625, 5955, 342, 17730, 281, 253, 4327, 273, 10336, 323, 841, 6928, 466, 497, 597, 247, 2969, 13361, 81, 697, 1896, 891, 9829, 436, 533, 513, 417, 1158, 352, 369, 5469, 7152, 339, 793, 360, 3454, 253, 2488, 29328, 247, 4460, 278, 3591, 68, 1775, 17407, 30364, 50065, 407, 253, 11454, 6928, 275, 1798, 253, 11454, 2990, 310, 6777, 281, 320, 247, 2685, 3169, 1566, 326, 4483, 253, 3242, 7103, 273, 253, 10419, 5912, 50275, 783, 5731, 7424, 273, 278, 3591, 68, 25066, 253, 8062, 273, 288, 17475, 407, 24049, 253, 11786, 273, 253, 2341, 1159, 715, 253, 2685, 281, 5416, 253, 30332, 2322, 1016, 5731, 5150, 760, 7024, 327, 253, 643, 629, 273, 253, 4903, 275, 1635, 253, 2488, 671, 11809, 1529, 2990, 391, 281, 12661, 271, 7103, 4328, 323, 253, 2303, 11786, 436, 20096, 253, 14259, 875, 253, 4081, 1332, 285, 288, 17475, 50275, 284, 323, 253, 3733, 8103, 253, 2488, 29328, 281, 22950, 253, 10419, 15579, 285, 14924, 2281, 6537, 407, 10235, 9840, 1955, 281, 253, 10649, 1430, 273, 253, 10419, 4038, 436, 8103, 476, 320, 41398, 10302, 3523, 11786, 10480, 310, 908, 281, 33292, 253, 3733, 285, 4796, 253, 2105, 273, 896, 44263, 318, 50275, 358, 5378, 1037, 253, 2488, 44995, 253, 4081, 1775, 17407, 275, 690, 20953, 15302, 21535, 9077, 285, 3676, 2341, 3169, 3210, 253, 4081, 1775, 17407, 33526, 2169, 3265, 2429, 281, 1666, 25379, 50275, 15337, 19843, 253, 2929, 310, 4518, 3542, 285, 3477, 281, 956, 253, 2488, 671, 12453, 253, 2905, 789, 285, 3748, 253, 3064, 2429, 281, 2045, 1666, 25379, 50275, 48746, 3590, 1255, 891, 452, 247, 3158, 1007, 387, 253, 4278, 273, 253, 4081, 1332, 352, 3133, 253, 28529, 310, 3451, 50275, 2369, 652, 555, 50276, 20261, 253, 2605, 273, 253, 1775, 17407, 310, 11797, 407, 298, 19, 11774, 68, 627, 403, 1335, 690, 3910, 298, 19, 11774, 68, 2087, 4219, 253, 288, 17475, 5731, 5150, 407, 41463, 268, 1814, 89, 715, 767, 4243, 1016, 629, 310, 4764, 1025, 407, 11454, 6928, 253, 4081, 1775, 17407, 3185, 27959, 253, 1375, 268, 1814, 91, 3185, 273, 4764, 268, 1814, 89, 342, 3081, 2990, 391, 253, 2934, 273, 970, 15579, 347, 271, 8103, 281, 11907, 17947, 310, 417, 747, 50276, 35529, 253, 38135, 8696, 275, 253, 10393, 273, 253, 2685, 1566, 281, 4483, 10649, 1430, 273, 10419, 4038, 4583, 253, 4081, 1332, 310, 4460, 281, 690, 6070, 50274, 9188, 40348, 273, 253, 789, 253, 4081, 1332, 476, 320, 12258, 347, 247, 12955, 273, 298, 19, 11774, 68, 342, 5777, 1027, 48257, 4764, 1320, 285, 3733, 16566, 253, 5750, 273, 253, 4081, 1332, 310, 253, 2169, 3265, 2429, 281, 2045, 1775, 446, 398, 436, 778, 320, 9371, 281, 690, 23886, 50276, 20881, 1255, 337, 3738, 253, 2488, 14371, 253, 1805, 3265, 476, 320, 2797, 970, 253, 4081, 1332, 891, 1335, 4510, 625, 1783, 281, 2096, 253, 3607, 273, 436, 1775, 17407, 323, 1650, 891, 717, 14338, 281, 871, 849, 1774, 253, 2990, 391, 310, 604, 391, 310, 5176, 285, 253, 26416, 71, 6375, 2899, 1080, 310, 908, 26332, 5731, 1182, 79, 3560, 407, 22753, 1269, 79, 285, 4720, 1182, 79, 281, 25066, 288, 17475, 5731, 4086, 840, 253, 11786, 7103, 310, 671, 6760, 387, 1027, 8593, 752, 403, 253, 3910, 275, 2426, 273, 16226, 374, 253, 1921, 326, 253, 4081, 1332, 476, 452, 2169, 3265, 310, 1955, 281, 253, 11903, 1320, 273, 253, 10419, 15579, 3738, 253, 3733, 8103, 556, 271, 14924, 2281, 1307, 891, 717, 14338, 281, 871, 1057, 253, 15579, 1307, 31835, 253, 3410, 3290, 323, 1650, 275, 21535, 9077, 253, 2488, 760, 5012, 253, 3265, 752, 670, 253, 14940, 3885, 273, 253, 1775, 17407, 2429, 281, 2571, 326, 897, 3410, 3290, 347, 253, 3733, 8103, 752, 670, 616, 16226, 275, 21535, 9077, 495, 323, 3733, 3676, 299, 5844, 7419, 432, 3265, 891, 671, 971, 281, 871, 253, 14940, 3885, 273, 253, 299, 5844, 3733, 285, 253, 3290, 273, 253, 4561, 3888, 2429, 281, 4691, 66, 841, 403, 253, 2629, 7103, 7982, 323, 299, 5844, 577, 323, 299, 5844, 2139, 760, 2429, 281, 4691, 66, 667, 4606, 2139, 43337, 643, 1666, 25379, 50276, 7152, 339, 431, 248, 2929, 8219, 326, 247, 1805, 8103, 281, 6194, 11454, 278, 3591, 68, 34501, 310, 281, 22950, 253, 10419, 15579, 246, 953, 6358, 50276, 69, 437, 522, 430, 284, 6247, 285, 7568, 247, 1332, 327, 2509, 594, 253, 1332, 2722, 5520, 10491, 6733, 2429, 281, 2045, 1332, 3340, 581, 326, 22318, 253, 5795, 298, 19, 3264, 6923, 253, 38135, 310, 417, 273, 253, 3733, 8103, 533, 247, 11454, 8164, 2492, 342, 5520, 10491, 6733, 50275, 856, 84, 50276, 18, 253, 1332, 310, 973, 24013, 8550, 275, 2593, 337, 285, 4518, 5183, 407, 4677, 337, 374, 253, 11454, 8164, 2492, 8414, 247, 1643, 19080, 24866, 281, 1056, 253, 2990, 10649, 494, 285, 8338, 253, 2317, 973, 495, 253, 1332, 14371, 5520, 10491, 6733, 4080, 407, 3265, 50275, 5040, 50276, 18, 253, 2929, 778, 878, 625, 789, 327, 9759, 50276, 4674, 495, 310, 1892, 281, 956, 697, 1077, 1048, 285, 5046, 6240, 690, 19087, 651, 1361, 50276, 74, 1928, 2593, 577, 943, 564, 1078, 2593, 495, 1677, 849, 352, 310, 4390, 3559, 50276, 19, 627, 310, 247, 3221, 7234, 275, 253, 2929, 581, 476, 4354, 1056, 667, 5502, 5912, 10517, 352, 407, 1690, 271, 3081, 1313, 18427, 763, 42118, 2997, 49844, 3213, 16579, 723, 10333, 50276, 953, 417, 2032, 581, 878, 281, 23000, 5416, 326, 253, 10295, 310, 22816, 285, 247, 38847, 50276, 1542, 253, 1921, 1840, 697, 4409, 253, 3748, 326, 253, 4081, 11454, 2746, 310, 22816, 285, 247, 38847, 285, 1918, 690, 816, 692, 1942, 251, 50276, 20, 642, 4679, 323, 36594, 2451, 891, 2096, 326, 4645, 1805, 3265, 310, 1175, 533, 1014, 690, 23539, 1775, 17407, 476, 1421, 281, 1199, 5520, 3265, 891, 1158, 627, 943, 320, 690, 1543, 327, 10941, 253, 4081, 1332, 1411, 973, 21877, 288, 17475, 327, 247, 1643, 17699, 16561, 17032, 1895, 275, 2426, 273, 253, 12637, 970, 690, 9079, 5175, 3082, 281, 2451, 604, 253, 12637, 2686, 10129, 651, 1918, 479, 625, 7162, 326, 253, 1332, 310, 10491, 432, 253, 2303, 390, 387, 1878, 10941, 247, 1643, 9506, 50275, 34974, 50276, 18, 275, 246, 953, 6358, 50276, 69, 437, 522, 430, 284, 6247, 253, 4477, 3693, 2709, 896, 44263, 318, 407, 15910, 253, 4309, 10272, 387, 253, 4038, 11786, 1307, 275, 776, 3368, 359, 1089, 352, 310, 3309, 323, 1175, 3045, 1057, 352, 1599, 368, 8069, 970, 247, 23539, 11786, 285, 352, 7819, 562, 281, 320, 1805, 685, 271, 38663, 581, 352, 4453, 3240, 12504, 281, 479, 347, 352, 2097, 368, 403, 275, 958, 39793, 690, 643, 8103, 534, 7819, 562, 281, 320, 1805, 50276, 19, 516, 3839, 5061, 33496, 342, 253, 9978, 273, 2593, 8073, 50276, 4609, 26586, 253, 878, 281, 5206, 253, 3213, 1979, 253, 760, 10839, 494, 4764, 310, 247, 2303, 2997, 2281, 891, 1158, 26724, 288, 17475, 390, 4691, 66, 476, 513, 594, 407, 690, 3909, 14366, 81, 27996, 323, 3213, 1979, 347, 973, 24088, 8746, 25001, 347, 697, 2218, 275, 20392, 50276, 783, 17631, 3733, 273, 38391, 983, 3066, 4499, 422, 23279, 3249, 432, 253, 958, 326, 253, 11786, 5998, 407, 247, 2159, 6321, 278, 3591, 68, 310, 38663, 253, 4081, 1332, 3139, 36908, 2968, 342, 253, 8492, 1255, 3587, 594, 253, 760, 21541, 1921, 281, 5513, 2139, 352, 19132, 253, 3733, 310, 326, 253, 12480, 310, 594, 1175, 824, 326, 342, 253, 2159, 5931, 253, 8492, 273, 253, 11786, 310, 594, 1355, 310, 436, 752, 9369, 50276, 74, 4282, 1880, 390, 417, 253, 19645, 2388, 664, 9367, 3733, 273, 1775, 446, 398, 285, 38391, 983, 651, 2847, 667, 2523, 275, 1798, 588, 352, 11907, 326, 299, 5844, 760, 8133, 2341, 1475, 941, 2792, 285, 253, 1775, 17407, 760, 8069, 3812, 3530, 432, 690, 27620, 3268, 10151, 403, 3530, 432, 38391, 983, 1512, 2074, 281, 941, 2792, 50276, 1542, 4677, 495, 67, 47515, 253, 14940, 8770, 26332, 513, 3037, 494, 1775, 17407, 285, 4691, 66, 29623, 281, 247, 2074, 15579, 390, 417, 50276, 1542, 4677, 495, 68, 1057, 1048, 6321, 4691, 66, 4711, 24600, 3530, 604, 4754, 352, 6492, 326, 436, 1039, 273, 3733, 38391, 983, 3693, 690, 19955, 534, 22942, 342, 2159, 6321, 278, 3591, 68, 556, 604, 642, 840, 352, 778, 5224, 326, 10380, 253, 3037, 494, 1775, 17407, 820, 26672, 342, 253, 299, 5844, 10380, 281, 3693, 253, 19955, 310, 253, 6158, 359, 3078, 187, 187, 4118, 18435, 27, 2520, 2929, 4081, 271, 278, 3591, 68, 1775, 17407, 326, 24772, 288, 17475, 285, 11454, 2990, 1754, 10419, 3268, 352, 310, 271, 7756, 689, 298, 19, 11774, 68, 285, 246, 953, 6358, 50276, 69, 437, 522, 430, 284, 6247, 342, 253, 2201, 15832, 1146, 326, 253, 4081, 2622, 3006, 2685, 3169, 10419, 310, 28136, 824, 326, 253, 4038, 273, 253, 10419, 2805, 5260, 310, 10649, 494, 4679, 403, 5196, 327, 13506, 10670, 17699, 16561, 21535, 9077, 285, 3676, 2341, 3169, 1566, 3733, 50276, 6050, 30628, 403, 4583, 5211, 670, 253, 38135, 273, 253, 2746, 690, 19843, 3374, 452, 644, 5439, 275, 690, 273, 253, 30628, 3302, 10123, 671, 7350, 327, 253, 7103, 7533, 1690, 253, 5816, 7103, 7982, 824, 347, 3265, 9815, 403, 671, 5439, 407, 253, 30628, 253, 18520, 9713, 690, 273, 253, 19843, 3374, 533, 690, 5661, 7103, 3374, 1335, 2226, 24088, 10941, 342, 298, 19, 11774, 68, 275, 2426, 273, 3265, 9815, 285, 253, 7932, 4691, 66, 8245, 1543, 1056, 253, 7756, 273, 253, 4081, 2746, 1679, 2590, 50276, 74, 11697, 1089, 253, 4081, 2746, 347, 247, 1077, 4722, 4473, 2299, 891, 671, 5194, 342, 253, 30628, 326, 625, 5661, 2175, 878, 281, 320, 2218, 275, 1340, 281, 2096, 253, 1524, 6351, 273, 253, 2746, 209 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 534, 4648, 247, 10419, 3268, 4764, 1025, 407, 247, 11454, 2990, 285, 534, 5556, 4219, 253, 15579, 273, 253, 4795, 10419, 4583, 891, 1119, 352, 281, 320, 271, 4722, 5933, 342, 9648, 1175, 1543, 347, 2429, 281, 5795, 3082, 2299, 253, 5740, 273, 253, 5933, 3139, 369, 8489, 21643, 285, 263, 2410, 311, 4525, 50276, 783, 10199, 1690, 2593, 374, 369, 1270, 285, 2530, 247, 44003, 533, 1077, 34025, 10199, 281, 3332, 7274, 323, 17825, 278, 3591, 68, 253, 5068, 273, 2593, 495, 369, 671, 3240, 973, 3559, 2299, 672, 253, 4477, 1614, 281, 253, 4243, 273, 616, 2746, 11797, 407, 288, 17475, 352, 4850, 247, 2372, 4682, 76, 1321, 629, 273, 253, 10183, 1060, 310, 326, 387, 436, 1127, 253, 806, 12494, 273, 268, 21, 253, 4477, 403, 12930, 288, 17475, 2581, 685, 616, 1332, 534, 310, 417, 7094, 2590, 4931, 436, 812, 452, 644, 21010, 407, 4886, 253, 2905, 789, 2593, 4321, 275, 253, 2929, 285, 6941, 323, 253, 5740, 273, 288, 17475, 1078, 33058, 715, 616, 1211, 2746, 50276, 783, 5886, 875, 288, 17475, 285, 616, 2746, 534, 2789, 897, 273, 10444, 5018, 1269, 83, 812, 671, 452, 644, 625, 16575, 5544, 285, 1677, 625, 30328, 253, 3048, 310, 627, 533, 352, 36908, 1646, 281, 320, 347, 2810, 281, 253, 26416, 71, 6375, 3213, 347, 1269, 83, 36908, 2723, 281, 271, 10444, 3213, 534, 651, 320, 1269, 27509, 417, 326, 516, 3981, 275, 667, 1039, 326, 436, 2789, 253, 5933, 13583, 816, 326, 253, 4602, 310, 2649, 3240, 347, 2590, 2624, 50276, 3549, 6241, 891, 651, 671, 452, 10490, 281, 923, 247, 625, 2590, 5740, 273, 253, 2805, 246, 285, 256, 12624, 50276, 10489, 483, 436, 789, 253, 4477, 671, 3730, 281, 1309, 3733, 891, 5467, 597, 1347, 616, 15644, 5018, 1309, 253, 3410, 1232, 347, 310, 253, 1083, 273, 253, 246, 953, 6358, 285, 20889, 522, 430, 284, 789, 2299, 436, 812, 897, 690, 37699, 4583, 891, 1158, 326, 436, 285, 643, 1461, 16723, 11106, 1840, 812, 452, 644, 2218, 1977, 342, 407, 1690, 247, 30909, 19270, 39930, 273, 253, 5933, 347, 247, 2644, 50276, 71, 3341, 4583, 253, 1543, 1646, 281, 320, 3240, 247, 2372, 1805, 685, 11771, 3082, 3738, 516, 417, 271, 6485, 275, 436, 2170, 2299, 3738, 253, 4477, 288, 12457, 281, 823, 326, 597, 513, 417, 7277, 3242, 13782, 673, 390, 3265, 9815, 352, 651, 452, 644, 5322, 281, 923, 247, 625, 11080, 5955, 273, 253, 4103, 15180, 10454, 273, 253, 18075, 1223, 3265, 4971, 1057, 275, 690, 3282, 755, 2810, 281, 436, 253, 13782, 3309, 323, 6928, 7293, 327, 616, 1979, 812, 1132, 247, 2803, 1060, 12014, 891, 651, 751, 281, 923, 625, 5955, 342, 17730, 281, 253, 4327, 273, 10336, 323, 841, 6928, 466, 497, 597, 247, 2969, 13361, 81, 697, 1896, 891, 9829, 436, 533, 513, 417, 1158, 352, 369, 5469, 7152, 339, 793, 360, 3454, 253, 2488, 29328, 247, 4460, 278, 3591, 68, 1775, 17407, 30364, 50065, 407, 253, 11454, 6928, 275, 1798, 253, 11454, 2990, 310, 6777, 281, 320, 247, 2685, 3169, 1566, 326, 4483, 253, 3242, 7103, 273, 253, 10419, 5912, 50275, 783, 5731, 7424, 273, 278, 3591, 68, 25066, 253, 8062, 273, 288, 17475, 407, 24049, 253, 11786, 273, 253, 2341, 1159, 715, 253, 2685, 281, 5416, 253, 30332, 2322, 1016, 5731, 5150, 760, 7024, 327, 253, 643, 629, 273, 253, 4903, 275, 1635, 253, 2488, 671, 11809, 1529, 2990, 391, 281, 12661, 271, 7103, 4328, 323, 253, 2303, 11786, 436, 20096, 253, 14259, 875, 253, 4081, 1332, 285, 288, 17475, 50275, 284, 323, 253, 3733, 8103, 253, 2488, 29328, 281, 22950, 253, 10419, 15579, 285, 14924, 2281, 6537, 407, 10235, 9840, 1955, 281, 253, 10649, 1430, 273, 253, 10419, 4038, 436, 8103, 476, 320, 41398, 10302, 3523, 11786, 10480, 310, 908, 281, 33292, 253, 3733, 285, 4796, 253, 2105, 273, 896, 44263, 318, 50275, 358, 5378, 1037, 253, 2488, 44995, 253, 4081, 1775, 17407, 275, 690, 20953, 15302, 21535, 9077, 285, 3676, 2341, 3169, 3210, 253, 4081, 1775, 17407, 33526, 2169, 3265, 2429, 281, 1666, 25379, 50275, 15337, 19843, 253, 2929, 310, 4518, 3542, 285, 3477, 281, 956, 253, 2488, 671, 12453, 253, 2905, 789, 285, 3748, 253, 3064, 2429, 281, 2045, 1666, 25379, 50275, 48746, 3590, 1255, 891, 452, 247, 3158, 1007, 387, 253, 4278, 273, 253, 4081, 1332, 352, 3133, 253, 28529, 310, 3451, 50275, 2369, 652, 555, 50276, 20261, 253, 2605, 273, 253, 1775, 17407, 310, 11797, 407, 298, 19, 11774, 68, 627, 403, 1335, 690, 3910, 298, 19, 11774, 68, 2087, 4219, 253, 288, 17475, 5731, 5150, 407, 41463, 268, 1814, 89, 715, 767, 4243, 1016, 629, 310, 4764, 1025, 407, 11454, 6928, 253, 4081, 1775, 17407, 3185, 27959, 253, 1375, 268, 1814, 91, 3185, 273, 4764, 268, 1814, 89, 342, 3081, 2990, 391, 253, 2934, 273, 970, 15579, 347, 271, 8103, 281, 11907, 17947, 310, 417, 747, 50276, 35529, 253, 38135, 8696, 275, 253, 10393, 273, 253, 2685, 1566, 281, 4483, 10649, 1430, 273, 10419, 4038, 4583, 253, 4081, 1332, 310, 4460, 281, 690, 6070, 50274, 9188, 40348, 273, 253, 789, 253, 4081, 1332, 476, 320, 12258, 347, 247, 12955, 273, 298, 19, 11774, 68, 342, 5777, 1027, 48257, 4764, 1320, 285, 3733, 16566, 253, 5750, 273, 253, 4081, 1332, 310, 253, 2169, 3265, 2429, 281, 2045, 1775, 446, 398, 436, 778, 320, 9371, 281, 690, 23886, 50276, 20881, 1255, 337, 3738, 253, 2488, 14371, 253, 1805, 3265, 476, 320, 2797, 970, 253, 4081, 1332, 891, 1335, 4510, 625, 1783, 281, 2096, 253, 3607, 273, 436, 1775, 17407, 323, 1650, 891, 717, 14338, 281, 871, 849, 1774, 253, 2990, 391, 310, 604, 391, 310, 5176, 285, 253, 26416, 71, 6375, 2899, 1080, 310, 908, 26332, 5731, 1182, 79, 3560, 407, 22753, 1269, 79, 285, 4720, 1182, 79, 281, 25066, 288, 17475, 5731, 4086, 840, 253, 11786, 7103, 310, 671, 6760, 387, 1027, 8593, 752, 403, 253, 3910, 275, 2426, 273, 16226, 374, 253, 1921, 326, 253, 4081, 1332, 476, 452, 2169, 3265, 310, 1955, 281, 253, 11903, 1320, 273, 253, 10419, 15579, 3738, 253, 3733, 8103, 556, 271, 14924, 2281, 1307, 891, 717, 14338, 281, 871, 1057, 253, 15579, 1307, 31835, 253, 3410, 3290, 323, 1650, 275, 21535, 9077, 253, 2488, 760, 5012, 253, 3265, 752, 670, 253, 14940, 3885, 273, 253, 1775, 17407, 2429, 281, 2571, 326, 897, 3410, 3290, 347, 253, 3733, 8103, 752, 670, 616, 16226, 275, 21535, 9077, 495, 323, 3733, 3676, 299, 5844, 7419, 432, 3265, 891, 671, 971, 281, 871, 253, 14940, 3885, 273, 253, 299, 5844, 3733, 285, 253, 3290, 273, 253, 4561, 3888, 2429, 281, 4691, 66, 841, 403, 253, 2629, 7103, 7982, 323, 299, 5844, 577, 323, 299, 5844, 2139, 760, 2429, 281, 4691, 66, 667, 4606, 2139, 43337, 643, 1666, 25379, 50276, 7152, 339, 431, 248, 2929, 8219, 326, 247, 1805, 8103, 281, 6194, 11454, 278, 3591, 68, 34501, 310, 281, 22950, 253, 10419, 15579, 246, 953, 6358, 50276, 69, 437, 522, 430, 284, 6247, 285, 7568, 247, 1332, 327, 2509, 594, 253, 1332, 2722, 5520, 10491, 6733, 2429, 281, 2045, 1332, 3340, 581, 326, 22318, 253, 5795, 298, 19, 3264, 6923, 253, 38135, 310, 417, 273, 253, 3733, 8103, 533, 247, 11454, 8164, 2492, 342, 5520, 10491, 6733, 50275, 856, 84, 50276, 18, 253, 1332, 310, 973, 24013, 8550, 275, 2593, 337, 285, 4518, 5183, 407, 4677, 337, 374, 253, 11454, 8164, 2492, 8414, 247, 1643, 19080, 24866, 281, 1056, 253, 2990, 10649, 494, 285, 8338, 253, 2317, 973, 495, 253, 1332, 14371, 5520, 10491, 6733, 4080, 407, 3265, 50275, 5040, 50276, 18, 253, 2929, 778, 878, 625, 789, 327, 9759, 50276, 4674, 495, 310, 1892, 281, 956, 697, 1077, 1048, 285, 5046, 6240, 690, 19087, 651, 1361, 50276, 74, 1928, 2593, 577, 943, 564, 1078, 2593, 495, 1677, 849, 352, 310, 4390, 3559, 50276, 19, 627, 310, 247, 3221, 7234, 275, 253, 2929, 581, 476, 4354, 1056, 667, 5502, 5912, 10517, 352, 407, 1690, 271, 3081, 1313, 18427, 763, 42118, 2997, 49844, 3213, 16579, 723, 10333, 50276, 953, 417, 2032, 581, 878, 281, 23000, 5416, 326, 253, 10295, 310, 22816, 285, 247, 38847, 50276, 1542, 253, 1921, 1840, 697, 4409, 253, 3748, 326, 253, 4081, 11454, 2746, 310, 22816, 285, 247, 38847, 285, 1918, 690, 816, 692, 1942, 251, 50276, 20, 642, 4679, 323, 36594, 2451, 891, 2096, 326, 4645, 1805, 3265, 310, 1175, 533, 1014, 690, 23539, 1775, 17407, 476, 1421, 281, 1199, 5520, 3265, 891, 1158, 627, 943, 320, 690, 1543, 327, 10941, 253, 4081, 1332, 1411, 973, 21877, 288, 17475, 327, 247, 1643, 17699, 16561, 17032, 1895, 275, 2426, 273, 253, 12637, 970, 690, 9079, 5175, 3082, 281, 2451, 604, 253, 12637, 2686, 10129, 651, 1918, 479, 625, 7162, 326, 253, 1332, 310, 10491, 432, 253, 2303, 390, 387, 1878, 10941, 247, 1643, 9506, 50275, 34974, 50276, 18, 275, 246, 953, 6358, 50276, 69, 437, 522, 430, 284, 6247, 253, 4477, 3693, 2709, 896, 44263, 318, 407, 15910, 253, 4309, 10272, 387, 253, 4038, 11786, 1307, 275, 776, 3368, 359, 1089, 352, 310, 3309, 323, 1175, 3045, 1057, 352, 1599, 368, 8069, 970, 247, 23539, 11786, 285, 352, 7819, 562, 281, 320, 1805, 685, 271, 38663, 581, 352, 4453, 3240, 12504, 281, 479, 347, 352, 2097, 368, 403, 275, 958, 39793, 690, 643, 8103, 534, 7819, 562, 281, 320, 1805, 50276, 19, 516, 3839, 5061, 33496, 342, 253, 9978, 273, 2593, 8073, 50276, 4609, 26586, 253, 878, 281, 5206, 253, 3213, 1979, 253, 760, 10839, 494, 4764, 310, 247, 2303, 2997, 2281, 891, 1158, 26724, 288, 17475, 390, 4691, 66, 476, 513, 594, 407, 690, 3909, 14366, 81, 27996, 323, 3213, 1979, 347, 973, 24088, 8746, 25001, 347, 697, 2218, 275, 20392, 50276, 783, 17631, 3733, 273, 38391, 983, 3066, 4499, 422, 23279, 3249, 432, 253, 958, 326, 253, 11786, 5998, 407, 247, 2159, 6321, 278, 3591, 68, 310, 38663, 253, 4081, 1332, 3139, 36908, 2968, 342, 253, 8492, 1255, 3587, 594, 253, 760, 21541, 1921, 281, 5513, 2139, 352, 19132, 253, 3733, 310, 326, 253, 12480, 310, 594, 1175, 824, 326, 342, 253, 2159, 5931, 253, 8492, 273, 253, 11786, 310, 594, 1355, 310, 436, 752, 9369, 50276, 74, 4282, 1880, 390, 417, 253, 19645, 2388, 664, 9367, 3733, 273, 1775, 446, 398, 285, 38391, 983, 651, 2847, 667, 2523, 275, 1798, 588, 352, 11907, 326, 299, 5844, 760, 8133, 2341, 1475, 941, 2792, 285, 253, 1775, 17407, 760, 8069, 3812, 3530, 432, 690, 27620, 3268, 10151, 403, 3530, 432, 38391, 983, 1512, 2074, 281, 941, 2792, 50276, 1542, 4677, 495, 67, 47515, 253, 14940, 8770, 26332, 513, 3037, 494, 1775, 17407, 285, 4691, 66, 29623, 281, 247, 2074, 15579, 390, 417, 50276, 1542, 4677, 495, 68, 1057, 1048, 6321, 4691, 66, 4711, 24600, 3530, 604, 4754, 352, 6492, 326, 436, 1039, 273, 3733, 38391, 983, 3693, 690, 19955, 534, 22942, 342, 2159, 6321, 278, 3591, 68, 556, 604, 642, 840, 352, 778, 5224, 326, 10380, 253, 3037, 494, 1775, 17407, 820, 26672, 342, 253, 299, 5844, 10380, 281, 3693, 253, 19955, 310, 253, 6158, 359, 3078, 187, 187, 4118, 18435, 27, 2520, 2929, 4081, 271, 278, 3591, 68, 1775, 17407, 326, 24772, 288, 17475, 285, 11454, 2990, 1754, 10419, 3268, 352, 310, 271, 7756, 689, 298, 19, 11774, 68, 285, 246, 953, 6358, 50276, 69, 437, 522, 430, 284, 6247, 342, 253, 2201, 15832, 1146, 326, 253, 4081, 2622, 3006, 2685, 3169, 10419, 310, 28136, 824, 326, 253, 4038, 273, 253, 10419, 2805, 5260, 310, 10649, 494, 4679, 403, 5196, 327, 13506, 10670, 17699, 16561, 21535, 9077, 285, 3676, 2341, 3169, 1566, 3733, 50276, 6050, 30628, 403, 4583, 5211, 670, 253, 38135, 273, 253, 2746, 690, 19843, 3374, 452, 644, 5439, 275, 690, 273, 253, 30628, 3302, 10123, 671, 7350, 327, 253, 7103, 7533, 1690, 253, 5816, 7103, 7982, 824, 347, 3265, 9815, 403, 671, 5439, 407, 253, 30628, 253, 18520, 9713, 690, 273, 253, 19843, 3374, 533, 690, 5661, 7103, 3374, 1335, 2226, 24088, 10941, 342, 298, 19, 11774, 68, 275, 2426, 273, 3265, 9815, 285, 253, 7932, 4691, 66, 8245, 1543, 1056, 253, 7756, 273, 253, 4081, 2746, 1679, 2590, 50276, 74, 11697, 1089, 253, 4081, 2746, 347, 247, 1077, 4722, 4473, 2299, 891, 671, 5194, 342, 253, 30628, 326, 625, 5661, 2175, 878, 281, 320, 2218, 275, 1340, 281, 2096, 253, 1524, 6351, 273, 253, 2746, 209 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: summary this paper proposes a method for training model that are robust to spurious correlations building upon prior work that uses productofexperts and a model explicitly trained on a dataset bias eg a hypothesisonly model instead of using a model explicitly trained to learn the dataset bias the authors use a weak learner with limited capacity then this model is used in the poe setting as in past work the advantage of this method is that a model developer doesnt need to know that a bias exists since the hope is that the weak learner will implicitly learn the bias strengths a thorough study of using a limitedcapacity auxiliary model to train more robust models which helps a final model ignore spurious correlations that are easy to learn weaknesses the work is a rather straightforward extension of prior work furthermore the authors only evaluate on 2 textual tasksi would have liked to see more experiments with spurious correlations in vision eg vqa or the datasets used in httpsopenreviewnetforumidryxgujrfvs and other experiments on text eg the triviaqacp dataset in the clark paper as is its hard to glean how broadly applicable this method actually is i would have also liked to see more of a comparison with methods that use known bias eg clark et al or he et alit seems like some of the comparisons in the table arent completely fair recommendation 6 i think this paper is a potentiallyuseful extension of a prior method but im still somewhat unconvinced that this method is applicable in settings where the bias is hard to detect which is what we really care about since if the bias is easy to detect we can use clark et al and other methods comments and questions 1 the comparisons to clark et al arent fair comparisons for adversarial squad since the clark et al paper uses a different base model for adversarial squad modifed bidaf 2 the weak learner is a rather blunt instrument it picks up dataset biases but it also likely picks up features that are actually usefulnot all robust features have to be difficult to learn is it possible to better quantify what useful information is being learned and subsequently thrown out by the weak learner this would make it easier to determine if using it is worthwhile 3 while its true that the weak model empirically learns to relearn the same dataset biases targeted in prior work eg negation correlates with contradiction its somewhat unclear to me how well this method would translate to a setting with unknown biases the mnli squad examples are a bit artificial since we already have knowledge of the biasits possible that weak learners can pick up on spurious features that are easy to learn which are the same ones that humans notice id like to see whether this method applies well to tasks where it isnt immediately obvious that the bias is easy to learn perhaps a synthetic experiment would be useful here is it possible to modulate the learnability of the bias the synthetic experiments in the paper suggest that for cases the bias is hard to learn this method isnt very effective which makes sensein how many of the cases in the literature is the bias hard to learn this is another reason why i think more experiments would be usefuldocsepsummary this paper focuses on the known problem that current nlp models tend to solve tasks by exploiting superficial properties of the training data that do not generalize for example in the nli task models learn that negation words are indicative of the label contradiction and high word overlap is indicative of the label entailment there have been many recent solutions proposed for mitigating such behavior but existing methods have tended to assume knowledge of the specific dataset biases a priori in this paper the authors propose a method based on product of experts that doesnt assume particular knowledge of specific dataset biases the method works by first training a weak model and then training a main model using a loss that upweights examples on which the weak model performs poorly namely predicts the wrong answer with high confidence the assumption is that weak models will exploit heuristics and so this method will deincentivize the main model to use those same heuristics the authors evaluate on a range of tasks including a simulated bias setting and nli setting and a qa setting and offer a fair amount of analysis of their results in particular the analysis showing that the weak learners do in fact adopt the biases which have been documented elsewhere in the literature is interesting and the discussion of how weak does the weak learner need to be is appreciated a few questions on this below strengths straightforward method for addressing an important known problem with neural nlp models thorough analysis not just a method and results paper weaknesses novelty might be somewhat limited method is not wildly creative but i dont necessarily think wild creativity is a prerequisite for scientific value the authors do a good job of directly contending with the similar contemporaneous work in their paper additional commentsquestions just a few thoughts that came up while reading the weaknessofweaklearner analysis is interesting i imagine this is not something that can be understood in absolute terms ie i would not expect there to be some level of weakness that is sufficient for all biases and all datasets eg surely the lexical overlap bias is harder to learn than a lexical bias like the presence of negation words since recognizing lexical overlap presupposes recognizing lexical identity therefore id imagine knowing how weak the weak learner needs to be requires some intuition about which biases you are trying to remove which runs counter to the primary thrust of the paper namely removing bias without knowing what the bias is thoughts its interesting that even with this the performance on hans nonentailed is still only 56 which is better but still not exactly good and doesnt suggest the model has learned the right thing so much as its has learned not to use that particular wrong thing for research questions such as this is the model using the heuristic i always find it unsatisfying to think about performance gains that are in between 0 and 100 eg when we talk about human learning we usually see an abrupt shift when the learner gets it and our hope in removing the spurious features with methods like yours would be that wed help the neural models similarly get it and reach 100 at least on examples that isolate the effect of this spurious feature i dont expect you to have an answer for this but just raising to hear your thoughts docsep reason for score the research problem is critical the solution is appropriate and novel the claims are validated the experiments are interesting however the writing in section 3 4 et 5 should be improved if so i would be willing to raise my score my background my research is focused on detecting and avoiding data biases or spurious correlations learned by deep neural networks this is the exact scope of this paper however my area of expertise is computer vision and multimodal textimage not natural language processing summary context the paper focuses on automatically detecting data biases learned by natural language processing models and overcoming them using a learning strategy problem the authors identify and tackle issues of stateoftheart methods they are required to already know about a certain bias to be overcome solution and novelty the proposed method consists in 1 training a weak model that aims at detecting biases 2 overcoming these biases by training a main model using a product of experts hinton 2002 with the predictions of the fixed weak model claim a weak model can be used to discover data biases the proposed method produces a main model that generalize better to outofdistribution examples what i liked the most metaproblem of automatically detecting and overcoming biases in neural networks is critical well contextualized relevant issues of state of the art have been identified intro and related work are easy to read and understand novel simple and interesting method to tackle them interesting figures experiments are interesting and well chosen what could be improved 1 abstract introduction and 2 related work your research problem and solution are general and can be applied to many fields is there a specific reason why you decided to focus on nlp only you could improve the impact of your approach by citing papers that tackle the same problem with similar solutions from different fields clark et al 2019 dont take the easy way out ensemblebased methods for avoiding known dataset biases that you already cite ran some experiments in multiple fields nlp vqa etc cadene et al rubi reducing unimodal biases for visual question answering neurips2019 in vqa could also be cited 3 proposed method next to eq1 why an element wise sum is equivalent to an element wise multiplication after softmax it seems wrong to me it could be useful to have a general definition of the poe loss instead of just an example of binary cross entropy in eq2 see 43 you should define poece here 4 experiments overall i think it is important that you improve the writing for this section and reduce jargon it is really difficult to understand for readers that are not familiar with the datasets on which you perform your study also it is really difficult to understand which dataset is indistribution or outofdistribution you dont define development matched accuracy before using it 41 you use too many footnotes that could be included in the text 42 you dont define ce even in the caption of figure2 in table 2 you could reduce jargon by using weak and main instead of w and m in table 2 you dont define an even in the caption 43 i dont understand why poece is better on hard i dont like that you propose to use poece as your method of choice to counteract these effects without defining it in section 3 to be clear i still dont understand what is the learning method that you propose poe or poece 5 analysis 52 title is on two lines instead of one i dont understand when trained jointly with the larger mediumbert weak learner how many parameters dont expect your reader to look at figure 4 to obtain this information 6 conclusion could you add a discussion about the limitations of your approach in particular how to choose the number of parameters of your weak learner what to choose between poe and poece and most critically if you dont assess the type of biases and the amount of biases included in the dataset how to be sure that your method will have a beneficial impact then if you need to assess the type of biases using another method that specifically targets them could be more efficient docseppaper summary the authors argue that they have proposed a method to train robust models to biases without having prior knowledge of the biases they argue also to provide analysis on how weak learner capacity impacts the indomainoutofdomain performance reasons to reject 1 the authors argue they have shown the model with limited capacity capture biases however this has been shown already in 1 in 2019 and therefore is not a contribution of the authors 2 the main method proposed in this paper is exactly the same method proposed in 2 please note that 2 was already available in early july 2020 and on top of existing work the paper does not provide other contributions 3 about the third argued contribution on showing how the performance of the debiasing method change based on the capacity of weak learners in 1 the authors included the discussion between the choice of weak learners on their impact though the method in 1 is different the discussion in that paper still would apply here as well please refer to table 13 and figure 1 in 1 given the points above and since the main method in the paper is proposed in 2 the paper does not provide enough contributions to be suitable for the iclr venue 1 robust natural language inference models with example forgetting yaghoobzadeh et al httpsarxivorgpdf191103861pdf 2019 2 towards debiasing nlu models from unknown biases utama et al 13 july 2020 httpsopenreviewnetforumiduhpxm2kjhe emnlp 2020 ### Summary:
this paper considers the problem of learning models for nlp tasks that are less reliant on artifacts and other datasetspecific features that are unlikely to be reliable for new datasets this is an important problem because these biases limit outofdistribution generalization prior work has considered models that explicitly factor out known biases this work proposes using an ensemble of weak learners to implicitly identify some of these biases and train a more robust model the work shows that weak learners can capture some of the same biases that humans identify and that the resulting trained model is significantly more robust on adversarially designed challenge tasks while sacrificing little accuracy on the test sets of the original data sets the papers method is useful straightforward and intuitively appealing the experiments are generally well conducted some of the reviewers raised questions about evaluating on tasks with unknown biases the authors addressed these concerns in discussion and we encourage them to include this in the final version of the paper using the additional page
[ 84, 3362, 6667, 403, 247, 2372, 13345, 1580, 359, 2168, 452, 3640, 273, 253, 8492, 953, 1896, 326, 5075, 40390, 476, 2619, 598, 327, 46541, 3386, 326, 403, 3477, 281, 3037, 534, 403, 253, 1072, 4394, 326, 7497, 4366, 2654, 751, 281, 923, 1880, 436, 1332, 10384, 973, 281, 8892, 835, 352, 310, 2649, 4745, 4755, 326, 253, 8492, 310, 3477, 281, 3037, 4931, 247, 13506, 3368, 651, 320, 4217, 1060, 310, 352, 1896, 281, 28498, 253, 3037, 1430, 273, 253, 8492, 253, 13506, 4679, 275, 253, 2929, 1804, 326, 323, 2219, 253, 8492, 310, 1892, 281, 3037, 436, 1332, 310, 2649, 1077, 3576, 534, 2789, 3282, 249, 849, 1142, 273, 253, 2219, 275, 253, 6239, 310, 253, 8492, 1892, 281, 3037, 436, 310, 1529, 1921, 2139, 891, 1158, 625, 4679, 651, 320, 4217, 7152, 339, 793, 360, 3454, 50276, 2520, 2929, 16633, 327, 253, 1929, 1895, 326, 1655, 295, 24343, 3210, 5257, 281, 8415, 8892, 407, 38883, 28019, 3607, 273, 253, 3733, 941, 326, 513, 417, 39970, 323, 1650, 275, 253, 295, 965, 4836, 3210, 3037, 326, 2297, 318, 3000, 403, 24838, 273, 253, 5203, 20620, 285, 1029, 3159, 14787, 310, 24838, 273, 253, 5203, 46518, 420, 627, 452, 644, 1142, 3332, 5482, 4081, 323, 37460, 824, 3879, 533, 5368, 3082, 452, 20845, 281, 5467, 3640, 273, 253, 2173, 10895, 31306, 247, 30400, 275, 436, 2929, 253, 4477, 12661, 247, 1332, 1754, 327, 1885, 273, 10071, 326, 36908, 5467, 1798, 3640, 273, 2173, 10895, 31306, 253, 1332, 2987, 407, 806, 3733, 247, 5075, 1566, 285, 840, 3733, 247, 2022, 1566, 970, 247, 2957, 326, 598, 42739, 6667, 327, 534, 253, 5075, 1566, 17923, 15225, 10775, 26295, 253, 3430, 3662, 342, 1029, 7162, 253, 9376, 310, 326, 5075, 3210, 588, 22059, 344, 321, 3397, 285, 594, 436, 1332, 588, 372, 249, 1154, 400, 907, 253, 2022, 1566, 281, 897, 1110, 1072, 344, 321, 3397, 253, 4477, 7472, 327, 247, 2491, 273, 8892, 1690, 247, 15524, 8492, 4758, 285, 295, 965, 4758, 285, 247, 2805, 66, 4758, 285, 3959, 247, 4344, 2408, 273, 1783, 273, 616, 1543, 275, 1798, 253, 1783, 4645, 326, 253, 5075, 40390, 513, 275, 958, 5283, 253, 31306, 534, 452, 644, 14290, 11358, 275, 253, 6239, 310, 4722, 285, 253, 5955, 273, 849, 5075, 1057, 253, 5075, 458, 47612, 878, 281, 320, 310, 14109, 247, 1643, 3533, 327, 436, 2708, 50276, 296, 3755, 20556, 50276, 10981, 429, 10495, 1332, 323, 15974, 271, 1774, 1929, 1895, 342, 11454, 295, 24343, 3210, 50276, 42771, 602, 1783, 417, 816, 247, 1332, 285, 1543, 2929, 50276, 20881, 1255, 265, 50276, 2369, 652, 555, 1537, 320, 8489, 3710, 1332, 310, 417, 32251, 10995, 533, 891, 13414, 7933, 1158, 4956, 22794, 310, 247, 38445, 323, 8249, 1318, 253, 4477, 513, 247, 1175, 2628, 273, 3587, 523, 1946, 342, 253, 2074, 31915, 6473, 789, 275, 616, 2929, 50276, 38092, 5701, 34974, 50276, 6309, 247, 1643, 7906, 326, 2210, 598, 1223, 4361, 50276, 783, 14855, 1171, 20881, 282, 47612, 1783, 310, 4722, 891, 8564, 436, 310, 417, 1633, 326, 476, 320, 7192, 275, 7880, 2426, 26332, 891, 651, 417, 1902, 627, 281, 320, 690, 1268, 273, 14855, 326, 310, 4209, 323, 512, 31306, 285, 512, 15302, 24088, 13353, 253, 26752, 474, 14787, 8492, 310, 12150, 281, 3037, 685, 247, 26752, 474, 8492, 751, 253, 3361, 273, 2297, 318, 3000, 1580, 26182, 26752, 474, 14787, 838, 1135, 4863, 26182, 26752, 474, 6489, 3103, 2654, 8564, 8958, 849, 5075, 253, 5075, 458, 47612, 3198, 281, 320, 4419, 690, 30328, 670, 534, 31306, 368, 403, 2820, 281, 5386, 534, 6613, 4828, 281, 253, 3625, 19031, 273, 253, 2929, 10775, 11922, 8492, 1293, 8958, 752, 253, 8492, 310, 7906, 50276, 953, 4722, 326, 1014, 342, 436, 253, 3045, 327, 288, 507, 1327, 290, 7193, 310, 1335, 760, 8026, 534, 310, 1805, 533, 1335, 417, 4555, 1175, 285, 36908, 1804, 253, 1566, 556, 6311, 253, 987, 2181, 594, 1199, 347, 697, 556, 6311, 417, 281, 897, 326, 1798, 3430, 2181, 323, 2561, 3533, 824, 347, 436, 310, 253, 1566, 970, 253, 47641, 891, 1900, 1089, 352, 43288, 3184, 281, 1158, 670, 3045, 15988, 326, 403, 275, 875, 470, 285, 2233, 24088, 672, 359, 2312, 670, 1966, 4715, 359, 3798, 923, 271, 21213, 5333, 672, 253, 458, 47612, 4850, 352, 285, 776, 3524, 275, 11922, 253, 46541, 3386, 342, 3082, 751, 13298, 651, 320, 326, 8764, 1361, 253, 11454, 3210, 12014, 755, 352, 285, 3986, 2233, 387, 1878, 327, 6667, 326, 20843, 253, 1055, 273, 436, 46541, 4735, 891, 13414, 1902, 368, 281, 452, 271, 3662, 323, 436, 533, 816, 12976, 281, 4089, 634, 7906, 5474, 33032, 1921, 323, 4868, 50276, 783, 2561, 1895, 310, 4619, 253, 2900, 310, 4569, 285, 4460, 253, 3916, 403, 17618, 253, 4679, 403, 4722, 2299, 253, 4028, 275, 2593, 495, 577, 1162, 608, 943, 320, 5520, 604, 594, 891, 651, 320, 7378, 281, 7164, 619, 4868, 50275, 2577, 4114, 50276, 2577, 2561, 310, 7106, 327, 15549, 285, 17816, 941, 31306, 390, 46541, 13007, 6311, 407, 3676, 11454, 6928, 436, 310, 253, 3242, 7990, 273, 436, 2929, 2299, 619, 2170, 273, 15040, 310, 4382, 8113, 285, 23390, 26306, 2505, 5695, 417, 3626, 3448, 5162, 50275, 8774, 50276, 8882, 253, 2929, 16633, 327, 8356, 15549, 941, 31306, 6311, 407, 3626, 3448, 5162, 3210, 285, 40845, 731, 970, 247, 4715, 5700, 50276, 28872, 253, 4477, 4271, 285, 18915, 3374, 273, 1375, 23037, 14387, 3082, 50276, 9328, 403, 2424, 281, 2168, 871, 670, 247, 2176, 8492, 281, 320, 11399, 50276, 42023, 285, 38135, 253, 4081, 1332, 8414, 275, 337, 3733, 247, 5075, 1566, 326, 13698, 387, 15549, 31306, 374, 40845, 841, 31306, 407, 3733, 247, 2022, 1566, 970, 247, 1885, 273, 10071, 288, 8185, 6752, 342, 253, 13650, 273, 253, 4229, 5075, 1566, 50276, 7041, 50276, 66, 5075, 1566, 476, 320, 908, 281, 9413, 941, 31306, 50276, 783, 4081, 1332, 11330, 247, 2022, 1566, 326, 39970, 1805, 281, 562, 1171, 35360, 6667, 50275, 5371, 891, 10490, 253, 954, 50275, 3899, 522, 287, 12404, 273, 8356, 15549, 285, 40845, 31306, 275, 11454, 6928, 310, 4619, 50276, 4714, 33876, 1025, 50276, 15477, 3374, 273, 1375, 273, 253, 1445, 452, 644, 3636, 50276, 35322, 285, 2905, 789, 403, 3477, 281, 1239, 285, 2096, 50276, 2369, 652, 2969, 285, 4722, 1332, 281, 18915, 731, 50276, 47606, 8442, 50276, 16217, 3825, 403, 50276, 47606, 285, 973, 6777, 50275, 5371, 812, 320, 5520, 50276, 18, 12002, 10199, 285, 374, 2905, 789, 50276, 12550, 2561, 1895, 285, 2900, 403, 2087, 285, 476, 320, 3732, 281, 1142, 4910, 310, 627, 247, 2173, 1921, 2139, 368, 4425, 281, 2770, 327, 295, 24343, 760, 50276, 5658, 812, 3157, 253, 3486, 273, 634, 2746, 407, 19936, 9380, 326, 18915, 253, 1072, 1895, 342, 2074, 5482, 432, 1027, 4910, 502, 782, 1162, 355, 6247, 13414, 1379, 253, 3477, 1039, 562, 19862, 3169, 3082, 323, 17816, 1929, 10895, 31306, 326, 368, 2168, 26542, 6337, 690, 4679, 275, 2709, 4910, 295, 24343, 362, 31569, 3966, 18072, 1751, 1162, 355, 7692, 74, 8493, 32505, 26306, 31306, 323, 5304, 1953, 22291, 5723, 2824, 9638, 275, 362, 31569, 812, 671, 320, 11106, 50276, 20, 4081, 1332, 50276, 8384, 281, 16186, 18, 2139, 271, 3284, 15822, 2020, 310, 6425, 281, 271, 3284, 15822, 25219, 846, 2602, 4090, 352, 3133, 3430, 281, 479, 50276, 262, 812, 320, 4217, 281, 452, 247, 2087, 5426, 273, 253, 2963, 70, 2957, 3185, 273, 816, 271, 1650, 273, 8985, 2831, 15579, 275, 16186, 19, 50276, 2887, 7652, 368, 943, 4853, 2963, 18393, 1060, 50276, 21, 4679, 50276, 1189, 455, 891, 1158, 352, 310, 1774, 326, 368, 3157, 253, 4028, 323, 436, 2593, 285, 4796, 480, 1662, 251, 352, 310, 1663, 2834, 281, 2096, 323, 10668, 326, 403, 417, 7615, 342, 253, 15302, 327, 534, 368, 1347, 634, 1263, 671, 352, 310, 1663, 2834, 281, 2096, 534, 10895, 310, 31929, 2382, 390, 562, 1171, 35360, 50276, 5658, 13414, 4853, 2440, 13373, 7200, 1078, 970, 352, 7609, 50276, 5658, 897, 1512, 1142, 3174, 21377, 326, 812, 320, 2908, 275, 253, 2505, 5976, 50276, 5658, 13414, 4853, 2636, 1014, 275, 253, 11743, 273, 4677, 19, 50276, 249, 2829, 374, 368, 812, 4796, 480, 1662, 251, 407, 970, 5075, 285, 2022, 3185, 273, 259, 285, 278, 50276, 249, 2829, 374, 368, 13414, 4853, 271, 1014, 275, 253, 11743, 7652, 50276, 74, 13414, 2096, 2139, 2963, 18393, 310, 1805, 327, 1892, 50275, 74, 13414, 751, 326, 368, 12661, 281, 897, 2963, 18393, 347, 634, 1332, 273, 4327, 281, 4828, 514, 841, 2538, 1293, 13947, 352, 275, 2593, 495, 281, 320, 2590, 891, 1335, 13414, 2096, 752, 310, 253, 4715, 1332, 326, 368, 12661, 2963, 70, 390, 2963, 18393, 50276, 22, 1783, 8073, 50276, 5564, 310, 327, 767, 3104, 3185, 273, 581, 50276, 74, 13414, 2096, 672, 10166, 26277, 342, 253, 4067, 4646, 6291, 5075, 458, 47612, 849, 1142, 3602, 13414, 1902, 634, 9414, 281, 1007, 387, 4677, 577, 281, 4044, 436, 1491, 50276, 23, 6452, 50276, 16534, 368, 823, 247, 5955, 670, 253, 7364, 273, 634, 2746, 275, 1798, 849, 281, 5206, 253, 1180, 273, 3602, 273, 634, 5075, 458, 47612, 752, 281, 5206, 875, 2963, 70, 285, 2963, 18393, 285, 954, 21038, 604, 368, 13414, 2939, 253, 1511, 273, 31306, 285, 253, 2408, 273, 31306, 2908, 275, 253, 10895, 849, 281, 320, 2119, 326, 634, 1332, 588, 452, 247, 12912, 3486, 840, 604, 368, 878, 281, 2939, 253, 1511, 273, 31306, 970, 1529, 1332, 326, 5742, 8571, 731, 812, 320, 625, 5919, 50276, 7152, 339, 377, 6653, 6010, 253, 4477, 9059, 326, 597, 452, 4081, 247, 1332, 281, 6194, 10237, 3210, 281, 31306, 1293, 1907, 2720, 3640, 273, 253, 31306, 597, 9059, 671, 281, 2085, 1783, 327, 849, 5075, 458, 47612, 5350, 16274, 253, 801, 297, 404, 483, 1171, 13517, 3045, 50276, 250, 3743, 281, 12009, 337, 253, 4477, 9059, 597, 452, 2011, 253, 1566, 342, 3710, 5350, 9232, 31306, 2299, 436, 556, 644, 2011, 2168, 275, 337, 275, 6247, 285, 3103, 310, 417, 247, 7680, 273, 253, 4477, 374, 253, 2022, 1332, 4081, 275, 436, 2929, 310, 4555, 253, 1072, 1332, 4081, 275, 374, 4496, 3877, 326, 374, 369, 2168, 2130, 275, 2393, 480, 2988, 9169, 285, 327, 1755, 273, 5368, 789, 253, 2929, 1057, 417, 2085, 643, 9021, 50276, 20, 670, 253, 2626, 9125, 7680, 327, 4645, 849, 253, 3045, 273, 253, 372, 4193, 2355, 1332, 1818, 1754, 327, 253, 5350, 273, 5075, 40390, 275, 337, 253, 4477, 2908, 253, 5955, 875, 253, 4327, 273, 5075, 40390, 327, 616, 3486, 2167, 253, 1332, 275, 337, 310, 1027, 253, 5955, 275, 326, 2929, 1335, 651, 4647, 1060, 347, 973, 50276, 32897, 3730, 281, 2829, 2145, 285, 4677, 337, 275, 337, 50275, 28821, 253, 2792, 1840, 285, 1580, 253, 2022, 1332, 275, 253, 2929, 310, 4081, 275, 374, 253, 2929, 1057, 417, 2085, 2217, 9021, 281, 320, 7470, 323, 253, 17857, 32888, 18767, 50275, 18, 10237, 3626, 3448, 17032, 3210, 342, 1650, 37264, 340, 356, 1689, 706, 91, 796, 73, 1162, 355, 5987, 39962, 2061, 9275, 746, 7749, 1839, 3832, 9275, 6247, 50276, 19, 4404, 372, 4193, 2355, 295, 7675, 3210, 432, 7202, 31306, 2780, 2902, 1162, 355, 2145, 480, 2988, 9169, 5987, 5758, 15337, 3024, 39061, 301, 6968, 3498, 78, 19, 31169, 248, 802, 13307, 81, 9169, 2490, 187, 4118, 18435, 27, 2520, 2929, 19401, 253, 1895, 273, 4715, 3210, 323, 295, 24343, 8892, 326, 403, 1679, 774, 6231, 327, 24165, 285, 643, 15302, 29765, 3386, 326, 403, 11543, 281, 320, 9630, 323, 747, 15302, 436, 310, 271, 1774, 1895, 984, 841, 31306, 2701, 562, 1171, 35360, 26647, 2720, 789, 556, 2783, 3210, 326, 11120, 2803, 562, 1929, 31306, 436, 789, 29328, 970, 271, 19862, 273, 5075, 40390, 281, 29688, 4271, 690, 273, 841, 31306, 285, 6194, 247, 625, 10237, 1566, 253, 789, 2722, 326, 5075, 40390, 476, 9232, 690, 273, 253, 1072, 31306, 326, 7497, 4271, 285, 326, 253, 4795, 10166, 1566, 310, 3012, 625, 10237, 327, 18539, 274, 1365, 4158, 5691, 8892, 1223, 18501, 272, 1652, 7200, 327, 253, 1071, 5239, 273, 253, 3236, 941, 5239, 50276, 783, 9380, 1332, 310, 4217, 15246, 285, 540, 41597, 23176, 253, 4679, 403, 3839, 973, 5196, 690, 273, 253, 30628, 5439, 3533, 670, 16344, 327, 8892, 342, 7202, 31306, 253, 4477, 9713, 841, 7350, 275, 5955, 285, 359, 11907, 731, 281, 2486, 436, 275, 253, 2457, 2715, 273, 253, 2929, 970, 253, 3081, 3239 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 84, 3362, 6667, 403, 247, 2372, 13345, 1580, 359, 2168, 452, 3640, 273, 253, 8492, 953, 1896, 326, 5075, 40390, 476, 2619, 598, 327, 46541, 3386, 326, 403, 3477, 281, 3037, 534, 403, 253, 1072, 4394, 326, 7497, 4366, 2654, 751, 281, 923, 1880, 436, 1332, 10384, 973, 281, 8892, 835, 352, 310, 2649, 4745, 4755, 326, 253, 8492, 310, 3477, 281, 3037, 4931, 247, 13506, 3368, 651, 320, 4217, 1060, 310, 352, 1896, 281, 28498, 253, 3037, 1430, 273, 253, 8492, 253, 13506, 4679, 275, 253, 2929, 1804, 326, 323, 2219, 253, 8492, 310, 1892, 281, 3037, 436, 1332, 310, 2649, 1077, 3576, 534, 2789, 3282, 249, 849, 1142, 273, 253, 2219, 275, 253, 6239, 310, 253, 8492, 1892, 281, 3037, 436, 310, 1529, 1921, 2139, 891, 1158, 625, 4679, 651, 320, 4217, 7152, 339, 793, 360, 3454, 50276, 2520, 2929, 16633, 327, 253, 1929, 1895, 326, 1655, 295, 24343, 3210, 5257, 281, 8415, 8892, 407, 38883, 28019, 3607, 273, 253, 3733, 941, 326, 513, 417, 39970, 323, 1650, 275, 253, 295, 965, 4836, 3210, 3037, 326, 2297, 318, 3000, 403, 24838, 273, 253, 5203, 20620, 285, 1029, 3159, 14787, 310, 24838, 273, 253, 5203, 46518, 420, 627, 452, 644, 1142, 3332, 5482, 4081, 323, 37460, 824, 3879, 533, 5368, 3082, 452, 20845, 281, 5467, 3640, 273, 253, 2173, 10895, 31306, 247, 30400, 275, 436, 2929, 253, 4477, 12661, 247, 1332, 1754, 327, 1885, 273, 10071, 326, 36908, 5467, 1798, 3640, 273, 2173, 10895, 31306, 253, 1332, 2987, 407, 806, 3733, 247, 5075, 1566, 285, 840, 3733, 247, 2022, 1566, 970, 247, 2957, 326, 598, 42739, 6667, 327, 534, 253, 5075, 1566, 17923, 15225, 10775, 26295, 253, 3430, 3662, 342, 1029, 7162, 253, 9376, 310, 326, 5075, 3210, 588, 22059, 344, 321, 3397, 285, 594, 436, 1332, 588, 372, 249, 1154, 400, 907, 253, 2022, 1566, 281, 897, 1110, 1072, 344, 321, 3397, 253, 4477, 7472, 327, 247, 2491, 273, 8892, 1690, 247, 15524, 8492, 4758, 285, 295, 965, 4758, 285, 247, 2805, 66, 4758, 285, 3959, 247, 4344, 2408, 273, 1783, 273, 616, 1543, 275, 1798, 253, 1783, 4645, 326, 253, 5075, 40390, 513, 275, 958, 5283, 253, 31306, 534, 452, 644, 14290, 11358, 275, 253, 6239, 310, 4722, 285, 253, 5955, 273, 849, 5075, 1057, 253, 5075, 458, 47612, 878, 281, 320, 310, 14109, 247, 1643, 3533, 327, 436, 2708, 50276, 296, 3755, 20556, 50276, 10981, 429, 10495, 1332, 323, 15974, 271, 1774, 1929, 1895, 342, 11454, 295, 24343, 3210, 50276, 42771, 602, 1783, 417, 816, 247, 1332, 285, 1543, 2929, 50276, 20881, 1255, 265, 50276, 2369, 652, 555, 1537, 320, 8489, 3710, 1332, 310, 417, 32251, 10995, 533, 891, 13414, 7933, 1158, 4956, 22794, 310, 247, 38445, 323, 8249, 1318, 253, 4477, 513, 247, 1175, 2628, 273, 3587, 523, 1946, 342, 253, 2074, 31915, 6473, 789, 275, 616, 2929, 50276, 38092, 5701, 34974, 50276, 6309, 247, 1643, 7906, 326, 2210, 598, 1223, 4361, 50276, 783, 14855, 1171, 20881, 282, 47612, 1783, 310, 4722, 891, 8564, 436, 310, 417, 1633, 326, 476, 320, 7192, 275, 7880, 2426, 26332, 891, 651, 417, 1902, 627, 281, 320, 690, 1268, 273, 14855, 326, 310, 4209, 323, 512, 31306, 285, 512, 15302, 24088, 13353, 253, 26752, 474, 14787, 8492, 310, 12150, 281, 3037, 685, 247, 26752, 474, 8492, 751, 253, 3361, 273, 2297, 318, 3000, 1580, 26182, 26752, 474, 14787, 838, 1135, 4863, 26182, 26752, 474, 6489, 3103, 2654, 8564, 8958, 849, 5075, 253, 5075, 458, 47612, 3198, 281, 320, 4419, 690, 30328, 670, 534, 31306, 368, 403, 2820, 281, 5386, 534, 6613, 4828, 281, 253, 3625, 19031, 273, 253, 2929, 10775, 11922, 8492, 1293, 8958, 752, 253, 8492, 310, 7906, 50276, 953, 4722, 326, 1014, 342, 436, 253, 3045, 327, 288, 507, 1327, 290, 7193, 310, 1335, 760, 8026, 534, 310, 1805, 533, 1335, 417, 4555, 1175, 285, 36908, 1804, 253, 1566, 556, 6311, 253, 987, 2181, 594, 1199, 347, 697, 556, 6311, 417, 281, 897, 326, 1798, 3430, 2181, 323, 2561, 3533, 824, 347, 436, 310, 253, 1566, 970, 253, 47641, 891, 1900, 1089, 352, 43288, 3184, 281, 1158, 670, 3045, 15988, 326, 403, 275, 875, 470, 285, 2233, 24088, 672, 359, 2312, 670, 1966, 4715, 359, 3798, 923, 271, 21213, 5333, 672, 253, 458, 47612, 4850, 352, 285, 776, 3524, 275, 11922, 253, 46541, 3386, 342, 3082, 751, 13298, 651, 320, 326, 8764, 1361, 253, 11454, 3210, 12014, 755, 352, 285, 3986, 2233, 387, 1878, 327, 6667, 326, 20843, 253, 1055, 273, 436, 46541, 4735, 891, 13414, 1902, 368, 281, 452, 271, 3662, 323, 436, 533, 816, 12976, 281, 4089, 634, 7906, 5474, 33032, 1921, 323, 4868, 50276, 783, 2561, 1895, 310, 4619, 253, 2900, 310, 4569, 285, 4460, 253, 3916, 403, 17618, 253, 4679, 403, 4722, 2299, 253, 4028, 275, 2593, 495, 577, 1162, 608, 943, 320, 5520, 604, 594, 891, 651, 320, 7378, 281, 7164, 619, 4868, 50275, 2577, 4114, 50276, 2577, 2561, 310, 7106, 327, 15549, 285, 17816, 941, 31306, 390, 46541, 13007, 6311, 407, 3676, 11454, 6928, 436, 310, 253, 3242, 7990, 273, 436, 2929, 2299, 619, 2170, 273, 15040, 310, 4382, 8113, 285, 23390, 26306, 2505, 5695, 417, 3626, 3448, 5162, 50275, 8774, 50276, 8882, 253, 2929, 16633, 327, 8356, 15549, 941, 31306, 6311, 407, 3626, 3448, 5162, 3210, 285, 40845, 731, 970, 247, 4715, 5700, 50276, 28872, 253, 4477, 4271, 285, 18915, 3374, 273, 1375, 23037, 14387, 3082, 50276, 9328, 403, 2424, 281, 2168, 871, 670, 247, 2176, 8492, 281, 320, 11399, 50276, 42023, 285, 38135, 253, 4081, 1332, 8414, 275, 337, 3733, 247, 5075, 1566, 326, 13698, 387, 15549, 31306, 374, 40845, 841, 31306, 407, 3733, 247, 2022, 1566, 970, 247, 1885, 273, 10071, 288, 8185, 6752, 342, 253, 13650, 273, 253, 4229, 5075, 1566, 50276, 7041, 50276, 66, 5075, 1566, 476, 320, 908, 281, 9413, 941, 31306, 50276, 783, 4081, 1332, 11330, 247, 2022, 1566, 326, 39970, 1805, 281, 562, 1171, 35360, 6667, 50275, 5371, 891, 10490, 253, 954, 50275, 3899, 522, 287, 12404, 273, 8356, 15549, 285, 40845, 31306, 275, 11454, 6928, 310, 4619, 50276, 4714, 33876, 1025, 50276, 15477, 3374, 273, 1375, 273, 253, 1445, 452, 644, 3636, 50276, 35322, 285, 2905, 789, 403, 3477, 281, 1239, 285, 2096, 50276, 2369, 652, 2969, 285, 4722, 1332, 281, 18915, 731, 50276, 47606, 8442, 50276, 16217, 3825, 403, 50276, 47606, 285, 973, 6777, 50275, 5371, 812, 320, 5520, 50276, 18, 12002, 10199, 285, 374, 2905, 789, 50276, 12550, 2561, 1895, 285, 2900, 403, 2087, 285, 476, 320, 3732, 281, 1142, 4910, 310, 627, 247, 2173, 1921, 2139, 368, 4425, 281, 2770, 327, 295, 24343, 760, 50276, 5658, 812, 3157, 253, 3486, 273, 634, 2746, 407, 19936, 9380, 326, 18915, 253, 1072, 1895, 342, 2074, 5482, 432, 1027, 4910, 502, 782, 1162, 355, 6247, 13414, 1379, 253, 3477, 1039, 562, 19862, 3169, 3082, 323, 17816, 1929, 10895, 31306, 326, 368, 2168, 26542, 6337, 690, 4679, 275, 2709, 4910, 295, 24343, 362, 31569, 3966, 18072, 1751, 1162, 355, 7692, 74, 8493, 32505, 26306, 31306, 323, 5304, 1953, 22291, 5723, 2824, 9638, 275, 362, 31569, 812, 671, 320, 11106, 50276, 20, 4081, 1332, 50276, 8384, 281, 16186, 18, 2139, 271, 3284, 15822, 2020, 310, 6425, 281, 271, 3284, 15822, 25219, 846, 2602, 4090, 352, 3133, 3430, 281, 479, 50276, 262, 812, 320, 4217, 281, 452, 247, 2087, 5426, 273, 253, 2963, 70, 2957, 3185, 273, 816, 271, 1650, 273, 8985, 2831, 15579, 275, 16186, 19, 50276, 2887, 7652, 368, 943, 4853, 2963, 18393, 1060, 50276, 21, 4679, 50276, 1189, 455, 891, 1158, 352, 310, 1774, 326, 368, 3157, 253, 4028, 323, 436, 2593, 285, 4796, 480, 1662, 251, 352, 310, 1663, 2834, 281, 2096, 323, 10668, 326, 403, 417, 7615, 342, 253, 15302, 327, 534, 368, 1347, 634, 1263, 671, 352, 310, 1663, 2834, 281, 2096, 534, 10895, 310, 31929, 2382, 390, 562, 1171, 35360, 50276, 5658, 13414, 4853, 2440, 13373, 7200, 1078, 970, 352, 7609, 50276, 5658, 897, 1512, 1142, 3174, 21377, 326, 812, 320, 2908, 275, 253, 2505, 5976, 50276, 5658, 13414, 4853, 2636, 1014, 275, 253, 11743, 273, 4677, 19, 50276, 249, 2829, 374, 368, 812, 4796, 480, 1662, 251, 407, 970, 5075, 285, 2022, 3185, 273, 259, 285, 278, 50276, 249, 2829, 374, 368, 13414, 4853, 271, 1014, 275, 253, 11743, 7652, 50276, 74, 13414, 2096, 2139, 2963, 18393, 310, 1805, 327, 1892, 50275, 74, 13414, 751, 326, 368, 12661, 281, 897, 2963, 18393, 347, 634, 1332, 273, 4327, 281, 4828, 514, 841, 2538, 1293, 13947, 352, 275, 2593, 495, 281, 320, 2590, 891, 1335, 13414, 2096, 752, 310, 253, 4715, 1332, 326, 368, 12661, 2963, 70, 390, 2963, 18393, 50276, 22, 1783, 8073, 50276, 5564, 310, 327, 767, 3104, 3185, 273, 581, 50276, 74, 13414, 2096, 672, 10166, 26277, 342, 253, 4067, 4646, 6291, 5075, 458, 47612, 849, 1142, 3602, 13414, 1902, 634, 9414, 281, 1007, 387, 4677, 577, 281, 4044, 436, 1491, 50276, 23, 6452, 50276, 16534, 368, 823, 247, 5955, 670, 253, 7364, 273, 634, 2746, 275, 1798, 849, 281, 5206, 253, 1180, 273, 3602, 273, 634, 5075, 458, 47612, 752, 281, 5206, 875, 2963, 70, 285, 2963, 18393, 285, 954, 21038, 604, 368, 13414, 2939, 253, 1511, 273, 31306, 285, 253, 2408, 273, 31306, 2908, 275, 253, 10895, 849, 281, 320, 2119, 326, 634, 1332, 588, 452, 247, 12912, 3486, 840, 604, 368, 878, 281, 2939, 253, 1511, 273, 31306, 970, 1529, 1332, 326, 5742, 8571, 731, 812, 320, 625, 5919, 50276, 7152, 339, 377, 6653, 6010, 253, 4477, 9059, 326, 597, 452, 4081, 247, 1332, 281, 6194, 10237, 3210, 281, 31306, 1293, 1907, 2720, 3640, 273, 253, 31306, 597, 9059, 671, 281, 2085, 1783, 327, 849, 5075, 458, 47612, 5350, 16274, 253, 801, 297, 404, 483, 1171, 13517, 3045, 50276, 250, 3743, 281, 12009, 337, 253, 4477, 9059, 597, 452, 2011, 253, 1566, 342, 3710, 5350, 9232, 31306, 2299, 436, 556, 644, 2011, 2168, 275, 337, 275, 6247, 285, 3103, 310, 417, 247, 7680, 273, 253, 4477, 374, 253, 2022, 1332, 4081, 275, 436, 2929, 310, 4555, 253, 1072, 1332, 4081, 275, 374, 4496, 3877, 326, 374, 369, 2168, 2130, 275, 2393, 480, 2988, 9169, 285, 327, 1755, 273, 5368, 789, 253, 2929, 1057, 417, 2085, 643, 9021, 50276, 20, 670, 253, 2626, 9125, 7680, 327, 4645, 849, 253, 3045, 273, 253, 372, 4193, 2355, 1332, 1818, 1754, 327, 253, 5350, 273, 5075, 40390, 275, 337, 253, 4477, 2908, 253, 5955, 875, 253, 4327, 273, 5075, 40390, 327, 616, 3486, 2167, 253, 1332, 275, 337, 310, 1027, 253, 5955, 275, 326, 2929, 1335, 651, 4647, 1060, 347, 973, 50276, 32897, 3730, 281, 2829, 2145, 285, 4677, 337, 275, 337, 50275, 28821, 253, 2792, 1840, 285, 1580, 253, 2022, 1332, 275, 253, 2929, 310, 4081, 275, 374, 253, 2929, 1057, 417, 2085, 2217, 9021, 281, 320, 7470, 323, 253, 17857, 32888, 18767, 50275, 18, 10237, 3626, 3448, 17032, 3210, 342, 1650, 37264, 340, 356, 1689, 706, 91, 796, 73, 1162, 355, 5987, 39962, 2061, 9275, 746, 7749, 1839, 3832, 9275, 6247, 50276, 19, 4404, 372, 4193, 2355, 295, 7675, 3210, 432, 7202, 31306, 2780, 2902, 1162, 355, 2145, 480, 2988, 9169, 5987, 5758, 15337, 3024, 39061, 301, 6968, 3498, 78, 19, 31169, 248, 802, 13307, 81, 9169, 2490, 187, 4118, 18435, 27, 2520, 2929, 19401, 253, 1895, 273, 4715, 3210, 323, 295, 24343, 8892, 326, 403, 1679, 774, 6231, 327, 24165, 285, 643, 15302, 29765, 3386, 326, 403, 11543, 281, 320, 9630, 323, 747, 15302, 436, 310, 271, 1774, 1895, 984, 841, 31306, 2701, 562, 1171, 35360, 26647, 2720, 789, 556, 2783, 3210, 326, 11120, 2803, 562, 1929, 31306, 436, 789, 29328, 970, 271, 19862, 273, 5075, 40390, 281, 29688, 4271, 690, 273, 841, 31306, 285, 6194, 247, 625, 10237, 1566, 253, 789, 2722, 326, 5075, 40390, 476, 9232, 690, 273, 253, 1072, 31306, 326, 7497, 4271, 285, 326, 253, 4795, 10166, 1566, 310, 3012, 625, 10237, 327, 18539, 274, 1365, 4158, 5691, 8892, 1223, 18501, 272, 1652, 7200, 327, 253, 1071, 5239, 273, 253, 3236, 941, 5239, 50276, 783, 9380, 1332, 310, 4217, 15246, 285, 540, 41597, 23176, 253, 4679, 403, 3839, 973, 5196, 690, 273, 253, 30628, 5439, 3533, 670, 16344, 327, 8892, 342, 7202, 31306, 253, 4477, 9713, 841, 7350, 275, 5955, 285, 359, 11907, 731, 281, 2486, 436, 275, 253, 2457, 2715, 273, 253, 2929, 970, 253, 3081, 3239 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: the paper builds on the neural ode framework by using delay differential equations ddes instead of ordinary differential equations odes this can help in the modeling of systems with a time delay effect and overcome many limitations of the neural ode framework using ddes is a novel technique in machine learning and can complement and build on the framework of neural odes and help model systems with time delay dependencies and overcome some limitations of odes the paper was well written with a clear description of the model and theory to support it as well as the algorithm to train it the experiments are well described though they lacked the model parameters and more description on their significance major points 1 model parameters for the experiments were not listed this can be especially important for systems with a time delay effect as in figure 6 and 7 was the same delay used as in the equation of each system what happens when the delay is different 2 while some complex systems were used to demonstrate the potential of the proposed model as in figure 7 the abstract claims that the model can successfully model chaotic dynamics yet that was not shown in the experiments the dynamic regime for these systems with their chosen parameters were not clear together with the effect of time delay on these dynamics exploring this more closely can demonstrate the full potential of nddes 3 the authors claim that unlike nodes nddes can overcome the problem of mutually intersected trajectories in phase space they point to the experiment in figure 6 as evidence of that i feel there needs to be more discussion on this point both theoretically proving that ddes can indeed model intersections in phase space and with a clearer demonstration experimentally docsep the motivation for resorting to ddes over odes is laid out reasonably well although i think the intuition behind why nodes cannot learn certain functions is lacking specifically that trajectories cannot cross due to the lack of memory provided by the delay figure 2 could be more informative as it stands it only indicates that the initial state of a ndde is specified by a function over an interval rather than a single vector which is all that is being used algorithm 1 is challenging to understand without some knowledge of how ddes are solved in the experiments nddes are only compared against nodes but it seems like anodes are the real competition notably in the image experiments performed in the anode paper they report substantially higher accuracies than nddes this is mentioned in the discussion section but i think the paper would benefit from an experiment involving a realworld dataset with some sort of delay component ideally in which nddes outperform nodesanodes the existing synthetic experiments are designed to show the strengths of nddes while the image classification tasks dont substantially differentiate them from nodes the number of function evaluations is not reported anywhere which seems like a significant statistic when nddes are substantially more complex than nodes figure 7 could probably be improved by reducing the number of different parameter values reported eg 3 columns instead of 6 the description also lists 5 parameter values but there are 6 columns the discussion paragraph extensions of the nddes suggests a generalization to more complex delays but all of the models in this paper ensure that the delayed value is always the constant initial value also modeling the initial function as an ode is mentioned and i wonder whether the implication is that this would involve learning the initial function as a node itself either way its not obvious how either of these would yield better representationsdocsepthis paper contributes to a recent line of researches linked to neural ordinary differential equations chen et al 2018 and variants this new family of deep neural networks node generalize ideas from residual networks and consider continuous dynamics of hidden units using an ordinary differential equation ode specified by a neural network computing in such networks consists in taking xh0 as input and define the output layer ht to be the solution of an ode initial value problem at time t the main ingredient which makes learning possible is the backpropagation algorithm through the last layer ode solver that relies on the adjoint sensitivity method pontryagin et al 1962 this work considers delay differential equations instead of ode which allows to implement more complex dynamics and thus achieve estimation of more complex functions the contributions of the paper are the following 1 a novel deep continuous time deep neural network defined by the following equations for a given delay tau never discussed neural delay differential equations define dynamics of the form dhtdt fhthttau t w t geq 0 ht phit t leq 0 in this way the network can take into account a former hidden layer 2 the derivation of the adjoint sensitivity method for delayed equations which seems relatively straightforward and which is backed by two proofs in the appendix 3 a novel learning algorithm that implements the forward for h and the backward pass for hlambda the augmented variable and dldw l is the loss by a piecewise ode solver dealing with the different delayed states 4 experiments on 2d toydatasets and on classic differential equations such as mackeyglass are shown to exhibit the ability of dde to cope with those dynamics in constrast to ode 5 experiments on image datasets 6 the strong point of this paper is of course the proposal of the new variant of node which comes with a novel algorithm and overcomes some limitations of node i was interested by the examples of functions not covered by node and covered by nddeand easily convinced by that there are some weak points in this work some of them could be easily improved i think others call for further work relatively minor points the paper is not selfcontent and does not a very good job in explaining the context of neural ode i suggest that you more clearly describe node as a hypothesis class and then as a learning algorithm the two uses of ndde concern modeling timeseries or implementing a classificationregression function can you each time precise inputsoutputs samples you use in training just for clarity the reader guesses of course but its rather slows down the reading this absence of notation and formalization major points and questions to the authors more importantly as a novel algorithm is introduced i expect to see a complexity in time analysis as for node i understand that the complexity in space is favorable an associated question is the role of tau the delay how is it chosen i imagine that if tau converges towards 0 we find the behaviour of node what was its value in the two realworld experiments was it selected by crossvalidation is it too computationally heavy to consider multiple delays eventually i have doubts and questions about the real opportunities for using ndde in the two real word datasets yes the divergence of node poses problems in these experiments no doubt that ndde has a more stable empirical behavior with a small variance however in fine the average performance is nearly of the same order a significant difference only on mnist but with a very bad score considering this is an easy problem i wont qualify these results as exceptional and i kindly engage the authors to remain modest now can we cope without delays by introducing new states in the way augmented neural ode are built dupont et al for what kinds of classification problems ndde could be more interesting than node the question is certainly difficult in conclusion the paper tackles a very interesting topic and i had pleasure to discover it with the help of literature although the idea to consider dde instead of ode is incremental its relevant and novel and certainly promising and worth to be explored because it addresses some of the issues of node however i stay on my hunger on different points there are pending questions that require to be answered before acceptance docsepthis paper presents a modeling class of parameterized firstorder differential equations that are conditional on some delayed past states this is a promising direction for the community to go to push past current ode modeling limitations but i recommend for rejection in the current form due to improper characterization and evaluation with respect to prior work more details below strengths to the best of my knowledge this is a novel modeling class for neural odes delay odes are wellstudied in contexts outside of the machine learning community that would be useful to bring into it on the representation capacity delay odes are able to overcome one instance of an intersecting path issue that the usual firstorder neural odes are unable to represent the application to population dynamics and mackeyglass systems that are naturally represented an delay differential equations seems reasonable and wellmotivated that baseline methods may not be able to capture this modeling class weaknesses the strongest weaknesses of this paper are the characterization and evaluation in comparison to dupont 2019 which points out some of the same modeling issues and proposes an alternative way to fix them even though this paper cites dupont2019 it omits that they consider identical experimental settings and does not include their results 1 figure 4 and 5 are almost identical to figures in dupont2019 2 table 1 presents results in mnist and cifar10 and is identical to the setting in dupont2019 the neural ode baselines are almost consistent with the ones reported in dupont2019 but dupont2019 outperforms the method presented here and are omitted from the table dupont2019 additionally considers the svhn dataset other comments and questions is the adjoint method for the nddes very different than the adjoint method for nodes it could be insightful to intuitively explain this the extension to more general delay differential equations seems promising and working out the details in this general case in addition to the case of having a single delay would add to the papers theory and methodological contributions ### Summary:
this paper is a variant of the large growing class of neural odes and adds dependency on a time delay to the baseline which allows to model a larger class of physical systems in particular adding the possibility of crossing paths in phase space after initial evaluation the paper was on the fence with 2 reviewers providing favorable reviews and 2 reviewers recommending rejection a particular important issue raised was positioning with respect to prior art dupont 2019 with some substantial overlap between the papers requests of theoretical discussions of the class of studied systems and its properties most of these remarks have been addressed by the authors in particular positioning and experimental comparisons the ac judged that the paper had been sufficiently improved and recommends acceptance
[ 253, 2278, 15, 187, 4118, 8439, 27, 187, 783, 2929, 21168, 327, 253, 11454, 258, 615, 7792, 407, 970, 5778, 8967, 7424, 277, 3229, 3185, 273, 9826, 8967, 7424, 258, 3229, 436, 476, 1361, 275, 253, 14053, 273, 2718, 342, 247, 673, 5778, 1055, 285, 11399, 1142, 7364, 273, 253, 11454, 258, 615, 7792, 50276, 5302, 277, 3229, 310, 247, 4460, 5853, 275, 5145, 4715, 285, 476, 13503, 285, 1973, 327, 253, 7792, 273, 11454, 258, 3229, 285, 1361, 1566, 2718, 342, 673, 5778, 21011, 285, 11399, 690, 7364, 273, 258, 3229, 50276, 783, 2929, 369, 973, 3542, 342, 247, 2590, 5740, 273, 253, 1566, 285, 3762, 281, 1329, 352, 347, 973, 347, 253, 5933, 281, 6194, 352, 253, 4679, 403, 973, 2529, 2167, 597, 20296, 253, 1566, 3602, 285, 625, 5740, 327, 616, 8453, 50276, 24330, 2792, 50276, 18, 1566, 3602, 323, 253, 4679, 497, 417, 7117, 436, 476, 320, 3340, 1774, 323, 2718, 342, 247, 673, 5778, 1055, 347, 275, 4677, 721, 285, 818, 369, 253, 1072, 5778, 908, 347, 275, 253, 5150, 273, 1016, 985, 752, 6569, 672, 253, 5778, 310, 1027, 50275, 19, 1223, 690, 2570, 2718, 497, 908, 281, 7568, 253, 2442, 273, 253, 4081, 1566, 347, 275, 4677, 818, 253, 12002, 3916, 326, 253, 1566, 476, 8379, 1566, 29784, 8062, 2568, 326, 369, 417, 2011, 275, 253, 4679, 253, 7870, 9459, 323, 841, 2718, 342, 616, 6777, 3602, 497, 417, 2590, 2366, 342, 253, 1055, 273, 673, 5778, 327, 841, 8062, 18216, 436, 625, 8244, 476, 7568, 253, 2120, 2442, 273, 295, 1678, 265, 50276, 20, 253, 4477, 1750, 326, 12401, 7632, 295, 1678, 265, 476, 11399, 253, 1895, 273, 25834, 734, 339, 2356, 24102, 275, 3408, 2317, 597, 1127, 281, 253, 3368, 275, 4677, 721, 347, 1941, 273, 326, 891, 1928, 627, 3198, 281, 320, 625, 5955, 327, 436, 1127, 1097, 28055, 18597, 326, 277, 3229, 476, 6296, 1566, 42320, 275, 3408, 2317, 285, 342, 247, 30909, 20028, 21657, 5474, 33032, 253, 16038, 323, 501, 12655, 281, 277, 3229, 689, 258, 3229, 310, 10090, 562, 12054, 973, 3738, 891, 1158, 253, 30328, 3212, 2139, 7632, 2550, 3037, 2176, 3470, 310, 14999, 5742, 326, 24102, 2550, 2831, 1955, 281, 253, 3480, 273, 3541, 2530, 407, 253, 5778, 50275, 13206, 374, 812, 320, 625, 27096, 347, 352, 9572, 352, 760, 6492, 326, 253, 3302, 1375, 273, 247, 40515, 615, 310, 7616, 407, 247, 1159, 689, 271, 7726, 2581, 685, 247, 2014, 4972, 534, 310, 512, 326, 310, 1146, 908, 50275, 41528, 337, 310, 11132, 281, 2096, 1293, 690, 3640, 273, 849, 277, 3229, 403, 14042, 50275, 249, 253, 4679, 295, 1678, 265, 403, 760, 2429, 1411, 7632, 533, 352, 3133, 751, 271, 3180, 403, 253, 1524, 7324, 19836, 275, 253, 2460, 4679, 2684, 275, 253, 34589, 2929, 597, 1304, 9619, 2169, 3933, 19103, 685, 295, 1678, 265, 50275, 2520, 310, 5393, 275, 253, 5955, 2593, 533, 891, 1158, 253, 2929, 651, 5649, 432, 271, 3368, 7668, 247, 1524, 10186, 10895, 342, 690, 3686, 273, 5778, 4445, 34243, 275, 534, 295, 1678, 265, 562, 32231, 7632, 266, 3180, 253, 5368, 13506, 4679, 403, 4158, 281, 921, 253, 20544, 273, 295, 1678, 265, 1223, 253, 2460, 9162, 8892, 13414, 9619, 22629, 731, 432, 7632, 50275, 783, 1180, 273, 1159, 27163, 310, 417, 2361, 9825, 534, 3133, 751, 247, 1534, 26312, 672, 295, 1678, 265, 403, 9619, 625, 2570, 685, 7632, 50275, 13206, 818, 812, 3164, 320, 5520, 407, 8493, 253, 1180, 273, 1027, 4764, 2193, 2361, 24088, 495, 9930, 3185, 273, 721, 253, 5740, 671, 10894, 608, 4764, 2193, 533, 627, 403, 721, 9930, 50275, 783, 5955, 12494, 18149, 273, 253, 295, 1678, 265, 5936, 247, 26647, 281, 625, 2570, 20219, 533, 512, 273, 253, 3210, 275, 436, 2929, 5416, 326, 253, 13444, 1318, 310, 1900, 253, 3638, 3302, 1318, 671, 14053, 253, 3302, 1159, 347, 271, 258, 615, 310, 5393, 285, 891, 4282, 1880, 253, 27570, 310, 326, 436, 651, 6388, 4715, 253, 3302, 1159, 347, 247, 4666, 3139, 2057, 1039, 697, 417, 4755, 849, 2057, 273, 841, 651, 4917, 1805, 14237, 7152, 33032, 2520, 2929, 17904, 281, 247, 3332, 1386, 273, 29905, 2706, 7939, 281, 11454, 9826, 8967, 7424, 260, 864, 1162, 355, 4765, 285, 11640, 50276, 2520, 747, 2021, 273, 3676, 11454, 6928, 4666, 39970, 5697, 432, 12541, 6928, 285, 1908, 5415, 8062, 273, 8763, 5085, 970, 271, 9826, 8967, 5150, 258, 615, 7616, 407, 247, 11454, 2990, 12672, 275, 824, 6928, 8414, 275, 3192, 1269, 73, 17, 347, 3280, 285, 4853, 253, 3453, 3828, 288, 85, 281, 320, 253, 2900, 273, 271, 258, 615, 3302, 1318, 1895, 387, 673, 246, 253, 2022, 24405, 534, 2789, 4715, 1896, 310, 253, 896, 44263, 318, 5933, 949, 253, 1390, 3828, 258, 615, 47037, 326, 15771, 327, 253, 39200, 7340, 1332, 42020, 610, 26093, 1162, 355, 20208, 50276, 2520, 789, 19401, 5778, 8967, 7424, 3185, 273, 258, 615, 534, 4483, 281, 3359, 625, 2570, 8062, 285, 3021, 5115, 13418, 273, 625, 2570, 3470, 253, 9021, 273, 253, 2929, 403, 253, 1563, 337, 186, 66, 4460, 3676, 5415, 673, 3676, 11454, 2990, 2931, 407, 253, 1563, 7424, 323, 247, 1677, 5778, 29201, 1620, 5469, 11454, 5778, 8967, 7424, 4853, 8062, 273, 253, 830, 277, 384, 7064, 50276, 71, 384, 384, 3115, 246, 259, 50276, 85, 305, 2574, 470, 288, 85, 50276, 545, 262, 246, 458, 82, 470, 50269, 249, 436, 1039, 253, 2990, 476, 1379, 715, 2395, 247, 3438, 8763, 3828, 374, 186, 783, 28529, 273, 253, 39200, 7340, 1332, 323, 13444, 7424, 534, 3133, 4942, 15246, 285, 534, 310, 17245, 407, 767, 27947, 275, 253, 30762, 495, 186, 66, 4460, 4715, 5933, 326, 17930, 253, 3579, 323, 288, 285, 253, 19265, 1509, 323, 288, 2260, 253, 31612, 4778, 285, 277, 392, 88, 298, 310, 253, 2957, 407, 247, 5313, 3020, 258, 615, 47037, 10620, 342, 253, 1027, 13444, 3054, 577, 186, 16217, 3825, 327, 374, 69, 20953, 46906, 1507, 285, 327, 10610, 8967, 7424, 824, 347, 5315, 2364, 25483, 403, 2011, 281, 10738, 253, 3745, 273, 277, 615, 281, 23808, 342, 1110, 8062, 275, 1030, 42836, 281, 258, 615, 608, 186, 16217, 3825, 327, 2460, 15302, 721, 186, 253, 2266, 1127, 273, 436, 2929, 310, 273, 2282, 253, 10419, 273, 253, 747, 12955, 273, 4666, 534, 3249, 342, 247, 4460, 5933, 285, 689, 3217, 690, 7364, 273, 4666, 891, 369, 6110, 407, 253, 6667, 273, 3470, 417, 6107, 407, 4666, 285, 6107, 407, 40515, 615, 395, 4354, 13762, 407, 326, 50275, 9088, 403, 690, 5075, 2792, 275, 436, 789, 690, 273, 731, 812, 320, 4354, 5520, 891, 1158, 2571, 1067, 323, 2007, 50276, 1601, 4942, 5884, 2792, 209, 186, 783, 2929, 310, 417, 1881, 6071, 285, 1057, 417, 247, 1077, 1175, 2628, 275, 15571, 253, 3634, 273, 11454, 258, 615, 891, 1804, 326, 368, 625, 4518, 6266, 4666, 347, 247, 9079, 966, 285, 840, 347, 247, 4715, 5933, 209, 186, 783, 767, 4648, 273, 40515, 615, 4468, 14053, 2069, 12395, 50276, 263, 16994, 247, 9162, 1747, 1256, 1159, 476, 368, 1016, 673, 10799, 14800, 44902, 3530, 368, 897, 275, 3733, 816, 323, 19843, 253, 9414, 5476, 265, 273, 2282, 533, 697, 2581, 49133, 1066, 253, 4361, 436, 5928, 273, 14951, 285, 7473, 1320, 50276, 24330, 2792, 285, 3533, 281, 253, 4477, 209, 186, 3062, 15538, 347, 247, 4460, 5933, 310, 5611, 891, 1902, 281, 923, 247, 10454, 275, 673, 1783, 347, 323, 4666, 891, 2096, 326, 253, 10454, 275, 2317, 310, 13857, 50276, 186, 266, 2330, 1953, 310, 253, 2554, 273, 29201, 253, 5778, 849, 310, 352, 6777, 50276, 74, 8564, 326, 604, 29201, 26414, 4404, 470, 359, 1089, 253, 8770, 273, 4666, 50276, 5371, 369, 697, 1318, 275, 253, 767, 1524, 10186, 4679, 369, 352, 4236, 407, 2831, 29599, 50275, 261, 352, 1512, 43245, 5536, 281, 1908, 2709, 20219, 50275, 186, 8045, 1230, 891, 452, 24626, 285, 3533, 670, 253, 1524, 9091, 323, 970, 40515, 615, 275, 253, 767, 1524, 3159, 15302, 4754, 253, 23279, 273, 4666, 24543, 3237, 50276, 249, 841, 4679, 642, 5545, 326, 40515, 615, 556, 247, 625, 6474, 16774, 3879, 342, 247, 1355, 11041, 2299, 275, 4030, 253, 3388, 3045, 310, 4829, 273, 253, 1072, 1340, 247, 1534, 3064, 760, 327, 278, 79, 382, 533, 342, 247, 1077, 3076, 4868, 7296, 436, 310, 271, 3477, 1895, 891, 31451, 19478, 841, 1543, 347, 18714, 285, 891, 26604, 11377, 253, 4477, 281, 3464, 16453, 209, 186, 2666, 476, 359, 23808, 1293, 20219, 407, 16984, 747, 3054, 275, 253, 1039, 31612, 50276, 570, 1546, 258, 615, 403, 4270, 24708, 834, 1162, 355, 50276, 186, 1542, 752, 9351, 273, 9162, 3237, 40515, 615, 812, 320, 625, 4722, 685, 4666, 50276, 783, 1953, 310, 5604, 2834, 50276, 249, 6452, 253, 2929, 39223, 247, 1077, 4722, 9400, 285, 891, 574, 11284, 281, 9413, 352, 342, 253, 1361, 273, 6239, 3738, 253, 2934, 281, 1908, 277, 615, 3185, 273, 258, 615, 310, 32809, 697, 4623, 285, 4460, 285, 5604, 12532, 285, 4409, 281, 320, 14859, 984, 352, 12453, 690, 273, 253, 3374, 273, 4666, 2299, 891, 3297, 327, 619, 23635, 327, 1027, 2792, 627, 403, 13910, 3533, 326, 2430, 281, 320, 9577, 1078, 14924, 5474, 33032, 2520, 2929, 10262, 247, 14053, 966, 273, 4764, 1025, 806, 2621, 8967, 7424, 326, 403, 17697, 327, 690, 13444, 2469, 3054, 436, 310, 247, 12532, 3884, 323, 253, 3114, 281, 564, 281, 7450, 2469, 1655, 258, 615, 14053, 7364, 533, 891, 5583, 323, 18235, 275, 253, 1655, 830, 1955, 281, 14697, 14846, 285, 7103, 342, 1675, 281, 2720, 789, 625, 4278, 2708, 50275, 296, 3755, 20556, 281, 253, 1682, 273, 619, 3640, 436, 310, 247, 4460, 14053, 966, 323, 11454, 258, 3229, 5778, 258, 3229, 403, 973, 14091, 728, 275, 22349, 3345, 273, 253, 5145, 4715, 3114, 326, 651, 320, 4217, 281, 3324, 715, 352, 50276, 251, 253, 6779, 5350, 5778, 258, 3229, 403, 2104, 281, 11399, 581, 4227, 273, 271, 23965, 272, 1854, 2523, 326, 253, 7312, 806, 2621, 11454, 258, 3229, 403, 7591, 281, 1957, 50276, 783, 2898, 281, 3072, 8062, 285, 5315, 2364, 25483, 2718, 326, 403, 10748, 6607, 271, 5778, 8967, 7424, 3133, 5272, 285, 973, 24013, 8550, 326, 8245, 3082, 778, 417, 320, 2104, 281, 9232, 436, 14053, 966, 50275, 20881, 1255, 265, 253, 19508, 32213, 273, 436, 2929, 403, 253, 14846, 285, 7103, 275, 5301, 281, 24708, 834, 6247, 534, 2792, 562, 690, 273, 253, 1072, 14053, 3374, 285, 29328, 271, 5795, 1039, 281, 4993, 731, 1014, 2167, 436, 2929, 28070, 24708, 834, 9638, 352, 7005, 953, 326, 597, 1908, 8931, 5661, 7533, 285, 1057, 417, 2486, 616, 1543, 50274, 18, 4677, 577, 285, 608, 403, 2761, 8931, 281, 8442, 275, 24708, 834, 9638, 50274, 19, 2829, 337, 10262, 1543, 275, 278, 79, 382, 285, 260, 338, 274, 740, 285, 310, 8931, 50272, 936, 253, 4758, 275, 24708, 834, 9638, 253, 11454, 258, 615, 1666, 25379, 50272, 609, 2761, 5185, 342, 253, 4394, 2361, 275, 24708, 834, 9638, 533, 50272, 34856, 834, 9638, 41731, 13015, 253, 1332, 3559, 1060, 285, 403, 50272, 297, 2166, 432, 253, 2829, 24708, 834, 9638, 23000, 50272, 5040, 5852, 253, 18504, 13107, 10895, 50275, 977, 5701, 285, 3533, 310, 253, 39200, 1332, 323, 253, 295, 1678, 265, 1077, 1027, 685, 253, 39200, 1332, 323, 7632, 352, 812, 320, 47860, 281, 540, 41597, 5513, 436, 50276, 783, 6880, 281, 625, 2087, 5778, 8967, 7424, 3133, 12532, 285, 2444, 562, 253, 4278, 275, 436, 2087, 1083, 275, 1635, 281, 253, 1083, 273, 1907, 247, 2014, 5778, 651, 823, 281, 253, 9380, 3762, 285, 35961, 9021, 187, 187, 4118, 18435, 27, 2520, 2929, 310, 247, 12955, 273, 253, 1781, 5675, 966, 273, 11454, 258, 3229, 285, 11323, 18925, 327, 247, 673, 5778, 281, 253, 8245, 534, 4483, 281, 1566, 247, 4067, 966, 273, 3520, 2718, 275, 1798, 6240, 253, 6387, 273, 14270, 11865, 275, 3408, 2317, 50276, 6438, 3302, 7103, 253, 2929, 369, 327, 253, 19354, 342, 374, 30628, 5277, 13857, 10123, 285, 374, 30628, 46705, 18235, 247, 1798, 1774, 2523, 5439, 369, 19274, 342, 1675, 281, 2720, 1445, 24708, 834, 6247, 342, 690, 6832, 14787, 875, 253, 9380, 9762, 273, 10527, 11985, 273, 253, 966, 273, 5421, 2718, 285, 697, 3607, 50276, 2252, 273, 841, 16157, 452, 644, 9713, 407, 253, 4477, 275, 1798, 19274, 285, 5661, 14023, 50276, 783, 913, 24242, 326, 253, 2929, 574, 644, 10481, 5520, 285, 32636, 14924 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 253, 2278, 15, 187, 4118, 8439, 27, 187, 783, 2929, 21168, 327, 253, 11454, 258, 615, 7792, 407, 970, 5778, 8967, 7424, 277, 3229, 3185, 273, 9826, 8967, 7424, 258, 3229, 436, 476, 1361, 275, 253, 14053, 273, 2718, 342, 247, 673, 5778, 1055, 285, 11399, 1142, 7364, 273, 253, 11454, 258, 615, 7792, 50276, 5302, 277, 3229, 310, 247, 4460, 5853, 275, 5145, 4715, 285, 476, 13503, 285, 1973, 327, 253, 7792, 273, 11454, 258, 3229, 285, 1361, 1566, 2718, 342, 673, 5778, 21011, 285, 11399, 690, 7364, 273, 258, 3229, 50276, 783, 2929, 369, 973, 3542, 342, 247, 2590, 5740, 273, 253, 1566, 285, 3762, 281, 1329, 352, 347, 973, 347, 253, 5933, 281, 6194, 352, 253, 4679, 403, 973, 2529, 2167, 597, 20296, 253, 1566, 3602, 285, 625, 5740, 327, 616, 8453, 50276, 24330, 2792, 50276, 18, 1566, 3602, 323, 253, 4679, 497, 417, 7117, 436, 476, 320, 3340, 1774, 323, 2718, 342, 247, 673, 5778, 1055, 347, 275, 4677, 721, 285, 818, 369, 253, 1072, 5778, 908, 347, 275, 253, 5150, 273, 1016, 985, 752, 6569, 672, 253, 5778, 310, 1027, 50275, 19, 1223, 690, 2570, 2718, 497, 908, 281, 7568, 253, 2442, 273, 253, 4081, 1566, 347, 275, 4677, 818, 253, 12002, 3916, 326, 253, 1566, 476, 8379, 1566, 29784, 8062, 2568, 326, 369, 417, 2011, 275, 253, 4679, 253, 7870, 9459, 323, 841, 2718, 342, 616, 6777, 3602, 497, 417, 2590, 2366, 342, 253, 1055, 273, 673, 5778, 327, 841, 8062, 18216, 436, 625, 8244, 476, 7568, 253, 2120, 2442, 273, 295, 1678, 265, 50276, 20, 253, 4477, 1750, 326, 12401, 7632, 295, 1678, 265, 476, 11399, 253, 1895, 273, 25834, 734, 339, 2356, 24102, 275, 3408, 2317, 597, 1127, 281, 253, 3368, 275, 4677, 721, 347, 1941, 273, 326, 891, 1928, 627, 3198, 281, 320, 625, 5955, 327, 436, 1127, 1097, 28055, 18597, 326, 277, 3229, 476, 6296, 1566, 42320, 275, 3408, 2317, 285, 342, 247, 30909, 20028, 21657, 5474, 33032, 253, 16038, 323, 501, 12655, 281, 277, 3229, 689, 258, 3229, 310, 10090, 562, 12054, 973, 3738, 891, 1158, 253, 30328, 3212, 2139, 7632, 2550, 3037, 2176, 3470, 310, 14999, 5742, 326, 24102, 2550, 2831, 1955, 281, 253, 3480, 273, 3541, 2530, 407, 253, 5778, 50275, 13206, 374, 812, 320, 625, 27096, 347, 352, 9572, 352, 760, 6492, 326, 253, 3302, 1375, 273, 247, 40515, 615, 310, 7616, 407, 247, 1159, 689, 271, 7726, 2581, 685, 247, 2014, 4972, 534, 310, 512, 326, 310, 1146, 908, 50275, 41528, 337, 310, 11132, 281, 2096, 1293, 690, 3640, 273, 849, 277, 3229, 403, 14042, 50275, 249, 253, 4679, 295, 1678, 265, 403, 760, 2429, 1411, 7632, 533, 352, 3133, 751, 271, 3180, 403, 253, 1524, 7324, 19836, 275, 253, 2460, 4679, 2684, 275, 253, 34589, 2929, 597, 1304, 9619, 2169, 3933, 19103, 685, 295, 1678, 265, 50275, 2520, 310, 5393, 275, 253, 5955, 2593, 533, 891, 1158, 253, 2929, 651, 5649, 432, 271, 3368, 7668, 247, 1524, 10186, 10895, 342, 690, 3686, 273, 5778, 4445, 34243, 275, 534, 295, 1678, 265, 562, 32231, 7632, 266, 3180, 253, 5368, 13506, 4679, 403, 4158, 281, 921, 253, 20544, 273, 295, 1678, 265, 1223, 253, 2460, 9162, 8892, 13414, 9619, 22629, 731, 432, 7632, 50275, 783, 1180, 273, 1159, 27163, 310, 417, 2361, 9825, 534, 3133, 751, 247, 1534, 26312, 672, 295, 1678, 265, 403, 9619, 625, 2570, 685, 7632, 50275, 13206, 818, 812, 3164, 320, 5520, 407, 8493, 253, 1180, 273, 1027, 4764, 2193, 2361, 24088, 495, 9930, 3185, 273, 721, 253, 5740, 671, 10894, 608, 4764, 2193, 533, 627, 403, 721, 9930, 50275, 783, 5955, 12494, 18149, 273, 253, 295, 1678, 265, 5936, 247, 26647, 281, 625, 2570, 20219, 533, 512, 273, 253, 3210, 275, 436, 2929, 5416, 326, 253, 13444, 1318, 310, 1900, 253, 3638, 3302, 1318, 671, 14053, 253, 3302, 1159, 347, 271, 258, 615, 310, 5393, 285, 891, 4282, 1880, 253, 27570, 310, 326, 436, 651, 6388, 4715, 253, 3302, 1159, 347, 247, 4666, 3139, 2057, 1039, 697, 417, 4755, 849, 2057, 273, 841, 651, 4917, 1805, 14237, 7152, 33032, 2520, 2929, 17904, 281, 247, 3332, 1386, 273, 29905, 2706, 7939, 281, 11454, 9826, 8967, 7424, 260, 864, 1162, 355, 4765, 285, 11640, 50276, 2520, 747, 2021, 273, 3676, 11454, 6928, 4666, 39970, 5697, 432, 12541, 6928, 285, 1908, 5415, 8062, 273, 8763, 5085, 970, 271, 9826, 8967, 5150, 258, 615, 7616, 407, 247, 11454, 2990, 12672, 275, 824, 6928, 8414, 275, 3192, 1269, 73, 17, 347, 3280, 285, 4853, 253, 3453, 3828, 288, 85, 281, 320, 253, 2900, 273, 271, 258, 615, 3302, 1318, 1895, 387, 673, 246, 253, 2022, 24405, 534, 2789, 4715, 1896, 310, 253, 896, 44263, 318, 5933, 949, 253, 1390, 3828, 258, 615, 47037, 326, 15771, 327, 253, 39200, 7340, 1332, 42020, 610, 26093, 1162, 355, 20208, 50276, 2520, 789, 19401, 5778, 8967, 7424, 3185, 273, 258, 615, 534, 4483, 281, 3359, 625, 2570, 8062, 285, 3021, 5115, 13418, 273, 625, 2570, 3470, 253, 9021, 273, 253, 2929, 403, 253, 1563, 337, 186, 66, 4460, 3676, 5415, 673, 3676, 11454, 2990, 2931, 407, 253, 1563, 7424, 323, 247, 1677, 5778, 29201, 1620, 5469, 11454, 5778, 8967, 7424, 4853, 8062, 273, 253, 830, 277, 384, 7064, 50276, 71, 384, 384, 3115, 246, 259, 50276, 85, 305, 2574, 470, 288, 85, 50276, 545, 262, 246, 458, 82, 470, 50269, 249, 436, 1039, 253, 2990, 476, 1379, 715, 2395, 247, 3438, 8763, 3828, 374, 186, 783, 28529, 273, 253, 39200, 7340, 1332, 323, 13444, 7424, 534, 3133, 4942, 15246, 285, 534, 310, 17245, 407, 767, 27947, 275, 253, 30762, 495, 186, 66, 4460, 4715, 5933, 326, 17930, 253, 3579, 323, 288, 285, 253, 19265, 1509, 323, 288, 2260, 253, 31612, 4778, 285, 277, 392, 88, 298, 310, 253, 2957, 407, 247, 5313, 3020, 258, 615, 47037, 10620, 342, 253, 1027, 13444, 3054, 577, 186, 16217, 3825, 327, 374, 69, 20953, 46906, 1507, 285, 327, 10610, 8967, 7424, 824, 347, 5315, 2364, 25483, 403, 2011, 281, 10738, 253, 3745, 273, 277, 615, 281, 23808, 342, 1110, 8062, 275, 1030, 42836, 281, 258, 615, 608, 186, 16217, 3825, 327, 2460, 15302, 721, 186, 253, 2266, 1127, 273, 436, 2929, 310, 273, 2282, 253, 10419, 273, 253, 747, 12955, 273, 4666, 534, 3249, 342, 247, 4460, 5933, 285, 689, 3217, 690, 7364, 273, 4666, 891, 369, 6110, 407, 253, 6667, 273, 3470, 417, 6107, 407, 4666, 285, 6107, 407, 40515, 615, 395, 4354, 13762, 407, 326, 50275, 9088, 403, 690, 5075, 2792, 275, 436, 789, 690, 273, 731, 812, 320, 4354, 5520, 891, 1158, 2571, 1067, 323, 2007, 50276, 1601, 4942, 5884, 2792, 209, 186, 783, 2929, 310, 417, 1881, 6071, 285, 1057, 417, 247, 1077, 1175, 2628, 275, 15571, 253, 3634, 273, 11454, 258, 615, 891, 1804, 326, 368, 625, 4518, 6266, 4666, 347, 247, 9079, 966, 285, 840, 347, 247, 4715, 5933, 209, 186, 783, 767, 4648, 273, 40515, 615, 4468, 14053, 2069, 12395, 50276, 263, 16994, 247, 9162, 1747, 1256, 1159, 476, 368, 1016, 673, 10799, 14800, 44902, 3530, 368, 897, 275, 3733, 816, 323, 19843, 253, 9414, 5476, 265, 273, 2282, 533, 697, 2581, 49133, 1066, 253, 4361, 436, 5928, 273, 14951, 285, 7473, 1320, 50276, 24330, 2792, 285, 3533, 281, 253, 4477, 209, 186, 3062, 15538, 347, 247, 4460, 5933, 310, 5611, 891, 1902, 281, 923, 247, 10454, 275, 673, 1783, 347, 323, 4666, 891, 2096, 326, 253, 10454, 275, 2317, 310, 13857, 50276, 186, 266, 2330, 1953, 310, 253, 2554, 273, 29201, 253, 5778, 849, 310, 352, 6777, 50276, 74, 8564, 326, 604, 29201, 26414, 4404, 470, 359, 1089, 253, 8770, 273, 4666, 50276, 5371, 369, 697, 1318, 275, 253, 767, 1524, 10186, 4679, 369, 352, 4236, 407, 2831, 29599, 50275, 261, 352, 1512, 43245, 5536, 281, 1908, 2709, 20219, 50275, 186, 8045, 1230, 891, 452, 24626, 285, 3533, 670, 253, 1524, 9091, 323, 970, 40515, 615, 275, 253, 767, 1524, 3159, 15302, 4754, 253, 23279, 273, 4666, 24543, 3237, 50276, 249, 841, 4679, 642, 5545, 326, 40515, 615, 556, 247, 625, 6474, 16774, 3879, 342, 247, 1355, 11041, 2299, 275, 4030, 253, 3388, 3045, 310, 4829, 273, 253, 1072, 1340, 247, 1534, 3064, 760, 327, 278, 79, 382, 533, 342, 247, 1077, 3076, 4868, 7296, 436, 310, 271, 3477, 1895, 891, 31451, 19478, 841, 1543, 347, 18714, 285, 891, 26604, 11377, 253, 4477, 281, 3464, 16453, 209, 186, 2666, 476, 359, 23808, 1293, 20219, 407, 16984, 747, 3054, 275, 253, 1039, 31612, 50276, 570, 1546, 258, 615, 403, 4270, 24708, 834, 1162, 355, 50276, 186, 1542, 752, 9351, 273, 9162, 3237, 40515, 615, 812, 320, 625, 4722, 685, 4666, 50276, 783, 1953, 310, 5604, 2834, 50276, 249, 6452, 253, 2929, 39223, 247, 1077, 4722, 9400, 285, 891, 574, 11284, 281, 9413, 352, 342, 253, 1361, 273, 6239, 3738, 253, 2934, 281, 1908, 277, 615, 3185, 273, 258, 615, 310, 32809, 697, 4623, 285, 4460, 285, 5604, 12532, 285, 4409, 281, 320, 14859, 984, 352, 12453, 690, 273, 253, 3374, 273, 4666, 2299, 891, 3297, 327, 619, 23635, 327, 1027, 2792, 627, 403, 13910, 3533, 326, 2430, 281, 320, 9577, 1078, 14924, 5474, 33032, 2520, 2929, 10262, 247, 14053, 966, 273, 4764, 1025, 806, 2621, 8967, 7424, 326, 403, 17697, 327, 690, 13444, 2469, 3054, 436, 310, 247, 12532, 3884, 323, 253, 3114, 281, 564, 281, 7450, 2469, 1655, 258, 615, 14053, 7364, 533, 891, 5583, 323, 18235, 275, 253, 1655, 830, 1955, 281, 14697, 14846, 285, 7103, 342, 1675, 281, 2720, 789, 625, 4278, 2708, 50275, 296, 3755, 20556, 281, 253, 1682, 273, 619, 3640, 436, 310, 247, 4460, 14053, 966, 323, 11454, 258, 3229, 5778, 258, 3229, 403, 973, 14091, 728, 275, 22349, 3345, 273, 253, 5145, 4715, 3114, 326, 651, 320, 4217, 281, 3324, 715, 352, 50276, 251, 253, 6779, 5350, 5778, 258, 3229, 403, 2104, 281, 11399, 581, 4227, 273, 271, 23965, 272, 1854, 2523, 326, 253, 7312, 806, 2621, 11454, 258, 3229, 403, 7591, 281, 1957, 50276, 783, 2898, 281, 3072, 8062, 285, 5315, 2364, 25483, 2718, 326, 403, 10748, 6607, 271, 5778, 8967, 7424, 3133, 5272, 285, 973, 24013, 8550, 326, 8245, 3082, 778, 417, 320, 2104, 281, 9232, 436, 14053, 966, 50275, 20881, 1255, 265, 253, 19508, 32213, 273, 436, 2929, 403, 253, 14846, 285, 7103, 275, 5301, 281, 24708, 834, 6247, 534, 2792, 562, 690, 273, 253, 1072, 14053, 3374, 285, 29328, 271, 5795, 1039, 281, 4993, 731, 1014, 2167, 436, 2929, 28070, 24708, 834, 9638, 352, 7005, 953, 326, 597, 1908, 8931, 5661, 7533, 285, 1057, 417, 2486, 616, 1543, 50274, 18, 4677, 577, 285, 608, 403, 2761, 8931, 281, 8442, 275, 24708, 834, 9638, 50274, 19, 2829, 337, 10262, 1543, 275, 278, 79, 382, 285, 260, 338, 274, 740, 285, 310, 8931, 50272, 936, 253, 4758, 275, 24708, 834, 9638, 253, 11454, 258, 615, 1666, 25379, 50272, 609, 2761, 5185, 342, 253, 4394, 2361, 275, 24708, 834, 9638, 533, 50272, 34856, 834, 9638, 41731, 13015, 253, 1332, 3559, 1060, 285, 403, 50272, 297, 2166, 432, 253, 2829, 24708, 834, 9638, 23000, 50272, 5040, 5852, 253, 18504, 13107, 10895, 50275, 977, 5701, 285, 3533, 310, 253, 39200, 1332, 323, 253, 295, 1678, 265, 1077, 1027, 685, 253, 39200, 1332, 323, 7632, 352, 812, 320, 47860, 281, 540, 41597, 5513, 436, 50276, 783, 6880, 281, 625, 2087, 5778, 8967, 7424, 3133, 12532, 285, 2444, 562, 253, 4278, 275, 436, 2087, 1083, 275, 1635, 281, 253, 1083, 273, 1907, 247, 2014, 5778, 651, 823, 281, 253, 9380, 3762, 285, 35961, 9021, 187, 187, 4118, 18435, 27, 2520, 2929, 310, 247, 12955, 273, 253, 1781, 5675, 966, 273, 11454, 258, 3229, 285, 11323, 18925, 327, 247, 673, 5778, 281, 253, 8245, 534, 4483, 281, 1566, 247, 4067, 966, 273, 3520, 2718, 275, 1798, 6240, 253, 6387, 273, 14270, 11865, 275, 3408, 2317, 50276, 6438, 3302, 7103, 253, 2929, 369, 327, 253, 19354, 342, 374, 30628, 5277, 13857, 10123, 285, 374, 30628, 46705, 18235, 247, 1798, 1774, 2523, 5439, 369, 19274, 342, 1675, 281, 2720, 1445, 24708, 834, 6247, 342, 690, 6832, 14787, 875, 253, 9380, 9762, 273, 10527, 11985, 273, 253, 966, 273, 5421, 2718, 285, 697, 3607, 50276, 2252, 273, 841, 16157, 452, 644, 9713, 407, 253, 4477, 275, 1798, 19274, 285, 5661, 14023, 50276, 783, 913, 24242, 326, 253, 2929, 574, 644, 10481, 5520, 285, 32636, 14924 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: the paper introduces a novel 2stage classification network where the 1st stage consists of a normal feedforward pass through a network and the 2nd passintrospection creates a set of features using gradients of the penultimate layer of feedforward network and passes it through another mlp to predict the correct class the set of features for each sample is created by concatenating the classconditioned crossentropy gradients at the penultimate layer for each class in the dataset in other words they calculate crossentropy gradients for each sample by exhaustively setting every target class as the sample label these gradients are then concatenated and passed through the mlp to generate the class label they show that this approach gives them sota for cifar10c and cifar10 cure datasets strengths 1 the paper proposes a novel approach to rectify predictions for a classification network 2 they show theoretically that the features calculated using gradients could be useful 3 they show results on cifar10c and cifar10 cure datasets weaknesses 1 this applicability of this work is currently limited to classification tasks that have very few target classes and a penultimate layer of modest dimension 2 results are shown only for cifar10 related dataset and raises concerns if they work for other datasets like flowers 102caltech 101 cub200 etc 3 introspection stage is memory and computeintensive the authors should find ways to address this and other scaling issues yes docsepthe authors propose a twostage formulation for neural network decision making inspired by human introspection the first stage is the regular discriminative feedforward stage where a deep network processes an input and generates the logit features the introspective features for class i are then given as the change in the network parameters up until the last layer when a label of i is introduced for a sample x the introspective features are then used to train a second introspective network which tries to match the class predictions obtained from the introspective features to the ground truth predictions as well as the predictions from the first network the authors provide theoretical derivations for these features their approximation sparsity as well as how to extract them efficiently the extensive experiments performed by the authors demonstrate that introspective learning improve neural network performance in terms of robustness calibration domain adaptation and active learning strengths 1 while the idea of logit gradients encoding class similarity information is not new using this information to improve robustness and calibration in neural networks is quite novel and very well motivated theoretically and empirically the biological analogue drawn to kahnemans fastslow learning to motivate introspective learning is relevant but somewhat tenuous 2 the experiments conducted on robustness and calibration show that introspection helps improve both the outofdistribution test accuracy for distorted images as well as the calibration of the neural network the introspective neural networks tend to have very low expected calibration error in general 3 the results in table 1 show consistent improvement in performance on existing robustness techniques when using introspection on top this motivates the usage of introspection as a general purpose technique to train more robust neural network models weakness 1 this is not an issue with the work itself but against the nomenclature of the approach the authors use a narrow definition of introspection pertinent specifically to discriminative learning why class x over class y in general i am against the practice of coadapting terms for ideas established in other fields cognitive science in this case and using them in an extremely narrow scope in machine learning this proposed method covers one abductive reasoning based posthoc introspection approach not all of introspection as it pertains to humans 2 could the low calibration error of the introspective model be affected by the fact that the introspective model is an mlp whereas the models used for comparison are cnns 1 showed that vectormatrix scaling of cnn logits leads to better calibration error and i wonder if the introspective model ends up doing something similar to reduce the calibration error references 1 chuan guo geoff pleiss yu sun and kilian q weinberger on calibration of modern neural networks in international conference on machine learning pmlr 2017 pp 13211330 disclaimer i have not formally verified the proofs provided by the authors in appendix the authors have adequately discussed the limitations of their work as well as its broader societal impact in section 7 docsepthis paper proposes a novel idea by using a simple threelayer mlp instead of fully connected layers as the reflection stage to ask the network to reflect on its feedforward decision by considering and evaluating all available choicesclasses the proposed introspection network is very inspiring in transforming n classes into n possible introspective questions and answers however the robustness provided by the introspective networks in section 4 is not clearly presented na docsepthe paper advocates the use of a two stage process for classification to increase robustness of nn the first net is standard the second network leverages gradients of the first network as inputs ie they compute gradients for all possible outcomes towards the last layer at least this is what seems to be the case according to fig 3 the basic idea is not novel against the authors claim see 1 this work published in 2020 uses also a 2stage process one fast and one slow stage it speaks of reflection uses the kahnemann reference etc however aside from these similarities the technical details seem to be quite different also the cited work does not rely on explanations of all classes however the authors should discuss this work theory is appreciated but i have doubt on correctness in the proof supplementary material from equation 15 to 16 the authors transform using loga0b0a1b1 loga0a1 logb0b1 but this equation is not true the paper does not improve in general ie for regular cifar10 but only for cifar10c thus it seems valuable only in a few cases eg the work above improves on cifar10 on the test set using reflection the evaluation is just on one dataset and one architecture this is insufficient and makes the generalization very questionable detailed comments robsutness robustness 1 reflectivenet learning from explanations from 2020 httpsarxivorgabs201113986 limitations are ok ### Summary:
the paper proposes an approach for reflecting on model predictions in a classification task the approach is novel and empirical evaluations show significant improvement over standard predictive networks one of the reviewers is critical of the paper because this is not the first twostage approach proposed i do not think that this is fair criticism given the technical details of the current paper and an existing approach the reviewer points out is significantly different as the reviewer themselves agree questions about the practicality of applicability when the number of classes are large and generalization of the conclusions to other datasets were raised which are fair points in my opinion and addressing these points would make a stronger contribution
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 783, 2929, 23970, 247, 4460, 374, 13311, 9162, 2990, 835, 253, 337, 296, 3924, 8414, 273, 247, 2622, 3997, 10495, 1509, 949, 247, 2990, 285, 253, 374, 2109, 1509, 565, 2921, 13886, 10513, 247, 873, 273, 3386, 970, 27935, 273, 253, 4331, 503, 2542, 3828, 273, 3997, 10495, 2990, 285, 11999, 352, 949, 1529, 13361, 81, 281, 3283, 253, 3451, 966, 253, 873, 273, 3386, 323, 1016, 3410, 310, 3562, 407, 32147, 839, 253, 966, 44321, 2831, 290, 10144, 27935, 387, 253, 4331, 503, 2542, 3828, 323, 1016, 966, 275, 253, 10895, 275, 643, 3000, 597, 10173, 2831, 290, 10144, 27935, 323, 1016, 3410, 407, 9286, 1242, 4758, 1046, 2303, 966, 347, 253, 3410, 5203, 841, 27935, 403, 840, 32147, 456, 285, 4817, 949, 253, 13361, 81, 281, 6635, 253, 966, 5203, 597, 921, 326, 436, 2746, 4245, 731, 256, 5503, 323, 260, 338, 274, 740, 68, 285, 260, 338, 274, 740, 16498, 15302, 20544, 337, 253, 2929, 29328, 247, 4460, 2746, 281, 9004, 1419, 13650, 323, 247, 9162, 2990, 374, 597, 921, 28055, 326, 253, 3386, 5118, 970, 27935, 812, 320, 4217, 495, 597, 921, 1543, 327, 260, 338, 274, 740, 68, 285, 260, 338, 274, 740, 16498, 15302, 50276, 20881, 1255, 265, 337, 436, 30437, 273, 436, 789, 310, 4390, 3710, 281, 9162, 8892, 326, 452, 1077, 1643, 2303, 5971, 285, 247, 4331, 503, 2542, 3828, 273, 16453, 7877, 374, 1543, 403, 2011, 760, 323, 260, 338, 274, 740, 2905, 10895, 285, 16540, 7350, 604, 597, 789, 323, 643, 15302, 751, 12405, 12197, 1179, 17556, 8437, 12966, 1518, 3966, 495, 540, 2921, 13886, 3924, 310, 3541, 285, 11897, 47986, 253, 4477, 943, 1089, 4088, 281, 2953, 436, 285, 643, 13642, 3374, 50275, 9820, 5474, 339, 431, 248, 4477, 12661, 247, 2500, 493, 486, 15895, 323, 11454, 2990, 3061, 2403, 11797, 407, 1966, 540, 2921, 13886, 253, 806, 3924, 310, 253, 3963, 20741, 800, 3997, 10495, 3924, 835, 247, 3676, 2990, 4870, 271, 3280, 285, 15693, 253, 2412, 262, 3386, 253, 540, 46650, 3386, 323, 966, 891, 403, 840, 1677, 347, 253, 1818, 275, 253, 2990, 3602, 598, 1919, 253, 1390, 3828, 672, 247, 5203, 273, 891, 310, 5611, 323, 247, 3410, 1269, 253, 540, 46650, 3386, 403, 840, 908, 281, 6194, 247, 1273, 540, 46650, 2990, 534, 14177, 281, 3761, 253, 966, 13650, 2797, 432, 253, 540, 46650, 3386, 281, 253, 3216, 5083, 13650, 347, 973, 347, 253, 13650, 432, 253, 806, 2990, 253, 4477, 2085, 10527, 3538, 569, 323, 841, 3386, 616, 11193, 37139, 414, 347, 973, 347, 849, 281, 4908, 731, 14556, 253, 9470, 4679, 2684, 407, 253, 4477, 7568, 326, 540, 46650, 4715, 3157, 11454, 2990, 3045, 275, 2426, 273, 31640, 18543, 5028, 15644, 285, 3939, 4715, 20544, 50276, 18, 1223, 253, 2934, 273, 2412, 262, 27935, 9706, 966, 14259, 1491, 310, 417, 747, 970, 436, 1491, 281, 3157, 31640, 285, 18543, 275, 11454, 6928, 310, 3240, 4460, 285, 1077, 973, 17194, 28055, 285, 45190, 253, 7534, 28046, 8392, 281, 465, 18272, 358, 507, 3809, 31200, 4715, 281, 41509, 540, 46650, 4715, 310, 4623, 533, 8489, 3578, 3472, 50276, 19, 253, 4679, 5196, 327, 31640, 285, 18543, 921, 326, 540, 2921, 13886, 7729, 3157, 1097, 253, 562, 1171, 35360, 1071, 7200, 323, 32408, 3888, 347, 973, 347, 253, 18543, 273, 253, 11454, 2990, 253, 540, 46650, 11454, 6928, 5257, 281, 452, 1077, 1698, 3264, 18543, 2228, 275, 2087, 50274, 20, 253, 1543, 275, 2829, 337, 921, 5185, 7756, 275, 3045, 327, 5368, 31640, 5609, 672, 970, 540, 2921, 13886, 327, 1755, 436, 15265, 684, 253, 10393, 273, 540, 2921, 13886, 347, 247, 2087, 4096, 5853, 281, 6194, 625, 10237, 11454, 2990, 3210, 50276, 20881, 1255, 50275, 18, 436, 310, 417, 271, 2523, 342, 253, 789, 3139, 533, 1411, 253, 295, 50125, 273, 253, 2746, 253, 4477, 897, 247, 6891, 5426, 273, 540, 2921, 13886, 21452, 5742, 281, 20741, 800, 4715, 50276, 22309, 966, 1269, 689, 966, 340, 275, 2087, 891, 717, 1411, 253, 3946, 273, 820, 26672, 272, 2426, 323, 5697, 4232, 275, 643, 4910, 9699, 5859, 275, 436, 1083, 285, 970, 731, 275, 271, 6685, 6891, 7990, 275, 5145, 4715, 436, 4081, 1332, 10949, 581, 490, 43324, 14720, 1754, 1501, 37806, 540, 2921, 13886, 2746, 417, 512, 273, 540, 2921, 13886, 347, 352, 45703, 281, 7497, 50274, 19, 812, 253, 1698, 18543, 2228, 273, 253, 540, 46650, 1566, 320, 5876, 407, 253, 958, 326, 253, 540, 46650, 1566, 310, 271, 13361, 81, 5727, 253, 3210, 908, 323, 5301, 403, 260, 79, 2224, 337, 2692, 326, 1670, 291, 526, 28491, 13642, 273, 260, 9866, 2412, 953, 5644, 281, 1805, 18543, 2228, 285, 891, 4282, 604, 253, 540, 46650, 1566, 7637, 598, 2509, 1633, 2074, 281, 4796, 253, 18543, 2228, 50276, 250, 3065, 50276, 18, 448, 9041, 1149, 80, 3471, 2727, 2318, 739, 340, 86, 5101, 285, 11895, 757, 2805, 359, 249, 24423, 327, 18543, 273, 4980, 11454, 6928, 275, 5213, 8059, 327, 5145, 4715, 268, 1686, 83, 4240, 7266, 13718, 14322, 1229, 50276, 3431, 34837, 50276, 74, 452, 417, 19186, 16058, 253, 27947, 2530, 407, 253, 4477, 275, 30762, 253, 4477, 452, 18212, 5469, 253, 7364, 273, 616, 789, 347, 973, 347, 697, 16055, 38058, 3486, 275, 2593, 818, 5474, 33032, 2520, 2929, 29328, 247, 4460, 2934, 407, 970, 247, 2969, 289, 250, 293, 4071, 13361, 81, 3185, 273, 4751, 4802, 8090, 347, 253, 12906, 3924, 281, 1642, 253, 2990, 281, 4887, 327, 697, 3997, 10495, 3061, 407, 7296, 285, 16344, 512, 2130, 10165, 19770, 253, 4081, 540, 2921, 13886, 2990, 310, 1077, 29853, 275, 27197, 295, 5971, 715, 295, 1896, 540, 46650, 3533, 285, 9172, 2299, 253, 31640, 2530, 407, 253, 540, 46650, 6928, 275, 2593, 577, 310, 417, 4518, 3559, 5549, 5474, 339, 431, 248, 2929, 23318, 253, 897, 273, 247, 767, 3924, 1232, 323, 9162, 281, 2572, 31640, 273, 48257, 253, 806, 2036, 310, 2629, 253, 1273, 2990, 19732, 1131, 27935, 273, 253, 806, 2990, 347, 14800, 26332, 597, 11897, 27935, 323, 512, 1896, 6973, 4404, 253, 1390, 3828, 387, 1878, 436, 310, 752, 3133, 281, 320, 253, 1083, 2556, 281, 3036, 495, 50275, 783, 5044, 2934, 310, 417, 4460, 1411, 253, 4477, 1750, 923, 337, 436, 789, 3863, 275, 9169, 4648, 671, 247, 374, 13311, 1232, 581, 3809, 285, 581, 3468, 3924, 352, 16544, 273, 12906, 4648, 253, 465, 18272, 39480, 3806, 3966, 2299, 9255, 432, 841, 22620, 253, 7681, 4278, 1646, 281, 320, 3240, 1027, 671, 253, 11106, 789, 1057, 417, 10725, 327, 22909, 273, 512, 5971, 2299, 253, 4477, 943, 2319, 436, 789, 50276, 32525, 310, 14109, 533, 891, 452, 5545, 327, 36594, 275, 253, 4737, 24864, 2144, 432, 5150, 1458, 281, 1668, 253, 4477, 4979, 970, 2412, 66, 17, 67, 17, 66, 18, 67, 18, 2412, 66, 17, 66, 18, 2412, 67, 17, 67, 18, 533, 436, 5150, 310, 417, 2032, 50276, 783, 2929, 1057, 417, 3157, 275, 2087, 26332, 323, 3963, 260, 338, 274, 740, 533, 760, 323, 260, 338, 274, 740, 68, 3021, 352, 3133, 9865, 760, 275, 247, 1643, 2219, 24088, 253, 789, 1840, 19132, 327, 260, 338, 274, 740, 327, 253, 1071, 873, 970, 12906, 50276, 783, 7103, 310, 816, 327, 581, 10895, 285, 581, 10336, 436, 310, 12497, 285, 2789, 253, 26647, 1077, 30455, 50276, 5992, 7193, 5701, 50276, 287, 1768, 307, 1255, 50275, 18848, 461, 1255, 337, 4887, 3870, 292, 4715, 432, 22909, 432, 9169, 5987, 39962, 2061, 5375, 1252, 883, 1867, 2691, 7364, 403, 8718, 2490, 187, 4118, 18435, 27, 783, 2929, 29328, 271, 2746, 323, 18964, 327, 1566, 13650, 275, 247, 9162, 4836, 253, 2746, 310, 4460, 285, 16774, 27163, 921, 1534, 7756, 689, 2629, 15970, 6928, 581, 273, 253, 30628, 310, 4619, 273, 253, 2929, 984, 436, 310, 417, 253, 806, 2500, 493, 486, 2746, 4081, 891, 513, 417, 1158, 326, 436, 310, 4344, 14226, 1677, 253, 7681, 4278, 273, 253, 1655, 2929, 285, 271, 5368, 2746, 253, 37317, 2792, 562, 310, 3012, 1027, 347, 253, 37317, 3746, 5194, 3533, 670, 253, 8542, 414, 273, 30437, 672, 253, 1180, 273, 5971, 403, 1781, 285, 26647, 273, 253, 11815, 281, 643, 15302, 497, 5439, 534, 403, 4344, 2792, 275, 619, 4743, 285, 15974, 841, 2792, 651, 1056, 247, 10046, 7680, 50276 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 783, 2929, 23970, 247, 4460, 374, 13311, 9162, 2990, 835, 253, 337, 296, 3924, 8414, 273, 247, 2622, 3997, 10495, 1509, 949, 247, 2990, 285, 253, 374, 2109, 1509, 565, 2921, 13886, 10513, 247, 873, 273, 3386, 970, 27935, 273, 253, 4331, 503, 2542, 3828, 273, 3997, 10495, 2990, 285, 11999, 352, 949, 1529, 13361, 81, 281, 3283, 253, 3451, 966, 253, 873, 273, 3386, 323, 1016, 3410, 310, 3562, 407, 32147, 839, 253, 966, 44321, 2831, 290, 10144, 27935, 387, 253, 4331, 503, 2542, 3828, 323, 1016, 966, 275, 253, 10895, 275, 643, 3000, 597, 10173, 2831, 290, 10144, 27935, 323, 1016, 3410, 407, 9286, 1242, 4758, 1046, 2303, 966, 347, 253, 3410, 5203, 841, 27935, 403, 840, 32147, 456, 285, 4817, 949, 253, 13361, 81, 281, 6635, 253, 966, 5203, 597, 921, 326, 436, 2746, 4245, 731, 256, 5503, 323, 260, 338, 274, 740, 68, 285, 260, 338, 274, 740, 16498, 15302, 20544, 337, 253, 2929, 29328, 247, 4460, 2746, 281, 9004, 1419, 13650, 323, 247, 9162, 2990, 374, 597, 921, 28055, 326, 253, 3386, 5118, 970, 27935, 812, 320, 4217, 495, 597, 921, 1543, 327, 260, 338, 274, 740, 68, 285, 260, 338, 274, 740, 16498, 15302, 50276, 20881, 1255, 265, 337, 436, 30437, 273, 436, 789, 310, 4390, 3710, 281, 9162, 8892, 326, 452, 1077, 1643, 2303, 5971, 285, 247, 4331, 503, 2542, 3828, 273, 16453, 7877, 374, 1543, 403, 2011, 760, 323, 260, 338, 274, 740, 2905, 10895, 285, 16540, 7350, 604, 597, 789, 323, 643, 15302, 751, 12405, 12197, 1179, 17556, 8437, 12966, 1518, 3966, 495, 540, 2921, 13886, 3924, 310, 3541, 285, 11897, 47986, 253, 4477, 943, 1089, 4088, 281, 2953, 436, 285, 643, 13642, 3374, 50275, 9820, 5474, 339, 431, 248, 4477, 12661, 247, 2500, 493, 486, 15895, 323, 11454, 2990, 3061, 2403, 11797, 407, 1966, 540, 2921, 13886, 253, 806, 3924, 310, 253, 3963, 20741, 800, 3997, 10495, 3924, 835, 247, 3676, 2990, 4870, 271, 3280, 285, 15693, 253, 2412, 262, 3386, 253, 540, 46650, 3386, 323, 966, 891, 403, 840, 1677, 347, 253, 1818, 275, 253, 2990, 3602, 598, 1919, 253, 1390, 3828, 672, 247, 5203, 273, 891, 310, 5611, 323, 247, 3410, 1269, 253, 540, 46650, 3386, 403, 840, 908, 281, 6194, 247, 1273, 540, 46650, 2990, 534, 14177, 281, 3761, 253, 966, 13650, 2797, 432, 253, 540, 46650, 3386, 281, 253, 3216, 5083, 13650, 347, 973, 347, 253, 13650, 432, 253, 806, 2990, 253, 4477, 2085, 10527, 3538, 569, 323, 841, 3386, 616, 11193, 37139, 414, 347, 973, 347, 849, 281, 4908, 731, 14556, 253, 9470, 4679, 2684, 407, 253, 4477, 7568, 326, 540, 46650, 4715, 3157, 11454, 2990, 3045, 275, 2426, 273, 31640, 18543, 5028, 15644, 285, 3939, 4715, 20544, 50276, 18, 1223, 253, 2934, 273, 2412, 262, 27935, 9706, 966, 14259, 1491, 310, 417, 747, 970, 436, 1491, 281, 3157, 31640, 285, 18543, 275, 11454, 6928, 310, 3240, 4460, 285, 1077, 973, 17194, 28055, 285, 45190, 253, 7534, 28046, 8392, 281, 465, 18272, 358, 507, 3809, 31200, 4715, 281, 41509, 540, 46650, 4715, 310, 4623, 533, 8489, 3578, 3472, 50276, 19, 253, 4679, 5196, 327, 31640, 285, 18543, 921, 326, 540, 2921, 13886, 7729, 3157, 1097, 253, 562, 1171, 35360, 1071, 7200, 323, 32408, 3888, 347, 973, 347, 253, 18543, 273, 253, 11454, 2990, 253, 540, 46650, 11454, 6928, 5257, 281, 452, 1077, 1698, 3264, 18543, 2228, 275, 2087, 50274, 20, 253, 1543, 275, 2829, 337, 921, 5185, 7756, 275, 3045, 327, 5368, 31640, 5609, 672, 970, 540, 2921, 13886, 327, 1755, 436, 15265, 684, 253, 10393, 273, 540, 2921, 13886, 347, 247, 2087, 4096, 5853, 281, 6194, 625, 10237, 11454, 2990, 3210, 50276, 20881, 1255, 50275, 18, 436, 310, 417, 271, 2523, 342, 253, 789, 3139, 533, 1411, 253, 295, 50125, 273, 253, 2746, 253, 4477, 897, 247, 6891, 5426, 273, 540, 2921, 13886, 21452, 5742, 281, 20741, 800, 4715, 50276, 22309, 966, 1269, 689, 966, 340, 275, 2087, 891, 717, 1411, 253, 3946, 273, 820, 26672, 272, 2426, 323, 5697, 4232, 275, 643, 4910, 9699, 5859, 275, 436, 1083, 285, 970, 731, 275, 271, 6685, 6891, 7990, 275, 5145, 4715, 436, 4081, 1332, 10949, 581, 490, 43324, 14720, 1754, 1501, 37806, 540, 2921, 13886, 2746, 417, 512, 273, 540, 2921, 13886, 347, 352, 45703, 281, 7497, 50274, 19, 812, 253, 1698, 18543, 2228, 273, 253, 540, 46650, 1566, 320, 5876, 407, 253, 958, 326, 253, 540, 46650, 1566, 310, 271, 13361, 81, 5727, 253, 3210, 908, 323, 5301, 403, 260, 79, 2224, 337, 2692, 326, 1670, 291, 526, 28491, 13642, 273, 260, 9866, 2412, 953, 5644, 281, 1805, 18543, 2228, 285, 891, 4282, 604, 253, 540, 46650, 1566, 7637, 598, 2509, 1633, 2074, 281, 4796, 253, 18543, 2228, 50276, 250, 3065, 50276, 18, 448, 9041, 1149, 80, 3471, 2727, 2318, 739, 340, 86, 5101, 285, 11895, 757, 2805, 359, 249, 24423, 327, 18543, 273, 4980, 11454, 6928, 275, 5213, 8059, 327, 5145, 4715, 268, 1686, 83, 4240, 7266, 13718, 14322, 1229, 50276, 3431, 34837, 50276, 74, 452, 417, 19186, 16058, 253, 27947, 2530, 407, 253, 4477, 275, 30762, 253, 4477, 452, 18212, 5469, 253, 7364, 273, 616, 789, 347, 973, 347, 697, 16055, 38058, 3486, 275, 2593, 818, 5474, 33032, 2520, 2929, 29328, 247, 4460, 2934, 407, 970, 247, 2969, 289, 250, 293, 4071, 13361, 81, 3185, 273, 4751, 4802, 8090, 347, 253, 12906, 3924, 281, 1642, 253, 2990, 281, 4887, 327, 697, 3997, 10495, 3061, 407, 7296, 285, 16344, 512, 2130, 10165, 19770, 253, 4081, 540, 2921, 13886, 2990, 310, 1077, 29853, 275, 27197, 295, 5971, 715, 295, 1896, 540, 46650, 3533, 285, 9172, 2299, 253, 31640, 2530, 407, 253, 540, 46650, 6928, 275, 2593, 577, 310, 417, 4518, 3559, 5549, 5474, 339, 431, 248, 2929, 23318, 253, 897, 273, 247, 767, 3924, 1232, 323, 9162, 281, 2572, 31640, 273, 48257, 253, 806, 2036, 310, 2629, 253, 1273, 2990, 19732, 1131, 27935, 273, 253, 806, 2990, 347, 14800, 26332, 597, 11897, 27935, 323, 512, 1896, 6973, 4404, 253, 1390, 3828, 387, 1878, 436, 310, 752, 3133, 281, 320, 253, 1083, 2556, 281, 3036, 495, 50275, 783, 5044, 2934, 310, 417, 4460, 1411, 253, 4477, 1750, 923, 337, 436, 789, 3863, 275, 9169, 4648, 671, 247, 374, 13311, 1232, 581, 3809, 285, 581, 3468, 3924, 352, 16544, 273, 12906, 4648, 253, 465, 18272, 39480, 3806, 3966, 2299, 9255, 432, 841, 22620, 253, 7681, 4278, 1646, 281, 320, 3240, 1027, 671, 253, 11106, 789, 1057, 417, 10725, 327, 22909, 273, 512, 5971, 2299, 253, 4477, 943, 2319, 436, 789, 50276, 32525, 310, 14109, 533, 891, 452, 5545, 327, 36594, 275, 253, 4737, 24864, 2144, 432, 5150, 1458, 281, 1668, 253, 4477, 4979, 970, 2412, 66, 17, 67, 17, 66, 18, 67, 18, 2412, 66, 17, 66, 18, 2412, 67, 17, 67, 18, 533, 436, 5150, 310, 417, 2032, 50276, 783, 2929, 1057, 417, 3157, 275, 2087, 26332, 323, 3963, 260, 338, 274, 740, 533, 760, 323, 260, 338, 274, 740, 68, 3021, 352, 3133, 9865, 760, 275, 247, 1643, 2219, 24088, 253, 789, 1840, 19132, 327, 260, 338, 274, 740, 327, 253, 1071, 873, 970, 12906, 50276, 783, 7103, 310, 816, 327, 581, 10895, 285, 581, 10336, 436, 310, 12497, 285, 2789, 253, 26647, 1077, 30455, 50276, 5992, 7193, 5701, 50276, 287, 1768, 307, 1255, 50275, 18848, 461, 1255, 337, 4887, 3870, 292, 4715, 432, 22909, 432, 9169, 5987, 39962, 2061, 5375, 1252, 883, 1867, 2691, 7364, 403, 8718, 2490, 187, 4118, 18435, 27, 783, 2929, 29328, 271, 2746, 323, 18964, 327, 1566, 13650, 275, 247, 9162, 4836, 253, 2746, 310, 4460, 285, 16774, 27163, 921, 1534, 7756, 689, 2629, 15970, 6928, 581, 273, 253, 30628, 310, 4619, 273, 253, 2929, 984, 436, 310, 417, 253, 806, 2500, 493, 486, 2746, 4081, 891, 513, 417, 1158, 326, 436, 310, 4344, 14226, 1677, 253, 7681, 4278, 273, 253, 1655, 2929, 285, 271, 5368, 2746, 253, 37317, 2792, 562, 310, 3012, 1027, 347, 253, 37317, 3746, 5194, 3533, 670, 253, 8542, 414, 273, 30437, 672, 253, 1180, 273, 5971, 403, 1781, 285, 26647, 273, 253, 11815, 281, 643, 15302, 497, 5439, 534, 403, 4344, 2792, 275, 619, 4743, 285, 15974, 841, 2792, 651, 1056, 247, 10046, 7680, 50276 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: this work highlights the existence of imbalanced gradients as a phenomenon that may hinder optimization of gradientbased adversarial attacks and thus give a false sense of robustness imbalanced gradients may occur as the attack objective consists of the difference of two terms typically the outputs of the network on two different classes when the gradients of these two terms have opposite directions the attack optimization may get easily stuck in a suboptimal local optimum thus decreasing the attack effectiveness to overcome this issue the authors propose a simple variant that can also be applied to existing attacks based on following the gradient of each term in the objective iteratively rather than considering their combination the paper addresses an important issue in adversarial machine learning ie the security evaluation of a defense in fact the performance of many proposed defenses has been overestimated due to the weakness of attacks used in the evaluation one of the causes of this is the inability of gradientbased attacks to correctly perform the optimization the paper provides an exhaustive analysis and formulation of the problem of imbalanced gradients for which also a heuristic metric is provided this phenomenon is then evaluated on a set of stateoftheart defenses and attacks the authors propose an attack variant inspired from pgd and mt attacks using the difference of class logits as the objective ie the socalled margin loss then the optimization algorithm works as follows for each restart in the first half of the optimization process only one of the two terms of the margin loss is alternatively used while in the second half the whole margin loss is used the attack formulation and algorithm are clear it is important to remark that rather than proposing a new attack framework the authors actually only propose an optimization variant of the existing pgd and mt attacks and run these attacks with the proposed changes as correctly stated by the authors indeed their proposal can be viewed as an initialization strategy for adversarial attacks rather than a completely novel attack experiments are reproducible and consider many stateoftheart attack algorithms and defenses however the comparison with other attacks does not seem totally fair the same hyperparameters step size number of iterations and restarts are used for pgd cw and md attacks same also for odi mt and mdmt and default hyperparameters are used for the autoattack framework this is problematic as each attack should be tuned independently and the choice of hyperparameters should be justified a common criterion for hyperparameter tuning may be selected and a gridsearch over the attack hyperparameters should be conducted as shown in many other papers the attack hyperparameters play a key role in correctly optimizing the attack objective even single data points can require different hyperparameter tuning another limitation of this work is that it is restricted solely to wideresnet architectures and cifar10 furthermore the computational efficiency of the attack is not discussed although it is not a primary aspect it should at least be mentioned when it is compared to other attacks since it is potentially slower as it might require more iterations and restarts to be fair attacks should be even compared under the same complexity eg number of queries to the target model in the last column of table 1 the performance difference between mdmt and pgd is reported however this could lead to an overestimation of the attack performance for a fair evaluation the mdmt attack should be compared to the stronger attack available at the state of the art according to the experimental evaluation conducted by the authors and not to pgd which has been shown to perform worse than more recent attacks in many papers eg autoattack docsepthis paper identified the issue of imbalanced gradient verified through some recent defense methods motivated by such an issue a marginal decomposition md attack is proposed to offer a stronger robustness measure in general the paper is well written and the studied problem is interesting the md perspective explains why label smoothing may provide insufficient robustness my comments are listed below 1 in sec 32 why are the first k2 iterations used to maximize the individual margin term and then update the entire loss what will happen if the scheduling is the opposite updating the entire loss at the first k2 iterations then individual termssome ablation studies or explanation should be provided 2 does the proposed stronger attack offer a stronger minmax defense suppose that the ordinary pgd attack is replaced by an md attack during minmax training will it offer better overall robustness the general question to ask is in addition to root cause analysis on the ineffectiveness of some existing defense methods what are the additional benefits of the newly proposed md attack 3 does it seem that the md attack has to run over more iterations than the pgd attack leading to extensive computation cost 4 what is the possible effect of the md attack on the generated perturbation pattern in the blackbox setting will the md attack be more queryefficient than a commonlyused pgd blackbox attack postrebuttal i am mostly satisfied with the authors response after reading other reviewers comments i shared a similar concern on the marginal contribution however the newly added blackbox result is a good addition to the paper thus i keep my original rating toward the positive side docsepthe paper introduces and analyses the possibility that the effectiveness of pgdbased adversarial attacks might be reduced by imbalanced gradients between the terms of the margin losses commonly used as a remedy it also proposed a new scheme for pgd attack where for the first half of the iterations a singleterm loss is optimized before falling back on the the usual margin loss the authors test the hypothesis of imbalanced gradients introducing a new metric gir and the newly proposed attack in two versions md and mdmt on several defenses based on adversarial training pros 1 understanding why some adversarial defenses make some versions of pgd eg with crossentropy or margin loss overestimate sometimes even largely robustness is an interesting direction and the proposed explanation of imbalanced gradients is as far as i know novel 2 the proposed attacks are effective against many defenses cons 1 it is not clear what the proposed metric gradient imbalance ratio gir captures and how it is connected the scheme of md in fact the gir values for jarnat1 sense and mma which are not robust especially against md are smaller than for mart and similar to those for uat which are instead robust and little or not affected by gradient imbalance according to the results of the md attacks moreover feascatter and mart have similar gir but the improvement of mdmt compared to standard pgd is 31 for the former but only around 5 for the latter 2 in the comparison with other methods table 1 the results of aa seem worse than reported in the latest version of croce hein 2020 which report lower values of robust accuracy than mdmt for the same defenses also when considering only individual attacks itd make sense to include fab as in sec 43 and appendix c the authors state that it is the most effective in aa and partially avoids the problem of imbalanced gradients 3 in the md attack the first half of the iterations have step size set to 2cdot epsilon which is than decreased in the second stage this seems similar to what autopgd does overall the hypothesis the imbalanced gradients might be a cause of overestimation of robustness is reasonable and worth exploration but the metric proposed by the authors does not clearly capture this and the attacks which should overcome that issue are not clearly better than existing methods update after rebuttal i thank the authors for the response after reading it the revised version and the other reviews the concerns expressed in the initial review are still valid then i keep the initial scoredocsepthis paper explores constructing adversarial examples in classification in order to create better robustness metrics for general classifiers an attack is defined as an epsilonperturbation of the learned parameters which create a model whose performance is much degraded the premise of this paper is to use gradient imbalance as a way of creating perturbation targets which are claimed and shown numerically to better fool networks that are trained to withstand more traditional attacks and can be used to create more robust models in general i think the premise is wellmotivated and the results look promising however i am having quite a bit of trouble understanding the main method from the margin loss given lxy zmaxz which seems similar to the loss in logistic regression with gradient nabla ellxy yxt1sigmayxttheta it is not entirely clear what zmax and z correspond to in the general case i am not clear as to why we would expect label balance eg why would 1 sigmayxttheta it is also not clear why when one term dominates the other that it is a form of nonrobustness alternatively the interpretation may be that a particular training sample is affecting the model more than the others that would seem like a more reasonable metric of fallibility but i dont think thats what the authors intend overall i do think the paper can be improved if written more clearly with terms and concepts defined i guessed that pi is projection and other terms like imbalance ratio are buried in text and hard to find but i find the method overall simple and elegant so with continued discussions with the authors to clarify what exactly is happening i would raise my score ### Summary:
the paper identifies a subtle gradient problem in adversarial robustness imbalanced gradients which can cause create a false sense of adversarial robustness the paper provides insights into this problem and proposes a margin decomposition based solution for the pgd attack pros novel insights into why some adversarial defenses may make some versions of pgd overestimate robustness proposes a method that is motivated by such findings of imbalanced gradients the proposed attacks are shown effective across a wide range of defenses cons the proposed gradient imbalance ratio could be better motivated ie how is it connected to the scheme of margin decomposition limited novelty in the attacks ie variant of the existing pgd and mt attacks with some proposed changes various concerns with experiments ie stepsize tuning choice of hyperparameters overall the reviewers felt that there were some interesting ideas and directions presented in the paper however the reviewers also felt that the contribution was of marginal significance and more confidence in the various components ie how the proposed metrics measure the imbalanced gradient effect and various concerns in the experiments would have made the paper more convincing
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 2520, 789, 16681, 253, 6242, 273, 516, 30063, 27935, 347, 247, 11562, 326, 778, 35007, 13757, 273, 11786, 3169, 48960, 8104, 285, 3021, 1918, 247, 3221, 3282, 273, 31640, 516, 30063, 27935, 778, 2826, 347, 253, 2983, 8103, 8414, 273, 253, 3064, 273, 767, 2426, 5431, 253, 18012, 273, 253, 2990, 327, 767, 1027, 5971, 672, 253, 27935, 273, 841, 767, 2426, 452, 7285, 10746, 253, 2983, 13757, 778, 755, 4354, 10960, 275, 247, 749, 29776, 1980, 24571, 3021, 11052, 253, 2983, 12510, 281, 11399, 436, 2523, 253, 4477, 12661, 247, 2969, 12955, 326, 476, 671, 320, 3732, 281, 5368, 8104, 1754, 327, 1563, 253, 11786, 273, 1016, 1307, 275, 253, 8103, 10040, 3146, 2581, 685, 7296, 616, 5019, 50276, 783, 2929, 12453, 271, 1774, 2523, 275, 48960, 5145, 4715, 26332, 253, 3988, 7103, 273, 247, 5684, 275, 958, 253, 3045, 273, 1142, 4081, 25774, 556, 644, 35039, 18280, 1955, 281, 253, 14855, 273, 8104, 908, 275, 253, 7103, 581, 273, 253, 5997, 273, 436, 310, 253, 19297, 273, 11786, 3169, 8104, 281, 9113, 1347, 253, 13757, 50276, 783, 2929, 3400, 271, 41389, 1783, 285, 15895, 273, 253, 1895, 273, 516, 30063, 27935, 323, 534, 671, 247, 47641, 7982, 310, 2530, 436, 11562, 310, 840, 6760, 327, 247, 873, 273, 1375, 23037, 14387, 25774, 285, 8104, 50275, 783, 4477, 12661, 271, 2983, 12955, 11797, 432, 23256, 69, 285, 26301, 8104, 970, 253, 3064, 273, 966, 2412, 953, 347, 253, 8103, 26332, 253, 9267, 18859, 8459, 2957, 840, 253, 13757, 5933, 2987, 347, 3637, 323, 1016, 19855, 275, 253, 806, 2716, 273, 253, 13757, 1232, 760, 581, 273, 253, 767, 2426, 273, 253, 8459, 2957, 310, 31506, 908, 1223, 275, 253, 1273, 2716, 253, 2644, 8459, 2957, 310, 908, 253, 2983, 15895, 285, 5933, 403, 2590, 352, 310, 1774, 281, 7579, 326, 2581, 685, 36636, 247, 747, 2983, 7792, 253, 4477, 2686, 760, 12661, 271, 13757, 12955, 273, 253, 5368, 23256, 69, 285, 26301, 8104, 285, 1408, 841, 8104, 342, 253, 4081, 2544, 347, 9113, 4767, 407, 253, 4477, 6296, 616, 10419, 476, 320, 11575, 347, 271, 31850, 5700, 323, 48960, 8104, 2581, 685, 247, 4336, 4460, 2983, 50276, 16217, 3825, 403, 41374, 285, 1908, 1142, 1375, 23037, 14387, 2983, 11333, 285, 25774, 2299, 253, 5301, 342, 643, 8104, 1057, 417, 1646, 9106, 4344, 253, 1072, 4373, 22041, 3213, 1979, 1180, 273, 25142, 285, 1551, 12863, 403, 908, 323, 23256, 69, 260, 88, 285, 31934, 8104, 1072, 671, 323, 258, 5168, 26301, 285, 31934, 6917, 285, 4284, 4373, 22041, 403, 908, 323, 253, 6753, 35946, 7792, 436, 310, 20276, 347, 1016, 2983, 943, 320, 24251, 10939, 285, 253, 4327, 273, 4373, 22041, 943, 320, 17285, 247, 1846, 17705, 323, 4373, 19484, 25184, 778, 320, 4236, 285, 247, 9860, 8716, 689, 253, 2983, 4373, 22041, 943, 320, 5196, 347, 2011, 275, 1142, 643, 9380, 253, 2983, 4373, 22041, 1132, 247, 2234, 2554, 275, 9113, 39793, 253, 2983, 8103, 1014, 2014, 941, 2792, 476, 2430, 1027, 4373, 19484, 25184, 50276, 23955, 12291, 273, 436, 789, 310, 326, 352, 310, 11096, 12718, 281, 4618, 373, 3024, 35615, 285, 260, 338, 274, 740, 50276, 44295, 3062, 253, 15180, 6733, 273, 253, 2983, 310, 417, 5469, 3738, 352, 310, 417, 247, 3625, 4809, 352, 943, 387, 1878, 320, 5393, 672, 352, 310, 2429, 281, 643, 8104, 1580, 352, 310, 7826, 17357, 347, 352, 1537, 2430, 625, 25142, 285, 1551, 12863, 281, 320, 4344, 8104, 943, 320, 1014, 2429, 762, 253, 1072, 10454, 24088, 1180, 273, 19241, 281, 253, 2303, 1566, 50276, 249, 253, 1390, 5084, 273, 2829, 337, 253, 3045, 3064, 875, 31934, 6917, 285, 23256, 69, 310, 2361, 2299, 436, 812, 1421, 281, 271, 35039, 14508, 273, 253, 2983, 3045, 323, 247, 4344, 7103, 253, 31934, 6917, 2983, 943, 320, 2429, 281, 253, 10046, 2983, 2130, 387, 253, 1375, 273, 253, 1445, 2556, 281, 253, 5661, 7103, 5196, 407, 253, 4477, 285, 417, 281, 23256, 69, 534, 556, 644, 2011, 281, 1347, 7197, 685, 625, 3332, 8104, 275, 1142, 9380, 50276, 909, 6753, 35946, 5474, 33032, 2520, 2929, 3636, 253, 2523, 273, 516, 30063, 11786, 16058, 949, 690, 3332, 5684, 3082, 17194, 407, 824, 271, 2523, 247, 16888, 14717, 31934, 2983, 310, 4081, 281, 3959, 247, 10046, 31640, 2557, 275, 2087, 253, 2929, 310, 973, 3542, 285, 253, 5421, 1895, 310, 4722, 253, 31934, 8668, 11424, 2139, 5203, 36971, 778, 2085, 12497, 31640, 50276, 2577, 5701, 403, 7117, 2708, 50275, 18, 50276, 249, 4706, 4567, 2139, 403, 253, 806, 465, 19, 25142, 908, 281, 22950, 253, 2060, 8459, 1307, 285, 840, 5731, 253, 2862, 2957, 752, 588, 5108, 604, 253, 27387, 310, 253, 7285, 22753, 253, 2862, 2957, 387, 253, 806, 465, 19, 25142, 840, 2060, 1307, 859, 485, 28913, 2175, 390, 8813, 943, 320, 2530, 50276, 19, 1057, 253, 4081, 10046, 2983, 3959, 247, 10046, 1054, 4090, 5684, 9428, 326, 253, 9826, 23256, 69, 2983, 310, 7932, 407, 271, 31934, 2983, 1309, 1054, 4090, 3733, 588, 352, 3959, 1805, 4583, 31640, 253, 2087, 1953, 281, 1642, 310, 275, 1635, 281, 5230, 2847, 1783, 327, 253, 275, 38439, 273, 690, 5368, 5684, 3082, 752, 403, 253, 3081, 5373, 273, 253, 9841, 4081, 31934, 2983, 50276, 20, 1057, 352, 1646, 326, 253, 31934, 2983, 556, 281, 1408, 689, 625, 25142, 685, 253, 23256, 69, 2983, 4283, 281, 9470, 13782, 2105, 50276, 21, 752, 310, 253, 1896, 1055, 273, 253, 31934, 2983, 327, 253, 4561, 20452, 3102, 275, 253, 2806, 3364, 4758, 588, 253, 31934, 2983, 320, 625, 7316, 20246, 685, 247, 7744, 3197, 23256, 69, 2806, 3364, 2983, 50275, 5996, 250, 2858, 22559, 50276, 74, 717, 6571, 10048, 342, 253, 4477, 2380, 846, 4361, 643, 30628, 5701, 891, 6096, 247, 2074, 4468, 327, 253, 16888, 7680, 2299, 253, 9841, 2879, 2806, 3364, 906, 310, 247, 1175, 1635, 281, 253, 2929, 3021, 891, 1978, 619, 3236, 13716, 2584, 253, 2762, 1930, 50275, 7152, 339, 431, 248, 2929, 23970, 285, 6260, 253, 6387, 326, 253, 12510, 273, 23256, 69, 3169, 48960, 8104, 1537, 320, 3777, 407, 516, 30063, 27935, 875, 253, 2426, 273, 253, 8459, 11655, 7744, 908, 347, 247, 16748, 352, 671, 4081, 247, 747, 6974, 323, 23256, 69, 2983, 835, 323, 253, 806, 2716, 273, 253, 25142, 247, 2014, 3945, 2957, 310, 18325, 1078, 10805, 896, 327, 253, 253, 7312, 8459, 2957, 253, 4477, 1071, 253, 9079, 273, 516, 30063, 27935, 16984, 247, 747, 7982, 48496, 285, 253, 9841, 4081, 2983, 275, 767, 9508, 31934, 285, 31934, 6917, 327, 2067, 25774, 1754, 327, 48960, 3733, 50276, 856, 84, 337, 4685, 2139, 690, 48960, 25774, 1056, 690, 9508, 273, 23256, 69, 24088, 342, 2831, 290, 10144, 390, 8459, 2957, 35039, 2542, 4536, 1014, 8127, 31640, 310, 271, 4722, 3884, 285, 253, 4081, 8813, 273, 516, 30063, 27935, 310, 347, 2080, 347, 891, 871, 4460, 374, 253, 4081, 8104, 403, 3576, 1411, 1142, 25774, 50276, 5040, 337, 352, 310, 417, 2590, 752, 253, 4081, 7982, 11786, 31561, 4313, 48496, 28174, 285, 849, 352, 310, 4802, 253, 6974, 273, 31934, 275, 958, 253, 48496, 2193, 323, 480, 1596, 255, 18, 3282, 285, 278, 785, 534, 403, 417, 10237, 3340, 1411, 31934, 403, 4577, 685, 323, 16172, 285, 2074, 281, 1110, 323, 1484, 255, 534, 403, 3185, 10237, 285, 1652, 390, 417, 5876, 407, 11786, 31561, 2556, 281, 253, 1543, 273, 253, 31934, 8104, 25761, 704, 4843, 2569, 285, 16172, 452, 2074, 48496, 533, 253, 7756, 273, 31934, 6917, 2429, 281, 2629, 23256, 69, 310, 4562, 323, 253, 3438, 533, 760, 1475, 608, 323, 253, 6158, 374, 275, 253, 5301, 342, 643, 3082, 2829, 337, 253, 1543, 273, 39951, 1646, 7197, 685, 2361, 275, 253, 6323, 2715, 273, 9187, 336, 50276, 248, 249, 9169, 534, 1304, 2406, 2193, 273, 10237, 7200, 685, 31934, 6917, 323, 253, 1072, 25774, 671, 672, 7296, 760, 2060, 8104, 352, 69, 1056, 3282, 281, 2486, 6969, 347, 275, 4706, 7652, 285, 30762, 260, 253, 4477, 1375, 326, 352, 310, 253, 954, 3576, 275, 39951, 285, 10571, 32547, 253, 1895, 273, 516, 30063, 27935, 495, 275, 253, 31934, 2983, 253, 806, 2716, 273, 253, 25142, 452, 3213, 1979, 873, 281, 374, 3830, 299, 4277, 534, 310, 685, 6137, 275, 253, 1273, 3924, 436, 3133, 2074, 281, 752, 1125, 412, 35333, 1057, 50276, 1189, 455, 253, 9079, 253, 516, 30063, 27935, 1537, 320, 247, 2847, 273, 35039, 14508, 273, 31640, 310, 5272, 285, 4409, 17947, 533, 253, 7982, 4081, 407, 253, 4477, 1057, 417, 4518, 9232, 436, 285, 253, 8104, 534, 943, 11399, 326, 2523, 403, 417, 4518, 1805, 685, 5368, 3082, 50275, 11183, 846, 30080, 22559, 50276, 74, 5717, 253, 4477, 323, 253, 2380, 846, 4361, 352, 253, 17265, 2715, 285, 253, 643, 10123, 253, 7350, 4469, 275, 253, 3302, 2278, 403, 1335, 3588, 50276, 7461, 891, 1978, 253, 3302, 11691, 406, 33032, 2520, 2929, 33826, 26736, 48960, 6667, 275, 9162, 275, 1340, 281, 2794, 1805, 31640, 17082, 323, 2087, 49996, 271, 2983, 310, 2931, 347, 271, 299, 4277, 44931, 318, 273, 253, 6311, 3602, 534, 2794, 247, 1566, 3692, 3045, 310, 1199, 30853, 253, 26536, 273, 436, 2929, 310, 281, 897, 11786, 31561, 347, 247, 1039, 273, 6153, 20452, 8571, 534, 403, 7558, 285, 2011, 27184, 281, 1805, 11213, 6928, 326, 403, 10166, 281, 32385, 625, 5899, 8104, 285, 476, 320, 908, 281, 2794, 625, 10237, 3210, 275, 2087, 50276, 74, 1158, 253, 26536, 310, 973, 24013, 8550, 285, 253, 1543, 1007, 12532, 2299, 891, 717, 1907, 3240, 247, 2372, 273, 7596, 4685, 253, 2022, 1332, 432, 253, 8459, 2957, 1677, 298, 5246, 50276, 91, 4090, 91, 534, 3133, 2074, 281, 253, 2957, 275, 21535, 9077, 342, 11786, 50276, 6526, 11591, 5246, 50276, 90, 633, 18, 24502, 11159, 633, 3124, 352, 310, 417, 7094, 2590, 752, 1182, 4090, 285, 1182, 2723, 281, 275, 253, 2087, 1083, 891, 717, 417, 2590, 347, 281, 2139, 359, 651, 1902, 5203, 6654, 24088, 2139, 651, 337, 50276, 24502, 11159, 633, 3124, 352, 310, 671, 417, 2590, 2139, 672, 581, 1307, 36807, 253, 643, 326, 352, 310, 247, 830, 273, 1327, 18848, 461, 1255, 50275, 30991, 3146, 253, 7914, 778, 320, 326, 247, 1798, 3733, 3410, 310, 13567, 253, 1566, 625, 685, 253, 2571, 326, 651, 1646, 751, 247, 625, 5272, 7982, 273, 2965, 2322, 533, 891, 13414, 1158, 28763, 752, 253, 4477, 18607, 50276, 1189, 455, 891, 513, 1158, 253, 2929, 476, 320, 5520, 604, 3542, 625, 4518, 342, 2426, 285, 12342, 2931, 891, 30346, 326, 12580, 310, 12378, 285, 643, 2426, 751, 31561, 4313, 403, 14205, 275, 2505, 285, 1892, 281, 1089, 533, 891, 1089, 253, 1332, 4583, 2969, 285, 20654, 594, 342, 4821, 11985, 342, 253, 4477, 281, 19148, 752, 4555, 310, 9369, 891, 651, 7164, 619, 4868, 2490, 187, 4118, 18435, 27, 783, 2929, 22649, 247, 16105, 11786, 1895, 275, 48960, 31640, 516, 30063, 27935, 534, 476, 2847, 2794, 247, 3221, 3282, 273, 48960, 31640, 253, 2929, 3400, 16039, 715, 436, 1895, 285, 29328, 247, 8459, 14717, 1754, 2900, 323, 253, 23256, 69, 2983, 50276, 856, 84, 50276, 2369, 652, 16039, 715, 2139, 690, 48960, 25774, 778, 1056, 690, 9508, 273, 23256, 69, 35039, 2542, 31640, 50276, 856, 6013, 247, 1332, 326, 310, 17194, 407, 824, 4342, 273, 516, 30063, 27935, 50276, 783, 4081, 8104, 403, 2011, 3576, 2439, 247, 4618, 2491, 273, 25774, 50276, 5040, 50276, 783, 4081, 11786, 31561, 4313, 812, 320, 1805, 17194, 26332, 849, 310, 352, 4802, 281, 253, 6974, 273, 8459, 14717, 50276, 15870, 38135, 275, 253, 8104, 26332, 12955, 273, 253, 5368, 23256, 69, 285, 26301, 8104, 342, 690, 4081, 2544, 50276, 2044, 784, 7350, 342, 4679, 26332, 5018, 907, 25184, 4327, 273, 4373, 22041, 50276, 1189, 455, 253, 30628, 3543, 326, 627, 497, 690, 4722, 5697, 285, 10746, 3559, 275, 253, 2929, 2299, 253, 30628, 671, 3543, 326, 253, 7680, 369, 273, 16888, 8453, 285, 625, 7162, 275, 253, 2710, 4295, 26332, 849, 253, 4081, 17082, 2557, 253, 516, 30063, 11786, 1055, 285, 2710, 7350, 275, 253, 4679, 651, 452, 1160, 253, 2929, 625, 21414 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 2520, 789, 16681, 253, 6242, 273, 516, 30063, 27935, 347, 247, 11562, 326, 778, 35007, 13757, 273, 11786, 3169, 48960, 8104, 285, 3021, 1918, 247, 3221, 3282, 273, 31640, 516, 30063, 27935, 778, 2826, 347, 253, 2983, 8103, 8414, 273, 253, 3064, 273, 767, 2426, 5431, 253, 18012, 273, 253, 2990, 327, 767, 1027, 5971, 672, 253, 27935, 273, 841, 767, 2426, 452, 7285, 10746, 253, 2983, 13757, 778, 755, 4354, 10960, 275, 247, 749, 29776, 1980, 24571, 3021, 11052, 253, 2983, 12510, 281, 11399, 436, 2523, 253, 4477, 12661, 247, 2969, 12955, 326, 476, 671, 320, 3732, 281, 5368, 8104, 1754, 327, 1563, 253, 11786, 273, 1016, 1307, 275, 253, 8103, 10040, 3146, 2581, 685, 7296, 616, 5019, 50276, 783, 2929, 12453, 271, 1774, 2523, 275, 48960, 5145, 4715, 26332, 253, 3988, 7103, 273, 247, 5684, 275, 958, 253, 3045, 273, 1142, 4081, 25774, 556, 644, 35039, 18280, 1955, 281, 253, 14855, 273, 8104, 908, 275, 253, 7103, 581, 273, 253, 5997, 273, 436, 310, 253, 19297, 273, 11786, 3169, 8104, 281, 9113, 1347, 253, 13757, 50276, 783, 2929, 3400, 271, 41389, 1783, 285, 15895, 273, 253, 1895, 273, 516, 30063, 27935, 323, 534, 671, 247, 47641, 7982, 310, 2530, 436, 11562, 310, 840, 6760, 327, 247, 873, 273, 1375, 23037, 14387, 25774, 285, 8104, 50275, 783, 4477, 12661, 271, 2983, 12955, 11797, 432, 23256, 69, 285, 26301, 8104, 970, 253, 3064, 273, 966, 2412, 953, 347, 253, 8103, 26332, 253, 9267, 18859, 8459, 2957, 840, 253, 13757, 5933, 2987, 347, 3637, 323, 1016, 19855, 275, 253, 806, 2716, 273, 253, 13757, 1232, 760, 581, 273, 253, 767, 2426, 273, 253, 8459, 2957, 310, 31506, 908, 1223, 275, 253, 1273, 2716, 253, 2644, 8459, 2957, 310, 908, 253, 2983, 15895, 285, 5933, 403, 2590, 352, 310, 1774, 281, 7579, 326, 2581, 685, 36636, 247, 747, 2983, 7792, 253, 4477, 2686, 760, 12661, 271, 13757, 12955, 273, 253, 5368, 23256, 69, 285, 26301, 8104, 285, 1408, 841, 8104, 342, 253, 4081, 2544, 347, 9113, 4767, 407, 253, 4477, 6296, 616, 10419, 476, 320, 11575, 347, 271, 31850, 5700, 323, 48960, 8104, 2581, 685, 247, 4336, 4460, 2983, 50276, 16217, 3825, 403, 41374, 285, 1908, 1142, 1375, 23037, 14387, 2983, 11333, 285, 25774, 2299, 253, 5301, 342, 643, 8104, 1057, 417, 1646, 9106, 4344, 253, 1072, 4373, 22041, 3213, 1979, 1180, 273, 25142, 285, 1551, 12863, 403, 908, 323, 23256, 69, 260, 88, 285, 31934, 8104, 1072, 671, 323, 258, 5168, 26301, 285, 31934, 6917, 285, 4284, 4373, 22041, 403, 908, 323, 253, 6753, 35946, 7792, 436, 310, 20276, 347, 1016, 2983, 943, 320, 24251, 10939, 285, 253, 4327, 273, 4373, 22041, 943, 320, 17285, 247, 1846, 17705, 323, 4373, 19484, 25184, 778, 320, 4236, 285, 247, 9860, 8716, 689, 253, 2983, 4373, 22041, 943, 320, 5196, 347, 2011, 275, 1142, 643, 9380, 253, 2983, 4373, 22041, 1132, 247, 2234, 2554, 275, 9113, 39793, 253, 2983, 8103, 1014, 2014, 941, 2792, 476, 2430, 1027, 4373, 19484, 25184, 50276, 23955, 12291, 273, 436, 789, 310, 326, 352, 310, 11096, 12718, 281, 4618, 373, 3024, 35615, 285, 260, 338, 274, 740, 50276, 44295, 3062, 253, 15180, 6733, 273, 253, 2983, 310, 417, 5469, 3738, 352, 310, 417, 247, 3625, 4809, 352, 943, 387, 1878, 320, 5393, 672, 352, 310, 2429, 281, 643, 8104, 1580, 352, 310, 7826, 17357, 347, 352, 1537, 2430, 625, 25142, 285, 1551, 12863, 281, 320, 4344, 8104, 943, 320, 1014, 2429, 762, 253, 1072, 10454, 24088, 1180, 273, 19241, 281, 253, 2303, 1566, 50276, 249, 253, 1390, 5084, 273, 2829, 337, 253, 3045, 3064, 875, 31934, 6917, 285, 23256, 69, 310, 2361, 2299, 436, 812, 1421, 281, 271, 35039, 14508, 273, 253, 2983, 3045, 323, 247, 4344, 7103, 253, 31934, 6917, 2983, 943, 320, 2429, 281, 253, 10046, 2983, 2130, 387, 253, 1375, 273, 253, 1445, 2556, 281, 253, 5661, 7103, 5196, 407, 253, 4477, 285, 417, 281, 23256, 69, 534, 556, 644, 2011, 281, 1347, 7197, 685, 625, 3332, 8104, 275, 1142, 9380, 50276, 909, 6753, 35946, 5474, 33032, 2520, 2929, 3636, 253, 2523, 273, 516, 30063, 11786, 16058, 949, 690, 3332, 5684, 3082, 17194, 407, 824, 271, 2523, 247, 16888, 14717, 31934, 2983, 310, 4081, 281, 3959, 247, 10046, 31640, 2557, 275, 2087, 253, 2929, 310, 973, 3542, 285, 253, 5421, 1895, 310, 4722, 253, 31934, 8668, 11424, 2139, 5203, 36971, 778, 2085, 12497, 31640, 50276, 2577, 5701, 403, 7117, 2708, 50275, 18, 50276, 249, 4706, 4567, 2139, 403, 253, 806, 465, 19, 25142, 908, 281, 22950, 253, 2060, 8459, 1307, 285, 840, 5731, 253, 2862, 2957, 752, 588, 5108, 604, 253, 27387, 310, 253, 7285, 22753, 253, 2862, 2957, 387, 253, 806, 465, 19, 25142, 840, 2060, 1307, 859, 485, 28913, 2175, 390, 8813, 943, 320, 2530, 50276, 19, 1057, 253, 4081, 10046, 2983, 3959, 247, 10046, 1054, 4090, 5684, 9428, 326, 253, 9826, 23256, 69, 2983, 310, 7932, 407, 271, 31934, 2983, 1309, 1054, 4090, 3733, 588, 352, 3959, 1805, 4583, 31640, 253, 2087, 1953, 281, 1642, 310, 275, 1635, 281, 5230, 2847, 1783, 327, 253, 275, 38439, 273, 690, 5368, 5684, 3082, 752, 403, 253, 3081, 5373, 273, 253, 9841, 4081, 31934, 2983, 50276, 20, 1057, 352, 1646, 326, 253, 31934, 2983, 556, 281, 1408, 689, 625, 25142, 685, 253, 23256, 69, 2983, 4283, 281, 9470, 13782, 2105, 50276, 21, 752, 310, 253, 1896, 1055, 273, 253, 31934, 2983, 327, 253, 4561, 20452, 3102, 275, 253, 2806, 3364, 4758, 588, 253, 31934, 2983, 320, 625, 7316, 20246, 685, 247, 7744, 3197, 23256, 69, 2806, 3364, 2983, 50275, 5996, 250, 2858, 22559, 50276, 74, 717, 6571, 10048, 342, 253, 4477, 2380, 846, 4361, 643, 30628, 5701, 891, 6096, 247, 2074, 4468, 327, 253, 16888, 7680, 2299, 253, 9841, 2879, 2806, 3364, 906, 310, 247, 1175, 1635, 281, 253, 2929, 3021, 891, 1978, 619, 3236, 13716, 2584, 253, 2762, 1930, 50275, 7152, 339, 431, 248, 2929, 23970, 285, 6260, 253, 6387, 326, 253, 12510, 273, 23256, 69, 3169, 48960, 8104, 1537, 320, 3777, 407, 516, 30063, 27935, 875, 253, 2426, 273, 253, 8459, 11655, 7744, 908, 347, 247, 16748, 352, 671, 4081, 247, 747, 6974, 323, 23256, 69, 2983, 835, 323, 253, 806, 2716, 273, 253, 25142, 247, 2014, 3945, 2957, 310, 18325, 1078, 10805, 896, 327, 253, 253, 7312, 8459, 2957, 253, 4477, 1071, 253, 9079, 273, 516, 30063, 27935, 16984, 247, 747, 7982, 48496, 285, 253, 9841, 4081, 2983, 275, 767, 9508, 31934, 285, 31934, 6917, 327, 2067, 25774, 1754, 327, 48960, 3733, 50276, 856, 84, 337, 4685, 2139, 690, 48960, 25774, 1056, 690, 9508, 273, 23256, 69, 24088, 342, 2831, 290, 10144, 390, 8459, 2957, 35039, 2542, 4536, 1014, 8127, 31640, 310, 271, 4722, 3884, 285, 253, 4081, 8813, 273, 516, 30063, 27935, 310, 347, 2080, 347, 891, 871, 4460, 374, 253, 4081, 8104, 403, 3576, 1411, 1142, 25774, 50276, 5040, 337, 352, 310, 417, 2590, 752, 253, 4081, 7982, 11786, 31561, 4313, 48496, 28174, 285, 849, 352, 310, 4802, 253, 6974, 273, 31934, 275, 958, 253, 48496, 2193, 323, 480, 1596, 255, 18, 3282, 285, 278, 785, 534, 403, 417, 10237, 3340, 1411, 31934, 403, 4577, 685, 323, 16172, 285, 2074, 281, 1110, 323, 1484, 255, 534, 403, 3185, 10237, 285, 1652, 390, 417, 5876, 407, 11786, 31561, 2556, 281, 253, 1543, 273, 253, 31934, 8104, 25761, 704, 4843, 2569, 285, 16172, 452, 2074, 48496, 533, 253, 7756, 273, 31934, 6917, 2429, 281, 2629, 23256, 69, 310, 4562, 323, 253, 3438, 533, 760, 1475, 608, 323, 253, 6158, 374, 275, 253, 5301, 342, 643, 3082, 2829, 337, 253, 1543, 273, 39951, 1646, 7197, 685, 2361, 275, 253, 6323, 2715, 273, 9187, 336, 50276, 248, 249, 9169, 534, 1304, 2406, 2193, 273, 10237, 7200, 685, 31934, 6917, 323, 253, 1072, 25774, 671, 672, 7296, 760, 2060, 8104, 352, 69, 1056, 3282, 281, 2486, 6969, 347, 275, 4706, 7652, 285, 30762, 260, 253, 4477, 1375, 326, 352, 310, 253, 954, 3576, 275, 39951, 285, 10571, 32547, 253, 1895, 273, 516, 30063, 27935, 495, 275, 253, 31934, 2983, 253, 806, 2716, 273, 253, 25142, 452, 3213, 1979, 873, 281, 374, 3830, 299, 4277, 534, 310, 685, 6137, 275, 253, 1273, 3924, 436, 3133, 2074, 281, 752, 1125, 412, 35333, 1057, 50276, 1189, 455, 253, 9079, 253, 516, 30063, 27935, 1537, 320, 247, 2847, 273, 35039, 14508, 273, 31640, 310, 5272, 285, 4409, 17947, 533, 253, 7982, 4081, 407, 253, 4477, 1057, 417, 4518, 9232, 436, 285, 253, 8104, 534, 943, 11399, 326, 2523, 403, 417, 4518, 1805, 685, 5368, 3082, 50275, 11183, 846, 30080, 22559, 50276, 74, 5717, 253, 4477, 323, 253, 2380, 846, 4361, 352, 253, 17265, 2715, 285, 253, 643, 10123, 253, 7350, 4469, 275, 253, 3302, 2278, 403, 1335, 3588, 50276, 7461, 891, 1978, 253, 3302, 11691, 406, 33032, 2520, 2929, 33826, 26736, 48960, 6667, 275, 9162, 275, 1340, 281, 2794, 1805, 31640, 17082, 323, 2087, 49996, 271, 2983, 310, 2931, 347, 271, 299, 4277, 44931, 318, 273, 253, 6311, 3602, 534, 2794, 247, 1566, 3692, 3045, 310, 1199, 30853, 253, 26536, 273, 436, 2929, 310, 281, 897, 11786, 31561, 347, 247, 1039, 273, 6153, 20452, 8571, 534, 403, 7558, 285, 2011, 27184, 281, 1805, 11213, 6928, 326, 403, 10166, 281, 32385, 625, 5899, 8104, 285, 476, 320, 908, 281, 2794, 625, 10237, 3210, 275, 2087, 50276, 74, 1158, 253, 26536, 310, 973, 24013, 8550, 285, 253, 1543, 1007, 12532, 2299, 891, 717, 1907, 3240, 247, 2372, 273, 7596, 4685, 253, 2022, 1332, 432, 253, 8459, 2957, 1677, 298, 5246, 50276, 91, 4090, 91, 534, 3133, 2074, 281, 253, 2957, 275, 21535, 9077, 342, 11786, 50276, 6526, 11591, 5246, 50276, 90, 633, 18, 24502, 11159, 633, 3124, 352, 310, 417, 7094, 2590, 752, 1182, 4090, 285, 1182, 2723, 281, 275, 253, 2087, 1083, 891, 717, 417, 2590, 347, 281, 2139, 359, 651, 1902, 5203, 6654, 24088, 2139, 651, 337, 50276, 24502, 11159, 633, 3124, 352, 310, 671, 417, 2590, 2139, 672, 581, 1307, 36807, 253, 643, 326, 352, 310, 247, 830, 273, 1327, 18848, 461, 1255, 50275, 30991, 3146, 253, 7914, 778, 320, 326, 247, 1798, 3733, 3410, 310, 13567, 253, 1566, 625, 685, 253, 2571, 326, 651, 1646, 751, 247, 625, 5272, 7982, 273, 2965, 2322, 533, 891, 13414, 1158, 28763, 752, 253, 4477, 18607, 50276, 1189, 455, 891, 513, 1158, 253, 2929, 476, 320, 5520, 604, 3542, 625, 4518, 342, 2426, 285, 12342, 2931, 891, 30346, 326, 12580, 310, 12378, 285, 643, 2426, 751, 31561, 4313, 403, 14205, 275, 2505, 285, 1892, 281, 1089, 533, 891, 1089, 253, 1332, 4583, 2969, 285, 20654, 594, 342, 4821, 11985, 342, 253, 4477, 281, 19148, 752, 4555, 310, 9369, 891, 651, 7164, 619, 4868, 2490, 187, 4118, 18435, 27, 783, 2929, 22649, 247, 16105, 11786, 1895, 275, 48960, 31640, 516, 30063, 27935, 534, 476, 2847, 2794, 247, 3221, 3282, 273, 48960, 31640, 253, 2929, 3400, 16039, 715, 436, 1895, 285, 29328, 247, 8459, 14717, 1754, 2900, 323, 253, 23256, 69, 2983, 50276, 856, 84, 50276, 2369, 652, 16039, 715, 2139, 690, 48960, 25774, 778, 1056, 690, 9508, 273, 23256, 69, 35039, 2542, 31640, 50276, 856, 6013, 247, 1332, 326, 310, 17194, 407, 824, 4342, 273, 516, 30063, 27935, 50276, 783, 4081, 8104, 403, 2011, 3576, 2439, 247, 4618, 2491, 273, 25774, 50276, 5040, 50276, 783, 4081, 11786, 31561, 4313, 812, 320, 1805, 17194, 26332, 849, 310, 352, 4802, 281, 253, 6974, 273, 8459, 14717, 50276, 15870, 38135, 275, 253, 8104, 26332, 12955, 273, 253, 5368, 23256, 69, 285, 26301, 8104, 342, 690, 4081, 2544, 50276, 2044, 784, 7350, 342, 4679, 26332, 5018, 907, 25184, 4327, 273, 4373, 22041, 50276, 1189, 455, 253, 30628, 3543, 326, 627, 497, 690, 4722, 5697, 285, 10746, 3559, 275, 253, 2929, 2299, 253, 30628, 671, 3543, 326, 253, 7680, 369, 273, 16888, 8453, 285, 625, 7162, 275, 253, 2710, 4295, 26332, 849, 253, 4081, 17082, 2557, 253, 516, 30063, 11786, 1055, 285, 2710, 7350, 275, 253, 4679, 651, 452, 1160, 253, 2929, 625, 21414 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: a method for learning physics priors is proposed for faster learning and better transfer learning the key idea in learning physics priors using spatial net which is similar to a convolutional lstm model for making predictions authors propose to improve the sample efficiency of deep rl algorithms by augmenting ppos state input with 3 future frames predicted by the physics prior model authors show that using spatialnet leads to better prediction of the future as compared to previous methods on simple simulated physics environment and can be incorporated to improve performance on atari games a i am a bit unclear on how spatialnet is trained along with the policy in the ipa architecture in section 51 it is mentioned that we train both spatialnet and the policy simultaneously and use proximal policy optimization ppo as our model free algorithm however earlier in section 3 it is mentioned that first the agent is pretrained with prediction and then the pretrained model is used with the rl algorithm can the authors clarify the training procedure is it the case that the spatialnet is first pretrained with some data and then finetuned along with the environment rewards do the policynet and the frame prediction net share any parameters b is the comparison in table 2figure 5 fair in terms of number of frames seen by the agent let a ppo agent see n frames how many frames does the ipa agent say both for training spatial net policy c how about baselines where instead of augmenting ppo with any additional frames the policy is initialized with weights learned by spatial net other baseline is to jointly optimize for future frame prediction environment reward in this case atleast some parameters between the spatial net and the policy net will be shared but without augmenting the input state with future predicted frames the spatial net architecture is similar to convolutional lstm and i therefore dont think that is a significantly novel technical contribution the application of spatial net to augment frames in the state is although novel in my best knowledge the above questions will help me understand the experiments better right now the method is slightly unclear to me and the results on atari figure 11 are a bit underwhelming also why did the authors chose the specific atari games that they reported results on why not other games too docsepsummary this paper propose to learning a dynamics model with future prediction in video and using it for reinforcement learning the dynamics models is a variants of convolution lstm and it is trained mean squared error in the future frame the way of using dynamics model for reinforcement learning is similar to weber et al 2017 where k step prediction of the dynamics model is uses as an augmented input of the policy strength training dynamic model to understand physic and using it for reinforcement learning is an interesting problem that worth exploring this paper tackles this problem and demonstrated experimental setting based on physics games weakness the part for understanding dynamics model is very close to existing convolutional lstm model xingjian et al 2015 which is a popular baseline in video modelling community and how pretrained dynamics model is used for reinforcement learning is similar to weber et al 2017 but this paper does not provide comparison to any of these two baseline since the difference with these existing method is subtle clear comparison with these method and difference in characteristic is essential to show the novelty of the paper overall comment this paper address the interesting problem of understanding dynamics for solving reinforcement learning but the suggested method is not novel and comparison with existing close methods are not performeddocsepquality the paper proposed a new method to learn some physics prior in an environment along with a new spatialnetwork architecture instead of learning a specific dynamics model they propose to learn a dynamics model that is actionfree purely learning the extrinsic dynamics they formulate this problem as a video prediction problem a series of experiments are conducted on physworld a new physics based simulator and a subset of atari games clarity the writing is good originality this work is original as most of the modelbased rl works are focusing on learning one environment instead of common rules of physics significance of this work this work propose an interesting direction to pursue cons 1 in figure 4 the authors show that a pretrained model can learn faster than random initialization however it is hard to ablate the factor that causes this effect does the dynamics predictor learn the physics priors or is it just because it learn the visual prior of the shape of the objects etc 2 the baseline for atari games is quite limited first of all 3 out of 5 atari games in the original ppo paper show that acer performs better than ppo asteroid breakout demonattack i think it is better to make some improvement upon stateoftheart methods 3 all the experiments are shown with only 3 random seeds without error bar in the main paper although the reward plots are shown in figure 11 4 5 out of 10 atari games are similar to ppo according to figure 11 its hard to be conclusive when half of the experiments are positive and the rest are not 5 lack of discussion about egodynamics there are physics priors for both the environment and the controller usually the controlleragent requires an action to predict its dynamics then why should we omit the egodynamics and only model the outer world 6 physics prior usually happen in physical environment the proposed method works well in the physworld environments but is there some task that are more realistic than atari games that can leverage the power of physics priors more its good that this method works in some atari games but isnt learning the dynamics of atari games a bit off the topic 7 the transfer learning experiments should contain a baseline mamlreptile since you are learning physics prior it is fair to add metalearning baselines for comparison i think the direction is interesting and the effort is made well but the experiments are less convincing than the abstractintroduction ### Summary:
the paper suggests a new way to learn a physics prior in an actionfree way from raw frames the idea is to learn the common rules of physics in some sense from purely visual observations and use that as pretraining the authors made a number of experiments in response to the reviewer concerns but the submission still fell short of their expectations in the postrebuttal discussion the reviewers mentioned that its not clear how spatialnet is different from a convlstm mentioned the writing quality and the fact that the physics prior is really quite close to what others call video prediction in other baselines
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 66, 1332, 323, 4715, 12057, 2235, 641, 310, 4081, 323, 7938, 4715, 285, 1805, 3700, 4715, 253, 2234, 2934, 275, 4715, 12057, 2235, 641, 970, 8820, 2036, 534, 310, 2074, 281, 247, 27311, 267, 298, 296, 78, 1566, 323, 2403, 13650, 4477, 12661, 281, 3157, 253, 3410, 6733, 273, 3676, 391, 77, 11333, 407, 35919, 272, 268, 993, 1375, 3280, 342, 495, 2852, 13009, 8131, 407, 253, 12057, 2720, 1566, 50275, 43355, 921, 326, 970, 8820, 3024, 5644, 281, 1805, 10554, 273, 253, 2852, 347, 2429, 281, 2045, 3082, 327, 2969, 15524, 12057, 3126, 285, 476, 320, 11217, 281, 3157, 3045, 327, 387, 1792, 3958, 50275, 66, 891, 717, 247, 2372, 12744, 327, 849, 8820, 3024, 310, 10166, 2112, 342, 253, 3646, 275, 253, 13997, 66, 10336, 275, 2593, 8319, 352, 310, 5393, 326, 359, 6194, 1097, 8820, 3024, 285, 253, 3646, 10486, 285, 897, 19561, 3646, 13757, 268, 5367, 347, 776, 1566, 1959, 5933, 2299, 4321, 275, 2593, 495, 352, 310, 5393, 326, 806, 253, 5570, 310, 3215, 11273, 342, 10554, 285, 840, 253, 3215, 11273, 1566, 310, 908, 342, 253, 391, 77, 5933, 476, 253, 4477, 19148, 253, 3733, 5199, 310, 352, 253, 1083, 326, 253, 8820, 3024, 310, 806, 3215, 11273, 342, 690, 941, 285, 840, 1442, 292, 37437, 2112, 342, 253, 3126, 23267, 513, 253, 6382, 1362, 292, 285, 253, 3665, 10554, 2036, 3894, 667, 3602, 50275, 67, 310, 253, 5301, 275, 2829, 374, 13206, 608, 4344, 275, 2426, 273, 1180, 273, 13009, 2326, 407, 253, 5570, 1339, 247, 268, 5367, 5570, 923, 295, 13009, 849, 1142, 13009, 1057, 253, 13997, 66, 5570, 1333, 1097, 323, 3733, 8820, 2036, 50276, 22872, 50275, 68, 849, 670, 1666, 25379, 835, 3185, 273, 35919, 272, 268, 5367, 342, 667, 3081, 13009, 253, 3646, 310, 31260, 342, 13461, 6311, 407, 8820, 2036, 643, 8245, 310, 281, 26277, 22318, 323, 2852, 3665, 10554, 50276, 20034, 10921, 275, 436, 1083, 387, 38462, 690, 3602, 875, 253, 8820, 2036, 285, 253, 3646, 2036, 588, 320, 6096, 533, 1293, 35919, 272, 253, 3280, 1375, 342, 2852, 8131, 13009, 50275, 783, 8820, 2036, 10336, 310, 2074, 281, 27311, 267, 298, 296, 78, 50276, 395, 891, 3103, 13414, 1158, 326, 310, 247, 3012, 4460, 7681, 7680, 253, 2898, 273, 8820, 2036, 281, 35919, 13009, 275, 253, 1375, 310, 3738, 4460, 275, 619, 1682, 3640, 253, 1840, 3533, 588, 1361, 479, 2096, 253, 4679, 1805, 987, 1024, 253, 1332, 310, 5777, 12744, 281, 479, 285, 253, 1543, 327, 387, 1792, 4677, 1903, 403, 247, 2372, 762, 11622, 3987, 671, 2139, 858, 253, 4477, 9703, 253, 2173, 387, 1792, 3958, 326, 597, 2361, 1543, 327, 50276, 22309, 417, 643, 3958, 1512, 5474, 339, 793, 360, 3454, 436, 2929, 12661, 281, 4715, 247, 8062, 1566, 342, 2852, 10554, 275, 3492, 285, 970, 352, 323, 35221, 4715, 253, 8062, 3210, 310, 247, 11640, 273, 27311, 298, 296, 78, 285, 352, 310, 10166, 1599, 30044, 2228, 275, 253, 2852, 3665, 253, 1039, 273, 970, 8062, 1566, 323, 35221, 4715, 310, 2074, 281, 359, 589, 1162, 355, 4240, 835, 465, 3213, 10554, 273, 253, 8062, 1566, 310, 4648, 347, 271, 31612, 3280, 273, 253, 3646, 50276, 45563, 3733, 7870, 1566, 281, 2096, 26289, 285, 970, 352, 323, 35221, 4715, 310, 271, 4722, 1895, 326, 4409, 18216, 436, 2929, 39223, 436, 1895, 285, 5183, 5661, 4758, 1754, 327, 12057, 3958, 50275, 20881, 1255, 253, 629, 323, 4685, 8062, 1566, 310, 1077, 2810, 281, 5368, 27311, 267, 298, 296, 78, 1566, 1269, 272, 75, 757, 1162, 355, 4104, 534, 310, 247, 4633, 8245, 275, 3492, 26278, 3114, 285, 849, 3215, 11273, 8062, 1566, 310, 908, 323, 35221, 4715, 310, 2074, 281, 359, 589, 1162, 355, 4240, 533, 436, 2929, 1057, 417, 2085, 5301, 281, 667, 273, 841, 767, 8245, 50276, 17480, 253, 3064, 342, 841, 5368, 1332, 310, 16105, 2590, 5301, 342, 841, 1332, 285, 3064, 275, 8847, 310, 5667, 281, 921, 253, 38135, 273, 253, 2929, 50275, 1189, 455, 4385, 436, 2929, 2953, 253, 4722, 1895, 273, 4685, 8062, 323, 16161, 35221, 4715, 533, 253, 5125, 1332, 310, 417, 4460, 285, 5301, 342, 5368, 2810, 3082, 403, 417, 2684, 7152, 33032, 15177, 253, 2929, 4081, 247, 747, 1332, 281, 3037, 690, 12057, 2720, 275, 271, 3126, 2112, 342, 247, 747, 8820, 18428, 10336, 3185, 273, 4715, 247, 2173, 8062, 1566, 597, 12661, 281, 3037, 247, 8062, 1566, 326, 310, 2250, 4924, 15846, 4715, 253, 38988, 8062, 50276, 9328, 36803, 436, 1895, 347, 247, 3492, 10554, 1895, 247, 2962, 273, 4679, 403, 5196, 327, 2150, 10186, 247, 747, 12057, 1754, 40022, 285, 247, 8578, 273, 387, 1792, 3958, 19843, 253, 4028, 310, 1175, 3236, 414, 436, 789, 310, 3236, 347, 954, 273, 253, 1566, 3169, 391, 77, 2987, 403, 13654, 327, 4715, 581, 3126, 3185, 273, 1846, 4803, 273, 12057, 8453, 273, 436, 789, 436, 789, 12661, 271, 4722, 3884, 281, 15142, 50276, 5040, 337, 275, 4677, 577, 253, 4477, 921, 326, 247, 3215, 11273, 1566, 476, 3037, 7938, 685, 3632, 31850, 2299, 352, 310, 1892, 281, 490, 12579, 253, 2803, 326, 5997, 436, 1055, 50276, 18566, 253, 8062, 23403, 3037, 253, 12057, 2235, 641, 390, 310, 352, 816, 984, 352, 3037, 253, 5304, 2720, 273, 253, 5281, 273, 253, 5113, 3966, 50276, 19, 253, 8245, 323, 387, 1792, 3958, 310, 3240, 3710, 806, 273, 512, 495, 562, 273, 608, 50276, 255, 1792, 3958, 50276, 249, 253, 3236, 268, 5367, 2929, 921, 326, 913, 254, 17923, 1805, 685, 268, 5367, 46125, 2740, 483, 2725, 35946, 891, 1158, 352, 310, 1805, 281, 1056, 690, 7756, 2220, 1375, 23037, 14387, 3082, 495, 512, 253, 4679, 403, 2011, 342, 760, 495, 3632, 12922, 1293, 2228, 2534, 275, 253, 2022, 2929, 3738, 253, 10921, 14777, 403, 2011, 275, 4677, 1903, 50276, 21, 608, 562, 273, 884, 387, 1792, 3958, 403, 2074, 281, 268, 5367, 2556, 281, 4677, 1903, 697, 1892, 281, 320, 38662, 672, 2716, 273, 253, 4679, 403, 2762, 285, 253, 1551, 403, 417, 50276, 22, 3480, 273, 5955, 670, 24088, 30141, 627, 403, 12057, 2235, 641, 323, 1097, 253, 3126, 285, 253, 9763, 3798, 253, 9763, 12788, 50276, 36042, 271, 2250, 281, 3283, 697, 8062, 840, 2139, 943, 359, 35991, 253, 24088, 30141, 285, 760, 1566, 253, 8346, 1533, 50276, 23, 12057, 2720, 3798, 5108, 275, 3520, 3126, 253, 4081, 1332, 2987, 973, 275, 253, 2150, 10186, 12620, 533, 310, 627, 690, 4836, 326, 403, 625, 15958, 685, 387, 1792, 3958, 326, 476, 25057, 253, 1612, 273, 12057, 2235, 641, 625, 697, 1175, 326, 436, 1332, 2987, 275, 690, 387, 1792, 3958, 533, 310, 2649, 4715, 253, 8062, 273, 387, 1792, 3958, 247, 2372, 745, 253, 9400, 50276, 24, 253, 3700, 4715, 4679, 943, 3831, 247, 8245, 50276, 78, 16878, 22316, 587, 1580, 368, 403, 4715, 12057, 2720, 352, 310, 4344, 281, 823, 5148, 613, 920, 1666, 25379, 323, 5301, 50276, 74, 1158, 253, 3884, 310, 4722, 285, 253, 3434, 310, 1160, 973, 533, 253, 4679, 403, 1679, 21414, 685, 253, 12002, 46089, 2490, 187, 4118, 18435, 27, 783, 2929, 5936, 247, 747, 1039, 281, 3037, 247, 12057, 2720, 275, 271, 2250, 4924, 1039, 432, 9305, 13009, 253, 2934, 310, 281, 3037, 253, 1846, 4803, 273, 12057, 275, 690, 3282, 432, 15846, 5304, 7313, 285, 897, 326, 347, 3215, 26208, 253, 4477, 1160, 247, 1180, 273, 4679, 275, 2380, 281, 253, 37317, 7350, 533, 253, 19529, 1335, 6497, 2159, 273, 616, 12656, 275, 253, 1501, 250, 2858, 22559, 5955, 253, 30628, 5393, 326, 697, 417, 2590, 849, 8820, 3024, 310, 1027, 432, 247, 2410, 42663, 78, 5393, 253, 4028, 3290, 285, 253, 958, 326, 253, 12057, 2720, 310, 1663, 3240, 2810, 281, 752, 2571, 1067, 3492, 10554, 275, 643, 1666, 25379, 50276 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 66, 1332, 323, 4715, 12057, 2235, 641, 310, 4081, 323, 7938, 4715, 285, 1805, 3700, 4715, 253, 2234, 2934, 275, 4715, 12057, 2235, 641, 970, 8820, 2036, 534, 310, 2074, 281, 247, 27311, 267, 298, 296, 78, 1566, 323, 2403, 13650, 4477, 12661, 281, 3157, 253, 3410, 6733, 273, 3676, 391, 77, 11333, 407, 35919, 272, 268, 993, 1375, 3280, 342, 495, 2852, 13009, 8131, 407, 253, 12057, 2720, 1566, 50275, 43355, 921, 326, 970, 8820, 3024, 5644, 281, 1805, 10554, 273, 253, 2852, 347, 2429, 281, 2045, 3082, 327, 2969, 15524, 12057, 3126, 285, 476, 320, 11217, 281, 3157, 3045, 327, 387, 1792, 3958, 50275, 66, 891, 717, 247, 2372, 12744, 327, 849, 8820, 3024, 310, 10166, 2112, 342, 253, 3646, 275, 253, 13997, 66, 10336, 275, 2593, 8319, 352, 310, 5393, 326, 359, 6194, 1097, 8820, 3024, 285, 253, 3646, 10486, 285, 897, 19561, 3646, 13757, 268, 5367, 347, 776, 1566, 1959, 5933, 2299, 4321, 275, 2593, 495, 352, 310, 5393, 326, 806, 253, 5570, 310, 3215, 11273, 342, 10554, 285, 840, 253, 3215, 11273, 1566, 310, 908, 342, 253, 391, 77, 5933, 476, 253, 4477, 19148, 253, 3733, 5199, 310, 352, 253, 1083, 326, 253, 8820, 3024, 310, 806, 3215, 11273, 342, 690, 941, 285, 840, 1442, 292, 37437, 2112, 342, 253, 3126, 23267, 513, 253, 6382, 1362, 292, 285, 253, 3665, 10554, 2036, 3894, 667, 3602, 50275, 67, 310, 253, 5301, 275, 2829, 374, 13206, 608, 4344, 275, 2426, 273, 1180, 273, 13009, 2326, 407, 253, 5570, 1339, 247, 268, 5367, 5570, 923, 295, 13009, 849, 1142, 13009, 1057, 253, 13997, 66, 5570, 1333, 1097, 323, 3733, 8820, 2036, 50276, 22872, 50275, 68, 849, 670, 1666, 25379, 835, 3185, 273, 35919, 272, 268, 5367, 342, 667, 3081, 13009, 253, 3646, 310, 31260, 342, 13461, 6311, 407, 8820, 2036, 643, 8245, 310, 281, 26277, 22318, 323, 2852, 3665, 10554, 50276, 20034, 10921, 275, 436, 1083, 387, 38462, 690, 3602, 875, 253, 8820, 2036, 285, 253, 3646, 2036, 588, 320, 6096, 533, 1293, 35919, 272, 253, 3280, 1375, 342, 2852, 8131, 13009, 50275, 783, 8820, 2036, 10336, 310, 2074, 281, 27311, 267, 298, 296, 78, 50276, 395, 891, 3103, 13414, 1158, 326, 310, 247, 3012, 4460, 7681, 7680, 253, 2898, 273, 8820, 2036, 281, 35919, 13009, 275, 253, 1375, 310, 3738, 4460, 275, 619, 1682, 3640, 253, 1840, 3533, 588, 1361, 479, 2096, 253, 4679, 1805, 987, 1024, 253, 1332, 310, 5777, 12744, 281, 479, 285, 253, 1543, 327, 387, 1792, 4677, 1903, 403, 247, 2372, 762, 11622, 3987, 671, 2139, 858, 253, 4477, 9703, 253, 2173, 387, 1792, 3958, 326, 597, 2361, 1543, 327, 50276, 22309, 417, 643, 3958, 1512, 5474, 339, 793, 360, 3454, 436, 2929, 12661, 281, 4715, 247, 8062, 1566, 342, 2852, 10554, 275, 3492, 285, 970, 352, 323, 35221, 4715, 253, 8062, 3210, 310, 247, 11640, 273, 27311, 298, 296, 78, 285, 352, 310, 10166, 1599, 30044, 2228, 275, 253, 2852, 3665, 253, 1039, 273, 970, 8062, 1566, 323, 35221, 4715, 310, 2074, 281, 359, 589, 1162, 355, 4240, 835, 465, 3213, 10554, 273, 253, 8062, 1566, 310, 4648, 347, 271, 31612, 3280, 273, 253, 3646, 50276, 45563, 3733, 7870, 1566, 281, 2096, 26289, 285, 970, 352, 323, 35221, 4715, 310, 271, 4722, 1895, 326, 4409, 18216, 436, 2929, 39223, 436, 1895, 285, 5183, 5661, 4758, 1754, 327, 12057, 3958, 50275, 20881, 1255, 253, 629, 323, 4685, 8062, 1566, 310, 1077, 2810, 281, 5368, 27311, 267, 298, 296, 78, 1566, 1269, 272, 75, 757, 1162, 355, 4104, 534, 310, 247, 4633, 8245, 275, 3492, 26278, 3114, 285, 849, 3215, 11273, 8062, 1566, 310, 908, 323, 35221, 4715, 310, 2074, 281, 359, 589, 1162, 355, 4240, 533, 436, 2929, 1057, 417, 2085, 5301, 281, 667, 273, 841, 767, 8245, 50276, 17480, 253, 3064, 342, 841, 5368, 1332, 310, 16105, 2590, 5301, 342, 841, 1332, 285, 3064, 275, 8847, 310, 5667, 281, 921, 253, 38135, 273, 253, 2929, 50275, 1189, 455, 4385, 436, 2929, 2953, 253, 4722, 1895, 273, 4685, 8062, 323, 16161, 35221, 4715, 533, 253, 5125, 1332, 310, 417, 4460, 285, 5301, 342, 5368, 2810, 3082, 403, 417, 2684, 7152, 33032, 15177, 253, 2929, 4081, 247, 747, 1332, 281, 3037, 690, 12057, 2720, 275, 271, 3126, 2112, 342, 247, 747, 8820, 18428, 10336, 3185, 273, 4715, 247, 2173, 8062, 1566, 597, 12661, 281, 3037, 247, 8062, 1566, 326, 310, 2250, 4924, 15846, 4715, 253, 38988, 8062, 50276, 9328, 36803, 436, 1895, 347, 247, 3492, 10554, 1895, 247, 2962, 273, 4679, 403, 5196, 327, 2150, 10186, 247, 747, 12057, 1754, 40022, 285, 247, 8578, 273, 387, 1792, 3958, 19843, 253, 4028, 310, 1175, 3236, 414, 436, 789, 310, 3236, 347, 954, 273, 253, 1566, 3169, 391, 77, 2987, 403, 13654, 327, 4715, 581, 3126, 3185, 273, 1846, 4803, 273, 12057, 8453, 273, 436, 789, 436, 789, 12661, 271, 4722, 3884, 281, 15142, 50276, 5040, 337, 275, 4677, 577, 253, 4477, 921, 326, 247, 3215, 11273, 1566, 476, 3037, 7938, 685, 3632, 31850, 2299, 352, 310, 1892, 281, 490, 12579, 253, 2803, 326, 5997, 436, 1055, 50276, 18566, 253, 8062, 23403, 3037, 253, 12057, 2235, 641, 390, 310, 352, 816, 984, 352, 3037, 253, 5304, 2720, 273, 253, 5281, 273, 253, 5113, 3966, 50276, 19, 253, 8245, 323, 387, 1792, 3958, 310, 3240, 3710, 806, 273, 512, 495, 562, 273, 608, 50276, 255, 1792, 3958, 50276, 249, 253, 3236, 268, 5367, 2929, 921, 326, 913, 254, 17923, 1805, 685, 268, 5367, 46125, 2740, 483, 2725, 35946, 891, 1158, 352, 310, 1805, 281, 1056, 690, 7756, 2220, 1375, 23037, 14387, 3082, 495, 512, 253, 4679, 403, 2011, 342, 760, 495, 3632, 12922, 1293, 2228, 2534, 275, 253, 2022, 2929, 3738, 253, 10921, 14777, 403, 2011, 275, 4677, 1903, 50276, 21, 608, 562, 273, 884, 387, 1792, 3958, 403, 2074, 281, 268, 5367, 2556, 281, 4677, 1903, 697, 1892, 281, 320, 38662, 672, 2716, 273, 253, 4679, 403, 2762, 285, 253, 1551, 403, 417, 50276, 22, 3480, 273, 5955, 670, 24088, 30141, 627, 403, 12057, 2235, 641, 323, 1097, 253, 3126, 285, 253, 9763, 3798, 253, 9763, 12788, 50276, 36042, 271, 2250, 281, 3283, 697, 8062, 840, 2139, 943, 359, 35991, 253, 24088, 30141, 285, 760, 1566, 253, 8346, 1533, 50276, 23, 12057, 2720, 3798, 5108, 275, 3520, 3126, 253, 4081, 1332, 2987, 973, 275, 253, 2150, 10186, 12620, 533, 310, 627, 690, 4836, 326, 403, 625, 15958, 685, 387, 1792, 3958, 326, 476, 25057, 253, 1612, 273, 12057, 2235, 641, 625, 697, 1175, 326, 436, 1332, 2987, 275, 690, 387, 1792, 3958, 533, 310, 2649, 4715, 253, 8062, 273, 387, 1792, 3958, 247, 2372, 745, 253, 9400, 50276, 24, 253, 3700, 4715, 4679, 943, 3831, 247, 8245, 50276, 78, 16878, 22316, 587, 1580, 368, 403, 4715, 12057, 2720, 352, 310, 4344, 281, 823, 5148, 613, 920, 1666, 25379, 323, 5301, 50276, 74, 1158, 253, 3884, 310, 4722, 285, 253, 3434, 310, 1160, 973, 533, 253, 4679, 403, 1679, 21414, 685, 253, 12002, 46089, 2490, 187, 4118, 18435, 27, 783, 2929, 5936, 247, 747, 1039, 281, 3037, 247, 12057, 2720, 275, 271, 2250, 4924, 1039, 432, 9305, 13009, 253, 2934, 310, 281, 3037, 253, 1846, 4803, 273, 12057, 275, 690, 3282, 432, 15846, 5304, 7313, 285, 897, 326, 347, 3215, 26208, 253, 4477, 1160, 247, 1180, 273, 4679, 275, 2380, 281, 253, 37317, 7350, 533, 253, 19529, 1335, 6497, 2159, 273, 616, 12656, 275, 253, 1501, 250, 2858, 22559, 5955, 253, 30628, 5393, 326, 697, 417, 2590, 849, 8820, 3024, 310, 1027, 432, 247, 2410, 42663, 78, 5393, 253, 4028, 3290, 285, 253, 958, 326, 253, 12057, 2720, 310, 1663, 3240, 2810, 281, 752, 2571, 1067, 3492, 10554, 275, 643, 1666, 25379, 50276 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: the authors demonstrate that transformers obtain impressive performance even when some of the layers are randomly initialized and never updated the authors have experiments on four types of reservoir transformer reservoir ffn feedforward layer reservoir bigru reservoir and cnn reservoir and the results show that the reservoir can achieve competitivebetter performance or than normal transformer on machine translation language modeling and mlm pretraining as pointed out by the authors deep reservoir computing networks scardapane wang 2017 gallicchio micheli 2017 have been explored before the main novelty of this paper is only the reservoir exploration of transformer structure although the method is interesting and the experiments are welldone the work reducing transformer depth on demand with structured dropout seems more straightforward and effective the authors should have some comparisons with this method as theres still residual connections how about simply adding some noise to the previous layer overall although the experiments are interesting it doesnt provide novel theory on it and misses some comparison to previous works its hard to evaluate the significance of this work pro 1 the reservoir on transformer is interesting and has not been explored before 2 the authors prove the idea by many different tasks cons 1 the reservoir operation was explored on other structures 2 no strong baseline provided such as reducing transformer depth on demand with structured dropout or some other structure pruning based methods 3 no novel theory to explain the method update i like the experiment added in the revision however it is only tested on a iwslt which is a smaller dataset and can be influence by many hyperparameters its not clear what frozen layers mean for layerdrop and it need to be clarified with more details i didnt find clear comparison with stronger baselines on wmt in the revision as pointed our by the authors i think the theory mainly come from previous works i have also read other reviews overall i would like to keep my ratingdocsepthe paper explores the concept of randomization in transformer architectures essentially some of the layers in the encoder network are replaced by fixed layers which makes the computation faster straightforwardly in the prediction phase the paper also illustrates a method called backskipping to reduce the cost of backpropagating gradients through the fixed layers the proposal is well supported by a number of experiments in the area of machine translation pros i find the paper very clear and well written readable i enjoyed reading it the research proposed in the paper seems novel enough to me and goes towards a fruitful research direction that one of randomized neural algorithms although this is nowadays established in the ml community the experiments offer a nice perspective on the advantages of the proposed model cons i find inappropriate the use of the term reservoir in this work in my understanding that is essentially a fixed untrained recurrent layer the aspect of the dynamics is fundamental in this respect in the paper instead the use of reservoir stands mainly for randomized all in all i dont see reservoir layers in the proposed model and i suggest to take this into account perhaps changing the name and the title the experiments show that using untrained layers gives a positive tradeoff between accuracy and compute times in the experimental comparison i think it would however important to see how this tradeoff behavesscales wrt to the number of trainable weights when avoiding training in some of the layers the number of trainable weights is reduced and the overall complexity of the system is reduced too having a sort of regularization effect the experiments show that it is possible to avoid training in some of the layers of a transformer however in randomized neural networks a pivotal role is played by the scaling parameters of the involved weight matrices orthogonal matrices are used in the experiments which is good but i wonder how and if the scaling of such matrices affect the performance ideally such scaling should be hyperparameters and be chosen on a validation set a few minor points please if possible clarify more explicitly on the concept of converging faster please clarify on the data splitting in trvlts for the used datasets please introduce the datasets before section 31 where they are already mentioned without intro edit i would like to thank the authors for the nice work during the review process i am pretty satisfied with that and i feel serene to increase my rating to acceptancedocsep summary the paper studies how transformers train when some layers are kept fixed at the randomly initialized parameters the authors observe that transformers can be effectively trained with a significant percentage of their layers frozen it is also argued that as a result transformers can be trained more efficiently as for the frozen layers gradients with respect to the parameters need not be computed strengths the proposed idea is very simple and easy to understand and implement the authors perform thorough experiments on several datasets and tasks weaknesses 1 the aucc metric is not measuring the efficiency independent of the time budget as it is claimed for instance given a high enough that the best performing model will always have the highest aucc contrary to that given a low enough that the fastest to converge model will always be better i agree with the authors that measuring the efficiency of a neural network is a hard problem but i dont think that the aucc metric is a good solution time to x of best score is probably much more informative in addition time to best score is also not very informative as it depends heavily on the learning rate schedule and the random oscillations of training 2 from the provided comparison with respect to the number of updateable layers we cannot deduce information regarding the efficiency for instance in fig 2 for wmt how many frozen layers are used for the reservoir transformers what is the wallclock time per epoch for each model 3 from the comparison in the supplementary material with respect to all the layers in the model we observe that the performance is on par with standard transformers but what is the improvement in efficiency namely how much faster are the reservoir transformers for a given number of layers reasons for recommendation the paper proposes a very simple idea and then evaluates it experimentally therefore the experimental section should be thorough and should lead to clear conclusions due to the the newly proposed metric as well as the lack of comparison between training time and achieved performance lead me to propose rejection miscellaneous figure 14 bottom right should probably read validation ppl postrebuttal update i would like to thank the authors for their additions to the paper i believe that the extra metrics improve the readers ability to extract conclusions from the experiments significantly having said that i believe that the extra experiments and numbers do not paint a clearer picture for instance in iwslt and wmt indeed there is fairly consistent evidence that ffn reservoir is more efficient to train than fully trainable transformers however for enwik8 the t reservoir outperforms everything significantly when looking at figure 15 but judging from figure 13 we see that the best case scenario has been selected for t reservoir namely 32 or 48 layers even more importantly perhaps the story is completely different when looking at the test set evaluation where ffn reservoirs perform better and actually the t reservoir performs the worst among all methods similarly on roberta pretraining fully trainable transformers seem to achieve the lowest validation perplexity and with the highest efficiency to summarize i believe that the reservoir transformers could be a useful tool for improving either the efficiency or the generalization of transformers in lowdata regimes or both however a distillation of the experiments and conclusions is required in order for this to be shown from the paper namely even with all those numbers i still cannot judge which reservoir layer will be better in what sense it will be better accuracy or efficiency and why is it going to be better due to the above i tend to keep my score but because the additions of the authors provide significantly more information and in my opinion value to the paper i will increase my score to 5 ### Summary:
the reviewers were split between accept 7 and borderline reject two 5s all three reviewers acknowledged that the proposed approach is simple and intuitive but this paper follows for the most part the concept of reservoir operation and apply it to transformers the main criticisms were insufficient experiments r5 and the lack of a clear conclusion r2 i found these concerns to be valid and did not find strong reasons to overturn their recommendations more comprehensive experiments especially on wmt and clear conclusions accuracy or efficiency would make this paper much stronger
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 783, 4477, 7568, 326, 4979, 398, 4044, 13943, 3045, 1014, 672, 690, 273, 253, 8090, 403, 12421, 31260, 285, 1620, 9300, 253, 4477, 452, 4679, 327, 1740, 3510, 273, 18699, 39707, 18699, 50276, 567, 79, 3997, 10495, 3828, 18699, 1943, 579, 18699, 285, 260, 9866, 18699, 285, 253, 1543, 921, 326, 253, 18699, 476, 5115, 12085, 29266, 3045, 390, 685, 2622, 39707, 327, 5145, 10234, 3448, 14053, 285, 13361, 78, 3215, 26208, 347, 8042, 562, 407, 253, 4477, 50276, 22412, 18699, 12672, 6928, 660, 472, 522, 1351, 50276, 33317, 4240, 9391, 280, 41380, 50276, 6185, 248, 965, 4240, 452, 644, 14859, 1078, 253, 2022, 38135, 273, 436, 2929, 310, 760, 253, 18699, 17947, 273, 39707, 2605, 3738, 253, 1332, 310, 4722, 285, 253, 4679, 403, 6210, 392, 531, 253, 789, 8493, 39707, 6864, 327, 4831, 342, 18872, 5926, 483, 3133, 625, 15246, 285, 3576, 253, 4477, 943, 452, 690, 14023, 342, 436, 1332, 347, 253, 373, 1335, 12541, 10291, 849, 670, 3365, 6240, 690, 6046, 281, 253, 2045, 3828, 4583, 3738, 253, 4679, 403, 4722, 352, 36908, 2085, 4460, 3762, 327, 352, 285, 38771, 690, 5301, 281, 2045, 2987, 697, 1892, 281, 7472, 253, 8453, 273, 436, 789, 50276, 856, 337, 253, 18699, 327, 39707, 310, 4722, 285, 556, 417, 644, 14859, 1078, 374, 253, 4477, 5276, 253, 2934, 407, 1142, 1027, 8892, 50276, 5040, 337, 253, 18699, 4254, 369, 14859, 327, 643, 5289, 374, 642, 2266, 8245, 2530, 824, 347, 8493, 39707, 6864, 327, 4831, 342, 18872, 5926, 483, 390, 690, 643, 2605, 819, 25004, 1754, 3082, 495, 642, 4460, 3762, 281, 5513, 253, 1332, 50275, 11183, 50276, 74, 751, 253, 3368, 2879, 275, 253, 18520, 2299, 352, 310, 760, 5762, 327, 247, 891, 88, 3433, 85, 534, 310, 247, 4577, 10895, 285, 476, 320, 4833, 407, 1142, 4373, 22041, 697, 417, 2590, 752, 13831, 8090, 1599, 323, 3828, 12233, 285, 352, 878, 281, 320, 31637, 342, 625, 4278, 891, 42126, 1089, 2590, 5301, 342, 10046, 1666, 25379, 327, 259, 6917, 275, 253, 18520, 50276, 284, 8042, 776, 407, 253, 4477, 891, 1158, 253, 3762, 7194, 1705, 432, 2045, 2987, 50275, 74, 452, 671, 1239, 643, 10123, 4583, 891, 651, 751, 281, 1978, 619, 13716, 7152, 339, 431, 248, 2929, 33826, 253, 4473, 273, 46852, 275, 39707, 35615, 9093, 690, 273, 253, 8090, 275, 253, 32049, 2990, 403, 7932, 407, 4229, 8090, 534, 2789, 253, 13782, 7938, 15246, 314, 275, 253, 10554, 3408, 253, 2929, 671, 18303, 247, 1332, 1925, 896, 3319, 8201, 281, 4796, 253, 2105, 273, 896, 44263, 839, 27935, 949, 253, 4229, 8090, 253, 10419, 310, 973, 4516, 407, 247, 1180, 273, 4679, 275, 253, 2170, 273, 5145, 10234, 5847, 50276, 74, 1089, 253, 2929, 1077, 2590, 285, 973, 3542, 50276, 25285, 891, 11346, 4361, 352, 50276, 783, 2561, 4081, 275, 253, 2929, 3133, 4460, 2217, 281, 479, 285, 4566, 4404, 247, 46001, 2561, 3884, 326, 581, 273, 14871, 11454, 11333, 3738, 436, 310, 31735, 4232, 275, 253, 13361, 3114, 50276, 783, 4679, 3959, 247, 5322, 8668, 327, 253, 11361, 273, 253, 4081, 1566, 50276, 5040, 50276, 74, 1089, 19582, 253, 897, 273, 253, 1307, 18699, 275, 436, 789, 275, 619, 4685, 326, 310, 9093, 247, 4229, 440, 32927, 18902, 3828, 253, 4809, 273, 253, 8062, 310, 7936, 275, 436, 1675, 275, 253, 2929, 3185, 253, 897, 273, 18699, 9572, 7194, 323, 14871, 512, 275, 512, 891, 13414, 923, 18699, 8090, 275, 253, 4081, 1566, 285, 891, 1804, 281, 1379, 436, 715, 2395, 4931, 6890, 253, 1416, 285, 253, 4060, 50276, 783, 4679, 921, 326, 970, 440, 32927, 8090, 4245, 247, 2762, 5454, 2727, 875, 7200, 285, 11897, 2069, 275, 253, 5661, 5301, 891, 1158, 352, 651, 50276, 35529, 50276, 18108, 281, 923, 849, 436, 5454, 2727, 2724, 405, 1179, 265, 8772, 281, 253, 1180, 273, 6194, 494, 13461, 672, 17816, 3733, 275, 690, 273, 253, 8090, 253, 1180, 273, 6194, 494, 13461, 310, 3777, 285, 253, 4583, 10454, 273, 253, 985, 310, 3777, 1512, 1907, 247, 3686, 273, 37820, 1055, 50276, 783, 4679, 921, 326, 352, 310, 1896, 281, 3693, 3733, 275, 690, 273, 253, 8090, 273, 247, 39707, 2299, 275, 14871, 11454, 6928, 247, 30847, 2554, 310, 4546, 407, 253, 13642, 3602, 273, 253, 3206, 2801, 12624, 19627, 12624, 403, 908, 275, 253, 4679, 534, 310, 1175, 533, 891, 4282, 849, 285, 604, 253, 13642, 273, 824, 12624, 2818, 253, 3045, 34243, 824, 13642, 943, 320, 4373, 22041, 285, 320, 6777, 327, 247, 12820, 873, 50276, 66, 1643, 5884, 2792, 50276, 32897, 604, 1896, 19148, 625, 11120, 327, 253, 4473, 273, 5975, 3390, 7938, 50276, 32897, 19148, 327, 253, 941, 19860, 275, 492, 29576, 1641, 323, 253, 908, 15302, 50276, 32897, 9569, 253, 15302, 1078, 2593, 4562, 835, 597, 403, 2168, 5393, 1293, 26432, 50275, 15576, 891, 651, 751, 281, 5717, 253, 4477, 323, 253, 5322, 789, 1309, 253, 2278, 1232, 891, 717, 3965, 10048, 342, 326, 285, 891, 1928, 396, 9457, 281, 2572, 619, 13716, 281, 2997, 3086, 406, 33032, 6010, 50276, 783, 2929, 2175, 849, 4979, 398, 6194, 672, 690, 8090, 403, 4934, 4229, 387, 253, 12421, 31260, 3602, 253, 4477, 10018, 326, 4979, 398, 476, 320, 8069, 10166, 342, 247, 1534, 7155, 273, 616, 8090, 13831, 352, 310, 671, 9125, 326, 347, 247, 906, 4979, 398, 476, 320, 10166, 625, 14556, 347, 323, 253, 13831, 8090, 27935, 342, 1675, 281, 253, 3602, 878, 417, 320, 10302, 50275, 296, 3755, 20556, 50275, 783, 4081, 2934, 310, 1077, 2969, 285, 3477, 281, 2096, 285, 3359, 50276, 783, 4477, 1347, 11080, 4679, 327, 2067, 15302, 285, 8892, 50275, 20881, 1255, 265, 50276, 18, 253, 7331, 550, 7982, 310, 417, 10499, 253, 6733, 3907, 273, 253, 673, 7563, 347, 352, 310, 7558, 323, 4227, 1677, 247, 1029, 2217, 326, 253, 1682, 9591, 1566, 588, 1900, 452, 253, 4585, 7331, 550, 10214, 281, 326, 1677, 247, 1698, 2217, 326, 253, 22583, 281, 29623, 1566, 588, 1900, 320, 1805, 891, 5194, 342, 253, 4477, 326, 10499, 253, 6733, 273, 247, 11454, 2990, 310, 247, 1892, 1895, 533, 891, 13414, 1158, 326, 253, 7331, 550, 7982, 310, 247, 1175, 2900, 673, 281, 1269, 273, 1682, 4868, 310, 3164, 1199, 625, 27096, 275, 1635, 673, 281, 1682, 4868, 310, 671, 417, 1077, 27096, 347, 352, 7024, 11306, 327, 253, 4715, 2281, 10130, 285, 253, 3632, 22957, 273, 3733, 374, 432, 253, 2530, 5301, 342, 1675, 281, 253, 1180, 273, 5731, 494, 8090, 359, 2550, 27566, 1491, 5001, 253, 6733, 323, 4227, 275, 3036, 374, 323, 259, 6917, 849, 1142, 13831, 8090, 403, 908, 323, 253, 18699, 4979, 398, 752, 310, 253, 3402, 13273, 673, 591, 23657, 323, 1016, 1566, 495, 432, 253, 5301, 275, 253, 24864, 2144, 342, 1675, 281, 512, 253, 8090, 275, 253, 1566, 359, 10018, 326, 253, 3045, 310, 327, 1061, 342, 2629, 4979, 398, 533, 752, 310, 253, 7756, 275, 6733, 10775, 849, 1199, 7938, 403, 253, 18699, 4979, 398, 323, 247, 1677, 1180, 273, 8090, 50275, 250, 3743, 323, 17401, 50276, 783, 2929, 29328, 247, 1077, 2969, 2934, 285, 840, 44995, 352, 21657, 3103, 253, 5661, 2593, 943, 320, 11080, 285, 943, 1421, 281, 2590, 11815, 1955, 281, 253, 253, 9841, 4081, 7982, 347, 973, 347, 253, 3480, 273, 5301, 875, 3733, 673, 285, 6786, 3045, 1421, 479, 281, 12661, 18235, 50275, 43671, 43295, 50275, 13206, 1638, 5004, 987, 943, 3164, 1239, 12820, 268, 446, 50275, 5996, 250, 2858, 22559, 5731, 50276, 74, 651, 751, 281, 5717, 253, 4477, 323, 616, 30733, 281, 253, 2929, 891, 2868, 326, 253, 4465, 17082, 3157, 253, 10668, 3745, 281, 4908, 11815, 432, 253, 4679, 3012, 50276, 30819, 753, 326, 891, 2868, 326, 253, 4465, 4679, 285, 3904, 513, 417, 6848, 247, 30909, 5406, 323, 4227, 275, 891, 88, 3433, 85, 285, 259, 6917, 6296, 627, 310, 9648, 5185, 1941, 326, 269, 4174, 18699, 310, 625, 5919, 281, 6194, 685, 4751, 6194, 494, 4979, 398, 2299, 323, 546, 44874, 25, 253, 246, 18699, 41731, 13015, 3253, 3012, 672, 2819, 387, 4677, 1458, 533, 32721, 432, 4677, 2145, 359, 923, 326, 253, 1682, 1083, 10076, 556, 644, 4236, 323, 246, 18699, 10775, 4567, 390, 5693, 8090, 1014, 625, 15538, 4931, 253, 2926, 310, 4336, 1027, 672, 2819, 387, 253, 1071, 873, 7103, 835, 269, 4174, 45009, 1347, 1805, 285, 2686, 253, 246, 18699, 17923, 253, 9065, 2190, 512, 3082, 12014, 327, 687, 589, 893, 3215, 26208, 4751, 6194, 494, 4979, 398, 1646, 281, 5115, 253, 8840, 12820, 44229, 414, 285, 342, 253, 4585, 6733, 50276, 936, 26799, 891, 2868, 326, 253, 18699, 4979, 398, 812, 320, 247, 4217, 4968, 323, 11138, 2057, 253, 6733, 390, 253, 26647, 273, 4979, 398, 275, 1698, 2203, 27005, 390, 1097, 2299, 247, 940, 21755, 273, 253, 4679, 285, 11815, 310, 2424, 275, 1340, 323, 436, 281, 320, 2011, 432, 253, 2929, 10775, 1014, 342, 512, 1110, 3904, 891, 1335, 2550, 5963, 534, 18699, 3828, 588, 320, 1805, 275, 752, 3282, 352, 588, 320, 1805, 7200, 390, 6733, 285, 2139, 310, 352, 1469, 281, 320, 1805, 50276, 21848, 281, 253, 1840, 891, 5257, 281, 1978, 619, 4868, 533, 984, 253, 30733, 273, 253, 4477, 2085, 3012, 625, 1491, 285, 275, 619, 4743, 1318, 281, 253, 2929, 891, 588, 2572, 619, 4868, 281, 608, 187, 187, 4118, 18435, 27, 783, 30628, 497, 8085, 875, 2997, 818, 285, 45210, 12009, 767, 608, 84, 512, 1264, 30628, 14969, 326, 253, 4081, 2746, 310, 2969, 285, 27350, 533, 436, 2929, 3637, 323, 253, 954, 629, 253, 4473, 273, 18699, 4254, 285, 4647, 352, 281, 4979, 398, 253, 2022, 43680, 497, 12497, 4679, 391, 22, 285, 253, 3480, 273, 247, 2590, 6452, 391, 19, 891, 1119, 841, 7350, 281, 320, 3588, 285, 858, 417, 1089, 2266, 4606, 281, 46011, 616, 12645, 625, 11088, 4679, 3340, 327, 259, 6917, 285, 2590, 11815, 7200, 390, 6733, 651, 1056, 436, 2929, 1199, 10046 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 783, 4477, 7568, 326, 4979, 398, 4044, 13943, 3045, 1014, 672, 690, 273, 253, 8090, 403, 12421, 31260, 285, 1620, 9300, 253, 4477, 452, 4679, 327, 1740, 3510, 273, 18699, 39707, 18699, 50276, 567, 79, 3997, 10495, 3828, 18699, 1943, 579, 18699, 285, 260, 9866, 18699, 285, 253, 1543, 921, 326, 253, 18699, 476, 5115, 12085, 29266, 3045, 390, 685, 2622, 39707, 327, 5145, 10234, 3448, 14053, 285, 13361, 78, 3215, 26208, 347, 8042, 562, 407, 253, 4477, 50276, 22412, 18699, 12672, 6928, 660, 472, 522, 1351, 50276, 33317, 4240, 9391, 280, 41380, 50276, 6185, 248, 965, 4240, 452, 644, 14859, 1078, 253, 2022, 38135, 273, 436, 2929, 310, 760, 253, 18699, 17947, 273, 39707, 2605, 3738, 253, 1332, 310, 4722, 285, 253, 4679, 403, 6210, 392, 531, 253, 789, 8493, 39707, 6864, 327, 4831, 342, 18872, 5926, 483, 3133, 625, 15246, 285, 3576, 253, 4477, 943, 452, 690, 14023, 342, 436, 1332, 347, 253, 373, 1335, 12541, 10291, 849, 670, 3365, 6240, 690, 6046, 281, 253, 2045, 3828, 4583, 3738, 253, 4679, 403, 4722, 352, 36908, 2085, 4460, 3762, 327, 352, 285, 38771, 690, 5301, 281, 2045, 2987, 697, 1892, 281, 7472, 253, 8453, 273, 436, 789, 50276, 856, 337, 253, 18699, 327, 39707, 310, 4722, 285, 556, 417, 644, 14859, 1078, 374, 253, 4477, 5276, 253, 2934, 407, 1142, 1027, 8892, 50276, 5040, 337, 253, 18699, 4254, 369, 14859, 327, 643, 5289, 374, 642, 2266, 8245, 2530, 824, 347, 8493, 39707, 6864, 327, 4831, 342, 18872, 5926, 483, 390, 690, 643, 2605, 819, 25004, 1754, 3082, 495, 642, 4460, 3762, 281, 5513, 253, 1332, 50275, 11183, 50276, 74, 751, 253, 3368, 2879, 275, 253, 18520, 2299, 352, 310, 760, 5762, 327, 247, 891, 88, 3433, 85, 534, 310, 247, 4577, 10895, 285, 476, 320, 4833, 407, 1142, 4373, 22041, 697, 417, 2590, 752, 13831, 8090, 1599, 323, 3828, 12233, 285, 352, 878, 281, 320, 31637, 342, 625, 4278, 891, 42126, 1089, 2590, 5301, 342, 10046, 1666, 25379, 327, 259, 6917, 275, 253, 18520, 50276, 284, 8042, 776, 407, 253, 4477, 891, 1158, 253, 3762, 7194, 1705, 432, 2045, 2987, 50275, 74, 452, 671, 1239, 643, 10123, 4583, 891, 651, 751, 281, 1978, 619, 13716, 7152, 339, 431, 248, 2929, 33826, 253, 4473, 273, 46852, 275, 39707, 35615, 9093, 690, 273, 253, 8090, 275, 253, 32049, 2990, 403, 7932, 407, 4229, 8090, 534, 2789, 253, 13782, 7938, 15246, 314, 275, 253, 10554, 3408, 253, 2929, 671, 18303, 247, 1332, 1925, 896, 3319, 8201, 281, 4796, 253, 2105, 273, 896, 44263, 839, 27935, 949, 253, 4229, 8090, 253, 10419, 310, 973, 4516, 407, 247, 1180, 273, 4679, 275, 253, 2170, 273, 5145, 10234, 5847, 50276, 74, 1089, 253, 2929, 1077, 2590, 285, 973, 3542, 50276, 25285, 891, 11346, 4361, 352, 50276, 783, 2561, 4081, 275, 253, 2929, 3133, 4460, 2217, 281, 479, 285, 4566, 4404, 247, 46001, 2561, 3884, 326, 581, 273, 14871, 11454, 11333, 3738, 436, 310, 31735, 4232, 275, 253, 13361, 3114, 50276, 783, 4679, 3959, 247, 5322, 8668, 327, 253, 11361, 273, 253, 4081, 1566, 50276, 5040, 50276, 74, 1089, 19582, 253, 897, 273, 253, 1307, 18699, 275, 436, 789, 275, 619, 4685, 326, 310, 9093, 247, 4229, 440, 32927, 18902, 3828, 253, 4809, 273, 253, 8062, 310, 7936, 275, 436, 1675, 275, 253, 2929, 3185, 253, 897, 273, 18699, 9572, 7194, 323, 14871, 512, 275, 512, 891, 13414, 923, 18699, 8090, 275, 253, 4081, 1566, 285, 891, 1804, 281, 1379, 436, 715, 2395, 4931, 6890, 253, 1416, 285, 253, 4060, 50276, 783, 4679, 921, 326, 970, 440, 32927, 8090, 4245, 247, 2762, 5454, 2727, 875, 7200, 285, 11897, 2069, 275, 253, 5661, 5301, 891, 1158, 352, 651, 50276, 35529, 50276, 18108, 281, 923, 849, 436, 5454, 2727, 2724, 405, 1179, 265, 8772, 281, 253, 1180, 273, 6194, 494, 13461, 672, 17816, 3733, 275, 690, 273, 253, 8090, 253, 1180, 273, 6194, 494, 13461, 310, 3777, 285, 253, 4583, 10454, 273, 253, 985, 310, 3777, 1512, 1907, 247, 3686, 273, 37820, 1055, 50276, 783, 4679, 921, 326, 352, 310, 1896, 281, 3693, 3733, 275, 690, 273, 253, 8090, 273, 247, 39707, 2299, 275, 14871, 11454, 6928, 247, 30847, 2554, 310, 4546, 407, 253, 13642, 3602, 273, 253, 3206, 2801, 12624, 19627, 12624, 403, 908, 275, 253, 4679, 534, 310, 1175, 533, 891, 4282, 849, 285, 604, 253, 13642, 273, 824, 12624, 2818, 253, 3045, 34243, 824, 13642, 943, 320, 4373, 22041, 285, 320, 6777, 327, 247, 12820, 873, 50276, 66, 1643, 5884, 2792, 50276, 32897, 604, 1896, 19148, 625, 11120, 327, 253, 4473, 273, 5975, 3390, 7938, 50276, 32897, 19148, 327, 253, 941, 19860, 275, 492, 29576, 1641, 323, 253, 908, 15302, 50276, 32897, 9569, 253, 15302, 1078, 2593, 4562, 835, 597, 403, 2168, 5393, 1293, 26432, 50275, 15576, 891, 651, 751, 281, 5717, 253, 4477, 323, 253, 5322, 789, 1309, 253, 2278, 1232, 891, 717, 3965, 10048, 342, 326, 285, 891, 1928, 396, 9457, 281, 2572, 619, 13716, 281, 2997, 3086, 406, 33032, 6010, 50276, 783, 2929, 2175, 849, 4979, 398, 6194, 672, 690, 8090, 403, 4934, 4229, 387, 253, 12421, 31260, 3602, 253, 4477, 10018, 326, 4979, 398, 476, 320, 8069, 10166, 342, 247, 1534, 7155, 273, 616, 8090, 13831, 352, 310, 671, 9125, 326, 347, 247, 906, 4979, 398, 476, 320, 10166, 625, 14556, 347, 323, 253, 13831, 8090, 27935, 342, 1675, 281, 253, 3602, 878, 417, 320, 10302, 50275, 296, 3755, 20556, 50275, 783, 4081, 2934, 310, 1077, 2969, 285, 3477, 281, 2096, 285, 3359, 50276, 783, 4477, 1347, 11080, 4679, 327, 2067, 15302, 285, 8892, 50275, 20881, 1255, 265, 50276, 18, 253, 7331, 550, 7982, 310, 417, 10499, 253, 6733, 3907, 273, 253, 673, 7563, 347, 352, 310, 7558, 323, 4227, 1677, 247, 1029, 2217, 326, 253, 1682, 9591, 1566, 588, 1900, 452, 253, 4585, 7331, 550, 10214, 281, 326, 1677, 247, 1698, 2217, 326, 253, 22583, 281, 29623, 1566, 588, 1900, 320, 1805, 891, 5194, 342, 253, 4477, 326, 10499, 253, 6733, 273, 247, 11454, 2990, 310, 247, 1892, 1895, 533, 891, 13414, 1158, 326, 253, 7331, 550, 7982, 310, 247, 1175, 2900, 673, 281, 1269, 273, 1682, 4868, 310, 3164, 1199, 625, 27096, 275, 1635, 673, 281, 1682, 4868, 310, 671, 417, 1077, 27096, 347, 352, 7024, 11306, 327, 253, 4715, 2281, 10130, 285, 253, 3632, 22957, 273, 3733, 374, 432, 253, 2530, 5301, 342, 1675, 281, 253, 1180, 273, 5731, 494, 8090, 359, 2550, 27566, 1491, 5001, 253, 6733, 323, 4227, 275, 3036, 374, 323, 259, 6917, 849, 1142, 13831, 8090, 403, 908, 323, 253, 18699, 4979, 398, 752, 310, 253, 3402, 13273, 673, 591, 23657, 323, 1016, 1566, 495, 432, 253, 5301, 275, 253, 24864, 2144, 342, 1675, 281, 512, 253, 8090, 275, 253, 1566, 359, 10018, 326, 253, 3045, 310, 327, 1061, 342, 2629, 4979, 398, 533, 752, 310, 253, 7756, 275, 6733, 10775, 849, 1199, 7938, 403, 253, 18699, 4979, 398, 323, 247, 1677, 1180, 273, 8090, 50275, 250, 3743, 323, 17401, 50276, 783, 2929, 29328, 247, 1077, 2969, 2934, 285, 840, 44995, 352, 21657, 3103, 253, 5661, 2593, 943, 320, 11080, 285, 943, 1421, 281, 2590, 11815, 1955, 281, 253, 253, 9841, 4081, 7982, 347, 973, 347, 253, 3480, 273, 5301, 875, 3733, 673, 285, 6786, 3045, 1421, 479, 281, 12661, 18235, 50275, 43671, 43295, 50275, 13206, 1638, 5004, 987, 943, 3164, 1239, 12820, 268, 446, 50275, 5996, 250, 2858, 22559, 5731, 50276, 74, 651, 751, 281, 5717, 253, 4477, 323, 616, 30733, 281, 253, 2929, 891, 2868, 326, 253, 4465, 17082, 3157, 253, 10668, 3745, 281, 4908, 11815, 432, 253, 4679, 3012, 50276, 30819, 753, 326, 891, 2868, 326, 253, 4465, 4679, 285, 3904, 513, 417, 6848, 247, 30909, 5406, 323, 4227, 275, 891, 88, 3433, 85, 285, 259, 6917, 6296, 627, 310, 9648, 5185, 1941, 326, 269, 4174, 18699, 310, 625, 5919, 281, 6194, 685, 4751, 6194, 494, 4979, 398, 2299, 323, 546, 44874, 25, 253, 246, 18699, 41731, 13015, 3253, 3012, 672, 2819, 387, 4677, 1458, 533, 32721, 432, 4677, 2145, 359, 923, 326, 253, 1682, 1083, 10076, 556, 644, 4236, 323, 246, 18699, 10775, 4567, 390, 5693, 8090, 1014, 625, 15538, 4931, 253, 2926, 310, 4336, 1027, 672, 2819, 387, 253, 1071, 873, 7103, 835, 269, 4174, 45009, 1347, 1805, 285, 2686, 253, 246, 18699, 17923, 253, 9065, 2190, 512, 3082, 12014, 327, 687, 589, 893, 3215, 26208, 4751, 6194, 494, 4979, 398, 1646, 281, 5115, 253, 8840, 12820, 44229, 414, 285, 342, 253, 4585, 6733, 50276, 936, 26799, 891, 2868, 326, 253, 18699, 4979, 398, 812, 320, 247, 4217, 4968, 323, 11138, 2057, 253, 6733, 390, 253, 26647, 273, 4979, 398, 275, 1698, 2203, 27005, 390, 1097, 2299, 247, 940, 21755, 273, 253, 4679, 285, 11815, 310, 2424, 275, 1340, 323, 436, 281, 320, 2011, 432, 253, 2929, 10775, 1014, 342, 512, 1110, 3904, 891, 1335, 2550, 5963, 534, 18699, 3828, 588, 320, 1805, 275, 752, 3282, 352, 588, 320, 1805, 7200, 390, 6733, 285, 2139, 310, 352, 1469, 281, 320, 1805, 50276, 21848, 281, 253, 1840, 891, 5257, 281, 1978, 619, 4868, 533, 984, 253, 30733, 273, 253, 4477, 2085, 3012, 625, 1491, 285, 275, 619, 4743, 1318, 281, 253, 2929, 891, 588, 2572, 619, 4868, 281, 608, 187, 187, 4118, 18435, 27, 783, 30628, 497, 8085, 875, 2997, 818, 285, 45210, 12009, 767, 608, 84, 512, 1264, 30628, 14969, 326, 253, 4081, 2746, 310, 2969, 285, 27350, 533, 436, 2929, 3637, 323, 253, 954, 629, 253, 4473, 273, 18699, 4254, 285, 4647, 352, 281, 4979, 398, 253, 2022, 43680, 497, 12497, 4679, 391, 22, 285, 253, 3480, 273, 247, 2590, 6452, 391, 19, 891, 1119, 841, 7350, 281, 320, 3588, 285, 858, 417, 1089, 2266, 4606, 281, 46011, 616, 12645, 625, 11088, 4679, 3340, 327, 259, 6917, 285, 2590, 11815, 7200, 390, 6733, 651, 1056, 436, 2929, 1199, 10046 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: docsepthe is a data analysis study on iclr submissions from 2017 to 2021 several variables such as gender location prior experience and topic of the papers is considered the study is mainly univariate and tries to find meaningful patterns for inferring representation of minorities at iclr positive aspects the observations are interesting specially for the organizers and senior members of iclr the methodology and analysis is clear and easy to follow negative aspects the paper is not wellmotivated and does not seem to fit within the general scope of iclr see more below the analyses are univariate and possibly misleading see more below the study is nonconclusive and lacks a coherent structure no indepth study of the observations is provided and the results raise more questions than answers main comments the main problem with the study is the focus on univariate analysis the phenomena in social sciences often involve more than one variable and a univariate analysis gives a biased and often misleading answer a notable example is the work by blau and kahn blau francine d and lawrence m kahn 2017 the gender wage gap extent trends and explanations journal of economic literature 55 3 789865 for drawing meaningful conclusions i encourage authors to perform a multivariate analysis on all the variables considered in this study and possibly more variables drawn from similar studies in social sciences the study is not wellmotivate and it is not clear 1 how it fits withing the scope of iclr and 2 what conclusion should be drawn from the multiple correlations found in the study the analysis done in this paper is narrowed to iclr data only and does not relate to representation learning and broader applications the paper is not conclusive and is built as a set of seemingly independent correlationbased analysis without a coherent structure there are no deeper analysis of the observations for example when a certain bias exists assuming a correct analysis what are the underlying dynamics that lead to these biases considering the anonymity of the reviewing process in many cases where the authors found there is a gender bias in the accepted papers per topic the question arises that how can a gender bias appear in this kind of reviewing process isnt this just a byproduct of the small dataset and even smaller subset of female authors within the dataset i find the justification behind considering only the first and last authors for the mentorship analysis rather poor the papers submitted in cs venues are more often than not the product of more than one teams effort in which the senior heads of each team do not necessarily have mentorship relationship with the first author of the paper often times a student i think authors should consider the affiliations experience and gender of all authors of a paper and draw conclusion on the mentorship also there can be more than one mentor eg two senior authors with the same affiliation as the first author the same goes for attributing a location to a paper last author is not a good representation necessarily a better approach is to make a new class of multinational papers in section 31 authors study data from 2020 and 2021 to infer the statistics for first authors who came back the next year the conference in these two years was virtual and quite possibly an anomaly in terms of the behavioral patterns authors should consider previous years as well i evaluate the paper as a reject for the following reasons the paper is not wellmotivated and does not seem to fit within the general scope of iclr see more below the analyses are univariate and possibly misleading see more below the study is nonconclusive and lacks a coherent structure no indepth study of the observations is provided and the results raise more questions than answers docsepthis paper aims to quantify to some degree how gender country of origin and paper topic impact the review process decision to submit and paper review outcomes at iclr using data from 2017 to 2021 the authors report several interesting outcomes including a lower returnretention rate for women attendees of iclr as well as authors with low review scores higher scores for papers of western origin the increased likelihood of theoremcontaining papers to be accepted and the higher rate of acceptance for industry papers strengths this paper studies an important topic regarding how conference here specifically iclr submissionscoreacceptance is correlated with various aspects that are in and out of authors control including their choice of topic gender location etc this has implications for how conferences can be designed to be more inclusive guide scholars toward more friendlyspecific venues and how as a community we may rethink the assignment of prestigeimportance to conferences the authors examine several crossvariable impacts including gendertopic experiencegeographic region and genderexperience it would be interesting to measure how impactful such variables are compared to each other perhaps via some sort of anova or recursive elimination feature importance assessment or the proposal of a method for measuring confoundingcombinatorial impacts of each of the studied variables weaknesses it would have been interesting to see how gender representation could have changed if nonfirstlast authors were also labeled with their gender identity although i recognize this may have been too expensive to label it is very difficult in general to make strong comments about the impact of various demographic factors on conferences and the review process and while the authors propose several important insights and hypotheses to explain differences in eg return rate of women it is difficult to establish how likely these hypotheses are or to establish causality it would be interesting to see how the authors account for the selfselection bias of the fieldsorganizations that submit to or have members who attend iclr and perhaps compare this to other conferences including those that are more specialized to specific fields like acl as well as other more general conferences like neurips there is not much discussion of possible reasons for the scoring difference among geographic regions it would be nice to see more clarity around what causes could be hypothesized and what data the authors believe is missingcan be collected to test such hypotheses this paper touches on the very important topic of investigating explanatory effects of demographic and metafactors in conference attendance submission and acceptance this is an extraordinarily wide mandate and the paper attempts to cover a fair amount of ground this also results in several intuitive unanswered questions when reading despite the opportunity for deeper analysis i believe this paper is valuable in understanding representation at conferences specifically iclr and provides some starting points for improving the same docsepthis paper presents a study on the relationship between demographics and the submission and review of papers to iclr leveraging publicly available data from openreview the authors perform a substantial manual annotation effort on this dataset and then study how paper acceptance and retention ie whether authors submit to the conference again the following year are related to gender and geographical location by attempting to control for various factors topic industry affiliation experience level and the use of theorems are also examined strong points the peer review process in machine learning and diversity andor bias issues in csaiml are both crucially important issues that are in dire need of research this work addresses both of these issues simultaneously and is therefore likely to obtain substantial attention this work studies not only the impact of gender and geographic location but also industry affiliation the use of theorems topics experience level etc substantial and laborintensive manual annotation of iclr authors gender was performed in order to execute the study the authors chose to perform this work to get more reliable results instead of simply relying on automatic tools which are known to have a bias toward performing better on western names attempting to control for various factors due to the observational nature of the dataset makes the analysis much more rigorous there are several very interesting findings in the research which could have realworld impact on the peer review process and on the academy at large for example iclr authors are more likely to return or have their paper accepted if their mentor is the same gender larger iclr teams have higher review scores on average women tend to have larger coauthor teams than men weak points the precise relationship to the work of tran et al 2020 is unclear eg what are the methodological differences and the novel findings note if this paper turns out to be by the same authors as the tran et al arxiv paper and is actually a conference version of that same paper its significance would be higher i cannot assess this due to the double blind it is a complicated question as to whether this paper is a good fit for iclr on one hand this work does not have the methodological novelty we would expect of an accepted iclr paper and this type of research is not normally published at this venue on the other since it studies the iclr conference itself its findings are clearly of interest to the iclr community furthermore it would behoove the iclr community to be open to submissions such as this which are outside the box but still insightful arguably a journal format would be more appropriate for this work as it would provide more space for a thorough analysis then again there is no journal which is directly affiliated with the iclr conference so this would have disconnected the paper from the iclr community i would like to have seen a more sophisticated level of analysis which leverages techniques developed by the iclr community and from machine learning atlarge for example nlp methods including topic models word embeddings andor bert could have contributed to the analysis this work only scratches the surface of the analyses on this dataset which are possible when leveraging the authors gender annotations the paper could be strengthened by a deeper investigation into whether the disparities identified are due to factors relating to systemic bias sexism racism etc similarly the paper could have delved deeper into the extent to which english language capabilities are the reason for the disparities eg by reporting results when controlling for the percentage of english language speakers per country the potential impact of covid19 on the population under study should have been discussed since much of the analysis hinges on annotations it would be appropriate to report interannotator agreement metrics questions to the authors why is it particularly important to perform this study on the iclr conference other than the fact that the data are available on openreview additional feedback minor suggestions pg 3 the phrase limited evidence is a bit ambiguous as it is unclear whether authors are saying there is evidence the data suggest the statement is true but the evidence is limited or there isnt evidence the data suggest the statement is false but the evidence is limited pg 3 use parenthetical citations citep for etzkowitz et al 1992 rosser lane 2002 pg 5 increase monotonically with number of authors missing the pg 6 papers from east and south asian asia pg 7 women did doing better than men remove doing pg 9 as a function 3 industry indicators of pg 9 1062 missing parentheses around the number pg 9 mentorship is import important the authors should mention that this work assumes a gender binary and that future work could and should study nonbinary and transgender populations once this work is published it would be extremely valuable if the authors could make their annotated dataset publicly available this paper does not fit the typical mold of an iclr paper in that it is mainly empirical work rather than presenting novel methods as always more analysis could have been done however its findings regarding participation and review in iclr are valuable to the iclr community and it would likely obtain substantial attention in the literature and have an impact on the machine learning community at large docsepthe paper conducts an observational study of disparities in the iclr 20172021 reviewing process the work centers around the analysis of disparities in reviewing scores and acceptance rates across gender number of authors countries papers topics and industryacademia the authors present several findings that are likely of interest to the community and provide some potential explanations behind some of such findings the exploratory data analysis and regression results presented in the paper contain some interesting results although the authors do not provide explanations behind each of these results it might be helpful to raise awareness of these patterns in the community at the same time most of the findings from this exploratory analysis overlap with those of tran et al 2020 thus i am unsure of how novel the authors contributions are besides presenting an updated analysis as the authors correctly note in the ethics statement this is an observational study all regression models control only for the reviewers scores thus in many instances its unclear what kind of effect if any these models are actually capturing the paper contains statements such as probe the effects of paper topic on the review process review score impacts retention how it impacts their return rate we remove the effect of reviewer score by controlling for this variable there is still a small residual effect of gender etc these regressions simply uncover associational patterns and thus causal statements should be avoided about section 31 in section 31 the histogram on the left of figure 1 would suggest a large difference in scores between men and women so i found the fact that the difference was only 013 surprising is this difference of 013 based on the data whose distribution is displayed in the histogram was any residual diagnostics analysis for the logistic regression models conducted all models are simple ie they include only a few variables so it should be rather straightforward to check whether the assumptions are met eg by using the diagnostic plots presented here httpsarxivorgpdf161203257pdf in the regression of retention reviewers scoresgender how were observations of individuals that appeared in multiple papers as first authors handled these observations are dependent and thus it might be a good idea to cluster the standard errors by the first author are these results robust to changes in the way that reviewers scores are taken into account eg if the most negative score instead of the mean score is accounted for do the results still hold past work has shown that type i error may not be controlled for when running logistic regression on peerreviewed data httpsarxivorgabs191213188 so this issue should be acknowledged based on table 1 we cannot really say much about whether the difference between the coefficients in the two regression models is statistically significant the most straightforward way to make some inference on that would be to fit a single model of retention intercept iis woman reviewer score reviewer score iis woman and do a wald test on the last coefficient that said the difference in the probabilities predicted by the current models are very small some of the concerns and suggestions raised above apply also to the remaining sections of the paper about section 32 we find that the number of publications of the first author had a positive impact on return rate p 0143 table 6 this pvalue is above any of the significance levels that are generally considered 0001 001 005 01 so i would interpret this coefficient and not impact as not being statistical significant i read that borderline pvalue 011 so i assume that the significance level the authors adopted is 01 minor details some of the summary statistics that are reported contain two decimal digits while others only contain one the manuscript contains some typos eg these data was section 1 in zweben bizot 2021 i could not find the information about women comprising 236 of the enrollees in computation graduate programs section 2 perhaps it might useful to mention that the papers included in the dataset represent almost but not all of the submissions to iclr in the period considered captions of figures 37 need to be improved table 1 needs to be improved too eg by adding an additional column on the left containing the labels men and women it would be helpful if the authors could provide the code to replicate these results the analysis presented in the paper contains some interesting results however many of the findings overlap with those of tran et al 2020 which conducted a similar analysis in addition i raised several concerns and questions on the methodology used in the paper ### Summary:
the authors have not responded to my concerns in the rebuttal response post which are crucial in justifying their methodology and reliability of their observations lack of multivariate analysis and unjustified generalization using limited and possibly biased subset of the data there are no connections made with the topics of interest in iclr either for these reasons i am willing to keep my position to reject this paper
[ 352, 651, 320, 5322, 281, 923, 625, 19843, 1475, 752, 5997, 812, 320, 24045, 285, 752, 941, 253, 4477, 2868, 310, 5816, 5092, 320, 5728, 281, 1071, 824, 24316, 436, 2929, 26847, 327, 253, 1077, 1774, 9400, 273, 15686, 41355, 2538, 273, 18825, 285, 1313, 2320, 46435, 275, 8059, 21250, 19529, 285, 14924, 436, 310, 271, 49258, 4618, 21787, 285, 253, 2929, 9437, 281, 3835, 247, 4344, 2408, 273, 3216, 50276, 2520, 671, 1543, 275, 2067, 27350, 440, 42195, 3533, 672, 4361, 5747, 253, 5107, 323, 12861, 1783, 891, 2868, 436, 2929, 310, 9865, 275, 4685, 6779, 387, 27691, 5742, 17857, 32888, 285, 3400, 690, 4983, 2792, 323, 11138, 253, 1072, 5474, 33032, 2520, 2929, 10262, 247, 1263, 327, 253, 2954, 875, 35949, 285, 253, 19529, 285, 2278, 273, 9380, 281, 17857, 32888, 19732, 2977, 13644, 2130, 941, 432, 1527, 15337, 50276, 783, 4477, 1347, 247, 6832, 11595, 22581, 3434, 327, 436, 10895, 285, 840, 1263, 849, 2929, 14924, 285, 17302, 26332, 1880, 4477, 11929, 281, 253, 8059, 969, 253, 1563, 807, 403, 2905, 281, 8645, 285, 25231, 4328, 407, 13756, 281, 1453, 323, 2710, 2616, 9400, 4491, 37846, 2793, 1268, 285, 253, 897, 273, 39383, 403, 671, 6730, 2266, 2792, 50276, 783, 14218, 2278, 1232, 275, 5145, 4715, 285, 9991, 285, 263, 8492, 3374, 275, 29180, 1468, 77, 403, 1097, 29325, 1365, 1774, 3374, 326, 403, 275, 1185, 878, 273, 2561, 50276, 2520, 789, 12453, 1097, 273, 841, 3374, 10486, 285, 310, 3103, 2779, 281, 4044, 6832, 4116, 50276, 2520, 789, 2175, 417, 760, 253, 3486, 273, 8645, 285, 23365, 4328, 533, 671, 4491, 37846, 253, 897, 273, 39383, 12989, 2793, 1268, 3966, 50276, 42312, 285, 5299, 47986, 11595, 22581, 273, 17857, 32888, 4477, 8645, 369, 2684, 275, 1340, 281, 13194, 253, 1263, 253, 4477, 9703, 281, 1347, 436, 789, 281, 755, 625, 9630, 1543, 3185, 273, 3365, 22128, 327, 12077, 5657, 534, 403, 1929, 281, 452, 247, 8492, 2584, 9591, 1805, 327, 10439, 4454, 50276, 38839, 272, 281, 1453, 323, 2710, 2616, 1955, 281, 253, 21899, 3753, 273, 253, 10895, 2789, 253, 1783, 1199, 625, 26565, 50276, 9088, 403, 2067, 1077, 4722, 4342, 275, 253, 2561, 534, 812, 452, 1524, 10186, 3486, 327, 253, 14218, 2278, 1232, 285, 327, 253, 35893, 387, 1781, 323, 1650, 17857, 32888, 4477, 403, 625, 2779, 281, 1091, 390, 452, 616, 2929, 7607, 604, 616, 31854, 310, 253, 1072, 8645, 4067, 17857, 32888, 6671, 452, 2169, 2278, 7363, 327, 3388, 2255, 5257, 281, 452, 4067, 820, 7582, 6671, 685, 1821, 50276, 20881, 2792, 50276, 783, 10799, 2954, 281, 253, 789, 273, 21191, 1162, 355, 9169, 310, 12744, 24088, 752, 403, 253, 35961, 3910, 285, 253, 4460, 4342, 3877, 604, 436, 2929, 7819, 562, 281, 320, 407, 253, 1072, 4477, 347, 253, 21191, 1162, 355, 549, 32693, 2929, 285, 310, 2686, 247, 8059, 2715, 273, 326, 1072, 2929, 697, 8453, 651, 320, 2169, 891, 2550, 2939, 436, 1955, 281, 253, 4021, 9645, 50276, 262, 310, 247, 9542, 1953, 347, 281, 1880, 436, 2929, 310, 247, 1175, 4944, 323, 17857, 32888, 50276, 251, 581, 1133, 436, 789, 1057, 417, 452, 253, 35961, 38135, 359, 651, 1902, 273, 271, 7607, 17857, 32888, 2929, 285, 436, 1511, 273, 2561, 310, 417, 9403, 3863, 387, 436, 18767, 50276, 251, 253, 643, 1580, 352, 2175, 253, 17857, 32888, 8059, 3139, 697, 4342, 403, 4518, 273, 1600, 281, 253, 17857, 32888, 3114, 50276, 44295, 3062, 352, 651, 1602, 80, 710, 253, 17857, 32888, 3114, 281, 320, 1527, 281, 35103, 824, 347, 436, 534, 403, 3345, 253, 3817, 533, 1335, 47860, 50276, 1662, 86, 1598, 247, 6698, 5981, 651, 320, 625, 4569, 323, 436, 789, 347, 352, 651, 2085, 625, 2317, 323, 247, 11080, 1783, 840, 969, 627, 310, 642, 6698, 534, 310, 3587, 27312, 342, 253, 17857, 32888, 8059, 594, 436, 651, 452, 33817, 253, 2929, 432, 253, 17857, 32888, 3114, 50276, 74, 651, 751, 281, 452, 2326, 247, 625, 18144, 1268, 273, 1783, 534, 19732, 1131, 5609, 3715, 407, 253, 17857, 32888, 3114, 285, 432, 5145, 4715, 387, 16374, 50276, 1542, 1650, 295, 24343, 3082, 1690, 9400, 3210, 3159, 46234, 285, 263, 270, 797, 812, 452, 9945, 281, 253, 1783, 50276, 2520, 789, 760, 7362, 32358, 253, 2553, 273, 253, 6260, 327, 436, 10895, 534, 403, 1896, 672, 19732, 2977, 253, 4477, 8645, 31825, 50276, 783, 2929, 812, 320, 34615, 407, 247, 12861, 5839, 715, 1880, 253, 43586, 3636, 403, 1955, 281, 2616, 12600, 281, 13407, 8492, 2825, 1204, 23285, 3966, 50276, 3549, 6241, 253, 2929, 812, 452, 1448, 1272, 12861, 715, 253, 6070, 281, 534, 48087, 3448, 13789, 403, 253, 1921, 323, 253, 43586, 24088, 407, 9610, 1543, 672, 10938, 323, 253, 7155, 273, 48087, 3448, 17999, 591, 2586, 50276, 783, 2442, 3486, 273, 9383, 301, 746, 327, 253, 3072, 762, 1263, 943, 452, 644, 5469, 50276, 17480, 1199, 273, 253, 1783, 34865, 265, 327, 31825, 352, 651, 320, 4569, 281, 1304, 734, 11423, 1080, 4345, 17082, 50274, 34974, 281, 253, 4477, 50276, 22309, 310, 352, 3782, 1774, 281, 1347, 436, 1263, 327, 253, 17857, 32888, 8059, 643, 685, 253, 958, 326, 253, 941, 403, 2130, 327, 1527, 15337, 50274, 38092, 8680, 50276, 37585, 13991, 50276, 8159, 495, 253, 12616, 3710, 1941, 310, 247, 2372, 23851, 347, 352, 310, 12744, 1880, 4477, 403, 3981, 627, 310, 1941, 253, 941, 1804, 253, 3908, 310, 2032, 533, 253, 1941, 310, 3710, 390, 627, 310, 2649, 1941, 253, 941, 1804, 253, 3908, 310, 3221, 533, 253, 1941, 310, 3710, 50276, 8159, 495, 897, 2885, 6168, 474, 30404, 4851, 554, 323, 1162, 32786, 29384, 1162, 355, 9748, 687, 859, 254, 50276, 34669, 6752, 50276, 8159, 608, 2572, 41907, 1037, 342, 1180, 273, 4477, 5816, 253, 50276, 8159, 721, 9380, 432, 9268, 285, 6420, 347, 757, 347, 571, 50276, 8159, 818, 2255, 858, 2509, 1805, 685, 1821, 5386, 2509, 50276, 8159, 898, 347, 247, 1159, 495, 4491, 18172, 273, 50276, 8159, 898, 884, 3763, 5816, 41616, 1475, 253, 1180, 50276, 8159, 898, 9022, 15306, 310, 1395, 1774, 50276, 783, 4477, 943, 3748, 326, 436, 789, 19584, 247, 8645, 8985, 285, 326, 2852, 789, 812, 285, 943, 1263, 1327, 26458, 285, 31737, 7625, 50276, 19131, 436, 789, 310, 3863, 352, 651, 320, 6685, 9865, 604, 253, 4477, 812, 1056, 616, 28267, 10895, 13644, 2130, 436, 2929, 1057, 417, 4944, 253, 6867, 13100, 273, 271, 17857, 32888, 2929, 275, 326, 352, 310, 7194, 16774, 789, 2581, 685, 15250, 4460, 3082, 50276, 284, 1900, 625, 1783, 812, 452, 644, 2218, 50276, 35529, 697, 4342, 5001, 11497, 285, 2278, 275, 17857, 32888, 403, 9865, 281, 253, 17857, 32888, 3114, 285, 352, 651, 2779, 4044, 6832, 4116, 275, 253, 6239, 285, 452, 271, 3486, 327, 253, 5145, 4715, 3114, 387, 1781, 50275, 7152, 339, 431, 248, 2929, 2589, 84, 271, 21899, 1263, 273, 43586, 275, 253, 17857, 32888, 4240, 938, 1797, 16725, 1232, 253, 789, 12127, 1475, 253, 1783, 273, 43586, 275, 16725, 7363, 285, 14924, 4142, 2439, 8645, 1180, 273, 4477, 4343, 9380, 12989, 285, 4491, 317, 4788, 571, 253, 4477, 1246, 2067, 4342, 326, 403, 2779, 273, 1600, 281, 253, 3114, 285, 2085, 690, 2442, 22909, 3212, 690, 273, 824, 4342, 50275, 783, 41075, 941, 1783, 285, 9077, 1543, 3559, 275, 253, 2929, 3831, 690, 4722, 1543, 3738, 253, 4477, 513, 417, 2085, 22909, 3212, 1016, 273, 841, 1543, 352, 1537, 320, 9371, 281, 7164, 11891, 273, 841, 6127, 275, 253, 3114, 387, 253, 1072, 673, 954, 273, 253, 4342, 432, 436, 41075, 1783, 14787, 342, 1110, 273, 21191, 1162, 355, 9169, 3021, 891, 717, 31488, 273, 849, 4460, 253, 4477, 9021, 403, 16280, 15250, 271, 9300, 1783, 50276, 284, 253, 4477, 9113, 3877, 275, 253, 18035, 3908, 436, 310, 271, 21899, 1263, 512, 9077, 3210, 1453, 760, 323, 253, 30628, 7363, 3021, 275, 1142, 10872, 697, 12744, 752, 2238, 273, 1055, 604, 667, 841, 3210, 403, 2686, 26475, 253, 2929, 4428, 7234, 824, 347, 10304, 253, 2538, 273, 2929, 9400, 327, 253, 2278, 1232, 2278, 4868, 16274, 17302, 849, 352, 16274, 616, 1091, 2281, 359, 5386, 253, 1055, 273, 37317, 4868, 407, 10938, 323, 436, 4778, 627, 310, 1335, 247, 1355, 12541, 1055, 273, 8645, 3966, 841, 810, 37761, 3365, 32355, 1709, 1050, 6127, 285, 3021, 19349, 7234, 943, 320, 16371, 50276, 10383, 2593, 4562, 50275, 249, 2593, 4562, 253, 33105, 327, 253, 1669, 273, 4677, 337, 651, 1804, 247, 1781, 3064, 275, 7363, 875, 1821, 285, 2255, 594, 891, 1119, 253, 958, 326, 253, 3064, 369, 760, 470, 1012, 10084, 310, 436, 3064, 273, 470, 1012, 1754, 327, 253, 941, 3692, 3268, 310, 8653, 275, 253, 33105, 50276, 4238, 667, 12541, 39266, 1783, 323, 253, 21535, 9077, 3210, 5196, 512, 3210, 403, 2969, 26332, 597, 2486, 760, 247, 1643, 4903, 594, 352, 943, 320, 2581, 15246, 281, 2451, 1880, 253, 13260, 403, 1313, 24088, 407, 970, 253, 10401, 14777, 3559, 1060, 5987, 39962, 2061, 9275, 1036, 805, 2941, 21553, 9275, 50276, 249, 253, 9077, 273, 17302, 50276, 15337, 398, 7363, 21069, 849, 497, 7313, 273, 4292, 326, 5420, 275, 2709, 9380, 347, 806, 4477, 15726, 841, 7313, 403, 7976, 285, 3021, 352, 1537, 320, 247, 1175, 2934, 281, 7368, 253, 2629, 6332, 407, 253, 806, 2488, 50276, 609, 841, 1543, 10237, 281, 2544, 275, 253, 1039, 326, 30628, 7363, 403, 2668, 715, 2395, 24088, 604, 253, 954, 4016, 4868, 3185, 273, 253, 1599, 4868, 310, 20184, 323, 513, 253, 1543, 1335, 2186, 50275, 32628, 789, 556, 2011, 326, 1511, 891, 2228, 778, 417, 320, 6537, 323, 672, 3515, 21535, 9077, 327, 14218, 33349, 941, 5987, 39962, 2061, 5375, 746, 805, 1012, 17599, 594, 436, 2523, 943, 320, 14969, 50276, 3169, 327, 2829, 337, 359, 2550, 1663, 1333, 1199, 670, 1880, 253, 3064, 875, 253, 10303, 275, 253, 767, 9077, 3210, 310, 10126, 1534, 253, 954, 15246, 1039, 281, 1056, 690, 17032, 327, 326, 651, 320, 281, 4944, 247, 2014, 1566, 273, 17302, 50276, 2388, 916, 50276, 74, 261, 3416, 50276, 15337, 254, 4868, 50276, 15337, 254, 4868, 50276, 74, 261, 3416, 285, 513, 247, 259, 8950, 1071, 327, 253, 1390, 10235, 326, 753, 253, 3064, 275, 253, 50276, 22275, 6720, 8131, 407, 253, 1655, 3210, 403, 1077, 1355, 50276, 8826, 273, 253, 7350, 285, 13991, 5439, 1840, 4647, 671, 281, 253, 5780, 7118, 273, 253, 2929, 50275, 10383, 2593, 4567, 50276, 664, 1089, 326, 253, 1180, 273, 16516, 273, 253, 806, 2488, 574, 247, 2762, 3486, 327, 1091, 2281, 268, 50276, 520, 3079, 2829, 721, 436, 268, 2877, 310, 1840, 667, 273, 253, 8453, 2308, 326, 403, 3839, 2783, 209, 5831, 209, 2874, 209, 5523, 14805, 594, 891, 651, 4665, 436, 10235, 285, 417, 3486, 347, 417, 1146, 7605, 1534, 891, 1239, 326, 45210, 268, 2877, 470, 883, 594, 891, 5467, 326, 253, 8453, 1268, 253, 4477, 8671, 310, 14805, 50276, 37585, 4278, 50276, 8826, 273, 253, 6010, 9990, 326, 403, 2361, 3831, 767, 14492, 24321, 1223, 2571, 760, 3831, 581, 50276, 783, 7714, 4428, 690, 963, 993, 24088, 841, 941, 369, 50276, 4674, 337, 275, 1182, 664, 7564, 50276, 48369, 302, 43425, 891, 812, 417, 1089, 253, 1491, 670, 2255, 11616, 28131, 273, 253, 47132, 282, 265, 275, 13782, 16125, 5659, 50276, 4674, 374, 4931, 352, 1537, 4217, 281, 3748, 326, 253, 9380, 2908, 275, 253, 10895, 1957, 2761, 533, 417, 512, 273, 253, 35103, 281, 17857, 32888, 275, 253, 2180, 2783, 50276, 17140, 621, 273, 8442, 5345, 878, 281, 320, 5520, 2829, 337, 3198, 281, 320, 5520, 1512, 24088, 407, 6240, 271, 3081, 5084, 327, 253, 1669, 4508, 253, 13301, 1821, 285, 2255, 50276, 262, 651, 320, 9371, 604, 253, 4477, 812, 2085, 253, 2127, 281, 25464, 841, 1543, 253, 1783, 3559, 275, 253, 2929, 4428, 690, 4722, 1543, 2299, 1142, 273, 253, 4342, 14787, 342, 1110, 273, 21191, 1162, 355, 9169, 534, 5196, 247, 2074, 1783, 275, 1635, 891, 5439, 2067, 7350, 285, 3533, 327, 253, 16182, 908, 275, 253, 2929, 50275, 187, 187, 4118, 18435, 27, 783, 4477, 452, 417, 10974, 281, 619, 7350, 275, 253, 30080, 22559, 2380, 1501, 534, 403, 9560, 275, 816, 5411, 616, 16182, 285, 13367, 273, 616, 7313, 3480, 273, 21471, 1783, 285, 26694, 1245, 26647, 970, 3710, 285, 6830, 23539, 8578, 273, 253, 941, 627, 403, 642, 10291, 1160, 342, 253, 12989, 273, 1600, 275, 17857, 32888, 2057, 323, 841, 4606, 891, 717, 7378, 281, 1978, 619, 1899, 281, 12009, 436, 2929 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 352, 651, 320, 5322, 281, 923, 625, 19843, 1475, 752, 5997, 812, 320, 24045, 285, 752, 941, 253, 4477, 2868, 310, 5816, 5092, 320, 5728, 281, 1071, 824, 24316, 436, 2929, 26847, 327, 253, 1077, 1774, 9400, 273, 15686, 41355, 2538, 273, 18825, 285, 1313, 2320, 46435, 275, 8059, 21250, 19529, 285, 14924, 436, 310, 271, 49258, 4618, 21787, 285, 253, 2929, 9437, 281, 3835, 247, 4344, 2408, 273, 3216, 50276, 2520, 671, 1543, 275, 2067, 27350, 440, 42195, 3533, 672, 4361, 5747, 253, 5107, 323, 12861, 1783, 891, 2868, 436, 2929, 310, 9865, 275, 4685, 6779, 387, 27691, 5742, 17857, 32888, 285, 3400, 690, 4983, 2792, 323, 11138, 253, 1072, 5474, 33032, 2520, 2929, 10262, 247, 1263, 327, 253, 2954, 875, 35949, 285, 253, 19529, 285, 2278, 273, 9380, 281, 17857, 32888, 19732, 2977, 13644, 2130, 941, 432, 1527, 15337, 50276, 783, 4477, 1347, 247, 6832, 11595, 22581, 3434, 327, 436, 10895, 285, 840, 1263, 849, 2929, 14924, 285, 17302, 26332, 1880, 4477, 11929, 281, 253, 8059, 969, 253, 1563, 807, 403, 2905, 281, 8645, 285, 25231, 4328, 407, 13756, 281, 1453, 323, 2710, 2616, 9400, 4491, 37846, 2793, 1268, 285, 253, 897, 273, 39383, 403, 671, 6730, 2266, 2792, 50276, 783, 14218, 2278, 1232, 275, 5145, 4715, 285, 9991, 285, 263, 8492, 3374, 275, 29180, 1468, 77, 403, 1097, 29325, 1365, 1774, 3374, 326, 403, 275, 1185, 878, 273, 2561, 50276, 2520, 789, 12453, 1097, 273, 841, 3374, 10486, 285, 310, 3103, 2779, 281, 4044, 6832, 4116, 50276, 2520, 789, 2175, 417, 760, 253, 3486, 273, 8645, 285, 23365, 4328, 533, 671, 4491, 37846, 253, 897, 273, 39383, 12989, 2793, 1268, 3966, 50276, 42312, 285, 5299, 47986, 11595, 22581, 273, 17857, 32888, 4477, 8645, 369, 2684, 275, 1340, 281, 13194, 253, 1263, 253, 4477, 9703, 281, 1347, 436, 789, 281, 755, 625, 9630, 1543, 3185, 273, 3365, 22128, 327, 12077, 5657, 534, 403, 1929, 281, 452, 247, 8492, 2584, 9591, 1805, 327, 10439, 4454, 50276, 38839, 272, 281, 1453, 323, 2710, 2616, 1955, 281, 253, 21899, 3753, 273, 253, 10895, 2789, 253, 1783, 1199, 625, 26565, 50276, 9088, 403, 2067, 1077, 4722, 4342, 275, 253, 2561, 534, 812, 452, 1524, 10186, 3486, 327, 253, 14218, 2278, 1232, 285, 327, 253, 35893, 387, 1781, 323, 1650, 17857, 32888, 4477, 403, 625, 2779, 281, 1091, 390, 452, 616, 2929, 7607, 604, 616, 31854, 310, 253, 1072, 8645, 4067, 17857, 32888, 6671, 452, 2169, 2278, 7363, 327, 3388, 2255, 5257, 281, 452, 4067, 820, 7582, 6671, 685, 1821, 50276, 20881, 2792, 50276, 783, 10799, 2954, 281, 253, 789, 273, 21191, 1162, 355, 9169, 310, 12744, 24088, 752, 403, 253, 35961, 3910, 285, 253, 4460, 4342, 3877, 604, 436, 2929, 7819, 562, 281, 320, 407, 253, 1072, 4477, 347, 253, 21191, 1162, 355, 549, 32693, 2929, 285, 310, 2686, 247, 8059, 2715, 273, 326, 1072, 2929, 697, 8453, 651, 320, 2169, 891, 2550, 2939, 436, 1955, 281, 253, 4021, 9645, 50276, 262, 310, 247, 9542, 1953, 347, 281, 1880, 436, 2929, 310, 247, 1175, 4944, 323, 17857, 32888, 50276, 251, 581, 1133, 436, 789, 1057, 417, 452, 253, 35961, 38135, 359, 651, 1902, 273, 271, 7607, 17857, 32888, 2929, 285, 436, 1511, 273, 2561, 310, 417, 9403, 3863, 387, 436, 18767, 50276, 251, 253, 643, 1580, 352, 2175, 253, 17857, 32888, 8059, 3139, 697, 4342, 403, 4518, 273, 1600, 281, 253, 17857, 32888, 3114, 50276, 44295, 3062, 352, 651, 1602, 80, 710, 253, 17857, 32888, 3114, 281, 320, 1527, 281, 35103, 824, 347, 436, 534, 403, 3345, 253, 3817, 533, 1335, 47860, 50276, 1662, 86, 1598, 247, 6698, 5981, 651, 320, 625, 4569, 323, 436, 789, 347, 352, 651, 2085, 625, 2317, 323, 247, 11080, 1783, 840, 969, 627, 310, 642, 6698, 534, 310, 3587, 27312, 342, 253, 17857, 32888, 8059, 594, 436, 651, 452, 33817, 253, 2929, 432, 253, 17857, 32888, 3114, 50276, 74, 651, 751, 281, 452, 2326, 247, 625, 18144, 1268, 273, 1783, 534, 19732, 1131, 5609, 3715, 407, 253, 17857, 32888, 3114, 285, 432, 5145, 4715, 387, 16374, 50276, 1542, 1650, 295, 24343, 3082, 1690, 9400, 3210, 3159, 46234, 285, 263, 270, 797, 812, 452, 9945, 281, 253, 1783, 50276, 2520, 789, 760, 7362, 32358, 253, 2553, 273, 253, 6260, 327, 436, 10895, 534, 403, 1896, 672, 19732, 2977, 253, 4477, 8645, 31825, 50276, 783, 2929, 812, 320, 34615, 407, 247, 12861, 5839, 715, 1880, 253, 43586, 3636, 403, 1955, 281, 2616, 12600, 281, 13407, 8492, 2825, 1204, 23285, 3966, 50276, 3549, 6241, 253, 2929, 812, 452, 1448, 1272, 12861, 715, 253, 6070, 281, 534, 48087, 3448, 13789, 403, 253, 1921, 323, 253, 43586, 24088, 407, 9610, 1543, 672, 10938, 323, 253, 7155, 273, 48087, 3448, 17999, 591, 2586, 50276, 783, 2442, 3486, 273, 9383, 301, 746, 327, 253, 3072, 762, 1263, 943, 452, 644, 5469, 50276, 17480, 1199, 273, 253, 1783, 34865, 265, 327, 31825, 352, 651, 320, 4569, 281, 1304, 734, 11423, 1080, 4345, 17082, 50274, 34974, 281, 253, 4477, 50276, 22309, 310, 352, 3782, 1774, 281, 1347, 436, 1263, 327, 253, 17857, 32888, 8059, 643, 685, 253, 958, 326, 253, 941, 403, 2130, 327, 1527, 15337, 50274, 38092, 8680, 50276, 37585, 13991, 50276, 8159, 495, 253, 12616, 3710, 1941, 310, 247, 2372, 23851, 347, 352, 310, 12744, 1880, 4477, 403, 3981, 627, 310, 1941, 253, 941, 1804, 253, 3908, 310, 2032, 533, 253, 1941, 310, 3710, 390, 627, 310, 2649, 1941, 253, 941, 1804, 253, 3908, 310, 3221, 533, 253, 1941, 310, 3710, 50276, 8159, 495, 897, 2885, 6168, 474, 30404, 4851, 554, 323, 1162, 32786, 29384, 1162, 355, 9748, 687, 859, 254, 50276, 34669, 6752, 50276, 8159, 608, 2572, 41907, 1037, 342, 1180, 273, 4477, 5816, 253, 50276, 8159, 721, 9380, 432, 9268, 285, 6420, 347, 757, 347, 571, 50276, 8159, 818, 2255, 858, 2509, 1805, 685, 1821, 5386, 2509, 50276, 8159, 898, 347, 247, 1159, 495, 4491, 18172, 273, 50276, 8159, 898, 884, 3763, 5816, 41616, 1475, 253, 1180, 50276, 8159, 898, 9022, 15306, 310, 1395, 1774, 50276, 783, 4477, 943, 3748, 326, 436, 789, 19584, 247, 8645, 8985, 285, 326, 2852, 789, 812, 285, 943, 1263, 1327, 26458, 285, 31737, 7625, 50276, 19131, 436, 789, 310, 3863, 352, 651, 320, 6685, 9865, 604, 253, 4477, 812, 1056, 616, 28267, 10895, 13644, 2130, 436, 2929, 1057, 417, 4944, 253, 6867, 13100, 273, 271, 17857, 32888, 2929, 275, 326, 352, 310, 7194, 16774, 789, 2581, 685, 15250, 4460, 3082, 50276, 284, 1900, 625, 1783, 812, 452, 644, 2218, 50276, 35529, 697, 4342, 5001, 11497, 285, 2278, 275, 17857, 32888, 403, 9865, 281, 253, 17857, 32888, 3114, 285, 352, 651, 2779, 4044, 6832, 4116, 275, 253, 6239, 285, 452, 271, 3486, 327, 253, 5145, 4715, 3114, 387, 1781, 50275, 7152, 339, 431, 248, 2929, 2589, 84, 271, 21899, 1263, 273, 43586, 275, 253, 17857, 32888, 4240, 938, 1797, 16725, 1232, 253, 789, 12127, 1475, 253, 1783, 273, 43586, 275, 16725, 7363, 285, 14924, 4142, 2439, 8645, 1180, 273, 4477, 4343, 9380, 12989, 285, 4491, 317, 4788, 571, 253, 4477, 1246, 2067, 4342, 326, 403, 2779, 273, 1600, 281, 253, 3114, 285, 2085, 690, 2442, 22909, 3212, 690, 273, 824, 4342, 50275, 783, 41075, 941, 1783, 285, 9077, 1543, 3559, 275, 253, 2929, 3831, 690, 4722, 1543, 3738, 253, 4477, 513, 417, 2085, 22909, 3212, 1016, 273, 841, 1543, 352, 1537, 320, 9371, 281, 7164, 11891, 273, 841, 6127, 275, 253, 3114, 387, 253, 1072, 673, 954, 273, 253, 4342, 432, 436, 41075, 1783, 14787, 342, 1110, 273, 21191, 1162, 355, 9169, 3021, 891, 717, 31488, 273, 849, 4460, 253, 4477, 9021, 403, 16280, 15250, 271, 9300, 1783, 50276, 284, 253, 4477, 9113, 3877, 275, 253, 18035, 3908, 436, 310, 271, 21899, 1263, 512, 9077, 3210, 1453, 760, 323, 253, 30628, 7363, 3021, 275, 1142, 10872, 697, 12744, 752, 2238, 273, 1055, 604, 667, 841, 3210, 403, 2686, 26475, 253, 2929, 4428, 7234, 824, 347, 10304, 253, 2538, 273, 2929, 9400, 327, 253, 2278, 1232, 2278, 4868, 16274, 17302, 849, 352, 16274, 616, 1091, 2281, 359, 5386, 253, 1055, 273, 37317, 4868, 407, 10938, 323, 436, 4778, 627, 310, 1335, 247, 1355, 12541, 1055, 273, 8645, 3966, 841, 810, 37761, 3365, 32355, 1709, 1050, 6127, 285, 3021, 19349, 7234, 943, 320, 16371, 50276, 10383, 2593, 4562, 50275, 249, 2593, 4562, 253, 33105, 327, 253, 1669, 273, 4677, 337, 651, 1804, 247, 1781, 3064, 275, 7363, 875, 1821, 285, 2255, 594, 891, 1119, 253, 958, 326, 253, 3064, 369, 760, 470, 1012, 10084, 310, 436, 3064, 273, 470, 1012, 1754, 327, 253, 941, 3692, 3268, 310, 8653, 275, 253, 33105, 50276, 4238, 667, 12541, 39266, 1783, 323, 253, 21535, 9077, 3210, 5196, 512, 3210, 403, 2969, 26332, 597, 2486, 760, 247, 1643, 4903, 594, 352, 943, 320, 2581, 15246, 281, 2451, 1880, 253, 13260, 403, 1313, 24088, 407, 970, 253, 10401, 14777, 3559, 1060, 5987, 39962, 2061, 9275, 1036, 805, 2941, 21553, 9275, 50276, 249, 253, 9077, 273, 17302, 50276, 15337, 398, 7363, 21069, 849, 497, 7313, 273, 4292, 326, 5420, 275, 2709, 9380, 347, 806, 4477, 15726, 841, 7313, 403, 7976, 285, 3021, 352, 1537, 320, 247, 1175, 2934, 281, 7368, 253, 2629, 6332, 407, 253, 806, 2488, 50276, 609, 841, 1543, 10237, 281, 2544, 275, 253, 1039, 326, 30628, 7363, 403, 2668, 715, 2395, 24088, 604, 253, 954, 4016, 4868, 3185, 273, 253, 1599, 4868, 310, 20184, 323, 513, 253, 1543, 1335, 2186, 50275, 32628, 789, 556, 2011, 326, 1511, 891, 2228, 778, 417, 320, 6537, 323, 672, 3515, 21535, 9077, 327, 14218, 33349, 941, 5987, 39962, 2061, 5375, 746, 805, 1012, 17599, 594, 436, 2523, 943, 320, 14969, 50276, 3169, 327, 2829, 337, 359, 2550, 1663, 1333, 1199, 670, 1880, 253, 3064, 875, 253, 10303, 275, 253, 767, 9077, 3210, 310, 10126, 1534, 253, 954, 15246, 1039, 281, 1056, 690, 17032, 327, 326, 651, 320, 281, 4944, 247, 2014, 1566, 273, 17302, 50276, 2388, 916, 50276, 74, 261, 3416, 50276, 15337, 254, 4868, 50276, 15337, 254, 4868, 50276, 74, 261, 3416, 285, 513, 247, 259, 8950, 1071, 327, 253, 1390, 10235, 326, 753, 253, 3064, 275, 253, 50276, 22275, 6720, 8131, 407, 253, 1655, 3210, 403, 1077, 1355, 50276, 8826, 273, 253, 7350, 285, 13991, 5439, 1840, 4647, 671, 281, 253, 5780, 7118, 273, 253, 2929, 50275, 10383, 2593, 4567, 50276, 664, 1089, 326, 253, 1180, 273, 16516, 273, 253, 806, 2488, 574, 247, 2762, 3486, 327, 1091, 2281, 268, 50276, 520, 3079, 2829, 721, 436, 268, 2877, 310, 1840, 667, 273, 253, 8453, 2308, 326, 403, 3839, 2783, 209, 5831, 209, 2874, 209, 5523, 14805, 594, 891, 651, 4665, 436, 10235, 285, 417, 3486, 347, 417, 1146, 7605, 1534, 891, 1239, 326, 45210, 268, 2877, 470, 883, 594, 891, 5467, 326, 253, 8453, 1268, 253, 4477, 8671, 310, 14805, 50276, 37585, 4278, 50276, 8826, 273, 253, 6010, 9990, 326, 403, 2361, 3831, 767, 14492, 24321, 1223, 2571, 760, 3831, 581, 50276, 783, 7714, 4428, 690, 963, 993, 24088, 841, 941, 369, 50276, 4674, 337, 275, 1182, 664, 7564, 50276, 48369, 302, 43425, 891, 812, 417, 1089, 253, 1491, 670, 2255, 11616, 28131, 273, 253, 47132, 282, 265, 275, 13782, 16125, 5659, 50276, 4674, 374, 4931, 352, 1537, 4217, 281, 3748, 326, 253, 9380, 2908, 275, 253, 10895, 1957, 2761, 533, 417, 512, 273, 253, 35103, 281, 17857, 32888, 275, 253, 2180, 2783, 50276, 17140, 621, 273, 8442, 5345, 878, 281, 320, 5520, 2829, 337, 3198, 281, 320, 5520, 1512, 24088, 407, 6240, 271, 3081, 5084, 327, 253, 1669, 4508, 253, 13301, 1821, 285, 2255, 50276, 262, 651, 320, 9371, 604, 253, 4477, 812, 2085, 253, 2127, 281, 25464, 841, 1543, 253, 1783, 3559, 275, 253, 2929, 4428, 690, 4722, 1543, 2299, 1142, 273, 253, 4342, 14787, 342, 1110, 273, 21191, 1162, 355, 9169, 534, 5196, 247, 2074, 1783, 275, 1635, 891, 5439, 2067, 7350, 285, 3533, 327, 253, 16182, 908, 275, 253, 2929, 50275, 187, 187, 4118, 18435, 27, 783, 4477, 452, 417, 10974, 281, 619, 7350, 275, 253, 30080, 22559, 2380, 1501, 534, 403, 9560, 275, 816, 5411, 616, 16182, 285, 13367, 273, 616, 7313, 3480, 273, 21471, 1783, 285, 26694, 1245, 26647, 970, 3710, 285, 6830, 23539, 8578, 273, 253, 941, 627, 403, 642, 10291, 1160, 342, 253, 12989, 273, 1600, 275, 17857, 32888, 2057, 323, 841, 4606, 891, 717, 7378, 281, 1978, 619, 1899, 281, 12009, 436, 2929 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: the paper studies a problem of learning stepsize policy for lbfgs algorithm this paper falls into a general category of metalearning algorithms that try to derive a datadriven approach to learn one of the parameters of the learning algorithm in this case it is the learning rate of lbfgs the paper is very similar in nature to the papers of ravi larochelle maml and andrychowicz my biggest issue with this paper is the evaluation the paper itself cites in the introduction for largedimensional problems this procedure is likely to become the bottleneck of the optimization task however the paper doesnt not provide necessary evaluation to help resolving this issue in fact it is not very clear at all that the proposed method would work on a more general scenario when the evaluation dataset is wildly different from training the dataset im also a bit surprised to see this paper tackling specifically lbfgs algorithm instead of more general case of learning rate parameter for any gradient basis algorithm i would be curious to learn what is so special about lbfgs that made the authors chose it after all the paper and lbfgs deals with deterministic optimization problems on a full batch which limits the applicability of the paper docsepsummary the paper presents a novel stepssize adaptation for the lbfgs algorithm inspired by the learningtolearn idea the stepsize policy is determined by two linear layers which compare a higher dimensional mapping of curvature information which is trained to adapt the stepsize reasons for score while the idea of learning to predict a suitable stepsize is intriguing and is definitely worth pursuing i am not convinced that the proposed algorithm results in an active policy that usefully adapts the stepsize there are too many concerns that i think needs to be addressed and it is not clear if the speed up improves over the reliability of a line search i therefore vote to reject the paper in its current form pros it is clear that a lot of thought has gone into the project to come up with the policy i think it might have merit but requires additional tests the figures were first difficult to understand but once the content had been explained in the text the benefit of the chosen presentation became clear concerns my main concern is best visualized in figure 3 both the learned policy and the btls seem to mostly favour a step of 1 which raises several questions 1 what would be the results of using no adaptation and rely on a step of 1 or 09 constantly as a baseline 2 the pialgorithm mostly uses a stepsize of 1 which happens top be the upper boundary taum which means it is not clear if the policy network has learned that 1 is a good step or if it has not learned at all and the results are just due to the clipping what would happen if taum1 for example 3 given that both the btls and pi mostly use tk1 is there any intuitive explanation as to why the results between the two algorithms differ by so much in figure 3 are there additional figures where the btls similarly outperforms the competition it did reach 105 first in 60 of the tasks according to table 1 column 1 this despite the fact that btls is at least 1 forward pass more expensive per iteration than the policy for a fully connected network i think that is 50 of the iteration cost the benefit of using the double network for the policy is not clear to me what would be the result of using a single linear layer instead or a recurrent network that monitors temporal changes to the curvature information that is used as input given that the input and output dimensionality of the policy network is of low dimension it would be interesting to see what the weights and biases look like for respective policy layers by looking at the weights it would be possible to see what curvature information makes the network decide to adjust the steplength does the policy learn a cosine behaviour similar to the proposed possibility in the appendix could the policy be used for another optimization algorithm for which gtintercal dk0 such as rmsprop or gd it might be easier to understand the influence of the policy in such a setting comparably minor points section 3 first paragraph ends with a statement regarding rhoi to keep the hessian approximation spd by ignoring iterations is this used in the implementation table 1 according to what metric is first defined iteration time function evaluations it would be good to mention the average value of this metric for each optimizer at the end of k800 inner steps in section 6 it says that the learning rate of adam and rmsprop was chosen to yield fast convergence i understand this as quickly reaching a threshold not achieving best performance for allocated time is this correct that could help explain why so many settings in table 1 and figure 4 fail to converge personally i think the firstorder algorithms should be allowed to change the learning rate between problems to prevent them from not converging ex rmsprop eq7 st with or is the outer problem actually constrained and i missed something post rebuttal i have considered the revised article additional reviews and rebuttal and decided to slightly raise my score but i am still in favor of rejecting the paper below is a summary of my reasoning the authors have provided a good rebuttal and i am overall pleased with the detailed response additional experiments and figures and overall exhibited transparency unfortunately my assumption about tk seemed correct when considering the additional lbfgsb results which indicate that using standard tk1 is a really strong baseline that proved difficult to beat i would suggest finding another set of problems where tk1 is not so good for lbfgs or consider adapting another firstorder algorithm for which it is clear that the steplength needs to change between tasks and architecturesdocsep description this paper makes two separate contributions to machine learning 1 it proposes a neural network to learn the stepsize policy of lbfgs and 2 it uses the algorithm for training a deep neural network pros in practical terms it provides a solution to the problem of finding an appropriate stepsize policy it also demonstrates that the policy out performs adam and rmsprop on a rather strangely chosen problem cons learning how to optimise by examining related problems is often a fruitless endeavour because small changes in a problem can drastically change the behaviour of optimisation algorithms it would have been nice to see more convincing evidence that learning a stepsize polices generalises across at few more problem domains for me the evaluation of their algorithm for training a neural network was slightly unconvincing possibly this was just chosen as an example optimisation problem and the application shouldnt be taken too seriously a comment to that effect would be useful obviously the use of fullbatch training is not that practical for most problems for neural network training robustness to the noise produced by minibatch training is important to understand although adam and rmsprop are stateoftheart optimisers for minibatch training when compared on fullbatch it would be useful to compare with other standard approaches such as conjugate gradient etc the choice of problem was puzzling clearly an mlp on mnist is not representative of modern machine learning it left the question of whether a small network was deliberately chosen because the algorithm does not scale to really large networks again a comment about this would have strengthened the paper i was left with the impression that the authors were being slightly selective in their choice of problems for showing the utility of their method i would have liked to see more conclusive evidence in this direction and a clearer discussion of the regime where this method is likely to perform well typos the paper is comprehensible but would gain from being proof read by a native speaker this is not a consideration that affects the valuation examples of rewordings that would make the paper flow slightly better are p2 l4 good well p2 l8 exempts frees p2 l23 seek search p3 section 3 paragraph 2 line 2 it does there does docsep1 paper contribution summary this paper proposes a neural network architecture with local gradientquasi gradient as input to learn the steplength policy from data for the specific lbfgs algorithm the steplength policy can be learned from the data with similar optimization problems the designed neural network architecture for steplength policy is using inner product as input to allow the policy to be independent of the problem size and the logarithm operation is used to enable products and division among powers of the inputs the numerical example shows that the learned policy is comparable to lbfgs with backtracking line search and potentially better generalization to the same problem with higher dimension 2 strong and weak points of the paper strong part 1 the network architecture for the steplength policy is very interesting and may be useful for other problems with similar needs 2 the numerical experiment shows that the learned steplength policy is comparable to the same algorithm with backtracking line searched steplength which looks promising week part 1 the papers goal is limited to design the steplength policy for one specific optimization algorithm lbfgs if the original lbfgs algorithm is not effective for certain problems then the learned steplength policy may be unuseful as well 2 based on table 1s metric of gradient length smaller than epsilon the learned steplength policy seems not better than adam even in the higher dimension setup in which the adam is using the pretuned learning rate the result in fig 5 seems better for the learned steplength policy but that uses the smallest value over training as compared value which may not be fair because adam rmsprop and lbfgs are different algorithms which may result in different local minima in different problems 3 the performance test setup is not realistic in neural network model training the setup for training the model inner optimization along with steplength policy model outer optimization is with a fixed 1000 images randomly selected from 60 batches of data over t 16 outer iteration and 10 epochs when comparing different algorithms performance it also splits the test dataset into 10 batches of 1000 images and record performance for each fixed batch dataset which is not realistic in usual neural network model training usually different batches are randomly selected at each time step as opposed to being fixed this makes the claimed performance doubtful in realistic setting 3 based on the strong and week points of this paper i tend to reject it 4 supporting arguments please refer to my above arguments 5 questions 1 can authors please explain the first two plots in figure 4 it seems to me that both adam and rmsprop are faster reaching to the required gradient norm precision the precision seems different between these two algorithms and the authors claimed one why not use the same precision requirement so that we can compare these algorithms directly 2 can authors provide any feedback on the third point of weak part 3 can this proposed architectureidea be used as a stepsize policy for not just lbfgs ### Summary:
the paper presents a novel procedure to set the stepssize for the lbfgs algorithm using a neural network overall the reviewers found the paper interesting and the main idea wellthought however a baseline that was proposed by one of the reviewers seems to be basically on par with the performance of the proposed algorithm at least in the experiments of the paper for this reason it is difficult to understand if the new procedure has merit or not also the reviewers would have liked to see the same approach applied to different optimization algorithms for the reasons the paper cannot be accepted in the current form yet the idea might have potential so i encourage the authors to take into account the reviewers comments and resubmit the paper to another venue
[ 534, 7277, 247, 2169, 15759, 10603, 273, 16841, 1491, 534, 310, 10166, 281, 5223, 253, 5018, 907, 50275, 250, 3743, 323, 4868, 50276, 6050, 253, 2934, 273, 4715, 281, 3283, 247, 7470, 5018, 907, 310, 27807, 285, 310, 7964, 4409, 23453, 891, 717, 417, 13762, 326, 253, 4081, 5933, 1543, 275, 271, 3939, 3646, 326, 4217, 314, 5223, 84, 253, 5018, 907, 627, 403, 1512, 1142, 7350, 326, 891, 1158, 3198, 281, 320, 9713, 285, 352, 310, 417, 2590, 604, 253, 3885, 598, 19132, 689, 253, 13367, 273, 247, 1386, 3186, 891, 3103, 6273, 281, 12009, 253, 2929, 275, 697, 1655, 830, 50276, 856, 84, 50275, 262, 310, 2590, 326, 247, 2257, 273, 1869, 556, 4783, 715, 253, 2199, 281, 1705, 598, 342, 253, 3646, 891, 1158, 352, 1537, 452, 15785, 533, 4419, 3081, 5216, 50275, 783, 8442, 497, 806, 2834, 281, 2096, 533, 2378, 253, 2600, 574, 644, 5544, 275, 253, 2505, 253, 5649, 273, 253, 6777, 9759, 3395, 2590, 50274, 585, 1209, 2224, 50275, 2577, 2022, 4468, 310, 1682, 27130, 275, 4677, 495, 1097, 253, 6311, 3646, 285, 253, 270, 33543, 1646, 281, 6571, 9796, 247, 3213, 273, 337, 534, 16540, 2067, 3533, 50276, 186, 18, 752, 651, 320, 253, 1543, 273, 970, 642, 15644, 285, 10725, 327, 247, 3213, 273, 337, 390, 15630, 11485, 347, 247, 8245, 50276, 186, 19, 253, 268, 451, 17505, 6571, 4648, 247, 5018, 907, 273, 337, 534, 6569, 1755, 320, 253, 5170, 7548, 15307, 360, 534, 2097, 352, 310, 417, 2590, 604, 253, 3646, 2990, 556, 6311, 326, 337, 310, 247, 1175, 3213, 390, 604, 352, 556, 417, 6311, 387, 512, 285, 253, 1543, 403, 816, 1955, 281, 253, 502, 8201, 752, 651, 5108, 604, 15307, 360, 18, 323, 1650, 50276, 186, 20, 1677, 326, 1097, 253, 270, 33543, 285, 12580, 6571, 897, 246, 76, 18, 310, 627, 667, 27350, 8813, 347, 281, 2139, 253, 1543, 875, 253, 767, 11333, 9184, 407, 594, 1199, 275, 4677, 495, 403, 627, 3081, 8442, 835, 253, 270, 33543, 12014, 41731, 13015, 253, 7324, 352, 858, 3986, 12446, 806, 275, 3925, 273, 253, 8892, 2556, 281, 2829, 337, 5084, 337, 436, 5747, 253, 958, 326, 270, 33543, 310, 387, 1878, 337, 3579, 1509, 625, 8214, 591, 19502, 685, 253, 3646, 323, 247, 4751, 4802, 2990, 891, 1158, 326, 310, 2456, 273, 253, 19502, 2105, 50275, 783, 5649, 273, 970, 253, 4021, 2990, 323, 253, 3646, 310, 417, 2590, 281, 479, 752, 651, 320, 253, 906, 273, 970, 247, 2014, 4872, 3828, 3185, 390, 247, 18902, 2990, 326, 28028, 11935, 2544, 281, 253, 16841, 1491, 326, 310, 908, 347, 3280, 50275, 28821, 326, 253, 3280, 285, 3453, 7877, 1319, 273, 253, 3646, 2990, 310, 273, 1698, 7877, 352, 651, 320, 4722, 281, 923, 752, 253, 13461, 285, 31306, 1007, 751, 323, 9056, 3646, 8090, 407, 2819, 387, 253, 13461, 352, 651, 320, 1896, 281, 923, 752, 16841, 1491, 2789, 253, 2990, 7617, 281, 4575, 253, 2870, 446, 1746, 1057, 253, 3646, 3037, 247, 7349, 460, 8770, 2074, 281, 253, 4081, 6387, 275, 253, 30762, 50274, 16534, 253, 3646, 320, 908, 323, 1529, 13757, 5933, 323, 534, 305, 85, 2388, 1179, 277, 76, 17, 824, 347, 391, 983, 8560, 390, 305, 69, 352, 1537, 320, 6927, 281, 2096, 253, 4833, 273, 253, 3646, 275, 824, 247, 4758, 50275, 681, 1148, 1598, 5884, 2792, 50275, 4674, 495, 806, 12494, 7637, 342, 247, 3908, 5001, 391, 1689, 74, 281, 1978, 253, 344, 859, 757, 11193, 653, 69, 407, 23111, 25142, 310, 436, 908, 275, 253, 7092, 50275, 2420, 337, 2556, 281, 752, 7982, 310, 806, 2931, 19502, 673, 1159, 27163, 352, 651, 320, 1175, 281, 3748, 253, 3388, 1318, 273, 436, 7982, 323, 1016, 5556, 6081, 387, 253, 990, 273, 465, 10695, 6703, 5018, 50276, 249, 2593, 721, 352, 2296, 326, 253, 4715, 2281, 273, 38622, 285, 391, 983, 8560, 369, 6777, 281, 4917, 3809, 14940, 891, 2096, 436, 347, 4541, 10922, 247, 7887, 417, 17170, 1682, 3045, 323, 18564, 673, 310, 436, 3451, 326, 812, 1361, 5513, 2139, 594, 1142, 7533, 275, 2829, 337, 285, 4677, 577, 1891, 281, 29623, 11697, 891, 1158, 253, 806, 2621, 11333, 943, 320, 4136, 281, 1818, 253, 4715, 2281, 875, 3237, 281, 3657, 731, 432, 417, 5975, 3390, 385, 391, 983, 8560, 50276, 2574, 24, 331, 50276, 3113, 390, 310, 253, 8346, 1895, 2686, 20793, 285, 891, 9829, 1633, 50272, 5996, 30080, 22559, 50276, 74, 452, 2783, 253, 17265, 3929, 3081, 10123, 285, 30080, 22559, 285, 4425, 281, 5777, 7164, 619, 4868, 533, 891, 717, 1335, 275, 3718, 273, 33944, 253, 2929, 2708, 310, 247, 6010, 273, 619, 14720, 50274, 783, 4477, 452, 2530, 247, 1175, 30080, 22559, 285, 891, 717, 4583, 13864, 342, 253, 7000, 2380, 3081, 4679, 285, 8442, 285, 4583, 12322, 22107, 50276, 328, 9520, 619, 9376, 670, 246, 76, 4455, 3451, 672, 7296, 253, 3081, 298, 3342, 5943, 67, 1543, 534, 5224, 326, 970, 2629, 246, 76, 18, 310, 247, 1663, 2266, 8245, 326, 8058, 2834, 281, 7171, 50276, 74, 651, 1804, 4560, 1529, 873, 273, 3237, 835, 246, 76, 18, 310, 417, 594, 1175, 323, 298, 3342, 5943, 390, 1908, 42174, 1529, 806, 2621, 5933, 323, 534, 352, 310, 2590, 326, 253, 2870, 446, 1746, 3198, 281, 1818, 875, 8892, 285, 35615, 7152, 33032, 5740, 50276, 2520, 2929, 2789, 767, 4858, 9021, 281, 5145, 4715, 337, 352, 29328, 247, 11454, 2990, 281, 3037, 253, 5018, 907, 3646, 273, 298, 3342, 5943, 285, 374, 352, 4648, 253, 5933, 323, 3733, 247, 3676, 11454, 2990, 50275, 856, 84, 50276, 249, 8542, 2426, 352, 3400, 247, 2900, 281, 253, 1895, 273, 4560, 271, 4569, 5018, 907, 3646, 50276, 262, 671, 14371, 326, 253, 3646, 562, 17923, 38622, 285, 391, 983, 8560, 327, 247, 2581, 38612, 6777, 1895, 50275, 5040, 50276, 28269, 849, 281, 5556, 885, 407, 17565, 2905, 3237, 310, 2223, 247, 9279, 1417, 19072, 25755, 984, 1355, 2544, 275, 247, 1895, 476, 31063, 1818, 253, 8770, 273, 5556, 5837, 11333, 50276, 262, 651, 452, 644, 5322, 281, 923, 625, 21414, 1941, 326, 4715, 247, 5018, 907, 877, 1271, 2087, 3013, 2439, 387, 1643, 625, 1895, 10625, 50276, 1542, 479, 253, 7103, 273, 616, 5933, 323, 3733, 247, 11454, 2990, 369, 5777, 10915, 87, 19163, 50276, 38896, 436, 369, 816, 6777, 347, 271, 1650, 5556, 5837, 1895, 285, 253, 2898, 943, 2649, 320, 2668, 1512, 10369, 247, 4385, 281, 326, 1055, 651, 320, 4217, 50276, 706, 11529, 253, 897, 273, 2120, 23941, 3733, 310, 417, 326, 8542, 323, 954, 3237, 50276, 1542, 11454, 2990, 3733, 31640, 281, 253, 6046, 4197, 407, 1054, 487, 1506, 3733, 310, 1774, 281, 2096, 50276, 20261, 38622, 285, 391, 983, 8560, 403, 1375, 23037, 14387, 5556, 34768, 323, 1054, 487, 1506, 3733, 672, 2429, 327, 2120, 23941, 352, 651, 320, 4217, 281, 7277, 342, 643, 2629, 7274, 824, 347, 27442, 11786, 3966, 50276, 783, 4327, 273, 1895, 369, 21843, 1981, 50276, 49346, 271, 13361, 81, 327, 278, 79, 382, 310, 417, 8612, 273, 4980, 5145, 4715, 50276, 262, 1669, 253, 1953, 273, 1880, 247, 1355, 2990, 369, 21547, 6777, 984, 253, 5933, 1057, 417, 4311, 281, 1663, 1781, 6928, 50276, 16245, 247, 4385, 670, 436, 651, 452, 34615, 253, 2929, 50276, 74, 369, 1669, 342, 253, 13214, 326, 253, 4477, 497, 1146, 5777, 13687, 275, 616, 4327, 273, 3237, 323, 4645, 253, 11839, 273, 616, 1332, 50276, 74, 651, 452, 10490, 281, 923, 625, 38662, 1941, 275, 436, 3884, 285, 247, 30909, 5955, 273, 253, 9459, 835, 436, 1332, 310, 2779, 281, 1347, 973, 50275, 555, 993, 50276, 783, 2929, 310, 28535, 6286, 533, 651, 6351, 432, 1146, 4737, 1239, 407, 247, 7925, 14925, 50276, 2520, 310, 417, 247, 8180, 326, 11852, 253, 29581, 50276, 32045, 273, 294, 3418, 723, 326, 651, 1056, 253, 2929, 2685, 5777, 1805, 403, 50276, 81, 19, 298, 21, 1175, 50276, 4714, 268, 19, 298, 25, 13959, 84, 50276, 71, 6151, 268, 19, 298, 1508, 7703, 50276, 8716, 268, 20, 2593, 495, 12494, 374, 1386, 374, 352, 1057, 50275, 9088, 1057, 50276, 7152, 33032, 18, 2929, 7680, 6010, 50273, 2520, 2929, 29328, 247, 11454, 2990, 10336, 342, 1980, 11786, 371, 9720, 11786, 347, 3280, 281, 3037, 253, 2870, 446, 1746, 3646, 432, 941, 323, 253, 2173, 298, 3342, 5943, 5933, 253, 2870, 446, 1746, 3646, 476, 320, 6311, 432, 253, 941, 342, 2074, 13757, 3237, 253, 4158, 11454, 2990, 10336, 323, 2870, 446, 1746, 3646, 310, 970, 6703, 1885, 347, 3280, 281, 1581, 253, 3646, 281, 320, 3907, 273, 253, 1895, 1979, 285, 253, 42407, 4254, 310, 908, 281, 8046, 3580, 285, 9025, 2190, 9136, 273, 253, 14800, 50273, 783, 10704, 1650, 2722, 326, 253, 6311, 3646, 310, 10870, 281, 298, 3342, 5943, 342, 896, 40111, 1386, 3186, 285, 7826, 1805, 26647, 281, 253, 1072, 1895, 342, 2169, 7877, 50276, 19, 50276, 9072, 285, 5075, 2792, 273, 253, 2929, 50273, 9072, 629, 337, 253, 2990, 10336, 323, 253, 2870, 446, 1746, 3646, 310, 1077, 4722, 285, 778, 320, 4217, 323, 643, 3237, 342, 2074, 3198, 374, 253, 10704, 3368, 2722, 326, 253, 6311, 2870, 446, 1746, 3646, 310, 10870, 281, 253, 1072, 5933, 342, 896, 40111, 1386, 16113, 2870, 446, 1746, 534, 4453, 12532, 50273, 11151, 629, 337, 253, 9380, 4736, 310, 3710, 281, 2216, 253, 2870, 446, 1746, 3646, 323, 581, 2173, 13757, 5933, 298, 3342, 5943, 604, 253, 3236, 298, 3342, 5943, 5933, 310, 417, 3576, 323, 2176, 3237, 840, 253, 6311, 2870, 446, 1746, 3646, 778, 320, 9949, 4085, 347, 973, 374, 1754, 327, 2829, 337, 84, 7982, 273, 11786, 2978, 4577, 685, 299, 4277, 253, 6311, 2870, 446, 1746, 3646, 3133, 417, 1805, 685, 38622, 1014, 275, 253, 2169, 7877, 9978, 275, 534, 253, 38622, 310, 970, 253, 3215, 37437, 4715, 2281, 253, 906, 275, 3036, 608, 3133, 1805, 323, 253, 6311, 2870, 446, 1746, 3646, 533, 326, 4648, 253, 8004, 1318, 689, 3733, 347, 2429, 1318, 534, 778, 417, 320, 4344, 984, 38622, 391, 983, 8560, 285, 298, 3342, 5943, 403, 1027, 11333, 534, 778, 906, 275, 1027, 1980, 46836, 275, 1027, 3237, 495, 253, 3045, 1071, 9978, 310, 417, 15958, 275, 11454, 2990, 1566, 3733, 253, 9978, 323, 3733, 253, 1566, 6703, 13757, 2112, 342, 2870, 446, 1746, 3646, 1566, 8346, 13757, 310, 342, 247, 4229, 9098, 3888, 12421, 4236, 432, 3925, 39657, 273, 941, 689, 246, 50276, 1036, 8346, 19502, 285, 884, 44540, 672, 10941, 1027, 11333, 3045, 352, 671, 36509, 253, 1071, 10895, 715, 884, 39657, 273, 9098, 3888, 285, 1924, 3045, 323, 1016, 4229, 14604, 10895, 534, 310, 417, 15958, 275, 7312, 11454, 2990, 1566, 3733, 3798, 1027, 39657, 403, 12421, 4236, 387, 1016, 673, 3213, 347, 10066, 281, 1146, 4229, 436, 2789, 253, 7558, 3045, 38342, 275, 15958, 4758, 50276, 20, 1754, 327, 253, 2266, 285, 2129, 2792, 273, 436, 2929, 891, 5257, 281, 12009, 352, 50276, 21, 8109, 7125, 50273, 32897, 3730, 281, 619, 1840, 7125, 50276, 22, 3533, 50273, 18, 476, 4477, 4496, 5513, 253, 806, 767, 14777, 275, 4677, 577, 352, 3133, 281, 479, 326, 1097, 38622, 285, 391, 983, 8560, 403, 7938, 10922, 281, 253, 2424, 11786, 5222, 12320, 253, 12320, 3133, 1027, 875, 841, 767, 11333, 285, 253, 4477, 7558, 581, 2139, 417, 897, 253, 1072, 12320, 8284, 594, 326, 359, 476, 7277, 841, 11333, 3587, 50273, 19, 476, 4477, 2085, 667, 8680, 327, 253, 2626, 1127, 273, 5075, 629, 50273, 20, 476, 436, 4081, 10336, 36665, 320, 908, 347, 247, 5018, 907, 3646, 323, 417, 816, 298, 3342, 5943, 187, 187, 4118, 18435, 27, 783, 2929, 10262, 247, 4460, 5199, 281, 873, 253, 3213, 859, 907, 323, 253, 298, 3342, 5943, 5933, 970, 247, 11454, 2990, 4583, 253, 30628, 1119, 253, 2929, 4722, 285, 253, 2022, 2934, 973, 24286, 2299, 247, 8245, 326, 369, 4081, 407, 581, 273, 253, 30628, 3133, 281, 320, 10323, 327, 1061, 342, 253, 3045, 273, 253, 4081, 5933, 387, 1878, 275, 253, 4679, 273, 253, 2929, 323, 436, 1921, 352, 310, 2834, 281, 2096, 604, 253, 747, 5199, 556, 15785, 390, 417, 671, 253, 30628, 651, 452, 10490, 281, 923, 253, 1072, 2746, 3732, 281, 1027, 13757, 11333, 50276, 1542, 253, 4606, 253, 2929, 2550, 320, 7607, 275, 253, 1655, 830, 2568, 253, 2934, 1537, 452, 2442, 594, 891, 11907, 253, 4477, 281, 1379, 715, 2395, 253, 30628, 5701, 285, 501, 538, 2225, 253, 2929, 281, 1529, 18767 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 534, 7277, 247, 2169, 15759, 10603, 273, 16841, 1491, 534, 310, 10166, 281, 5223, 253, 5018, 907, 50275, 250, 3743, 323, 4868, 50276, 6050, 253, 2934, 273, 4715, 281, 3283, 247, 7470, 5018, 907, 310, 27807, 285, 310, 7964, 4409, 23453, 891, 717, 417, 13762, 326, 253, 4081, 5933, 1543, 275, 271, 3939, 3646, 326, 4217, 314, 5223, 84, 253, 5018, 907, 627, 403, 1512, 1142, 7350, 326, 891, 1158, 3198, 281, 320, 9713, 285, 352, 310, 417, 2590, 604, 253, 3885, 598, 19132, 689, 253, 13367, 273, 247, 1386, 3186, 891, 3103, 6273, 281, 12009, 253, 2929, 275, 697, 1655, 830, 50276, 856, 84, 50275, 262, 310, 2590, 326, 247, 2257, 273, 1869, 556, 4783, 715, 253, 2199, 281, 1705, 598, 342, 253, 3646, 891, 1158, 352, 1537, 452, 15785, 533, 4419, 3081, 5216, 50275, 783, 8442, 497, 806, 2834, 281, 2096, 533, 2378, 253, 2600, 574, 644, 5544, 275, 253, 2505, 253, 5649, 273, 253, 6777, 9759, 3395, 2590, 50274, 585, 1209, 2224, 50275, 2577, 2022, 4468, 310, 1682, 27130, 275, 4677, 495, 1097, 253, 6311, 3646, 285, 253, 270, 33543, 1646, 281, 6571, 9796, 247, 3213, 273, 337, 534, 16540, 2067, 3533, 50276, 186, 18, 752, 651, 320, 253, 1543, 273, 970, 642, 15644, 285, 10725, 327, 247, 3213, 273, 337, 390, 15630, 11485, 347, 247, 8245, 50276, 186, 19, 253, 268, 451, 17505, 6571, 4648, 247, 5018, 907, 273, 337, 534, 6569, 1755, 320, 253, 5170, 7548, 15307, 360, 534, 2097, 352, 310, 417, 2590, 604, 253, 3646, 2990, 556, 6311, 326, 337, 310, 247, 1175, 3213, 390, 604, 352, 556, 417, 6311, 387, 512, 285, 253, 1543, 403, 816, 1955, 281, 253, 502, 8201, 752, 651, 5108, 604, 15307, 360, 18, 323, 1650, 50276, 186, 20, 1677, 326, 1097, 253, 270, 33543, 285, 12580, 6571, 897, 246, 76, 18, 310, 627, 667, 27350, 8813, 347, 281, 2139, 253, 1543, 875, 253, 767, 11333, 9184, 407, 594, 1199, 275, 4677, 495, 403, 627, 3081, 8442, 835, 253, 270, 33543, 12014, 41731, 13015, 253, 7324, 352, 858, 3986, 12446, 806, 275, 3925, 273, 253, 8892, 2556, 281, 2829, 337, 5084, 337, 436, 5747, 253, 958, 326, 270, 33543, 310, 387, 1878, 337, 3579, 1509, 625, 8214, 591, 19502, 685, 253, 3646, 323, 247, 4751, 4802, 2990, 891, 1158, 326, 310, 2456, 273, 253, 19502, 2105, 50275, 783, 5649, 273, 970, 253, 4021, 2990, 323, 253, 3646, 310, 417, 2590, 281, 479, 752, 651, 320, 253, 906, 273, 970, 247, 2014, 4872, 3828, 3185, 390, 247, 18902, 2990, 326, 28028, 11935, 2544, 281, 253, 16841, 1491, 326, 310, 908, 347, 3280, 50275, 28821, 326, 253, 3280, 285, 3453, 7877, 1319, 273, 253, 3646, 2990, 310, 273, 1698, 7877, 352, 651, 320, 4722, 281, 923, 752, 253, 13461, 285, 31306, 1007, 751, 323, 9056, 3646, 8090, 407, 2819, 387, 253, 13461, 352, 651, 320, 1896, 281, 923, 752, 16841, 1491, 2789, 253, 2990, 7617, 281, 4575, 253, 2870, 446, 1746, 1057, 253, 3646, 3037, 247, 7349, 460, 8770, 2074, 281, 253, 4081, 6387, 275, 253, 30762, 50274, 16534, 253, 3646, 320, 908, 323, 1529, 13757, 5933, 323, 534, 305, 85, 2388, 1179, 277, 76, 17, 824, 347, 391, 983, 8560, 390, 305, 69, 352, 1537, 320, 6927, 281, 2096, 253, 4833, 273, 253, 3646, 275, 824, 247, 4758, 50275, 681, 1148, 1598, 5884, 2792, 50275, 4674, 495, 806, 12494, 7637, 342, 247, 3908, 5001, 391, 1689, 74, 281, 1978, 253, 344, 859, 757, 11193, 653, 69, 407, 23111, 25142, 310, 436, 908, 275, 253, 7092, 50275, 2420, 337, 2556, 281, 752, 7982, 310, 806, 2931, 19502, 673, 1159, 27163, 352, 651, 320, 1175, 281, 3748, 253, 3388, 1318, 273, 436, 7982, 323, 1016, 5556, 6081, 387, 253, 990, 273, 465, 10695, 6703, 5018, 50276, 249, 2593, 721, 352, 2296, 326, 253, 4715, 2281, 273, 38622, 285, 391, 983, 8560, 369, 6777, 281, 4917, 3809, 14940, 891, 2096, 436, 347, 4541, 10922, 247, 7887, 417, 17170, 1682, 3045, 323, 18564, 673, 310, 436, 3451, 326, 812, 1361, 5513, 2139, 594, 1142, 7533, 275, 2829, 337, 285, 4677, 577, 1891, 281, 29623, 11697, 891, 1158, 253, 806, 2621, 11333, 943, 320, 4136, 281, 1818, 253, 4715, 2281, 875, 3237, 281, 3657, 731, 432, 417, 5975, 3390, 385, 391, 983, 8560, 50276, 2574, 24, 331, 50276, 3113, 390, 310, 253, 8346, 1895, 2686, 20793, 285, 891, 9829, 1633, 50272, 5996, 30080, 22559, 50276, 74, 452, 2783, 253, 17265, 3929, 3081, 10123, 285, 30080, 22559, 285, 4425, 281, 5777, 7164, 619, 4868, 533, 891, 717, 1335, 275, 3718, 273, 33944, 253, 2929, 2708, 310, 247, 6010, 273, 619, 14720, 50274, 783, 4477, 452, 2530, 247, 1175, 30080, 22559, 285, 891, 717, 4583, 13864, 342, 253, 7000, 2380, 3081, 4679, 285, 8442, 285, 4583, 12322, 22107, 50276, 328, 9520, 619, 9376, 670, 246, 76, 4455, 3451, 672, 7296, 253, 3081, 298, 3342, 5943, 67, 1543, 534, 5224, 326, 970, 2629, 246, 76, 18, 310, 247, 1663, 2266, 8245, 326, 8058, 2834, 281, 7171, 50276, 74, 651, 1804, 4560, 1529, 873, 273, 3237, 835, 246, 76, 18, 310, 417, 594, 1175, 323, 298, 3342, 5943, 390, 1908, 42174, 1529, 806, 2621, 5933, 323, 534, 352, 310, 2590, 326, 253, 2870, 446, 1746, 3198, 281, 1818, 875, 8892, 285, 35615, 7152, 33032, 5740, 50276, 2520, 2929, 2789, 767, 4858, 9021, 281, 5145, 4715, 337, 352, 29328, 247, 11454, 2990, 281, 3037, 253, 5018, 907, 3646, 273, 298, 3342, 5943, 285, 374, 352, 4648, 253, 5933, 323, 3733, 247, 3676, 11454, 2990, 50275, 856, 84, 50276, 249, 8542, 2426, 352, 3400, 247, 2900, 281, 253, 1895, 273, 4560, 271, 4569, 5018, 907, 3646, 50276, 262, 671, 14371, 326, 253, 3646, 562, 17923, 38622, 285, 391, 983, 8560, 327, 247, 2581, 38612, 6777, 1895, 50275, 5040, 50276, 28269, 849, 281, 5556, 885, 407, 17565, 2905, 3237, 310, 2223, 247, 9279, 1417, 19072, 25755, 984, 1355, 2544, 275, 247, 1895, 476, 31063, 1818, 253, 8770, 273, 5556, 5837, 11333, 50276, 262, 651, 452, 644, 5322, 281, 923, 625, 21414, 1941, 326, 4715, 247, 5018, 907, 877, 1271, 2087, 3013, 2439, 387, 1643, 625, 1895, 10625, 50276, 1542, 479, 253, 7103, 273, 616, 5933, 323, 3733, 247, 11454, 2990, 369, 5777, 10915, 87, 19163, 50276, 38896, 436, 369, 816, 6777, 347, 271, 1650, 5556, 5837, 1895, 285, 253, 2898, 943, 2649, 320, 2668, 1512, 10369, 247, 4385, 281, 326, 1055, 651, 320, 4217, 50276, 706, 11529, 253, 897, 273, 2120, 23941, 3733, 310, 417, 326, 8542, 323, 954, 3237, 50276, 1542, 11454, 2990, 3733, 31640, 281, 253, 6046, 4197, 407, 1054, 487, 1506, 3733, 310, 1774, 281, 2096, 50276, 20261, 38622, 285, 391, 983, 8560, 403, 1375, 23037, 14387, 5556, 34768, 323, 1054, 487, 1506, 3733, 672, 2429, 327, 2120, 23941, 352, 651, 320, 4217, 281, 7277, 342, 643, 2629, 7274, 824, 347, 27442, 11786, 3966, 50276, 783, 4327, 273, 1895, 369, 21843, 1981, 50276, 49346, 271, 13361, 81, 327, 278, 79, 382, 310, 417, 8612, 273, 4980, 5145, 4715, 50276, 262, 1669, 253, 1953, 273, 1880, 247, 1355, 2990, 369, 21547, 6777, 984, 253, 5933, 1057, 417, 4311, 281, 1663, 1781, 6928, 50276, 16245, 247, 4385, 670, 436, 651, 452, 34615, 253, 2929, 50276, 74, 369, 1669, 342, 253, 13214, 326, 253, 4477, 497, 1146, 5777, 13687, 275, 616, 4327, 273, 3237, 323, 4645, 253, 11839, 273, 616, 1332, 50276, 74, 651, 452, 10490, 281, 923, 625, 38662, 1941, 275, 436, 3884, 285, 247, 30909, 5955, 273, 253, 9459, 835, 436, 1332, 310, 2779, 281, 1347, 973, 50275, 555, 993, 50276, 783, 2929, 310, 28535, 6286, 533, 651, 6351, 432, 1146, 4737, 1239, 407, 247, 7925, 14925, 50276, 2520, 310, 417, 247, 8180, 326, 11852, 253, 29581, 50276, 32045, 273, 294, 3418, 723, 326, 651, 1056, 253, 2929, 2685, 5777, 1805, 403, 50276, 81, 19, 298, 21, 1175, 50276, 4714, 268, 19, 298, 25, 13959, 84, 50276, 71, 6151, 268, 19, 298, 1508, 7703, 50276, 8716, 268, 20, 2593, 495, 12494, 374, 1386, 374, 352, 1057, 50275, 9088, 1057, 50276, 7152, 33032, 18, 2929, 7680, 6010, 50273, 2520, 2929, 29328, 247, 11454, 2990, 10336, 342, 1980, 11786, 371, 9720, 11786, 347, 3280, 281, 3037, 253, 2870, 446, 1746, 3646, 432, 941, 323, 253, 2173, 298, 3342, 5943, 5933, 253, 2870, 446, 1746, 3646, 476, 320, 6311, 432, 253, 941, 342, 2074, 13757, 3237, 253, 4158, 11454, 2990, 10336, 323, 2870, 446, 1746, 3646, 310, 970, 6703, 1885, 347, 3280, 281, 1581, 253, 3646, 281, 320, 3907, 273, 253, 1895, 1979, 285, 253, 42407, 4254, 310, 908, 281, 8046, 3580, 285, 9025, 2190, 9136, 273, 253, 14800, 50273, 783, 10704, 1650, 2722, 326, 253, 6311, 3646, 310, 10870, 281, 298, 3342, 5943, 342, 896, 40111, 1386, 3186, 285, 7826, 1805, 26647, 281, 253, 1072, 1895, 342, 2169, 7877, 50276, 19, 50276, 9072, 285, 5075, 2792, 273, 253, 2929, 50273, 9072, 629, 337, 253, 2990, 10336, 323, 253, 2870, 446, 1746, 3646, 310, 1077, 4722, 285, 778, 320, 4217, 323, 643, 3237, 342, 2074, 3198, 374, 253, 10704, 3368, 2722, 326, 253, 6311, 2870, 446, 1746, 3646, 310, 10870, 281, 253, 1072, 5933, 342, 896, 40111, 1386, 16113, 2870, 446, 1746, 534, 4453, 12532, 50273, 11151, 629, 337, 253, 9380, 4736, 310, 3710, 281, 2216, 253, 2870, 446, 1746, 3646, 323, 581, 2173, 13757, 5933, 298, 3342, 5943, 604, 253, 3236, 298, 3342, 5943, 5933, 310, 417, 3576, 323, 2176, 3237, 840, 253, 6311, 2870, 446, 1746, 3646, 778, 320, 9949, 4085, 347, 973, 374, 1754, 327, 2829, 337, 84, 7982, 273, 11786, 2978, 4577, 685, 299, 4277, 253, 6311, 2870, 446, 1746, 3646, 3133, 417, 1805, 685, 38622, 1014, 275, 253, 2169, 7877, 9978, 275, 534, 253, 38622, 310, 970, 253, 3215, 37437, 4715, 2281, 253, 906, 275, 3036, 608, 3133, 1805, 323, 253, 6311, 2870, 446, 1746, 3646, 533, 326, 4648, 253, 8004, 1318, 689, 3733, 347, 2429, 1318, 534, 778, 417, 320, 4344, 984, 38622, 391, 983, 8560, 285, 298, 3342, 5943, 403, 1027, 11333, 534, 778, 906, 275, 1027, 1980, 46836, 275, 1027, 3237, 495, 253, 3045, 1071, 9978, 310, 417, 15958, 275, 11454, 2990, 1566, 3733, 253, 9978, 323, 3733, 253, 1566, 6703, 13757, 2112, 342, 2870, 446, 1746, 3646, 1566, 8346, 13757, 310, 342, 247, 4229, 9098, 3888, 12421, 4236, 432, 3925, 39657, 273, 941, 689, 246, 50276, 1036, 8346, 19502, 285, 884, 44540, 672, 10941, 1027, 11333, 3045, 352, 671, 36509, 253, 1071, 10895, 715, 884, 39657, 273, 9098, 3888, 285, 1924, 3045, 323, 1016, 4229, 14604, 10895, 534, 310, 417, 15958, 275, 7312, 11454, 2990, 1566, 3733, 3798, 1027, 39657, 403, 12421, 4236, 387, 1016, 673, 3213, 347, 10066, 281, 1146, 4229, 436, 2789, 253, 7558, 3045, 38342, 275, 15958, 4758, 50276, 20, 1754, 327, 253, 2266, 285, 2129, 2792, 273, 436, 2929, 891, 5257, 281, 12009, 352, 50276, 21, 8109, 7125, 50273, 32897, 3730, 281, 619, 1840, 7125, 50276, 22, 3533, 50273, 18, 476, 4477, 4496, 5513, 253, 806, 767, 14777, 275, 4677, 577, 352, 3133, 281, 479, 326, 1097, 38622, 285, 391, 983, 8560, 403, 7938, 10922, 281, 253, 2424, 11786, 5222, 12320, 253, 12320, 3133, 1027, 875, 841, 767, 11333, 285, 253, 4477, 7558, 581, 2139, 417, 897, 253, 1072, 12320, 8284, 594, 326, 359, 476, 7277, 841, 11333, 3587, 50273, 19, 476, 4477, 2085, 667, 8680, 327, 253, 2626, 1127, 273, 5075, 629, 50273, 20, 476, 436, 4081, 10336, 36665, 320, 908, 347, 247, 5018, 907, 3646, 323, 417, 816, 298, 3342, 5943, 187, 187, 4118, 18435, 27, 783, 2929, 10262, 247, 4460, 5199, 281, 873, 253, 3213, 859, 907, 323, 253, 298, 3342, 5943, 5933, 970, 247, 11454, 2990, 4583, 253, 30628, 1119, 253, 2929, 4722, 285, 253, 2022, 2934, 973, 24286, 2299, 247, 8245, 326, 369, 4081, 407, 581, 273, 253, 30628, 3133, 281, 320, 10323, 327, 1061, 342, 253, 3045, 273, 253, 4081, 5933, 387, 1878, 275, 253, 4679, 273, 253, 2929, 323, 436, 1921, 352, 310, 2834, 281, 2096, 604, 253, 747, 5199, 556, 15785, 390, 417, 671, 253, 30628, 651, 452, 10490, 281, 923, 253, 1072, 2746, 3732, 281, 1027, 13757, 11333, 50276, 1542, 253, 4606, 253, 2929, 2550, 320, 7607, 275, 253, 1655, 830, 2568, 253, 2934, 1537, 452, 2442, 594, 891, 11907, 253, 4477, 281, 1379, 715, 2395, 253, 30628, 5701, 285, 501, 538, 2225, 253, 2929, 281, 1529, 18767 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: the authors hypothesize that description of visual texturesobjects using summary statistics as may be doing the human vision in the periphery may induce robustness to adversarial examples in the discussion they argue that this may be the case because imposing separability of classes with highvariance responses of peripheral vision leads to enlarged class separation of smallvariance sets in foveal vision which have the same mean the authors check their initial hypothesis in an indirect way they check that robust representations to adversarial attacks resemble human representations they do so by comparing tildex synthesized images from networks which are robust to adversarial examples and hatx synthesized images from models of human peripheral vision they argue that given the fact that these classes of images tildex and hatx are equally discriminable by humans from the original images x the image representation of robust networks is similar to the periphery image representation in humans the authors also compare the discriminability of images xs generated from standard nonrobust networks the specific way in which tildex hatx and xs are compared is measuring the psychometric function over eccentricity so that these images are discriminable by humans from the original image interestingly these images are also compared using subjective image distortion metrics while the idea of connecting summary statistics with the robustness to adversarial attacks is interesting i feel the paper has a number of points that should be clarified by the authors before it can be accepted for publication at iclr 1 two representations with the same nullspace are not necessarily the same the authors say that the representation of robust nets is similar to the representations of the models of peripheral vision only because the images coming from the inverse of such models are metameric similarly nondiscriminable for humans note that this only means that these representations share the null space and this null space is also shared by humans however this is far from being the same as saying that the representations are equivalent or similar an example of this difference can be obtained from color science where the metamer concept comes from imagine the space of spectral radiances with vectors living in a 100dimensional space say one value every 3 nm in the 400700 nm interval consider a perceptual system defined by a linear operator p that projects this 100dimensional space into a 3dimensional space p is a 3100 matrix that performs the integration by the cone sensitivities note that this perceptual system p has a nullspace the inverse is not unique and many different spectra lead to the same tristimulus representation we say that these physically different stimuli are metameric they are the same for the perceptual system p they are perceptually same because these different spectra differ in vectors that belong to the nullspace of the projection operator p with this in mind if we consider any arbitrary rotation of the 3d vectors all these would be different color representations but they would share the same nullspace with the same metamers does this mean that all these different color spaces are equivalent not at all actually spaces that separate the achromatic information from the chromatic information as opposed to lms cones are closer to the internal physiological and psychophysical representations of color and their statistical properties of these opponent spaces are quite different too therefore sharing some null space does not mean having equivalent representations now imagine the complication of going from the 100 dimensions of spectral radiance to the millions of dimensions in color images this complexity implies that the rationale followed in the paper is not obvious the authors sould give additional evidences of why their rationale is correct or acknowledge the problems in this assumption 2 authors should give additional evidence for the suggested connection between summary statistics and robustness to adversarial attacks specifically in the discussion in fig 8 the authors argue that this conection may come from the fact that imposing separability of classes with highvariance responses of peripheral vision leads to enlarged class separation of smallvariance sets in foveal vision with samples that have the same mean i think this interesting suggestion in the discussion is not justified in any way or i totally missed the justification sorry if that is the case 3 generation of hatx implies the minimization of a perceptual distance with regard to tildex as explained in section 22 eq 7 in this situation who is responsible for the humanlike behavior of tildex more specifically who is responsible for the similar behavior of humans in hatx and tildex a the fact that the robust representation is humanlike as argued by the authors or b the fact that the images hatx were selected so that they were close to tildex for humans it is interesting to note that the texture distortion metric dists is based in computing summary statistics which are similar in spirit to the synthesis method taken as model of peripheral vision consider that the procedure in freeman and simoncelli 11 reduces to enforcing the replication of certain statistical descriptions as in the old portilla and simoncelli 01 paper in a similar vein in dists texture similarity is measured according to mean and variance of the responses in different layers of a vgg net and in fact in ding et al pami 21 they also propose enforcing the replication of these descriptors as a texture synthesis method all this seems quite overlapped 4 related to the previous question fig 6 that shows that the perceptual distance of tildex and hatx wrt x are similar seems like a consequence of the selection of hatx described in section 22 no 5 moreover given the fact that xs are so subjectively different from natural images the whole psychophysical procedure seems unnecessary simple inspection of the synthesized images xs and tildex is enough to see that tildex is closer to whatever hatx and x and that xs simply is just not right as a result it is not surprising that the psychometric functions for tildex are more similar to those of hatx than those of xs in this way i feel the psychophysical experiments and the perceptual image quality metrics do not add much with regard to the qualitative visualizations shown in tsipras et al iclr 2019 6 finally given the above concerns i suggest another way of assessing the similarity between the robust networks and human vision why not trying to simulate other human tasks such as in geirhos et al iclr 19 cited in the work or classical psychophysical responses as in martinezgarcia et al frontiers in neuroscience 2019 not cited in the work i am not suggesting to do this now but i feel a comment on alternative comparisons with humans is due the attempt to assess the humannature of the image representation of networks which are robust to adversarial attacks is very interesting both for machine learning as well as for computational neuroscience however a number of points need to be clarified before the work can be published at iclr 1 representations with shared nullspace the central idea that is checked in the work are not necessarily the same simple linear examples tell us that they can differ in arbitrary rotations or extra linear invertible transforms and these may represent quite different qualitative nature 2 authors should give additional evidence for the suggested connection between the use of summary statistics and robustness to adversarial attacks fig 8 3 authors should mention other ways to assess the similarity between the robust networks and humans which may be centered in the response space rather than in the null space see extra minor comments at the detailed review docsepthis paper examines whether an adversarially trained robust resnet50 classification model might better resemble the representation of human peripheral vision than a nonadversarially trained nonrobust model it does so by means of a human perceptual experiment test images from restricted imagenet a collapsed subset of imagenet comprising 9 superclasses are resynthesized via gradient descent from random noise initial conditions by minimizing the l2 difference in the penultimate feature activation of both a robust and nonrobustmodel in addition the same stimuli are resynthesized according to previously validated models of summary statistic based peripheral computation texforms a range of perceptual discrimination experiments oddoneout and two alternative forcedchoice are carried out with human subjects among different pairs of stimuli original robust nonrobust and texform at varying degrees of peripheral eccentricity the experiments show that the discriminability of robust and texform stimuli vs original stimuli degrade similarly with increasing eccentricity while nonrobust stimuli are relatively easy to pick out vs original stimuli at all levels of eccentricity strengths 1 i thought the topic of the paper was interesting and relevant for iclr since it probes at a link between the representations learned by adversarially trained image classifiers and the robustness of human peripheral vision and suggests some possible paths forward eg spatiallyadaptive computation of continuous representations 2 the experiments appear to me to be carefully thought through and executed and the results are well explained and documented 3 the paper describes related literature well and i feel like i learned some useful things from reading it so thank you weaknesses 1 i find the assertion that robust representations capture peripheral computation similar to current stateoftheart texture peripheral vision models a little strong its certainly the case that the synthesized metamers have similar but not matching behavior curves in the precision vs eccentricity discrimination tasks of fig 5 but is this sufficient to claim that they capture computation similarly is it possible that a model could yield synthesized images which also have similar curves but have nothing to do with how peripheral computation takes place what if you were to learn a decoder from the representations of robust and nonrobust networks and train it to reconstruct the images from those representations then use those reconstructions in place of the current synthetic images do you think these relationships would hold up would that be an invalid way of producing synthetic images 2 along that line id like to see some discussion and analysis about the role of optimization in generating the stimuli in particular do the standard and robust images share the same final l2 minimization error does gradient descent find suboptimal local minima more often for either standard or robust how often does it matter could the optimization landscape of the nonrobust and robust models lead to incorrect experimental conclusions or is it not a concern quantitatively since standard and robust models use the same network what are the distributions of converged l2 distances in both sets of tested stimuli similarly what are the l2 distances vs the original representation if for example you take a robust image and feed it into a standard model and vice versa is one model more invariant to the others stimuli 3 there are various details missing from the experiments section which i think are important was vanilla gradient descent used as currently implied in the paper or a more adaptive optimizer eg adam as in feather et al or gradient descent with momentum as in mahendran et al how was the experiment carried out remote inlab what controls were in place was an eye tracker used to ensure subject fixations did all subjects view all stimuli pairs 4 i would love to understand the role of visual angle in the experiments it seems the choice was made to map 256x256 pixel stimuli to 7x7 dva p4 how was this decided presumably the results are sensitive to this choice if say a 14x14 dva were used instead would it become easier to discriminate the robust and original stimuli would it make sense to check that similar trends continue to exist between texform and robust stimuli even despite such changes maybe just causing shifts in the curves in fig 5 5 i think it should be made clearer at an earlier point in the abstract introduction why human peripheral vision as opposed to foveal vision is important to this problem i like the clarity of the statement at the bottom of page 3 wed like to investigate if the following statement is true a transformation resembling peripheral computation in the human visual system can closely be approximated by an adversarially trained network i think something along these lines should be stated in the abstract to improve readability also i found the term biological plausibility to be a little vague and unqualified until the discussion section and as mentioned above i am not yet convinced that these experiments definitely find biological plausibility 6 i would like to understand why the choice was made to take the representation at the secondtolast network layer after average pooling which destroys some potentially useful local information did you try to optimize for the representation before average pooling or is there a particular reason why doing so would not make sense i would be very curious to see whether that closes the gap between the robust and nonrobust discriminability curves minor points 1 figure clarity the clarity of fig 4 would be helped by adding what kind of stimulus each one is in the frame not easy to tell without zooming in a lot fig 6 red and pink are a little hard to distinguish on my screen maybe choose a more distinct color fig 7 is similarly difficult to make out without zooming in a clearer alternative might be eg filled and unfilled circles 2 sec21 we used gradient descent to minimize the difference between the representation of the secondtolast network layer of a target image and an initial noise seed as shown in figure 9 rather than an initial noise seed would it be clearer to say a synthetic image initialized with random noise 3 i understand the code used in the paper is already largely public from other sources but the work may have better impact if the various synthetic and original stimuli could be released alongside the paper perhaps along with the generation scripts i think this paper is marginally below the acceptance threshold but this may be due in part to the exposition of the work i like the effort to try to link these two areas of human peripheral vision and robust representations and i think the paper does quite a good job to describe the experiments undertaken if the authors can respond convincingly to the various questions i and other reviewers may have i would be willing to upgrade my rating edit following the rebuttal from the authors and taking into consideration their response to other reviewer comments i am upgrading my score docsepthis paper presents a psychophysical comparison between a peripheral summary statistics model texforms adversarially robust and nonrobust neural networks images are generated from each model network using gradient ascent then presented to humans at different retinal eccentricities the key finding is that images generated by nonrobust networks dont look anything like natural images and so are easily discriminable whereas robust and texforms become more difficult to tell apart from natural images as they are moved further into the periphery the main claim is that because of this similar performance falloff training on adversarial noise may cause similar representations to be learned as in the human periphery by comparing the generated images in terms of physical and perceptual distances the claim is further strengthened as suggesting their models compute the same transformations the paper ends with the interesting conjecture that peripheral computation may implicitly act as a natural visual regularizer strengths the paper raises some interesting ideas linking network robustness with human perception specifically about the roles of eye movements and peripheral computations in inducing useful invariances the psychophysical experiments appear well conducted though see below weaknesses purely in terms of writing i found the motivation and results of the paper difficult to understand until the discussion the key motivating question posed on page 2 could it be that adversarially trained networks are robust because they encode object representations similar to human peripheral computation comes out of the blue why would they we are of course also robust perhaps more so in the fovea so this didnt seem to make sense on the face of it the paper claims that the texform and robust models are doing similar things or even the same computations figure 6 based on 1 the finding of similar psychometric functions figure 5 and similar iqa scores figure 6 this is quite weak as the paper itself notes there are many distortions derived from the synthesized model stimuli that can potentially yield the same perceptual sensitivity in a discrimination task i do not find the iqa analysis convincing either if i understand correctly these measure the distance between the original and the synthesized and synthesized vs synthesized images within the same model but isnt the key issue whether the generated images look similar to each other presumably if they are performing the same computations they should i think a stronger way to assess this point would be to perform some variant of maximum differentiation competition mad wang simoncelli 2008 loosely synthesize an image that maximally changes the robust synthesis while keeping the texform representation unchanged and vice versa if the two models are performing similar computations then moving in some direction in one models space while keeping the other fixed should create little perceptible change missing experimental details was the human data collected in a lab or online was eye tracking performed to monitor fixation was the study conducted according to relevant ethical guidelines and approved by a review board or similar i could also find no details on this in the supplement minor p 6 typo mainly human observers the differences in the terms in equation 7 were hard to read tildex vs hat x look very similar so i was confused for a while a relevant reference for the conjectures on page 9 around learning invariances from fovea to periphery is nandy a s tjan b s 2012 saccadeconfounded image statistics explain visual crowding nature neuroscience 153 463469 p 9 counterintuitively the fact that our visual system is spatiallyadaptive could give rise to a more robust encoding mechanism of the visual stimulus as observers can encode a distribution rather than a point as they move their center of gaze does this imply the converse that nonfoveated biological vision systems are less robust i dont know of evidence that speaks specifically to this but there are certainly organisms with nonfoveated visual systems that get along just fine in their niche the paper makes a few interesting arguments linking robustness to foveated visual systems i find the evidence that the texform and robust models are doing similar the same computation to be only weakly supported the story of the paper could also be more clearly constructed edit after discussion given report of new experimental evidence that the texform and robust models generate perceptuallysimilar images i am raising my score from 5 to 8 docsepthis study draws a surprising connection between adversarially trained cnns and human peripheral perception using the synthesized metamers the authors show that the human ability to discriminate between natural images and their synthetic metamers drops in a similar fashion for metamers generated by inverting adversarially trained cnns and for metamers generated by an inversion of a wellstudied model of peripheral vision ie texforms as display eccentricity increases this may indicate that adversarially trained models bear some resemblance to human visual processing at the retinal periphery i enjoyed reading the paper and found the connections it makes inspiring thoughts overall the metameric manipulation seems to make a simple prediction if a model correctly captures human vision for a given eccentricity then the metamers should not be discriminable by humans when presented at that eccentricity the authors interpret the decreasing discriminability as indicative that robust cnns enjoy a similar relation to peripheral vision as the freeman simoncelli model does i agree that this is a reasonable interpretation however the data also suggests a discrepancy between robust cnns and peripheral vision as discriminability remains above chance even for 30 visual degrees the authors might consider addressing this discrepancy in the text and in subsequent analyses analyzing what humans see that the model cant doesnt fitting the two free parameters of the freeman simoncelli model to the robust cnn weaken the conclusion that the two models are equivalent in their predictive power what if the texforms are formed independently of the cnn it is unclear to me whether its fair to generate metamers to be tested in the periphery without simulating visual acuity limitations during stimulus synthesis the standard cnn might be driven by highfrequency information that isnt even there when the stimulus is presented at 20 or 30 degrees at least the authors might want to evaluate posthoc whether the metamers remain metamers under highpass filtering emulating retinal constraints at the different eccentricities the synthetic vs synthetic condition is included without sufficient motivation or discussion how should this condition be interpreted differently than the natural vs synthetic condition this point is particularly relevant for the standard cnn where these two conditions diverge do subject demographics ethical approval and subject compensation appear in the text opensourcing the code stimuli and behavioral results would increase the impact of this work learning invariances shared between foveal and peripheral processing is an interesting idea perhaps this approach can replace adversarial training this direction should be pursued further in another study minor points the discussion treats standard cnns as spatially uniform however in practice this is not the case due to effects related to pooling azulay weiss 2018 and padding alsallakh et al 2020 the latter study even indicates implicit foveation some of the cited arxiv citations are outdated and should be replaced with citations of the corresponding conference proceedings references alsallakh b kokhlikyan n miglani v yuan j reblitzrichardson o 2020 mind the padcnns can develop blind spots arxiv preprint arxiv201002178 azulay aharon and yair weiss why do deep convolutional networks generalize so poorly to small image transformations arxiv preprint arxiv180512177 2018 although this work has some weaker aspects this is a mature research project whose acceptance to iclr would promote more important research into the connections between cnns robustness and peripheral vision therefore i recommend acceptance ### Summary:
this paper shows that images synthesized to match adversarially robust representations are similar to original images to humans when viewed peripherally this was not true for adversarially nonrobust representations additionally the adversarially robust representations were similar to the texform model image from a model of human peripheral vision reviewers increased their score a lot during the rebuttal period as the authors provided more details on the experiments and agreed to tone down some of the claims especially the strong claim that the robust representations capture peripheral computation similar to current soa texture peripheral vision models as well stated by reviewer s6dv two representations with the same nullspace are not necessarily the same with reviewer scores of 8 across the board reviewers agree that this is interesting work that should be presented at the conference i agree
[ 417, 2568, 13762, 326, 841, 4679, 7964, 1089, 7534, 18662, 2322, 50275, 23, 891, 651, 751, 281, 2096, 2139, 253, 4327, 369, 1160, 281, 1379, 253, 6779, 387, 253, 1273, 34776, 505, 2990, 3828, 846, 3388, 45900, 534, 46340, 690, 7826, 4217, 1980, 1491, 858, 368, 1611, 281, 22318, 323, 253, 6779, 1078, 3388, 45900, 390, 310, 627, 247, 1798, 1921, 2139, 2509, 594, 651, 417, 1056, 3282, 891, 651, 320, 1077, 14338, 281, 923, 1880, 326, 27599, 253, 8037, 875, 253, 10237, 285, 1327, 18848, 461, 20741, 1430, 9191, 50275, 37585, 2792, 50276, 18, 4677, 19843, 253, 19843, 273, 3036, 577, 651, 320, 6518, 407, 6240, 752, 2238, 273, 15199, 1016, 581, 310, 275, 253, 3665, 417, 3477, 281, 2028, 1293, 21282, 272, 275, 247, 2257, 3036, 721, 2502, 285, 14863, 403, 247, 1652, 1892, 281, 12129, 327, 619, 3601, 50276, 28489, 5206, 247, 625, 5799, 3295, 3036, 818, 310, 12014, 2834, 281, 1056, 562, 1293, 21282, 272, 275, 50276, 66, 30909, 5795, 1537, 320, 24088, 6898, 285, 5369, 2591, 14240, 50276, 19, 4706, 1797, 359, 908, 11786, 18499, 281, 15338, 253, 3064, 875, 253, 6779, 273, 253, 1273, 34776, 505, 2990, 3828, 273, 247, 2303, 2460, 285, 271, 3302, 6046, 8357, 347, 2011, 275, 4677, 898, 50276, 30786, 685, 271, 3302, 6046, 8357, 651, 352, 320, 30909, 281, 1333, 247, 13506, 2460, 31260, 342, 3632, 6046, 50276, 20, 891, 2096, 253, 2127, 908, 275, 253, 2929, 310, 2168, 8127, 1345, 432, 643, 4973, 533, 253, 789, 778, 452, 1805, 3486, 604, 253, 2710, 13506, 285, 3236, 15374, 812, 320, 4439, 12936, 253, 2929, 4931, 2112, 342, 253, 5978, 20477, 50276, 74, 1158, 436, 2929, 310, 42876, 2708, 253, 14924, 7887, 533, 436, 778, 320, 1955, 275, 629, 281, 253, 47284, 273, 253, 789, 891, 751, 253, 3434, 281, 1611, 281, 3048, 841, 767, 3672, 273, 1966, 10844, 8113, 285, 10237, 14237, 285, 891, 1158, 253, 2929, 1057, 3240, 247, 1175, 2628, 281, 6266, 253, 4679, 20023, 604, 253, 4477, 476, 3794, 2410, 1763, 5356, 281, 253, 2710, 3533, 891, 285, 643, 30628, 778, 452, 891, 651, 320, 7378, 281, 15047, 619, 13716, 50276, 15576, 1563, 253, 30080, 22559, 432, 253, 4477, 285, 3192, 715, 8180, 616, 2380, 281, 643, 37317, 5701, 891, 717, 38234, 619, 4868, 5474, 33032, 2520, 2929, 10262, 247, 4369, 40947, 5301, 875, 247, 10844, 6010, 9990, 1566, 30557, 13015, 18539, 274, 1365, 10237, 285, 1327, 18848, 461, 11454, 6928, 3888, 403, 4561, 432, 1016, 1566, 50276, 18428, 970, 11786, 49104, 840, 3559, 281, 7497, 387, 1027, 22043, 26418, 1005, 253, 2234, 4560, 310, 326, 3888, 4561, 407, 1327, 18848, 461, 6928, 13414, 1007, 2712, 751, 3626, 3888, 285, 594, 403, 4354, 20741, 494, 5727, 10237, 285, 30557, 13015, 2489, 625, 2834, 281, 2028, 7419, 432, 3626, 3888, 347, 597, 403, 4395, 2007, 715, 253, 29615, 253, 2022, 1750, 310, 326, 984, 273, 436, 2074, 3045, 2965, 2727, 3733, 327, 48960, 6046, 778, 2847, 2074, 14237, 281, 320, 6311, 347, 275, 253, 1966, 29615, 407, 10941, 253, 4561, 3888, 275, 2426, 273, 3520, 285, 39612, 13849, 253, 1750, 310, 2007, 34615, 347, 7738, 616, 3210, 11897, 253, 1072, 21257, 253, 2929, 7637, 342, 253, 4722, 24366, 326, 10844, 13782, 778, 29688, 769, 347, 247, 3626, 5304, 3963, 6081, 50274, 296, 3755, 20556, 50275, 783, 2929, 16540, 690, 4722, 5697, 20057, 2990, 31640, 342, 1966, 13071, 5742, 670, 253, 9503, 273, 5130, 11438, 285, 10844, 30745, 275, 24635, 4217, 828, 6656, 707, 50276, 783, 4369, 40947, 4679, 3176, 973, 5196, 2167, 923, 2708, 50275, 20881, 1255, 265, 50275, 32669, 314, 275, 2426, 273, 4028, 891, 1119, 253, 16038, 285, 1543, 273, 253, 2929, 2834, 281, 2096, 1919, 253, 5955, 253, 2234, 15265, 839, 1953, 22691, 327, 3239, 374, 812, 352, 320, 326, 18539, 274, 1365, 10166, 6928, 403, 10237, 984, 597, 22573, 1789, 14237, 2074, 281, 1966, 10844, 13782, 3249, 562, 273, 253, 4797, 2139, 651, 597, 359, 403, 273, 2282, 671, 10237, 4931, 625, 594, 275, 253, 269, 710, 66, 594, 436, 42126, 1646, 281, 1056, 3282, 327, 253, 2454, 273, 352, 50276, 783, 2929, 3916, 326, 253, 30557, 630, 285, 10237, 3210, 403, 2509, 2074, 1841, 390, 1014, 253, 1072, 30745, 4677, 721, 1754, 327, 337, 253, 4560, 273, 2074, 4369, 7480, 3470, 4677, 608, 285, 2074, 891, 31569, 7363, 4677, 721, 436, 310, 3240, 5075, 347, 253, 2929, 3139, 7211, 627, 403, 1142, 46598, 6012, 432, 253, 17791, 1566, 15374, 326, 476, 7826, 4917, 253, 1072, 39612, 7340, 275, 247, 11081, 4836, 891, 513, 417, 1089, 253, 891, 31569, 1783, 21414, 2057, 604, 891, 2096, 9113, 841, 2557, 253, 4181, 875, 253, 3236, 285, 253, 17791, 285, 17791, 4632, 17791, 3888, 1561, 253, 1072, 1566, 533, 310, 2649, 253, 2234, 2523, 1880, 253, 4561, 3888, 1007, 2074, 281, 1016, 643, 18289, 604, 597, 403, 9591, 253, 1072, 30745, 597, 943, 891, 1158, 247, 10046, 1039, 281, 2939, 436, 1127, 651, 320, 281, 1347, 690, 12955, 273, 4869, 9827, 7324, 10279, 259, 606, 50276, 3549, 251, 3992, 74, 4695, 35056, 46919, 271, 2460, 326, 11903, 595, 2544, 253, 10237, 9066, 1223, 7562, 253, 30557, 630, 6779, 19965, 285, 12008, 26620, 604, 253, 767, 3210, 403, 9591, 2074, 30745, 840, 4886, 275, 690, 3884, 275, 581, 3210, 2317, 1223, 7562, 253, 643, 4229, 943, 2794, 1652, 591, 44043, 1818, 50276, 33722, 5661, 4278, 369, 253, 1966, 941, 5728, 275, 247, 5188, 390, 3909, 369, 5130, 12544, 2684, 281, 5724, 21746, 369, 253, 1263, 5196, 2556, 281, 4623, 16289, 9600, 285, 7012, 407, 247, 2278, 4450, 390, 2074, 891, 812, 671, 1089, 642, 4278, 327, 436, 275, 253, 8499, 50275, 37585, 50275, 81, 721, 1745, 80, 7194, 1966, 25226, 50275, 783, 3910, 275, 253, 2426, 275, 5150, 818, 497, 1892, 281, 1239, 10751, 2465, 4632, 7856, 1269, 1007, 1077, 2074, 594, 891, 369, 13477, 323, 247, 1223, 50276, 66, 4623, 3806, 323, 253, 19704, 980, 327, 3239, 898, 1475, 4715, 828, 6656, 707, 432, 269, 710, 66, 281, 29615, 310, 295, 13183, 247, 256, 50276, 85, 17551, 270, 256, 4050, 256, 3649, 796, 8259, 8055, 2460, 9990, 5513, 5304, 6207, 5361, 3753, 6551, 21559, 21579, 7904, 1706, 2090, 50276, 81, 898, 4828, 565, 41597, 253, 958, 326, 776, 5304, 985, 310, 28819, 26672, 422, 812, 1918, 6054, 281, 247, 625, 10237, 9706, 5122, 273, 253, 5304, 15199, 347, 25226, 476, 22573, 247, 3268, 2581, 685, 247, 1127, 347, 597, 2118, 616, 4055, 273, 15944, 50276, 18566, 436, 16084, 253, 42810, 326, 1327, 71, 710, 456, 7534, 8113, 2718, 403, 1679, 10237, 891, 13414, 871, 273, 1941, 326, 16544, 5742, 281, 436, 533, 627, 403, 5604, 16711, 342, 1327, 71, 710, 456, 5304, 2718, 326, 755, 2112, 816, 4030, 275, 616, 25803, 253, 2929, 2789, 247, 1643, 4722, 7125, 20057, 31640, 281, 269, 710, 456, 5304, 2718, 891, 1089, 253, 1941, 326, 253, 30557, 630, 285, 10237, 3210, 403, 2509, 2074, 50276, 783, 1072, 13782, 281, 320, 760, 22112, 4516, 253, 2926, 273, 253, 2929, 812, 671, 320, 625, 4518, 8818, 50276, 15576, 846, 5955, 1677, 1304, 273, 747, 5661, 1941, 326, 253, 30557, 630, 285, 10237, 3210, 6635, 591, 916, 1230, 22202, 3888, 891, 717, 12976, 619, 4868, 432, 608, 281, 854, 5474, 33032, 2520, 1263, 21354, 247, 10084, 4602, 875, 18539, 274, 1365, 10166, 260, 79, 2224, 285, 1966, 10844, 13071, 970, 253, 17791, 42281, 398, 253, 4477, 921, 326, 253, 1966, 3745, 281, 30530, 875, 3626, 3888, 285, 616, 13506, 42281, 398, 15323, 275, 247, 2074, 8142, 323, 42281, 398, 4561, 407, 275, 31324, 18539, 274, 1365, 10166, 260, 79, 2224, 285, 323, 42281, 398, 4561, 407, 271, 27697, 273, 247, 973, 14091, 728, 1566, 273, 10844, 8113, 26332, 30557, 13015, 347, 3148, 26418, 414, 5459, 436, 778, 5224, 326, 18539, 274, 1365, 10166, 3210, 8800, 690, 36199, 281, 1966, 5304, 5162, 387, 253, 22043, 29615, 891, 11346, 4361, 253, 2929, 285, 1119, 253, 10291, 352, 2789, 29853, 50275, 24286, 84, 50275, 1189, 455, 253, 1313, 40463, 19763, 3133, 281, 1056, 247, 2969, 10554, 604, 247, 1566, 9113, 28174, 1966, 8113, 323, 247, 1677, 26418, 414, 840, 253, 42281, 398, 943, 417, 320, 20741, 494, 407, 7497, 672, 3559, 387, 326, 26418, 414, 253, 4477, 4665, 253, 11052, 20741, 1430, 347, 24838, 326, 10237, 260, 79, 2224, 4264, 247, 2074, 5886, 281, 10844, 8113, 347, 253, 4107, 11155, 50276, 3549, 251, 3992, 74, 1566, 1057, 891, 5194, 326, 436, 310, 247, 5272, 7914, 2299, 253, 941, 671, 5936, 247, 26210, 875, 10237, 260, 79, 2224, 285, 10844, 8113, 347, 20741, 1430, 4558, 1840, 4839, 1014, 323, 1884, 5304, 7759, 253, 4477, 1537, 1908, 15974, 436, 26210, 275, 253, 2505, 285, 275, 6774, 6260, 18918, 752, 7497, 923, 326, 253, 1566, 16216, 50275, 18566, 2649, 13532, 253, 767, 1959, 3602, 273, 253, 4107, 11155, 50276, 3549, 251, 3992, 74, 1566, 281, 253, 10237, 260, 9866, 20171, 253, 6452, 326, 253, 767, 3210, 403, 6425, 275, 616, 15970, 1612, 752, 604, 253, 30557, 13015, 403, 4447, 10939, 273, 253, 260, 9866, 50275, 262, 310, 12744, 281, 479, 1880, 697, 4344, 281, 6635, 42281, 398, 281, 320, 5762, 275, 253, 29615, 1293, 948, 8287, 5304, 43457, 7364, 1309, 15199, 9066, 253, 2629, 260, 9866, 1537, 320, 8877, 407, 1029, 18163, 1491, 326, 310, 2649, 1014, 627, 672, 253, 15199, 310, 3559, 387, 1384, 390, 1884, 7759, 387, 1878, 253, 4477, 1537, 971, 281, 7472, 1501, 37806, 1880, 253, 42281, 398, 3464, 42281, 398, 762, 1029, 5858, 19690, 802, 8287, 22043, 10806, 387, 253, 1027, 26418, 1005, 50275, 783, 13506, 4632, 13506, 1617, 310, 2908, 1293, 4209, 16038, 390, 5955, 849, 943, 436, 1617, 320, 12814, 13359, 685, 253, 3626, 4632, 13506, 1617, 436, 1127, 310, 3782, 4623, 323, 253, 2629, 260, 9866, 835, 841, 767, 2515, 11711, 463, 50275, 3088, 2256, 35949, 16289, 9612, 285, 2256, 10963, 3176, 275, 253, 2505, 50275, 25249, 40883, 253, 2127, 15374, 285, 14613, 1543, 651, 2572, 253, 3486, 273, 436, 789, 50275, 28269, 828, 6656, 707, 6096, 875, 269, 710, 267, 285, 10844, 5162, 310, 271, 4722, 2934, 4931, 436, 2746, 476, 8171, 48960, 3733, 436, 3884, 943, 320, 23321, 2007, 275, 1529, 1263, 50275, 37585, 2792, 50275, 783, 5955, 26574, 2629, 260, 79, 2224, 347, 28819, 6447, 2299, 275, 3946, 436, 310, 417, 253, 1083, 1955, 281, 2538, 2905, 281, 45900, 11775, 335, 333, 50275, 664, 739, 4765, 285, 13294, 14350, 455, 18980, 1162, 355, 9169, 253, 6158, 1263, 1014, 6492, 15424, 269, 710, 318, 50275, 8826, 273, 253, 11106, 549, 32693, 30404, 403, 36761, 285, 943, 320, 7932, 342, 30404, 273, 253, 3969, 8059, 10061, 50276, 250, 3065, 14350, 455, 18980, 270, 465, 536, 73, 965, 4742, 266, 295, 5563, 77, 6451, 362, 340, 9041, 480, 50276, 250, 1559, 5432, 5969, 472, 1665, 258, 9169, 2564, 253, 13229, 14340, 2224, 476, 1287, 9645, 13977, 549, 32693, 638, 3845, 549, 32693, 1252, 4699, 20070, 50276, 1370, 335, 333, 247, 9432, 251, 285, 340, 1094, 359, 739, 2139, 513, 3676, 27311, 267, 6928, 39970, 594, 15225, 281, 1355, 2460, 21257, 549, 32693, 638, 3845, 549, 32693, 1093, 1762, 805, 15907, 4765, 3738, 436, 789, 556, 690, 21076, 7794, 436, 310, 247, 14242, 2561, 2199, 3692, 14924, 281, 17857, 32888, 651, 8591, 625, 1774, 2561, 715, 253, 10291, 875, 260, 79, 2224, 31640, 285, 10844, 8113, 3103, 891, 5583, 14924, 2490, 187, 4118, 18435, 27, 2520, 2929, 2722, 326, 3888, 17791, 281, 3761, 18539, 274, 1365, 10237, 14237, 403, 2074, 281, 3236, 3888, 281, 7497, 672, 11575, 50276, 468, 6894, 595, 50276, 2520, 369, 417, 2032, 323, 18539, 274, 1365, 1327, 18848, 461, 14237, 50276, 29483, 595, 253, 18539, 274, 1365, 10237, 14237, 497, 2074, 281, 253, 30557, 630, 1566, 2460, 432, 247, 1566, 273, 1966, 10844, 8113, 50275, 15337, 398, 2559, 616, 4868, 247, 2257, 1309, 253, 30080, 22559, 2180, 347, 50275, 783, 4477, 2530, 625, 4278, 327, 253, 4679, 285, 5821, 281, 50276, 48619, 1066, 690, 273, 253, 3916, 3340, 253, 2266, 1750, 326, 253, 10237, 14237, 9232, 10844, 13782, 2074, 281, 1655, 594, 66, 14542, 10844, 8113, 3210, 50276, 284, 973, 4767, 407, 50276, 15337, 254, 256, 23, 27088, 767, 14237, 342, 253, 1072, 3635, 5641, 403, 417, 7933, 253, 1072, 50274, 3113, 37317, 7363, 273, 854, 2439, 253, 4450, 30628, 5194, 326, 436, 310, 4722, 789, 326, 943, 320, 3559, 387, 253, 8059, 50276, 74, 5194 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 417, 2568, 13762, 326, 841, 4679, 7964, 1089, 7534, 18662, 2322, 50275, 23, 891, 651, 751, 281, 2096, 2139, 253, 4327, 369, 1160, 281, 1379, 253, 6779, 387, 253, 1273, 34776, 505, 2990, 3828, 846, 3388, 45900, 534, 46340, 690, 7826, 4217, 1980, 1491, 858, 368, 1611, 281, 22318, 323, 253, 6779, 1078, 3388, 45900, 390, 310, 627, 247, 1798, 1921, 2139, 2509, 594, 651, 417, 1056, 3282, 891, 651, 320, 1077, 14338, 281, 923, 1880, 326, 27599, 253, 8037, 875, 253, 10237, 285, 1327, 18848, 461, 20741, 1430, 9191, 50275, 37585, 2792, 50276, 18, 4677, 19843, 253, 19843, 273, 3036, 577, 651, 320, 6518, 407, 6240, 752, 2238, 273, 15199, 1016, 581, 310, 275, 253, 3665, 417, 3477, 281, 2028, 1293, 21282, 272, 275, 247, 2257, 3036, 721, 2502, 285, 14863, 403, 247, 1652, 1892, 281, 12129, 327, 619, 3601, 50276, 28489, 5206, 247, 625, 5799, 3295, 3036, 818, 310, 12014, 2834, 281, 1056, 562, 1293, 21282, 272, 275, 50276, 66, 30909, 5795, 1537, 320, 24088, 6898, 285, 5369, 2591, 14240, 50276, 19, 4706, 1797, 359, 908, 11786, 18499, 281, 15338, 253, 3064, 875, 253, 6779, 273, 253, 1273, 34776, 505, 2990, 3828, 273, 247, 2303, 2460, 285, 271, 3302, 6046, 8357, 347, 2011, 275, 4677, 898, 50276, 30786, 685, 271, 3302, 6046, 8357, 651, 352, 320, 30909, 281, 1333, 247, 13506, 2460, 31260, 342, 3632, 6046, 50276, 20, 891, 2096, 253, 2127, 908, 275, 253, 2929, 310, 2168, 8127, 1345, 432, 643, 4973, 533, 253, 789, 778, 452, 1805, 3486, 604, 253, 2710, 13506, 285, 3236, 15374, 812, 320, 4439, 12936, 253, 2929, 4931, 2112, 342, 253, 5978, 20477, 50276, 74, 1158, 436, 2929, 310, 42876, 2708, 253, 14924, 7887, 533, 436, 778, 320, 1955, 275, 629, 281, 253, 47284, 273, 253, 789, 891, 751, 253, 3434, 281, 1611, 281, 3048, 841, 767, 3672, 273, 1966, 10844, 8113, 285, 10237, 14237, 285, 891, 1158, 253, 2929, 1057, 3240, 247, 1175, 2628, 281, 6266, 253, 4679, 20023, 604, 253, 4477, 476, 3794, 2410, 1763, 5356, 281, 253, 2710, 3533, 891, 285, 643, 30628, 778, 452, 891, 651, 320, 7378, 281, 15047, 619, 13716, 50276, 15576, 1563, 253, 30080, 22559, 432, 253, 4477, 285, 3192, 715, 8180, 616, 2380, 281, 643, 37317, 5701, 891, 717, 38234, 619, 4868, 5474, 33032, 2520, 2929, 10262, 247, 4369, 40947, 5301, 875, 247, 10844, 6010, 9990, 1566, 30557, 13015, 18539, 274, 1365, 10237, 285, 1327, 18848, 461, 11454, 6928, 3888, 403, 4561, 432, 1016, 1566, 50276, 18428, 970, 11786, 49104, 840, 3559, 281, 7497, 387, 1027, 22043, 26418, 1005, 253, 2234, 4560, 310, 326, 3888, 4561, 407, 1327, 18848, 461, 6928, 13414, 1007, 2712, 751, 3626, 3888, 285, 594, 403, 4354, 20741, 494, 5727, 10237, 285, 30557, 13015, 2489, 625, 2834, 281, 2028, 7419, 432, 3626, 3888, 347, 597, 403, 4395, 2007, 715, 253, 29615, 253, 2022, 1750, 310, 326, 984, 273, 436, 2074, 3045, 2965, 2727, 3733, 327, 48960, 6046, 778, 2847, 2074, 14237, 281, 320, 6311, 347, 275, 253, 1966, 29615, 407, 10941, 253, 4561, 3888, 275, 2426, 273, 3520, 285, 39612, 13849, 253, 1750, 310, 2007, 34615, 347, 7738, 616, 3210, 11897, 253, 1072, 21257, 253, 2929, 7637, 342, 253, 4722, 24366, 326, 10844, 13782, 778, 29688, 769, 347, 247, 3626, 5304, 3963, 6081, 50274, 296, 3755, 20556, 50275, 783, 2929, 16540, 690, 4722, 5697, 20057, 2990, 31640, 342, 1966, 13071, 5742, 670, 253, 9503, 273, 5130, 11438, 285, 10844, 30745, 275, 24635, 4217, 828, 6656, 707, 50276, 783, 4369, 40947, 4679, 3176, 973, 5196, 2167, 923, 2708, 50275, 20881, 1255, 265, 50275, 32669, 314, 275, 2426, 273, 4028, 891, 1119, 253, 16038, 285, 1543, 273, 253, 2929, 2834, 281, 2096, 1919, 253, 5955, 253, 2234, 15265, 839, 1953, 22691, 327, 3239, 374, 812, 352, 320, 326, 18539, 274, 1365, 10166, 6928, 403, 10237, 984, 597, 22573, 1789, 14237, 2074, 281, 1966, 10844, 13782, 3249, 562, 273, 253, 4797, 2139, 651, 597, 359, 403, 273, 2282, 671, 10237, 4931, 625, 594, 275, 253, 269, 710, 66, 594, 436, 42126, 1646, 281, 1056, 3282, 327, 253, 2454, 273, 352, 50276, 783, 2929, 3916, 326, 253, 30557, 630, 285, 10237, 3210, 403, 2509, 2074, 1841, 390, 1014, 253, 1072, 30745, 4677, 721, 1754, 327, 337, 253, 4560, 273, 2074, 4369, 7480, 3470, 4677, 608, 285, 2074, 891, 31569, 7363, 4677, 721, 436, 310, 3240, 5075, 347, 253, 2929, 3139, 7211, 627, 403, 1142, 46598, 6012, 432, 253, 17791, 1566, 15374, 326, 476, 7826, 4917, 253, 1072, 39612, 7340, 275, 247, 11081, 4836, 891, 513, 417, 1089, 253, 891, 31569, 1783, 21414, 2057, 604, 891, 2096, 9113, 841, 2557, 253, 4181, 875, 253, 3236, 285, 253, 17791, 285, 17791, 4632, 17791, 3888, 1561, 253, 1072, 1566, 533, 310, 2649, 253, 2234, 2523, 1880, 253, 4561, 3888, 1007, 2074, 281, 1016, 643, 18289, 604, 597, 403, 9591, 253, 1072, 30745, 597, 943, 891, 1158, 247, 10046, 1039, 281, 2939, 436, 1127, 651, 320, 281, 1347, 690, 12955, 273, 4869, 9827, 7324, 10279, 259, 606, 50276, 3549, 251, 3992, 74, 4695, 35056, 46919, 271, 2460, 326, 11903, 595, 2544, 253, 10237, 9066, 1223, 7562, 253, 30557, 630, 6779, 19965, 285, 12008, 26620, 604, 253, 767, 3210, 403, 9591, 2074, 30745, 840, 4886, 275, 690, 3884, 275, 581, 3210, 2317, 1223, 7562, 253, 643, 4229, 943, 2794, 1652, 591, 44043, 1818, 50276, 33722, 5661, 4278, 369, 253, 1966, 941, 5728, 275, 247, 5188, 390, 3909, 369, 5130, 12544, 2684, 281, 5724, 21746, 369, 253, 1263, 5196, 2556, 281, 4623, 16289, 9600, 285, 7012, 407, 247, 2278, 4450, 390, 2074, 891, 812, 671, 1089, 642, 4278, 327, 436, 275, 253, 8499, 50275, 37585, 50275, 81, 721, 1745, 80, 7194, 1966, 25226, 50275, 783, 3910, 275, 253, 2426, 275, 5150, 818, 497, 1892, 281, 1239, 10751, 2465, 4632, 7856, 1269, 1007, 1077, 2074, 594, 891, 369, 13477, 323, 247, 1223, 50276, 66, 4623, 3806, 323, 253, 19704, 980, 327, 3239, 898, 1475, 4715, 828, 6656, 707, 432, 269, 710, 66, 281, 29615, 310, 295, 13183, 247, 256, 50276, 85, 17551, 270, 256, 4050, 256, 3649, 796, 8259, 8055, 2460, 9990, 5513, 5304, 6207, 5361, 3753, 6551, 21559, 21579, 7904, 1706, 2090, 50276, 81, 898, 4828, 565, 41597, 253, 958, 326, 776, 5304, 985, 310, 28819, 26672, 422, 812, 1918, 6054, 281, 247, 625, 10237, 9706, 5122, 273, 253, 5304, 15199, 347, 25226, 476, 22573, 247, 3268, 2581, 685, 247, 1127, 347, 597, 2118, 616, 4055, 273, 15944, 50276, 18566, 436, 16084, 253, 42810, 326, 1327, 71, 710, 456, 7534, 8113, 2718, 403, 1679, 10237, 891, 13414, 871, 273, 1941, 326, 16544, 5742, 281, 436, 533, 627, 403, 5604, 16711, 342, 1327, 71, 710, 456, 5304, 2718, 326, 755, 2112, 816, 4030, 275, 616, 25803, 253, 2929, 2789, 247, 1643, 4722, 7125, 20057, 31640, 281, 269, 710, 456, 5304, 2718, 891, 1089, 253, 1941, 326, 253, 30557, 630, 285, 10237, 3210, 403, 2509, 2074, 50276, 783, 1072, 13782, 281, 320, 760, 22112, 4516, 253, 2926, 273, 253, 2929, 812, 671, 320, 625, 4518, 8818, 50276, 15576, 846, 5955, 1677, 1304, 273, 747, 5661, 1941, 326, 253, 30557, 630, 285, 10237, 3210, 6635, 591, 916, 1230, 22202, 3888, 891, 717, 12976, 619, 4868, 432, 608, 281, 854, 5474, 33032, 2520, 1263, 21354, 247, 10084, 4602, 875, 18539, 274, 1365, 10166, 260, 79, 2224, 285, 1966, 10844, 13071, 970, 253, 17791, 42281, 398, 253, 4477, 921, 326, 253, 1966, 3745, 281, 30530, 875, 3626, 3888, 285, 616, 13506, 42281, 398, 15323, 275, 247, 2074, 8142, 323, 42281, 398, 4561, 407, 275, 31324, 18539, 274, 1365, 10166, 260, 79, 2224, 285, 323, 42281, 398, 4561, 407, 271, 27697, 273, 247, 973, 14091, 728, 1566, 273, 10844, 8113, 26332, 30557, 13015, 347, 3148, 26418, 414, 5459, 436, 778, 5224, 326, 18539, 274, 1365, 10166, 3210, 8800, 690, 36199, 281, 1966, 5304, 5162, 387, 253, 22043, 29615, 891, 11346, 4361, 253, 2929, 285, 1119, 253, 10291, 352, 2789, 29853, 50275, 24286, 84, 50275, 1189, 455, 253, 1313, 40463, 19763, 3133, 281, 1056, 247, 2969, 10554, 604, 247, 1566, 9113, 28174, 1966, 8113, 323, 247, 1677, 26418, 414, 840, 253, 42281, 398, 943, 417, 320, 20741, 494, 407, 7497, 672, 3559, 387, 326, 26418, 414, 253, 4477, 4665, 253, 11052, 20741, 1430, 347, 24838, 326, 10237, 260, 79, 2224, 4264, 247, 2074, 5886, 281, 10844, 8113, 347, 253, 4107, 11155, 50276, 3549, 251, 3992, 74, 1566, 1057, 891, 5194, 326, 436, 310, 247, 5272, 7914, 2299, 253, 941, 671, 5936, 247, 26210, 875, 10237, 260, 79, 2224, 285, 10844, 8113, 347, 20741, 1430, 4558, 1840, 4839, 1014, 323, 1884, 5304, 7759, 253, 4477, 1537, 1908, 15974, 436, 26210, 275, 253, 2505, 285, 275, 6774, 6260, 18918, 752, 7497, 923, 326, 253, 1566, 16216, 50275, 18566, 2649, 13532, 253, 767, 1959, 3602, 273, 253, 4107, 11155, 50276, 3549, 251, 3992, 74, 1566, 281, 253, 10237, 260, 9866, 20171, 253, 6452, 326, 253, 767, 3210, 403, 6425, 275, 616, 15970, 1612, 752, 604, 253, 30557, 13015, 403, 4447, 10939, 273, 253, 260, 9866, 50275, 262, 310, 12744, 281, 479, 1880, 697, 4344, 281, 6635, 42281, 398, 281, 320, 5762, 275, 253, 29615, 1293, 948, 8287, 5304, 43457, 7364, 1309, 15199, 9066, 253, 2629, 260, 9866, 1537, 320, 8877, 407, 1029, 18163, 1491, 326, 310, 2649, 1014, 627, 672, 253, 15199, 310, 3559, 387, 1384, 390, 1884, 7759, 387, 1878, 253, 4477, 1537, 971, 281, 7472, 1501, 37806, 1880, 253, 42281, 398, 3464, 42281, 398, 762, 1029, 5858, 19690, 802, 8287, 22043, 10806, 387, 253, 1027, 26418, 1005, 50275, 783, 13506, 4632, 13506, 1617, 310, 2908, 1293, 4209, 16038, 390, 5955, 849, 943, 436, 1617, 320, 12814, 13359, 685, 253, 3626, 4632, 13506, 1617, 436, 1127, 310, 3782, 4623, 323, 253, 2629, 260, 9866, 835, 841, 767, 2515, 11711, 463, 50275, 3088, 2256, 35949, 16289, 9612, 285, 2256, 10963, 3176, 275, 253, 2505, 50275, 25249, 40883, 253, 2127, 15374, 285, 14613, 1543, 651, 2572, 253, 3486, 273, 436, 789, 50275, 28269, 828, 6656, 707, 6096, 875, 269, 710, 267, 285, 10844, 5162, 310, 271, 4722, 2934, 4931, 436, 2746, 476, 8171, 48960, 3733, 436, 3884, 943, 320, 23321, 2007, 275, 1529, 1263, 50275, 37585, 2792, 50275, 783, 5955, 26574, 2629, 260, 79, 2224, 347, 28819, 6447, 2299, 275, 3946, 436, 310, 417, 253, 1083, 1955, 281, 2538, 2905, 281, 45900, 11775, 335, 333, 50275, 664, 739, 4765, 285, 13294, 14350, 455, 18980, 1162, 355, 9169, 253, 6158, 1263, 1014, 6492, 15424, 269, 710, 318, 50275, 8826, 273, 253, 11106, 549, 32693, 30404, 403, 36761, 285, 943, 320, 7932, 342, 30404, 273, 253, 3969, 8059, 10061, 50276, 250, 3065, 14350, 455, 18980, 270, 465, 536, 73, 965, 4742, 266, 295, 5563, 77, 6451, 362, 340, 9041, 480, 50276, 250, 1559, 5432, 5969, 472, 1665, 258, 9169, 2564, 253, 13229, 14340, 2224, 476, 1287, 9645, 13977, 549, 32693, 638, 3845, 549, 32693, 1252, 4699, 20070, 50276, 1370, 335, 333, 247, 9432, 251, 285, 340, 1094, 359, 739, 2139, 513, 3676, 27311, 267, 6928, 39970, 594, 15225, 281, 1355, 2460, 21257, 549, 32693, 638, 3845, 549, 32693, 1093, 1762, 805, 15907, 4765, 3738, 436, 789, 556, 690, 21076, 7794, 436, 310, 247, 14242, 2561, 2199, 3692, 14924, 281, 17857, 32888, 651, 8591, 625, 1774, 2561, 715, 253, 10291, 875, 260, 79, 2224, 31640, 285, 10844, 8113, 3103, 891, 5583, 14924, 2490, 187, 4118, 18435, 27, 2520, 2929, 2722, 326, 3888, 17791, 281, 3761, 18539, 274, 1365, 10237, 14237, 403, 2074, 281, 3236, 3888, 281, 7497, 672, 11575, 50276, 468, 6894, 595, 50276, 2520, 369, 417, 2032, 323, 18539, 274, 1365, 1327, 18848, 461, 14237, 50276, 29483, 595, 253, 18539, 274, 1365, 10237, 14237, 497, 2074, 281, 253, 30557, 630, 1566, 2460, 432, 247, 1566, 273, 1966, 10844, 8113, 50275, 15337, 398, 2559, 616, 4868, 247, 2257, 1309, 253, 30080, 22559, 2180, 347, 50275, 783, 4477, 2530, 625, 4278, 327, 253, 4679, 285, 5821, 281, 50276, 48619, 1066, 690, 273, 253, 3916, 3340, 253, 2266, 1750, 326, 253, 10237, 14237, 9232, 10844, 13782, 2074, 281, 1655, 594, 66, 14542, 10844, 8113, 3210, 50276, 284, 973, 4767, 407, 50276, 15337, 254, 256, 23, 27088, 767, 14237, 342, 253, 1072, 3635, 5641, 403, 417, 7933, 253, 1072, 50274, 3113, 37317, 7363, 273, 854, 2439, 253, 4450, 30628, 5194, 326, 436, 310, 4722, 789, 326, 943, 320, 3559, 387, 253, 8059, 50276, 74, 5194 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: this paper focuses on improving the condition of overparameterization for stochastic gradient descent on overparameterized shallow neural networks the authors first present a general global convergence result for sgd on a finitesum compositional optimization problem and then apply this general result to twolayer neural networks with smooth activation functions to obtain a sufficient condition of overparameterization of order om2 sqrtb where m is the sample size and b is the minibatch size while the results of this paper look correct and rigorous i have a major concern about the significance and novelty of the paper first of all the o m2 sqrtb condition on overparameterization in fact requires additional assumptions on the data or data distribution that is not covered by assumptions 14 in this paper this result relies on an estimated order of sigmamathrmmin xt that is derived by assuming x is uniform this is a very strong data distribution assumption with this additional assumption it is not fair to compare the condition sigmamathrmmin xt with the other works in table 1 for example in order to compare the result with existing works considering the separable data assumption the authors should derive additional guarantees of sigmamathrmmin xt based on this separable data assumption as is done in oymak soltanolkotabi 2020 moreover the discussion on ji telgarsky 2020 chen et al 2021 and daniely 2020 is too vague and not convincing it still seems that these works have already proved better conditions of overparameterization than this paper since these works also highlight the improvement of overparameterization conditions the authors should consider adding them to table 1 and discuss the differences between this work and existing works in detail at last as given in table 1 in this paper the result in this paper is only better than oymak soltanolkotabi 2020 in the sense that i this paper considers the training of parameters on both layers while oymak soltanolkotabi 2020 only considers the training of the first layer however in literature it is widely believed that the difficulty of the analysis lies mainly on the training of the first layer parameters ii this paper gives results for different minibatch sizes however this paper also has a disadvantage compared to oymak soltanolkotabi 2020 as this paper requires a smooth activation function while oymak soltanolkotabi 2020 works for relu activation therefore the result of this paper can be quite incremental compared to oymak soltanolkotabi 2020 typos 1 the sample size is denoted as n in table 1 however in the latter part of the paper the notation of sample size is m 2 the notation nabla hjwi equation 4 is not consistent with that in the discussion below it nabla hjphiwi my major concern about this paper is its significance and novelty based on this concern i would like to recommend rejection docsepthis paper first analyzes the convergence of sgd under assumptions of pl condition on loss function and growth condition on stochastic gradients then it considers a twolayer neural network with some special setting and claims that this network with quadratic loss satisfies both pl condition and growth condition and hence sgd converges the results presented in this paper are already known in the literature please check the two highly related papers liu et al 2021 liu et al 2020 specifically assuming pl condition on the loss function convergence of sgd is already obtained in liu et al 2021 with an exponential learning rate this covers theorem 1 of this submission the concepts of nearisometry definition 4 and length of trajectory proposition 1 are also similarly established in liu et al 2021 nearisometry corresponds to the socalled uniform conditioning of ntk neural tangent kernel and proposition 1 corresponds to the existence of solution in a neighborhood ball pl conditions are shown on neural networks in liu et al 2021 i noticed that liu et al 2021 includes deep neural networks as well as shallow ones but this submission only considers shallow networks with more assumptions assumption 3 4 the proof of the growth condition of the shallow network is incorrect in section 2 the theory assumes growth condition on stochastic gradients ie derivative of loss wrt the model parameters however when proving a shallow network satisfies growth conditions in the proof of theorem 2 the paper analyzes a different object derivative of loss wrt the model output the notations of the paper are not consistent and ambiguous which makes it hard to read for example in eq13 and the sentence after matrix z is discussed but z is never defined before references liu et al loss landscapes and optimization in overparameterized nonlinear systems and neural networks 2021 liu et al on the linearity of large nonlinear models when and why the tangent kernel is constant neurips 2020 edited after author feedback thanks for the authors for the feedback the paper seems properly cited the anonymous paper so i removed the dual submission concern flag in the ethic review part as for the high similarities with prior works i never acknowledged that there are significant differences between the current submission and prior works eg liu et al 2020 the concepts and main logic are highly similar to prior works i mentioned above i have read the other reviews and agree that this paper lacks novelty and that main concepts and methods are already known in prior works so i would like to keep my score unchanged 1 known results 2 incorrect proof docsepthis paper proved that a twolayer neural network with smooth activation and proper initialization can converge linearly to a global minima of training loss using minibatch sgd when the width is larger than omegam2sqrtb where m is the number of training data and b is batch size as the batch size increases this provides an interpolation between the quadratic result for sgd and the subquadratic result for fullbatch gd to prove this result the authors first provide a general convergence result for sgd on a particular class of functions and then apply this framework to 2layer neural networks to derive the width requirement strengths 1 this paper provided a better width requirement bound for linear convergence in twolayer neural network training by generalizing techniques from previous results and taking batch size into consideration specifically compared to oymak soltanolkotabi 2020 this paper enables the training for both layers instead of a single layer besides compared to anonymous this paper enables training using minibatch sgd instead of gd 2 this paper is generally wellorganized and easy to follow it also provided detailed proof sketches with a few remarks making the proof easy to understand 3 the related works appear to be adequately cited and compared in detail concerns 1 the novelty of this paper might be somewhat limited and this is my major concern of this paper i will explain in detail in the following paragraphs 11 the learning dynamics of neural networks in this paper are still in the lazy regime and the proof ideas are of similar styles as previous papers in this area specifically as shown in lemma 1 the authors require that the weights do not move far ie larger than a constant from the initialization to ensure the local smoothness and bound the eigenvalues of the jacobian the bound on minimum eigenvalue of the jacobian ensures linear convergence which results in bounded weight change and then induces bounds on eigenvalues of the jacobian therefore the learning dynamics in this paper still require a lot of things including weights jacobian matrices etc to remain close to initialization which should be considered as lazy regime this is different from the neural network training in practice since empirically the weights of neural networks usually move far from initialization 12 the novelty of the techniques used in this paper seems a bit limited the bounds for local smoothness betaphi eigenvalues of phiw0 initial function value and preservation of nearisometry during training all come from anonymous furthermore the idea of bounding the probability that the weights leave the local neighborhood of the initialization exists in theorem 31 of oymak soltanolkotabi 2019 and theorem 26 of oymak soltanolkotabi 2020 this paper provided a new way of bounding the expected length of sgd trajectory proposition 1 enabled training for the secondlayer weights and generalized sgd to minibatch sgd by introducing the batch size b theorem 2 to interpolate between anonymous and oymak soltanolkotabi 2020 the techniques used in these steps seem specific to the particular setting in this paper and might not generalize thus the technical novelty of this paper might be a bit limited 2 the results in this paper only apply to smooth activation functions with bounded hermite norm which is a bit strong and the width requirement also depends on the smoothness factor and hermite norm most previous works hold for relu activations as also shown in table 1 minor comments 1 the statement of theorem 1 might need to be further clarified about the randomness when stating theorem 1 the authors said with probability at least 1zeta and the result still has the expectation notation it would be better if the authors could clarify what randomness goes into the 1zeta part and what randomness is the expectation taken over one possible way might be to define the event that sgd travels outside the neighborhood of the initialization and state theorem 1 in a similar way as theorem 26 in oymak soltanolkotabi 2020 typos 1 definition 3 frac1c frac1b 2 rhophi is used in proposition 1 but not formally defined until lemma 3 so it is better to move the definition earlier 3 simeq is used in lemma 2 and section 31 but is not formally defined update thank the authors for their responses i have read all other reviews and the authors responses to all the reviewers and i have decided to keep my score unchanged the authors responses address some of my concerns eg their main results do not directly require the weights during training remain close enough to the initialization however my major concern the novelty of the proof techniques which is also shared by the other reviewers still remains because most of the resultsmethods used in this paper already exist in the literature therefore i would like to keep my score and tend to recommend rejection i tend to vote for rejecting this paper despite generalizing the linear convergence result to minibatch sgd and improving the width requirement to subquadratic my major concern about this paper is its limited novelty as explained in the concerns section above docsep1 this paper proves that sgd converges to a global minimum in certain nonconvex problems assuming the loss function satisfies a growth condition the proof relies on assuming that the initial jacobian matrix is nonsingular and shows that it stays nonsingular since sgd iterates remain close to the initialization 2 the paper then applies the above analysis to a twolayer neural network and proves that a subquadratic scaling on the width is sufficient for global convergence assuming the minibatch size grows with the number of training samples for constant batch size it requires quadratic overparameterization furthermore an interpolation between subquadratic and quadratic scaling is given depending on the batch size pros 1 this paper relaxed the overparameterization requirement in twolayer neural networks for the sgd algorithm which is certainly an important direction 2 the proof involves bounding the distance to initialization for sgd iterates which is much more challenging than for gd i think the techniques here can be useful in future studies in sgd convergence cons 1 i am concerned about the novelty of the proof it seems to me that the proof is very similar to the ntk analysis in the sense that it argues the jacobian matrix is fullrank during the training because the sgd iterates stay close to the initialization i wonder if there is any conceptual difference between this proof and the ntk argument 2 in section 31 the authors assumed that mgeq d02 and claimed its the case in practice i think in practice the number of training samples m is much smaller than the square of input dimension d02 for example in cifar the input dimension is 32times 32times 33072 and there are only 50000 training samples in imagenet the input dimension is 256times 256times 3196608 and there are only 1m samples 3 one of the messages of this paper is that a smaller batch size requires a wider neural network is there any empirical experiment that can support this claim otherwise this might just be an artifact of the analysis the main result of this paper is proving the subquadratic width scaling for sgd convergence in twolayer neural networks assuming the batch size grows with the number of training samples i understand that the analysis requires new techniques to bound the distance of sgd iterates to its initialization but i think the proof is conceptually the same as ntk analysis and am concerned about its novelty therefore i think this paper is marginally below the acceptance threshold i will be willing to raise my score if my concerns are addressed thanks for the response after reading the response and other reviews i decided to keep my original score my major concern is still the novelty of the proof ### Summary:
this paper shows sgd enjoys linear convergence for shallow neural networks under certain assumptions however reviewers reach the consensus that this paper lacks technical novelty the meta reviewer agrees and thus decides to reject the paper
[ 2520, 2929, 806, 3537, 13505, 253, 14940, 273, 256, 35333, 762, 13260, 273, 499, 1617, 327, 2957, 1159, 285, 3116, 1617, 327, 19191, 27935, 840, 352, 19401, 247, 2500, 311, 4071, 11454, 2990, 342, 690, 2714, 4758, 285, 3916, 326, 436, 2990, 342, 21396, 2957, 12310, 1097, 499, 1617, 285, 3116, 1617, 285, 7613, 256, 35333, 26414, 253, 1543, 3559, 275, 436, 2929, 403, 2168, 1929, 275, 253, 6239, 4496, 2451, 253, 767, 4122, 2905, 9380, 632, 86, 1162, 355, 43425, 632, 86, 1162, 355, 9169, 50276, 46458, 7384, 499, 1617, 327, 253, 2957, 1159, 14940, 273, 256, 35333, 310, 2168, 2797, 275, 632, 86, 1162, 355, 43425, 342, 271, 17619, 4715, 2281, 436, 10949, 10012, 337, 273, 436, 19529, 50275, 783, 12342, 273, 2822, 261, 6213, 5426, 577, 285, 2978, 273, 18974, 13989, 337, 403, 671, 12014, 4232, 275, 632, 86, 1162, 355, 43425, 2822, 261, 6213, 10140, 281, 253, 9267, 18859, 6447, 21839, 273, 295, 17922, 11454, 28196, 10295, 285, 13989, 337, 10140, 281, 253, 6242, 273, 2900, 275, 247, 9168, 4023, 50276, 446, 2515, 403, 2011, 327, 11454, 6928, 275, 632, 86, 1162, 355, 43425, 891, 8344, 326, 632, 86, 1162, 355, 43425, 3797, 3676, 11454, 6928, 347, 973, 347, 20126, 4394, 533, 436, 19529, 760, 19401, 20126, 6928, 342, 625, 13260, 9376, 495, 50276, 21, 50276, 783, 4737, 273, 253, 3116, 1617, 273, 253, 20126, 2990, 310, 13583, 275, 2593, 374, 253, 3762, 19584, 3116, 1617, 327, 19191, 27935, 26332, 4309, 273, 2957, 8772, 253, 1566, 3602, 2299, 672, 18597, 247, 20126, 2990, 12310, 3116, 2515, 275, 253, 4737, 273, 10012, 374, 253, 2929, 3537, 13505, 247, 1027, 1789, 50276, 46229, 800, 273, 2957, 8772, 253, 1566, 3453, 50276, 783, 41818, 273, 253, 2929, 403, 417, 5185, 285, 23851, 534, 2789, 352, 1892, 281, 1239, 323, 1650, 275, 16186, 1012, 285, 253, 6197, 846, 4315, 1182, 310, 5469, 533, 1182, 310, 1620, 2931, 1078, 50275, 250, 3065, 50276, 965, 86, 1162, 355, 2957, 37328, 285, 13757, 275, 689, 19484, 1025, 14561, 2718, 285, 11454, 6928, 43425, 50276, 965, 86, 1162, 355, 327, 253, 50137, 273, 1781, 14561, 3210, 672, 285, 2139, 253, 28196, 10295, 310, 3638, 5723, 2824, 9169, 50275, 49539, 846, 2488, 8680, 50275, 35501, 323, 253, 4477, 323, 253, 8680, 50276, 783, 2929, 3133, 6283, 11106, 253, 17679, 2929, 594, 891, 5176, 253, 8746, 19529, 4468, 7908, 275, 253, 48538, 2278, 629, 50276, 284, 323, 253, 1029, 22620, 342, 2720, 2987, 891, 1620, 14969, 326, 627, 403, 1534, 3910, 875, 253, 1655, 19529, 285, 50276, 40844, 2987, 24088, 632, 86, 1162, 355, 9169, 253, 12342, 285, 2022, 9317, 403, 4122, 2074, 281, 2720, 2987, 891, 5393, 1840, 50275, 74, 452, 1239, 253, 643, 10123, 285, 5194, 326, 436, 2929, 19756, 38135, 285, 326, 2022, 12342, 285, 3082, 403, 2168, 1929, 275, 2720, 2987, 594, 891, 651, 751, 281, 1978, 619, 4868, 19965, 50273, 18, 1929, 1543, 374, 13583, 4737, 50276, 7152, 33032, 2520, 2929, 8058, 326, 247, 2500, 311, 4071, 11454, 2990, 342, 6032, 5743, 285, 1463, 31850, 476, 29623, 23352, 281, 247, 4156, 46836, 273, 3733, 2957, 970, 1054, 487, 1506, 256, 35333, 672, 253, 4871, 310, 4067, 685, 7005, 909, 312, 19, 2609, 67, 835, 278, 310, 253, 1180, 273, 3733, 941, 285, 270, 310, 14604, 1979, 347, 253, 14604, 1979, 5459, 436, 3400, 271, 30370, 875, 253, 21396, 906, 323, 256, 35333, 285, 253, 749, 3362, 83, 1420, 906, 323, 2120, 23941, 305, 69, 281, 5276, 436, 906, 253, 4477, 806, 2085, 247, 2087, 14940, 906, 323, 256, 35333, 327, 247, 1798, 966, 273, 3470, 285, 840, 4647, 436, 7792, 281, 374, 12026, 11454, 6928, 281, 15313, 253, 4871, 8284, 20544, 50276, 18, 436, 2929, 2530, 247, 1805, 4871, 8284, 3033, 323, 4872, 14940, 275, 2500, 311, 4071, 11454, 2990, 3733, 407, 2087, 3006, 5609, 432, 2045, 1543, 285, 3192, 14604, 1979, 715, 8180, 5742, 2429, 281, 258, 1105, 518, 50276, 84, 7391, 9201, 46841, 18754, 9169, 436, 2929, 13276, 253, 3733, 323, 1097, 8090, 3185, 273, 247, 2014, 3828, 16280, 2429, 281, 17679, 436, 2929, 13276, 3733, 970, 1054, 487, 1506, 256, 35333, 3185, 273, 305, 69, 50276, 19, 436, 2929, 310, 3839, 973, 34092, 285, 3477, 281, 956, 352, 671, 2530, 7000, 4737, 46159, 342, 247, 1643, 16157, 2403, 253, 4737, 3477, 281, 2096, 50276, 20, 253, 2905, 2987, 3176, 281, 320, 18212, 11106, 285, 2429, 275, 2508, 50276, 585, 1209, 2224, 50276, 18, 253, 38135, 273, 436, 2929, 1537, 320, 8489, 3710, 285, 436, 310, 619, 2201, 4468, 273, 436, 2929, 891, 588, 5513, 275, 2508, 275, 253, 1563, 33295, 50276, 883, 253, 4715, 8062, 273, 11454, 6928, 275, 436, 2929, 403, 1335, 275, 253, 22658, 9459, 285, 253, 4737, 5697, 403, 273, 2074, 14957, 347, 2045, 9380, 275, 436, 2170, 5742, 347, 2011, 275, 18057, 337, 253, 4477, 2430, 326, 253, 13461, 513, 417, 2118, 2080, 26332, 4067, 685, 247, 3638, 432, 253, 31850, 281, 5416, 253, 1980, 6032, 1255, 285, 3033, 253, 20223, 273, 253, 480, 317, 706, 757, 253, 3033, 327, 5927, 25023, 273, 253, 480, 317, 706, 757, 20096, 4872, 14940, 534, 1543, 275, 11542, 2801, 1818, 285, 840, 14757, 14493, 327, 20223, 273, 253, 480, 317, 706, 757, 3103, 253, 4715, 8062, 275, 436, 2929, 1335, 2430, 247, 2257, 273, 1841, 1690, 13461, 480, 317, 706, 757, 12624, 3966, 281, 3464, 2810, 281, 31850, 534, 943, 320, 2783, 347, 22658, 9459, 436, 310, 1027, 432, 253, 11454, 2990, 3733, 275, 3946, 1580, 45190, 253, 13461, 273, 11454, 6928, 3798, 2118, 2080, 432, 31850, 50276, 805, 253, 38135, 273, 253, 5609, 908, 275, 436, 2929, 3133, 247, 2372, 3710, 253, 14493, 323, 1980, 6032, 1255, 701, 522, 5801, 20223, 273, 815, 27684, 17, 3302, 1159, 1318, 285, 23029, 273, 2822, 261, 6213, 1309, 3733, 512, 1705, 432, 17679, 33810, 253, 2934, 273, 41113, 253, 5912, 326, 253, 13461, 3553, 253, 1980, 9168, 273, 253, 31850, 4961, 275, 10012, 4562, 273, 258, 1105, 518, 50276, 84, 7391, 9201, 46841, 18754, 6247, 285, 10012, 3436, 273, 258, 1105, 518, 50276, 84, 7391, 9201, 46841, 18754, 9169, 436, 2929, 2530, 247, 747, 1039, 273, 41113, 253, 3264, 2978, 273, 256, 35333, 18974, 13989, 337, 11410, 3733, 323, 253, 1273, 12026, 13461, 285, 14923, 256, 35333, 281, 1054, 487, 1506, 256, 35333, 407, 16984, 253, 14604, 1979, 270, 10012, 374, 281, 20670, 366, 875, 17679, 285, 258, 1105, 518, 50276, 84, 7391, 9201, 46841, 18754, 9169, 253, 5609, 908, 275, 841, 5018, 1646, 2173, 281, 253, 1798, 4758, 275, 436, 2929, 285, 1537, 417, 39970, 3021, 253, 7681, 38135, 273, 436, 2929, 1537, 320, 247, 2372, 3710, 50276, 19, 253, 1543, 275, 436, 2929, 760, 4647, 281, 6032, 5743, 3470, 342, 11542, 617, 78, 614, 5222, 534, 310, 247, 2372, 2266, 285, 253, 4871, 8284, 671, 7024, 327, 253, 6032, 1255, 2803, 285, 617, 78, 614, 5222, 954, 2045, 2987, 2186, 323, 774, 86, 1396, 569, 347, 671, 2011, 275, 2829, 337, 50276, 37585, 5701, 50276, 18, 253, 3908, 273, 10012, 337, 1537, 878, 281, 320, 2007, 31637, 670, 253, 3632, 1255, 672, 14851, 10012, 337, 253, 4477, 753, 342, 5912, 387, 1878, 337, 7597, 285, 253, 906, 1335, 556, 253, 15355, 14951, 352, 651, 320, 1805, 604, 253, 4477, 812, 19148, 752, 3632, 1255, 4566, 715, 253, 337, 7597, 629, 285, 752, 3632, 1255, 310, 253, 15355, 2668, 689, 581, 1896, 1039, 1537, 320, 281, 4853, 253, 2362, 326, 256, 35333, 24376, 3345, 253, 9168, 273, 253, 31850, 285, 1375, 10012, 337, 275, 247, 2074, 1039, 347, 10012, 3436, 275, 258, 1105, 518, 50276, 84, 7391, 9201, 46841, 18754, 9169, 50276, 555, 993, 50275, 18, 5426, 495, 1315, 317, 18, 68, 50276, 1124, 18, 67, 374, 13882, 2689, 74, 310, 908, 275, 13989, 337, 533, 417, 19186, 2931, 1919, 18057, 495, 594, 352, 310, 1805, 281, 2118, 253, 5426, 4321, 495, 256, 553, 82, 310, 908, 275, 18057, 374, 285, 2593, 4562, 533, 310, 417, 19186, 2931, 50276, 11183, 50276, 47033, 253, 4477, 323, 616, 6128, 891, 452, 1239, 512, 643, 10123, 285, 253, 4477, 6128, 281, 512, 253, 30628, 285, 891, 452, 4425, 281, 1978, 619, 4868, 19965, 253, 4477, 6128, 2953, 690, 273, 619, 7350, 24088, 616, 2022, 1543, 513, 417, 3587, 2430, 253, 13461, 1309, 3733, 3464, 2810, 2217, 281, 253, 31850, 2299, 619, 2201, 4468, 253, 38135, 273, 253, 4737, 5609, 534, 310, 671, 6096, 407, 253, 643, 30628, 1335, 4558, 984, 954, 273, 253, 1543, 30172, 908, 275, 436, 2929, 2168, 2226, 275, 253, 6239, 3103, 891, 651, 751, 281, 1978, 619, 4868, 285, 5257, 281, 5583, 18235, 891, 5257, 281, 6273, 323, 33944, 436, 2929, 5747, 2087, 3006, 253, 4872, 14940, 906, 281, 1054, 487, 1506, 256, 35333, 285, 11138, 253, 4871, 8284, 281, 749, 3362, 83, 1420, 619, 2201, 4468, 670, 436, 2929, 310, 697, 3710, 38135, 347, 5544, 275, 253, 7350, 2593, 1840, 5474, 33032, 18, 436, 2929, 19539, 326, 256, 35333, 26414, 281, 247, 4156, 5927, 275, 2176, 1327, 44181, 3237, 7384, 253, 2957, 1159, 12310, 247, 3116, 1617, 253, 4737, 15771, 327, 7384, 326, 253, 3302, 480, 317, 706, 757, 4315, 310, 14122, 272, 792, 285, 2722, 326, 352, 19931, 14122, 272, 792, 1580, 256, 35333, 10040, 684, 3464, 2810, 281, 253, 31850, 50276, 19, 253, 2929, 840, 10384, 253, 1840, 1783, 281, 247, 2500, 311, 4071, 11454, 2990, 285, 19539, 326, 247, 749, 3362, 83, 1420, 13642, 327, 253, 4871, 310, 4209, 323, 4156, 14940, 7384, 253, 1054, 487, 1506, 1979, 17202, 342, 253, 1180, 273, 3733, 3530, 323, 3638, 14604, 1979, 352, 4419, 21396, 689, 19484, 1320, 33810, 271, 30370, 875, 749, 3362, 83, 1420, 285, 21396, 13642, 310, 1677, 7293, 327, 253, 14604, 1979, 50276, 856, 84, 337, 436, 2929, 19595, 253, 689, 19484, 1320, 8284, 275, 2500, 311, 4071, 11454, 6928, 323, 253, 256, 35333, 5933, 534, 310, 5604, 271, 1774, 3884, 374, 253, 4737, 8687, 41113, 253, 4181, 281, 31850, 323, 256, 35333, 10040, 684, 534, 310, 1199, 625, 11132, 685, 323, 305, 69, 891, 1158, 253, 5609, 1060, 476, 320, 4217, 275, 2852, 2175, 275, 256, 35333, 14940, 50275, 5040, 337, 891, 717, 7514, 670, 253, 38135, 273, 253, 4737, 352, 3133, 281, 479, 326, 253, 4737, 310, 1077, 2074, 281, 253, 295, 17922, 1783, 275, 253, 3282, 326, 352, 8219, 253, 480, 317, 706, 757, 4315, 310, 2120, 14714, 1309, 253, 3733, 984, 253, 256, 35333, 10040, 684, 3297, 2810, 281, 253, 31850, 891, 4282, 604, 627, 310, 667, 20178, 3064, 875, 436, 4737, 285, 253, 295, 17922, 4154, 50276, 19, 275, 2593, 4562, 253, 4477, 8025, 326, 278, 5090, 277, 2640, 285, 7558, 697, 253, 1083, 275, 3946, 891, 1158, 275, 3946, 253, 1180, 273, 3733, 3530, 278, 310, 1199, 4577, 685, 253, 6278, 273, 3280, 7877, 277, 2640, 323, 1650, 275, 260, 338, 274, 253, 3280, 7877, 310, 4567, 3181, 4567, 3181, 24792, 3547, 285, 627, 403, 760, 608, 1418, 3733, 3530, 275, 4440, 257, 292, 253, 3280, 7877, 310, 17558, 3181, 17558, 3181, 495, 19196, 25805, 285, 627, 403, 760, 337, 78, 3530, 50276, 20, 581, 273, 253, 8169, 273, 436, 2929, 310, 326, 247, 4577, 14604, 1979, 4419, 247, 14200, 11454, 2990, 310, 627, 667, 16774, 3368, 326, 476, 1329, 436, 1750, 5010, 436, 1537, 816, 320, 271, 34332, 273, 253, 1783, 253, 2022, 906, 273, 436, 2929, 310, 18597, 253, 749, 3362, 83, 1420, 4871, 13642, 323, 256, 35333, 14940, 275, 2500, 311, 4071, 11454, 6928, 7384, 253, 14604, 1979, 17202, 342, 253, 1180, 273, 3733, 3530, 891, 2096, 326, 253, 1783, 4419, 747, 5609, 281, 3033, 253, 4181, 273, 256, 35333, 10040, 684, 281, 697, 31850, 533, 891, 1158, 253, 4737, 310, 4473, 1230, 253, 1072, 347, 295, 17922, 1783, 285, 717, 7514, 670, 697, 38135, 3103, 891, 1158, 436, 2929, 310, 42876, 2708, 253, 14924, 7887, 891, 588, 320, 7378, 281, 7164, 619, 4868, 604, 619, 7350, 403, 9713, 50273, 35501, 323, 253, 2380, 846, 4361, 253, 2380, 285, 643, 10123, 891, 4425, 281, 1978, 619, 3236, 4868, 619, 2201, 4468, 310, 1335, 253, 38135, 273, 253, 4737, 50276, 187, 187, 4118, 18435, 27, 2520, 2929, 2722, 256, 35333, 29566, 4872, 14940, 323, 20126, 11454, 6928, 762, 2176, 13260, 2299, 30628, 3986, 253, 13969, 326, 436, 2929, 19756, 7681, 38135, 253, 11419, 37317, 18726, 285, 3021, 21936, 281, 12009, 253, 2929 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 2520, 2929, 806, 3537, 13505, 253, 14940, 273, 256, 35333, 762, 13260, 273, 499, 1617, 327, 2957, 1159, 285, 3116, 1617, 327, 19191, 27935, 840, 352, 19401, 247, 2500, 311, 4071, 11454, 2990, 342, 690, 2714, 4758, 285, 3916, 326, 436, 2990, 342, 21396, 2957, 12310, 1097, 499, 1617, 285, 3116, 1617, 285, 7613, 256, 35333, 26414, 253, 1543, 3559, 275, 436, 2929, 403, 2168, 1929, 275, 253, 6239, 4496, 2451, 253, 767, 4122, 2905, 9380, 632, 86, 1162, 355, 43425, 632, 86, 1162, 355, 9169, 50276, 46458, 7384, 499, 1617, 327, 253, 2957, 1159, 14940, 273, 256, 35333, 310, 2168, 2797, 275, 632, 86, 1162, 355, 43425, 342, 271, 17619, 4715, 2281, 436, 10949, 10012, 337, 273, 436, 19529, 50275, 783, 12342, 273, 2822, 261, 6213, 5426, 577, 285, 2978, 273, 18974, 13989, 337, 403, 671, 12014, 4232, 275, 632, 86, 1162, 355, 43425, 2822, 261, 6213, 10140, 281, 253, 9267, 18859, 6447, 21839, 273, 295, 17922, 11454, 28196, 10295, 285, 13989, 337, 10140, 281, 253, 6242, 273, 2900, 275, 247, 9168, 4023, 50276, 446, 2515, 403, 2011, 327, 11454, 6928, 275, 632, 86, 1162, 355, 43425, 891, 8344, 326, 632, 86, 1162, 355, 43425, 3797, 3676, 11454, 6928, 347, 973, 347, 20126, 4394, 533, 436, 19529, 760, 19401, 20126, 6928, 342, 625, 13260, 9376, 495, 50276, 21, 50276, 783, 4737, 273, 253, 3116, 1617, 273, 253, 20126, 2990, 310, 13583, 275, 2593, 374, 253, 3762, 19584, 3116, 1617, 327, 19191, 27935, 26332, 4309, 273, 2957, 8772, 253, 1566, 3602, 2299, 672, 18597, 247, 20126, 2990, 12310, 3116, 2515, 275, 253, 4737, 273, 10012, 374, 253, 2929, 3537, 13505, 247, 1027, 1789, 50276, 46229, 800, 273, 2957, 8772, 253, 1566, 3453, 50276, 783, 41818, 273, 253, 2929, 403, 417, 5185, 285, 23851, 534, 2789, 352, 1892, 281, 1239, 323, 1650, 275, 16186, 1012, 285, 253, 6197, 846, 4315, 1182, 310, 5469, 533, 1182, 310, 1620, 2931, 1078, 50275, 250, 3065, 50276, 965, 86, 1162, 355, 2957, 37328, 285, 13757, 275, 689, 19484, 1025, 14561, 2718, 285, 11454, 6928, 43425, 50276, 965, 86, 1162, 355, 327, 253, 50137, 273, 1781, 14561, 3210, 672, 285, 2139, 253, 28196, 10295, 310, 3638, 5723, 2824, 9169, 50275, 49539, 846, 2488, 8680, 50275, 35501, 323, 253, 4477, 323, 253, 8680, 50276, 783, 2929, 3133, 6283, 11106, 253, 17679, 2929, 594, 891, 5176, 253, 8746, 19529, 4468, 7908, 275, 253, 48538, 2278, 629, 50276, 284, 323, 253, 1029, 22620, 342, 2720, 2987, 891, 1620, 14969, 326, 627, 403, 1534, 3910, 875, 253, 1655, 19529, 285, 50276, 40844, 2987, 24088, 632, 86, 1162, 355, 9169, 253, 12342, 285, 2022, 9317, 403, 4122, 2074, 281, 2720, 2987, 891, 5393, 1840, 50275, 74, 452, 1239, 253, 643, 10123, 285, 5194, 326, 436, 2929, 19756, 38135, 285, 326, 2022, 12342, 285, 3082, 403, 2168, 1929, 275, 2720, 2987, 594, 891, 651, 751, 281, 1978, 619, 4868, 19965, 50273, 18, 1929, 1543, 374, 13583, 4737, 50276, 7152, 33032, 2520, 2929, 8058, 326, 247, 2500, 311, 4071, 11454, 2990, 342, 6032, 5743, 285, 1463, 31850, 476, 29623, 23352, 281, 247, 4156, 46836, 273, 3733, 2957, 970, 1054, 487, 1506, 256, 35333, 672, 253, 4871, 310, 4067, 685, 7005, 909, 312, 19, 2609, 67, 835, 278, 310, 253, 1180, 273, 3733, 941, 285, 270, 310, 14604, 1979, 347, 253, 14604, 1979, 5459, 436, 3400, 271, 30370, 875, 253, 21396, 906, 323, 256, 35333, 285, 253, 749, 3362, 83, 1420, 906, 323, 2120, 23941, 305, 69, 281, 5276, 436, 906, 253, 4477, 806, 2085, 247, 2087, 14940, 906, 323, 256, 35333, 327, 247, 1798, 966, 273, 3470, 285, 840, 4647, 436, 7792, 281, 374, 12026, 11454, 6928, 281, 15313, 253, 4871, 8284, 20544, 50276, 18, 436, 2929, 2530, 247, 1805, 4871, 8284, 3033, 323, 4872, 14940, 275, 2500, 311, 4071, 11454, 2990, 3733, 407, 2087, 3006, 5609, 432, 2045, 1543, 285, 3192, 14604, 1979, 715, 8180, 5742, 2429, 281, 258, 1105, 518, 50276, 84, 7391, 9201, 46841, 18754, 9169, 436, 2929, 13276, 253, 3733, 323, 1097, 8090, 3185, 273, 247, 2014, 3828, 16280, 2429, 281, 17679, 436, 2929, 13276, 3733, 970, 1054, 487, 1506, 256, 35333, 3185, 273, 305, 69, 50276, 19, 436, 2929, 310, 3839, 973, 34092, 285, 3477, 281, 956, 352, 671, 2530, 7000, 4737, 46159, 342, 247, 1643, 16157, 2403, 253, 4737, 3477, 281, 2096, 50276, 20, 253, 2905, 2987, 3176, 281, 320, 18212, 11106, 285, 2429, 275, 2508, 50276, 585, 1209, 2224, 50276, 18, 253, 38135, 273, 436, 2929, 1537, 320, 8489, 3710, 285, 436, 310, 619, 2201, 4468, 273, 436, 2929, 891, 588, 5513, 275, 2508, 275, 253, 1563, 33295, 50276, 883, 253, 4715, 8062, 273, 11454, 6928, 275, 436, 2929, 403, 1335, 275, 253, 22658, 9459, 285, 253, 4737, 5697, 403, 273, 2074, 14957, 347, 2045, 9380, 275, 436, 2170, 5742, 347, 2011, 275, 18057, 337, 253, 4477, 2430, 326, 253, 13461, 513, 417, 2118, 2080, 26332, 4067, 685, 247, 3638, 432, 253, 31850, 281, 5416, 253, 1980, 6032, 1255, 285, 3033, 253, 20223, 273, 253, 480, 317, 706, 757, 253, 3033, 327, 5927, 25023, 273, 253, 480, 317, 706, 757, 20096, 4872, 14940, 534, 1543, 275, 11542, 2801, 1818, 285, 840, 14757, 14493, 327, 20223, 273, 253, 480, 317, 706, 757, 3103, 253, 4715, 8062, 275, 436, 2929, 1335, 2430, 247, 2257, 273, 1841, 1690, 13461, 480, 317, 706, 757, 12624, 3966, 281, 3464, 2810, 281, 31850, 534, 943, 320, 2783, 347, 22658, 9459, 436, 310, 1027, 432, 253, 11454, 2990, 3733, 275, 3946, 1580, 45190, 253, 13461, 273, 11454, 6928, 3798, 2118, 2080, 432, 31850, 50276, 805, 253, 38135, 273, 253, 5609, 908, 275, 436, 2929, 3133, 247, 2372, 3710, 253, 14493, 323, 1980, 6032, 1255, 701, 522, 5801, 20223, 273, 815, 27684, 17, 3302, 1159, 1318, 285, 23029, 273, 2822, 261, 6213, 1309, 3733, 512, 1705, 432, 17679, 33810, 253, 2934, 273, 41113, 253, 5912, 326, 253, 13461, 3553, 253, 1980, 9168, 273, 253, 31850, 4961, 275, 10012, 4562, 273, 258, 1105, 518, 50276, 84, 7391, 9201, 46841, 18754, 6247, 285, 10012, 3436, 273, 258, 1105, 518, 50276, 84, 7391, 9201, 46841, 18754, 9169, 436, 2929, 2530, 247, 747, 1039, 273, 41113, 253, 3264, 2978, 273, 256, 35333, 18974, 13989, 337, 11410, 3733, 323, 253, 1273, 12026, 13461, 285, 14923, 256, 35333, 281, 1054, 487, 1506, 256, 35333, 407, 16984, 253, 14604, 1979, 270, 10012, 374, 281, 20670, 366, 875, 17679, 285, 258, 1105, 518, 50276, 84, 7391, 9201, 46841, 18754, 9169, 253, 5609, 908, 275, 841, 5018, 1646, 2173, 281, 253, 1798, 4758, 275, 436, 2929, 285, 1537, 417, 39970, 3021, 253, 7681, 38135, 273, 436, 2929, 1537, 320, 247, 2372, 3710, 50276, 19, 253, 1543, 275, 436, 2929, 760, 4647, 281, 6032, 5743, 3470, 342, 11542, 617, 78, 614, 5222, 534, 310, 247, 2372, 2266, 285, 253, 4871, 8284, 671, 7024, 327, 253, 6032, 1255, 2803, 285, 617, 78, 614, 5222, 954, 2045, 2987, 2186, 323, 774, 86, 1396, 569, 347, 671, 2011, 275, 2829, 337, 50276, 37585, 5701, 50276, 18, 253, 3908, 273, 10012, 337, 1537, 878, 281, 320, 2007, 31637, 670, 253, 3632, 1255, 672, 14851, 10012, 337, 253, 4477, 753, 342, 5912, 387, 1878, 337, 7597, 285, 253, 906, 1335, 556, 253, 15355, 14951, 352, 651, 320, 1805, 604, 253, 4477, 812, 19148, 752, 3632, 1255, 4566, 715, 253, 337, 7597, 629, 285, 752, 3632, 1255, 310, 253, 15355, 2668, 689, 581, 1896, 1039, 1537, 320, 281, 4853, 253, 2362, 326, 256, 35333, 24376, 3345, 253, 9168, 273, 253, 31850, 285, 1375, 10012, 337, 275, 247, 2074, 1039, 347, 10012, 3436, 275, 258, 1105, 518, 50276, 84, 7391, 9201, 46841, 18754, 9169, 50276, 555, 993, 50275, 18, 5426, 495, 1315, 317, 18, 68, 50276, 1124, 18, 67, 374, 13882, 2689, 74, 310, 908, 275, 13989, 337, 533, 417, 19186, 2931, 1919, 18057, 495, 594, 352, 310, 1805, 281, 2118, 253, 5426, 4321, 495, 256, 553, 82, 310, 908, 275, 18057, 374, 285, 2593, 4562, 533, 310, 417, 19186, 2931, 50276, 11183, 50276, 47033, 253, 4477, 323, 616, 6128, 891, 452, 1239, 512, 643, 10123, 285, 253, 4477, 6128, 281, 512, 253, 30628, 285, 891, 452, 4425, 281, 1978, 619, 4868, 19965, 253, 4477, 6128, 2953, 690, 273, 619, 7350, 24088, 616, 2022, 1543, 513, 417, 3587, 2430, 253, 13461, 1309, 3733, 3464, 2810, 2217, 281, 253, 31850, 2299, 619, 2201, 4468, 253, 38135, 273, 253, 4737, 5609, 534, 310, 671, 6096, 407, 253, 643, 30628, 1335, 4558, 984, 954, 273, 253, 1543, 30172, 908, 275, 436, 2929, 2168, 2226, 275, 253, 6239, 3103, 891, 651, 751, 281, 1978, 619, 4868, 285, 5257, 281, 5583, 18235, 891, 5257, 281, 6273, 323, 33944, 436, 2929, 5747, 2087, 3006, 253, 4872, 14940, 906, 281, 1054, 487, 1506, 256, 35333, 285, 11138, 253, 4871, 8284, 281, 749, 3362, 83, 1420, 619, 2201, 4468, 670, 436, 2929, 310, 697, 3710, 38135, 347, 5544, 275, 253, 7350, 2593, 1840, 5474, 33032, 18, 436, 2929, 19539, 326, 256, 35333, 26414, 281, 247, 4156, 5927, 275, 2176, 1327, 44181, 3237, 7384, 253, 2957, 1159, 12310, 247, 3116, 1617, 253, 4737, 15771, 327, 7384, 326, 253, 3302, 480, 317, 706, 757, 4315, 310, 14122, 272, 792, 285, 2722, 326, 352, 19931, 14122, 272, 792, 1580, 256, 35333, 10040, 684, 3464, 2810, 281, 253, 31850, 50276, 19, 253, 2929, 840, 10384, 253, 1840, 1783, 281, 247, 2500, 311, 4071, 11454, 2990, 285, 19539, 326, 247, 749, 3362, 83, 1420, 13642, 327, 253, 4871, 310, 4209, 323, 4156, 14940, 7384, 253, 1054, 487, 1506, 1979, 17202, 342, 253, 1180, 273, 3733, 3530, 323, 3638, 14604, 1979, 352, 4419, 21396, 689, 19484, 1320, 33810, 271, 30370, 875, 749, 3362, 83, 1420, 285, 21396, 13642, 310, 1677, 7293, 327, 253, 14604, 1979, 50276, 856, 84, 337, 436, 2929, 19595, 253, 689, 19484, 1320, 8284, 275, 2500, 311, 4071, 11454, 6928, 323, 253, 256, 35333, 5933, 534, 310, 5604, 271, 1774, 3884, 374, 253, 4737, 8687, 41113, 253, 4181, 281, 31850, 323, 256, 35333, 10040, 684, 534, 310, 1199, 625, 11132, 685, 323, 305, 69, 891, 1158, 253, 5609, 1060, 476, 320, 4217, 275, 2852, 2175, 275, 256, 35333, 14940, 50275, 5040, 337, 891, 717, 7514, 670, 253, 38135, 273, 253, 4737, 352, 3133, 281, 479, 326, 253, 4737, 310, 1077, 2074, 281, 253, 295, 17922, 1783, 275, 253, 3282, 326, 352, 8219, 253, 480, 317, 706, 757, 4315, 310, 2120, 14714, 1309, 253, 3733, 984, 253, 256, 35333, 10040, 684, 3297, 2810, 281, 253, 31850, 891, 4282, 604, 627, 310, 667, 20178, 3064, 875, 436, 4737, 285, 253, 295, 17922, 4154, 50276, 19, 275, 2593, 4562, 253, 4477, 8025, 326, 278, 5090, 277, 2640, 285, 7558, 697, 253, 1083, 275, 3946, 891, 1158, 275, 3946, 253, 1180, 273, 3733, 3530, 278, 310, 1199, 4577, 685, 253, 6278, 273, 3280, 7877, 277, 2640, 323, 1650, 275, 260, 338, 274, 253, 3280, 7877, 310, 4567, 3181, 4567, 3181, 24792, 3547, 285, 627, 403, 760, 608, 1418, 3733, 3530, 275, 4440, 257, 292, 253, 3280, 7877, 310, 17558, 3181, 17558, 3181, 495, 19196, 25805, 285, 627, 403, 760, 337, 78, 3530, 50276, 20, 581, 273, 253, 8169, 273, 436, 2929, 310, 326, 247, 4577, 14604, 1979, 4419, 247, 14200, 11454, 2990, 310, 627, 667, 16774, 3368, 326, 476, 1329, 436, 1750, 5010, 436, 1537, 816, 320, 271, 34332, 273, 253, 1783, 253, 2022, 906, 273, 436, 2929, 310, 18597, 253, 749, 3362, 83, 1420, 4871, 13642, 323, 256, 35333, 14940, 275, 2500, 311, 4071, 11454, 6928, 7384, 253, 14604, 1979, 17202, 342, 253, 1180, 273, 3733, 3530, 891, 2096, 326, 253, 1783, 4419, 747, 5609, 281, 3033, 253, 4181, 273, 256, 35333, 10040, 684, 281, 697, 31850, 533, 891, 1158, 253, 4737, 310, 4473, 1230, 253, 1072, 347, 295, 17922, 1783, 285, 717, 7514, 670, 697, 38135, 3103, 891, 1158, 436, 2929, 310, 42876, 2708, 253, 14924, 7887, 891, 588, 320, 7378, 281, 7164, 619, 4868, 604, 619, 7350, 403, 9713, 50273, 35501, 323, 253, 2380, 846, 4361, 253, 2380, 285, 643, 10123, 891, 4425, 281, 1978, 619, 3236, 4868, 619, 2201, 4468, 310, 1335, 253, 38135, 273, 253, 4737, 50276, 187, 187, 4118, 18435, 27, 2520, 2929, 2722, 256, 35333, 29566, 4872, 14940, 323, 20126, 11454, 6928, 762, 2176, 13260, 2299, 30628, 3986, 253, 13969, 326, 436, 2929, 19756, 7681, 38135, 253, 11419, 37317, 18726, 285, 3021, 21936, 281, 12009, 253, 2929 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: the authors introduce virtual mcts which approximates its vanilla version with a smaller amount of computations they also perform theoretical analysis as well as empirical performance analysis on 9x9 go and the atari game while the topic seems reasonable i have a difficulty in understanding the problem they attempt to address see the questions typically gameplaying programs have time management algorithms but the authors does not mention the approach for example httpsdkemaastrichtuniversitynlmwinandsdocumentstimemanagementformontecarlotreesearchpdf httpswwwremicoulomfrpublicationstimemanagementpdf in addition their comparison is against vanilla uct but another naive baseline is to stop uct if the best move is much better than the others it could be estimated by considering the number of visits and the reward received so far misc page 3 equation 1 is this correct page 3 thus mctsbased rl is roughly n times computationally more expansive than traditional rl algorithms expensive page 3 it is consists of two components it consists of page 4 illustrated in algorithm 1 2 algorithms 1 and 2 na docsepthis paper proposes a modification to the montecarlo tree search paradigmspecifically the uct algorithmthat is more sample efficient the method attempts to use its thinking time more wisely as the authors describe it their treebuilding algorithm seeks to spend more time when evaluating harder states and less time on simpler states ie states that are less positionally complex and where the best action can be determined more easily identifying these situations is accomplished by noting when additional iterations of search would not appreciably change the visitation distribution of the actions at the root node ie the stochastic policy at the root node specifically on each iteration of treebuilding after the traditional loop of node selectionexpansionevaluationvalue propagation is completed some additional computation is performed what the authors term virtual expansions in this phase nk pulls of the arms of the root node are performed according to the selection strategy for eg ucb1 but without descending down the tree further n total search budget measured by iterations k current iteration index instead after each pull the current average utiltity of that action qs a is used as the reward so that the only modification to the statistics accumulated in the nodes at level 1 in the search tree are in the visitation counts the policy induced by the combination of regular treebuilding and virtual expansions is tracked over time when it begins to show signs of convergence ie the norm between the policies from two sufficiently different iterations is sufficiently small the search is terminated the authors provide two theorems that characterize the nature of this convergence as well as empirical evaluation in several domains9x9 go and five atari gamesto demonstrate the validity and benefit of their approach originality the simplicity of the approach is a strength in my opinion the idea is fairly straightforward and the fact that it results in significant gains in a number of application domains is noteworthy significance mcts is of broad interest in the ml and ai communities and papers that deal with enhancements to the baseline algorithm or propose novel applications are often published at neurips so the paper is likely to be of interest to many researchers quality the paper contains a nice balance of theory and experiment the main theorems suggest that the authors proposed approach should provide some benefits and the empirical evaluation reinforces this the gains in performance are particularly strong in 45 atari domains where vmcts the authors approach outperforms vanilla mcts while building much smaller search trees typically only 50 as big the ablation experiments involving tuning the algorithm hyperparameters r the search budget ratio and epsilon which determines the tolerance criterion for convergence were useful for evaluating their impact on performance the results in 9x9 go were however less compelling to me the evidence here did not appear to support the authors conclusions vanilla uct was shown to be a little slower to act but also a little stronger as a player so claiming that vmcts was somehow preferable here or a superior choice 265266 therefore such termination rule can keep strong performances with less budget is not borne out by the data the qualitative analysis in fig 3 about vmcts spending more time in challenging positions and less time in less complex positions was welcome but drawing such samples from a single game seems a little anecdotal aggregating information over many more such positions would have made for a stronger argument and paper clarity unfortunately this was the weakest element of the paper which relegates it to a borderline accept in my eyes the lack of clear writing and visualizations made it harder to evaluate some of the claims it also raises questions about the reproducibility of the work more specifics are provided in the next section but i outline some broad issues here no details are provided about the training procedure efficientzero is mentioned in section 51 but its not clear if that was the training procedure that was used its not clear if the code for reproducing the results will be made available i found fig 1a difficult to understand as it lumps together different algorithms that are being parameterized in different ways i was also a little confused by why there was a curve for gnugo which is an offtheshelf go playing engine presumably the authors did not retrain this agent from scratch so why are there different points along the xaxis for this player the proofs for theorems 41 and 42 are presumably in the appendix so their proofs could not be verified while i understand the authors are operating under space constraints it would have been nice to at least see a proof sketch yes any concerns have been raised in other sections docsepthis paper proposed a novel method named virtual mcts vmcts to reduce the computation time of vanilla mcts with the virtual expended termination rule vetrule this paper also gives theoretical bounds of the proposed method and evaluates the performance and computations on 9 9 go board games and atari games experiments show that this method can achieve comparable performances to the original search algorithm while requiring less than 50 search times on average in general this paper shows that vmcts can think faster on simple states and longer on hard states for solid theoretical analysis and positive experimental results i would recommend the acceptance of this paper strengths this paper gives the theoretical bounds of the proposed method with proof namely for any given epsilon with sufficient large n if the policy distance between k and k2 is smaller than epsilon then the distance between k and n is smaller than 3epsilon this theoretical result supports an early stop of mcts search which i believe has some impact the theoretical analysis is sound after i verify the proofs of theorem 41 and 42 in appendix a little bit hard to read though in the experiments with go 9x9 and atari vmcts shows that this method can achieve comparable performances to the original search algorithm while requiring less than 50 search times on average there is a comparison with dsmcts a past work in aaai and vmcts still has better performance weaknesses some minor presentation problems listed in questions section yes docsepthis paper proposes vmcts containing two main improvements over classic mcts first the authors propose virtual expansion by applying rollout without actual simulation the simulation returns are replaced by the current q value next vmcts uses an adaptive termination condition to decide when to stop doing rollout the proposed algorithm is evaluated on 9x9 go and five atari games improving the efficiency of mcts is very important since they typically require a huge amount of computation resources this paper proposes two techniques virtual expansion and an earlytermination condition virtual expansion aims to mitigate the problem that exploratory behavior of the mcts rollouts can mask out information about the action with the highest expected reward for example if the agent is given a budget of 100 rollouts and it has only figured out a promising path at the 95th rollout the last 5 rollouts may not be sufficient to backpropagate this information up to the root node in such cases applying virtual expansion to backpropagate such information could be helpful this is also justified by the ablation study in sec 54 however while virtual expansion could be useful in the abovementioned scenario i think it i could be implemented by an action selection policy for bestarm identification bai and ii might only be useful in games with less chaotic rewards i for example it is possible that simply selecting the action that leads to the best cumulative reward seen in the rollouts can be as good as virtual expansion ii for tasks where the reward function has high variance it seems possible that virtual expansion will be fooled by some high rewards collected by chance to allow better justification of the virtual expansion idea it would be great if the authors could compare it with other action selection strategies eg select by q value and some bai strategies the theoretical analysis in the paper shows that vmcts will converge to the optimal policy as the number of rollouts increases but it does not provide additional intuition on the comparison between vanilla mcts and vmcts specifically according to thm 41 we still need k to be sufficiently large in order to guarantee eg bai also the theorems do not take into consideration the earlytermination condition the authors addressed the limitations and potential negative societal impact of their work ### Summary:
i found this to be an interesting paper as the reviewers indicated it could be improved in terms of clarity and i strongly encourage the authors to consider those comments carefully as ultimately this could only make their paper more impactful in particular the authors could consider how to be clearer about their claims and how to provide stronger evidence for these for instance a claim like it can maintain comparable performances while reducing half of the time to search adaptively is very general and it is unclear that it is really true for instance is this true under all conditions that said i believe the paper is clear enough and the method is simple enough that it might be of interest to the community and i think it would be good to accept it for presentation at the conference this agrees with most reviewers three of whom voted to accept the paper i do agree with the one reviewer voting to reject that im somewhat unsure how this compares to other reasonable approaches but i think this can be further discussed in followup papers as well
[ 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 783, 4477, 9569, 7503, 278, 291, 84, 534, 4020, 684, 697, 26724, 2715, 342, 247, 4577, 2408, 273, 30745, 597, 671, 1347, 10527, 1783, 347, 973, 347, 16774, 3045, 1783, 327, 898, 89, 26, 564, 285, 253, 387, 1792, 2165, 50275, 6050, 253, 9400, 3133, 5272, 891, 452, 247, 10183, 275, 4685, 253, 1895, 597, 3177, 281, 2953, 923, 253, 3533, 50276, 42623, 30355, 272, 5659, 452, 673, 4323, 11333, 533, 253, 4477, 1057, 417, 3748, 253, 2746, 50274, 1542, 1650, 50275, 3614, 17261, 8895, 505, 32246, 328, 2095, 35510, 6481, 2287, 3306, 15980, 11155, 5585, 630, 25599, 5546, 11753, 6151, 3849, 9275, 50276, 3614, 2700, 2013, 280, 3941, 297, 925, 4387, 318, 15980, 11155, 5585, 9275, 50276, 249, 1635, 616, 5301, 310, 1411, 26724, 1484, 291, 533, 1529, 27785, 8245, 310, 281, 3523, 1484, 291, 604, 253, 1682, 2118, 310, 1199, 1805, 685, 253, 2571, 352, 812, 320, 5998, 407, 7296, 253, 1180, 273, 12941, 285, 253, 10921, 2959, 594, 2080, 50276, 43671, 50276, 6377, 495, 5150, 337, 50276, 261, 436, 3451, 50276, 6377, 495, 3021, 278, 291, 84, 3169, 391, 77, 310, 11467, 295, 2069, 43245, 625, 44380, 685, 5899, 391, 77, 11333, 50276, 24710, 50276, 6377, 495, 352, 310, 8414, 273, 767, 4295, 50275, 262, 8414, 273, 50276, 6377, 577, 50276, 408, 5337, 456, 275, 5933, 337, 374, 50276, 267, 46042, 337, 285, 374, 50276, 2072, 50276, 7152, 33032, 2520, 2929, 29328, 247, 11237, 281, 253, 1114, 442, 5546, 4213, 5202, 3186, 11951, 304, 983, 1553, 4452, 253, 1484, 291, 5933, 3529, 310, 625, 3410, 5919, 253, 1332, 9437, 281, 897, 697, 4680, 673, 625, 46529, 347, 253, 4477, 6266, 352, 616, 5202, 22157, 5933, 14993, 281, 6947, 625, 673, 672, 16344, 12150, 3054, 285, 1679, 673, 327, 19554, 3054, 26332, 3054, 326, 403, 1679, 1899, 595, 2570, 285, 835, 253, 1682, 2250, 476, 320, 3413, 625, 4354, 12488, 841, 9534, 310, 14123, 407, 15806, 672, 3081, 25142, 273, 3186, 651, 417, 6373, 47388, 1818, 253, 41820, 3268, 273, 253, 5231, 387, 253, 5230, 4666, 26332, 253, 19191, 3646, 387, 253, 5230, 4666, 5742, 327, 1016, 19502, 273, 5202, 22157, 846, 253, 5899, 6287, 273, 4666, 5438, 4347, 6232, 15419, 2368, 2877, 18634, 310, 6312, 690, 3081, 13782, 310, 2684, 50276, 5371, 253, 4477, 1307, 7503, 40955, 275, 436, 3408, 295, 76, 25612, 273, 253, 6174, 273, 253, 5230, 4666, 403, 2684, 2556, 281, 253, 5438, 5700, 323, 24088, 44274, 67, 18, 533, 1293, 16317, 1066, 253, 5202, 2007, 295, 50276, 13074, 3186, 7563, 4080, 407, 25142, 465, 50276, 6259, 19502, 3605, 3185, 846, 1016, 3785, 253, 1655, 3388, 2780, 2878, 414, 273, 326, 2250, 2805, 84, 247, 310, 908, 347, 253, 10921, 594, 326, 253, 760, 11237, 281, 253, 9990, 20821, 275, 253, 7632, 387, 1268, 337, 275, 253, 3186, 5202, 403, 275, 253, 41820, 9372, 253, 3646, 5802, 407, 253, 5019, 273, 3963, 5202, 22157, 285, 7503, 40955, 310, 27173, 689, 673, 672, 352, 9513, 281, 921, 7871, 273, 14940, 26332, 253, 5222, 875, 253, 7823, 432, 767, 10481, 1027, 25142, 310, 10481, 1355, 253, 3186, 310, 19344, 253, 4477, 2085, 767, 39383, 326, 17710, 253, 3753, 273, 436, 14940, 347, 973, 347, 16774, 7103, 275, 2067, 10625, 26, 89, 26, 564, 285, 2620, 387, 1792, 18814, 22055, 7568, 253, 13091, 285, 5649, 273, 616, 2746, 3236, 414, 50276, 783, 17647, 273, 253, 2746, 310, 247, 4757, 275, 619, 4743, 253, 2934, 310, 9648, 15246, 285, 253, 958, 326, 352, 1543, 275, 1534, 15988, 275, 247, 1180, 273, 2898, 10625, 310, 35092, 50276, 9188, 40348, 50276, 78, 291, 84, 310, 273, 3862, 1600, 275, 253, 13361, 285, 23105, 7888, 285, 9380, 326, 2968, 342, 42752, 281, 253, 8245, 5933, 390, 12661, 4460, 4893, 403, 2223, 3863, 387, 5723, 2824, 594, 253, 2929, 310, 2779, 281, 320, 273, 1600, 281, 1142, 8607, 50276, 15177, 50276, 783, 2929, 4428, 247, 5322, 6654, 273, 3762, 285, 3368, 253, 2022, 39383, 1804, 326, 253, 4477, 4081, 2746, 943, 2085, 690, 5373, 285, 253, 16774, 7103, 9838, 36217, 436, 50276, 783, 15988, 275, 3045, 403, 3782, 2266, 275, 5329, 387, 1792, 10625, 835, 31940, 291, 84, 253, 4477, 2746, 41731, 13015, 26724, 278, 291, 84, 1223, 3652, 1199, 4577, 3186, 7139, 5431, 760, 2456, 347, 1943, 50276, 783, 28913, 4679, 7668, 25184, 253, 5933, 4373, 22041, 391, 253, 3186, 7563, 4313, 285, 299, 4277, 534, 14802, 253, 13761, 17705, 323, 14940, 497, 4217, 323, 16344, 616, 3486, 327, 3045, 50276, 783, 1543, 275, 898, 89, 26, 564, 497, 2299, 1679, 18511, 281, 479, 253, 1941, 1060, 858, 417, 3176, 281, 1329, 253, 4477, 11815, 50276, 6148, 6077, 1484, 291, 369, 2011, 281, 320, 247, 1652, 17357, 281, 769, 533, 671, 247, 1652, 10046, 347, 247, 4760, 594, 15081, 326, 31940, 291, 84, 369, 10380, 29224, 1060, 390, 247, 8936, 4327, 25905, 22313, 3103, 824, 15056, 4086, 476, 1978, 2266, 16226, 342, 1679, 7563, 310, 417, 32708, 562, 407, 253, 941, 50276, 783, 18276, 1783, 275, 3036, 495, 670, 31940, 291, 84, 9100, 625, 673, 275, 11132, 6887, 285, 1679, 673, 275, 1679, 2570, 6887, 369, 10112, 533, 10263, 824, 3530, 432, 247, 2014, 2165, 3133, 247, 1652, 34009, 5256, 267, 9406, 839, 1491, 689, 1142, 625, 824, 6887, 651, 452, 1160, 323, 247, 10046, 4154, 285, 2929, 50275, 498, 15752, 19235, 436, 369, 253, 5075, 383, 3284, 273, 253, 2929, 534, 1693, 72, 684, 352, 281, 247, 45210, 2997, 275, 619, 2927, 253, 3480, 273, 2590, 4028, 285, 5304, 5904, 1160, 352, 12150, 281, 7472, 690, 273, 253, 3916, 352, 671, 16540, 3533, 670, 253, 38041, 273, 253, 789, 625, 40155, 403, 2530, 275, 253, 1735, 2593, 533, 891, 19270, 690, 3862, 3374, 1060, 50276, 2369, 4278, 403, 2530, 670, 253, 3733, 5199, 5919, 10528, 310, 5393, 275, 2593, 8319, 533, 697, 417, 2590, 604, 326, 369, 253, 3733, 5199, 326, 369, 908, 697, 417, 2590, 604, 253, 2127, 323, 39306, 253, 1543, 588, 320, 1160, 2130, 50276, 74, 1119, 3036, 337, 66, 2834, 281, 2096, 347, 352, 298, 10628, 2366, 1027, 11333, 326, 403, 1146, 4764, 1025, 275, 1027, 4088, 891, 369, 671, 247, 1652, 13477, 407, 2139, 627, 369, 247, 6970, 323, 18976, 814, 80, 534, 310, 271, 273, 649, 1041, 48164, 564, 4882, 3948, 50276, 10192, 40224, 253, 4477, 858, 417, 851, 1949, 436, 5570, 432, 20041, 594, 2139, 403, 627, 1027, 2792, 2112, 253, 1269, 10565, 323, 436, 4760, 50276, 783, 27947, 323, 39383, 7609, 285, 5976, 403, 18289, 275, 253, 30762, 594, 616, 27947, 812, 417, 320, 16058, 1223, 891, 2096, 253, 4477, 403, 6498, 762, 2317, 10806, 352, 651, 452, 644, 5322, 281, 387, 1878, 923, 247, 4737, 23211, 4754, 50276, 1279, 7350, 452, 644, 5439, 275, 643, 7118, 5474, 33032, 2520, 2929, 4081, 247, 4460, 1332, 4907, 7503, 278, 291, 84, 31940, 291, 84, 281, 4796, 253, 13782, 673, 273, 26724, 278, 291, 84, 342, 253, 7503, 49976, 15056, 4086, 26925, 15093, 436, 2929, 671, 4245, 10527, 14493, 273, 253, 4081, 1332, 285, 44995, 253, 3045, 285, 30745, 327, 898, 50276, 26, 564, 4450, 3958, 285, 387, 1792, 3958, 4679, 921, 326, 436, 1332, 476, 5115, 10870, 16226, 281, 253, 3236, 3186, 5933, 1223, 10568, 1679, 685, 2456, 3186, 2069, 327, 3388, 275, 2087, 436, 2929, 2722, 326, 31940, 291, 84, 476, 1158, 7938, 327, 2969, 3054, 285, 3356, 327, 1892, 3054, 323, 4891, 10527, 1783, 285, 2762, 5661, 1543, 891, 651, 5583, 253, 14924, 273, 436, 2929, 50275, 296, 3755, 20556, 50275, 2520, 2929, 4245, 253, 10527, 14493, 273, 253, 4081, 1332, 342, 4737, 10775, 323, 667, 1677, 299, 4277, 342, 4209, 1781, 295, 604, 253, 3646, 4181, 875, 465, 285, 465, 19, 310, 4577, 685, 299, 4277, 840, 253, 4181, 875, 465, 285, 295, 310, 4577, 685, 495, 4259, 436, 10527, 906, 8525, 271, 2393, 3523, 273, 278, 291, 84, 3186, 534, 891, 2868, 556, 690, 3486, 253, 10527, 1783, 310, 3590, 846, 891, 12654, 253, 27947, 273, 10012, 7609, 285, 5976, 275, 30762, 247, 1652, 2372, 1892, 281, 1239, 2167, 50276, 249, 253, 4679, 342, 564, 898, 89, 26, 285, 387, 1792, 31940, 291, 84, 2722, 326, 436, 1332, 476, 5115, 10870, 16226, 281, 253, 3236, 3186, 5933, 1223, 10568, 1679, 685, 2456, 3186, 2069, 327, 3388, 50275, 9088, 310, 247, 5301, 342, 277, 3610, 291, 84, 247, 2469, 789, 275, 39951, 2284, 285, 31940, 291, 84, 1335, 556, 1805, 3045, 50276, 20881, 1255, 265, 50276, 8826, 5884, 9759, 3237, 7117, 275, 3533, 2593, 50274, 9820, 5474, 33032, 2520, 2929, 29328, 31940, 291, 84, 4508, 767, 2022, 11701, 689, 10610, 278, 291, 84, 806, 253, 4477, 12661, 7503, 7466, 407, 9433, 4533, 483, 1293, 4588, 9864, 253, 9864, 6548, 403, 7932, 407, 253, 1655, 2805, 1318, 1735, 31940, 291, 84, 4648, 271, 17825, 15056, 1617, 281, 7617, 672, 281, 3523, 2509, 4533, 483, 253, 4081, 5933, 310, 6760, 327, 898, 89, 26, 564, 285, 2620, 387, 1792, 3958, 11138, 253, 6733, 273, 278, 291, 84, 310, 1077, 1774, 1580, 597, 5431, 2430, 247, 5699, 2408, 273, 13782, 5300, 436, 2929, 29328, 767, 5609, 7503, 7466, 285, 271, 2393, 49022, 1617, 50276, 22259, 7466, 13698, 281, 29966, 253, 1895, 326, 41075, 3879, 273, 253, 278, 291, 84, 4533, 8349, 476, 8989, 562, 1491, 670, 253, 2250, 342, 253, 4585, 3264, 10921, 323, 1650, 604, 253, 5570, 310, 1677, 247, 7563, 273, 2233, 4533, 8349, 285, 352, 556, 760, 15433, 562, 247, 12532, 1854, 387, 253, 5325, 394, 4533, 483, 253, 1390, 608, 4533, 8349, 778, 417, 320, 4209, 281, 896, 44263, 366, 436, 1491, 598, 281, 253, 5230, 4666, 275, 824, 2219, 9433, 7503, 7466, 281, 896, 44263, 366, 824, 1491, 812, 320, 9371, 436, 310, 671, 17285, 407, 253, 28913, 1263, 275, 4706, 8255, 50276, 35529, 1223, 7503, 7466, 812, 320, 4217, 275, 253, 1840, 13012, 10076, 891, 1158, 352, 891, 812, 320, 9009, 407, 271, 2250, 5438, 3646, 323, 1682, 1513, 8137, 270, 2284, 285, 21255, 1537, 760, 320, 4217, 275, 3958, 342, 1679, 29784, 23267, 891, 323, 1650, 352, 310, 1896, 326, 3365, 17221, 253, 2250, 326, 5644, 281, 253, 1682, 18849, 10921, 2326, 275, 253, 4533, 8349, 476, 320, 347, 1175, 347, 7503, 7466, 21255, 323, 8892, 835, 253, 10921, 1159, 556, 1029, 11041, 352, 3133, 1896, 326, 7503, 7466, 588, 320, 11213, 264, 407, 690, 1029, 23267, 5728, 407, 4839, 281, 1581, 1805, 22861, 273, 253, 7503, 7466, 2934, 352, 651, 320, 1270, 604, 253, 4477, 812, 7277, 352, 342, 643, 2250, 5438, 8130, 24088, 3609, 407, 2805, 1318, 285, 690, 270, 2284, 8130, 50276, 783, 10527, 1783, 275, 253, 2929, 2722, 326, 31940, 291, 84, 588, 29623, 281, 253, 8654, 3646, 347, 253, 1180, 273, 4533, 8349, 5459, 533, 352, 1057, 417, 2085, 3081, 30328, 327, 253, 5301, 875, 26724, 278, 291, 84, 285, 31940, 291, 84, 5742, 2556, 281, 289, 78, 7609, 359, 1335, 878, 465, 281, 320, 10481, 1781, 275, 1340, 281, 12215, 24088, 270, 2284, 671, 253, 39383, 513, 417, 1379, 715, 8180, 253, 2393, 49022, 1617, 253, 4477, 9713, 253, 7364, 285, 2442, 4016, 38058, 3486, 273, 616, 789, 2490, 187, 4118, 18435, 27, 74, 1119, 436, 281, 320, 271, 4722, 2929, 50276, 284, 253, 30628, 4860, 352, 812, 320, 5520, 275, 2426, 273, 19843, 285, 891, 7052, 11907, 253, 4477, 281, 1908, 1110, 5701, 9257, 347, 9142, 436, 812, 760, 1056, 616, 2929, 625, 3486, 1020, 50276, 249, 1798, 253, 4477, 812, 1908, 849, 281, 320, 30909, 670, 616, 3916, 285, 849, 281, 2085, 10046, 1941, 323, 841, 50276, 1542, 4227, 247, 1750, 751, 352, 476, 6558, 10870, 16226, 1223, 8493, 2716, 273, 253, 673, 281, 3186, 5223, 1242, 310, 1077, 2087, 285, 352, 310, 12744, 326, 352, 310, 1663, 2032, 323, 4227, 310, 436, 2032, 762, 512, 2515, 50276, 3529, 753, 891, 2868, 253, 2929, 310, 2590, 2217, 285, 253, 1332, 310, 2969, 2217, 326, 352, 1537, 320, 273, 1600, 281, 253, 3114, 285, 891, 1158, 352, 651, 320, 1175, 281, 2997, 352, 323, 9759, 387, 253, 8059, 50276, 2520, 18726, 342, 954, 30628, 1264, 273, 5207, 14285, 281, 2997, 253, 2929, 50276, 74, 513, 5194, 342, 253, 581, 37317, 13423, 281, 12009, 326, 516, 8489, 31488, 849, 436, 26662, 281, 643, 5272, 7274, 533, 891, 1158, 436, 476, 320, 2007, 5469, 275, 956, 484, 9380, 347, 973 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 783, 4477, 9569, 7503, 278, 291, 84, 534, 4020, 684, 697, 26724, 2715, 342, 247, 4577, 2408, 273, 30745, 597, 671, 1347, 10527, 1783, 347, 973, 347, 16774, 3045, 1783, 327, 898, 89, 26, 564, 285, 253, 387, 1792, 2165, 50275, 6050, 253, 9400, 3133, 5272, 891, 452, 247, 10183, 275, 4685, 253, 1895, 597, 3177, 281, 2953, 923, 253, 3533, 50276, 42623, 30355, 272, 5659, 452, 673, 4323, 11333, 533, 253, 4477, 1057, 417, 3748, 253, 2746, 50274, 1542, 1650, 50275, 3614, 17261, 8895, 505, 32246, 328, 2095, 35510, 6481, 2287, 3306, 15980, 11155, 5585, 630, 25599, 5546, 11753, 6151, 3849, 9275, 50276, 3614, 2700, 2013, 280, 3941, 297, 925, 4387, 318, 15980, 11155, 5585, 9275, 50276, 249, 1635, 616, 5301, 310, 1411, 26724, 1484, 291, 533, 1529, 27785, 8245, 310, 281, 3523, 1484, 291, 604, 253, 1682, 2118, 310, 1199, 1805, 685, 253, 2571, 352, 812, 320, 5998, 407, 7296, 253, 1180, 273, 12941, 285, 253, 10921, 2959, 594, 2080, 50276, 43671, 50276, 6377, 495, 5150, 337, 50276, 261, 436, 3451, 50276, 6377, 495, 3021, 278, 291, 84, 3169, 391, 77, 310, 11467, 295, 2069, 43245, 625, 44380, 685, 5899, 391, 77, 11333, 50276, 24710, 50276, 6377, 495, 352, 310, 8414, 273, 767, 4295, 50275, 262, 8414, 273, 50276, 6377, 577, 50276, 408, 5337, 456, 275, 5933, 337, 374, 50276, 267, 46042, 337, 285, 374, 50276, 2072, 50276, 7152, 33032, 2520, 2929, 29328, 247, 11237, 281, 253, 1114, 442, 5546, 4213, 5202, 3186, 11951, 304, 983, 1553, 4452, 253, 1484, 291, 5933, 3529, 310, 625, 3410, 5919, 253, 1332, 9437, 281, 897, 697, 4680, 673, 625, 46529, 347, 253, 4477, 6266, 352, 616, 5202, 22157, 5933, 14993, 281, 6947, 625, 673, 672, 16344, 12150, 3054, 285, 1679, 673, 327, 19554, 3054, 26332, 3054, 326, 403, 1679, 1899, 595, 2570, 285, 835, 253, 1682, 2250, 476, 320, 3413, 625, 4354, 12488, 841, 9534, 310, 14123, 407, 15806, 672, 3081, 25142, 273, 3186, 651, 417, 6373, 47388, 1818, 253, 41820, 3268, 273, 253, 5231, 387, 253, 5230, 4666, 26332, 253, 19191, 3646, 387, 253, 5230, 4666, 5742, 327, 1016, 19502, 273, 5202, 22157, 846, 253, 5899, 6287, 273, 4666, 5438, 4347, 6232, 15419, 2368, 2877, 18634, 310, 6312, 690, 3081, 13782, 310, 2684, 50276, 5371, 253, 4477, 1307, 7503, 40955, 275, 436, 3408, 295, 76, 25612, 273, 253, 6174, 273, 253, 5230, 4666, 403, 2684, 2556, 281, 253, 5438, 5700, 323, 24088, 44274, 67, 18, 533, 1293, 16317, 1066, 253, 5202, 2007, 295, 50276, 13074, 3186, 7563, 4080, 407, 25142, 465, 50276, 6259, 19502, 3605, 3185, 846, 1016, 3785, 253, 1655, 3388, 2780, 2878, 414, 273, 326, 2250, 2805, 84, 247, 310, 908, 347, 253, 10921, 594, 326, 253, 760, 11237, 281, 253, 9990, 20821, 275, 253, 7632, 387, 1268, 337, 275, 253, 3186, 5202, 403, 275, 253, 41820, 9372, 253, 3646, 5802, 407, 253, 5019, 273, 3963, 5202, 22157, 285, 7503, 40955, 310, 27173, 689, 673, 672, 352, 9513, 281, 921, 7871, 273, 14940, 26332, 253, 5222, 875, 253, 7823, 432, 767, 10481, 1027, 25142, 310, 10481, 1355, 253, 3186, 310, 19344, 253, 4477, 2085, 767, 39383, 326, 17710, 253, 3753, 273, 436, 14940, 347, 973, 347, 16774, 7103, 275, 2067, 10625, 26, 89, 26, 564, 285, 2620, 387, 1792, 18814, 22055, 7568, 253, 13091, 285, 5649, 273, 616, 2746, 3236, 414, 50276, 783, 17647, 273, 253, 2746, 310, 247, 4757, 275, 619, 4743, 253, 2934, 310, 9648, 15246, 285, 253, 958, 326, 352, 1543, 275, 1534, 15988, 275, 247, 1180, 273, 2898, 10625, 310, 35092, 50276, 9188, 40348, 50276, 78, 291, 84, 310, 273, 3862, 1600, 275, 253, 13361, 285, 23105, 7888, 285, 9380, 326, 2968, 342, 42752, 281, 253, 8245, 5933, 390, 12661, 4460, 4893, 403, 2223, 3863, 387, 5723, 2824, 594, 253, 2929, 310, 2779, 281, 320, 273, 1600, 281, 1142, 8607, 50276, 15177, 50276, 783, 2929, 4428, 247, 5322, 6654, 273, 3762, 285, 3368, 253, 2022, 39383, 1804, 326, 253, 4477, 4081, 2746, 943, 2085, 690, 5373, 285, 253, 16774, 7103, 9838, 36217, 436, 50276, 783, 15988, 275, 3045, 403, 3782, 2266, 275, 5329, 387, 1792, 10625, 835, 31940, 291, 84, 253, 4477, 2746, 41731, 13015, 26724, 278, 291, 84, 1223, 3652, 1199, 4577, 3186, 7139, 5431, 760, 2456, 347, 1943, 50276, 783, 28913, 4679, 7668, 25184, 253, 5933, 4373, 22041, 391, 253, 3186, 7563, 4313, 285, 299, 4277, 534, 14802, 253, 13761, 17705, 323, 14940, 497, 4217, 323, 16344, 616, 3486, 327, 3045, 50276, 783, 1543, 275, 898, 89, 26, 564, 497, 2299, 1679, 18511, 281, 479, 253, 1941, 1060, 858, 417, 3176, 281, 1329, 253, 4477, 11815, 50276, 6148, 6077, 1484, 291, 369, 2011, 281, 320, 247, 1652, 17357, 281, 769, 533, 671, 247, 1652, 10046, 347, 247, 4760, 594, 15081, 326, 31940, 291, 84, 369, 10380, 29224, 1060, 390, 247, 8936, 4327, 25905, 22313, 3103, 824, 15056, 4086, 476, 1978, 2266, 16226, 342, 1679, 7563, 310, 417, 32708, 562, 407, 253, 941, 50276, 783, 18276, 1783, 275, 3036, 495, 670, 31940, 291, 84, 9100, 625, 673, 275, 11132, 6887, 285, 1679, 673, 275, 1679, 2570, 6887, 369, 10112, 533, 10263, 824, 3530, 432, 247, 2014, 2165, 3133, 247, 1652, 34009, 5256, 267, 9406, 839, 1491, 689, 1142, 625, 824, 6887, 651, 452, 1160, 323, 247, 10046, 4154, 285, 2929, 50275, 498, 15752, 19235, 436, 369, 253, 5075, 383, 3284, 273, 253, 2929, 534, 1693, 72, 684, 352, 281, 247, 45210, 2997, 275, 619, 2927, 253, 3480, 273, 2590, 4028, 285, 5304, 5904, 1160, 352, 12150, 281, 7472, 690, 273, 253, 3916, 352, 671, 16540, 3533, 670, 253, 38041, 273, 253, 789, 625, 40155, 403, 2530, 275, 253, 1735, 2593, 533, 891, 19270, 690, 3862, 3374, 1060, 50276, 2369, 4278, 403, 2530, 670, 253, 3733, 5199, 5919, 10528, 310, 5393, 275, 2593, 8319, 533, 697, 417, 2590, 604, 326, 369, 253, 3733, 5199, 326, 369, 908, 697, 417, 2590, 604, 253, 2127, 323, 39306, 253, 1543, 588, 320, 1160, 2130, 50276, 74, 1119, 3036, 337, 66, 2834, 281, 2096, 347, 352, 298, 10628, 2366, 1027, 11333, 326, 403, 1146, 4764, 1025, 275, 1027, 4088, 891, 369, 671, 247, 1652, 13477, 407, 2139, 627, 369, 247, 6970, 323, 18976, 814, 80, 534, 310, 271, 273, 649, 1041, 48164, 564, 4882, 3948, 50276, 10192, 40224, 253, 4477, 858, 417, 851, 1949, 436, 5570, 432, 20041, 594, 2139, 403, 627, 1027, 2792, 2112, 253, 1269, 10565, 323, 436, 4760, 50276, 783, 27947, 323, 39383, 7609, 285, 5976, 403, 18289, 275, 253, 30762, 594, 616, 27947, 812, 417, 320, 16058, 1223, 891, 2096, 253, 4477, 403, 6498, 762, 2317, 10806, 352, 651, 452, 644, 5322, 281, 387, 1878, 923, 247, 4737, 23211, 4754, 50276, 1279, 7350, 452, 644, 5439, 275, 643, 7118, 5474, 33032, 2520, 2929, 4081, 247, 4460, 1332, 4907, 7503, 278, 291, 84, 31940, 291, 84, 281, 4796, 253, 13782, 673, 273, 26724, 278, 291, 84, 342, 253, 7503, 49976, 15056, 4086, 26925, 15093, 436, 2929, 671, 4245, 10527, 14493, 273, 253, 4081, 1332, 285, 44995, 253, 3045, 285, 30745, 327, 898, 50276, 26, 564, 4450, 3958, 285, 387, 1792, 3958, 4679, 921, 326, 436, 1332, 476, 5115, 10870, 16226, 281, 253, 3236, 3186, 5933, 1223, 10568, 1679, 685, 2456, 3186, 2069, 327, 3388, 275, 2087, 436, 2929, 2722, 326, 31940, 291, 84, 476, 1158, 7938, 327, 2969, 3054, 285, 3356, 327, 1892, 3054, 323, 4891, 10527, 1783, 285, 2762, 5661, 1543, 891, 651, 5583, 253, 14924, 273, 436, 2929, 50275, 296, 3755, 20556, 50275, 2520, 2929, 4245, 253, 10527, 14493, 273, 253, 4081, 1332, 342, 4737, 10775, 323, 667, 1677, 299, 4277, 342, 4209, 1781, 295, 604, 253, 3646, 4181, 875, 465, 285, 465, 19, 310, 4577, 685, 299, 4277, 840, 253, 4181, 875, 465, 285, 295, 310, 4577, 685, 495, 4259, 436, 10527, 906, 8525, 271, 2393, 3523, 273, 278, 291, 84, 3186, 534, 891, 2868, 556, 690, 3486, 253, 10527, 1783, 310, 3590, 846, 891, 12654, 253, 27947, 273, 10012, 7609, 285, 5976, 275, 30762, 247, 1652, 2372, 1892, 281, 1239, 2167, 50276, 249, 253, 4679, 342, 564, 898, 89, 26, 285, 387, 1792, 31940, 291, 84, 2722, 326, 436, 1332, 476, 5115, 10870, 16226, 281, 253, 3236, 3186, 5933, 1223, 10568, 1679, 685, 2456, 3186, 2069, 327, 3388, 50275, 9088, 310, 247, 5301, 342, 277, 3610, 291, 84, 247, 2469, 789, 275, 39951, 2284, 285, 31940, 291, 84, 1335, 556, 1805, 3045, 50276, 20881, 1255, 265, 50276, 8826, 5884, 9759, 3237, 7117, 275, 3533, 2593, 50274, 9820, 5474, 33032, 2520, 2929, 29328, 31940, 291, 84, 4508, 767, 2022, 11701, 689, 10610, 278, 291, 84, 806, 253, 4477, 12661, 7503, 7466, 407, 9433, 4533, 483, 1293, 4588, 9864, 253, 9864, 6548, 403, 7932, 407, 253, 1655, 2805, 1318, 1735, 31940, 291, 84, 4648, 271, 17825, 15056, 1617, 281, 7617, 672, 281, 3523, 2509, 4533, 483, 253, 4081, 5933, 310, 6760, 327, 898, 89, 26, 564, 285, 2620, 387, 1792, 3958, 11138, 253, 6733, 273, 278, 291, 84, 310, 1077, 1774, 1580, 597, 5431, 2430, 247, 5699, 2408, 273, 13782, 5300, 436, 2929, 29328, 767, 5609, 7503, 7466, 285, 271, 2393, 49022, 1617, 50276, 22259, 7466, 13698, 281, 29966, 253, 1895, 326, 41075, 3879, 273, 253, 278, 291, 84, 4533, 8349, 476, 8989, 562, 1491, 670, 253, 2250, 342, 253, 4585, 3264, 10921, 323, 1650, 604, 253, 5570, 310, 1677, 247, 7563, 273, 2233, 4533, 8349, 285, 352, 556, 760, 15433, 562, 247, 12532, 1854, 387, 253, 5325, 394, 4533, 483, 253, 1390, 608, 4533, 8349, 778, 417, 320, 4209, 281, 896, 44263, 366, 436, 1491, 598, 281, 253, 5230, 4666, 275, 824, 2219, 9433, 7503, 7466, 281, 896, 44263, 366, 824, 1491, 812, 320, 9371, 436, 310, 671, 17285, 407, 253, 28913, 1263, 275, 4706, 8255, 50276, 35529, 1223, 7503, 7466, 812, 320, 4217, 275, 253, 1840, 13012, 10076, 891, 1158, 352, 891, 812, 320, 9009, 407, 271, 2250, 5438, 3646, 323, 1682, 1513, 8137, 270, 2284, 285, 21255, 1537, 760, 320, 4217, 275, 3958, 342, 1679, 29784, 23267, 891, 323, 1650, 352, 310, 1896, 326, 3365, 17221, 253, 2250, 326, 5644, 281, 253, 1682, 18849, 10921, 2326, 275, 253, 4533, 8349, 476, 320, 347, 1175, 347, 7503, 7466, 21255, 323, 8892, 835, 253, 10921, 1159, 556, 1029, 11041, 352, 3133, 1896, 326, 7503, 7466, 588, 320, 11213, 264, 407, 690, 1029, 23267, 5728, 407, 4839, 281, 1581, 1805, 22861, 273, 253, 7503, 7466, 2934, 352, 651, 320, 1270, 604, 253, 4477, 812, 7277, 352, 342, 643, 2250, 5438, 8130, 24088, 3609, 407, 2805, 1318, 285, 690, 270, 2284, 8130, 50276, 783, 10527, 1783, 275, 253, 2929, 2722, 326, 31940, 291, 84, 588, 29623, 281, 253, 8654, 3646, 347, 253, 1180, 273, 4533, 8349, 5459, 533, 352, 1057, 417, 2085, 3081, 30328, 327, 253, 5301, 875, 26724, 278, 291, 84, 285, 31940, 291, 84, 5742, 2556, 281, 289, 78, 7609, 359, 1335, 878, 465, 281, 320, 10481, 1781, 275, 1340, 281, 12215, 24088, 270, 2284, 671, 253, 39383, 513, 417, 1379, 715, 8180, 253, 2393, 49022, 1617, 253, 4477, 9713, 253, 7364, 285, 2442, 4016, 38058, 3486, 273, 616, 789, 2490, 187, 4118, 18435, 27, 74, 1119, 436, 281, 320, 271, 4722, 2929, 50276, 284, 253, 30628, 4860, 352, 812, 320, 5520, 275, 2426, 273, 19843, 285, 891, 7052, 11907, 253, 4477, 281, 1908, 1110, 5701, 9257, 347, 9142, 436, 812, 760, 1056, 616, 2929, 625, 3486, 1020, 50276, 249, 1798, 253, 4477, 812, 1908, 849, 281, 320, 30909, 670, 616, 3916, 285, 849, 281, 2085, 10046, 1941, 323, 841, 50276, 1542, 4227, 247, 1750, 751, 352, 476, 6558, 10870, 16226, 1223, 8493, 2716, 273, 253, 673, 281, 3186, 5223, 1242, 310, 1077, 2087, 285, 352, 310, 12744, 326, 352, 310, 1663, 2032, 323, 4227, 310, 436, 2032, 762, 512, 2515, 50276, 3529, 753, 891, 2868, 253, 2929, 310, 2590, 2217, 285, 253, 1332, 310, 2969, 2217, 326, 352, 1537, 320, 273, 1600, 281, 253, 3114, 285, 891, 1158, 352, 651, 320, 1175, 281, 2997, 352, 323, 9759, 387, 253, 8059, 50276, 2520, 18726, 342, 954, 30628, 1264, 273, 5207, 14285, 281, 2997, 253, 2929, 50276, 74, 513, 5194, 342, 253, 581, 37317, 13423, 281, 12009, 326, 516, 8489, 31488, 849, 436, 26662, 281, 643, 5272, 7274, 533, 891, 1158, 436, 476, 320, 2007, 5469, 275, 956, 484, 9380, 347, 973 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: the paper applies the recently proposed selfsupervised learning method barlowtwins to graph structured data for constructing the augmented version of a graph previous methods such as edgedropping or feature masking are used the paper conducts experimental evaluation on datasets of various scales on both transductive and inductive setting strenghts 1 paper is clearly written and easy to follow 2 the experimental evaluation is robust and experimental details are clearly stated weakness the paper is the straightforward extension of previous work barlowtwins for graph structured data the paper does not has any technical novelty in my opinion i am willing to increase the score of the paper if in the rebuttal the authors can clearly state the novelty of the paper wrt to the barlow twin paper although the paper is clearly written and many experiments are presented the paper does not meet the bar of the top conference such as iclr because of it being a very direct application of a previous paper my review is rather short for this paper because based based on the lack of novelty of this paper i do not have many questions to ask or suggestions to make docsepthe authors study symmetric selfsupervised graph representation learning without negative samples inspired by the barlow twins method previously proposed in the image domain they illustrate that using their method it is possible to achieve competitive performance to stateoftheart methods such as bgrl at a fraction of the training cost disclosure i have reviewed a previous version of this paper the authors have included thorough fullbatch experiments as well as larger scale datasets such as ogbnproducts which is much appreciated the paper is clear wellwritten and easy to follow the authors propose a simple but meaningful extension of the barlow twins idea to the graph domain and demonstrate its effectiveness on relevant experiments i think allowing for symmetric loss is a very important direction for graph representation learning and the proposed solution is an elegant way of achieving that my main concern would be with the reported bgrl results on ogbnproducts i understand that the authors have ran bgrl under the same computational budget as gbt but it appears clear that bgrl needs more time to reach peak performance would it be possible just to avoid muddying the waters for future work to run bgrl for longer and report how the performance is affected it is ok if this number is higher than gbts reported performance the authors are optimising for a different metric i think that sufficiently many of my previous concerns have been addressed and i am now leaning on the side of acceptance the authors have presented a useful extension of barlow twins into the graph domain and now have experiments in support of the industrial relevance of their method the novelty is somewhat limited as is the case for most of the recent graph ssl papers that adapt image domain techniques but it is useful in and of itself that the gains observed in images transfer well to the irregular domains docsepthis paper proposed a selfsupervised learning framework for graph representation learning based on a crosscorrelationbased loss function in the proposed framework two views of the input graph obtained by augmentation methods are passed through the same encoder to compute two embedding matrices then barlow twins loss is used to compute the loss according to the embedding matrices the main contribution of this paper lies in that it adapted barlow twins from vision to graph representation learning field and evaluated the performance of this selfsupervised framework in multiple node classification tasks the proposed method achieved analogous results compared to sota methods with lower time and space complexity this paper is easy to follow wellwritten and organized this paper is a heuristic attempt to apply barlow twins in graph domain both transductive and inductive experiments are done to evaluate the performance the proposed method achieved results on par with the sota methods while the time and space complexity are lower i have some minor concerns below for the authors to address 1 the novelty of this paper is limited the challenges to be tackle of the application about barlow twins in graph domain is unclear to me 2 data augmentation in vision tasks comes from strong human prior eg random resize cropping and horizontal flipping would not change the semantic of an image while the graph data augmentation methods used in this paper is borrowed from previous literatures it makes no sense to me for example applying edge dropping to a protein would obviously lead to different biomolecules 3 the experimental results are on par with baseline methods on the most tasks considering that the low time and space complexity is coming from previous literature ie barlow twins the experimental contribution is limited 4 in terms of the encoder network and augmentation hyperparameter design the paper did not provide comprehensive analysis or ablation studies 5 the authors carefully describe the downstream datasets maybe i miss it but i dont find the pretrained dataset used in the experiment this paper adapted the recent barlow twins to selfsupervised graph representation learning and provided some informative empirical experiment results with such interesting trials the reviewer expected to see the concerns are well addressed ### Summary:
the paper proposes to use the recently introduced barlowtwins contrastive learning objective to the case of graph networks the main concern raised by reviewers was the limited novelty of this work which they argued mostly combines existing lines of work and does not introduce sufficiently new concepts this was also discussed between the authors and the reviewers having read the paper and the reviews i tend to agree with the reviewers that this paper is more of a combination of existing works and their relatively straightforward application to the graph network domain thus although the empirical results are encouraging i agree the paper has limited novelty and falls below the iclr acceptance bar
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 783, 2929, 10384, 253, 4102, 4081, 1881, 35421, 4715, 1332, 2534, 676, 7553, 968, 281, 4216, 18872, 941, 323, 26736, 253, 31612, 2715, 273, 247, 4216, 2045, 3082, 824, 347, 1407, 2400, 287, 2784, 390, 4735, 44790, 403, 908, 253, 2929, 2589, 84, 5661, 7103, 327, 15302, 273, 2710, 11498, 327, 1097, 811, 43324, 285, 42115, 4758, 50275, 296, 3755, 384, 84, 50276, 18, 2929, 310, 4518, 3542, 285, 3477, 281, 956, 374, 253, 5661, 7103, 310, 10237, 285, 5661, 4278, 403, 4518, 4767, 50276, 20881, 1255, 50276, 783, 2929, 310, 253, 15246, 6880, 273, 2045, 789, 2534, 676, 7553, 968, 323, 4216, 18872, 941, 253, 2929, 1057, 417, 556, 667, 7681, 38135, 275, 619, 4743, 891, 717, 7378, 281, 2572, 253, 4868, 273, 253, 2929, 604, 275, 253, 30080, 22559, 253, 4477, 476, 4518, 1375, 253, 38135, 273, 253, 2929, 8772, 281, 253, 2534, 676, 19661, 2929, 50276, 20261, 253, 2929, 310, 4518, 3542, 285, 1142, 4679, 403, 3559, 253, 2929, 1057, 417, 2525, 253, 2534, 273, 253, 1755, 8059, 824, 347, 17857, 32888, 984, 273, 352, 1146, 247, 1077, 1480, 2898, 273, 247, 2045, 2929, 50275, 2577, 2278, 310, 2581, 2159, 323, 436, 2929, 984, 1754, 1754, 327, 253, 3480, 273, 38135, 273, 436, 2929, 891, 513, 417, 452, 1142, 3533, 281, 1642, 390, 13991, 281, 1056, 5474, 339, 431, 248, 4477, 1263, 13123, 1881, 35421, 4216, 6779, 4715, 1293, 4016, 3530, 11797, 407, 253, 2534, 676, 26664, 1332, 3786, 4081, 275, 253, 2460, 5028, 597, 17093, 326, 970, 616, 1332, 352, 310, 1896, 281, 5115, 12085, 3045, 281, 1375, 23037, 14387, 3082, 824, 347, 270, 737, 77, 387, 247, 6919, 273, 253, 3733, 2105, 13911, 891, 452, 9814, 247, 2045, 2715, 273, 436, 2929, 253, 4477, 452, 2908, 11080, 2120, 23941, 4679, 347, 973, 347, 4067, 4311, 15302, 824, 347, 9040, 15453, 23832, 534, 310, 1199, 14109, 50276, 783, 2929, 310, 2590, 973, 15720, 285, 3477, 281, 956, 253, 4477, 12661, 247, 2969, 533, 14282, 6880, 273, 253, 2534, 676, 26664, 2934, 281, 253, 4216, 5028, 285, 7568, 697, 12510, 327, 4623, 4679, 891, 1158, 6941, 323, 13123, 2957, 310, 247, 1077, 1774, 3884, 323, 4216, 6779, 4715, 285, 253, 4081, 2900, 310, 271, 20654, 1039, 273, 17170, 326, 50276, 2577, 2022, 4468, 651, 320, 342, 253, 2361, 270, 737, 77, 1543, 327, 9040, 15453, 23832, 891, 2096, 326, 253, 4477, 452, 6337, 270, 737, 77, 762, 253, 1072, 15180, 7563, 347, 305, 2612, 533, 352, 4620, 2590, 326, 270, 737, 77, 3198, 625, 673, 281, 3986, 5241, 3045, 651, 352, 320, 1896, 816, 281, 3693, 278, 7937, 3184, 253, 12685, 323, 2852, 789, 281, 1408, 270, 737, 77, 323, 3356, 285, 1304, 849, 253, 3045, 310, 5876, 352, 310, 8718, 604, 436, 1180, 310, 2169, 685, 305, 67, 1641, 2361, 3045, 50276, 783, 4477, 403, 5556, 2182, 323, 247, 1027, 7982, 891, 1158, 326, 10481, 1142, 273, 619, 2045, 7350, 452, 644, 9713, 285, 891, 717, 1024, 25661, 327, 253, 1930, 273, 14924, 253, 4477, 452, 3559, 247, 4217, 6880, 273, 2534, 676, 26664, 715, 253, 4216, 5028, 285, 1024, 452, 4679, 275, 1329, 273, 253, 9787, 17200, 273, 616, 1332, 253, 38135, 310, 8489, 3710, 347, 310, 253, 1083, 323, 954, 273, 253, 3332, 4216, 256, 3433, 9380, 326, 5223, 2460, 5028, 5609, 533, 352, 310, 4217, 275, 285, 273, 3139, 326, 253, 15988, 2540, 275, 3888, 3700, 973, 281, 253, 17948, 10625, 5474, 33032, 2520, 2929, 4081, 247, 1881, 35421, 4715, 7792, 323, 4216, 6779, 4715, 1754, 327, 247, 2831, 39305, 3169, 2957, 1159, 275, 253, 4081, 7792, 767, 6849, 273, 253, 3280, 4216, 2797, 407, 42072, 3082, 403, 4817, 949, 253, 1072, 32049, 281, 11897, 767, 21496, 12624, 840, 2534, 676, 26664, 2957, 310, 908, 281, 11897, 253, 2957, 2556, 281, 253, 21496, 12624, 253, 2022, 7680, 273, 436, 2929, 8696, 275, 326, 352, 12956, 2534, 676, 26664, 432, 8113, 281, 4216, 6779, 4715, 1673, 285, 6760, 253, 3045, 273, 436, 1881, 35421, 7792, 275, 2709, 4666, 9162, 8892, 253, 4081, 1332, 6786, 19890, 1543, 2429, 281, 256, 5503, 3082, 342, 2406, 673, 285, 2317, 10454, 50276, 2520, 2929, 310, 3477, 281, 956, 973, 15720, 285, 10932, 436, 2929, 310, 247, 47641, 3177, 281, 4647, 2534, 676, 26664, 275, 4216, 5028, 1097, 811, 43324, 285, 42115, 4679, 403, 2218, 281, 7472, 253, 3045, 253, 4081, 1332, 6786, 1543, 327, 1061, 342, 253, 256, 5503, 3082, 1223, 253, 673, 285, 2317, 10454, 403, 2406, 50276, 74, 452, 690, 5884, 7350, 2708, 323, 253, 4477, 281, 2953, 50276, 18, 253, 38135, 273, 436, 2929, 310, 3710, 253, 7881, 281, 320, 18915, 273, 253, 2898, 670, 2534, 676, 26664, 275, 4216, 5028, 310, 12744, 281, 479, 50275, 19, 941, 42072, 275, 8113, 8892, 3249, 432, 2266, 1966, 2720, 24088, 3632, 39332, 9187, 2784, 285, 11593, 46899, 651, 417, 1818, 253, 24705, 273, 271, 2460, 1223, 253, 4216, 941, 42072, 3082, 908, 275, 436, 2929, 310, 29563, 432, 2045, 4133, 2478, 352, 2789, 642, 3282, 281, 479, 323, 1650, 9433, 5024, 18752, 281, 247, 2601, 651, 9090, 1421, 281, 1027, 9303, 8666, 50276, 20, 253, 5661, 1543, 403, 327, 1061, 342, 8245, 3082, 327, 253, 954, 8892, 7296, 326, 253, 1698, 673, 285, 2317, 10454, 310, 3551, 432, 2045, 6239, 26332, 2534, 676, 26664, 253, 5661, 7680, 310, 3710, 50276, 21, 275, 2426, 273, 253, 32049, 2990, 285, 42072, 4373, 19484, 2216, 253, 2929, 858, 417, 2085, 11088, 1783, 390, 28913, 2175, 50276, 22, 253, 4477, 9257, 6266, 253, 15450, 15302, 5046, 891, 2985, 352, 533, 891, 13414, 1089, 253, 3215, 11273, 10895, 908, 275, 253, 3368, 436, 2929, 12956, 253, 3332, 2534, 676, 26664, 281, 1881, 35421, 4216, 6779, 4715, 285, 2530, 690, 27096, 16774, 3368, 1543, 342, 824, 4722, 7587, 253, 37317, 3264, 281, 923, 253, 7350, 403, 973, 9713, 2490, 187, 4118, 18435, 27, 783, 2929, 29328, 281, 897, 253, 4102, 5611, 2534, 676, 7553, 968, 4499, 422, 4715, 8103, 281, 253, 1083, 273, 4216, 6928, 253, 2022, 4468, 5439, 407, 30628, 369, 253, 3710, 38135, 273, 436, 789, 534, 597, 9125, 6571, 24772, 5368, 3104, 273, 789, 285, 1057, 417, 9569, 10481, 747, 12342, 436, 369, 671, 5469, 875, 253, 4477, 285, 253, 30628, 1907, 1239, 253, 2929, 285, 253, 10123, 891, 5257, 281, 5194, 342, 253, 30628, 326, 436, 2929, 310, 625, 273, 247, 5019, 273, 5368, 2987, 285, 616, 4942, 15246, 2898, 281, 253, 4216, 2990, 5028, 3021, 3738, 253, 16774, 1543, 403, 18462, 891, 5194, 253, 2929, 556, 3710, 38135, 285, 11521, 2708, 253, 17857, 32888, 14924, 2534 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 783, 2929, 10384, 253, 4102, 4081, 1881, 35421, 4715, 1332, 2534, 676, 7553, 968, 281, 4216, 18872, 941, 323, 26736, 253, 31612, 2715, 273, 247, 4216, 2045, 3082, 824, 347, 1407, 2400, 287, 2784, 390, 4735, 44790, 403, 908, 253, 2929, 2589, 84, 5661, 7103, 327, 15302, 273, 2710, 11498, 327, 1097, 811, 43324, 285, 42115, 4758, 50275, 296, 3755, 384, 84, 50276, 18, 2929, 310, 4518, 3542, 285, 3477, 281, 956, 374, 253, 5661, 7103, 310, 10237, 285, 5661, 4278, 403, 4518, 4767, 50276, 20881, 1255, 50276, 783, 2929, 310, 253, 15246, 6880, 273, 2045, 789, 2534, 676, 7553, 968, 323, 4216, 18872, 941, 253, 2929, 1057, 417, 556, 667, 7681, 38135, 275, 619, 4743, 891, 717, 7378, 281, 2572, 253, 4868, 273, 253, 2929, 604, 275, 253, 30080, 22559, 253, 4477, 476, 4518, 1375, 253, 38135, 273, 253, 2929, 8772, 281, 253, 2534, 676, 19661, 2929, 50276, 20261, 253, 2929, 310, 4518, 3542, 285, 1142, 4679, 403, 3559, 253, 2929, 1057, 417, 2525, 253, 2534, 273, 253, 1755, 8059, 824, 347, 17857, 32888, 984, 273, 352, 1146, 247, 1077, 1480, 2898, 273, 247, 2045, 2929, 50275, 2577, 2278, 310, 2581, 2159, 323, 436, 2929, 984, 1754, 1754, 327, 253, 3480, 273, 38135, 273, 436, 2929, 891, 513, 417, 452, 1142, 3533, 281, 1642, 390, 13991, 281, 1056, 5474, 339, 431, 248, 4477, 1263, 13123, 1881, 35421, 4216, 6779, 4715, 1293, 4016, 3530, 11797, 407, 253, 2534, 676, 26664, 1332, 3786, 4081, 275, 253, 2460, 5028, 597, 17093, 326, 970, 616, 1332, 352, 310, 1896, 281, 5115, 12085, 3045, 281, 1375, 23037, 14387, 3082, 824, 347, 270, 737, 77, 387, 247, 6919, 273, 253, 3733, 2105, 13911, 891, 452, 9814, 247, 2045, 2715, 273, 436, 2929, 253, 4477, 452, 2908, 11080, 2120, 23941, 4679, 347, 973, 347, 4067, 4311, 15302, 824, 347, 9040, 15453, 23832, 534, 310, 1199, 14109, 50276, 783, 2929, 310, 2590, 973, 15720, 285, 3477, 281, 956, 253, 4477, 12661, 247, 2969, 533, 14282, 6880, 273, 253, 2534, 676, 26664, 2934, 281, 253, 4216, 5028, 285, 7568, 697, 12510, 327, 4623, 4679, 891, 1158, 6941, 323, 13123, 2957, 310, 247, 1077, 1774, 3884, 323, 4216, 6779, 4715, 285, 253, 4081, 2900, 310, 271, 20654, 1039, 273, 17170, 326, 50276, 2577, 2022, 4468, 651, 320, 342, 253, 2361, 270, 737, 77, 1543, 327, 9040, 15453, 23832, 891, 2096, 326, 253, 4477, 452, 6337, 270, 737, 77, 762, 253, 1072, 15180, 7563, 347, 305, 2612, 533, 352, 4620, 2590, 326, 270, 737, 77, 3198, 625, 673, 281, 3986, 5241, 3045, 651, 352, 320, 1896, 816, 281, 3693, 278, 7937, 3184, 253, 12685, 323, 2852, 789, 281, 1408, 270, 737, 77, 323, 3356, 285, 1304, 849, 253, 3045, 310, 5876, 352, 310, 8718, 604, 436, 1180, 310, 2169, 685, 305, 67, 1641, 2361, 3045, 50276, 783, 4477, 403, 5556, 2182, 323, 247, 1027, 7982, 891, 1158, 326, 10481, 1142, 273, 619, 2045, 7350, 452, 644, 9713, 285, 891, 717, 1024, 25661, 327, 253, 1930, 273, 14924, 253, 4477, 452, 3559, 247, 4217, 6880, 273, 2534, 676, 26664, 715, 253, 4216, 5028, 285, 1024, 452, 4679, 275, 1329, 273, 253, 9787, 17200, 273, 616, 1332, 253, 38135, 310, 8489, 3710, 347, 310, 253, 1083, 323, 954, 273, 253, 3332, 4216, 256, 3433, 9380, 326, 5223, 2460, 5028, 5609, 533, 352, 310, 4217, 275, 285, 273, 3139, 326, 253, 15988, 2540, 275, 3888, 3700, 973, 281, 253, 17948, 10625, 5474, 33032, 2520, 2929, 4081, 247, 1881, 35421, 4715, 7792, 323, 4216, 6779, 4715, 1754, 327, 247, 2831, 39305, 3169, 2957, 1159, 275, 253, 4081, 7792, 767, 6849, 273, 253, 3280, 4216, 2797, 407, 42072, 3082, 403, 4817, 949, 253, 1072, 32049, 281, 11897, 767, 21496, 12624, 840, 2534, 676, 26664, 2957, 310, 908, 281, 11897, 253, 2957, 2556, 281, 253, 21496, 12624, 253, 2022, 7680, 273, 436, 2929, 8696, 275, 326, 352, 12956, 2534, 676, 26664, 432, 8113, 281, 4216, 6779, 4715, 1673, 285, 6760, 253, 3045, 273, 436, 1881, 35421, 7792, 275, 2709, 4666, 9162, 8892, 253, 4081, 1332, 6786, 19890, 1543, 2429, 281, 256, 5503, 3082, 342, 2406, 673, 285, 2317, 10454, 50276, 2520, 2929, 310, 3477, 281, 956, 973, 15720, 285, 10932, 436, 2929, 310, 247, 47641, 3177, 281, 4647, 2534, 676, 26664, 275, 4216, 5028, 1097, 811, 43324, 285, 42115, 4679, 403, 2218, 281, 7472, 253, 3045, 253, 4081, 1332, 6786, 1543, 327, 1061, 342, 253, 256, 5503, 3082, 1223, 253, 673, 285, 2317, 10454, 403, 2406, 50276, 74, 452, 690, 5884, 7350, 2708, 323, 253, 4477, 281, 2953, 50276, 18, 253, 38135, 273, 436, 2929, 310, 3710, 253, 7881, 281, 320, 18915, 273, 253, 2898, 670, 2534, 676, 26664, 275, 4216, 5028, 310, 12744, 281, 479, 50275, 19, 941, 42072, 275, 8113, 8892, 3249, 432, 2266, 1966, 2720, 24088, 3632, 39332, 9187, 2784, 285, 11593, 46899, 651, 417, 1818, 253, 24705, 273, 271, 2460, 1223, 253, 4216, 941, 42072, 3082, 908, 275, 436, 2929, 310, 29563, 432, 2045, 4133, 2478, 352, 2789, 642, 3282, 281, 479, 323, 1650, 9433, 5024, 18752, 281, 247, 2601, 651, 9090, 1421, 281, 1027, 9303, 8666, 50276, 20, 253, 5661, 1543, 403, 327, 1061, 342, 8245, 3082, 327, 253, 954, 8892, 7296, 326, 253, 1698, 673, 285, 2317, 10454, 310, 3551, 432, 2045, 6239, 26332, 2534, 676, 26664, 253, 5661, 7680, 310, 3710, 50276, 21, 275, 2426, 273, 253, 32049, 2990, 285, 42072, 4373, 19484, 2216, 253, 2929, 858, 417, 2085, 11088, 1783, 390, 28913, 2175, 50276, 22, 253, 4477, 9257, 6266, 253, 15450, 15302, 5046, 891, 2985, 352, 533, 891, 13414, 1089, 253, 3215, 11273, 10895, 908, 275, 253, 3368, 436, 2929, 12956, 253, 3332, 2534, 676, 26664, 281, 1881, 35421, 4216, 6779, 4715, 285, 2530, 690, 27096, 16774, 3368, 1543, 342, 824, 4722, 7587, 253, 37317, 3264, 281, 923, 253, 7350, 403, 973, 9713, 2490, 187, 4118, 18435, 27, 783, 2929, 29328, 281, 897, 253, 4102, 5611, 2534, 676, 7553, 968, 4499, 422, 4715, 8103, 281, 253, 1083, 273, 4216, 6928, 253, 2022, 4468, 5439, 407, 30628, 369, 253, 3710, 38135, 273, 436, 789, 534, 597, 9125, 6571, 24772, 5368, 3104, 273, 789, 285, 1057, 417, 9569, 10481, 747, 12342, 436, 369, 671, 5469, 875, 253, 4477, 285, 253, 30628, 1907, 1239, 253, 2929, 285, 253, 10123, 891, 5257, 281, 5194, 342, 253, 30628, 326, 436, 2929, 310, 625, 273, 247, 5019, 273, 5368, 2987, 285, 616, 4942, 15246, 2898, 281, 253, 4216, 2990, 5028, 3021, 3738, 253, 16774, 1543, 403, 18462, 891, 5194, 253, 2929, 556, 3710, 38135, 285, 11521, 2708, 253, 17857, 32888, 14924, 2534 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: the authors introduce a quantum pyramidal circuit to achieve an orthogonal layer of a neural network which is fast and can maintain orthogonality the angle gradients are derived the authors implement the orthogonal nn on simulators and quantum machines to demonstrate the effectiveness strength 1 the paper gives a detailed derivation of the construction of the orthogonal nn layer and gives detailed gradient computation 2 the orthogonal quantum nns are implemented on real quantum machines weakness 1 major the contribution and novelty of the paper are debatable using 2dimensional unitary planar rotators to construct arbitrary unitary operations is wellestablished 1 the beam splitter gate is a tunable machzehnder interferometer mzi the unitary group parametrization was described in section 2 of the unitary and rotation groups francis d murnaghan 1962 using triangular array reck style 2 and rectangular array clements style 3 to construct arbitrary unitaries was proposed years ago and are widely applied in optical neural networks 4 training rotation phases in the unitary parametrization space with various robustnessdiscretization considerations already exist in the literature 5 there are also techniques proposed to train rotation phases directly onchipin situ without softwareassisted backpropagation 6 prior work also exists to solve gradient explosion issues in rnns using those unitary operators 7 2 major the robustness of the constructed orthogonal layer is debatable the triangular array constructs an arbitrary unitary with 2n3 depth such a deeplycascaded layer will encounter severe angle error accumulation issues which is a wellobserved phenomenon in optical neural networks 8 shallower quantum neural layers might be more resilient to quantum noises rectangular clements style and butterfly meshes will be more noisetolerant 3 major experiments are weak not enough data and experiments to support the effectiveness of the orthogonal qnn layer discussion on efficiency and robustness of the orthogonal layer is missing 4 minor training such triangular quantum unitary layers using backpropagation consumes considerable memory as each stage needs to store intermediate results which is not quite scalable or efficient also training such unitary meshes in the rotation angle space is sensitive to initialization without careful initialization the signals will collapse into the local paths with slow convergence 9 1 nicholas c harris jacques carolan darius bunandar et al linear programmable nanophotonic processors optica 2018 2 m reck a zeilinger h bernstein et al experimental realization of any discrete unitary operator physical review letters 1994 3 william r clements peter c humphreys benjamin j metcalf w steven kolthammer and ian a walmsley optimal design for universal multiport interferometers optica 2016 4 y shen n c harris s skirloet al deep learning with coherent nanophotonic circuits nature photonics 2017 5 j gu z zhao c feng h zhu r t chen and d z pan roq a noiseaware quantization scheme towards robust optical neural networks with lowbit controls date 2020 6 tyler w hughes momchil minkov yu shi shanhui fan training of photonic neural networks through in situ backpropagation and gradient measurement optica 2018 7 li jing yichen shen tena dubcek john peurifoy et al tunable efficient unitary neural networks eunn and their application to rnns icml 2017 8 michael ys fang sasikanth manipatruni casimir wierzynski et al design of optical neural networks with component imprecisions optics express 2019 9 sunil pai ben bartlett olav solgaard david a b miller matrix optimization on universal unitary photonic devices physical review applied 2019 the originality of the paper content needs further justification the method introduced were wellestablished in photonic computing and the machine learning community docsepa parameterization of nns that guarantees orthogonality and is easily implementable using classical as well as quantum computing the paper proposes a pyramidal circuit as a modelarchitecture that can be implemented on quantum devices and in the same form on classical devices on quantum devices it is implemented using 2qubit 2level gates rbs gates and on classical devices as a sequence of planar rotations in both cases there are nn12 trainable parameters that also only need on2 in the backward pass strengths single unified architecture that can be implemented on classical and quantum devices simple data loading procedure for quantum computer that does not require assumptions about qram good error characteristics of the unary representation experimental results showing similar accuracy for the classical implementation implementation on a quantum simulator and implementation on a quantum computer weaknesses as the authors note computing wx for x with n features uses uw x where uw is a 2n times 2n unitary ie one needs n qubits for an ndimensional dataset literature review for the classical orthogonal nns is missing some prior work similar to the pyramidal circuit refs 1 and 2 propose a decomposition of an orthogonal matrix in nns into a product of simpler matrices with on2 trainable parameters in total in a way similar to how q matrix is decomposed in qr decomposition 1 uses householder projections while 2 uses givens rotations essentially the same building block as the rbs gate 1 dorobantu et al dizzyrnn reparameterizing recurrent neural networks for normpreserving backpropagation 2 mhammedi et al efficient orthogonal parametrisation of recurrent neural networks using householder reflections while the efficient parameterization guaranteeing orthogonality is not entirely novel in classical ml showing a single architecture that works on gpus and on nisq devices can gather significant interest from ml and qml communities docsepthe authors propose a new type of neural network layer called a pyramidal circuit this layer implements an orthogonal matrix multiplication and its claimed that it allows for gradient descent to be run while maintaining perfect orthogonality and with the same asymptotic running time as an arbitrary fully connected layer this algorithm is inspired by quantum computing and it can be applied on a classical computer or a near term nisq quantum computer to add supporting evidence that the pyramidal circuit works the authors include some experiments demonstrating that networks with pyramidal circuits can achieve some level of accuracy on binary classification problems comments and concerns reproducibility states we have explained in detail each part of both algorithms implementation it is in our belief that anyone with classical software skills can reimplement the classical algorithm if this is the authors belief what harm would it do to simply release the code furthermore could a classically trained software developer manage to reproduce the experiments on the real quantum device in a reasonable time frame it is doubtful as i am a software engineer who works with ibmq devices every day and i know firsthand that any dealings with such devices are highly nontrivial im doubtful i could reproduce the results in this paper in a reasonable time frame if at all section 5 numerical experiments is lacking significant details the authors claim to achieve 98 accuracy on a binary mnist classification task on the real quantum computer ibmqbogota state preparation and measurement error for qubits on this device very often exceed 2 this fact alone makes me highly skeptical of the results perhaps the authors did something more clever than what was presented in the paper to achieve this result but its impossible to know without the code released and without a very detailed description its unclear to me how significant the results in table 2 are mnist with all 10 classes can be classified with 99 accuracy pca on mnist with 2 principle components renders the classes 6 and 9 nearly linearlyseparable i question if 95 accuracy can be achieved by the correct placement of a hyperplane using 4 principle components if so is the network playing a significant role at all figure 10 if im reading this figure correctly i believe its showing a test accuracy on mnist of 85 this is very low compared to state of the art and low compared to simple offtheshelf methods perhaps im not understanding the significance of this or reading it wrong altogether section 4 would benefit greatly by being organized into a mathematical proof format where things are clearly defined beforehand and not just italicized if they mean something particular its hard to access the correctness of the claims in the format theyre provided in section 3 states therefore for any given neural networks orthogonal layer there is a quantum pyramidal circuit that reproduces it again a statement like this is big and requires a mathematical proof some loose form of justification is provided equation 5 doesnt make sense to me if w is an n by n matrix x is ndimensional then wx is ndimensional if uw is 2n by 2n then how can it operate on x figure 6 very hard to follow where this matrix came from i believe its the product of rbs gates section 22 states the important property to note is that the number of parameters of the quantum pyramidal circuit corresponding to a neural network layer of size n d is 2n 1 d d2 exactly the same as the number of degrees of freedom of an orthogonal matrix of dimension n d i believe the authors intended to say weight matrix instead of layer as a neural network layer can consist of more than a weight matrix typesetting and typos fig1 vs eq1 doesnt feel consistent furthermore the absence of a space for fig is quite strange same with equations like n4 with no spaces fig 2b the lines in the figure are not a consistent weight and i can not figure out what the intended meaning of this is if it relates to fig 2a its unclear how if it doesnt the caption isnt sufficient to understand why the network edges are different weights quotation marks shift in style throughout the paper suggestion instead of repeating multiple times that stheta and ctheta are sine and cosine respectively state this a single time early in the paper decimal formatting virtually all iclr papers abide by the format 12350 rather than 12350 to indicate 123 and one half this paper uses the latter figure 5 7th unary state to the 6th but mathematically indexing starts from 0 so the 7th unary state corresponds to j6 this point could potentially confuse the overall idea of pyramidal circuits as neural network layers may have some novel significance my overall understanding of the evidence supporting the claims in this paper is weak it is missing any type of demonstrationconcreteproof that network weight matrices remain orthogonal during training in addition to missing any demonstration of the claimed convergence rate proof despite my greater confidence in understanding implementations i believe its reproducibility is questionable and the experimental descriptions are lacking the experimental results appear to be weak such as achieving an 85 test accuracy on mnist overall it was a difficult paper to access docsepthis paper replaces usual weights and orthogonal matrices in orthogonal neural networks with an equivalent pyramidal circuit made of twodimensional rotations by this the authors proposed a training method for orthogonal neural networks that run in quadratic time which is a significant improvement from previous methods based on singular value decomposition the authors also performed some numerical experiments to verify the abilities of their pyramidal circuit on the standard mnist dataset which showed that their methods are efficient for learning a classification task the novelty of this paper is high by the introduction of pyramidal circuit the authors can implement an orthogonal matrix multiplication which allows for gradient descent with perfect orthogonality with the same asymptotic running time as a standard fully connected layer i think this is a good contribution to the deep learning community however the experiments are not very convincing since they lack the comparison of running times overall i think the results in this paper are important this paper proposed a training method for orthogonal neural networks that run in quadratic time which is a significant improvement from previous methods based on singular value decomposition however the experiments are not very convincing since they lack the comparison of running times ### Summary:
this paper introduces a quantum pyramidal circuit for the computation of orthogonal layers in neural networks and implements the algorithm on simulators and on a quantum computer to illustrate its effectiveness it also obtains an on2 classical algorithm for forward and backpropagation the reviewers generally found strength in the derivations and implementation on real quantum machines some reviewers regarded the contributions as strong and novel while others expressed skepticism about the novelty and the robustness of the algorithm having read the paper in detail i concur with the several reviewers who found the literature review of classical orthogonal nns to be lacking in particular one reviewer highlights similarities with householder reflections and givens rotations for which substantial literature already exists without a proper comparison to this existing work it is not possible to properly assess the novelty or relative contributions of the current paper beyond an extended discussion of related work the paper would also benefit from improved experimental analysis while the paper is framed around the quantum algorithm the main contributions are described as a novel and efficient classical algorithm this would indeed be a contribution of interest to the broader nonquantum iclr community but there is no experimental evidence supporting the utility of the proposed methods an analysis that compares the classical algorithm to the numerous prior works that parameterize orthogonal layers would be an essential addition as it stands i cannot recommend the paper for publication
[ 391, 246, 260, 864, 285, 277, 1182, 3199, 687, 82, 247, 6046, 13823, 36643, 6974, 4404, 10237, 5748, 11454, 6928, 342, 1698, 2713, 5760, 3522, 9169, 50276, 23, 963, 2146, 259, 15729, 1041, 2243, 348, 300, 278, 750, 729, 340, 86, 439, 74, 439, 266, 37942, 7989, 3733, 273, 2359, 5120, 11454, 6928, 949, 275, 5999, 896, 44263, 318, 285, 11786, 6814, 1478, 3737, 4765, 50276, 24, 632, 480, 272, 340, 35009, 703, 79, 3578, 66, 19155, 336, 76, 480, 2116, 759, 321, 338, 899, 1162, 355, 10839, 494, 5919, 24287, 11454, 6928, 299, 4462, 285, 616, 2898, 281, 391, 79, 2224, 17857, 1686, 4240, 50276, 25, 278, 44023, 340, 84, 269, 606, 256, 284, 1479, 14718, 9452, 255, 6321, 74, 6483, 23036, 259, 1321, 91, 1362, 9327, 1162, 355, 2216, 273, 5748, 11454, 6928, 342, 4445, 1607, 2845, 3836, 35353, 3890, 6247, 50276, 26, 5101, 300, 1349, 74, 2240, 44693, 17655, 8919, 580, 1220, 47973, 34843, 301, 247, 270, 5499, 254, 4315, 13757, 327, 10898, 24287, 2359, 5120, 4095, 3520, 2278, 3732, 6247, 50276, 783, 3236, 414, 273, 253, 2929, 2600, 3198, 2007, 22861, 253, 1332, 5611, 497, 973, 21877, 275, 2359, 5120, 12672, 285, 253, 5145, 4715, 3114, 50276, 7152, 339, 4904, 4764, 1320, 273, 295, 2224, 326, 23632, 9373, 38931, 1319, 285, 310, 4354, 3359, 494, 970, 8946, 347, 973, 347, 6318, 12672, 253, 2929, 29328, 247, 25874, 11421, 5049, 347, 247, 1566, 1116, 38413, 326, 476, 320, 9009, 327, 6318, 4095, 285, 275, 253, 1072, 830, 327, 8946, 4095, 327, 6318, 4095, 352, 310, 9009, 970, 374, 371, 2713, 374, 5251, 18488, 391, 1768, 18488, 285, 327, 8946, 4095, 347, 247, 3425, 273, 23601, 39501, 275, 1097, 2219, 627, 403, 48257, 805, 6194, 494, 3602, 326, 671, 760, 878, 327, 19, 275, 253, 19265, 1509, 20544, 50276, 20199, 27998, 10336, 326, 476, 320, 9009, 327, 8946, 285, 6318, 4095, 50276, 19583, 941, 10935, 5199, 323, 6318, 4382, 326, 1057, 417, 2430, 13260, 670, 2805, 3358, 50276, 12311, 2228, 5319, 273, 253, 440, 552, 6779, 50276, 49363, 1543, 4645, 2074, 7200, 323, 253, 8946, 7092, 7092, 327, 247, 6318, 40022, 285, 7092, 327, 247, 6318, 4382, 50276, 20881, 1255, 265, 50276, 284, 253, 4477, 3877, 12672, 22365, 323, 1269, 342, 295, 3386, 4648, 1484, 88, 1269, 835, 1484, 88, 310, 247, 374, 79, 2069, 374, 79, 24287, 26332, 581, 3198, 295, 42414, 323, 271, 295, 6967, 10895, 50276, 22478, 1177, 2278, 323, 253, 8946, 19627, 295, 2224, 310, 5816, 690, 2720, 789, 2074, 281, 253, 25874, 11421, 5049, 1275, 84, 337, 285, 374, 12661, 247, 14717, 273, 271, 19627, 4315, 275, 295, 2224, 715, 247, 1885, 273, 19554, 12624, 342, 327, 19, 6194, 494, 3602, 275, 2264, 275, 247, 1039, 2074, 281, 849, 2805, 4315, 310, 45765, 275, 2805, 83, 14717, 337, 4648, 2419, 11375, 20553, 1223, 374, 4648, 1259, 561, 39501, 9093, 253, 1072, 3652, 2972, 347, 253, 391, 1768, 7394, 50276, 18, 277, 263, 706, 386, 86, 1162, 355, 32862, 3847, 83, 9866, 294, 19484, 3006, 18902, 11454, 6928, 323, 5222, 10192, 26368, 896, 44263, 318, 50276, 19, 278, 3964, 2514, 1162, 355, 5919, 19627, 30364, 4448, 318, 273, 18902, 11454, 6928, 970, 2419, 11375, 24233, 1223, 253, 5919, 4764, 1320, 12215, 272, 9373, 38931, 1319, 310, 417, 7094, 4460, 275, 8946, 13361, 4645, 247, 2014, 10336, 326, 2987, 327, 31025, 316, 285, 327, 295, 261, 82, 4095, 476, 9580, 1534, 1600, 432, 13361, 285, 2805, 1686, 7888, 5474, 339, 431, 248, 4477, 12661, 247, 747, 1511, 273, 11454, 2990, 3828, 1925, 247, 25874, 11421, 5049, 436, 3828, 17930, 271, 19627, 4315, 25219, 285, 697, 7558, 326, 352, 4483, 323, 11786, 18499, 281, 320, 1408, 1223, 11850, 3962, 9373, 38931, 1319, 285, 342, 253, 1072, 20185, 3515, 673, 347, 271, 10341, 4751, 4802, 3828, 436, 5933, 310, 11797, 407, 6318, 12672, 285, 352, 476, 320, 3732, 327, 247, 8946, 4382, 390, 247, 2822, 1307, 295, 261, 82, 6318, 4382, 281, 823, 8109, 1941, 326, 253, 25874, 11421, 5049, 2987, 253, 4477, 2486, 690, 4679, 17227, 326, 6928, 342, 25874, 11421, 14174, 476, 5115, 690, 1268, 273, 7200, 327, 8985, 9162, 3237, 5701, 285, 7350, 50275, 250, 5551, 33593, 3054, 359, 452, 5544, 275, 2508, 1016, 629, 273, 1097, 11333, 7092, 352, 310, 275, 776, 9927, 326, 3780, 342, 8946, 3694, 6936, 476, 294, 303, 3018, 253, 8946, 5933, 604, 436, 310, 253, 4477, 9927, 752, 5237, 651, 352, 513, 281, 3365, 3727, 253, 2127, 33810, 812, 247, 966, 1037, 10166, 3694, 13722, 8722, 281, 18302, 253, 4679, 327, 253, 1524, 6318, 2813, 275, 247, 5272, 673, 3665, 352, 310, 38342, 347, 891, 717, 247, 3694, 16518, 665, 2987, 342, 18890, 37365, 4095, 1046, 1388, 285, 891, 871, 806, 4608, 326, 667, 42331, 342, 824, 4095, 403, 4122, 37825, 516, 38342, 891, 812, 18302, 253, 1543, 275, 436, 2929, 275, 247, 5272, 673, 3665, 604, 387, 512, 50275, 4674, 608, 10704, 4679, 310, 14999, 1534, 4278, 253, 4477, 1750, 281, 5115, 10508, 7200, 327, 247, 8985, 278, 79, 382, 9162, 4836, 327, 253, 1524, 6318, 4382, 18890, 37365, 67, 462, 5503, 1375, 9008, 285, 6814, 2228, 323, 42414, 327, 436, 2813, 1077, 2223, 8268, 374, 436, 958, 3815, 2789, 479, 4122, 33872, 273, 253, 1543, 4931, 253, 4477, 858, 1633, 625, 19080, 685, 752, 369, 3559, 275, 253, 2929, 281, 5115, 436, 906, 533, 697, 7479, 281, 871, 1293, 253, 2127, 4439, 285, 1293, 247, 1077, 7000, 5740, 50275, 953, 12744, 281, 479, 849, 1534, 253, 1543, 275, 2829, 374, 403, 278, 79, 382, 342, 512, 884, 5971, 476, 320, 10509, 342, 8688, 7200, 268, 6357, 327, 278, 79, 382, 342, 374, 8063, 4295, 29512, 253, 5971, 721, 285, 898, 4829, 23352, 16806, 494, 891, 1953, 604, 5325, 7200, 476, 320, 6786, 407, 253, 3451, 14663, 273, 247, 4373, 13568, 970, 577, 8063, 4295, 604, 594, 310, 253, 2990, 4882, 247, 1534, 2554, 387, 512, 50275, 13206, 884, 604, 516, 4361, 436, 4677, 9113, 891, 2868, 697, 4645, 247, 1071, 7200, 327, 278, 79, 382, 273, 9330, 436, 310, 1077, 1698, 2429, 281, 1375, 273, 253, 1445, 285, 1698, 2429, 281, 2969, 273, 649, 1041, 48164, 3082, 4931, 516, 417, 4685, 253, 8453, 273, 436, 390, 4361, 352, 3430, 17965, 50275, 4674, 577, 651, 5649, 10260, 407, 1146, 10932, 715, 247, 15965, 4737, 5981, 835, 1841, 403, 4518, 2931, 38565, 285, 417, 816, 36037, 280, 1025, 604, 597, 1599, 1633, 1798, 697, 1892, 281, 2289, 253, 36594, 273, 253, 3916, 275, 253, 5981, 597, 250, 2530, 275, 50275, 4674, 495, 3054, 3103, 323, 667, 1677, 11454, 6928, 19627, 3828, 627, 310, 247, 6318, 25874, 11421, 5049, 326, 7598, 707, 352, 969, 247, 3908, 751, 436, 310, 1943, 285, 4419, 247, 15965, 4737, 690, 13155, 830, 273, 22861, 310, 2530, 50275, 29813, 608, 36908, 1056, 3282, 281, 479, 604, 259, 310, 271, 295, 407, 295, 4315, 1269, 310, 295, 6967, 840, 22365, 310, 295, 6967, 604, 1484, 88, 310, 374, 79, 407, 374, 79, 840, 849, 476, 352, 10196, 327, 1269, 50275, 13206, 721, 1077, 1892, 281, 956, 835, 436, 4315, 2210, 432, 891, 2868, 697, 253, 1885, 273, 391, 1768, 18488, 50275, 4674, 3307, 3054, 253, 1774, 2867, 281, 3877, 310, 326, 253, 1180, 273, 3602, 273, 253, 6318, 25874, 11421, 5049, 3969, 281, 247, 11454, 2990, 3828, 273, 1979, 295, 50276, 69, 310, 374, 79, 50276, 18, 50276, 69, 50276, 69, 19, 4555, 253, 1072, 347, 253, 1180, 273, 7759, 273, 7185, 273, 271, 19627, 4315, 273, 7877, 295, 50276, 69, 891, 2868, 253, 4477, 6034, 281, 1333, 2801, 4315, 3185, 273, 3828, 347, 247, 11454, 2990, 3828, 476, 2882, 273, 625, 685, 247, 2801, 4315, 50276, 10706, 33513, 285, 963, 993, 50275, 926, 18, 4632, 16186, 18, 36908, 1928, 5185, 33810, 253, 5928, 273, 247, 2317, 323, 3036, 310, 3240, 8921, 1072, 342, 7424, 751, 295, 21, 342, 642, 8470, 50275, 926, 374, 67, 253, 3104, 275, 253, 4677, 403, 417, 247, 5185, 2801, 285, 891, 476, 417, 4677, 562, 752, 253, 6034, 4495, 273, 436, 310, 604, 352, 7033, 281, 3036, 374, 66, 697, 12744, 849, 604, 352, 36908, 253, 11743, 310, 2649, 4209, 281, 2096, 2139, 253, 2990, 9297, 403, 1027, 13461, 50275, 11956, 318, 10880, 5333, 275, 3740, 4768, 253, 2929, 50275, 35640, 279, 3185, 273, 24385, 2709, 2069, 326, 331, 22666, 285, 260, 3124, 403, 37353, 285, 7349, 460, 2975, 1375, 436, 247, 2014, 673, 2393, 275, 253, 2929, 50275, 8632, 1983, 33907, 14257, 512, 17857, 32888, 9380, 41761, 407, 253, 5981, 15567, 1235, 2581, 685, 15567, 1235, 281, 5224, 15567, 285, 581, 2716, 436, 2929, 4648, 253, 6158, 50275, 13206, 608, 818, 394, 440, 552, 1375, 281, 253, 721, 394, 533, 11076, 1037, 44176, 7866, 432, 470, 594, 253, 818, 394, 440, 552, 1375, 10140, 281, 480, 23, 436, 1127, 812, 7826, 40678, 50276, 783, 4583, 2934, 273, 25874, 11421, 14174, 347, 11454, 2990, 8090, 778, 452, 690, 4460, 8453, 619, 4583, 4685, 273, 253, 1941, 8109, 253, 3916, 275, 436, 2929, 310, 5075, 352, 310, 5816, 667, 1511, 273, 20028, 585, 6713, 16314, 326, 2990, 2801, 12624, 3464, 19627, 1309, 3733, 275, 1635, 281, 5816, 667, 20028, 273, 253, 7558, 14940, 2281, 4737, 5747, 619, 3687, 7162, 275, 4685, 27558, 891, 2868, 697, 38041, 310, 30455, 285, 253, 5661, 20121, 403, 14999, 253, 5661, 1543, 3176, 281, 320, 5075, 824, 347, 17170, 271, 9330, 1071, 7200, 327, 278, 79, 382, 4583, 352, 369, 247, 2834, 2929, 281, 2289, 5474, 33032, 2520, 2929, 36287, 7312, 13461, 285, 19627, 12624, 275, 19627, 11454, 6928, 342, 271, 6425, 25874, 11421, 5049, 1160, 273, 2500, 351, 37613, 39501, 50276, 1615, 436, 253, 4477, 4081, 247, 3733, 1332, 323, 19627, 11454, 6928, 326, 1408, 275, 21396, 673, 534, 310, 247, 1534, 7756, 432, 2045, 3082, 1754, 327, 11098, 1318, 14717, 253, 4477, 671, 2684, 690, 10704, 4679, 281, 12654, 253, 15277, 273, 616, 25874, 11421, 5049, 327, 253, 2629, 278, 79, 382, 10895, 534, 2692, 326, 616, 3082, 403, 5919, 323, 4715, 247, 9162, 4836, 253, 38135, 273, 436, 2929, 310, 1029, 407, 253, 10199, 273, 25874, 11421, 5049, 253, 4477, 476, 3359, 271, 19627, 4315, 25219, 534, 4483, 323, 11786, 18499, 342, 3962, 9373, 38931, 1319, 342, 253, 1072, 20185, 3515, 673, 347, 247, 2629, 4751, 4802, 3828, 891, 1158, 436, 310, 247, 1175, 7680, 281, 253, 3676, 4715, 3114, 2299, 253, 4679, 403, 417, 1077, 21414, 1580, 597, 3480, 253, 5301, 273, 3515, 2069, 50276, 1189, 455, 50276, 74, 1158, 253, 1543, 275, 436, 2929, 403, 1774, 50276, 2520, 2929, 4081, 247, 3733, 1332, 323, 19627, 11454, 6928, 326, 1408, 275, 21396, 673, 534, 310, 247, 1534, 7756, 432, 2045, 3082, 1754, 327, 11098, 1318, 14717, 2299, 253, 4679, 403, 417, 1077, 21414, 1580, 597, 3480, 253, 5301, 273, 3515, 2069, 2490, 187, 4118, 18435, 27, 2520, 2929, 23970, 247, 6318, 25874, 11421, 5049, 323, 253, 13782, 273, 19627, 8090, 275, 11454, 6928, 285, 17930, 253, 5933, 327, 948, 28457, 285, 327, 247, 6318, 4382, 281, 17093, 697, 12510, 352, 671, 31326, 271, 327, 19, 8946, 5933, 323, 3579, 285, 896, 44263, 318, 50276, 783, 30628, 3839, 1119, 4757, 275, 253, 3538, 569, 285, 7092, 327, 1524, 6318, 10679, 690, 30628, 12258, 253, 9021, 347, 2266, 285, 4460, 1223, 2571, 4469, 44730, 670, 253, 38135, 285, 253, 31640, 273, 253, 5933, 1907, 1239, 253, 2929, 275, 2508, 891, 15038, 342, 253, 2067, 30628, 665, 1119, 253, 6239, 2278, 273, 8946, 19627, 295, 2224, 281, 320, 14999, 275, 1798, 581, 37317, 16681, 22620, 342, 2419, 11375, 24233, 285, 1259, 561, 39501, 323, 534, 6832, 6239, 2168, 4961, 1293, 247, 1463, 5301, 281, 436, 5368, 789, 352, 310, 417, 1896, 281, 6283, 2939, 253, 38135, 390, 4103, 9021, 273, 253, 1655, 2929, 50276, 42218, 271, 6508, 5955, 273, 2905, 789, 253, 2929, 651, 671, 5649, 432, 5520, 5661, 1783, 1223, 253, 2929, 310, 29318, 1475, 253, 6318, 5933, 253, 2022, 9021, 403, 2529, 347, 247, 4460, 285, 5919, 8946, 5933, 436, 651, 6296, 320, 247, 7680, 273, 1600, 281, 253, 16055, 1327, 46320, 17857, 32888, 3114, 533, 627, 310, 642, 5661, 1941, 8109, 253, 11839, 273, 253, 4081, 3082, 271, 1783, 326, 26662, 253, 8946, 5933, 281, 253, 7418, 2720, 2987, 326, 4764, 907, 19627, 8090, 651, 320, 271, 5667, 1635, 347, 352, 9572, 891, 2550, 5583, 253, 2929, 323, 9311 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 391, 246, 260, 864, 285, 277, 1182, 3199, 687, 82, 247, 6046, 13823, 36643, 6974, 4404, 10237, 5748, 11454, 6928, 342, 1698, 2713, 5760, 3522, 9169, 50276, 23, 963, 2146, 259, 15729, 1041, 2243, 348, 300, 278, 750, 729, 340, 86, 439, 74, 439, 266, 37942, 7989, 3733, 273, 2359, 5120, 11454, 6928, 949, 275, 5999, 896, 44263, 318, 285, 11786, 6814, 1478, 3737, 4765, 50276, 24, 632, 480, 272, 340, 35009, 703, 79, 3578, 66, 19155, 336, 76, 480, 2116, 759, 321, 338, 899, 1162, 355, 10839, 494, 5919, 24287, 11454, 6928, 299, 4462, 285, 616, 2898, 281, 391, 79, 2224, 17857, 1686, 4240, 50276, 25, 278, 44023, 340, 84, 269, 606, 256, 284, 1479, 14718, 9452, 255, 6321, 74, 6483, 23036, 259, 1321, 91, 1362, 9327, 1162, 355, 2216, 273, 5748, 11454, 6928, 342, 4445, 1607, 2845, 3836, 35353, 3890, 6247, 50276, 26, 5101, 300, 1349, 74, 2240, 44693, 17655, 8919, 580, 1220, 47973, 34843, 301, 247, 270, 5499, 254, 4315, 13757, 327, 10898, 24287, 2359, 5120, 4095, 3520, 2278, 3732, 6247, 50276, 783, 3236, 414, 273, 253, 2929, 2600, 3198, 2007, 22861, 253, 1332, 5611, 497, 973, 21877, 275, 2359, 5120, 12672, 285, 253, 5145, 4715, 3114, 50276, 7152, 339, 4904, 4764, 1320, 273, 295, 2224, 326, 23632, 9373, 38931, 1319, 285, 310, 4354, 3359, 494, 970, 8946, 347, 973, 347, 6318, 12672, 253, 2929, 29328, 247, 25874, 11421, 5049, 347, 247, 1566, 1116, 38413, 326, 476, 320, 9009, 327, 6318, 4095, 285, 275, 253, 1072, 830, 327, 8946, 4095, 327, 6318, 4095, 352, 310, 9009, 970, 374, 371, 2713, 374, 5251, 18488, 391, 1768, 18488, 285, 327, 8946, 4095, 347, 247, 3425, 273, 23601, 39501, 275, 1097, 2219, 627, 403, 48257, 805, 6194, 494, 3602, 326, 671, 760, 878, 327, 19, 275, 253, 19265, 1509, 20544, 50276, 20199, 27998, 10336, 326, 476, 320, 9009, 327, 8946, 285, 6318, 4095, 50276, 19583, 941, 10935, 5199, 323, 6318, 4382, 326, 1057, 417, 2430, 13260, 670, 2805, 3358, 50276, 12311, 2228, 5319, 273, 253, 440, 552, 6779, 50276, 49363, 1543, 4645, 2074, 7200, 323, 253, 8946, 7092, 7092, 327, 247, 6318, 40022, 285, 7092, 327, 247, 6318, 4382, 50276, 20881, 1255, 265, 50276, 284, 253, 4477, 3877, 12672, 22365, 323, 1269, 342, 295, 3386, 4648, 1484, 88, 1269, 835, 1484, 88, 310, 247, 374, 79, 2069, 374, 79, 24287, 26332, 581, 3198, 295, 42414, 323, 271, 295, 6967, 10895, 50276, 22478, 1177, 2278, 323, 253, 8946, 19627, 295, 2224, 310, 5816, 690, 2720, 789, 2074, 281, 253, 25874, 11421, 5049, 1275, 84, 337, 285, 374, 12661, 247, 14717, 273, 271, 19627, 4315, 275, 295, 2224, 715, 247, 1885, 273, 19554, 12624, 342, 327, 19, 6194, 494, 3602, 275, 2264, 275, 247, 1039, 2074, 281, 849, 2805, 4315, 310, 45765, 275, 2805, 83, 14717, 337, 4648, 2419, 11375, 20553, 1223, 374, 4648, 1259, 561, 39501, 9093, 253, 1072, 3652, 2972, 347, 253, 391, 1768, 7394, 50276, 18, 277, 263, 706, 386, 86, 1162, 355, 32862, 3847, 83, 9866, 294, 19484, 3006, 18902, 11454, 6928, 323, 5222, 10192, 26368, 896, 44263, 318, 50276, 19, 278, 3964, 2514, 1162, 355, 5919, 19627, 30364, 4448, 318, 273, 18902, 11454, 6928, 970, 2419, 11375, 24233, 1223, 253, 5919, 4764, 1320, 12215, 272, 9373, 38931, 1319, 310, 417, 7094, 4460, 275, 8946, 13361, 4645, 247, 2014, 10336, 326, 2987, 327, 31025, 316, 285, 327, 295, 261, 82, 4095, 476, 9580, 1534, 1600, 432, 13361, 285, 2805, 1686, 7888, 5474, 339, 431, 248, 4477, 12661, 247, 747, 1511, 273, 11454, 2990, 3828, 1925, 247, 25874, 11421, 5049, 436, 3828, 17930, 271, 19627, 4315, 25219, 285, 697, 7558, 326, 352, 4483, 323, 11786, 18499, 281, 320, 1408, 1223, 11850, 3962, 9373, 38931, 1319, 285, 342, 253, 1072, 20185, 3515, 673, 347, 271, 10341, 4751, 4802, 3828, 436, 5933, 310, 11797, 407, 6318, 12672, 285, 352, 476, 320, 3732, 327, 247, 8946, 4382, 390, 247, 2822, 1307, 295, 261, 82, 6318, 4382, 281, 823, 8109, 1941, 326, 253, 25874, 11421, 5049, 2987, 253, 4477, 2486, 690, 4679, 17227, 326, 6928, 342, 25874, 11421, 14174, 476, 5115, 690, 1268, 273, 7200, 327, 8985, 9162, 3237, 5701, 285, 7350, 50275, 250, 5551, 33593, 3054, 359, 452, 5544, 275, 2508, 1016, 629, 273, 1097, 11333, 7092, 352, 310, 275, 776, 9927, 326, 3780, 342, 8946, 3694, 6936, 476, 294, 303, 3018, 253, 8946, 5933, 604, 436, 310, 253, 4477, 9927, 752, 5237, 651, 352, 513, 281, 3365, 3727, 253, 2127, 33810, 812, 247, 966, 1037, 10166, 3694, 13722, 8722, 281, 18302, 253, 4679, 327, 253, 1524, 6318, 2813, 275, 247, 5272, 673, 3665, 352, 310, 38342, 347, 891, 717, 247, 3694, 16518, 665, 2987, 342, 18890, 37365, 4095, 1046, 1388, 285, 891, 871, 806, 4608, 326, 667, 42331, 342, 824, 4095, 403, 4122, 37825, 516, 38342, 891, 812, 18302, 253, 1543, 275, 436, 2929, 275, 247, 5272, 673, 3665, 604, 387, 512, 50275, 4674, 608, 10704, 4679, 310, 14999, 1534, 4278, 253, 4477, 1750, 281, 5115, 10508, 7200, 327, 247, 8985, 278, 79, 382, 9162, 4836, 327, 253, 1524, 6318, 4382, 18890, 37365, 67, 462, 5503, 1375, 9008, 285, 6814, 2228, 323, 42414, 327, 436, 2813, 1077, 2223, 8268, 374, 436, 958, 3815, 2789, 479, 4122, 33872, 273, 253, 1543, 4931, 253, 4477, 858, 1633, 625, 19080, 685, 752, 369, 3559, 275, 253, 2929, 281, 5115, 436, 906, 533, 697, 7479, 281, 871, 1293, 253, 2127, 4439, 285, 1293, 247, 1077, 7000, 5740, 50275, 953, 12744, 281, 479, 849, 1534, 253, 1543, 275, 2829, 374, 403, 278, 79, 382, 342, 512, 884, 5971, 476, 320, 10509, 342, 8688, 7200, 268, 6357, 327, 278, 79, 382, 342, 374, 8063, 4295, 29512, 253, 5971, 721, 285, 898, 4829, 23352, 16806, 494, 891, 1953, 604, 5325, 7200, 476, 320, 6786, 407, 253, 3451, 14663, 273, 247, 4373, 13568, 970, 577, 8063, 4295, 604, 594, 310, 253, 2990, 4882, 247, 1534, 2554, 387, 512, 50275, 13206, 884, 604, 516, 4361, 436, 4677, 9113, 891, 2868, 697, 4645, 247, 1071, 7200, 327, 278, 79, 382, 273, 9330, 436, 310, 1077, 1698, 2429, 281, 1375, 273, 253, 1445, 285, 1698, 2429, 281, 2969, 273, 649, 1041, 48164, 3082, 4931, 516, 417, 4685, 253, 8453, 273, 436, 390, 4361, 352, 3430, 17965, 50275, 4674, 577, 651, 5649, 10260, 407, 1146, 10932, 715, 247, 15965, 4737, 5981, 835, 1841, 403, 4518, 2931, 38565, 285, 417, 816, 36037, 280, 1025, 604, 597, 1599, 1633, 1798, 697, 1892, 281, 2289, 253, 36594, 273, 253, 3916, 275, 253, 5981, 597, 250, 2530, 275, 50275, 4674, 495, 3054, 3103, 323, 667, 1677, 11454, 6928, 19627, 3828, 627, 310, 247, 6318, 25874, 11421, 5049, 326, 7598, 707, 352, 969, 247, 3908, 751, 436, 310, 1943, 285, 4419, 247, 15965, 4737, 690, 13155, 830, 273, 22861, 310, 2530, 50275, 29813, 608, 36908, 1056, 3282, 281, 479, 604, 259, 310, 271, 295, 407, 295, 4315, 1269, 310, 295, 6967, 840, 22365, 310, 295, 6967, 604, 1484, 88, 310, 374, 79, 407, 374, 79, 840, 849, 476, 352, 10196, 327, 1269, 50275, 13206, 721, 1077, 1892, 281, 956, 835, 436, 4315, 2210, 432, 891, 2868, 697, 253, 1885, 273, 391, 1768, 18488, 50275, 4674, 3307, 3054, 253, 1774, 2867, 281, 3877, 310, 326, 253, 1180, 273, 3602, 273, 253, 6318, 25874, 11421, 5049, 3969, 281, 247, 11454, 2990, 3828, 273, 1979, 295, 50276, 69, 310, 374, 79, 50276, 18, 50276, 69, 50276, 69, 19, 4555, 253, 1072, 347, 253, 1180, 273, 7759, 273, 7185, 273, 271, 19627, 4315, 273, 7877, 295, 50276, 69, 891, 2868, 253, 4477, 6034, 281, 1333, 2801, 4315, 3185, 273, 3828, 347, 247, 11454, 2990, 3828, 476, 2882, 273, 625, 685, 247, 2801, 4315, 50276, 10706, 33513, 285, 963, 993, 50275, 926, 18, 4632, 16186, 18, 36908, 1928, 5185, 33810, 253, 5928, 273, 247, 2317, 323, 3036, 310, 3240, 8921, 1072, 342, 7424, 751, 295, 21, 342, 642, 8470, 50275, 926, 374, 67, 253, 3104, 275, 253, 4677, 403, 417, 247, 5185, 2801, 285, 891, 476, 417, 4677, 562, 752, 253, 6034, 4495, 273, 436, 310, 604, 352, 7033, 281, 3036, 374, 66, 697, 12744, 849, 604, 352, 36908, 253, 11743, 310, 2649, 4209, 281, 2096, 2139, 253, 2990, 9297, 403, 1027, 13461, 50275, 11956, 318, 10880, 5333, 275, 3740, 4768, 253, 2929, 50275, 35640, 279, 3185, 273, 24385, 2709, 2069, 326, 331, 22666, 285, 260, 3124, 403, 37353, 285, 7349, 460, 2975, 1375, 436, 247, 2014, 673, 2393, 275, 253, 2929, 50275, 8632, 1983, 33907, 14257, 512, 17857, 32888, 9380, 41761, 407, 253, 5981, 15567, 1235, 2581, 685, 15567, 1235, 281, 5224, 15567, 285, 581, 2716, 436, 2929, 4648, 253, 6158, 50275, 13206, 608, 818, 394, 440, 552, 1375, 281, 253, 721, 394, 533, 11076, 1037, 44176, 7866, 432, 470, 594, 253, 818, 394, 440, 552, 1375, 10140, 281, 480, 23, 436, 1127, 812, 7826, 40678, 50276, 783, 4583, 2934, 273, 25874, 11421, 14174, 347, 11454, 2990, 8090, 778, 452, 690, 4460, 8453, 619, 4583, 4685, 273, 253, 1941, 8109, 253, 3916, 275, 436, 2929, 310, 5075, 352, 310, 5816, 667, 1511, 273, 20028, 585, 6713, 16314, 326, 2990, 2801, 12624, 3464, 19627, 1309, 3733, 275, 1635, 281, 5816, 667, 20028, 273, 253, 7558, 14940, 2281, 4737, 5747, 619, 3687, 7162, 275, 4685, 27558, 891, 2868, 697, 38041, 310, 30455, 285, 253, 5661, 20121, 403, 14999, 253, 5661, 1543, 3176, 281, 320, 5075, 824, 347, 17170, 271, 9330, 1071, 7200, 327, 278, 79, 382, 4583, 352, 369, 247, 2834, 2929, 281, 2289, 5474, 33032, 2520, 2929, 36287, 7312, 13461, 285, 19627, 12624, 275, 19627, 11454, 6928, 342, 271, 6425, 25874, 11421, 5049, 1160, 273, 2500, 351, 37613, 39501, 50276, 1615, 436, 253, 4477, 4081, 247, 3733, 1332, 323, 19627, 11454, 6928, 326, 1408, 275, 21396, 673, 534, 310, 247, 1534, 7756, 432, 2045, 3082, 1754, 327, 11098, 1318, 14717, 253, 4477, 671, 2684, 690, 10704, 4679, 281, 12654, 253, 15277, 273, 616, 25874, 11421, 5049, 327, 253, 2629, 278, 79, 382, 10895, 534, 2692, 326, 616, 3082, 403, 5919, 323, 4715, 247, 9162, 4836, 253, 38135, 273, 436, 2929, 310, 1029, 407, 253, 10199, 273, 25874, 11421, 5049, 253, 4477, 476, 3359, 271, 19627, 4315, 25219, 534, 4483, 323, 11786, 18499, 342, 3962, 9373, 38931, 1319, 342, 253, 1072, 20185, 3515, 673, 347, 247, 2629, 4751, 4802, 3828, 891, 1158, 436, 310, 247, 1175, 7680, 281, 253, 3676, 4715, 3114, 2299, 253, 4679, 403, 417, 1077, 21414, 1580, 597, 3480, 253, 5301, 273, 3515, 2069, 50276, 1189, 455, 50276, 74, 1158, 253, 1543, 275, 436, 2929, 403, 1774, 50276, 2520, 2929, 4081, 247, 3733, 1332, 323, 19627, 11454, 6928, 326, 1408, 275, 21396, 673, 534, 310, 247, 1534, 7756, 432, 2045, 3082, 1754, 327, 11098, 1318, 14717, 2299, 253, 4679, 403, 417, 1077, 21414, 1580, 597, 3480, 253, 5301, 273, 3515, 2069, 2490, 187, 4118, 18435, 27, 2520, 2929, 23970, 247, 6318, 25874, 11421, 5049, 323, 253, 13782, 273, 19627, 8090, 275, 11454, 6928, 285, 17930, 253, 5933, 327, 948, 28457, 285, 327, 247, 6318, 4382, 281, 17093, 697, 12510, 352, 671, 31326, 271, 327, 19, 8946, 5933, 323, 3579, 285, 896, 44263, 318, 50276, 783, 30628, 3839, 1119, 4757, 275, 253, 3538, 569, 285, 7092, 327, 1524, 6318, 10679, 690, 30628, 12258, 253, 9021, 347, 2266, 285, 4460, 1223, 2571, 4469, 44730, 670, 253, 38135, 285, 253, 31640, 273, 253, 5933, 1907, 1239, 253, 2929, 275, 2508, 891, 15038, 342, 253, 2067, 30628, 665, 1119, 253, 6239, 2278, 273, 8946, 19627, 295, 2224, 281, 320, 14999, 275, 1798, 581, 37317, 16681, 22620, 342, 2419, 11375, 24233, 285, 1259, 561, 39501, 323, 534, 6832, 6239, 2168, 4961, 1293, 247, 1463, 5301, 281, 436, 5368, 789, 352, 310, 417, 1896, 281, 6283, 2939, 253, 38135, 390, 4103, 9021, 273, 253, 1655, 2929, 50276, 42218, 271, 6508, 5955, 273, 2905, 789, 253, 2929, 651, 671, 5649, 432, 5520, 5661, 1783, 1223, 253, 2929, 310, 29318, 1475, 253, 6318, 5933, 253, 2022, 9021, 403, 2529, 347, 247, 4460, 285, 5919, 8946, 5933, 436, 651, 6296, 320, 247, 7680, 273, 1600, 281, 253, 16055, 1327, 46320, 17857, 32888, 3114, 533, 627, 310, 642, 5661, 1941, 8109, 253, 11839, 273, 253, 4081, 3082, 271, 1783, 326, 26662, 253, 8946, 5933, 281, 253, 7418, 2720, 2987, 326, 4764, 907, 19627, 8090, 651, 320, 271, 5667, 1635, 347, 352, 9572, 891, 2550, 5583, 253, 2929, 323, 9311 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: this paper argues that regularizationbased approaches to continual learning fail to distinguish between classes from different tasks in the classincremental setting without relying on task labels to simplify the problem the paper uses a theoretical argument showing it is not possible to assess the discriminability between two classes from different tasks without access to the data or a model of the data distribution and also demonstrates a difference in performance when conditioning on task labels i think the thesis of the paper is plausible but unfortunately i dont think this is demonstrated in the submission in its current form and think there are a few flaws in the argument that need to be addressed i do think this is an interesting and important line of inquiry and could lead to valuable insights eg proving comprehensively that regularizationbased approaches need to do a better job of capturing the data distribution or showing they can be improved by doing so i encourage the authors to address the comments below in lieu of this main comments 1 the argument relies heavily on the assumption that regularizationbased approaches to supervised continual learning do not model the data distribution eg however by hypothesis the regularizationbased model does not model the data distribution first this seems to conflate regularizationbased methods and discriminative models as regularizationbased methods can be applied to generative modelling tasks eg vcl 1 with a classifier used on learned representations this scenario is not considered here second there is work showing that discriminative models have some generative capabilities eg 2 and it may be possible to extract this information to better model the data distribution and understand where previously learned decision boundaries are valid 2 the theoretical reasoning essentially demonstrates the simple intuition that we cannot assess whether the samples from two classes are linearly separable without having access to both sets of data or a model of the data from the class from the previous task this insight makes sense but its not clear that it is nontrivial or important it says nothing about how well and under what conditions previously learned decision boundaries may generalisetransfer to a new task and ignores my point 1 above that discriminative models do carry information about the data distribution 3 the empirical results primarily show that performance with some regularization methods is much worse when not using the task labels in different situations eg with disjointjoint classes in each task this is known from previous work since regularizationbased methods get much poorer performance in classincremental learning ie single head no task label compared to taskincremental learning multihead task label given see for example 34 unfortunately these results alone cannot attribute the poor performance to the reasons argued in the paper ie the inability to capture previous data distributions leading to the inability to discriminate classes across tasks and it would be nice to see experiments that actually measure whether the poor performance is due to the arguments made the paper also makes a broader claim about regularizationbased methods being limited but only evaluates ewc and kfac other regularizationbased approaches like vcl without coreset1 and bgd 5 could be explored too 4 the writing is quite difficult to follow at times id recommend another thorough proofread to address grammar spelling and clarity issues see a list of some examples below but there are many others 5 there are a number of papers in continual learning that have been missed both older regularizationbased approaches and some newer ones see refs below some writing issues inferences should be inference in most places in the text eg for inference during inference etc most citations seem to not have parentheses so they blend into the text eg for continual learning kirkpatrick et al 2017 on page 1 please use citep instead of citet when the citation doesnt flow from the text section 32 the regularization term omega act should be acts also section 32 a matrix pondering weights importance not sure what pondering should mean here definition 1 separating two dataset should be datasets i think disentangled should be discriminable in the third part section of section 42 references 1 nguyen cuong v et al variational continual learning arxiv preprint arxiv171010628 2017 2 grathwohl will et al your classifier is secretly an energy based model and you should treat it like one arxiv preprint arxiv191203263 2019 3 hsu yenchang et al reevaluating continual learning scenarios a categorization and case for strong baselines arxiv preprint arxiv181012488 2018 4 van de ven gido m and andreas s tolias three scenarios for continual learning arxiv preprint arxiv190407734 2019 5 zeno chen et al task agnostic continual learning using online variational bayes arxiv preprint arxiv180310123 2018 6 aljundi rahaf et al memory aware synapses learning what not to forget proceedings of the european conference on computer vision eccv 2018 7 aljundi r lin m goujaud b bengio y 2019 gradient based sample selection for online continual learning in advances in neural information processing systems pp 1181611825 8 rao d visin f rusu a pascanu r teh y w hadsell r 2019 continual unsupervised representation learning in advances in neural information processing systems pp 76477657 9 chaudhry arslan et al using hindsight to anchor past knowledge in continual learning arxiv preprint arxiv200208165 2020 docsepthis paper presents a theoretical analysis of regularization based approaches to the problem of continually learning a sequence of tasks the point of the paper is to demonstrate shortcomings of these kinds of approaches in the context of classincremental learning where classes are observed once and one after another the authors argue that these kinds of methods require task labels at test time to correctly distinguish classes from different tasks i would like to mention as strengths of the paper the paper is a valuable attempt at examining properties of existing approaches to the problem of continual learning i appreciate the rigorous approach to the evaluation of the shortcomings of these kinds of methods the paper is in general wellwritten and easy to follow there are no major errors or typos the weaknesses i see in this paper are although there is a clear and formal explanation of why it is not possible to discriminate among classes from different task when there is no access to data from those previous classes i am not fully convinced that the set of parameters kept from previous classes and used in regularizationbased approaches do not represent to some extent this data in particular there is no clear argument for the claim on page 5 however by hypothesis omegat1 does not model the data distribution from ct1 and therefore it does not model data distribution from ct1 classes i would like to see some discussion regarding how fairly a set of parameters thetat1 would represent the s set in terms of the experiments i consider the number of tasks quite limited to be convinced i would like to see several tasks at least 10 and sequential results in terms of tasks learned rather than epochs questions for authors please address my comments on the weaknesses above docsepthe paper presents an interesting conclusion that kinds of regularizations cannot learn to discriminate classes from different learning tasks both theoretical and experimental analysis is provided two regularization methods are tested pro 1 the conclusion is interesting and it reveals that regularizations fail when the task label isnt given this conclusion questions those proposed regularizations for continual learning 2 theoretical analysis of the shortcomings of regularizations in continual learning is presented besides the paper also pointed out supplementary shortcomings of regularization in other types of learning situations cons 1 the related work is not sufficient it lacks the introduction and discussion of popular regularization methods in continual learning 2 the experiments are implemented on 3 tiny datasets mnist fashionmnist kmnist its not sufficient to show the importance of the proposed conclusion the popular cifar100 dataset should be tested 3 only two regularization methods are tested the popular knowledge distillation based regularization a is not tested 4 although regularizations cannot learn to discriminate classes from different tasks they can avoid the bias to new training data which is also important for improving cl performance the paper would be stronger if the analysis why regularizations can improve cl performance is presented 5 can you explain why there is a sharp promotiondecrease of accuracy at the final steps in each learning task figure 2 3 a learning without forgetting li et al 2018 docseptldr the paper presents an intuition as to why continual learning with regularization will be unable to discern subsequent tasks while i believe the intuitions to be correct the formal laguage used to present this intuition is not sufficiently rigorous to warrant publication the paper sets out to build on the surprisingly sparse literature on the theory of continual learning the key point the paper wishes to make is that regularizationbased continual learning approaches are unable to discern tasks without additional information such as task labels at training time this is attempted in the form of three propositions first of i should say that i believe the authors are on the right track have the correct intuitions i believe that the claim they want to make is correct and i believe that classification is a promising setting to prove their claim via separating hyperplanes in other words i want to like the paper but its current state does not allow me to recommend its publication let me also be transparent i think this paper needs more than a bandaid and will in my opinion not be publishable without a major revisionrevise and resubmit order of magnitude in treatment to use journal jargon in essence my problem is that i do not think the current version of the paper manages to deliver a convincing rigorously mathematical proof of the developed intuitions since this is what the paper sets out to do i thus recommend that its current form not be published the main hindrance throughout is a serious lack in precision this begins with the definitions for instance definition 2 defines interferences as follows in machine learning interferences are conflicts between two or more objective functions leading to prediction errors there are interferences when optimizing one objective function degrades the optimization of at least another one while this definition captures the intuition of interference it is not precise enough eg which objective functions what kind of prediction error to be used within a mathematical result as is done in proposition 43 which states that while learning a sequence of disjoint classification tasks if the memory omega of the past tasks is only dependent on trained weights and learning criterion of previous task and does not model the past distribution it is not possible for deep neural networks to learn new tasks without interference for example it is unclear which prediction error which is part of the interference definition from def 2 or which objective functions again they are part of the definition 2 isare the relevant oneones for proposition 43 while both answer can be intuited to a certain degree if the reader is eager enough a rigorous result requires the reader to exactly know all relevant objects in question what modelling the past distribution is also not precisely defined but used to state this result this kind of imprecision plagues the paper throughout eg while lemma 41 refers to s s as bounded set of discrete points in rn which is very clear lemma 42 suddenly calls the same tuple two bounded datasets without previously defining datasets as bounded sets of discrete points in rn while this particular imprecision can be intuited this sort of imprecision does not help me trusting the result similar problems plague the proof of proposition 43 my criticism is no that it is false instead my criticism is that it is written in such vague terms that it becomes impossible to assess if it is correct or incorrect in fact i believe that most of the intuitions i can follow are correct but this is the problem the proofs are written in an imprecise way forcing me to believe rather than convince myself as the definitions and wording throughout the paper the statements in the proof lack rigour and are much more informal than what would be required for a watertight proof as i understand it the main thrust of the proof is the following idea it is possible to rewrite the binary classification problem as the problem of finding a hyperplane between classes one can further rewrite the continual learning problem as the attempt of finding a hyperplane between a new class and previous classes by lemmas 4142 this tells us that the problem is not solvable unless we store everything we observed previously in the regulariser i actually believe that this idea is very elegant simple and could be shown rigorously but i dont believe the current paper does that i think comparing the current papers approach with the related literature 123 will benefit the authors and allow them to clearly define the relevant objects of interest rigorously prove their interesting result regarding the lemmas i will also note the following while both results are clearly correct in the sense that without the second set of points the separating hyperplane problem is not even defined the purpose of lemmas 4142 is doubtful it is fairly clear that you will not be able to find the separating hyperplane between two collections of points if you are given only one of these collections accordingly the proof of these lemmas feels like an attempt at proving that you wont know what real number xy is equal to if you are only given x3 but y remains unknown what i am trying to say is that the very problem of finding a separating hyperplane or of finding the real number xy is not defined if one of the two collection of points or y is not given though the following is less important than my main points i also would have enjoyed a more thorough review of the very sparse related literature since there are very few papers discussing the theory of continual learning this part should be an easy fix to the best of my knowledge there are essentially 2 maybe 3 relevant papers

 1 httpsarxivorgabs200611942 unpublished deriving generalization bounds for continual learning 2 httpsarxivorgabs200605188 icml 2020 probably most related to the current paper as they basically show a different impossibility result than proposition 43 which i think holds more generally 3 httpsarxivorgabs161008628 jmlr 2017 probably least related but still worth discussing derives regret bounds ### Summary:
all four reviewers recommend rejecting the paper however there is agreement that this is an interesting line of research and the ac agrees reviewers provided extensive and well educated feedback the authors did not respond to the raised concerns
[ 32693, 1166, 6903, 3071, 1619, 4240, 374, 650, 506, 680, 12408, 588, 1162, 355, 634, 30410, 310, 31065, 271, 2341, 1754, 1566, 285, 368, 943, 1555, 352, 751, 581, 549, 32693, 638, 3845, 549, 32693, 746, 8193, 1237, 3571, 6247, 495, 288, 3467, 340, 15152, 606, 1162, 355, 294, 15419, 18186, 45120, 4715, 15216, 247, 13213, 1320, 285, 1083, 323, 2266, 1666, 25379, 549, 32693, 638, 3845, 549, 32693, 1093, 6903, 1348, 2055, 4765, 577, 3889, 372, 8097, 305, 7112, 278, 285, 285, 250, 284, 256, 281, 965, 284, 1264, 15216, 323, 45120, 4715, 549, 32693, 638, 3845, 549, 32693, 16129, 1449, 2357, 1706, 6247, 608, 1182, 15854, 260, 864, 1162, 355, 4836, 639, 79, 6932, 45120, 4715, 970, 3909, 39762, 17699, 265, 549, 32693, 638, 3845, 549, 32693, 11395, 2405, 39154, 4765, 721, 355, 75, 1504, 74, 1218, 73, 2320, 1162, 355, 3541, 6600, 40041, 4715, 752, 417, 281, 7740, 10061, 273, 253, 19454, 266, 8059, 327, 4382, 8113, 23746, 87, 4765, 818, 355, 75, 1504, 74, 391, 19169, 278, 305, 276, 75, 5353, 270, 50276, 67, 1205, 900, 340, 6247, 11786, 1754, 3410, 5438, 323, 3909, 45120, 4715, 275, 16424, 275, 11454, 1491, 5162, 2718, 7266, 12643, 1036, 14711, 1099, 854, 1218, 80, 277, 1649, 249, 269, 391, 316, 86, 247, 268, 4843, 40160, 391, 716, 73, 340, 259, 50276, 73, 6594, 437, 391, 6247, 45120, 440, 35421, 6779, 4715, 275, 16424, 275, 11454, 1491, 5162, 2718, 7266, 818, 1540, 2357, 28339, 898, 448, 5353, 73, 610, 549, 3433, 266, 1162, 355, 970, 17134, 18347, 281, 18536, 2469, 3640, 275, 45120, 4715, 549, 32693, 638, 3845, 549, 32693, 1518, 17391, 15429, 9169, 5474, 33032, 2520, 2929, 10262, 247, 10527, 1783, 273, 37820, 1754, 7274, 281, 253, 1895, 273, 23265, 4715, 247, 3425, 273, 8892, 253, 1127, 273, 253, 2929, 310, 281, 7568, 35387, 273, 841, 9351, 273, 7274, 275, 253, 3634, 273, 966, 19687, 30132, 4715, 835, 5971, 403, 2540, 2378, 285, 581, 846, 1529, 253, 4477, 9059, 326, 841, 9351, 273, 3082, 2430, 4836, 13301, 387, 1071, 673, 281, 9113, 12129, 5971, 432, 1027, 8892, 50275, 74, 651, 751, 281, 3748, 347, 20544, 273, 253, 2929, 50276, 186, 783, 2929, 310, 247, 9865, 3177, 387, 17565, 3607, 273, 5368, 7274, 281, 253, 1895, 273, 45120, 4715, 891, 11435, 253, 26565, 2746, 281, 253, 7103, 273, 253, 35387, 273, 841, 9351, 273, 3082, 50275, 783, 2929, 310, 275, 2087, 973, 15720, 285, 3477, 281, 956, 627, 403, 642, 2201, 6332, 390, 963, 993, 50275, 783, 32213, 891, 923, 275, 436, 2929, 403, 50276, 186, 20261, 627, 310, 247, 2590, 285, 7473, 8813, 273, 2139, 352, 310, 417, 1896, 281, 30530, 2190, 5971, 432, 1027, 4836, 672, 627, 310, 642, 2289, 281, 941, 432, 1110, 2045, 5971, 891, 717, 417, 4751, 13762, 326, 253, 873, 273, 3602, 4934, 432, 2045, 5971, 285, 908, 275, 37820, 3169, 7274, 513, 417, 1957, 281, 690, 6070, 436, 941, 275, 1798, 627, 310, 642, 2590, 4154, 323, 253, 1750, 327, 3239, 608, 2299, 407, 9079, 7005, 909, 255, 18, 1057, 417, 1566, 253, 941, 3268, 432, 45830, 18, 285, 3103, 352, 1057, 417, 1566, 941, 3268, 432, 45830, 18, 50276, 19770, 891, 651, 751, 281, 923, 690, 5955, 5001, 849, 9648, 247, 873, 273, 3602, 253, 41506, 18, 651, 1957, 253, 256, 873, 209, 186, 249, 2426, 273, 253, 4679, 891, 1908, 253, 1180, 273, 8892, 3240, 3710, 281, 320, 13762, 891, 651, 751, 281, 923, 2067, 8892, 387, 1878, 884, 285, 22453, 1543, 275, 2426, 273, 8892, 6311, 2581, 685, 44540, 50275, 34974, 323, 4477, 50275, 32897, 2953, 619, 5701, 327, 253, 32213, 1840, 5474, 339, 431, 248, 2929, 10262, 271, 4722, 6452, 326, 9351, 273, 3963, 5904, 2550, 3037, 281, 30530, 5971, 432, 1027, 4715, 8892, 1097, 10527, 285, 5661, 1783, 310, 2530, 767, 37820, 3082, 403, 5762, 50276, 856, 337, 186, 783, 6452, 310, 4722, 285, 352, 12957, 326, 3963, 5904, 1891, 672, 253, 4836, 5203, 310, 2649, 1677, 436, 6452, 3533, 1110, 4081, 3963, 5904, 323, 45120, 4715, 374, 186, 783, 33977, 1783, 273, 253, 35387, 273, 3963, 5904, 275, 45120, 4715, 310, 3559, 16280, 253, 2929, 671, 8042, 562, 24864, 35387, 273, 37820, 275, 643, 3510, 273, 4715, 9534, 50276, 5040, 337, 186, 783, 2905, 789, 310, 417, 4209, 352, 19756, 253, 10199, 285, 5955, 273, 4633, 37820, 3082, 275, 45120, 4715, 50276, 19, 186, 783, 4679, 403, 9009, 327, 495, 10058, 15302, 278, 79, 382, 8142, 16192, 382, 10771, 79, 382, 697, 417, 4209, 281, 921, 253, 6349, 273, 253, 4081, 6452, 253, 4633, 260, 338, 274, 2313, 10895, 943, 320, 5762, 50276, 20, 186, 7483, 767, 37820, 3082, 403, 5762, 253, 4633, 3640, 940, 21755, 1754, 37820, 247, 310, 417, 5762, 50276, 21, 186, 20261, 3963, 5904, 2550, 3037, 281, 30530, 5971, 432, 1027, 8892, 597, 476, 3693, 253, 8492, 281, 747, 3733, 941, 534, 310, 671, 1774, 323, 11138, 502, 3045, 253, 2929, 651, 320, 10046, 604, 253, 1783, 2139, 3963, 5904, 476, 3157, 502, 3045, 310, 3559, 608, 186, 5092, 368, 5513, 2139, 627, 310, 247, 9479, 14892, 40600, 511, 273, 7200, 387, 253, 2457, 5018, 275, 1016, 4715, 4836, 4677, 374, 495, 50276, 66, 4715, 1293, 37264, 632, 1162, 355, 4765, 5474, 339, 431, 392, 83, 253, 2929, 10262, 271, 30328, 347, 281, 2139, 45120, 4715, 342, 37820, 588, 320, 7591, 281, 26923, 6774, 8892, 1223, 891, 2868, 253, 16875, 4431, 281, 320, 3451, 253, 7473, 16653, 86, 486, 908, 281, 1246, 436, 30328, 310, 417, 10481, 26565, 281, 7501, 9311, 50276, 783, 2929, 5239, 562, 281, 1973, 327, 253, 19143, 23507, 6239, 327, 253, 3762, 273, 45120, 4715, 253, 2234, 1127, 253, 2929, 16995, 281, 1056, 310, 326, 37820, 3169, 45120, 4715, 7274, 403, 7591, 281, 26923, 8892, 1293, 3081, 1491, 824, 347, 4836, 13301, 387, 3733, 673, 436, 310, 9919, 275, 253, 830, 273, 1264, 39325, 50276, 7053, 273, 891, 943, 1333, 326, 891, 2868, 253, 4477, 403, 327, 253, 987, 3540, 50276, 9802, 253, 3451, 16875, 4431, 891, 2868, 326, 253, 1750, 597, 971, 281, 1056, 310, 3451, 285, 891, 2868, 326, 9162, 310, 247, 12532, 4758, 281, 5276, 616, 1750, 3066, 23694, 4373, 32763, 275, 643, 3000, 891, 971, 281, 751, 253, 2929, 533, 697, 1655, 1375, 1057, 417, 1581, 479, 281, 5583, 697, 9311, 1339, 479, 671, 320, 13955, 891, 1158, 436, 2929, 3198, 625, 685, 247, 3961, 13774, 285, 588, 275, 619, 4743, 417, 320, 15452, 494, 1293, 247, 2201, 18520, 15079, 885, 285, 501, 538, 2225, 1340, 273, 9777, 275, 1971, 281, 897, 6698, 480, 1662, 251, 275, 17718, 619, 1895, 310, 326, 891, 513, 417, 1158, 253, 1655, 2715, 273, 253, 2929, 26091, 281, 7257, 247, 21414, 8132, 29689, 15965, 4737, 273, 253, 3715, 16875, 4431, 1580, 436, 310, 752, 253, 2929, 5239, 562, 281, 513, 891, 3021, 5583, 326, 697, 1655, 830, 417, 320, 3863, 50276, 783, 2022, 17134, 27638, 4768, 310, 247, 4092, 3480, 275, 12320, 436, 9513, 342, 253, 14308, 323, 4227, 5426, 374, 13067, 734, 3065, 347, 3637, 50276, 249, 5145, 4715, 734, 3065, 403, 15272, 875, 767, 390, 625, 8103, 3470, 4283, 281, 10554, 6332, 627, 403, 734, 3065, 672, 39793, 581, 8103, 1159, 372, 25013, 253, 13757, 273, 387, 1878, 1529, 581, 50276, 6050, 436, 5426, 28174, 253, 30328, 273, 11689, 352, 310, 417, 10799, 2217, 24088, 534, 8103, 3470, 752, 2238, 273, 10554, 2228, 50275, 936, 320, 908, 1561, 247, 15965, 906, 347, 310, 2218, 275, 13989, 7652, 534, 3054, 326, 50276, 6050, 4715, 247, 3425, 273, 28465, 9162, 8892, 604, 253, 3541, 40639, 273, 253, 2469, 8892, 310, 760, 7976, 327, 10166, 13461, 285, 4715, 17705, 273, 2045, 4836, 285, 1057, 417, 1566, 253, 2469, 3268, 352, 310, 417, 1896, 323, 3676, 11454, 6928, 281, 3037, 747, 8892, 1293, 11689, 50276, 1542, 1650, 352, 310, 12744, 534, 10554, 2228, 534, 310, 629, 273, 253, 11689, 5426, 432, 809, 374, 390, 534, 8103, 3470, 969, 597, 403, 629, 273, 253, 5426, 374, 310, 609, 253, 4623, 581, 2487, 323, 13989, 7652, 1223, 1097, 3662, 476, 320, 16875, 959, 281, 247, 2176, 4248, 604, 253, 9414, 310, 15211, 2217, 247, 26565, 906, 4419, 253, 9414, 281, 4555, 871, 512, 4623, 5113, 275, 1953, 752, 26278, 253, 2469, 3268, 310, 671, 417, 10534, 2931, 533, 908, 281, 1375, 436, 906, 50276, 2520, 2238, 273, 1607, 2845, 1297, 499, 9681, 253, 2929, 4768, 24088, 1223, 18057, 7609, 10770, 281, 256, 256, 347, 11542, 873, 273, 13358, 2792, 275, 391, 79, 534, 310, 1077, 2590, 18057, 5976, 8423, 5841, 253, 1072, 31343, 767, 11542, 15302, 1293, 3786, 13947, 15302, 347, 11542, 5239, 273, 13358, 2792, 275, 391, 79, 1223, 436, 1798, 1607, 2845, 1297, 476, 320, 16875, 959, 436, 3686, 273, 1607, 2845, 1297, 1057, 417, 1361, 479, 44895, 253, 906, 50276, 22202, 3237, 31781, 253, 4737, 273, 13989, 7652, 50276, 2577, 14226, 310, 642, 326, 352, 310, 3221, 3185, 619, 14226, 310, 326, 352, 310, 3542, 275, 824, 21248, 2426, 326, 352, 4916, 7479, 281, 2939, 604, 352, 310, 3451, 390, 13583, 50276, 249, 958, 891, 2868, 326, 954, 273, 253, 16875, 4431, 891, 476, 956, 403, 3451, 533, 436, 310, 253, 1895, 253, 27947, 403, 3542, 275, 271, 1607, 2845, 885, 1039, 17190, 479, 281, 2868, 2581, 685, 18578, 4266, 347, 253, 14308, 285, 41066, 4768, 253, 2929, 253, 7234, 275, 253, 4737, 3480, 8132, 454, 285, 403, 1199, 625, 25040, 685, 752, 651, 320, 2424, 323, 247, 1824, 33886, 4737, 50276, 284, 891, 2096, 352, 253, 2022, 19031, 273, 253, 4737, 310, 253, 1563, 2934, 352, 310, 1896, 281, 24813, 253, 8985, 9162, 1895, 347, 253, 1895, 273, 4560, 247, 4373, 13568, 875, 5971, 581, 476, 2007, 24813, 253, 45120, 4715, 1895, 347, 253, 3177, 273, 4560, 247, 4373, 13568, 875, 247, 747, 966, 285, 2045, 5971, 407, 458, 44661, 577, 17364, 436, 8599, 441, 326, 253, 1895, 310, 417, 1220, 17254, 5734, 359, 4657, 3253, 359, 2540, 3786, 275, 253, 3963, 9141, 891, 2686, 2868, 326, 436, 2934, 310, 1077, 20654, 2969, 285, 812, 320, 2011, 8132, 29689, 533, 891, 13414, 2868, 253, 1655, 2929, 1057, 326, 891, 1158, 10941, 253, 1655, 9380, 2746, 342, 253, 2905, 6239, 15567, 588, 5649, 253, 4477, 285, 1581, 731, 281, 4518, 4853, 253, 4623, 5113, 273, 1600, 50276, 10389, 29689, 5276, 616, 4722, 906, 50276, 1747, 13218, 253, 458, 44661, 891, 588, 671, 3877, 253, 1563, 1223, 1097, 1543, 403, 4518, 3451, 275, 253, 3282, 326, 1293, 253, 1273, 873, 273, 2792, 253, 23694, 4373, 13568, 1895, 310, 417, 1014, 2931, 253, 4096, 273, 458, 44661, 577, 17364, 310, 38342, 352, 310, 9648, 2590, 326, 368, 588, 417, 320, 2104, 281, 1089, 253, 23694, 4373, 13568, 875, 767, 18406, 273, 2792, 604, 368, 403, 1677, 760, 581, 273, 841, 18406, 15672, 253, 4737, 273, 841, 458, 44661, 9193, 50276, 3022, 271, 3177, 387, 18597, 326, 368, 31451, 871, 752, 1524, 1180, 1269, 90, 310, 4503, 281, 604, 368, 403, 760, 1677, 1269, 20, 533, 340, 4558, 7202, 752, 891, 717, 2820, 281, 1333, 310, 326, 253, 1077, 1895, 273, 4560, 247, 23694, 4373, 13568, 390, 273, 4560, 253, 1524, 1180, 1269, 90, 310, 417, 2931, 604, 581, 273, 253, 767, 4849, 273, 2792, 390, 340, 310, 417, 1677, 50276, 2004, 253, 1563, 310, 1679, 1774, 685, 619, 2022, 2792, 891, 671, 651, 452, 11346, 247, 625, 11080, 2278, 273, 253, 1077, 23507, 2905, 6239, 1580, 627, 403, 1077, 1643, 9380, 16585, 253, 3762, 273, 45120, 4715, 436, 629, 943, 320, 271, 3477, 4993, 281, 253, 1682, 273, 619, 3640, 627, 403, 9093, 374, 5046, 495, 4623, 9380, 40702, 40702, 337, 5987, 39962, 2061, 5375, 1518, 3832, 45202, 27085, 44190, 26647, 14493, 323, 45120, 4715, 374, 5987, 39962, 2061, 5375, 1518, 20954, 17599, 17857, 1686, 9169, 3164, 954, 2905, 281, 253, 1655, 2929, 347, 597, 10323, 921, 247, 1027, 37593, 2322, 906, 685, 13989, 7652, 534, 891, 1158, 6556, 625, 3839, 495, 5987, 39962, 2061, 5375, 1036, 2313, 2691, 1619, 480, 1686, 83, 4240, 3164, 1878, 2905, 533, 1335, 4409, 16585, 38422, 14938, 14493, 50276, 15540, 2490, 187, 4118, 18435, 27, 455, 1740, 30628, 5583, 33944, 253, 2929, 2299, 627, 310, 4345, 326, 436, 310, 271, 4722, 1386, 273, 2561, 285, 253, 913, 18726, 30628, 2530, 9470, 285, 973, 19149, 8680, 253, 4477, 858, 417, 3794, 281, 253, 5439, 7350 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 32693, 1166, 6903, 3071, 1619, 4240, 374, 650, 506, 680, 12408, 588, 1162, 355, 634, 30410, 310, 31065, 271, 2341, 1754, 1566, 285, 368, 943, 1555, 352, 751, 581, 549, 32693, 638, 3845, 549, 32693, 746, 8193, 1237, 3571, 6247, 495, 288, 3467, 340, 15152, 606, 1162, 355, 294, 15419, 18186, 45120, 4715, 15216, 247, 13213, 1320, 285, 1083, 323, 2266, 1666, 25379, 549, 32693, 638, 3845, 549, 32693, 1093, 6903, 1348, 2055, 4765, 577, 3889, 372, 8097, 305, 7112, 278, 285, 285, 250, 284, 256, 281, 965, 284, 1264, 15216, 323, 45120, 4715, 549, 32693, 638, 3845, 549, 32693, 16129, 1449, 2357, 1706, 6247, 608, 1182, 15854, 260, 864, 1162, 355, 4836, 639, 79, 6932, 45120, 4715, 970, 3909, 39762, 17699, 265, 549, 32693, 638, 3845, 549, 32693, 11395, 2405, 39154, 4765, 721, 355, 75, 1504, 74, 1218, 73, 2320, 1162, 355, 3541, 6600, 40041, 4715, 752, 417, 281, 7740, 10061, 273, 253, 19454, 266, 8059, 327, 4382, 8113, 23746, 87, 4765, 818, 355, 75, 1504, 74, 391, 19169, 278, 305, 276, 75, 5353, 270, 50276, 67, 1205, 900, 340, 6247, 11786, 1754, 3410, 5438, 323, 3909, 45120, 4715, 275, 16424, 275, 11454, 1491, 5162, 2718, 7266, 12643, 1036, 14711, 1099, 854, 1218, 80, 277, 1649, 249, 269, 391, 316, 86, 247, 268, 4843, 40160, 391, 716, 73, 340, 259, 50276, 73, 6594, 437, 391, 6247, 45120, 440, 35421, 6779, 4715, 275, 16424, 275, 11454, 1491, 5162, 2718, 7266, 818, 1540, 2357, 28339, 898, 448, 5353, 73, 610, 549, 3433, 266, 1162, 355, 970, 17134, 18347, 281, 18536, 2469, 3640, 275, 45120, 4715, 549, 32693, 638, 3845, 549, 32693, 1518, 17391, 15429, 9169, 5474, 33032, 2520, 2929, 10262, 247, 10527, 1783, 273, 37820, 1754, 7274, 281, 253, 1895, 273, 23265, 4715, 247, 3425, 273, 8892, 253, 1127, 273, 253, 2929, 310, 281, 7568, 35387, 273, 841, 9351, 273, 7274, 275, 253, 3634, 273, 966, 19687, 30132, 4715, 835, 5971, 403, 2540, 2378, 285, 581, 846, 1529, 253, 4477, 9059, 326, 841, 9351, 273, 3082, 2430, 4836, 13301, 387, 1071, 673, 281, 9113, 12129, 5971, 432, 1027, 8892, 50275, 74, 651, 751, 281, 3748, 347, 20544, 273, 253, 2929, 50276, 186, 783, 2929, 310, 247, 9865, 3177, 387, 17565, 3607, 273, 5368, 7274, 281, 253, 1895, 273, 45120, 4715, 891, 11435, 253, 26565, 2746, 281, 253, 7103, 273, 253, 35387, 273, 841, 9351, 273, 3082, 50275, 783, 2929, 310, 275, 2087, 973, 15720, 285, 3477, 281, 956, 627, 403, 642, 2201, 6332, 390, 963, 993, 50275, 783, 32213, 891, 923, 275, 436, 2929, 403, 50276, 186, 20261, 627, 310, 247, 2590, 285, 7473, 8813, 273, 2139, 352, 310, 417, 1896, 281, 30530, 2190, 5971, 432, 1027, 4836, 672, 627, 310, 642, 2289, 281, 941, 432, 1110, 2045, 5971, 891, 717, 417, 4751, 13762, 326, 253, 873, 273, 3602, 4934, 432, 2045, 5971, 285, 908, 275, 37820, 3169, 7274, 513, 417, 1957, 281, 690, 6070, 436, 941, 275, 1798, 627, 310, 642, 2590, 4154, 323, 253, 1750, 327, 3239, 608, 2299, 407, 9079, 7005, 909, 255, 18, 1057, 417, 1566, 253, 941, 3268, 432, 45830, 18, 285, 3103, 352, 1057, 417, 1566, 941, 3268, 432, 45830, 18, 50276, 19770, 891, 651, 751, 281, 923, 690, 5955, 5001, 849, 9648, 247, 873, 273, 3602, 253, 41506, 18, 651, 1957, 253, 256, 873, 209, 186, 249, 2426, 273, 253, 4679, 891, 1908, 253, 1180, 273, 8892, 3240, 3710, 281, 320, 13762, 891, 651, 751, 281, 923, 2067, 8892, 387, 1878, 884, 285, 22453, 1543, 275, 2426, 273, 8892, 6311, 2581, 685, 44540, 50275, 34974, 323, 4477, 50275, 32897, 2953, 619, 5701, 327, 253, 32213, 1840, 5474, 339, 431, 248, 2929, 10262, 271, 4722, 6452, 326, 9351, 273, 3963, 5904, 2550, 3037, 281, 30530, 5971, 432, 1027, 4715, 8892, 1097, 10527, 285, 5661, 1783, 310, 2530, 767, 37820, 3082, 403, 5762, 50276, 856, 337, 186, 783, 6452, 310, 4722, 285, 352, 12957, 326, 3963, 5904, 1891, 672, 253, 4836, 5203, 310, 2649, 1677, 436, 6452, 3533, 1110, 4081, 3963, 5904, 323, 45120, 4715, 374, 186, 783, 33977, 1783, 273, 253, 35387, 273, 3963, 5904, 275, 45120, 4715, 310, 3559, 16280, 253, 2929, 671, 8042, 562, 24864, 35387, 273, 37820, 275, 643, 3510, 273, 4715, 9534, 50276, 5040, 337, 186, 783, 2905, 789, 310, 417, 4209, 352, 19756, 253, 10199, 285, 5955, 273, 4633, 37820, 3082, 275, 45120, 4715, 50276, 19, 186, 783, 4679, 403, 9009, 327, 495, 10058, 15302, 278, 79, 382, 8142, 16192, 382, 10771, 79, 382, 697, 417, 4209, 281, 921, 253, 6349, 273, 253, 4081, 6452, 253, 4633, 260, 338, 274, 2313, 10895, 943, 320, 5762, 50276, 20, 186, 7483, 767, 37820, 3082, 403, 5762, 253, 4633, 3640, 940, 21755, 1754, 37820, 247, 310, 417, 5762, 50276, 21, 186, 20261, 3963, 5904, 2550, 3037, 281, 30530, 5971, 432, 1027, 8892, 597, 476, 3693, 253, 8492, 281, 747, 3733, 941, 534, 310, 671, 1774, 323, 11138, 502, 3045, 253, 2929, 651, 320, 10046, 604, 253, 1783, 2139, 3963, 5904, 476, 3157, 502, 3045, 310, 3559, 608, 186, 5092, 368, 5513, 2139, 627, 310, 247, 9479, 14892, 40600, 511, 273, 7200, 387, 253, 2457, 5018, 275, 1016, 4715, 4836, 4677, 374, 495, 50276, 66, 4715, 1293, 37264, 632, 1162, 355, 4765, 5474, 339, 431, 392, 83, 253, 2929, 10262, 271, 30328, 347, 281, 2139, 45120, 4715, 342, 37820, 588, 320, 7591, 281, 26923, 6774, 8892, 1223, 891, 2868, 253, 16875, 4431, 281, 320, 3451, 253, 7473, 16653, 86, 486, 908, 281, 1246, 436, 30328, 310, 417, 10481, 26565, 281, 7501, 9311, 50276, 783, 2929, 5239, 562, 281, 1973, 327, 253, 19143, 23507, 6239, 327, 253, 3762, 273, 45120, 4715, 253, 2234, 1127, 253, 2929, 16995, 281, 1056, 310, 326, 37820, 3169, 45120, 4715, 7274, 403, 7591, 281, 26923, 8892, 1293, 3081, 1491, 824, 347, 4836, 13301, 387, 3733, 673, 436, 310, 9919, 275, 253, 830, 273, 1264, 39325, 50276, 7053, 273, 891, 943, 1333, 326, 891, 2868, 253, 4477, 403, 327, 253, 987, 3540, 50276, 9802, 253, 3451, 16875, 4431, 891, 2868, 326, 253, 1750, 597, 971, 281, 1056, 310, 3451, 285, 891, 2868, 326, 9162, 310, 247, 12532, 4758, 281, 5276, 616, 1750, 3066, 23694, 4373, 32763, 275, 643, 3000, 891, 971, 281, 751, 253, 2929, 533, 697, 1655, 1375, 1057, 417, 1581, 479, 281, 5583, 697, 9311, 1339, 479, 671, 320, 13955, 891, 1158, 436, 2929, 3198, 625, 685, 247, 3961, 13774, 285, 588, 275, 619, 4743, 417, 320, 15452, 494, 1293, 247, 2201, 18520, 15079, 885, 285, 501, 538, 2225, 1340, 273, 9777, 275, 1971, 281, 897, 6698, 480, 1662, 251, 275, 17718, 619, 1895, 310, 326, 891, 513, 417, 1158, 253, 1655, 2715, 273, 253, 2929, 26091, 281, 7257, 247, 21414, 8132, 29689, 15965, 4737, 273, 253, 3715, 16875, 4431, 1580, 436, 310, 752, 253, 2929, 5239, 562, 281, 513, 891, 3021, 5583, 326, 697, 1655, 830, 417, 320, 3863, 50276, 783, 2022, 17134, 27638, 4768, 310, 247, 4092, 3480, 275, 12320, 436, 9513, 342, 253, 14308, 323, 4227, 5426, 374, 13067, 734, 3065, 347, 3637, 50276, 249, 5145, 4715, 734, 3065, 403, 15272, 875, 767, 390, 625, 8103, 3470, 4283, 281, 10554, 6332, 627, 403, 734, 3065, 672, 39793, 581, 8103, 1159, 372, 25013, 253, 13757, 273, 387, 1878, 1529, 581, 50276, 6050, 436, 5426, 28174, 253, 30328, 273, 11689, 352, 310, 417, 10799, 2217, 24088, 534, 8103, 3470, 752, 2238, 273, 10554, 2228, 50275, 936, 320, 908, 1561, 247, 15965, 906, 347, 310, 2218, 275, 13989, 7652, 534, 3054, 326, 50276, 6050, 4715, 247, 3425, 273, 28465, 9162, 8892, 604, 253, 3541, 40639, 273, 253, 2469, 8892, 310, 760, 7976, 327, 10166, 13461, 285, 4715, 17705, 273, 2045, 4836, 285, 1057, 417, 1566, 253, 2469, 3268, 352, 310, 417, 1896, 323, 3676, 11454, 6928, 281, 3037, 747, 8892, 1293, 11689, 50276, 1542, 1650, 352, 310, 12744, 534, 10554, 2228, 534, 310, 629, 273, 253, 11689, 5426, 432, 809, 374, 390, 534, 8103, 3470, 969, 597, 403, 629, 273, 253, 5426, 374, 310, 609, 253, 4623, 581, 2487, 323, 13989, 7652, 1223, 1097, 3662, 476, 320, 16875, 959, 281, 247, 2176, 4248, 604, 253, 9414, 310, 15211, 2217, 247, 26565, 906, 4419, 253, 9414, 281, 4555, 871, 512, 4623, 5113, 275, 1953, 752, 26278, 253, 2469, 3268, 310, 671, 417, 10534, 2931, 533, 908, 281, 1375, 436, 906, 50276, 2520, 2238, 273, 1607, 2845, 1297, 499, 9681, 253, 2929, 4768, 24088, 1223, 18057, 7609, 10770, 281, 256, 256, 347, 11542, 873, 273, 13358, 2792, 275, 391, 79, 534, 310, 1077, 2590, 18057, 5976, 8423, 5841, 253, 1072, 31343, 767, 11542, 15302, 1293, 3786, 13947, 15302, 347, 11542, 5239, 273, 13358, 2792, 275, 391, 79, 1223, 436, 1798, 1607, 2845, 1297, 476, 320, 16875, 959, 436, 3686, 273, 1607, 2845, 1297, 1057, 417, 1361, 479, 44895, 253, 906, 50276, 22202, 3237, 31781, 253, 4737, 273, 13989, 7652, 50276, 2577, 14226, 310, 642, 326, 352, 310, 3221, 3185, 619, 14226, 310, 326, 352, 310, 3542, 275, 824, 21248, 2426, 326, 352, 4916, 7479, 281, 2939, 604, 352, 310, 3451, 390, 13583, 50276, 249, 958, 891, 2868, 326, 954, 273, 253, 16875, 4431, 891, 476, 956, 403, 3451, 533, 436, 310, 253, 1895, 253, 27947, 403, 3542, 275, 271, 1607, 2845, 885, 1039, 17190, 479, 281, 2868, 2581, 685, 18578, 4266, 347, 253, 14308, 285, 41066, 4768, 253, 2929, 253, 7234, 275, 253, 4737, 3480, 8132, 454, 285, 403, 1199, 625, 25040, 685, 752, 651, 320, 2424, 323, 247, 1824, 33886, 4737, 50276, 284, 891, 2096, 352, 253, 2022, 19031, 273, 253, 4737, 310, 253, 1563, 2934, 352, 310, 1896, 281, 24813, 253, 8985, 9162, 1895, 347, 253, 1895, 273, 4560, 247, 4373, 13568, 875, 5971, 581, 476, 2007, 24813, 253, 45120, 4715, 1895, 347, 253, 3177, 273, 4560, 247, 4373, 13568, 875, 247, 747, 966, 285, 2045, 5971, 407, 458, 44661, 577, 17364, 436, 8599, 441, 326, 253, 1895, 310, 417, 1220, 17254, 5734, 359, 4657, 3253, 359, 2540, 3786, 275, 253, 3963, 9141, 891, 2686, 2868, 326, 436, 2934, 310, 1077, 20654, 2969, 285, 812, 320, 2011, 8132, 29689, 533, 891, 13414, 2868, 253, 1655, 2929, 1057, 326, 891, 1158, 10941, 253, 1655, 9380, 2746, 342, 253, 2905, 6239, 15567, 588, 5649, 253, 4477, 285, 1581, 731, 281, 4518, 4853, 253, 4623, 5113, 273, 1600, 50276, 10389, 29689, 5276, 616, 4722, 906, 50276, 1747, 13218, 253, 458, 44661, 891, 588, 671, 3877, 253, 1563, 1223, 1097, 1543, 403, 4518, 3451, 275, 253, 3282, 326, 1293, 253, 1273, 873, 273, 2792, 253, 23694, 4373, 13568, 1895, 310, 417, 1014, 2931, 253, 4096, 273, 458, 44661, 577, 17364, 310, 38342, 352, 310, 9648, 2590, 326, 368, 588, 417, 320, 2104, 281, 1089, 253, 23694, 4373, 13568, 875, 767, 18406, 273, 2792, 604, 368, 403, 1677, 760, 581, 273, 841, 18406, 15672, 253, 4737, 273, 841, 458, 44661, 9193, 50276, 3022, 271, 3177, 387, 18597, 326, 368, 31451, 871, 752, 1524, 1180, 1269, 90, 310, 4503, 281, 604, 368, 403, 760, 1677, 1269, 20, 533, 340, 4558, 7202, 752, 891, 717, 2820, 281, 1333, 310, 326, 253, 1077, 1895, 273, 4560, 247, 23694, 4373, 13568, 390, 273, 4560, 253, 1524, 1180, 1269, 90, 310, 417, 2931, 604, 581, 273, 253, 767, 4849, 273, 2792, 390, 340, 310, 417, 1677, 50276, 2004, 253, 1563, 310, 1679, 1774, 685, 619, 2022, 2792, 891, 671, 651, 452, 11346, 247, 625, 11080, 2278, 273, 253, 1077, 23507, 2905, 6239, 1580, 627, 403, 1077, 1643, 9380, 16585, 253, 3762, 273, 45120, 4715, 436, 629, 943, 320, 271, 3477, 4993, 281, 253, 1682, 273, 619, 3640, 627, 403, 9093, 374, 5046, 495, 4623, 9380, 40702, 40702, 337, 5987, 39962, 2061, 5375, 1518, 3832, 45202, 27085, 44190, 26647, 14493, 323, 45120, 4715, 374, 5987, 39962, 2061, 5375, 1518, 20954, 17599, 17857, 1686, 9169, 3164, 954, 2905, 281, 253, 1655, 2929, 347, 597, 10323, 921, 247, 1027, 37593, 2322, 906, 685, 13989, 7652, 534, 891, 1158, 6556, 625, 3839, 495, 5987, 39962, 2061, 5375, 1036, 2313, 2691, 1619, 480, 1686, 83, 4240, 3164, 1878, 2905, 533, 1335, 4409, 16585, 38422, 14938, 14493, 50276, 15540, 2490, 187, 4118, 18435, 27, 455, 1740, 30628, 5583, 33944, 253, 2929, 2299, 627, 310, 4345, 326, 436, 310, 271, 4722, 1386, 273, 2561, 285, 253, 913, 18726, 30628, 2530, 9470, 285, 973, 19149, 8680, 253, 4477, 858, 417, 3794, 281, 253, 5439, 7350 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: this paper aims to learn generalizable representations without labels to this end this paper proposes maximum entropy coding mec inspired by the principle of maximum entropy in information theory mec uses the minimal lossy coding and the taylor series approximation to make the maximum entropy estimation feasible extensive experimental results show that mec consistently outperforms existing ssl methods under various downstream tasks furthermore this paper demonstrates that mec is robust to various hyperparameters eg smaller batch sizes and architectures eg vits the proposed method has several strengths 1 simplicity and scalability 2 high performance across various downstream tasks and 3 robustness to various hyperparameters and architectures i think the extensive experiments well demonstrate these strengths furthermore mec can be interpreted as batchwise and featurewise ssl methods so i feel this part is also interesting one major concern with this paper is that what information the proposed method can learn while existing ssl methods cannot is unclear this paper mentioned that existing ssl methods introduce some biases while mec does not what are the biases exactly could you provide some empirical evidence supporting that mec can learn lessbiased representations i think this paper does not explain why and how mec can outperform other ssl methods some minor concerns are provided in the questions section to sum up although some explanation seems insufficient i feel the empirical results are strong hence i vote for weak accept this paper well addressed the limitations and the potential negative societal impact docsepthis paper proposed a jointembedding objective called maximum entropy coding which is close to mcr2 74 this method directly optimizes the information content by minimizing the coding length function this proposed method unifies batchwise objectives simsiam and featurewise objectives barlowtwins practically this is implemented as a taylor series expansion the proposed method shows strong performances when combined with all existing techniques including exponential moving average and asymmetric network on a wide range of experiments including imagenet linear probe semisupervised classification transfer learning on video tasks and object detection however this paper actually uses all the tricks including exponential moving average from byol but only mentions it in the ablation studysupplementary material it is obvious that the strong result of this method mostly comes from this mixture of existing design components strengths 1 the proposed method is strongly supported by theoretical motivation the maximum entropy principle 2 the proposed method shows strong performances on a wide range of experiments when combined with all existing techniques including exponential moving average asymmetric network including imagenet linear probe semisupervised classification transfer learning on video tasks and object detection 3 the authors provided experimental details including pseudocode and all hyperparameters weaknesses 1 the strong result mostly comes from a mixture of designing components eg exponential moving average and asymmetric architecture it is questionable how much advantage comes from the maximum entropy coding loss in fact the only contribution of the proposed method over existing methods is that it has a higherorder correction table 5e clearly shows that the extra two orders 24 only increase accuracy by 03 2 its unclear what the main contribution is to this paper a the principle of maximum entropy is no different from requiring independent features which is already proposed in mcr2 barlowtwins vicreg and whitening ssl b the higherorder approximation as mentioned above only has a 03 linear probe improvement all other experiments fail to verify the advantage c the practical advantage of this mixture model the authors only compare their work to fundamental frameworks byol barlowtwins swav but there are many similar mixture models eg arxiv220407141 arxiv210414548 arxiv210912909 arxiv201213493 arxiv220105119 after rebuttal comments i am more convinced by the authors that the empirical advantage is misleading the empirical advantage is exactly just 03 actually in response point 1 barlow twins are supposed to reach 735 so here the extra two orders approximately indeed only provide a 01 improvement i think this is totally ok a good theory unifies several ideas and provides incremental improvements but it is unacceptable that the paper tries to hide this point the main figure and most of the results tables will mislead readers that the advantage of the proposed method over barlow twins simclr is due to the proposed loss in fact leveraging other tricks like momentum encoders is the main reason but is only mentioned in l159 once not in the figure not in the main text not in the experimental detail na docsepthis paper proposes a selfsupervised learning method dubbed maximum entropy encoding mec which leverages the principle of maximum entropy to learn unbiased representations of an image dataset experiments done on imagenet the authors combine a maximum entropy with the augmentationinvariance objective of contrastive learning which is justified as a view consistency prior which gives testable information the exact resulting loss has a log determinant which they approximate using a taylor series they show that various orders of this approximation are in fact equivalent to existing selfsupervised methods such as simsiam and barlow twins experiments are conducted with pretraining on imagenet and suitably diverse downstream tasks linear evaluation on imagenet semisupervised classification on imagenet transfer learning on object detection voc coco and segmentation coco as well as video tracking table 5 strengths originality the community has been iteratively constructing contrastive or similar selfsupervised learning methods for several years now this paper subsumes several previous works by providing a novel unified view that allows for further adaption and exploration while the resulting loss code is not markedly different from existing work the theoretical approach it is derived from and the resulting flexibilityopportunity it yields constitute originality in my eyes quality the experiments in this paper are very clean with straightforward wellaccepted experimental settings appendix c describes the methodology well without introducing any complicating bells and whistles the results of the new method are consistently equal or superior to existing objectives all while being very motivated i crossreferenced numbers with original works and the authors match or in some cases exceed the original citations in table referenced numbers example the arxiv version of simclr lists 693 as its top1 imagenet acc while this work gives up to 704 i assume these discrepancies are from improvedadditional facets in the training but would appreciate clarification on this point i feel extremely comfortable that i could replicate the experiments of this paper with minimal effort and would see comparable results clarity this paper is extremely wellwritten i was able to understand all of the concepts on the first readthrough the order of presentation was logical and descriptions were enlightening without being overly wordy as previously mentioned i feel comfortable that i could replicate the results quickly especially given the useful pseudocode in appendix a significance the direct tying of a family of objectives to a very grounded mathematical concept is highly significant in my eyes as it brings the iterative line of research to a convergence point from which more novel lines of research can be taken there could have been several more iterations of selfsupervised learning papers converging to this objective by empirical motivation but this work shortcuts that and opens up exciting possibilities for future work especially given the prominence of contrastive learning objectives in current visionlanguage works misc figure 4 was an extremely useful illustration appendix was extremely informative and helpful the ablations in table 5 particularly e largely match my intuition which is nice weaknesses originality this paper consolidates existing lines of exploration and opens the door for further contrastive methods and as such the ultimate objective is not technically very different from existing work given the importance and significance of the consolidation though this is not truly a weakness in my opinion quality i do wish that there were at least some pretraining experiments on datasets besides imagenet either uncurated web imagery or something like places365 in my opinion this is the biggest weakness of this paper as stated above overall i was very impressed with the quality of this work the only typo that jumped out to me while reading was variaty instead of variety in l60 clarity i am very familiar with the contrastive learning so am probably a biased estimator here but i thought the explanations and experiments were extremely straightforward and nonconfusing i have no complaints about the clarity my only related suggestion is that a pointer to appendix h is placed in the main text see misc below significance as stated in the above originality section its difficult for a consolidation paper to have earthshattering impact as the value largely lies in the tying together of work that necessarily exists i dont think that this by any means is a net negative i would rank this paper as one of the more significant ones that ive come across in the last 24 weeks of arxiv but it does cap it below what a conference best paper might look like misc appendix h starting from l724 is extremely interesting and i was wishing for a similar section within the main text while incorporating that many lines is impractical at this stage of the paper i would suggest that a reference to appendix h with a preview of what it contains is added to the main text appendices e and f provide a succinct but accurate and precise description of limitations and the possible relation to societal impact ### Summary:
the paper in general received three positive feedbacks and ratings the three reviewers all recognize the theoretical soundness of the paper and the paper is also clearly presented with informative and strong experimental results there is a few places making one reviewer less comfortable in terms of the exact effectiveness of the proposed theory while overall the experimental results are comprehensive and basically suppor the claiming points made by the paper the authors may furher clarify the points based on the comments
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 2520, 2929, 13698, 281, 3037, 2087, 12729, 14237, 1293, 13301, 281, 436, 990, 436, 2929, 29328, 4869, 15579, 12425, 479, 68, 11797, 407, 253, 8063, 273, 4869, 15579, 275, 1491, 3762, 479, 68, 4648, 253, 8723, 2957, 90, 12425, 285, 253, 246, 9614, 2962, 11193, 281, 1056, 253, 4869, 15579, 13418, 17887, 9470, 5661, 1543, 921, 326, 479, 68, 12724, 41731, 13015, 5368, 256, 3433, 3082, 762, 2710, 15450, 8892, 33810, 436, 2929, 14371, 326, 479, 68, 310, 10237, 281, 2710, 4373, 22041, 24088, 4577, 14604, 9552, 285, 35615, 24088, 362, 953, 50276, 783, 4081, 1332, 556, 2067, 20544, 337, 17647, 285, 9171, 1430, 374, 1029, 3045, 2439, 2710, 15450, 8892, 285, 495, 31640, 281, 2710, 4373, 22041, 285, 35615, 891, 1158, 253, 9470, 4679, 973, 7568, 841, 20544, 33810, 479, 68, 476, 320, 12814, 347, 14604, 3020, 285, 4735, 3020, 256, 3433, 3082, 594, 891, 1928, 436, 629, 310, 671, 4722, 50276, 531, 2201, 4468, 342, 436, 2929, 310, 326, 752, 1491, 253, 4081, 1332, 476, 3037, 1223, 5368, 256, 3433, 3082, 2550, 310, 12744, 436, 2929, 5393, 326, 5368, 256, 3433, 3082, 9569, 690, 31306, 1223, 479, 68, 1057, 417, 752, 403, 253, 31306, 4555, 812, 368, 2085, 690, 16774, 1941, 8109, 326, 479, 68, 476, 3037, 1679, 30344, 14237, 891, 1158, 436, 2929, 1057, 417, 5513, 2139, 285, 849, 479, 68, 476, 562, 32231, 643, 256, 3433, 3082, 50276, 8826, 5884, 7350, 403, 2530, 275, 253, 3533, 2593, 50276, 936, 2020, 598, 3738, 690, 8813, 3133, 12497, 891, 1928, 253, 16774, 1543, 403, 2266, 7613, 891, 6273, 323, 5075, 2997, 50276, 2520, 2929, 973, 9713, 253, 7364, 285, 253, 2442, 4016, 38058, 3486, 50276, 7152, 33032, 2520, 2929, 4081, 247, 6036, 24224, 5361, 8103, 1925, 4869, 15579, 12425, 534, 310, 2810, 281, 278, 7083, 19, 10677, 436, 1332, 3587, 5556, 4219, 253, 1491, 2600, 407, 28699, 253, 12425, 2978, 1159, 436, 4081, 1332, 440, 7790, 14604, 3020, 16566, 948, 9245, 312, 285, 4735, 3020, 16566, 2534, 676, 7553, 968, 18236, 436, 310, 9009, 347, 247, 246, 9614, 2962, 7466, 50276, 783, 4081, 1332, 2722, 2266, 16226, 672, 5678, 342, 512, 5368, 5609, 1690, 17619, 4886, 3388, 285, 26640, 2990, 327, 247, 4618, 2491, 273, 4679, 1690, 4440, 257, 292, 4872, 10304, 49863, 29974, 13337, 9162, 3700, 4715, 327, 3492, 8892, 285, 1789, 5481, 50276, 35529, 436, 2929, 2686, 4648, 512, 253, 24866, 1690, 17619, 4886, 3388, 432, 407, 311, 533, 760, 25957, 352, 275, 253, 28913, 799, 656, 1135, 3695, 2144, 352, 310, 4755, 326, 253, 2266, 906, 273, 436, 1332, 6571, 3249, 432, 436, 7802, 273, 5368, 2216, 4295, 20544, 337, 253, 4081, 1332, 310, 7052, 4516, 407, 10527, 16038, 253, 4869, 15579, 8063, 50276, 19, 253, 4081, 1332, 2722, 2266, 16226, 327, 247, 4618, 2491, 273, 4679, 672, 5678, 342, 512, 5368, 5609, 1690, 17619, 4886, 3388, 26640, 2990, 1690, 4440, 257, 292, 4872, 10304, 49863, 29974, 13337, 9162, 3700, 4715, 327, 3492, 8892, 285, 1789, 5481, 495, 253, 4477, 2530, 5661, 4278, 1690, 10585, 406, 853, 285, 512, 4373, 22041, 50275, 20881, 1255, 265, 337, 253, 2266, 906, 6571, 3249, 432, 247, 7802, 273, 20462, 4295, 24088, 17619, 4886, 3388, 285, 26640, 10336, 352, 310, 30455, 849, 1199, 5750, 3249, 432, 253, 4869, 15579, 12425, 2957, 275, 958, 253, 760, 7680, 273, 253, 4081, 1332, 689, 5368, 3082, 310, 326, 352, 556, 247, 2169, 2621, 10618, 2829, 608, 70, 4518, 2722, 326, 253, 4465, 767, 7367, 2164, 760, 2572, 7200, 407, 17272, 374, 697, 12744, 752, 253, 2022, 7680, 310, 281, 436, 2929, 50276, 66, 253, 8063, 273, 4869, 15579, 310, 642, 1027, 432, 10568, 3907, 3386, 534, 310, 2168, 4081, 275, 278, 7083, 19, 2534, 676, 7553, 968, 15951, 1747, 285, 30661, 2980, 256, 3433, 50276, 67, 253, 2169, 2621, 11193, 347, 5393, 1840, 760, 556, 247, 17272, 4872, 10304, 7756, 512, 643, 4679, 1891, 281, 12654, 253, 5750, 50276, 68, 253, 8542, 5750, 273, 436, 7802, 1566, 253, 4477, 760, 7277, 616, 789, 281, 7936, 31225, 407, 311, 2534, 676, 7553, 968, 1863, 580, 533, 627, 403, 1142, 2074, 7802, 3210, 24088, 549, 32693, 14256, 24769, 18962, 549, 32693, 19, 11238, 11838, 2385, 549, 32693, 19, 12852, 805, 28766, 549, 32693, 6755, 1012, 35337, 549, 32693, 19, 1252, 1762, 12115, 50275, 6438, 30080, 22559, 5701, 50275, 74, 717, 625, 13762, 407, 253, 4477, 326, 253, 16774, 5750, 310, 24363, 253, 16774, 5750, 310, 4555, 816, 17272, 2686, 275, 2380, 1127, 337, 2534, 676, 26664, 403, 6326, 281, 3986, 818, 1671, 594, 1060, 253, 4465, 767, 7367, 5512, 6296, 760, 2085, 247, 14805, 7756, 50276, 74, 1158, 436, 310, 9106, 8718, 247, 1175, 3762, 440, 7790, 2067, 5697, 285, 3400, 32809, 11701, 533, 352, 310, 28536, 326, 253, 2929, 14177, 281, 10507, 436, 1127, 50276, 783, 2022, 4677, 285, 954, 273, 253, 1543, 7180, 588, 3731, 26460, 10668, 326, 253, 5750, 273, 253, 4081, 1332, 689, 2534, 676, 26664, 50276, 3549, 498, 83, 310, 1955, 281, 253, 4081, 2957, 275, 958, 19732, 2977, 643, 24866, 751, 10254, 2349, 351, 398, 310, 253, 2022, 1921, 533, 310, 760, 5393, 275, 298, 17220, 2378, 417, 275, 253, 4677, 417, 275, 253, 2022, 2505, 417, 275, 253, 5661, 2508, 50276, 2072, 5474, 33032, 2520, 2929, 29328, 247, 1881, 35421, 4715, 1332, 33690, 4869, 15579, 9706, 479, 68, 534, 19732, 1131, 253, 8063, 273, 4869, 15579, 281, 3037, 38663, 14237, 273, 271, 2460, 10895, 4679, 2218, 327, 4440, 257, 292, 253, 4477, 13398, 247, 4869, 15579, 342, 253, 42072, 7821, 14417, 8103, 273, 4499, 422, 4715, 534, 310, 17285, 347, 247, 1859, 15274, 2720, 534, 4245, 1071, 494, 1491, 50276, 783, 3242, 4795, 2957, 556, 247, 2412, 27152, 534, 597, 16851, 970, 247, 246, 9614, 2962, 597, 921, 326, 2710, 7367, 273, 436, 11193, 403, 275, 958, 6425, 281, 5368, 1881, 35421, 3082, 824, 347, 948, 9245, 312, 285, 2534, 676, 26664, 4679, 403, 5196, 342, 3215, 26208, 327, 4440, 257, 292, 285, 43364, 11117, 15450, 8892, 4872, 7103, 327, 4440, 257, 292, 49863, 29974, 13337, 9162, 327, 4440, 257, 292, 3700, 4715, 327, 1789, 5481, 11571, 50276, 68, 16856, 285, 26405, 9285, 80, 347, 973, 347, 3492, 12544, 2829, 608, 20544, 3236, 414, 253, 3114, 556, 644, 10040, 3146, 26736, 4499, 422, 390, 2074, 1881, 35421, 4715, 3082, 323, 2067, 1107, 1024, 436, 2929, 749, 2204, 265, 2067, 2045, 2987, 407, 5277, 247, 4460, 27998, 1859, 326, 4483, 323, 2007, 5223, 279, 285, 17947, 1223, 253, 4795, 2957, 2127, 310, 417, 22293, 1027, 432, 5368, 789, 253, 10527, 2746, 352, 310, 6012, 432, 285, 253, 4795, 15840, 10468, 32268, 352, 11026, 12647, 3236, 414, 275, 619, 2927, 50276, 15177, 253, 4679, 275, 436, 2929, 403, 1077, 4076, 342, 15246, 973, 14764, 264, 5661, 7533, 30762, 260, 8631, 253, 16182, 973, 1293, 16984, 667, 5177, 839, 36161, 285, 37031, 868, 253, 1543, 273, 253, 747, 1332, 403, 12724, 4503, 390, 8936, 281, 5368, 16566, 512, 1223, 1146, 1077, 17194, 891, 2831, 250, 50252, 3904, 342, 3236, 2987, 285, 253, 4477, 3761, 390, 275, 690, 2219, 8268, 253, 3236, 30404, 275, 2829, 23378, 3904, 1650, 253, 549, 32693, 2715, 273, 948, 498, 83, 10894, 721, 4590, 347, 697, 1755, 18, 4440, 257, 292, 756, 1223, 436, 789, 4245, 598, 281, 48798, 891, 5467, 841, 37122, 403, 432, 5520, 38092, 42083, 275, 253, 3733, 533, 651, 11435, 37699, 327, 436, 1127, 891, 1928, 6685, 9848, 326, 891, 812, 25464, 253, 4679, 273, 436, 2929, 342, 8723, 3434, 285, 651, 923, 10870, 1543, 50276, 498, 15752, 436, 2929, 310, 6685, 973, 15720, 891, 369, 2104, 281, 2096, 512, 273, 253, 12342, 327, 253, 806, 1239, 10489, 253, 1340, 273, 9759, 369, 13760, 285, 20121, 497, 25441, 2980, 1293, 1146, 27662, 3159, 90, 347, 3786, 5393, 891, 1928, 9848, 326, 891, 812, 25464, 253, 1543, 4541, 3340, 1677, 253, 4217, 10585, 406, 853, 275, 30762, 247, 50276, 9188, 40348, 253, 1480, 42068, 273, 247, 2021, 273, 16566, 281, 247, 1077, 28462, 15965, 4473, 310, 4122, 1534, 275, 619, 2927, 347, 352, 10316, 253, 34560, 1386, 273, 2561, 281, 247, 14940, 1127, 432, 534, 625, 4460, 3104, 273, 2561, 476, 320, 2668, 627, 812, 452, 644, 2067, 625, 25142, 273, 1881, 35421, 4715, 9380, 5975, 3390, 281, 436, 8103, 407, 16774, 16038, 533, 436, 789, 28194, 84, 326, 285, 13279, 598, 12302, 15018, 323, 2852, 789, 3340, 1677, 253, 44373, 273, 4499, 422, 4715, 16566, 275, 1655, 8113, 12982, 2987, 50276, 43671, 50276, 13206, 577, 369, 271, 6685, 4217, 23356, 50276, 50237, 369, 6685, 27096, 285, 9371, 50276, 783, 490, 77, 569, 275, 2829, 608, 3782, 299, 8127, 3761, 619, 30328, 534, 310, 5322, 50276, 20881, 1255, 265, 3236, 414, 436, 2929, 16932, 684, 5368, 3104, 273, 17947, 285, 13279, 253, 3369, 323, 2007, 4499, 422, 3082, 285, 347, 824, 253, 12553, 8103, 310, 417, 22335, 1077, 1027, 432, 5368, 789, 1677, 253, 6349, 285, 8453, 273, 253, 34889, 2167, 436, 310, 417, 7777, 247, 14855, 275, 619, 4743, 50276, 15177, 891, 513, 5730, 326, 627, 497, 387, 1878, 690, 3215, 26208, 4679, 327, 15302, 16280, 4440, 257, 292, 2057, 440, 1915, 456, 4384, 27471, 390, 1633, 751, 5053, 22359, 275, 619, 4743, 436, 310, 253, 5962, 14855, 273, 436, 2929, 347, 4767, 1840, 4583, 891, 369, 1077, 17847, 342, 253, 3290, 273, 436, 789, 253, 760, 1745, 80, 326, 16780, 562, 281, 479, 1223, 4361, 369, 1459, 37968, 3185, 273, 5235, 275, 298, 1549, 50276, 498, 15752, 891, 717, 1077, 7615, 342, 253, 4499, 422, 4715, 594, 717, 3164, 247, 23539, 29107, 1060, 533, 891, 1869, 253, 22909, 285, 4679, 497, 6685, 15246, 285, 1327, 8259, 5302, 891, 452, 642, 14672, 670, 253, 19843, 619, 760, 2905, 14876, 310, 326, 247, 12219, 281, 30762, 288, 310, 4845, 275, 253, 2022, 2505, 923, 27722, 2708, 50276, 9188, 40348, 347, 4767, 275, 253, 1840, 3236, 414, 2593, 697, 2834, 323, 247, 34889, 2929, 281, 452, 6149, 84, 700, 22993, 3486, 347, 253, 1318, 8127, 8696, 275, 253, 42068, 2366, 273, 789, 326, 7933, 4961, 891, 13414, 1158, 326, 436, 407, 667, 2097, 310, 247, 2036, 4016, 891, 651, 5958, 436, 2929, 347, 581, 273, 253, 625, 1534, 4394, 326, 209, 422, 1705, 2439, 275, 253, 1390, 2164, 3618, 273, 549, 32693, 533, 352, 1057, 1729, 352, 2708, 752, 247, 8059, 1682, 2929, 1537, 1007, 751, 50275, 43671, 50276, 50237, 288, 4983, 432, 298, 24, 1348, 310, 6685, 4722, 285, 891, 369, 30685, 323, 247, 2074, 2593, 1561, 253, 2022, 2505, 1223, 24049, 326, 1142, 3104, 310, 45783, 387, 436, 3924, 273, 253, 2929, 891, 651, 1804, 326, 247, 3806, 281, 30762, 288, 342, 247, 25044, 273, 752, 352, 4428, 310, 2879, 281, 253, 2022, 2505, 50276, 9691, 1271, 299, 285, 269, 2085, 247, 18382, 4291, 533, 7899, 285, 10799, 5740, 273, 7364, 285, 253, 1896, 5886, 281, 38058, 3486, 2490, 187, 4118, 18435, 27, 783, 2929, 275, 2087, 2959, 1264, 2762, 8680, 84, 285, 17503, 253, 1264, 30628, 512, 9446, 253, 10527, 3590, 1255, 273, 253, 2929, 285, 253, 2929, 310, 671, 4518, 3559, 342, 27096, 285, 2266, 5661, 1543, 627, 310, 247, 1643, 5053, 2403, 581, 37317, 1679, 9848, 275, 2426, 273, 253, 3242, 12510, 273, 253, 4081, 3762, 1223, 4583, 253, 5661, 1543, 403, 11088, 285, 10323, 915, 263, 253, 15081, 2792, 1160, 407, 253, 2929, 253, 4477, 778, 11829, 379, 19148, 253, 2792, 1754, 327, 253, 5701, 50275 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 2520, 2929, 13698, 281, 3037, 2087, 12729, 14237, 1293, 13301, 281, 436, 990, 436, 2929, 29328, 4869, 15579, 12425, 479, 68, 11797, 407, 253, 8063, 273, 4869, 15579, 275, 1491, 3762, 479, 68, 4648, 253, 8723, 2957, 90, 12425, 285, 253, 246, 9614, 2962, 11193, 281, 1056, 253, 4869, 15579, 13418, 17887, 9470, 5661, 1543, 921, 326, 479, 68, 12724, 41731, 13015, 5368, 256, 3433, 3082, 762, 2710, 15450, 8892, 33810, 436, 2929, 14371, 326, 479, 68, 310, 10237, 281, 2710, 4373, 22041, 24088, 4577, 14604, 9552, 285, 35615, 24088, 362, 953, 50276, 783, 4081, 1332, 556, 2067, 20544, 337, 17647, 285, 9171, 1430, 374, 1029, 3045, 2439, 2710, 15450, 8892, 285, 495, 31640, 281, 2710, 4373, 22041, 285, 35615, 891, 1158, 253, 9470, 4679, 973, 7568, 841, 20544, 33810, 479, 68, 476, 320, 12814, 347, 14604, 3020, 285, 4735, 3020, 256, 3433, 3082, 594, 891, 1928, 436, 629, 310, 671, 4722, 50276, 531, 2201, 4468, 342, 436, 2929, 310, 326, 752, 1491, 253, 4081, 1332, 476, 3037, 1223, 5368, 256, 3433, 3082, 2550, 310, 12744, 436, 2929, 5393, 326, 5368, 256, 3433, 3082, 9569, 690, 31306, 1223, 479, 68, 1057, 417, 752, 403, 253, 31306, 4555, 812, 368, 2085, 690, 16774, 1941, 8109, 326, 479, 68, 476, 3037, 1679, 30344, 14237, 891, 1158, 436, 2929, 1057, 417, 5513, 2139, 285, 849, 479, 68, 476, 562, 32231, 643, 256, 3433, 3082, 50276, 8826, 5884, 7350, 403, 2530, 275, 253, 3533, 2593, 50276, 936, 2020, 598, 3738, 690, 8813, 3133, 12497, 891, 1928, 253, 16774, 1543, 403, 2266, 7613, 891, 6273, 323, 5075, 2997, 50276, 2520, 2929, 973, 9713, 253, 7364, 285, 253, 2442, 4016, 38058, 3486, 50276, 7152, 33032, 2520, 2929, 4081, 247, 6036, 24224, 5361, 8103, 1925, 4869, 15579, 12425, 534, 310, 2810, 281, 278, 7083, 19, 10677, 436, 1332, 3587, 5556, 4219, 253, 1491, 2600, 407, 28699, 253, 12425, 2978, 1159, 436, 4081, 1332, 440, 7790, 14604, 3020, 16566, 948, 9245, 312, 285, 4735, 3020, 16566, 2534, 676, 7553, 968, 18236, 436, 310, 9009, 347, 247, 246, 9614, 2962, 7466, 50276, 783, 4081, 1332, 2722, 2266, 16226, 672, 5678, 342, 512, 5368, 5609, 1690, 17619, 4886, 3388, 285, 26640, 2990, 327, 247, 4618, 2491, 273, 4679, 1690, 4440, 257, 292, 4872, 10304, 49863, 29974, 13337, 9162, 3700, 4715, 327, 3492, 8892, 285, 1789, 5481, 50276, 35529, 436, 2929, 2686, 4648, 512, 253, 24866, 1690, 17619, 4886, 3388, 432, 407, 311, 533, 760, 25957, 352, 275, 253, 28913, 799, 656, 1135, 3695, 2144, 352, 310, 4755, 326, 253, 2266, 906, 273, 436, 1332, 6571, 3249, 432, 436, 7802, 273, 5368, 2216, 4295, 20544, 337, 253, 4081, 1332, 310, 7052, 4516, 407, 10527, 16038, 253, 4869, 15579, 8063, 50276, 19, 253, 4081, 1332, 2722, 2266, 16226, 327, 247, 4618, 2491, 273, 4679, 672, 5678, 342, 512, 5368, 5609, 1690, 17619, 4886, 3388, 26640, 2990, 1690, 4440, 257, 292, 4872, 10304, 49863, 29974, 13337, 9162, 3700, 4715, 327, 3492, 8892, 285, 1789, 5481, 495, 253, 4477, 2530, 5661, 4278, 1690, 10585, 406, 853, 285, 512, 4373, 22041, 50275, 20881, 1255, 265, 337, 253, 2266, 906, 6571, 3249, 432, 247, 7802, 273, 20462, 4295, 24088, 17619, 4886, 3388, 285, 26640, 10336, 352, 310, 30455, 849, 1199, 5750, 3249, 432, 253, 4869, 15579, 12425, 2957, 275, 958, 253, 760, 7680, 273, 253, 4081, 1332, 689, 5368, 3082, 310, 326, 352, 556, 247, 2169, 2621, 10618, 2829, 608, 70, 4518, 2722, 326, 253, 4465, 767, 7367, 2164, 760, 2572, 7200, 407, 17272, 374, 697, 12744, 752, 253, 2022, 7680, 310, 281, 436, 2929, 50276, 66, 253, 8063, 273, 4869, 15579, 310, 642, 1027, 432, 10568, 3907, 3386, 534, 310, 2168, 4081, 275, 278, 7083, 19, 2534, 676, 7553, 968, 15951, 1747, 285, 30661, 2980, 256, 3433, 50276, 67, 253, 2169, 2621, 11193, 347, 5393, 1840, 760, 556, 247, 17272, 4872, 10304, 7756, 512, 643, 4679, 1891, 281, 12654, 253, 5750, 50276, 68, 253, 8542, 5750, 273, 436, 7802, 1566, 253, 4477, 760, 7277, 616, 789, 281, 7936, 31225, 407, 311, 2534, 676, 7553, 968, 1863, 580, 533, 627, 403, 1142, 2074, 7802, 3210, 24088, 549, 32693, 14256, 24769, 18962, 549, 32693, 19, 11238, 11838, 2385, 549, 32693, 19, 12852, 805, 28766, 549, 32693, 6755, 1012, 35337, 549, 32693, 19, 1252, 1762, 12115, 50275, 6438, 30080, 22559, 5701, 50275, 74, 717, 625, 13762, 407, 253, 4477, 326, 253, 16774, 5750, 310, 24363, 253, 16774, 5750, 310, 4555, 816, 17272, 2686, 275, 2380, 1127, 337, 2534, 676, 26664, 403, 6326, 281, 3986, 818, 1671, 594, 1060, 253, 4465, 767, 7367, 5512, 6296, 760, 2085, 247, 14805, 7756, 50276, 74, 1158, 436, 310, 9106, 8718, 247, 1175, 3762, 440, 7790, 2067, 5697, 285, 3400, 32809, 11701, 533, 352, 310, 28536, 326, 253, 2929, 14177, 281, 10507, 436, 1127, 50276, 783, 2022, 4677, 285, 954, 273, 253, 1543, 7180, 588, 3731, 26460, 10668, 326, 253, 5750, 273, 253, 4081, 1332, 689, 2534, 676, 26664, 50276, 3549, 498, 83, 310, 1955, 281, 253, 4081, 2957, 275, 958, 19732, 2977, 643, 24866, 751, 10254, 2349, 351, 398, 310, 253, 2022, 1921, 533, 310, 760, 5393, 275, 298, 17220, 2378, 417, 275, 253, 4677, 417, 275, 253, 2022, 2505, 417, 275, 253, 5661, 2508, 50276, 2072, 5474, 33032, 2520, 2929, 29328, 247, 1881, 35421, 4715, 1332, 33690, 4869, 15579, 9706, 479, 68, 534, 19732, 1131, 253, 8063, 273, 4869, 15579, 281, 3037, 38663, 14237, 273, 271, 2460, 10895, 4679, 2218, 327, 4440, 257, 292, 253, 4477, 13398, 247, 4869, 15579, 342, 253, 42072, 7821, 14417, 8103, 273, 4499, 422, 4715, 534, 310, 17285, 347, 247, 1859, 15274, 2720, 534, 4245, 1071, 494, 1491, 50276, 783, 3242, 4795, 2957, 556, 247, 2412, 27152, 534, 597, 16851, 970, 247, 246, 9614, 2962, 597, 921, 326, 2710, 7367, 273, 436, 11193, 403, 275, 958, 6425, 281, 5368, 1881, 35421, 3082, 824, 347, 948, 9245, 312, 285, 2534, 676, 26664, 4679, 403, 5196, 342, 3215, 26208, 327, 4440, 257, 292, 285, 43364, 11117, 15450, 8892, 4872, 7103, 327, 4440, 257, 292, 49863, 29974, 13337, 9162, 327, 4440, 257, 292, 3700, 4715, 327, 1789, 5481, 11571, 50276, 68, 16856, 285, 26405, 9285, 80, 347, 973, 347, 3492, 12544, 2829, 608, 20544, 3236, 414, 253, 3114, 556, 644, 10040, 3146, 26736, 4499, 422, 390, 2074, 1881, 35421, 4715, 3082, 323, 2067, 1107, 1024, 436, 2929, 749, 2204, 265, 2067, 2045, 2987, 407, 5277, 247, 4460, 27998, 1859, 326, 4483, 323, 2007, 5223, 279, 285, 17947, 1223, 253, 4795, 2957, 2127, 310, 417, 22293, 1027, 432, 5368, 789, 253, 10527, 2746, 352, 310, 6012, 432, 285, 253, 4795, 15840, 10468, 32268, 352, 11026, 12647, 3236, 414, 275, 619, 2927, 50276, 15177, 253, 4679, 275, 436, 2929, 403, 1077, 4076, 342, 15246, 973, 14764, 264, 5661, 7533, 30762, 260, 8631, 253, 16182, 973, 1293, 16984, 667, 5177, 839, 36161, 285, 37031, 868, 253, 1543, 273, 253, 747, 1332, 403, 12724, 4503, 390, 8936, 281, 5368, 16566, 512, 1223, 1146, 1077, 17194, 891, 2831, 250, 50252, 3904, 342, 3236, 2987, 285, 253, 4477, 3761, 390, 275, 690, 2219, 8268, 253, 3236, 30404, 275, 2829, 23378, 3904, 1650, 253, 549, 32693, 2715, 273, 948, 498, 83, 10894, 721, 4590, 347, 697, 1755, 18, 4440, 257, 292, 756, 1223, 436, 789, 4245, 598, 281, 48798, 891, 5467, 841, 37122, 403, 432, 5520, 38092, 42083, 275, 253, 3733, 533, 651, 11435, 37699, 327, 436, 1127, 891, 1928, 6685, 9848, 326, 891, 812, 25464, 253, 4679, 273, 436, 2929, 342, 8723, 3434, 285, 651, 923, 10870, 1543, 50276, 498, 15752, 436, 2929, 310, 6685, 973, 15720, 891, 369, 2104, 281, 2096, 512, 273, 253, 12342, 327, 253, 806, 1239, 10489, 253, 1340, 273, 9759, 369, 13760, 285, 20121, 497, 25441, 2980, 1293, 1146, 27662, 3159, 90, 347, 3786, 5393, 891, 1928, 9848, 326, 891, 812, 25464, 253, 1543, 4541, 3340, 1677, 253, 4217, 10585, 406, 853, 275, 30762, 247, 50276, 9188, 40348, 253, 1480, 42068, 273, 247, 2021, 273, 16566, 281, 247, 1077, 28462, 15965, 4473, 310, 4122, 1534, 275, 619, 2927, 347, 352, 10316, 253, 34560, 1386, 273, 2561, 281, 247, 14940, 1127, 432, 534, 625, 4460, 3104, 273, 2561, 476, 320, 2668, 627, 812, 452, 644, 2067, 625, 25142, 273, 1881, 35421, 4715, 9380, 5975, 3390, 281, 436, 8103, 407, 16774, 16038, 533, 436, 789, 28194, 84, 326, 285, 13279, 598, 12302, 15018, 323, 2852, 789, 3340, 1677, 253, 44373, 273, 4499, 422, 4715, 16566, 275, 1655, 8113, 12982, 2987, 50276, 43671, 50276, 13206, 577, 369, 271, 6685, 4217, 23356, 50276, 50237, 369, 6685, 27096, 285, 9371, 50276, 783, 490, 77, 569, 275, 2829, 608, 3782, 299, 8127, 3761, 619, 30328, 534, 310, 5322, 50276, 20881, 1255, 265, 3236, 414, 436, 2929, 16932, 684, 5368, 3104, 273, 17947, 285, 13279, 253, 3369, 323, 2007, 4499, 422, 3082, 285, 347, 824, 253, 12553, 8103, 310, 417, 22335, 1077, 1027, 432, 5368, 789, 1677, 253, 6349, 285, 8453, 273, 253, 34889, 2167, 436, 310, 417, 7777, 247, 14855, 275, 619, 4743, 50276, 15177, 891, 513, 5730, 326, 627, 497, 387, 1878, 690, 3215, 26208, 4679, 327, 15302, 16280, 4440, 257, 292, 2057, 440, 1915, 456, 4384, 27471, 390, 1633, 751, 5053, 22359, 275, 619, 4743, 436, 310, 253, 5962, 14855, 273, 436, 2929, 347, 4767, 1840, 4583, 891, 369, 1077, 17847, 342, 253, 3290, 273, 436, 789, 253, 760, 1745, 80, 326, 16780, 562, 281, 479, 1223, 4361, 369, 1459, 37968, 3185, 273, 5235, 275, 298, 1549, 50276, 498, 15752, 891, 717, 1077, 7615, 342, 253, 4499, 422, 4715, 594, 717, 3164, 247, 23539, 29107, 1060, 533, 891, 1869, 253, 22909, 285, 4679, 497, 6685, 15246, 285, 1327, 8259, 5302, 891, 452, 642, 14672, 670, 253, 19843, 619, 760, 2905, 14876, 310, 326, 247, 12219, 281, 30762, 288, 310, 4845, 275, 253, 2022, 2505, 923, 27722, 2708, 50276, 9188, 40348, 347, 4767, 275, 253, 1840, 3236, 414, 2593, 697, 2834, 323, 247, 34889, 2929, 281, 452, 6149, 84, 700, 22993, 3486, 347, 253, 1318, 8127, 8696, 275, 253, 42068, 2366, 273, 789, 326, 7933, 4961, 891, 13414, 1158, 326, 436, 407, 667, 2097, 310, 247, 2036, 4016, 891, 651, 5958, 436, 2929, 347, 581, 273, 253, 625, 1534, 4394, 326, 209, 422, 1705, 2439, 275, 253, 1390, 2164, 3618, 273, 549, 32693, 533, 352, 1057, 1729, 352, 2708, 752, 247, 8059, 1682, 2929, 1537, 1007, 751, 50275, 43671, 50276, 50237, 288, 4983, 432, 298, 24, 1348, 310, 6685, 4722, 285, 891, 369, 30685, 323, 247, 2074, 2593, 1561, 253, 2022, 2505, 1223, 24049, 326, 1142, 3104, 310, 45783, 387, 436, 3924, 273, 253, 2929, 891, 651, 1804, 326, 247, 3806, 281, 30762, 288, 342, 247, 25044, 273, 752, 352, 4428, 310, 2879, 281, 253, 2022, 2505, 50276, 9691, 1271, 299, 285, 269, 2085, 247, 18382, 4291, 533, 7899, 285, 10799, 5740, 273, 7364, 285, 253, 1896, 5886, 281, 38058, 3486, 2490, 187, 4118, 18435, 27, 783, 2929, 275, 2087, 2959, 1264, 2762, 8680, 84, 285, 17503, 253, 1264, 30628, 512, 9446, 253, 10527, 3590, 1255, 273, 253, 2929, 285, 253, 2929, 310, 671, 4518, 3559, 342, 27096, 285, 2266, 5661, 1543, 627, 310, 247, 1643, 5053, 2403, 581, 37317, 1679, 9848, 275, 2426, 273, 253, 3242, 12510, 273, 253, 4081, 3762, 1223, 4583, 253, 5661, 1543, 403, 11088, 285, 10323, 915, 263, 253, 15081, 2792, 1160, 407, 253, 2929, 253, 4477, 778, 11829, 379, 19148, 253, 2792, 1754, 327, 253, 5701, 50275 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: this paper proposes an uncertaintyaware hierarchical refinement scheme for incremental implicitlyrefined classification exploring inheritance relations of multilevel semantic increment it consists of a global representation extension module and a hierarchical distribution alignment module strengths 1 this paper introduces global representation extension and hierarchical distribution alignment to the iirc task for better hierarchical invariance and conducts experiments on iirccifar dataset to verify its effectiveness 2 this paper makes progress in a newly proposed task iirc which focus on the robustness of models when processing data of different categories and differentgrained classes weakness 1 there is only one dataset for overall performance comparison experiments more experiments and analyses on more datasets should be added to verify the effectiveness of the proposed method 2 the effectiveness of separate module is not fully verified the ablation results of the gre module in table 1 is lower than the baseline more explanations should be given the authors should give more introduction and limitation analyses about the proposed method in realistic application scenarios docsepinspired by recent incremental implicitlyrefined classification iirc works this paper proposes an uncertaintyaware hierarchical refinement uahr method which explores the semantic correlation of different granularity levels of image labels to ensure the consistency assumption of the feature distribution of class from different granularity levels technically a global representation extension strategy is developed to boost the interclass discriminability of newlyadded classes and a hierarchical distribution alignment strategy is designed to ensure distribution consistency across different classes belonging to the same superclass experiments are conducted on iirccifar and iircimagenetlite datasets 1 this paper finds the feature distribution consistency across the classes with inheritance relationships further to guide the optimization of the incremental learning process this paper designs the hierarchical distribution alignment strategy to exploit the correlation of hierarchical class distributions 2 ablation studies show the effectiveness of each proposed module 3 this paper is wellwritten clearly presenting the main idea technical contribution it is interesting that the authors find that the current incremental training phase would destroy the feature distribution consistency between super and subclasses however the part about the global representation extension strategy in the paper is relatively weak due to the following two reasons the authors did not explain why the rbf kernel can effectively measure the uncertainty of interclass features the knowledge distillation kd loss between the current model and old model is a widelyused technique in the classincremental learning field ref 1 2 3 i feel that the author seems to directly apply the kd loss to solve the knowledge forgetting issue and the kd loss is not adapted according to the iirc task which reduces my rating of technical contribution for this paper please explain the differences between the kd method used in this paper and the commonlyused kd methods during the rebuttal period technical details the authors use the rbf kernel to calculate the uncertainty between different classes why choose the rbf kernel to calculate interclass uncertainty will the effectiveness of the method decline without the rbf mapping actually eq 1 presents that the mean diversity across all novel classcenters is formulated but different classes may have different importance for iirc tasks as a result is a classwise diversity uncertainty metric better in line 187 page 5 the authors state that the output entropy of the old classes needs to be subtracted by a margin distance value it is unclear how to obtain such a margin distance experiments the work ref 4 was also conducted on the iirccifar and iircimagenetlite datasets but this paper did not compare with the original papers results in the work ref 4 besides the definition of the horizontal axis in figure 5 is also inconsistent with that in figures 35 in ref 4 please explain the reason for this inconsistency during the rebuttal period ref 1 semanticaware knowledge distillation for fewshot classincremental learning cvpr2021 ref 2 fewshot classincremental learning via relation knowledge distillation aaai2021 ref 3 classincremental learning by knowledge distillation with adaptive feature consolidation cvpr2022 ref 4 iirc incremental implicitlyrefined classification cvpr2021 docsepthis paper proposes a method called uahr to address the task of incremental implicitly refinement classification specifically two components global representation extension gre and hierarchical distribution alignment hda are proposed and integrated together to measure the uncertainty and align the distribution experimental studies on public datasets including iirccifar and iircimagenetlite show the effectiveness of the proposed uahr scheme there are several strengths of the proposed uahr the paper is well motivated the ablation study is extensive there are many intuitive visualizations of the intermediate results enhancing the credit of the motivation also some weaknesses exist the experimental comparison with the sota methods is not sufficient the representation of the paper is poor details are missing as indicated in the above sections there are issues with experimental studies representations and missing details for details please see the above sections docsepthis paper tackles the incremental implicitlyrefined classification by proposing a framework that is based on uncertaintyaware hierarchical refinement in different stages label relationships are better leveraged by the proposed method for semantic reasoning and the uncertainty further assists this inference process the proposed method achieves competitive performance in two popular benchmarks strength good paper writing and motivation to construct the class hierarchy clear illustrations in figure 4 and figure 5 that facilitate understanding weaknesses are mainly related to the novelty the hierarchical class structure has been adopted in1 the use of global representation has also been leveraged in 2 entropybased margin controller has been proposed in 3 1 largescale fewshot learning knowledge transfer with class hierarchy cvpr 2019 2 fewshot learning with global class representations iccv 2019 3 spot and learn a maximumentropy patch sampler for fewshot image classification cvpr 2019 the limitations have been discussed in the appendix ### Summary:
three reviewers are positive on this paper although the rating of one reviewer is borderline reject he is fairly confident in his assessment in effectthe authors have well addressed all the reviewers concerns in the rebuttal so i suggest accepting this paper
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 2520, 2929, 29328, 271, 11649, 13823, 24498, 29646, 6974, 323, 32809, 29688, 709, 967, 9162, 18216, 24954, 2493, 273, 1554, 48268, 24705, 17627, 352, 8414, 273, 247, 4156, 6779, 6880, 6333, 285, 247, 24498, 3268, 12420, 6333, 20544, 50276, 18, 186, 2520, 2929, 23970, 4156, 6779, 6880, 285, 24498, 3268, 12420, 281, 253, 891, 1426, 4836, 323, 1805, 24498, 31429, 285, 2589, 84, 4679, 327, 891, 343, 550, 338, 274, 10895, 281, 12654, 697, 12510, 374, 186, 2520, 2929, 2789, 4780, 275, 247, 9841, 4081, 4836, 891, 1426, 534, 2770, 327, 253, 31640, 273, 3210, 672, 5162, 941, 273, 1027, 9050, 285, 1027, 72, 11273, 5971, 50276, 20881, 1255, 50276, 18, 186, 9088, 310, 760, 581, 10895, 323, 4583, 3045, 5301, 4679, 625, 4679, 285, 6260, 327, 625, 15302, 943, 320, 2879, 281, 12654, 253, 12510, 273, 253, 4081, 1332, 374, 186, 783, 12510, 273, 4858, 6333, 310, 417, 4751, 16058, 253, 28913, 1543, 273, 253, 13738, 6333, 275, 2829, 337, 310, 2406, 685, 253, 8245, 625, 22909, 943, 320, 1677, 50276, 783, 4477, 943, 1918, 625, 10199, 285, 12291, 6260, 670, 253, 4081, 1332, 275, 15958, 2898, 15216, 5474, 33032, 38358, 407, 3332, 32809, 29688, 709, 967, 9162, 891, 1426, 2987, 436, 2929, 29328, 271, 11649, 13823, 24498, 29646, 1484, 24452, 1332, 534, 33826, 253, 24705, 5921, 273, 1027, 32449, 414, 2308, 273, 2460, 13301, 281, 5416, 253, 15274, 9376, 273, 253, 4735, 3268, 273, 966, 432, 1027, 32449, 414, 2308, 22335, 247, 4156, 6779, 6880, 5700, 310, 3715, 281, 9510, 253, 734, 2437, 20741, 1430, 273, 9841, 22566, 5971, 285, 247, 24498, 3268, 12420, 5700, 310, 4158, 281, 5416, 3268, 15274, 2439, 1027, 5971, 15823, 281, 253, 1072, 2221, 2437, 4679, 403, 5196, 327, 891, 343, 550, 338, 274, 285, 891, 1426, 303, 6533, 292, 46788, 15302, 337, 436, 2929, 9010, 253, 4735, 3268, 15274, 2439, 253, 5971, 342, 24954, 7688, 2007, 281, 7102, 253, 13757, 273, 253, 32809, 4715, 1232, 436, 2929, 11809, 253, 24498, 3268, 12420, 5700, 281, 22059, 253, 5921, 273, 24498, 966, 10670, 50276, 19, 28913, 2175, 921, 253, 12510, 273, 1016, 4081, 6333, 495, 436, 2929, 310, 973, 15720, 4518, 15250, 253, 2022, 2934, 50275, 48746, 7680, 352, 310, 4722, 326, 253, 4477, 1089, 326, 253, 1655, 32809, 3733, 3408, 651, 6909, 253, 4735, 3268, 15274, 875, 2221, 285, 749, 19770, 2299, 253, 629, 670, 253, 4156, 6779, 6880, 5700, 275, 253, 2929, 310, 4942, 5075, 1955, 281, 253, 1563, 767, 4606, 50276, 783, 4477, 858, 417, 5513, 2139, 253, 391, 3342, 10295, 476, 8069, 2557, 253, 11649, 273, 734, 2437, 3386, 50276, 783, 3640, 940, 21755, 465, 69, 2957, 875, 253, 1655, 1566, 285, 1711, 1566, 310, 247, 7561, 3197, 5853, 275, 253, 966, 19687, 30132, 4715, 1673, 1275, 337, 374, 495, 891, 1928, 326, 253, 2488, 3133, 281, 3587, 4647, 253, 465, 69, 2957, 281, 8415, 253, 3640, 37264, 2523, 285, 253, 465, 69, 2957, 310, 417, 12956, 2556, 281, 253, 891, 1426, 4836, 534, 11355, 619, 13716, 273, 7681, 7680, 323, 436, 2929, 4496, 5513, 253, 3910, 875, 253, 465, 69, 1332, 908, 275, 436, 2929, 285, 253, 7744, 3197, 465, 69, 3082, 1309, 253, 30080, 22559, 2180, 50275, 48746, 4278, 50276, 783, 4477, 897, 253, 391, 3342, 10295, 281, 10173, 253, 11649, 875, 1027, 5971, 2139, 5206, 253, 391, 3342, 10295, 281, 10173, 734, 2437, 11649, 588, 253, 12510, 273, 253, 1332, 10343, 1293, 253, 391, 3342, 10603, 50275, 35264, 16186, 337, 10262, 326, 253, 1599, 9991, 2439, 512, 4460, 966, 1154, 398, 310, 26115, 533, 1027, 5971, 778, 452, 1027, 6349, 323, 891, 1426, 8892, 347, 247, 906, 310, 247, 966, 3020, 9991, 11649, 7982, 1805, 50276, 249, 1386, 25165, 3239, 608, 253, 4477, 1375, 326, 253, 3453, 15579, 273, 253, 1711, 5971, 3198, 281, 320, 42426, 407, 247, 8459, 4181, 1318, 352, 310, 12744, 849, 281, 4044, 824, 247, 8459, 4181, 50276, 16217, 3825, 50276, 783, 789, 1275, 577, 369, 671, 5196, 327, 253, 891, 343, 550, 338, 274, 285, 891, 1426, 303, 6533, 292, 46788, 15302, 533, 436, 2929, 858, 417, 7277, 342, 253, 3236, 9380, 1543, 275, 253, 789, 1275, 577, 16280, 253, 5426, 273, 253, 11593, 7844, 275, 4677, 608, 310, 671, 16706, 342, 326, 275, 8442, 4791, 275, 1275, 577, 4496, 5513, 253, 1921, 323, 436, 43430, 1309, 253, 30080, 22559, 2180, 50276, 709, 337, 24705, 13823, 3640, 940, 21755, 323, 1643, 11860, 966, 19687, 30132, 4715, 30105, 1087, 938, 1797, 1275, 374, 1643, 11860, 966, 19687, 30132, 4715, 3066, 5886, 3640, 940, 21755, 39951, 2284, 938, 1797, 1275, 495, 966, 19687, 30132, 4715, 407, 3640, 940, 21755, 342, 17825, 4735, 34889, 30105, 1087, 938, 1423, 1275, 577, 891, 1426, 32809, 29688, 709, 967, 9162, 30105, 1087, 938, 1797, 50276, 7152, 33032, 2520, 2929, 29328, 247, 1332, 1925, 1484, 24452, 281, 2953, 253, 4836, 273, 32809, 29688, 29646, 9162, 5742, 767, 4295, 4156, 6779, 6880, 13738, 285, 24498, 3268, 12420, 288, 1473, 403, 4081, 285, 8527, 2366, 281, 2557, 253, 11649, 285, 8495, 253, 3268, 5661, 2175, 327, 1345, 15302, 1690, 891, 343, 550, 338, 274, 285, 891, 1426, 303, 6533, 292, 46788, 921, 253, 12510, 273, 253, 4081, 1484, 24452, 6974, 627, 403, 2067, 20544, 273, 253, 4081, 1484, 24452, 50275, 783, 2929, 310, 973, 17194, 50276, 783, 28913, 1263, 310, 9470, 50276, 9088, 403, 1142, 27350, 5304, 5904, 273, 253, 10444, 1543, 22474, 253, 6152, 273, 253, 16038, 50276, 12563, 690, 32213, 2226, 50275, 783, 5661, 5301, 342, 253, 256, 5503, 3082, 310, 417, 4209, 50276, 783, 6779, 273, 253, 2929, 310, 4105, 50276, 23454, 403, 5816, 347, 4860, 275, 253, 1840, 7118, 627, 403, 3374, 342, 5661, 2175, 14237, 285, 5816, 4278, 323, 4278, 4496, 923, 253, 1840, 7118, 5474, 33032, 2520, 2929, 39223, 253, 32809, 29688, 709, 967, 9162, 407, 36636, 247, 7792, 326, 310, 1754, 327, 11649, 13823, 24498, 29646, 275, 1027, 8661, 5203, 7688, 403, 1805, 19732, 2961, 407, 253, 4081, 1332, 323, 24705, 14720, 285, 253, 11649, 2007, 27593, 436, 17032, 1232, 253, 4081, 1332, 33526, 12085, 3045, 275, 767, 4633, 49602, 4757, 50275, 12311, 2929, 4028, 285, 16038, 281, 3989, 253, 966, 19868, 50275, 8250, 33954, 275, 4677, 577, 285, 4677, 608, 326, 12454, 4685, 50276, 20881, 1255, 265, 403, 7194, 2905, 281, 253, 38135, 50275, 783, 24498, 966, 2605, 556, 644, 8671, 275, 18, 50274, 783, 897, 273, 4156, 6779, 556, 671, 644, 19732, 2961, 275, 374, 50274, 290, 10144, 3169, 8459, 9763, 556, 644, 4081, 275, 495, 50276, 18, 1236, 2510, 25912, 1643, 11860, 4715, 3640, 3700, 342, 966, 19868, 30105, 1087, 6247, 374, 1643, 11860, 4715, 342, 4156, 966, 14237, 17857, 17312, 6247, 495, 6308, 285, 3037, 247, 11903, 1303, 10144, 12097, 1775, 17407, 323, 1643, 11860, 2460, 9162, 30105, 1087, 6247, 50274, 783, 7364, 452, 644, 5469, 275, 253, 30762, 2490, 187, 4118, 18435, 27, 13524, 30628, 403, 2762, 327, 436, 2929, 3738, 253, 13716, 273, 581, 37317, 310, 45210, 12009, 50276, 248, 310, 9648, 13224, 275, 521, 6803, 50276, 249, 1055, 783, 4477, 452, 973, 9713, 512, 253, 30628, 7350, 275, 253, 30080, 22559, 50276, 601, 891, 1804, 18738, 436, 2929, 209 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 2520, 2929, 29328, 271, 11649, 13823, 24498, 29646, 6974, 323, 32809, 29688, 709, 967, 9162, 18216, 24954, 2493, 273, 1554, 48268, 24705, 17627, 352, 8414, 273, 247, 4156, 6779, 6880, 6333, 285, 247, 24498, 3268, 12420, 6333, 20544, 50276, 18, 186, 2520, 2929, 23970, 4156, 6779, 6880, 285, 24498, 3268, 12420, 281, 253, 891, 1426, 4836, 323, 1805, 24498, 31429, 285, 2589, 84, 4679, 327, 891, 343, 550, 338, 274, 10895, 281, 12654, 697, 12510, 374, 186, 2520, 2929, 2789, 4780, 275, 247, 9841, 4081, 4836, 891, 1426, 534, 2770, 327, 253, 31640, 273, 3210, 672, 5162, 941, 273, 1027, 9050, 285, 1027, 72, 11273, 5971, 50276, 20881, 1255, 50276, 18, 186, 9088, 310, 760, 581, 10895, 323, 4583, 3045, 5301, 4679, 625, 4679, 285, 6260, 327, 625, 15302, 943, 320, 2879, 281, 12654, 253, 12510, 273, 253, 4081, 1332, 374, 186, 783, 12510, 273, 4858, 6333, 310, 417, 4751, 16058, 253, 28913, 1543, 273, 253, 13738, 6333, 275, 2829, 337, 310, 2406, 685, 253, 8245, 625, 22909, 943, 320, 1677, 50276, 783, 4477, 943, 1918, 625, 10199, 285, 12291, 6260, 670, 253, 4081, 1332, 275, 15958, 2898, 15216, 5474, 33032, 38358, 407, 3332, 32809, 29688, 709, 967, 9162, 891, 1426, 2987, 436, 2929, 29328, 271, 11649, 13823, 24498, 29646, 1484, 24452, 1332, 534, 33826, 253, 24705, 5921, 273, 1027, 32449, 414, 2308, 273, 2460, 13301, 281, 5416, 253, 15274, 9376, 273, 253, 4735, 3268, 273, 966, 432, 1027, 32449, 414, 2308, 22335, 247, 4156, 6779, 6880, 5700, 310, 3715, 281, 9510, 253, 734, 2437, 20741, 1430, 273, 9841, 22566, 5971, 285, 247, 24498, 3268, 12420, 5700, 310, 4158, 281, 5416, 3268, 15274, 2439, 1027, 5971, 15823, 281, 253, 1072, 2221, 2437, 4679, 403, 5196, 327, 891, 343, 550, 338, 274, 285, 891, 1426, 303, 6533, 292, 46788, 15302, 337, 436, 2929, 9010, 253, 4735, 3268, 15274, 2439, 253, 5971, 342, 24954, 7688, 2007, 281, 7102, 253, 13757, 273, 253, 32809, 4715, 1232, 436, 2929, 11809, 253, 24498, 3268, 12420, 5700, 281, 22059, 253, 5921, 273, 24498, 966, 10670, 50276, 19, 28913, 2175, 921, 253, 12510, 273, 1016, 4081, 6333, 495, 436, 2929, 310, 973, 15720, 4518, 15250, 253, 2022, 2934, 50275, 48746, 7680, 352, 310, 4722, 326, 253, 4477, 1089, 326, 253, 1655, 32809, 3733, 3408, 651, 6909, 253, 4735, 3268, 15274, 875, 2221, 285, 749, 19770, 2299, 253, 629, 670, 253, 4156, 6779, 6880, 5700, 275, 253, 2929, 310, 4942, 5075, 1955, 281, 253, 1563, 767, 4606, 50276, 783, 4477, 858, 417, 5513, 2139, 253, 391, 3342, 10295, 476, 8069, 2557, 253, 11649, 273, 734, 2437, 3386, 50276, 783, 3640, 940, 21755, 465, 69, 2957, 875, 253, 1655, 1566, 285, 1711, 1566, 310, 247, 7561, 3197, 5853, 275, 253, 966, 19687, 30132, 4715, 1673, 1275, 337, 374, 495, 891, 1928, 326, 253, 2488, 3133, 281, 3587, 4647, 253, 465, 69, 2957, 281, 8415, 253, 3640, 37264, 2523, 285, 253, 465, 69, 2957, 310, 417, 12956, 2556, 281, 253, 891, 1426, 4836, 534, 11355, 619, 13716, 273, 7681, 7680, 323, 436, 2929, 4496, 5513, 253, 3910, 875, 253, 465, 69, 1332, 908, 275, 436, 2929, 285, 253, 7744, 3197, 465, 69, 3082, 1309, 253, 30080, 22559, 2180, 50275, 48746, 4278, 50276, 783, 4477, 897, 253, 391, 3342, 10295, 281, 10173, 253, 11649, 875, 1027, 5971, 2139, 5206, 253, 391, 3342, 10295, 281, 10173, 734, 2437, 11649, 588, 253, 12510, 273, 253, 1332, 10343, 1293, 253, 391, 3342, 10603, 50275, 35264, 16186, 337, 10262, 326, 253, 1599, 9991, 2439, 512, 4460, 966, 1154, 398, 310, 26115, 533, 1027, 5971, 778, 452, 1027, 6349, 323, 891, 1426, 8892, 347, 247, 906, 310, 247, 966, 3020, 9991, 11649, 7982, 1805, 50276, 249, 1386, 25165, 3239, 608, 253, 4477, 1375, 326, 253, 3453, 15579, 273, 253, 1711, 5971, 3198, 281, 320, 42426, 407, 247, 8459, 4181, 1318, 352, 310, 12744, 849, 281, 4044, 824, 247, 8459, 4181, 50276, 16217, 3825, 50276, 783, 789, 1275, 577, 369, 671, 5196, 327, 253, 891, 343, 550, 338, 274, 285, 891, 1426, 303, 6533, 292, 46788, 15302, 533, 436, 2929, 858, 417, 7277, 342, 253, 3236, 9380, 1543, 275, 253, 789, 1275, 577, 16280, 253, 5426, 273, 253, 11593, 7844, 275, 4677, 608, 310, 671, 16706, 342, 326, 275, 8442, 4791, 275, 1275, 577, 4496, 5513, 253, 1921, 323, 436, 43430, 1309, 253, 30080, 22559, 2180, 50276, 709, 337, 24705, 13823, 3640, 940, 21755, 323, 1643, 11860, 966, 19687, 30132, 4715, 30105, 1087, 938, 1797, 1275, 374, 1643, 11860, 966, 19687, 30132, 4715, 3066, 5886, 3640, 940, 21755, 39951, 2284, 938, 1797, 1275, 495, 966, 19687, 30132, 4715, 407, 3640, 940, 21755, 342, 17825, 4735, 34889, 30105, 1087, 938, 1423, 1275, 577, 891, 1426, 32809, 29688, 709, 967, 9162, 30105, 1087, 938, 1797, 50276, 7152, 33032, 2520, 2929, 29328, 247, 1332, 1925, 1484, 24452, 281, 2953, 253, 4836, 273, 32809, 29688, 29646, 9162, 5742, 767, 4295, 4156, 6779, 6880, 13738, 285, 24498, 3268, 12420, 288, 1473, 403, 4081, 285, 8527, 2366, 281, 2557, 253, 11649, 285, 8495, 253, 3268, 5661, 2175, 327, 1345, 15302, 1690, 891, 343, 550, 338, 274, 285, 891, 1426, 303, 6533, 292, 46788, 921, 253, 12510, 273, 253, 4081, 1484, 24452, 6974, 627, 403, 2067, 20544, 273, 253, 4081, 1484, 24452, 50275, 783, 2929, 310, 973, 17194, 50276, 783, 28913, 1263, 310, 9470, 50276, 9088, 403, 1142, 27350, 5304, 5904, 273, 253, 10444, 1543, 22474, 253, 6152, 273, 253, 16038, 50276, 12563, 690, 32213, 2226, 50275, 783, 5661, 5301, 342, 253, 256, 5503, 3082, 310, 417, 4209, 50276, 783, 6779, 273, 253, 2929, 310, 4105, 50276, 23454, 403, 5816, 347, 4860, 275, 253, 1840, 7118, 627, 403, 3374, 342, 5661, 2175, 14237, 285, 5816, 4278, 323, 4278, 4496, 923, 253, 1840, 7118, 5474, 33032, 2520, 2929, 39223, 253, 32809, 29688, 709, 967, 9162, 407, 36636, 247, 7792, 326, 310, 1754, 327, 11649, 13823, 24498, 29646, 275, 1027, 8661, 5203, 7688, 403, 1805, 19732, 2961, 407, 253, 4081, 1332, 323, 24705, 14720, 285, 253, 11649, 2007, 27593, 436, 17032, 1232, 253, 4081, 1332, 33526, 12085, 3045, 275, 767, 4633, 49602, 4757, 50275, 12311, 2929, 4028, 285, 16038, 281, 3989, 253, 966, 19868, 50275, 8250, 33954, 275, 4677, 577, 285, 4677, 608, 326, 12454, 4685, 50276, 20881, 1255, 265, 403, 7194, 2905, 281, 253, 38135, 50275, 783, 24498, 966, 2605, 556, 644, 8671, 275, 18, 50274, 783, 897, 273, 4156, 6779, 556, 671, 644, 19732, 2961, 275, 374, 50274, 290, 10144, 3169, 8459, 9763, 556, 644, 4081, 275, 495, 50276, 18, 1236, 2510, 25912, 1643, 11860, 4715, 3640, 3700, 342, 966, 19868, 30105, 1087, 6247, 374, 1643, 11860, 4715, 342, 4156, 966, 14237, 17857, 17312, 6247, 495, 6308, 285, 3037, 247, 11903, 1303, 10144, 12097, 1775, 17407, 323, 1643, 11860, 2460, 9162, 30105, 1087, 6247, 50274, 783, 7364, 452, 644, 5469, 275, 253, 30762, 2490, 187, 4118, 18435, 27, 13524, 30628, 403, 2762, 327, 436, 2929, 3738, 253, 13716, 273, 581, 37317, 310, 45210, 12009, 50276, 248, 310, 9648, 13224, 275, 521, 6803, 50276, 249, 1055, 783, 4477, 452, 973, 9713, 512, 253, 30628, 7350, 275, 253, 30080, 22559, 50276, 601, 891, 1804, 18738, 436, 2929, 209 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: the paper proposes to select parameters for augmentation through a training procedure without validation data like the prior work on augerino it does so by introducing a bayesian model selection framework for selecting the ideal set of augmentations for a neural network effort in making this model tractable is notable and the evaluation shows some benefits over strong baselines strengths the paper is clear and reads well though the treatment of technical derivations is quite long model does sensible behavior ie figure 2 model has improvements over augerino eg table 1 diversity of datasets models and augmentations seems acceptable but can be still strengthened eg table 1 weaknesses one baseline is noninvariant which seems like a weak baseline since there is often a significant performance boost by just using augmentations in a simple way augerino is a stronger baseline but the results are much less noticable against that baseline eg figure 3 its not clear nor intuitive why this method works some examples of failure modes of augerino that were addressed would be enlightening post rebuttal many of the papers strengths are due to being a bayesian treatment of the problem of setting augmentation parameters which avoids hyperparameter tuning there is a convincing case that it would be used in the future as a building block as this approach works on a standard set of benchmarks and is otherwise sensible the evaluation can be strengthened as always eg scaling the approach on more difficult datasets but i think the paper has sufficient contribution as is and so i am increasing my score na docsepthe paper proposes a gradientbased method for automatically selecting the data augmentation parameters without the validation data the trick is to parameterize the data augmentation scheme approximate the marginal likelihood with respect to these data augmentation parameters and optimize them during training the authors propose schemes such as kfac approximation and explicit automatic differentiation to make the computation of differentiable marginal likelihood more tractable empirically the proposed approach is capable of recovering invariances presented in the data on mnist fashionmnist and cifar10 strengths the topic is interesting and relevant to the neurips community the paper is wellwritten and the technical details look correct to me while limited to small datasets and models the experiments demonstrate that the proposed approach can successfully recover invariances presented in the model and can help improve generalization weaknesses there are several specific limitations of the proposed approach to give some specific examples 1 the proposed approach does not scale to large datasets and models 2 the method requires taskspecific data augmentation parameterization eg affine transformation for image classification tasks and it is not straightforward to apply the proposed method in other domains and 3 the proposed algorithm is challenging to implement on standard deep learning frameworks and introduce additional computational overhead that scales quadratically with model parameters and linearly to the number of datasets while i believe that the toy experiments presented in the paper demonstrate the proposed approachs effectiveness the proposed methods scalability is still questionable it would be helpful if the authors also compared their algorithms with other data augmentation adaptation methods that make use of validation datasets to my understanding these methods should be much more efficient than the proposed approach although it requires a validation dataset if we select 20 of training data as validation data shouldnt the prior algorithms also be able to learn these invariances postrebuttal thank you for the authors response i acknowledge that i have read the rebuttal and other reviewers comments as the author addressed all my concerns in the paper i increased my score the work does not have a potential negative societal impact docsepthe authors extend previously proposed techniques for a laplace approximation for bayesian model selection to learn and incorporate invariances while training a deep network to make this approximation computationally feasible the authors propose techniques to apply a kroneckerfactored approximation the authors extensively evaluate their method on many datasets with ablations and comparisons to another method that learns invariances while training strengths the authors have very illustrative experiments figure 2 is very wellpresented that show that the models they train are learning the invariances present in the dataset training a network to capture the invariances present in a dataset is of significant interest for the machine learning community furthermore they test their method on a wide variety of datasetsnetwork architectures
 the authors clearly describe the novelty of their approach in comparison to immer et al 2021 they describe how the aforementioned work would be computationally infeasible for their proposed task weaknesses one goal of this work is obtain a model that is invariant to a set or distribution of transformations lines 1617 as measured by test accuracy in table 1 in the case where one wants s network with invariance to the natural variations in a dataset the authors should justify their approach over simply training with augmented samples as done in randaugment cubuk et al 2019 for example can the authors approach provide benefit over randaugment on imagenet using their six candidate augmentations it should be stated that the authors proposed method has significance outside this scenario since they also identify which invariances are present as opposed to augmentation approaches nevertheless such a comparison is important for the community to understand which method should be used for this given circumstance figure 4 of the appendix is very nice for measuring performance of different approximation methods to strengthen their work the authors should consider giving empirical runtime results as well so that practitioners can understand the empirical performancerun speed tradeoffs from the experiments the authors have presented it appears that their methods requires the set of candidate invariances to be specified beforehand this is limited for datasets where the invariances are not known beforehand the authors have not released code ### Summary:
after reading the submission and its reviews my understanding is that the submission proposes to use a tractable laplace approximations general gaussnewton and kfac to learn suitable invariancesaugmentation for the considered neural network the derivations of this paper and the description of the method are clear and well motivated while the experiments illustrate the concept and the soundness of the method they remains small scale and are limited to sets of parameterized augmentations nonetheless i recommend this submission for acceptance
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 783, 2929, 29328, 281, 3609, 3602, 323, 42072, 949, 247, 3733, 5199, 1293, 12820, 941, 751, 253, 2720, 789, 327, 14688, 254, 2610, 352, 1057, 594, 407, 16984, 247, 17699, 16561, 1566, 5438, 7792, 323, 17221, 253, 7445, 873, 273, 35919, 569, 323, 247, 11454, 2990, 3434, 275, 2403, 436, 1566, 10649, 494, 310, 16613, 285, 253, 7103, 2722, 690, 5373, 689, 2266, 1666, 25379, 20544, 50276, 783, 2929, 310, 2590, 285, 9563, 973, 2167, 253, 1971, 273, 7681, 3538, 569, 310, 3240, 1048, 50276, 7645, 1057, 24600, 3879, 26332, 4677, 374, 50276, 7645, 556, 11701, 689, 14688, 254, 2610, 24088, 2829, 337, 50276, 69, 2095, 273, 15302, 3210, 285, 35919, 569, 3133, 12207, 533, 476, 320, 1335, 34615, 24088, 2829, 337, 50276, 20881, 1255, 265, 50276, 531, 8245, 310, 1327, 25168, 534, 3133, 751, 247, 5075, 8245, 1580, 627, 310, 2223, 247, 1534, 3045, 9510, 407, 816, 970, 35919, 569, 275, 247, 2969, 1039, 14688, 254, 2610, 310, 247, 10046, 8245, 533, 253, 1543, 403, 1199, 1679, 417, 35232, 1411, 326, 8245, 24088, 4677, 495, 50276, 953, 417, 2590, 4543, 27350, 2139, 436, 1332, 2987, 690, 6667, 273, 4433, 10006, 273, 14688, 254, 2610, 326, 497, 9713, 651, 320, 25441, 2980, 50276, 5996, 30080, 22559, 1142, 273, 253, 9380, 20544, 403, 1955, 281, 1146, 247, 17699, 16561, 1971, 273, 253, 1895, 273, 4758, 42072, 3602, 534, 32547, 4373, 19484, 25184, 627, 310, 247, 21414, 1083, 326, 352, 651, 320, 908, 275, 253, 2852, 347, 247, 3652, 2972, 347, 436, 2746, 2987, 327, 247, 2629, 873, 273, 49602, 285, 310, 5010, 24600, 253, 7103, 476, 320, 34615, 347, 1900, 24088, 13642, 253, 2746, 327, 625, 2834, 15302, 533, 891, 1158, 253, 2929, 556, 4209, 7680, 347, 310, 285, 594, 891, 717, 3629, 619, 4868, 5549, 5474, 339, 431, 248, 2929, 29328, 247, 11786, 3169, 1332, 323, 8356, 17221, 253, 941, 42072, 3602, 1293, 253, 12820, 941, 253, 10480, 310, 281, 4764, 907, 253, 941, 42072, 6974, 16851, 253, 16888, 12177, 342, 1675, 281, 841, 941, 42072, 3602, 285, 22318, 731, 1309, 3733, 253, 4477, 12661, 15849, 824, 347, 465, 28402, 11193, 285, 6843, 12077, 9827, 281, 1056, 253, 13782, 273, 46350, 16888, 12177, 625, 10649, 494, 45190, 253, 4081, 2746, 310, 7032, 273, 27930, 828, 6656, 707, 3559, 275, 253, 941, 327, 278, 79, 382, 8142, 16192, 382, 285, 260, 338, 274, 740, 20544, 50276, 783, 9400, 310, 4722, 285, 4623, 281, 253, 5723, 2824, 3114, 50276, 783, 2929, 310, 973, 15720, 285, 253, 7681, 4278, 1007, 3451, 281, 479, 50275, 6050, 3710, 281, 1355, 15302, 285, 3210, 253, 4679, 7568, 326, 253, 4081, 2746, 476, 8379, 9295, 828, 6656, 707, 3559, 275, 253, 1566, 285, 476, 1361, 3157, 26647, 50276, 20881, 1255, 265, 50276, 9088, 403, 2067, 2173, 7364, 273, 253, 4081, 2746, 281, 1918, 690, 2173, 6667, 337, 253, 4081, 2746, 1057, 417, 4311, 281, 1781, 15302, 285, 3210, 374, 253, 1332, 4419, 8892, 29765, 941, 42072, 4764, 1320, 24088, 29438, 9261, 323, 2460, 9162, 8892, 285, 352, 310, 417, 15246, 281, 4647, 253, 4081, 1332, 275, 643, 10625, 285, 495, 253, 4081, 5933, 310, 11132, 281, 3359, 327, 2629, 3676, 4715, 31225, 285, 9569, 3081, 15180, 18332, 326, 11498, 13284, 5372, 342, 1566, 3602, 285, 23352, 281, 253, 1180, 273, 15302, 50276, 6050, 891, 2868, 326, 253, 20953, 4679, 3559, 275, 253, 2929, 7568, 253, 4081, 2746, 84, 12510, 253, 4081, 3082, 9171, 1430, 310, 1335, 30455, 50276, 262, 651, 320, 9371, 604, 253, 4477, 671, 2429, 616, 11333, 342, 643, 941, 42072, 15644, 3082, 326, 1056, 897, 273, 12820, 15302, 281, 619, 4685, 841, 3082, 943, 320, 1199, 625, 5919, 685, 253, 4081, 2746, 3738, 352, 4419, 247, 12820, 10895, 604, 359, 3609, 1384, 273, 3733, 941, 347, 12820, 941, 943, 2649, 253, 2720, 11333, 671, 320, 2104, 281, 3037, 841, 828, 6656, 707, 50274, 5996, 250, 2858, 22559, 50275, 47033, 368, 323, 253, 4477, 2380, 891, 14409, 326, 891, 452, 1239, 253, 30080, 22559, 285, 643, 30628, 5701, 347, 253, 2488, 9713, 512, 619, 7350, 275, 253, 2929, 891, 2559, 619, 4868, 253, 789, 1057, 417, 452, 247, 2442, 4016, 38058, 3486, 50276, 7152, 339, 431, 248, 4477, 9017, 3786, 4081, 5609, 323, 247, 826, 5070, 11193, 323, 17699, 16561, 1566, 5438, 281, 3037, 285, 19071, 828, 6656, 707, 1223, 3733, 247, 3676, 2990, 281, 1056, 436, 11193, 43245, 17887, 253, 4477, 12661, 5609, 281, 4647, 247, 465, 29037, 13692, 12690, 2149, 11193, 253, 4477, 18171, 7472, 616, 1332, 327, 1142, 15302, 342, 490, 77, 569, 285, 14023, 281, 1529, 1332, 326, 33772, 828, 6656, 707, 1223, 3733, 50276, 296, 3755, 20556, 50276, 783, 4477, 452, 1077, 47386, 4679, 4677, 374, 310, 1077, 973, 15068, 264, 326, 921, 326, 253, 3210, 597, 6194, 403, 4715, 253, 828, 6656, 707, 1246, 275, 253, 10895, 3733, 247, 2990, 281, 9232, 253, 828, 6656, 707, 1246, 275, 247, 10895, 310, 273, 1534, 1600, 323, 253, 5145, 4715, 3114, 33810, 597, 1071, 616, 1332, 327, 247, 4618, 5235, 273, 15302, 18428, 35615, 40702, 50276, 783, 4477, 4518, 6266, 253, 38135, 273, 616, 2746, 275, 5301, 281, 37643, 1162, 355, 43425, 597, 6266, 849, 253, 18979, 789, 651, 320, 43245, 275, 36764, 917, 323, 616, 4081, 4836, 50276, 20881, 1255, 265, 50276, 531, 4736, 273, 436, 789, 310, 4044, 247, 1566, 326, 310, 13727, 281, 247, 873, 390, 3268, 273, 21257, 3104, 1668, 1166, 347, 4080, 407, 1071, 7200, 275, 2829, 337, 275, 253, 1083, 835, 581, 5605, 256, 2990, 342, 31429, 281, 253, 3626, 10575, 275, 247, 10895, 253, 4477, 943, 15249, 616, 2746, 689, 3365, 3733, 342, 31612, 3530, 347, 2218, 275, 40819, 2321, 420, 12966, 2788, 1162, 355, 6247, 323, 1650, 476, 253, 4477, 2746, 2085, 5649, 689, 40819, 2321, 420, 327, 4440, 257, 292, 970, 616, 2800, 7431, 35919, 569, 352, 943, 320, 4767, 326, 253, 4477, 4081, 1332, 556, 8453, 3345, 436, 10076, 1580, 597, 671, 4271, 534, 828, 6656, 707, 403, 1246, 347, 10066, 281, 42072, 7274, 17837, 824, 247, 5301, 310, 1774, 323, 253, 3114, 281, 2096, 534, 1332, 943, 320, 908, 323, 436, 1677, 26741, 50276, 13206, 577, 273, 253, 30762, 310, 1077, 5322, 323, 10499, 3045, 273, 1027, 11193, 3082, 281, 17084, 616, 789, 253, 4477, 943, 1908, 4933, 16774, 20243, 1543, 347, 973, 594, 326, 24432, 476, 2096, 253, 16774, 1347, 21955, 328, 3885, 5454, 14273, 50275, 4064, 253, 4679, 253, 4477, 452, 3559, 352, 4620, 326, 616, 3082, 4419, 253, 873, 273, 7431, 828, 6656, 707, 281, 320, 7616, 38565, 436, 310, 3710, 323, 15302, 835, 253, 828, 6656, 707, 403, 417, 1929, 38565, 50276, 783, 4477, 452, 417, 4439, 2127, 2490, 187, 4118, 18435, 27, 6438, 4361, 253, 19529, 285, 697, 10123, 619, 4685, 310, 326, 253, 19529, 29328, 281, 897, 247, 10649, 494, 826, 5070, 34754, 2087, 305, 10064, 1826, 1299, 285, 465, 28402, 281, 3037, 7470, 828, 6656, 707, 2321, 16977, 323, 253, 2783, 11454, 2990, 253, 3538, 569, 273, 436, 2929, 285, 253, 5740, 273, 253, 1332, 403, 2590, 285, 973, 17194, 1223, 253, 4679, 17093, 253, 4473, 285, 253, 3590, 1255, 273, 253, 1332, 597, 4558, 1355, 4311, 285, 403, 3710, 281, 5239, 273, 4764, 1025, 35919, 569, 23188, 891, 5583, 436, 19529, 323, 14924 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 783, 2929, 29328, 281, 3609, 3602, 323, 42072, 949, 247, 3733, 5199, 1293, 12820, 941, 751, 253, 2720, 789, 327, 14688, 254, 2610, 352, 1057, 594, 407, 16984, 247, 17699, 16561, 1566, 5438, 7792, 323, 17221, 253, 7445, 873, 273, 35919, 569, 323, 247, 11454, 2990, 3434, 275, 2403, 436, 1566, 10649, 494, 310, 16613, 285, 253, 7103, 2722, 690, 5373, 689, 2266, 1666, 25379, 20544, 50276, 783, 2929, 310, 2590, 285, 9563, 973, 2167, 253, 1971, 273, 7681, 3538, 569, 310, 3240, 1048, 50276, 7645, 1057, 24600, 3879, 26332, 4677, 374, 50276, 7645, 556, 11701, 689, 14688, 254, 2610, 24088, 2829, 337, 50276, 69, 2095, 273, 15302, 3210, 285, 35919, 569, 3133, 12207, 533, 476, 320, 1335, 34615, 24088, 2829, 337, 50276, 20881, 1255, 265, 50276, 531, 8245, 310, 1327, 25168, 534, 3133, 751, 247, 5075, 8245, 1580, 627, 310, 2223, 247, 1534, 3045, 9510, 407, 816, 970, 35919, 569, 275, 247, 2969, 1039, 14688, 254, 2610, 310, 247, 10046, 8245, 533, 253, 1543, 403, 1199, 1679, 417, 35232, 1411, 326, 8245, 24088, 4677, 495, 50276, 953, 417, 2590, 4543, 27350, 2139, 436, 1332, 2987, 690, 6667, 273, 4433, 10006, 273, 14688, 254, 2610, 326, 497, 9713, 651, 320, 25441, 2980, 50276, 5996, 30080, 22559, 1142, 273, 253, 9380, 20544, 403, 1955, 281, 1146, 247, 17699, 16561, 1971, 273, 253, 1895, 273, 4758, 42072, 3602, 534, 32547, 4373, 19484, 25184, 627, 310, 247, 21414, 1083, 326, 352, 651, 320, 908, 275, 253, 2852, 347, 247, 3652, 2972, 347, 436, 2746, 2987, 327, 247, 2629, 873, 273, 49602, 285, 310, 5010, 24600, 253, 7103, 476, 320, 34615, 347, 1900, 24088, 13642, 253, 2746, 327, 625, 2834, 15302, 533, 891, 1158, 253, 2929, 556, 4209, 7680, 347, 310, 285, 594, 891, 717, 3629, 619, 4868, 5549, 5474, 339, 431, 248, 2929, 29328, 247, 11786, 3169, 1332, 323, 8356, 17221, 253, 941, 42072, 3602, 1293, 253, 12820, 941, 253, 10480, 310, 281, 4764, 907, 253, 941, 42072, 6974, 16851, 253, 16888, 12177, 342, 1675, 281, 841, 941, 42072, 3602, 285, 22318, 731, 1309, 3733, 253, 4477, 12661, 15849, 824, 347, 465, 28402, 11193, 285, 6843, 12077, 9827, 281, 1056, 253, 13782, 273, 46350, 16888, 12177, 625, 10649, 494, 45190, 253, 4081, 2746, 310, 7032, 273, 27930, 828, 6656, 707, 3559, 275, 253, 941, 327, 278, 79, 382, 8142, 16192, 382, 285, 260, 338, 274, 740, 20544, 50276, 783, 9400, 310, 4722, 285, 4623, 281, 253, 5723, 2824, 3114, 50276, 783, 2929, 310, 973, 15720, 285, 253, 7681, 4278, 1007, 3451, 281, 479, 50275, 6050, 3710, 281, 1355, 15302, 285, 3210, 253, 4679, 7568, 326, 253, 4081, 2746, 476, 8379, 9295, 828, 6656, 707, 3559, 275, 253, 1566, 285, 476, 1361, 3157, 26647, 50276, 20881, 1255, 265, 50276, 9088, 403, 2067, 2173, 7364, 273, 253, 4081, 2746, 281, 1918, 690, 2173, 6667, 337, 253, 4081, 2746, 1057, 417, 4311, 281, 1781, 15302, 285, 3210, 374, 253, 1332, 4419, 8892, 29765, 941, 42072, 4764, 1320, 24088, 29438, 9261, 323, 2460, 9162, 8892, 285, 352, 310, 417, 15246, 281, 4647, 253, 4081, 1332, 275, 643, 10625, 285, 495, 253, 4081, 5933, 310, 11132, 281, 3359, 327, 2629, 3676, 4715, 31225, 285, 9569, 3081, 15180, 18332, 326, 11498, 13284, 5372, 342, 1566, 3602, 285, 23352, 281, 253, 1180, 273, 15302, 50276, 6050, 891, 2868, 326, 253, 20953, 4679, 3559, 275, 253, 2929, 7568, 253, 4081, 2746, 84, 12510, 253, 4081, 3082, 9171, 1430, 310, 1335, 30455, 50276, 262, 651, 320, 9371, 604, 253, 4477, 671, 2429, 616, 11333, 342, 643, 941, 42072, 15644, 3082, 326, 1056, 897, 273, 12820, 15302, 281, 619, 4685, 841, 3082, 943, 320, 1199, 625, 5919, 685, 253, 4081, 2746, 3738, 352, 4419, 247, 12820, 10895, 604, 359, 3609, 1384, 273, 3733, 941, 347, 12820, 941, 943, 2649, 253, 2720, 11333, 671, 320, 2104, 281, 3037, 841, 828, 6656, 707, 50274, 5996, 250, 2858, 22559, 50275, 47033, 368, 323, 253, 4477, 2380, 891, 14409, 326, 891, 452, 1239, 253, 30080, 22559, 285, 643, 30628, 5701, 347, 253, 2488, 9713, 512, 619, 7350, 275, 253, 2929, 891, 2559, 619, 4868, 253, 789, 1057, 417, 452, 247, 2442, 4016, 38058, 3486, 50276, 7152, 339, 431, 248, 4477, 9017, 3786, 4081, 5609, 323, 247, 826, 5070, 11193, 323, 17699, 16561, 1566, 5438, 281, 3037, 285, 19071, 828, 6656, 707, 1223, 3733, 247, 3676, 2990, 281, 1056, 436, 11193, 43245, 17887, 253, 4477, 12661, 5609, 281, 4647, 247, 465, 29037, 13692, 12690, 2149, 11193, 253, 4477, 18171, 7472, 616, 1332, 327, 1142, 15302, 342, 490, 77, 569, 285, 14023, 281, 1529, 1332, 326, 33772, 828, 6656, 707, 1223, 3733, 50276, 296, 3755, 20556, 50276, 783, 4477, 452, 1077, 47386, 4679, 4677, 374, 310, 1077, 973, 15068, 264, 326, 921, 326, 253, 3210, 597, 6194, 403, 4715, 253, 828, 6656, 707, 1246, 275, 253, 10895, 3733, 247, 2990, 281, 9232, 253, 828, 6656, 707, 1246, 275, 247, 10895, 310, 273, 1534, 1600, 323, 253, 5145, 4715, 3114, 33810, 597, 1071, 616, 1332, 327, 247, 4618, 5235, 273, 15302, 18428, 35615, 40702, 50276, 783, 4477, 4518, 6266, 253, 38135, 273, 616, 2746, 275, 5301, 281, 37643, 1162, 355, 43425, 597, 6266, 849, 253, 18979, 789, 651, 320, 43245, 275, 36764, 917, 323, 616, 4081, 4836, 50276, 20881, 1255, 265, 50276, 531, 4736, 273, 436, 789, 310, 4044, 247, 1566, 326, 310, 13727, 281, 247, 873, 390, 3268, 273, 21257, 3104, 1668, 1166, 347, 4080, 407, 1071, 7200, 275, 2829, 337, 275, 253, 1083, 835, 581, 5605, 256, 2990, 342, 31429, 281, 253, 3626, 10575, 275, 247, 10895, 253, 4477, 943, 15249, 616, 2746, 689, 3365, 3733, 342, 31612, 3530, 347, 2218, 275, 40819, 2321, 420, 12966, 2788, 1162, 355, 6247, 323, 1650, 476, 253, 4477, 2746, 2085, 5649, 689, 40819, 2321, 420, 327, 4440, 257, 292, 970, 616, 2800, 7431, 35919, 569, 352, 943, 320, 4767, 326, 253, 4477, 4081, 1332, 556, 8453, 3345, 436, 10076, 1580, 597, 671, 4271, 534, 828, 6656, 707, 403, 1246, 347, 10066, 281, 42072, 7274, 17837, 824, 247, 5301, 310, 1774, 323, 253, 3114, 281, 2096, 534, 1332, 943, 320, 908, 323, 436, 1677, 26741, 50276, 13206, 577, 273, 253, 30762, 310, 1077, 5322, 323, 10499, 3045, 273, 1027, 11193, 3082, 281, 17084, 616, 789, 253, 4477, 943, 1908, 4933, 16774, 20243, 1543, 347, 973, 594, 326, 24432, 476, 2096, 253, 16774, 1347, 21955, 328, 3885, 5454, 14273, 50275, 4064, 253, 4679, 253, 4477, 452, 3559, 352, 4620, 326, 616, 3082, 4419, 253, 873, 273, 7431, 828, 6656, 707, 281, 320, 7616, 38565, 436, 310, 3710, 323, 15302, 835, 253, 828, 6656, 707, 403, 417, 1929, 38565, 50276, 783, 4477, 452, 417, 4439, 2127, 2490, 187, 4118, 18435, 27, 6438, 4361, 253, 19529, 285, 697, 10123, 619, 4685, 310, 326, 253, 19529, 29328, 281, 897, 247, 10649, 494, 826, 5070, 34754, 2087, 305, 10064, 1826, 1299, 285, 465, 28402, 281, 3037, 7470, 828, 6656, 707, 2321, 16977, 323, 253, 2783, 11454, 2990, 253, 3538, 569, 273, 436, 2929, 285, 253, 5740, 273, 253, 1332, 403, 2590, 285, 973, 17194, 1223, 253, 4679, 17093, 253, 4473, 285, 253, 3590, 1255, 273, 253, 1332, 597, 4558, 1355, 4311, 285, 403, 3710, 281, 5239, 273, 4764, 1025, 35919, 569, 23188, 891, 5583, 436, 19529, 323, 14924 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: this paper proposes a method to solve wasserstein gradient flows based on the jko scheme using variational formulations of functional objectives such as the kl divergence or the generalized entropy nonlinear diffusion relying on known reformulations of the jko scheme as optimization over convex functions the paper departs from recent related methods in expressing certain objectives as fdivergences and in turn using the dual formulation of these divergences to circumvent the need to do explicit density computation in these the resulting method involves parametrizing two types of operators as neural networks one of them as an inputconvex neural network and solving a minimax objective the paper presents experiments on simple pdes mostly in 1d or 2d with known solutions strengths an ingenious use of fdivergence duality aka variational formulation to rewrite densitydepending functional flow objectives such as the generalized entropy as optimization of expectations allowing for computation via finitesample approximation the resulting method seems to be empirically valid in low and middimensional settings weaknesses novelty the main and perhaps only novelty of this approach compared to mokrov et al 2021 alvarezmelis et al 2021 bunne et al 2021 is the use of the variational formulation of fdivergences something that is itself well known note that this is only relevant for functional objectives that depend on the density of the measure itself eg those that cannot be expressed as an expectation over the measure all other objectives including the interaction energy considered here can be tackled with the exact same approach of the three papers above the novel contributions should be more clearly separated from nonnovel ones eg the reformulation of wgf as optimization over convex functions might seem like a contribution of this paper to an uninitiated reader although it has been explored extensively in other work most prominently in benamou et al 2014 which is puzzling not cited here and more recently in the 3 jko icnn papers cited above some statements are not clear not correct or too vague these should all be clarified or corrected for example that is not too far away from the identity map in pg 5 in section 22 the delta f delta p is not formally a gradient but the firstvariation of a functional in section 36 it is stated that prior works requires explicit computation of log determinants of hessians at cubic complexity but this is not the case at least one of those methods uses cuadrticcost matrixvectorproduct stochastic estimators of the hess logdet the motivation for using the fb scheme for the interaction energy is not clear some of the intuitionmotivation discussed in sections b2 and b3 should probably be moved to the main text limited experimental validation especially with regards to highdimensional settings which is arguably the main promise of this work in addition all experiments are synthetic a more compelling evaluation framework should include at least one highdimensional realistic dataset the evaluation is mostly qualitative the paper purposefuly chooses pdes with known solutions but then provides mostly qualitativevisual results a more compelling evaluation framwork would include quantitative comparison against these known solutions in sections 43 and 44 while the computational complexity summary provided in the paper is useful there are some many hidden constants in those bounds that they are hardly useful these should be complemented with a thorough empirical runtime analysis comparing against exact and inexact methods such as those cited here as related work minor comments while the motivation is different there are deep and unexplored connections between the dualizatin of the fdivergence objectives used here and the dual of the kantorovich ot problem that has been recently explored extensively for learning monge maps between distributions many of these rely on convex conjugacy to reformulate a sup objective as a supinf one see korotin et al 2021b for an excellent survey on these it would have been great though not obligatory to see a discussion on these connections i would suggest moving table 1 after proposition 1 so that all objects in the definition of mathcalath have been already introduced i spent some time trying to figure out what mut was before finding it further down missing related work benamou et al discretization of functionals involving the mongeampre operator 2014 huang et al convex potential flows universar probability distributions with optimal transport and convex optimization iclr 2021 bunne et al jkonet proximal optimal transport modeling of population dynamics 2021 the paper provides an interesting variation on recent jkobased methods for computational wasserstein gradient flows but its limited novelty and empirical evaluation diminish its contribution and make it a borderline paper in my view that being said if the issues i raise in my review are properly addressed i would be willing to increase my score docsepthis paper studies the implementation of some wasserstein gradient flows wgf in discrete time but without discretizing the space the methods proposed are based on the jko operator to discretize wgf in time the implementation of the jko can be challenging the strategy of the authors is to first reparametrize the jko as a minimization over a space of functions instead of measures via pushforward then when the objective function is a fdivergence the objective inside the jko admit a variational representation and can be expressed as a sup conclusion each jko is written as a min max over a space of functions to solve it they parametrize the functions by neural networks and alternatively maximize and minimize the problem using adam an important feature is that the objective in the min max can be approximated with samples of the current distribution its density doesnt appear only integrals wrt to the current distribution strengths the paper is well written and high level ideas on the theory are explained the approach is clear and reasonable the simulations show promising results they do not seem to have been cherry picked they show what is claimed by the authors and there is no surprise given the simplicity of the approach weaknesses the theoretical consistence of the proposed methods is ignored are the proposed scheme consistent time discretizations of the wgf are the assumptions satisfied is the alternative maximizationminimization strategy algo 1 consistent do all measures considered admit density wrt lebesgue the paper mainly provides intuition for algo 1 without solid theoretical foundations moreover the technical contribution is rather limited see the summary of the paper above one could argue that it is an easy paper in the sense that only the simulations seem new the novelty is mainly to have parametrized the functions in the objective of the jko by neural networks the approach is reasonable the simulation promising but i do not see a significant technical contribution they essentially reparametrized a min max problem over a function space using neural nets and running adam to alternatively maximize and minimize docsepthis paper proposes a variational formulation of each jko step for optimizing functionals on measures different from existing recent works on emulating jko steps by training pushforward neural networks either directly or as gradients of convex functions the variational formulation involves another inner maximization of a function without needing density access that typically requires cubic time complexity due to computing the log determinants of the pushforwards experiments are done to demonstrate the practicality of the algorithm this paper identifies a crucial challenge in the existing works of emulating jko steps namely the expensive computation of the densities of the form pkx for each k the solution the paper suggests is reasonable but it comes at the cost of adding another inner maximization which makes the optimization a lot more difficult eg unstable due to high variance overall i think the amount of contribution is very limited the variational formulation of the objective functionals considered is all well known and putting it together with the jko step is fairly straightforward the experiments are not convincing enough to demonstrate the practical advantages of the proposed method compared to the alternatives detailed comments in eq 9 a distribution gamma is introduced what is the point of introducing this measure is it essentially an importance sampling of q for which we only know the density why is it enough to just choose gaussians which could be very different from q many sections are very similar to mokrov 2021 for example there is nothing new in sec 35 and the experiments setup in 41 and 42 are exactly the same yet the more challenging experiments from mokrov 2021 are not reproduced here such as posterior inference and nonlinear filtering in sec 36 it is common to use the hutchinson trace estimator to approximate the gradient of log det in addition to a linear solve which could speed up the competing methods it might be good to include a comparison to that the results in figure 2 are visibly worse than those of mokrov 2021 moreover here only up to dimension 13 is included whereas mokrov 2021 contains dimension 32 the results in figure 3 of the proposed method are better than those of mokrov 2021 im wondering why this is the case since in mokrov 2021 the kl divergence is calculated exactly whereas in the proposed method additional bias could be introduced due to the failure of maximizing h the paper studies an important challenge in jko steps encountered by recent works but the contributions are incremental without demonstrating convincing practical advantages docsepthe paper proposes a method to compute wasserstein gradient flows wgfs via neural networks and the jko scheme in contrast to prior works to compute wgfs of functionals involving fdivergences the authors use variational approximations rather than direct computations it is claimed to work faster and perform better benefits 1 the variational approach might be a solution to issues of direct computation such as cubically growing complexity as a function of the dimension drawbacks 2 the experiments of the paper are weak and do not sufficiently support the main claims 3 the relation to the prior work is not fully disclosed 4 the scope of the paper wgfs is narrow detailed comments are below relation to prior work a large part of the algorithm proposed by the authors matches the previously known art more precisely sections 30 and 31 exploit the reformulation of jko via functions that have already been proposed in 1 not cited in the paper and extensively used in 23 i think this is not very clear from the text and might add extra inexistent value to the current work if i correctly understand the actual difference wrt the prior work eg 2 is the way that the fdivergence is optimized the authors use a variational approximation in contrast to direct computation here i have two remarks first the key claim of the paper is that direct computation scales cubically and is not feasible in high dimensions i agree but what about fast approximations in 3 the authors explicitly state they use a fast estimator based on the hutchinson method in 2 the authors state that to speed up the computation fast approximations can be used it is unfair to ignore this and compare in the experimental section only with 2 using and only by using the direct computation second the variational approximation of the fdivergence is not novel see for example fgans 5 i wonder if such approximations have already been used in bayesian machine learning bml i encourage the authors to include a detailed discussion of this why is this important i am not an expert in bml but it seems to me that this particular variational approximation can be applied to a dozen of tasks involving kl divergence the current paper demonstrates that it outperforms direct computation if this is indeed true why hasnt this approach been used for other bml tasks if it has already been the authors should include relevant references to support the current experimental findings overall the paper should more carefully acknowledge the prior work and make it transparent what is new and what is already well known paper 4 also seems relevant experiments the experimental section is very weak both in terms of quality and quantity in terms of quantity the paper looks poor compared to predecessors eg 2 a paper to which they compare their method in particular the results provided in figure 2 visually suffer from notable artifacts which is a contrast to that reported by 2 in a similar setup figure 2 of 2 besides the dimensions considered in the current paper are lower which is suspicious importantly the quantitative results provided in figure 3 raise questions how is it possible that on such a toy example evolving gaussian distributions linear pushforward maps grad psi the variational methods so drastically outperform the direct computation up to 10 times this is quite unbelievable in particular in small dimensions did the authors use the same network architectures and other shared hyperparameters for comparison a discussion here is needed scope if i correctly understand the authors do not provide any highdimensional applications of wgfs due to this i currently tend to think the scope of the paper is narrow and the usefulness of the proposed approach for the community of iclr is questionable adding an application would definitely benefit the paper correctness of the method overall the method is correct however in one of the experiments the authors approximate the pushforward of the jko step directly as the neural network not as the gradient of the inputconvex network in this case the push forward distribution might not have density damaging the entire jko scheme clarity what for the fb scheme is introduced section 33 this is not clear from the text references 1 benamou j d carlier g mrigot q oudet e 2016 discretization of functionals involving the mongeampre operator numerische mathematik 1343 611636 2 mokrov p korotin a li l genevay a solomon j burnaev e 2021 largescale wasserstein gradient flows arxiv preprint arxiv210600736 3 alvarezmelis d schiff y mroueh y 2021 optimizing functionals on the space of probabilities with input convex neural networks arxiv preprint arxiv210600774 4 bunne c mengpapaxanthos l krause a cuturi m 2021 jkonet proximal optimal transport modeling of population dynamics arxiv preprint arxiv210606345 5 nowozin s cseke b tomioka r 2016 december fgan training generative neural samplers using variational divergence minimization in proceedings of the 30th international conference on neural information processing systems pp 271279 my overall impression of the paper is that it is unfinished while the idea of variational approximation is reasonable i suppose this paper requires a major revision with a dozen text improvements and experiments therefore i vote to reject this paper in its current form ### Summary:
the paper proposes a minmax reformulation for jko gradient flows appealing the variational formulation of fdivergences this would alleviate the need of an explicit density all reviewers pointed out the limited novelty in the work and the limited experimentation we encourage authors to add a theoretical analysis to their work and further strengthening of the experimental section with high dimensional experiments and to resubmit the work on an upcoming venue
[ 271, 4722, 7629, 327, 3332, 480, 76, 706, 833, 3082, 323, 15180, 369, 2152, 6339, 11786, 14221, 533, 697, 3710, 38135, 285, 16774, 7103, 37856, 697, 7680, 285, 1056, 352, 247, 45210, 2929, 275, 619, 1859, 326, 1146, 753, 604, 253, 3374, 891, 7164, 275, 619, 2278, 403, 6283, 9713, 891, 651, 320, 7378, 281, 2572, 619, 4868, 50276, 7152, 33032, 2520, 2929, 2175, 253, 7092, 273, 690, 369, 2152, 6339, 11786, 14221, 259, 29976, 275, 13358, 673, 533, 1293, 35132, 3006, 253, 2317, 253, 3082, 4081, 403, 1754, 327, 253, 480, 7381, 5572, 281, 35132, 907, 259, 29976, 275, 673, 253, 7092, 273, 253, 480, 7381, 476, 320, 11132, 253, 5700, 273, 253, 4477, 310, 281, 806, 294, 3575, 292, 363, 2721, 253, 480, 7381, 347, 247, 41458, 689, 247, 2317, 273, 3470, 3185, 273, 5593, 3066, 7450, 10495, 840, 672, 253, 8103, 1159, 310, 247, 29439, 2373, 9515, 253, 8103, 3304, 253, 480, 7381, 11476, 247, 39762, 6779, 285, 476, 320, 4469, 347, 247, 7018, 6452, 1016, 480, 7381, 310, 3542, 347, 247, 1054, 2781, 689, 247, 2317, 273, 3470, 281, 8415, 352, 597, 30364, 363, 2721, 253, 3470, 407, 11454, 6928, 285, 31506, 22950, 285, 15338, 253, 1895, 970, 38622, 271, 1774, 4735, 310, 326, 253, 8103, 275, 253, 1054, 2781, 476, 320, 34930, 342, 3530, 273, 253, 1655, 3268, 697, 4038, 36908, 3176, 760, 28676, 8772, 281, 253, 1655, 3268, 50276, 296, 3755, 20556, 50276, 783, 2929, 310, 973, 3542, 285, 1029, 1268, 5697, 327, 253, 3762, 403, 5544, 253, 2746, 310, 2590, 285, 5272, 253, 9938, 921, 12532, 1543, 597, 513, 417, 1646, 281, 452, 644, 33804, 5055, 597, 921, 752, 310, 7558, 407, 253, 4477, 285, 627, 310, 642, 9326, 1677, 253, 17647, 273, 253, 2746, 50276, 20881, 1255, 265, 50276, 783, 10527, 2882, 566, 273, 253, 4081, 3082, 310, 12841, 403, 253, 4081, 6974, 5185, 673, 35132, 5904, 273, 253, 259, 29976, 403, 253, 13260, 10048, 310, 253, 5795, 11903, 1320, 1222, 27996, 5700, 30390, 337, 5185, 513, 512, 5593, 2783, 11476, 4038, 8772, 458, 12133, 25070, 253, 2929, 7194, 3400, 30328, 323, 30390, 337, 1293, 4891, 10527, 27629, 50275, 3062, 1189, 253, 7681, 7680, 310, 2581, 3710, 923, 253, 6010, 273, 253, 2929, 1840, 581, 812, 9059, 326, 352, 310, 271, 3477, 2929, 275, 253, 3282, 326, 760, 253, 9938, 1646, 747, 253, 38135, 310, 7194, 281, 452, 30364, 50065, 253, 3470, 275, 253, 8103, 273, 253, 480, 7381, 407, 11454, 6928, 50276, 783, 2746, 310, 5272, 253, 9864, 12532, 533, 891, 513, 417, 923, 247, 1534, 7681, 7680, 597, 9093, 294, 3575, 292, 50065, 247, 1054, 2781, 1895, 689, 247, 1159, 2317, 970, 11454, 37507, 285, 3515, 38622, 281, 31506, 22950, 285, 15338, 50276, 7152, 33032, 2520, 2929, 29328, 247, 39762, 15895, 273, 1016, 480, 7381, 3213, 323, 39793, 1159, 932, 327, 5593, 1027, 432, 5368, 3332, 2987, 327, 802, 8287, 480, 7381, 5018, 407, 3733, 7450, 10495, 11454, 6928, 2057, 3587, 390, 347, 27935, 273, 17133, 3470, 253, 39762, 15895, 8687, 1529, 6703, 11903, 1320, 273, 247, 1159, 1293, 25312, 4038, 2289, 326, 5431, 4419, 23664, 673, 10454, 1955, 281, 12672, 253, 2412, 29647, 273, 253, 7450, 1542, 4515, 4679, 403, 2218, 281, 7568, 253, 8542, 414, 273, 253, 5933, 50276, 2520, 2929, 22649, 247, 9560, 5691, 275, 253, 5368, 2987, 273, 802, 8287, 480, 7381, 5018, 10775, 253, 8214, 13782, 273, 253, 16689, 273, 253, 830, 268, 76, 89, 323, 1016, 465, 253, 2900, 253, 2929, 5936, 310, 5272, 533, 352, 3249, 387, 253, 2105, 273, 6240, 1529, 6703, 11903, 1320, 534, 2789, 253, 13757, 247, 2257, 625, 2834, 24088, 17631, 1955, 281, 1029, 11041, 4583, 891, 1158, 253, 2408, 273, 7680, 310, 1077, 3710, 253, 39762, 15895, 273, 253, 8103, 1159, 932, 2783, 310, 512, 973, 1929, 285, 8133, 352, 2366, 342, 253, 480, 7381, 3213, 310, 9648, 15246, 253, 4679, 403, 417, 21414, 2217, 281, 7568, 253, 8542, 11361, 273, 253, 4081, 1332, 2429, 281, 253, 18075, 50276, 5992, 7193, 5701, 50276, 249, 16186, 898, 247, 3268, 17356, 310, 5611, 752, 310, 253, 1127, 273, 16984, 436, 2557, 310, 352, 9093, 271, 6349, 10491, 273, 2805, 323, 534, 359, 760, 871, 253, 4038, 2139, 310, 352, 2217, 281, 816, 5206, 305, 10064, 2458, 534, 812, 320, 1077, 1027, 432, 2805, 50276, 20415, 7118, 403, 1077, 2074, 281, 278, 536, 18540, 43425, 323, 1650, 627, 310, 2717, 747, 275, 4706, 4791, 285, 253, 4679, 9978, 275, 7609, 285, 5976, 403, 4555, 253, 1072, 2568, 253, 625, 11132, 4679, 432, 278, 536, 18540, 43425, 403, 417, 23775, 1060, 824, 347, 12637, 17032, 285, 14561, 19690, 50275, 249, 4706, 5540, 352, 310, 1846, 281, 897, 253, 288, 9248, 9258, 10711, 29107, 281, 16851, 253, 11786, 273, 2412, 843, 275, 1635, 281, 247, 4872, 8415, 534, 812, 3885, 598, 253, 11771, 3082, 352, 1537, 320, 1175, 281, 2486, 247, 5301, 281, 326, 50276, 783, 1543, 275, 4677, 374, 403, 47975, 7197, 685, 1110, 273, 278, 536, 18540, 43425, 25761, 1060, 760, 598, 281, 7877, 2145, 310, 2908, 5727, 278, 536, 18540, 43425, 4428, 7877, 4567, 50276, 783, 1543, 275, 4677, 495, 273, 253, 4081, 1332, 403, 1805, 685, 1110, 273, 278, 536, 18540, 43425, 516, 12371, 2139, 436, 310, 253, 1083, 1580, 275, 278, 536, 18540, 43425, 253, 27451, 23279, 310, 5118, 4555, 5727, 275, 253, 4081, 1332, 3081, 8492, 812, 320, 5611, 1955, 281, 253, 4433, 273, 46875, 288, 50276, 783, 2929, 2175, 271, 1774, 5691, 275, 480, 7381, 5018, 14494, 407, 3332, 2987, 533, 253, 9021, 403, 32809, 1293, 17227, 21414, 8542, 11361, 50276, 7152, 339, 431, 248, 2929, 29328, 247, 1332, 281, 11897, 369, 2152, 6339, 11786, 14221, 259, 72, 3671, 3066, 11454, 6928, 285, 253, 480, 7381, 6974, 275, 4499, 281, 2720, 2987, 281, 11897, 259, 72, 3671, 273, 1159, 932, 7668, 29439, 2373, 1541, 707, 253, 4477, 897, 39762, 34754, 2581, 685, 1480, 30745, 352, 310, 7558, 281, 789, 7938, 285, 1347, 1805, 5373, 337, 253, 39762, 2746, 1537, 320, 247, 2900, 281, 3374, 273, 1480, 13782, 824, 347, 12966, 1037, 5675, 10454, 347, 247, 1159, 273, 253, 7877, 50276, 6553, 12270, 50276, 19, 253, 4679, 273, 253, 2929, 403, 5075, 285, 513, 417, 10481, 1329, 253, 2022, 3916, 495, 253, 5886, 281, 253, 2720, 789, 310, 417, 4751, 10557, 577, 253, 7990, 273, 253, 2929, 259, 72, 3671, 310, 6891, 50276, 5992, 7193, 5701, 403, 2708, 50276, 16429, 281, 2720, 789, 247, 1781, 629, 273, 253, 5933, 4081, 407, 253, 4477, 10129, 253, 3786, 1929, 1445, 625, 10534, 7118, 1884, 285, 4562, 22059, 253, 8460, 1427, 273, 480, 7381, 3066, 3470, 326, 452, 2168, 644, 4081, 275, 337, 417, 11106, 275, 253, 2929, 285, 18171, 908, 275, 3495, 891, 1158, 436, 310, 417, 1077, 2590, 432, 253, 2505, 285, 1537, 823, 4465, 29257, 6688, 1318, 281, 253, 1655, 789, 50276, 338, 891, 9113, 2096, 253, 4588, 3064, 8772, 253, 2720, 789, 24088, 374, 310, 253, 1039, 326, 253, 29439, 2373, 9515, 310, 18325, 253, 4477, 897, 247, 39762, 11193, 275, 4499, 281, 1480, 13782, 1060, 891, 452, 767, 16157, 50276, 7053, 253, 2234, 1750, 273, 253, 2929, 310, 326, 1480, 13782, 11498, 12966, 1037, 285, 310, 417, 17887, 275, 1029, 10103, 891, 5194, 533, 752, 670, 3809, 34754, 275, 495, 253, 4477, 11120, 1375, 597, 897, 247, 3809, 29107, 1754, 327, 253, 288, 9248, 9258, 1332, 275, 374, 253, 4477, 1375, 326, 281, 3885, 598, 253, 13782, 3809, 34754, 476, 320, 908, 352, 310, 16593, 281, 11823, 436, 285, 7277, 275, 253, 5661, 2593, 760, 342, 374, 970, 285, 760, 407, 970, 253, 1480, 13782, 50276, 9815, 253, 39762, 11193, 273, 253, 29439, 2373, 9515, 310, 417, 4460, 923, 323, 1650, 269, 72, 507, 608, 891, 4282, 604, 824, 34754, 452, 2168, 644, 908, 275, 17699, 16561, 5145, 4715, 270, 1686, 891, 11907, 253, 4477, 281, 2486, 247, 7000, 5955, 273, 436, 2139, 310, 436, 1774, 891, 717, 417, 271, 6485, 275, 270, 1686, 533, 352, 3133, 281, 479, 326, 436, 1798, 39762, 11193, 476, 320, 3732, 281, 247, 13365, 273, 8892, 7668, 27451, 23279, 253, 1655, 2929, 14371, 326, 352, 41731, 13015, 1480, 13782, 604, 436, 310, 6296, 2032, 2139, 556, 2649, 436, 2746, 644, 908, 323, 643, 270, 1686, 8892, 604, 352, 556, 2168, 644, 253, 4477, 943, 2486, 4623, 10414, 281, 1329, 253, 1655, 5661, 4342, 50276, 1189, 455, 253, 2929, 943, 625, 9257, 14409, 253, 2720, 789, 285, 1056, 352, 13955, 752, 310, 747, 285, 752, 310, 2168, 973, 1929, 2929, 577, 671, 3133, 4623, 50276, 16217, 3825, 253, 5661, 2593, 310, 1077, 5075, 1097, 275, 2426, 273, 3290, 285, 10671, 275, 2426, 273, 10671, 253, 2929, 4453, 4105, 2429, 281, 43211, 24088, 374, 247, 2929, 281, 534, 597, 7277, 616, 1332, 275, 1798, 253, 1543, 2530, 275, 4677, 374, 25910, 11089, 432, 16613, 24165, 534, 310, 247, 4499, 281, 326, 2361, 407, 374, 275, 247, 2074, 9978, 4677, 374, 273, 374, 16280, 253, 10103, 2783, 275, 253, 1655, 2929, 403, 2406, 534, 310, 20634, 50276, 2948, 5954, 253, 11745, 1543, 2530, 275, 4677, 495, 7164, 3533, 849, 310, 352, 1896, 326, 327, 824, 247, 20953, 1650, 25537, 305, 12064, 10670, 50276, 8172, 7450, 10495, 8115, 3805, 3714, 74, 253, 39762, 3082, 594, 31063, 562, 32231, 253, 1480, 13782, 598, 281, 884, 2069, 436, 310, 3240, 42798, 275, 1798, 275, 1355, 10103, 858, 253, 4477, 897, 253, 1072, 2990, 35615, 285, 643, 6096, 4373, 22041, 323, 5301, 247, 5955, 1060, 310, 3058, 50276, 14329, 604, 891, 9113, 2096, 253, 4477, 513, 417, 2085, 667, 1029, 6967, 4893, 273, 259, 72, 3671, 1955, 281, 436, 891, 4390, 5257, 281, 1158, 253, 7990, 273, 253, 2929, 310, 6891, 285, 253, 31471, 273, 253, 4081, 2746, 323, 253, 3114, 273, 17857, 32888, 310, 30455, 6240, 271, 2898, 651, 7964, 5649, 253, 2929, 50275, 28113, 1255, 273, 253, 1332, 4583, 253, 1332, 310, 3451, 2299, 275, 581, 273, 253, 4679, 253, 4477, 16851, 253, 7450, 10495, 273, 253, 480, 7381, 3213, 3587, 347, 253, 11454, 2990, 417, 347, 253, 11786, 273, 253, 3280, 44181, 2990, 275, 436, 1083, 253, 7450, 3579, 3268, 1537, 417, 452, 4038, 24038, 253, 2862, 480, 7381, 6974, 50276, 498, 15752, 752, 323, 253, 49962, 6974, 310, 5611, 2593, 5922, 436, 310, 417, 2590, 432, 253, 2505, 50276, 250, 3065, 50276, 18, 2240, 312, 276, 480, 277, 1113, 3623, 305, 278, 10389, 302, 2805, 50276, 2995, 292, 299, 4022, 35132, 1320, 273, 1159, 932, 7668, 253, 1114, 463, 1301, 250, 5572, 4520, 15488, 11076, 1479, 13900, 20, 721, 13210, 1812, 50276, 19, 278, 536, 18540, 268, 37720, 26834, 247, 632, 298, 730, 1173, 333, 247, 1220, 16142, 480, 50276, 14814, 66, 1173, 299, 43425, 1236, 2510, 25912, 369, 2152, 6339, 11786, 14221, 549, 32693, 638, 3845, 549, 32693, 19, 12971, 7931, 1812, 50276, 20, 355, 42356, 18683, 261, 277, 5807, 1648, 340, 50276, 28094, 276, 11430, 340, 43425, 39793, 1159, 932, 327, 253, 2317, 273, 20552, 342, 3280, 17133, 11454, 6928, 549, 32693, 638, 3845, 549, 32693, 16899, 10487, 30715, 50276, 21, 34258, 570, 260, 278, 1205, 81, 522, 991, 14718, 375, 298, 465, 376, 2327, 247, 50276, 7317, 11317, 278, 43425, 480, 32937, 292, 19561, 8654, 4616, 14053, 273, 3072, 8062, 549, 32693, 638, 3845, 549, 32693, 16899, 25358, 16767, 50276, 22, 1024, 6002, 249, 256, 260, 339, 413, 270, 50276, 85, 21206, 20531, 391, 4022, 372, 4246, 269, 1247, 3733, 1006, 800, 11454, 1775, 446, 398, 970, 39762, 23279, 41458, 275, 10061, 273, 253, 1884, 394, 5213, 8059, 327, 11454, 1491, 5162, 2718, 7266, 3435, 805, 2787, 619, 4583, 13214, 273, 253, 2929, 310, 326, 352, 310, 46809, 1223, 253, 2934, 273, 39762, 11193, 310, 5272, 891, 9428, 436, 2929, 4419, 247, 2201, 18520, 342, 247, 13365, 2505, 11701, 285, 4679, 3103, 891, 6273, 281, 12009, 436, 2929, 275, 697, 1655, 830, 2490, 187, 4118, 18435, 27, 783, 2929, 29328, 50276, 66, 1054, 4090, 8460, 1427, 323, 480, 7381, 11786, 14221, 23176, 253, 39762, 15895, 273, 29439, 2373, 1541, 707, 436, 651, 33623, 253, 878, 273, 271, 6843, 4038, 512, 30628, 8042, 562, 253, 3710, 38135, 275, 253, 789, 285, 253, 3710, 40290, 50275, 664, 11907, 4477, 281, 823, 247, 10527, 1783, 281, 616, 789, 285, 2007, 31845, 273, 253, 5661, 2593, 342, 1029, 15759, 4679, 285, 281, 501, 538, 2225, 253, 789, 327, 271, 15146, 18767 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 271, 4722, 7629, 327, 3332, 480, 76, 706, 833, 3082, 323, 15180, 369, 2152, 6339, 11786, 14221, 533, 697, 3710, 38135, 285, 16774, 7103, 37856, 697, 7680, 285, 1056, 352, 247, 45210, 2929, 275, 619, 1859, 326, 1146, 753, 604, 253, 3374, 891, 7164, 275, 619, 2278, 403, 6283, 9713, 891, 651, 320, 7378, 281, 2572, 619, 4868, 50276, 7152, 33032, 2520, 2929, 2175, 253, 7092, 273, 690, 369, 2152, 6339, 11786, 14221, 259, 29976, 275, 13358, 673, 533, 1293, 35132, 3006, 253, 2317, 253, 3082, 4081, 403, 1754, 327, 253, 480, 7381, 5572, 281, 35132, 907, 259, 29976, 275, 673, 253, 7092, 273, 253, 480, 7381, 476, 320, 11132, 253, 5700, 273, 253, 4477, 310, 281, 806, 294, 3575, 292, 363, 2721, 253, 480, 7381, 347, 247, 41458, 689, 247, 2317, 273, 3470, 3185, 273, 5593, 3066, 7450, 10495, 840, 672, 253, 8103, 1159, 310, 247, 29439, 2373, 9515, 253, 8103, 3304, 253, 480, 7381, 11476, 247, 39762, 6779, 285, 476, 320, 4469, 347, 247, 7018, 6452, 1016, 480, 7381, 310, 3542, 347, 247, 1054, 2781, 689, 247, 2317, 273, 3470, 281, 8415, 352, 597, 30364, 363, 2721, 253, 3470, 407, 11454, 6928, 285, 31506, 22950, 285, 15338, 253, 1895, 970, 38622, 271, 1774, 4735, 310, 326, 253, 8103, 275, 253, 1054, 2781, 476, 320, 34930, 342, 3530, 273, 253, 1655, 3268, 697, 4038, 36908, 3176, 760, 28676, 8772, 281, 253, 1655, 3268, 50276, 296, 3755, 20556, 50276, 783, 2929, 310, 973, 3542, 285, 1029, 1268, 5697, 327, 253, 3762, 403, 5544, 253, 2746, 310, 2590, 285, 5272, 253, 9938, 921, 12532, 1543, 597, 513, 417, 1646, 281, 452, 644, 33804, 5055, 597, 921, 752, 310, 7558, 407, 253, 4477, 285, 627, 310, 642, 9326, 1677, 253, 17647, 273, 253, 2746, 50276, 20881, 1255, 265, 50276, 783, 10527, 2882, 566, 273, 253, 4081, 3082, 310, 12841, 403, 253, 4081, 6974, 5185, 673, 35132, 5904, 273, 253, 259, 29976, 403, 253, 13260, 10048, 310, 253, 5795, 11903, 1320, 1222, 27996, 5700, 30390, 337, 5185, 513, 512, 5593, 2783, 11476, 4038, 8772, 458, 12133, 25070, 253, 2929, 7194, 3400, 30328, 323, 30390, 337, 1293, 4891, 10527, 27629, 50275, 3062, 1189, 253, 7681, 7680, 310, 2581, 3710, 923, 253, 6010, 273, 253, 2929, 1840, 581, 812, 9059, 326, 352, 310, 271, 3477, 2929, 275, 253, 3282, 326, 760, 253, 9938, 1646, 747, 253, 38135, 310, 7194, 281, 452, 30364, 50065, 253, 3470, 275, 253, 8103, 273, 253, 480, 7381, 407, 11454, 6928, 50276, 783, 2746, 310, 5272, 253, 9864, 12532, 533, 891, 513, 417, 923, 247, 1534, 7681, 7680, 597, 9093, 294, 3575, 292, 50065, 247, 1054, 2781, 1895, 689, 247, 1159, 2317, 970, 11454, 37507, 285, 3515, 38622, 281, 31506, 22950, 285, 15338, 50276, 7152, 33032, 2520, 2929, 29328, 247, 39762, 15895, 273, 1016, 480, 7381, 3213, 323, 39793, 1159, 932, 327, 5593, 1027, 432, 5368, 3332, 2987, 327, 802, 8287, 480, 7381, 5018, 407, 3733, 7450, 10495, 11454, 6928, 2057, 3587, 390, 347, 27935, 273, 17133, 3470, 253, 39762, 15895, 8687, 1529, 6703, 11903, 1320, 273, 247, 1159, 1293, 25312, 4038, 2289, 326, 5431, 4419, 23664, 673, 10454, 1955, 281, 12672, 253, 2412, 29647, 273, 253, 7450, 1542, 4515, 4679, 403, 2218, 281, 7568, 253, 8542, 414, 273, 253, 5933, 50276, 2520, 2929, 22649, 247, 9560, 5691, 275, 253, 5368, 2987, 273, 802, 8287, 480, 7381, 5018, 10775, 253, 8214, 13782, 273, 253, 16689, 273, 253, 830, 268, 76, 89, 323, 1016, 465, 253, 2900, 253, 2929, 5936, 310, 5272, 533, 352, 3249, 387, 253, 2105, 273, 6240, 1529, 6703, 11903, 1320, 534, 2789, 253, 13757, 247, 2257, 625, 2834, 24088, 17631, 1955, 281, 1029, 11041, 4583, 891, 1158, 253, 2408, 273, 7680, 310, 1077, 3710, 253, 39762, 15895, 273, 253, 8103, 1159, 932, 2783, 310, 512, 973, 1929, 285, 8133, 352, 2366, 342, 253, 480, 7381, 3213, 310, 9648, 15246, 253, 4679, 403, 417, 21414, 2217, 281, 7568, 253, 8542, 11361, 273, 253, 4081, 1332, 2429, 281, 253, 18075, 50276, 5992, 7193, 5701, 50276, 249, 16186, 898, 247, 3268, 17356, 310, 5611, 752, 310, 253, 1127, 273, 16984, 436, 2557, 310, 352, 9093, 271, 6349, 10491, 273, 2805, 323, 534, 359, 760, 871, 253, 4038, 2139, 310, 352, 2217, 281, 816, 5206, 305, 10064, 2458, 534, 812, 320, 1077, 1027, 432, 2805, 50276, 20415, 7118, 403, 1077, 2074, 281, 278, 536, 18540, 43425, 323, 1650, 627, 310, 2717, 747, 275, 4706, 4791, 285, 253, 4679, 9978, 275, 7609, 285, 5976, 403, 4555, 253, 1072, 2568, 253, 625, 11132, 4679, 432, 278, 536, 18540, 43425, 403, 417, 23775, 1060, 824, 347, 12637, 17032, 285, 14561, 19690, 50275, 249, 4706, 5540, 352, 310, 1846, 281, 897, 253, 288, 9248, 9258, 10711, 29107, 281, 16851, 253, 11786, 273, 2412, 843, 275, 1635, 281, 247, 4872, 8415, 534, 812, 3885, 598, 253, 11771, 3082, 352, 1537, 320, 1175, 281, 2486, 247, 5301, 281, 326, 50276, 783, 1543, 275, 4677, 374, 403, 47975, 7197, 685, 1110, 273, 278, 536, 18540, 43425, 25761, 1060, 760, 598, 281, 7877, 2145, 310, 2908, 5727, 278, 536, 18540, 43425, 4428, 7877, 4567, 50276, 783, 1543, 275, 4677, 495, 273, 253, 4081, 1332, 403, 1805, 685, 1110, 273, 278, 536, 18540, 43425, 516, 12371, 2139, 436, 310, 253, 1083, 1580, 275, 278, 536, 18540, 43425, 253, 27451, 23279, 310, 5118, 4555, 5727, 275, 253, 4081, 1332, 3081, 8492, 812, 320, 5611, 1955, 281, 253, 4433, 273, 46875, 288, 50276, 783, 2929, 2175, 271, 1774, 5691, 275, 480, 7381, 5018, 14494, 407, 3332, 2987, 533, 253, 9021, 403, 32809, 1293, 17227, 21414, 8542, 11361, 50276, 7152, 339, 431, 248, 2929, 29328, 247, 1332, 281, 11897, 369, 2152, 6339, 11786, 14221, 259, 72, 3671, 3066, 11454, 6928, 285, 253, 480, 7381, 6974, 275, 4499, 281, 2720, 2987, 281, 11897, 259, 72, 3671, 273, 1159, 932, 7668, 29439, 2373, 1541, 707, 253, 4477, 897, 39762, 34754, 2581, 685, 1480, 30745, 352, 310, 7558, 281, 789, 7938, 285, 1347, 1805, 5373, 337, 253, 39762, 2746, 1537, 320, 247, 2900, 281, 3374, 273, 1480, 13782, 824, 347, 12966, 1037, 5675, 10454, 347, 247, 1159, 273, 253, 7877, 50276, 6553, 12270, 50276, 19, 253, 4679, 273, 253, 2929, 403, 5075, 285, 513, 417, 10481, 1329, 253, 2022, 3916, 495, 253, 5886, 281, 253, 2720, 789, 310, 417, 4751, 10557, 577, 253, 7990, 273, 253, 2929, 259, 72, 3671, 310, 6891, 50276, 5992, 7193, 5701, 403, 2708, 50276, 16429, 281, 2720, 789, 247, 1781, 629, 273, 253, 5933, 4081, 407, 253, 4477, 10129, 253, 3786, 1929, 1445, 625, 10534, 7118, 1884, 285, 4562, 22059, 253, 8460, 1427, 273, 480, 7381, 3066, 3470, 326, 452, 2168, 644, 4081, 275, 337, 417, 11106, 275, 253, 2929, 285, 18171, 908, 275, 3495, 891, 1158, 436, 310, 417, 1077, 2590, 432, 253, 2505, 285, 1537, 823, 4465, 29257, 6688, 1318, 281, 253, 1655, 789, 50276, 338, 891, 9113, 2096, 253, 4588, 3064, 8772, 253, 2720, 789, 24088, 374, 310, 253, 1039, 326, 253, 29439, 2373, 9515, 310, 18325, 253, 4477, 897, 247, 39762, 11193, 275, 4499, 281, 1480, 13782, 1060, 891, 452, 767, 16157, 50276, 7053, 253, 2234, 1750, 273, 253, 2929, 310, 326, 1480, 13782, 11498, 12966, 1037, 285, 310, 417, 17887, 275, 1029, 10103, 891, 5194, 533, 752, 670, 3809, 34754, 275, 495, 253, 4477, 11120, 1375, 597, 897, 247, 3809, 29107, 1754, 327, 253, 288, 9248, 9258, 1332, 275, 374, 253, 4477, 1375, 326, 281, 3885, 598, 253, 13782, 3809, 34754, 476, 320, 908, 352, 310, 16593, 281, 11823, 436, 285, 7277, 275, 253, 5661, 2593, 760, 342, 374, 970, 285, 760, 407, 970, 253, 1480, 13782, 50276, 9815, 253, 39762, 11193, 273, 253, 29439, 2373, 9515, 310, 417, 4460, 923, 323, 1650, 269, 72, 507, 608, 891, 4282, 604, 824, 34754, 452, 2168, 644, 908, 275, 17699, 16561, 5145, 4715, 270, 1686, 891, 11907, 253, 4477, 281, 2486, 247, 7000, 5955, 273, 436, 2139, 310, 436, 1774, 891, 717, 417, 271, 6485, 275, 270, 1686, 533, 352, 3133, 281, 479, 326, 436, 1798, 39762, 11193, 476, 320, 3732, 281, 247, 13365, 273, 8892, 7668, 27451, 23279, 253, 1655, 2929, 14371, 326, 352, 41731, 13015, 1480, 13782, 604, 436, 310, 6296, 2032, 2139, 556, 2649, 436, 2746, 644, 908, 323, 643, 270, 1686, 8892, 604, 352, 556, 2168, 644, 253, 4477, 943, 2486, 4623, 10414, 281, 1329, 253, 1655, 5661, 4342, 50276, 1189, 455, 253, 2929, 943, 625, 9257, 14409, 253, 2720, 789, 285, 1056, 352, 13955, 752, 310, 747, 285, 752, 310, 2168, 973, 1929, 2929, 577, 671, 3133, 4623, 50276, 16217, 3825, 253, 5661, 2593, 310, 1077, 5075, 1097, 275, 2426, 273, 3290, 285, 10671, 275, 2426, 273, 10671, 253, 2929, 4453, 4105, 2429, 281, 43211, 24088, 374, 247, 2929, 281, 534, 597, 7277, 616, 1332, 275, 1798, 253, 1543, 2530, 275, 4677, 374, 25910, 11089, 432, 16613, 24165, 534, 310, 247, 4499, 281, 326, 2361, 407, 374, 275, 247, 2074, 9978, 4677, 374, 273, 374, 16280, 253, 10103, 2783, 275, 253, 1655, 2929, 403, 2406, 534, 310, 20634, 50276, 2948, 5954, 253, 11745, 1543, 2530, 275, 4677, 495, 7164, 3533, 849, 310, 352, 1896, 326, 327, 824, 247, 20953, 1650, 25537, 305, 12064, 10670, 50276, 8172, 7450, 10495, 8115, 3805, 3714, 74, 253, 39762, 3082, 594, 31063, 562, 32231, 253, 1480, 13782, 598, 281, 884, 2069, 436, 310, 3240, 42798, 275, 1798, 275, 1355, 10103, 858, 253, 4477, 897, 253, 1072, 2990, 35615, 285, 643, 6096, 4373, 22041, 323, 5301, 247, 5955, 1060, 310, 3058, 50276, 14329, 604, 891, 9113, 2096, 253, 4477, 513, 417, 2085, 667, 1029, 6967, 4893, 273, 259, 72, 3671, 1955, 281, 436, 891, 4390, 5257, 281, 1158, 253, 7990, 273, 253, 2929, 310, 6891, 285, 253, 31471, 273, 253, 4081, 2746, 323, 253, 3114, 273, 17857, 32888, 310, 30455, 6240, 271, 2898, 651, 7964, 5649, 253, 2929, 50275, 28113, 1255, 273, 253, 1332, 4583, 253, 1332, 310, 3451, 2299, 275, 581, 273, 253, 4679, 253, 4477, 16851, 253, 7450, 10495, 273, 253, 480, 7381, 3213, 3587, 347, 253, 11454, 2990, 417, 347, 253, 11786, 273, 253, 3280, 44181, 2990, 275, 436, 1083, 253, 7450, 3579, 3268, 1537, 417, 452, 4038, 24038, 253, 2862, 480, 7381, 6974, 50276, 498, 15752, 752, 323, 253, 49962, 6974, 310, 5611, 2593, 5922, 436, 310, 417, 2590, 432, 253, 2505, 50276, 250, 3065, 50276, 18, 2240, 312, 276, 480, 277, 1113, 3623, 305, 278, 10389, 302, 2805, 50276, 2995, 292, 299, 4022, 35132, 1320, 273, 1159, 932, 7668, 253, 1114, 463, 1301, 250, 5572, 4520, 15488, 11076, 1479, 13900, 20, 721, 13210, 1812, 50276, 19, 278, 536, 18540, 268, 37720, 26834, 247, 632, 298, 730, 1173, 333, 247, 1220, 16142, 480, 50276, 14814, 66, 1173, 299, 43425, 1236, 2510, 25912, 369, 2152, 6339, 11786, 14221, 549, 32693, 638, 3845, 549, 32693, 19, 12971, 7931, 1812, 50276, 20, 355, 42356, 18683, 261, 277, 5807, 1648, 340, 50276, 28094, 276, 11430, 340, 43425, 39793, 1159, 932, 327, 253, 2317, 273, 20552, 342, 3280, 17133, 11454, 6928, 549, 32693, 638, 3845, 549, 32693, 16899, 10487, 30715, 50276, 21, 34258, 570, 260, 278, 1205, 81, 522, 991, 14718, 375, 298, 465, 376, 2327, 247, 50276, 7317, 11317, 278, 43425, 480, 32937, 292, 19561, 8654, 4616, 14053, 273, 3072, 8062, 549, 32693, 638, 3845, 549, 32693, 16899, 25358, 16767, 50276, 22, 1024, 6002, 249, 256, 260, 339, 413, 270, 50276, 85, 21206, 20531, 391, 4022, 372, 4246, 269, 1247, 3733, 1006, 800, 11454, 1775, 446, 398, 970, 39762, 23279, 41458, 275, 10061, 273, 253, 1884, 394, 5213, 8059, 327, 11454, 1491, 5162, 2718, 7266, 3435, 805, 2787, 619, 4583, 13214, 273, 253, 2929, 310, 326, 352, 310, 46809, 1223, 253, 2934, 273, 39762, 11193, 310, 5272, 891, 9428, 436, 2929, 4419, 247, 2201, 18520, 342, 247, 13365, 2505, 11701, 285, 4679, 3103, 891, 6273, 281, 12009, 436, 2929, 275, 697, 1655, 830, 2490, 187, 4118, 18435, 27, 783, 2929, 29328, 50276, 66, 1054, 4090, 8460, 1427, 323, 480, 7381, 11786, 14221, 23176, 253, 39762, 15895, 273, 29439, 2373, 1541, 707, 436, 651, 33623, 253, 878, 273, 271, 6843, 4038, 512, 30628, 8042, 562, 253, 3710, 38135, 275, 253, 789, 285, 253, 3710, 40290, 50275, 664, 11907, 4477, 281, 823, 247, 10527, 1783, 281, 616, 789, 285, 2007, 31845, 273, 253, 5661, 2593, 342, 1029, 15759, 4679, 285, 281, 501, 538, 2225, 253, 789, 327, 271, 15146, 18767 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: the paper proposes a method for controlling unfairness of a model when the test distribution may differ in the marginal distribution of a feature such as gender race it derives a statistical test for checking unfairness on the unknown test distribution via importance weighting combined with userinput on extent of the shift experiments on an academic performance dataset shows that the approach achieves low failure rate of learning an unfair classifier the work provides a conceptually clear and flexible framework for learning fair classifiers under demographic shifts the organization of the concepts seldonian framework problem description fairness tests is great and the writing is clear isolating the problem to performing the fairness test under an unknown distribution helps to understand the proposed method the flexibility of the framework to accommodate different fairness measures is an advantage the probabilistic guarantee for fairness violation is a good feature absent from many fair learning approaches for this particular problem my main issues are 1 with the setup of controlling only fairness and not the accuracy and 2 the seemingly large number of samples required for good accuracy for the method some related work is also missing which can be easily addressed few details on the scope of the demographic variable discrete vs continuous and relation to covariate shifts need clarity i mention these issues in detail below questions to address in the response 1 why is the objective of controlling test unfairness while controlling train error justified as opposed to attending to both quantities for test set as pointed out determination of the fairness test is impacted by demographic shift so is the classification loss at deployment time that is to be computed for the shifted distribution does this accounted somehow in the algorithm one strategy is similar to the proposed fairness test procedure to note that loss conditional on t remains the same and somehow importance weighting or bounding the loss i am wondering if i am missing this important detail in the algorithm on how training and deployment error are related to each other 2 does the demographic shift assumption rely on demographic attribute space mathcalt to be discrete the specification of marginal shift in terms of lowerupper bounds on each probability of each value of t suggests that similarly conditioning set zeta around eq 4 is assumed to be discrete thus excluding some fairness metrics which is fine but the allowed domains for variables should be specified please mention if discreteness is a limiting assumption for the proposed algorithm 3 accuracy of shifty seems to be severely impacted even with known shifts when there are 100k data points in middle plots of figure 3 under known shift the test fairness constraint is not harsher than the fairness constraint on the train set what is the reason for the drop also it is more practical to have much fewer data points such as of the order of 10k at least the sizes of fairness datasets used in literature are in the same or lower range does the severe drop in accuracy observed in only this particular dataset of exam scores or others too it would be worthwhile to either include results for lower sample sizes on this and preferably other datasets or empirically study and discuss issues limiting the sample efficiency for future work to address minor questions that do not affect my review 4 in algorithm 1 does the candidate model thethac in step 2 appear in inputs to fairness testing in step 3 5 i missed the description of how the candidate models are enumerated and tested one by one can you please clarify what is the output of the candidate model selection step 2 in algorithm 1 related work there have been multiple studies on fairness guarantees under shifts which have been omitted they are making somewhat different in some cases weaker assumptions and have different guarantees it would be good to discuss differences from these works biswas and mukherjee 21 httpsdlacmorgdoiabs10114534617023462596 rezaei et al 21 httpsarxivorgabs201005166 schumann et al 19 httpsarxivorgabs190609688 coston et al 19 httpsdlacmorgdoiabs10114533066183314236 singh et al 21 httpsarxivorgabs191100677 dai and brown 20 httpsdynamicdecisionsgithubioassetspdfs29pdf if any of these works have the same problem setup such as rezaei et al 21 please consider comparing against them suggestions please mention if the demographic shift assumption is a type of covariate shift which is a better known term in literature or how does it differ please describe how the result in theorem 2 is important and used to support the method please add the reason for choosing the particular numerical optimizer from endres et al 2018 is uttest a nonsmooth function a possible dataset to experiment is the new adult dataset derived from us census which has distribution shifts perhaps demographic and other shifts see folktables package httpsgithubcomzyklsfolktables the work presents a novel and clear approach to the problem of fair learning under shifts there are some open issues on the problem setup and experimental results which i would like authors to respond to in total the work is a technically strong contribution to the nascent literature on the problem from the newly made changes and clarifications in the response my concerns are addressed i highly encourage authors to move the results on new dataset discussion on related work and continuous demographic attributes to the main text docsepthe paper presents a class of algorithms for ensuring fairness guarantees when the deployment data is susceptible to a demographic shift termed as a marginal shift in demographic attributes such as gender or race in comparison to the training data results are provided for two settings 1 when the exact demographic shift in the deployment setting is known and 2 when the demographic shift in the deployment environment is unknown this is done by computing a highconfidence upper bound on the prevalence of unfair behavior in the deployment environment presently done using the students ttest comparisons are made with seldonian and quasiseldonian algorithms fairlearn and fairness constraints the empirical analysis is performed on the university dataset to predict gpa using scores with gender as the fairness attribute and students race as the demographic attribute the marginal distribution of which changes across the training and deployment settings positives 1 the paper is generally wellwritten and easy to follow 2 the problem studied is definitely interesting to the community and one of the main challenges with the deployment of models in practical settings 3 the highconfidence upper bounds on unfairness in the deployment setting present an interesting approach to ensuring fairness guarantees in the deployment setting negatives 1 the authors are missing some parallel works with ensuring fairness guarantees with general distribution shift 123 2 certain design choices are missing clear explanations such as how was the interpolation factor of 03 decided additional concerns 1 can the assumptions about gtheta be extended to settings where there needs a comparison with respect to another demographic group rather than ensuring it to be under some threshold 2 it is not exactly clear how the approach can be extended to definitions such as individual fairness 3 in order to better assess the quality of the fair predictions in the deployment setting it would be helpful to have some empirical analysis when multiple definitions of fairness that may not be compatible are suggested by the user references 1 singh h singh r mhasawade v chunara r 2021 march fairness violations and mitigation under covariate shift in proceedings of the 2021 acm conference on fairness accountability and transparency pp 313 2 du w wu x 2021 robust fairnessaware learning under sample selection bias arxiv preprint arxiv210511570 3 dai j brown s m label bias label shift fair machine learning with unreliable labels the contributions of the paper are clear as the paper is missing some related works the baseline comparisons can be further extended to assess other factors as suggested docsepthe authors propose an approach called shifty that provides highconfidence fairness guarantees when the distribution of training and deployment is different they proposed two approaches to tackle the problem one based on known distribution shift and another one for when the distribution shift is unknown they evaluated their approach on a dataset for student success prediction and compared their approach with sota inprocessing fairness approaches the paper tackles an interesting and challenging problem the idea of separating the tasks in a parallel sense is interesting as they divide the problem into the candidate selection and searching for the highconfidence upper bound the main issue however is the data distribution part that it seems that they didnt describe its procedure hence i am not sure whether it was a fair sample selection in which both of the df and dc are representative of the source data in the introduction the authors claimed that they allow users to select fairness notions from a large class depending on the application domain however according to their approach it is not clear how they can account for individual fairness can the authors elaborate on this furthermore the authors acknowledge in section d of supplementary material there is an assumption that each parameters intervals are independent which is often false can the authors explain such justifications their method of identifying the distribution shift in both known and unknown shifts relies on userprovided information indeed it is an assumption to the problem which may hold untrue would it be possible to detect distribution shift at the deployment and change the model accordingly why there is a need to train models for various unknown distribution shifts it is obvious that considering various shifts has a huge impact on the performance according to their results in section 42 the structure of the evaluation section is not convincing to me and the comparison with inprocessing fairness approaches does not seem fair the only valid comparison is wrt to the fairness guarantees however it is not clear to me why the authors compared their approach under distribution shift with other methods that do not claim to be fair when the distributions of the train and test sets are different also given the results presented for the experiment with unknown distribution shift it seems the seldonian algorithm performs better that their approach id suggest the authors to compare their approach with methods proposed for concept shift also it will be useful to add atleast one more dataset for the evaluation to make sure that these results are consistent in various contexts i have found the main idea of the paper interesting however i have some doubts about their proposed method and the way that they evaluated their approach and compared it with other methods see previous section also i suggest the authors to focus only on group fairness as it is not clear how their approach can be used for individual fairness methods docsepthe paper focuses on algorithmic fairness in machine learning ml and in particular on the problem of demographic shift the motivation comes from the fact that the distribution of the underlying population might change between training and deployment the papers defines this as a shift in the marginal distribution of a single random variable race sex etc the setup considers a classification task where sensitive information the fairness attribute might not be legal to use but is available information given a certain event of interest and a target tolerance value tau eg with respect to sex the falsepositive rate of the model must below tau for female an algorithm is defined to be fair if the algorithm achieves the target value tau with high probability given by some delta parameter to account for demographic shift the model further includes a demographic attribute t the distribution of t can change over time but the conditional distribution must remain the same by assumption given this setup the paper introduces shifty an ml algorithm that provides highconfidence guarantees that a fairness property will hold even when a demographic shift has occurred shifty works as follows it takes as input a training datasets some required fairness constraints and a description of demographic shifts then it partitions the dataset to two subsets finds a model based on the first dataset then it uses q and the second dataset to build an upper confidence bound for the model after deployment and given demographic shift it returns only models that are unfair after deployment with probability at most delta the experiments use test scores data from the brazilian university system and show the satisfactory performance of the proposed algorithm comments i found the formal definition of demographic shift page 3 somewhat hard to parse and understand intuitively on top of the previous comment it is not clear to me why the authors make a distinction between fairness attribute and demographic attribute what if the college wants to take into account both race and gender as a fairness legally protected attribute also why is it reasonable to assume that t changes eq 2 but the conditional probability eq 3 does not in particular in what settings it is justified to assume that eq 3 holds section 312 assumes a more realistic scenario it is hard to think of scenarios where q would be exactly known so i think this section should be presented as part of the main setup and not as an extension however the description of the method is not very precise how do the numerical approximation and simplicial homology optimization exactly work in this setting current appendix a and related work on pages 34 could be improved how does shifty provide guarantees that the other algorithms cannot would it require significant modification of the existing algorithms and how if possible to account for demographic shifts property b in terms of writing the paper is quite clearly written there are some repetitions in certain places eg section 2 in the main contributions list page 2 i think that the theoretical contribution 2 is an overstatement theorem 1 is rather obvious and no guarantees are provided for the unknown demographic shift case maybe obtaining distributiondependent bounds would be possible why was only this dataset chosen have the authors tried potentially synthetic experiments with other datasets eg the law admissions dataset used often in fair ml how is such a dramatic demographic shift explained in brazil fig 2 the sentence to evaluate each models e about evaluation in section 42 is not very clear i think the paper touches upon an important practical challenge not only in fair ml but ml models used for decisionmaking in general the analysis is natural straightforward and technically sound the most technical aspect is the computation of the upper confidence bound which is standard so i think there is no particular novelty in its methodological contribution that said i think the conceptual contribution of the paper is still quite new and useful in the light of the novel application i thus think the algorithm is quite novel nevertheless it would be helpful if the authors provided a clearer comparison to the most related works and explained how their approach is novel finally i found the evaluation section convincing and wellexecuted i would appreciate additional experiments on other widely used in fair ml datasets even synthetic experiments ### Summary:
the paper studies the setting of groupbased fairness under the socalled demographic shift where the marginal distribution of the data remains the same conditional on the subgroup but the subgroup distribution can change it provides a class of algorithms which give high confidence guarantees under demographic shift in both the known and unknown shift setting overall the paper is a worthwhile contribution it provides a new angle to the important problem of groupbased fairness with good theoretical and empirical results
[ 5333, 534, 310, 247, 1805, 1929, 1307, 275, 6239, 390, 849, 1057, 352, 9184, 50276, 32897, 6266, 849, 253, 906, 275, 10012, 374, 310, 1774, 285, 908, 281, 1329, 253, 1332, 50276, 32897, 823, 253, 1921, 323, 13887, 253, 1798, 10704, 5556, 6081, 432, 990, 373, 1162, 355, 4765, 310, 2780, 2566, 247, 14122, 78, 4902, 1159, 50276, 66, 1896, 10895, 281, 3368, 310, 253, 747, 6782, 10895, 6012, 432, 441, 20740, 534, 556, 3268, 15036, 4931, 18825, 285, 643, 15036, 923, 6305, 5751, 2272, 5522, 5987, 7280, 681, 44109, 5200, 10631, 5751, 2272, 253, 789, 10262, 247, 4460, 285, 2590, 2746, 281, 253, 1895, 273, 4344, 4715, 762, 15036, 627, 403, 690, 1527, 3374, 327, 253, 1895, 9978, 285, 5661, 1543, 534, 891, 651, 751, 4477, 281, 3794, 281, 275, 2264, 253, 789, 310, 247, 22335, 2266, 7680, 281, 253, 13332, 1154, 6239, 327, 253, 1895, 50275, 4064, 253, 9841, 1160, 2544, 285, 8254, 6787, 275, 253, 2380, 619, 7350, 403, 9713, 891, 4122, 11907, 4477, 281, 2118, 253, 1543, 327, 747, 10895, 5955, 327, 2905, 789, 285, 5415, 18825, 12474, 281, 253, 2022, 2505, 5474, 339, 431, 248, 2929, 10262, 247, 966, 273, 11333, 323, 17749, 28959, 23632, 672, 253, 19007, 941, 310, 16931, 281, 247, 18825, 5333, 23776, 347, 247, 16888, 5333, 275, 18825, 12474, 824, 347, 8645, 390, 5492, 275, 5301, 281, 253, 3733, 941, 1543, 403, 2530, 323, 767, 7533, 337, 672, 253, 3242, 18825, 5333, 275, 253, 19007, 4758, 310, 1929, 285, 374, 672, 253, 18825, 5333, 275, 253, 19007, 3126, 310, 7202, 436, 310, 2218, 407, 12672, 247, 1029, 39943, 5170, 3033, 327, 253, 8996, 273, 16593, 3879, 275, 253, 19007, 3126, 21654, 2218, 970, 253, 3484, 246, 2566, 14023, 403, 1160, 342, 11329, 9903, 757, 285, 572, 4914, 293, 9903, 757, 11333, 4344, 29343, 285, 28959, 10806, 253, 16774, 1783, 310, 2684, 327, 253, 9835, 10895, 281, 3283, 305, 4904, 970, 7363, 342, 8645, 347, 253, 28959, 11104, 285, 3484, 5492, 347, 253, 18825, 11104, 253, 16888, 3268, 273, 534, 2544, 2439, 253, 3733, 285, 19007, 7533, 50276, 993, 23223, 50276, 18, 253, 2929, 310, 3839, 973, 15720, 285, 3477, 281, 956, 374, 253, 1895, 5421, 310, 7964, 4722, 281, 253, 3114, 285, 581, 273, 253, 2022, 7881, 342, 253, 19007, 273, 3210, 275, 8542, 7533, 50276, 20, 253, 1029, 39943, 5170, 14493, 327, 16593, 1255, 275, 253, 19007, 4758, 1246, 271, 4722, 2746, 281, 17749, 28959, 23632, 275, 253, 19007, 4758, 50276, 8265, 3993, 337, 253, 4477, 403, 5816, 690, 7529, 2987, 342, 17749, 28959, 23632, 342, 2087, 3268, 5333, 15567, 50276, 19, 2176, 2216, 10165, 403, 5816, 2590, 22909, 824, 347, 849, 369, 253, 30370, 2803, 273, 17272, 4425, 50276, 38092, 7350, 337, 476, 253, 13260, 670, 305, 3124, 320, 6508, 281, 7533, 835, 627, 3198, 247, 5301, 342, 1675, 281, 1529, 18825, 1387, 2581, 685, 17749, 352, 281, 320, 762, 690, 7887, 50276, 19, 352, 310, 417, 4555, 2590, 849, 253, 2746, 476, 320, 6508, 281, 14308, 824, 347, 2060, 28959, 50276, 20, 275, 1340, 281, 1805, 2939, 253, 3290, 273, 253, 4344, 13650, 275, 253, 19007, 4758, 352, 651, 320, 9371, 281, 452, 690, 16774, 1783, 672, 2709, 14308, 273, 28959, 326, 778, 417, 320, 13333, 403, 5125, 407, 253, 2608, 50276, 250, 3065, 337, 1625, 73, 288, 1625, 73, 391, 278, 7110, 1403, 796, 362, 50276, 348, 328, 4595, 391, 43425, 14172, 28959, 15927, 285, 36455, 762, 9383, 11610, 5333, 275, 10061, 273, 253, 43425, 913, 78, 8059, 327, 28959, 30990, 285, 22107, 7266, 31389, 374, 3443, 259, 50276, 44217, 1269, 43425, 10237, 28959, 13823, 4715, 762, 3410, 5438, 8492, 549, 32693, 638, 3845, 549, 32693, 19, 10655, 12730, 1967, 495, 277, 2284, 480, 50276, 33167, 256, 278, 5203, 8492, 5203, 5333, 4344, 5145, 4715, 342, 36230, 13301, 253, 9021, 273, 253, 2929, 403, 2590, 347, 253, 2929, 310, 5816, 690, 2905, 2987, 253, 8245, 14023, 476, 320, 2007, 6508, 281, 2939, 643, 2616, 347, 5125, 50276, 7152, 339, 431, 248, 4477, 12661, 271, 2746, 1925, 439, 338, 555, 326, 3400, 1029, 39943, 28959, 23632, 672, 253, 3268, 273, 3733, 285, 19007, 310, 1027, 597, 4081, 767, 7274, 281, 18915, 253, 1895, 581, 1754, 327, 1929, 3268, 5333, 285, 1529, 581, 323, 672, 253, 3268, 5333, 310, 7202, 597, 6760, 616, 2746, 327, 247, 10895, 323, 5974, 2323, 10554, 285, 2429, 616, 2746, 342, 256, 5503, 275, 21678, 28959, 7274, 50276, 783, 2929, 39223, 271, 4722, 285, 11132, 1895, 253, 2934, 273, 23694, 253, 8892, 275, 247, 7529, 3282, 310, 4722, 347, 597, 10957, 253, 1895, 715, 253, 7431, 5438, 285, 12203, 323, 253, 1029, 39943, 5170, 3033, 253, 2022, 2523, 2299, 310, 253, 941, 3268, 629, 326, 352, 3133, 326, 597, 42126, 6266, 697, 5199, 7613, 891, 717, 417, 2119, 1880, 352, 369, 247, 4344, 3410, 5438, 275, 534, 1097, 273, 253, 20926, 285, 36196, 403, 8612, 273, 253, 2603, 941, 50276, 249, 253, 10199, 253, 4477, 7558, 326, 597, 1581, 4212, 281, 3609, 28959, 27367, 432, 247, 1781, 966, 7293, 327, 253, 2898, 5028, 2299, 2556, 281, 616, 2746, 352, 310, 417, 2590, 849, 597, 476, 2395, 323, 2060, 28959, 476, 253, 4477, 21184, 327, 436, 50276, 44295, 3062, 253, 4477, 14409, 275, 2593, 277, 273, 24864, 2144, 627, 310, 271, 9376, 326, 1016, 3602, 11508, 403, 3907, 534, 310, 2223, 3221, 476, 253, 4477, 5513, 824, 816, 6787, 50276, 14094, 1332, 273, 12488, 253, 3268, 5333, 275, 1097, 1929, 285, 7202, 15036, 15771, 327, 2608, 33850, 1491, 6296, 352, 310, 271, 9376, 281, 253, 1895, 534, 778, 2186, 440, 5672, 50276, 12756, 352, 320, 1896, 281, 2736, 3268, 5333, 387, 253, 19007, 285, 1818, 253, 1566, 15672, 2139, 627, 310, 247, 878, 281, 6194, 3210, 323, 2710, 7202, 3268, 15036, 50276, 262, 310, 4755, 326, 7296, 2710, 15036, 556, 247, 5699, 3486, 327, 253, 3045, 2556, 281, 616, 1543, 275, 2593, 5976, 50275, 783, 2605, 273, 253, 7103, 2593, 310, 417, 21414, 281, 479, 285, 253, 5301, 342, 275, 21678, 28959, 7274, 1057, 417, 1646, 4344, 253, 760, 3588, 5301, 310, 8772, 281, 253, 28959, 23632, 2299, 352, 310, 417, 2590, 281, 479, 2139, 253, 4477, 2429, 616, 2746, 762, 3268, 5333, 342, 643, 3082, 326, 513, 417, 1750, 281, 320, 4344, 672, 253, 10670, 273, 253, 6194, 285, 1071, 5239, 403, 1027, 671, 1677, 253, 1543, 3559, 323, 253, 3368, 342, 7202, 3268, 5333, 50276, 262, 3133, 253, 11329, 9903, 757, 5933, 17923, 1805, 326, 616, 2746, 2654, 1804, 253, 4477, 281, 7277, 616, 2746, 342, 3082, 4081, 323, 4473, 5333, 671, 352, 588, 320, 4217, 281, 823, 387, 38462, 581, 625, 10895, 323, 253, 7103, 281, 1056, 2119, 326, 841, 1543, 403, 5185, 275, 2710, 22349, 891, 452, 1119, 253, 2022, 2934, 273, 253, 2929, 4722, 2299, 891, 452, 690, 24626, 670, 616, 4081, 1332, 285, 253, 1039, 326, 597, 6760, 616, 2746, 285, 2429, 352, 342, 643, 3082, 923, 2045, 2593, 50276, 12563, 891, 1804, 253, 4477, 281, 2770, 760, 327, 1387, 28959, 347, 352, 310, 417, 2590, 849, 616, 2746, 476, 320, 908, 323, 2060, 28959, 3082, 50276, 7152, 339, 431, 248, 2929, 16633, 327, 5933, 280, 28959, 275, 5145, 4715, 13361, 285, 275, 1798, 327, 253, 1895, 273, 18825, 5333, 253, 16038, 3249, 432, 253, 958, 326, 253, 3268, 273, 253, 6944, 3072, 1537, 1818, 875, 3733, 285, 19007, 50276, 783, 9380, 13067, 436, 347, 247, 5333, 275, 253, 16888, 3268, 273, 247, 2014, 3632, 4778, 5492, 2825, 3966, 50276, 783, 9978, 19401, 247, 9162, 4836, 835, 7996, 1491, 253, 28959, 11104, 1537, 417, 320, 4320, 281, 897, 533, 310, 2130, 1491, 1677, 247, 2176, 2362, 273, 1600, 285, 247, 2303, 13761, 1318, 29201, 24088, 342, 1675, 281, 2825, 253, 3221, 10247, 2281, 273, 253, 1566, 1364, 2708, 29201, 323, 5343, 271, 5933, 310, 2931, 281, 320, 4344, 604, 253, 5933, 33526, 253, 2303, 1318, 29201, 342, 1029, 5912, 1677, 407, 690, 18687, 4764, 281, 2395, 323, 18825, 5333, 253, 1566, 2007, 3797, 247, 18825, 11104, 246, 253, 3268, 273, 246, 476, 1818, 689, 673, 533, 253, 17697, 3268, 1364, 3464, 253, 1072, 407, 9376, 50276, 28821, 436, 9978, 253, 2929, 23970, 439, 338, 555, 271, 13361, 5933, 326, 3400, 1029, 39943, 23632, 326, 247, 28959, 2867, 588, 2186, 1014, 672, 247, 18825, 5333, 556, 5866, 439, 338, 555, 2987, 347, 3637, 352, 3936, 347, 3280, 247, 3733, 15302, 690, 2424, 28959, 10806, 285, 247, 5740, 273, 18825, 15036, 840, 352, 27959, 253, 10895, 281, 767, 20077, 9010, 247, 1566, 1754, 327, 253, 806, 10895, 840, 352, 4648, 2805, 285, 253, 1273, 10895, 281, 1973, 271, 5170, 7162, 3033, 323, 253, 1566, 846, 19007, 285, 1677, 18825, 5333, 352, 6548, 760, 3210, 326, 403, 16593, 846, 19007, 342, 5912, 387, 954, 18687, 50276, 783, 4679, 897, 1071, 7363, 941, 432, 253, 270, 9918, 757, 9835, 985, 285, 921, 253, 20297, 3045, 273, 253, 4081, 5933, 50276, 26122, 50275, 74, 1119, 253, 7473, 5426, 273, 18825, 5333, 3239, 495, 8489, 1892, 281, 14390, 285, 2096, 540, 41597, 50275, 251, 1755, 273, 253, 2045, 4385, 352, 310, 417, 2590, 281, 479, 2139, 253, 4477, 1056, 247, 13812, 875, 28959, 11104, 285, 18825, 11104, 752, 604, 253, 6831, 5605, 281, 1379, 715, 2395, 1097, 5492, 285, 8645, 347, 247, 28959, 17734, 6885, 11104, 671, 2139, 310, 352, 5272, 281, 5467, 326, 246, 2544, 16186, 374, 533, 253, 17697, 5912, 16186, 495, 1057, 417, 275, 1798, 275, 752, 7533, 352, 310, 17285, 281, 5467, 326, 16186, 495, 6556, 50276, 4674, 30581, 19584, 247, 625, 15958, 10076, 352, 310, 1892, 281, 1158, 273, 15216, 835, 2805, 651, 320, 4555, 1929, 594, 891, 1158, 436, 2593, 943, 320, 3559, 347, 629, 273, 253, 2022, 9978, 285, 417, 347, 271, 6880, 2299, 253, 5740, 273, 253, 1332, 310, 417, 1077, 10799, 849, 513, 253, 10704, 11193, 285, 45666, 23117, 13757, 4555, 789, 275, 436, 4758, 50276, 6259, 30762, 247, 285, 2905, 789, 327, 7223, 5910, 812, 320, 5520, 849, 1057, 439, 338, 555, 2085, 23632, 326, 253, 643, 11333, 2550, 651, 352, 2430, 1534, 11237, 273, 253, 5368, 11333, 285, 849, 604, 1896, 281, 2395, 323, 18825, 15036, 2867, 270, 50276, 249, 2426, 273, 4028, 253, 2929, 310, 3240, 4518, 3542, 627, 403, 690, 49495, 275, 2176, 5053, 24088, 2593, 374, 50276, 249, 253, 2022, 9021, 1618, 3239, 374, 891, 1158, 326, 253, 10527, 7680, 374, 310, 271, 689, 25322, 10012, 337, 310, 2581, 4755, 285, 642, 23632, 403, 2530, 323, 253, 7202, 18825, 5333, 1083, 5046, 13546, 3268, 6820, 14493, 651, 320, 1896, 50276, 22309, 369, 760, 436, 10895, 6777, 452, 253, 4477, 3597, 7826, 13506, 4679, 342, 643, 15302, 24088, 253, 1569, 26120, 10895, 908, 2223, 275, 4344, 13361, 50276, 5430, 310, 824, 247, 14138, 18825, 5333, 5544, 275, 270, 9918, 3036, 374, 50276, 783, 6197, 281, 7472, 1016, 3210, 299, 670, 7103, 275, 2593, 5976, 310, 417, 1077, 2590, 50275, 74, 1158, 253, 2929, 26847, 2220, 271, 1774, 8542, 5691, 417, 760, 275, 4344, 13361, 533, 13361, 3210, 908, 323, 3061, 11849, 275, 2087, 253, 1783, 310, 3626, 15246, 285, 22335, 3590, 253, 954, 7681, 4809, 310, 253, 13782, 273, 253, 5170, 7162, 3033, 534, 310, 2629, 594, 891, 1158, 627, 310, 642, 1798, 38135, 275, 697, 35961, 7680, 50275, 3529, 753, 891, 1158, 253, 20178, 7680, 273, 253, 2929, 310, 1335, 3240, 747, 285, 4217, 275, 253, 1708, 273, 253, 4460, 2898, 891, 3021, 1158, 253, 5933, 310, 3240, 4460, 50276, 7594, 8299, 352, 651, 320, 9371, 604, 253, 4477, 2530, 247, 30909, 5301, 281, 253, 954, 2905, 2987, 285, 5544, 849, 616, 2746, 310, 4460, 50275, 71, 3341, 891, 1119, 253, 7103, 2593, 21414, 285, 6210, 1591, 886, 4525, 891, 651, 11435, 3081, 4679, 327, 50276, 977, 7561, 908, 275, 4344, 13361, 50276, 46906, 1507, 1014, 13506, 4679, 50276, 187, 187, 4118, 18435, 27, 783, 2929, 2175, 253, 4758, 273, 1387, 3169, 28959, 762, 253, 9267, 18859, 18825, 5333, 835, 253, 16888, 3268, 273, 253, 941, 4558, 253, 1072, 17697, 327, 253, 14632, 533, 253, 14632, 3268, 476, 1818, 352, 3400, 247, 966, 273, 11333, 534, 1918, 1029, 7162, 23632, 762, 18825, 5333, 275, 1097, 253, 1929, 285, 7202, 5333, 4758, 50276, 1189, 455, 253, 2929, 310, 247, 32811, 7680, 352, 3400, 247, 747, 6907, 281, 253, 1774, 1895, 273, 1387, 3169, 28959, 342, 1175, 10527, 285, 16774, 1543 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 5333, 534, 310, 247, 1805, 1929, 1307, 275, 6239, 390, 849, 1057, 352, 9184, 50276, 32897, 6266, 849, 253, 906, 275, 10012, 374, 310, 1774, 285, 908, 281, 1329, 253, 1332, 50276, 32897, 823, 253, 1921, 323, 13887, 253, 1798, 10704, 5556, 6081, 432, 990, 373, 1162, 355, 4765, 310, 2780, 2566, 247, 14122, 78, 4902, 1159, 50276, 66, 1896, 10895, 281, 3368, 310, 253, 747, 6782, 10895, 6012, 432, 441, 20740, 534, 556, 3268, 15036, 4931, 18825, 285, 643, 15036, 923, 6305, 5751, 2272, 5522, 5987, 7280, 681, 44109, 5200, 10631, 5751, 2272, 253, 789, 10262, 247, 4460, 285, 2590, 2746, 281, 253, 1895, 273, 4344, 4715, 762, 15036, 627, 403, 690, 1527, 3374, 327, 253, 1895, 9978, 285, 5661, 1543, 534, 891, 651, 751, 4477, 281, 3794, 281, 275, 2264, 253, 789, 310, 247, 22335, 2266, 7680, 281, 253, 13332, 1154, 6239, 327, 253, 1895, 50275, 4064, 253, 9841, 1160, 2544, 285, 8254, 6787, 275, 253, 2380, 619, 7350, 403, 9713, 891, 4122, 11907, 4477, 281, 2118, 253, 1543, 327, 747, 10895, 5955, 327, 2905, 789, 285, 5415, 18825, 12474, 281, 253, 2022, 2505, 5474, 339, 431, 248, 2929, 10262, 247, 966, 273, 11333, 323, 17749, 28959, 23632, 672, 253, 19007, 941, 310, 16931, 281, 247, 18825, 5333, 23776, 347, 247, 16888, 5333, 275, 18825, 12474, 824, 347, 8645, 390, 5492, 275, 5301, 281, 253, 3733, 941, 1543, 403, 2530, 323, 767, 7533, 337, 672, 253, 3242, 18825, 5333, 275, 253, 19007, 4758, 310, 1929, 285, 374, 672, 253, 18825, 5333, 275, 253, 19007, 3126, 310, 7202, 436, 310, 2218, 407, 12672, 247, 1029, 39943, 5170, 3033, 327, 253, 8996, 273, 16593, 3879, 275, 253, 19007, 3126, 21654, 2218, 970, 253, 3484, 246, 2566, 14023, 403, 1160, 342, 11329, 9903, 757, 285, 572, 4914, 293, 9903, 757, 11333, 4344, 29343, 285, 28959, 10806, 253, 16774, 1783, 310, 2684, 327, 253, 9835, 10895, 281, 3283, 305, 4904, 970, 7363, 342, 8645, 347, 253, 28959, 11104, 285, 3484, 5492, 347, 253, 18825, 11104, 253, 16888, 3268, 273, 534, 2544, 2439, 253, 3733, 285, 19007, 7533, 50276, 993, 23223, 50276, 18, 253, 2929, 310, 3839, 973, 15720, 285, 3477, 281, 956, 374, 253, 1895, 5421, 310, 7964, 4722, 281, 253, 3114, 285, 581, 273, 253, 2022, 7881, 342, 253, 19007, 273, 3210, 275, 8542, 7533, 50276, 20, 253, 1029, 39943, 5170, 14493, 327, 16593, 1255, 275, 253, 19007, 4758, 1246, 271, 4722, 2746, 281, 17749, 28959, 23632, 275, 253, 19007, 4758, 50276, 8265, 3993, 337, 253, 4477, 403, 5816, 690, 7529, 2987, 342, 17749, 28959, 23632, 342, 2087, 3268, 5333, 15567, 50276, 19, 2176, 2216, 10165, 403, 5816, 2590, 22909, 824, 347, 849, 369, 253, 30370, 2803, 273, 17272, 4425, 50276, 38092, 7350, 337, 476, 253, 13260, 670, 305, 3124, 320, 6508, 281, 7533, 835, 627, 3198, 247, 5301, 342, 1675, 281, 1529, 18825, 1387, 2581, 685, 17749, 352, 281, 320, 762, 690, 7887, 50276, 19, 352, 310, 417, 4555, 2590, 849, 253, 2746, 476, 320, 6508, 281, 14308, 824, 347, 2060, 28959, 50276, 20, 275, 1340, 281, 1805, 2939, 253, 3290, 273, 253, 4344, 13650, 275, 253, 19007, 4758, 352, 651, 320, 9371, 281, 452, 690, 16774, 1783, 672, 2709, 14308, 273, 28959, 326, 778, 417, 320, 13333, 403, 5125, 407, 253, 2608, 50276, 250, 3065, 337, 1625, 73, 288, 1625, 73, 391, 278, 7110, 1403, 796, 362, 50276, 348, 328, 4595, 391, 43425, 14172, 28959, 15927, 285, 36455, 762, 9383, 11610, 5333, 275, 10061, 273, 253, 43425, 913, 78, 8059, 327, 28959, 30990, 285, 22107, 7266, 31389, 374, 3443, 259, 50276, 44217, 1269, 43425, 10237, 28959, 13823, 4715, 762, 3410, 5438, 8492, 549, 32693, 638, 3845, 549, 32693, 19, 10655, 12730, 1967, 495, 277, 2284, 480, 50276, 33167, 256, 278, 5203, 8492, 5203, 5333, 4344, 5145, 4715, 342, 36230, 13301, 253, 9021, 273, 253, 2929, 403, 2590, 347, 253, 2929, 310, 5816, 690, 2905, 2987, 253, 8245, 14023, 476, 320, 2007, 6508, 281, 2939, 643, 2616, 347, 5125, 50276, 7152, 339, 431, 248, 4477, 12661, 271, 2746, 1925, 439, 338, 555, 326, 3400, 1029, 39943, 28959, 23632, 672, 253, 3268, 273, 3733, 285, 19007, 310, 1027, 597, 4081, 767, 7274, 281, 18915, 253, 1895, 581, 1754, 327, 1929, 3268, 5333, 285, 1529, 581, 323, 672, 253, 3268, 5333, 310, 7202, 597, 6760, 616, 2746, 327, 247, 10895, 323, 5974, 2323, 10554, 285, 2429, 616, 2746, 342, 256, 5503, 275, 21678, 28959, 7274, 50276, 783, 2929, 39223, 271, 4722, 285, 11132, 1895, 253, 2934, 273, 23694, 253, 8892, 275, 247, 7529, 3282, 310, 4722, 347, 597, 10957, 253, 1895, 715, 253, 7431, 5438, 285, 12203, 323, 253, 1029, 39943, 5170, 3033, 253, 2022, 2523, 2299, 310, 253, 941, 3268, 629, 326, 352, 3133, 326, 597, 42126, 6266, 697, 5199, 7613, 891, 717, 417, 2119, 1880, 352, 369, 247, 4344, 3410, 5438, 275, 534, 1097, 273, 253, 20926, 285, 36196, 403, 8612, 273, 253, 2603, 941, 50276, 249, 253, 10199, 253, 4477, 7558, 326, 597, 1581, 4212, 281, 3609, 28959, 27367, 432, 247, 1781, 966, 7293, 327, 253, 2898, 5028, 2299, 2556, 281, 616, 2746, 352, 310, 417, 2590, 849, 597, 476, 2395, 323, 2060, 28959, 476, 253, 4477, 21184, 327, 436, 50276, 44295, 3062, 253, 4477, 14409, 275, 2593, 277, 273, 24864, 2144, 627, 310, 271, 9376, 326, 1016, 3602, 11508, 403, 3907, 534, 310, 2223, 3221, 476, 253, 4477, 5513, 824, 816, 6787, 50276, 14094, 1332, 273, 12488, 253, 3268, 5333, 275, 1097, 1929, 285, 7202, 15036, 15771, 327, 2608, 33850, 1491, 6296, 352, 310, 271, 9376, 281, 253, 1895, 534, 778, 2186, 440, 5672, 50276, 12756, 352, 320, 1896, 281, 2736, 3268, 5333, 387, 253, 19007, 285, 1818, 253, 1566, 15672, 2139, 627, 310, 247, 878, 281, 6194, 3210, 323, 2710, 7202, 3268, 15036, 50276, 262, 310, 4755, 326, 7296, 2710, 15036, 556, 247, 5699, 3486, 327, 253, 3045, 2556, 281, 616, 1543, 275, 2593, 5976, 50275, 783, 2605, 273, 253, 7103, 2593, 310, 417, 21414, 281, 479, 285, 253, 5301, 342, 275, 21678, 28959, 7274, 1057, 417, 1646, 4344, 253, 760, 3588, 5301, 310, 8772, 281, 253, 28959, 23632, 2299, 352, 310, 417, 2590, 281, 479, 2139, 253, 4477, 2429, 616, 2746, 762, 3268, 5333, 342, 643, 3082, 326, 513, 417, 1750, 281, 320, 4344, 672, 253, 10670, 273, 253, 6194, 285, 1071, 5239, 403, 1027, 671, 1677, 253, 1543, 3559, 323, 253, 3368, 342, 7202, 3268, 5333, 50276, 262, 3133, 253, 11329, 9903, 757, 5933, 17923, 1805, 326, 616, 2746, 2654, 1804, 253, 4477, 281, 7277, 616, 2746, 342, 3082, 4081, 323, 4473, 5333, 671, 352, 588, 320, 4217, 281, 823, 387, 38462, 581, 625, 10895, 323, 253, 7103, 281, 1056, 2119, 326, 841, 1543, 403, 5185, 275, 2710, 22349, 891, 452, 1119, 253, 2022, 2934, 273, 253, 2929, 4722, 2299, 891, 452, 690, 24626, 670, 616, 4081, 1332, 285, 253, 1039, 326, 597, 6760, 616, 2746, 285, 2429, 352, 342, 643, 3082, 923, 2045, 2593, 50276, 12563, 891, 1804, 253, 4477, 281, 2770, 760, 327, 1387, 28959, 347, 352, 310, 417, 2590, 849, 616, 2746, 476, 320, 908, 323, 2060, 28959, 3082, 50276, 7152, 339, 431, 248, 2929, 16633, 327, 5933, 280, 28959, 275, 5145, 4715, 13361, 285, 275, 1798, 327, 253, 1895, 273, 18825, 5333, 253, 16038, 3249, 432, 253, 958, 326, 253, 3268, 273, 253, 6944, 3072, 1537, 1818, 875, 3733, 285, 19007, 50276, 783, 9380, 13067, 436, 347, 247, 5333, 275, 253, 16888, 3268, 273, 247, 2014, 3632, 4778, 5492, 2825, 3966, 50276, 783, 9978, 19401, 247, 9162, 4836, 835, 7996, 1491, 253, 28959, 11104, 1537, 417, 320, 4320, 281, 897, 533, 310, 2130, 1491, 1677, 247, 2176, 2362, 273, 1600, 285, 247, 2303, 13761, 1318, 29201, 24088, 342, 1675, 281, 2825, 253, 3221, 10247, 2281, 273, 253, 1566, 1364, 2708, 29201, 323, 5343, 271, 5933, 310, 2931, 281, 320, 4344, 604, 253, 5933, 33526, 253, 2303, 1318, 29201, 342, 1029, 5912, 1677, 407, 690, 18687, 4764, 281, 2395, 323, 18825, 5333, 253, 1566, 2007, 3797, 247, 18825, 11104, 246, 253, 3268, 273, 246, 476, 1818, 689, 673, 533, 253, 17697, 3268, 1364, 3464, 253, 1072, 407, 9376, 50276, 28821, 436, 9978, 253, 2929, 23970, 439, 338, 555, 271, 13361, 5933, 326, 3400, 1029, 39943, 23632, 326, 247, 28959, 2867, 588, 2186, 1014, 672, 247, 18825, 5333, 556, 5866, 439, 338, 555, 2987, 347, 3637, 352, 3936, 347, 3280, 247, 3733, 15302, 690, 2424, 28959, 10806, 285, 247, 5740, 273, 18825, 15036, 840, 352, 27959, 253, 10895, 281, 767, 20077, 9010, 247, 1566, 1754, 327, 253, 806, 10895, 840, 352, 4648, 2805, 285, 253, 1273, 10895, 281, 1973, 271, 5170, 7162, 3033, 323, 253, 1566, 846, 19007, 285, 1677, 18825, 5333, 352, 6548, 760, 3210, 326, 403, 16593, 846, 19007, 342, 5912, 387, 954, 18687, 50276, 783, 4679, 897, 1071, 7363, 941, 432, 253, 270, 9918, 757, 9835, 985, 285, 921, 253, 20297, 3045, 273, 253, 4081, 5933, 50276, 26122, 50275, 74, 1119, 253, 7473, 5426, 273, 18825, 5333, 3239, 495, 8489, 1892, 281, 14390, 285, 2096, 540, 41597, 50275, 251, 1755, 273, 253, 2045, 4385, 352, 310, 417, 2590, 281, 479, 2139, 253, 4477, 1056, 247, 13812, 875, 28959, 11104, 285, 18825, 11104, 752, 604, 253, 6831, 5605, 281, 1379, 715, 2395, 1097, 5492, 285, 8645, 347, 247, 28959, 17734, 6885, 11104, 671, 2139, 310, 352, 5272, 281, 5467, 326, 246, 2544, 16186, 374, 533, 253, 17697, 5912, 16186, 495, 1057, 417, 275, 1798, 275, 752, 7533, 352, 310, 17285, 281, 5467, 326, 16186, 495, 6556, 50276, 4674, 30581, 19584, 247, 625, 15958, 10076, 352, 310, 1892, 281, 1158, 273, 15216, 835, 2805, 651, 320, 4555, 1929, 594, 891, 1158, 436, 2593, 943, 320, 3559, 347, 629, 273, 253, 2022, 9978, 285, 417, 347, 271, 6880, 2299, 253, 5740, 273, 253, 1332, 310, 417, 1077, 10799, 849, 513, 253, 10704, 11193, 285, 45666, 23117, 13757, 4555, 789, 275, 436, 4758, 50276, 6259, 30762, 247, 285, 2905, 789, 327, 7223, 5910, 812, 320, 5520, 849, 1057, 439, 338, 555, 2085, 23632, 326, 253, 643, 11333, 2550, 651, 352, 2430, 1534, 11237, 273, 253, 5368, 11333, 285, 849, 604, 1896, 281, 2395, 323, 18825, 15036, 2867, 270, 50276, 249, 2426, 273, 4028, 253, 2929, 310, 3240, 4518, 3542, 627, 403, 690, 49495, 275, 2176, 5053, 24088, 2593, 374, 50276, 249, 253, 2022, 9021, 1618, 3239, 374, 891, 1158, 326, 253, 10527, 7680, 374, 310, 271, 689, 25322, 10012, 337, 310, 2581, 4755, 285, 642, 23632, 403, 2530, 323, 253, 7202, 18825, 5333, 1083, 5046, 13546, 3268, 6820, 14493, 651, 320, 1896, 50276, 22309, 369, 760, 436, 10895, 6777, 452, 253, 4477, 3597, 7826, 13506, 4679, 342, 643, 15302, 24088, 253, 1569, 26120, 10895, 908, 2223, 275, 4344, 13361, 50276, 5430, 310, 824, 247, 14138, 18825, 5333, 5544, 275, 270, 9918, 3036, 374, 50276, 783, 6197, 281, 7472, 1016, 3210, 299, 670, 7103, 275, 2593, 5976, 310, 417, 1077, 2590, 50275, 74, 1158, 253, 2929, 26847, 2220, 271, 1774, 8542, 5691, 417, 760, 275, 4344, 13361, 533, 13361, 3210, 908, 323, 3061, 11849, 275, 2087, 253, 1783, 310, 3626, 15246, 285, 22335, 3590, 253, 954, 7681, 4809, 310, 253, 13782, 273, 253, 5170, 7162, 3033, 534, 310, 2629, 594, 891, 1158, 627, 310, 642, 1798, 38135, 275, 697, 35961, 7680, 50275, 3529, 753, 891, 1158, 253, 20178, 7680, 273, 253, 2929, 310, 1335, 3240, 747, 285, 4217, 275, 253, 1708, 273, 253, 4460, 2898, 891, 3021, 1158, 253, 5933, 310, 3240, 4460, 50276, 7594, 8299, 352, 651, 320, 9371, 604, 253, 4477, 2530, 247, 30909, 5301, 281, 253, 954, 2905, 2987, 285, 5544, 849, 616, 2746, 310, 4460, 50275, 71, 3341, 891, 1119, 253, 7103, 2593, 21414, 285, 6210, 1591, 886, 4525, 891, 651, 11435, 3081, 4679, 327, 50276, 977, 7561, 908, 275, 4344, 13361, 50276, 46906, 1507, 1014, 13506, 4679, 50276, 187, 187, 4118, 18435, 27, 783, 2929, 2175, 253, 4758, 273, 1387, 3169, 28959, 762, 253, 9267, 18859, 18825, 5333, 835, 253, 16888, 3268, 273, 253, 941, 4558, 253, 1072, 17697, 327, 253, 14632, 533, 253, 14632, 3268, 476, 1818, 352, 3400, 247, 966, 273, 11333, 534, 1918, 1029, 7162, 23632, 762, 18825, 5333, 275, 1097, 253, 1929, 285, 7202, 5333, 4758, 50276, 1189, 455, 253, 2929, 310, 247, 32811, 7680, 352, 3400, 247, 747, 6907, 281, 253, 1774, 1895, 273, 1387, 3169, 28959, 342, 1175, 10527, 285, 16774, 1543 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: this paper introduces an allpair message passing scheme based on a kernelized gumbelsoftmax operator to reduce the complexity of message passing from quadratic in number of nodes n to linear time at the cost of some approximation error the authors provide an analysis of the incurred error and present empirical comparisons with stateoftheart methods strengths the paper is very wellwritten and organized the problem of efficient graph learning is highly relevant to the ml community sound theoretical results are provided that bound the approximation error of the proposed scheme sec 32 the presented method combines existing techniques random features reparameterization trick gumbel distribution sampling in a novel way empirical results and comparisons sec 4 on common graph datasets support the improved effectiveness of the proposed approach visualizations and sensitivity analysis of relevant variables tau are provided in the appendix weaknesses the method requires a parameter tau to be tuned in advance is there any way to learn or automatically set tau based on the theoretical analysis eg mathematically balance the tradeoff mentioned in line 178 so that ablation studies are not needed on an application specific basis yes in the appendix docsepthe authors proposed a gnn framework named aligns that can simultaneously learn the latent topology and missing labels of nodes they came up with a two steps approximation to first kernelize the softmax function used in latent node structures then a continues relaxation of the categorical distribution over all pairs of edges to enable endtoend back propagation in the experimentation section they tested on various datasets with up to 2m nodes in both transductive and inductive setting given clean noisy and even no graph information results show not only aligns has a performance gain against sota models but also it scales with increasing number of nodes pros the paper is wellwritten and easy to read method has potential for used in much wider range without constraining on graphstructured input scalability of model is very impressive the kernelization and continues relaxation seems to hold up pretty well in real world task performance in transductive setting with standard benchmarks in homophily and heterophily graphs are on par or outperform sota in inductive setting even after adding noise to edges the model still performs very stable cons typo in line 331 elbo not eblo eq 8 is not the same as in algorithm 1 shouldnt it be zul on the right side i dont get why all datasets have a fix tau 025 shouldnt it based on the feature sizegraph structures from my personal perspective i would like to see the learned latent structure in main text and have a comparison to the original graphs i didnt see a negative societal impact of this work docsepthis paper considers to overcome the deficiencies of oversquashing heterophily longrange dependencies edge incompleteness etc by proposing a new structure to learn the topology of the input graphs specifically it proposes to incorporate the learned topology into the attention matrix through the softmax gumbel operator to reduce the runtime complexity a kernelized version of attention computation is developed using random feature decomposition also convergence properties of the approximation are derived extensive experiments are conducted on different types of node classification tasks strengths 1 theoretical analysis is comprehensive convergence results are developed to study the quality of the fast computation of the softmax gumbel operator 2 the variational perspective of the proposed method is motivating 3 numerical experiments are comprehensive as well which includes testing on small graphs large graphs transductive and inductive task and graphenhanced applications etc weaknesses 1 the paper mentions that the new method is proposed to tackle problems like oversquashing heterophily longrange dependencies etc however in the main body of the paper it is hard to realize how the development of the new structure helps to resolve these issues accordingly it would be great if the authors could point this out more clearly 2 since the method is proposed for scalable gnn the authors are encouraged to review and discuss more about sota scalable gnns and compare with the proposed model in the related works section such as subgraphwise sampling graphsaint 1 kernelized masked attention gkat 2 linearized gnn 34 etc also those models could be reasonable baselines in the section 42 experiments on larger graph datasets where currently only mlp and gcn are compared 1 zeng hanqing et al graphsaint graph sampling based inductive learning method arxiv preprint arxiv190704931 2019 2 choromanski krzysztof et al graph kernel attention transformers arxiv preprint arxiv210707999 2021 3 wu felix et al simplifying graph convolutional networks international conference on machine learning pmlr 2019 4 bojchevski aleksandar et al scaling graph neural networks with approximate pagerank proceedings of the 26th acm sigkdd international conference on knowledge discovery data mining 2020 the limitations and negative social impacts are not addressed in the paper docsepthis work performs message passing over any pairs of nodes in the graph while avoiding the n2 time complexity by using kernelized operator the key technique is to use random feature to estimate the kernel function the authors conduct experiments on various nodelevel tasks to evaluate the method pros 1 using random feature technique to approximate the kernel function is impressive and reduces the complexity significantly 2 the presentation flow of this paper is clear and easy to follow cons 1 more convincing benchmark comparison should be considered actually i was surprised by the effectiveness of the random feature approximation i think more rigorous benchmark comparison should be used to evaluate the empirical performance in the current experiments except ogbproteins other experiments are mainly using the setting defined by this paper which might be less convincing to me for example for the citation networks we have wellestablished benchmark setup in the community the setup used in this paper is different and the results of the baselines are not consistent with prior works for example on citeseer the accuracy of jknet 72 is much lower than gcn 76 which indicates that the reproduced results of baselines might not be convincing postrebuttal after reading the response from authors my original concerns are addressed not applicable ### Summary:
the paper presents an approximate way to perform allpair message passing within the context of gnns the papers main contribution is a series of extensive empirical results and at the same time theoretical justification for the approach all the reviewers liked the paper and noted the impressive scalability of the approach some technical questionsconcerns were also addressed post rebuttal this paper is recommended for acceptance
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 2520, 2929, 23970, 271, 512, 13934, 3935, 8136, 6974, 1754, 327, 247, 10295, 1025, 305, 3561, 293, 5530, 4090, 5572, 281, 4796, 253, 10454, 273, 3935, 8136, 432, 21396, 275, 1180, 273, 7632, 295, 281, 4872, 673, 387, 253, 2105, 273, 690, 11193, 2228, 253, 4477, 2085, 271, 1783, 273, 253, 23122, 2228, 285, 1246, 16774, 14023, 342, 1375, 23037, 14387, 3082, 50276, 296, 3755, 20556, 50276, 783, 2929, 310, 1077, 973, 15720, 285, 10932, 50276, 783, 1895, 273, 5919, 4216, 4715, 310, 4122, 4623, 281, 253, 13361, 3114, 50275, 27962, 10527, 1543, 403, 2530, 326, 3033, 253, 11193, 2228, 273, 253, 4081, 6974, 4706, 4567, 50276, 783, 3559, 1332, 24772, 5368, 5609, 3632, 3386, 294, 19484, 1320, 10480, 305, 3561, 293, 3268, 10491, 275, 247, 4460, 1039, 50276, 358, 5378, 474, 1543, 285, 14023, 4706, 577, 327, 1846, 4216, 15302, 1329, 253, 5520, 12510, 273, 253, 4081, 2746, 50276, 34309, 5904, 285, 7340, 1783, 273, 4623, 4903, 29201, 403, 2530, 275, 253, 30762, 50275, 20881, 1255, 265, 50276, 783, 1332, 4419, 247, 4764, 29201, 281, 320, 24251, 275, 7170, 310, 627, 667, 1039, 281, 3037, 390, 8356, 873, 29201, 1754, 327, 253, 10527, 1783, 24088, 11076, 1037, 6654, 253, 5454, 2727, 5393, 275, 1386, 24833, 594, 326, 28913, 2175, 403, 417, 3058, 327, 271, 2898, 2173, 3720, 50276, 9820, 275, 253, 30762, 5474, 339, 431, 248, 4477, 4081, 247, 305, 9866, 7792, 4907, 8495, 84, 326, 476, 10486, 3037, 253, 21624, 18080, 285, 5816, 13301, 273, 7632, 597, 2210, 598, 342, 247, 767, 5018, 11193, 281, 806, 10295, 907, 253, 2602, 4090, 1159, 908, 275, 21624, 4666, 5289, 840, 247, 7788, 17040, 273, 253, 31091, 3268, 689, 512, 8557, 273, 9297, 281, 8046, 990, 936, 423, 896, 18634, 275, 253, 40290, 2593, 597, 5762, 327, 2710, 15302, 342, 598, 281, 374, 78, 7632, 275, 1097, 811, 43324, 285, 42115, 4758, 1677, 4076, 27620, 285, 1014, 642, 4216, 1491, 1543, 921, 417, 760, 8495, 84, 556, 247, 3045, 6351, 1411, 256, 5503, 3210, 533, 671, 352, 11498, 342, 3629, 1180, 273, 7632, 50276, 856, 84, 50275, 783, 2929, 310, 973, 15720, 285, 3477, 281, 1239, 50275, 9349, 556, 2442, 323, 908, 275, 1199, 14200, 2491, 1293, 1030, 26208, 327, 4216, 34218, 3280, 50275, 24606, 1430, 273, 1566, 310, 1077, 13943, 253, 10295, 1320, 285, 7788, 17040, 3133, 281, 2186, 598, 3965, 973, 275, 1524, 1533, 4836, 50275, 24159, 275, 811, 43324, 4758, 342, 2629, 49602, 275, 2860, 2689, 1031, 285, 6895, 2689, 1031, 14580, 403, 327, 1061, 390, 562, 32231, 256, 5503, 50275, 249, 42115, 4758, 1014, 846, 6240, 6046, 281, 9297, 253, 1566, 1335, 17923, 1077, 6474, 50275, 5040, 50275, 555, 5367, 275, 1386, 36438, 1045, 2399, 417, 299, 43750, 50276, 2574, 854, 310, 417, 253, 1072, 347, 275, 5933, 337, 943, 2649, 352, 320, 1182, 335, 327, 253, 987, 1930, 50274, 74, 13414, 755, 2139, 512, 15302, 452, 247, 4993, 29201, 50276, 14521, 943, 2649, 352, 1754, 327, 253, 4735, 1979, 10580, 5289, 50275, 4064, 619, 3367, 8668, 891, 651, 751, 281, 923, 253, 6311, 21624, 2605, 275, 2022, 2505, 285, 452, 247, 5301, 281, 253, 3236, 14580, 50276, 74, 42126, 923, 247, 4016, 38058, 3486, 273, 436, 789, 50276, 7152, 33032, 2520, 2929, 19401, 281, 11399, 253, 30218, 273, 689, 23600, 3834, 6895, 2689, 1031, 1048, 6324, 21011, 5024, 275, 7507, 1866, 405, 3966, 407, 36636, 247, 747, 2605, 281, 3037, 253, 18080, 273, 253, 3280, 14580, 5742, 352, 29328, 281, 19071, 253, 6311, 18080, 715, 253, 4116, 4315, 949, 253, 2602, 4090, 305, 3561, 293, 5572, 281, 4796, 253, 20243, 10454, 247, 10295, 1025, 2715, 273, 4116, 13782, 310, 3715, 970, 3632, 4735, 14717, 671, 14940, 3607, 273, 253, 11193, 403, 6012, 9470, 4679, 403, 5196, 327, 1027, 3510, 273, 4666, 9162, 8892, 20544, 50276, 18, 10527, 1783, 310, 11088, 14940, 1543, 403, 3715, 281, 1263, 253, 3290, 273, 253, 3809, 13782, 273, 253, 2602, 4090, 305, 3561, 293, 5572, 50276, 19, 253, 39762, 8668, 273, 253, 4081, 1332, 310, 15265, 839, 50276, 20, 10704, 4679, 403, 11088, 347, 973, 534, 3797, 5175, 327, 1355, 14580, 1781, 14580, 811, 43324, 285, 42115, 4836, 285, 17309, 864, 73, 3086, 4893, 3966, 50275, 20881, 1255, 265, 50276, 18, 253, 2929, 25957, 326, 253, 747, 1332, 310, 4081, 281, 18915, 3237, 751, 689, 23600, 3834, 6895, 2689, 1031, 1048, 6324, 21011, 3966, 2299, 275, 253, 2022, 2133, 273, 253, 2929, 352, 310, 1892, 281, 8968, 849, 253, 2440, 273, 253, 747, 2605, 7729, 281, 11322, 841, 3374, 15672, 352, 651, 320, 1270, 604, 253, 4477, 812, 1127, 436, 562, 625, 4518, 374, 1580, 253, 1332, 310, 4081, 323, 44755, 305, 9866, 253, 4477, 403, 14659, 281, 2278, 285, 2319, 625, 670, 256, 5503, 44755, 18976, 2224, 285, 7277, 342, 253, 4081, 1566, 275, 253, 2905, 2987, 2593, 824, 347, 749, 10580, 3020, 10491, 14580, 1143, 337, 10295, 1025, 34741, 4116, 305, 29846, 374, 4872, 1025, 305, 9866, 5910, 3966, 671, 1110, 3210, 812, 320, 5272, 1666, 25379, 275, 253, 2593, 5976, 4679, 327, 4067, 4216, 15302, 835, 4390, 760, 13361, 81, 285, 305, 14340, 403, 2429, 50276, 18, 1182, 1205, 15761, 82, 272, 1162, 355, 14580, 1143, 4216, 10491, 1754, 42115, 4715, 1332, 549, 32693, 638, 3845, 549, 32693, 16129, 1967, 2537, 2405, 6247, 50276, 19, 30811, 297, 507, 5985, 36407, 91, 656, 91, 936, 71, 1162, 355, 4216, 10295, 4116, 4979, 398, 549, 32693, 638, 3845, 549, 32693, 16899, 26522, 16742, 43425, 50276, 20, 259, 86, 11664, 895, 1162, 355, 8077, 5411, 4216, 27311, 267, 6928, 5213, 8059, 327, 5145, 4715, 268, 1686, 83, 6247, 50276, 21, 1766, 75, 45627, 9327, 21844, 661, 395, 274, 1162, 355, 13642, 4216, 11454, 6928, 342, 16851, 268, 3800, 1164, 10061, 273, 253, 3436, 394, 913, 78, 9788, 76, 1678, 5213, 8059, 327, 3640, 8900, 50276, 2203, 15067, 9169, 50276, 783, 7364, 285, 4016, 2675, 16274, 403, 417, 9713, 275, 253, 2929, 5474, 33032, 2520, 789, 17923, 3935, 8136, 689, 667, 8557, 273, 7632, 275, 253, 4216, 1223, 17816, 253, 295, 19, 673, 10454, 407, 970, 10295, 1025, 5572, 253, 2234, 5853, 310, 281, 897, 3632, 4735, 281, 6642, 253, 10295, 1159, 253, 4477, 2589, 4679, 327, 2710, 4666, 5251, 8892, 281, 7472, 253, 1332, 50276, 856, 84, 50276, 18, 970, 3632, 4735, 5853, 281, 16851, 253, 10295, 1159, 310, 13943, 285, 11355, 253, 10454, 3012, 50276, 19, 253, 9759, 2685, 273, 436, 2929, 310, 2590, 285, 3477, 281, 956, 50275, 5040, 50276, 18, 625, 21414, 22791, 5301, 943, 320, 2783, 2686, 891, 369, 9861, 407, 253, 12510, 273, 253, 3632, 4735, 11193, 891, 1158, 625, 26565, 22791, 5301, 943, 320, 908, 281, 7472, 253, 16774, 3045, 275, 253, 1655, 4679, 3707, 9040, 67, 8517, 968, 643, 4679, 403, 7194, 970, 253, 4758, 2931, 407, 436, 2929, 534, 1537, 320, 1679, 21414, 281, 479, 323, 1650, 323, 253, 25577, 6928, 359, 452, 973, 21877, 22791, 9978, 275, 253, 3114, 253, 9978, 908, 275, 436, 2929, 310, 1027, 285, 253, 1543, 273, 253, 1666, 25379, 403, 417, 5185, 342, 2720, 2987, 323, 1650, 327, 4851, 3248, 254, 253, 7200, 273, 480, 76, 3024, 8187, 310, 1199, 2406, 685, 305, 14340, 10909, 534, 6492, 326, 253, 23775, 1543, 273, 1666, 25379, 1537, 417, 320, 21414, 50272, 5996, 250, 2858, 22559, 50275, 6438, 4361, 253, 2380, 432, 4477, 619, 3236, 7350, 403, 9713, 50276, 1439, 7763, 2490, 187, 4118, 18435, 27, 783, 2929, 10262, 271, 16851, 1039, 281, 1347, 512, 13934, 3935, 8136, 1561, 253, 3634, 273, 18976, 2224, 253, 9380, 2022, 7680, 310, 247, 2962, 273, 9470, 16774, 1543, 285, 387, 253, 1072, 673, 10527, 22861, 323, 253, 2746, 512, 253, 30628, 10490, 253, 2929, 285, 4879, 253, 13943, 9171, 1430, 273, 253, 2746, 690, 7681, 3533, 585, 1209, 2224, 497, 671, 9713, 1501, 30080, 22559, 436, 2929, 310, 8521, 323, 14924 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 2520, 2929, 23970, 271, 512, 13934, 3935, 8136, 6974, 1754, 327, 247, 10295, 1025, 305, 3561, 293, 5530, 4090, 5572, 281, 4796, 253, 10454, 273, 3935, 8136, 432, 21396, 275, 1180, 273, 7632, 295, 281, 4872, 673, 387, 253, 2105, 273, 690, 11193, 2228, 253, 4477, 2085, 271, 1783, 273, 253, 23122, 2228, 285, 1246, 16774, 14023, 342, 1375, 23037, 14387, 3082, 50276, 296, 3755, 20556, 50276, 783, 2929, 310, 1077, 973, 15720, 285, 10932, 50276, 783, 1895, 273, 5919, 4216, 4715, 310, 4122, 4623, 281, 253, 13361, 3114, 50275, 27962, 10527, 1543, 403, 2530, 326, 3033, 253, 11193, 2228, 273, 253, 4081, 6974, 4706, 4567, 50276, 783, 3559, 1332, 24772, 5368, 5609, 3632, 3386, 294, 19484, 1320, 10480, 305, 3561, 293, 3268, 10491, 275, 247, 4460, 1039, 50276, 358, 5378, 474, 1543, 285, 14023, 4706, 577, 327, 1846, 4216, 15302, 1329, 253, 5520, 12510, 273, 253, 4081, 2746, 50276, 34309, 5904, 285, 7340, 1783, 273, 4623, 4903, 29201, 403, 2530, 275, 253, 30762, 50275, 20881, 1255, 265, 50276, 783, 1332, 4419, 247, 4764, 29201, 281, 320, 24251, 275, 7170, 310, 627, 667, 1039, 281, 3037, 390, 8356, 873, 29201, 1754, 327, 253, 10527, 1783, 24088, 11076, 1037, 6654, 253, 5454, 2727, 5393, 275, 1386, 24833, 594, 326, 28913, 2175, 403, 417, 3058, 327, 271, 2898, 2173, 3720, 50276, 9820, 275, 253, 30762, 5474, 339, 431, 248, 4477, 4081, 247, 305, 9866, 7792, 4907, 8495, 84, 326, 476, 10486, 3037, 253, 21624, 18080, 285, 5816, 13301, 273, 7632, 597, 2210, 598, 342, 247, 767, 5018, 11193, 281, 806, 10295, 907, 253, 2602, 4090, 1159, 908, 275, 21624, 4666, 5289, 840, 247, 7788, 17040, 273, 253, 31091, 3268, 689, 512, 8557, 273, 9297, 281, 8046, 990, 936, 423, 896, 18634, 275, 253, 40290, 2593, 597, 5762, 327, 2710, 15302, 342, 598, 281, 374, 78, 7632, 275, 1097, 811, 43324, 285, 42115, 4758, 1677, 4076, 27620, 285, 1014, 642, 4216, 1491, 1543, 921, 417, 760, 8495, 84, 556, 247, 3045, 6351, 1411, 256, 5503, 3210, 533, 671, 352, 11498, 342, 3629, 1180, 273, 7632, 50276, 856, 84, 50275, 783, 2929, 310, 973, 15720, 285, 3477, 281, 1239, 50275, 9349, 556, 2442, 323, 908, 275, 1199, 14200, 2491, 1293, 1030, 26208, 327, 4216, 34218, 3280, 50275, 24606, 1430, 273, 1566, 310, 1077, 13943, 253, 10295, 1320, 285, 7788, 17040, 3133, 281, 2186, 598, 3965, 973, 275, 1524, 1533, 4836, 50275, 24159, 275, 811, 43324, 4758, 342, 2629, 49602, 275, 2860, 2689, 1031, 285, 6895, 2689, 1031, 14580, 403, 327, 1061, 390, 562, 32231, 256, 5503, 50275, 249, 42115, 4758, 1014, 846, 6240, 6046, 281, 9297, 253, 1566, 1335, 17923, 1077, 6474, 50275, 5040, 50275, 555, 5367, 275, 1386, 36438, 1045, 2399, 417, 299, 43750, 50276, 2574, 854, 310, 417, 253, 1072, 347, 275, 5933, 337, 943, 2649, 352, 320, 1182, 335, 327, 253, 987, 1930, 50274, 74, 13414, 755, 2139, 512, 15302, 452, 247, 4993, 29201, 50276, 14521, 943, 2649, 352, 1754, 327, 253, 4735, 1979, 10580, 5289, 50275, 4064, 619, 3367, 8668, 891, 651, 751, 281, 923, 253, 6311, 21624, 2605, 275, 2022, 2505, 285, 452, 247, 5301, 281, 253, 3236, 14580, 50276, 74, 42126, 923, 247, 4016, 38058, 3486, 273, 436, 789, 50276, 7152, 33032, 2520, 2929, 19401, 281, 11399, 253, 30218, 273, 689, 23600, 3834, 6895, 2689, 1031, 1048, 6324, 21011, 5024, 275, 7507, 1866, 405, 3966, 407, 36636, 247, 747, 2605, 281, 3037, 253, 18080, 273, 253, 3280, 14580, 5742, 352, 29328, 281, 19071, 253, 6311, 18080, 715, 253, 4116, 4315, 949, 253, 2602, 4090, 305, 3561, 293, 5572, 281, 4796, 253, 20243, 10454, 247, 10295, 1025, 2715, 273, 4116, 13782, 310, 3715, 970, 3632, 4735, 14717, 671, 14940, 3607, 273, 253, 11193, 403, 6012, 9470, 4679, 403, 5196, 327, 1027, 3510, 273, 4666, 9162, 8892, 20544, 50276, 18, 10527, 1783, 310, 11088, 14940, 1543, 403, 3715, 281, 1263, 253, 3290, 273, 253, 3809, 13782, 273, 253, 2602, 4090, 305, 3561, 293, 5572, 50276, 19, 253, 39762, 8668, 273, 253, 4081, 1332, 310, 15265, 839, 50276, 20, 10704, 4679, 403, 11088, 347, 973, 534, 3797, 5175, 327, 1355, 14580, 1781, 14580, 811, 43324, 285, 42115, 4836, 285, 17309, 864, 73, 3086, 4893, 3966, 50275, 20881, 1255, 265, 50276, 18, 253, 2929, 25957, 326, 253, 747, 1332, 310, 4081, 281, 18915, 3237, 751, 689, 23600, 3834, 6895, 2689, 1031, 1048, 6324, 21011, 3966, 2299, 275, 253, 2022, 2133, 273, 253, 2929, 352, 310, 1892, 281, 8968, 849, 253, 2440, 273, 253, 747, 2605, 7729, 281, 11322, 841, 3374, 15672, 352, 651, 320, 1270, 604, 253, 4477, 812, 1127, 436, 562, 625, 4518, 374, 1580, 253, 1332, 310, 4081, 323, 44755, 305, 9866, 253, 4477, 403, 14659, 281, 2278, 285, 2319, 625, 670, 256, 5503, 44755, 18976, 2224, 285, 7277, 342, 253, 4081, 1566, 275, 253, 2905, 2987, 2593, 824, 347, 749, 10580, 3020, 10491, 14580, 1143, 337, 10295, 1025, 34741, 4116, 305, 29846, 374, 4872, 1025, 305, 9866, 5910, 3966, 671, 1110, 3210, 812, 320, 5272, 1666, 25379, 275, 253, 2593, 5976, 4679, 327, 4067, 4216, 15302, 835, 4390, 760, 13361, 81, 285, 305, 14340, 403, 2429, 50276, 18, 1182, 1205, 15761, 82, 272, 1162, 355, 14580, 1143, 4216, 10491, 1754, 42115, 4715, 1332, 549, 32693, 638, 3845, 549, 32693, 16129, 1967, 2537, 2405, 6247, 50276, 19, 30811, 297, 507, 5985, 36407, 91, 656, 91, 936, 71, 1162, 355, 4216, 10295, 4116, 4979, 398, 549, 32693, 638, 3845, 549, 32693, 16899, 26522, 16742, 43425, 50276, 20, 259, 86, 11664, 895, 1162, 355, 8077, 5411, 4216, 27311, 267, 6928, 5213, 8059, 327, 5145, 4715, 268, 1686, 83, 6247, 50276, 21, 1766, 75, 45627, 9327, 21844, 661, 395, 274, 1162, 355, 13642, 4216, 11454, 6928, 342, 16851, 268, 3800, 1164, 10061, 273, 253, 3436, 394, 913, 78, 9788, 76, 1678, 5213, 8059, 327, 3640, 8900, 50276, 2203, 15067, 9169, 50276, 783, 7364, 285, 4016, 2675, 16274, 403, 417, 9713, 275, 253, 2929, 5474, 33032, 2520, 789, 17923, 3935, 8136, 689, 667, 8557, 273, 7632, 275, 253, 4216, 1223, 17816, 253, 295, 19, 673, 10454, 407, 970, 10295, 1025, 5572, 253, 2234, 5853, 310, 281, 897, 3632, 4735, 281, 6642, 253, 10295, 1159, 253, 4477, 2589, 4679, 327, 2710, 4666, 5251, 8892, 281, 7472, 253, 1332, 50276, 856, 84, 50276, 18, 970, 3632, 4735, 5853, 281, 16851, 253, 10295, 1159, 310, 13943, 285, 11355, 253, 10454, 3012, 50276, 19, 253, 9759, 2685, 273, 436, 2929, 310, 2590, 285, 3477, 281, 956, 50275, 5040, 50276, 18, 625, 21414, 22791, 5301, 943, 320, 2783, 2686, 891, 369, 9861, 407, 253, 12510, 273, 253, 3632, 4735, 11193, 891, 1158, 625, 26565, 22791, 5301, 943, 320, 908, 281, 7472, 253, 16774, 3045, 275, 253, 1655, 4679, 3707, 9040, 67, 8517, 968, 643, 4679, 403, 7194, 970, 253, 4758, 2931, 407, 436, 2929, 534, 1537, 320, 1679, 21414, 281, 479, 323, 1650, 323, 253, 25577, 6928, 359, 452, 973, 21877, 22791, 9978, 275, 253, 3114, 253, 9978, 908, 275, 436, 2929, 310, 1027, 285, 253, 1543, 273, 253, 1666, 25379, 403, 417, 5185, 342, 2720, 2987, 323, 1650, 327, 4851, 3248, 254, 253, 7200, 273, 480, 76, 3024, 8187, 310, 1199, 2406, 685, 305, 14340, 10909, 534, 6492, 326, 253, 23775, 1543, 273, 1666, 25379, 1537, 417, 320, 21414, 50272, 5996, 250, 2858, 22559, 50275, 6438, 4361, 253, 2380, 432, 4477, 619, 3236, 7350, 403, 9713, 50276, 1439, 7763, 2490, 187, 4118, 18435, 27, 783, 2929, 10262, 271, 16851, 1039, 281, 1347, 512, 13934, 3935, 8136, 1561, 253, 3634, 273, 18976, 2224, 253, 9380, 2022, 7680, 310, 247, 2962, 273, 9470, 16774, 1543, 285, 387, 253, 1072, 673, 10527, 22861, 323, 253, 2746, 512, 253, 30628, 10490, 253, 2929, 285, 4879, 253, 13943, 9171, 1430, 273, 253, 2746, 690, 7681, 3533, 585, 1209, 2224, 497, 671, 9713, 1501, 30080, 22559, 436, 2929, 310, 8521, 323, 14924 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: this paper introduces an inverse reinforcement learning approach for learning from diverse thirdperson videos the key component of the method is representing the frames of videos with graphs which capture scene semantics and object interactions while leaving out visual details such as textures they generate the graphs by running a pretrained object detector then computing geometric relationships between each pair of detected objects then they apply temporal cycle consistencybased selfsupervised learning on the graph representations which yields an embedding space from which they can generate a dense reward signal given an observation and a goal they show much improved performance with their method on a simulated sweeping task and three robotic manipulation tasks on a real robot strengths 1 graphbased method is intuitive wellexplained and shown to be very effective 2 experiments are thorough 3 writing is very clear weaknesses 1 the selection of architecture for processing the graph and building the graph could be better motivated for instance were other graph convolutional architectures considered why use a fullyconnected graph etc 2 regarding the graph representation it is unclear exactly how exactly it differs from prior work in objectcentric representations docsepthis paper proposes a method for learning a reward function which is both crossembodiment and crossdomain this allows for the reward function to be learned from human videos for the learned reward to be used for learning in simulation and for the learned policy to be deployed in the real world again on a physical robot a graph representation is used to cross the domain gap and temporal cycle consistency loss is used to cross the embodiment gap experiments show improved performance on a crossembodiment benchmark and successful deployment on a physical robot strengths 1 the combination of graph representations and irl appears to be novel and is likely to have significant impact as this general approach can be used for many tasks 1 the experimental finding that learned reward functions can be better than handdesigned rewards for these tasks is intriguing and may be useful for the community 1 the paper does well at proposing a vision for this learning pipeline learning rewards from realworld human demonstrations learning a policy in simulation using learned rewards and finally deployment of the learned policy on physical hardware this seems like a promising way forward for robot learning and therefore raises the impact potential of this work 1 evaluation on a real robot shows that this method can be readily applied in the real world supporting the vision for the full pipeline 1 data efficiency as suggested by an ablation the graph abstraction is much more dataefficient a comparison using this metric to other methods may be interesting limitations 1 the node features only use the bounding box and centroids of each object in the current draft of the paper other visual information is discarded this is what helps the method cross the domain gap but also drastically cuts the number of tasks that the method can be applied for if it does not use visual features beyond the sizes of objects for example if the object to be pushed and the goal are the same size then it is unclear how the network would be able to tell them apart 1 xmagical is a useful benchmark as it is designed specifically for evaluating crossembodiment methods but it is not very clear why strong performance in this benchmark would necessarily translate well to strong performance in real robot experiments the policy is statebased rather than imagebased it seems like objects do not collide or interact much so this might not show the full power of the interaction networks used in this method the results on this benchmark suggest that when trained on not diverse videos but one visual style of videos the proposed method generalises better to videos with different visual features this makes sense because these visual features are all discarded by the representation keeping only bounding boxes however in the second row of figure 4 it seems like the baseline method xirl performs just as well as the proposed method when the training dataset contains visually diverse videos presumably because the visual encoder learns which features to focus on this might lessen the impact of the current work because it is also motivated by the context of learning from diverse videos 1 the realworld experiments address a scenario which also might not show the full potential of interaction networks as there are only 23 objects involved the results are not very convincing for the proposed method unfortunately in the current draft the number of demonstrations is in the range of 162256 which is high considering the simplicity of the task for the peginhole experiment it does not seem as though 3d geometric understanding is needed to complete the task in the video on the website simply moving in a 2d line without raising peg allows peg to be dragged over wall and into box due to compliance of string in the demo videos the peg is lifted over the wall it seems like the robot does not succeed in learning this behaviour additionally it is not clear how the limited visual features in the graph representation would allow this 3d information to be captured on line 250 it is claimed that the proposed method outperforms the oracle method this conclusion is not fully proven by the data as it stands this is because a of the variance of the reward curves and b because this conclusion seems highly dependent on this specific termination point for the reward curve when before that sac was significantly outperforming graphirl and after this point it is unclear which method will perform better since neither curve looks like it has reached convergence 1 the limitations section does not explore the limitations of the method in detail such as those mentioned in the review questions in eq 4 why is g not just the final frame but an aggregation across frames docsepthis work proposes a framework for robot learning from thirdperson human demonstration videos to learn from a diverse set of videos that have different viewpoints motions and embodiments the irl is performed at an abstract graph level they show that the reward function indicating task progression that is learned with this abstraction can be used to train an rl agent and the learned policy can be successfully transferred to a real robot plus the research problem is well motivated the components in the proposed pipeline are designed well there are no major flaws in experiments and evaluations choosing the right representation for an irl problem is a fundamental research question scene graphs have a lot of potential in this direction and this work has demonstrated its effectiveness no environment nor additional handcrafted reward is required which sets this paper apart from some previous works minus this work assumes an object detector is given which is a reasonable assumption but there are several unanswered questions what offtheshelf detector is being used here whats the detection accuracy in the tasks presented more critically how accurate the perception module needs to be for the entire pipeline to be effective graph abstraction has all the advantages mentioned in the paper but like any abstraction some information is lost what is some critical information that is lost in the abstraction does this prevent the proposed approach to generalize to tasks that heavily rely on for example object pose information 3d geometric information etc this should be at least discussed in the limitations section docsepthe authors present a technique they call graphirl for learning reward abstractions across domains with diverse embodiment gaps the method decomposes the underlying scene structure and agentobject interaction into a graph which the authors claim is invariant to distractor features like texture agent embodiment etc the learned reward is then used in an irl setting to learn policies from thirdperson demonstrations and they show successful crossembodiment transfer on both simulated and realworld setups strengths the paper is highly polished and easy to read the videos are high quality and the experimental results are quite thorough with some details missing see below real robot experiments are appreciated this is an incredibly challenging problem domain so any improvement in performance could be impactful for the community weaknesses some references are missing there have been quite a few papers that use objectcentric representations to define by definition invariant state representations one example would be manuelli keypoints into the future selfsupervised correspondence in modelbased reinforcement learning i suggest the authors perform an additional literature search the reward is not conditioned on policy actions and so implies some limitations in the form of reward that can be learned eg learned rewards cannot be directly action dependent or assumes the learned reward can infer the policy action if this is what is needed concrete example rewards that introduce penalties or costs for certain actions please add details discussing the fundamental limitations of the assertions made especially since you have a limitations section similarly the reward is defined on a representative goal frame which also limits the rewards that can be learned effectively you can only represent rewards where some individual goal state is sufficient to solve the task this should be discussed to avoid overselling the method ive read the our approach section a few times and while i think i understand how the graph is constructed a visual aid would be immensely helpful i understand the component parts psi fo fs fin and i can follow these into si and sj but it would be nice to see the flow from image to bounding box to embeddings to construction of the graph figure 2 is a good start but doesnt actually contain any of the terms from the our approach section so is somewhat hard to follow table 1 please explain why the environment reward underperforms your method is this a sparse reward and if so why did you compare a dense reward with a sparse one otherwise id expect a well crafted true reward to be an upper bound so this requires some explanation for the real robot experiments did you train on any real data or did you train the reward only in sim and then deploy in real likewise did you train the policy in real or reward policy trained in sim and deployed in real sorry if this is spelled out but i wasnt able to confirm if its the case that the reward was trained in sim only it would make sense to me that a method that first crops objects learns a reward function that can better transfer sim to real while a method like xirl that uses full image embeddings wont perform as well since background distractor features are going to result in generalization issues this isnt necessarily a ding against the method at all and in fact using bounding boxes is a reasonable contribution but an experiment that also trains the full stack in real would be necessary to make sure the graph representation accounts for the claimed performance gains and its not just because of cropping out of the background nit it would be helpful if the number of videos was in the table 3 caption ### Summary:
this paper proposes an interesting approach to learning rewards from a diverse set of videos and has received some strong reviews the approach is original and the paper wellmotivated and wellwritten the experimental section is thorough with good results and includes experiments on a real robot using a graphbased method for irl is novel and could have significant impact however there were a few weaknesses found by the reviewers in particular more justification for the choice of architecture and visual features could improve the paper how would different choices impact performance along with this a more indepth discussion of the limitations of the approach would greatly improve the paper the authors are encouraged to address these concerns along with other concerns raised by the reviewers thank you to the authors for the detailed response and updated experiments
[ 5520, 3045, 342, 616, 1332, 327, 247, 15524, 28110, 4836, 285, 1264, 35121, 19763, 8892, 327, 247, 1524, 15688, 20544, 337, 4216, 3169, 1332, 310, 27350, 6210, 1591, 446, 1243, 285, 2011, 281, 320, 1077, 3576, 374, 4679, 403, 11080, 495, 4028, 310, 1077, 2590, 50276, 20881, 1255, 265, 337, 253, 5438, 273, 10336, 323, 5162, 253, 4216, 285, 3652, 253, 4216, 812, 320, 1805, 17194, 323, 4227, 497, 643, 4216, 27311, 267, 35615, 2783, 2139, 897, 247, 4751, 14063, 4216, 3966, 374, 5001, 253, 4216, 6779, 352, 310, 12744, 4555, 849, 4555, 352, 19986, 432, 2720, 789, 275, 1789, 37382, 14237, 5474, 33032, 2520, 2929, 29328, 247, 1332, 323, 4715, 247, 10921, 1159, 534, 310, 1097, 23987, 4393, 351, 2092, 285, 2831, 13517, 436, 4483, 323, 253, 10921, 1159, 281, 320, 6311, 432, 1966, 10556, 323, 253, 6311, 10921, 281, 320, 908, 323, 4715, 275, 9864, 285, 323, 253, 6311, 3646, 281, 320, 18329, 275, 253, 1524, 1533, 969, 327, 247, 3520, 15688, 247, 4216, 6779, 310, 908, 281, 2831, 253, 5028, 8037, 285, 11935, 5880, 15274, 2957, 310, 908, 281, 2831, 253, 14704, 8037, 4679, 921, 5520, 3045, 327, 247, 23987, 4393, 351, 2092, 22791, 285, 5547, 19007, 327, 247, 3520, 15688, 20544, 337, 253, 5019, 273, 4216, 14237, 285, 209, 2587, 4620, 281, 320, 4460, 285, 310, 2779, 281, 452, 1534, 3486, 347, 436, 2087, 2746, 476, 320, 908, 323, 1142, 8892, 337, 253, 5661, 4560, 326, 6311, 10921, 3470, 476, 320, 1805, 685, 1133, 38061, 23267, 323, 841, 8892, 310, 27807, 285, 778, 320, 4217, 323, 253, 3114, 337, 253, 2929, 1057, 973, 387, 36636, 247, 8113, 323, 436, 4715, 15722, 4715, 23267, 432, 1524, 10186, 1966, 32367, 4715, 247, 3646, 275, 9864, 970, 6311, 23267, 285, 4720, 19007, 273, 253, 6311, 3646, 327, 3520, 10309, 436, 3133, 751, 247, 12532, 1039, 3579, 323, 15688, 4715, 285, 3103, 16540, 253, 3486, 2442, 273, 436, 789, 337, 7103, 327, 247, 1524, 15688, 2722, 326, 436, 1332, 476, 320, 12450, 3732, 275, 253, 1524, 1533, 8109, 253, 8113, 323, 253, 2120, 15722, 337, 941, 6733, 347, 5125, 407, 271, 28913, 253, 4216, 38562, 310, 1199, 625, 941, 20246, 247, 5301, 970, 436, 7982, 281, 643, 3082, 778, 320, 4722, 50276, 17465, 569, 337, 253, 4666, 3386, 760, 897, 253, 41113, 3817, 285, 1399, 287, 2352, 273, 1016, 1789, 275, 253, 1655, 7482, 273, 253, 2929, 643, 5304, 1491, 310, 25665, 436, 310, 752, 7729, 253, 1332, 2831, 253, 5028, 8037, 533, 671, 31063, 12176, 253, 1180, 273, 8892, 326, 253, 1332, 476, 320, 3732, 323, 604, 352, 1057, 417, 897, 5304, 3386, 4457, 253, 9552, 273, 5113, 323, 1650, 604, 253, 1789, 281, 320, 10184, 285, 253, 4736, 403, 253, 1072, 1979, 840, 352, 310, 12744, 849, 253, 2990, 651, 320, 2104, 281, 2028, 731, 7419, 337, 1269, 20752, 474, 310, 247, 4217, 22791, 347, 352, 310, 4158, 5742, 323, 16344, 23987, 4393, 351, 2092, 3082, 533, 352, 310, 417, 1077, 2590, 2139, 2266, 3045, 275, 436, 22791, 651, 7933, 16497, 973, 281, 2266, 3045, 275, 1524, 15688, 4679, 253, 3646, 310, 1375, 3169, 2581, 685, 2460, 3169, 352, 3133, 751, 5113, 513, 417, 3007, 504, 390, 8008, 1199, 594, 436, 1537, 417, 921, 253, 2120, 1612, 273, 253, 5016, 6928, 908, 275, 436, 1332, 253, 1543, 327, 436, 22791, 1804, 326, 672, 10166, 327, 417, 11117, 10556, 533, 581, 5304, 3740, 273, 10556, 253, 4081, 1332, 2087, 3013, 1805, 281, 10556, 342, 1027, 5304, 3386, 436, 2789, 3282, 984, 841, 5304, 3386, 403, 512, 25665, 407, 253, 6779, 7562, 760, 41113, 12783, 2299, 275, 253, 1273, 4194, 273, 4677, 577, 352, 3133, 751, 253, 8245, 1332, 1269, 2587, 17923, 816, 347, 973, 347, 253, 4081, 1332, 672, 253, 3733, 10895, 4428, 25910, 11117, 10556, 18289, 984, 253, 5304, 32049, 33772, 534, 3386, 281, 2770, 327, 436, 1537, 1679, 257, 253, 3486, 273, 253, 1655, 789, 984, 352, 310, 671, 17194, 407, 253, 3634, 273, 4715, 432, 11117, 10556, 337, 253, 1524, 10186, 4679, 2953, 247, 10076, 534, 671, 1537, 417, 921, 253, 2120, 2442, 273, 5016, 6928, 347, 627, 403, 760, 3495, 5113, 3206, 253, 1543, 403, 417, 1077, 21414, 323, 253, 4081, 1332, 19235, 275, 253, 1655, 7482, 253, 1180, 273, 32367, 310, 275, 253, 2491, 273, 23094, 9726, 534, 310, 1029, 7296, 253, 17647, 273, 253, 4836, 323, 253, 759, 6058, 13928, 3368, 352, 1057, 417, 1646, 347, 2167, 495, 69, 17856, 4685, 310, 3058, 281, 3426, 253, 4836, 275, 253, 3492, 327, 253, 4422, 3365, 4886, 275, 247, 374, 69, 1386, 1293, 12976, 47997, 4483, 47997, 281, 320, 22944, 689, 3402, 285, 715, 3817, 1955, 281, 10276, 273, 2876, 275, 253, 22020, 10556, 253, 47997, 310, 14287, 689, 253, 3402, 50276, 262, 3133, 751, 253, 15688, 1057, 417, 9302, 275, 4715, 436, 8770, 23000, 352, 310, 417, 2590, 849, 253, 3710, 5304, 3386, 275, 253, 4216, 6779, 651, 1581, 436, 495, 69, 1491, 281, 320, 10848, 327, 1386, 10257, 352, 310, 7558, 326, 253, 4081, 1332, 41731, 13015, 253, 42295, 1332, 436, 6452, 310, 417, 4751, 11464, 407, 253, 941, 347, 352, 9572, 436, 310, 984, 247, 273, 253, 11041, 273, 253, 10921, 9191, 285, 270, 984, 436, 6452, 3133, 4122, 7976, 327, 436, 2173, 15056, 1127, 323, 253, 10921, 6970, 672, 1078, 326, 7044, 369, 3012, 41731, 14692, 4216, 2587, 285, 846, 436, 1127, 352, 310, 12744, 534, 1332, 588, 1347, 1805, 1580, 6747, 6970, 4453, 751, 352, 556, 4925, 14940, 337, 253, 7364, 2593, 1057, 417, 8338, 253, 7364, 273, 253, 1332, 275, 2508, 824, 347, 1110, 5393, 275, 253, 2278, 50276, 34974, 275, 16186, 577, 2139, 310, 305, 417, 816, 253, 2457, 3665, 533, 271, 20828, 2439, 13009, 50276, 7152, 33032, 2520, 789, 29328, 247, 7792, 323, 15688, 4715, 432, 2626, 10816, 1966, 20028, 10556, 281, 3037, 432, 247, 11117, 873, 273, 10556, 326, 452, 1027, 1859, 10801, 14462, 285, 24898, 253, 209, 2587, 310, 2684, 387, 271, 12002, 4216, 1268, 597, 921, 326, 253, 10921, 1159, 7809, 4836, 10005, 326, 310, 6311, 342, 436, 38562, 476, 320, 908, 281, 6194, 271, 391, 77, 5570, 285, 253, 6311, 3646, 476, 320, 8379, 9495, 281, 247, 1524, 15688, 50276, 11095, 50276, 783, 2561, 1895, 310, 973, 17194, 253, 4295, 275, 253, 4081, 15722, 403, 4158, 973, 627, 403, 642, 2201, 32138, 275, 4679, 285, 27163, 50276, 4039, 5555, 253, 987, 6779, 323, 271, 209, 2587, 1895, 310, 247, 7936, 2561, 1953, 6200, 14580, 452, 247, 2257, 273, 2442, 275, 436, 3884, 285, 436, 789, 556, 5183, 697, 12510, 50276, 2369, 3126, 4543, 3081, 1133, 12517, 264, 10921, 310, 2424, 534, 5239, 436, 2929, 7419, 432, 690, 2045, 2987, 50276, 10420, 50276, 2520, 789, 19584, 271, 1789, 13562, 310, 1677, 534, 310, 247, 5272, 9376, 533, 627, 403, 2067, 440, 42195, 3533, 752, 273, 649, 1041, 48164, 13562, 310, 1146, 908, 1060, 47515, 253, 5481, 7200, 275, 253, 8892, 3559, 625, 21038, 849, 7899, 253, 13071, 6333, 3198, 281, 320, 323, 253, 2862, 15722, 281, 320, 3576, 50276, 10580, 38562, 556, 512, 253, 11361, 5393, 275, 253, 2929, 533, 751, 667, 38562, 690, 1491, 310, 3663, 752, 310, 690, 4619, 1491, 326, 310, 3663, 275, 253, 38562, 1057, 436, 3657, 253, 4081, 2746, 281, 39970, 281, 8892, 326, 11306, 10725, 327, 323, 1650, 1789, 16753, 1491, 495, 69, 17856, 1491, 3966, 436, 943, 320, 387, 1878, 5469, 275, 253, 7364, 2593, 50276, 7152, 339, 431, 248, 4477, 1246, 247, 5853, 597, 1067, 4216, 2587, 323, 4715, 10921, 490, 10981, 960, 2439, 10625, 342, 11117, 14704, 18388, 253, 1332, 11101, 6013, 253, 6944, 6200, 2605, 285, 5570, 6082, 5016, 715, 247, 4216, 534, 253, 4477, 1750, 310, 13727, 281, 940, 30524, 3386, 751, 14542, 5570, 14704, 3966, 253, 6311, 10921, 310, 840, 908, 275, 271, 209, 2587, 4758, 281, 3037, 7823, 432, 2626, 10816, 32367, 285, 597, 921, 5547, 23987, 4393, 351, 2092, 3700, 327, 1097, 15524, 285, 1524, 10186, 873, 8777, 20544, 50275, 783, 2929, 310, 4122, 29422, 285, 3477, 281, 1239, 253, 10556, 403, 1029, 3290, 285, 253, 5661, 1543, 403, 3240, 11080, 342, 690, 4278, 5816, 923, 2708, 50275, 6549, 15688, 4679, 403, 14109, 50275, 2520, 310, 271, 16088, 11132, 1895, 5028, 594, 667, 7756, 275, 3045, 812, 320, 3486, 1020, 323, 253, 3114, 50276, 20881, 1255, 265, 50275, 8826, 10414, 403, 5816, 627, 452, 644, 3240, 247, 1643, 9380, 326, 897, 1789, 37382, 14237, 281, 4853, 407, 5426, 13727, 1375, 14237, 581, 1650, 651, 320, 637, 86, 13890, 2234, 10801, 715, 253, 2852, 1881, 35421, 17668, 275, 1566, 3169, 35221, 4715, 891, 1804, 253, 4477, 1347, 271, 3081, 6239, 3186, 50275, 783, 10921, 310, 417, 27039, 327, 3646, 5231, 285, 594, 8018, 690, 7364, 275, 253, 830, 273, 10921, 326, 476, 320, 6311, 24088, 6311, 23267, 2550, 320, 3587, 2250, 7976, 50276, 263, 19584, 253, 6311, 10921, 476, 9441, 253, 3646, 2250, 604, 436, 310, 752, 310, 3058, 11859, 1650, 23267, 326, 9569, 22414, 390, 4815, 323, 2176, 5231, 4496, 823, 4278, 16585, 253, 7936, 7364, 273, 253, 33184, 1160, 3340, 1580, 368, 452, 247, 7364, 2593, 50275, 3549, 6241, 253, 10921, 310, 2931, 327, 247, 8612, 4736, 3665, 534, 671, 7787, 253, 23267, 326, 476, 320, 6311, 8069, 368, 476, 760, 1957, 23267, 835, 690, 2060, 4736, 1375, 310, 4209, 281, 8415, 253, 4836, 436, 943, 320, 5469, 281, 3693, 689, 23708, 253, 1332, 50275, 422, 1239, 253, 776, 2746, 2593, 247, 1643, 2069, 285, 1223, 891, 1158, 891, 2096, 849, 253, 4216, 310, 8818, 247, 5304, 8596, 651, 320, 45224, 9371, 891, 2096, 253, 4445, 4243, 3714, 74, 7954, 25290, 1442, 285, 891, 476, 956, 841, 715, 4927, 285, 41996, 533, 352, 651, 320, 5322, 281, 923, 253, 2685, 432, 2460, 281, 41113, 3817, 281, 46234, 281, 5140, 273, 253, 4216, 4677, 374, 310, 247, 1175, 1265, 533, 36908, 2686, 3831, 667, 273, 253, 2426, 432, 253, 776, 2746, 2593, 594, 310, 8489, 1892, 281, 956, 50275, 2420, 337, 4496, 5513, 2139, 253, 3126, 10921, 762, 468, 13015, 634, 1332, 310, 436, 247, 23507, 10921, 285, 604, 594, 2139, 858, 368, 7277, 247, 14086, 10921, 342, 247, 23507, 581, 5010, 2654, 1902, 247, 973, 37171, 2032, 10921, 281, 320, 271, 5170, 3033, 594, 436, 4419, 690, 8813, 50275, 1542, 253, 1524, 15688, 4679, 858, 368, 6194, 327, 667, 1524, 941, 390, 858, 368, 6194, 253, 10921, 760, 275, 948, 285, 840, 8745, 275, 1524, 21223, 858, 368, 6194, 253, 3646, 275, 1524, 390, 10921, 50276, 22872, 10166, 275, 948, 285, 18329, 275, 1524, 7016, 604, 436, 310, 43997, 562, 533, 891, 369, 2649, 2104, 281, 6583, 604, 697, 253, 1083, 326, 253, 10921, 369, 10166, 275, 948, 760, 352, 651, 1056, 3282, 281, 479, 326, 247, 1332, 326, 806, 19492, 5113, 33772, 247, 10921, 1159, 326, 476, 1805, 3700, 948, 281, 1524, 1223, 247, 1332, 751, 1269, 2587, 326, 4648, 2120, 2460, 46234, 31451, 1347, 347, 973, 1580, 4114, 940, 30524, 3386, 403, 1469, 281, 906, 275, 26647, 3374, 436, 310, 2649, 7933, 247, 50097, 1411, 253, 1332, 387, 512, 285, 275, 958, 970, 41113, 12783, 310, 247, 5272, 7680, 533, 271, 3368, 326, 671, 18784, 253, 2120, 8031, 275, 1524, 651, 320, 3309, 281, 1056, 2119, 253, 4216, 6779, 8553, 323, 253, 7558, 3045, 15988, 285, 697, 417, 816, 984, 273, 9187, 2784, 562, 273, 253, 4114, 50275, 32202, 352, 651, 320, 9371, 604, 253, 1180, 273, 10556, 369, 275, 253, 2829, 495, 11743, 2490, 187, 4118, 18435, 27, 2520, 2929, 29328, 271, 4722, 2746, 281, 4715, 23267, 432, 247, 11117, 873, 273, 10556, 285, 556, 2959, 690, 2266, 10123, 253, 2746, 310, 3236, 285, 253, 2929, 973, 24013, 8550, 285, 973, 15720, 253, 5661, 2593, 310, 11080, 342, 1175, 1543, 285, 3797, 4679, 327, 247, 1524, 15688, 970, 247, 4216, 3169, 1332, 323, 209, 2587, 310, 4460, 285, 812, 452, 1534, 3486, 2299, 627, 497, 247, 1643, 32213, 1119, 407, 253, 30628, 275, 1798, 625, 22861, 323, 253, 4327, 273, 10336, 285, 5304, 3386, 812, 3157, 253, 2929, 849, 651, 1027, 10165, 3486, 3045, 2112, 342, 436, 247, 625, 801, 554, 394, 5955, 273, 253, 7364, 273, 253, 2746, 651, 10260, 3157, 253, 2929, 253, 4477, 403, 14659, 281, 2953, 841, 7350, 2112, 342, 643, 7350, 5439, 407, 253, 30628, 50274, 47033, 368, 281, 253, 4477, 323, 253, 7000, 2380, 285, 9300, 4679, 209 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 5520, 3045, 342, 616, 1332, 327, 247, 15524, 28110, 4836, 285, 1264, 35121, 19763, 8892, 327, 247, 1524, 15688, 20544, 337, 4216, 3169, 1332, 310, 27350, 6210, 1591, 446, 1243, 285, 2011, 281, 320, 1077, 3576, 374, 4679, 403, 11080, 495, 4028, 310, 1077, 2590, 50276, 20881, 1255, 265, 337, 253, 5438, 273, 10336, 323, 5162, 253, 4216, 285, 3652, 253, 4216, 812, 320, 1805, 17194, 323, 4227, 497, 643, 4216, 27311, 267, 35615, 2783, 2139, 897, 247, 4751, 14063, 4216, 3966, 374, 5001, 253, 4216, 6779, 352, 310, 12744, 4555, 849, 4555, 352, 19986, 432, 2720, 789, 275, 1789, 37382, 14237, 5474, 33032, 2520, 2929, 29328, 247, 1332, 323, 4715, 247, 10921, 1159, 534, 310, 1097, 23987, 4393, 351, 2092, 285, 2831, 13517, 436, 4483, 323, 253, 10921, 1159, 281, 320, 6311, 432, 1966, 10556, 323, 253, 6311, 10921, 281, 320, 908, 323, 4715, 275, 9864, 285, 323, 253, 6311, 3646, 281, 320, 18329, 275, 253, 1524, 1533, 969, 327, 247, 3520, 15688, 247, 4216, 6779, 310, 908, 281, 2831, 253, 5028, 8037, 285, 11935, 5880, 15274, 2957, 310, 908, 281, 2831, 253, 14704, 8037, 4679, 921, 5520, 3045, 327, 247, 23987, 4393, 351, 2092, 22791, 285, 5547, 19007, 327, 247, 3520, 15688, 20544, 337, 253, 5019, 273, 4216, 14237, 285, 209, 2587, 4620, 281, 320, 4460, 285, 310, 2779, 281, 452, 1534, 3486, 347, 436, 2087, 2746, 476, 320, 908, 323, 1142, 8892, 337, 253, 5661, 4560, 326, 6311, 10921, 3470, 476, 320, 1805, 685, 1133, 38061, 23267, 323, 841, 8892, 310, 27807, 285, 778, 320, 4217, 323, 253, 3114, 337, 253, 2929, 1057, 973, 387, 36636, 247, 8113, 323, 436, 4715, 15722, 4715, 23267, 432, 1524, 10186, 1966, 32367, 4715, 247, 3646, 275, 9864, 970, 6311, 23267, 285, 4720, 19007, 273, 253, 6311, 3646, 327, 3520, 10309, 436, 3133, 751, 247, 12532, 1039, 3579, 323, 15688, 4715, 285, 3103, 16540, 253, 3486, 2442, 273, 436, 789, 337, 7103, 327, 247, 1524, 15688, 2722, 326, 436, 1332, 476, 320, 12450, 3732, 275, 253, 1524, 1533, 8109, 253, 8113, 323, 253, 2120, 15722, 337, 941, 6733, 347, 5125, 407, 271, 28913, 253, 4216, 38562, 310, 1199, 625, 941, 20246, 247, 5301, 970, 436, 7982, 281, 643, 3082, 778, 320, 4722, 50276, 17465, 569, 337, 253, 4666, 3386, 760, 897, 253, 41113, 3817, 285, 1399, 287, 2352, 273, 1016, 1789, 275, 253, 1655, 7482, 273, 253, 2929, 643, 5304, 1491, 310, 25665, 436, 310, 752, 7729, 253, 1332, 2831, 253, 5028, 8037, 533, 671, 31063, 12176, 253, 1180, 273, 8892, 326, 253, 1332, 476, 320, 3732, 323, 604, 352, 1057, 417, 897, 5304, 3386, 4457, 253, 9552, 273, 5113, 323, 1650, 604, 253, 1789, 281, 320, 10184, 285, 253, 4736, 403, 253, 1072, 1979, 840, 352, 310, 12744, 849, 253, 2990, 651, 320, 2104, 281, 2028, 731, 7419, 337, 1269, 20752, 474, 310, 247, 4217, 22791, 347, 352, 310, 4158, 5742, 323, 16344, 23987, 4393, 351, 2092, 3082, 533, 352, 310, 417, 1077, 2590, 2139, 2266, 3045, 275, 436, 22791, 651, 7933, 16497, 973, 281, 2266, 3045, 275, 1524, 15688, 4679, 253, 3646, 310, 1375, 3169, 2581, 685, 2460, 3169, 352, 3133, 751, 5113, 513, 417, 3007, 504, 390, 8008, 1199, 594, 436, 1537, 417, 921, 253, 2120, 1612, 273, 253, 5016, 6928, 908, 275, 436, 1332, 253, 1543, 327, 436, 22791, 1804, 326, 672, 10166, 327, 417, 11117, 10556, 533, 581, 5304, 3740, 273, 10556, 253, 4081, 1332, 2087, 3013, 1805, 281, 10556, 342, 1027, 5304, 3386, 436, 2789, 3282, 984, 841, 5304, 3386, 403, 512, 25665, 407, 253, 6779, 7562, 760, 41113, 12783, 2299, 275, 253, 1273, 4194, 273, 4677, 577, 352, 3133, 751, 253, 8245, 1332, 1269, 2587, 17923, 816, 347, 973, 347, 253, 4081, 1332, 672, 253, 3733, 10895, 4428, 25910, 11117, 10556, 18289, 984, 253, 5304, 32049, 33772, 534, 3386, 281, 2770, 327, 436, 1537, 1679, 257, 253, 3486, 273, 253, 1655, 789, 984, 352, 310, 671, 17194, 407, 253, 3634, 273, 4715, 432, 11117, 10556, 337, 253, 1524, 10186, 4679, 2953, 247, 10076, 534, 671, 1537, 417, 921, 253, 2120, 2442, 273, 5016, 6928, 347, 627, 403, 760, 3495, 5113, 3206, 253, 1543, 403, 417, 1077, 21414, 323, 253, 4081, 1332, 19235, 275, 253, 1655, 7482, 253, 1180, 273, 32367, 310, 275, 253, 2491, 273, 23094, 9726, 534, 310, 1029, 7296, 253, 17647, 273, 253, 4836, 323, 253, 759, 6058, 13928, 3368, 352, 1057, 417, 1646, 347, 2167, 495, 69, 17856, 4685, 310, 3058, 281, 3426, 253, 4836, 275, 253, 3492, 327, 253, 4422, 3365, 4886, 275, 247, 374, 69, 1386, 1293, 12976, 47997, 4483, 47997, 281, 320, 22944, 689, 3402, 285, 715, 3817, 1955, 281, 10276, 273, 2876, 275, 253, 22020, 10556, 253, 47997, 310, 14287, 689, 253, 3402, 50276, 262, 3133, 751, 253, 15688, 1057, 417, 9302, 275, 4715, 436, 8770, 23000, 352, 310, 417, 2590, 849, 253, 3710, 5304, 3386, 275, 253, 4216, 6779, 651, 1581, 436, 495, 69, 1491, 281, 320, 10848, 327, 1386, 10257, 352, 310, 7558, 326, 253, 4081, 1332, 41731, 13015, 253, 42295, 1332, 436, 6452, 310, 417, 4751, 11464, 407, 253, 941, 347, 352, 9572, 436, 310, 984, 247, 273, 253, 11041, 273, 253, 10921, 9191, 285, 270, 984, 436, 6452, 3133, 4122, 7976, 327, 436, 2173, 15056, 1127, 323, 253, 10921, 6970, 672, 1078, 326, 7044, 369, 3012, 41731, 14692, 4216, 2587, 285, 846, 436, 1127, 352, 310, 12744, 534, 1332, 588, 1347, 1805, 1580, 6747, 6970, 4453, 751, 352, 556, 4925, 14940, 337, 253, 7364, 2593, 1057, 417, 8338, 253, 7364, 273, 253, 1332, 275, 2508, 824, 347, 1110, 5393, 275, 253, 2278, 50276, 34974, 275, 16186, 577, 2139, 310, 305, 417, 816, 253, 2457, 3665, 533, 271, 20828, 2439, 13009, 50276, 7152, 33032, 2520, 789, 29328, 247, 7792, 323, 15688, 4715, 432, 2626, 10816, 1966, 20028, 10556, 281, 3037, 432, 247, 11117, 873, 273, 10556, 326, 452, 1027, 1859, 10801, 14462, 285, 24898, 253, 209, 2587, 310, 2684, 387, 271, 12002, 4216, 1268, 597, 921, 326, 253, 10921, 1159, 7809, 4836, 10005, 326, 310, 6311, 342, 436, 38562, 476, 320, 908, 281, 6194, 271, 391, 77, 5570, 285, 253, 6311, 3646, 476, 320, 8379, 9495, 281, 247, 1524, 15688, 50276, 11095, 50276, 783, 2561, 1895, 310, 973, 17194, 253, 4295, 275, 253, 4081, 15722, 403, 4158, 973, 627, 403, 642, 2201, 32138, 275, 4679, 285, 27163, 50276, 4039, 5555, 253, 987, 6779, 323, 271, 209, 2587, 1895, 310, 247, 7936, 2561, 1953, 6200, 14580, 452, 247, 2257, 273, 2442, 275, 436, 3884, 285, 436, 789, 556, 5183, 697, 12510, 50276, 2369, 3126, 4543, 3081, 1133, 12517, 264, 10921, 310, 2424, 534, 5239, 436, 2929, 7419, 432, 690, 2045, 2987, 50276, 10420, 50276, 2520, 789, 19584, 271, 1789, 13562, 310, 1677, 534, 310, 247, 5272, 9376, 533, 627, 403, 2067, 440, 42195, 3533, 752, 273, 649, 1041, 48164, 13562, 310, 1146, 908, 1060, 47515, 253, 5481, 7200, 275, 253, 8892, 3559, 625, 21038, 849, 7899, 253, 13071, 6333, 3198, 281, 320, 323, 253, 2862, 15722, 281, 320, 3576, 50276, 10580, 38562, 556, 512, 253, 11361, 5393, 275, 253, 2929, 533, 751, 667, 38562, 690, 1491, 310, 3663, 752, 310, 690, 4619, 1491, 326, 310, 3663, 275, 253, 38562, 1057, 436, 3657, 253, 4081, 2746, 281, 39970, 281, 8892, 326, 11306, 10725, 327, 323, 1650, 1789, 16753, 1491, 495, 69, 17856, 1491, 3966, 436, 943, 320, 387, 1878, 5469, 275, 253, 7364, 2593, 50276, 7152, 339, 431, 248, 4477, 1246, 247, 5853, 597, 1067, 4216, 2587, 323, 4715, 10921, 490, 10981, 960, 2439, 10625, 342, 11117, 14704, 18388, 253, 1332, 11101, 6013, 253, 6944, 6200, 2605, 285, 5570, 6082, 5016, 715, 247, 4216, 534, 253, 4477, 1750, 310, 13727, 281, 940, 30524, 3386, 751, 14542, 5570, 14704, 3966, 253, 6311, 10921, 310, 840, 908, 275, 271, 209, 2587, 4758, 281, 3037, 7823, 432, 2626, 10816, 32367, 285, 597, 921, 5547, 23987, 4393, 351, 2092, 3700, 327, 1097, 15524, 285, 1524, 10186, 873, 8777, 20544, 50275, 783, 2929, 310, 4122, 29422, 285, 3477, 281, 1239, 253, 10556, 403, 1029, 3290, 285, 253, 5661, 1543, 403, 3240, 11080, 342, 690, 4278, 5816, 923, 2708, 50275, 6549, 15688, 4679, 403, 14109, 50275, 2520, 310, 271, 16088, 11132, 1895, 5028, 594, 667, 7756, 275, 3045, 812, 320, 3486, 1020, 323, 253, 3114, 50276, 20881, 1255, 265, 50275, 8826, 10414, 403, 5816, 627, 452, 644, 3240, 247, 1643, 9380, 326, 897, 1789, 37382, 14237, 281, 4853, 407, 5426, 13727, 1375, 14237, 581, 1650, 651, 320, 637, 86, 13890, 2234, 10801, 715, 253, 2852, 1881, 35421, 17668, 275, 1566, 3169, 35221, 4715, 891, 1804, 253, 4477, 1347, 271, 3081, 6239, 3186, 50275, 783, 10921, 310, 417, 27039, 327, 3646, 5231, 285, 594, 8018, 690, 7364, 275, 253, 830, 273, 10921, 326, 476, 320, 6311, 24088, 6311, 23267, 2550, 320, 3587, 2250, 7976, 50276, 263, 19584, 253, 6311, 10921, 476, 9441, 253, 3646, 2250, 604, 436, 310, 752, 310, 3058, 11859, 1650, 23267, 326, 9569, 22414, 390, 4815, 323, 2176, 5231, 4496, 823, 4278, 16585, 253, 7936, 7364, 273, 253, 33184, 1160, 3340, 1580, 368, 452, 247, 7364, 2593, 50275, 3549, 6241, 253, 10921, 310, 2931, 327, 247, 8612, 4736, 3665, 534, 671, 7787, 253, 23267, 326, 476, 320, 6311, 8069, 368, 476, 760, 1957, 23267, 835, 690, 2060, 4736, 1375, 310, 4209, 281, 8415, 253, 4836, 436, 943, 320, 5469, 281, 3693, 689, 23708, 253, 1332, 50275, 422, 1239, 253, 776, 2746, 2593, 247, 1643, 2069, 285, 1223, 891, 1158, 891, 2096, 849, 253, 4216, 310, 8818, 247, 5304, 8596, 651, 320, 45224, 9371, 891, 2096, 253, 4445, 4243, 3714, 74, 7954, 25290, 1442, 285, 891, 476, 956, 841, 715, 4927, 285, 41996, 533, 352, 651, 320, 5322, 281, 923, 253, 2685, 432, 2460, 281, 41113, 3817, 281, 46234, 281, 5140, 273, 253, 4216, 4677, 374, 310, 247, 1175, 1265, 533, 36908, 2686, 3831, 667, 273, 253, 2426, 432, 253, 776, 2746, 2593, 594, 310, 8489, 1892, 281, 956, 50275, 2420, 337, 4496, 5513, 2139, 253, 3126, 10921, 762, 468, 13015, 634, 1332, 310, 436, 247, 23507, 10921, 285, 604, 594, 2139, 858, 368, 7277, 247, 14086, 10921, 342, 247, 23507, 581, 5010, 2654, 1902, 247, 973, 37171, 2032, 10921, 281, 320, 271, 5170, 3033, 594, 436, 4419, 690, 8813, 50275, 1542, 253, 1524, 15688, 4679, 858, 368, 6194, 327, 667, 1524, 941, 390, 858, 368, 6194, 253, 10921, 760, 275, 948, 285, 840, 8745, 275, 1524, 21223, 858, 368, 6194, 253, 3646, 275, 1524, 390, 10921, 50276, 22872, 10166, 275, 948, 285, 18329, 275, 1524, 7016, 604, 436, 310, 43997, 562, 533, 891, 369, 2649, 2104, 281, 6583, 604, 697, 253, 1083, 326, 253, 10921, 369, 10166, 275, 948, 760, 352, 651, 1056, 3282, 281, 479, 326, 247, 1332, 326, 806, 19492, 5113, 33772, 247, 10921, 1159, 326, 476, 1805, 3700, 948, 281, 1524, 1223, 247, 1332, 751, 1269, 2587, 326, 4648, 2120, 2460, 46234, 31451, 1347, 347, 973, 1580, 4114, 940, 30524, 3386, 403, 1469, 281, 906, 275, 26647, 3374, 436, 310, 2649, 7933, 247, 50097, 1411, 253, 1332, 387, 512, 285, 275, 958, 970, 41113, 12783, 310, 247, 5272, 7680, 533, 271, 3368, 326, 671, 18784, 253, 2120, 8031, 275, 1524, 651, 320, 3309, 281, 1056, 2119, 253, 4216, 6779, 8553, 323, 253, 7558, 3045, 15988, 285, 697, 417, 816, 984, 273, 9187, 2784, 562, 273, 253, 4114, 50275, 32202, 352, 651, 320, 9371, 604, 253, 1180, 273, 10556, 369, 275, 253, 2829, 495, 11743, 2490, 187, 4118, 18435, 27, 2520, 2929, 29328, 271, 4722, 2746, 281, 4715, 23267, 432, 247, 11117, 873, 273, 10556, 285, 556, 2959, 690, 2266, 10123, 253, 2746, 310, 3236, 285, 253, 2929, 973, 24013, 8550, 285, 973, 15720, 253, 5661, 2593, 310, 11080, 342, 1175, 1543, 285, 3797, 4679, 327, 247, 1524, 15688, 970, 247, 4216, 3169, 1332, 323, 209, 2587, 310, 4460, 285, 812, 452, 1534, 3486, 2299, 627, 497, 247, 1643, 32213, 1119, 407, 253, 30628, 275, 1798, 625, 22861, 323, 253, 4327, 273, 10336, 285, 5304, 3386, 812, 3157, 253, 2929, 849, 651, 1027, 10165, 3486, 3045, 2112, 342, 436, 247, 625, 801, 554, 394, 5955, 273, 253, 7364, 273, 253, 2746, 651, 10260, 3157, 253, 2929, 253, 4477, 403, 14659, 281, 2953, 841, 7350, 2112, 342, 643, 7350, 5439, 407, 253, 30628, 50274, 47033, 368, 281, 253, 4477, 323, 253, 7000, 2380, 285, 9300, 4679, 209 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: the paper proposes an accelerated mcmc method for sampling motivated by nesterovs accelerated gradient nag method starting from the high resolution ode of nag obtained in shi et al the paper applies a two stage mechanism to remove the hessiandependency the obtained first order ode system serves as the backbone of the proposed diffusion process hessianfree highresolution hfhr dynamics discretization the hfhr dynamics leads to the proposed sampling method theoretical convergence are provided for both the continuous and the discretized variant showing acceleration to existing underdamped langevin method uld the paper is very technical and not easy to follow the main contribution of the paper is to provide an accelerated first order diffusion process in continuous time yielding a novel sampling method after discretization i have a few questionsconcerns to be clarified regarding the claims in the paper 1 acceleration in continuous time comparing to the rate obtained in dalalyan 2020 the proposed algorithm improves by a factor of kappa however i am confused as no improvement is achieved for the function fxy mx2ly2 this seems contradictory to the claim do i misunderstand anything is it related to the time scaling 2 discretization when discretizing the paper applies a secondorder symmetric composition what is the motivation behind this step what would be the benefit compared to applying forward euler discretization 3 acceleration in discrete time unless we show that the constant term is improved by a nontrivial factor l or kappa for example i wouldnt call it an acceleration as those are only upper bounds in the analysis 4 uld hfhr0gamma i understand that the continuous ode are equivalent by taking alpha 0 is the discretized algorithm still follows the same equivalence for instance different discretization method may be applied 5 comments on experiments a in figure 2 it seems surprising to me that only 2 iterations is enough to reach epsilon closeness which may suggest the problem is too easy b in figure 3 why does some figure only has alpha01 and the others has alpha 01 05 1 overall i believe the theoretical result of the paper has merit but the presentation need to be improved docsepthis paper introduces hessianfree highresolution sde inspired by accelerated gradient nag the author shows that continuous solution achieves an acceleration over the underdamped langevin a discrete algorithm also has speedup with a constant factor this paper has solid theoretical analysis however the author missed an important comparison with randomized midpoint method shen lee 2019 i think the author proved an iteration complexity that is worse to the method in that paper this invalidates authors claim on constant factor speedup over underdamped langevin dynamics specific comments 1 in assumption a1 m lvert yx rvert leq lvert nabla fynabla fx rvert leq l lvert yx rvert seems insufficient to establish mstronlyconvex and lsmooth 2 does this paper consider logconcave case in the introduction author said will consider the setup of logconcave logstrongly concave target distributions however it seems assumption a1 require m 0 3 on page 9 author wrote strongly logconcave assumptions required in theorem 52 why there is a bracket does it imply that theorem 52 could be applied in logconcave case 4 in remark 55 at the last line the author claimed steps needed by uld can be halved by hfhr which algorithm is the author referring to for uld is it algorithm in cheng et al 2018 or hfhr with alpha0 it seems these two algorithms are actually different 5 in section 61 author proposed to use error of mean as a surrogate of 2wasserstein distance although error of mean is a lower bound how tight it is in figure 2 the mean value at alpha05 in the figure 2 seems close to 1 which means hfhr achieves varepsilonleq01 accuracy in just one iteration clearly one iteration is not enough for burnin does it means error of mean is a bad indicator of mixing 6 in figure 1 could the author also add lines for uld and randomized midpoint method i thinks this paper could be improved substantially but currently would like to recommend reject docsepthis stuyd focused on developing a hessianfree highresolution nesterovs acceleration approach which proves converging faster and being nontrivial not a timerescaling trick this paper introduces a hessianfree highresolution nesterovs acceleration approach which proves faster convergence than both underdamped and overdamped langevin dynamics in addition it also proves that the acceleration cannot be achieved by timerescaling experiments results support the argument in optimization acceleration as well as the theoretical results some questions the reviewer minds the theorem 51 depends on the assumption on gamma and alpha if such an assumption often holds in real applications it is will be better to demonstrate the acceleration superiority in with a more complex network and a largescale dataset the reviewer thinks this paper contributes a new and effective gradient based optimization solver overall the reviewer would like to recommend acceptance docsepthe author proposed an acceleratedgradientbased mcmc method based on nesterovs accelerated gradient nag in continuous time the algorithm is able to achieve a tremendous acceleration over the underdamped langevin algorithm numerical schemes can propose a speedup with a constant factor this paper builds a nagbased mcmc sampler which is named as hessianfree highresolution hfhralpha gamma and generalizes the underdamped langevin also can be written as hfhr0 gamma with an additional gradient drift and brownian motion for the update of the position variable pros the most prominent feature of the extended algorithm yields much faster exponential convergence with rate ol in continuous time than the standard underdamped langevin algorithm cheng18 rate ofracml dalalyan20 rate ofracmsqrtl although the acceleration becomes less significant in numerical algorithms due to an increase of discretization error some speedup by a constant factor can be still achieved cons the writing is not very clear which affects the readability experience and made me hard to check the details for example 1 the derivation of formula 6 is slightly adhoc and more related work or motivations are suggested 2 at the end of section b some link on the definitions of notations may be suggested 3 page 24 ht is derived based on taylor expansion instead of saying nothing is better for presenting to the readers remark theorem 51 is checked carefully theorem 52 is not the logic seems to be reasonable minor issues 1 experiments conducted on section 62 is nonconvex and doesnt match the theory 2 missing part in related works replicaexchange aka parallel tempering 1 accelerating nonconvex learning via replica exchange langevin diffusion iclr19 2 nonconvex learning via replica exchange stochastic gradient mcmc icml20 cheng18 underdamped langevin mcmc a nonasymptotic analysis colt18 dalalyan20 a s dalalyan and l rioudurand on sampling from a logconcave density using kinetic langevin diffusions bernoulli the nagbased mcmc sampler extends the underdamped langevin with a drift term and brownian motion and is proved faster than the alternative i would recommend this paper to be accepted docsep this paper discuss the acceleratedgradientbased mcmc method and propose some rigorous proof i do not think the problem addressed by this paper does not make sense the gradientbased algorithm added by the noise in the convex case does not play the essentially different role with that without the noise please the authors refer to bin shi weijie su and michael i jordan on learning rates and schrdinger operators httpsarxivorgabs200406977 bin shi on the hyperparameters in stochastic gradient descent with momentum httpsarxivorgabs210803947 docsepthis paper introduces a stochastic process called hfhr the sdes of hfhr is derived by rewriting nesterovs accelerated gradient nag into phasespace representation formulating it as odes and then injecting noise into both position and momentum variables hfhr can be used for sampling because it has stationary distribution as the target distribution a discretization of hfhr is given by operator splitting and euler integration a mixing time bound of widetildeosqrtdvarepsilon is obtained for sampling logstrongconcaveandsmooth target distribution with extra thirdorder growth condition strengths the hfhr process seems novel and has interesting connections with nag the paper is mostly clear and easy to follow weakness 1 hfhr has noise injected into position directly this property is actually harmful for discretization because directly coupling brownian motion into qt makes the trajectory of qt extremely non smooth and make it hard to estimate nabla fqt the uld avoid directly inject noise into position to achieve acceleration compared to the overdamped ld hfhr unlike from uld has noise term dwt in position however it still achieve same asymptotic convergence as certain discretized uld at the price of an extra condition on higher order derivative 2 rely on additional thirdorder growth condition although this condition is weaker than hessian lipschitz and has been used in prior art to accelerate the convergence of ld li et al 2021 it still makes the convergence guarantee weaker than many discretized uld which have same or better asymptotic convergence speed and dont require thirdorder growth condition 3 the condition number kappa dependence and thirdorder growth constant g dependence in iteration complexity is unclear the main paper doesnt specify the asymptotic dependence of kappa and g but hide it in the constant c including that information in the main paper would be appreciated according the appendix page 20 where c is defined i guess that the kappa dependence is okappa32 and g dependence is og could the author help to verify the above statements 4 iteration complexity is worse than certain discretization of uld although the authors cited shen lee 2019 they didnt discuss methods in this paper at all the randomized midpoint method is a special discretization of uld which achieves widetildeod13varepsilon23 iteration complexity and doesnt require thirdorder growth condition 5 the method is not well motivated according to the paper hfhr is motivated by one question how to appropriately inject noise to nag algorithm in discrete time we should note that although there are underlying connections between optimization and sampling injecting noise into a good optimization method doesnt necessarily yield a good sampling algorithm unfortunately this paper seems like this is the case more specifically discrete hfhr with alpha neq 0 have worse iteration complexity compared to certain discretization of uld 6 points in figure 2 is biased estimation if i understand correctly the yaxis value in figure 2 is smallest number k such that lvert mathbbe muk qmathbbe mu q rvert leq varepsilon in order to estimate mathbbe muk q frac1n sumin qki is calculated for n1000 samples according to line 3 in generateconfigpy which are generated and run in parallel for k iterations frac1n sumin qki is indeed unbiased estimation of mathbbe muk q however min k st lvert frac1n sumin qkimathbbe mu q rvert leq varepsilon is a biased estimation of min k st lvert mathbbe muk qmathbbe mu q rvert leq varepsilon typo dt is missing in eq 6 bracket error on 3rdtolast line on page 8 the method is novel in the sense that it is derived based on the idea inject noise into nesterovs accelerated gradient however the derived theory is not significant as it rely on extra assumptions and has worse iteration complexity ### Summary:
the paper considers the high resolution continuous limit of nesterovs accelerated gradient nag algorithm and its connections to sampling mcmc methods the paper develops a hessianfree high resolution hfhr ode and injects noise into it to obtain an accelerated sampling algorithm further the paper provides a discretetime variant of the algorithm by appropriately discretizing hfhr using simple discretization schemes for strongly logconcave potential functions logdensities the paper proves convergence of the order tildeosqrtdepsilon in wasserstein2 distance in the asymptotic sense the result matches the convergence of the underdamped langevin algorithm however the paper argues that the constants in the proposed algorithm are smaller and empirically shows that the proposed algorithm is faster in practice the main contributions of the paper are theoretical however the theoretical results are supplemented by numerical experiments overall the reviewers found the contributions interesting and the theoretical contributions of the paper technically sound the main concerns that were not completely addressed were related to the presentation of the results and reproducibility of some of the numerical experiments while both seem minor and possible to address ultimately there was not enough support to recommend acceptance however the paper is solid and merits acceptance after suitable revisions thus the authors are encouraged to revise the paper and resubmit it to one of the conferences in the equivalence class of iclr
[ 1908, 2412, 45542, 1123, 1083, 275, 253, 10199, 2488, 753, 50276, 9846, 1908, 253, 9978, 273, 2412, 45542, 1123, 50276, 2808, 9072, 314, 40886, 2303, 10670, 2299, 352, 3133, 9376, 247, 18, 2430, 278, 50276, 17, 50275, 20, 327, 3239, 898, 2488, 4159, 7052, 2412, 45542, 1123, 13260, 2424, 275, 10012, 8073, 2139, 627, 310, 247, 24312, 1057, 352, 16084, 326, 10012, 8073, 812, 320, 3732, 275, 2412, 45542, 1123, 1083, 50276, 21, 275, 7579, 7288, 387, 253, 1390, 1386, 253, 2488, 7558, 5018, 3058, 407, 12130, 69, 476, 320, 7905, 1272, 407, 288, 71, 6285, 534, 5933, 310, 253, 2488, 14339, 281, 323, 12130, 69, 310, 352, 5933, 275, 260, 24176, 1162, 355, 4765, 390, 288, 71, 6285, 342, 9765, 17, 352, 3133, 841, 767, 11333, 403, 2686, 1027, 50276, 22, 275, 2593, 9901, 2488, 4081, 281, 897, 2228, 273, 1599, 347, 247, 35701, 273, 374, 88, 30666, 6339, 4181, 3738, 2228, 273, 1599, 310, 247, 2406, 3033, 849, 6863, 352, 310, 275, 4677, 374, 253, 1599, 1318, 387, 9765, 1762, 275, 253, 4677, 374, 3133, 2810, 281, 337, 534, 2097, 288, 71, 6285, 33526, 362, 609, 4277, 3040, 520, 7200, 275, 816, 581, 19502, 4518, 581, 19502, 310, 417, 2217, 323, 5451, 249, 1057, 352, 2097, 2228, 273, 1599, 310, 247, 3076, 15301, 273, 12480, 50276, 23, 275, 4677, 337, 812, 253, 2488, 671, 823, 3104, 323, 12130, 69, 285, 14871, 4260, 3659, 1332, 891, 11121, 436, 2929, 812, 320, 5520, 9619, 533, 4390, 651, 751, 281, 5583, 12009, 50276, 7152, 33032, 2520, 331, 7352, 69, 7106, 327, 6684, 247, 344, 859, 757, 4924, 1029, 21061, 295, 9358, 729, 84, 17680, 2746, 534, 19539, 5975, 3390, 7938, 285, 1146, 50276, 79, 834, 10482, 417, 247, 673, 373, 1179, 272, 10480, 436, 2929, 23970, 247, 344, 859, 757, 4924, 1029, 21061, 295, 9358, 729, 84, 17680, 2746, 534, 19539, 7938, 14940, 685, 1097, 762, 69, 17263, 285, 689, 69, 17263, 298, 912, 8498, 8062, 275, 1635, 352, 671, 19539, 326, 253, 17680, 2550, 320, 6786, 407, 673, 373, 1179, 272, 4679, 1543, 1329, 253, 4154, 275, 13757, 17680, 347, 973, 347, 253, 10527, 1543, 50275, 8826, 3533, 253, 37317, 13846, 50276, 783, 10012, 8319, 7024, 327, 253, 9376, 327, 17356, 285, 9765, 604, 824, 271, 9376, 2223, 6556, 275, 1524, 4893, 50276, 262, 310, 588, 320, 1805, 281, 7568, 253, 17680, 34385, 275, 342, 247, 625, 2570, 2990, 285, 247, 1236, 2510, 25912, 10895, 253, 37317, 11121, 436, 2929, 17904, 247, 747, 285, 3576, 11786, 1754, 13757, 47037, 4583, 253, 37317, 651, 751, 281, 5583, 14924, 5474, 339, 431, 248, 2488, 4081, 271, 21702, 29844, 3169, 278, 3591, 68, 1332, 1754, 327, 295, 9358, 729, 84, 21702, 11786, 295, 356, 275, 5415, 673, 253, 5933, 310, 2104, 281, 5115, 247, 19999, 17680, 689, 253, 762, 69, 17263, 298, 912, 8498, 5933, 10704, 15849, 476, 12661, 247, 3885, 484, 342, 247, 3638, 2803, 50276, 2520, 2929, 21168, 247, 295, 356, 3169, 278, 3591, 68, 1775, 17407, 534, 310, 4907, 347, 344, 859, 757, 4924, 1029, 21061, 288, 30550, 1544, 1494, 17356, 285, 2087, 4219, 253, 762, 69, 17263, 298, 912, 8498, 671, 476, 320, 3542, 347, 288, 71, 6285, 17, 17356, 342, 271, 3081, 11786, 16924, 285, 8516, 757, 3200, 323, 253, 5731, 273, 253, 1899, 4778, 50275, 856, 84, 253, 954, 11906, 4735, 273, 253, 6508, 5933, 11026, 1199, 7938, 17619, 14940, 342, 2281, 8919, 50276, 249, 5415, 673, 685, 253, 2629, 762, 69, 17263, 298, 912, 8498, 5933, 50276, 5756, 72, 1093, 2281, 273, 8306, 1686, 26932, 5242, 266, 938, 2281, 273, 8306, 983, 2274, 77, 3738, 253, 17680, 4916, 1679, 1534, 275, 10704, 11333, 1955, 281, 271, 2572, 273, 35132, 1320, 2228, 690, 3885, 484, 407, 247, 3638, 2803, 476, 320, 1335, 6786, 50276, 5040, 50276, 783, 4028, 310, 417, 1077, 2590, 534, 11852, 253, 1239, 1430, 2793, 285, 1160, 479, 1892, 281, 2451, 253, 4278, 323, 1650, 337, 253, 28529, 273, 7212, 721, 310, 5777, 519, 37806, 285, 625, 2905, 789, 390, 42852, 403, 5125, 374, 387, 253, 990, 273, 2593, 270, 690, 3048, 327, 253, 14308, 273, 41818, 778, 320, 5125, 495, 3239, 2164, 288, 85, 310, 6012, 1754, 327, 246, 9614, 7466, 3185, 273, 3981, 2717, 310, 1805, 323, 15250, 281, 253, 10668, 50276, 39808, 10012, 8319, 310, 10141, 9257, 10012, 8073, 310, 417, 253, 9317, 3133, 281, 320, 5272, 50276, 37585, 3374, 50275, 18, 4679, 5196, 327, 2593, 9743, 310, 1327, 44181, 285, 36908, 3761, 253, 3762, 50276, 19, 5816, 629, 275, 2905, 2987, 36804, 35757, 38857, 7529, 2660, 272, 50275, 18, 38757, 1327, 44181, 4715, 3066, 36804, 6431, 298, 912, 8498, 12393, 17857, 32888, 746, 50276, 19, 1327, 44181, 4715, 3066, 36804, 6431, 19191, 11786, 278, 3591, 68, 17857, 1686, 938, 50275, 5756, 72, 1093, 762, 69, 17263, 298, 912, 8498, 278, 3591, 68, 247, 1327, 284, 40045, 3875, 1783, 847, 85, 1093, 50276, 26955, 5242, 266, 938, 247, 256, 26932, 5242, 266, 285, 298, 4172, 2995, 321, 395, 327, 10491, 432, 247, 2412, 45542, 1123, 4038, 970, 17818, 298, 912, 8498, 2171, 16723, 270, 1808, 276, 25658, 50275, 783, 295, 356, 3169, 278, 3591, 68, 1775, 17407, 8725, 253, 762, 69, 17263, 298, 912, 8498, 342, 247, 16924, 1307, 285, 8516, 757, 3200, 285, 310, 8058, 7938, 685, 253, 5795, 891, 651, 5583, 436, 2929, 281, 320, 7607, 5474, 33032, 436, 2929, 2319, 253, 21702, 29844, 3169, 278, 3591, 68, 1332, 285, 12661, 690, 26565, 4737, 891, 513, 417, 1158, 253, 1895, 9713, 407, 436, 2929, 1057, 417, 1056, 3282, 50276, 783, 11786, 3169, 5933, 2879, 407, 253, 6046, 275, 253, 17133, 1083, 1057, 417, 1132, 253, 9093, 1027, 2554, 342, 326, 1293, 253, 6046, 50275, 32897, 253, 4477, 3730, 281, 50275, 4805, 439, 74, 359, 1944, 466, 402, 285, 278, 44023, 891, 480, 11208, 50276, 251, 4715, 4142, 285, 5807, 5784, 4940, 9158, 5987, 39962, 2061, 5375, 1518, 1449, 2090, 2357, 50276, 4805, 439, 74, 327, 253, 4373, 22041, 275, 19191, 11786, 18499, 342, 10254, 5987, 39962, 2061, 5375, 16899, 1438, 1867, 2504, 50275, 7152, 33032, 2520, 2929, 23970, 247, 19191, 1232, 1925, 288, 71, 6285, 253, 256, 3229, 273, 288, 71, 6285, 310, 6012, 407, 294, 17695, 295, 9358, 729, 84, 21702, 11786, 295, 356, 715, 12475, 4511, 6779, 830, 8287, 352, 347, 258, 3229, 285, 840, 42014, 6046, 715, 1097, 1899, 285, 10254, 4903, 288, 71, 6285, 476, 320, 908, 323, 10491, 984, 352, 556, 17429, 3268, 347, 253, 2303, 3268, 247, 35132, 1320, 273, 288, 71, 6285, 310, 1677, 407, 5572, 19860, 285, 299, 14398, 9554, 247, 12480, 673, 3033, 273, 5261, 292, 6227, 375, 2274, 69, 4519, 310, 2797, 323, 10491, 2412, 9072, 45542, 1123, 2287, 78, 4902, 2303, 3268, 342, 4465, 2626, 2621, 3116, 1617, 20544, 253, 288, 71, 6285, 1232, 3133, 4460, 285, 556, 4722, 10291, 342, 295, 356, 253, 2929, 310, 6571, 2590, 285, 3477, 281, 956, 50276, 20881, 1255, 337, 288, 71, 6285, 556, 6046, 13945, 715, 1899, 3587, 50276, 2520, 2867, 310, 2686, 19632, 323, 35132, 1320, 984, 3587, 8789, 8516, 757, 3200, 715, 2805, 85, 2789, 253, 18974, 273, 2805, 85, 6685, 1327, 6032, 285, 1056, 352, 1892, 281, 6642, 295, 6348, 269, 28221, 50276, 783, 12130, 69, 3693, 3587, 14888, 6046, 715, 1899, 281, 5115, 17680, 2429, 281, 253, 689, 69, 17263, 42651, 288, 71, 6285, 12401, 432, 12130, 69, 556, 6046, 1307, 277, 17118, 275, 1899, 2299, 352, 1335, 5115, 1072, 20185, 14940, 347, 2176, 35132, 1025, 12130, 69, 387, 253, 4376, 273, 271, 4465, 1617, 327, 2169, 1340, 4309, 50276, 19, 10725, 327, 3081, 2626, 2621, 3116, 1617, 50276, 20261, 436, 1617, 310, 21076, 685, 344, 859, 757, 11233, 37913, 285, 556, 644, 908, 275, 2720, 1445, 281, 28523, 253, 14940, 273, 42651, 632, 1162, 355, 43425, 352, 1335, 2789, 253, 14940, 12215, 21076, 685, 1142, 35132, 1025, 12130, 69, 534, 452, 1072, 390, 1805, 20185, 14940, 3885, 285, 13414, 2430, 2626, 2621, 3116, 1617, 50276, 20, 253, 1617, 1180, 465, 5596, 10096, 285, 2626, 2621, 3116, 3638, 305, 10096, 275, 19502, 10454, 310, 12744, 50276, 783, 2022, 2929, 36908, 13199, 253, 20185, 10096, 273, 465, 5596, 285, 305, 533, 10507, 352, 275, 253, 3638, 260, 1690, 326, 1491, 275, 253, 2022, 2929, 651, 320, 14109, 50276, 35861, 253, 30762, 3239, 1384, 835, 260, 310, 2931, 891, 5476, 326, 253, 465, 5596, 10096, 310, 258, 6165, 1237, 285, 305, 10096, 310, 9040, 812, 253, 2488, 1361, 281, 12654, 253, 1840, 7234, 50276, 21, 19502, 10454, 310, 7197, 685, 2176, 35132, 1320, 273, 12130, 69, 50276, 20261, 253, 4477, 11106, 703, 79, 50276, 14906, 6247, 597, 42126, 2319, 3082, 275, 436, 2929, 387, 512, 50276, 783, 14871, 4260, 3659, 1332, 310, 247, 2714, 35132, 1320, 273, 12130, 69, 534, 33526, 5261, 292, 6227, 351, 1012, 4519, 1508, 19502, 10454, 285, 36908, 2430, 2626, 2621, 3116, 1617, 50276, 22, 253, 1332, 310, 417, 973, 17194, 50276, 35861, 281, 253, 2929, 288, 71, 6285, 310, 17194, 407, 581, 1953, 849, 281, 20420, 14888, 6046, 281, 295, 356, 5933, 275, 13358, 673, 359, 943, 3877, 326, 3738, 627, 403, 6944, 10291, 875, 13757, 285, 10491, 42014, 6046, 715, 247, 1175, 13757, 1332, 36908, 7933, 4917, 247, 1175, 10491, 5933, 19235, 436, 2929, 3133, 751, 436, 310, 253, 1083, 625, 5742, 13358, 288, 71, 6285, 342, 9765, 425, 82, 470, 452, 7197, 19502, 10454, 2429, 281, 2176, 35132, 1320, 273, 12130, 69, 50276, 23, 2792, 275, 4677, 374, 310, 23539, 13418, 50276, 338, 891, 2096, 9113, 253, 340, 10565, 1318, 275, 4677, 374, 310, 8004, 1180, 465, 824, 326, 298, 1748, 14168, 49472, 278, 2788, 2805, 1324, 1257, 12910, 2805, 391, 1748, 458, 82, 362, 609, 4277, 275, 1340, 281, 6642, 14168, 49472, 278, 2788, 2805, 1315, 317, 18, 79, 2020, 249, 2805, 5985, 310, 5118, 323, 295, 9138, 3530, 2556, 281, 1386, 495, 275, 6635, 5397, 4789, 534, 403, 4561, 285, 1408, 275, 7529, 323, 465, 25142, 50276, 1124, 18, 79, 2020, 249, 2805, 5985, 310, 6296, 38663, 13418, 273, 14168, 49472, 278, 2788, 2805, 2299, 50276, 1222, 465, 331, 298, 1748, 1315, 317, 18, 79, 2020, 249, 2805, 42686, 506, 49472, 12910, 2805, 391, 1748, 458, 82, 362, 609, 4277, 50275, 261, 247, 23539, 13418, 273, 50276, 1222, 465, 331, 298, 1748, 14168, 49472, 278, 2788, 2805, 1324, 1257, 12910, 2805, 391, 1748, 458, 82, 362, 609, 4277, 50276, 555, 5367, 50276, 7064, 310, 5816, 275, 16186, 721, 24312, 2228, 327, 495, 5784, 34776, 505, 1386, 327, 3239, 854, 50276, 783, 1332, 310, 4460, 275, 253, 3282, 326, 352, 310, 6012, 1754, 327, 253, 2934, 14888, 6046, 715, 295, 9358, 729, 84, 21702, 11786, 2299, 253, 6012, 3762, 310, 417, 1534, 347, 352, 10725, 327, 4465, 13260, 285, 556, 7197, 19502, 10454, 2490, 187, 4118, 18435, 27, 783, 2929, 19401, 253, 1029, 6064, 5415, 2701, 273, 295, 9358, 729, 84, 21702, 11786, 295, 356, 5933, 285, 697, 10291, 281, 10491, 278, 3591, 68, 3082, 253, 2929, 24357, 247, 344, 859, 757, 4924, 1029, 6064, 288, 71, 6285, 258, 615, 285, 14888, 84, 6046, 715, 352, 281, 4044, 271, 21702, 10491, 5933, 2007, 253, 2929, 3400, 247, 35132, 7816, 12955, 273, 253, 5933, 407, 20420, 35132, 3006, 288, 71, 6285, 970, 2969, 35132, 1320, 15849, 323, 7052, 2412, 45542, 1123, 2442, 3470, 2412, 23624, 1005, 253, 2929, 19539, 14940, 273, 253, 1340, 246, 6227, 375, 2274, 69, 4259, 275, 369, 2152, 6339, 19, 4181, 275, 253, 20185, 3282, 253, 906, 10129, 253, 14940, 273, 253, 762, 69, 17263, 298, 912, 8498, 5933, 2299, 253, 2929, 8219, 326, 253, 14637, 275, 253, 4081, 5933, 403, 4577, 285, 45190, 2722, 326, 253, 4081, 5933, 310, 7938, 275, 3946, 253, 2022, 9021, 273, 253, 2929, 403, 10527, 2299, 253, 10527, 1543, 403, 19079, 407, 10704, 4679, 50276, 1189, 455, 253, 30628, 1119, 253, 9021, 4722, 285, 253, 10527, 9021, 273, 253, 2929, 22335, 3590, 253, 2022, 7350, 326, 497, 417, 4336, 9713, 497, 2905, 281, 253, 9759, 273, 253, 1543, 285, 38041, 273, 690, 273, 253, 10704, 4679, 1223, 1097, 1646, 5884, 285, 1896, 281, 2953, 9142, 627, 369, 417, 2217, 1329, 281, 5583, 14924, 2299, 253, 2929, 310, 4891, 285, 16108, 14924, 846, 7470, 38549, 3021, 253, 4477, 403, 14659, 281, 49620, 253, 2929, 285, 501, 538, 2225, 352, 281, 581, 273, 253, 27691, 275, 253, 19945, 966, 273, 17857, 32888 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 1908, 2412, 45542, 1123, 1083, 275, 253, 10199, 2488, 753, 50276, 9846, 1908, 253, 9978, 273, 2412, 45542, 1123, 50276, 2808, 9072, 314, 40886, 2303, 10670, 2299, 352, 3133, 9376, 247, 18, 2430, 278, 50276, 17, 50275, 20, 327, 3239, 898, 2488, 4159, 7052, 2412, 45542, 1123, 13260, 2424, 275, 10012, 8073, 2139, 627, 310, 247, 24312, 1057, 352, 16084, 326, 10012, 8073, 812, 320, 3732, 275, 2412, 45542, 1123, 1083, 50276, 21, 275, 7579, 7288, 387, 253, 1390, 1386, 253, 2488, 7558, 5018, 3058, 407, 12130, 69, 476, 320, 7905, 1272, 407, 288, 71, 6285, 534, 5933, 310, 253, 2488, 14339, 281, 323, 12130, 69, 310, 352, 5933, 275, 260, 24176, 1162, 355, 4765, 390, 288, 71, 6285, 342, 9765, 17, 352, 3133, 841, 767, 11333, 403, 2686, 1027, 50276, 22, 275, 2593, 9901, 2488, 4081, 281, 897, 2228, 273, 1599, 347, 247, 35701, 273, 374, 88, 30666, 6339, 4181, 3738, 2228, 273, 1599, 310, 247, 2406, 3033, 849, 6863, 352, 310, 275, 4677, 374, 253, 1599, 1318, 387, 9765, 1762, 275, 253, 4677, 374, 3133, 2810, 281, 337, 534, 2097, 288, 71, 6285, 33526, 362, 609, 4277, 3040, 520, 7200, 275, 816, 581, 19502, 4518, 581, 19502, 310, 417, 2217, 323, 5451, 249, 1057, 352, 2097, 2228, 273, 1599, 310, 247, 3076, 15301, 273, 12480, 50276, 23, 275, 4677, 337, 812, 253, 2488, 671, 823, 3104, 323, 12130, 69, 285, 14871, 4260, 3659, 1332, 891, 11121, 436, 2929, 812, 320, 5520, 9619, 533, 4390, 651, 751, 281, 5583, 12009, 50276, 7152, 33032, 2520, 331, 7352, 69, 7106, 327, 6684, 247, 344, 859, 757, 4924, 1029, 21061, 295, 9358, 729, 84, 17680, 2746, 534, 19539, 5975, 3390, 7938, 285, 1146, 50276, 79, 834, 10482, 417, 247, 673, 373, 1179, 272, 10480, 436, 2929, 23970, 247, 344, 859, 757, 4924, 1029, 21061, 295, 9358, 729, 84, 17680, 2746, 534, 19539, 7938, 14940, 685, 1097, 762, 69, 17263, 285, 689, 69, 17263, 298, 912, 8498, 8062, 275, 1635, 352, 671, 19539, 326, 253, 17680, 2550, 320, 6786, 407, 673, 373, 1179, 272, 4679, 1543, 1329, 253, 4154, 275, 13757, 17680, 347, 973, 347, 253, 10527, 1543, 50275, 8826, 3533, 253, 37317, 13846, 50276, 783, 10012, 8319, 7024, 327, 253, 9376, 327, 17356, 285, 9765, 604, 824, 271, 9376, 2223, 6556, 275, 1524, 4893, 50276, 262, 310, 588, 320, 1805, 281, 7568, 253, 17680, 34385, 275, 342, 247, 625, 2570, 2990, 285, 247, 1236, 2510, 25912, 10895, 253, 37317, 11121, 436, 2929, 17904, 247, 747, 285, 3576, 11786, 1754, 13757, 47037, 4583, 253, 37317, 651, 751, 281, 5583, 14924, 5474, 339, 431, 248, 2488, 4081, 271, 21702, 29844, 3169, 278, 3591, 68, 1332, 1754, 327, 295, 9358, 729, 84, 21702, 11786, 295, 356, 275, 5415, 673, 253, 5933, 310, 2104, 281, 5115, 247, 19999, 17680, 689, 253, 762, 69, 17263, 298, 912, 8498, 5933, 10704, 15849, 476, 12661, 247, 3885, 484, 342, 247, 3638, 2803, 50276, 2520, 2929, 21168, 247, 295, 356, 3169, 278, 3591, 68, 1775, 17407, 534, 310, 4907, 347, 344, 859, 757, 4924, 1029, 21061, 288, 30550, 1544, 1494, 17356, 285, 2087, 4219, 253, 762, 69, 17263, 298, 912, 8498, 671, 476, 320, 3542, 347, 288, 71, 6285, 17, 17356, 342, 271, 3081, 11786, 16924, 285, 8516, 757, 3200, 323, 253, 5731, 273, 253, 1899, 4778, 50275, 856, 84, 253, 954, 11906, 4735, 273, 253, 6508, 5933, 11026, 1199, 7938, 17619, 14940, 342, 2281, 8919, 50276, 249, 5415, 673, 685, 253, 2629, 762, 69, 17263, 298, 912, 8498, 5933, 50276, 5756, 72, 1093, 2281, 273, 8306, 1686, 26932, 5242, 266, 938, 2281, 273, 8306, 983, 2274, 77, 3738, 253, 17680, 4916, 1679, 1534, 275, 10704, 11333, 1955, 281, 271, 2572, 273, 35132, 1320, 2228, 690, 3885, 484, 407, 247, 3638, 2803, 476, 320, 1335, 6786, 50276, 5040, 50276, 783, 4028, 310, 417, 1077, 2590, 534, 11852, 253, 1239, 1430, 2793, 285, 1160, 479, 1892, 281, 2451, 253, 4278, 323, 1650, 337, 253, 28529, 273, 7212, 721, 310, 5777, 519, 37806, 285, 625, 2905, 789, 390, 42852, 403, 5125, 374, 387, 253, 990, 273, 2593, 270, 690, 3048, 327, 253, 14308, 273, 41818, 778, 320, 5125, 495, 3239, 2164, 288, 85, 310, 6012, 1754, 327, 246, 9614, 7466, 3185, 273, 3981, 2717, 310, 1805, 323, 15250, 281, 253, 10668, 50276, 39808, 10012, 8319, 310, 10141, 9257, 10012, 8073, 310, 417, 253, 9317, 3133, 281, 320, 5272, 50276, 37585, 3374, 50275, 18, 4679, 5196, 327, 2593, 9743, 310, 1327, 44181, 285, 36908, 3761, 253, 3762, 50276, 19, 5816, 629, 275, 2905, 2987, 36804, 35757, 38857, 7529, 2660, 272, 50275, 18, 38757, 1327, 44181, 4715, 3066, 36804, 6431, 298, 912, 8498, 12393, 17857, 32888, 746, 50276, 19, 1327, 44181, 4715, 3066, 36804, 6431, 19191, 11786, 278, 3591, 68, 17857, 1686, 938, 50275, 5756, 72, 1093, 762, 69, 17263, 298, 912, 8498, 278, 3591, 68, 247, 1327, 284, 40045, 3875, 1783, 847, 85, 1093, 50276, 26955, 5242, 266, 938, 247, 256, 26932, 5242, 266, 285, 298, 4172, 2995, 321, 395, 327, 10491, 432, 247, 2412, 45542, 1123, 4038, 970, 17818, 298, 912, 8498, 2171, 16723, 270, 1808, 276, 25658, 50275, 783, 295, 356, 3169, 278, 3591, 68, 1775, 17407, 8725, 253, 762, 69, 17263, 298, 912, 8498, 342, 247, 16924, 1307, 285, 8516, 757, 3200, 285, 310, 8058, 7938, 685, 253, 5795, 891, 651, 5583, 436, 2929, 281, 320, 7607, 5474, 33032, 436, 2929, 2319, 253, 21702, 29844, 3169, 278, 3591, 68, 1332, 285, 12661, 690, 26565, 4737, 891, 513, 417, 1158, 253, 1895, 9713, 407, 436, 2929, 1057, 417, 1056, 3282, 50276, 783, 11786, 3169, 5933, 2879, 407, 253, 6046, 275, 253, 17133, 1083, 1057, 417, 1132, 253, 9093, 1027, 2554, 342, 326, 1293, 253, 6046, 50275, 32897, 253, 4477, 3730, 281, 50275, 4805, 439, 74, 359, 1944, 466, 402, 285, 278, 44023, 891, 480, 11208, 50276, 251, 4715, 4142, 285, 5807, 5784, 4940, 9158, 5987, 39962, 2061, 5375, 1518, 1449, 2090, 2357, 50276, 4805, 439, 74, 327, 253, 4373, 22041, 275, 19191, 11786, 18499, 342, 10254, 5987, 39962, 2061, 5375, 16899, 1438, 1867, 2504, 50275, 7152, 33032, 2520, 2929, 23970, 247, 19191, 1232, 1925, 288, 71, 6285, 253, 256, 3229, 273, 288, 71, 6285, 310, 6012, 407, 294, 17695, 295, 9358, 729, 84, 21702, 11786, 295, 356, 715, 12475, 4511, 6779, 830, 8287, 352, 347, 258, 3229, 285, 840, 42014, 6046, 715, 1097, 1899, 285, 10254, 4903, 288, 71, 6285, 476, 320, 908, 323, 10491, 984, 352, 556, 17429, 3268, 347, 253, 2303, 3268, 247, 35132, 1320, 273, 288, 71, 6285, 310, 1677, 407, 5572, 19860, 285, 299, 14398, 9554, 247, 12480, 673, 3033, 273, 5261, 292, 6227, 375, 2274, 69, 4519, 310, 2797, 323, 10491, 2412, 9072, 45542, 1123, 2287, 78, 4902, 2303, 3268, 342, 4465, 2626, 2621, 3116, 1617, 20544, 253, 288, 71, 6285, 1232, 3133, 4460, 285, 556, 4722, 10291, 342, 295, 356, 253, 2929, 310, 6571, 2590, 285, 3477, 281, 956, 50276, 20881, 1255, 337, 288, 71, 6285, 556, 6046, 13945, 715, 1899, 3587, 50276, 2520, 2867, 310, 2686, 19632, 323, 35132, 1320, 984, 3587, 8789, 8516, 757, 3200, 715, 2805, 85, 2789, 253, 18974, 273, 2805, 85, 6685, 1327, 6032, 285, 1056, 352, 1892, 281, 6642, 295, 6348, 269, 28221, 50276, 783, 12130, 69, 3693, 3587, 14888, 6046, 715, 1899, 281, 5115, 17680, 2429, 281, 253, 689, 69, 17263, 42651, 288, 71, 6285, 12401, 432, 12130, 69, 556, 6046, 1307, 277, 17118, 275, 1899, 2299, 352, 1335, 5115, 1072, 20185, 14940, 347, 2176, 35132, 1025, 12130, 69, 387, 253, 4376, 273, 271, 4465, 1617, 327, 2169, 1340, 4309, 50276, 19, 10725, 327, 3081, 2626, 2621, 3116, 1617, 50276, 20261, 436, 1617, 310, 21076, 685, 344, 859, 757, 11233, 37913, 285, 556, 644, 908, 275, 2720, 1445, 281, 28523, 253, 14940, 273, 42651, 632, 1162, 355, 43425, 352, 1335, 2789, 253, 14940, 12215, 21076, 685, 1142, 35132, 1025, 12130, 69, 534, 452, 1072, 390, 1805, 20185, 14940, 3885, 285, 13414, 2430, 2626, 2621, 3116, 1617, 50276, 20, 253, 1617, 1180, 465, 5596, 10096, 285, 2626, 2621, 3116, 3638, 305, 10096, 275, 19502, 10454, 310, 12744, 50276, 783, 2022, 2929, 36908, 13199, 253, 20185, 10096, 273, 465, 5596, 285, 305, 533, 10507, 352, 275, 253, 3638, 260, 1690, 326, 1491, 275, 253, 2022, 2929, 651, 320, 14109, 50276, 35861, 253, 30762, 3239, 1384, 835, 260, 310, 2931, 891, 5476, 326, 253, 465, 5596, 10096, 310, 258, 6165, 1237, 285, 305, 10096, 310, 9040, 812, 253, 2488, 1361, 281, 12654, 253, 1840, 7234, 50276, 21, 19502, 10454, 310, 7197, 685, 2176, 35132, 1320, 273, 12130, 69, 50276, 20261, 253, 4477, 11106, 703, 79, 50276, 14906, 6247, 597, 42126, 2319, 3082, 275, 436, 2929, 387, 512, 50276, 783, 14871, 4260, 3659, 1332, 310, 247, 2714, 35132, 1320, 273, 12130, 69, 534, 33526, 5261, 292, 6227, 351, 1012, 4519, 1508, 19502, 10454, 285, 36908, 2430, 2626, 2621, 3116, 1617, 50276, 22, 253, 1332, 310, 417, 973, 17194, 50276, 35861, 281, 253, 2929, 288, 71, 6285, 310, 17194, 407, 581, 1953, 849, 281, 20420, 14888, 6046, 281, 295, 356, 5933, 275, 13358, 673, 359, 943, 3877, 326, 3738, 627, 403, 6944, 10291, 875, 13757, 285, 10491, 42014, 6046, 715, 247, 1175, 13757, 1332, 36908, 7933, 4917, 247, 1175, 10491, 5933, 19235, 436, 2929, 3133, 751, 436, 310, 253, 1083, 625, 5742, 13358, 288, 71, 6285, 342, 9765, 425, 82, 470, 452, 7197, 19502, 10454, 2429, 281, 2176, 35132, 1320, 273, 12130, 69, 50276, 23, 2792, 275, 4677, 374, 310, 23539, 13418, 50276, 338, 891, 2096, 9113, 253, 340, 10565, 1318, 275, 4677, 374, 310, 8004, 1180, 465, 824, 326, 298, 1748, 14168, 49472, 278, 2788, 2805, 1324, 1257, 12910, 2805, 391, 1748, 458, 82, 362, 609, 4277, 275, 1340, 281, 6642, 14168, 49472, 278, 2788, 2805, 1315, 317, 18, 79, 2020, 249, 2805, 5985, 310, 5118, 323, 295, 9138, 3530, 2556, 281, 1386, 495, 275, 6635, 5397, 4789, 534, 403, 4561, 285, 1408, 275, 7529, 323, 465, 25142, 50276, 1124, 18, 79, 2020, 249, 2805, 5985, 310, 6296, 38663, 13418, 273, 14168, 49472, 278, 2788, 2805, 2299, 50276, 1222, 465, 331, 298, 1748, 1315, 317, 18, 79, 2020, 249, 2805, 42686, 506, 49472, 12910, 2805, 391, 1748, 458, 82, 362, 609, 4277, 50275, 261, 247, 23539, 13418, 273, 50276, 1222, 465, 331, 298, 1748, 14168, 49472, 278, 2788, 2805, 1324, 1257, 12910, 2805, 391, 1748, 458, 82, 362, 609, 4277, 50276, 555, 5367, 50276, 7064, 310, 5816, 275, 16186, 721, 24312, 2228, 327, 495, 5784, 34776, 505, 1386, 327, 3239, 854, 50276, 783, 1332, 310, 4460, 275, 253, 3282, 326, 352, 310, 6012, 1754, 327, 253, 2934, 14888, 6046, 715, 295, 9358, 729, 84, 21702, 11786, 2299, 253, 6012, 3762, 310, 417, 1534, 347, 352, 10725, 327, 4465, 13260, 285, 556, 7197, 19502, 10454, 2490, 187, 4118, 18435, 27, 783, 2929, 19401, 253, 1029, 6064, 5415, 2701, 273, 295, 9358, 729, 84, 21702, 11786, 295, 356, 5933, 285, 697, 10291, 281, 10491, 278, 3591, 68, 3082, 253, 2929, 24357, 247, 344, 859, 757, 4924, 1029, 6064, 288, 71, 6285, 258, 615, 285, 14888, 84, 6046, 715, 352, 281, 4044, 271, 21702, 10491, 5933, 2007, 253, 2929, 3400, 247, 35132, 7816, 12955, 273, 253, 5933, 407, 20420, 35132, 3006, 288, 71, 6285, 970, 2969, 35132, 1320, 15849, 323, 7052, 2412, 45542, 1123, 2442, 3470, 2412, 23624, 1005, 253, 2929, 19539, 14940, 273, 253, 1340, 246, 6227, 375, 2274, 69, 4259, 275, 369, 2152, 6339, 19, 4181, 275, 253, 20185, 3282, 253, 906, 10129, 253, 14940, 273, 253, 762, 69, 17263, 298, 912, 8498, 5933, 2299, 253, 2929, 8219, 326, 253, 14637, 275, 253, 4081, 5933, 403, 4577, 285, 45190, 2722, 326, 253, 4081, 5933, 310, 7938, 275, 3946, 253, 2022, 9021, 273, 253, 2929, 403, 10527, 2299, 253, 10527, 1543, 403, 19079, 407, 10704, 4679, 50276, 1189, 455, 253, 30628, 1119, 253, 9021, 4722, 285, 253, 10527, 9021, 273, 253, 2929, 22335, 3590, 253, 2022, 7350, 326, 497, 417, 4336, 9713, 497, 2905, 281, 253, 9759, 273, 253, 1543, 285, 38041, 273, 690, 273, 253, 10704, 4679, 1223, 1097, 1646, 5884, 285, 1896, 281, 2953, 9142, 627, 369, 417, 2217, 1329, 281, 5583, 14924, 2299, 253, 2929, 310, 4891, 285, 16108, 14924, 846, 7470, 38549, 3021, 253, 4477, 403, 14659, 281, 49620, 253, 2929, 285, 501, 538, 2225, 352, 281, 581, 273, 253, 27691, 275, 253, 19945, 966, 273, 17857, 32888 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: this paper proposed a new face rendering technique to synthesize paired nirvis images for improved face recognition in nir space unlike the previous methods which use imagetoimage translation models learned from paired nirvis images this method use physicalbased 3d face rendering it first reconstructs 3d face meshes and reflectance assets in vis space then infers the corresponding reflectance assets in nir space and finally synthesizes paired visnir images at various head poses and illuminations it also employs a novel idmmd loss to close the gap between vis and nir features in nirvis face recognition training the proposed method helped to achieve stateoftheart face recognition performance on four nirvis benchmarks strengths the proposed method allows generating infinite pairs of visnir facial images covering different head poses and illuminations without learning from any real visnir image pairs the paper proposes a sophisticated algorithm to transform a vis reflectance asset into its nir version the paper proposes a novel idmmd loss to close the gap between vis and nir features in nirvis face recognition training ablation studies confirm its effectiveness by using only the synthesized visnir images the trained face recognizers can produce competitive nir face recognition performance compared with the baseline methods trained on real nirvis datasets after finetuning on real nirvis images these recognizers provide stateoftheart and nearperfect performance on four nirvis face recognition benchmarks weaknesses the synthesized images particularly the vis ones look pretty unrealistic the 3d models only cover the facial part leaving some components missing including hair facial accessories and background the denotations such as mathcalrnir should be explained earlier eg in the text or the caption of table 1 rather than in the caption of fig 4 there is a predefined subset of webface260m called webface4m the authors said they randomly selected images to create their webface4m dataset are they the same dataset if not better change the dataset name to avoid confusion it seems the authors did not augment facial expressions all qualitative figures of synthetic images have neutral expressions i think the expression is not an important factor in the test benchmarks but the authors should consider it when developing face recognizers for realworld applications there is no discussion on limitations the paper limits its application to avoid potential social impacts docsepthe work proposes a nirvis face matching dataset constructed with a physicallybased renderer an idmmd loss is employed to facilitate the identity feature learning as well as reduce the modality discrepancy the work achieves stateoftheart performance on 4 nirvis face recognition benchmarks strengths 1 the proposed method is capable to automatically generate multiple nirvis image pairs with identity information reserved which is of great significance 2 with the proposed training scheme the nirvis face matching dataset helps improve the nirvis face recognition performance by a large margin weaknesses 1 the proposed idmmd loss reduces the distance between the nirvis feature centroids of the same identity which is effective yet not novel 1 1 wei ziyu et al syncretic modality collaborative learning for visible infrared person reidentification iccv 2021 2 the training differences from other compared methods should be more detailly stated 3 some related works are missing eg dual face alignment learning network for nirvis face recognition orthogonal modality disentanglement and representation alignment network for nirvis face recognition dualagent gans for photorealistic and identity preserving profile face synthesis see weaknesses questions docsepin this paper the authors propose a method to synthesize nearinfrared faces by transforming them from visible faces based on this the authors can conduct nirvis face recognition without any existing nirvis face datasets besides they also propose an identitybased maximum mean discrepancy loss to facilitate identity feature learning the good performance shows the efficiency of their method strengths 1 the authors utilize a novel 3drendering based method to generate largescale nirvis paired data 2 the authors propose an identitybased maximum mean discrepancy loss to facilitate identity feature learning 3 the descriptions of implementations of the proposed synthesis method are detailed weakness 1 the motivation for using a 3drendering based generating dataset is not clearly illustrated and it seems that the authors just utilize idrelated loss when training it is suggested to strengthen more use of the generation 2 the comparison results are not sufficient it is suggested to add generated data by different percentages to illustrate the effectiveness of the synthesis method 3 i suggest more comparative visualizations to verify the effectiveness of the proposed method in addition the authors should discuss how generalizable the proposed method is in practical situations the limitations and potential negative societal impact have been adequately addressed ### Summary:
the paper received 3 positive reviews the reviewers all lean towards acceptance after the rebuttal overall this work can be of large interest to the community working on nirvis recognition but i hope the authors will present additional visualized results as suggested by the reviewers
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 2520, 2929, 4081, 247, 747, 2454, 18164, 5853, 281, 46919, 18433, 295, 343, 4534, 3888, 323, 5520, 2454, 8981, 275, 295, 343, 2317, 12401, 253, 2045, 3082, 534, 897, 4440, 16713, 5695, 10234, 3210, 6311, 432, 18433, 295, 343, 4534, 3888, 436, 1332, 897, 3520, 3169, 495, 69, 2454, 18164, 352, 806, 17029, 84, 495, 69, 2454, 6191, 1041, 285, 4887, 593, 10434, 275, 1649, 2317, 840, 2192, 398, 253, 3969, 4887, 593, 10434, 275, 295, 343, 2317, 285, 4720, 35143, 4219, 18433, 1649, 79, 343, 3888, 387, 2710, 1481, 24543, 285, 34080, 569, 352, 671, 27532, 247, 4460, 2654, 2188, 69, 2957, 281, 2810, 253, 8037, 875, 1649, 285, 295, 343, 3386, 275, 295, 343, 4534, 2454, 8981, 3733, 253, 4081, 1332, 6518, 281, 5115, 1375, 23037, 14387, 2454, 8981, 3045, 327, 1740, 295, 343, 4534, 49602, 50276, 296, 3755, 20556, 50276, 783, 4081, 1332, 4483, 11365, 11968, 8557, 273, 1649, 79, 343, 17754, 3888, 10985, 1027, 1481, 24543, 285, 34080, 569, 1293, 4715, 432, 667, 1524, 1649, 79, 343, 2460, 8557, 50276, 783, 2929, 29328, 247, 18144, 5933, 281, 4979, 247, 1649, 4887, 593, 15231, 715, 697, 295, 343, 2715, 50276, 783, 2929, 29328, 247, 4460, 2654, 2188, 69, 2957, 281, 2810, 253, 8037, 875, 1649, 285, 295, 343, 3386, 275, 295, 343, 4534, 2454, 8981, 3733, 28913, 2175, 6583, 697, 12510, 50276, 1615, 970, 760, 253, 17791, 1649, 79, 343, 3888, 253, 10166, 2454, 3183, 14460, 476, 4711, 12085, 295, 343, 2454, 8981, 3045, 2429, 342, 253, 8245, 3082, 10166, 327, 1524, 295, 343, 4534, 15302, 846, 1442, 292, 25004, 327, 1524, 295, 343, 4534, 3888, 841, 3183, 14460, 2085, 1375, 23037, 14387, 285, 2822, 32060, 3045, 327, 1740, 295, 343, 4534, 2454, 8981, 49602, 50275, 20881, 1255, 265, 50276, 783, 17791, 3888, 3782, 253, 1649, 4394, 1007, 3965, 46521, 253, 495, 69, 3210, 760, 3835, 253, 17754, 629, 6108, 690, 4295, 5816, 1690, 4707, 17754, 28234, 285, 4114, 50276, 783, 1850, 302, 569, 824, 347, 14168, 1179, 30930, 343, 943, 320, 5544, 4321, 24088, 275, 253, 2505, 390, 253, 11743, 273, 2829, 337, 2581, 685, 275, 253, 11743, 273, 3036, 577, 50276, 9088, 310, 247, 41364, 8578, 273, 4384, 1664, 19319, 78, 1925, 4384, 1664, 21, 78, 253, 4477, 753, 597, 12421, 4236, 3888, 281, 2794, 616, 4384, 1664, 21, 78, 10895, 403, 597, 253, 1072, 10895, 604, 417, 1805, 1818, 253, 10895, 1416, 281, 3693, 13775, 50276, 262, 3133, 253, 4477, 858, 417, 35919, 17754, 12091, 512, 18276, 8442, 273, 13506, 3888, 452, 9238, 12091, 891, 1158, 253, 2048, 310, 417, 271, 1774, 2803, 275, 253, 1071, 49602, 533, 253, 4477, 943, 1908, 352, 672, 6684, 2454, 3183, 14460, 323, 1524, 10186, 4893, 627, 310, 642, 5955, 327, 7364, 253, 2929, 7787, 697, 2898, 281, 3693, 2442, 2675, 16274, 5474, 339, 431, 248, 789, 29328, 247, 295, 343, 4534, 2454, 11038, 10895, 8818, 342, 247, 13318, 3169, 3816, 21052, 271, 2654, 2188, 69, 2957, 310, 7091, 281, 12454, 253, 6489, 4735, 4715, 347, 973, 347, 4796, 253, 36453, 26210, 253, 789, 33526, 1375, 23037, 14387, 3045, 327, 577, 295, 343, 4534, 2454, 8981, 49602, 50276, 296, 3755, 20556, 337, 186, 783, 4081, 1332, 310, 7032, 281, 8356, 6635, 2709, 295, 343, 4534, 2460, 8557, 342, 6489, 1491, 10827, 534, 310, 273, 1270, 8453, 50276, 19, 186, 3113, 253, 4081, 3733, 6974, 253, 295, 343, 4534, 2454, 11038, 10895, 7729, 3157, 253, 295, 343, 4534, 2454, 8981, 3045, 407, 247, 1781, 8459, 50275, 20881, 1255, 265, 337, 186, 783, 4081, 2654, 2188, 69, 2957, 11355, 253, 4181, 875, 253, 295, 343, 4534, 4735, 1399, 287, 2352, 273, 253, 1072, 6489, 534, 310, 3576, 2568, 417, 4460, 337, 50276, 18, 359, 74, 1182, 14059, 86, 1162, 355, 2753, 2414, 280, 36453, 27549, 4715, 323, 7985, 20542, 1436, 294, 888, 1877, 17857, 17312, 43425, 374, 186, 783, 3733, 3910, 432, 643, 2429, 3082, 943, 320, 625, 2508, 314, 4767, 495, 50276, 8826, 2905, 2987, 403, 5816, 24088, 8746, 2454, 12420, 4715, 2990, 323, 295, 343, 4534, 2454, 8981, 19627, 36453, 557, 290, 606, 1338, 285, 6779, 12420, 2990, 323, 295, 343, 4534, 2454, 8981, 8746, 12788, 305, 507, 323, 38099, 267, 2531, 285, 6489, 24279, 6222, 2454, 9066, 50276, 2887, 32213, 50276, 34974, 5474, 339, 9852, 436, 2929, 253, 4477, 12661, 247, 1332, 281, 46919, 2822, 43030, 9365, 407, 27197, 731, 432, 7985, 9365, 1754, 327, 436, 253, 4477, 476, 2589, 295, 343, 4534, 2454, 8981, 1293, 667, 5368, 295, 343, 4534, 2454, 15302, 16280, 597, 671, 12661, 271, 6489, 3169, 4869, 1599, 26210, 2957, 281, 12454, 6489, 4735, 4715, 253, 1175, 3045, 2722, 253, 6733, 273, 616, 1332, 20544, 337, 253, 4477, 16584, 247, 4460, 495, 69, 12574, 272, 1754, 1332, 281, 6635, 1236, 2510, 25912, 295, 343, 4534, 18433, 941, 374, 253, 4477, 12661, 271, 6489, 3169, 4869, 1599, 26210, 2957, 281, 12454, 6489, 4735, 4715, 495, 253, 20121, 273, 27558, 273, 253, 4081, 9066, 1332, 403, 7000, 50276, 20881, 1255, 337, 253, 16038, 323, 970, 247, 495, 69, 12574, 272, 1754, 11365, 10895, 310, 417, 4518, 12800, 285, 352, 3133, 326, 253, 4477, 816, 16584, 2654, 4919, 2957, 672, 3733, 352, 310, 5125, 281, 17084, 625, 897, 273, 253, 5978, 374, 253, 5301, 1543, 403, 417, 4209, 352, 310, 5125, 281, 823, 4561, 941, 407, 1027, 26026, 281, 17093, 253, 12510, 273, 253, 9066, 1332, 495, 891, 1804, 625, 20407, 5304, 5904, 281, 12654, 253, 12510, 273, 253, 4081, 1332, 275, 1635, 253, 4477, 943, 2319, 849, 2087, 12729, 253, 4081, 1332, 310, 275, 8542, 9534, 50274, 783, 7364, 285, 2442, 4016, 38058, 3486, 452, 644, 18212, 9713, 2490, 187, 4118, 18435, 27, 783, 2929, 2959, 495, 2762, 10123, 253, 30628, 512, 9644, 4404, 14924, 846, 253, 30080, 22559, 50275, 1189, 455, 436, 789, 476, 320, 273, 1781, 1600, 281, 253, 3114, 2444, 327, 295, 343, 4534, 8981, 533, 891, 3524, 253, 4477, 588, 1246, 3081, 27130, 1543, 347, 5125, 407, 253, 30628 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 2520, 2929, 4081, 247, 747, 2454, 18164, 5853, 281, 46919, 18433, 295, 343, 4534, 3888, 323, 5520, 2454, 8981, 275, 295, 343, 2317, 12401, 253, 2045, 3082, 534, 897, 4440, 16713, 5695, 10234, 3210, 6311, 432, 18433, 295, 343, 4534, 3888, 436, 1332, 897, 3520, 3169, 495, 69, 2454, 18164, 352, 806, 17029, 84, 495, 69, 2454, 6191, 1041, 285, 4887, 593, 10434, 275, 1649, 2317, 840, 2192, 398, 253, 3969, 4887, 593, 10434, 275, 295, 343, 2317, 285, 4720, 35143, 4219, 18433, 1649, 79, 343, 3888, 387, 2710, 1481, 24543, 285, 34080, 569, 352, 671, 27532, 247, 4460, 2654, 2188, 69, 2957, 281, 2810, 253, 8037, 875, 1649, 285, 295, 343, 3386, 275, 295, 343, 4534, 2454, 8981, 3733, 253, 4081, 1332, 6518, 281, 5115, 1375, 23037, 14387, 2454, 8981, 3045, 327, 1740, 295, 343, 4534, 49602, 50276, 296, 3755, 20556, 50276, 783, 4081, 1332, 4483, 11365, 11968, 8557, 273, 1649, 79, 343, 17754, 3888, 10985, 1027, 1481, 24543, 285, 34080, 569, 1293, 4715, 432, 667, 1524, 1649, 79, 343, 2460, 8557, 50276, 783, 2929, 29328, 247, 18144, 5933, 281, 4979, 247, 1649, 4887, 593, 15231, 715, 697, 295, 343, 2715, 50276, 783, 2929, 29328, 247, 4460, 2654, 2188, 69, 2957, 281, 2810, 253, 8037, 875, 1649, 285, 295, 343, 3386, 275, 295, 343, 4534, 2454, 8981, 3733, 28913, 2175, 6583, 697, 12510, 50276, 1615, 970, 760, 253, 17791, 1649, 79, 343, 3888, 253, 10166, 2454, 3183, 14460, 476, 4711, 12085, 295, 343, 2454, 8981, 3045, 2429, 342, 253, 8245, 3082, 10166, 327, 1524, 295, 343, 4534, 15302, 846, 1442, 292, 25004, 327, 1524, 295, 343, 4534, 3888, 841, 3183, 14460, 2085, 1375, 23037, 14387, 285, 2822, 32060, 3045, 327, 1740, 295, 343, 4534, 2454, 8981, 49602, 50275, 20881, 1255, 265, 50276, 783, 17791, 3888, 3782, 253, 1649, 4394, 1007, 3965, 46521, 253, 495, 69, 3210, 760, 3835, 253, 17754, 629, 6108, 690, 4295, 5816, 1690, 4707, 17754, 28234, 285, 4114, 50276, 783, 1850, 302, 569, 824, 347, 14168, 1179, 30930, 343, 943, 320, 5544, 4321, 24088, 275, 253, 2505, 390, 253, 11743, 273, 2829, 337, 2581, 685, 275, 253, 11743, 273, 3036, 577, 50276, 9088, 310, 247, 41364, 8578, 273, 4384, 1664, 19319, 78, 1925, 4384, 1664, 21, 78, 253, 4477, 753, 597, 12421, 4236, 3888, 281, 2794, 616, 4384, 1664, 21, 78, 10895, 403, 597, 253, 1072, 10895, 604, 417, 1805, 1818, 253, 10895, 1416, 281, 3693, 13775, 50276, 262, 3133, 253, 4477, 858, 417, 35919, 17754, 12091, 512, 18276, 8442, 273, 13506, 3888, 452, 9238, 12091, 891, 1158, 253, 2048, 310, 417, 271, 1774, 2803, 275, 253, 1071, 49602, 533, 253, 4477, 943, 1908, 352, 672, 6684, 2454, 3183, 14460, 323, 1524, 10186, 4893, 627, 310, 642, 5955, 327, 7364, 253, 2929, 7787, 697, 2898, 281, 3693, 2442, 2675, 16274, 5474, 339, 431, 248, 789, 29328, 247, 295, 343, 4534, 2454, 11038, 10895, 8818, 342, 247, 13318, 3169, 3816, 21052, 271, 2654, 2188, 69, 2957, 310, 7091, 281, 12454, 253, 6489, 4735, 4715, 347, 973, 347, 4796, 253, 36453, 26210, 253, 789, 33526, 1375, 23037, 14387, 3045, 327, 577, 295, 343, 4534, 2454, 8981, 49602, 50276, 296, 3755, 20556, 337, 186, 783, 4081, 1332, 310, 7032, 281, 8356, 6635, 2709, 295, 343, 4534, 2460, 8557, 342, 6489, 1491, 10827, 534, 310, 273, 1270, 8453, 50276, 19, 186, 3113, 253, 4081, 3733, 6974, 253, 295, 343, 4534, 2454, 11038, 10895, 7729, 3157, 253, 295, 343, 4534, 2454, 8981, 3045, 407, 247, 1781, 8459, 50275, 20881, 1255, 265, 337, 186, 783, 4081, 2654, 2188, 69, 2957, 11355, 253, 4181, 875, 253, 295, 343, 4534, 4735, 1399, 287, 2352, 273, 253, 1072, 6489, 534, 310, 3576, 2568, 417, 4460, 337, 50276, 18, 359, 74, 1182, 14059, 86, 1162, 355, 2753, 2414, 280, 36453, 27549, 4715, 323, 7985, 20542, 1436, 294, 888, 1877, 17857, 17312, 43425, 374, 186, 783, 3733, 3910, 432, 643, 2429, 3082, 943, 320, 625, 2508, 314, 4767, 495, 50276, 8826, 2905, 2987, 403, 5816, 24088, 8746, 2454, 12420, 4715, 2990, 323, 295, 343, 4534, 2454, 8981, 19627, 36453, 557, 290, 606, 1338, 285, 6779, 12420, 2990, 323, 295, 343, 4534, 2454, 8981, 8746, 12788, 305, 507, 323, 38099, 267, 2531, 285, 6489, 24279, 6222, 2454, 9066, 50276, 2887, 32213, 50276, 34974, 5474, 339, 9852, 436, 2929, 253, 4477, 12661, 247, 1332, 281, 46919, 2822, 43030, 9365, 407, 27197, 731, 432, 7985, 9365, 1754, 327, 436, 253, 4477, 476, 2589, 295, 343, 4534, 2454, 8981, 1293, 667, 5368, 295, 343, 4534, 2454, 15302, 16280, 597, 671, 12661, 271, 6489, 3169, 4869, 1599, 26210, 2957, 281, 12454, 6489, 4735, 4715, 253, 1175, 3045, 2722, 253, 6733, 273, 616, 1332, 20544, 337, 253, 4477, 16584, 247, 4460, 495, 69, 12574, 272, 1754, 1332, 281, 6635, 1236, 2510, 25912, 295, 343, 4534, 18433, 941, 374, 253, 4477, 12661, 271, 6489, 3169, 4869, 1599, 26210, 2957, 281, 12454, 6489, 4735, 4715, 495, 253, 20121, 273, 27558, 273, 253, 4081, 9066, 1332, 403, 7000, 50276, 20881, 1255, 337, 253, 16038, 323, 970, 247, 495, 69, 12574, 272, 1754, 11365, 10895, 310, 417, 4518, 12800, 285, 352, 3133, 326, 253, 4477, 816, 16584, 2654, 4919, 2957, 672, 3733, 352, 310, 5125, 281, 17084, 625, 897, 273, 253, 5978, 374, 253, 5301, 1543, 403, 417, 4209, 352, 310, 5125, 281, 823, 4561, 941, 407, 1027, 26026, 281, 17093, 253, 12510, 273, 253, 9066, 1332, 495, 891, 1804, 625, 20407, 5304, 5904, 281, 12654, 253, 12510, 273, 253, 4081, 1332, 275, 1635, 253, 4477, 943, 2319, 849, 2087, 12729, 253, 4081, 1332, 310, 275, 8542, 9534, 50274, 783, 7364, 285, 2442, 4016, 38058, 3486, 452, 644, 18212, 9713, 2490, 187, 4118, 18435, 27, 783, 2929, 2959, 495, 2762, 10123, 253, 30628, 512, 9644, 4404, 14924, 846, 253, 30080, 22559, 50275, 1189, 455, 436, 789, 476, 320, 273, 1781, 1600, 281, 253, 3114, 2444, 327, 295, 343, 4534, 8981, 533, 891, 3524, 253, 4477, 588, 1246, 3081, 27130, 1543, 347, 5125, 407, 253, 30628 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: summary this paper proposes a knowledge distillation kd approach for selfsupervised learning ssl with small neural network models the authors first observe that the stateoftheart contrastive learningbased ssl does not obtain good performance on small models due to the larger model capacity required for instance discrimination to tackle this problem they propose a seed a kd method where the smaller student model learns to mimic its larger teacher models similarity distribution between an instance and its augmented views using a crossentropy based objective the authors perform various experiments to show that 1 seed obtains substantial improvement in sslbased imagenet classification performance for small models as compared to ssl training without seed 2 the performance gains are also substantial for transfer learning on other classification tasks 3 the performance gains are smaller for downstream tasks of object detection and instance segmentation with performance gains reducing for the larger coco dataset as compared to voc 4 seed is robust to choice of ssl method and performs better than other kd approaches strengths the paper is clearly written and wellorganized the seed approach is wellmotivated and sensible experimental validation and the ablation studies are quite thorough performance gains on classification tasks are substantial the method is simple to implement weaknesses 1 performance gains on downstream tasks of detection and instance segmentation are much lower how would the authors propose to improve these 2 if the primary goal is to improve ssl performance on small models i would have liked to see more analysis on how different design choices of setting up contrastive learning affect model performance and if these could aid performance improvement in addition to knowledge distillation questions and suggestions 1 adding fullysupervised baselines for small models in table 1 will be useful in understanding the gap between full supervision and ssl for these models 2 in figure 3 does 100 green line represent the student network trained with 100 of labeled imagenet supervised data it is hard to interpret what these numbers represent 3 minor point some citations which should not be in parentheses are in parentheses eg romero et al page 8 please fix this in the revision docsepthe paper proposes to distill knowledge from large teacher networks to small student networks in selfsupervised learning experimental results show significant improvements on small networks concerns during the selfsupervised distillation phase it is not clear to me only distillation loss is applied or selfsupervised learning loss is combined with distillation loss for learning student network if only distillation loss is applied does it make sense to train a student network using both selfsupervised learning loss and distillation loss such as mocov2 used in most experiments the caption of figure 3 is confusing it would be good to explain it more clearly it seems that improvements on object detection and instance segmentation table 2 are relatively small compared to other experiments are there any explanations could it be possible to use smaller student networks in this experiment as well in the experiment of different sizes of sample queue does it mean that the larger the better what is the intuition behind it strong data augmentation is needed for most of selfsupervised methods but normally for distillation it is not common to use very strong data augmentation why does the paper decide to use the same data augmentation for both selfsupervised learning and distillation learning it is quite similar to a very recent paper as shown in the following it would be good to discuss the differences in the paper inproceedingskoohpayegani2020compress titlecompress selfsupervised learning by compressing representations authorkoohpayegani soroush abbasi and tejankar ajinkya and pirsiavash hamed booktitleneurips year2020 docsepsummary the paper address the problem of knowledge distillation in selfsupervised learning where the representational knowledge from the larger model ie teacher is used to guide the learning of a smaller model ie student to achieve this an instance queue is used to compute the similarity score between teacher model features and the feature of a given image and the learning objective is to minimise the crossentropy loss of the similarity between teacher and student models the paper provides comprehensive empirical results to justify the efficacy of the proposed approach justification of rating the paper provides a comprehensive evaluation of the proposed approach on various network architecture and downstream tasks overall i feel this work does not have sufficient theoretical or algorithmic contributions the key contribution is the idea of apply knowledge distillation for selfsupervised learning strengths this is the first work that addresses selfsupervised learning ssl with knowledge distillation it empirically shows ssl with a small model is challenging consistent with finding from chen et al 2020ab and proposed a technique seed to transfer knowledge this paper provides comprehensive experiment on image classification task selfsupervised semisupervised and supervised object detectionsegmentation domain transfer as well as provide ablation studies on various model architecture and parameters the results show knowledge distillation is effective for selfsupervised learning weakness the core novelty of this work is the idea of conduct knowledge distillation in selfsupervised learning the key weakness is that the knowledge distillation approach and the instance queue approach are previously proposed and known to the research community this work empirically shows how it can be combined for the task on hand minor comments in section 32 before eqn 2 it is best to change the similarity score between xi and vecdjs to the similarity between the extracted teacherstudent feature zi and vecdjs this is to avoid confusion as one might wonder how can the similarity between the input image and the vecdj be computed please move the tables caption to the top of the table ### Summary:
there is definite consensus on this paper with all reviewers expressing very favorable opinions the author responses are very well articulated and address the main concerns expressed by the reviewers the paper is very wellwritten and the ablation study wellexecuted some recent related work was missed in the original submission but this was adequately addressed in rebuttal the proposed approach is novel technique for feature representation learning the clarifications to the manuscript and the new analyses are especially appreciated
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 8774, 436, 2929, 29328, 247, 50276, 36871, 940, 21755, 465, 69, 2746, 323, 1881, 35421, 4715, 256, 3433, 342, 1355, 11454, 2990, 3210, 253, 4477, 806, 10018, 326, 253, 1375, 23037, 14387, 4499, 422, 4715, 3169, 256, 3433, 1057, 417, 4044, 1175, 3045, 327, 1355, 3210, 1955, 50276, 936, 253, 4067, 50276, 7645, 5350, 2424, 323, 4227, 11081, 281, 18915, 436, 1895, 597, 12661, 247, 8357, 247, 50276, 76, 69, 1332, 835, 253, 4577, 5974, 1566, 33772, 281, 25066, 697, 4067, 9732, 3210, 14259, 3268, 875, 271, 4227, 285, 697, 31612, 6849, 970, 247, 2831, 290, 10144, 1754, 8103, 253, 4477, 1347, 2710, 4679, 281, 921, 326, 50276, 18, 50276, 24270, 31326, 6832, 7756, 275, 256, 3433, 3169, 4440, 257, 292, 9162, 3045, 323, 1355, 3210, 347, 2429, 281, 256, 3433, 3733, 1293, 8357, 374, 253, 3045, 15988, 403, 671, 6832, 323, 3700, 4715, 327, 643, 9162, 8892, 495, 253, 3045, 15988, 403, 4577, 323, 15450, 8892, 273, 1789, 5481, 285, 4227, 26405, 342, 3045, 15988, 8493, 323, 253, 4067, 9285, 80, 10895, 347, 2429, 281, 11571, 577, 8357, 310, 10237, 281, 4327, 273, 256, 3433, 1332, 285, 17923, 1805, 685, 643, 465, 69, 7274, 50276, 296, 3755, 20556, 253, 2929, 310, 4518, 3542, 285, 973, 34092, 253, 8357, 2746, 50276, 261, 973, 24013, 8550, 285, 24600, 5661, 12820, 285, 253, 28913, 2175, 403, 3240, 11080, 3045, 15988, 327, 9162, 8892, 403, 6832, 253, 1332, 310, 2969, 281, 3359, 50275, 20881, 1255, 265, 337, 3045, 15988, 327, 15450, 8892, 273, 5481, 285, 4227, 26405, 403, 1199, 2406, 50276, 5430, 651, 253, 4477, 12661, 281, 3157, 841, 374, 604, 253, 3625, 4736, 310, 281, 3157, 256, 3433, 3045, 327, 1355, 3210, 891, 651, 452, 10490, 281, 923, 625, 1783, 327, 849, 1027, 2216, 10165, 273, 4758, 598, 4499, 422, 4715, 2818, 1566, 3045, 285, 604, 841, 812, 8596, 3045, 7756, 275, 1635, 281, 3640, 940, 21755, 50275, 34974, 285, 13991, 337, 6240, 4751, 35421, 1666, 25379, 323, 1355, 3210, 275, 2829, 337, 588, 320, 4217, 275, 4685, 253, 8037, 875, 2120, 20446, 285, 256, 3433, 323, 841, 3210, 374, 275, 4677, 495, 1057, 2233, 4759, 1386, 1957, 253, 5974, 2990, 10166, 342, 2233, 273, 13130, 4440, 257, 292, 22296, 941, 352, 310, 1892, 281, 4665, 752, 841, 3904, 1957, 495, 5884, 1127, 690, 30404, 534, 943, 417, 320, 275, 41616, 403, 275, 41616, 24088, 10102, 2771, 1162, 355, 3239, 854, 4496, 4993, 436, 275, 253, 18520, 50276, 7152, 339, 431, 248, 2929, 29328, 281, 940, 408, 3640, 432, 1781, 9732, 6928, 281, 1355, 5974, 6928, 275, 1881, 35421, 4715, 5661, 1543, 921, 1534, 11701, 327, 1355, 6928, 50276, 585, 1209, 2224, 50275, 32674, 253, 1881, 35421, 940, 21755, 3408, 352, 310, 417, 2590, 281, 479, 760, 940, 21755, 2957, 310, 3732, 390, 1881, 35421, 4715, 2957, 310, 5678, 342, 940, 21755, 2957, 323, 4715, 5974, 2990, 604, 760, 940, 21755, 2957, 310, 3732, 1057, 352, 1056, 3282, 281, 6194, 247, 5974, 2990, 970, 1097, 1881, 35421, 4715, 2957, 285, 940, 21755, 2957, 824, 347, 278, 406, 729, 19, 908, 275, 954, 4679, 50275, 783, 11743, 273, 4677, 495, 310, 21643, 352, 651, 320, 1175, 281, 5513, 352, 625, 4518, 50276, 262, 3133, 326, 11701, 327, 1789, 5481, 285, 4227, 26405, 2829, 374, 403, 4942, 1355, 2429, 281, 643, 4679, 403, 627, 667, 22909, 812, 352, 320, 1896, 281, 897, 4577, 5974, 6928, 275, 436, 3368, 347, 973, 50275, 249, 253, 3368, 273, 1027, 9552, 273, 3410, 15154, 1057, 352, 1599, 326, 253, 4067, 253, 1805, 752, 310, 253, 30328, 3212, 352, 50276, 9072, 941, 42072, 310, 3058, 323, 954, 273, 1881, 35421, 3082, 533, 9403, 323, 940, 21755, 352, 310, 417, 1846, 281, 897, 1077, 2266, 941, 42072, 2139, 1057, 253, 2929, 7617, 281, 897, 253, 1072, 941, 42072, 323, 1097, 1881, 35421, 4715, 285, 940, 21755, 4715, 50276, 262, 310, 3240, 2074, 281, 247, 1077, 3332, 2929, 347, 2011, 275, 253, 1563, 352, 651, 320, 1175, 281, 2319, 253, 3910, 275, 253, 2929, 50276, 249, 856, 22868, 7381, 1368, 12080, 909, 6451, 14952, 42693, 50275, 5564, 42693, 1881, 35421, 4715, 407, 509, 13537, 14237, 50275, 14399, 1064, 80, 1368, 12080, 909, 6451, 256, 11303, 73, 490, 67, 9720, 285, 716, 75, 1164, 274, 29168, 750, 5973, 285, 19188, 84, 571, 87, 1225, 288, 3163, 50275, 3305, 39584, 28740, 321, 2824, 50275, 2913, 14952, 5474, 339, 793, 360, 3454, 50276, 783, 2929, 2953, 253, 1895, 273, 3640, 940, 21755, 275, 1881, 35421, 4715, 835, 253, 1957, 1050, 3640, 432, 253, 4067, 1566, 26332, 9732, 310, 908, 281, 7102, 253, 4715, 273, 247, 4577, 1566, 26332, 5974, 281, 5115, 436, 271, 4227, 15154, 310, 908, 281, 11897, 253, 14259, 4868, 875, 9732, 1566, 3386, 285, 253, 4735, 273, 247, 1677, 2460, 285, 253, 4715, 8103, 310, 281, 7221, 885, 253, 2831, 290, 10144, 2957, 273, 253, 14259, 875, 9732, 285, 5974, 3210, 253, 2929, 3400, 11088, 16774, 1543, 281, 15249, 253, 10307, 273, 253, 4081, 2746, 50276, 6309, 1877, 273, 13716, 253, 2929, 3400, 247, 11088, 7103, 273, 253, 4081, 2746, 327, 2710, 2990, 10336, 285, 15450, 8892, 4583, 891, 1928, 436, 789, 1057, 417, 452, 4209, 10527, 390, 5933, 280, 9021, 253, 2234, 7680, 310, 253, 2934, 273, 4647, 3640, 940, 21755, 323, 1881, 35421, 4715, 50276, 296, 3755, 20556, 50276, 2520, 310, 253, 806, 789, 326, 12453, 1881, 35421, 4715, 256, 3433, 342, 3640, 940, 21755, 352, 45190, 2722, 256, 3433, 342, 247, 1355, 1566, 310, 11132, 5185, 342, 4560, 432, 260, 864, 1162, 355, 9169, 357, 285, 4081, 247, 5853, 8357, 281, 3700, 3640, 50276, 2520, 2929, 3400, 11088, 3368, 327, 2460, 9162, 4836, 1881, 35421, 49863, 29974, 13337, 285, 22296, 1789, 5481, 29429, 318, 5028, 3700, 347, 973, 347, 2085, 28913, 2175, 327, 2710, 1566, 10336, 285, 3602, 253, 1543, 921, 3640, 940, 21755, 310, 3576, 323, 1881, 35421, 4715, 50275, 20881, 1255, 50275, 783, 5161, 38135, 273, 436, 789, 310, 253, 2934, 273, 2589, 3640, 940, 21755, 275, 1881, 35421, 4715, 253, 2234, 14855, 310, 326, 253, 3640, 940, 21755, 2746, 285, 253, 4227, 15154, 2746, 403, 3786, 4081, 285, 1929, 281, 253, 2561, 3114, 436, 789, 45190, 2722, 849, 352, 476, 320, 5678, 323, 253, 4836, 327, 1133, 50276, 37585, 5701, 50275, 249, 2593, 4567, 1078, 16186, 79, 374, 352, 310, 1682, 281, 1818, 253, 14259, 4868, 875, 1269, 74, 285, 1670, 2428, 4305, 281, 253, 14259, 875, 253, 10375, 9732, 39095, 4735, 1182, 74, 285, 1670, 2428, 4305, 436, 310, 281, 3693, 13775, 347, 581, 1537, 4282, 849, 476, 253, 14259, 875, 253, 3280, 2460, 285, 253, 1670, 2428, 75, 320, 10302, 50276, 32897, 2118, 253, 7180, 11743, 281, 253, 1755, 273, 253, 2829, 187, 187, 4118, 18435, 27, 9088, 310, 19040, 13969, 327, 436, 2929, 342, 512, 30628, 13002, 1077, 13857, 11626, 253, 2488, 6128, 403, 1077, 973, 35144, 285, 2953, 253, 2022, 7350, 4469, 407, 253, 30628, 253, 2929, 310, 1077, 973, 15720, 285, 253, 28913, 1263, 6210, 1591, 886, 4525, 690, 3332, 2905, 789, 369, 9829, 275, 253, 3236, 19529, 533, 436, 369, 18212, 9713, 275, 30080, 22559, 253, 4081, 2746, 310, 4460, 5853, 323, 4735, 6779, 4715, 253, 8254, 6787, 281, 253, 7714, 285, 253, 747, 6260, 403, 3340, 14109, 209 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 8774, 436, 2929, 29328, 247, 50276, 36871, 940, 21755, 465, 69, 2746, 323, 1881, 35421, 4715, 256, 3433, 342, 1355, 11454, 2990, 3210, 253, 4477, 806, 10018, 326, 253, 1375, 23037, 14387, 4499, 422, 4715, 3169, 256, 3433, 1057, 417, 4044, 1175, 3045, 327, 1355, 3210, 1955, 50276, 936, 253, 4067, 50276, 7645, 5350, 2424, 323, 4227, 11081, 281, 18915, 436, 1895, 597, 12661, 247, 8357, 247, 50276, 76, 69, 1332, 835, 253, 4577, 5974, 1566, 33772, 281, 25066, 697, 4067, 9732, 3210, 14259, 3268, 875, 271, 4227, 285, 697, 31612, 6849, 970, 247, 2831, 290, 10144, 1754, 8103, 253, 4477, 1347, 2710, 4679, 281, 921, 326, 50276, 18, 50276, 24270, 31326, 6832, 7756, 275, 256, 3433, 3169, 4440, 257, 292, 9162, 3045, 323, 1355, 3210, 347, 2429, 281, 256, 3433, 3733, 1293, 8357, 374, 253, 3045, 15988, 403, 671, 6832, 323, 3700, 4715, 327, 643, 9162, 8892, 495, 253, 3045, 15988, 403, 4577, 323, 15450, 8892, 273, 1789, 5481, 285, 4227, 26405, 342, 3045, 15988, 8493, 323, 253, 4067, 9285, 80, 10895, 347, 2429, 281, 11571, 577, 8357, 310, 10237, 281, 4327, 273, 256, 3433, 1332, 285, 17923, 1805, 685, 643, 465, 69, 7274, 50276, 296, 3755, 20556, 253, 2929, 310, 4518, 3542, 285, 973, 34092, 253, 8357, 2746, 50276, 261, 973, 24013, 8550, 285, 24600, 5661, 12820, 285, 253, 28913, 2175, 403, 3240, 11080, 3045, 15988, 327, 9162, 8892, 403, 6832, 253, 1332, 310, 2969, 281, 3359, 50275, 20881, 1255, 265, 337, 3045, 15988, 327, 15450, 8892, 273, 5481, 285, 4227, 26405, 403, 1199, 2406, 50276, 5430, 651, 253, 4477, 12661, 281, 3157, 841, 374, 604, 253, 3625, 4736, 310, 281, 3157, 256, 3433, 3045, 327, 1355, 3210, 891, 651, 452, 10490, 281, 923, 625, 1783, 327, 849, 1027, 2216, 10165, 273, 4758, 598, 4499, 422, 4715, 2818, 1566, 3045, 285, 604, 841, 812, 8596, 3045, 7756, 275, 1635, 281, 3640, 940, 21755, 50275, 34974, 285, 13991, 337, 6240, 4751, 35421, 1666, 25379, 323, 1355, 3210, 275, 2829, 337, 588, 320, 4217, 275, 4685, 253, 8037, 875, 2120, 20446, 285, 256, 3433, 323, 841, 3210, 374, 275, 4677, 495, 1057, 2233, 4759, 1386, 1957, 253, 5974, 2990, 10166, 342, 2233, 273, 13130, 4440, 257, 292, 22296, 941, 352, 310, 1892, 281, 4665, 752, 841, 3904, 1957, 495, 5884, 1127, 690, 30404, 534, 943, 417, 320, 275, 41616, 403, 275, 41616, 24088, 10102, 2771, 1162, 355, 3239, 854, 4496, 4993, 436, 275, 253, 18520, 50276, 7152, 339, 431, 248, 2929, 29328, 281, 940, 408, 3640, 432, 1781, 9732, 6928, 281, 1355, 5974, 6928, 275, 1881, 35421, 4715, 5661, 1543, 921, 1534, 11701, 327, 1355, 6928, 50276, 585, 1209, 2224, 50275, 32674, 253, 1881, 35421, 940, 21755, 3408, 352, 310, 417, 2590, 281, 479, 760, 940, 21755, 2957, 310, 3732, 390, 1881, 35421, 4715, 2957, 310, 5678, 342, 940, 21755, 2957, 323, 4715, 5974, 2990, 604, 760, 940, 21755, 2957, 310, 3732, 1057, 352, 1056, 3282, 281, 6194, 247, 5974, 2990, 970, 1097, 1881, 35421, 4715, 2957, 285, 940, 21755, 2957, 824, 347, 278, 406, 729, 19, 908, 275, 954, 4679, 50275, 783, 11743, 273, 4677, 495, 310, 21643, 352, 651, 320, 1175, 281, 5513, 352, 625, 4518, 50276, 262, 3133, 326, 11701, 327, 1789, 5481, 285, 4227, 26405, 2829, 374, 403, 4942, 1355, 2429, 281, 643, 4679, 403, 627, 667, 22909, 812, 352, 320, 1896, 281, 897, 4577, 5974, 6928, 275, 436, 3368, 347, 973, 50275, 249, 253, 3368, 273, 1027, 9552, 273, 3410, 15154, 1057, 352, 1599, 326, 253, 4067, 253, 1805, 752, 310, 253, 30328, 3212, 352, 50276, 9072, 941, 42072, 310, 3058, 323, 954, 273, 1881, 35421, 3082, 533, 9403, 323, 940, 21755, 352, 310, 417, 1846, 281, 897, 1077, 2266, 941, 42072, 2139, 1057, 253, 2929, 7617, 281, 897, 253, 1072, 941, 42072, 323, 1097, 1881, 35421, 4715, 285, 940, 21755, 4715, 50276, 262, 310, 3240, 2074, 281, 247, 1077, 3332, 2929, 347, 2011, 275, 253, 1563, 352, 651, 320, 1175, 281, 2319, 253, 3910, 275, 253, 2929, 50276, 249, 856, 22868, 7381, 1368, 12080, 909, 6451, 14952, 42693, 50275, 5564, 42693, 1881, 35421, 4715, 407, 509, 13537, 14237, 50275, 14399, 1064, 80, 1368, 12080, 909, 6451, 256, 11303, 73, 490, 67, 9720, 285, 716, 75, 1164, 274, 29168, 750, 5973, 285, 19188, 84, 571, 87, 1225, 288, 3163, 50275, 3305, 39584, 28740, 321, 2824, 50275, 2913, 14952, 5474, 339, 793, 360, 3454, 50276, 783, 2929, 2953, 253, 1895, 273, 3640, 940, 21755, 275, 1881, 35421, 4715, 835, 253, 1957, 1050, 3640, 432, 253, 4067, 1566, 26332, 9732, 310, 908, 281, 7102, 253, 4715, 273, 247, 4577, 1566, 26332, 5974, 281, 5115, 436, 271, 4227, 15154, 310, 908, 281, 11897, 253, 14259, 4868, 875, 9732, 1566, 3386, 285, 253, 4735, 273, 247, 1677, 2460, 285, 253, 4715, 8103, 310, 281, 7221, 885, 253, 2831, 290, 10144, 2957, 273, 253, 14259, 875, 9732, 285, 5974, 3210, 253, 2929, 3400, 11088, 16774, 1543, 281, 15249, 253, 10307, 273, 253, 4081, 2746, 50276, 6309, 1877, 273, 13716, 253, 2929, 3400, 247, 11088, 7103, 273, 253, 4081, 2746, 327, 2710, 2990, 10336, 285, 15450, 8892, 4583, 891, 1928, 436, 789, 1057, 417, 452, 4209, 10527, 390, 5933, 280, 9021, 253, 2234, 7680, 310, 253, 2934, 273, 4647, 3640, 940, 21755, 323, 1881, 35421, 4715, 50276, 296, 3755, 20556, 50276, 2520, 310, 253, 806, 789, 326, 12453, 1881, 35421, 4715, 256, 3433, 342, 3640, 940, 21755, 352, 45190, 2722, 256, 3433, 342, 247, 1355, 1566, 310, 11132, 5185, 342, 4560, 432, 260, 864, 1162, 355, 9169, 357, 285, 4081, 247, 5853, 8357, 281, 3700, 3640, 50276, 2520, 2929, 3400, 11088, 3368, 327, 2460, 9162, 4836, 1881, 35421, 49863, 29974, 13337, 285, 22296, 1789, 5481, 29429, 318, 5028, 3700, 347, 973, 347, 2085, 28913, 2175, 327, 2710, 1566, 10336, 285, 3602, 253, 1543, 921, 3640, 940, 21755, 310, 3576, 323, 1881, 35421, 4715, 50275, 20881, 1255, 50275, 783, 5161, 38135, 273, 436, 789, 310, 253, 2934, 273, 2589, 3640, 940, 21755, 275, 1881, 35421, 4715, 253, 2234, 14855, 310, 326, 253, 3640, 940, 21755, 2746, 285, 253, 4227, 15154, 2746, 403, 3786, 4081, 285, 1929, 281, 253, 2561, 3114, 436, 789, 45190, 2722, 849, 352, 476, 320, 5678, 323, 253, 4836, 327, 1133, 50276, 37585, 5701, 50275, 249, 2593, 4567, 1078, 16186, 79, 374, 352, 310, 1682, 281, 1818, 253, 14259, 4868, 875, 1269, 74, 285, 1670, 2428, 4305, 281, 253, 14259, 875, 253, 10375, 9732, 39095, 4735, 1182, 74, 285, 1670, 2428, 4305, 436, 310, 281, 3693, 13775, 347, 581, 1537, 4282, 849, 476, 253, 14259, 875, 253, 3280, 2460, 285, 253, 1670, 2428, 75, 320, 10302, 50276, 32897, 2118, 253, 7180, 11743, 281, 253, 1755, 273, 253, 2829, 187, 187, 4118, 18435, 27, 9088, 310, 19040, 13969, 327, 436, 2929, 342, 512, 30628, 13002, 1077, 13857, 11626, 253, 2488, 6128, 403, 1077, 973, 35144, 285, 2953, 253, 2022, 7350, 4469, 407, 253, 30628, 253, 2929, 310, 1077, 973, 15720, 285, 253, 28913, 1263, 6210, 1591, 886, 4525, 690, 3332, 2905, 789, 369, 9829, 275, 253, 3236, 19529, 533, 436, 369, 18212, 9713, 275, 30080, 22559, 253, 4081, 2746, 310, 4460, 5853, 323, 4735, 6779, 4715, 253, 8254, 6787, 281, 253, 7714, 285, 253, 747, 6260, 403, 3340, 14109, 209 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: this paper proposes a new valuebased method using risk measures in cooperative multiagent reinforcement learning the authors propose a new network structure that calculates global cvar through individual distribution and learns risksensitized multiagent policies the authors also propose a new dynamic risk level prediction method that can dynamically adjust the risk level according to the agents observation and action applying risksensitive reinforcement learning in multiagent reinforcement learning is interesting but several points can be improved this paper has a fundamental difference compared to the distributional reinforcement learning recently studied in singleagent reinforcement learning like iqn in singleagent reinforcement learning the distribution of qfunction is first learned through the distributional bellman operator and the learned distribution can be utilized in additional applications such as risksensitive reinforcement learning even if the mean value of the distribution is used without considering the risk it shows higher performance than the existing reinforcement learning algorithms without distribution the distributional bellman operator has a richer training signal than the bellman operator for the existing scalar qfunction which enables fast representation learning moreover by changing the sampling distribution through the learned distribution risksensitive reinforcement learning can be easily applied to multiple risk measures however in this paper the authors do not use distribution as a direct learning objective but only aim to maximize cvar unlike the distributional bellman operator this learning method cannot be expected to improve training speed and it is difficult to understand the meaning of the learned return distribution learned with the scalar loss function also this paper proposes a dynamic risk level prediction but this part is confused the authors argue that this method solves timeconsistency issues and unstable learning however there is an insufficient explanation as to why this problem occurred and how to solve it there is also a lack of explanation on why risk level is divided into k steps and why alpha is defined as in equation 3 finally detailed analyses are needed on how dynamic risk level prediction affects performance there is a paper which the authors must cite and compare to this is not the same as rmix but close enough that it has to be compared hu et al 2020 qrmix distributional value function factorisation for cooperative multiagent reinforcement learning urlhttpsarxivorgpdf200904197pdf docsep strengths 1 the paper is wellwritten and versed in the pulse of related works on the topic this makes it very easy to assess the points of differentiation of the formulationfocus from prior art especially the problem class of marl with risk measures is salient 2 the authors clearly demonstrate the practical merits of the proposed techniques in ambitious experiments this is a major upshot of the work one may clearly see across challenging domains the performance gains achieved by rmix 3 solving decpomdps is in general intractable and one must often resort to particle filteringbelief representations of the state combining these technical challenges with risk sensitive utilities is nontrivial and the authors have made a bold effort to obtain a tractable algorithm despite these issues importantly their work has some conceptual justification weaknesses 1 while the conceptual setup and theoretical contributions are clear the actual training mechanism is given very little explanation there should at least be an iterative update scheme or pseudocode presenting the key algorithm of this work this would serve to make it easier to distinguish what are the unique attributes to the algorithm put forth in this work outside of a contextual discussion and at a more granular algorithmic level by reading the paper it is very difficult to understand what information agents must exchange during training 2 what is the purpose of the generalized return pdf how does the information required for its estimation get processed algorithmically and can each agent estimate its component of it with local information only this is not easy to discern 3 theorem 1 must somehow assume that each agents marginal utility is concave and that the joint utility is jointly concave in agents local policies but a discussion of this seems missing without concavity then one can only ensure that agents policies in the decentralized setting converge to stationarity and that these stationary points belong to the set of extrema of the global utility it may be the case that the fact that the risksensitive bellman operator being a contraction is enough to mitigate these subtleties and ensure convergence to a global extrema but this is not discussed 4 just defining the td loss in eqn 4 is not enough because the presence of a risk measure means that a vanilla td step in terms of approximating an expected value does not hold how does the blocking to avoid changing the weights of the agents network from the dynamic risk level predictor change the td population objective under normal circumstances this would be the bellman optimality operator how does ctot mitigate this issue and what relationship does this hold to the notions of risksensitive bellman equations in andrzej ruszczynski riskaverse dynamic programming for markov decision processes mathematical programming 1252235261 2010 see also kose u ruszczynski a 2020 riskaverse learning by temporal difference methods arxiv preprint arxiv200300780 5 in general a discussion of the technical innovations required to establish the theorems is absent are these theorems inherited from the algorithms appearing in earlier work what is new minor comments 1 references missing regarding risksensitive rl zhang junyu amrit singh bedi mengdi wang and alec koppel cautious reinforcement learning via distributional risk in the dual domain arxiv preprint arxiv200212475 2020 2 the link between equation 2 and estimating a conditional expectation should be made more explicit 3 the meaning and interpretation of igm is ambiguous more effort needs to be expended to properly explain it the idea seems to be that individual agents optimal policy is equivalent to an oraclemetaagent that aggregates information over the network this notion is then key to interpreting the importance of theorem 1 docsepoverall the paper considers the cooperative multiagent marl setting where agents have partial observability and share a team reward instead of optimizing the expected cumulative reward the paper considers optimizing the cvar measure the proposed method relies on ctde and features a dynamic risklevel predictor although cvar optimization is an important problem for marl and the experimental results seem to be convincing the formulation of the problem and the description of the proposed method are not written with enough clarity specifically i have the following commentsquestions commentsquestions 1 for cvar optimization is the risk level ie alpha given as part of the problem itself if it is given whats the intuition behind the dynamic risk level predictor that handles the temporal nature of stochastic outcomes what is the temporal nature 2 its known that for a discrete random variable its pdf is composed by dirac functions weighted by the pmf is definition 1 repeating this 3 i have difficulty understanding equations 1 and 2 in 1 whats the meaning of deltataui ui isnt that the dirac function takes a number as its argument why can the dirac function take a trajectory as its argument for 2 why does the estimate not depend on pj that appears in 1 4 in theorem 1 what is the definition of igm and the definition of ctot if the reward is shared why would the local cvar be different from the global cvar although the paper considers an important optimization objective for marl i dont think the presentation is ready for publication yet docsepthe authors propose rmix to deal with the randomness of rewards and the uncertainty in environments rmix learns the individual value distributions of each agent and uses a predictor to calculate the dynamic risk level given the individual value distribution and the risk level a cvar operator outputs the c value for execution for training the c values are mixed as ctot and updated by td error endtoend rmix outperforms a series of value decomposition baselines on many challenging starcraft ii tasks the paper is very clear and wellstructured expanding value decomposition methods to the risksensitive field is a novel idea and it shows competitive performance in empirical studies however my main concern is the definition of the risk level alpha more indepth analysis is expected to interpret why the discrepancy between the embedding of current individual return distributions and the embedding of historical return distributions could reflect the risk level since the embedding parameters femb and phi are trained endtoend by tderror the real meaning of alpha is unknowable eq 3 is a little confusing and i get the left side of eq 3 should be palphaik not alphaik it is hard to understand how to get the final alphai from the krange probability figure 12 helps a lot in this understanding and i expect it to be more detailed and put on the main pages rmix is built on qmix however the individual value function in qmix does not estimate a real expected return and the value has no meaning is the theoretical analysis of the risk still valid in qmix rmix is proposed for the randomness of rewards and the uncertainty in environments however i think they are not usually observed in the environment starcraft ii since the policies of the enemy are fixed increasing the randomness and uncertainty could verify the advantages of rmix update after author response i thank the authors for the detailed response most of my concerns have been addressed and i decide to keep my score ### Summary:
this paper proposes a method of risksensitive multiagent reinforcement learning in cooperative settings the proposed method introduces several new ideas but they are not theoretically well founded which has caused many confusions among the reviewers although some of the confusions are resolved through discussion there remain major concerns about the validity of the method
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 2520, 2929, 29328, 247, 747, 1318, 3169, 1332, 970, 2495, 5593, 275, 27293, 4471, 12788, 35221, 4715, 253, 4477, 12661, 247, 747, 2990, 2605, 326, 45319, 4156, 260, 2044, 949, 2060, 3268, 285, 33772, 10502, 33761, 1025, 4471, 12788, 7823, 253, 4477, 671, 12661, 247, 747, 7870, 2495, 1268, 10554, 1332, 326, 476, 23043, 4575, 253, 2495, 1268, 2556, 281, 253, 6083, 8310, 285, 2250, 9433, 10502, 18917, 35221, 4715, 275, 4471, 12788, 35221, 4715, 310, 4722, 533, 2067, 2792, 476, 320, 5520, 50276, 2520, 2929, 556, 247, 7936, 3064, 2429, 281, 253, 3268, 267, 35221, 4715, 4102, 5421, 275, 2014, 12788, 35221, 4715, 751, 891, 47051, 275, 2014, 12788, 35221, 4715, 253, 3268, 273, 2805, 3701, 310, 806, 6311, 949, 253, 3268, 267, 17487, 1342, 5572, 285, 253, 6311, 3268, 476, 320, 12845, 275, 3081, 4893, 824, 347, 10502, 18917, 35221, 4715, 1014, 604, 253, 1599, 1318, 273, 253, 3268, 310, 908, 1293, 7296, 253, 2495, 352, 2722, 2169, 3045, 685, 253, 5368, 35221, 4715, 11333, 1293, 3268, 253, 3268, 267, 17487, 1342, 5572, 556, 247, 38539, 3733, 2625, 685, 253, 17487, 1342, 5572, 323, 253, 5368, 13434, 2805, 3701, 534, 13276, 3809, 6779, 4715, 25761, 407, 6890, 253, 10491, 3268, 949, 253, 6311, 3268, 10502, 18917, 35221, 4715, 476, 320, 4354, 3732, 281, 2709, 2495, 5593, 50276, 35529, 275, 436, 2929, 253, 4477, 513, 417, 897, 3268, 347, 247, 1480, 4715, 8103, 533, 760, 4388, 281, 22950, 260, 2044, 12401, 253, 3268, 267, 17487, 1342, 5572, 436, 4715, 1332, 2550, 320, 3264, 281, 3157, 3733, 3885, 285, 352, 310, 2834, 281, 2096, 253, 4495, 273, 253, 6311, 1091, 3268, 6311, 342, 253, 13434, 2957, 1159, 50276, 12563, 436, 2929, 29328, 247, 7870, 2495, 1268, 10554, 533, 436, 629, 310, 13477, 253, 4477, 9059, 326, 436, 1332, 35910, 673, 46540, 1371, 3374, 285, 17631, 4715, 2299, 627, 310, 271, 12497, 8813, 347, 281, 2139, 436, 1895, 5866, 285, 849, 281, 8415, 352, 627, 310, 671, 247, 3480, 273, 8813, 327, 2139, 2495, 1268, 310, 4272, 715, 465, 5018, 285, 2139, 9765, 310, 2931, 347, 275, 5150, 495, 4720, 7000, 6260, 403, 3058, 327, 849, 7870, 2495, 1268, 10554, 11852, 3045, 50276, 9088, 310, 247, 2929, 534, 253, 4477, 1364, 26542, 285, 7277, 281, 436, 310, 417, 253, 1072, 347, 391, 24706, 533, 2810, 2217, 326, 352, 556, 281, 320, 2429, 50275, 11917, 1162, 355, 9169, 50276, 82, 1109, 895, 3268, 267, 1318, 1159, 2803, 5837, 323, 27293, 4471, 12788, 35221, 4715, 9688, 3614, 39962, 2061, 9275, 7857, 2125, 18493, 9275, 5474, 33032, 20544, 50276, 18, 253, 2929, 310, 973, 15720, 285, 1888, 264, 275, 253, 10724, 273, 2905, 2987, 327, 253, 9400, 436, 2789, 352, 1077, 3477, 281, 2939, 253, 2792, 273, 9827, 273, 253, 15895, 16651, 432, 2720, 1445, 3340, 253, 1895, 966, 273, 2304, 77, 342, 2495, 5593, 310, 43066, 50276, 19, 50276, 783, 4477, 4518, 7568, 253, 8542, 16108, 273, 253, 4081, 5609, 275, 24683, 4679, 436, 310, 247, 2201, 598, 11860, 273, 253, 789, 581, 778, 4518, 923, 2439, 11132, 10625, 253, 3045, 15988, 6786, 407, 391, 24706, 50276, 20, 16161, 1086, 81, 297, 69, 793, 310, 275, 2087, 540, 44374, 285, 581, 1364, 2223, 17942, 281, 8091, 19690, 31815, 14237, 273, 253, 1375, 16248, 841, 7681, 7881, 342, 2495, 7996, 28275, 310, 37825, 285, 253, 4477, 452, 1160, 247, 13433, 3434, 281, 4044, 247, 10649, 494, 5933, 5747, 841, 3374, 15538, 616, 789, 556, 690, 20178, 22861, 50274, 20881, 1255, 265, 50275, 18, 1223, 253, 20178, 9978, 285, 10527, 9021, 403, 2590, 253, 4588, 3733, 5122, 310, 1677, 1077, 1652, 8813, 627, 943, 387, 1878, 320, 271, 34560, 5731, 6974, 390, 10585, 406, 853, 15250, 253, 2234, 5933, 273, 436, 789, 436, 651, 5752, 281, 1056, 352, 6927, 281, 12129, 752, 403, 253, 4451, 12474, 281, 253, 5933, 1691, 6593, 275, 436, 789, 3345, 273, 247, 33876, 5955, 285, 387, 247, 625, 32449, 5933, 280, 1268, 407, 4361, 253, 2929, 352, 310, 1077, 2834, 281, 2096, 752, 1491, 6083, 1364, 6431, 1309, 3733, 50276, 19, 752, 310, 253, 4096, 273, 253, 14923, 1091, 31697, 849, 1057, 253, 1491, 2424, 323, 697, 13418, 755, 11742, 5933, 1037, 285, 476, 1016, 5570, 6642, 697, 4445, 273, 352, 342, 1980, 1491, 760, 436, 310, 417, 3477, 281, 26923, 50276, 20, 10012, 337, 1364, 10380, 5467, 326, 1016, 6083, 16888, 11839, 310, 40886, 285, 326, 253, 6036, 11839, 310, 26277, 40886, 275, 6083, 1980, 7823, 533, 247, 5955, 273, 436, 3133, 5816, 1293, 7036, 580, 414, 840, 581, 476, 760, 5416, 326, 6083, 7823, 275, 253, 40880, 4758, 29623, 281, 4660, 15752, 285, 326, 841, 17429, 2792, 5663, 281, 253, 873, 273, 1021, 250, 785, 273, 253, 4156, 11839, 352, 778, 320, 253, 1083, 326, 253, 958, 326, 253, 10502, 18917, 17487, 1342, 5572, 1146, 247, 22170, 310, 2217, 281, 29966, 841, 8482, 1059, 447, 285, 5416, 14940, 281, 247, 4156, 1021, 250, 785, 533, 436, 310, 417, 5469, 50275, 21, 816, 13947, 253, 32989, 2957, 275, 16186, 79, 577, 310, 417, 2217, 984, 253, 3361, 273, 247, 2495, 2557, 2097, 326, 247, 26724, 32989, 3213, 275, 2426, 273, 4020, 839, 271, 3264, 1318, 1057, 417, 2186, 849, 1057, 253, 14589, 281, 3693, 6890, 253, 13461, 273, 253, 6083, 2990, 432, 253, 7870, 2495, 1268, 23403, 1818, 253, 32989, 3072, 8103, 762, 2622, 5989, 436, 651, 320, 253, 17487, 1342, 5556, 1319, 5572, 50276, 5430, 1057, 260, 23877, 29966, 436, 2523, 285, 752, 2954, 1057, 436, 2186, 281, 253, 27367, 273, 10502, 18917, 17487, 1342, 7424, 275, 50276, 395, 83, 2721, 75, 391, 316, 91, 14617, 1362, 9327, 2495, 66, 3025, 7870, 10717, 323, 1616, 729, 3061, 4870, 15965, 10717, 11140, 1423, 1671, 26560, 4267, 50276, 2887, 671, 50276, 76, 583, 1484, 50276, 14734, 91, 14617, 1362, 9327, 247, 9169, 2495, 66, 3025, 4715, 407, 11935, 3064, 3082, 549, 32693, 638, 3845, 549, 32693, 1518, 7554, 43698, 50276, 22, 275, 2087, 247, 5955, 273, 253, 7681, 32771, 2424, 281, 5100, 253, 39383, 310, 12125, 403, 841, 39383, 20265, 432, 253, 11333, 15602, 275, 4321, 789, 752, 310, 747, 50276, 37585, 5701, 50276, 18, 10414, 5816, 5001, 10502, 18917, 391, 77, 50276, 91, 12109, 12468, 30838, 717, 902, 1625, 73, 3722, 74, 278, 1205, 5168, 259, 606, 285, 21844, 68, 465, 28820, 31798, 35221, 4715, 3066, 3268, 267, 2495, 275, 253, 8746, 5028, 549, 32693, 638, 3845, 549, 32693, 10016, 13397, 1976, 9169, 50276, 19, 253, 3048, 875, 5150, 374, 285, 26230, 247, 17697, 15355, 943, 320, 1160, 625, 6843, 50276, 20, 253, 4495, 285, 7914, 273, 209, 15379, 310, 23851, 625, 3434, 3198, 281, 320, 49976, 281, 6283, 5513, 352, 253, 2934, 3133, 281, 320, 326, 2060, 6083, 8654, 3646, 310, 6425, 281, 271, 390, 317, 5616, 1464, 12788, 326, 29111, 1491, 689, 253, 2990, 436, 10732, 310, 840, 2234, 281, 29375, 253, 6349, 273, 10012, 337, 50276, 7152, 33032, 1189, 455, 253, 2929, 19401, 253, 27293, 4471, 12788, 2304, 77, 4758, 835, 6083, 452, 7898, 1759, 1430, 285, 3894, 247, 2285, 10921, 3185, 273, 39793, 253, 3264, 18849, 10921, 253, 2929, 19401, 39793, 253, 260, 2044, 2557, 253, 4081, 1332, 15771, 327, 45830, 615, 285, 3386, 247, 7870, 2495, 5251, 23403, 3738, 260, 2044, 13757, 310, 271, 1774, 1895, 323, 2304, 77, 285, 253, 5661, 1543, 1646, 281, 320, 21414, 253, 15895, 273, 253, 1895, 285, 253, 5740, 273, 253, 4081, 1332, 403, 417, 3542, 342, 2217, 19843, 5742, 891, 452, 253, 1563, 5701, 34974, 50272, 26122, 34974, 50275, 18, 323, 260, 2044, 13757, 310, 253, 2495, 1268, 26332, 9765, 1677, 347, 629, 273, 253, 1895, 3139, 604, 352, 310, 1677, 47515, 253, 30328, 3212, 253, 7870, 2495, 1268, 23403, 326, 22139, 253, 11935, 3753, 273, 19191, 6973, 752, 310, 253, 11935, 3753, 50275, 19, 697, 1929, 326, 323, 247, 13358, 3632, 4778, 697, 31697, 310, 9924, 407, 14035, 317, 3470, 17375, 407, 253, 12920, 71, 310, 5426, 337, 24385, 436, 50275, 20, 891, 452, 10183, 4685, 7424, 337, 285, 374, 275, 337, 47515, 253, 4495, 273, 1448, 85, 682, 4113, 28243, 310, 2649, 326, 253, 14035, 317, 1159, 3936, 247, 1180, 347, 697, 4154, 2139, 476, 253, 14035, 317, 1159, 1379, 247, 18974, 347, 697, 4154, 323, 374, 2139, 1057, 253, 6642, 417, 3469, 327, 268, 75, 326, 4620, 275, 337, 50275, 21, 275, 10012, 337, 752, 310, 253, 5426, 273, 209, 15379, 285, 253, 5426, 273, 260, 23877, 604, 253, 10921, 310, 6096, 2139, 651, 253, 1980, 260, 2044, 320, 1027, 432, 253, 4156, 260, 2044, 50273, 20261, 253, 2929, 19401, 271, 1774, 13757, 8103, 323, 2304, 77, 891, 13414, 1158, 253, 9759, 310, 4704, 323, 9311, 2568, 5474, 339, 431, 248, 4477, 12661, 391, 24706, 281, 2968, 342, 253, 3632, 1255, 273, 23267, 285, 253, 11649, 275, 12620, 391, 24706, 33772, 253, 2060, 1318, 10670, 273, 1016, 5570, 285, 4648, 247, 23403, 281, 10173, 253, 7870, 2495, 1268, 1677, 253, 2060, 1318, 3268, 285, 253, 2495, 1268, 247, 260, 2044, 5572, 18012, 253, 260, 1318, 323, 10636, 323, 3733, 253, 260, 2193, 403, 6804, 347, 260, 23877, 285, 9300, 407, 32989, 2228, 990, 936, 423, 391, 24706, 41731, 13015, 247, 2962, 273, 1318, 14717, 1666, 25379, 327, 1142, 11132, 331, 3178, 2694, 21255, 8892, 253, 2929, 310, 1077, 2590, 285, 973, 34218, 16122, 1318, 14717, 3082, 281, 253, 10502, 18917, 1673, 310, 247, 4460, 2934, 285, 352, 2722, 12085, 3045, 275, 16774, 2175, 50275, 35529, 619, 2022, 4468, 310, 253, 5426, 273, 253, 2495, 1268, 9765, 625, 801, 554, 394, 1783, 310, 3264, 281, 4665, 2139, 253, 26210, 875, 253, 21496, 273, 1655, 2060, 1091, 10670, 285, 253, 21496, 273, 9493, 1091, 10670, 812, 4887, 253, 2495, 1268, 1580, 253, 21496, 3602, 3427, 67, 285, 815, 74, 403, 10166, 990, 936, 423, 407, 246, 491, 6045, 253, 1524, 4495, 273, 9765, 310, 440, 14428, 494, 16186, 495, 310, 247, 1652, 21643, 285, 891, 755, 253, 1669, 1930, 273, 16186, 495, 943, 320, 268, 1637, 1479, 417, 9765, 1479, 352, 310, 1892, 281, 2096, 849, 281, 755, 253, 2457, 9765, 74, 432, 253, 465, 6324, 5912, 4677, 1249, 7729, 247, 2257, 275, 436, 4685, 285, 891, 1902, 352, 281, 320, 625, 7000, 285, 1691, 327, 253, 2022, 7223, 50276, 1109, 895, 310, 4270, 327, 2805, 24706, 2299, 253, 2060, 1318, 1159, 275, 2805, 24706, 1057, 417, 6642, 247, 1524, 3264, 1091, 285, 253, 1318, 556, 642, 4495, 310, 253, 10527, 1783, 273, 253, 2495, 1335, 3588, 275, 2805, 24706, 50276, 1109, 895, 310, 4081, 323, 253, 3632, 1255, 273, 23267, 285, 253, 11649, 275, 12620, 2299, 891, 1158, 597, 403, 417, 3798, 2540, 275, 253, 3126, 331, 3178, 2694, 21255, 1580, 253, 7823, 273, 253, 9054, 403, 4229, 3629, 253, 3632, 1255, 285, 11649, 812, 12654, 253, 11361, 273, 391, 24706, 50276, 11183, 846, 2488, 2380, 50276, 74, 5717, 253, 4477, 323, 253, 7000, 2380, 954, 273, 619, 7350, 452, 644, 9713, 285, 891, 7617, 281, 1978, 619, 4868, 2490, 187, 4118, 18435, 27, 2520, 2929, 29328, 247, 1332, 273, 10502, 18917, 4471, 12788, 35221, 4715, 275, 27293, 7533, 50276, 783, 4081, 1332, 23970, 2067, 747, 5697, 533, 597, 403, 417, 28055, 973, 11420, 534, 556, 4269, 1142, 1461, 16723, 2190, 253, 30628, 50276, 20261, 690, 273, 253, 1461, 16723, 403, 11512, 949, 5955, 627, 3464, 2201, 7350, 670, 253, 13091, 273, 253, 1332, 50276 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 2520, 2929, 29328, 247, 747, 1318, 3169, 1332, 970, 2495, 5593, 275, 27293, 4471, 12788, 35221, 4715, 253, 4477, 12661, 247, 747, 2990, 2605, 326, 45319, 4156, 260, 2044, 949, 2060, 3268, 285, 33772, 10502, 33761, 1025, 4471, 12788, 7823, 253, 4477, 671, 12661, 247, 747, 7870, 2495, 1268, 10554, 1332, 326, 476, 23043, 4575, 253, 2495, 1268, 2556, 281, 253, 6083, 8310, 285, 2250, 9433, 10502, 18917, 35221, 4715, 275, 4471, 12788, 35221, 4715, 310, 4722, 533, 2067, 2792, 476, 320, 5520, 50276, 2520, 2929, 556, 247, 7936, 3064, 2429, 281, 253, 3268, 267, 35221, 4715, 4102, 5421, 275, 2014, 12788, 35221, 4715, 751, 891, 47051, 275, 2014, 12788, 35221, 4715, 253, 3268, 273, 2805, 3701, 310, 806, 6311, 949, 253, 3268, 267, 17487, 1342, 5572, 285, 253, 6311, 3268, 476, 320, 12845, 275, 3081, 4893, 824, 347, 10502, 18917, 35221, 4715, 1014, 604, 253, 1599, 1318, 273, 253, 3268, 310, 908, 1293, 7296, 253, 2495, 352, 2722, 2169, 3045, 685, 253, 5368, 35221, 4715, 11333, 1293, 3268, 253, 3268, 267, 17487, 1342, 5572, 556, 247, 38539, 3733, 2625, 685, 253, 17487, 1342, 5572, 323, 253, 5368, 13434, 2805, 3701, 534, 13276, 3809, 6779, 4715, 25761, 407, 6890, 253, 10491, 3268, 949, 253, 6311, 3268, 10502, 18917, 35221, 4715, 476, 320, 4354, 3732, 281, 2709, 2495, 5593, 50276, 35529, 275, 436, 2929, 253, 4477, 513, 417, 897, 3268, 347, 247, 1480, 4715, 8103, 533, 760, 4388, 281, 22950, 260, 2044, 12401, 253, 3268, 267, 17487, 1342, 5572, 436, 4715, 1332, 2550, 320, 3264, 281, 3157, 3733, 3885, 285, 352, 310, 2834, 281, 2096, 253, 4495, 273, 253, 6311, 1091, 3268, 6311, 342, 253, 13434, 2957, 1159, 50276, 12563, 436, 2929, 29328, 247, 7870, 2495, 1268, 10554, 533, 436, 629, 310, 13477, 253, 4477, 9059, 326, 436, 1332, 35910, 673, 46540, 1371, 3374, 285, 17631, 4715, 2299, 627, 310, 271, 12497, 8813, 347, 281, 2139, 436, 1895, 5866, 285, 849, 281, 8415, 352, 627, 310, 671, 247, 3480, 273, 8813, 327, 2139, 2495, 1268, 310, 4272, 715, 465, 5018, 285, 2139, 9765, 310, 2931, 347, 275, 5150, 495, 4720, 7000, 6260, 403, 3058, 327, 849, 7870, 2495, 1268, 10554, 11852, 3045, 50276, 9088, 310, 247, 2929, 534, 253, 4477, 1364, 26542, 285, 7277, 281, 436, 310, 417, 253, 1072, 347, 391, 24706, 533, 2810, 2217, 326, 352, 556, 281, 320, 2429, 50275, 11917, 1162, 355, 9169, 50276, 82, 1109, 895, 3268, 267, 1318, 1159, 2803, 5837, 323, 27293, 4471, 12788, 35221, 4715, 9688, 3614, 39962, 2061, 9275, 7857, 2125, 18493, 9275, 5474, 33032, 20544, 50276, 18, 253, 2929, 310, 973, 15720, 285, 1888, 264, 275, 253, 10724, 273, 2905, 2987, 327, 253, 9400, 436, 2789, 352, 1077, 3477, 281, 2939, 253, 2792, 273, 9827, 273, 253, 15895, 16651, 432, 2720, 1445, 3340, 253, 1895, 966, 273, 2304, 77, 342, 2495, 5593, 310, 43066, 50276, 19, 50276, 783, 4477, 4518, 7568, 253, 8542, 16108, 273, 253, 4081, 5609, 275, 24683, 4679, 436, 310, 247, 2201, 598, 11860, 273, 253, 789, 581, 778, 4518, 923, 2439, 11132, 10625, 253, 3045, 15988, 6786, 407, 391, 24706, 50276, 20, 16161, 1086, 81, 297, 69, 793, 310, 275, 2087, 540, 44374, 285, 581, 1364, 2223, 17942, 281, 8091, 19690, 31815, 14237, 273, 253, 1375, 16248, 841, 7681, 7881, 342, 2495, 7996, 28275, 310, 37825, 285, 253, 4477, 452, 1160, 247, 13433, 3434, 281, 4044, 247, 10649, 494, 5933, 5747, 841, 3374, 15538, 616, 789, 556, 690, 20178, 22861, 50274, 20881, 1255, 265, 50275, 18, 1223, 253, 20178, 9978, 285, 10527, 9021, 403, 2590, 253, 4588, 3733, 5122, 310, 1677, 1077, 1652, 8813, 627, 943, 387, 1878, 320, 271, 34560, 5731, 6974, 390, 10585, 406, 853, 15250, 253, 2234, 5933, 273, 436, 789, 436, 651, 5752, 281, 1056, 352, 6927, 281, 12129, 752, 403, 253, 4451, 12474, 281, 253, 5933, 1691, 6593, 275, 436, 789, 3345, 273, 247, 33876, 5955, 285, 387, 247, 625, 32449, 5933, 280, 1268, 407, 4361, 253, 2929, 352, 310, 1077, 2834, 281, 2096, 752, 1491, 6083, 1364, 6431, 1309, 3733, 50276, 19, 752, 310, 253, 4096, 273, 253, 14923, 1091, 31697, 849, 1057, 253, 1491, 2424, 323, 697, 13418, 755, 11742, 5933, 1037, 285, 476, 1016, 5570, 6642, 697, 4445, 273, 352, 342, 1980, 1491, 760, 436, 310, 417, 3477, 281, 26923, 50276, 20, 10012, 337, 1364, 10380, 5467, 326, 1016, 6083, 16888, 11839, 310, 40886, 285, 326, 253, 6036, 11839, 310, 26277, 40886, 275, 6083, 1980, 7823, 533, 247, 5955, 273, 436, 3133, 5816, 1293, 7036, 580, 414, 840, 581, 476, 760, 5416, 326, 6083, 7823, 275, 253, 40880, 4758, 29623, 281, 4660, 15752, 285, 326, 841, 17429, 2792, 5663, 281, 253, 873, 273, 1021, 250, 785, 273, 253, 4156, 11839, 352, 778, 320, 253, 1083, 326, 253, 958, 326, 253, 10502, 18917, 17487, 1342, 5572, 1146, 247, 22170, 310, 2217, 281, 29966, 841, 8482, 1059, 447, 285, 5416, 14940, 281, 247, 4156, 1021, 250, 785, 533, 436, 310, 417, 5469, 50275, 21, 816, 13947, 253, 32989, 2957, 275, 16186, 79, 577, 310, 417, 2217, 984, 253, 3361, 273, 247, 2495, 2557, 2097, 326, 247, 26724, 32989, 3213, 275, 2426, 273, 4020, 839, 271, 3264, 1318, 1057, 417, 2186, 849, 1057, 253, 14589, 281, 3693, 6890, 253, 13461, 273, 253, 6083, 2990, 432, 253, 7870, 2495, 1268, 23403, 1818, 253, 32989, 3072, 8103, 762, 2622, 5989, 436, 651, 320, 253, 17487, 1342, 5556, 1319, 5572, 50276, 5430, 1057, 260, 23877, 29966, 436, 2523, 285, 752, 2954, 1057, 436, 2186, 281, 253, 27367, 273, 10502, 18917, 17487, 1342, 7424, 275, 50276, 395, 83, 2721, 75, 391, 316, 91, 14617, 1362, 9327, 2495, 66, 3025, 7870, 10717, 323, 1616, 729, 3061, 4870, 15965, 10717, 11140, 1423, 1671, 26560, 4267, 50276, 2887, 671, 50276, 76, 583, 1484, 50276, 14734, 91, 14617, 1362, 9327, 247, 9169, 2495, 66, 3025, 4715, 407, 11935, 3064, 3082, 549, 32693, 638, 3845, 549, 32693, 1518, 7554, 43698, 50276, 22, 275, 2087, 247, 5955, 273, 253, 7681, 32771, 2424, 281, 5100, 253, 39383, 310, 12125, 403, 841, 39383, 20265, 432, 253, 11333, 15602, 275, 4321, 789, 752, 310, 747, 50276, 37585, 5701, 50276, 18, 10414, 5816, 5001, 10502, 18917, 391, 77, 50276, 91, 12109, 12468, 30838, 717, 902, 1625, 73, 3722, 74, 278, 1205, 5168, 259, 606, 285, 21844, 68, 465, 28820, 31798, 35221, 4715, 3066, 3268, 267, 2495, 275, 253, 8746, 5028, 549, 32693, 638, 3845, 549, 32693, 10016, 13397, 1976, 9169, 50276, 19, 253, 3048, 875, 5150, 374, 285, 26230, 247, 17697, 15355, 943, 320, 1160, 625, 6843, 50276, 20, 253, 4495, 285, 7914, 273, 209, 15379, 310, 23851, 625, 3434, 3198, 281, 320, 49976, 281, 6283, 5513, 352, 253, 2934, 3133, 281, 320, 326, 2060, 6083, 8654, 3646, 310, 6425, 281, 271, 390, 317, 5616, 1464, 12788, 326, 29111, 1491, 689, 253, 2990, 436, 10732, 310, 840, 2234, 281, 29375, 253, 6349, 273, 10012, 337, 50276, 7152, 33032, 1189, 455, 253, 2929, 19401, 253, 27293, 4471, 12788, 2304, 77, 4758, 835, 6083, 452, 7898, 1759, 1430, 285, 3894, 247, 2285, 10921, 3185, 273, 39793, 253, 3264, 18849, 10921, 253, 2929, 19401, 39793, 253, 260, 2044, 2557, 253, 4081, 1332, 15771, 327, 45830, 615, 285, 3386, 247, 7870, 2495, 5251, 23403, 3738, 260, 2044, 13757, 310, 271, 1774, 1895, 323, 2304, 77, 285, 253, 5661, 1543, 1646, 281, 320, 21414, 253, 15895, 273, 253, 1895, 285, 253, 5740, 273, 253, 4081, 1332, 403, 417, 3542, 342, 2217, 19843, 5742, 891, 452, 253, 1563, 5701, 34974, 50272, 26122, 34974, 50275, 18, 323, 260, 2044, 13757, 310, 253, 2495, 1268, 26332, 9765, 1677, 347, 629, 273, 253, 1895, 3139, 604, 352, 310, 1677, 47515, 253, 30328, 3212, 253, 7870, 2495, 1268, 23403, 326, 22139, 253, 11935, 3753, 273, 19191, 6973, 752, 310, 253, 11935, 3753, 50275, 19, 697, 1929, 326, 323, 247, 13358, 3632, 4778, 697, 31697, 310, 9924, 407, 14035, 317, 3470, 17375, 407, 253, 12920, 71, 310, 5426, 337, 24385, 436, 50275, 20, 891, 452, 10183, 4685, 7424, 337, 285, 374, 275, 337, 47515, 253, 4495, 273, 1448, 85, 682, 4113, 28243, 310, 2649, 326, 253, 14035, 317, 1159, 3936, 247, 1180, 347, 697, 4154, 2139, 476, 253, 14035, 317, 1159, 1379, 247, 18974, 347, 697, 4154, 323, 374, 2139, 1057, 253, 6642, 417, 3469, 327, 268, 75, 326, 4620, 275, 337, 50275, 21, 275, 10012, 337, 752, 310, 253, 5426, 273, 209, 15379, 285, 253, 5426, 273, 260, 23877, 604, 253, 10921, 310, 6096, 2139, 651, 253, 1980, 260, 2044, 320, 1027, 432, 253, 4156, 260, 2044, 50273, 20261, 253, 2929, 19401, 271, 1774, 13757, 8103, 323, 2304, 77, 891, 13414, 1158, 253, 9759, 310, 4704, 323, 9311, 2568, 5474, 339, 431, 248, 4477, 12661, 391, 24706, 281, 2968, 342, 253, 3632, 1255, 273, 23267, 285, 253, 11649, 275, 12620, 391, 24706, 33772, 253, 2060, 1318, 10670, 273, 1016, 5570, 285, 4648, 247, 23403, 281, 10173, 253, 7870, 2495, 1268, 1677, 253, 2060, 1318, 3268, 285, 253, 2495, 1268, 247, 260, 2044, 5572, 18012, 253, 260, 1318, 323, 10636, 323, 3733, 253, 260, 2193, 403, 6804, 347, 260, 23877, 285, 9300, 407, 32989, 2228, 990, 936, 423, 391, 24706, 41731, 13015, 247, 2962, 273, 1318, 14717, 1666, 25379, 327, 1142, 11132, 331, 3178, 2694, 21255, 8892, 253, 2929, 310, 1077, 2590, 285, 973, 34218, 16122, 1318, 14717, 3082, 281, 253, 10502, 18917, 1673, 310, 247, 4460, 2934, 285, 352, 2722, 12085, 3045, 275, 16774, 2175, 50275, 35529, 619, 2022, 4468, 310, 253, 5426, 273, 253, 2495, 1268, 9765, 625, 801, 554, 394, 1783, 310, 3264, 281, 4665, 2139, 253, 26210, 875, 253, 21496, 273, 1655, 2060, 1091, 10670, 285, 253, 21496, 273, 9493, 1091, 10670, 812, 4887, 253, 2495, 1268, 1580, 253, 21496, 3602, 3427, 67, 285, 815, 74, 403, 10166, 990, 936, 423, 407, 246, 491, 6045, 253, 1524, 4495, 273, 9765, 310, 440, 14428, 494, 16186, 495, 310, 247, 1652, 21643, 285, 891, 755, 253, 1669, 1930, 273, 16186, 495, 943, 320, 268, 1637, 1479, 417, 9765, 1479, 352, 310, 1892, 281, 2096, 849, 281, 755, 253, 2457, 9765, 74, 432, 253, 465, 6324, 5912, 4677, 1249, 7729, 247, 2257, 275, 436, 4685, 285, 891, 1902, 352, 281, 320, 625, 7000, 285, 1691, 327, 253, 2022, 7223, 50276, 1109, 895, 310, 4270, 327, 2805, 24706, 2299, 253, 2060, 1318, 1159, 275, 2805, 24706, 1057, 417, 6642, 247, 1524, 3264, 1091, 285, 253, 1318, 556, 642, 4495, 310, 253, 10527, 1783, 273, 253, 2495, 1335, 3588, 275, 2805, 24706, 50276, 1109, 895, 310, 4081, 323, 253, 3632, 1255, 273, 23267, 285, 253, 11649, 275, 12620, 2299, 891, 1158, 597, 403, 417, 3798, 2540, 275, 253, 3126, 331, 3178, 2694, 21255, 1580, 253, 7823, 273, 253, 9054, 403, 4229, 3629, 253, 3632, 1255, 285, 11649, 812, 12654, 253, 11361, 273, 391, 24706, 50276, 11183, 846, 2488, 2380, 50276, 74, 5717, 253, 4477, 323, 253, 7000, 2380, 954, 273, 619, 7350, 452, 644, 9713, 285, 891, 7617, 281, 1978, 619, 4868, 2490, 187, 4118, 18435, 27, 2520, 2929, 29328, 247, 1332, 273, 10502, 18917, 4471, 12788, 35221, 4715, 275, 27293, 7533, 50276, 783, 4081, 1332, 23970, 2067, 747, 5697, 533, 597, 403, 417, 28055, 973, 11420, 534, 556, 4269, 1142, 1461, 16723, 2190, 253, 30628, 50276, 20261, 690, 273, 253, 1461, 16723, 403, 11512, 949, 5955, 627, 3464, 2201, 7350, 670, 253, 13091, 273, 253, 1332, 50276 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: this paper proposes audiovisual hubert based on the original hubert model the general idea is similar to hubert that predicts clusters of extracted features the authors further propose the modality dropout and masking by substitution module to further finetune the design extensive experiments and ablation studies are carried out on the lrs3 dataset which validate the effectiveness of the proposed method strength this paper is basically wellwritten with detailed supplementary the idea of leveraging bert for selfsupervised training is different from the traditional audiovisual alignment pertaining detailed improvements such as the audiovisual stream formulation and masking by substitution have been proposed the results on lrsv04 outperform the previous sota weakness this kind of formulation though novel on the topic of audiovisual speech recognition has been used on other topics the whole model is built upon hubert which limits the contribution why use only the lrs3 dataset for evaluation the lrs2 dataset should be available more dataset evaluations could be more comprehensive than one about the masking by substitution why would the fake segment detection subtask improves the learned features of the resnet encoder when the filledin frames are from real videos more intuition should be given it is said that your motivation is to keep masked region smooth temporally i understand that it would be smoother than random noise masking however segments from the same video cannot guarantee temporal smoothing i am curious about the multilingual vs monolingual experiments in sec 44 the model is kept in a setting with one 1 iteration training and 30h of labeled data while i understand the limitation in resources the authors are suggested to show the multilingual results under the full setting with extra data involved it might take a longer training time after all the method is agnostic to the spoken language overall this paper introduces techniques that the field of audiovisual speech recognition has rarely leveraged into this field and makes taskspecific modifications the idea is not thoroughly new but the application is good as i am no expert in this field it is possible that i have missed something docsepthis paper presents a multimodal audiovisual pretraining approach that results in models that can be fine tuned to achieve state of the art performance on both lipreading visual only as well as speech recognition audio only on the lrs3 dataset which is the largest public lipreading benchmark it extends the idea of masked selfsupervised learning to multimodal scenarios where units that are masked and predicted are automatically discovered via clustering and refined as training progresses while the proposed approach is a relatively straight forward extension of previously published hubert model it is demonstrated to be very effective paper also presents a number of alternative approaches for leaning embeddings of the visual modality and very interestingly demonstrates the value of multimodal input to the model even when a single modality is present at test time pros a simple but very effective extension of masked selfsupervised training to multimodal tasks b state of the art lipreading and asr results on the lrs3 benchmark c demonstrates value of providing multimodal input to the model even when only a single modality is present at test time alternatives that consider only that modality as input image of lips in this case are significantly worse cons not a negative but it will be interesting to see further experimentation with layerwise supervision approaches that inform visual representation learning via audio or av networks this will have the advantage that the model will only see the visual modality at training time yet will have the advantage of supervision from audio or av model at all layers not just at the output very well written paper presenting an effective approach for learning embeddings for multimodal data docsepthe paper proposes a strategy for selfsupervised pretraining for lip reading to this end they propose avhubert which learns embeddings by masking video input and predicting iteratively refined hidden units the authors propose two substrategies for this visualonly and crossmodal the pretrained embeddings are finetuned on the lip reading tasks in the style of wav2vec 20 and hubert for automatic speech recognition the authors perform experiments on both limited and full resource settings on both of which they demonstrate strong performance the ideas are intuitive and very suitable for the problem the authors take good advantage of a recent advancement of pretraining in asr hubert and applies it to lip reading aka visual speech recognition the visualonly hubert is a simple adaptation but the crossmodal iterative refinement is a useful addition the performance is strong on both low and full resource settings and the results clearly demonstrate the advantages of selfsupervised pretraining it would be good to also show the effectiveness of the embeddings for other downstream tasks such as audiovisual speech recognition and speech separation i think table c1 in the appendix is quite useful for justifying design choices and at least some of it would be beneficial to show in the main paper it would also be good to show lip reading results after visualonly singlemodality pretraining like audio hubert without crossmodal training as an application paper the paper might be better suited for cvpr or interspeech but i still vote for acceptance since the contributions are significant and the results are stateoftheart the authors propose an effective adaptation of hubert for lip reading by adding crossmodal iterative training the contributions are logical and the results are strong docsepthis paper presents a new selfsupervised learning framework for audiovisual speech their framework is trained with bertlike training that predicts representations in the masked regions using audio and visual inputs extensive experiments are performed using two popular audiovisual datasets especially with a small amount of labeled training data 30 hours their method achieved comparable results with a previous stateoftheart method which trained with a huge amount of labeled data 31000 hours moreover when trained with more labeled data 433 hours it surpasses the previous stateoftheart method trained with 31000 hours by 31 wer strength 1 the authors propose a powerful method to learn speech representations from multimodal data which can be utilized for both asr and vsr in a selfsupervised manner 2 avhubert achieves stateoftheart performance in terms of wer in a sentencelevel audiovisual dataset lrs3 moreover even in a lowresource setting training with 30 hours labeled data 1759 hours unlabeled data their model shows a comparable vsr result with a previous stateoftheart method that trained with 31000 hours labeled data 3 their proposed method is well verified in various views with extensive experiments through both manuscript and supplementary performances 1 with different model sizes 2 with different loss functions ctc and s2s 3 with different cluster targets 4 using multilingual data 5 of both asr and vsr and 6 of ablation studies weakness 1 even the proposed avhubert shows impressive performances but it is an expansion of hubert to work in multimodal data 2 there are many typos and errors in the manuscript please refer to below detailed comments moreover figure 1 should be improved the arrows are arranged confusingly questions 1 the authors use concatenation as the default fusion operator which dimension is utilized for concatenation temporal or channel if the temporal dimension is used for multimodal fusion better to refer to the two papers r1 r2 2 for the loss function in equation 4 their final performance seems obtained by setting the alpha as 0 however it is hard to find because it is placed in the results of ablation studies in the appendix since setting the alpha as zero the same as omitting the second loss term is better performed it would be better to describe the effect of alpha in avhubert after equation 4 or at the experimental setup briefly 3 the asr performance of avhubert is not presented it is just described as the avhubert performs worse than hubert in text moreover the last sentence in sec 45 seems not enough to explain why avhubert is not better than hubert on asr even if multimodal data is utilized during training have the authors reexamined hyperparameters ma mv pm pa alpha for asr task 4 2nd paragraph in sec a4 what does the sentence both modalities are used at test time for feature extraction mean there is no downstream task utilizes both modalities errors 1 sec 32 line 4 in the cluster assignment z1ta does z1ta mean z1ti 2 3rd paragraph on page 4 when only modality is used only one modality 3 last paragraph on page 5 where b consists of all possible where b1 maps all possible b seems a mapping function thus the word consist seems not appropriate 4 2nd paragraph on page 6 we rely on the joint decoder module what does the joint decoder mean do the authors use a joint ctcs2s decoder 5 2nd paragraph on page 8 iterative petraining iterative pretraining 6 1st paragraph on page 14 cmudict cmu maybe missing reference r1 chen yenchun et al uniter universal imagetext representation learning european conference on computer vision springer cham 2020 r2 lee sangho et al parameter efficient multimodal transformers for video representation learning international conference on learning representations 2020 overall authors approach is promising for reducing the need for a large labeled visual dataset for training vsr models and the result shown in the paper is significant ### Summary:
paper this paper introduces an extension of the hubert audioonly model for the audiovisual setting allowing for selfsupervised pretraining of multimodal model which also performs well on the unimodal tasks lipreading and asr the paper applies the idea of modality dropout to their multimodal pretraining setup and introduce the idea of masking with substitution as a way to improve visual representation learning a strong aspect of the paper is its experimental section showing strong improvement for lipreading tasks bringing a new stateoftheart performance the experiments also show improvement for asr task discussion all reviewers seemed to appreciate the experimental results with new stateoftheart performance on both unimodal task when performing multimodal pretraining the paper does bring some technical novelty but primarily because of its application to the audiovisual domain the modality dropout idea was already explored for other audiovisual tasks such as speechdriven face animation abdelaziz et al icmi 2020 but the idea of masking by substitution seems novel and helps learning better visual representations the authors were able to address many questions and concerns expressed by reviewers all reviewers took the time to read these responses and acknowledge them summary this paper brings an interesting extension of the audioonly hubert model for the audiovisual setting the strength of the paper is in its evaluation with strong performances establishing many new stateoftheart results all reviewers supported the acceptance of this paper
[ 273, 253, 4081, 1332, 50276, 45563, 50275, 2520, 2929, 310, 10323, 973, 15720, 342, 7000, 24864, 50275, 783, 2934, 273, 19732, 2977, 270, 797, 323, 1881, 35421, 3733, 310, 1027, 432, 253, 5899, 41174, 729, 261, 780, 12420, 27855, 50275, 5992, 7193, 11701, 824, 347, 253, 41174, 729, 261, 780, 5542, 15895, 285, 44790, 407, 19137, 452, 644, 4081, 50275, 783, 1543, 327, 298, 2967, 87, 2125, 562, 32231, 253, 2045, 256, 5503, 50275, 20881, 1255, 50275, 2520, 2238, 273, 15895, 2167, 4460, 327, 253, 9400, 273, 41174, 729, 261, 780, 6519, 8981, 556, 644, 908, 327, 643, 12989, 253, 2644, 1566, 310, 4270, 2220, 14713, 797, 534, 7787, 253, 7680, 50275, 22309, 897, 760, 253, 298, 2967, 20, 10895, 323, 7103, 253, 298, 2967, 19, 10895, 943, 320, 2130, 625, 10895, 27163, 812, 320, 625, 11088, 685, 581, 50275, 10383, 253, 44790, 407, 19137, 2139, 651, 253, 15223, 8223, 5481, 8482, 1945, 19132, 253, 6311, 3386, 273, 253, 501, 3024, 32049, 672, 253, 6898, 249, 13009, 403, 432, 1524, 10556, 625, 30328, 943, 320, 1677, 352, 310, 753, 326, 634, 16038, 310, 281, 1978, 34741, 2919, 6032, 5897, 595, 891, 2096, 326, 352, 651, 320, 39797, 977, 685, 3632, 6046, 44790, 2299, 13288, 432, 253, 1072, 3492, 2550, 12215, 11935, 36971, 50275, 74, 717, 14338, 670, 253, 1554, 39661, 4632, 28294, 272, 780, 4679, 275, 4706, 7127, 253, 1566, 310, 4934, 275, 247, 4758, 342, 581, 337, 19502, 3733, 285, 1884, 73, 273, 13130, 941, 1223, 891, 2096, 253, 12291, 275, 5300, 253, 4477, 403, 5125, 281, 921, 253, 1554, 39661, 1543, 762, 253, 2120, 4758, 342, 4465, 941, 3206, 352, 1537, 1379, 247, 3356, 3733, 673, 846, 512, 253, 1332, 310, 639, 79, 6932, 281, 253, 13452, 3448, 50276, 1189, 455, 436, 2929, 23970, 5609, 326, 253, 1673, 273, 41174, 729, 261, 780, 6519, 8981, 556, 11766, 19732, 2961, 715, 436, 1673, 285, 2789, 8892, 29765, 14586, 253, 2934, 310, 417, 16575, 747, 533, 253, 2898, 310, 1175, 347, 891, 717, 642, 6485, 275, 436, 1673, 352, 310, 1896, 326, 891, 452, 9829, 1633, 5474, 33032, 2520, 2929, 10262, 247, 23390, 26306, 41174, 729, 261, 780, 3215, 26208, 2746, 326, 1543, 275, 3210, 326, 476, 320, 4030, 24251, 281, 5115, 1375, 273, 253, 1445, 3045, 327, 1097, 5541, 24042, 5304, 760, 347, 973, 347, 6519, 8981, 9797, 760, 327, 253, 298, 2967, 20, 10895, 534, 310, 253, 6253, 1345, 5541, 24042, 22791, 352, 8725, 253, 2934, 273, 34741, 1881, 35421, 4715, 281, 23390, 26306, 15216, 835, 5085, 326, 403, 34741, 285, 8131, 403, 8356, 6888, 3066, 17524, 285, 22407, 347, 3733, 42851, 50276, 6050, 253, 4081, 2746, 310, 247, 4942, 4951, 3579, 6880, 273, 3786, 3863, 14713, 797, 1566, 352, 310, 5183, 281, 320, 1077, 3576, 50276, 20790, 671, 10262, 247, 1180, 273, 5795, 7274, 323, 25661, 46234, 273, 253, 5304, 36453, 285, 1077, 4722, 314, 14371, 253, 1318, 273, 23390, 26306, 3280, 281, 253, 1566, 1014, 672, 247, 2014, 36453, 310, 1246, 387, 1071, 673, 50276, 856, 84, 50276, 66, 2969, 533, 1077, 3576, 6880, 273, 34741, 1881, 35421, 3733, 281, 23390, 26306, 8892, 50276, 67, 1375, 273, 253, 1445, 5541, 24042, 285, 347, 83, 1543, 327, 253, 298, 2967, 20, 22791, 50276, 68, 14371, 1318, 273, 5277, 23390, 26306, 3280, 281, 253, 1566, 1014, 672, 760, 247, 2014, 36453, 310, 1246, 387, 1071, 673, 50276, 30991, 3993, 326, 1908, 760, 326, 36453, 347, 3280, 2460, 273, 11233, 275, 436, 1083, 403, 3012, 7197, 50276, 5040, 50276, 1439, 247, 4016, 533, 352, 588, 320, 4722, 281, 923, 2007, 40290, 342, 3828, 3020, 20446, 7274, 326, 4151, 5304, 6779, 4715, 3066, 9797, 390, 1323, 6928, 50276, 2520, 588, 452, 253, 5750, 326, 253, 1566, 588, 760, 923, 253, 5304, 36453, 387, 3733, 673, 2568, 588, 452, 253, 5750, 273, 20446, 432, 9797, 390, 1323, 1566, 387, 512, 8090, 417, 816, 387, 253, 3453, 50275, 635, 973, 3542, 2929, 15250, 271, 3576, 2746, 323, 4715, 46234, 323, 23390, 26306, 941, 5474, 339, 431, 248, 2929, 29328, 247, 5700, 323, 1881, 35421, 3215, 26208, 323, 5541, 4361, 281, 436, 990, 597, 12661, 1323, 73, 34762, 534, 33772, 46234, 407, 44790, 3492, 3280, 285, 21565, 10040, 3146, 22407, 8763, 5085, 253, 4477, 12661, 767, 5390, 992, 447, 323, 436, 5304, 7483, 285, 2831, 24353, 253, 3215, 11273, 46234, 403, 1442, 292, 37437, 327, 253, 5541, 4361, 8892, 275, 253, 3740, 273, 259, 580, 19, 4642, 1384, 285, 14713, 797, 323, 12077, 6519, 8981, 253, 4477, 1347, 4679, 327, 1097, 3710, 285, 2120, 7741, 7533, 327, 1097, 273, 534, 597, 7568, 2266, 3045, 50276, 783, 5697, 403, 27350, 285, 1077, 7470, 323, 253, 1895, 253, 4477, 1379, 1175, 5750, 273, 247, 3332, 32992, 273, 3215, 26208, 275, 347, 83, 14713, 797, 285, 10384, 352, 281, 5541, 4361, 38857, 5304, 6519, 8981, 253, 5304, 7483, 14713, 797, 310, 247, 2969, 15644, 533, 253, 2831, 24353, 34560, 29646, 310, 247, 4217, 1635, 50275, 783, 3045, 310, 2266, 327, 1097, 1698, 285, 2120, 7741, 7533, 285, 253, 1543, 4518, 7568, 253, 11361, 273, 1881, 35421, 3215, 26208, 50276, 262, 651, 320, 1175, 281, 671, 921, 253, 12510, 273, 253, 46234, 323, 643, 15450, 8892, 824, 347, 41174, 729, 261, 780, 6519, 8981, 285, 6519, 9712, 50275, 74, 1158, 2829, 260, 18, 275, 253, 30762, 310, 3240, 4217, 323, 816, 5411, 2216, 10165, 285, 387, 1878, 690, 273, 352, 651, 320, 12912, 281, 921, 275, 253, 2022, 2929, 352, 651, 671, 320, 1175, 281, 921, 5541, 4361, 1543, 846, 5304, 7483, 2014, 2307, 1319, 3215, 26208, 751, 9797, 14713, 797, 1293, 2831, 24353, 3733, 50276, 284, 271, 2898, 2929, 253, 2929, 1537, 320, 1805, 18960, 323, 30105, 1087, 390, 734, 48460, 533, 891, 1335, 6273, 323, 14924, 1580, 253, 9021, 403, 1534, 285, 253, 1543, 403, 1375, 23037, 14387, 50276, 783, 4477, 12661, 271, 3576, 15644, 273, 14713, 797, 323, 5541, 4361, 407, 6240, 2831, 24353, 34560, 3733, 253, 9021, 403, 13760, 285, 253, 1543, 403, 2266, 5474, 33032, 2520, 2929, 10262, 247, 747, 1881, 35421, 4715, 7792, 323, 41174, 729, 261, 780, 6519, 616, 7792, 310, 10166, 342, 270, 797, 3022, 3733, 326, 26295, 14237, 275, 253, 34741, 4811, 970, 9797, 285, 5304, 14800, 9470, 4679, 403, 2684, 970, 767, 4633, 41174, 729, 261, 780, 15302, 3340, 342, 247, 1355, 2408, 273, 13130, 3733, 941, 1884, 3038, 616, 1332, 6786, 10870, 1543, 342, 247, 2045, 1375, 23037, 14387, 1332, 534, 10166, 342, 247, 5699, 2408, 273, 13130, 941, 4562, 933, 3038, 25761, 672, 10166, 342, 625, 13130, 941, 38935, 3038, 352, 28842, 265, 253, 2045, 1375, 23037, 14387, 1332, 10166, 342, 4562, 933, 3038, 407, 4562, 16640, 4757, 337, 186, 783, 4477, 12661, 247, 6422, 1332, 281, 3037, 6519, 14237, 432, 23390, 26306, 941, 534, 476, 320, 12845, 323, 1097, 347, 83, 285, 4632, 83, 275, 247, 1881, 35421, 5133, 374, 186, 580, 73, 34762, 33526, 1375, 23037, 14387, 3045, 275, 2426, 273, 16640, 275, 247, 6197, 5251, 41174, 729, 261, 780, 10895, 298, 2967, 20, 25761, 1014, 275, 247, 1698, 15024, 4758, 3733, 342, 1884, 3038, 13130, 941, 50276, 1166, 3046, 3038, 440, 22027, 941, 616, 1566, 2722, 247, 10870, 4632, 83, 906, 342, 247, 2045, 1375, 23037, 14387, 1332, 326, 10166, 342, 4562, 933, 3038, 13130, 941, 495, 186, 14094, 4081, 1332, 310, 973, 16058, 275, 2710, 6849, 342, 9470, 4679, 949, 1097, 7714, 285, 24864, 16226, 337, 342, 1027, 1566, 9552, 374, 342, 1027, 2957, 3470, 260, 18038, 285, 256, 19, 84, 495, 342, 1027, 7368, 8571, 577, 970, 1554, 39661, 941, 608, 273, 1097, 347, 83, 285, 4632, 83, 285, 721, 273, 28913, 2175, 50276, 20881, 1255, 337, 186, 9154, 253, 4081, 1323, 73, 34762, 2722, 13943, 16226, 533, 352, 310, 271, 7466, 273, 14713, 797, 281, 789, 275, 23390, 26306, 941, 374, 186, 9088, 403, 1142, 963, 993, 285, 6332, 275, 253, 7714, 4496, 3730, 281, 2708, 7000, 5701, 25761, 4677, 337, 943, 320, 5520, 253, 18159, 403, 10912, 21643, 314, 50276, 34974, 337, 253, 4477, 897, 32147, 318, 347, 253, 4284, 11781, 5572, 534, 7877, 310, 12845, 323, 32147, 318, 11935, 390, 5048, 604, 253, 11935, 7877, 310, 908, 323, 23390, 26306, 11781, 1805, 281, 3730, 281, 253, 767, 9380, 391, 18, 391, 19, 374, 323, 253, 2957, 1159, 275, 5150, 577, 616, 2457, 3045, 3133, 2797, 407, 4758, 253, 9765, 347, 470, 2299, 352, 310, 1892, 281, 1089, 984, 352, 310, 4845, 275, 253, 1543, 273, 28913, 2175, 275, 253, 30762, 1580, 4758, 253, 9765, 347, 5058, 253, 1072, 347, 7005, 2835, 253, 1273, 2957, 1307, 310, 1805, 2684, 352, 651, 320, 1805, 281, 6266, 253, 1055, 273, 9765, 275, 1323, 73, 34762, 846, 5150, 577, 390, 387, 253, 5661, 9978, 13366, 495, 253, 347, 83, 3045, 273, 1323, 73, 34762, 310, 417, 3559, 352, 310, 816, 2529, 347, 253, 1323, 73, 34762, 17923, 7197, 685, 14713, 797, 275, 2505, 25761, 253, 1390, 6197, 275, 4706, 5329, 3133, 417, 2217, 281, 5513, 2139, 1323, 73, 34762, 310, 417, 1805, 685, 14713, 797, 327, 347, 83, 1014, 604, 23390, 26306, 941, 310, 12845, 1309, 3733, 452, 253, 4477, 294, 911, 33431, 4373, 22041, 6429, 278, 87, 12920, 1349, 9765, 323, 347, 83, 4836, 577, 374, 2109, 12494, 275, 4706, 247, 21, 752, 1057, 253, 6197, 1097, 33433, 403, 908, 387, 1071, 673, 323, 4735, 11998, 1599, 627, 310, 642, 15450, 4836, 29820, 1097, 33433, 50276, 22836, 337, 4706, 4567, 1386, 577, 275, 253, 7368, 12714, 1182, 18, 893, 1057, 1182, 18, 893, 1599, 1182, 18, 6811, 50276, 19, 495, 5784, 12494, 327, 3239, 577, 672, 760, 36453, 310, 908, 50275, 7483, 581, 36453, 495, 1390, 12494, 327, 3239, 608, 835, 270, 8414, 273, 512, 1896, 50276, 2811, 270, 18, 8115, 512, 1896, 270, 3133, 247, 10603, 1159, 3021, 253, 3159, 2882, 3133, 417, 4569, 577, 374, 2109, 12494, 327, 3239, 721, 359, 10725, 327, 253, 6036, 29810, 6333, 752, 1057, 253, 6036, 29810, 1599, 513, 253, 4477, 897, 247, 6036, 45830, 6113, 19, 84, 29810, 608, 374, 2109, 12494, 327, 3239, 854, 34560, 7590, 26208, 50276, 2562, 800, 3215, 26208, 721, 337, 296, 12494, 327, 3239, 1638, 7892, 438, 882, 260, 1906, 50276, 28489, 5816, 3806, 50276, 83, 18, 260, 864, 340, 15152, 328, 1162, 355, 440, 2562, 10898, 4440, 292, 2068, 6779, 4715, 19454, 266, 8059, 327, 4382, 8113, 7203, 254, 45909, 9169, 391, 19, 458, 70, 21758, 1689, 1162, 355, 4764, 5919, 23390, 26306, 4979, 398, 323, 3492, 6779, 4715, 5213, 8059, 327, 4715, 14237, 9169, 4583, 4477, 2746, 310, 12532, 323, 8493, 253, 878, 323, 247, 1781, 13130, 5304, 10895, 323, 3733, 4632, 83, 3210, 285, 253, 906, 2011, 275, 253, 2929, 310, 1534, 2490, 187, 4118, 18435, 27, 20790, 436, 2929, 23970, 271, 6880, 273, 253, 14713, 797, 9797, 7483, 1566, 323, 253, 41174, 729, 261, 780, 4758, 6941, 323, 1881, 35421, 3215, 26208, 273, 23390, 26306, 1566, 534, 671, 17923, 973, 327, 253, 32505, 26306, 8892, 5541, 24042, 285, 347, 83, 253, 2929, 10384, 253, 2934, 273, 36453, 5926, 483, 281, 616, 23390, 26306, 3215, 26208, 9978, 285, 9569, 253, 2934, 273, 44790, 342, 19137, 347, 247, 1039, 281, 3157, 5304, 6779, 4715, 247, 2266, 4809, 273, 253, 2929, 310, 697, 5661, 2593, 4645, 2266, 7756, 323, 5541, 24042, 8892, 9745, 247, 747, 1375, 23037, 14387, 3045, 253, 4679, 671, 921, 7756, 323, 347, 83, 4836, 50276, 49794, 512, 30628, 4455, 281, 11435, 253, 5661, 1543, 342, 747, 1375, 23037, 14387, 3045, 327, 1097, 32505, 26306, 4836, 672, 9591, 23390, 26306, 3215, 26208, 253, 2929, 1057, 3324, 690, 7681, 38135, 533, 8558, 984, 273, 697, 2898, 281, 253, 41174, 729, 261, 780, 5028, 253, 36453, 5926, 483, 2934, 369, 2168, 14859, 323, 643, 41174, 729, 261, 780, 8892, 824, 347, 6519, 17477, 2454, 16904, 490, 7555, 1370, 478, 1162, 355, 17857, 7373, 9169, 533, 253, 2934, 273, 44790, 407, 19137, 3133, 4460, 285, 7729, 4715, 1805, 5304, 14237, 253, 4477, 497, 2104, 281, 2953, 1142, 3533, 285, 7350, 4469, 407, 30628, 512, 30628, 2335, 253, 673, 281, 1239, 841, 6128, 285, 14409, 731, 6010, 436, 2929, 10316, 271, 4722, 6880, 273, 253, 9797, 7483, 14713, 797, 1566, 323, 253, 41174, 729, 261, 780, 4758, 253, 4757, 273, 253, 2929, 310, 275, 697, 7103, 342, 2266, 16226, 14631, 1142, 747, 1375, 23037, 14387, 1543, 50276, 455, 30628, 4516, 253, 14924, 273, 436, 2929 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 273, 253, 4081, 1332, 50276, 45563, 50275, 2520, 2929, 310, 10323, 973, 15720, 342, 7000, 24864, 50275, 783, 2934, 273, 19732, 2977, 270, 797, 323, 1881, 35421, 3733, 310, 1027, 432, 253, 5899, 41174, 729, 261, 780, 12420, 27855, 50275, 5992, 7193, 11701, 824, 347, 253, 41174, 729, 261, 780, 5542, 15895, 285, 44790, 407, 19137, 452, 644, 4081, 50275, 783, 1543, 327, 298, 2967, 87, 2125, 562, 32231, 253, 2045, 256, 5503, 50275, 20881, 1255, 50275, 2520, 2238, 273, 15895, 2167, 4460, 327, 253, 9400, 273, 41174, 729, 261, 780, 6519, 8981, 556, 644, 908, 327, 643, 12989, 253, 2644, 1566, 310, 4270, 2220, 14713, 797, 534, 7787, 253, 7680, 50275, 22309, 897, 760, 253, 298, 2967, 20, 10895, 323, 7103, 253, 298, 2967, 19, 10895, 943, 320, 2130, 625, 10895, 27163, 812, 320, 625, 11088, 685, 581, 50275, 10383, 253, 44790, 407, 19137, 2139, 651, 253, 15223, 8223, 5481, 8482, 1945, 19132, 253, 6311, 3386, 273, 253, 501, 3024, 32049, 672, 253, 6898, 249, 13009, 403, 432, 1524, 10556, 625, 30328, 943, 320, 1677, 352, 310, 753, 326, 634, 16038, 310, 281, 1978, 34741, 2919, 6032, 5897, 595, 891, 2096, 326, 352, 651, 320, 39797, 977, 685, 3632, 6046, 44790, 2299, 13288, 432, 253, 1072, 3492, 2550, 12215, 11935, 36971, 50275, 74, 717, 14338, 670, 253, 1554, 39661, 4632, 28294, 272, 780, 4679, 275, 4706, 7127, 253, 1566, 310, 4934, 275, 247, 4758, 342, 581, 337, 19502, 3733, 285, 1884, 73, 273, 13130, 941, 1223, 891, 2096, 253, 12291, 275, 5300, 253, 4477, 403, 5125, 281, 921, 253, 1554, 39661, 1543, 762, 253, 2120, 4758, 342, 4465, 941, 3206, 352, 1537, 1379, 247, 3356, 3733, 673, 846, 512, 253, 1332, 310, 639, 79, 6932, 281, 253, 13452, 3448, 50276, 1189, 455, 436, 2929, 23970, 5609, 326, 253, 1673, 273, 41174, 729, 261, 780, 6519, 8981, 556, 11766, 19732, 2961, 715, 436, 1673, 285, 2789, 8892, 29765, 14586, 253, 2934, 310, 417, 16575, 747, 533, 253, 2898, 310, 1175, 347, 891, 717, 642, 6485, 275, 436, 1673, 352, 310, 1896, 326, 891, 452, 9829, 1633, 5474, 33032, 2520, 2929, 10262, 247, 23390, 26306, 41174, 729, 261, 780, 3215, 26208, 2746, 326, 1543, 275, 3210, 326, 476, 320, 4030, 24251, 281, 5115, 1375, 273, 253, 1445, 3045, 327, 1097, 5541, 24042, 5304, 760, 347, 973, 347, 6519, 8981, 9797, 760, 327, 253, 298, 2967, 20, 10895, 534, 310, 253, 6253, 1345, 5541, 24042, 22791, 352, 8725, 253, 2934, 273, 34741, 1881, 35421, 4715, 281, 23390, 26306, 15216, 835, 5085, 326, 403, 34741, 285, 8131, 403, 8356, 6888, 3066, 17524, 285, 22407, 347, 3733, 42851, 50276, 6050, 253, 4081, 2746, 310, 247, 4942, 4951, 3579, 6880, 273, 3786, 3863, 14713, 797, 1566, 352, 310, 5183, 281, 320, 1077, 3576, 50276, 20790, 671, 10262, 247, 1180, 273, 5795, 7274, 323, 25661, 46234, 273, 253, 5304, 36453, 285, 1077, 4722, 314, 14371, 253, 1318, 273, 23390, 26306, 3280, 281, 253, 1566, 1014, 672, 247, 2014, 36453, 310, 1246, 387, 1071, 673, 50276, 856, 84, 50276, 66, 2969, 533, 1077, 3576, 6880, 273, 34741, 1881, 35421, 3733, 281, 23390, 26306, 8892, 50276, 67, 1375, 273, 253, 1445, 5541, 24042, 285, 347, 83, 1543, 327, 253, 298, 2967, 20, 22791, 50276, 68, 14371, 1318, 273, 5277, 23390, 26306, 3280, 281, 253, 1566, 1014, 672, 760, 247, 2014, 36453, 310, 1246, 387, 1071, 673, 50276, 30991, 3993, 326, 1908, 760, 326, 36453, 347, 3280, 2460, 273, 11233, 275, 436, 1083, 403, 3012, 7197, 50276, 5040, 50276, 1439, 247, 4016, 533, 352, 588, 320, 4722, 281, 923, 2007, 40290, 342, 3828, 3020, 20446, 7274, 326, 4151, 5304, 6779, 4715, 3066, 9797, 390, 1323, 6928, 50276, 2520, 588, 452, 253, 5750, 326, 253, 1566, 588, 760, 923, 253, 5304, 36453, 387, 3733, 673, 2568, 588, 452, 253, 5750, 273, 20446, 432, 9797, 390, 1323, 1566, 387, 512, 8090, 417, 816, 387, 253, 3453, 50275, 635, 973, 3542, 2929, 15250, 271, 3576, 2746, 323, 4715, 46234, 323, 23390, 26306, 941, 5474, 339, 431, 248, 2929, 29328, 247, 5700, 323, 1881, 35421, 3215, 26208, 323, 5541, 4361, 281, 436, 990, 597, 12661, 1323, 73, 34762, 534, 33772, 46234, 407, 44790, 3492, 3280, 285, 21565, 10040, 3146, 22407, 8763, 5085, 253, 4477, 12661, 767, 5390, 992, 447, 323, 436, 5304, 7483, 285, 2831, 24353, 253, 3215, 11273, 46234, 403, 1442, 292, 37437, 327, 253, 5541, 4361, 8892, 275, 253, 3740, 273, 259, 580, 19, 4642, 1384, 285, 14713, 797, 323, 12077, 6519, 8981, 253, 4477, 1347, 4679, 327, 1097, 3710, 285, 2120, 7741, 7533, 327, 1097, 273, 534, 597, 7568, 2266, 3045, 50276, 783, 5697, 403, 27350, 285, 1077, 7470, 323, 253, 1895, 253, 4477, 1379, 1175, 5750, 273, 247, 3332, 32992, 273, 3215, 26208, 275, 347, 83, 14713, 797, 285, 10384, 352, 281, 5541, 4361, 38857, 5304, 6519, 8981, 253, 5304, 7483, 14713, 797, 310, 247, 2969, 15644, 533, 253, 2831, 24353, 34560, 29646, 310, 247, 4217, 1635, 50275, 783, 3045, 310, 2266, 327, 1097, 1698, 285, 2120, 7741, 7533, 285, 253, 1543, 4518, 7568, 253, 11361, 273, 1881, 35421, 3215, 26208, 50276, 262, 651, 320, 1175, 281, 671, 921, 253, 12510, 273, 253, 46234, 323, 643, 15450, 8892, 824, 347, 41174, 729, 261, 780, 6519, 8981, 285, 6519, 9712, 50275, 74, 1158, 2829, 260, 18, 275, 253, 30762, 310, 3240, 4217, 323, 816, 5411, 2216, 10165, 285, 387, 1878, 690, 273, 352, 651, 320, 12912, 281, 921, 275, 253, 2022, 2929, 352, 651, 671, 320, 1175, 281, 921, 5541, 4361, 1543, 846, 5304, 7483, 2014, 2307, 1319, 3215, 26208, 751, 9797, 14713, 797, 1293, 2831, 24353, 3733, 50276, 284, 271, 2898, 2929, 253, 2929, 1537, 320, 1805, 18960, 323, 30105, 1087, 390, 734, 48460, 533, 891, 1335, 6273, 323, 14924, 1580, 253, 9021, 403, 1534, 285, 253, 1543, 403, 1375, 23037, 14387, 50276, 783, 4477, 12661, 271, 3576, 15644, 273, 14713, 797, 323, 5541, 4361, 407, 6240, 2831, 24353, 34560, 3733, 253, 9021, 403, 13760, 285, 253, 1543, 403, 2266, 5474, 33032, 2520, 2929, 10262, 247, 747, 1881, 35421, 4715, 7792, 323, 41174, 729, 261, 780, 6519, 616, 7792, 310, 10166, 342, 270, 797, 3022, 3733, 326, 26295, 14237, 275, 253, 34741, 4811, 970, 9797, 285, 5304, 14800, 9470, 4679, 403, 2684, 970, 767, 4633, 41174, 729, 261, 780, 15302, 3340, 342, 247, 1355, 2408, 273, 13130, 3733, 941, 1884, 3038, 616, 1332, 6786, 10870, 1543, 342, 247, 2045, 1375, 23037, 14387, 1332, 534, 10166, 342, 247, 5699, 2408, 273, 13130, 941, 4562, 933, 3038, 25761, 672, 10166, 342, 625, 13130, 941, 38935, 3038, 352, 28842, 265, 253, 2045, 1375, 23037, 14387, 1332, 10166, 342, 4562, 933, 3038, 407, 4562, 16640, 4757, 337, 186, 783, 4477, 12661, 247, 6422, 1332, 281, 3037, 6519, 14237, 432, 23390, 26306, 941, 534, 476, 320, 12845, 323, 1097, 347, 83, 285, 4632, 83, 275, 247, 1881, 35421, 5133, 374, 186, 580, 73, 34762, 33526, 1375, 23037, 14387, 3045, 275, 2426, 273, 16640, 275, 247, 6197, 5251, 41174, 729, 261, 780, 10895, 298, 2967, 20, 25761, 1014, 275, 247, 1698, 15024, 4758, 3733, 342, 1884, 3038, 13130, 941, 50276, 1166, 3046, 3038, 440, 22027, 941, 616, 1566, 2722, 247, 10870, 4632, 83, 906, 342, 247, 2045, 1375, 23037, 14387, 1332, 326, 10166, 342, 4562, 933, 3038, 13130, 941, 495, 186, 14094, 4081, 1332, 310, 973, 16058, 275, 2710, 6849, 342, 9470, 4679, 949, 1097, 7714, 285, 24864, 16226, 337, 342, 1027, 1566, 9552, 374, 342, 1027, 2957, 3470, 260, 18038, 285, 256, 19, 84, 495, 342, 1027, 7368, 8571, 577, 970, 1554, 39661, 941, 608, 273, 1097, 347, 83, 285, 4632, 83, 285, 721, 273, 28913, 2175, 50276, 20881, 1255, 337, 186, 9154, 253, 4081, 1323, 73, 34762, 2722, 13943, 16226, 533, 352, 310, 271, 7466, 273, 14713, 797, 281, 789, 275, 23390, 26306, 941, 374, 186, 9088, 403, 1142, 963, 993, 285, 6332, 275, 253, 7714, 4496, 3730, 281, 2708, 7000, 5701, 25761, 4677, 337, 943, 320, 5520, 253, 18159, 403, 10912, 21643, 314, 50276, 34974, 337, 253, 4477, 897, 32147, 318, 347, 253, 4284, 11781, 5572, 534, 7877, 310, 12845, 323, 32147, 318, 11935, 390, 5048, 604, 253, 11935, 7877, 310, 908, 323, 23390, 26306, 11781, 1805, 281, 3730, 281, 253, 767, 9380, 391, 18, 391, 19, 374, 323, 253, 2957, 1159, 275, 5150, 577, 616, 2457, 3045, 3133, 2797, 407, 4758, 253, 9765, 347, 470, 2299, 352, 310, 1892, 281, 1089, 984, 352, 310, 4845, 275, 253, 1543, 273, 28913, 2175, 275, 253, 30762, 1580, 4758, 253, 9765, 347, 5058, 253, 1072, 347, 7005, 2835, 253, 1273, 2957, 1307, 310, 1805, 2684, 352, 651, 320, 1805, 281, 6266, 253, 1055, 273, 9765, 275, 1323, 73, 34762, 846, 5150, 577, 390, 387, 253, 5661, 9978, 13366, 495, 253, 347, 83, 3045, 273, 1323, 73, 34762, 310, 417, 3559, 352, 310, 816, 2529, 347, 253, 1323, 73, 34762, 17923, 7197, 685, 14713, 797, 275, 2505, 25761, 253, 1390, 6197, 275, 4706, 5329, 3133, 417, 2217, 281, 5513, 2139, 1323, 73, 34762, 310, 417, 1805, 685, 14713, 797, 327, 347, 83, 1014, 604, 23390, 26306, 941, 310, 12845, 1309, 3733, 452, 253, 4477, 294, 911, 33431, 4373, 22041, 6429, 278, 87, 12920, 1349, 9765, 323, 347, 83, 4836, 577, 374, 2109, 12494, 275, 4706, 247, 21, 752, 1057, 253, 6197, 1097, 33433, 403, 908, 387, 1071, 673, 323, 4735, 11998, 1599, 627, 310, 642, 15450, 4836, 29820, 1097, 33433, 50276, 22836, 337, 4706, 4567, 1386, 577, 275, 253, 7368, 12714, 1182, 18, 893, 1057, 1182, 18, 893, 1599, 1182, 18, 6811, 50276, 19, 495, 5784, 12494, 327, 3239, 577, 672, 760, 36453, 310, 908, 50275, 7483, 581, 36453, 495, 1390, 12494, 327, 3239, 608, 835, 270, 8414, 273, 512, 1896, 50276, 2811, 270, 18, 8115, 512, 1896, 270, 3133, 247, 10603, 1159, 3021, 253, 3159, 2882, 3133, 417, 4569, 577, 374, 2109, 12494, 327, 3239, 721, 359, 10725, 327, 253, 6036, 29810, 6333, 752, 1057, 253, 6036, 29810, 1599, 513, 253, 4477, 897, 247, 6036, 45830, 6113, 19, 84, 29810, 608, 374, 2109, 12494, 327, 3239, 854, 34560, 7590, 26208, 50276, 2562, 800, 3215, 26208, 721, 337, 296, 12494, 327, 3239, 1638, 7892, 438, 882, 260, 1906, 50276, 28489, 5816, 3806, 50276, 83, 18, 260, 864, 340, 15152, 328, 1162, 355, 440, 2562, 10898, 4440, 292, 2068, 6779, 4715, 19454, 266, 8059, 327, 4382, 8113, 7203, 254, 45909, 9169, 391, 19, 458, 70, 21758, 1689, 1162, 355, 4764, 5919, 23390, 26306, 4979, 398, 323, 3492, 6779, 4715, 5213, 8059, 327, 4715, 14237, 9169, 4583, 4477, 2746, 310, 12532, 323, 8493, 253, 878, 323, 247, 1781, 13130, 5304, 10895, 323, 3733, 4632, 83, 3210, 285, 253, 906, 2011, 275, 253, 2929, 310, 1534, 2490, 187, 4118, 18435, 27, 20790, 436, 2929, 23970, 271, 6880, 273, 253, 14713, 797, 9797, 7483, 1566, 323, 253, 41174, 729, 261, 780, 4758, 6941, 323, 1881, 35421, 3215, 26208, 273, 23390, 26306, 1566, 534, 671, 17923, 973, 327, 253, 32505, 26306, 8892, 5541, 24042, 285, 347, 83, 253, 2929, 10384, 253, 2934, 273, 36453, 5926, 483, 281, 616, 23390, 26306, 3215, 26208, 9978, 285, 9569, 253, 2934, 273, 44790, 342, 19137, 347, 247, 1039, 281, 3157, 5304, 6779, 4715, 247, 2266, 4809, 273, 253, 2929, 310, 697, 5661, 2593, 4645, 2266, 7756, 323, 5541, 24042, 8892, 9745, 247, 747, 1375, 23037, 14387, 3045, 253, 4679, 671, 921, 7756, 323, 347, 83, 4836, 50276, 49794, 512, 30628, 4455, 281, 11435, 253, 5661, 1543, 342, 747, 1375, 23037, 14387, 3045, 327, 1097, 32505, 26306, 4836, 672, 9591, 23390, 26306, 3215, 26208, 253, 2929, 1057, 3324, 690, 7681, 38135, 533, 8558, 984, 273, 697, 2898, 281, 253, 41174, 729, 261, 780, 5028, 253, 36453, 5926, 483, 2934, 369, 2168, 14859, 323, 643, 41174, 729, 261, 780, 8892, 824, 347, 6519, 17477, 2454, 16904, 490, 7555, 1370, 478, 1162, 355, 17857, 7373, 9169, 533, 253, 2934, 273, 44790, 407, 19137, 3133, 4460, 285, 7729, 4715, 1805, 5304, 14237, 253, 4477, 497, 2104, 281, 2953, 1142, 3533, 285, 7350, 4469, 407, 30628, 512, 30628, 2335, 253, 673, 281, 1239, 841, 6128, 285, 14409, 731, 6010, 436, 2929, 10316, 271, 4722, 6880, 273, 253, 9797, 7483, 14713, 797, 1566, 323, 253, 41174, 729, 261, 780, 4758, 253, 4757, 273, 253, 2929, 310, 275, 697, 7103, 342, 2266, 16226, 14631, 1142, 747, 1375, 23037, 14387, 1543, 50276, 455, 30628, 4516, 253, 14924, 273, 436, 2929 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: this paper proposes a novel fewshot classification algorithm based on distribution calibration the proposed work is mainly based on freelunch yang et al 2021 where a feature extractor is first trained using base classes and the features of fewshot classes are augmented by the additional features drawn from gaussian distributions here the mean and covariance of the gaussians are constructed by adequately adapting the statistics computed from the base classes and yang et al 2021 proposed a simple heuristic approach for that this paper suggests using a more theoretically grounded approach for this procedure where the transfer of statistics from base classes is designed to be more sensitive to the importance of individual samples included in base classes or novel classes specifically the paper introduces a hierarchical optimal transport algorithm where the relevance between the base and novel classes is estimated via solving hierarchical optimal transport problems and using the estimated transition matrices for constructing parameters for gaussians to generate novel features strength the paper is wellwritten and easy to follow the hierarchical optimal transport approach seems reasonable and sound the experimental results are extensive and convincing the paper also provides an ablation study and helpful visualizations weakness the algorithm can be considered somewhat incremental because most of the procedure is based on yang et al 2021 as i mentioned before most of the algorithm is based on freelunch yang et al 2021 and the only different part is the construction of gaussian parameters through advanced weighted sums where the weights are computed from hierarchical optimal transport still i think the contribution is quite valuable and the algorithm demonstrated excellent performance docsepthis paper presents a calibrationbased fewshot learning method to address the problem of the biased model toward novel samples the main tool to that end is optimal transport ot and thereby this work attempts to overcome the limitations of its preliminary work 14 by replacing euclidean distance with a proposed distance based on ot that can measure the distance between two distributions in a way of obtaining matching cost of one distribution to another in order to get such a matching cost matrix the paper suggests to learn those costs between base classes to novel samples by a lowlevel ot optimization problem where the goal is to capture the importance of a sample with respect to its class the experimental results show that the proposed method performs better than the existing fewshot learning methods including freelunch 14 strengths 1 14 is a seminal work with clear novelty and this work adequately finds room for improvement and indeed improves the performance of 14 by designing a more statistically sophisticated distance metric 2 experiments are well designed and conducted and the proposed hot method seems to work well in most of the settings 3 in particular hot is highly effective in crossdomain settings compared to freelunch weaknesses 1 there is no experiment to support the limitations of freelunch as explicitly claimed by the authors even though it is observed that the overall performance has been improved it is not quite clear whether such a performance gain indeed comes from addressing the limitations of freelunch 2 the presentation can be further improved as the current notations are too complicated and some consecutive sentences are not smoothly linked to each other 3 in the overall performance comparison table 1 the performance gain of hot is somewhat marginal particularly in 1shot settings yes docsepthis work studies fewshot learning and proposes a optimaltransportbased weighting scheme to leverage examples from base classes for updating the classifiers of the learning of novel classes built on the framework of previous work 14 this work develops a optimaltransportbased weighting scheme instead of euclidean distance and the resulting weights per base class implicitly considers similarity of individual examples within that base class to examples in the novel classes evaluation shows improved performance over 14 and competitive performance on standard fewshot image classification benchmarks across multiple backbones extensive ablation studies are conducted to verify superiority to 14 strength 1 extensive experiments across multiple backbones crossdomain results and ablation studies qualitative results on optimal transport weights helps understanding of the approach 2 code provided for reproducing experiment results weakness 1 the improvement over 14 in performance is relatively small with respect to how principled l59 l191 the approach is it would be nice to get an understanding on how much room of improvement is available from distribution calibration to put the improvements that has been extracted by the proposed approach into perspective for example if theres oracle base class weighting how good the performance can be on the query set are we moving in the right direction for example do the class weights eg appendix figure 4 match the oracle weights from 14 to this paper are weights extracted looking more like oracle weights 2 would be nice to demonstrate how the perexample weights mjn catches differences among examples in the same class for example baby is similar to sofa does mjn tell apart sofa images with a baby and sofa images without a baby how do the transport probabilities generalize in crossdomain settings the authors response has adequately addressed my concerns i have updated the score limitations are sufficiently addressed docsepthis paper is on the topic of distribution calibration methods for fewshot learning which seek to align the novel class feature distribution to that of the base classes methodological and experimental comparisons are primarily with a prior distribution calibration method free lunch 14 used the average features of the two closest base classes in euclidean space the authors instead propose using optimal transport to compare novel classes with all base classes additionally the cost function to compare a base class with a novel sample is also learned as an optimal transport problem experiments are done on 5way 1shot and 5way 5shot miniimagenet tieredimagenet cub and cifarfs in comparison with free lunch and other popular fewshot methods overall results are impressive establishing a new stateoftheart strengths 1 the proposed usage of optimal transport to align base and novel distributions is very sensible as are the improvements over the previous distribution alignment method free lunch its almost surprising that such an approach hasnt been attempted before 2 background concepts eg fewshot learning optimal transport are introduced well for unfamiliar readers though some of the writing could use some improvement see below and the authors should take care to avoid text that is too similar to prior works see below 3 the empirical results are impressive the main results are on 5way 1shot and 5way 5shot miniimagenet tieredimagenet cub and cifarfs which are the common fsl benchmarks in recent works while all comparisons arent on the same backbone the backbones used are listed in table 1 regardless i believe these are the highest accuracies ive seen on these fewshot benchmarks and the improvements over the most relevant baseline are consistent additional experiments in crossdomain settings and ablation studies lend further support to the efficacy of the method weaknesses 1 there isnt a discussion on computational cost of this method optimal transport has a reputation for being expensive so such a discussion would be very helpful particularly given the twolevel ot used here how long does this method take to compute the transport plan 2 similar to the above point the computational cost of the optimal transport plan scales with the number of classes involved while this may be manageable for the small size of common fsl benchmarks i suspect that datasets with larger numbers of classes may prove more expensive 3 this work seems heavily based on the free lunch paper 14 not just in methodology but also the paper itself some similarities are perhaps natural as this work takes a similar strategy and builds off of 14 however section 22 is mostly a paraphrase of 14s section 31 there are strong similarities in certain lines of the related works and several of the tables have the same formatting there are also some odd variable substitutions from those of 14 that seem unnecessary eg topw instead of the more common topk beta for tukeys ladder of powers transformation instead of the more common lambda 4 writing while generally understandable the writing isnt always idiomatically or grammatically correct i strongly recommend another thorough round of edits a nonexhaustive list of examples appear under miscellaneous below miscellaneous line 21 1 is a strange choice as the only cited reference for image classification line 30 introduce to calibrate line 38 although with reasonably good performance line 51 and elsewhere labelled vs labeled both spellings are correct but generally one should stick to a single version line 75 owing a rich theory due to its rich theory though the rich theory isnt the reason why people use ot for comparing probability distributions line 87 we divide whole dataset missing the line 8998 b is being used to denote both the number of base classes and the set of base classes line 99 to generate from to be generated from line 167 while i see why using only the mean may be inaccurate i fail to see why itd be severely biased line 175 each distributions line 176 should subject to line 214 incorrect usage of besides algorithm 1 t should be in math mode test the query set on classifier theta test classifier theta on the query set line 238 269 missing oxford comma datasets shouldnt be preceded with the word the unless followed by the word dataset table 1 why arent the best results on cub bolded line 269 cifarfas line 341 starting a sentence with and 2 and 5 in the references are the same paper discussion of limitations absent what is the added computation cost of the method when does ot tend to fail how limiting is the gaussian assumption of features ### Summary:
this paper builds off of the distribution calibration approach for fewshot learning known as free lunch replacing the euclidean metric with hierarchicaloptimal transport the result is a more principled and empirically effective approach the main issues among the reviewers were concerned with the limitations of free lunch and whether this new approach could overcome them and what sort of additional computational cost would be incurred by using optimal transport while more expensive it is felt that overall the approach is sufficiently novel and effective
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 2520, 2929, 29328, 247, 4460, 1643, 11860, 9162, 5933, 1754, 327, 3268, 18543, 253, 4081, 789, 310, 7194, 1754, 327, 31847, 3204, 30966, 1162, 355, 43425, 835, 247, 4735, 4908, 263, 310, 806, 10166, 970, 2613, 5971, 285, 253, 3386, 273, 1643, 11860, 5971, 403, 31612, 407, 253, 3081, 3386, 8392, 432, 305, 12064, 10670, 1060, 253, 1599, 285, 26677, 273, 253, 305, 10064, 2458, 403, 8818, 407, 18212, 42174, 253, 9990, 10302, 432, 253, 2613, 5971, 285, 30966, 1162, 355, 43425, 4081, 247, 2969, 47641, 2746, 323, 326, 436, 2929, 5936, 970, 247, 625, 28055, 28462, 2746, 323, 436, 5199, 835, 253, 3700, 273, 9990, 432, 2613, 5971, 310, 4158, 281, 320, 625, 7996, 281, 253, 6349, 273, 2060, 3530, 2908, 275, 2613, 5971, 390, 4460, 5971, 5742, 253, 2929, 23970, 247, 24498, 8654, 4616, 5933, 835, 253, 17200, 875, 253, 2613, 285, 4460, 5971, 310, 5998, 3066, 16161, 24498, 8654, 4616, 3237, 285, 970, 253, 5998, 5502, 12624, 323, 26736, 3602, 323, 305, 10064, 2458, 281, 6635, 4460, 3386, 50276, 45563, 50276, 783, 2929, 310, 973, 15720, 285, 3477, 281, 956, 50276, 783, 24498, 8654, 4616, 2746, 3133, 5272, 285, 3590, 50276, 783, 5661, 1543, 403, 9470, 285, 21414, 253, 2929, 671, 3400, 271, 28913, 1263, 285, 9371, 5304, 5904, 50276, 20881, 1255, 50276, 783, 5933, 476, 320, 2783, 8489, 32809, 984, 954, 273, 253, 5199, 310, 1754, 327, 30966, 1162, 355, 43425, 347, 891, 5393, 1078, 954, 273, 253, 5933, 310, 1754, 327, 31847, 3204, 30966, 1162, 355, 43425, 285, 253, 760, 1027, 629, 310, 253, 5140, 273, 305, 12064, 3602, 949, 7269, 17375, 22661, 835, 253, 13461, 403, 10302, 432, 24498, 8654, 4616, 1335, 891, 1158, 253, 7680, 310, 3240, 9865, 285, 253, 5933, 5183, 7126, 3045, 5474, 33032, 2520, 2929, 10262, 247, 18543, 3169, 1643, 11860, 4715, 1332, 281, 2953, 253, 1895, 273, 253, 23539, 1566, 2584, 4460, 3530, 253, 2022, 4968, 281, 326, 990, 310, 8654, 4616, 14366, 285, 7624, 436, 789, 9437, 281, 11399, 253, 7364, 273, 697, 12611, 789, 1638, 407, 15706, 299, 26365, 4181, 342, 247, 4081, 4181, 1754, 327, 14366, 326, 476, 2557, 253, 4181, 875, 767, 10670, 275, 247, 1039, 273, 13546, 11038, 2105, 273, 581, 3268, 281, 1529, 275, 1340, 281, 755, 824, 247, 11038, 2105, 4315, 253, 2929, 5936, 281, 3037, 1110, 4815, 875, 2613, 5971, 281, 4460, 3530, 407, 247, 1698, 5251, 14366, 13757, 1895, 835, 253, 4736, 310, 281, 9232, 253, 6349, 273, 247, 3410, 342, 1675, 281, 697, 966, 253, 5661, 1543, 921, 326, 253, 4081, 1332, 17923, 1805, 685, 253, 5368, 1643, 11860, 4715, 3082, 1690, 31847, 3204, 1638, 50276, 296, 3755, 20556, 337, 1638, 310, 247, 41116, 789, 342, 2590, 38135, 285, 436, 789, 18212, 9010, 2316, 323, 7756, 285, 6296, 19132, 253, 3045, 273, 1638, 407, 20462, 247, 625, 10126, 18144, 4181, 7982, 50276, 19, 4679, 403, 973, 4158, 285, 5196, 285, 253, 4081, 3511, 1332, 3133, 281, 789, 973, 275, 954, 273, 253, 7533, 50276, 20, 275, 1798, 3511, 310, 4122, 3576, 275, 2831, 13517, 7533, 2429, 281, 31847, 3204, 50276, 20881, 1255, 265, 337, 627, 310, 642, 3368, 281, 1329, 253, 7364, 273, 31847, 3204, 347, 11120, 7558, 407, 253, 4477, 1014, 2167, 352, 310, 2540, 326, 253, 4583, 3045, 556, 644, 5520, 352, 310, 417, 3240, 2590, 1880, 824, 247, 3045, 6351, 6296, 3249, 432, 15974, 253, 7364, 273, 31847, 3204, 50276, 19, 253, 9759, 476, 320, 2007, 5520, 347, 253, 1655, 41818, 403, 1512, 9542, 285, 690, 12640, 14683, 403, 417, 25863, 7939, 281, 1016, 643, 50276, 20, 275, 253, 4583, 3045, 5301, 2829, 337, 253, 3045, 6351, 273, 3511, 310, 8489, 16888, 3782, 275, 337, 11860, 7533, 50275, 9820, 50276, 7152, 33032, 2520, 789, 2175, 1643, 11860, 4715, 285, 29328, 247, 8654, 33788, 3169, 42428, 6974, 281, 25057, 6667, 432, 2613, 5971, 323, 22753, 253, 49996, 273, 253, 4715, 273, 4460, 5971, 4270, 327, 253, 7792, 273, 2045, 789, 1638, 436, 789, 24357, 247, 8654, 33788, 3169, 42428, 6974, 3185, 273, 299, 26365, 4181, 285, 253, 4795, 13461, 591, 2613, 966, 29688, 19401, 14259, 273, 2060, 6667, 1561, 326, 2613, 966, 281, 6667, 275, 253, 4460, 5971, 7103, 2722, 5520, 3045, 689, 1638, 285, 12085, 3045, 327, 2629, 1643, 11860, 2460, 9162, 49602, 2439, 2709, 896, 47473, 9470, 28913, 2175, 403, 5196, 281, 12654, 34385, 281, 1638, 4757, 50276, 18, 9470, 4679, 2439, 2709, 896, 47473, 2831, 13517, 1543, 285, 28913, 2175, 18276, 1543, 327, 8654, 4616, 13461, 7729, 4685, 273, 253, 2746, 374, 2127, 2530, 323, 39306, 3368, 1543, 50276, 20881, 1255, 50276, 18, 253, 7756, 689, 1638, 275, 3045, 310, 4942, 1355, 342, 1675, 281, 849, 3505, 74, 6216, 298, 3046, 298, 22179, 253, 2746, 310, 352, 651, 320, 5322, 281, 755, 271, 4685, 327, 50275, 5430, 1199, 2316, 273, 7756, 310, 2130, 432, 3268, 18543, 281, 1691, 253, 11701, 326, 556, 644, 10375, 407, 253, 4081, 2746, 715, 8668, 323, 1650, 604, 253, 373, 42295, 2613, 966, 42428, 849, 1175, 253, 3045, 476, 320, 327, 253, 7316, 873, 50275, 609, 359, 4886, 275, 253, 987, 3884, 323, 1650, 513, 253, 966, 13461, 24088, 30762, 4677, 577, 3761, 253, 42295, 13461, 432, 1638, 281, 436, 2929, 403, 13461, 10375, 2819, 625, 751, 42295, 13461, 50275, 19, 651, 320, 5322, 281, 7568, 849, 253, 759, 18398, 4636, 13461, 278, 33126, 32010, 3910, 2190, 6667, 275, 253, 1072, 966, 323, 1650, 6858, 310, 2074, 281, 30929, 1057, 278, 33126, 2028, 7419, 30929, 3888, 342, 247, 6858, 285, 30929, 3888, 1293, 247, 6858, 849, 513, 253, 4616, 20552, 39970, 275, 2831, 13517, 7533, 50274, 783, 4477, 2380, 556, 18212, 9713, 619, 7350, 891, 452, 9300, 253, 4868, 7364, 403, 10481, 9713, 5474, 33032, 2520, 2929, 310, 327, 253, 9400, 273, 3268, 18543, 3082, 323, 1643, 11860, 4715, 534, 7703, 281, 8495, 253, 4460, 966, 4735, 3268, 281, 326, 273, 253, 2613, 5971, 35961, 285, 5661, 14023, 403, 8558, 342, 247, 2720, 3268, 18543, 1332, 1959, 11157, 1638, 908, 253, 3388, 3386, 273, 253, 767, 8642, 2613, 5971, 275, 299, 26365, 2317, 253, 4477, 3185, 12661, 970, 8654, 4616, 281, 7277, 4460, 5971, 342, 512, 2613, 5971, 23000, 253, 2105, 1159, 281, 7277, 247, 2613, 966, 342, 247, 4460, 3410, 310, 671, 6311, 347, 271, 8654, 4616, 1895, 4679, 403, 2218, 327, 608, 1106, 337, 11860, 285, 608, 1106, 608, 11860, 12949, 303, 6533, 292, 13898, 433, 303, 6533, 292, 12966, 285, 260, 338, 274, 3671, 275, 5301, 342, 1959, 11157, 285, 643, 4633, 1643, 11860, 3082, 4583, 1543, 403, 13943, 14631, 247, 747, 1375, 23037, 14387, 20544, 337, 253, 4081, 10393, 273, 8654, 4616, 281, 8495, 2613, 285, 4460, 10670, 310, 1077, 24600, 347, 403, 253, 11701, 689, 253, 2045, 3268, 12420, 1332, 1959, 11157, 697, 2761, 10084, 326, 824, 271, 2746, 556, 2649, 644, 9919, 1078, 374, 4114, 12342, 24088, 1643, 11860, 4715, 8654, 4616, 403, 5611, 973, 323, 32139, 10668, 2167, 690, 273, 253, 4028, 812, 897, 690, 7756, 923, 2708, 285, 253, 4477, 943, 1379, 1557, 281, 3693, 2505, 326, 310, 1512, 2074, 281, 2720, 2987, 923, 2708, 50276, 20, 253, 16774, 1543, 403, 13943, 253, 2022, 1543, 403, 327, 608, 1106, 337, 11860, 285, 608, 1106, 608, 11860, 12949, 303, 6533, 292, 13898, 433, 303, 6533, 292, 12966, 285, 260, 338, 274, 3671, 534, 403, 253, 1846, 269, 3433, 49602, 275, 3332, 2987, 1223, 512, 14023, 403, 2649, 327, 253, 1072, 27882, 253, 896, 47473, 908, 403, 7117, 275, 2829, 337, 10159, 891, 2868, 841, 403, 253, 4585, 3933, 19103, 209, 422, 2326, 327, 841, 1643, 11860, 49602, 285, 253, 11701, 689, 253, 954, 4623, 8245, 403, 5185, 3081, 4679, 275, 2831, 13517, 7533, 285, 28913, 2175, 28698, 2007, 1329, 281, 253, 10307, 273, 253, 1332, 50276, 20881, 1255, 265, 337, 627, 310, 2649, 247, 5955, 327, 15180, 2105, 273, 436, 1332, 8654, 4616, 556, 247, 12681, 323, 1146, 8214, 594, 824, 247, 5955, 651, 320, 1077, 9371, 3782, 1677, 253, 767, 5251, 14366, 908, 1060, 849, 1048, 1057, 436, 1332, 1379, 281, 11897, 253, 4616, 2098, 50276, 19, 2074, 281, 253, 1840, 1127, 253, 15180, 2105, 273, 253, 8654, 4616, 2098, 11498, 342, 253, 1180, 273, 5971, 3206, 1223, 436, 778, 320, 49951, 323, 253, 1355, 1979, 273, 1846, 269, 3433, 49602, 891, 9101, 326, 15302, 342, 4067, 3904, 273, 5971, 778, 5276, 625, 8214, 495, 436, 789, 3133, 11306, 1754, 327, 253, 1959, 11157, 2929, 1638, 417, 816, 275, 16182, 533, 671, 253, 2929, 3139, 690, 22620, 403, 4931, 3626, 347, 436, 789, 3936, 247, 2074, 5700, 285, 21168, 745, 273, 1638, 2299, 2593, 3307, 310, 6571, 247, 1061, 24596, 511, 273, 1638, 84, 2593, 4562, 627, 403, 2266, 22620, 275, 2176, 3104, 273, 253, 2905, 2987, 285, 2067, 273, 253, 7180, 452, 253, 1072, 33907, 627, 403, 671, 690, 8909, 4778, 35225, 432, 1110, 273, 1638, 326, 1646, 15279, 24088, 1755, 88, 3185, 273, 253, 625, 1846, 1755, 76, 9840, 323, 11737, 12305, 23465, 273, 9136, 9261, 3185, 273, 253, 625, 1846, 29331, 577, 4028, 1223, 3839, 34007, 253, 4028, 310, 2649, 1900, 22467, 297, 5372, 390, 47412, 1037, 3451, 891, 7052, 5583, 1529, 11080, 3790, 273, 1407, 953, 247, 44382, 8648, 422, 1618, 273, 6667, 3176, 762, 27722, 43295, 2708, 50276, 43671, 43295, 50276, 1282, 3127, 337, 310, 247, 8921, 4327, 347, 253, 760, 11106, 3806, 323, 2460, 9162, 50276, 1282, 1884, 9569, 281, 24403, 366, 50276, 1282, 6480, 3738, 342, 12054, 1175, 3045, 50276, 1282, 8319, 285, 11358, 27214, 4632, 13130, 50276, 15617, 15368, 723, 403, 3451, 533, 3839, 581, 943, 7356, 281, 247, 2014, 2715, 50276, 1282, 6879, 21681, 247, 6793, 3762, 50276, 21848, 281, 697, 6793, 3762, 2167, 253, 6793, 3762, 310, 2649, 253, 1921, 2139, 952, 897, 14366, 323, 10941, 5912, 10670, 50276, 1282, 11422, 359, 10957, 2644, 10895, 50276, 33722, 253, 50276, 1282, 854, 27043, 270, 310, 1146, 908, 281, 9173, 1097, 253, 1180, 273, 2613, 5971, 285, 253, 873, 273, 2613, 5971, 50276, 1282, 8688, 281, 6635, 432, 50276, 936, 320, 4561, 432, 50276, 1282, 22743, 1223, 891, 923, 2139, 970, 760, 253, 1599, 778, 320, 31215, 891, 1891, 281, 923, 2139, 352, 69, 320, 18270, 23539, 50276, 1282, 20105, 1016, 10670, 50276, 1282, 23670, 943, 2256, 281, 50276, 1282, 25901, 13583, 10393, 273, 16280, 50276, 41528, 337, 246, 943, 320, 275, 14168, 4438, 1071, 253, 7316, 873, 327, 30410, 39116, 50276, 2566, 30410, 39116, 327, 253, 7316, 873, 50276, 1282, 27518, 28534, 5816, 3475, 4379, 39169, 50276, 46906, 1507, 943, 2649, 320, 29920, 342, 253, 3159, 253, 5734, 3560, 407, 253, 3159, 10895, 50276, 2420, 337, 2139, 403, 2649, 253, 1682, 1543, 327, 12966, 13433, 264, 50276, 1282, 28534, 260, 338, 45289, 284, 50276, 1282, 36076, 4983, 247, 6197, 342, 285, 50276, 19, 285, 608, 275, 253, 10414, 403, 253, 1072, 2929, 50276, 49794, 273, 7364, 12125, 752, 310, 253, 2879, 13782, 2105, 273, 253, 1332, 672, 1057, 14366, 5257, 281, 1891, 849, 14155, 310, 253, 305, 12064, 9376, 273, 3386, 2490, 187, 4118, 18435, 27, 2520, 2929, 21168, 745, 273, 253, 3268, 18543, 2746, 323, 1643, 11860, 4715, 1929, 347, 1959, 11157, 15706, 253, 299, 26365, 7982, 342, 24498, 29776, 4616, 253, 906, 310, 247, 625, 3505, 74, 6216, 285, 45190, 3576, 2746, 253, 2022, 3374, 2190, 253, 30628, 497, 7514, 342, 253, 7364, 273, 1959, 11157, 285, 1880, 436, 747, 2746, 812, 11399, 731, 285, 752, 3686, 273, 3081, 15180, 2105, 651, 320, 23122, 407, 970, 8654, 4616, 1223, 625, 8214, 352, 310, 3543, 326, 4583, 253, 2746, 310, 10481, 4460, 285, 3576, 209 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 2520, 2929, 29328, 247, 4460, 1643, 11860, 9162, 5933, 1754, 327, 3268, 18543, 253, 4081, 789, 310, 7194, 1754, 327, 31847, 3204, 30966, 1162, 355, 43425, 835, 247, 4735, 4908, 263, 310, 806, 10166, 970, 2613, 5971, 285, 253, 3386, 273, 1643, 11860, 5971, 403, 31612, 407, 253, 3081, 3386, 8392, 432, 305, 12064, 10670, 1060, 253, 1599, 285, 26677, 273, 253, 305, 10064, 2458, 403, 8818, 407, 18212, 42174, 253, 9990, 10302, 432, 253, 2613, 5971, 285, 30966, 1162, 355, 43425, 4081, 247, 2969, 47641, 2746, 323, 326, 436, 2929, 5936, 970, 247, 625, 28055, 28462, 2746, 323, 436, 5199, 835, 253, 3700, 273, 9990, 432, 2613, 5971, 310, 4158, 281, 320, 625, 7996, 281, 253, 6349, 273, 2060, 3530, 2908, 275, 2613, 5971, 390, 4460, 5971, 5742, 253, 2929, 23970, 247, 24498, 8654, 4616, 5933, 835, 253, 17200, 875, 253, 2613, 285, 4460, 5971, 310, 5998, 3066, 16161, 24498, 8654, 4616, 3237, 285, 970, 253, 5998, 5502, 12624, 323, 26736, 3602, 323, 305, 10064, 2458, 281, 6635, 4460, 3386, 50276, 45563, 50276, 783, 2929, 310, 973, 15720, 285, 3477, 281, 956, 50276, 783, 24498, 8654, 4616, 2746, 3133, 5272, 285, 3590, 50276, 783, 5661, 1543, 403, 9470, 285, 21414, 253, 2929, 671, 3400, 271, 28913, 1263, 285, 9371, 5304, 5904, 50276, 20881, 1255, 50276, 783, 5933, 476, 320, 2783, 8489, 32809, 984, 954, 273, 253, 5199, 310, 1754, 327, 30966, 1162, 355, 43425, 347, 891, 5393, 1078, 954, 273, 253, 5933, 310, 1754, 327, 31847, 3204, 30966, 1162, 355, 43425, 285, 253, 760, 1027, 629, 310, 253, 5140, 273, 305, 12064, 3602, 949, 7269, 17375, 22661, 835, 253, 13461, 403, 10302, 432, 24498, 8654, 4616, 1335, 891, 1158, 253, 7680, 310, 3240, 9865, 285, 253, 5933, 5183, 7126, 3045, 5474, 33032, 2520, 2929, 10262, 247, 18543, 3169, 1643, 11860, 4715, 1332, 281, 2953, 253, 1895, 273, 253, 23539, 1566, 2584, 4460, 3530, 253, 2022, 4968, 281, 326, 990, 310, 8654, 4616, 14366, 285, 7624, 436, 789, 9437, 281, 11399, 253, 7364, 273, 697, 12611, 789, 1638, 407, 15706, 299, 26365, 4181, 342, 247, 4081, 4181, 1754, 327, 14366, 326, 476, 2557, 253, 4181, 875, 767, 10670, 275, 247, 1039, 273, 13546, 11038, 2105, 273, 581, 3268, 281, 1529, 275, 1340, 281, 755, 824, 247, 11038, 2105, 4315, 253, 2929, 5936, 281, 3037, 1110, 4815, 875, 2613, 5971, 281, 4460, 3530, 407, 247, 1698, 5251, 14366, 13757, 1895, 835, 253, 4736, 310, 281, 9232, 253, 6349, 273, 247, 3410, 342, 1675, 281, 697, 966, 253, 5661, 1543, 921, 326, 253, 4081, 1332, 17923, 1805, 685, 253, 5368, 1643, 11860, 4715, 3082, 1690, 31847, 3204, 1638, 50276, 296, 3755, 20556, 337, 1638, 310, 247, 41116, 789, 342, 2590, 38135, 285, 436, 789, 18212, 9010, 2316, 323, 7756, 285, 6296, 19132, 253, 3045, 273, 1638, 407, 20462, 247, 625, 10126, 18144, 4181, 7982, 50276, 19, 4679, 403, 973, 4158, 285, 5196, 285, 253, 4081, 3511, 1332, 3133, 281, 789, 973, 275, 954, 273, 253, 7533, 50276, 20, 275, 1798, 3511, 310, 4122, 3576, 275, 2831, 13517, 7533, 2429, 281, 31847, 3204, 50276, 20881, 1255, 265, 337, 627, 310, 642, 3368, 281, 1329, 253, 7364, 273, 31847, 3204, 347, 11120, 7558, 407, 253, 4477, 1014, 2167, 352, 310, 2540, 326, 253, 4583, 3045, 556, 644, 5520, 352, 310, 417, 3240, 2590, 1880, 824, 247, 3045, 6351, 6296, 3249, 432, 15974, 253, 7364, 273, 31847, 3204, 50276, 19, 253, 9759, 476, 320, 2007, 5520, 347, 253, 1655, 41818, 403, 1512, 9542, 285, 690, 12640, 14683, 403, 417, 25863, 7939, 281, 1016, 643, 50276, 20, 275, 253, 4583, 3045, 5301, 2829, 337, 253, 3045, 6351, 273, 3511, 310, 8489, 16888, 3782, 275, 337, 11860, 7533, 50275, 9820, 50276, 7152, 33032, 2520, 789, 2175, 1643, 11860, 4715, 285, 29328, 247, 8654, 33788, 3169, 42428, 6974, 281, 25057, 6667, 432, 2613, 5971, 323, 22753, 253, 49996, 273, 253, 4715, 273, 4460, 5971, 4270, 327, 253, 7792, 273, 2045, 789, 1638, 436, 789, 24357, 247, 8654, 33788, 3169, 42428, 6974, 3185, 273, 299, 26365, 4181, 285, 253, 4795, 13461, 591, 2613, 966, 29688, 19401, 14259, 273, 2060, 6667, 1561, 326, 2613, 966, 281, 6667, 275, 253, 4460, 5971, 7103, 2722, 5520, 3045, 689, 1638, 285, 12085, 3045, 327, 2629, 1643, 11860, 2460, 9162, 49602, 2439, 2709, 896, 47473, 9470, 28913, 2175, 403, 5196, 281, 12654, 34385, 281, 1638, 4757, 50276, 18, 9470, 4679, 2439, 2709, 896, 47473, 2831, 13517, 1543, 285, 28913, 2175, 18276, 1543, 327, 8654, 4616, 13461, 7729, 4685, 273, 253, 2746, 374, 2127, 2530, 323, 39306, 3368, 1543, 50276, 20881, 1255, 50276, 18, 253, 7756, 689, 1638, 275, 3045, 310, 4942, 1355, 342, 1675, 281, 849, 3505, 74, 6216, 298, 3046, 298, 22179, 253, 2746, 310, 352, 651, 320, 5322, 281, 755, 271, 4685, 327, 50275, 5430, 1199, 2316, 273, 7756, 310, 2130, 432, 3268, 18543, 281, 1691, 253, 11701, 326, 556, 644, 10375, 407, 253, 4081, 2746, 715, 8668, 323, 1650, 604, 253, 373, 42295, 2613, 966, 42428, 849, 1175, 253, 3045, 476, 320, 327, 253, 7316, 873, 50275, 609, 359, 4886, 275, 253, 987, 3884, 323, 1650, 513, 253, 966, 13461, 24088, 30762, 4677, 577, 3761, 253, 42295, 13461, 432, 1638, 281, 436, 2929, 403, 13461, 10375, 2819, 625, 751, 42295, 13461, 50275, 19, 651, 320, 5322, 281, 7568, 849, 253, 759, 18398, 4636, 13461, 278, 33126, 32010, 3910, 2190, 6667, 275, 253, 1072, 966, 323, 1650, 6858, 310, 2074, 281, 30929, 1057, 278, 33126, 2028, 7419, 30929, 3888, 342, 247, 6858, 285, 30929, 3888, 1293, 247, 6858, 849, 513, 253, 4616, 20552, 39970, 275, 2831, 13517, 7533, 50274, 783, 4477, 2380, 556, 18212, 9713, 619, 7350, 891, 452, 9300, 253, 4868, 7364, 403, 10481, 9713, 5474, 33032, 2520, 2929, 310, 327, 253, 9400, 273, 3268, 18543, 3082, 323, 1643, 11860, 4715, 534, 7703, 281, 8495, 253, 4460, 966, 4735, 3268, 281, 326, 273, 253, 2613, 5971, 35961, 285, 5661, 14023, 403, 8558, 342, 247, 2720, 3268, 18543, 1332, 1959, 11157, 1638, 908, 253, 3388, 3386, 273, 253, 767, 8642, 2613, 5971, 275, 299, 26365, 2317, 253, 4477, 3185, 12661, 970, 8654, 4616, 281, 7277, 4460, 5971, 342, 512, 2613, 5971, 23000, 253, 2105, 1159, 281, 7277, 247, 2613, 966, 342, 247, 4460, 3410, 310, 671, 6311, 347, 271, 8654, 4616, 1895, 4679, 403, 2218, 327, 608, 1106, 337, 11860, 285, 608, 1106, 608, 11860, 12949, 303, 6533, 292, 13898, 433, 303, 6533, 292, 12966, 285, 260, 338, 274, 3671, 275, 5301, 342, 1959, 11157, 285, 643, 4633, 1643, 11860, 3082, 4583, 1543, 403, 13943, 14631, 247, 747, 1375, 23037, 14387, 20544, 337, 253, 4081, 10393, 273, 8654, 4616, 281, 8495, 2613, 285, 4460, 10670, 310, 1077, 24600, 347, 403, 253, 11701, 689, 253, 2045, 3268, 12420, 1332, 1959, 11157, 697, 2761, 10084, 326, 824, 271, 2746, 556, 2649, 644, 9919, 1078, 374, 4114, 12342, 24088, 1643, 11860, 4715, 8654, 4616, 403, 5611, 973, 323, 32139, 10668, 2167, 690, 273, 253, 4028, 812, 897, 690, 7756, 923, 2708, 285, 253, 4477, 943, 1379, 1557, 281, 3693, 2505, 326, 310, 1512, 2074, 281, 2720, 2987, 923, 2708, 50276, 20, 253, 16774, 1543, 403, 13943, 253, 2022, 1543, 403, 327, 608, 1106, 337, 11860, 285, 608, 1106, 608, 11860, 12949, 303, 6533, 292, 13898, 433, 303, 6533, 292, 12966, 285, 260, 338, 274, 3671, 534, 403, 253, 1846, 269, 3433, 49602, 275, 3332, 2987, 1223, 512, 14023, 403, 2649, 327, 253, 1072, 27882, 253, 896, 47473, 908, 403, 7117, 275, 2829, 337, 10159, 891, 2868, 841, 403, 253, 4585, 3933, 19103, 209, 422, 2326, 327, 841, 1643, 11860, 49602, 285, 253, 11701, 689, 253, 954, 4623, 8245, 403, 5185, 3081, 4679, 275, 2831, 13517, 7533, 285, 28913, 2175, 28698, 2007, 1329, 281, 253, 10307, 273, 253, 1332, 50276, 20881, 1255, 265, 337, 627, 310, 2649, 247, 5955, 327, 15180, 2105, 273, 436, 1332, 8654, 4616, 556, 247, 12681, 323, 1146, 8214, 594, 824, 247, 5955, 651, 320, 1077, 9371, 3782, 1677, 253, 767, 5251, 14366, 908, 1060, 849, 1048, 1057, 436, 1332, 1379, 281, 11897, 253, 4616, 2098, 50276, 19, 2074, 281, 253, 1840, 1127, 253, 15180, 2105, 273, 253, 8654, 4616, 2098, 11498, 342, 253, 1180, 273, 5971, 3206, 1223, 436, 778, 320, 49951, 323, 253, 1355, 1979, 273, 1846, 269, 3433, 49602, 891, 9101, 326, 15302, 342, 4067, 3904, 273, 5971, 778, 5276, 625, 8214, 495, 436, 789, 3133, 11306, 1754, 327, 253, 1959, 11157, 2929, 1638, 417, 816, 275, 16182, 533, 671, 253, 2929, 3139, 690, 22620, 403, 4931, 3626, 347, 436, 789, 3936, 247, 2074, 5700, 285, 21168, 745, 273, 1638, 2299, 2593, 3307, 310, 6571, 247, 1061, 24596, 511, 273, 1638, 84, 2593, 4562, 627, 403, 2266, 22620, 275, 2176, 3104, 273, 253, 2905, 2987, 285, 2067, 273, 253, 7180, 452, 253, 1072, 33907, 627, 403, 671, 690, 8909, 4778, 35225, 432, 1110, 273, 1638, 326, 1646, 15279, 24088, 1755, 88, 3185, 273, 253, 625, 1846, 1755, 76, 9840, 323, 11737, 12305, 23465, 273, 9136, 9261, 3185, 273, 253, 625, 1846, 29331, 577, 4028, 1223, 3839, 34007, 253, 4028, 310, 2649, 1900, 22467, 297, 5372, 390, 47412, 1037, 3451, 891, 7052, 5583, 1529, 11080, 3790, 273, 1407, 953, 247, 44382, 8648, 422, 1618, 273, 6667, 3176, 762, 27722, 43295, 2708, 50276, 43671, 43295, 50276, 1282, 3127, 337, 310, 247, 8921, 4327, 347, 253, 760, 11106, 3806, 323, 2460, 9162, 50276, 1282, 1884, 9569, 281, 24403, 366, 50276, 1282, 6480, 3738, 342, 12054, 1175, 3045, 50276, 1282, 8319, 285, 11358, 27214, 4632, 13130, 50276, 15617, 15368, 723, 403, 3451, 533, 3839, 581, 943, 7356, 281, 247, 2014, 2715, 50276, 1282, 6879, 21681, 247, 6793, 3762, 50276, 21848, 281, 697, 6793, 3762, 2167, 253, 6793, 3762, 310, 2649, 253, 1921, 2139, 952, 897, 14366, 323, 10941, 5912, 10670, 50276, 1282, 11422, 359, 10957, 2644, 10895, 50276, 33722, 253, 50276, 1282, 854, 27043, 270, 310, 1146, 908, 281, 9173, 1097, 253, 1180, 273, 2613, 5971, 285, 253, 873, 273, 2613, 5971, 50276, 1282, 8688, 281, 6635, 432, 50276, 936, 320, 4561, 432, 50276, 1282, 22743, 1223, 891, 923, 2139, 970, 760, 253, 1599, 778, 320, 31215, 891, 1891, 281, 923, 2139, 352, 69, 320, 18270, 23539, 50276, 1282, 20105, 1016, 10670, 50276, 1282, 23670, 943, 2256, 281, 50276, 1282, 25901, 13583, 10393, 273, 16280, 50276, 41528, 337, 246, 943, 320, 275, 14168, 4438, 1071, 253, 7316, 873, 327, 30410, 39116, 50276, 2566, 30410, 39116, 327, 253, 7316, 873, 50276, 1282, 27518, 28534, 5816, 3475, 4379, 39169, 50276, 46906, 1507, 943, 2649, 320, 29920, 342, 253, 3159, 253, 5734, 3560, 407, 253, 3159, 10895, 50276, 2420, 337, 2139, 403, 2649, 253, 1682, 1543, 327, 12966, 13433, 264, 50276, 1282, 28534, 260, 338, 45289, 284, 50276, 1282, 36076, 4983, 247, 6197, 342, 285, 50276, 19, 285, 608, 275, 253, 10414, 403, 253, 1072, 2929, 50276, 49794, 273, 7364, 12125, 752, 310, 253, 2879, 13782, 2105, 273, 253, 1332, 672, 1057, 14366, 5257, 281, 1891, 849, 14155, 310, 253, 305, 12064, 9376, 273, 3386, 2490, 187, 4118, 18435, 27, 2520, 2929, 21168, 745, 273, 253, 3268, 18543, 2746, 323, 1643, 11860, 4715, 1929, 347, 1959, 11157, 15706, 253, 299, 26365, 7982, 342, 24498, 29776, 4616, 253, 906, 310, 247, 625, 3505, 74, 6216, 285, 45190, 3576, 2746, 253, 2022, 3374, 2190, 253, 30628, 497, 7514, 342, 253, 7364, 273, 1959, 11157, 285, 1880, 436, 747, 2746, 812, 11399, 731, 285, 752, 3686, 273, 3081, 15180, 2105, 651, 320, 23122, 407, 970, 8654, 4616, 1223, 625, 8214, 352, 310, 3543, 326, 4583, 253, 2746, 310, 10481, 4460, 285, 3576, 209 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: this paper studies the hierarchical topic modeling problem existing studies generally learn topic representations in the euclidean space which might lead to some limitations in capturing hierarchical relations to be more specific the ability to model complex patterns is inherently bounded by the dimensionality of the embedding space on the other hand side information such as taxonomy of concepts is sometimes available to guide the learning of hierarchical topics which might be challenging to preserve in the euclidean space as a consequence this paper proposes a novel framework that introduces learning word and topic representations in the hyperbolic space experiments on three public text datasets and auxiliary ablationcase studies demonstrate the effectiveness of the proposed framework strengths 1 this paper studies an important task hierarchical topic modeling is a wellstudied yet important problem that has the potential of benefiting a wide spectrum of downstream applications such as classification named entity recognition etc 2 the motivation of the solution is very clear as is well known euclidean space falls short when it comes to modeling hierarchical structures in contrast hyperbolic space has nice properties of hierarchical awareness and spacious room which benefits hierarchical relationship modeling 3 the experiments are thorough they are sufficient to justify the superiority of hyperbolic space under the problem set an ablation study between hyperminer and hyperminerkg validates the modeling of hierarchical external knowledge meanwhile the experiments also show that the proposed method is extendible to all kinds of etm models 4 the paper is carefully written and well organized weaknesses 1 more related works could be potentially included and discussed 1 1 hierarchical topic mining via joint spherical tree and text embedding minor issues typos in line 137 sine should be since in appendix line 93 wth should be with as mentioned in the appendix the main limitation of this work is the mismatch problem between the given structural knowledge and the target corpus to provide proper guidance for mining an interpretable topic taxonomy the prior structural knowledge should be well matched with the corresponding dataset docsepthis paper improves embedded topic modeling by using a hyperbolic space the proposed approach particularly helps considering hierarchical structure of words and topics in addition the authors propose a revised contrastive loss to inject prior knowledge the evaluation compares the proposed approach with 6 existing embedding and nonneuralembeddingbased topic modeling approaches on 4 datasets the proposed approach is shown to outperform the existing approaches in most configurations some analyses including the 2d visualization show the proposed approach performs as intended and encode the hierarchical information strengths s1 the proposed approach encodes hierarchical information and can incorporate the existing knowledge by adopting the hyperbolic space into an existing neural topic modeling approach s2 the paper has good presentation and readability in general s3 the solid empirical results show the performance of the constructed topic model and the analysis shows that the approach works as intended weaknesses w1 the variants hyperetm hyperminer are hyperminerkg are not explained and it thus reduces the reproducibility of the experiments w2 the topic taxonomy figure 4b is hard to relate without document examples andor summary of the topics w3 discussion of sideeffect is missing as the hyperbolic space is enforced the freedom of representing a topic decreases as such there can be sideeffects and a discussion of desired properties in topic modeling and sideeffects from this approach would be beneficial for example is the treestructure the best or can it be a rather limiting factor i cannot spot any negative societal impact of this work docsepthis paper presents a neural hierarchical topic model that uses hyperbolic embeddings the paper uses a gammapoisson factorization to facilitate optimizing a variational objective they augment this objective with a taxonomyaware contrastive loss as a way of addressing hyperbolic manifold learning across four datasets they compare against six baselines averaged across multiple runs they demonstrate small but consistent improvements in their metrics all variances are fairly small update postresponse i thank the authors for their answers to my questions i understand theres a limited amount that can be done during the author response that said while the additional taxonomy experiments are encouraging though for inclusion in a paper they would need additional discussion for whatever revision is next i suggest that while the proposed clarifications of poincare vs lorentz may be helpful for some readers the main question is is this level of detail necessary for the main paper if the answer is yes then this point needs to be made more immediately obvious overall this paper has a clear mathematical overview for the singular main contribution hyperbolic hierarchical topic modeling this paper presents a decent amount of experimental evidence that the proposed approach is effective on purely quantitative measures while this paper does not address the impact of the proposed approach on the topic model as a language model perplexity the classification results give some evidence as to the effect on document modeling however lack of examples and experimentation when it comes to the taxonomy is noticable this is especially the case given the tied nature of the model to the taxonomy l167169 this is important not just for overall completeness but to better understand the results that have been presented the gains eg in figure 3 are generally small but consistent that alone is not a negative however those smaller gains are less grounded without additional examples or experimentation this paper has some reorganization and other presentation issues while these issues could arguably be viewed as easily addressable they do impact the ability of a reader to udnerstand what the papers takeaways are the current organization of the paper suggests that the authors apply hyperbolic embeddings to a poisson gamma belief network only however as the results show this is not the case i recommend reorganizing so that the main aspect of hyperbolic embeddings applied to hierarchical topic models stands out since figure 3 only makes distinctions based on color it is a bit difficult to quickly interpret the mathematical introduction of both the lorentz and poincare similarities is confusing and given the amount of space dedicated to the equivalence explanation l140 too nuanced indeed even from the appendix this equivalence is not clear am i missing something obvious while fig 4b is helpful 4a is too busy the concept hierarchy is clear but thats a given while the lexical hierarchy except at a very coarse granularity is not no see taxonomy experimentsq1 ### Summary:
hyperbolic embeddings were a fascinating alternative to euclidean embeddings that never seemed to take off despite having significant conceptual advantages in representing the oddities of semantics i am happy to see more work on curved spaces as a tool for semantic analysis this work has strong reviews and reviewers were generally happy with the author responses id like to see it published
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 2520, 2929, 2175, 253, 24498, 9400, 14053, 1895, 5368, 2175, 3839, 3037, 9400, 14237, 275, 253, 299, 26365, 2317, 534, 1537, 1421, 281, 690, 7364, 275, 26475, 24498, 2493, 281, 320, 625, 2173, 253, 3745, 281, 1566, 2570, 6127, 310, 26557, 11542, 407, 253, 7877, 1319, 273, 253, 21496, 2317, 327, 253, 643, 1133, 1930, 1491, 824, 347, 2891, 13646, 273, 12342, 310, 4536, 2130, 281, 7102, 253, 4715, 273, 24498, 12989, 534, 1537, 320, 11132, 281, 14003, 275, 253, 299, 26365, 2317, 347, 247, 9936, 436, 2929, 29328, 247, 4460, 7792, 326, 23970, 4715, 3159, 285, 9400, 14237, 275, 253, 28095, 2317, 4679, 327, 1264, 1345, 2505, 15302, 285, 24026, 28913, 5045, 2175, 7568, 253, 12510, 273, 253, 4081, 7792, 20544, 337, 436, 2929, 2175, 271, 1774, 4836, 24498, 9400, 14053, 310, 247, 973, 14091, 728, 2568, 1774, 1895, 326, 556, 253, 2442, 273, 2750, 2996, 247, 4618, 6637, 273, 15450, 4893, 824, 347, 9162, 4907, 10726, 8981, 3966, 374, 253, 16038, 273, 253, 2900, 310, 1077, 2590, 347, 310, 973, 1929, 299, 26365, 2317, 11521, 2159, 672, 352, 3249, 281, 14053, 24498, 5289, 275, 4499, 28095, 2317, 556, 5322, 3607, 273, 24498, 11891, 285, 37039, 2316, 534, 5373, 24498, 2954, 14053, 495, 253, 4679, 403, 11080, 597, 403, 4209, 281, 15249, 253, 34385, 273, 28095, 2317, 762, 253, 1895, 873, 271, 28913, 1263, 875, 4373, 1222, 254, 285, 4373, 1222, 254, 5840, 3588, 684, 253, 14053, 273, 24498, 6024, 3640, 26614, 50276, 783, 4679, 671, 921, 326, 253, 4081, 1332, 310, 9017, 917, 281, 512, 9351, 273, 1162, 78, 3210, 577, 253, 2929, 310, 9257, 3542, 285, 973, 10932, 50276, 20881, 1255, 265, 337, 625, 2905, 2987, 812, 320, 7826, 2908, 285, 5469, 337, 50276, 18, 24498, 9400, 15067, 3066, 6036, 19474, 5202, 285, 2505, 21496, 50276, 37585, 3374, 50276, 555, 993, 275, 1386, 14509, 37353, 943, 320, 1580, 275, 30762, 1386, 11456, 259, 394, 943, 320, 342, 347, 5393, 275, 253, 30762, 253, 2022, 12291, 273, 436, 789, 310, 253, 29713, 1895, 875, 253, 1677, 8350, 3640, 285, 253, 2303, 20689, 281, 2085, 1463, 12925, 323, 15067, 271, 4665, 494, 9400, 2891, 13646, 253, 2720, 8350, 3640, 943, 320, 973, 13373, 342, 253, 3969, 10895, 5474, 33032, 2520, 2929, 19132, 12691, 9400, 14053, 407, 970, 247, 28095, 2317, 253, 4081, 2746, 3782, 7729, 7296, 24498, 2605, 273, 3000, 285, 12989, 275, 1635, 253, 4477, 12661, 247, 17265, 4499, 422, 2957, 281, 14888, 2720, 3640, 253, 7103, 26662, 253, 4081, 2746, 342, 721, 5368, 21496, 285, 1327, 570, 1546, 24224, 5361, 3169, 9400, 14053, 7274, 327, 577, 15302, 253, 4081, 2746, 310, 2011, 281, 562, 32231, 253, 5368, 7274, 275, 954, 16012, 690, 6260, 1690, 253, 374, 69, 24426, 921, 253, 4081, 2746, 17923, 347, 6034, 285, 22573, 253, 24498, 1491, 20544, 50276, 84, 18, 253, 4081, 2746, 31360, 24498, 1491, 285, 476, 19071, 253, 5368, 3640, 407, 25987, 253, 28095, 2317, 715, 271, 5368, 11454, 9400, 14053, 2746, 50276, 84, 19, 253, 2929, 556, 1175, 9759, 285, 1239, 1430, 275, 2087, 50276, 84, 20, 253, 4891, 16774, 1543, 921, 253, 3045, 273, 253, 8818, 9400, 1566, 285, 253, 1783, 2722, 326, 253, 2746, 2987, 347, 6034, 50276, 20881, 1255, 265, 50276, 88, 18, 253, 11640, 31012, 1221, 78, 4373, 1222, 254, 403, 4373, 1222, 254, 5840, 403, 417, 5544, 285, 352, 3021, 11355, 253, 38041, 273, 253, 4679, 50276, 88, 19, 253, 9400, 2891, 13646, 4677, 577, 67, 310, 1892, 281, 14588, 1293, 3389, 6667, 285, 263, 6010, 273, 253, 12989, 50276, 88, 20, 5955, 273, 1930, 8222, 310, 5816, 347, 253, 28095, 2317, 310, 27810, 253, 7185, 273, 9999, 247, 9400, 12075, 347, 824, 627, 476, 320, 1930, 29907, 285, 247, 5955, 273, 6799, 3607, 275, 9400, 14053, 285, 1930, 29907, 432, 436, 2746, 651, 320, 12912, 323, 1650, 310, 253, 2578, 383, 7818, 253, 1682, 390, 476, 352, 320, 247, 2581, 14155, 2803, 891, 2550, 6308, 667, 4016, 38058, 3486, 273, 436, 789, 5474, 33032, 2520, 2929, 10262, 247, 11454, 24498, 9400, 1566, 326, 4648, 28095, 46234, 253, 2929, 4648, 247, 305, 3681, 40240, 17469, 39401, 281, 12454, 39793, 247, 39762, 8103, 597, 35919, 436, 8103, 342, 247, 2891, 13646, 13823, 4499, 422, 2957, 347, 247, 1039, 273, 15974, 28095, 16751, 4715, 2439, 1740, 15302, 597, 7277, 1411, 2800, 1666, 25379, 17522, 2439, 2709, 6613, 597, 7568, 1355, 533, 5185, 11701, 275, 616, 17082, 512, 48894, 403, 9648, 1355, 5731, 1501, 10927, 891, 5717, 253, 4477, 323, 616, 9172, 281, 619, 3533, 50275, 74, 2096, 253, 373, 247, 3710, 2408, 326, 476, 320, 2218, 1309, 253, 2488, 2380, 326, 753, 1223, 253, 3081, 2891, 13646, 4679, 403, 18462, 2167, 323, 11250, 275, 247, 2929, 597, 651, 878, 3081, 5955, 50275, 1542, 5913, 18520, 310, 1735, 891, 1804, 326, 1223, 253, 4081, 8254, 6787, 273, 2963, 1763, 609, 4632, 298, 27536, 778, 320, 9371, 323, 690, 10668, 253, 2022, 1953, 310, 310, 436, 1268, 273, 2508, 3309, 323, 253, 2022, 2929, 604, 253, 3662, 310, 4754, 840, 436, 1127, 3198, 281, 320, 1160, 625, 4745, 4755, 50272, 1189, 455, 436, 2929, 556, 247, 2590, 15965, 18389, 323, 253, 11098, 2022, 7680, 28095, 24498, 9400, 14053, 436, 2929, 10262, 247, 12524, 2408, 273, 5661, 1941, 326, 253, 4081, 2746, 310, 3576, 327, 15846, 11745, 5593, 1223, 436, 2929, 1057, 417, 2953, 253, 3486, 273, 253, 4081, 2746, 327, 253, 9400, 1566, 347, 247, 3448, 1566, 44229, 414, 253, 9162, 1543, 1918, 690, 1941, 347, 281, 253, 1055, 327, 3389, 14053, 50276, 35529, 3480, 273, 6667, 285, 40290, 672, 352, 3249, 281, 253, 2891, 13646, 310, 417, 35232, 436, 310, 3340, 253, 1083, 1677, 253, 12331, 3753, 273, 253, 1566, 281, 253, 2891, 13646, 298, 18146, 17809, 436, 310, 1774, 417, 816, 323, 4583, 29867, 533, 281, 1805, 2096, 253, 1543, 326, 452, 644, 3559, 253, 15988, 24088, 275, 4677, 495, 403, 3839, 1355, 533, 5185, 326, 3815, 310, 417, 247, 4016, 2299, 1110, 4577, 15988, 403, 1679, 28462, 1293, 3081, 6667, 390, 40290, 50276, 2520, 2929, 556, 690, 40386, 285, 643, 9759, 3374, 1223, 841, 3374, 812, 25711, 320, 11575, 347, 4354, 2953, 494, 597, 513, 3486, 253, 3745, 273, 247, 9414, 281, 18198, 1216, 1676, 752, 253, 9380, 1379, 42287, 403, 50275, 783, 1655, 6003, 273, 253, 2929, 5936, 326, 253, 4477, 50275, 18788, 28095, 46234, 281, 247, 2963, 17469, 17356, 9927, 2990, 50275, 7483, 2299, 347, 253, 1543, 921, 436, 310, 417, 253, 1083, 891, 50275, 250, 27167, 294, 7397, 3006, 594, 326, 253, 2022, 4809, 273, 28095, 50275, 24224, 26935, 3732, 281, 24498, 9400, 3210, 9572, 562, 50275, 17480, 4677, 495, 760, 2789, 42060, 1754, 327, 3295, 352, 310, 247, 2372, 50275, 38157, 281, 4541, 4665, 50275, 783, 15965, 10199, 273, 1097, 253, 298, 27536, 285, 2963, 1763, 609, 50275, 22202, 1005, 310, 21643, 285, 1677, 253, 2408, 273, 2317, 9940, 50275, 936, 253, 19945, 8813, 298, 12434, 1512, 8794, 3086, 6296, 1014, 50275, 4064, 253, 30762, 436, 19945, 310, 417, 2590, 717, 891, 5816, 50275, 17873, 4755, 50275, 6050, 3036, 577, 67, 310, 9371, 577, 66, 310, 1512, 10000, 253, 4473, 19868, 310, 50275, 8250, 533, 28763, 247, 1677, 1223, 253, 26752, 474, 19868, 3707, 387, 247, 50275, 635, 25319, 32449, 414, 310, 417, 642, 923, 2891, 13646, 4679, 82, 18, 2490, 187, 4118, 18435, 27, 27049, 67, 3422, 46234, 497, 247, 20996, 5795, 281, 299, 26365, 46234, 326, 1620, 4455, 281, 1379, 745, 5747, 1907, 1534, 20178, 11361, 275, 9999, 253, 8909, 1005, 273, 35185, 891, 717, 5211, 281, 923, 625, 789, 327, 22627, 8470, 347, 247, 4968, 323, 24705, 1783, 436, 789, 556, 2266, 10123, 285, 30628, 497, 3839, 5211, 342, 253, 2488, 6128, 2654, 751, 281, 923, 352, 3863 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 2520, 2929, 2175, 253, 24498, 9400, 14053, 1895, 5368, 2175, 3839, 3037, 9400, 14237, 275, 253, 299, 26365, 2317, 534, 1537, 1421, 281, 690, 7364, 275, 26475, 24498, 2493, 281, 320, 625, 2173, 253, 3745, 281, 1566, 2570, 6127, 310, 26557, 11542, 407, 253, 7877, 1319, 273, 253, 21496, 2317, 327, 253, 643, 1133, 1930, 1491, 824, 347, 2891, 13646, 273, 12342, 310, 4536, 2130, 281, 7102, 253, 4715, 273, 24498, 12989, 534, 1537, 320, 11132, 281, 14003, 275, 253, 299, 26365, 2317, 347, 247, 9936, 436, 2929, 29328, 247, 4460, 7792, 326, 23970, 4715, 3159, 285, 9400, 14237, 275, 253, 28095, 2317, 4679, 327, 1264, 1345, 2505, 15302, 285, 24026, 28913, 5045, 2175, 7568, 253, 12510, 273, 253, 4081, 7792, 20544, 337, 436, 2929, 2175, 271, 1774, 4836, 24498, 9400, 14053, 310, 247, 973, 14091, 728, 2568, 1774, 1895, 326, 556, 253, 2442, 273, 2750, 2996, 247, 4618, 6637, 273, 15450, 4893, 824, 347, 9162, 4907, 10726, 8981, 3966, 374, 253, 16038, 273, 253, 2900, 310, 1077, 2590, 347, 310, 973, 1929, 299, 26365, 2317, 11521, 2159, 672, 352, 3249, 281, 14053, 24498, 5289, 275, 4499, 28095, 2317, 556, 5322, 3607, 273, 24498, 11891, 285, 37039, 2316, 534, 5373, 24498, 2954, 14053, 495, 253, 4679, 403, 11080, 597, 403, 4209, 281, 15249, 253, 34385, 273, 28095, 2317, 762, 253, 1895, 873, 271, 28913, 1263, 875, 4373, 1222, 254, 285, 4373, 1222, 254, 5840, 3588, 684, 253, 14053, 273, 24498, 6024, 3640, 26614, 50276, 783, 4679, 671, 921, 326, 253, 4081, 1332, 310, 9017, 917, 281, 512, 9351, 273, 1162, 78, 3210, 577, 253, 2929, 310, 9257, 3542, 285, 973, 10932, 50276, 20881, 1255, 265, 337, 625, 2905, 2987, 812, 320, 7826, 2908, 285, 5469, 337, 50276, 18, 24498, 9400, 15067, 3066, 6036, 19474, 5202, 285, 2505, 21496, 50276, 37585, 3374, 50276, 555, 993, 275, 1386, 14509, 37353, 943, 320, 1580, 275, 30762, 1386, 11456, 259, 394, 943, 320, 342, 347, 5393, 275, 253, 30762, 253, 2022, 12291, 273, 436, 789, 310, 253, 29713, 1895, 875, 253, 1677, 8350, 3640, 285, 253, 2303, 20689, 281, 2085, 1463, 12925, 323, 15067, 271, 4665, 494, 9400, 2891, 13646, 253, 2720, 8350, 3640, 943, 320, 973, 13373, 342, 253, 3969, 10895, 5474, 33032, 2520, 2929, 19132, 12691, 9400, 14053, 407, 970, 247, 28095, 2317, 253, 4081, 2746, 3782, 7729, 7296, 24498, 2605, 273, 3000, 285, 12989, 275, 1635, 253, 4477, 12661, 247, 17265, 4499, 422, 2957, 281, 14888, 2720, 3640, 253, 7103, 26662, 253, 4081, 2746, 342, 721, 5368, 21496, 285, 1327, 570, 1546, 24224, 5361, 3169, 9400, 14053, 7274, 327, 577, 15302, 253, 4081, 2746, 310, 2011, 281, 562, 32231, 253, 5368, 7274, 275, 954, 16012, 690, 6260, 1690, 253, 374, 69, 24426, 921, 253, 4081, 2746, 17923, 347, 6034, 285, 22573, 253, 24498, 1491, 20544, 50276, 84, 18, 253, 4081, 2746, 31360, 24498, 1491, 285, 476, 19071, 253, 5368, 3640, 407, 25987, 253, 28095, 2317, 715, 271, 5368, 11454, 9400, 14053, 2746, 50276, 84, 19, 253, 2929, 556, 1175, 9759, 285, 1239, 1430, 275, 2087, 50276, 84, 20, 253, 4891, 16774, 1543, 921, 253, 3045, 273, 253, 8818, 9400, 1566, 285, 253, 1783, 2722, 326, 253, 2746, 2987, 347, 6034, 50276, 20881, 1255, 265, 50276, 88, 18, 253, 11640, 31012, 1221, 78, 4373, 1222, 254, 403, 4373, 1222, 254, 5840, 403, 417, 5544, 285, 352, 3021, 11355, 253, 38041, 273, 253, 4679, 50276, 88, 19, 253, 9400, 2891, 13646, 4677, 577, 67, 310, 1892, 281, 14588, 1293, 3389, 6667, 285, 263, 6010, 273, 253, 12989, 50276, 88, 20, 5955, 273, 1930, 8222, 310, 5816, 347, 253, 28095, 2317, 310, 27810, 253, 7185, 273, 9999, 247, 9400, 12075, 347, 824, 627, 476, 320, 1930, 29907, 285, 247, 5955, 273, 6799, 3607, 275, 9400, 14053, 285, 1930, 29907, 432, 436, 2746, 651, 320, 12912, 323, 1650, 310, 253, 2578, 383, 7818, 253, 1682, 390, 476, 352, 320, 247, 2581, 14155, 2803, 891, 2550, 6308, 667, 4016, 38058, 3486, 273, 436, 789, 5474, 33032, 2520, 2929, 10262, 247, 11454, 24498, 9400, 1566, 326, 4648, 28095, 46234, 253, 2929, 4648, 247, 305, 3681, 40240, 17469, 39401, 281, 12454, 39793, 247, 39762, 8103, 597, 35919, 436, 8103, 342, 247, 2891, 13646, 13823, 4499, 422, 2957, 347, 247, 1039, 273, 15974, 28095, 16751, 4715, 2439, 1740, 15302, 597, 7277, 1411, 2800, 1666, 25379, 17522, 2439, 2709, 6613, 597, 7568, 1355, 533, 5185, 11701, 275, 616, 17082, 512, 48894, 403, 9648, 1355, 5731, 1501, 10927, 891, 5717, 253, 4477, 323, 616, 9172, 281, 619, 3533, 50275, 74, 2096, 253, 373, 247, 3710, 2408, 326, 476, 320, 2218, 1309, 253, 2488, 2380, 326, 753, 1223, 253, 3081, 2891, 13646, 4679, 403, 18462, 2167, 323, 11250, 275, 247, 2929, 597, 651, 878, 3081, 5955, 50275, 1542, 5913, 18520, 310, 1735, 891, 1804, 326, 1223, 253, 4081, 8254, 6787, 273, 2963, 1763, 609, 4632, 298, 27536, 778, 320, 9371, 323, 690, 10668, 253, 2022, 1953, 310, 310, 436, 1268, 273, 2508, 3309, 323, 253, 2022, 2929, 604, 253, 3662, 310, 4754, 840, 436, 1127, 3198, 281, 320, 1160, 625, 4745, 4755, 50272, 1189, 455, 436, 2929, 556, 247, 2590, 15965, 18389, 323, 253, 11098, 2022, 7680, 28095, 24498, 9400, 14053, 436, 2929, 10262, 247, 12524, 2408, 273, 5661, 1941, 326, 253, 4081, 2746, 310, 3576, 327, 15846, 11745, 5593, 1223, 436, 2929, 1057, 417, 2953, 253, 3486, 273, 253, 4081, 2746, 327, 253, 9400, 1566, 347, 247, 3448, 1566, 44229, 414, 253, 9162, 1543, 1918, 690, 1941, 347, 281, 253, 1055, 327, 3389, 14053, 50276, 35529, 3480, 273, 6667, 285, 40290, 672, 352, 3249, 281, 253, 2891, 13646, 310, 417, 35232, 436, 310, 3340, 253, 1083, 1677, 253, 12331, 3753, 273, 253, 1566, 281, 253, 2891, 13646, 298, 18146, 17809, 436, 310, 1774, 417, 816, 323, 4583, 29867, 533, 281, 1805, 2096, 253, 1543, 326, 452, 644, 3559, 253, 15988, 24088, 275, 4677, 495, 403, 3839, 1355, 533, 5185, 326, 3815, 310, 417, 247, 4016, 2299, 1110, 4577, 15988, 403, 1679, 28462, 1293, 3081, 6667, 390, 40290, 50276, 2520, 2929, 556, 690, 40386, 285, 643, 9759, 3374, 1223, 841, 3374, 812, 25711, 320, 11575, 347, 4354, 2953, 494, 597, 513, 3486, 253, 3745, 273, 247, 9414, 281, 18198, 1216, 1676, 752, 253, 9380, 1379, 42287, 403, 50275, 783, 1655, 6003, 273, 253, 2929, 5936, 326, 253, 4477, 50275, 18788, 28095, 46234, 281, 247, 2963, 17469, 17356, 9927, 2990, 50275, 7483, 2299, 347, 253, 1543, 921, 436, 310, 417, 253, 1083, 891, 50275, 250, 27167, 294, 7397, 3006, 594, 326, 253, 2022, 4809, 273, 28095, 50275, 24224, 26935, 3732, 281, 24498, 9400, 3210, 9572, 562, 50275, 17480, 4677, 495, 760, 2789, 42060, 1754, 327, 3295, 352, 310, 247, 2372, 50275, 38157, 281, 4541, 4665, 50275, 783, 15965, 10199, 273, 1097, 253, 298, 27536, 285, 2963, 1763, 609, 50275, 22202, 1005, 310, 21643, 285, 1677, 253, 2408, 273, 2317, 9940, 50275, 936, 253, 19945, 8813, 298, 12434, 1512, 8794, 3086, 6296, 1014, 50275, 4064, 253, 30762, 436, 19945, 310, 417, 2590, 717, 891, 5816, 50275, 17873, 4755, 50275, 6050, 3036, 577, 67, 310, 9371, 577, 66, 310, 1512, 10000, 253, 4473, 19868, 310, 50275, 8250, 533, 28763, 247, 1677, 1223, 253, 26752, 474, 19868, 3707, 387, 247, 50275, 635, 25319, 32449, 414, 310, 417, 642, 923, 2891, 13646, 4679, 82, 18, 2490, 187, 4118, 18435, 27, 27049, 67, 3422, 46234, 497, 247, 20996, 5795, 281, 299, 26365, 46234, 326, 1620, 4455, 281, 1379, 745, 5747, 1907, 1534, 20178, 11361, 275, 9999, 253, 8909, 1005, 273, 35185, 891, 717, 5211, 281, 923, 625, 789, 327, 22627, 8470, 347, 247, 4968, 323, 24705, 1783, 436, 789, 556, 2266, 10123, 285, 30628, 497, 3839, 5211, 342, 253, 2488, 6128, 2654, 751, 281, 923, 352, 3863 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: the paper compiles a dataset from various sources into one unified format for use in downstream research this dataset can be used for pretraining since it is not labelled for any specific purpose and can provide useful insights into various social issues and datarelated insights 1 it is a really large dataset 256 gb using this as pretraining data would provide useful insights to the community and the authors built and trained such a model 2 the paper also shows some case studies related to the use of this dataset this will be especially useful for researchers who are not familiar with the insights that legal documents can provide for example studying anonymization or toxicity 1 the dataset can be used in the future for pretraining but it appears it is across a very wide legal spectrum is this the norm in the legal domain in healthcare such a wide use of datasets will not lead to useful insights for example ehr records alone can vary based on format structure and term usage and it is rarely possible to use them with a huge data dump 2 this dataset is rather large and can explicitly violate the privacy concerns of those mentioned i am not convinced with everything in the paper that the authors have taken appropriate measures in deanonymizing names and identifiable information and if such a task is even possible 2 a public dataset that is available over the hugging face api can violate individual privacy if the documents arent anonymized properly or can be deanonymized easily i understand that this was all public data and that it was available for download before with permissive licenses but now this data is callable with a single loaddataset call which is slightly disconcerting docsepthe paper presents a dataset of opensource englishlanguage legal and administrative data covering court opinions contracts administrative rules and legislative records a new dataset has been presented opensource dataset without license restrictions description of the privacy concerns a detailed description of data acquisition and data sources unfortunately it contains mainly uscanada legal domain docsepprivacy and toxic content is a problem for training material in large language models the paper investigates the issue by curating and using a large open source legal dataset where there data has already been filtered through legal and administrative processes as the dataset comes from different sources 35 in different contexts eg criminal vs civil cases and in different countries it has been filtered using different rules as they apply in each particular case this variegated dataset allows the authors to carry out a series of experiments leveraging the differences in the already applied filtering rules how do different jurisdictions handle privacy filtering how is toxic content handled by governments also the dataset can be used so that models can be trained in order to learn contextual privacy rules like pseudonymisation investigate how consistent toxicity filters are they arent the dataset may be useful to other researchers particularly those wishing to investigate the impact in training of differentiated visavis the filtering rules applied of training datasets very large dataset dataset comes from different sources can be used to answer practical questions in training large language models well documented from a legal point of view material from the eu does come from a different tradition than the caselaw used in us and the uk it is mentioned that the adversarial legal system in many anglophone countries creates incentives for lawyers to complain about overt racism in written materials the difference between casebased and continental or civil law would be interesting to pursue docsepthe paper studies responsible data filtering practice for ai researchers the paper conveys the legally grounded lessons by surveying how governments have developed the data filtering algorithm while balancing transparency vs privacy and toxicity they also provide large scale comprehensive legal corpus that can be used for training legal language models and show that such models have a potential to be used as contextual data filters by performing case studies overall the paper is well written and the issue is timely the released corpus is comprehensive and will be useful for studying legal ai the released language model polbert does not seem to show the full potential of pol dataset yet the focus of this paper is raising the issue and convey the legally grounded lessons with new research direction using large scale legal corpus a study of the timely issue in a legally grounded way the release of comprehensive largescale english legal corpus showing a potential for a datadriven contextual text sanitization the released corpus is mostly unstructured for instance in courtlisteneropinions the meta data like issued date the name of judges the name of court are not separated from the raw text as well as the major fields like facts claim and opinion this may limit the usability and the uniqueness of the released dataset the accompanying polbert language models do not seem to be trained optimally and their usefulness may be limited ### Summary:
the reviews are generally positive though somewhat short and a large pretraining corpus for legal text will likely be useful for nlp research one reviewer gave a reject score 5 with the following two weaknesses a the dataset comes from a wide spectrum of lawrelated data sources which may differ substantially and hence limit the usefulness of the dataset b privacy i am not too worried about point a because large language models seem to be able to learn from diverse data sources regarding point b the separate ethics review mentions the privacy concerns as well but finds that the submission sufficiently discusses this concern and sees no serious ethical issues hence overall i recommend accepting the paper
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 783, 2929, 509, 3205, 247, 10895, 432, 2710, 4973, 715, 581, 27998, 5981, 323, 897, 275, 15450, 2561, 436, 10895, 476, 320, 908, 323, 3215, 26208, 1580, 352, 310, 417, 27214, 323, 667, 2173, 4096, 285, 476, 2085, 4217, 16039, 715, 2710, 2675, 3374, 285, 2856, 609, 16148, 16039, 337, 352, 310, 247, 1663, 1781, 10895, 17558, 305, 67, 970, 436, 347, 3215, 26208, 941, 651, 2085, 4217, 16039, 281, 253, 3114, 285, 253, 4477, 4270, 285, 10166, 824, 247, 1566, 50276, 19, 253, 2929, 671, 2722, 690, 1083, 2175, 2905, 281, 253, 897, 273, 436, 10895, 436, 588, 320, 3340, 4217, 323, 8607, 665, 403, 417, 7615, 342, 253, 16039, 326, 4320, 7177, 476, 2085, 323, 1650, 12392, 26314, 1320, 390, 14386, 337, 253, 10895, 476, 320, 908, 275, 253, 2852, 323, 3215, 26208, 533, 352, 4620, 352, 310, 2439, 247, 1077, 4618, 4320, 6637, 310, 436, 253, 5222, 275, 253, 4320, 5028, 275, 11723, 824, 247, 4618, 897, 273, 15302, 588, 417, 1421, 281, 4217, 16039, 323, 1650, 299, 6285, 5861, 3815, 476, 6889, 1754, 327, 5981, 2605, 285, 1307, 10393, 285, 352, 310, 11766, 1896, 281, 897, 731, 342, 247, 5699, 941, 13725, 50276, 19, 436, 10895, 310, 2581, 1781, 285, 476, 11120, 20835, 253, 11068, 7350, 273, 1110, 5393, 891, 717, 417, 13762, 342, 3253, 275, 253, 2929, 326, 253, 4477, 452, 2668, 4569, 5593, 275, 372, 266, 7983, 3006, 4454, 285, 38640, 1491, 285, 604, 824, 247, 4836, 310, 1014, 1896, 374, 247, 1345, 10895, 326, 310, 2130, 689, 253, 15729, 3390, 2454, 23370, 476, 20835, 2060, 11068, 604, 253, 7177, 403, 2649, 26314, 1025, 6283, 390, 476, 320, 372, 266, 7983, 1025, 4354, 891, 2096, 326, 436, 369, 512, 1345, 941, 285, 326, 352, 369, 2130, 323, 6184, 1078, 342, 591, 35407, 23937, 533, 1024, 436, 941, 310, 1067, 494, 342, 247, 2014, 3301, 42429, 1067, 534, 310, 5777, 557, 585, 15004, 272, 5474, 339, 431, 248, 2929, 10262, 247, 10895, 273, 13279, 1505, 48087, 12982, 4320, 285, 10656, 941, 10985, 1302, 11626, 12712, 10656, 4803, 285, 14112, 5861, 50275, 66, 747, 10895, 556, 644, 3559, 50276, 25249, 1505, 10895, 1293, 7981, 13133, 50275, 10008, 273, 253, 11068, 7350, 50275, 66, 7000, 5740, 273, 941, 11931, 285, 941, 4973, 50275, 328, 9520, 352, 4428, 7194, 441, 5092, 2960, 4320, 5028, 50275, 7152, 339, 377, 1069, 1974, 285, 12825, 2600, 310, 247, 1895, 323, 3733, 2144, 275, 1781, 3448, 3210, 253, 2929, 2340, 684, 253, 2523, 407, 1095, 839, 285, 970, 247, 1781, 1527, 2603, 4320, 10895, 835, 627, 941, 556, 2168, 644, 18748, 949, 4320, 285, 10656, 4870, 347, 253, 10895, 3249, 432, 1027, 4973, 4791, 275, 1027, 22349, 24088, 6424, 4632, 5079, 2219, 285, 275, 1027, 4343, 352, 556, 644, 18748, 970, 1027, 4803, 347, 597, 4647, 275, 1016, 1798, 1083, 436, 945, 25362, 456, 10895, 4483, 253, 4477, 281, 4459, 562, 247, 2962, 273, 4679, 19732, 2977, 253, 3910, 275, 253, 2168, 3732, 19690, 4803, 50275, 5430, 513, 1027, 34853, 6016, 11068, 19690, 50276, 5430, 310, 12825, 2600, 15726, 407, 13001, 50276, 12563, 253, 10895, 476, 320, 908, 594, 326, 3210, 476, 320, 10166, 275, 1340, 281, 50275, 29343, 33876, 11068, 4803, 751, 10585, 7983, 5837, 50276, 24889, 12894, 849, 5185, 14386, 15116, 403, 50276, 9328, 403, 2649, 50276, 783, 10895, 778, 320, 4217, 281, 643, 8607, 3782, 1110, 30685, 281, 7409, 253, 3486, 275, 3733, 273, 22266, 1649, 41826, 253, 19690, 4803, 3732, 273, 3733, 15302, 50276, 635, 1781, 10895, 50275, 42429, 3249, 432, 1027, 4973, 50275, 5092, 320, 908, 281, 3662, 8542, 3533, 275, 3733, 1781, 3448, 3210, 50275, 4714, 14290, 50276, 4064, 247, 4320, 1127, 273, 1859, 2144, 432, 253, 21187, 1057, 1705, 432, 247, 1027, 4062, 685, 253, 6483, 293, 1403, 908, 275, 441, 285, 253, 42487, 352, 310, 5393, 326, 253, 48960, 4320, 985, 275, 1142, 2897, 77, 41461, 4343, 10513, 26911, 323, 16099, 281, 17805, 670, 19486, 23285, 275, 3542, 4753, 253, 3064, 875, 1083, 3169, 285, 39222, 390, 5079, 1569, 651, 320, 4722, 281, 15142, 5474, 339, 431, 248, 2929, 2175, 5506, 941, 19690, 3946, 323, 23105, 8607, 253, 2929, 11785, 656, 253, 17734, 28462, 15880, 407, 8957, 3184, 849, 13001, 452, 3715, 253, 941, 19690, 5933, 1223, 26259, 22107, 4632, 11068, 285, 14386, 597, 671, 2085, 1781, 4311, 11088, 4320, 20689, 326, 476, 320, 908, 323, 3733, 4320, 3448, 3210, 285, 921, 326, 824, 3210, 452, 247, 2442, 281, 320, 908, 347, 33876, 941, 15116, 407, 9591, 1083, 2175, 50276, 1189, 455, 253, 2929, 310, 973, 3542, 285, 253, 2523, 310, 14793, 253, 4439, 20689, 310, 11088, 285, 588, 320, 4217, 323, 12392, 4320, 23105, 253, 4439, 3448, 1566, 877, 6291, 1057, 417, 1646, 281, 921, 253, 2120, 2442, 273, 877, 10895, 2568, 253, 2770, 273, 436, 2929, 310, 12976, 253, 2523, 285, 12709, 253, 17734, 28462, 15880, 342, 747, 2561, 3884, 970, 1781, 4311, 4320, 20689, 50275, 66, 1263, 273, 253, 14793, 2523, 275, 247, 17734, 28462, 1039, 50276, 783, 3727, 273, 11088, 1236, 2510, 25912, 48087, 4320, 20689, 50276, 9029, 272, 247, 2442, 323, 247, 2856, 324, 1069, 257, 33876, 2505, 44368, 1320, 50275, 783, 4439, 20689, 310, 6571, 440, 34218, 323, 4227, 275, 1302, 48888, 30436, 621, 253, 11419, 941, 751, 6808, 3522, 253, 1416, 273, 16006, 253, 1416, 273, 1302, 403, 417, 9070, 432, 253, 9305, 2505, 347, 973, 347, 253, 2201, 4910, 751, 5441, 1750, 285, 4743, 436, 778, 2701, 253, 47813, 285, 253, 34002, 273, 253, 4439, 10895, 50276, 783, 17909, 877, 6291, 3448, 3210, 513, 417, 1646, 281, 320, 10166, 5556, 595, 285, 616, 31471, 778, 320, 3710, 2490, 187, 4118, 18435, 27, 783, 10123, 403, 3839, 2762, 2167, 8489, 2159, 285, 247, 1781, 3215, 26208, 20689, 323, 4320, 2505, 588, 2779, 320, 4217, 323, 295, 24343, 2561, 581, 37317, 3534, 247, 12009, 4868, 608, 342, 253, 1563, 767, 32213, 50276, 66, 253, 10895, 3249, 432, 247, 4618, 6637, 273, 1569, 4919, 941, 4973, 534, 778, 9184, 9619, 285, 7613, 2701, 253, 31471, 273, 253, 10895, 50276, 67, 11068, 50275, 74, 717, 417, 1512, 11926, 670, 1127, 247, 984, 1781, 3448, 3210, 1646, 281, 320, 2104, 281, 3037, 432, 11117, 941, 4973, 50276, 1747, 13218, 1127, 270, 253, 4858, 18035, 2278, 25957, 253, 11068, 7350, 347, 973, 533, 9010, 326, 253, 19529, 10481, 25339, 436, 4468, 285, 11403, 642, 4092, 16289, 3374, 50276, 48521, 4583, 891, 5583, 18738, 253, 2929, 209 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 783, 2929, 509, 3205, 247, 10895, 432, 2710, 4973, 715, 581, 27998, 5981, 323, 897, 275, 15450, 2561, 436, 10895, 476, 320, 908, 323, 3215, 26208, 1580, 352, 310, 417, 27214, 323, 667, 2173, 4096, 285, 476, 2085, 4217, 16039, 715, 2710, 2675, 3374, 285, 2856, 609, 16148, 16039, 337, 352, 310, 247, 1663, 1781, 10895, 17558, 305, 67, 970, 436, 347, 3215, 26208, 941, 651, 2085, 4217, 16039, 281, 253, 3114, 285, 253, 4477, 4270, 285, 10166, 824, 247, 1566, 50276, 19, 253, 2929, 671, 2722, 690, 1083, 2175, 2905, 281, 253, 897, 273, 436, 10895, 436, 588, 320, 3340, 4217, 323, 8607, 665, 403, 417, 7615, 342, 253, 16039, 326, 4320, 7177, 476, 2085, 323, 1650, 12392, 26314, 1320, 390, 14386, 337, 253, 10895, 476, 320, 908, 275, 253, 2852, 323, 3215, 26208, 533, 352, 4620, 352, 310, 2439, 247, 1077, 4618, 4320, 6637, 310, 436, 253, 5222, 275, 253, 4320, 5028, 275, 11723, 824, 247, 4618, 897, 273, 15302, 588, 417, 1421, 281, 4217, 16039, 323, 1650, 299, 6285, 5861, 3815, 476, 6889, 1754, 327, 5981, 2605, 285, 1307, 10393, 285, 352, 310, 11766, 1896, 281, 897, 731, 342, 247, 5699, 941, 13725, 50276, 19, 436, 10895, 310, 2581, 1781, 285, 476, 11120, 20835, 253, 11068, 7350, 273, 1110, 5393, 891, 717, 417, 13762, 342, 3253, 275, 253, 2929, 326, 253, 4477, 452, 2668, 4569, 5593, 275, 372, 266, 7983, 3006, 4454, 285, 38640, 1491, 285, 604, 824, 247, 4836, 310, 1014, 1896, 374, 247, 1345, 10895, 326, 310, 2130, 689, 253, 15729, 3390, 2454, 23370, 476, 20835, 2060, 11068, 604, 253, 7177, 403, 2649, 26314, 1025, 6283, 390, 476, 320, 372, 266, 7983, 1025, 4354, 891, 2096, 326, 436, 369, 512, 1345, 941, 285, 326, 352, 369, 2130, 323, 6184, 1078, 342, 591, 35407, 23937, 533, 1024, 436, 941, 310, 1067, 494, 342, 247, 2014, 3301, 42429, 1067, 534, 310, 5777, 557, 585, 15004, 272, 5474, 339, 431, 248, 2929, 10262, 247, 10895, 273, 13279, 1505, 48087, 12982, 4320, 285, 10656, 941, 10985, 1302, 11626, 12712, 10656, 4803, 285, 14112, 5861, 50275, 66, 747, 10895, 556, 644, 3559, 50276, 25249, 1505, 10895, 1293, 7981, 13133, 50275, 10008, 273, 253, 11068, 7350, 50275, 66, 7000, 5740, 273, 941, 11931, 285, 941, 4973, 50275, 328, 9520, 352, 4428, 7194, 441, 5092, 2960, 4320, 5028, 50275, 7152, 339, 377, 1069, 1974, 285, 12825, 2600, 310, 247, 1895, 323, 3733, 2144, 275, 1781, 3448, 3210, 253, 2929, 2340, 684, 253, 2523, 407, 1095, 839, 285, 970, 247, 1781, 1527, 2603, 4320, 10895, 835, 627, 941, 556, 2168, 644, 18748, 949, 4320, 285, 10656, 4870, 347, 253, 10895, 3249, 432, 1027, 4973, 4791, 275, 1027, 22349, 24088, 6424, 4632, 5079, 2219, 285, 275, 1027, 4343, 352, 556, 644, 18748, 970, 1027, 4803, 347, 597, 4647, 275, 1016, 1798, 1083, 436, 945, 25362, 456, 10895, 4483, 253, 4477, 281, 4459, 562, 247, 2962, 273, 4679, 19732, 2977, 253, 3910, 275, 253, 2168, 3732, 19690, 4803, 50275, 5430, 513, 1027, 34853, 6016, 11068, 19690, 50276, 5430, 310, 12825, 2600, 15726, 407, 13001, 50276, 12563, 253, 10895, 476, 320, 908, 594, 326, 3210, 476, 320, 10166, 275, 1340, 281, 50275, 29343, 33876, 11068, 4803, 751, 10585, 7983, 5837, 50276, 24889, 12894, 849, 5185, 14386, 15116, 403, 50276, 9328, 403, 2649, 50276, 783, 10895, 778, 320, 4217, 281, 643, 8607, 3782, 1110, 30685, 281, 7409, 253, 3486, 275, 3733, 273, 22266, 1649, 41826, 253, 19690, 4803, 3732, 273, 3733, 15302, 50276, 635, 1781, 10895, 50275, 42429, 3249, 432, 1027, 4973, 50275, 5092, 320, 908, 281, 3662, 8542, 3533, 275, 3733, 1781, 3448, 3210, 50275, 4714, 14290, 50276, 4064, 247, 4320, 1127, 273, 1859, 2144, 432, 253, 21187, 1057, 1705, 432, 247, 1027, 4062, 685, 253, 6483, 293, 1403, 908, 275, 441, 285, 253, 42487, 352, 310, 5393, 326, 253, 48960, 4320, 985, 275, 1142, 2897, 77, 41461, 4343, 10513, 26911, 323, 16099, 281, 17805, 670, 19486, 23285, 275, 3542, 4753, 253, 3064, 875, 1083, 3169, 285, 39222, 390, 5079, 1569, 651, 320, 4722, 281, 15142, 5474, 339, 431, 248, 2929, 2175, 5506, 941, 19690, 3946, 323, 23105, 8607, 253, 2929, 11785, 656, 253, 17734, 28462, 15880, 407, 8957, 3184, 849, 13001, 452, 3715, 253, 941, 19690, 5933, 1223, 26259, 22107, 4632, 11068, 285, 14386, 597, 671, 2085, 1781, 4311, 11088, 4320, 20689, 326, 476, 320, 908, 323, 3733, 4320, 3448, 3210, 285, 921, 326, 824, 3210, 452, 247, 2442, 281, 320, 908, 347, 33876, 941, 15116, 407, 9591, 1083, 2175, 50276, 1189, 455, 253, 2929, 310, 973, 3542, 285, 253, 2523, 310, 14793, 253, 4439, 20689, 310, 11088, 285, 588, 320, 4217, 323, 12392, 4320, 23105, 253, 4439, 3448, 1566, 877, 6291, 1057, 417, 1646, 281, 921, 253, 2120, 2442, 273, 877, 10895, 2568, 253, 2770, 273, 436, 2929, 310, 12976, 253, 2523, 285, 12709, 253, 17734, 28462, 15880, 342, 747, 2561, 3884, 970, 1781, 4311, 4320, 20689, 50275, 66, 1263, 273, 253, 14793, 2523, 275, 247, 17734, 28462, 1039, 50276, 783, 3727, 273, 11088, 1236, 2510, 25912, 48087, 4320, 20689, 50276, 9029, 272, 247, 2442, 323, 247, 2856, 324, 1069, 257, 33876, 2505, 44368, 1320, 50275, 783, 4439, 20689, 310, 6571, 440, 34218, 323, 4227, 275, 1302, 48888, 30436, 621, 253, 11419, 941, 751, 6808, 3522, 253, 1416, 273, 16006, 253, 1416, 273, 1302, 403, 417, 9070, 432, 253, 9305, 2505, 347, 973, 347, 253, 2201, 4910, 751, 5441, 1750, 285, 4743, 436, 778, 2701, 253, 47813, 285, 253, 34002, 273, 253, 4439, 10895, 50276, 783, 17909, 877, 6291, 3448, 3210, 513, 417, 1646, 281, 320, 10166, 5556, 595, 285, 616, 31471, 778, 320, 3710, 2490, 187, 4118, 18435, 27, 783, 10123, 403, 3839, 2762, 2167, 8489, 2159, 285, 247, 1781, 3215, 26208, 20689, 323, 4320, 2505, 588, 2779, 320, 4217, 323, 295, 24343, 2561, 581, 37317, 3534, 247, 12009, 4868, 608, 342, 253, 1563, 767, 32213, 50276, 66, 253, 10895, 3249, 432, 247, 4618, 6637, 273, 1569, 4919, 941, 4973, 534, 778, 9184, 9619, 285, 7613, 2701, 253, 31471, 273, 253, 10895, 50276, 67, 11068, 50275, 74, 717, 417, 1512, 11926, 670, 1127, 247, 984, 1781, 3448, 3210, 1646, 281, 320, 2104, 281, 3037, 432, 11117, 941, 4973, 50276, 1747, 13218, 1127, 270, 253, 4858, 18035, 2278, 25957, 253, 11068, 7350, 347, 973, 533, 9010, 326, 253, 19529, 10481, 25339, 436, 4468, 285, 11403, 642, 4092, 16289, 3374, 50276, 48521, 4583, 891, 5583, 18738, 253, 2929, 209 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: this paper studies the problem of consistency and calibration in the setting of classification with an adversary that perturbs the inputs during inference the authors highlight the critical difference in the consistency and calibration between the standard setting and the adversarial setting they show the widely used convex surrogate losses for 01 loss are no longer calibrated or consistent they then provide detailed analyses to build the necessary and sufficient conditions for continuous loss functions to be calibrated in an adversarial setting they further advance toward consistency in the adversarial setting by showing a weak result of consistency the originality of the work is sufficient as the studies of calibration and consistency in adversarial settings for general functions are missing in the literature to the best of my knowledge the quality of the work is high as the authors provide detailed and solid analyses of calibration and consistency in the adversarial setting including the difficulty and potential solutions to specific and general results the paper is wellorganized and clarified with sufficient background and preliminaries provided i am not an expert in calibration and consistency but i was able to follow the analyses step by step finally i believe the result is significant as the calibration and consistency in adversarial settings are important problems that are able to guide the training of adversarially robust classifiers this paper is able to move one step toward this goal the limitations of this paper have been adequately discussed at the end of section 42 docsepthe paper shows that the usual continous loss functions like hinge loss and logistic loss that are used to approximate the 01loss are not consistent nor calibrated for the adversarial 01loss it shows that a certain assymetry is necessary in order to have calibration and proposes shifted odd functions as candidates that are shown to fullfill them finally difficulties with and potential pathways towards consistent losses for the adversarial setting are discussed and related with some interesting theoretical results the paper does not contain any experiments the paper is generally well written and presented showing that the standard losses for adversarial robustness training do not satisfy certain basic theoretical properties is an interesting and potentially important insight the paper rigorously and comprehensively proofs these results i checked the proofs up until proposition 31 and they appear to be correct with the only issue for me being the definition of eta see below figure 1 is very helpful for understanding proposition 31 more similar illustrations for the other results would be very beneficial for the paper if possible i think that the paper is a very interesting contribution to the field of adversarial robustness the paper is generally well written but some details are missing for me also due to me not being very familiar with the literature on consistency i hope for clarifications on the questions below particularly on eta while the results are interesting and partly surprising to me from the perspective of adversarial robustness i cannot say much about the novelty in the context of consistency and calibration literature which this work builds upon the limitations of currently achievable results on consistency are described in detail the authors believe there is no potential social harm which is not a statement that is not obvious for any machine learning research due to potential dual use and other general risks however there is nothing specific that needs to be mentioned for this paper and i agree with the authors as in my opinion a work like this has nonnegative expected social impacts docsepthis paper studies the consistency and calibration properties of adversarial losses specifically it defines an adversarial loss as the highest loss in a epsilonball around data examples and find the sufficient and necessary conditions when an adversarial loss is adversarially calibrated furthermore this paper also studies the consistency of adversarial loss however the obtained result in prop 42 regarding the consistency seems weaker than expected strengths this is a comprehensive study about the calibration and consistency property of loss functions in the context of adversary the paper is generally wellwritten and easy to follow weakness the gained results seem to have less impact on practical problems of adversarial attack and defense there is no hint of how to utilize the gained results to improve adversarial robustness this is a theoretically comprehensive study however it would be much better if the authors give hints of how the gained results can be exploited to benefit adversarial attack and defense ### Summary:
reviewers have expressed strongly in favour of acceptance one improving their score to very strong accept after the rebuttal and discussion congratulations im delighted to recommend acceptance
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 2520, 2929, 2175, 253, 1895, 273, 15274, 285, 18543, 275, 253, 4758, 273, 9162, 342, 271, 34014, 326, 6925, 28312, 253, 14800, 1309, 17032, 253, 4477, 6780, 253, 4619, 3064, 275, 253, 15274, 285, 18543, 875, 253, 2629, 4758, 285, 253, 48960, 4758, 597, 921, 253, 7561, 908, 17133, 35701, 11655, 323, 14805, 2957, 403, 642, 3356, 35890, 390, 5185, 597, 840, 2085, 7000, 6260, 281, 1973, 253, 3309, 285, 4209, 2515, 323, 5415, 2957, 3470, 281, 320, 35890, 275, 271, 48960, 4758, 597, 2007, 7170, 2584, 15274, 275, 253, 48960, 4758, 407, 4645, 247, 5075, 906, 273, 15274, 253, 3236, 414, 273, 253, 789, 310, 4209, 347, 253, 2175, 273, 18543, 285, 15274, 275, 48960, 7533, 323, 2087, 3470, 403, 5816, 275, 253, 6239, 281, 253, 1682, 273, 619, 3640, 50275, 783, 3290, 273, 253, 789, 310, 1029, 347, 253, 4477, 2085, 7000, 285, 4891, 6260, 273, 18543, 285, 15274, 275, 253, 48960, 4758, 1690, 253, 10183, 285, 2442, 5482, 281, 2173, 285, 2087, 1543, 50276, 783, 2929, 310, 973, 34092, 285, 31637, 342, 4209, 4114, 285, 11944, 249, 3927, 2530, 891, 717, 417, 271, 6485, 275, 18543, 285, 15274, 533, 891, 369, 2104, 281, 956, 253, 6260, 3213, 407, 3213, 50276, 71, 3341, 891, 2868, 253, 906, 310, 1534, 347, 253, 18543, 285, 15274, 275, 48960, 7533, 403, 1774, 3237, 326, 403, 2104, 281, 7102, 253, 3733, 273, 18539, 274, 1365, 10237, 49996, 436, 2929, 310, 2104, 281, 2118, 581, 3213, 2584, 436, 4736, 253, 7364, 273, 436, 2929, 452, 644, 18212, 5469, 387, 253, 990, 273, 2593, 5976, 50276, 7152, 339, 431, 248, 2929, 2722, 326, 253, 7312, 1558, 528, 2957, 3470, 751, 38864, 2957, 285, 21535, 2957, 326, 403, 908, 281, 16851, 253, 14805, 18585, 403, 417, 5185, 4543, 35890, 323, 253, 48960, 14805, 18585, 50276, 262, 2722, 326, 247, 2176, 718, 1105, 5704, 310, 3309, 275, 1340, 281, 452, 18543, 285, 29328, 14728, 8909, 3470, 347, 9183, 326, 403, 2011, 281, 2120, 9337, 731, 50276, 71, 3341, 12748, 342, 285, 2442, 9130, 4404, 5185, 11655, 323, 253, 48960, 4758, 403, 5469, 285, 2905, 342, 690, 4722, 10527, 1543, 50276, 783, 2929, 1057, 417, 3831, 667, 4679, 253, 2929, 310, 3839, 973, 3542, 285, 3559, 50276, 9029, 272, 326, 253, 2629, 11655, 323, 48960, 31640, 3733, 513, 417, 10517, 2176, 5044, 10527, 3607, 310, 271, 4722, 285, 7826, 1774, 12288, 253, 2929, 8132, 29689, 285, 9483, 1242, 27947, 841, 1543, 50276, 74, 10141, 253, 27947, 598, 1919, 13989, 4562, 285, 597, 3176, 281, 320, 3451, 342, 253, 760, 2523, 323, 479, 1146, 253, 5426, 273, 1162, 66, 923, 2708, 50276, 13206, 337, 310, 1077, 9371, 323, 4685, 13989, 4562, 625, 2074, 33954, 323, 253, 643, 1543, 651, 320, 1077, 12912, 323, 253, 2929, 604, 1896, 50276, 74, 1158, 326, 253, 2929, 310, 247, 1077, 4722, 7680, 281, 253, 1673, 273, 48960, 31640, 50276, 783, 2929, 310, 3839, 973, 3542, 533, 690, 4278, 403, 5816, 323, 479, 671, 1955, 281, 479, 417, 1146, 1077, 7615, 342, 253, 6239, 327, 15274, 891, 3524, 323, 8254, 6787, 327, 253, 3533, 2708, 3782, 327, 1162, 66, 50276, 6050, 253, 1543, 403, 4722, 285, 13730, 10084, 281, 479, 432, 253, 8668, 273, 48960, 31640, 891, 2550, 1333, 1199, 670, 253, 38135, 275, 253, 3634, 273, 15274, 285, 18543, 6239, 534, 436, 789, 21168, 2220, 253, 7364, 273, 4390, 39941, 1543, 327, 15274, 403, 2529, 275, 2508, 50276, 783, 4477, 2868, 627, 310, 642, 2442, 2675, 5237, 534, 310, 417, 247, 3908, 326, 310, 417, 4755, 323, 667, 5145, 4715, 2561, 1955, 281, 2442, 8746, 897, 285, 643, 2087, 10502, 2299, 627, 310, 2717, 2173, 326, 3198, 281, 320, 5393, 323, 436, 2929, 285, 891, 5194, 342, 253, 4477, 347, 275, 619, 4743, 247, 789, 751, 436, 556, 46214, 3264, 2675, 16274, 5474, 33032, 2520, 2929, 2175, 253, 15274, 285, 18543, 3607, 273, 48960, 11655, 5742, 352, 13067, 271, 48960, 2957, 347, 253, 4585, 2957, 275, 247, 299, 4277, 2910, 1475, 941, 6667, 285, 1089, 253, 4209, 285, 3309, 2515, 672, 271, 48960, 2957, 310, 18539, 274, 1365, 35890, 33810, 436, 2929, 671, 2175, 253, 15274, 273, 48960, 2957, 2299, 253, 2797, 906, 275, 4198, 5976, 5001, 253, 15274, 3133, 21076, 685, 3264, 50276, 296, 3755, 20556, 50276, 2520, 310, 247, 11088, 1263, 670, 253, 18543, 285, 15274, 2867, 273, 2957, 3470, 275, 253, 3634, 273, 34014, 50276, 783, 2929, 310, 3839, 973, 15720, 285, 3477, 281, 956, 50275, 20881, 1255, 50276, 783, 12103, 1543, 1646, 281, 452, 1679, 3486, 327, 8542, 3237, 273, 48960, 2983, 285, 5684, 50276, 9088, 310, 642, 12662, 273, 849, 281, 16584, 253, 12103, 1543, 281, 3157, 48960, 31640, 436, 310, 247, 28055, 11088, 1263, 2299, 352, 651, 320, 1199, 1805, 604, 253, 4477, 1918, 28145, 273, 849, 253, 12103, 1543, 476, 320, 28734, 281, 5649, 48960, 2983, 285, 5684, 2490, 187, 4118, 18435, 27, 15337, 398, 452, 4469, 7052, 275, 9796, 273, 14924, 581, 11138, 616, 4868, 281, 1077, 2266, 2997, 846, 253, 30080, 22559, 285, 5955, 28858, 3339, 516, 25534, 281, 5583, 14924 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 2520, 2929, 2175, 253, 1895, 273, 15274, 285, 18543, 275, 253, 4758, 273, 9162, 342, 271, 34014, 326, 6925, 28312, 253, 14800, 1309, 17032, 253, 4477, 6780, 253, 4619, 3064, 275, 253, 15274, 285, 18543, 875, 253, 2629, 4758, 285, 253, 48960, 4758, 597, 921, 253, 7561, 908, 17133, 35701, 11655, 323, 14805, 2957, 403, 642, 3356, 35890, 390, 5185, 597, 840, 2085, 7000, 6260, 281, 1973, 253, 3309, 285, 4209, 2515, 323, 5415, 2957, 3470, 281, 320, 35890, 275, 271, 48960, 4758, 597, 2007, 7170, 2584, 15274, 275, 253, 48960, 4758, 407, 4645, 247, 5075, 906, 273, 15274, 253, 3236, 414, 273, 253, 789, 310, 4209, 347, 253, 2175, 273, 18543, 285, 15274, 275, 48960, 7533, 323, 2087, 3470, 403, 5816, 275, 253, 6239, 281, 253, 1682, 273, 619, 3640, 50275, 783, 3290, 273, 253, 789, 310, 1029, 347, 253, 4477, 2085, 7000, 285, 4891, 6260, 273, 18543, 285, 15274, 275, 253, 48960, 4758, 1690, 253, 10183, 285, 2442, 5482, 281, 2173, 285, 2087, 1543, 50276, 783, 2929, 310, 973, 34092, 285, 31637, 342, 4209, 4114, 285, 11944, 249, 3927, 2530, 891, 717, 417, 271, 6485, 275, 18543, 285, 15274, 533, 891, 369, 2104, 281, 956, 253, 6260, 3213, 407, 3213, 50276, 71, 3341, 891, 2868, 253, 906, 310, 1534, 347, 253, 18543, 285, 15274, 275, 48960, 7533, 403, 1774, 3237, 326, 403, 2104, 281, 7102, 253, 3733, 273, 18539, 274, 1365, 10237, 49996, 436, 2929, 310, 2104, 281, 2118, 581, 3213, 2584, 436, 4736, 253, 7364, 273, 436, 2929, 452, 644, 18212, 5469, 387, 253, 990, 273, 2593, 5976, 50276, 7152, 339, 431, 248, 2929, 2722, 326, 253, 7312, 1558, 528, 2957, 3470, 751, 38864, 2957, 285, 21535, 2957, 326, 403, 908, 281, 16851, 253, 14805, 18585, 403, 417, 5185, 4543, 35890, 323, 253, 48960, 14805, 18585, 50276, 262, 2722, 326, 247, 2176, 718, 1105, 5704, 310, 3309, 275, 1340, 281, 452, 18543, 285, 29328, 14728, 8909, 3470, 347, 9183, 326, 403, 2011, 281, 2120, 9337, 731, 50276, 71, 3341, 12748, 342, 285, 2442, 9130, 4404, 5185, 11655, 323, 253, 48960, 4758, 403, 5469, 285, 2905, 342, 690, 4722, 10527, 1543, 50276, 783, 2929, 1057, 417, 3831, 667, 4679, 253, 2929, 310, 3839, 973, 3542, 285, 3559, 50276, 9029, 272, 326, 253, 2629, 11655, 323, 48960, 31640, 3733, 513, 417, 10517, 2176, 5044, 10527, 3607, 310, 271, 4722, 285, 7826, 1774, 12288, 253, 2929, 8132, 29689, 285, 9483, 1242, 27947, 841, 1543, 50276, 74, 10141, 253, 27947, 598, 1919, 13989, 4562, 285, 597, 3176, 281, 320, 3451, 342, 253, 760, 2523, 323, 479, 1146, 253, 5426, 273, 1162, 66, 923, 2708, 50276, 13206, 337, 310, 1077, 9371, 323, 4685, 13989, 4562, 625, 2074, 33954, 323, 253, 643, 1543, 651, 320, 1077, 12912, 323, 253, 2929, 604, 1896, 50276, 74, 1158, 326, 253, 2929, 310, 247, 1077, 4722, 7680, 281, 253, 1673, 273, 48960, 31640, 50276, 783, 2929, 310, 3839, 973, 3542, 533, 690, 4278, 403, 5816, 323, 479, 671, 1955, 281, 479, 417, 1146, 1077, 7615, 342, 253, 6239, 327, 15274, 891, 3524, 323, 8254, 6787, 327, 253, 3533, 2708, 3782, 327, 1162, 66, 50276, 6050, 253, 1543, 403, 4722, 285, 13730, 10084, 281, 479, 432, 253, 8668, 273, 48960, 31640, 891, 2550, 1333, 1199, 670, 253, 38135, 275, 253, 3634, 273, 15274, 285, 18543, 6239, 534, 436, 789, 21168, 2220, 253, 7364, 273, 4390, 39941, 1543, 327, 15274, 403, 2529, 275, 2508, 50276, 783, 4477, 2868, 627, 310, 642, 2442, 2675, 5237, 534, 310, 417, 247, 3908, 326, 310, 417, 4755, 323, 667, 5145, 4715, 2561, 1955, 281, 2442, 8746, 897, 285, 643, 2087, 10502, 2299, 627, 310, 2717, 2173, 326, 3198, 281, 320, 5393, 323, 436, 2929, 285, 891, 5194, 342, 253, 4477, 347, 275, 619, 4743, 247, 789, 751, 436, 556, 46214, 3264, 2675, 16274, 5474, 33032, 2520, 2929, 2175, 253, 15274, 285, 18543, 3607, 273, 48960, 11655, 5742, 352, 13067, 271, 48960, 2957, 347, 253, 4585, 2957, 275, 247, 299, 4277, 2910, 1475, 941, 6667, 285, 1089, 253, 4209, 285, 3309, 2515, 672, 271, 48960, 2957, 310, 18539, 274, 1365, 35890, 33810, 436, 2929, 671, 2175, 253, 15274, 273, 48960, 2957, 2299, 253, 2797, 906, 275, 4198, 5976, 5001, 253, 15274, 3133, 21076, 685, 3264, 50276, 296, 3755, 20556, 50276, 2520, 310, 247, 11088, 1263, 670, 253, 18543, 285, 15274, 2867, 273, 2957, 3470, 275, 253, 3634, 273, 34014, 50276, 783, 2929, 310, 3839, 973, 15720, 285, 3477, 281, 956, 50275, 20881, 1255, 50276, 783, 12103, 1543, 1646, 281, 452, 1679, 3486, 327, 8542, 3237, 273, 48960, 2983, 285, 5684, 50276, 9088, 310, 642, 12662, 273, 849, 281, 16584, 253, 12103, 1543, 281, 3157, 48960, 31640, 436, 310, 247, 28055, 11088, 1263, 2299, 352, 651, 320, 1199, 1805, 604, 253, 4477, 1918, 28145, 273, 849, 253, 12103, 1543, 476, 320, 28734, 281, 5649, 48960, 2983, 285, 5684, 2490, 187, 4118, 18435, 27, 15337, 398, 452, 4469, 7052, 275, 9796, 273, 14924, 581, 11138, 616, 4868, 281, 1077, 2266, 2997, 846, 253, 30080, 22559, 285, 5955, 28858, 3339, 516, 25534, 281, 5583, 14924 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: the authors propose minimummargin mm attack to provide comparable performance with autoattack while significantly decreasing the computational cost they propose sequential target ranking selection stars to make the computational cost independent of the number of classes strengths the paper is wellwritten and the preliminaries are described clearly the proposed method presents significantly low computational complexity weaknesses the proposed method is only compared against pgd and cw the authors have mentioned that for reliability we evaluate the quality of adversarial examples using the margin between two targets for precisely identifying the most adversarial example can you please explain about most adversarial example beta in equation 9 is not defined although i believe the proposed method has the potential for a good publication i do not recommend the acceptance of the paper in the current form docsepthe paper proposed a strong adversarial attack ie an attack that can generate strong adversarial examples and thus can better evaluate the adversarial robustness of given deep learning models compared with the sota attack the proposed attack is much faster and thus easier to be applied in practice the idea is novel and the results are solid major contributions the main idea has been illustrated in figure 2 traditional pgd attack minimizes the probability of the true label by maximizing the loss and the proposed minimum margin attack minimizes the margin of the probabilities between the true label and the most confusing label to the best of my knowledge the idea is novel in adversarial machine learning the computational efficiency of the mm attack is amazing nowadays researchers are still using mainly pgd for training but aa for evaluation because aa is more than 100 times slower than pgd the mm attack is 20 or even 30 times faster than the aa attack making it possible for training with stronger adversarial examples besides faster evaluation of the adversarial robustness of given deep learning models in my opinion the results are significant concerns the paper lacks some theoretical analysis for example how would the attack converge how to guarantee the minimum probability margin example is stronger than the minimum probability example and thus more informative for both evaluating and training and whether the iterative mm attack algorithm is as stable as the pgd and aa algorithms the experiments mainly focused on the evaluation part and then the training part is quite weak although researchers believe stronger adversarial examples lead to more robust models it is not always the case because adversarial examples can be generated by quite different underlying principles as minimum probability vs minimum probability margin it is better to concretely show that mm is almost as fast as pgd and almost as strong as aa for training besides for evaluation this is quite critical for the significance of the paper this is an overall wellexecuted paper with good novelty and solid experiments some points should be clarified and stregnthened in the revision docsepthis paper proposes a minimummargin mm attack to evaluate defenses the authors report detailed results on the effects of different loss functions experiments are done on cifar10100 and svhn against the adversarially trained models strengths this paper is wellwritten especially with detailed descriptions and empirical results on the effects of different attacking loss functions the improvements shown in figure 1 seems promising with significant saving on computation weaknesses although several attacking baselines are considered they are all only evaluated against pgdat madry et al 2018 this could cause a biased evaluation of the attacking performance namely as a potential substitute for aa the proposed mm should be widely tested against different defenses just as done in the aa paper croce hein 2020 this should not be computationally hard considering the efficiency of mm and many existing defenses and their checkpoints provided in eg robustbench the multitarget attacking strategy has already been proposed in a but im surprised that a is not even cited in this paper for me the proposed star strategy is just a topk variant for the original multitarget attack besides using logits rather than softmax outputs is also not a new discovery since carlini wagner 2017b thus the technical contribution and novelty of mm are quite limited minors what is the definition of lmm in algorithm 1 references a gowal et al an alternative surrogate loss for pgdbased adversarial testing 2019 limited technical contribution lack of evaluations against more defenses docsepthe paper proposes an attack for testing adversarial robustness that is reportedly faster than the stateoftheart attacks but still produces reliable results the advantage in speed is obtained by using a sequential target ranking selection method while reliability is achieved by using a minimummargin loss comments the threat model is not stated anywhere there is no definition of adversarial robustnessrobust accuracy it is not clear then if 1 the attack is a minimumdistance or maximumconfidence attack 2 it is a targeted or untargeted attack in the common sense used in the field as opposed to the targeted version of apgd that is instead just using the targets for reducing the number of adversarial classes to consider in the optimization and 3 if the attack is only defined in the ellinfty norm flaws in the experimental evaluation the evaluation does not consider a stateoftheart attack such as brendel2020 this is a pity as brendel2020 is presented as a fast and reliable method for evaluating robustness and has similar desired characteristics as this attack the parameters of the attacks used seem suboptimal there is no choice of the hyperparameters and using 10 steps for pgd seems to be limiting the capabilities of the attack the same can be true for the cw attack especially as there is no mention on how many binarysearch steps are being used it is ok to test the attacks with limited resources but a more detailed asymptotic analysis eg with 1k steps would concretely support the claims of the paper that the attack remains comparable to the other attacks while reducing computational time the runtimes are computed in an uneven scenario the total time perstep or better perquery to the model should be used instead of the total cumulative time this makes no sense as an alternative one should compare the capabilities of the pgd attack depicted here as fast but not reliable with a fixed computational time ie by increasing the number of steps performed by pgd until it spends the same amount of time as the mm attack the authors did not state if they used some available implementations of the attack or implemented their own versions since the computational time depends on the implementation this might be a problem when using the runtime as a benchmark incorrect statements and unsupported claims abstract there is no definition of the most adversarial example even though there are several references of this in the paper this is also used in the abstract depending on the objective a stong adversarial example can be seen in different ways i suggest to expand this with a definition there is no evidence suggesting that the pgd attack is 100 times slower than aa the comparison is performed in uneven scenario where aa uses 100 iterations while pgd uses 10 moreover this is stated in the abstract which makes the statement easy to take and quote without knowing the context this statement should be removed introduction for practitioners who need realtime evaluation at each epoch of the training process of a robust model is there real cases that require this kind of evaluation this is missing a reference unfortunately pgd fails to reliably evaluate adversarial robustness of a robust dnn this sentence is overgeneralistic and not true for the majority of the cases pgd was succesfully used against many defenses just by making it adaptive to the defense tramer2020 ce loss which is based on the probability of the true label py is not an appropriate measure to the quality of adversarial examples there is no definition in the paper for quality of adversarial examples which makes this statement very confusing hence the reliable method is to minimize zy zt for each t neq y and take the most adversarial one which is a widely used solution this is not widelyused as for now it seems only used in croce2020 preliminary x0 refers to the starting point which corresponds to the natural example or the natural example perturbed by a small gaussian or uniformly random noise the statement within parentheses makes the definition of the closed ball in eq 2 makes the ball centered in x0 this does not correspond to the adversarial robustness measured in the original clean sample many equations see eqs 37 depend on f x y but they often dont appear inside the equations they showed that using adaptive step size significantly improves the adversarial robustness should be improves the adversarial examples or improves the adversarial evaluation the attacks are not improving robustness realization eqs 8 and 9 use variables alpha beta never introduced in the text minor issues the comparison with targeteddlr loss in sect 3 should be clarified it is very difficult to read and it does not really capture the advantage of using different methods for rescaling this might be better supported by some evidence or toy example and surely by adding some insight on which the hypothesis is based on moreover the authors should then explain what is the difference from the cw loss as it seems that they are using that one figures and tables need descriptive captions that clarify what is being depicted in particular tables need improvements in the headers and some highlighting of the results it is also a good practice to mention them in the text figure 1b figure 2 is difficult to understand and contains a legend with unclear definitions see classified area in table 2 it is impossible to understand what are the values presented in the cells the algorithm needs some hintscommentsdescription references tramer2020 tramer florian et al on adaptive attacks to adversarial example defenses advances in neural information processing systems 33 2020 croce2020 croce francesco and matthias hein reliable evaluation of adversarial robustness with an ensemble of diverse parameterfree attacks international conference on machine learning pmlr 2020 brendel2020 brendel w et al accurate reliable and fast robustness evaluation thirtythird conference on neural information processing systems neurips 2019 curran 2020 strengths tries to improve efficiency of adversarial attacks weaknesses the paper is missing some definition that should not be taken for granted the evaluation is not entirely convincing and might be unfair results should be presented better as they are very difficult to read and understand ### Summary:
the paper focuses on the strong adversarial attack ie an attack that can generate strong adversarial examples and thus can better evaluate the adversarial robustness of given deep learning models one review gave a score of 8 while the other 3 reviewers gave negative scores the main issue lies in the limited experiments as a potential substitute for aa the proposed mm should be widely tested against different defenses just as done in the aa paper the writing of the paper is somehow is not rigorous including many incorrect statements and unsupported claims which should be well addressed in the revision thus it cannot be accepted to iclr for its current version
[ 50275, 783, 4081, 1332, 310, 760, 2429, 1411, 23256, 69, 285, 260, 88, 50274, 783, 4477, 452, 5393, 326, 323, 13367, 359, 7472, 253, 3290, 273, 48960, 6667, 970, 253, 8459, 875, 767, 8571, 323, 10534, 12488, 253, 954, 48960, 1650, 476, 368, 4496, 5513, 670, 954, 48960, 1650, 50272, 2461, 275, 5150, 898, 310, 417, 2931, 3738, 891, 2868, 253, 4081, 1332, 556, 253, 2442, 323, 247, 1175, 9311, 891, 513, 417, 5583, 253, 14924, 273, 253, 2929, 275, 253, 1655, 830, 5474, 339, 431, 248, 2929, 4081, 247, 2266, 48960, 2983, 26332, 271, 2983, 326, 476, 6635, 2266, 48960, 6667, 285, 3021, 476, 1805, 7472, 253, 48960, 31640, 273, 1677, 3676, 4715, 3210, 2429, 342, 253, 256, 5503, 2983, 253, 4081, 2983, 310, 1199, 7938, 285, 3021, 6927, 281, 320, 3732, 275, 3946, 253, 2934, 310, 4460, 285, 253, 1543, 403, 4891, 50276, 24330, 9021, 50276, 783, 2022, 2934, 556, 644, 12800, 275, 4677, 374, 5899, 23256, 69, 2983, 46926, 253, 5912, 273, 253, 2032, 5203, 407, 46875, 253, 2957, 285, 253, 4081, 5927, 8459, 2983, 46926, 253, 8459, 273, 253, 20552, 875, 253, 2032, 5203, 285, 253, 954, 21643, 5203, 281, 253, 1682, 273, 619, 3640, 253, 2934, 310, 4460, 275, 48960, 5145, 4715, 50276, 783, 15180, 6733, 273, 253, 5823, 2983, 310, 8644, 31735, 8607, 403, 1335, 970, 7194, 23256, 69, 323, 3733, 533, 39951, 323, 7103, 984, 39951, 310, 625, 685, 2233, 2069, 17357, 685, 23256, 69, 253, 5823, 2983, 310, 1384, 390, 1014, 1884, 2069, 7938, 685, 253, 39951, 2983, 2403, 352, 1896, 323, 3733, 342, 10046, 48960, 6667, 16280, 7938, 7103, 273, 253, 48960, 31640, 273, 1677, 3676, 4715, 3210, 275, 619, 4743, 253, 1543, 403, 1534, 50276, 585, 1209, 2224, 50276, 783, 2929, 19756, 690, 10527, 1783, 323, 1650, 849, 651, 253, 2983, 29623, 849, 281, 12215, 253, 5927, 5912, 8459, 1650, 310, 10046, 685, 253, 5927, 5912, 1650, 285, 3021, 625, 27096, 323, 1097, 16344, 285, 3733, 285, 1880, 253, 34560, 5823, 2983, 5933, 310, 347, 6474, 347, 253, 23256, 69, 285, 39951, 11333, 50276, 783, 4679, 7194, 7106, 327, 253, 7103, 629, 285, 840, 253, 3733, 629, 310, 3240, 5075, 3738, 8607, 2868, 10046, 48960, 6667, 1421, 281, 625, 10237, 3210, 352, 310, 417, 1900, 253, 1083, 984, 48960, 6667, 476, 320, 4561, 407, 3240, 1027, 6944, 9241, 347, 5927, 5912, 4632, 5927, 5912, 8459, 352, 310, 1805, 281, 345, 2414, 600, 921, 326, 5823, 310, 2761, 347, 3809, 347, 23256, 69, 285, 2761, 347, 2266, 347, 39951, 323, 3733, 16280, 323, 7103, 436, 310, 3240, 4619, 323, 253, 8453, 273, 253, 2929, 50276, 2520, 310, 271, 4583, 6210, 1591, 886, 4525, 2929, 342, 1175, 38135, 285, 4891, 4679, 690, 2792, 943, 320, 31637, 285, 331, 21357, 7461, 264, 275, 253, 18520, 50276, 7152, 33032, 2520, 2929, 29328, 247, 5927, 15456, 5823, 2983, 281, 7472, 25774, 253, 4477, 1304, 7000, 1543, 327, 253, 2538, 273, 1027, 2957, 3470, 4679, 403, 2218, 327, 260, 338, 274, 6903, 361, 285, 18504, 13107, 1411, 253, 18539, 274, 1365, 10166, 3210, 20544, 50276, 2520, 2929, 310, 973, 15720, 3340, 342, 7000, 20121, 285, 16774, 1543, 327, 253, 2538, 273, 1027, 20362, 2957, 3470, 50276, 783, 11701, 2011, 275, 4677, 337, 3133, 12532, 342, 1534, 13868, 327, 13782, 50276, 20881, 1255, 265, 50276, 20261, 2067, 20362, 1666, 25379, 403, 2783, 597, 403, 512, 760, 6760, 1411, 23256, 8608, 10279, 610, 1162, 355, 4765, 436, 812, 2847, 247, 23539, 7103, 273, 253, 20362, 3045, 10775, 347, 247, 2442, 16502, 323, 39951, 253, 4081, 5823, 943, 320, 7561, 5762, 1411, 1027, 25774, 816, 347, 2218, 275, 253, 39951, 2929, 9187, 336, 50276, 248, 249, 9169, 436, 943, 417, 320, 43245, 1892, 7296, 253, 6733, 273, 5823, 285, 1142, 5368, 25774, 285, 616, 2451, 10801, 2530, 275, 24088, 10237, 31591, 50276, 783, 1554, 262, 1816, 20362, 5700, 556, 2168, 644, 4081, 275, 247, 533, 516, 9861, 326, 247, 310, 417, 1014, 11106, 275, 436, 2929, 323, 479, 253, 4081, 4177, 5700, 310, 816, 247, 1755, 76, 12955, 323, 253, 3236, 1554, 262, 1816, 2983, 16280, 970, 2412, 953, 2581, 685, 2602, 4090, 18012, 310, 671, 417, 247, 747, 8900, 1580, 1113, 3642, 74, 50276, 88, 25823, 4240, 67, 3021, 253, 7681, 7680, 285, 38135, 273, 5823, 403, 3240, 3710, 50274, 1222, 641, 50276, 5371, 310, 253, 5426, 273, 298, 2188, 275, 5933, 337, 50276, 250, 3065, 247, 305, 319, 267, 1162, 355, 271, 5795, 35701, 2957, 323, 23256, 69, 3169, 48960, 5175, 6247, 3710, 7681, 7680, 3480, 273, 27163, 1411, 625, 25774, 5474, 339, 431, 248, 2929, 29328, 271, 2983, 323, 5175, 48960, 31640, 326, 310, 17324, 7938, 685, 253, 1375, 23037, 14387, 8104, 533, 1335, 11330, 9630, 1543, 253, 5750, 275, 3885, 310, 2797, 407, 970, 247, 22453, 2303, 19947, 5438, 1332, 1223, 13367, 310, 6786, 407, 970, 247, 5927, 15456, 2957, 50275, 26122, 50275, 783, 4322, 1566, 310, 417, 4767, 9825, 627, 310, 642, 5426, 273, 48960, 31640, 18848, 461, 7200, 352, 310, 417, 2590, 840, 604, 337, 253, 2983, 310, 247, 5927, 19893, 390, 4869, 39943, 2983, 374, 352, 310, 247, 10522, 390, 440, 44490, 2983, 275, 253, 1846, 3282, 908, 275, 253, 1673, 347, 10066, 281, 253, 10522, 2715, 273, 1049, 35333, 326, 310, 3185, 816, 970, 253, 8571, 323, 8493, 253, 1180, 273, 48960, 5971, 281, 1908, 275, 253, 13757, 285, 495, 604, 253, 2983, 310, 760, 2931, 275, 253, 11591, 3259, 5222, 50274, 1258, 11345, 275, 253, 5661, 7103, 50275, 783, 7103, 1057, 417, 1908, 247, 1375, 23037, 14387, 2983, 824, 347, 270, 5047, 293, 14952, 436, 310, 247, 27042, 347, 270, 5047, 293, 14952, 310, 3559, 347, 247, 3809, 285, 9630, 1332, 323, 16344, 31640, 285, 556, 2074, 6799, 5319, 347, 436, 2983, 50276, 783, 3602, 273, 253, 8104, 908, 1646, 749, 29776, 627, 310, 642, 4327, 273, 253, 4373, 22041, 285, 970, 884, 5018, 323, 23256, 69, 3133, 281, 320, 14155, 253, 13789, 273, 253, 2983, 253, 1072, 476, 320, 2032, 323, 253, 260, 88, 2983, 3340, 347, 627, 310, 642, 3748, 327, 849, 1142, 8985, 8716, 5018, 403, 1146, 908, 352, 310, 8718, 281, 1071, 253, 8104, 342, 3710, 5300, 533, 247, 625, 7000, 20185, 1783, 24088, 342, 337, 76, 5018, 651, 345, 2414, 600, 1329, 253, 3916, 273, 253, 2929, 326, 253, 2983, 4558, 10870, 281, 253, 643, 8104, 1223, 8493, 15180, 673, 50276, 783, 1408, 3181, 403, 10302, 275, 271, 30914, 10076, 253, 2264, 673, 591, 10539, 390, 1805, 591, 7267, 281, 253, 1566, 943, 320, 908, 3185, 273, 253, 2264, 18849, 673, 436, 2789, 642, 3282, 347, 271, 5795, 581, 943, 7277, 253, 13789, 273, 253, 23256, 69, 2983, 17253, 1060, 347, 3809, 533, 417, 9630, 342, 247, 4229, 15180, 673, 26332, 407, 3629, 253, 1180, 273, 5018, 2684, 407, 23256, 69, 1919, 352, 30885, 253, 1072, 2408, 273, 673, 347, 253, 5823, 2983, 50276, 783, 4477, 858, 417, 1375, 604, 597, 908, 690, 2130, 27558, 273, 253, 2983, 390, 9009, 616, 1211, 9508, 1580, 253, 15180, 673, 7024, 327, 253, 7092, 436, 1537, 320, 247, 1895, 672, 970, 253, 20243, 347, 247, 22791, 50275, 1763, 263, 6471, 7234, 285, 36542, 3916, 50276, 15834, 50275, 9088, 310, 642, 5426, 273, 253, 954, 48960, 1650, 1014, 2167, 627, 403, 2067, 10414, 273, 436, 275, 253, 2929, 436, 310, 671, 908, 275, 253, 12002, 7293, 327, 253, 8103, 247, 331, 543, 48960, 1650, 476, 320, 2326, 275, 1027, 4088, 891, 1804, 281, 5645, 436, 342, 247, 5426, 50276, 9088, 310, 642, 1941, 7738, 326, 253, 23256, 69, 2983, 310, 2233, 2069, 17357, 685, 39951, 253, 5301, 310, 2684, 275, 30914, 10076, 835, 39951, 4648, 2233, 25142, 1223, 23256, 69, 4648, 884, 25761, 436, 310, 4767, 275, 253, 12002, 534, 2789, 253, 3908, 3477, 281, 1379, 285, 14430, 1293, 8958, 253, 3634, 436, 3908, 943, 320, 5176, 50275, 46089, 50275, 1542, 24432, 665, 878, 1524, 2606, 7103, 387, 1016, 23657, 273, 253, 3733, 1232, 273, 247, 10237, 1566, 310, 627, 1524, 2219, 326, 2430, 436, 2238, 273, 7103, 436, 310, 5816, 247, 3806, 50276, 328, 9520, 23256, 69, 10224, 281, 27340, 7472, 48960, 31640, 273, 247, 10237, 277, 9866, 436, 6197, 310, 689, 16691, 2531, 285, 417, 2032, 323, 253, 5020, 273, 253, 2219, 23256, 69, 369, 18382, 265, 2920, 908, 1411, 1142, 25774, 816, 407, 2403, 352, 17825, 281, 253, 5684, 492, 13429, 14952, 50276, 336, 2957, 534, 310, 1754, 327, 253, 5912, 273, 253, 2032, 5203, 7239, 310, 417, 271, 4569, 2557, 281, 253, 3290, 273, 48960, 6667, 627, 310, 642, 5426, 275, 253, 2929, 323, 3290, 273, 48960, 6667, 534, 2789, 436, 3908, 1077, 21643, 50276, 48521, 253, 9630, 1332, 310, 281, 15338, 1182, 90, 50276, 15701, 323, 1016, 246, 425, 82, 340, 285, 1379, 253, 954, 48960, 581, 534, 310, 247, 7561, 908, 2900, 436, 310, 417, 7561, 3197, 347, 323, 1024, 352, 3133, 760, 908, 275, 9187, 336, 14952, 50276, 81, 34466, 50275, 89, 17, 10770, 281, 253, 4983, 1127, 534, 10140, 281, 253, 3626, 1650, 390, 253, 3626, 1650, 44711, 407, 247, 1355, 305, 12064, 390, 17568, 3632, 6046, 253, 3908, 1561, 41616, 2789, 253, 5426, 273, 253, 4581, 4023, 275, 16186, 374, 2789, 253, 4023, 18932, 275, 1269, 17, 436, 1057, 417, 2723, 281, 253, 48960, 31640, 4080, 275, 253, 3236, 4076, 3410, 50276, 20415, 7424, 923, 16186, 84, 5345, 3469, 327, 269, 1269, 340, 533, 597, 2223, 13414, 3176, 3304, 253, 7424, 50276, 9328, 2692, 326, 970, 17825, 3213, 1979, 3012, 19132, 253, 48960, 31640, 943, 320, 19132, 253, 48960, 6667, 390, 19132, 253, 48960, 7103, 253, 8104, 403, 417, 11138, 31640, 50276, 6549, 1320, 50275, 2574, 84, 854, 285, 898, 897, 4903, 9765, 9840, 1620, 5611, 275, 253, 2505, 50274, 37585, 3374, 50275, 783, 5301, 342, 10522, 11830, 83, 2957, 275, 25102, 495, 943, 320, 31637, 352, 310, 1077, 2834, 281, 1239, 285, 352, 1057, 417, 1663, 9232, 253, 5750, 273, 970, 1027, 3082, 323, 46595, 272, 436, 1537, 320, 1805, 4516, 407, 690, 1941, 390, 20953, 1650, 285, 13353, 407, 6240, 690, 12288, 327, 534, 253, 9079, 310, 1754, 327, 25761, 253, 4477, 943, 840, 5513, 752, 310, 253, 3064, 432, 253, 260, 88, 2957, 347, 352, 3133, 326, 597, 403, 970, 326, 581, 50276, 40203, 285, 7180, 878, 27389, 3403, 621, 326, 19148, 752, 310, 1146, 17253, 275, 1798, 7180, 878, 11701, 275, 253, 20546, 285, 690, 27321, 273, 253, 1543, 352, 310, 671, 247, 1175, 3946, 281, 3748, 731, 275, 253, 2505, 4677, 337, 67, 4677, 374, 310, 2834, 281, 2096, 285, 4428, 247, 13691, 342, 12744, 14308, 923, 10509, 2170, 275, 2829, 374, 352, 310, 7479, 281, 2096, 752, 403, 253, 2193, 3559, 275, 253, 1341, 253, 5933, 3198, 690, 28145, 26122, 10008, 50274, 250, 3065, 50275, 1206, 13429, 14952, 492, 13429, 892, 40563, 1162, 355, 327, 17825, 8104, 281, 48960, 1650, 25774, 16424, 275, 11454, 1491, 5162, 2718, 5922, 9169, 50275, 23853, 336, 14952, 9187, 336, 1315, 1972, 1940, 285, 1111, 394, 6358, 344, 249, 9630, 7103, 273, 48960, 31640, 342, 271, 19862, 273, 11117, 4764, 4924, 8104, 5213, 8059, 327, 5145, 4715, 268, 1686, 83, 9169, 50275, 67, 5047, 293, 14952, 270, 5047, 293, 259, 1162, 355, 7899, 9630, 285, 3809, 31640, 7103, 10488, 19016, 8059, 327, 11454, 1491, 5162, 2718, 5723, 2824, 6247, 1095, 4011, 9169, 50276, 296, 3755, 20556, 50275, 85, 2246, 281, 3157, 6733, 273, 48960, 8104, 50275, 20881, 1255, 265, 50276, 783, 2929, 310, 5816, 690, 5426, 326, 943, 417, 320, 2668, 323, 7169, 50276, 783, 7103, 310, 417, 7094, 21414, 285, 1537, 320, 16593, 50276, 16680, 943, 320, 3559, 1805, 347, 597, 403, 1077, 2834, 281, 1239, 285, 2096, 2490, 187, 4118, 18435, 27, 783, 2929, 16633, 327, 253, 2266, 48960, 2983, 26332, 271, 2983, 326, 476, 6635, 2266, 48960, 6667, 285, 3021, 476, 1805, 7472, 253, 48960, 31640, 273, 1677, 3676, 4715, 3210, 581, 2278, 3534, 247, 4868, 273, 854, 1223, 253, 643, 495, 30628, 3534, 4016, 7363, 253, 2022, 2523, 8696, 275, 253, 3710, 4679, 347, 247, 2442, 16502, 323, 39951, 253, 4081, 5823, 943, 320, 7561, 5762, 1411, 1027, 25774, 816, 347, 2218, 275, 253, 39951, 2929, 253, 4028, 273, 253, 2929, 310, 10380, 310, 417, 26565, 1690, 1142, 13583, 7234, 285, 36542, 3916, 534, 943, 320, 973, 9713, 275, 253, 18520, 3021, 352, 2550, 320, 7607, 281, 17857, 32888, 323, 697, 1655, 2715 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 50275, 783, 4081, 1332, 310, 760, 2429, 1411, 23256, 69, 285, 260, 88, 50274, 783, 4477, 452, 5393, 326, 323, 13367, 359, 7472, 253, 3290, 273, 48960, 6667, 970, 253, 8459, 875, 767, 8571, 323, 10534, 12488, 253, 954, 48960, 1650, 476, 368, 4496, 5513, 670, 954, 48960, 1650, 50272, 2461, 275, 5150, 898, 310, 417, 2931, 3738, 891, 2868, 253, 4081, 1332, 556, 253, 2442, 323, 247, 1175, 9311, 891, 513, 417, 5583, 253, 14924, 273, 253, 2929, 275, 253, 1655, 830, 5474, 339, 431, 248, 2929, 4081, 247, 2266, 48960, 2983, 26332, 271, 2983, 326, 476, 6635, 2266, 48960, 6667, 285, 3021, 476, 1805, 7472, 253, 48960, 31640, 273, 1677, 3676, 4715, 3210, 2429, 342, 253, 256, 5503, 2983, 253, 4081, 2983, 310, 1199, 7938, 285, 3021, 6927, 281, 320, 3732, 275, 3946, 253, 2934, 310, 4460, 285, 253, 1543, 403, 4891, 50276, 24330, 9021, 50276, 783, 2022, 2934, 556, 644, 12800, 275, 4677, 374, 5899, 23256, 69, 2983, 46926, 253, 5912, 273, 253, 2032, 5203, 407, 46875, 253, 2957, 285, 253, 4081, 5927, 8459, 2983, 46926, 253, 8459, 273, 253, 20552, 875, 253, 2032, 5203, 285, 253, 954, 21643, 5203, 281, 253, 1682, 273, 619, 3640, 253, 2934, 310, 4460, 275, 48960, 5145, 4715, 50276, 783, 15180, 6733, 273, 253, 5823, 2983, 310, 8644, 31735, 8607, 403, 1335, 970, 7194, 23256, 69, 323, 3733, 533, 39951, 323, 7103, 984, 39951, 310, 625, 685, 2233, 2069, 17357, 685, 23256, 69, 253, 5823, 2983, 310, 1384, 390, 1014, 1884, 2069, 7938, 685, 253, 39951, 2983, 2403, 352, 1896, 323, 3733, 342, 10046, 48960, 6667, 16280, 7938, 7103, 273, 253, 48960, 31640, 273, 1677, 3676, 4715, 3210, 275, 619, 4743, 253, 1543, 403, 1534, 50276, 585, 1209, 2224, 50276, 783, 2929, 19756, 690, 10527, 1783, 323, 1650, 849, 651, 253, 2983, 29623, 849, 281, 12215, 253, 5927, 5912, 8459, 1650, 310, 10046, 685, 253, 5927, 5912, 1650, 285, 3021, 625, 27096, 323, 1097, 16344, 285, 3733, 285, 1880, 253, 34560, 5823, 2983, 5933, 310, 347, 6474, 347, 253, 23256, 69, 285, 39951, 11333, 50276, 783, 4679, 7194, 7106, 327, 253, 7103, 629, 285, 840, 253, 3733, 629, 310, 3240, 5075, 3738, 8607, 2868, 10046, 48960, 6667, 1421, 281, 625, 10237, 3210, 352, 310, 417, 1900, 253, 1083, 984, 48960, 6667, 476, 320, 4561, 407, 3240, 1027, 6944, 9241, 347, 5927, 5912, 4632, 5927, 5912, 8459, 352, 310, 1805, 281, 345, 2414, 600, 921, 326, 5823, 310, 2761, 347, 3809, 347, 23256, 69, 285, 2761, 347, 2266, 347, 39951, 323, 3733, 16280, 323, 7103, 436, 310, 3240, 4619, 323, 253, 8453, 273, 253, 2929, 50276, 2520, 310, 271, 4583, 6210, 1591, 886, 4525, 2929, 342, 1175, 38135, 285, 4891, 4679, 690, 2792, 943, 320, 31637, 285, 331, 21357, 7461, 264, 275, 253, 18520, 50276, 7152, 33032, 2520, 2929, 29328, 247, 5927, 15456, 5823, 2983, 281, 7472, 25774, 253, 4477, 1304, 7000, 1543, 327, 253, 2538, 273, 1027, 2957, 3470, 4679, 403, 2218, 327, 260, 338, 274, 6903, 361, 285, 18504, 13107, 1411, 253, 18539, 274, 1365, 10166, 3210, 20544, 50276, 2520, 2929, 310, 973, 15720, 3340, 342, 7000, 20121, 285, 16774, 1543, 327, 253, 2538, 273, 1027, 20362, 2957, 3470, 50276, 783, 11701, 2011, 275, 4677, 337, 3133, 12532, 342, 1534, 13868, 327, 13782, 50276, 20881, 1255, 265, 50276, 20261, 2067, 20362, 1666, 25379, 403, 2783, 597, 403, 512, 760, 6760, 1411, 23256, 8608, 10279, 610, 1162, 355, 4765, 436, 812, 2847, 247, 23539, 7103, 273, 253, 20362, 3045, 10775, 347, 247, 2442, 16502, 323, 39951, 253, 4081, 5823, 943, 320, 7561, 5762, 1411, 1027, 25774, 816, 347, 2218, 275, 253, 39951, 2929, 9187, 336, 50276, 248, 249, 9169, 436, 943, 417, 320, 43245, 1892, 7296, 253, 6733, 273, 5823, 285, 1142, 5368, 25774, 285, 616, 2451, 10801, 2530, 275, 24088, 10237, 31591, 50276, 783, 1554, 262, 1816, 20362, 5700, 556, 2168, 644, 4081, 275, 247, 533, 516, 9861, 326, 247, 310, 417, 1014, 11106, 275, 436, 2929, 323, 479, 253, 4081, 4177, 5700, 310, 816, 247, 1755, 76, 12955, 323, 253, 3236, 1554, 262, 1816, 2983, 16280, 970, 2412, 953, 2581, 685, 2602, 4090, 18012, 310, 671, 417, 247, 747, 8900, 1580, 1113, 3642, 74, 50276, 88, 25823, 4240, 67, 3021, 253, 7681, 7680, 285, 38135, 273, 5823, 403, 3240, 3710, 50274, 1222, 641, 50276, 5371, 310, 253, 5426, 273, 298, 2188, 275, 5933, 337, 50276, 250, 3065, 247, 305, 319, 267, 1162, 355, 271, 5795, 35701, 2957, 323, 23256, 69, 3169, 48960, 5175, 6247, 3710, 7681, 7680, 3480, 273, 27163, 1411, 625, 25774, 5474, 339, 431, 248, 2929, 29328, 271, 2983, 323, 5175, 48960, 31640, 326, 310, 17324, 7938, 685, 253, 1375, 23037, 14387, 8104, 533, 1335, 11330, 9630, 1543, 253, 5750, 275, 3885, 310, 2797, 407, 970, 247, 22453, 2303, 19947, 5438, 1332, 1223, 13367, 310, 6786, 407, 970, 247, 5927, 15456, 2957, 50275, 26122, 50275, 783, 4322, 1566, 310, 417, 4767, 9825, 627, 310, 642, 5426, 273, 48960, 31640, 18848, 461, 7200, 352, 310, 417, 2590, 840, 604, 337, 253, 2983, 310, 247, 5927, 19893, 390, 4869, 39943, 2983, 374, 352, 310, 247, 10522, 390, 440, 44490, 2983, 275, 253, 1846, 3282, 908, 275, 253, 1673, 347, 10066, 281, 253, 10522, 2715, 273, 1049, 35333, 326, 310, 3185, 816, 970, 253, 8571, 323, 8493, 253, 1180, 273, 48960, 5971, 281, 1908, 275, 253, 13757, 285, 495, 604, 253, 2983, 310, 760, 2931, 275, 253, 11591, 3259, 5222, 50274, 1258, 11345, 275, 253, 5661, 7103, 50275, 783, 7103, 1057, 417, 1908, 247, 1375, 23037, 14387, 2983, 824, 347, 270, 5047, 293, 14952, 436, 310, 247, 27042, 347, 270, 5047, 293, 14952, 310, 3559, 347, 247, 3809, 285, 9630, 1332, 323, 16344, 31640, 285, 556, 2074, 6799, 5319, 347, 436, 2983, 50276, 783, 3602, 273, 253, 8104, 908, 1646, 749, 29776, 627, 310, 642, 4327, 273, 253, 4373, 22041, 285, 970, 884, 5018, 323, 23256, 69, 3133, 281, 320, 14155, 253, 13789, 273, 253, 2983, 253, 1072, 476, 320, 2032, 323, 253, 260, 88, 2983, 3340, 347, 627, 310, 642, 3748, 327, 849, 1142, 8985, 8716, 5018, 403, 1146, 908, 352, 310, 8718, 281, 1071, 253, 8104, 342, 3710, 5300, 533, 247, 625, 7000, 20185, 1783, 24088, 342, 337, 76, 5018, 651, 345, 2414, 600, 1329, 253, 3916, 273, 253, 2929, 326, 253, 2983, 4558, 10870, 281, 253, 643, 8104, 1223, 8493, 15180, 673, 50276, 783, 1408, 3181, 403, 10302, 275, 271, 30914, 10076, 253, 2264, 673, 591, 10539, 390, 1805, 591, 7267, 281, 253, 1566, 943, 320, 908, 3185, 273, 253, 2264, 18849, 673, 436, 2789, 642, 3282, 347, 271, 5795, 581, 943, 7277, 253, 13789, 273, 253, 23256, 69, 2983, 17253, 1060, 347, 3809, 533, 417, 9630, 342, 247, 4229, 15180, 673, 26332, 407, 3629, 253, 1180, 273, 5018, 2684, 407, 23256, 69, 1919, 352, 30885, 253, 1072, 2408, 273, 673, 347, 253, 5823, 2983, 50276, 783, 4477, 858, 417, 1375, 604, 597, 908, 690, 2130, 27558, 273, 253, 2983, 390, 9009, 616, 1211, 9508, 1580, 253, 15180, 673, 7024, 327, 253, 7092, 436, 1537, 320, 247, 1895, 672, 970, 253, 20243, 347, 247, 22791, 50275, 1763, 263, 6471, 7234, 285, 36542, 3916, 50276, 15834, 50275, 9088, 310, 642, 5426, 273, 253, 954, 48960, 1650, 1014, 2167, 627, 403, 2067, 10414, 273, 436, 275, 253, 2929, 436, 310, 671, 908, 275, 253, 12002, 7293, 327, 253, 8103, 247, 331, 543, 48960, 1650, 476, 320, 2326, 275, 1027, 4088, 891, 1804, 281, 5645, 436, 342, 247, 5426, 50276, 9088, 310, 642, 1941, 7738, 326, 253, 23256, 69, 2983, 310, 2233, 2069, 17357, 685, 39951, 253, 5301, 310, 2684, 275, 30914, 10076, 835, 39951, 4648, 2233, 25142, 1223, 23256, 69, 4648, 884, 25761, 436, 310, 4767, 275, 253, 12002, 534, 2789, 253, 3908, 3477, 281, 1379, 285, 14430, 1293, 8958, 253, 3634, 436, 3908, 943, 320, 5176, 50275, 46089, 50275, 1542, 24432, 665, 878, 1524, 2606, 7103, 387, 1016, 23657, 273, 253, 3733, 1232, 273, 247, 10237, 1566, 310, 627, 1524, 2219, 326, 2430, 436, 2238, 273, 7103, 436, 310, 5816, 247, 3806, 50276, 328, 9520, 23256, 69, 10224, 281, 27340, 7472, 48960, 31640, 273, 247, 10237, 277, 9866, 436, 6197, 310, 689, 16691, 2531, 285, 417, 2032, 323, 253, 5020, 273, 253, 2219, 23256, 69, 369, 18382, 265, 2920, 908, 1411, 1142, 25774, 816, 407, 2403, 352, 17825, 281, 253, 5684, 492, 13429, 14952, 50276, 336, 2957, 534, 310, 1754, 327, 253, 5912, 273, 253, 2032, 5203, 7239, 310, 417, 271, 4569, 2557, 281, 253, 3290, 273, 48960, 6667, 627, 310, 642, 5426, 275, 253, 2929, 323, 3290, 273, 48960, 6667, 534, 2789, 436, 3908, 1077, 21643, 50276, 48521, 253, 9630, 1332, 310, 281, 15338, 1182, 90, 50276, 15701, 323, 1016, 246, 425, 82, 340, 285, 1379, 253, 954, 48960, 581, 534, 310, 247, 7561, 908, 2900, 436, 310, 417, 7561, 3197, 347, 323, 1024, 352, 3133, 760, 908, 275, 9187, 336, 14952, 50276, 81, 34466, 50275, 89, 17, 10770, 281, 253, 4983, 1127, 534, 10140, 281, 253, 3626, 1650, 390, 253, 3626, 1650, 44711, 407, 247, 1355, 305, 12064, 390, 17568, 3632, 6046, 253, 3908, 1561, 41616, 2789, 253, 5426, 273, 253, 4581, 4023, 275, 16186, 374, 2789, 253, 4023, 18932, 275, 1269, 17, 436, 1057, 417, 2723, 281, 253, 48960, 31640, 4080, 275, 253, 3236, 4076, 3410, 50276, 20415, 7424, 923, 16186, 84, 5345, 3469, 327, 269, 1269, 340, 533, 597, 2223, 13414, 3176, 3304, 253, 7424, 50276, 9328, 2692, 326, 970, 17825, 3213, 1979, 3012, 19132, 253, 48960, 31640, 943, 320, 19132, 253, 48960, 6667, 390, 19132, 253, 48960, 7103, 253, 8104, 403, 417, 11138, 31640, 50276, 6549, 1320, 50275, 2574, 84, 854, 285, 898, 897, 4903, 9765, 9840, 1620, 5611, 275, 253, 2505, 50274, 37585, 3374, 50275, 783, 5301, 342, 10522, 11830, 83, 2957, 275, 25102, 495, 943, 320, 31637, 352, 310, 1077, 2834, 281, 1239, 285, 352, 1057, 417, 1663, 9232, 253, 5750, 273, 970, 1027, 3082, 323, 46595, 272, 436, 1537, 320, 1805, 4516, 407, 690, 1941, 390, 20953, 1650, 285, 13353, 407, 6240, 690, 12288, 327, 534, 253, 9079, 310, 1754, 327, 25761, 253, 4477, 943, 840, 5513, 752, 310, 253, 3064, 432, 253, 260, 88, 2957, 347, 352, 3133, 326, 597, 403, 970, 326, 581, 50276, 40203, 285, 7180, 878, 27389, 3403, 621, 326, 19148, 752, 310, 1146, 17253, 275, 1798, 7180, 878, 11701, 275, 253, 20546, 285, 690, 27321, 273, 253, 1543, 352, 310, 671, 247, 1175, 3946, 281, 3748, 731, 275, 253, 2505, 4677, 337, 67, 4677, 374, 310, 2834, 281, 2096, 285, 4428, 247, 13691, 342, 12744, 14308, 923, 10509, 2170, 275, 2829, 374, 352, 310, 7479, 281, 2096, 752, 403, 253, 2193, 3559, 275, 253, 1341, 253, 5933, 3198, 690, 28145, 26122, 10008, 50274, 250, 3065, 50275, 1206, 13429, 14952, 492, 13429, 892, 40563, 1162, 355, 327, 17825, 8104, 281, 48960, 1650, 25774, 16424, 275, 11454, 1491, 5162, 2718, 5922, 9169, 50275, 23853, 336, 14952, 9187, 336, 1315, 1972, 1940, 285, 1111, 394, 6358, 344, 249, 9630, 7103, 273, 48960, 31640, 342, 271, 19862, 273, 11117, 4764, 4924, 8104, 5213, 8059, 327, 5145, 4715, 268, 1686, 83, 9169, 50275, 67, 5047, 293, 14952, 270, 5047, 293, 259, 1162, 355, 7899, 9630, 285, 3809, 31640, 7103, 10488, 19016, 8059, 327, 11454, 1491, 5162, 2718, 5723, 2824, 6247, 1095, 4011, 9169, 50276, 296, 3755, 20556, 50275, 85, 2246, 281, 3157, 6733, 273, 48960, 8104, 50275, 20881, 1255, 265, 50276, 783, 2929, 310, 5816, 690, 5426, 326, 943, 417, 320, 2668, 323, 7169, 50276, 783, 7103, 310, 417, 7094, 21414, 285, 1537, 320, 16593, 50276, 16680, 943, 320, 3559, 1805, 347, 597, 403, 1077, 2834, 281, 1239, 285, 2096, 2490, 187, 4118, 18435, 27, 783, 2929, 16633, 327, 253, 2266, 48960, 2983, 26332, 271, 2983, 326, 476, 6635, 2266, 48960, 6667, 285, 3021, 476, 1805, 7472, 253, 48960, 31640, 273, 1677, 3676, 4715, 3210, 581, 2278, 3534, 247, 4868, 273, 854, 1223, 253, 643, 495, 30628, 3534, 4016, 7363, 253, 2022, 2523, 8696, 275, 253, 3710, 4679, 347, 247, 2442, 16502, 323, 39951, 253, 4081, 5823, 943, 320, 7561, 5762, 1411, 1027, 25774, 816, 347, 2218, 275, 253, 39951, 2929, 253, 4028, 273, 253, 2929, 310, 10380, 310, 417, 26565, 1690, 1142, 13583, 7234, 285, 36542, 3916, 534, 943, 320, 973, 9713, 275, 253, 18520, 3021, 352, 2550, 320, 7607, 281, 17857, 32888, 323, 697, 1655, 2715 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: while the title of the paper suggests that it leverages techniques from the vast literature on numerical continuation the proposed approach is much more specific the main idea consists in annealing the temperature parameter of the soft bellman operator and to warm start the the corresponding series of problems across time in the language of numerical continuation this approach fits under the umbrella of natural parameter continuation which i would describe succintly as warm starting from the perspective of addressing the challenging optimization landscape presupposed in many problems the combination of continuation smooth approximation to the optimal equations makes sense however for me the narrative doesnt hold when the authors motivate their method in the context of offline batch policy gradient methods as the authors point out the main challenge associated with offline data is the inability to sample new data this is problematic in the policy gradient setting because our derivative estimator only holds under the distribution under which the samples have been collected as soon as the policy parameters are updated the distributional shift should be addressed via an appropriate change of measure or via a model to me this is the main challenge in the offpolicy setting and the proposed continuationbased solution does not address this issue i view this as a monte carlo estimation problem first not one pertaining to the optimization landscape perhaps the paper should have been named differently because the remaining theoretical contributions in the paper do not pertain to continuation per se theorem 1 provides a bound on the policy gradient methods with a softmax policy this is different from the soft optimality equations theorem 2 and 3 follow from the results in rust 1994 1996 and in econometrics where the smooth soft bellman operator has been widely used theorem 2 follows from the perspective of policy iteration as an application newtonkantorovich to the smooth bellman optimality equations theorem 3 follows from dinis theorem where limtau to 0 tstartau v tstar v where ttau would be the smooth soft bellman operator and tstar the usual bellman operator see rust 1996 numerical dynamic programming in economics section 4 more specifically equations 42 and 44 docsepsummary of the paper this paper proposed a new batch rl algorithm based on the continuation method in numerical optimization the proposed method makes use of kl regularization in the offline policy optimization process with a decreasing temperature parameter which comes from the continuation method of optimization the paper shows that the policy learning with the kl regularized value yields a faster convergence rate as the motivation of using kl regularization and a decreasing temperature will ensure the final convergence to the optimal policy experiments on mojoco atari and a recommender dataset shows that the proposed algorithm is effective justification for the score the proposed algorithm is a natural improvement over the current behavior regularized policy optimization methods the intuition from the continuation method provides a justification for using a decreasing temperature i think the main merit of this paper is the solid experiment in simulation tasks with discrete actions and continuous actions and in a real dataset this contribution seems solid but i have some concerns about theorem 1 and important related work that is missed detailed comments pro 1 the intuition behind the algorithmic change is clear enough section 31 gives the motivation of using a kl regularization and section 33 gives an illustrative example of why the continuation method can be better than a constant threshold 2 the experiment section covers the standard rl benchmark in both continuous and discrete action settings it additionally studied the performance of the proposed algorithm in a real dataset since while most batch rl work only run experiments in simulation tasks this is a good step to bring the algorithm close to the motivation of doing batch rl cons 1 theorem 1 seems to be a bit disconnected from the main contribution of the paper and its hard to understand its role for me later theoretical analysis and practical approximation are all based on the policy iteration algorithm but theorem 1 seems to be based on policy gradient could the convergence rate improvement be shown in the policy iteration case additionally theorem 1 says maximizing widetildevpitau does that mean theorem 1 consider the onpolicy setting in such a case what does beta mean 2 the proposed algorithm seems to be very related to the brac framework and the algorithm brac with the kl value penalty why is it not mentioned at all the brac paper may not directly use such a decreasing temperature or link it to the continuation method but as prior work in batch rl it also considered kl regularization and even studied using an adaptive regularization coefficient 3 more recent batch rl algorithms like brac cql etc also reported their result in mujoco and atari cql did domains and seems to be better than bcq bear and rem an open dataset d4rl has the reported performance of those more recent baselines i think in general it will be better to compare with these more recent baselines especially brac since its very related docsepsummary the paper extends soft actorcritic sac to the batch rl setting replacing the policy entropy in the objective function with the kl divergence from the behavioral policy the temperature parameter tau weighting the reward agains the kl term is annealed towards zero during the optimization process which corresponds to starting with behavioral cloning for high values of tau and ending up with the standard reward maximization rl objective for tau0 theoretical analysis and experiments confirm the advantages of the proposed method decision i vote for accepting the paper the idea of annealing the kl constraint is simple and elegant although it is very similar to other constrained policy update methods discussed in the related work section the evaluation in the batch rl setting and demonstration of the improved convergence properties is novel the execution is of high quality with evaluations on tabular problems mujoco atari and a contextual bandit problem for movie recommendation questions 1 as pointed out in sec 33 when the policy deviates too much from the behavioral policy the value estimate becomes erroneous therefore a criterion based on the ensemble variance of the qfunction estimates is proposed is there a way to derive such a criterion from first principles 2 can you relate your work to 1 3 does your method work when the behavior policy is not known but only the dataset is available 4 can you quantify how far the optimal policy is allowed to be from the behavioral policy for example on a pendulum swingup task if the behavioral policy is taking random actions and the pendulum always jitters at the bottom inferring the optimal policy from this data appears quite challenging can one give some criteria when the method is expected to work well references 1 nachum o dai b 2020 reinforcement learning via fenchelrockafellar duality arxiv preprint arxiv200101866docsepthe authors propose a kl regularized approach for batch rl where the importance of the kl term is reduced during learning theoretical guarantees are provided in the tabular domain the algorithm is tested in several domains mujoco atari and a recommender task strengths although the theoretical results are mostly straightforward extensions of existing results they provide a solid backing to the method in particular theorem 1 is an interesting result which motivates kl regularization in this setting the experimental section is very thorough i would prefer more seeds but the coverage over many domains and behavior policies is convincing evidence for the empirical success of the method the appendix is very comprehensive although code is missing the method and experiments are reproducible with the provided descriptions the writing was clear and the method was wellmotivated overall the paper feels fairly complete and well polished weaknesses there is one very glaring weakness to this paper the proposed kl regularized approach for offline rl already exists wu et al 2019 related jacques et al 2019 this is additionally problematic as both methods are not cited or discussed in the paper to the best of my knowledge the continuation aspect is still novel as well as the theoretical contributions however a discussion on wu et al 2019 is necessary my first impression was that the variance solution to checkpointing tau in section 33 was somewhat hacky on second thought however as suggested by the authors in the introduction gradually reducing tau provides a mechanism for searching for the optimal value which trades between the constraint and learning on top of the proposed benefits of continuation for optimization i think this is an interesting component of the method on the plus side i think the checkpointing solves an important problem for batch rl but on the downside i think measuring the variance of the ensemble is not as well motivated as the rest of the method i think the paper could benefit from additional discussion or experiments which examine this aspect further while the method is the best over a wide range of domains the performance benefit seems fairly incremental consequently for most users its unclear if the benefits are sufficiently significant to warrant the additional complexity recommendation so firstly the novelty concerns absolutely need to be addressed and the mentioned papers citeddiscussed in the paper regardless i do feel like there is a meaningful contribution that builds on this prior work at both a theoretical and empirical level as a result im leaning on the side of accept references wu yifan et al behavior regularized offline reinforcement learning 2019 jaques natasha et al way offpolicy batch deep reinforcement learning of implicit human preferences in dialog 2019 postrebuttal the authors have addressed most of my concerns i have increased my score although the additional experiments on the variancecheckpointing are helpful i would still like to see more discussion in the paper itself ### Summary:
the paper got a quite high disagreement in the scores from the reviewers r2 voted for rejecting the paper as he did not see the connection of the algorithm to the continuation method and also that the continuation method does not address the distributional shift which is one of the main problems for offlline rl yet these concerns have been properly answered in the rebuttal of the authors and the distributional shift is also addressed by the continuation method by reducing the error in policy evaluation further concerns from the reviewers were raised in terms of related work to a similar algorithm brac which is also addressed in the revision of the paper the reviewers also identified the following strong points of the paper the algorithm is a simple and very effective adaptation to sac the presented results are exhaustive and convincing the paper provides strong theoretical results for the presented algorithm the authors did a very good job with their revision adding more comparisons and ablation studies i agree that this paper very interesting and recommend acceptance
[ 253, 2278, 15, 187, 4118, 8439, 27, 187, 6050, 253, 4060, 273, 253, 2929, 5936, 326, 352, 19732, 1131, 5609, 432, 253, 8485, 6239, 327, 10704, 26272, 253, 4081, 2746, 310, 1199, 625, 2173, 253, 2022, 2934, 8414, 275, 35375, 253, 3276, 4764, 273, 253, 2602, 17487, 1342, 5572, 285, 281, 5890, 1265, 253, 253, 3969, 2962, 273, 3237, 2439, 673, 275, 253, 3448, 273, 10704, 26272, 436, 2746, 13840, 762, 253, 33265, 273, 3626, 4764, 26272, 534, 891, 651, 6266, 18382, 565, 314, 347, 5890, 4983, 50275, 4064, 253, 8668, 273, 15974, 253, 11132, 13757, 13016, 838, 1135, 1700, 275, 1142, 3237, 253, 5019, 273, 26272, 50276, 34006, 11193, 281, 253, 8654, 7424, 2789, 3282, 2299, 323, 479, 253, 14511, 36908, 2186, 672, 253, 4477, 41509, 616, 1332, 275, 253, 3634, 273, 28841, 14604, 3646, 11786, 3082, 347, 253, 4477, 1127, 562, 253, 2022, 5691, 2330, 342, 28841, 941, 310, 253, 19297, 281, 3410, 747, 941, 436, 310, 20276, 275, 253, 3646, 11786, 4758, 984, 776, 4309, 29107, 760, 6556, 762, 253, 3268, 762, 534, 253, 3530, 452, 644, 5728, 347, 3517, 347, 253, 3646, 3602, 403, 9300, 253, 3268, 267, 5333, 943, 320, 9713, 3066, 271, 4569, 1818, 273, 2557, 390, 3066, 247, 1566, 281, 479, 436, 310, 253, 2022, 5691, 275, 253, 745, 22872, 4758, 285, 253, 4081, 26272, 3169, 2900, 1057, 417, 2953, 436, 2523, 50276, 74, 1859, 436, 347, 247, 1114, 442, 1113, 4213, 13418, 1895, 806, 417, 581, 27855, 281, 253, 13757, 13016, 50275, 30875, 253, 2929, 943, 452, 644, 4907, 13359, 984, 253, 5780, 10527, 9021, 275, 253, 2929, 513, 417, 6925, 404, 281, 26272, 591, 396, 10012, 337, 3400, 247, 3033, 327, 253, 3646, 11786, 3082, 342, 247, 2602, 4090, 3646, 436, 310, 1027, 432, 253, 2602, 5556, 1319, 7424, 10012, 374, 285, 495, 956, 432, 253, 1543, 275, 20035, 9354, 8441, 285, 275, 2895, 6853, 84, 835, 253, 6032, 2602, 17487, 1342, 5572, 556, 644, 7561, 908, 10012, 374, 3637, 432, 253, 8668, 273, 3646, 19502, 347, 271, 2898, 747, 1299, 76, 386, 263, 23303, 281, 253, 6032, 17487, 1342, 5556, 1319, 7424, 10012, 495, 3637, 432, 13223, 261, 10012, 835, 1579, 3115, 281, 470, 246, 5478, 1952, 362, 50276, 85, 7873, 362, 835, 246, 3115, 651, 320, 253, 6032, 2602, 17487, 1342, 5572, 285, 246, 7873, 253, 7312, 17487, 1342, 5572, 923, 20035, 8441, 10704, 7870, 10717, 275, 20701, 2593, 577, 625, 5742, 7424, 5976, 285, 7127, 5474, 339, 793, 360, 3454, 273, 253, 2929, 436, 2929, 4081, 247, 747, 14604, 391, 77, 5933, 1754, 327, 253, 26272, 1332, 275, 10704, 13757, 253, 4081, 1332, 2789, 897, 273, 27451, 37820, 275, 253, 28841, 3646, 13757, 1232, 342, 247, 11052, 3276, 4764, 50276, 4609, 3249, 432, 253, 26272, 1332, 273, 13757, 253, 2929, 2722, 326, 253, 3646, 4715, 342, 253, 27451, 3963, 1025, 1318, 11026, 247, 7938, 14940, 2281, 347, 253, 16038, 273, 970, 27451, 37820, 285, 247, 11052, 3276, 588, 5416, 253, 2457, 14940, 281, 253, 8654, 3646, 4679, 327, 5497, 75, 16856, 387, 1792, 285, 247, 3818, 3109, 10895, 2722, 326, 253, 4081, 5933, 310, 3576, 50276, 6309, 1877, 323, 253, 4868, 253, 4081, 5933, 310, 247, 3626, 7756, 689, 253, 1655, 3879, 3963, 1025, 3646, 13757, 3082, 253, 30328, 432, 253, 26272, 1332, 3400, 247, 22861, 323, 970, 247, 11052, 3276, 891, 1158, 253, 2022, 15785, 273, 436, 2929, 310, 253, 4891, 3368, 275, 9864, 8892, 342, 13358, 5231, 285, 5415, 5231, 285, 275, 247, 1524, 10895, 436, 7680, 3133, 4891, 533, 891, 452, 690, 7350, 670, 10012, 337, 285, 1774, 2905, 789, 326, 310, 9829, 50276, 5992, 7193, 5701, 354, 337, 253, 30328, 3212, 253, 5933, 280, 1818, 310, 2590, 2217, 2593, 4562, 4245, 253, 16038, 273, 970, 247, 27451, 37820, 285, 2593, 5922, 4245, 271, 47386, 1650, 273, 2139, 253, 26272, 1332, 476, 320, 1805, 685, 247, 3638, 7887, 374, 253, 3368, 2593, 10949, 253, 2629, 391, 77, 22791, 275, 1097, 5415, 285, 13358, 2250, 7533, 352, 23000, 5421, 253, 3045, 273, 253, 4081, 5933, 275, 247, 1524, 10895, 1580, 1223, 954, 14604, 391, 77, 789, 760, 1408, 4679, 275, 9864, 8892, 436, 310, 247, 1175, 3213, 281, 3324, 253, 5933, 2810, 281, 253, 16038, 273, 2509, 14604, 391, 77, 50276, 5040, 337, 10012, 337, 3133, 281, 320, 247, 2372, 33817, 432, 253, 2022, 7680, 273, 253, 2929, 285, 697, 1892, 281, 2096, 697, 2554, 323, 479, 1996, 10527, 1783, 285, 8542, 11193, 403, 512, 1754, 327, 253, 3646, 19502, 5933, 533, 10012, 337, 3133, 281, 320, 1754, 327, 3646, 11786, 812, 253, 14940, 2281, 7756, 320, 2011, 275, 253, 3646, 19502, 1083, 23000, 10012, 337, 2296, 46875, 5261, 292, 300, 3620, 18086, 1952, 1057, 326, 1599, 10012, 337, 1908, 253, 327, 22872, 4758, 275, 824, 247, 1083, 752, 1057, 9840, 1599, 374, 253, 4081, 5933, 3133, 281, 320, 1077, 2905, 281, 253, 1308, 317, 7792, 285, 253, 5933, 1308, 317, 342, 253, 27451, 1318, 12339, 2139, 310, 352, 417, 5393, 387, 512, 253, 1308, 317, 2929, 778, 417, 3587, 897, 824, 247, 11052, 3276, 390, 3048, 352, 281, 253, 26272, 1332, 533, 347, 2720, 789, 275, 14604, 391, 77, 352, 671, 2783, 27451, 37820, 285, 1014, 5421, 970, 271, 17825, 37820, 10235, 50276, 20, 625, 3332, 14604, 391, 77, 11333, 751, 1308, 317, 260, 5848, 3966, 671, 2361, 616, 906, 275, 278, 10441, 16856, 285, 387, 1792, 260, 5848, 858, 10625, 285, 3133, 281, 320, 1805, 685, 49501, 82, 8800, 285, 867, 271, 1527, 10895, 277, 21, 8435, 556, 253, 2361, 3045, 273, 1110, 625, 3332, 1666, 25379, 891, 1158, 275, 2087, 352, 588, 320, 1805, 281, 7277, 342, 841, 625, 3332, 1666, 25379, 3340, 1308, 317, 1580, 697, 1077, 2905, 50276, 7152, 339, 793, 360, 3454, 50276, 783, 2929, 8725, 2602, 12353, 68, 17425, 7044, 281, 253, 14604, 391, 77, 4758, 15706, 253, 3646, 15579, 275, 253, 8103, 1159, 342, 253, 27451, 23279, 432, 253, 14613, 3646, 253, 3276, 4764, 29201, 42428, 253, 10921, 969, 84, 253, 27451, 1307, 310, 27175, 3256, 4404, 5058, 1309, 253, 13757, 1232, 534, 10140, 281, 4983, 342, 14613, 34591, 323, 1029, 2193, 273, 29201, 285, 12365, 598, 342, 253, 2629, 10921, 11903, 1320, 391, 77, 8103, 323, 29201, 17, 10527, 1783, 285, 4679, 6583, 253, 11361, 273, 253, 4081, 1332, 50276, 33642, 50276, 74, 6273, 323, 18738, 253, 2929, 253, 2934, 273, 35375, 253, 27451, 7658, 310, 2969, 285, 20654, 3738, 352, 310, 1077, 2074, 281, 643, 20793, 3646, 5731, 3082, 5469, 275, 253, 2905, 789, 2593, 253, 7103, 275, 253, 14604, 391, 77, 4758, 285, 20028, 273, 253, 5520, 14940, 3607, 310, 4460, 253, 10636, 310, 273, 1029, 3290, 342, 27163, 327, 10334, 792, 3237, 278, 10441, 16856, 387, 1792, 285, 247, 33876, 3961, 262, 1895, 323, 6440, 17401, 50276, 34974, 50276, 18, 347, 8042, 562, 275, 4706, 5922, 672, 253, 3646, 1474, 28032, 1512, 1199, 432, 253, 14613, 3646, 253, 1318, 6642, 4916, 21210, 3103, 247, 17705, 1754, 327, 253, 19862, 11041, 273, 253, 2805, 3701, 8197, 310, 4081, 310, 627, 247, 1039, 281, 15313, 824, 247, 17705, 432, 806, 9241, 374, 476, 368, 14588, 634, 789, 281, 337, 495, 1057, 634, 1332, 789, 672, 253, 3879, 3646, 310, 417, 1929, 533, 760, 253, 10895, 310, 2130, 577, 476, 368, 22048, 849, 2080, 253, 8654, 3646, 310, 4136, 281, 320, 432, 253, 14613, 3646, 323, 1650, 327, 247, 32752, 15508, 14284, 484, 4836, 604, 253, 14613, 3646, 310, 3192, 3632, 5231, 285, 253, 32752, 15508, 1900, 480, 262, 1336, 387, 253, 5004, 9441, 804, 253, 8654, 3646, 432, 436, 941, 4620, 3240, 11132, 476, 581, 1918, 690, 6866, 672, 253, 1332, 310, 3264, 281, 789, 973, 50276, 250, 3065, 50276, 18, 23424, 360, 258, 50276, 1473, 74, 270, 9169, 35221, 4715, 3066, 38775, 29311, 16249, 2320, 8739, 34962, 549, 32693, 638, 3845, 549, 32693, 1518, 6903, 46639, 7152, 339, 431, 248, 4477, 12661, 247, 27451, 3963, 1025, 2746, 323, 14604, 391, 77, 835, 253, 6349, 273, 253, 27451, 1307, 310, 3777, 1309, 4715, 10527, 23632, 403, 2530, 275, 253, 10334, 792, 5028, 253, 5933, 310, 5762, 275, 2067, 10625, 278, 10441, 16856, 387, 1792, 285, 247, 3818, 3109, 4836, 50275, 296, 3755, 20556, 50276, 20261, 253, 10527, 1543, 403, 6571, 15246, 18149, 273, 5368, 1543, 597, 2085, 247, 4891, 19673, 281, 253, 1332, 275, 1798, 10012, 337, 310, 271, 4722, 906, 534, 15265, 684, 27451, 37820, 275, 436, 4758, 50275, 783, 5661, 2593, 310, 1077, 11080, 891, 651, 4510, 625, 12922, 533, 253, 7031, 689, 1142, 10625, 285, 3879, 7823, 310, 21414, 1941, 323, 253, 16774, 2323, 273, 253, 1332, 50276, 783, 30762, 310, 1077, 11088, 3738, 2127, 310, 5816, 253, 1332, 285, 4679, 403, 41374, 342, 253, 2530, 20121, 50276, 783, 4028, 369, 2590, 285, 253, 1332, 369, 973, 24013, 8550, 4583, 253, 2929, 9193, 9648, 3426, 285, 973, 29422, 50275, 20881, 1255, 265, 50276, 9088, 310, 581, 1077, 45982, 14855, 281, 436, 2929, 253, 4081, 27451, 3963, 1025, 2746, 323, 28841, 391, 77, 2168, 4961, 259, 86, 1162, 355, 6247, 50276, 4919, 480, 317, 10999, 1162, 355, 6247, 436, 310, 23000, 20276, 347, 1097, 3082, 403, 417, 11106, 390, 5469, 275, 253, 2929, 281, 253, 1682, 273, 619, 3640, 253, 26272, 4809, 310, 1335, 4460, 347, 973, 347, 253, 10527, 9021, 2299, 247, 5955, 327, 259, 86, 1162, 355, 6247, 310, 3309, 50276, 2577, 806, 13214, 369, 326, 253, 11041, 2900, 281, 32552, 272, 29201, 275, 2593, 5922, 369, 8489, 13908, 90, 327, 1273, 1869, 2299, 347, 5125, 407, 253, 4477, 275, 253, 10199, 13237, 8493, 29201, 3400, 247, 5122, 323, 12203, 323, 253, 8654, 1318, 534, 28587, 875, 253, 7658, 285, 4715, 327, 1755, 273, 253, 4081, 5373, 273, 26272, 323, 13757, 891, 1158, 436, 310, 271, 4722, 4445, 273, 253, 1332, 327, 253, 5043, 1930, 891, 1158, 253, 32552, 272, 35910, 271, 1774, 1895, 323, 14604, 391, 77, 533, 327, 253, 42719, 891, 1158, 10499, 253, 11041, 273, 253, 19862, 310, 417, 347, 973, 17194, 347, 253, 1551, 273, 253, 1332, 891, 1158, 253, 2929, 812, 5649, 432, 3081, 5955, 390, 4679, 534, 9186, 436, 4809, 2007, 50275, 6050, 253, 1332, 310, 253, 1682, 689, 247, 4618, 2491, 273, 10625, 253, 3045, 5649, 3133, 9648, 32809, 17912, 323, 954, 4212, 697, 12744, 604, 253, 5373, 403, 10481, 1534, 281, 7501, 253, 3081, 10454, 50276, 250, 27167, 318, 50276, 601, 41005, 253, 38135, 7350, 8839, 878, 281, 320, 9713, 285, 253, 5393, 9380, 11106, 35844, 264, 275, 253, 2929, 10159, 891, 513, 1928, 751, 627, 310, 247, 14282, 7680, 326, 21168, 327, 436, 2720, 789, 387, 1097, 247, 10527, 285, 16774, 1268, 347, 247, 906, 516, 25661, 327, 253, 1930, 273, 2997, 50276, 250, 3065, 50276, 44217, 340, 338, 266, 1162, 355, 3879, 3963, 1025, 28841, 35221, 4715, 6247, 50276, 6362, 10999, 2889, 22967, 1162, 355, 1039, 745, 22872, 14604, 3676, 35221, 4715, 273, 15424, 1966, 17971, 275, 10756, 6247, 50276, 5996, 250, 2858, 22559, 50276, 783, 4477, 452, 9713, 954, 273, 619, 7350, 891, 452, 2559, 619, 4868, 3738, 253, 3081, 4679, 327, 253, 11041, 5903, 3659, 272, 403, 9371, 891, 651, 1335, 751, 281, 923, 625, 5955, 275, 253, 2929, 3139, 50276, 187, 187, 4118, 18435, 27, 783, 2929, 1694, 247, 3240, 1029, 30859, 275, 253, 7363, 432, 253, 30628, 391, 19, 14285, 323, 33944, 253, 2929, 347, 344, 858, 417, 923, 253, 4602, 273, 253, 5933, 281, 253, 26272, 1332, 285, 671, 326, 253, 26272, 1332, 1057, 417, 2953, 253, 3268, 267, 5333, 534, 310, 581, 273, 253, 2022, 3237, 323, 745, 620, 460, 391, 77, 2568, 841, 7350, 452, 644, 6283, 9577, 275, 253, 30080, 22559, 273, 253, 4477, 285, 253, 3268, 267, 5333, 310, 671, 9713, 407, 253, 26272, 1332, 407, 8493, 253, 2228, 275, 3646, 7103, 2007, 7350, 432, 253, 30628, 497, 5439, 275, 2426, 273, 2905, 789, 281, 247, 2074, 5933, 1308, 317, 534, 310, 671, 9713, 275, 253, 18520, 273, 253, 2929, 50275, 783, 30628, 671, 3636, 253, 1563, 2266, 2792, 273, 253, 2929, 50276, 783, 5933, 310, 247, 2969, 285, 1077, 3576, 15644, 281, 7044, 50275, 783, 3559, 1543, 403, 41389, 285, 21414, 50276, 783, 2929, 3400, 2266, 10527, 1543, 323, 253, 3559, 5933, 50276, 783, 4477, 858, 247, 1077, 1175, 2628, 342, 616, 18520, 6240, 625, 14023, 285, 28913, 2175, 50276, 74, 5194, 326, 436, 2929, 1077, 4722, 285, 5583, 14924 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 253, 2278, 15, 187, 4118, 8439, 27, 187, 6050, 253, 4060, 273, 253, 2929, 5936, 326, 352, 19732, 1131, 5609, 432, 253, 8485, 6239, 327, 10704, 26272, 253, 4081, 2746, 310, 1199, 625, 2173, 253, 2022, 2934, 8414, 275, 35375, 253, 3276, 4764, 273, 253, 2602, 17487, 1342, 5572, 285, 281, 5890, 1265, 253, 253, 3969, 2962, 273, 3237, 2439, 673, 275, 253, 3448, 273, 10704, 26272, 436, 2746, 13840, 762, 253, 33265, 273, 3626, 4764, 26272, 534, 891, 651, 6266, 18382, 565, 314, 347, 5890, 4983, 50275, 4064, 253, 8668, 273, 15974, 253, 11132, 13757, 13016, 838, 1135, 1700, 275, 1142, 3237, 253, 5019, 273, 26272, 50276, 34006, 11193, 281, 253, 8654, 7424, 2789, 3282, 2299, 323, 479, 253, 14511, 36908, 2186, 672, 253, 4477, 41509, 616, 1332, 275, 253, 3634, 273, 28841, 14604, 3646, 11786, 3082, 347, 253, 4477, 1127, 562, 253, 2022, 5691, 2330, 342, 28841, 941, 310, 253, 19297, 281, 3410, 747, 941, 436, 310, 20276, 275, 253, 3646, 11786, 4758, 984, 776, 4309, 29107, 760, 6556, 762, 253, 3268, 762, 534, 253, 3530, 452, 644, 5728, 347, 3517, 347, 253, 3646, 3602, 403, 9300, 253, 3268, 267, 5333, 943, 320, 9713, 3066, 271, 4569, 1818, 273, 2557, 390, 3066, 247, 1566, 281, 479, 436, 310, 253, 2022, 5691, 275, 253, 745, 22872, 4758, 285, 253, 4081, 26272, 3169, 2900, 1057, 417, 2953, 436, 2523, 50276, 74, 1859, 436, 347, 247, 1114, 442, 1113, 4213, 13418, 1895, 806, 417, 581, 27855, 281, 253, 13757, 13016, 50275, 30875, 253, 2929, 943, 452, 644, 4907, 13359, 984, 253, 5780, 10527, 9021, 275, 253, 2929, 513, 417, 6925, 404, 281, 26272, 591, 396, 10012, 337, 3400, 247, 3033, 327, 253, 3646, 11786, 3082, 342, 247, 2602, 4090, 3646, 436, 310, 1027, 432, 253, 2602, 5556, 1319, 7424, 10012, 374, 285, 495, 956, 432, 253, 1543, 275, 20035, 9354, 8441, 285, 275, 2895, 6853, 84, 835, 253, 6032, 2602, 17487, 1342, 5572, 556, 644, 7561, 908, 10012, 374, 3637, 432, 253, 8668, 273, 3646, 19502, 347, 271, 2898, 747, 1299, 76, 386, 263, 23303, 281, 253, 6032, 17487, 1342, 5556, 1319, 7424, 10012, 495, 3637, 432, 13223, 261, 10012, 835, 1579, 3115, 281, 470, 246, 5478, 1952, 362, 50276, 85, 7873, 362, 835, 246, 3115, 651, 320, 253, 6032, 2602, 17487, 1342, 5572, 285, 246, 7873, 253, 7312, 17487, 1342, 5572, 923, 20035, 8441, 10704, 7870, 10717, 275, 20701, 2593, 577, 625, 5742, 7424, 5976, 285, 7127, 5474, 339, 793, 360, 3454, 273, 253, 2929, 436, 2929, 4081, 247, 747, 14604, 391, 77, 5933, 1754, 327, 253, 26272, 1332, 275, 10704, 13757, 253, 4081, 1332, 2789, 897, 273, 27451, 37820, 275, 253, 28841, 3646, 13757, 1232, 342, 247, 11052, 3276, 4764, 50276, 4609, 3249, 432, 253, 26272, 1332, 273, 13757, 253, 2929, 2722, 326, 253, 3646, 4715, 342, 253, 27451, 3963, 1025, 1318, 11026, 247, 7938, 14940, 2281, 347, 253, 16038, 273, 970, 27451, 37820, 285, 247, 11052, 3276, 588, 5416, 253, 2457, 14940, 281, 253, 8654, 3646, 4679, 327, 5497, 75, 16856, 387, 1792, 285, 247, 3818, 3109, 10895, 2722, 326, 253, 4081, 5933, 310, 3576, 50276, 6309, 1877, 323, 253, 4868, 253, 4081, 5933, 310, 247, 3626, 7756, 689, 253, 1655, 3879, 3963, 1025, 3646, 13757, 3082, 253, 30328, 432, 253, 26272, 1332, 3400, 247, 22861, 323, 970, 247, 11052, 3276, 891, 1158, 253, 2022, 15785, 273, 436, 2929, 310, 253, 4891, 3368, 275, 9864, 8892, 342, 13358, 5231, 285, 5415, 5231, 285, 275, 247, 1524, 10895, 436, 7680, 3133, 4891, 533, 891, 452, 690, 7350, 670, 10012, 337, 285, 1774, 2905, 789, 326, 310, 9829, 50276, 5992, 7193, 5701, 354, 337, 253, 30328, 3212, 253, 5933, 280, 1818, 310, 2590, 2217, 2593, 4562, 4245, 253, 16038, 273, 970, 247, 27451, 37820, 285, 2593, 5922, 4245, 271, 47386, 1650, 273, 2139, 253, 26272, 1332, 476, 320, 1805, 685, 247, 3638, 7887, 374, 253, 3368, 2593, 10949, 253, 2629, 391, 77, 22791, 275, 1097, 5415, 285, 13358, 2250, 7533, 352, 23000, 5421, 253, 3045, 273, 253, 4081, 5933, 275, 247, 1524, 10895, 1580, 1223, 954, 14604, 391, 77, 789, 760, 1408, 4679, 275, 9864, 8892, 436, 310, 247, 1175, 3213, 281, 3324, 253, 5933, 2810, 281, 253, 16038, 273, 2509, 14604, 391, 77, 50276, 5040, 337, 10012, 337, 3133, 281, 320, 247, 2372, 33817, 432, 253, 2022, 7680, 273, 253, 2929, 285, 697, 1892, 281, 2096, 697, 2554, 323, 479, 1996, 10527, 1783, 285, 8542, 11193, 403, 512, 1754, 327, 253, 3646, 19502, 5933, 533, 10012, 337, 3133, 281, 320, 1754, 327, 3646, 11786, 812, 253, 14940, 2281, 7756, 320, 2011, 275, 253, 3646, 19502, 1083, 23000, 10012, 337, 2296, 46875, 5261, 292, 300, 3620, 18086, 1952, 1057, 326, 1599, 10012, 337, 1908, 253, 327, 22872, 4758, 275, 824, 247, 1083, 752, 1057, 9840, 1599, 374, 253, 4081, 5933, 3133, 281, 320, 1077, 2905, 281, 253, 1308, 317, 7792, 285, 253, 5933, 1308, 317, 342, 253, 27451, 1318, 12339, 2139, 310, 352, 417, 5393, 387, 512, 253, 1308, 317, 2929, 778, 417, 3587, 897, 824, 247, 11052, 3276, 390, 3048, 352, 281, 253, 26272, 1332, 533, 347, 2720, 789, 275, 14604, 391, 77, 352, 671, 2783, 27451, 37820, 285, 1014, 5421, 970, 271, 17825, 37820, 10235, 50276, 20, 625, 3332, 14604, 391, 77, 11333, 751, 1308, 317, 260, 5848, 3966, 671, 2361, 616, 906, 275, 278, 10441, 16856, 285, 387, 1792, 260, 5848, 858, 10625, 285, 3133, 281, 320, 1805, 685, 49501, 82, 8800, 285, 867, 271, 1527, 10895, 277, 21, 8435, 556, 253, 2361, 3045, 273, 1110, 625, 3332, 1666, 25379, 891, 1158, 275, 2087, 352, 588, 320, 1805, 281, 7277, 342, 841, 625, 3332, 1666, 25379, 3340, 1308, 317, 1580, 697, 1077, 2905, 50276, 7152, 339, 793, 360, 3454, 50276, 783, 2929, 8725, 2602, 12353, 68, 17425, 7044, 281, 253, 14604, 391, 77, 4758, 15706, 253, 3646, 15579, 275, 253, 8103, 1159, 342, 253, 27451, 23279, 432, 253, 14613, 3646, 253, 3276, 4764, 29201, 42428, 253, 10921, 969, 84, 253, 27451, 1307, 310, 27175, 3256, 4404, 5058, 1309, 253, 13757, 1232, 534, 10140, 281, 4983, 342, 14613, 34591, 323, 1029, 2193, 273, 29201, 285, 12365, 598, 342, 253, 2629, 10921, 11903, 1320, 391, 77, 8103, 323, 29201, 17, 10527, 1783, 285, 4679, 6583, 253, 11361, 273, 253, 4081, 1332, 50276, 33642, 50276, 74, 6273, 323, 18738, 253, 2929, 253, 2934, 273, 35375, 253, 27451, 7658, 310, 2969, 285, 20654, 3738, 352, 310, 1077, 2074, 281, 643, 20793, 3646, 5731, 3082, 5469, 275, 253, 2905, 789, 2593, 253, 7103, 275, 253, 14604, 391, 77, 4758, 285, 20028, 273, 253, 5520, 14940, 3607, 310, 4460, 253, 10636, 310, 273, 1029, 3290, 342, 27163, 327, 10334, 792, 3237, 278, 10441, 16856, 387, 1792, 285, 247, 33876, 3961, 262, 1895, 323, 6440, 17401, 50276, 34974, 50276, 18, 347, 8042, 562, 275, 4706, 5922, 672, 253, 3646, 1474, 28032, 1512, 1199, 432, 253, 14613, 3646, 253, 1318, 6642, 4916, 21210, 3103, 247, 17705, 1754, 327, 253, 19862, 11041, 273, 253, 2805, 3701, 8197, 310, 4081, 310, 627, 247, 1039, 281, 15313, 824, 247, 17705, 432, 806, 9241, 374, 476, 368, 14588, 634, 789, 281, 337, 495, 1057, 634, 1332, 789, 672, 253, 3879, 3646, 310, 417, 1929, 533, 760, 253, 10895, 310, 2130, 577, 476, 368, 22048, 849, 2080, 253, 8654, 3646, 310, 4136, 281, 320, 432, 253, 14613, 3646, 323, 1650, 327, 247, 32752, 15508, 14284, 484, 4836, 604, 253, 14613, 3646, 310, 3192, 3632, 5231, 285, 253, 32752, 15508, 1900, 480, 262, 1336, 387, 253, 5004, 9441, 804, 253, 8654, 3646, 432, 436, 941, 4620, 3240, 11132, 476, 581, 1918, 690, 6866, 672, 253, 1332, 310, 3264, 281, 789, 973, 50276, 250, 3065, 50276, 18, 23424, 360, 258, 50276, 1473, 74, 270, 9169, 35221, 4715, 3066, 38775, 29311, 16249, 2320, 8739, 34962, 549, 32693, 638, 3845, 549, 32693, 1518, 6903, 46639, 7152, 339, 431, 248, 4477, 12661, 247, 27451, 3963, 1025, 2746, 323, 14604, 391, 77, 835, 253, 6349, 273, 253, 27451, 1307, 310, 3777, 1309, 4715, 10527, 23632, 403, 2530, 275, 253, 10334, 792, 5028, 253, 5933, 310, 5762, 275, 2067, 10625, 278, 10441, 16856, 387, 1792, 285, 247, 3818, 3109, 4836, 50275, 296, 3755, 20556, 50276, 20261, 253, 10527, 1543, 403, 6571, 15246, 18149, 273, 5368, 1543, 597, 2085, 247, 4891, 19673, 281, 253, 1332, 275, 1798, 10012, 337, 310, 271, 4722, 906, 534, 15265, 684, 27451, 37820, 275, 436, 4758, 50275, 783, 5661, 2593, 310, 1077, 11080, 891, 651, 4510, 625, 12922, 533, 253, 7031, 689, 1142, 10625, 285, 3879, 7823, 310, 21414, 1941, 323, 253, 16774, 2323, 273, 253, 1332, 50276, 783, 30762, 310, 1077, 11088, 3738, 2127, 310, 5816, 253, 1332, 285, 4679, 403, 41374, 342, 253, 2530, 20121, 50276, 783, 4028, 369, 2590, 285, 253, 1332, 369, 973, 24013, 8550, 4583, 253, 2929, 9193, 9648, 3426, 285, 973, 29422, 50275, 20881, 1255, 265, 50276, 9088, 310, 581, 1077, 45982, 14855, 281, 436, 2929, 253, 4081, 27451, 3963, 1025, 2746, 323, 28841, 391, 77, 2168, 4961, 259, 86, 1162, 355, 6247, 50276, 4919, 480, 317, 10999, 1162, 355, 6247, 436, 310, 23000, 20276, 347, 1097, 3082, 403, 417, 11106, 390, 5469, 275, 253, 2929, 281, 253, 1682, 273, 619, 3640, 253, 26272, 4809, 310, 1335, 4460, 347, 973, 347, 253, 10527, 9021, 2299, 247, 5955, 327, 259, 86, 1162, 355, 6247, 310, 3309, 50276, 2577, 806, 13214, 369, 326, 253, 11041, 2900, 281, 32552, 272, 29201, 275, 2593, 5922, 369, 8489, 13908, 90, 327, 1273, 1869, 2299, 347, 5125, 407, 253, 4477, 275, 253, 10199, 13237, 8493, 29201, 3400, 247, 5122, 323, 12203, 323, 253, 8654, 1318, 534, 28587, 875, 253, 7658, 285, 4715, 327, 1755, 273, 253, 4081, 5373, 273, 26272, 323, 13757, 891, 1158, 436, 310, 271, 4722, 4445, 273, 253, 1332, 327, 253, 5043, 1930, 891, 1158, 253, 32552, 272, 35910, 271, 1774, 1895, 323, 14604, 391, 77, 533, 327, 253, 42719, 891, 1158, 10499, 253, 11041, 273, 253, 19862, 310, 417, 347, 973, 17194, 347, 253, 1551, 273, 253, 1332, 891, 1158, 253, 2929, 812, 5649, 432, 3081, 5955, 390, 4679, 534, 9186, 436, 4809, 2007, 50275, 6050, 253, 1332, 310, 253, 1682, 689, 247, 4618, 2491, 273, 10625, 253, 3045, 5649, 3133, 9648, 32809, 17912, 323, 954, 4212, 697, 12744, 604, 253, 5373, 403, 10481, 1534, 281, 7501, 253, 3081, 10454, 50276, 250, 27167, 318, 50276, 601, 41005, 253, 38135, 7350, 8839, 878, 281, 320, 9713, 285, 253, 5393, 9380, 11106, 35844, 264, 275, 253, 2929, 10159, 891, 513, 1928, 751, 627, 310, 247, 14282, 7680, 326, 21168, 327, 436, 2720, 789, 387, 1097, 247, 10527, 285, 16774, 1268, 347, 247, 906, 516, 25661, 327, 253, 1930, 273, 2997, 50276, 250, 3065, 50276, 44217, 340, 338, 266, 1162, 355, 3879, 3963, 1025, 28841, 35221, 4715, 6247, 50276, 6362, 10999, 2889, 22967, 1162, 355, 1039, 745, 22872, 14604, 3676, 35221, 4715, 273, 15424, 1966, 17971, 275, 10756, 6247, 50276, 5996, 250, 2858, 22559, 50276, 783, 4477, 452, 9713, 954, 273, 619, 7350, 891, 452, 2559, 619, 4868, 3738, 253, 3081, 4679, 327, 253, 11041, 5903, 3659, 272, 403, 9371, 891, 651, 1335, 751, 281, 923, 625, 5955, 275, 253, 2929, 3139, 50276, 187, 187, 4118, 18435, 27, 783, 2929, 1694, 247, 3240, 1029, 30859, 275, 253, 7363, 432, 253, 30628, 391, 19, 14285, 323, 33944, 253, 2929, 347, 344, 858, 417, 923, 253, 4602, 273, 253, 5933, 281, 253, 26272, 1332, 285, 671, 326, 253, 26272, 1332, 1057, 417, 2953, 253, 3268, 267, 5333, 534, 310, 581, 273, 253, 2022, 3237, 323, 745, 620, 460, 391, 77, 2568, 841, 7350, 452, 644, 6283, 9577, 275, 253, 30080, 22559, 273, 253, 4477, 285, 253, 3268, 267, 5333, 310, 671, 9713, 407, 253, 26272, 1332, 407, 8493, 253, 2228, 275, 3646, 7103, 2007, 7350, 432, 253, 30628, 497, 5439, 275, 2426, 273, 2905, 789, 281, 247, 2074, 5933, 1308, 317, 534, 310, 671, 9713, 275, 253, 18520, 273, 253, 2929, 50275, 783, 30628, 671, 3636, 253, 1563, 2266, 2792, 273, 253, 2929, 50276, 783, 5933, 310, 247, 2969, 285, 1077, 3576, 15644, 281, 7044, 50275, 783, 3559, 1543, 403, 41389, 285, 21414, 50276, 783, 2929, 3400, 2266, 10527, 1543, 323, 253, 3559, 5933, 50276, 783, 4477, 858, 247, 1077, 1175, 2628, 342, 616, 18520, 6240, 625, 14023, 285, 28913, 2175, 50276, 74, 5194, 326, 436, 2929, 1077, 4722, 285, 5583, 14924 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: this paper proposes to use image representations from trained selfsupervised models to evaluate gans more accurately compared to the currently used representations from supervisedpretrained models eg inceptionv3 the authors claim that such embeddings suppress information not critical for the classification process which however are assumed to be crucial for assessing the full distributions of real and generated images the authors use 5 datasets and their respective representations from 5 models 3 supervised and 2 selfsupervised to show that representations from selfsupervised models lead to better gan evaluations the representations were used to evaluate 6 gan models with 3 metrics namely fid precision and recall a ranking of the gan models shows inconsistencies between supervised and selfsupervised based representations by visual inspection prediction accuracy tests and a comparison of representation invariances the authors show that rankings via selfsupervised embeddings are more plausible pros interesting proposal to better evaluate gans and generative models for image data the paper is well written and easy to understand the experiments are extensive and support the claim of the authors testing for invariances of representations is an interesting idea and the results support the use of embeddings from selfsupervised models cons the authors argue that latent representations from an autoencoder capture all the information from images it would be interesting to see how such representations eg from the autoencoder used to show the invariances described in section a1 behave compared to the proposed selfsupervised representations i would like to see them to be included in the experiments minor comment typo in a1 corrseponding edit the authors have not responded to any of the reviews i lower my rating to 4 edit2 oh there was a misunderstanding i probably was not logged in and didnt see any comments and reviews i raised the rating and will read the answers and will rate again edit3 after reading the rebuttals i raise my rating to 7 docsep summary the papers looks at the problem of evaluating gan samples current methods such as fidpr with inception v3 are problematic because they generally depend on using the features of a model discriminatively trained on a super set of imagenet the authors show that these type of models ignore details that are meaningful when for example comparing results on celebahq instead the authors propose to use a recent selfsupervised trained model which have been shown to provide more general representations they take a selection of recent powerful gans and compare the ranking of their results based on fidpr with discriminative imagine features vs selfsupervised features and show that there are indeed differences to evaluate the ground truth the authors device a number of small experiments that attempt to establish these facts retrieving celeba labels from features of each model and an additional classifier trained on one gan output evaluated on another in all experiments the authors show that the selfsupervised trained model produces a gan ranking that is closer to the truth review the paper is well written and provides a very nice overview of recent advances in gans and description of their evaluation methods while the proposed method is a simple improvement over previous work replace the feature extractor keeping most else constant the empirical evaluation is very thorough and well done in particular i found the additional experiments based on the results in table 1 and 3 very informative and welcome results in table 2 and 4 give a very interesting confirmation that selfsupervised embeddings are indeed more informative the visualisations are well done and relevant and the markings in the table make finding comparisons straightforward discussion while the ordering between swav and deepclusterv2 is the same the actual numbers are significantly different do the authors know why this might happen the authors compare to two specific selfsupervised algorithms which are additionally trained using clustering is there a particular reason those models were chosen does the type of contrastive learning impact the results or is generally better representations as measured by finetuning for imagenet better for gan evaluation too would it be worthwhile to attempt to add the types of artifacts in gans to the set of augmentations done for the contrastive learning in general i think the paper can benefit from more analysis around the choice for the right selfsupervised network and trade offs post answer my questions are appropriately answered and i appreciate the addition of section 34 i think my current score accurately reflects my evaluation of the paper with the remaining concern being the magnitude of the contributiondocsepthis paper provides an interesting empirical study of selfsupervised image embeddings as an alternative to pretrained imagenet classification models for the purpose of evaluating the quality and diversity of gan generative image models i always found it a little odd that a model trained on imagenet with crossentropy loss should somehow be a magical universal quality metric and i am happy to see that this paper provides good evidence that this is not the case the authors select 2 selfsupervised models and compare them against a number of supervised models the metrics used are fid and precisionrecall i am curious why inception score was not also compared the paper does quite a thorough job of selecting and comparing models by normalizing for architecture and changing dataset or loss function it shows clearly that selfsupervised methods outperform the supervised methods for ranking various gan models it would have been interesting to train the selfsupervised model on the dataset itself eg lsun or celeba to see whether that provides an even more useful signal given that deep networks find it hard to generalize across datasets i would expect that directly training an embedding on the target dataset would do better did the authors try something along these lines a minor comment is that the layout of the results and comments is a bit confusing due to the very long number of points that refer to a particular figure and needing lots of scrolling back and forth some better way to organize the information and comments would be appreciated i would also find it insightful to better understand why selfsupervision works better for evaluating representations any comments to this regard would be interesting lastly i am curious why the authors did not consider selfsupervised methods such as simclr i have read the rebuttals and other comments and maintain my rating of the paper docsepoverview of paper this work compares supervised feature extractors vs two types of selfsupervised feature extractors for the task of gan model evaluation it shows that the ranking provided by selfsupervised features is different from that of supervised features and claims it corresponds better with human judgement experiments are conducted of multiple large gans and datasets novelty i am not aware of previous works investigating selfsupervised features for gan evaluation but note that 1 evaluated selfsupervised features as a perceptual loss which is highly related significance the current metrics used to evaluate gans are wellknown to be problematic and the search for better measures is important on the other hand i am not certain that this paper is conclusive enough be able to shift the community towards different metrics this is a hard thing to do as even changing the evaluation from using inceptionv3 features which are fairly outdated to more modern resnets has not happened yet methodology i have a few issues with the method although the authors claim to have better agreement with human and objective judgements this is not extremely well justified eg showing that swav can classify facial attributes between inceptionv3 is not by itself very indicative it uses a better architecture and more generally transfer learning of selfsupervise vs supervised methods was extensively validated in the original papers and is about equal also supervised imagenet features are particularly poor for faces i guess that for other objects types results might be different the evidence for the groundtruth ranking for precision and recall is not particularly strong evaluation many gans and datasets evaluated why only swav and deepclusters why not use the other popular contrastive learning methods eg moco simclr byol overall the investigation of better ways of evaluating gans is important the main criticism is that in my opinion not enough effort was taken to establish the groundtruth ranking between models making the results of this investigation less significant the rebuttal addressed my concerns i increased the score 1 zhang et al the unreasonable effectiveness of deep features as a perceptual metric cvpr18 ### Summary:
all four reviewers unanimously recommended for an acceptance four 7s they generally appreciated that the proposed idea is novel and experiments are convincing i think the paper tackles an important problem of evaluating gans and the idea of using selfsupervised representations as opposed to the conventional imagenetbased representations would lead to interesting discussions and followups
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 2520, 2929, 29328, 281, 897, 2460, 14237, 432, 10166, 1881, 35421, 3210, 281, 7472, 305, 507, 625, 13613, 2429, 281, 253, 4390, 908, 14237, 432, 22296, 4025, 11273, 3210, 24088, 39645, 87, 20, 253, 4477, 1750, 326, 824, 46234, 10476, 1491, 417, 4619, 323, 253, 9162, 1232, 534, 2299, 403, 8025, 281, 320, 9560, 323, 18005, 253, 2120, 10670, 273, 1524, 285, 4561, 3888, 253, 4477, 897, 608, 15302, 285, 616, 9056, 14237, 432, 608, 3210, 495, 22296, 285, 374, 1881, 35421, 281, 921, 326, 14237, 432, 1881, 35421, 3210, 1421, 281, 1805, 36827, 27163, 253, 14237, 497, 908, 281, 7472, 721, 36827, 3210, 342, 495, 17082, 10775, 269, 301, 12320, 285, 6983, 247, 19947, 273, 253, 36827, 3210, 2722, 45611, 875, 22296, 285, 1881, 35421, 1754, 14237, 407, 5304, 15981, 10554, 7200, 5216, 285, 247, 5301, 273, 6779, 828, 6656, 707, 253, 4477, 921, 326, 31972, 3066, 1881, 35421, 46234, 403, 625, 21541, 50276, 856, 84, 4722, 10419, 281, 1805, 7472, 305, 507, 285, 1006, 800, 3210, 323, 2460, 941, 253, 2929, 310, 973, 3542, 285, 3477, 281, 2096, 253, 4679, 403, 9470, 285, 1329, 253, 1750, 273, 253, 4477, 5175, 323, 828, 6656, 707, 273, 14237, 310, 271, 4722, 2934, 285, 253, 1543, 1329, 253, 897, 273, 46234, 432, 1881, 35421, 3210, 50274, 5040, 253, 4477, 9059, 326, 21624, 14237, 432, 271, 6753, 36465, 9232, 512, 253, 1491, 432, 3888, 352, 651, 320, 4722, 281, 923, 849, 824, 14237, 24088, 432, 253, 6753, 36465, 908, 281, 921, 253, 828, 6656, 707, 2529, 275, 2593, 247, 18, 21319, 2429, 281, 253, 4081, 1881, 35421, 14237, 891, 651, 751, 281, 923, 731, 281, 320, 2908, 275, 253, 4679, 50276, 37585, 4385, 1745, 80, 275, 247, 18, 944, 83, 33032, 857, 272, 50276, 15576, 253, 4477, 452, 417, 10974, 281, 667, 273, 253, 10123, 891, 2406, 619, 13716, 281, 577, 50275, 15576, 19, 12506, 627, 369, 247, 40663, 891, 3164, 369, 417, 24917, 275, 285, 42126, 923, 667, 5701, 285, 10123, 891, 5439, 253, 13716, 285, 588, 1239, 253, 9172, 285, 588, 2281, 969, 50276, 15576, 20, 846, 4361, 253, 30080, 85, 932, 891, 7164, 619, 13716, 281, 818, 5474, 33032, 6010, 50276, 783, 9380, 4453, 387, 253, 1895, 273, 16344, 36827, 3530, 1655, 3082, 824, 347, 269, 301, 1087, 342, 39645, 362, 20, 403, 20276, 984, 597, 3839, 3469, 327, 970, 253, 3386, 273, 247, 1566, 20741, 3146, 10166, 327, 247, 2221, 873, 273, 4440, 257, 292, 253, 4477, 921, 326, 841, 1511, 273, 3210, 11823, 4278, 326, 403, 14282, 672, 323, 1650, 10941, 1543, 327, 6076, 67, 1240, 82, 50276, 34235, 253, 4477, 12661, 281, 897, 247, 3332, 1881, 35421, 10166, 1566, 534, 452, 644, 2011, 281, 2085, 625, 2087, 14237, 597, 1379, 247, 5438, 273, 3332, 6422, 305, 507, 285, 7277, 253, 19947, 273, 616, 1543, 1754, 327, 269, 301, 1087, 342, 20741, 800, 8564, 3386, 4632, 1881, 35421, 3386, 285, 921, 326, 627, 403, 6296, 3910, 50276, 936, 7472, 253, 3216, 5083, 253, 4477, 2813, 247, 1180, 273, 1355, 4679, 326, 3177, 281, 5100, 841, 5441, 48484, 6076, 5830, 13301, 432, 3386, 273, 1016, 1566, 285, 271, 3081, 30410, 10166, 327, 581, 36827, 3453, 6760, 327, 1529, 275, 512, 4679, 253, 4477, 921, 326, 253, 1881, 35421, 10166, 1566, 11330, 247, 36827, 19947, 326, 310, 8003, 281, 253, 5083, 50275, 15337, 50276, 783, 2929, 310, 973, 3542, 285, 3400, 247, 1077, 5322, 18389, 273, 3332, 16424, 275, 305, 507, 285, 5740, 273, 616, 7103, 3082, 1223, 253, 4081, 1332, 310, 247, 2969, 7756, 689, 2045, 789, 8171, 253, 4735, 4908, 263, 7562, 954, 2010, 3638, 253, 16774, 7103, 310, 1077, 11080, 285, 973, 2218, 50276, 249, 1798, 891, 1119, 253, 3081, 4679, 1754, 327, 253, 1543, 275, 2829, 337, 285, 495, 1077, 27096, 285, 10112, 1543, 275, 2829, 374, 285, 577, 1918, 247, 1077, 4722, 16883, 326, 1881, 35421, 46234, 403, 6296, 625, 27096, 50276, 783, 5304, 18058, 403, 973, 2218, 285, 4623, 285, 253, 46547, 275, 253, 2829, 1056, 4560, 14023, 15246, 50274, 49794, 50275, 6050, 253, 15824, 875, 1863, 580, 285, 3676, 498, 461, 677, 19, 310, 253, 1072, 253, 4588, 3904, 403, 3012, 1027, 513, 253, 4477, 871, 2139, 436, 1537, 5108, 50276, 783, 4477, 7277, 281, 767, 2173, 1881, 35421, 11333, 534, 403, 23000, 10166, 970, 17524, 310, 627, 247, 1798, 1921, 1110, 3210, 497, 6777, 1057, 253, 1511, 273, 4499, 422, 4715, 3486, 253, 1543, 390, 310, 3839, 1805, 14237, 347, 4080, 407, 1442, 292, 25004, 323, 4440, 257, 292, 1805, 323, 36827, 7103, 1512, 651, 352, 320, 32811, 281, 3177, 281, 823, 253, 3510, 273, 24165, 275, 305, 507, 281, 253, 873, 273, 35919, 569, 2218, 323, 253, 4499, 422, 4715, 50276, 249, 2087, 891, 1158, 253, 2929, 476, 5649, 432, 625, 1783, 1475, 253, 4327, 323, 253, 987, 1881, 35421, 2990, 285, 5454, 745, 84, 50275, 5996, 3662, 50276, 2577, 3533, 403, 20420, 9577, 285, 891, 11435, 253, 1635, 273, 2593, 5910, 891, 1158, 619, 1655, 4868, 13613, 13806, 619, 7103, 273, 253, 2929, 342, 253, 5780, 4468, 1146, 253, 9777, 273, 253, 7680, 7152, 33032, 2520, 2929, 3400, 271, 4722, 16774, 1263, 273, 1881, 35421, 2460, 46234, 347, 271, 5795, 281, 3215, 11273, 4440, 257, 292, 9162, 3210, 323, 253, 4096, 273, 16344, 253, 3290, 285, 9991, 273, 36827, 1006, 800, 2460, 3210, 891, 1900, 1119, 352, 247, 1652, 8909, 326, 247, 1566, 10166, 327, 4440, 257, 292, 342, 2831, 290, 10144, 2957, 943, 10380, 320, 247, 21653, 10898, 3290, 7982, 285, 891, 717, 5211, 281, 923, 326, 436, 2929, 3400, 1175, 1941, 326, 436, 310, 417, 253, 1083, 253, 4477, 3609, 374, 1881, 35421, 3210, 285, 7277, 731, 1411, 247, 1180, 273, 22296, 3210, 253, 17082, 908, 403, 269, 301, 285, 12320, 2845, 455, 891, 717, 14338, 2139, 39645, 4868, 369, 417, 671, 2429, 50275, 783, 2929, 1057, 3240, 247, 11080, 2628, 273, 17221, 285, 10941, 3210, 407, 2622, 3006, 323, 10336, 285, 6890, 10895, 390, 2957, 1159, 352, 2722, 4518, 326, 1881, 35421, 3082, 562, 32231, 253, 22296, 3082, 323, 19947, 2710, 36827, 3210, 50276, 262, 651, 452, 644, 4722, 281, 6194, 253, 1881, 35421, 1566, 327, 253, 10895, 3139, 24088, 298, 13998, 390, 6076, 5830, 281, 923, 1880, 326, 3400, 271, 1014, 625, 4217, 2625, 1677, 326, 3676, 6928, 1089, 352, 1892, 281, 39970, 2439, 15302, 891, 651, 1902, 326, 3587, 3733, 271, 21496, 327, 253, 2303, 10895, 651, 513, 1805, 858, 253, 4477, 1611, 1633, 2112, 841, 3104, 50276, 66, 5884, 4385, 310, 326, 253, 12806, 273, 253, 1543, 285, 5701, 310, 247, 2372, 21643, 1955, 281, 253, 1077, 1048, 1180, 273, 2792, 326, 3730, 281, 247, 1798, 4677, 285, 25312, 8783, 273, 41684, 896, 285, 6593, 690, 1805, 1039, 281, 23968, 253, 1491, 285, 5701, 651, 320, 14109, 50276, 74, 651, 671, 1089, 352, 47860, 281, 1805, 2096, 2139, 1881, 12185, 4694, 2987, 1805, 323, 16344, 14237, 667, 5701, 281, 436, 2743, 651, 320, 4722, 1390, 314, 891, 717, 14338, 2139, 253, 4477, 858, 417, 1908, 1881, 35421, 3082, 824, 347, 948, 498, 83, 50276, 74, 452, 1239, 253, 30080, 85, 932, 285, 643, 5701, 285, 6558, 619, 13716, 273, 253, 2929, 50276, 7152, 33032, 39930, 273, 2929, 436, 789, 26662, 22296, 4735, 4908, 641, 4632, 767, 3510, 273, 1881, 35421, 4735, 4908, 641, 323, 253, 4836, 273, 36827, 1566, 7103, 352, 2722, 326, 253, 19947, 2530, 407, 1881, 35421, 3386, 310, 1027, 432, 326, 273, 22296, 3386, 285, 3916, 352, 10140, 1805, 342, 1966, 31536, 4679, 403, 5196, 273, 2709, 1781, 305, 507, 285, 15302, 50276, 2369, 652, 555, 891, 717, 417, 6600, 273, 2045, 2987, 15686, 1881, 35421, 3386, 323, 36827, 7103, 533, 3877, 326, 337, 6760, 1881, 35421, 3386, 347, 247, 39612, 2957, 534, 310, 4122, 2905, 50274, 9188, 40348, 253, 1655, 17082, 908, 281, 7472, 305, 507, 403, 973, 4304, 281, 320, 20276, 285, 253, 3186, 323, 1805, 5593, 310, 1774, 327, 253, 643, 1133, 891, 717, 417, 2176, 326, 436, 2929, 310, 38662, 2217, 320, 2104, 281, 5333, 253, 3114, 4404, 1027, 17082, 436, 310, 247, 1892, 2181, 281, 513, 347, 1014, 6890, 253, 7103, 432, 970, 39645, 87, 20, 3386, 534, 403, 9648, 36761, 281, 625, 4980, 501, 47301, 556, 417, 4592, 2568, 50276, 9349, 1497, 891, 452, 247, 1643, 3374, 342, 253, 1332, 50276, 20261, 253, 4477, 1750, 281, 452, 1805, 4345, 342, 1966, 285, 8103, 2128, 24556, 436, 310, 417, 6685, 973, 17285, 24088, 4645, 326, 1863, 580, 476, 30215, 17754, 12474, 875, 39645, 87, 20, 310, 417, 407, 3139, 1077, 24838, 50276, 262, 4648, 247, 1805, 10336, 285, 625, 3839, 3700, 4715, 273, 1881, 12185, 87, 885, 4632, 22296, 3082, 369, 18171, 17618, 275, 253, 3236, 9380, 285, 310, 670, 4503, 671, 22296, 4440, 257, 292, 3386, 403, 3782, 4105, 323, 9365, 891, 5476, 326, 323, 643, 5113, 3510, 1543, 1537, 320, 1027, 253, 1941, 323, 253, 3216, 33024, 19947, 323, 12320, 285, 6983, 310, 417, 3782, 2266, 50276, 15419, 2368, 50275, 20415, 305, 507, 285, 15302, 6760, 50276, 22309, 760, 1863, 580, 285, 3676, 498, 29577, 2139, 417, 897, 253, 643, 4633, 4499, 422, 4715, 3082, 24088, 278, 16856, 948, 498, 83, 407, 311, 50275, 1189, 455, 253, 5839, 273, 1805, 4088, 273, 16344, 305, 507, 310, 1774, 253, 2022, 14226, 310, 326, 275, 619, 4743, 417, 2217, 3434, 369, 2668, 281, 5100, 253, 3216, 33024, 19947, 875, 3210, 2403, 253, 1543, 273, 436, 5839, 1679, 1534, 50274, 783, 30080, 22559, 9713, 619, 7350, 50276, 74, 2559, 253, 4868, 50276, 18, 1182, 12109, 1162, 355, 253, 20697, 12510, 273, 3676, 3386, 347, 247, 39612, 7982, 30105, 1087, 1093, 187, 187, 4118, 18435, 27, 455, 1740, 30628, 38350, 8521, 323, 271, 14924, 1740, 818, 84, 597, 3839, 14109, 326, 253, 4081, 2934, 310, 4460, 285, 4679, 403, 21414, 891, 1158, 253, 2929, 39223, 271, 1774, 1895, 273, 16344, 305, 507, 285, 253, 2934, 273, 970, 1881, 35421, 14237, 347, 10066, 281, 253, 6041, 4440, 257, 292, 3169, 14237, 651, 1421, 281, 4722, 11985, 285, 956, 8777, 209 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 2520, 2929, 29328, 281, 897, 2460, 14237, 432, 10166, 1881, 35421, 3210, 281, 7472, 305, 507, 625, 13613, 2429, 281, 253, 4390, 908, 14237, 432, 22296, 4025, 11273, 3210, 24088, 39645, 87, 20, 253, 4477, 1750, 326, 824, 46234, 10476, 1491, 417, 4619, 323, 253, 9162, 1232, 534, 2299, 403, 8025, 281, 320, 9560, 323, 18005, 253, 2120, 10670, 273, 1524, 285, 4561, 3888, 253, 4477, 897, 608, 15302, 285, 616, 9056, 14237, 432, 608, 3210, 495, 22296, 285, 374, 1881, 35421, 281, 921, 326, 14237, 432, 1881, 35421, 3210, 1421, 281, 1805, 36827, 27163, 253, 14237, 497, 908, 281, 7472, 721, 36827, 3210, 342, 495, 17082, 10775, 269, 301, 12320, 285, 6983, 247, 19947, 273, 253, 36827, 3210, 2722, 45611, 875, 22296, 285, 1881, 35421, 1754, 14237, 407, 5304, 15981, 10554, 7200, 5216, 285, 247, 5301, 273, 6779, 828, 6656, 707, 253, 4477, 921, 326, 31972, 3066, 1881, 35421, 46234, 403, 625, 21541, 50276, 856, 84, 4722, 10419, 281, 1805, 7472, 305, 507, 285, 1006, 800, 3210, 323, 2460, 941, 253, 2929, 310, 973, 3542, 285, 3477, 281, 2096, 253, 4679, 403, 9470, 285, 1329, 253, 1750, 273, 253, 4477, 5175, 323, 828, 6656, 707, 273, 14237, 310, 271, 4722, 2934, 285, 253, 1543, 1329, 253, 897, 273, 46234, 432, 1881, 35421, 3210, 50274, 5040, 253, 4477, 9059, 326, 21624, 14237, 432, 271, 6753, 36465, 9232, 512, 253, 1491, 432, 3888, 352, 651, 320, 4722, 281, 923, 849, 824, 14237, 24088, 432, 253, 6753, 36465, 908, 281, 921, 253, 828, 6656, 707, 2529, 275, 2593, 247, 18, 21319, 2429, 281, 253, 4081, 1881, 35421, 14237, 891, 651, 751, 281, 923, 731, 281, 320, 2908, 275, 253, 4679, 50276, 37585, 4385, 1745, 80, 275, 247, 18, 944, 83, 33032, 857, 272, 50276, 15576, 253, 4477, 452, 417, 10974, 281, 667, 273, 253, 10123, 891, 2406, 619, 13716, 281, 577, 50275, 15576, 19, 12506, 627, 369, 247, 40663, 891, 3164, 369, 417, 24917, 275, 285, 42126, 923, 667, 5701, 285, 10123, 891, 5439, 253, 13716, 285, 588, 1239, 253, 9172, 285, 588, 2281, 969, 50276, 15576, 20, 846, 4361, 253, 30080, 85, 932, 891, 7164, 619, 13716, 281, 818, 5474, 33032, 6010, 50276, 783, 9380, 4453, 387, 253, 1895, 273, 16344, 36827, 3530, 1655, 3082, 824, 347, 269, 301, 1087, 342, 39645, 362, 20, 403, 20276, 984, 597, 3839, 3469, 327, 970, 253, 3386, 273, 247, 1566, 20741, 3146, 10166, 327, 247, 2221, 873, 273, 4440, 257, 292, 253, 4477, 921, 326, 841, 1511, 273, 3210, 11823, 4278, 326, 403, 14282, 672, 323, 1650, 10941, 1543, 327, 6076, 67, 1240, 82, 50276, 34235, 253, 4477, 12661, 281, 897, 247, 3332, 1881, 35421, 10166, 1566, 534, 452, 644, 2011, 281, 2085, 625, 2087, 14237, 597, 1379, 247, 5438, 273, 3332, 6422, 305, 507, 285, 7277, 253, 19947, 273, 616, 1543, 1754, 327, 269, 301, 1087, 342, 20741, 800, 8564, 3386, 4632, 1881, 35421, 3386, 285, 921, 326, 627, 403, 6296, 3910, 50276, 936, 7472, 253, 3216, 5083, 253, 4477, 2813, 247, 1180, 273, 1355, 4679, 326, 3177, 281, 5100, 841, 5441, 48484, 6076, 5830, 13301, 432, 3386, 273, 1016, 1566, 285, 271, 3081, 30410, 10166, 327, 581, 36827, 3453, 6760, 327, 1529, 275, 512, 4679, 253, 4477, 921, 326, 253, 1881, 35421, 10166, 1566, 11330, 247, 36827, 19947, 326, 310, 8003, 281, 253, 5083, 50275, 15337, 50276, 783, 2929, 310, 973, 3542, 285, 3400, 247, 1077, 5322, 18389, 273, 3332, 16424, 275, 305, 507, 285, 5740, 273, 616, 7103, 3082, 1223, 253, 4081, 1332, 310, 247, 2969, 7756, 689, 2045, 789, 8171, 253, 4735, 4908, 263, 7562, 954, 2010, 3638, 253, 16774, 7103, 310, 1077, 11080, 285, 973, 2218, 50276, 249, 1798, 891, 1119, 253, 3081, 4679, 1754, 327, 253, 1543, 275, 2829, 337, 285, 495, 1077, 27096, 285, 10112, 1543, 275, 2829, 374, 285, 577, 1918, 247, 1077, 4722, 16883, 326, 1881, 35421, 46234, 403, 6296, 625, 27096, 50276, 783, 5304, 18058, 403, 973, 2218, 285, 4623, 285, 253, 46547, 275, 253, 2829, 1056, 4560, 14023, 15246, 50274, 49794, 50275, 6050, 253, 15824, 875, 1863, 580, 285, 3676, 498, 461, 677, 19, 310, 253, 1072, 253, 4588, 3904, 403, 3012, 1027, 513, 253, 4477, 871, 2139, 436, 1537, 5108, 50276, 783, 4477, 7277, 281, 767, 2173, 1881, 35421, 11333, 534, 403, 23000, 10166, 970, 17524, 310, 627, 247, 1798, 1921, 1110, 3210, 497, 6777, 1057, 253, 1511, 273, 4499, 422, 4715, 3486, 253, 1543, 390, 310, 3839, 1805, 14237, 347, 4080, 407, 1442, 292, 25004, 323, 4440, 257, 292, 1805, 323, 36827, 7103, 1512, 651, 352, 320, 32811, 281, 3177, 281, 823, 253, 3510, 273, 24165, 275, 305, 507, 281, 253, 873, 273, 35919, 569, 2218, 323, 253, 4499, 422, 4715, 50276, 249, 2087, 891, 1158, 253, 2929, 476, 5649, 432, 625, 1783, 1475, 253, 4327, 323, 253, 987, 1881, 35421, 2990, 285, 5454, 745, 84, 50275, 5996, 3662, 50276, 2577, 3533, 403, 20420, 9577, 285, 891, 11435, 253, 1635, 273, 2593, 5910, 891, 1158, 619, 1655, 4868, 13613, 13806, 619, 7103, 273, 253, 2929, 342, 253, 5780, 4468, 1146, 253, 9777, 273, 253, 7680, 7152, 33032, 2520, 2929, 3400, 271, 4722, 16774, 1263, 273, 1881, 35421, 2460, 46234, 347, 271, 5795, 281, 3215, 11273, 4440, 257, 292, 9162, 3210, 323, 253, 4096, 273, 16344, 253, 3290, 285, 9991, 273, 36827, 1006, 800, 2460, 3210, 891, 1900, 1119, 352, 247, 1652, 8909, 326, 247, 1566, 10166, 327, 4440, 257, 292, 342, 2831, 290, 10144, 2957, 943, 10380, 320, 247, 21653, 10898, 3290, 7982, 285, 891, 717, 5211, 281, 923, 326, 436, 2929, 3400, 1175, 1941, 326, 436, 310, 417, 253, 1083, 253, 4477, 3609, 374, 1881, 35421, 3210, 285, 7277, 731, 1411, 247, 1180, 273, 22296, 3210, 253, 17082, 908, 403, 269, 301, 285, 12320, 2845, 455, 891, 717, 14338, 2139, 39645, 4868, 369, 417, 671, 2429, 50275, 783, 2929, 1057, 3240, 247, 11080, 2628, 273, 17221, 285, 10941, 3210, 407, 2622, 3006, 323, 10336, 285, 6890, 10895, 390, 2957, 1159, 352, 2722, 4518, 326, 1881, 35421, 3082, 562, 32231, 253, 22296, 3082, 323, 19947, 2710, 36827, 3210, 50276, 262, 651, 452, 644, 4722, 281, 6194, 253, 1881, 35421, 1566, 327, 253, 10895, 3139, 24088, 298, 13998, 390, 6076, 5830, 281, 923, 1880, 326, 3400, 271, 1014, 625, 4217, 2625, 1677, 326, 3676, 6928, 1089, 352, 1892, 281, 39970, 2439, 15302, 891, 651, 1902, 326, 3587, 3733, 271, 21496, 327, 253, 2303, 10895, 651, 513, 1805, 858, 253, 4477, 1611, 1633, 2112, 841, 3104, 50276, 66, 5884, 4385, 310, 326, 253, 12806, 273, 253, 1543, 285, 5701, 310, 247, 2372, 21643, 1955, 281, 253, 1077, 1048, 1180, 273, 2792, 326, 3730, 281, 247, 1798, 4677, 285, 25312, 8783, 273, 41684, 896, 285, 6593, 690, 1805, 1039, 281, 23968, 253, 1491, 285, 5701, 651, 320, 14109, 50276, 74, 651, 671, 1089, 352, 47860, 281, 1805, 2096, 2139, 1881, 12185, 4694, 2987, 1805, 323, 16344, 14237, 667, 5701, 281, 436, 2743, 651, 320, 4722, 1390, 314, 891, 717, 14338, 2139, 253, 4477, 858, 417, 1908, 1881, 35421, 3082, 824, 347, 948, 498, 83, 50276, 74, 452, 1239, 253, 30080, 85, 932, 285, 643, 5701, 285, 6558, 619, 13716, 273, 253, 2929, 50276, 7152, 33032, 39930, 273, 2929, 436, 789, 26662, 22296, 4735, 4908, 641, 4632, 767, 3510, 273, 1881, 35421, 4735, 4908, 641, 323, 253, 4836, 273, 36827, 1566, 7103, 352, 2722, 326, 253, 19947, 2530, 407, 1881, 35421, 3386, 310, 1027, 432, 326, 273, 22296, 3386, 285, 3916, 352, 10140, 1805, 342, 1966, 31536, 4679, 403, 5196, 273, 2709, 1781, 305, 507, 285, 15302, 50276, 2369, 652, 555, 891, 717, 417, 6600, 273, 2045, 2987, 15686, 1881, 35421, 3386, 323, 36827, 7103, 533, 3877, 326, 337, 6760, 1881, 35421, 3386, 347, 247, 39612, 2957, 534, 310, 4122, 2905, 50274, 9188, 40348, 253, 1655, 17082, 908, 281, 7472, 305, 507, 403, 973, 4304, 281, 320, 20276, 285, 253, 3186, 323, 1805, 5593, 310, 1774, 327, 253, 643, 1133, 891, 717, 417, 2176, 326, 436, 2929, 310, 38662, 2217, 320, 2104, 281, 5333, 253, 3114, 4404, 1027, 17082, 436, 310, 247, 1892, 2181, 281, 513, 347, 1014, 6890, 253, 7103, 432, 970, 39645, 87, 20, 3386, 534, 403, 9648, 36761, 281, 625, 4980, 501, 47301, 556, 417, 4592, 2568, 50276, 9349, 1497, 891, 452, 247, 1643, 3374, 342, 253, 1332, 50276, 20261, 253, 4477, 1750, 281, 452, 1805, 4345, 342, 1966, 285, 8103, 2128, 24556, 436, 310, 417, 6685, 973, 17285, 24088, 4645, 326, 1863, 580, 476, 30215, 17754, 12474, 875, 39645, 87, 20, 310, 417, 407, 3139, 1077, 24838, 50276, 262, 4648, 247, 1805, 10336, 285, 625, 3839, 3700, 4715, 273, 1881, 12185, 87, 885, 4632, 22296, 3082, 369, 18171, 17618, 275, 253, 3236, 9380, 285, 310, 670, 4503, 671, 22296, 4440, 257, 292, 3386, 403, 3782, 4105, 323, 9365, 891, 5476, 326, 323, 643, 5113, 3510, 1543, 1537, 320, 1027, 253, 1941, 323, 253, 3216, 33024, 19947, 323, 12320, 285, 6983, 310, 417, 3782, 2266, 50276, 15419, 2368, 50275, 20415, 305, 507, 285, 15302, 6760, 50276, 22309, 760, 1863, 580, 285, 3676, 498, 29577, 2139, 417, 897, 253, 643, 4633, 4499, 422, 4715, 3082, 24088, 278, 16856, 948, 498, 83, 407, 311, 50275, 1189, 455, 253, 5839, 273, 1805, 4088, 273, 16344, 305, 507, 310, 1774, 253, 2022, 14226, 310, 326, 275, 619, 4743, 417, 2217, 3434, 369, 2668, 281, 5100, 253, 3216, 33024, 19947, 875, 3210, 2403, 253, 1543, 273, 436, 5839, 1679, 1534, 50274, 783, 30080, 22559, 9713, 619, 7350, 50276, 74, 2559, 253, 4868, 50276, 18, 1182, 12109, 1162, 355, 253, 20697, 12510, 273, 3676, 3386, 347, 247, 39612, 7982, 30105, 1087, 1093, 187, 187, 4118, 18435, 27, 455, 1740, 30628, 38350, 8521, 323, 271, 14924, 1740, 818, 84, 597, 3839, 14109, 326, 253, 4081, 2934, 310, 4460, 285, 4679, 403, 21414, 891, 1158, 253, 2929, 39223, 271, 1774, 1895, 273, 16344, 305, 507, 285, 253, 2934, 273, 970, 1881, 35421, 14237, 347, 10066, 281, 253, 6041, 4440, 257, 292, 3169, 14237, 651, 1421, 281, 4722, 11985, 285, 956, 8777, 209 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: this paper studies explainable nlp and surveys available datasets and taxonomizes existing data collection methodologies focusing on a categorization of highlights freeform and structured explanations the paper outlines how these categories appear in the datasets and certain weaknesses eg artifacts introduced by annotators or strengths that appear in certain datasets there is also some highlevel prescriptive guidance on best practices to carry forward post author response thanks to the authors for making the changes i think they help on clarity i would keep the rating the same because as the authors mention some of the weaknesses are fairly inherent to the abridged survey form i think the paper could be accepted the paper is comprehensive and ambitious in scope i found the paper useful as an overview of datasets with explanations in nlp though admittedly as an outsider to the field so i cannot compare this metaanalysis with other metaanalyses of the same topic i think this paper is a good resource for navigating the landscape of explainable nlp for outsiders i found it especially helpful when the paper synthesizes or draws direct comparisons across datasets overall the paper is a bit of a highlevel metaanalysis summary summarizing some metadata of the datasets and pulling indrawing out some specific attributes of specific datasets i find the latter synthesizing discussions across datasets comparing structural qualities across datasets much more informative than the table by itself the prescriptive guidelines and reflections in this paper are a bit highleveldescriptive and could be sharpened further although i could see this being a good resource for newcomers eg like a wellwritten blog post for visibility the sharpness of the analysis seems like there is room for improvement although the taxonomy is based on formal structure of the explanation eg splitting into highlight freeform structured the paper remains fairly descriptive in summarizing or recounting the metadata of the datasets without finergrained analysis at times the editorial analysis eg section 5 felt a bit informal or underargued while it was helpful to include specific examples a more structured comparison of datasets along the evaluative criteria eg sufficiency and so on for highlights would have been helpful i also found it helpful when the paper recounted critiques of datasets as this may be more evident to insiders to the literature than outside making this more structured even including a column for possible concerns in the table though this could be understandably a burden for the authors to maintain could sharpen the utility of the summarizing tables docsepthis survey paper provides a concise overview on 61 datasets for explainable natural language processing highlighting their advantagesdisadvantages and making suggestions for better data collection strategies it provides a good starting point for exnlp the authors start with explaining why textual explanation is important for ml models and then introduce three types of explanations and existing datasets in particular they discuss how the dataset collection methods can affect modelling focusing on the question of whether human explanation is sufficient or comprehensive for predictions unsurprisingly it is not the paper further discuss means for improving the data quality and diversity for instance by using templatebased explanations post author response thanks for the responses clarifying the structure of the draft and incorporating the feedback in the final version the final score remains unchanged as i already thought it can be published a quite comprehensive and wellorganised introduction and summary on large amounts of datasets for explainable nlp which will be handy for both experts and beginners the suggestions for data collection provided can guide researchers for creating high quality data it is not very clear how to assess explanations in a dataset although the authors dedicate some sections for the related discussion the discussion covers varies domains and dimensions making it less focused a short description in section 4 or 6 summing up key takeaways would be great for instance what qualifies a good or acceptable explanation is there a quantifiable measure docsepthis takes a muchneeded lens to the details of how exnlp datasets are actually collected providing a corrective to the modelcentric approach of much of machine learning it points out some weaknesses in how we collect exnlp data and how we document the collection process the suggestions it makes for improvement are reasonable post author response thanks for making the improvements my rating is unchanged makes novel points that are key to doing exnlp properly provides constructive suggestions for improving current practices provides many starting points for future work on exnlp the paper is a little unfocused and comes across at some points as a collection of different points related to the same subject datasets it feels like some sections could be expanded into standalone papers in themselves however perhaps this is understandable in a paper that is tackling this subject dataset quality in exnlp for the first time i think more narrative connecting the different points would help the reader understand any ways in which the different points made contribute to a unified message ### Summary:
the paper surveys existing datasets and data collection methodologies for explainable nlp pointing out strengths and weaknesses of existing datasets and providing suggestions for future data collection in this area overall the reviewers found the paper interesting and useful to the community but had some issues with the clarity and presentation the authors were largely able to address these concerns with their responses and revised draft and in the end all reviewers thought that the paper should be accepted congratulations on having your paper accepted to the neurips 2021 track on datasets and benchmarks when preparing the cameraready paper the authors are encouraged to take the reviewers feedback into account specifically with improving the clarity and readability of the paper
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 2520, 2929, 2175, 5513, 494, 295, 24343, 285, 17276, 2130, 15302, 285, 2891, 11168, 4219, 5368, 941, 4849, 39396, 13654, 327, 247, 13213, 1320, 273, 16681, 1959, 630, 285, 18872, 22909, 253, 2929, 36264, 849, 841, 9050, 3176, 275, 253, 15302, 285, 2176, 32213, 24088, 24165, 5611, 407, 12182, 2392, 390, 20544, 326, 3176, 275, 2176, 15302, 627, 310, 671, 690, 1029, 5251, 838, 39555, 12925, 327, 1682, 8333, 281, 4459, 3579, 50273, 5996, 2488, 2380, 50276, 35501, 281, 253, 4477, 323, 2403, 253, 2544, 891, 1158, 597, 1361, 327, 19843, 50276, 74, 651, 1978, 253, 13716, 253, 1072, 984, 347, 253, 4477, 3748, 690, 273, 253, 32213, 403, 9648, 12794, 281, 253, 490, 6992, 2400, 6630, 830, 891, 1158, 253, 2929, 812, 320, 7607, 50276, 783, 2929, 310, 11088, 285, 24683, 275, 7990, 891, 1119, 253, 2929, 4217, 347, 271, 18389, 273, 15302, 342, 22909, 275, 295, 24343, 2167, 47421, 347, 271, 20823, 1334, 281, 253, 1673, 594, 891, 2550, 7277, 436, 11419, 12792, 342, 643, 11419, 47927, 273, 253, 1072, 9400, 891, 1158, 436, 2929, 310, 247, 1175, 7741, 323, 49858, 253, 13016, 273, 5513, 494, 295, 24343, 323, 20823, 5852, 891, 1119, 352, 3340, 9371, 672, 253, 2929, 35143, 4219, 390, 21354, 1480, 14023, 2439, 15302, 50276, 1189, 455, 253, 2929, 310, 247, 2372, 273, 247, 1029, 5251, 11419, 12792, 6010, 10405, 3006, 690, 21464, 273, 253, 15302, 285, 14252, 801, 2040, 272, 562, 690, 2173, 12474, 273, 2173, 15302, 891, 1089, 253, 6158, 50276, 49331, 84, 3006, 11985, 2439, 15302, 10941, 8350, 18701, 2439, 15302, 1199, 625, 27096, 685, 253, 2829, 407, 3139, 253, 838, 39555, 9600, 285, 24233, 275, 436, 2929, 403, 247, 2372, 1029, 5251, 3229, 39555, 285, 812, 320, 9479, 2348, 2007, 3738, 891, 812, 923, 436, 1146, 247, 1175, 7741, 323, 38782, 398, 24088, 751, 247, 973, 15720, 5311, 1501, 323, 23114, 253, 9479, 1255, 273, 253, 1783, 3133, 751, 627, 310, 2316, 323, 7756, 50275, 20261, 253, 2891, 13646, 310, 1754, 327, 7473, 2605, 273, 253, 8813, 24088, 19860, 715, 6780, 50276, 4924, 630, 50276, 34218, 253, 2929, 4558, 9648, 27389, 275, 10405, 3006, 390, 27513, 272, 253, 21464, 273, 253, 15302, 1293, 1442, 1326, 11273, 1783, 387, 2069, 253, 21977, 1783, 24088, 2593, 608, 3543, 247, 2372, 25040, 390, 762, 1662, 2107, 1223, 352, 369, 9371, 281, 2486, 2173, 6667, 247, 625, 18872, 5301, 273, 15302, 2112, 253, 3595, 800, 6866, 24088, 32572, 285, 594, 327, 323, 16681, 651, 452, 644, 9371, 50275, 74, 671, 1119, 352, 9371, 672, 253, 2929, 27513, 264, 2268, 4624, 273, 15302, 347, 436, 778, 320, 625, 8943, 281, 1210, 5852, 281, 253, 6239, 685, 3345, 2403, 436, 625, 18872, 1014, 1690, 247, 5084, 323, 1896, 7350, 275, 253, 2829, 50276, 2004, 436, 812, 320, 50276, 46322, 1598, 247, 7977, 323, 253, 4477, 281, 6558, 812, 17614, 3878, 253, 11839, 273, 253, 10405, 3006, 7180, 50276, 7152, 33032, 2520, 6630, 2929, 3400, 247, 44003, 18389, 327, 9901, 15302, 323, 5513, 494, 3626, 3448, 5162, 27321, 616, 11361, 3431, 11402, 1131, 285, 2403, 13991, 323, 1805, 941, 4849, 8130, 352, 3400, 247, 1175, 4983, 1127, 323, 385, 13307, 81, 253, 4477, 1265, 342, 15571, 2139, 45860, 8813, 310, 1774, 323, 13361, 3210, 285, 840, 9569, 1264, 3510, 273, 22909, 285, 5368, 15302, 275, 1798, 597, 2319, 849, 253, 10895, 4849, 3082, 476, 2818, 26278, 13654, 327, 253, 1953, 273, 1880, 1966, 8813, 310, 4209, 390, 11088, 323, 13650, 5061, 321, 28761, 352, 310, 417, 253, 2929, 2007, 2319, 2097, 323, 11138, 253, 941, 3290, 285, 9991, 323, 4227, 407, 970, 7646, 3169, 22909, 50276, 5996, 2488, 2380, 6701, 323, 253, 6128, 8254, 5411, 253, 2605, 273, 253, 7482, 285, 24049, 253, 8680, 275, 253, 2457, 2715, 253, 2457, 4868, 4558, 19965, 347, 891, 2168, 1869, 352, 476, 320, 3863, 247, 3240, 11088, 285, 973, 7397, 1701, 10199, 285, 6010, 327, 1781, 8322, 273, 15302, 323, 5513, 494, 295, 24343, 534, 588, 320, 24783, 323, 1097, 10071, 285, 2353, 12947, 50276, 783, 13991, 323, 941, 4849, 2530, 476, 7102, 8607, 323, 6153, 1029, 3290, 941, 50276, 262, 310, 417, 1077, 2590, 849, 281, 2939, 22909, 275, 247, 10895, 3738, 253, 4477, 5514, 9038, 690, 7118, 323, 253, 2905, 5955, 253, 5955, 10949, 16149, 10625, 285, 10103, 2403, 352, 1679, 7106, 247, 2159, 5740, 275, 2593, 577, 390, 721, 49947, 598, 2234, 1379, 42287, 651, 320, 1270, 323, 4227, 752, 4426, 7790, 247, 1175, 390, 12207, 8813, 310, 627, 247, 2677, 18397, 2557, 50276, 7152, 33032, 2520, 3936, 247, 1199, 34498, 9655, 281, 253, 4278, 273, 849, 385, 13307, 81, 15302, 403, 2686, 5728, 5277, 247, 3451, 422, 281, 253, 1566, 37382, 2746, 273, 1199, 273, 5145, 4715, 352, 2792, 562, 690, 32213, 275, 849, 359, 4822, 385, 13307, 81, 941, 285, 849, 359, 3389, 253, 4849, 1232, 253, 13991, 352, 2789, 323, 7756, 403, 5272, 50276, 5996, 2488, 2380, 6701, 323, 2403, 253, 11701, 619, 13716, 310, 19965, 50275, 46535, 4460, 2792, 326, 403, 2234, 281, 2509, 385, 13307, 81, 6283, 50275, 11404, 1487, 25799, 13991, 323, 11138, 1655, 8333, 50275, 11404, 1487, 1142, 4983, 2792, 323, 2852, 789, 327, 385, 13307, 81, 253, 2929, 310, 247, 1652, 5369, 2423, 264, 285, 3249, 2439, 387, 690, 2792, 347, 247, 4849, 273, 1027, 2792, 2905, 281, 253, 1072, 2256, 15302, 352, 9193, 751, 690, 7118, 812, 320, 11848, 715, 40468, 9380, 275, 3746, 2299, 4931, 436, 310, 34007, 275, 247, 2929, 326, 310, 46710, 436, 2256, 10895, 3290, 275, 385, 13307, 81, 323, 253, 806, 673, 891, 1158, 625, 14511, 12873, 253, 1027, 2792, 651, 1361, 253, 9414, 2096, 667, 4088, 275, 534, 253, 1027, 2792, 1160, 8162, 281, 247, 27998, 3935, 2490, 187, 4118, 18435, 27, 783, 2929, 17276, 5368, 15302, 285, 941, 4849, 39396, 323, 5513, 494, 295, 24343, 13458, 562, 20544, 285, 32213, 273, 5368, 15302, 285, 5277, 13991, 323, 2852, 941, 4849, 275, 436, 2170, 4583, 253, 30628, 1119, 253, 2929, 4722, 285, 4217, 281, 253, 3114, 533, 574, 690, 3374, 342, 253, 19843, 285, 9759, 253, 4477, 497, 8127, 2104, 281, 2953, 841, 7350, 342, 616, 6128, 285, 17265, 7482, 285, 275, 253, 990, 512, 30628, 1869, 326, 253, 2929, 943, 320, 7607, 28858, 3339, 327, 1907, 634, 2929, 7607, 281, 253, 5723, 2824, 43425, 3540, 327, 15302, 285, 49602, 672, 13828, 253, 4049, 254, 609, 5102, 2929, 253, 4477, 403, 14659, 281, 1379, 253, 30628, 8680, 715, 2395, 5742, 342, 11138, 253, 19843, 285, 1239, 1430, 273, 253, 2929 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 2520, 2929, 2175, 5513, 494, 295, 24343, 285, 17276, 2130, 15302, 285, 2891, 11168, 4219, 5368, 941, 4849, 39396, 13654, 327, 247, 13213, 1320, 273, 16681, 1959, 630, 285, 18872, 22909, 253, 2929, 36264, 849, 841, 9050, 3176, 275, 253, 15302, 285, 2176, 32213, 24088, 24165, 5611, 407, 12182, 2392, 390, 20544, 326, 3176, 275, 2176, 15302, 627, 310, 671, 690, 1029, 5251, 838, 39555, 12925, 327, 1682, 8333, 281, 4459, 3579, 50273, 5996, 2488, 2380, 50276, 35501, 281, 253, 4477, 323, 2403, 253, 2544, 891, 1158, 597, 1361, 327, 19843, 50276, 74, 651, 1978, 253, 13716, 253, 1072, 984, 347, 253, 4477, 3748, 690, 273, 253, 32213, 403, 9648, 12794, 281, 253, 490, 6992, 2400, 6630, 830, 891, 1158, 253, 2929, 812, 320, 7607, 50276, 783, 2929, 310, 11088, 285, 24683, 275, 7990, 891, 1119, 253, 2929, 4217, 347, 271, 18389, 273, 15302, 342, 22909, 275, 295, 24343, 2167, 47421, 347, 271, 20823, 1334, 281, 253, 1673, 594, 891, 2550, 7277, 436, 11419, 12792, 342, 643, 11419, 47927, 273, 253, 1072, 9400, 891, 1158, 436, 2929, 310, 247, 1175, 7741, 323, 49858, 253, 13016, 273, 5513, 494, 295, 24343, 323, 20823, 5852, 891, 1119, 352, 3340, 9371, 672, 253, 2929, 35143, 4219, 390, 21354, 1480, 14023, 2439, 15302, 50276, 1189, 455, 253, 2929, 310, 247, 2372, 273, 247, 1029, 5251, 11419, 12792, 6010, 10405, 3006, 690, 21464, 273, 253, 15302, 285, 14252, 801, 2040, 272, 562, 690, 2173, 12474, 273, 2173, 15302, 891, 1089, 253, 6158, 50276, 49331, 84, 3006, 11985, 2439, 15302, 10941, 8350, 18701, 2439, 15302, 1199, 625, 27096, 685, 253, 2829, 407, 3139, 253, 838, 39555, 9600, 285, 24233, 275, 436, 2929, 403, 247, 2372, 1029, 5251, 3229, 39555, 285, 812, 320, 9479, 2348, 2007, 3738, 891, 812, 923, 436, 1146, 247, 1175, 7741, 323, 38782, 398, 24088, 751, 247, 973, 15720, 5311, 1501, 323, 23114, 253, 9479, 1255, 273, 253, 1783, 3133, 751, 627, 310, 2316, 323, 7756, 50275, 20261, 253, 2891, 13646, 310, 1754, 327, 7473, 2605, 273, 253, 8813, 24088, 19860, 715, 6780, 50276, 4924, 630, 50276, 34218, 253, 2929, 4558, 9648, 27389, 275, 10405, 3006, 390, 27513, 272, 253, 21464, 273, 253, 15302, 1293, 1442, 1326, 11273, 1783, 387, 2069, 253, 21977, 1783, 24088, 2593, 608, 3543, 247, 2372, 25040, 390, 762, 1662, 2107, 1223, 352, 369, 9371, 281, 2486, 2173, 6667, 247, 625, 18872, 5301, 273, 15302, 2112, 253, 3595, 800, 6866, 24088, 32572, 285, 594, 327, 323, 16681, 651, 452, 644, 9371, 50275, 74, 671, 1119, 352, 9371, 672, 253, 2929, 27513, 264, 2268, 4624, 273, 15302, 347, 436, 778, 320, 625, 8943, 281, 1210, 5852, 281, 253, 6239, 685, 3345, 2403, 436, 625, 18872, 1014, 1690, 247, 5084, 323, 1896, 7350, 275, 253, 2829, 50276, 2004, 436, 812, 320, 50276, 46322, 1598, 247, 7977, 323, 253, 4477, 281, 6558, 812, 17614, 3878, 253, 11839, 273, 253, 10405, 3006, 7180, 50276, 7152, 33032, 2520, 6630, 2929, 3400, 247, 44003, 18389, 327, 9901, 15302, 323, 5513, 494, 3626, 3448, 5162, 27321, 616, 11361, 3431, 11402, 1131, 285, 2403, 13991, 323, 1805, 941, 4849, 8130, 352, 3400, 247, 1175, 4983, 1127, 323, 385, 13307, 81, 253, 4477, 1265, 342, 15571, 2139, 45860, 8813, 310, 1774, 323, 13361, 3210, 285, 840, 9569, 1264, 3510, 273, 22909, 285, 5368, 15302, 275, 1798, 597, 2319, 849, 253, 10895, 4849, 3082, 476, 2818, 26278, 13654, 327, 253, 1953, 273, 1880, 1966, 8813, 310, 4209, 390, 11088, 323, 13650, 5061, 321, 28761, 352, 310, 417, 253, 2929, 2007, 2319, 2097, 323, 11138, 253, 941, 3290, 285, 9991, 323, 4227, 407, 970, 7646, 3169, 22909, 50276, 5996, 2488, 2380, 6701, 323, 253, 6128, 8254, 5411, 253, 2605, 273, 253, 7482, 285, 24049, 253, 8680, 275, 253, 2457, 2715, 253, 2457, 4868, 4558, 19965, 347, 891, 2168, 1869, 352, 476, 320, 3863, 247, 3240, 11088, 285, 973, 7397, 1701, 10199, 285, 6010, 327, 1781, 8322, 273, 15302, 323, 5513, 494, 295, 24343, 534, 588, 320, 24783, 323, 1097, 10071, 285, 2353, 12947, 50276, 783, 13991, 323, 941, 4849, 2530, 476, 7102, 8607, 323, 6153, 1029, 3290, 941, 50276, 262, 310, 417, 1077, 2590, 849, 281, 2939, 22909, 275, 247, 10895, 3738, 253, 4477, 5514, 9038, 690, 7118, 323, 253, 2905, 5955, 253, 5955, 10949, 16149, 10625, 285, 10103, 2403, 352, 1679, 7106, 247, 2159, 5740, 275, 2593, 577, 390, 721, 49947, 598, 2234, 1379, 42287, 651, 320, 1270, 323, 4227, 752, 4426, 7790, 247, 1175, 390, 12207, 8813, 310, 627, 247, 2677, 18397, 2557, 50276, 7152, 33032, 2520, 3936, 247, 1199, 34498, 9655, 281, 253, 4278, 273, 849, 385, 13307, 81, 15302, 403, 2686, 5728, 5277, 247, 3451, 422, 281, 253, 1566, 37382, 2746, 273, 1199, 273, 5145, 4715, 352, 2792, 562, 690, 32213, 275, 849, 359, 4822, 385, 13307, 81, 941, 285, 849, 359, 3389, 253, 4849, 1232, 253, 13991, 352, 2789, 323, 7756, 403, 5272, 50276, 5996, 2488, 2380, 6701, 323, 2403, 253, 11701, 619, 13716, 310, 19965, 50275, 46535, 4460, 2792, 326, 403, 2234, 281, 2509, 385, 13307, 81, 6283, 50275, 11404, 1487, 25799, 13991, 323, 11138, 1655, 8333, 50275, 11404, 1487, 1142, 4983, 2792, 323, 2852, 789, 327, 385, 13307, 81, 253, 2929, 310, 247, 1652, 5369, 2423, 264, 285, 3249, 2439, 387, 690, 2792, 347, 247, 4849, 273, 1027, 2792, 2905, 281, 253, 1072, 2256, 15302, 352, 9193, 751, 690, 7118, 812, 320, 11848, 715, 40468, 9380, 275, 3746, 2299, 4931, 436, 310, 34007, 275, 247, 2929, 326, 310, 46710, 436, 2256, 10895, 3290, 275, 385, 13307, 81, 323, 253, 806, 673, 891, 1158, 625, 14511, 12873, 253, 1027, 2792, 651, 1361, 253, 9414, 2096, 667, 4088, 275, 534, 253, 1027, 2792, 1160, 8162, 281, 247, 27998, 3935, 2490, 187, 4118, 18435, 27, 783, 2929, 17276, 5368, 15302, 285, 941, 4849, 39396, 323, 5513, 494, 295, 24343, 13458, 562, 20544, 285, 32213, 273, 5368, 15302, 285, 5277, 13991, 323, 2852, 941, 4849, 275, 436, 2170, 4583, 253, 30628, 1119, 253, 2929, 4722, 285, 4217, 281, 253, 3114, 533, 574, 690, 3374, 342, 253, 19843, 285, 9759, 253, 4477, 497, 8127, 2104, 281, 2953, 841, 7350, 342, 616, 6128, 285, 17265, 7482, 285, 275, 253, 990, 512, 30628, 1869, 326, 253, 2929, 943, 320, 7607, 28858, 3339, 327, 1907, 634, 2929, 7607, 281, 253, 5723, 2824, 43425, 3540, 327, 15302, 285, 49602, 672, 13828, 253, 4049, 254, 609, 5102, 2929, 253, 4477, 403, 14659, 281, 1379, 253, 30628, 8680, 715, 2395, 5742, 342, 11138, 253, 19843, 285, 1239, 1430, 273, 253, 2929 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: this work proposed a novel setting for ssl where unlabeled data contains novel categories and it aims to simultaneously train classifiers for seen classes and discover novel classes this is achieved by introducing pairwise loss and regularizing with prior class distribution experiments are carried out on synthesized data splits and demonstrated good results strength this paper proposed a new setting for ssl simultaneously training classifier for seen classes and discovering novel classes from unlabeled data is novel comparison with existing methods adapted to the new setting is competitive weakness the expected number of novel classes is a strong assumption for openworld ssl inferring the number of classes is only briefly mentioned in the paper i would like to see more details the uncertainty in eq4 is not explained well i am wondering why other choices eg entropy is not selected moreover is the maxk is among the known classes or over all classes if it is over all classes i do not see how the classifier wk kin cu is trained i do not see any term that can avoid assigning unlabeled data to seen classes for example eq5 can still be minimized if all unlabeled pairs are assigned to the same seen classes regularizing the average posterior wrt a prior distribution is somehow a strong assumption in that the distribution for unseen classes are known in advance pretraining could substantially improve the quality of feature representation as a result the pairwise objective might heavily rely on the pretraining i think it is worth discussing or providing experiments to validate this point this paper provides a new perspective into ssl but the designs are not fully explained and some remains unjustified eg the prior distribution for all classes and the number of novel classes docsepthis paper considers the problem of openworld learning where the labeled data are from seen classes and unlabeled data are from both seen and novel classes to address this problem this paper presents an uncertainty based adaptive margin method for learning the joint classifier of seen and unseen classes the proposed method can avoid the model assigning samples of novel classes to the seen classes experiments on several datasets show the effectiveness of the proposed method pros this paper is wellwritten and easy to follow the motivation is clear and the overall organization is good a new setting is proposed in the community of semisupervised learning which jointly considers semisupervised learning openset learning and novel class discovery this is a difficult but very practice problem in the real world a simple but effective approach uncertainty based adaptive margin is proposed to address the introduced setting it can avoid the model quickly converge the seen classes so that the model can well recognize both seen and unseen classes extensive experiments are provided to verify the effectiveness of the proposed method in addition this paper makes great contributions to implement existing ssl and novel class discovery methods into openworld learning cons this paper states that the proposed method can estimate the number of unseen classes however it seems that it should first estimate a rough number with dtc how about the results of set a larger number of unseen classes eg 200 for cifar100 learn the proposed model and then estimate the number of unseen classes i appreciate that the authors implemented existing ssl and novel class discovery methods on the proposed setting however their implementation details are not very clear to me please introduce how to reproduce them in the supplementary for the estimation of uncertainty why not use other methods such as the entropy and mcdrop a in addition for the fixed negative margin does 05 produce the best results if not the authors should compare the proposed method with different values of the negative margin the proposed pairwise loss and regularization loss are not novel we can find many semisupervised learning and clustering methods using these two losses in addition for the pairwise loss this paper selects the nearest neighbor as the positive candidate however in practice this could introduce many falsenegative pairs when the number of classes is large and the training batch size is small i think this will be an issue for the proposed method especially when there are many classes for example the proposed method may meet problems when learning on the large imagenet which has 1000 classes indeed i think this is a longstanding problem that our community should consider and this may not be addressed in this paper for selfsupervised learning i would like to see the results of using different selfsupervised learning methods such as simclr mocov2 and rotationnet it is very interesting that b found that rotationnet achieves better results than simclr and mocov2 for novel class discovery therefore i am curious about that if this phenomenon happens in the proposed setting i think the main contribution is the proposed adaptive margin method therefore this approach can be applied to other novel class discovery methods how about applying the proposed margin method to rankingstatic i think one important result is missed in the paper we assume that the samples of novel classes and samples of seen classes are separated in the unlabeled data that is for the unlabeled data we know which samples are from novel classes or seen classes but we do not know the class labels i think this is could one upper bond for the proposed setting in addition under such an assumption can the proposed method improve the results or we only need simple pairwise loss and regularization loss also rankingstatics another two important experiments are missing 1 applying existing openset methods for the introduced setting 2 applying the proposed method on the openset recognition setting at last i strongly recommend the authors release the source code during submission 1 this is the first work that studies the openworld setting 2 i know that this paper was submitted to several venues and many researchers know this papersetting i also found several works that follow this setting and compare this work thus providing the source code could be a good start for this new task a y gal and z ghahramani dropout as a bayesian approximation representing model uncertainty in deep learning in icml 2016 overall i like this work especially the proposed setting although the novelty is not very strong this is the first step for openworld learning also there are some concerns and missing experiments that should be addressed during the rebuttal i think this is a promising and interesting task so that the first work should design a wellconsidered setting and provide extensive comparisons for the following works docsepthis paper studies a new setting for openworld semisupervised learning this setting extends the typical semisupervised learning by considering unseens classes in the test set this paper introduces a uncertainty adaptive margin mechanism for this new problem the results are evaluated on multiple datasets the results are promising overall this paper is well written the paper is well motivated the idea of adaptive margin and estimating intraclass variance using uncertainty is novel to me i have a few questions regarding the experiments in figure 3 right the accuracy of orcazm drops significant at the 140 epoch the authors explained that this shows orcazm is not able to reduce intraclass variance however its not clear why the decay of learning rate will trigger the performance drop i suggest replacing improvement in table 2 with relative improvement for the readers convenience the proposed margin term is neat as described in eq 4 it would be good if the authors could provide some analysis regarding ls when the margin is eq 4 for example how will the margin mitigate the bias this paper is very well written and well motivated i recommend accepting this paper for publication if the authors could address a few issues described above docsepin this paper the authors define a problem with a more realistic perspective in semisupervised learning openworld semisupervised learning which considers a situation that unseen classes are included in unlabeled and test datasets the authors propose a method called orca which tries to solve the problem caused by openworld semisupervised learning the main function of the orca is balancing intraclass variation between seen and novel classes with 3 kinds of loss terms strong points by defining a new realistic problem the authors provide a direction for researchers to move forward with a persuasive and structured explanation this paper contains a wide range of experiments to support the authors claims although there is no clear competitor in the defined problem authors adopted various methods from relevant fields with slight appropriate changes also they provide ablation study results to validate the proposed method weak points compared to the detailed technical explanation of the proposed method the solid explanation of the statements and intuition is insufficient as the authors define a new problem openworld ssl the experimental setting is somewhat arbitrary questions while uncertainty adaptive margin takes a key role in orca a detailed explanation of how it works doesnt seem to suffice readers would not be able to understand with only intuitive explanations and references so a more detailed explanation is necessary for example why eq3 means the large margin of seen classes in early training epochs in the experiment settings authors set the ratio of seennovel classes to 50 can it be said that this setting represents a realworld problem though there are no competitors to openworld ssl baseline algorithms like fixmatch and ds3l show somewhat low performances than its original paper with conventional ssl settings for the authors method to be more convincing it seems necessary to show that the orca does not fall behind the baseline in the existing ssl problem without any novel class although the paper has some weaknesses such as comparison in the conventional ssl setting the paper proposes a new problem that considers a more realistic situation and the proposed method seems to work properly for the designed setting docsepthe paper introduces a new semisupervised learning problem openworld semisupervised learning where the objective is to recognize samples from seen classes as well as cluster samples from novel classes the proposed method consists of three components to address the different aspects of this problem a supervised ce loss with uncertainty adaptive margin to recognize samples from seen classes a pairwise objective to cluster samples from novel classes and a regularization term to avoid degenerate solutions the effectiveness of the proposed method has been validated on multiple benchmark datasets strengths this work introduces a realistic semisupervised problem that accounts for the presence of samples from novel classes in the unlabeled set this problem setup is interesting and would make the developed ssl solutions applicable to more practical scenarios the proposed solution achieves promising results even though the technical novelty of different components of the proposed solution is not high incorporating these multiple components to solve different aspects of the challenging openworld semisupervised learning problem is novel enough the paper is well written and easy to follow the ablation is extensive besides the work includes analysis under various challenging conditions like class imbalance varying percentage of seen and novel classes etc weakness and concerns the effectiveness of the pairwise objective should greatly depend on the quality of selfsupervised features the proposed method and most of the baselines have utilized simclr pretrained weights an analysis would have been nice to see how the performance of the proposed method changes with different selfsupervised pretraining schemes especially the ones with pretext tasks like rotnet jigsaw etc besides for a lot of datasets the selfsupervised pretrained weights might not be available for instance the singlecell dataset therefore it is crucial to know how the method performs without incorporating the selfsupervised pretrained weights on the benchmark datasets cifar10 cifar100 imagenet100 the experiments on the main text used a large portion of labeled data 50 which is not a very practical setup for semisupervised learning however the appendix includes results with 10 labeled data since the sota closedworld ssl methods are very label efficient similar experiments with 1 5 etc labeled data could have been conducted pseudolabels for pairwise objective have been generated by finding the most confident pairs however for datasets with a large number of classes more than one sample might not be present in a minibatch if the batchsize is not sufficiently large therefore it would be interesting to know the sensitivity of the proposed method to the batchsize parameter how were the hyperparameters for the proposed method tuned did the authors use a validation set for this purpose if that is the case then did that validation set contain labeled samples from novel classes because using labeled samples from novel classes to tune hyperparameters would make the proposed method less practical comparison with an unsupervised clustering method like scan1 would have been interesting questions and comments the results reported in table 2 and 5 demonstrates that the accuracy on novel classes is higher than that of seen classes for the cifar10 dataset since for all the other methods the seen class performance is higher it would be interesting to know any insight behind this phenomenon in parallel to the zero margin and fixed margin experiments have the authors conducted any experiments by decaying the margin for ce loss with a predefined schedule linear sigmoid cosine etc even though this approach involves some hyperparameter tuning it would reduce the computation needed to compute the uncertainty adaptive margin 1 gansbeke et al scan learning to classify images without labels eccv 2020 this work introduces a new semisupervised learning problem that has more realworld applications even though the technical novelty of different components of the proposed method is not high the solution is well motivated and achieves promising performance on multiple datasets however some of the points are not convincing and require further clarification to assess the significance of the work i would appreciate it if the authors address these concerns in their response ### Summary:
this paper is proposed to address a novel but practical setting that the test set consists of both seen and unseen classes of the training set to tackle the crucial challenge of distribution mismatch between the inlier and outlier features the authors proposed a new method named orca by grouping similar instances to enlarge the classwise margin for debiasing the experimental results on imagenet have shown the proposed orca has significantly outperformed baselines in both inlier classification and outlier detection the whole paper is written with clear logic and is easy to follow moreover such a new setting may bring more inspiration to the community
[ 4016, 8459, 1057, 16987, 4711, 253, 1682, 1543, 604, 417, 253, 4477, 943, 7277, 253, 4081, 1332, 342, 1027, 2193, 273, 253, 4016, 8459, 50275, 783, 4081, 28208, 2957, 285, 37820, 2957, 403, 417, 4460, 359, 476, 1089, 1142, 49863, 29974, 13337, 4715, 285, 17524, 3082, 970, 841, 767, 11655, 50275, 249, 1635, 323, 253, 28208, 2957, 436, 2929, 34899, 253, 5275, 6346, 347, 253, 2762, 7431, 2299, 275, 3946, 436, 812, 9569, 1142, 21649, 257, 909, 800, 8557, 672, 253, 1180, 273, 5971, 310, 1781, 285, 253, 3733, 14604, 1979, 310, 1355, 891, 1158, 436, 588, 320, 271, 2523, 323, 253, 4081, 1332, 3340, 672, 627, 403, 1142, 5971, 323, 1650, 253, 4081, 1332, 778, 2525, 3237, 672, 4715, 327, 253, 1781, 4440, 257, 292, 534, 556, 9098, 5971, 6296, 891, 1158, 436, 310, 247, 1048, 6924, 1895, 326, 776, 3114, 943, 1908, 285, 436, 778, 417, 320, 9713, 275, 436, 2929, 50275, 1542, 1881, 35421, 4715, 891, 651, 751, 281, 923, 253, 1543, 273, 970, 1027, 1881, 35421, 4715, 3082, 824, 347, 948, 498, 83, 278, 406, 729, 19, 285, 9381, 3024, 352, 310, 1077, 4722, 326, 270, 1119, 326, 9381, 3024, 33526, 1805, 1543, 685, 948, 498, 83, 285, 278, 406, 729, 19, 323, 4460, 966, 8900, 3103, 891, 717, 14338, 670, 326, 604, 436, 11562, 6569, 275, 253, 4081, 4758, 50275, 74, 1158, 253, 2022, 7680, 310, 253, 4081, 17825, 8459, 1332, 3103, 436, 2746, 476, 320, 3732, 281, 643, 4460, 966, 8900, 3082, 849, 670, 9433, 253, 4081, 8459, 1332, 281, 19947, 4659, 50275, 74, 1158, 581, 1774, 906, 310, 9829, 275, 253, 2929, 359, 5467, 326, 253, 3530, 273, 4460, 5971, 285, 3530, 273, 2326, 5971, 403, 9070, 275, 253, 440, 22027, 941, 326, 310, 323, 253, 440, 22027, 941, 359, 871, 534, 3530, 403, 432, 4460, 5971, 390, 2326, 5971, 533, 359, 513, 417, 871, 253, 966, 13301, 891, 1158, 436, 310, 812, 581, 5170, 5533, 323, 253, 4081, 4758, 275, 1635, 762, 824, 271, 9376, 476, 253, 4081, 1332, 3157, 253, 1543, 390, 359, 760, 878, 2969, 28208, 2957, 285, 37820, 2957, 671, 19947, 8766, 982, 50275, 23955, 767, 1774, 4679, 403, 5816, 337, 9433, 5368, 13279, 292, 3082, 323, 253, 5611, 4758, 374, 9433, 253, 4081, 1332, 327, 253, 13279, 292, 8981, 4758, 50275, 255, 1390, 891, 7052, 5583, 253, 4477, 3727, 253, 2603, 2127, 1309, 19529, 337, 436, 310, 253, 806, 789, 326, 2175, 253, 1527, 10186, 4758, 374, 891, 871, 326, 436, 2929, 369, 9262, 281, 2067, 28966, 285, 1142, 8607, 871, 436, 9380, 33513, 891, 671, 1119, 2067, 2987, 326, 956, 436, 4758, 285, 7277, 436, 789, 3021, 5277, 253, 2603, 2127, 812, 320, 247, 1175, 1265, 323, 436, 747, 4836, 50276, 66, 340, 5918, 285, 1182, 32798, 1240, 3358, 6451, 5926, 483, 347, 247, 17699, 16561, 11193, 9999, 1566, 11649, 275, 3676, 4715, 275, 17857, 1686, 4022, 4583, 891, 751, 436, 789, 3340, 253, 4081, 4758, 3738, 253, 38135, 310, 417, 1077, 2266, 436, 310, 253, 806, 3213, 323, 1527, 10186, 4715, 671, 627, 403, 690, 7350, 285, 5816, 4679, 326, 943, 320, 9713, 1309, 253, 30080, 22559, 891, 1158, 436, 310, 247, 12532, 285, 4722, 4836, 594, 326, 253, 806, 789, 943, 2216, 247, 973, 46779, 4758, 285, 2085, 9470, 14023, 323, 253, 1563, 2987, 5474, 33032, 2520, 2929, 2175, 247, 747, 4758, 323, 1527, 10186, 49863, 29974, 13337, 4715, 436, 4758, 8725, 253, 6867, 49863, 29974, 13337, 4715, 407, 7296, 30394, 561, 5971, 275, 253, 1071, 873, 436, 2929, 23970, 247, 11649, 17825, 8459, 5122, 323, 436, 747, 1895, 253, 1543, 403, 6760, 327, 2709, 15302, 253, 1543, 403, 12532, 50276, 1189, 455, 436, 2929, 310, 973, 3542, 253, 2929, 310, 973, 17194, 253, 2934, 273, 17825, 8459, 285, 26230, 11251, 14407, 11041, 970, 11649, 310, 4460, 281, 479, 50275, 74, 452, 247, 1643, 3533, 5001, 253, 4679, 50275, 249, 4677, 495, 987, 253, 7200, 273, 390, 68, 1370, 78, 15323, 1534, 387, 253, 11858, 23657, 253, 4477, 5544, 326, 436, 2722, 390, 68, 1370, 78, 310, 417, 2104, 281, 4796, 11251, 14407, 11041, 2299, 697, 417, 2590, 2139, 253, 10027, 273, 4715, 2281, 588, 9632, 253, 3045, 5926, 50275, 74, 1804, 15706, 7756, 275, 2829, 374, 342, 4103, 7756, 323, 253, 10668, 16397, 50275, 783, 4081, 8459, 1307, 310, 18176, 347, 2529, 275, 16186, 577, 352, 651, 320, 1175, 604, 253, 4477, 812, 2085, 690, 1783, 5001, 35253, 672, 253, 8459, 310, 16186, 577, 323, 1650, 849, 588, 253, 8459, 29966, 253, 8492, 436, 2929, 310, 1077, 973, 3542, 285, 973, 17194, 891, 5583, 18738, 436, 2929, 323, 9311, 604, 253, 4477, 812, 2953, 247, 1643, 3374, 2529, 1840, 50276, 7152, 339, 9852, 436, 2929, 253, 4477, 4853, 247, 1895, 342, 247, 625, 15958, 8668, 275, 49863, 29974, 13337, 4715, 1527, 10186, 49863, 29974, 13337, 4715, 534, 19401, 247, 4112, 326, 39709, 5971, 403, 2908, 275, 440, 22027, 285, 1071, 15302, 253, 4477, 12661, 247, 1332, 1925, 390, 6357, 534, 14177, 281, 8415, 253, 1895, 4269, 407, 1527, 10186, 49863, 29974, 13337, 4715, 253, 2022, 1159, 273, 253, 390, 6357, 310, 26259, 11251, 14407, 7629, 875, 2326, 285, 4460, 5971, 342, 495, 9351, 273, 2957, 2426, 50275, 9072, 2792, 50275, 1615, 13947, 247, 747, 15958, 1895, 253, 4477, 2085, 247, 3884, 323, 8607, 281, 2118, 3579, 342, 247, 34593, 285, 18872, 8813, 50276, 2520, 2929, 4428, 247, 4618, 2491, 273, 4679, 281, 1329, 253, 4477, 3916, 3738, 627, 310, 642, 2590, 32048, 275, 253, 2931, 1895, 4477, 8671, 2710, 3082, 432, 4623, 4910, 342, 4512, 4569, 2544, 671, 597, 2085, 28913, 1263, 1543, 281, 17813, 253, 4081, 1332, 50275, 20881, 2792, 50276, 3118, 1096, 281, 253, 7000, 7681, 8813, 273, 253, 4081, 1332, 253, 4891, 8813, 273, 253, 7234, 285, 30328, 310, 12497, 50276, 284, 253, 4477, 4853, 247, 747, 1895, 1527, 10186, 256, 3433, 253, 5661, 4758, 310, 8489, 10341, 50275, 34974, 50276, 6050, 11649, 17825, 8459, 3936, 247, 2234, 2554, 275, 390, 6357, 247, 7000, 8813, 273, 849, 352, 2987, 36908, 1646, 281, 36433, 10668, 651, 417, 320, 2104, 281, 2096, 342, 760, 27350, 22909, 285, 10414, 594, 247, 625, 7000, 8813, 310, 3309, 323, 1650, 2139, 16186, 20, 2097, 253, 1781, 8459, 273, 2326, 5971, 275, 2393, 3733, 44540, 50276, 249, 253, 3368, 7533, 4477, 873, 253, 4313, 273, 2326, 2369, 652, 5971, 281, 2456, 476, 352, 320, 753, 326, 436, 4758, 6125, 247, 1524, 10186, 1895, 50276, 2004, 627, 403, 642, 21607, 281, 1527, 10186, 256, 3433, 8245, 11333, 751, 4993, 8992, 285, 20505, 20, 77, 921, 8489, 1698, 16226, 685, 697, 3236, 2929, 342, 6041, 256, 3433, 7533, 323, 253, 4477, 1332, 281, 320, 625, 21414, 352, 3133, 3309, 281, 921, 326, 253, 390, 6357, 1057, 417, 2965, 3212, 253, 8245, 275, 253, 5368, 256, 3433, 1895, 1293, 667, 4460, 966, 50275, 20261, 253, 2929, 556, 690, 32213, 824, 347, 5301, 275, 253, 6041, 256, 3433, 4758, 253, 2929, 29328, 247, 747, 1895, 326, 19401, 247, 625, 15958, 4112, 285, 253, 4081, 1332, 3133, 281, 789, 6283, 323, 253, 4158, 4758, 50276, 7152, 339, 431, 248, 2929, 23970, 247, 747, 49863, 29974, 13337, 4715, 1895, 1527, 10186, 49863, 29974, 13337, 4715, 835, 253, 8103, 310, 281, 9446, 3530, 432, 2326, 5971, 347, 973, 347, 7368, 3530, 432, 4460, 5971, 253, 4081, 1332, 8414, 273, 1264, 4295, 281, 2953, 253, 1027, 7794, 273, 436, 1895, 50276, 66, 22296, 2636, 2957, 342, 11649, 17825, 8459, 281, 9446, 3530, 432, 2326, 5971, 247, 28208, 8103, 281, 7368, 3530, 432, 4460, 5971, 285, 247, 37820, 1307, 281, 3693, 29458, 5482, 253, 12510, 273, 253, 4081, 1332, 556, 644, 17618, 327, 2709, 22791, 15302, 20544, 50275, 2520, 789, 23970, 247, 15958, 49863, 29974, 13337, 1895, 326, 8553, 323, 253, 3361, 273, 3530, 432, 4460, 5971, 275, 253, 440, 22027, 873, 436, 1895, 9978, 310, 4722, 285, 651, 1056, 253, 3715, 256, 3433, 5482, 7763, 281, 625, 8542, 15216, 50275, 783, 4081, 2900, 33526, 12532, 1543, 1014, 2167, 253, 7681, 38135, 273, 1027, 4295, 273, 253, 4081, 2900, 310, 417, 1029, 24049, 841, 2709, 4295, 281, 8415, 1027, 7794, 273, 253, 11132, 1527, 10186, 49863, 29974, 13337, 4715, 1895, 310, 4460, 2217, 50275, 783, 2929, 310, 973, 3542, 285, 3477, 281, 956, 50276, 783, 28913, 310, 9470, 16280, 253, 789, 3797, 1783, 762, 2710, 11132, 2515, 751, 966, 31561, 11962, 7155, 273, 2326, 285, 4460, 5971, 3966, 50275, 20881, 1255, 285, 7350, 50275, 783, 12510, 273, 253, 28208, 8103, 943, 10260, 3469, 327, 253, 3290, 273, 1881, 35421, 3386, 253, 4081, 1332, 285, 954, 273, 253, 1666, 25379, 452, 12845, 948, 498, 83, 3215, 11273, 13461, 271, 1783, 651, 452, 644, 5322, 281, 923, 849, 253, 3045, 273, 253, 4081, 1332, 2544, 342, 1027, 1881, 35421, 3215, 26208, 15849, 3340, 253, 4394, 342, 39543, 8892, 751, 4000, 3024, 480, 17638, 1403, 3966, 16280, 323, 247, 2257, 273, 15302, 253, 1881, 35421, 3215, 11273, 13461, 1537, 417, 320, 2130, 323, 4227, 253, 2014, 3992, 10895, 3103, 352, 310, 9560, 281, 871, 849, 253, 1332, 17923, 1293, 24049, 253, 1881, 35421, 3215, 11273, 13461, 327, 253, 22791, 15302, 260, 338, 274, 740, 260, 338, 274, 2313, 4440, 257, 292, 2313, 50273, 783, 4679, 327, 253, 2022, 2505, 908, 247, 1781, 5110, 273, 13130, 941, 2456, 534, 310, 417, 247, 1077, 8542, 9978, 323, 49863, 29974, 13337, 4715, 2299, 253, 30762, 3797, 1543, 342, 884, 13130, 941, 1580, 253, 256, 5503, 4581, 10186, 256, 3433, 3082, 403, 1077, 5203, 5919, 2074, 4679, 342, 337, 608, 3966, 13130, 941, 812, 452, 644, 5196, 50275, 39176, 311, 357, 1241, 323, 28208, 8103, 452, 644, 4561, 407, 4560, 253, 954, 13224, 8557, 2299, 323, 15302, 342, 247, 1781, 1180, 273, 5971, 625, 685, 581, 3410, 1537, 417, 320, 1246, 275, 247, 1054, 487, 1506, 604, 253, 14604, 3281, 310, 417, 10481, 1781, 3103, 352, 651, 320, 4722, 281, 871, 253, 7340, 273, 253, 4081, 1332, 281, 253, 14604, 3281, 4764, 50274, 5430, 497, 253, 4373, 22041, 323, 253, 4081, 1332, 24251, 858, 253, 4477, 897, 247, 12820, 873, 323, 436, 4096, 604, 326, 310, 253, 1083, 840, 858, 326, 12820, 873, 3831, 13130, 3530, 432, 4460, 5971, 984, 970, 13130, 3530, 432, 4460, 5971, 281, 19928, 4373, 22041, 651, 1056, 253, 4081, 1332, 1679, 8542, 50275, 47109, 342, 271, 440, 35421, 17524, 1332, 751, 11017, 18, 651, 452, 644, 4722, 50275, 34974, 285, 5701, 50275, 783, 1543, 2361, 275, 2829, 374, 285, 608, 14371, 326, 253, 7200, 327, 4460, 5971, 310, 2169, 685, 326, 273, 2326, 5971, 323, 253, 260, 338, 274, 740, 10895, 1580, 323, 512, 253, 643, 3082, 253, 2326, 966, 3045, 310, 2169, 352, 651, 320, 4722, 281, 871, 667, 12288, 3212, 436, 11562, 50276, 249, 7529, 281, 253, 5058, 8459, 285, 4229, 8459, 4679, 452, 253, 4477, 5196, 667, 4679, 407, 46957, 253, 8459, 323, 2636, 2957, 342, 247, 41364, 10130, 4872, 9788, 78, 1238, 7349, 460, 3966, 1014, 2167, 436, 2746, 8687, 690, 4373, 19484, 25184, 352, 651, 4796, 253, 13782, 3058, 281, 11897, 253, 11649, 17825, 8459, 50276, 18, 305, 507, 1257, 413, 1162, 355, 11017, 4715, 281, 30215, 3888, 1293, 13301, 23746, 87, 9169, 436, 789, 23970, 247, 747, 49863, 29974, 13337, 4715, 1895, 326, 556, 625, 1524, 10186, 4893, 1014, 2167, 253, 7681, 38135, 273, 1027, 4295, 273, 253, 4081, 1332, 310, 417, 1029, 253, 2900, 310, 973, 17194, 285, 33526, 12532, 3045, 327, 2709, 15302, 2299, 690, 273, 253, 2792, 403, 417, 21414, 285, 2430, 2007, 37699, 281, 2939, 253, 8453, 273, 253, 789, 891, 651, 11435, 352, 604, 253, 4477, 2953, 841, 7350, 275, 616, 2380, 2490, 187, 4118, 18435, 27, 2520, 2929, 310, 4081, 281, 2953, 247, 4460, 533, 8542, 4758, 326, 253, 1071, 873, 8414, 273, 1097, 2326, 285, 39709, 5971, 273, 253, 3733, 873, 281, 18915, 253, 9560, 5691, 273, 3268, 29713, 875, 253, 275, 3623, 285, 562, 3623, 3386, 253, 4477, 4081, 247, 747, 1332, 4907, 390, 6357, 407, 32827, 2074, 10872, 281, 46112, 253, 966, 3020, 8459, 323, 372, 4193, 2355, 253, 5661, 1543, 327, 4440, 257, 292, 452, 2011, 253, 4081, 390, 6357, 556, 3012, 41731, 10574, 1666, 25379, 275, 1097, 275, 3623, 9162, 285, 562, 3623, 5481, 253, 2644, 2929, 310, 3542, 342, 2590, 9317, 285, 310, 3477, 281, 956, 25761, 824, 247, 747, 4758, 778, 3324, 625, 17006, 281, 253, 3114 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 4016, 8459, 1057, 16987, 4711, 253, 1682, 1543, 604, 417, 253, 4477, 943, 7277, 253, 4081, 1332, 342, 1027, 2193, 273, 253, 4016, 8459, 50275, 783, 4081, 28208, 2957, 285, 37820, 2957, 403, 417, 4460, 359, 476, 1089, 1142, 49863, 29974, 13337, 4715, 285, 17524, 3082, 970, 841, 767, 11655, 50275, 249, 1635, 323, 253, 28208, 2957, 436, 2929, 34899, 253, 5275, 6346, 347, 253, 2762, 7431, 2299, 275, 3946, 436, 812, 9569, 1142, 21649, 257, 909, 800, 8557, 672, 253, 1180, 273, 5971, 310, 1781, 285, 253, 3733, 14604, 1979, 310, 1355, 891, 1158, 436, 588, 320, 271, 2523, 323, 253, 4081, 1332, 3340, 672, 627, 403, 1142, 5971, 323, 1650, 253, 4081, 1332, 778, 2525, 3237, 672, 4715, 327, 253, 1781, 4440, 257, 292, 534, 556, 9098, 5971, 6296, 891, 1158, 436, 310, 247, 1048, 6924, 1895, 326, 776, 3114, 943, 1908, 285, 436, 778, 417, 320, 9713, 275, 436, 2929, 50275, 1542, 1881, 35421, 4715, 891, 651, 751, 281, 923, 253, 1543, 273, 970, 1027, 1881, 35421, 4715, 3082, 824, 347, 948, 498, 83, 278, 406, 729, 19, 285, 9381, 3024, 352, 310, 1077, 4722, 326, 270, 1119, 326, 9381, 3024, 33526, 1805, 1543, 685, 948, 498, 83, 285, 278, 406, 729, 19, 323, 4460, 966, 8900, 3103, 891, 717, 14338, 670, 326, 604, 436, 11562, 6569, 275, 253, 4081, 4758, 50275, 74, 1158, 253, 2022, 7680, 310, 253, 4081, 17825, 8459, 1332, 3103, 436, 2746, 476, 320, 3732, 281, 643, 4460, 966, 8900, 3082, 849, 670, 9433, 253, 4081, 8459, 1332, 281, 19947, 4659, 50275, 74, 1158, 581, 1774, 906, 310, 9829, 275, 253, 2929, 359, 5467, 326, 253, 3530, 273, 4460, 5971, 285, 3530, 273, 2326, 5971, 403, 9070, 275, 253, 440, 22027, 941, 326, 310, 323, 253, 440, 22027, 941, 359, 871, 534, 3530, 403, 432, 4460, 5971, 390, 2326, 5971, 533, 359, 513, 417, 871, 253, 966, 13301, 891, 1158, 436, 310, 812, 581, 5170, 5533, 323, 253, 4081, 4758, 275, 1635, 762, 824, 271, 9376, 476, 253, 4081, 1332, 3157, 253, 1543, 390, 359, 760, 878, 2969, 28208, 2957, 285, 37820, 2957, 671, 19947, 8766, 982, 50275, 23955, 767, 1774, 4679, 403, 5816, 337, 9433, 5368, 13279, 292, 3082, 323, 253, 5611, 4758, 374, 9433, 253, 4081, 1332, 327, 253, 13279, 292, 8981, 4758, 50275, 255, 1390, 891, 7052, 5583, 253, 4477, 3727, 253, 2603, 2127, 1309, 19529, 337, 436, 310, 253, 806, 789, 326, 2175, 253, 1527, 10186, 4758, 374, 891, 871, 326, 436, 2929, 369, 9262, 281, 2067, 28966, 285, 1142, 8607, 871, 436, 9380, 33513, 891, 671, 1119, 2067, 2987, 326, 956, 436, 4758, 285, 7277, 436, 789, 3021, 5277, 253, 2603, 2127, 812, 320, 247, 1175, 1265, 323, 436, 747, 4836, 50276, 66, 340, 5918, 285, 1182, 32798, 1240, 3358, 6451, 5926, 483, 347, 247, 17699, 16561, 11193, 9999, 1566, 11649, 275, 3676, 4715, 275, 17857, 1686, 4022, 4583, 891, 751, 436, 789, 3340, 253, 4081, 4758, 3738, 253, 38135, 310, 417, 1077, 2266, 436, 310, 253, 806, 3213, 323, 1527, 10186, 4715, 671, 627, 403, 690, 7350, 285, 5816, 4679, 326, 943, 320, 9713, 1309, 253, 30080, 22559, 891, 1158, 436, 310, 247, 12532, 285, 4722, 4836, 594, 326, 253, 806, 789, 943, 2216, 247, 973, 46779, 4758, 285, 2085, 9470, 14023, 323, 253, 1563, 2987, 5474, 33032, 2520, 2929, 2175, 247, 747, 4758, 323, 1527, 10186, 49863, 29974, 13337, 4715, 436, 4758, 8725, 253, 6867, 49863, 29974, 13337, 4715, 407, 7296, 30394, 561, 5971, 275, 253, 1071, 873, 436, 2929, 23970, 247, 11649, 17825, 8459, 5122, 323, 436, 747, 1895, 253, 1543, 403, 6760, 327, 2709, 15302, 253, 1543, 403, 12532, 50276, 1189, 455, 436, 2929, 310, 973, 3542, 253, 2929, 310, 973, 17194, 253, 2934, 273, 17825, 8459, 285, 26230, 11251, 14407, 11041, 970, 11649, 310, 4460, 281, 479, 50275, 74, 452, 247, 1643, 3533, 5001, 253, 4679, 50275, 249, 4677, 495, 987, 253, 7200, 273, 390, 68, 1370, 78, 15323, 1534, 387, 253, 11858, 23657, 253, 4477, 5544, 326, 436, 2722, 390, 68, 1370, 78, 310, 417, 2104, 281, 4796, 11251, 14407, 11041, 2299, 697, 417, 2590, 2139, 253, 10027, 273, 4715, 2281, 588, 9632, 253, 3045, 5926, 50275, 74, 1804, 15706, 7756, 275, 2829, 374, 342, 4103, 7756, 323, 253, 10668, 16397, 50275, 783, 4081, 8459, 1307, 310, 18176, 347, 2529, 275, 16186, 577, 352, 651, 320, 1175, 604, 253, 4477, 812, 2085, 690, 1783, 5001, 35253, 672, 253, 8459, 310, 16186, 577, 323, 1650, 849, 588, 253, 8459, 29966, 253, 8492, 436, 2929, 310, 1077, 973, 3542, 285, 973, 17194, 891, 5583, 18738, 436, 2929, 323, 9311, 604, 253, 4477, 812, 2953, 247, 1643, 3374, 2529, 1840, 50276, 7152, 339, 9852, 436, 2929, 253, 4477, 4853, 247, 1895, 342, 247, 625, 15958, 8668, 275, 49863, 29974, 13337, 4715, 1527, 10186, 49863, 29974, 13337, 4715, 534, 19401, 247, 4112, 326, 39709, 5971, 403, 2908, 275, 440, 22027, 285, 1071, 15302, 253, 4477, 12661, 247, 1332, 1925, 390, 6357, 534, 14177, 281, 8415, 253, 1895, 4269, 407, 1527, 10186, 49863, 29974, 13337, 4715, 253, 2022, 1159, 273, 253, 390, 6357, 310, 26259, 11251, 14407, 7629, 875, 2326, 285, 4460, 5971, 342, 495, 9351, 273, 2957, 2426, 50275, 9072, 2792, 50275, 1615, 13947, 247, 747, 15958, 1895, 253, 4477, 2085, 247, 3884, 323, 8607, 281, 2118, 3579, 342, 247, 34593, 285, 18872, 8813, 50276, 2520, 2929, 4428, 247, 4618, 2491, 273, 4679, 281, 1329, 253, 4477, 3916, 3738, 627, 310, 642, 2590, 32048, 275, 253, 2931, 1895, 4477, 8671, 2710, 3082, 432, 4623, 4910, 342, 4512, 4569, 2544, 671, 597, 2085, 28913, 1263, 1543, 281, 17813, 253, 4081, 1332, 50275, 20881, 2792, 50276, 3118, 1096, 281, 253, 7000, 7681, 8813, 273, 253, 4081, 1332, 253, 4891, 8813, 273, 253, 7234, 285, 30328, 310, 12497, 50276, 284, 253, 4477, 4853, 247, 747, 1895, 1527, 10186, 256, 3433, 253, 5661, 4758, 310, 8489, 10341, 50275, 34974, 50276, 6050, 11649, 17825, 8459, 3936, 247, 2234, 2554, 275, 390, 6357, 247, 7000, 8813, 273, 849, 352, 2987, 36908, 1646, 281, 36433, 10668, 651, 417, 320, 2104, 281, 2096, 342, 760, 27350, 22909, 285, 10414, 594, 247, 625, 7000, 8813, 310, 3309, 323, 1650, 2139, 16186, 20, 2097, 253, 1781, 8459, 273, 2326, 5971, 275, 2393, 3733, 44540, 50276, 249, 253, 3368, 7533, 4477, 873, 253, 4313, 273, 2326, 2369, 652, 5971, 281, 2456, 476, 352, 320, 753, 326, 436, 4758, 6125, 247, 1524, 10186, 1895, 50276, 2004, 627, 403, 642, 21607, 281, 1527, 10186, 256, 3433, 8245, 11333, 751, 4993, 8992, 285, 20505, 20, 77, 921, 8489, 1698, 16226, 685, 697, 3236, 2929, 342, 6041, 256, 3433, 7533, 323, 253, 4477, 1332, 281, 320, 625, 21414, 352, 3133, 3309, 281, 921, 326, 253, 390, 6357, 1057, 417, 2965, 3212, 253, 8245, 275, 253, 5368, 256, 3433, 1895, 1293, 667, 4460, 966, 50275, 20261, 253, 2929, 556, 690, 32213, 824, 347, 5301, 275, 253, 6041, 256, 3433, 4758, 253, 2929, 29328, 247, 747, 1895, 326, 19401, 247, 625, 15958, 4112, 285, 253, 4081, 1332, 3133, 281, 789, 6283, 323, 253, 4158, 4758, 50276, 7152, 339, 431, 248, 2929, 23970, 247, 747, 49863, 29974, 13337, 4715, 1895, 1527, 10186, 49863, 29974, 13337, 4715, 835, 253, 8103, 310, 281, 9446, 3530, 432, 2326, 5971, 347, 973, 347, 7368, 3530, 432, 4460, 5971, 253, 4081, 1332, 8414, 273, 1264, 4295, 281, 2953, 253, 1027, 7794, 273, 436, 1895, 50276, 66, 22296, 2636, 2957, 342, 11649, 17825, 8459, 281, 9446, 3530, 432, 2326, 5971, 247, 28208, 8103, 281, 7368, 3530, 432, 4460, 5971, 285, 247, 37820, 1307, 281, 3693, 29458, 5482, 253, 12510, 273, 253, 4081, 1332, 556, 644, 17618, 327, 2709, 22791, 15302, 20544, 50275, 2520, 789, 23970, 247, 15958, 49863, 29974, 13337, 1895, 326, 8553, 323, 253, 3361, 273, 3530, 432, 4460, 5971, 275, 253, 440, 22027, 873, 436, 1895, 9978, 310, 4722, 285, 651, 1056, 253, 3715, 256, 3433, 5482, 7763, 281, 625, 8542, 15216, 50275, 783, 4081, 2900, 33526, 12532, 1543, 1014, 2167, 253, 7681, 38135, 273, 1027, 4295, 273, 253, 4081, 2900, 310, 417, 1029, 24049, 841, 2709, 4295, 281, 8415, 1027, 7794, 273, 253, 11132, 1527, 10186, 49863, 29974, 13337, 4715, 1895, 310, 4460, 2217, 50275, 783, 2929, 310, 973, 3542, 285, 3477, 281, 956, 50276, 783, 28913, 310, 9470, 16280, 253, 789, 3797, 1783, 762, 2710, 11132, 2515, 751, 966, 31561, 11962, 7155, 273, 2326, 285, 4460, 5971, 3966, 50275, 20881, 1255, 285, 7350, 50275, 783, 12510, 273, 253, 28208, 8103, 943, 10260, 3469, 327, 253, 3290, 273, 1881, 35421, 3386, 253, 4081, 1332, 285, 954, 273, 253, 1666, 25379, 452, 12845, 948, 498, 83, 3215, 11273, 13461, 271, 1783, 651, 452, 644, 5322, 281, 923, 849, 253, 3045, 273, 253, 4081, 1332, 2544, 342, 1027, 1881, 35421, 3215, 26208, 15849, 3340, 253, 4394, 342, 39543, 8892, 751, 4000, 3024, 480, 17638, 1403, 3966, 16280, 323, 247, 2257, 273, 15302, 253, 1881, 35421, 3215, 11273, 13461, 1537, 417, 320, 2130, 323, 4227, 253, 2014, 3992, 10895, 3103, 352, 310, 9560, 281, 871, 849, 253, 1332, 17923, 1293, 24049, 253, 1881, 35421, 3215, 11273, 13461, 327, 253, 22791, 15302, 260, 338, 274, 740, 260, 338, 274, 2313, 4440, 257, 292, 2313, 50273, 783, 4679, 327, 253, 2022, 2505, 908, 247, 1781, 5110, 273, 13130, 941, 2456, 534, 310, 417, 247, 1077, 8542, 9978, 323, 49863, 29974, 13337, 4715, 2299, 253, 30762, 3797, 1543, 342, 884, 13130, 941, 1580, 253, 256, 5503, 4581, 10186, 256, 3433, 3082, 403, 1077, 5203, 5919, 2074, 4679, 342, 337, 608, 3966, 13130, 941, 812, 452, 644, 5196, 50275, 39176, 311, 357, 1241, 323, 28208, 8103, 452, 644, 4561, 407, 4560, 253, 954, 13224, 8557, 2299, 323, 15302, 342, 247, 1781, 1180, 273, 5971, 625, 685, 581, 3410, 1537, 417, 320, 1246, 275, 247, 1054, 487, 1506, 604, 253, 14604, 3281, 310, 417, 10481, 1781, 3103, 352, 651, 320, 4722, 281, 871, 253, 7340, 273, 253, 4081, 1332, 281, 253, 14604, 3281, 4764, 50274, 5430, 497, 253, 4373, 22041, 323, 253, 4081, 1332, 24251, 858, 253, 4477, 897, 247, 12820, 873, 323, 436, 4096, 604, 326, 310, 253, 1083, 840, 858, 326, 12820, 873, 3831, 13130, 3530, 432, 4460, 5971, 984, 970, 13130, 3530, 432, 4460, 5971, 281, 19928, 4373, 22041, 651, 1056, 253, 4081, 1332, 1679, 8542, 50275, 47109, 342, 271, 440, 35421, 17524, 1332, 751, 11017, 18, 651, 452, 644, 4722, 50275, 34974, 285, 5701, 50275, 783, 1543, 2361, 275, 2829, 374, 285, 608, 14371, 326, 253, 7200, 327, 4460, 5971, 310, 2169, 685, 326, 273, 2326, 5971, 323, 253, 260, 338, 274, 740, 10895, 1580, 323, 512, 253, 643, 3082, 253, 2326, 966, 3045, 310, 2169, 352, 651, 320, 4722, 281, 871, 667, 12288, 3212, 436, 11562, 50276, 249, 7529, 281, 253, 5058, 8459, 285, 4229, 8459, 4679, 452, 253, 4477, 5196, 667, 4679, 407, 46957, 253, 8459, 323, 2636, 2957, 342, 247, 41364, 10130, 4872, 9788, 78, 1238, 7349, 460, 3966, 1014, 2167, 436, 2746, 8687, 690, 4373, 19484, 25184, 352, 651, 4796, 253, 13782, 3058, 281, 11897, 253, 11649, 17825, 8459, 50276, 18, 305, 507, 1257, 413, 1162, 355, 11017, 4715, 281, 30215, 3888, 1293, 13301, 23746, 87, 9169, 436, 789, 23970, 247, 747, 49863, 29974, 13337, 4715, 1895, 326, 556, 625, 1524, 10186, 4893, 1014, 2167, 253, 7681, 38135, 273, 1027, 4295, 273, 253, 4081, 1332, 310, 417, 1029, 253, 2900, 310, 973, 17194, 285, 33526, 12532, 3045, 327, 2709, 15302, 2299, 690, 273, 253, 2792, 403, 417, 21414, 285, 2430, 2007, 37699, 281, 2939, 253, 8453, 273, 253, 789, 891, 651, 11435, 352, 604, 253, 4477, 2953, 841, 7350, 275, 616, 2380, 2490, 187, 4118, 18435, 27, 2520, 2929, 310, 4081, 281, 2953, 247, 4460, 533, 8542, 4758, 326, 253, 1071, 873, 8414, 273, 1097, 2326, 285, 39709, 5971, 273, 253, 3733, 873, 281, 18915, 253, 9560, 5691, 273, 3268, 29713, 875, 253, 275, 3623, 285, 562, 3623, 3386, 253, 4477, 4081, 247, 747, 1332, 4907, 390, 6357, 407, 32827, 2074, 10872, 281, 46112, 253, 966, 3020, 8459, 323, 372, 4193, 2355, 253, 5661, 1543, 327, 4440, 257, 292, 452, 2011, 253, 4081, 390, 6357, 556, 3012, 41731, 10574, 1666, 25379, 275, 1097, 275, 3623, 9162, 285, 562, 3623, 5481, 253, 2644, 2929, 310, 3542, 342, 2590, 9317, 285, 310, 3477, 281, 956, 25761, 824, 247, 747, 4758, 778, 3324, 625, 17006, 281, 253, 3114 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: summary this paper aims to improve adversarial robustness of the classifiers in a different perspective than the existing works usually the networks are trained using adversarial examples to improve robustness adversarial training this work extend this line of thought and make an input robust to adversarial attacks instead of updating the network they make updates to the input to gain robustness in other words this work explore the existence of safe spots near the input samples that are robust against adversarial attacks results on cifar10 and imagenet reveals that there exists such safe spots which are resistant to adversarial perturbations and improve adversarial robustness when combined with adversarial training the authors term it as safespot aware adversarial training based on this approach the authors also propose outofdistribution detection method that outperforms previous works strengths motivation is clear the proposed approach is interesting and different from existing works the practical application of the proposed framework is elaborated clearly technical details and formulations are clear results show that the proposed approach improves adversarial robustness and clean data performance on both cifar10 and imagenet furthermore the proposed approach greatly improves the robustness when evaluated with randomized smoothing the design of the approach enables outofdistribution detection that outperforms previous works weaknesses the major concern lies in the evaluation of the proposed technique here the authors find safe spots and also propose safespot aware adversarial training but evaluate on pgd based adversarial attack in a standard manner it is important to address the possibility of safe spot aware adversarial attack on the proposed defense and its success rate in case such attack is infeasible please provide the rationale behind that clarify the difference between sfull and spgd from experiments section since sfull also uses tstep pgd how it is different than spgd though the outofdistribution detection results slightly outperforms previous works under fpr95 metric the performance gains are very minimal and not very significant than the baseline oe hendrycks et al 2019b under two metrics auroc and aupr final thoughts the proposed method is clearly motivated although the performance gains on adversarial robustness is significant there are critical points yet to be addressed therefore i marginally accept this paper post rebuttal the authors have addressed my concerns in the rebuttal however i also agree with the other critical points raised by other reviewers particularly reviewer 4 that are of major concern hence i retain my initial score and marginally accept the paperdocsepthe authors argue that there are some safe spots in the data space that are less prone to adversarial attacks the authors propose a technique to identify such safe spots they then leverage them for robust training and observe higher robust accuracy than baseline finally they leverage this observation to identify out of distribution data the application is important and the results look promising however i have the following concern the authors propose a new threat model where the adversary may have access to the labeled data they motivated such a setting with an example of google image search however such a setting is quite limited there are also existing methods that use supervised learning setting with incorrect labeling the paper should discuss how they differ from such a line of work the search algorithm requires that a correct predicted label is available this setting is not quite realistic how can we find a safe spot when the label is unknown some of the findings are not quite surprising for example a safe spot is more in a robust model with small epsilon docsepthis paper proposes a new adversarial framework where the defender could preemptively modify classifier inputs to find safe spots that are robust to adversarial attacks they then introduce a novel bilevel optimization algorithm that can find safe spots on over 90 of the correctly classified images for adversarially trained classifiers on cifar10 and imagenet datasets and show that they can be used to improve both the empirical and certified robustness on smoothed classifiers besides they propose a new training scheme based on their conjecture about safe spots for outofdistribution detection which achieves stateoftheart results on neardistribution outliers overall the writing is clear and the idea is interesting i think they have the following contributions 1 propose a novel adversarial framework and motivate it by a realworld application search engine example 2 propose an effective algorithm to find safe spots 3 show the usefulness of safe spots for adversarial robustness and outofdistribution detection however i have the following concerns 1 i am wondering whether the safe spots actually exist based on the definition the classifier should have the same predictions on all data points in the epsilonball around a safe spot so the classifier has certified robustness on the safe spots but it might be hard to show that the classifier has certified robustness on the safe spots although they have shown that the attacks could not successfully find adversarial examples for the safe spots it might be due to the gradient masking issues could the authors try the autoattack proposed in 1 to see whether the classifier is actually robust on the safe spots 2 from their results we can see that the safe spots dont exist for naturally trained models we need to use adversarial training to produce safe spots but it is not surprising that adversarial training could produce safe spots in fact if we could solve the standard adversarial training objective optimally then any natural images from the training distribution should be safe spots could the authors explain why we need to find safe spots other than the natural images in this case 3 if the classifier is not robust on the natural image xo and the defender finds a safe spot xs around xo then from the attacker perspective why he could not first perturb xs to be xo and then perturb xo to find adversarial examples if the attacker could not find adversarial examples for xs then he may try other attack strategies like using larger epsilon or other perturbation types in such cases the proposed defense framework may not work could the authors explain it 4 for safe spotaware adversarial training they mention that the training procedure is more computationally demanding than pgd adversarial training then they use targeted fgsm or kstep pgd towards the groundtruth label as a proxy to safe spot search it is hard for me to understand what they exactly do in this part could the authors describe it in detail also why would the safe spotaware adversarial training be better than the standard adversarial training i think standard adversarial training can also produce safe spots is it because the safe spotaware adversarial training search for xa in a larger ball around xo bdeltaepsilonxo than the standard adversarial training could the authors try standard adversarial training with a perturbation budget of deltaepsilon to see if this is the case 5 for outofdistribution detection they conjecture that the samples from the learned distribution will have a higher probability of having safe spots compared to the outofdistribution samples but i dont think their results could support this conjecture in their training objective they explicitly minimize the probability of safe spot existence of the outlier samples so they try to train a model such that their conjecture holds it would be better if they could show whether their conjecture holds for naturally trained models or the models trained using outlier exposure i suggest they perform an ablation study for objective 4 also i think they miss some ood detection baselines such as 2 and 3 could the authors compare their method to them 1 croce francesco and matthias hein reliable evaluation of adversarial robustness with an ensemble of diverse parameterfree attacks arxiv preprint arxiv200301690 2020 2 mohseni sina et al selfsupervised learning for generalizable outofdistribution detection aaai 2020 3 liu weitang et al energybased outofdistribution detection arxiv preprint arxiv201003759 2020 after discussion with authors thanks for the clarification some of my concerns have been addressed and i have raised my score but i keep the concern that the proposed defense framework may be easily broken in practice given that the attacker can have unlimited power docsepthank you for your answers the paper proposes a new method for making adversarial attacks more difficult in their method the defender not the attacker modifies the original input xo to xs which is guaranteed to be safe in the sense that an attacker modifying xs will not manage to change the predicted class until large changes to the sample are performed the defenders budget for modifying the sample is denoted delta whereas the attackers budget is epsilon after some relaxations they arrive at the optimization problem stated in equation 2 find the modification xs subject to budget delta which minimizes the risk measured by crossentropy that any modification of xs by an attacker subject to budget epsilon is misclassified they also extend the idea to outof distribution ood detection though the main contribution seems to be in mitigating adversarial attacks during testing strong points the basic idea and derivation of the optimization problem is clearly written the idea of modifying an input image before classification is interesting apparently new and effective in mitigating the impact of adverserial attacks according to the experiments in 41 42 43 weakunclear points section 35 safe spotaware adversarial training is a little bit unclear to me is it that now the training data not the test data is modified but then it is not so clear to me whether the final training objective is still well defined it might just have a similar effect as adding noise to training samples furthermore it appears that the whole thing becomes difficult to train since during training it necessary to iterate between a ordinary model training and b modification of training samples section 36 outofdistribution detection is not convincing apparently the proposed method is not only hard to train but also has three important hyperparameters gamma lambda and mu which need to be carefully tuned therefore even though the authors report improvements over previous methods in section 4 i am not convinced that this is a practical approach to ood in section 41 i am not sure what the authors mean with our methods can find safe spots on over 85 of the test set images my understanding is that if the class label could not be changed by an attacker then the method was successful even if the original sample was misclassified however table 1 reports only classification accuracy ### Summary:
the reviewers recognized that the proposed method is interesting and seems to be useful in some cases and the authors provided sufficient empirical results to support their claim in addition some comments have already been clarified however some reviewers still concerned that the proposed defence method will be defeated under some conditions and still have the major concern regarding the issue of adopting some attack strategies to find adversarial examples near the safe spots even though the authors clarified some critical points of the proposed method these drawbacks led to the decision to not accept however this paper has some merit and can be made into a stronger contribution in the future
[ 1869, 285, 1056, 271, 3280, 10237, 281, 48960, 8104, 3185, 273, 22753, 253, 2990, 597, 1056, 11269, 281, 253, 3280, 281, 6351, 31640, 275, 643, 3000, 436, 789, 8338, 253, 6242, 273, 4999, 13977, 2822, 253, 3280, 3530, 326, 403, 10237, 1411, 48960, 8104, 1543, 327, 260, 338, 274, 740, 285, 4440, 257, 292, 12957, 326, 627, 4961, 824, 4999, 13977, 534, 403, 14264, 281, 48960, 26309, 285, 3157, 48960, 31640, 672, 5678, 342, 48960, 3733, 253, 4477, 1307, 352, 347, 4389, 265, 11714, 6600, 48960, 3733, 1754, 327, 436, 2746, 253, 4477, 671, 12661, 562, 1171, 35360, 5481, 1332, 326, 41731, 13015, 2045, 2987, 50276, 296, 3755, 20556, 50275, 24013, 7639, 310, 2590, 50276, 783, 4081, 2746, 310, 4722, 285, 1027, 432, 5368, 2987, 253, 8542, 2898, 273, 253, 4081, 7792, 310, 50221, 4518, 50276, 48746, 4278, 285, 26850, 403, 2590, 50276, 16680, 921, 326, 253, 4081, 2746, 19132, 48960, 31640, 285, 4076, 941, 3045, 327, 1097, 260, 338, 274, 740, 285, 4440, 257, 292, 33810, 253, 4081, 2746, 10260, 19132, 253, 31640, 672, 6760, 342, 14871, 36971, 50276, 783, 2216, 273, 253, 2746, 13276, 562, 1171, 35360, 5481, 326, 41731, 13015, 2045, 2987, 50276, 20881, 1255, 265, 209, 186, 783, 2201, 4468, 8696, 275, 253, 7103, 273, 253, 4081, 5853, 1060, 253, 4477, 1089, 4999, 13977, 285, 671, 12661, 4389, 265, 11714, 6600, 48960, 3733, 533, 7472, 327, 23256, 69, 1754, 48960, 2983, 275, 247, 2629, 50276, 8420, 254, 352, 310, 1774, 281, 2953, 253, 6387, 273, 4999, 6308, 6600, 48960, 2983, 327, 253, 4081, 5684, 285, 697, 2323, 2281, 275, 1083, 824, 2983, 310, 275, 36764, 917, 4496, 2085, 253, 24775, 3212, 326, 50274, 186, 498, 274, 1419, 253, 3064, 875, 256, 11546, 285, 653, 35333, 432, 4679, 2593, 1580, 256, 11546, 671, 4648, 246, 10539, 23256, 69, 849, 352, 310, 1027, 685, 653, 35333, 209, 186, 2004, 253, 562, 1171, 35360, 5481, 1543, 5777, 41731, 13015, 2045, 2987, 762, 269, 1087, 2222, 7982, 253, 3045, 15988, 403, 1077, 8723, 285, 417, 1077, 1534, 685, 253, 8245, 258, 70, 344, 2109, 610, 6163, 1162, 355, 6247, 67, 762, 767, 17082, 247, 1822, 68, 285, 247, 484, 83, 50276, 13017, 7906, 253, 4081, 1332, 310, 4518, 17194, 3738, 253, 3045, 15988, 327, 48960, 31640, 310, 1534, 627, 403, 4619, 2792, 2568, 281, 320, 9713, 3103, 891, 42876, 2997, 436, 2929, 50275, 5996, 30080, 22559, 253, 4477, 452, 9713, 619, 7350, 275, 253, 30080, 22559, 2299, 891, 671, 5194, 342, 253, 643, 4619, 2792, 5439, 407, 643, 30628, 3782, 37317, 577, 326, 403, 273, 2201, 4468, 7613, 891, 13280, 619, 3302, 4868, 285, 42876, 2997, 253, 2929, 7152, 339, 431, 248, 4477, 9059, 326, 627, 403, 690, 4999, 13977, 275, 253, 941, 2317, 326, 403, 1679, 21291, 281, 48960, 8104, 253, 4477, 12661, 247, 5853, 281, 4271, 824, 4999, 13977, 597, 840, 25057, 731, 323, 10237, 3733, 285, 10018, 2169, 10237, 7200, 685, 8245, 4720, 597, 25057, 436, 8310, 281, 4271, 562, 273, 3268, 941, 50275, 783, 2898, 310, 1774, 285, 253, 1543, 1007, 12532, 2299, 891, 452, 253, 1563, 4468, 50275, 783, 4477, 12661, 247, 747, 4322, 1566, 835, 253, 34014, 778, 452, 2289, 281, 253, 13130, 941, 597, 17194, 824, 247, 4758, 342, 271, 1650, 273, 17899, 2460, 3186, 2299, 824, 247, 4758, 310, 3240, 3710, 627, 403, 671, 5368, 3082, 326, 897, 22296, 4715, 4758, 342, 13583, 21473, 253, 2929, 943, 2319, 849, 597, 9184, 432, 824, 247, 1386, 273, 789, 50274, 783, 3186, 5933, 4419, 326, 247, 3451, 8131, 5203, 310, 2130, 436, 4758, 310, 417, 3240, 15958, 849, 476, 359, 1089, 247, 4999, 6308, 672, 253, 5203, 310, 7202, 50275, 8826, 273, 253, 4342, 403, 417, 3240, 10084, 323, 1650, 247, 4999, 6308, 310, 625, 275, 247, 10237, 1566, 342, 1355, 299, 4277, 50271, 7152, 33032, 2520, 2929, 29328, 247, 747, 48960, 7792, 835, 253, 26762, 812, 36588, 1242, 10007, 30410, 14800, 281, 1089, 4999, 13977, 326, 403, 10237, 281, 48960, 8104, 597, 840, 9569, 247, 4460, 26413, 652, 13757, 5933, 326, 476, 1089, 4999, 13977, 327, 689, 5091, 273, 253, 9113, 10509, 3888, 323, 18539, 274, 1365, 10166, 49996, 327, 260, 338, 274, 740, 285, 4440, 257, 292, 15302, 285, 921, 326, 597, 476, 320, 908, 281, 3157, 1097, 253, 16774, 285, 18065, 31640, 327, 43966, 49996, 16280, 597, 12661, 247, 747, 3733, 6974, 1754, 327, 616, 24366, 670, 4999, 13977, 323, 562, 1171, 35360, 5481, 534, 33526, 1375, 23037, 14387, 1543, 327, 425, 472, 382, 2382, 42559, 50275, 1189, 455, 253, 4028, 310, 2590, 285, 253, 2934, 310, 4722, 891, 1158, 597, 452, 253, 1563, 9021, 50275, 18, 12661, 247, 4460, 48960, 7792, 285, 41509, 352, 407, 247, 1524, 10186, 2898, 3186, 3948, 1650, 50275, 19, 12661, 271, 3576, 5933, 281, 1089, 4999, 13977, 50275, 20, 921, 253, 31471, 273, 4999, 13977, 323, 48960, 31640, 285, 562, 1171, 35360, 5481, 50275, 35529, 891, 452, 253, 1563, 7350, 50275, 18, 891, 717, 12371, 1880, 253, 4999, 13977, 2686, 2226, 1754, 327, 253, 5426, 253, 30410, 943, 452, 253, 1072, 13650, 327, 512, 941, 2792, 275, 253, 299, 4277, 2910, 1475, 247, 4999, 6308, 594, 253, 30410, 556, 18065, 31640, 327, 253, 4999, 13977, 533, 352, 1537, 320, 1892, 281, 921, 326, 253, 30410, 556, 18065, 31640, 327, 253, 4999, 13977, 3738, 597, 452, 2011, 326, 253, 8104, 812, 417, 8379, 1089, 48960, 6667, 323, 253, 4999, 13977, 352, 1537, 320, 1955, 281, 253, 11786, 44790, 3374, 812, 253, 4477, 1611, 253, 6753, 35946, 4081, 275, 337, 281, 923, 1880, 253, 30410, 310, 2686, 10237, 327, 253, 4999, 13977, 50275, 19, 432, 616, 1543, 359, 476, 923, 326, 253, 4999, 13977, 13414, 2226, 323, 10748, 10166, 3210, 359, 878, 281, 897, 48960, 3733, 281, 4711, 4999, 13977, 533, 352, 310, 417, 10084, 326, 48960, 3733, 812, 4711, 4999, 13977, 275, 958, 604, 359, 812, 8415, 253, 2629, 48960, 3733, 8103, 5556, 595, 840, 667, 3626, 3888, 432, 253, 3733, 3268, 943, 320, 4999, 13977, 812, 253, 4477, 5513, 2139, 359, 878, 281, 1089, 4999, 13977, 643, 685, 253, 3626, 3888, 275, 436, 1083, 50275, 20, 604, 253, 30410, 310, 417, 10237, 327, 253, 3626, 2460, 1269, 80, 285, 253, 26762, 9010, 247, 4999, 6308, 48361, 1475, 1269, 80, 840, 432, 253, 30539, 8668, 2139, 344, 812, 417, 806, 12230, 48361, 281, 320, 1269, 80, 285, 840, 12230, 1269, 80, 281, 1089, 48960, 6667, 604, 253, 30539, 812, 417, 1089, 48960, 6667, 323, 48361, 840, 344, 778, 1611, 643, 2983, 8130, 751, 970, 4067, 299, 4277, 390, 643, 20452, 3510, 275, 824, 2219, 253, 4081, 5684, 7792, 778, 417, 789, 812, 253, 4477, 5513, 352, 50275, 21, 323, 4999, 6308, 13823, 48960, 3733, 597, 3748, 326, 253, 3733, 5199, 310, 625, 43245, 17905, 685, 23256, 69, 48960, 3733, 840, 597, 897, 10522, 269, 72, 3610, 390, 465, 10539, 23256, 69, 4404, 253, 3216, 33024, 5203, 347, 247, 17335, 281, 4999, 6308, 3186, 352, 310, 1892, 323, 479, 281, 2096, 752, 597, 4555, 513, 275, 436, 629, 812, 253, 4477, 6266, 352, 275, 2508, 671, 2139, 651, 253, 4999, 6308, 13823, 48960, 3733, 320, 1805, 685, 253, 2629, 48960, 3733, 891, 1158, 2629, 48960, 3733, 476, 671, 4711, 4999, 13977, 310, 352, 984, 253, 4999, 6308, 13823, 48960, 3733, 3186, 323, 1269, 66, 275, 247, 4067, 4023, 1475, 1269, 80, 270, 3005, 4259, 89, 80, 685, 253, 2629, 48960, 3733, 812, 253, 4477, 1611, 2629, 48960, 3733, 342, 247, 20452, 7563, 273, 18687, 4259, 281, 923, 604, 436, 310, 253, 1083, 50275, 22, 323, 562, 1171, 35360, 5481, 597, 24366, 326, 253, 3530, 432, 253, 6311, 3268, 588, 452, 247, 2169, 5912, 273, 1907, 4999, 13977, 2429, 281, 253, 562, 1171, 35360, 3530, 533, 891, 13414, 1158, 616, 1543, 812, 1329, 436, 24366, 275, 616, 3733, 8103, 597, 11120, 15338, 253, 5912, 273, 4999, 6308, 6242, 273, 253, 562, 3623, 3530, 594, 597, 1611, 281, 6194, 247, 1566, 824, 326, 616, 24366, 6556, 352, 651, 320, 1805, 604, 597, 812, 921, 1880, 616, 24366, 6556, 323, 10748, 10166, 3210, 390, 253, 3210, 10166, 970, 562, 3623, 5445, 891, 1804, 597, 1347, 271, 28913, 1263, 323, 8103, 577, 671, 891, 1158, 597, 2985, 690, 258, 351, 5481, 1666, 25379, 824, 347, 374, 285, 495, 812, 253, 4477, 7277, 616, 1332, 281, 731, 50273, 18, 9187, 336, 1315, 1972, 1940, 285, 1111, 394, 6358, 344, 249, 9630, 7103, 273, 48960, 31640, 342, 271, 19862, 273, 11117, 4764, 4924, 8104, 549, 32693, 638, 3845, 549, 32693, 9755, 520, 31055, 9169, 50276, 19, 278, 1368, 8243, 74, 256, 1758, 1162, 355, 1881, 35421, 4715, 323, 2087, 12729, 562, 1171, 35360, 5481, 39951, 2284, 9169, 50276, 20, 632, 86, 359, 262, 606, 1162, 355, 2341, 3169, 562, 1171, 35360, 5481, 549, 32693, 638, 3845, 549, 32693, 1252, 45440, 3046, 9169, 50276, 6438, 5955, 342, 4477, 50276, 35501, 323, 253, 37699, 690, 273, 619, 7350, 452, 644, 9713, 285, 891, 452, 5439, 619, 4868, 533, 891, 1978, 253, 4468, 326, 253, 4081, 5684, 7792, 778, 320, 4354, 7154, 275, 3946, 1677, 326, 253, 30539, 476, 452, 25470, 1612, 5474, 33032, 47033, 368, 323, 634, 9172, 50274, 783, 2929, 29328, 247, 747, 1332, 323, 2403, 48960, 8104, 625, 2834, 50276, 249, 616, 1332, 253, 26762, 417, 253, 30539, 771, 7790, 253, 3236, 3280, 1269, 80, 281, 48361, 534, 310, 16293, 281, 320, 4999, 275, 253, 3282, 326, 271, 30539, 26264, 48361, 588, 417, 8722, 281, 1818, 253, 8131, 966, 1919, 1781, 2544, 281, 253, 3410, 403, 2684, 253, 31342, 7563, 323, 26264, 253, 3410, 310, 17007, 18687, 5727, 253, 40567, 7563, 310, 299, 4277, 846, 690, 7921, 569, 597, 12666, 387, 253, 13757, 1895, 4767, 275, 5150, 374, 1089, 253, 11237, 48361, 2256, 281, 7563, 18687, 534, 46926, 253, 2495, 4080, 407, 2831, 290, 10144, 326, 667, 11237, 273, 48361, 407, 271, 30539, 2256, 281, 7563, 299, 4277, 310, 3731, 39651, 597, 671, 9017, 253, 2934, 281, 562, 1171, 3268, 258, 351, 5481, 2167, 253, 2022, 7680, 3133, 281, 320, 275, 37460, 48960, 8104, 1309, 5175, 50276, 9072, 2792, 50275, 783, 5044, 2934, 285, 28529, 273, 253, 13757, 1895, 310, 4518, 3542, 50276, 783, 2934, 273, 26264, 271, 3280, 2460, 1078, 9162, 310, 4722, 8505, 747, 285, 3576, 275, 37460, 253, 3486, 273, 18539, 30941, 8104, 2556, 281, 253, 4679, 275, 7609, 5976, 7652, 50275, 20881, 328, 8250, 2792, 50275, 4674, 4791, 4999, 6308, 13823, 48960, 3733, 310, 247, 1652, 2372, 12744, 281, 479, 310, 352, 326, 1024, 253, 3733, 941, 417, 253, 1071, 941, 310, 7321, 533, 840, 352, 310, 417, 594, 2590, 281, 479, 1880, 253, 2457, 3733, 8103, 310, 1335, 973, 2931, 352, 1537, 816, 452, 247, 2074, 1055, 347, 6240, 6046, 281, 3733, 3530, 33810, 352, 4620, 326, 253, 2644, 2181, 4916, 2834, 281, 6194, 1580, 1309, 3733, 352, 3309, 281, 35388, 875, 247, 9826, 1566, 3733, 285, 270, 11237, 273, 3733, 3530, 50275, 4674, 5540, 562, 1171, 35360, 5481, 310, 417, 21414, 8505, 253, 4081, 1332, 310, 417, 760, 1892, 281, 6194, 533, 671, 556, 1264, 1774, 4373, 22041, 17356, 29331, 285, 12910, 534, 878, 281, 320, 9257, 24251, 3103, 1014, 2167, 253, 4477, 1304, 11701, 689, 2045, 3082, 275, 2593, 577, 891, 717, 417, 13762, 326, 436, 310, 247, 8542, 2746, 281, 258, 351, 50275, 249, 2593, 7609, 891, 717, 417, 2119, 752, 253, 4477, 1599, 342, 776, 3082, 476, 1089, 4999, 13977, 327, 689, 9330, 273, 253, 1071, 873, 3888, 619, 4685, 310, 326, 604, 253, 966, 5203, 812, 417, 320, 4391, 407, 271, 30539, 840, 253, 1332, 369, 5547, 1014, 604, 253, 3236, 3410, 369, 3731, 39651, 2299, 2829, 337, 5012, 760, 9162, 7200, 2490, 187, 4118, 18435, 27, 783, 30628, 7478, 326, 253, 4081, 1332, 310, 4722, 285, 3133, 281, 320, 4217, 275, 690, 2219, 285, 253, 4477, 2530, 4209, 16774, 1543, 281, 1329, 616, 1750, 275, 1635, 690, 5701, 452, 2168, 644, 31637, 2299, 690, 30628, 1335, 7514, 326, 253, 4081, 17147, 1332, 588, 320, 16473, 762, 690, 2515, 285, 1335, 452, 253, 2201, 4468, 5001, 253, 2523, 273, 25987, 690, 2983, 8130, 281, 1089, 48960, 6667, 2822, 253, 4999, 13977, 1014, 2167, 253, 4477, 31637, 690, 4619, 2792, 273, 253, 4081, 1332, 50276, 20513, 30453, 3977, 281, 253, 3061, 281, 417, 2997, 2299, 436, 2929, 556, 690, 15785, 285, 476, 320, 1160, 715, 247, 10046, 7680, 275, 253, 2852, 209 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 1869, 285, 1056, 271, 3280, 10237, 281, 48960, 8104, 3185, 273, 22753, 253, 2990, 597, 1056, 11269, 281, 253, 3280, 281, 6351, 31640, 275, 643, 3000, 436, 789, 8338, 253, 6242, 273, 4999, 13977, 2822, 253, 3280, 3530, 326, 403, 10237, 1411, 48960, 8104, 1543, 327, 260, 338, 274, 740, 285, 4440, 257, 292, 12957, 326, 627, 4961, 824, 4999, 13977, 534, 403, 14264, 281, 48960, 26309, 285, 3157, 48960, 31640, 672, 5678, 342, 48960, 3733, 253, 4477, 1307, 352, 347, 4389, 265, 11714, 6600, 48960, 3733, 1754, 327, 436, 2746, 253, 4477, 671, 12661, 562, 1171, 35360, 5481, 1332, 326, 41731, 13015, 2045, 2987, 50276, 296, 3755, 20556, 50275, 24013, 7639, 310, 2590, 50276, 783, 4081, 2746, 310, 4722, 285, 1027, 432, 5368, 2987, 253, 8542, 2898, 273, 253, 4081, 7792, 310, 50221, 4518, 50276, 48746, 4278, 285, 26850, 403, 2590, 50276, 16680, 921, 326, 253, 4081, 2746, 19132, 48960, 31640, 285, 4076, 941, 3045, 327, 1097, 260, 338, 274, 740, 285, 4440, 257, 292, 33810, 253, 4081, 2746, 10260, 19132, 253, 31640, 672, 6760, 342, 14871, 36971, 50276, 783, 2216, 273, 253, 2746, 13276, 562, 1171, 35360, 5481, 326, 41731, 13015, 2045, 2987, 50276, 20881, 1255, 265, 209, 186, 783, 2201, 4468, 8696, 275, 253, 7103, 273, 253, 4081, 5853, 1060, 253, 4477, 1089, 4999, 13977, 285, 671, 12661, 4389, 265, 11714, 6600, 48960, 3733, 533, 7472, 327, 23256, 69, 1754, 48960, 2983, 275, 247, 2629, 50276, 8420, 254, 352, 310, 1774, 281, 2953, 253, 6387, 273, 4999, 6308, 6600, 48960, 2983, 327, 253, 4081, 5684, 285, 697, 2323, 2281, 275, 1083, 824, 2983, 310, 275, 36764, 917, 4496, 2085, 253, 24775, 3212, 326, 50274, 186, 498, 274, 1419, 253, 3064, 875, 256, 11546, 285, 653, 35333, 432, 4679, 2593, 1580, 256, 11546, 671, 4648, 246, 10539, 23256, 69, 849, 352, 310, 1027, 685, 653, 35333, 209, 186, 2004, 253, 562, 1171, 35360, 5481, 1543, 5777, 41731, 13015, 2045, 2987, 762, 269, 1087, 2222, 7982, 253, 3045, 15988, 403, 1077, 8723, 285, 417, 1077, 1534, 685, 253, 8245, 258, 70, 344, 2109, 610, 6163, 1162, 355, 6247, 67, 762, 767, 17082, 247, 1822, 68, 285, 247, 484, 83, 50276, 13017, 7906, 253, 4081, 1332, 310, 4518, 17194, 3738, 253, 3045, 15988, 327, 48960, 31640, 310, 1534, 627, 403, 4619, 2792, 2568, 281, 320, 9713, 3103, 891, 42876, 2997, 436, 2929, 50275, 5996, 30080, 22559, 253, 4477, 452, 9713, 619, 7350, 275, 253, 30080, 22559, 2299, 891, 671, 5194, 342, 253, 643, 4619, 2792, 5439, 407, 643, 30628, 3782, 37317, 577, 326, 403, 273, 2201, 4468, 7613, 891, 13280, 619, 3302, 4868, 285, 42876, 2997, 253, 2929, 7152, 339, 431, 248, 4477, 9059, 326, 627, 403, 690, 4999, 13977, 275, 253, 941, 2317, 326, 403, 1679, 21291, 281, 48960, 8104, 253, 4477, 12661, 247, 5853, 281, 4271, 824, 4999, 13977, 597, 840, 25057, 731, 323, 10237, 3733, 285, 10018, 2169, 10237, 7200, 685, 8245, 4720, 597, 25057, 436, 8310, 281, 4271, 562, 273, 3268, 941, 50275, 783, 2898, 310, 1774, 285, 253, 1543, 1007, 12532, 2299, 891, 452, 253, 1563, 4468, 50275, 783, 4477, 12661, 247, 747, 4322, 1566, 835, 253, 34014, 778, 452, 2289, 281, 253, 13130, 941, 597, 17194, 824, 247, 4758, 342, 271, 1650, 273, 17899, 2460, 3186, 2299, 824, 247, 4758, 310, 3240, 3710, 627, 403, 671, 5368, 3082, 326, 897, 22296, 4715, 4758, 342, 13583, 21473, 253, 2929, 943, 2319, 849, 597, 9184, 432, 824, 247, 1386, 273, 789, 50274, 783, 3186, 5933, 4419, 326, 247, 3451, 8131, 5203, 310, 2130, 436, 4758, 310, 417, 3240, 15958, 849, 476, 359, 1089, 247, 4999, 6308, 672, 253, 5203, 310, 7202, 50275, 8826, 273, 253, 4342, 403, 417, 3240, 10084, 323, 1650, 247, 4999, 6308, 310, 625, 275, 247, 10237, 1566, 342, 1355, 299, 4277, 50271, 7152, 33032, 2520, 2929, 29328, 247, 747, 48960, 7792, 835, 253, 26762, 812, 36588, 1242, 10007, 30410, 14800, 281, 1089, 4999, 13977, 326, 403, 10237, 281, 48960, 8104, 597, 840, 9569, 247, 4460, 26413, 652, 13757, 5933, 326, 476, 1089, 4999, 13977, 327, 689, 5091, 273, 253, 9113, 10509, 3888, 323, 18539, 274, 1365, 10166, 49996, 327, 260, 338, 274, 740, 285, 4440, 257, 292, 15302, 285, 921, 326, 597, 476, 320, 908, 281, 3157, 1097, 253, 16774, 285, 18065, 31640, 327, 43966, 49996, 16280, 597, 12661, 247, 747, 3733, 6974, 1754, 327, 616, 24366, 670, 4999, 13977, 323, 562, 1171, 35360, 5481, 534, 33526, 1375, 23037, 14387, 1543, 327, 425, 472, 382, 2382, 42559, 50275, 1189, 455, 253, 4028, 310, 2590, 285, 253, 2934, 310, 4722, 891, 1158, 597, 452, 253, 1563, 9021, 50275, 18, 12661, 247, 4460, 48960, 7792, 285, 41509, 352, 407, 247, 1524, 10186, 2898, 3186, 3948, 1650, 50275, 19, 12661, 271, 3576, 5933, 281, 1089, 4999, 13977, 50275, 20, 921, 253, 31471, 273, 4999, 13977, 323, 48960, 31640, 285, 562, 1171, 35360, 5481, 50275, 35529, 891, 452, 253, 1563, 7350, 50275, 18, 891, 717, 12371, 1880, 253, 4999, 13977, 2686, 2226, 1754, 327, 253, 5426, 253, 30410, 943, 452, 253, 1072, 13650, 327, 512, 941, 2792, 275, 253, 299, 4277, 2910, 1475, 247, 4999, 6308, 594, 253, 30410, 556, 18065, 31640, 327, 253, 4999, 13977, 533, 352, 1537, 320, 1892, 281, 921, 326, 253, 30410, 556, 18065, 31640, 327, 253, 4999, 13977, 3738, 597, 452, 2011, 326, 253, 8104, 812, 417, 8379, 1089, 48960, 6667, 323, 253, 4999, 13977, 352, 1537, 320, 1955, 281, 253, 11786, 44790, 3374, 812, 253, 4477, 1611, 253, 6753, 35946, 4081, 275, 337, 281, 923, 1880, 253, 30410, 310, 2686, 10237, 327, 253, 4999, 13977, 50275, 19, 432, 616, 1543, 359, 476, 923, 326, 253, 4999, 13977, 13414, 2226, 323, 10748, 10166, 3210, 359, 878, 281, 897, 48960, 3733, 281, 4711, 4999, 13977, 533, 352, 310, 417, 10084, 326, 48960, 3733, 812, 4711, 4999, 13977, 275, 958, 604, 359, 812, 8415, 253, 2629, 48960, 3733, 8103, 5556, 595, 840, 667, 3626, 3888, 432, 253, 3733, 3268, 943, 320, 4999, 13977, 812, 253, 4477, 5513, 2139, 359, 878, 281, 1089, 4999, 13977, 643, 685, 253, 3626, 3888, 275, 436, 1083, 50275, 20, 604, 253, 30410, 310, 417, 10237, 327, 253, 3626, 2460, 1269, 80, 285, 253, 26762, 9010, 247, 4999, 6308, 48361, 1475, 1269, 80, 840, 432, 253, 30539, 8668, 2139, 344, 812, 417, 806, 12230, 48361, 281, 320, 1269, 80, 285, 840, 12230, 1269, 80, 281, 1089, 48960, 6667, 604, 253, 30539, 812, 417, 1089, 48960, 6667, 323, 48361, 840, 344, 778, 1611, 643, 2983, 8130, 751, 970, 4067, 299, 4277, 390, 643, 20452, 3510, 275, 824, 2219, 253, 4081, 5684, 7792, 778, 417, 789, 812, 253, 4477, 5513, 352, 50275, 21, 323, 4999, 6308, 13823, 48960, 3733, 597, 3748, 326, 253, 3733, 5199, 310, 625, 43245, 17905, 685, 23256, 69, 48960, 3733, 840, 597, 897, 10522, 269, 72, 3610, 390, 465, 10539, 23256, 69, 4404, 253, 3216, 33024, 5203, 347, 247, 17335, 281, 4999, 6308, 3186, 352, 310, 1892, 323, 479, 281, 2096, 752, 597, 4555, 513, 275, 436, 629, 812, 253, 4477, 6266, 352, 275, 2508, 671, 2139, 651, 253, 4999, 6308, 13823, 48960, 3733, 320, 1805, 685, 253, 2629, 48960, 3733, 891, 1158, 2629, 48960, 3733, 476, 671, 4711, 4999, 13977, 310, 352, 984, 253, 4999, 6308, 13823, 48960, 3733, 3186, 323, 1269, 66, 275, 247, 4067, 4023, 1475, 1269, 80, 270, 3005, 4259, 89, 80, 685, 253, 2629, 48960, 3733, 812, 253, 4477, 1611, 2629, 48960, 3733, 342, 247, 20452, 7563, 273, 18687, 4259, 281, 923, 604, 436, 310, 253, 1083, 50275, 22, 323, 562, 1171, 35360, 5481, 597, 24366, 326, 253, 3530, 432, 253, 6311, 3268, 588, 452, 247, 2169, 5912, 273, 1907, 4999, 13977, 2429, 281, 253, 562, 1171, 35360, 3530, 533, 891, 13414, 1158, 616, 1543, 812, 1329, 436, 24366, 275, 616, 3733, 8103, 597, 11120, 15338, 253, 5912, 273, 4999, 6308, 6242, 273, 253, 562, 3623, 3530, 594, 597, 1611, 281, 6194, 247, 1566, 824, 326, 616, 24366, 6556, 352, 651, 320, 1805, 604, 597, 812, 921, 1880, 616, 24366, 6556, 323, 10748, 10166, 3210, 390, 253, 3210, 10166, 970, 562, 3623, 5445, 891, 1804, 597, 1347, 271, 28913, 1263, 323, 8103, 577, 671, 891, 1158, 597, 2985, 690, 258, 351, 5481, 1666, 25379, 824, 347, 374, 285, 495, 812, 253, 4477, 7277, 616, 1332, 281, 731, 50273, 18, 9187, 336, 1315, 1972, 1940, 285, 1111, 394, 6358, 344, 249, 9630, 7103, 273, 48960, 31640, 342, 271, 19862, 273, 11117, 4764, 4924, 8104, 549, 32693, 638, 3845, 549, 32693, 9755, 520, 31055, 9169, 50276, 19, 278, 1368, 8243, 74, 256, 1758, 1162, 355, 1881, 35421, 4715, 323, 2087, 12729, 562, 1171, 35360, 5481, 39951, 2284, 9169, 50276, 20, 632, 86, 359, 262, 606, 1162, 355, 2341, 3169, 562, 1171, 35360, 5481, 549, 32693, 638, 3845, 549, 32693, 1252, 45440, 3046, 9169, 50276, 6438, 5955, 342, 4477, 50276, 35501, 323, 253, 37699, 690, 273, 619, 7350, 452, 644, 9713, 285, 891, 452, 5439, 619, 4868, 533, 891, 1978, 253, 4468, 326, 253, 4081, 5684, 7792, 778, 320, 4354, 7154, 275, 3946, 1677, 326, 253, 30539, 476, 452, 25470, 1612, 5474, 33032, 47033, 368, 323, 634, 9172, 50274, 783, 2929, 29328, 247, 747, 1332, 323, 2403, 48960, 8104, 625, 2834, 50276, 249, 616, 1332, 253, 26762, 417, 253, 30539, 771, 7790, 253, 3236, 3280, 1269, 80, 281, 48361, 534, 310, 16293, 281, 320, 4999, 275, 253, 3282, 326, 271, 30539, 26264, 48361, 588, 417, 8722, 281, 1818, 253, 8131, 966, 1919, 1781, 2544, 281, 253, 3410, 403, 2684, 253, 31342, 7563, 323, 26264, 253, 3410, 310, 17007, 18687, 5727, 253, 40567, 7563, 310, 299, 4277, 846, 690, 7921, 569, 597, 12666, 387, 253, 13757, 1895, 4767, 275, 5150, 374, 1089, 253, 11237, 48361, 2256, 281, 7563, 18687, 534, 46926, 253, 2495, 4080, 407, 2831, 290, 10144, 326, 667, 11237, 273, 48361, 407, 271, 30539, 2256, 281, 7563, 299, 4277, 310, 3731, 39651, 597, 671, 9017, 253, 2934, 281, 562, 1171, 3268, 258, 351, 5481, 2167, 253, 2022, 7680, 3133, 281, 320, 275, 37460, 48960, 8104, 1309, 5175, 50276, 9072, 2792, 50275, 783, 5044, 2934, 285, 28529, 273, 253, 13757, 1895, 310, 4518, 3542, 50276, 783, 2934, 273, 26264, 271, 3280, 2460, 1078, 9162, 310, 4722, 8505, 747, 285, 3576, 275, 37460, 253, 3486, 273, 18539, 30941, 8104, 2556, 281, 253, 4679, 275, 7609, 5976, 7652, 50275, 20881, 328, 8250, 2792, 50275, 4674, 4791, 4999, 6308, 13823, 48960, 3733, 310, 247, 1652, 2372, 12744, 281, 479, 310, 352, 326, 1024, 253, 3733, 941, 417, 253, 1071, 941, 310, 7321, 533, 840, 352, 310, 417, 594, 2590, 281, 479, 1880, 253, 2457, 3733, 8103, 310, 1335, 973, 2931, 352, 1537, 816, 452, 247, 2074, 1055, 347, 6240, 6046, 281, 3733, 3530, 33810, 352, 4620, 326, 253, 2644, 2181, 4916, 2834, 281, 6194, 1580, 1309, 3733, 352, 3309, 281, 35388, 875, 247, 9826, 1566, 3733, 285, 270, 11237, 273, 3733, 3530, 50275, 4674, 5540, 562, 1171, 35360, 5481, 310, 417, 21414, 8505, 253, 4081, 1332, 310, 417, 760, 1892, 281, 6194, 533, 671, 556, 1264, 1774, 4373, 22041, 17356, 29331, 285, 12910, 534, 878, 281, 320, 9257, 24251, 3103, 1014, 2167, 253, 4477, 1304, 11701, 689, 2045, 3082, 275, 2593, 577, 891, 717, 417, 13762, 326, 436, 310, 247, 8542, 2746, 281, 258, 351, 50275, 249, 2593, 7609, 891, 717, 417, 2119, 752, 253, 4477, 1599, 342, 776, 3082, 476, 1089, 4999, 13977, 327, 689, 9330, 273, 253, 1071, 873, 3888, 619, 4685, 310, 326, 604, 253, 966, 5203, 812, 417, 320, 4391, 407, 271, 30539, 840, 253, 1332, 369, 5547, 1014, 604, 253, 3236, 3410, 369, 3731, 39651, 2299, 2829, 337, 5012, 760, 9162, 7200, 2490, 187, 4118, 18435, 27, 783, 30628, 7478, 326, 253, 4081, 1332, 310, 4722, 285, 3133, 281, 320, 4217, 275, 690, 2219, 285, 253, 4477, 2530, 4209, 16774, 1543, 281, 1329, 616, 1750, 275, 1635, 690, 5701, 452, 2168, 644, 31637, 2299, 690, 30628, 1335, 7514, 326, 253, 4081, 17147, 1332, 588, 320, 16473, 762, 690, 2515, 285, 1335, 452, 253, 2201, 4468, 5001, 253, 2523, 273, 25987, 690, 2983, 8130, 281, 1089, 48960, 6667, 2822, 253, 4999, 13977, 1014, 2167, 253, 4477, 31637, 690, 4619, 2792, 273, 253, 4081, 1332, 50276, 20513, 30453, 3977, 281, 253, 3061, 281, 417, 2997, 2299, 436, 2929, 556, 690, 15785, 285, 476, 320, 1160, 715, 247, 10046, 7680, 275, 253, 2852, 209 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: teaching agents to play lewis signalling games is still one of the most common ways of studying emergent communication and has historically been a test bed for our understanding of many ec dynamics in this paper the authors propose to view the standard lewis game loss as a composition of two terms one that essentially promotes a unique message for each object and one that promotes mutual understandability between speaker and listener it is this second term that is the main focus here deemed as coadaptation as the authors explore the effect that artificially controlling this factor has on the learned languages they find that when trained in the de facto way this term often overfits and this overfitting has a detrimental effect on compositionality strengths first off many thanks to the authors for presenting such a clear paper it was generally a joy to read the proposed decomposition makes a lot of sense and maybe more than that just formalizing the loss in this way may help other researchers to structure the hypotheses of their own work in particular i liked the formalization of ltrainadapt and its role in the loss function since this gets quite close i think to how one should discuss generalization to novel objectattribute combinations through compositionality in terms of the results the authors make a compelling case for the relationship between coadaptation overfitting hindering compositionality in languages both experiments with really handson early stopping regularization and with more traditional ml regularization on the coadaptation term yield better generalization weaknesses although i think this paper is mostly ready for publication in its current state and i do think it is beneficial to the ec community im not sure the scope of the contribution is significant enough to warrant publication at neurips both in terms of what benefit it brings to the ec community and in terms of its interest to the larger set of research at the venue i think the contribution is quite niche okay and maybe rehashing a lot of what the ec community seems to know does this work really shine new light on these issues or does it just reinforce the lessons learned by the slew of other ec work i think it comes down to this other cited ec work tends to also tackle the same problem of lack of compositionality stemming from coadaptation but focus more on how solutions may emerge from population dynamics and which sets of agents are communicating with each other these will ultimately find themselves expressed in gradients in this work the authors cut straight through to the optimization itself in that sense it helps point research towards more desirable properties in the loss function but it is still up to the research community to find plausible ways for these pressures to originate because presumably as a model of language development we dont want strong assumptions governing the loss itself over the actual agentenvironment dynamics and the particular method of regularizing as applied here primarily the early stopping seems hacky and more for the purpose of probing than for simulation whereas other proposed methods like neural iterated learning apply similar constraints in a more plausible way so in that extent the closing argument l342 seems somewhat backwards to me researchers seem aware that coadaptation and lack of compositionality is undesirable so they design realistic and intuitive pressures which ultimately have similar effects on the loss it is not surprising if a more handson approach goes directly to the loss to accomplish the same effect as the authors say remarkably this emergent compositionality does not result from things but it is instead created forcefully and not in some natural way that would be remarkable imo in conclusion like a good literature review i would like to see it published it could be useful to ec researchers but i find it difficult to see this particular analysis as sufficiently novel or important to future ec research in terms of limitations not sufficiently in terms of societal impact it should not be applicable docsepthe authors propose a framework for studying the generalization and compositionality of communication games playing lewis signaling games in short there are two agents a speaker and a listener and the speaker observes a random state from its environment eg an image and sends a signal to the listener the listener then undertakes an action based on this signal finally both agents are equally rewarded based on the outcome of the listeners action line 2426 the authors demonstrate how the agents loss can be decomposed into an information term that quantifies the degree of ambiguity of the language protocol and a coadaptation term that quantifies the gap between the listener and its optimum this observation allows studying the source of overfitting or lack of generalization in the communication agents especially when communication is in an unseen state from the training set the main idea is to decrease the importance of the coadaptation term in the loss so as to avoid the speaker to overfit the listener for doing so the authors experimented with using different training strategies continuous listener partial listener and early stopping listener section 33 the authors experimented using the egg toolkit with an environment in which every object has k 6 attributes each taking 10 different values for a total of 1 million objects and a nonoverlapping trainvaltest split is used the experiments confirm the hypothesis of the loss decomposition by showing the importance of the coadaptation term while evaluating an unseen set of objects from the test set moreover the authors show the effectiveness of the proposed method on more complex games celeba and imagenet showing a much stronger generalization compared to a standard continuous listener strengths the proposed loss decomposition is novel to the best of my knowledge the paper is formal and rigorous without being too verbose or hard to read in its presentation the experimental results are conclusive and wellconducted adding the final experiments on more complex tasks adds validation to the proposed hypothesis the paper is very well written and easy to follow the content is well organized and the plots add to the presentation not reported by the authors docsepthe paper provides a theoretical analysis of the objective of lewis communication games that reveal that the objective can be decomposed into two components 1 mutualinformation component that encourages the speaker to generate unambiguous highly informative messages on its input and 2 adaptation component that encourages the distribution of the speakers posterior to match the distribution of the listener the latter is directly optimized by both players and is zeroed when the listener achieves an optimal distribution over objects given a message following this theoretical analysis the authors hypothesize that overfitting issues in the adaptation game result in some previouslyreported problems in the emergent communication such as low accuracy and lack of compositionality to test this hypothesis the co adaption term is estimated with a probe a listener that is trained to convergence over the frozen speaker by modifying the weight of the probes loss and the original loss in the reward of the speaker the influence of mutual information component vs the adaption component is tested it is shown that 1 the adaptation term severely overfits while the information term does not 2 by replacing the continuous cotraining of the speaker and the listener with an alternate training where the listener is trained for n steps it is possible to find a setting where the emergent language presents much superior properties strengths the paper is an excellent example of a simple and highly elegant theoretical observation that leads to significant empirical gains it is largely clearly written apart from some issues see below and the empirical results especially figure 2 and 3 are persuasive and surprising they point out to an inherent limitation that arises from optimization and may explain many of the previously reported issues with neural lewis games as such the analysis in this paper has the potential of substantially influencing the study of emergent communication and i strongly support its acceptance weaknesses while the presentation is largely clear it would be beneficial to improve in the camera ready version the presentation of the experiment that involves the weighting of the probes loss and the original loss in equations 1011 the constant alpha is presented as a way to upweight the original loss while in practice it is used to downweight the influence of the adaptation term while the two are equivalent these two conflicting views make it harder to understand the discussion in lines 237244 i think this paragraph in particular has to be improved particularly the sentence at line 240 yes ### Summary:
this paper shows that the objective for lewis games as treated in the recent emergent communication ec literature can be decomposed into two losses an information loss whether each message refers to a unique referent and a coadaptation loss which quantifies the alignment of the speaker and listener it shows that lack of generalization in unregularized ec is mostly due to the latter and empirically shows that intervening on coadaptation via regularization early stoppingreinitialization improves generalization the reviewers are generally positive about this paper the one somewhat negative score is actually quite positive as well the reviewer concedes that this is a solid paper and deserves to be accepted even in its current state at some venue but questions the overall impact of the contribution and whether it merits publication in neurips unfortunately a fourth review although promised by the reviewer did not materialize in time even after repeated prodding so i had to provide it myself see below as an area chair i am somewhat torn about this paper it is wellwritten and i think the field will benefit from having these things made more explicit sadly i also think it shows very clearly how frustratingly little progress ec has made as a field we are still talking about the same things years later after its revival in uninteresting unrealistic toy settings looking for linguistics and compositionality when basic informationcommunication theory combined with basic optimization would probably be more adequate overall i think this paper can help the field move in the right direction and i am recommending conditional acceptance the notation needs to be shored up the assumptions and limitations need to be made much more explicit and it needs to be made suitable not only for readers intricately familiar with ec but also for readers who are only just reading their first paper on the subject more detailed ac review strengths presentation this paper is wellwritten the decomposition makes a lot of sense and will not come as a surprise to anyone working in the field soundness the experiments and evaluation are thorough and appear to be easily reproducible given the details provided the paper provides additional experiments on different more complex games to overcome potential criticisms of its toy task nature impact the ec literature or even the field of ec broadly speaking is extremely troubled by a poor understanding of how basic assumptions eg questions of optimization setup impact observations eg compositionality and there is an extraordinary amount of wheelreinvention exacerbated by the lack of a standardized evaluation protocol this paper has the potential to help practitioners be more explicit about their assumptions weaknesses applicability and novelty the main contribution of this paper in my mind is showing the decomposition and using it to elucidate the impact of training dynamics on the emerging communication protocol however this decomposition only applies in a limited setting and that assumption is not nearly made explicit enough the final loss is negative log likelihood and the speaker is trained via policy gradients without any constraints or regularization either on the communication channel or the listener supervision almost all papers in ec are exactly about what constraintsregularizationdynamics we can impose in order to obtain better generalization i think the distinction ie decomposition between decodability information and learnability adaptation is wellknown amongst serious ec practitioners and almost any paper published on the subject deals with these in some form clarity given the above observation and the fact that the decomposition follows from trivial math the exposition itself is valuable if it makes something very explicit that was heretofore not explicit enough such that future work will benefit from it being explicit the paper has the potential to do this but in my opinion disappointingly falls short too much of the writing assumes too much prior knowledge on the part of the reader for this paper to be maximally valuable i would want it to be understandable by someone who doesnt know anything about emergent communication and reads this as their very first paper this issue is particularly prevalent in the most important part section 2 and the corresponding appendix the notation in the proofs is almost unforgivably convoluted the sub and superscript mixing for different parameterizations is unnecessarily confusing especially with the listener parameterization phi never actually being introduced as such the actual proofs in the appendix never making explicit that the two losses concern the speaker and listener respectively try having this read by someone unfamiliar with the field theyd instantly be lost and all of it is basically building on the work from a very specific group of people who do things in a very particular way using a framework egg that nobody else uses without ever making explicit that other ec papers do things very differently there are definitely ec papers that do early stopping freezingprobing in different phases of training etc but they tend to study what constraints can be imposed on top of the standard task formulation to make things generalize usually in much more sophisticated games i also did not like that the communication channel itself was not bottlenecked with vt10 in this setting everything collapses to basic information theory and the assumption is not realistic for studying any emergent linguistic phenomena overall if it was up to me i would have written this paper very differently with an argument along the lines of the ec literature is a mess lets make things more explicit by decomposing the loss in the most basic ec setting and then we can understand all of the interventions in the prior ec literature eg ensemblesfreezingpopulationsgroundingconstraintsregularizationsetc in this new light and we can even come up with some new approaches to tackling this problem such as downweighting coadaptation i think the paper has merit and i think it can be accepted but it really has to address its shortcomings section 2 and the proofs have to be notationally extremely clear with the proofs written out in much more detail and the paper has to make its assumptions much more explicit ie that it applies to the limited basic setting that was mostly studied by a very limited group of people typos i came across and experimental choice choices to ease the reader intuition readers deep model are large enough models the train coadapation keeps dismishing coadaptation diminishing
[ 1089, 352, 2834, 281, 923, 436, 1798, 1783, 347, 10481, 4460, 390, 1774, 281, 2852, 10038, 2561, 275, 2426, 273, 7364, 417, 10481, 50276, 249, 2426, 273, 38058, 3486, 352, 943, 417, 320, 7763, 5474, 339, 431, 248, 4477, 12661, 247, 7792, 323, 12392, 253, 26647, 285, 5889, 1319, 273, 5511, 3958, 4882, 458, 88, 261, 7601, 3958, 275, 2159, 627, 403, 767, 6083, 247, 14925, 285, 247, 24199, 285, 253, 14925, 40687, 247, 3632, 1375, 432, 697, 3126, 24088, 271, 2460, 285, 16965, 247, 2625, 281, 253, 24199, 253, 24199, 840, 10832, 1582, 271, 2250, 1754, 327, 436, 2625, 4720, 1097, 6083, 403, 9696, 33302, 1754, 327, 253, 6454, 273, 253, 30418, 2250, 1386, 2164, 1731, 253, 4477, 7568, 849, 253, 6083, 2957, 476, 320, 45765, 715, 271, 1491, 1307, 326, 2677, 7790, 253, 4248, 273, 28931, 273, 253, 3448, 7241, 285, 247, 820, 26672, 318, 1307, 326, 2677, 7790, 253, 8037, 875, 253, 24199, 285, 697, 24571, 50275, 2520, 8310, 4483, 12392, 253, 2603, 273, 689, 31893, 390, 3480, 273, 26647, 275, 253, 5511, 6083, 3340, 672, 5511, 310, 275, 271, 39709, 1375, 432, 253, 3733, 873, 253, 2022, 2934, 310, 281, 6379, 253, 6349, 273, 253, 820, 26672, 318, 1307, 275, 253, 2957, 594, 347, 281, 3693, 253, 14925, 281, 689, 8491, 253, 24199, 323, 2509, 594, 253, 4477, 3368, 264, 342, 970, 1027, 3733, 8130, 5415, 24199, 7898, 24199, 285, 2393, 15910, 24199, 2593, 5922, 50275, 783, 4477, 3368, 264, 970, 253, 7238, 4968, 11554, 342, 271, 3126, 275, 534, 1046, 1789, 556, 465, 50276, 23, 12474, 1016, 3192, 884, 1027, 2193, 323, 247, 2264, 273, 337, 3041, 5113, 285, 247, 1327, 1189, 77, 5436, 6194, 1208, 2566, 8085, 310, 908, 253, 4679, 6583, 253, 9079, 273, 253, 2957, 14717, 407, 4645, 253, 6349, 273, 253, 820, 26672, 318, 1307, 1223, 16344, 271, 39709, 873, 273, 5113, 432, 253, 1071, 873, 25761, 253, 4477, 921, 253, 12510, 273, 253, 4081, 1332, 327, 625, 2570, 3958, 6076, 5830, 285, 4440, 257, 292, 4645, 247, 1199, 10046, 26647, 2429, 281, 247, 2629, 5415, 24199, 20544, 50276, 783, 4081, 2957, 14717, 310, 4460, 281, 253, 1682, 273, 619, 3640, 253, 2929, 310, 7473, 285, 26565, 1293, 1146, 1512, 48656, 390, 1892, 281, 1239, 275, 697, 9759, 50276, 783, 5661, 1543, 403, 38662, 285, 973, 11018, 264, 6240, 253, 2457, 4679, 327, 625, 2570, 8892, 11323, 12820, 281, 253, 4081, 9079, 50275, 783, 2929, 310, 1077, 973, 3542, 285, 3477, 281, 956, 253, 2600, 310, 973, 10932, 285, 253, 14777, 823, 281, 253, 9759, 417, 2361, 407, 253, 4477, 5474, 339, 431, 248, 2929, 3400, 247, 10527, 1783, 273, 253, 8103, 273, 458, 88, 261, 5511, 3958, 326, 10313, 326, 253, 8103, 476, 320, 45765, 715, 767, 4295, 337, 15577, 18480, 4445, 326, 29426, 253, 14925, 281, 6635, 39662, 4122, 27096, 8169, 327, 697, 3280, 285, 374, 15644, 4445, 326, 29426, 253, 3268, 273, 253, 17999, 12637, 281, 3761, 253, 3268, 273, 253, 24199, 253, 6158, 310, 3587, 18325, 407, 1097, 3773, 285, 310, 5058, 264, 672, 253, 24199, 33526, 271, 8654, 3268, 689, 5113, 1677, 247, 3935, 1563, 436, 10527, 1783, 253, 4477, 41661, 326, 689, 31893, 3374, 275, 253, 15644, 2165, 906, 275, 690, 3786, 20527, 3237, 275, 253, 47006, 5511, 824, 347, 1698, 7200, 285, 3480, 273, 5889, 1319, 50276, 936, 1071, 436, 9079, 253, 820, 5223, 279, 1307, 310, 5998, 342, 247, 10304, 50276, 66, 24199, 326, 310, 10166, 281, 14940, 689, 253, 13831, 14925, 407, 26264, 253, 2801, 273, 253, 19432, 2957, 285, 253, 3236, 2957, 275, 253, 10921, 273, 253, 14925, 253, 4833, 273, 15577, 1491, 4445, 4632, 253, 5223, 279, 4445, 310, 5762, 352, 310, 2011, 326, 337, 253, 15644, 1307, 18270, 689, 26017, 1223, 253, 1491, 1307, 1057, 417, 374, 407, 15706, 253, 5415, 13450, 26208, 273, 253, 14925, 285, 253, 24199, 342, 271, 17958, 3733, 835, 253, 24199, 310, 10166, 323, 295, 5018, 352, 310, 1896, 281, 1089, 247, 4758, 835, 253, 47006, 3448, 10262, 1199, 8936, 3607, 50276, 296, 3755, 20556, 253, 2929, 310, 271, 7126, 1650, 273, 247, 2969, 285, 4122, 20654, 10527, 8310, 326, 5644, 281, 1534, 16774, 15988, 352, 310, 8127, 4518, 3542, 7419, 432, 690, 3374, 923, 2708, 285, 253, 16774, 1543, 3340, 4677, 374, 285, 495, 403, 34593, 285, 10084, 50276, 9328, 1127, 562, 281, 271, 12794, 12291, 326, 15877, 432, 13757, 285, 778, 5513, 1142, 273, 253, 3786, 2361, 3374, 342, 11454, 458, 88, 261, 3958, 347, 824, 253, 1783, 275, 436, 2929, 556, 253, 2442, 273, 9619, 29189, 253, 1263, 273, 47006, 5511, 285, 891, 7052, 1329, 697, 14924, 50275, 20881, 1255, 265, 1223, 253, 9759, 310, 8127, 2590, 352, 651, 320, 12912, 281, 3157, 275, 253, 6568, 4704, 2715, 253, 9759, 273, 253, 3368, 326, 8687, 253, 42428, 273, 253, 19432, 2957, 285, 253, 3236, 2957, 275, 7424, 8437, 18, 253, 3638, 9765, 310, 3559, 347, 247, 1039, 281, 598, 6712, 253, 3236, 2957, 1223, 275, 3946, 352, 310, 908, 281, 1066, 6712, 253, 4833, 273, 253, 15644, 1307, 1223, 253, 767, 403, 6425, 841, 767, 24648, 6849, 1056, 352, 12150, 281, 2096, 253, 5955, 275, 3104, 27332, 21149, 891, 1158, 436, 12494, 275, 1798, 556, 281, 320, 5520, 3782, 253, 6197, 387, 1386, 16918, 50276, 9820, 50276, 187, 187, 4118, 18435, 27, 2520, 2929, 2722, 326, 253, 8103, 323, 458, 88, 261, 3958, 347, 4127, 275, 253, 3332, 47006, 5511, 10038, 6239, 476, 320, 45765, 715, 767, 11655, 271, 1491, 2957, 1880, 1016, 3935, 10770, 281, 247, 4451, 294, 5652, 285, 247, 820, 26672, 318, 2957, 534, 2677, 7790, 253, 12420, 273, 253, 14925, 285, 24199, 352, 2722, 326, 3480, 273, 26647, 275, 440, 12846, 1025, 10038, 310, 6571, 1955, 281, 253, 6158, 285, 45190, 2722, 326, 37686, 327, 820, 26672, 318, 3066, 37820, 2393, 15910, 250, 19078, 1320, 19132, 26647, 50276, 783, 30628, 403, 3839, 2762, 670, 436, 2929, 253, 581, 8489, 4016, 4868, 310, 2686, 3240, 2762, 347, 973, 253, 37317, 44184, 326, 436, 310, 247, 4891, 2929, 285, 22828, 281, 320, 7607, 1014, 275, 697, 1655, 1375, 387, 690, 18767, 533, 3533, 253, 4583, 3486, 273, 253, 7680, 285, 1880, 352, 16108, 9311, 275, 5723, 2824, 19235, 247, 7002, 2278, 3738, 12316, 407, 253, 37317, 858, 417, 2144, 907, 275, 673, 1014, 846, 6015, 354, 38699, 594, 891, 574, 281, 2085, 352, 4266, 923, 2708, 50276, 284, 271, 2170, 6951, 891, 717, 8489, 15070, 670, 436, 2929, 352, 310, 973, 15720, 285, 891, 1158, 253, 1673, 588, 5649, 432, 1907, 841, 1841, 1160, 625, 6843, 30018, 891, 671, 1158, 352, 2722, 1077, 4518, 849, 29125, 314, 1652, 4780, 10038, 556, 1160, 347, 247, 1673, 359, 403, 1335, 5015, 670, 253, 1072, 1841, 1107, 1996, 846, 697, 36499, 275, 440, 47606, 46521, 20953, 7533, 2819, 323, 20365, 3397, 285, 5889, 1319, 672, 5044, 1491, 35437, 3762, 5678, 342, 5044, 13757, 651, 3164, 320, 625, 10599, 4583, 891, 1158, 436, 2929, 476, 1361, 253, 1673, 2118, 275, 253, 987, 3884, 285, 891, 717, 46705, 17697, 14924, 253, 14951, 3198, 281, 320, 439, 2149, 598, 253, 13260, 285, 7364, 878, 281, 320, 1160, 1199, 625, 6843, 285, 352, 3198, 281, 320, 1160, 7470, 417, 760, 323, 10668, 29381, 1523, 7615, 342, 10038, 533, 671, 323, 10668, 665, 403, 760, 816, 4361, 616, 806, 2929, 327, 253, 2256, 50274, 3062, 7000, 913, 2278, 50276, 296, 3755, 20556, 50276, 49836, 436, 2929, 310, 973, 15720, 253, 14717, 2789, 247, 2257, 273, 3282, 285, 588, 417, 1705, 347, 247, 9326, 281, 3780, 2444, 275, 253, 1673, 50276, 27962, 1255, 253, 4679, 285, 7103, 403, 11080, 285, 3176, 281, 320, 4354, 41374, 1677, 253, 4278, 2530, 253, 2929, 3400, 3081, 4679, 327, 1027, 625, 2570, 3958, 281, 11399, 2442, 43680, 273, 697, 20953, 4836, 3753, 50276, 48276, 253, 10038, 6239, 390, 1014, 253, 1673, 273, 10038, 21450, 8288, 310, 6685, 26504, 407, 247, 4105, 4685, 273, 849, 5044, 13260, 24088, 3533, 273, 13757, 9978, 3486, 7313, 24088, 5889, 1319, 285, 627, 310, 271, 15423, 2408, 273, 9530, 39910, 4053, 45482, 407, 253, 3480, 273, 247, 19817, 7103, 7241, 436, 2929, 556, 253, 2442, 281, 1361, 24432, 320, 625, 6843, 670, 616, 13260, 50276, 20881, 1255, 265, 50276, 33751, 1430, 285, 38135, 253, 2022, 7680, 273, 436, 2929, 275, 619, 2564, 310, 4645, 253, 14717, 285, 970, 352, 281, 30955, 253, 3486, 273, 3733, 8062, 327, 253, 14149, 5511, 7241, 2299, 436, 14717, 760, 10384, 275, 247, 3710, 4758, 285, 326, 9376, 310, 417, 4829, 1160, 6843, 2217, 253, 2457, 2957, 310, 4016, 2412, 12177, 285, 253, 14925, 310, 10166, 3066, 3646, 27935, 1293, 667, 10806, 390, 37820, 2057, 327, 253, 5511, 5048, 390, 253, 24199, 20446, 2761, 512, 9380, 275, 10038, 403, 4555, 670, 752, 10806, 12846, 1320, 41546, 359, 476, 16209, 275, 1340, 281, 4044, 1805, 26647, 891, 1158, 253, 13812, 26332, 14717, 875, 1086, 351, 1430, 1491, 285, 3037, 1430, 15644, 310, 973, 4304, 15995, 4092, 10038, 24432, 285, 2761, 667, 2929, 3863, 327, 253, 2256, 13330, 342, 841, 275, 690, 830, 50276, 498, 15752, 1677, 253, 1840, 8310, 285, 253, 958, 326, 253, 14717, 3637, 432, 14916, 14168, 253, 47284, 3139, 310, 9865, 604, 352, 2789, 1633, 1077, 6843, 326, 369, 42329, 417, 6843, 2217, 824, 326, 2852, 789, 588, 5649, 432, 352, 1146, 6843, 253, 2929, 556, 253, 2442, 281, 513, 436, 533, 275, 619, 4743, 11034, 5356, 11521, 2159, 1512, 1199, 273, 253, 4028, 19584, 1512, 1199, 2720, 3640, 327, 253, 629, 273, 253, 9414, 323, 436, 2929, 281, 320, 11903, 595, 9865, 891, 651, 971, 352, 281, 320, 34007, 407, 3095, 665, 36908, 871, 2712, 670, 47006, 5511, 285, 9563, 436, 347, 616, 1077, 806, 2929, 436, 2523, 310, 3782, 21270, 275, 253, 954, 1774, 629, 2593, 374, 285, 253, 3969, 30762, 253, 14951, 275, 253, 27947, 310, 2761, 42439, 72, 400, 1598, 2410, 311, 4525, 253, 749, 285, 17402, 1687, 12480, 323, 1027, 4764, 5904, 310, 48312, 21643, 3340, 342, 253, 24199, 4764, 1320, 815, 74, 1620, 2686, 1146, 5611, 347, 824, 253, 4588, 27947, 275, 253, 30762, 1620, 2403, 6843, 326, 253, 767, 11655, 4468, 253, 14925, 285, 24199, 2975, 1611, 1907, 436, 1239, 407, 3095, 32139, 342, 253, 1673, 597, 69, 18319, 320, 3663, 285, 512, 273, 352, 310, 10323, 3652, 327, 253, 789, 432, 247, 1077, 2173, 1387, 273, 952, 665, 513, 1841, 275, 247, 1077, 1798, 1039, 970, 247, 7792, 7238, 326, 12445, 2010, 4648, 1293, 2455, 2403, 6843, 326, 643, 10038, 9380, 513, 1841, 1077, 13359, 627, 403, 7964, 10038, 9380, 326, 513, 2393, 15910, 24250, 856, 12188, 275, 1027, 12475, 273, 3733, 3966, 533, 597, 5257, 281, 1263, 752, 10806, 476, 320, 11295, 327, 1755, 273, 253, 2629, 4836, 15895, 281, 1056, 1841, 39970, 3798, 275, 1199, 625, 18144, 3958, 891, 671, 858, 417, 751, 326, 253, 5511, 5048, 3139, 369, 417, 3673, 44856, 264, 342, 362, 85, 740, 50276, 249, 436, 4758, 3253, 3007, 23508, 281, 5044, 1491, 3762, 285, 253, 9376, 310, 417, 15958, 323, 12392, 667, 47006, 32019, 16958, 50276, 1189, 455, 604, 352, 369, 598, 281, 479, 891, 651, 452, 3542, 436, 2929, 1077, 13359, 342, 271, 4154, 2112, 253, 3104, 273, 253, 10038, 6239, 310, 247, 4840, 14935, 1056, 1841, 625, 6843, 407, 11101, 28163, 253, 2957, 275, 253, 954, 5044, 10038, 4758, 285, 840, 359, 476, 2096, 512, 273, 253, 12214, 275, 253, 2720, 10038, 6239, 24088, 49328, 4924, 8537, 43221, 2595, 272, 3474, 21462, 12846, 5904, 14069, 275, 436, 747, 1708, 285, 359, 476, 1014, 1705, 598, 342, 690, 747, 7274, 281, 46710, 436, 1895, 824, 347, 1066, 6712, 272, 820, 26672, 318, 891, 1158, 253, 2929, 556, 15785, 285, 891, 1158, 352, 476, 320, 7607, 533, 352, 1663, 556, 281, 2953, 697, 35387, 2593, 374, 285, 253, 27947, 452, 281, 320, 14951, 595, 6685, 2590, 342, 253, 27947, 3542, 562, 275, 1199, 625, 2508, 285, 253, 2929, 556, 281, 1056, 697, 13260, 1199, 625, 6843, 26332, 326, 352, 10384, 281, 253, 3710, 5044, 4758, 326, 369, 6571, 5421, 407, 247, 1077, 3710, 1387, 273, 952, 50276, 555, 993, 891, 2210, 2439, 50276, 395, 5661, 4327, 50276, 4039, 1271, 50276, 936, 11990, 253, 9414, 30328, 50276, 1088, 398, 50276, 22412, 1566, 403, 1781, 2217, 50276, 19286, 50276, 783, 6194, 820, 324, 522, 318, 11359, 557, 78, 3647, 50276, 68, 1376, 1739, 318, 48245 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 1089, 352, 2834, 281, 923, 436, 1798, 1783, 347, 10481, 4460, 390, 1774, 281, 2852, 10038, 2561, 275, 2426, 273, 7364, 417, 10481, 50276, 249, 2426, 273, 38058, 3486, 352, 943, 417, 320, 7763, 5474, 339, 431, 248, 4477, 12661, 247, 7792, 323, 12392, 253, 26647, 285, 5889, 1319, 273, 5511, 3958, 4882, 458, 88, 261, 7601, 3958, 275, 2159, 627, 403, 767, 6083, 247, 14925, 285, 247, 24199, 285, 253, 14925, 40687, 247, 3632, 1375, 432, 697, 3126, 24088, 271, 2460, 285, 16965, 247, 2625, 281, 253, 24199, 253, 24199, 840, 10832, 1582, 271, 2250, 1754, 327, 436, 2625, 4720, 1097, 6083, 403, 9696, 33302, 1754, 327, 253, 6454, 273, 253, 30418, 2250, 1386, 2164, 1731, 253, 4477, 7568, 849, 253, 6083, 2957, 476, 320, 45765, 715, 271, 1491, 1307, 326, 2677, 7790, 253, 4248, 273, 28931, 273, 253, 3448, 7241, 285, 247, 820, 26672, 318, 1307, 326, 2677, 7790, 253, 8037, 875, 253, 24199, 285, 697, 24571, 50275, 2520, 8310, 4483, 12392, 253, 2603, 273, 689, 31893, 390, 3480, 273, 26647, 275, 253, 5511, 6083, 3340, 672, 5511, 310, 275, 271, 39709, 1375, 432, 253, 3733, 873, 253, 2022, 2934, 310, 281, 6379, 253, 6349, 273, 253, 820, 26672, 318, 1307, 275, 253, 2957, 594, 347, 281, 3693, 253, 14925, 281, 689, 8491, 253, 24199, 323, 2509, 594, 253, 4477, 3368, 264, 342, 970, 1027, 3733, 8130, 5415, 24199, 7898, 24199, 285, 2393, 15910, 24199, 2593, 5922, 50275, 783, 4477, 3368, 264, 970, 253, 7238, 4968, 11554, 342, 271, 3126, 275, 534, 1046, 1789, 556, 465, 50276, 23, 12474, 1016, 3192, 884, 1027, 2193, 323, 247, 2264, 273, 337, 3041, 5113, 285, 247, 1327, 1189, 77, 5436, 6194, 1208, 2566, 8085, 310, 908, 253, 4679, 6583, 253, 9079, 273, 253, 2957, 14717, 407, 4645, 253, 6349, 273, 253, 820, 26672, 318, 1307, 1223, 16344, 271, 39709, 873, 273, 5113, 432, 253, 1071, 873, 25761, 253, 4477, 921, 253, 12510, 273, 253, 4081, 1332, 327, 625, 2570, 3958, 6076, 5830, 285, 4440, 257, 292, 4645, 247, 1199, 10046, 26647, 2429, 281, 247, 2629, 5415, 24199, 20544, 50276, 783, 4081, 2957, 14717, 310, 4460, 281, 253, 1682, 273, 619, 3640, 253, 2929, 310, 7473, 285, 26565, 1293, 1146, 1512, 48656, 390, 1892, 281, 1239, 275, 697, 9759, 50276, 783, 5661, 1543, 403, 38662, 285, 973, 11018, 264, 6240, 253, 2457, 4679, 327, 625, 2570, 8892, 11323, 12820, 281, 253, 4081, 9079, 50275, 783, 2929, 310, 1077, 973, 3542, 285, 3477, 281, 956, 253, 2600, 310, 973, 10932, 285, 253, 14777, 823, 281, 253, 9759, 417, 2361, 407, 253, 4477, 5474, 339, 431, 248, 2929, 3400, 247, 10527, 1783, 273, 253, 8103, 273, 458, 88, 261, 5511, 3958, 326, 10313, 326, 253, 8103, 476, 320, 45765, 715, 767, 4295, 337, 15577, 18480, 4445, 326, 29426, 253, 14925, 281, 6635, 39662, 4122, 27096, 8169, 327, 697, 3280, 285, 374, 15644, 4445, 326, 29426, 253, 3268, 273, 253, 17999, 12637, 281, 3761, 253, 3268, 273, 253, 24199, 253, 6158, 310, 3587, 18325, 407, 1097, 3773, 285, 310, 5058, 264, 672, 253, 24199, 33526, 271, 8654, 3268, 689, 5113, 1677, 247, 3935, 1563, 436, 10527, 1783, 253, 4477, 41661, 326, 689, 31893, 3374, 275, 253, 15644, 2165, 906, 275, 690, 3786, 20527, 3237, 275, 253, 47006, 5511, 824, 347, 1698, 7200, 285, 3480, 273, 5889, 1319, 50276, 936, 1071, 436, 9079, 253, 820, 5223, 279, 1307, 310, 5998, 342, 247, 10304, 50276, 66, 24199, 326, 310, 10166, 281, 14940, 689, 253, 13831, 14925, 407, 26264, 253, 2801, 273, 253, 19432, 2957, 285, 253, 3236, 2957, 275, 253, 10921, 273, 253, 14925, 253, 4833, 273, 15577, 1491, 4445, 4632, 253, 5223, 279, 4445, 310, 5762, 352, 310, 2011, 326, 337, 253, 15644, 1307, 18270, 689, 26017, 1223, 253, 1491, 1307, 1057, 417, 374, 407, 15706, 253, 5415, 13450, 26208, 273, 253, 14925, 285, 253, 24199, 342, 271, 17958, 3733, 835, 253, 24199, 310, 10166, 323, 295, 5018, 352, 310, 1896, 281, 1089, 247, 4758, 835, 253, 47006, 3448, 10262, 1199, 8936, 3607, 50276, 296, 3755, 20556, 253, 2929, 310, 271, 7126, 1650, 273, 247, 2969, 285, 4122, 20654, 10527, 8310, 326, 5644, 281, 1534, 16774, 15988, 352, 310, 8127, 4518, 3542, 7419, 432, 690, 3374, 923, 2708, 285, 253, 16774, 1543, 3340, 4677, 374, 285, 495, 403, 34593, 285, 10084, 50276, 9328, 1127, 562, 281, 271, 12794, 12291, 326, 15877, 432, 13757, 285, 778, 5513, 1142, 273, 253, 3786, 2361, 3374, 342, 11454, 458, 88, 261, 3958, 347, 824, 253, 1783, 275, 436, 2929, 556, 253, 2442, 273, 9619, 29189, 253, 1263, 273, 47006, 5511, 285, 891, 7052, 1329, 697, 14924, 50275, 20881, 1255, 265, 1223, 253, 9759, 310, 8127, 2590, 352, 651, 320, 12912, 281, 3157, 275, 253, 6568, 4704, 2715, 253, 9759, 273, 253, 3368, 326, 8687, 253, 42428, 273, 253, 19432, 2957, 285, 253, 3236, 2957, 275, 7424, 8437, 18, 253, 3638, 9765, 310, 3559, 347, 247, 1039, 281, 598, 6712, 253, 3236, 2957, 1223, 275, 3946, 352, 310, 908, 281, 1066, 6712, 253, 4833, 273, 253, 15644, 1307, 1223, 253, 767, 403, 6425, 841, 767, 24648, 6849, 1056, 352, 12150, 281, 2096, 253, 5955, 275, 3104, 27332, 21149, 891, 1158, 436, 12494, 275, 1798, 556, 281, 320, 5520, 3782, 253, 6197, 387, 1386, 16918, 50276, 9820, 50276, 187, 187, 4118, 18435, 27, 2520, 2929, 2722, 326, 253, 8103, 323, 458, 88, 261, 3958, 347, 4127, 275, 253, 3332, 47006, 5511, 10038, 6239, 476, 320, 45765, 715, 767, 11655, 271, 1491, 2957, 1880, 1016, 3935, 10770, 281, 247, 4451, 294, 5652, 285, 247, 820, 26672, 318, 2957, 534, 2677, 7790, 253, 12420, 273, 253, 14925, 285, 24199, 352, 2722, 326, 3480, 273, 26647, 275, 440, 12846, 1025, 10038, 310, 6571, 1955, 281, 253, 6158, 285, 45190, 2722, 326, 37686, 327, 820, 26672, 318, 3066, 37820, 2393, 15910, 250, 19078, 1320, 19132, 26647, 50276, 783, 30628, 403, 3839, 2762, 670, 436, 2929, 253, 581, 8489, 4016, 4868, 310, 2686, 3240, 2762, 347, 973, 253, 37317, 44184, 326, 436, 310, 247, 4891, 2929, 285, 22828, 281, 320, 7607, 1014, 275, 697, 1655, 1375, 387, 690, 18767, 533, 3533, 253, 4583, 3486, 273, 253, 7680, 285, 1880, 352, 16108, 9311, 275, 5723, 2824, 19235, 247, 7002, 2278, 3738, 12316, 407, 253, 37317, 858, 417, 2144, 907, 275, 673, 1014, 846, 6015, 354, 38699, 594, 891, 574, 281, 2085, 352, 4266, 923, 2708, 50276, 284, 271, 2170, 6951, 891, 717, 8489, 15070, 670, 436, 2929, 352, 310, 973, 15720, 285, 891, 1158, 253, 1673, 588, 5649, 432, 1907, 841, 1841, 1160, 625, 6843, 30018, 891, 671, 1158, 352, 2722, 1077, 4518, 849, 29125, 314, 1652, 4780, 10038, 556, 1160, 347, 247, 1673, 359, 403, 1335, 5015, 670, 253, 1072, 1841, 1107, 1996, 846, 697, 36499, 275, 440, 47606, 46521, 20953, 7533, 2819, 323, 20365, 3397, 285, 5889, 1319, 672, 5044, 1491, 35437, 3762, 5678, 342, 5044, 13757, 651, 3164, 320, 625, 10599, 4583, 891, 1158, 436, 2929, 476, 1361, 253, 1673, 2118, 275, 253, 987, 3884, 285, 891, 717, 46705, 17697, 14924, 253, 14951, 3198, 281, 320, 439, 2149, 598, 253, 13260, 285, 7364, 878, 281, 320, 1160, 1199, 625, 6843, 285, 352, 3198, 281, 320, 1160, 7470, 417, 760, 323, 10668, 29381, 1523, 7615, 342, 10038, 533, 671, 323, 10668, 665, 403, 760, 816, 4361, 616, 806, 2929, 327, 253, 2256, 50274, 3062, 7000, 913, 2278, 50276, 296, 3755, 20556, 50276, 49836, 436, 2929, 310, 973, 15720, 253, 14717, 2789, 247, 2257, 273, 3282, 285, 588, 417, 1705, 347, 247, 9326, 281, 3780, 2444, 275, 253, 1673, 50276, 27962, 1255, 253, 4679, 285, 7103, 403, 11080, 285, 3176, 281, 320, 4354, 41374, 1677, 253, 4278, 2530, 253, 2929, 3400, 3081, 4679, 327, 1027, 625, 2570, 3958, 281, 11399, 2442, 43680, 273, 697, 20953, 4836, 3753, 50276, 48276, 253, 10038, 6239, 390, 1014, 253, 1673, 273, 10038, 21450, 8288, 310, 6685, 26504, 407, 247, 4105, 4685, 273, 849, 5044, 13260, 24088, 3533, 273, 13757, 9978, 3486, 7313, 24088, 5889, 1319, 285, 627, 310, 271, 15423, 2408, 273, 9530, 39910, 4053, 45482, 407, 253, 3480, 273, 247, 19817, 7103, 7241, 436, 2929, 556, 253, 2442, 281, 1361, 24432, 320, 625, 6843, 670, 616, 13260, 50276, 20881, 1255, 265, 50276, 33751, 1430, 285, 38135, 253, 2022, 7680, 273, 436, 2929, 275, 619, 2564, 310, 4645, 253, 14717, 285, 970, 352, 281, 30955, 253, 3486, 273, 3733, 8062, 327, 253, 14149, 5511, 7241, 2299, 436, 14717, 760, 10384, 275, 247, 3710, 4758, 285, 326, 9376, 310, 417, 4829, 1160, 6843, 2217, 253, 2457, 2957, 310, 4016, 2412, 12177, 285, 253, 14925, 310, 10166, 3066, 3646, 27935, 1293, 667, 10806, 390, 37820, 2057, 327, 253, 5511, 5048, 390, 253, 24199, 20446, 2761, 512, 9380, 275, 10038, 403, 4555, 670, 752, 10806, 12846, 1320, 41546, 359, 476, 16209, 275, 1340, 281, 4044, 1805, 26647, 891, 1158, 253, 13812, 26332, 14717, 875, 1086, 351, 1430, 1491, 285, 3037, 1430, 15644, 310, 973, 4304, 15995, 4092, 10038, 24432, 285, 2761, 667, 2929, 3863, 327, 253, 2256, 13330, 342, 841, 275, 690, 830, 50276, 498, 15752, 1677, 253, 1840, 8310, 285, 253, 958, 326, 253, 14717, 3637, 432, 14916, 14168, 253, 47284, 3139, 310, 9865, 604, 352, 2789, 1633, 1077, 6843, 326, 369, 42329, 417, 6843, 2217, 824, 326, 2852, 789, 588, 5649, 432, 352, 1146, 6843, 253, 2929, 556, 253, 2442, 281, 513, 436, 533, 275, 619, 4743, 11034, 5356, 11521, 2159, 1512, 1199, 273, 253, 4028, 19584, 1512, 1199, 2720, 3640, 327, 253, 629, 273, 253, 9414, 323, 436, 2929, 281, 320, 11903, 595, 9865, 891, 651, 971, 352, 281, 320, 34007, 407, 3095, 665, 36908, 871, 2712, 670, 47006, 5511, 285, 9563, 436, 347, 616, 1077, 806, 2929, 436, 2523, 310, 3782, 21270, 275, 253, 954, 1774, 629, 2593, 374, 285, 253, 3969, 30762, 253, 14951, 275, 253, 27947, 310, 2761, 42439, 72, 400, 1598, 2410, 311, 4525, 253, 749, 285, 17402, 1687, 12480, 323, 1027, 4764, 5904, 310, 48312, 21643, 3340, 342, 253, 24199, 4764, 1320, 815, 74, 1620, 2686, 1146, 5611, 347, 824, 253, 4588, 27947, 275, 253, 30762, 1620, 2403, 6843, 326, 253, 767, 11655, 4468, 253, 14925, 285, 24199, 2975, 1611, 1907, 436, 1239, 407, 3095, 32139, 342, 253, 1673, 597, 69, 18319, 320, 3663, 285, 512, 273, 352, 310, 10323, 3652, 327, 253, 789, 432, 247, 1077, 2173, 1387, 273, 952, 665, 513, 1841, 275, 247, 1077, 1798, 1039, 970, 247, 7792, 7238, 326, 12445, 2010, 4648, 1293, 2455, 2403, 6843, 326, 643, 10038, 9380, 513, 1841, 1077, 13359, 627, 403, 7964, 10038, 9380, 326, 513, 2393, 15910, 24250, 856, 12188, 275, 1027, 12475, 273, 3733, 3966, 533, 597, 5257, 281, 1263, 752, 10806, 476, 320, 11295, 327, 1755, 273, 253, 2629, 4836, 15895, 281, 1056, 1841, 39970, 3798, 275, 1199, 625, 18144, 3958, 891, 671, 858, 417, 751, 326, 253, 5511, 5048, 3139, 369, 417, 3673, 44856, 264, 342, 362, 85, 740, 50276, 249, 436, 4758, 3253, 3007, 23508, 281, 5044, 1491, 3762, 285, 253, 9376, 310, 417, 15958, 323, 12392, 667, 47006, 32019, 16958, 50276, 1189, 455, 604, 352, 369, 598, 281, 479, 891, 651, 452, 3542, 436, 2929, 1077, 13359, 342, 271, 4154, 2112, 253, 3104, 273, 253, 10038, 6239, 310, 247, 4840, 14935, 1056, 1841, 625, 6843, 407, 11101, 28163, 253, 2957, 275, 253, 954, 5044, 10038, 4758, 285, 840, 359, 476, 2096, 512, 273, 253, 12214, 275, 253, 2720, 10038, 6239, 24088, 49328, 4924, 8537, 43221, 2595, 272, 3474, 21462, 12846, 5904, 14069, 275, 436, 747, 1708, 285, 359, 476, 1014, 1705, 598, 342, 690, 747, 7274, 281, 46710, 436, 1895, 824, 347, 1066, 6712, 272, 820, 26672, 318, 891, 1158, 253, 2929, 556, 15785, 285, 891, 1158, 352, 476, 320, 7607, 533, 352, 1663, 556, 281, 2953, 697, 35387, 2593, 374, 285, 253, 27947, 452, 281, 320, 14951, 595, 6685, 2590, 342, 253, 27947, 3542, 562, 275, 1199, 625, 2508, 285, 253, 2929, 556, 281, 1056, 697, 13260, 1199, 625, 6843, 26332, 326, 352, 10384, 281, 253, 3710, 5044, 4758, 326, 369, 6571, 5421, 407, 247, 1077, 3710, 1387, 273, 952, 50276, 555, 993, 891, 2210, 2439, 50276, 395, 5661, 4327, 50276, 4039, 1271, 50276, 936, 11990, 253, 9414, 30328, 50276, 1088, 398, 50276, 22412, 1566, 403, 1781, 2217, 50276, 19286, 50276, 783, 6194, 820, 324, 522, 318, 11359, 557, 78, 3647, 50276, 68, 1376, 1739, 318, 48245 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: summary in this paper the authors view maml from the lens of reproducing hilbert kernel hilbert spaces rkhs by applying tools from the theory of neural tangent kernels ntks based on these insights they develop two metalearning algorithms that avoid gradientbased innerloop adaptation their algorithms are theoretically grounded and exhibit improved empirical performance overall this is a solid contribution that should be of interest to the community thus i recommend acceptance strengths this paper is generally well written and proceeds to develop insights into gradientbased fewshot adaptation on first principles from ntk theory in particular they establish that parameter adaptation trajectories are equivalent to functional trajectories in some rkhs under the induced ntk which allows us to bring to bear tools and analysis from that field in particular the authors provide rigorous mathematical derivations and show that gradientbased fewshot adaptation of the initialisation can be approximated without innerloop adaptation under certain assumptions on the ntk that it induces a relatively linear adaptation space from which they derive two algorithms that avoid gradientbased innerloop adaptation they demonstrate empirically that the proposed algorithms perform better than similar algorithms on the standard fewshot learning setup on miniimagenet as well as that the method is significantly more robust to adversarial robustness and enjoys substantially generalisation to outofdistribution task i am particularly impressed by the second algorithm metarkhsii which derives a closed formsolution to gradientbased adaptation in rkhs that they then map back into parameter space via ntk metalearning the closedform solver can be thought of as learning a functional inference process for task adaptation this provides a fresh new perspective and opens up for several new research directions in particular the authors mention that inference over the ntk can render the search for task models linear which provides benefits both in terms of robustness and generalisation they offer some empirical evidence that this is indeed the case weaknesses while i welcome the principled approach taken in this paper it is somewhat underwhelming that the first algorithm which avoids an n3 complexity in the data is derived from a firstorder taylor series expansion of the original maml in parameter space after all metarkhsi can be motivated a firstorder taylor approximation to maml without the need to involve rkhs with that said i do appreciate the effort taken to establish that this is equivalent to optimisation in function space the gist of this algorithm is to convert maml into a multitask objective with a regulariser that tries to maximise the gradient norm at initialisation while i believe the precise regulariser introduced here differs from other works i am missing a discussion of similar works that also propose ways of regularising maml updates eg 1 2 3 and followups finally im a bit unsure as to how metarkhsi behaves during metatesting given that it removes the adaptation loop at metatraining does it simply use the metalearner initialisation without adaptation or do you perform gradient descent on metatest tasks minor comments at times the writing can be a bit aggressive in the abstract you claim superiority of your paradigm improved performance on miniimagenet is not sufficient evidence to make such a strong claim if metarkhsi does not do any adaptation on metatest tasks i do not think it is fair to say that it replaces the inneradaptation process it removes it in favor of a multitask solution this may not be a good strategy beyond fewshot learning you mention that ntks can yield loss functionals that are convex which seems appealing i would suggest making this connection to your proposed algorithms stronger eq 2 is the transpose on the wrong gradient vector on the line above it you use nabla as a column vector eq 4 in the paragraph following it you speak of solving eq 4 but eq 4 is a functional and not a problem do you mean min eq 4 thm 2 is somewhat cryptic what you mean to say is that the regulariser can be mapped into parameter space so that em correct thms 3 and 4 could use some discussion what is the takeaway how likely are these conditions to hold the coloured line in the robustness graphs are very hard to read i initially thought that maml achieved optimal robustness postrebuttal ive read the rebuttal and am content with the response im maintain my recommendation to accept this paper references 1 guiroy et al towards understanding generalization in gradientbased metalearning 2019 2 khodak et al provable guarantees for gradientbased metalearning 2019 3 zhou et al efficient meta learning via minibatch proximal update 2019docsepthis paper mainly deals with the computational issues of model agnostic metalearning maml specifically it proposes two metalearning algorithms where the hypothesis class ie the mapping function set is defined in rkhs induced by ntk extensive experimental studies on many tasks ie regression fewshot image classification robustness to adversarial attack and outofdistribution generalization illustrate its superiority compared with other baselines pros 1 overall this paper is wellwritten and organized 2 this work is based on recent solid theoretical results ie ntk from the deep learning theory 3 the proposed methods have promising performance empirically cons 1 for the proposed first algorithm ie metarkhs1 what is the connection or difference between the introduced regularizer in eq4 and some commonly used regularizers eg fmathcalh2 in rkhs furthermore does the objective function in eq4 could become negative please give more explanation 2 for the second algorithm ie metarkhs2 the authors claim that it can have a closedform adaptation function in my opinion this is mainly because the loss function is squared loss just like kernel least squared regression if the loss changes to other losses such as crossentropy loss it cannot get a closedform solution if so the algorithm also needs doublelooped optimization please clarify it 3 although this paper mainly focuses on the ntkinduced rkhs other kernel functions eg rbf also can hold for these two algorithms it is interesting to test whether ntk is better than rbf in the metalearning setting such as the regression task 4 the formal proofs for the theorems in this paper are mainly based on previous results it is better to summarize the technical differences for clear theoretical contributions docsep summary in the attempt to create an adaptationfree metalearning method authors construct an rkhs based on the ntk and explain how to do gradientbasedmamlstyle metalearning in this space instead of parameter space they propose two energy functionals first one based on maximizing the norm of the parameters and the second one making the adaptation in closed form based on the ntk that approximate the mamls infeasible learning objective evaluation of the functionals doesnt require evaluating the function outside of current parameters theta thus alleviating the need for explicit adaptation which is a cause of technical problems in other methods authors show theoretical arguments confirming their method is closely related to maml as well as the results of extensive experiments in fewshot classification regression and outofdistribution generalization tasks authors claim the result of training their second method is robust to adversarial examples good points 1 the benefits of the presented method are obvious problems associated with a long adaptation loop is a common yet unsolved problem contemporary methods struggle with 2 experiments are extensive and their results give a convincing argument that the method works 3 i find the method i intuitive as the high norm of the parameters coincides with parts of parameter space from which one can adapt the most within few gradient steps ie maml will be naturally seeking these places too overclaims 1 authors claim their method is a first singlelooped metalearning algorithm its not clear from the paper what is formally meant by that why is imamlreptilewarpgrad flennerhag et al 2019 httpsarxivorgabs190900025 not singlelooped in general a comment on warpgrad which learns a goodforadaptation geometry through a gradient preconditioner seems warranted 2 authors claim their method is robust to adversarial attacks they demonstrate it using just two attacks a black and a whitebox one which i dont consider extensive enough there is a known bias of adversarial defences guarding only against particular attack methods and being susceptible to others cf athalye et al 2018 obfuscated gradients give a false sense of security or uesato et al 2018 adversarial risk and the dangers of evaluating against weak attacks furthermore its not clear that the chosen baselines are the best metalearning methods out there in terms of adversarial attacks to compare to which makes the claim of our methods are much more robust to adversarial attacks than existing approaches groundless 3 results on classificationoutofdistribution generalization are okish to support the claims of the paper but for completeness i believe it would be preferable to note the sota performance on these tasks otherwise a careless reader may be under the impression that the sota is held by the authors methods personal biases i believe short adaptation unrolls is a deadend as we get to metalearn harderwider task distributions there wont be any place in the parameter space from which we could achieve good finetuning performance without an expressive adaptation procedure this is particularly clear in metarl where without doing gradient steps the data we will be obtaining wont fully describe the mdp we need to solve one may imagine a task with multiple rooms where only after performing adaptation we will pass the correct door and observe the actual task small in sec 44 nilsback zisserman 2008 should be a citepdocsepthe authors propose two metalearning algorithms in the reproducing kernel hilbert space rkhs induced by the recently proposed neural tangent kernels ntk the authors show how their algorithms obviate an explicit inner loop or taskadaptation step in the metalearning training phase in first algorithm no explicit adaptation function is used whereas in the second a close form adaptation function which invokes the ntk is proposed which is a simpler adaptation than that of maml and hence offers computational efficiency the work is interesting and supported by theory inspired from the ntk theory and adds to the newly expanding literature in the use of kernels in metalearning unlike the authors claim in the introduction theirs is not the first metalearning paradigm in the rkhs cf wang et al 2020 cervio et al 2019 the authors perform extensive experiments on regression and classification datasets and compare their results with other mamltype algorithms the experimental results do not show significant gains in terms of performance over the existing maml approaches except in the case of outofdistribution datasets and adversarial attacks where it is shown to outperform the others the performance similarity to other methods is not surprising since the proposed approaches can be seen as an efficient approximations of the maml i give my detailed comments next firstly i believe the title of the paper could be changed to metalearning in the rkhs induced by the ntk or something more specific currently it comes off as rather broad and disproportionate to the work and existing work i had some issues with the claim that both the metarkhs approaches do not need an explicit innerloop adaptation this does not seem true in the case of metarkhsi an inner adaptation is not needed in the metatraining but necessary just like the maml during for a test task in the case of metarkhsii the ntk gradient flow based adaptation in eq6 forms the innerloop just that it is a more efficient inner update than the maml the authors must consider rephrasing the claim to reflect these metarkhsi 1 from what i understand the treatment is based on an approximation to the maml with kinner gradient descent steps through a taylor expansion to this extent one can expect the performance to be similar to that of kstep maml on using the step size kalpha in equation 4 indeed we see this in the experimental results 2 it is unclear as to how once the metaparameter theta is learnt the parameters for a new task are obtained from it from what i see it is obtained by ksteps of actual gradient descent just like the maml this was not mentioned anywhere clearly in the manuscript i estimated this from the experiment plots in figure 5 of the appendix which mentions different inner steps 3 the taylor expansion in of equation 2 under what case is this expansion valid does this again assume a high degree of similarity between the tasks 4 before equation 3 equation 2 is an unbiased estimator of this claim is not immediately evident to me would be better if the authors could expand a bit here 5 the connection to the ntk seems a bit weak and superficial based on eq 3 the authors propose the energy functional in eq4 for learning the metaparameter and this is where the rkhs comes in however in the very next paragraph in theorem 2 the equivalence of the gradient in parameter space and the functional gradient is claimed here i do not see what properties of the ntk are being invoked eq 4 can be with the the parameter gradient instead of the gradient in the rkhs and the approach will continue to hold valid the authors should bring out the connection to rkhs and ntk more clearly at present it seems to me that the approach does not explicitly have connections to an rkhs indeed the proofs of theorem 3 and 4 also do not seem to use the properties of rkhs or the ntk metarkhsii here the authors propose an adaptation function based on the ntk and the gradient flow this is an interesting adaptation function that evokes the ntk and in the process can help approximate a korder inner gradient and yet be free of the computational difficulties that come due to this in the standard maml systems unlike metarkhsi the connection to ntk is strong and clear here the authors should consider expanding and emphasising this portion better for example what is the implication of theorem 5 last paragraph on page 5 intuitivelythus can be deem more robust i do not follow this could the authors expand here i also found the robustness discussion to be needing clarity on the whole figure i the axes and the legends are not readable section 43 and 44 the metarkhs methods significantly outperform other approaches in the case of adversarial attacks and outofthedistribution tasks this is of merit since it is known that maml type approaches are far too sensitive to the outlier tasks can this robustness be better explained or mathematically analysed in terms of the rkhs or the ntk this would greatly strengthen the contribution overall i feel that this is an interesting and novel contribution particularly in terms of mathematical concepts though the approach does not necessarily outperform similar methods by a significant margin except in the case of outofthedistribution tasks or adversarial attacks the connection to rkhs seems a bit weak and currently appears in the form of ntk and that too evidently only in metarkhsii a suggestion is also consider dataset cases where maml necessarily requires multiple inner gradient steps to just one step inner gradient update the first order maml currently all the examples in the paper show similar performance for both maml and first order maml this could help verify if the metarkhs approaches can indeed achieve similar performance as multistep inner gradient maml while having much lower complexity references wang et al 2020 haoxiang wang ruoyu sun and bo li global convergence and induced kernels of gradientbased metalearning with neural nets 2020 cervio et al 2019 j cervio j a bazerque m calvofullana and a ribeiro metalearning through coupled optimization in reproducing kernel hilbert spaces 2019 american control conference acc philadelphia pa usa 2019 pp 48404846 doi 1023919acc20198814419 ### Summary:
this paper considers metalearning based on maml the authors use neural tangent kernels ntks to develop two metalearning algorithms that avoid the innerloop adaptation which makes maml computationally intensive experimental results demonstrate favorable empirical performance over existing methods the paper is generally well written and readable the proposed methods are well motivated and based on solid theoretical ground the emprirical performance shows advantages in efficiency and quality this work is worth acceptence in iclr 2021
[ 1071, 1880, 295, 17922, 310, 1805, 685, 391, 3342, 275, 253, 5148, 613, 920, 4758, 824, 347, 253, 9077, 4836, 577, 253, 7473, 27947, 323, 253, 39383, 275, 436, 2929, 403, 7194, 1754, 327, 2045, 1543, 352, 310, 1805, 281, 26799, 253, 7681, 3910, 323, 2590, 10527, 9021, 50276, 7152, 33032, 6010, 275, 253, 3177, 281, 2794, 271, 15644, 4924, 5148, 613, 920, 1332, 4477, 3989, 271, 391, 76, 11285, 1754, 327, 253, 295, 17922, 285, 5513, 849, 281, 513, 11786, 3169, 78, 16878, 4826, 5148, 613, 920, 275, 436, 2317, 3185, 273, 4764, 2317, 50276, 9328, 12661, 767, 2341, 1159, 932, 806, 581, 1754, 327, 46875, 253, 5222, 273, 253, 3602, 285, 253, 1273, 581, 2403, 253, 15644, 275, 4581, 830, 1754, 327, 253, 295, 17922, 326, 16851, 253, 28346, 5200, 275, 36764, 917, 4715, 8103, 7103, 273, 253, 1159, 932, 36908, 2430, 16344, 253, 1159, 3345, 273, 1655, 3602, 39116, 3021, 7374, 6584, 839, 253, 878, 323, 6843, 15644, 534, 310, 247, 2847, 273, 7681, 3237, 275, 643, 3082, 4477, 921, 10527, 7125, 24025, 616, 1332, 310, 8244, 2905, 281, 278, 16878, 347, 973, 347, 253, 1543, 273, 9470, 4679, 275, 1643, 11860, 9162, 9077, 285, 562, 1171, 35360, 26647, 8892, 4477, 1750, 253, 906, 273, 3733, 616, 1273, 1332, 310, 10237, 281, 48960, 6667, 50275, 12311, 2792, 337, 253, 5373, 273, 253, 3559, 1332, 403, 4755, 3237, 2330, 342, 247, 1048, 15644, 6287, 310, 247, 1846, 2568, 5061, 5336, 1895, 13399, 3082, 11182, 342, 374, 4679, 403, 9470, 285, 616, 1543, 1918, 247, 21414, 4154, 326, 253, 1332, 2987, 495, 891, 1089, 253, 1332, 891, 27350, 347, 253, 1029, 5222, 273, 253, 3602, 30150, 342, 4243, 273, 4764, 2317, 432, 534, 581, 476, 5223, 253, 954, 1561, 1643, 11786, 5018, 26332, 278, 16878, 588, 320, 10748, 8445, 841, 5053, 1512, 50275, 1189, 28803, 337, 4477, 1750, 616, 1332, 310, 247, 806, 2014, 4213, 11802, 5148, 613, 920, 5933, 697, 417, 2590, 432, 253, 2929, 752, 310, 19186, 5486, 407, 326, 2139, 310, 516, 16878, 22316, 587, 88, 5916, 4971, 892, 257, 1216, 73, 356, 1162, 355, 6247, 5987, 39962, 2061, 5375, 746, 2693, 933, 1099, 417, 2014, 4213, 11802, 275, 2087, 247, 4385, 327, 45645, 4971, 534, 33772, 247, 1175, 1542, 26672, 318, 12087, 949, 247, 11786, 638, 12380, 254, 3133, 26085, 374, 4477, 1750, 616, 1332, 310, 10237, 281, 48960, 8104, 597, 7568, 352, 970, 816, 767, 8104, 247, 2806, 285, 247, 3168, 3364, 581, 534, 891, 13414, 1908, 9470, 2217, 627, 310, 247, 1929, 8492, 273, 48960, 809, 2979, 7496, 272, 760, 1411, 1798, 2983, 3082, 285, 1146, 16931, 281, 2571, 21194, 9621, 5242, 70, 1162, 355, 4765, 691, 71, 19387, 456, 27935, 1918, 247, 3221, 3282, 273, 3988, 390, 209, 955, 4611, 1162, 355, 4765, 48960, 2495, 285, 253, 25926, 273, 16344, 1411, 5075, 8104, 33810, 697, 417, 2590, 326, 253, 6777, 1666, 25379, 403, 253, 1682, 5148, 613, 920, 3082, 562, 627, 275, 2426, 273, 48960, 8104, 281, 7277, 281, 534, 2789, 253, 1750, 273, 776, 3082, 403, 1199, 625, 10237, 281, 48960, 8104, 685, 5368, 7274, 3216, 1417, 495, 1543, 327, 9162, 483, 1171, 35360, 26647, 403, 8718, 763, 281, 1329, 253, 3916, 273, 253, 2929, 533, 323, 29867, 891, 2868, 352, 651, 320, 29224, 281, 3877, 253, 256, 5503, 3045, 327, 841, 8892, 5010, 247, 48292, 9414, 778, 320, 762, 253, 13214, 326, 253, 256, 5503, 310, 2918, 407, 253, 4477, 3082, 50275, 21941, 31306, 891, 2868, 2159, 15644, 440, 1811, 84, 310, 247, 3846, 423, 347, 359, 755, 281, 5148, 613, 79, 12150, 88, 1334, 4836, 10670, 627, 31451, 320, 667, 1659, 275, 253, 4764, 2317, 432, 534, 359, 812, 5115, 1175, 1442, 292, 25004, 3045, 1293, 271, 43541, 15644, 5199, 436, 310, 3782, 2590, 275, 1313, 7694, 835, 1293, 2509, 11786, 5018, 253, 941, 359, 588, 320, 13546, 31451, 4751, 6266, 253, 278, 12132, 359, 878, 281, 8415, 581, 778, 8564, 247, 4836, 342, 2709, 9956, 835, 760, 846, 9591, 15644, 359, 588, 1509, 253, 3451, 3369, 285, 10018, 253, 4588, 4836, 50275, 6795, 275, 4706, 7127, 295, 3683, 2135, 50276, 91, 739, 8592, 4695, 943, 320, 247, 4851, 554, 7152, 339, 431, 248, 4477, 12661, 767, 5148, 613, 920, 11333, 275, 253, 39306, 10295, 288, 300, 6291, 2317, 391, 76, 11285, 5802, 407, 253, 4102, 4081, 11454, 28196, 34501, 295, 17922, 253, 4477, 921, 849, 616, 11333, 691, 87, 4513, 271, 6843, 6703, 6287, 390, 4836, 26672, 318, 3213, 275, 253, 5148, 613, 920, 3733, 3408, 275, 806, 5933, 642, 6843, 15644, 1159, 310, 908, 5727, 275, 253, 1273, 247, 2810, 830, 15644, 1159, 534, 828, 7095, 253, 295, 17922, 310, 4081, 50276, 4609, 310, 247, 19554, 15644, 685, 326, 273, 278, 16878, 285, 7613, 6131, 15180, 6733, 253, 789, 310, 4722, 285, 50276, 19391, 407, 3762, 11797, 432, 253, 295, 17922, 3762, 285, 11323, 281, 253, 9841, 16122, 6239, 275, 253, 897, 273, 34501, 275, 5148, 613, 920, 12401, 253, 4477, 1750, 275, 253, 10199, 31187, 310, 417, 253, 806, 5148, 613, 920, 22199, 275, 253, 391, 76, 11285, 21194, 259, 606, 1162, 355, 9169, 15380, 900, 1162, 355, 6247, 253, 4477, 1347, 9470, 4679, 327, 9077, 285, 9162, 15302, 285, 7277, 616, 1543, 342, 643, 278, 16878, 881, 11333, 253, 5661, 1543, 513, 417, 921, 1534, 15988, 275, 2426, 273, 3045, 689, 253, 5368, 278, 16878, 7274, 3707, 275, 253, 1083, 273, 562, 1171, 35360, 15302, 285, 48960, 8104, 835, 352, 310, 2011, 281, 562, 32231, 253, 2571, 253, 3045, 14259, 281, 643, 3082, 310, 417, 10084, 1580, 253, 4081, 7274, 476, 320, 2326, 347, 271, 5919, 34754, 273, 253, 278, 16878, 50276, 74, 1918, 619, 7000, 5701, 1735, 50276, 7053, 314, 891, 2868, 253, 4060, 273, 253, 2929, 812, 320, 4391, 281, 5148, 613, 920, 275, 253, 391, 76, 11285, 5802, 407, 253, 295, 17922, 390, 1633, 625, 2173, 50276, 47590, 352, 3249, 745, 347, 2581, 3862, 285, 50196, 281, 253, 789, 285, 5368, 789, 50276, 74, 574, 690, 3374, 342, 253, 1750, 326, 1097, 253, 1313, 782, 11285, 7274, 513, 417, 878, 271, 6843, 6703, 14075, 15644, 436, 1057, 417, 1646, 2032, 275, 253, 1083, 273, 1313, 782, 73, 9245, 50276, 266, 6703, 15644, 310, 417, 3058, 275, 253, 1313, 255, 26208, 533, 3309, 816, 751, 253, 278, 16878, 1309, 323, 247, 1071, 4836, 275, 253, 1083, 273, 1313, 782, 11285, 2886, 253, 295, 17922, 11786, 2685, 1754, 15644, 275, 16186, 23, 4948, 253, 6703, 14075, 816, 326, 352, 310, 247, 625, 5919, 6703, 5731, 685, 253, 278, 16878, 253, 4477, 1364, 1908, 294, 545, 83, 2355, 253, 1750, 281, 4887, 841, 50276, 186, 3899, 782, 73, 9245, 337, 186, 4064, 752, 891, 2096, 253, 1971, 310, 1754, 327, 271, 11193, 281, 253, 278, 16878, 342, 465, 5338, 11786, 18499, 5018, 949, 247, 246, 9614, 7466, 281, 436, 6070, 581, 476, 1902, 253, 3045, 281, 320, 2074, 281, 326, 273, 465, 10539, 278, 16878, 327, 970, 253, 3213, 1979, 465, 1637, 275, 5150, 577, 50276, 527, 13158, 359, 923, 436, 275, 253, 5661, 1543, 50276, 19, 186, 262, 310, 12744, 347, 281, 849, 2378, 253, 21543, 274, 6245, 39116, 310, 34003, 253, 3602, 323, 247, 747, 4836, 403, 2797, 432, 352, 432, 752, 891, 923, 352, 310, 2797, 407, 465, 20528, 273, 4588, 11786, 18499, 816, 751, 253, 278, 16878, 436, 369, 417, 5393, 9825, 4518, 275, 253, 7714, 891, 5998, 436, 432, 253, 3368, 14777, 275, 4677, 608, 273, 253, 30762, 534, 25957, 1027, 6703, 5018, 50276, 20, 253, 246, 9614, 7466, 275, 273, 5150, 374, 762, 752, 1083, 310, 436, 7466, 3588, 1057, 436, 969, 5467, 247, 1029, 4248, 273, 14259, 875, 253, 8892, 577, 1078, 5150, 495, 5150, 374, 310, 271, 38663, 29107, 273, 50276, 2520, 1750, 310, 417, 4745, 8943, 281, 479, 651, 320, 1805, 604, 253, 4477, 812, 5645, 247, 2372, 1060, 608, 253, 4602, 281, 253, 295, 17922, 3133, 247, 2372, 5075, 285, 28019, 50276, 3169, 327, 16186, 495, 253, 4477, 12661, 253, 2341, 5164, 275, 16186, 21, 323, 4715, 253, 21543, 274, 6245, 285, 436, 310, 835, 253, 391, 76, 11285, 3249, 275, 2299, 275, 253, 1077, 1735, 12494, 275, 10012, 374, 253, 19945, 273, 253, 11786, 275, 4764, 2317, 285, 253, 5164, 11786, 310, 7558, 1060, 891, 513, 417, 923, 752, 3607, 273, 253, 295, 17922, 403, 1146, 25605, 16186, 577, 476, 320, 342, 253, 253, 4764, 11786, 3185, 273, 253, 11786, 275, 253, 391, 76, 11285, 285, 253, 2746, 588, 4035, 281, 2186, 3588, 253, 4477, 943, 3324, 562, 253, 4602, 281, 391, 76, 11285, 285, 295, 17922, 625, 4518, 387, 1246, 352, 3133, 281, 479, 326, 253, 2746, 1057, 417, 11120, 452, 10291, 281, 271, 391, 76, 11285, 6296, 253, 27947, 273, 10012, 495, 285, 577, 671, 513, 417, 1646, 281, 897, 253, 3607, 273, 391, 76, 11285, 390, 253, 295, 17922, 50276, 3899, 782, 11285, 2886, 1060, 253, 4477, 12661, 271, 15644, 1159, 1754, 327, 253, 295, 17922, 285, 253, 11786, 2685, 436, 310, 271, 4722, 15644, 1159, 326, 612, 7095, 253, 295, 17922, 285, 275, 253, 1232, 476, 1361, 16851, 247, 465, 2621, 6703, 11786, 285, 2568, 320, 1959, 273, 253, 15180, 12748, 326, 1705, 1955, 281, 436, 275, 253, 2629, 278, 16878, 2718, 12401, 1313, 782, 73, 9245, 253, 4602, 281, 295, 17922, 310, 2266, 285, 2590, 1060, 253, 4477, 943, 1908, 16122, 285, 10251, 2182, 436, 5110, 1805, 323, 1650, 752, 310, 253, 27570, 273, 10012, 608, 50276, 6275, 12494, 327, 3239, 608, 540, 41597, 40622, 476, 320, 43413, 625, 10237, 891, 513, 417, 956, 436, 812, 253, 4477, 5645, 1060, 891, 671, 1119, 253, 31640, 5955, 281, 320, 25312, 19843, 327, 253, 2644, 50276, 186, 13206, 891, 50276, 783, 24039, 285, 253, 38209, 403, 417, 34025, 50276, 4674, 7652, 285, 7127, 253, 1313, 782, 11285, 3082, 3012, 562, 32231, 643, 7274, 275, 253, 1083, 273, 48960, 8104, 285, 562, 23037, 742, 382, 2382, 8892, 436, 310, 273, 15785, 1580, 352, 310, 1929, 326, 278, 16878, 1511, 7274, 403, 2080, 1512, 7996, 281, 253, 562, 3623, 8892, 476, 436, 31640, 320, 1805, 5544, 390, 11076, 1037, 15626, 275, 2426, 273, 253, 391, 76, 11285, 390, 253, 295, 17922, 436, 651, 10260, 17084, 253, 7680, 50276, 1189, 455, 891, 1928, 326, 436, 310, 271, 4722, 50276, 395, 4460, 7680, 3782, 275, 2426, 273, 15965, 12342, 2167, 253, 2746, 1057, 417, 7933, 562, 32231, 2074, 3082, 407, 247, 1534, 8459, 3707, 275, 253, 1083, 273, 562, 23037, 742, 382, 2382, 8892, 390, 48960, 8104, 253, 4602, 281, 391, 76, 11285, 3133, 247, 2372, 5075, 285, 4390, 4620, 275, 253, 830, 273, 295, 17922, 285, 326, 1512, 28668, 760, 275, 1313, 782, 11285, 2886, 247, 14876, 310, 671, 1908, 10895, 2219, 835, 278, 16878, 7933, 4419, 2709, 6703, 11786, 5018, 281, 816, 581, 3213, 6703, 11786, 5731, 50276, 783, 806, 1340, 278, 16878, 50276, 47590, 512, 253, 6667, 275, 253, 2929, 921, 2074, 3045, 323, 1097, 278, 16878, 285, 806, 1340, 278, 16878, 436, 812, 1361, 12654, 604, 253, 1313, 782, 11285, 7274, 476, 6296, 5115, 2074, 3045, 347, 1554, 382, 554, 6703, 11786, 278, 16878, 1223, 1907, 1199, 2406, 10454, 50275, 250, 3065, 259, 606, 1162, 355, 9169, 50276, 3227, 1004, 22589, 259, 606, 8864, 899, 86, 5101, 285, 1766, 632, 4156, 14940, 285, 5802, 34501, 273, 11786, 3169, 5148, 613, 920, 342, 11454, 37507, 9169, 50275, 68, 677, 900, 1162, 355, 6247, 480, 15380, 900, 480, 247, 270, 24613, 1452, 278, 1724, 87, 1171, 962, 3230, 285, 247, 9412, 70, 9401, 5148, 613, 920, 949, 9904, 13757, 275, 39306, 10295, 288, 300, 6291, 8470, 6247, 41290, 266, 1453, 8059, 756, 30005, 13254, 1349, 441, 66, 6247, 7266, 5693, 1449, 2385, 2950, 28076, 884, 20487, 746, 3649, 9638, 2055, 14231, 746, 2490, 187, 4118, 18435, 27, 2520, 2929, 19401, 5148, 613, 920, 1754, 327, 278, 16878, 50276, 783, 4477, 897, 11454, 28196, 34501, 34900, 661, 281, 1287, 767, 5148, 613, 920, 11333, 326, 3693, 253, 6703, 14075, 15644, 534, 2789, 278, 16878, 43245, 17193, 50276, 49363, 1543, 7568, 13857, 16774, 3045, 689, 5368, 3082, 50275, 783, 2929, 310, 3839, 973, 3542, 285, 34025, 50276, 783, 4081, 3082, 403, 973, 17194, 285, 1754, 327, 4891, 10527, 3216, 50276, 783, 802, 1087, 45756, 3045, 2722, 11361, 275, 6733, 285, 3290, 50275, 2520, 789, 310, 4409, 2997, 566, 275, 17857, 32888, 43425, 209 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 1071, 1880, 295, 17922, 310, 1805, 685, 391, 3342, 275, 253, 5148, 613, 920, 4758, 824, 347, 253, 9077, 4836, 577, 253, 7473, 27947, 323, 253, 39383, 275, 436, 2929, 403, 7194, 1754, 327, 2045, 1543, 352, 310, 1805, 281, 26799, 253, 7681, 3910, 323, 2590, 10527, 9021, 50276, 7152, 33032, 6010, 275, 253, 3177, 281, 2794, 271, 15644, 4924, 5148, 613, 920, 1332, 4477, 3989, 271, 391, 76, 11285, 1754, 327, 253, 295, 17922, 285, 5513, 849, 281, 513, 11786, 3169, 78, 16878, 4826, 5148, 613, 920, 275, 436, 2317, 3185, 273, 4764, 2317, 50276, 9328, 12661, 767, 2341, 1159, 932, 806, 581, 1754, 327, 46875, 253, 5222, 273, 253, 3602, 285, 253, 1273, 581, 2403, 253, 15644, 275, 4581, 830, 1754, 327, 253, 295, 17922, 326, 16851, 253, 28346, 5200, 275, 36764, 917, 4715, 8103, 7103, 273, 253, 1159, 932, 36908, 2430, 16344, 253, 1159, 3345, 273, 1655, 3602, 39116, 3021, 7374, 6584, 839, 253, 878, 323, 6843, 15644, 534, 310, 247, 2847, 273, 7681, 3237, 275, 643, 3082, 4477, 921, 10527, 7125, 24025, 616, 1332, 310, 8244, 2905, 281, 278, 16878, 347, 973, 347, 253, 1543, 273, 9470, 4679, 275, 1643, 11860, 9162, 9077, 285, 562, 1171, 35360, 26647, 8892, 4477, 1750, 253, 906, 273, 3733, 616, 1273, 1332, 310, 10237, 281, 48960, 6667, 50275, 12311, 2792, 337, 253, 5373, 273, 253, 3559, 1332, 403, 4755, 3237, 2330, 342, 247, 1048, 15644, 6287, 310, 247, 1846, 2568, 5061, 5336, 1895, 13399, 3082, 11182, 342, 374, 4679, 403, 9470, 285, 616, 1543, 1918, 247, 21414, 4154, 326, 253, 1332, 2987, 495, 891, 1089, 253, 1332, 891, 27350, 347, 253, 1029, 5222, 273, 253, 3602, 30150, 342, 4243, 273, 4764, 2317, 432, 534, 581, 476, 5223, 253, 954, 1561, 1643, 11786, 5018, 26332, 278, 16878, 588, 320, 10748, 8445, 841, 5053, 1512, 50275, 1189, 28803, 337, 4477, 1750, 616, 1332, 310, 247, 806, 2014, 4213, 11802, 5148, 613, 920, 5933, 697, 417, 2590, 432, 253, 2929, 752, 310, 19186, 5486, 407, 326, 2139, 310, 516, 16878, 22316, 587, 88, 5916, 4971, 892, 257, 1216, 73, 356, 1162, 355, 6247, 5987, 39962, 2061, 5375, 746, 2693, 933, 1099, 417, 2014, 4213, 11802, 275, 2087, 247, 4385, 327, 45645, 4971, 534, 33772, 247, 1175, 1542, 26672, 318, 12087, 949, 247, 11786, 638, 12380, 254, 3133, 26085, 374, 4477, 1750, 616, 1332, 310, 10237, 281, 48960, 8104, 597, 7568, 352, 970, 816, 767, 8104, 247, 2806, 285, 247, 3168, 3364, 581, 534, 891, 13414, 1908, 9470, 2217, 627, 310, 247, 1929, 8492, 273, 48960, 809, 2979, 7496, 272, 760, 1411, 1798, 2983, 3082, 285, 1146, 16931, 281, 2571, 21194, 9621, 5242, 70, 1162, 355, 4765, 691, 71, 19387, 456, 27935, 1918, 247, 3221, 3282, 273, 3988, 390, 209, 955, 4611, 1162, 355, 4765, 48960, 2495, 285, 253, 25926, 273, 16344, 1411, 5075, 8104, 33810, 697, 417, 2590, 326, 253, 6777, 1666, 25379, 403, 253, 1682, 5148, 613, 920, 3082, 562, 627, 275, 2426, 273, 48960, 8104, 281, 7277, 281, 534, 2789, 253, 1750, 273, 776, 3082, 403, 1199, 625, 10237, 281, 48960, 8104, 685, 5368, 7274, 3216, 1417, 495, 1543, 327, 9162, 483, 1171, 35360, 26647, 403, 8718, 763, 281, 1329, 253, 3916, 273, 253, 2929, 533, 323, 29867, 891, 2868, 352, 651, 320, 29224, 281, 3877, 253, 256, 5503, 3045, 327, 841, 8892, 5010, 247, 48292, 9414, 778, 320, 762, 253, 13214, 326, 253, 256, 5503, 310, 2918, 407, 253, 4477, 3082, 50275, 21941, 31306, 891, 2868, 2159, 15644, 440, 1811, 84, 310, 247, 3846, 423, 347, 359, 755, 281, 5148, 613, 79, 12150, 88, 1334, 4836, 10670, 627, 31451, 320, 667, 1659, 275, 253, 4764, 2317, 432, 534, 359, 812, 5115, 1175, 1442, 292, 25004, 3045, 1293, 271, 43541, 15644, 5199, 436, 310, 3782, 2590, 275, 1313, 7694, 835, 1293, 2509, 11786, 5018, 253, 941, 359, 588, 320, 13546, 31451, 4751, 6266, 253, 278, 12132, 359, 878, 281, 8415, 581, 778, 8564, 247, 4836, 342, 2709, 9956, 835, 760, 846, 9591, 15644, 359, 588, 1509, 253, 3451, 3369, 285, 10018, 253, 4588, 4836, 50275, 6795, 275, 4706, 7127, 295, 3683, 2135, 50276, 91, 739, 8592, 4695, 943, 320, 247, 4851, 554, 7152, 339, 431, 248, 4477, 12661, 767, 5148, 613, 920, 11333, 275, 253, 39306, 10295, 288, 300, 6291, 2317, 391, 76, 11285, 5802, 407, 253, 4102, 4081, 11454, 28196, 34501, 295, 17922, 253, 4477, 921, 849, 616, 11333, 691, 87, 4513, 271, 6843, 6703, 6287, 390, 4836, 26672, 318, 3213, 275, 253, 5148, 613, 920, 3733, 3408, 275, 806, 5933, 642, 6843, 15644, 1159, 310, 908, 5727, 275, 253, 1273, 247, 2810, 830, 15644, 1159, 534, 828, 7095, 253, 295, 17922, 310, 4081, 50276, 4609, 310, 247, 19554, 15644, 685, 326, 273, 278, 16878, 285, 7613, 6131, 15180, 6733, 253, 789, 310, 4722, 285, 50276, 19391, 407, 3762, 11797, 432, 253, 295, 17922, 3762, 285, 11323, 281, 253, 9841, 16122, 6239, 275, 253, 897, 273, 34501, 275, 5148, 613, 920, 12401, 253, 4477, 1750, 275, 253, 10199, 31187, 310, 417, 253, 806, 5148, 613, 920, 22199, 275, 253, 391, 76, 11285, 21194, 259, 606, 1162, 355, 9169, 15380, 900, 1162, 355, 6247, 253, 4477, 1347, 9470, 4679, 327, 9077, 285, 9162, 15302, 285, 7277, 616, 1543, 342, 643, 278, 16878, 881, 11333, 253, 5661, 1543, 513, 417, 921, 1534, 15988, 275, 2426, 273, 3045, 689, 253, 5368, 278, 16878, 7274, 3707, 275, 253, 1083, 273, 562, 1171, 35360, 15302, 285, 48960, 8104, 835, 352, 310, 2011, 281, 562, 32231, 253, 2571, 253, 3045, 14259, 281, 643, 3082, 310, 417, 10084, 1580, 253, 4081, 7274, 476, 320, 2326, 347, 271, 5919, 34754, 273, 253, 278, 16878, 50276, 74, 1918, 619, 7000, 5701, 1735, 50276, 7053, 314, 891, 2868, 253, 4060, 273, 253, 2929, 812, 320, 4391, 281, 5148, 613, 920, 275, 253, 391, 76, 11285, 5802, 407, 253, 295, 17922, 390, 1633, 625, 2173, 50276, 47590, 352, 3249, 745, 347, 2581, 3862, 285, 50196, 281, 253, 789, 285, 5368, 789, 50276, 74, 574, 690, 3374, 342, 253, 1750, 326, 1097, 253, 1313, 782, 11285, 7274, 513, 417, 878, 271, 6843, 6703, 14075, 15644, 436, 1057, 417, 1646, 2032, 275, 253, 1083, 273, 1313, 782, 73, 9245, 50276, 266, 6703, 15644, 310, 417, 3058, 275, 253, 1313, 255, 26208, 533, 3309, 816, 751, 253, 278, 16878, 1309, 323, 247, 1071, 4836, 275, 253, 1083, 273, 1313, 782, 11285, 2886, 253, 295, 17922, 11786, 2685, 1754, 15644, 275, 16186, 23, 4948, 253, 6703, 14075, 816, 326, 352, 310, 247, 625, 5919, 6703, 5731, 685, 253, 278, 16878, 253, 4477, 1364, 1908, 294, 545, 83, 2355, 253, 1750, 281, 4887, 841, 50276, 186, 3899, 782, 73, 9245, 337, 186, 4064, 752, 891, 2096, 253, 1971, 310, 1754, 327, 271, 11193, 281, 253, 278, 16878, 342, 465, 5338, 11786, 18499, 5018, 949, 247, 246, 9614, 7466, 281, 436, 6070, 581, 476, 1902, 253, 3045, 281, 320, 2074, 281, 326, 273, 465, 10539, 278, 16878, 327, 970, 253, 3213, 1979, 465, 1637, 275, 5150, 577, 50276, 527, 13158, 359, 923, 436, 275, 253, 5661, 1543, 50276, 19, 186, 262, 310, 12744, 347, 281, 849, 2378, 253, 21543, 274, 6245, 39116, 310, 34003, 253, 3602, 323, 247, 747, 4836, 403, 2797, 432, 352, 432, 752, 891, 923, 352, 310, 2797, 407, 465, 20528, 273, 4588, 11786, 18499, 816, 751, 253, 278, 16878, 436, 369, 417, 5393, 9825, 4518, 275, 253, 7714, 891, 5998, 436, 432, 253, 3368, 14777, 275, 4677, 608, 273, 253, 30762, 534, 25957, 1027, 6703, 5018, 50276, 20, 253, 246, 9614, 7466, 275, 273, 5150, 374, 762, 752, 1083, 310, 436, 7466, 3588, 1057, 436, 969, 5467, 247, 1029, 4248, 273, 14259, 875, 253, 8892, 577, 1078, 5150, 495, 5150, 374, 310, 271, 38663, 29107, 273, 50276, 2520, 1750, 310, 417, 4745, 8943, 281, 479, 651, 320, 1805, 604, 253, 4477, 812, 5645, 247, 2372, 1060, 608, 253, 4602, 281, 253, 295, 17922, 3133, 247, 2372, 5075, 285, 28019, 50276, 3169, 327, 16186, 495, 253, 4477, 12661, 253, 2341, 5164, 275, 16186, 21, 323, 4715, 253, 21543, 274, 6245, 285, 436, 310, 835, 253, 391, 76, 11285, 3249, 275, 2299, 275, 253, 1077, 1735, 12494, 275, 10012, 374, 253, 19945, 273, 253, 11786, 275, 4764, 2317, 285, 253, 5164, 11786, 310, 7558, 1060, 891, 513, 417, 923, 752, 3607, 273, 253, 295, 17922, 403, 1146, 25605, 16186, 577, 476, 320, 342, 253, 253, 4764, 11786, 3185, 273, 253, 11786, 275, 253, 391, 76, 11285, 285, 253, 2746, 588, 4035, 281, 2186, 3588, 253, 4477, 943, 3324, 562, 253, 4602, 281, 391, 76, 11285, 285, 295, 17922, 625, 4518, 387, 1246, 352, 3133, 281, 479, 326, 253, 2746, 1057, 417, 11120, 452, 10291, 281, 271, 391, 76, 11285, 6296, 253, 27947, 273, 10012, 495, 285, 577, 671, 513, 417, 1646, 281, 897, 253, 3607, 273, 391, 76, 11285, 390, 253, 295, 17922, 50276, 3899, 782, 11285, 2886, 1060, 253, 4477, 12661, 271, 15644, 1159, 1754, 327, 253, 295, 17922, 285, 253, 11786, 2685, 436, 310, 271, 4722, 15644, 1159, 326, 612, 7095, 253, 295, 17922, 285, 275, 253, 1232, 476, 1361, 16851, 247, 465, 2621, 6703, 11786, 285, 2568, 320, 1959, 273, 253, 15180, 12748, 326, 1705, 1955, 281, 436, 275, 253, 2629, 278, 16878, 2718, 12401, 1313, 782, 73, 9245, 253, 4602, 281, 295, 17922, 310, 2266, 285, 2590, 1060, 253, 4477, 943, 1908, 16122, 285, 10251, 2182, 436, 5110, 1805, 323, 1650, 752, 310, 253, 27570, 273, 10012, 608, 50276, 6275, 12494, 327, 3239, 608, 540, 41597, 40622, 476, 320, 43413, 625, 10237, 891, 513, 417, 956, 436, 812, 253, 4477, 5645, 1060, 891, 671, 1119, 253, 31640, 5955, 281, 320, 25312, 19843, 327, 253, 2644, 50276, 186, 13206, 891, 50276, 783, 24039, 285, 253, 38209, 403, 417, 34025, 50276, 4674, 7652, 285, 7127, 253, 1313, 782, 11285, 3082, 3012, 562, 32231, 643, 7274, 275, 253, 1083, 273, 48960, 8104, 285, 562, 23037, 742, 382, 2382, 8892, 436, 310, 273, 15785, 1580, 352, 310, 1929, 326, 278, 16878, 1511, 7274, 403, 2080, 1512, 7996, 281, 253, 562, 3623, 8892, 476, 436, 31640, 320, 1805, 5544, 390, 11076, 1037, 15626, 275, 2426, 273, 253, 391, 76, 11285, 390, 253, 295, 17922, 436, 651, 10260, 17084, 253, 7680, 50276, 1189, 455, 891, 1928, 326, 436, 310, 271, 4722, 50276, 395, 4460, 7680, 3782, 275, 2426, 273, 15965, 12342, 2167, 253, 2746, 1057, 417, 7933, 562, 32231, 2074, 3082, 407, 247, 1534, 8459, 3707, 275, 253, 1083, 273, 562, 23037, 742, 382, 2382, 8892, 390, 48960, 8104, 253, 4602, 281, 391, 76, 11285, 3133, 247, 2372, 5075, 285, 4390, 4620, 275, 253, 830, 273, 295, 17922, 285, 326, 1512, 28668, 760, 275, 1313, 782, 11285, 2886, 247, 14876, 310, 671, 1908, 10895, 2219, 835, 278, 16878, 7933, 4419, 2709, 6703, 11786, 5018, 281, 816, 581, 3213, 6703, 11786, 5731, 50276, 783, 806, 1340, 278, 16878, 50276, 47590, 512, 253, 6667, 275, 253, 2929, 921, 2074, 3045, 323, 1097, 278, 16878, 285, 806, 1340, 278, 16878, 436, 812, 1361, 12654, 604, 253, 1313, 782, 11285, 7274, 476, 6296, 5115, 2074, 3045, 347, 1554, 382, 554, 6703, 11786, 278, 16878, 1223, 1907, 1199, 2406, 10454, 50275, 250, 3065, 259, 606, 1162, 355, 9169, 50276, 3227, 1004, 22589, 259, 606, 8864, 899, 86, 5101, 285, 1766, 632, 4156, 14940, 285, 5802, 34501, 273, 11786, 3169, 5148, 613, 920, 342, 11454, 37507, 9169, 50275, 68, 677, 900, 1162, 355, 6247, 480, 15380, 900, 480, 247, 270, 24613, 1452, 278, 1724, 87, 1171, 962, 3230, 285, 247, 9412, 70, 9401, 5148, 613, 920, 949, 9904, 13757, 275, 39306, 10295, 288, 300, 6291, 8470, 6247, 41290, 266, 1453, 8059, 756, 30005, 13254, 1349, 441, 66, 6247, 7266, 5693, 1449, 2385, 2950, 28076, 884, 20487, 746, 3649, 9638, 2055, 14231, 746, 2490, 187, 4118, 18435, 27, 2520, 2929, 19401, 5148, 613, 920, 1754, 327, 278, 16878, 50276, 783, 4477, 897, 11454, 28196, 34501, 34900, 661, 281, 1287, 767, 5148, 613, 920, 11333, 326, 3693, 253, 6703, 14075, 15644, 534, 2789, 278, 16878, 43245, 17193, 50276, 49363, 1543, 7568, 13857, 16774, 3045, 689, 5368, 3082, 50275, 783, 2929, 310, 3839, 973, 3542, 285, 34025, 50276, 783, 4081, 3082, 403, 973, 17194, 285, 1754, 327, 4891, 10527, 3216, 50276, 783, 802, 1087, 45756, 3045, 2722, 11361, 275, 6733, 285, 3290, 50275, 2520, 789, 310, 4409, 2997, 566, 275, 17857, 32888, 43425, 209 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: in this paper the optimization of instancedependent sample complexity for reinforcement learning rl under linear mdp is studied the pac policy learning setting is considered which is to identify the best policy among all an eliminationstyle algorithm with a novel exploration strategy is proposed the key idea lies in using online frankwolfe to explore the directions which reduce the sample variances most the algorithm achieves the first instancedependent sample complexity guarantee for linear mdp the analysis also shows that lowregret algorithms can be suboptimal in certain instances showing the advantage of the proposed algorithm strengths 1 the paper proposes an algorithm that is the first to achieve the instancedependent sample complexity guarantee under linear mdp the result is original and novel 2 the exploration strategy based on online frankwolfe is interesting and can be of independent interest for other bandit learning problems weaknesses 1 according to the algorithm it is implicitly assumed that the policy set has finite cardinality this assumption is not reasonable since there are infinite states in a linear mdp the number of all policies should also be infinite i think this is a major limitation of the proposed algorithm and analysis 2 another limitation is that the instancedependent term in theorem 1 does not have a clear explanation it is not easy to think out which kind of linear mdp the proposed algorithm can be advantageous 3 i also have some doubts about the exploration step in line 6 of algorithm 1 the number of data to collect is set to kh l which is closely related to the overall sample complexity but there is no explanation of how this number is set furthermore the sample complexity of algorithm 2 is proposed in theorem 6 which is of order o1epsilonexp but there is no definition of epsilonexp in the paper is epsilonexpepsilon if this is true i cant understand why the instancedependent bound in theorem 1 can hold since as a step of algorithm 1 algorithm 2 has already required o1epsilonexp instances 4 i think some proofreading needs to be done to improve the notational explanations there are some notations without defining previously or immediately after their first appearance overall i think the paper proposes interesting and novel results but the weaknesses discussed above are not ignorable for me currently after rebuttal i increase my score since the responses address most of my doubts i suggest including the explanations in the paper to improve clarity the limitations and potential negative societal impact are properly discussed docsepthis paper studies the problem of sample complexity of rl with linear function approximation recent works have extensively studied this problem with a range of results for different versions and structural assumptions however this paper presents the first instance dependent sample complexity analysis for the linear mdp setting the proposed algorithm is based on a policy elimination scheme which iteratively maintains a set of policies which lie within a 2l optimality gap from the optimal policy the update step for this algorithm is based on a novel online experimentdesign procedure which guarantees the collection of a sufficiently exploratory dataset of stateaction pairs at each step in terms of the formal results the authors show the first instancedependent sample complexity analysis for linear mdps which can be shown to also yield a dimensionoptimal rate in the worst case further it is formally shown that there exists instances where the instancedependent rate improves over the sample complexity of worst case optimal pacrl algorithms lastly it is shown that the instancedependent rate for the linear case recovers the instancedependent rate for deterministic mdps whereas it gives another instancedependent rate for tabular mdp which can be provably better or worse than previous instancedependent rates for specific instances overall this paper is very well written and shows important and novel results for linear mdps i especially like the thoroughness of this paper where multiple aspects of instancedependence are explored 1 instancedependent analysis for linear mdps the algorithm is well explained and a novel method for the experiment design objective in 41 is also proposed and analysed in the final rate the authors show that the rate depends on factors which relate to hardness of exploring any direction in the feature space phi a policy gap term for the input policy class a final result is obtained by showing that a policy class can be constructed for linear mdps whose size exponentially scales with d to derive the instancedependent guarantee with respect to the optimal policy 2 demonstrating the utility of the instancedependent rate the authors show using prop 3 that there exists linear mdps such that the instance dependent rate improves over the worst case rate by a factor of d2 further it is shown that no existing pacoptimal algorithm will get a polylogd rate for this problem 3 discussions for tabular and deterministic mdps i especially like the result in prop 5 which shows that the sample comeplxity for their algorithm is in fact another possible instancedependent rate for tabular domains and can improve over previously shown instanceadaptive algorithms or in some cases perform worse than them it brings an interesting question of unifying these different instancedependent measure of complexity and highlights possibilities of improvement even for the linear mdp case for deterministic mdps previously known rate can be recovered upto h factors weakness this is a strong first result in this direction and hence has certain aspects which can possibly be improved 1 computational efficiency since the algorithm follows a policy elimination template for establishing a guarantee wrt optimal policy an exponentially large policy set needs to be constructed the experiment design objective then iterates over the expected feature vectors for each policy to eliminate suboptimal candidates as such it is not clear if a computationally efficient version of the solution can be derived 2 dependence on explorability parameter the authors need to assume that there exists a policy which generates a full rank covariance matrix for the stateaction features in my understanding this also affects the algorithm as the covariance matrices constructed in the experiment design steps are guaranteed to be nonsingular using this assumption can the authors comment about the necessity of this assumption or potential ways to remove this the authors do discuss the potential improvements in the conclusion and formally show that their algorithm does not yield the best sample complexity for some tabular domains as such i do not have any concerns about the discussions of the limitations docsepthis paper considers instancedependent complexity of learning under linear mdps based on a novel online experiment design in linear mdps the author proposes an algorithm whose sample complexity scaling with the instancedependent complexity this algorithm is worstcase optimal and beats any lowregret algorithm on an explicit example constructed by the author strengths interesting question the question of obtaining worstcase optimal algorithms with linear function approximation has been wellstudied recently when it comes to instancedependent rl most of the recent work focuses on the tabular setting obtaining computationally efficient algorithms with tight instancedependent sample complexity under linear mdps is still an open question in rl great novelty the author uses an online experiment designbased procedure to reduce the uncertainty of their algorithm enough comparison with the previous work in section 43 the author reduces their result to tabular and deterministic mdps and carefully compares their results with the existing results wellwritten and easy to follow clear algorithms definitions assumptions and theorems weaknesses lack of tightness the paper does not show if the algorithm obtains tight instancedependence computationally inefficient algorithm lack of tightness wrt instancedependence not tight wrt h dependence computationally inefficient the sample complexity result scales with the hardesttoreach direction ### Summary:
this paper considers instancedependent complexity of learning under linear mdps based on a novel online experiment design in linear mdps the author proposes an algorithm whose sample complexity scaling with the instancedependent complexity this algorithm is worstcase optimal and beats any lowregret algorithm on an explicit example constructed by the author all reviewers are convinced by the contribution of this paper and we recommend acceptance
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 249, 436, 2929, 253, 13757, 273, 978, 3086, 2662, 3410, 10454, 323, 35221, 4715, 391, 77, 762, 4872, 278, 12132, 310, 5421, 253, 19162, 3646, 4715, 4758, 310, 2783, 534, 310, 281, 4271, 253, 1682, 3646, 2190, 512, 271, 20408, 4826, 5933, 342, 247, 4460, 17947, 5700, 310, 4081, 253, 2234, 2934, 8696, 275, 970, 3909, 21332, 88, 311, 453, 281, 8338, 253, 10746, 534, 4796, 253, 3410, 48894, 954, 253, 5933, 33526, 253, 806, 978, 3086, 2662, 3410, 10454, 12215, 323, 4872, 278, 12132, 253, 1783, 671, 2722, 326, 1698, 1747, 1221, 11333, 476, 320, 749, 29776, 275, 2176, 10872, 4645, 253, 5750, 273, 253, 4081, 5933, 20544, 50276, 18, 253, 2929, 29328, 271, 5933, 326, 310, 253, 806, 281, 5115, 253, 978, 3086, 2662, 3410, 10454, 12215, 762, 4872, 278, 12132, 253, 906, 310, 3236, 285, 4460, 50276, 19, 253, 17947, 5700, 1754, 327, 3909, 21332, 88, 311, 453, 310, 4722, 285, 476, 320, 273, 3907, 1600, 323, 643, 3961, 262, 4715, 3237, 50276, 20881, 1255, 265, 50276, 18, 2556, 281, 253, 5933, 352, 310, 29688, 8025, 326, 253, 3646, 873, 556, 6486, 46950, 436, 9376, 310, 417, 5272, 1580, 627, 403, 11968, 3054, 275, 247, 4872, 278, 12132, 253, 1180, 273, 512, 7823, 943, 671, 320, 11968, 891, 1158, 436, 310, 247, 2201, 12291, 273, 253, 4081, 5933, 285, 1783, 50276, 19, 1529, 12291, 310, 326, 253, 978, 3086, 2662, 1307, 275, 10012, 337, 1057, 417, 452, 247, 2590, 8813, 352, 310, 417, 3477, 281, 1158, 562, 534, 2238, 273, 4872, 278, 12132, 253, 4081, 5933, 476, 320, 24400, 50275, 20, 891, 671, 452, 690, 24626, 670, 253, 17947, 3213, 275, 1386, 721, 273, 5933, 337, 253, 1180, 273, 941, 281, 4822, 310, 873, 281, 26856, 298, 534, 310, 8244, 2905, 281, 253, 4583, 3410, 10454, 533, 627, 310, 642, 8813, 273, 849, 436, 1180, 310, 873, 33810, 253, 3410, 10454, 273, 5933, 374, 310, 4081, 275, 10012, 721, 534, 310, 273, 1340, 258, 18, 2265, 300, 531, 37755, 533, 627, 310, 642, 5426, 273, 299, 793, 300, 531, 37755, 275, 253, 2929, 310, 299, 793, 300, 531, 89, 365, 4277, 604, 436, 310, 2032, 891, 16216, 2096, 2139, 253, 978, 3086, 2662, 3033, 275, 10012, 337, 476, 2186, 1580, 347, 247, 3213, 273, 5933, 337, 5933, 374, 556, 2168, 2424, 258, 18, 2265, 300, 531, 37755, 10872, 50276, 21, 891, 1158, 690, 4737, 24042, 3198, 281, 320, 2218, 281, 3157, 253, 417, 1050, 22909, 627, 403, 690, 41818, 1293, 13947, 3786, 390, 4745, 846, 616, 806, 7286, 50275, 1189, 455, 891, 1158, 253, 2929, 29328, 4722, 285, 4460, 1543, 533, 253, 32213, 5469, 1840, 403, 417, 4900, 12178, 323, 479, 4390, 50275, 6438, 30080, 22559, 50276, 74, 2572, 619, 4868, 1580, 253, 6128, 2953, 954, 273, 619, 24626, 891, 1804, 1690, 253, 22909, 275, 253, 2929, 281, 3157, 19843, 50276, 783, 7364, 285, 2442, 4016, 38058, 3486, 403, 6283, 5469, 50276, 7152, 33032, 2520, 2929, 2175, 253, 1895, 273, 3410, 10454, 273, 391, 77, 342, 4872, 1159, 11193, 3332, 2987, 452, 18171, 5421, 436, 1895, 342, 247, 2491, 273, 1543, 323, 1027, 9508, 285, 8350, 13260, 2299, 436, 2929, 10262, 253, 806, 4227, 7976, 3410, 10454, 1783, 323, 253, 4872, 278, 12132, 4758, 253, 4081, 5933, 310, 1754, 327, 247, 3646, 20408, 6974, 534, 10040, 3146, 18922, 247, 873, 273, 7823, 534, 7027, 1561, 247, 374, 77, 5556, 1319, 8037, 432, 253, 8654, 3646, 253, 5731, 3213, 323, 436, 5933, 310, 1754, 327, 247, 4460, 3909, 3368, 19417, 5199, 534, 23632, 253, 4849, 273, 247, 10481, 41075, 10895, 273, 1375, 1913, 8557, 387, 1016, 3213, 50276, 249, 2426, 273, 253, 7473, 1543, 253, 4477, 921, 253, 806, 978, 3086, 2662, 3410, 10454, 1783, 323, 4872, 31934, 793, 534, 476, 320, 2011, 281, 671, 4917, 247, 7877, 29776, 2281, 275, 253, 9065, 1083, 2007, 352, 310, 19186, 2011, 326, 627, 4961, 10872, 835, 253, 978, 3086, 2662, 2281, 19132, 689, 253, 3410, 10454, 273, 9065, 1083, 8654, 19162, 8435, 11333, 1390, 314, 352, 310, 2011, 326, 253, 978, 3086, 2662, 2281, 323, 253, 4872, 1083, 761, 12239, 253, 978, 3086, 2662, 2281, 323, 30027, 31934, 793, 5727, 352, 4245, 1529, 978, 3086, 2662, 2281, 323, 10334, 792, 278, 12132, 534, 476, 320, 872, 1598, 1805, 390, 7197, 685, 2045, 978, 3086, 2662, 4142, 323, 2173, 10872, 50276, 1189, 455, 436, 2929, 310, 1077, 973, 3542, 285, 2722, 1774, 285, 4460, 1543, 323, 4872, 31934, 793, 891, 3340, 751, 253, 11080, 1255, 273, 436, 2929, 835, 2709, 7794, 273, 978, 3086, 23907, 403, 14859, 337, 978, 3086, 2662, 1783, 323, 4872, 31934, 793, 253, 5933, 310, 973, 5544, 285, 247, 4460, 1332, 323, 253, 3368, 2216, 8103, 275, 7609, 310, 671, 4081, 285, 15626, 275, 253, 2457, 2281, 253, 4477, 921, 326, 253, 2281, 7024, 327, 2616, 534, 14588, 281, 38576, 273, 18216, 667, 3884, 275, 253, 4735, 2317, 815, 74, 247, 3646, 8037, 1307, 323, 253, 3280, 3646, 966, 247, 2457, 906, 310, 2797, 407, 4645, 326, 247, 3646, 966, 476, 320, 8818, 323, 4872, 31934, 793, 3692, 1979, 28596, 11498, 342, 277, 281, 15313, 253, 978, 3086, 2662, 12215, 342, 1675, 281, 253, 8654, 3646, 374, 17227, 253, 11839, 273, 253, 978, 3086, 2662, 2281, 253, 4477, 921, 970, 4198, 495, 326, 627, 4961, 4872, 31934, 793, 824, 326, 253, 4227, 7976, 2281, 19132, 689, 253, 9065, 1083, 2281, 407, 247, 2803, 273, 277, 19, 2007, 352, 310, 2011, 326, 642, 5368, 19162, 29776, 5933, 588, 755, 247, 877, 1190, 462, 69, 2281, 323, 436, 1895, 495, 11985, 323, 10334, 792, 285, 30027, 31934, 793, 891, 3340, 751, 253, 906, 275, 4198, 608, 534, 2722, 326, 253, 3410, 1705, 446, 89, 414, 323, 616, 5933, 310, 275, 958, 1529, 1896, 978, 3086, 2662, 2281, 323, 10334, 792, 10625, 285, 476, 3157, 689, 3786, 2011, 4227, 26672, 422, 11333, 390, 275, 690, 2219, 1347, 7197, 685, 731, 352, 10316, 271, 4722, 1953, 273, 440, 5411, 841, 1027, 978, 3086, 2662, 2557, 273, 10454, 285, 16681, 15018, 273, 7756, 1014, 323, 253, 4872, 278, 12132, 1083, 323, 30027, 31934, 793, 3786, 1929, 2281, 476, 320, 12372, 11776, 80, 288, 2616, 50276, 20881, 1255, 436, 310, 247, 2266, 806, 906, 275, 436, 3884, 285, 7613, 556, 2176, 7794, 534, 476, 6830, 320, 5520, 337, 15180, 6733, 1580, 253, 5933, 3637, 247, 3646, 20408, 7646, 323, 14631, 247, 12215, 8772, 8654, 3646, 271, 28596, 1781, 3646, 873, 3198, 281, 320, 8818, 253, 3368, 2216, 8103, 840, 10040, 684, 689, 253, 3264, 4735, 11390, 323, 1016, 3646, 281, 13469, 749, 29776, 9183, 347, 824, 352, 310, 417, 2590, 604, 247, 43245, 5919, 2715, 273, 253, 2900, 476, 320, 6012, 374, 10096, 327, 31880, 1430, 4764, 253, 4477, 878, 281, 5467, 326, 627, 4961, 247, 3646, 534, 15693, 247, 2120, 5958, 26677, 4315, 323, 253, 1375, 1913, 3386, 275, 619, 4685, 436, 671, 11852, 253, 5933, 347, 253, 26677, 12624, 8818, 275, 253, 3368, 2216, 5018, 403, 16293, 281, 320, 14122, 272, 792, 970, 436, 9376, 476, 253, 4477, 4385, 670, 253, 15504, 273, 436, 9376, 390, 2442, 4088, 281, 5386, 436, 253, 4477, 513, 2319, 253, 2442, 11701, 275, 253, 6452, 285, 19186, 921, 326, 616, 5933, 1057, 417, 4917, 253, 1682, 3410, 10454, 323, 690, 10334, 792, 10625, 347, 824, 891, 513, 417, 452, 667, 7350, 670, 253, 11985, 273, 253, 7364, 5474, 33032, 2520, 2929, 19401, 978, 3086, 2662, 10454, 273, 4715, 762, 4872, 31934, 793, 1754, 327, 247, 4460, 3909, 3368, 2216, 275, 4872, 31934, 793, 253, 2488, 29328, 271, 5933, 3692, 3410, 10454, 13642, 342, 253, 978, 3086, 2662, 10454, 436, 5933, 310, 9065, 5045, 8654, 285, 27125, 667, 1698, 1747, 1221, 5933, 327, 271, 6843, 1650, 8818, 407, 253, 2488, 50276, 296, 3755, 20556, 50276, 47606, 1953, 253, 1953, 273, 13546, 9065, 5045, 8654, 11333, 342, 4872, 1159, 11193, 556, 644, 973, 14091, 728, 4102, 672, 352, 3249, 281, 978, 3086, 2662, 391, 77, 954, 273, 253, 3332, 789, 16633, 327, 253, 10334, 792, 4758, 13546, 43245, 5919, 11333, 342, 6863, 978, 3086, 2662, 3410, 10454, 762, 4872, 31934, 793, 310, 1335, 271, 1527, 1953, 275, 391, 77, 50276, 17124, 38135, 253, 2488, 4648, 271, 3909, 3368, 2216, 3169, 5199, 281, 4796, 253, 11649, 273, 616, 5933, 50276, 40252, 5301, 342, 253, 2045, 789, 275, 2593, 7652, 253, 2488, 11355, 616, 906, 281, 10334, 792, 285, 30027, 31934, 793, 285, 9257, 26662, 616, 1543, 342, 253, 5368, 1543, 50276, 4714, 15720, 285, 3477, 281, 956, 2590, 11333, 14308, 13260, 285, 39383, 50276, 20881, 1255, 265, 50276, 77, 471, 273, 6863, 1255, 253, 2929, 1057, 417, 921, 604, 253, 5933, 31326, 6863, 978, 3086, 23907, 50276, 681, 10340, 595, 31334, 5933, 50275, 77, 471, 273, 6863, 1255, 8772, 978, 3086, 23907, 417, 6863, 8772, 288, 10096, 43245, 31334, 253, 3410, 10454, 906, 11498, 342, 253, 31056, 85, 410, 607, 3884, 2490, 187, 4118, 18435, 27, 2520, 2929, 19401, 978, 3086, 2662, 10454, 273, 4715, 762, 4872, 31934, 793, 1754, 327, 247, 4460, 3909, 3368, 2216, 275, 4872, 31934, 793, 253, 2488, 29328, 271, 5933, 3692, 3410, 10454, 13642, 342, 253, 978, 3086, 2662, 10454, 436, 5933, 310, 9065, 5045, 8654, 285, 27125, 667, 1698, 1747, 1221, 5933, 327, 271, 6843, 1650, 8818, 407, 253, 2488, 512, 30628, 403, 13762, 407, 253, 7680, 273, 436, 2929, 285, 359, 5583, 14924 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 249, 436, 2929, 253, 13757, 273, 978, 3086, 2662, 3410, 10454, 323, 35221, 4715, 391, 77, 762, 4872, 278, 12132, 310, 5421, 253, 19162, 3646, 4715, 4758, 310, 2783, 534, 310, 281, 4271, 253, 1682, 3646, 2190, 512, 271, 20408, 4826, 5933, 342, 247, 4460, 17947, 5700, 310, 4081, 253, 2234, 2934, 8696, 275, 970, 3909, 21332, 88, 311, 453, 281, 8338, 253, 10746, 534, 4796, 253, 3410, 48894, 954, 253, 5933, 33526, 253, 806, 978, 3086, 2662, 3410, 10454, 12215, 323, 4872, 278, 12132, 253, 1783, 671, 2722, 326, 1698, 1747, 1221, 11333, 476, 320, 749, 29776, 275, 2176, 10872, 4645, 253, 5750, 273, 253, 4081, 5933, 20544, 50276, 18, 253, 2929, 29328, 271, 5933, 326, 310, 253, 806, 281, 5115, 253, 978, 3086, 2662, 3410, 10454, 12215, 762, 4872, 278, 12132, 253, 906, 310, 3236, 285, 4460, 50276, 19, 253, 17947, 5700, 1754, 327, 3909, 21332, 88, 311, 453, 310, 4722, 285, 476, 320, 273, 3907, 1600, 323, 643, 3961, 262, 4715, 3237, 50276, 20881, 1255, 265, 50276, 18, 2556, 281, 253, 5933, 352, 310, 29688, 8025, 326, 253, 3646, 873, 556, 6486, 46950, 436, 9376, 310, 417, 5272, 1580, 627, 403, 11968, 3054, 275, 247, 4872, 278, 12132, 253, 1180, 273, 512, 7823, 943, 671, 320, 11968, 891, 1158, 436, 310, 247, 2201, 12291, 273, 253, 4081, 5933, 285, 1783, 50276, 19, 1529, 12291, 310, 326, 253, 978, 3086, 2662, 1307, 275, 10012, 337, 1057, 417, 452, 247, 2590, 8813, 352, 310, 417, 3477, 281, 1158, 562, 534, 2238, 273, 4872, 278, 12132, 253, 4081, 5933, 476, 320, 24400, 50275, 20, 891, 671, 452, 690, 24626, 670, 253, 17947, 3213, 275, 1386, 721, 273, 5933, 337, 253, 1180, 273, 941, 281, 4822, 310, 873, 281, 26856, 298, 534, 310, 8244, 2905, 281, 253, 4583, 3410, 10454, 533, 627, 310, 642, 8813, 273, 849, 436, 1180, 310, 873, 33810, 253, 3410, 10454, 273, 5933, 374, 310, 4081, 275, 10012, 721, 534, 310, 273, 1340, 258, 18, 2265, 300, 531, 37755, 533, 627, 310, 642, 5426, 273, 299, 793, 300, 531, 37755, 275, 253, 2929, 310, 299, 793, 300, 531, 89, 365, 4277, 604, 436, 310, 2032, 891, 16216, 2096, 2139, 253, 978, 3086, 2662, 3033, 275, 10012, 337, 476, 2186, 1580, 347, 247, 3213, 273, 5933, 337, 5933, 374, 556, 2168, 2424, 258, 18, 2265, 300, 531, 37755, 10872, 50276, 21, 891, 1158, 690, 4737, 24042, 3198, 281, 320, 2218, 281, 3157, 253, 417, 1050, 22909, 627, 403, 690, 41818, 1293, 13947, 3786, 390, 4745, 846, 616, 806, 7286, 50275, 1189, 455, 891, 1158, 253, 2929, 29328, 4722, 285, 4460, 1543, 533, 253, 32213, 5469, 1840, 403, 417, 4900, 12178, 323, 479, 4390, 50275, 6438, 30080, 22559, 50276, 74, 2572, 619, 4868, 1580, 253, 6128, 2953, 954, 273, 619, 24626, 891, 1804, 1690, 253, 22909, 275, 253, 2929, 281, 3157, 19843, 50276, 783, 7364, 285, 2442, 4016, 38058, 3486, 403, 6283, 5469, 50276, 7152, 33032, 2520, 2929, 2175, 253, 1895, 273, 3410, 10454, 273, 391, 77, 342, 4872, 1159, 11193, 3332, 2987, 452, 18171, 5421, 436, 1895, 342, 247, 2491, 273, 1543, 323, 1027, 9508, 285, 8350, 13260, 2299, 436, 2929, 10262, 253, 806, 4227, 7976, 3410, 10454, 1783, 323, 253, 4872, 278, 12132, 4758, 253, 4081, 5933, 310, 1754, 327, 247, 3646, 20408, 6974, 534, 10040, 3146, 18922, 247, 873, 273, 7823, 534, 7027, 1561, 247, 374, 77, 5556, 1319, 8037, 432, 253, 8654, 3646, 253, 5731, 3213, 323, 436, 5933, 310, 1754, 327, 247, 4460, 3909, 3368, 19417, 5199, 534, 23632, 253, 4849, 273, 247, 10481, 41075, 10895, 273, 1375, 1913, 8557, 387, 1016, 3213, 50276, 249, 2426, 273, 253, 7473, 1543, 253, 4477, 921, 253, 806, 978, 3086, 2662, 3410, 10454, 1783, 323, 4872, 31934, 793, 534, 476, 320, 2011, 281, 671, 4917, 247, 7877, 29776, 2281, 275, 253, 9065, 1083, 2007, 352, 310, 19186, 2011, 326, 627, 4961, 10872, 835, 253, 978, 3086, 2662, 2281, 19132, 689, 253, 3410, 10454, 273, 9065, 1083, 8654, 19162, 8435, 11333, 1390, 314, 352, 310, 2011, 326, 253, 978, 3086, 2662, 2281, 323, 253, 4872, 1083, 761, 12239, 253, 978, 3086, 2662, 2281, 323, 30027, 31934, 793, 5727, 352, 4245, 1529, 978, 3086, 2662, 2281, 323, 10334, 792, 278, 12132, 534, 476, 320, 872, 1598, 1805, 390, 7197, 685, 2045, 978, 3086, 2662, 4142, 323, 2173, 10872, 50276, 1189, 455, 436, 2929, 310, 1077, 973, 3542, 285, 2722, 1774, 285, 4460, 1543, 323, 4872, 31934, 793, 891, 3340, 751, 253, 11080, 1255, 273, 436, 2929, 835, 2709, 7794, 273, 978, 3086, 23907, 403, 14859, 337, 978, 3086, 2662, 1783, 323, 4872, 31934, 793, 253, 5933, 310, 973, 5544, 285, 247, 4460, 1332, 323, 253, 3368, 2216, 8103, 275, 7609, 310, 671, 4081, 285, 15626, 275, 253, 2457, 2281, 253, 4477, 921, 326, 253, 2281, 7024, 327, 2616, 534, 14588, 281, 38576, 273, 18216, 667, 3884, 275, 253, 4735, 2317, 815, 74, 247, 3646, 8037, 1307, 323, 253, 3280, 3646, 966, 247, 2457, 906, 310, 2797, 407, 4645, 326, 247, 3646, 966, 476, 320, 8818, 323, 4872, 31934, 793, 3692, 1979, 28596, 11498, 342, 277, 281, 15313, 253, 978, 3086, 2662, 12215, 342, 1675, 281, 253, 8654, 3646, 374, 17227, 253, 11839, 273, 253, 978, 3086, 2662, 2281, 253, 4477, 921, 970, 4198, 495, 326, 627, 4961, 4872, 31934, 793, 824, 326, 253, 4227, 7976, 2281, 19132, 689, 253, 9065, 1083, 2281, 407, 247, 2803, 273, 277, 19, 2007, 352, 310, 2011, 326, 642, 5368, 19162, 29776, 5933, 588, 755, 247, 877, 1190, 462, 69, 2281, 323, 436, 1895, 495, 11985, 323, 10334, 792, 285, 30027, 31934, 793, 891, 3340, 751, 253, 906, 275, 4198, 608, 534, 2722, 326, 253, 3410, 1705, 446, 89, 414, 323, 616, 5933, 310, 275, 958, 1529, 1896, 978, 3086, 2662, 2281, 323, 10334, 792, 10625, 285, 476, 3157, 689, 3786, 2011, 4227, 26672, 422, 11333, 390, 275, 690, 2219, 1347, 7197, 685, 731, 352, 10316, 271, 4722, 1953, 273, 440, 5411, 841, 1027, 978, 3086, 2662, 2557, 273, 10454, 285, 16681, 15018, 273, 7756, 1014, 323, 253, 4872, 278, 12132, 1083, 323, 30027, 31934, 793, 3786, 1929, 2281, 476, 320, 12372, 11776, 80, 288, 2616, 50276, 20881, 1255, 436, 310, 247, 2266, 806, 906, 275, 436, 3884, 285, 7613, 556, 2176, 7794, 534, 476, 6830, 320, 5520, 337, 15180, 6733, 1580, 253, 5933, 3637, 247, 3646, 20408, 7646, 323, 14631, 247, 12215, 8772, 8654, 3646, 271, 28596, 1781, 3646, 873, 3198, 281, 320, 8818, 253, 3368, 2216, 8103, 840, 10040, 684, 689, 253, 3264, 4735, 11390, 323, 1016, 3646, 281, 13469, 749, 29776, 9183, 347, 824, 352, 310, 417, 2590, 604, 247, 43245, 5919, 2715, 273, 253, 2900, 476, 320, 6012, 374, 10096, 327, 31880, 1430, 4764, 253, 4477, 878, 281, 5467, 326, 627, 4961, 247, 3646, 534, 15693, 247, 2120, 5958, 26677, 4315, 323, 253, 1375, 1913, 3386, 275, 619, 4685, 436, 671, 11852, 253, 5933, 347, 253, 26677, 12624, 8818, 275, 253, 3368, 2216, 5018, 403, 16293, 281, 320, 14122, 272, 792, 970, 436, 9376, 476, 253, 4477, 4385, 670, 253, 15504, 273, 436, 9376, 390, 2442, 4088, 281, 5386, 436, 253, 4477, 513, 2319, 253, 2442, 11701, 275, 253, 6452, 285, 19186, 921, 326, 616, 5933, 1057, 417, 4917, 253, 1682, 3410, 10454, 323, 690, 10334, 792, 10625, 347, 824, 891, 513, 417, 452, 667, 7350, 670, 253, 11985, 273, 253, 7364, 5474, 33032, 2520, 2929, 19401, 978, 3086, 2662, 10454, 273, 4715, 762, 4872, 31934, 793, 1754, 327, 247, 4460, 3909, 3368, 2216, 275, 4872, 31934, 793, 253, 2488, 29328, 271, 5933, 3692, 3410, 10454, 13642, 342, 253, 978, 3086, 2662, 10454, 436, 5933, 310, 9065, 5045, 8654, 285, 27125, 667, 1698, 1747, 1221, 5933, 327, 271, 6843, 1650, 8818, 407, 253, 2488, 50276, 296, 3755, 20556, 50276, 47606, 1953, 253, 1953, 273, 13546, 9065, 5045, 8654, 11333, 342, 4872, 1159, 11193, 556, 644, 973, 14091, 728, 4102, 672, 352, 3249, 281, 978, 3086, 2662, 391, 77, 954, 273, 253, 3332, 789, 16633, 327, 253, 10334, 792, 4758, 13546, 43245, 5919, 11333, 342, 6863, 978, 3086, 2662, 3410, 10454, 762, 4872, 31934, 793, 310, 1335, 271, 1527, 1953, 275, 391, 77, 50276, 17124, 38135, 253, 2488, 4648, 271, 3909, 3368, 2216, 3169, 5199, 281, 4796, 253, 11649, 273, 616, 5933, 50276, 40252, 5301, 342, 253, 2045, 789, 275, 2593, 7652, 253, 2488, 11355, 616, 906, 281, 10334, 792, 285, 30027, 31934, 793, 285, 9257, 26662, 616, 1543, 342, 253, 5368, 1543, 50276, 4714, 15720, 285, 3477, 281, 956, 2590, 11333, 14308, 13260, 285, 39383, 50276, 20881, 1255, 265, 50276, 77, 471, 273, 6863, 1255, 253, 2929, 1057, 417, 921, 604, 253, 5933, 31326, 6863, 978, 3086, 23907, 50276, 681, 10340, 595, 31334, 5933, 50275, 77, 471, 273, 6863, 1255, 8772, 978, 3086, 23907, 417, 6863, 8772, 288, 10096, 43245, 31334, 253, 3410, 10454, 906, 11498, 342, 253, 31056, 85, 410, 607, 3884, 2490, 187, 4118, 18435, 27, 2520, 2929, 19401, 978, 3086, 2662, 10454, 273, 4715, 762, 4872, 31934, 793, 1754, 327, 247, 4460, 3909, 3368, 2216, 275, 4872, 31934, 793, 253, 2488, 29328, 271, 5933, 3692, 3410, 10454, 13642, 342, 253, 978, 3086, 2662, 10454, 436, 5933, 310, 9065, 5045, 8654, 285, 27125, 667, 1698, 1747, 1221, 5933, 327, 271, 6843, 1650, 8818, 407, 253, 2488, 512, 30628, 403, 13762, 407, 253, 7680, 273, 436, 2929, 285, 359, 5583, 14924 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: this work addressed an interesting problem and a very relevant one how to audit generative models the most popular metric to evaluate generative models is fid but it is extremely opaque and population based this paper proposes new fidelity metrics that overcome both these challenges while the definition of these metrics are well grounded their measurement process itself which is done via trained nn classifiers is a bit unconvincing the authors demonstrate that the proposed metric can recover the real ranking of generative models on a synthetic data generation task and can discover issues like mode collapse strengths the paper addresses an important topic that is time critical and relevant to the generative modeling community and the broader sphere of ai it is well written and motivated and a pleasure to read overall the authors clearly point out the issues that are lacking with existing metrics and motivate the need to define the new versions of precision recall metrics moreover the metrics enable a new use case of model auditing which can be very relevant is safety critical applications weaknesses 1 the paper promises a lot in terms of the defined metrics but the evaluations are bit underwhelming the experiment with the predictive model depends on the downstream classifier as well the performance of the generative model is tied to the performance of the classifier hence the hyperparameter settings and training settings for the classifier should be explained clearly 2 another aspect that is missing from the evaluation is the application of model auditing on real life critical data the idea of model auditing becomes useful mainly in cases where we need to filter at the sample level when a generative model is trained on medical data imaging how do the metrics help in model auditing 3 an interesting question to probe is how to compare different generative models that have comparative expertise for instance one model could be an expert in color and another could be in shape and so on can we use the defined metrics to identify such behaviors 4 please provide error bars on the predictive modeling experiments as in account for the randomness in the generative model training process overall the paper does a good job in addressing a relevant problem in a well motivated manner even though there are some open concerns regarding the evaluation the fact that the current approach is one of the first to facilitate model auditing for generative models makes this a good contribution authors please address my concerns in the weaknesses listed above docsepthe paper presents a methodology assessing the performance of a generative model in a domain agnostic fashion a 3dimensional metric space is proposed fidelity output quality diversity coverage of expected variability of output and generalization to what degree model avoids memorizing training data ie is truly generative is proposed particularly the latter element is novel and the manner in which it is may be evaluated by developing an authenticity measure for the task three illustrative use cases in image mnist and medical patient data covid19 hide and seek seq2seq data domains are provided the paper contributes useful thought and discussion for how best to argue the holistic performance of a generative model contrasting with fid and other domainspecific approaches there are novel perspectives in this paper that are worth sharing with the community particularly those contributing to discussion around generalization vs memorization copying where it is shown that prior metric ignoring this aspect otherwise indicate performant models cf timeseries neurips challenge data the paper could be improved by drawing deeper relationship with prior thought in this area eg adlam et al 2019 meehan et al 2020 who are cited in sec 322 but not discused indeed the brief literature review is relegated to the paper appendix and was not helpful in arguing for the novel contribution of the paper in context of such work the paper also lacks any reflection or conclusion on the limitations of the proposed metrics it was unclear to me how the proposed authenticity classifier would work for multimodal distribution given the assumptions of noise within a hypersphere it did not seem like it would be practical for realistic complex data problems a use case on highdimensional data eg images beyond a toy mnist example would have better demonstrated the practical utility of this work for data where the need of such performance metrics is clearer i am on the borderline tending toward accepting the paper given the interesting discussion around authenticity and memorization but i am not fully convinced this is a practically useful paper docsepthis paper targets to a 3dim metric alphaprecision betarecall and authenticity that quantifies the fidelity diversity and generalization performance of a generative model the proposed metric serves as softboundary classifiers between real and generated sphericalshaped supports and improves the robustness against outliers and generation failure cases this samplewise metric can audit models by judging individual synthetic samples by their quality strengths clear writing and demonstrations meaningful research topic technically reproducible weaknesses the technical novelty is marginally revised from sajjadi et al 2018 the experiments are incomplete and therefore unconvincing the testing datasets are a bit toylike like sajjadi et al 2018 please also experiment on celeba 1 or ffhq 2 for mode collapse evaluation please consider using attribute classifiers pretrained on celeba as well as ivom 3 the testing generative models are out of date results on oldfashioned architectures are not conclusive for cuttingedge research works imagine our community works on a new gan paradigm and targets to outperforming stylegan3 1 are we convinced to use the proposed metric which has only been validated on toy datamodels for gan please consider validating on stylegan3 4 for vae please consider validating on vqvae2 5 1 liu ziwei et al deep learning face attributes in the wild iccv 2015 2 karras tero samuli laine and timo aila a stylebased generator architecture for generative adversarial networks cvpr 2019 3 metz luke et al unrolled generative adversarial networks iclr 2017 4 karras tero et al aliasfree generative adversarial networks neurips 2021 5 razavi ali aaron van den oord and oriol vinyals generating diverse highfidelity images with vqvae2 neurips 2019 see the main review docsepthe authors propose a mechanism to calculate the goodness of a generative model on a persample level this method is model agnostic this can be used to audit the model posthoc strengths 1 well written easy to follow 2 timely topic 3 discussion of neurips challenge is important weaknesses 1 i did not learn something substantial from reading the paper 2 the claims are too strong detailed comments 1 the generalized definitions of precision and recall are effective markers of understanding synthetic data quality however empirically they do not outperform other distribution measuring mechanisms such as fid pw w and i suspect methods like kl divergence or mmd or tv distance will also perform comparably 2 to my understanding approaches like fid kl divergence mmd tv distance are also model agnostic i humbly request the authors to rethink the positioning of their work along this particular claim 3 the appeal in this work lies in the fact that the proposed measures can be calculated on a persample basis unlike prior works 4 empirically the authors consider few issues which may result in poor fidelitydiversity mode collapse being one of them if the models did not generalize during training how do these methods capture them 5 could the authors clarify how the proposed method provides interpretability the authors observe correlations between induced failures and their scores but they cant really attribute the score to a particular failure event 6 the proposed measure for authenticity relies on the ability to detect overfitting however recent work by feldman et al argues that overfitting or memorization in some cases is essential for generalization can the authors comment on the same key requirements i would urge the authors to empirically validate if overfitting has occurred to validate their claims maybe through mi attacks ### Summary:
the authors propose a new set of metrics for evaluation of generative models based on the wellestablished precisionrecall framework and an additional dimension quantifying the degree of memorization the authors evaluated the proposed approach in several settings and compared it to a subset of the classic evaluation measures in this space the reviewers agreed that this is an important and challenging problem relevant to the generative modeling community at large the paper is wellwritten and the proposed method and motivation are clearly explained the initial reviews were borderline and after the discussion phase we have 2 borderline accepts one strong accept and one strong reject after reading the manuscript the rebuttal and the discussion i feel that the work should not be accepted on the grounds of insufficient empirical validation establishing a new evaluation metric is a very challenging task one needs to demonstrate the pitfalls of existing metrics as well as how the new metric is capturing the missing dimensions in a thorough empirical validation while the former was somewhat shown in this work and in many other works the latter was not fully demonstrated the primary reason is the use of a nonstandard benchmark to evaluate the utility of the proposed metrics i agree that covering a broader set of tasks and models makes sense in general but it shouldnt be done at the cost of existing wellunderstood benchmarks i expected to see a thorough comparison with 1 one of the most practical metrics used today which can be easily extended to all settings considered in this work notwithstanding the drawbacks outlined in 2 what are the additional insights what is 1 failing to capture in practical instances does the rank correlation change with respect to modern models across classic datasets beyond mnist and cifar10 this would remove confounding variables and significantly strengthen the paper my final assessment is that this work is borderline but below the acceptance bar for iclr i strongly suggest the authors to showcase the additional improvements over methods such as 1 in practical and wellunderstood settings commonly used to benchmark generative models eg on images the experiments suggested by the reviewers are a step in the right direction but not sufficient 1 improved precision and recall metric for assessing generative models tuomas kynknniemi tero karras samuli laine jaakko lehtinen timo aila neurips 19 2 evaluating generative models using divergence frontiers josip djolonga mario lui marco cuturi olivier frederic bachem olivier bousquet sylvain gelly aistats 20
[ 8439, 27, 187, 2520, 789, 9713, 271, 4722, 1895, 285, 247, 1077, 4623, 581, 50276, 5430, 281, 23873, 1006, 800, 3210, 253, 954, 4633, 7982, 281, 7472, 1006, 800, 3210, 310, 269, 301, 533, 352, 310, 6685, 34350, 285, 3072, 1754, 436, 2929, 29328, 747, 32422, 17082, 326, 11399, 1097, 841, 7881, 1223, 253, 5426, 273, 841, 17082, 403, 973, 28462, 616, 6814, 1232, 3139, 50276, 4609, 310, 2218, 3066, 10166, 48257, 49996, 50276, 261, 247, 2372, 10915, 87, 19163, 253, 4477, 7568, 326, 253, 4081, 7982, 476, 9295, 253, 1524, 19947, 273, 1006, 800, 3210, 327, 247, 13506, 941, 5978, 4836, 285, 476, 9413, 3374, 751, 4438, 13551, 20544, 253, 2929, 12453, 271, 1774, 9400, 326, 310, 673, 4619, 285, 4623, 281, 253, 1006, 800, 14053, 3114, 285, 253, 16055, 15269, 273, 23105, 352, 310, 973, 3542, 285, 17194, 285, 247, 11284, 281, 1239, 4583, 253, 4477, 4518, 1127, 562, 253, 3374, 326, 403, 14999, 342, 5368, 17082, 285, 41509, 253, 878, 281, 4853, 253, 747, 9508, 273, 12320, 50276, 2845, 455, 17082, 25761, 253, 17082, 8046, 247, 747, 897, 1083, 273, 1566, 3820, 2996, 534, 476, 320, 1077, 4623, 310, 5252, 4619, 4893, 50276, 20881, 1255, 265, 50276, 18, 253, 2929, 16966, 247, 2257, 275, 2426, 273, 253, 2931, 17082, 533, 253, 27163, 403, 2372, 762, 11622, 3987, 253, 3368, 342, 253, 15970, 1566, 7024, 327, 253, 15450, 30410, 347, 973, 50276, 783, 3045, 273, 253, 1006, 800, 1566, 310, 12331, 281, 253, 3045, 273, 253, 30410, 50276, 48521, 253, 4373, 19484, 7533, 285, 3733, 7533, 323, 253, 30410, 943, 320, 5544, 4518, 50276, 19, 1529, 4809, 326, 310, 5816, 432, 253, 7103, 310, 253, 2898, 273, 1566, 3820, 2996, 327, 1524, 1495, 4619, 941, 253, 2934, 273, 1566, 3820, 2996, 4916, 4217, 7194, 275, 2219, 835, 359, 878, 281, 5806, 387, 253, 3410, 1268, 672, 247, 1006, 800, 1566, 310, 10166, 327, 3739, 941, 6979, 849, 513, 253, 17082, 1361, 275, 1566, 3820, 2996, 495, 271, 4722, 1953, 281, 10304, 310, 849, 281, 7277, 1027, 1006, 800, 3210, 326, 452, 20407, 15040, 323, 4227, 581, 1566, 812, 320, 271, 6485, 275, 3295, 285, 1529, 812, 320, 275, 5281, 285, 594, 327, 50276, 5092, 359, 897, 253, 2931, 17082, 281, 4271, 824, 13576, 577, 4496, 2085, 2228, 8965, 327, 253, 15970, 14053, 4679, 50276, 284, 275, 2395, 323, 253, 3632, 1255, 275, 253, 1006, 800, 1566, 3733, 1232, 4583, 253, 2929, 1057, 247, 1175, 2628, 275, 15974, 247, 4623, 1895, 275, 247, 973, 17194, 5133, 1014, 2167, 627, 403, 690, 1527, 7350, 5001, 253, 7103, 253, 958, 326, 253, 1655, 2746, 310, 581, 273, 253, 806, 281, 12454, 1566, 3820, 2996, 323, 1006, 800, 3210, 2789, 436, 247, 1175, 7680, 4477, 4496, 2953, 619, 7350, 275, 253, 32213, 7117, 1840, 50275, 7152, 339, 431, 248, 2929, 10262, 247, 16182, 18005, 253, 3045, 273, 247, 1006, 800, 1566, 275, 247, 5028, 639, 79, 6932, 8142, 50276, 66, 495, 6967, 7982, 2317, 310, 4081, 32422, 3453, 3290, 9991, 7031, 273, 3264, 13099, 273, 3453, 285, 26647, 281, 752, 4248, 1566, 32547, 16407, 3006, 3733, 941, 26332, 310, 7777, 1006, 800, 310, 4081, 50276, 35456, 253, 6158, 3284, 310, 4460, 285, 253, 5133, 275, 534, 352, 310, 778, 320, 6760, 407, 6684, 271, 40318, 2557, 323, 253, 4836, 1264, 47386, 897, 2219, 275, 2460, 278, 79, 382, 285, 3739, 3110, 941, 9383, 301, 746, 10507, 285, 7703, 22510, 19, 14571, 941, 10625, 403, 2530, 253, 2929, 17904, 4217, 1869, 285, 5955, 323, 849, 1682, 281, 9059, 253, 45290, 3045, 273, 247, 1006, 800, 1566, 42455, 342, 269, 301, 285, 643, 10625, 29765, 7274, 50276, 9088, 403, 4460, 24302, 275, 436, 2929, 326, 403, 4409, 9628, 342, 253, 3114, 3782, 1110, 15979, 281, 5955, 1475, 26647, 4632, 16407, 1320, 24699, 835, 352, 310, 2011, 326, 2720, 7982, 23111, 436, 4809, 5010, 5224, 1347, 386, 3210, 21194, 2069, 12395, 5723, 2824, 5691, 941, 50276, 783, 2929, 812, 320, 5520, 407, 10263, 12861, 2954, 342, 2720, 1869, 275, 436, 2170, 24088, 519, 5247, 1162, 355, 6247, 479, 70, 5582, 1162, 355, 9169, 665, 403, 11106, 275, 4706, 31619, 533, 417, 1262, 3197, 50276, 527, 13158, 253, 4864, 6239, 2278, 310, 50217, 281, 253, 2929, 30762, 285, 369, 417, 9371, 275, 16425, 323, 253, 4460, 7680, 273, 253, 2929, 275, 3634, 273, 824, 789, 50276, 783, 2929, 671, 19756, 667, 12906, 390, 6452, 327, 253, 7364, 273, 253, 4081, 17082, 50276, 262, 369, 12744, 281, 479, 849, 253, 4081, 40318, 30410, 651, 789, 323, 23390, 26306, 3268, 1677, 253, 13260, 273, 6046, 1561, 247, 24052, 81, 1568, 50276, 262, 858, 417, 1646, 751, 352, 651, 320, 8542, 323, 15958, 2570, 941, 3237, 50276, 66, 897, 1083, 327, 1029, 6967, 941, 24088, 3888, 4457, 247, 20953, 278, 79, 382, 1650, 651, 452, 1805, 5183, 253, 8542, 11839, 273, 436, 789, 323, 941, 835, 253, 878, 273, 824, 3045, 17082, 310, 30909, 891, 717, 327, 253, 45210, 43981, 2584, 18738, 253, 2929, 1677, 253, 4722, 5955, 1475, 40318, 285, 16407, 1320, 533, 891, 717, 417, 4751, 13762, 436, 310, 247, 18236, 4217, 2929, 50273, 7152, 33032, 2520, 2929, 8571, 281, 247, 495, 4528, 7982, 355, 545, 522, 2845, 1297, 701, 609, 4065, 285, 40318, 326, 2677, 7790, 253, 32422, 9991, 285, 26647, 3045, 273, 247, 1006, 800, 1566, 253, 4081, 7982, 11029, 347, 2602, 45794, 49996, 875, 1524, 285, 4561, 19474, 13824, 8525, 285, 19132, 253, 31640, 1411, 42559, 285, 5978, 4433, 2219, 436, 3410, 3020, 7982, 476, 23873, 3210, 407, 32721, 2060, 13506, 3530, 407, 616, 3290, 50276, 296, 3755, 20556, 50276, 8250, 4028, 285, 32367, 50276, 30407, 1020, 2561, 9400, 50276, 23693, 1037, 41374, 50276, 20881, 1255, 265, 50276, 783, 7681, 38135, 310, 42876, 17265, 432, 618, 22492, 11282, 1162, 355, 4765, 50276, 783, 4679, 403, 18464, 285, 3103, 10915, 87, 19163, 50274, 783, 5175, 15302, 403, 247, 2372, 20953, 3022, 751, 618, 22492, 11282, 1162, 355, 4765, 4496, 671, 3368, 327, 6076, 5830, 337, 390, 269, 30550, 82, 374, 323, 4438, 13551, 7103, 4496, 1908, 970, 11104, 49996, 3215, 11273, 327, 6076, 5830, 347, 973, 347, 21983, 297, 495, 50274, 783, 5175, 1006, 800, 3210, 403, 562, 273, 3522, 1543, 327, 1711, 34979, 35615, 403, 417, 38662, 323, 9968, 13057, 2561, 2987, 8564, 776, 3114, 2987, 327, 247, 747, 36827, 22199, 285, 8571, 281, 41731, 14692, 3740, 1247, 20, 337, 403, 359, 13762, 281, 897, 253, 4081, 7982, 534, 556, 760, 644, 17618, 327, 20953, 2856, 312, 351, 1241, 323, 36827, 4496, 1908, 3588, 839, 327, 3740, 1247, 20, 577, 323, 362, 3348, 4496, 1908, 3588, 839, 327, 362, 82, 21574, 19, 608, 50276, 18, 632, 86, 1182, 74, 26981, 1162, 355, 3676, 4715, 2454, 12474, 275, 253, 4956, 17857, 17312, 4104, 50276, 19, 465, 3298, 284, 246, 2771, 1775, 22357, 298, 6529, 285, 4522, 80, 247, 8807, 247, 3740, 3169, 14156, 10336, 323, 1006, 800, 48960, 6928, 30105, 1087, 6247, 50276, 20, 1313, 91, 298, 17936, 1162, 355, 440, 9095, 1006, 800, 48960, 6928, 17857, 32888, 4240, 50276, 21, 465, 3298, 284, 246, 2771, 1162, 355, 28129, 4924, 1006, 800, 48960, 6928, 5723, 2824, 43425, 50276, 22, 21721, 23096, 19541, 247, 10510, 3889, 1850, 258, 636, 285, 47692, 311, 362, 5104, 932, 11365, 11117, 1029, 71, 21718, 3888, 342, 362, 82, 21574, 19, 5723, 2824, 6247, 923, 253, 2022, 2278, 5474, 339, 431, 248, 4477, 12661, 247, 5122, 281, 10173, 253, 23190, 273, 247, 1006, 800, 1566, 327, 247, 1153, 4636, 1268, 436, 1332, 310, 1566, 639, 79, 6932, 436, 476, 320, 908, 281, 23873, 253, 1566, 1501, 37806, 50276, 296, 3755, 20556, 50276, 18, 973, 3542, 3477, 281, 956, 374, 14793, 9400, 495, 5955, 273, 5723, 2824, 5691, 310, 1774, 50276, 20881, 1255, 265, 50276, 18, 891, 858, 417, 3037, 1633, 6832, 432, 4361, 253, 2929, 374, 253, 3916, 403, 1512, 2266, 50276, 5992, 7193, 5701, 50276, 18, 253, 14923, 14308, 273, 12320, 285, 6983, 403, 3576, 9588, 273, 4685, 13506, 941, 3290, 2299, 45190, 597, 513, 417, 562, 32231, 643, 3268, 10499, 6297, 824, 347, 269, 301, 268, 88, 259, 285, 891, 9101, 3082, 751, 27451, 23279, 390, 5823, 69, 390, 23055, 4181, 588, 671, 1347, 3294, 1598, 50276, 19, 281, 619, 4685, 7274, 751, 269, 301, 27451, 23279, 5823, 69, 23055, 4181, 403, 671, 1566, 639, 79, 6932, 891, 1547, 44444, 2748, 253, 4477, 281, 294, 18959, 253, 19274, 273, 616, 789, 2112, 436, 1798, 1750, 495, 253, 4549, 275, 436, 789, 8696, 275, 253, 958, 326, 253, 4081, 5593, 476, 320, 5118, 327, 247, 1153, 4636, 3720, 12401, 2720, 2987, 50276, 21, 45190, 253, 4477, 1908, 1643, 3374, 534, 778, 906, 275, 4105, 32422, 69, 2095, 4438, 13551, 1146, 581, 273, 731, 604, 253, 3210, 858, 417, 39970, 1309, 3733, 849, 513, 841, 3082, 9232, 731, 608, 812, 253, 4477, 19148, 849, 253, 4081, 1332, 3400, 4665, 1430, 253, 4477, 10018, 13007, 875, 5802, 20101, 285, 616, 7363, 533, 597, 16216, 1663, 11104, 253, 4868, 281, 247, 1798, 4433, 2362, 721, 253, 4081, 2557, 323, 40318, 15771, 327, 253, 3745, 281, 2736, 689, 31893, 2299, 3332, 789, 407, 269, 10391, 1342, 1162, 355, 8219, 326, 689, 31893, 390, 16407, 1320, 275, 690, 2219, 310, 5667, 323, 26647, 476, 253, 4477, 4385, 327, 253, 1072, 50276, 2364, 6095, 891, 651, 21434, 253, 4477, 281, 45190, 17813, 604, 689, 31893, 556, 5866, 281, 17813, 616, 3916, 5046, 949, 3641, 8104, 2490, 187, 4118, 18435, 27, 783, 4477, 12661, 247, 747, 873, 273, 17082, 323, 7103, 273, 1006, 800, 3210, 1754, 327, 253, 973, 21877, 12320, 2845, 455, 7792, 285, 271, 3081, 7877, 2677, 5411, 253, 4248, 273, 16407, 1320, 253, 4477, 6760, 253, 4081, 2746, 275, 2067, 7533, 285, 2429, 352, 281, 247, 8578, 273, 253, 10610, 7103, 5593, 275, 436, 2317, 253, 30628, 5821, 326, 436, 310, 271, 1774, 285, 11132, 1895, 4623, 281, 253, 1006, 800, 14053, 3114, 387, 1781, 253, 2929, 310, 973, 15720, 285, 253, 4081, 1332, 285, 16038, 403, 4518, 5544, 50275, 783, 3302, 10123, 497, 45210, 285, 846, 253, 5955, 3408, 359, 452, 374, 45210, 25026, 581, 2266, 2997, 285, 581, 2266, 12009, 846, 4361, 253, 7714, 253, 30080, 22559, 285, 253, 5955, 891, 1928, 326, 253, 789, 943, 417, 320, 7607, 327, 253, 9905, 273, 12497, 16774, 12820, 14631, 247, 747, 7103, 7982, 310, 247, 1077, 11132, 4836, 50276, 531, 3198, 281, 7568, 253, 8483, 27366, 273, 5368, 17082, 347, 973, 347, 849, 253, 747, 7982, 310, 26475, 253, 5816, 10103, 275, 247, 11080, 16774, 12820, 1223, 253, 3438, 369, 8489, 2011, 275, 436, 789, 285, 275, 1142, 643, 2987, 253, 6158, 369, 417, 4751, 5183, 253, 3625, 1921, 310, 253, 897, 273, 247, 1327, 15291, 22791, 281, 7472, 253, 11839, 273, 253, 4081, 17082, 891, 5194, 326, 10985, 247, 16055, 873, 273, 8892, 285, 3210, 2789, 3282, 275, 2087, 533, 352, 943, 2649, 320, 2218, 387, 253, 2105, 273, 5368, 973, 4524, 6545, 49602, 891, 3264, 281, 923, 247, 11080, 5301, 342, 337, 581, 273, 253, 954, 8542, 17082, 908, 3063, 534, 476, 320, 4354, 6508, 281, 512, 7533, 2783, 275, 436, 789, 30812, 253, 30453, 18627, 275, 374, 752, 403, 253, 3081, 16039, 752, 310, 337, 11741, 281, 9232, 275, 8542, 10872, 1057, 253, 5958, 5921, 1818, 342, 1675, 281, 4980, 3210, 2439, 10610, 15302, 4457, 278, 79, 382, 285, 260, 338, 274, 740, 436, 651, 5386, 34541, 4903, 285, 3012, 17084, 253, 2929, 50276, 2577, 2457, 6803, 310, 326, 436, 789, 310, 45210, 533, 2708, 253, 14924, 2534, 323, 17857, 32888, 891, 7052, 1804, 253, 4477, 281, 34647, 253, 3081, 11701, 689, 3082, 824, 347, 337, 275, 8542, 285, 973, 4524, 6545, 7533, 7744, 908, 281, 22791, 1006, 800, 3210, 24088, 327, 3888, 253, 4679, 5125, 407, 253, 30628, 403, 247, 3213, 275, 253, 987, 3884, 533, 417, 4209, 50276, 18, 5520, 12320, 285, 6983, 7982, 323, 18005, 1006, 800, 3210, 11737, 4921, 465, 1362, 3696, 8311, 35381, 246, 2771, 465, 3298, 284, 1775, 22357, 298, 6529, 8729, 518, 7381, 458, 384, 14899, 4522, 80, 247, 8807, 5723, 2824, 655, 50276, 19, 16344, 1006, 800, 3210, 970, 23279, 2914, 4670, 50276, 36326, 532, 277, 75, 311, 543, 66, 2304, 900, 17669, 2304, 1940, 2624, 11317, 8919, 400, 1321, 269, 433, 6555, 270, 2679, 78, 8919, 400, 1321, 270, 528, 21118, 726, 38884, 404, 9480, 314, 247, 382, 1832, 1384 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 8439, 27, 187, 2520, 789, 9713, 271, 4722, 1895, 285, 247, 1077, 4623, 581, 50276, 5430, 281, 23873, 1006, 800, 3210, 253, 954, 4633, 7982, 281, 7472, 1006, 800, 3210, 310, 269, 301, 533, 352, 310, 6685, 34350, 285, 3072, 1754, 436, 2929, 29328, 747, 32422, 17082, 326, 11399, 1097, 841, 7881, 1223, 253, 5426, 273, 841, 17082, 403, 973, 28462, 616, 6814, 1232, 3139, 50276, 4609, 310, 2218, 3066, 10166, 48257, 49996, 50276, 261, 247, 2372, 10915, 87, 19163, 253, 4477, 7568, 326, 253, 4081, 7982, 476, 9295, 253, 1524, 19947, 273, 1006, 800, 3210, 327, 247, 13506, 941, 5978, 4836, 285, 476, 9413, 3374, 751, 4438, 13551, 20544, 253, 2929, 12453, 271, 1774, 9400, 326, 310, 673, 4619, 285, 4623, 281, 253, 1006, 800, 14053, 3114, 285, 253, 16055, 15269, 273, 23105, 352, 310, 973, 3542, 285, 17194, 285, 247, 11284, 281, 1239, 4583, 253, 4477, 4518, 1127, 562, 253, 3374, 326, 403, 14999, 342, 5368, 17082, 285, 41509, 253, 878, 281, 4853, 253, 747, 9508, 273, 12320, 50276, 2845, 455, 17082, 25761, 253, 17082, 8046, 247, 747, 897, 1083, 273, 1566, 3820, 2996, 534, 476, 320, 1077, 4623, 310, 5252, 4619, 4893, 50276, 20881, 1255, 265, 50276, 18, 253, 2929, 16966, 247, 2257, 275, 2426, 273, 253, 2931, 17082, 533, 253, 27163, 403, 2372, 762, 11622, 3987, 253, 3368, 342, 253, 15970, 1566, 7024, 327, 253, 15450, 30410, 347, 973, 50276, 783, 3045, 273, 253, 1006, 800, 1566, 310, 12331, 281, 253, 3045, 273, 253, 30410, 50276, 48521, 253, 4373, 19484, 7533, 285, 3733, 7533, 323, 253, 30410, 943, 320, 5544, 4518, 50276, 19, 1529, 4809, 326, 310, 5816, 432, 253, 7103, 310, 253, 2898, 273, 1566, 3820, 2996, 327, 1524, 1495, 4619, 941, 253, 2934, 273, 1566, 3820, 2996, 4916, 4217, 7194, 275, 2219, 835, 359, 878, 281, 5806, 387, 253, 3410, 1268, 672, 247, 1006, 800, 1566, 310, 10166, 327, 3739, 941, 6979, 849, 513, 253, 17082, 1361, 275, 1566, 3820, 2996, 495, 271, 4722, 1953, 281, 10304, 310, 849, 281, 7277, 1027, 1006, 800, 3210, 326, 452, 20407, 15040, 323, 4227, 581, 1566, 812, 320, 271, 6485, 275, 3295, 285, 1529, 812, 320, 275, 5281, 285, 594, 327, 50276, 5092, 359, 897, 253, 2931, 17082, 281, 4271, 824, 13576, 577, 4496, 2085, 2228, 8965, 327, 253, 15970, 14053, 4679, 50276, 284, 275, 2395, 323, 253, 3632, 1255, 275, 253, 1006, 800, 1566, 3733, 1232, 4583, 253, 2929, 1057, 247, 1175, 2628, 275, 15974, 247, 4623, 1895, 275, 247, 973, 17194, 5133, 1014, 2167, 627, 403, 690, 1527, 7350, 5001, 253, 7103, 253, 958, 326, 253, 1655, 2746, 310, 581, 273, 253, 806, 281, 12454, 1566, 3820, 2996, 323, 1006, 800, 3210, 2789, 436, 247, 1175, 7680, 4477, 4496, 2953, 619, 7350, 275, 253, 32213, 7117, 1840, 50275, 7152, 339, 431, 248, 2929, 10262, 247, 16182, 18005, 253, 3045, 273, 247, 1006, 800, 1566, 275, 247, 5028, 639, 79, 6932, 8142, 50276, 66, 495, 6967, 7982, 2317, 310, 4081, 32422, 3453, 3290, 9991, 7031, 273, 3264, 13099, 273, 3453, 285, 26647, 281, 752, 4248, 1566, 32547, 16407, 3006, 3733, 941, 26332, 310, 7777, 1006, 800, 310, 4081, 50276, 35456, 253, 6158, 3284, 310, 4460, 285, 253, 5133, 275, 534, 352, 310, 778, 320, 6760, 407, 6684, 271, 40318, 2557, 323, 253, 4836, 1264, 47386, 897, 2219, 275, 2460, 278, 79, 382, 285, 3739, 3110, 941, 9383, 301, 746, 10507, 285, 7703, 22510, 19, 14571, 941, 10625, 403, 2530, 253, 2929, 17904, 4217, 1869, 285, 5955, 323, 849, 1682, 281, 9059, 253, 45290, 3045, 273, 247, 1006, 800, 1566, 42455, 342, 269, 301, 285, 643, 10625, 29765, 7274, 50276, 9088, 403, 4460, 24302, 275, 436, 2929, 326, 403, 4409, 9628, 342, 253, 3114, 3782, 1110, 15979, 281, 5955, 1475, 26647, 4632, 16407, 1320, 24699, 835, 352, 310, 2011, 326, 2720, 7982, 23111, 436, 4809, 5010, 5224, 1347, 386, 3210, 21194, 2069, 12395, 5723, 2824, 5691, 941, 50276, 783, 2929, 812, 320, 5520, 407, 10263, 12861, 2954, 342, 2720, 1869, 275, 436, 2170, 24088, 519, 5247, 1162, 355, 6247, 479, 70, 5582, 1162, 355, 9169, 665, 403, 11106, 275, 4706, 31619, 533, 417, 1262, 3197, 50276, 527, 13158, 253, 4864, 6239, 2278, 310, 50217, 281, 253, 2929, 30762, 285, 369, 417, 9371, 275, 16425, 323, 253, 4460, 7680, 273, 253, 2929, 275, 3634, 273, 824, 789, 50276, 783, 2929, 671, 19756, 667, 12906, 390, 6452, 327, 253, 7364, 273, 253, 4081, 17082, 50276, 262, 369, 12744, 281, 479, 849, 253, 4081, 40318, 30410, 651, 789, 323, 23390, 26306, 3268, 1677, 253, 13260, 273, 6046, 1561, 247, 24052, 81, 1568, 50276, 262, 858, 417, 1646, 751, 352, 651, 320, 8542, 323, 15958, 2570, 941, 3237, 50276, 66, 897, 1083, 327, 1029, 6967, 941, 24088, 3888, 4457, 247, 20953, 278, 79, 382, 1650, 651, 452, 1805, 5183, 253, 8542, 11839, 273, 436, 789, 323, 941, 835, 253, 878, 273, 824, 3045, 17082, 310, 30909, 891, 717, 327, 253, 45210, 43981, 2584, 18738, 253, 2929, 1677, 253, 4722, 5955, 1475, 40318, 285, 16407, 1320, 533, 891, 717, 417, 4751, 13762, 436, 310, 247, 18236, 4217, 2929, 50273, 7152, 33032, 2520, 2929, 8571, 281, 247, 495, 4528, 7982, 355, 545, 522, 2845, 1297, 701, 609, 4065, 285, 40318, 326, 2677, 7790, 253, 32422, 9991, 285, 26647, 3045, 273, 247, 1006, 800, 1566, 253, 4081, 7982, 11029, 347, 2602, 45794, 49996, 875, 1524, 285, 4561, 19474, 13824, 8525, 285, 19132, 253, 31640, 1411, 42559, 285, 5978, 4433, 2219, 436, 3410, 3020, 7982, 476, 23873, 3210, 407, 32721, 2060, 13506, 3530, 407, 616, 3290, 50276, 296, 3755, 20556, 50276, 8250, 4028, 285, 32367, 50276, 30407, 1020, 2561, 9400, 50276, 23693, 1037, 41374, 50276, 20881, 1255, 265, 50276, 783, 7681, 38135, 310, 42876, 17265, 432, 618, 22492, 11282, 1162, 355, 4765, 50276, 783, 4679, 403, 18464, 285, 3103, 10915, 87, 19163, 50274, 783, 5175, 15302, 403, 247, 2372, 20953, 3022, 751, 618, 22492, 11282, 1162, 355, 4765, 4496, 671, 3368, 327, 6076, 5830, 337, 390, 269, 30550, 82, 374, 323, 4438, 13551, 7103, 4496, 1908, 970, 11104, 49996, 3215, 11273, 327, 6076, 5830, 347, 973, 347, 21983, 297, 495, 50274, 783, 5175, 1006, 800, 3210, 403, 562, 273, 3522, 1543, 327, 1711, 34979, 35615, 403, 417, 38662, 323, 9968, 13057, 2561, 2987, 8564, 776, 3114, 2987, 327, 247, 747, 36827, 22199, 285, 8571, 281, 41731, 14692, 3740, 1247, 20, 337, 403, 359, 13762, 281, 897, 253, 4081, 7982, 534, 556, 760, 644, 17618, 327, 20953, 2856, 312, 351, 1241, 323, 36827, 4496, 1908, 3588, 839, 327, 3740, 1247, 20, 577, 323, 362, 3348, 4496, 1908, 3588, 839, 327, 362, 82, 21574, 19, 608, 50276, 18, 632, 86, 1182, 74, 26981, 1162, 355, 3676, 4715, 2454, 12474, 275, 253, 4956, 17857, 17312, 4104, 50276, 19, 465, 3298, 284, 246, 2771, 1775, 22357, 298, 6529, 285, 4522, 80, 247, 8807, 247, 3740, 3169, 14156, 10336, 323, 1006, 800, 48960, 6928, 30105, 1087, 6247, 50276, 20, 1313, 91, 298, 17936, 1162, 355, 440, 9095, 1006, 800, 48960, 6928, 17857, 32888, 4240, 50276, 21, 465, 3298, 284, 246, 2771, 1162, 355, 28129, 4924, 1006, 800, 48960, 6928, 5723, 2824, 43425, 50276, 22, 21721, 23096, 19541, 247, 10510, 3889, 1850, 258, 636, 285, 47692, 311, 362, 5104, 932, 11365, 11117, 1029, 71, 21718, 3888, 342, 362, 82, 21574, 19, 5723, 2824, 6247, 923, 253, 2022, 2278, 5474, 339, 431, 248, 4477, 12661, 247, 5122, 281, 10173, 253, 23190, 273, 247, 1006, 800, 1566, 327, 247, 1153, 4636, 1268, 436, 1332, 310, 1566, 639, 79, 6932, 436, 476, 320, 908, 281, 23873, 253, 1566, 1501, 37806, 50276, 296, 3755, 20556, 50276, 18, 973, 3542, 3477, 281, 956, 374, 14793, 9400, 495, 5955, 273, 5723, 2824, 5691, 310, 1774, 50276, 20881, 1255, 265, 50276, 18, 891, 858, 417, 3037, 1633, 6832, 432, 4361, 253, 2929, 374, 253, 3916, 403, 1512, 2266, 50276, 5992, 7193, 5701, 50276, 18, 253, 14923, 14308, 273, 12320, 285, 6983, 403, 3576, 9588, 273, 4685, 13506, 941, 3290, 2299, 45190, 597, 513, 417, 562, 32231, 643, 3268, 10499, 6297, 824, 347, 269, 301, 268, 88, 259, 285, 891, 9101, 3082, 751, 27451, 23279, 390, 5823, 69, 390, 23055, 4181, 588, 671, 1347, 3294, 1598, 50276, 19, 281, 619, 4685, 7274, 751, 269, 301, 27451, 23279, 5823, 69, 23055, 4181, 403, 671, 1566, 639, 79, 6932, 891, 1547, 44444, 2748, 253, 4477, 281, 294, 18959, 253, 19274, 273, 616, 789, 2112, 436, 1798, 1750, 495, 253, 4549, 275, 436, 789, 8696, 275, 253, 958, 326, 253, 4081, 5593, 476, 320, 5118, 327, 247, 1153, 4636, 3720, 12401, 2720, 2987, 50276, 21, 45190, 253, 4477, 1908, 1643, 3374, 534, 778, 906, 275, 4105, 32422, 69, 2095, 4438, 13551, 1146, 581, 273, 731, 604, 253, 3210, 858, 417, 39970, 1309, 3733, 849, 513, 841, 3082, 9232, 731, 608, 812, 253, 4477, 19148, 849, 253, 4081, 1332, 3400, 4665, 1430, 253, 4477, 10018, 13007, 875, 5802, 20101, 285, 616, 7363, 533, 597, 16216, 1663, 11104, 253, 4868, 281, 247, 1798, 4433, 2362, 721, 253, 4081, 2557, 323, 40318, 15771, 327, 253, 3745, 281, 2736, 689, 31893, 2299, 3332, 789, 407, 269, 10391, 1342, 1162, 355, 8219, 326, 689, 31893, 390, 16407, 1320, 275, 690, 2219, 310, 5667, 323, 26647, 476, 253, 4477, 4385, 327, 253, 1072, 50276, 2364, 6095, 891, 651, 21434, 253, 4477, 281, 45190, 17813, 604, 689, 31893, 556, 5866, 281, 17813, 616, 3916, 5046, 949, 3641, 8104, 2490, 187, 4118, 18435, 27, 783, 4477, 12661, 247, 747, 873, 273, 17082, 323, 7103, 273, 1006, 800, 3210, 1754, 327, 253, 973, 21877, 12320, 2845, 455, 7792, 285, 271, 3081, 7877, 2677, 5411, 253, 4248, 273, 16407, 1320, 253, 4477, 6760, 253, 4081, 2746, 275, 2067, 7533, 285, 2429, 352, 281, 247, 8578, 273, 253, 10610, 7103, 5593, 275, 436, 2317, 253, 30628, 5821, 326, 436, 310, 271, 1774, 285, 11132, 1895, 4623, 281, 253, 1006, 800, 14053, 3114, 387, 1781, 253, 2929, 310, 973, 15720, 285, 253, 4081, 1332, 285, 16038, 403, 4518, 5544, 50275, 783, 3302, 10123, 497, 45210, 285, 846, 253, 5955, 3408, 359, 452, 374, 45210, 25026, 581, 2266, 2997, 285, 581, 2266, 12009, 846, 4361, 253, 7714, 253, 30080, 22559, 285, 253, 5955, 891, 1928, 326, 253, 789, 943, 417, 320, 7607, 327, 253, 9905, 273, 12497, 16774, 12820, 14631, 247, 747, 7103, 7982, 310, 247, 1077, 11132, 4836, 50276, 531, 3198, 281, 7568, 253, 8483, 27366, 273, 5368, 17082, 347, 973, 347, 849, 253, 747, 7982, 310, 26475, 253, 5816, 10103, 275, 247, 11080, 16774, 12820, 1223, 253, 3438, 369, 8489, 2011, 275, 436, 789, 285, 275, 1142, 643, 2987, 253, 6158, 369, 417, 4751, 5183, 253, 3625, 1921, 310, 253, 897, 273, 247, 1327, 15291, 22791, 281, 7472, 253, 11839, 273, 253, 4081, 17082, 891, 5194, 326, 10985, 247, 16055, 873, 273, 8892, 285, 3210, 2789, 3282, 275, 2087, 533, 352, 943, 2649, 320, 2218, 387, 253, 2105, 273, 5368, 973, 4524, 6545, 49602, 891, 3264, 281, 923, 247, 11080, 5301, 342, 337, 581, 273, 253, 954, 8542, 17082, 908, 3063, 534, 476, 320, 4354, 6508, 281, 512, 7533, 2783, 275, 436, 789, 30812, 253, 30453, 18627, 275, 374, 752, 403, 253, 3081, 16039, 752, 310, 337, 11741, 281, 9232, 275, 8542, 10872, 1057, 253, 5958, 5921, 1818, 342, 1675, 281, 4980, 3210, 2439, 10610, 15302, 4457, 278, 79, 382, 285, 260, 338, 274, 740, 436, 651, 5386, 34541, 4903, 285, 3012, 17084, 253, 2929, 50276, 2577, 2457, 6803, 310, 326, 436, 789, 310, 45210, 533, 2708, 253, 14924, 2534, 323, 17857, 32888, 891, 7052, 1804, 253, 4477, 281, 34647, 253, 3081, 11701, 689, 3082, 824, 347, 337, 275, 8542, 285, 973, 4524, 6545, 7533, 7744, 908, 281, 22791, 1006, 800, 3210, 24088, 327, 3888, 253, 4679, 5125, 407, 253, 30628, 403, 247, 3213, 275, 253, 987, 3884, 533, 417, 4209, 50276, 18, 5520, 12320, 285, 6983, 7982, 323, 18005, 1006, 800, 3210, 11737, 4921, 465, 1362, 3696, 8311, 35381, 246, 2771, 465, 3298, 284, 1775, 22357, 298, 6529, 8729, 518, 7381, 458, 384, 14899, 4522, 80, 247, 8807, 5723, 2824, 655, 50276, 19, 16344, 1006, 800, 3210, 970, 23279, 2914, 4670, 50276, 36326, 532, 277, 75, 311, 543, 66, 2304, 900, 17669, 2304, 1940, 2624, 11317, 8919, 400, 1321, 269, 433, 6555, 270, 2679, 78, 8919, 400, 1321, 270, 528, 21118, 726, 38884, 404, 9480, 314, 247, 382, 1832, 1384 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: the author proposed a memoryaugemented transformer where special memory tokens are added to the attention mechanism and test its performance on language modeling and sequence modeling tasks the author shows that their technique requires less memory in the language modeling tasks to achieve the same accuracy comparing to stateoftheart model and outperformed baselines where processing long sequence is required the method that the author proposed is simple and their experiments showed strong performance of their method however the experiments focuses mainly on the copy and reverse tasks and it remains to be seen how the method performs on other tasks also the perplexity that the author shows is slightly worse than that of baseline it is remained to be shown whether their method can match the perplexity of the baselines and whether that would mean higher training cost time and memory etc the author may use a separate paragraph to explain further the societal impact of their work as it is related to large language models which may raise environmental and other concerns docsepin this paper authors propose a recurrent transformer including the memory to store and process local and global information in a input segment and transfer info between segments the add special tokens at the start and end of each input segment and define the output of these special tokens as read and write the representation of the write will be fed to the next segment they claim that their model can achieve similar results with transformerxl with smaller smaller memory sizes strengths 1 the idea is clever and easy to follow to resolve the issues brought by the autoregressive property read token can be the memory for its following tokens and write token can be the summary of this segment which can be used for the next segments 2 the experiment is sufficient and comparison to the transformerxl is also sufficient to me weakness 1 my main concern is the training process of the rmt to me since the gradient will be back propagated to the very beginning is there any forgetting for the memory token so the token in the last segment will forget the contextual information of the first segment 2 since transformerxl will stop the gradient for each segment i think the training process will be more stable than rmt can you confirm this 3 since you claim that rmt can benefit the longer sequence modeling due to the quadratic computational complexity it would be reasonable to include some other efficient transformer and its alternatives such as sparse transformer luna s4 in the table 2 for comparison yes docsepthis paper shows a new method for adding recurrence to transformers via special overlapping special memory tokens between segments hence the new transformer architecture is called recurrent memory transformer rmt a special type of tokens aka memory tokens are prepended and appended to each segment and the number m for adding how many memory tokens is a new hyperparameter the gradients can propagate to past segments through memory tokens which is the distinct feature from previous architecture such as transformer xl the paper uses transformer xl as the baseline transformer is easy to perform parallel training and it suffers less from vanishing gradients compared to recurrent neural networks rnns which made it as popular nowadays however transformer is more expensive than rnns due to its extensive use of attention mechanism especially during inference another downside is increasing the sequence length modelled by transformers is difficult as the statespace of the memory is much larger than rnns and again the computational cost for attending to previous context increase linearly as the context length increases previous works have addressed this issue by caching the transformer state of immediate previous segment and inventing a new type of positional embedding that only cares about relational difference or compressing the states of previous context the former trxl has been often considered as the baseline for transformers introducing recurrence although trxl can arrange longerterm contexts for sequence modelling tasks there are discontinuities in the backward pass so the model doesnt effectively model longterm temporal dependencies also the transformer state of previous segment is directly given as the memory which can increase the computational cost and memory requirement for the forward pass this works provides a way to 1 pass the gradients from the future segments which is presumably more principled way of modelling longterm dependencies 2 connect consecutive segments with small overheads m memory tokens where m is much smaller size than the segment size n i think the claim recurrence with memory has no limitations on effective context length written in the caption of figure 2 doesnt sound right it is difficult to increase the context length without trading off other factors such as the size of model parameters due to limited memory space of the hardwares which is mentioned in the section 5 the authors wrote that rmt can suffer from instabilities and oom issues when using larger number of bptt unrolls this could affect the scaling property of rmt overall the writing is good it conveys the idea very clearly their experiments look convincing which i will leave my comments below comments about the experiments they results from synthetic tasks looks good figure 4 however the performance of the rmt on wikitext103 shows that trxl still performs the best and rmt can achieve the performance similar to trxl variant which uses halfsized memory i think the choice of segment length only 50 and 150 is quite limited and also the length of memory in this particular experiment and 150 tokens sounds quite short as a sequence length in in the first place it would be nice to add extra longer length such as 300 600 for this task as well however the experiments on enwiki8 figure 5 show nicely that the baseline transformer and trxl both shows worse results as the sequence length increases but rmt is more robust and preserves its performance it would be nice if you can write more details about the models optimizer tokenization and training procedure these details should be contained in the main manuscript for reproducibility and readability one thing that i would like to propose to the authors as a reviewer to make the paper to look more stronger is to add a bit more baselines that tried to compress the previous contexts eg compressive transformer also rmt and feedback transformer has a close relationship the rmt can be seen as blocklevel recurrent version of feedback transformer one could argue that instead of providing one memory token per each step rmt provides m memory tokens as the context for processing a segment of n tokens maybe adding feedback transformer and its variants providing m memory state similar to rmt for processing arbitrary number of tokens in the new segment minor comment figure 7 referenced in the main paper is not included in the main but in the appendix it would be better to make this more clear the paper didnt properly discussed about the limitations of the proposed method rather than adding only one sentence in section 5 furthermore we observed instabilities and outofmemory issues during rmt training for a larger number of bptt unrolls and memory sizes i suggest being more clear about this and how it can affect the scaling property of the proposed method the negative societal impact was not discussed in the paper so na ### Summary:
the paper proposes a memory augmented architecture for transformers to deal with the segmentbased long sequences the idea is simple and easy to follow a special memory token is added to the transformer and corresponding memory operations are introduced to control the storage of the information from previous segments the experiments show the comparison between the proposed method and the transfomerxl architecture the reviews are overall positive
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 783, 2488, 4081, 247, 3541, 2321, 1003, 264, 39707, 835, 2714, 3541, 21761, 403, 2879, 281, 253, 4116, 5122, 285, 1071, 697, 3045, 327, 3448, 14053, 285, 3425, 14053, 8892, 253, 2488, 2722, 326, 616, 5853, 4419, 1679, 3541, 275, 253, 3448, 14053, 8892, 281, 5115, 253, 1072, 7200, 10941, 281, 1375, 23037, 14387, 1566, 285, 41731, 10574, 1666, 25379, 835, 5162, 1048, 3425, 310, 2424, 253, 1332, 326, 253, 2488, 4081, 310, 2969, 285, 616, 4679, 2692, 2266, 3045, 273, 616, 1332, 2299, 253, 4679, 16633, 7194, 327, 253, 3491, 285, 8107, 8892, 285, 352, 4558, 281, 320, 2326, 849, 253, 1332, 17923, 327, 643, 8892, 671, 253, 44229, 414, 326, 253, 2488, 2722, 310, 5777, 7197, 685, 326, 273, 8245, 352, 310, 6376, 281, 320, 2011, 1880, 616, 1332, 476, 3761, 253, 44229, 414, 273, 253, 1666, 25379, 285, 1880, 326, 651, 1599, 2169, 3733, 2105, 673, 285, 3541, 3966, 253, 2488, 778, 897, 247, 4858, 12494, 281, 5513, 2007, 253, 38058, 3486, 273, 616, 789, 347, 352, 310, 2905, 281, 1781, 3448, 3210, 534, 778, 7164, 6938, 285, 643, 7350, 5474, 339, 9852, 436, 2929, 4477, 12661, 247, 18902, 39707, 1690, 253, 3541, 281, 4657, 285, 1232, 1980, 285, 4156, 1491, 275, 247, 3280, 8223, 285, 3700, 8692, 875, 13288, 253, 823, 2714, 21761, 387, 253, 1265, 285, 990, 273, 1016, 3280, 8223, 285, 4853, 253, 3453, 273, 841, 2714, 21761, 347, 1239, 285, 3630, 253, 6779, 273, 253, 3630, 588, 320, 10208, 281, 253, 1735, 8223, 597, 1750, 326, 616, 1566, 476, 5115, 2074, 1543, 342, 39707, 30291, 342, 4577, 4577, 3541, 9552, 50276, 296, 3755, 20556, 337, 253, 2934, 310, 19080, 285, 3477, 281, 956, 281, 11322, 253, 3374, 3982, 407, 253, 47694, 11020, 2867, 1239, 10669, 476, 320, 253, 3541, 323, 697, 1563, 21761, 285, 3630, 10669, 476, 320, 253, 6010, 273, 436, 8223, 534, 476, 320, 908, 323, 253, 1735, 13288, 50276, 19, 253, 3368, 310, 4209, 285, 5301, 281, 253, 39707, 30291, 310, 671, 4209, 281, 479, 50276, 20881, 1255, 337, 619, 2022, 4468, 310, 253, 3733, 1232, 273, 253, 391, 6917, 281, 479, 1580, 253, 11786, 588, 320, 896, 46695, 281, 253, 1077, 5068, 310, 627, 667, 37264, 323, 253, 3541, 10669, 594, 253, 10669, 275, 253, 1390, 8223, 588, 7740, 253, 33876, 1491, 273, 253, 806, 8223, 374, 1580, 39707, 30291, 588, 3523, 253, 11786, 323, 1016, 8223, 891, 1158, 253, 3733, 1232, 588, 320, 625, 6474, 685, 391, 6917, 476, 368, 6583, 436, 495, 1580, 368, 1750, 326, 391, 6917, 476, 5649, 253, 3356, 3425, 14053, 1955, 281, 253, 21396, 15180, 10454, 352, 651, 320, 5272, 281, 2486, 690, 643, 5919, 39707, 285, 697, 18075, 824, 347, 23507, 39707, 298, 9821, 256, 21, 275, 253, 2829, 374, 323, 5301, 50276, 9820, 5474, 33032, 2520, 2929, 2722, 247, 747, 1332, 323, 6240, 15969, 281, 4979, 398, 3066, 2714, 21481, 2714, 3541, 21761, 875, 13288, 7613, 253, 747, 39707, 10336, 310, 1925, 18902, 3541, 39707, 391, 6917, 247, 2714, 1511, 273, 21761, 38857, 3541, 21761, 403, 3765, 1834, 285, 42873, 281, 1016, 8223, 285, 253, 1180, 278, 323, 6240, 849, 1142, 3541, 21761, 310, 247, 747, 4373, 19484, 253, 27935, 476, 38500, 281, 2469, 13288, 949, 3541, 21761, 534, 310, 253, 5799, 4735, 432, 2045, 10336, 824, 347, 39707, 1269, 77, 253, 2929, 4648, 39707, 1269, 77, 347, 253, 8245, 39707, 310, 3477, 281, 1347, 7529, 3733, 285, 352, 27171, 1679, 432, 29199, 27935, 2429, 281, 18902, 11454, 6928, 391, 79, 2224, 534, 1160, 352, 347, 4633, 31735, 2299, 39707, 310, 625, 8214, 685, 391, 79, 2224, 1955, 281, 697, 9470, 897, 273, 4116, 5122, 3340, 1309, 17032, 1529, 42719, 310, 3629, 253, 3425, 2978, 41329, 407, 4979, 398, 310, 2834, 347, 253, 3054, 4511, 273, 253, 3541, 310, 1199, 4067, 685, 391, 79, 2224, 285, 969, 253, 15180, 2105, 323, 16362, 281, 2045, 3634, 2572, 23352, 347, 253, 3634, 2978, 5459, 50275, 35065, 2987, 452, 9713, 436, 2523, 407, 42324, 253, 39707, 1375, 273, 8993, 2045, 8223, 285, 10242, 272, 247, 747, 1511, 273, 40798, 21496, 326, 760, 24505, 670, 38524, 3064, 390, 509, 13537, 253, 3054, 273, 2045, 3634, 253, 3438, 492, 30291, 556, 644, 2223, 2783, 347, 253, 8245, 323, 4979, 398, 16984, 15969, 3738, 492, 30291, 476, 23240, 3356, 3945, 22349, 323, 3425, 26278, 8892, 627, 403, 16196, 39560, 275, 253, 19265, 1509, 594, 253, 1566, 36908, 8069, 1566, 1048, 3945, 11935, 21011, 671, 253, 39707, 1375, 273, 2045, 8223, 310, 3587, 1677, 347, 253, 3541, 534, 476, 2572, 253, 15180, 2105, 285, 3541, 8284, 323, 253, 3579, 1509, 436, 2987, 3400, 247, 1039, 281, 337, 1509, 253, 27935, 432, 253, 2852, 13288, 534, 310, 18289, 625, 3505, 74, 6216, 1039, 273, 26278, 1048, 3945, 21011, 374, 4684, 12640, 13288, 342, 1355, 18332, 84, 278, 3541, 21761, 835, 278, 310, 1199, 4577, 1979, 685, 253, 8223, 1979, 295, 50275, 74, 1158, 253, 1750, 15969, 342, 3541, 556, 642, 7364, 327, 3576, 3634, 2978, 3542, 275, 253, 11743, 273, 4677, 374, 36908, 3590, 987, 352, 310, 2834, 281, 2572, 253, 3634, 2978, 1293, 11947, 745, 643, 2616, 824, 347, 253, 1979, 273, 1566, 3602, 1955, 281, 3710, 3541, 2317, 273, 253, 1892, 88, 4420, 534, 310, 5393, 275, 253, 2593, 608, 253, 4477, 4159, 326, 391, 6917, 476, 11089, 432, 978, 6720, 285, 258, 297, 3374, 672, 970, 4067, 1180, 273, 270, 431, 85, 440, 1811, 84, 436, 812, 2818, 253, 13642, 2867, 273, 391, 6917, 50275, 1189, 455, 253, 4028, 310, 1175, 352, 11785, 656, 253, 2934, 1077, 4518, 616, 4679, 1007, 21414, 534, 891, 588, 3553, 619, 5701, 2708, 50276, 26122, 670, 253, 4679, 597, 1543, 432, 13506, 8892, 4453, 1175, 4677, 577, 2299, 253, 3045, 273, 253, 391, 6917, 327, 259, 1479, 614, 633, 12172, 2722, 326, 492, 30291, 1335, 17923, 253, 1682, 285, 391, 6917, 476, 5115, 253, 3045, 2074, 281, 492, 30291, 12955, 534, 4648, 2716, 10306, 3541, 891, 1158, 253, 4327, 273, 8223, 2978, 760, 2456, 285, 7783, 310, 3240, 3710, 285, 671, 253, 2978, 273, 3541, 275, 436, 1798, 3368, 285, 7783, 21761, 7835, 3240, 2159, 347, 247, 3425, 2978, 275, 275, 253, 806, 1659, 352, 651, 320, 5322, 281, 823, 4465, 3356, 2978, 824, 347, 7469, 12891, 323, 436, 4836, 347, 973, 2299, 253, 4679, 327, 546, 16123, 25, 4677, 608, 921, 23395, 326, 253, 8245, 39707, 285, 492, 30291, 1097, 2722, 7197, 1543, 347, 253, 3425, 2978, 5459, 533, 391, 6917, 310, 625, 10237, 285, 31221, 697, 3045, 50276, 262, 651, 320, 5322, 604, 368, 476, 3630, 625, 4278, 670, 253, 3210, 5556, 6081, 10669, 1320, 285, 3733, 5199, 841, 4278, 943, 320, 6221, 275, 253, 2022, 7714, 323, 38041, 285, 1239, 1430, 50276, 531, 2181, 326, 891, 651, 751, 281, 12661, 281, 253, 4477, 347, 247, 37317, 281, 1056, 253, 2929, 281, 1007, 625, 10046, 310, 281, 823, 247, 2372, 625, 1666, 25379, 326, 3597, 281, 19477, 253, 2045, 22349, 24088, 509, 8122, 39707, 671, 391, 6917, 285, 8680, 39707, 556, 247, 2810, 2954, 253, 391, 6917, 476, 320, 2326, 347, 2972, 5251, 18902, 2715, 273, 8680, 39707, 581, 812, 9059, 326, 3185, 273, 5277, 581, 3541, 10669, 591, 1016, 3213, 391, 6917, 3400, 278, 3541, 21761, 347, 253, 3634, 323, 5162, 247, 8223, 273, 295, 21761, 5046, 6240, 8680, 39707, 285, 697, 11640, 5277, 278, 3541, 1375, 2074, 281, 391, 6917, 323, 5162, 10341, 1180, 273, 21761, 275, 253, 747, 8223, 50276, 37585, 4385, 4677, 818, 23378, 275, 253, 2022, 2929, 310, 417, 2908, 275, 253, 2022, 533, 275, 253, 30762, 352, 651, 320, 1805, 281, 1056, 436, 625, 2590, 253, 2929, 42126, 6283, 5469, 670, 253, 7364, 273, 253, 4081, 1332, 2581, 685, 6240, 760, 581, 6197, 275, 2593, 608, 33810, 359, 2540, 978, 6720, 285, 562, 1171, 20704, 3374, 1309, 391, 6917, 3733, 323, 247, 4067, 1180, 273, 270, 431, 85, 440, 1811, 84, 285, 3541, 9552, 891, 1804, 1146, 625, 2590, 670, 436, 285, 849, 352, 476, 2818, 253, 13642, 2867, 273, 253, 4081, 1332, 50276, 783, 4016, 38058, 3486, 369, 417, 5469, 275, 253, 2929, 594, 5549, 2490, 187, 4118, 18435, 27, 783, 2929, 29328, 247, 3541, 31612, 10336, 323, 4979, 398, 281, 2968, 342, 253, 8223, 3169, 1048, 6430, 253, 2934, 310, 2969, 285, 3477, 281, 956, 247, 2714, 3541, 10669, 310, 2879, 281, 253, 39707, 285, 3969, 3541, 5871, 403, 5611, 281, 1453, 253, 5718, 273, 253, 1491, 432, 2045, 13288, 253, 4679, 921, 253, 5301, 875, 253, 4081, 1332, 285, 253, 47415, 8056, 30291, 10336, 253, 10123, 403, 4583, 2762 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 783, 2488, 4081, 247, 3541, 2321, 1003, 264, 39707, 835, 2714, 3541, 21761, 403, 2879, 281, 253, 4116, 5122, 285, 1071, 697, 3045, 327, 3448, 14053, 285, 3425, 14053, 8892, 253, 2488, 2722, 326, 616, 5853, 4419, 1679, 3541, 275, 253, 3448, 14053, 8892, 281, 5115, 253, 1072, 7200, 10941, 281, 1375, 23037, 14387, 1566, 285, 41731, 10574, 1666, 25379, 835, 5162, 1048, 3425, 310, 2424, 253, 1332, 326, 253, 2488, 4081, 310, 2969, 285, 616, 4679, 2692, 2266, 3045, 273, 616, 1332, 2299, 253, 4679, 16633, 7194, 327, 253, 3491, 285, 8107, 8892, 285, 352, 4558, 281, 320, 2326, 849, 253, 1332, 17923, 327, 643, 8892, 671, 253, 44229, 414, 326, 253, 2488, 2722, 310, 5777, 7197, 685, 326, 273, 8245, 352, 310, 6376, 281, 320, 2011, 1880, 616, 1332, 476, 3761, 253, 44229, 414, 273, 253, 1666, 25379, 285, 1880, 326, 651, 1599, 2169, 3733, 2105, 673, 285, 3541, 3966, 253, 2488, 778, 897, 247, 4858, 12494, 281, 5513, 2007, 253, 38058, 3486, 273, 616, 789, 347, 352, 310, 2905, 281, 1781, 3448, 3210, 534, 778, 7164, 6938, 285, 643, 7350, 5474, 339, 9852, 436, 2929, 4477, 12661, 247, 18902, 39707, 1690, 253, 3541, 281, 4657, 285, 1232, 1980, 285, 4156, 1491, 275, 247, 3280, 8223, 285, 3700, 8692, 875, 13288, 253, 823, 2714, 21761, 387, 253, 1265, 285, 990, 273, 1016, 3280, 8223, 285, 4853, 253, 3453, 273, 841, 2714, 21761, 347, 1239, 285, 3630, 253, 6779, 273, 253, 3630, 588, 320, 10208, 281, 253, 1735, 8223, 597, 1750, 326, 616, 1566, 476, 5115, 2074, 1543, 342, 39707, 30291, 342, 4577, 4577, 3541, 9552, 50276, 296, 3755, 20556, 337, 253, 2934, 310, 19080, 285, 3477, 281, 956, 281, 11322, 253, 3374, 3982, 407, 253, 47694, 11020, 2867, 1239, 10669, 476, 320, 253, 3541, 323, 697, 1563, 21761, 285, 3630, 10669, 476, 320, 253, 6010, 273, 436, 8223, 534, 476, 320, 908, 323, 253, 1735, 13288, 50276, 19, 253, 3368, 310, 4209, 285, 5301, 281, 253, 39707, 30291, 310, 671, 4209, 281, 479, 50276, 20881, 1255, 337, 619, 2022, 4468, 310, 253, 3733, 1232, 273, 253, 391, 6917, 281, 479, 1580, 253, 11786, 588, 320, 896, 46695, 281, 253, 1077, 5068, 310, 627, 667, 37264, 323, 253, 3541, 10669, 594, 253, 10669, 275, 253, 1390, 8223, 588, 7740, 253, 33876, 1491, 273, 253, 806, 8223, 374, 1580, 39707, 30291, 588, 3523, 253, 11786, 323, 1016, 8223, 891, 1158, 253, 3733, 1232, 588, 320, 625, 6474, 685, 391, 6917, 476, 368, 6583, 436, 495, 1580, 368, 1750, 326, 391, 6917, 476, 5649, 253, 3356, 3425, 14053, 1955, 281, 253, 21396, 15180, 10454, 352, 651, 320, 5272, 281, 2486, 690, 643, 5919, 39707, 285, 697, 18075, 824, 347, 23507, 39707, 298, 9821, 256, 21, 275, 253, 2829, 374, 323, 5301, 50276, 9820, 5474, 33032, 2520, 2929, 2722, 247, 747, 1332, 323, 6240, 15969, 281, 4979, 398, 3066, 2714, 21481, 2714, 3541, 21761, 875, 13288, 7613, 253, 747, 39707, 10336, 310, 1925, 18902, 3541, 39707, 391, 6917, 247, 2714, 1511, 273, 21761, 38857, 3541, 21761, 403, 3765, 1834, 285, 42873, 281, 1016, 8223, 285, 253, 1180, 278, 323, 6240, 849, 1142, 3541, 21761, 310, 247, 747, 4373, 19484, 253, 27935, 476, 38500, 281, 2469, 13288, 949, 3541, 21761, 534, 310, 253, 5799, 4735, 432, 2045, 10336, 824, 347, 39707, 1269, 77, 253, 2929, 4648, 39707, 1269, 77, 347, 253, 8245, 39707, 310, 3477, 281, 1347, 7529, 3733, 285, 352, 27171, 1679, 432, 29199, 27935, 2429, 281, 18902, 11454, 6928, 391, 79, 2224, 534, 1160, 352, 347, 4633, 31735, 2299, 39707, 310, 625, 8214, 685, 391, 79, 2224, 1955, 281, 697, 9470, 897, 273, 4116, 5122, 3340, 1309, 17032, 1529, 42719, 310, 3629, 253, 3425, 2978, 41329, 407, 4979, 398, 310, 2834, 347, 253, 3054, 4511, 273, 253, 3541, 310, 1199, 4067, 685, 391, 79, 2224, 285, 969, 253, 15180, 2105, 323, 16362, 281, 2045, 3634, 2572, 23352, 347, 253, 3634, 2978, 5459, 50275, 35065, 2987, 452, 9713, 436, 2523, 407, 42324, 253, 39707, 1375, 273, 8993, 2045, 8223, 285, 10242, 272, 247, 747, 1511, 273, 40798, 21496, 326, 760, 24505, 670, 38524, 3064, 390, 509, 13537, 253, 3054, 273, 2045, 3634, 253, 3438, 492, 30291, 556, 644, 2223, 2783, 347, 253, 8245, 323, 4979, 398, 16984, 15969, 3738, 492, 30291, 476, 23240, 3356, 3945, 22349, 323, 3425, 26278, 8892, 627, 403, 16196, 39560, 275, 253, 19265, 1509, 594, 253, 1566, 36908, 8069, 1566, 1048, 3945, 11935, 21011, 671, 253, 39707, 1375, 273, 2045, 8223, 310, 3587, 1677, 347, 253, 3541, 534, 476, 2572, 253, 15180, 2105, 285, 3541, 8284, 323, 253, 3579, 1509, 436, 2987, 3400, 247, 1039, 281, 337, 1509, 253, 27935, 432, 253, 2852, 13288, 534, 310, 18289, 625, 3505, 74, 6216, 1039, 273, 26278, 1048, 3945, 21011, 374, 4684, 12640, 13288, 342, 1355, 18332, 84, 278, 3541, 21761, 835, 278, 310, 1199, 4577, 1979, 685, 253, 8223, 1979, 295, 50275, 74, 1158, 253, 1750, 15969, 342, 3541, 556, 642, 7364, 327, 3576, 3634, 2978, 3542, 275, 253, 11743, 273, 4677, 374, 36908, 3590, 987, 352, 310, 2834, 281, 2572, 253, 3634, 2978, 1293, 11947, 745, 643, 2616, 824, 347, 253, 1979, 273, 1566, 3602, 1955, 281, 3710, 3541, 2317, 273, 253, 1892, 88, 4420, 534, 310, 5393, 275, 253, 2593, 608, 253, 4477, 4159, 326, 391, 6917, 476, 11089, 432, 978, 6720, 285, 258, 297, 3374, 672, 970, 4067, 1180, 273, 270, 431, 85, 440, 1811, 84, 436, 812, 2818, 253, 13642, 2867, 273, 391, 6917, 50275, 1189, 455, 253, 4028, 310, 1175, 352, 11785, 656, 253, 2934, 1077, 4518, 616, 4679, 1007, 21414, 534, 891, 588, 3553, 619, 5701, 2708, 50276, 26122, 670, 253, 4679, 597, 1543, 432, 13506, 8892, 4453, 1175, 4677, 577, 2299, 253, 3045, 273, 253, 391, 6917, 327, 259, 1479, 614, 633, 12172, 2722, 326, 492, 30291, 1335, 17923, 253, 1682, 285, 391, 6917, 476, 5115, 253, 3045, 2074, 281, 492, 30291, 12955, 534, 4648, 2716, 10306, 3541, 891, 1158, 253, 4327, 273, 8223, 2978, 760, 2456, 285, 7783, 310, 3240, 3710, 285, 671, 253, 2978, 273, 3541, 275, 436, 1798, 3368, 285, 7783, 21761, 7835, 3240, 2159, 347, 247, 3425, 2978, 275, 275, 253, 806, 1659, 352, 651, 320, 5322, 281, 823, 4465, 3356, 2978, 824, 347, 7469, 12891, 323, 436, 4836, 347, 973, 2299, 253, 4679, 327, 546, 16123, 25, 4677, 608, 921, 23395, 326, 253, 8245, 39707, 285, 492, 30291, 1097, 2722, 7197, 1543, 347, 253, 3425, 2978, 5459, 533, 391, 6917, 310, 625, 10237, 285, 31221, 697, 3045, 50276, 262, 651, 320, 5322, 604, 368, 476, 3630, 625, 4278, 670, 253, 3210, 5556, 6081, 10669, 1320, 285, 3733, 5199, 841, 4278, 943, 320, 6221, 275, 253, 2022, 7714, 323, 38041, 285, 1239, 1430, 50276, 531, 2181, 326, 891, 651, 751, 281, 12661, 281, 253, 4477, 347, 247, 37317, 281, 1056, 253, 2929, 281, 1007, 625, 10046, 310, 281, 823, 247, 2372, 625, 1666, 25379, 326, 3597, 281, 19477, 253, 2045, 22349, 24088, 509, 8122, 39707, 671, 391, 6917, 285, 8680, 39707, 556, 247, 2810, 2954, 253, 391, 6917, 476, 320, 2326, 347, 2972, 5251, 18902, 2715, 273, 8680, 39707, 581, 812, 9059, 326, 3185, 273, 5277, 581, 3541, 10669, 591, 1016, 3213, 391, 6917, 3400, 278, 3541, 21761, 347, 253, 3634, 323, 5162, 247, 8223, 273, 295, 21761, 5046, 6240, 8680, 39707, 285, 697, 11640, 5277, 278, 3541, 1375, 2074, 281, 391, 6917, 323, 5162, 10341, 1180, 273, 21761, 275, 253, 747, 8223, 50276, 37585, 4385, 4677, 818, 23378, 275, 253, 2022, 2929, 310, 417, 2908, 275, 253, 2022, 533, 275, 253, 30762, 352, 651, 320, 1805, 281, 1056, 436, 625, 2590, 253, 2929, 42126, 6283, 5469, 670, 253, 7364, 273, 253, 4081, 1332, 2581, 685, 6240, 760, 581, 6197, 275, 2593, 608, 33810, 359, 2540, 978, 6720, 285, 562, 1171, 20704, 3374, 1309, 391, 6917, 3733, 323, 247, 4067, 1180, 273, 270, 431, 85, 440, 1811, 84, 285, 3541, 9552, 891, 1804, 1146, 625, 2590, 670, 436, 285, 849, 352, 476, 2818, 253, 13642, 2867, 273, 253, 4081, 1332, 50276, 783, 4016, 38058, 3486, 369, 417, 5469, 275, 253, 2929, 594, 5549, 2490, 187, 4118, 18435, 27, 783, 2929, 29328, 247, 3541, 31612, 10336, 323, 4979, 398, 281, 2968, 342, 253, 8223, 3169, 1048, 6430, 253, 2934, 310, 2969, 285, 3477, 281, 956, 247, 2714, 3541, 10669, 310, 2879, 281, 253, 39707, 285, 3969, 3541, 5871, 403, 5611, 281, 1453, 253, 5718, 273, 253, 1491, 432, 2045, 13288, 253, 4679, 921, 253, 5301, 875, 253, 4081, 1332, 285, 253, 47415, 8056, 30291, 10336, 253, 10123, 403, 4583, 2762 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: summarize what the paper claims to contribute the paper develops a spatiotemporal network that is defined in terms of continuous spatial functions and continuous temporal dynamics the approach is meant to bring deep networks closer to biological neural models the model is tested on cifar10 and on a variation of cifar10 in which blocks of pixels are blacked out the methodological novelty consists of combining several existing approaches continuous kernels kernelscale learning and neural odes list strong and weak points of the paper strong points i think the motivation is very strong deep convolutional networks are increasingly used in brain modelling but they are somewhat disconnected from earlier computational neuroscience in ways that this paper tries to address the pattern completion test with blackedout pixels is a nice way to employ spatiotemporal dynamics in image recognition and the results are promising it is very interesting that the distribution of learned scales reflects the distribution of receptivefield sizes in primary visual cortex weak points the featuremap evolution results in fig 4 are interesting and a few more examples are given in the appendix but it would also be nice to see mean sd dynamics across many images to complement figure 3c more thoroughly the model doesnt outperform controls on cifar10 although it does perform moderately better with blackedout blocks of 6x6 pixels or more performance of the model and baselines on cifar10 is not strong which raises the question of how compatible the approach is with higher performance the choice of cifar10 as a test of the approach does not seem to be well motivated a task with a temporal component might give the dynamic parts of the model more to do clearly state your recommendation accept or reject with one or two key reasons for this choice i recommend to accept the paper there is a growing body of work that compares deep cnns to biological neural networks and the conventions of discrete time and space in cnns unfortunately distance this work somewhat from much previous computational neuroscience i think the paper helps to close this gap ask questions you would like answered by the authors to help you clarify your understanding of the paper and provide the additional evidence you need to be confident in your assessment could you please expand on the motivation for continuous space with respect to biological vision ultimately vision is based on discrete photoreceptors and generally on populations of discrete cells relatedly the motivation for continuous time is more obvious but i think it would also help to comment on spikes in this context the dynamic equation in eq 2 seems to be autonomous how does input affect h in section 32 the learned scales are shown to increase with network depth this is compared with biological receptive fields which grow through the visual hierarchy but i had understood the scale to correspond to the kernel size rather than the receptive field size which grows in deep networks even if all the kernels are the same size does sigma correspond to kernel or receptive field size provide additional feedback with the aim to improve the paper analytic tractability is mentioned on page 1 as an advantage of some continuous models in neuroscience and it seems to be implied at that point that such benefits are sought in the paper but it doesnt seem that the approach ultimately offers much hope in this sense if there is some potential here please expand i didnt understand time or network depth on pg 4 also on pg 4 the sampling domain of the filter is given but not the sampling frequency i found section a1 relatively hard to follow in particular eq 5 seems to be meant to motivate the filter family but i didnt follow the argument or maybe i missed the point entirely also aside from general interest i didnt understand how the paragraph that contains eq 6 related to the rest of the paper it wouldnt hurt to define dopri i didnt follow the last paragraph of a2 docsep review summary the authors define continuous deep networks by expressing 2dconvolutional filters as a linear combination of gaussian function and its derivatives by combining this description with the previously proposed neural ode framework they obtain a spatiotemporally continuous description of a convolutional neural network there are 3 main contributions 1 they are able to estimate the support width of the filters and they show that it increases with the network depth as observed in the visual cortex 2 they show that their network performs as well as alternative noncontinuous neural networks on cifar10 while having less parameters 3 they exploit the temporal dynamics to resolve a pattern completion task overall i think the work is good but i am not as enthusiastic as the authors about the importance of the work for neuroscience and machine learning strengths the paper is well written and easy to understand the goals and contributions of the work are clearly stated the quantitative results are marginally good the qualitative results filter support width pattern completion and contrast robustness are interesting and relevant to neuroscience weaknesses i fail to understand how the work could be relevant to neuroscience beyond what is presented here what is so important about using spatially continuous filters that couldnt be done with discrete filters the pattern completion task is not fully conducted it would be great to reconstruct the missing part of the input the increase of filter support width with network depth correlates with what is known for the visual cortex but i fail to understand how it could be relevant to current work in experimental neuroscience what is the benefit of learning continuously changing support for machine learning about the relation to biology the increase in size might be more related to the specific task on which the network is trained than to what is observed in the visual cortex the observed contrast robustness is not compared to other neural networks nor discussed in the light of experimental neuroscience observations minor comments the text in the figures is way too small it should be the same size as the main text docsepsummary the authors propose a hierarchical model of neural odes which they fit to cifar10 they find performance on par with resnets and include qualitative analyses on the filters learned by the models their ability to fill in occluded features and robustness to contrast at test time strengths the filter parameterization is interesting i can imagine this improving sample efficiency in certain contexts perhaps the authors should seek out those kinds of tasks to complement their cifar results the discussion does a nice job of explaining the issues that neural odes have when scaling to large image datasets weaknesses you spend time discussing spatiotemporal receptive fields throughout why your models are applied to 2d images the authors are missing a huge literature on a recurrent convolutional networks and b using these networks to simulate classical vs extra classical receptive field effects resnetblocks is the original resnet maybe change blocks to a citation or v1v2 depending on which implementation it is unclear from the text theres essentially no difference between the performance of any of the models tested is it possible to scale to imagenet it is important to show that the proposed method does something different than the standard resnet the authors attempted to add some qualitative experiments towards this goal in fig 4 but those results are not very convincing i think to show fillingin youd want to show reconstruction in rgb space ### Summary:
this paper received 1 weak accept 1 accept and 1 weak reject all reviewers questioned the motivation for continuous spacetime with respect to biological vision obviously discrete approximations used in machine vision are approximations but it is not clear from the paper or the authors response that this severely limits the ability of deep nets to predict neural data in ways that their continuous nets would not in addition i have to confess that i did not really understand the argument made by the authors in their revision in any case the burden should be on the authors to go beyond general statements and to really demonstrate that the proposed models provide actual insights for neuroscience since the performance in terms of machine vision on cifar10 is underwhelming the authors have to find a low data regime and even then the reviewers stated that the baselines used are not strong baselines the reduction in the number of parameters is quite small relative to methods for actually reducing the number of parameters clearly the work has potential as noted by the reviewers the authors suggest that dcns can be used to model the temporal profiles of neuronal responses which are known to not be constant even when the experimental stimuli are static images for example spatial frequency responses change over time in response to stationary gratings frazor et al 2004 similar observations are made for the contrast response function albrecht et al 2002 such temporal profiles cannot be simulated in conventional cnns this sounds like an interesting set of neuroscience data that the authors could be indeed leveraging to demonstrate the benefit of their approach my recommendation would be to add those in a revision of this paper which will significantly strengthen the work i would add that the concepts of temporal and spatial continuity are independent and the authors should consider studying them separately to provide more indepth analyses and convincing results as it stands the paper has clear potential but it does not make a sufficient contribution to either ml or neuroscience and hence i recommend the paper to be rejected
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 2204, 4175, 907, 752, 253, 2929, 3916, 281, 8162, 50276, 783, 2929, 24357, 247, 7046, 7173, 358, 23702, 2990, 326, 310, 2931, 275, 2426, 273, 5415, 8820, 3470, 285, 5415, 11935, 8062, 253, 2746, 310, 5486, 281, 3324, 3676, 6928, 8003, 281, 7534, 11454, 3210, 253, 1566, 310, 5762, 327, 260, 338, 274, 740, 285, 327, 247, 7629, 273, 260, 338, 274, 740, 275, 534, 8336, 273, 15115, 403, 2806, 264, 562, 253, 35961, 38135, 8414, 273, 16248, 2067, 5368, 7274, 5415, 34501, 10295, 7527, 4715, 285, 11454, 258, 3229, 50275, 3550, 2266, 285, 5075, 2792, 273, 253, 2929, 50276, 9072, 2792, 209, 186, 74, 1158, 253, 16038, 310, 1077, 2266, 3676, 27311, 267, 6928, 403, 9592, 908, 275, 3998, 26278, 533, 597, 403, 8489, 33817, 432, 4321, 15180, 6551, 21559, 275, 4088, 326, 436, 2929, 14177, 281, 2953, 50276, 186, 783, 3102, 12240, 1071, 342, 2806, 264, 483, 15115, 310, 247, 5322, 1039, 281, 2126, 7046, 7173, 358, 23702, 8062, 275, 2460, 8981, 285, 253, 1543, 403, 12532, 50276, 186, 262, 310, 1077, 4722, 326, 253, 3268, 273, 6311, 11498, 13806, 253, 3268, 273, 44952, 3423, 9552, 275, 3625, 5304, 14031, 50275, 20881, 2792, 50276, 186, 783, 4735, 4251, 5606, 1543, 275, 3036, 577, 403, 4722, 285, 247, 1643, 625, 6667, 403, 1677, 275, 253, 30762, 533, 352, 651, 671, 320, 5322, 281, 923, 1599, 50276, 8289, 8062, 2439, 1142, 3888, 281, 13503, 4677, 495, 68, 625, 16575, 50276, 186, 783, 1566, 36908, 562, 32231, 5760, 327, 260, 338, 274, 740, 3738, 352, 1057, 1347, 28249, 1805, 342, 2806, 264, 483, 8336, 273, 721, 89, 23, 15115, 390, 625, 50276, 186, 24159, 273, 253, 1566, 285, 1666, 25379, 327, 260, 338, 274, 740, 310, 417, 2266, 534, 16540, 253, 1953, 273, 849, 13333, 253, 2746, 310, 342, 2169, 3045, 50276, 186, 783, 4327, 273, 260, 338, 274, 740, 347, 247, 1071, 273, 253, 2746, 1057, 417, 1646, 281, 320, 973, 17194, 247, 4836, 342, 247, 11935, 4445, 1537, 1918, 253, 7870, 4243, 273, 253, 1566, 625, 281, 513, 50275, 49346, 1375, 634, 17401, 2997, 390, 12009, 342, 581, 390, 767, 2234, 4606, 323, 436, 4327, 50276, 74, 5583, 281, 2997, 253, 2929, 627, 310, 247, 5675, 2133, 273, 789, 326, 26662, 3676, 260, 79, 2224, 281, 7534, 11454, 6928, 285, 253, 29793, 273, 13358, 673, 285, 2317, 275, 260, 79, 2224, 19235, 4181, 436, 789, 8489, 432, 1199, 2045, 15180, 6551, 21559, 891, 1158, 253, 2929, 7729, 281, 2810, 436, 8037, 50275, 1945, 3533, 368, 651, 751, 9577, 407, 253, 4477, 281, 1361, 368, 19148, 634, 4685, 273, 253, 2929, 285, 2085, 253, 3081, 1941, 368, 878, 281, 320, 13224, 275, 634, 6803, 50276, 186, 16534, 368, 4496, 5645, 327, 253, 16038, 323, 5415, 2317, 342, 1675, 281, 7534, 8113, 9142, 8113, 310, 1754, 327, 13358, 38099, 37555, 285, 3839, 327, 7625, 273, 13358, 1341, 2905, 314, 253, 16038, 323, 5415, 673, 310, 625, 4755, 533, 891, 1158, 352, 651, 671, 1361, 281, 4385, 327, 34635, 275, 436, 3634, 50276, 186, 783, 7870, 5150, 275, 16186, 374, 3133, 281, 320, 26279, 849, 1057, 3280, 2818, 288, 50276, 186, 249, 2593, 4567, 253, 6311, 11498, 403, 2011, 281, 2572, 342, 2990, 6864, 436, 310, 2429, 342, 7534, 44952, 4910, 534, 1756, 949, 253, 5304, 19868, 533, 891, 574, 7192, 253, 4311, 281, 2723, 281, 253, 10295, 1979, 2581, 685, 253, 44952, 1673, 1979, 534, 17202, 275, 3676, 6928, 1014, 604, 512, 253, 34501, 403, 253, 1072, 1979, 1057, 40009, 2723, 281, 10295, 390, 44952, 1673, 1979, 50275, 42260, 3081, 8680, 342, 253, 4388, 281, 3157, 253, 2929, 50276, 186, 42567, 10649, 1430, 310, 5393, 327, 3239, 337, 347, 271, 5750, 273, 690, 5415, 3210, 275, 6551, 21559, 285, 352, 3133, 281, 320, 10466, 387, 326, 1127, 326, 824, 5373, 403, 7799, 275, 253, 2929, 533, 352, 36908, 1646, 326, 253, 2746, 9142, 6131, 1199, 3524, 275, 436, 3282, 604, 627, 310, 690, 2442, 1060, 4496, 5645, 50276, 186, 74, 42126, 2096, 50276, 2606, 390, 2990, 6864, 327, 23256, 577, 50275, 186, 12563, 327, 23256, 577, 253, 10491, 5028, 273, 253, 5806, 310, 1677, 533, 417, 253, 10491, 4294, 50276, 186, 74, 1119, 2593, 247, 18, 4942, 1892, 281, 956, 275, 1798, 16186, 608, 3133, 281, 320, 5486, 281, 41509, 253, 5806, 2021, 533, 891, 42126, 956, 253, 4154, 390, 5046, 891, 9829, 253, 1127, 7094, 671, 9255, 432, 2087, 1600, 891, 42126, 2096, 849, 253, 12494, 326, 4428, 16186, 721, 2905, 281, 253, 1551, 273, 253, 2929, 50276, 186, 262, 651, 2649, 8513, 281, 4853, 13548, 363, 50276, 186, 74, 42126, 956, 253, 1390, 12494, 273, 247, 19, 50276, 7152, 33032, 2278, 50274, 8774, 50276, 783, 4477, 4853, 5415, 3676, 6928, 407, 13002, 374, 69, 13118, 2241, 267, 15116, 347, 247, 4872, 5019, 273, 305, 12064, 1159, 285, 697, 13335, 407, 16248, 436, 5740, 342, 253, 3786, 4081, 11454, 258, 615, 7792, 597, 4044, 247, 7046, 7173, 358, 1831, 595, 5415, 5740, 273, 247, 27311, 267, 11454, 2990, 50276, 9088, 403, 495, 2022, 9021, 50274, 18, 597, 403, 2104, 281, 6642, 253, 1329, 4871, 273, 253, 15116, 285, 597, 921, 326, 50276, 262, 5459, 342, 253, 2990, 6864, 347, 2540, 275, 253, 5304, 14031, 50275, 19, 597, 921, 326, 616, 2990, 17923, 347, 973, 347, 5795, 1327, 38927, 11454, 6928, 327, 260, 338, 274, 740, 1223, 1907, 1679, 3602, 50275, 20, 597, 22059, 253, 11935, 8062, 281, 11322, 247, 3102, 12240, 4836, 50276, 1189, 455, 891, 1158, 253, 789, 310, 1175, 533, 891, 717, 417, 347, 31905, 347, 253, 4477, 670, 253, 6349, 273, 253, 789, 323, 6551, 21559, 285, 5145, 4715, 50275, 296, 3755, 20556, 50275, 783, 2929, 310, 973, 3542, 285, 3477, 281, 2096, 253, 7342, 285, 9021, 273, 253, 789, 403, 4518, 4767, 50276, 783, 11745, 1543, 403, 42876, 1175, 50276, 783, 18276, 1543, 5806, 1329, 4871, 3102, 12240, 285, 4499, 31640, 403, 4722, 285, 4623, 281, 6551, 21559, 50274, 20881, 1255, 265, 50275, 74, 1891, 281, 2096, 849, 253, 789, 812, 320, 4623, 281, 6551, 21559, 4457, 752, 310, 3559, 1060, 752, 310, 594, 1774, 670, 970, 28819, 5415, 15116, 326, 812, 2649, 320, 2218, 342, 13358, 15116, 50275, 783, 3102, 12240, 4836, 310, 417, 4751, 5196, 352, 651, 320, 1270, 281, 17029, 253, 5816, 629, 273, 253, 3280, 50276, 783, 2572, 273, 5806, 1329, 4871, 342, 2990, 6864, 27972, 342, 752, 310, 1929, 323, 253, 5304, 14031, 533, 891, 1891, 281, 2096, 849, 352, 812, 320, 4623, 281, 1655, 789, 275, 5661, 6551, 21559, 752, 310, 253, 5649, 273, 4715, 14949, 6890, 1329, 323, 5145, 4715, 50276, 10383, 253, 5886, 281, 16775, 253, 2572, 275, 1979, 1537, 320, 625, 2905, 281, 253, 2173, 4836, 327, 534, 253, 2990, 310, 10166, 685, 281, 752, 310, 2540, 275, 253, 5304, 14031, 50276, 783, 2540, 4499, 31640, 310, 417, 2429, 281, 643, 11454, 6928, 4543, 5469, 275, 253, 1708, 273, 5661, 6551, 21559, 7313, 50275, 37585, 5701, 50275, 783, 2505, 275, 253, 8442, 310, 1039, 1512, 1355, 352, 943, 320, 253, 1072, 1979, 347, 253, 2022, 2505, 50276, 7152, 339, 793, 360, 3454, 50276, 783, 4477, 12661, 247, 24498, 1566, 273, 11454, 258, 3229, 534, 597, 4944, 281, 260, 338, 274, 740, 597, 1089, 3045, 327, 1061, 342, 501, 47301, 285, 2486, 18276, 6260, 327, 253, 15116, 6311, 407, 253, 3210, 616, 3745, 281, 7522, 275, 15715, 4686, 3386, 285, 31640, 281, 4499, 387, 1071, 673, 50275, 296, 3755, 20556, 50276, 783, 5806, 4764, 1320, 310, 4722, 891, 476, 8564, 436, 11138, 3410, 6733, 275, 2176, 22349, 50276, 30875, 253, 4477, 943, 7703, 562, 1110, 9351, 273, 8892, 281, 13503, 616, 260, 338, 274, 1543, 50276, 783, 5955, 1057, 247, 5322, 2628, 273, 15571, 253, 3374, 326, 11454, 258, 3229, 452, 672, 13642, 281, 1781, 2460, 15302, 50275, 20881, 1255, 265, 50276, 5658, 6947, 673, 16585, 7046, 7173, 358, 23702, 44952, 4910, 4768, 2139, 634, 3210, 403, 3732, 281, 374, 69, 3888, 50276, 783, 4477, 403, 5816, 247, 5699, 6239, 327, 247, 18902, 27311, 267, 6928, 285, 270, 970, 841, 6928, 281, 26065, 8946, 4632, 4465, 8946, 44952, 1673, 2538, 50276, 373, 3024, 27027, 310, 253, 3236, 501, 3024, 5046, 1818, 8336, 281, 247, 25577, 390, 362, 18, 87, 19, 7293, 327, 534, 7092, 352, 310, 12744, 432, 253, 2505, 50276, 783, 373, 9093, 642, 3064, 875, 253, 3045, 273, 667, 273, 253, 3210, 5762, 310, 352, 1896, 281, 4311, 281, 4440, 257, 292, 352, 310, 1774, 281, 921, 326, 253, 4081, 1332, 1057, 1633, 1027, 685, 253, 2629, 501, 3024, 253, 4477, 9919, 281, 823, 690, 18276, 4679, 4404, 436, 4736, 275, 3036, 577, 533, 1110, 1543, 403, 417, 1077, 21414, 891, 1158, 281, 921, 12868, 249, 368, 69, 971, 281, 921, 14433, 275, 46206, 2317, 2490, 187, 4118, 18435, 27, 2520, 2929, 2959, 337, 5075, 2997, 337, 2997, 285, 337, 5075, 12009, 50276, 455, 30628, 17801, 253, 16038, 323, 5415, 29380, 342, 1675, 281, 7534, 8113, 9090, 13358, 34754, 908, 275, 5145, 8113, 403, 34754, 533, 352, 310, 417, 2590, 432, 253, 2929, 390, 253, 4477, 2380, 326, 436, 18270, 7787, 253, 3745, 273, 3676, 37507, 281, 3283, 11454, 941, 275, 4088, 326, 616, 5415, 37507, 651, 417, 50275, 249, 1635, 891, 452, 281, 25172, 326, 891, 858, 417, 1663, 2096, 253, 4154, 1160, 407, 253, 4477, 275, 616, 18520, 275, 667, 1083, 253, 7977, 943, 320, 327, 253, 4477, 281, 564, 4457, 2087, 7234, 285, 281, 1663, 7568, 326, 253, 4081, 3210, 2085, 4588, 16039, 323, 6551, 21559, 1580, 253, 3045, 275, 2426, 273, 5145, 8113, 327, 260, 338, 274, 740, 310, 762, 11622, 3987, 253, 4477, 452, 281, 1089, 247, 1698, 941, 9459, 285, 1014, 840, 253, 30628, 4767, 326, 253, 1666, 25379, 908, 403, 417, 2266, 1666, 25379, 253, 5141, 275, 253, 1180, 273, 3602, 310, 3240, 1355, 4103, 281, 3082, 323, 2686, 8493, 253, 1180, 273, 3602, 50274, 49346, 253, 789, 556, 2442, 347, 4879, 407, 253, 30628, 253, 4477, 1804, 326, 36196, 2224, 476, 320, 908, 281, 1566, 253, 11935, 10104, 273, 16069, 6128, 534, 403, 1929, 281, 417, 320, 3638, 1014, 672, 253, 5661, 15374, 403, 4228, 3888, 323, 1650, 8820, 4294, 6128, 1818, 689, 673, 275, 2380, 281, 17429, 15556, 723, 269, 5744, 263, 1162, 355, 6157, 2074, 7313, 403, 1160, 323, 253, 4499, 2380, 1159, 355, 3381, 12914, 1162, 355, 6752, 824, 11935, 10104, 2550, 320, 15524, 275, 6041, 260, 79, 2224, 50276, 2520, 7835, 751, 271, 4722, 873, 273, 6551, 21559, 941, 326, 253, 4477, 812, 320, 6296, 19732, 2977, 281, 7568, 253, 5649, 273, 616, 2746, 619, 17401, 651, 320, 281, 823, 1110, 275, 247, 18520, 273, 436, 2929, 534, 588, 3012, 17084, 253, 789, 891, 651, 823, 326, 253, 12342, 273, 11935, 285, 8820, 21815, 403, 3907, 285, 253, 4477, 943, 1908, 12392, 731, 11794, 281, 2085, 625, 801, 554, 394, 6260, 285, 21414, 1543, 50275, 284, 352, 9572, 253, 2929, 556, 2590, 2442, 533, 352, 1057, 417, 1056, 247, 4209, 7680, 281, 2057, 13361, 390, 6551, 21559, 285, 7613, 891, 5583, 253, 2929, 281, 320, 10945, 209 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 2204, 4175, 907, 752, 253, 2929, 3916, 281, 8162, 50276, 783, 2929, 24357, 247, 7046, 7173, 358, 23702, 2990, 326, 310, 2931, 275, 2426, 273, 5415, 8820, 3470, 285, 5415, 11935, 8062, 253, 2746, 310, 5486, 281, 3324, 3676, 6928, 8003, 281, 7534, 11454, 3210, 253, 1566, 310, 5762, 327, 260, 338, 274, 740, 285, 327, 247, 7629, 273, 260, 338, 274, 740, 275, 534, 8336, 273, 15115, 403, 2806, 264, 562, 253, 35961, 38135, 8414, 273, 16248, 2067, 5368, 7274, 5415, 34501, 10295, 7527, 4715, 285, 11454, 258, 3229, 50275, 3550, 2266, 285, 5075, 2792, 273, 253, 2929, 50276, 9072, 2792, 209, 186, 74, 1158, 253, 16038, 310, 1077, 2266, 3676, 27311, 267, 6928, 403, 9592, 908, 275, 3998, 26278, 533, 597, 403, 8489, 33817, 432, 4321, 15180, 6551, 21559, 275, 4088, 326, 436, 2929, 14177, 281, 2953, 50276, 186, 783, 3102, 12240, 1071, 342, 2806, 264, 483, 15115, 310, 247, 5322, 1039, 281, 2126, 7046, 7173, 358, 23702, 8062, 275, 2460, 8981, 285, 253, 1543, 403, 12532, 50276, 186, 262, 310, 1077, 4722, 326, 253, 3268, 273, 6311, 11498, 13806, 253, 3268, 273, 44952, 3423, 9552, 275, 3625, 5304, 14031, 50275, 20881, 2792, 50276, 186, 783, 4735, 4251, 5606, 1543, 275, 3036, 577, 403, 4722, 285, 247, 1643, 625, 6667, 403, 1677, 275, 253, 30762, 533, 352, 651, 671, 320, 5322, 281, 923, 1599, 50276, 8289, 8062, 2439, 1142, 3888, 281, 13503, 4677, 495, 68, 625, 16575, 50276, 186, 783, 1566, 36908, 562, 32231, 5760, 327, 260, 338, 274, 740, 3738, 352, 1057, 1347, 28249, 1805, 342, 2806, 264, 483, 8336, 273, 721, 89, 23, 15115, 390, 625, 50276, 186, 24159, 273, 253, 1566, 285, 1666, 25379, 327, 260, 338, 274, 740, 310, 417, 2266, 534, 16540, 253, 1953, 273, 849, 13333, 253, 2746, 310, 342, 2169, 3045, 50276, 186, 783, 4327, 273, 260, 338, 274, 740, 347, 247, 1071, 273, 253, 2746, 1057, 417, 1646, 281, 320, 973, 17194, 247, 4836, 342, 247, 11935, 4445, 1537, 1918, 253, 7870, 4243, 273, 253, 1566, 625, 281, 513, 50275, 49346, 1375, 634, 17401, 2997, 390, 12009, 342, 581, 390, 767, 2234, 4606, 323, 436, 4327, 50276, 74, 5583, 281, 2997, 253, 2929, 627, 310, 247, 5675, 2133, 273, 789, 326, 26662, 3676, 260, 79, 2224, 281, 7534, 11454, 6928, 285, 253, 29793, 273, 13358, 673, 285, 2317, 275, 260, 79, 2224, 19235, 4181, 436, 789, 8489, 432, 1199, 2045, 15180, 6551, 21559, 891, 1158, 253, 2929, 7729, 281, 2810, 436, 8037, 50275, 1945, 3533, 368, 651, 751, 9577, 407, 253, 4477, 281, 1361, 368, 19148, 634, 4685, 273, 253, 2929, 285, 2085, 253, 3081, 1941, 368, 878, 281, 320, 13224, 275, 634, 6803, 50276, 186, 16534, 368, 4496, 5645, 327, 253, 16038, 323, 5415, 2317, 342, 1675, 281, 7534, 8113, 9142, 8113, 310, 1754, 327, 13358, 38099, 37555, 285, 3839, 327, 7625, 273, 13358, 1341, 2905, 314, 253, 16038, 323, 5415, 673, 310, 625, 4755, 533, 891, 1158, 352, 651, 671, 1361, 281, 4385, 327, 34635, 275, 436, 3634, 50276, 186, 783, 7870, 5150, 275, 16186, 374, 3133, 281, 320, 26279, 849, 1057, 3280, 2818, 288, 50276, 186, 249, 2593, 4567, 253, 6311, 11498, 403, 2011, 281, 2572, 342, 2990, 6864, 436, 310, 2429, 342, 7534, 44952, 4910, 534, 1756, 949, 253, 5304, 19868, 533, 891, 574, 7192, 253, 4311, 281, 2723, 281, 253, 10295, 1979, 2581, 685, 253, 44952, 1673, 1979, 534, 17202, 275, 3676, 6928, 1014, 604, 512, 253, 34501, 403, 253, 1072, 1979, 1057, 40009, 2723, 281, 10295, 390, 44952, 1673, 1979, 50275, 42260, 3081, 8680, 342, 253, 4388, 281, 3157, 253, 2929, 50276, 186, 42567, 10649, 1430, 310, 5393, 327, 3239, 337, 347, 271, 5750, 273, 690, 5415, 3210, 275, 6551, 21559, 285, 352, 3133, 281, 320, 10466, 387, 326, 1127, 326, 824, 5373, 403, 7799, 275, 253, 2929, 533, 352, 36908, 1646, 326, 253, 2746, 9142, 6131, 1199, 3524, 275, 436, 3282, 604, 627, 310, 690, 2442, 1060, 4496, 5645, 50276, 186, 74, 42126, 2096, 50276, 2606, 390, 2990, 6864, 327, 23256, 577, 50275, 186, 12563, 327, 23256, 577, 253, 10491, 5028, 273, 253, 5806, 310, 1677, 533, 417, 253, 10491, 4294, 50276, 186, 74, 1119, 2593, 247, 18, 4942, 1892, 281, 956, 275, 1798, 16186, 608, 3133, 281, 320, 5486, 281, 41509, 253, 5806, 2021, 533, 891, 42126, 956, 253, 4154, 390, 5046, 891, 9829, 253, 1127, 7094, 671, 9255, 432, 2087, 1600, 891, 42126, 2096, 849, 253, 12494, 326, 4428, 16186, 721, 2905, 281, 253, 1551, 273, 253, 2929, 50276, 186, 262, 651, 2649, 8513, 281, 4853, 13548, 363, 50276, 186, 74, 42126, 956, 253, 1390, 12494, 273, 247, 19, 50276, 7152, 33032, 2278, 50274, 8774, 50276, 783, 4477, 4853, 5415, 3676, 6928, 407, 13002, 374, 69, 13118, 2241, 267, 15116, 347, 247, 4872, 5019, 273, 305, 12064, 1159, 285, 697, 13335, 407, 16248, 436, 5740, 342, 253, 3786, 4081, 11454, 258, 615, 7792, 597, 4044, 247, 7046, 7173, 358, 1831, 595, 5415, 5740, 273, 247, 27311, 267, 11454, 2990, 50276, 9088, 403, 495, 2022, 9021, 50274, 18, 597, 403, 2104, 281, 6642, 253, 1329, 4871, 273, 253, 15116, 285, 597, 921, 326, 50276, 262, 5459, 342, 253, 2990, 6864, 347, 2540, 275, 253, 5304, 14031, 50275, 19, 597, 921, 326, 616, 2990, 17923, 347, 973, 347, 5795, 1327, 38927, 11454, 6928, 327, 260, 338, 274, 740, 1223, 1907, 1679, 3602, 50275, 20, 597, 22059, 253, 11935, 8062, 281, 11322, 247, 3102, 12240, 4836, 50276, 1189, 455, 891, 1158, 253, 789, 310, 1175, 533, 891, 717, 417, 347, 31905, 347, 253, 4477, 670, 253, 6349, 273, 253, 789, 323, 6551, 21559, 285, 5145, 4715, 50275, 296, 3755, 20556, 50275, 783, 2929, 310, 973, 3542, 285, 3477, 281, 2096, 253, 7342, 285, 9021, 273, 253, 789, 403, 4518, 4767, 50276, 783, 11745, 1543, 403, 42876, 1175, 50276, 783, 18276, 1543, 5806, 1329, 4871, 3102, 12240, 285, 4499, 31640, 403, 4722, 285, 4623, 281, 6551, 21559, 50274, 20881, 1255, 265, 50275, 74, 1891, 281, 2096, 849, 253, 789, 812, 320, 4623, 281, 6551, 21559, 4457, 752, 310, 3559, 1060, 752, 310, 594, 1774, 670, 970, 28819, 5415, 15116, 326, 812, 2649, 320, 2218, 342, 13358, 15116, 50275, 783, 3102, 12240, 4836, 310, 417, 4751, 5196, 352, 651, 320, 1270, 281, 17029, 253, 5816, 629, 273, 253, 3280, 50276, 783, 2572, 273, 5806, 1329, 4871, 342, 2990, 6864, 27972, 342, 752, 310, 1929, 323, 253, 5304, 14031, 533, 891, 1891, 281, 2096, 849, 352, 812, 320, 4623, 281, 1655, 789, 275, 5661, 6551, 21559, 752, 310, 253, 5649, 273, 4715, 14949, 6890, 1329, 323, 5145, 4715, 50276, 10383, 253, 5886, 281, 16775, 253, 2572, 275, 1979, 1537, 320, 625, 2905, 281, 253, 2173, 4836, 327, 534, 253, 2990, 310, 10166, 685, 281, 752, 310, 2540, 275, 253, 5304, 14031, 50276, 783, 2540, 4499, 31640, 310, 417, 2429, 281, 643, 11454, 6928, 4543, 5469, 275, 253, 1708, 273, 5661, 6551, 21559, 7313, 50275, 37585, 5701, 50275, 783, 2505, 275, 253, 8442, 310, 1039, 1512, 1355, 352, 943, 320, 253, 1072, 1979, 347, 253, 2022, 2505, 50276, 7152, 339, 793, 360, 3454, 50276, 783, 4477, 12661, 247, 24498, 1566, 273, 11454, 258, 3229, 534, 597, 4944, 281, 260, 338, 274, 740, 597, 1089, 3045, 327, 1061, 342, 501, 47301, 285, 2486, 18276, 6260, 327, 253, 15116, 6311, 407, 253, 3210, 616, 3745, 281, 7522, 275, 15715, 4686, 3386, 285, 31640, 281, 4499, 387, 1071, 673, 50275, 296, 3755, 20556, 50276, 783, 5806, 4764, 1320, 310, 4722, 891, 476, 8564, 436, 11138, 3410, 6733, 275, 2176, 22349, 50276, 30875, 253, 4477, 943, 7703, 562, 1110, 9351, 273, 8892, 281, 13503, 616, 260, 338, 274, 1543, 50276, 783, 5955, 1057, 247, 5322, 2628, 273, 15571, 253, 3374, 326, 11454, 258, 3229, 452, 672, 13642, 281, 1781, 2460, 15302, 50275, 20881, 1255, 265, 50276, 5658, 6947, 673, 16585, 7046, 7173, 358, 23702, 44952, 4910, 4768, 2139, 634, 3210, 403, 3732, 281, 374, 69, 3888, 50276, 783, 4477, 403, 5816, 247, 5699, 6239, 327, 247, 18902, 27311, 267, 6928, 285, 270, 970, 841, 6928, 281, 26065, 8946, 4632, 4465, 8946, 44952, 1673, 2538, 50276, 373, 3024, 27027, 310, 253, 3236, 501, 3024, 5046, 1818, 8336, 281, 247, 25577, 390, 362, 18, 87, 19, 7293, 327, 534, 7092, 352, 310, 12744, 432, 253, 2505, 50276, 783, 373, 9093, 642, 3064, 875, 253, 3045, 273, 667, 273, 253, 3210, 5762, 310, 352, 1896, 281, 4311, 281, 4440, 257, 292, 352, 310, 1774, 281, 921, 326, 253, 4081, 1332, 1057, 1633, 1027, 685, 253, 2629, 501, 3024, 253, 4477, 9919, 281, 823, 690, 18276, 4679, 4404, 436, 4736, 275, 3036, 577, 533, 1110, 1543, 403, 417, 1077, 21414, 891, 1158, 281, 921, 12868, 249, 368, 69, 971, 281, 921, 14433, 275, 46206, 2317, 2490, 187, 4118, 18435, 27, 2520, 2929, 2959, 337, 5075, 2997, 337, 2997, 285, 337, 5075, 12009, 50276, 455, 30628, 17801, 253, 16038, 323, 5415, 29380, 342, 1675, 281, 7534, 8113, 9090, 13358, 34754, 908, 275, 5145, 8113, 403, 34754, 533, 352, 310, 417, 2590, 432, 253, 2929, 390, 253, 4477, 2380, 326, 436, 18270, 7787, 253, 3745, 273, 3676, 37507, 281, 3283, 11454, 941, 275, 4088, 326, 616, 5415, 37507, 651, 417, 50275, 249, 1635, 891, 452, 281, 25172, 326, 891, 858, 417, 1663, 2096, 253, 4154, 1160, 407, 253, 4477, 275, 616, 18520, 275, 667, 1083, 253, 7977, 943, 320, 327, 253, 4477, 281, 564, 4457, 2087, 7234, 285, 281, 1663, 7568, 326, 253, 4081, 3210, 2085, 4588, 16039, 323, 6551, 21559, 1580, 253, 3045, 275, 2426, 273, 5145, 8113, 327, 260, 338, 274, 740, 310, 762, 11622, 3987, 253, 4477, 452, 281, 1089, 247, 1698, 941, 9459, 285, 1014, 840, 253, 30628, 4767, 326, 253, 1666, 25379, 908, 403, 417, 2266, 1666, 25379, 253, 5141, 275, 253, 1180, 273, 3602, 310, 3240, 1355, 4103, 281, 3082, 323, 2686, 8493, 253, 1180, 273, 3602, 50274, 49346, 253, 789, 556, 2442, 347, 4879, 407, 253, 30628, 253, 4477, 1804, 326, 36196, 2224, 476, 320, 908, 281, 1566, 253, 11935, 10104, 273, 16069, 6128, 534, 403, 1929, 281, 417, 320, 3638, 1014, 672, 253, 5661, 15374, 403, 4228, 3888, 323, 1650, 8820, 4294, 6128, 1818, 689, 673, 275, 2380, 281, 17429, 15556, 723, 269, 5744, 263, 1162, 355, 6157, 2074, 7313, 403, 1160, 323, 253, 4499, 2380, 1159, 355, 3381, 12914, 1162, 355, 6752, 824, 11935, 10104, 2550, 320, 15524, 275, 6041, 260, 79, 2224, 50276, 2520, 7835, 751, 271, 4722, 873, 273, 6551, 21559, 941, 326, 253, 4477, 812, 320, 6296, 19732, 2977, 281, 7568, 253, 5649, 273, 616, 2746, 619, 17401, 651, 320, 281, 823, 1110, 275, 247, 18520, 273, 436, 2929, 534, 588, 3012, 17084, 253, 789, 891, 651, 823, 326, 253, 12342, 273, 11935, 285, 8820, 21815, 403, 3907, 285, 253, 4477, 943, 1908, 12392, 731, 11794, 281, 2085, 625, 801, 554, 394, 6260, 285, 21414, 1543, 50275, 284, 352, 9572, 253, 2929, 556, 2590, 2442, 533, 352, 1057, 417, 1056, 247, 4209, 7680, 281, 2057, 13361, 390, 6551, 21559, 285, 7613, 891, 5583, 253, 2929, 281, 320, 10945, 209 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: the current manuscript presented a connectome constrained latent variable model for wholebrain calcium imaging of celegans nervous system the wholebrain calcium imaging of celegans was collected while celegans was undergoing chemosensory testing and the dataset has been published recently in the current study the authors aimed to present a model that could predict the singleneuron and the singletrial activity of this dataset specifically the activity of each neuron of the whole brain imaging was modeled by latent variable analogous to the voltage of one unit in the model network the connection between model network units was constrained by the connectome the authors showed that the connectome constraints significantly improved the prediction power of the model in predicting the activity of missing units as well as predicting the singletrial activity of holdout warms overall the authors have provided a clear description of the model and carried out a good amount of experiments to evaluate different model variants i believe that connectome constraints help improve the model performance in the current setup however the results are not surprising the question is whether the benefit of connectome constraints was due to insufficient training data when you have enough data would an unconstrained model or a loosely constrained model perform as well as a connectome constrained model meaning that a model could learn reasonable connection structure through sufficient training how does the latent space look like the current model predicted the latent variable of neuron voltage and used a differential equation to further predict the observed calcium signal one major question i had was whether such a simplified calciumvoltage model was sufficient for celegans neurons the units of model networks were passive nonspiking units while in real brain calcium signal was mainly generated by suprathreshold neuronal activities although the problem might be partially taken care of by the nonlinearities of the model it is not clear whether that was sufficient on the other hand im wondering whether the oversimplified calciumvoltage model could be one reason causing the relatively worse single neuron prediction performance of sensory neurons considering the spiking properties of sensory neurons they might be more sensitive to inaccurate calciumvoltage models compared to interneurons and motor neurons how does warm prediction look like could authors show some population trajectories could authors discuss what had led to the inaccuracy in whole warm predictions whether due to inaccurate prediction of a particular subset of neurons or whether due to inaccurate prediction of overall brain state in general i vote for accepting the current paper if the authors could address all of my concerns above i think the authors have presented a model for a novel wholebrain calcium imaging which was only recently published overall the modeling strategy makes sense i like that the authors tested on different variants of the model which are all relevant and provided insights on model selection one limitation of the current work is the model generalization i think the application of the model is limited to the current training data im not convinced that the current model could generalize to chemosensory tasks other than the ones in the training sets my major concern is about the clarity of some parts of the paper and some additional improvement in the work i hope the authors could address my concerns in the rebuttal docsepthe paper develops a latent variable model with biologically meaningful parameters for the worm c elegans whole brain neural traces given the connectivity between the neurons the neural traces are modeled as stochastic leaky integrators with either conductance or current based synaptic input model given the voltage traces the observed calcium signals are modeled as first order leaky integrators added by noise to account for the observation noise external input is nonlinearly transformed and fed into the neurons to account for the stimulusdependent modulation of neural activity multiple connectivity models fully trainable fully trainable with l2 regularization trainable weight magnitudes trainable global scale are used to investigate whether biological connectome constraints help with neural activity predictions or not variational inference is used to infer a posterior distribution over neural activity trajectories with factorized gaussian distribution as the variational family results on neuronholdout and wormholdout experiments suggest that connectome constraints and conductancebased modeling improves model prediction over unseen neurons and worms the question of whether anatomical constraints help with improving model predictions and constraining the space of generative models of neural activity is an important and significant question in computational neuroscience this paper takes a step towards addressing this question by developing biological models that have interpretable parameters connectivity weights neuron time constants resting membrane potentials yet are flexible to capture the variability across the population or individual neurons by incorporating noise in the latent and observed variables if a model resembles the biological generative process of the data up to appropriate amount of details we would expect that incorporating more biological constraints would help with models predictive performance therefore this approach not only could be used for model selection but also could provide insight about why biological systems prefer certain architectures or mechanisms over others while the paper is clear and interesting there are several issue that need more clarification as listed below hierarchical bayesian modelshttpswwwbiorxivorgcontent101101621540v1articleinfo are already developed to study the dynamics in c elegans how would the results change if the anatomical constraints are imposed on the already established models such as the one mentioned above some comparison with this paper would make the claims stronger in the results why neuronheldout correlation is worse than wormheldout am i misunderstanding this at least intuitively shouldnt we expect to have better performance when a single neuron is missing compared to all the neurons from a worm missing how did the authors perform neuronholdout experiments if a neuron is missing in the training dataset then how do we have a latent variable corresponding to it is the missing neuron treated as missing observations at the test time do we use the reconstruction for evaluating the correlations is the reconstruction the mean of the posterior or is it based on samples for the neuronholdout experiments pairs of neurons 107 are considered why is there 107 pairs and why are we considering pairs of neurons instead of removing a single neuron again i feel like im misunderstanding the neuronholdout experiments is it possible to visualize the reconstructed activity of all the neurons for a holdoutworm and training worm in the appendix this would be important for biology experts and can provide further insights into the proposed model how should we ensure that the model is learning time dynamics the factorized gaussian family does not incorporate any dynamical model or smoothness constraints in my understanding is that right when using anatomical constraints for model selection we need to control and account for as many confounding factors as possible that can potentially lead to misinterpretations these factors include but not limited to controlling for the number of parameters across different models more parameters can lead to overfitting can make the optimization harder can make the inference harder since its in a higher dimensional space and successful training what is the convergence criterion in fig 3 the error bars seem too large are the differences significant in the violin plots in addition some neurons have really bad predictive performance can we focus on those neurons to understand why this is happening is it due to lack of measurements form their neighboring neurons is it because they are less predictable and more impacted by unmeasured sources of variability i have some minor comments to and would like the authors to clarify what is theta please write down theta explicitly using the notations introduced in the paper according to wikipedia c elegans has 302 neurons not 300 can you clarify this what is phi are you using a neural network to parametrize phi if so what kind of architecture is used how does the regularization work what is the probabilistic interpretation of adding the m c regularization is this equivalent to defining a normal prior please clarify this why did the authors use l2 regularization isnt l1 regularization associated with sparsity if so what is the probabilistic interpretation of this in the variational inference framework while the approach is very interesting and the problem considered is important the paper lacks comparisons and clearer description and investigation of the results furthermore some justification on specific choices need to be done see above for more detailed comments post discussion im increasing my score to accept since most of my comments are incorporated i thank the authors for their hard work in revising the paper and incorporating additional experimentsvisualizations docsepthe authors propose a biologically constrained latent linear dynamical model of the c elegans nervous system they use connectomic information including chemical vs electrical synapses to constrain connections between units during inference they fit the model to calcium imaging data from whole c elegans using a variational approach they find that biological constraints improve model performance and validity using both withheld neuron and across worm metrics this paper makes some additional developments on the c elegans latent variable modeling literature including both chemical and electrical synapse information adding conductancebased synapses and nonlinear calcium observations with a time constant however i dont believe its the first linear latent variable model to consider connectomic constraints or fitting linear models to c elegans data with variational inference see equation 1 in mena et al 2017 also see linderman et al 2019 for crossworm comparisons and predictions of heldout neurons the clarity of the methods and conclusions could be improved methods does g in equation 5 denote the softplus its not clear whether the same in the sentence before refers to g being the same across neurons same as the softplus definition earlier in the section or both is there a reference supporting this observation model is the threshold of g fixed page 4 section 21 redundantly refers to bargmann et al about nonspiking neurons in c elegans twice in the first two paragraphs id suggest moving this reference earlier to the introduction where you first mention that youre using a nonspiking model as many in the iclr audience may not know this fact and itd more quickly support your modeling assumptions last paragraph of page 3 a latent variable model i found this paragraph a little confusing i wasnt sure what many sentences were referring to especially with the introduction of the blackbox networks 2nd sentence vs the blackbox ccn brought up in the third sentence the explanation of variational inference over smc top text on page 3 starting inference instead of smc was a little unclear what the authors wanted to convey with generates a different type of distribution and is more computationally efficient because it allows direct sampling from the approximate posterior a more explicit definition of different would help clarity or removing this altogether and its also unclear whether sampling from the posterior is used to help validateapply the model here conclusions the conductancebased synapses showed a performance improvement in the neuron holdout but not really in the worm holdout condition why is this and does it impact the conclusions how does the calcium observation model impact the model fit this is an important contribution of the proposed model but its not clear if the ca dynamics are necessary compared to for instance a simple lineargaussian model given the observation window size the paper highlights that known biological constraints can be included tractably within latent variable models of biological neural networks in this case the c elegans moreover they provide some evidence that these constraints improve model fits the model goes beyond existing linear latent variable models of c elegans however the conclusions do not include strong measures comparing against existing methodology and its difficult to gauge how much advancement this model provides while the presentation clarity was adequate organization and model presentation could be improved docsepthis paper uses vae variational autoencoder to model neural activities of celegans observed in calcium imaging the main contribution is making the encoderlearned posterior distribution close to a biophysically constrained prior as well as designing the decoder based on biophysical rules strength adding biophysical constraints into neuron activity modeling and conducting both neuron and worm holdout experiments to validate the model alleviate the issue of trailtotrail variability weakness not clearly written at all about the prior pvh generation it only mentions pvh is a biophysically realistic connectomeconstrained network with passive pointneuron dynamics without pointing out what the network is are the only trainable parameters in the network wcji and weji and a mlp layer for encoding hit or are there any other layers and since it has subscript the same as the decoder network should i assume they are the same network if so how can the decoder be trained since it needs the prior to calculate kl divergence for the loss theres no detail about the exact network architecture layer inputoutput dimensions data processing training procedure etc this makes the whole logic even harder to comprehend apart from clarity the technical novelty in computer science is a bit insignificant empirically the results are reasonable but not interesting enough and no comparison is made with other baseline modeling methods i would suggest a rewriting rephrasing better connecting between sections putting many appendix contents like a2 and a5 into the main text etc and resubmitting to a neuroscience journal overall the paper provides a good aspect in modeling neuron activities by using connectome constrains as vae prior but its not clearly written and lacks novelty or significance in both technical and empirical aspects ### Summary:
the authors build an encoding model of wholebrain brain activity by integrating incomplete functional data with anatomicalconnectomics data this work is significant from a computational neuroscience perspective because it constitutes a proof of concept regarding how whole brain calcium imaging data can be used to constrain the missing parameters of a connectomeconstrained biophysically detailed model of the c elegans nervous system there were issues related to clarity in the initial submission which all appeared to have been addressed in the final revision this paper received 3 accepts including one marginal accept and 1 reject the paper was discussed and the reviewers including the negative reviewer were unanimous that the current submission should be accepted
[ 339, 431, 248, 2929, 24357, 247, 21624, 4778, 1566, 342, 35605, 14282, 3602, 323, 253, 28384, 260, 44984, 2644, 3998, 11454, 20274, 1677, 253, 17769, 875, 253, 8512, 253, 11454, 20274, 403, 23115, 347, 19191, 13584, 90, 2899, 2392, 342, 2057, 29738, 390, 1655, 1754, 21066, 3280, 1566, 1677, 253, 6718, 20274, 253, 2540, 11672, 6298, 403, 23115, 347, 806, 1340, 13584, 90, 2899, 2392, 2879, 407, 6046, 281, 2395, 323, 253, 8310, 6046, 6024, 3280, 310, 1327, 1282, 1285, 13657, 285, 10208, 715, 253, 8512, 281, 2395, 323, 253, 15199, 6820, 15673, 273, 11454, 2425, 2709, 17769, 3210, 4751, 6194, 494, 4751, 6194, 494, 342, 298, 19, 37820, 6194, 494, 2801, 32800, 6194, 494, 4156, 4311, 403, 908, 281, 7409, 1880, 7534, 4684, 485, 10806, 1361, 342, 11454, 2425, 13650, 390, 417, 39762, 17032, 310, 908, 281, 9441, 247, 12637, 3268, 689, 11454, 2425, 24102, 342, 2803, 1025, 305, 12064, 3268, 347, 253, 39762, 2021, 1543, 327, 23586, 4949, 483, 285, 28384, 4949, 483, 4679, 1804, 326, 4684, 485, 10806, 285, 29738, 3169, 14053, 19132, 1566, 10554, 689, 39709, 8512, 285, 35231, 253, 1953, 273, 1880, 27166, 10806, 1361, 342, 11138, 1566, 13650, 285, 1030, 26208, 253, 2317, 273, 1006, 800, 3210, 273, 11454, 2425, 310, 271, 1774, 285, 1534, 1953, 275, 15180, 6551, 21559, 436, 2929, 3936, 247, 3213, 4404, 15974, 436, 1953, 407, 6684, 7534, 3210, 326, 452, 4665, 494, 3602, 17769, 13461, 23586, 673, 14637, 18180, 6384, 19316, 2568, 403, 12112, 281, 9232, 253, 13099, 2439, 253, 3072, 390, 2060, 8512, 407, 24049, 6046, 275, 253, 21624, 285, 2540, 4903, 604, 247, 1566, 29217, 253, 7534, 1006, 800, 1232, 273, 253, 941, 598, 281, 4569, 2408, 273, 4278, 359, 651, 1902, 326, 24049, 625, 7534, 10806, 651, 1361, 342, 3210, 15970, 3045, 3103, 436, 2746, 417, 760, 812, 320, 908, 323, 1566, 5438, 533, 671, 812, 2085, 12288, 670, 2139, 7534, 2718, 4510, 2176, 35615, 390, 6297, 689, 2571, 50276, 6050, 253, 2929, 310, 2590, 285, 4722, 627, 403, 2067, 2523, 326, 878, 625, 37699, 347, 7117, 2708, 50275, 73, 1321, 1116, 474, 17699, 16561, 3210, 3614, 2700, 67, 1528, 32693, 2061, 6071, 6903, 6903, 3763, 1010, 1449, 87, 18, 14600, 5374, 403, 2168, 3715, 281, 1263, 253, 8062, 275, 260, 44984, 849, 651, 253, 1543, 1818, 604, 253, 27166, 10806, 403, 11295, 327, 253, 2168, 4232, 3210, 824, 347, 253, 581, 5393, 1840, 690, 5301, 342, 436, 2929, 651, 1056, 253, 3916, 10046, 50276, 249, 253, 1543, 2139, 23586, 13028, 483, 5921, 310, 7197, 685, 28384, 13028, 483, 717, 891, 40663, 436, 387, 1878, 540, 41597, 943, 2649, 359, 1902, 281, 452, 1805, 3045, 672, 247, 2014, 23586, 310, 5816, 2429, 281, 512, 253, 8512, 432, 247, 28384, 5816, 50276, 5430, 858, 253, 4477, 1347, 23586, 4949, 483, 4679, 604, 247, 23586, 310, 5816, 275, 253, 3733, 10895, 840, 849, 513, 359, 452, 247, 21624, 4778, 3969, 281, 352, 310, 253, 5816, 23586, 4127, 347, 5816, 7313, 387, 253, 1071, 673, 513, 359, 897, 253, 14433, 323, 16344, 253, 13007, 310, 253, 14433, 253, 1599, 273, 253, 12637, 390, 310, 352, 1754, 327, 3530, 50276, 1542, 253, 23586, 4949, 483, 4679, 8557, 273, 8512, 14034, 403, 2783, 2139, 310, 627, 14034, 8557, 285, 2139, 403, 359, 7296, 8557, 273, 8512, 3185, 273, 11922, 247, 2014, 23586, 969, 891, 1928, 751, 516, 40663, 253, 23586, 4949, 483, 4679, 50276, 261, 352, 1896, 281, 31986, 253, 25578, 2425, 273, 512, 253, 8512, 323, 247, 2186, 483, 36141, 285, 3733, 28384, 275, 253, 30762, 436, 651, 320, 1774, 323, 16775, 10071, 285, 476, 2085, 2007, 16039, 715, 253, 4081, 1566, 50276, 5430, 943, 359, 5416, 326, 253, 1566, 310, 4715, 673, 8062, 253, 2803, 1025, 305, 12064, 2021, 1057, 417, 19071, 667, 18525, 1566, 390, 6032, 1255, 10806, 275, 619, 4685, 310, 326, 987, 50276, 9453, 970, 27166, 10806, 323, 1566, 5438, 359, 878, 281, 1453, 285, 2395, 323, 347, 1142, 34541, 2616, 347, 1896, 326, 476, 7826, 1421, 281, 3731, 22416, 569, 841, 2616, 2486, 533, 417, 3710, 281, 10938, 323, 253, 1180, 273, 3602, 2439, 1027, 3210, 625, 3602, 476, 1421, 281, 689, 31893, 476, 1056, 253, 13757, 12150, 476, 1056, 253, 17032, 12150, 1580, 697, 275, 247, 2169, 15759, 2317, 285, 5547, 3733, 752, 310, 253, 14940, 17705, 50275, 249, 3036, 495, 253, 2228, 8965, 1646, 1512, 1781, 403, 253, 3910, 1534, 275, 253, 36608, 14777, 275, 1635, 690, 8512, 452, 1663, 3076, 15970, 3045, 476, 359, 2770, 327, 1110, 8512, 281, 2096, 2139, 436, 310, 9369, 310, 352, 1955, 281, 3480, 273, 6341, 830, 616, 20667, 8512, 310, 352, 984, 597, 403, 1679, 28826, 285, 625, 27857, 407, 440, 44657, 4973, 273, 13099, 50275, 74, 452, 690, 5884, 5701, 281, 285, 651, 751, 253, 4477, 281, 19148, 50275, 5371, 310, 39116, 4496, 3630, 1066, 39116, 11120, 970, 253, 41818, 5611, 275, 253, 2929, 50276, 35861, 281, 259, 15170, 260, 44984, 556, 29044, 8512, 417, 7469, 476, 368, 19148, 436, 50276, 5371, 310, 815, 74, 403, 368, 970, 247, 11454, 2990, 281, 30364, 363, 2721, 815, 74, 604, 594, 752, 2238, 273, 10336, 310, 908, 50276, 5430, 1057, 253, 37820, 789, 752, 310, 253, 37851, 7914, 273, 6240, 253, 50276, 78, 260, 50276, 12846, 1320, 310, 436, 6425, 281, 13947, 247, 2622, 2720, 4496, 19148, 436, 2139, 858, 253, 4477, 897, 298, 19, 37820, 310, 2649, 298, 18, 37820, 2330, 342, 37139, 414, 604, 594, 752, 310, 253, 37851, 7914, 273, 436, 275, 253, 39762, 17032, 7792, 50272, 6050, 253, 2746, 310, 1077, 4722, 285, 253, 1895, 2783, 310, 1774, 253, 2929, 19756, 14023, 285, 30909, 5740, 285, 5839, 273, 253, 1543, 33810, 690, 22861, 327, 2173, 10165, 878, 281, 320, 2218, 923, 1840, 323, 625, 7000, 5701, 50276, 5996, 5955, 516, 3629, 619, 4868, 281, 2997, 1580, 954, 273, 619, 5701, 403, 11217, 891, 5717, 253, 4477, 323, 616, 1892, 789, 275, 3585, 2182, 253, 2929, 285, 24049, 3081, 4679, 34309, 5904, 5474, 339, 431, 248, 4477, 12661, 247, 35605, 20793, 21624, 4872, 18525, 1566, 273, 253, 260, 44984, 11219, 985, 597, 897, 4684, 4986, 1491, 1690, 5793, 4632, 8545, 40041, 281, 37709, 10291, 875, 5085, 1309, 17032, 597, 4944, 253, 1566, 281, 11672, 6979, 941, 432, 2644, 260, 44984, 970, 247, 39762, 2746, 597, 1089, 326, 7534, 10806, 3157, 1566, 3045, 285, 13091, 970, 1097, 40492, 23586, 285, 2439, 28384, 17082, 436, 2929, 2789, 690, 3081, 16936, 327, 253, 260, 44984, 21624, 4778, 14053, 6239, 1690, 1097, 5793, 285, 8545, 2753, 8023, 1491, 6240, 29738, 3169, 40041, 285, 14561, 11672, 7313, 342, 247, 673, 3638, 2299, 891, 13414, 2868, 697, 253, 806, 4872, 21624, 4778, 1566, 281, 1908, 4684, 4986, 10806, 390, 13532, 4872, 3210, 281, 260, 44984, 941, 342, 39762, 17032, 923, 5150, 337, 275, 1821, 66, 1162, 355, 4240, 671, 923, 298, 6228, 1342, 1162, 355, 6247, 323, 2831, 36141, 14023, 285, 13650, 273, 2918, 483, 8512, 50276, 783, 19843, 273, 253, 3082, 285, 11815, 812, 320, 5520, 3082, 1057, 305, 275, 5150, 608, 9173, 253, 2602, 11095, 697, 417, 2590, 1880, 253, 1072, 275, 253, 6197, 1078, 10770, 281, 305, 1146, 253, 1072, 2439, 8512, 1072, 347, 253, 2602, 11095, 5426, 4321, 275, 253, 2593, 390, 1097, 209, 186, 261, 627, 247, 3806, 8109, 436, 8310, 1566, 310, 253, 7887, 273, 305, 4229, 50276, 6377, 577, 2593, 3127, 19886, 5954, 10770, 281, 14862, 8420, 1162, 355, 670, 14122, 81, 16434, 8512, 275, 260, 44984, 7019, 275, 253, 806, 767, 33295, 2654, 1804, 4886, 436, 3806, 4321, 281, 253, 10199, 835, 368, 806, 3748, 326, 368, 250, 970, 247, 14122, 81, 16434, 1566, 347, 1142, 275, 253, 17857, 32888, 8446, 778, 417, 871, 436, 958, 285, 352, 69, 625, 4541, 1329, 634, 14053, 13260, 50276, 6275, 12494, 273, 3239, 495, 247, 21624, 4778, 1566, 891, 1119, 436, 12494, 247, 1652, 21643, 891, 369, 2649, 2119, 752, 1142, 14683, 497, 14339, 281, 3340, 342, 253, 10199, 273, 253, 2806, 3364, 6928, 374, 2109, 6197, 4632, 253, 2806, 3364, 260, 14340, 3982, 598, 275, 253, 2626, 6197, 50276, 783, 8813, 273, 39762, 17032, 689, 924, 68, 1755, 2505, 327, 3239, 495, 4983, 17032, 3185, 273, 924, 68, 369, 247, 1652, 12744, 752, 253, 4477, 3078, 281, 12709, 342, 15693, 247, 1027, 1511, 273, 3268, 285, 310, 625, 43245, 5919, 984, 352, 4483, 1480, 10491, 432, 253, 16851, 12637, 247, 625, 6843, 5426, 273, 1027, 651, 1361, 19843, 390, 11922, 436, 17965, 285, 697, 671, 12744, 1880, 10491, 432, 253, 12637, 310, 908, 281, 1361, 17813, 18788, 253, 1566, 1060, 50276, 585, 7797, 253, 29738, 3169, 40041, 2692, 247, 3045, 7756, 275, 253, 23586, 2186, 483, 533, 417, 1663, 275, 253, 28384, 2186, 483, 1617, 2139, 310, 436, 285, 1057, 352, 3486, 253, 11815, 50276, 5430, 1057, 253, 11672, 8310, 1566, 3486, 253, 1566, 4944, 436, 310, 271, 1774, 7680, 273, 253, 4081, 1566, 533, 697, 417, 2590, 604, 253, 7318, 8062, 403, 3309, 2429, 281, 323, 4227, 247, 2969, 1386, 1662, 12064, 1566, 1677, 253, 8310, 3497, 1979, 253, 2929, 16681, 326, 1929, 7534, 10806, 476, 320, 2908, 10649, 1598, 1561, 21624, 4778, 3210, 273, 7534, 11454, 6928, 275, 436, 1083, 253, 260, 44984, 25761, 597, 2085, 690, 1941, 326, 841, 10806, 3157, 1566, 13840, 253, 1566, 4566, 4457, 5368, 4872, 21624, 4778, 3210, 273, 260, 44984, 2299, 253, 11815, 513, 417, 2486, 2266, 5593, 10941, 1411, 5368, 16182, 285, 697, 2834, 281, 11206, 849, 1199, 32992, 436, 1566, 3400, 1223, 253, 9759, 19843, 369, 10599, 6003, 285, 1566, 9759, 812, 320, 5520, 5474, 33032, 2520, 2929, 4648, 362, 3348, 39762, 6753, 36465, 281, 1566, 11454, 4712, 273, 2636, 1851, 507, 2540, 275, 11672, 6979, 253, 2022, 7680, 310, 2403, 253, 32049, 29343, 264, 12637, 3268, 2810, 281, 247, 1794, 16946, 1037, 20793, 2720, 347, 973, 347, 20462, 253, 29810, 1754, 327, 1794, 40947, 4803, 4757, 50276, 8052, 1794, 40947, 10806, 715, 23586, 2425, 14053, 285, 16472, 1097, 23586, 285, 28384, 2186, 483, 4679, 281, 17813, 253, 1566, 50276, 4781, 87, 4513, 253, 2523, 273, 1140, 2878, 302, 38017, 13099, 50276, 20881, 1255, 50276, 1439, 4518, 3542, 387, 512, 50276, 10383, 253, 2720, 268, 42781, 5978, 352, 760, 25957, 268, 42781, 310, 247, 1794, 16946, 1037, 15958, 4684, 485, 48454, 2990, 342, 16864, 1127, 570, 27658, 8062, 1293, 13458, 562, 752, 253, 2990, 310, 403, 253, 760, 6194, 494, 3602, 275, 253, 2990, 259, 68, 8020, 285, 359, 8020, 285, 247, 13361, 81, 50276, 12026, 323, 9706, 4352, 390, 403, 627, 667, 643, 8090, 285, 1580, 352, 556, 749, 3866, 50276, 783, 1072, 347, 253, 29810, 2990, 943, 891, 5467, 597, 403, 253, 1072, 2990, 604, 594, 849, 476, 253, 29810, 320, 10166, 1580, 352, 3198, 253, 2720, 281, 10173, 27451, 23279, 323, 253, 2957, 50276, 783, 373, 642, 2508, 670, 253, 3242, 2990, 10336, 3828, 3280, 9252, 10103, 941, 5162, 3733, 5199, 3966, 436, 2789, 253, 2644, 9317, 1014, 12150, 281, 37240, 50276, 522, 435, 432, 19843, 253, 7681, 38135, 275, 4382, 5859, 310, 247, 2372, 34584, 45190, 253, 1543, 403, 5272, 533, 417, 4722, 2217, 285, 642, 5301, 310, 1160, 342, 643, 8245, 14053, 3082, 891, 651, 1804, 247, 294, 17695, 294, 545, 83, 2355, 1805, 12873, 875, 7118, 8133, 1142, 30762, 9410, 751, 247, 19, 285, 247, 22, 715, 253, 2022, 2505, 3966, 285, 501, 538, 15318, 281, 247, 6551, 21559, 6698, 4583, 253, 2929, 3400, 247, 1175, 4809, 275, 14053, 23586, 4712, 407, 970, 4684, 485, 1030, 44196, 347, 362, 3348, 2720, 533, 697, 417, 4518, 3542, 285, 19756, 38135, 390, 8453, 275, 1097, 7681, 285, 16774, 7794, 50276, 187, 187, 4118, 18435, 27, 783, 4477, 1973, 271, 9706, 1566, 273, 2644, 21590, 3998, 2425, 407, 24399, 18464, 5164, 941, 342, 27166, 11025, 19177, 941, 436, 789, 310, 1534, 432, 247, 15180, 6551, 21559, 8668, 984, 352, 16988, 247, 4737, 273, 4473, 5001, 849, 50276, 40338, 3998, 11672, 6979, 941, 476, 320, 908, 281, 37709, 253, 5816, 3602, 273, 247, 4684, 485, 48454, 1794, 16946, 1037, 7000, 1566, 273, 253, 260, 44984, 11219, 985, 627, 497, 3374, 2905, 281, 19843, 275, 253, 3302, 19529, 534, 512, 5420, 281, 452, 644, 9713, 275, 253, 2457, 18520, 436, 2929, 2959, 495, 25026, 1690, 581, 16888, 2997, 285, 337, 12009, 253, 2929, 369, 5469, 285, 253, 30628, 1690, 253, 4016, 37317, 497, 42293, 326, 253, 1655, 19529, 943, 320, 7607 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 339, 431, 248, 2929, 24357, 247, 21624, 4778, 1566, 342, 35605, 14282, 3602, 323, 253, 28384, 260, 44984, 2644, 3998, 11454, 20274, 1677, 253, 17769, 875, 253, 8512, 253, 11454, 20274, 403, 23115, 347, 19191, 13584, 90, 2899, 2392, 342, 2057, 29738, 390, 1655, 1754, 21066, 3280, 1566, 1677, 253, 6718, 20274, 253, 2540, 11672, 6298, 403, 23115, 347, 806, 1340, 13584, 90, 2899, 2392, 2879, 407, 6046, 281, 2395, 323, 253, 8310, 6046, 6024, 3280, 310, 1327, 1282, 1285, 13657, 285, 10208, 715, 253, 8512, 281, 2395, 323, 253, 15199, 6820, 15673, 273, 11454, 2425, 2709, 17769, 3210, 4751, 6194, 494, 4751, 6194, 494, 342, 298, 19, 37820, 6194, 494, 2801, 32800, 6194, 494, 4156, 4311, 403, 908, 281, 7409, 1880, 7534, 4684, 485, 10806, 1361, 342, 11454, 2425, 13650, 390, 417, 39762, 17032, 310, 908, 281, 9441, 247, 12637, 3268, 689, 11454, 2425, 24102, 342, 2803, 1025, 305, 12064, 3268, 347, 253, 39762, 2021, 1543, 327, 23586, 4949, 483, 285, 28384, 4949, 483, 4679, 1804, 326, 4684, 485, 10806, 285, 29738, 3169, 14053, 19132, 1566, 10554, 689, 39709, 8512, 285, 35231, 253, 1953, 273, 1880, 27166, 10806, 1361, 342, 11138, 1566, 13650, 285, 1030, 26208, 253, 2317, 273, 1006, 800, 3210, 273, 11454, 2425, 310, 271, 1774, 285, 1534, 1953, 275, 15180, 6551, 21559, 436, 2929, 3936, 247, 3213, 4404, 15974, 436, 1953, 407, 6684, 7534, 3210, 326, 452, 4665, 494, 3602, 17769, 13461, 23586, 673, 14637, 18180, 6384, 19316, 2568, 403, 12112, 281, 9232, 253, 13099, 2439, 253, 3072, 390, 2060, 8512, 407, 24049, 6046, 275, 253, 21624, 285, 2540, 4903, 604, 247, 1566, 29217, 253, 7534, 1006, 800, 1232, 273, 253, 941, 598, 281, 4569, 2408, 273, 4278, 359, 651, 1902, 326, 24049, 625, 7534, 10806, 651, 1361, 342, 3210, 15970, 3045, 3103, 436, 2746, 417, 760, 812, 320, 908, 323, 1566, 5438, 533, 671, 812, 2085, 12288, 670, 2139, 7534, 2718, 4510, 2176, 35615, 390, 6297, 689, 2571, 50276, 6050, 253, 2929, 310, 2590, 285, 4722, 627, 403, 2067, 2523, 326, 878, 625, 37699, 347, 7117, 2708, 50275, 73, 1321, 1116, 474, 17699, 16561, 3210, 3614, 2700, 67, 1528, 32693, 2061, 6071, 6903, 6903, 3763, 1010, 1449, 87, 18, 14600, 5374, 403, 2168, 3715, 281, 1263, 253, 8062, 275, 260, 44984, 849, 651, 253, 1543, 1818, 604, 253, 27166, 10806, 403, 11295, 327, 253, 2168, 4232, 3210, 824, 347, 253, 581, 5393, 1840, 690, 5301, 342, 436, 2929, 651, 1056, 253, 3916, 10046, 50276, 249, 253, 1543, 2139, 23586, 13028, 483, 5921, 310, 7197, 685, 28384, 13028, 483, 717, 891, 40663, 436, 387, 1878, 540, 41597, 943, 2649, 359, 1902, 281, 452, 1805, 3045, 672, 247, 2014, 23586, 310, 5816, 2429, 281, 512, 253, 8512, 432, 247, 28384, 5816, 50276, 5430, 858, 253, 4477, 1347, 23586, 4949, 483, 4679, 604, 247, 23586, 310, 5816, 275, 253, 3733, 10895, 840, 849, 513, 359, 452, 247, 21624, 4778, 3969, 281, 352, 310, 253, 5816, 23586, 4127, 347, 5816, 7313, 387, 253, 1071, 673, 513, 359, 897, 253, 14433, 323, 16344, 253, 13007, 310, 253, 14433, 253, 1599, 273, 253, 12637, 390, 310, 352, 1754, 327, 3530, 50276, 1542, 253, 23586, 4949, 483, 4679, 8557, 273, 8512, 14034, 403, 2783, 2139, 310, 627, 14034, 8557, 285, 2139, 403, 359, 7296, 8557, 273, 8512, 3185, 273, 11922, 247, 2014, 23586, 969, 891, 1928, 751, 516, 40663, 253, 23586, 4949, 483, 4679, 50276, 261, 352, 1896, 281, 31986, 253, 25578, 2425, 273, 512, 253, 8512, 323, 247, 2186, 483, 36141, 285, 3733, 28384, 275, 253, 30762, 436, 651, 320, 1774, 323, 16775, 10071, 285, 476, 2085, 2007, 16039, 715, 253, 4081, 1566, 50276, 5430, 943, 359, 5416, 326, 253, 1566, 310, 4715, 673, 8062, 253, 2803, 1025, 305, 12064, 2021, 1057, 417, 19071, 667, 18525, 1566, 390, 6032, 1255, 10806, 275, 619, 4685, 310, 326, 987, 50276, 9453, 970, 27166, 10806, 323, 1566, 5438, 359, 878, 281, 1453, 285, 2395, 323, 347, 1142, 34541, 2616, 347, 1896, 326, 476, 7826, 1421, 281, 3731, 22416, 569, 841, 2616, 2486, 533, 417, 3710, 281, 10938, 323, 253, 1180, 273, 3602, 2439, 1027, 3210, 625, 3602, 476, 1421, 281, 689, 31893, 476, 1056, 253, 13757, 12150, 476, 1056, 253, 17032, 12150, 1580, 697, 275, 247, 2169, 15759, 2317, 285, 5547, 3733, 752, 310, 253, 14940, 17705, 50275, 249, 3036, 495, 253, 2228, 8965, 1646, 1512, 1781, 403, 253, 3910, 1534, 275, 253, 36608, 14777, 275, 1635, 690, 8512, 452, 1663, 3076, 15970, 3045, 476, 359, 2770, 327, 1110, 8512, 281, 2096, 2139, 436, 310, 9369, 310, 352, 1955, 281, 3480, 273, 6341, 830, 616, 20667, 8512, 310, 352, 984, 597, 403, 1679, 28826, 285, 625, 27857, 407, 440, 44657, 4973, 273, 13099, 50275, 74, 452, 690, 5884, 5701, 281, 285, 651, 751, 253, 4477, 281, 19148, 50275, 5371, 310, 39116, 4496, 3630, 1066, 39116, 11120, 970, 253, 41818, 5611, 275, 253, 2929, 50276, 35861, 281, 259, 15170, 260, 44984, 556, 29044, 8512, 417, 7469, 476, 368, 19148, 436, 50276, 5371, 310, 815, 74, 403, 368, 970, 247, 11454, 2990, 281, 30364, 363, 2721, 815, 74, 604, 594, 752, 2238, 273, 10336, 310, 908, 50276, 5430, 1057, 253, 37820, 789, 752, 310, 253, 37851, 7914, 273, 6240, 253, 50276, 78, 260, 50276, 12846, 1320, 310, 436, 6425, 281, 13947, 247, 2622, 2720, 4496, 19148, 436, 2139, 858, 253, 4477, 897, 298, 19, 37820, 310, 2649, 298, 18, 37820, 2330, 342, 37139, 414, 604, 594, 752, 310, 253, 37851, 7914, 273, 436, 275, 253, 39762, 17032, 7792, 50272, 6050, 253, 2746, 310, 1077, 4722, 285, 253, 1895, 2783, 310, 1774, 253, 2929, 19756, 14023, 285, 30909, 5740, 285, 5839, 273, 253, 1543, 33810, 690, 22861, 327, 2173, 10165, 878, 281, 320, 2218, 923, 1840, 323, 625, 7000, 5701, 50276, 5996, 5955, 516, 3629, 619, 4868, 281, 2997, 1580, 954, 273, 619, 5701, 403, 11217, 891, 5717, 253, 4477, 323, 616, 1892, 789, 275, 3585, 2182, 253, 2929, 285, 24049, 3081, 4679, 34309, 5904, 5474, 339, 431, 248, 4477, 12661, 247, 35605, 20793, 21624, 4872, 18525, 1566, 273, 253, 260, 44984, 11219, 985, 597, 897, 4684, 4986, 1491, 1690, 5793, 4632, 8545, 40041, 281, 37709, 10291, 875, 5085, 1309, 17032, 597, 4944, 253, 1566, 281, 11672, 6979, 941, 432, 2644, 260, 44984, 970, 247, 39762, 2746, 597, 1089, 326, 7534, 10806, 3157, 1566, 3045, 285, 13091, 970, 1097, 40492, 23586, 285, 2439, 28384, 17082, 436, 2929, 2789, 690, 3081, 16936, 327, 253, 260, 44984, 21624, 4778, 14053, 6239, 1690, 1097, 5793, 285, 8545, 2753, 8023, 1491, 6240, 29738, 3169, 40041, 285, 14561, 11672, 7313, 342, 247, 673, 3638, 2299, 891, 13414, 2868, 697, 253, 806, 4872, 21624, 4778, 1566, 281, 1908, 4684, 4986, 10806, 390, 13532, 4872, 3210, 281, 260, 44984, 941, 342, 39762, 17032, 923, 5150, 337, 275, 1821, 66, 1162, 355, 4240, 671, 923, 298, 6228, 1342, 1162, 355, 6247, 323, 2831, 36141, 14023, 285, 13650, 273, 2918, 483, 8512, 50276, 783, 19843, 273, 253, 3082, 285, 11815, 812, 320, 5520, 3082, 1057, 305, 275, 5150, 608, 9173, 253, 2602, 11095, 697, 417, 2590, 1880, 253, 1072, 275, 253, 6197, 1078, 10770, 281, 305, 1146, 253, 1072, 2439, 8512, 1072, 347, 253, 2602, 11095, 5426, 4321, 275, 253, 2593, 390, 1097, 209, 186, 261, 627, 247, 3806, 8109, 436, 8310, 1566, 310, 253, 7887, 273, 305, 4229, 50276, 6377, 577, 2593, 3127, 19886, 5954, 10770, 281, 14862, 8420, 1162, 355, 670, 14122, 81, 16434, 8512, 275, 260, 44984, 7019, 275, 253, 806, 767, 33295, 2654, 1804, 4886, 436, 3806, 4321, 281, 253, 10199, 835, 368, 806, 3748, 326, 368, 250, 970, 247, 14122, 81, 16434, 1566, 347, 1142, 275, 253, 17857, 32888, 8446, 778, 417, 871, 436, 958, 285, 352, 69, 625, 4541, 1329, 634, 14053, 13260, 50276, 6275, 12494, 273, 3239, 495, 247, 21624, 4778, 1566, 891, 1119, 436, 12494, 247, 1652, 21643, 891, 369, 2649, 2119, 752, 1142, 14683, 497, 14339, 281, 3340, 342, 253, 10199, 273, 253, 2806, 3364, 6928, 374, 2109, 6197, 4632, 253, 2806, 3364, 260, 14340, 3982, 598, 275, 253, 2626, 6197, 50276, 783, 8813, 273, 39762, 17032, 689, 924, 68, 1755, 2505, 327, 3239, 495, 4983, 17032, 3185, 273, 924, 68, 369, 247, 1652, 12744, 752, 253, 4477, 3078, 281, 12709, 342, 15693, 247, 1027, 1511, 273, 3268, 285, 310, 625, 43245, 5919, 984, 352, 4483, 1480, 10491, 432, 253, 16851, 12637, 247, 625, 6843, 5426, 273, 1027, 651, 1361, 19843, 390, 11922, 436, 17965, 285, 697, 671, 12744, 1880, 10491, 432, 253, 12637, 310, 908, 281, 1361, 17813, 18788, 253, 1566, 1060, 50276, 585, 7797, 253, 29738, 3169, 40041, 2692, 247, 3045, 7756, 275, 253, 23586, 2186, 483, 533, 417, 1663, 275, 253, 28384, 2186, 483, 1617, 2139, 310, 436, 285, 1057, 352, 3486, 253, 11815, 50276, 5430, 1057, 253, 11672, 8310, 1566, 3486, 253, 1566, 4944, 436, 310, 271, 1774, 7680, 273, 253, 4081, 1566, 533, 697, 417, 2590, 604, 253, 7318, 8062, 403, 3309, 2429, 281, 323, 4227, 247, 2969, 1386, 1662, 12064, 1566, 1677, 253, 8310, 3497, 1979, 253, 2929, 16681, 326, 1929, 7534, 10806, 476, 320, 2908, 10649, 1598, 1561, 21624, 4778, 3210, 273, 7534, 11454, 6928, 275, 436, 1083, 253, 260, 44984, 25761, 597, 2085, 690, 1941, 326, 841, 10806, 3157, 1566, 13840, 253, 1566, 4566, 4457, 5368, 4872, 21624, 4778, 3210, 273, 260, 44984, 2299, 253, 11815, 513, 417, 2486, 2266, 5593, 10941, 1411, 5368, 16182, 285, 697, 2834, 281, 11206, 849, 1199, 32992, 436, 1566, 3400, 1223, 253, 9759, 19843, 369, 10599, 6003, 285, 1566, 9759, 812, 320, 5520, 5474, 33032, 2520, 2929, 4648, 362, 3348, 39762, 6753, 36465, 281, 1566, 11454, 4712, 273, 2636, 1851, 507, 2540, 275, 11672, 6979, 253, 2022, 7680, 310, 2403, 253, 32049, 29343, 264, 12637, 3268, 2810, 281, 247, 1794, 16946, 1037, 20793, 2720, 347, 973, 347, 20462, 253, 29810, 1754, 327, 1794, 40947, 4803, 4757, 50276, 8052, 1794, 40947, 10806, 715, 23586, 2425, 14053, 285, 16472, 1097, 23586, 285, 28384, 2186, 483, 4679, 281, 17813, 253, 1566, 50276, 4781, 87, 4513, 253, 2523, 273, 1140, 2878, 302, 38017, 13099, 50276, 20881, 1255, 50276, 1439, 4518, 3542, 387, 512, 50276, 10383, 253, 2720, 268, 42781, 5978, 352, 760, 25957, 268, 42781, 310, 247, 1794, 16946, 1037, 15958, 4684, 485, 48454, 2990, 342, 16864, 1127, 570, 27658, 8062, 1293, 13458, 562, 752, 253, 2990, 310, 403, 253, 760, 6194, 494, 3602, 275, 253, 2990, 259, 68, 8020, 285, 359, 8020, 285, 247, 13361, 81, 50276, 12026, 323, 9706, 4352, 390, 403, 627, 667, 643, 8090, 285, 1580, 352, 556, 749, 3866, 50276, 783, 1072, 347, 253, 29810, 2990, 943, 891, 5467, 597, 403, 253, 1072, 2990, 604, 594, 849, 476, 253, 29810, 320, 10166, 1580, 352, 3198, 253, 2720, 281, 10173, 27451, 23279, 323, 253, 2957, 50276, 783, 373, 642, 2508, 670, 253, 3242, 2990, 10336, 3828, 3280, 9252, 10103, 941, 5162, 3733, 5199, 3966, 436, 2789, 253, 2644, 9317, 1014, 12150, 281, 37240, 50276, 522, 435, 432, 19843, 253, 7681, 38135, 275, 4382, 5859, 310, 247, 2372, 34584, 45190, 253, 1543, 403, 5272, 533, 417, 4722, 2217, 285, 642, 5301, 310, 1160, 342, 643, 8245, 14053, 3082, 891, 651, 1804, 247, 294, 17695, 294, 545, 83, 2355, 1805, 12873, 875, 7118, 8133, 1142, 30762, 9410, 751, 247, 19, 285, 247, 22, 715, 253, 2022, 2505, 3966, 285, 501, 538, 15318, 281, 247, 6551, 21559, 6698, 4583, 253, 2929, 3400, 247, 1175, 4809, 275, 14053, 23586, 4712, 407, 970, 4684, 485, 1030, 44196, 347, 362, 3348, 2720, 533, 697, 417, 4518, 3542, 285, 19756, 38135, 390, 8453, 275, 1097, 7681, 285, 16774, 7794, 50276, 187, 187, 4118, 18435, 27, 783, 4477, 1973, 271, 9706, 1566, 273, 2644, 21590, 3998, 2425, 407, 24399, 18464, 5164, 941, 342, 27166, 11025, 19177, 941, 436, 789, 310, 1534, 432, 247, 15180, 6551, 21559, 8668, 984, 352, 16988, 247, 4737, 273, 4473, 5001, 849, 50276, 40338, 3998, 11672, 6979, 941, 476, 320, 908, 281, 37709, 253, 5816, 3602, 273, 247, 4684, 485, 48454, 1794, 16946, 1037, 7000, 1566, 273, 253, 260, 44984, 11219, 985, 627, 497, 3374, 2905, 281, 19843, 275, 253, 3302, 19529, 534, 512, 5420, 281, 452, 644, 9713, 275, 253, 2457, 18520, 436, 2929, 2959, 495, 25026, 1690, 581, 16888, 2997, 285, 337, 12009, 253, 2929, 369, 5469, 285, 253, 30628, 1690, 253, 4016, 37317, 497, 42293, 326, 253, 1655, 19529, 943, 320, 7607 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: this paper studies the mixing time of scale invariant models whose parameters are governed by a stochastic differential equation that approximates sgd weight decay under certain regularity assumptions the paper shows convergence of parameters to a unique distribution with a rate theta fractln lambdaeta eta lambda under the setting of scale invariant models whose parameters are governed by a sde approximation of sgd wd the paper proves the direction of sgd iterates converge in distribution to a solution of an sde the main technical strategy is to employ 1 time rescaling 2 scaling factor that closely mimics the normalization factor for the parameters as in z li 2021 time rescaling is used to ensure that the main workhorse katzenberger theorem can be applied essentially by rescaling time by oteta2 the negative gradient loss becomes the dominant term in the sde for small learning rate eta which results in the parameters being close to manifold gamma furthermore the paper shows a cool corollary that shows that scaleinvariant models with no weight decay converge exponentially slower weaknesses the main weaknesses are the strong technical assumptions eg constant noise covariance trace asymptotic results smoothness assumptions but understandable given the difficulty of the problem yes docsepthe main contributions of the paper are 1 providing a characterizing sde limiting the sgdwd by using a timerescaling technique so that the results by katzenberger are applicable 2 showing that sgd without wd for stochastic scale invariant loss has the same limiting dynamics as sgdwd but is exponentially slowed down 3 under the assumption of all minimizers forming a compact manifold and noise being nondegenerate in the tangent space of the manifold the authors show that from any initialization the limiting diffusion process converges to a unique stationary distribution 4 lastly the authors conduct experiments to empirically show that these results hold strengths the authors are able to show the limiting dynamics for sgdwd and devise a novel time rescaling scheme to make the results of 1 applicable as they claim that the results of 2 does not hold since there is no true minimizer in the usual sense weaknesses it seems that the time rescaling approach is the only notable contribution of the paper the derivation of the sde and other results seem to follow from 2 while there seem to be novelties the framework to analyze these dynamics have been provided by 2 so on that ground the contribution seems to be marginal 1 g s katzenberger solutions of a stochastic differential equation forced onto a manifold by a large drift the annals of probability pages 15871628 1991 2 z li t wang and s arora what happens after sgd reaches zero lossa mathematical framework arxiv preprint arxiv211006914 2021 mentioned in weaknesses docsepthis paper provided the dynamic convergence of sgd combined with the popular weightdecay technique sgdwd for scaledinvariant loss functions eg neural networks with normalization specifically 1 the authors showed the iterates of sgd converge in distribution to some random variable in mathcaloleftfrac1eta lambdaln left frac2lambdaetae2t1 1rightright steps where eta is stepsize lambda is the weightdecay parameter and t is the budget of the iterations 2 in the regime of eta mathcalo1lambda when eta lambda rightarrow 0 the authors showed that sgd without wd has the same limiting diffusion as sgd with wd but its convergence is slower 3 under the assumptions on the manifold and the noise covariance the authors partially proved the faster equilibrium conjecture of 26 in general the paper is clear and wellwritten as i know the interplay of normalization gradient noise and weight decay of sgd focused in this paper is a very important and interesting problem the study of their interplay is beyond the optimization theory the main contribution is that the authors provided the limiting dynamics convergence of sgd with weight decay and gave empirical evidence to verify their results however i am not an expert in the dynamic theory of sde i am afraid that i can not give a fair assessment of the theoretical parts but i checked the numerical results carefully and gave my concerns about the experiments on the toy example in line 262 equality 24 has not been mentioned in the main content the authors may use 10 instead the same problem appears in line 290 in line 267 the authors said that as long as no zi is in the same direction with xast cdots then assumption 64 holds my question is how to ensure that no zi is in the same direction with xast in general it is difficult to gain knowledge about xast the results on cifar10 from figure 1a the values of test accuracy and training accuracy change a lot and the algorithm is unstable i am afraid it is hard to say that the algorithm has converged around 100 epochs i guess because the initial stepsize eta 08 is too large i am curious how do you select the initial step size are you based on the best performance of the algorithm as i know a lot of papers studied that decaying the step size improves the performance i think the interesting result is that the authors claimed that they can use the time scaling in 24 as an upper bound to decide the proper time of lr decay have you used this time scaling 24 to decide when to drop the step size in your experiments i did not see that the selection of 100 epochs is based on their theory recently there are some studies about stepdecay step size which is very efficient in practice this step size keeps a constant at first and then drops by a constant factor after some iterations for example in 1 the authors analyzed this stepdecay step size with the knowledge of the total number of iterations t and they drop the step size every the same time period i think the interesting thing is that the authors can use this theory of time scaling 24 to drop the step size adaptively the authors have claimed this point but i am not very sure if the formula of 24 can be used in practice and achieve good performance in deep neural networks i think more experiments are needed if the authors verify this claim 1 wang xiaoyu sindri magnsson and mikael johansson on the convergence of step decay stepsize for stochastic optimization advances in neural information processing systems 34 2021 1422614238 yes they have mentioned their limitations docsepthis papers theoretically proves the fast equilibrium conjecture that sgd for scale invariant loss mixes in o1lambda eta steps by con sidering a new time rescaling limiting sde the limiting sde of sgd without wd mixes much slower than sgd with wd justifies the benefit of weight decay for generalization performance the paper also provide some empirical results showing that the fast mixing obtains in asymptotic analysis does appear in pratical setting strengths overall this is an excellent paper that provides important theoretical results in understanding the behavior of sgd in training deep neural networks more specifically originality the fast equilibrium conjecture was proposed to explains the interesting generalization performance as mentioned in line 33 as the name suggested this conjecture has not been proved before therefore the proof in this paper is original moreover the new time rescaling sde has never been studied in the existing literature which is totally new to my best knowledge quality all the results in this paper is clearly stated and justified i briefly checked the proof and it looks sound to me clarity a well written paper with main results highlighted significance the proof of this conjecture provide us a reliable tool to understand sgd which is theoretically significant weakness 1 it would be better to provide some intuitive explanations of the assumption made to prove the main result can these assumptions be satisfied in the realized case 2 beyond the proof of the conjecture it would be more interesting to see what this mixing property can lead to some new conclusions though i agree the content is already enough for one paper 3 section 52 provide some discretization analysis more work should be done on this eg the discretization error 4 please correct typos in the paper eg line 27 affect effect the major two limitations of the current results is 1 the technical assumptions may not be satisfied 2 the asymptotic approximation does not necessarily fully characterize sgd however as i have stated in strengths and weakness the current results are already enough for a good neurips paper ### Summary:
this paper continues a line of works studying sgd via a similar sde this time employing a variety of tools not used before for instance timerescaling and normalization layers the reviewers on one hand were concerned at times that this was incremental but also uncovered a variety of interesting contributions as such i feel this paper is a clear accept and am excited to see it in the conference that said the discussions between reviewers and authors were quite detailed and i feel the authors could greatly strengthen their presentation by carefully adjusting for them making their intended message more clear for future readers
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 2520, 2929, 2175, 253, 12480, 673, 273, 4311, 13727, 3210, 3692, 3602, 403, 17886, 407, 247, 19191, 8967, 5150, 326, 4020, 684, 256, 35333, 50276, 6712, 10027, 762, 2176, 31793, 13260, 253, 2929, 2722, 14940, 273, 3602, 281, 247, 4451, 3268, 342, 247, 2281, 50275, 3124, 10232, 6677, 29331, 1464, 50276, 1464, 29331, 50276, 4524, 253, 4758, 273, 4311, 13727, 3210, 3692, 3602, 403, 17886, 407, 247, 256, 615, 11193, 273, 256, 35333, 50276, 14066, 253, 2929, 19539, 253, 3884, 273, 256, 35333, 10040, 684, 29623, 275, 3268, 281, 247, 2900, 273, 271, 256, 615, 253, 2022, 7681, 5700, 310, 281, 2126, 337, 673, 46595, 272, 374, 13642, 2803, 326, 8244, 43341, 253, 21539, 2803, 323, 253, 3602, 50276, 284, 275, 50276, 91, 632, 43425, 673, 46595, 272, 310, 908, 281, 5416, 326, 253, 2022, 789, 33647, 45470, 5282, 24423, 10012, 476, 320, 3732, 9093, 407, 46595, 272, 673, 407, 50276, 302, 1464, 19, 50276, 783, 4016, 11786, 2957, 4916, 253, 11360, 1307, 275, 253, 256, 615, 323, 1355, 4715, 2281, 1162, 66, 534, 1543, 275, 253, 3602, 1146, 2810, 281, 16751, 17356, 50275, 44295, 3062, 253, 2929, 2722, 247, 4484, 40460, 326, 2722, 326, 4311, 25168, 3210, 342, 642, 2801, 10027, 29623, 28596, 17357, 50274, 20881, 1255, 265, 253, 2022, 32213, 403, 253, 2266, 7681, 13260, 24088, 3638, 6046, 26677, 10711, 20185, 1543, 6032, 1255, 13260, 533, 34007, 1677, 253, 10183, 273, 253, 1895, 50275, 9820, 5474, 339, 431, 248, 2022, 9021, 273, 253, 2929, 403, 337, 5277, 247, 39330, 256, 615, 14155, 253, 256, 35333, 14066, 407, 970, 247, 673, 373, 1179, 272, 5853, 594, 326, 253, 1543, 407, 45470, 5282, 24423, 403, 7763, 374, 4645, 326, 256, 35333, 1293, 259, 69, 323, 19191, 4311, 13727, 2957, 556, 253, 1072, 14155, 8062, 347, 256, 35333, 14066, 533, 310, 28596, 28837, 1066, 495, 762, 253, 9376, 273, 512, 7221, 14460, 9046, 247, 8566, 16751, 285, 6046, 1146, 1327, 42822, 275, 253, 28196, 2317, 273, 253, 16751, 253, 4477, 921, 326, 432, 667, 31850, 253, 14155, 12393, 1232, 26414, 281, 247, 4451, 17429, 3268, 577, 1390, 314, 253, 4477, 2589, 4679, 281, 45190, 921, 326, 841, 1543, 2186, 20544, 253, 4477, 403, 2104, 281, 921, 253, 14155, 8062, 323, 256, 35333, 14066, 285, 45018, 247, 4460, 673, 46595, 272, 6974, 281, 1056, 253, 1543, 273, 337, 7763, 347, 597, 1750, 326, 253, 1543, 273, 374, 1057, 417, 2186, 1580, 627, 310, 642, 2032, 7221, 6081, 275, 253, 7312, 3282, 50276, 20881, 1255, 265, 352, 3133, 326, 253, 673, 46595, 272, 2746, 310, 253, 760, 16613, 7680, 273, 253, 2929, 253, 28529, 273, 253, 256, 615, 285, 643, 1543, 1646, 281, 956, 432, 374, 1223, 627, 1646, 281, 320, 4460, 2890, 253, 7792, 281, 12106, 841, 8062, 452, 644, 2530, 407, 374, 594, 327, 326, 3216, 253, 7680, 3133, 281, 320, 16888, 50276, 18, 305, 256, 45470, 5282, 24423, 5482, 273, 247, 19191, 8967, 5150, 6726, 4830, 247, 16751, 407, 247, 1781, 16924, 253, 2459, 932, 273, 5912, 7223, 1458, 2597, 1036, 1619, 10226, 374, 1182, 632, 246, 259, 606, 285, 256, 549, 6464, 752, 6569, 846, 256, 35333, 14190, 5058, 2957, 66, 15965, 7792, 549, 32693, 638, 3845, 549, 32693, 17605, 361, 2090, 1047, 43425, 50275, 13012, 275, 32213, 5474, 33032, 2520, 2929, 2530, 253, 7870, 14940, 273, 256, 35333, 5678, 342, 253, 4633, 2801, 8632, 333, 5853, 256, 35333, 14066, 323, 24337, 25168, 2957, 3470, 24088, 11454, 6928, 342, 21539, 5742, 337, 253, 4477, 2692, 253, 10040, 684, 273, 256, 35333, 29623, 275, 3268, 281, 690, 3632, 4778, 275, 14168, 1179, 80, 1274, 1124, 18, 1464, 24082, 26955, 79, 1669, 1315, 317, 19, 2260, 1464, 70, 19, 85, 18, 337, 918, 918, 5018, 835, 1162, 66, 310, 5018, 907, 29331, 310, 253, 2801, 8632, 333, 4764, 285, 246, 310, 253, 7563, 273, 253, 25142, 374, 275, 253, 9459, 273, 1162, 66, 50276, 1588, 80, 18, 2260, 672, 1162, 66, 29331, 987, 2501, 470, 253, 4477, 2692, 326, 256, 35333, 1293, 259, 69, 556, 253, 1072, 14155, 12393, 347, 256, 35333, 342, 259, 69, 533, 697, 14940, 310, 17357, 495, 762, 253, 13260, 327, 253, 16751, 285, 253, 6046, 26677, 253, 4477, 10571, 8058, 253, 7938, 12902, 24366, 273, 3436, 275, 2087, 253, 2929, 310, 2590, 285, 973, 15720, 347, 891, 871, 253, 36039, 273, 21539, 11786, 6046, 285, 2801, 10027, 273, 256, 35333, 7106, 275, 436, 2929, 310, 247, 1077, 1774, 285, 4722, 1895, 253, 1263, 273, 616, 36039, 310, 4457, 253, 13757, 3762, 253, 2022, 7680, 310, 326, 253, 4477, 2530, 253, 14155, 8062, 14940, 273, 256, 35333, 342, 2801, 10027, 285, 3534, 16774, 1941, 281, 12654, 616, 1543, 50276, 35529, 891, 717, 417, 271, 6485, 275, 253, 7870, 3762, 273, 256, 615, 891, 717, 9202, 326, 891, 476, 417, 1918, 247, 4344, 6803, 273, 253, 10527, 4243, 533, 891, 10141, 253, 10704, 1543, 9257, 285, 3534, 619, 7350, 50276, 10383, 253, 4679, 327, 253, 20953, 1650, 50276, 249, 1386, 32455, 13919, 2164, 556, 417, 644, 5393, 275, 253, 2022, 2600, 253, 4477, 778, 897, 884, 3185, 50276, 783, 1072, 1895, 4620, 275, 1386, 26711, 50276, 249, 1386, 27880, 253, 4477, 753, 326, 347, 1048, 347, 642, 1182, 74, 310, 275, 253, 1072, 3884, 342, 1269, 505, 260, 6768, 840, 9376, 6705, 6556, 619, 1953, 310, 849, 281, 5416, 326, 642, 1182, 74, 310, 275, 253, 1072, 3884, 342, 1269, 505, 275, 2087, 352, 310, 2834, 281, 6351, 3640, 670, 1269, 505, 50276, 783, 1543, 327, 260, 338, 274, 740, 50276, 4064, 4677, 337, 66, 253, 2193, 273, 1071, 7200, 285, 3733, 7200, 1818, 247, 2257, 285, 253, 5933, 310, 17631, 891, 717, 9202, 352, 310, 1892, 281, 1333, 326, 253, 5933, 556, 5975, 2400, 1475, 2233, 44540, 891, 5476, 984, 253, 3302, 5018, 907, 1162, 66, 50276, 2904, 310, 1512, 1781, 891, 717, 14338, 849, 513, 368, 3609, 253, 3302, 3213, 1979, 403, 368, 1754, 327, 253, 1682, 3045, 273, 253, 5933, 50276, 284, 891, 871, 247, 2257, 273, 9380, 5421, 326, 46957, 253, 3213, 1979, 19132, 253, 3045, 891, 1158, 253, 4722, 906, 310, 326, 253, 4477, 7558, 326, 597, 476, 897, 253, 673, 13642, 275, 2164, 347, 271, 5170, 3033, 281, 7617, 253, 1463, 673, 273, 298, 83, 10027, 452, 368, 908, 436, 673, 13642, 2164, 281, 7617, 672, 281, 5926, 253, 3213, 1979, 275, 634, 4679, 891, 858, 417, 923, 326, 253, 5438, 273, 2233, 44540, 310, 1754, 327, 616, 3762, 50274, 45019, 314, 627, 403, 690, 2175, 670, 3213, 8632, 333, 3213, 1979, 534, 310, 1077, 5919, 275, 3946, 436, 3213, 1979, 11359, 247, 3638, 387, 806, 285, 840, 15323, 407, 247, 3638, 2803, 846, 690, 25142, 323, 1650, 275, 337, 253, 4477, 5867, 436, 3213, 8632, 333, 3213, 1979, 342, 253, 3640, 273, 253, 2264, 1180, 273, 25142, 246, 285, 597, 5926, 253, 3213, 1979, 1046, 253, 1072, 673, 2180, 891, 1158, 253, 4722, 2181, 310, 326, 253, 4477, 476, 897, 436, 3762, 273, 673, 13642, 2164, 281, 5926, 253, 3213, 1979, 5223, 1242, 253, 4477, 452, 7558, 436, 1127, 533, 891, 717, 417, 1077, 2119, 604, 253, 7212, 273, 2164, 476, 320, 908, 275, 3946, 285, 5115, 1175, 3045, 275, 3676, 11454, 6928, 891, 1158, 625, 4679, 403, 3058, 604, 253, 4477, 12654, 436, 1750, 50276, 18, 259, 606, 1269, 571, 899, 86, 21308, 363, 2849, 20897, 285, 278, 1479, 4696, 480, 1368, 507, 1665, 327, 253, 14940, 273, 3213, 10027, 5018, 907, 323, 19191, 13757, 16424, 275, 11454, 1491, 5162, 2718, 5910, 43425, 1638, 21345, 1047, 21378, 50276, 9820, 597, 452, 5393, 616, 7364, 50276, 7152, 33032, 2520, 9380, 28055, 19539, 253, 3809, 12902, 24366, 326, 256, 35333, 323, 4311, 13727, 2957, 47603, 275, 50276, 80, 18, 2260, 1162, 66, 5018, 407, 345, 256, 1334, 272, 247, 747, 673, 46595, 272, 14155, 256, 615, 253, 14155, 256, 615, 273, 256, 35333, 1293, 259, 69, 47603, 1199, 17357, 685, 256, 35333, 342, 259, 69, 816, 7790, 253, 5649, 273, 2801, 10027, 323, 26647, 3045, 50276, 783, 2929, 671, 2085, 690, 16774, 1543, 4645, 326, 253, 3809, 12480, 31326, 275, 20185, 1783, 1057, 3176, 275, 819, 255, 474, 4758, 50275, 296, 3755, 20556, 4583, 436, 310, 271, 7126, 2929, 326, 3400, 1774, 10527, 1543, 275, 4685, 253, 3879, 273, 256, 35333, 275, 3733, 3676, 11454, 6928, 625, 5742, 3236, 414, 253, 3809, 12902, 24366, 369, 4081, 281, 11424, 253, 4722, 26647, 3045, 347, 5393, 275, 1386, 5922, 347, 253, 1416, 5125, 436, 24366, 556, 417, 644, 8058, 1078, 3103, 253, 4737, 275, 436, 2929, 310, 3236, 25761, 253, 747, 673, 46595, 272, 256, 615, 556, 1620, 644, 5421, 275, 253, 5368, 6239, 534, 310, 9106, 747, 281, 619, 1682, 3640, 3290, 512, 253, 1543, 275, 436, 2929, 310, 4518, 4767, 285, 17285, 891, 13366, 10141, 253, 4737, 285, 352, 4453, 3590, 281, 479, 50276, 498, 15752, 247, 973, 3542, 2929, 342, 2022, 1543, 16318, 50276, 9188, 40348, 253, 4737, 273, 436, 24366, 50276, 42260, 441, 247, 9630, 4968, 281, 2096, 256, 35333, 534, 310, 28055, 1534, 50275, 20881, 1255, 50276, 18, 352, 651, 320, 1805, 281, 2085, 690, 27350, 22909, 273, 253, 9376, 1160, 281, 5276, 253, 2022, 906, 476, 841, 13260, 320, 10048, 275, 253, 8156, 1083, 50276, 19, 4457, 253, 4737, 273, 253, 24366, 352, 651, 320, 625, 4722, 281, 923, 752, 436, 12480, 2867, 476, 1421, 281, 690, 747, 11815, 2167, 891, 5194, 253, 2600, 310, 2168, 2217, 323, 581, 2929, 50276, 20, 2593, 8073, 2085, 690, 35132, 1320, 1783, 625, 789, 943, 320, 2218, 327, 436, 24088, 253, 35132, 1320, 2228, 50276, 21, 4496, 3451, 963, 993, 275, 253, 2929, 24088, 1386, 3435, 2818, 50276, 8222, 50273, 783, 2201, 767, 7364, 273, 253, 1655, 1543, 310, 337, 253, 7681, 13260, 778, 417, 320, 10048, 374, 253, 20185, 11193, 1057, 417, 7933, 4751, 17710, 256, 35333, 50276, 35529, 347, 891, 452, 4767, 275, 20544, 285, 14855, 253, 1655, 1543, 403, 2168, 2217, 323, 247, 1175, 5723, 2824, 2929, 50276, 187, 187, 4118, 18435, 27, 2520, 2929, 7788, 247, 1386, 273, 2987, 12392, 256, 35333, 3066, 247, 2074, 256, 615, 436, 673, 19693, 247, 5235, 273, 5657, 417, 908, 1078, 323, 4227, 673, 373, 1179, 272, 285, 21539, 8090, 50276, 783, 30628, 327, 581, 1133, 497, 7514, 387, 2069, 326, 436, 369, 32809, 533, 671, 27819, 247, 5235, 273, 4722, 9021, 50276, 284, 824, 891, 1928, 436, 2929, 310, 247, 2590, 2997, 285, 717, 9049, 281, 923, 352, 275, 253, 8059, 50276, 3529, 753, 253, 11985, 875, 30628, 285, 4477, 497, 3240, 7000, 285, 891, 1928, 253, 4477, 812, 10260, 17084, 616, 9759, 407, 9257, 19427, 323, 731, 2403, 616, 6034, 3935, 625, 2590, 323, 2852, 10668 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 2520, 2929, 2175, 253, 12480, 673, 273, 4311, 13727, 3210, 3692, 3602, 403, 17886, 407, 247, 19191, 8967, 5150, 326, 4020, 684, 256, 35333, 50276, 6712, 10027, 762, 2176, 31793, 13260, 253, 2929, 2722, 14940, 273, 3602, 281, 247, 4451, 3268, 342, 247, 2281, 50275, 3124, 10232, 6677, 29331, 1464, 50276, 1464, 29331, 50276, 4524, 253, 4758, 273, 4311, 13727, 3210, 3692, 3602, 403, 17886, 407, 247, 256, 615, 11193, 273, 256, 35333, 50276, 14066, 253, 2929, 19539, 253, 3884, 273, 256, 35333, 10040, 684, 29623, 275, 3268, 281, 247, 2900, 273, 271, 256, 615, 253, 2022, 7681, 5700, 310, 281, 2126, 337, 673, 46595, 272, 374, 13642, 2803, 326, 8244, 43341, 253, 21539, 2803, 323, 253, 3602, 50276, 284, 275, 50276, 91, 632, 43425, 673, 46595, 272, 310, 908, 281, 5416, 326, 253, 2022, 789, 33647, 45470, 5282, 24423, 10012, 476, 320, 3732, 9093, 407, 46595, 272, 673, 407, 50276, 302, 1464, 19, 50276, 783, 4016, 11786, 2957, 4916, 253, 11360, 1307, 275, 253, 256, 615, 323, 1355, 4715, 2281, 1162, 66, 534, 1543, 275, 253, 3602, 1146, 2810, 281, 16751, 17356, 50275, 44295, 3062, 253, 2929, 2722, 247, 4484, 40460, 326, 2722, 326, 4311, 25168, 3210, 342, 642, 2801, 10027, 29623, 28596, 17357, 50274, 20881, 1255, 265, 253, 2022, 32213, 403, 253, 2266, 7681, 13260, 24088, 3638, 6046, 26677, 10711, 20185, 1543, 6032, 1255, 13260, 533, 34007, 1677, 253, 10183, 273, 253, 1895, 50275, 9820, 5474, 339, 431, 248, 2022, 9021, 273, 253, 2929, 403, 337, 5277, 247, 39330, 256, 615, 14155, 253, 256, 35333, 14066, 407, 970, 247, 673, 373, 1179, 272, 5853, 594, 326, 253, 1543, 407, 45470, 5282, 24423, 403, 7763, 374, 4645, 326, 256, 35333, 1293, 259, 69, 323, 19191, 4311, 13727, 2957, 556, 253, 1072, 14155, 8062, 347, 256, 35333, 14066, 533, 310, 28596, 28837, 1066, 495, 762, 253, 9376, 273, 512, 7221, 14460, 9046, 247, 8566, 16751, 285, 6046, 1146, 1327, 42822, 275, 253, 28196, 2317, 273, 253, 16751, 253, 4477, 921, 326, 432, 667, 31850, 253, 14155, 12393, 1232, 26414, 281, 247, 4451, 17429, 3268, 577, 1390, 314, 253, 4477, 2589, 4679, 281, 45190, 921, 326, 841, 1543, 2186, 20544, 253, 4477, 403, 2104, 281, 921, 253, 14155, 8062, 323, 256, 35333, 14066, 285, 45018, 247, 4460, 673, 46595, 272, 6974, 281, 1056, 253, 1543, 273, 337, 7763, 347, 597, 1750, 326, 253, 1543, 273, 374, 1057, 417, 2186, 1580, 627, 310, 642, 2032, 7221, 6081, 275, 253, 7312, 3282, 50276, 20881, 1255, 265, 352, 3133, 326, 253, 673, 46595, 272, 2746, 310, 253, 760, 16613, 7680, 273, 253, 2929, 253, 28529, 273, 253, 256, 615, 285, 643, 1543, 1646, 281, 956, 432, 374, 1223, 627, 1646, 281, 320, 4460, 2890, 253, 7792, 281, 12106, 841, 8062, 452, 644, 2530, 407, 374, 594, 327, 326, 3216, 253, 7680, 3133, 281, 320, 16888, 50276, 18, 305, 256, 45470, 5282, 24423, 5482, 273, 247, 19191, 8967, 5150, 6726, 4830, 247, 16751, 407, 247, 1781, 16924, 253, 2459, 932, 273, 5912, 7223, 1458, 2597, 1036, 1619, 10226, 374, 1182, 632, 246, 259, 606, 285, 256, 549, 6464, 752, 6569, 846, 256, 35333, 14190, 5058, 2957, 66, 15965, 7792, 549, 32693, 638, 3845, 549, 32693, 17605, 361, 2090, 1047, 43425, 50275, 13012, 275, 32213, 5474, 33032, 2520, 2929, 2530, 253, 7870, 14940, 273, 256, 35333, 5678, 342, 253, 4633, 2801, 8632, 333, 5853, 256, 35333, 14066, 323, 24337, 25168, 2957, 3470, 24088, 11454, 6928, 342, 21539, 5742, 337, 253, 4477, 2692, 253, 10040, 684, 273, 256, 35333, 29623, 275, 3268, 281, 690, 3632, 4778, 275, 14168, 1179, 80, 1274, 1124, 18, 1464, 24082, 26955, 79, 1669, 1315, 317, 19, 2260, 1464, 70, 19, 85, 18, 337, 918, 918, 5018, 835, 1162, 66, 310, 5018, 907, 29331, 310, 253, 2801, 8632, 333, 4764, 285, 246, 310, 253, 7563, 273, 253, 25142, 374, 275, 253, 9459, 273, 1162, 66, 50276, 1588, 80, 18, 2260, 672, 1162, 66, 29331, 987, 2501, 470, 253, 4477, 2692, 326, 256, 35333, 1293, 259, 69, 556, 253, 1072, 14155, 12393, 347, 256, 35333, 342, 259, 69, 533, 697, 14940, 310, 17357, 495, 762, 253, 13260, 327, 253, 16751, 285, 253, 6046, 26677, 253, 4477, 10571, 8058, 253, 7938, 12902, 24366, 273, 3436, 275, 2087, 253, 2929, 310, 2590, 285, 973, 15720, 347, 891, 871, 253, 36039, 273, 21539, 11786, 6046, 285, 2801, 10027, 273, 256, 35333, 7106, 275, 436, 2929, 310, 247, 1077, 1774, 285, 4722, 1895, 253, 1263, 273, 616, 36039, 310, 4457, 253, 13757, 3762, 253, 2022, 7680, 310, 326, 253, 4477, 2530, 253, 14155, 8062, 14940, 273, 256, 35333, 342, 2801, 10027, 285, 3534, 16774, 1941, 281, 12654, 616, 1543, 50276, 35529, 891, 717, 417, 271, 6485, 275, 253, 7870, 3762, 273, 256, 615, 891, 717, 9202, 326, 891, 476, 417, 1918, 247, 4344, 6803, 273, 253, 10527, 4243, 533, 891, 10141, 253, 10704, 1543, 9257, 285, 3534, 619, 7350, 50276, 10383, 253, 4679, 327, 253, 20953, 1650, 50276, 249, 1386, 32455, 13919, 2164, 556, 417, 644, 5393, 275, 253, 2022, 2600, 253, 4477, 778, 897, 884, 3185, 50276, 783, 1072, 1895, 4620, 275, 1386, 26711, 50276, 249, 1386, 27880, 253, 4477, 753, 326, 347, 1048, 347, 642, 1182, 74, 310, 275, 253, 1072, 3884, 342, 1269, 505, 260, 6768, 840, 9376, 6705, 6556, 619, 1953, 310, 849, 281, 5416, 326, 642, 1182, 74, 310, 275, 253, 1072, 3884, 342, 1269, 505, 275, 2087, 352, 310, 2834, 281, 6351, 3640, 670, 1269, 505, 50276, 783, 1543, 327, 260, 338, 274, 740, 50276, 4064, 4677, 337, 66, 253, 2193, 273, 1071, 7200, 285, 3733, 7200, 1818, 247, 2257, 285, 253, 5933, 310, 17631, 891, 717, 9202, 352, 310, 1892, 281, 1333, 326, 253, 5933, 556, 5975, 2400, 1475, 2233, 44540, 891, 5476, 984, 253, 3302, 5018, 907, 1162, 66, 50276, 2904, 310, 1512, 1781, 891, 717, 14338, 849, 513, 368, 3609, 253, 3302, 3213, 1979, 403, 368, 1754, 327, 253, 1682, 3045, 273, 253, 5933, 50276, 284, 891, 871, 247, 2257, 273, 9380, 5421, 326, 46957, 253, 3213, 1979, 19132, 253, 3045, 891, 1158, 253, 4722, 906, 310, 326, 253, 4477, 7558, 326, 597, 476, 897, 253, 673, 13642, 275, 2164, 347, 271, 5170, 3033, 281, 7617, 253, 1463, 673, 273, 298, 83, 10027, 452, 368, 908, 436, 673, 13642, 2164, 281, 7617, 672, 281, 5926, 253, 3213, 1979, 275, 634, 4679, 891, 858, 417, 923, 326, 253, 5438, 273, 2233, 44540, 310, 1754, 327, 616, 3762, 50274, 45019, 314, 627, 403, 690, 2175, 670, 3213, 8632, 333, 3213, 1979, 534, 310, 1077, 5919, 275, 3946, 436, 3213, 1979, 11359, 247, 3638, 387, 806, 285, 840, 15323, 407, 247, 3638, 2803, 846, 690, 25142, 323, 1650, 275, 337, 253, 4477, 5867, 436, 3213, 8632, 333, 3213, 1979, 342, 253, 3640, 273, 253, 2264, 1180, 273, 25142, 246, 285, 597, 5926, 253, 3213, 1979, 1046, 253, 1072, 673, 2180, 891, 1158, 253, 4722, 2181, 310, 326, 253, 4477, 476, 897, 436, 3762, 273, 673, 13642, 2164, 281, 5926, 253, 3213, 1979, 5223, 1242, 253, 4477, 452, 7558, 436, 1127, 533, 891, 717, 417, 1077, 2119, 604, 253, 7212, 273, 2164, 476, 320, 908, 275, 3946, 285, 5115, 1175, 3045, 275, 3676, 11454, 6928, 891, 1158, 625, 4679, 403, 3058, 604, 253, 4477, 12654, 436, 1750, 50276, 18, 259, 606, 1269, 571, 899, 86, 21308, 363, 2849, 20897, 285, 278, 1479, 4696, 480, 1368, 507, 1665, 327, 253, 14940, 273, 3213, 10027, 5018, 907, 323, 19191, 13757, 16424, 275, 11454, 1491, 5162, 2718, 5910, 43425, 1638, 21345, 1047, 21378, 50276, 9820, 597, 452, 5393, 616, 7364, 50276, 7152, 33032, 2520, 9380, 28055, 19539, 253, 3809, 12902, 24366, 326, 256, 35333, 323, 4311, 13727, 2957, 47603, 275, 50276, 80, 18, 2260, 1162, 66, 5018, 407, 345, 256, 1334, 272, 247, 747, 673, 46595, 272, 14155, 256, 615, 253, 14155, 256, 615, 273, 256, 35333, 1293, 259, 69, 47603, 1199, 17357, 685, 256, 35333, 342, 259, 69, 816, 7790, 253, 5649, 273, 2801, 10027, 323, 26647, 3045, 50276, 783, 2929, 671, 2085, 690, 16774, 1543, 4645, 326, 253, 3809, 12480, 31326, 275, 20185, 1783, 1057, 3176, 275, 819, 255, 474, 4758, 50275, 296, 3755, 20556, 4583, 436, 310, 271, 7126, 2929, 326, 3400, 1774, 10527, 1543, 275, 4685, 253, 3879, 273, 256, 35333, 275, 3733, 3676, 11454, 6928, 625, 5742, 3236, 414, 253, 3809, 12902, 24366, 369, 4081, 281, 11424, 253, 4722, 26647, 3045, 347, 5393, 275, 1386, 5922, 347, 253, 1416, 5125, 436, 24366, 556, 417, 644, 8058, 1078, 3103, 253, 4737, 275, 436, 2929, 310, 3236, 25761, 253, 747, 673, 46595, 272, 256, 615, 556, 1620, 644, 5421, 275, 253, 5368, 6239, 534, 310, 9106, 747, 281, 619, 1682, 3640, 3290, 512, 253, 1543, 275, 436, 2929, 310, 4518, 4767, 285, 17285, 891, 13366, 10141, 253, 4737, 285, 352, 4453, 3590, 281, 479, 50276, 498, 15752, 247, 973, 3542, 2929, 342, 2022, 1543, 16318, 50276, 9188, 40348, 253, 4737, 273, 436, 24366, 50276, 42260, 441, 247, 9630, 4968, 281, 2096, 256, 35333, 534, 310, 28055, 1534, 50275, 20881, 1255, 50276, 18, 352, 651, 320, 1805, 281, 2085, 690, 27350, 22909, 273, 253, 9376, 1160, 281, 5276, 253, 2022, 906, 476, 841, 13260, 320, 10048, 275, 253, 8156, 1083, 50276, 19, 4457, 253, 4737, 273, 253, 24366, 352, 651, 320, 625, 4722, 281, 923, 752, 436, 12480, 2867, 476, 1421, 281, 690, 747, 11815, 2167, 891, 5194, 253, 2600, 310, 2168, 2217, 323, 581, 2929, 50276, 20, 2593, 8073, 2085, 690, 35132, 1320, 1783, 625, 789, 943, 320, 2218, 327, 436, 24088, 253, 35132, 1320, 2228, 50276, 21, 4496, 3451, 963, 993, 275, 253, 2929, 24088, 1386, 3435, 2818, 50276, 8222, 50273, 783, 2201, 767, 7364, 273, 253, 1655, 1543, 310, 337, 253, 7681, 13260, 778, 417, 320, 10048, 374, 253, 20185, 11193, 1057, 417, 7933, 4751, 17710, 256, 35333, 50276, 35529, 347, 891, 452, 4767, 275, 20544, 285, 14855, 253, 1655, 1543, 403, 2168, 2217, 323, 247, 1175, 5723, 2824, 2929, 50276, 187, 187, 4118, 18435, 27, 2520, 2929, 7788, 247, 1386, 273, 2987, 12392, 256, 35333, 3066, 247, 2074, 256, 615, 436, 673, 19693, 247, 5235, 273, 5657, 417, 908, 1078, 323, 4227, 673, 373, 1179, 272, 285, 21539, 8090, 50276, 783, 30628, 327, 581, 1133, 497, 7514, 387, 2069, 326, 436, 369, 32809, 533, 671, 27819, 247, 5235, 273, 4722, 9021, 50276, 284, 824, 891, 1928, 436, 2929, 310, 247, 2590, 2997, 285, 717, 9049, 281, 923, 352, 275, 253, 8059, 50276, 3529, 753, 253, 11985, 875, 30628, 285, 4477, 497, 3240, 7000, 285, 891, 1928, 253, 4477, 812, 10260, 17084, 616, 9759, 407, 9257, 19427, 323, 731, 2403, 616, 6034, 3935, 625, 2590, 323, 2852, 10668 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: the authors propose globally gated deep linear networks ggdlns which is a variant of gated linear networks glns that is more amenable to theoretical analysis the paper presents a wide breadth of theoretical results which are additionally supported by simulations these include bias variance analysis for singlelayer networks kernel normalization analysis for multilayer networks and multitask learning backpropagating kernel renormalization bkr has been applied for analysing multilayer linear networks one of the contributions of this work is to extend the bkr analysis to multilayer nonlinear networks this nonlinearity is achieved via gating as opposed to static nonlinearities such as relu functions which makes the analysis tractable this transition from linear functions to nonlinear functions is a significant theoretical progress the theoretical analysis is sound and insightful most results are intuitiveexpected however i am confused by some of them as i explain in questions 36 i will update my evaluation after understanding some of the results better the pretrained gating idea is novel and shows promise for glnlike architectures perhaps the biggest weakness is that the analysis is limited to the ggdlns and do not directly apply to popular architectures such as relu mlps however analysis of nonlinear networks is extreme difficult so any progress is valuable as the authors note the analysis techniques developed in the paper might be applicable to more popular architectures in the future the paper is written and presented clearly however it can be quite dense in some places which is expected given the quantity of results there are a few minor writing issues the text could benefit from a spellcheck processisng analayzing etc the xaxis label in figure 1 must be p instead of m i believe panel label d is missing in figure 7 see caption d generalization error increases with limitations are adequately addressed docsepthis paper presents theoretical insights that build upon the recently proposed gated linear networks glns first a simplified version of glns called globally gated deep linear networks ggdlns are proposed the motivation for the proposal is to construct a model that can serve as a useful model of learning in general neural networks by being practical enough yet amenable to theoretical analysis the proposed model enables the authors to obtain useful theoretical characterizations of the memory capacity and generalization behavior for ggdlns which are confirmed via simulations i should note that i am not familiar with the prior work in this area and thus my opinions are only based on a few readings of this paper as a newcomer i can not ascertain if all relevant prior work has been properly credited and can not verify that the method of backpropagating kernel renormalization li sompolinsky 2021 has been applied correctly to the proposed model nevertheless the authors make a good case that ggdlns are a more interesting model to study compared to deep linear networks studied in recent work exhibiting generalization behavior that is closer to neural networks used in practice sec 4 it is also shown that there is remarkable qualitative agreement between the papers theoretical predictions regarding generalization and results obtained using gradient descent even though technically the theory applies to posterior predictions obtained using langevin dynamics this again indicates that ggdlns are useful objects to study for predicting the behavior of neural networks in practice yes the limitations are discussed in sec 6 docsepthe authors propose a new architecture based on gated linear networks gln called globally gated deep linear network ggdln they derive equation for the generalization properties and describe the architecture theoretically they present also several experiments comparing ggdln and the same architecture learned by gradient descent the main strength is the attempt to analyze the gated deep linear networks on the theoretical basis moreover the authors have made a considerable effort also to analyse the architecture by different perspectives my main concern is that the work is too dense nine pages looks too few to contain all the material authors want to present making the reading process difficult also because the introductory figures are not so good as necessary for this reason some details are missing discussion is very short and some clarifications are missing the authors describe more the next perspectives that limitations ### Summary:
three reviewers recommend accept the reviewers praised the significant extensions of previous works the novelty of the ideas and found the theoretical analysis sound and insightful the author responses to initial feedback were found to be insightful and reassured the reviewers about their recommendations hence i am recommending accept i encourage the authors to take the reviewers feedback carefully into consideration when preparing the final manuscript and to work on the items promised in the discussion period
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 783, 4477, 12661, 21349, 305, 456, 3676, 4872, 6928, 305, 72, 11830, 2224, 534, 310, 247, 12955, 273, 305, 456, 4872, 6928, 1289, 2224, 326, 310, 625, 42133, 281, 10527, 1783, 253, 2929, 10262, 247, 4618, 37535, 273, 10527, 1543, 534, 403, 23000, 4516, 407, 9938, 50276, 20513, 2486, 8492, 50276, 87, 14417, 1783, 323, 2014, 12026, 6928, 10295, 21539, 1783, 323, 33362, 4071, 6928, 285, 1554, 262, 1945, 4715, 896, 44263, 839, 10295, 39358, 270, 21529, 556, 644, 3732, 323, 5127, 272, 33362, 4071, 4872, 6928, 581, 273, 253, 9021, 273, 436, 789, 310, 281, 9017, 253, 270, 21529, 1783, 281, 33362, 4071, 14561, 6928, 50276, 2520, 14561, 414, 310, 6786, 3066, 305, 839, 347, 10066, 281, 4228, 14561, 1005, 824, 347, 774, 86, 3470, 534, 2789, 253, 1783, 10649, 494, 436, 5502, 432, 4872, 3470, 281, 14561, 3470, 310, 247, 1534, 10527, 4780, 50276, 783, 10527, 1783, 310, 3590, 285, 47860, 954, 1543, 403, 27350, 9127, 2299, 891, 717, 13477, 407, 690, 273, 731, 347, 891, 5513, 275, 3533, 5540, 891, 588, 5731, 619, 7103, 846, 4685, 690, 273, 253, 1543, 1805, 50276, 783, 3215, 11273, 305, 839, 2934, 310, 4460, 285, 2722, 9023, 323, 1289, 79, 3022, 35615, 50276, 30875, 253, 5962, 14855, 310, 326, 253, 1783, 310, 3710, 281, 253, 305, 72, 11830, 2224, 285, 513, 417, 3587, 4647, 281, 4633, 35615, 824, 347, 774, 86, 13361, 793, 2299, 1783, 273, 14561, 6928, 310, 9559, 2834, 594, 667, 4780, 310, 9865, 347, 253, 4477, 3877, 253, 1783, 5609, 3715, 275, 253, 2929, 1537, 320, 7763, 281, 625, 4633, 35615, 275, 253, 2852, 50276, 783, 2929, 310, 3542, 285, 3559, 4518, 2299, 352, 476, 320, 3240, 14086, 275, 690, 5053, 534, 310, 3264, 1677, 253, 10671, 273, 1543, 50276, 9088, 403, 247, 1643, 5884, 4028, 3374, 50276, 783, 2505, 812, 5649, 432, 247, 15368, 5903, 1232, 261, 1251, 50276, 14983, 333, 8537, 3966, 50276, 783, 1269, 10565, 5203, 275, 4677, 337, 1364, 320, 268, 3185, 273, 278, 891, 2868, 50276, 19720, 5203, 277, 310, 5816, 275, 4677, 818, 923, 11743, 277, 26647, 2228, 5459, 342, 7364, 403, 18212, 9713, 5474, 33032, 2520, 2929, 10262, 10527, 16039, 326, 1973, 2220, 253, 4102, 4081, 305, 456, 4872, 6928, 1289, 2224, 806, 247, 21010, 2715, 273, 1289, 2224, 1925, 21349, 305, 456, 3676, 4872, 6928, 305, 72, 11830, 2224, 403, 4081, 253, 16038, 323, 253, 10419, 310, 281, 3989, 247, 1566, 326, 476, 5752, 347, 247, 4217, 1566, 273, 4715, 275, 2087, 11454, 6928, 407, 1146, 8542, 2217, 2568, 42133, 281, 10527, 1783, 253, 4081, 1566, 13276, 253, 4477, 281, 4044, 4217, 10527, 1894, 5904, 273, 253, 3541, 5350, 285, 26647, 3879, 323, 305, 72, 11830, 2224, 534, 403, 5783, 3066, 9938, 891, 943, 3877, 326, 891, 717, 417, 7615, 342, 253, 2720, 789, 275, 436, 2170, 285, 3021, 619, 11626, 403, 760, 1754, 327, 247, 1643, 28799, 273, 436, 2929, 347, 247, 38782, 254, 891, 476, 417, 24228, 604, 512, 4623, 2720, 789, 556, 644, 6283, 26873, 285, 476, 417, 12654, 326, 253, 1332, 273, 896, 44263, 839, 10295, 39358, 632, 50276, 34355, 4818, 26495, 43425, 556, 644, 3732, 9113, 281, 253, 4081, 1566, 17837, 253, 4477, 1056, 247, 1175, 1083, 326, 305, 72, 11830, 2224, 403, 247, 625, 4722, 1566, 281, 1263, 2429, 281, 3676, 4872, 6928, 5421, 275, 3332, 789, 30259, 26647, 3879, 326, 310, 8003, 281, 11454, 6928, 908, 275, 3946, 4706, 577, 352, 310, 671, 2011, 326, 627, 310, 13406, 18276, 4345, 875, 253, 9380, 10527, 13650, 5001, 26647, 285, 1543, 2797, 970, 11786, 18499, 1014, 2167, 22335, 253, 3762, 10384, 281, 12637, 13650, 2797, 970, 298, 912, 8498, 8062, 436, 969, 6492, 326, 305, 72, 11830, 2224, 403, 4217, 5113, 281, 1263, 323, 21565, 253, 3879, 273, 11454, 6928, 275, 3946, 4754, 253, 7364, 403, 5469, 275, 4706, 721, 5474, 339, 431, 248, 4477, 12661, 247, 747, 10336, 1754, 327, 305, 456, 4872, 6928, 1289, 79, 1925, 21349, 305, 456, 3676, 4872, 2990, 305, 35333, 6677, 597, 15313, 5150, 323, 253, 26647, 3607, 285, 6266, 253, 10336, 28055, 597, 1246, 671, 2067, 4679, 10941, 305, 35333, 6677, 285, 253, 1072, 10336, 6311, 407, 11786, 18499, 253, 2022, 4757, 310, 253, 3177, 281, 12106, 253, 305, 456, 3676, 4872, 6928, 327, 253, 10527, 3720, 25761, 253, 4477, 452, 1160, 247, 10665, 3434, 671, 281, 30648, 253, 10336, 407, 1027, 24302, 50276, 2577, 2022, 4468, 310, 326, 253, 789, 310, 1512, 14086, 7457, 7223, 4453, 1512, 1643, 281, 3831, 512, 253, 2144, 4477, 971, 281, 1246, 2403, 253, 4361, 1232, 2834, 671, 984, 253, 47649, 8442, 403, 417, 594, 1175, 347, 3309, 323, 436, 1921, 690, 4278, 403, 5816, 5955, 310, 1077, 2159, 285, 690, 8254, 6787, 403, 5816, 50275, 783, 4477, 6266, 625, 253, 1735, 24302, 326, 7364, 50276, 187, 187, 4118, 18435, 27, 13524, 30628, 5583, 2997, 253, 30628, 26108, 253, 1534, 18149, 273, 2045, 2987, 253, 38135, 273, 253, 5697, 285, 1119, 253, 10527, 1783, 3590, 285, 47860, 253, 2488, 6128, 281, 3302, 8680, 497, 1119, 281, 320, 47860, 285, 17279, 1520, 253, 30628, 670, 616, 12645, 7613, 891, 717, 46705, 2997, 891, 11907, 253, 4477, 281, 1379, 253, 30628, 8680, 9257, 715, 8180, 672, 13828, 253, 2457, 7714, 285, 281, 789, 327, 253, 4957, 12316, 275, 253, 5955, 2180, 50276 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 783, 4477, 12661, 21349, 305, 456, 3676, 4872, 6928, 305, 72, 11830, 2224, 534, 310, 247, 12955, 273, 305, 456, 4872, 6928, 1289, 2224, 326, 310, 625, 42133, 281, 10527, 1783, 253, 2929, 10262, 247, 4618, 37535, 273, 10527, 1543, 534, 403, 23000, 4516, 407, 9938, 50276, 20513, 2486, 8492, 50276, 87, 14417, 1783, 323, 2014, 12026, 6928, 10295, 21539, 1783, 323, 33362, 4071, 6928, 285, 1554, 262, 1945, 4715, 896, 44263, 839, 10295, 39358, 270, 21529, 556, 644, 3732, 323, 5127, 272, 33362, 4071, 4872, 6928, 581, 273, 253, 9021, 273, 436, 789, 310, 281, 9017, 253, 270, 21529, 1783, 281, 33362, 4071, 14561, 6928, 50276, 2520, 14561, 414, 310, 6786, 3066, 305, 839, 347, 10066, 281, 4228, 14561, 1005, 824, 347, 774, 86, 3470, 534, 2789, 253, 1783, 10649, 494, 436, 5502, 432, 4872, 3470, 281, 14561, 3470, 310, 247, 1534, 10527, 4780, 50276, 783, 10527, 1783, 310, 3590, 285, 47860, 954, 1543, 403, 27350, 9127, 2299, 891, 717, 13477, 407, 690, 273, 731, 347, 891, 5513, 275, 3533, 5540, 891, 588, 5731, 619, 7103, 846, 4685, 690, 273, 253, 1543, 1805, 50276, 783, 3215, 11273, 305, 839, 2934, 310, 4460, 285, 2722, 9023, 323, 1289, 79, 3022, 35615, 50276, 30875, 253, 5962, 14855, 310, 326, 253, 1783, 310, 3710, 281, 253, 305, 72, 11830, 2224, 285, 513, 417, 3587, 4647, 281, 4633, 35615, 824, 347, 774, 86, 13361, 793, 2299, 1783, 273, 14561, 6928, 310, 9559, 2834, 594, 667, 4780, 310, 9865, 347, 253, 4477, 3877, 253, 1783, 5609, 3715, 275, 253, 2929, 1537, 320, 7763, 281, 625, 4633, 35615, 275, 253, 2852, 50276, 783, 2929, 310, 3542, 285, 3559, 4518, 2299, 352, 476, 320, 3240, 14086, 275, 690, 5053, 534, 310, 3264, 1677, 253, 10671, 273, 1543, 50276, 9088, 403, 247, 1643, 5884, 4028, 3374, 50276, 783, 2505, 812, 5649, 432, 247, 15368, 5903, 1232, 261, 1251, 50276, 14983, 333, 8537, 3966, 50276, 783, 1269, 10565, 5203, 275, 4677, 337, 1364, 320, 268, 3185, 273, 278, 891, 2868, 50276, 19720, 5203, 277, 310, 5816, 275, 4677, 818, 923, 11743, 277, 26647, 2228, 5459, 342, 7364, 403, 18212, 9713, 5474, 33032, 2520, 2929, 10262, 10527, 16039, 326, 1973, 2220, 253, 4102, 4081, 305, 456, 4872, 6928, 1289, 2224, 806, 247, 21010, 2715, 273, 1289, 2224, 1925, 21349, 305, 456, 3676, 4872, 6928, 305, 72, 11830, 2224, 403, 4081, 253, 16038, 323, 253, 10419, 310, 281, 3989, 247, 1566, 326, 476, 5752, 347, 247, 4217, 1566, 273, 4715, 275, 2087, 11454, 6928, 407, 1146, 8542, 2217, 2568, 42133, 281, 10527, 1783, 253, 4081, 1566, 13276, 253, 4477, 281, 4044, 4217, 10527, 1894, 5904, 273, 253, 3541, 5350, 285, 26647, 3879, 323, 305, 72, 11830, 2224, 534, 403, 5783, 3066, 9938, 891, 943, 3877, 326, 891, 717, 417, 7615, 342, 253, 2720, 789, 275, 436, 2170, 285, 3021, 619, 11626, 403, 760, 1754, 327, 247, 1643, 28799, 273, 436, 2929, 347, 247, 38782, 254, 891, 476, 417, 24228, 604, 512, 4623, 2720, 789, 556, 644, 6283, 26873, 285, 476, 417, 12654, 326, 253, 1332, 273, 896, 44263, 839, 10295, 39358, 632, 50276, 34355, 4818, 26495, 43425, 556, 644, 3732, 9113, 281, 253, 4081, 1566, 17837, 253, 4477, 1056, 247, 1175, 1083, 326, 305, 72, 11830, 2224, 403, 247, 625, 4722, 1566, 281, 1263, 2429, 281, 3676, 4872, 6928, 5421, 275, 3332, 789, 30259, 26647, 3879, 326, 310, 8003, 281, 11454, 6928, 908, 275, 3946, 4706, 577, 352, 310, 671, 2011, 326, 627, 310, 13406, 18276, 4345, 875, 253, 9380, 10527, 13650, 5001, 26647, 285, 1543, 2797, 970, 11786, 18499, 1014, 2167, 22335, 253, 3762, 10384, 281, 12637, 13650, 2797, 970, 298, 912, 8498, 8062, 436, 969, 6492, 326, 305, 72, 11830, 2224, 403, 4217, 5113, 281, 1263, 323, 21565, 253, 3879, 273, 11454, 6928, 275, 3946, 4754, 253, 7364, 403, 5469, 275, 4706, 721, 5474, 339, 431, 248, 4477, 12661, 247, 747, 10336, 1754, 327, 305, 456, 4872, 6928, 1289, 79, 1925, 21349, 305, 456, 3676, 4872, 2990, 305, 35333, 6677, 597, 15313, 5150, 323, 253, 26647, 3607, 285, 6266, 253, 10336, 28055, 597, 1246, 671, 2067, 4679, 10941, 305, 35333, 6677, 285, 253, 1072, 10336, 6311, 407, 11786, 18499, 253, 2022, 4757, 310, 253, 3177, 281, 12106, 253, 305, 456, 3676, 4872, 6928, 327, 253, 10527, 3720, 25761, 253, 4477, 452, 1160, 247, 10665, 3434, 671, 281, 30648, 253, 10336, 407, 1027, 24302, 50276, 2577, 2022, 4468, 310, 326, 253, 789, 310, 1512, 14086, 7457, 7223, 4453, 1512, 1643, 281, 3831, 512, 253, 2144, 4477, 971, 281, 1246, 2403, 253, 4361, 1232, 2834, 671, 984, 253, 47649, 8442, 403, 417, 594, 1175, 347, 3309, 323, 436, 1921, 690, 4278, 403, 5816, 5955, 310, 1077, 2159, 285, 690, 8254, 6787, 403, 5816, 50275, 783, 4477, 6266, 625, 253, 1735, 24302, 326, 7364, 50276, 187, 187, 4118, 18435, 27, 13524, 30628, 5583, 2997, 253, 30628, 26108, 253, 1534, 18149, 273, 2045, 2987, 253, 38135, 273, 253, 5697, 285, 1119, 253, 10527, 1783, 3590, 285, 47860, 253, 2488, 6128, 281, 3302, 8680, 497, 1119, 281, 320, 47860, 285, 17279, 1520, 253, 30628, 670, 616, 12645, 7613, 891, 717, 46705, 2997, 891, 11907, 253, 4477, 281, 1379, 253, 30628, 8680, 9257, 715, 8180, 672, 13828, 253, 2457, 7714, 285, 281, 789, 327, 253, 4957, 12316, 275, 253, 5955, 2180, 50276 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: the paper studies a pure exploration bandit problem in a setting where the reward and policies are assumed to be in rkhs spaces the main result is a passive strategy based on experimental design and bounds on the simple regret the results are extended to the cumulative regret setting via a standard explorethencommit argument in the latter setting the results are depending on the kernel about competitive with the stateoftheart more on this later technical correctness i did not check the proofs in detail as far as i see the results are consistent with the literature and i do not have any particular concerns clarity the writing is technically reasonable see the minors for some exceptions the paper does not help the reader very much developing intuitions you could write for example what the algorithm is doing in the finitedimensional linear setting at the same time i would encourage the authors to spell out where is the novelty in their analysis what technical barrier has been overcome what parts of the proof would be most interesting and most useful in future analysis novelty i am not a super expert on kernel methods for bandits i am a little surprised that the basic results are new i suppose experimental design is already a little bit niche anyway someone more expert than me should probably comment on the novelty other comments 1 it is interesting that an improvement in the rate of the cumulative regret is possible relative to ucb the author could explain a little more why this is possible my own guess is that the nonsequential way the data is collected leads to an improved dependence on the effective dimension which in the kernel setting is coupled to the horizon this effect appears in the finitedimensional setting where the phased algorithms achieve a bound of sqrtd t logk with k the number of actions and d the dimension ucb one obtains only d sqrtn note that explorethencommit in this setting yields a suboptimal t23 rate but the dimension dependence is improved by using the non sequential estimators an obvious question this raises is whether or not a phased elimination algorithm using your risk bounds can improve even further the algorithm i am proposing would operate in phases m12 with the number of interactions in phase m being 2m within each phase the algorithm would use the same procedure as your simple regret algorithm to collect data and then eliminate policies based on confidence intervals does this idea lead to an even better rate if not can you explain why note you will need high probability confidence intervals for this but i guess you are could derive them already for your analysis note that some previous work also handles the contextual bandit version of this problem which is not possible using explorethencommit 2 another obvious question is whether or not you can get high probability bounds in theorem 1 i guess the answer is yes 3 there is a related literature on best arm identification in linear bandits here the algorithms are generally adaptive and the bounds depend on the relationship between the policies actions for example 1 below but there are many more recent papers see citations to this paper while these works often study finite dimensional and action settings they do have a problemdependent nature it would be interesting to investigate this possibility here 1 soare et al bestarm identification in linear bandits 2014 4 my last comment is on assumption 1 how benign is this and how necessary is it in the finitepolicy linear setting the assumption in general simply does not hold what is interesting there is that by solving the logdet problem you can nevertheless find a design that yields the same minimax rate independent of the geometry of the policy set surely we should wonder if the same is true here my intuition says yes since by introducing the effective dimension everything becomes finite in summary can we obtain the same results without this assumption by first introducing the effective dimension and then solving a standard doptimal design problem if not why not if so why not do it minors p2 the regret has not been defined in table 1 p3 pi in argmax but why should this exist some compactness assumption p3 the coefficients mupi and mur were not explainedintroduced p4 policy learning via reward learning policy learning via reward learning p4 there is a missing expectation in the rhs of 7 p5 i am not sure if the singular values sigmaj are defined or i missed where p5 it was not clear to me whether or not the assumption in 9 will be used for the remainder p5 j is introduced abruptly and before it has really been defined p6 where the quantity zetaj for some universal constant c 0 where the quantity zetaj and c 0 is a universal constant p6 on what does cpi depend i think it is a universal constant but the order of quantifiers in the assumption is a bit confusing p6 for instance via convexification can you explain this more p7 does not flip the larger eignevectors of more explanation would be helpful here as well p8 we let ncoveps denote an epsnet of i guess really ncoveps is the size of an epsilon net the net itself seems to be c defined in the next sentence or this terminology is unfamiliar to me what you call policies are perhaps more commonly called actions what lower bounds do we have in these settings the observation that rates improve when the policy set is the unit ball was observed in the finitedimensional setting by rustemevichientong et al linearly parameterized bandits 2010 the paper executes about the first thing you would try this can be a strength and a weakness i wish the authors provided more insight in their writing not only explaining what holds but also why and putting in the context of existing work the paper could be made stronger by investigating any or all of the directions suggested in 14 above docsepthe paper addresses the problem of reward learning by learning a reward model from human feedback using the optimal design of the queries authors address this by essentially framing the problem in the flavor of gpbandits that models rewards and policies as nonparametric functions belonging to subsets of reproducing kernel hilbert spaces rkhss where the learner receives noisy oracle access to a true reward and is expected to output a nearoptimal reward maximizing policy more precisely they analyze the framework of doublynonparametric bandits for theoretically studying the reward learning problem the proposed techniques are shown to yield nonasymptotic excess risk bounds for a simple plugin estimator based on ridge regression the general results can be applied to the specific settings of gpbandits and are shown to yield competitive regret guarantees for the matern kernel the paper is readably well written and the proofs are sound which is however mostly borrowed from the existing analysis of gpbandits eg srinivas 2010 one concern with the problem formulation doublynonparametric bandits is it seems to be closely tied to the framework of gpbandits which is well studied in the literature it is unclear the scopes and motivations of the proposed frameworks beyond that are already covered by gpbandits can you point out a realworld problem that can not be resolved under the gpbandits framework but with a doubly nonparametric bandits framework due to the same reason the proposed ridge regressionbased algorithms are also lifted from standard methods used in gp bandits as well as the analysis techniques so it is hard to appreciate the specific novelties of this paper or the unique contribution that is missing the earlier works this does not reflect from the contribution section of the paper as well any comments on the computational complexity of the proposed algorithm given any arbitrary nonparametric reward and policy class finally the experimental evaluation section of the paper is extremely weak there have been no comparisons made with stateoftheart methods even the algorithms of kernelizedmab as listed in table 1 which is surprising the paper has some minor typos eg doublynonparameteric in pg 2 please proofread the draft thoroughly a separate problem formulation and technical contributions section would also be helpful for the readers the theoretical findings of the paper are sound but it is hard to appreciate the novelties of this work over the existing techniques and analysis of gpbandits i will be happy to increase the score if authors can precisely point the new challenges overcome by this work and what the one novel idea unique to this work docsepthis paper is motivated by learning optimal actions in tasks where both the reward function and policy actions are nonparametric previous literature has typically only considered one of these two components as being nonparametric the main focus is on reliably identifying a policy with low instantaneous regretrisk via as few queries as possible where all queries are specified in advance ie the passive query setting the proposed approach selects query locations based on an eigenvalue decomposition of the policy space sampling repeatedly along a set of top eigenvectors to ensure reliable estimation of the reward via a plugin ridgeregressionbased regression this estimated reward function is then minimised over the policy class to return a suggested policyaction the accompanying theory shows the decay of the risk of this suggested action and shows that in the setting of the gaussian process bandit this can enjoy a better rate than existing adaptive approaches such as gpucb this is a solid contribution in an impressively general setting which i believe be interesting to many working in this area and inspire future work the challenge is well motivated in the introductory sections and the theoretical results show the proposed approach has strong performance i have not been able to fully check all of the mathematical work in the appendices but what i have inspected seems to be accurate and nontrivial while i am assigning a positive score i think the exposition around the improvement over ucbstyle algorithms could be improved presently there is a comment saying that you believe the improvement in terms of the theory is due to the proposed approach being a better approach rather than some gap in the theory this doesnt feel as strong as it could be could you supplement this with some more details as to what features make the difference it is as you identify quite a surprising result that popular adaptive algorithms are theoretically outperformed by passive approaches could some of the elements of this passive approach be employed to produce a yet stronger adaptive approach in future work do you think minor comments typo near the bottom of page 2 gpucb and gpts are only yield sublinear typo in second para of section 3 such general plugin procedure have i am positive about this submission i think it is interesting innovative and potentially very impactful the results are impressive though i think there is an opportunity to give a bit more insight as to the conceptual reasons for the more surprising aspects of the results docsepthis paper essentially studies pure exploration in nonparametric bandits setting and provides a simple regret guarantee the authors generalized the setting to nonparametric policy class some new ideas based on minimizing a risk upper bound are proposed overall i feel this is an interesting work and wellwritten section 4 has its value i have some specific comments below 1 i feel the setting is very close to the study of simple regret in the pure exploration problem in the bandit community 2 is essentially the simple regret i feel its a bit weird to call 1 an oracle its just bandit feedback maybe the authors are from other communities but i hope you could relate or comment with bandits language in section 2 pure exploration with a simple regret guarantee is a wellstudied area where you could refer bandit algorithm book 2 some important references on gpbandits are missing the minimax optimal regret for the matern kernel of gpbandits has been solved recently the rate is tnud2nud the upper bound appears in on information gain and regret bounds in gaussian process bandits aistats 2021 and the lower bound appears in on lower bounds for standard and robust gaussian process bandit optimization icml 2021 the regret bound in this paper is clearly suboptimal when reducing to gpbandits please correct me if i am wrong if this is correct i think it is very important to discuss how the current result can go beyond the gpbandits setting and how important the nonparametric policy is and how the current regret bound explains the hardness of more general policy class 3 i feel it should be cautious to argue passive learning is better than adaptive learning in terms of simple regret the bound you compare with is not sharp actually in a recent work bandit phase retrieval httpsarxivorgabs210601660 the authors have shown that adaptive learning is strictly better than passive learning the rate is sharp there their model is a strictly subclass of your model i think 4 in section 5 do you require the space of input points to be the full unit ball because when you convert your algorithm into an online regret minimization algorithm the explorethencommit should not be a good one unless your action set has some good curvature to use like the full unit ball 5 how it relates to optimal design indeed in the main section the word optimal design does not even appear if it appears in the title i feel you should explain what do you mean by optimal design explicitly this paper presents some interesting ideas and generalizes the setting to nonparametric policy class i hope the authors could clarify their contribution wrt the sota rate ### Summary:
while the reviewers found several interesting points about the paper they raised several issues which prevents me from recommending acceptance of the paper in particular the paper is not positioned properly in the literature hence the novelty and the contributions are not properly clarified the approach of the paper is reasonably simple which would be a good thing by itself but there seem to be natural avenues along which more complete results could be obtained as mentioned in the reviews finally the experiments should be improved eg comparing with other algorithms from the literature in summary this is a promising work but it requires some improvements before it can be published
[ 1895, 368, 476, 17837, 1089, 247, 2216, 326, 11026, 253, 1072, 7221, 991, 2281, 3907, 273, 253, 12087, 273, 253, 3646, 873, 13353, 359, 943, 4282, 604, 253, 1072, 310, 2032, 1060, 619, 30328, 2296, 4754, 1580, 407, 16984, 253, 3576, 7877, 3253, 4916, 6486, 275, 6010, 476, 359, 4044, 253, 1072, 1543, 1293, 436, 9376, 407, 806, 16984, 253, 3576, 7877, 285, 840, 16161, 247, 2629, 277, 29776, 2216, 1895, 604, 417, 2139, 417, 604, 594, 2139, 417, 513, 352, 50272, 1222, 641, 50275, 81, 19, 253, 14938, 556, 417, 644, 2931, 275, 2829, 337, 50275, 81, 20, 12580, 275, 1736, 4090, 50276, 2858, 2139, 943, 436, 2226, 690, 8566, 1255, 9376, 50275, 81, 20, 253, 10303, 278, 484, 74, 285, 4682, 497, 417, 5544, 36445, 758, 50275, 81, 21, 3646, 4715, 3066, 10921, 4715, 50276, 22872, 4715, 3066, 10921, 4715, 50275, 81, 21, 627, 310, 247, 5816, 15355, 275, 253, 38309, 273, 818, 50275, 81, 22, 891, 717, 417, 2119, 604, 253, 11098, 2193, 40009, 75, 403, 2931, 390, 891, 9829, 835, 50275, 81, 22, 352, 369, 417, 2590, 281, 479, 1880, 390, 417, 253, 9376, 275, 898, 588, 320, 908, 323, 253, 6414, 50275, 81, 22, 480, 310, 5611, 30046, 285, 1078, 352, 556, 1663, 644, 2931, 50275, 81, 23, 835, 253, 10671, 1182, 292, 1432, 50275, 1542, 690, 10898, 3638, 260, 50276, 17, 50276, 2811, 253, 10671, 1182, 292, 1432, 50275, 395, 260, 50276, 17, 310, 247, 10898, 3638, 50275, 81, 23, 327, 752, 1057, 260, 2059, 3469, 891, 1158, 352, 310, 247, 10898, 3638, 533, 253, 1340, 273, 2677, 13783, 275, 253, 9376, 310, 247, 2372, 21643, 50275, 81, 23, 323, 4227, 3066, 17133, 1877, 50276, 5092, 368, 5513, 436, 625, 50275, 81, 24, 1057, 417, 19153, 253, 4067, 299, 525, 20010, 5285, 273, 50276, 3062, 8813, 651, 320, 9371, 1060, 347, 973, 50275, 81, 25, 359, 1339, 295, 68, 710, 793, 9173, 271, 299, 793, 3024, 273, 891, 5476, 1663, 295, 68, 710, 793, 310, 253, 1979, 273, 271, 299, 4277, 2036, 253, 2036, 3139, 3133, 281, 320, 260, 2931, 275, 253, 1735, 6197, 390, 436, 28939, 310, 32139, 281, 479, 50275, 5371, 368, 1067, 7823, 403, 4931, 625, 7744, 1925, 5231, 50275, 5371, 2406, 14493, 513, 359, 452, 275, 841, 7533, 50275, 783, 8310, 326, 4142, 3157, 672, 253, 3646, 873, 310, 253, 3943, 4023, 369, 2540, 275, 253, 1442, 959, 37613, 4758, 407, 20035, 358, 1173, 469, 850, 543, 1162, 355, 23352, 4764, 1025, 3961, 953, 4267, 50276, 783, 2929, 42506, 670, 253, 806, 2181, 368, 651, 1611, 436, 476, 320, 247, 4757, 285, 247, 14855, 891, 5730, 253, 4477, 2530, 625, 12288, 275, 616, 4028, 417, 760, 15571, 752, 6556, 533, 671, 2139, 285, 8133, 275, 253, 3634, 273, 5368, 789, 253, 2929, 812, 320, 1160, 10046, 407, 15686, 667, 390, 512, 273, 253, 10746, 5125, 275, 1638, 1840, 5474, 339, 431, 248, 2929, 12453, 253, 1895, 273, 10921, 4715, 407, 4715, 247, 10921, 1566, 432, 1966, 8680, 970, 253, 8654, 2216, 273, 253, 19241, 4477, 2953, 436, 407, 9093, 39926, 253, 1895, 275, 253, 13746, 273, 31025, 4152, 953, 326, 3210, 23267, 285, 7823, 347, 1327, 36928, 3470, 15823, 281, 20077, 273, 39306, 10295, 288, 300, 6291, 8470, 391, 17616, 859, 835, 253, 458, 47612, 14488, 27620, 42295, 2289, 281, 247, 2032, 10921, 285, 310, 3264, 281, 3453, 247, 2822, 29776, 10921, 46875, 3646, 625, 10534, 597, 12106, 253, 7792, 273, 44881, 4160, 36928, 3961, 953, 323, 28055, 12392, 253, 10921, 4715, 1895, 50276, 783, 4081, 5609, 403, 2011, 281, 4917, 1327, 284, 40045, 3875, 6714, 2495, 14493, 323, 247, 2969, 15191, 29107, 1754, 327, 27563, 9077, 253, 2087, 1543, 476, 320, 3732, 281, 253, 2173, 7533, 273, 31025, 4152, 953, 285, 403, 2011, 281, 4917, 12085, 14938, 23632, 323, 253, 45171, 10295, 50276, 783, 2929, 310, 1239, 1598, 973, 3542, 285, 253, 27947, 403, 3590, 534, 310, 2299, 6571, 29563, 432, 253, 5368, 1783, 273, 31025, 4152, 953, 24088, 256, 11078, 34627, 4267, 50275, 531, 4468, 342, 253, 1895, 15895, 44881, 4160, 36928, 3961, 953, 310, 352, 3133, 281, 320, 8244, 12331, 281, 253, 7792, 273, 31025, 4152, 953, 534, 310, 973, 5421, 275, 253, 6239, 352, 310, 12744, 253, 660, 11192, 285, 42852, 273, 253, 4081, 31225, 4457, 326, 403, 2168, 6107, 407, 31025, 4152, 953, 476, 368, 1127, 562, 247, 1524, 10186, 1895, 326, 476, 417, 320, 11512, 762, 253, 31025, 4152, 953, 7792, 533, 342, 247, 44881, 1327, 36928, 3961, 953, 7792, 1955, 281, 253, 1072, 1921, 253, 4081, 27563, 9077, 3169, 11333, 403, 671, 14287, 432, 2629, 3082, 908, 275, 31025, 3961, 953, 347, 973, 347, 253, 1783, 5609, 594, 352, 310, 1892, 281, 11435, 253, 2173, 4460, 2890, 273, 436, 2929, 390, 253, 4451, 7680, 326, 310, 5816, 253, 4321, 2987, 436, 1057, 417, 4887, 432, 253, 7680, 2593, 273, 253, 2929, 347, 973, 50276, 1279, 5701, 327, 253, 15180, 10454, 273, 253, 4081, 5933, 1677, 667, 10341, 1327, 36928, 10921, 285, 3646, 966, 4720, 253, 5661, 7103, 2593, 273, 253, 2929, 310, 6685, 5075, 627, 452, 644, 642, 14023, 1160, 342, 1375, 23037, 14387, 3082, 1014, 253, 11333, 273, 10295, 1025, 78, 357, 347, 7117, 275, 2829, 337, 534, 310, 10084, 50276, 783, 2929, 556, 690, 5884, 963, 993, 24088, 44881, 4160, 19484, 280, 275, 23256, 374, 4496, 4737, 1088, 253, 7482, 16575, 247, 4858, 1895, 15895, 285, 7681, 9021, 2593, 651, 671, 320, 9371, 323, 253, 10668, 50275, 783, 10527, 4342, 273, 253, 2929, 403, 3590, 533, 352, 310, 1892, 281, 11435, 253, 4460, 2890, 273, 436, 789, 689, 253, 5368, 5609, 285, 1783, 273, 31025, 4152, 953, 891, 588, 320, 5211, 281, 2572, 253, 4868, 604, 4477, 476, 10534, 1127, 253, 747, 7881, 11399, 407, 436, 789, 285, 752, 253, 581, 4460, 2934, 4451, 281, 436, 789, 50276, 7152, 33032, 2520, 2929, 310, 17194, 407, 4715, 8654, 5231, 275, 8892, 835, 1097, 253, 10921, 1159, 285, 3646, 5231, 403, 1327, 36928, 2045, 6239, 556, 5431, 760, 2783, 581, 273, 841, 767, 4295, 347, 1146, 1327, 36928, 253, 2022, 2770, 310, 327, 27340, 12488, 247, 3646, 342, 1698, 35774, 810, 250, 1206, 1886, 3066, 347, 1643, 19241, 347, 1896, 835, 512, 19241, 403, 7616, 275, 7170, 26332, 253, 16864, 7316, 4758, 253, 4081, 2746, 34899, 7316, 8593, 1754, 327, 271, 25023, 14717, 273, 253, 3646, 2317, 10491, 12889, 2112, 247, 873, 273, 1755, 48670, 281, 5416, 9630, 13418, 273, 253, 10921, 3066, 247, 15191, 27563, 1747, 1256, 3169, 9077, 436, 5998, 10921, 1159, 310, 840, 7221, 1701, 689, 253, 3646, 966, 281, 1091, 247, 5125, 3646, 1913, 253, 17909, 3762, 2722, 253, 10027, 273, 253, 2495, 273, 436, 5125, 2250, 285, 2722, 326, 275, 253, 4758, 273, 253, 305, 12064, 1232, 3961, 262, 436, 476, 4264, 247, 1805, 2281, 685, 5368, 17825, 7274, 824, 347, 31025, 1028, 67, 50276, 2520, 310, 247, 4891, 7680, 275, 271, 21097, 1242, 2087, 4758, 534, 891, 2868, 320, 4722, 281, 1142, 2444, 275, 436, 2170, 285, 26761, 2852, 789, 253, 5691, 310, 973, 17194, 275, 253, 47649, 7118, 285, 253, 10527, 1543, 921, 253, 4081, 2746, 556, 2266, 3045, 891, 452, 417, 644, 2104, 281, 4751, 2451, 512, 273, 253, 15965, 789, 275, 253, 14801, 1271, 533, 752, 891, 452, 36560, 3133, 281, 320, 7899, 285, 37825, 1223, 891, 717, 34018, 247, 2762, 4868, 891, 1158, 253, 47284, 1475, 253, 7756, 689, 44274, 67, 4826, 11333, 812, 320, 5520, 50275, 10192, 1574, 627, 310, 247, 4385, 3981, 326, 368, 2868, 253, 7756, 275, 2426, 273, 253, 3762, 310, 1955, 281, 253, 4081, 2746, 1146, 247, 1805, 2746, 2581, 685, 690, 8037, 275, 253, 3762, 436, 36908, 1928, 347, 2266, 347, 352, 812, 320, 50276, 16534, 368, 8499, 436, 342, 690, 625, 4278, 347, 281, 752, 3386, 1056, 253, 3064, 352, 310, 347, 368, 4271, 3240, 247, 10084, 906, 326, 4633, 17825, 11333, 403, 28055, 41731, 10574, 407, 16864, 7274, 812, 690, 273, 253, 3603, 273, 436, 16864, 2746, 320, 7091, 281, 4711, 247, 2568, 10046, 17825, 2746, 275, 2852, 789, 513, 368, 1158, 50276, 37585, 5701, 209, 186, 555, 5367, 2822, 253, 5004, 273, 3239, 374, 31025, 1028, 67, 285, 305, 45276, 403, 760, 4917, 749, 8172, 209, 186, 555, 5367, 275, 1273, 5586, 273, 2593, 495, 824, 2087, 15191, 5199, 452, 50276, 74, 717, 2762, 670, 436, 19529, 891, 1158, 352, 310, 4722, 16694, 285, 7826, 1077, 3486, 1020, 253, 1543, 403, 13943, 2167, 891, 1158, 627, 310, 271, 5107, 281, 1918, 247, 2372, 625, 12288, 347, 281, 253, 20178, 4606, 323, 253, 625, 10084, 7794, 273, 253, 1543, 5474, 33032, 2520, 2929, 9093, 2175, 6313, 17947, 275, 1327, 36928, 3961, 953, 4758, 285, 3400, 247, 2969, 14938, 12215, 253, 4477, 14923, 253, 4758, 281, 1327, 36928, 3646, 966, 690, 747, 5697, 1754, 327, 28699, 247, 2495, 5170, 3033, 403, 4081, 50276, 1189, 455, 891, 1928, 436, 310, 271, 4722, 789, 285, 973, 15720, 2593, 577, 556, 697, 1318, 891, 452, 690, 2173, 5701, 2708, 50276, 18, 891, 1928, 253, 4758, 310, 1077, 2810, 281, 253, 1263, 273, 2969, 14938, 275, 253, 6313, 17947, 1895, 275, 253, 3961, 262, 3114, 374, 310, 9093, 253, 2969, 14938, 891, 1928, 697, 247, 2372, 12504, 281, 1067, 337, 271, 42295, 697, 816, 3961, 262, 8680, 5046, 253, 4477, 403, 432, 643, 7888, 533, 891, 3524, 368, 812, 14588, 390, 4385, 342, 3961, 953, 3448, 275, 2593, 374, 6313, 17947, 342, 247, 2969, 14938, 12215, 310, 247, 973, 14091, 728, 2170, 835, 368, 812, 3730, 3961, 262, 5933, 1984, 50275, 19, 690, 1774, 10414, 327, 31025, 4152, 953, 403, 5816, 253, 7221, 991, 8654, 14938, 323, 253, 45171, 10295, 273, 31025, 4152, 953, 556, 644, 14042, 4102, 253, 2281, 310, 246, 79, 438, 19, 79, 438, 253, 5170, 3033, 4620, 275, 327, 1491, 6351, 285, 14938, 14493, 275, 305, 12064, 1232, 3961, 953, 247, 382, 1832, 43425, 285, 253, 2406, 3033, 4620, 275, 327, 2406, 14493, 323, 2629, 285, 10237, 305, 12064, 1232, 3961, 262, 13757, 17857, 1686, 43425, 253, 14938, 3033, 275, 436, 2929, 310, 4518, 749, 29776, 672, 8493, 281, 31025, 4152, 953, 4496, 3451, 479, 604, 891, 717, 3430, 604, 436, 310, 3451, 891, 1158, 352, 310, 1077, 1774, 281, 2319, 849, 253, 1655, 906, 476, 564, 4457, 253, 31025, 4152, 953, 4758, 285, 849, 1774, 253, 1327, 36928, 3646, 310, 285, 849, 253, 1655, 14938, 3033, 11424, 253, 38576, 273, 625, 2087, 3646, 966, 50275, 20, 891, 1928, 352, 943, 320, 31798, 281, 9059, 16864, 4715, 310, 1805, 685, 17825, 4715, 275, 2426, 273, 2969, 14938, 253, 3033, 368, 7277, 342, 310, 417, 9479, 2686, 275, 247, 3332, 789, 3961, 262, 3408, 25064, 5987, 39962, 2061, 5375, 19, 12971, 11718, 1549, 253, 4477, 452, 2011, 326, 17825, 4715, 310, 13714, 1805, 685, 16864, 4715, 253, 2281, 310, 9479, 627, 616, 1566, 310, 247, 13714, 35851, 273, 634, 1566, 891, 1158, 50275, 21, 275, 2593, 608, 513, 368, 2430, 253, 2317, 273, 3280, 2792, 281, 320, 253, 2120, 3943, 4023, 984, 672, 368, 6455, 634, 5933, 715, 271, 3909, 14938, 41458, 5933, 253, 8338, 7461, 25756, 943, 417, 320, 247, 1175, 581, 5734, 634, 2250, 873, 556, 690, 1175, 16841, 281, 897, 751, 253, 2120, 3943, 4023, 50275, 22, 849, 352, 7033, 281, 8654, 2216, 6296, 275, 253, 2022, 2593, 253, 3159, 8654, 2216, 1057, 417, 1014, 3176, 604, 352, 4620, 275, 253, 4060, 891, 1928, 368, 943, 5513, 752, 513, 368, 1599, 407, 8654, 2216, 11120, 50276, 2520, 2929, 10262, 690, 4722, 5697, 285, 2087, 4219, 253, 4758, 281, 1327, 36928, 3646, 966, 891, 3524, 253, 4477, 812, 19148, 616, 7680, 8772, 253, 256, 5503, 2281, 50276, 187, 187, 4118, 18435, 27, 6050, 253, 30628, 1119, 2067, 4722, 2792, 670, 253, 2929, 597, 5439, 2067, 3374, 534, 16897, 479, 432, 46705, 14924, 273, 253, 2929, 275, 1798, 253, 2929, 310, 417, 15471, 6283, 275, 253, 6239, 7613, 253, 38135, 285, 253, 9021, 403, 417, 6283, 31637, 253, 2746, 273, 253, 2929, 310, 12054, 2969, 534, 651, 320, 247, 1175, 2181, 407, 3139, 533, 627, 1646, 281, 320, 3626, 44201, 2112, 534, 625, 3426, 1543, 812, 320, 2797, 347, 5393, 275, 253, 10123, 4720, 253, 4679, 943, 320, 5520, 24088, 10941, 342, 643, 11333, 432, 253, 6239, 275, 6010, 436, 310, 247, 12532, 789, 533, 352, 4419, 690, 11701, 1078, 352, 476, 320, 3863 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 1895, 368, 476, 17837, 1089, 247, 2216, 326, 11026, 253, 1072, 7221, 991, 2281, 3907, 273, 253, 12087, 273, 253, 3646, 873, 13353, 359, 943, 4282, 604, 253, 1072, 310, 2032, 1060, 619, 30328, 2296, 4754, 1580, 407, 16984, 253, 3576, 7877, 3253, 4916, 6486, 275, 6010, 476, 359, 4044, 253, 1072, 1543, 1293, 436, 9376, 407, 806, 16984, 253, 3576, 7877, 285, 840, 16161, 247, 2629, 277, 29776, 2216, 1895, 604, 417, 2139, 417, 604, 594, 2139, 417, 513, 352, 50272, 1222, 641, 50275, 81, 19, 253, 14938, 556, 417, 644, 2931, 275, 2829, 337, 50275, 81, 20, 12580, 275, 1736, 4090, 50276, 2858, 2139, 943, 436, 2226, 690, 8566, 1255, 9376, 50275, 81, 20, 253, 10303, 278, 484, 74, 285, 4682, 497, 417, 5544, 36445, 758, 50275, 81, 21, 3646, 4715, 3066, 10921, 4715, 50276, 22872, 4715, 3066, 10921, 4715, 50275, 81, 21, 627, 310, 247, 5816, 15355, 275, 253, 38309, 273, 818, 50275, 81, 22, 891, 717, 417, 2119, 604, 253, 11098, 2193, 40009, 75, 403, 2931, 390, 891, 9829, 835, 50275, 81, 22, 352, 369, 417, 2590, 281, 479, 1880, 390, 417, 253, 9376, 275, 898, 588, 320, 908, 323, 253, 6414, 50275, 81, 22, 480, 310, 5611, 30046, 285, 1078, 352, 556, 1663, 644, 2931, 50275, 81, 23, 835, 253, 10671, 1182, 292, 1432, 50275, 1542, 690, 10898, 3638, 260, 50276, 17, 50276, 2811, 253, 10671, 1182, 292, 1432, 50275, 395, 260, 50276, 17, 310, 247, 10898, 3638, 50275, 81, 23, 327, 752, 1057, 260, 2059, 3469, 891, 1158, 352, 310, 247, 10898, 3638, 533, 253, 1340, 273, 2677, 13783, 275, 253, 9376, 310, 247, 2372, 21643, 50275, 81, 23, 323, 4227, 3066, 17133, 1877, 50276, 5092, 368, 5513, 436, 625, 50275, 81, 24, 1057, 417, 19153, 253, 4067, 299, 525, 20010, 5285, 273, 50276, 3062, 8813, 651, 320, 9371, 1060, 347, 973, 50275, 81, 25, 359, 1339, 295, 68, 710, 793, 9173, 271, 299, 793, 3024, 273, 891, 5476, 1663, 295, 68, 710, 793, 310, 253, 1979, 273, 271, 299, 4277, 2036, 253, 2036, 3139, 3133, 281, 320, 260, 2931, 275, 253, 1735, 6197, 390, 436, 28939, 310, 32139, 281, 479, 50275, 5371, 368, 1067, 7823, 403, 4931, 625, 7744, 1925, 5231, 50275, 5371, 2406, 14493, 513, 359, 452, 275, 841, 7533, 50275, 783, 8310, 326, 4142, 3157, 672, 253, 3646, 873, 310, 253, 3943, 4023, 369, 2540, 275, 253, 1442, 959, 37613, 4758, 407, 20035, 358, 1173, 469, 850, 543, 1162, 355, 23352, 4764, 1025, 3961, 953, 4267, 50276, 783, 2929, 42506, 670, 253, 806, 2181, 368, 651, 1611, 436, 476, 320, 247, 4757, 285, 247, 14855, 891, 5730, 253, 4477, 2530, 625, 12288, 275, 616, 4028, 417, 760, 15571, 752, 6556, 533, 671, 2139, 285, 8133, 275, 253, 3634, 273, 5368, 789, 253, 2929, 812, 320, 1160, 10046, 407, 15686, 667, 390, 512, 273, 253, 10746, 5125, 275, 1638, 1840, 5474, 339, 431, 248, 2929, 12453, 253, 1895, 273, 10921, 4715, 407, 4715, 247, 10921, 1566, 432, 1966, 8680, 970, 253, 8654, 2216, 273, 253, 19241, 4477, 2953, 436, 407, 9093, 39926, 253, 1895, 275, 253, 13746, 273, 31025, 4152, 953, 326, 3210, 23267, 285, 7823, 347, 1327, 36928, 3470, 15823, 281, 20077, 273, 39306, 10295, 288, 300, 6291, 8470, 391, 17616, 859, 835, 253, 458, 47612, 14488, 27620, 42295, 2289, 281, 247, 2032, 10921, 285, 310, 3264, 281, 3453, 247, 2822, 29776, 10921, 46875, 3646, 625, 10534, 597, 12106, 253, 7792, 273, 44881, 4160, 36928, 3961, 953, 323, 28055, 12392, 253, 10921, 4715, 1895, 50276, 783, 4081, 5609, 403, 2011, 281, 4917, 1327, 284, 40045, 3875, 6714, 2495, 14493, 323, 247, 2969, 15191, 29107, 1754, 327, 27563, 9077, 253, 2087, 1543, 476, 320, 3732, 281, 253, 2173, 7533, 273, 31025, 4152, 953, 285, 403, 2011, 281, 4917, 12085, 14938, 23632, 323, 253, 45171, 10295, 50276, 783, 2929, 310, 1239, 1598, 973, 3542, 285, 253, 27947, 403, 3590, 534, 310, 2299, 6571, 29563, 432, 253, 5368, 1783, 273, 31025, 4152, 953, 24088, 256, 11078, 34627, 4267, 50275, 531, 4468, 342, 253, 1895, 15895, 44881, 4160, 36928, 3961, 953, 310, 352, 3133, 281, 320, 8244, 12331, 281, 253, 7792, 273, 31025, 4152, 953, 534, 310, 973, 5421, 275, 253, 6239, 352, 310, 12744, 253, 660, 11192, 285, 42852, 273, 253, 4081, 31225, 4457, 326, 403, 2168, 6107, 407, 31025, 4152, 953, 476, 368, 1127, 562, 247, 1524, 10186, 1895, 326, 476, 417, 320, 11512, 762, 253, 31025, 4152, 953, 7792, 533, 342, 247, 44881, 1327, 36928, 3961, 953, 7792, 1955, 281, 253, 1072, 1921, 253, 4081, 27563, 9077, 3169, 11333, 403, 671, 14287, 432, 2629, 3082, 908, 275, 31025, 3961, 953, 347, 973, 347, 253, 1783, 5609, 594, 352, 310, 1892, 281, 11435, 253, 2173, 4460, 2890, 273, 436, 2929, 390, 253, 4451, 7680, 326, 310, 5816, 253, 4321, 2987, 436, 1057, 417, 4887, 432, 253, 7680, 2593, 273, 253, 2929, 347, 973, 50276, 1279, 5701, 327, 253, 15180, 10454, 273, 253, 4081, 5933, 1677, 667, 10341, 1327, 36928, 10921, 285, 3646, 966, 4720, 253, 5661, 7103, 2593, 273, 253, 2929, 310, 6685, 5075, 627, 452, 644, 642, 14023, 1160, 342, 1375, 23037, 14387, 3082, 1014, 253, 11333, 273, 10295, 1025, 78, 357, 347, 7117, 275, 2829, 337, 534, 310, 10084, 50276, 783, 2929, 556, 690, 5884, 963, 993, 24088, 44881, 4160, 19484, 280, 275, 23256, 374, 4496, 4737, 1088, 253, 7482, 16575, 247, 4858, 1895, 15895, 285, 7681, 9021, 2593, 651, 671, 320, 9371, 323, 253, 10668, 50275, 783, 10527, 4342, 273, 253, 2929, 403, 3590, 533, 352, 310, 1892, 281, 11435, 253, 4460, 2890, 273, 436, 789, 689, 253, 5368, 5609, 285, 1783, 273, 31025, 4152, 953, 891, 588, 320, 5211, 281, 2572, 253, 4868, 604, 4477, 476, 10534, 1127, 253, 747, 7881, 11399, 407, 436, 789, 285, 752, 253, 581, 4460, 2934, 4451, 281, 436, 789, 50276, 7152, 33032, 2520, 2929, 310, 17194, 407, 4715, 8654, 5231, 275, 8892, 835, 1097, 253, 10921, 1159, 285, 3646, 5231, 403, 1327, 36928, 2045, 6239, 556, 5431, 760, 2783, 581, 273, 841, 767, 4295, 347, 1146, 1327, 36928, 253, 2022, 2770, 310, 327, 27340, 12488, 247, 3646, 342, 1698, 35774, 810, 250, 1206, 1886, 3066, 347, 1643, 19241, 347, 1896, 835, 512, 19241, 403, 7616, 275, 7170, 26332, 253, 16864, 7316, 4758, 253, 4081, 2746, 34899, 7316, 8593, 1754, 327, 271, 25023, 14717, 273, 253, 3646, 2317, 10491, 12889, 2112, 247, 873, 273, 1755, 48670, 281, 5416, 9630, 13418, 273, 253, 10921, 3066, 247, 15191, 27563, 1747, 1256, 3169, 9077, 436, 5998, 10921, 1159, 310, 840, 7221, 1701, 689, 253, 3646, 966, 281, 1091, 247, 5125, 3646, 1913, 253, 17909, 3762, 2722, 253, 10027, 273, 253, 2495, 273, 436, 5125, 2250, 285, 2722, 326, 275, 253, 4758, 273, 253, 305, 12064, 1232, 3961, 262, 436, 476, 4264, 247, 1805, 2281, 685, 5368, 17825, 7274, 824, 347, 31025, 1028, 67, 50276, 2520, 310, 247, 4891, 7680, 275, 271, 21097, 1242, 2087, 4758, 534, 891, 2868, 320, 4722, 281, 1142, 2444, 275, 436, 2170, 285, 26761, 2852, 789, 253, 5691, 310, 973, 17194, 275, 253, 47649, 7118, 285, 253, 10527, 1543, 921, 253, 4081, 2746, 556, 2266, 3045, 891, 452, 417, 644, 2104, 281, 4751, 2451, 512, 273, 253, 15965, 789, 275, 253, 14801, 1271, 533, 752, 891, 452, 36560, 3133, 281, 320, 7899, 285, 37825, 1223, 891, 717, 34018, 247, 2762, 4868, 891, 1158, 253, 47284, 1475, 253, 7756, 689, 44274, 67, 4826, 11333, 812, 320, 5520, 50275, 10192, 1574, 627, 310, 247, 4385, 3981, 326, 368, 2868, 253, 7756, 275, 2426, 273, 253, 3762, 310, 1955, 281, 253, 4081, 2746, 1146, 247, 1805, 2746, 2581, 685, 690, 8037, 275, 253, 3762, 436, 36908, 1928, 347, 2266, 347, 352, 812, 320, 50276, 16534, 368, 8499, 436, 342, 690, 625, 4278, 347, 281, 752, 3386, 1056, 253, 3064, 352, 310, 347, 368, 4271, 3240, 247, 10084, 906, 326, 4633, 17825, 11333, 403, 28055, 41731, 10574, 407, 16864, 7274, 812, 690, 273, 253, 3603, 273, 436, 16864, 2746, 320, 7091, 281, 4711, 247, 2568, 10046, 17825, 2746, 275, 2852, 789, 513, 368, 1158, 50276, 37585, 5701, 209, 186, 555, 5367, 2822, 253, 5004, 273, 3239, 374, 31025, 1028, 67, 285, 305, 45276, 403, 760, 4917, 749, 8172, 209, 186, 555, 5367, 275, 1273, 5586, 273, 2593, 495, 824, 2087, 15191, 5199, 452, 50276, 74, 717, 2762, 670, 436, 19529, 891, 1158, 352, 310, 4722, 16694, 285, 7826, 1077, 3486, 1020, 253, 1543, 403, 13943, 2167, 891, 1158, 627, 310, 271, 5107, 281, 1918, 247, 2372, 625, 12288, 347, 281, 253, 20178, 4606, 323, 253, 625, 10084, 7794, 273, 253, 1543, 5474, 33032, 2520, 2929, 9093, 2175, 6313, 17947, 275, 1327, 36928, 3961, 953, 4758, 285, 3400, 247, 2969, 14938, 12215, 253, 4477, 14923, 253, 4758, 281, 1327, 36928, 3646, 966, 690, 747, 5697, 1754, 327, 28699, 247, 2495, 5170, 3033, 403, 4081, 50276, 1189, 455, 891, 1928, 436, 310, 271, 4722, 789, 285, 973, 15720, 2593, 577, 556, 697, 1318, 891, 452, 690, 2173, 5701, 2708, 50276, 18, 891, 1928, 253, 4758, 310, 1077, 2810, 281, 253, 1263, 273, 2969, 14938, 275, 253, 6313, 17947, 1895, 275, 253, 3961, 262, 3114, 374, 310, 9093, 253, 2969, 14938, 891, 1928, 697, 247, 2372, 12504, 281, 1067, 337, 271, 42295, 697, 816, 3961, 262, 8680, 5046, 253, 4477, 403, 432, 643, 7888, 533, 891, 3524, 368, 812, 14588, 390, 4385, 342, 3961, 953, 3448, 275, 2593, 374, 6313, 17947, 342, 247, 2969, 14938, 12215, 310, 247, 973, 14091, 728, 2170, 835, 368, 812, 3730, 3961, 262, 5933, 1984, 50275, 19, 690, 1774, 10414, 327, 31025, 4152, 953, 403, 5816, 253, 7221, 991, 8654, 14938, 323, 253, 45171, 10295, 273, 31025, 4152, 953, 556, 644, 14042, 4102, 253, 2281, 310, 246, 79, 438, 19, 79, 438, 253, 5170, 3033, 4620, 275, 327, 1491, 6351, 285, 14938, 14493, 275, 305, 12064, 1232, 3961, 953, 247, 382, 1832, 43425, 285, 253, 2406, 3033, 4620, 275, 327, 2406, 14493, 323, 2629, 285, 10237, 305, 12064, 1232, 3961, 262, 13757, 17857, 1686, 43425, 253, 14938, 3033, 275, 436, 2929, 310, 4518, 749, 29776, 672, 8493, 281, 31025, 4152, 953, 4496, 3451, 479, 604, 891, 717, 3430, 604, 436, 310, 3451, 891, 1158, 352, 310, 1077, 1774, 281, 2319, 849, 253, 1655, 906, 476, 564, 4457, 253, 31025, 4152, 953, 4758, 285, 849, 1774, 253, 1327, 36928, 3646, 310, 285, 849, 253, 1655, 14938, 3033, 11424, 253, 38576, 273, 625, 2087, 3646, 966, 50275, 20, 891, 1928, 352, 943, 320, 31798, 281, 9059, 16864, 4715, 310, 1805, 685, 17825, 4715, 275, 2426, 273, 2969, 14938, 253, 3033, 368, 7277, 342, 310, 417, 9479, 2686, 275, 247, 3332, 789, 3961, 262, 3408, 25064, 5987, 39962, 2061, 5375, 19, 12971, 11718, 1549, 253, 4477, 452, 2011, 326, 17825, 4715, 310, 13714, 1805, 685, 16864, 4715, 253, 2281, 310, 9479, 627, 616, 1566, 310, 247, 13714, 35851, 273, 634, 1566, 891, 1158, 50275, 21, 275, 2593, 608, 513, 368, 2430, 253, 2317, 273, 3280, 2792, 281, 320, 253, 2120, 3943, 4023, 984, 672, 368, 6455, 634, 5933, 715, 271, 3909, 14938, 41458, 5933, 253, 8338, 7461, 25756, 943, 417, 320, 247, 1175, 581, 5734, 634, 2250, 873, 556, 690, 1175, 16841, 281, 897, 751, 253, 2120, 3943, 4023, 50275, 22, 849, 352, 7033, 281, 8654, 2216, 6296, 275, 253, 2022, 2593, 253, 3159, 8654, 2216, 1057, 417, 1014, 3176, 604, 352, 4620, 275, 253, 4060, 891, 1928, 368, 943, 5513, 752, 513, 368, 1599, 407, 8654, 2216, 11120, 50276, 2520, 2929, 10262, 690, 4722, 5697, 285, 2087, 4219, 253, 4758, 281, 1327, 36928, 3646, 966, 891, 3524, 253, 4477, 812, 19148, 616, 7680, 8772, 253, 256, 5503, 2281, 50276, 187, 187, 4118, 18435, 27, 6050, 253, 30628, 1119, 2067, 4722, 2792, 670, 253, 2929, 597, 5439, 2067, 3374, 534, 16897, 479, 432, 46705, 14924, 273, 253, 2929, 275, 1798, 253, 2929, 310, 417, 15471, 6283, 275, 253, 6239, 7613, 253, 38135, 285, 253, 9021, 403, 417, 6283, 31637, 253, 2746, 273, 253, 2929, 310, 12054, 2969, 534, 651, 320, 247, 1175, 2181, 407, 3139, 533, 627, 1646, 281, 320, 3626, 44201, 2112, 534, 625, 3426, 1543, 812, 320, 2797, 347, 5393, 275, 253, 10123, 4720, 253, 4679, 943, 320, 5520, 24088, 10941, 342, 643, 11333, 432, 253, 6239, 275, 6010, 436, 310, 247, 12532, 789, 533, 352, 4419, 690, 11701, 1078, 352, 476, 320, 3863 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: the paper studies the prediction problem under spurious association the idea is to construct a nuisance randomized distribution based on the observed distribution the nuisance randomized distribution is constructed by reweighting the observed data then the samples from the nuisance randomized distribution are used to learn a predictive function in general this paper studies a longstanding problem of spurious association in fitting a predictive function the intuition of nurd is clear if we have data from a distribution where the nuisance variables z are randomized and do not cause the label y the prediction will not suffer from the spurious association another strength of this paper is that it studies a variety of representative data sets the following are my major questionscomments about this paper 1 a major concern is about the setting of the data in the literature of this problem eg invariant causal prediction peters 2016 invariant risk minimization arjovsky 2019 the input is oxz and we do not know which are x and which are z for example with a image of cow on the grass we do not know which pixels are of cow x and which pixels are of grass z but this paper assumes we know this separation a priori then why not just exclude all nuisance z from the predictor and only use x as the input 2 the paper lacks sufficient clarity eg sec 2 a main reason might be that the paper focus on the problem of spurious association which is a causal problem but it does not use a causal language and notations eg docalculus potential outcome 3 the simulation results might need further check in the color mnist example the nurd reaches the oracle accuracy 75 however this is an ideal upper bound of accuracy referring to irm arjovsky 2019 the prediction accuracy on the greyscale images without any spurious association is 73 the author may need to checkexplain why nurd outperforms this practical oracle 4 nurd does not compare with any methods that improves over classic erm 5 some implementation details are not presented such as how to separate x and z how to choose the hyperparameter lambda in sum the reviewer think the setting of this paper the writing clarity and the simulation study may need further revision and improvement before getting published docsepthis paper introduces a method to reduce the influence of nuisance variables in predictive tasks this is done by fitting a nuisancerandomized distribution to the data and finding a data representation under this distribution that makes the nuisance and the label independent experiments show that by using a classifier learned on this representation the method is able to improve classification performances by limiting the impact of nuisance variables strengths nuisanceinduced spurious correlations are a major issue in many applications since they lead to models that cannot generalize well to unseen data this paper introduces a novel idea to address this issue to the best of my knowledge which is theoretically principled and well motivated as such i found the paper a very interesting read the proposed method allows to remove many of the strong assumptions that limit the application of competing ones for example it can be used in high dimensional image classification tasks the experimental results show that despite highly different training and test distributions nurd is able to greatly reduce the influence of nuisance variables main weaknesses while the presented theory and results are very compelling there are two main issues with the exposition that would greatly limit its impact in the iclr community 1 the theoretical exposition is unnecessarily abstract and complex 1 section 2 and 3 are much heavier from the theoretical point of view than the average iclr paper given the nature of the presented method this is of course not a problem per se however to have an impact on the community these sections should be made much more accessible what i find that is lacking the most is intuition on what is being presented one way to convey it is to use throughout these sections a running example eg the very clear cows vs penguin example of the introduction most of the propositions theorems and lemmas would be way easier to understand if the authors provided intuition of what they mean if we consider x being the input image and z being its background in the running example 2 this is particularly important when presenting nurd in section 3 the authors should give way more intuition to the reader using an actual example on what the randomization and distillation step do for example what is the practical difference of using 6 vs 7 in the cows vs penguin classification task 3 the introduction in page 2 is very hard to understand without more concrete examples for being the introduction of the paper the discussion is way too abstract for readers unfamiliar with the field 2 from the main text there is close to no information on how this method would be implemented in practice    1 after reading the main text i had an understanding of the theory but i had no clue on what i would need to do to implement this as well as the level of additional complexity this method introduces if i were to apply it in a standard neural network classification task many of the info in the appendix should be moved or at least better addressed in the main text one possible way could be presenting how the reweightinggenerative nurd and critic model would look in a standard neural network classification task and what optimizing 6 and 7 imply 2 without the code being available i doubt many people would know how to implement this method some of the above issues might be due to the lack of space in the main text some ideas to gain some extra space to implement the above changes could be moving lemma 1 theorem 1 and the discussion around them to the appendix the key component for nurd is theorem 2 lemma 1 and theorem 1 can be just seen as necessary steps to prove theorem 2 this would also allow to simplify the theoretical part of the paper mostly focusing in the main text on reweighting nurd which seems more broadly applicable see below most of the generative nurd discussion could be covered in the appendix   moving parts of the related work to the appendix commentsquestion on experiments 1 for the more realistic experiments in 53 and 54 generative nurd seems to struggle is generative nurd feasible in practice for image classification tasks generating good high dimensional images remains a challenge what if the estimate of nuisance randomized distribution is poor 2 in real world applications assuming that the object to classify in the images are centered might not be realistic what happens if in many images from the dataset the background z contains useful info for the classification task how robust is nurd to this this could happen for example with pathologies that are in the outer region of the lungs in chest xray images 3 how well does nurd scale wrt standard classification what are in the experiments the training times of erm generative nurd and reweighting nurd 4 section 51 is missing a reference to table 1 this paper presents a very interesting idea with competitive performances to a very relevant problem in real world applications the exposition of the paper however needs to be greatly simplified by giving a lot more intuition and implementation details due to this in this current state i am afraid that the paper would not have the impact it deserves in the iclr community updated review after the authors rebuttal and revision of the paper i have read the paper once more the authors did a great job improving the exposition of the paper which is now way easier to understand and accessible to a wider audience i have therefore increased my score to an accept docsepthe paper presents a formalization of spuriously correlated distractor features into a family of distributions one goal here is to train a model to be robust against these distribution shifts the paper presents results comparing against erm and argues that it is unique in its problem formulation compared to other methods that make assumptions about environments or specific spurious correlations i find the mathematical formalization to be a bit too much and it makes it harder to understand the core proposed ideas eq 1 is a contribution of this work and it presents the problem of spurious correlations broken down into the contributions of a nuisance variable z conditioned on the label and the input features conditioned on both the label and z from what i understand of the nurd method you estimate the nuisance variable using a neural network and then use this to reweight the training samples based on this how exactly do you learn ptrz and ptry z as this is a core contribution of this paper please explain this very clearly i find the evaluations lacking only one type of spuriously correlated feature was used why not evaluate on the celeba evaluation in sagawa et al 2019 and for chest xrays why not use the same experimental setup as zech et al 2018 instead of correlating the border note a version of the zech experiment was done using all publicly available data in httpsarxivorgabs191000199 also in the nonmnist experiments only a single size border thickness was used in order to explore the utility of this approach please run experiments with multiple types of spurious correlations possibly also varying the amount of correlation please show an experiment where this method doesnt work so that we may understand its limitations the proposed formalization and method are interesting however the presentation is not focused on the core contribution the method is lacking detail and the experiments do not characterize the method well docsepthis paper proposes nuisancerandomized distillation nurd to deal with spurious correlations induced by a changing relationship between labels and nuisance variables where covariates are also correlated with nuisance the paper first introduces the nuisancerandomized distribution under which labels and nuisance are independent and shows that nurd can find uncorrelating representations of covariates where labels remain independent from nuisance when conditioned on these covariate representations the paper further describes how to find the optimal representation that has maximum information with labels nurd is evaluated on various classification tasks involving classconditional gaussians colored mnist images waterbirds and chest xrays strength 1 a great list of examples is provided that describe how spurious correlations arise 2 mathematic details and proofs are provided in full details for all theorems etc 3 nurd works extremely well on problems where nuisance can be clearly defined and its relationship with label is strong and changing a lot between train and test data weaknesses 1 too much technical details in the introduction without adequate context eg too early for we show that the distribution of the label given the covariates under the nuisancerandomized distribution is minimax optimal for the nuisancevarying family if the label and the nuisance are independent given the covariates and all the text after hard to get intuition when the second half of intro is just listing math properties without some high level descriptions of how to attain those properties 2 same for the methods section which repeats much of the introduction with some math interleaved would be nice to have a concrete example to illustrate all steps involved in nurd and then go into the theorems definitions etc as currently written it is hard to get an integrated picture when section 2 is just a bunch of math definitions 3 perhaps consider putting related work up front to provide context and state contributions with respect to existing works any reasons why domain adaptation and optimal transport are not mentioned 4 would the choice of nuisance distribution have a major effect on the results how sensitive are results to the many parameter choices 5 the experiments with gaussian color mnist and waterbirds seem quite extreme in terms of changing relationship between nuisance and labels would we see same results for less extreme scenarios a huge problem in medicine is using genomic or imaging data to predict neurodegenerative disease status with age being nuisance would be great if nurd works on this problem 6 nurd does not perform as well as erm when the nuisancelabel relationships do not change mainly because nurd does not use the nuisance variables to its advantage is it possible to create experiments where there are no nuisance variables that can be used would like to see if nurd performs as least on par with erm in that case ie not underperforming when there is no change in nuisancelabel relationships 7 for the real word problem examined in this paper pneumonia cases are upsampled thus might not provide realistic classification assessment ie samples would be correlated would like to see results with controls downsampled 8 quantitative comparison with existing dl methods eg if have z can find representation of x that predicts y well but not z overall the paper is very rigorous in proving the mathematical guarantees of the proposed method but the method itself is not performing that well for harder real word problems my evaluation is based on the simplicity of experiments for scenarios where nurd worked for the harder case even when classification bias is introduced to the data nurd does not perform particularly well postrevision summary the authors have nicely addressed most of my comments the text is still heavy on math over intuition but improved i am thus increasing my score docsepthis paper develops a representation method called nuisancerandomized distillation nurd for building predictive models with better generalizability ie testing performance without using spurious nuisancelabel correlations in observed data nurd breaks the nuisancelabel dependence and finds the most informative representation to predict the label for every distribution in the nuisance varying family which is a set of distributions that differ only in the nuisancelabel relationship the nurd method is evaluated on several application datasets it produces accurate predictive models on the datasets with strong spurious correlations between the label and some nuisance variables strengths the paper proposes a method to solve an important and challenging problem of nuisancelabel correlations in supervised learning the paper provides a good theoretical analysis of the proposed method and interesting simulation results to demonstrate the applications of the proposed method the related work section is wellstructured and comprehensive including previous works using different ideas overall the paper is well written and easy to follow weaknesses and suggestions sections 2 and 3 only introduce the major component of the proposed method however it is immediately clear to me what kind of datasets we can apply the proposed method importantly how do we extract the nuisance variable z out a given arbitrary labelled dataset equation 1 is better to be introduced using some examples of xy and z in the second paragraph of the introduction for an image why the data generating process of x is conditional on z and y i thought both x and z are some pixels on an image at least in statistics predictor often refers to a variable a model uses to predict the target here predictor refers to the predictive model or distribution itself i encourage the authors to reconsider their terminology measuring the performance of a predictor by kl divergence seems more reasonable for a classification task if the label is continuous then comparing kl divergence does not tell us exactly which model is closer to the ground truth in proposition 3 in the distribution pindep x is generated by y and z then why we would have y is independent of z conditional on x i am thinking of x is a collider of y and z in a directed graphical model is it true that for any pindep we can find an uncorrelated set of representations rpindep why the first equality in equation 4 the highlevel idea of nuisance randomization distillation nurd is clear in the paper i wonder if there are any distributions pxyz for which we do not want to use distillation in order words if there are any cases that we should optimize equation 6 or 7 without the second mutual information term for example if y is a function of x maximizing the expected loglikelihood should give the best predictor you mentioned quite some related methods in section 4 is it possible to compare with some of them or could you provide some suggestions on when to use these related methods and when we should use nurd this paper proposes an interesting idea for an important problem it provides a good theoretical and experimental analysis of the proposed method i recommend accepting this paper ### Summary:
the paper studies how to build predictive models that are robust to nuisanceinduced spurious correlations present in the data it introduces nuisancerandomized distillation nurd constructed by reweighting the observed data to break the nuisancelabel dependence and find the most informative representation to predict the label experiments on several datasets show that by using a classifier learned on this representation nurd is able to improve the classification performance by limiting the impact of nuisance variables the main concerns were about the presentation and organization of the paper which was heavily focused on the theoretical justifications but fell short in explaining the intuitions and implementation details the revision and rebuttal have addressed some of these concerns and improved the overall exposition of the paper based on which two reviewers raised their scores to 8 while there is still room to further improve the paper by providing more detailed discussions about the proposed algorithms the ac considers the paper ready for publication under its current form
[ 7384, 326, 253, 1789, 281, 30215, 275, 253, 3888, 403, 18932, 1537, 417, 320, 15958, 752, 6569, 604, 275, 1142, 3888, 432, 253, 10895, 253, 4114, 1182, 4428, 4217, 8692, 323, 253, 9162, 4836, 849, 10237, 310, 295, 11180, 281, 436, 436, 812, 5108, 323, 1650, 342, 49404, 326, 403, 275, 253, 8346, 2919, 273, 253, 18926, 275, 9081, 1269, 1402, 3888, 495, 849, 973, 1057, 295, 11180, 4311, 8772, 2629, 9162, 752, 403, 275, 253, 4679, 253, 3733, 2069, 273, 209, 693, 1006, 800, 295, 11180, 285, 294, 6712, 272, 295, 11180, 577, 2593, 8319, 310, 5816, 247, 3806, 281, 2829, 337, 50274, 2520, 2929, 10262, 247, 1077, 4722, 2934, 342, 12085, 16226, 281, 247, 1077, 4623, 1895, 275, 1524, 1533, 4893, 253, 47284, 273, 253, 2929, 2299, 3198, 281, 320, 10260, 21010, 407, 4933, 247, 2257, 625, 30328, 285, 7092, 4278, 1955, 281, 436, 275, 436, 1655, 1375, 891, 717, 9202, 326, 253, 2929, 651, 417, 452, 253, 3486, 352, 22828, 275, 253, 17857, 32888, 3114, 50272, 39055, 2278, 846, 253, 4477, 30080, 22559, 285, 18520, 273, 253, 2929, 891, 452, 1239, 253, 2929, 2378, 625, 253, 4477, 858, 247, 1270, 2628, 11138, 253, 47284, 273, 253, 2929, 534, 310, 1024, 1039, 6927, 281, 2096, 285, 12482, 281, 247, 14200, 8446, 891, 452, 3103, 2559, 619, 4868, 281, 271, 2997, 5474, 339, 431, 248, 2929, 10262, 247, 7473, 1320, 273, 36057, 8140, 9578, 940, 30524, 3386, 715, 247, 2021, 273, 10670, 581, 4736, 1060, 310, 281, 6194, 247, 1566, 281, 320, 10237, 1411, 841, 3268, 15036, 253, 2929, 10262, 1543, 10941, 1411, 209, 693, 285, 8219, 326, 352, 310, 4451, 275, 697, 1895, 15895, 2429, 281, 643, 3082, 326, 1056, 13260, 670, 12620, 390, 2173, 46541, 13007, 891, 1089, 253, 15965, 7473, 1320, 281, 320, 247, 2372, 1512, 1199, 285, 352, 2789, 352, 12150, 281, 2096, 253, 5161, 4081, 5697, 50276, 2574, 337, 310, 247, 7680, 273, 436, 789, 285, 352, 10262, 253, 1895, 273, 46541, 13007, 7154, 1066, 715, 253, 9021, 273, 247, 41843, 4778, 1182, 27039, 327, 253, 5203, 285, 253, 3280, 3386, 27039, 327, 1097, 253, 5203, 285, 1182, 50276, 4064, 752, 891, 2096, 273, 253, 295, 11180, 1332, 368, 6642, 253, 41843, 4778, 970, 247, 11454, 2990, 285, 840, 897, 436, 281, 294, 6712, 253, 3733, 3530, 1754, 327, 436, 849, 4555, 513, 368, 3037, 27028, 91, 285, 268, 14626, 50276, 91, 50276, 284, 436, 310, 247, 5161, 7680, 273, 436, 2929, 4496, 5513, 436, 1077, 4518, 50276, 74, 1089, 253, 27163, 14999, 760, 581, 1511, 273, 36057, 8140, 9578, 4735, 369, 908, 2139, 417, 7472, 327, 253, 6076, 5830, 7103, 275, 18561, 11415, 1162, 355, 6247, 285, 323, 9081, 1269, 20237, 2139, 417, 897, 253, 1072, 5661, 9978, 347, 1182, 5036, 1162, 355, 4765, 3185, 273, 3411, 839, 253, 5680, 3877, 247, 2715, 273, 253, 1182, 5036, 3368, 369, 2218, 970, 512, 13644, 2130, 941, 275, 5987, 39962, 2061, 5375, 746, 2313, 520, 1525, 50276, 12563, 275, 253, 1327, 16192, 382, 4679, 760, 247, 2014, 1979, 5680, 9544, 369, 908, 275, 1340, 281, 8338, 253, 11839, 273, 436, 2746, 4496, 1408, 4679, 342, 2709, 3510, 273, 46541, 13007, 6830, 671, 11962, 253, 2408, 273, 5921, 4496, 921, 271, 3368, 835, 436, 1332, 36908, 789, 594, 326, 359, 778, 2096, 697, 7364, 50276, 783, 4081, 7473, 1320, 285, 1332, 403, 4722, 2299, 253, 9759, 310, 417, 7106, 327, 253, 5161, 7680, 253, 1332, 310, 14999, 2508, 285, 253, 4679, 513, 417, 17710, 253, 1332, 973, 50276, 7152, 33032, 2520, 2929, 29328, 8794, 16870, 1209, 2976, 1025, 940, 21755, 295, 11180, 281, 2968, 342, 46541, 13007, 5802, 407, 247, 6890, 2954, 875, 13301, 285, 41843, 4903, 835, 33520, 403, 671, 9578, 342, 41843, 253, 2929, 806, 23970, 253, 8794, 16870, 1209, 2976, 1025, 3268, 762, 534, 13301, 285, 41843, 403, 3907, 285, 2722, 326, 295, 11180, 476, 1089, 41656, 1661, 839, 14237, 273, 33520, 835, 13301, 3464, 3907, 432, 41843, 672, 27039, 327, 841, 9383, 11610, 14237, 253, 2929, 2007, 8631, 849, 281, 1089, 253, 8654, 6779, 326, 556, 4869, 1491, 342, 13301, 295, 11180, 310, 6760, 327, 2710, 9162, 8892, 7668, 966, 35428, 305, 10064, 2458, 18010, 278, 79, 382, 3888, 1824, 40382, 285, 9081, 1269, 20237, 4757, 337, 247, 1270, 1618, 273, 6667, 310, 2530, 326, 6266, 849, 46541, 13007, 12893, 374, 39011, 4278, 285, 27947, 403, 2530, 275, 2120, 4278, 323, 512, 39383, 3966, 495, 295, 11180, 2987, 6685, 973, 327, 3237, 835, 41843, 476, 320, 4518, 2931, 285, 697, 2954, 342, 5203, 310, 2266, 285, 6890, 247, 2257, 875, 6194, 285, 1071, 941, 50276, 20881, 1255, 265, 337, 1512, 1199, 7681, 4278, 275, 253, 10199, 1293, 10599, 3634, 24088, 1512, 2393, 323, 359, 921, 326, 253, 3268, 273, 253, 5203, 1677, 253, 33520, 762, 253, 8794, 16870, 1209, 2976, 1025, 3268, 310, 7221, 991, 8654, 323, 253, 41843, 39381, 272, 2021, 604, 253, 5203, 285, 253, 41843, 403, 3907, 1677, 253, 33520, 285, 512, 253, 2505, 846, 1892, 281, 755, 30328, 672, 253, 1273, 2716, 273, 26432, 310, 816, 16485, 14168, 3607, 1293, 690, 1029, 1268, 20121, 273, 849, 281, 20685, 1110, 3607, 374, 1072, 323, 253, 3082, 2593, 534, 24510, 1199, 273, 253, 10199, 342, 690, 14168, 25817, 9367, 651, 320, 5322, 281, 452, 247, 11859, 1650, 281, 17093, 512, 5018, 3206, 275, 295, 11180, 285, 840, 564, 715, 253, 39383, 14308, 3966, 347, 4390, 3542, 352, 310, 1892, 281, 755, 271, 8527, 5406, 672, 2593, 374, 310, 816, 247, 12190, 273, 14168, 14308, 495, 4931, 1908, 8133, 2905, 789, 598, 2914, 281, 2085, 3634, 285, 1375, 9021, 342, 1675, 281, 5368, 2987, 667, 4606, 2139, 5028, 15644, 285, 8654, 4616, 403, 417, 5393, 577, 651, 253, 4327, 273, 41843, 3268, 452, 247, 2201, 1055, 327, 253, 1543, 849, 7996, 403, 1543, 281, 253, 1142, 4764, 10165, 608, 253, 4679, 342, 305, 12064, 3295, 278, 79, 382, 285, 1824, 40382, 1646, 3240, 9559, 275, 2426, 273, 6890, 2954, 875, 41843, 285, 13301, 651, 359, 923, 1072, 1543, 323, 1679, 9559, 15216, 247, 5699, 1895, 275, 9921, 310, 970, 14421, 390, 6979, 941, 281, 3283, 41346, 2728, 3708, 342, 2363, 1146, 41843, 651, 320, 1270, 604, 295, 11180, 2987, 327, 436, 1895, 50276, 23, 295, 11180, 1057, 417, 1347, 347, 973, 347, 209, 693, 672, 253, 8794, 261, 18721, 1492, 7688, 513, 417, 1818, 7194, 984, 295, 11180, 1057, 417, 897, 253, 41843, 4903, 281, 697, 5750, 310, 352, 1896, 281, 2794, 4679, 835, 627, 403, 642, 41843, 4903, 326, 476, 320, 908, 651, 751, 281, 923, 604, 295, 11180, 17923, 347, 1878, 327, 1061, 342, 209, 693, 275, 326, 1083, 26332, 417, 762, 468, 14692, 672, 627, 310, 642, 1818, 275, 8794, 261, 18721, 1492, 7688, 818, 323, 253, 1524, 3159, 1895, 6730, 275, 436, 2929, 18277, 2219, 403, 598, 22163, 6216, 3021, 1537, 417, 2085, 15958, 9162, 6803, 26332, 3530, 651, 320, 9578, 651, 751, 281, 923, 1543, 342, 5760, 1066, 22163, 6216, 50276, 25, 11745, 5301, 342, 5368, 45439, 3082, 24088, 604, 452, 1182, 476, 1089, 6779, 273, 1269, 326, 26295, 340, 973, 533, 417, 1182, 50275, 1189, 455, 253, 2929, 310, 1077, 26565, 275, 18597, 253, 15965, 23632, 273, 253, 4081, 1332, 533, 253, 1332, 3139, 310, 417, 9591, 326, 973, 323, 12150, 1524, 3159, 3237, 619, 7103, 310, 1754, 327, 253, 17647, 273, 4679, 323, 15216, 835, 295, 11180, 4307, 323, 253, 12150, 1083, 1014, 672, 9162, 8492, 310, 5611, 281, 253, 941, 295, 11180, 1057, 417, 1347, 3782, 973, 50276, 5996, 250, 4694, 6010, 253, 4477, 452, 23395, 9713, 954, 273, 619, 5701, 253, 2505, 310, 1335, 5536, 327, 14168, 689, 30328, 533, 5520, 891, 717, 3021, 3629, 619, 4868, 5474, 33032, 2520, 2929, 24357, 247, 6779, 1332, 1925, 8794, 16870, 1209, 2976, 1025, 940, 21755, 295, 11180, 323, 3652, 15970, 3210, 342, 1805, 2087, 50228, 26332, 5175, 3045, 1293, 970, 46541, 8794, 261, 18721, 1492, 13007, 275, 2540, 941, 295, 11180, 13471, 253, 8794, 261, 18721, 1492, 10096, 285, 9010, 253, 954, 27096, 6779, 281, 3283, 253, 5203, 323, 1046, 3268, 275, 253, 41843, 11962, 2021, 534, 310, 247, 873, 273, 10670, 326, 9184, 760, 275, 253, 8794, 261, 18721, 1492, 2954, 253, 295, 11180, 1332, 310, 6760, 327, 2067, 2898, 15302, 352, 11330, 7899, 15970, 3210, 327, 253, 15302, 342, 2266, 46541, 13007, 875, 253, 5203, 285, 690, 41843, 4903, 20544, 50276, 783, 2929, 29328, 247, 1332, 281, 8415, 271, 1774, 285, 11132, 1895, 273, 8794, 261, 18721, 1492, 13007, 275, 22296, 4715, 253, 2929, 3400, 247, 1175, 10527, 1783, 273, 253, 4081, 1332, 285, 4722, 9864, 1543, 281, 7568, 253, 4893, 273, 253, 4081, 1332, 253, 2905, 789, 2593, 310, 973, 34218, 285, 11088, 1690, 2045, 2987, 970, 1027, 5697, 4583, 253, 2929, 310, 973, 3542, 285, 3477, 281, 956, 50276, 20881, 1255, 265, 285, 13991, 50276, 21454, 374, 285, 495, 760, 9569, 253, 2201, 4445, 273, 253, 4081, 1332, 2299, 352, 310, 4745, 2590, 281, 479, 752, 2238, 273, 15302, 359, 476, 4647, 253, 4081, 1332, 15538, 849, 513, 359, 4908, 253, 41843, 4778, 1182, 562, 247, 1677, 10341, 27214, 10895, 50276, 29813, 337, 310, 1805, 281, 320, 5611, 970, 690, 6667, 273, 1269, 90, 285, 1182, 275, 253, 1273, 12494, 273, 253, 10199, 323, 271, 2460, 2139, 253, 941, 11365, 1232, 273, 1269, 310, 17697, 327, 1182, 285, 340, 891, 1869, 1097, 1269, 285, 1182, 403, 690, 15115, 327, 271, 2460, 50276, 255, 1878, 275, 9990, 23403, 2223, 10770, 281, 247, 4778, 247, 1566, 4648, 281, 3283, 253, 2303, 1060, 23403, 10770, 281, 253, 15970, 1566, 390, 3268, 3139, 891, 11907, 253, 4477, 281, 24033, 616, 28939, 50274, 28025, 981, 253, 3045, 273, 247, 23403, 407, 27451, 23279, 3133, 625, 5272, 323, 247, 9162, 4836, 604, 253, 5203, 310, 5415, 840, 10941, 27451, 23279, 1057, 417, 2028, 441, 4555, 534, 1566, 310, 8003, 281, 253, 3216, 5083, 50274, 249, 13989, 495, 275, 253, 3268, 268, 527, 554, 1269, 310, 4561, 407, 340, 285, 1182, 840, 2139, 359, 651, 452, 340, 310, 3907, 273, 1182, 17697, 327, 1269, 891, 717, 4680, 273, 1269, 310, 247, 3007, 1334, 273, 340, 285, 1182, 275, 247, 6828, 29886, 1566, 50275, 261, 352, 2032, 326, 323, 667, 268, 527, 554, 359, 476, 1089, 271, 41656, 4919, 873, 273, 14237, 391, 81, 527, 554, 50276, 22309, 253, 806, 13919, 275, 5150, 577, 50276, 783, 1029, 5251, 2934, 273, 41843, 46852, 940, 21755, 295, 11180, 310, 2590, 275, 253, 2929, 891, 4282, 604, 627, 403, 667, 10670, 268, 35609, 323, 534, 359, 513, 417, 971, 281, 897, 940, 21755, 275, 1340, 3000, 604, 627, 403, 667, 2219, 326, 359, 943, 22318, 5150, 721, 390, 818, 1293, 253, 1273, 15577, 1491, 1307, 323, 1650, 604, 340, 310, 247, 1159, 273, 1269, 46875, 253, 3264, 2412, 7513, 10202, 943, 1918, 253, 1682, 23403, 50276, 5658, 5393, 3240, 690, 2905, 3082, 275, 2593, 577, 310, 352, 1896, 281, 7277, 342, 690, 273, 731, 390, 812, 368, 2085, 690, 13991, 327, 672, 281, 897, 841, 2905, 3082, 285, 672, 359, 943, 897, 50276, 79, 11180, 50274, 2520, 2929, 29328, 271, 4722, 2934, 323, 271, 1774, 1895, 352, 3400, 247, 1175, 10527, 285, 5661, 1783, 273, 253, 4081, 1332, 891, 5583, 18738, 436, 2929, 50276, 187, 187, 4118, 18435, 27, 783, 2929, 2175, 849, 281, 1973, 15970, 3210, 326, 403, 10237, 281, 41843, 6367, 46541, 13007, 1246, 275, 253, 941, 50276, 262, 23970, 8794, 16870, 1209, 2976, 1025, 940, 21755, 295, 11180, 8818, 407, 294, 6712, 272, 253, 2540, 941, 281, 2740, 253, 8794, 261, 18721, 1492, 10096, 285, 1089, 253, 954, 27096, 6779, 281, 3283, 253, 5203, 4679, 327, 2067, 15302, 921, 326, 407, 970, 247, 30410, 6311, 327, 436, 6779, 295, 11180, 310, 2104, 281, 3157, 253, 9162, 3045, 407, 14155, 253, 3486, 273, 41843, 4903, 253, 2022, 7350, 497, 670, 253, 9759, 285, 6003, 273, 253, 2929, 534, 369, 11306, 7106, 327, 253, 10527, 816, 6787, 533, 6497, 2159, 275, 15571, 253, 16875, 4431, 285, 7092, 4278, 253, 18520, 285, 30080, 22559, 452, 9713, 690, 273, 841, 7350, 285, 5520, 253, 4583, 47284, 273, 253, 2929, 1754, 327, 534, 767, 30628, 5439, 616, 7363, 281, 854, 1223, 627, 310, 1335, 2316, 281, 2007, 3157, 253, 2929, 407, 5277, 625, 7000, 11985, 670, 253, 4081, 11333, 253, 913, 19401, 253, 2929, 4704, 323, 9311, 762, 697, 1655, 830 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 7384, 326, 253, 1789, 281, 30215, 275, 253, 3888, 403, 18932, 1537, 417, 320, 15958, 752, 6569, 604, 275, 1142, 3888, 432, 253, 10895, 253, 4114, 1182, 4428, 4217, 8692, 323, 253, 9162, 4836, 849, 10237, 310, 295, 11180, 281, 436, 436, 812, 5108, 323, 1650, 342, 49404, 326, 403, 275, 253, 8346, 2919, 273, 253, 18926, 275, 9081, 1269, 1402, 3888, 495, 849, 973, 1057, 295, 11180, 4311, 8772, 2629, 9162, 752, 403, 275, 253, 4679, 253, 3733, 2069, 273, 209, 693, 1006, 800, 295, 11180, 285, 294, 6712, 272, 295, 11180, 577, 2593, 8319, 310, 5816, 247, 3806, 281, 2829, 337, 50274, 2520, 2929, 10262, 247, 1077, 4722, 2934, 342, 12085, 16226, 281, 247, 1077, 4623, 1895, 275, 1524, 1533, 4893, 253, 47284, 273, 253, 2929, 2299, 3198, 281, 320, 10260, 21010, 407, 4933, 247, 2257, 625, 30328, 285, 7092, 4278, 1955, 281, 436, 275, 436, 1655, 1375, 891, 717, 9202, 326, 253, 2929, 651, 417, 452, 253, 3486, 352, 22828, 275, 253, 17857, 32888, 3114, 50272, 39055, 2278, 846, 253, 4477, 30080, 22559, 285, 18520, 273, 253, 2929, 891, 452, 1239, 253, 2929, 2378, 625, 253, 4477, 858, 247, 1270, 2628, 11138, 253, 47284, 273, 253, 2929, 534, 310, 1024, 1039, 6927, 281, 2096, 285, 12482, 281, 247, 14200, 8446, 891, 452, 3103, 2559, 619, 4868, 281, 271, 2997, 5474, 339, 431, 248, 2929, 10262, 247, 7473, 1320, 273, 36057, 8140, 9578, 940, 30524, 3386, 715, 247, 2021, 273, 10670, 581, 4736, 1060, 310, 281, 6194, 247, 1566, 281, 320, 10237, 1411, 841, 3268, 15036, 253, 2929, 10262, 1543, 10941, 1411, 209, 693, 285, 8219, 326, 352, 310, 4451, 275, 697, 1895, 15895, 2429, 281, 643, 3082, 326, 1056, 13260, 670, 12620, 390, 2173, 46541, 13007, 891, 1089, 253, 15965, 7473, 1320, 281, 320, 247, 2372, 1512, 1199, 285, 352, 2789, 352, 12150, 281, 2096, 253, 5161, 4081, 5697, 50276, 2574, 337, 310, 247, 7680, 273, 436, 789, 285, 352, 10262, 253, 1895, 273, 46541, 13007, 7154, 1066, 715, 253, 9021, 273, 247, 41843, 4778, 1182, 27039, 327, 253, 5203, 285, 253, 3280, 3386, 27039, 327, 1097, 253, 5203, 285, 1182, 50276, 4064, 752, 891, 2096, 273, 253, 295, 11180, 1332, 368, 6642, 253, 41843, 4778, 970, 247, 11454, 2990, 285, 840, 897, 436, 281, 294, 6712, 253, 3733, 3530, 1754, 327, 436, 849, 4555, 513, 368, 3037, 27028, 91, 285, 268, 14626, 50276, 91, 50276, 284, 436, 310, 247, 5161, 7680, 273, 436, 2929, 4496, 5513, 436, 1077, 4518, 50276, 74, 1089, 253, 27163, 14999, 760, 581, 1511, 273, 36057, 8140, 9578, 4735, 369, 908, 2139, 417, 7472, 327, 253, 6076, 5830, 7103, 275, 18561, 11415, 1162, 355, 6247, 285, 323, 9081, 1269, 20237, 2139, 417, 897, 253, 1072, 5661, 9978, 347, 1182, 5036, 1162, 355, 4765, 3185, 273, 3411, 839, 253, 5680, 3877, 247, 2715, 273, 253, 1182, 5036, 3368, 369, 2218, 970, 512, 13644, 2130, 941, 275, 5987, 39962, 2061, 5375, 746, 2313, 520, 1525, 50276, 12563, 275, 253, 1327, 16192, 382, 4679, 760, 247, 2014, 1979, 5680, 9544, 369, 908, 275, 1340, 281, 8338, 253, 11839, 273, 436, 2746, 4496, 1408, 4679, 342, 2709, 3510, 273, 46541, 13007, 6830, 671, 11962, 253, 2408, 273, 5921, 4496, 921, 271, 3368, 835, 436, 1332, 36908, 789, 594, 326, 359, 778, 2096, 697, 7364, 50276, 783, 4081, 7473, 1320, 285, 1332, 403, 4722, 2299, 253, 9759, 310, 417, 7106, 327, 253, 5161, 7680, 253, 1332, 310, 14999, 2508, 285, 253, 4679, 513, 417, 17710, 253, 1332, 973, 50276, 7152, 33032, 2520, 2929, 29328, 8794, 16870, 1209, 2976, 1025, 940, 21755, 295, 11180, 281, 2968, 342, 46541, 13007, 5802, 407, 247, 6890, 2954, 875, 13301, 285, 41843, 4903, 835, 33520, 403, 671, 9578, 342, 41843, 253, 2929, 806, 23970, 253, 8794, 16870, 1209, 2976, 1025, 3268, 762, 534, 13301, 285, 41843, 403, 3907, 285, 2722, 326, 295, 11180, 476, 1089, 41656, 1661, 839, 14237, 273, 33520, 835, 13301, 3464, 3907, 432, 41843, 672, 27039, 327, 841, 9383, 11610, 14237, 253, 2929, 2007, 8631, 849, 281, 1089, 253, 8654, 6779, 326, 556, 4869, 1491, 342, 13301, 295, 11180, 310, 6760, 327, 2710, 9162, 8892, 7668, 966, 35428, 305, 10064, 2458, 18010, 278, 79, 382, 3888, 1824, 40382, 285, 9081, 1269, 20237, 4757, 337, 247, 1270, 1618, 273, 6667, 310, 2530, 326, 6266, 849, 46541, 13007, 12893, 374, 39011, 4278, 285, 27947, 403, 2530, 275, 2120, 4278, 323, 512, 39383, 3966, 495, 295, 11180, 2987, 6685, 973, 327, 3237, 835, 41843, 476, 320, 4518, 2931, 285, 697, 2954, 342, 5203, 310, 2266, 285, 6890, 247, 2257, 875, 6194, 285, 1071, 941, 50276, 20881, 1255, 265, 337, 1512, 1199, 7681, 4278, 275, 253, 10199, 1293, 10599, 3634, 24088, 1512, 2393, 323, 359, 921, 326, 253, 3268, 273, 253, 5203, 1677, 253, 33520, 762, 253, 8794, 16870, 1209, 2976, 1025, 3268, 310, 7221, 991, 8654, 323, 253, 41843, 39381, 272, 2021, 604, 253, 5203, 285, 253, 41843, 403, 3907, 1677, 253, 33520, 285, 512, 253, 2505, 846, 1892, 281, 755, 30328, 672, 253, 1273, 2716, 273, 26432, 310, 816, 16485, 14168, 3607, 1293, 690, 1029, 1268, 20121, 273, 849, 281, 20685, 1110, 3607, 374, 1072, 323, 253, 3082, 2593, 534, 24510, 1199, 273, 253, 10199, 342, 690, 14168, 25817, 9367, 651, 320, 5322, 281, 452, 247, 11859, 1650, 281, 17093, 512, 5018, 3206, 275, 295, 11180, 285, 840, 564, 715, 253, 39383, 14308, 3966, 347, 4390, 3542, 352, 310, 1892, 281, 755, 271, 8527, 5406, 672, 2593, 374, 310, 816, 247, 12190, 273, 14168, 14308, 495, 4931, 1908, 8133, 2905, 789, 598, 2914, 281, 2085, 3634, 285, 1375, 9021, 342, 1675, 281, 5368, 2987, 667, 4606, 2139, 5028, 15644, 285, 8654, 4616, 403, 417, 5393, 577, 651, 253, 4327, 273, 41843, 3268, 452, 247, 2201, 1055, 327, 253, 1543, 849, 7996, 403, 1543, 281, 253, 1142, 4764, 10165, 608, 253, 4679, 342, 305, 12064, 3295, 278, 79, 382, 285, 1824, 40382, 1646, 3240, 9559, 275, 2426, 273, 6890, 2954, 875, 41843, 285, 13301, 651, 359, 923, 1072, 1543, 323, 1679, 9559, 15216, 247, 5699, 1895, 275, 9921, 310, 970, 14421, 390, 6979, 941, 281, 3283, 41346, 2728, 3708, 342, 2363, 1146, 41843, 651, 320, 1270, 604, 295, 11180, 2987, 327, 436, 1895, 50276, 23, 295, 11180, 1057, 417, 1347, 347, 973, 347, 209, 693, 672, 253, 8794, 261, 18721, 1492, 7688, 513, 417, 1818, 7194, 984, 295, 11180, 1057, 417, 897, 253, 41843, 4903, 281, 697, 5750, 310, 352, 1896, 281, 2794, 4679, 835, 627, 403, 642, 41843, 4903, 326, 476, 320, 908, 651, 751, 281, 923, 604, 295, 11180, 17923, 347, 1878, 327, 1061, 342, 209, 693, 275, 326, 1083, 26332, 417, 762, 468, 14692, 672, 627, 310, 642, 1818, 275, 8794, 261, 18721, 1492, 7688, 818, 323, 253, 1524, 3159, 1895, 6730, 275, 436, 2929, 18277, 2219, 403, 598, 22163, 6216, 3021, 1537, 417, 2085, 15958, 9162, 6803, 26332, 3530, 651, 320, 9578, 651, 751, 281, 923, 1543, 342, 5760, 1066, 22163, 6216, 50276, 25, 11745, 5301, 342, 5368, 45439, 3082, 24088, 604, 452, 1182, 476, 1089, 6779, 273, 1269, 326, 26295, 340, 973, 533, 417, 1182, 50275, 1189, 455, 253, 2929, 310, 1077, 26565, 275, 18597, 253, 15965, 23632, 273, 253, 4081, 1332, 533, 253, 1332, 3139, 310, 417, 9591, 326, 973, 323, 12150, 1524, 3159, 3237, 619, 7103, 310, 1754, 327, 253, 17647, 273, 4679, 323, 15216, 835, 295, 11180, 4307, 323, 253, 12150, 1083, 1014, 672, 9162, 8492, 310, 5611, 281, 253, 941, 295, 11180, 1057, 417, 1347, 3782, 973, 50276, 5996, 250, 4694, 6010, 253, 4477, 452, 23395, 9713, 954, 273, 619, 5701, 253, 2505, 310, 1335, 5536, 327, 14168, 689, 30328, 533, 5520, 891, 717, 3021, 3629, 619, 4868, 5474, 33032, 2520, 2929, 24357, 247, 6779, 1332, 1925, 8794, 16870, 1209, 2976, 1025, 940, 21755, 295, 11180, 323, 3652, 15970, 3210, 342, 1805, 2087, 50228, 26332, 5175, 3045, 1293, 970, 46541, 8794, 261, 18721, 1492, 13007, 275, 2540, 941, 295, 11180, 13471, 253, 8794, 261, 18721, 1492, 10096, 285, 9010, 253, 954, 27096, 6779, 281, 3283, 253, 5203, 323, 1046, 3268, 275, 253, 41843, 11962, 2021, 534, 310, 247, 873, 273, 10670, 326, 9184, 760, 275, 253, 8794, 261, 18721, 1492, 2954, 253, 295, 11180, 1332, 310, 6760, 327, 2067, 2898, 15302, 352, 11330, 7899, 15970, 3210, 327, 253, 15302, 342, 2266, 46541, 13007, 875, 253, 5203, 285, 690, 41843, 4903, 20544, 50276, 783, 2929, 29328, 247, 1332, 281, 8415, 271, 1774, 285, 11132, 1895, 273, 8794, 261, 18721, 1492, 13007, 275, 22296, 4715, 253, 2929, 3400, 247, 1175, 10527, 1783, 273, 253, 4081, 1332, 285, 4722, 9864, 1543, 281, 7568, 253, 4893, 273, 253, 4081, 1332, 253, 2905, 789, 2593, 310, 973, 34218, 285, 11088, 1690, 2045, 2987, 970, 1027, 5697, 4583, 253, 2929, 310, 973, 3542, 285, 3477, 281, 956, 50276, 20881, 1255, 265, 285, 13991, 50276, 21454, 374, 285, 495, 760, 9569, 253, 2201, 4445, 273, 253, 4081, 1332, 2299, 352, 310, 4745, 2590, 281, 479, 752, 2238, 273, 15302, 359, 476, 4647, 253, 4081, 1332, 15538, 849, 513, 359, 4908, 253, 41843, 4778, 1182, 562, 247, 1677, 10341, 27214, 10895, 50276, 29813, 337, 310, 1805, 281, 320, 5611, 970, 690, 6667, 273, 1269, 90, 285, 1182, 275, 253, 1273, 12494, 273, 253, 10199, 323, 271, 2460, 2139, 253, 941, 11365, 1232, 273, 1269, 310, 17697, 327, 1182, 285, 340, 891, 1869, 1097, 1269, 285, 1182, 403, 690, 15115, 327, 271, 2460, 50276, 255, 1878, 275, 9990, 23403, 2223, 10770, 281, 247, 4778, 247, 1566, 4648, 281, 3283, 253, 2303, 1060, 23403, 10770, 281, 253, 15970, 1566, 390, 3268, 3139, 891, 11907, 253, 4477, 281, 24033, 616, 28939, 50274, 28025, 981, 253, 3045, 273, 247, 23403, 407, 27451, 23279, 3133, 625, 5272, 323, 247, 9162, 4836, 604, 253, 5203, 310, 5415, 840, 10941, 27451, 23279, 1057, 417, 2028, 441, 4555, 534, 1566, 310, 8003, 281, 253, 3216, 5083, 50274, 249, 13989, 495, 275, 253, 3268, 268, 527, 554, 1269, 310, 4561, 407, 340, 285, 1182, 840, 2139, 359, 651, 452, 340, 310, 3907, 273, 1182, 17697, 327, 1269, 891, 717, 4680, 273, 1269, 310, 247, 3007, 1334, 273, 340, 285, 1182, 275, 247, 6828, 29886, 1566, 50275, 261, 352, 2032, 326, 323, 667, 268, 527, 554, 359, 476, 1089, 271, 41656, 4919, 873, 273, 14237, 391, 81, 527, 554, 50276, 22309, 253, 806, 13919, 275, 5150, 577, 50276, 783, 1029, 5251, 2934, 273, 41843, 46852, 940, 21755, 295, 11180, 310, 2590, 275, 253, 2929, 891, 4282, 604, 627, 403, 667, 10670, 268, 35609, 323, 534, 359, 513, 417, 971, 281, 897, 940, 21755, 275, 1340, 3000, 604, 627, 403, 667, 2219, 326, 359, 943, 22318, 5150, 721, 390, 818, 1293, 253, 1273, 15577, 1491, 1307, 323, 1650, 604, 340, 310, 247, 1159, 273, 1269, 46875, 253, 3264, 2412, 7513, 10202, 943, 1918, 253, 1682, 23403, 50276, 5658, 5393, 3240, 690, 2905, 3082, 275, 2593, 577, 310, 352, 1896, 281, 7277, 342, 690, 273, 731, 390, 812, 368, 2085, 690, 13991, 327, 672, 281, 897, 841, 2905, 3082, 285, 672, 359, 943, 897, 50276, 79, 11180, 50274, 2520, 2929, 29328, 271, 4722, 2934, 323, 271, 1774, 1895, 352, 3400, 247, 1175, 10527, 285, 5661, 1783, 273, 253, 4081, 1332, 891, 5583, 18738, 436, 2929, 50276, 187, 187, 4118, 18435, 27, 783, 2929, 2175, 849, 281, 1973, 15970, 3210, 326, 403, 10237, 281, 41843, 6367, 46541, 13007, 1246, 275, 253, 941, 50276, 262, 23970, 8794, 16870, 1209, 2976, 1025, 940, 21755, 295, 11180, 8818, 407, 294, 6712, 272, 253, 2540, 941, 281, 2740, 253, 8794, 261, 18721, 1492, 10096, 285, 1089, 253, 954, 27096, 6779, 281, 3283, 253, 5203, 4679, 327, 2067, 15302, 921, 326, 407, 970, 247, 30410, 6311, 327, 436, 6779, 295, 11180, 310, 2104, 281, 3157, 253, 9162, 3045, 407, 14155, 253, 3486, 273, 41843, 4903, 253, 2022, 7350, 497, 670, 253, 9759, 285, 6003, 273, 253, 2929, 534, 369, 11306, 7106, 327, 253, 10527, 816, 6787, 533, 6497, 2159, 275, 15571, 253, 16875, 4431, 285, 7092, 4278, 253, 18520, 285, 30080, 22559, 452, 9713, 690, 273, 841, 7350, 285, 5520, 253, 4583, 47284, 273, 253, 2929, 1754, 327, 534, 767, 30628, 5439, 616, 7363, 281, 854, 1223, 627, 310, 1335, 2316, 281, 2007, 3157, 253, 2929, 407, 5277, 625, 7000, 11985, 670, 253, 4081, 11333, 253, 913, 19401, 253, 2929, 4704, 323, 9311, 762, 697, 1655, 830 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: this work presents a selfsupervised method that trains a teacher to generate theorems and a solver to prove them the method applies to the special case of singleloop programs and an assertion that requires proof strengths 1 problem formulation of validation assertion is precise 2 the method uses vast augmentation to generate data 3 a highperformance implementation of alphazero and search weaknesses 1 the paper is not clearly written 2 limited to a very special case of a singleloop 3 doesnt generalize for example to nested loops and does not scale the work does not address the limitation of solving one problem type that is too specific and overfitting docsepthis paper proposes an alphazerolike approach to loop invariant synthesis specifically a teacher agent is first trained to generate programs which are then used to train a solver agent to incorporate domain knowledge both agents are modeled as nondeterministic programs which can be refined by implementing certain operators the solver agent is a nondeterministic program with the choose operator to be learned and the reward operator to be designed while the teacher agent is designed to fill a singleloop program template in order to generate a diverse set of training programs a number of teacher constraints are introduced different from the standard alphazero setting both agents are trained to not only predict the value of a state but also the associated events the implementation includes a domainspecific language for writing strategies a graphical interface for inspecting neural network predictions and an efficient implementation of alphazero algorithm the evaluation performed on the benchmark from code2inv shows that looprl this work significantly outperforms code2inv strengths although fairly technical this paper is wellwriten the motivation is clearly illustrated and the proposed methodology of learning teacher and solver agents looks very promising modeling search strategies as nondeterministic programs and then reducing strategy learning into refining programs are very novel the implementation including domain specific language design graphical ainterface highperformance mcts search etc is very solid the evaluation shows significant improvement compared with code2inv weaknesses the title makes a very general claim of finding proofs and theorems however only loop invariant synthesis for singleloop programs is concerned in the paper the design of teacher and solver agents are also specialied for the loop invariant synthesis problem it might be better to adjust the title to reflect what the paper is actually trying to address as hinted by cln2inv 29 the code2inv benchmarks are very easy to solve when domain knowledge like templates either explicit or implict are used verification tasks from svcomphttpssvcompsosylaborg2022indexphp would be a much better benchmark to test the full strength of looprl the current evaluation is relatively weak because the code2inv benchmark is a simple set of small singleloop programs evaluations on relatively large programs like the ones used in svcomphttpssvcompsosylaborg2022indexphp will make this work much stronger docsepthis paper proposes an approach to synthesizing loop invariants with reinforcement learning the idea is to leverage abductive reasoning for which search strategies can be treated as nondeterministic programs an alphazerostyle rl algorithm is in turn used to refined the strategies the paper also proposes a method to generate problems of invariant synthesis using rl to automate some of the processes described in the paper the authors also implemented the looprl theorem prover consisting of features that ease the burden of customising the approach the evaluation clearly shows the benefits of including the teacher process in this approach originality and clarity the idea of using rl for invariant synthesis itself is not new however to my knowledge the proposed method to generate new problems ie the teacher process is indeed novel even better the generation of new problems uses the same methodology used to solve the problems which leads to a uniform alphazerostyle solution to the problem of invariant synthesis the looprl theorem prover clearly embodies a significant amount of work which makes the proposed approaches practical the paper is well written and easy to follow overall it is a good attempt to find loop invariants using a combination of abductive reasoning and reinforcement learning quality and significance my main concern is the significance it is true that abductive reasoning is closer to the way human experts reason about invariants but the proposed approach is conceivably insufficient for the general task of theorem proving the crucial part of this approach is the abduct procedure which requires intensive domainspecific expert knowledge to engineer in the case of loop invariant synthesis studied in this paper inventing such an abduct procedure is relatively easy because the domain is mostly linear arithmetic and we have the fouriermotzkin elimination however it is unclear how an abduct procedure can be engineered for more advanced mathematics say algebraic geometry i can imagine that inventing these procedures requires deep analysis of human mathematical activities which essentially makes the proposed approach relies heavily on handengineered procedures for the above reasons i think that the title and the claims of this paper are perhaps too general invariant synthesis is a narrow instance of theorem proving ie generating cutformulae satisfying certain conditions and the paper only shows that the approach is promising for invariant synthesis the evaluation confirms the contribution of the teacher process but is rather unconvincing in terms of absolute performance the fact that looprl with the vanilla mcts can solve all the problems of code2inv benchmarks means that the expert knowledge is perhaps too strong or that the problems are too easy possibly both in fact a modern stateoftheart invariant synthesizer such as spacerhttpsarxivorgabs14054028 should also be able to solve all the problems in the code2inv benchmarks the reviewer suggests adding more problems to the test set such as problems from recent sygus competitionshttpssygusorg yes the authors adequately addressed the limitations ### Summary:
this work presents an approach to learning to prove loopinvariant theorems organized around jointly training teacher and solver models reviewers praised its originality and creativity as well as the quality of the software artifacts produced both the ideas and the code could be valuable for the community at the same time there is a consensus which i agree with that the work oversells itself by claiming to be a general framework for learning to prove theorems might be true that you could in principle apply the framework to other kinds of theorems but that would have to be shown empirically ditto for the claim that this can be applied to program synthesis which the paper makes in the very first sentence of the abstract given these overreaches the camera ready version of this paper needs to soften its claims about its broad applicability for theorem proving and program synthesis the authors also need to change the title so that it has loop invariant in it or something similar which they are receptive to in the rebuttal the paper can talk about these loftier ambitions in the conclusion but should clearly demarcate the actual extent of the empirical results
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 2520, 789, 10262, 247, 1881, 35421, 1332, 326, 18784, 247, 9732, 281, 6635, 39383, 285, 247, 47037, 281, 5276, 731, 253, 1332, 10384, 281, 253, 2714, 1083, 273, 2014, 14075, 5659, 285, 271, 17077, 326, 4419, 4737, 50276, 296, 3755, 20556, 337, 1895, 15895, 273, 12820, 17077, 310, 10799, 374, 253, 1332, 4648, 8485, 42072, 281, 6635, 941, 495, 247, 1029, 24159, 7092, 273, 355, 545, 1370, 2771, 285, 3186, 50276, 20881, 1255, 265, 337, 253, 2929, 310, 417, 4518, 3542, 374, 3710, 281, 247, 1077, 2714, 1083, 273, 247, 2014, 14075, 495, 36908, 39970, 323, 1650, 281, 20494, 17417, 285, 1057, 417, 4311, 253, 789, 1057, 417, 2953, 253, 12291, 273, 16161, 581, 1895, 1511, 326, 310, 1512, 2173, 285, 689, 31893, 5474, 33032, 2520, 2929, 29328, 271, 355, 545, 24613, 311, 2804, 2746, 281, 6287, 13727, 9066, 5742, 247, 9732, 5570, 310, 806, 10166, 281, 6635, 5659, 534, 403, 840, 908, 281, 6194, 247, 47037, 5570, 281, 19071, 5028, 3640, 1097, 6083, 403, 23115, 347, 27370, 13109, 249, 2531, 5659, 534, 476, 320, 22407, 407, 16994, 2176, 9158, 253, 47037, 5570, 310, 247, 27370, 13109, 249, 2531, 2086, 342, 253, 5206, 5572, 281, 320, 6311, 285, 253, 10921, 5572, 281, 320, 4158, 1223, 253, 9732, 5570, 310, 4158, 281, 7522, 247, 2014, 14075, 2086, 7646, 275, 1340, 281, 6635, 247, 11117, 873, 273, 3733, 5659, 247, 1180, 273, 9732, 10806, 403, 5611, 1027, 432, 253, 2629, 355, 545, 1370, 2771, 4758, 1097, 6083, 403, 10166, 281, 417, 760, 3283, 253, 1318, 273, 247, 1375, 533, 671, 253, 2330, 3394, 253, 7092, 3797, 247, 10625, 29765, 3448, 323, 4028, 8130, 247, 29886, 5673, 323, 16030, 272, 11454, 2990, 13650, 285, 271, 5919, 7092, 273, 355, 545, 1370, 2771, 5933, 253, 7103, 2684, 327, 253, 22791, 432, 2127, 19, 7821, 2722, 326, 6287, 8435, 436, 789, 3012, 41731, 13015, 2127, 19, 7821, 50276, 296, 3755, 20556, 50276, 20261, 9648, 7681, 436, 2929, 310, 973, 8510, 257, 50276, 783, 16038, 310, 4518, 12800, 285, 253, 4081, 16182, 273, 4715, 9732, 285, 47037, 6083, 4453, 1077, 12532, 50276, 7645, 272, 3186, 8130, 347, 27370, 13109, 249, 2531, 5659, 285, 840, 8493, 5700, 4715, 715, 1275, 1699, 5659, 403, 1077, 4460, 50276, 783, 7092, 1690, 5028, 2173, 3448, 2216, 29886, 247, 15049, 1029, 24159, 278, 291, 84, 3186, 3966, 310, 1077, 4891, 50276, 783, 7103, 2722, 1534, 7756, 2429, 342, 2127, 19, 7821, 50276, 20881, 1255, 265, 50276, 783, 4060, 2789, 247, 1077, 2087, 1750, 273, 4560, 27947, 285, 39383, 2299, 760, 6287, 13727, 9066, 323, 2014, 14075, 5659, 310, 7514, 275, 253, 2929, 253, 2216, 273, 9732, 285, 47037, 6083, 403, 671, 2714, 728, 323, 253, 6287, 13727, 9066, 1895, 352, 1537, 320, 1805, 281, 4575, 253, 4060, 281, 4887, 752, 253, 2929, 310, 2686, 2820, 281, 2953, 50276, 284, 47466, 407, 502, 79, 19, 7821, 3285, 253, 2127, 19, 7821, 49602, 403, 1077, 3477, 281, 8415, 672, 5028, 3640, 751, 20665, 2057, 6843, 390, 3898, 882, 403, 908, 50276, 332, 1877, 8892, 432, 18504, 3118, 3614, 11427, 681, 793, 39847, 357, 2061, 938, 1423, 4663, 5581, 651, 320, 247, 1199, 1805, 22791, 281, 1071, 253, 2120, 4757, 273, 6287, 8435, 50275, 783, 1655, 7103, 310, 4942, 5075, 984, 253, 2127, 19, 7821, 22791, 310, 247, 2969, 873, 273, 1355, 2014, 14075, 5659, 27163, 327, 4942, 1781, 5659, 751, 253, 4394, 908, 275, 18504, 3118, 3614, 11427, 681, 793, 39847, 357, 2061, 938, 1423, 4663, 5581, 588, 1056, 436, 789, 1199, 10046, 5474, 33032, 2520, 2929, 29328, 271, 2746, 281, 35143, 3006, 6287, 38318, 342, 35221, 4715, 253, 2934, 310, 281, 25057, 490, 43324, 14720, 323, 534, 3186, 8130, 476, 320, 4127, 347, 27370, 13109, 249, 2531, 5659, 271, 355, 545, 24613, 493, 2172, 391, 77, 5933, 310, 275, 1614, 908, 281, 22407, 253, 8130, 253, 2929, 671, 29328, 247, 1332, 281, 6635, 3237, 273, 13727, 9066, 970, 391, 77, 281, 3772, 366, 690, 273, 253, 4870, 2529, 275, 253, 2929, 253, 4477, 671, 9009, 253, 6287, 8435, 10012, 354, 332, 11253, 273, 3386, 326, 11990, 253, 7977, 273, 2840, 2182, 253, 2746, 253, 7103, 4518, 2722, 253, 5373, 273, 1690, 253, 9732, 1232, 275, 436, 2746, 3236, 414, 285, 19843, 50275, 783, 2934, 273, 970, 391, 77, 323, 13727, 9066, 3139, 310, 417, 747, 2299, 281, 619, 3640, 253, 4081, 1332, 281, 6635, 747, 3237, 26332, 253, 9732, 1232, 310, 6296, 4460, 1014, 1805, 253, 5978, 273, 747, 3237, 4648, 253, 1072, 16182, 908, 281, 8415, 253, 3237, 534, 5644, 281, 247, 6447, 355, 545, 24613, 493, 2172, 2900, 281, 253, 1895, 273, 13727, 9066, 253, 6287, 8435, 10012, 354, 332, 4518, 4443, 4550, 247, 1534, 2408, 273, 789, 534, 2789, 253, 4081, 7274, 8542, 253, 2929, 310, 973, 3542, 285, 3477, 281, 956, 4583, 352, 310, 247, 1175, 3177, 281, 1089, 6287, 38318, 970, 247, 5019, 273, 490, 43324, 14720, 285, 35221, 4715, 50276, 15177, 285, 8453, 50275, 2577, 2022, 4468, 310, 253, 8453, 352, 310, 2032, 326, 490, 43324, 14720, 310, 8003, 281, 253, 1039, 1966, 10071, 1921, 670, 38318, 533, 253, 4081, 2746, 310, 10686, 400, 1598, 12497, 323, 253, 2087, 4836, 273, 10012, 18597, 253, 9560, 629, 273, 436, 2746, 310, 253, 490, 1586, 5199, 534, 4419, 17193, 10625, 29765, 6485, 3640, 281, 16518, 50271, 249, 253, 1083, 273, 6287, 13727, 9066, 5421, 275, 436, 2929, 10242, 272, 824, 271, 490, 1586, 5199, 310, 4942, 3477, 984, 253, 5028, 310, 6571, 4872, 27844, 285, 359, 452, 253, 269, 13979, 693, 302, 91, 5914, 20408, 2299, 352, 310, 12744, 849, 271, 490, 1586, 5199, 476, 320, 28136, 323, 625, 7269, 23065, 1333, 20157, 12087, 891, 476, 8564, 326, 10242, 272, 841, 7259, 4419, 3676, 1783, 273, 1966, 15965, 4712, 534, 9093, 2789, 253, 4081, 2746, 15771, 11306, 327, 1133, 15179, 2122, 7259, 50275, 1542, 253, 1840, 4606, 891, 1158, 326, 253, 4060, 285, 253, 3916, 273, 436, 2929, 403, 4931, 1512, 2087, 13727, 9066, 310, 247, 6891, 4227, 273, 10012, 18597, 26332, 11365, 2624, 630, 335, 3348, 14127, 2176, 2515, 285, 253, 2929, 760, 2722, 326, 253, 2746, 310, 12532, 323, 13727, 9066, 50275, 783, 7103, 23849, 253, 7680, 273, 253, 9732, 1232, 533, 310, 2581, 10915, 87, 19163, 275, 2426, 273, 7880, 3045, 253, 958, 326, 6287, 8435, 342, 253, 26724, 278, 291, 84, 476, 8415, 512, 253, 3237, 273, 2127, 19, 7821, 49602, 2097, 326, 253, 6485, 3640, 310, 4931, 1512, 2266, 390, 326, 253, 3237, 403, 1512, 3477, 6830, 1097, 275, 958, 247, 4980, 1375, 23037, 14387, 13727, 35143, 6081, 824, 347, 38590, 3614, 39962, 2061, 5375, 1047, 1762, 1449, 1619, 943, 671, 320, 2104, 281, 8415, 512, 253, 3237, 275, 253, 2127, 19, 7821, 49602, 253, 37317, 5936, 6240, 625, 3237, 281, 253, 1071, 873, 824, 347, 3237, 432, 3332, 726, 28349, 31067, 3614, 84, 11550, 316, 2061, 50276, 9820, 253, 4477, 18212, 9713, 253, 7364, 2490, 187, 4118, 18435, 27, 2520, 789, 10262, 271, 2746, 281, 4715, 281, 5276, 6287, 25168, 39383, 10932, 1475, 26277, 3733, 9732, 285, 47037, 3210, 30628, 26108, 697, 3236, 414, 285, 22794, 347, 973, 347, 253, 3290, 273, 253, 3694, 24165, 4197, 1097, 253, 5697, 285, 253, 2127, 812, 320, 9865, 323, 253, 3114, 50276, 255, 253, 1072, 673, 627, 310, 247, 13969, 534, 891, 5194, 342, 326, 253, 789, 689, 84, 7042, 3139, 407, 15081, 281, 320, 247, 2087, 7792, 323, 4715, 281, 5276, 39383, 1537, 320, 2032, 326, 368, 812, 275, 8063, 4647, 253, 7792, 281, 643, 9351, 273, 39383, 533, 326, 651, 452, 281, 320, 2011, 45190, 277, 35570, 323, 253, 1750, 326, 436, 476, 320, 3732, 281, 2086, 9066, 534, 253, 2929, 2789, 275, 253, 1077, 806, 6197, 273, 253, 12002, 50276, 28821, 841, 689, 250, 3844, 253, 6568, 4704, 2715, 273, 436, 2929, 3198, 281, 50007, 697, 3916, 670, 697, 3862, 30437, 323, 10012, 18597, 285, 2086, 9066, 253, 4477, 671, 878, 281, 1818, 253, 4060, 594, 326, 352, 556, 6287, 13727, 275, 352, 390, 1633, 2074, 534, 597, 403, 44952, 281, 275, 253, 30080, 22559, 253, 2929, 476, 2312, 670, 841, 2343, 649, 1321, 37321, 275, 253, 6452, 533, 943, 4518, 1471, 3178, 366, 253, 4588, 6070, 273, 253, 16774, 1543 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 2520, 789, 10262, 247, 1881, 35421, 1332, 326, 18784, 247, 9732, 281, 6635, 39383, 285, 247, 47037, 281, 5276, 731, 253, 1332, 10384, 281, 253, 2714, 1083, 273, 2014, 14075, 5659, 285, 271, 17077, 326, 4419, 4737, 50276, 296, 3755, 20556, 337, 1895, 15895, 273, 12820, 17077, 310, 10799, 374, 253, 1332, 4648, 8485, 42072, 281, 6635, 941, 495, 247, 1029, 24159, 7092, 273, 355, 545, 1370, 2771, 285, 3186, 50276, 20881, 1255, 265, 337, 253, 2929, 310, 417, 4518, 3542, 374, 3710, 281, 247, 1077, 2714, 1083, 273, 247, 2014, 14075, 495, 36908, 39970, 323, 1650, 281, 20494, 17417, 285, 1057, 417, 4311, 253, 789, 1057, 417, 2953, 253, 12291, 273, 16161, 581, 1895, 1511, 326, 310, 1512, 2173, 285, 689, 31893, 5474, 33032, 2520, 2929, 29328, 271, 355, 545, 24613, 311, 2804, 2746, 281, 6287, 13727, 9066, 5742, 247, 9732, 5570, 310, 806, 10166, 281, 6635, 5659, 534, 403, 840, 908, 281, 6194, 247, 47037, 5570, 281, 19071, 5028, 3640, 1097, 6083, 403, 23115, 347, 27370, 13109, 249, 2531, 5659, 534, 476, 320, 22407, 407, 16994, 2176, 9158, 253, 47037, 5570, 310, 247, 27370, 13109, 249, 2531, 2086, 342, 253, 5206, 5572, 281, 320, 6311, 285, 253, 10921, 5572, 281, 320, 4158, 1223, 253, 9732, 5570, 310, 4158, 281, 7522, 247, 2014, 14075, 2086, 7646, 275, 1340, 281, 6635, 247, 11117, 873, 273, 3733, 5659, 247, 1180, 273, 9732, 10806, 403, 5611, 1027, 432, 253, 2629, 355, 545, 1370, 2771, 4758, 1097, 6083, 403, 10166, 281, 417, 760, 3283, 253, 1318, 273, 247, 1375, 533, 671, 253, 2330, 3394, 253, 7092, 3797, 247, 10625, 29765, 3448, 323, 4028, 8130, 247, 29886, 5673, 323, 16030, 272, 11454, 2990, 13650, 285, 271, 5919, 7092, 273, 355, 545, 1370, 2771, 5933, 253, 7103, 2684, 327, 253, 22791, 432, 2127, 19, 7821, 2722, 326, 6287, 8435, 436, 789, 3012, 41731, 13015, 2127, 19, 7821, 50276, 296, 3755, 20556, 50276, 20261, 9648, 7681, 436, 2929, 310, 973, 8510, 257, 50276, 783, 16038, 310, 4518, 12800, 285, 253, 4081, 16182, 273, 4715, 9732, 285, 47037, 6083, 4453, 1077, 12532, 50276, 7645, 272, 3186, 8130, 347, 27370, 13109, 249, 2531, 5659, 285, 840, 8493, 5700, 4715, 715, 1275, 1699, 5659, 403, 1077, 4460, 50276, 783, 7092, 1690, 5028, 2173, 3448, 2216, 29886, 247, 15049, 1029, 24159, 278, 291, 84, 3186, 3966, 310, 1077, 4891, 50276, 783, 7103, 2722, 1534, 7756, 2429, 342, 2127, 19, 7821, 50276, 20881, 1255, 265, 50276, 783, 4060, 2789, 247, 1077, 2087, 1750, 273, 4560, 27947, 285, 39383, 2299, 760, 6287, 13727, 9066, 323, 2014, 14075, 5659, 310, 7514, 275, 253, 2929, 253, 2216, 273, 9732, 285, 47037, 6083, 403, 671, 2714, 728, 323, 253, 6287, 13727, 9066, 1895, 352, 1537, 320, 1805, 281, 4575, 253, 4060, 281, 4887, 752, 253, 2929, 310, 2686, 2820, 281, 2953, 50276, 284, 47466, 407, 502, 79, 19, 7821, 3285, 253, 2127, 19, 7821, 49602, 403, 1077, 3477, 281, 8415, 672, 5028, 3640, 751, 20665, 2057, 6843, 390, 3898, 882, 403, 908, 50276, 332, 1877, 8892, 432, 18504, 3118, 3614, 11427, 681, 793, 39847, 357, 2061, 938, 1423, 4663, 5581, 651, 320, 247, 1199, 1805, 22791, 281, 1071, 253, 2120, 4757, 273, 6287, 8435, 50275, 783, 1655, 7103, 310, 4942, 5075, 984, 253, 2127, 19, 7821, 22791, 310, 247, 2969, 873, 273, 1355, 2014, 14075, 5659, 27163, 327, 4942, 1781, 5659, 751, 253, 4394, 908, 275, 18504, 3118, 3614, 11427, 681, 793, 39847, 357, 2061, 938, 1423, 4663, 5581, 588, 1056, 436, 789, 1199, 10046, 5474, 33032, 2520, 2929, 29328, 271, 2746, 281, 35143, 3006, 6287, 38318, 342, 35221, 4715, 253, 2934, 310, 281, 25057, 490, 43324, 14720, 323, 534, 3186, 8130, 476, 320, 4127, 347, 27370, 13109, 249, 2531, 5659, 271, 355, 545, 24613, 493, 2172, 391, 77, 5933, 310, 275, 1614, 908, 281, 22407, 253, 8130, 253, 2929, 671, 29328, 247, 1332, 281, 6635, 3237, 273, 13727, 9066, 970, 391, 77, 281, 3772, 366, 690, 273, 253, 4870, 2529, 275, 253, 2929, 253, 4477, 671, 9009, 253, 6287, 8435, 10012, 354, 332, 11253, 273, 3386, 326, 11990, 253, 7977, 273, 2840, 2182, 253, 2746, 253, 7103, 4518, 2722, 253, 5373, 273, 1690, 253, 9732, 1232, 275, 436, 2746, 3236, 414, 285, 19843, 50275, 783, 2934, 273, 970, 391, 77, 323, 13727, 9066, 3139, 310, 417, 747, 2299, 281, 619, 3640, 253, 4081, 1332, 281, 6635, 747, 3237, 26332, 253, 9732, 1232, 310, 6296, 4460, 1014, 1805, 253, 5978, 273, 747, 3237, 4648, 253, 1072, 16182, 908, 281, 8415, 253, 3237, 534, 5644, 281, 247, 6447, 355, 545, 24613, 493, 2172, 2900, 281, 253, 1895, 273, 13727, 9066, 253, 6287, 8435, 10012, 354, 332, 4518, 4443, 4550, 247, 1534, 2408, 273, 789, 534, 2789, 253, 4081, 7274, 8542, 253, 2929, 310, 973, 3542, 285, 3477, 281, 956, 4583, 352, 310, 247, 1175, 3177, 281, 1089, 6287, 38318, 970, 247, 5019, 273, 490, 43324, 14720, 285, 35221, 4715, 50276, 15177, 285, 8453, 50275, 2577, 2022, 4468, 310, 253, 8453, 352, 310, 2032, 326, 490, 43324, 14720, 310, 8003, 281, 253, 1039, 1966, 10071, 1921, 670, 38318, 533, 253, 4081, 2746, 310, 10686, 400, 1598, 12497, 323, 253, 2087, 4836, 273, 10012, 18597, 253, 9560, 629, 273, 436, 2746, 310, 253, 490, 1586, 5199, 534, 4419, 17193, 10625, 29765, 6485, 3640, 281, 16518, 50271, 249, 253, 1083, 273, 6287, 13727, 9066, 5421, 275, 436, 2929, 10242, 272, 824, 271, 490, 1586, 5199, 310, 4942, 3477, 984, 253, 5028, 310, 6571, 4872, 27844, 285, 359, 452, 253, 269, 13979, 693, 302, 91, 5914, 20408, 2299, 352, 310, 12744, 849, 271, 490, 1586, 5199, 476, 320, 28136, 323, 625, 7269, 23065, 1333, 20157, 12087, 891, 476, 8564, 326, 10242, 272, 841, 7259, 4419, 3676, 1783, 273, 1966, 15965, 4712, 534, 9093, 2789, 253, 4081, 2746, 15771, 11306, 327, 1133, 15179, 2122, 7259, 50275, 1542, 253, 1840, 4606, 891, 1158, 326, 253, 4060, 285, 253, 3916, 273, 436, 2929, 403, 4931, 1512, 2087, 13727, 9066, 310, 247, 6891, 4227, 273, 10012, 18597, 26332, 11365, 2624, 630, 335, 3348, 14127, 2176, 2515, 285, 253, 2929, 760, 2722, 326, 253, 2746, 310, 12532, 323, 13727, 9066, 50275, 783, 7103, 23849, 253, 7680, 273, 253, 9732, 1232, 533, 310, 2581, 10915, 87, 19163, 275, 2426, 273, 7880, 3045, 253, 958, 326, 6287, 8435, 342, 253, 26724, 278, 291, 84, 476, 8415, 512, 253, 3237, 273, 2127, 19, 7821, 49602, 2097, 326, 253, 6485, 3640, 310, 4931, 1512, 2266, 390, 326, 253, 3237, 403, 1512, 3477, 6830, 1097, 275, 958, 247, 4980, 1375, 23037, 14387, 13727, 35143, 6081, 824, 347, 38590, 3614, 39962, 2061, 5375, 1047, 1762, 1449, 1619, 943, 671, 320, 2104, 281, 8415, 512, 253, 3237, 275, 253, 2127, 19, 7821, 49602, 253, 37317, 5936, 6240, 625, 3237, 281, 253, 1071, 873, 824, 347, 3237, 432, 3332, 726, 28349, 31067, 3614, 84, 11550, 316, 2061, 50276, 9820, 253, 4477, 18212, 9713, 253, 7364, 2490, 187, 4118, 18435, 27, 2520, 789, 10262, 271, 2746, 281, 4715, 281, 5276, 6287, 25168, 39383, 10932, 1475, 26277, 3733, 9732, 285, 47037, 3210, 30628, 26108, 697, 3236, 414, 285, 22794, 347, 973, 347, 253, 3290, 273, 253, 3694, 24165, 4197, 1097, 253, 5697, 285, 253, 2127, 812, 320, 9865, 323, 253, 3114, 50276, 255, 253, 1072, 673, 627, 310, 247, 13969, 534, 891, 5194, 342, 326, 253, 789, 689, 84, 7042, 3139, 407, 15081, 281, 320, 247, 2087, 7792, 323, 4715, 281, 5276, 39383, 1537, 320, 2032, 326, 368, 812, 275, 8063, 4647, 253, 7792, 281, 643, 9351, 273, 39383, 533, 326, 651, 452, 281, 320, 2011, 45190, 277, 35570, 323, 253, 1750, 326, 436, 476, 320, 3732, 281, 2086, 9066, 534, 253, 2929, 2789, 275, 253, 1077, 806, 6197, 273, 253, 12002, 50276, 28821, 841, 689, 250, 3844, 253, 6568, 4704, 2715, 273, 436, 2929, 3198, 281, 50007, 697, 3916, 670, 697, 3862, 30437, 323, 10012, 18597, 285, 2086, 9066, 253, 4477, 671, 878, 281, 1818, 253, 4060, 594, 326, 352, 556, 6287, 13727, 275, 352, 390, 1633, 2074, 534, 597, 403, 44952, 281, 275, 253, 30080, 22559, 253, 2929, 476, 2312, 670, 841, 2343, 649, 1321, 37321, 275, 253, 6452, 533, 943, 4518, 1471, 3178, 366, 253, 4588, 6070, 273, 253, 16774, 1543 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: the paper proposes an approach for zeroshot reward specification using natural language groundings without expert demos and state information the work proposes using clips imagelanguage encoder to find the saliency maps and separately encoding the target locations for the object these objects are identified by parsing the task sentence the work proposes automatically getting a reward function by again using the sentence the work claims that automatically parsing the goal text and using a heuristic to get a reward function works better than using the clips language encode to goal text and get the rewards with a dot product the work proposes a simple and novel solution to the zeroshot reward specification problem which is an important step towards getting rl agents to work in new environments and tasks the experiments show promising results and support the authors claims weakness experiments have been performed on only simple scenarios and environments the natural language queries are simple and the authors dont talk about translating to more complex tasks using clip based text embedding with a dot product in the reward function performs worse than the heuristicbased reward function but since the embedding based approach is translatable more experiments could have been performed to understand why the reason for the lower performance missing details on the construction of goaltext is it constructed in a way that is easy for the parser to be parsed missing related works in the space of learning language conditioned reward function or learning to perform language conditioned tasks 1 1 fu j korattikara a levine s guadarrama s from language to goals inverse reinforcement learning for visionbased instruction following arxiv preprint arxiv190207742 2019 feb 20 the experiments in the paper are on toy examples and the approach proposed has several limitations for it to translate to other tasks for example the complexity of the environment and the task specification but the techniques proposed are simple and can be extended to more complex cases with further research docsepthe paper proposes a framework to train policies for tasks that are specified via language text without the use of any expert trajectories or underlying state information to engineer reward functions the authors leverage a stateoftheart visual grounding model clip to ground object nouns from the text into the current visual observation in order to derive a proxy for the rewards signal through some toy experiments they demonstrate performance of a simple baseline that implements this concept as well as enhancements to the baseline saliency grounding spatial relationship based rewards that overcomes drawbacks of the naive approach they also train their languageconditioned on several tasks to obtain a generic policy that can learn to solve an arbitrary task given paired text descriptions and trajectory rollouts from the training tasks the paper tackles the problem of specifying goal configurations via language text this is a more general and therefore more challenging setting that is agnostic to the specific instance of the target objectsconfigurations presented via the goal image the empirical evidence presented by the authors seems to support their claims however given below are some of my concerns with the paper the goal text descriptions seem to be severely underspecifying tasks for instance consider an example of a goal text from the paper image of an inverted pendulum this doesnt really say a lot about the desired task and it might hold true for several configurations within a delta tolerance of the actual goal state that makes me wonder if the kind of goaltext descriptions being considered in the paper can be treated as good proxies for specifying tasks i would imagine this problem becoming severe with task complexity when the approach is deployed to settings with slightly more complicated state spaces and task configurations for instance consider gridworld navigation what would a goal description that leads to a high correlation between grounding of objects in the current observation and progress towards the goal be for the task of going to location x in the grid the introductory section of the paper makes the case against reward engineering by saying that it utilizes the underlying state information the spatial grounding heuristics used in the proposed approach are getting computed based on the gt locations of objects in the scene doesnt that count as having access to state information and engineering rewards how do the authors reconcile this moreover computing the rewards based on whether spatial relationships as decoded from the goal text are being satisfied or not involves the use of multiple camera views this brings in additional complexity from a realworld deployment perspective it is not clear how their approach fundamentally differs from the reward structure used here httpsarxivorgpdf190404404pdf the reward is derived from the iou and classification accuracies of predictions of the model and the policy gets trained depending on how well the agents perception can localize the target object one might say that they warm start their training with expert trajectories which the authors do not but to be fair the cited paper works with a much more complex state space 3d indoor scenes ignoring that the general principle of deriving a reward based on localizationgrounding of a target semantic concept holds true here as well the paper is missing a lot of critical information such as there is no information of how the goal text descriptions are derived is it by using some sort of captioning model on the image goal description or are they manually annotated for every task the paper doesnt have any details on how the grounding of semantics and spatial relationships from the goal text onto the visual observations translates to numeric rewards that seems to be an important step in their approach also listed below are some minor suggestions and clarification questions in sec 22 the authors say that the goal text is parsed into object noun phrases and object interactions what does the latter mean are these the spatial relationships that the referred objects must satisfy did the authors try finetuning the maskrcnn is the comparison shown in fig 3 between non finetuned maskrcnn and clip if so do the authors have a sense for the degree of overlap between the vocabulary of classes between maskrcnnclip and their environments in sec 32 the language makes it seem like your model is progressively trained using gt reward first and then your proposed visiolinguistic rewards this could get a little confusing consider rewording to ensure that its clear that one of them is your approach and the others are baselines overall i am not entirely convinced that the approach of simply grounding text descriptions onto image observations can serve as a reliable proxy for rewards in addition to that the paper in its current format has a lot of missing details therefore my vote is to reject this paper docsepthis paper demonstrates how to use pretrained clip model as a reward function for an rl agent this approach enables flexible goal specification using language the proposed reward generation method uses clip to identify the relevant objects and a separate module to compute the reward based on a specified spatial relationship this separation produces a better reward function than using the dot product between the embedded image vector and language vector the idea to use natural language grounded in the scene as the reward function is an interesting idea and a goal for language grounded rl however i have some concernsquestions 1 definition of zeroshot the reward function is determined by the heuristics defined in table 1 these heuristics encode the knowledge about the goal so the reward function is not entirely zeroshot as the heuristics encode the structure of the goal 2 the use of natural language is limited originally i expected to see the reward function that is directly computed from the clip image embed and text embed it is reasonable that the paper shows the dot product doesnt represent the spatial relationship well however the final proposal ie using a clipbased saliency map and a separate spatial module doesnt leverage the visionlanguage model well the use of clip here is closer to an object detector while this paper showed that the pretrained mask rcnn doesnt provide good detection this still doesnt show change the fact that clip is used as an object detector in this case any color detector may work as well too then what does a pretrained visionlanguage model help in the reward specification this separation also limits the possibility to extend to other language descriptions beyond spatial relationships 3 clipbase result in figure 6 even though the reward using a simple dot product may not be good in all cases why does figure 6 show the base result to be always flat it doesnt seem likely that rl agent that uses this reward function doesnt learn anything from the beginning 4 alternative clip baseline while the dot product version doesnt work im wondering if there is an alternative formulation for example turns the reward specification into classification of objects of different spatial relationships this would remove the need for separate spatial heuristics the idea of the paper is interesting however the final implementation doesnt really need the pretrained visionlanguage model docsepthe paper presents a hybrid learningrulebased approach to languageconditioned reward specification particularly for robotic control tasks from images the core contribution of the work is in specifying the reward 0shot specifically the method parses a language instruction into an object noun phrase and desired spatial configuration then it uses clip embedding of each object phrase as a query and uses gradcam to get a saliency map in the image which is used to localize the target object finally a set of heuristics are used to compute the reward for a particular spatial configuration given the target objects coordinates the reward can then be used to train either single or multitask policies experiments suggest that in simulated fetch robot domains this approach can nearly match an oracle reward strengths the paper tackles and interesting and important problem in 0shot reward specification solving 0shot reward specification particularly from language is essential to getting rl to work in new environments additionally the approach proposed in this work is creative and performs well in the domains considered from images weaknesses the weaknesses in this work are 1 the generality of the proposed approach 2 the experimental domainscomparisons and 3 coverage of prior work 1 the paper proposes an interesting approach to 0 shot reward specification in explicitly mapping an instruction to nouns and spatial relations using clip to localize a noun then using the position and spatial relation with heuristics to define the reward however i think there are a number of limitations of this approach that the paper should discuss further first parsing the instruction into objects and into a fixed set of spatial relations does seems limited to programmatic language and the specific tasks in the paper eg in the paper the instruction an image of a yellow block on top of a red block directly maps to the on top of rule but this likely will not be trivial in the case where the instruction is in the form of natural language even for the inverted pendulum example shown earlier in the paper its not clear how a parser would map that to objects and a specific spatial relation second the approach of extracting the object positions with gradcam is a clever idea but it does seem very specific to environment tasks and viewpoints considered in the paper in a more interesting visual scene with more than 2 objects its not clear that the gradcam would provide as good object localization the paper also mentions that for this environment very specific camera viewpoints are needed for the gradcam localization to work moreover there are many tasks eg opening a cabinet where its not clear how to heuristically define a reward based on the gradcam localization or what part of the cabinet the saliency map will map to it does seem that most of the reward specification is actually coming in the form of hardcoded rules and the main use of clip is in localizing objects 2 the experiments show that on some simple tabletop manipulation tasks the proposed approach can nearly match an oracle reward function this is an interesting result however says more about the ability to use clip to localize objects than it does about the particular reward function approach since the heuristic reward is the same between both this method and the oracle the curiosity driven baselines also are not that informative since they do not get the language instruction some more interesting comparisons would be a how the method compares to goalimage specifications and b how it might compare to using an off the shelf imagecaptioning approach it would also be valuable to see if this approach can handle a more complex natural language instructions and b more interesting environments with more complex visuals than the 2 solid colored blocks 3 there are a number of missing related works 1234 which also learn language conditioned skills on robots particularly 13 also learn to ground language to goals or rewards 1 macglashan et al grounding english commands to reward functions rss 2015 1 arumugam et al grounding natural language instructions to semantic goal representations for abstraction and generalization autonomous robots 2019 3 nair et al learning languageconditioned robot behavior from offline data and crowdsourced annotation corl 2021 4 lynch and sermanet language conditioned imitation learning over unstructured data rss 2021 overall the paper presents an interesting and creative solution to the important problem 0shot language specification for robot manipulation from images however the proposed method seems to be taskenvironmentviewpoint specific and the paper should further discuss these limitations the paper can also improve its experimental domains baseline comparisons and coverage of related work ### Summary:
this manuscript describes a method that turns sentences into reward functions by recognizing objects parsing sentences into a simple formalism and then grounding the parse in the recognized objects to form a reward for an agent 1 the title and much of the manuscript are written in a way that reviewers found confusing it would seem from the title and most of the text that the method integrates language models clip specifically into rl in a novel way to provide zeroshot rewards but this is not the case clip is used purely as an object detector yes the method requires a good object detector and clip provides that but any good object detector that can handle arbitrary phrases would have done 2 the overall setup of the work extract the state of the world and then parse sentences to formulate rewards by grounding parts of the parse into parts of the world state has been explored widely in robotics reviewers provided citations going back several years but many others exist i would encourage the authors to rewrite the manuscript around their central contributions and downgrade their use of clip and language models in general to a minor technical footnote similarly refocusing related work on the robotics literature and demonstrating how this approach differs and improves on the state of the art there could result in a strong contribution
[ 10921, 11369, 407, 3981, 326, 352, 29820, 253, 6944, 1375, 1491, 253, 8820, 3216, 272, 344, 321, 3397, 908, 275, 253, 4081, 2746, 403, 2970, 10302, 1754, 327, 253, 305, 85, 8593, 273, 5113, 275, 253, 6200, 36908, 326, 1385, 347, 1907, 2289, 281, 1375, 1491, 285, 11369, 23267, 849, 513, 253, 4477, 42853, 436, 25761, 12672, 253, 23267, 1754, 327, 1880, 8820, 7688, 347, 45775, 432, 253, 4736, 2505, 403, 1146, 10048, 390, 417, 8687, 253, 897, 273, 2709, 6568, 6849, 436, 10316, 275, 3081, 10454, 432, 247, 1524, 10186, 19007, 8668, 50275, 262, 310, 417, 2590, 849, 616, 2746, 26401, 19986, 432, 253, 10921, 2605, 908, 1060, 5987, 39962, 2061, 9275, 16129, 20611, 20611, 9275, 253, 10921, 310, 6012, 432, 253, 891, 276, 285, 9162, 3933, 19103, 273, 13650, 273, 253, 1566, 285, 253, 3646, 4850, 10166, 7293, 327, 849, 973, 253, 6083, 13071, 476, 1980, 907, 253, 2303, 1789, 581, 1537, 1333, 326, 597, 5890, 1265, 616, 3733, 342, 6485, 24102, 534, 253, 4477, 513, 417, 533, 281, 320, 4344, 253, 11106, 2929, 2987, 342, 247, 1199, 625, 2570, 1375, 2317, 495, 69, 24340, 13451, 23111, 326, 253, 2087, 8063, 273, 44190, 247, 10921, 1754, 327, 14536, 2595, 272, 273, 247, 2303, 24705, 4473, 6556, 2032, 1060, 347, 973, 50275, 783, 2929, 310, 5816, 247, 2257, 273, 4619, 1491, 824, 347, 50274, 9088, 310, 642, 1491, 273, 849, 253, 4736, 2505, 20121, 403, 6012, 310, 352, 407, 970, 690, 3686, 273, 11743, 272, 1566, 327, 253, 2460, 4736, 5740, 390, 403, 597, 13542, 28267, 323, 1046, 4836, 50274, 783, 2929, 36908, 452, 667, 4278, 327, 849, 253, 3216, 272, 273, 35185, 285, 8820, 7688, 432, 253, 4736, 2505, 4830, 253, 5304, 7313, 30376, 281, 31437, 23267, 326, 3133, 281, 320, 271, 1774, 3213, 275, 616, 2746, 50276, 12563, 7117, 2708, 403, 690, 5884, 13991, 285, 37699, 3533, 50276, 249, 4706, 3307, 253, 4477, 1333, 326, 253, 4736, 2505, 310, 36838, 715, 1789, 28407, 25491, 285, 1789, 6355, 752, 1057, 253, 6158, 1599, 403, 841, 253, 8820, 7688, 326, 253, 6289, 5113, 1364, 10517, 50276, 14958, 253, 4477, 1611, 1442, 292, 25004, 253, 8989, 3373, 9866, 310, 253, 5301, 2011, 275, 3036, 495, 875, 1327, 1442, 292, 37437, 8989, 3373, 9866, 285, 17230, 604, 594, 513, 253, 4477, 452, 247, 3282, 323, 253, 4248, 273, 14787, 875, 253, 30318, 273, 5971, 875, 8989, 3373, 9866, 11536, 285, 616, 12620, 50276, 249, 4706, 4567, 253, 3448, 2789, 352, 1646, 751, 634, 1566, 310, 31414, 10166, 970, 305, 85, 10921, 806, 285, 840, 634, 4081, 1649, 10966, 5191, 2531, 23267, 436, 812, 755, 247, 1652, 21643, 1908, 294, 88, 1573, 281, 5416, 326, 697, 2590, 326, 581, 273, 731, 310, 634, 2746, 285, 253, 2571, 403, 1666, 25379, 4583, 891, 717, 417, 7094, 13762, 326, 253, 2746, 273, 3365, 3216, 272, 2505, 20121, 4830, 2460, 7313, 476, 5752, 347, 247, 9630, 17335, 323, 23267, 275, 1635, 281, 326, 253, 2929, 275, 697, 1655, 5981, 556, 247, 2257, 273, 5816, 4278, 3103, 619, 6273, 310, 281, 12009, 436, 2929, 5474, 33032, 2520, 2929, 14371, 849, 281, 897, 3215, 11273, 17230, 1566, 347, 247, 10921, 1159, 323, 271, 391, 77, 5570, 436, 2746, 13276, 12112, 4736, 17776, 970, 3448, 253, 4081, 10921, 5978, 1332, 4648, 17230, 281, 4271, 253, 4623, 5113, 285, 247, 4858, 6333, 281, 11897, 253, 10921, 1754, 327, 247, 7616, 8820, 2954, 436, 9712, 11330, 247, 1805, 10921, 1159, 685, 970, 253, 14261, 1885, 875, 253, 12691, 2460, 4972, 285, 3448, 4972, 50276, 783, 2934, 281, 897, 3626, 3448, 28462, 275, 253, 6200, 347, 253, 10921, 1159, 310, 271, 4722, 2934, 285, 247, 4736, 323, 3448, 28462, 391, 77, 2299, 891, 452, 690, 7350, 34974, 50276, 18, 5426, 273, 1182, 254, 6934, 302, 253, 10921, 1159, 310, 3413, 407, 253, 344, 321, 3397, 2931, 275, 2829, 337, 841, 344, 321, 3397, 22573, 253, 3640, 670, 253, 4736, 594, 253, 10921, 1159, 310, 417, 7094, 1182, 254, 6934, 302, 347, 253, 344, 321, 3397, 22573, 253, 2605, 273, 253, 4736, 50276, 19, 253, 897, 273, 3626, 3448, 310, 3710, 8927, 891, 3264, 281, 923, 253, 10921, 1159, 326, 310, 3587, 10302, 432, 253, 17230, 2460, 8473, 285, 2505, 8473, 352, 310, 5272, 326, 253, 2929, 2722, 253, 14261, 1885, 36908, 1957, 253, 8820, 2954, 973, 2299, 253, 2457, 10419, 26332, 970, 247, 17230, 3169, 3779, 4364, 3711, 285, 247, 4858, 8820, 6333, 36908, 25057, 253, 8113, 12982, 1566, 973, 253, 897, 273, 17230, 1060, 310, 8003, 281, 271, 1789, 13562, 1223, 436, 2929, 2692, 326, 253, 3215, 11273, 8989, 27657, 9866, 36908, 2085, 1175, 5481, 436, 1335, 36908, 921, 1818, 253, 958, 326, 17230, 310, 908, 347, 271, 1789, 13562, 275, 436, 1083, 667, 3295, 13562, 778, 789, 347, 973, 1512, 840, 752, 1057, 247, 3215, 11273, 8113, 12982, 1566, 1361, 275, 253, 10921, 17776, 436, 9712, 671, 7787, 253, 6387, 281, 9017, 281, 643, 3448, 20121, 4457, 8820, 7688, 50276, 20, 17230, 4793, 906, 275, 4677, 721, 1014, 2167, 253, 10921, 970, 247, 2969, 14261, 1885, 778, 417, 320, 1175, 275, 512, 2219, 2139, 1057, 4677, 721, 921, 253, 2613, 906, 281, 320, 1900, 6507, 352, 36908, 1646, 2779, 326, 391, 77, 5570, 326, 4648, 436, 10921, 1159, 36908, 3037, 2712, 432, 253, 5068, 50276, 21, 5795, 17230, 8245, 1223, 253, 14261, 1885, 2715, 36908, 789, 516, 12371, 604, 627, 310, 271, 5795, 15895, 323, 1650, 7819, 253, 10921, 17776, 715, 9162, 273, 5113, 273, 1027, 8820, 7688, 436, 651, 5386, 253, 878, 323, 4858, 8820, 344, 321, 3397, 253, 2934, 273, 253, 2929, 310, 4722, 2299, 253, 2457, 7092, 36908, 1663, 878, 253, 3215, 11273, 8113, 12982, 1566, 5474, 339, 431, 248, 2929, 10262, 247, 9769, 4715, 15093, 3169, 2746, 281, 3448, 44321, 10921, 17776, 3782, 323, 35121, 1453, 8892, 432, 3888, 253, 5161, 7680, 273, 253, 789, 310, 275, 31238, 253, 10921, 470, 11860, 5742, 253, 1332, 13328, 265, 247, 3448, 9775, 715, 271, 1789, 28407, 12616, 285, 6799, 8820, 6661, 840, 352, 4648, 17230, 21496, 273, 1016, 1789, 12616, 347, 247, 7316, 285, 4648, 3805, 12583, 281, 755, 247, 3779, 4364, 3711, 275, 253, 2460, 534, 310, 908, 281, 1980, 907, 253, 2303, 1789, 4720, 247, 873, 273, 344, 321, 3397, 403, 908, 281, 11897, 253, 10921, 323, 247, 1798, 8820, 6661, 1677, 253, 2303, 5113, 11627, 253, 10921, 476, 840, 320, 908, 281, 6194, 2057, 2014, 390, 1554, 262, 1945, 7823, 4679, 1804, 326, 275, 15524, 20279, 15688, 10625, 436, 2746, 476, 4829, 3761, 271, 42295, 10921, 50275, 296, 3755, 20556, 50276, 783, 2929, 39223, 285, 4722, 285, 1774, 1895, 275, 470, 11860, 10921, 17776, 16161, 470, 11860, 10921, 17776, 3782, 432, 3448, 310, 5667, 281, 2970, 391, 77, 281, 789, 275, 747, 12620, 23000, 253, 2746, 4081, 275, 436, 789, 310, 10995, 285, 17923, 973, 275, 253, 10625, 2783, 432, 3888, 50276, 20881, 1255, 265, 50276, 783, 32213, 275, 436, 789, 403, 337, 253, 31376, 273, 253, 4081, 2746, 374, 253, 5661, 10625, 681, 1148, 10047, 285, 495, 7031, 273, 2720, 789, 50276, 18, 253, 2929, 29328, 271, 4722, 2746, 281, 470, 5103, 10921, 17776, 275, 11120, 10603, 271, 9775, 281, 28407, 84, 285, 8820, 2493, 970, 17230, 281, 1980, 907, 247, 28407, 840, 970, 253, 1899, 285, 8820, 5886, 342, 344, 321, 3397, 281, 4853, 253, 10921, 2299, 891, 1158, 627, 403, 247, 1180, 273, 7364, 273, 436, 2746, 326, 253, 2929, 943, 2319, 2007, 50275, 7053, 29072, 253, 9775, 715, 5113, 285, 715, 247, 4229, 873, 273, 8820, 2493, 1057, 3133, 3710, 281, 2086, 33778, 3448, 285, 253, 2173, 8892, 275, 253, 2929, 24088, 275, 253, 2929, 253, 9775, 271, 2460, 273, 247, 8862, 2972, 327, 1755, 273, 247, 2502, 2972, 3587, 8115, 281, 253, 327, 1755, 273, 4086, 533, 436, 2779, 588, 417, 320, 14916, 275, 253, 1083, 835, 253, 9775, 310, 275, 253, 830, 273, 3626, 3448, 1014, 323, 253, 28483, 32752, 15508, 1650, 2011, 4321, 275, 253, 2929, 697, 417, 2590, 849, 247, 22109, 651, 3711, 326, 281, 5113, 285, 247, 2173, 8820, 5886, 50275, 9815, 253, 2746, 273, 34705, 253, 1789, 6887, 342, 3805, 12583, 310, 247, 19080, 2934, 533, 352, 1057, 1646, 1077, 2173, 281, 3126, 8892, 285, 1859, 10801, 2783, 275, 253, 2929, 275, 247, 625, 4722, 5304, 6200, 342, 625, 685, 374, 5113, 697, 417, 2590, 326, 253, 3805, 12583, 651, 2085, 347, 1175, 1789, 14536, 253, 2929, 671, 25957, 326, 323, 436, 3126, 1077, 2173, 6568, 1859, 10801, 403, 3058, 323, 253, 3805, 12583, 14536, 281, 789, 25761, 627, 403, 1142, 8892, 24088, 5909, 247, 19211, 835, 697, 417, 2590, 849, 281, 344, 321, 18260, 4853, 247, 10921, 1754, 327, 253, 3805, 12583, 14536, 390, 752, 629, 273, 253, 19211, 253, 3779, 4364, 3711, 588, 3711, 281, 50276, 262, 1057, 1646, 326, 954, 273, 253, 10921, 17776, 310, 2686, 3551, 275, 253, 830, 273, 1892, 38059, 4803, 285, 253, 2022, 897, 273, 17230, 310, 275, 1980, 3006, 5113, 50275, 19, 253, 4679, 921, 326, 327, 690, 2969, 2829, 3956, 19763, 8892, 253, 4081, 2746, 476, 4829, 3761, 271, 42295, 10921, 1159, 436, 310, 271, 4722, 906, 2299, 2296, 625, 670, 253, 3745, 281, 897, 17230, 281, 1980, 907, 5113, 685, 352, 1057, 670, 253, 1798, 10921, 1159, 2746, 1580, 253, 47641, 10921, 310, 253, 1072, 875, 1097, 436, 1332, 285, 253, 42295, 253, 24536, 8877, 1666, 25379, 671, 403, 417, 326, 27096, 1580, 597, 513, 417, 755, 253, 3448, 9775, 690, 625, 4722, 14023, 651, 320, 247, 849, 253, 1332, 26662, 281, 4736, 5695, 23944, 285, 270, 849, 352, 1537, 7277, 281, 970, 271, 745, 253, 22826, 2460, 34480, 272, 2746, 352, 651, 671, 320, 9865, 281, 923, 604, 436, 2746, 476, 6016, 247, 625, 2570, 3626, 3448, 7997, 285, 270, 625, 4722, 12620, 342, 625, 2570, 5304, 84, 685, 253, 374, 4891, 18010, 8336, 50275, 20, 627, 403, 247, 1180, 273, 5816, 2905, 2987, 1249, 1706, 534, 671, 3037, 3448, 27039, 6936, 327, 25497, 3782, 2145, 671, 3037, 281, 3216, 3448, 281, 7342, 390, 23267, 50275, 18, 5315, 3129, 1225, 266, 1162, 355, 3216, 272, 48087, 13896, 281, 10921, 3470, 391, 859, 4104, 337, 549, 360, 814, 312, 1162, 355, 3216, 272, 3626, 3448, 7997, 281, 24705, 4736, 14237, 323, 38562, 285, 26647, 26279, 25497, 6247, 495, 295, 1094, 1162, 355, 4715, 3448, 44321, 15688, 3879, 432, 28841, 941, 285, 24597, 47549, 22581, 944, 77, 43425, 577, 50117, 348, 285, 256, 8592, 292, 3448, 27039, 45738, 4715, 689, 440, 34218, 941, 391, 859, 43425, 50276, 1189, 455, 253, 2929, 10262, 271, 4722, 285, 10995, 2900, 281, 253, 1774, 1895, 470, 11860, 3448, 17776, 323, 15688, 19763, 432, 3888, 2299, 253, 4081, 1332, 3133, 281, 320, 4836, 20034, 1374, 3659, 2173, 285, 253, 2929, 943, 2007, 2319, 841, 7364, 253, 2929, 476, 671, 3157, 697, 5661, 10625, 8245, 14023, 285, 7031, 273, 2905, 789, 50276, 187, 187, 4118, 18435, 27, 2520, 7714, 8631, 247, 1332, 326, 7819, 14683, 715, 10921, 3470, 407, 26182, 5113, 29072, 14683, 715, 247, 2969, 30221, 285, 840, 3216, 272, 253, 14390, 275, 253, 7478, 5113, 281, 830, 247, 10921, 323, 271, 5570, 50276, 18, 253, 4060, 285, 1199, 273, 253, 7714, 403, 3542, 275, 247, 1039, 326, 30628, 1119, 21643, 352, 651, 1646, 432, 253, 4060, 285, 954, 273, 253, 2505, 326, 253, 1332, 49661, 3448, 3210, 17230, 5742, 715, 391, 77, 275, 247, 4460, 1039, 281, 2085, 1182, 254, 6934, 302, 23267, 533, 436, 310, 417, 253, 1083, 17230, 310, 908, 15846, 347, 271, 1789, 13562, 4754, 253, 1332, 4419, 247, 1175, 1789, 13562, 285, 17230, 3400, 326, 533, 667, 1175, 1789, 13562, 326, 476, 6016, 10341, 25491, 651, 452, 2218, 50276, 19, 253, 4583, 9978, 273, 253, 789, 4908, 253, 1375, 273, 253, 1533, 285, 840, 14390, 14683, 281, 36803, 23267, 407, 3216, 272, 4243, 273, 253, 14390, 715, 4243, 273, 253, 1533, 1375, 556, 644, 14859, 7561, 275, 15688, 982, 30628, 2530, 30404, 1469, 896, 2067, 1107, 533, 1142, 2571, 2226, 50276, 74, 651, 11907, 253, 4477, 281, 24813, 253, 7714, 1475, 616, 4275, 9021, 285, 1066, 7698, 616, 897, 273, 17230, 285, 3448, 3210, 275, 2087, 281, 247, 5884, 7681, 43302, 12014, 1275, 2423, 272, 2905, 789, 327, 253, 15688, 982, 6239, 285, 17227, 849, 436, 2746, 19986, 285, 19132, 327, 253, 1375, 273, 253, 1445, 627, 812, 906, 275, 247, 2266, 7680 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 10921, 11369, 407, 3981, 326, 352, 29820, 253, 6944, 1375, 1491, 253, 8820, 3216, 272, 344, 321, 3397, 908, 275, 253, 4081, 2746, 403, 2970, 10302, 1754, 327, 253, 305, 85, 8593, 273, 5113, 275, 253, 6200, 36908, 326, 1385, 347, 1907, 2289, 281, 1375, 1491, 285, 11369, 23267, 849, 513, 253, 4477, 42853, 436, 25761, 12672, 253, 23267, 1754, 327, 1880, 8820, 7688, 347, 45775, 432, 253, 4736, 2505, 403, 1146, 10048, 390, 417, 8687, 253, 897, 273, 2709, 6568, 6849, 436, 10316, 275, 3081, 10454, 432, 247, 1524, 10186, 19007, 8668, 50275, 262, 310, 417, 2590, 849, 616, 2746, 26401, 19986, 432, 253, 10921, 2605, 908, 1060, 5987, 39962, 2061, 9275, 16129, 20611, 20611, 9275, 253, 10921, 310, 6012, 432, 253, 891, 276, 285, 9162, 3933, 19103, 273, 13650, 273, 253, 1566, 285, 253, 3646, 4850, 10166, 7293, 327, 849, 973, 253, 6083, 13071, 476, 1980, 907, 253, 2303, 1789, 581, 1537, 1333, 326, 597, 5890, 1265, 616, 3733, 342, 6485, 24102, 534, 253, 4477, 513, 417, 533, 281, 320, 4344, 253, 11106, 2929, 2987, 342, 247, 1199, 625, 2570, 1375, 2317, 495, 69, 24340, 13451, 23111, 326, 253, 2087, 8063, 273, 44190, 247, 10921, 1754, 327, 14536, 2595, 272, 273, 247, 2303, 24705, 4473, 6556, 2032, 1060, 347, 973, 50275, 783, 2929, 310, 5816, 247, 2257, 273, 4619, 1491, 824, 347, 50274, 9088, 310, 642, 1491, 273, 849, 253, 4736, 2505, 20121, 403, 6012, 310, 352, 407, 970, 690, 3686, 273, 11743, 272, 1566, 327, 253, 2460, 4736, 5740, 390, 403, 597, 13542, 28267, 323, 1046, 4836, 50274, 783, 2929, 36908, 452, 667, 4278, 327, 849, 253, 3216, 272, 273, 35185, 285, 8820, 7688, 432, 253, 4736, 2505, 4830, 253, 5304, 7313, 30376, 281, 31437, 23267, 326, 3133, 281, 320, 271, 1774, 3213, 275, 616, 2746, 50276, 12563, 7117, 2708, 403, 690, 5884, 13991, 285, 37699, 3533, 50276, 249, 4706, 3307, 253, 4477, 1333, 326, 253, 4736, 2505, 310, 36838, 715, 1789, 28407, 25491, 285, 1789, 6355, 752, 1057, 253, 6158, 1599, 403, 841, 253, 8820, 7688, 326, 253, 6289, 5113, 1364, 10517, 50276, 14958, 253, 4477, 1611, 1442, 292, 25004, 253, 8989, 3373, 9866, 310, 253, 5301, 2011, 275, 3036, 495, 875, 1327, 1442, 292, 37437, 8989, 3373, 9866, 285, 17230, 604, 594, 513, 253, 4477, 452, 247, 3282, 323, 253, 4248, 273, 14787, 875, 253, 30318, 273, 5971, 875, 8989, 3373, 9866, 11536, 285, 616, 12620, 50276, 249, 4706, 4567, 253, 3448, 2789, 352, 1646, 751, 634, 1566, 310, 31414, 10166, 970, 305, 85, 10921, 806, 285, 840, 634, 4081, 1649, 10966, 5191, 2531, 23267, 436, 812, 755, 247, 1652, 21643, 1908, 294, 88, 1573, 281, 5416, 326, 697, 2590, 326, 581, 273, 731, 310, 634, 2746, 285, 253, 2571, 403, 1666, 25379, 4583, 891, 717, 417, 7094, 13762, 326, 253, 2746, 273, 3365, 3216, 272, 2505, 20121, 4830, 2460, 7313, 476, 5752, 347, 247, 9630, 17335, 323, 23267, 275, 1635, 281, 326, 253, 2929, 275, 697, 1655, 5981, 556, 247, 2257, 273, 5816, 4278, 3103, 619, 6273, 310, 281, 12009, 436, 2929, 5474, 33032, 2520, 2929, 14371, 849, 281, 897, 3215, 11273, 17230, 1566, 347, 247, 10921, 1159, 323, 271, 391, 77, 5570, 436, 2746, 13276, 12112, 4736, 17776, 970, 3448, 253, 4081, 10921, 5978, 1332, 4648, 17230, 281, 4271, 253, 4623, 5113, 285, 247, 4858, 6333, 281, 11897, 253, 10921, 1754, 327, 247, 7616, 8820, 2954, 436, 9712, 11330, 247, 1805, 10921, 1159, 685, 970, 253, 14261, 1885, 875, 253, 12691, 2460, 4972, 285, 3448, 4972, 50276, 783, 2934, 281, 897, 3626, 3448, 28462, 275, 253, 6200, 347, 253, 10921, 1159, 310, 271, 4722, 2934, 285, 247, 4736, 323, 3448, 28462, 391, 77, 2299, 891, 452, 690, 7350, 34974, 50276, 18, 5426, 273, 1182, 254, 6934, 302, 253, 10921, 1159, 310, 3413, 407, 253, 344, 321, 3397, 2931, 275, 2829, 337, 841, 344, 321, 3397, 22573, 253, 3640, 670, 253, 4736, 594, 253, 10921, 1159, 310, 417, 7094, 1182, 254, 6934, 302, 347, 253, 344, 321, 3397, 22573, 253, 2605, 273, 253, 4736, 50276, 19, 253, 897, 273, 3626, 3448, 310, 3710, 8927, 891, 3264, 281, 923, 253, 10921, 1159, 326, 310, 3587, 10302, 432, 253, 17230, 2460, 8473, 285, 2505, 8473, 352, 310, 5272, 326, 253, 2929, 2722, 253, 14261, 1885, 36908, 1957, 253, 8820, 2954, 973, 2299, 253, 2457, 10419, 26332, 970, 247, 17230, 3169, 3779, 4364, 3711, 285, 247, 4858, 8820, 6333, 36908, 25057, 253, 8113, 12982, 1566, 973, 253, 897, 273, 17230, 1060, 310, 8003, 281, 271, 1789, 13562, 1223, 436, 2929, 2692, 326, 253, 3215, 11273, 8989, 27657, 9866, 36908, 2085, 1175, 5481, 436, 1335, 36908, 921, 1818, 253, 958, 326, 17230, 310, 908, 347, 271, 1789, 13562, 275, 436, 1083, 667, 3295, 13562, 778, 789, 347, 973, 1512, 840, 752, 1057, 247, 3215, 11273, 8113, 12982, 1566, 1361, 275, 253, 10921, 17776, 436, 9712, 671, 7787, 253, 6387, 281, 9017, 281, 643, 3448, 20121, 4457, 8820, 7688, 50276, 20, 17230, 4793, 906, 275, 4677, 721, 1014, 2167, 253, 10921, 970, 247, 2969, 14261, 1885, 778, 417, 320, 1175, 275, 512, 2219, 2139, 1057, 4677, 721, 921, 253, 2613, 906, 281, 320, 1900, 6507, 352, 36908, 1646, 2779, 326, 391, 77, 5570, 326, 4648, 436, 10921, 1159, 36908, 3037, 2712, 432, 253, 5068, 50276, 21, 5795, 17230, 8245, 1223, 253, 14261, 1885, 2715, 36908, 789, 516, 12371, 604, 627, 310, 271, 5795, 15895, 323, 1650, 7819, 253, 10921, 17776, 715, 9162, 273, 5113, 273, 1027, 8820, 7688, 436, 651, 5386, 253, 878, 323, 4858, 8820, 344, 321, 3397, 253, 2934, 273, 253, 2929, 310, 4722, 2299, 253, 2457, 7092, 36908, 1663, 878, 253, 3215, 11273, 8113, 12982, 1566, 5474, 339, 431, 248, 2929, 10262, 247, 9769, 4715, 15093, 3169, 2746, 281, 3448, 44321, 10921, 17776, 3782, 323, 35121, 1453, 8892, 432, 3888, 253, 5161, 7680, 273, 253, 789, 310, 275, 31238, 253, 10921, 470, 11860, 5742, 253, 1332, 13328, 265, 247, 3448, 9775, 715, 271, 1789, 28407, 12616, 285, 6799, 8820, 6661, 840, 352, 4648, 17230, 21496, 273, 1016, 1789, 12616, 347, 247, 7316, 285, 4648, 3805, 12583, 281, 755, 247, 3779, 4364, 3711, 275, 253, 2460, 534, 310, 908, 281, 1980, 907, 253, 2303, 1789, 4720, 247, 873, 273, 344, 321, 3397, 403, 908, 281, 11897, 253, 10921, 323, 247, 1798, 8820, 6661, 1677, 253, 2303, 5113, 11627, 253, 10921, 476, 840, 320, 908, 281, 6194, 2057, 2014, 390, 1554, 262, 1945, 7823, 4679, 1804, 326, 275, 15524, 20279, 15688, 10625, 436, 2746, 476, 4829, 3761, 271, 42295, 10921, 50275, 296, 3755, 20556, 50276, 783, 2929, 39223, 285, 4722, 285, 1774, 1895, 275, 470, 11860, 10921, 17776, 16161, 470, 11860, 10921, 17776, 3782, 432, 3448, 310, 5667, 281, 2970, 391, 77, 281, 789, 275, 747, 12620, 23000, 253, 2746, 4081, 275, 436, 789, 310, 10995, 285, 17923, 973, 275, 253, 10625, 2783, 432, 3888, 50276, 20881, 1255, 265, 50276, 783, 32213, 275, 436, 789, 403, 337, 253, 31376, 273, 253, 4081, 2746, 374, 253, 5661, 10625, 681, 1148, 10047, 285, 495, 7031, 273, 2720, 789, 50276, 18, 253, 2929, 29328, 271, 4722, 2746, 281, 470, 5103, 10921, 17776, 275, 11120, 10603, 271, 9775, 281, 28407, 84, 285, 8820, 2493, 970, 17230, 281, 1980, 907, 247, 28407, 840, 970, 253, 1899, 285, 8820, 5886, 342, 344, 321, 3397, 281, 4853, 253, 10921, 2299, 891, 1158, 627, 403, 247, 1180, 273, 7364, 273, 436, 2746, 326, 253, 2929, 943, 2319, 2007, 50275, 7053, 29072, 253, 9775, 715, 5113, 285, 715, 247, 4229, 873, 273, 8820, 2493, 1057, 3133, 3710, 281, 2086, 33778, 3448, 285, 253, 2173, 8892, 275, 253, 2929, 24088, 275, 253, 2929, 253, 9775, 271, 2460, 273, 247, 8862, 2972, 327, 1755, 273, 247, 2502, 2972, 3587, 8115, 281, 253, 327, 1755, 273, 4086, 533, 436, 2779, 588, 417, 320, 14916, 275, 253, 1083, 835, 253, 9775, 310, 275, 253, 830, 273, 3626, 3448, 1014, 323, 253, 28483, 32752, 15508, 1650, 2011, 4321, 275, 253, 2929, 697, 417, 2590, 849, 247, 22109, 651, 3711, 326, 281, 5113, 285, 247, 2173, 8820, 5886, 50275, 9815, 253, 2746, 273, 34705, 253, 1789, 6887, 342, 3805, 12583, 310, 247, 19080, 2934, 533, 352, 1057, 1646, 1077, 2173, 281, 3126, 8892, 285, 1859, 10801, 2783, 275, 253, 2929, 275, 247, 625, 4722, 5304, 6200, 342, 625, 685, 374, 5113, 697, 417, 2590, 326, 253, 3805, 12583, 651, 2085, 347, 1175, 1789, 14536, 253, 2929, 671, 25957, 326, 323, 436, 3126, 1077, 2173, 6568, 1859, 10801, 403, 3058, 323, 253, 3805, 12583, 14536, 281, 789, 25761, 627, 403, 1142, 8892, 24088, 5909, 247, 19211, 835, 697, 417, 2590, 849, 281, 344, 321, 18260, 4853, 247, 10921, 1754, 327, 253, 3805, 12583, 14536, 390, 752, 629, 273, 253, 19211, 253, 3779, 4364, 3711, 588, 3711, 281, 50276, 262, 1057, 1646, 326, 954, 273, 253, 10921, 17776, 310, 2686, 3551, 275, 253, 830, 273, 1892, 38059, 4803, 285, 253, 2022, 897, 273, 17230, 310, 275, 1980, 3006, 5113, 50275, 19, 253, 4679, 921, 326, 327, 690, 2969, 2829, 3956, 19763, 8892, 253, 4081, 2746, 476, 4829, 3761, 271, 42295, 10921, 1159, 436, 310, 271, 4722, 906, 2299, 2296, 625, 670, 253, 3745, 281, 897, 17230, 281, 1980, 907, 5113, 685, 352, 1057, 670, 253, 1798, 10921, 1159, 2746, 1580, 253, 47641, 10921, 310, 253, 1072, 875, 1097, 436, 1332, 285, 253, 42295, 253, 24536, 8877, 1666, 25379, 671, 403, 417, 326, 27096, 1580, 597, 513, 417, 755, 253, 3448, 9775, 690, 625, 4722, 14023, 651, 320, 247, 849, 253, 1332, 26662, 281, 4736, 5695, 23944, 285, 270, 849, 352, 1537, 7277, 281, 970, 271, 745, 253, 22826, 2460, 34480, 272, 2746, 352, 651, 671, 320, 9865, 281, 923, 604, 436, 2746, 476, 6016, 247, 625, 2570, 3626, 3448, 7997, 285, 270, 625, 4722, 12620, 342, 625, 2570, 5304, 84, 685, 253, 374, 4891, 18010, 8336, 50275, 20, 627, 403, 247, 1180, 273, 5816, 2905, 2987, 1249, 1706, 534, 671, 3037, 3448, 27039, 6936, 327, 25497, 3782, 2145, 671, 3037, 281, 3216, 3448, 281, 7342, 390, 23267, 50275, 18, 5315, 3129, 1225, 266, 1162, 355, 3216, 272, 48087, 13896, 281, 10921, 3470, 391, 859, 4104, 337, 549, 360, 814, 312, 1162, 355, 3216, 272, 3626, 3448, 7997, 281, 24705, 4736, 14237, 323, 38562, 285, 26647, 26279, 25497, 6247, 495, 295, 1094, 1162, 355, 4715, 3448, 44321, 15688, 3879, 432, 28841, 941, 285, 24597, 47549, 22581, 944, 77, 43425, 577, 50117, 348, 285, 256, 8592, 292, 3448, 27039, 45738, 4715, 689, 440, 34218, 941, 391, 859, 43425, 50276, 1189, 455, 253, 2929, 10262, 271, 4722, 285, 10995, 2900, 281, 253, 1774, 1895, 470, 11860, 3448, 17776, 323, 15688, 19763, 432, 3888, 2299, 253, 4081, 1332, 3133, 281, 320, 4836, 20034, 1374, 3659, 2173, 285, 253, 2929, 943, 2007, 2319, 841, 7364, 253, 2929, 476, 671, 3157, 697, 5661, 10625, 8245, 14023, 285, 7031, 273, 2905, 789, 50276, 187, 187, 4118, 18435, 27, 2520, 7714, 8631, 247, 1332, 326, 7819, 14683, 715, 10921, 3470, 407, 26182, 5113, 29072, 14683, 715, 247, 2969, 30221, 285, 840, 3216, 272, 253, 14390, 275, 253, 7478, 5113, 281, 830, 247, 10921, 323, 271, 5570, 50276, 18, 253, 4060, 285, 1199, 273, 253, 7714, 403, 3542, 275, 247, 1039, 326, 30628, 1119, 21643, 352, 651, 1646, 432, 253, 4060, 285, 954, 273, 253, 2505, 326, 253, 1332, 49661, 3448, 3210, 17230, 5742, 715, 391, 77, 275, 247, 4460, 1039, 281, 2085, 1182, 254, 6934, 302, 23267, 533, 436, 310, 417, 253, 1083, 17230, 310, 908, 15846, 347, 271, 1789, 13562, 4754, 253, 1332, 4419, 247, 1175, 1789, 13562, 285, 17230, 3400, 326, 533, 667, 1175, 1789, 13562, 326, 476, 6016, 10341, 25491, 651, 452, 2218, 50276, 19, 253, 4583, 9978, 273, 253, 789, 4908, 253, 1375, 273, 253, 1533, 285, 840, 14390, 14683, 281, 36803, 23267, 407, 3216, 272, 4243, 273, 253, 14390, 715, 4243, 273, 253, 1533, 1375, 556, 644, 14859, 7561, 275, 15688, 982, 30628, 2530, 30404, 1469, 896, 2067, 1107, 533, 1142, 2571, 2226, 50276, 74, 651, 11907, 253, 4477, 281, 24813, 253, 7714, 1475, 616, 4275, 9021, 285, 1066, 7698, 616, 897, 273, 17230, 285, 3448, 3210, 275, 2087, 281, 247, 5884, 7681, 43302, 12014, 1275, 2423, 272, 2905, 789, 327, 253, 15688, 982, 6239, 285, 17227, 849, 436, 2746, 19986, 285, 19132, 327, 253, 1375, 273, 253, 1445, 627, 812, 906, 275, 247, 2266, 7680 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: this paper proposes an ensemble model between a full finetuning model and a parameterefficient finetuning model to improve the outofdistribution ood performance of a full finetuning model the proposed method is inspired by the observation that full finetuning model achieves good indistribution id performance while parameterefficient finetuning model achieves better ood performance there are two ensembling methods presented in the paper linear interpolation between the predictions of the two models and distill from the predictions of a parameterefficient model with id training data improved ood performance is observed with this ensemble method strengths the writing is generally clear the paper is addressing an important phenomenon that parameterefficient tuning which has deficient id performance while full finetuning has worse ood performance weakness first of all the presented methodology is not well motivated parameterefficient tuning aims to reduce number of parameters that need to save for each task by only finetuning a small number of additional parameters however the proposed method produces an ensemble model that finetunes all parameters losing the parameterefficiency if the goal of this paper is to improve the ood performance of full finetuning it should compare with methods that improve ood performance which has a vast amount of related work and this literature review is also missing from the paper on the other hand if the goal of this paper is to improve the id performance of parameterefficient tuning then it should propose a method for improving parameterefficient tuning methods rather than combine it with full finetuning second although its unfair to mention recent work after submission time there is recent work on parametertuning that shows proper improvements over parameterefficient tuning methods could match the performance of full finetuning this makes the motivation of this paper less meaningful third the experiments do not reflect how and why the hyperparameters length of prefix vectors bottleneck dimension of parameterefficient tuning are selected which is important for the discussions and conclusions for example if with tuned hyperparameters the id results of parameterefficient tuning methods could be close to full finetuning while preserving good ood performance then the proposed method makes no sense any more another minor flaw is that when prefixtuning is introduces its described as it prepends a sequence of trainable taskspecific prefix vectors to the input which actually is not true prefixtuning prepends tunable vectors to the projected key and value vectors instead of the input another question is the reported rouge2 score on xsum is quite low 212 on id compared to the original reported result in bart paper 2227 on the full test set i would guess the id performance is even higher or at lease to this number why is it the motivation of this paper is not welldefined the goal of this paper is to improve the ood performance of a fullfinetuning model however it didnt compare with any method on ood generalization domain adaptation the proposed method also makes parameterefficient tuning method not attractive any more docsepthis paper presents interesting an idea of combining lightweight finetuning and full finetuning to achieve the best of both approaches ie perform best on outofdomain and indomain data the authors proposed two approaches a simple ensemble method and a socalled cocktail finetuning that combines two finetuning methods in one single model they evaluated their tasks in three datasets webnlg xsum and openqa and obtained mixed results the authors also provided good analyses for more insights strengthens this paper addressed interesting research topic and the proposed method is interesting and promising this paper is clear and therefore it is easy to follow and to understand the analyses are interesting and provide insightful information weaknesses the experimental setup of this paper might be problematic two out of the three tasks are generation tasks in which bleu or rouge2 evaluation scores are difficult to interpret ie it is very difficult to judge the effectiveness of the proposed method the experimental results are mixed ie it is difficult to draw strong conclusions from the results furthermore it is not clear where the results are significant different among systems the definitions of id and ood are not well described in this paper and need to be improved furthermore it is not clear how many examples of id and ood were used in the datasets overall i rate this paper as marginally above the acceptance threshold mainly because the idea is interesting and somewhat novel however there are some weaknesses esp in the experimental setup and results that make this paper a borderline paper docsepthe present paper first discusses the tradeoff between performance for outofdomain data and indomain data with respect to whether the model is fully finetuned or lightweight finetuned on nlg tasks second it argues that such a tradeoff is not necessary if one can make use of both of these two finetuning schema in a clever way to this end it proposes cocktail finetuning which augments full finetuning via distillation from a lightweight model and which achieves equal performance as an ensemble of the two finetuning schema at length this paper also explains the behavior of the cocktail finetuning through a toy model the current paper looks perfect up to section 4 it properly discusses different finetuning schema as well as the tradeoff between ood performance and id performance with respect to the use of a funetuning scheme it properly defines the problem ie the tradeoff and proposes a simple yet effective way to overcome it it tests its method on three different nlg tasks table2text generation summarisation and qa however starting from section 5 it makes me hard to recommend an acceptance for this paper in the current format this is because of two major concerns one is that the interpretations of experimental results are problematic weight might make the rest discussions and conclusions do not stand anymore in section 5 regarding table 2 the authors say when evaluating id full finetuning achieves the stronger results whereas lightweight finetuning achieves only 81 of the skyline on average this statement is only true on average and overlooks the huge differences between different tasks for example for id data of the data2text task full finetuning achieves a score of 6325 while lightweight finetuning achieve 6318 there is no significant difference between the performance on id of these two schema meanwhile in a similar vein for ood samples of the summarisation task i also found no significant difference between the two schema the situation described by the author only exists when doing question answering this suggests that such a tradeoff is taskdependent to a large extent and it seems to me that if a task relies on the inputs in a more directed way eg qa is often accomplished by simply copying text from inputs then the tradeoff is more significant and the proposed model is more useful whereas if a task relies on the inputs in a more indirect way eg data2text requires the generator to plan what to produce in the first place then the tradeoff is less significant and improvement made by the proposed model is less anyway such a phenomenon should not be overlooked and covered by only saying on average the other is that when evaluating the data2text generation only bleu is used however there has been a bank of work that has proved the bleu has low validity on nlg tasks eg reiter 2018 this makes at least the results on webnlg do not reliable reference reiter e 2018 a structured review of the validity of bleu computational linguistics 443 393401 i generally like the idea of this paper and the paper is perfect up to the place where the experiments are introduced discussions of this paper overlook major phenomena in the results making the discussions and probably the conclusions are in part wrong docsepthis paper proposes a simple yet effective method cocktail finetuning for the natural language generation tasks their results show that cocktail finetuning can handle both indomain data and outofdomain data effectively by combing adapterfinetuning and fullfinetuning through knowledge distillation and overall has comparable performance compared to their ensembles it also provides theoretical analysis on multiclass logistic regression to explain why it works i am not convinced and even surprised by the authors that ensembling is one of the methods that we propose as well i think the ensemble method is existing and not proposed by authors and authors only apply it to the setting lightweight finetuning and full finetuning in the experiments we present we focus on monolingual models and there isnt much evidence that prefixtuning works for translation the argument is problematic this papers title is ensembles and cocktails robust finetuning for natural language generation since it is for the general natural language generation nlg task i would like to see these results as mt is one of the most important task in nlg if the proposed method doesnt work for mt then i would think the paper overclaims its contribution strengths the idea of combining adapterfinetuning and fullfinetuning to handle both indomain data and outofdomain data is interesting and clearly motivated this paper is well written and presents their motivation clearly and finally also provides theoretical analysis on multiclass logistic regression to help understand why it can work weakness their methods only perform comparably to the ensemble method but not consistently and significantly better although cocktail finetuning only does inference once its not free and needs a knowledge distillation process before training which requires inference over wholetraining data using the adapterfinetuning model and can be timeconsuming the authors should talk about this limitation in the paper this paper aims at the natural language generation area and should consider adding experiments on datasets for machine translation mt adding results for machine translation can make this paper more convincing and solid this paper combines adapterfinetuning and fullfinetuning to handle both indomain data and outofdomain data and the idea is interesting its also well written and clearly motivated further it provides a theoretical analysis of multiclass logistic regression to explain why it works however i think the authors overclaimed their contribution although this papers title is ensembles and cocktails robust finetuning for natural language generation it lacks machine translation results one of the most important tasks in natural language generation in addition the authors mention that ensembling is one of the methods that we propose as well which i dont agree with i would recommend declining this paper ### Summary:
this paper presents a method for ensembling light finetuning methods and full finetuning methods to achieve better performance both indomain and outofdomain distributions as authors agree similar idea has been explored in the computer vision literature the reviewers like the overall idea of the paper but they all had some concerns regarding the experiments the reviewers provide valuable feedback on how to improve the experiments potentially running the same idea on more datasets and tasks provide more analyses and discussions on how to understand the results
[ 273, 253, 767, 3210, 285, 940, 408, 432, 253, 13650, 273, 247, 30364, 11892, 2276, 1566, 342, 2654, 3733, 941, 5520, 258, 351, 3045, 310, 2540, 342, 436, 19862, 1332, 20544, 50275, 783, 4028, 310, 3839, 2590, 50276, 783, 2929, 310, 15974, 271, 1774, 11562, 326, 30364, 11892, 2276, 25184, 534, 556, 24030, 2654, 3045, 1223, 2120, 1442, 292, 25004, 556, 7197, 258, 351, 3045, 50276, 20881, 1255, 50275, 7053, 273, 512, 253, 3559, 16182, 310, 417, 973, 17194, 30364, 11892, 2276, 25184, 13698, 281, 4796, 1180, 273, 3602, 326, 878, 281, 5321, 323, 1016, 4836, 407, 760, 1442, 292, 25004, 247, 1355, 1180, 273, 3081, 3602, 2299, 253, 4081, 1332, 11330, 271, 19862, 1566, 326, 1442, 292, 14124, 512, 3602, 10305, 253, 30364, 11892, 15412, 604, 253, 4736, 273, 436, 2929, 310, 281, 3157, 253, 258, 351, 3045, 273, 2120, 1442, 292, 25004, 352, 943, 7277, 342, 3082, 326, 3157, 258, 351, 3045, 534, 556, 247, 8485, 2408, 273, 2905, 789, 285, 436, 6239, 2278, 310, 671, 5816, 432, 253, 2929, 327, 253, 643, 1133, 604, 253, 4736, 273, 436, 2929, 310, 281, 3157, 253, 2654, 3045, 273, 30364, 11892, 2276, 25184, 840, 352, 943, 12661, 247, 1332, 323, 11138, 30364, 11892, 2276, 25184, 3082, 2581, 685, 13398, 352, 342, 2120, 1442, 292, 25004, 50275, 9815, 3738, 697, 16593, 281, 3748, 3332, 789, 846, 19529, 673, 627, 310, 3332, 789, 327, 30364, 797, 25004, 326, 2722, 1463, 11701, 689, 30364, 11892, 2276, 25184, 3082, 812, 3761, 253, 3045, 273, 2120, 1442, 292, 25004, 436, 2789, 253, 16038, 273, 436, 2929, 1679, 14282, 50275, 19016, 253, 4679, 513, 417, 4887, 849, 285, 2139, 253, 4373, 22041, 2978, 273, 17744, 11390, 3673, 44856, 7877, 273, 30364, 11892, 2276, 25184, 403, 4236, 534, 310, 1774, 323, 253, 11985, 285, 11815, 323, 1650, 604, 342, 24251, 4373, 22041, 253, 2654, 1543, 273, 30364, 11892, 2276, 25184, 3082, 812, 320, 2810, 281, 2120, 1442, 292, 25004, 1223, 24279, 1175, 258, 351, 3045, 840, 253, 4081, 1332, 2789, 642, 3282, 667, 625, 50276, 23955, 5884, 19652, 310, 326, 672, 638, 11125, 633, 25004, 310, 23970, 697, 2529, 347, 352, 3765, 1727, 247, 3425, 273, 6194, 494, 8892, 29765, 17744, 11390, 281, 253, 3280, 534, 2686, 310, 417, 2032, 638, 11125, 633, 25004, 3765, 1727, 10839, 494, 11390, 281, 253, 16589, 2234, 285, 1318, 11390, 3185, 273, 253, 3280, 50276, 23955, 1953, 310, 253, 2361, 30497, 463, 19, 4868, 327, 1269, 2204, 310, 3240, 1698, 21990, 327, 2654, 2429, 281, 253, 3236, 2361, 906, 275, 44693, 2929, 374, 20785, 327, 253, 2120, 1071, 873, 891, 651, 5476, 253, 2654, 3045, 310, 1014, 2169, 390, 387, 16959, 281, 436, 1180, 2139, 310, 352, 253, 16038, 273, 436, 2929, 310, 417, 6210, 392, 37224, 253, 4736, 273, 436, 2929, 310, 281, 3157, 253, 258, 351, 3045, 273, 247, 2120, 71, 7795, 25004, 1566, 2299, 352, 42126, 7277, 342, 667, 1332, 327, 258, 351, 26647, 50276, 13517, 15644, 253, 4081, 1332, 671, 2789, 30364, 11892, 2276, 25184, 1332, 417, 12994, 667, 625, 5474, 33032, 2520, 2929, 10262, 4722, 271, 2934, 273, 16248, 28441, 1442, 292, 25004, 285, 2120, 1442, 292, 25004, 281, 5115, 253, 1682, 273, 1097, 7274, 26332, 1347, 1682, 327, 562, 1171, 13517, 285, 801, 297, 404, 941, 253, 4477, 4081, 767, 7274, 247, 2969, 19862, 1332, 285, 247, 9267, 18859, 29086, 1442, 292, 25004, 326, 24772, 767, 1442, 292, 25004, 3082, 275, 581, 2014, 1566, 597, 6760, 616, 8892, 275, 1264, 15302, 4384, 13307, 72, 1269, 2204, 285, 1527, 31569, 285, 2797, 6804, 1543, 253, 4477, 671, 2530, 1175, 6260, 323, 625, 16039, 50275, 296, 3755, 49966, 50276, 2520, 2929, 9713, 4722, 2561, 9400, 285, 253, 4081, 1332, 310, 4722, 285, 12532, 50276, 2520, 2929, 310, 2590, 285, 3103, 352, 310, 3477, 281, 956, 285, 281, 2096, 50276, 783, 6260, 403, 4722, 285, 2085, 47860, 1491, 50275, 20881, 1255, 265, 50276, 783, 5661, 9978, 273, 436, 2929, 1537, 320, 20276, 767, 562, 273, 253, 1264, 8892, 403, 5978, 8892, 275, 534, 7387, 86, 390, 30497, 463, 19, 7103, 7363, 403, 2834, 281, 4665, 26332, 352, 310, 1077, 2834, 281, 5963, 253, 12510, 273, 253, 4081, 1332, 50276, 783, 5661, 1543, 403, 6804, 26332, 352, 310, 2834, 281, 3812, 2266, 11815, 432, 253, 1543, 33810, 352, 310, 417, 2590, 835, 253, 1543, 403, 1534, 1027, 2190, 2718, 50276, 783, 14308, 273, 2654, 285, 258, 351, 403, 417, 973, 2529, 275, 436, 2929, 285, 878, 281, 320, 5520, 33810, 352, 310, 417, 2590, 849, 1142, 6667, 273, 2654, 285, 258, 351, 497, 908, 275, 253, 15302, 50274, 1189, 455, 891, 2281, 436, 2929, 347, 42876, 1840, 253, 14924, 7887, 7194, 984, 253, 2934, 310, 4722, 285, 8489, 4460, 2299, 627, 403, 690, 32213, 17985, 275, 253, 5661, 9978, 285, 1543, 326, 1056, 436, 2929, 247, 45210, 2929, 50276, 7152, 339, 431, 248, 1246, 2929, 806, 25339, 253, 5454, 2727, 875, 3045, 323, 562, 1171, 13517, 941, 285, 801, 297, 404, 941, 342, 1675, 281, 1880, 253, 1566, 310, 4751, 1442, 292, 37437, 390, 28441, 1442, 292, 37437, 327, 295, 21619, 8892, 1273, 352, 8219, 326, 824, 247, 5454, 2727, 310, 417, 3309, 604, 581, 476, 1056, 897, 273, 1097, 273, 841, 767, 1442, 292, 25004, 20824, 275, 247, 19080, 1039, 281, 436, 990, 352, 29328, 29086, 1442, 292, 25004, 534, 14688, 942, 2120, 1442, 292, 25004, 3066, 940, 21755, 432, 247, 28441, 1566, 285, 534, 33526, 4503, 3045, 347, 271, 19862, 273, 253, 767, 1442, 292, 25004, 20824, 387, 2978, 436, 2929, 671, 11424, 253, 3879, 273, 253, 29086, 1442, 292, 25004, 949, 247, 20953, 1566, 50276, 783, 1655, 2929, 4453, 3962, 598, 281, 2593, 577, 352, 6283, 25339, 1027, 1442, 292, 25004, 20824, 347, 973, 347, 253, 5454, 2727, 875, 258, 351, 3045, 285, 2654, 3045, 342, 1675, 281, 253, 897, 273, 247, 794, 292, 25004, 6974, 352, 6283, 13067, 253, 1895, 26332, 253, 5454, 2727, 285, 29328, 247, 2969, 2568, 3576, 1039, 281, 11399, 352, 352, 5216, 697, 1332, 327, 1264, 1027, 295, 21619, 8892, 2829, 19, 1156, 5978, 10405, 5837, 285, 2805, 66, 50276, 35529, 4983, 432, 2593, 608, 352, 2789, 479, 1892, 281, 5583, 271, 14924, 323, 436, 2929, 275, 253, 1655, 5981, 436, 310, 984, 273, 767, 2201, 7350, 50276, 531, 310, 326, 253, 27838, 273, 5661, 1543, 403, 20276, 2801, 1537, 1056, 253, 1551, 11985, 285, 11815, 513, 417, 1462, 10542, 275, 2593, 608, 5001, 2829, 374, 253, 4477, 1333, 672, 16344, 2654, 2120, 1442, 292, 25004, 33526, 253, 10046, 1543, 5727, 28441, 1442, 292, 25004, 33526, 760, 11681, 273, 253, 1629, 36226, 327, 3388, 436, 3908, 310, 760, 2032, 327, 3388, 285, 20621, 84, 253, 5699, 3910, 875, 1027, 8892, 323, 1650, 323, 2654, 941, 273, 253, 941, 19, 1156, 4836, 2120, 1442, 292, 25004, 33526, 247, 4868, 273, 9654, 1099, 1223, 28441, 1442, 292, 25004, 5115, 9654, 1093, 627, 310, 642, 1534, 3064, 875, 253, 3045, 327, 2654, 273, 841, 767, 20824, 26614, 275, 247, 2074, 17716, 323, 258, 351, 3530, 273, 253, 10405, 5837, 4836, 891, 671, 1119, 642, 1534, 3064, 875, 253, 767, 20824, 253, 4112, 2529, 407, 253, 2488, 760, 4961, 672, 2509, 1953, 22291, 436, 5936, 326, 824, 247, 5454, 2727, 310, 4836, 6820, 281, 247, 1781, 6070, 285, 352, 3133, 281, 479, 326, 604, 247, 4836, 15771, 327, 253, 14800, 275, 247, 625, 6828, 1039, 24088, 2805, 66, 310, 2223, 14123, 407, 3365, 24699, 2505, 432, 14800, 840, 253, 5454, 2727, 310, 625, 1534, 285, 253, 4081, 1566, 310, 625, 4217, 5727, 604, 247, 4836, 15771, 327, 253, 14800, 275, 247, 625, 11686, 1039, 24088, 941, 19, 1156, 4419, 253, 14156, 281, 2098, 752, 281, 4711, 275, 253, 806, 1659, 840, 253, 5454, 2727, 310, 1679, 1534, 285, 7756, 1160, 407, 253, 4081, 1566, 310, 1679, 8791, 824, 247, 11562, 943, 417, 320, 28849, 285, 6107, 407, 760, 3981, 327, 3388, 50276, 783, 643, 310, 326, 672, 16344, 253, 941, 19, 1156, 5978, 760, 7387, 86, 310, 908, 2299, 627, 556, 644, 247, 4310, 273, 789, 326, 556, 8058, 253, 7387, 86, 556, 1698, 13091, 327, 295, 21619, 8892, 24088, 28411, 4765, 436, 2789, 387, 1878, 253, 1543, 327, 4384, 13307, 72, 513, 417, 9630, 50276, 14005, 28411, 299, 4765, 247, 18872, 2278, 273, 253, 13091, 273, 7387, 86, 15180, 20365, 3397, 34790, 6931, 1706, 520, 50276, 74, 3839, 751, 253, 2934, 273, 436, 2929, 285, 253, 2929, 310, 3962, 598, 281, 253, 1659, 835, 253, 4679, 403, 5611, 11985, 273, 436, 2929, 20621, 2201, 16958, 275, 253, 1543, 2403, 253, 11985, 285, 3164, 253, 11815, 403, 275, 629, 3430, 5474, 33032, 2520, 2929, 29328, 247, 2969, 2568, 3576, 1332, 29086, 1442, 292, 25004, 323, 253, 3626, 3448, 5978, 8892, 616, 1543, 921, 326, 29086, 1442, 292, 25004, 476, 6016, 1097, 801, 297, 404, 941, 285, 562, 1171, 13517, 941, 8069, 407, 2049, 272, 23675, 71, 7795, 25004, 285, 2120, 71, 7795, 25004, 949, 3640, 940, 21755, 285, 4583, 556, 10870, 3045, 2429, 281, 616, 49328, 352, 671, 3400, 10527, 1783, 327, 23559, 14407, 21535, 9077, 281, 5513, 2139, 352, 2987, 50276, 74, 717, 417, 13762, 285, 1014, 9861, 407, 253, 4477, 326, 546, 35128, 310, 581, 273, 253, 3082, 326, 359, 12661, 347, 973, 891, 1158, 253, 19862, 1332, 310, 5368, 285, 417, 4081, 407, 4477, 285, 4477, 760, 4647, 352, 281, 253, 4758, 28441, 1442, 292, 25004, 285, 2120, 1442, 292, 25004, 50276, 249, 253, 4679, 359, 1246, 359, 2770, 327, 28294, 272, 780, 3210, 285, 627, 310, 2649, 1199, 1941, 326, 638, 11125, 633, 25004, 2987, 323, 10234, 253, 4154, 310, 20276, 436, 9380, 4060, 310, 49328, 285, 11694, 37465, 10237, 1442, 292, 25004, 323, 3626, 3448, 5978, 1580, 352, 310, 323, 253, 2087, 3626, 3448, 5978, 295, 21619, 4836, 891, 651, 751, 281, 923, 841, 1543, 347, 26301, 310, 581, 273, 253, 954, 1774, 4836, 275, 295, 21619, 604, 253, 4081, 1332, 36908, 789, 323, 26301, 840, 891, 651, 1158, 253, 2929, 689, 28803, 697, 7680, 50276, 296, 3755, 20556, 50276, 783, 2934, 273, 16248, 23675, 71, 7795, 25004, 285, 2120, 71, 7795, 25004, 281, 6016, 1097, 801, 297, 404, 941, 285, 562, 1171, 13517, 941, 310, 4722, 285, 4518, 17194, 50276, 2520, 2929, 310, 973, 3542, 285, 10262, 616, 16038, 4518, 285, 4720, 671, 3400, 10527, 1783, 327, 23559, 14407, 21535, 9077, 281, 1361, 2096, 2139, 352, 476, 789, 50276, 20881, 1255, 50276, 14094, 3082, 760, 1347, 3294, 1598, 281, 253, 19862, 1332, 533, 417, 12724, 285, 3012, 1805, 3738, 29086, 1442, 292, 25004, 760, 1057, 17032, 2378, 697, 417, 1959, 285, 3198, 247, 3640, 940, 21755, 1232, 1078, 3733, 534, 4419, 17032, 689, 665, 1059, 26208, 941, 970, 253, 23675, 71, 7795, 25004, 1566, 285, 476, 320, 673, 33136, 253, 4477, 943, 2312, 670, 436, 12291, 275, 253, 2929, 50276, 2520, 2929, 13698, 387, 253, 3626, 3448, 5978, 2170, 285, 943, 1908, 6240, 4679, 327, 15302, 323, 5145, 10234, 26301, 50276, 8052, 1543, 323, 5145, 10234, 476, 1056, 436, 2929, 625, 21414, 285, 4891, 50276, 2520, 2929, 24772, 23675, 71, 7795, 25004, 285, 2120, 71, 7795, 25004, 281, 6016, 1097, 801, 297, 404, 941, 285, 562, 1171, 13517, 941, 285, 253, 2934, 310, 4722, 697, 671, 973, 3542, 285, 4518, 17194, 2007, 352, 3400, 247, 10527, 1783, 273, 23559, 14407, 21535, 9077, 281, 5513, 2139, 352, 2987, 2299, 891, 1158, 253, 4477, 689, 13578, 616, 7680, 3738, 436, 9380, 4060, 310, 49328, 285, 11694, 37465, 10237, 1442, 292, 25004, 323, 3626, 3448, 5978, 352, 19756, 5145, 10234, 1543, 581, 273, 253, 954, 1774, 8892, 275, 3626, 3448, 5978, 275, 1635, 253, 4477, 3748, 326, 546, 35128, 310, 581, 273, 253, 3082, 326, 359, 12661, 347, 973, 534, 891, 13414, 5194, 342, 891, 651, 5583, 27177, 436, 2929, 2490, 187, 4118, 18435, 27, 2520, 2929, 10262, 247, 1332, 323, 546, 35128, 1708, 1442, 292, 25004, 3082, 285, 2120, 1442, 292, 25004, 3082, 281, 5115, 1805, 3045, 1097, 801, 297, 404, 285, 562, 1171, 13517, 10670, 347, 4477, 5194, 2074, 2934, 556, 644, 14859, 275, 253, 4382, 8113, 6239, 253, 30628, 751, 253, 4583, 2934, 273, 253, 2929, 533, 597, 512, 574, 690, 7350, 5001, 253, 4679, 253, 30628, 2085, 9865, 8680, 327, 849, 281, 3157, 253, 4679, 7826, 3515, 253, 1072, 2934, 327, 625, 15302, 285, 8892, 2085, 625, 6260, 285, 11985, 327, 849, 281, 2096, 253, 1543 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 273, 253, 767, 3210, 285, 940, 408, 432, 253, 13650, 273, 247, 30364, 11892, 2276, 1566, 342, 2654, 3733, 941, 5520, 258, 351, 3045, 310, 2540, 342, 436, 19862, 1332, 20544, 50275, 783, 4028, 310, 3839, 2590, 50276, 783, 2929, 310, 15974, 271, 1774, 11562, 326, 30364, 11892, 2276, 25184, 534, 556, 24030, 2654, 3045, 1223, 2120, 1442, 292, 25004, 556, 7197, 258, 351, 3045, 50276, 20881, 1255, 50275, 7053, 273, 512, 253, 3559, 16182, 310, 417, 973, 17194, 30364, 11892, 2276, 25184, 13698, 281, 4796, 1180, 273, 3602, 326, 878, 281, 5321, 323, 1016, 4836, 407, 760, 1442, 292, 25004, 247, 1355, 1180, 273, 3081, 3602, 2299, 253, 4081, 1332, 11330, 271, 19862, 1566, 326, 1442, 292, 14124, 512, 3602, 10305, 253, 30364, 11892, 15412, 604, 253, 4736, 273, 436, 2929, 310, 281, 3157, 253, 258, 351, 3045, 273, 2120, 1442, 292, 25004, 352, 943, 7277, 342, 3082, 326, 3157, 258, 351, 3045, 534, 556, 247, 8485, 2408, 273, 2905, 789, 285, 436, 6239, 2278, 310, 671, 5816, 432, 253, 2929, 327, 253, 643, 1133, 604, 253, 4736, 273, 436, 2929, 310, 281, 3157, 253, 2654, 3045, 273, 30364, 11892, 2276, 25184, 840, 352, 943, 12661, 247, 1332, 323, 11138, 30364, 11892, 2276, 25184, 3082, 2581, 685, 13398, 352, 342, 2120, 1442, 292, 25004, 50275, 9815, 3738, 697, 16593, 281, 3748, 3332, 789, 846, 19529, 673, 627, 310, 3332, 789, 327, 30364, 797, 25004, 326, 2722, 1463, 11701, 689, 30364, 11892, 2276, 25184, 3082, 812, 3761, 253, 3045, 273, 2120, 1442, 292, 25004, 436, 2789, 253, 16038, 273, 436, 2929, 1679, 14282, 50275, 19016, 253, 4679, 513, 417, 4887, 849, 285, 2139, 253, 4373, 22041, 2978, 273, 17744, 11390, 3673, 44856, 7877, 273, 30364, 11892, 2276, 25184, 403, 4236, 534, 310, 1774, 323, 253, 11985, 285, 11815, 323, 1650, 604, 342, 24251, 4373, 22041, 253, 2654, 1543, 273, 30364, 11892, 2276, 25184, 3082, 812, 320, 2810, 281, 2120, 1442, 292, 25004, 1223, 24279, 1175, 258, 351, 3045, 840, 253, 4081, 1332, 2789, 642, 3282, 667, 625, 50276, 23955, 5884, 19652, 310, 326, 672, 638, 11125, 633, 25004, 310, 23970, 697, 2529, 347, 352, 3765, 1727, 247, 3425, 273, 6194, 494, 8892, 29765, 17744, 11390, 281, 253, 3280, 534, 2686, 310, 417, 2032, 638, 11125, 633, 25004, 3765, 1727, 10839, 494, 11390, 281, 253, 16589, 2234, 285, 1318, 11390, 3185, 273, 253, 3280, 50276, 23955, 1953, 310, 253, 2361, 30497, 463, 19, 4868, 327, 1269, 2204, 310, 3240, 1698, 21990, 327, 2654, 2429, 281, 253, 3236, 2361, 906, 275, 44693, 2929, 374, 20785, 327, 253, 2120, 1071, 873, 891, 651, 5476, 253, 2654, 3045, 310, 1014, 2169, 390, 387, 16959, 281, 436, 1180, 2139, 310, 352, 253, 16038, 273, 436, 2929, 310, 417, 6210, 392, 37224, 253, 4736, 273, 436, 2929, 310, 281, 3157, 253, 258, 351, 3045, 273, 247, 2120, 71, 7795, 25004, 1566, 2299, 352, 42126, 7277, 342, 667, 1332, 327, 258, 351, 26647, 50276, 13517, 15644, 253, 4081, 1332, 671, 2789, 30364, 11892, 2276, 25184, 1332, 417, 12994, 667, 625, 5474, 33032, 2520, 2929, 10262, 4722, 271, 2934, 273, 16248, 28441, 1442, 292, 25004, 285, 2120, 1442, 292, 25004, 281, 5115, 253, 1682, 273, 1097, 7274, 26332, 1347, 1682, 327, 562, 1171, 13517, 285, 801, 297, 404, 941, 253, 4477, 4081, 767, 7274, 247, 2969, 19862, 1332, 285, 247, 9267, 18859, 29086, 1442, 292, 25004, 326, 24772, 767, 1442, 292, 25004, 3082, 275, 581, 2014, 1566, 597, 6760, 616, 8892, 275, 1264, 15302, 4384, 13307, 72, 1269, 2204, 285, 1527, 31569, 285, 2797, 6804, 1543, 253, 4477, 671, 2530, 1175, 6260, 323, 625, 16039, 50275, 296, 3755, 49966, 50276, 2520, 2929, 9713, 4722, 2561, 9400, 285, 253, 4081, 1332, 310, 4722, 285, 12532, 50276, 2520, 2929, 310, 2590, 285, 3103, 352, 310, 3477, 281, 956, 285, 281, 2096, 50276, 783, 6260, 403, 4722, 285, 2085, 47860, 1491, 50275, 20881, 1255, 265, 50276, 783, 5661, 9978, 273, 436, 2929, 1537, 320, 20276, 767, 562, 273, 253, 1264, 8892, 403, 5978, 8892, 275, 534, 7387, 86, 390, 30497, 463, 19, 7103, 7363, 403, 2834, 281, 4665, 26332, 352, 310, 1077, 2834, 281, 5963, 253, 12510, 273, 253, 4081, 1332, 50276, 783, 5661, 1543, 403, 6804, 26332, 352, 310, 2834, 281, 3812, 2266, 11815, 432, 253, 1543, 33810, 352, 310, 417, 2590, 835, 253, 1543, 403, 1534, 1027, 2190, 2718, 50276, 783, 14308, 273, 2654, 285, 258, 351, 403, 417, 973, 2529, 275, 436, 2929, 285, 878, 281, 320, 5520, 33810, 352, 310, 417, 2590, 849, 1142, 6667, 273, 2654, 285, 258, 351, 497, 908, 275, 253, 15302, 50274, 1189, 455, 891, 2281, 436, 2929, 347, 42876, 1840, 253, 14924, 7887, 7194, 984, 253, 2934, 310, 4722, 285, 8489, 4460, 2299, 627, 403, 690, 32213, 17985, 275, 253, 5661, 9978, 285, 1543, 326, 1056, 436, 2929, 247, 45210, 2929, 50276, 7152, 339, 431, 248, 1246, 2929, 806, 25339, 253, 5454, 2727, 875, 3045, 323, 562, 1171, 13517, 941, 285, 801, 297, 404, 941, 342, 1675, 281, 1880, 253, 1566, 310, 4751, 1442, 292, 37437, 390, 28441, 1442, 292, 37437, 327, 295, 21619, 8892, 1273, 352, 8219, 326, 824, 247, 5454, 2727, 310, 417, 3309, 604, 581, 476, 1056, 897, 273, 1097, 273, 841, 767, 1442, 292, 25004, 20824, 275, 247, 19080, 1039, 281, 436, 990, 352, 29328, 29086, 1442, 292, 25004, 534, 14688, 942, 2120, 1442, 292, 25004, 3066, 940, 21755, 432, 247, 28441, 1566, 285, 534, 33526, 4503, 3045, 347, 271, 19862, 273, 253, 767, 1442, 292, 25004, 20824, 387, 2978, 436, 2929, 671, 11424, 253, 3879, 273, 253, 29086, 1442, 292, 25004, 949, 247, 20953, 1566, 50276, 783, 1655, 2929, 4453, 3962, 598, 281, 2593, 577, 352, 6283, 25339, 1027, 1442, 292, 25004, 20824, 347, 973, 347, 253, 5454, 2727, 875, 258, 351, 3045, 285, 2654, 3045, 342, 1675, 281, 253, 897, 273, 247, 794, 292, 25004, 6974, 352, 6283, 13067, 253, 1895, 26332, 253, 5454, 2727, 285, 29328, 247, 2969, 2568, 3576, 1039, 281, 11399, 352, 352, 5216, 697, 1332, 327, 1264, 1027, 295, 21619, 8892, 2829, 19, 1156, 5978, 10405, 5837, 285, 2805, 66, 50276, 35529, 4983, 432, 2593, 608, 352, 2789, 479, 1892, 281, 5583, 271, 14924, 323, 436, 2929, 275, 253, 1655, 5981, 436, 310, 984, 273, 767, 2201, 7350, 50276, 531, 310, 326, 253, 27838, 273, 5661, 1543, 403, 20276, 2801, 1537, 1056, 253, 1551, 11985, 285, 11815, 513, 417, 1462, 10542, 275, 2593, 608, 5001, 2829, 374, 253, 4477, 1333, 672, 16344, 2654, 2120, 1442, 292, 25004, 33526, 253, 10046, 1543, 5727, 28441, 1442, 292, 25004, 33526, 760, 11681, 273, 253, 1629, 36226, 327, 3388, 436, 3908, 310, 760, 2032, 327, 3388, 285, 20621, 84, 253, 5699, 3910, 875, 1027, 8892, 323, 1650, 323, 2654, 941, 273, 253, 941, 19, 1156, 4836, 2120, 1442, 292, 25004, 33526, 247, 4868, 273, 9654, 1099, 1223, 28441, 1442, 292, 25004, 5115, 9654, 1093, 627, 310, 642, 1534, 3064, 875, 253, 3045, 327, 2654, 273, 841, 767, 20824, 26614, 275, 247, 2074, 17716, 323, 258, 351, 3530, 273, 253, 10405, 5837, 4836, 891, 671, 1119, 642, 1534, 3064, 875, 253, 767, 20824, 253, 4112, 2529, 407, 253, 2488, 760, 4961, 672, 2509, 1953, 22291, 436, 5936, 326, 824, 247, 5454, 2727, 310, 4836, 6820, 281, 247, 1781, 6070, 285, 352, 3133, 281, 479, 326, 604, 247, 4836, 15771, 327, 253, 14800, 275, 247, 625, 6828, 1039, 24088, 2805, 66, 310, 2223, 14123, 407, 3365, 24699, 2505, 432, 14800, 840, 253, 5454, 2727, 310, 625, 1534, 285, 253, 4081, 1566, 310, 625, 4217, 5727, 604, 247, 4836, 15771, 327, 253, 14800, 275, 247, 625, 11686, 1039, 24088, 941, 19, 1156, 4419, 253, 14156, 281, 2098, 752, 281, 4711, 275, 253, 806, 1659, 840, 253, 5454, 2727, 310, 1679, 1534, 285, 7756, 1160, 407, 253, 4081, 1566, 310, 1679, 8791, 824, 247, 11562, 943, 417, 320, 28849, 285, 6107, 407, 760, 3981, 327, 3388, 50276, 783, 643, 310, 326, 672, 16344, 253, 941, 19, 1156, 5978, 760, 7387, 86, 310, 908, 2299, 627, 556, 644, 247, 4310, 273, 789, 326, 556, 8058, 253, 7387, 86, 556, 1698, 13091, 327, 295, 21619, 8892, 24088, 28411, 4765, 436, 2789, 387, 1878, 253, 1543, 327, 4384, 13307, 72, 513, 417, 9630, 50276, 14005, 28411, 299, 4765, 247, 18872, 2278, 273, 253, 13091, 273, 7387, 86, 15180, 20365, 3397, 34790, 6931, 1706, 520, 50276, 74, 3839, 751, 253, 2934, 273, 436, 2929, 285, 253, 2929, 310, 3962, 598, 281, 253, 1659, 835, 253, 4679, 403, 5611, 11985, 273, 436, 2929, 20621, 2201, 16958, 275, 253, 1543, 2403, 253, 11985, 285, 3164, 253, 11815, 403, 275, 629, 3430, 5474, 33032, 2520, 2929, 29328, 247, 2969, 2568, 3576, 1332, 29086, 1442, 292, 25004, 323, 253, 3626, 3448, 5978, 8892, 616, 1543, 921, 326, 29086, 1442, 292, 25004, 476, 6016, 1097, 801, 297, 404, 941, 285, 562, 1171, 13517, 941, 8069, 407, 2049, 272, 23675, 71, 7795, 25004, 285, 2120, 71, 7795, 25004, 949, 3640, 940, 21755, 285, 4583, 556, 10870, 3045, 2429, 281, 616, 49328, 352, 671, 3400, 10527, 1783, 327, 23559, 14407, 21535, 9077, 281, 5513, 2139, 352, 2987, 50276, 74, 717, 417, 13762, 285, 1014, 9861, 407, 253, 4477, 326, 546, 35128, 310, 581, 273, 253, 3082, 326, 359, 12661, 347, 973, 891, 1158, 253, 19862, 1332, 310, 5368, 285, 417, 4081, 407, 4477, 285, 4477, 760, 4647, 352, 281, 253, 4758, 28441, 1442, 292, 25004, 285, 2120, 1442, 292, 25004, 50276, 249, 253, 4679, 359, 1246, 359, 2770, 327, 28294, 272, 780, 3210, 285, 627, 310, 2649, 1199, 1941, 326, 638, 11125, 633, 25004, 2987, 323, 10234, 253, 4154, 310, 20276, 436, 9380, 4060, 310, 49328, 285, 11694, 37465, 10237, 1442, 292, 25004, 323, 3626, 3448, 5978, 1580, 352, 310, 323, 253, 2087, 3626, 3448, 5978, 295, 21619, 4836, 891, 651, 751, 281, 923, 841, 1543, 347, 26301, 310, 581, 273, 253, 954, 1774, 4836, 275, 295, 21619, 604, 253, 4081, 1332, 36908, 789, 323, 26301, 840, 891, 651, 1158, 253, 2929, 689, 28803, 697, 7680, 50276, 296, 3755, 20556, 50276, 783, 2934, 273, 16248, 23675, 71, 7795, 25004, 285, 2120, 71, 7795, 25004, 281, 6016, 1097, 801, 297, 404, 941, 285, 562, 1171, 13517, 941, 310, 4722, 285, 4518, 17194, 50276, 2520, 2929, 310, 973, 3542, 285, 10262, 616, 16038, 4518, 285, 4720, 671, 3400, 10527, 1783, 327, 23559, 14407, 21535, 9077, 281, 1361, 2096, 2139, 352, 476, 789, 50276, 20881, 1255, 50276, 14094, 3082, 760, 1347, 3294, 1598, 281, 253, 19862, 1332, 533, 417, 12724, 285, 3012, 1805, 3738, 29086, 1442, 292, 25004, 760, 1057, 17032, 2378, 697, 417, 1959, 285, 3198, 247, 3640, 940, 21755, 1232, 1078, 3733, 534, 4419, 17032, 689, 665, 1059, 26208, 941, 970, 253, 23675, 71, 7795, 25004, 1566, 285, 476, 320, 673, 33136, 253, 4477, 943, 2312, 670, 436, 12291, 275, 253, 2929, 50276, 2520, 2929, 13698, 387, 253, 3626, 3448, 5978, 2170, 285, 943, 1908, 6240, 4679, 327, 15302, 323, 5145, 10234, 26301, 50276, 8052, 1543, 323, 5145, 10234, 476, 1056, 436, 2929, 625, 21414, 285, 4891, 50276, 2520, 2929, 24772, 23675, 71, 7795, 25004, 285, 2120, 71, 7795, 25004, 281, 6016, 1097, 801, 297, 404, 941, 285, 562, 1171, 13517, 941, 285, 253, 2934, 310, 4722, 697, 671, 973, 3542, 285, 4518, 17194, 2007, 352, 3400, 247, 10527, 1783, 273, 23559, 14407, 21535, 9077, 281, 5513, 2139, 352, 2987, 2299, 891, 1158, 253, 4477, 689, 13578, 616, 7680, 3738, 436, 9380, 4060, 310, 49328, 285, 11694, 37465, 10237, 1442, 292, 25004, 323, 3626, 3448, 5978, 352, 19756, 5145, 10234, 1543, 581, 273, 253, 954, 1774, 8892, 275, 3626, 3448, 5978, 275, 1635, 253, 4477, 3748, 326, 546, 35128, 310, 581, 273, 253, 3082, 326, 359, 12661, 347, 973, 534, 891, 13414, 5194, 342, 891, 651, 5583, 27177, 436, 2929, 2490, 187, 4118, 18435, 27, 2520, 2929, 10262, 247, 1332, 323, 546, 35128, 1708, 1442, 292, 25004, 3082, 285, 2120, 1442, 292, 25004, 3082, 281, 5115, 1805, 3045, 1097, 801, 297, 404, 285, 562, 1171, 13517, 10670, 347, 4477, 5194, 2074, 2934, 556, 644, 14859, 275, 253, 4382, 8113, 6239, 253, 30628, 751, 253, 4583, 2934, 273, 253, 2929, 533, 597, 512, 574, 690, 7350, 5001, 253, 4679, 253, 30628, 2085, 9865, 8680, 327, 849, 281, 3157, 253, 4679, 7826, 3515, 253, 1072, 2934, 327, 625, 15302, 285, 8892, 2085, 625, 6260, 285, 11985, 327, 849, 281, 2096, 253, 1543 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: summary the paper proposes two regularizers for finetuning pretrained mask lms to improve the robustness of nli and squad models when evaluated on adversarial nli and squad datasets adding the regularizers to regular finetuning achieves a robust accuracy comparable to adversarial training baselines adding the regularizers are added to adversarial training baselines archives extra robustness gains the first regularizer is an implementation of the information bottleneck principle specialized for contextual text representations in the general ib objective we seek to maximize the mutual information between the representation and the label as well as minimizing the mutual information between the representation and the input in this paper the authors targets the tokenlevel bert representation this design choice was not discussed in detail but i assume its motivated by the observation that our models lack of robustness is often manifested in an overreliance on local superficial features the second regularizer has a similar motivation but instead of minimizing the mutual information between the input and the representation the anchoring feature regularizer minimizes the mutual information between the global representation and the token level ones specifically those that are nonrobust and unuseful to identify nonrobust and unuseful tokens in each input the authors use a heuristic based on input gradient norm similar to how interpretability people generate heatmaps for text classification detailed comments interpretation of experimental results and choice of baselines the abstract claims that infobert achieves stateoftheart robust accuracy this is not accurate the best numbers reported in this paper are achieved by applying infobert regularizers to freelb adversarial training this can be seen as an ensemble of two or three if you count the two regularizers separately since they can be applied independent of each other adversarial training methods ensembling usually helps robustness see for example tramr etal iclr 2018 ensemble adversarial training attacks and defenses for a fair comparison freelb should be ensemble with another adversarial training method or with freelb applied to a second model when applied individually the gain from infobert has a much smaller advantage compared to the baselines missing evaluations and ablations an obvious ablation is missing apply the two regularizers separately im especially curious if both lead to gains on top of freelb the paper has a sort of disproportionate treatment of the two regularizers both theorem 31 and 32 are talking about the ib regularizer while a lot of design choices for the anchoring feature regularizer are proposed without justification or verification eg the portion of useful and robust tokens the anchoring feature regularizer relies on various heuristics definition of usefulness and robustness and if it turns out to be the main contributing factor to infobert it would be good to know if others would like to apply infobert on other tasks they might need to tune these hyperparameters formulation of the method the authors cite the localization of the ib principle in the ib regularizer as part of the novelty of the method however this kind of localization can be found in eg li eisner 2019 specializing word embeddings for parsing by information bottleneck emnlp which is one of the first applications of ib principle to nlp with pretrained contextualized representations in the anchoring feature regularizer the use of input gradient norm for filtering nonrobust and unuseful tokens is reminiscent of how interpretation methods generate saliency maps for text classification both are missing from the references implications and verification of theorem 32 this is a minor point but in my opinion theorem 32 is a bit of an overkill just to solidify the intuition that the performance gap becomes closer when ixi ti and ixi ti decreases it would be nice to verify empirically through experiment that this theorem is correct finally i encourage the authors to evaluate the method on more tasks and attacks and especially focus on comparing against the naive adversarial training baseline it would be good to have a better understanding of how much gain inforbert brings and what are the most important factors docsep summary the paper proposes a novel learning framework for robust against adversarial attacks finetuning of pretrained language models that is based on information theoretic arguments it introduces two regularization mechanisms and investigates their efficacy on various tasks reasons for score overall i vote for accept the approach is novel interesting and well presented the theoretical results seem to be sound it also seems to outperform competitors in the field of adversarial language models concerning the experiments some questions remain but i hope that the authors will address them in the rebuttal pros 1 the idea is interesting and well formulated the theoretical results seem to be correct to me 2 the approach is tested on several standard datasets used in adversarial language models it seems to outperform previous approaches 3 the paper is well written and clearly structured cons 1 in my view the experiments seem to show a tendency towards a slightly worse performance on the more difficult tasks in comparison to the competitor methods thus the better overall performance on the anli data could be driven by the easier tasks 2 i couldnt find a clear description of the global representation z a more explicit description would be helpful questions during rebuttal period please address and clarify the cons above minor comments 1 is definition 31 a standard definition or is it introduced by the authors 2 page 4 definition 3 contains an incomplete sentence the qx 3 page 6 evaluation metrics it should be stated witch argument is maximized 4 page 16 lemma a1 in the proof of the lemma i think that all instances of yi should be replaced with ti in formula 13 in the rightmost term the token index n should be i1 5 page 17 formula 33 hyy should probably be hty same goes for equ 36 6 squares might be missing in formulas 37 to 43 7 a reference to formula 44 would be nice docsepthis work infobert proposes additional objectives for transformer finetuning to obtain models more robust to adversarial inputs the authors first propose a mutual information based information bottleneck objective next the authors propose an adversarial loss inspired method for identifying robust features and a subsequent objective to emphasize the mutual information between global representations and these robust features the experiments demonstrate that infobert consistently outperforms other adversarial training approaches on a variety of adversarial evaluations i largely follow the logic behind the derivation however i find some of the details unclear i would like to see proofs for the theorems as well as an explanation of the assumptions under which the theorems hold the experimental results are convincing however there are no ablation studies to disentangle the performance contributions of the two proposed objectives for the first point the questions i have are as follows for equations 13 i find the integral notation to be a bit odd isnt it common practice to put the dydt at the very end of the integral also you should consider leaving out punctuations from equations for equation 5 why is there another 1n inside the square brackets why is equation 7 true in the general case suppose n1 is this essentially saying that any sample from an empirical distribution would provide a lower bound for the true distribution id like to see a proof for theorems 31 and 32 how do the authors define stability and robustness the manuscript talk about them in vague terms and they do not seem to be precisely defined how does equation 9 follow from 6 can you put in the intermediate steps also in this case what is n and what is m and what happened to the multiplier n from equation 6 ### Summary:
this paper introduces two regularizers that are meant to improve outofdomain robustness when used in the finetuning of pretrained transformers like bert results with anli and adversarial squad are encouraging pros new method with concrete improvements in several difficult task settings new framing of adversarial generalization cons the ablations that are highlighted in the main paper body dont do a good job of isolating the specific new contributions though the appendix provides enough detail that im satisfied that the main empirical contribution is sound reviewers found the theoretical motivation very difficult to follow in places
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 6010, 253, 2929, 29328, 767, 3963, 14460, 323, 1442, 292, 25004, 3215, 11273, 8989, 298, 983, 281, 3157, 253, 31640, 273, 295, 965, 285, 13487, 3210, 672, 6760, 327, 48960, 295, 965, 285, 13487, 15302, 6240, 253, 3963, 14460, 281, 3963, 1442, 292, 25004, 33526, 247, 10237, 7200, 10870, 281, 48960, 3733, 1666, 25379, 6240, 253, 3963, 14460, 403, 2879, 281, 48960, 3733, 1666, 25379, 31523, 4465, 31640, 15988, 50276, 783, 806, 3963, 6081, 310, 271, 7092, 273, 253, 1491, 3673, 44856, 8063, 18052, 323, 33876, 2505, 14237, 275, 253, 2087, 18890, 8103, 359, 7703, 281, 22950, 253, 15577, 1491, 875, 253, 6779, 285, 253, 5203, 347, 973, 347, 28699, 253, 15577, 1491, 875, 253, 6779, 285, 253, 3280, 275, 436, 2929, 253, 4477, 8571, 253, 10669, 5251, 270, 797, 6779, 436, 2216, 4327, 369, 417, 5469, 275, 2508, 533, 891, 5467, 697, 17194, 407, 253, 8310, 326, 776, 3210, 3480, 273, 31640, 310, 2223, 33579, 275, 271, 689, 22987, 593, 327, 1980, 28019, 3386, 50276, 783, 1273, 3963, 6081, 556, 247, 2074, 16038, 533, 3185, 273, 28699, 253, 15577, 1491, 875, 253, 3280, 285, 253, 6779, 253, 11886, 4263, 4735, 3963, 6081, 46926, 253, 15577, 1491, 875, 253, 4156, 6779, 285, 253, 10669, 1268, 4394, 5742, 1110, 326, 403, 1327, 18848, 461, 285, 9949, 4085, 281, 4271, 1327, 18848, 461, 285, 9949, 4085, 21761, 275, 1016, 3280, 253, 4477, 897, 247, 47641, 1754, 327, 3280, 11786, 5222, 2074, 281, 849, 4665, 1430, 952, 6635, 4250, 23226, 323, 2505, 9162, 50275, 5992, 7193, 5701, 7914, 273, 5661, 1543, 285, 4327, 273, 1666, 25379, 253, 12002, 3916, 326, 2192, 2672, 85, 33526, 1375, 23037, 14387, 10237, 7200, 436, 310, 417, 7899, 253, 1682, 3904, 2361, 275, 436, 2929, 403, 6786, 407, 9433, 2192, 2672, 85, 3963, 14460, 281, 31847, 67, 48960, 3733, 436, 476, 320, 2326, 347, 271, 19862, 273, 767, 390, 1264, 604, 368, 1385, 253, 767, 3963, 14460, 11794, 1580, 597, 476, 320, 3732, 3907, 273, 1016, 643, 48960, 3733, 3082, 546, 35128, 3798, 7729, 31640, 923, 323, 1650, 32926, 83, 1162, 267, 17857, 32888, 4765, 19862, 48960, 3733, 8104, 285, 25774, 323, 247, 4344, 5301, 31847, 67, 943, 320, 19862, 342, 1529, 48960, 3733, 1332, 390, 342, 31847, 67, 3732, 281, 247, 1273, 1566, 672, 3732, 15978, 253, 6351, 432, 2192, 2672, 85, 556, 247, 1199, 4577, 5750, 2429, 281, 253, 1666, 25379, 50276, 33722, 27163, 285, 490, 77, 569, 271, 4755, 28913, 310, 5816, 4647, 253, 767, 3963, 14460, 11794, 516, 3340, 14338, 604, 1097, 1421, 281, 15988, 327, 1755, 273, 31847, 67, 253, 2929, 556, 247, 3686, 273, 50196, 1971, 273, 253, 767, 3963, 14460, 1097, 10012, 4562, 285, 4567, 403, 5015, 670, 253, 18890, 3963, 6081, 1223, 247, 2257, 273, 2216, 10165, 323, 253, 11886, 4263, 4735, 3963, 6081, 403, 4081, 1293, 22861, 390, 21999, 24088, 253, 5110, 273, 4217, 285, 10237, 21761, 253, 11886, 4263, 4735, 3963, 6081, 15771, 327, 2710, 344, 321, 3397, 5426, 273, 31471, 285, 31640, 285, 604, 352, 7819, 562, 281, 320, 253, 2022, 15979, 2803, 281, 2192, 2672, 85, 352, 651, 320, 1175, 281, 871, 50276, 338, 2571, 651, 751, 281, 4647, 2192, 2672, 85, 327, 643, 8892, 597, 1537, 878, 281, 19928, 841, 4373, 22041, 15895, 273, 253, 1332, 253, 4477, 26542, 253, 14536, 273, 253, 18890, 8063, 275, 253, 18890, 3963, 6081, 347, 629, 273, 253, 38135, 273, 253, 1332, 2299, 436, 2238, 273, 14536, 476, 320, 1119, 275, 24088, 632, 50276, 70, 261, 1216, 6247, 2714, 3006, 3159, 46234, 323, 29072, 407, 1491, 3673, 44856, 802, 13307, 81, 534, 310, 581, 273, 253, 806, 4893, 273, 18890, 8063, 281, 295, 24343, 342, 3215, 11273, 33876, 1025, 14237, 275, 253, 11886, 4263, 4735, 3963, 6081, 253, 897, 273, 3280, 11786, 5222, 323, 19690, 1327, 18848, 461, 285, 9949, 4085, 21761, 310, 35036, 273, 849, 7914, 3082, 6635, 3779, 4364, 8115, 323, 2505, 9162, 1097, 403, 5816, 432, 253, 10414, 50276, 303, 35663, 285, 21999, 273, 10012, 4567, 436, 310, 247, 5884, 1127, 533, 275, 619, 4743, 10012, 4567, 310, 247, 2372, 273, 271, 689, 24212, 816, 281, 4891, 1419, 253, 30328, 326, 253, 3045, 8037, 4916, 8003, 672, 891, 2981, 16816, 285, 891, 2981, 16816, 12075, 352, 651, 320, 5322, 281, 12654, 45190, 949, 3368, 326, 436, 10012, 310, 3451, 50276, 71, 3341, 891, 11907, 253, 4477, 281, 7472, 253, 1332, 327, 625, 8892, 285, 8104, 285, 3340, 2770, 327, 10941, 1411, 253, 27785, 48960, 3733, 8245, 352, 651, 320, 1175, 281, 452, 247, 1805, 4685, 273, 849, 1199, 6351, 275, 1542, 6291, 10316, 285, 752, 403, 253, 954, 1774, 2616, 5474, 33032, 50275, 8774, 50276, 783, 2929, 29328, 247, 4460, 4715, 7792, 323, 10237, 1411, 48960, 8104, 1442, 292, 25004, 273, 3215, 11273, 3448, 3210, 326, 310, 1754, 327, 1491, 253, 30325, 7125, 352, 23970, 767, 37820, 6297, 285, 2340, 684, 616, 10307, 327, 2710, 8892, 50274, 250, 3743, 323, 4868, 50275, 1189, 455, 891, 6273, 323, 2997, 253, 2746, 310, 4460, 4722, 285, 973, 3559, 253, 10527, 1543, 1646, 281, 320, 3590, 352, 671, 3133, 281, 562, 32231, 21607, 275, 253, 1673, 273, 48960, 3448, 3210, 8664, 253, 4679, 690, 3533, 3464, 533, 891, 3524, 326, 253, 4477, 588, 2953, 731, 275, 253, 30080, 22559, 50275, 856, 84, 50275, 18, 253, 2934, 310, 4722, 285, 973, 26115, 253, 10527, 1543, 1646, 281, 320, 3451, 281, 479, 50276, 19, 253, 2746, 310, 5762, 327, 2067, 2629, 15302, 908, 275, 48960, 3448, 3210, 352, 3133, 281, 562, 32231, 2045, 7274, 50276, 20, 253, 2929, 310, 973, 3542, 285, 4518, 18872, 50273, 5040, 50274, 18, 275, 619, 1859, 253, 4679, 1646, 281, 921, 247, 14955, 4404, 247, 5777, 7197, 3045, 327, 253, 625, 2834, 8892, 275, 5301, 281, 253, 32048, 3082, 3021, 253, 1805, 4583, 3045, 327, 253, 271, 965, 941, 812, 320, 8877, 407, 253, 6927, 8892, 50273, 19, 891, 812, 2649, 1089, 247, 2590, 5740, 273, 253, 4156, 6779, 1182, 247, 625, 6843, 5740, 651, 320, 9371, 50273, 34974, 1309, 30080, 22559, 2180, 50274, 32897, 2953, 285, 19148, 253, 772, 1840, 50272, 37585, 5701, 50275, 18, 310, 5426, 4562, 247, 2629, 5426, 390, 310, 352, 5611, 407, 253, 4477, 50275, 19, 3239, 577, 5426, 495, 4428, 271, 18464, 6197, 253, 2805, 89, 50276, 20, 3239, 721, 7103, 17082, 352, 943, 320, 4767, 25204, 4154, 310, 11903, 1025, 50276, 21, 3239, 1668, 18057, 247, 18, 275, 253, 4737, 273, 253, 18057, 891, 1158, 326, 512, 10872, 273, 340, 74, 943, 320, 7932, 342, 16816, 275, 7212, 2145, 275, 253, 987, 2252, 1307, 253, 10669, 3605, 295, 943, 320, 891, 18, 50276, 22, 3239, 1722, 7212, 5922, 1465, 90, 943, 3164, 320, 288, 555, 1072, 4566, 323, 1298, 5540, 50276, 23, 19325, 1537, 320, 5816, 275, 23276, 5345, 281, 7652, 50276, 24, 247, 3806, 281, 7212, 7127, 651, 320, 5322, 50275, 7152, 33032, 2520, 789, 2192, 2672, 85, 29328, 3081, 16566, 323, 39707, 1442, 292, 25004, 281, 4044, 3210, 625, 10237, 281, 48960, 14800, 253, 4477, 806, 12661, 247, 15577, 1491, 1754, 1491, 3673, 44856, 8103, 1735, 253, 4477, 12661, 271, 48960, 2957, 11797, 1332, 323, 12488, 10237, 3386, 285, 247, 6774, 8103, 281, 22175, 253, 15577, 1491, 875, 4156, 14237, 285, 841, 10237, 3386, 253, 4679, 7568, 326, 2192, 2672, 85, 12724, 41731, 13015, 643, 48960, 3733, 7274, 327, 247, 5235, 273, 48960, 27163, 50276, 74, 8127, 956, 253, 9317, 3212, 253, 28529, 2299, 891, 1089, 690, 273, 253, 4278, 12744, 891, 651, 751, 281, 923, 27947, 323, 253, 39383, 347, 973, 347, 271, 8813, 273, 253, 13260, 762, 534, 253, 39383, 2186, 253, 5661, 1543, 403, 21414, 2299, 627, 403, 642, 28913, 2175, 281, 557, 290, 2134, 253, 3045, 9021, 273, 253, 767, 4081, 16566, 323, 253, 806, 1127, 253, 3533, 891, 452, 403, 347, 3637, 50276, 1542, 7424, 2145, 891, 1089, 253, 9909, 14951, 281, 320, 247, 2372, 8909, 50276, 261, 2649, 352, 1846, 3946, 281, 1691, 253, 17713, 7064, 387, 253, 1077, 990, 273, 253, 9909, 671, 368, 943, 1908, 6108, 562, 17256, 12542, 432, 7424, 323, 5150, 608, 2139, 310, 627, 1529, 337, 79, 3304, 253, 6278, 26609, 2139, 310, 5150, 818, 2032, 275, 253, 2087, 1083, 9428, 295, 18, 310, 436, 9093, 3981, 326, 667, 3410, 432, 271, 16774, 3268, 651, 2085, 247, 2406, 3033, 323, 253, 2032, 3268, 2654, 751, 281, 923, 247, 4737, 323, 39383, 4562, 285, 4567, 849, 513, 253, 4477, 4853, 7882, 285, 31640, 253, 7714, 2312, 670, 731, 275, 21248, 2426, 285, 597, 513, 417, 1646, 281, 320, 10534, 2931, 849, 1057, 5150, 898, 956, 432, 721, 476, 368, 1691, 275, 253, 10444, 5018, 671, 275, 436, 1083, 752, 310, 295, 285, 752, 310, 278, 285, 752, 4592, 281, 253, 39199, 295, 432, 5150, 721, 187, 187, 4118, 18435, 27, 2520, 2929, 23970, 767, 3963, 14460, 326, 403, 5486, 281, 3157, 562, 1171, 13517, 31640, 672, 908, 275, 253, 1442, 292, 25004, 273, 3215, 11273, 4979, 398, 751, 270, 797, 1543, 342, 271, 965, 285, 48960, 13487, 403, 18462, 50276, 856, 84, 50276, 1826, 1332, 342, 11859, 11701, 275, 2067, 2834, 4836, 7533, 50276, 1826, 39926, 273, 48960, 26647, 50276, 5040, 50276, 783, 490, 77, 569, 326, 403, 16318, 275, 253, 2022, 2929, 2133, 13414, 513, 247, 1175, 2628, 273, 4186, 839, 253, 2173, 747, 9021, 2167, 253, 30762, 3400, 2217, 2508, 326, 516, 10048, 326, 253, 2022, 16774, 7680, 310, 3590, 50276, 15337, 398, 1119, 253, 10527, 16038, 1077, 2834, 281, 956, 275, 5053 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 6010, 253, 2929, 29328, 767, 3963, 14460, 323, 1442, 292, 25004, 3215, 11273, 8989, 298, 983, 281, 3157, 253, 31640, 273, 295, 965, 285, 13487, 3210, 672, 6760, 327, 48960, 295, 965, 285, 13487, 15302, 6240, 253, 3963, 14460, 281, 3963, 1442, 292, 25004, 33526, 247, 10237, 7200, 10870, 281, 48960, 3733, 1666, 25379, 6240, 253, 3963, 14460, 403, 2879, 281, 48960, 3733, 1666, 25379, 31523, 4465, 31640, 15988, 50276, 783, 806, 3963, 6081, 310, 271, 7092, 273, 253, 1491, 3673, 44856, 8063, 18052, 323, 33876, 2505, 14237, 275, 253, 2087, 18890, 8103, 359, 7703, 281, 22950, 253, 15577, 1491, 875, 253, 6779, 285, 253, 5203, 347, 973, 347, 28699, 253, 15577, 1491, 875, 253, 6779, 285, 253, 3280, 275, 436, 2929, 253, 4477, 8571, 253, 10669, 5251, 270, 797, 6779, 436, 2216, 4327, 369, 417, 5469, 275, 2508, 533, 891, 5467, 697, 17194, 407, 253, 8310, 326, 776, 3210, 3480, 273, 31640, 310, 2223, 33579, 275, 271, 689, 22987, 593, 327, 1980, 28019, 3386, 50276, 783, 1273, 3963, 6081, 556, 247, 2074, 16038, 533, 3185, 273, 28699, 253, 15577, 1491, 875, 253, 3280, 285, 253, 6779, 253, 11886, 4263, 4735, 3963, 6081, 46926, 253, 15577, 1491, 875, 253, 4156, 6779, 285, 253, 10669, 1268, 4394, 5742, 1110, 326, 403, 1327, 18848, 461, 285, 9949, 4085, 281, 4271, 1327, 18848, 461, 285, 9949, 4085, 21761, 275, 1016, 3280, 253, 4477, 897, 247, 47641, 1754, 327, 3280, 11786, 5222, 2074, 281, 849, 4665, 1430, 952, 6635, 4250, 23226, 323, 2505, 9162, 50275, 5992, 7193, 5701, 7914, 273, 5661, 1543, 285, 4327, 273, 1666, 25379, 253, 12002, 3916, 326, 2192, 2672, 85, 33526, 1375, 23037, 14387, 10237, 7200, 436, 310, 417, 7899, 253, 1682, 3904, 2361, 275, 436, 2929, 403, 6786, 407, 9433, 2192, 2672, 85, 3963, 14460, 281, 31847, 67, 48960, 3733, 436, 476, 320, 2326, 347, 271, 19862, 273, 767, 390, 1264, 604, 368, 1385, 253, 767, 3963, 14460, 11794, 1580, 597, 476, 320, 3732, 3907, 273, 1016, 643, 48960, 3733, 3082, 546, 35128, 3798, 7729, 31640, 923, 323, 1650, 32926, 83, 1162, 267, 17857, 32888, 4765, 19862, 48960, 3733, 8104, 285, 25774, 323, 247, 4344, 5301, 31847, 67, 943, 320, 19862, 342, 1529, 48960, 3733, 1332, 390, 342, 31847, 67, 3732, 281, 247, 1273, 1566, 672, 3732, 15978, 253, 6351, 432, 2192, 2672, 85, 556, 247, 1199, 4577, 5750, 2429, 281, 253, 1666, 25379, 50276, 33722, 27163, 285, 490, 77, 569, 271, 4755, 28913, 310, 5816, 4647, 253, 767, 3963, 14460, 11794, 516, 3340, 14338, 604, 1097, 1421, 281, 15988, 327, 1755, 273, 31847, 67, 253, 2929, 556, 247, 3686, 273, 50196, 1971, 273, 253, 767, 3963, 14460, 1097, 10012, 4562, 285, 4567, 403, 5015, 670, 253, 18890, 3963, 6081, 1223, 247, 2257, 273, 2216, 10165, 323, 253, 11886, 4263, 4735, 3963, 6081, 403, 4081, 1293, 22861, 390, 21999, 24088, 253, 5110, 273, 4217, 285, 10237, 21761, 253, 11886, 4263, 4735, 3963, 6081, 15771, 327, 2710, 344, 321, 3397, 5426, 273, 31471, 285, 31640, 285, 604, 352, 7819, 562, 281, 320, 253, 2022, 15979, 2803, 281, 2192, 2672, 85, 352, 651, 320, 1175, 281, 871, 50276, 338, 2571, 651, 751, 281, 4647, 2192, 2672, 85, 327, 643, 8892, 597, 1537, 878, 281, 19928, 841, 4373, 22041, 15895, 273, 253, 1332, 253, 4477, 26542, 253, 14536, 273, 253, 18890, 8063, 275, 253, 18890, 3963, 6081, 347, 629, 273, 253, 38135, 273, 253, 1332, 2299, 436, 2238, 273, 14536, 476, 320, 1119, 275, 24088, 632, 50276, 70, 261, 1216, 6247, 2714, 3006, 3159, 46234, 323, 29072, 407, 1491, 3673, 44856, 802, 13307, 81, 534, 310, 581, 273, 253, 806, 4893, 273, 18890, 8063, 281, 295, 24343, 342, 3215, 11273, 33876, 1025, 14237, 275, 253, 11886, 4263, 4735, 3963, 6081, 253, 897, 273, 3280, 11786, 5222, 323, 19690, 1327, 18848, 461, 285, 9949, 4085, 21761, 310, 35036, 273, 849, 7914, 3082, 6635, 3779, 4364, 8115, 323, 2505, 9162, 1097, 403, 5816, 432, 253, 10414, 50276, 303, 35663, 285, 21999, 273, 10012, 4567, 436, 310, 247, 5884, 1127, 533, 275, 619, 4743, 10012, 4567, 310, 247, 2372, 273, 271, 689, 24212, 816, 281, 4891, 1419, 253, 30328, 326, 253, 3045, 8037, 4916, 8003, 672, 891, 2981, 16816, 285, 891, 2981, 16816, 12075, 352, 651, 320, 5322, 281, 12654, 45190, 949, 3368, 326, 436, 10012, 310, 3451, 50276, 71, 3341, 891, 11907, 253, 4477, 281, 7472, 253, 1332, 327, 625, 8892, 285, 8104, 285, 3340, 2770, 327, 10941, 1411, 253, 27785, 48960, 3733, 8245, 352, 651, 320, 1175, 281, 452, 247, 1805, 4685, 273, 849, 1199, 6351, 275, 1542, 6291, 10316, 285, 752, 403, 253, 954, 1774, 2616, 5474, 33032, 50275, 8774, 50276, 783, 2929, 29328, 247, 4460, 4715, 7792, 323, 10237, 1411, 48960, 8104, 1442, 292, 25004, 273, 3215, 11273, 3448, 3210, 326, 310, 1754, 327, 1491, 253, 30325, 7125, 352, 23970, 767, 37820, 6297, 285, 2340, 684, 616, 10307, 327, 2710, 8892, 50274, 250, 3743, 323, 4868, 50275, 1189, 455, 891, 6273, 323, 2997, 253, 2746, 310, 4460, 4722, 285, 973, 3559, 253, 10527, 1543, 1646, 281, 320, 3590, 352, 671, 3133, 281, 562, 32231, 21607, 275, 253, 1673, 273, 48960, 3448, 3210, 8664, 253, 4679, 690, 3533, 3464, 533, 891, 3524, 326, 253, 4477, 588, 2953, 731, 275, 253, 30080, 22559, 50275, 856, 84, 50275, 18, 253, 2934, 310, 4722, 285, 973, 26115, 253, 10527, 1543, 1646, 281, 320, 3451, 281, 479, 50276, 19, 253, 2746, 310, 5762, 327, 2067, 2629, 15302, 908, 275, 48960, 3448, 3210, 352, 3133, 281, 562, 32231, 2045, 7274, 50276, 20, 253, 2929, 310, 973, 3542, 285, 4518, 18872, 50273, 5040, 50274, 18, 275, 619, 1859, 253, 4679, 1646, 281, 921, 247, 14955, 4404, 247, 5777, 7197, 3045, 327, 253, 625, 2834, 8892, 275, 5301, 281, 253, 32048, 3082, 3021, 253, 1805, 4583, 3045, 327, 253, 271, 965, 941, 812, 320, 8877, 407, 253, 6927, 8892, 50273, 19, 891, 812, 2649, 1089, 247, 2590, 5740, 273, 253, 4156, 6779, 1182, 247, 625, 6843, 5740, 651, 320, 9371, 50273, 34974, 1309, 30080, 22559, 2180, 50274, 32897, 2953, 285, 19148, 253, 772, 1840, 50272, 37585, 5701, 50275, 18, 310, 5426, 4562, 247, 2629, 5426, 390, 310, 352, 5611, 407, 253, 4477, 50275, 19, 3239, 577, 5426, 495, 4428, 271, 18464, 6197, 253, 2805, 89, 50276, 20, 3239, 721, 7103, 17082, 352, 943, 320, 4767, 25204, 4154, 310, 11903, 1025, 50276, 21, 3239, 1668, 18057, 247, 18, 275, 253, 4737, 273, 253, 18057, 891, 1158, 326, 512, 10872, 273, 340, 74, 943, 320, 7932, 342, 16816, 275, 7212, 2145, 275, 253, 987, 2252, 1307, 253, 10669, 3605, 295, 943, 320, 891, 18, 50276, 22, 3239, 1722, 7212, 5922, 1465, 90, 943, 3164, 320, 288, 555, 1072, 4566, 323, 1298, 5540, 50276, 23, 19325, 1537, 320, 5816, 275, 23276, 5345, 281, 7652, 50276, 24, 247, 3806, 281, 7212, 7127, 651, 320, 5322, 50275, 7152, 33032, 2520, 789, 2192, 2672, 85, 29328, 3081, 16566, 323, 39707, 1442, 292, 25004, 281, 4044, 3210, 625, 10237, 281, 48960, 14800, 253, 4477, 806, 12661, 247, 15577, 1491, 1754, 1491, 3673, 44856, 8103, 1735, 253, 4477, 12661, 271, 48960, 2957, 11797, 1332, 323, 12488, 10237, 3386, 285, 247, 6774, 8103, 281, 22175, 253, 15577, 1491, 875, 4156, 14237, 285, 841, 10237, 3386, 253, 4679, 7568, 326, 2192, 2672, 85, 12724, 41731, 13015, 643, 48960, 3733, 7274, 327, 247, 5235, 273, 48960, 27163, 50276, 74, 8127, 956, 253, 9317, 3212, 253, 28529, 2299, 891, 1089, 690, 273, 253, 4278, 12744, 891, 651, 751, 281, 923, 27947, 323, 253, 39383, 347, 973, 347, 271, 8813, 273, 253, 13260, 762, 534, 253, 39383, 2186, 253, 5661, 1543, 403, 21414, 2299, 627, 403, 642, 28913, 2175, 281, 557, 290, 2134, 253, 3045, 9021, 273, 253, 767, 4081, 16566, 323, 253, 806, 1127, 253, 3533, 891, 452, 403, 347, 3637, 50276, 1542, 7424, 2145, 891, 1089, 253, 9909, 14951, 281, 320, 247, 2372, 8909, 50276, 261, 2649, 352, 1846, 3946, 281, 1691, 253, 17713, 7064, 387, 253, 1077, 990, 273, 253, 9909, 671, 368, 943, 1908, 6108, 562, 17256, 12542, 432, 7424, 323, 5150, 608, 2139, 310, 627, 1529, 337, 79, 3304, 253, 6278, 26609, 2139, 310, 5150, 818, 2032, 275, 253, 2087, 1083, 9428, 295, 18, 310, 436, 9093, 3981, 326, 667, 3410, 432, 271, 16774, 3268, 651, 2085, 247, 2406, 3033, 323, 253, 2032, 3268, 2654, 751, 281, 923, 247, 4737, 323, 39383, 4562, 285, 4567, 849, 513, 253, 4477, 4853, 7882, 285, 31640, 253, 7714, 2312, 670, 731, 275, 21248, 2426, 285, 597, 513, 417, 1646, 281, 320, 10534, 2931, 849, 1057, 5150, 898, 956, 432, 721, 476, 368, 1691, 275, 253, 10444, 5018, 671, 275, 436, 1083, 752, 310, 295, 285, 752, 310, 278, 285, 752, 4592, 281, 253, 39199, 295, 432, 5150, 721, 187, 187, 4118, 18435, 27, 2520, 2929, 23970, 767, 3963, 14460, 326, 403, 5486, 281, 3157, 562, 1171, 13517, 31640, 672, 908, 275, 253, 1442, 292, 25004, 273, 3215, 11273, 4979, 398, 751, 270, 797, 1543, 342, 271, 965, 285, 48960, 13487, 403, 18462, 50276, 856, 84, 50276, 1826, 1332, 342, 11859, 11701, 275, 2067, 2834, 4836, 7533, 50276, 1826, 39926, 273, 48960, 26647, 50276, 5040, 50276, 783, 490, 77, 569, 326, 403, 16318, 275, 253, 2022, 2929, 2133, 13414, 513, 247, 1175, 2628, 273, 4186, 839, 253, 2173, 747, 9021, 2167, 253, 30762, 3400, 2217, 2508, 326, 516, 10048, 326, 253, 2022, 16774, 7680, 310, 3590, 50276, 15337, 398, 1119, 253, 10527, 16038, 1077, 2834, 281, 956, 275, 5053 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: the paper provides a novel analysis to the linear mdp setting through an actor critic setup where the policy is softmaxparameterized the claim is that we can avoid mixing time and exploration based assumptions by using this analysis this is attractive since its much simplercleaner but also comes with a slower convergence rate the main steps include bounding the policy within a kl ball of the maximum entropy optimal policy through a mirror descent style analysis followed by controlling the critic approximation error given the policy is within the kl ball the paper is well written fairly simple to follow and provides good justifications and reminders about the overall story to the reader when going into mathematical details i was able to follow the main story but not the specific details for proving the different lemmas overall i think the motivation is strong and a cleaner analysis is definitely much appreciated on the downside there arent any algorithmic insights i could gather from the paper particularly is there anything a practitioner can take away from this this might not be a concern if the paper was offering a novel result however the question in this case is what exactly do we gain from this alternate analysis right now i dont think we learn much but hopefully the authors can comment more on this point finally below are some questions i hope the authors can reply to for me to understand the work better so the assumption that the optimal maximum entropy policy places a positive mass on every state the kl bound allows us to use the same assumption on any policy within the kl bound how should one contrast this with the assumption that the softmax parameterization itself places a positive mass on each state is that even true the kl is upper bounded by the number of actions k can you say something about how growing number of actions affects the kl bound moreover wouldnt the horizon term dominate for common cases here how tight is this kl bound overall intuitively it seems like the kl ball should be growing smaller as we explore more of the mdp whereas here it is fixed across all t can you provide an intuitive explanation for why this is still fine the paper is nicely written and provides an interesting alternative to analysing a linear mdp with an actor critic setup i have some doubts about how much insight can be gained from this analysis for future work especially algorithmically and therefore i am currently advocating for a weak accept docsepthis heavily technical theoretical paper main contribution is to show that natural actorcritic is philosophically implicitly biased toward high entropy policies more precisely the key result is to show that actorcritic when set as a batched mirror descent algorithm applied on a linear finite states and linear finite actions mdp with linear softmax policies without regularization or explicit exploration but with properly chosen parameters maintains its policies in a kl ball of radius 1 lnk 11gamma2 around the maximum entropy optimal policy the optimal policy which samples uniformly the optimal actions with high probability theorem 14 it is worth noting that this result is obtained without global mixing assumptions on the mdp only with a mixing assumption on the target policy this is an improvement from previous work khodadadian 2021 who assumed uniform mixing and provided only expectation bounds dropping these assumption is however costly with a step size set to 1sqrtt the number of iterations required to obtain and epsilonoptimal policy is epsilon14 the core of the paper is in the detailed proof which spans 11 pages of appendix with a very slow convergence rate 1epsilon14 the provided algorithm is strictly theoretical and is not intended to be applied the authors are clear and honest on that ground despite the real authors efforts the paper is very hard to read and requires an advanced mathematical background coupled with a vast culture on theoretical aspects of rl the main paper and the few pages of proof that i checked seemed solid but i must admit that this paper is behind my reach that said i find interesting for the community to get a few theoretical results on actorcritic and i do not expect such results to be easy to handle minor remarks page 12 line 3 is just and need not be is just and needs not to be a highly theoretical paper underlying an interesting result on the bias of actor critic toward high entropy policies the proofs are long and technical and assessing their correctness is behind my reach but my bet is that it is a good and correct paper docsepthe paper proposes and analyzes the convergence rate and optimality of a naturalactorcriticlike actor critic with linear function approximation in linear mdps the most important discovery is that once the policy is within the ball of the maximum entropy optimal policy it remains there forever provided that the critic is sufficiently accurate consequently one now needs to make ergodicity assumption for only the optimal policy instead of all policies along the optimization path i think the paper makes a reasonable contribution to the field first it explicitly uses the results from mirror descent to analyze their actor update which confirms that once the policy is within the ball of the maximum entropy optimal policy it remains there forever provided that the critic is sufficiently accurate this result appears novel to my knowledge and is useful in eliminating previously used strong assumptions second it analyzes the converge of linear td without using a projection which also appears novel to my knowledge however i do have some concerns i find the proofs are not very readerfriendly i cannot really verify the proof of lemma c2 though i tend to believe it is correct i think the coupling used in page 18 is not clearly defined what is the distribution of zi j and can ei j be clearly defined and i cannot get why the inequality leading to 32m b holds this is the major concern for my score 5 id like to raise my score if this concern is clarified in general i think there are many skipped steps in the proofs it would be good if all the proofs can be verified directly by looking at the pdf without doing extra computation a possible improvement i am not sure if linear mdp is a necessary ingredient for this work in my understanding its used only to ensure that the td fixed point is the true value function i think this can also be ensured by either considering tabular setting or linear function approximation with compatible features it would be good if the authors can briefly discuss this after all linear mdp is a very strong assumption i think the paper makes reasonable contribution but the presentation of the proofs can be improved the authors successfully addressed my concerns and i increased my score accordingly ### Summary:
the paper makes a significant contribution in the rather sparse and challenging field of convergence analyses of actorcritic style algorithms under the linear mdp structural assumption showing that there is a natural bias towards being highentropy as one of the reviewers points out although it is unlikely that the strategy actually proposed is amenable to implementation the paper nevertheless provides a clean and novel analysis of convergence of learning by eschewing the usual mixing time type assumptions often found in the theoreticallyoriented rl literature based on this strength of the paper i am glad to recommend its acceptance
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 783, 2929, 3400, 247, 4460, 1783, 281, 253, 4872, 278, 12132, 4758, 949, 271, 12353, 7291, 9978, 835, 253, 3646, 310, 2602, 4090, 19484, 1025, 253, 1750, 310, 326, 359, 476, 3693, 12480, 673, 285, 17947, 1754, 13260, 407, 970, 436, 1783, 436, 310, 12994, 1580, 697, 1199, 19554, 16437, 254, 533, 671, 3249, 342, 247, 17357, 14940, 2281, 253, 2022, 5018, 2486, 41113, 253, 3646, 1561, 247, 27451, 4023, 273, 253, 4869, 15579, 8654, 3646, 949, 247, 11472, 18499, 3740, 1783, 3560, 407, 10938, 253, 7291, 11193, 2228, 1677, 253, 3646, 310, 1561, 253, 27451, 4023, 253, 2929, 310, 973, 3542, 9648, 2969, 281, 956, 285, 3400, 1175, 816, 6787, 285, 9287, 398, 670, 253, 4583, 2926, 281, 253, 9414, 672, 1469, 715, 15965, 4278, 891, 369, 2104, 281, 956, 253, 2022, 2926, 533, 417, 253, 2173, 4278, 323, 18597, 253, 1027, 458, 44661, 4583, 891, 1158, 253, 16038, 310, 2266, 285, 247, 28452, 1783, 310, 7964, 1199, 14109, 327, 253, 42719, 627, 403, 2649, 667, 5933, 280, 16039, 891, 812, 9580, 432, 253, 2929, 3782, 310, 627, 2712, 247, 34815, 476, 1379, 1977, 432, 436, 436, 1537, 417, 320, 247, 4468, 604, 253, 2929, 369, 9159, 247, 4460, 906, 2299, 253, 1953, 275, 436, 1083, 310, 752, 4555, 513, 359, 6351, 432, 436, 17958, 1783, 987, 1024, 891, 13414, 1158, 359, 3037, 1199, 533, 18670, 253, 4477, 476, 4385, 625, 327, 436, 1127, 4720, 2708, 403, 690, 3533, 891, 3524, 253, 4477, 476, 12252, 281, 323, 479, 281, 2096, 253, 789, 1805, 50275, 601, 253, 9376, 326, 253, 8654, 4869, 15579, 3646, 5053, 247, 2762, 2280, 327, 1046, 1375, 50276, 783, 27451, 3033, 4483, 441, 281, 897, 253, 1072, 9376, 327, 667, 3646, 1561, 253, 27451, 3033, 849, 943, 581, 4499, 436, 342, 253, 9376, 326, 253, 2602, 4090, 4764, 1320, 3139, 5053, 247, 2762, 2280, 327, 1016, 1375, 50276, 261, 326, 1014, 2032, 50275, 783, 27451, 310, 5170, 11542, 407, 253, 1180, 273, 5231, 465, 476, 368, 1333, 1633, 670, 849, 5675, 1180, 273, 5231, 11852, 253, 27451, 3033, 25761, 651, 2649, 253, 16892, 1307, 25903, 323, 1846, 2219, 1060, 849, 6863, 310, 436, 27451, 3033, 4583, 540, 41597, 352, 3133, 751, 253, 27451, 4023, 943, 320, 5675, 4577, 347, 359, 8338, 625, 273, 253, 278, 12132, 5727, 1060, 352, 310, 4229, 2439, 512, 246, 476, 368, 2085, 271, 27350, 8813, 323, 2139, 436, 310, 1335, 4030, 253, 2929, 310, 23395, 3542, 285, 3400, 271, 4722, 5795, 281, 5127, 272, 247, 4872, 278, 12132, 342, 271, 12353, 7291, 9978, 891, 452, 690, 24626, 670, 849, 1199, 12288, 476, 320, 12103, 432, 436, 1783, 323, 2852, 789, 3340, 5933, 1037, 285, 3103, 891, 717, 4390, 43243, 323, 247, 5075, 2997, 50276, 7152, 33032, 2520, 11306, 7681, 10527, 2929, 2022, 7680, 310, 281, 921, 326, 3626, 12353, 68, 17425, 310, 17211, 1037, 29688, 23539, 2584, 1029, 15579, 7823, 50276, 3062, 10534, 253, 2234, 906, 310, 281, 921, 326, 12353, 68, 17425, 672, 873, 347, 247, 10464, 2147, 50276, 17001, 6045, 18499, 5933, 3732, 327, 247, 4872, 6486, 3054, 285, 4872, 6486, 5231, 278, 12132, 342, 4872, 2602, 4090, 7823, 1293, 37820, 390, 6843, 17947, 533, 342, 6283, 6777, 3602, 18922, 697, 7823, 275, 247, 27451, 4023, 273, 9941, 337, 298, 30664, 50276, 883, 2733, 19, 1475, 253, 4869, 15579, 8654, 3646, 253, 8654, 3646, 534, 3530, 17568, 253, 8654, 5231, 342, 1029, 5912, 10012, 1638, 352, 310, 4409, 15806, 326, 436, 906, 310, 2797, 1293, 4156, 12480, 13260, 327, 253, 278, 12132, 760, 342, 247, 12480, 9376, 327, 253, 2303, 3646, 50276, 2520, 310, 271, 7756, 432, 2045, 789, 26856, 351, 324, 7577, 43425, 665, 8025, 6447, 12480, 285, 2530, 760, 15355, 14493, 50276, 3002, 2784, 841, 9376, 310, 2299, 19983, 342, 247, 3213, 1979, 873, 281, 337, 2609, 85, 253, 1180, 273, 25142, 2424, 281, 4044, 285, 299, 4277, 29776, 3646, 310, 299, 4277, 1047, 50276, 783, 5161, 273, 253, 2929, 310, 275, 253, 7000, 4737, 534, 35742, 1903, 7223, 273, 30762, 342, 247, 1077, 3468, 14940, 2281, 337, 4259, 1047, 253, 2530, 5933, 310, 13714, 10527, 285, 310, 417, 6034, 281, 320, 3732, 253, 4477, 403, 2590, 285, 8274, 327, 326, 3216, 5747, 253, 1524, 4477, 6031, 253, 2929, 310, 1077, 1892, 281, 1239, 285, 4419, 271, 7269, 15965, 4114, 9904, 342, 247, 8485, 4466, 327, 10527, 7794, 273, 391, 77, 253, 2022, 2929, 285, 253, 1643, 7223, 273, 4737, 326, 891, 10141, 4455, 4891, 533, 891, 1364, 11476, 326, 436, 2929, 310, 3212, 619, 3986, 326, 753, 891, 1089, 4722, 323, 253, 3114, 281, 755, 247, 1643, 10527, 1543, 327, 12353, 68, 17425, 285, 891, 513, 417, 1902, 824, 1543, 281, 320, 3477, 281, 6016, 50275, 37585, 16157, 3239, 1249, 1386, 495, 310, 816, 50276, 395, 878, 417, 320, 50276, 261, 816, 50276, 395, 3198, 417, 281, 320, 50276, 66, 4122, 10527, 2929, 6944, 271, 4722, 906, 327, 253, 8492, 273, 12353, 7291, 2584, 1029, 15579, 7823, 253, 27947, 403, 1048, 285, 7681, 285, 18005, 616, 36594, 310, 3212, 619, 3986, 533, 619, 701, 310, 326, 352, 310, 247, 1175, 285, 3451, 2929, 5474, 339, 431, 248, 2929, 29328, 285, 3537, 13505, 253, 14940, 2281, 285, 5556, 1319, 273, 247, 3626, 5906, 68, 17425, 3022, 12353, 7291, 342, 4872, 1159, 11193, 275, 4872, 31934, 793, 253, 954, 1774, 8900, 310, 326, 2378, 253, 3646, 310, 1561, 253, 4023, 273, 253, 4869, 15579, 8654, 3646, 352, 4558, 627, 11654, 2530, 326, 253, 7291, 310, 10481, 7899, 17912, 581, 1024, 3198, 281, 1056, 21651, 351, 5755, 9376, 323, 760, 253, 8654, 3646, 3185, 273, 512, 7823, 2112, 253, 13757, 1854, 891, 1158, 253, 2929, 2789, 247, 5272, 7680, 281, 253, 1673, 50276, 7053, 352, 11120, 4648, 253, 1543, 432, 11472, 18499, 281, 12106, 616, 12353, 5731, 534, 23849, 326, 2378, 253, 3646, 310, 1561, 253, 4023, 273, 253, 4869, 15579, 8654, 3646, 352, 4558, 627, 11654, 2530, 326, 253, 7291, 310, 10481, 7899, 436, 906, 4620, 4460, 281, 619, 3640, 285, 310, 4217, 275, 23703, 3786, 908, 2266, 13260, 50276, 9815, 352, 3537, 13505, 253, 29623, 273, 4872, 32989, 1293, 970, 247, 12378, 534, 671, 4620, 4460, 281, 619, 3640, 50276, 35529, 891, 513, 452, 690, 7350, 891, 1089, 253, 27947, 403, 417, 1077, 9414, 19771, 891, 2550, 1663, 12654, 253, 4737, 273, 18057, 260, 19, 2167, 891, 5257, 281, 2868, 352, 310, 3451, 891, 1158, 253, 8789, 908, 275, 3239, 1283, 310, 417, 4518, 2931, 752, 310, 253, 3268, 273, 1182, 74, 480, 285, 476, 22616, 480, 320, 4518, 2931, 285, 891, 2550, 755, 2139, 253, 11370, 4283, 281, 4567, 78, 50276, 67, 50276, 15532, 436, 310, 253, 2201, 4468, 323, 619, 4868, 608, 2654, 751, 281, 7164, 619, 4868, 604, 50276, 2520, 4468, 310, 31637, 275, 2087, 891, 1158, 627, 403, 1142, 37001, 5018, 275, 253, 27947, 352, 651, 320, 1175, 604, 512, 253, 27947, 476, 320, 16058, 3587, 407, 2819, 387, 253, 31697, 1293, 2509, 4465, 13782, 50276, 66, 1896, 7756, 891, 717, 417, 2119, 604, 4872, 278, 12132, 310, 247, 3309, 24405, 323, 436, 789, 275, 619, 4685, 697, 908, 760, 281, 5416, 326, 253, 32989, 4229, 1127, 310, 253, 2032, 1318, 1159, 891, 1158, 436, 476, 671, 320, 33075, 407, 2057, 7296, 10334, 792, 4758, 390, 4872, 1159, 11193, 342, 13333, 3386, 352, 651, 320, 1175, 604, 253, 4477, 476, 13366, 2319, 436, 846, 512, 4872, 278, 12132, 310, 247, 1077, 2266, 9376, 50276, 74, 1158, 253, 2929, 2789, 5272, 7680, 533, 253, 9759, 273, 253, 27947, 476, 320, 5520, 50274, 783, 4477, 8379, 9713, 619, 7350, 285, 891, 2559, 619, 4868, 15672, 2490, 187, 4118, 18435, 27, 783, 2929, 2789, 247, 1534, 7680, 275, 253, 2581, 23507, 285, 11132, 1673, 273, 14940, 6260, 273, 12353, 68, 17425, 3740, 11333, 762, 253, 4872, 278, 12132, 8350, 9376, 4645, 326, 627, 310, 247, 3626, 8492, 4404, 1146, 1725, 864, 85, 10144, 347, 581, 273, 253, 30628, 2792, 562, 3738, 352, 310, 11543, 326, 253, 5700, 2686, 4081, 310, 42133, 281, 7092, 253, 2929, 17837, 3400, 247, 4076, 285, 4460, 1783, 273, 14940, 273, 4715, 407, 1578, 1962, 7706, 253, 7312, 12480, 673, 1511, 13260, 2223, 1119, 275, 253, 28055, 21085, 391, 77, 6239, 1754, 327, 436, 4757, 273, 253, 2929, 891, 717, 9995, 281, 5583, 697, 14924 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 783, 2929, 3400, 247, 4460, 1783, 281, 253, 4872, 278, 12132, 4758, 949, 271, 12353, 7291, 9978, 835, 253, 3646, 310, 2602, 4090, 19484, 1025, 253, 1750, 310, 326, 359, 476, 3693, 12480, 673, 285, 17947, 1754, 13260, 407, 970, 436, 1783, 436, 310, 12994, 1580, 697, 1199, 19554, 16437, 254, 533, 671, 3249, 342, 247, 17357, 14940, 2281, 253, 2022, 5018, 2486, 41113, 253, 3646, 1561, 247, 27451, 4023, 273, 253, 4869, 15579, 8654, 3646, 949, 247, 11472, 18499, 3740, 1783, 3560, 407, 10938, 253, 7291, 11193, 2228, 1677, 253, 3646, 310, 1561, 253, 27451, 4023, 253, 2929, 310, 973, 3542, 9648, 2969, 281, 956, 285, 3400, 1175, 816, 6787, 285, 9287, 398, 670, 253, 4583, 2926, 281, 253, 9414, 672, 1469, 715, 15965, 4278, 891, 369, 2104, 281, 956, 253, 2022, 2926, 533, 417, 253, 2173, 4278, 323, 18597, 253, 1027, 458, 44661, 4583, 891, 1158, 253, 16038, 310, 2266, 285, 247, 28452, 1783, 310, 7964, 1199, 14109, 327, 253, 42719, 627, 403, 2649, 667, 5933, 280, 16039, 891, 812, 9580, 432, 253, 2929, 3782, 310, 627, 2712, 247, 34815, 476, 1379, 1977, 432, 436, 436, 1537, 417, 320, 247, 4468, 604, 253, 2929, 369, 9159, 247, 4460, 906, 2299, 253, 1953, 275, 436, 1083, 310, 752, 4555, 513, 359, 6351, 432, 436, 17958, 1783, 987, 1024, 891, 13414, 1158, 359, 3037, 1199, 533, 18670, 253, 4477, 476, 4385, 625, 327, 436, 1127, 4720, 2708, 403, 690, 3533, 891, 3524, 253, 4477, 476, 12252, 281, 323, 479, 281, 2096, 253, 789, 1805, 50275, 601, 253, 9376, 326, 253, 8654, 4869, 15579, 3646, 5053, 247, 2762, 2280, 327, 1046, 1375, 50276, 783, 27451, 3033, 4483, 441, 281, 897, 253, 1072, 9376, 327, 667, 3646, 1561, 253, 27451, 3033, 849, 943, 581, 4499, 436, 342, 253, 9376, 326, 253, 2602, 4090, 4764, 1320, 3139, 5053, 247, 2762, 2280, 327, 1016, 1375, 50276, 261, 326, 1014, 2032, 50275, 783, 27451, 310, 5170, 11542, 407, 253, 1180, 273, 5231, 465, 476, 368, 1333, 1633, 670, 849, 5675, 1180, 273, 5231, 11852, 253, 27451, 3033, 25761, 651, 2649, 253, 16892, 1307, 25903, 323, 1846, 2219, 1060, 849, 6863, 310, 436, 27451, 3033, 4583, 540, 41597, 352, 3133, 751, 253, 27451, 4023, 943, 320, 5675, 4577, 347, 359, 8338, 625, 273, 253, 278, 12132, 5727, 1060, 352, 310, 4229, 2439, 512, 246, 476, 368, 2085, 271, 27350, 8813, 323, 2139, 436, 310, 1335, 4030, 253, 2929, 310, 23395, 3542, 285, 3400, 271, 4722, 5795, 281, 5127, 272, 247, 4872, 278, 12132, 342, 271, 12353, 7291, 9978, 891, 452, 690, 24626, 670, 849, 1199, 12288, 476, 320, 12103, 432, 436, 1783, 323, 2852, 789, 3340, 5933, 1037, 285, 3103, 891, 717, 4390, 43243, 323, 247, 5075, 2997, 50276, 7152, 33032, 2520, 11306, 7681, 10527, 2929, 2022, 7680, 310, 281, 921, 326, 3626, 12353, 68, 17425, 310, 17211, 1037, 29688, 23539, 2584, 1029, 15579, 7823, 50276, 3062, 10534, 253, 2234, 906, 310, 281, 921, 326, 12353, 68, 17425, 672, 873, 347, 247, 10464, 2147, 50276, 17001, 6045, 18499, 5933, 3732, 327, 247, 4872, 6486, 3054, 285, 4872, 6486, 5231, 278, 12132, 342, 4872, 2602, 4090, 7823, 1293, 37820, 390, 6843, 17947, 533, 342, 6283, 6777, 3602, 18922, 697, 7823, 275, 247, 27451, 4023, 273, 9941, 337, 298, 30664, 50276, 883, 2733, 19, 1475, 253, 4869, 15579, 8654, 3646, 253, 8654, 3646, 534, 3530, 17568, 253, 8654, 5231, 342, 1029, 5912, 10012, 1638, 352, 310, 4409, 15806, 326, 436, 906, 310, 2797, 1293, 4156, 12480, 13260, 327, 253, 278, 12132, 760, 342, 247, 12480, 9376, 327, 253, 2303, 3646, 50276, 2520, 310, 271, 7756, 432, 2045, 789, 26856, 351, 324, 7577, 43425, 665, 8025, 6447, 12480, 285, 2530, 760, 15355, 14493, 50276, 3002, 2784, 841, 9376, 310, 2299, 19983, 342, 247, 3213, 1979, 873, 281, 337, 2609, 85, 253, 1180, 273, 25142, 2424, 281, 4044, 285, 299, 4277, 29776, 3646, 310, 299, 4277, 1047, 50276, 783, 5161, 273, 253, 2929, 310, 275, 253, 7000, 4737, 534, 35742, 1903, 7223, 273, 30762, 342, 247, 1077, 3468, 14940, 2281, 337, 4259, 1047, 253, 2530, 5933, 310, 13714, 10527, 285, 310, 417, 6034, 281, 320, 3732, 253, 4477, 403, 2590, 285, 8274, 327, 326, 3216, 5747, 253, 1524, 4477, 6031, 253, 2929, 310, 1077, 1892, 281, 1239, 285, 4419, 271, 7269, 15965, 4114, 9904, 342, 247, 8485, 4466, 327, 10527, 7794, 273, 391, 77, 253, 2022, 2929, 285, 253, 1643, 7223, 273, 4737, 326, 891, 10141, 4455, 4891, 533, 891, 1364, 11476, 326, 436, 2929, 310, 3212, 619, 3986, 326, 753, 891, 1089, 4722, 323, 253, 3114, 281, 755, 247, 1643, 10527, 1543, 327, 12353, 68, 17425, 285, 891, 513, 417, 1902, 824, 1543, 281, 320, 3477, 281, 6016, 50275, 37585, 16157, 3239, 1249, 1386, 495, 310, 816, 50276, 395, 878, 417, 320, 50276, 261, 816, 50276, 395, 3198, 417, 281, 320, 50276, 66, 4122, 10527, 2929, 6944, 271, 4722, 906, 327, 253, 8492, 273, 12353, 7291, 2584, 1029, 15579, 7823, 253, 27947, 403, 1048, 285, 7681, 285, 18005, 616, 36594, 310, 3212, 619, 3986, 533, 619, 701, 310, 326, 352, 310, 247, 1175, 285, 3451, 2929, 5474, 339, 431, 248, 2929, 29328, 285, 3537, 13505, 253, 14940, 2281, 285, 5556, 1319, 273, 247, 3626, 5906, 68, 17425, 3022, 12353, 7291, 342, 4872, 1159, 11193, 275, 4872, 31934, 793, 253, 954, 1774, 8900, 310, 326, 2378, 253, 3646, 310, 1561, 253, 4023, 273, 253, 4869, 15579, 8654, 3646, 352, 4558, 627, 11654, 2530, 326, 253, 7291, 310, 10481, 7899, 17912, 581, 1024, 3198, 281, 1056, 21651, 351, 5755, 9376, 323, 760, 253, 8654, 3646, 3185, 273, 512, 7823, 2112, 253, 13757, 1854, 891, 1158, 253, 2929, 2789, 247, 5272, 7680, 281, 253, 1673, 50276, 7053, 352, 11120, 4648, 253, 1543, 432, 11472, 18499, 281, 12106, 616, 12353, 5731, 534, 23849, 326, 2378, 253, 3646, 310, 1561, 253, 4023, 273, 253, 4869, 15579, 8654, 3646, 352, 4558, 627, 11654, 2530, 326, 253, 7291, 310, 10481, 7899, 436, 906, 4620, 4460, 281, 619, 3640, 285, 310, 4217, 275, 23703, 3786, 908, 2266, 13260, 50276, 9815, 352, 3537, 13505, 253, 29623, 273, 4872, 32989, 1293, 970, 247, 12378, 534, 671, 4620, 4460, 281, 619, 3640, 50276, 35529, 891, 513, 452, 690, 7350, 891, 1089, 253, 27947, 403, 417, 1077, 9414, 19771, 891, 2550, 1663, 12654, 253, 4737, 273, 18057, 260, 19, 2167, 891, 5257, 281, 2868, 352, 310, 3451, 891, 1158, 253, 8789, 908, 275, 3239, 1283, 310, 417, 4518, 2931, 752, 310, 253, 3268, 273, 1182, 74, 480, 285, 476, 22616, 480, 320, 4518, 2931, 285, 891, 2550, 755, 2139, 253, 11370, 4283, 281, 4567, 78, 50276, 67, 50276, 15532, 436, 310, 253, 2201, 4468, 323, 619, 4868, 608, 2654, 751, 281, 7164, 619, 4868, 604, 50276, 2520, 4468, 310, 31637, 275, 2087, 891, 1158, 627, 403, 1142, 37001, 5018, 275, 253, 27947, 352, 651, 320, 1175, 604, 512, 253, 27947, 476, 320, 16058, 3587, 407, 2819, 387, 253, 31697, 1293, 2509, 4465, 13782, 50276, 66, 1896, 7756, 891, 717, 417, 2119, 604, 4872, 278, 12132, 310, 247, 3309, 24405, 323, 436, 789, 275, 619, 4685, 697, 908, 760, 281, 5416, 326, 253, 32989, 4229, 1127, 310, 253, 2032, 1318, 1159, 891, 1158, 436, 476, 671, 320, 33075, 407, 2057, 7296, 10334, 792, 4758, 390, 4872, 1159, 11193, 342, 13333, 3386, 352, 651, 320, 1175, 604, 253, 4477, 476, 13366, 2319, 436, 846, 512, 4872, 278, 12132, 310, 247, 1077, 2266, 9376, 50276, 74, 1158, 253, 2929, 2789, 5272, 7680, 533, 253, 9759, 273, 253, 27947, 476, 320, 5520, 50274, 783, 4477, 8379, 9713, 619, 7350, 285, 891, 2559, 619, 4868, 15672, 2490, 187, 4118, 18435, 27, 783, 2929, 2789, 247, 1534, 7680, 275, 253, 2581, 23507, 285, 11132, 1673, 273, 14940, 6260, 273, 12353, 68, 17425, 3740, 11333, 762, 253, 4872, 278, 12132, 8350, 9376, 4645, 326, 627, 310, 247, 3626, 8492, 4404, 1146, 1725, 864, 85, 10144, 347, 581, 273, 253, 30628, 2792, 562, 3738, 352, 310, 11543, 326, 253, 5700, 2686, 4081, 310, 42133, 281, 7092, 253, 2929, 17837, 3400, 247, 4076, 285, 4460, 1783, 273, 14940, 273, 4715, 407, 1578, 1962, 7706, 253, 7312, 12480, 673, 1511, 13260, 2223, 1119, 275, 253, 28055, 21085, 391, 77, 6239, 1754, 327, 436, 4757, 273, 253, 2929, 891, 717, 9995, 281, 5583, 697, 14924 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: the paper used the graph scattering network as the encoder and mlp as the decoder to generate linksgraph signalsgraphs pros 1 clearly written easy to follow 2 no need to train the encoder 3 good results on link prediction tasks cons 1 lack of novelty it is a simple combination of existing encoders and decoders for example compared to vgae the only difference in the link prediction task is using a different encoder even if the performance is very good it can only demonstrate the effectiveness of others encoder work and this papers correct selection of a good encoder 2 lack of insights as a combination of existing works if the paper can deeply explain the why this encoder is effective for the generation it is also beneficial but we also do not see this part in particular in the graph generation task the more important component may be the decoder to regulate the validness of the generated graphs eg constrained generation of semantically valid graphs via regularizing variational autoencoders in nips 2018 which used the similar decoder but adding strong regularizations in vae 3 results on qm9 not good enough and lack of references some recent works eg junction tree variational autoencoder for molecular graph generation icml 2018 could already achieve 100 valid docsep summary the authors apply the wavelet scattering transform to construct an autoencoder for graphs they apply this architecture to reconstructing citation graphs images and generating molecules assessment it was difficult to discern what parts of this paper were new work the graph scattering transform seems to have appeared first in hammond et al or zhou and lerman the proposed decoder in 321 is attributed to kipf and welling the molecule generation portion was interesting but i dont think there was enough novel content in this paper to justify acceptance to iclr i could be convinced otherwise if the authors contribution is clarified in rebuttal questions and concerns i found the definition of spf page 5 a little confusing in particular what constitutes a path p in this setting can you motivate the whitening operation a that is applied to the encoding it seems like this is eliminating a lot of the information encoded in barx im confused by the choice of loss function at the top of page 6 since dz sigma it seems like di j is meant to represent the probability of a link between i and j in that case the loss is a sum of negative probabilities which is unusual was this meant to be a sum of log probabilities also this loss doesnt seem to account for including edges where there are none can you explain why this is the case in section 42 the encoded dimension is 256 iiuc considering that the data was restricted to the boots class the reduction from 784256 dimensions does not seem significant the authors concede that some highfrequency information is lost so i wonder how their approach compares to eg a lowpass filter or simple compression algorithm section 43 states that the molecules considered are constructed from atoms c h o n and f later there are multiple references to only 4 atom types onehot vectors in r4 etc clarify pleasedocsepsummary the paper presents a generative model for graphs which is a vaelike architecture where the encoder is a scattering transform with fixed parameters rather than a trainable neural net pros the problem of graph generative models is very important and a lot of the existing methods are not very scalable using spectral methods in place of standard neural net operations makes a lot of sense numerical results for the link prediction task seem to be significantly better than those of baselines cons the paper contains various imprecisions see the nonexhaustive list below and significant amount of statements which are hard to understand i am not sure if the work can be considered particularly novel in particular it is not really emphasised what is the difference with angles mallat 2018 the motivation for the work is not entirely clear it is true that gans and vaes have their issues but in my view it is not really explained argued why the proposed method would solve them i find the argument about the efficiency not very convincing especially after looking at the members bottom of p 7 the scattering transform alone takes several orders of magnitude longer than the baseline authors also mention that their method does not require training of the encoder but i do not see any comparisons with respect to number of parameters the experimental evaluation for signal generation and graph generation is not very convincing for the former there is no real comparison to existing models and for the latter the experimental setup seems a bit strange it appears that the models were trained on different subsets of the dataset making the comparison not very meaningful also i would expect to see the same methods to be compared to a cross all the tasks unless it is impossible for some reason various typos imprecisions unclear statements p1 are complex as well as difficult to train and finetune not at all clear what this means p1 their development is based on fruitful methods of deep learning in the euclidean domain such as convolutional and recurrent neural networks recurrent and convolution neural network are not necessarily restricted to euclidean domains p1 using a prescribed graph representation it is possible to avoid training the two components at the same time but the quality of the prescribed representation is important in order to generate promising results not clear what this sentence means p2 unlike gan or vae the model in this paper does not require training two components either iteratively or at the same time i do not see why that would necessarily be a bad thing especially in the case of vae where traditional training in practice corresponds to training a single neural net p3 gantype graph networks use a discriminator in order to compete with the generator and make its training more powerful i am not sure this statement is strictly correct p9 we remark that the number of molecules in the training sets are not identical to that in does this mean that the models are effectively trained on different data in that case the comparison is not very meaningful ### Summary:
ar1 is concerned about the novelty and what are exact novel elements of the proposed approach ar2 is worried about the novelty combination of existing blocks and lack of insights ar3 is also concerned about the novelty complexity and poor evaluationslack of thorough comparisons with other baselines after rebuttal the reviewers remained unconvinced eg ar3 still would like to see why the proposed method would be any better than ganbased approaches with regret at this point the ac cannot accept this paper but ac encourages the authors to take all reviews into consideration and improve their manuscript accordingly matters such as complexity perhaps scattering networks arent the most friendly here clear insights and strong comparisons to generative approaches are needed
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 783, 2929, 908, 253, 4216, 11715, 2990, 347, 253, 32049, 285, 13361, 81, 347, 253, 29810, 281, 6635, 4859, 10580, 6298, 33884, 50276, 856, 84, 337, 186, 49346, 3542, 3477, 281, 956, 374, 186, 2369, 878, 281, 6194, 253, 32049, 495, 186, 12311, 1543, 327, 3048, 10554, 8892, 50276, 5040, 337, 186, 77, 471, 273, 38135, 352, 310, 247, 2969, 5019, 273, 5368, 2349, 351, 398, 285, 1086, 351, 398, 323, 1650, 2429, 281, 362, 27943, 253, 760, 3064, 275, 253, 3048, 10554, 4836, 310, 970, 247, 1027, 32049, 1014, 604, 253, 3045, 310, 1077, 1175, 352, 476, 760, 7568, 253, 12510, 273, 2571, 32049, 789, 285, 436, 9380, 3451, 5438, 273, 247, 1175, 32049, 50276, 19, 186, 77, 471, 273, 16039, 347, 247, 5019, 273, 5368, 2987, 604, 253, 2929, 476, 11617, 5513, 253, 2139, 436, 32049, 310, 3576, 323, 253, 5978, 352, 310, 671, 12912, 533, 359, 671, 513, 417, 923, 436, 629, 275, 1798, 275, 253, 4216, 5978, 4836, 253, 625, 1774, 4445, 778, 320, 253, 29810, 281, 14368, 253, 3588, 1255, 273, 253, 4561, 14580, 24088, 20793, 5978, 273, 3300, 39904, 3588, 14580, 3066, 3963, 3006, 39762, 6753, 2083, 351, 398, 275, 295, 2824, 4765, 534, 908, 253, 2074, 29810, 533, 6240, 2266, 3963, 5904, 275, 362, 3348, 50276, 20, 186, 1543, 327, 2805, 78, 26, 417, 1175, 2217, 285, 3480, 273, 10414, 690, 3332, 2987, 24088, 16889, 5202, 39762, 6753, 36465, 323, 5787, 4216, 5978, 17857, 1686, 4765, 812, 2168, 5115, 2233, 3588, 50275, 7152, 33032, 6010, 50275, 783, 4477, 4647, 253, 5149, 1059, 11715, 4979, 281, 3989, 271, 6753, 36465, 323, 14580, 597, 4647, 436, 10336, 281, 17029, 272, 25577, 14580, 3888, 285, 11365, 8094, 50275, 515, 18114, 50275, 262, 369, 2834, 281, 26923, 752, 4243, 273, 436, 2929, 497, 747, 789, 253, 4216, 11715, 4979, 3133, 281, 452, 5420, 806, 275, 288, 3681, 857, 1162, 355, 390, 1182, 14451, 285, 298, 8592, 253, 4081, 29810, 275, 33251, 310, 12877, 281, 465, 532, 71, 285, 973, 272, 253, 12570, 5978, 5110, 369, 4722, 533, 891, 13414, 1158, 627, 369, 2217, 4460, 2600, 275, 436, 2929, 281, 15249, 14924, 281, 17857, 32888, 891, 812, 320, 13762, 5010, 604, 253, 4477, 7680, 310, 31637, 275, 30080, 22559, 50275, 34974, 285, 7350, 50274, 74, 1119, 253, 5426, 273, 653, 71, 3239, 608, 247, 1652, 21643, 275, 1798, 752, 16988, 247, 1854, 268, 275, 436, 4758, 50276, 5092, 368, 41509, 253, 30661, 2980, 4254, 247, 326, 310, 3732, 281, 253, 9706, 352, 3133, 751, 436, 310, 23703, 247, 2257, 273, 253, 1491, 16202, 275, 2534, 89, 50276, 303, 13477, 407, 253, 4327, 273, 2957, 1159, 387, 253, 1755, 273, 3239, 721, 1580, 33425, 50276, 2592, 352, 3133, 751, 1073, 480, 310, 5486, 281, 1957, 253, 5912, 273, 247, 3048, 875, 891, 285, 480, 275, 326, 1083, 253, 2957, 310, 247, 2020, 273, 4016, 20552, 534, 310, 11555, 369, 436, 5486, 281, 320, 247, 2020, 273, 2412, 20552, 671, 436, 2957, 36908, 1646, 281, 2395, 323, 1690, 9297, 835, 627, 403, 5293, 476, 368, 5513, 2139, 436, 310, 253, 1083, 50276, 249, 2593, 5976, 253, 16202, 7877, 310, 17558, 21255, 1028, 7296, 326, 253, 941, 369, 11096, 281, 253, 19269, 966, 253, 5141, 432, 818, 2759, 9726, 10103, 1057, 417, 1646, 1534, 253, 4477, 46272, 326, 690, 1029, 18163, 1491, 310, 3663, 594, 891, 4282, 849, 616, 2746, 26662, 281, 24088, 247, 1698, 5858, 5806, 390, 2969, 13800, 5933, 50276, 4674, 7652, 3054, 326, 253, 8094, 2783, 403, 8818, 432, 10917, 260, 288, 258, 295, 285, 269, 1996, 627, 403, 2709, 10414, 281, 760, 577, 13112, 3510, 581, 12022, 11390, 275, 391, 21, 3966, 19148, 13864, 406, 339, 793, 360, 3454, 253, 2929, 10262, 247, 1006, 800, 1566, 323, 14580, 534, 310, 247, 362, 4696, 2804, 10336, 835, 253, 32049, 310, 247, 11715, 4979, 342, 4229, 3602, 2581, 685, 247, 6194, 494, 11454, 2036, 50276, 856, 84, 50276, 783, 1895, 273, 4216, 1006, 800, 3210, 310, 1077, 1774, 285, 247, 2257, 273, 253, 5368, 3082, 403, 417, 1077, 44755, 50276, 5302, 9879, 3082, 275, 1659, 273, 2629, 11454, 2036, 5871, 2789, 247, 2257, 273, 3282, 50275, 40907, 474, 1543, 323, 253, 3048, 10554, 4836, 1646, 281, 320, 3012, 1805, 685, 1110, 273, 1666, 25379, 50276, 5040, 50275, 783, 2929, 4428, 2710, 1607, 2845, 3836, 923, 253, 44382, 8648, 422, 1618, 2708, 285, 1534, 2408, 273, 7234, 534, 403, 1892, 281, 2096, 50276, 74, 717, 417, 2119, 604, 253, 789, 476, 320, 2783, 3782, 4460, 275, 1798, 352, 310, 417, 1663, 10251, 1701, 752, 310, 253, 3064, 342, 14636, 50276, 78, 455, 255, 4765, 50276, 783, 16038, 323, 253, 789, 310, 417, 7094, 2590, 352, 310, 2032, 326, 305, 507, 285, 13460, 265, 452, 616, 3374, 533, 275, 619, 1859, 352, 310, 417, 1663, 5544, 50276, 1662, 2107, 2139, 253, 4081, 1332, 651, 8415, 731, 50276, 74, 1089, 253, 4154, 670, 253, 6733, 417, 1077, 21414, 3340, 846, 2819, 387, 253, 2758, 5004, 273, 268, 818, 253, 11715, 4979, 3815, 3936, 2067, 7367, 273, 9777, 3356, 685, 253, 8245, 4477, 671, 3748, 326, 616, 1332, 1057, 417, 2430, 3733, 209, 189, 1171, 253, 32049, 533, 891, 513, 417, 923, 667, 14023, 342, 1675, 281, 1180, 273, 3602, 50276, 783, 5661, 7103, 323, 2625, 5978, 285, 4216, 5978, 310, 417, 1077, 21414, 323, 253, 3438, 627, 310, 642, 1524, 5301, 281, 5368, 3210, 285, 323, 253, 6158, 253, 5661, 9978, 3133, 247, 2372, 8921, 352, 4620, 326, 253, 3210, 497, 10166, 327, 1027, 20077, 273, 253, 10895, 2403, 253, 5301, 417, 1077, 14282, 671, 891, 651, 1902, 281, 923, 253, 1072, 3082, 281, 320, 2429, 281, 247, 2831, 512, 253, 8892, 5734, 352, 310, 7479, 323, 690, 1921, 50276, 2044, 784, 963, 993, 50276, 11548, 2845, 3836, 50276, 328, 8250, 7234, 268, 18, 403, 2570, 347, 973, 347, 2834, 281, 6194, 285, 1442, 292, 2517, 417, 387, 512, 2590, 752, 436, 2097, 268, 18, 616, 2440, 310, 1754, 327, 46001, 3082, 273, 3676, 4715, 275, 253, 299, 26365, 5028, 824, 347, 27311, 267, 285, 18902, 11454, 6928, 18902, 285, 27311, 11454, 2990, 403, 417, 7933, 11096, 281, 299, 26365, 10625, 50276, 81, 18, 970, 247, 15588, 4216, 6779, 352, 310, 1896, 281, 3693, 3733, 253, 767, 4295, 387, 253, 1072, 673, 533, 253, 3290, 273, 253, 15588, 6779, 310, 1774, 275, 1340, 281, 6635, 12532, 1543, 417, 2590, 752, 436, 6197, 2097, 268, 19, 12401, 36827, 390, 362, 3348, 253, 1566, 275, 436, 2929, 1057, 417, 2430, 3733, 767, 4295, 2057, 10040, 3146, 390, 387, 253, 1072, 673, 891, 513, 417, 923, 2139, 326, 651, 7933, 320, 247, 3076, 2181, 3340, 275, 253, 1083, 273, 362, 3348, 835, 5899, 3733, 275, 3946, 10140, 281, 3733, 247, 2014, 11454, 2036, 268, 20, 305, 386, 1692, 4216, 6928, 897, 247, 7134, 12915, 275, 1340, 281, 15639, 342, 253, 14156, 285, 1056, 697, 3733, 625, 6422, 891, 717, 417, 2119, 436, 3908, 310, 13714, 3451, 268, 26, 359, 7579, 326, 253, 1180, 273, 8094, 275, 253, 3733, 5239, 403, 417, 8931, 281, 326, 275, 50276, 18566, 436, 1599, 326, 253, 3210, 403, 8069, 10166, 327, 1027, 941, 275, 326, 1083, 253, 5301, 310, 417, 1077, 14282, 2490, 187, 4118, 18435, 27, 274, 18, 310, 7514, 670, 253, 38135, 285, 752, 403, 3242, 4460, 3603, 273, 253, 4081, 2746, 549, 19, 310, 11926, 670, 253, 38135, 5019, 273, 5368, 8336, 285, 3480, 273, 16039, 549, 20, 310, 671, 7514, 670, 253, 38135, 10454, 285, 4105, 50276, 15419, 12542, 77, 471, 273, 11080, 14023, 342, 643, 1666, 25379, 846, 30080, 22559, 253, 30628, 6376, 10915, 8498, 758, 24088, 549, 20, 1335, 651, 751, 281, 923, 2139, 253, 4081, 1332, 651, 320, 667, 1805, 685, 36827, 3169, 7274, 50276, 3113, 14938, 387, 436, 1127, 253, 913, 2550, 2997, 436, 2929, 533, 913, 29426, 253, 4477, 281, 1379, 512, 10123, 715, 8180, 285, 3157, 616, 7714, 15672, 8213, 824, 347, 10454, 4931, 11715, 6928, 403, 2649, 253, 954, 11453, 1060, 2590, 16039, 285, 2266, 14023, 281, 1006, 800, 7274, 403, 3058 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 783, 2929, 908, 253, 4216, 11715, 2990, 347, 253, 32049, 285, 13361, 81, 347, 253, 29810, 281, 6635, 4859, 10580, 6298, 33884, 50276, 856, 84, 337, 186, 49346, 3542, 3477, 281, 956, 374, 186, 2369, 878, 281, 6194, 253, 32049, 495, 186, 12311, 1543, 327, 3048, 10554, 8892, 50276, 5040, 337, 186, 77, 471, 273, 38135, 352, 310, 247, 2969, 5019, 273, 5368, 2349, 351, 398, 285, 1086, 351, 398, 323, 1650, 2429, 281, 362, 27943, 253, 760, 3064, 275, 253, 3048, 10554, 4836, 310, 970, 247, 1027, 32049, 1014, 604, 253, 3045, 310, 1077, 1175, 352, 476, 760, 7568, 253, 12510, 273, 2571, 32049, 789, 285, 436, 9380, 3451, 5438, 273, 247, 1175, 32049, 50276, 19, 186, 77, 471, 273, 16039, 347, 247, 5019, 273, 5368, 2987, 604, 253, 2929, 476, 11617, 5513, 253, 2139, 436, 32049, 310, 3576, 323, 253, 5978, 352, 310, 671, 12912, 533, 359, 671, 513, 417, 923, 436, 629, 275, 1798, 275, 253, 4216, 5978, 4836, 253, 625, 1774, 4445, 778, 320, 253, 29810, 281, 14368, 253, 3588, 1255, 273, 253, 4561, 14580, 24088, 20793, 5978, 273, 3300, 39904, 3588, 14580, 3066, 3963, 3006, 39762, 6753, 2083, 351, 398, 275, 295, 2824, 4765, 534, 908, 253, 2074, 29810, 533, 6240, 2266, 3963, 5904, 275, 362, 3348, 50276, 20, 186, 1543, 327, 2805, 78, 26, 417, 1175, 2217, 285, 3480, 273, 10414, 690, 3332, 2987, 24088, 16889, 5202, 39762, 6753, 36465, 323, 5787, 4216, 5978, 17857, 1686, 4765, 812, 2168, 5115, 2233, 3588, 50275, 7152, 33032, 6010, 50275, 783, 4477, 4647, 253, 5149, 1059, 11715, 4979, 281, 3989, 271, 6753, 36465, 323, 14580, 597, 4647, 436, 10336, 281, 17029, 272, 25577, 14580, 3888, 285, 11365, 8094, 50275, 515, 18114, 50275, 262, 369, 2834, 281, 26923, 752, 4243, 273, 436, 2929, 497, 747, 789, 253, 4216, 11715, 4979, 3133, 281, 452, 5420, 806, 275, 288, 3681, 857, 1162, 355, 390, 1182, 14451, 285, 298, 8592, 253, 4081, 29810, 275, 33251, 310, 12877, 281, 465, 532, 71, 285, 973, 272, 253, 12570, 5978, 5110, 369, 4722, 533, 891, 13414, 1158, 627, 369, 2217, 4460, 2600, 275, 436, 2929, 281, 15249, 14924, 281, 17857, 32888, 891, 812, 320, 13762, 5010, 604, 253, 4477, 7680, 310, 31637, 275, 30080, 22559, 50275, 34974, 285, 7350, 50274, 74, 1119, 253, 5426, 273, 653, 71, 3239, 608, 247, 1652, 21643, 275, 1798, 752, 16988, 247, 1854, 268, 275, 436, 4758, 50276, 5092, 368, 41509, 253, 30661, 2980, 4254, 247, 326, 310, 3732, 281, 253, 9706, 352, 3133, 751, 436, 310, 23703, 247, 2257, 273, 253, 1491, 16202, 275, 2534, 89, 50276, 303, 13477, 407, 253, 4327, 273, 2957, 1159, 387, 253, 1755, 273, 3239, 721, 1580, 33425, 50276, 2592, 352, 3133, 751, 1073, 480, 310, 5486, 281, 1957, 253, 5912, 273, 247, 3048, 875, 891, 285, 480, 275, 326, 1083, 253, 2957, 310, 247, 2020, 273, 4016, 20552, 534, 310, 11555, 369, 436, 5486, 281, 320, 247, 2020, 273, 2412, 20552, 671, 436, 2957, 36908, 1646, 281, 2395, 323, 1690, 9297, 835, 627, 403, 5293, 476, 368, 5513, 2139, 436, 310, 253, 1083, 50276, 249, 2593, 5976, 253, 16202, 7877, 310, 17558, 21255, 1028, 7296, 326, 253, 941, 369, 11096, 281, 253, 19269, 966, 253, 5141, 432, 818, 2759, 9726, 10103, 1057, 417, 1646, 1534, 253, 4477, 46272, 326, 690, 1029, 18163, 1491, 310, 3663, 594, 891, 4282, 849, 616, 2746, 26662, 281, 24088, 247, 1698, 5858, 5806, 390, 2969, 13800, 5933, 50276, 4674, 7652, 3054, 326, 253, 8094, 2783, 403, 8818, 432, 10917, 260, 288, 258, 295, 285, 269, 1996, 627, 403, 2709, 10414, 281, 760, 577, 13112, 3510, 581, 12022, 11390, 275, 391, 21, 3966, 19148, 13864, 406, 339, 793, 360, 3454, 253, 2929, 10262, 247, 1006, 800, 1566, 323, 14580, 534, 310, 247, 362, 4696, 2804, 10336, 835, 253, 32049, 310, 247, 11715, 4979, 342, 4229, 3602, 2581, 685, 247, 6194, 494, 11454, 2036, 50276, 856, 84, 50276, 783, 1895, 273, 4216, 1006, 800, 3210, 310, 1077, 1774, 285, 247, 2257, 273, 253, 5368, 3082, 403, 417, 1077, 44755, 50276, 5302, 9879, 3082, 275, 1659, 273, 2629, 11454, 2036, 5871, 2789, 247, 2257, 273, 3282, 50275, 40907, 474, 1543, 323, 253, 3048, 10554, 4836, 1646, 281, 320, 3012, 1805, 685, 1110, 273, 1666, 25379, 50276, 5040, 50275, 783, 2929, 4428, 2710, 1607, 2845, 3836, 923, 253, 44382, 8648, 422, 1618, 2708, 285, 1534, 2408, 273, 7234, 534, 403, 1892, 281, 2096, 50276, 74, 717, 417, 2119, 604, 253, 789, 476, 320, 2783, 3782, 4460, 275, 1798, 352, 310, 417, 1663, 10251, 1701, 752, 310, 253, 3064, 342, 14636, 50276, 78, 455, 255, 4765, 50276, 783, 16038, 323, 253, 789, 310, 417, 7094, 2590, 352, 310, 2032, 326, 305, 507, 285, 13460, 265, 452, 616, 3374, 533, 275, 619, 1859, 352, 310, 417, 1663, 5544, 50276, 1662, 2107, 2139, 253, 4081, 1332, 651, 8415, 731, 50276, 74, 1089, 253, 4154, 670, 253, 6733, 417, 1077, 21414, 3340, 846, 2819, 387, 253, 2758, 5004, 273, 268, 818, 253, 11715, 4979, 3815, 3936, 2067, 7367, 273, 9777, 3356, 685, 253, 8245, 4477, 671, 3748, 326, 616, 1332, 1057, 417, 2430, 3733, 209, 189, 1171, 253, 32049, 533, 891, 513, 417, 923, 667, 14023, 342, 1675, 281, 1180, 273, 3602, 50276, 783, 5661, 7103, 323, 2625, 5978, 285, 4216, 5978, 310, 417, 1077, 21414, 323, 253, 3438, 627, 310, 642, 1524, 5301, 281, 5368, 3210, 285, 323, 253, 6158, 253, 5661, 9978, 3133, 247, 2372, 8921, 352, 4620, 326, 253, 3210, 497, 10166, 327, 1027, 20077, 273, 253, 10895, 2403, 253, 5301, 417, 1077, 14282, 671, 891, 651, 1902, 281, 923, 253, 1072, 3082, 281, 320, 2429, 281, 247, 2831, 512, 253, 8892, 5734, 352, 310, 7479, 323, 690, 1921, 50276, 2044, 784, 963, 993, 50276, 11548, 2845, 3836, 50276, 328, 8250, 7234, 268, 18, 403, 2570, 347, 973, 347, 2834, 281, 6194, 285, 1442, 292, 2517, 417, 387, 512, 2590, 752, 436, 2097, 268, 18, 616, 2440, 310, 1754, 327, 46001, 3082, 273, 3676, 4715, 275, 253, 299, 26365, 5028, 824, 347, 27311, 267, 285, 18902, 11454, 6928, 18902, 285, 27311, 11454, 2990, 403, 417, 7933, 11096, 281, 299, 26365, 10625, 50276, 81, 18, 970, 247, 15588, 4216, 6779, 352, 310, 1896, 281, 3693, 3733, 253, 767, 4295, 387, 253, 1072, 673, 533, 253, 3290, 273, 253, 15588, 6779, 310, 1774, 275, 1340, 281, 6635, 12532, 1543, 417, 2590, 752, 436, 6197, 2097, 268, 19, 12401, 36827, 390, 362, 3348, 253, 1566, 275, 436, 2929, 1057, 417, 2430, 3733, 767, 4295, 2057, 10040, 3146, 390, 387, 253, 1072, 673, 891, 513, 417, 923, 2139, 326, 651, 7933, 320, 247, 3076, 2181, 3340, 275, 253, 1083, 273, 362, 3348, 835, 5899, 3733, 275, 3946, 10140, 281, 3733, 247, 2014, 11454, 2036, 268, 20, 305, 386, 1692, 4216, 6928, 897, 247, 7134, 12915, 275, 1340, 281, 15639, 342, 253, 14156, 285, 1056, 697, 3733, 625, 6422, 891, 717, 417, 2119, 436, 3908, 310, 13714, 3451, 268, 26, 359, 7579, 326, 253, 1180, 273, 8094, 275, 253, 3733, 5239, 403, 417, 8931, 281, 326, 275, 50276, 18566, 436, 1599, 326, 253, 3210, 403, 8069, 10166, 327, 1027, 941, 275, 326, 1083, 253, 5301, 310, 417, 1077, 14282, 2490, 187, 4118, 18435, 27, 274, 18, 310, 7514, 670, 253, 38135, 285, 752, 403, 3242, 4460, 3603, 273, 253, 4081, 2746, 549, 19, 310, 11926, 670, 253, 38135, 5019, 273, 5368, 8336, 285, 3480, 273, 16039, 549, 20, 310, 671, 7514, 670, 253, 38135, 10454, 285, 4105, 50276, 15419, 12542, 77, 471, 273, 11080, 14023, 342, 643, 1666, 25379, 846, 30080, 22559, 253, 30628, 6376, 10915, 8498, 758, 24088, 549, 20, 1335, 651, 751, 281, 923, 2139, 253, 4081, 1332, 651, 320, 667, 1805, 685, 36827, 3169, 7274, 50276, 3113, 14938, 387, 436, 1127, 253, 913, 2550, 2997, 436, 2929, 533, 913, 29426, 253, 4477, 281, 1379, 512, 10123, 715, 8180, 285, 3157, 616, 7714, 15672, 8213, 824, 347, 10454, 4931, 11715, 6928, 403, 2649, 253, 954, 11453, 1060, 2590, 16039, 285, 2266, 14023, 281, 1006, 800, 7274, 403, 3058 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: this paper presented a dialog response generation method using adversarial learning framework the generator is based on previously proposed hierarchical recurrent encoderdecoder network hred and the discriminator is a bidirectional rnn noise samples are introduced in generator for response generation they evaluated their approach on two datasets and showed mostly better results than the other systems the novelty of the paper is limited modeling longer dialog history beyond the current turn is not new this has been used in different tasks such as dialect act classification intent classification and slot filling response generation etc the generator is based on previous hred adding noise to generate responses is somewhat new but that doesnt seem to be well motivated or justified why adding gaussian noise improves the diversity or informativeness of the responses is not explained the idea of discriminator has been widely used recently for language generation related tasks what is new here is it the wordbased metric sharing the context and word information with generator it would be helpful if the authors can clarify their contribution regarding using mle to first generate multiple hypotheses in generator how is the quality of the nbest responses is there a way to measure the goodness of the responses in some kind of reranking framework not necessarily discriminator the results in the table showed the proposed method outperforms the others in terms of those objective metrics i feel some subjective evaluations are needed to strengthen the paper from the samples responses in the table it doesnt look like the new method generates very good responses detailed comments sec 2 before 21 last paragraph with the gan objective we can match the noise distribution pzi to the distribution of the ground truth response pxi1xi this needs clarification figure 1 caption left and right are misplaced sec 21 last paragraph without zi the net could still learn a mapping from xi to yi but would produce deterministic outputs i think the authors mean that the system generates a probability distribution pyix the output is the most likely one from that however if the output is a distribution the system can also do some sampling and not necessarily output the top one this is not that different from adding noise in the history if thats based on some distribution then it may still be deterministic docsep the paper applies conditional gan to the hred model serban et al 2016 for dialogue response generation showing improvements in terms of informativeness and diversity compared to hred and vhred serban et al 2017 the paper is technically solid and relatively easy to follow and the results are good but comparisons with previous work descriptive and experimental are rather weak related work is incomplete the paper specifically argues for the use of gan to improve diversity in dialogue response generation but this is not the first paper to do so xu et al 2017 presents a ganlike setup that targets exactly the same goal but that work is not cited in the paper same for zhang et al 2018 but the latter work is rather recent it still should probably be cited evaluation there is no evaluation against xu et al which targets the same goal the authors didnt even compare their methods against baselines used in other gan works for diverse response generation eg mmi xu et al zhang et al li et als gan approach xu et al which makes it difficult to draw comparisons between these related methods as opposed to these other works the paper doesnt offer any human evaluation it would have been nice to include an lstm or gru baseline as these models are still often used in practice and the vhred paper suggests serban et al 2016 table 1 that lstm holds up quite well against hred if we extrapolate the results of vhred vs lstm and vhred vs hred the ablation of gan and hred would help us understand which of the two is more important in sum the work is relatively solid but considering how much has already been done on generating diverse responses including 3 other papers also using gan i dont think this paper is too influential its main weakness is the evaluation particularly the lack of human evaluation minor comments introduction diversity promoting training objective but their model is for single turn conversations if these were single turns they wouldnt really be called conversations that objective has been used with 3 turn conversations it can actually be applied to multiturn dialogue as with any autoegressive generative models for example it has been exploited that way as a baseline for multiturn dialogue li et al 2016deep reinforcement learning for dialogue generation note it is not a training objective but only an objective function at inference time which is a more valid reason to criticize that paper we use greedy decoding mle on the first part of the objective doesnt that hurt diversity because of mle what about using sampling instead maybe with temperature algorithm 1 the pthetag dont seem to match the text of section 2 hi is in sometimes written in bold and sometimes not see also eq 12 for comparison end of section 21 there are multiple li et al specify which one end of section 22 and 24 extra closing parenthesis after n0 figures are too small to read the subscripts xu et al 2017 zhen xu bingquan liu baoxun wang sun chengjie xiaolong wang zhuoran wang and chao qi neural response generation via gan with an approximate embedding layer emnlp 2017 zhang et al 2018 zhang yizhe galley michel gao jianfeng gan zhe li xiujun brockett chris dolan bill 2018 generating informative and diverse conversational responses via adversarial information maximizationdocsepthis paper presents an adversarial learning model for generating diverse responses for dialogue systems based on hred hierarchical recurrent encoderdecoder network the contribution of the work mainly lies in 1 adversarial learning and 2 injecting gaussian noise at wordlevel and sentencelevel to encourage the diversity overall the idea is interesting and the automatic evaluation based on perplexity bleu rouge etc shows that the proposed methods outperform existing methods several suggestions it seems like injection noise at wordlevel almost always outperforms adding sentencelevel noise it would be better if the authors can explain why this happens and whether it can be applied for other response generation tasks built on above comment the authors can also experiment with other response generation datasets eg interactions on social media from examples in table 3 and 4 the generated responses are of low quality overall i suggest the authors run human evaluation to see whether there is any significant difference among system responses by different models on aspects of informativeness and fluency at leastdocsepthis paper propose a new approach to dialogue modeling by introducing two innovations over an established dialogue model the hred hierarchical recurrent encoderdecoder network the innovations are 1 adding a gan objective to the standard mle objective of the hred model and 2 modifying the hred model to include an attention mechanism over the local conditioning information ie the call before the present response writing the writing was mostly ok though there were some issues early in section 2 the authors rather awkwardly transition from a mathematical formalism that included the two halves of the dialogue as x call and y response to a formalism that only considers a single sequence x novelty and impact the proposed approach explicitly combines an established model with two components that are themselves wellestablished its fair to say that the novelty is relatively weak the model development is sensible but reasonably straightforward it isnt clear to me that a careful reader of the literature in this area particularly the gan for text literature will learn that much from this paper experiments overall the empirical evaluation shows fairly convincingly that the proposed model is effective i do wonder why would the hredgan model outperform the hred model on perplexity the hred model is directly optimizing mle which is directly related to the perplexity measure while the hredgan include an additional objective that should perhaps sacrifice likelihood this puzzling result was not discussed and really should be the generated responses given in table 3 while showing some improvement over hred and vhred esp in terms of response length and specificity do not fit the context particularly well this really just shows we still have some way to go before this challenging task is solved it would be useful if the authors could run an ablation study to help resolve the relative contributions of the two innovations gan and attention to the improvements in results perhaps the improvement in perplexity discussed above is do to the use of attention detailed comments questions in the paragraph between eqns 2 and 3 the authors seem to suggest that teacher forcing is an added heuristic however this is just the correct evaluation of the mle objective in discussing the combined mlegan objective in eqn 8 does the mle objective use teacher forcing some earlier text discussed above leads me to suspect that it does not ### Summary:
this paper proposes an adversarial learning framework for dialogue generation the generator is based on previously proposed hierarchical recurrent encoderdecoder network hred by serban et al and the discriminator is a bidirectional rnn noise is introduced in generator for response generation the approach is evaluated on two commonly used corpora movie data and ubuntu corpus in the original version of the paper human evaluation was missing an issue raised by all reviewers however this has been added in the revisions these supplement the previous automated measures in demonstrating the benefits and significant gains from the proposed approach all reviewers raise the issue of the work being incremental and not novel enough given the previous work in hredvhred and use of hierarchical approaches to model dialogue context furthermore noise generation seems new but is not well motivated justified and analyzed
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 2520, 2929, 3559, 247, 10756, 2380, 5978, 1332, 970, 48960, 4715, 7792, 50276, 783, 14156, 310, 1754, 327, 3786, 4081, 24498, 18902, 32049, 48759, 2990, 288, 433, 285, 253, 7134, 12915, 310, 247, 12246, 30869, 391, 9866, 50276, 24946, 3530, 403, 5611, 275, 14156, 323, 2380, 5978, 597, 6760, 616, 2746, 327, 767, 15302, 285, 2692, 6571, 1805, 1543, 685, 253, 643, 2718, 50275, 783, 38135, 273, 253, 2929, 310, 3710, 50276, 7645, 272, 3356, 10756, 2892, 4457, 253, 1655, 1614, 310, 417, 747, 436, 556, 644, 908, 275, 1027, 8892, 824, 347, 28282, 769, 9162, 6860, 9162, 285, 15239, 12868, 2380, 5978, 3966, 253, 14156, 310, 1754, 327, 2045, 288, 433, 50276, 8052, 6046, 281, 6635, 6128, 310, 8489, 747, 533, 326, 36908, 1646, 281, 320, 973, 17194, 390, 17285, 50276, 22309, 6240, 305, 12064, 6046, 19132, 253, 9991, 390, 4151, 255, 6460, 273, 253, 6128, 310, 417, 5544, 50276, 783, 2934, 273, 7134, 12915, 556, 644, 7561, 908, 4102, 323, 3448, 5978, 2905, 8892, 50276, 5371, 310, 747, 1060, 310, 352, 253, 3159, 3169, 7982, 9628, 253, 3634, 285, 3159, 1491, 342, 14156, 50276, 262, 651, 320, 9371, 604, 253, 4477, 476, 19148, 616, 7680, 50273, 1747, 13218, 970, 278, 282, 281, 806, 6635, 2709, 24316, 275, 14156, 849, 310, 253, 3290, 273, 253, 295, 14461, 6128, 50276, 261, 627, 247, 1039, 281, 2557, 253, 23190, 273, 253, 6128, 275, 690, 2238, 273, 294, 47883, 7792, 417, 7933, 7134, 12915, 50275, 783, 1543, 275, 253, 2829, 2692, 253, 4081, 1332, 41731, 13015, 253, 2571, 275, 2426, 273, 1110, 8103, 17082, 891, 1928, 690, 17854, 27163, 403, 3058, 281, 17084, 253, 2929, 432, 253, 3530, 6128, 275, 253, 2829, 352, 36908, 1007, 751, 253, 747, 1332, 15693, 1077, 1175, 6128, 50274, 5992, 7193, 5701, 50275, 1704, 374, 1078, 3127, 1390, 12494, 342, 253, 36827, 8103, 359, 476, 3761, 253, 6046, 3268, 268, 9877, 281, 253, 3268, 273, 253, 3216, 5083, 2380, 268, 2981, 18, 2981, 50276, 2520, 3198, 37699, 50275, 13206, 337, 11743, 1669, 285, 987, 403, 48032, 50275, 1704, 3127, 1390, 12494, 1293, 1182, 74, 253, 2036, 812, 1335, 3037, 247, 10603, 432, 1269, 74, 281, 340, 74, 533, 651, 4711, 30027, 18012, 50276, 74, 1158, 253, 4477, 1599, 326, 253, 985, 15693, 247, 5912, 3268, 7239, 895, 253, 3453, 310, 253, 954, 2779, 581, 432, 326, 2299, 604, 253, 3453, 310, 247, 3268, 253, 985, 476, 671, 513, 690, 10491, 285, 417, 7933, 3453, 253, 1755, 581, 50276, 2520, 310, 417, 326, 1027, 432, 6240, 6046, 275, 253, 2892, 50276, 338, 28763, 1754, 327, 690, 3268, 840, 352, 778, 1335, 320, 30027, 50276, 7152, 33032, 253, 2929, 10384, 17697, 36827, 281, 253, 288, 433, 1566, 1151, 5568, 1162, 355, 4022, 323, 17414, 2380, 5978, 4645, 11701, 275, 2426, 273, 4151, 255, 6460, 285, 9991, 2429, 281, 288, 433, 285, 362, 73, 433, 1151, 5568, 1162, 355, 4240, 50276, 783, 2929, 310, 22335, 4891, 285, 4942, 3477, 281, 956, 285, 253, 1543, 403, 1175, 533, 14023, 342, 2045, 789, 27389, 285, 5661, 403, 2581, 5075, 50274, 4919, 789, 310, 18464, 253, 2929, 5742, 8219, 323, 253, 897, 273, 36827, 281, 3157, 9991, 275, 17414, 2380, 5978, 533, 436, 310, 417, 253, 806, 2929, 281, 513, 594, 1269, 86, 1162, 355, 4240, 10262, 247, 36827, 3022, 9978, 326, 8571, 4555, 253, 1072, 4736, 533, 326, 789, 310, 417, 11106, 275, 253, 2929, 1072, 323, 1182, 12109, 1162, 355, 4765, 533, 253, 6158, 789, 310, 2581, 3332, 352, 1335, 943, 3164, 320, 11106, 50275, 15419, 2368, 627, 310, 642, 7103, 1411, 1269, 86, 1162, 355, 534, 8571, 253, 1072, 4736, 253, 4477, 42126, 1014, 7277, 616, 3082, 1411, 1666, 25379, 908, 275, 643, 36827, 2987, 323, 11117, 2380, 5978, 24088, 5823, 74, 1269, 86, 1162, 355, 1182, 12109, 1162, 355, 632, 1162, 14350, 36827, 2746, 1269, 86, 1162, 355, 534, 2789, 352, 2834, 281, 3812, 14023, 875, 841, 2905, 3082, 347, 10066, 281, 841, 643, 2987, 253, 2929, 36908, 3959, 667, 1966, 7103, 50275, 262, 651, 452, 644, 5322, 281, 2486, 271, 298, 296, 78, 390, 26970, 8245, 347, 841, 3210, 403, 1335, 2223, 908, 275, 3946, 285, 253, 362, 73, 433, 2929, 5936, 1151, 5568, 1162, 355, 4022, 2829, 337, 326, 298, 296, 78, 6556, 598, 3240, 973, 1411, 288, 433, 604, 359, 26480, 25839, 253, 1543, 273, 362, 73, 433, 4632, 298, 296, 78, 285, 362, 73, 433, 4632, 288, 433, 253, 28913, 273, 36827, 285, 288, 433, 651, 1361, 441, 2096, 534, 273, 253, 767, 310, 625, 1774, 50276, 249, 2020, 253, 789, 310, 4942, 4891, 533, 7296, 849, 1199, 556, 2168, 644, 2218, 327, 11365, 11117, 6128, 1690, 495, 643, 9380, 671, 970, 36827, 891, 13414, 1158, 436, 2929, 310, 1512, 20803, 697, 2022, 14855, 310, 253, 7103, 3782, 253, 3480, 273, 1966, 7103, 50276, 37585, 5701, 50275, 46089, 9991, 14312, 3733, 8103, 533, 616, 1566, 310, 323, 2014, 1614, 16072, 50276, 338, 841, 497, 2014, 7819, 597, 651, 2649, 1663, 320, 1925, 16072, 326, 8103, 556, 644, 908, 342, 495, 1614, 16072, 352, 476, 2686, 320, 3732, 281, 1554, 262, 662, 17414, 347, 342, 667, 6753, 909, 8122, 1006, 800, 3210, 323, 1650, 352, 556, 644, 28734, 326, 1039, 347, 247, 8245, 323, 1554, 262, 662, 17414, 632, 1162, 355, 4022, 22412, 35221, 4715, 323, 17414, 5978, 3877, 352, 310, 417, 247, 3733, 8103, 533, 760, 271, 8103, 1159, 387, 17032, 673, 534, 310, 247, 625, 3588, 1921, 281, 45688, 326, 2929, 50275, 664, 897, 38754, 28490, 278, 282, 327, 253, 806, 629, 273, 253, 8103, 36908, 326, 8513, 9991, 984, 273, 278, 282, 752, 670, 970, 10491, 3185, 5046, 342, 3276, 50275, 41528, 337, 253, 268, 783, 7784, 13414, 1646, 281, 3761, 253, 2505, 273, 2593, 374, 14260, 310, 275, 4536, 3542, 275, 13433, 285, 4536, 417, 923, 671, 16186, 1249, 323, 5301, 50275, 423, 273, 2593, 3127, 627, 403, 2709, 632, 1162, 355, 13199, 534, 581, 50275, 423, 273, 2593, 3307, 285, 2164, 4465, 11196, 2885, 25232, 846, 295, 17, 50274, 40203, 403, 1512, 1355, 281, 1239, 253, 749, 26804, 50276, 46036, 1162, 355, 4240, 1182, 864, 1269, 86, 270, 272, 371, 266, 632, 86, 18927, 1004, 328, 259, 606, 5101, 260, 24176, 75, 466, 1269, 571, 311, 543, 259, 606, 1182, 11917, 263, 266, 259, 606, 285, 448, 8500, 2805, 74, 11454, 2380, 5978, 3066, 36827, 342, 271, 16851, 21496, 3828, 802, 13307, 81, 4240, 50276, 91, 12109, 1162, 355, 4765, 1182, 12109, 340, 478, 248, 50276, 9896, 2205, 8555, 2955, 50276, 2485, 80, 480, 757, 71, 1205, 50276, 1247, 1182, 248, 50276, 965, 1269, 74, 10441, 328, 50276, 9108, 777, 3592, 448, 4448, 50276, 69, 24361, 6007, 4765, 11365, 27096, 285, 11117, 5636, 1050, 6128, 3066, 48960, 1491, 11903, 1320, 7152, 33032, 2520, 2929, 10262, 271, 48960, 4715, 1566, 323, 11365, 11117, 6128, 323, 17414, 2718, 1754, 327, 288, 433, 24498, 18902, 32049, 48759, 2990, 253, 7680, 273, 253, 789, 7194, 8696, 275, 337, 48960, 4715, 285, 374, 42014, 305, 12064, 6046, 387, 3159, 5251, 285, 6197, 5251, 281, 11907, 253, 9991, 4583, 253, 2934, 310, 4722, 285, 253, 12077, 7103, 1754, 327, 44229, 414, 7387, 86, 30497, 463, 3966, 2722, 326, 253, 4081, 3082, 562, 32231, 5368, 3082, 50276, 43249, 13991, 50276, 262, 3133, 751, 8829, 6046, 387, 3159, 5251, 2761, 1900, 41731, 13015, 6240, 6197, 5251, 6046, 352, 651, 320, 1805, 604, 253, 4477, 476, 5513, 2139, 436, 6569, 285, 1880, 352, 476, 320, 3732, 323, 643, 2380, 5978, 8892, 50275, 20989, 327, 1840, 4385, 253, 4477, 476, 671, 3368, 342, 643, 2380, 5978, 15302, 24088, 6355, 327, 2675, 3420, 50275, 4064, 6667, 275, 2829, 495, 285, 577, 253, 4561, 6128, 403, 273, 1698, 3290, 4583, 891, 1804, 253, 4477, 1408, 1966, 7103, 281, 923, 1880, 627, 310, 667, 1534, 3064, 2190, 985, 6128, 407, 1027, 3210, 327, 7794, 273, 4151, 255, 6460, 285, 2938, 1371, 387, 1878, 7152, 33032, 2520, 2929, 12661, 247, 747, 2746, 281, 17414, 14053, 407, 16984, 767, 32771, 689, 271, 4232, 17414, 1566, 253, 288, 433, 24498, 18902, 32049, 48759, 2990, 253, 32771, 403, 337, 6240, 247, 36827, 8103, 281, 253, 2629, 278, 282, 8103, 273, 253, 288, 433, 1566, 285, 374, 26264, 253, 288, 433, 1566, 281, 2486, 271, 4116, 5122, 689, 253, 1980, 21839, 1491, 26332, 253, 1067, 1078, 253, 1246, 2380, 50274, 17695, 253, 4028, 369, 6571, 8718, 2167, 627, 497, 690, 3374, 2393, 275, 2593, 374, 253, 4477, 2581, 19328, 314, 5502, 432, 247, 15965, 30221, 326, 2908, 253, 767, 37999, 273, 253, 17414, 347, 1269, 1067, 285, 340, 2380, 281, 247, 30221, 326, 760, 19401, 247, 2014, 3425, 1269, 50275, 2369, 652, 555, 285, 3486, 50276, 783, 4081, 2746, 11120, 24772, 271, 4232, 1566, 342, 767, 4295, 326, 403, 3746, 973, 21877, 697, 4344, 281, 1333, 326, 253, 38135, 310, 4942, 5075, 253, 1566, 2440, 310, 24600, 533, 12054, 15246, 352, 310, 2649, 2590, 281, 479, 326, 247, 10182, 9414, 273, 253, 6239, 275, 436, 2170, 3782, 253, 36827, 323, 2505, 6239, 588, 3037, 326, 1199, 432, 436, 2929, 50275, 16217, 3825, 4583, 253, 16774, 7103, 2722, 9648, 2410, 1763, 5356, 326, 253, 4081, 1566, 310, 3576, 891, 513, 4282, 2139, 651, 253, 288, 433, 1247, 1566, 562, 32231, 253, 288, 433, 1566, 327, 44229, 414, 253, 288, 433, 1566, 310, 3587, 39793, 278, 282, 534, 310, 3587, 2905, 281, 253, 44229, 414, 2557, 1223, 253, 288, 433, 1247, 2486, 271, 3081, 8103, 326, 943, 4931, 17789, 12177, 436, 21843, 1981, 906, 369, 417, 5469, 285, 1663, 943, 320, 50276, 783, 4561, 6128, 1677, 275, 2829, 495, 50276, 6050, 4645, 690, 7756, 689, 288, 433, 285, 362, 73, 433, 17985, 275, 2426, 273, 2380, 2978, 285, 13005, 50276, 3088, 417, 4944, 253, 3634, 3782, 973, 436, 1663, 816, 2722, 359, 1335, 452, 690, 1039, 281, 564, 1078, 436, 11132, 4836, 310, 14042, 50275, 262, 651, 320, 4217, 604, 253, 4477, 812, 1408, 271, 28913, 1263, 281, 1361, 11322, 253, 4103, 9021, 273, 253, 767, 32771, 36827, 285, 4116, 281, 253, 11701, 275, 1543, 4931, 253, 7756, 275, 44229, 414, 5469, 1840, 310, 513, 281, 253, 897, 273, 4116, 50275, 5992, 7193, 5701, 50276, 34974, 50275, 249, 253, 12494, 875, 16186, 2224, 374, 285, 495, 253, 4477, 1646, 281, 1804, 326, 50275, 442, 12844, 17190, 310, 271, 2879, 47641, 50276, 35529, 436, 310, 816, 253, 50275, 28113, 7103, 273, 253, 278, 282, 8103, 50274, 249, 16585, 253, 5678, 278, 282, 1247, 8103, 275, 16186, 79, 854, 1057, 253, 278, 282, 50275, 6082, 422, 897, 9732, 17190, 690, 4321, 2505, 5469, 1840, 5644, 50275, 1405, 281, 9101, 326, 352, 1057, 417, 50276, 187, 187, 4118, 18435, 27, 436, 2929, 29328, 271, 48960, 4715, 7792, 323, 17414, 5978, 253, 14156, 310, 1754, 327, 3786, 4081, 24498, 18902, 32049, 48759, 2990, 288, 433, 407, 1151, 5568, 1162, 355, 285, 253, 7134, 12915, 310, 247, 12246, 30869, 391, 9866, 6046, 310, 5611, 275, 14156, 323, 2380, 5978, 253, 2746, 310, 6760, 327, 767, 7744, 908, 5944, 66, 6440, 941, 285, 13520, 20689, 50276, 249, 253, 3236, 2715, 273, 253, 2929, 1966, 7103, 369, 5816, 271, 2523, 5439, 407, 512, 30628, 2299, 436, 556, 644, 2879, 275, 253, 38549, 841, 8499, 253, 2045, 16644, 5593, 275, 17227, 253, 5373, 285, 1534, 15988, 432, 253, 4081, 2746, 50276, 455, 30628, 7164, 253, 2523, 273, 253, 789, 1146, 32809, 285, 417, 4460, 2217, 1677, 253, 2045, 789, 275, 288, 433, 42781, 433, 285, 897, 273, 24498, 7274, 281, 1566, 17414, 3634, 33810, 6046, 5978, 3133, 747, 533, 310, 417, 973, 17194, 17285, 285, 5867, 50276 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 2520, 2929, 3559, 247, 10756, 2380, 5978, 1332, 970, 48960, 4715, 7792, 50276, 783, 14156, 310, 1754, 327, 3786, 4081, 24498, 18902, 32049, 48759, 2990, 288, 433, 285, 253, 7134, 12915, 310, 247, 12246, 30869, 391, 9866, 50276, 24946, 3530, 403, 5611, 275, 14156, 323, 2380, 5978, 597, 6760, 616, 2746, 327, 767, 15302, 285, 2692, 6571, 1805, 1543, 685, 253, 643, 2718, 50275, 783, 38135, 273, 253, 2929, 310, 3710, 50276, 7645, 272, 3356, 10756, 2892, 4457, 253, 1655, 1614, 310, 417, 747, 436, 556, 644, 908, 275, 1027, 8892, 824, 347, 28282, 769, 9162, 6860, 9162, 285, 15239, 12868, 2380, 5978, 3966, 253, 14156, 310, 1754, 327, 2045, 288, 433, 50276, 8052, 6046, 281, 6635, 6128, 310, 8489, 747, 533, 326, 36908, 1646, 281, 320, 973, 17194, 390, 17285, 50276, 22309, 6240, 305, 12064, 6046, 19132, 253, 9991, 390, 4151, 255, 6460, 273, 253, 6128, 310, 417, 5544, 50276, 783, 2934, 273, 7134, 12915, 556, 644, 7561, 908, 4102, 323, 3448, 5978, 2905, 8892, 50276, 5371, 310, 747, 1060, 310, 352, 253, 3159, 3169, 7982, 9628, 253, 3634, 285, 3159, 1491, 342, 14156, 50276, 262, 651, 320, 9371, 604, 253, 4477, 476, 19148, 616, 7680, 50273, 1747, 13218, 970, 278, 282, 281, 806, 6635, 2709, 24316, 275, 14156, 849, 310, 253, 3290, 273, 253, 295, 14461, 6128, 50276, 261, 627, 247, 1039, 281, 2557, 253, 23190, 273, 253, 6128, 275, 690, 2238, 273, 294, 47883, 7792, 417, 7933, 7134, 12915, 50275, 783, 1543, 275, 253, 2829, 2692, 253, 4081, 1332, 41731, 13015, 253, 2571, 275, 2426, 273, 1110, 8103, 17082, 891, 1928, 690, 17854, 27163, 403, 3058, 281, 17084, 253, 2929, 432, 253, 3530, 6128, 275, 253, 2829, 352, 36908, 1007, 751, 253, 747, 1332, 15693, 1077, 1175, 6128, 50274, 5992, 7193, 5701, 50275, 1704, 374, 1078, 3127, 1390, 12494, 342, 253, 36827, 8103, 359, 476, 3761, 253, 6046, 3268, 268, 9877, 281, 253, 3268, 273, 253, 3216, 5083, 2380, 268, 2981, 18, 2981, 50276, 2520, 3198, 37699, 50275, 13206, 337, 11743, 1669, 285, 987, 403, 48032, 50275, 1704, 3127, 1390, 12494, 1293, 1182, 74, 253, 2036, 812, 1335, 3037, 247, 10603, 432, 1269, 74, 281, 340, 74, 533, 651, 4711, 30027, 18012, 50276, 74, 1158, 253, 4477, 1599, 326, 253, 985, 15693, 247, 5912, 3268, 7239, 895, 253, 3453, 310, 253, 954, 2779, 581, 432, 326, 2299, 604, 253, 3453, 310, 247, 3268, 253, 985, 476, 671, 513, 690, 10491, 285, 417, 7933, 3453, 253, 1755, 581, 50276, 2520, 310, 417, 326, 1027, 432, 6240, 6046, 275, 253, 2892, 50276, 338, 28763, 1754, 327, 690, 3268, 840, 352, 778, 1335, 320, 30027, 50276, 7152, 33032, 253, 2929, 10384, 17697, 36827, 281, 253, 288, 433, 1566, 1151, 5568, 1162, 355, 4022, 323, 17414, 2380, 5978, 4645, 11701, 275, 2426, 273, 4151, 255, 6460, 285, 9991, 2429, 281, 288, 433, 285, 362, 73, 433, 1151, 5568, 1162, 355, 4240, 50276, 783, 2929, 310, 22335, 4891, 285, 4942, 3477, 281, 956, 285, 253, 1543, 403, 1175, 533, 14023, 342, 2045, 789, 27389, 285, 5661, 403, 2581, 5075, 50274, 4919, 789, 310, 18464, 253, 2929, 5742, 8219, 323, 253, 897, 273, 36827, 281, 3157, 9991, 275, 17414, 2380, 5978, 533, 436, 310, 417, 253, 806, 2929, 281, 513, 594, 1269, 86, 1162, 355, 4240, 10262, 247, 36827, 3022, 9978, 326, 8571, 4555, 253, 1072, 4736, 533, 326, 789, 310, 417, 11106, 275, 253, 2929, 1072, 323, 1182, 12109, 1162, 355, 4765, 533, 253, 6158, 789, 310, 2581, 3332, 352, 1335, 943, 3164, 320, 11106, 50275, 15419, 2368, 627, 310, 642, 7103, 1411, 1269, 86, 1162, 355, 534, 8571, 253, 1072, 4736, 253, 4477, 42126, 1014, 7277, 616, 3082, 1411, 1666, 25379, 908, 275, 643, 36827, 2987, 323, 11117, 2380, 5978, 24088, 5823, 74, 1269, 86, 1162, 355, 1182, 12109, 1162, 355, 632, 1162, 14350, 36827, 2746, 1269, 86, 1162, 355, 534, 2789, 352, 2834, 281, 3812, 14023, 875, 841, 2905, 3082, 347, 10066, 281, 841, 643, 2987, 253, 2929, 36908, 3959, 667, 1966, 7103, 50275, 262, 651, 452, 644, 5322, 281, 2486, 271, 298, 296, 78, 390, 26970, 8245, 347, 841, 3210, 403, 1335, 2223, 908, 275, 3946, 285, 253, 362, 73, 433, 2929, 5936, 1151, 5568, 1162, 355, 4022, 2829, 337, 326, 298, 296, 78, 6556, 598, 3240, 973, 1411, 288, 433, 604, 359, 26480, 25839, 253, 1543, 273, 362, 73, 433, 4632, 298, 296, 78, 285, 362, 73, 433, 4632, 288, 433, 253, 28913, 273, 36827, 285, 288, 433, 651, 1361, 441, 2096, 534, 273, 253, 767, 310, 625, 1774, 50276, 249, 2020, 253, 789, 310, 4942, 4891, 533, 7296, 849, 1199, 556, 2168, 644, 2218, 327, 11365, 11117, 6128, 1690, 495, 643, 9380, 671, 970, 36827, 891, 13414, 1158, 436, 2929, 310, 1512, 20803, 697, 2022, 14855, 310, 253, 7103, 3782, 253, 3480, 273, 1966, 7103, 50276, 37585, 5701, 50275, 46089, 9991, 14312, 3733, 8103, 533, 616, 1566, 310, 323, 2014, 1614, 16072, 50276, 338, 841, 497, 2014, 7819, 597, 651, 2649, 1663, 320, 1925, 16072, 326, 8103, 556, 644, 908, 342, 495, 1614, 16072, 352, 476, 2686, 320, 3732, 281, 1554, 262, 662, 17414, 347, 342, 667, 6753, 909, 8122, 1006, 800, 3210, 323, 1650, 352, 556, 644, 28734, 326, 1039, 347, 247, 8245, 323, 1554, 262, 662, 17414, 632, 1162, 355, 4022, 22412, 35221, 4715, 323, 17414, 5978, 3877, 352, 310, 417, 247, 3733, 8103, 533, 760, 271, 8103, 1159, 387, 17032, 673, 534, 310, 247, 625, 3588, 1921, 281, 45688, 326, 2929, 50275, 664, 897, 38754, 28490, 278, 282, 327, 253, 806, 629, 273, 253, 8103, 36908, 326, 8513, 9991, 984, 273, 278, 282, 752, 670, 970, 10491, 3185, 5046, 342, 3276, 50275, 41528, 337, 253, 268, 783, 7784, 13414, 1646, 281, 3761, 253, 2505, 273, 2593, 374, 14260, 310, 275, 4536, 3542, 275, 13433, 285, 4536, 417, 923, 671, 16186, 1249, 323, 5301, 50275, 423, 273, 2593, 3127, 627, 403, 2709, 632, 1162, 355, 13199, 534, 581, 50275, 423, 273, 2593, 3307, 285, 2164, 4465, 11196, 2885, 25232, 846, 295, 17, 50274, 40203, 403, 1512, 1355, 281, 1239, 253, 749, 26804, 50276, 46036, 1162, 355, 4240, 1182, 864, 1269, 86, 270, 272, 371, 266, 632, 86, 18927, 1004, 328, 259, 606, 5101, 260, 24176, 75, 466, 1269, 571, 311, 543, 259, 606, 1182, 11917, 263, 266, 259, 606, 285, 448, 8500, 2805, 74, 11454, 2380, 5978, 3066, 36827, 342, 271, 16851, 21496, 3828, 802, 13307, 81, 4240, 50276, 91, 12109, 1162, 355, 4765, 1182, 12109, 340, 478, 248, 50276, 9896, 2205, 8555, 2955, 50276, 2485, 80, 480, 757, 71, 1205, 50276, 1247, 1182, 248, 50276, 965, 1269, 74, 10441, 328, 50276, 9108, 777, 3592, 448, 4448, 50276, 69, 24361, 6007, 4765, 11365, 27096, 285, 11117, 5636, 1050, 6128, 3066, 48960, 1491, 11903, 1320, 7152, 33032, 2520, 2929, 10262, 271, 48960, 4715, 1566, 323, 11365, 11117, 6128, 323, 17414, 2718, 1754, 327, 288, 433, 24498, 18902, 32049, 48759, 2990, 253, 7680, 273, 253, 789, 7194, 8696, 275, 337, 48960, 4715, 285, 374, 42014, 305, 12064, 6046, 387, 3159, 5251, 285, 6197, 5251, 281, 11907, 253, 9991, 4583, 253, 2934, 310, 4722, 285, 253, 12077, 7103, 1754, 327, 44229, 414, 7387, 86, 30497, 463, 3966, 2722, 326, 253, 4081, 3082, 562, 32231, 5368, 3082, 50276, 43249, 13991, 50276, 262, 3133, 751, 8829, 6046, 387, 3159, 5251, 2761, 1900, 41731, 13015, 6240, 6197, 5251, 6046, 352, 651, 320, 1805, 604, 253, 4477, 476, 5513, 2139, 436, 6569, 285, 1880, 352, 476, 320, 3732, 323, 643, 2380, 5978, 8892, 50275, 20989, 327, 1840, 4385, 253, 4477, 476, 671, 3368, 342, 643, 2380, 5978, 15302, 24088, 6355, 327, 2675, 3420, 50275, 4064, 6667, 275, 2829, 495, 285, 577, 253, 4561, 6128, 403, 273, 1698, 3290, 4583, 891, 1804, 253, 4477, 1408, 1966, 7103, 281, 923, 1880, 627, 310, 667, 1534, 3064, 2190, 985, 6128, 407, 1027, 3210, 327, 7794, 273, 4151, 255, 6460, 285, 2938, 1371, 387, 1878, 7152, 33032, 2520, 2929, 12661, 247, 747, 2746, 281, 17414, 14053, 407, 16984, 767, 32771, 689, 271, 4232, 17414, 1566, 253, 288, 433, 24498, 18902, 32049, 48759, 2990, 253, 32771, 403, 337, 6240, 247, 36827, 8103, 281, 253, 2629, 278, 282, 8103, 273, 253, 288, 433, 1566, 285, 374, 26264, 253, 288, 433, 1566, 281, 2486, 271, 4116, 5122, 689, 253, 1980, 21839, 1491, 26332, 253, 1067, 1078, 253, 1246, 2380, 50274, 17695, 253, 4028, 369, 6571, 8718, 2167, 627, 497, 690, 3374, 2393, 275, 2593, 374, 253, 4477, 2581, 19328, 314, 5502, 432, 247, 15965, 30221, 326, 2908, 253, 767, 37999, 273, 253, 17414, 347, 1269, 1067, 285, 340, 2380, 281, 247, 30221, 326, 760, 19401, 247, 2014, 3425, 1269, 50275, 2369, 652, 555, 285, 3486, 50276, 783, 4081, 2746, 11120, 24772, 271, 4232, 1566, 342, 767, 4295, 326, 403, 3746, 973, 21877, 697, 4344, 281, 1333, 326, 253, 38135, 310, 4942, 5075, 253, 1566, 2440, 310, 24600, 533, 12054, 15246, 352, 310, 2649, 2590, 281, 479, 326, 247, 10182, 9414, 273, 253, 6239, 275, 436, 2170, 3782, 253, 36827, 323, 2505, 6239, 588, 3037, 326, 1199, 432, 436, 2929, 50275, 16217, 3825, 4583, 253, 16774, 7103, 2722, 9648, 2410, 1763, 5356, 326, 253, 4081, 1566, 310, 3576, 891, 513, 4282, 2139, 651, 253, 288, 433, 1247, 1566, 562, 32231, 253, 288, 433, 1566, 327, 44229, 414, 253, 288, 433, 1566, 310, 3587, 39793, 278, 282, 534, 310, 3587, 2905, 281, 253, 44229, 414, 2557, 1223, 253, 288, 433, 1247, 2486, 271, 3081, 8103, 326, 943, 4931, 17789, 12177, 436, 21843, 1981, 906, 369, 417, 5469, 285, 1663, 943, 320, 50276, 783, 4561, 6128, 1677, 275, 2829, 495, 50276, 6050, 4645, 690, 7756, 689, 288, 433, 285, 362, 73, 433, 17985, 275, 2426, 273, 2380, 2978, 285, 13005, 50276, 3088, 417, 4944, 253, 3634, 3782, 973, 436, 1663, 816, 2722, 359, 1335, 452, 690, 1039, 281, 564, 1078, 436, 11132, 4836, 310, 14042, 50275, 262, 651, 320, 4217, 604, 253, 4477, 812, 1408, 271, 28913, 1263, 281, 1361, 11322, 253, 4103, 9021, 273, 253, 767, 32771, 36827, 285, 4116, 281, 253, 11701, 275, 1543, 4931, 253, 7756, 275, 44229, 414, 5469, 1840, 310, 513, 281, 253, 897, 273, 4116, 50275, 5992, 7193, 5701, 50276, 34974, 50275, 249, 253, 12494, 875, 16186, 2224, 374, 285, 495, 253, 4477, 1646, 281, 1804, 326, 50275, 442, 12844, 17190, 310, 271, 2879, 47641, 50276, 35529, 436, 310, 816, 253, 50275, 28113, 7103, 273, 253, 278, 282, 8103, 50274, 249, 16585, 253, 5678, 278, 282, 1247, 8103, 275, 16186, 79, 854, 1057, 253, 278, 282, 50275, 6082, 422, 897, 9732, 17190, 690, 4321, 2505, 5469, 1840, 5644, 50275, 1405, 281, 9101, 326, 352, 1057, 417, 50276, 187, 187, 4118, 18435, 27, 436, 2929, 29328, 271, 48960, 4715, 7792, 323, 17414, 5978, 253, 14156, 310, 1754, 327, 3786, 4081, 24498, 18902, 32049, 48759, 2990, 288, 433, 407, 1151, 5568, 1162, 355, 285, 253, 7134, 12915, 310, 247, 12246, 30869, 391, 9866, 6046, 310, 5611, 275, 14156, 323, 2380, 5978, 253, 2746, 310, 6760, 327, 767, 7744, 908, 5944, 66, 6440, 941, 285, 13520, 20689, 50276, 249, 253, 3236, 2715, 273, 253, 2929, 1966, 7103, 369, 5816, 271, 2523, 5439, 407, 512, 30628, 2299, 436, 556, 644, 2879, 275, 253, 38549, 841, 8499, 253, 2045, 16644, 5593, 275, 17227, 253, 5373, 285, 1534, 15988, 432, 253, 4081, 2746, 50276, 455, 30628, 7164, 253, 2523, 273, 253, 789, 1146, 32809, 285, 417, 4460, 2217, 1677, 253, 2045, 789, 275, 288, 433, 42781, 433, 285, 897, 273, 24498, 7274, 281, 1566, 17414, 3634, 33810, 6046, 5978, 3133, 747, 533, 310, 417, 973, 17194, 17285, 285, 5867, 50276 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: the paper proposes an imagebased learning framework for object rearrangement in particular three major challenges are addressed 1 objects are closely located with each other 2 the planner can determine which objects are subject to rearrangement 3 the planner can resolve a situation when other objects occupy the goal locations of target objects the proposed method is evaluated on simulations and realworld experiments and shows its promising performance in terms of success rate accuracy and efficiency strengths the proposed framework is novel and it is convincing why this learning approach should work the results from both simulation and experiments look promising weaknesses details are included in the summary of recommendation and issues sections contributions seem less significant as the justification of the fundamental limitations of the previous methods is not well addressed object sequencing has some critical limitations which may fail in practice in environments containing many objects docsepthis work presents a novel method for tabletop object rearrangement amid clutter for cases where a subset of the objects may need to be removed from the scene it works through sequencing the objects to manipulate through a measure of the graph edit distance from the current state to the goal state where each object is manipulated by either a learned grasp or a learned push primitive both implemented as fully convolutional qvalue maps and where each grasped object is either moved away from the scene or placed in a location with the greatest feature correspondence to the goal scene extensive evaluation is done in simulation and it is demonstrated that the learned policies can successfully transfer to a real robot strengths clearly written and largely easy to follow the problem addressed is important to the field of robotics the proposed solution is novel in several respects in particular the scenegraph edit based object sequencing and the depthimage cross correlation aspect of the featurebased repositioning the experiments are detailed and convincingly demonstrate the approach works well and the video clearly demonstrates the approach can be adapted to work on a real robot the discussion of limitations is thorough weaknesses several of the figures 1 3 4 have images andor text that are so small that even zooming in does not make it easy to make out the details in the pdf and that would be hard to understand in a printed version i felt some of the text not covering the main novel ideas or results could have been cut or moved to allow for more results or analysis to be included in the main paper instead of the appendix in particular the stanard rl notation preliminaries are covered in section 31 and the summary of the qvalue map method which as stated has a lot of overlap with 20 to have been moved to the appendix or cut down i feel including the planning step and ablation results in the main text would have been more informative while the citation of the relevant prior work is well done there is very little summary of it i feel that for readers less familiar with the cited works would hard a hard time understanding which are the key novel contributions of this work and which parts are largely similar to prior work such as the qvalue map while the ablation results for the action sequencing is informative i was left wondering how well a simple handcoded heuristic such as moving each nontarget object to the bin first and then repositioning each object by qvalue might perform in general it is not intuitive why the first set of actions would not be just removing all nontarget objects from the scene followed by planning to reposition the remaining objects to the target the goal specification including an image of the final scene needed for template matching seems a bit limiting the method would be more powerful and widely applicable were it able to work with just a scenegraph based of the goal scene in some parts the paper can be a bit dense with extremely long paragraphs and it seemed that it could have been edited a bit more to make reading easier docsepselective object rearrangement in clutter presented an imagebased learned method for tabletop object rearrangement in clutter using parallel jaw gripper the method consists of three stages including a graphbased object sequencing to select the next object to manipulate using the current scene graph and goal scene graphs graph edit distance feature based action selection to map image to robot actions select between pushing or grasping and a correspondencebased placement policy overall authors claim that the method is the first to address concurrently object selection from initially cluttered scene discarding the unrelated objects and dealing with cases where goal location is occupied experiments show that the method is able to outperform sota methods in less restrictive settings while also providing reasonable performance in more restrictive settings strengths real robot experiments the authors demonstrated zeroshot transfer from simulation to real robot cool interesting choice of actions push grasp where grasp is preferred push designed to singulate objects in clutter experiments are well designed including definition of a clutter coefficient quantitative results of experiments analyzed and convincing justifications given for observed behavior outperforms existing sota methods weaknesses predefined assumptions predefined window crop size for visually defining grasped object would be better to see some ablations on generalization to outofdistribution objects would the assumption still hold dependence on pretrained methods uoisnet3d is used to provide set of object segmentation masks how does the system react to errors in the predicted segmentation mask missing experiment it appears the nerp baseline has adequate performance when only swap is handled it would be interesting to see how the presented method works when only swap is handled as well can the authors comment on whether this is an online method and what kind of computational resources is required to run the pipeline docsepthis paper presents a system for imagebased object rearrangement in cluttered tabletop environments where given a goal image the robot plans a sequence of grasp and pushing actions to achieve the configuration corresponding to the goal image the presented approach has 3 components 1 an object sequencing method which constructs a scene graph and encourages the robot to manipulate objects which result in a graph closer in topology to the scene graph corresponding to the goal image 2 an action selection method which selects a grasping or pushing action on sequenced objects based on their likelihood of success and 3 a placement module which places objects such that they match their pose in the goal image the action selection method and placement module are very similar to ideas from prior work but the object sequencing method appears to be relatively novel the constructed system is also impressive the videos demonstrate the efficacy of the approach on robot hardware strengths 1 the presented method is clearly described and for the most part all components of the system are easy to understand 2 the resulting system is deployed on physical robotic hardware and videos suggest that the proposed method is able to successfully rearrange configurations with up to 15 objects this is a very challenging task and the effort put into deploying this system in hardware experiments is much appreciated 3 the graph based object sequencing method is very interesting and novel to the best of my knowledge i appreciate the ablation in section 42 which appears to demonstrate the utility of the approach as opposed to heuristics based only on grasp qvalues such as in mech search by danielczuk et al 4 i appreciate the extensive evaluation of the proposed method on a number of different numbers of initial objects and goal objects in simulation weaknesses 1 the novelty of the work with respect to prior work needs to be made a bit more clear the paper claims that this is the first paper that allows the robot to selectively rearrange objects from image observations in clutter while the authors claim that prior tamp approaches that incorporate learningbased vision models 14 15 16 17 struggle to scale to adversarial environments the reason for this should be expanded in more depth and preferably the authors should experimentally show that prior methods which seem in principle capable of addressing the same problem setting are intractable for the problem settings considered in this work 2 the comparison to prior work in simulation experiments does not appear calibrated the authors should implement prior methods in the same setup as their algorithm is evaluated in otherwise the numbers in table 2 do not seem to be a fair comparison and it is hard to draw any meaningful conclusions about the relative performance of the presented method it seems important to implement at least 2 of 14 15 16 17 24 25 based on which the authors believe will be the strongest baselines in their experimental setup 3 the discussion of the physical experiment results is insufficient in the main body of the paper it is important the paper discusses quantitive results such as the number of trials success rate etc of these experiments for the reader to be able to evaluate the significance of the results it is also preferable to compare to prior methods in physical experiments but it is ok to omit this if sufficient comparison is at least provided in simulation ### Summary:
the paper proposes a framework for object rearrangement in clutter given a goal image using graphbased object sequencing action selection pushing or grasping and a placement module the reviews are mixed and the current rating of the paper is weak reject strong accept weak accept weak reject reviewers agree that the object sequencing method is interesting and that the real robot videos are impressive and compelling however reviewers also raise several valid questions and concerns including experiments do not appear to be calibrated comparisons to prior work should be performed in the same experimental setup for an informative comparison clarification on novelty claims why the proposed algorithm is better positioned to handle clutter and swap authors are encouraged to respond to reviewers comments and questions updates reviewers appreciate the authors detailed responses and agree that the presented ideas are interesting with compelling real robot results on cluttered tabletop rearrangements that involve both translation and rotation the idea of distinguishing pick and push for rearrangement also adds value to the community the experiments and ablations in simulation provide helpful quantitative evidence to the efficacy of the approach there are still a few outstanding concerns on whether the comparisons to prior methods are perfectly calibrated the authors have clarified that to address this they have tried to match the experimental setup from prior work as closely as possible including same data input format data generation process and evaluation metrics given that the prior methods did not release code for the authors to run on their system these experiments should suffice effectively demonstrating the same task but with clear notable improvements
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 783, 2929, 29328, 271, 2460, 3169, 4715, 7792, 323, 1789, 47410, 275, 1798, 1264, 2201, 7881, 403, 9713, 337, 5113, 403, 8244, 4441, 342, 1016, 643, 374, 253, 499, 9582, 476, 3653, 534, 5113, 403, 2256, 281, 47410, 495, 253, 499, 9582, 476, 11322, 247, 4112, 672, 643, 5113, 26263, 253, 4736, 8593, 273, 2303, 5113, 253, 4081, 1332, 310, 6760, 327, 9938, 285, 1524, 10186, 4679, 285, 2722, 697, 12532, 3045, 275, 2426, 273, 2323, 2281, 7200, 285, 6733, 20544, 50276, 783, 4081, 7792, 310, 4460, 285, 352, 310, 21414, 2139, 436, 4715, 2746, 943, 789, 50276, 783, 1543, 432, 1097, 9864, 285, 4679, 1007, 12532, 50276, 20881, 1255, 265, 4278, 403, 2908, 275, 253, 6010, 273, 17401, 285, 3374, 7118, 50276, 1987, 8303, 1646, 1679, 1534, 347, 253, 22861, 273, 253, 7936, 7364, 273, 253, 2045, 3082, 310, 417, 973, 9713, 50276, 6082, 12184, 556, 690, 4619, 7364, 534, 778, 1891, 275, 3946, 275, 12620, 4508, 1142, 5113, 5474, 33032, 2520, 789, 10262, 247, 4460, 1332, 323, 2829, 3956, 1789, 47410, 17421, 502, 12216, 323, 2219, 835, 247, 8578, 273, 253, 5113, 778, 878, 281, 320, 5176, 432, 253, 6200, 352, 2987, 949, 12184, 253, 5113, 281, 26526, 949, 247, 2557, 273, 253, 4216, 12921, 4181, 432, 253, 1655, 1375, 281, 253, 4736, 1375, 835, 1016, 1789, 310, 32494, 407, 2057, 247, 6311, 15909, 390, 247, 6311, 7450, 20523, 1097, 9009, 347, 4751, 27311, 267, 2805, 2877, 8115, 285, 835, 1016, 46137, 1789, 310, 2057, 4395, 1977, 432, 253, 6200, 390, 4845, 275, 247, 4328, 342, 253, 6459, 4735, 17668, 281, 253, 4736, 6200, 9470, 7103, 310, 2218, 275, 9864, 285, 352, 310, 5183, 326, 253, 6311, 7823, 476, 8379, 3700, 281, 247, 1524, 15688, 20544, 50276, 49346, 3542, 285, 8127, 3477, 281, 956, 50276, 783, 1895, 9713, 310, 1774, 281, 253, 1673, 273, 15688, 982, 50276, 783, 4081, 2900, 310, 4460, 275, 2067, 23006, 275, 1798, 253, 5362, 909, 1354, 12921, 1754, 1789, 12184, 285, 253, 6864, 5695, 2831, 5921, 4809, 273, 253, 4735, 3169, 294, 3321, 272, 50275, 783, 4679, 403, 7000, 285, 2410, 1763, 5356, 7568, 253, 2746, 2987, 973, 285, 253, 3492, 4518, 14371, 253, 2746, 476, 320, 12956, 281, 789, 327, 247, 1524, 15688, 50276, 783, 5955, 273, 7364, 310, 11080, 50275, 20881, 1255, 265, 50276, 43249, 273, 253, 8442, 337, 495, 577, 452, 3888, 285, 263, 2505, 326, 403, 594, 1355, 326, 1014, 21282, 272, 275, 1057, 417, 1056, 352, 3477, 281, 1056, 562, 253, 4278, 275, 253, 31697, 285, 326, 651, 320, 1892, 281, 2096, 275, 247, 11462, 2715, 50276, 74, 3543, 690, 273, 253, 2505, 417, 10985, 253, 2022, 4460, 5697, 390, 1543, 812, 452, 644, 2624, 390, 4395, 281, 1581, 323, 625, 1543, 390, 1783, 281, 320, 2908, 275, 253, 2022, 2929, 3185, 273, 253, 30762, 275, 1798, 253, 331, 266, 472, 391, 77, 14951, 11944, 249, 3927, 403, 6107, 275, 2593, 4562, 285, 253, 6010, 273, 253, 2805, 2877, 3711, 1332, 534, 347, 4767, 556, 247, 2257, 273, 14787, 342, 1384, 281, 452, 644, 4395, 281, 253, 30762, 390, 2624, 1066, 891, 1928, 1690, 253, 7219, 3213, 285, 28913, 1543, 275, 253, 2022, 2505, 651, 452, 644, 625, 27096, 50276, 6050, 253, 25577, 273, 253, 4623, 2720, 789, 310, 973, 2218, 627, 310, 1077, 1652, 6010, 273, 352, 891, 1928, 326, 323, 10668, 1679, 7615, 342, 253, 11106, 2987, 651, 1892, 247, 1892, 673, 4685, 534, 403, 253, 2234, 4460, 9021, 273, 436, 789, 285, 534, 4243, 403, 8127, 2074, 281, 2720, 789, 824, 347, 253, 2805, 2877, 3711, 50275, 6050, 253, 28913, 1543, 323, 253, 2250, 12184, 310, 27096, 891, 369, 1669, 12371, 849, 973, 247, 2969, 1133, 38059, 47641, 824, 347, 4886, 1016, 25450, 1816, 1789, 281, 253, 10269, 806, 285, 840, 294, 3321, 272, 1016, 1789, 407, 2805, 2877, 1537, 1347, 275, 2087, 352, 310, 417, 27350, 2139, 253, 806, 873, 273, 5231, 651, 417, 320, 816, 11922, 512, 25450, 1816, 5113, 432, 253, 6200, 3560, 407, 7219, 281, 294, 3321, 253, 5780, 5113, 281, 253, 2303, 50276, 783, 4736, 17776, 1690, 271, 2460, 273, 253, 2457, 6200, 3058, 323, 7646, 11038, 3133, 247, 2372, 14155, 50276, 783, 1332, 651, 320, 625, 6422, 285, 7561, 7763, 497, 352, 2104, 281, 789, 342, 816, 247, 5362, 909, 1354, 1754, 273, 253, 4736, 6200, 50275, 249, 690, 4243, 253, 2929, 476, 320, 247, 2372, 14086, 342, 6685, 1048, 33295, 285, 352, 4455, 326, 352, 812, 452, 644, 16168, 247, 2372, 625, 281, 1056, 4361, 6927, 5474, 33032, 41677, 1789, 47410, 275, 502, 12216, 3559, 271, 2460, 3169, 6311, 1332, 323, 2829, 3956, 1789, 47410, 275, 502, 12216, 970, 7529, 21065, 18234, 3803, 253, 1332, 8414, 273, 1264, 8661, 1690, 247, 4216, 3169, 1789, 12184, 281, 3609, 253, 1735, 1789, 281, 26526, 970, 253, 1655, 6200, 4216, 285, 4736, 6200, 14580, 4216, 12921, 4181, 4735, 1754, 2250, 5438, 281, 3711, 2460, 281, 15688, 5231, 3609, 875, 13383, 390, 48635, 285, 247, 17668, 3169, 14663, 3646, 4583, 4477, 1750, 326, 253, 1332, 310, 253, 806, 281, 2953, 35046, 1789, 5438, 432, 8523, 26986, 3606, 6200, 1262, 13218, 253, 20804, 5113, 285, 10620, 342, 2219, 835, 4736, 4328, 310, 13598, 4679, 921, 326, 253, 1332, 310, 2104, 281, 562, 32231, 256, 5503, 3082, 275, 1679, 29190, 7533, 1223, 671, 5277, 5272, 3045, 275, 625, 29190, 7533, 50276, 296, 3755, 20556, 50276, 6549, 15688, 4679, 253, 4477, 5183, 1182, 254, 6934, 302, 3700, 432, 9864, 281, 1524, 15688, 4484, 50276, 47606, 4327, 273, 5231, 7450, 50276, 737, 4938, 835, 15909, 310, 9013, 7450, 4158, 281, 1625, 4187, 5113, 275, 502, 12216, 50276, 16217, 3825, 403, 973, 4158, 1690, 5426, 273, 247, 502, 12216, 10235, 11745, 1543, 273, 4679, 5867, 285, 21414, 816, 6787, 1677, 323, 2540, 3879, 50276, 483, 468, 13015, 5368, 256, 5503, 3082, 50276, 20881, 1255, 265, 50276, 12787, 37224, 13260, 41364, 3497, 17177, 1979, 323, 25910, 13947, 46137, 1789, 651, 320, 1805, 281, 923, 690, 490, 77, 569, 327, 26647, 281, 562, 1171, 35360, 5113, 651, 253, 9376, 1335, 2186, 50276, 39606, 327, 3215, 11273, 3082, 1484, 10225, 3024, 20, 69, 310, 908, 281, 2085, 873, 273, 1789, 26405, 25965, 849, 1057, 253, 985, 8071, 281, 6332, 275, 253, 8131, 26405, 8989, 50276, 33722, 3368, 352, 4620, 253, 38998, 81, 8245, 556, 10599, 3045, 672, 760, 22101, 310, 15726, 352, 651, 320, 4722, 281, 923, 849, 253, 3559, 1332, 2987, 672, 760, 22101, 310, 15726, 347, 973, 50276, 5092, 253, 4477, 4385, 327, 1880, 436, 310, 271, 3909, 1332, 285, 752, 2238, 273, 15180, 5300, 310, 2424, 281, 1408, 253, 15722, 5474, 33032, 2520, 2929, 10262, 247, 985, 323, 2460, 3169, 1789, 47410, 275, 26986, 3606, 2829, 3956, 12620, 835, 1677, 247, 4736, 2460, 253, 15688, 5827, 247, 3425, 273, 15909, 285, 13383, 5231, 281, 5115, 253, 6661, 3969, 281, 253, 4736, 2460, 253, 3559, 2746, 556, 495, 4295, 337, 271, 1789, 12184, 1332, 534, 21031, 247, 6200, 4216, 285, 29426, 253, 15688, 281, 26526, 5113, 534, 50276, 6870, 275, 247, 4216, 8003, 275, 18080, 281, 253, 6200, 4216, 3969, 281, 253, 4736, 2460, 374, 271, 2250, 5438, 1332, 534, 34899, 247, 48635, 390, 13383, 2250, 327, 27867, 5113, 1754, 327, 616, 12177, 273, 2323, 285, 495, 247, 14663, 6333, 534, 5053, 5113, 824, 326, 597, 3761, 616, 16753, 275, 253, 4736, 2460, 253, 2250, 5438, 1332, 285, 14663, 6333, 403, 1077, 2074, 281, 5697, 432, 2720, 789, 533, 253, 1789, 12184, 1332, 4620, 281, 320, 4942, 4460, 253, 8818, 985, 310, 671, 13943, 253, 10556, 7568, 253, 10307, 273, 253, 2746, 327, 15688, 10309, 20544, 50276, 18, 253, 3559, 1332, 310, 4518, 2529, 285, 323, 253, 954, 629, 512, 4295, 273, 253, 985, 403, 3477, 281, 2096, 374, 253, 4795, 985, 310, 18329, 327, 3520, 35121, 10309, 285, 10556, 1804, 326, 253, 4081, 1332, 310, 2104, 281, 8379, 23690, 912, 16012, 342, 598, 281, 1458, 5113, 436, 310, 247, 1077, 11132, 4836, 285, 253, 3434, 1691, 715, 45021, 436, 985, 275, 10309, 4679, 310, 1199, 14109, 495, 253, 4216, 1754, 1789, 12184, 1332, 310, 1077, 4722, 285, 4460, 281, 253, 1682, 273, 619, 3640, 891, 11435, 253, 28913, 275, 2593, 5976, 534, 4620, 281, 7568, 253, 11839, 273, 253, 2746, 347, 10066, 281, 344, 321, 3397, 1754, 760, 327, 15909, 2805, 8858, 824, 347, 275, 479, 348, 3186, 407, 16447, 928, 14617, 2788, 1162, 355, 577, 891, 11435, 253, 9470, 7103, 273, 253, 4081, 1332, 327, 247, 1180, 273, 1027, 3904, 273, 3302, 5113, 285, 4736, 5113, 275, 9864, 50276, 20881, 1255, 265, 50276, 18, 253, 38135, 273, 253, 789, 342, 1675, 281, 2720, 789, 3198, 281, 320, 1160, 247, 2372, 625, 2590, 253, 2929, 3916, 326, 436, 310, 253, 806, 2929, 326, 4483, 253, 15688, 281, 21656, 23690, 912, 5113, 432, 2460, 7313, 275, 502, 12216, 1223, 253, 4477, 1750, 326, 2720, 50149, 7274, 326, 19071, 4715, 3169, 8113, 3210, 1638, 1458, 1668, 1722, 11182, 281, 4311, 281, 48960, 12620, 253, 1921, 323, 436, 943, 320, 11848, 275, 625, 6864, 285, 13027, 253, 4477, 943, 21657, 921, 326, 2720, 3082, 534, 1646, 275, 8063, 7032, 273, 15974, 253, 1072, 1895, 4758, 403, 540, 44374, 323, 253, 1895, 7533, 2783, 275, 436, 789, 50276, 19, 253, 5301, 281, 2720, 789, 275, 9864, 4679, 1057, 417, 3176, 35890, 253, 4477, 943, 3359, 2720, 3082, 275, 253, 1072, 9978, 347, 616, 5933, 310, 6760, 275, 5010, 253, 3904, 275, 2829, 374, 513, 417, 1646, 281, 320, 247, 4344, 5301, 285, 352, 310, 1892, 281, 3812, 667, 14282, 11815, 670, 253, 4103, 3045, 273, 253, 3559, 1332, 352, 3133, 1774, 281, 3359, 387, 1878, 374, 273, 1638, 1458, 1668, 1722, 2164, 2030, 1754, 327, 534, 253, 4477, 2868, 588, 320, 253, 19508, 1666, 25379, 275, 616, 5661, 9978, 50275, 20, 253, 5955, 273, 253, 3520, 3368, 1543, 310, 12497, 275, 253, 2022, 2133, 273, 253, 2929, 352, 310, 1774, 253, 2929, 25339, 2677, 1483, 1543, 824, 347, 253, 1180, 273, 7587, 2323, 2281, 3966, 273, 841, 4679, 323, 253, 9414, 281, 320, 2104, 281, 7472, 253, 8453, 273, 253, 1543, 352, 310, 671, 29224, 281, 7277, 281, 2720, 3082, 275, 3520, 4679, 533, 352, 310, 8718, 281, 35991, 436, 604, 4209, 5301, 310, 387, 1878, 2530, 275, 9864, 2490, 187, 4118, 18435, 27, 783, 2929, 29328, 247, 7792, 323, 1789, 47410, 275, 502, 12216, 1677, 247, 4736, 2460, 970, 4216, 3169, 1789, 12184, 2250, 5438, 13383, 390, 48635, 285, 247, 14663, 6333, 253, 10123, 403, 6804, 285, 253, 1655, 13716, 273, 253, 2929, 310, 5075, 12009, 2266, 2997, 5075, 2997, 5075, 12009, 30628, 5194, 326, 253, 1789, 12184, 1332, 310, 4722, 285, 326, 253, 1524, 15688, 10556, 403, 13943, 285, 18511, 2299, 30628, 671, 7164, 2067, 3588, 3533, 285, 7350, 1690, 50275, 16217, 3825, 513, 417, 3176, 281, 320, 35890, 14023, 281, 2720, 789, 943, 320, 2684, 275, 253, 1072, 5661, 9978, 323, 271, 27096, 5301, 50276, 498, 274, 1877, 327, 38135, 3916, 2139, 253, 4081, 5933, 310, 1805, 15471, 281, 6016, 502, 12216, 285, 22101, 50276, 43355, 403, 14659, 281, 3794, 281, 30628, 5701, 285, 3533, 50276, 484, 24275, 50276, 15337, 398, 11435, 253, 4477, 7000, 6128, 285, 5194, 326, 253, 3559, 5697, 403, 4722, 342, 18511, 1524, 15688, 1543, 327, 26986, 3606, 2829, 3956, 33289, 3658, 326, 6388, 1097, 10234, 285, 9381, 253, 2934, 273, 32495, 2619, 285, 7450, 323, 47410, 671, 11323, 1318, 281, 253, 3114, 253, 4679, 285, 490, 77, 569, 275, 9864, 2085, 9371, 11745, 1941, 281, 253, 10307, 273, 253, 2746, 627, 403, 1335, 247, 1643, 16383, 7350, 327, 1880, 253, 14023, 281, 2720, 3082, 403, 9670, 35890, 50276, 783, 4477, 452, 31637, 326, 281, 2953, 436, 597, 452, 3597, 281, 3761, 253, 5661, 9978, 432, 2720, 789, 347, 8244, 347, 1896, 1690, 1072, 941, 3280, 5981, 941, 5978, 1232, 285, 7103, 17082, 1677, 326, 253, 2720, 3082, 858, 417, 3727, 2127, 323, 253, 4477, 281, 1408, 327, 616, 985, 841, 4679, 943, 36433, 8069, 17227, 253, 1072, 4836, 533, 342, 2590, 16613, 11701 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 783, 2929, 29328, 271, 2460, 3169, 4715, 7792, 323, 1789, 47410, 275, 1798, 1264, 2201, 7881, 403, 9713, 337, 5113, 403, 8244, 4441, 342, 1016, 643, 374, 253, 499, 9582, 476, 3653, 534, 5113, 403, 2256, 281, 47410, 495, 253, 499, 9582, 476, 11322, 247, 4112, 672, 643, 5113, 26263, 253, 4736, 8593, 273, 2303, 5113, 253, 4081, 1332, 310, 6760, 327, 9938, 285, 1524, 10186, 4679, 285, 2722, 697, 12532, 3045, 275, 2426, 273, 2323, 2281, 7200, 285, 6733, 20544, 50276, 783, 4081, 7792, 310, 4460, 285, 352, 310, 21414, 2139, 436, 4715, 2746, 943, 789, 50276, 783, 1543, 432, 1097, 9864, 285, 4679, 1007, 12532, 50276, 20881, 1255, 265, 4278, 403, 2908, 275, 253, 6010, 273, 17401, 285, 3374, 7118, 50276, 1987, 8303, 1646, 1679, 1534, 347, 253, 22861, 273, 253, 7936, 7364, 273, 253, 2045, 3082, 310, 417, 973, 9713, 50276, 6082, 12184, 556, 690, 4619, 7364, 534, 778, 1891, 275, 3946, 275, 12620, 4508, 1142, 5113, 5474, 33032, 2520, 789, 10262, 247, 4460, 1332, 323, 2829, 3956, 1789, 47410, 17421, 502, 12216, 323, 2219, 835, 247, 8578, 273, 253, 5113, 778, 878, 281, 320, 5176, 432, 253, 6200, 352, 2987, 949, 12184, 253, 5113, 281, 26526, 949, 247, 2557, 273, 253, 4216, 12921, 4181, 432, 253, 1655, 1375, 281, 253, 4736, 1375, 835, 1016, 1789, 310, 32494, 407, 2057, 247, 6311, 15909, 390, 247, 6311, 7450, 20523, 1097, 9009, 347, 4751, 27311, 267, 2805, 2877, 8115, 285, 835, 1016, 46137, 1789, 310, 2057, 4395, 1977, 432, 253, 6200, 390, 4845, 275, 247, 4328, 342, 253, 6459, 4735, 17668, 281, 253, 4736, 6200, 9470, 7103, 310, 2218, 275, 9864, 285, 352, 310, 5183, 326, 253, 6311, 7823, 476, 8379, 3700, 281, 247, 1524, 15688, 20544, 50276, 49346, 3542, 285, 8127, 3477, 281, 956, 50276, 783, 1895, 9713, 310, 1774, 281, 253, 1673, 273, 15688, 982, 50276, 783, 4081, 2900, 310, 4460, 275, 2067, 23006, 275, 1798, 253, 5362, 909, 1354, 12921, 1754, 1789, 12184, 285, 253, 6864, 5695, 2831, 5921, 4809, 273, 253, 4735, 3169, 294, 3321, 272, 50275, 783, 4679, 403, 7000, 285, 2410, 1763, 5356, 7568, 253, 2746, 2987, 973, 285, 253, 3492, 4518, 14371, 253, 2746, 476, 320, 12956, 281, 789, 327, 247, 1524, 15688, 50276, 783, 5955, 273, 7364, 310, 11080, 50275, 20881, 1255, 265, 50276, 43249, 273, 253, 8442, 337, 495, 577, 452, 3888, 285, 263, 2505, 326, 403, 594, 1355, 326, 1014, 21282, 272, 275, 1057, 417, 1056, 352, 3477, 281, 1056, 562, 253, 4278, 275, 253, 31697, 285, 326, 651, 320, 1892, 281, 2096, 275, 247, 11462, 2715, 50276, 74, 3543, 690, 273, 253, 2505, 417, 10985, 253, 2022, 4460, 5697, 390, 1543, 812, 452, 644, 2624, 390, 4395, 281, 1581, 323, 625, 1543, 390, 1783, 281, 320, 2908, 275, 253, 2022, 2929, 3185, 273, 253, 30762, 275, 1798, 253, 331, 266, 472, 391, 77, 14951, 11944, 249, 3927, 403, 6107, 275, 2593, 4562, 285, 253, 6010, 273, 253, 2805, 2877, 3711, 1332, 534, 347, 4767, 556, 247, 2257, 273, 14787, 342, 1384, 281, 452, 644, 4395, 281, 253, 30762, 390, 2624, 1066, 891, 1928, 1690, 253, 7219, 3213, 285, 28913, 1543, 275, 253, 2022, 2505, 651, 452, 644, 625, 27096, 50276, 6050, 253, 25577, 273, 253, 4623, 2720, 789, 310, 973, 2218, 627, 310, 1077, 1652, 6010, 273, 352, 891, 1928, 326, 323, 10668, 1679, 7615, 342, 253, 11106, 2987, 651, 1892, 247, 1892, 673, 4685, 534, 403, 253, 2234, 4460, 9021, 273, 436, 789, 285, 534, 4243, 403, 8127, 2074, 281, 2720, 789, 824, 347, 253, 2805, 2877, 3711, 50275, 6050, 253, 28913, 1543, 323, 253, 2250, 12184, 310, 27096, 891, 369, 1669, 12371, 849, 973, 247, 2969, 1133, 38059, 47641, 824, 347, 4886, 1016, 25450, 1816, 1789, 281, 253, 10269, 806, 285, 840, 294, 3321, 272, 1016, 1789, 407, 2805, 2877, 1537, 1347, 275, 2087, 352, 310, 417, 27350, 2139, 253, 806, 873, 273, 5231, 651, 417, 320, 816, 11922, 512, 25450, 1816, 5113, 432, 253, 6200, 3560, 407, 7219, 281, 294, 3321, 253, 5780, 5113, 281, 253, 2303, 50276, 783, 4736, 17776, 1690, 271, 2460, 273, 253, 2457, 6200, 3058, 323, 7646, 11038, 3133, 247, 2372, 14155, 50276, 783, 1332, 651, 320, 625, 6422, 285, 7561, 7763, 497, 352, 2104, 281, 789, 342, 816, 247, 5362, 909, 1354, 1754, 273, 253, 4736, 6200, 50275, 249, 690, 4243, 253, 2929, 476, 320, 247, 2372, 14086, 342, 6685, 1048, 33295, 285, 352, 4455, 326, 352, 812, 452, 644, 16168, 247, 2372, 625, 281, 1056, 4361, 6927, 5474, 33032, 41677, 1789, 47410, 275, 502, 12216, 3559, 271, 2460, 3169, 6311, 1332, 323, 2829, 3956, 1789, 47410, 275, 502, 12216, 970, 7529, 21065, 18234, 3803, 253, 1332, 8414, 273, 1264, 8661, 1690, 247, 4216, 3169, 1789, 12184, 281, 3609, 253, 1735, 1789, 281, 26526, 970, 253, 1655, 6200, 4216, 285, 4736, 6200, 14580, 4216, 12921, 4181, 4735, 1754, 2250, 5438, 281, 3711, 2460, 281, 15688, 5231, 3609, 875, 13383, 390, 48635, 285, 247, 17668, 3169, 14663, 3646, 4583, 4477, 1750, 326, 253, 1332, 310, 253, 806, 281, 2953, 35046, 1789, 5438, 432, 8523, 26986, 3606, 6200, 1262, 13218, 253, 20804, 5113, 285, 10620, 342, 2219, 835, 4736, 4328, 310, 13598, 4679, 921, 326, 253, 1332, 310, 2104, 281, 562, 32231, 256, 5503, 3082, 275, 1679, 29190, 7533, 1223, 671, 5277, 5272, 3045, 275, 625, 29190, 7533, 50276, 296, 3755, 20556, 50276, 6549, 15688, 4679, 253, 4477, 5183, 1182, 254, 6934, 302, 3700, 432, 9864, 281, 1524, 15688, 4484, 50276, 47606, 4327, 273, 5231, 7450, 50276, 737, 4938, 835, 15909, 310, 9013, 7450, 4158, 281, 1625, 4187, 5113, 275, 502, 12216, 50276, 16217, 3825, 403, 973, 4158, 1690, 5426, 273, 247, 502, 12216, 10235, 11745, 1543, 273, 4679, 5867, 285, 21414, 816, 6787, 1677, 323, 2540, 3879, 50276, 483, 468, 13015, 5368, 256, 5503, 3082, 50276, 20881, 1255, 265, 50276, 12787, 37224, 13260, 41364, 3497, 17177, 1979, 323, 25910, 13947, 46137, 1789, 651, 320, 1805, 281, 923, 690, 490, 77, 569, 327, 26647, 281, 562, 1171, 35360, 5113, 651, 253, 9376, 1335, 2186, 50276, 39606, 327, 3215, 11273, 3082, 1484, 10225, 3024, 20, 69, 310, 908, 281, 2085, 873, 273, 1789, 26405, 25965, 849, 1057, 253, 985, 8071, 281, 6332, 275, 253, 8131, 26405, 8989, 50276, 33722, 3368, 352, 4620, 253, 38998, 81, 8245, 556, 10599, 3045, 672, 760, 22101, 310, 15726, 352, 651, 320, 4722, 281, 923, 849, 253, 3559, 1332, 2987, 672, 760, 22101, 310, 15726, 347, 973, 50276, 5092, 253, 4477, 4385, 327, 1880, 436, 310, 271, 3909, 1332, 285, 752, 2238, 273, 15180, 5300, 310, 2424, 281, 1408, 253, 15722, 5474, 33032, 2520, 2929, 10262, 247, 985, 323, 2460, 3169, 1789, 47410, 275, 26986, 3606, 2829, 3956, 12620, 835, 1677, 247, 4736, 2460, 253, 15688, 5827, 247, 3425, 273, 15909, 285, 13383, 5231, 281, 5115, 253, 6661, 3969, 281, 253, 4736, 2460, 253, 3559, 2746, 556, 495, 4295, 337, 271, 1789, 12184, 1332, 534, 21031, 247, 6200, 4216, 285, 29426, 253, 15688, 281, 26526, 5113, 534, 50276, 6870, 275, 247, 4216, 8003, 275, 18080, 281, 253, 6200, 4216, 3969, 281, 253, 4736, 2460, 374, 271, 2250, 5438, 1332, 534, 34899, 247, 48635, 390, 13383, 2250, 327, 27867, 5113, 1754, 327, 616, 12177, 273, 2323, 285, 495, 247, 14663, 6333, 534, 5053, 5113, 824, 326, 597, 3761, 616, 16753, 275, 253, 4736, 2460, 253, 2250, 5438, 1332, 285, 14663, 6333, 403, 1077, 2074, 281, 5697, 432, 2720, 789, 533, 253, 1789, 12184, 1332, 4620, 281, 320, 4942, 4460, 253, 8818, 985, 310, 671, 13943, 253, 10556, 7568, 253, 10307, 273, 253, 2746, 327, 15688, 10309, 20544, 50276, 18, 253, 3559, 1332, 310, 4518, 2529, 285, 323, 253, 954, 629, 512, 4295, 273, 253, 985, 403, 3477, 281, 2096, 374, 253, 4795, 985, 310, 18329, 327, 3520, 35121, 10309, 285, 10556, 1804, 326, 253, 4081, 1332, 310, 2104, 281, 8379, 23690, 912, 16012, 342, 598, 281, 1458, 5113, 436, 310, 247, 1077, 11132, 4836, 285, 253, 3434, 1691, 715, 45021, 436, 985, 275, 10309, 4679, 310, 1199, 14109, 495, 253, 4216, 1754, 1789, 12184, 1332, 310, 1077, 4722, 285, 4460, 281, 253, 1682, 273, 619, 3640, 891, 11435, 253, 28913, 275, 2593, 5976, 534, 4620, 281, 7568, 253, 11839, 273, 253, 2746, 347, 10066, 281, 344, 321, 3397, 1754, 760, 327, 15909, 2805, 8858, 824, 347, 275, 479, 348, 3186, 407, 16447, 928, 14617, 2788, 1162, 355, 577, 891, 11435, 253, 9470, 7103, 273, 253, 4081, 1332, 327, 247, 1180, 273, 1027, 3904, 273, 3302, 5113, 285, 4736, 5113, 275, 9864, 50276, 20881, 1255, 265, 50276, 18, 253, 38135, 273, 253, 789, 342, 1675, 281, 2720, 789, 3198, 281, 320, 1160, 247, 2372, 625, 2590, 253, 2929, 3916, 326, 436, 310, 253, 806, 2929, 326, 4483, 253, 15688, 281, 21656, 23690, 912, 5113, 432, 2460, 7313, 275, 502, 12216, 1223, 253, 4477, 1750, 326, 2720, 50149, 7274, 326, 19071, 4715, 3169, 8113, 3210, 1638, 1458, 1668, 1722, 11182, 281, 4311, 281, 48960, 12620, 253, 1921, 323, 436, 943, 320, 11848, 275, 625, 6864, 285, 13027, 253, 4477, 943, 21657, 921, 326, 2720, 3082, 534, 1646, 275, 8063, 7032, 273, 15974, 253, 1072, 1895, 4758, 403, 540, 44374, 323, 253, 1895, 7533, 2783, 275, 436, 789, 50276, 19, 253, 5301, 281, 2720, 789, 275, 9864, 4679, 1057, 417, 3176, 35890, 253, 4477, 943, 3359, 2720, 3082, 275, 253, 1072, 9978, 347, 616, 5933, 310, 6760, 275, 5010, 253, 3904, 275, 2829, 374, 513, 417, 1646, 281, 320, 247, 4344, 5301, 285, 352, 310, 1892, 281, 3812, 667, 14282, 11815, 670, 253, 4103, 3045, 273, 253, 3559, 1332, 352, 3133, 1774, 281, 3359, 387, 1878, 374, 273, 1638, 1458, 1668, 1722, 2164, 2030, 1754, 327, 534, 253, 4477, 2868, 588, 320, 253, 19508, 1666, 25379, 275, 616, 5661, 9978, 50275, 20, 253, 5955, 273, 253, 3520, 3368, 1543, 310, 12497, 275, 253, 2022, 2133, 273, 253, 2929, 352, 310, 1774, 253, 2929, 25339, 2677, 1483, 1543, 824, 347, 253, 1180, 273, 7587, 2323, 2281, 3966, 273, 841, 4679, 323, 253, 9414, 281, 320, 2104, 281, 7472, 253, 8453, 273, 253, 1543, 352, 310, 671, 29224, 281, 7277, 281, 2720, 3082, 275, 3520, 4679, 533, 352, 310, 8718, 281, 35991, 436, 604, 4209, 5301, 310, 387, 1878, 2530, 275, 9864, 2490, 187, 4118, 18435, 27, 783, 2929, 29328, 247, 7792, 323, 1789, 47410, 275, 502, 12216, 1677, 247, 4736, 2460, 970, 4216, 3169, 1789, 12184, 2250, 5438, 13383, 390, 48635, 285, 247, 14663, 6333, 253, 10123, 403, 6804, 285, 253, 1655, 13716, 273, 253, 2929, 310, 5075, 12009, 2266, 2997, 5075, 2997, 5075, 12009, 30628, 5194, 326, 253, 1789, 12184, 1332, 310, 4722, 285, 326, 253, 1524, 15688, 10556, 403, 13943, 285, 18511, 2299, 30628, 671, 7164, 2067, 3588, 3533, 285, 7350, 1690, 50275, 16217, 3825, 513, 417, 3176, 281, 320, 35890, 14023, 281, 2720, 789, 943, 320, 2684, 275, 253, 1072, 5661, 9978, 323, 271, 27096, 5301, 50276, 498, 274, 1877, 327, 38135, 3916, 2139, 253, 4081, 5933, 310, 1805, 15471, 281, 6016, 502, 12216, 285, 22101, 50276, 43355, 403, 14659, 281, 3794, 281, 30628, 5701, 285, 3533, 50276, 484, 24275, 50276, 15337, 398, 11435, 253, 4477, 7000, 6128, 285, 5194, 326, 253, 3559, 5697, 403, 4722, 342, 18511, 1524, 15688, 1543, 327, 26986, 3606, 2829, 3956, 33289, 3658, 326, 6388, 1097, 10234, 285, 9381, 253, 2934, 273, 32495, 2619, 285, 7450, 323, 47410, 671, 11323, 1318, 281, 253, 3114, 253, 4679, 285, 490, 77, 569, 275, 9864, 2085, 9371, 11745, 1941, 281, 253, 10307, 273, 253, 2746, 627, 403, 1335, 247, 1643, 16383, 7350, 327, 1880, 253, 14023, 281, 2720, 3082, 403, 9670, 35890, 50276, 783, 4477, 452, 31637, 326, 281, 2953, 436, 597, 452, 3597, 281, 3761, 253, 5661, 9978, 432, 2720, 789, 347, 8244, 347, 1896, 1690, 1072, 941, 3280, 5981, 941, 5978, 1232, 285, 7103, 17082, 1677, 326, 253, 2720, 3082, 858, 417, 3727, 2127, 323, 253, 4477, 281, 1408, 327, 616, 985, 841, 4679, 943, 36433, 8069, 17227, 253, 1072, 4836, 533, 342, 2590, 16613, 11701 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: this paper builds a novel indoor perception dataset on radience fields motivated by the recent advances of nerf overall this is an interesting and novel contribution since it exhibits a notable memory compression rate from the original dataset and also provides a unified representation of 3d and 2d information the paper is clearly written and wellorganized 1 the first perception dataset built on radience fields 2 significant memory reduction for the storge 3 unified representation for 2d and 3d input the title looks quite broad because the paper only considers indoor scenarios and benchmark three tasks the authors are suggested to tone down their claims docsepthis work explores the potential of using explicit radiance fields as a unified compressed format for 2d image sequences and 3d scenes it presents two datasets of pretrained plenoxels of co3d objects and scannet indoor scenes which are used for benchmark perception tasks such as 2d3d classification and 3d semantic segmentation through either rendering 2d images or directly using its feature grids it hopes to inspire future work to build larger scale 3d datasets and train bigger vision models largescale training of plenoxels on co3d and scannet effectively compressing the video sequences into plenoxels format comprehensive study of perception tasks on those pretrained plenoxels shows reasonable good results background augmentation and camera manipulation can be enabled with neural rendering which are nontrivial for traditional 2d3d data format for nerflike models to be real candidate to compress data data format we probably need more study on the ratedistortioncompute tradeoff which is lacking in this work eg the compression scheme quantization is quite vanilla the paper claims implicit representation dataset but plenoxels are more like explicit representation please make it more clear in abstract and paper we can think of perception on trained plenoxels as twostep processes first compress images into plenoxels then use the compressed features for perception tasks one can argue the compression step may lose important information for downstream perception tasks thus endtoend perception models should still achieve better accuracy docsepthis work introduces the recent implicit 3d scene representation method for constructing datasets this new representation can unify 2d images and 3d scene datasets into one compact format specifically the authors evaluate this idea by building two datasets upon the co3d and scannet datasets then the authors tested the accuracy with multiple resnet variants to show lower memory and the availability of the proposed datasets the first work to use implicit representation for a largescale dataset the compression rate for the scannet dataset is impressive in the proposed dataset more data augmentation can be performed eg render novel views substitute the background i like the papers motivation and i believe there is some potential for an implicit representation of the dataset however i have some concerns about the effectiveness of the proposed method for practical usage though the data representation is unified it seems it can not help to simplify the training process or boost performance especially for 2d tasks the images should be rendered first or generated during the training process the rendering may also slow down the training process which the authors do not report there is no such step if using the original dataset besides the rendering time the reconstruction error is also introduced for the 3d task this problem is alleviated by learning from the implicit representation directly but it will present another problem the network architecture of the existing method should be redesigned to adapt to this data format data compression is the core advantage of the implicit representation but the result on the objectcentric dataset is not good enough as the image quality will decrease i wonder if it is worth this tradeoff with background augmentation the accuracy result test on the original test set is still lower than trained with the original dataset as shown in fig3 with the drawback of implicit representation it seems hard to represent synthetic datasets like 3dfuture3dfront well specifically the scene editing eg material lighting and domain randomization technique are not supported also the advantage of generating accurate ground truth eg noiseless depth is hard to preserve some minor concerns resnet is used for all experiments experimenting with some sota methods in the classification and semantic segmentation task is better fig3 needs more discussion it is desirable that the result trained on perfception has a lower accuracy than the original dataset showing a domain gap caused by reconstruction error however when evaluating using the test set of perfception the result ie shallow green and blue that trained on perfception expected to have a lower domain gap with the test set is still lower than the result on the original test set ie dark green and blue please discuss the choice of plenoxels needs more discussion the author says it has explicit geometry and consistent features but other methods like ingp can also obtain geometry and be trained faster docsepthis paper proposed a dataset called perfception for perception tasks such as classification and segmentation the data is generated by a recent method called plenoxels this method allows a high memory compression rate while still maintaining 2d and 3d information 1 table 1 shows a clear comparison with existing datasets 2 the proposed dataset can be used for 2d image classification 3d object classification and 3d scene semantic segmentation 1 this paper creates an impression that the paper is good at data compression i know this is not the intended purpose of this paper but the paper reads like this 2 the authors provided google drive links for urls people in some countries cannot access services provided by google and hence cannot access this dataset docsepthis paper proposes the first largescale neural radiance based dataset which is generated by the plenoxels 14 authors study visual perception tasks that directly process the implicit representation in their dataset the pipeline for generating implicit is readytouse and provided for new data generation this paper uses the radiance field generated by a nerfbased method plenoxels 14 as a novel data structure to replace traditional 2d or 3d data structure it is an interesting attempt since constructing a complete 3d structure of a scene is laborious directly generating implicit representation from input images for 3d perception will be really useful two datasets perfceptionco3d and perfceptionscannet are introduced the implicit representation gives another choice for 3d perceptron the experiment shows that previous 2d3d methods can directly apply on the novel data structure and the performance is well the dataset generation is totally based on the method plenoxels 14 all data is directly stored from the intermediate data generated by the plenoxels eg spherical harmonic coefficients densities the contribution of this paper is not substantial enough the density and color information are also generated by other nerfbased methods 5 6 10 11 the 3d structure like points can be generated by pointnerf httpsarxivorgabs220108845 the reason why we have to use the implicit representation of plenoxels is not given a thorough comparison between perfception and implicit data generated by methods mentioned before is necessary the statement in l249 perfceptionco3d contains the same information as the original co3d dataset can not be summarized from the results fig 3 resnet trained on perfceptionco3d has obvious performance drop compared with resnet trained on the co3d when the test set is co3d besides resnet trained on perfceptionco3d also has performance drop when test set changes from co3d to perfceptionco3d i think there exists information loss after transforming original data to implicit one a discussion about this is necessary a performance comparison between methods trained on scannet and perfceptionscannet is necessary similar to fig 3 docsepthe paper focuses on the datasets in implicit 3d representation this work creates the first largescale implicit representation datasets called perfception for perception tasks including classification and semantic segmentation in nerf whats more the paper provides a readytouse pipeline to generate the implicit datasets automatically the paper introduces the first largescale implicit datasets that can be readily used in downstream perception tasks the dataset called perfception is both a timely and relevant contribution to the broader research community additionally it shows a significant memory compression rate from the original dataset such a compression rate is beneficial for efficient research the third strength of the paper is the pipeline to generate the implicit datasets automatically which motivates to generate a very large scale 3d dataset in future 1 minimal discussion of the ethical or societal implications 2 lack the comparisons of rendering qualities between perfception dataset and current popular 3d datasets in nerf research ### Summary:
authors propose a novel largescale implicit representation dataset for perception tasks called the perfception dataset which consists of two parts that incorporate both objectcentric and scenecentric scans for classification and segmentation the dataset obtains significant memory compression rate while containing both 2d and 3d information in a unified form some important pros from reviews the first largescale perception dataset built on radiance fields implicit representation as another choice for 3d perception significant and impressive storage reduction compression comprehensive comparison with existing datasets authors provided a strong rebuttal and reviewers agreed that it addressed their concerns i believe that the dataset is a relevant contribution to the broader research community hence i recommend to accept the paper authors should make sure to incorporate suggestions from reviews into their final manuscript
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 2520, 2929, 21168, 247, 4460, 24340, 13071, 10895, 327, 1985, 1482, 4910, 17194, 407, 253, 3332, 16424, 273, 38998, 71, 4583, 436, 310, 271, 4722, 285, 4460, 7680, 1580, 352, 15646, 247, 16613, 3541, 13800, 2281, 432, 253, 3236, 10895, 285, 671, 3400, 247, 27998, 6779, 273, 495, 69, 285, 374, 69, 1491, 253, 2929, 310, 4518, 3542, 285, 973, 34092, 50276, 18, 253, 806, 13071, 10895, 4270, 327, 1985, 1482, 4910, 374, 1534, 3541, 5141, 323, 253, 331, 4652, 495, 27998, 6779, 323, 374, 69, 285, 495, 69, 3280, 253, 4060, 4453, 3240, 3862, 984, 253, 2929, 760, 19401, 24340, 15216, 285, 22791, 1264, 8892, 253, 4477, 403, 5125, 281, 10541, 1066, 616, 3916, 5474, 33032, 2520, 789, 33826, 253, 2442, 273, 970, 6843, 1985, 5155, 4910, 347, 247, 27998, 21012, 5981, 323, 374, 69, 2460, 6430, 285, 495, 69, 13451, 352, 10262, 767, 15302, 273, 3215, 11273, 499, 257, 1004, 1241, 273, 820, 20, 69, 5113, 285, 660, 1136, 292, 24340, 13451, 534, 403, 908, 323, 22791, 13071, 8892, 824, 347, 374, 69, 20, 69, 9162, 285, 495, 69, 24705, 26405, 949, 2057, 18164, 374, 69, 3888, 390, 3587, 970, 697, 4735, 42590, 352, 13079, 281, 26761, 2852, 789, 281, 1973, 4067, 4311, 495, 69, 15302, 285, 6194, 8750, 8113, 3210, 50276, 14915, 20039, 1079, 3733, 273, 499, 257, 1004, 1241, 327, 820, 20, 69, 285, 660, 1136, 292, 8069, 509, 13537, 253, 3492, 6430, 715, 499, 257, 1004, 1241, 5981, 50276, 3118, 8391, 422, 1263, 273, 13071, 8892, 327, 1110, 3215, 11273, 499, 257, 1004, 1241, 2722, 5272, 1175, 1543, 50276, 11814, 42072, 285, 6568, 19763, 476, 320, 11410, 342, 11454, 18164, 534, 403, 37825, 323, 5899, 374, 69, 20, 69, 941, 5981, 50276, 1542, 38998, 71, 3022, 3210, 281, 320, 1524, 7431, 281, 19477, 941, 941, 5981, 359, 3164, 878, 625, 1263, 327, 253, 20139, 382, 9564, 39413, 5454, 2727, 534, 310, 14999, 275, 436, 789, 24088, 253, 13800, 6974, 36643, 310, 3240, 26724, 50276, 783, 2929, 3916, 15424, 6779, 10895, 533, 499, 257, 1004, 1241, 403, 625, 751, 6843, 6779, 4496, 1056, 352, 625, 2590, 275, 12002, 285, 2929, 50276, 664, 476, 1158, 273, 13071, 327, 10166, 499, 257, 1004, 1241, 347, 2500, 493, 554, 4870, 806, 19477, 3888, 715, 499, 257, 1004, 1241, 840, 897, 253, 21012, 3386, 323, 13071, 8892, 581, 476, 9059, 253, 13800, 3213, 778, 7168, 1774, 1491, 323, 15450, 13071, 8892, 3021, 990, 936, 423, 13071, 3210, 943, 1335, 5115, 1805, 7200, 50276, 7152, 33032, 2520, 789, 23970, 253, 3332, 15424, 495, 69, 6200, 6779, 1332, 323, 26736, 15302, 436, 747, 6779, 476, 440, 1419, 374, 69, 3888, 285, 495, 69, 6200, 15302, 715, 581, 8566, 5981, 5742, 253, 4477, 7472, 436, 2934, 407, 3652, 767, 15302, 2220, 253, 820, 20, 69, 285, 660, 1136, 292, 15302, 840, 253, 4477, 5762, 253, 7200, 342, 2709, 501, 3024, 11640, 281, 921, 2406, 3541, 285, 253, 11659, 273, 253, 4081, 15302, 50276, 783, 806, 789, 281, 897, 15424, 6779, 323, 247, 1236, 2510, 25912, 10895, 50276, 783, 13800, 2281, 323, 253, 660, 1136, 292, 10895, 310, 13943, 50276, 249, 253, 4081, 10895, 625, 941, 42072, 476, 320, 2684, 24088, 8600, 4460, 6849, 16502, 253, 4114, 891, 751, 253, 9380, 16038, 285, 891, 2868, 627, 310, 690, 2442, 323, 271, 15424, 6779, 273, 253, 10895, 50276, 35529, 891, 452, 690, 7350, 670, 253, 12510, 273, 253, 4081, 1332, 323, 8542, 10393, 50275, 2004, 253, 941, 6779, 310, 27998, 352, 3133, 352, 476, 417, 1361, 281, 25636, 253, 3733, 1232, 390, 9510, 3045, 3340, 323, 374, 69, 8892, 253, 3888, 943, 320, 13697, 806, 390, 4561, 1309, 253, 3733, 1232, 253, 18164, 778, 671, 3468, 1066, 253, 3733, 1232, 534, 253, 4477, 513, 417, 1304, 627, 310, 642, 824, 3213, 604, 970, 253, 3236, 10895, 16280, 253, 18164, 673, 253, 14433, 2228, 310, 671, 5611, 323, 253, 495, 69, 4836, 436, 1895, 310, 26353, 4215, 407, 4715, 432, 253, 15424, 6779, 3587, 533, 352, 588, 1246, 1529, 1895, 253, 2990, 10336, 273, 253, 5368, 1332, 943, 320, 30924, 1300, 281, 5223, 281, 436, 941, 5981, 50275, 2203, 13800, 310, 253, 5161, 5750, 273, 253, 15424, 6779, 533, 253, 906, 327, 253, 1789, 37382, 10895, 310, 417, 1175, 2217, 347, 253, 2460, 3290, 588, 6379, 891, 4282, 604, 352, 310, 4409, 436, 5454, 2727, 50275, 3113, 4114, 42072, 253, 7200, 906, 1071, 327, 253, 3236, 1071, 873, 50276, 261, 1335, 2406, 685, 10166, 342, 253, 3236, 10895, 347, 2011, 275, 3036, 20, 50275, 3113, 253, 32489, 273, 15424, 6779, 352, 3133, 1892, 281, 1957, 13506, 15302, 751, 495, 4989, 2532, 20, 69, 6342, 973, 5742, 253, 6200, 14835, 24088, 2144, 15632, 285, 5028, 46852, 5853, 403, 417, 4516, 671, 253, 5750, 273, 11365, 7899, 3216, 5083, 24088, 642, 261, 6134, 6864, 310, 1892, 281, 14003, 50276, 8826, 5884, 7350, 50276, 373, 3024, 310, 908, 323, 512, 4679, 46086, 342, 690, 256, 5503, 3082, 275, 253, 9162, 285, 24705, 26405, 4836, 310, 1805, 50275, 926, 20, 3198, 625, 5955, 352, 310, 11408, 326, 253, 906, 10166, 327, 20403, 2409, 556, 247, 2406, 7200, 685, 253, 3236, 10895, 4645, 247, 5028, 8037, 4269, 407, 14433, 2228, 2299, 672, 16344, 970, 253, 1071, 873, 273, 20403, 2409, 253, 906, 26332, 20126, 4759, 285, 4797, 326, 10166, 327, 20403, 2409, 3264, 281, 452, 247, 2406, 5028, 8037, 342, 253, 1071, 873, 310, 1335, 2406, 685, 253, 906, 327, 253, 3236, 1071, 873, 26332, 3644, 4759, 285, 4797, 4496, 2319, 50275, 783, 4327, 273, 499, 257, 1004, 1241, 3198, 625, 5955, 253, 2488, 2296, 352, 556, 6843, 12087, 285, 5185, 3386, 533, 643, 3082, 751, 6446, 81, 476, 671, 4044, 12087, 285, 320, 10166, 7938, 5474, 33032, 2520, 2929, 4081, 247, 10895, 1925, 20403, 2409, 323, 13071, 8892, 824, 347, 9162, 285, 26405, 253, 941, 310, 4561, 407, 247, 3332, 1332, 1925, 499, 257, 1004, 1241, 436, 1332, 4483, 247, 1029, 3541, 13800, 2281, 1223, 1335, 11850, 374, 69, 285, 495, 69, 1491, 337, 2829, 337, 2722, 247, 2590, 5301, 342, 5368, 15302, 50276, 19, 253, 4081, 10895, 476, 320, 908, 323, 374, 69, 2460, 9162, 495, 69, 1789, 9162, 285, 495, 69, 6200, 24705, 26405, 337, 436, 2929, 10513, 271, 13214, 326, 253, 2929, 310, 1175, 387, 941, 13800, 891, 871, 436, 310, 417, 253, 6034, 4096, 273, 436, 2929, 533, 253, 2929, 9563, 751, 436, 50276, 19, 253, 4477, 2530, 17899, 4446, 4859, 323, 2936, 5200, 952, 275, 690, 4343, 2550, 2289, 3238, 2530, 407, 17899, 285, 7613, 2550, 2289, 436, 10895, 5474, 33032, 2520, 2929, 29328, 253, 806, 1236, 2510, 25912, 11454, 1985, 5155, 1754, 10895, 534, 310, 4561, 407, 253, 499, 257, 1004, 1241, 1638, 4477, 1263, 5304, 13071, 8892, 326, 3587, 1232, 253, 15424, 6779, 275, 616, 10895, 253, 15722, 323, 11365, 15424, 310, 1239, 1767, 1312, 285, 2530, 323, 747, 941, 5978, 50274, 2520, 2929, 4648, 253, 1985, 5155, 1673, 4561, 407, 247, 50276, 1216, 71, 3169, 1332, 499, 257, 1004, 1241, 1638, 347, 247, 4460, 941, 2605, 281, 8171, 5899, 374, 69, 390, 495, 69, 941, 2605, 352, 310, 271, 4722, 3177, 1580, 26736, 247, 3426, 495, 69, 2605, 273, 247, 6200, 310, 5299, 784, 3587, 11365, 15424, 6779, 432, 3280, 3888, 323, 495, 69, 13071, 588, 320, 1663, 4217, 50276, 9389, 15302, 20403, 2409, 1940, 20, 69, 285, 20403, 14233, 68, 1136, 292, 403, 5611, 253, 15424, 6779, 4245, 1529, 4327, 323, 495, 69, 591, 916, 1406, 50276, 783, 3368, 2722, 326, 2045, 374, 69, 20, 69, 3082, 476, 3587, 4647, 327, 253, 4460, 941, 2605, 285, 253, 3045, 310, 973, 50276, 783, 10895, 5978, 310, 9106, 1754, 327, 253, 1332, 499, 257, 1004, 1241, 1638, 512, 941, 310, 3587, 7141, 432, 253, 10444, 941, 4561, 407, 253, 50276, 446, 257, 1004, 1241, 24088, 19474, 23007, 10303, 16689, 50276, 783, 7680, 273, 436, 2929, 310, 417, 6832, 2217, 50275, 783, 4038, 285, 3295, 1491, 403, 671, 4561, 407, 643, 38998, 71, 3169, 3082, 608, 721, 884, 1903, 253, 495, 69, 2605, 751, 2792, 476, 320, 4561, 407, 1127, 1216, 71, 5987, 39962, 2061, 5375, 19, 7199, 2055, 1857, 253, 1921, 2139, 359, 452, 281, 897, 253, 15424, 6779, 273, 499, 257, 1004, 1241, 310, 417, 1677, 247, 11080, 5301, 875, 20403, 2409, 285, 15424, 941, 4561, 407, 3082, 5393, 1078, 310, 3309, 50276, 783, 3908, 275, 298, 21361, 20403, 2409, 1940, 20, 69, 4428, 253, 1072, 1491, 347, 253, 3236, 820, 20, 69, 10895, 476, 417, 320, 17903, 432, 253, 1543, 3036, 495, 501, 3024, 10166, 327, 20403, 2409, 1940, 20, 69, 556, 4755, 3045, 5926, 2429, 342, 501, 3024, 10166, 327, 253, 820, 20, 69, 672, 253, 1071, 873, 310, 820, 20, 69, 16280, 501, 3024, 10166, 327, 20403, 2409, 1940, 20, 69, 671, 556, 3045, 5926, 672, 1071, 873, 2544, 432, 820, 20, 69, 281, 20403, 2409, 1940, 20, 69, 891, 1158, 627, 4961, 1491, 2957, 846, 27197, 3236, 941, 281, 15424, 581, 247, 5955, 670, 436, 310, 3309, 50275, 66, 3045, 5301, 875, 3082, 10166, 327, 660, 1136, 292, 285, 50276, 49181, 14233, 68, 1136, 292, 310, 3309, 2074, 281, 3036, 495, 5474, 339, 431, 248, 2929, 16633, 327, 253, 15302, 275, 15424, 495, 69, 6779, 436, 789, 10513, 253, 806, 1236, 2510, 25912, 15424, 6779, 15302, 1925, 20403, 2409, 323, 13071, 8892, 1690, 9162, 285, 24705, 26405, 275, 38998, 71, 47515, 625, 253, 2929, 3400, 247, 1239, 1767, 1312, 15722, 281, 6635, 253, 15424, 15302, 8356, 253, 2929, 23970, 253, 806, 1236, 2510, 25912, 15424, 15302, 326, 476, 320, 12450, 908, 275, 15450, 13071, 8892, 253, 10895, 1925, 20403, 2409, 310, 1097, 247, 14793, 285, 4623, 7680, 281, 253, 16055, 2561, 3114, 23000, 352, 2722, 247, 1534, 3541, 13800, 2281, 432, 253, 3236, 10895, 824, 247, 13800, 2281, 310, 12912, 323, 5919, 2561, 253, 2626, 4757, 273, 253, 2929, 310, 253, 15722, 281, 6635, 253, 15424, 15302, 8356, 534, 15265, 684, 281, 6635, 247, 1077, 1781, 4311, 495, 69, 10895, 275, 2852, 337, 8723, 5955, 273, 253, 16289, 390, 38058, 12739, 374, 3480, 253, 14023, 273, 18164, 18701, 875, 20403, 2409, 10895, 285, 1655, 4633, 495, 69, 15302, 275, 38998, 71, 2561, 50276, 187, 187, 4118, 18435, 27, 43355, 12661, 247, 4460, 1236, 2510, 25912, 15424, 6779, 10895, 323, 13071, 8892, 1925, 253, 20403, 2409, 10895, 534, 8414, 273, 767, 4243, 326, 19071, 1097, 1789, 37382, 285, 5362, 886, 19458, 20947, 323, 9162, 285, 26405, 253, 10895, 31326, 1534, 3541, 13800, 2281, 1223, 4508, 1097, 374, 69, 285, 495, 69, 1491, 275, 247, 27998, 830, 50276, 8826, 1774, 5847, 432, 10123, 50276, 783, 806, 1236, 2510, 25912, 13071, 10895, 4270, 327, 1985, 5155, 4910, 15424, 6779, 347, 1529, 4327, 323, 495, 69, 13071, 50276, 32258, 285, 13943, 5718, 5141, 50276, 3118, 1256, 50276, 3118, 8391, 422, 5301, 342, 5368, 15302, 50276, 43355, 2530, 247, 2266, 30080, 22559, 285, 30628, 5821, 326, 352, 9713, 616, 7350, 891, 2868, 326, 253, 10895, 310, 247, 4623, 7680, 281, 253, 16055, 2561, 3114, 7613, 891, 5583, 281, 2997, 253, 2929, 4477, 943, 1056, 2119, 281, 19071, 13991, 432, 10123, 715, 616, 2457, 7714, 50275 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 2520, 2929, 21168, 247, 4460, 24340, 13071, 10895, 327, 1985, 1482, 4910, 17194, 407, 253, 3332, 16424, 273, 38998, 71, 4583, 436, 310, 271, 4722, 285, 4460, 7680, 1580, 352, 15646, 247, 16613, 3541, 13800, 2281, 432, 253, 3236, 10895, 285, 671, 3400, 247, 27998, 6779, 273, 495, 69, 285, 374, 69, 1491, 253, 2929, 310, 4518, 3542, 285, 973, 34092, 50276, 18, 253, 806, 13071, 10895, 4270, 327, 1985, 1482, 4910, 374, 1534, 3541, 5141, 323, 253, 331, 4652, 495, 27998, 6779, 323, 374, 69, 285, 495, 69, 3280, 253, 4060, 4453, 3240, 3862, 984, 253, 2929, 760, 19401, 24340, 15216, 285, 22791, 1264, 8892, 253, 4477, 403, 5125, 281, 10541, 1066, 616, 3916, 5474, 33032, 2520, 789, 33826, 253, 2442, 273, 970, 6843, 1985, 5155, 4910, 347, 247, 27998, 21012, 5981, 323, 374, 69, 2460, 6430, 285, 495, 69, 13451, 352, 10262, 767, 15302, 273, 3215, 11273, 499, 257, 1004, 1241, 273, 820, 20, 69, 5113, 285, 660, 1136, 292, 24340, 13451, 534, 403, 908, 323, 22791, 13071, 8892, 824, 347, 374, 69, 20, 69, 9162, 285, 495, 69, 24705, 26405, 949, 2057, 18164, 374, 69, 3888, 390, 3587, 970, 697, 4735, 42590, 352, 13079, 281, 26761, 2852, 789, 281, 1973, 4067, 4311, 495, 69, 15302, 285, 6194, 8750, 8113, 3210, 50276, 14915, 20039, 1079, 3733, 273, 499, 257, 1004, 1241, 327, 820, 20, 69, 285, 660, 1136, 292, 8069, 509, 13537, 253, 3492, 6430, 715, 499, 257, 1004, 1241, 5981, 50276, 3118, 8391, 422, 1263, 273, 13071, 8892, 327, 1110, 3215, 11273, 499, 257, 1004, 1241, 2722, 5272, 1175, 1543, 50276, 11814, 42072, 285, 6568, 19763, 476, 320, 11410, 342, 11454, 18164, 534, 403, 37825, 323, 5899, 374, 69, 20, 69, 941, 5981, 50276, 1542, 38998, 71, 3022, 3210, 281, 320, 1524, 7431, 281, 19477, 941, 941, 5981, 359, 3164, 878, 625, 1263, 327, 253, 20139, 382, 9564, 39413, 5454, 2727, 534, 310, 14999, 275, 436, 789, 24088, 253, 13800, 6974, 36643, 310, 3240, 26724, 50276, 783, 2929, 3916, 15424, 6779, 10895, 533, 499, 257, 1004, 1241, 403, 625, 751, 6843, 6779, 4496, 1056, 352, 625, 2590, 275, 12002, 285, 2929, 50276, 664, 476, 1158, 273, 13071, 327, 10166, 499, 257, 1004, 1241, 347, 2500, 493, 554, 4870, 806, 19477, 3888, 715, 499, 257, 1004, 1241, 840, 897, 253, 21012, 3386, 323, 13071, 8892, 581, 476, 9059, 253, 13800, 3213, 778, 7168, 1774, 1491, 323, 15450, 13071, 8892, 3021, 990, 936, 423, 13071, 3210, 943, 1335, 5115, 1805, 7200, 50276, 7152, 33032, 2520, 789, 23970, 253, 3332, 15424, 495, 69, 6200, 6779, 1332, 323, 26736, 15302, 436, 747, 6779, 476, 440, 1419, 374, 69, 3888, 285, 495, 69, 6200, 15302, 715, 581, 8566, 5981, 5742, 253, 4477, 7472, 436, 2934, 407, 3652, 767, 15302, 2220, 253, 820, 20, 69, 285, 660, 1136, 292, 15302, 840, 253, 4477, 5762, 253, 7200, 342, 2709, 501, 3024, 11640, 281, 921, 2406, 3541, 285, 253, 11659, 273, 253, 4081, 15302, 50276, 783, 806, 789, 281, 897, 15424, 6779, 323, 247, 1236, 2510, 25912, 10895, 50276, 783, 13800, 2281, 323, 253, 660, 1136, 292, 10895, 310, 13943, 50276, 249, 253, 4081, 10895, 625, 941, 42072, 476, 320, 2684, 24088, 8600, 4460, 6849, 16502, 253, 4114, 891, 751, 253, 9380, 16038, 285, 891, 2868, 627, 310, 690, 2442, 323, 271, 15424, 6779, 273, 253, 10895, 50276, 35529, 891, 452, 690, 7350, 670, 253, 12510, 273, 253, 4081, 1332, 323, 8542, 10393, 50275, 2004, 253, 941, 6779, 310, 27998, 352, 3133, 352, 476, 417, 1361, 281, 25636, 253, 3733, 1232, 390, 9510, 3045, 3340, 323, 374, 69, 8892, 253, 3888, 943, 320, 13697, 806, 390, 4561, 1309, 253, 3733, 1232, 253, 18164, 778, 671, 3468, 1066, 253, 3733, 1232, 534, 253, 4477, 513, 417, 1304, 627, 310, 642, 824, 3213, 604, 970, 253, 3236, 10895, 16280, 253, 18164, 673, 253, 14433, 2228, 310, 671, 5611, 323, 253, 495, 69, 4836, 436, 1895, 310, 26353, 4215, 407, 4715, 432, 253, 15424, 6779, 3587, 533, 352, 588, 1246, 1529, 1895, 253, 2990, 10336, 273, 253, 5368, 1332, 943, 320, 30924, 1300, 281, 5223, 281, 436, 941, 5981, 50275, 2203, 13800, 310, 253, 5161, 5750, 273, 253, 15424, 6779, 533, 253, 906, 327, 253, 1789, 37382, 10895, 310, 417, 1175, 2217, 347, 253, 2460, 3290, 588, 6379, 891, 4282, 604, 352, 310, 4409, 436, 5454, 2727, 50275, 3113, 4114, 42072, 253, 7200, 906, 1071, 327, 253, 3236, 1071, 873, 50276, 261, 1335, 2406, 685, 10166, 342, 253, 3236, 10895, 347, 2011, 275, 3036, 20, 50275, 3113, 253, 32489, 273, 15424, 6779, 352, 3133, 1892, 281, 1957, 13506, 15302, 751, 495, 4989, 2532, 20, 69, 6342, 973, 5742, 253, 6200, 14835, 24088, 2144, 15632, 285, 5028, 46852, 5853, 403, 417, 4516, 671, 253, 5750, 273, 11365, 7899, 3216, 5083, 24088, 642, 261, 6134, 6864, 310, 1892, 281, 14003, 50276, 8826, 5884, 7350, 50276, 373, 3024, 310, 908, 323, 512, 4679, 46086, 342, 690, 256, 5503, 3082, 275, 253, 9162, 285, 24705, 26405, 4836, 310, 1805, 50275, 926, 20, 3198, 625, 5955, 352, 310, 11408, 326, 253, 906, 10166, 327, 20403, 2409, 556, 247, 2406, 7200, 685, 253, 3236, 10895, 4645, 247, 5028, 8037, 4269, 407, 14433, 2228, 2299, 672, 16344, 970, 253, 1071, 873, 273, 20403, 2409, 253, 906, 26332, 20126, 4759, 285, 4797, 326, 10166, 327, 20403, 2409, 3264, 281, 452, 247, 2406, 5028, 8037, 342, 253, 1071, 873, 310, 1335, 2406, 685, 253, 906, 327, 253, 3236, 1071, 873, 26332, 3644, 4759, 285, 4797, 4496, 2319, 50275, 783, 4327, 273, 499, 257, 1004, 1241, 3198, 625, 5955, 253, 2488, 2296, 352, 556, 6843, 12087, 285, 5185, 3386, 533, 643, 3082, 751, 6446, 81, 476, 671, 4044, 12087, 285, 320, 10166, 7938, 5474, 33032, 2520, 2929, 4081, 247, 10895, 1925, 20403, 2409, 323, 13071, 8892, 824, 347, 9162, 285, 26405, 253, 941, 310, 4561, 407, 247, 3332, 1332, 1925, 499, 257, 1004, 1241, 436, 1332, 4483, 247, 1029, 3541, 13800, 2281, 1223, 1335, 11850, 374, 69, 285, 495, 69, 1491, 337, 2829, 337, 2722, 247, 2590, 5301, 342, 5368, 15302, 50276, 19, 253, 4081, 10895, 476, 320, 908, 323, 374, 69, 2460, 9162, 495, 69, 1789, 9162, 285, 495, 69, 6200, 24705, 26405, 337, 436, 2929, 10513, 271, 13214, 326, 253, 2929, 310, 1175, 387, 941, 13800, 891, 871, 436, 310, 417, 253, 6034, 4096, 273, 436, 2929, 533, 253, 2929, 9563, 751, 436, 50276, 19, 253, 4477, 2530, 17899, 4446, 4859, 323, 2936, 5200, 952, 275, 690, 4343, 2550, 2289, 3238, 2530, 407, 17899, 285, 7613, 2550, 2289, 436, 10895, 5474, 33032, 2520, 2929, 29328, 253, 806, 1236, 2510, 25912, 11454, 1985, 5155, 1754, 10895, 534, 310, 4561, 407, 253, 499, 257, 1004, 1241, 1638, 4477, 1263, 5304, 13071, 8892, 326, 3587, 1232, 253, 15424, 6779, 275, 616, 10895, 253, 15722, 323, 11365, 15424, 310, 1239, 1767, 1312, 285, 2530, 323, 747, 941, 5978, 50274, 2520, 2929, 4648, 253, 1985, 5155, 1673, 4561, 407, 247, 50276, 1216, 71, 3169, 1332, 499, 257, 1004, 1241, 1638, 347, 247, 4460, 941, 2605, 281, 8171, 5899, 374, 69, 390, 495, 69, 941, 2605, 352, 310, 271, 4722, 3177, 1580, 26736, 247, 3426, 495, 69, 2605, 273, 247, 6200, 310, 5299, 784, 3587, 11365, 15424, 6779, 432, 3280, 3888, 323, 495, 69, 13071, 588, 320, 1663, 4217, 50276, 9389, 15302, 20403, 2409, 1940, 20, 69, 285, 20403, 14233, 68, 1136, 292, 403, 5611, 253, 15424, 6779, 4245, 1529, 4327, 323, 495, 69, 591, 916, 1406, 50276, 783, 3368, 2722, 326, 2045, 374, 69, 20, 69, 3082, 476, 3587, 4647, 327, 253, 4460, 941, 2605, 285, 253, 3045, 310, 973, 50276, 783, 10895, 5978, 310, 9106, 1754, 327, 253, 1332, 499, 257, 1004, 1241, 1638, 512, 941, 310, 3587, 7141, 432, 253, 10444, 941, 4561, 407, 253, 50276, 446, 257, 1004, 1241, 24088, 19474, 23007, 10303, 16689, 50276, 783, 7680, 273, 436, 2929, 310, 417, 6832, 2217, 50275, 783, 4038, 285, 3295, 1491, 403, 671, 4561, 407, 643, 38998, 71, 3169, 3082, 608, 721, 884, 1903, 253, 495, 69, 2605, 751, 2792, 476, 320, 4561, 407, 1127, 1216, 71, 5987, 39962, 2061, 5375, 19, 7199, 2055, 1857, 253, 1921, 2139, 359, 452, 281, 897, 253, 15424, 6779, 273, 499, 257, 1004, 1241, 310, 417, 1677, 247, 11080, 5301, 875, 20403, 2409, 285, 15424, 941, 4561, 407, 3082, 5393, 1078, 310, 3309, 50276, 783, 3908, 275, 298, 21361, 20403, 2409, 1940, 20, 69, 4428, 253, 1072, 1491, 347, 253, 3236, 820, 20, 69, 10895, 476, 417, 320, 17903, 432, 253, 1543, 3036, 495, 501, 3024, 10166, 327, 20403, 2409, 1940, 20, 69, 556, 4755, 3045, 5926, 2429, 342, 501, 3024, 10166, 327, 253, 820, 20, 69, 672, 253, 1071, 873, 310, 820, 20, 69, 16280, 501, 3024, 10166, 327, 20403, 2409, 1940, 20, 69, 671, 556, 3045, 5926, 672, 1071, 873, 2544, 432, 820, 20, 69, 281, 20403, 2409, 1940, 20, 69, 891, 1158, 627, 4961, 1491, 2957, 846, 27197, 3236, 941, 281, 15424, 581, 247, 5955, 670, 436, 310, 3309, 50275, 66, 3045, 5301, 875, 3082, 10166, 327, 660, 1136, 292, 285, 50276, 49181, 14233, 68, 1136, 292, 310, 3309, 2074, 281, 3036, 495, 5474, 339, 431, 248, 2929, 16633, 327, 253, 15302, 275, 15424, 495, 69, 6779, 436, 789, 10513, 253, 806, 1236, 2510, 25912, 15424, 6779, 15302, 1925, 20403, 2409, 323, 13071, 8892, 1690, 9162, 285, 24705, 26405, 275, 38998, 71, 47515, 625, 253, 2929, 3400, 247, 1239, 1767, 1312, 15722, 281, 6635, 253, 15424, 15302, 8356, 253, 2929, 23970, 253, 806, 1236, 2510, 25912, 15424, 15302, 326, 476, 320, 12450, 908, 275, 15450, 13071, 8892, 253, 10895, 1925, 20403, 2409, 310, 1097, 247, 14793, 285, 4623, 7680, 281, 253, 16055, 2561, 3114, 23000, 352, 2722, 247, 1534, 3541, 13800, 2281, 432, 253, 3236, 10895, 824, 247, 13800, 2281, 310, 12912, 323, 5919, 2561, 253, 2626, 4757, 273, 253, 2929, 310, 253, 15722, 281, 6635, 253, 15424, 15302, 8356, 534, 15265, 684, 281, 6635, 247, 1077, 1781, 4311, 495, 69, 10895, 275, 2852, 337, 8723, 5955, 273, 253, 16289, 390, 38058, 12739, 374, 3480, 253, 14023, 273, 18164, 18701, 875, 20403, 2409, 10895, 285, 1655, 4633, 495, 69, 15302, 275, 38998, 71, 2561, 50276, 187, 187, 4118, 18435, 27, 43355, 12661, 247, 4460, 1236, 2510, 25912, 15424, 6779, 10895, 323, 13071, 8892, 1925, 253, 20403, 2409, 10895, 534, 8414, 273, 767, 4243, 326, 19071, 1097, 1789, 37382, 285, 5362, 886, 19458, 20947, 323, 9162, 285, 26405, 253, 10895, 31326, 1534, 3541, 13800, 2281, 1223, 4508, 1097, 374, 69, 285, 495, 69, 1491, 275, 247, 27998, 830, 50276, 8826, 1774, 5847, 432, 10123, 50276, 783, 806, 1236, 2510, 25912, 13071, 10895, 4270, 327, 1985, 5155, 4910, 15424, 6779, 347, 1529, 4327, 323, 495, 69, 13071, 50276, 32258, 285, 13943, 5718, 5141, 50276, 3118, 1256, 50276, 3118, 8391, 422, 5301, 342, 5368, 15302, 50276, 43355, 2530, 247, 2266, 30080, 22559, 285, 30628, 5821, 326, 352, 9713, 616, 7350, 891, 2868, 326, 253, 10895, 310, 247, 4623, 7680, 281, 253, 16055, 2561, 3114, 7613, 891, 5583, 281, 2997, 253, 2929, 4477, 943, 1056, 2119, 281, 19071, 13991, 432, 10123, 715, 616, 2457, 7714, 50275 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: this paper proposes and studies the model completion problem given a trained network and the data on which is was trained if a subset of the network is reinitialized from scratch how many retraining iterations are needed to achieve the original network accuracy or some percentage of it for a variety of networks and problems in both supervised and reinforcement learning modelcompletion mc hardness is quantified for individual network layerssections the experiments are the core of the paper and are generally well documented and seem reproducible however there are two issues that cloud the paper 1 the problem motivation bounding the security of model splitting is a bit odd has model splitting been proposed in the literature as a potential solution to shared model governance otherwise it feels like the problem setting was invented to justify the analysis in this paper the tail wagging the dog as the saying goes 2 model completion yet still be an interesting analytical tool for deep networks but this requires a different evaluation for instance model completion provides a way to study how complicated different network layers are to learn or maybe to quantify how much of the inference task may be contained in each though these concepts would need precise language and experimental evidence but how do these observations compare to other ways of obtaining similar observations for instance from the pruning literature molchanov 2017 iclr httpsopenreviewnetpdfidsjgciw5gl includes several figures detailing the statistics of individual network layers and how prunable are the filters in each this is largely an analytical paper and ill readily acknowledge that it is difficult to pull a clear and insightful study out of a jumble of experimental observations and hard to review such a paper too but the limitations of the problem motivation point 1 and in my opinion the misaligned focus of the analysis point 2 hurt the clarity and significance of this paper for it to really be a useful tool in understanding deep learning some additional work seems to be needed other notes 3 pruning literature would be a reasonable comparison in the related work for instance han iclr 2017 httpsarxivorgabs160704381 describes a densesparsedense method where a dense model is pruned sparse after which the pruned connections are reinitialized and retrained dense leading to improved accuracy relative to the original dense model 4 consider replacing the uncommonly used ca with eg 1000x instead of ca 1000x 5 the specifics about imagenet in the intro to section 3 should be moved to section 4 6 in section 32 paragraph 2 clarify if loss refers to test loss as stated in the intro to section 3 7 in figure 2 alpha09 and figure 3 alpha10 bottom why are the values constant docsepthis paper proposes the interesting idea of analyzing how difficult it is to reinitialize and retrain layers in neural networks they study these techniques in the context of imagenet classification and reinforcement learning in the atari and deepmind lab domains while these are interesting ideas and domains to study i have concerns with the positioning and execution of the paper positioning execution and motivation on the positioning of the paper a significant part of the introduction and related work section is spent arguing that this approach can be used for shared model governance in contexts where homomorphic encryption or secure multiparty computation would instead be used comparing the approaches studied in this paper to these sophisticated cryptographicallymotivated techniques seems like too much of a stretch as the methods serve very different purposes and in most cases cannot even be directly compared the first and second paragraph discuss the vision of distributing the training of models between multiple parties i agree that this is a useful area to study and direction for the community to go but as the introduction of this paper states this is the most interesting when the parties have control over logically separate components of the modeling pipeline and also when joint training of the components is being done potentially on disjoint and private datasets the empirical results of this paper do none of this as they only look at the case when a single layer is being replaced furthermore the motivation and positioning of the paper is not carried through in the empirical setup where they investigate approaches that do training over all of the parameters of the model breaking the assumption that the parties should be independent and should not share information metrics for measuring model completeness section 31 defines the metric of completion hardness that is used throughout the rest of the paper the metric looks at the number of iterations that retraining the model takes to reach the same performance as the original model its not clear why this is an important metric and i am not convinced it is the right one to use as it 1 does not give a notion of how nicely the missing portion was recovered just that the accuracy reached the same accuracy as the original network and 2 methods with a very long periteration runtime such as secondorder and samplingbased methods could be used to reach a good performance in a small number of iterations making these methods appear to be very good at completing models i dont think it is nice that this metric relies on the same optimizer being used for the original model and the completed model i think its more interesting to study how much data is required to recover missing portions of the model instead of how many iterations are needed to recover the same performance the supervised learning experiments appear to be done using the entire dataset while the rl experiments do present a setting where the data is not the same empirical results i am also surprised by the empirical finding in section 51 that t1 outperforms t2 since it seems like only optimizing the parameters of the missing layer would be the best approach i think that if a similarity metric was used instead t2 would be significantly better at finding the layer that is the most similar to the layer that was removed some smaller comments 1 in section 31 the definition of ct does not use t explicitly inside of it 2 in the last paragraph of section 31 and first paragraph of section 32 n should be defined as an iteration that reaches the best loss 3 the description of t3 does not say what method is used to optimize the overparameterized layer is it t1 or t2 4 why does t4 use t1 instead of t2 5 in the experimental setup why is t2 applied with a different learning rate schedule than the original training procedure 6 why is t2 not shown in the alexnet results for figure 2 7 the dissimilar axes between the plots in figure 2 and figure 3 make them difficult to compare and interpret 8 its surprising that in figure 3 the hardness of alpha10 for t2 is 10 for everythingdocsepthe authors introduce the problem of model completion mc to the machine learning community they provide a thorough review or related works and convincingly argue that existing solutions to this sort of task ie homomorphic encryption and multiparty computation are not fully satisfactory in the domain of neural network learning the authors also provide extensive numerical experiments attempting to quantify their proposed measure of hardnessofmodelcompletion mchardnesstalpha on a diverse set of supervised and rlrelated tasks and they provide extensive analysis of those results i find the paper to raise more questions than it answers in a good way the authors note that their measure depends strongly on the peculiarities of the particular retraining scheme used do the authors worry that such a measure could end up being too looseessentially always a function of whatever the fastest optimization scheme happens to be for any particular architecture more broadly theres an additional axis to the optimization problem which is how much does the training scheme know about the particulars of the problem ranging from literally has oracle access to the weights of the trained model ie trivial mchardness 0 always to knows what the architecture of the heldoutlayer is and has been designed to optimize that particular network see eg learned optimizers to knows a little bit about the problem structure and uses hyperparameter tuned adam to knows nothing about the problem and picks a random architecture to use for the held out weights training it with sgd model completion seems morally or at least from a security standpoint slightly underspecified without being more careful about what information each player in this game has access to as it stands its an excellent empirical measure and captures a very interesting problem but id like to know how to make it even more theoretically grounded an excellent contribution and im excited to see followup work we of course have tremendous inductive bias in how we go about designing architectures for neural networks but hopefully you understand my point ### Summary:
as all the reviewers have highlighted there is some interesting analysis in this paper on understanding which models can be easier to complete the experiments are quite thorough and seem reproducible however the biggest limitationand the ones that is making it harder for the reviewers to come to a consensusis the fact that the motivation seems mismatched with the provided approach there is quite a lot of focus on security and being robust to an adversary model splitting is proposed as a reasonable solution however the model completion hardness measure proposed is insufficiently justified both in that its not clear what security guarantees it provides nor is it clear why training time was chosen over other metrics like number of samples as mentioned by a reviewer if this measure had been previously proposed and the focus of this paper was to provide empirical insight that might be fine but that does not appear to be the case this mismatch is evident also in the writing in the paper after the introduction the paper largely reads as understanding how retrainable different architectures are under which problem settings when replacing an entire layer with little to no mention of security or privacy in summary this paper has some interesting ideas but an unclear focus the proposed strategy should be better justified or maybe even better for the larger iclr audience the provided analysis could be motivated for other settings such as understanding convergence rates or trainability in neural networks
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 2520, 2929, 29328, 285, 2175, 253, 1566, 12240, 1895, 1677, 247, 10166, 2990, 285, 253, 941, 327, 534, 310, 369, 10166, 604, 247, 8578, 273, 253, 2990, 310, 294, 19078, 1025, 432, 20041, 849, 1142, 851, 26208, 25142, 403, 3058, 281, 5115, 253, 3236, 2990, 7200, 390, 690, 7155, 273, 352, 323, 247, 5235, 273, 6928, 285, 3237, 275, 1097, 22296, 285, 35221, 4715, 1566, 45634, 278, 68, 38576, 310, 18755, 323, 2060, 2990, 8090, 21454, 253, 4679, 403, 253, 5161, 273, 253, 2929, 285, 403, 3839, 973, 14290, 285, 1646, 41374, 50276, 35529, 627, 403, 767, 3374, 326, 9005, 253, 2929, 209, 186, 18, 253, 1895, 16038, 41113, 253, 3988, 273, 1566, 19860, 310, 247, 2372, 8909, 556, 1566, 19860, 644, 4081, 275, 253, 6239, 347, 247, 2442, 2900, 281, 6096, 1566, 25978, 5010, 352, 9193, 751, 253, 1895, 4758, 369, 23179, 281, 15249, 253, 1783, 275, 436, 2929, 253, 8105, 259, 29191, 253, 4370, 347, 253, 3981, 4566, 209, 186, 19, 1566, 12240, 2568, 1335, 320, 271, 4722, 16101, 4968, 323, 3676, 6928, 533, 436, 4419, 247, 1027, 7103, 323, 4227, 1566, 12240, 3400, 247, 1039, 281, 1263, 849, 9542, 1027, 2990, 8090, 403, 281, 3037, 390, 5046, 281, 22048, 849, 1199, 273, 253, 17032, 4836, 778, 320, 6221, 275, 1016, 2167, 841, 12342, 651, 878, 10799, 3448, 285, 5661, 1941, 533, 849, 513, 841, 7313, 7277, 281, 643, 4088, 273, 13546, 2074, 7313, 323, 4227, 432, 253, 819, 25004, 6239, 14008, 2291, 729, 4240, 17857, 32888, 5987, 5758, 15337, 3024, 9275, 2352, 75, 72, 5297, 88, 22, 3129, 3797, 2067, 8442, 38746, 253, 9990, 273, 2060, 2990, 8090, 285, 849, 819, 328, 494, 403, 253, 15116, 275, 1016, 50276, 2520, 310, 8127, 271, 16101, 2929, 285, 2853, 12450, 14409, 326, 352, 310, 2834, 281, 3785, 247, 2590, 285, 47860, 1263, 562, 273, 247, 480, 19493, 273, 5661, 7313, 285, 1892, 281, 2278, 824, 247, 2929, 1512, 533, 253, 7364, 273, 253, 1895, 16038, 1127, 337, 285, 275, 619, 4743, 253, 3731, 2132, 2770, 273, 253, 1783, 1127, 374, 8513, 253, 19843, 285, 8453, 273, 436, 2929, 323, 352, 281, 1663, 320, 247, 4217, 4968, 275, 4685, 3676, 4715, 690, 3081, 789, 3133, 281, 320, 3058, 50276, 977, 7211, 209, 186, 20, 819, 25004, 6239, 651, 320, 247, 5272, 5301, 275, 253, 2905, 789, 323, 4227, 15761, 17857, 32888, 4240, 5987, 39962, 2061, 5375, 9913, 26942, 30169, 8631, 247, 277, 5060, 35422, 264, 1215, 1332, 835, 247, 14086, 1566, 310, 819, 37437, 23507, 846, 534, 253, 819, 37437, 10291, 403, 294, 19078, 1025, 285, 851, 11273, 14086, 4283, 281, 5520, 7200, 4103, 281, 253, 3236, 14086, 1566, 209, 186, 21, 1908, 15706, 253, 24666, 314, 908, 7318, 342, 50276, 909, 9098, 89, 3185, 273, 7318, 9098, 89, 209, 186, 22, 253, 40155, 670, 4440, 257, 292, 275, 253, 26432, 281, 2593, 495, 943, 320, 4395, 281, 2593, 577, 209, 186, 23, 275, 2593, 4567, 12494, 374, 19148, 604, 2957, 10770, 281, 1071, 2957, 347, 4767, 275, 253, 26432, 281, 2593, 495, 209, 186, 24, 275, 4677, 374, 9765, 2693, 285, 4677, 495, 9765, 740, 5004, 2139, 403, 253, 2193, 3638, 5474, 33032, 2520, 2929, 29328, 253, 4722, 2934, 273, 18918, 849, 2834, 352, 310, 281, 294, 44103, 285, 851, 1949, 8090, 275, 11454, 6928, 597, 1263, 841, 5609, 275, 253, 3634, 273, 4440, 257, 292, 9162, 285, 35221, 4715, 275, 253, 387, 1792, 285, 3676, 14785, 5188, 10625, 1223, 841, 403, 4722, 5697, 285, 10625, 281, 1263, 891, 452, 7350, 342, 253, 19274, 285, 10636, 273, 253, 2929, 50276, 3321, 272, 10636, 285, 16038, 327, 253, 19274, 273, 253, 2929, 247, 1534, 629, 273, 253, 10199, 285, 2905, 789, 2593, 310, 5262, 16425, 326, 436, 2746, 476, 320, 908, 323, 6096, 1566, 25978, 275, 22349, 835, 2860, 13468, 24589, 390, 7895, 10796, 43545, 13782, 651, 3185, 320, 908, 10941, 253, 7274, 5421, 275, 436, 2929, 281, 841, 18144, 10105, 32414, 24013, 8550, 5609, 3133, 751, 1512, 1199, 273, 247, 13726, 347, 253, 3082, 5752, 1077, 1027, 6378, 285, 275, 954, 2219, 2550, 1014, 320, 3587, 2429, 50276, 783, 806, 285, 1273, 12494, 2319, 253, 8113, 273, 35904, 253, 3733, 273, 3210, 875, 2709, 4676, 891, 5194, 326, 436, 310, 247, 4217, 2170, 281, 1263, 285, 3884, 323, 253, 3114, 281, 564, 533, 347, 253, 10199, 273, 436, 2929, 3054, 436, 310, 253, 954, 4722, 672, 253, 4676, 452, 1453, 689, 40452, 4858, 4295, 273, 253, 14053, 15722, 285, 671, 672, 6036, 3733, 273, 253, 4295, 310, 1146, 2218, 7826, 327, 28465, 285, 3055, 15302, 253, 16774, 1543, 273, 436, 2929, 513, 5293, 273, 436, 347, 597, 760, 1007, 387, 253, 1083, 672, 247, 2014, 3828, 310, 1146, 7932, 50276, 44295, 3062, 253, 16038, 285, 19274, 273, 253, 2929, 310, 417, 4824, 949, 275, 253, 16774, 9978, 835, 597, 7409, 7274, 326, 513, 3733, 689, 512, 273, 253, 3602, 273, 253, 1566, 10155, 253, 9376, 326, 253, 4676, 943, 320, 3907, 285, 943, 417, 3894, 1491, 50276, 45037, 323, 10499, 1566, 29867, 2593, 4562, 13067, 253, 7982, 273, 12240, 38576, 326, 310, 908, 4768, 253, 1551, 273, 253, 2929, 253, 7982, 4453, 387, 253, 1180, 273, 25142, 326, 851, 26208, 253, 1566, 3936, 281, 3986, 253, 1072, 3045, 347, 253, 3236, 1566, 697, 417, 2590, 2139, 436, 310, 271, 1774, 7982, 285, 891, 717, 417, 13762, 352, 310, 253, 987, 581, 281, 897, 347, 352, 337, 1057, 417, 1918, 247, 10732, 273, 849, 23395, 253, 5816, 5110, 369, 12372, 816, 326, 253, 7200, 4925, 253, 1072, 7200, 347, 253, 3236, 2990, 285, 374, 3082, 342, 247, 1077, 1048, 591, 2562, 318, 20243, 824, 347, 1273, 2621, 285, 10491, 3169, 3082, 812, 320, 908, 281, 3986, 247, 1175, 3045, 275, 247, 1355, 1180, 273, 25142, 2403, 841, 3082, 3176, 281, 320, 1077, 1175, 387, 21006, 3210, 891, 13414, 1158, 352, 310, 5322, 326, 436, 7982, 15771, 327, 253, 1072, 5556, 6081, 1146, 908, 323, 253, 3236, 1566, 285, 253, 6312, 1566, 50276, 74, 1158, 697, 625, 4722, 281, 1263, 849, 1199, 941, 310, 2424, 281, 9295, 5816, 11821, 273, 253, 1566, 3185, 273, 849, 1142, 25142, 403, 3058, 281, 9295, 253, 1072, 3045, 253, 22296, 4715, 4679, 3176, 281, 320, 2218, 970, 253, 2862, 10895, 1223, 253, 391, 77, 4679, 513, 1246, 247, 4758, 835, 253, 941, 310, 417, 253, 1072, 50276, 358, 5378, 474, 1543, 891, 717, 671, 9861, 407, 253, 16774, 4560, 275, 2593, 8319, 326, 246, 18, 41731, 13015, 246, 19, 1580, 352, 3133, 751, 760, 39793, 253, 3602, 273, 253, 5816, 3828, 651, 320, 253, 1682, 2746, 891, 1158, 326, 604, 247, 14259, 7982, 369, 908, 3185, 246, 19, 651, 320, 3012, 1805, 387, 4560, 253, 3828, 326, 310, 253, 954, 2074, 281, 253, 3828, 326, 369, 5176, 50276, 8826, 4577, 5701, 50276, 18, 275, 2593, 4562, 253, 5426, 273, 45830, 1057, 417, 897, 246, 11120, 50274, 40084, 273, 352, 374, 275, 253, 1390, 12494, 273, 2593, 4562, 285, 806, 12494, 273, 50274, 4674, 4567, 295, 943, 320, 2931, 347, 271, 19502, 326, 50274, 250, 3844, 253, 1682, 2957, 495, 253, 5740, 273, 246, 20, 1057, 417, 1333, 752, 1332, 310, 908, 281, 50274, 32581, 907, 253, 689, 19484, 1025, 3828, 310, 352, 246, 18, 390, 246, 19, 577, 2139, 1057, 246, 21, 897, 246, 18, 3185, 273, 246, 19, 608, 275, 253, 5661, 9978, 2139, 310, 246, 19, 3732, 342, 247, 1027, 50274, 28269, 2281, 10130, 685, 253, 3236, 3733, 5199, 721, 2139, 310, 246, 19, 417, 2011, 275, 253, 247, 1591, 3024, 1543, 323, 4677, 374, 818, 253, 43110, 24039, 875, 253, 14777, 275, 4677, 374, 285, 50274, 13206, 495, 1056, 731, 2834, 281, 7277, 285, 4665, 854, 697, 10084, 326, 275, 4677, 495, 253, 38576, 273, 9765, 740, 50274, 1542, 246, 19, 310, 884, 323, 3253, 7152, 339, 431, 248, 4477, 9569, 253, 1895, 273, 1566, 12240, 278, 68, 281, 253, 5145, 4715, 3114, 50276, 9328, 2085, 247, 11080, 2278, 390, 2905, 2987, 285, 2410, 1763, 5356, 9059, 326, 5368, 5482, 281, 436, 3686, 273, 4836, 26332, 2860, 13468, 24589, 285, 10796, 43545, 13782, 403, 417, 4751, 20297, 275, 253, 5028, 273, 11454, 2990, 4715, 50276, 783, 4477, 671, 2085, 9470, 10704, 4679, 13756, 281, 22048, 616, 4081, 2557, 273, 38576, 1171, 7645, 45634, 278, 32675, 5210, 296, 1637, 327, 247, 11117, 873, 273, 22296, 285, 391, 77, 4919, 8892, 285, 597, 2085, 9470, 1783, 273, 1110, 1543, 50276, 74, 1089, 253, 2929, 281, 7164, 625, 3533, 685, 352, 9172, 275, 247, 1175, 1039, 50276, 783, 4477, 3877, 326, 616, 2557, 7024, 7052, 327, 253, 19532, 1005, 273, 253, 1798, 851, 26208, 6974, 908, 50276, 3088, 253, 4477, 7664, 326, 824, 247, 2557, 812, 990, 598, 1146, 1512, 13155, 405, 4303, 1900, 247, 1159, 273, 5913, 253, 22583, 13757, 6974, 6569, 281, 320, 323, 667, 1798, 10336, 50274, 3062, 21450, 253, 373, 271, 3081, 7844, 281, 253, 13757, 1895, 534, 310, 849, 1199, 1057, 253, 3733, 6974, 871, 670, 253, 1798, 84, 273, 253, 1895, 12319, 432, 12832, 556, 42295, 2289, 281, 253, 13461, 273, 253, 10166, 1566, 26332, 14916, 278, 32675, 1255, 50276, 17, 1900, 281, 6057, 752, 253, 10336, 273, 253, 2918, 483, 12026, 310, 285, 556, 644, 4158, 281, 22318, 326, 1798, 2990, 923, 24088, 6311, 5556, 14460, 281, 6057, 247, 1652, 2372, 670, 253, 1895, 2605, 285, 4648, 4373, 19484, 24251, 38622, 281, 6057, 2717, 670, 253, 1895, 285, 21460, 247, 3632, 10336, 281, 897, 323, 253, 2918, 562, 13461, 3733, 352, 342, 256, 35333, 50276, 7645, 12240, 3133, 37620, 390, 387, 1878, 432, 247, 3988, 32764, 5777, 17433, 1553, 1245, 1293, 1146, 625, 10182, 670, 752, 1491, 1016, 4760, 275, 436, 2165, 556, 2289, 281, 50276, 284, 352, 9572, 697, 271, 7126, 16774, 2557, 285, 28174, 247, 1077, 4722, 1895, 533, 2654, 751, 281, 871, 849, 281, 1056, 352, 1014, 625, 28055, 28462, 50276, 266, 7126, 7680, 285, 516, 9049, 281, 923, 956, 484, 789, 50273, 664, 273, 2282, 452, 19999, 42115, 8492, 275, 849, 359, 564, 670, 20462, 35615, 323, 11454, 6928, 533, 18670, 368, 2096, 619, 1127, 187, 187, 4118, 18435, 27, 284, 512, 253, 30628, 452, 16318, 627, 310, 690, 4722, 1783, 275, 436, 2929, 327, 4685, 534, 3210, 476, 320, 6927, 281, 3426, 253, 4679, 403, 3240, 11080, 285, 1646, 41374, 2299, 253, 5962, 12291, 395, 253, 4394, 326, 310, 2403, 352, 12150, 323, 253, 30628, 281, 1705, 281, 247, 13969, 261, 253, 958, 326, 253, 16038, 3133, 19412, 24529, 342, 253, 2530, 2746, 627, 310, 3240, 247, 2257, 273, 2770, 327, 3988, 285, 1146, 10237, 281, 271, 34014, 1566, 19860, 310, 4081, 347, 247, 5272, 2900, 2299, 253, 1566, 12240, 38576, 2557, 4081, 310, 12497, 314, 17285, 1097, 275, 326, 697, 417, 2590, 752, 3988, 23632, 352, 3400, 4543, 310, 352, 2590, 2139, 3733, 673, 369, 6777, 689, 643, 17082, 751, 1180, 273, 3530, 347, 5393, 407, 247, 37317, 604, 436, 2557, 574, 644, 3786, 4081, 285, 253, 2770, 273, 436, 2929, 369, 281, 2085, 16774, 12288, 326, 1537, 320, 4030, 533, 326, 1057, 417, 3176, 281, 320, 253, 1083, 436, 29713, 310, 8943, 671, 275, 253, 4028, 275, 253, 2929, 846, 253, 10199, 253, 2929, 8127, 9563, 347, 4685, 849, 851, 1949, 494, 1027, 35615, 403, 762, 534, 1895, 7533, 672, 15706, 271, 2862, 3828, 342, 1652, 281, 642, 3748, 273, 3988, 390, 11068, 50275, 249, 6010, 436, 2929, 556, 690, 4722, 5697, 533, 271, 12744, 2770, 253, 4081, 5700, 943, 320, 1805, 17285, 390, 5046, 1014, 1805, 323, 253, 4067, 17857, 32888, 8446, 253, 2530, 1783, 812, 320, 17194, 323, 643, 7533, 824, 347, 4685, 14940, 4142, 390, 6194, 1430, 275, 11454, 6928 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 2520, 2929, 29328, 285, 2175, 253, 1566, 12240, 1895, 1677, 247, 10166, 2990, 285, 253, 941, 327, 534, 310, 369, 10166, 604, 247, 8578, 273, 253, 2990, 310, 294, 19078, 1025, 432, 20041, 849, 1142, 851, 26208, 25142, 403, 3058, 281, 5115, 253, 3236, 2990, 7200, 390, 690, 7155, 273, 352, 323, 247, 5235, 273, 6928, 285, 3237, 275, 1097, 22296, 285, 35221, 4715, 1566, 45634, 278, 68, 38576, 310, 18755, 323, 2060, 2990, 8090, 21454, 253, 4679, 403, 253, 5161, 273, 253, 2929, 285, 403, 3839, 973, 14290, 285, 1646, 41374, 50276, 35529, 627, 403, 767, 3374, 326, 9005, 253, 2929, 209, 186, 18, 253, 1895, 16038, 41113, 253, 3988, 273, 1566, 19860, 310, 247, 2372, 8909, 556, 1566, 19860, 644, 4081, 275, 253, 6239, 347, 247, 2442, 2900, 281, 6096, 1566, 25978, 5010, 352, 9193, 751, 253, 1895, 4758, 369, 23179, 281, 15249, 253, 1783, 275, 436, 2929, 253, 8105, 259, 29191, 253, 4370, 347, 253, 3981, 4566, 209, 186, 19, 1566, 12240, 2568, 1335, 320, 271, 4722, 16101, 4968, 323, 3676, 6928, 533, 436, 4419, 247, 1027, 7103, 323, 4227, 1566, 12240, 3400, 247, 1039, 281, 1263, 849, 9542, 1027, 2990, 8090, 403, 281, 3037, 390, 5046, 281, 22048, 849, 1199, 273, 253, 17032, 4836, 778, 320, 6221, 275, 1016, 2167, 841, 12342, 651, 878, 10799, 3448, 285, 5661, 1941, 533, 849, 513, 841, 7313, 7277, 281, 643, 4088, 273, 13546, 2074, 7313, 323, 4227, 432, 253, 819, 25004, 6239, 14008, 2291, 729, 4240, 17857, 32888, 5987, 5758, 15337, 3024, 9275, 2352, 75, 72, 5297, 88, 22, 3129, 3797, 2067, 8442, 38746, 253, 9990, 273, 2060, 2990, 8090, 285, 849, 819, 328, 494, 403, 253, 15116, 275, 1016, 50276, 2520, 310, 8127, 271, 16101, 2929, 285, 2853, 12450, 14409, 326, 352, 310, 2834, 281, 3785, 247, 2590, 285, 47860, 1263, 562, 273, 247, 480, 19493, 273, 5661, 7313, 285, 1892, 281, 2278, 824, 247, 2929, 1512, 533, 253, 7364, 273, 253, 1895, 16038, 1127, 337, 285, 275, 619, 4743, 253, 3731, 2132, 2770, 273, 253, 1783, 1127, 374, 8513, 253, 19843, 285, 8453, 273, 436, 2929, 323, 352, 281, 1663, 320, 247, 4217, 4968, 275, 4685, 3676, 4715, 690, 3081, 789, 3133, 281, 320, 3058, 50276, 977, 7211, 209, 186, 20, 819, 25004, 6239, 651, 320, 247, 5272, 5301, 275, 253, 2905, 789, 323, 4227, 15761, 17857, 32888, 4240, 5987, 39962, 2061, 5375, 9913, 26942, 30169, 8631, 247, 277, 5060, 35422, 264, 1215, 1332, 835, 247, 14086, 1566, 310, 819, 37437, 23507, 846, 534, 253, 819, 37437, 10291, 403, 294, 19078, 1025, 285, 851, 11273, 14086, 4283, 281, 5520, 7200, 4103, 281, 253, 3236, 14086, 1566, 209, 186, 21, 1908, 15706, 253, 24666, 314, 908, 7318, 342, 50276, 909, 9098, 89, 3185, 273, 7318, 9098, 89, 209, 186, 22, 253, 40155, 670, 4440, 257, 292, 275, 253, 26432, 281, 2593, 495, 943, 320, 4395, 281, 2593, 577, 209, 186, 23, 275, 2593, 4567, 12494, 374, 19148, 604, 2957, 10770, 281, 1071, 2957, 347, 4767, 275, 253, 26432, 281, 2593, 495, 209, 186, 24, 275, 4677, 374, 9765, 2693, 285, 4677, 495, 9765, 740, 5004, 2139, 403, 253, 2193, 3638, 5474, 33032, 2520, 2929, 29328, 253, 4722, 2934, 273, 18918, 849, 2834, 352, 310, 281, 294, 44103, 285, 851, 1949, 8090, 275, 11454, 6928, 597, 1263, 841, 5609, 275, 253, 3634, 273, 4440, 257, 292, 9162, 285, 35221, 4715, 275, 253, 387, 1792, 285, 3676, 14785, 5188, 10625, 1223, 841, 403, 4722, 5697, 285, 10625, 281, 1263, 891, 452, 7350, 342, 253, 19274, 285, 10636, 273, 253, 2929, 50276, 3321, 272, 10636, 285, 16038, 327, 253, 19274, 273, 253, 2929, 247, 1534, 629, 273, 253, 10199, 285, 2905, 789, 2593, 310, 5262, 16425, 326, 436, 2746, 476, 320, 908, 323, 6096, 1566, 25978, 275, 22349, 835, 2860, 13468, 24589, 390, 7895, 10796, 43545, 13782, 651, 3185, 320, 908, 10941, 253, 7274, 5421, 275, 436, 2929, 281, 841, 18144, 10105, 32414, 24013, 8550, 5609, 3133, 751, 1512, 1199, 273, 247, 13726, 347, 253, 3082, 5752, 1077, 1027, 6378, 285, 275, 954, 2219, 2550, 1014, 320, 3587, 2429, 50276, 783, 806, 285, 1273, 12494, 2319, 253, 8113, 273, 35904, 253, 3733, 273, 3210, 875, 2709, 4676, 891, 5194, 326, 436, 310, 247, 4217, 2170, 281, 1263, 285, 3884, 323, 253, 3114, 281, 564, 533, 347, 253, 10199, 273, 436, 2929, 3054, 436, 310, 253, 954, 4722, 672, 253, 4676, 452, 1453, 689, 40452, 4858, 4295, 273, 253, 14053, 15722, 285, 671, 672, 6036, 3733, 273, 253, 4295, 310, 1146, 2218, 7826, 327, 28465, 285, 3055, 15302, 253, 16774, 1543, 273, 436, 2929, 513, 5293, 273, 436, 347, 597, 760, 1007, 387, 253, 1083, 672, 247, 2014, 3828, 310, 1146, 7932, 50276, 44295, 3062, 253, 16038, 285, 19274, 273, 253, 2929, 310, 417, 4824, 949, 275, 253, 16774, 9978, 835, 597, 7409, 7274, 326, 513, 3733, 689, 512, 273, 253, 3602, 273, 253, 1566, 10155, 253, 9376, 326, 253, 4676, 943, 320, 3907, 285, 943, 417, 3894, 1491, 50276, 45037, 323, 10499, 1566, 29867, 2593, 4562, 13067, 253, 7982, 273, 12240, 38576, 326, 310, 908, 4768, 253, 1551, 273, 253, 2929, 253, 7982, 4453, 387, 253, 1180, 273, 25142, 326, 851, 26208, 253, 1566, 3936, 281, 3986, 253, 1072, 3045, 347, 253, 3236, 1566, 697, 417, 2590, 2139, 436, 310, 271, 1774, 7982, 285, 891, 717, 417, 13762, 352, 310, 253, 987, 581, 281, 897, 347, 352, 337, 1057, 417, 1918, 247, 10732, 273, 849, 23395, 253, 5816, 5110, 369, 12372, 816, 326, 253, 7200, 4925, 253, 1072, 7200, 347, 253, 3236, 2990, 285, 374, 3082, 342, 247, 1077, 1048, 591, 2562, 318, 20243, 824, 347, 1273, 2621, 285, 10491, 3169, 3082, 812, 320, 908, 281, 3986, 247, 1175, 3045, 275, 247, 1355, 1180, 273, 25142, 2403, 841, 3082, 3176, 281, 320, 1077, 1175, 387, 21006, 3210, 891, 13414, 1158, 352, 310, 5322, 326, 436, 7982, 15771, 327, 253, 1072, 5556, 6081, 1146, 908, 323, 253, 3236, 1566, 285, 253, 6312, 1566, 50276, 74, 1158, 697, 625, 4722, 281, 1263, 849, 1199, 941, 310, 2424, 281, 9295, 5816, 11821, 273, 253, 1566, 3185, 273, 849, 1142, 25142, 403, 3058, 281, 9295, 253, 1072, 3045, 253, 22296, 4715, 4679, 3176, 281, 320, 2218, 970, 253, 2862, 10895, 1223, 253, 391, 77, 4679, 513, 1246, 247, 4758, 835, 253, 941, 310, 417, 253, 1072, 50276, 358, 5378, 474, 1543, 891, 717, 671, 9861, 407, 253, 16774, 4560, 275, 2593, 8319, 326, 246, 18, 41731, 13015, 246, 19, 1580, 352, 3133, 751, 760, 39793, 253, 3602, 273, 253, 5816, 3828, 651, 320, 253, 1682, 2746, 891, 1158, 326, 604, 247, 14259, 7982, 369, 908, 3185, 246, 19, 651, 320, 3012, 1805, 387, 4560, 253, 3828, 326, 310, 253, 954, 2074, 281, 253, 3828, 326, 369, 5176, 50276, 8826, 4577, 5701, 50276, 18, 275, 2593, 4562, 253, 5426, 273, 45830, 1057, 417, 897, 246, 11120, 50274, 40084, 273, 352, 374, 275, 253, 1390, 12494, 273, 2593, 4562, 285, 806, 12494, 273, 50274, 4674, 4567, 295, 943, 320, 2931, 347, 271, 19502, 326, 50274, 250, 3844, 253, 1682, 2957, 495, 253, 5740, 273, 246, 20, 1057, 417, 1333, 752, 1332, 310, 908, 281, 50274, 32581, 907, 253, 689, 19484, 1025, 3828, 310, 352, 246, 18, 390, 246, 19, 577, 2139, 1057, 246, 21, 897, 246, 18, 3185, 273, 246, 19, 608, 275, 253, 5661, 9978, 2139, 310, 246, 19, 3732, 342, 247, 1027, 50274, 28269, 2281, 10130, 685, 253, 3236, 3733, 5199, 721, 2139, 310, 246, 19, 417, 2011, 275, 253, 247, 1591, 3024, 1543, 323, 4677, 374, 818, 253, 43110, 24039, 875, 253, 14777, 275, 4677, 374, 285, 50274, 13206, 495, 1056, 731, 2834, 281, 7277, 285, 4665, 854, 697, 10084, 326, 275, 4677, 495, 253, 38576, 273, 9765, 740, 50274, 1542, 246, 19, 310, 884, 323, 3253, 7152, 339, 431, 248, 4477, 9569, 253, 1895, 273, 1566, 12240, 278, 68, 281, 253, 5145, 4715, 3114, 50276, 9328, 2085, 247, 11080, 2278, 390, 2905, 2987, 285, 2410, 1763, 5356, 9059, 326, 5368, 5482, 281, 436, 3686, 273, 4836, 26332, 2860, 13468, 24589, 285, 10796, 43545, 13782, 403, 417, 4751, 20297, 275, 253, 5028, 273, 11454, 2990, 4715, 50276, 783, 4477, 671, 2085, 9470, 10704, 4679, 13756, 281, 22048, 616, 4081, 2557, 273, 38576, 1171, 7645, 45634, 278, 32675, 5210, 296, 1637, 327, 247, 11117, 873, 273, 22296, 285, 391, 77, 4919, 8892, 285, 597, 2085, 9470, 1783, 273, 1110, 1543, 50276, 74, 1089, 253, 2929, 281, 7164, 625, 3533, 685, 352, 9172, 275, 247, 1175, 1039, 50276, 783, 4477, 3877, 326, 616, 2557, 7024, 7052, 327, 253, 19532, 1005, 273, 253, 1798, 851, 26208, 6974, 908, 50276, 3088, 253, 4477, 7664, 326, 824, 247, 2557, 812, 990, 598, 1146, 1512, 13155, 405, 4303, 1900, 247, 1159, 273, 5913, 253, 22583, 13757, 6974, 6569, 281, 320, 323, 667, 1798, 10336, 50274, 3062, 21450, 253, 373, 271, 3081, 7844, 281, 253, 13757, 1895, 534, 310, 849, 1199, 1057, 253, 3733, 6974, 871, 670, 253, 1798, 84, 273, 253, 1895, 12319, 432, 12832, 556, 42295, 2289, 281, 253, 13461, 273, 253, 10166, 1566, 26332, 14916, 278, 32675, 1255, 50276, 17, 1900, 281, 6057, 752, 253, 10336, 273, 253, 2918, 483, 12026, 310, 285, 556, 644, 4158, 281, 22318, 326, 1798, 2990, 923, 24088, 6311, 5556, 14460, 281, 6057, 247, 1652, 2372, 670, 253, 1895, 2605, 285, 4648, 4373, 19484, 24251, 38622, 281, 6057, 2717, 670, 253, 1895, 285, 21460, 247, 3632, 10336, 281, 897, 323, 253, 2918, 562, 13461, 3733, 352, 342, 256, 35333, 50276, 7645, 12240, 3133, 37620, 390, 387, 1878, 432, 247, 3988, 32764, 5777, 17433, 1553, 1245, 1293, 1146, 625, 10182, 670, 752, 1491, 1016, 4760, 275, 436, 2165, 556, 2289, 281, 50276, 284, 352, 9572, 697, 271, 7126, 16774, 2557, 285, 28174, 247, 1077, 4722, 1895, 533, 2654, 751, 281, 871, 849, 281, 1056, 352, 1014, 625, 28055, 28462, 50276, 266, 7126, 7680, 285, 516, 9049, 281, 923, 956, 484, 789, 50273, 664, 273, 2282, 452, 19999, 42115, 8492, 275, 849, 359, 564, 670, 20462, 35615, 323, 11454, 6928, 533, 18670, 368, 2096, 619, 1127, 187, 187, 4118, 18435, 27, 284, 512, 253, 30628, 452, 16318, 627, 310, 690, 4722, 1783, 275, 436, 2929, 327, 4685, 534, 3210, 476, 320, 6927, 281, 3426, 253, 4679, 403, 3240, 11080, 285, 1646, 41374, 2299, 253, 5962, 12291, 395, 253, 4394, 326, 310, 2403, 352, 12150, 323, 253, 30628, 281, 1705, 281, 247, 13969, 261, 253, 958, 326, 253, 16038, 3133, 19412, 24529, 342, 253, 2530, 2746, 627, 310, 3240, 247, 2257, 273, 2770, 327, 3988, 285, 1146, 10237, 281, 271, 34014, 1566, 19860, 310, 4081, 347, 247, 5272, 2900, 2299, 253, 1566, 12240, 38576, 2557, 4081, 310, 12497, 314, 17285, 1097, 275, 326, 697, 417, 2590, 752, 3988, 23632, 352, 3400, 4543, 310, 352, 2590, 2139, 3733, 673, 369, 6777, 689, 643, 17082, 751, 1180, 273, 3530, 347, 5393, 407, 247, 37317, 604, 436, 2557, 574, 644, 3786, 4081, 285, 253, 2770, 273, 436, 2929, 369, 281, 2085, 16774, 12288, 326, 1537, 320, 4030, 533, 326, 1057, 417, 3176, 281, 320, 253, 1083, 436, 29713, 310, 8943, 671, 275, 253, 4028, 275, 253, 2929, 846, 253, 10199, 253, 2929, 8127, 9563, 347, 4685, 849, 851, 1949, 494, 1027, 35615, 403, 762, 534, 1895, 7533, 672, 15706, 271, 2862, 3828, 342, 1652, 281, 642, 3748, 273, 3988, 390, 11068, 50275, 249, 6010, 436, 2929, 556, 690, 4722, 5697, 533, 271, 12744, 2770, 253, 4081, 5700, 943, 320, 1805, 17285, 390, 5046, 1014, 1805, 323, 253, 4067, 17857, 32888, 8446, 253, 2530, 1783, 812, 320, 17194, 323, 643, 7533, 824, 347, 4685, 14940, 4142, 390, 6194, 1430, 275, 11454, 6928 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: the paper considers the problem of learning tractable probabilistic models tpms and presents a gradientbased method for learning a tpm from local possibly inconsistent estimates specifically a tpm is first learnt from the input dataset and is subsequently refined by updating its parameters using a set of local probability estimates defined over small subsets of variables ie local marginal distributions the parameters are updated using a gradient descent based approach with an objective function that ensures closeness to the local estimates as well as the learnt model the empirical evaluation demonstrates that the proposed method produces tractable models that are superior to those learned solely from data the paper is fairly well written and organised the quality of the presentation is overall very good and therefore the paper is relatively easy to follow most of the concepts are introduced and discussed if a fairly clear manner the empirical evaluation is sound and demonstrates clearly the strength of the proposed method see above docsepthis paper investigates a learning setting where training data is paired with noisy local estimates focusing on cutset networks a popular class of tractable probabilistic models although the results seem easily extendable to other tpm classes tractable models enable the efficient evaluation of the gradients required to solve this learning problem the paper then proposes an iterative gradient ascent approach to further speed up convergence instead of maximizing the original objective involving both the local estimates and training data the paper proposes a momentmatching variant where the local estimates are considered as a postprocessing step after training the cn on data using standard techniques i found the paper generally wellwritten i think it is quite accessible even for nonexperts in tractable models while being sufficiently detailed in the description of the contribution the empirical evaluation is also convincing i am not completely convinced that having access to these local estimates is quite common in realworld applications the work cited at lines 3839 is not specifically supporting the claim considering that prior work in this setting is quite limited according to the related work section it would be nice if the paper substantiated more this claim with realworld examples updated after rebuttal the authors addressed my concerns in their response and included the changes in an updated version of the paper i did not find any unaddressed limitations in the paper i am looking forward to reading an answer to q1 in order to better assess the impact of this work possibly raising my contribution and overall rating docsepthe paper proposes an algorithm for learning a tractable probabilistic model tpm from a dataset consisting of 1 a possibly small complete dataset and 2 possibly many local marginal estimates the tractable probabilistic model is a cutset network or mixtures of they propose a tractable gradient algorithm for learning such a model and show empirically how their algorithm can significantly improve the quality of a learned model when local marginal estimates can be taken into account cutset networks are tractable given complete data their parameters can be learned in closed form and otherwise their gradients can also be computed in polynomial time the learning algorithm is based on gradient descent where the local marginal estimates are incorporated as a soft constraint in the objective function as the authors have mentioned the learning problem or variations of it have been considered before in the probabilistic graphical models literature but there are some fundamental difficulties with this learning task on bayesian networks and markov networks namely checking whether local marginal constraints are satisfied requires inference in the network which is computationall hard for bnsmns but can be done in polytime on tractable models such as cutset networks to my knowledge the exploitation of a tractable probablistic model tpm for this learning task is novel there are certainly practitioners who are interested in this learning task for their applications and i believe this type of paper would help spread the practical use of tpms the proposed problem learning a probabilistic model from data marginals is relevant in practice and the authors apply a more modern class of tractable statistical models that is able to overcome limitations of classical models like bayesian networks the experiments demonstrate the utility on the learning problem because of the computational limitations inherent in bayesian networks one did not see this learning scenario as often in practice but given the use of tractable models it may see increased visibility no negative societal impact and otherwise limitations and relations to related work were satisfactory docsepthis paper proposes to incorporate socalled local estimates into the optimization of probabilistic models since such estimates are typically intractable the paper focuses on using tractable probabilistic models in order to enforce them into the loss function the main discussion centers on learning cutset networks with possible inconsistent local estimates to do so the paper formulates this as a constrained optimization problem constraints being the local estimates and takes a lagrangian relaxation approach which basically ends up looking like a weighted joint optimization problem finally due to the properties of cutset networks where parameters map to conditional probabilities the loss function variance can be reduced through momentmatching instead of sgd leading to faster convergence experiments are run on the 20datasets benchmark showing that incorporating local estimates leads to better learned models when evaluated on kl divergence against an oracle model chosen to be a mixture of cutset networks my overall opinion of this paper is poor the paper has three significant and pervasive shortcomings from start to finish the assumption of local estimates felt impractical and not welljustified are there really many scenarios where we would have large access to local estimates if so why are there no experiments on real datasets with real local estimates the current experiments are completely synthetic in the sense that the paper learns an oracle model using the full dataset to synthetically generate the local estimates and then synthetically hides partial dataset from the learner models why are the experiments only testing q vs r with no reasonable baselines it is completely expected that r will do better than q since it is given strictly more information essentially local statistics from the oracle model where are the ablations using sgd using other tpms or using any other method to incorporate local estimates altogether i find it the experimental evaluation sorely lacking why is there such emphasis on cutset networks the paper tries to start off general making statements that try to encompass all tpms but the entirety of the experiments end up being focused on cutset networks even the momentmatching technique is specific to cutset networks general tpms do not have the conditional distribution interpretation in their parameters would probably need to rely on sgd for optimization given this there should absolutely have been studies ran on other families of tpms as it stands the paper and its title should advertise only cutset networks on this why dont we use general probabilistic circuits what are the drawbacks of pcs compared to cutset networks strengths the formulation is reasonable and the derivations look correct the writing is reasonably clear weaknesses poor justification of access to local estimates at the very least there should be experiments on datasets with real local estimates not synthetically generated experiments section is weak and far from detailed missing ablations sgd instead of moment matching other tpms choices of oracle model unjustified choice of the heavy emphasis on cutset networks ### Summary:
this submission did not reach a full agreement among pc members i will not repeat the arguments here as they can be read in the reviews please see eg review wy8e which according to the authors captures well their intention the main open criticism is the lack of a real application where the type of assumption used in the paper is present the other important one being the comparison with other approaches which seems to have been justified by the authors to a good extent the concern was originally about the existence of those applications but later this was resolved it remained as a concern that the real application is not shown in this work i consider that this is small when weighted against the positive comments in all reviews
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 783, 2929, 19401, 253, 1895, 273, 4715, 10649, 494, 37851, 3210, 246, 81, 983, 285, 10262, 247, 11786, 3169, 1332, 323, 4715, 247, 246, 2617, 432, 1980, 6830, 16706, 8197, 5742, 247, 246, 2617, 310, 806, 34003, 432, 253, 3280, 10895, 285, 310, 9674, 22407, 407, 22753, 697, 3602, 970, 247, 873, 273, 1980, 5912, 8197, 2931, 689, 1355, 20077, 273, 4903, 26332, 1980, 16888, 10670, 253, 3602, 403, 9300, 970, 247, 11786, 18499, 1754, 2746, 342, 271, 8103, 1159, 326, 20096, 2734, 8098, 281, 253, 1980, 8197, 347, 973, 347, 253, 34003, 1566, 253, 16774, 7103, 14371, 326, 253, 4081, 1332, 11330, 10649, 494, 3210, 326, 403, 8936, 281, 1110, 6311, 12718, 432, 941, 50276, 783, 2929, 310, 9648, 973, 3542, 285, 29070, 253, 3290, 273, 253, 9759, 310, 4583, 1077, 1175, 285, 3103, 253, 2929, 310, 4942, 3477, 281, 956, 954, 273, 253, 12342, 403, 5611, 285, 5469, 604, 247, 9648, 2590, 5133, 253, 16774, 7103, 310, 3590, 285, 14371, 4518, 253, 4757, 273, 253, 4081, 1332, 50276, 2887, 1840, 5474, 33032, 2520, 2929, 2340, 684, 247, 4715, 4758, 835, 3733, 941, 310, 18433, 342, 27620, 1980, 8197, 13654, 327, 2624, 1178, 6928, 247, 4633, 966, 273, 10649, 494, 37851, 3210, 3738, 253, 1543, 1646, 4354, 9017, 494, 281, 643, 246, 2617, 5971, 50276, 43757, 494, 3210, 8046, 253, 5919, 7103, 273, 253, 27935, 2424, 281, 8415, 436, 4715, 1895, 253, 2929, 840, 29328, 271, 34560, 11786, 49104, 2746, 281, 2007, 3885, 598, 14940, 3185, 273, 46875, 253, 3236, 8103, 7668, 1097, 253, 1980, 8197, 285, 3733, 941, 253, 2929, 29328, 247, 2774, 45767, 12955, 835, 253, 1980, 8197, 403, 2783, 347, 247, 1501, 21678, 3213, 846, 3733, 253, 260, 79, 327, 941, 970, 2629, 5609, 50274, 74, 1119, 253, 2929, 3839, 973, 15720, 891, 1158, 352, 310, 3240, 12482, 1014, 323, 44382, 468, 1641, 275, 10649, 494, 3210, 1223, 1146, 10481, 7000, 275, 253, 5740, 273, 253, 7680, 253, 16774, 7103, 310, 671, 21414, 50274, 74, 717, 417, 4336, 13762, 326, 1907, 2289, 281, 841, 1980, 8197, 310, 3240, 1846, 275, 1524, 10186, 4893, 253, 789, 11106, 387, 3104, 6480, 1867, 310, 417, 5742, 8109, 253, 1750, 7296, 326, 2720, 789, 275, 436, 4758, 310, 3240, 3710, 2556, 281, 253, 2905, 789, 2593, 352, 651, 320, 5322, 604, 253, 2929, 4326, 4215, 625, 436, 1750, 342, 1524, 10186, 6667, 50276, 39055, 846, 30080, 22559, 253, 4477, 9713, 619, 7350, 275, 616, 2380, 285, 2908, 253, 2544, 275, 271, 9300, 2715, 273, 253, 2929, 50276, 74, 858, 417, 1089, 667, 440, 1911, 2079, 7364, 275, 253, 2929, 891, 717, 2819, 3579, 281, 4361, 271, 3662, 281, 2805, 18, 275, 1340, 281, 1805, 2939, 253, 3486, 273, 436, 789, 6830, 12976, 619, 7680, 285, 4583, 13716, 5474, 339, 431, 248, 2929, 29328, 271, 5933, 323, 4715, 247, 10649, 494, 37851, 1566, 246, 2617, 432, 247, 10895, 11253, 273, 337, 247, 6830, 1355, 3426, 10895, 285, 374, 6830, 1142, 1980, 16888, 8197, 50276, 783, 10649, 494, 37851, 1566, 310, 247, 2624, 1178, 2990, 390, 24170, 273, 50276, 9328, 12661, 247, 10649, 494, 11786, 5933, 323, 4715, 824, 247, 1566, 285, 921, 45190, 849, 616, 5933, 476, 3012, 3157, 253, 3290, 273, 247, 6311, 1566, 672, 1980, 16888, 8197, 476, 320, 2668, 715, 2395, 50276, 7317, 1178, 6928, 403, 10649, 494, 1677, 3426, 941, 616, 3602, 476, 320, 6311, 275, 4581, 830, 285, 5010, 616, 27935, 476, 671, 320, 10302, 275, 14189, 673, 50276, 783, 4715, 5933, 310, 1754, 327, 11786, 18499, 835, 253, 1980, 16888, 8197, 403, 11217, 347, 247, 2602, 7658, 275, 253, 8103, 1159, 50276, 284, 253, 4477, 452, 5393, 253, 4715, 1895, 390, 10575, 273, 352, 452, 644, 2783, 1078, 275, 253, 37851, 29886, 3210, 6239, 533, 627, 403, 690, 7936, 12748, 342, 436, 4715, 4836, 327, 17699, 16561, 6928, 285, 1616, 729, 6928, 50276, 49592, 12669, 1880, 1980, 16888, 10806, 403, 10048, 4419, 17032, 275, 253, 2990, 534, 310, 13782, 455, 1892, 323, 270, 2224, 78, 2224, 533, 476, 320, 2218, 275, 877, 1767, 553, 327, 10649, 494, 3210, 824, 347, 2624, 1178, 6928, 50276, 936, 619, 3640, 253, 30211, 273, 247, 10649, 494, 1742, 1752, 2531, 1566, 246, 2617, 323, 436, 4715, 4836, 310, 4460, 50276, 9088, 403, 5604, 24432, 665, 403, 6110, 275, 436, 4715, 4836, 323, 616, 4893, 285, 891, 2868, 436, 1511, 273, 2929, 651, 1361, 5195, 253, 8542, 897, 273, 246, 81, 983, 50276, 783, 4081, 1895, 4715, 247, 37851, 1566, 432, 941, 50276, 15456, 932, 310, 4623, 275, 3946, 285, 253, 4477, 4647, 247, 625, 4980, 966, 273, 10649, 494, 7605, 3210, 326, 310, 2104, 281, 11399, 7364, 273, 8946, 3210, 751, 17699, 16561, 6928, 50276, 783, 4679, 7568, 253, 11839, 327, 253, 4715, 1895, 50276, 12157, 273, 253, 15180, 7364, 12794, 275, 17699, 16561, 6928, 581, 858, 417, 923, 436, 4715, 10076, 347, 2223, 275, 3946, 533, 1677, 253, 897, 273, 10649, 494, 3210, 352, 778, 923, 2559, 23114, 50276, 2369, 4016, 38058, 3486, 285, 5010, 7364, 285, 2493, 281, 2905, 789, 497, 20297, 5474, 33032, 2520, 2929, 29328, 281, 19071, 9267, 18859, 1980, 8197, 715, 253, 13757, 273, 37851, 3210, 1580, 824, 8197, 403, 5431, 540, 44374, 253, 2929, 16633, 327, 970, 10649, 494, 37851, 3210, 275, 1340, 281, 7767, 731, 715, 253, 2957, 1159, 253, 2022, 5955, 12127, 327, 4715, 2624, 1178, 6928, 342, 1896, 16706, 1980, 8197, 281, 513, 594, 253, 2929, 17075, 684, 436, 347, 247, 20793, 13757, 1895, 10806, 1146, 253, 1980, 8197, 285, 3936, 247, 16653, 23623, 17040, 2746, 534, 10323, 7637, 598, 2819, 751, 247, 17375, 6036, 13757, 1895, 4720, 1955, 281, 253, 3607, 273, 2624, 1178, 6928, 835, 3602, 3711, 281, 17697, 20552, 253, 2957, 1159, 11041, 476, 320, 3777, 949, 2774, 45767, 3185, 273, 256, 35333, 4283, 281, 7938, 14940, 4679, 403, 1408, 327, 253, 1384, 46906, 1507, 22791, 4645, 326, 24049, 1980, 8197, 5644, 281, 1805, 6311, 3210, 672, 6760, 327, 27451, 23279, 1411, 271, 42295, 1566, 6777, 281, 320, 247, 7802, 273, 2624, 1178, 6928, 619, 4583, 4743, 273, 436, 2929, 310, 4105, 253, 2929, 556, 1264, 1534, 285, 42551, 35387, 50276, 4064, 1265, 281, 8416, 253, 9376, 273, 1980, 8197, 3543, 45783, 285, 417, 973, 6309, 1245, 403, 627, 1663, 1142, 15216, 835, 359, 651, 452, 1781, 2289, 281, 1980, 8197, 604, 594, 2139, 403, 627, 642, 4679, 327, 1524, 15302, 342, 1524, 1980, 8197, 253, 1655, 4679, 403, 4336, 13506, 275, 253, 3282, 326, 253, 2929, 33772, 271, 42295, 1566, 970, 253, 2120, 10895, 281, 5132, 85, 1037, 6635, 253, 1980, 8197, 285, 840, 5132, 85, 1037, 43660, 7898, 10895, 432, 253, 458, 47612, 3210, 50275, 22309, 403, 253, 4679, 760, 5175, 2805, 4632, 391, 342, 642, 5272, 1666, 25379, 352, 310, 4336, 3264, 326, 391, 588, 513, 1805, 685, 2805, 1580, 352, 310, 1677, 13714, 625, 1491, 9093, 1980, 9990, 432, 253, 42295, 1566, 835, 403, 253, 490, 77, 569, 970, 256, 35333, 970, 643, 246, 81, 983, 390, 970, 667, 643, 1332, 281, 19071, 1980, 8197, 17965, 891, 1089, 352, 253, 5661, 7103, 25132, 314, 14999, 50276, 22309, 310, 627, 824, 15075, 327, 2624, 1178, 6928, 253, 2929, 14177, 281, 1265, 745, 2087, 2403, 7234, 326, 1611, 281, 18387, 512, 246, 81, 983, 533, 253, 25983, 273, 253, 4679, 990, 598, 1146, 7106, 327, 2624, 1178, 6928, 1014, 253, 2774, 45767, 5853, 310, 2173, 281, 2624, 1178, 6928, 2087, 246, 81, 983, 513, 417, 452, 253, 17697, 3268, 7914, 275, 616, 3602, 50276, 12756, 3164, 878, 281, 10725, 327, 256, 35333, 323, 13757, 1677, 436, 627, 943, 8839, 452, 644, 2175, 6337, 327, 643, 5870, 273, 246, 81, 983, 347, 352, 9572, 253, 2929, 285, 697, 4060, 943, 46459, 760, 2624, 1178, 6928, 50275, 251, 436, 2139, 13414, 359, 897, 2087, 37851, 14174, 752, 403, 253, 30453, 273, 268, 6113, 2429, 281, 2624, 1178, 6928, 50274, 296, 3755, 20556, 50275, 783, 15895, 310, 5272, 285, 253, 3538, 569, 1007, 3451, 50276, 783, 4028, 310, 12054, 2590, 50276, 20881, 1255, 265, 50275, 31943, 22861, 273, 2289, 281, 1980, 8197, 387, 253, 1077, 1878, 627, 943, 320, 4679, 327, 15302, 342, 1524, 1980, 8197, 417, 5132, 85, 1037, 4561, 50276, 16217, 3825, 2593, 310, 5075, 285, 2080, 432, 7000, 5816, 490, 77, 569, 256, 35333, 3185, 273, 2774, 11038, 643, 246, 81, 983, 10165, 273, 42295, 1566, 50276, 328, 6309, 1245, 4327, 273, 253, 5536, 15075, 327, 2624, 1178, 6928, 50275, 187, 187, 4118, 18435, 27, 2520, 19529, 858, 417, 3986, 247, 2120, 4345, 2190, 21136, 2758, 891, 588, 417, 10280, 253, 7125, 1060, 347, 597, 476, 320, 1239, 275, 253, 10123, 4496, 923, 24088, 2278, 37136, 25, 70, 534, 2556, 281, 253, 4477, 28174, 973, 616, 8208, 253, 2022, 1527, 14226, 310, 253, 3480, 273, 247, 1524, 2898, 835, 253, 1511, 273, 9376, 908, 275, 253, 2929, 310, 1246, 253, 643, 1774, 581, 1146, 253, 5301, 342, 643, 7274, 534, 3133, 281, 452, 644, 17285, 407, 253, 4477, 281, 247, 1175, 6070, 253, 4468, 369, 8927, 670, 253, 6242, 273, 1110, 4893, 533, 1996, 436, 369, 11512, 352, 6376, 347, 247, 4468, 326, 253, 1524, 2898, 310, 417, 2011, 275, 436, 789, 891, 1908, 326, 436, 310, 1355, 672, 17375, 1411, 253, 2762, 5701, 275, 512, 10123 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 783, 2929, 19401, 253, 1895, 273, 4715, 10649, 494, 37851, 3210, 246, 81, 983, 285, 10262, 247, 11786, 3169, 1332, 323, 4715, 247, 246, 2617, 432, 1980, 6830, 16706, 8197, 5742, 247, 246, 2617, 310, 806, 34003, 432, 253, 3280, 10895, 285, 310, 9674, 22407, 407, 22753, 697, 3602, 970, 247, 873, 273, 1980, 5912, 8197, 2931, 689, 1355, 20077, 273, 4903, 26332, 1980, 16888, 10670, 253, 3602, 403, 9300, 970, 247, 11786, 18499, 1754, 2746, 342, 271, 8103, 1159, 326, 20096, 2734, 8098, 281, 253, 1980, 8197, 347, 973, 347, 253, 34003, 1566, 253, 16774, 7103, 14371, 326, 253, 4081, 1332, 11330, 10649, 494, 3210, 326, 403, 8936, 281, 1110, 6311, 12718, 432, 941, 50276, 783, 2929, 310, 9648, 973, 3542, 285, 29070, 253, 3290, 273, 253, 9759, 310, 4583, 1077, 1175, 285, 3103, 253, 2929, 310, 4942, 3477, 281, 956, 954, 273, 253, 12342, 403, 5611, 285, 5469, 604, 247, 9648, 2590, 5133, 253, 16774, 7103, 310, 3590, 285, 14371, 4518, 253, 4757, 273, 253, 4081, 1332, 50276, 2887, 1840, 5474, 33032, 2520, 2929, 2340, 684, 247, 4715, 4758, 835, 3733, 941, 310, 18433, 342, 27620, 1980, 8197, 13654, 327, 2624, 1178, 6928, 247, 4633, 966, 273, 10649, 494, 37851, 3210, 3738, 253, 1543, 1646, 4354, 9017, 494, 281, 643, 246, 2617, 5971, 50276, 43757, 494, 3210, 8046, 253, 5919, 7103, 273, 253, 27935, 2424, 281, 8415, 436, 4715, 1895, 253, 2929, 840, 29328, 271, 34560, 11786, 49104, 2746, 281, 2007, 3885, 598, 14940, 3185, 273, 46875, 253, 3236, 8103, 7668, 1097, 253, 1980, 8197, 285, 3733, 941, 253, 2929, 29328, 247, 2774, 45767, 12955, 835, 253, 1980, 8197, 403, 2783, 347, 247, 1501, 21678, 3213, 846, 3733, 253, 260, 79, 327, 941, 970, 2629, 5609, 50274, 74, 1119, 253, 2929, 3839, 973, 15720, 891, 1158, 352, 310, 3240, 12482, 1014, 323, 44382, 468, 1641, 275, 10649, 494, 3210, 1223, 1146, 10481, 7000, 275, 253, 5740, 273, 253, 7680, 253, 16774, 7103, 310, 671, 21414, 50274, 74, 717, 417, 4336, 13762, 326, 1907, 2289, 281, 841, 1980, 8197, 310, 3240, 1846, 275, 1524, 10186, 4893, 253, 789, 11106, 387, 3104, 6480, 1867, 310, 417, 5742, 8109, 253, 1750, 7296, 326, 2720, 789, 275, 436, 4758, 310, 3240, 3710, 2556, 281, 253, 2905, 789, 2593, 352, 651, 320, 5322, 604, 253, 2929, 4326, 4215, 625, 436, 1750, 342, 1524, 10186, 6667, 50276, 39055, 846, 30080, 22559, 253, 4477, 9713, 619, 7350, 275, 616, 2380, 285, 2908, 253, 2544, 275, 271, 9300, 2715, 273, 253, 2929, 50276, 74, 858, 417, 1089, 667, 440, 1911, 2079, 7364, 275, 253, 2929, 891, 717, 2819, 3579, 281, 4361, 271, 3662, 281, 2805, 18, 275, 1340, 281, 1805, 2939, 253, 3486, 273, 436, 789, 6830, 12976, 619, 7680, 285, 4583, 13716, 5474, 339, 431, 248, 2929, 29328, 271, 5933, 323, 4715, 247, 10649, 494, 37851, 1566, 246, 2617, 432, 247, 10895, 11253, 273, 337, 247, 6830, 1355, 3426, 10895, 285, 374, 6830, 1142, 1980, 16888, 8197, 50276, 783, 10649, 494, 37851, 1566, 310, 247, 2624, 1178, 2990, 390, 24170, 273, 50276, 9328, 12661, 247, 10649, 494, 11786, 5933, 323, 4715, 824, 247, 1566, 285, 921, 45190, 849, 616, 5933, 476, 3012, 3157, 253, 3290, 273, 247, 6311, 1566, 672, 1980, 16888, 8197, 476, 320, 2668, 715, 2395, 50276, 7317, 1178, 6928, 403, 10649, 494, 1677, 3426, 941, 616, 3602, 476, 320, 6311, 275, 4581, 830, 285, 5010, 616, 27935, 476, 671, 320, 10302, 275, 14189, 673, 50276, 783, 4715, 5933, 310, 1754, 327, 11786, 18499, 835, 253, 1980, 16888, 8197, 403, 11217, 347, 247, 2602, 7658, 275, 253, 8103, 1159, 50276, 284, 253, 4477, 452, 5393, 253, 4715, 1895, 390, 10575, 273, 352, 452, 644, 2783, 1078, 275, 253, 37851, 29886, 3210, 6239, 533, 627, 403, 690, 7936, 12748, 342, 436, 4715, 4836, 327, 17699, 16561, 6928, 285, 1616, 729, 6928, 50276, 49592, 12669, 1880, 1980, 16888, 10806, 403, 10048, 4419, 17032, 275, 253, 2990, 534, 310, 13782, 455, 1892, 323, 270, 2224, 78, 2224, 533, 476, 320, 2218, 275, 877, 1767, 553, 327, 10649, 494, 3210, 824, 347, 2624, 1178, 6928, 50276, 936, 619, 3640, 253, 30211, 273, 247, 10649, 494, 1742, 1752, 2531, 1566, 246, 2617, 323, 436, 4715, 4836, 310, 4460, 50276, 9088, 403, 5604, 24432, 665, 403, 6110, 275, 436, 4715, 4836, 323, 616, 4893, 285, 891, 2868, 436, 1511, 273, 2929, 651, 1361, 5195, 253, 8542, 897, 273, 246, 81, 983, 50276, 783, 4081, 1895, 4715, 247, 37851, 1566, 432, 941, 50276, 15456, 932, 310, 4623, 275, 3946, 285, 253, 4477, 4647, 247, 625, 4980, 966, 273, 10649, 494, 7605, 3210, 326, 310, 2104, 281, 11399, 7364, 273, 8946, 3210, 751, 17699, 16561, 6928, 50276, 783, 4679, 7568, 253, 11839, 327, 253, 4715, 1895, 50276, 12157, 273, 253, 15180, 7364, 12794, 275, 17699, 16561, 6928, 581, 858, 417, 923, 436, 4715, 10076, 347, 2223, 275, 3946, 533, 1677, 253, 897, 273, 10649, 494, 3210, 352, 778, 923, 2559, 23114, 50276, 2369, 4016, 38058, 3486, 285, 5010, 7364, 285, 2493, 281, 2905, 789, 497, 20297, 5474, 33032, 2520, 2929, 29328, 281, 19071, 9267, 18859, 1980, 8197, 715, 253, 13757, 273, 37851, 3210, 1580, 824, 8197, 403, 5431, 540, 44374, 253, 2929, 16633, 327, 970, 10649, 494, 37851, 3210, 275, 1340, 281, 7767, 731, 715, 253, 2957, 1159, 253, 2022, 5955, 12127, 327, 4715, 2624, 1178, 6928, 342, 1896, 16706, 1980, 8197, 281, 513, 594, 253, 2929, 17075, 684, 436, 347, 247, 20793, 13757, 1895, 10806, 1146, 253, 1980, 8197, 285, 3936, 247, 16653, 23623, 17040, 2746, 534, 10323, 7637, 598, 2819, 751, 247, 17375, 6036, 13757, 1895, 4720, 1955, 281, 253, 3607, 273, 2624, 1178, 6928, 835, 3602, 3711, 281, 17697, 20552, 253, 2957, 1159, 11041, 476, 320, 3777, 949, 2774, 45767, 3185, 273, 256, 35333, 4283, 281, 7938, 14940, 4679, 403, 1408, 327, 253, 1384, 46906, 1507, 22791, 4645, 326, 24049, 1980, 8197, 5644, 281, 1805, 6311, 3210, 672, 6760, 327, 27451, 23279, 1411, 271, 42295, 1566, 6777, 281, 320, 247, 7802, 273, 2624, 1178, 6928, 619, 4583, 4743, 273, 436, 2929, 310, 4105, 253, 2929, 556, 1264, 1534, 285, 42551, 35387, 50276, 4064, 1265, 281, 8416, 253, 9376, 273, 1980, 8197, 3543, 45783, 285, 417, 973, 6309, 1245, 403, 627, 1663, 1142, 15216, 835, 359, 651, 452, 1781, 2289, 281, 1980, 8197, 604, 594, 2139, 403, 627, 642, 4679, 327, 1524, 15302, 342, 1524, 1980, 8197, 253, 1655, 4679, 403, 4336, 13506, 275, 253, 3282, 326, 253, 2929, 33772, 271, 42295, 1566, 970, 253, 2120, 10895, 281, 5132, 85, 1037, 6635, 253, 1980, 8197, 285, 840, 5132, 85, 1037, 43660, 7898, 10895, 432, 253, 458, 47612, 3210, 50275, 22309, 403, 253, 4679, 760, 5175, 2805, 4632, 391, 342, 642, 5272, 1666, 25379, 352, 310, 4336, 3264, 326, 391, 588, 513, 1805, 685, 2805, 1580, 352, 310, 1677, 13714, 625, 1491, 9093, 1980, 9990, 432, 253, 42295, 1566, 835, 403, 253, 490, 77, 569, 970, 256, 35333, 970, 643, 246, 81, 983, 390, 970, 667, 643, 1332, 281, 19071, 1980, 8197, 17965, 891, 1089, 352, 253, 5661, 7103, 25132, 314, 14999, 50276, 22309, 310, 627, 824, 15075, 327, 2624, 1178, 6928, 253, 2929, 14177, 281, 1265, 745, 2087, 2403, 7234, 326, 1611, 281, 18387, 512, 246, 81, 983, 533, 253, 25983, 273, 253, 4679, 990, 598, 1146, 7106, 327, 2624, 1178, 6928, 1014, 253, 2774, 45767, 5853, 310, 2173, 281, 2624, 1178, 6928, 2087, 246, 81, 983, 513, 417, 452, 253, 17697, 3268, 7914, 275, 616, 3602, 50276, 12756, 3164, 878, 281, 10725, 327, 256, 35333, 323, 13757, 1677, 436, 627, 943, 8839, 452, 644, 2175, 6337, 327, 643, 5870, 273, 246, 81, 983, 347, 352, 9572, 253, 2929, 285, 697, 4060, 943, 46459, 760, 2624, 1178, 6928, 50275, 251, 436, 2139, 13414, 359, 897, 2087, 37851, 14174, 752, 403, 253, 30453, 273, 268, 6113, 2429, 281, 2624, 1178, 6928, 50274, 296, 3755, 20556, 50275, 783, 15895, 310, 5272, 285, 253, 3538, 569, 1007, 3451, 50276, 783, 4028, 310, 12054, 2590, 50276, 20881, 1255, 265, 50275, 31943, 22861, 273, 2289, 281, 1980, 8197, 387, 253, 1077, 1878, 627, 943, 320, 4679, 327, 15302, 342, 1524, 1980, 8197, 417, 5132, 85, 1037, 4561, 50276, 16217, 3825, 2593, 310, 5075, 285, 2080, 432, 7000, 5816, 490, 77, 569, 256, 35333, 3185, 273, 2774, 11038, 643, 246, 81, 983, 10165, 273, 42295, 1566, 50276, 328, 6309, 1245, 4327, 273, 253, 5536, 15075, 327, 2624, 1178, 6928, 50275, 187, 187, 4118, 18435, 27, 2520, 19529, 858, 417, 3986, 247, 2120, 4345, 2190, 21136, 2758, 891, 588, 417, 10280, 253, 7125, 1060, 347, 597, 476, 320, 1239, 275, 253, 10123, 4496, 923, 24088, 2278, 37136, 25, 70, 534, 2556, 281, 253, 4477, 28174, 973, 616, 8208, 253, 2022, 1527, 14226, 310, 253, 3480, 273, 247, 1524, 2898, 835, 253, 1511, 273, 9376, 908, 275, 253, 2929, 310, 1246, 253, 643, 1774, 581, 1146, 253, 5301, 342, 643, 7274, 534, 3133, 281, 452, 644, 17285, 407, 253, 4477, 281, 247, 1175, 6070, 253, 4468, 369, 8927, 670, 253, 6242, 273, 1110, 4893, 533, 1996, 436, 369, 11512, 352, 6376, 347, 247, 4468, 326, 253, 1524, 2898, 310, 417, 2011, 275, 436, 789, 891, 1908, 326, 436, 310, 1355, 672, 17375, 1411, 253, 2762, 5701, 275, 512, 10123 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: summary the article proposes to tune an hmc sampler by maximising eparamlog targetxt over the parameters of the hmc sampler furthermore the article studies the influence of the initial distribution while the approach is certainly interesting i have not found the empirical studies satisfying enough  comments 1 the article considers a vector epsilon as well as a mass matrix usually the parameter epsilon is chosen as a scalar number choosing epsilon as a vector can indeed also be seen as a particular type of preconditioning or choice of mass matrix i have found this part of the paper not extremely well explained 2 it is indeed also difficult to choose l and that is mainly what the nouturn method tries to automate in practice dynamically adapting l can make a lot of difference in highdimensional settings andor different parts of the state space exhibit different scales it would have been very interesting to investigate how the proposed method can be used in conjunction with nouturn type strategies furthermore it was not entirely clear to me how the epsilon was tuned when the nouturn was used 3 in the 2d example since the authors have used rejection sampling to produce the plots it is also easy to accurately estimate the meancovariance of the target distribution it would have been interesting to use these statistics although it is not possible to do so in more complex scenarios and see if this leads to improved performances 4 in the min barp method why choose a target acceptance rate of 025 my experience says that the number is usually chosen much higher 5 while reporting the ksd i think it would have been very interesting to report the ess or variations of it since it is the standard measure of efficiency in the mcmc literature 6 finally while the 2d examples are certainly very interesting i am not convinced that directly going from 2d to superdifficulttarget is the right approach to understand the properties of the proposed methods there are many settings that are more difficult than these 2d distributions but much more tractable than the dlgmolecular targets in summary i think that the authors are proposing an interesting line of research but more careful numerical investigations are necessary to really understand the worth of the methodologydocsep summary this paper proposes a variational inference based framework to tune some of the hyperparameters of hmc algorithms automatically the authors drop the entropy term from regular elbo formulation which facilitates a gradientbased approach however dropping this term requires extra care for which authors offer an automatedmethod finally the authors demonstrate empirical validation on several problems strength the authors do an excellent job of articulating their intuition behind the idea both see section 3 while dropping the entropy term from elbo decomposition is heuristicbased the explanations are wellformulated and figure 1 does an excellent job of getting the point across more so since dropping the entropy term can cause pathological behaviors the authors propose a method to ensure wider initial distributions i commend the authors for the nontrivial engineering that was required to make their ideas work i also commend the authors effort of conducting statistical tests and extensive empirical evaluations concerns my main concern with the work is that it is often onpar with the competing methodsi understand that a new method doesnt need to be sota on every benchmarkand the sksd enabled variants that achieve this performance are prohibitively slow see tables 6 and 9 i could not help but feel concerned when no discussion was offered for an almost tenfold increase in the computational time for training dlgms to convince me i will suggest offering an honest discussion on the runtimes of the approaches i find the discussion in section b1important and believe it should be more formal specifically i will suggest algorithimizing what objective is used at which stage alternatively authors can choose to restructure this some other way however it is too important to be left in its current form updates after the rebuttal i like the paper and found the revised version more transparent i support the engineering approach of the paper however as we all know these papers often require authors to go to greater lengths to convince after reading the other discussion and reviews i think the authors can consider a few additional experiments i would suggest investing in a more involved toyexperiment to better motivate the engineering solutions if possible authors can also consider a more careful ablation study to establish the relevance of each component on this toymodel further the authors offered explanations for the training time aberrations if possible authors can consider including the equallyfastvariants in the revision to be more convincing docsepthe paper proposes a method to optimize the parameters of the hybrid monte carlo hmc algorithm the step size and the diagonal of momentums covariance matrix in order to do that the authors consider the distribution of samples qt obtained after t iterations of the algorithm t acceptreject steps starting from some distribution q0 then a reasonable objective for the optimization would be the kldivergence between qt and the target density p however the evaluation of the kldivergence includes the entropy of qt whose density is intractable due to numerous acceptreject steps the proposed solution to this difficulty is to ignore the entropy term and maximize the log density of the target on samples from qt to avoid the degenerate solution due to ignorance of the entropy the authors propose to choose q0 carefully eg to learn q0 as a normalizing flow approximating the target p via minimization of masscovering alphadivergence the latter involves the usage of samples from the target distribution major concerns 1 the method is an engineering trick rather than a grounded approach to the optimization of sampling algorithms indeed in many cases people use mcmc methods to obtain guarantees for the sampling procedure the proposed method removes all these guarantees by relying on the choice of the initial distribution q0 moreover the optimization of q0 via masscovering objectives is a notoriously hard problem since samples from the target distribution are not given in a usual setting 2 i think the paper lacks an essential comparison with the method proposed by titsias gradientbased adaptive markov chain monte carlo 2019 this paper proposes a more general objective for parameter optimization explicitly fostering high entropy of the proposal moreover in contrast with the learning step of q0 it operates in an adaptive manner not requiring any pretraining steps 3 given the limited theoretical novelty i would expect the iclr paper to demonstrate highly successful empirical results however it is not the case for the current submission im quite confident that the results on cv tasks are out of practical interest also for the molecular dynamics the metrics choice hinders the assessment of the practical significance minor comments 1 i dont find the comparison of marginal distributions on the 60d problem to be a convincing way to compare samplers performance i would suggest considering either another metric or another problem 2 i also would suggest to include the description at least the formula for the density of the problem molecular configurations it would provide the reader with an additional intuition on its difficulty 3 i think section 4 would benefit from the clear description of the choice of s for instance from the description of the variable mu which appears there for the first time additional comments after rereading the review i feel that it may sound a bit harsh for the authors therefore i want to say aloud that i find the papers subject to be of great interest consider any work in this direction valuable and encourage the authors to continue their studies my criticism is only an attempt to approach the review process objectivelydocsepsummary the paper introduces a gradientbased approach for tuning the stepsize and the diagonal mass matrix of hmc together with the parameters of an initial distribution for the markov chain they suggest different objectives amenable for sgd maximize the expected target logdensity of the final state of the chain but also an objective to ensure a somewhat wide initial distribution the approach is illustrated on 2d toy models deep latent gaussian models on fashion mnist and molecular configurations positives the submission suggests a practical approach for tuning hmc that remains a challenging problem the combination of the different objectives is new as far as i am aware empirical experiments are provided to justify the approach on standard benchmark problems where it is seems to be competitive with state of the art methods and a more extensive study on sampling molecular configurations negatives i feel that further arguments are needed to justify why the entropy of the proposed state can be ignored when adapting the hyperparameters of the sampler the paper argues that since hmc by construction cannot collapse to such a point mass we argue that the entropy term can be dropped provided the initial distribution of the chain has enough coverage of the target i am not convinced by this take a standard normal target then a leapfrogintegrator with 2 steps unit mass matrix and step size of sqrt2 proposes deterministically from a point mass distribution and this happens everywhere on the state space while this might be an unrealistic example it is not clear to me how such situations can be avoided in general it is also not clear to me why the sliced kernelized stein discrepancy objective automatically adjusts the width of the initial distribution in equation 4 the discrepancy is between the final state and the target and i fail to see how this relates to the width of the initial density recommendations i vote for a weak reject at the moment the ideas proposed in the paper are indeed interesting however i am not yet convinced that the objectives yield hmc kernels that explore the state space well so the hmc proposal does not become close to deterministiccompletes a uturn so that entropy comes largely from the initial distribution which is however trained with a different objective also the use of the sliced kernelized stein discrepancy specifically should be better motivated i am happy to increase my score if the authors better clarify these points further commentsissues the authors claim in the abstract that existing approaches optimize a tractable lower bound that is too loose to be useful in practice can this be backed up more concretely i understand that such methods such as thin et al 2020 use a looser bound but not that these types of bounds are useless in practice in section 31 how do the acceptance rates compare for the narrow vs the wide initial distribution my intuition would be that the acceptance rates for the narrow one are smaller than for the wide one would it then be possible to get a better exploration even in this case by including an objective to target an acceptance rate say increase the stepsize if the acceptance rate is above 065 minor comments is it obvious that equation 6 minimizes the 1divergence for k1 is this not the standard vae0divergence while for k1 the iwae objective can be seen as a 0divergence on an extended space what are the gamma variables simulated from n0i exactly are they really the momentum variables are the initial momentum variables not from n0diagm in the experiments from section 51 why do you target a minimum acceptance rate of 025 and not an average rate of 065 which seems a more common choice in the adaptive mcmc literature ### Summary:
this paper proposes a tuning strategy for hamiltonian monte carlo hmc the proposed algorithm optimizes a modified variational objective over the t step distribution of an hmc chain the proposed scheme is evaluated experimentally all of the reviewers agreed that this is an important problem and that the proposed methods is promising unfortunately reviewers had reservations about the empirical evaluation and the theoretical properties of the scheme because the evaluation of the scheme is primarily empirical i cannot recommend acceptance of the paper in its current form i agree with the following specific reviewer concerns the proposed method does not come with any particular guarantees and particularly no guarantees regarding the effect of dropping the entropy term and using an sksd training scheme to compensate while guarantees are not necessary for publication the paper should make up for this with comprehensive and convincing experiments i agree with r1 that more careful ablation studies on toy models are needed if nothing else to reveal the strengths and weaknesses of the proposed approach i would also recommend a more careful discussion about the computational cost of this method and how it can be fairly compared to baselines i dont agree that deliberately wasteful experiments reveal much especially if running more realistic experiments reduces the relative impact of the proposed method
[ 432, 374, 69, 281, 2221, 38157, 7831, 310, 253, 987, 2746, 281, 2096, 253, 3607, 273, 253, 4081, 3082, 627, 403, 1142, 7533, 326, 403, 625, 2834, 685, 841, 374, 69, 10670, 533, 1199, 625, 10649, 494, 685, 253, 277, 21619, 36911, 8571, 50276, 249, 6010, 891, 1158, 326, 253, 4477, 403, 36636, 271, 4722, 1386, 273, 2561, 533, 625, 10182, 10704, 14006, 403, 3309, 281, 1663, 2096, 253, 4409, 273, 253, 16182, 7152, 33032, 50276, 8774, 50276, 2520, 2929, 29328, 247, 39762, 17032, 1754, 7792, 281, 19928, 690, 273, 253, 4373, 22041, 273, 288, 17475, 11333, 8356, 253, 4477, 5926, 253, 15579, 1307, 432, 3963, 1045, 2399, 15895, 534, 29499, 247, 11786, 3169, 2746, 2299, 18752, 436, 1307, 4419, 4465, 1557, 323, 534, 4477, 3959, 271, 16644, 9349, 4720, 253, 4477, 7568, 16774, 12820, 327, 2067, 3237, 50275, 45563, 50276, 783, 4477, 513, 271, 7126, 2628, 273, 18575, 8287, 616, 30328, 3212, 253, 2934, 1097, 923, 2593, 495, 1223, 18752, 253, 15579, 1307, 432, 1045, 2399, 14717, 310, 47641, 3169, 253, 22909, 403, 973, 630, 2907, 285, 4677, 337, 1057, 271, 7126, 2628, 273, 2970, 253, 1127, 2439, 50275, 3062, 594, 1580, 18752, 253, 15579, 1307, 476, 2847, 18977, 13576, 253, 4477, 12661, 247, 1332, 281, 5416, 14200, 3302, 10670, 891, 49638, 253, 4477, 323, 253, 37825, 11369, 326, 369, 2424, 281, 1056, 616, 5697, 789, 891, 671, 49638, 253, 4477, 3434, 273, 16472, 7605, 5216, 285, 9470, 16774, 27163, 50274, 585, 1209, 2224, 50276, 2577, 2022, 4468, 342, 253, 789, 310, 326, 352, 310, 2223, 327, 1148, 342, 253, 11771, 3082, 74, 2096, 326, 247, 747, 1332, 36908, 878, 281, 320, 256, 5503, 327, 1046, 22791, 395, 253, 256, 661, 69, 11410, 11640, 326, 5115, 436, 3045, 403, 9419, 25785, 3468, 923, 7180, 721, 285, 898, 891, 812, 417, 1361, 533, 1928, 7514, 672, 642, 5955, 369, 5907, 323, 271, 2761, 3578, 8089, 2572, 275, 253, 15180, 673, 323, 3733, 277, 21619, 983, 281, 18578, 479, 891, 588, 1804, 9159, 271, 8274, 5955, 327, 253, 1408, 3181, 273, 253, 7274, 50276, 74, 1089, 253, 5955, 275, 2593, 270, 18, 18108, 285, 2868, 352, 943, 320, 625, 7473, 50276, 46458, 891, 588, 1804, 4649, 303, 3006, 752, 8103, 310, 908, 387, 534, 3924, 31506, 4477, 476, 5206, 281, 1551, 7818, 436, 690, 643, 1039, 2299, 352, 310, 1512, 1774, 281, 320, 1669, 275, 697, 1655, 830, 50274, 484, 24275, 846, 253, 30080, 22559, 50276, 74, 751, 253, 2929, 285, 1119, 253, 17265, 2715, 625, 13955, 891, 1329, 253, 11369, 2746, 273, 253, 2929, 2299, 347, 359, 512, 871, 841, 9380, 2223, 2430, 4477, 281, 564, 281, 3687, 16095, 281, 18578, 846, 4361, 253, 643, 5955, 285, 10123, 891, 1158, 253, 4477, 476, 1908, 247, 1643, 3081, 4679, 891, 651, 1804, 23415, 275, 247, 625, 3206, 20953, 16217, 2092, 281, 1805, 41509, 253, 11369, 5482, 604, 1896, 4477, 476, 671, 1908, 247, 625, 10182, 28913, 1263, 281, 5100, 253, 17200, 273, 1016, 4445, 327, 436, 281, 1105, 49797, 2007, 253, 4477, 5907, 22909, 323, 253, 3733, 673, 22939, 569, 604, 1896, 4477, 476, 1908, 1690, 253, 9696, 7957, 20617, 1103, 275, 253, 18520, 281, 320, 625, 21414, 5474, 339, 431, 248, 2929, 29328, 247, 1332, 281, 22318, 253, 3602, 273, 253, 9769, 1114, 442, 1113, 4213, 288, 17475, 5933, 253, 3213, 1979, 285, 253, 16421, 273, 2774, 7640, 26677, 4315, 275, 1340, 281, 513, 326, 253, 4477, 1908, 253, 3268, 273, 3530, 2805, 85, 2797, 846, 246, 25142, 273, 253, 5933, 246, 2997, 49844, 5018, 4983, 432, 690, 3268, 2805, 17, 840, 247, 5272, 8103, 323, 253, 13757, 651, 320, 253, 465, 392, 2373, 9515, 875, 2805, 85, 285, 253, 2303, 4038, 268, 2299, 253, 7103, 273, 253, 465, 392, 2373, 9515, 3797, 253, 15579, 273, 2805, 85, 3692, 4038, 310, 540, 44374, 1955, 281, 7418, 2997, 49844, 5018, 253, 4081, 2900, 281, 436, 10183, 310, 281, 11823, 253, 15579, 1307, 285, 22950, 253, 2412, 4038, 273, 253, 2303, 327, 3530, 432, 2805, 85, 281, 3693, 253, 29458, 2900, 1955, 281, 24492, 273, 253, 15579, 253, 4477, 12661, 281, 5206, 2805, 17, 9257, 24088, 281, 3037, 2805, 17, 347, 247, 2622, 3006, 2685, 4020, 839, 253, 2303, 268, 3066, 41458, 273, 2280, 16484, 272, 355, 545, 324, 2373, 9515, 253, 6158, 8687, 253, 10393, 273, 3530, 432, 253, 2303, 3268, 50276, 24330, 7350, 337, 253, 1332, 310, 271, 11369, 10480, 2581, 685, 247, 28462, 2746, 281, 253, 13757, 273, 10491, 11333, 6296, 275, 1142, 2219, 952, 897, 278, 3591, 68, 3082, 281, 4044, 23632, 323, 253, 10491, 5199, 253, 4081, 1332, 26586, 512, 841, 23632, 407, 22128, 327, 253, 4327, 273, 253, 3302, 3268, 2805, 17, 25761, 253, 13757, 273, 2805, 17, 3066, 2280, 16484, 272, 16566, 310, 247, 417, 49186, 1892, 1895, 1580, 3530, 432, 253, 2303, 3268, 403, 417, 1677, 275, 247, 7312, 4758, 50276, 19, 891, 1158, 253, 2929, 19756, 271, 5667, 5301, 342, 253, 1332, 4081, 407, 246, 953, 6358, 11786, 3169, 17825, 1616, 729, 5931, 1114, 442, 1113, 4213, 6247, 436, 2929, 29328, 247, 625, 2087, 8103, 323, 4764, 13757, 11120, 25243, 2158, 1029, 15579, 273, 253, 10419, 25761, 275, 4499, 342, 253, 4715, 3213, 273, 2805, 17, 352, 17209, 275, 271, 17825, 5133, 417, 10568, 667, 3215, 26208, 5018, 50276, 20, 1677, 253, 3710, 10527, 38135, 891, 651, 1902, 253, 17857, 32888, 2929, 281, 7568, 4122, 5547, 16774, 1543, 2299, 352, 310, 417, 253, 1083, 323, 253, 1655, 19529, 516, 3240, 13224, 326, 253, 1543, 327, 30105, 8892, 403, 562, 273, 8542, 1600, 671, 323, 253, 5787, 8062, 253, 17082, 4327, 17134, 398, 253, 6803, 273, 253, 8542, 8453, 50276, 37585, 5701, 337, 891, 13414, 1089, 253, 5301, 273, 16888, 10670, 327, 253, 3925, 69, 1895, 281, 320, 247, 21414, 1039, 281, 7277, 1775, 446, 398, 3045, 891, 651, 1804, 7296, 2057, 1529, 7982, 390, 1529, 1895, 374, 891, 671, 651, 1804, 281, 2486, 253, 5740, 387, 1878, 253, 7212, 323, 253, 4038, 273, 253, 1895, 5787, 16012, 352, 651, 2085, 253, 9414, 342, 271, 3081, 30328, 327, 697, 10183, 495, 891, 1158, 2593, 577, 651, 5649, 432, 253, 2590, 5740, 273, 253, 4327, 273, 256, 323, 4227, 432, 253, 5740, 273, 253, 4778, 12910, 534, 4620, 627, 323, 253, 806, 673, 50276, 38092, 5701, 846, 294, 24042, 253, 2278, 891, 1928, 326, 352, 778, 3590, 247, 2372, 17770, 323, 253, 4477, 3103, 891, 971, 281, 1333, 33454, 326, 891, 1089, 253, 9380, 2256, 281, 320, 273, 1270, 1600, 1908, 667, 789, 275, 436, 3884, 9865, 285, 11907, 253, 4477, 281, 4035, 616, 2175, 619, 14226, 310, 760, 271, 3177, 281, 2746, 253, 2278, 1232, 38304, 7152, 339, 793, 360, 3454, 253, 2929, 23970, 247, 11786, 3169, 2746, 323, 25184, 253, 5018, 907, 285, 253, 16421, 2280, 4315, 273, 288, 17475, 2366, 342, 253, 3602, 273, 271, 3302, 3268, 323, 253, 1616, 729, 5931, 597, 1804, 1027, 16566, 42133, 323, 256, 35333, 22950, 253, 3264, 2303, 2412, 20425, 273, 253, 2457, 1375, 273, 253, 5931, 533, 671, 271, 8103, 281, 5416, 247, 8489, 4618, 3302, 3268, 253, 2746, 310, 12800, 327, 374, 69, 20953, 3210, 3676, 21624, 305, 12064, 3210, 327, 8142, 278, 79, 382, 285, 5787, 16012, 50276, 993, 23223, 253, 19529, 5936, 247, 8542, 2746, 323, 25184, 288, 17475, 326, 4558, 247, 11132, 1895, 253, 5019, 273, 253, 1027, 16566, 310, 747, 347, 2080, 347, 891, 717, 6600, 50276, 358, 5378, 474, 4679, 403, 2530, 281, 15249, 253, 2746, 327, 2629, 22791, 3237, 835, 352, 310, 3133, 281, 320, 12085, 342, 1375, 273, 253, 1445, 3082, 285, 247, 625, 9470, 1263, 327, 10491, 5787, 16012, 50275, 8265, 3993, 891, 1928, 326, 2007, 7125, 403, 3058, 281, 15249, 2139, 253, 15579, 273, 253, 4081, 1375, 476, 320, 12841, 672, 42174, 253, 4373, 22041, 273, 253, 1775, 17407, 253, 2929, 8219, 326, 1580, 288, 17475, 407, 5140, 2550, 13551, 281, 824, 247, 1127, 2280, 359, 9059, 326, 253, 15579, 1307, 476, 320, 8231, 2530, 253, 3302, 3268, 273, 253, 5931, 556, 2217, 7031, 273, 253, 2303, 891, 717, 417, 13762, 407, 436, 1379, 247, 2629, 2622, 2303, 840, 247, 26416, 35255, 6058, 442, 737, 1080, 342, 374, 5018, 3943, 2280, 4315, 285, 3213, 1979, 273, 8084, 19, 29328, 11544, 18260, 432, 247, 1127, 2280, 3268, 285, 436, 6569, 11678, 327, 253, 1375, 2317, 1223, 436, 1537, 320, 271, 46521, 1650, 352, 310, 417, 2590, 281, 479, 849, 824, 9534, 476, 320, 16371, 275, 2087, 352, 310, 671, 417, 2590, 281, 479, 2139, 253, 25530, 10295, 1025, 2870, 249, 26210, 8103, 8356, 4575, 84, 253, 4871, 273, 253, 3302, 3268, 275, 5150, 577, 253, 26210, 310, 875, 253, 2457, 1375, 285, 253, 2303, 285, 891, 1891, 281, 923, 849, 436, 7033, 281, 253, 4871, 273, 253, 3302, 4038, 50276, 250, 27167, 569, 891, 6273, 323, 247, 5075, 12009, 387, 253, 2774, 253, 5697, 4081, 275, 253, 2929, 403, 6296, 4722, 2299, 891, 717, 417, 2568, 13762, 326, 253, 16566, 4917, 288, 17475, 34501, 326, 8338, 253, 1375, 2317, 973, 594, 253, 288, 17475, 10419, 1057, 417, 2489, 2810, 281, 30027, 7507, 5298, 247, 2780, 662, 594, 326, 15579, 3249, 8127, 432, 253, 3302, 3268, 534, 310, 2299, 10166, 342, 247, 1027, 8103, 671, 253, 897, 273, 253, 25530, 10295, 1025, 2870, 249, 26210, 5742, 943, 320, 1805, 17194, 891, 717, 5211, 281, 2572, 619, 4868, 604, 253, 4477, 1805, 19148, 841, 2792, 50276, 44295, 5701, 22402, 253, 4477, 1750, 275, 253, 12002, 326, 5368, 7274, 22318, 247, 10649, 494, 2406, 3033, 326, 310, 1512, 13155, 281, 320, 4217, 275, 3946, 476, 436, 320, 17245, 598, 625, 345, 2414, 600, 891, 2096, 326, 824, 3082, 824, 347, 6906, 1162, 355, 9169, 897, 247, 2343, 14356, 3033, 533, 417, 326, 841, 3510, 273, 14493, 403, 19437, 275, 3946, 275, 2593, 4562, 849, 513, 253, 14924, 4142, 7277, 323, 253, 6891, 4632, 253, 4618, 3302, 3268, 619, 30328, 651, 320, 326, 253, 14924, 4142, 323, 253, 6891, 581, 403, 4577, 685, 323, 253, 4618, 581, 651, 352, 840, 320, 1896, 281, 755, 247, 1805, 17947, 1014, 275, 436, 1083, 407, 1690, 271, 8103, 281, 2303, 271, 14924, 2281, 1333, 2572, 253, 5018, 907, 604, 253, 14924, 2281, 310, 1840, 470, 2082, 50275, 37585, 5701, 310, 352, 4755, 326, 5150, 721, 46926, 253, 337, 69, 2373, 9515, 323, 465, 18, 310, 436, 417, 253, 2629, 362, 3348, 17, 69, 2373, 9515, 1223, 323, 465, 18, 253, 891, 88, 3348, 8103, 476, 320, 2326, 347, 247, 470, 69, 2373, 9515, 327, 271, 6508, 2317, 752, 403, 253, 17356, 4903, 15524, 432, 295, 17, 74, 4555, 403, 597, 1663, 253, 10254, 4903, 403, 253, 3302, 10254, 4903, 417, 432, 295, 17, 5168, 19803, 275, 253, 4679, 432, 2593, 8319, 2139, 513, 368, 2303, 247, 5927, 14924, 2281, 273, 470, 1099, 285, 417, 271, 3388, 2281, 273, 470, 2082, 534, 3133, 247, 625, 1846, 4327, 275, 253, 17825, 278, 3591, 68, 6239, 50276, 187, 187, 4118, 18435, 27, 2520, 2929, 29328, 247, 25184, 5700, 323, 10546, 7839, 757, 1114, 442, 1113, 4213, 288, 17475, 253, 4081, 5933, 5556, 4219, 247, 7321, 39762, 8103, 689, 253, 246, 3213, 3268, 273, 271, 288, 17475, 5931, 253, 4081, 6974, 310, 6760, 21657, 50276, 455, 273, 253, 30628, 5821, 326, 436, 310, 271, 1774, 1895, 285, 326, 253, 4081, 3082, 310, 12532, 19235, 30628, 574, 33196, 670, 253, 16774, 7103, 285, 253, 10527, 3607, 273, 253, 6974, 984, 253, 7103, 273, 253, 6974, 310, 8558, 16774, 891, 2550, 5583, 14924, 273, 253, 2929, 275, 697, 1655, 830, 50276, 74, 5194, 342, 253, 1563, 2173, 37317, 7350, 253, 4081, 1332, 1057, 417, 1705, 342, 667, 1798, 23632, 285, 3782, 642, 23632, 5001, 253, 1055, 273, 18752, 253, 15579, 1307, 285, 970, 271, 256, 661, 69, 3733, 6974, 281, 23514, 1223, 23632, 403, 417, 3309, 323, 9311, 253, 2929, 943, 1056, 598, 323, 436, 342, 11088, 285, 21414, 4679, 891, 5194, 342, 391, 18, 326, 625, 10182, 28913, 2175, 327, 20953, 3210, 403, 3058, 604, 2717, 2010, 281, 10313, 253, 20544, 285, 32213, 273, 253, 4081, 2746, 891, 651, 671, 5583, 247, 625, 10182, 5955, 670, 253, 15180, 2105, 273, 436, 1332, 285, 849, 352, 476, 320, 9648, 2429, 281, 1666, 25379, 891, 13414, 5194, 326, 21547, 8138, 1020, 4679, 10313, 1199, 3340, 604, 3515, 625, 15958, 4679, 11355, 253, 4103, 3486, 273, 253, 4081, 1332 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 432, 374, 69, 281, 2221, 38157, 7831, 310, 253, 987, 2746, 281, 2096, 253, 3607, 273, 253, 4081, 3082, 627, 403, 1142, 7533, 326, 403, 625, 2834, 685, 841, 374, 69, 10670, 533, 1199, 625, 10649, 494, 685, 253, 277, 21619, 36911, 8571, 50276, 249, 6010, 891, 1158, 326, 253, 4477, 403, 36636, 271, 4722, 1386, 273, 2561, 533, 625, 10182, 10704, 14006, 403, 3309, 281, 1663, 2096, 253, 4409, 273, 253, 16182, 7152, 33032, 50276, 8774, 50276, 2520, 2929, 29328, 247, 39762, 17032, 1754, 7792, 281, 19928, 690, 273, 253, 4373, 22041, 273, 288, 17475, 11333, 8356, 253, 4477, 5926, 253, 15579, 1307, 432, 3963, 1045, 2399, 15895, 534, 29499, 247, 11786, 3169, 2746, 2299, 18752, 436, 1307, 4419, 4465, 1557, 323, 534, 4477, 3959, 271, 16644, 9349, 4720, 253, 4477, 7568, 16774, 12820, 327, 2067, 3237, 50275, 45563, 50276, 783, 4477, 513, 271, 7126, 2628, 273, 18575, 8287, 616, 30328, 3212, 253, 2934, 1097, 923, 2593, 495, 1223, 18752, 253, 15579, 1307, 432, 1045, 2399, 14717, 310, 47641, 3169, 253, 22909, 403, 973, 630, 2907, 285, 4677, 337, 1057, 271, 7126, 2628, 273, 2970, 253, 1127, 2439, 50275, 3062, 594, 1580, 18752, 253, 15579, 1307, 476, 2847, 18977, 13576, 253, 4477, 12661, 247, 1332, 281, 5416, 14200, 3302, 10670, 891, 49638, 253, 4477, 323, 253, 37825, 11369, 326, 369, 2424, 281, 1056, 616, 5697, 789, 891, 671, 49638, 253, 4477, 3434, 273, 16472, 7605, 5216, 285, 9470, 16774, 27163, 50274, 585, 1209, 2224, 50276, 2577, 2022, 4468, 342, 253, 789, 310, 326, 352, 310, 2223, 327, 1148, 342, 253, 11771, 3082, 74, 2096, 326, 247, 747, 1332, 36908, 878, 281, 320, 256, 5503, 327, 1046, 22791, 395, 253, 256, 661, 69, 11410, 11640, 326, 5115, 436, 3045, 403, 9419, 25785, 3468, 923, 7180, 721, 285, 898, 891, 812, 417, 1361, 533, 1928, 7514, 672, 642, 5955, 369, 5907, 323, 271, 2761, 3578, 8089, 2572, 275, 253, 15180, 673, 323, 3733, 277, 21619, 983, 281, 18578, 479, 891, 588, 1804, 9159, 271, 8274, 5955, 327, 253, 1408, 3181, 273, 253, 7274, 50276, 74, 1089, 253, 5955, 275, 2593, 270, 18, 18108, 285, 2868, 352, 943, 320, 625, 7473, 50276, 46458, 891, 588, 1804, 4649, 303, 3006, 752, 8103, 310, 908, 387, 534, 3924, 31506, 4477, 476, 5206, 281, 1551, 7818, 436, 690, 643, 1039, 2299, 352, 310, 1512, 1774, 281, 320, 1669, 275, 697, 1655, 830, 50274, 484, 24275, 846, 253, 30080, 22559, 50276, 74, 751, 253, 2929, 285, 1119, 253, 17265, 2715, 625, 13955, 891, 1329, 253, 11369, 2746, 273, 253, 2929, 2299, 347, 359, 512, 871, 841, 9380, 2223, 2430, 4477, 281, 564, 281, 3687, 16095, 281, 18578, 846, 4361, 253, 643, 5955, 285, 10123, 891, 1158, 253, 4477, 476, 1908, 247, 1643, 3081, 4679, 891, 651, 1804, 23415, 275, 247, 625, 3206, 20953, 16217, 2092, 281, 1805, 41509, 253, 11369, 5482, 604, 1896, 4477, 476, 671, 1908, 247, 625, 10182, 28913, 1263, 281, 5100, 253, 17200, 273, 1016, 4445, 327, 436, 281, 1105, 49797, 2007, 253, 4477, 5907, 22909, 323, 253, 3733, 673, 22939, 569, 604, 1896, 4477, 476, 1908, 1690, 253, 9696, 7957, 20617, 1103, 275, 253, 18520, 281, 320, 625, 21414, 5474, 339, 431, 248, 2929, 29328, 247, 1332, 281, 22318, 253, 3602, 273, 253, 9769, 1114, 442, 1113, 4213, 288, 17475, 5933, 253, 3213, 1979, 285, 253, 16421, 273, 2774, 7640, 26677, 4315, 275, 1340, 281, 513, 326, 253, 4477, 1908, 253, 3268, 273, 3530, 2805, 85, 2797, 846, 246, 25142, 273, 253, 5933, 246, 2997, 49844, 5018, 4983, 432, 690, 3268, 2805, 17, 840, 247, 5272, 8103, 323, 253, 13757, 651, 320, 253, 465, 392, 2373, 9515, 875, 2805, 85, 285, 253, 2303, 4038, 268, 2299, 253, 7103, 273, 253, 465, 392, 2373, 9515, 3797, 253, 15579, 273, 2805, 85, 3692, 4038, 310, 540, 44374, 1955, 281, 7418, 2997, 49844, 5018, 253, 4081, 2900, 281, 436, 10183, 310, 281, 11823, 253, 15579, 1307, 285, 22950, 253, 2412, 4038, 273, 253, 2303, 327, 3530, 432, 2805, 85, 281, 3693, 253, 29458, 2900, 1955, 281, 24492, 273, 253, 15579, 253, 4477, 12661, 281, 5206, 2805, 17, 9257, 24088, 281, 3037, 2805, 17, 347, 247, 2622, 3006, 2685, 4020, 839, 253, 2303, 268, 3066, 41458, 273, 2280, 16484, 272, 355, 545, 324, 2373, 9515, 253, 6158, 8687, 253, 10393, 273, 3530, 432, 253, 2303, 3268, 50276, 24330, 7350, 337, 253, 1332, 310, 271, 11369, 10480, 2581, 685, 247, 28462, 2746, 281, 253, 13757, 273, 10491, 11333, 6296, 275, 1142, 2219, 952, 897, 278, 3591, 68, 3082, 281, 4044, 23632, 323, 253, 10491, 5199, 253, 4081, 1332, 26586, 512, 841, 23632, 407, 22128, 327, 253, 4327, 273, 253, 3302, 3268, 2805, 17, 25761, 253, 13757, 273, 2805, 17, 3066, 2280, 16484, 272, 16566, 310, 247, 417, 49186, 1892, 1895, 1580, 3530, 432, 253, 2303, 3268, 403, 417, 1677, 275, 247, 7312, 4758, 50276, 19, 891, 1158, 253, 2929, 19756, 271, 5667, 5301, 342, 253, 1332, 4081, 407, 246, 953, 6358, 11786, 3169, 17825, 1616, 729, 5931, 1114, 442, 1113, 4213, 6247, 436, 2929, 29328, 247, 625, 2087, 8103, 323, 4764, 13757, 11120, 25243, 2158, 1029, 15579, 273, 253, 10419, 25761, 275, 4499, 342, 253, 4715, 3213, 273, 2805, 17, 352, 17209, 275, 271, 17825, 5133, 417, 10568, 667, 3215, 26208, 5018, 50276, 20, 1677, 253, 3710, 10527, 38135, 891, 651, 1902, 253, 17857, 32888, 2929, 281, 7568, 4122, 5547, 16774, 1543, 2299, 352, 310, 417, 253, 1083, 323, 253, 1655, 19529, 516, 3240, 13224, 326, 253, 1543, 327, 30105, 8892, 403, 562, 273, 8542, 1600, 671, 323, 253, 5787, 8062, 253, 17082, 4327, 17134, 398, 253, 6803, 273, 253, 8542, 8453, 50276, 37585, 5701, 337, 891, 13414, 1089, 253, 5301, 273, 16888, 10670, 327, 253, 3925, 69, 1895, 281, 320, 247, 21414, 1039, 281, 7277, 1775, 446, 398, 3045, 891, 651, 1804, 7296, 2057, 1529, 7982, 390, 1529, 1895, 374, 891, 671, 651, 1804, 281, 2486, 253, 5740, 387, 1878, 253, 7212, 323, 253, 4038, 273, 253, 1895, 5787, 16012, 352, 651, 2085, 253, 9414, 342, 271, 3081, 30328, 327, 697, 10183, 495, 891, 1158, 2593, 577, 651, 5649, 432, 253, 2590, 5740, 273, 253, 4327, 273, 256, 323, 4227, 432, 253, 5740, 273, 253, 4778, 12910, 534, 4620, 627, 323, 253, 806, 673, 50276, 38092, 5701, 846, 294, 24042, 253, 2278, 891, 1928, 326, 352, 778, 3590, 247, 2372, 17770, 323, 253, 4477, 3103, 891, 971, 281, 1333, 33454, 326, 891, 1089, 253, 9380, 2256, 281, 320, 273, 1270, 1600, 1908, 667, 789, 275, 436, 3884, 9865, 285, 11907, 253, 4477, 281, 4035, 616, 2175, 619, 14226, 310, 760, 271, 3177, 281, 2746, 253, 2278, 1232, 38304, 7152, 339, 793, 360, 3454, 253, 2929, 23970, 247, 11786, 3169, 2746, 323, 25184, 253, 5018, 907, 285, 253, 16421, 2280, 4315, 273, 288, 17475, 2366, 342, 253, 3602, 273, 271, 3302, 3268, 323, 253, 1616, 729, 5931, 597, 1804, 1027, 16566, 42133, 323, 256, 35333, 22950, 253, 3264, 2303, 2412, 20425, 273, 253, 2457, 1375, 273, 253, 5931, 533, 671, 271, 8103, 281, 5416, 247, 8489, 4618, 3302, 3268, 253, 2746, 310, 12800, 327, 374, 69, 20953, 3210, 3676, 21624, 305, 12064, 3210, 327, 8142, 278, 79, 382, 285, 5787, 16012, 50276, 993, 23223, 253, 19529, 5936, 247, 8542, 2746, 323, 25184, 288, 17475, 326, 4558, 247, 11132, 1895, 253, 5019, 273, 253, 1027, 16566, 310, 747, 347, 2080, 347, 891, 717, 6600, 50276, 358, 5378, 474, 4679, 403, 2530, 281, 15249, 253, 2746, 327, 2629, 22791, 3237, 835, 352, 310, 3133, 281, 320, 12085, 342, 1375, 273, 253, 1445, 3082, 285, 247, 625, 9470, 1263, 327, 10491, 5787, 16012, 50275, 8265, 3993, 891, 1928, 326, 2007, 7125, 403, 3058, 281, 15249, 2139, 253, 15579, 273, 253, 4081, 1375, 476, 320, 12841, 672, 42174, 253, 4373, 22041, 273, 253, 1775, 17407, 253, 2929, 8219, 326, 1580, 288, 17475, 407, 5140, 2550, 13551, 281, 824, 247, 1127, 2280, 359, 9059, 326, 253, 15579, 1307, 476, 320, 8231, 2530, 253, 3302, 3268, 273, 253, 5931, 556, 2217, 7031, 273, 253, 2303, 891, 717, 417, 13762, 407, 436, 1379, 247, 2629, 2622, 2303, 840, 247, 26416, 35255, 6058, 442, 737, 1080, 342, 374, 5018, 3943, 2280, 4315, 285, 3213, 1979, 273, 8084, 19, 29328, 11544, 18260, 432, 247, 1127, 2280, 3268, 285, 436, 6569, 11678, 327, 253, 1375, 2317, 1223, 436, 1537, 320, 271, 46521, 1650, 352, 310, 417, 2590, 281, 479, 849, 824, 9534, 476, 320, 16371, 275, 2087, 352, 310, 671, 417, 2590, 281, 479, 2139, 253, 25530, 10295, 1025, 2870, 249, 26210, 8103, 8356, 4575, 84, 253, 4871, 273, 253, 3302, 3268, 275, 5150, 577, 253, 26210, 310, 875, 253, 2457, 1375, 285, 253, 2303, 285, 891, 1891, 281, 923, 849, 436, 7033, 281, 253, 4871, 273, 253, 3302, 4038, 50276, 250, 27167, 569, 891, 6273, 323, 247, 5075, 12009, 387, 253, 2774, 253, 5697, 4081, 275, 253, 2929, 403, 6296, 4722, 2299, 891, 717, 417, 2568, 13762, 326, 253, 16566, 4917, 288, 17475, 34501, 326, 8338, 253, 1375, 2317, 973, 594, 253, 288, 17475, 10419, 1057, 417, 2489, 2810, 281, 30027, 7507, 5298, 247, 2780, 662, 594, 326, 15579, 3249, 8127, 432, 253, 3302, 3268, 534, 310, 2299, 10166, 342, 247, 1027, 8103, 671, 253, 897, 273, 253, 25530, 10295, 1025, 2870, 249, 26210, 5742, 943, 320, 1805, 17194, 891, 717, 5211, 281, 2572, 619, 4868, 604, 253, 4477, 1805, 19148, 841, 2792, 50276, 44295, 5701, 22402, 253, 4477, 1750, 275, 253, 12002, 326, 5368, 7274, 22318, 247, 10649, 494, 2406, 3033, 326, 310, 1512, 13155, 281, 320, 4217, 275, 3946, 476, 436, 320, 17245, 598, 625, 345, 2414, 600, 891, 2096, 326, 824, 3082, 824, 347, 6906, 1162, 355, 9169, 897, 247, 2343, 14356, 3033, 533, 417, 326, 841, 3510, 273, 14493, 403, 19437, 275, 3946, 275, 2593, 4562, 849, 513, 253, 14924, 4142, 7277, 323, 253, 6891, 4632, 253, 4618, 3302, 3268, 619, 30328, 651, 320, 326, 253, 14924, 4142, 323, 253, 6891, 581, 403, 4577, 685, 323, 253, 4618, 581, 651, 352, 840, 320, 1896, 281, 755, 247, 1805, 17947, 1014, 275, 436, 1083, 407, 1690, 271, 8103, 281, 2303, 271, 14924, 2281, 1333, 2572, 253, 5018, 907, 604, 253, 14924, 2281, 310, 1840, 470, 2082, 50275, 37585, 5701, 310, 352, 4755, 326, 5150, 721, 46926, 253, 337, 69, 2373, 9515, 323, 465, 18, 310, 436, 417, 253, 2629, 362, 3348, 17, 69, 2373, 9515, 1223, 323, 465, 18, 253, 891, 88, 3348, 8103, 476, 320, 2326, 347, 247, 470, 69, 2373, 9515, 327, 271, 6508, 2317, 752, 403, 253, 17356, 4903, 15524, 432, 295, 17, 74, 4555, 403, 597, 1663, 253, 10254, 4903, 403, 253, 3302, 10254, 4903, 417, 432, 295, 17, 5168, 19803, 275, 253, 4679, 432, 2593, 8319, 2139, 513, 368, 2303, 247, 5927, 14924, 2281, 273, 470, 1099, 285, 417, 271, 3388, 2281, 273, 470, 2082, 534, 3133, 247, 625, 1846, 4327, 275, 253, 17825, 278, 3591, 68, 6239, 50276, 187, 187, 4118, 18435, 27, 2520, 2929, 29328, 247, 25184, 5700, 323, 10546, 7839, 757, 1114, 442, 1113, 4213, 288, 17475, 253, 4081, 5933, 5556, 4219, 247, 7321, 39762, 8103, 689, 253, 246, 3213, 3268, 273, 271, 288, 17475, 5931, 253, 4081, 6974, 310, 6760, 21657, 50276, 455, 273, 253, 30628, 5821, 326, 436, 310, 271, 1774, 1895, 285, 326, 253, 4081, 3082, 310, 12532, 19235, 30628, 574, 33196, 670, 253, 16774, 7103, 285, 253, 10527, 3607, 273, 253, 6974, 984, 253, 7103, 273, 253, 6974, 310, 8558, 16774, 891, 2550, 5583, 14924, 273, 253, 2929, 275, 697, 1655, 830, 50276, 74, 5194, 342, 253, 1563, 2173, 37317, 7350, 253, 4081, 1332, 1057, 417, 1705, 342, 667, 1798, 23632, 285, 3782, 642, 23632, 5001, 253, 1055, 273, 18752, 253, 15579, 1307, 285, 970, 271, 256, 661, 69, 3733, 6974, 281, 23514, 1223, 23632, 403, 417, 3309, 323, 9311, 253, 2929, 943, 1056, 598, 323, 436, 342, 11088, 285, 21414, 4679, 891, 5194, 342, 391, 18, 326, 625, 10182, 28913, 2175, 327, 20953, 3210, 403, 3058, 604, 2717, 2010, 281, 10313, 253, 20544, 285, 32213, 273, 253, 4081, 2746, 891, 651, 671, 5583, 247, 625, 10182, 5955, 670, 253, 15180, 2105, 273, 436, 1332, 285, 849, 352, 476, 320, 9648, 2429, 281, 1666, 25379, 891, 13414, 5194, 326, 21547, 8138, 1020, 4679, 10313, 1199, 3340, 604, 3515, 625, 15958, 4679, 11355, 253, 4103, 3486, 273, 253, 4081, 1332 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: summary the paper focuses on gradient projection gp for incremental learning the authors motivate their work by stating that while approaches based on gp lead to superior performance in overcoming catastrophic forgetting they suffer from a significant drawback in gp once one calculates the subspaces spanned by layerwise inputs for a task say task a the network is then forced to only update the weights in an orthogonal direction to these subspaces for learning a subsequent task say task b hence keeping important parameters for task a intact and overcome catastrophic forgetting however suppose task b and task a are similar an extreme case is when task b is the continuation of task a in that case we know that the essential parameters for task a are likely to be important for task b and that the network could benefit by continuing to update the important parameters for task a which is not allowed in gp algorithms this behavior has two consequences 1 intransigence ie the network wont be able to learn task b as effectively as possible and 2 the network will have reduced backward transfer the backward transfer issue is apparent in the extreme case when task b is a continuation of task a and the networks performance on task a would have improved if it was able to learn task b using the important weights for task a the paper addresses this issue with a simple yet practical solution for a new task the authors calculate the correlation between the subspaces calculated for old tasks and the new task layerwise keep track of the most correlated previous tasks for each layer and denote them as layerwise trust regions next the authors propose a scaled weight projection that allows for unfreezing the important parameters in the trust region of the new task while learning this task and learn the corresponding weights as part of the optimization process finally the authors report results on permuted mnist pmnist cifar100 split cifar100 sup and a sequence of 5datasets with 10class classification which includes cifar10 mnist svhn notmnist and fashion mnist in comparison with gp methods regularizationbased methods and memory replay method they show consistent improvement in accuracy and more interestingly in backward transfer strengths the paper addresses an important problem with a good solution is well motivated is theoretically sound is well written and easy to follow does a good job at concisely reviewing the important recent work on the topic weaknesses i dont see any major weaknesses in this paper i have a few questions that i would like the authors to address see below here are a few points that the paper can improve upon providing the memory footprint of the algorithm comparing wallclocks of trgpgp questions for authors i appreciate it if the authors can clarify the following points and provide additional information 1 the scaling matrix qjtl plays a crucial role in your approach this matrix is balancing the freezingunfreezing process why wouldnt the network always choose qs such that all parameters in the trust region are unfrozen this would presumably lead to the least loss for the current task as it provides the maximum capacity for the task if so why should one optimize q as opposed to simply fixing it to unfreeze all parameters in the trust region 2 could you please comment on the memory footprint of your algorithm also could you please provide a headtohead wallclock comparison between gp and trgp 3 last question is also about unfreezing and matrix q if my understanding is correct all tasks in the trust region for each layer are currently treated as equally important for instance if for a certain layer task a and task c have a correlation of 10 same tasks and task b and task c have a correlation of 08 they are both added to the trust region of task c but the information that task a is more important to unfreeze is missing and is left up to the optimization on q to decide this by itself do you think there is a benefit in incorporating the correlation values into your trust region overall evaluation i think the paper addresses a very important problem generally speaking nonmemoryreplaybased methods for overcoming catastrophic forgetting could suffer from intransigence which is the inability to learn new tasks due to increased stiffnessrigidity of the network this reduces both forward and backward transfer especially when the tasks are similar eg revisiting an old task this paper provides a rational solution to this problem which provides consistent numerical improvement over the stateoftheart from an editorial point of view the paper is wellwritten easy to follow and it provides a good overview of the recent literature on the topic hence i think this is a good paper and vote for its acceptance docsepsome existing methods put restrictive constrains on the optimization space of the new task to prevent catastrophic forgetting which may lead to unsatisfactory performance for the new tasks this paper aims to facilitate the forward knowledge transfer based on an efficient characterization of task correlation main contributions can be summarized as follows 1introduce a novel notion of trust region based on the norm of gradient projection onto the subspace spanned by task inputs to measure task correlation 2proposed a novel approach for the new task to leverage the knowledge of the strongly correlated old tasks in the trust region through a scaled weight projection 3developed a continual learning approach trust region gradient projectiontrgp based on the introduced trust region scaled weight projection and a module to construct task input subspace 4compared to related stateoftheart approaches trgp achieves substantial performance improvement on all benchmarks strengths 1this paper put forward the problems of the existing methods and provides a detailed discussion 2propose a novel solution to this problem which achieves substantial performance improvement on all benchmarks and overwhelmed its baseline method weakness 1toy example 1 seems not convincing in this example the only difference between task 1 and task 2 is the sign of the input since each task has a separate classifier we can suppose a simple scenario where network has only one linear layer followed by a classifier since s2s1 w2w1 after learning task 2 though w1x1 w2x2 we can adapt the classifier of task 2 which is taskspecific to ensure wc2 wc1 and make sure task 2 can be classified correctly the example seems want to illustrate that the ideal output of task 2 should be the same as task 1 ie w1x1 w2x2 but since the input has changed why cannot the output change 2the way to measure the correlation between tasks denote xrmn xrnb rmb as the gradient spanned by the input batch of task t where is the input dimension and is the label dimension its projection onto the subspace of task j can be written as x bjbj perform svd on x the gradient projection can be rewritten as uv bjbj which is ubjbj why use the projection of a vector spanned by u to measure the correlation instead of the base u 3as said in the paper trust region aims to select strongly correlated tasks but in session impact of selected tasks in trust region it is obvious that trust region only selects most close tasks even if they are not strongly correlated and this phenomenon can be seen in the taskwise experiment and higher layers in the layerwise experiments which makes me doubt the effectiveness of the trust region selection 4in the session 42 this paper says that if task t is strongly correlated with task j the weight projection on subspace of sj is important for task t this view seems like an assumption which is not proved it would be more than welcome if more convincing proof could be provided this paper put forward the main problem of existing methods and provides a detailed discussion the author proposes a novel solution to this problem which achieves substantial performance improvement on all benchmarks and overwhelms its baseline method but some of the claims seem not convincing and some seem lack illustration docsepin this paper the problem of forward knowledge transfer in continual learning settings is explored the idea is to measure correlations between the learned tasks based on the notion of trust region which helps to identify the most similar learned tasks to the current task the core idea is that frozen weights for similar past tasks can be relaxed to reuse them to learn the current task since task similarities are used for this purpose this will not lead to catastrophic forgetting and at the same time helps to transfer knowledge experiments on four benchmarks are provided to demonstrate that the method is effective strengths 1 the problem of forward knowledge transfer is an unexplored problem in the continual learning literature 2 the paper reads well 3 experiments are somewhat convincing weaknesses 1 experiments are not extensive enough and important benchmarks are missing 2 thorough comparison with recent works is missing current works in continual learning focus mostly on tackling catastrophic forgetting and overlook accumulative learning hence i think the authors have selected a good area and the proposed idea also seems to be sound however i have some reservations about this work 1 given the expectation in the recent literature on continual learning only relatively simple datasets are considered in the experiments i think results on more complex datasets such as splitsubimagenet and splitimagenet should be added 2 i was wondering why forward transfer metric is not used for comparison i think it is as informative as backward transfer metric 3 an area of interest is to analyze the learning curves and dynamics of learning ie performance vs learning epoch it is helpful to see if the proposed method enables better jumpstart performance when a new task is learned 4 comparison with more works is missing for example ewc and hat are both outdated for regularizationbased methods and many recent followups exist for a realistic comparison against prior works stateoftheart methods for each group of methods should be included please check the recent literature 5 computational complexity is overlooked whereas it is an important factor for continual learning i think it is necessary to include how much additional computational load should be performed to benefit from trgp 6 for a more informative comparison the standard deviation in results should be reported also bwt metric can be reported more accurately by including more decimals in conclusion i think this work is in a good direction but further improvement is necessary to make it suitable for a venue similar to iclr docsepthis paper proposes a continual learning method based on gradient projection memory gpm of saha et al which projects the gradient of each layer to be orthogonal to the input subspace of previous tasks motivated by the fact that the orthogonal projection can harm the performance by being too restrictive the authors propose a heuristic algorithm to reduce the restriction specifically the authors choose a subset of most correlated tasks and let the model change along the subspace of the correlated tasks strengths this paper points out a major issue of gpm saha et al the orthogonal projection is too restrictive the proposed heuristic empirically improves the performance weaknesses the writing is hard to follow since this paper introduces multiple new concepts it was hard for me to understand the overall algorithm and the intuition behind it i think the writing can be improved a lot for example it was hard to find out what the authors are trying to do with the trust region i was confused if they are going to allow or restrict the model update in the trust region in figure 2 slf seems to be the 2d input subspace of layer l in 3d input space but why is there nablawl mathcal lt which should be in the parameter space according to eq2 nablawl mathcal lt and mathrmproj nablawl mathcal lt are matrices but why are they represented as a vector some of the reasoning steps are controversial toy example 1 is not a good example the authors claim that orthogonal projection is problematic because the optimal model for task 2 should be w2l w1l however that will lead to complete forgetting of the first task in fact this example is impossible to solve with a single linear layer therefore i do not agree with the conjecture motivated by this example naive orthogonal projection could possibly compromise the learning performance of the new task that is strongly correlated with old tasks especially when the correlation is negative as in the toy example 1 toy example 1 is a situation where the model has to choose between two conflicting options i perfectly memorize task 1 and ignore task 2 ii completely forget task 1 and learn task 2 orthogonal projection is a method for option i which is one of the best things a linear model can do considering my previous arguments the claim that the weight projection to the subspace of trust region should be modified is somewhat arbitrary i think more theoretical justification is needed lack of justification and empirical evidence for the scaled weight projection although the authors claim scaled weight projection as one of their main contributions there is no explanation of how the scaling matrix qjtl in eq6 is initialized and trained also there is no ablation study to prove its effectiveness without further information i cannot acknowledge this component as a contribution limited novelty in the end the authors perform a relaxed orthogonal projection of the gradient the proposed method is just one way of relaxing the orthogonal projection and there is not enough justification for why this particular algorithm should be effective i think there can be various ways to relax the orthogonal projection it would be better if the authors could provide a thorough empirical analysis of multiple relaxing schemes or theoretical justification for the proposed method this paper proposes a heuristic to improve gpm saha et al however the proposed method lacks theoretical justification useful insights and some essential experiments also the overall writing should be improved therefore i do not think this paper meets the iclr standards ### Summary:
the submission addresses the problem of whether or not to update weights for a previous task in continual learning the approach is to specify a trust region based on task similarity and update weights only in the direction of the tasks that are similar enough to the current one the paper was on the balance well received 34 reviewers recommended acceptance 2 with scores of 8 and complemented for its simple but effective approach and good discussion of related literature the submission attracted a reasonable amount of engagement and discussion between reviewers and authors which should be taken into account in the final version of the paper
[ 452, 247, 5921, 273, 16331, 50276, 9328, 403, 1097, 2879, 281, 253, 4517, 2919, 273, 4836, 260, 533, 253, 1491, 326, 4836, 247, 310, 625, 1774, 281, 440, 4924, 2721, 310, 5816, 285, 310, 1669, 598, 281, 253, 13757, 327, 2805, 281, 7617, 436, 407, 3139, 50276, 3088, 368, 1158, 627, 310, 247, 5649, 275, 24049, 253, 5921, 2193, 715, 634, 4517, 2919, 50276, 1189, 455, 7103, 50276, 74, 1158, 253, 2929, 12453, 247, 1077, 1774, 1895, 3839, 8288, 1327, 20704, 250, 1993, 3169, 3082, 323, 40845, 36256, 37264, 812, 11089, 432, 4996, 507, 304, 566, 534, 310, 253, 19297, 281, 3037, 747, 8892, 1955, 281, 2559, 26370, 10389, 14283, 273, 253, 2990, 436, 11355, 1097, 3579, 285, 19265, 3700, 3340, 672, 253, 8892, 403, 2074, 24088, 27694, 2996, 271, 1711, 4836, 436, 2929, 3400, 247, 8870, 2900, 281, 436, 1895, 534, 3400, 5185, 10704, 7756, 689, 253, 1375, 23037, 14387, 432, 271, 21977, 1127, 273, 1859, 253, 2929, 310, 973, 15720, 3477, 281, 956, 285, 352, 3400, 247, 1175, 18389, 273, 253, 3332, 6239, 327, 253, 9400, 7613, 891, 1158, 436, 310, 247, 1175, 2929, 285, 6273, 323, 697, 14924, 50275, 7152, 339, 793, 485, 5368, 3082, 1691, 29190, 1030, 44196, 327, 253, 13757, 2317, 273, 253, 747, 4836, 281, 3657, 36256, 37264, 534, 778, 1421, 281, 49770, 3045, 323, 253, 747, 8892, 436, 2929, 13698, 281, 12454, 253, 3579, 3640, 3700, 1754, 327, 271, 5919, 14846, 273, 4836, 5921, 2022, 9021, 476, 320, 17903, 347, 3637, 337, 36445, 336, 247, 4460, 10732, 273, 4517, 2919, 1754, 327, 253, 5222, 273, 11786, 12378, 4830, 253, 24822, 40423, 407, 4836, 14800, 281, 2557, 4836, 5921, 374, 856, 7334, 247, 4460, 2746, 323, 253, 747, 4836, 281, 25057, 253, 3640, 273, 253, 7052, 9578, 1711, 8892, 275, 253, 4517, 2919, 949, 247, 24337, 2801, 12378, 495, 35208, 247, 45120, 4715, 2746, 4517, 2919, 11786, 12378, 1206, 17788, 1754, 327, 253, 5611, 4517, 2919, 24337, 2801, 12378, 285, 247, 6333, 281, 3989, 4836, 3280, 24822, 577, 3118, 1096, 281, 2905, 1375, 23037, 14387, 7274, 492, 17788, 33526, 6832, 3045, 7756, 327, 512, 49602, 50276, 296, 3755, 20556, 50276, 18, 2520, 2929, 1691, 3579, 253, 3237, 273, 253, 5368, 3082, 285, 3400, 247, 7000, 5955, 50275, 19, 856, 3014, 247, 4460, 2900, 281, 436, 1895, 534, 33526, 6832, 3045, 7756, 327, 512, 49602, 285, 29991, 697, 8245, 1332, 50276, 20881, 1255, 50276, 18, 85, 899, 1650, 337, 3133, 417, 21414, 50276, 249, 436, 1650, 253, 760, 3064, 875, 4836, 337, 285, 4836, 374, 310, 253, 861, 273, 253, 3280, 1580, 1016, 4836, 556, 247, 4858, 30410, 359, 476, 9428, 247, 2969, 10076, 835, 2990, 556, 760, 581, 4872, 3828, 3560, 407, 247, 30410, 1580, 256, 19, 84, 18, 259, 19, 88, 18, 846, 4715, 4836, 374, 2167, 259, 18, 89, 18, 50276, 88, 19, 89, 19, 359, 476, 5223, 253, 30410, 273, 4836, 374, 534, 310, 8892, 29765, 281, 5416, 259, 68, 19, 50276, 38212, 18, 50276, 395, 50276, 11145, 2119, 4836, 374, 476, 320, 10509, 9113, 253, 1650, 3133, 971, 281, 17093, 326, 253, 7445, 3453, 273, 4836, 374, 943, 320, 253, 1072, 347, 4836, 337, 26332, 259, 18, 89, 18, 50276, 88, 19, 89, 19, 533, 1580, 253, 3280, 556, 4391, 2139, 2550, 253, 3453, 1818, 374, 783, 1039, 281, 2557, 253, 5921, 875, 8892, 9173, 1269, 1109, 79, 1269, 83, 5668, 391, 1814, 347, 253, 11786, 40423, 407, 253, 3280, 14604, 273, 4836, 246, 835, 50276, 261, 253, 3280, 7877, 285, 50276, 261, 253, 5203, 7877, 697, 12378, 4830, 253, 24822, 273, 4836, 480, 476, 320, 3542, 347, 1269, 270, 27686, 75, 1347, 18504, 69, 327, 1269, 253, 11786, 12378, 476, 320, 35993, 347, 46556, 270, 27686, 75, 534, 310, 12980, 27686, 75, 2139, 897, 253, 12378, 273, 247, 4972, 40423, 407, 1484, 281, 2557, 253, 5921, 3185, 273, 253, 2613, 1484, 495, 284, 753, 275, 253, 2929, 4517, 2919, 13698, 281, 3609, 7052, 9578, 8892, 533, 275, 6874, 3486, 273, 4236, 8892, 275, 4517, 2919, 352, 310, 4755, 326, 4517, 2919, 760, 34899, 954, 2810, 8892, 1014, 604, 597, 403, 417, 7052, 9578, 285, 436, 11562, 476, 320, 2326, 275, 253, 4836, 3020, 3368, 285, 2169, 8090, 275, 253, 3828, 3020, 4679, 534, 2789, 479, 5545, 253, 12510, 273, 253, 4517, 2919, 5438, 577, 249, 253, 6874, 5976, 436, 2929, 2296, 326, 604, 4836, 246, 310, 7052, 9578, 342, 4836, 480, 253, 2801, 12378, 327, 24822, 273, 41996, 310, 1774, 323, 4836, 246, 436, 1859, 3133, 751, 271, 9376, 534, 310, 417, 8058, 352, 651, 320, 625, 685, 10112, 604, 625, 21414, 4737, 812, 320, 2530, 436, 2929, 1691, 3579, 253, 2022, 1895, 273, 5368, 3082, 285, 3400, 247, 7000, 5955, 253, 2488, 29328, 247, 4460, 2900, 281, 436, 1895, 534, 33526, 6832, 3045, 7756, 327, 512, 49602, 285, 12357, 983, 697, 8245, 1332, 533, 690, 273, 253, 3916, 1646, 417, 21414, 285, 690, 1646, 3480, 23356, 5474, 339, 9852, 436, 2929, 253, 1895, 273, 3579, 3640, 3700, 275, 45120, 4715, 7533, 310, 14859, 253, 2934, 310, 281, 2557, 13007, 875, 253, 6311, 8892, 1754, 327, 253, 10732, 273, 4517, 2919, 534, 7729, 281, 4271, 253, 954, 2074, 6311, 8892, 281, 253, 1655, 4836, 253, 5161, 2934, 310, 326, 13831, 13461, 323, 2074, 2469, 8892, 476, 320, 19595, 281, 33150, 731, 281, 3037, 253, 1655, 4836, 1580, 4836, 22620, 403, 908, 323, 436, 4096, 436, 588, 417, 1421, 281, 36256, 37264, 285, 387, 253, 1072, 673, 7729, 281, 3700, 3640, 4679, 327, 1740, 49602, 403, 2530, 281, 7568, 326, 253, 1332, 310, 3576, 20544, 50272, 18, 253, 1895, 273, 3579, 3640, 3700, 310, 271, 35021, 2149, 1895, 275, 253, 45120, 4715, 6239, 50272, 19, 253, 2929, 9563, 973, 50272, 20, 4679, 403, 8489, 21414, 50276, 20881, 1255, 265, 50272, 18, 4679, 403, 417, 9470, 2217, 285, 1774, 49602, 403, 5816, 50272, 19, 11080, 5301, 342, 3332, 2987, 310, 5816, 50275, 6259, 2987, 275, 45120, 4715, 2770, 6571, 327, 46710, 36256, 37264, 285, 20621, 7358, 12581, 4715, 7613, 891, 1158, 253, 4477, 452, 4236, 247, 1175, 2170, 285, 253, 4081, 2934, 671, 3133, 281, 320, 3590, 2299, 891, 452, 690, 33196, 670, 436, 789, 50276, 18, 1677, 253, 15355, 275, 253, 3332, 6239, 327, 45120, 4715, 760, 4942, 2969, 15302, 403, 2783, 275, 253, 4679, 891, 1158, 1543, 327, 625, 2570, 15302, 824, 347, 36509, 538, 303, 6533, 292, 285, 8085, 303, 6533, 292, 943, 320, 2879, 50275, 19, 891, 369, 12371, 2139, 3579, 3700, 7982, 310, 417, 908, 323, 5301, 891, 1158, 352, 310, 347, 27096, 347, 19265, 3700, 7982, 50276, 20, 50276, 266, 2170, 273, 1600, 310, 281, 12106, 253, 4715, 9191, 285, 8062, 273, 4715, 26332, 3045, 4632, 4715, 23657, 352, 310, 9371, 281, 923, 604, 253, 4081, 1332, 13276, 1805, 6923, 5478, 3045, 672, 247, 747, 4836, 310, 6311, 50276, 21, 5301, 342, 625, 2987, 310, 5816, 323, 1650, 299, 38212, 285, 7856, 403, 1097, 36761, 323, 37820, 3169, 3082, 285, 1142, 3332, 956, 8777, 2226, 323, 247, 15958, 5301, 1411, 2720, 2987, 1375, 23037, 14387, 3082, 323, 1016, 1387, 273, 3082, 943, 320, 2908, 4496, 2451, 253, 3332, 6239, 50276, 22, 15180, 10454, 310, 28849, 5727, 352, 310, 271, 1774, 2803, 323, 45120, 4715, 891, 1158, 352, 310, 3309, 281, 2486, 849, 1199, 3081, 15180, 3301, 943, 320, 2684, 281, 5649, 432, 492, 17788, 50276, 23, 323, 247, 625, 27096, 5301, 253, 2629, 11254, 275, 1543, 943, 320, 2361, 671, 270, 17118, 7982, 476, 320, 2361, 625, 13613, 407, 1690, 625, 1086, 21185, 50275, 249, 6452, 891, 1158, 436, 789, 310, 275, 247, 1175, 3884, 533, 2007, 7756, 310, 3309, 281, 1056, 352, 7470, 323, 247, 18767, 2074, 281, 17857, 32888, 5474, 33032, 2520, 2929, 29328, 247, 45120, 4715, 1332, 1754, 327, 11786, 12378, 3541, 305, 2617, 273, 618, 3227, 1162, 355, 534, 6493, 253, 11786, 273, 1016, 3828, 281, 320, 19627, 281, 253, 3280, 24822, 273, 2045, 8892, 17194, 407, 253, 958, 326, 253, 19627, 12378, 476, 5237, 253, 3045, 407, 1146, 1512, 29190, 253, 4477, 12661, 247, 47641, 5933, 281, 4796, 253, 12400, 5742, 253, 4477, 5206, 247, 8578, 273, 954, 9578, 8892, 285, 1339, 253, 1566, 1818, 2112, 253, 24822, 273, 253, 9578, 8892, 50276, 296, 3755, 20556, 50276, 2520, 2929, 2792, 562, 247, 2201, 2523, 273, 305, 2617, 618, 3227, 1162, 355, 253, 19627, 12378, 310, 1512, 29190, 50276, 783, 4081, 47641, 45190, 19132, 253, 3045, 50275, 20881, 1255, 265, 50276, 783, 4028, 310, 1892, 281, 956, 1580, 436, 2929, 23970, 2709, 747, 12342, 352, 369, 1892, 323, 479, 281, 2096, 253, 4583, 5933, 285, 253, 30328, 3212, 352, 891, 1158, 253, 4028, 476, 320, 5520, 247, 2257, 323, 1650, 50276, 262, 369, 1892, 281, 1089, 562, 752, 253, 4477, 403, 2820, 281, 513, 342, 253, 4517, 2919, 891, 369, 13477, 604, 597, 403, 1469, 281, 1581, 390, 4656, 253, 1566, 5731, 275, 253, 4517, 2919, 50276, 249, 4677, 374, 1499, 71, 3133, 281, 320, 253, 374, 69, 3280, 24822, 273, 3828, 298, 275, 495, 69, 3280, 2317, 533, 2139, 310, 627, 295, 1752, 1403, 77, 14168, 1179, 46007, 534, 943, 320, 275, 253, 4764, 2317, 2556, 281, 16186, 19, 295, 1752, 1403, 77, 14168, 1179, 46007, 285, 14168, 1109, 29167, 295, 1752, 1403, 77, 14168, 1179, 46007, 403, 12624, 533, 2139, 403, 597, 6607, 347, 247, 4972, 50275, 8826, 273, 253, 14720, 5018, 403, 15620, 50276, 85, 899, 1650, 337, 310, 417, 247, 1175, 1650, 253, 4477, 1750, 326, 19627, 12378, 310, 20276, 984, 253, 8654, 1566, 323, 4836, 374, 943, 320, 259, 19, 77, 50276, 88, 18, 77, 2299, 326, 588, 1421, 281, 3426, 37264, 273, 253, 806, 4836, 275, 958, 436, 1650, 310, 7479, 281, 8415, 342, 247, 2014, 4872, 3828, 50276, 45230, 891, 513, 417, 5194, 342, 253, 24366, 17194, 407, 436, 1650, 27785, 19627, 12378, 812, 6830, 18230, 253, 4715, 3045, 273, 253, 747, 4836, 326, 310, 7052, 9578, 342, 1711, 8892, 3340, 672, 253, 5921, 310, 4016, 347, 275, 253, 20953, 1650, 337, 20953, 1650, 337, 310, 247, 4112, 835, 253, 1566, 556, 281, 5206, 875, 767, 24648, 4610, 891, 9670, 16407, 907, 4836, 337, 285, 11823, 4836, 374, 21255, 4336, 7740, 4836, 337, 285, 3037, 4836, 374, 19627, 12378, 310, 247, 1332, 323, 4500, 891, 534, 310, 581, 273, 253, 1682, 1841, 247, 4872, 1566, 476, 513, 50276, 15603, 272, 619, 2045, 7125, 253, 1750, 326, 253, 2801, 12378, 281, 253, 24822, 273, 4517, 2919, 943, 320, 7321, 310, 8489, 10341, 891, 1158, 625, 10527, 22861, 310, 3058, 50275, 77, 471, 273, 22861, 285, 16774, 1941, 323, 253, 24337, 2801, 12378, 3738, 253, 4477, 1750, 24337, 2801, 12378, 347, 581, 273, 616, 2022, 9021, 627, 310, 642, 8813, 273, 849, 253, 13642, 4315, 2805, 75, 17945, 275, 16186, 23, 310, 31260, 285, 10166, 671, 627, 310, 642, 28913, 1263, 281, 5276, 697, 12510, 1293, 2007, 1491, 891, 2550, 14409, 436, 4445, 347, 247, 7680, 50275, 15870, 38135, 275, 253, 990, 253, 4477, 1347, 247, 19595, 19627, 12378, 273, 253, 11786, 253, 4081, 1332, 310, 816, 581, 1039, 273, 32196, 253, 19627, 12378, 285, 627, 310, 417, 2217, 22861, 323, 2139, 436, 1798, 5933, 943, 320, 3576, 891, 1158, 627, 476, 320, 2710, 4088, 281, 7921, 253, 19627, 12378, 352, 651, 320, 1805, 604, 253, 4477, 812, 2085, 247, 11080, 16774, 1783, 273, 2709, 32196, 15849, 390, 10527, 22861, 323, 253, 4081, 1332, 436, 2929, 29328, 247, 47641, 281, 3157, 305, 2617, 618, 3227, 1162, 355, 2299, 253, 4081, 1332, 19756, 10527, 22861, 4217, 16039, 285, 690, 5667, 4679, 671, 253, 4583, 4028, 943, 320, 5520, 3103, 891, 513, 417, 1158, 436, 2929, 16382, 253, 17857, 32888, 7465, 2490, 187, 4118, 18435, 27, 783, 19529, 12453, 253, 1895, 273, 1880, 390, 417, 281, 5731, 13461, 323, 247, 2045, 4836, 275, 45120, 4715, 50276, 783, 2746, 310, 281, 13199, 247, 4517, 2919, 1754, 327, 4836, 14259, 285, 5731, 13461, 760, 275, 253, 3884, 273, 253, 8892, 326, 403, 2074, 2217, 281, 253, 1655, 581, 50276, 783, 2929, 369, 327, 253, 6654, 973, 2959, 5910, 30628, 8521, 14924, 374, 342, 7363, 273, 854, 285, 48912, 323, 697, 2969, 533, 3576, 2746, 285, 1175, 5955, 273, 2905, 6239, 50276, 783, 19529, 17755, 247, 5272, 2408, 273, 13226, 285, 5955, 875, 30628, 285, 4477, 534, 943, 320, 2668, 715, 2395, 275, 253, 2457, 2715, 273, 253, 2929 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 452, 247, 5921, 273, 16331, 50276, 9328, 403, 1097, 2879, 281, 253, 4517, 2919, 273, 4836, 260, 533, 253, 1491, 326, 4836, 247, 310, 625, 1774, 281, 440, 4924, 2721, 310, 5816, 285, 310, 1669, 598, 281, 253, 13757, 327, 2805, 281, 7617, 436, 407, 3139, 50276, 3088, 368, 1158, 627, 310, 247, 5649, 275, 24049, 253, 5921, 2193, 715, 634, 4517, 2919, 50276, 1189, 455, 7103, 50276, 74, 1158, 253, 2929, 12453, 247, 1077, 1774, 1895, 3839, 8288, 1327, 20704, 250, 1993, 3169, 3082, 323, 40845, 36256, 37264, 812, 11089, 432, 4996, 507, 304, 566, 534, 310, 253, 19297, 281, 3037, 747, 8892, 1955, 281, 2559, 26370, 10389, 14283, 273, 253, 2990, 436, 11355, 1097, 3579, 285, 19265, 3700, 3340, 672, 253, 8892, 403, 2074, 24088, 27694, 2996, 271, 1711, 4836, 436, 2929, 3400, 247, 8870, 2900, 281, 436, 1895, 534, 3400, 5185, 10704, 7756, 689, 253, 1375, 23037, 14387, 432, 271, 21977, 1127, 273, 1859, 253, 2929, 310, 973, 15720, 3477, 281, 956, 285, 352, 3400, 247, 1175, 18389, 273, 253, 3332, 6239, 327, 253, 9400, 7613, 891, 1158, 436, 310, 247, 1175, 2929, 285, 6273, 323, 697, 14924, 50275, 7152, 339, 793, 485, 5368, 3082, 1691, 29190, 1030, 44196, 327, 253, 13757, 2317, 273, 253, 747, 4836, 281, 3657, 36256, 37264, 534, 778, 1421, 281, 49770, 3045, 323, 253, 747, 8892, 436, 2929, 13698, 281, 12454, 253, 3579, 3640, 3700, 1754, 327, 271, 5919, 14846, 273, 4836, 5921, 2022, 9021, 476, 320, 17903, 347, 3637, 337, 36445, 336, 247, 4460, 10732, 273, 4517, 2919, 1754, 327, 253, 5222, 273, 11786, 12378, 4830, 253, 24822, 40423, 407, 4836, 14800, 281, 2557, 4836, 5921, 374, 856, 7334, 247, 4460, 2746, 323, 253, 747, 4836, 281, 25057, 253, 3640, 273, 253, 7052, 9578, 1711, 8892, 275, 253, 4517, 2919, 949, 247, 24337, 2801, 12378, 495, 35208, 247, 45120, 4715, 2746, 4517, 2919, 11786, 12378, 1206, 17788, 1754, 327, 253, 5611, 4517, 2919, 24337, 2801, 12378, 285, 247, 6333, 281, 3989, 4836, 3280, 24822, 577, 3118, 1096, 281, 2905, 1375, 23037, 14387, 7274, 492, 17788, 33526, 6832, 3045, 7756, 327, 512, 49602, 50276, 296, 3755, 20556, 50276, 18, 2520, 2929, 1691, 3579, 253, 3237, 273, 253, 5368, 3082, 285, 3400, 247, 7000, 5955, 50275, 19, 856, 3014, 247, 4460, 2900, 281, 436, 1895, 534, 33526, 6832, 3045, 7756, 327, 512, 49602, 285, 29991, 697, 8245, 1332, 50276, 20881, 1255, 50276, 18, 85, 899, 1650, 337, 3133, 417, 21414, 50276, 249, 436, 1650, 253, 760, 3064, 875, 4836, 337, 285, 4836, 374, 310, 253, 861, 273, 253, 3280, 1580, 1016, 4836, 556, 247, 4858, 30410, 359, 476, 9428, 247, 2969, 10076, 835, 2990, 556, 760, 581, 4872, 3828, 3560, 407, 247, 30410, 1580, 256, 19, 84, 18, 259, 19, 88, 18, 846, 4715, 4836, 374, 2167, 259, 18, 89, 18, 50276, 88, 19, 89, 19, 359, 476, 5223, 253, 30410, 273, 4836, 374, 534, 310, 8892, 29765, 281, 5416, 259, 68, 19, 50276, 38212, 18, 50276, 395, 50276, 11145, 2119, 4836, 374, 476, 320, 10509, 9113, 253, 1650, 3133, 971, 281, 17093, 326, 253, 7445, 3453, 273, 4836, 374, 943, 320, 253, 1072, 347, 4836, 337, 26332, 259, 18, 89, 18, 50276, 88, 19, 89, 19, 533, 1580, 253, 3280, 556, 4391, 2139, 2550, 253, 3453, 1818, 374, 783, 1039, 281, 2557, 253, 5921, 875, 8892, 9173, 1269, 1109, 79, 1269, 83, 5668, 391, 1814, 347, 253, 11786, 40423, 407, 253, 3280, 14604, 273, 4836, 246, 835, 50276, 261, 253, 3280, 7877, 285, 50276, 261, 253, 5203, 7877, 697, 12378, 4830, 253, 24822, 273, 4836, 480, 476, 320, 3542, 347, 1269, 270, 27686, 75, 1347, 18504, 69, 327, 1269, 253, 11786, 12378, 476, 320, 35993, 347, 46556, 270, 27686, 75, 534, 310, 12980, 27686, 75, 2139, 897, 253, 12378, 273, 247, 4972, 40423, 407, 1484, 281, 2557, 253, 5921, 3185, 273, 253, 2613, 1484, 495, 284, 753, 275, 253, 2929, 4517, 2919, 13698, 281, 3609, 7052, 9578, 8892, 533, 275, 6874, 3486, 273, 4236, 8892, 275, 4517, 2919, 352, 310, 4755, 326, 4517, 2919, 760, 34899, 954, 2810, 8892, 1014, 604, 597, 403, 417, 7052, 9578, 285, 436, 11562, 476, 320, 2326, 275, 253, 4836, 3020, 3368, 285, 2169, 8090, 275, 253, 3828, 3020, 4679, 534, 2789, 479, 5545, 253, 12510, 273, 253, 4517, 2919, 5438, 577, 249, 253, 6874, 5976, 436, 2929, 2296, 326, 604, 4836, 246, 310, 7052, 9578, 342, 4836, 480, 253, 2801, 12378, 327, 24822, 273, 41996, 310, 1774, 323, 4836, 246, 436, 1859, 3133, 751, 271, 9376, 534, 310, 417, 8058, 352, 651, 320, 625, 685, 10112, 604, 625, 21414, 4737, 812, 320, 2530, 436, 2929, 1691, 3579, 253, 2022, 1895, 273, 5368, 3082, 285, 3400, 247, 7000, 5955, 253, 2488, 29328, 247, 4460, 2900, 281, 436, 1895, 534, 33526, 6832, 3045, 7756, 327, 512, 49602, 285, 12357, 983, 697, 8245, 1332, 533, 690, 273, 253, 3916, 1646, 417, 21414, 285, 690, 1646, 3480, 23356, 5474, 339, 9852, 436, 2929, 253, 1895, 273, 3579, 3640, 3700, 275, 45120, 4715, 7533, 310, 14859, 253, 2934, 310, 281, 2557, 13007, 875, 253, 6311, 8892, 1754, 327, 253, 10732, 273, 4517, 2919, 534, 7729, 281, 4271, 253, 954, 2074, 6311, 8892, 281, 253, 1655, 4836, 253, 5161, 2934, 310, 326, 13831, 13461, 323, 2074, 2469, 8892, 476, 320, 19595, 281, 33150, 731, 281, 3037, 253, 1655, 4836, 1580, 4836, 22620, 403, 908, 323, 436, 4096, 436, 588, 417, 1421, 281, 36256, 37264, 285, 387, 253, 1072, 673, 7729, 281, 3700, 3640, 4679, 327, 1740, 49602, 403, 2530, 281, 7568, 326, 253, 1332, 310, 3576, 20544, 50272, 18, 253, 1895, 273, 3579, 3640, 3700, 310, 271, 35021, 2149, 1895, 275, 253, 45120, 4715, 6239, 50272, 19, 253, 2929, 9563, 973, 50272, 20, 4679, 403, 8489, 21414, 50276, 20881, 1255, 265, 50272, 18, 4679, 403, 417, 9470, 2217, 285, 1774, 49602, 403, 5816, 50272, 19, 11080, 5301, 342, 3332, 2987, 310, 5816, 50275, 6259, 2987, 275, 45120, 4715, 2770, 6571, 327, 46710, 36256, 37264, 285, 20621, 7358, 12581, 4715, 7613, 891, 1158, 253, 4477, 452, 4236, 247, 1175, 2170, 285, 253, 4081, 2934, 671, 3133, 281, 320, 3590, 2299, 891, 452, 690, 33196, 670, 436, 789, 50276, 18, 1677, 253, 15355, 275, 253, 3332, 6239, 327, 45120, 4715, 760, 4942, 2969, 15302, 403, 2783, 275, 253, 4679, 891, 1158, 1543, 327, 625, 2570, 15302, 824, 347, 36509, 538, 303, 6533, 292, 285, 8085, 303, 6533, 292, 943, 320, 2879, 50275, 19, 891, 369, 12371, 2139, 3579, 3700, 7982, 310, 417, 908, 323, 5301, 891, 1158, 352, 310, 347, 27096, 347, 19265, 3700, 7982, 50276, 20, 50276, 266, 2170, 273, 1600, 310, 281, 12106, 253, 4715, 9191, 285, 8062, 273, 4715, 26332, 3045, 4632, 4715, 23657, 352, 310, 9371, 281, 923, 604, 253, 4081, 1332, 13276, 1805, 6923, 5478, 3045, 672, 247, 747, 4836, 310, 6311, 50276, 21, 5301, 342, 625, 2987, 310, 5816, 323, 1650, 299, 38212, 285, 7856, 403, 1097, 36761, 323, 37820, 3169, 3082, 285, 1142, 3332, 956, 8777, 2226, 323, 247, 15958, 5301, 1411, 2720, 2987, 1375, 23037, 14387, 3082, 323, 1016, 1387, 273, 3082, 943, 320, 2908, 4496, 2451, 253, 3332, 6239, 50276, 22, 15180, 10454, 310, 28849, 5727, 352, 310, 271, 1774, 2803, 323, 45120, 4715, 891, 1158, 352, 310, 3309, 281, 2486, 849, 1199, 3081, 15180, 3301, 943, 320, 2684, 281, 5649, 432, 492, 17788, 50276, 23, 323, 247, 625, 27096, 5301, 253, 2629, 11254, 275, 1543, 943, 320, 2361, 671, 270, 17118, 7982, 476, 320, 2361, 625, 13613, 407, 1690, 625, 1086, 21185, 50275, 249, 6452, 891, 1158, 436, 789, 310, 275, 247, 1175, 3884, 533, 2007, 7756, 310, 3309, 281, 1056, 352, 7470, 323, 247, 18767, 2074, 281, 17857, 32888, 5474, 33032, 2520, 2929, 29328, 247, 45120, 4715, 1332, 1754, 327, 11786, 12378, 3541, 305, 2617, 273, 618, 3227, 1162, 355, 534, 6493, 253, 11786, 273, 1016, 3828, 281, 320, 19627, 281, 253, 3280, 24822, 273, 2045, 8892, 17194, 407, 253, 958, 326, 253, 19627, 12378, 476, 5237, 253, 3045, 407, 1146, 1512, 29190, 253, 4477, 12661, 247, 47641, 5933, 281, 4796, 253, 12400, 5742, 253, 4477, 5206, 247, 8578, 273, 954, 9578, 8892, 285, 1339, 253, 1566, 1818, 2112, 253, 24822, 273, 253, 9578, 8892, 50276, 296, 3755, 20556, 50276, 2520, 2929, 2792, 562, 247, 2201, 2523, 273, 305, 2617, 618, 3227, 1162, 355, 253, 19627, 12378, 310, 1512, 29190, 50276, 783, 4081, 47641, 45190, 19132, 253, 3045, 50275, 20881, 1255, 265, 50276, 783, 4028, 310, 1892, 281, 956, 1580, 436, 2929, 23970, 2709, 747, 12342, 352, 369, 1892, 323, 479, 281, 2096, 253, 4583, 5933, 285, 253, 30328, 3212, 352, 891, 1158, 253, 4028, 476, 320, 5520, 247, 2257, 323, 1650, 50276, 262, 369, 1892, 281, 1089, 562, 752, 253, 4477, 403, 2820, 281, 513, 342, 253, 4517, 2919, 891, 369, 13477, 604, 597, 403, 1469, 281, 1581, 390, 4656, 253, 1566, 5731, 275, 253, 4517, 2919, 50276, 249, 4677, 374, 1499, 71, 3133, 281, 320, 253, 374, 69, 3280, 24822, 273, 3828, 298, 275, 495, 69, 3280, 2317, 533, 2139, 310, 627, 295, 1752, 1403, 77, 14168, 1179, 46007, 534, 943, 320, 275, 253, 4764, 2317, 2556, 281, 16186, 19, 295, 1752, 1403, 77, 14168, 1179, 46007, 285, 14168, 1109, 29167, 295, 1752, 1403, 77, 14168, 1179, 46007, 403, 12624, 533, 2139, 403, 597, 6607, 347, 247, 4972, 50275, 8826, 273, 253, 14720, 5018, 403, 15620, 50276, 85, 899, 1650, 337, 310, 417, 247, 1175, 1650, 253, 4477, 1750, 326, 19627, 12378, 310, 20276, 984, 253, 8654, 1566, 323, 4836, 374, 943, 320, 259, 19, 77, 50276, 88, 18, 77, 2299, 326, 588, 1421, 281, 3426, 37264, 273, 253, 806, 4836, 275, 958, 436, 1650, 310, 7479, 281, 8415, 342, 247, 2014, 4872, 3828, 50276, 45230, 891, 513, 417, 5194, 342, 253, 24366, 17194, 407, 436, 1650, 27785, 19627, 12378, 812, 6830, 18230, 253, 4715, 3045, 273, 253, 747, 4836, 326, 310, 7052, 9578, 342, 1711, 8892, 3340, 672, 253, 5921, 310, 4016, 347, 275, 253, 20953, 1650, 337, 20953, 1650, 337, 310, 247, 4112, 835, 253, 1566, 556, 281, 5206, 875, 767, 24648, 4610, 891, 9670, 16407, 907, 4836, 337, 285, 11823, 4836, 374, 21255, 4336, 7740, 4836, 337, 285, 3037, 4836, 374, 19627, 12378, 310, 247, 1332, 323, 4500, 891, 534, 310, 581, 273, 253, 1682, 1841, 247, 4872, 1566, 476, 513, 50276, 15603, 272, 619, 2045, 7125, 253, 1750, 326, 253, 2801, 12378, 281, 253, 24822, 273, 4517, 2919, 943, 320, 7321, 310, 8489, 10341, 891, 1158, 625, 10527, 22861, 310, 3058, 50275, 77, 471, 273, 22861, 285, 16774, 1941, 323, 253, 24337, 2801, 12378, 3738, 253, 4477, 1750, 24337, 2801, 12378, 347, 581, 273, 616, 2022, 9021, 627, 310, 642, 8813, 273, 849, 253, 13642, 4315, 2805, 75, 17945, 275, 16186, 23, 310, 31260, 285, 10166, 671, 627, 310, 642, 28913, 1263, 281, 5276, 697, 12510, 1293, 2007, 1491, 891, 2550, 14409, 436, 4445, 347, 247, 7680, 50275, 15870, 38135, 275, 253, 990, 253, 4477, 1347, 247, 19595, 19627, 12378, 273, 253, 11786, 253, 4081, 1332, 310, 816, 581, 1039, 273, 32196, 253, 19627, 12378, 285, 627, 310, 417, 2217, 22861, 323, 2139, 436, 1798, 5933, 943, 320, 3576, 891, 1158, 627, 476, 320, 2710, 4088, 281, 7921, 253, 19627, 12378, 352, 651, 320, 1805, 604, 253, 4477, 812, 2085, 247, 11080, 16774, 1783, 273, 2709, 32196, 15849, 390, 10527, 22861, 323, 253, 4081, 1332, 436, 2929, 29328, 247, 47641, 281, 3157, 305, 2617, 618, 3227, 1162, 355, 2299, 253, 4081, 1332, 19756, 10527, 22861, 4217, 16039, 285, 690, 5667, 4679, 671, 253, 4583, 4028, 943, 320, 5520, 3103, 891, 513, 417, 1158, 436, 2929, 16382, 253, 17857, 32888, 7465, 2490, 187, 4118, 18435, 27, 783, 19529, 12453, 253, 1895, 273, 1880, 390, 417, 281, 5731, 13461, 323, 247, 2045, 4836, 275, 45120, 4715, 50276, 783, 2746, 310, 281, 13199, 247, 4517, 2919, 1754, 327, 4836, 14259, 285, 5731, 13461, 760, 275, 253, 3884, 273, 253, 8892, 326, 403, 2074, 2217, 281, 253, 1655, 581, 50276, 783, 2929, 369, 327, 253, 6654, 973, 2959, 5910, 30628, 8521, 14924, 374, 342, 7363, 273, 854, 285, 48912, 323, 697, 2969, 533, 3576, 2746, 285, 1175, 5955, 273, 2905, 6239, 50276, 783, 19529, 17755, 247, 5272, 2408, 273, 13226, 285, 5955, 875, 30628, 285, 4477, 534, 943, 320, 2668, 715, 2395, 275, 253, 2457, 2715, 273, 253, 2929 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: this paper presents a polar coordinate systembased data augmentation approach polarmix for scanning lidar point cloud understanding polarmix generates augmented data by mixing two different scans with sceneswapping and instance rotatepasting extensive experiments and ablation studies validate the performance gain by polarmix for three taskssemantic segmentation object detection domain adaptionon three datasets strengths the paper is wellwritten and easy to follow the augmentation strategy is beneficial not only for semantic segmentation tasks but also for object detection and domain adaption i like the idea of mixing two domains data scanwisely to bridge the domain gap since polarmix is a da strategy generally it is modelagnostic and applicable to any model weaknesses the baselines for comparison of w and wo polarmix are not the latest ones i am wondering how is the performance gain by da of polarmix under some latest backbones with higher baseline performance the improvement will still significant the contribution is a little limited considering the relevant work polarmix is kind of the combination of cutmix copypaste and mix3d from the perspective of idea at least one relevant work is not mentioned and compared for object detection task 1 1j fang x zuo d zhou s jin s wang and l zhang lidaraug a general renderingbased augmentation framework for 3d object detection cvpr2021 the authors didnt discuss the limitations of their research and potential negative societal impacts docsepthis paper presents a data augmentation method specifically designed for lidar point clouds two layers of data augmentation are introduced scene level augmentation and instance level augmentation the authors demonstrated the enhancement brought by this data augmentation method on various applications and showed that the proposed data augmentation is superior than the stateoftheart lidar data augmentation methods the idea is simple in a good way the evaluation is thorough the authors demonstrated three application namely semantic segmentation object detection domain gap reduction the proposed method outperforms the stateoftheart lidar data augmentation methods this paper presented a neat idea and showed extensive experiments to justify its contribution i do not see any strong limitations docsepthis paper proposes an augmentation method for point clouds especially captured in road environments the proposed augmentation method is to crop cut and mix two 3d scans at both scenelevel and instancelevel this concept can be extended to unsupervised domain adaptation uda by fusing source domain point cloud known labels and target domain point cloud unknown labels series of experiments demonstrate its superiority in comparison with other point cloud augmentation methods 1 strength simple and straightforward ways of augmenting point clouds potential extension to uda tasks consistent performance improvements in various tasks 2 weakness lack of analysis and intuition behind such a design overall the authors present extensive experiments to demonstrate the superiority of the proposed data augmentation scheme however i wonder why such data augmentation is helpful for pointbased recognition tasks for instance is polarmix effective in making the networks more robust to point cloud density or point cloud noise the lack of such analysis makes me suspicious of the novelty of this work currently this is the dominant worrying point of this paper 1 while authors only focus on the lidar scans that are captured in the road environments there are other types of point cloud datasets such as s3dis i or scannet j in my understanding this method is not applicable to the indoor point set datasets if so authors should have included this problem as their limitation reference i 3d semantic parsing of largescale indoor spaces iccv 2016 j scannet richlyannotated 3d reconstructions of indoor scenes cvpr 2017 docsep this paper introduces an approach to augmenting cylindrical lidar point cloud to acquire boosted performance on 3d semantic segmentation and 3d detection the proposed approach called polarmix enables cut edit and mix point clouds along the scanning direction the augmentation happens on the scenelevel and instancelevel so that the augmented data provides a variety of combinations of the augmented scenes the proposed approach is superior to conventional global rotation and scale augmentation cutmix 37 copypaste 13 and mix3d 21 strengths 1 the paper comprehensively summarizes the related work in the point cloud augmentation field the paper is selfcontained so the readers readily follow the problem and the recent advances 2 the paper is straightforward to understand i enjoyed reading the paper the idea of mixing the 3d scenes is already known but mixing on the polar coordinate seems to be very effective on the cylindrical lidar point clouds 3 the approach shows compelling results on the semantickitti 1 nusceneslidarseg 2 semanticposs 22 synlidar datasets the proposed augmentation approach is applied with recent 3d semantic segmentation networks such as minknet 8 spvcnn 30 randlanet 15 cylinder3d 41 for the task of 3d detection pointpillar 18 second 36 centernet 10 are applied the gain is clear and it outperforms other baselines 4 the approach shows the effectiveness of the unsupervised domain adaptation as well since the approach can mix labeled source data and unlabeled target data the approach can be readily applied to the various combinations of the domains 5 the proposed augmentation approach improves data efficiency as demonstrated in the experiment section with the small amount of 3d scans the polarmix can produce a similar performance 6 the paper explains the proposed idea in detail the supportive figures such as figures 1 and 2 help to understand the approach better weakness 1 the approach is a simple extension of the idea of the mix of 3d scenes and rotating bounding boxes limited to the azimuth angles the idea is not entirely new and the target domain is limited to the cylindrical lidar datasets not the general 3d scenes however the cylindrical lidar domain is one of the exciting domains for the task of intelligent mobility systems and the custom design of 3d detection for the cylindrical lidar data also forms a research field therefore i think it is not a critical weakness 2 it is unclear how the baseline approaches for the semantic segmentation are selected in addition to the cutmix 37 copypaste 13 mix3d 21 there are possible options to be applied similarly approaches to augment the 3d detection task such as gtaug 36 12 cutmix or approaches of 6 11 13 could be applied the paper needs some clarification on how the baseline approaches were selected 3 it is recommended to indicate the computational overhead when the proposed approach is applied for instance when training minknet compared with the vanilla minknet training what percentage of the total time is added to use polarmix depending on the additional computation burden the baseline approaches could be reevaluated the paper does not address the limitation of the proposed approach the concern about the computational overhead is clarified and stated if it has drawbacks in any facts the limitation section needs to describe them docsepthis paper proposed a data augmentation method named polarmix for lidar point cloud perception it includes two separate operations a scenelevel swapping that first cuts point cloud sectors of two lidar scans wrt the point azimuth values and then switches the cut sectors to form a new sample for training and an instancelevel copypaste which selects instance points for certain classes from one scan rotates them along the lidar scanning direction and pastes the rotated points to another scan experimental results show that polarmix yields improvements for both lidar semantic segmentation and 3d object detection strengths 1 this paper introduces a new data augmentation method for lidar point clouds 2 the proposed polarmix is tested for both lidar semantic segmentation and 3d object detection and the results show that the proposed method can improve the baselines for both tasks weaknesses 1 this paper explains mostly the what and how but not much why for their specific augmentation operations more design insights and analysis can improve this paper a lot 2 the experimental analysis is insufficient to support all conclusions stated in this paper 3 the elaboration of this paper is not good enough some assumptions are made without clear logical relations this paper does not include a specific description of potential limitations in l396 the author state that the effectiveness of the proposed method might be influenced by the parameter setting which should be regarded as and included in ablation analysis more general limitations are recommended to be discussed such as the applicability of the proposed method for different types of lidar point clouds and networks ### Summary:
the proposed augmentation method for lidar scans is to crop cut and mix two 3d scans at both scenelevel and instancelevel the approach is not novel and a simple extension of the idea of the mix of 3d scenes and rotating bounding boxes another limitation is that the method cannot be applied to general 3d scenes the reviews include a7 wa6 ba5 two br 4 after carefully checking out the rebuttals and discussions i recommend the paper to be presented for the neurips community
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 2520, 2929, 10262, 247, 6994, 13249, 985, 3169, 941, 42072, 2746, 877, 1513, 895, 323, 13733, 16486, 274, 1127, 9005, 4685, 877, 1513, 895, 15693, 31612, 941, 407, 12480, 767, 1027, 20947, 342, 13451, 88, 5436, 285, 4227, 21033, 81, 14669, 9470, 4679, 285, 28913, 2175, 17813, 253, 3045, 6351, 407, 877, 1513, 895, 323, 1264, 8892, 6017, 6484, 26405, 1789, 5481, 5028, 5223, 279, 251, 1264, 15302, 20544, 50276, 783, 2929, 310, 973, 15720, 285, 3477, 281, 956, 50276, 783, 42072, 5700, 310, 12912, 417, 760, 323, 24705, 26405, 8892, 533, 671, 323, 1789, 5481, 285, 5028, 5223, 279, 891, 751, 253, 2934, 273, 12480, 767, 10625, 941, 11017, 88, 9299, 281, 9729, 253, 5028, 8037, 50276, 17480, 877, 1513, 895, 310, 247, 4204, 5700, 3839, 352, 310, 1566, 1530, 6932, 285, 7763, 281, 667, 1566, 50276, 20881, 1255, 265, 50276, 783, 1666, 25379, 323, 5301, 273, 259, 50276, 395, 32063, 877, 1513, 895, 403, 417, 253, 6323, 4394, 891, 717, 12371, 849, 310, 253, 3045, 6351, 407, 4204, 273, 50276, 4818, 1513, 895, 762, 690, 6323, 896, 47473, 342, 2169, 8245, 3045, 253, 7756, 588, 1335, 1534, 50276, 783, 7680, 310, 247, 1652, 3710, 7296, 253, 4623, 789, 877, 1513, 895, 310, 2238, 273, 253, 5019, 273, 2624, 24706, 3491, 23164, 285, 5878, 20, 69, 432, 253, 8668, 273, 2934, 50275, 255, 1878, 581, 4623, 789, 310, 417, 5393, 285, 2429, 323, 1789, 5481, 4836, 337, 50276, 18, 75, 269, 606, 1269, 10736, 80, 277, 1182, 14451, 256, 480, 249, 256, 259, 606, 285, 298, 1182, 12109, 16486, 274, 2321, 247, 2087, 18164, 3169, 42072, 7792, 323, 495, 69, 1789, 5481, 30105, 1087, 938, 1797, 253, 4477, 42126, 2319, 253, 7364, 273, 616, 2561, 285, 2442, 4016, 38058, 16274, 5474, 33032, 2520, 2929, 10262, 247, 941, 42072, 1332, 5742, 4158, 323, 16486, 274, 1127, 16173, 767, 8090, 273, 941, 42072, 403, 5611, 6200, 1268, 42072, 285, 4227, 1268, 42072, 253, 4477, 5183, 253, 14314, 3982, 407, 436, 941, 42072, 1332, 327, 2710, 4893, 285, 2692, 326, 253, 4081, 941, 42072, 310, 8936, 685, 253, 1375, 23037, 14387, 16486, 274, 941, 42072, 3082, 50275, 783, 2934, 310, 2969, 275, 247, 1175, 1039, 50276, 783, 7103, 310, 11080, 253, 4477, 5183, 1264, 2898, 10775, 24705, 26405, 1789, 5481, 5028, 8037, 5141, 50276, 783, 4081, 1332, 41731, 13015, 253, 1375, 23037, 14387, 16486, 274, 941, 42072, 3082, 50276, 2520, 2929, 3559, 247, 18176, 2934, 285, 2692, 9470, 4679, 281, 15249, 697, 7680, 891, 513, 417, 923, 667, 2266, 7364, 5474, 33032, 2520, 2929, 29328, 271, 42072, 1332, 323, 1127, 16173, 3340, 10848, 275, 3971, 12620, 253, 4081, 42072, 1332, 310, 281, 17177, 2624, 285, 5878, 767, 495, 69, 20947, 387, 1097, 6200, 5251, 285, 4227, 5251, 436, 4473, 476, 320, 6508, 281, 440, 35421, 5028, 15644, 209, 14776, 407, 269, 5302, 2603, 5028, 1127, 9005, 1929, 13301, 285, 2303, 5028, 1127, 9005, 7202, 13301, 2962, 273, 4679, 7568, 697, 34385, 275, 5301, 342, 643, 1127, 9005, 42072, 3082, 337, 4757, 50276, 19583, 285, 15246, 4088, 273, 35919, 272, 1127, 16173, 50275, 33177, 6880, 281, 209, 14776, 8892, 50276, 32474, 3045, 11701, 275, 2710, 8892, 50276, 19, 14855, 50276, 77, 471, 273, 1783, 285, 30328, 3212, 824, 247, 2216, 4583, 253, 4477, 1246, 9470, 4679, 281, 7568, 253, 34385, 273, 253, 4081, 941, 42072, 6974, 2299, 891, 4282, 2139, 824, 941, 42072, 310, 9371, 323, 1127, 3169, 8981, 8892, 323, 4227, 310, 877, 1513, 895, 3576, 275, 2403, 253, 6928, 625, 10237, 281, 1127, 9005, 4038, 390, 1127, 9005, 6046, 253, 3480, 273, 824, 1783, 2789, 479, 20634, 273, 253, 38135, 273, 436, 789, 4390, 436, 310, 253, 11360, 29124, 1127, 273, 436, 2929, 50276, 18, 1223, 4477, 760, 2770, 327, 253, 16486, 274, 20947, 326, 403, 10848, 275, 253, 3971, 12620, 627, 403, 643, 3510, 273, 1127, 9005, 15302, 824, 347, 256, 20, 3431, 891, 390, 660, 1136, 292, 480, 275, 619, 4685, 436, 1332, 310, 417, 7763, 281, 253, 24340, 1127, 873, 15302, 604, 594, 4477, 943, 452, 2908, 436, 1895, 347, 616, 12291, 50275, 14005, 50276, 74, 495, 69, 24705, 29072, 273, 1236, 2510, 25912, 24340, 8470, 17857, 17312, 4022, 50276, 75, 660, 1136, 292, 6793, 314, 11423, 456, 495, 69, 49866, 6477, 273, 24340, 13451, 30105, 1087, 4240, 50276, 7152, 33032, 436, 2929, 23970, 271, 2746, 281, 35919, 272, 23990, 16486, 274, 1127, 9005, 281, 16270, 46002, 3045, 327, 495, 69, 24705, 26405, 285, 495, 69, 5481, 253, 4081, 2746, 1925, 877, 1513, 895, 13276, 2624, 12921, 285, 5878, 1127, 16173, 2112, 253, 13733, 3884, 253, 42072, 6569, 327, 253, 6200, 5251, 285, 4227, 5251, 594, 326, 253, 31612, 941, 3400, 247, 5235, 273, 13553, 273, 253, 31612, 13451, 253, 4081, 2746, 310, 8936, 281, 6041, 4156, 9381, 285, 4311, 42072, 2624, 24706, 5345, 3491, 23164, 2145, 285, 5878, 20, 69, 3127, 20544, 337, 253, 2929, 9483, 1242, 37250, 253, 2905, 789, 275, 253, 1127, 9005, 42072, 1673, 253, 2929, 310, 1881, 41010, 594, 253, 10668, 12450, 956, 253, 1895, 285, 253, 3332, 16424, 374, 253, 2929, 310, 15246, 281, 2096, 891, 11346, 4361, 253, 2929, 253, 2934, 273, 12480, 253, 495, 69, 13451, 310, 2168, 1929, 533, 12480, 327, 253, 6994, 13249, 3133, 281, 320, 1077, 3576, 327, 253, 23990, 16486, 274, 1127, 16173, 50276, 20, 253, 2746, 2722, 18511, 1543, 327, 253, 3300, 386, 781, 21498, 337, 295, 19387, 24453, 77, 301, 10788, 72, 374, 24705, 24765, 3307, 2753, 77, 301, 274, 15302, 253, 4081, 42072, 2746, 310, 3732, 342, 3332, 495, 69, 24705, 26405, 6928, 824, 347, 278, 750, 3024, 854, 653, 16788, 9866, 1884, 40819, 13409, 292, 1458, 16657, 20, 69, 7609, 323, 253, 4836, 273, 495, 69, 5481, 1127, 81, 33787, 1283, 1273, 5540, 1399, 1808, 292, 884, 403, 3732, 253, 6351, 310, 2590, 285, 352, 41731, 13015, 643, 1666, 25379, 577, 253, 2746, 2722, 253, 12510, 273, 253, 440, 35421, 5028, 15644, 347, 973, 1580, 253, 2746, 476, 5878, 13130, 2603, 941, 285, 440, 22027, 2303, 941, 253, 2746, 476, 320, 12450, 3732, 281, 253, 2710, 13553, 273, 253, 10625, 608, 253, 4081, 42072, 2746, 19132, 941, 6733, 347, 5183, 275, 253, 3368, 2593, 342, 253, 1355, 2408, 273, 495, 69, 20947, 253, 877, 1513, 895, 476, 4711, 247, 2074, 3045, 721, 253, 2929, 11424, 253, 4081, 2934, 275, 2508, 253, 23384, 8442, 824, 347, 8442, 337, 285, 374, 1361, 281, 2096, 253, 2746, 1805, 50276, 20881, 1255, 337, 253, 2746, 310, 247, 2969, 6880, 273, 253, 2934, 273, 253, 5878, 273, 495, 69, 13451, 285, 17387, 41113, 12783, 3710, 281, 253, 40665, 14636, 253, 2934, 310, 417, 7094, 747, 285, 253, 2303, 5028, 310, 3710, 281, 253, 23990, 16486, 274, 15302, 417, 253, 2087, 495, 69, 13451, 2299, 253, 23990, 16486, 274, 5028, 310, 581, 273, 253, 12302, 10625, 323, 253, 4836, 273, 17497, 16239, 2718, 285, 253, 2840, 2216, 273, 495, 69, 5481, 323, 253, 23990, 16486, 274, 941, 671, 4948, 247, 2561, 1673, 3103, 891, 1158, 352, 310, 417, 247, 4619, 14855, 374, 352, 310, 12744, 849, 253, 8245, 7274, 323, 253, 24705, 26405, 403, 4236, 275, 1635, 281, 253, 2624, 24706, 5345, 3491, 23164, 2145, 5878, 20, 69, 3127, 627, 403, 1896, 4610, 281, 320, 3732, 12014, 7274, 281, 35919, 253, 495, 69, 5481, 4836, 824, 347, 305, 893, 814, 5540, 1249, 2624, 24706, 390, 7274, 273, 721, 1903, 2145, 812, 320, 3732, 253, 2929, 3198, 690, 37699, 327, 849, 253, 8245, 7274, 497, 4236, 495, 352, 310, 8521, 281, 5224, 253, 15180, 18332, 672, 253, 4081, 2746, 310, 3732, 323, 4227, 672, 3733, 278, 750, 3024, 2429, 342, 253, 26724, 278, 750, 3024, 3733, 752, 7155, 273, 253, 2264, 673, 310, 2879, 281, 897, 877, 1513, 895, 7293, 327, 253, 3081, 13782, 7977, 253, 8245, 7274, 812, 320, 294, 15419, 11634, 50276, 783, 2929, 1057, 417, 2953, 253, 12291, 273, 253, 4081, 2746, 253, 4468, 670, 253, 15180, 18332, 310, 31637, 285, 4767, 604, 352, 556, 30453, 275, 667, 5441, 253, 12291, 2593, 3198, 281, 6266, 731, 5474, 33032, 2520, 2929, 4081, 247, 941, 42072, 1332, 4907, 877, 1513, 895, 323, 16486, 274, 1127, 9005, 13071, 352, 3797, 767, 4858, 5871, 247, 6200, 5251, 1863, 5436, 326, 806, 12176, 1127, 9005, 19465, 273, 767, 16486, 274, 20947, 8772, 253, 1127, 40665, 2193, 285, 840, 20994, 253, 2624, 19465, 281, 830, 247, 747, 3410, 323, 3733, 285, 271, 4227, 5251, 3491, 23164, 534, 34899, 4227, 2792, 323, 2176, 5971, 432, 581, 11017, 45933, 731, 2112, 253, 16486, 274, 13733, 3884, 285, 2469, 265, 253, 27272, 2792, 281, 1529, 11017, 5661, 1543, 921, 326, 877, 1513, 895, 11026, 11701, 323, 1097, 16486, 274, 24705, 26405, 285, 495, 69, 1789, 5481, 20544, 337, 436, 2929, 23970, 247, 747, 941, 42072, 1332, 323, 16486, 274, 1127, 16173, 50276, 19, 253, 4081, 877, 1513, 895, 310, 5762, 323, 1097, 16486, 274, 24705, 26405, 285, 495, 69, 1789, 5481, 285, 253, 1543, 921, 326, 253, 4081, 1332, 476, 3157, 253, 1666, 25379, 323, 1097, 8892, 50276, 20881, 1255, 265, 337, 436, 2929, 11424, 6571, 253, 752, 285, 849, 533, 417, 1199, 2139, 323, 616, 2173, 42072, 5871, 50276, 3062, 2216, 16039, 285, 1783, 476, 3157, 436, 2929, 247, 2257, 50276, 19, 253, 5661, 1783, 310, 12497, 281, 1329, 512, 11815, 4767, 275, 436, 2929, 50276, 20, 253, 14883, 318, 273, 436, 2929, 310, 417, 1175, 2217, 690, 13260, 403, 1160, 1293, 2590, 13760, 2493, 50276, 2520, 2929, 1057, 417, 2486, 247, 2173, 5740, 273, 2442, 7364, 275, 298, 24698, 253, 2488, 1375, 326, 253, 12510, 273, 253, 4081, 1332, 1537, 320, 12208, 407, 253, 4764, 4758, 534, 943, 320, 12258, 347, 285, 2908, 275, 28913, 1783, 625, 2087, 7364, 403, 8521, 281, 320, 5469, 824, 347, 253, 30437, 273, 253, 4081, 1332, 323, 1027, 3510, 273, 16486, 274, 1127, 16173, 285, 6928, 2490, 187, 4118, 18435, 27, 783, 4081, 42072, 1332, 323, 16486, 274, 20947, 310, 281, 17177, 2624, 285, 5878, 767, 495, 69, 20947, 387, 1097, 6200, 5251, 285, 4227, 5251, 253, 2746, 310, 417, 4460, 285, 247, 2969, 6880, 273, 253, 2934, 273, 253, 5878, 273, 495, 69, 13451, 285, 17387, 41113, 12783, 1529, 12291, 310, 326, 253, 1332, 2550, 320, 3732, 281, 2087, 495, 69, 13451, 253, 10123, 2486, 247, 24, 5172, 23, 18927, 22, 767, 1308, 577, 846, 9257, 12669, 562, 253, 30080, 85, 932, 285, 11985, 891, 5583, 253, 2929, 281, 320, 3559, 323, 253, 5723, 2824, 3114 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 2520, 2929, 10262, 247, 6994, 13249, 985, 3169, 941, 42072, 2746, 877, 1513, 895, 323, 13733, 16486, 274, 1127, 9005, 4685, 877, 1513, 895, 15693, 31612, 941, 407, 12480, 767, 1027, 20947, 342, 13451, 88, 5436, 285, 4227, 21033, 81, 14669, 9470, 4679, 285, 28913, 2175, 17813, 253, 3045, 6351, 407, 877, 1513, 895, 323, 1264, 8892, 6017, 6484, 26405, 1789, 5481, 5028, 5223, 279, 251, 1264, 15302, 20544, 50276, 783, 2929, 310, 973, 15720, 285, 3477, 281, 956, 50276, 783, 42072, 5700, 310, 12912, 417, 760, 323, 24705, 26405, 8892, 533, 671, 323, 1789, 5481, 285, 5028, 5223, 279, 891, 751, 253, 2934, 273, 12480, 767, 10625, 941, 11017, 88, 9299, 281, 9729, 253, 5028, 8037, 50276, 17480, 877, 1513, 895, 310, 247, 4204, 5700, 3839, 352, 310, 1566, 1530, 6932, 285, 7763, 281, 667, 1566, 50276, 20881, 1255, 265, 50276, 783, 1666, 25379, 323, 5301, 273, 259, 50276, 395, 32063, 877, 1513, 895, 403, 417, 253, 6323, 4394, 891, 717, 12371, 849, 310, 253, 3045, 6351, 407, 4204, 273, 50276, 4818, 1513, 895, 762, 690, 6323, 896, 47473, 342, 2169, 8245, 3045, 253, 7756, 588, 1335, 1534, 50276, 783, 7680, 310, 247, 1652, 3710, 7296, 253, 4623, 789, 877, 1513, 895, 310, 2238, 273, 253, 5019, 273, 2624, 24706, 3491, 23164, 285, 5878, 20, 69, 432, 253, 8668, 273, 2934, 50275, 255, 1878, 581, 4623, 789, 310, 417, 5393, 285, 2429, 323, 1789, 5481, 4836, 337, 50276, 18, 75, 269, 606, 1269, 10736, 80, 277, 1182, 14451, 256, 480, 249, 256, 259, 606, 285, 298, 1182, 12109, 16486, 274, 2321, 247, 2087, 18164, 3169, 42072, 7792, 323, 495, 69, 1789, 5481, 30105, 1087, 938, 1797, 253, 4477, 42126, 2319, 253, 7364, 273, 616, 2561, 285, 2442, 4016, 38058, 16274, 5474, 33032, 2520, 2929, 10262, 247, 941, 42072, 1332, 5742, 4158, 323, 16486, 274, 1127, 16173, 767, 8090, 273, 941, 42072, 403, 5611, 6200, 1268, 42072, 285, 4227, 1268, 42072, 253, 4477, 5183, 253, 14314, 3982, 407, 436, 941, 42072, 1332, 327, 2710, 4893, 285, 2692, 326, 253, 4081, 941, 42072, 310, 8936, 685, 253, 1375, 23037, 14387, 16486, 274, 941, 42072, 3082, 50275, 783, 2934, 310, 2969, 275, 247, 1175, 1039, 50276, 783, 7103, 310, 11080, 253, 4477, 5183, 1264, 2898, 10775, 24705, 26405, 1789, 5481, 5028, 8037, 5141, 50276, 783, 4081, 1332, 41731, 13015, 253, 1375, 23037, 14387, 16486, 274, 941, 42072, 3082, 50276, 2520, 2929, 3559, 247, 18176, 2934, 285, 2692, 9470, 4679, 281, 15249, 697, 7680, 891, 513, 417, 923, 667, 2266, 7364, 5474, 33032, 2520, 2929, 29328, 271, 42072, 1332, 323, 1127, 16173, 3340, 10848, 275, 3971, 12620, 253, 4081, 42072, 1332, 310, 281, 17177, 2624, 285, 5878, 767, 495, 69, 20947, 387, 1097, 6200, 5251, 285, 4227, 5251, 436, 4473, 476, 320, 6508, 281, 440, 35421, 5028, 15644, 209, 14776, 407, 269, 5302, 2603, 5028, 1127, 9005, 1929, 13301, 285, 2303, 5028, 1127, 9005, 7202, 13301, 2962, 273, 4679, 7568, 697, 34385, 275, 5301, 342, 643, 1127, 9005, 42072, 3082, 337, 4757, 50276, 19583, 285, 15246, 4088, 273, 35919, 272, 1127, 16173, 50275, 33177, 6880, 281, 209, 14776, 8892, 50276, 32474, 3045, 11701, 275, 2710, 8892, 50276, 19, 14855, 50276, 77, 471, 273, 1783, 285, 30328, 3212, 824, 247, 2216, 4583, 253, 4477, 1246, 9470, 4679, 281, 7568, 253, 34385, 273, 253, 4081, 941, 42072, 6974, 2299, 891, 4282, 2139, 824, 941, 42072, 310, 9371, 323, 1127, 3169, 8981, 8892, 323, 4227, 310, 877, 1513, 895, 3576, 275, 2403, 253, 6928, 625, 10237, 281, 1127, 9005, 4038, 390, 1127, 9005, 6046, 253, 3480, 273, 824, 1783, 2789, 479, 20634, 273, 253, 38135, 273, 436, 789, 4390, 436, 310, 253, 11360, 29124, 1127, 273, 436, 2929, 50276, 18, 1223, 4477, 760, 2770, 327, 253, 16486, 274, 20947, 326, 403, 10848, 275, 253, 3971, 12620, 627, 403, 643, 3510, 273, 1127, 9005, 15302, 824, 347, 256, 20, 3431, 891, 390, 660, 1136, 292, 480, 275, 619, 4685, 436, 1332, 310, 417, 7763, 281, 253, 24340, 1127, 873, 15302, 604, 594, 4477, 943, 452, 2908, 436, 1895, 347, 616, 12291, 50275, 14005, 50276, 74, 495, 69, 24705, 29072, 273, 1236, 2510, 25912, 24340, 8470, 17857, 17312, 4022, 50276, 75, 660, 1136, 292, 6793, 314, 11423, 456, 495, 69, 49866, 6477, 273, 24340, 13451, 30105, 1087, 4240, 50276, 7152, 33032, 436, 2929, 23970, 271, 2746, 281, 35919, 272, 23990, 16486, 274, 1127, 9005, 281, 16270, 46002, 3045, 327, 495, 69, 24705, 26405, 285, 495, 69, 5481, 253, 4081, 2746, 1925, 877, 1513, 895, 13276, 2624, 12921, 285, 5878, 1127, 16173, 2112, 253, 13733, 3884, 253, 42072, 6569, 327, 253, 6200, 5251, 285, 4227, 5251, 594, 326, 253, 31612, 941, 3400, 247, 5235, 273, 13553, 273, 253, 31612, 13451, 253, 4081, 2746, 310, 8936, 281, 6041, 4156, 9381, 285, 4311, 42072, 2624, 24706, 5345, 3491, 23164, 2145, 285, 5878, 20, 69, 3127, 20544, 337, 253, 2929, 9483, 1242, 37250, 253, 2905, 789, 275, 253, 1127, 9005, 42072, 1673, 253, 2929, 310, 1881, 41010, 594, 253, 10668, 12450, 956, 253, 1895, 285, 253, 3332, 16424, 374, 253, 2929, 310, 15246, 281, 2096, 891, 11346, 4361, 253, 2929, 253, 2934, 273, 12480, 253, 495, 69, 13451, 310, 2168, 1929, 533, 12480, 327, 253, 6994, 13249, 3133, 281, 320, 1077, 3576, 327, 253, 23990, 16486, 274, 1127, 16173, 50276, 20, 253, 2746, 2722, 18511, 1543, 327, 253, 3300, 386, 781, 21498, 337, 295, 19387, 24453, 77, 301, 10788, 72, 374, 24705, 24765, 3307, 2753, 77, 301, 274, 15302, 253, 4081, 42072, 2746, 310, 3732, 342, 3332, 495, 69, 24705, 26405, 6928, 824, 347, 278, 750, 3024, 854, 653, 16788, 9866, 1884, 40819, 13409, 292, 1458, 16657, 20, 69, 7609, 323, 253, 4836, 273, 495, 69, 5481, 1127, 81, 33787, 1283, 1273, 5540, 1399, 1808, 292, 884, 403, 3732, 253, 6351, 310, 2590, 285, 352, 41731, 13015, 643, 1666, 25379, 577, 253, 2746, 2722, 253, 12510, 273, 253, 440, 35421, 5028, 15644, 347, 973, 1580, 253, 2746, 476, 5878, 13130, 2603, 941, 285, 440, 22027, 2303, 941, 253, 2746, 476, 320, 12450, 3732, 281, 253, 2710, 13553, 273, 253, 10625, 608, 253, 4081, 42072, 2746, 19132, 941, 6733, 347, 5183, 275, 253, 3368, 2593, 342, 253, 1355, 2408, 273, 495, 69, 20947, 253, 877, 1513, 895, 476, 4711, 247, 2074, 3045, 721, 253, 2929, 11424, 253, 4081, 2934, 275, 2508, 253, 23384, 8442, 824, 347, 8442, 337, 285, 374, 1361, 281, 2096, 253, 2746, 1805, 50276, 20881, 1255, 337, 253, 2746, 310, 247, 2969, 6880, 273, 253, 2934, 273, 253, 5878, 273, 495, 69, 13451, 285, 17387, 41113, 12783, 3710, 281, 253, 40665, 14636, 253, 2934, 310, 417, 7094, 747, 285, 253, 2303, 5028, 310, 3710, 281, 253, 23990, 16486, 274, 15302, 417, 253, 2087, 495, 69, 13451, 2299, 253, 23990, 16486, 274, 5028, 310, 581, 273, 253, 12302, 10625, 323, 253, 4836, 273, 17497, 16239, 2718, 285, 253, 2840, 2216, 273, 495, 69, 5481, 323, 253, 23990, 16486, 274, 941, 671, 4948, 247, 2561, 1673, 3103, 891, 1158, 352, 310, 417, 247, 4619, 14855, 374, 352, 310, 12744, 849, 253, 8245, 7274, 323, 253, 24705, 26405, 403, 4236, 275, 1635, 281, 253, 2624, 24706, 5345, 3491, 23164, 2145, 5878, 20, 69, 3127, 627, 403, 1896, 4610, 281, 320, 3732, 12014, 7274, 281, 35919, 253, 495, 69, 5481, 4836, 824, 347, 305, 893, 814, 5540, 1249, 2624, 24706, 390, 7274, 273, 721, 1903, 2145, 812, 320, 3732, 253, 2929, 3198, 690, 37699, 327, 849, 253, 8245, 7274, 497, 4236, 495, 352, 310, 8521, 281, 5224, 253, 15180, 18332, 672, 253, 4081, 2746, 310, 3732, 323, 4227, 672, 3733, 278, 750, 3024, 2429, 342, 253, 26724, 278, 750, 3024, 3733, 752, 7155, 273, 253, 2264, 673, 310, 2879, 281, 897, 877, 1513, 895, 7293, 327, 253, 3081, 13782, 7977, 253, 8245, 7274, 812, 320, 294, 15419, 11634, 50276, 783, 2929, 1057, 417, 2953, 253, 12291, 273, 253, 4081, 2746, 253, 4468, 670, 253, 15180, 18332, 310, 31637, 285, 4767, 604, 352, 556, 30453, 275, 667, 5441, 253, 12291, 2593, 3198, 281, 6266, 731, 5474, 33032, 2520, 2929, 4081, 247, 941, 42072, 1332, 4907, 877, 1513, 895, 323, 16486, 274, 1127, 9005, 13071, 352, 3797, 767, 4858, 5871, 247, 6200, 5251, 1863, 5436, 326, 806, 12176, 1127, 9005, 19465, 273, 767, 16486, 274, 20947, 8772, 253, 1127, 40665, 2193, 285, 840, 20994, 253, 2624, 19465, 281, 830, 247, 747, 3410, 323, 3733, 285, 271, 4227, 5251, 3491, 23164, 534, 34899, 4227, 2792, 323, 2176, 5971, 432, 581, 11017, 45933, 731, 2112, 253, 16486, 274, 13733, 3884, 285, 2469, 265, 253, 27272, 2792, 281, 1529, 11017, 5661, 1543, 921, 326, 877, 1513, 895, 11026, 11701, 323, 1097, 16486, 274, 24705, 26405, 285, 495, 69, 1789, 5481, 20544, 337, 436, 2929, 23970, 247, 747, 941, 42072, 1332, 323, 16486, 274, 1127, 16173, 50276, 19, 253, 4081, 877, 1513, 895, 310, 5762, 323, 1097, 16486, 274, 24705, 26405, 285, 495, 69, 1789, 5481, 285, 253, 1543, 921, 326, 253, 4081, 1332, 476, 3157, 253, 1666, 25379, 323, 1097, 8892, 50276, 20881, 1255, 265, 337, 436, 2929, 11424, 6571, 253, 752, 285, 849, 533, 417, 1199, 2139, 323, 616, 2173, 42072, 5871, 50276, 3062, 2216, 16039, 285, 1783, 476, 3157, 436, 2929, 247, 2257, 50276, 19, 253, 5661, 1783, 310, 12497, 281, 1329, 512, 11815, 4767, 275, 436, 2929, 50276, 20, 253, 14883, 318, 273, 436, 2929, 310, 417, 1175, 2217, 690, 13260, 403, 1160, 1293, 2590, 13760, 2493, 50276, 2520, 2929, 1057, 417, 2486, 247, 2173, 5740, 273, 2442, 7364, 275, 298, 24698, 253, 2488, 1375, 326, 253, 12510, 273, 253, 4081, 1332, 1537, 320, 12208, 407, 253, 4764, 4758, 534, 943, 320, 12258, 347, 285, 2908, 275, 28913, 1783, 625, 2087, 7364, 403, 8521, 281, 320, 5469, 824, 347, 253, 30437, 273, 253, 4081, 1332, 323, 1027, 3510, 273, 16486, 274, 1127, 16173, 285, 6928, 2490, 187, 4118, 18435, 27, 783, 4081, 42072, 1332, 323, 16486, 274, 20947, 310, 281, 17177, 2624, 285, 5878, 767, 495, 69, 20947, 387, 1097, 6200, 5251, 285, 4227, 5251, 253, 2746, 310, 417, 4460, 285, 247, 2969, 6880, 273, 253, 2934, 273, 253, 5878, 273, 495, 69, 13451, 285, 17387, 41113, 12783, 1529, 12291, 310, 326, 253, 1332, 2550, 320, 3732, 281, 2087, 495, 69, 13451, 253, 10123, 2486, 247, 24, 5172, 23, 18927, 22, 767, 1308, 577, 846, 9257, 12669, 562, 253, 30080, 85, 932, 285, 11985, 891, 5583, 253, 2929, 281, 320, 3559, 323, 253, 5723, 2824, 3114 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: this paper proposes a new attention mechanism in vision transformers which is called quadtree attention this quadtree attention mechanism builds token pyramids and computes the attention in a coarsetofine manner at each level only top k regions with the highest attention scores from query and key are selected for further attention computation on finer tokens while other regions simply use current coarse attention as a part of output in the empirical study the proposed quadtree attention achieves good performance on a wide range of vision tasks eg feature matching stereo matching classification and object detection with less computation pros 1 the multiscale design in quadtree attention is reasonable the quadtree attention can model longrange interactions using coarse tokens from less relevant regions while establishing dense interactions using fine tokens from informative local regions thus the multiscale structure in a coarsetofine manner in quadtree attention provides and efficient way to capture both longrange and local interactions a further modified version of quadtree attention quadtreeb introduces additional weights and overlapping regions reducing the effect from inaccurate attention score on coarse tokens 2 the quadtree attention can be used in both selfattention and crossattention which leads to a wider applicability 3 empirical performance demonstrates the effectiveness of the proposed approach on feature matching tasks the proposed quadtree achieves significant improvement on auc of camera pose errors on stereo matching and object detection it is observed that the computation flops is reduced by a large margin with onpar performance 4 the paper is well written and easy to follow cons questions suggestions 1 from the technical contribution aspect the quadtree structure is not brand new it is arguably true that the proposed approach is one of the earliest attempts to introduce quadtree structure into the attention mechanism however since some recent work 1 also employs coarsetofine structure in attention it would be better to have some comparison with such work 1 focal selfattention for localglobal interactions in vision transformers neurips 2021 2 the design of pyramid tokens is unclear 1 in figure 1 top2 patches are used to compute finer attention i wonder why in level 2 there are two subpatches green and yellow from the position of red patch in level we should only have the green patch in level 2 which could be consistent with figure 2 in addition why the two green patches in level 2 of image b have the same location in the 2x2 grid both are lower left the same pattern is also observed in red patches in level 3 2 it remains unclear that how the pyramid tokens are generated in page 45 it mentions we construct llevel pyramids for query and value v tokens respectively by 2x2 average pooling does it mean the coarse tokens are always average pooled fine tokens at one level lower is there any additional patch embedding layer introduced 3 in level 1 top k patches are selected which leads to 4k subpatches in level 2 does it uniformly sample top k subpatches from 4k subpatches or each 2x2 subpatches from coarse patch must have one sampled subpatch 3 in the empirical experiments the quadtree attention does not bring much efficiency improvement on the imagenet classification it has very similar parameters and flops compared with its baseline pvtv2 which is quite different from the object detection task i wonder what is the reason behind such as big gap 4 typos equation 1 should be v instead of vt figure 2 it would be better to use mathbfmi1 instead of mi1 for a consistent notation in the text line below eq 4 should sij1 be sij0 in summary i think the proposed quadtree attention is a reasonable efficient design of attention mechanism it builds coarse and fine tokens which is able to model both longrange and local interactions empirical results also demonstrate that the proposed approach achieves good performance with less computation on many vision tasks however i still hold some concerns in technical contribution unclear design of the pyramid tokens and performance on classification task thus i would like to rate this paper as marginally above the acceptance threshold docsepthe paper proposed an attention approach to handle the global or longrange attention by leveraging the idea of quadtree structure meanwhile reducing the quadratic complexity of original attention operation to linear experiments are performed on several tasks eg feature matching image classification and objection detection superior results are achieved on these tasks strengths the idea of the paper is clear and easy to follow up the effectiveness of the proposed approach is verified on different tasks weaknesses some important related works are missing the previous work 1 also proposed a multilevel attention approach to efficiently balance the shortdistance and longdistance attention which proposed the same way to generate the multilevel or fine and coarse tokens from this point the idea of quadtreeb attention is very similar to 1 however the authors did not mention and discuss the relation to this very related work i suggest the authors carefully discuss the relation 1 the paper proposed an efficient way to process longdistance and shortdistance attention there should be some baselines we should compare with 1 the vanilla attention mechanisms 2 some other efficient attention mechanisms eg shifted windows attention in swintransformer focal attention in focal transformer1 sampled kv in pvt however the authors only compared with the pvt id like to see more solid appletoapple comparisons and discussions on that questions how about the real runtime when applying quadtree attention to pvt architecture as i posted above i think the quadtree attention mechanism is a general attention mechanism id like to know how about the performance when applying to other transformerbased architecture eg swintransformer vit et al the authors argue that swintransformer restricts the attention in the local windows however by shifting the windows in the swin transform it enables that information exchange between windows for swin from this point i dont think that swintransformer limits the attention whin local windows 1 focal selfattention for localglobal interactions in vision transformers the paper proposed a clear method to handle longdistance and shortdistance attention however important related work is missing in the discussion as mentioned above i think more discussions and experiments should be added to support the claim i am leaning to reject the paper if the authors cant address my concerns docsepthis paper aims to address the quadratic computational complexity of vanilla transformers the key idea is to build token pyramids and computes attention in a coarsetofine manner which reduces the computational complexity to linear the resulting attention paradigm forms a quadtree structure and two variants quadtreea and quadtreeb are further proposed to improve message aggregation experiments are conducted on four different computer vision tasks involving either selfattention or cross attention results show that the proposed approach matches the state of the art with much fewer flops and model parameters pros 1 this paper is easy to follow and overall wellstructured 2 i think the problem this paper tries to tackle is interesting for the community designing an efficient attention mechanism in transformers for vision tasks is definitely an important problem especially computing attention within token pyramids following a quardtree structure is novel to me which has not been studied in previous work 3 the experiments are comprehensive and strong it is good to see the comparisons on four different computer vision tasks which cover both selfattention and crossattention although the improvement in accuracy seems minor to me the reduction in computation is notable cons overall i have some concerns about the experiment and the following points might be worthy to study in the rebuttal period 1 spare attention usually leads to a faster convergence rate but higher final losses compared to full attention i wonder if this still holds in quadtree attention it is interesting to see the training curve of the proposed method compared with vit or swin 2 i wonder if the number of tokens in the finest resolution will affect the model performance eg what is the current patch size of the input image and will it lead to performance improvement if a larger or smaller patch size is chosen 3 it is also worthy to explore if variants of larger smaller size in model parameters will lead to performance improvement and how flops change in these variants the paper studied a challenging and important problem in efficient attention mechanisms i think the overall method is novel and the experimental results are promising which leads me to a positive rating i hope the authors can further address my concerns in the rebuttal docsepthe paper proposes an efficient attention algorithm based on quadtree for vision transformers it establishes a feature pyramid for tokens and aggregates sparse keyvalue features in each pyramid level the method achieves competitive performance on the stereo image classification and object detection strength introducing the quadtree to the vision transformers is interesting and makes sense it can efficiently achieve longrange dependence while keeping the local details the paper is overall wellwritten and wellorganized the reported performance is competitive against many stateoftheart vision transformers weakness topk assignment for tokens in the quadtree is nondifferentiable which may reduce the generalization how does the proposed method achieve endtoend training the quadtree aggregates features in an unstructured and iterative way it is unfriendly to the parallel devices and probably takes more latency than the dense attention it would be nice to provide the actual throughput on gpus to demonstrate the efficiency the paper proposes a novel efficient vision transformer based on quadtrees which is interesting and technical sound it achieves competitive performance in various vision tasks ### Summary:
the paper proposes an efficient attention variant inspired by quadtrees for use in vision transformers when applied to several vision tasks the approach leads to better results andor less compute the reviews are all positive about the paper after taking into account the authors feedback one reviewer forgot to update their official rating apparently they point out that the idea is reasonable and the empirical evaluation is thorough and convincing with good gains on several tasks and datasets overall i recommend acceptance
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 2520, 2929, 29328, 247, 747, 4116, 5122, 275, 8113, 4979, 398, 534, 310, 1925, 9853, 12588, 4116, 436, 9853, 12588, 4116, 5122, 21168, 10669, 25874, 2352, 285, 48169, 253, 4116, 275, 247, 820, 1032, 292, 1171, 460, 5133, 387, 1016, 1268, 760, 1755, 465, 4811, 342, 253, 4585, 4116, 7363, 432, 7316, 285, 2234, 403, 4236, 323, 2007, 4116, 13782, 327, 40259, 21761, 1223, 643, 4811, 3365, 897, 1655, 25319, 4116, 347, 247, 629, 273, 3453, 275, 253, 16774, 1263, 253, 4081, 9853, 12588, 4116, 33526, 1175, 3045, 327, 247, 4618, 2491, 273, 8113, 8892, 24088, 4735, 11038, 36167, 11038, 9162, 285, 1789, 5481, 342, 1679, 13782, 5847, 337, 253, 1554, 2865, 1079, 2216, 275, 9853, 12588, 4116, 310, 5272, 253, 9853, 12588, 4116, 476, 1566, 1048, 6324, 6355, 970, 25319, 21761, 432, 1679, 4623, 4811, 1223, 14631, 14086, 6355, 970, 4030, 21761, 432, 27096, 1980, 4811, 3021, 253, 1554, 2865, 1079, 2605, 275, 247, 820, 1032, 292, 1171, 460, 5133, 275, 9853, 12588, 4116, 3400, 285, 5919, 1039, 281, 9232, 1097, 1048, 6324, 285, 1980, 6355, 247, 2007, 7321, 2715, 273, 9853, 12588, 4116, 9853, 12588, 67, 23970, 3081, 13461, 285, 21481, 4811, 8493, 253, 1055, 432, 31215, 4116, 4868, 327, 25319, 21761, 50276, 19, 253, 9853, 12588, 4116, 476, 320, 908, 275, 1097, 1881, 42959, 285, 2831, 42959, 534, 5644, 281, 247, 14200, 30437, 50276, 20, 16774, 3045, 14371, 253, 12510, 273, 253, 4081, 2746, 327, 4735, 11038, 8892, 253, 4081, 9853, 12588, 33526, 1534, 7756, 327, 247, 1028, 273, 6568, 16753, 6332, 327, 36167, 11038, 285, 1789, 5481, 352, 310, 2540, 326, 253, 13782, 892, 2695, 310, 3777, 407, 247, 1781, 8459, 342, 327, 1148, 3045, 50276, 21, 253, 2929, 310, 973, 3542, 285, 3477, 281, 956, 50275, 5040, 50276, 34974, 50276, 35640, 621, 337, 432, 253, 7681, 7680, 4809, 253, 9853, 12588, 2605, 310, 417, 7138, 747, 352, 310, 25711, 2032, 326, 253, 4081, 2746, 310, 581, 273, 253, 18353, 9437, 281, 9569, 9853, 12588, 2605, 715, 253, 4116, 5122, 2299, 1580, 690, 3332, 789, 337, 671, 27532, 820, 1032, 292, 1171, 460, 2605, 275, 4116, 352, 651, 320, 1805, 281, 452, 690, 5301, 342, 824, 789, 337, 18560, 1881, 42959, 323, 1980, 14456, 6355, 275, 8113, 4979, 398, 5723, 2824, 43425, 50276, 19, 253, 2216, 273, 39694, 21761, 310, 12744, 50274, 18, 275, 4677, 337, 1755, 19, 20412, 403, 908, 281, 11897, 40259, 4116, 891, 4282, 2139, 275, 1268, 374, 627, 403, 767, 749, 4066, 2706, 4759, 285, 8862, 432, 253, 1899, 273, 2502, 12097, 275, 1268, 359, 943, 760, 452, 253, 4759, 12097, 275, 1268, 374, 534, 812, 320, 5185, 342, 4677, 374, 275, 1635, 2139, 253, 767, 4759, 20412, 275, 1268, 374, 273, 2460, 270, 452, 253, 1072, 4328, 275, 253, 374, 89, 19, 9860, 1097, 403, 2406, 1669, 253, 1072, 3102, 310, 671, 2540, 275, 2502, 20412, 275, 1268, 495, 50274, 19, 352, 4558, 12744, 326, 849, 253, 39694, 21761, 403, 4561, 275, 3239, 5329, 352, 25957, 359, 3989, 298, 5251, 25874, 2352, 323, 7316, 285, 1318, 362, 21761, 2975, 407, 374, 89, 19, 3388, 45900, 1057, 352, 1599, 253, 25319, 21761, 403, 1900, 3388, 24462, 4030, 21761, 387, 581, 1268, 2406, 310, 627, 667, 3081, 12097, 21496, 3828, 5611, 50274, 20, 275, 1268, 337, 1755, 465, 20412, 403, 4236, 534, 5644, 281, 577, 76, 749, 4066, 2706, 275, 1268, 374, 1057, 352, 17568, 3410, 1755, 465, 749, 4066, 2706, 432, 577, 76, 749, 4066, 2706, 390, 1016, 374, 89, 19, 749, 4066, 2706, 432, 25319, 12097, 1364, 452, 581, 19958, 749, 20559, 50275, 20, 275, 253, 16774, 4679, 253, 9853, 12588, 4116, 1057, 417, 3324, 1199, 6733, 7756, 327, 253, 4440, 257, 292, 9162, 352, 556, 1077, 2074, 3602, 285, 892, 2695, 2429, 342, 697, 8245, 268, 87, 18698, 19, 534, 310, 3240, 1027, 432, 253, 1789, 5481, 4836, 891, 4282, 752, 310, 253, 1921, 3212, 824, 347, 1943, 8037, 50276, 21, 963, 993, 50276, 29813, 337, 943, 320, 362, 3185, 273, 362, 85, 50276, 13206, 374, 352, 651, 320, 1805, 281, 897, 14168, 3342, 7373, 18, 3185, 273, 3641, 18, 323, 247, 5185, 14951, 275, 253, 2505, 50276, 1282, 2708, 16186, 577, 943, 256, 1944, 18, 320, 256, 1944, 17, 275, 6010, 891, 1158, 253, 4081, 9853, 12588, 4116, 310, 247, 5272, 5919, 2216, 273, 4116, 5122, 352, 21168, 25319, 285, 4030, 21761, 534, 310, 2104, 281, 1566, 1097, 1048, 6324, 285, 1980, 6355, 16774, 1543, 671, 7568, 326, 253, 4081, 2746, 33526, 1175, 3045, 342, 1679, 13782, 327, 1142, 8113, 8892, 2299, 891, 1335, 2186, 690, 7350, 275, 7681, 7680, 12744, 2216, 273, 253, 39694, 21761, 285, 3045, 327, 9162, 4836, 3021, 891, 651, 751, 281, 2281, 436, 2929, 347, 42876, 1840, 253, 14924, 7887, 5474, 339, 431, 248, 2929, 4081, 271, 4116, 2746, 281, 6016, 253, 4156, 390, 1048, 6324, 4116, 407, 19732, 2977, 253, 2934, 273, 9853, 12588, 2605, 26614, 8493, 253, 21396, 10454, 273, 3236, 4116, 4254, 281, 4872, 4679, 403, 2684, 327, 2067, 8892, 24088, 4735, 11038, 2460, 9162, 285, 14926, 5481, 8936, 1543, 403, 6786, 327, 841, 8892, 50275, 296, 3755, 20556, 50275, 783, 2934, 273, 253, 2929, 310, 2590, 285, 3477, 281, 956, 598, 50276, 783, 12510, 273, 253, 4081, 2746, 310, 16058, 327, 1027, 8892, 50274, 20881, 1255, 265, 50276, 8826, 1774, 2905, 2987, 403, 5816, 253, 2045, 789, 337, 671, 4081, 247, 1554, 48268, 4116, 2746, 281, 14556, 6654, 253, 2159, 19893, 285, 1048, 19893, 4116, 534, 4081, 253, 1072, 1039, 281, 6635, 253, 1554, 48268, 390, 4030, 285, 25319, 21761, 432, 436, 1127, 253, 2934, 273, 9853, 12588, 67, 4116, 310, 1077, 2074, 281, 337, 2299, 253, 4477, 858, 417, 3748, 285, 2319, 253, 5886, 281, 436, 1077, 2905, 789, 891, 1804, 253, 4477, 9257, 2319, 253, 5886, 337, 50274, 783, 2929, 4081, 271, 5919, 1039, 281, 1232, 1048, 19893, 285, 2159, 19893, 4116, 627, 943, 320, 690, 1666, 25379, 359, 943, 7277, 342, 337, 253, 26724, 4116, 6297, 374, 690, 643, 5919, 4116, 6297, 24088, 14728, 8323, 4116, 275, 1863, 565, 16147, 19946, 18560, 4116, 275, 18560, 39707, 18, 19958, 44739, 275, 268, 20282, 2299, 253, 4477, 760, 2429, 342, 253, 268, 20282, 2654, 751, 281, 923, 625, 4891, 19126, 936, 19934, 14023, 285, 11985, 327, 326, 50274, 34974, 50276, 5430, 670, 253, 1524, 20243, 672, 9433, 9853, 12588, 4116, 281, 268, 20282, 10336, 50275, 284, 891, 9269, 1840, 891, 1158, 253, 9853, 12588, 4116, 5122, 310, 247, 2087, 4116, 5122, 2654, 751, 281, 871, 849, 670, 253, 3045, 672, 9433, 281, 643, 39707, 3169, 10336, 24088, 1863, 565, 16147, 19946, 9084, 1162, 355, 50275, 783, 4477, 9059, 326, 1863, 565, 16147, 19946, 45798, 253, 4116, 275, 253, 1980, 8323, 2299, 407, 19507, 253, 8323, 275, 253, 1863, 249, 4979, 352, 13276, 326, 1491, 6431, 875, 8323, 323, 1863, 249, 432, 436, 1127, 891, 13414, 1158, 326, 1863, 565, 16147, 19946, 7787, 253, 4116, 364, 249, 1980, 8323, 50274, 18, 18560, 1881, 42959, 323, 1980, 14456, 6355, 275, 8113, 4979, 398, 50276, 783, 2929, 4081, 247, 2590, 1332, 281, 6016, 1048, 19893, 285, 2159, 19893, 4116, 2299, 1774, 2905, 789, 310, 5816, 275, 253, 5955, 347, 5393, 1840, 891, 1158, 625, 11985, 285, 4679, 943, 320, 2879, 281, 1329, 253, 1750, 891, 717, 25661, 281, 12009, 253, 2929, 604, 253, 4477, 16216, 2953, 619, 7350, 50276, 7152, 33032, 2520, 2929, 13698, 281, 2953, 253, 21396, 15180, 10454, 273, 26724, 4979, 398, 253, 2234, 2934, 310, 281, 1973, 10669, 25874, 2352, 285, 48169, 4116, 275, 247, 820, 1032, 292, 1171, 460, 5133, 534, 11355, 253, 15180, 10454, 281, 4872, 253, 4795, 4116, 22199, 4948, 247, 9853, 12588, 2605, 285, 767, 11640, 9853, 12588, 66, 285, 9853, 12588, 67, 403, 2007, 4081, 281, 3157, 3935, 20828, 4679, 403, 5196, 327, 1740, 1027, 4382, 8113, 8892, 7668, 2057, 1881, 42959, 390, 2831, 4116, 1543, 921, 326, 253, 4081, 2746, 10129, 253, 1375, 273, 253, 1445, 342, 1199, 11184, 892, 2695, 285, 1566, 3602, 5847, 50275, 18, 436, 2929, 310, 3477, 281, 956, 285, 4583, 973, 34218, 50276, 19, 891, 1158, 253, 1895, 436, 2929, 14177, 281, 18915, 310, 4722, 323, 253, 3114, 20462, 271, 5919, 4116, 5122, 275, 4979, 398, 323, 8113, 8892, 310, 7964, 271, 1774, 1895, 3340, 12672, 4116, 1561, 10669, 25874, 2352, 1563, 247, 572, 472, 12588, 2605, 310, 4460, 281, 479, 534, 556, 417, 644, 5421, 275, 2045, 789, 50276, 20, 253, 4679, 403, 11088, 285, 2266, 352, 310, 1175, 281, 923, 253, 14023, 327, 1740, 1027, 4382, 8113, 8892, 534, 3835, 1097, 1881, 42959, 285, 2831, 42959, 3738, 253, 7756, 275, 7200, 3133, 5884, 281, 479, 253, 5141, 275, 13782, 310, 16613, 50276, 5040, 50276, 1189, 455, 891, 452, 690, 7350, 670, 253, 3368, 285, 253, 1563, 2792, 1537, 320, 18338, 281, 1263, 275, 253, 30080, 22559, 2180, 50276, 18, 18345, 4116, 3798, 5644, 281, 247, 7938, 14940, 2281, 533, 2169, 2457, 11655, 2429, 281, 2120, 4116, 891, 4282, 604, 436, 1335, 6556, 275, 9853, 12588, 4116, 352, 310, 4722, 281, 923, 253, 3733, 6970, 273, 253, 4081, 1332, 2429, 342, 9084, 390, 1863, 249, 50276, 19, 891, 4282, 604, 253, 1180, 273, 21761, 275, 253, 23476, 6064, 588, 2818, 253, 1566, 3045, 24088, 752, 310, 253, 1655, 12097, 1979, 273, 253, 3280, 2460, 285, 588, 352, 1421, 281, 3045, 7756, 604, 247, 4067, 390, 4577, 12097, 1979, 310, 6777, 50276, 20, 352, 310, 671, 18338, 281, 8338, 604, 11640, 273, 4067, 4577, 1979, 275, 1566, 3602, 588, 1421, 281, 3045, 7756, 285, 849, 892, 2695, 1818, 275, 841, 11640, 253, 2929, 5421, 247, 11132, 285, 1774, 1895, 275, 5919, 4116, 6297, 891, 1158, 253, 4583, 1332, 310, 4460, 285, 253, 5661, 1543, 403, 12532, 534, 5644, 479, 281, 247, 2762, 13716, 891, 3524, 253, 4477, 476, 2007, 2953, 619, 7350, 275, 253, 30080, 22559, 5474, 339, 431, 248, 2929, 29328, 271, 5919, 4116, 5933, 1754, 327, 9853, 12588, 323, 8113, 4979, 398, 352, 25097, 247, 4735, 39694, 323, 21761, 285, 29111, 23507, 2234, 2877, 3386, 275, 1016, 39694, 1268, 253, 1332, 33526, 12085, 3045, 327, 253, 36167, 2460, 9162, 285, 1789, 5481, 4757, 50276, 36445, 2844, 253, 9853, 12588, 281, 253, 8113, 4979, 398, 310, 4722, 285, 2789, 3282, 352, 476, 14556, 5115, 1048, 6324, 10096, 1223, 7562, 253, 1980, 4278, 50276, 783, 2929, 310, 4583, 973, 15720, 285, 973, 34092, 50276, 783, 2361, 3045, 310, 12085, 1411, 1142, 1375, 23037, 14387, 8113, 4979, 398, 50276, 20881, 1255, 50276, 3956, 76, 12714, 323, 21761, 275, 253, 9853, 12588, 310, 27370, 7413, 6051, 534, 778, 4796, 253, 26647, 849, 1057, 253, 4081, 1332, 5115, 990, 936, 423, 3733, 50276, 783, 9853, 12588, 29111, 3386, 275, 271, 440, 34218, 285, 34560, 1039, 352, 310, 5369, 6902, 314, 281, 253, 7529, 4095, 285, 3164, 3936, 625, 22667, 685, 253, 14086, 4116, 352, 651, 320, 5322, 281, 2085, 253, 4588, 28519, 327, 31025, 316, 281, 7568, 253, 6733, 253, 2929, 29328, 247, 4460, 5919, 8113, 39707, 1754, 327, 9853, 45670, 534, 310, 4722, 285, 7681, 3590, 352, 33526, 12085, 3045, 275, 2710, 8113, 8892, 2490, 187, 4118, 18435, 27, 783, 2929, 29328, 271, 5919, 4116, 12955, 11797, 407, 9853, 45670, 323, 897, 275, 8113, 4979, 398, 672, 3732, 281, 2067, 8113, 8892, 253, 2746, 5644, 281, 1805, 1543, 285, 263, 1679, 11897, 50275, 783, 10123, 403, 512, 2762, 670, 253, 2929, 846, 3192, 715, 2395, 253, 4477, 8680, 581, 37317, 18298, 281, 5731, 616, 3565, 13716, 8505, 597, 1127, 562, 326, 253, 2934, 310, 5272, 285, 253, 16774, 7103, 310, 11080, 285, 21414, 342, 1175, 15988, 327, 2067, 8892, 285, 15302, 50276, 1189, 455, 891, 5583, 14924 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 2520, 2929, 29328, 247, 747, 4116, 5122, 275, 8113, 4979, 398, 534, 310, 1925, 9853, 12588, 4116, 436, 9853, 12588, 4116, 5122, 21168, 10669, 25874, 2352, 285, 48169, 253, 4116, 275, 247, 820, 1032, 292, 1171, 460, 5133, 387, 1016, 1268, 760, 1755, 465, 4811, 342, 253, 4585, 4116, 7363, 432, 7316, 285, 2234, 403, 4236, 323, 2007, 4116, 13782, 327, 40259, 21761, 1223, 643, 4811, 3365, 897, 1655, 25319, 4116, 347, 247, 629, 273, 3453, 275, 253, 16774, 1263, 253, 4081, 9853, 12588, 4116, 33526, 1175, 3045, 327, 247, 4618, 2491, 273, 8113, 8892, 24088, 4735, 11038, 36167, 11038, 9162, 285, 1789, 5481, 342, 1679, 13782, 5847, 337, 253, 1554, 2865, 1079, 2216, 275, 9853, 12588, 4116, 310, 5272, 253, 9853, 12588, 4116, 476, 1566, 1048, 6324, 6355, 970, 25319, 21761, 432, 1679, 4623, 4811, 1223, 14631, 14086, 6355, 970, 4030, 21761, 432, 27096, 1980, 4811, 3021, 253, 1554, 2865, 1079, 2605, 275, 247, 820, 1032, 292, 1171, 460, 5133, 275, 9853, 12588, 4116, 3400, 285, 5919, 1039, 281, 9232, 1097, 1048, 6324, 285, 1980, 6355, 247, 2007, 7321, 2715, 273, 9853, 12588, 4116, 9853, 12588, 67, 23970, 3081, 13461, 285, 21481, 4811, 8493, 253, 1055, 432, 31215, 4116, 4868, 327, 25319, 21761, 50276, 19, 253, 9853, 12588, 4116, 476, 320, 908, 275, 1097, 1881, 42959, 285, 2831, 42959, 534, 5644, 281, 247, 14200, 30437, 50276, 20, 16774, 3045, 14371, 253, 12510, 273, 253, 4081, 2746, 327, 4735, 11038, 8892, 253, 4081, 9853, 12588, 33526, 1534, 7756, 327, 247, 1028, 273, 6568, 16753, 6332, 327, 36167, 11038, 285, 1789, 5481, 352, 310, 2540, 326, 253, 13782, 892, 2695, 310, 3777, 407, 247, 1781, 8459, 342, 327, 1148, 3045, 50276, 21, 253, 2929, 310, 973, 3542, 285, 3477, 281, 956, 50275, 5040, 50276, 34974, 50276, 35640, 621, 337, 432, 253, 7681, 7680, 4809, 253, 9853, 12588, 2605, 310, 417, 7138, 747, 352, 310, 25711, 2032, 326, 253, 4081, 2746, 310, 581, 273, 253, 18353, 9437, 281, 9569, 9853, 12588, 2605, 715, 253, 4116, 5122, 2299, 1580, 690, 3332, 789, 337, 671, 27532, 820, 1032, 292, 1171, 460, 2605, 275, 4116, 352, 651, 320, 1805, 281, 452, 690, 5301, 342, 824, 789, 337, 18560, 1881, 42959, 323, 1980, 14456, 6355, 275, 8113, 4979, 398, 5723, 2824, 43425, 50276, 19, 253, 2216, 273, 39694, 21761, 310, 12744, 50274, 18, 275, 4677, 337, 1755, 19, 20412, 403, 908, 281, 11897, 40259, 4116, 891, 4282, 2139, 275, 1268, 374, 627, 403, 767, 749, 4066, 2706, 4759, 285, 8862, 432, 253, 1899, 273, 2502, 12097, 275, 1268, 359, 943, 760, 452, 253, 4759, 12097, 275, 1268, 374, 534, 812, 320, 5185, 342, 4677, 374, 275, 1635, 2139, 253, 767, 4759, 20412, 275, 1268, 374, 273, 2460, 270, 452, 253, 1072, 4328, 275, 253, 374, 89, 19, 9860, 1097, 403, 2406, 1669, 253, 1072, 3102, 310, 671, 2540, 275, 2502, 20412, 275, 1268, 495, 50274, 19, 352, 4558, 12744, 326, 849, 253, 39694, 21761, 403, 4561, 275, 3239, 5329, 352, 25957, 359, 3989, 298, 5251, 25874, 2352, 323, 7316, 285, 1318, 362, 21761, 2975, 407, 374, 89, 19, 3388, 45900, 1057, 352, 1599, 253, 25319, 21761, 403, 1900, 3388, 24462, 4030, 21761, 387, 581, 1268, 2406, 310, 627, 667, 3081, 12097, 21496, 3828, 5611, 50274, 20, 275, 1268, 337, 1755, 465, 20412, 403, 4236, 534, 5644, 281, 577, 76, 749, 4066, 2706, 275, 1268, 374, 1057, 352, 17568, 3410, 1755, 465, 749, 4066, 2706, 432, 577, 76, 749, 4066, 2706, 390, 1016, 374, 89, 19, 749, 4066, 2706, 432, 25319, 12097, 1364, 452, 581, 19958, 749, 20559, 50275, 20, 275, 253, 16774, 4679, 253, 9853, 12588, 4116, 1057, 417, 3324, 1199, 6733, 7756, 327, 253, 4440, 257, 292, 9162, 352, 556, 1077, 2074, 3602, 285, 892, 2695, 2429, 342, 697, 8245, 268, 87, 18698, 19, 534, 310, 3240, 1027, 432, 253, 1789, 5481, 4836, 891, 4282, 752, 310, 253, 1921, 3212, 824, 347, 1943, 8037, 50276, 21, 963, 993, 50276, 29813, 337, 943, 320, 362, 3185, 273, 362, 85, 50276, 13206, 374, 352, 651, 320, 1805, 281, 897, 14168, 3342, 7373, 18, 3185, 273, 3641, 18, 323, 247, 5185, 14951, 275, 253, 2505, 50276, 1282, 2708, 16186, 577, 943, 256, 1944, 18, 320, 256, 1944, 17, 275, 6010, 891, 1158, 253, 4081, 9853, 12588, 4116, 310, 247, 5272, 5919, 2216, 273, 4116, 5122, 352, 21168, 25319, 285, 4030, 21761, 534, 310, 2104, 281, 1566, 1097, 1048, 6324, 285, 1980, 6355, 16774, 1543, 671, 7568, 326, 253, 4081, 2746, 33526, 1175, 3045, 342, 1679, 13782, 327, 1142, 8113, 8892, 2299, 891, 1335, 2186, 690, 7350, 275, 7681, 7680, 12744, 2216, 273, 253, 39694, 21761, 285, 3045, 327, 9162, 4836, 3021, 891, 651, 751, 281, 2281, 436, 2929, 347, 42876, 1840, 253, 14924, 7887, 5474, 339, 431, 248, 2929, 4081, 271, 4116, 2746, 281, 6016, 253, 4156, 390, 1048, 6324, 4116, 407, 19732, 2977, 253, 2934, 273, 9853, 12588, 2605, 26614, 8493, 253, 21396, 10454, 273, 3236, 4116, 4254, 281, 4872, 4679, 403, 2684, 327, 2067, 8892, 24088, 4735, 11038, 2460, 9162, 285, 14926, 5481, 8936, 1543, 403, 6786, 327, 841, 8892, 50275, 296, 3755, 20556, 50275, 783, 2934, 273, 253, 2929, 310, 2590, 285, 3477, 281, 956, 598, 50276, 783, 12510, 273, 253, 4081, 2746, 310, 16058, 327, 1027, 8892, 50274, 20881, 1255, 265, 50276, 8826, 1774, 2905, 2987, 403, 5816, 253, 2045, 789, 337, 671, 4081, 247, 1554, 48268, 4116, 2746, 281, 14556, 6654, 253, 2159, 19893, 285, 1048, 19893, 4116, 534, 4081, 253, 1072, 1039, 281, 6635, 253, 1554, 48268, 390, 4030, 285, 25319, 21761, 432, 436, 1127, 253, 2934, 273, 9853, 12588, 67, 4116, 310, 1077, 2074, 281, 337, 2299, 253, 4477, 858, 417, 3748, 285, 2319, 253, 5886, 281, 436, 1077, 2905, 789, 891, 1804, 253, 4477, 9257, 2319, 253, 5886, 337, 50274, 783, 2929, 4081, 271, 5919, 1039, 281, 1232, 1048, 19893, 285, 2159, 19893, 4116, 627, 943, 320, 690, 1666, 25379, 359, 943, 7277, 342, 337, 253, 26724, 4116, 6297, 374, 690, 643, 5919, 4116, 6297, 24088, 14728, 8323, 4116, 275, 1863, 565, 16147, 19946, 18560, 4116, 275, 18560, 39707, 18, 19958, 44739, 275, 268, 20282, 2299, 253, 4477, 760, 2429, 342, 253, 268, 20282, 2654, 751, 281, 923, 625, 4891, 19126, 936, 19934, 14023, 285, 11985, 327, 326, 50274, 34974, 50276, 5430, 670, 253, 1524, 20243, 672, 9433, 9853, 12588, 4116, 281, 268, 20282, 10336, 50275, 284, 891, 9269, 1840, 891, 1158, 253, 9853, 12588, 4116, 5122, 310, 247, 2087, 4116, 5122, 2654, 751, 281, 871, 849, 670, 253, 3045, 672, 9433, 281, 643, 39707, 3169, 10336, 24088, 1863, 565, 16147, 19946, 9084, 1162, 355, 50275, 783, 4477, 9059, 326, 1863, 565, 16147, 19946, 45798, 253, 4116, 275, 253, 1980, 8323, 2299, 407, 19507, 253, 8323, 275, 253, 1863, 249, 4979, 352, 13276, 326, 1491, 6431, 875, 8323, 323, 1863, 249, 432, 436, 1127, 891, 13414, 1158, 326, 1863, 565, 16147, 19946, 7787, 253, 4116, 364, 249, 1980, 8323, 50274, 18, 18560, 1881, 42959, 323, 1980, 14456, 6355, 275, 8113, 4979, 398, 50276, 783, 2929, 4081, 247, 2590, 1332, 281, 6016, 1048, 19893, 285, 2159, 19893, 4116, 2299, 1774, 2905, 789, 310, 5816, 275, 253, 5955, 347, 5393, 1840, 891, 1158, 625, 11985, 285, 4679, 943, 320, 2879, 281, 1329, 253, 1750, 891, 717, 25661, 281, 12009, 253, 2929, 604, 253, 4477, 16216, 2953, 619, 7350, 50276, 7152, 33032, 2520, 2929, 13698, 281, 2953, 253, 21396, 15180, 10454, 273, 26724, 4979, 398, 253, 2234, 2934, 310, 281, 1973, 10669, 25874, 2352, 285, 48169, 4116, 275, 247, 820, 1032, 292, 1171, 460, 5133, 534, 11355, 253, 15180, 10454, 281, 4872, 253, 4795, 4116, 22199, 4948, 247, 9853, 12588, 2605, 285, 767, 11640, 9853, 12588, 66, 285, 9853, 12588, 67, 403, 2007, 4081, 281, 3157, 3935, 20828, 4679, 403, 5196, 327, 1740, 1027, 4382, 8113, 8892, 7668, 2057, 1881, 42959, 390, 2831, 4116, 1543, 921, 326, 253, 4081, 2746, 10129, 253, 1375, 273, 253, 1445, 342, 1199, 11184, 892, 2695, 285, 1566, 3602, 5847, 50275, 18, 436, 2929, 310, 3477, 281, 956, 285, 4583, 973, 34218, 50276, 19, 891, 1158, 253, 1895, 436, 2929, 14177, 281, 18915, 310, 4722, 323, 253, 3114, 20462, 271, 5919, 4116, 5122, 275, 4979, 398, 323, 8113, 8892, 310, 7964, 271, 1774, 1895, 3340, 12672, 4116, 1561, 10669, 25874, 2352, 1563, 247, 572, 472, 12588, 2605, 310, 4460, 281, 479, 534, 556, 417, 644, 5421, 275, 2045, 789, 50276, 20, 253, 4679, 403, 11088, 285, 2266, 352, 310, 1175, 281, 923, 253, 14023, 327, 1740, 1027, 4382, 8113, 8892, 534, 3835, 1097, 1881, 42959, 285, 2831, 42959, 3738, 253, 7756, 275, 7200, 3133, 5884, 281, 479, 253, 5141, 275, 13782, 310, 16613, 50276, 5040, 50276, 1189, 455, 891, 452, 690, 7350, 670, 253, 3368, 285, 253, 1563, 2792, 1537, 320, 18338, 281, 1263, 275, 253, 30080, 22559, 2180, 50276, 18, 18345, 4116, 3798, 5644, 281, 247, 7938, 14940, 2281, 533, 2169, 2457, 11655, 2429, 281, 2120, 4116, 891, 4282, 604, 436, 1335, 6556, 275, 9853, 12588, 4116, 352, 310, 4722, 281, 923, 253, 3733, 6970, 273, 253, 4081, 1332, 2429, 342, 9084, 390, 1863, 249, 50276, 19, 891, 4282, 604, 253, 1180, 273, 21761, 275, 253, 23476, 6064, 588, 2818, 253, 1566, 3045, 24088, 752, 310, 253, 1655, 12097, 1979, 273, 253, 3280, 2460, 285, 588, 352, 1421, 281, 3045, 7756, 604, 247, 4067, 390, 4577, 12097, 1979, 310, 6777, 50276, 20, 352, 310, 671, 18338, 281, 8338, 604, 11640, 273, 4067, 4577, 1979, 275, 1566, 3602, 588, 1421, 281, 3045, 7756, 285, 849, 892, 2695, 1818, 275, 841, 11640, 253, 2929, 5421, 247, 11132, 285, 1774, 1895, 275, 5919, 4116, 6297, 891, 1158, 253, 4583, 1332, 310, 4460, 285, 253, 5661, 1543, 403, 12532, 534, 5644, 479, 281, 247, 2762, 13716, 891, 3524, 253, 4477, 476, 2007, 2953, 619, 7350, 275, 253, 30080, 22559, 5474, 339, 431, 248, 2929, 29328, 271, 5919, 4116, 5933, 1754, 327, 9853, 12588, 323, 8113, 4979, 398, 352, 25097, 247, 4735, 39694, 323, 21761, 285, 29111, 23507, 2234, 2877, 3386, 275, 1016, 39694, 1268, 253, 1332, 33526, 12085, 3045, 327, 253, 36167, 2460, 9162, 285, 1789, 5481, 4757, 50276, 36445, 2844, 253, 9853, 12588, 281, 253, 8113, 4979, 398, 310, 4722, 285, 2789, 3282, 352, 476, 14556, 5115, 1048, 6324, 10096, 1223, 7562, 253, 1980, 4278, 50276, 783, 2929, 310, 4583, 973, 15720, 285, 973, 34092, 50276, 783, 2361, 3045, 310, 12085, 1411, 1142, 1375, 23037, 14387, 8113, 4979, 398, 50276, 20881, 1255, 50276, 3956, 76, 12714, 323, 21761, 275, 253, 9853, 12588, 310, 27370, 7413, 6051, 534, 778, 4796, 253, 26647, 849, 1057, 253, 4081, 1332, 5115, 990, 936, 423, 3733, 50276, 783, 9853, 12588, 29111, 3386, 275, 271, 440, 34218, 285, 34560, 1039, 352, 310, 5369, 6902, 314, 281, 253, 7529, 4095, 285, 3164, 3936, 625, 22667, 685, 253, 14086, 4116, 352, 651, 320, 5322, 281, 2085, 253, 4588, 28519, 327, 31025, 316, 281, 7568, 253, 6733, 253, 2929, 29328, 247, 4460, 5919, 8113, 39707, 1754, 327, 9853, 45670, 534, 310, 4722, 285, 7681, 3590, 352, 33526, 12085, 3045, 275, 2710, 8113, 8892, 2490, 187, 4118, 18435, 27, 783, 2929, 29328, 271, 5919, 4116, 12955, 11797, 407, 9853, 45670, 323, 897, 275, 8113, 4979, 398, 672, 3732, 281, 2067, 8113, 8892, 253, 2746, 5644, 281, 1805, 1543, 285, 263, 1679, 11897, 50275, 783, 10123, 403, 512, 2762, 670, 253, 2929, 846, 3192, 715, 2395, 253, 4477, 8680, 581, 37317, 18298, 281, 5731, 616, 3565, 13716, 8505, 597, 1127, 562, 326, 253, 2934, 310, 5272, 285, 253, 16774, 7103, 310, 11080, 285, 21414, 342, 1175, 15988, 327, 2067, 8892, 285, 15302, 50276, 1189, 455, 891, 5583, 14924 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: the authors targeted an important data format multivariate time series and extended the usage of the transformer to this format the effort is appreciated as multivariate time series data is an important problem while the researches there is limited comparing to other sequences problems eg language due to the simple idea of extension and similar works before 1 i expected a higher quality of experiments and presentation of this paper however experiments and writing need significant improvement for the next submission major concerns 1 the experiments presentation is confusing and i just picked a few examples from table 1 a there is no standard deviation of the result b some methods are consistently worse than others like 1nndtwd comparing to 5nndtwd i am not sure why they are needed in the table c averaging the rmse does not deliver much information especially when there is no normalization on each dataset for example the beijingpm10 will dominate the averaged result 2 the structure of writing only has a lot of problems for example usually the fully supervised should be presented before semisupervised while the paper did in a reverse way another problem is the classification result which seems an important part but the table is shown in the appendix 3 the selection of datasets needs more justifications reference 1 httpsarxivorgpdf200108317pdf docsepthis paper aims to develop a transformerbased pretrained model for multivariate time series representation learning specifically the transformers encoder is only used and a timeseries imputation task is constructed as their unsupervised learning objective this is a bit similar to the bert model in nlp but authors added a mask for each variable of the time series after pretraining with this imputation loss the transformer can be used for downstream tasks such as regression and classification as the authors mentioned on page 6 this is achieved by further finetuning all weights of the pretrained transformer it is natural to use the transformer model from nlp for time series modeling since both sentences and time series are sequential data in this work the authors contributions or changes should include twopoints 1 constructing that imputation task for multivariate time series data 2 using a learned positional encoding page 4 i think these two things seem a bit interesting my concerns mainly include 1 actually there are existing works that have tried to use the transformer for time series but you didnt compare them in your experiments at least i think you should clarify whats your advantages comparing to these existing works 2019 neurips enhancing the locality and breaking the memory bottleneck of transformer on time series forecasting 2020 cvpr sketchbert learning sketch bidirectional encoder representation from transformers by selfsupervised learning of sketch gestalt 3 as a core topic in this work i encourage authors to clarify the definition of the time series representation whats a good representation in time series there is a related work for time series representation learning 2019 neurips unsupervised scalable representation learning for multivariate time series i notice that they also train a dilated cnn model to get a single feature for a segment of time series but in your case you directly concatenate states from alltime steps to form your final vector i just wonder if your strategy is reasonable because when we deal with long time series concatenating them will create another long timeserieshidden state sequence this may not be representation learning and it cannot handle the segmentlevel tasks such as classification thus i encourage authors to give more insights into the questions whats a good representation of the time series 3 besides in the above neurips representation learning paper i find they didnt finetune their models parameters they just added svm to test the performance of learned representations but in this work the authors finetuning their model thus i wonder whats the performance of your model without additional finetuning whats your parameter settings for the finetuning procedure 4 another question i am concerned about is your learnable position encoding it is with a shape of wbyd where w should be your window length and d is your input dimension since time series is dynamic data and its length would be much longer this design maybe not good as it largely increases the number of parameters 5 for experimental results authors mainly consider their models performance on regression and classification tasks but i would like to see more analysis of their learned presentations adding more visualization analysis would be helpful to demonstrate your frameworks effectiveness 6 experimental settings are unclear for example since your datasets only contain traintest sets how to pick your hyperparameters is it based on that performance on that test set what kind of regression task is used in your regression experiments is it a onestepahead prediction task 7 in table 5 you claim your model is faster than the rocket model but the running time reported for your model is perepoch training time not the total training time i think this seems a bit unreasonable please check it docsepthis paper uses transformer to improve mutlivariate time series classification and regression using a bert inspired selfsupervised loss the authors show improvement over multiple standard datasets the use of selfsupervision improves performance in lower labeled data regime strong points the method is well explained and take good inspiration of bert pretraining the dataavailable experiment shows that training with selfsupervision helps for classification this work is well positioned in the literature of the field weak points the timeserie representation is obtained by concatenation of the tokens representations this means that the representations are not of fixed size or contain padding representations the introduced masking loss could suffer from discrepancy between training and inference where all the covariates at a given timesteps need to be predicted moreover the model could rely on very correlated covariates which frequently happens to recover the masked variables limiting the effect of selfsupervised training i would like to see if masking all variables at some times step helpshurts the pretraining the paper does not consider if finetuning is needed or if the network could be frozen and only a small mlp could be learned on top of the preextracted timesseries representations the hyperparameters of the network are chosen per dataset but are not reported it is not clear if the tuning is done with cross validation or looking at the test error the authors should report the performance of the transformer with common hyperparameters specified in the appendix this would be more fair to compare with franceschi et al who do not tune the hyperparameters of their encoder the claim in the abstract that the method offers computational efficiency is not backed with evidence section a4 execution time does not report execution time of the method compared to the baselines decision i tend to reject this paper the idea is interesting but some design decisions of the method representation pooling selfsupervised loss need for finetuning should be better justified secondly the evaluation of the method should be more precise hyperparameters tuning speed questionsremarks in section 31 because the computational complexity and the number of parameters of the model scale as ow2 with the input sequence length w the number of parameters of the transformer architecture does not scale with the input length only the memory footprint and the computation scales quadratically we note that similar to the case of word embeddings the positional encodings generally appear not to interfere with the numerical information of the time series do you have evidence for this have you tried powernorm by shen et al 2020 that you cite in 31 you could mention temporal fusion transformers for interpretable multihorizon time series forecasting lim et al 2019 arxiv191209363 for another use of transformers for univariate quantile forecasting add the dataset you are experimenting with in the caption of figure 2 additional feedback figure 2 chose different colors for left and right figures table 1 consider using booktabs for table format and transpose dataset and models to make it fit in the margin section 32 the sentence starting with the reason that spans 6 lines could be split and rewordeddocsep summary the paper proposes an unsupervised learning framework similar to bert idea but for multivariate time series architecture inputs are projected into ddimensional vector space where d is the dimension of the transformer model sequence element representations a learned positional encoding is added to the input which is passed to the transformer encoder the output of the encoder is then passed to another projection layer that depends on the application ie classification or regression the transformer architecture is similar to vaswani et al 2017 however they replaced layer norm with batch norm unsupervised learning training for unsupervised learning the paper proposed taking the input data masking it and predicting an output sequence the loss is then calculated as the mean square error between the actual input and the predicted input evaluation and results the paper considered two tasks both regression and classification for regression the proposed method was evaluated on datasets from the monash university uea ucr time series regression archive tan et al 2020a for classification the paper used the uea time series classification archive bagnall et al 2018 for each dataset they showed the results of using the proposed transformer architecture in a supervised manner and then using the same dataset for pretraining the transformer by the proposed masking approach in an unsupervised manner followed by finetuning using labels both methods were compared to other state of the art methods for the same task for multivariate regression the pretraining followed by supervised training showed improvements in 3 out of 6 datasets for classification 7 out of 11 datasets showed improvements strength the paper focues an important yet a relatively unexplored area the paper is clear well written and well motivated the paper benchmarked accorss multiple datasets on both classification and regression tasks weakness my main concern is the lack of novelty the paper is basically suggesting to use a transformer encoder and add a dense layer before and after and if we use unsupervised training of the transformer with the same dataset we may achieve better results the main success of bert is the ability to transfer and improve the performance on unretaleted task however here they did not include any experiments showing that if we train on model we can use to transfer on another model i believe critical experiments that are missing is unsupervised training say with dataset 1 and the fine tuning on dataset 2 and showing improvements on multiple task however even if this experiments are provided i still believe the paper lacks novelty maybe invistaginting what properties are being transferred in multivariate time series and showing difference between transferring from unvariate to multivariate and vice versa will help the paper replaced layer norm with batch norm and only stated here we instead use batch normalization because it can mitigate the effect of outlier values in time series an issue that does not arise in nlp word embedding they did not show the effect of using batch norm on the accuracy or gave any insights on why it mitigate the effect of outlier values in time series the code implementing the paper is not provided the effect of changing values variable r in masking is not investigated ### Summary:
the authors extends the transformer to multivariate time series the proposed extension is simple and lacks novelty some design decisions of the proposed method should be better justified similar works that also use the transformer for timeseries are not compared experimental results are not convincing the settings are unclear and the selection of datasets needs more justifications some important experiments are missing finally writing can also be improved
[ 4028, 760, 556, 247, 2257, 273, 3237, 323, 1650, 3798, 253, 4751, 22296, 943, 320, 3559, 1078, 49863, 29974, 13337, 1223, 253, 2929, 858, 275, 247, 8107, 1039, 1529, 1895, 310, 253, 9162, 906, 534, 3133, 271, 1774, 629, 533, 253, 2829, 310, 2011, 275, 253, 30762, 50275, 20, 253, 5438, 273, 15302, 3198, 625, 816, 6787, 50275, 14005, 50276, 18, 5987, 39962, 2061, 9275, 1518, 740, 3245, 1166, 9275, 50274, 7152, 33032, 2520, 2929, 13698, 281, 1287, 247, 39707, 3169, 3215, 11273, 1566, 323, 21471, 673, 2962, 6779, 4715, 5742, 253, 4979, 398, 32049, 310, 760, 908, 285, 247, 2069, 12395, 516, 10340, 4836, 310, 8818, 347, 616, 440, 35421, 4715, 8103, 436, 310, 247, 2372, 2074, 281, 253, 270, 797, 1566, 275, 295, 24343, 533, 4477, 2879, 247, 8989, 323, 1016, 4778, 273, 253, 673, 2962, 846, 3215, 26208, 342, 436, 516, 10340, 2957, 253, 39707, 476, 320, 908, 323, 15450, 8892, 824, 347, 9077, 285, 9162, 347, 253, 4477, 5393, 327, 3239, 721, 436, 310, 6786, 407, 2007, 1442, 292, 25004, 512, 13461, 273, 253, 3215, 11273, 39707, 50275, 262, 310, 3626, 281, 897, 253, 39707, 1566, 432, 295, 24343, 323, 673, 2962, 14053, 1580, 1097, 14683, 285, 673, 2962, 403, 22453, 941, 275, 436, 789, 253, 4477, 9021, 390, 2544, 943, 2486, 2500, 412, 842, 84, 337, 26736, 326, 516, 10340, 4836, 323, 21471, 673, 2962, 941, 374, 970, 247, 6311, 40798, 9706, 3239, 577, 891, 1158, 841, 767, 1841, 1646, 247, 2372, 4722, 50276, 2577, 7350, 7194, 2486, 50276, 18, 2686, 627, 403, 5368, 2987, 326, 452, 3597, 281, 897, 253, 39707, 323, 673, 2962, 533, 368, 42126, 7277, 731, 275, 634, 4679, 387, 1878, 891, 1158, 368, 943, 19148, 47515, 634, 11361, 10941, 281, 841, 5368, 2987, 50276, 9638, 5723, 2824, 22474, 253, 33643, 285, 10155, 253, 3541, 3673, 44856, 273, 39707, 327, 673, 2962, 16923, 272, 50276, 14952, 30105, 1087, 23211, 6291, 4715, 23211, 12246, 30869, 32049, 6779, 432, 4979, 398, 407, 1881, 35421, 4715, 273, 23211, 9374, 2711, 50275, 20, 347, 247, 5161, 9400, 275, 436, 789, 891, 11907, 4477, 281, 19148, 253, 5426, 273, 253, 673, 2962, 6779, 47515, 247, 1175, 6779, 275, 673, 2962, 627, 310, 247, 2905, 789, 323, 673, 2962, 6779, 4715, 50275, 9638, 5723, 2824, 440, 35421, 44755, 6779, 4715, 323, 21471, 673, 2962, 50276, 74, 4366, 326, 597, 671, 6194, 247, 49783, 260, 9866, 1566, 281, 755, 247, 2014, 4735, 323, 247, 8223, 273, 673, 2962, 533, 275, 634, 1083, 368, 3587, 32147, 366, 3054, 432, 512, 2606, 5018, 281, 830, 634, 2457, 4972, 891, 816, 4282, 604, 634, 5700, 310, 5272, 984, 672, 359, 2968, 342, 1048, 673, 2962, 32147, 839, 731, 588, 2794, 1529, 1048, 2069, 12395, 19057, 1375, 3425, 436, 778, 417, 320, 6779, 4715, 285, 352, 2550, 6016, 253, 8223, 5251, 8892, 824, 347, 9162, 3021, 891, 11907, 4477, 281, 1918, 625, 16039, 715, 253, 3533, 47515, 247, 1175, 6779, 273, 253, 673, 2962, 50275, 20, 16280, 275, 253, 1840, 5723, 2824, 6779, 4715, 2929, 891, 1089, 597, 42126, 1442, 292, 2517, 616, 3210, 3602, 597, 816, 2879, 256, 11618, 281, 1071, 253, 3045, 273, 6311, 14237, 533, 275, 436, 789, 253, 4477, 1442, 292, 25004, 616, 1566, 3021, 891, 4282, 47515, 253, 3045, 273, 634, 1566, 1293, 3081, 1442, 292, 25004, 47515, 634, 4764, 7533, 323, 253, 1442, 292, 25004, 5199, 50275, 21, 1529, 1953, 891, 717, 7514, 670, 310, 634, 3037, 494, 1899, 9706, 352, 310, 342, 247, 5281, 273, 259, 1615, 69, 835, 259, 943, 320, 634, 3497, 2978, 285, 277, 310, 634, 3280, 7877, 1580, 673, 2962, 310, 7870, 941, 285, 697, 2978, 651, 320, 1199, 3356, 436, 2216, 5046, 417, 1175, 347, 352, 8127, 5459, 253, 1180, 273, 3602, 50274, 22, 323, 5661, 1543, 4477, 7194, 1908, 616, 3210, 3045, 327, 9077, 285, 9162, 8892, 533, 891, 651, 751, 281, 923, 625, 1783, 273, 616, 6311, 27228, 6240, 625, 24426, 1783, 651, 320, 9371, 281, 7568, 634, 31225, 12510, 50275, 23, 5661, 7533, 403, 12744, 323, 1650, 1580, 634, 15302, 760, 3831, 1140, 565, 383, 5239, 849, 281, 2619, 634, 4373, 22041, 310, 352, 1754, 327, 326, 3045, 327, 326, 1071, 873, 752, 2238, 273, 9077, 4836, 310, 908, 275, 634, 9077, 4679, 310, 352, 247, 327, 383, 554, 42338, 10554, 4836, 50275, 24, 275, 2829, 608, 368, 1750, 634, 1566, 310, 7938, 685, 253, 21912, 1566, 533, 253, 3515, 673, 2361, 323, 634, 1566, 310, 759, 4762, 3770, 3733, 673, 417, 253, 2264, 3733, 673, 891, 1158, 436, 3133, 247, 2372, 20697, 4496, 2451, 352, 50276, 7152, 33032, 2520, 2929, 4648, 39707, 281, 3157, 2873, 77, 12735, 673, 2962, 9162, 285, 9077, 970, 247, 270, 797, 11797, 1881, 35421, 2957, 253, 4477, 921, 7756, 689, 2709, 2629, 15302, 253, 897, 273, 1881, 12185, 4694, 19132, 3045, 275, 2406, 13130, 941, 9459, 50274, 9072, 2792, 50275, 783, 1332, 310, 973, 5544, 285, 1379, 1175, 17006, 273, 270, 797, 3215, 26208, 50276, 783, 941, 15735, 3368, 2722, 326, 3733, 342, 1881, 12185, 4694, 7729, 323, 9162, 50276, 2520, 789, 310, 973, 15471, 275, 253, 6239, 273, 253, 1673, 50275, 20881, 2792, 50275, 783, 2069, 24763, 6779, 310, 2797, 407, 32147, 318, 273, 253, 21761, 14237, 436, 2097, 326, 253, 14237, 403, 417, 273, 4229, 1979, 390, 3831, 13294, 14237, 50276, 783, 5611, 44790, 2957, 812, 11089, 432, 26210, 875, 3733, 285, 17032, 835, 512, 253, 33520, 387, 247, 1677, 4522, 383, 2265, 878, 281, 320, 8131, 25761, 253, 1566, 812, 10725, 327, 1077, 9578, 33520, 534, 7208, 6569, 281, 9295, 253, 34741, 4903, 14155, 253, 1055, 273, 1881, 35421, 3733, 891, 651, 751, 281, 923, 604, 44790, 512, 4903, 387, 690, 2069, 3213, 7729, 27941, 1641, 253, 3215, 26208, 50276, 783, 2929, 1057, 417, 1908, 604, 1442, 292, 25004, 310, 3058, 390, 604, 253, 2990, 812, 320, 13831, 285, 760, 247, 1355, 13361, 81, 812, 320, 6311, 327, 1755, 273, 253, 638, 2068, 14668, 4522, 405, 12395, 14237, 50276, 783, 4373, 22041, 273, 253, 2990, 403, 6777, 591, 10895, 533, 403, 417, 2361, 352, 310, 417, 2590, 604, 253, 25184, 310, 2218, 342, 2831, 12820, 390, 2819, 387, 253, 1071, 2228, 253, 4477, 943, 1304, 253, 3045, 273, 253, 39707, 342, 1846, 4373, 22041, 7616, 275, 253, 30762, 436, 651, 320, 625, 4344, 281, 7277, 342, 1315, 1972, 4635, 1162, 355, 665, 513, 417, 19928, 253, 4373, 22041, 273, 616, 32049, 50276, 783, 1750, 275, 253, 12002, 326, 253, 1332, 6131, 15180, 6733, 310, 417, 17245, 342, 1941, 2593, 247, 21, 10636, 673, 1057, 417, 1304, 10636, 673, 273, 253, 1332, 2429, 281, 253, 1666, 25379, 50274, 33642, 50276, 74, 5257, 281, 12009, 436, 2929, 253, 2934, 310, 4722, 533, 690, 2216, 7089, 273, 253, 1332, 6779, 45900, 1881, 35421, 2957, 878, 323, 1442, 292, 25004, 943, 320, 1805, 17285, 1273, 314, 253, 7103, 273, 253, 1332, 943, 320, 625, 10799, 4373, 22041, 25184, 3885, 50275, 34974, 2013, 7969, 50275, 249, 2593, 4562, 984, 253, 15180, 10454, 285, 253, 1180, 273, 3602, 273, 253, 1566, 4311, 347, 18454, 19, 342, 253, 3280, 3425, 2978, 259, 253, 1180, 273, 3602, 273, 253, 39707, 10336, 1057, 417, 4311, 342, 253, 3280, 2978, 760, 253, 3541, 33257, 285, 253, 13782, 11498, 13284, 5372, 50275, 664, 3877, 326, 2074, 281, 253, 1083, 273, 3159, 46234, 253, 40798, 2349, 351, 723, 3839, 3176, 417, 281, 19323, 342, 253, 10704, 1491, 273, 253, 673, 2962, 513, 368, 452, 1941, 323, 436, 50275, 9802, 368, 3597, 1612, 12850, 407, 703, 79, 1162, 355, 9169, 326, 368, 26542, 275, 4562, 50275, 5658, 812, 3748, 11935, 11781, 4979, 398, 323, 4665, 494, 4471, 1688, 21148, 673, 2962, 16923, 272, 1579, 1162, 355, 6247, 549, 32693, 746, 805, 2693, 26338, 323, 1529, 897, 273, 4979, 398, 323, 36474, 2677, 587, 16923, 272, 50275, 1911, 253, 10895, 368, 403, 46086, 342, 275, 253, 11743, 273, 4677, 374, 50275, 38092, 8680, 50275, 13206, 374, 9703, 1027, 9830, 323, 1669, 285, 987, 8442, 50276, 2420, 337, 1908, 970, 1984, 33754, 323, 2829, 5981, 285, 811, 3014, 10895, 285, 3210, 281, 1056, 352, 4944, 275, 253, 8459, 50276, 4674, 4567, 253, 6197, 4983, 342, 253, 1921, 50276, 3529, 35742, 721, 3104, 812, 320, 8085, 285, 294, 3418, 264, 7152, 33032, 50276, 8774, 50276, 783, 2929, 29328, 271, 440, 35421, 4715, 7792, 2074, 281, 270, 797, 2934, 533, 323, 21471, 673, 2962, 50276, 1116, 38413, 50275, 43824, 403, 16589, 715, 277, 6967, 4972, 2317, 835, 277, 310, 253, 7877, 273, 253, 39707, 1566, 3425, 3284, 14237, 50276, 66, 6311, 40798, 9706, 310, 2879, 281, 253, 3280, 534, 310, 4817, 281, 253, 39707, 32049, 253, 3453, 273, 253, 32049, 310, 840, 4817, 281, 1529, 12378, 3828, 326, 7024, 327, 253, 2898, 26332, 9162, 390, 9077, 253, 39707, 10336, 310, 2074, 281, 16016, 88, 6451, 1162, 355, 4240, 2299, 597, 7932, 3828, 5222, 342, 14604, 5222, 50274, 328, 35421, 4715, 3733, 323, 440, 35421, 4715, 253, 2929, 4081, 3192, 253, 3280, 941, 44790, 352, 285, 21565, 271, 3453, 3425, 253, 2957, 310, 840, 5118, 347, 253, 1599, 6278, 2228, 875, 253, 4588, 3280, 285, 253, 8131, 3280, 50274, 15419, 2368, 285, 1543, 50276, 783, 2929, 2783, 767, 8892, 1097, 9077, 285, 9162, 50276, 1542, 9077, 253, 4081, 1332, 369, 6760, 327, 15302, 432, 253, 1114, 1225, 9835, 209, 489, 66, 44274, 83, 673, 2962, 9077, 21429, 23136, 1162, 355, 9169, 66, 323, 9162, 253, 2929, 908, 253, 209, 489, 66, 673, 2962, 9162, 21429, 270, 1530, 455, 1162, 355, 4765, 50276, 1542, 1016, 10895, 597, 2692, 253, 1543, 273, 970, 253, 4081, 39707, 10336, 275, 247, 22296, 5133, 285, 840, 970, 253, 1072, 10895, 323, 3215, 26208, 253, 39707, 407, 253, 4081, 44790, 2746, 275, 271, 440, 35421, 5133, 3560, 407, 1442, 292, 25004, 970, 13301, 1097, 3082, 497, 2429, 281, 643, 1375, 273, 253, 1445, 3082, 323, 253, 1072, 4836, 50276, 1542, 21471, 9077, 253, 3215, 26208, 3560, 407, 22296, 3733, 2692, 11701, 275, 495, 562, 273, 721, 15302, 323, 9162, 818, 562, 273, 1903, 15302, 2692, 11701, 50275, 45563, 50276, 783, 2929, 41685, 955, 50276, 266, 1774, 2568, 247, 4942, 35021, 2149, 2170, 50276, 783, 2929, 310, 2590, 973, 3542, 285, 973, 17194, 50276, 783, 2929, 22791, 264, 756, 641, 84, 2709, 15302, 327, 1097, 9162, 285, 9077, 8892, 50275, 20881, 1255, 50276, 2577, 2022, 4468, 310, 253, 3480, 273, 38135, 253, 2929, 310, 10323, 7738, 281, 897, 247, 50276, 16702, 254, 32049, 285, 823, 247, 14086, 3828, 1078, 285, 846, 285, 604, 359, 897, 440, 35421, 3733, 273, 253, 39707, 342, 253, 1072, 10895, 359, 778, 5115, 1805, 1543, 50276, 783, 2022, 2323, 273, 270, 797, 310, 253, 3745, 281, 3700, 285, 3157, 253, 3045, 327, 440, 1221, 267, 33610, 4836, 2299, 1060, 597, 858, 417, 2486, 667, 4679, 4645, 326, 604, 359, 6194, 327, 1566, 359, 476, 897, 281, 3700, 327, 1529, 1566, 891, 2868, 4619, 4679, 326, 403, 5816, 310, 440, 35421, 3733, 1333, 342, 10895, 337, 285, 253, 4030, 25184, 327, 10895, 374, 285, 4645, 11701, 327, 2709, 4836, 2299, 1014, 604, 436, 4679, 403, 2530, 891, 1335, 2868, 253, 2929, 19756, 38135, 5046, 828, 382, 356, 565, 272, 752, 3607, 403, 1146, 9495, 275, 21471, 673, 2962, 285, 4645, 3064, 875, 27090, 432, 440, 87, 11610, 281, 21471, 285, 12008, 26620, 588, 1361, 50276, 783, 2929, 7932, 3828, 5222, 342, 14604, 5222, 285, 760, 4767, 1060, 359, 3185, 897, 14604, 21539, 984, 352, 476, 29966, 253, 1055, 273, 562, 3623, 2193, 275, 673, 2962, 271, 2523, 326, 1057, 417, 12893, 275, 295, 24343, 3159, 21496, 597, 858, 417, 921, 253, 1055, 273, 970, 14604, 5222, 327, 253, 7200, 390, 3534, 667, 16039, 327, 2139, 352, 29966, 253, 1055, 273, 562, 3623, 2193, 275, 673, 2962, 50276, 783, 2127, 16994, 253, 2929, 310, 417, 2530, 50276, 783, 1055, 273, 6890, 2193, 4778, 391, 275, 44790, 310, 417, 6949, 50273, 187, 187, 4118, 18435, 27, 783, 4477, 8725, 253, 39707, 281, 21471, 673, 2962, 253, 4081, 6880, 310, 2969, 285, 19756, 38135, 690, 2216, 7089, 273, 253, 4081, 1332, 943, 320, 1805, 17285, 2074, 2987, 326, 671, 897, 253, 39707, 323, 2069, 12395, 403, 417, 2429, 50276, 49363, 1543, 403, 417, 21414, 253, 7533, 403, 12744, 285, 253, 5438, 273, 15302, 3198, 625, 816, 6787, 690, 1774, 4679, 403, 5816, 50276, 71, 3341, 4028, 476, 671, 320, 5520 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 4028, 760, 556, 247, 2257, 273, 3237, 323, 1650, 3798, 253, 4751, 22296, 943, 320, 3559, 1078, 49863, 29974, 13337, 1223, 253, 2929, 858, 275, 247, 8107, 1039, 1529, 1895, 310, 253, 9162, 906, 534, 3133, 271, 1774, 629, 533, 253, 2829, 310, 2011, 275, 253, 30762, 50275, 20, 253, 5438, 273, 15302, 3198, 625, 816, 6787, 50275, 14005, 50276, 18, 5987, 39962, 2061, 9275, 1518, 740, 3245, 1166, 9275, 50274, 7152, 33032, 2520, 2929, 13698, 281, 1287, 247, 39707, 3169, 3215, 11273, 1566, 323, 21471, 673, 2962, 6779, 4715, 5742, 253, 4979, 398, 32049, 310, 760, 908, 285, 247, 2069, 12395, 516, 10340, 4836, 310, 8818, 347, 616, 440, 35421, 4715, 8103, 436, 310, 247, 2372, 2074, 281, 253, 270, 797, 1566, 275, 295, 24343, 533, 4477, 2879, 247, 8989, 323, 1016, 4778, 273, 253, 673, 2962, 846, 3215, 26208, 342, 436, 516, 10340, 2957, 253, 39707, 476, 320, 908, 323, 15450, 8892, 824, 347, 9077, 285, 9162, 347, 253, 4477, 5393, 327, 3239, 721, 436, 310, 6786, 407, 2007, 1442, 292, 25004, 512, 13461, 273, 253, 3215, 11273, 39707, 50275, 262, 310, 3626, 281, 897, 253, 39707, 1566, 432, 295, 24343, 323, 673, 2962, 14053, 1580, 1097, 14683, 285, 673, 2962, 403, 22453, 941, 275, 436, 789, 253, 4477, 9021, 390, 2544, 943, 2486, 2500, 412, 842, 84, 337, 26736, 326, 516, 10340, 4836, 323, 21471, 673, 2962, 941, 374, 970, 247, 6311, 40798, 9706, 3239, 577, 891, 1158, 841, 767, 1841, 1646, 247, 2372, 4722, 50276, 2577, 7350, 7194, 2486, 50276, 18, 2686, 627, 403, 5368, 2987, 326, 452, 3597, 281, 897, 253, 39707, 323, 673, 2962, 533, 368, 42126, 7277, 731, 275, 634, 4679, 387, 1878, 891, 1158, 368, 943, 19148, 47515, 634, 11361, 10941, 281, 841, 5368, 2987, 50276, 9638, 5723, 2824, 22474, 253, 33643, 285, 10155, 253, 3541, 3673, 44856, 273, 39707, 327, 673, 2962, 16923, 272, 50276, 14952, 30105, 1087, 23211, 6291, 4715, 23211, 12246, 30869, 32049, 6779, 432, 4979, 398, 407, 1881, 35421, 4715, 273, 23211, 9374, 2711, 50275, 20, 347, 247, 5161, 9400, 275, 436, 789, 891, 11907, 4477, 281, 19148, 253, 5426, 273, 253, 673, 2962, 6779, 47515, 247, 1175, 6779, 275, 673, 2962, 627, 310, 247, 2905, 789, 323, 673, 2962, 6779, 4715, 50275, 9638, 5723, 2824, 440, 35421, 44755, 6779, 4715, 323, 21471, 673, 2962, 50276, 74, 4366, 326, 597, 671, 6194, 247, 49783, 260, 9866, 1566, 281, 755, 247, 2014, 4735, 323, 247, 8223, 273, 673, 2962, 533, 275, 634, 1083, 368, 3587, 32147, 366, 3054, 432, 512, 2606, 5018, 281, 830, 634, 2457, 4972, 891, 816, 4282, 604, 634, 5700, 310, 5272, 984, 672, 359, 2968, 342, 1048, 673, 2962, 32147, 839, 731, 588, 2794, 1529, 1048, 2069, 12395, 19057, 1375, 3425, 436, 778, 417, 320, 6779, 4715, 285, 352, 2550, 6016, 253, 8223, 5251, 8892, 824, 347, 9162, 3021, 891, 11907, 4477, 281, 1918, 625, 16039, 715, 253, 3533, 47515, 247, 1175, 6779, 273, 253, 673, 2962, 50275, 20, 16280, 275, 253, 1840, 5723, 2824, 6779, 4715, 2929, 891, 1089, 597, 42126, 1442, 292, 2517, 616, 3210, 3602, 597, 816, 2879, 256, 11618, 281, 1071, 253, 3045, 273, 6311, 14237, 533, 275, 436, 789, 253, 4477, 1442, 292, 25004, 616, 1566, 3021, 891, 4282, 47515, 253, 3045, 273, 634, 1566, 1293, 3081, 1442, 292, 25004, 47515, 634, 4764, 7533, 323, 253, 1442, 292, 25004, 5199, 50275, 21, 1529, 1953, 891, 717, 7514, 670, 310, 634, 3037, 494, 1899, 9706, 352, 310, 342, 247, 5281, 273, 259, 1615, 69, 835, 259, 943, 320, 634, 3497, 2978, 285, 277, 310, 634, 3280, 7877, 1580, 673, 2962, 310, 7870, 941, 285, 697, 2978, 651, 320, 1199, 3356, 436, 2216, 5046, 417, 1175, 347, 352, 8127, 5459, 253, 1180, 273, 3602, 50274, 22, 323, 5661, 1543, 4477, 7194, 1908, 616, 3210, 3045, 327, 9077, 285, 9162, 8892, 533, 891, 651, 751, 281, 923, 625, 1783, 273, 616, 6311, 27228, 6240, 625, 24426, 1783, 651, 320, 9371, 281, 7568, 634, 31225, 12510, 50275, 23, 5661, 7533, 403, 12744, 323, 1650, 1580, 634, 15302, 760, 3831, 1140, 565, 383, 5239, 849, 281, 2619, 634, 4373, 22041, 310, 352, 1754, 327, 326, 3045, 327, 326, 1071, 873, 752, 2238, 273, 9077, 4836, 310, 908, 275, 634, 9077, 4679, 310, 352, 247, 327, 383, 554, 42338, 10554, 4836, 50275, 24, 275, 2829, 608, 368, 1750, 634, 1566, 310, 7938, 685, 253, 21912, 1566, 533, 253, 3515, 673, 2361, 323, 634, 1566, 310, 759, 4762, 3770, 3733, 673, 417, 253, 2264, 3733, 673, 891, 1158, 436, 3133, 247, 2372, 20697, 4496, 2451, 352, 50276, 7152, 33032, 2520, 2929, 4648, 39707, 281, 3157, 2873, 77, 12735, 673, 2962, 9162, 285, 9077, 970, 247, 270, 797, 11797, 1881, 35421, 2957, 253, 4477, 921, 7756, 689, 2709, 2629, 15302, 253, 897, 273, 1881, 12185, 4694, 19132, 3045, 275, 2406, 13130, 941, 9459, 50274, 9072, 2792, 50275, 783, 1332, 310, 973, 5544, 285, 1379, 1175, 17006, 273, 270, 797, 3215, 26208, 50276, 783, 941, 15735, 3368, 2722, 326, 3733, 342, 1881, 12185, 4694, 7729, 323, 9162, 50276, 2520, 789, 310, 973, 15471, 275, 253, 6239, 273, 253, 1673, 50275, 20881, 2792, 50275, 783, 2069, 24763, 6779, 310, 2797, 407, 32147, 318, 273, 253, 21761, 14237, 436, 2097, 326, 253, 14237, 403, 417, 273, 4229, 1979, 390, 3831, 13294, 14237, 50276, 783, 5611, 44790, 2957, 812, 11089, 432, 26210, 875, 3733, 285, 17032, 835, 512, 253, 33520, 387, 247, 1677, 4522, 383, 2265, 878, 281, 320, 8131, 25761, 253, 1566, 812, 10725, 327, 1077, 9578, 33520, 534, 7208, 6569, 281, 9295, 253, 34741, 4903, 14155, 253, 1055, 273, 1881, 35421, 3733, 891, 651, 751, 281, 923, 604, 44790, 512, 4903, 387, 690, 2069, 3213, 7729, 27941, 1641, 253, 3215, 26208, 50276, 783, 2929, 1057, 417, 1908, 604, 1442, 292, 25004, 310, 3058, 390, 604, 253, 2990, 812, 320, 13831, 285, 760, 247, 1355, 13361, 81, 812, 320, 6311, 327, 1755, 273, 253, 638, 2068, 14668, 4522, 405, 12395, 14237, 50276, 783, 4373, 22041, 273, 253, 2990, 403, 6777, 591, 10895, 533, 403, 417, 2361, 352, 310, 417, 2590, 604, 253, 25184, 310, 2218, 342, 2831, 12820, 390, 2819, 387, 253, 1071, 2228, 253, 4477, 943, 1304, 253, 3045, 273, 253, 39707, 342, 1846, 4373, 22041, 7616, 275, 253, 30762, 436, 651, 320, 625, 4344, 281, 7277, 342, 1315, 1972, 4635, 1162, 355, 665, 513, 417, 19928, 253, 4373, 22041, 273, 616, 32049, 50276, 783, 1750, 275, 253, 12002, 326, 253, 1332, 6131, 15180, 6733, 310, 417, 17245, 342, 1941, 2593, 247, 21, 10636, 673, 1057, 417, 1304, 10636, 673, 273, 253, 1332, 2429, 281, 253, 1666, 25379, 50274, 33642, 50276, 74, 5257, 281, 12009, 436, 2929, 253, 2934, 310, 4722, 533, 690, 2216, 7089, 273, 253, 1332, 6779, 45900, 1881, 35421, 2957, 878, 323, 1442, 292, 25004, 943, 320, 1805, 17285, 1273, 314, 253, 7103, 273, 253, 1332, 943, 320, 625, 10799, 4373, 22041, 25184, 3885, 50275, 34974, 2013, 7969, 50275, 249, 2593, 4562, 984, 253, 15180, 10454, 285, 253, 1180, 273, 3602, 273, 253, 1566, 4311, 347, 18454, 19, 342, 253, 3280, 3425, 2978, 259, 253, 1180, 273, 3602, 273, 253, 39707, 10336, 1057, 417, 4311, 342, 253, 3280, 2978, 760, 253, 3541, 33257, 285, 253, 13782, 11498, 13284, 5372, 50275, 664, 3877, 326, 2074, 281, 253, 1083, 273, 3159, 46234, 253, 40798, 2349, 351, 723, 3839, 3176, 417, 281, 19323, 342, 253, 10704, 1491, 273, 253, 673, 2962, 513, 368, 452, 1941, 323, 436, 50275, 9802, 368, 3597, 1612, 12850, 407, 703, 79, 1162, 355, 9169, 326, 368, 26542, 275, 4562, 50275, 5658, 812, 3748, 11935, 11781, 4979, 398, 323, 4665, 494, 4471, 1688, 21148, 673, 2962, 16923, 272, 1579, 1162, 355, 6247, 549, 32693, 746, 805, 2693, 26338, 323, 1529, 897, 273, 4979, 398, 323, 36474, 2677, 587, 16923, 272, 50275, 1911, 253, 10895, 368, 403, 46086, 342, 275, 253, 11743, 273, 4677, 374, 50275, 38092, 8680, 50275, 13206, 374, 9703, 1027, 9830, 323, 1669, 285, 987, 8442, 50276, 2420, 337, 1908, 970, 1984, 33754, 323, 2829, 5981, 285, 811, 3014, 10895, 285, 3210, 281, 1056, 352, 4944, 275, 253, 8459, 50276, 4674, 4567, 253, 6197, 4983, 342, 253, 1921, 50276, 3529, 35742, 721, 3104, 812, 320, 8085, 285, 294, 3418, 264, 7152, 33032, 50276, 8774, 50276, 783, 2929, 29328, 271, 440, 35421, 4715, 7792, 2074, 281, 270, 797, 2934, 533, 323, 21471, 673, 2962, 50276, 1116, 38413, 50275, 43824, 403, 16589, 715, 277, 6967, 4972, 2317, 835, 277, 310, 253, 7877, 273, 253, 39707, 1566, 3425, 3284, 14237, 50276, 66, 6311, 40798, 9706, 310, 2879, 281, 253, 3280, 534, 310, 4817, 281, 253, 39707, 32049, 253, 3453, 273, 253, 32049, 310, 840, 4817, 281, 1529, 12378, 3828, 326, 7024, 327, 253, 2898, 26332, 9162, 390, 9077, 253, 39707, 10336, 310, 2074, 281, 16016, 88, 6451, 1162, 355, 4240, 2299, 597, 7932, 3828, 5222, 342, 14604, 5222, 50274, 328, 35421, 4715, 3733, 323, 440, 35421, 4715, 253, 2929, 4081, 3192, 253, 3280, 941, 44790, 352, 285, 21565, 271, 3453, 3425, 253, 2957, 310, 840, 5118, 347, 253, 1599, 6278, 2228, 875, 253, 4588, 3280, 285, 253, 8131, 3280, 50274, 15419, 2368, 285, 1543, 50276, 783, 2929, 2783, 767, 8892, 1097, 9077, 285, 9162, 50276, 1542, 9077, 253, 4081, 1332, 369, 6760, 327, 15302, 432, 253, 1114, 1225, 9835, 209, 489, 66, 44274, 83, 673, 2962, 9077, 21429, 23136, 1162, 355, 9169, 66, 323, 9162, 253, 2929, 908, 253, 209, 489, 66, 673, 2962, 9162, 21429, 270, 1530, 455, 1162, 355, 4765, 50276, 1542, 1016, 10895, 597, 2692, 253, 1543, 273, 970, 253, 4081, 39707, 10336, 275, 247, 22296, 5133, 285, 840, 970, 253, 1072, 10895, 323, 3215, 26208, 253, 39707, 407, 253, 4081, 44790, 2746, 275, 271, 440, 35421, 5133, 3560, 407, 1442, 292, 25004, 970, 13301, 1097, 3082, 497, 2429, 281, 643, 1375, 273, 253, 1445, 3082, 323, 253, 1072, 4836, 50276, 1542, 21471, 9077, 253, 3215, 26208, 3560, 407, 22296, 3733, 2692, 11701, 275, 495, 562, 273, 721, 15302, 323, 9162, 818, 562, 273, 1903, 15302, 2692, 11701, 50275, 45563, 50276, 783, 2929, 41685, 955, 50276, 266, 1774, 2568, 247, 4942, 35021, 2149, 2170, 50276, 783, 2929, 310, 2590, 973, 3542, 285, 973, 17194, 50276, 783, 2929, 22791, 264, 756, 641, 84, 2709, 15302, 327, 1097, 9162, 285, 9077, 8892, 50275, 20881, 1255, 50276, 2577, 2022, 4468, 310, 253, 3480, 273, 38135, 253, 2929, 310, 10323, 7738, 281, 897, 247, 50276, 16702, 254, 32049, 285, 823, 247, 14086, 3828, 1078, 285, 846, 285, 604, 359, 897, 440, 35421, 3733, 273, 253, 39707, 342, 253, 1072, 10895, 359, 778, 5115, 1805, 1543, 50276, 783, 2022, 2323, 273, 270, 797, 310, 253, 3745, 281, 3700, 285, 3157, 253, 3045, 327, 440, 1221, 267, 33610, 4836, 2299, 1060, 597, 858, 417, 2486, 667, 4679, 4645, 326, 604, 359, 6194, 327, 1566, 359, 476, 897, 281, 3700, 327, 1529, 1566, 891, 2868, 4619, 4679, 326, 403, 5816, 310, 440, 35421, 3733, 1333, 342, 10895, 337, 285, 253, 4030, 25184, 327, 10895, 374, 285, 4645, 11701, 327, 2709, 4836, 2299, 1014, 604, 436, 4679, 403, 2530, 891, 1335, 2868, 253, 2929, 19756, 38135, 5046, 828, 382, 356, 565, 272, 752, 3607, 403, 1146, 9495, 275, 21471, 673, 2962, 285, 4645, 3064, 875, 27090, 432, 440, 87, 11610, 281, 21471, 285, 12008, 26620, 588, 1361, 50276, 783, 2929, 7932, 3828, 5222, 342, 14604, 5222, 285, 760, 4767, 1060, 359, 3185, 897, 14604, 21539, 984, 352, 476, 29966, 253, 1055, 273, 562, 3623, 2193, 275, 673, 2962, 271, 2523, 326, 1057, 417, 12893, 275, 295, 24343, 3159, 21496, 597, 858, 417, 921, 253, 1055, 273, 970, 14604, 5222, 327, 253, 7200, 390, 3534, 667, 16039, 327, 2139, 352, 29966, 253, 1055, 273, 562, 3623, 2193, 275, 673, 2962, 50276, 783, 2127, 16994, 253, 2929, 310, 417, 2530, 50276, 783, 1055, 273, 6890, 2193, 4778, 391, 275, 44790, 310, 417, 6949, 50273, 187, 187, 4118, 18435, 27, 783, 4477, 8725, 253, 39707, 281, 21471, 673, 2962, 253, 4081, 6880, 310, 2969, 285, 19756, 38135, 690, 2216, 7089, 273, 253, 4081, 1332, 943, 320, 1805, 17285, 2074, 2987, 326, 671, 897, 253, 39707, 323, 2069, 12395, 403, 417, 2429, 50276, 49363, 1543, 403, 417, 21414, 253, 7533, 403, 12744, 285, 253, 5438, 273, 15302, 3198, 625, 816, 6787, 690, 1774, 4679, 403, 5816, 50276, 71, 3341, 4028, 476, 671, 320, 5520 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: this work proposes a method to disentangle epistemic from aleatoric uncertainty for avoiding the noisy tv problem which occurs when intrinsically motivated agents get rewarded for visiting states that have high irreducible uncertainty the main weakness of the current work is that it misses relevant work which does the same thing while it seems to misinterpret certain parts of the literature as an example the authors state that tractable epistemic uncertainty estimation with high dimensional data is an unsolved problem citing gal 2016 which demonstrates a method for estimating epistemic uncertainty in high dimensional data thats mc dropout also there is a similar approach using deep ensembles to assess epistemic uncertainty although far from optimal such approaches are already being used in epistemic uncertainty driven exploration for example in planning to explore httpsarxivorgabs200505960httpsarxivorgabs200505960 the authors use epistemic uncertainty estimation as an intrinsic reward in fact the authors in related work last paragraph they cover a few efforts using epistemic uncertainty estimation as intrinsic rewards while they dont mention whats the shortcomings of such work which they aim to solve with their proposal in terms of experimentation the selected baselines are not sufficient to demonstrate how much better this method is id expect to see a comparison with mc dropout or deep ensembles based methods of assessing epistemic uncertainty in the presence of heteroscedastic noise or due httpsarxivorgpdf210211409pdfhttpsarxivorgpdf210211409pdf detailed comments c1 this work presents aleatoric mapping agents which use single network deterministic uncertainty estimation is the reference correct there kendall gal work uses a bayesian neural network whereas deterministic uncertainty estimation was proposed here httpsopenreviewnetforumidfu7d6kqpzs4httpsopenreviewnetforumidfu7d6kqpzs4 not cited in the text c2 for the epistemic and aleatoric uncertainty you cite hullermeier waegeman can you also cite the earlier references eg hora s 1996 aleatory and epistemic uncertainty in probability elicitation with an example from hazardous waste management reliability engineering and system safety 5423217223 c3 however as far as we are aware we are the first to compute aleatoric uncertainties with a scalable curiosity framework to reduce intrinsic rewards for those state transitions with aleatoric uncertainty i dont understand what is the novelty here there is a lot of work disentangling epistemic from aleatoric uncertainty which works at scale can you please state as precise as possible whats the contribution in terms of the computation of aleatoric uncertainty c4 we implicitly incentivise agents to seek epistemic uncertainties by removing the aleatoric component from the total prediction error if im not mistaken thats the common way to disentangle the epistemic from the aleatoric uncertainty which is usually derived by the law of total variance what is the novelty here c5 possible because regularisation terms could absorb some epistemic uncertainty what does it mean to absorb some epistemic uncertainty epistemic uncertainty can be reduced or increased do you mean that through the regularization epistemic uncertainty is reduced if so can you explain why this can happen c6 in the minigrid experiments can you try to implement the intrinsic reward based on the epistemic uncertainty as evaluated from ensemble kendall gal 2017 models that should be the variance over expectations of the deep ensemble components vartheta sim ptheta mid d muthetast1 c7 its very difficult to follow the experimental protocol in section 5 how do you estimate the acetylcholine in the simulations finally id like to suggest to the authors to try to use citep instead of cite whenever possible also try to use a more accessible colour for the citations as currently it makes it challenging to read my recommendation is influenced by the following issues the novelty of the contribution is not clear to me id like to understand c3c6 because overall it seems to me that epistemic uncertainty driven intrinsic rewards have been used before and here its not compared thoroughly against such baselines which are mentioned in related work c7 i dont understand how this section contributes towards understanding what type of uncertainty acetylcholine signals docsepthis paper suggests an intrinsic bonus for exploration that avoids noisy tv by adding a penalty for the estimated variance of the reached state st1 given previous state st for this they fit an independent normal model of the new state st1 with mean mut1 and var sigmat12 predicted from the previous state st then the bonus is given by the error of prediction st1 hatmut12 minus the variance penalty sigmat12 that way a noisy tv will always be fitted with an higher variance that should have a similar order than the error of prediction so both predictable and unpredictable transitions bonus will end up to zero in average after learning the model of dynamics of the whole environment i am surprised that this idea was not explored in an earlier paper taking the prediction error minus the variance looks like the very first solution to try in order to prevent agents to stay watching a noisy tv while my knowledge to this domain is limited to my mind this is the first paper suggesting this idea probably other reviewers will prove me wrong i expect this approach to work well and i trust the good experimental results exposed in the paper i believe this could have a nice impact on the curiosityexploration efforts in the rl community however there are many weaknesses that makes this paper not ready for a publication first all the mathematical justifications are unclear and slovenly written with too much errors in equations eq 3 is wrongly reported it should be st1 in the left of the noise term and hatst1 in the right of the variance term in eq 5 must1 should actually be st1 except its an average over a batch of states st1 with the same previous st but i guess its not in eq 6 hats should be hatmu for consistency also i feel that there is a lot of paragraphes just to describe a very simple gaussian model used to fit st1 given st by maximum loglikelihood in eq 5 sigma is the estimated variance and mu is the estimated mean eq 5 is just the log of eq 4 with lambda 1pi then in eq 7 its just r error estimated variance no need for eta the point is that these terms should have the same expectation at convergence of the fitted model with eta1 the bioinspired aspect needs more content to be convincing for example reporting models or experimental results from the biological papers for ex the actual original curves that fig 5 is trying to reproduce what i suggest the experimental section looks good to me but all the method section needs to be rewritten from scratch after introducing the mdp notations i would directly start with the final equation 7 we take the error minus the variance then show that the mean of this bonus is zero when the model is learned so whatever is the unpredictability the bonus will go to zero optionally look at the suboptimality behaviour of the bonus is the error higher or lower than the variance how does it affect the agents curiosity finally explain how the mean and variance are learned here with a twoheaded network trained by maximum llh also discuss what happen if the action are taken into account in the learned model so far the bonus is onpolicy and this should appear somewhere and be discussed here everything works as long as the agent is not changing his policy but depending on the learning dynamics of the policy the model may never converge this would not happen with action taken into account nice and novel idea too poor mathematical justifications and unclear description of the method i sincerely hope the paper will be improved for a further submission because i believe the idea has an high potential that would be missed if it appears in its present slovenly form docsepthis paper extends the works on curiosity in artificial agents by incorporating an explicit prediction of nonreducible aleatoric uncertainty into the agents action choice computation such that the agents have a preference against the environmental states where uncertainty cannot be reduced by learning the proposed framework has enabled efficient exploration in a set of tasks where nonreducible uncertainty has been introduced the authors have also used it to propose a test to clarify the role of acetylcholine the neurotransmitter related to uncertainty in the brain strengths the paper provides a very nice comprehensive review of the existing curiosity techniques which was a pleasure to read i like the neuroscientific inspiration of the proposed algorithm the field is full of claims about neuroai synergy but short of good examples this work further developed may become a good example of such synergy the idea of explicitly accounting for aleatoric uncertainty in deciding on future actions seems highly meaningful and potentially fruitful in this formulation the uncertainty is estimated with a single agent and not with ensembles of agents as it was done before this makes the model more useful the authors have performed an impressive set of experiments involving a significant number of baselines i like the idea of proposing a test which experimental neuroscientists may then use to improve our understanding of the brain weaknesses provided the close relation of the proposed method to prior work as mentioned by the authors it would be reasonable to expect a strong results part of the paper see below most of the experiments in this paper were performed in environments specially altered to introduce stochastic traps the frameworks performance in unaltered environments did not exceed that of the baselines the exploration has been previously developed as a tool with a final goal of maximizing rewards in scarce environments in this work the authors have reported improvements in exploration but there are no experiments indicating improvement of total reward in standard unaltered environments the authors do explicitly discuss this issue it is unclear whether additional exploration as suggested here leads to higher rewards in the standard or realworld environment which is the goal of exploration in the first place in the proposed experiment for establishing the role of acetylcholine in the brain the prediction seemingly boils down to the fact that epistemic uncertainty decays over time whereas aleatoric uncertainty remains constant these predictions seem to directly follow from the definitions of such uncertainties so it is unclear whether the proposed design of the experiment is necessary specific comments in equation 5 is the last sign correct although the notation is clear consider introducing the variables before using them whereas the introduction is generally great it may be useful to talk a bit more about angela yu and peter dayans model as it forms the main neuroscientific motivation for the proposed framework so that the readers wont have to read their paper i would personally consider adding a short description of how equation 5 was derived its pretty clear for me but may or may not be clear for everyone the derivation is super simple and wont take up a lot of space suggestions to the authors i found this work highly promising and would love to see future developments based on your framework perhaps something along these lines it would be great to see the improvement of the received reward in standard or realworld environments the claim here supported by the literature is that stochastic traps are the major problem in curiositydriven exploration this implies that there are enough standard environments where stochastic traps affect the efficiency of baseline curiositybased algorithms such environments can be used to showcase the benefits of the proposed algorithm when it comes to rewards which consequently would skyrocket the impact of this work it would also be nice if the author follow up on their proposed acetylcholine experiment and collaborate with experimental neuroscientists to test this hypothesis determining that acetylcholine encodes one type of uncertainty or another would be an important result and another great way to ascertain the high impact of this work an impressive amount of work has been done the writing is mostly easy to follow and the results seem promising however provided that the proposed framework is closely related to the existing ones further research towards practical results is needed the better exploration proposed here is important but it may or may not lead to betterperforming agents i therefore think that the paper is better suited for a conference workshop rather than for the main track thus overall i tend to recommend rejection ### Summary:
authors present a method to disentangle epistemic from aleatoric uncertainty for avoiding the noisy tv problem during selfdriven exploration this is an important area where we need more ideas and experiments the authors present a biologically inspired approach and through experiments although it doesnt present the stateoftheart exploration in wellknown rl environments i acknowledge that new solutions to problems that were previously intractable often would face such an issue the prediction to discriminate neuroscientific modulations that directly encode epistemic and aleatoric uncertainty is bold but not very specific unfortunately as the reviewers noted the manuscript in the current form doesnt quite meet the bar yet i suggest comparing methods for directly estimating uncertainty i also suggest adding discussion on the estimation bias for the epistemic uncertainty for the proposed method i strongly encourage the authosr to continue this interesting line of work
[ 50276, 68, 21, 359, 29688, 15210, 400, 885, 6083, 281, 7703, 30009, 11060, 20418, 407, 11922, 253, 21844, 1080, 280, 4445, 432, 253, 2264, 10554, 2228, 50276, 338, 516, 417, 20854, 28763, 253, 1846, 1039, 281, 557, 290, 2134, 253, 30009, 11060, 432, 253, 21844, 1080, 280, 11649, 534, 310, 3798, 6012, 407, 253, 1569, 273, 2264, 11041, 752, 310, 253, 38135, 1060, 50276, 68, 22, 1896, 984, 3963, 5837, 2426, 812, 15816, 690, 30009, 11060, 11649, 50276, 5371, 1057, 352, 1599, 281, 15816, 690, 30009, 11060, 11649, 30009, 11060, 11649, 476, 320, 3777, 390, 2559, 513, 368, 1599, 326, 949, 253, 37820, 30009, 11060, 11649, 310, 3777, 604, 594, 476, 368, 5513, 2139, 436, 476, 5108, 50276, 68, 23, 275, 253, 1054, 304, 6992, 4679, 476, 368, 1611, 281, 3359, 253, 15276, 10921, 1754, 327, 253, 30009, 11060, 11649, 347, 6760, 432, 19862, 465, 423, 455, 50276, 9896, 4240, 3210, 326, 943, 320, 253, 11041, 689, 12656, 273, 253, 3676, 19862, 4295, 362, 435, 22666, 948, 268, 3124, 4260, 277, 2873, 6168, 505, 18, 50275, 68, 24, 697, 1077, 2834, 281, 956, 253, 5661, 7241, 275, 2593, 608, 849, 513, 368, 6642, 253, 27802, 36992, 275, 253, 9938, 50276, 71, 3341, 2654, 751, 281, 1804, 281, 253, 4477, 281, 1611, 281, 897, 4851, 554, 3185, 273, 26542, 10793, 1896, 671, 1611, 281, 897, 247, 625, 12482, 10688, 323, 253, 30404, 347, 4390, 352, 2789, 352, 11132, 281, 1239, 619, 17401, 310, 12208, 407, 253, 1563, 3374, 50276, 783, 38135, 273, 253, 7680, 310, 417, 2590, 281, 479, 2654, 751, 281, 2096, 260, 20, 68, 23, 984, 4583, 352, 3133, 281, 479, 326, 30009, 11060, 11649, 8877, 15276, 23267, 452, 644, 908, 1078, 285, 1060, 697, 417, 2429, 16575, 1411, 824, 1666, 25379, 534, 403, 5393, 275, 2905, 789, 50276, 68, 24, 50276, 74, 13414, 2096, 849, 436, 2593, 17904, 4404, 4685, 752, 1511, 273, 11649, 27802, 36992, 6298, 5474, 33032, 2520, 2929, 5936, 271, 15276, 17301, 323, 17947, 326, 32547, 27620, 23055, 407, 6240, 247, 12339, 323, 253, 5998, 11041, 273, 253, 4925, 1375, 331, 18, 1677, 2045, 1375, 331, 50276, 1542, 436, 597, 4944, 271, 3907, 2622, 1566, 273, 253, 747, 1375, 331, 18, 342, 1599, 2873, 18, 285, 945, 9788, 2056, 805, 8131, 432, 253, 2045, 1375, 331, 840, 253, 17301, 310, 1677, 407, 253, 2228, 273, 10554, 331, 18, 50276, 700, 10082, 805, 19734, 253, 11041, 12339, 9788, 2056, 805, 50275, 3529, 1039, 247, 27620, 23055, 588, 1900, 320, 14662, 342, 271, 2169, 11041, 326, 943, 452, 247, 2074, 1340, 685, 253, 2228, 273, 10554, 594, 1097, 28826, 285, 32947, 16307, 17301, 588, 990, 598, 281, 5058, 275, 3388, 846, 4715, 253, 1566, 273, 8062, 273, 253, 2644, 3126, 50276, 74, 717, 9861, 326, 436, 2934, 369, 417, 14859, 275, 271, 4321, 2929, 3192, 253, 10554, 2228, 19734, 253, 11041, 4453, 751, 253, 1077, 806, 2900, 281, 1611, 275, 1340, 281, 3657, 6083, 281, 3297, 7487, 247, 27620, 23055, 1223, 619, 3640, 281, 436, 5028, 310, 3710, 281, 619, 2564, 436, 310, 253, 806, 2929, 7738, 436, 2934, 3164, 643, 30628, 588, 5276, 479, 3430, 50274, 74, 1902, 436, 2746, 281, 789, 973, 285, 891, 4517, 253, 1175, 5661, 1543, 7329, 275, 253, 2929, 891, 2868, 436, 812, 452, 247, 5322, 3486, 327, 253, 24536, 15083, 7843, 6031, 275, 253, 391, 77, 3114, 50275, 35529, 627, 403, 1142, 32213, 326, 2789, 436, 2929, 417, 4704, 323, 247, 9311, 50276, 7053, 512, 253, 15965, 816, 6787, 403, 12744, 285, 1499, 13606, 314, 3542, 342, 1512, 1199, 6332, 275, 7424, 50276, 2574, 495, 310, 47723, 2361, 352, 943, 320, 331, 18, 275, 253, 1669, 273, 253, 6046, 1307, 285, 7856, 296, 18, 275, 253, 987, 273, 253, 11041, 1307, 50276, 249, 16186, 608, 1364, 18, 943, 2686, 320, 331, 18, 3707, 697, 271, 3388, 689, 247, 14604, 273, 3054, 331, 18, 342, 253, 1072, 2045, 331, 533, 891, 5476, 697, 417, 50276, 249, 16186, 721, 33054, 943, 320, 7856, 1906, 323, 15274, 50276, 12563, 891, 1928, 326, 627, 310, 247, 2257, 273, 1061, 356, 1761, 1041, 816, 281, 6266, 247, 1077, 2969, 305, 12064, 1566, 908, 281, 4944, 331, 18, 1677, 331, 407, 4869, 2412, 7513, 10202, 275, 16186, 608, 40009, 310, 253, 5998, 11041, 285, 12910, 310, 253, 5998, 1599, 16186, 608, 310, 816, 253, 2412, 273, 16186, 577, 342, 29331, 50276, 18, 2059, 840, 275, 16186, 818, 697, 816, 391, 50276, 3775, 50276, 383, 18280, 11041, 642, 878, 323, 1162, 66, 253, 1127, 310, 326, 841, 2426, 943, 452, 253, 1072, 15355, 387, 14940, 273, 253, 14662, 1566, 342, 1162, 66, 18, 50275, 783, 9015, 38358, 4809, 3198, 625, 2600, 281, 320, 21414, 50275, 1542, 1650, 9610, 3210, 390, 5661, 1543, 432, 253, 7534, 9380, 323, 385, 253, 4588, 3236, 9191, 326, 3036, 608, 310, 2820, 281, 18302, 50276, 5371, 891, 1804, 50276, 783, 5661, 2593, 4453, 1175, 281, 479, 533, 512, 253, 1332, 2593, 3198, 281, 320, 35993, 432, 20041, 846, 16984, 253, 278, 12132, 41818, 891, 651, 3587, 1265, 342, 253, 2457, 5150, 818, 359, 1379, 253, 2228, 19734, 253, 11041, 50276, 7461, 921, 326, 253, 1599, 273, 436, 17301, 310, 5058, 672, 253, 1566, 310, 6311, 594, 5913, 310, 253, 29444, 1430, 253, 17301, 588, 564, 281, 5058, 33323, 1007, 387, 253, 749, 32581, 1319, 8770, 273, 253, 17301, 310, 253, 2228, 2169, 390, 2406, 685, 253, 11041, 849, 1057, 352, 2818, 253, 6083, 24536, 50276, 71, 3341, 5513, 849, 253, 1599, 285, 11041, 403, 6311, 1060, 342, 247, 767, 24818, 2990, 10166, 407, 4869, 26198, 73, 671, 2319, 752, 5108, 604, 253, 2250, 403, 2668, 715, 2395, 275, 253, 6311, 1566, 594, 2080, 253, 17301, 310, 327, 22872, 285, 436, 943, 3176, 9366, 285, 320, 5469, 1060, 3253, 2987, 347, 1048, 347, 253, 5570, 310, 417, 6890, 521, 3646, 533, 7293, 327, 253, 4715, 8062, 273, 253, 3646, 253, 1566, 778, 1620, 29623, 436, 651, 417, 5108, 342, 2250, 2668, 715, 2395, 50272, 40199, 285, 4460, 2934, 50276, 15627, 4105, 15965, 816, 6787, 285, 12744, 5740, 273, 253, 1332, 50276, 74, 43001, 3524, 253, 2929, 588, 320, 5520, 323, 247, 2007, 19529, 984, 891, 2868, 253, 2934, 556, 271, 1029, 2442, 326, 651, 320, 9829, 604, 352, 4620, 275, 697, 1246, 1499, 13606, 314, 830, 5474, 33032, 2520, 2929, 8725, 253, 2987, 327, 24536, 275, 13345, 6083, 407, 24049, 271, 6843, 10554, 273, 1327, 21456, 21844, 1080, 280, 11649, 715, 253, 6083, 2250, 4327, 13782, 824, 326, 253, 6083, 452, 247, 14682, 1411, 253, 6938, 3054, 835, 11649, 2550, 320, 3777, 407, 4715, 253, 4081, 7792, 556, 11410, 5919, 17947, 275, 247, 873, 273, 8892, 835, 1327, 21456, 11649, 556, 644, 5611, 253, 4477, 452, 671, 908, 352, 281, 12661, 247, 1071, 281, 19148, 253, 2554, 273, 27802, 36992, 50276, 783, 34254, 34242, 2905, 281, 11649, 50276, 249, 253, 3998, 20544, 50276, 783, 2929, 3400, 247, 1077, 5322, 11088, 2278, 273, 253, 5368, 24536, 5609, 534, 369, 247, 11284, 281, 1239, 50276, 74, 751, 253, 6551, 47458, 17006, 273, 253, 4081, 5933, 253, 1673, 310, 2120, 273, 3916, 670, 6551, 2284, 726, 9751, 533, 2159, 273, 1175, 6667, 436, 789, 2007, 3715, 778, 2489, 247, 1175, 1650, 273, 824, 726, 9751, 50276, 783, 2934, 273, 11120, 15890, 323, 21844, 1080, 280, 11649, 275, 18000, 327, 2852, 5231, 3133, 4122, 14282, 285, 7826, 46001, 50276, 249, 436, 15895, 253, 11649, 310, 5998, 342, 247, 2014, 5570, 285, 417, 342, 49328, 273, 6083, 347, 352, 369, 2218, 1078, 436, 2789, 253, 1566, 625, 4217, 50276, 783, 4477, 452, 2684, 271, 13943, 873, 273, 4679, 7668, 247, 1534, 1180, 273, 1666, 25379, 50276, 74, 751, 253, 2934, 273, 36636, 247, 1071, 534, 5661, 6551, 30202, 1346, 778, 840, 897, 281, 3157, 776, 4685, 273, 253, 3998, 50276, 20881, 1255, 265, 50276, 33850, 253, 2810, 5886, 273, 253, 4081, 1332, 281, 2720, 789, 347, 5393, 407, 253, 4477, 352, 651, 320, 5272, 281, 1902, 247, 2266, 1543, 629, 273, 253, 2929, 923, 2708, 50276, 2252, 273, 253, 4679, 275, 436, 2929, 497, 2684, 275, 12620, 24443, 12059, 281, 9569, 19191, 30678, 253, 31225, 3045, 275, 440, 267, 3606, 12620, 858, 417, 8268, 326, 273, 253, 1666, 25379, 50276, 783, 17947, 556, 644, 3786, 3715, 347, 247, 4968, 342, 247, 2457, 4736, 273, 46875, 23267, 275, 29967, 12620, 275, 436, 789, 253, 4477, 452, 2361, 11701, 275, 17947, 533, 627, 403, 642, 4679, 7809, 7756, 273, 2264, 10921, 275, 2629, 440, 267, 3606, 12620, 253, 4477, 513, 11120, 2319, 436, 2523, 352, 310, 12744, 1880, 3081, 17947, 347, 5125, 1060, 5644, 281, 2169, 23267, 275, 253, 2629, 390, 1524, 10186, 3126, 534, 310, 253, 4736, 273, 17947, 275, 253, 806, 1659, 50276, 249, 253, 4081, 3368, 323, 14631, 253, 2554, 273, 27802, 36992, 275, 253, 3998, 253, 10554, 16907, 1766, 3683, 1066, 281, 253, 958, 326, 30009, 11060, 11649, 27221, 689, 673, 5727, 21844, 1080, 280, 11649, 4558, 3638, 841, 13650, 1646, 281, 3587, 956, 432, 253, 14308, 273, 824, 20418, 594, 352, 310, 12744, 1880, 253, 4081, 2216, 273, 253, 3368, 310, 3309, 50276, 6160, 5701, 50276, 249, 5150, 608, 310, 253, 1390, 861, 50276, 28113, 50276, 20261, 253, 14951, 310, 2590, 1908, 16984, 253, 4903, 1078, 970, 731, 50276, 2811, 284, 253, 10199, 310, 3839, 1270, 352, 778, 320, 4217, 281, 2312, 247, 2372, 625, 670, 2897, 7896, 340, 86, 285, 268, 1715, 1388, 507, 1566, 347, 352, 4948, 253, 2022, 6551, 47458, 16038, 323, 253, 4081, 7792, 594, 326, 253, 10668, 31451, 452, 281, 1239, 616, 2929, 50276, 74, 651, 11697, 1908, 6240, 247, 2159, 5740, 273, 849, 5150, 608, 369, 6012, 697, 3965, 2590, 323, 479, 533, 778, 390, 778, 417, 320, 2590, 323, 4130, 253, 28529, 310, 2221, 2969, 285, 31451, 1379, 598, 247, 2257, 273, 2317, 50276, 35640, 621, 281, 253, 4477, 50276, 74, 1119, 436, 789, 4122, 12532, 285, 651, 2389, 281, 923, 2852, 16936, 1754, 327, 634, 7792, 4931, 1633, 2112, 841, 3104, 50276, 262, 651, 320, 1270, 281, 923, 253, 7756, 273, 253, 2959, 10921, 275, 2629, 390, 1524, 10186, 12620, 253, 1750, 1060, 4516, 407, 253, 6239, 310, 326, 19191, 30678, 403, 253, 2201, 1895, 275, 24536, 17477, 17947, 436, 8018, 326, 627, 403, 2217, 2629, 12620, 835, 19191, 30678, 2818, 253, 6733, 273, 8245, 24536, 3169, 11333, 824, 12620, 476, 320, 908, 281, 34647, 253, 5373, 273, 253, 4081, 5933, 672, 352, 3249, 281, 23267, 534, 17912, 651, 8467, 16249, 292, 253, 3486, 273, 436, 789, 50276, 262, 651, 671, 320, 5322, 604, 253, 2488, 956, 598, 327, 616, 4081, 27802, 36992, 3368, 285, 42124, 342, 5661, 6551, 30202, 1346, 281, 1071, 436, 9079, 8925, 326, 27802, 36992, 31360, 581, 1511, 273, 11649, 390, 1529, 651, 320, 271, 1774, 906, 285, 1529, 1270, 1039, 281, 24228, 253, 1029, 3486, 273, 436, 789, 50276, 266, 13943, 2408, 273, 789, 556, 644, 2218, 253, 4028, 310, 6571, 3477, 281, 956, 285, 253, 1543, 1646, 12532, 2299, 2530, 326, 253, 4081, 7792, 310, 8244, 2905, 281, 253, 5368, 4394, 2007, 2561, 4404, 8542, 1543, 310, 3058, 253, 1805, 17947, 4081, 1060, 310, 1774, 533, 352, 778, 390, 778, 417, 1421, 281, 1805, 468, 14692, 6083, 891, 3103, 1158, 326, 253, 2929, 310, 1805, 18960, 323, 247, 8059, 22586, 2581, 685, 323, 253, 2022, 3540, 3021, 4583, 891, 5257, 281, 5583, 18235, 2490, 187, 4118, 18435, 27, 43355, 1246, 247, 1332, 281, 557, 290, 2134, 30009, 11060, 432, 21844, 1080, 280, 11649, 323, 17816, 253, 27620, 23055, 1895, 1309, 1881, 17477, 17947, 436, 310, 271, 1774, 2170, 835, 359, 878, 625, 5697, 285, 4679, 253, 4477, 1246, 247, 35605, 11797, 2746, 285, 949, 4679, 3738, 352, 36908, 1246, 253, 1375, 23037, 14387, 17947, 275, 973, 4304, 391, 77, 12620, 891, 14409, 326, 747, 5482, 281, 3237, 326, 497, 3786, 540, 44374, 2223, 651, 2454, 824, 271, 2523, 253, 10554, 281, 30530, 6551, 47458, 771, 3339, 326, 3587, 22573, 30009, 11060, 285, 21844, 1080, 280, 11649, 310, 13433, 533, 417, 1077, 2173, 19235, 347, 253, 30628, 4879, 253, 7714, 275, 253, 1655, 830, 36908, 3240, 2525, 253, 2534, 2568, 891, 1804, 10941, 3082, 323, 3587, 26230, 11649, 891, 671, 1804, 6240, 5955, 327, 253, 13418, 8492, 323, 253, 30009, 11060, 11649, 323, 253, 4081, 1332, 891, 7052, 11907, 253, 24896, 375, 83, 281, 4035, 436, 4722, 1386, 273, 789 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 50276, 68, 21, 359, 29688, 15210, 400, 885, 6083, 281, 7703, 30009, 11060, 20418, 407, 11922, 253, 21844, 1080, 280, 4445, 432, 253, 2264, 10554, 2228, 50276, 338, 516, 417, 20854, 28763, 253, 1846, 1039, 281, 557, 290, 2134, 253, 30009, 11060, 432, 253, 21844, 1080, 280, 11649, 534, 310, 3798, 6012, 407, 253, 1569, 273, 2264, 11041, 752, 310, 253, 38135, 1060, 50276, 68, 22, 1896, 984, 3963, 5837, 2426, 812, 15816, 690, 30009, 11060, 11649, 50276, 5371, 1057, 352, 1599, 281, 15816, 690, 30009, 11060, 11649, 30009, 11060, 11649, 476, 320, 3777, 390, 2559, 513, 368, 1599, 326, 949, 253, 37820, 30009, 11060, 11649, 310, 3777, 604, 594, 476, 368, 5513, 2139, 436, 476, 5108, 50276, 68, 23, 275, 253, 1054, 304, 6992, 4679, 476, 368, 1611, 281, 3359, 253, 15276, 10921, 1754, 327, 253, 30009, 11060, 11649, 347, 6760, 432, 19862, 465, 423, 455, 50276, 9896, 4240, 3210, 326, 943, 320, 253, 11041, 689, 12656, 273, 253, 3676, 19862, 4295, 362, 435, 22666, 948, 268, 3124, 4260, 277, 2873, 6168, 505, 18, 50275, 68, 24, 697, 1077, 2834, 281, 956, 253, 5661, 7241, 275, 2593, 608, 849, 513, 368, 6642, 253, 27802, 36992, 275, 253, 9938, 50276, 71, 3341, 2654, 751, 281, 1804, 281, 253, 4477, 281, 1611, 281, 897, 4851, 554, 3185, 273, 26542, 10793, 1896, 671, 1611, 281, 897, 247, 625, 12482, 10688, 323, 253, 30404, 347, 4390, 352, 2789, 352, 11132, 281, 1239, 619, 17401, 310, 12208, 407, 253, 1563, 3374, 50276, 783, 38135, 273, 253, 7680, 310, 417, 2590, 281, 479, 2654, 751, 281, 2096, 260, 20, 68, 23, 984, 4583, 352, 3133, 281, 479, 326, 30009, 11060, 11649, 8877, 15276, 23267, 452, 644, 908, 1078, 285, 1060, 697, 417, 2429, 16575, 1411, 824, 1666, 25379, 534, 403, 5393, 275, 2905, 789, 50276, 68, 24, 50276, 74, 13414, 2096, 849, 436, 2593, 17904, 4404, 4685, 752, 1511, 273, 11649, 27802, 36992, 6298, 5474, 33032, 2520, 2929, 5936, 271, 15276, 17301, 323, 17947, 326, 32547, 27620, 23055, 407, 6240, 247, 12339, 323, 253, 5998, 11041, 273, 253, 4925, 1375, 331, 18, 1677, 2045, 1375, 331, 50276, 1542, 436, 597, 4944, 271, 3907, 2622, 1566, 273, 253, 747, 1375, 331, 18, 342, 1599, 2873, 18, 285, 945, 9788, 2056, 805, 8131, 432, 253, 2045, 1375, 331, 840, 253, 17301, 310, 1677, 407, 253, 2228, 273, 10554, 331, 18, 50276, 700, 10082, 805, 19734, 253, 11041, 12339, 9788, 2056, 805, 50275, 3529, 1039, 247, 27620, 23055, 588, 1900, 320, 14662, 342, 271, 2169, 11041, 326, 943, 452, 247, 2074, 1340, 685, 253, 2228, 273, 10554, 594, 1097, 28826, 285, 32947, 16307, 17301, 588, 990, 598, 281, 5058, 275, 3388, 846, 4715, 253, 1566, 273, 8062, 273, 253, 2644, 3126, 50276, 74, 717, 9861, 326, 436, 2934, 369, 417, 14859, 275, 271, 4321, 2929, 3192, 253, 10554, 2228, 19734, 253, 11041, 4453, 751, 253, 1077, 806, 2900, 281, 1611, 275, 1340, 281, 3657, 6083, 281, 3297, 7487, 247, 27620, 23055, 1223, 619, 3640, 281, 436, 5028, 310, 3710, 281, 619, 2564, 436, 310, 253, 806, 2929, 7738, 436, 2934, 3164, 643, 30628, 588, 5276, 479, 3430, 50274, 74, 1902, 436, 2746, 281, 789, 973, 285, 891, 4517, 253, 1175, 5661, 1543, 7329, 275, 253, 2929, 891, 2868, 436, 812, 452, 247, 5322, 3486, 327, 253, 24536, 15083, 7843, 6031, 275, 253, 391, 77, 3114, 50275, 35529, 627, 403, 1142, 32213, 326, 2789, 436, 2929, 417, 4704, 323, 247, 9311, 50276, 7053, 512, 253, 15965, 816, 6787, 403, 12744, 285, 1499, 13606, 314, 3542, 342, 1512, 1199, 6332, 275, 7424, 50276, 2574, 495, 310, 47723, 2361, 352, 943, 320, 331, 18, 275, 253, 1669, 273, 253, 6046, 1307, 285, 7856, 296, 18, 275, 253, 987, 273, 253, 11041, 1307, 50276, 249, 16186, 608, 1364, 18, 943, 2686, 320, 331, 18, 3707, 697, 271, 3388, 689, 247, 14604, 273, 3054, 331, 18, 342, 253, 1072, 2045, 331, 533, 891, 5476, 697, 417, 50276, 249, 16186, 721, 33054, 943, 320, 7856, 1906, 323, 15274, 50276, 12563, 891, 1928, 326, 627, 310, 247, 2257, 273, 1061, 356, 1761, 1041, 816, 281, 6266, 247, 1077, 2969, 305, 12064, 1566, 908, 281, 4944, 331, 18, 1677, 331, 407, 4869, 2412, 7513, 10202, 275, 16186, 608, 40009, 310, 253, 5998, 11041, 285, 12910, 310, 253, 5998, 1599, 16186, 608, 310, 816, 253, 2412, 273, 16186, 577, 342, 29331, 50276, 18, 2059, 840, 275, 16186, 818, 697, 816, 391, 50276, 3775, 50276, 383, 18280, 11041, 642, 878, 323, 1162, 66, 253, 1127, 310, 326, 841, 2426, 943, 452, 253, 1072, 15355, 387, 14940, 273, 253, 14662, 1566, 342, 1162, 66, 18, 50275, 783, 9015, 38358, 4809, 3198, 625, 2600, 281, 320, 21414, 50275, 1542, 1650, 9610, 3210, 390, 5661, 1543, 432, 253, 7534, 9380, 323, 385, 253, 4588, 3236, 9191, 326, 3036, 608, 310, 2820, 281, 18302, 50276, 5371, 891, 1804, 50276, 783, 5661, 2593, 4453, 1175, 281, 479, 533, 512, 253, 1332, 2593, 3198, 281, 320, 35993, 432, 20041, 846, 16984, 253, 278, 12132, 41818, 891, 651, 3587, 1265, 342, 253, 2457, 5150, 818, 359, 1379, 253, 2228, 19734, 253, 11041, 50276, 7461, 921, 326, 253, 1599, 273, 436, 17301, 310, 5058, 672, 253, 1566, 310, 6311, 594, 5913, 310, 253, 29444, 1430, 253, 17301, 588, 564, 281, 5058, 33323, 1007, 387, 253, 749, 32581, 1319, 8770, 273, 253, 17301, 310, 253, 2228, 2169, 390, 2406, 685, 253, 11041, 849, 1057, 352, 2818, 253, 6083, 24536, 50276, 71, 3341, 5513, 849, 253, 1599, 285, 11041, 403, 6311, 1060, 342, 247, 767, 24818, 2990, 10166, 407, 4869, 26198, 73, 671, 2319, 752, 5108, 604, 253, 2250, 403, 2668, 715, 2395, 275, 253, 6311, 1566, 594, 2080, 253, 17301, 310, 327, 22872, 285, 436, 943, 3176, 9366, 285, 320, 5469, 1060, 3253, 2987, 347, 1048, 347, 253, 5570, 310, 417, 6890, 521, 3646, 533, 7293, 327, 253, 4715, 8062, 273, 253, 3646, 253, 1566, 778, 1620, 29623, 436, 651, 417, 5108, 342, 2250, 2668, 715, 2395, 50272, 40199, 285, 4460, 2934, 50276, 15627, 4105, 15965, 816, 6787, 285, 12744, 5740, 273, 253, 1332, 50276, 74, 43001, 3524, 253, 2929, 588, 320, 5520, 323, 247, 2007, 19529, 984, 891, 2868, 253, 2934, 556, 271, 1029, 2442, 326, 651, 320, 9829, 604, 352, 4620, 275, 697, 1246, 1499, 13606, 314, 830, 5474, 33032, 2520, 2929, 8725, 253, 2987, 327, 24536, 275, 13345, 6083, 407, 24049, 271, 6843, 10554, 273, 1327, 21456, 21844, 1080, 280, 11649, 715, 253, 6083, 2250, 4327, 13782, 824, 326, 253, 6083, 452, 247, 14682, 1411, 253, 6938, 3054, 835, 11649, 2550, 320, 3777, 407, 4715, 253, 4081, 7792, 556, 11410, 5919, 17947, 275, 247, 873, 273, 8892, 835, 1327, 21456, 11649, 556, 644, 5611, 253, 4477, 452, 671, 908, 352, 281, 12661, 247, 1071, 281, 19148, 253, 2554, 273, 27802, 36992, 50276, 783, 34254, 34242, 2905, 281, 11649, 50276, 249, 253, 3998, 20544, 50276, 783, 2929, 3400, 247, 1077, 5322, 11088, 2278, 273, 253, 5368, 24536, 5609, 534, 369, 247, 11284, 281, 1239, 50276, 74, 751, 253, 6551, 47458, 17006, 273, 253, 4081, 5933, 253, 1673, 310, 2120, 273, 3916, 670, 6551, 2284, 726, 9751, 533, 2159, 273, 1175, 6667, 436, 789, 2007, 3715, 778, 2489, 247, 1175, 1650, 273, 824, 726, 9751, 50276, 783, 2934, 273, 11120, 15890, 323, 21844, 1080, 280, 11649, 275, 18000, 327, 2852, 5231, 3133, 4122, 14282, 285, 7826, 46001, 50276, 249, 436, 15895, 253, 11649, 310, 5998, 342, 247, 2014, 5570, 285, 417, 342, 49328, 273, 6083, 347, 352, 369, 2218, 1078, 436, 2789, 253, 1566, 625, 4217, 50276, 783, 4477, 452, 2684, 271, 13943, 873, 273, 4679, 7668, 247, 1534, 1180, 273, 1666, 25379, 50276, 74, 751, 253, 2934, 273, 36636, 247, 1071, 534, 5661, 6551, 30202, 1346, 778, 840, 897, 281, 3157, 776, 4685, 273, 253, 3998, 50276, 20881, 1255, 265, 50276, 33850, 253, 2810, 5886, 273, 253, 4081, 1332, 281, 2720, 789, 347, 5393, 407, 253, 4477, 352, 651, 320, 5272, 281, 1902, 247, 2266, 1543, 629, 273, 253, 2929, 923, 2708, 50276, 2252, 273, 253, 4679, 275, 436, 2929, 497, 2684, 275, 12620, 24443, 12059, 281, 9569, 19191, 30678, 253, 31225, 3045, 275, 440, 267, 3606, 12620, 858, 417, 8268, 326, 273, 253, 1666, 25379, 50276, 783, 17947, 556, 644, 3786, 3715, 347, 247, 4968, 342, 247, 2457, 4736, 273, 46875, 23267, 275, 29967, 12620, 275, 436, 789, 253, 4477, 452, 2361, 11701, 275, 17947, 533, 627, 403, 642, 4679, 7809, 7756, 273, 2264, 10921, 275, 2629, 440, 267, 3606, 12620, 253, 4477, 513, 11120, 2319, 436, 2523, 352, 310, 12744, 1880, 3081, 17947, 347, 5125, 1060, 5644, 281, 2169, 23267, 275, 253, 2629, 390, 1524, 10186, 3126, 534, 310, 253, 4736, 273, 17947, 275, 253, 806, 1659, 50276, 249, 253, 4081, 3368, 323, 14631, 253, 2554, 273, 27802, 36992, 275, 253, 3998, 253, 10554, 16907, 1766, 3683, 1066, 281, 253, 958, 326, 30009, 11060, 11649, 27221, 689, 673, 5727, 21844, 1080, 280, 11649, 4558, 3638, 841, 13650, 1646, 281, 3587, 956, 432, 253, 14308, 273, 824, 20418, 594, 352, 310, 12744, 1880, 253, 4081, 2216, 273, 253, 3368, 310, 3309, 50276, 6160, 5701, 50276, 249, 5150, 608, 310, 253, 1390, 861, 50276, 28113, 50276, 20261, 253, 14951, 310, 2590, 1908, 16984, 253, 4903, 1078, 970, 731, 50276, 2811, 284, 253, 10199, 310, 3839, 1270, 352, 778, 320, 4217, 281, 2312, 247, 2372, 625, 670, 2897, 7896, 340, 86, 285, 268, 1715, 1388, 507, 1566, 347, 352, 4948, 253, 2022, 6551, 47458, 16038, 323, 253, 4081, 7792, 594, 326, 253, 10668, 31451, 452, 281, 1239, 616, 2929, 50276, 74, 651, 11697, 1908, 6240, 247, 2159, 5740, 273, 849, 5150, 608, 369, 6012, 697, 3965, 2590, 323, 479, 533, 778, 390, 778, 417, 320, 2590, 323, 4130, 253, 28529, 310, 2221, 2969, 285, 31451, 1379, 598, 247, 2257, 273, 2317, 50276, 35640, 621, 281, 253, 4477, 50276, 74, 1119, 436, 789, 4122, 12532, 285, 651, 2389, 281, 923, 2852, 16936, 1754, 327, 634, 7792, 4931, 1633, 2112, 841, 3104, 50276, 262, 651, 320, 1270, 281, 923, 253, 7756, 273, 253, 2959, 10921, 275, 2629, 390, 1524, 10186, 12620, 253, 1750, 1060, 4516, 407, 253, 6239, 310, 326, 19191, 30678, 403, 253, 2201, 1895, 275, 24536, 17477, 17947, 436, 8018, 326, 627, 403, 2217, 2629, 12620, 835, 19191, 30678, 2818, 253, 6733, 273, 8245, 24536, 3169, 11333, 824, 12620, 476, 320, 908, 281, 34647, 253, 5373, 273, 253, 4081, 5933, 672, 352, 3249, 281, 23267, 534, 17912, 651, 8467, 16249, 292, 253, 3486, 273, 436, 789, 50276, 262, 651, 671, 320, 5322, 604, 253, 2488, 956, 598, 327, 616, 4081, 27802, 36992, 3368, 285, 42124, 342, 5661, 6551, 30202, 1346, 281, 1071, 436, 9079, 8925, 326, 27802, 36992, 31360, 581, 1511, 273, 11649, 390, 1529, 651, 320, 271, 1774, 906, 285, 1529, 1270, 1039, 281, 24228, 253, 1029, 3486, 273, 436, 789, 50276, 266, 13943, 2408, 273, 789, 556, 644, 2218, 253, 4028, 310, 6571, 3477, 281, 956, 285, 253, 1543, 1646, 12532, 2299, 2530, 326, 253, 4081, 7792, 310, 8244, 2905, 281, 253, 5368, 4394, 2007, 2561, 4404, 8542, 1543, 310, 3058, 253, 1805, 17947, 4081, 1060, 310, 1774, 533, 352, 778, 390, 778, 417, 1421, 281, 1805, 468, 14692, 6083, 891, 3103, 1158, 326, 253, 2929, 310, 1805, 18960, 323, 247, 8059, 22586, 2581, 685, 323, 253, 2022, 3540, 3021, 4583, 891, 5257, 281, 5583, 18235, 2490, 187, 4118, 18435, 27, 43355, 1246, 247, 1332, 281, 557, 290, 2134, 30009, 11060, 432, 21844, 1080, 280, 11649, 323, 17816, 253, 27620, 23055, 1895, 1309, 1881, 17477, 17947, 436, 310, 271, 1774, 2170, 835, 359, 878, 625, 5697, 285, 4679, 253, 4477, 1246, 247, 35605, 11797, 2746, 285, 949, 4679, 3738, 352, 36908, 1246, 253, 1375, 23037, 14387, 17947, 275, 973, 4304, 391, 77, 12620, 891, 14409, 326, 747, 5482, 281, 3237, 326, 497, 3786, 540, 44374, 2223, 651, 2454, 824, 271, 2523, 253, 10554, 281, 30530, 6551, 47458, 771, 3339, 326, 3587, 22573, 30009, 11060, 285, 21844, 1080, 280, 11649, 310, 13433, 533, 417, 1077, 2173, 19235, 347, 253, 30628, 4879, 253, 7714, 275, 253, 1655, 830, 36908, 3240, 2525, 253, 2534, 2568, 891, 1804, 10941, 3082, 323, 3587, 26230, 11649, 891, 671, 1804, 6240, 5955, 327, 253, 13418, 8492, 323, 253, 30009, 11060, 11649, 323, 253, 4081, 1332, 891, 7052, 11907, 253, 24896, 375, 83, 281, 4035, 436, 4722, 1386, 273, 789 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: this paper proposes spiking neural networks snns as an efficient neuromorphic alternative to dilated temporal convolutions based on the wavenet architecture for keyword spotting the main idea is to model the delay periods in dilated convolutions of wavenet as synaptic time constants in snns for efficient implementation with neuromorphic hardware for this task authors introduce an effective transformation of the original wavenet architecture to a qualitatively equivalent snn architecture which they named as wavesense then they show by experimental evaluation that wavesense surpasses the sota snn performances and come close to the sota performance of cnn and lstm based methods in keyword spotting strengths 1 this study addresses a valid problem because although neuromorphic systems have the obvious advantage of lowpower usage regarding the performance existing neuromorphic solutions are not at par with traditional methods 2 authors clearly explain how they transform the original wavenet architecture to their snn architecture wavesense to realize an efficient neuromorphic system i think that the experimentation configurations are also detailed sufficiently for reproducibility of the results authors will open source their code as well 3 performance evaluations are comprehensive by testing on three different public datasets and by comparing against both existing snn and ann methods the results adequately support the performance claims of the proposed model 4 discussion section is quite informative weaknesses 1 i think the main weakness of this paper is the limited evaluations regarding the efficiency i understand that neuromorphic hardware is by nature lowpower compared to classical digital processors and thus we may accept that they are always efficient however authors do not compare the efficiency of their method with respect to previous snn models except the model size and parameter count in table 1 as efficiency is the selling point it is desirable to see the comparison against the snn models given in table 2 as well also it would be helpful if authors can elaborate more on if neuron and parameter counts are directly comparable across different neuromorphic systems 2 authors often use the term spatiotemporal as in we focus on audio tasks as spatiotemporal tasks i dont quite understand the use of this term in the context of this paper because there is no spatial domain with a single channel audio authors should clarify this issue 3 authors obtain 095 faph with a 08 frr while coucke et al 2018 obtains frr 012 for a fixed faph of 05 on the heysnips dataset authors say that their results were slightly worse however it is not easy to compare these numbers since authors do not fix their faph score authors need to explain how they reach that conclusion based on these numbers a plot of faph versus frr could be helpful although the proposed method looks promising and comparative evaluations show performance improvements compared to snns nearing the ann performances the efficiency aspect compared to snns have not been justified well and there are some other critical flaws as pointed by the other reviewers authors did not address these issues therefore my decision is reject docsepthe paper proposes a spiking neural network snn that takes advantage of the time constant of the neuron spiking dynamics to implement temporal convolution instead of using synaptic delays in order to reduce memory requirements input temporal data such as audio streams are processed continuously without any buffering which makes it suitable for always on audio recognition systems further the spiking architecture makes it suitable for low power applications using neuromorphic processors it adopts an existing artificial neural network ann architecture for generating audio waveforms converts it to the spiking domain and trains the network using an established spiking backpropagation algorithm the experimental results show improved audio keyword recognition performance compared to the existing spike based methods strengths 1 the paper demonstrates that the time constant of the leakyintegrateandfire lif can be effectively used to implement time delays as required for temporal convolution this can reduce memory requirements of temporal spiking algorithms in general compared to buffering spikes or using synaptic transmission delays 2 simple and standard neuron and synapse models allow the proposed snn to be compatible with most neuromorphic processors therefore it can have applications in low power ai systems 3 the results presented show that the proposed snn performs competitively against the other existing snn models in keyword spotting tasks and is only slightly less accurate than the ann models weaknesses 1 the paper provides limited technical novelty the network architecture peak detection training loss and learning algorithm are not new the ann architecture was used for keyword spotting by coucke et al 2018 also the paper lacks the justification behind selecting the network architecture training scheme etc with quantitative or qualitative reasoning 2 the paper proposes that the time constants be hand tuned based on the task this requires at least these two comparisons to justify which are missing from the paper 1 a graph showing the effect of varying the time constant on accuracy for different tasks ie to show quantitatively the relation between tasks and time constants and 2 comparison of the hand tuned time constant versus learning the time constant similar to srnn by yin et al 2020 3 the paper does not quantify the improvement in memory usage from the alternative design ie temporal convolution using spike buffers or using synaptic delays the significance of the memory saved via the proposed method compared to the memory available in standard neuromorphic processors is not known 4 since keyword recognition is based on maximum neuron activation during presentation it is not clear in deployment how this proposed method will handle the absence of any keywords because there will always be a maximum activation and intuitively a thresholding is required to avoid any false positives 5 since the prediction by the proposed snn is a factor of the network state its prediction performance is dependent on the order in which the test set is presented the paper should present accuracy results as the average accuracy over a large number of runs where in each run the test set is presented in a new random order while the presented results show that the proposed approach of temporal dilated spiking convolutional network can be trained to recognize audio keywords with competitive accuracy the main contribution of the paper is a system that combines ideas from multiple existing works some key experiments pertaining to the claims are missing such as the benefit of manual time constant selection or how the manual approach generalizes to new tasks how it detects the no keyword case in deployment etc the experiment section is missing data about memory requirement reduction than alternative approaches and throughput comparisons on a standard neuromorphic processor the paper can be strengthened by the addition of the following 1 time constant optimization instead of hand selection and its effect on accuracy on different tasks in the current setup it should analyze the effect of varying the constant in equation 8 2 compare the memory requirement of the same wavesense equivalent implemented using synaptic delays vs wavesense using time constants to show the reduction in memory and computation requirements using synaptic time constant as mentioned in section 21 3 experimental results to show robustness regarding weakness 5 if detailed experiments and analysis are presented on the above issues in the revised version with justification of the related claims made i will be inclined to reconsider my score to a marginal accept docsepthe authors propose a way to translate wavenet into snnbased network referred to as wavesense the performance of spiking wavenet on three different datasets was compared with a few previous results several technical aspects of the proposed method were reported including data preprocessing and translation into spiking neurons and so forth strengths the results are interesting and promising the benchmark is quite weak though weaknesses 1 the translation of wavenet into snnbased network itself appears correct however i indicate that the snn considered is absolutely not an snn given the use of multiple spikes per timestep fundamentally spiking neurons in an snn should communicate via onebittimestep data given the eventbased data processing in nature this is the common constraint on spike data among all snn models otherwise the network is equivalent to dnn unfolded in the timedomain with integer activations in this regard strictly speaking the network considered in the present work is not snn therefore the comparison of this work with other snns is not fair at all for a correct comparison the authors should limit the number of spikes per timestep for a neuron should be limited to one and reevaluate the performance 2 this work hardly includes important fundamental aspects of the proposed network the authors translation of wavenet into snnbased network is understandable yet it is not clear where the breakthrough lies and how it works in compared with dnnbased wavenet also ablation study is missing 3 benchmark is weak the authors used only relatively new datasets and compared their results with only few of previous works there should be a number of previous works on google speech commands that should be compared based on my concerns listed above i recommend for reject docsepthis paper proposes a spiking neural network snn with temporal convolutions for lowpower keyword spotting the proposed snn architecture comes from the wavenet oord aaron van den et al 2016 with modifications to adapt to the snn dynamics the idea for using such an architecture for keyword spotting originates from coucke alice et al 2018 by training the snn using backprop through time bptt the proposed method achieved better performance than other snnbased methods and reached near soa performance for the keyword spotting datasets oord aaron van den et al wavenet a generative model for raw audio 2016 coucke alice et al efficient keyword spotting using dilated convolutions and gating 2018 pros 1 simulating temporal convolution in snn by varying synaptic time constant is a neat idea as already pointed out in the paper its expensive to use spike delays for neuromorphic computations the concept of using different time constants to approximate the dynamics of the delays can potentially be more efficient to compute on the neuromorphic processor 2 the motivation of using peak loss function for fast detection is interesting and may be useful for many other streaming applications which require instantaneous detection it will be interesting to see how the improvement in detection speed using the peak loss function compares against conventional loss functions that use the last step output 3 the proposed snn outperformed other snn methods on keyword spotting this shows that snn using temporal convolutions is more suitable for this task than other snns that use recurrent or fullyconnected layers cons 1 the paper lacks evidence to back the primary motivation of the proposed approach the work is motivated by the need for lowpower keyword spotting systems however the authors do not show any energycomputation cost comparison between the proposed snn method and existing approaches without this comparison theres no quantifiable advantage for the proposed snn which invalidates the methods value energy efficiency is not a default property of snn and an snn needs to be deployed appropriately on a neuromorphic processor to realize this advantage the deployment is not trivial and the efficiency highly relies on the dynamics and resource cost of the network the authors need to either estimate the computation cost of the snn by counting the number of operations or deploy the snn on a neuromorphic processor for energy benchmarking 2 in addition other lowpower variants of deep networks like binary neural networks exist for keyword spotting for example liu bo et al 2019 peter david et al 2020 the authors need to discuss the advantages of using snn over these approaches and compare the snn with these lowpower approaches on energy cost and accuracy 3 the paper lacks ablation studies to show that the networks high performance comes from the proposed spiking temporal convolution mechanism since spiking neurons have an inherent information integration in the temporal dimension the high accuracy is not enough to prove the temporal convolution is correctly functioning when compared with the other snn approaches the result for the proposed method might be better due to the more complex network structure thus the authors need to provide additional ablation studies on snn without temporal convolution to prove the proposed mechanism works moreover the author can also compare the synaptic time constant with the delay approach to show how much the approximation influences representation capacity 4 the structure and writing of the paper need considerable improvements to reach the standards of the conference the current structure of the introduction might be hard to follow for the readers who are unfamiliar with the work the reviewer recommends the authors to divide the section into an introduction section and a related works section the method section is also hard to follow with network design implementation details and experiment data mixed together the description of the spiking temporal convolution lacks important details for readers to understand the method the method section should be clear enough so that the readers can reproduce the method using just the writeup 5 figures in the paper lack sufficient details for understanding the proposed method better figure 1a is very similar to figure 3 in oord aaron van den et al 2016 with no additional information on how snn works with this network design with the figure and descriptions in the text its very difficult to understand how the snn computation is unrolled and whether or not spiking neurons in each layer are updated simultaneously figure 2 is similar to figure 4 in oord aaron van den et al 2016 although specific layers in oord aaron van den et al 2016 have been replaced by spiking layers the paper didnt argue why these layers are needed in the block oord aaron van den et al wavenet a generative model for raw audio 2016 coucke alice et al efficient keyword spotting using dilated convolutions and gating 2018 liu bo et al an ultralow power alwayson keyword spotting accelerator using quantized convolutional neural network and voltagedomain analog switching networkbased approximate computing 2019 peter david et al resourceefficient dnns for keyword spotting using neural architecture search and quantization 2020 overall the reviewer recommends not accepting the paper the authors gave a neat idea on how to implement temporal convolution with spiking neurons efficiently however the overall quality of the paper does not reach the standards of the conference first additional experiments are needed to validate the effectiveness of the proposed method and demonstrate the advantage of the method compared to existing approaches second the structure and writing of the paper need considerable improvements to make it more readable more detailed problems of the paper are listed in the cons in the main review docsepthis paper presents the wavesense model consists of spiking neurons and the wavenet architecture for ultralow power local signal processing the authors conducted experiments on several realworld data sets for evaluating the efficacy of the proposed models this paper is hard to follow due to its tinpot writing and organization for example there should be some parentheses to surround the cite or reference besides its necessary to give a formal introduction for the notations in the spiking neural model such as taus its not reasonable to put the formulation about the spiking neural model to appendix there are two large blanks wasteful in page 6 fig 2 does not corresponds to the context in subsection 22 in addition using the wavenet architecture the authors should list all the trainable parameters and introduce their roles played in this model the points above prevent the understanding of this work this paper needs to strictly rewritten and organized for proofreading and thus i will take the score of rejection ### Summary:
the authors propose in this manuscript to use spiking neural networks snns as an efficient alternative to dilated temporal convolutions they propose to utilize the membrane time constant of neurons instead of synaptic delays for memory efficiency training such networks with bptt achieves better performance than other snnbased methods and achieve close to sota compared to ann solutions for keyword spotting pros the manuscript addresses an interesting problem performance is good cons limited evaluations regarding efficiency although this is a main point of the paper the technical novelty is limited one reviewer noted that the model is not actually an snn due to the use of multiple spikes per time step benchmarking is weak little comparison with previous work structure and writing of the paper needs improvement the authors did not reply to any of these critical points in summary although the idea seems interesting the manuscript is not ready for publication
[ 253, 3388, 7200, 689, 247, 1781, 1180, 273, 6613, 835, 275, 1016, 1408, 253, 1071, 873, 310, 3559, 275, 247, 747, 3632, 1340, 1223, 253, 3559, 1543, 921, 326, 253, 4081, 2746, 273, 11935, 49783, 653, 16434, 27311, 267, 2990, 476, 320, 10166, 281, 9446, 9797, 28731, 342, 12085, 7200, 253, 2022, 7680, 273, 253, 2929, 310, 247, 985, 326, 24772, 5697, 432, 2709, 5368, 2987, 690, 2234, 4679, 27855, 281, 253, 3916, 403, 5816, 824, 347, 253, 5649, 273, 11595, 673, 3638, 5438, 390, 849, 253, 11595, 2746, 2087, 4219, 281, 747, 8892, 849, 352, 34472, 253, 642, 23473, 1083, 275, 19007, 3966, 253, 3368, 2593, 310, 5816, 941, 670, 3541, 8284, 5141, 685, 5795, 7274, 285, 28519, 14023, 327, 247, 2629, 39650, 41050, 13732, 50276, 783, 2929, 476, 320, 34615, 407, 253, 1635, 273, 253, 1563, 50276, 18, 673, 3638, 13757, 3185, 273, 1133, 5438, 285, 697, 1055, 327, 7200, 327, 1027, 8892, 275, 253, 1655, 9978, 352, 943, 12106, 253, 50276, 8222, 273, 11962, 253, 3638, 275, 5150, 854, 50276, 19, 50276, 23813, 253, 3541, 8284, 273, 253, 1072, 10212, 1215, 6425, 9009, 970, 21066, 20219, 4632, 10212, 1215, 970, 673, 14637, 281, 921, 253, 5141, 275, 3541, 285, 13782, 6095, 970, 21066, 673, 3638, 347, 5393, 275, 2593, 3127, 50276, 20, 5661, 1543, 281, 921, 31640, 5001, 14855, 608, 50275, 338, 7000, 4679, 285, 1783, 403, 3559, 327, 253, 1840, 3374, 275, 253, 17265, 2715, 342, 22861, 273, 253, 2905, 3916, 1160, 891, 588, 320, 21802, 281, 24033, 619, 4868, 281, 247, 16888, 2997, 5474, 339, 431, 248, 4477, 12661, 247, 1039, 281, 16497, 259, 7603, 292, 715, 3802, 79, 3169, 2990, 6289, 281, 347, 10212, 1215, 253, 3045, 273, 653, 16434, 259, 7603, 292, 327, 1264, 1027, 15302, 369, 2429, 342, 247, 1643, 2045, 1543, 2067, 7681, 7794, 273, 253, 4081, 1332, 497, 2361, 1690, 941, 638, 21678, 285, 10234, 715, 653, 16434, 8512, 285, 594, 6593, 20544, 253, 1543, 403, 4722, 285, 12532, 253, 22791, 310, 3240, 5075, 2167, 50276, 20881, 1255, 265, 337, 253, 10234, 273, 259, 7603, 292, 715, 3802, 79, 3169, 2990, 3139, 4620, 3451, 2299, 891, 5224, 326, 253, 3802, 79, 2783, 310, 8839, 417, 271, 3802, 79, 1677, 253, 897, 273, 2709, 34635, 591, 4522, 383, 554, 26401, 653, 16434, 8512, 275, 271, 3802, 79, 943, 13791, 3066, 581, 67, 770, 303, 383, 554, 941, 1677, 253, 2362, 3169, 941, 5162, 275, 3753, 436, 310, 253, 1846, 7658, 327, 24147, 941, 2190, 512, 3802, 79, 3210, 5010, 253, 2990, 310, 6425, 281, 277, 9866, 42273, 275, 253, 4522, 5881, 404, 342, 7007, 1396, 569, 275, 436, 2743, 13714, 8288, 253, 2990, 2783, 275, 253, 1246, 789, 310, 417, 3802, 79, 3103, 253, 5301, 273, 436, 789, 342, 643, 3802, 2224, 310, 417, 4344, 387, 512, 323, 247, 3451, 5301, 253, 4477, 943, 2701, 253, 1180, 273, 34635, 591, 4522, 383, 554, 323, 247, 23586, 943, 320, 3710, 281, 581, 285, 294, 45141, 253, 3045, 50276, 19, 436, 789, 10693, 3797, 1774, 7936, 7794, 273, 253, 4081, 2990, 253, 4477, 10234, 273, 259, 7603, 292, 715, 3802, 79, 3169, 2990, 310, 34007, 2568, 352, 310, 417, 2590, 835, 253, 29709, 8696, 285, 849, 352, 2987, 275, 2429, 342, 277, 9866, 3169, 259, 7603, 292, 671, 28913, 1263, 310, 5816, 50275, 20, 22791, 310, 5075, 253, 4477, 908, 760, 4942, 747, 15302, 285, 2429, 616, 1543, 342, 760, 1643, 273, 2045, 2987, 627, 943, 320, 247, 1180, 273, 2045, 2987, 327, 17899, 6519, 13896, 326, 943, 320, 2429, 1754, 327, 619, 7350, 7117, 1840, 891, 5583, 323, 12009, 50276, 7152, 33032, 2520, 2929, 29328, 247, 653, 16434, 11454, 2990, 3802, 79, 342, 11935, 2410, 17009, 323, 1698, 9177, 23473, 6308, 1076, 253, 4081, 3802, 79, 10336, 3249, 432, 253, 259, 7603, 292, 258, 636, 247, 10510, 3889, 1850, 1162, 355, 4022, 342, 14586, 281, 5223, 281, 253, 3802, 79, 8062, 253, 2934, 323, 970, 824, 271, 10336, 323, 23473, 6308, 1076, 42799, 432, 2565, 33144, 355, 547, 1162, 355, 4765, 407, 3733, 253, 3802, 79, 970, 896, 8560, 949, 673, 270, 431, 85, 253, 4081, 1332, 6786, 1805, 3045, 685, 643, 3802, 79, 3169, 3082, 285, 4925, 2822, 594, 66, 3045, 323, 253, 23473, 6308, 1076, 15302, 50276, 80, 636, 247, 10510, 3889, 1850, 1162, 355, 259, 7603, 292, 247, 1006, 800, 1566, 323, 9305, 9797, 4022, 50276, 20313, 33144, 355, 547, 1162, 355, 5919, 23473, 6308, 1076, 970, 49783, 2410, 17009, 285, 305, 839, 4765, 50276, 856, 84, 50276, 18, 948, 8287, 11935, 27311, 275, 3802, 79, 407, 11962, 21066, 673, 3638, 310, 247, 18176, 2934, 347, 2168, 8042, 562, 275, 253, 2929, 697, 8214, 281, 897, 24147, 20219, 323, 39650, 41050, 30745, 253, 4473, 273, 970, 1027, 673, 14637, 281, 16851, 253, 8062, 273, 253, 20219, 476, 7826, 320, 625, 5919, 281, 11897, 327, 253, 39650, 41050, 13732, 50276, 19, 253, 16038, 273, 970, 5241, 2957, 1159, 323, 3809, 5481, 310, 4722, 285, 778, 320, 4217, 323, 1142, 643, 18361, 4893, 534, 2430, 35774, 5481, 352, 588, 320, 4722, 281, 923, 849, 253, 7756, 275, 5481, 3885, 970, 253, 5241, 2957, 1159, 26662, 1411, 6041, 2957, 3470, 326, 897, 253, 1390, 3213, 3453, 50275, 20, 253, 4081, 3802, 79, 41731, 10574, 643, 3802, 79, 3082, 327, 23473, 6308, 1076, 436, 2722, 326, 3802, 79, 970, 11935, 2410, 17009, 310, 625, 7470, 323, 436, 4836, 685, 643, 3802, 2224, 326, 897, 18902, 390, 4751, 14063, 8090, 50275, 5040, 50276, 18, 253, 2929, 19756, 1941, 281, 896, 253, 3625, 16038, 273, 253, 4081, 2746, 253, 789, 310, 17194, 407, 253, 878, 323, 1698, 9177, 23473, 6308, 1076, 2718, 2299, 253, 4477, 513, 417, 921, 667, 2341, 681, 10340, 2105, 5301, 875, 253, 4081, 3802, 79, 1332, 285, 5368, 7274, 1293, 436, 5301, 253, 373, 642, 2677, 18397, 5750, 323, 253, 4081, 3802, 79, 534, 12078, 684, 253, 3082, 1318, 2341, 6733, 310, 417, 247, 4284, 2867, 273, 3802, 79, 285, 271, 3802, 79, 3198, 281, 320, 18329, 20420, 327, 247, 39650, 41050, 13732, 281, 8968, 436, 5750, 253, 19007, 310, 417, 14916, 285, 253, 6733, 4122, 15771, 327, 253, 8062, 285, 7741, 2105, 273, 253, 2990, 253, 4477, 878, 281, 2057, 6642, 253, 13782, 2105, 273, 253, 3802, 79, 407, 15496, 253, 1180, 273, 5871, 390, 8745, 253, 3802, 79, 327, 247, 39650, 41050, 13732, 323, 2341, 22791, 272, 50276, 19, 275, 1635, 643, 1698, 9177, 11640, 273, 3676, 6928, 751, 8985, 11454, 6928, 2226, 323, 23473, 6308, 1076, 323, 1650, 632, 86, 1766, 1162, 355, 6247, 268, 1715, 34843, 301, 1162, 355, 9169, 253, 4477, 878, 281, 2319, 253, 11361, 273, 970, 3802, 79, 689, 841, 7274, 285, 7277, 253, 3802, 79, 342, 841, 1698, 9177, 7274, 327, 2341, 2105, 285, 7200, 50276, 20, 253, 2929, 19756, 28913, 2175, 281, 921, 326, 253, 6928, 1029, 3045, 3249, 432, 253, 4081, 653, 16434, 11935, 27311, 5122, 1580, 653, 16434, 8512, 452, 271, 12794, 1491, 9554, 275, 253, 11935, 7877, 253, 1029, 7200, 310, 417, 2217, 281, 5276, 253, 11935, 27311, 310, 9113, 15415, 672, 2429, 342, 253, 643, 3802, 79, 7274, 253, 906, 323, 253, 4081, 1332, 1537, 320, 1805, 1955, 281, 253, 625, 2570, 2990, 2605, 3021, 253, 4477, 878, 281, 2085, 3081, 28913, 2175, 327, 3802, 79, 1293, 11935, 27311, 281, 5276, 253, 4081, 5122, 2987, 25761, 253, 2488, 476, 671, 7277, 253, 21066, 673, 3638, 342, 253, 5778, 2746, 281, 921, 849, 1199, 253, 11193, 16178, 6779, 5350, 50276, 21, 253, 2605, 285, 4028, 273, 253, 2929, 878, 10665, 11701, 281, 3986, 253, 7465, 273, 253, 8059, 253, 1655, 2605, 273, 253, 10199, 1537, 320, 1892, 281, 956, 323, 253, 10668, 665, 403, 32139, 342, 253, 789, 253, 37317, 32636, 253, 4477, 281, 10957, 253, 2593, 715, 271, 10199, 2593, 285, 247, 2905, 2987, 2593, 253, 1332, 2593, 310, 671, 1892, 281, 956, 342, 2990, 2216, 7092, 4278, 285, 3368, 941, 6804, 2366, 253, 5740, 273, 253, 653, 16434, 11935, 27311, 19756, 1774, 4278, 323, 10668, 281, 2096, 253, 1332, 253, 1332, 2593, 943, 320, 2590, 2217, 594, 326, 253, 10668, 476, 18302, 253, 1332, 970, 816, 253, 3630, 484, 50276, 22, 8442, 275, 253, 2929, 3480, 4209, 4278, 323, 4685, 253, 4081, 1332, 1805, 4677, 337, 66, 310, 1077, 2074, 281, 4677, 495, 275, 258, 636, 247, 10510, 3889, 1850, 1162, 355, 4022, 342, 642, 3081, 1491, 327, 849, 3802, 79, 2987, 342, 436, 2990, 2216, 342, 253, 4677, 285, 20121, 275, 253, 2505, 697, 1077, 2834, 281, 2096, 849, 253, 3802, 79, 13782, 310, 440, 9095, 285, 1880, 390, 417, 653, 16434, 8512, 275, 1016, 3828, 403, 9300, 10486, 4677, 374, 310, 2074, 281, 4677, 577, 275, 258, 636, 247, 10510, 3889, 1850, 1162, 355, 4022, 3738, 2173, 8090, 275, 258, 636, 247, 10510, 3889, 1850, 1162, 355, 4022, 452, 644, 7932, 407, 653, 16434, 8090, 253, 2929, 42126, 9059, 2139, 841, 8090, 403, 3058, 275, 253, 2972, 50276, 80, 636, 247, 10510, 3889, 1850, 1162, 355, 259, 7603, 292, 247, 1006, 800, 1566, 323, 9305, 9797, 4022, 50276, 20313, 33144, 355, 547, 1162, 355, 5919, 23473, 6308, 1076, 970, 49783, 2410, 17009, 285, 305, 839, 4765, 50276, 965, 86, 1766, 1162, 355, 271, 4054, 1544, 319, 1612, 1900, 251, 23473, 6308, 1076, 44420, 970, 2677, 1025, 27311, 267, 11454, 2990, 285, 6044, 2961, 297, 404, 7370, 12797, 2990, 3169, 16851, 12672, 6247, 50276, 81, 1715, 34843, 301, 1162, 355, 7741, 20246, 277, 79, 2224, 323, 23473, 6308, 1076, 970, 11454, 10336, 3186, 285, 36643, 9169, 50275, 1189, 455, 253, 37317, 32636, 417, 18738, 253, 2929, 253, 4477, 3534, 247, 18176, 2934, 327, 849, 281, 3359, 11935, 27311, 342, 653, 16434, 8512, 14556, 2299, 253, 4583, 3290, 273, 253, 2929, 1057, 417, 3986, 253, 7465, 273, 253, 8059, 806, 3081, 4679, 403, 3058, 281, 17813, 253, 12510, 273, 253, 4081, 1332, 285, 7568, 253, 5750, 273, 253, 1332, 2429, 281, 5368, 7274, 1273, 253, 2605, 285, 4028, 273, 253, 2929, 878, 10665, 11701, 281, 1056, 352, 625, 34025, 625, 7000, 3237, 273, 253, 2929, 403, 7117, 275, 253, 772, 275, 253, 2022, 2278, 5474, 33032, 2520, 2929, 10262, 253, 10212, 1215, 1566, 8414, 273, 653, 16434, 8512, 285, 253, 259, 7603, 292, 10336, 323, 4054, 1544, 319, 1612, 1980, 2625, 5162, 253, 4477, 5196, 4679, 327, 2067, 1524, 10186, 941, 5239, 323, 16344, 253, 10307, 273, 253, 4081, 3210, 436, 2929, 310, 1892, 281, 956, 1955, 281, 697, 20596, 11714, 4028, 285, 6003, 50275, 1542, 1650, 627, 943, 320, 690, 41616, 281, 5665, 253, 26542, 390, 3806, 50275, 67, 11587, 697, 3309, 281, 1918, 247, 7473, 10199, 323, 253, 41818, 275, 253, 653, 16434, 11454, 1566, 824, 347, 246, 666, 697, 417, 5272, 281, 1691, 253, 15895, 670, 253, 653, 16434, 11454, 1566, 281, 30762, 627, 403, 767, 1781, 787, 3107, 8138, 1020, 275, 3239, 721, 50276, 926, 374, 1057, 417, 10140, 281, 253, 3634, 275, 19087, 3307, 50275, 249, 1635, 970, 253, 259, 7603, 292, 10336, 253, 4477, 943, 1618, 512, 253, 6194, 494, 3602, 285, 9569, 616, 9503, 4546, 275, 436, 1566, 50275, 783, 2792, 1840, 3657, 253, 4685, 273, 436, 789, 436, 2929, 3198, 281, 13714, 35993, 285, 10932, 323, 4737, 24042, 285, 3021, 891, 588, 1379, 253, 4868, 273, 18235, 2490, 187, 4118, 18435, 27, 783, 4477, 12661, 275, 436, 7714, 281, 897, 653, 16434, 11454, 6928, 3802, 2224, 347, 271, 5919, 5795, 281, 49783, 11935, 2410, 17009, 597, 12661, 281, 16584, 253, 6384, 673, 3638, 273, 8512, 3185, 273, 21066, 20219, 323, 3541, 6733, 3733, 824, 6928, 342, 270, 431, 85, 33526, 1805, 3045, 685, 643, 3802, 79, 3169, 3082, 285, 5115, 2810, 281, 256, 5503, 2429, 281, 2459, 5482, 323, 23473, 6308, 1076, 50276, 856, 84, 50276, 783, 7714, 12453, 271, 4722, 1895, 50276, 24159, 310, 1175, 50276, 5040, 50276, 15870, 27163, 5001, 6733, 3738, 436, 310, 247, 2022, 1127, 273, 253, 2929, 50276, 783, 7681, 38135, 310, 3710, 50276, 531, 37317, 4879, 326, 253, 1566, 310, 417, 2686, 271, 3802, 79, 1955, 281, 253, 897, 273, 2709, 34635, 591, 673, 3213, 50276, 31591, 4698, 272, 310, 5075, 1652, 5301, 342, 2045, 789, 50276, 18317, 285, 4028, 273, 253, 2929, 3198, 7756, 50276, 783, 4477, 858, 417, 12252, 281, 667, 273, 841, 4619, 2792, 275, 6010, 3738, 253, 2934, 3133, 4722, 253, 7714, 310, 417, 4704, 323, 9311 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 253, 3388, 7200, 689, 247, 1781, 1180, 273, 6613, 835, 275, 1016, 1408, 253, 1071, 873, 310, 3559, 275, 247, 747, 3632, 1340, 1223, 253, 3559, 1543, 921, 326, 253, 4081, 2746, 273, 11935, 49783, 653, 16434, 27311, 267, 2990, 476, 320, 10166, 281, 9446, 9797, 28731, 342, 12085, 7200, 253, 2022, 7680, 273, 253, 2929, 310, 247, 985, 326, 24772, 5697, 432, 2709, 5368, 2987, 690, 2234, 4679, 27855, 281, 253, 3916, 403, 5816, 824, 347, 253, 5649, 273, 11595, 673, 3638, 5438, 390, 849, 253, 11595, 2746, 2087, 4219, 281, 747, 8892, 849, 352, 34472, 253, 642, 23473, 1083, 275, 19007, 3966, 253, 3368, 2593, 310, 5816, 941, 670, 3541, 8284, 5141, 685, 5795, 7274, 285, 28519, 14023, 327, 247, 2629, 39650, 41050, 13732, 50276, 783, 2929, 476, 320, 34615, 407, 253, 1635, 273, 253, 1563, 50276, 18, 673, 3638, 13757, 3185, 273, 1133, 5438, 285, 697, 1055, 327, 7200, 327, 1027, 8892, 275, 253, 1655, 9978, 352, 943, 12106, 253, 50276, 8222, 273, 11962, 253, 3638, 275, 5150, 854, 50276, 19, 50276, 23813, 253, 3541, 8284, 273, 253, 1072, 10212, 1215, 6425, 9009, 970, 21066, 20219, 4632, 10212, 1215, 970, 673, 14637, 281, 921, 253, 5141, 275, 3541, 285, 13782, 6095, 970, 21066, 673, 3638, 347, 5393, 275, 2593, 3127, 50276, 20, 5661, 1543, 281, 921, 31640, 5001, 14855, 608, 50275, 338, 7000, 4679, 285, 1783, 403, 3559, 327, 253, 1840, 3374, 275, 253, 17265, 2715, 342, 22861, 273, 253, 2905, 3916, 1160, 891, 588, 320, 21802, 281, 24033, 619, 4868, 281, 247, 16888, 2997, 5474, 339, 431, 248, 4477, 12661, 247, 1039, 281, 16497, 259, 7603, 292, 715, 3802, 79, 3169, 2990, 6289, 281, 347, 10212, 1215, 253, 3045, 273, 653, 16434, 259, 7603, 292, 327, 1264, 1027, 15302, 369, 2429, 342, 247, 1643, 2045, 1543, 2067, 7681, 7794, 273, 253, 4081, 1332, 497, 2361, 1690, 941, 638, 21678, 285, 10234, 715, 653, 16434, 8512, 285, 594, 6593, 20544, 253, 1543, 403, 4722, 285, 12532, 253, 22791, 310, 3240, 5075, 2167, 50276, 20881, 1255, 265, 337, 253, 10234, 273, 259, 7603, 292, 715, 3802, 79, 3169, 2990, 3139, 4620, 3451, 2299, 891, 5224, 326, 253, 3802, 79, 2783, 310, 8839, 417, 271, 3802, 79, 1677, 253, 897, 273, 2709, 34635, 591, 4522, 383, 554, 26401, 653, 16434, 8512, 275, 271, 3802, 79, 943, 13791, 3066, 581, 67, 770, 303, 383, 554, 941, 1677, 253, 2362, 3169, 941, 5162, 275, 3753, 436, 310, 253, 1846, 7658, 327, 24147, 941, 2190, 512, 3802, 79, 3210, 5010, 253, 2990, 310, 6425, 281, 277, 9866, 42273, 275, 253, 4522, 5881, 404, 342, 7007, 1396, 569, 275, 436, 2743, 13714, 8288, 253, 2990, 2783, 275, 253, 1246, 789, 310, 417, 3802, 79, 3103, 253, 5301, 273, 436, 789, 342, 643, 3802, 2224, 310, 417, 4344, 387, 512, 323, 247, 3451, 5301, 253, 4477, 943, 2701, 253, 1180, 273, 34635, 591, 4522, 383, 554, 323, 247, 23586, 943, 320, 3710, 281, 581, 285, 294, 45141, 253, 3045, 50276, 19, 436, 789, 10693, 3797, 1774, 7936, 7794, 273, 253, 4081, 2990, 253, 4477, 10234, 273, 259, 7603, 292, 715, 3802, 79, 3169, 2990, 310, 34007, 2568, 352, 310, 417, 2590, 835, 253, 29709, 8696, 285, 849, 352, 2987, 275, 2429, 342, 277, 9866, 3169, 259, 7603, 292, 671, 28913, 1263, 310, 5816, 50275, 20, 22791, 310, 5075, 253, 4477, 908, 760, 4942, 747, 15302, 285, 2429, 616, 1543, 342, 760, 1643, 273, 2045, 2987, 627, 943, 320, 247, 1180, 273, 2045, 2987, 327, 17899, 6519, 13896, 326, 943, 320, 2429, 1754, 327, 619, 7350, 7117, 1840, 891, 5583, 323, 12009, 50276, 7152, 33032, 2520, 2929, 29328, 247, 653, 16434, 11454, 2990, 3802, 79, 342, 11935, 2410, 17009, 323, 1698, 9177, 23473, 6308, 1076, 253, 4081, 3802, 79, 10336, 3249, 432, 253, 259, 7603, 292, 258, 636, 247, 10510, 3889, 1850, 1162, 355, 4022, 342, 14586, 281, 5223, 281, 253, 3802, 79, 8062, 253, 2934, 323, 970, 824, 271, 10336, 323, 23473, 6308, 1076, 42799, 432, 2565, 33144, 355, 547, 1162, 355, 4765, 407, 3733, 253, 3802, 79, 970, 896, 8560, 949, 673, 270, 431, 85, 253, 4081, 1332, 6786, 1805, 3045, 685, 643, 3802, 79, 3169, 3082, 285, 4925, 2822, 594, 66, 3045, 323, 253, 23473, 6308, 1076, 15302, 50276, 80, 636, 247, 10510, 3889, 1850, 1162, 355, 259, 7603, 292, 247, 1006, 800, 1566, 323, 9305, 9797, 4022, 50276, 20313, 33144, 355, 547, 1162, 355, 5919, 23473, 6308, 1076, 970, 49783, 2410, 17009, 285, 305, 839, 4765, 50276, 856, 84, 50276, 18, 948, 8287, 11935, 27311, 275, 3802, 79, 407, 11962, 21066, 673, 3638, 310, 247, 18176, 2934, 347, 2168, 8042, 562, 275, 253, 2929, 697, 8214, 281, 897, 24147, 20219, 323, 39650, 41050, 30745, 253, 4473, 273, 970, 1027, 673, 14637, 281, 16851, 253, 8062, 273, 253, 20219, 476, 7826, 320, 625, 5919, 281, 11897, 327, 253, 39650, 41050, 13732, 50276, 19, 253, 16038, 273, 970, 5241, 2957, 1159, 323, 3809, 5481, 310, 4722, 285, 778, 320, 4217, 323, 1142, 643, 18361, 4893, 534, 2430, 35774, 5481, 352, 588, 320, 4722, 281, 923, 849, 253, 7756, 275, 5481, 3885, 970, 253, 5241, 2957, 1159, 26662, 1411, 6041, 2957, 3470, 326, 897, 253, 1390, 3213, 3453, 50275, 20, 253, 4081, 3802, 79, 41731, 10574, 643, 3802, 79, 3082, 327, 23473, 6308, 1076, 436, 2722, 326, 3802, 79, 970, 11935, 2410, 17009, 310, 625, 7470, 323, 436, 4836, 685, 643, 3802, 2224, 326, 897, 18902, 390, 4751, 14063, 8090, 50275, 5040, 50276, 18, 253, 2929, 19756, 1941, 281, 896, 253, 3625, 16038, 273, 253, 4081, 2746, 253, 789, 310, 17194, 407, 253, 878, 323, 1698, 9177, 23473, 6308, 1076, 2718, 2299, 253, 4477, 513, 417, 921, 667, 2341, 681, 10340, 2105, 5301, 875, 253, 4081, 3802, 79, 1332, 285, 5368, 7274, 1293, 436, 5301, 253, 373, 642, 2677, 18397, 5750, 323, 253, 4081, 3802, 79, 534, 12078, 684, 253, 3082, 1318, 2341, 6733, 310, 417, 247, 4284, 2867, 273, 3802, 79, 285, 271, 3802, 79, 3198, 281, 320, 18329, 20420, 327, 247, 39650, 41050, 13732, 281, 8968, 436, 5750, 253, 19007, 310, 417, 14916, 285, 253, 6733, 4122, 15771, 327, 253, 8062, 285, 7741, 2105, 273, 253, 2990, 253, 4477, 878, 281, 2057, 6642, 253, 13782, 2105, 273, 253, 3802, 79, 407, 15496, 253, 1180, 273, 5871, 390, 8745, 253, 3802, 79, 327, 247, 39650, 41050, 13732, 323, 2341, 22791, 272, 50276, 19, 275, 1635, 643, 1698, 9177, 11640, 273, 3676, 6928, 751, 8985, 11454, 6928, 2226, 323, 23473, 6308, 1076, 323, 1650, 632, 86, 1766, 1162, 355, 6247, 268, 1715, 34843, 301, 1162, 355, 9169, 253, 4477, 878, 281, 2319, 253, 11361, 273, 970, 3802, 79, 689, 841, 7274, 285, 7277, 253, 3802, 79, 342, 841, 1698, 9177, 7274, 327, 2341, 2105, 285, 7200, 50276, 20, 253, 2929, 19756, 28913, 2175, 281, 921, 326, 253, 6928, 1029, 3045, 3249, 432, 253, 4081, 653, 16434, 11935, 27311, 5122, 1580, 653, 16434, 8512, 452, 271, 12794, 1491, 9554, 275, 253, 11935, 7877, 253, 1029, 7200, 310, 417, 2217, 281, 5276, 253, 11935, 27311, 310, 9113, 15415, 672, 2429, 342, 253, 643, 3802, 79, 7274, 253, 906, 323, 253, 4081, 1332, 1537, 320, 1805, 1955, 281, 253, 625, 2570, 2990, 2605, 3021, 253, 4477, 878, 281, 2085, 3081, 28913, 2175, 327, 3802, 79, 1293, 11935, 27311, 281, 5276, 253, 4081, 5122, 2987, 25761, 253, 2488, 476, 671, 7277, 253, 21066, 673, 3638, 342, 253, 5778, 2746, 281, 921, 849, 1199, 253, 11193, 16178, 6779, 5350, 50276, 21, 253, 2605, 285, 4028, 273, 253, 2929, 878, 10665, 11701, 281, 3986, 253, 7465, 273, 253, 8059, 253, 1655, 2605, 273, 253, 10199, 1537, 320, 1892, 281, 956, 323, 253, 10668, 665, 403, 32139, 342, 253, 789, 253, 37317, 32636, 253, 4477, 281, 10957, 253, 2593, 715, 271, 10199, 2593, 285, 247, 2905, 2987, 2593, 253, 1332, 2593, 310, 671, 1892, 281, 956, 342, 2990, 2216, 7092, 4278, 285, 3368, 941, 6804, 2366, 253, 5740, 273, 253, 653, 16434, 11935, 27311, 19756, 1774, 4278, 323, 10668, 281, 2096, 253, 1332, 253, 1332, 2593, 943, 320, 2590, 2217, 594, 326, 253, 10668, 476, 18302, 253, 1332, 970, 816, 253, 3630, 484, 50276, 22, 8442, 275, 253, 2929, 3480, 4209, 4278, 323, 4685, 253, 4081, 1332, 1805, 4677, 337, 66, 310, 1077, 2074, 281, 4677, 495, 275, 258, 636, 247, 10510, 3889, 1850, 1162, 355, 4022, 342, 642, 3081, 1491, 327, 849, 3802, 79, 2987, 342, 436, 2990, 2216, 342, 253, 4677, 285, 20121, 275, 253, 2505, 697, 1077, 2834, 281, 2096, 849, 253, 3802, 79, 13782, 310, 440, 9095, 285, 1880, 390, 417, 653, 16434, 8512, 275, 1016, 3828, 403, 9300, 10486, 4677, 374, 310, 2074, 281, 4677, 577, 275, 258, 636, 247, 10510, 3889, 1850, 1162, 355, 4022, 3738, 2173, 8090, 275, 258, 636, 247, 10510, 3889, 1850, 1162, 355, 4022, 452, 644, 7932, 407, 653, 16434, 8090, 253, 2929, 42126, 9059, 2139, 841, 8090, 403, 3058, 275, 253, 2972, 50276, 80, 636, 247, 10510, 3889, 1850, 1162, 355, 259, 7603, 292, 247, 1006, 800, 1566, 323, 9305, 9797, 4022, 50276, 20313, 33144, 355, 547, 1162, 355, 5919, 23473, 6308, 1076, 970, 49783, 2410, 17009, 285, 305, 839, 4765, 50276, 965, 86, 1766, 1162, 355, 271, 4054, 1544, 319, 1612, 1900, 251, 23473, 6308, 1076, 44420, 970, 2677, 1025, 27311, 267, 11454, 2990, 285, 6044, 2961, 297, 404, 7370, 12797, 2990, 3169, 16851, 12672, 6247, 50276, 81, 1715, 34843, 301, 1162, 355, 7741, 20246, 277, 79, 2224, 323, 23473, 6308, 1076, 970, 11454, 10336, 3186, 285, 36643, 9169, 50275, 1189, 455, 253, 37317, 32636, 417, 18738, 253, 2929, 253, 4477, 3534, 247, 18176, 2934, 327, 849, 281, 3359, 11935, 27311, 342, 653, 16434, 8512, 14556, 2299, 253, 4583, 3290, 273, 253, 2929, 1057, 417, 3986, 253, 7465, 273, 253, 8059, 806, 3081, 4679, 403, 3058, 281, 17813, 253, 12510, 273, 253, 4081, 1332, 285, 7568, 253, 5750, 273, 253, 1332, 2429, 281, 5368, 7274, 1273, 253, 2605, 285, 4028, 273, 253, 2929, 878, 10665, 11701, 281, 1056, 352, 625, 34025, 625, 7000, 3237, 273, 253, 2929, 403, 7117, 275, 253, 772, 275, 253, 2022, 2278, 5474, 33032, 2520, 2929, 10262, 253, 10212, 1215, 1566, 8414, 273, 653, 16434, 8512, 285, 253, 259, 7603, 292, 10336, 323, 4054, 1544, 319, 1612, 1980, 2625, 5162, 253, 4477, 5196, 4679, 327, 2067, 1524, 10186, 941, 5239, 323, 16344, 253, 10307, 273, 253, 4081, 3210, 436, 2929, 310, 1892, 281, 956, 1955, 281, 697, 20596, 11714, 4028, 285, 6003, 50275, 1542, 1650, 627, 943, 320, 690, 41616, 281, 5665, 253, 26542, 390, 3806, 50275, 67, 11587, 697, 3309, 281, 1918, 247, 7473, 10199, 323, 253, 41818, 275, 253, 653, 16434, 11454, 1566, 824, 347, 246, 666, 697, 417, 5272, 281, 1691, 253, 15895, 670, 253, 653, 16434, 11454, 1566, 281, 30762, 627, 403, 767, 1781, 787, 3107, 8138, 1020, 275, 3239, 721, 50276, 926, 374, 1057, 417, 10140, 281, 253, 3634, 275, 19087, 3307, 50275, 249, 1635, 970, 253, 259, 7603, 292, 10336, 253, 4477, 943, 1618, 512, 253, 6194, 494, 3602, 285, 9569, 616, 9503, 4546, 275, 436, 1566, 50275, 783, 2792, 1840, 3657, 253, 4685, 273, 436, 789, 436, 2929, 3198, 281, 13714, 35993, 285, 10932, 323, 4737, 24042, 285, 3021, 891, 588, 1379, 253, 4868, 273, 18235, 2490, 187, 4118, 18435, 27, 783, 4477, 12661, 275, 436, 7714, 281, 897, 653, 16434, 11454, 6928, 3802, 2224, 347, 271, 5919, 5795, 281, 49783, 11935, 2410, 17009, 597, 12661, 281, 16584, 253, 6384, 673, 3638, 273, 8512, 3185, 273, 21066, 20219, 323, 3541, 6733, 3733, 824, 6928, 342, 270, 431, 85, 33526, 1805, 3045, 685, 643, 3802, 79, 3169, 3082, 285, 5115, 2810, 281, 256, 5503, 2429, 281, 2459, 5482, 323, 23473, 6308, 1076, 50276, 856, 84, 50276, 783, 7714, 12453, 271, 4722, 1895, 50276, 24159, 310, 1175, 50276, 5040, 50276, 15870, 27163, 5001, 6733, 3738, 436, 310, 247, 2022, 1127, 273, 253, 2929, 50276, 783, 7681, 38135, 310, 3710, 50276, 531, 37317, 4879, 326, 253, 1566, 310, 417, 2686, 271, 3802, 79, 1955, 281, 253, 897, 273, 2709, 34635, 591, 673, 3213, 50276, 31591, 4698, 272, 310, 5075, 1652, 5301, 342, 2045, 789, 50276, 18317, 285, 4028, 273, 253, 2929, 3198, 7756, 50276, 783, 4477, 858, 417, 12252, 281, 667, 273, 841, 4619, 2792, 275, 6010, 3738, 253, 2934, 3133, 4722, 253, 7714, 310, 417, 4704, 323, 9311 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: this work tackles the problem of modeling multiagent systems that contain a high degree of interaction between agents to model such systems this work proposes interaction modeling with multiplex attention imma which uses a multiplex graph latent structure to model different types of interactions in different layers of the multiplex in addition attention graph layers are used to capture the strength of the relations furthermore thanks to the structure of the multiplex latent layers this work proposes progressive layer training plt where one can train different interaction layers separately and iteratively adding different latent layers the proposed approach is evaluated on three different datasets showcasing strong results on all furthermore ablation studies show how having multiple latent layers andor having plt help the models prediction performance furthermore the social navigation environment provides the groundtruth interaction structure of each scene this work shows that the predicted latent structure predicts ground truth more accurately than prior approaches strengths the paper is well organized and wellwritten i believe the application of multiplex attention in the latent space is novel to the problem of multiagent motion forecasting with interaction modeling the results are strong and convincing on multiple datasets the ablation studies provide further evidence of the utility of having multiple levels of attention weaknesses in the appendix it is mentioned that training is performed with qphizx1th this combined with the footnote on page 3 on which i have a question below it seems like the proposed approach should not be formulatedclassified as a cvae and is more of a model which has a specific hidden structure on the bottleneck that is not regularized by any external forces eg kl divergence these are adequately addressed in the conclusion and the appendix docsepthe paper proposes a multiplex attention method for multiagent interaction modeling the main motivation is that there can be more than one interaction type between agents to address this the paper proposes to use multiple latent graphs to encode the different interactions between agents and employs the vae framework to infer the latent codes to ease learning the paper also proposes progressive layer training which is a curriculum learning method that gradually adds latent graphs to the model evaluations are performed on multiagent trajectory forecasting using one inhouse simulated dataset and two public datasets phase and nba the method outperforms prior methods in forecasting accuracy relational inference accuracy and sample efficiency strength the paper is generally wellwritten and ways to understand the motivation is clear for encoding multiple types of interaction between agents and their relative strength the paper performed extensive experiments and analyses on three datasets both trajectory forecasting and relation inference are evaluated the latent graph manipulation experiments are interesting which show the latent codes encode leaderfollower behaviors the paper outperforms the baselines significantly weakness the main idea multiplex attention of this paper is quite similar to factorized nri 1 where multiple latent graphs are used in a variational inference framework the technical novelty of this paper is limited in some sense the multiplex attention is not a big change from nri since the latent code itself is multidimensional therefore each dimension can be treated as a separate latent code and the main difference of the proposed approach is to not share encoder weights the paper didnt compare with strong multiagent trajectory forecasting baselines on classic trajectory forecasting benchmarks such as ethucy where a large number of agents are interacting with each other the paper could compare with sota methods such as 2 3 4 some design choices are not well motivated what is the motivation for softmax across agents in eq 2 this constrains the interaction strength from i to other agents to sum to 1 for interaction k however agent i can have strong interaction strength with multiple different agents for interaction k therefore i wonder why the method doesnt just formulate inference of zijk as a binary classification this interaction exists or not instead of multiclass classification across other agents 1 webb ezra et al factorised neural relational inference for multiinteraction systems icml 2019 workshops 2 mangalam karttikeya et al from goals waypoints paths to long term human trajectory forecasting proceedings of the ieeecvf international conference on computer vision 2021 3 yuan ye et al agentformer agentaware transformers for sociotemporal multiagent forecasting proceedings of the ieeecvf international conference on computer vision 2021 4 wang chuhua et al stepwise goaldriven networks for trajectory prediction ieee robotics and automation letters 72 2022 27162723 the paper didnt really talk about its limitation despite indicating in the form it has done so one clear limitation is that it cannot handle changing interaction types as the multiagent interaction evolves over time their interaction could also evolve for example a follower passes a leader to become the new leader recent work on dynamic nri 5 6 has attempted to address this limitation 5 graber colin and alexander schwing dynamic neural relational inference for forecasting trajectories proceedings of the ieeecvf conference on computer vision and pattern recognition workshops 2020 6 xiao ruichao manish kumar singh and rose yu dynamic relational inference in multiagent trajectories iclr 2021 docsepthe authors proposed a prediction model that uses a multiplex latent graph to represent multiple types of interactions and attention to account for different relations called interaction modeling with multiplex attention imma they also introduce a training strategy called progressive layer training plt the experimental results showed that their approach outperformed baseline models in trajectory forecasting and relation inference spanning three multiagent scenarios social navigation cooperative task achievement and team sports they also demonstrated better zeroshot generalization results in the social navigation environment stremgth they proposed a prediction model that combines multilayer graph neural networks and plt for interpreting the multiplexed latent structure the experiments showed that their approach outperformed baseline models in trajectory forecasting and relation inference spanning three multiagent scenarios they also demonstrated superior sample efficiency and better zeroshot generalization results weakness the novelty of this method seems to be combining multilayer graph neural networks and plt but the authors mentioned that each component has novelty which may be exaggerated i asked this question in the following questions the result was clear but there was no code available i saw checklist and there were some questions in the experiment description the authors addressed the limitations and potential negative societal impact of their work docsepthis paper presents a new method for modeling and predicting multiagent interaction observing that such interactions could be affected by multiple factors such as target destinations and collision avoidance during crowd navigation tasks the key idea was to model them using multiplex graphs specifically rather than classifying types of edges of graphs as done in existing work the proposed approach instead perform multiclass predictions for multiple edge types and construct multiplex attention matrices to make latent graph layers diverse the proposed work introduces a progressive learning scheme progressive layer training that gradually increases the number of graph layers to learn experimental results on multiple datasets show the effectiveness of the proposed approach over existing methods strengths originality modeling multiagent interaction using multiplex graphs seems novel at least for trajectory forecasting tasks it is well motivated by the observations that human interactions can be affected by multiple factors quality i agree that the proposed approach is well designed and technically sound the questions that the paper addresses are clearly presented q1q4 in section 4 and answered in the experimental results experiment details as well as potential societal impacts are clearly described in the appendix clarity overall the paper is clearly written and easy to follow significance i believe that the proposed work is relatively significant mainly because it outperformed existing methods on multiple datasets the zeroshot generalization experiment was interesting and showing the usefulness of the proposed approach weaknesses the motivating example for multiple factors affecting interactions ie target friend and collision avoidance in figure 1 may be a bit confusing if i understand it correctly multiplex graphs are built rather in a bottomup fashion in the proposed method where we cannot ensure that each graph layer has such semantic meaning in fact im still a bit confused what the multiplex graph actually learned is this just a leading agent as shown in figure 5 or something beyond while it was great that the proposed method performs very well i would also like to see typical failure cases are there some cases where interaction modeling or prediction are likely to fail as also described above the paper currently has few discussions on the limitations and typical failure cases for the proposed approach ### Summary:
the reviewers agreed this paper was presented well and a valuable contribution we urge the authors to take the reviewers comments into account in the final version also please increase the size of the tables the font size is quite small maybe too small
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 2520, 789, 39223, 253, 1895, 273, 14053, 4471, 12788, 2718, 326, 3831, 247, 1029, 4248, 273, 5016, 875, 6083, 281, 1566, 824, 2718, 436, 789, 29328, 5016, 14053, 342, 25993, 4116, 516, 785, 534, 4648, 247, 25993, 4216, 21624, 2605, 281, 1566, 1027, 3510, 273, 6355, 275, 1027, 8090, 273, 253, 25993, 275, 1635, 4116, 4216, 8090, 403, 908, 281, 9232, 253, 4757, 273, 253, 2493, 33810, 6701, 281, 253, 2605, 273, 253, 25993, 21624, 8090, 436, 789, 29328, 13439, 3828, 3733, 499, 85, 835, 581, 476, 6194, 1027, 5016, 8090, 11794, 285, 10040, 3146, 6240, 1027, 21624, 8090, 253, 4081, 2746, 310, 6760, 327, 1264, 1027, 15302, 44762, 2355, 2266, 1543, 327, 512, 33810, 28913, 2175, 921, 849, 1907, 2709, 21624, 8090, 285, 263, 1907, 499, 85, 1361, 253, 3210, 10554, 3045, 33810, 253, 2675, 15034, 3126, 3400, 253, 3216, 33024, 5016, 2605, 273, 1016, 6200, 436, 789, 2722, 326, 253, 8131, 21624, 2605, 26295, 3216, 5083, 625, 13613, 685, 2720, 7274, 20544, 50276, 783, 2929, 310, 973, 10932, 285, 973, 15720, 50276, 74, 2868, 253, 2898, 273, 25993, 4116, 275, 253, 21624, 2317, 310, 4460, 281, 253, 1895, 273, 4471, 12788, 3200, 16923, 272, 342, 5016, 14053, 50276, 783, 1543, 403, 2266, 285, 21414, 327, 2709, 15302, 253, 28913, 2175, 2085, 2007, 1941, 273, 253, 11839, 273, 1907, 2709, 2308, 273, 4116, 50275, 20881, 1255, 265, 50276, 249, 253, 30762, 352, 310, 5393, 326, 3733, 310, 2684, 342, 2805, 545, 478, 89, 18, 394, 436, 5678, 342, 253, 43302, 327, 3239, 495, 327, 534, 891, 452, 247, 1953, 2708, 352, 3133, 751, 253, 4081, 2746, 943, 417, 320, 26115, 39651, 347, 247, 260, 21574, 285, 310, 625, 273, 247, 1566, 534, 556, 247, 2173, 8763, 2605, 327, 253, 3673, 44856, 326, 310, 417, 3963, 1025, 407, 667, 6024, 5621, 24088, 27451, 23279, 841, 403, 18212, 9713, 275, 253, 6452, 285, 253, 30762, 5474, 339, 431, 248, 2929, 29328, 247, 25993, 4116, 1332, 323, 4471, 12788, 5016, 14053, 253, 2022, 16038, 310, 326, 627, 476, 320, 625, 685, 581, 5016, 1511, 875, 6083, 281, 2953, 436, 253, 2929, 29328, 281, 897, 2709, 21624, 14580, 281, 22573, 253, 1027, 6355, 875, 6083, 285, 27532, 253, 362, 3348, 7792, 281, 9441, 253, 21624, 11646, 281, 11990, 4715, 253, 2929, 671, 29328, 13439, 3828, 3733, 534, 310, 247, 24642, 4715, 1332, 326, 13237, 11323, 21624, 14580, 281, 253, 1566, 27163, 403, 2684, 327, 4471, 12788, 18974, 16923, 272, 970, 581, 275, 5967, 15524, 10895, 285, 767, 1345, 15302, 3408, 285, 295, 5830, 253, 1332, 41731, 13015, 2720, 3082, 275, 16923, 272, 7200, 38524, 17032, 7200, 285, 3410, 6733, 4757, 50275, 783, 2929, 310, 3839, 973, 15720, 285, 4088, 281, 2096, 253, 16038, 310, 2590, 323, 9706, 2709, 3510, 273, 5016, 875, 6083, 285, 616, 4103, 4757, 50276, 783, 2929, 2684, 9470, 4679, 285, 6260, 327, 1264, 15302, 1097, 18974, 16923, 272, 285, 5886, 17032, 403, 6760, 253, 21624, 4216, 19763, 4679, 403, 4722, 534, 921, 253, 21624, 11646, 22573, 6657, 25739, 254, 13576, 50276, 783, 2929, 41731, 13015, 253, 1666, 25379, 3012, 50276, 20881, 1255, 50275, 783, 2022, 2934, 25993, 4116, 273, 436, 2929, 310, 3240, 2074, 281, 2803, 1025, 295, 363, 337, 835, 2709, 21624, 14580, 403, 908, 275, 247, 39762, 17032, 7792, 253, 7681, 38135, 273, 436, 2929, 310, 3710, 275, 690, 3282, 253, 25993, 4116, 310, 417, 247, 1943, 1818, 432, 295, 363, 1580, 253, 21624, 2127, 3139, 310, 23964, 37613, 3103, 1016, 7877, 476, 320, 4127, 347, 247, 4858, 21624, 2127, 285, 253, 2022, 3064, 273, 253, 4081, 2746, 310, 281, 417, 3894, 32049, 13461, 50276, 783, 2929, 42126, 7277, 342, 2266, 4471, 12788, 18974, 16923, 272, 1666, 25379, 327, 10610, 18974, 16923, 272, 49602, 824, 347, 5105, 86, 951, 835, 247, 1781, 1180, 273, 6083, 403, 18745, 342, 1016, 643, 253, 2929, 812, 7277, 342, 256, 5503, 3082, 824, 347, 374, 495, 577, 50276, 8826, 2216, 10165, 403, 417, 973, 17194, 752, 310, 253, 16038, 323, 2602, 4090, 2439, 6083, 275, 16186, 374, 436, 1030, 44196, 253, 5016, 4757, 432, 891, 281, 643, 6083, 281, 2020, 281, 337, 323, 5016, 465, 2299, 5570, 891, 476, 452, 2266, 5016, 4757, 342, 2709, 1027, 6083, 323, 5016, 465, 3103, 891, 4282, 2139, 253, 1332, 36908, 816, 36803, 17032, 273, 1182, 16392, 347, 247, 8985, 9162, 436, 5016, 4961, 390, 417, 3185, 273, 23559, 14407, 9162, 2439, 643, 6083, 50276, 18, 4384, 67, 299, 91, 376, 1162, 355, 2803, 1701, 11454, 38524, 17032, 323, 4471, 2388, 1913, 2718, 575, 280, 1686, 6247, 29561, 50276, 19, 19695, 45321, 465, 435, 6811, 2364, 66, 1162, 355, 432, 7342, 1039, 10801, 50276, 34605, 281, 1048, 1307, 1966, 18974, 16923, 272, 575, 856, 22868, 273, 253, 26332, 70, 886, 39985, 5213, 8059, 327, 4382, 8113, 43425, 50276, 20, 340, 9041, 9094, 1162, 355, 5570, 19946, 5570, 13823, 4979, 398, 323, 10621, 302, 358, 23702, 4471, 12788, 16923, 272, 575, 856, 22868, 273, 253, 26332, 70, 886, 39985, 5213, 8059, 327, 4382, 8113, 43425, 50276, 21, 259, 606, 448, 6968, 5738, 1162, 355, 3213, 3020, 4736, 17477, 6928, 323, 18974, 10554, 575, 466, 1796, 15688, 982, 285, 29885, 4876, 575, 3547, 1384, 1423, 3435, 1036, 1630, 1508, 50276, 783, 2929, 42126, 1663, 2312, 670, 697, 12291, 5747, 7809, 275, 253, 830, 352, 556, 2218, 594, 50276, 531, 2590, 12291, 310, 326, 352, 2550, 6016, 6890, 5016, 3510, 347, 253, 4471, 12788, 5016, 43279, 689, 673, 616, 5016, 812, 671, 23554, 323, 1650, 247, 47201, 11999, 247, 6657, 281, 2489, 253, 747, 6657, 3332, 789, 327, 7870, 295, 363, 608, 721, 556, 9919, 281, 2953, 436, 12291, 50276, 22, 10013, 254, 847, 249, 285, 247, 1591, 5945, 5807, 7706, 7870, 11454, 38524, 17032, 323, 16923, 272, 24102, 575, 856, 22868, 273, 253, 26332, 70, 886, 39985, 8059, 327, 4382, 8113, 285, 3102, 8981, 29561, 9169, 50276, 23, 1269, 22728, 8864, 469, 8500, 637, 763, 465, 22711, 1625, 73, 285, 9461, 340, 86, 7870, 38524, 17032, 275, 4471, 12788, 24102, 575, 280, 32888, 43425, 5474, 339, 431, 248, 4477, 4081, 247, 10554, 1566, 326, 4648, 247, 25993, 21624, 4216, 281, 1957, 2709, 3510, 273, 6355, 285, 4116, 281, 2395, 323, 1027, 2493, 1925, 5016, 14053, 342, 25993, 4116, 516, 785, 597, 671, 9569, 247, 3733, 5700, 1925, 13439, 3828, 3733, 499, 85, 253, 5661, 1543, 2692, 326, 616, 2746, 41731, 10574, 8245, 3210, 275, 18974, 16923, 272, 285, 5886, 17032, 28369, 1264, 4471, 12788, 15216, 2675, 15034, 27293, 4836, 19797, 285, 2285, 9001, 597, 671, 5183, 1805, 1182, 254, 6934, 302, 26647, 1543, 275, 253, 2675, 15034, 3126, 50275, 296, 2013, 72, 394, 50276, 9328, 4081, 247, 10554, 1566, 326, 24772, 33362, 4071, 4216, 11454, 6928, 285, 499, 85, 323, 29375, 253, 25993, 264, 21624, 2605, 50275, 783, 4679, 2692, 326, 616, 2746, 41731, 10574, 8245, 3210, 275, 18974, 16923, 272, 285, 5886, 17032, 28369, 1264, 4471, 12788, 15216, 597, 671, 5183, 8936, 3410, 6733, 285, 1805, 1182, 254, 6934, 302, 26647, 1543, 50276, 20881, 1255, 50276, 783, 38135, 273, 436, 1332, 3133, 281, 320, 16248, 33362, 4071, 4216, 11454, 6928, 285, 499, 85, 533, 253, 4477, 5393, 326, 1016, 4445, 556, 38135, 534, 778, 320, 36074, 891, 2546, 436, 1953, 275, 253, 1563, 3533, 50275, 783, 906, 369, 2590, 533, 627, 369, 642, 2127, 2130, 891, 3047, 44282, 285, 627, 497, 690, 3533, 275, 253, 3368, 5740, 50274, 783, 4477, 9713, 253, 7364, 285, 2442, 4016, 38058, 3486, 273, 616, 789, 50276, 7152, 33032, 2520, 2929, 10262, 247, 747, 1332, 323, 14053, 285, 21565, 4471, 12788, 5016, 20764, 326, 824, 6355, 812, 320, 5876, 407, 2709, 2616, 824, 347, 2303, 31684, 285, 15708, 28772, 1309, 9539, 15034, 8892, 253, 2234, 2934, 369, 281, 1566, 731, 970, 25993, 14580, 5742, 2581, 685, 49653, 3510, 273, 9297, 273, 14580, 347, 2218, 275, 5368, 789, 253, 4081, 2746, 3185, 1347, 23559, 14407, 13650, 323, 2709, 5024, 3510, 285, 3989, 25993, 4116, 12624, 281, 1056, 21624, 4216, 8090, 11117, 253, 4081, 789, 23970, 247, 13439, 4715, 6974, 13439, 3828, 3733, 326, 13237, 5459, 253, 1180, 273, 4216, 8090, 281, 3037, 5661, 1543, 327, 2709, 15302, 921, 253, 12510, 273, 253, 4081, 2746, 689, 5368, 3082, 50275, 296, 3755, 20556, 50276, 19164, 414, 14053, 4471, 12788, 5016, 970, 25993, 14580, 3133, 4460, 387, 1878, 323, 18974, 16923, 272, 8892, 352, 310, 973, 17194, 407, 253, 7313, 326, 1966, 6355, 476, 320, 5876, 407, 2709, 2616, 50276, 15177, 891, 5194, 326, 253, 4081, 2746, 310, 973, 4158, 285, 22335, 3590, 253, 3533, 326, 253, 2929, 12453, 403, 4518, 3559, 2805, 18, 82, 21, 275, 2593, 577, 285, 9577, 275, 253, 5661, 1543, 3368, 4278, 347, 973, 347, 2442, 38058, 16274, 403, 4518, 2529, 275, 253, 30762, 50276, 498, 15752, 4583, 253, 2929, 310, 4518, 3542, 285, 3477, 281, 956, 50276, 9188, 40348, 891, 2868, 326, 253, 4081, 789, 310, 4942, 1534, 7194, 984, 352, 41731, 10574, 5368, 3082, 327, 2709, 15302, 253, 1182, 254, 6934, 302, 26647, 3368, 369, 4722, 285, 4645, 253, 31471, 273, 253, 4081, 2746, 50275, 20881, 1255, 265, 50276, 783, 15265, 839, 1650, 323, 2709, 2616, 13567, 6355, 26332, 2303, 3331, 285, 15708, 28772, 275, 4677, 337, 778, 320, 247, 2372, 21643, 604, 891, 2096, 352, 9113, 25993, 14580, 403, 4270, 2581, 275, 247, 5004, 484, 8142, 275, 253, 4081, 1332, 835, 359, 2550, 5416, 326, 1016, 4216, 3828, 556, 824, 24705, 4495, 275, 958, 516, 1335, 247, 2372, 13477, 752, 253, 25993, 4216, 2686, 6311, 310, 436, 816, 247, 4283, 5570, 347, 2011, 275, 4677, 608, 390, 1633, 4457, 50276, 6050, 352, 369, 1270, 326, 253, 4081, 1332, 17923, 1077, 973, 891, 651, 671, 751, 281, 923, 6867, 4433, 2219, 403, 627, 690, 2219, 835, 5016, 14053, 390, 10554, 403, 2779, 281, 1891, 347, 671, 2529, 1840, 253, 2929, 4390, 556, 1643, 11985, 327, 253, 7364, 285, 6867, 4433, 2219, 323, 253, 4081, 2746, 2490, 187, 4118, 18435, 27, 783, 30628, 5821, 436, 2929, 369, 3559, 973, 285, 247, 9865, 7680, 359, 21434, 253, 4477, 281, 1379, 253, 30628, 5701, 715, 2395, 275, 253, 2457, 2715, 50275, 12563, 4496, 2572, 253, 1979, 273, 253, 7180, 50276, 783, 8266, 1979, 310, 3240, 1355, 5046, 1512, 1355, 209 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 2520, 789, 39223, 253, 1895, 273, 14053, 4471, 12788, 2718, 326, 3831, 247, 1029, 4248, 273, 5016, 875, 6083, 281, 1566, 824, 2718, 436, 789, 29328, 5016, 14053, 342, 25993, 4116, 516, 785, 534, 4648, 247, 25993, 4216, 21624, 2605, 281, 1566, 1027, 3510, 273, 6355, 275, 1027, 8090, 273, 253, 25993, 275, 1635, 4116, 4216, 8090, 403, 908, 281, 9232, 253, 4757, 273, 253, 2493, 33810, 6701, 281, 253, 2605, 273, 253, 25993, 21624, 8090, 436, 789, 29328, 13439, 3828, 3733, 499, 85, 835, 581, 476, 6194, 1027, 5016, 8090, 11794, 285, 10040, 3146, 6240, 1027, 21624, 8090, 253, 4081, 2746, 310, 6760, 327, 1264, 1027, 15302, 44762, 2355, 2266, 1543, 327, 512, 33810, 28913, 2175, 921, 849, 1907, 2709, 21624, 8090, 285, 263, 1907, 499, 85, 1361, 253, 3210, 10554, 3045, 33810, 253, 2675, 15034, 3126, 3400, 253, 3216, 33024, 5016, 2605, 273, 1016, 6200, 436, 789, 2722, 326, 253, 8131, 21624, 2605, 26295, 3216, 5083, 625, 13613, 685, 2720, 7274, 20544, 50276, 783, 2929, 310, 973, 10932, 285, 973, 15720, 50276, 74, 2868, 253, 2898, 273, 25993, 4116, 275, 253, 21624, 2317, 310, 4460, 281, 253, 1895, 273, 4471, 12788, 3200, 16923, 272, 342, 5016, 14053, 50276, 783, 1543, 403, 2266, 285, 21414, 327, 2709, 15302, 253, 28913, 2175, 2085, 2007, 1941, 273, 253, 11839, 273, 1907, 2709, 2308, 273, 4116, 50275, 20881, 1255, 265, 50276, 249, 253, 30762, 352, 310, 5393, 326, 3733, 310, 2684, 342, 2805, 545, 478, 89, 18, 394, 436, 5678, 342, 253, 43302, 327, 3239, 495, 327, 534, 891, 452, 247, 1953, 2708, 352, 3133, 751, 253, 4081, 2746, 943, 417, 320, 26115, 39651, 347, 247, 260, 21574, 285, 310, 625, 273, 247, 1566, 534, 556, 247, 2173, 8763, 2605, 327, 253, 3673, 44856, 326, 310, 417, 3963, 1025, 407, 667, 6024, 5621, 24088, 27451, 23279, 841, 403, 18212, 9713, 275, 253, 6452, 285, 253, 30762, 5474, 339, 431, 248, 2929, 29328, 247, 25993, 4116, 1332, 323, 4471, 12788, 5016, 14053, 253, 2022, 16038, 310, 326, 627, 476, 320, 625, 685, 581, 5016, 1511, 875, 6083, 281, 2953, 436, 253, 2929, 29328, 281, 897, 2709, 21624, 14580, 281, 22573, 253, 1027, 6355, 875, 6083, 285, 27532, 253, 362, 3348, 7792, 281, 9441, 253, 21624, 11646, 281, 11990, 4715, 253, 2929, 671, 29328, 13439, 3828, 3733, 534, 310, 247, 24642, 4715, 1332, 326, 13237, 11323, 21624, 14580, 281, 253, 1566, 27163, 403, 2684, 327, 4471, 12788, 18974, 16923, 272, 970, 581, 275, 5967, 15524, 10895, 285, 767, 1345, 15302, 3408, 285, 295, 5830, 253, 1332, 41731, 13015, 2720, 3082, 275, 16923, 272, 7200, 38524, 17032, 7200, 285, 3410, 6733, 4757, 50275, 783, 2929, 310, 3839, 973, 15720, 285, 4088, 281, 2096, 253, 16038, 310, 2590, 323, 9706, 2709, 3510, 273, 5016, 875, 6083, 285, 616, 4103, 4757, 50276, 783, 2929, 2684, 9470, 4679, 285, 6260, 327, 1264, 15302, 1097, 18974, 16923, 272, 285, 5886, 17032, 403, 6760, 253, 21624, 4216, 19763, 4679, 403, 4722, 534, 921, 253, 21624, 11646, 22573, 6657, 25739, 254, 13576, 50276, 783, 2929, 41731, 13015, 253, 1666, 25379, 3012, 50276, 20881, 1255, 50275, 783, 2022, 2934, 25993, 4116, 273, 436, 2929, 310, 3240, 2074, 281, 2803, 1025, 295, 363, 337, 835, 2709, 21624, 14580, 403, 908, 275, 247, 39762, 17032, 7792, 253, 7681, 38135, 273, 436, 2929, 310, 3710, 275, 690, 3282, 253, 25993, 4116, 310, 417, 247, 1943, 1818, 432, 295, 363, 1580, 253, 21624, 2127, 3139, 310, 23964, 37613, 3103, 1016, 7877, 476, 320, 4127, 347, 247, 4858, 21624, 2127, 285, 253, 2022, 3064, 273, 253, 4081, 2746, 310, 281, 417, 3894, 32049, 13461, 50276, 783, 2929, 42126, 7277, 342, 2266, 4471, 12788, 18974, 16923, 272, 1666, 25379, 327, 10610, 18974, 16923, 272, 49602, 824, 347, 5105, 86, 951, 835, 247, 1781, 1180, 273, 6083, 403, 18745, 342, 1016, 643, 253, 2929, 812, 7277, 342, 256, 5503, 3082, 824, 347, 374, 495, 577, 50276, 8826, 2216, 10165, 403, 417, 973, 17194, 752, 310, 253, 16038, 323, 2602, 4090, 2439, 6083, 275, 16186, 374, 436, 1030, 44196, 253, 5016, 4757, 432, 891, 281, 643, 6083, 281, 2020, 281, 337, 323, 5016, 465, 2299, 5570, 891, 476, 452, 2266, 5016, 4757, 342, 2709, 1027, 6083, 323, 5016, 465, 3103, 891, 4282, 2139, 253, 1332, 36908, 816, 36803, 17032, 273, 1182, 16392, 347, 247, 8985, 9162, 436, 5016, 4961, 390, 417, 3185, 273, 23559, 14407, 9162, 2439, 643, 6083, 50276, 18, 4384, 67, 299, 91, 376, 1162, 355, 2803, 1701, 11454, 38524, 17032, 323, 4471, 2388, 1913, 2718, 575, 280, 1686, 6247, 29561, 50276, 19, 19695, 45321, 465, 435, 6811, 2364, 66, 1162, 355, 432, 7342, 1039, 10801, 50276, 34605, 281, 1048, 1307, 1966, 18974, 16923, 272, 575, 856, 22868, 273, 253, 26332, 70, 886, 39985, 5213, 8059, 327, 4382, 8113, 43425, 50276, 20, 340, 9041, 9094, 1162, 355, 5570, 19946, 5570, 13823, 4979, 398, 323, 10621, 302, 358, 23702, 4471, 12788, 16923, 272, 575, 856, 22868, 273, 253, 26332, 70, 886, 39985, 5213, 8059, 327, 4382, 8113, 43425, 50276, 21, 259, 606, 448, 6968, 5738, 1162, 355, 3213, 3020, 4736, 17477, 6928, 323, 18974, 10554, 575, 466, 1796, 15688, 982, 285, 29885, 4876, 575, 3547, 1384, 1423, 3435, 1036, 1630, 1508, 50276, 783, 2929, 42126, 1663, 2312, 670, 697, 12291, 5747, 7809, 275, 253, 830, 352, 556, 2218, 594, 50276, 531, 2590, 12291, 310, 326, 352, 2550, 6016, 6890, 5016, 3510, 347, 253, 4471, 12788, 5016, 43279, 689, 673, 616, 5016, 812, 671, 23554, 323, 1650, 247, 47201, 11999, 247, 6657, 281, 2489, 253, 747, 6657, 3332, 789, 327, 7870, 295, 363, 608, 721, 556, 9919, 281, 2953, 436, 12291, 50276, 22, 10013, 254, 847, 249, 285, 247, 1591, 5945, 5807, 7706, 7870, 11454, 38524, 17032, 323, 16923, 272, 24102, 575, 856, 22868, 273, 253, 26332, 70, 886, 39985, 8059, 327, 4382, 8113, 285, 3102, 8981, 29561, 9169, 50276, 23, 1269, 22728, 8864, 469, 8500, 637, 763, 465, 22711, 1625, 73, 285, 9461, 340, 86, 7870, 38524, 17032, 275, 4471, 12788, 24102, 575, 280, 32888, 43425, 5474, 339, 431, 248, 4477, 4081, 247, 10554, 1566, 326, 4648, 247, 25993, 21624, 4216, 281, 1957, 2709, 3510, 273, 6355, 285, 4116, 281, 2395, 323, 1027, 2493, 1925, 5016, 14053, 342, 25993, 4116, 516, 785, 597, 671, 9569, 247, 3733, 5700, 1925, 13439, 3828, 3733, 499, 85, 253, 5661, 1543, 2692, 326, 616, 2746, 41731, 10574, 8245, 3210, 275, 18974, 16923, 272, 285, 5886, 17032, 28369, 1264, 4471, 12788, 15216, 2675, 15034, 27293, 4836, 19797, 285, 2285, 9001, 597, 671, 5183, 1805, 1182, 254, 6934, 302, 26647, 1543, 275, 253, 2675, 15034, 3126, 50275, 296, 2013, 72, 394, 50276, 9328, 4081, 247, 10554, 1566, 326, 24772, 33362, 4071, 4216, 11454, 6928, 285, 499, 85, 323, 29375, 253, 25993, 264, 21624, 2605, 50275, 783, 4679, 2692, 326, 616, 2746, 41731, 10574, 8245, 3210, 275, 18974, 16923, 272, 285, 5886, 17032, 28369, 1264, 4471, 12788, 15216, 597, 671, 5183, 8936, 3410, 6733, 285, 1805, 1182, 254, 6934, 302, 26647, 1543, 50276, 20881, 1255, 50276, 783, 38135, 273, 436, 1332, 3133, 281, 320, 16248, 33362, 4071, 4216, 11454, 6928, 285, 499, 85, 533, 253, 4477, 5393, 326, 1016, 4445, 556, 38135, 534, 778, 320, 36074, 891, 2546, 436, 1953, 275, 253, 1563, 3533, 50275, 783, 906, 369, 2590, 533, 627, 369, 642, 2127, 2130, 891, 3047, 44282, 285, 627, 497, 690, 3533, 275, 253, 3368, 5740, 50274, 783, 4477, 9713, 253, 7364, 285, 2442, 4016, 38058, 3486, 273, 616, 789, 50276, 7152, 33032, 2520, 2929, 10262, 247, 747, 1332, 323, 14053, 285, 21565, 4471, 12788, 5016, 20764, 326, 824, 6355, 812, 320, 5876, 407, 2709, 2616, 824, 347, 2303, 31684, 285, 15708, 28772, 1309, 9539, 15034, 8892, 253, 2234, 2934, 369, 281, 1566, 731, 970, 25993, 14580, 5742, 2581, 685, 49653, 3510, 273, 9297, 273, 14580, 347, 2218, 275, 5368, 789, 253, 4081, 2746, 3185, 1347, 23559, 14407, 13650, 323, 2709, 5024, 3510, 285, 3989, 25993, 4116, 12624, 281, 1056, 21624, 4216, 8090, 11117, 253, 4081, 789, 23970, 247, 13439, 4715, 6974, 13439, 3828, 3733, 326, 13237, 5459, 253, 1180, 273, 4216, 8090, 281, 3037, 5661, 1543, 327, 2709, 15302, 921, 253, 12510, 273, 253, 4081, 2746, 689, 5368, 3082, 50275, 296, 3755, 20556, 50276, 19164, 414, 14053, 4471, 12788, 5016, 970, 25993, 14580, 3133, 4460, 387, 1878, 323, 18974, 16923, 272, 8892, 352, 310, 973, 17194, 407, 253, 7313, 326, 1966, 6355, 476, 320, 5876, 407, 2709, 2616, 50276, 15177, 891, 5194, 326, 253, 4081, 2746, 310, 973, 4158, 285, 22335, 3590, 253, 3533, 326, 253, 2929, 12453, 403, 4518, 3559, 2805, 18, 82, 21, 275, 2593, 577, 285, 9577, 275, 253, 5661, 1543, 3368, 4278, 347, 973, 347, 2442, 38058, 16274, 403, 4518, 2529, 275, 253, 30762, 50276, 498, 15752, 4583, 253, 2929, 310, 4518, 3542, 285, 3477, 281, 956, 50276, 9188, 40348, 891, 2868, 326, 253, 4081, 789, 310, 4942, 1534, 7194, 984, 352, 41731, 10574, 5368, 3082, 327, 2709, 15302, 253, 1182, 254, 6934, 302, 26647, 3368, 369, 4722, 285, 4645, 253, 31471, 273, 253, 4081, 2746, 50275, 20881, 1255, 265, 50276, 783, 15265, 839, 1650, 323, 2709, 2616, 13567, 6355, 26332, 2303, 3331, 285, 15708, 28772, 275, 4677, 337, 778, 320, 247, 2372, 21643, 604, 891, 2096, 352, 9113, 25993, 14580, 403, 4270, 2581, 275, 247, 5004, 484, 8142, 275, 253, 4081, 1332, 835, 359, 2550, 5416, 326, 1016, 4216, 3828, 556, 824, 24705, 4495, 275, 958, 516, 1335, 247, 2372, 13477, 752, 253, 25993, 4216, 2686, 6311, 310, 436, 816, 247, 4283, 5570, 347, 2011, 275, 4677, 608, 390, 1633, 4457, 50276, 6050, 352, 369, 1270, 326, 253, 4081, 1332, 17923, 1077, 973, 891, 651, 671, 751, 281, 923, 6867, 4433, 2219, 403, 627, 690, 2219, 835, 5016, 14053, 390, 10554, 403, 2779, 281, 1891, 347, 671, 2529, 1840, 253, 2929, 4390, 556, 1643, 11985, 327, 253, 7364, 285, 6867, 4433, 2219, 323, 253, 4081, 2746, 2490, 187, 4118, 18435, 27, 783, 30628, 5821, 436, 2929, 369, 3559, 973, 285, 247, 9865, 7680, 359, 21434, 253, 4477, 281, 1379, 253, 30628, 5701, 715, 2395, 275, 253, 2457, 2715, 50275, 12563, 4496, 2572, 253, 1979, 273, 253, 7180, 50276, 783, 8266, 1979, 310, 3240, 1355, 5046, 1512, 1355, 209 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: the paper introduces a first person videotext pretraining dataset called egoclip comprising 38m cliptext pairs chosen from ego4d ego4d is a largescale dataset comprising first person videos from a large variety of daily human activities the paper further proposes a pretraining objective called egonce to adapt the videotext contrastive learning to the egocentric domain by mining egocentric aware positive and negative samples finally the paper introduces a new development benchmark called egomcq that is close to the egoclip dataset and can therefore be used to effectively validate and explore algorithms for egoclip which they showcase by exploring egonce on egoclip egoclip is sourced from ego4d through a series of criteria a set of pairs of video and textual narrations are chosen with low amount of noise in terms of videotext misalignment and videotext irrelevance the final dataset has 29k hours of videos with 385 million narrations from 129 different scenarios there are 219 clips per minutes with average clip length of 1s and standard deviation of 09s this is followed by designing a contextual variablelength clip pairing strategy where centered around the instantaneous timestamp of narration a temporal duration of betaalpha is specified such that beta is the average temporal distance between pairs of consecutive narrations and alpha is the scale factor computed as the average of all betas across all videos in egoclip for videolanguage pretraining the paper proposes egonce which is an extension of infonce contrastive loss between videotext pairs egonce adapts infonce for egocentric videos by doing actionaware positive sampling where batch samples that share at least one noun and at least one verb are treated as positive samples it further has sceneaware negative sampling where different actions in the same scenario are treated as hard negative samples finally the paper develops a new development benchmark to evaluate on egoclip called egomcq the motivation is that standard videotext retrieval may be ineffective since there are substantial duplicatesemantically similar captions in ego4d multiple choice question mcq task makes repetitions highly unlikely given a small number of answers egomcq considers an intervideo setting where five clips originate from different videos aiming to distinguish instances from different scenarios egomcq also considers an intravideo setting by grouping five continuous clips together focusing on finegrained context clues such as hand interaction to validate the proposed method experiments are conducted on five egocentric benchmarks experiments show the egoclip and egoclip along with egonce perform better than existing methods on all benchmarks this is because egoclip helps bring pretraining closer to egocentric downstream tasks ablation is conducted on the effect of egonce showing that it is more effective than infonce ablation is also conducted to validate the effectiveness of the proposed development benchmark egomcq strengths 1 the proposed components of the work overall have significant novelty and help to improve pretraining for egocentric downstream tasks egoclip and accompanying egomcq benchmark will be very helpful to the egocentric community to develop better pretrained models suited for egocentric tasks 2 the overall paper quality is also high the paper is easy to read follow and understand 3 extensive evaluation has been performed to evaluate all aspects of the proposed method the experiments help gather insight on each aspect in a comprehensive manner weaknesses 1 since egocentric videos are untrimmed and consecutive actions are quite finegrained i have some doubts about the sceneaware negative sampling which considers different actions in the same scenario as hard negative samples but it is possible that different actions in the same scenario can share the same noun in that case it appears it might conflict with the actionaware positive sampling is it possible that sceneaware negative sampling can introduce false negatives 2 all experiments for egoclip seem to be conducted using the frozen method and timesformer backbone it would be more helpful to see if just for proof of concept on maybe one setting how the dataset and approach performs with another backbone i believe the authors have adequately addressed the limitations and potential negative societal impact of their work docsepthis paper studies videolanguage pretraining for egocentric videos the authors claim 3 contribution 1 cleaning the ego4d dataset to form a large videotext training corpus 2 proposed egonce to tackle the two challenges same action in different scenes and different actions in the same scene in egocentric videotext pretraining 3 built on top of the ego4d dataset the authors introduced egomcq as a development set strengths the authors studies pretraining for egocentric videos which is an important topic for the community and could be impactful the experimental results are mostly positive supporting the claims in the paper weaknesses there is a lack of architectural novelty the model architecture is essentially the same as frozen 11 l158165 the authors proposed actionaware positive sampling to address challenge 1 l151 which relies on the taxonomy annotation specific to the ego4d dataset and has limited scope which may not applicable to more general domain ego videos the sceneaware negative sampling is essentially negative sampling from the same video this technique is often used for tasks like video moment retrieval hendricks et al lei et al thus it is also not new in the ablations it is shown that using egoclip alone is better than using 3rd person view data either ht100m or cc3mwebvid2m it would be interesting to also explore whether the model can benefit from training on a combination of 1st and 3rd view data eg pretraining on egoclipcc3mwebvid2m the two challenges discussed in l150155 do not seem to unique to 1st person videos eg the same action running could happen in a street a park or in a playground the claimed contribution i l6 seems trivial it is almost simply done by doing a bit transform of the ego4d dataset annotations it is not clear why we need another development set egomcq if there are already a bunch of tasks which already has their development set hendricks et al localizing moments in video with natural language iccv 2017 let et al qvhighlights detecting moments and highlights in videos via natural language queries neurips 2021 the limitations are discussed in sec 7 docsepthis paper proposes a method for videolanguage learning specifically focused on the egocentric setting the paper introduces a new way to mine positivenegative samples for infonce style training based on sampling by overlapping nouns and verbs in the video caption the paper introduces a new evaluation setting on the ego4d dataset and evaluates on epic kitchens and charades ego the proposed mining method for the positivenegative samples is interesting and effective the results show good performance across multiple datasets and tasks the stateoftheart comparisons also show the difference due to different pt datasets and other settings overall the results are fairly complete and show the benefits of the components minor comments fortunately with the recent introduction of the massivescale egocentric video dataset ego4d 16 it becomes possible to unlock egocentric vlp that is a strong claim given that in table 1 ego4d is still a fraction of the size of other datasets eg howto100m webvid youtubetemporal180m etc i wouldnt call ego4d massivescale the results also are significantly better than the previous results which suggests this isnt really unlocking anything there are typos and grammatical errors throughout the paper it would benefit from careful editing some of the notation is unnecessary complicated for example the subscript i in eq 1 doesnt add anything ie that section could be revised without the subscript without any loss in clarity this would also simplify table 2 note that the answer to the licenses is incorrect ego4d does not have the mit license and has its own license with certain restrictions there is some slight discussion on these but it is very limited for example there are many societal impacts of egocentric videos such as surveillance privacy issues etc that could be discussed this goes beyond the impact of the electricity used to train the models docsepthis paper proposes a new benchmark for egocentric video understanding based on ego4d a pretraining set and a model for the pretraining and downstream finetuning given the untrimmed ego4d videos the authors do the curations and build the egpclip based on the recent work frozen a contrastive video learning model with a new loss rgonce is proposed to test the above design the authors also propose a valid set named egomcq based on the val data of ego4d on several downstream tasks the pretraining set and model show impressive performance compared to existing firstview pretraining works given the largescale videotext data from ego4d pros the curation and cleaning for ego4d are important and beneficial for the community the illustrations are detailed and easy to follow the suppl gives much useful information the experiments indeed show good signs when we use this new curated dataset for egocentric video learning the main setting and curation are sound cons the largest concern this work seems not to differentiate the contribution of ego4d and this submission 1 the tab1 is misleading as the data scale and contribution belong to ego4d instead of this work this work conducts mainly the curation and cleaning so i suggest comparing both ego4d and egoclip here to avoid this misleading 2 the results look good and impressive but the readers may wonder how much of them come from the curation though without curation the videos in ego4d may be not easy to directly use the performance improvements can be definitely thanks to the data contribution of ego4d when compared with the firstview data pretraining only an egoclip is weird and misleading thus i suggest the authors add the tests to use the original videos of ego4d in soma ways or if not at least discuss this clearly if we think about this point the data contribution of this work and the performance claim may be misaligned 3 considering the method contribution is relatively smaller compared to the data in my opinion this work may be more suitable for the benchmark and dataset track if the main contributions are not revised the evaluation the retrieval tests are essential but more video recognition tests can make the evaluation more solid eg on egtea epic etc typo l246 ieeq1 na ### Summary:
for egocentric videolanguage pretraining this paper creates a 1stperson videotext pretraining dataset proposes a new contrastive loss egonce and builds a new benchmark egomcq although the contribution of this work is somewhat incremental its motivation experimentation and organization are good besides all reviewers agree that egocentric videolanguage pretraining is an important topic for the community i hence suggest accepting it
[ 50276, 909, 15436, 532, 50275, 909, 15436, 532, 310, 47344, 432, 23057, 21, 69, 949, 247, 2962, 273, 6866, 247, 873, 273, 8557, 273, 3492, 285, 45860, 10658, 569, 403, 6777, 342, 1698, 2408, 273, 6046, 275, 2426, 273, 8851, 1584, 633, 3731, 40446, 285, 8851, 1584, 633, 3496, 11235, 11828, 253, 2457, 10895, 556, 3285, 76, 3038, 273, 10556, 342, 35383, 3041, 10658, 569, 432, 17181, 1027, 15216, 627, 403, 28121, 29205, 591, 2909, 342, 3388, 17230, 2978, 273, 337, 84, 285, 2629, 11254, 273, 15630, 84, 436, 310, 3560, 407, 20462, 247, 33876, 4778, 3985, 17230, 25015, 5700, 835, 18932, 1475, 253, 35774, 28921, 273, 39016, 1589, 247, 11935, 7467, 273, 9840, 1637, 310, 7616, 824, 326, 9840, 310, 253, 3388, 11935, 4181, 875, 8557, 273, 12640, 10658, 569, 285, 9765, 310, 253, 4311, 2803, 10302, 347, 253, 3388, 273, 512, 701, 284, 2439, 512, 10556, 275, 24088, 15436, 532, 50276, 1542, 8851, 311, 2848, 3215, 26208, 253, 2929, 29328, 24088, 19131, 534, 310, 271, 6880, 273, 2192, 19131, 4499, 422, 2957, 875, 8851, 1584, 633, 8557, 24088, 19131, 5223, 84, 2192, 19131, 323, 24088, 406, 19458, 10556, 407, 2509, 2250, 13823, 2762, 10491, 835, 14604, 3530, 326, 3894, 387, 1878, 581, 28407, 285, 387, 1878, 581, 17257, 403, 4127, 347, 2762, 3530, 352, 2007, 556, 6200, 13823, 4016, 10491, 835, 1027, 5231, 275, 253, 1072, 10076, 403, 4127, 347, 1892, 4016, 3530, 50275, 71, 3341, 253, 2929, 24357, 247, 747, 2440, 22791, 281, 7472, 327, 24088, 15436, 532, 1925, 299, 23640, 68, 82, 253, 16038, 310, 326, 2629, 8851, 1584, 633, 25064, 778, 320, 18860, 1580, 627, 403, 6832, 40430, 358, 39904, 2074, 3403, 621, 275, 23057, 21, 69, 2709, 4327, 1953, 278, 68, 82, 4836, 2789, 49495, 4122, 11543, 1677, 247, 1355, 1180, 273, 9172, 299, 23640, 68, 82, 19401, 271, 734, 16455, 4758, 835, 2620, 29205, 35282, 432, 1027, 10556, 26400, 281, 12129, 10872, 432, 1027, 15216, 299, 23640, 68, 82, 671, 19401, 271, 47252, 2842, 4758, 407, 32827, 2620, 5415, 29205, 2366, 13654, 327, 4030, 72, 11273, 3634, 30591, 824, 347, 1133, 5016, 50275, 936, 17813, 253, 4081, 1332, 4679, 403, 5196, 327, 2620, 24088, 406, 19458, 49602, 4679, 921, 253, 24088, 15436, 532, 285, 24088, 15436, 532, 2112, 342, 24088, 19131, 1347, 1805, 685, 5368, 3082, 327, 512, 49602, 436, 310, 984, 24088, 15436, 532, 7729, 3324, 3215, 26208, 8003, 281, 24088, 406, 19458, 15450, 8892, 28913, 310, 5196, 327, 253, 1055, 273, 24088, 19131, 4645, 326, 352, 310, 625, 3576, 685, 2192, 19131, 28913, 310, 671, 5196, 281, 17813, 253, 12510, 273, 253, 4081, 2440, 22791, 299, 23640, 68, 82, 50276, 296, 3755, 20556, 50276, 18, 253, 4081, 4295, 273, 253, 789, 4583, 452, 1534, 38135, 285, 1361, 281, 3157, 3215, 26208, 323, 24088, 406, 19458, 15450, 8892, 24088, 15436, 532, 285, 17909, 299, 23640, 68, 82, 22791, 588, 320, 1077, 9371, 281, 253, 24088, 406, 19458, 3114, 281, 1287, 1805, 3215, 11273, 3210, 18960, 323, 24088, 406, 19458, 8892, 374, 253, 4583, 2929, 3290, 310, 671, 1029, 253, 2929, 310, 3477, 281, 1239, 956, 285, 2096, 495, 9470, 7103, 556, 644, 2684, 281, 7472, 512, 7794, 273, 253, 4081, 1332, 253, 4679, 1361, 9580, 12288, 327, 1016, 4809, 275, 247, 11088, 5133, 50276, 20881, 1255, 265, 50276, 18, 1580, 24088, 406, 19458, 10556, 403, 440, 24716, 1314, 285, 12640, 5231, 403, 3240, 4030, 72, 11273, 891, 452, 690, 24626, 670, 253, 6200, 13823, 4016, 10491, 534, 19401, 1027, 5231, 275, 253, 1072, 10076, 347, 1892, 4016, 3530, 533, 352, 310, 1896, 326, 1027, 5231, 275, 253, 1072, 10076, 476, 3894, 253, 1072, 28407, 275, 326, 1083, 352, 4620, 352, 1537, 7344, 342, 253, 2250, 13823, 2762, 10491, 310, 352, 1896, 326, 6200, 13823, 4016, 10491, 476, 9569, 3221, 2297, 3993, 50276, 19, 512, 4679, 323, 24088, 15436, 532, 1646, 281, 320, 5196, 970, 253, 13831, 1332, 285, 2069, 19946, 27882, 352, 651, 320, 625, 9371, 281, 923, 604, 816, 323, 4737, 273, 4473, 327, 5046, 581, 4758, 849, 253, 10895, 285, 2746, 17923, 342, 1529, 27882, 50276, 74, 2868, 253, 4477, 452, 18212, 9713, 253, 7364, 285, 2442, 4016, 38058, 3486, 273, 616, 789, 5474, 33032, 2520, 2929, 2175, 8851, 311, 2848, 3215, 26208, 323, 24088, 406, 19458, 10556, 50276, 783, 4477, 1750, 495, 7680, 337, 12478, 253, 23057, 21, 69, 10895, 281, 830, 247, 1781, 8851, 1584, 633, 3733, 20689, 374, 4081, 24088, 19131, 281, 18915, 253, 767, 7881, 1072, 2250, 275, 1027, 13451, 285, 1027, 5231, 275, 253, 1072, 6200, 275, 24088, 406, 19458, 8851, 1584, 633, 3215, 26208, 495, 4270, 327, 1755, 273, 253, 23057, 21, 69, 10895, 253, 4477, 5611, 299, 23640, 68, 82, 347, 247, 2440, 873, 20544, 50275, 783, 4477, 2175, 3215, 26208, 323, 24088, 406, 19458, 10556, 534, 310, 271, 1774, 9400, 323, 253, 3114, 285, 812, 320, 3486, 1020, 50276, 783, 5661, 1543, 403, 6571, 2762, 8109, 253, 3916, 275, 253, 2929, 50276, 20881, 1255, 265, 50275, 9088, 310, 247, 3480, 273, 27934, 38135, 253, 1566, 10336, 310, 9093, 253, 1072, 347, 13831, 1903, 50276, 77, 18663, 15429, 253, 4477, 4081, 2250, 13823, 2762, 10491, 281, 2953, 5691, 337, 298, 18795, 534, 15771, 327, 253, 2891, 13646, 22581, 2173, 281, 253, 23057, 21, 69, 10895, 285, 556, 3710, 7990, 534, 778, 417, 7763, 281, 625, 2087, 5028, 23057, 10556, 50276, 783, 6200, 13823, 4016, 10491, 310, 9093, 4016, 10491, 432, 253, 1072, 3492, 436, 5853, 310, 2223, 908, 323, 8892, 751, 3492, 2774, 25064, 344, 2109, 21557, 1162, 355, 43278, 1162, 355, 3021, 352, 310, 671, 417, 747, 50276, 249, 253, 490, 77, 569, 352, 310, 2011, 326, 970, 24088, 15436, 532, 3815, 310, 1805, 685, 970, 495, 5784, 1436, 1859, 941, 2057, 288, 85, 2313, 78, 390, 25215, 20, 78, 7585, 18049, 19, 78, 352, 651, 320, 4722, 281, 671, 8338, 1880, 253, 1566, 476, 5649, 432, 3733, 327, 247, 5019, 273, 337, 296, 285, 495, 5784, 1859, 941, 24088, 3215, 26208, 327, 24088, 15436, 532, 550, 20, 78, 7585, 18049, 19, 78, 50276, 783, 767, 7881, 5469, 275, 298, 1010, 520, 2417, 513, 417, 1646, 281, 4451, 281, 337, 296, 1436, 10556, 24088, 253, 1072, 2250, 3515, 812, 5108, 275, 247, 6406, 247, 5603, 390, 275, 247, 41008, 50276, 783, 7558, 7680, 891, 298, 23, 3133, 14916, 352, 310, 2761, 3365, 2218, 407, 2509, 247, 2372, 4979, 273, 253, 23057, 21, 69, 10895, 31825, 50276, 262, 310, 417, 2590, 2139, 359, 878, 1529, 2440, 873, 299, 23640, 68, 82, 604, 627, 403, 2168, 247, 12190, 273, 8892, 534, 2168, 556, 616, 2440, 873, 50276, 45182, 21557, 1162, 355, 1980, 3006, 9506, 275, 3492, 342, 3626, 3448, 17857, 17312, 4240, 50276, 1059, 1162, 355, 2805, 87, 8656, 22850, 15549, 9506, 285, 16681, 275, 10556, 3066, 3626, 3448, 19241, 5723, 2824, 43425, 253, 7364, 403, 5469, 275, 4706, 818, 5474, 33032, 2520, 2929, 29328, 247, 1332, 323, 8851, 311, 2848, 4715, 5742, 7106, 327, 253, 24088, 406, 19458, 4758, 253, 2929, 23970, 247, 747, 1039, 281, 7477, 10538, 3870, 909, 800, 3530, 323, 2192, 19131, 3740, 3733, 1754, 327, 10491, 407, 21481, 28407, 84, 285, 43369, 275, 253, 3492, 11743, 253, 2929, 23970, 247, 747, 7103, 4758, 327, 253, 23057, 21, 69, 10895, 285, 44995, 327, 19876, 8576, 84, 285, 1018, 3355, 23057, 253, 4081, 15067, 1332, 323, 253, 10538, 3870, 909, 800, 3530, 310, 4722, 285, 3576, 253, 1543, 921, 1175, 3045, 2439, 2709, 15302, 285, 8892, 253, 1375, 23037, 14387, 14023, 671, 921, 253, 3064, 1955, 281, 1027, 31048, 15302, 285, 643, 7533, 4583, 253, 1543, 403, 9648, 3426, 285, 921, 253, 5373, 273, 253, 4295, 50276, 37585, 5701, 22972, 1523, 342, 253, 3332, 10199, 273, 253, 2280, 1644, 25912, 24088, 406, 19458, 3492, 10895, 23057, 21, 69, 1668, 352, 4916, 1896, 281, 19444, 24088, 406, 19458, 362, 24343, 50276, 3529, 310, 247, 2266, 1750, 1677, 326, 275, 2829, 337, 23057, 21, 69, 310, 1335, 247, 6919, 273, 253, 1979, 273, 643, 15302, 24088, 849, 936, 2313, 78, 4384, 18049, 368, 20615, 292, 358, 23702, 11395, 78, 3966, 891, 651, 2649, 1067, 23057, 21, 69, 2280, 1644, 25912, 253, 1543, 671, 403, 3012, 1805, 685, 253, 2045, 1543, 534, 5936, 436, 310, 2649, 1663, 19444, 272, 2712, 50276, 9088, 403, 963, 993, 285, 47412, 474, 6332, 4768, 253, 2929, 352, 651, 5649, 432, 10182, 14835, 50276, 8826, 273, 253, 14951, 310, 15279, 9542, 323, 1650, 253, 749, 3866, 891, 275, 16186, 337, 36908, 823, 2712, 26332, 326, 2593, 812, 320, 17265, 1293, 253, 749, 3866, 1293, 667, 2957, 275, 19843, 436, 651, 671, 25636, 2829, 374, 50276, 9939, 326, 253, 3662, 281, 253, 23937, 310, 13583, 23057, 21, 69, 1057, 417, 452, 253, 4784, 7981, 285, 556, 697, 1211, 7981, 342, 2176, 13133, 50276, 9088, 310, 690, 4512, 5955, 327, 841, 533, 352, 310, 1077, 3710, 323, 1650, 627, 403, 1142, 38058, 16274, 273, 24088, 406, 19458, 10556, 824, 347, 13234, 11068, 3374, 3966, 326, 812, 320, 5469, 436, 4566, 4457, 253, 3486, 273, 253, 13978, 908, 281, 6194, 253, 3210, 5474, 33032, 2520, 2929, 29328, 247, 747, 22791, 323, 24088, 406, 19458, 3492, 4685, 1754, 327, 23057, 21, 69, 247, 3215, 26208, 873, 285, 247, 1566, 323, 253, 3215, 26208, 285, 15450, 1442, 292, 25004, 1677, 253, 440, 24716, 1314, 23057, 21, 69, 10556, 253, 4477, 513, 253, 1095, 569, 285, 1973, 253, 299, 17788, 11536, 1754, 327, 253, 3332, 789, 13831, 247, 4499, 422, 3492, 4715, 1566, 342, 247, 747, 2957, 28937, 19131, 310, 4081, 281, 1071, 253, 1840, 2216, 253, 4477, 671, 12661, 247, 3588, 873, 4907, 299, 23640, 68, 82, 1754, 327, 253, 821, 941, 273, 23057, 21, 69, 327, 2067, 15450, 8892, 253, 3215, 26208, 873, 285, 1566, 921, 13943, 3045, 2429, 281, 5368, 806, 1374, 3215, 26208, 2987, 1677, 253, 1236, 2510, 25912, 8851, 1584, 633, 941, 432, 23057, 21, 69, 5847, 50276, 783, 1095, 318, 285, 12478, 323, 23057, 21, 69, 403, 1774, 285, 12912, 323, 253, 3114, 50275, 783, 33954, 403, 7000, 285, 3477, 281, 956, 253, 7642, 4245, 1199, 4217, 1491, 50275, 783, 4679, 6296, 921, 1175, 7871, 672, 359, 897, 436, 747, 1095, 456, 10895, 323, 24088, 406, 19458, 3492, 4715, 50275, 783, 2022, 4758, 285, 1095, 318, 403, 3590, 50276, 5040, 50276, 783, 6253, 4468, 436, 789, 3133, 417, 281, 22629, 253, 7680, 273, 23057, 21, 69, 285, 436, 19529, 50275, 18, 253, 10334, 18, 310, 24363, 347, 253, 941, 4311, 285, 7680, 5663, 281, 23057, 21, 69, 3185, 273, 436, 789, 436, 789, 2589, 84, 7194, 253, 1095, 318, 285, 12478, 594, 891, 1804, 10941, 1097, 23057, 21, 69, 285, 24088, 15436, 532, 1060, 281, 3693, 436, 24363, 50276, 19, 253, 1543, 1007, 1175, 285, 13943, 533, 253, 10668, 778, 4282, 849, 1199, 273, 731, 1705, 432, 253, 1095, 318, 2167, 1293, 1095, 318, 253, 10556, 275, 23057, 21, 69, 778, 320, 417, 3477, 281, 3587, 897, 253, 3045, 11701, 476, 320, 7964, 6701, 281, 253, 941, 7680, 273, 23057, 21, 69, 672, 2429, 342, 253, 806, 1374, 941, 3215, 26208, 760, 271, 24088, 15436, 532, 310, 12504, 285, 24363, 3021, 891, 1804, 253, 4477, 823, 253, 5216, 281, 897, 253, 3236, 10556, 273, 23057, 21, 69, 275, 1260, 66, 4088, 390, 604, 417, 387, 1878, 2319, 436, 4518, 604, 359, 1158, 670, 436, 1127, 253, 941, 7680, 273, 436, 789, 285, 253, 3045, 1750, 778, 320, 3731, 2132, 50276, 20, 7296, 253, 1332, 7680, 310, 4942, 4577, 2429, 281, 253, 941, 50276, 249, 619, 4743, 436, 789, 778, 320, 625, 7470, 323, 253, 22791, 285, 10895, 3540, 604, 253, 2022, 9021, 403, 417, 17265, 50275, 783, 7103, 253, 25064, 5216, 403, 5667, 533, 625, 3492, 8981, 5216, 476, 1056, 253, 7103, 625, 4891, 24088, 327, 24088, 442, 66, 19876, 3966, 50275, 555, 5367, 298, 23260, 26332, 2574, 18, 5549, 2490, 187, 4118, 18435, 27, 1542, 24088, 406, 19458, 8851, 311, 2848, 3215, 26208, 436, 2929, 10513, 247, 337, 296, 10816, 8851, 1584, 633, 3215, 26208, 10895, 29328, 247, 747, 4499, 422, 2957, 24088, 19131, 285, 21168, 247, 747, 22791, 299, 23640, 68, 82, 50276, 20261, 253, 7680, 273, 436, 789, 310, 8489, 32809, 697, 16038, 40290, 285, 6003, 403, 1175, 16280, 512, 30628, 5194, 326, 24088, 406, 19458, 8851, 311, 2848, 3215, 26208, 310, 271, 1774, 9400, 323, 253, 3114, 891, 7613, 1804, 18738, 352, 50276 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 50276, 909, 15436, 532, 50275, 909, 15436, 532, 310, 47344, 432, 23057, 21, 69, 949, 247, 2962, 273, 6866, 247, 873, 273, 8557, 273, 3492, 285, 45860, 10658, 569, 403, 6777, 342, 1698, 2408, 273, 6046, 275, 2426, 273, 8851, 1584, 633, 3731, 40446, 285, 8851, 1584, 633, 3496, 11235, 11828, 253, 2457, 10895, 556, 3285, 76, 3038, 273, 10556, 342, 35383, 3041, 10658, 569, 432, 17181, 1027, 15216, 627, 403, 28121, 29205, 591, 2909, 342, 3388, 17230, 2978, 273, 337, 84, 285, 2629, 11254, 273, 15630, 84, 436, 310, 3560, 407, 20462, 247, 33876, 4778, 3985, 17230, 25015, 5700, 835, 18932, 1475, 253, 35774, 28921, 273, 39016, 1589, 247, 11935, 7467, 273, 9840, 1637, 310, 7616, 824, 326, 9840, 310, 253, 3388, 11935, 4181, 875, 8557, 273, 12640, 10658, 569, 285, 9765, 310, 253, 4311, 2803, 10302, 347, 253, 3388, 273, 512, 701, 284, 2439, 512, 10556, 275, 24088, 15436, 532, 50276, 1542, 8851, 311, 2848, 3215, 26208, 253, 2929, 29328, 24088, 19131, 534, 310, 271, 6880, 273, 2192, 19131, 4499, 422, 2957, 875, 8851, 1584, 633, 8557, 24088, 19131, 5223, 84, 2192, 19131, 323, 24088, 406, 19458, 10556, 407, 2509, 2250, 13823, 2762, 10491, 835, 14604, 3530, 326, 3894, 387, 1878, 581, 28407, 285, 387, 1878, 581, 17257, 403, 4127, 347, 2762, 3530, 352, 2007, 556, 6200, 13823, 4016, 10491, 835, 1027, 5231, 275, 253, 1072, 10076, 403, 4127, 347, 1892, 4016, 3530, 50275, 71, 3341, 253, 2929, 24357, 247, 747, 2440, 22791, 281, 7472, 327, 24088, 15436, 532, 1925, 299, 23640, 68, 82, 253, 16038, 310, 326, 2629, 8851, 1584, 633, 25064, 778, 320, 18860, 1580, 627, 403, 6832, 40430, 358, 39904, 2074, 3403, 621, 275, 23057, 21, 69, 2709, 4327, 1953, 278, 68, 82, 4836, 2789, 49495, 4122, 11543, 1677, 247, 1355, 1180, 273, 9172, 299, 23640, 68, 82, 19401, 271, 734, 16455, 4758, 835, 2620, 29205, 35282, 432, 1027, 10556, 26400, 281, 12129, 10872, 432, 1027, 15216, 299, 23640, 68, 82, 671, 19401, 271, 47252, 2842, 4758, 407, 32827, 2620, 5415, 29205, 2366, 13654, 327, 4030, 72, 11273, 3634, 30591, 824, 347, 1133, 5016, 50275, 936, 17813, 253, 4081, 1332, 4679, 403, 5196, 327, 2620, 24088, 406, 19458, 49602, 4679, 921, 253, 24088, 15436, 532, 285, 24088, 15436, 532, 2112, 342, 24088, 19131, 1347, 1805, 685, 5368, 3082, 327, 512, 49602, 436, 310, 984, 24088, 15436, 532, 7729, 3324, 3215, 26208, 8003, 281, 24088, 406, 19458, 15450, 8892, 28913, 310, 5196, 327, 253, 1055, 273, 24088, 19131, 4645, 326, 352, 310, 625, 3576, 685, 2192, 19131, 28913, 310, 671, 5196, 281, 17813, 253, 12510, 273, 253, 4081, 2440, 22791, 299, 23640, 68, 82, 50276, 296, 3755, 20556, 50276, 18, 253, 4081, 4295, 273, 253, 789, 4583, 452, 1534, 38135, 285, 1361, 281, 3157, 3215, 26208, 323, 24088, 406, 19458, 15450, 8892, 24088, 15436, 532, 285, 17909, 299, 23640, 68, 82, 22791, 588, 320, 1077, 9371, 281, 253, 24088, 406, 19458, 3114, 281, 1287, 1805, 3215, 11273, 3210, 18960, 323, 24088, 406, 19458, 8892, 374, 253, 4583, 2929, 3290, 310, 671, 1029, 253, 2929, 310, 3477, 281, 1239, 956, 285, 2096, 495, 9470, 7103, 556, 644, 2684, 281, 7472, 512, 7794, 273, 253, 4081, 1332, 253, 4679, 1361, 9580, 12288, 327, 1016, 4809, 275, 247, 11088, 5133, 50276, 20881, 1255, 265, 50276, 18, 1580, 24088, 406, 19458, 10556, 403, 440, 24716, 1314, 285, 12640, 5231, 403, 3240, 4030, 72, 11273, 891, 452, 690, 24626, 670, 253, 6200, 13823, 4016, 10491, 534, 19401, 1027, 5231, 275, 253, 1072, 10076, 347, 1892, 4016, 3530, 533, 352, 310, 1896, 326, 1027, 5231, 275, 253, 1072, 10076, 476, 3894, 253, 1072, 28407, 275, 326, 1083, 352, 4620, 352, 1537, 7344, 342, 253, 2250, 13823, 2762, 10491, 310, 352, 1896, 326, 6200, 13823, 4016, 10491, 476, 9569, 3221, 2297, 3993, 50276, 19, 512, 4679, 323, 24088, 15436, 532, 1646, 281, 320, 5196, 970, 253, 13831, 1332, 285, 2069, 19946, 27882, 352, 651, 320, 625, 9371, 281, 923, 604, 816, 323, 4737, 273, 4473, 327, 5046, 581, 4758, 849, 253, 10895, 285, 2746, 17923, 342, 1529, 27882, 50276, 74, 2868, 253, 4477, 452, 18212, 9713, 253, 7364, 285, 2442, 4016, 38058, 3486, 273, 616, 789, 5474, 33032, 2520, 2929, 2175, 8851, 311, 2848, 3215, 26208, 323, 24088, 406, 19458, 10556, 50276, 783, 4477, 1750, 495, 7680, 337, 12478, 253, 23057, 21, 69, 10895, 281, 830, 247, 1781, 8851, 1584, 633, 3733, 20689, 374, 4081, 24088, 19131, 281, 18915, 253, 767, 7881, 1072, 2250, 275, 1027, 13451, 285, 1027, 5231, 275, 253, 1072, 6200, 275, 24088, 406, 19458, 8851, 1584, 633, 3215, 26208, 495, 4270, 327, 1755, 273, 253, 23057, 21, 69, 10895, 253, 4477, 5611, 299, 23640, 68, 82, 347, 247, 2440, 873, 20544, 50275, 783, 4477, 2175, 3215, 26208, 323, 24088, 406, 19458, 10556, 534, 310, 271, 1774, 9400, 323, 253, 3114, 285, 812, 320, 3486, 1020, 50276, 783, 5661, 1543, 403, 6571, 2762, 8109, 253, 3916, 275, 253, 2929, 50276, 20881, 1255, 265, 50275, 9088, 310, 247, 3480, 273, 27934, 38135, 253, 1566, 10336, 310, 9093, 253, 1072, 347, 13831, 1903, 50276, 77, 18663, 15429, 253, 4477, 4081, 2250, 13823, 2762, 10491, 281, 2953, 5691, 337, 298, 18795, 534, 15771, 327, 253, 2891, 13646, 22581, 2173, 281, 253, 23057, 21, 69, 10895, 285, 556, 3710, 7990, 534, 778, 417, 7763, 281, 625, 2087, 5028, 23057, 10556, 50276, 783, 6200, 13823, 4016, 10491, 310, 9093, 4016, 10491, 432, 253, 1072, 3492, 436, 5853, 310, 2223, 908, 323, 8892, 751, 3492, 2774, 25064, 344, 2109, 21557, 1162, 355, 43278, 1162, 355, 3021, 352, 310, 671, 417, 747, 50276, 249, 253, 490, 77, 569, 352, 310, 2011, 326, 970, 24088, 15436, 532, 3815, 310, 1805, 685, 970, 495, 5784, 1436, 1859, 941, 2057, 288, 85, 2313, 78, 390, 25215, 20, 78, 7585, 18049, 19, 78, 352, 651, 320, 4722, 281, 671, 8338, 1880, 253, 1566, 476, 5649, 432, 3733, 327, 247, 5019, 273, 337, 296, 285, 495, 5784, 1859, 941, 24088, 3215, 26208, 327, 24088, 15436, 532, 550, 20, 78, 7585, 18049, 19, 78, 50276, 783, 767, 7881, 5469, 275, 298, 1010, 520, 2417, 513, 417, 1646, 281, 4451, 281, 337, 296, 1436, 10556, 24088, 253, 1072, 2250, 3515, 812, 5108, 275, 247, 6406, 247, 5603, 390, 275, 247, 41008, 50276, 783, 7558, 7680, 891, 298, 23, 3133, 14916, 352, 310, 2761, 3365, 2218, 407, 2509, 247, 2372, 4979, 273, 253, 23057, 21, 69, 10895, 31825, 50276, 262, 310, 417, 2590, 2139, 359, 878, 1529, 2440, 873, 299, 23640, 68, 82, 604, 627, 403, 2168, 247, 12190, 273, 8892, 534, 2168, 556, 616, 2440, 873, 50276, 45182, 21557, 1162, 355, 1980, 3006, 9506, 275, 3492, 342, 3626, 3448, 17857, 17312, 4240, 50276, 1059, 1162, 355, 2805, 87, 8656, 22850, 15549, 9506, 285, 16681, 275, 10556, 3066, 3626, 3448, 19241, 5723, 2824, 43425, 253, 7364, 403, 5469, 275, 4706, 818, 5474, 33032, 2520, 2929, 29328, 247, 1332, 323, 8851, 311, 2848, 4715, 5742, 7106, 327, 253, 24088, 406, 19458, 4758, 253, 2929, 23970, 247, 747, 1039, 281, 7477, 10538, 3870, 909, 800, 3530, 323, 2192, 19131, 3740, 3733, 1754, 327, 10491, 407, 21481, 28407, 84, 285, 43369, 275, 253, 3492, 11743, 253, 2929, 23970, 247, 747, 7103, 4758, 327, 253, 23057, 21, 69, 10895, 285, 44995, 327, 19876, 8576, 84, 285, 1018, 3355, 23057, 253, 4081, 15067, 1332, 323, 253, 10538, 3870, 909, 800, 3530, 310, 4722, 285, 3576, 253, 1543, 921, 1175, 3045, 2439, 2709, 15302, 285, 8892, 253, 1375, 23037, 14387, 14023, 671, 921, 253, 3064, 1955, 281, 1027, 31048, 15302, 285, 643, 7533, 4583, 253, 1543, 403, 9648, 3426, 285, 921, 253, 5373, 273, 253, 4295, 50276, 37585, 5701, 22972, 1523, 342, 253, 3332, 10199, 273, 253, 2280, 1644, 25912, 24088, 406, 19458, 3492, 10895, 23057, 21, 69, 1668, 352, 4916, 1896, 281, 19444, 24088, 406, 19458, 362, 24343, 50276, 3529, 310, 247, 2266, 1750, 1677, 326, 275, 2829, 337, 23057, 21, 69, 310, 1335, 247, 6919, 273, 253, 1979, 273, 643, 15302, 24088, 849, 936, 2313, 78, 4384, 18049, 368, 20615, 292, 358, 23702, 11395, 78, 3966, 891, 651, 2649, 1067, 23057, 21, 69, 2280, 1644, 25912, 253, 1543, 671, 403, 3012, 1805, 685, 253, 2045, 1543, 534, 5936, 436, 310, 2649, 1663, 19444, 272, 2712, 50276, 9088, 403, 963, 993, 285, 47412, 474, 6332, 4768, 253, 2929, 352, 651, 5649, 432, 10182, 14835, 50276, 8826, 273, 253, 14951, 310, 15279, 9542, 323, 1650, 253, 749, 3866, 891, 275, 16186, 337, 36908, 823, 2712, 26332, 326, 2593, 812, 320, 17265, 1293, 253, 749, 3866, 1293, 667, 2957, 275, 19843, 436, 651, 671, 25636, 2829, 374, 50276, 9939, 326, 253, 3662, 281, 253, 23937, 310, 13583, 23057, 21, 69, 1057, 417, 452, 253, 4784, 7981, 285, 556, 697, 1211, 7981, 342, 2176, 13133, 50276, 9088, 310, 690, 4512, 5955, 327, 841, 533, 352, 310, 1077, 3710, 323, 1650, 627, 403, 1142, 38058, 16274, 273, 24088, 406, 19458, 10556, 824, 347, 13234, 11068, 3374, 3966, 326, 812, 320, 5469, 436, 4566, 4457, 253, 3486, 273, 253, 13978, 908, 281, 6194, 253, 3210, 5474, 33032, 2520, 2929, 29328, 247, 747, 22791, 323, 24088, 406, 19458, 3492, 4685, 1754, 327, 23057, 21, 69, 247, 3215, 26208, 873, 285, 247, 1566, 323, 253, 3215, 26208, 285, 15450, 1442, 292, 25004, 1677, 253, 440, 24716, 1314, 23057, 21, 69, 10556, 253, 4477, 513, 253, 1095, 569, 285, 1973, 253, 299, 17788, 11536, 1754, 327, 253, 3332, 789, 13831, 247, 4499, 422, 3492, 4715, 1566, 342, 247, 747, 2957, 28937, 19131, 310, 4081, 281, 1071, 253, 1840, 2216, 253, 4477, 671, 12661, 247, 3588, 873, 4907, 299, 23640, 68, 82, 1754, 327, 253, 821, 941, 273, 23057, 21, 69, 327, 2067, 15450, 8892, 253, 3215, 26208, 873, 285, 1566, 921, 13943, 3045, 2429, 281, 5368, 806, 1374, 3215, 26208, 2987, 1677, 253, 1236, 2510, 25912, 8851, 1584, 633, 941, 432, 23057, 21, 69, 5847, 50276, 783, 1095, 318, 285, 12478, 323, 23057, 21, 69, 403, 1774, 285, 12912, 323, 253, 3114, 50275, 783, 33954, 403, 7000, 285, 3477, 281, 956, 253, 7642, 4245, 1199, 4217, 1491, 50275, 783, 4679, 6296, 921, 1175, 7871, 672, 359, 897, 436, 747, 1095, 456, 10895, 323, 24088, 406, 19458, 3492, 4715, 50275, 783, 2022, 4758, 285, 1095, 318, 403, 3590, 50276, 5040, 50276, 783, 6253, 4468, 436, 789, 3133, 417, 281, 22629, 253, 7680, 273, 23057, 21, 69, 285, 436, 19529, 50275, 18, 253, 10334, 18, 310, 24363, 347, 253, 941, 4311, 285, 7680, 5663, 281, 23057, 21, 69, 3185, 273, 436, 789, 436, 789, 2589, 84, 7194, 253, 1095, 318, 285, 12478, 594, 891, 1804, 10941, 1097, 23057, 21, 69, 285, 24088, 15436, 532, 1060, 281, 3693, 436, 24363, 50276, 19, 253, 1543, 1007, 1175, 285, 13943, 533, 253, 10668, 778, 4282, 849, 1199, 273, 731, 1705, 432, 253, 1095, 318, 2167, 1293, 1095, 318, 253, 10556, 275, 23057, 21, 69, 778, 320, 417, 3477, 281, 3587, 897, 253, 3045, 11701, 476, 320, 7964, 6701, 281, 253, 941, 7680, 273, 23057, 21, 69, 672, 2429, 342, 253, 806, 1374, 941, 3215, 26208, 760, 271, 24088, 15436, 532, 310, 12504, 285, 24363, 3021, 891, 1804, 253, 4477, 823, 253, 5216, 281, 897, 253, 3236, 10556, 273, 23057, 21, 69, 275, 1260, 66, 4088, 390, 604, 417, 387, 1878, 2319, 436, 4518, 604, 359, 1158, 670, 436, 1127, 253, 941, 7680, 273, 436, 789, 285, 253, 3045, 1750, 778, 320, 3731, 2132, 50276, 20, 7296, 253, 1332, 7680, 310, 4942, 4577, 2429, 281, 253, 941, 50276, 249, 619, 4743, 436, 789, 778, 320, 625, 7470, 323, 253, 22791, 285, 10895, 3540, 604, 253, 2022, 9021, 403, 417, 17265, 50275, 783, 7103, 253, 25064, 5216, 403, 5667, 533, 625, 3492, 8981, 5216, 476, 1056, 253, 7103, 625, 4891, 24088, 327, 24088, 442, 66, 19876, 3966, 50275, 555, 5367, 298, 23260, 26332, 2574, 18, 5549, 2490, 187, 4118, 18435, 27, 1542, 24088, 406, 19458, 8851, 311, 2848, 3215, 26208, 436, 2929, 10513, 247, 337, 296, 10816, 8851, 1584, 633, 3215, 26208, 10895, 29328, 247, 747, 4499, 422, 2957, 24088, 19131, 285, 21168, 247, 747, 22791, 299, 23640, 68, 82, 50276, 20261, 253, 7680, 273, 436, 789, 310, 8489, 32809, 697, 16038, 40290, 285, 6003, 403, 1175, 16280, 512, 30628, 5194, 326, 24088, 406, 19458, 8851, 311, 2848, 3215, 26208, 310, 271, 1774, 9400, 323, 253, 3114, 891, 7613, 1804, 18738, 352, 50276 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: the paper curates data from longitudinal mobile and wearable sensors over multiple years the dataset contains additional information on physical health depression etc the paper also demonstrates how the dataset can be used to compare benchmark algorithms aimed at depression detection tasks the paper performs crossdataset evaluations of the benchmark algorithms the dataset is a valuable resource for comparing algorithms used for modeling human behavior or predicting certain human characteristics the documentation is quite elaborate and easy to follow the performance of the algorithms for the detection task is not very impressive there may be possible noises and hopefully curating more data over years would make it better docsepthis manuscript presents a longitudinal mobile and wearable sensing dataset for four years with various types of selfreport data such as personality physical health depression emotion and social wellbeing it also lists the benchmark results of 18 algorithms for depression detection tasks and supports crossdataset evaluations of behavior modelling algorithms generalizability although many human behavioural modelling research were done and some of related datasets were released previously there is no open dataset for comparing and evaluating various modelling algorithms bring difficulty to the development of this field in this research the dataset is very well presented and the experiments are carefully designed i believe it has great potential to the field and it should be accepted with some issues addressed the contribution is significant and the dataset is relevant to the broader research community the collected dataset includes a large scale of data smartphones wearables multiple questionnaires from 705 personyears various behavioral modeling algorithms for depression detection tasks are evaluated on the dataset and show the lack of generalizability of all existing algorithms this dataset may assist researchers to develop more generalizable algorithms and explore interesting insights into the relationship between sensing data and human wellbeing metrics the accessibility and accountability are good all the data are clearly presented and explained the ethics are well considered and the documents related to data collection are complete 1 the authors claimed that one of the contributions is that the dataset can serve as an open testbed for crossdataset generalization tasks to evaluate the generalizability and robustness of behavioral modeling algorithms however there are many public datasets online and they can also serve as the open testbed for evaluation if one advantage of this dataset is the capability of evaluating generalization why not compare the depression detection performance of existing algorithms on other popular datasets 2 why do you use binary classification i think doing regression may more accurately reflect the depression score especially considering the relatively low accuracy of the experiment result 3 where is sa5 i could not find the feature details as sa5 is missing 4 please explain the reason for every decision you made eg why add a 3level discretized version instead of 2 or 4level in addition you mention that you omit the missing values during analysis and use a medianbased imputation when necessary what about the features with a very high missing rate in this case medianbased imputation may be not accurate 5 correlation analysis in the manuscript pearson correlation coefficients were computed between every feature and the depression label however the calculation of the pvalue relies on the assumption that each item is normally distributed i could not find the description of the distribution of features 6 technical validation authors should provide information to justify the reliability of their data eg statistical analysis of experimental error and variation discussion of any procedures used to ensure reliable and unbiased data production docsepthe paper summarizes a dataset of smartphone and fitbit data spanning 439 unique subjects across 4 10week periods one period per year it contains data about phone screen usage location call durations nearby bluetooth devices maps sleep and steps it also includes surveys related to mental health and wellbeing statistics and distributions are presented to summarize the data visualizing weekly trends and comparisons between precovid and postcovid data ml models are used to distinguish datasets distinguish subjects predict depression and generalize across datasets a variety of existing learning pipelines are tested it is an impressive experimental procedure to collect data over the course of 4 years from hundreds of subjects having consistent data from such periods can be beneficial for a variety of machine learning purposes the included mental health information provides one immediate use but hopefully the data can be leveraged for other purposes as well the presented graphs provide good visualizations of the data clearly demonstrating weekly trends and differentiating the datasets the correlations results indicating how selected metrics relate to the depressions results was also quite interesting a wide variety of machine learning pipelines are also tested the data is presented as csv files which is a highly portable format a few questionscomments that came to mind are summarized below it is mentioned that the platform may be extended to other behavior modeling tasks beyond depression detection but it is unclear how groundtruth labels would be obtained for other tasks i may suggest adding some discussion about the possible scope of future tasks and how labels would be created for them this may be worth also addressing a bit in the abstract the scope of behavior modeling introduction etc if possible it may be helpful to clarifyelaborate the phrase crossdataset that is prominent in the discussions it is intuitive to the extent that it considers applying a model across different datasets but scope and the definition of dataset could be clarified for example globem seems to be considered as 4 datasets where each one is from a different year but i would tend to consider it as a single multiyear dataset in addition the scope of crossdataset analysis in this case keeps constant all of the data streams formats targets experimental protocols etc this seems like it may be more aptly considered as splitting a dataset into multiple years especially with overlapping subjects in each split whereas crossdataset sounds like there may be more substantial differences or approaches note that this ties nicely with your points about it being a challenging task since even this case where the datasets are so similar is still challenging it is great to explore so many different models but i wonder about their target domain were each of them designed for a task such as this using the same data streams available in this dataset if they were designed to operate for somewhat different tasks it may help explain their relatively low performance it was odd that they were all so low even on singledataset conditions so i wonder where their strengths lie and how their original applications differ from the current tasks given the relatively low performance of all algorithms i wasnt entirely sure of the desired conclusions is it simply meant to show that this is a hard task is it meant as a benchmark for future depressions detection pipelines this is also related to the previous question about how the training data and experimental procedures of the tested pipelines compare to the current tests to ensure that it is a fair test for them it was mentioned that they suffer from overfitting but their scores on singledataset tasks were also fairly low the distinguishaperson task seems to actually be distinguishing personyears rather than unique subjects since there are 705 targets instead of 497 maybe this causes issues with the training seeing more data from some subjects than others and conflating subject differences with dataset differences some additional discussion may be interesting here just to check are the domain generalization algorithms simply different network architectures that were designed to generalize well or is there a more fundamental difference between them and the depression detection algorithms the abstract mentions over 700 users which may be misleading since there are 497 unique users of course both numbers are very impressive until i looked at some data files it wasnt immediately clear that the datasets sampling rate was effectively once per day that summary data is provided for each stream for each day i know it was mentioned on line 185 but it seems notable enough to highlight it im not quite sure where the aggregated features from the other time ranges are presented but perhaps they are in different data files some more information about rapids may be helpful to include in the main paper it may be helpful to mention the fitbit models used presumably they didnt include heart rate or userinitiated exercise labels docsepthe authors focus on modeling longitudinal behavior signals and observe that there are no publicly available datasets with multiple domains so the authors propose the first multiyear mobile and wearable sensing datasets as an open testbed the authors reimplement 18 methods and provide the benchmark results extensive results reveal the limitations of the current methods eg lack of generalizability the dataset along with the benchmark results will advance the development of generalizable longitudinal behavior modeling 1 the first multiyear behavior dataset that supports domain generalization study 2 thorough and comprehensive data collectionprocessing description 3 an outofthebox benchmarking platform with 18 integrated algorithms my main concern is the possible overlapping content with the authors underreview submission 7 it seems these two papers use the same platform ie globem the authors also claim that many texts are taken from 7 with courtesy docsepthis paper provides the first longitudinal mobile and wearable sensing dataset and reports the benchmark results for 18 related algorithms for the depression detection besides in this paper the data collection process and visualized analysiseg data distribution and correlation are present which help researchers to understand this dataset quickly 1 the provided dataset is the first multiyear datasets from mobile and wearable sensing over 700 personyears with overlapping users interesting the dataset includes prepost covid behavioral data 2 the data collection process is clear which shows the rarity and input cost besides this paper provides relevant data analysis in terms of distribution correlation and domain classification 3 the authors conduct experiments of two tasks ie prior depression detection and recent domain generalization which offer a useful benchmark result for future research 1 inadequate sample size and type although the dataset contains four years of behavioral data the number of users is small and the number of behavior types collected is also small 2 lack of comparison with other human behavior datasets which is a very important work for dataset papers 3 i encourage the authors to provide a discussion of the effects of errors introduced by the collection equipment experimental participants docsepthe paper presents the first multiyear passive sensing datasets containing over 700 users data collected from mobile and wearable sensors together with a wide range of wellbeing metrics they benchmarked several algorithms to develop generalizable longitudinal behavior modeling algorithms with focusing on depression detection task the surveys and the depression detection task are well performed the paper is well written and the features for the depression detection task are interesting the dataset seems promising for mental health future research what is the different between dataset in 7 and this dataset since the authors are the same the dataset associated with the submission then is not fully open as claimed in the introduction a short background of the depression detection problem and the related algorithms would make the paper more readable big missing rate for the data may affect generalizability of the findings since a trivial method is used for handling such missing data in page 6 having a rigor analysis for each year and compare between pandemic and postpandemic is more interesting compared to just a general correlation score in page 8 the authors state current algorithms crossdataset generalizability is still far from satisfactory for reallife deployment a more detailed explaining of the intuition of such challenge and suggested directions are essential for the paper despite having short paragraph in discussion section having the selfreport aspect for depression measures and not having backup intelligent prediction is also a shortcoming of the paper the following sentence is not clear our datasets can serve as an open testbed for multiple crossdataset generalization tasks to evaluate a behavior modeling algorithms generalizability and robustness eg same or different users across multiple years the following finding seems trivial and it is well known from psychology literature this suggests that participants often commuted to offcampus places on weekends further participants tended to leverage weekends to catch up on sleep as shown by the peak inbed duration around weekends docsep the authors release the first multiyear longitudinal mobile sensing datasets covering over 700 personyears multiple sensor datasets ie location phone usage calls bluetooth physical activity and sleep behavior and wellbeing metrics eg physical health mental wellbeing social wellbeing wellbeing metrics are collected by prepost surveys and short twiceweekly ecological momentary assessment surveys during the study their benchmark results on the depression detection task demonstrate that current algorithms need to be further developed in terms of generalizability they analyze the distribution of data before and after covid or on weekdays and weekends they also conduct correlation analysis capturing the relationship between daily behaviors and wellbeing metrics the two domain classification tasks they conduct show significant distribution shifts among datasets and individuals they provide benchmark results of 9 prior depression detection algorithms and 9 deeplearning domain generalization algorithms on the depression detection task the experimental setups are divided into a single dataset and a cross dataset the best depression detection algorithms are better at the singledataset task while the best domain generalization algorithms are better at crossdataset tasks a significant performance drop from the single dataset task to the three crossdatasets shows that current algorithms are still far from reallife deployment this study focuses on inthewild timeseries data which has not been widely explored in terms of domain generalization compared to computer vision or natural language processing domains they collect sufficient features location phone usage bluetooth call physical activity and sleep participants completed weekly short surveys and extensive surveys before the start and at the end of the study prepost surveys cover personality bfi10 physical health chips mental wellbeing bdiii and social wellbeingsense of social and academic fit scale eds the short surveys include phq4 pss4 and panas this covers data from four consecutive years which is extensive enough to be served as longitudinal research the four years consist of two years before covid and two years after covid this leads to a domain shift between datasets and it makes the released data more informative and suitable for targeting domain generalization the benchmark contains both depression detection algorithms and domain generalization algorithms it would have been better if they provide the statistics of their data clearly and compare it to the few other datasets which can be compared to globem dataset also as they mentioned mobile sensing data tend to have a high data missing rate but they did not provide the full missing rate of each feature it is critical for timeseries data this is targeting human behavior modeling but the benchmark covers a depression detection task alone with one label it would be better to at least suggest some other tasks related to human behavior modeling with this extensive dataset it is limited to a certain population such as students asian and white and immigrants and firstgeneration participants the results of the cross dataset setup represent the domain generalizability domain classification tasks are provided to show the domain shift of their dataset the namethedataset experiment addresses the domain shift in the leaveonedatasetout setup and the distinguishtheperson experiment explains the domain shift in the overlapping users across datasets section 41 partly explains the domain shift in the prepostcovid setup but it would be better to add similar experiment docsepthe dataset encompasses several original features multiyear thus providing a realistic task for generalization rich implicit data from sensors inc time series relatively diverse population yet all of them are students the paper also provides results for 18 possible models complete information on dataset collection discussion of privacy concerns part of the dataset has already been used to publish eg doryab et al previous benchmarks in depression prediction use audiovideo data avec daicwoz text reece et al survey data sau et al or biomarkerssharma et al it seems thus that this dataset complements existing benchmarks the depresjon dataset has an activity feature and is the most similar but lacks rich features the dataset collection is well documented compensation irb notice forms distributed to study participants etc rich collection of results including semimanualpsychology backed methods and ml methods the multiyear nature and prepostcovid seem to be very useful to benchmark ood methods table 1 lacks the actual metric being used rmodel performancebalanced accuracy probably even though it is difficult and costly the dataset size could be increased as 700 instances is a bit low for benchmarking ml models with a risk to overfit quickly the test set the selflabeling is debatable but this has been correctly stated by the authors it is not clear to me how useful is the dataset as it seems that among 18 methods not one is able to perform eg in odd domainbed has several datasets with 50 accuracy could it be that features are not that predictive main weakness 1 it is not clearly stated what model selection method was used to choose hyperparameters of the different methods in ood gulrajani et al have observed that leaveonedomain out was not the best model selection approach so it could be interesting to provide results using training domain validation leave one domain out and oracle selection see the domainbed paper main weakness 2 the woods benchmark httpswoodsbenchmarksgithubio is widely used in odd and presents as well sensor time series data even though the label is different the difference should be discussed in the paper ### Summary:
globem is a multiyear human behaviour sensing dataset captured from mobile and wearable devices this is a highly impactful dataset that will benefit several research domains including machine learning hci and ubiquitous computing affective computing and mental health research the reviewers largely agree on the strong contributions both on the dataset as well as the benchmark task on domain generalisation induced from the behaviour before and after covid19 although several techniques that are adopted such as the imputation techniques are on the naive side this can open opportunities for others to work on the dataset and improve the modelling aspects therefore i strongly recommend acceptance of the work
[ 35615, 326, 497, 4158, 281, 39970, 973, 390, 310, 627, 247, 625, 7936, 3064, 875, 731, 285, 253, 9454, 5481, 11333, 50275, 783, 12002, 25957, 689, 18450, 4212, 534, 778, 320, 24363, 1580, 627, 403, 44473, 4451, 4212, 50276, 1171, 2282, 1097, 3904, 403, 1077, 13943, 50275, 27390, 891, 3261, 387, 690, 941, 4367, 352, 369, 2649, 4745, 2590, 326, 253, 15302, 10491, 2281, 369, 8069, 2378, 591, 1388, 326, 6010, 941, 310, 2530, 323, 1016, 5542, 323, 1016, 1388, 50276, 74, 871, 352, 369, 5393, 327, 1386, 23512, 533, 352, 3133, 16613, 2217, 281, 6780, 352, 50276, 303, 417, 3240, 2119, 835, 253, 40006, 3386, 432, 253, 643, 673, 13794, 403, 3559, 533, 4931, 597, 403, 275, 1027, 941, 4367, 50275, 8826, 625, 1491, 670, 17845, 2352, 778, 320, 9371, 281, 2486, 275, 253, 2022, 2929, 50275, 262, 778, 320, 9371, 281, 3748, 253, 4944, 2713, 3210, 908, 50276, 10192, 40224, 597, 42126, 2486, 2798, 2281, 390, 2608, 4478, 4215, 5763, 13301, 5474, 339, 431, 248, 4477, 2770, 327, 14053, 14854, 3879, 6298, 285, 10018, 326, 627, 403, 642, 13644, 2130, 15302, 342, 2709, 10625, 594, 253, 4477, 12661, 253, 806, 4471, 2913, 6109, 285, 8251, 494, 17950, 15302, 347, 271, 1527, 1071, 3026, 253, 4477, 294, 303, 3018, 1283, 3082, 285, 2085, 253, 22791, 1543, 9470, 1543, 10313, 253, 7364, 273, 253, 1655, 3082, 24088, 3480, 273, 2087, 50228, 253, 10895, 2112, 342, 253, 22791, 1543, 588, 7170, 253, 2440, 273, 2087, 12729, 14854, 3879, 14053, 337, 253, 806, 4471, 2913, 3879, 10895, 326, 8525, 5028, 26647, 1263, 374, 11080, 285, 11088, 941, 4849, 21678, 5740, 495, 271, 562, 23037, 248, 3364, 22791, 272, 5147, 342, 1283, 8527, 11333, 619, 2022, 4468, 310, 253, 1896, 21481, 2600, 342, 253, 4477, 762, 15337, 19529, 818, 352, 3133, 841, 767, 9380, 897, 253, 1072, 5147, 26332, 13371, 358, 253, 4477, 671, 1750, 326, 1142, 17438, 403, 2668, 432, 818, 342, 23633, 5474, 33032, 2520, 2929, 3400, 253, 806, 14854, 6109, 285, 8251, 494, 17950, 10895, 285, 5012, 253, 22791, 1543, 323, 1283, 2905, 11333, 323, 253, 9454, 5481, 16280, 275, 436, 2929, 253, 941, 4849, 1232, 285, 27130, 5127, 885, 72, 941, 3268, 285, 5921, 50276, 609, 1246, 534, 1361, 8607, 281, 2096, 436, 10895, 4541, 50276, 18, 253, 2530, 10895, 310, 253, 806, 4471, 2913, 15302, 432, 6109, 285, 8251, 494, 17950, 689, 18450, 1436, 10526, 342, 21481, 4212, 4722, 253, 10895, 3797, 3765, 493, 9383, 301, 14613, 941, 50275, 19, 253, 941, 4849, 1232, 310, 2590, 534, 2722, 253, 391, 15752, 285, 3280, 2105, 16280, 436, 2929, 3400, 4623, 941, 1783, 275, 2426, 273, 3268, 5921, 285, 5028, 9162, 50275, 20, 253, 4477, 2589, 4679, 273, 767, 8892, 26332, 2720, 9454, 5481, 285, 3332, 5028, 26647, 534, 3959, 247, 4217, 22791, 906, 323, 2852, 2561, 50276, 18, 18766, 3410, 1979, 285, 1511, 3738, 253, 10895, 4428, 1740, 1107, 273, 14613, 941, 253, 1180, 273, 4212, 310, 1355, 285, 253, 1180, 273, 3879, 3510, 5728, 310, 671, 1355, 50276, 19, 3480, 273, 5301, 342, 643, 1966, 3879, 15302, 534, 310, 247, 1077, 1774, 789, 323, 10895, 9380, 50276, 20, 891, 11907, 253, 4477, 281, 2085, 247, 5955, 273, 253, 2538, 273, 6332, 5611, 407, 253, 4849, 6500, 5661, 5014, 5474, 339, 431, 248, 2929, 10262, 253, 806, 4471, 2913, 16864, 17950, 15302, 4508, 689, 18450, 4212, 941, 5728, 432, 6109, 285, 8251, 494, 13479, 2366, 342, 247, 4618, 2491, 273, 46128, 17082, 597, 22791, 264, 2067, 11333, 281, 1287, 2087, 12729, 14854, 3879, 14053, 11333, 342, 13654, 327, 9454, 5481, 4836, 50276, 783, 17276, 285, 253, 9454, 5481, 4836, 403, 973, 2684, 50275, 783, 2929, 310, 973, 3542, 285, 253, 3386, 323, 253, 9454, 5481, 4836, 403, 4722, 50275, 783, 10895, 3133, 12532, 323, 6255, 1786, 2852, 2561, 50275, 5371, 310, 253, 1027, 875, 10895, 275, 818, 285, 436, 10895, 1580, 253, 4477, 403, 253, 1072, 50274, 783, 10895, 2330, 342, 253, 19529, 840, 310, 417, 4751, 1527, 347, 7558, 275, 253, 10199, 50274, 66, 2159, 4114, 273, 253, 9454, 5481, 1895, 285, 253, 2905, 11333, 651, 1056, 253, 2929, 625, 34025, 50274, 2760, 5816, 2281, 323, 253, 941, 778, 2818, 2087, 50228, 273, 253, 4342, 1580, 247, 14916, 1332, 310, 908, 323, 10885, 824, 5816, 941, 50274, 249, 3239, 721, 1907, 247, 8132, 263, 1783, 323, 1016, 807, 285, 7277, 875, 26296, 285, 34807, 395, 11060, 310, 625, 4722, 2429, 281, 816, 247, 2087, 5921, 4868, 50274, 249, 3239, 854, 253, 4477, 1375, 1655, 11333, 2831, 42429, 2087, 50228, 310, 1335, 2080, 432, 20297, 323, 294, 455, 1074, 19007, 247, 625, 7000, 15571, 273, 253, 30328, 273, 824, 5691, 285, 5125, 10746, 403, 5667, 323, 253, 2929, 5747, 1907, 2159, 12494, 275, 5955, 2593, 50274, 30819, 253, 1881, 16223, 4809, 323, 9454, 5593, 285, 417, 1907, 17119, 17497, 10554, 310, 671, 247, 2159, 4202, 273, 253, 2929, 50274, 783, 1563, 6197, 310, 417, 2590, 776, 15302, 476, 5752, 347, 271, 1527, 1071, 3026, 323, 2709, 2831, 42429, 26647, 8892, 281, 7472, 247, 3879, 14053, 11333, 2087, 50228, 285, 31640, 24088, 1072, 390, 1027, 4212, 2439, 2709, 1107, 50274, 783, 1563, 4560, 3133, 14916, 285, 352, 310, 973, 1929, 432, 20162, 6239, 436, 5936, 326, 5014, 2223, 764, 4525, 281, 745, 23485, 316, 5053, 327, 29665, 2007, 5014, 20845, 281, 25057, 29665, 281, 5834, 598, 327, 4600, 347, 2011, 407, 253, 5241, 275, 3026, 7467, 1475, 29665, 50276, 7152, 33032, 253, 4477, 3727, 253, 806, 4471, 2913, 14854, 6109, 17950, 15302, 10985, 689, 18450, 1436, 10526, 2709, 8468, 15302, 26332, 4328, 4481, 10393, 5841, 787, 23548, 3520, 2425, 285, 4600, 3879, 285, 46128, 17082, 24088, 3520, 1786, 6255, 46128, 2675, 46128, 46128, 17082, 403, 5728, 407, 3765, 493, 17276, 285, 2159, 7019, 11151, 314, 24957, 2774, 552, 6803, 17276, 1309, 253, 1263, 616, 22791, 1543, 327, 253, 9454, 5481, 4836, 7568, 326, 1655, 11333, 878, 281, 320, 2007, 3715, 275, 2426, 273, 2087, 50228, 50276, 9328, 12106, 253, 3268, 273, 941, 1078, 285, 846, 9383, 301, 390, 327, 2129, 11015, 285, 29665, 597, 671, 2589, 5921, 1783, 26475, 253, 2954, 875, 5312, 13576, 285, 46128, 17082, 253, 767, 5028, 9162, 8892, 597, 2589, 921, 1534, 3268, 15036, 2190, 15302, 285, 4292, 50276, 9328, 2085, 22791, 1543, 273, 898, 2720, 9454, 5481, 11333, 285, 898, 3676, 28269, 5028, 26647, 11333, 327, 253, 9454, 5481, 4836, 253, 5661, 873, 8777, 403, 4272, 715, 247, 2014, 10895, 285, 247, 2831, 10895, 50276, 783, 1682, 9454, 5481, 11333, 403, 1805, 387, 253, 1625, 1070, 255, 23456, 4836, 1223, 253, 1682, 5028, 26647, 11333, 403, 1805, 387, 2831, 42429, 8892, 247, 1534, 3045, 5926, 432, 253, 2014, 10895, 4836, 281, 253, 1264, 2831, 46906, 1507, 2722, 326, 1655, 11333, 403, 1335, 2080, 432, 294, 455, 1074, 19007, 50276, 2520, 1263, 16633, 327, 540, 248, 32778, 2069, 12395, 941, 534, 556, 417, 644, 7561, 14859, 275, 2426, 273, 5028, 26647, 2429, 281, 4382, 8113, 390, 3626, 3448, 5162, 10625, 50276, 9328, 4822, 4209, 3386, 4328, 4481, 10393, 787, 23548, 1067, 3520, 2425, 285, 4600, 5014, 6312, 13772, 2159, 17276, 285, 9470, 17276, 1078, 253, 1265, 285, 387, 253, 990, 273, 253, 1263, 3765, 493, 17276, 3835, 13216, 270, 11125, 740, 3520, 1786, 16666, 6255, 46128, 270, 69, 12211, 285, 2675, 973, 1257, 723, 1215, 273, 2675, 285, 11073, 4944, 4311, 35779, 253, 2159, 17276, 2486, 815, 82, 21, 268, 859, 21, 285, 3199, 284, 50276, 2520, 10949, 941, 432, 1740, 12640, 1107, 534, 310, 9470, 2217, 281, 320, 5608, 347, 14854, 2561, 253, 1740, 1107, 2882, 273, 767, 1107, 1078, 9383, 301, 285, 767, 1107, 846, 9383, 301, 436, 5644, 281, 247, 5028, 5333, 875, 15302, 285, 352, 2789, 253, 4439, 941, 625, 27096, 285, 7470, 323, 12262, 5028, 26647, 50276, 783, 22791, 4428, 1097, 9454, 5481, 11333, 285, 5028, 26647, 11333, 50276, 262, 651, 452, 644, 1805, 604, 597, 2085, 253, 9990, 273, 616, 941, 4518, 285, 7277, 352, 281, 253, 1643, 643, 15302, 534, 476, 320, 2429, 281, 13371, 358, 10895, 671, 347, 597, 5393, 6109, 17950, 941, 5257, 281, 452, 247, 1029, 941, 5816, 2281, 533, 597, 858, 417, 2085, 253, 2120, 5816, 2281, 273, 1016, 4735, 352, 310, 4619, 323, 2069, 12395, 941, 50276, 2520, 310, 12262, 1966, 3879, 14053, 533, 253, 22791, 10949, 247, 9454, 5481, 4836, 3815, 342, 581, 5203, 352, 651, 320, 1805, 281, 387, 1878, 1804, 690, 643, 8892, 2905, 281, 1966, 3879, 14053, 342, 436, 9470, 10895, 50276, 262, 310, 3710, 281, 247, 2176, 3072, 824, 347, 3484, 347, 757, 285, 3168, 285, 16618, 285, 806, 14520, 5014, 50276, 783, 1543, 273, 253, 2831, 10895, 9978, 1957, 253, 5028, 2087, 50228, 5028, 9162, 8892, 403, 2530, 281, 921, 253, 5028, 5333, 273, 616, 10895, 253, 9557, 292, 742, 255, 23456, 3368, 12453, 253, 5028, 5333, 275, 253, 3553, 11406, 255, 23456, 483, 9978, 285, 253, 7377, 261, 384, 248, 10816, 3368, 11424, 253, 5028, 5333, 275, 253, 21481, 4212, 2439, 15302, 2593, 7609, 13730, 11424, 253, 5028, 5333, 275, 253, 3765, 493, 31485, 301, 9978, 533, 352, 651, 320, 1805, 281, 823, 2074, 3368, 5474, 339, 431, 248, 10895, 37035, 2067, 3236, 3386, 50276, 23939, 2913, 3021, 5277, 247, 15958, 4836, 323, 26647, 50276, 5969, 15424, 941, 432, 13479, 1485, 673, 2962, 50276, 1661, 3146, 11117, 3072, 2568, 512, 273, 731, 403, 3484, 50276, 783, 2929, 671, 3400, 50276, 16680, 323, 1283, 1896, 3210, 50276, 11984, 1491, 327, 10895, 4849, 5955, 273, 11068, 7350, 50276, 2003, 273, 253, 10895, 556, 2168, 644, 908, 281, 15452, 24088, 277, 590, 357, 1162, 355, 50276, 35065, 49602, 275, 9454, 10554, 897, 41174, 729, 2842, 941, 14333, 4204, 280, 680, 91, 2505, 294, 18393, 1162, 355, 6630, 941, 32219, 1162, 355, 390, 20896, 1200, 21401, 1162, 355, 50276, 262, 3133, 3021, 326, 436, 10895, 509, 9115, 5368, 49602, 253, 1305, 373, 29983, 10895, 556, 271, 2425, 4735, 285, 310, 253, 954, 2074, 533, 19756, 6793, 3386, 50276, 783, 10895, 4849, 310, 973, 14290, 10963, 3496, 67, 4366, 4948, 5939, 281, 1263, 5014, 3966, 50276, 5969, 4849, 273, 1543, 1690, 3300, 25884, 780, 20980, 1497, 17245, 3082, 285, 13361, 3082, 50276, 783, 4471, 2913, 3753, 285, 3765, 493, 31485, 301, 1646, 281, 320, 1077, 4217, 281, 22791, 258, 351, 3082, 50276, 2420, 337, 19756, 253, 4588, 7982, 1146, 908, 391, 7645, 3045, 30063, 7200, 3164, 50276, 9154, 2167, 352, 310, 2834, 285, 19983, 253, 10895, 1979, 812, 320, 2559, 347, 18450, 10872, 310, 247, 2372, 1698, 323, 22791, 272, 13361, 3210, 342, 247, 2495, 281, 689, 8491, 4541, 253, 1071, 873, 50276, 783, 1881, 1968, 272, 310, 4274, 17980, 50276, 2858, 436, 556, 644, 9113, 4767, 407, 253, 4477, 50276, 262, 310, 417, 2590, 281, 479, 849, 4217, 310, 253, 10895, 347, 352, 3133, 326, 2190, 1283, 3082, 417, 581, 310, 2104, 281, 1347, 24088, 275, 8909, 5028, 3026, 556, 2067, 15302, 342, 2456, 7200, 50276, 16534, 352, 320, 326, 3386, 403, 417, 326, 15970, 50275, 7265, 14855, 337, 352, 310, 417, 4518, 4767, 752, 1566, 5438, 1332, 369, 908, 281, 5206, 4373, 22041, 273, 253, 1027, 3082, 50275, 249, 258, 351, 49725, 43792, 6451, 1162, 355, 452, 2540, 326, 3553, 251, 5881, 404, 562, 369, 417, 253, 1682, 1566, 5438, 2746, 594, 352, 812, 320, 4722, 281, 2085, 1543, 970, 3733, 5028, 12820, 3553, 581, 5028, 562, 285, 42295, 5438, 923, 253, 5028, 3026, 2929, 50275, 7265, 14855, 374, 253, 18557, 22791, 5987, 48853, 31591, 17144, 7280, 900, 310, 7561, 908, 275, 8909, 285, 10262, 347, 973, 8468, 673, 2962, 941, 1014, 2167, 253, 5203, 310, 1027, 253, 3064, 943, 320, 5469, 275, 253, 2929, 2490, 187, 4118, 18435, 27, 28626, 358, 310, 247, 4471, 2913, 1966, 8770, 17950, 10895, 10848, 432, 6109, 285, 8251, 494, 4095, 436, 310, 247, 4122, 3486, 1020, 10895, 326, 588, 5649, 2067, 2561, 10625, 1690, 5145, 4715, 288, 5297, 285, 33079, 12672, 40215, 12672, 285, 6255, 1786, 2561, 253, 30628, 8127, 5194, 327, 253, 2266, 9021, 1097, 327, 253, 10895, 347, 973, 347, 253, 22791, 4836, 327, 5028, 2087, 5837, 5802, 432, 253, 8770, 1078, 285, 846, 9383, 301, 746, 50276, 20261, 2067, 5609, 326, 403, 8671, 824, 347, 253, 516, 10340, 5609, 403, 327, 253, 27785, 1930, 436, 476, 1527, 9091, 323, 2571, 281, 789, 327, 253, 10895, 285, 3157, 253, 26278, 7794, 3103, 891, 7052, 5583, 14924, 273, 253, 789 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 35615, 326, 497, 4158, 281, 39970, 973, 390, 310, 627, 247, 625, 7936, 3064, 875, 731, 285, 253, 9454, 5481, 11333, 50275, 783, 12002, 25957, 689, 18450, 4212, 534, 778, 320, 24363, 1580, 627, 403, 44473, 4451, 4212, 50276, 1171, 2282, 1097, 3904, 403, 1077, 13943, 50275, 27390, 891, 3261, 387, 690, 941, 4367, 352, 369, 2649, 4745, 2590, 326, 253, 15302, 10491, 2281, 369, 8069, 2378, 591, 1388, 326, 6010, 941, 310, 2530, 323, 1016, 5542, 323, 1016, 1388, 50276, 74, 871, 352, 369, 5393, 327, 1386, 23512, 533, 352, 3133, 16613, 2217, 281, 6780, 352, 50276, 303, 417, 3240, 2119, 835, 253, 40006, 3386, 432, 253, 643, 673, 13794, 403, 3559, 533, 4931, 597, 403, 275, 1027, 941, 4367, 50275, 8826, 625, 1491, 670, 17845, 2352, 778, 320, 9371, 281, 2486, 275, 253, 2022, 2929, 50275, 262, 778, 320, 9371, 281, 3748, 253, 4944, 2713, 3210, 908, 50276, 10192, 40224, 597, 42126, 2486, 2798, 2281, 390, 2608, 4478, 4215, 5763, 13301, 5474, 339, 431, 248, 4477, 2770, 327, 14053, 14854, 3879, 6298, 285, 10018, 326, 627, 403, 642, 13644, 2130, 15302, 342, 2709, 10625, 594, 253, 4477, 12661, 253, 806, 4471, 2913, 6109, 285, 8251, 494, 17950, 15302, 347, 271, 1527, 1071, 3026, 253, 4477, 294, 303, 3018, 1283, 3082, 285, 2085, 253, 22791, 1543, 9470, 1543, 10313, 253, 7364, 273, 253, 1655, 3082, 24088, 3480, 273, 2087, 50228, 253, 10895, 2112, 342, 253, 22791, 1543, 588, 7170, 253, 2440, 273, 2087, 12729, 14854, 3879, 14053, 337, 253, 806, 4471, 2913, 3879, 10895, 326, 8525, 5028, 26647, 1263, 374, 11080, 285, 11088, 941, 4849, 21678, 5740, 495, 271, 562, 23037, 248, 3364, 22791, 272, 5147, 342, 1283, 8527, 11333, 619, 2022, 4468, 310, 253, 1896, 21481, 2600, 342, 253, 4477, 762, 15337, 19529, 818, 352, 3133, 841, 767, 9380, 897, 253, 1072, 5147, 26332, 13371, 358, 253, 4477, 671, 1750, 326, 1142, 17438, 403, 2668, 432, 818, 342, 23633, 5474, 33032, 2520, 2929, 3400, 253, 806, 14854, 6109, 285, 8251, 494, 17950, 10895, 285, 5012, 253, 22791, 1543, 323, 1283, 2905, 11333, 323, 253, 9454, 5481, 16280, 275, 436, 2929, 253, 941, 4849, 1232, 285, 27130, 5127, 885, 72, 941, 3268, 285, 5921, 50276, 609, 1246, 534, 1361, 8607, 281, 2096, 436, 10895, 4541, 50276, 18, 253, 2530, 10895, 310, 253, 806, 4471, 2913, 15302, 432, 6109, 285, 8251, 494, 17950, 689, 18450, 1436, 10526, 342, 21481, 4212, 4722, 253, 10895, 3797, 3765, 493, 9383, 301, 14613, 941, 50275, 19, 253, 941, 4849, 1232, 310, 2590, 534, 2722, 253, 391, 15752, 285, 3280, 2105, 16280, 436, 2929, 3400, 4623, 941, 1783, 275, 2426, 273, 3268, 5921, 285, 5028, 9162, 50275, 20, 253, 4477, 2589, 4679, 273, 767, 8892, 26332, 2720, 9454, 5481, 285, 3332, 5028, 26647, 534, 3959, 247, 4217, 22791, 906, 323, 2852, 2561, 50276, 18, 18766, 3410, 1979, 285, 1511, 3738, 253, 10895, 4428, 1740, 1107, 273, 14613, 941, 253, 1180, 273, 4212, 310, 1355, 285, 253, 1180, 273, 3879, 3510, 5728, 310, 671, 1355, 50276, 19, 3480, 273, 5301, 342, 643, 1966, 3879, 15302, 534, 310, 247, 1077, 1774, 789, 323, 10895, 9380, 50276, 20, 891, 11907, 253, 4477, 281, 2085, 247, 5955, 273, 253, 2538, 273, 6332, 5611, 407, 253, 4849, 6500, 5661, 5014, 5474, 339, 431, 248, 2929, 10262, 253, 806, 4471, 2913, 16864, 17950, 15302, 4508, 689, 18450, 4212, 941, 5728, 432, 6109, 285, 8251, 494, 13479, 2366, 342, 247, 4618, 2491, 273, 46128, 17082, 597, 22791, 264, 2067, 11333, 281, 1287, 2087, 12729, 14854, 3879, 14053, 11333, 342, 13654, 327, 9454, 5481, 4836, 50276, 783, 17276, 285, 253, 9454, 5481, 4836, 403, 973, 2684, 50275, 783, 2929, 310, 973, 3542, 285, 253, 3386, 323, 253, 9454, 5481, 4836, 403, 4722, 50275, 783, 10895, 3133, 12532, 323, 6255, 1786, 2852, 2561, 50275, 5371, 310, 253, 1027, 875, 10895, 275, 818, 285, 436, 10895, 1580, 253, 4477, 403, 253, 1072, 50274, 783, 10895, 2330, 342, 253, 19529, 840, 310, 417, 4751, 1527, 347, 7558, 275, 253, 10199, 50274, 66, 2159, 4114, 273, 253, 9454, 5481, 1895, 285, 253, 2905, 11333, 651, 1056, 253, 2929, 625, 34025, 50274, 2760, 5816, 2281, 323, 253, 941, 778, 2818, 2087, 50228, 273, 253, 4342, 1580, 247, 14916, 1332, 310, 908, 323, 10885, 824, 5816, 941, 50274, 249, 3239, 721, 1907, 247, 8132, 263, 1783, 323, 1016, 807, 285, 7277, 875, 26296, 285, 34807, 395, 11060, 310, 625, 4722, 2429, 281, 816, 247, 2087, 5921, 4868, 50274, 249, 3239, 854, 253, 4477, 1375, 1655, 11333, 2831, 42429, 2087, 50228, 310, 1335, 2080, 432, 20297, 323, 294, 455, 1074, 19007, 247, 625, 7000, 15571, 273, 253, 30328, 273, 824, 5691, 285, 5125, 10746, 403, 5667, 323, 253, 2929, 5747, 1907, 2159, 12494, 275, 5955, 2593, 50274, 30819, 253, 1881, 16223, 4809, 323, 9454, 5593, 285, 417, 1907, 17119, 17497, 10554, 310, 671, 247, 2159, 4202, 273, 253, 2929, 50274, 783, 1563, 6197, 310, 417, 2590, 776, 15302, 476, 5752, 347, 271, 1527, 1071, 3026, 323, 2709, 2831, 42429, 26647, 8892, 281, 7472, 247, 3879, 14053, 11333, 2087, 50228, 285, 31640, 24088, 1072, 390, 1027, 4212, 2439, 2709, 1107, 50274, 783, 1563, 4560, 3133, 14916, 285, 352, 310, 973, 1929, 432, 20162, 6239, 436, 5936, 326, 5014, 2223, 764, 4525, 281, 745, 23485, 316, 5053, 327, 29665, 2007, 5014, 20845, 281, 25057, 29665, 281, 5834, 598, 327, 4600, 347, 2011, 407, 253, 5241, 275, 3026, 7467, 1475, 29665, 50276, 7152, 33032, 253, 4477, 3727, 253, 806, 4471, 2913, 14854, 6109, 17950, 15302, 10985, 689, 18450, 1436, 10526, 2709, 8468, 15302, 26332, 4328, 4481, 10393, 5841, 787, 23548, 3520, 2425, 285, 4600, 3879, 285, 46128, 17082, 24088, 3520, 1786, 6255, 46128, 2675, 46128, 46128, 17082, 403, 5728, 407, 3765, 493, 17276, 285, 2159, 7019, 11151, 314, 24957, 2774, 552, 6803, 17276, 1309, 253, 1263, 616, 22791, 1543, 327, 253, 9454, 5481, 4836, 7568, 326, 1655, 11333, 878, 281, 320, 2007, 3715, 275, 2426, 273, 2087, 50228, 50276, 9328, 12106, 253, 3268, 273, 941, 1078, 285, 846, 9383, 301, 390, 327, 2129, 11015, 285, 29665, 597, 671, 2589, 5921, 1783, 26475, 253, 2954, 875, 5312, 13576, 285, 46128, 17082, 253, 767, 5028, 9162, 8892, 597, 2589, 921, 1534, 3268, 15036, 2190, 15302, 285, 4292, 50276, 9328, 2085, 22791, 1543, 273, 898, 2720, 9454, 5481, 11333, 285, 898, 3676, 28269, 5028, 26647, 11333, 327, 253, 9454, 5481, 4836, 253, 5661, 873, 8777, 403, 4272, 715, 247, 2014, 10895, 285, 247, 2831, 10895, 50276, 783, 1682, 9454, 5481, 11333, 403, 1805, 387, 253, 1625, 1070, 255, 23456, 4836, 1223, 253, 1682, 5028, 26647, 11333, 403, 1805, 387, 2831, 42429, 8892, 247, 1534, 3045, 5926, 432, 253, 2014, 10895, 4836, 281, 253, 1264, 2831, 46906, 1507, 2722, 326, 1655, 11333, 403, 1335, 2080, 432, 294, 455, 1074, 19007, 50276, 2520, 1263, 16633, 327, 540, 248, 32778, 2069, 12395, 941, 534, 556, 417, 644, 7561, 14859, 275, 2426, 273, 5028, 26647, 2429, 281, 4382, 8113, 390, 3626, 3448, 5162, 10625, 50276, 9328, 4822, 4209, 3386, 4328, 4481, 10393, 787, 23548, 1067, 3520, 2425, 285, 4600, 5014, 6312, 13772, 2159, 17276, 285, 9470, 17276, 1078, 253, 1265, 285, 387, 253, 990, 273, 253, 1263, 3765, 493, 17276, 3835, 13216, 270, 11125, 740, 3520, 1786, 16666, 6255, 46128, 270, 69, 12211, 285, 2675, 973, 1257, 723, 1215, 273, 2675, 285, 11073, 4944, 4311, 35779, 253, 2159, 17276, 2486, 815, 82, 21, 268, 859, 21, 285, 3199, 284, 50276, 2520, 10949, 941, 432, 1740, 12640, 1107, 534, 310, 9470, 2217, 281, 320, 5608, 347, 14854, 2561, 253, 1740, 1107, 2882, 273, 767, 1107, 1078, 9383, 301, 285, 767, 1107, 846, 9383, 301, 436, 5644, 281, 247, 5028, 5333, 875, 15302, 285, 352, 2789, 253, 4439, 941, 625, 27096, 285, 7470, 323, 12262, 5028, 26647, 50276, 783, 22791, 4428, 1097, 9454, 5481, 11333, 285, 5028, 26647, 11333, 50276, 262, 651, 452, 644, 1805, 604, 597, 2085, 253, 9990, 273, 616, 941, 4518, 285, 7277, 352, 281, 253, 1643, 643, 15302, 534, 476, 320, 2429, 281, 13371, 358, 10895, 671, 347, 597, 5393, 6109, 17950, 941, 5257, 281, 452, 247, 1029, 941, 5816, 2281, 533, 597, 858, 417, 2085, 253, 2120, 5816, 2281, 273, 1016, 4735, 352, 310, 4619, 323, 2069, 12395, 941, 50276, 2520, 310, 12262, 1966, 3879, 14053, 533, 253, 22791, 10949, 247, 9454, 5481, 4836, 3815, 342, 581, 5203, 352, 651, 320, 1805, 281, 387, 1878, 1804, 690, 643, 8892, 2905, 281, 1966, 3879, 14053, 342, 436, 9470, 10895, 50276, 262, 310, 3710, 281, 247, 2176, 3072, 824, 347, 3484, 347, 757, 285, 3168, 285, 16618, 285, 806, 14520, 5014, 50276, 783, 1543, 273, 253, 2831, 10895, 9978, 1957, 253, 5028, 2087, 50228, 5028, 9162, 8892, 403, 2530, 281, 921, 253, 5028, 5333, 273, 616, 10895, 253, 9557, 292, 742, 255, 23456, 3368, 12453, 253, 5028, 5333, 275, 253, 3553, 11406, 255, 23456, 483, 9978, 285, 253, 7377, 261, 384, 248, 10816, 3368, 11424, 253, 5028, 5333, 275, 253, 21481, 4212, 2439, 15302, 2593, 7609, 13730, 11424, 253, 5028, 5333, 275, 253, 3765, 493, 31485, 301, 9978, 533, 352, 651, 320, 1805, 281, 823, 2074, 3368, 5474, 339, 431, 248, 10895, 37035, 2067, 3236, 3386, 50276, 23939, 2913, 3021, 5277, 247, 15958, 4836, 323, 26647, 50276, 5969, 15424, 941, 432, 13479, 1485, 673, 2962, 50276, 1661, 3146, 11117, 3072, 2568, 512, 273, 731, 403, 3484, 50276, 783, 2929, 671, 3400, 50276, 16680, 323, 1283, 1896, 3210, 50276, 11984, 1491, 327, 10895, 4849, 5955, 273, 11068, 7350, 50276, 2003, 273, 253, 10895, 556, 2168, 644, 908, 281, 15452, 24088, 277, 590, 357, 1162, 355, 50276, 35065, 49602, 275, 9454, 10554, 897, 41174, 729, 2842, 941, 14333, 4204, 280, 680, 91, 2505, 294, 18393, 1162, 355, 6630, 941, 32219, 1162, 355, 390, 20896, 1200, 21401, 1162, 355, 50276, 262, 3133, 3021, 326, 436, 10895, 509, 9115, 5368, 49602, 253, 1305, 373, 29983, 10895, 556, 271, 2425, 4735, 285, 310, 253, 954, 2074, 533, 19756, 6793, 3386, 50276, 783, 10895, 4849, 310, 973, 14290, 10963, 3496, 67, 4366, 4948, 5939, 281, 1263, 5014, 3966, 50276, 5969, 4849, 273, 1543, 1690, 3300, 25884, 780, 20980, 1497, 17245, 3082, 285, 13361, 3082, 50276, 783, 4471, 2913, 3753, 285, 3765, 493, 31485, 301, 1646, 281, 320, 1077, 4217, 281, 22791, 258, 351, 3082, 50276, 2420, 337, 19756, 253, 4588, 7982, 1146, 908, 391, 7645, 3045, 30063, 7200, 3164, 50276, 9154, 2167, 352, 310, 2834, 285, 19983, 253, 10895, 1979, 812, 320, 2559, 347, 18450, 10872, 310, 247, 2372, 1698, 323, 22791, 272, 13361, 3210, 342, 247, 2495, 281, 689, 8491, 4541, 253, 1071, 873, 50276, 783, 1881, 1968, 272, 310, 4274, 17980, 50276, 2858, 436, 556, 644, 9113, 4767, 407, 253, 4477, 50276, 262, 310, 417, 2590, 281, 479, 849, 4217, 310, 253, 10895, 347, 352, 3133, 326, 2190, 1283, 3082, 417, 581, 310, 2104, 281, 1347, 24088, 275, 8909, 5028, 3026, 556, 2067, 15302, 342, 2456, 7200, 50276, 16534, 352, 320, 326, 3386, 403, 417, 326, 15970, 50275, 7265, 14855, 337, 352, 310, 417, 4518, 4767, 752, 1566, 5438, 1332, 369, 908, 281, 5206, 4373, 22041, 273, 253, 1027, 3082, 50275, 249, 258, 351, 49725, 43792, 6451, 1162, 355, 452, 2540, 326, 3553, 251, 5881, 404, 562, 369, 417, 253, 1682, 1566, 5438, 2746, 594, 352, 812, 320, 4722, 281, 2085, 1543, 970, 3733, 5028, 12820, 3553, 581, 5028, 562, 285, 42295, 5438, 923, 253, 5028, 3026, 2929, 50275, 7265, 14855, 374, 253, 18557, 22791, 5987, 48853, 31591, 17144, 7280, 900, 310, 7561, 908, 275, 8909, 285, 10262, 347, 973, 8468, 673, 2962, 941, 1014, 2167, 253, 5203, 310, 1027, 253, 3064, 943, 320, 5469, 275, 253, 2929, 2490, 187, 4118, 18435, 27, 28626, 358, 310, 247, 4471, 2913, 1966, 8770, 17950, 10895, 10848, 432, 6109, 285, 8251, 494, 4095, 436, 310, 247, 4122, 3486, 1020, 10895, 326, 588, 5649, 2067, 2561, 10625, 1690, 5145, 4715, 288, 5297, 285, 33079, 12672, 40215, 12672, 285, 6255, 1786, 2561, 253, 30628, 8127, 5194, 327, 253, 2266, 9021, 1097, 327, 253, 10895, 347, 973, 347, 253, 22791, 4836, 327, 5028, 2087, 5837, 5802, 432, 253, 8770, 1078, 285, 846, 9383, 301, 746, 50276, 20261, 2067, 5609, 326, 403, 8671, 824, 347, 253, 516, 10340, 5609, 403, 327, 253, 27785, 1930, 436, 476, 1527, 9091, 323, 2571, 281, 789, 327, 253, 10895, 285, 3157, 253, 26278, 7794, 3103, 891, 7052, 5583, 14924, 273, 253, 789 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: the authors propose the thinned generalized gamma process tggp for generating sparse graphs with mixedmembership block structure it builds upon the ggpbased model of caron and fox 12 and recent extensions to incorporate block structure the tggp thins potential edges from a ggp by removing edges where the community assignments for two nodes do not match the authors claim that their tggp is superior to the compound ggp approach of todeschini et al 17 and empirical results support this claim strengths empirical results suggest that proposed tggp provides a superior approach for modeling sparsity with mixedmembership block structure compared to the cggp which is the only other approach that can simultaneously model both properties the paper is mostly well written and the explanations are quite clear i enjoyed reading it however it seems to have been written in a rush with lots of sloppy presentation issues see weaknesses that detract from overall presentation quality proposed model and posterior inference algorithm appear to be technically sound and require some clever ways of sampling to avoid blowing up the computational complexity weaknesses lots of presentation issuesthis paper badly needs a copy edit and just a bit more time to correct the errors i have provided a list below these errors detract from what was a well written paper experiments are on somewhat small data sets with a maximum of about 2000 nodes and about 33000 edges also no experiments were performed on real data with labeled communities so estimated communities are only compared based on simulated data novelty is only moderate as the proposed model addresses the same setting as the cggp from todeschini et al 17 and most of the results follow from the gppbased construction of caron and fox 12 presentation issues unfinished last paragraph of intro beginning on line 39 in this paper we propose a novel random graph model that the variable x is first used in equation 5 in section 22 but not defined until its later usage in section 3 lines 150151 total number of proposed nodes nij between any two edges i and j in the multigraph total number of proposed edges nij between any two nodes i and j in the multigraph line 184 estiamte estimate figure 5 is too shorti cannot see how frequently the true simulated sociability values fall within the 95 credible intervals for most of the nodes which have wi 05 figure 6 should have axis labels rather than needing to read the caption also i find figure 4 in the supplementary material which shows the same results as figure 6 in the main paper but with all 5 models being compared to be more convincing i suggest for the authors to replace figure 6 in the main paper with figure 4 in the supplementary checklist was not completed entries 4 and 5 still listed as todo limitations are not really discussed just possible extensions for further work the authors should discuss the scalability by providing a rough scale of how large of networks they can fit in a reasonable amount of computation time eg a few days docsepthe paper proposes tggp a new model for sparse random graphs with planted overlapping communities based on the generalized gamma process ggp and shows some experiments comparing the behaviour of tggp compared to the previously proposed cggp compared to cggp the proposed tggp does not require to fix the number of communities a priori and has superior performance in terms of community detection and prediction of missing edges on synthetic and realworld graphs s the idea of allowing for overlapping communities via thinning in the ggp model is interesting and new tggp performs better as compared to cggp w the paper seems more like a draft lots of work is needed to improve the presentation for example more explanation of backgrounds and related work better structured contents and more clear presentation of the generative algorithm based on mcmc the performance in terms of community extraction graph reconstruction is superior with respect to cggp but it does not seem competitive with other available techniques for the specific tasks so the authors should clarify what the main contribution exactly are in particular to what extent they fit the machine learning community the work should come with an accompanying code that can be used to reproduce the experiments shown in the manuscript and to allow the use of the random graph generator by future researchers the paper does not discuss potential limitation docsepthis paper proposed a new framework for analyzing the community structure in the network they modeled the overlapping community structures in the network by extending the generalized gamma processggp model the model first generated a latent multigraph where edges are assigned depending on the sociability of nodes and their community memberships then the edges in the multigraph that connect two different communities are thinned unobservable the observed network is formulated as the projection of the latent multigraph excluding the thinned edges they also provided an efficient way to implement the posterior inference by utilizing the sparse property of the observed network specifically they sampled the community concordant and community discordant edges separately based on the observed network their experiments also showed that this new model recovered the community membership in the network better than previous model this new framework proposed in this paper is interesting this new model can infer the overlapping community structures in the network which has many applications and attracts a lot of attention in the research this new model captures the sparsity and mixed community membership in realworld networks this new model extended the generalized gamma processggp model by introducing the mixed community membership for each node the technique used in this extension is in some sense standard by utilizing the sparsity in the observed network yes they mentioned some potential generalizations of their model by including additional available information ### Summary:
the authors present a framework for thinning edges from from random graph realizations from the generalized gamma process ggp to generate sparse graphs with mixed community memberships the authors provide an efficient monte carlo methods that scale subquadratically with the number of nodes there are concerns about scalability of the proposed method and its novelty over the ggp based construction of caron and fox the reviewers also note that the paper needs proof reading and a clearer exposition
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 783, 4477, 12661, 253, 289, 13172, 14923, 17356, 1232, 246, 1266, 81, 323, 11365, 23507, 14580, 342, 6804, 25011, 1456, 2972, 2605, 352, 21168, 2220, 253, 305, 17788, 3169, 1566, 273, 1113, 251, 285, 30013, 1249, 285, 3332, 18149, 281, 19071, 2972, 2605, 253, 246, 1266, 81, 289, 968, 2442, 9297, 432, 247, 305, 17788, 407, 11922, 9297, 835, 253, 3114, 23768, 323, 767, 7632, 513, 417, 3761, 253, 4477, 1750, 326, 616, 246, 1266, 81, 310, 8936, 281, 253, 8508, 305, 17788, 2746, 273, 281, 3229, 46635, 1162, 355, 1722, 285, 16774, 1543, 1329, 436, 1750, 20544, 50276, 358, 5378, 474, 1543, 1804, 326, 4081, 246, 1266, 81, 3400, 247, 8936, 2746, 323, 14053, 37139, 414, 342, 6804, 25011, 1456, 2972, 2605, 2429, 281, 253, 260, 1266, 81, 534, 310, 253, 760, 643, 2746, 326, 476, 10486, 1566, 1097, 3607, 50276, 783, 2929, 310, 6571, 973, 3542, 285, 253, 22909, 403, 3240, 2590, 891, 11346, 4361, 352, 2299, 352, 3133, 281, 452, 644, 3542, 275, 247, 16949, 342, 8783, 273, 1499, 45695, 9759, 3374, 923, 32213, 326, 843, 974, 432, 4583, 9759, 3290, 50276, 856, 7334, 1566, 285, 12637, 17032, 5933, 3176, 281, 320, 22335, 3590, 285, 2430, 690, 19080, 4088, 273, 10491, 281, 3693, 24935, 598, 253, 15180, 10454, 50276, 20881, 1255, 265, 50276, 77, 1502, 273, 9759, 1521, 34401, 8701, 2929, 16426, 3198, 247, 3491, 12921, 285, 816, 247, 2372, 625, 673, 281, 3451, 253, 6332, 891, 452, 2530, 247, 1618, 2708, 841, 6332, 843, 974, 432, 752, 369, 247, 973, 3542, 2929, 50276, 16217, 3825, 403, 327, 8489, 1355, 941, 5239, 342, 247, 4869, 273, 670, 5307, 7632, 285, 670, 5922, 933, 9297, 671, 642, 4679, 497, 2684, 327, 1524, 941, 342, 13130, 7888, 594, 5998, 7888, 403, 760, 2429, 1754, 327, 15524, 941, 50276, 2369, 652, 555, 310, 760, 10290, 347, 253, 4081, 1566, 12453, 253, 1072, 4758, 347, 253, 260, 1266, 81, 432, 281, 3229, 46635, 1162, 355, 1722, 285, 954, 273, 253, 1543, 956, 432, 253, 305, 377, 3169, 5140, 273, 1113, 251, 285, 30013, 1249, 50276, 49836, 3374, 50276, 328, 36077, 1390, 12494, 273, 26432, 5068, 327, 1386, 6931, 275, 436, 2929, 359, 12661, 247, 4460, 3632, 4216, 1566, 326, 50276, 783, 4778, 1269, 310, 806, 908, 275, 5150, 608, 275, 2593, 3307, 533, 417, 2931, 1919, 697, 1996, 10393, 275, 2593, 495, 50276, 8737, 1458, 520, 3712, 2264, 1180, 273, 4081, 7632, 295, 1944, 875, 667, 767, 9297, 891, 285, 480, 275, 253, 1554, 32608, 50276, 13074, 1180, 273, 4081, 9297, 295, 1944, 875, 667, 767, 7632, 891, 285, 480, 275, 253, 1554, 32608, 50276, 1282, 25921, 1144, 16726, 442, 50276, 383, 2542, 50276, 13206, 608, 310, 1512, 2159, 74, 2550, 923, 849, 7208, 253, 2032, 15524, 10621, 1430, 2193, 2965, 1561, 253, 5325, 24542, 11508, 323, 954, 273, 253, 7632, 534, 452, 38435, 50276, 1762, 50276, 13206, 721, 943, 452, 7844, 13301, 2581, 685, 25312, 281, 1239, 253, 11743, 671, 891, 1089, 4677, 577, 275, 253, 24864, 2144, 534, 2722, 253, 1072, 1543, 347, 4677, 721, 275, 253, 2022, 2929, 533, 342, 512, 608, 3210, 1146, 2429, 281, 320, 625, 21414, 891, 1804, 323, 253, 4477, 281, 8171, 4677, 721, 275, 253, 2022, 2929, 342, 4677, 577, 275, 253, 24864, 50276, 5903, 3550, 369, 417, 6312, 12028, 577, 285, 608, 1335, 7117, 347, 20591, 7364, 403, 417, 1663, 5469, 816, 1896, 18149, 323, 2007, 789, 253, 4477, 943, 2319, 253, 9171, 1430, 407, 5277, 247, 7227, 4311, 273, 849, 1781, 273, 6928, 597, 476, 4944, 275, 247, 5272, 2408, 273, 13782, 673, 24088, 247, 1643, 1897, 5474, 339, 431, 248, 2929, 29328, 246, 1266, 81, 247, 747, 1566, 323, 23507, 3632, 14580, 342, 23846, 21481, 7888, 1754, 327, 253, 14923, 17356, 1232, 305, 17788, 285, 2722, 690, 4679, 10941, 253, 8770, 273, 246, 1266, 81, 2429, 281, 253, 3786, 4081, 260, 1266, 81, 2429, 281, 260, 1266, 81, 253, 4081, 246, 1266, 81, 1057, 417, 2430, 281, 4993, 253, 1180, 273, 7888, 247, 30400, 285, 556, 8936, 3045, 275, 2426, 273, 3114, 5481, 285, 10554, 273, 5816, 9297, 327, 13506, 285, 1524, 10186, 14580, 256, 50274, 783, 2934, 273, 6941, 323, 21481, 7888, 3066, 6906, 920, 275, 253, 305, 17788, 1566, 310, 4722, 285, 747, 50273, 85, 1266, 81, 17923, 1805, 347, 2429, 281, 260, 1266, 81, 50276, 88, 50273, 783, 2929, 3133, 625, 751, 247, 7482, 8783, 273, 789, 310, 3058, 281, 3157, 253, 9759, 323, 1650, 625, 8813, 273, 24550, 285, 2905, 789, 1805, 18872, 9410, 285, 625, 2590, 9759, 273, 253, 1006, 800, 5933, 1754, 327, 278, 3591, 68, 50276, 783, 3045, 275, 2426, 273, 3114, 11998, 50276, 10580, 14433, 310, 8936, 342, 1675, 281, 260, 1266, 81, 533, 352, 1057, 417, 1646, 12085, 342, 643, 2130, 5609, 323, 253, 2173, 8892, 594, 253, 4477, 943, 19148, 752, 253, 2022, 7680, 4555, 403, 275, 1798, 281, 752, 6070, 597, 4944, 253, 5145, 4715, 3114, 50276, 783, 789, 943, 1705, 342, 271, 17909, 2127, 326, 476, 320, 908, 281, 18302, 253, 4679, 2011, 275, 253, 7714, 285, 281, 1581, 253, 897, 273, 253, 3632, 4216, 14156, 407, 2852, 8607, 50275, 783, 2929, 1057, 417, 2319, 2442, 12291, 50276, 7152, 33032, 2520, 2929, 4081, 247, 747, 7792, 323, 18918, 253, 3114, 2605, 275, 253, 2990, 597, 23115, 253, 21481, 3114, 5289, 275, 253, 2990, 407, 13633, 253, 14923, 17356, 1232, 1266, 81, 1566, 50276, 783, 1566, 806, 4561, 247, 21624, 1554, 32608, 835, 9297, 403, 7922, 7293, 327, 253, 10621, 1430, 273, 7632, 285, 616, 3114, 14199, 84, 840, 253, 9297, 275, 253, 1554, 32608, 326, 4684, 767, 1027, 7888, 403, 289, 13172, 440, 23705, 494, 253, 2540, 2990, 310, 26115, 347, 253, 12378, 273, 253, 21624, 1554, 32608, 22914, 253, 289, 13172, 9297, 597, 671, 2530, 271, 5919, 1039, 281, 3359, 253, 12637, 17032, 407, 17617, 253, 23507, 2867, 273, 253, 2540, 2990, 5742, 597, 19958, 253, 3114, 34860, 386, 285, 3114, 37600, 386, 9297, 11794, 1754, 327, 253, 2540, 2990, 50276, 14094, 4679, 671, 2692, 326, 436, 747, 1566, 12372, 253, 3114, 14199, 275, 253, 2990, 1805, 685, 2045, 1566, 50275, 2520, 747, 7792, 4081, 275, 436, 2929, 310, 4722, 436, 747, 1566, 476, 9441, 253, 21481, 3114, 5289, 275, 253, 2990, 534, 556, 1142, 4893, 285, 45465, 247, 2257, 273, 4116, 275, 253, 2561, 436, 747, 1566, 28174, 253, 37139, 414, 285, 6804, 3114, 14199, 275, 1524, 10186, 6928, 50275, 2520, 747, 1566, 6508, 253, 14923, 17356, 1232, 1266, 81, 1566, 407, 16984, 253, 6804, 3114, 14199, 323, 1016, 4666, 253, 5853, 908, 275, 436, 6880, 310, 275, 690, 3282, 2629, 407, 17617, 253, 37139, 414, 275, 253, 2540, 2990, 50275, 9820, 597, 5393, 690, 2442, 2087, 5904, 273, 616, 1566, 407, 1690, 3081, 2130, 1491, 50276, 187, 187, 4118, 18435, 27, 783, 4477, 1246, 247, 7792, 323, 6906, 920, 9297, 432, 432, 3632, 4216, 1524, 5904, 432, 253, 14923, 17356, 1232, 305, 17788, 281, 6635, 23507, 14580, 342, 6804, 3114, 14199, 84, 253, 4477, 2085, 271, 5919, 1114, 442, 1113, 4213, 3082, 326, 4311, 749, 3362, 83, 5372, 342, 253, 1180, 273, 7632, 627, 403, 7350, 670, 9171, 1430, 273, 253, 4081, 1332, 285, 697, 38135, 689, 253, 305, 17788, 1754, 5140, 273, 1113, 251, 285, 30013, 253, 30628, 671, 3877, 326, 253, 2929, 3198, 4737, 4361, 285, 247, 30909, 47284, 209 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 783, 4477, 12661, 253, 289, 13172, 14923, 17356, 1232, 246, 1266, 81, 323, 11365, 23507, 14580, 342, 6804, 25011, 1456, 2972, 2605, 352, 21168, 2220, 253, 305, 17788, 3169, 1566, 273, 1113, 251, 285, 30013, 1249, 285, 3332, 18149, 281, 19071, 2972, 2605, 253, 246, 1266, 81, 289, 968, 2442, 9297, 432, 247, 305, 17788, 407, 11922, 9297, 835, 253, 3114, 23768, 323, 767, 7632, 513, 417, 3761, 253, 4477, 1750, 326, 616, 246, 1266, 81, 310, 8936, 281, 253, 8508, 305, 17788, 2746, 273, 281, 3229, 46635, 1162, 355, 1722, 285, 16774, 1543, 1329, 436, 1750, 20544, 50276, 358, 5378, 474, 1543, 1804, 326, 4081, 246, 1266, 81, 3400, 247, 8936, 2746, 323, 14053, 37139, 414, 342, 6804, 25011, 1456, 2972, 2605, 2429, 281, 253, 260, 1266, 81, 534, 310, 253, 760, 643, 2746, 326, 476, 10486, 1566, 1097, 3607, 50276, 783, 2929, 310, 6571, 973, 3542, 285, 253, 22909, 403, 3240, 2590, 891, 11346, 4361, 352, 2299, 352, 3133, 281, 452, 644, 3542, 275, 247, 16949, 342, 8783, 273, 1499, 45695, 9759, 3374, 923, 32213, 326, 843, 974, 432, 4583, 9759, 3290, 50276, 856, 7334, 1566, 285, 12637, 17032, 5933, 3176, 281, 320, 22335, 3590, 285, 2430, 690, 19080, 4088, 273, 10491, 281, 3693, 24935, 598, 253, 15180, 10454, 50276, 20881, 1255, 265, 50276, 77, 1502, 273, 9759, 1521, 34401, 8701, 2929, 16426, 3198, 247, 3491, 12921, 285, 816, 247, 2372, 625, 673, 281, 3451, 253, 6332, 891, 452, 2530, 247, 1618, 2708, 841, 6332, 843, 974, 432, 752, 369, 247, 973, 3542, 2929, 50276, 16217, 3825, 403, 327, 8489, 1355, 941, 5239, 342, 247, 4869, 273, 670, 5307, 7632, 285, 670, 5922, 933, 9297, 671, 642, 4679, 497, 2684, 327, 1524, 941, 342, 13130, 7888, 594, 5998, 7888, 403, 760, 2429, 1754, 327, 15524, 941, 50276, 2369, 652, 555, 310, 760, 10290, 347, 253, 4081, 1566, 12453, 253, 1072, 4758, 347, 253, 260, 1266, 81, 432, 281, 3229, 46635, 1162, 355, 1722, 285, 954, 273, 253, 1543, 956, 432, 253, 305, 377, 3169, 5140, 273, 1113, 251, 285, 30013, 1249, 50276, 49836, 3374, 50276, 328, 36077, 1390, 12494, 273, 26432, 5068, 327, 1386, 6931, 275, 436, 2929, 359, 12661, 247, 4460, 3632, 4216, 1566, 326, 50276, 783, 4778, 1269, 310, 806, 908, 275, 5150, 608, 275, 2593, 3307, 533, 417, 2931, 1919, 697, 1996, 10393, 275, 2593, 495, 50276, 8737, 1458, 520, 3712, 2264, 1180, 273, 4081, 7632, 295, 1944, 875, 667, 767, 9297, 891, 285, 480, 275, 253, 1554, 32608, 50276, 13074, 1180, 273, 4081, 9297, 295, 1944, 875, 667, 767, 7632, 891, 285, 480, 275, 253, 1554, 32608, 50276, 1282, 25921, 1144, 16726, 442, 50276, 383, 2542, 50276, 13206, 608, 310, 1512, 2159, 74, 2550, 923, 849, 7208, 253, 2032, 15524, 10621, 1430, 2193, 2965, 1561, 253, 5325, 24542, 11508, 323, 954, 273, 253, 7632, 534, 452, 38435, 50276, 1762, 50276, 13206, 721, 943, 452, 7844, 13301, 2581, 685, 25312, 281, 1239, 253, 11743, 671, 891, 1089, 4677, 577, 275, 253, 24864, 2144, 534, 2722, 253, 1072, 1543, 347, 4677, 721, 275, 253, 2022, 2929, 533, 342, 512, 608, 3210, 1146, 2429, 281, 320, 625, 21414, 891, 1804, 323, 253, 4477, 281, 8171, 4677, 721, 275, 253, 2022, 2929, 342, 4677, 577, 275, 253, 24864, 50276, 5903, 3550, 369, 417, 6312, 12028, 577, 285, 608, 1335, 7117, 347, 20591, 7364, 403, 417, 1663, 5469, 816, 1896, 18149, 323, 2007, 789, 253, 4477, 943, 2319, 253, 9171, 1430, 407, 5277, 247, 7227, 4311, 273, 849, 1781, 273, 6928, 597, 476, 4944, 275, 247, 5272, 2408, 273, 13782, 673, 24088, 247, 1643, 1897, 5474, 339, 431, 248, 2929, 29328, 246, 1266, 81, 247, 747, 1566, 323, 23507, 3632, 14580, 342, 23846, 21481, 7888, 1754, 327, 253, 14923, 17356, 1232, 305, 17788, 285, 2722, 690, 4679, 10941, 253, 8770, 273, 246, 1266, 81, 2429, 281, 253, 3786, 4081, 260, 1266, 81, 2429, 281, 260, 1266, 81, 253, 4081, 246, 1266, 81, 1057, 417, 2430, 281, 4993, 253, 1180, 273, 7888, 247, 30400, 285, 556, 8936, 3045, 275, 2426, 273, 3114, 5481, 285, 10554, 273, 5816, 9297, 327, 13506, 285, 1524, 10186, 14580, 256, 50274, 783, 2934, 273, 6941, 323, 21481, 7888, 3066, 6906, 920, 275, 253, 305, 17788, 1566, 310, 4722, 285, 747, 50273, 85, 1266, 81, 17923, 1805, 347, 2429, 281, 260, 1266, 81, 50276, 88, 50273, 783, 2929, 3133, 625, 751, 247, 7482, 8783, 273, 789, 310, 3058, 281, 3157, 253, 9759, 323, 1650, 625, 8813, 273, 24550, 285, 2905, 789, 1805, 18872, 9410, 285, 625, 2590, 9759, 273, 253, 1006, 800, 5933, 1754, 327, 278, 3591, 68, 50276, 783, 3045, 275, 2426, 273, 3114, 11998, 50276, 10580, 14433, 310, 8936, 342, 1675, 281, 260, 1266, 81, 533, 352, 1057, 417, 1646, 12085, 342, 643, 2130, 5609, 323, 253, 2173, 8892, 594, 253, 4477, 943, 19148, 752, 253, 2022, 7680, 4555, 403, 275, 1798, 281, 752, 6070, 597, 4944, 253, 5145, 4715, 3114, 50276, 783, 789, 943, 1705, 342, 271, 17909, 2127, 326, 476, 320, 908, 281, 18302, 253, 4679, 2011, 275, 253, 7714, 285, 281, 1581, 253, 897, 273, 253, 3632, 4216, 14156, 407, 2852, 8607, 50275, 783, 2929, 1057, 417, 2319, 2442, 12291, 50276, 7152, 33032, 2520, 2929, 4081, 247, 747, 7792, 323, 18918, 253, 3114, 2605, 275, 253, 2990, 597, 23115, 253, 21481, 3114, 5289, 275, 253, 2990, 407, 13633, 253, 14923, 17356, 1232, 1266, 81, 1566, 50276, 783, 1566, 806, 4561, 247, 21624, 1554, 32608, 835, 9297, 403, 7922, 7293, 327, 253, 10621, 1430, 273, 7632, 285, 616, 3114, 14199, 84, 840, 253, 9297, 275, 253, 1554, 32608, 326, 4684, 767, 1027, 7888, 403, 289, 13172, 440, 23705, 494, 253, 2540, 2990, 310, 26115, 347, 253, 12378, 273, 253, 21624, 1554, 32608, 22914, 253, 289, 13172, 9297, 597, 671, 2530, 271, 5919, 1039, 281, 3359, 253, 12637, 17032, 407, 17617, 253, 23507, 2867, 273, 253, 2540, 2990, 5742, 597, 19958, 253, 3114, 34860, 386, 285, 3114, 37600, 386, 9297, 11794, 1754, 327, 253, 2540, 2990, 50276, 14094, 4679, 671, 2692, 326, 436, 747, 1566, 12372, 253, 3114, 14199, 275, 253, 2990, 1805, 685, 2045, 1566, 50275, 2520, 747, 7792, 4081, 275, 436, 2929, 310, 4722, 436, 747, 1566, 476, 9441, 253, 21481, 3114, 5289, 275, 253, 2990, 534, 556, 1142, 4893, 285, 45465, 247, 2257, 273, 4116, 275, 253, 2561, 436, 747, 1566, 28174, 253, 37139, 414, 285, 6804, 3114, 14199, 275, 1524, 10186, 6928, 50275, 2520, 747, 1566, 6508, 253, 14923, 17356, 1232, 1266, 81, 1566, 407, 16984, 253, 6804, 3114, 14199, 323, 1016, 4666, 253, 5853, 908, 275, 436, 6880, 310, 275, 690, 3282, 2629, 407, 17617, 253, 37139, 414, 275, 253, 2540, 2990, 50275, 9820, 597, 5393, 690, 2442, 2087, 5904, 273, 616, 1566, 407, 1690, 3081, 2130, 1491, 50276, 187, 187, 4118, 18435, 27, 783, 4477, 1246, 247, 7792, 323, 6906, 920, 9297, 432, 432, 3632, 4216, 1524, 5904, 432, 253, 14923, 17356, 1232, 305, 17788, 281, 6635, 23507, 14580, 342, 6804, 3114, 14199, 84, 253, 4477, 2085, 271, 5919, 1114, 442, 1113, 4213, 3082, 326, 4311, 749, 3362, 83, 5372, 342, 253, 1180, 273, 7632, 627, 403, 7350, 670, 9171, 1430, 273, 253, 4081, 1332, 285, 697, 38135, 689, 253, 305, 17788, 1754, 5140, 273, 1113, 251, 285, 30013, 253, 30628, 671, 3877, 326, 253, 2929, 3198, 4737, 4361, 285, 247, 30909, 47284, 209 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: the paper is well organized with a clear idea of the proposed method and good related work descriptions overall the descriptions are clear and easy to follow but the experimental results need clarifying regarding the multidigit translation task it is not straightforward to this reviewer how the proposed method could match the digits semantic with different colors style in different locations the description in the paper is not enough to explain the results in fig 6 to this reviewer this task is more complex than the street view translation one in the same line it is curious what the results would be if digits with different colors are overlapping at random location rather than the gridlike arrangement ### Summary:
this paper proposes an image to image translation technique which decomposes into style and content transfer using a semantic consistency loss to encourage corresponding semantics using feature masks before and after translation performance is evaluated on a set of mnist variants as well as from simulated to real world driving imagery all reviewers found this paper well written with clear contribution compared to related work by focusing on the problem when onetoone mappings are not available across two domains which also have multimodal content or substyle the main weakness as discussed by the reviewers relates to the experiments and whether or not the set provided does effectively validate the proposed approach the authors argue their use of mnist as a toy problem but with full control to clearly validate their approach their semantic segmentation experiment shows modest performance improvement based on the experiments as is and the relative novelty of the proposed approach the ac recommends poster and encourages the authors to extend their analysis of the current results in a final version
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 783, 2929, 310, 973, 10932, 342, 247, 2590, 2934, 273, 253, 4081, 1332, 285, 1175, 2905, 789, 20121, 4583, 253, 20121, 403, 2590, 285, 3477, 281, 956, 533, 253, 5661, 1543, 878, 8254, 5411, 50275, 1747, 13218, 253, 23964, 304, 262, 10234, 4836, 352, 310, 417, 15246, 281, 436, 37317, 849, 253, 4081, 1332, 812, 3761, 253, 24321, 24705, 342, 1027, 9830, 3740, 275, 1027, 8593, 253, 5740, 275, 253, 2929, 310, 417, 2217, 281, 5513, 253, 1543, 275, 3036, 721, 281, 436, 37317, 436, 4836, 310, 625, 2570, 685, 253, 6406, 1859, 10234, 581, 275, 253, 1072, 1386, 352, 310, 14338, 752, 253, 1543, 651, 320, 604, 24321, 342, 1027, 9830, 403, 21481, 387, 3632, 4328, 2581, 685, 253, 9860, 3022, 11461, 187, 187, 4118, 18435, 27, 2520, 2929, 29328, 271, 2460, 281, 2460, 10234, 5853, 534, 11101, 6013, 715, 3740, 285, 2600, 3700, 970, 247, 24705, 15274, 2957, 281, 11907, 3969, 35185, 970, 4735, 25965, 1078, 285, 846, 10234, 3045, 310, 6760, 327, 247, 873, 273, 278, 79, 382, 11640, 347, 973, 347, 432, 15524, 281, 1524, 1533, 6276, 27471, 50275, 455, 30628, 1119, 436, 2929, 973, 3542, 342, 2590, 7680, 2429, 281, 2905, 789, 407, 13654, 327, 253, 1895, 672, 327, 16713, 531, 42794, 403, 417, 2130, 2439, 767, 10625, 534, 671, 452, 23390, 26306, 2600, 390, 749, 4826, 50275, 783, 2022, 14855, 347, 5469, 407, 253, 30628, 7033, 281, 253, 4679, 285, 1880, 390, 417, 253, 873, 2530, 1057, 8069, 17813, 253, 4081, 2746, 253, 4477, 9059, 616, 897, 273, 278, 79, 382, 347, 247, 20953, 1895, 533, 342, 2120, 1453, 281, 4518, 17813, 616, 2746, 616, 24705, 26405, 3368, 2722, 16453, 3045, 7756, 1754, 327, 253, 4679, 347, 310, 285, 253, 4103, 38135, 273, 253, 4081, 2746, 253, 913, 32636, 20731, 285, 29426, 253, 4477, 281, 9017, 616, 1783, 273, 253, 1655, 1543, 275, 247, 2457, 2715 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 783, 2929, 310, 973, 10932, 342, 247, 2590, 2934, 273, 253, 4081, 1332, 285, 1175, 2905, 789, 20121, 4583, 253, 20121, 403, 2590, 285, 3477, 281, 956, 533, 253, 5661, 1543, 878, 8254, 5411, 50275, 1747, 13218, 253, 23964, 304, 262, 10234, 4836, 352, 310, 417, 15246, 281, 436, 37317, 849, 253, 4081, 1332, 812, 3761, 253, 24321, 24705, 342, 1027, 9830, 3740, 275, 1027, 8593, 253, 5740, 275, 253, 2929, 310, 417, 2217, 281, 5513, 253, 1543, 275, 3036, 721, 281, 436, 37317, 436, 4836, 310, 625, 2570, 685, 253, 6406, 1859, 10234, 581, 275, 253, 1072, 1386, 352, 310, 14338, 752, 253, 1543, 651, 320, 604, 24321, 342, 1027, 9830, 403, 21481, 387, 3632, 4328, 2581, 685, 253, 9860, 3022, 11461, 187, 187, 4118, 18435, 27, 2520, 2929, 29328, 271, 2460, 281, 2460, 10234, 5853, 534, 11101, 6013, 715, 3740, 285, 2600, 3700, 970, 247, 24705, 15274, 2957, 281, 11907, 3969, 35185, 970, 4735, 25965, 1078, 285, 846, 10234, 3045, 310, 6760, 327, 247, 873, 273, 278, 79, 382, 11640, 347, 973, 347, 432, 15524, 281, 1524, 1533, 6276, 27471, 50275, 455, 30628, 1119, 436, 2929, 973, 3542, 342, 2590, 7680, 2429, 281, 2905, 789, 407, 13654, 327, 253, 1895, 672, 327, 16713, 531, 42794, 403, 417, 2130, 2439, 767, 10625, 534, 671, 452, 23390, 26306, 2600, 390, 749, 4826, 50275, 783, 2022, 14855, 347, 5469, 407, 253, 30628, 7033, 281, 253, 4679, 285, 1880, 390, 417, 253, 873, 2530, 1057, 8069, 17813, 253, 4081, 2746, 253, 4477, 9059, 616, 897, 273, 278, 79, 382, 347, 247, 20953, 1895, 533, 342, 2120, 1453, 281, 4518, 17813, 616, 2746, 616, 24705, 26405, 3368, 2722, 16453, 3045, 7756, 1754, 327, 253, 4679, 347, 310, 285, 253, 4103, 38135, 273, 253, 4081, 2746, 253, 913, 32636, 20731, 285, 29426, 253, 4477, 281, 9017, 616, 1783, 273, 253, 1655, 1543, 275, 247, 2457, 2715 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: nonparametric estimation of conditional moment models in high dimensions has been widely and well studied in the literature in recent years many works have been dedicated to establishing the large sample properties for the case where the conditioning variable has a small intrinsic dimension and found that the optimal convergence rate depends only on d see the references in the paper and see also jiao et al 2021 and the references therein the knn method of approximating the geodesic distance on the lower intrinsic dimensional manifold is commonly used see tenenbaum et al 2000 kpotufe 2011 chen and muller 2012 among others the authors generalise the asymptotic results of wager and athey 2018 athey et al 2019 for subsampled random forests and that of fan et al 2019 for subsampled 1nn estimator in the high dimension setting to their subsampled knn generalised method of moments in the low intrinsic dimension regime inspired by the ideas in kpotufe 2011 the authors propose a datadriven method to choose the tuning parameter there are two main contributions of the paper 1 the explicit form of the asymptotic variance of the estimator and 2 the datadriven tuning parameter that can guarantee the optimal convergence rate of the estimator nevertheless i have the following major concerns 1 the motivation for using the subsampling and fixing the value of k of the knn is unclear without the subsampling the estimator is essentially a uniform kernel estimator where the kernel shrinkage acts like the bandwidth of a standard kernel estimator and depends on the sample size n and the knn parameter k however it is known that if k satisfies certain rate conditions such an estimator can also achieve the optimal convergence rate of a nonparametric estimator and one may also incorporate kernel functions other than a uniform one to the knn to improve the practical performance see bickel and li 2007 kpotufe 2011 padilla et al 2020 for example then what is the advantage of using the subsampling 2 the authors treat the knn parameter k as a constant in the paper as i mentioned in the first question the kernel shrinkage parameter should depend on k what would the asymptotic results of the estimator be if we allow k to depend on the sample size n 3 in practice k has to be chosen to construct the estimator could the authors discuss how to determine k and how does k affect the practical performance 4 in the theorems in the paper are lambda psimax and p constants lie in 0infty if so taking b1 and sn which corresponds to the knn estimator without subsampling in the upper bound of the square root of the mean squared error of the estimator we have olambda1psimaxsqrtn1psloglogpnso1 this may not converge to 0 why is that please correct me if i misunderstand anything 5 in the paper the authors provide the explicit form of the asymptotic variance and say that if they have an estimate of the variance they can build plugin confidence intervals based on the asymptotic normality result however the exact estimator of the variance is not stated since this is one of the main contributions of the paper the authors should be clear here based on the formula of the variance many different estimation methods can be used do they all have similar theoretical and practical performance 6 in the simulation the authors consider only one setting one model one sample size one number of dimensions of the covariates and one intrinsic dimension this does not reflect the theories well a the sample size is set to n20000 while the intrinsic dimension d is only 2 usually with d2 a sample size of n5000 should be large enough to obtain a good performance the authors may consider n10002000 and 5000 to present the convergence of the estimation and the coverage of the confidence intervals b it would also be interesting to see a comparison between the proposed subsampled method and the knn estimator without subsampling but with k chosen by crossvalidation 7 although the authors mention that the conditional moment models are related to heterogeneous treatment effect problems the connection of the results to causal learning and reasoning is not straightforward indeed in the treatment effect problems other issues need to be considered in the estimation such as confounding therefore the estimation and inference results in the paper cannot be directly applied this concerns if the paper is suitable for the conference reference athey s tibshirani j and wager s 2019 generalized random forests the annals of statistics 47 11481178 bickel p j and li b 2007 local polynomial regression on unknown manifolds complex datasets and inverse problems institute of mathematical statistics 177186 chen d and muller hg 2012 nonlinear manifold representations for functional data the annals of statistics 40 129 fan y lv j and wang j 2018 dnn a twoscale distributional tale of heterogeneous treatment effect inference arxiv preprint arxiv180808469 jiao y shen g lin y and huang j 2021 deep nonparametric regression on approximately lowdimensional manifolds arxiv preprint arxiv210406708 kpotufe s 2011 knn regression adapts to local intrinsic dimension in advances in neural information processing systems 729737 tenenbaum j b de silva v and langford j c 2000 a global geometric framework for nonlinear dimensionality reduction science 290 23192323 padilla o h m sharpnack j and chen y 2020 adaptive nonparametric regression with the knearest neighbour fused lasso biometrika 107 293310docsep the reviewer is familiar with conditional moment models and their application familiar with intrinsic dimensions but only have textbook knowledge about nonparametric estimation here are my detailed comments 1 the problem can use more motivation eg in which application area is it more common to have high dimensional set of conditioning variables in many cases one can estimate the causal structure and given the estimated causal structure only a smaller number of variables need to be conditioned on 2 simulations is there any reason that the specific function familydata generation function was chosen it would also be interesting to see numerical illustrationdemonstration of the bound 3 it would be helpful if the author can comment of the complexity of the estimation 4 assumptions can the authors comment of some of the assumptions with respect to testability or how likely they are going to be met in other words guidance for when the methodology is applicable for people who are consider to use it on real world data docsep technical quality the paper establishes the rate and asymptotic normality of the estimator in particular under an unknown intrinsic dimension a datadriven procedure is shown to be adaptive in terms of its technical strengths and weaknesses in the context of the literature i am unable to provide an informed assessment as i am not familiar with related works clarity the paper seems quite dense and i find it very hard to follow after all there is not much room left after introducing the definitions conditions and stating the results there are a few broken sentences in the first paragraph of page 3 originality the authors seem to be able to build upon previous works and generalize their results eg the definition of intrinsic dimension and asymptotic normality of the subsampled knn estimator significance as authors mentioned in the introduction i presume that the results here are relevant to several problems in causal inference but the impact is not immediately clear to me perhaps the authors can comment on the application of their results ### Summary:
the paper is concerned with nonparametric estimation and focuses on explaining why nonparametric estimation happen to work well even in high dimensions it shows that the estimation and asymptotic normality results only depend on the intrinsic dimension of the data the proposed approach is based on a subsampled ensemble of the knearest neighbors and an adaptive procedure is proposed to identify the intrinsic dimension it might be interesting to comment the relationship among this procedure and the paper estimating the intrinsic dimension of datasets by a minimal neighborhood information facco et al nature 2017 all reviewers are interested in the presented approach in the line of and extending previous works by athey et al and the authors rebuttal clearly addressed the questions some efforts in making the revised version of the paper easier to follow are mandatory please discuss the motivations complexity of solving the weighted generalized moment equation depending on its structure and provide some guidelines for practitioners some ask the authors to make the paper easier to follow
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 4160, 36928, 13418, 273, 17697, 2774, 3210, 275, 1029, 10103, 556, 644, 7561, 285, 973, 5421, 275, 253, 6239, 275, 3332, 1107, 1142, 2987, 452, 644, 9940, 281, 14631, 253, 1781, 3410, 3607, 323, 253, 1083, 835, 253, 21839, 4778, 556, 247, 1355, 15276, 7877, 285, 1119, 326, 253, 8654, 14940, 2281, 7024, 760, 327, 277, 923, 253, 10414, 275, 253, 2929, 285, 923, 671, 480, 22728, 1162, 355, 43425, 285, 253, 10414, 15308, 253, 694, 79, 1332, 273, 4020, 839, 253, 35917, 4181, 327, 253, 2406, 15276, 15759, 16751, 310, 7744, 908, 923, 3578, 257, 30735, 1162, 355, 5307, 465, 11714, 86, 453, 4332, 260, 864, 285, 278, 962, 254, 4050, 2190, 2571, 253, 4477, 2087, 885, 253, 20185, 1543, 273, 259, 3800, 285, 15389, 90, 4765, 15389, 90, 1162, 355, 6247, 323, 8790, 312, 6216, 3632, 21327, 285, 326, 273, 7989, 1162, 355, 6247, 323, 8790, 312, 6216, 337, 9866, 29107, 275, 253, 1029, 7877, 4758, 281, 616, 8790, 312, 6216, 694, 79, 2087, 1701, 1332, 273, 9506, 275, 253, 1698, 15276, 7877, 9459, 11797, 407, 253, 5697, 275, 465, 11714, 86, 453, 4332, 253, 4477, 12661, 247, 2856, 324, 1069, 257, 1332, 281, 5206, 253, 25184, 4764, 50276, 9088, 403, 767, 2022, 9021, 273, 253, 2929, 337, 253, 6843, 830, 273, 253, 20185, 11041, 273, 253, 29107, 285, 374, 253, 2856, 324, 1069, 257, 25184, 4764, 326, 476, 12215, 253, 8654, 14940, 2281, 273, 253, 29107, 17837, 891, 452, 253, 1563, 2201, 7350, 50276, 18, 253, 16038, 323, 970, 253, 8790, 312, 4906, 285, 18505, 253, 1318, 273, 465, 273, 253, 694, 79, 310, 12744, 1293, 253, 8790, 312, 4906, 253, 29107, 310, 9093, 247, 6447, 10295, 29107, 835, 253, 10295, 47100, 6993, 751, 253, 16992, 273, 247, 2629, 10295, 29107, 285, 7024, 327, 253, 3410, 1979, 295, 285, 253, 694, 79, 4764, 465, 2299, 352, 310, 1929, 326, 604, 465, 12310, 2176, 2281, 2515, 824, 271, 29107, 476, 671, 5115, 253, 8654, 14940, 2281, 273, 247, 1327, 36928, 29107, 285, 581, 778, 671, 19071, 10295, 3470, 643, 685, 247, 6447, 581, 281, 253, 694, 79, 281, 3157, 253, 8542, 3045, 923, 270, 781, 293, 285, 632, 5215, 465, 11714, 86, 453, 4332, 13229, 6077, 1162, 355, 9169, 323, 1650, 840, 752, 310, 253, 5750, 273, 970, 253, 8790, 312, 4906, 374, 253, 4477, 1555, 253, 694, 79, 4764, 465, 347, 247, 3638, 275, 253, 2929, 347, 891, 5393, 275, 253, 806, 1953, 253, 10295, 47100, 4764, 943, 3469, 327, 465, 752, 651, 253, 20185, 1543, 273, 253, 29107, 320, 604, 359, 1581, 465, 281, 3469, 327, 253, 3410, 1979, 295, 495, 275, 3946, 465, 556, 281, 320, 6777, 281, 3989, 253, 29107, 812, 253, 4477, 2319, 849, 281, 3653, 465, 285, 849, 1057, 465, 2818, 253, 8542, 3045, 577, 275, 253, 39383, 275, 253, 2929, 403, 29331, 268, 3549, 991, 285, 268, 14637, 7027, 275, 470, 3259, 604, 594, 3192, 270, 18, 285, 3802, 534, 10140, 281, 253, 694, 79, 29107, 1293, 8790, 312, 4906, 275, 253, 5170, 3033, 273, 253, 6278, 5230, 273, 253, 1599, 30044, 2228, 273, 253, 29107, 359, 452, 258, 2260, 18, 793, 303, 991, 2609, 79, 18, 793, 2808, 2808, 16077, 601, 18, 436, 778, 417, 29623, 281, 470, 2139, 310, 326, 4496, 3451, 479, 604, 891, 23452, 1676, 2712, 608, 275, 253, 2929, 253, 4477, 2085, 253, 6843, 830, 273, 253, 20185, 11041, 285, 1333, 326, 604, 597, 452, 271, 6642, 273, 253, 11041, 597, 476, 1973, 15191, 7162, 11508, 1754, 327, 253, 20185, 5222, 1319, 906, 2299, 253, 3242, 29107, 273, 253, 11041, 310, 417, 4767, 1580, 436, 310, 581, 273, 253, 2022, 9021, 273, 253, 2929, 253, 4477, 943, 320, 2590, 1060, 1754, 327, 253, 7212, 273, 253, 11041, 1142, 1027, 13418, 3082, 476, 320, 908, 513, 597, 512, 452, 2074, 10527, 285, 8542, 3045, 721, 275, 253, 9864, 253, 4477, 1908, 760, 581, 4758, 581, 1566, 581, 3410, 1979, 581, 1180, 273, 10103, 273, 253, 33520, 285, 581, 15276, 7877, 436, 1057, 417, 4887, 253, 11813, 973, 50272, 66, 253, 3410, 1979, 310, 873, 281, 295, 19, 1418, 1223, 253, 15276, 7877, 277, 310, 760, 374, 3798, 342, 277, 19, 247, 3410, 1979, 273, 295, 28306, 943, 320, 1781, 2217, 281, 4044, 247, 1175, 3045, 253, 4477, 778, 1908, 295, 9138, 6914, 285, 29067, 281, 1246, 253, 14940, 273, 253, 13418, 285, 253, 7031, 273, 253, 7162, 11508, 50273, 67, 352, 651, 671, 320, 4722, 281, 923, 247, 5301, 875, 253, 4081, 8790, 312, 6216, 1332, 285, 253, 694, 79, 29107, 1293, 8790, 312, 4906, 533, 342, 465, 6777, 407, 2831, 29599, 50276, 24, 3738, 253, 4477, 3748, 326, 253, 17697, 2774, 3210, 403, 2905, 281, 22766, 1971, 1055, 3237, 253, 4602, 273, 253, 1543, 281, 19349, 4715, 285, 14720, 310, 417, 15246, 6296, 275, 253, 1971, 1055, 3237, 643, 3374, 878, 281, 320, 2783, 275, 253, 13418, 824, 347, 34541, 3103, 253, 13418, 285, 17032, 1543, 275, 253, 2929, 2550, 320, 3587, 3732, 436, 7350, 604, 253, 2929, 310, 7470, 323, 253, 8059, 50276, 14005, 50276, 4349, 90, 256, 27345, 1200, 343, 6451, 480, 285, 259, 3800, 256, 6247, 14923, 3632, 21327, 253, 2459, 932, 273, 9990, 7543, 11601, 25, 883, 3141, 50276, 67, 781, 293, 268, 480, 285, 632, 270, 5215, 1980, 14189, 9077, 327, 7202, 28236, 2570, 15302, 285, 13737, 3237, 32097, 273, 15965, 9990, 24232, 20270, 50276, 5756, 277, 285, 278, 962, 254, 288, 72, 4050, 14561, 16751, 14237, 323, 5164, 941, 253, 2459, 932, 273, 9990, 3387, 17181, 50276, 20227, 340, 298, 87, 480, 285, 259, 606, 480, 4765, 277, 9866, 247, 2500, 5829, 1079, 3268, 267, 17005, 273, 22766, 1971, 1055, 17032, 549, 32693, 638, 3845, 549, 32693, 11395, 1438, 2759, 2090, 50276, 75, 22728, 340, 703, 79, 305, 19169, 340, 285, 30287, 606, 480, 43425, 3676, 1327, 36928, 9077, 327, 5512, 1698, 6967, 28236, 549, 32693, 638, 3845, 549, 32693, 16899, 24854, 27158, 50276, 76, 11714, 86, 453, 256, 4332, 694, 79, 9077, 5223, 84, 281, 1980, 15276, 7877, 275, 16424, 275, 11454, 1491, 5162, 2718, 818, 23185, 1787, 50276, 1866, 257, 30735, 480, 270, 372, 2830, 6156, 362, 285, 19457, 4379, 480, 260, 5307, 247, 4156, 17856, 7792, 323, 14561, 7877, 1319, 5141, 5859, 26711, 3495, 14403, 21874, 50276, 11022, 6077, 258, 288, 278, 9479, 79, 471, 480, 285, 260, 864, 340, 9169, 17825, 1327, 36928, 9077, 342, 253, 7725, 4885, 14646, 29843, 298, 26341, 1794, 2755, 32410, 14034, 3285, 1610, 740, 7152, 33032, 253, 37317, 310, 7615, 342, 17697, 2774, 3210, 285, 616, 2898, 7615, 342, 15276, 10103, 533, 760, 452, 40554, 3640, 670, 1327, 36928, 13418, 50276, 1568, 403, 619, 7000, 5701, 337, 186, 783, 1895, 476, 897, 625, 16038, 24088, 275, 534, 2898, 2170, 310, 352, 625, 1846, 281, 452, 1029, 15759, 873, 273, 21839, 4903, 275, 1142, 2219, 581, 476, 6642, 253, 19349, 2605, 285, 1677, 253, 5998, 19349, 2605, 760, 247, 4577, 1180, 273, 4903, 878, 281, 320, 27039, 327, 374, 186, 3549, 3339, 310, 627, 667, 1921, 326, 253, 2173, 1159, 2021, 2203, 5978, 1159, 369, 6777, 352, 651, 671, 320, 4722, 281, 923, 10704, 23356, 48387, 318, 273, 253, 3033, 495, 186, 262, 651, 320, 9371, 604, 253, 2488, 476, 4385, 273, 253, 10454, 273, 253, 13418, 577, 186, 515, 360, 6372, 476, 253, 4477, 4385, 273, 690, 273, 253, 13260, 342, 1675, 281, 1071, 1430, 390, 849, 2779, 597, 403, 1469, 281, 320, 1313, 275, 643, 3000, 12925, 323, 672, 253, 16182, 310, 7763, 323, 952, 665, 403, 1908, 281, 897, 352, 327, 1524, 1533, 941, 5474, 33032, 7681, 3290, 253, 2929, 25097, 253, 2281, 285, 20185, 5222, 1319, 273, 253, 29107, 275, 1798, 762, 271, 7202, 15276, 7877, 247, 2856, 324, 1069, 257, 5199, 310, 2011, 281, 320, 17825, 50275, 249, 2426, 273, 697, 7681, 20544, 285, 32213, 275, 253, 3634, 273, 253, 6239, 891, 717, 7591, 281, 2085, 271, 8191, 6803, 347, 891, 717, 417, 7615, 342, 2905, 2987, 50274, 498, 15752, 253, 2929, 3133, 3240, 14086, 285, 891, 1089, 352, 1077, 1892, 281, 956, 846, 512, 627, 310, 417, 1199, 2316, 1669, 846, 16984, 253, 14308, 50276, 32978, 285, 14851, 253, 1543, 50275, 9088, 403, 247, 1643, 7154, 14683, 275, 253, 806, 12494, 273, 3239, 495, 50274, 19164, 414, 253, 4477, 1646, 281, 320, 2104, 281, 1973, 2220, 2045, 2987, 285, 39970, 616, 1543, 24088, 253, 5426, 273, 15276, 7877, 285, 20185, 5222, 1319, 273, 253, 8790, 312, 6216, 694, 79, 29107, 50275, 9188, 40348, 347, 4477, 5393, 275, 253, 10199, 891, 35533, 326, 253, 1543, 1060, 403, 4623, 281, 2067, 3237, 275, 19349, 17032, 533, 253, 3486, 310, 417, 4745, 2590, 281, 479, 4931, 253, 4477, 476, 4385, 327, 253, 2898, 273, 616, 1543, 187, 187, 4118, 18435, 27, 783, 2929, 310, 7514, 342, 1327, 36928, 13418, 285, 16633, 327, 15571, 2139, 1327, 36928, 13418, 5108, 281, 789, 973, 1014, 275, 1029, 10103, 352, 2722, 326, 253, 13418, 285, 20185, 5222, 1319, 1543, 760, 3469, 327, 253, 15276, 7877, 50276, 1171, 253, 941, 50276, 783, 4081, 2746, 310, 1754, 327, 247, 8790, 312, 6216, 19862, 273, 253, 7725, 4885, 15833, 285, 271, 17825, 5199, 310, 4081, 281, 4271, 253, 15276, 7877, 50276, 262, 1537, 320, 4722, 281, 4385, 253, 2954, 2190, 436, 5199, 285, 253, 2929, 26230, 253, 15276, 7877, 273, 15302, 407, 247, 8723, 9168, 1491, 2344, 1940, 1162, 355, 3753, 4240, 50276, 455, 30628, 403, 6110, 275, 253, 3559, 2746, 275, 253, 1386, 273, 285, 13633, 2045, 2987, 407, 15389, 90, 1162, 355, 285, 253, 4477, 30080, 22559, 4518, 9713, 253, 3533, 50276, 8826, 6031, 275, 2403, 253, 17265, 2715, 273, 253, 2929, 6927, 281, 956, 403, 17396, 4496, 2319, 253, 42852, 10454, 273, 16161, 253, 17375, 14923, 2774, 5150, 7293, 327, 697, 2605, 285, 2085, 690, 9600, 323, 24432, 50276, 8826, 1642, 253, 4477, 281, 1056, 253, 2929, 6927, 281, 956 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 4160, 36928, 13418, 273, 17697, 2774, 3210, 275, 1029, 10103, 556, 644, 7561, 285, 973, 5421, 275, 253, 6239, 275, 3332, 1107, 1142, 2987, 452, 644, 9940, 281, 14631, 253, 1781, 3410, 3607, 323, 253, 1083, 835, 253, 21839, 4778, 556, 247, 1355, 15276, 7877, 285, 1119, 326, 253, 8654, 14940, 2281, 7024, 760, 327, 277, 923, 253, 10414, 275, 253, 2929, 285, 923, 671, 480, 22728, 1162, 355, 43425, 285, 253, 10414, 15308, 253, 694, 79, 1332, 273, 4020, 839, 253, 35917, 4181, 327, 253, 2406, 15276, 15759, 16751, 310, 7744, 908, 923, 3578, 257, 30735, 1162, 355, 5307, 465, 11714, 86, 453, 4332, 260, 864, 285, 278, 962, 254, 4050, 2190, 2571, 253, 4477, 2087, 885, 253, 20185, 1543, 273, 259, 3800, 285, 15389, 90, 4765, 15389, 90, 1162, 355, 6247, 323, 8790, 312, 6216, 3632, 21327, 285, 326, 273, 7989, 1162, 355, 6247, 323, 8790, 312, 6216, 337, 9866, 29107, 275, 253, 1029, 7877, 4758, 281, 616, 8790, 312, 6216, 694, 79, 2087, 1701, 1332, 273, 9506, 275, 253, 1698, 15276, 7877, 9459, 11797, 407, 253, 5697, 275, 465, 11714, 86, 453, 4332, 253, 4477, 12661, 247, 2856, 324, 1069, 257, 1332, 281, 5206, 253, 25184, 4764, 50276, 9088, 403, 767, 2022, 9021, 273, 253, 2929, 337, 253, 6843, 830, 273, 253, 20185, 11041, 273, 253, 29107, 285, 374, 253, 2856, 324, 1069, 257, 25184, 4764, 326, 476, 12215, 253, 8654, 14940, 2281, 273, 253, 29107, 17837, 891, 452, 253, 1563, 2201, 7350, 50276, 18, 253, 16038, 323, 970, 253, 8790, 312, 4906, 285, 18505, 253, 1318, 273, 465, 273, 253, 694, 79, 310, 12744, 1293, 253, 8790, 312, 4906, 253, 29107, 310, 9093, 247, 6447, 10295, 29107, 835, 253, 10295, 47100, 6993, 751, 253, 16992, 273, 247, 2629, 10295, 29107, 285, 7024, 327, 253, 3410, 1979, 295, 285, 253, 694, 79, 4764, 465, 2299, 352, 310, 1929, 326, 604, 465, 12310, 2176, 2281, 2515, 824, 271, 29107, 476, 671, 5115, 253, 8654, 14940, 2281, 273, 247, 1327, 36928, 29107, 285, 581, 778, 671, 19071, 10295, 3470, 643, 685, 247, 6447, 581, 281, 253, 694, 79, 281, 3157, 253, 8542, 3045, 923, 270, 781, 293, 285, 632, 5215, 465, 11714, 86, 453, 4332, 13229, 6077, 1162, 355, 9169, 323, 1650, 840, 752, 310, 253, 5750, 273, 970, 253, 8790, 312, 4906, 374, 253, 4477, 1555, 253, 694, 79, 4764, 465, 347, 247, 3638, 275, 253, 2929, 347, 891, 5393, 275, 253, 806, 1953, 253, 10295, 47100, 4764, 943, 3469, 327, 465, 752, 651, 253, 20185, 1543, 273, 253, 29107, 320, 604, 359, 1581, 465, 281, 3469, 327, 253, 3410, 1979, 295, 495, 275, 3946, 465, 556, 281, 320, 6777, 281, 3989, 253, 29107, 812, 253, 4477, 2319, 849, 281, 3653, 465, 285, 849, 1057, 465, 2818, 253, 8542, 3045, 577, 275, 253, 39383, 275, 253, 2929, 403, 29331, 268, 3549, 991, 285, 268, 14637, 7027, 275, 470, 3259, 604, 594, 3192, 270, 18, 285, 3802, 534, 10140, 281, 253, 694, 79, 29107, 1293, 8790, 312, 4906, 275, 253, 5170, 3033, 273, 253, 6278, 5230, 273, 253, 1599, 30044, 2228, 273, 253, 29107, 359, 452, 258, 2260, 18, 793, 303, 991, 2609, 79, 18, 793, 2808, 2808, 16077, 601, 18, 436, 778, 417, 29623, 281, 470, 2139, 310, 326, 4496, 3451, 479, 604, 891, 23452, 1676, 2712, 608, 275, 253, 2929, 253, 4477, 2085, 253, 6843, 830, 273, 253, 20185, 11041, 285, 1333, 326, 604, 597, 452, 271, 6642, 273, 253, 11041, 597, 476, 1973, 15191, 7162, 11508, 1754, 327, 253, 20185, 5222, 1319, 906, 2299, 253, 3242, 29107, 273, 253, 11041, 310, 417, 4767, 1580, 436, 310, 581, 273, 253, 2022, 9021, 273, 253, 2929, 253, 4477, 943, 320, 2590, 1060, 1754, 327, 253, 7212, 273, 253, 11041, 1142, 1027, 13418, 3082, 476, 320, 908, 513, 597, 512, 452, 2074, 10527, 285, 8542, 3045, 721, 275, 253, 9864, 253, 4477, 1908, 760, 581, 4758, 581, 1566, 581, 3410, 1979, 581, 1180, 273, 10103, 273, 253, 33520, 285, 581, 15276, 7877, 436, 1057, 417, 4887, 253, 11813, 973, 50272, 66, 253, 3410, 1979, 310, 873, 281, 295, 19, 1418, 1223, 253, 15276, 7877, 277, 310, 760, 374, 3798, 342, 277, 19, 247, 3410, 1979, 273, 295, 28306, 943, 320, 1781, 2217, 281, 4044, 247, 1175, 3045, 253, 4477, 778, 1908, 295, 9138, 6914, 285, 29067, 281, 1246, 253, 14940, 273, 253, 13418, 285, 253, 7031, 273, 253, 7162, 11508, 50273, 67, 352, 651, 671, 320, 4722, 281, 923, 247, 5301, 875, 253, 4081, 8790, 312, 6216, 1332, 285, 253, 694, 79, 29107, 1293, 8790, 312, 4906, 533, 342, 465, 6777, 407, 2831, 29599, 50276, 24, 3738, 253, 4477, 3748, 326, 253, 17697, 2774, 3210, 403, 2905, 281, 22766, 1971, 1055, 3237, 253, 4602, 273, 253, 1543, 281, 19349, 4715, 285, 14720, 310, 417, 15246, 6296, 275, 253, 1971, 1055, 3237, 643, 3374, 878, 281, 320, 2783, 275, 253, 13418, 824, 347, 34541, 3103, 253, 13418, 285, 17032, 1543, 275, 253, 2929, 2550, 320, 3587, 3732, 436, 7350, 604, 253, 2929, 310, 7470, 323, 253, 8059, 50276, 14005, 50276, 4349, 90, 256, 27345, 1200, 343, 6451, 480, 285, 259, 3800, 256, 6247, 14923, 3632, 21327, 253, 2459, 932, 273, 9990, 7543, 11601, 25, 883, 3141, 50276, 67, 781, 293, 268, 480, 285, 632, 270, 5215, 1980, 14189, 9077, 327, 7202, 28236, 2570, 15302, 285, 13737, 3237, 32097, 273, 15965, 9990, 24232, 20270, 50276, 5756, 277, 285, 278, 962, 254, 288, 72, 4050, 14561, 16751, 14237, 323, 5164, 941, 253, 2459, 932, 273, 9990, 3387, 17181, 50276, 20227, 340, 298, 87, 480, 285, 259, 606, 480, 4765, 277, 9866, 247, 2500, 5829, 1079, 3268, 267, 17005, 273, 22766, 1971, 1055, 17032, 549, 32693, 638, 3845, 549, 32693, 11395, 1438, 2759, 2090, 50276, 75, 22728, 340, 703, 79, 305, 19169, 340, 285, 30287, 606, 480, 43425, 3676, 1327, 36928, 9077, 327, 5512, 1698, 6967, 28236, 549, 32693, 638, 3845, 549, 32693, 16899, 24854, 27158, 50276, 76, 11714, 86, 453, 256, 4332, 694, 79, 9077, 5223, 84, 281, 1980, 15276, 7877, 275, 16424, 275, 11454, 1491, 5162, 2718, 818, 23185, 1787, 50276, 1866, 257, 30735, 480, 270, 372, 2830, 6156, 362, 285, 19457, 4379, 480, 260, 5307, 247, 4156, 17856, 7792, 323, 14561, 7877, 1319, 5141, 5859, 26711, 3495, 14403, 21874, 50276, 11022, 6077, 258, 288, 278, 9479, 79, 471, 480, 285, 260, 864, 340, 9169, 17825, 1327, 36928, 9077, 342, 253, 7725, 4885, 14646, 29843, 298, 26341, 1794, 2755, 32410, 14034, 3285, 1610, 740, 7152, 33032, 253, 37317, 310, 7615, 342, 17697, 2774, 3210, 285, 616, 2898, 7615, 342, 15276, 10103, 533, 760, 452, 40554, 3640, 670, 1327, 36928, 13418, 50276, 1568, 403, 619, 7000, 5701, 337, 186, 783, 1895, 476, 897, 625, 16038, 24088, 275, 534, 2898, 2170, 310, 352, 625, 1846, 281, 452, 1029, 15759, 873, 273, 21839, 4903, 275, 1142, 2219, 581, 476, 6642, 253, 19349, 2605, 285, 1677, 253, 5998, 19349, 2605, 760, 247, 4577, 1180, 273, 4903, 878, 281, 320, 27039, 327, 374, 186, 3549, 3339, 310, 627, 667, 1921, 326, 253, 2173, 1159, 2021, 2203, 5978, 1159, 369, 6777, 352, 651, 671, 320, 4722, 281, 923, 10704, 23356, 48387, 318, 273, 253, 3033, 495, 186, 262, 651, 320, 9371, 604, 253, 2488, 476, 4385, 273, 253, 10454, 273, 253, 13418, 577, 186, 515, 360, 6372, 476, 253, 4477, 4385, 273, 690, 273, 253, 13260, 342, 1675, 281, 1071, 1430, 390, 849, 2779, 597, 403, 1469, 281, 320, 1313, 275, 643, 3000, 12925, 323, 672, 253, 16182, 310, 7763, 323, 952, 665, 403, 1908, 281, 897, 352, 327, 1524, 1533, 941, 5474, 33032, 7681, 3290, 253, 2929, 25097, 253, 2281, 285, 20185, 5222, 1319, 273, 253, 29107, 275, 1798, 762, 271, 7202, 15276, 7877, 247, 2856, 324, 1069, 257, 5199, 310, 2011, 281, 320, 17825, 50275, 249, 2426, 273, 697, 7681, 20544, 285, 32213, 275, 253, 3634, 273, 253, 6239, 891, 717, 7591, 281, 2085, 271, 8191, 6803, 347, 891, 717, 417, 7615, 342, 2905, 2987, 50274, 498, 15752, 253, 2929, 3133, 3240, 14086, 285, 891, 1089, 352, 1077, 1892, 281, 956, 846, 512, 627, 310, 417, 1199, 2316, 1669, 846, 16984, 253, 14308, 50276, 32978, 285, 14851, 253, 1543, 50275, 9088, 403, 247, 1643, 7154, 14683, 275, 253, 806, 12494, 273, 3239, 495, 50274, 19164, 414, 253, 4477, 1646, 281, 320, 2104, 281, 1973, 2220, 2045, 2987, 285, 39970, 616, 1543, 24088, 253, 5426, 273, 15276, 7877, 285, 20185, 5222, 1319, 273, 253, 8790, 312, 6216, 694, 79, 29107, 50275, 9188, 40348, 347, 4477, 5393, 275, 253, 10199, 891, 35533, 326, 253, 1543, 1060, 403, 4623, 281, 2067, 3237, 275, 19349, 17032, 533, 253, 3486, 310, 417, 4745, 2590, 281, 479, 4931, 253, 4477, 476, 4385, 327, 253, 2898, 273, 616, 1543, 187, 187, 4118, 18435, 27, 783, 2929, 310, 7514, 342, 1327, 36928, 13418, 285, 16633, 327, 15571, 2139, 1327, 36928, 13418, 5108, 281, 789, 973, 1014, 275, 1029, 10103, 352, 2722, 326, 253, 13418, 285, 20185, 5222, 1319, 1543, 760, 3469, 327, 253, 15276, 7877, 50276, 1171, 253, 941, 50276, 783, 4081, 2746, 310, 1754, 327, 247, 8790, 312, 6216, 19862, 273, 253, 7725, 4885, 15833, 285, 271, 17825, 5199, 310, 4081, 281, 4271, 253, 15276, 7877, 50276, 262, 1537, 320, 4722, 281, 4385, 253, 2954, 2190, 436, 5199, 285, 253, 2929, 26230, 253, 15276, 7877, 273, 15302, 407, 247, 8723, 9168, 1491, 2344, 1940, 1162, 355, 3753, 4240, 50276, 455, 30628, 403, 6110, 275, 253, 3559, 2746, 275, 253, 1386, 273, 285, 13633, 2045, 2987, 407, 15389, 90, 1162, 355, 285, 253, 4477, 30080, 22559, 4518, 9713, 253, 3533, 50276, 8826, 6031, 275, 2403, 253, 17265, 2715, 273, 253, 2929, 6927, 281, 956, 403, 17396, 4496, 2319, 253, 42852, 10454, 273, 16161, 253, 17375, 14923, 2774, 5150, 7293, 327, 697, 2605, 285, 2085, 690, 9600, 323, 24432, 50276, 8826, 1642, 253, 4477, 281, 1056, 253, 2929, 6927, 281, 956 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: the paper investigates how to use selfsupervised learning for multistage visual transformer models previous works have shown that ssl can learn image correspondences and lead to performant pretrained models while the multistage models can reduce the computation cost dramatically this work tries to merge these two trends together the solution is a new regionbased loss that can be applied to the local features the comprehensive experiments show the advantages of the resulting models on multiple tasks the paper has many strengths it is interesting to see that the loss on local features can work well in the selfsupervised learning case this observation may find more usage in the future works of doing selfsupervised learning on transformerbased models the experiments are comprehensive and convincing the paper tries the same idea on multiple transformer structures and different tasks the proposed method can achieve the efficiency of the multistage transformer models while achieving good selfsupervised feature pretraining for tasks including classification segmentation and detection the info in the appendix is helpful for result reproduction and detailed understanding in terms of weakness i do hope that the paper can be more clearly written such as reorganizing the info between main text and appendix to give more intuition of lr in addition the detail of table 2 is too scarce to understand directly i guess the supervised baseline is swint but it is hard to infer from the paper directly overall the manuscript looks like a good paper i do hope reading the paper can be easier despite the page limit docsepthis paper develops an efficient selfsupervised vision transformer for learning visual representations it introduces a multistage architecture with sparse attentions to reduce computation complexity and proposes a new pretraining task of region matching to capture finegrained region dependencies the results on the imagenet and 18 small datasets or downstream tasks are good and compared with other stateoftheart approaches strengths 1 it introduces the new pretraining task to capture finegrained region dependencies 2 the experimental results are good and better than other compared approaches weaknesses 1 the patch merging module and sparse selfattention in the multistage vit are very similar to the patch merging in the paper swin transformer liu et al 2021 the authors should clearly explain the differences between this paper and swin transformer 2the equation 2 should add more highlevel descriptions for each local feature zi why finding the local feature zj from the teacher with the highest cosine similarity can capture the finegrained region dependency the theoretical novelties are limited this paper mainly borrows the ideas from two papers dino caron et al 2021 and swin transformer liu et al 2021 however the experimental results are very good in this paper and the paper provides a lot of implementation details for others to reproduce the results docsepthe contribution of the paper comprises the observation that the multistage vision transformer msvt as opposed to the monolithic vision transformer vt does not produce discriminative patch representation and a loss function for selfsupervised pretraining of msvts that encourages discriminative patch representation extensive experimental evaluation demonstrates that the proposed loss term when added to the standard noncontrastive selfsupervised loss brings a modest but systematic improvement in performance of the msvt tab 5 and other architectures tab 4 weaknesses 1 i find the contribution in the form of the loss term incremental with respect to swin and the performance gain demonstrated in most experiments modest 2 in my opinion figure 1 and table 1 misrepresent the performance gain that stems from the contribution of the paper the baseline architecture swinb and swinbw14 already outperforms almost all methods in the comparison at a significantly lower number of parameters and much higher throughput the corresponding numbers are reported in tab 5 i encourage the authors to modify figure 1 and table 1 to represent the increase in performance of swin resulting from the use of the proposed loss term this can be done by introducing to both parts of figure 1 points representing the performance of the base architectures swinb s and t trained without mathcallr the performance gain due to training them with mathcallr can be indicated in the plot with vertical arrows table 1 can be modified by reporting the performance of the base architectures trained without the proposed loss in addition to their performance when trained with this loss basically by merging tab 5 into tab 1 the ablation results can still be discussed in a separate section 3 in the current transfer learning results the impact of the authors contribution on transferability of the trained networks is not clear i would prefer the experiments in learning to classify 18 small datasets to highlight the performance gainloss that stems from training with mathcallr in addition to the current comparison of supervised to unsupervised training this would imply adding a third bar color to figure 3 and one more row to both parts of table 2 to represent the performance of swin trained in a selfsupervised regime but without mathcallr strengths a i like the storyline of the paper where an interesting observation about the discriminative matching power of patch descriptors motivates the construction of the loss term b the proposed loss term is shown to improve both the performance of swin and of other architectures c the experimental evaluation is extensive in that it also shows limitations of the proposed loss for example when applied to image segmentation tab 2 and tab 6 d according to me the manuscript is well written and easy to understand minor questions and editorial suggestions q1 in the loss term 2 you select j by the cosine distance but you minimize cross entropy have you tried using the same score for both tasks ie selecting j with the crossentropy and minimizing cross entropy or selecting j by the cosine distance and minimizing the cosine distance q2 in the loss term 2 why havent you tried optimal 11 matching the hungarian algorithm to match source and target patches q3 in the paragraph design choices of mathcallr you describe the choice of argmax vs optimal transport in the loss term 2 i do not understand how ot can be used to select the optimal index j could you clarify this es1 page 2 loss of this property to alleviate this issue perhaps use enumeration instead of bullets and make the statements more specific loss of the property described in 1 alleviate the issue identified in point 2 es2 page 3 vit yields 95 accuracy in regiontoregion correspondences consider specifying how you compute the correspondence scorespatch distances cosine similarity as in subsec 44 page 8 es3 in the caption of tab 1 please specify w14 is the window size i find the main idea of this paper convincing the manuscript well written and the evaluation thorough the fact that the authors highlight some limitations of their loss term in particular when applied to image segmentation attests to their scientific scrutiny i think the paper deserves a publication i am willing to rate it 810 instead of 1010 because i find the contribution incremental and the performance gain modest in this round of reviews i lower my recommendation to 6 because i find the weakness 2 described above acute it misrepresents the effect of the contribution on performance in the teaser figure and in the first table with quantitative results i am confident the authors can fix it in the next revision of the paper which will prompt me to increase the rating to 810 i encourage the authors to also address the weakness 3 in the next revision of the paper the current transfer learning experiments section 42 do not evaluate the specific contribution of the paper but rather answer more general questions i am confident that adding evaluation more focused on the use of mathcallr will vastly benefit this part of the paper i nevertheless do not feel that i should request completing this task within the limited review period and i will not lower my recommendation on this basis edit the authors addressed the weaknesses 2 and 3 listed in the main review and answered all my questions i am therefore happy to endorse the paper this is a solid work and deserves to be published ### Summary:
this paper proposes two techniques for improving selfsupervised learning with a vision transformer the first improvement is using a multistage vit which is very similar to swin transformer and authors recognized this is not a major contribution the authors further found that using a multistage vit does not produce discriminative patch representation thus proposing the second improvement with a region level loss while both improvements are not particularly novel by themselves combining both leads to a strong empirical result however it does looks like the multiscale vision transformer is the major improvement as removing the regional loss only leads to less than 1 decrease in performance in most cases in general this is a good engineering paper with a practical approach for improving selfsupervised learning with vision transformation and obtained strong results thus its worthy of publication
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 783, 2929, 2340, 684, 849, 281, 897, 1881, 35421, 4715, 323, 1554, 382, 486, 5304, 39707, 3210, 2045, 2987, 452, 2011, 326, 256, 3433, 476, 3037, 2460, 2723, 2979, 285, 1421, 281, 1347, 386, 3215, 11273, 3210, 1223, 253, 1554, 382, 486, 3210, 476, 4796, 253, 13782, 2105, 16821, 436, 789, 14177, 281, 17310, 841, 767, 13554, 2366, 253, 2900, 310, 247, 747, 2919, 3169, 2957, 326, 476, 320, 3732, 281, 253, 1980, 3386, 253, 11088, 4679, 921, 253, 11361, 273, 253, 4795, 3210, 327, 2709, 8892, 253, 2929, 556, 1142, 20544, 50275, 262, 310, 4722, 281, 923, 326, 253, 2957, 327, 1980, 3386, 476, 789, 973, 275, 253, 1881, 35421, 4715, 1083, 436, 8310, 778, 1089, 625, 10393, 275, 253, 2852, 2987, 273, 2509, 1881, 35421, 4715, 327, 39707, 3169, 3210, 50275, 783, 4679, 403, 11088, 285, 21414, 253, 2929, 14177, 253, 1072, 2934, 327, 2709, 39707, 5289, 285, 1027, 8892, 253, 4081, 1332, 476, 5115, 253, 6733, 273, 253, 1554, 382, 486, 39707, 3210, 1223, 17170, 1175, 1881, 35421, 4735, 3215, 26208, 323, 8892, 1690, 9162, 26405, 285, 5481, 50275, 783, 8692, 275, 253, 30762, 310, 9371, 323, 906, 21068, 285, 7000, 4685, 50276, 249, 2426, 273, 14855, 891, 513, 3524, 326, 253, 2929, 476, 320, 625, 4518, 3542, 824, 347, 294, 7397, 3006, 253, 8692, 875, 2022, 2505, 285, 30762, 281, 1918, 625, 30328, 273, 298, 83, 50276, 249, 1635, 253, 2508, 273, 2829, 374, 310, 1512, 29967, 281, 2096, 3587, 891, 5476, 253, 22296, 8245, 310, 1863, 565, 533, 352, 310, 1892, 281, 9441, 432, 253, 2929, 3587, 4583, 253, 7714, 4453, 751, 247, 1175, 2929, 891, 513, 3524, 4361, 253, 2929, 476, 320, 6927, 5747, 253, 3239, 2701, 5474, 33032, 2520, 2929, 24357, 271, 5919, 1881, 35421, 8113, 39707, 323, 4715, 5304, 14237, 352, 23970, 247, 1554, 382, 486, 10336, 342, 23507, 33056, 621, 281, 4796, 13782, 10454, 285, 29328, 247, 747, 3215, 26208, 4836, 273, 2919, 11038, 281, 9232, 4030, 72, 11273, 2919, 21011, 253, 1543, 327, 253, 4440, 257, 292, 285, 1283, 1355, 15302, 390, 15450, 8892, 403, 1175, 285, 2429, 342, 643, 1375, 23037, 14387, 7274, 50276, 296, 3755, 20556, 50275, 18, 352, 23970, 253, 747, 3215, 26208, 4836, 281, 9232, 4030, 72, 11273, 2919, 21011, 374, 253, 5661, 1543, 403, 1175, 285, 1805, 685, 643, 2429, 7274, 50276, 20881, 1255, 265, 337, 253, 12097, 34047, 6333, 285, 23507, 1881, 42959, 275, 253, 1554, 382, 486, 9084, 403, 1077, 2074, 281, 253, 12097, 34047, 275, 253, 2929, 1863, 249, 39707, 632, 86, 1162, 355, 43425, 253, 4477, 943, 4518, 5513, 253, 3910, 875, 436, 2929, 285, 1863, 249, 39707, 374, 783, 5150, 374, 943, 823, 625, 1029, 5251, 20121, 323, 1016, 1980, 4735, 1182, 74, 2139, 4560, 253, 1980, 4735, 1182, 75, 432, 253, 9732, 342, 253, 4585, 7349, 460, 14259, 476, 9232, 253, 4030, 72, 11273, 2919, 18925, 50276, 783, 10527, 4460, 2890, 403, 3710, 436, 2929, 7194, 13179, 84, 253, 5697, 432, 767, 9380, 277, 2610, 1113, 251, 1162, 355, 43425, 285, 1863, 249, 39707, 632, 86, 1162, 355, 43425, 2299, 253, 5661, 1543, 403, 1077, 1175, 275, 436, 2929, 285, 253, 2929, 3400, 247, 2257, 273, 7092, 4278, 323, 2571, 281, 18302, 253, 1543, 50276, 7152, 339, 431, 248, 7680, 273, 253, 2929, 12093, 50276, 783, 8310, 326, 253, 1554, 382, 486, 8113, 39707, 278, 11427, 85, 347, 10066, 281, 253, 1114, 36842, 8113, 39707, 362, 85, 1057, 417, 4711, 20741, 800, 12097, 6779, 285, 50276, 66, 2957, 1159, 323, 1881, 35421, 3215, 26208, 273, 278, 11427, 1641, 326, 29426, 20741, 800, 12097, 6779, 50276, 2068, 3134, 5661, 7103, 14371, 326, 253, 4081, 2957, 1307, 672, 2879, 281, 253, 2629, 1327, 45842, 422, 1881, 35421, 2957, 10316, 247, 16453, 533, 12082, 7756, 275, 3045, 273, 253, 278, 11427, 85, 10334, 608, 285, 643, 35615, 10334, 577, 50276, 20881, 1255, 265, 337, 891, 1089, 253, 7680, 275, 253, 830, 273, 253, 2957, 1307, 32809, 342, 1675, 281, 1863, 249, 285, 253, 3045, 6351, 5183, 275, 954, 4679, 16453, 374, 275, 619, 4743, 4677, 337, 285, 2829, 337, 25355, 253, 3045, 6351, 326, 23880, 432, 253, 7680, 273, 253, 2929, 253, 8245, 10336, 1863, 249, 67, 285, 1863, 249, 39220, 1047, 2168, 41731, 13015, 2761, 512, 3082, 275, 253, 5301, 387, 247, 3012, 2406, 1180, 273, 3602, 285, 1199, 2169, 28519, 253, 3969, 3904, 403, 2361, 275, 10334, 608, 50276, 74, 11907, 253, 4477, 281, 10007, 4677, 337, 285, 2829, 337, 281, 1957, 253, 2572, 275, 3045, 273, 1863, 249, 4795, 432, 253, 897, 273, 253, 4081, 2957, 1307, 436, 476, 320, 2218, 407, 16984, 281, 1097, 4243, 273, 4677, 337, 2792, 9999, 253, 3045, 273, 253, 2613, 35615, 1863, 249, 67, 256, 285, 246, 10166, 1293, 14168, 4065, 83, 253, 3045, 6351, 1955, 281, 3733, 731, 342, 14168, 4065, 83, 476, 320, 4860, 275, 253, 7484, 342, 9118, 18159, 2829, 337, 476, 320, 7321, 407, 9610, 253, 3045, 273, 253, 2613, 35615, 10166, 1293, 253, 4081, 2957, 275, 1635, 281, 616, 3045, 672, 10166, 342, 436, 2957, 10323, 407, 34047, 10334, 608, 715, 10334, 337, 253, 28913, 1543, 476, 1335, 320, 5469, 275, 247, 4858, 2593, 495, 275, 253, 1655, 3700, 4715, 1543, 253, 3486, 273, 253, 4477, 7680, 327, 3700, 1430, 273, 253, 10166, 6928, 310, 417, 2590, 891, 651, 4510, 253, 4679, 275, 4715, 281, 30215, 1283, 1355, 15302, 281, 6780, 253, 3045, 6351, 18585, 326, 23880, 432, 3733, 342, 14168, 4065, 83, 275, 1635, 281, 253, 1655, 5301, 273, 22296, 281, 440, 35421, 3733, 436, 651, 16084, 6240, 247, 2626, 2534, 3295, 281, 4677, 495, 285, 581, 625, 4194, 281, 1097, 4243, 273, 2829, 374, 281, 1957, 253, 3045, 273, 1863, 249, 10166, 275, 247, 1881, 35421, 9459, 533, 1293, 14168, 4065, 83, 50275, 296, 3755, 20556, 50276, 66, 891, 751, 253, 44803, 273, 253, 2929, 835, 271, 4722, 8310, 670, 253, 20741, 800, 11038, 1612, 273, 12097, 42785, 15265, 684, 253, 5140, 273, 253, 2957, 1307, 50276, 67, 253, 4081, 2957, 1307, 310, 2011, 281, 3157, 1097, 253, 3045, 273, 1863, 249, 285, 273, 643, 35615, 50276, 68, 253, 5661, 7103, 310, 9470, 275, 326, 352, 671, 2722, 7364, 273, 253, 4081, 2957, 323, 1650, 672, 3732, 281, 2460, 26405, 10334, 374, 285, 10334, 721, 50276, 69, 2556, 281, 479, 253, 7714, 310, 973, 3542, 285, 3477, 281, 2096, 50276, 37585, 3533, 285, 21977, 13991, 50276, 82, 18, 275, 253, 2957, 1307, 374, 368, 3609, 480, 407, 253, 7349, 460, 4181, 533, 368, 15338, 2831, 15579, 452, 368, 3597, 970, 253, 1072, 4868, 323, 1097, 8892, 26332, 17221, 480, 342, 253, 2831, 290, 10144, 285, 28699, 2831, 15579, 390, 17221, 480, 407, 253, 7349, 460, 4181, 285, 28699, 253, 7349, 460, 4181, 50276, 82, 19, 275, 253, 2957, 1307, 374, 2139, 419, 2254, 368, 3597, 8654, 1903, 11038, 253, 10416, 6656, 5933, 281, 3761, 2603, 285, 2303, 20412, 50275, 82, 20, 275, 253, 12494, 2216, 10165, 273, 14168, 4065, 83, 368, 6266, 253, 4327, 273, 1736, 4090, 4632, 8654, 4616, 275, 253, 2957, 1307, 374, 891, 513, 417, 2096, 849, 14366, 476, 320, 908, 281, 3609, 253, 8654, 3605, 480, 812, 368, 19148, 436, 50274, 265, 18, 3239, 374, 2957, 273, 436, 2867, 281, 33623, 436, 2523, 4931, 897, 46223, 3185, 273, 29093, 285, 1056, 253, 7234, 625, 2173, 2957, 273, 253, 2867, 2529, 275, 337, 33623, 253, 2523, 3636, 275, 1127, 374, 50276, 265, 19, 3239, 495, 9084, 11026, 5325, 7200, 275, 50276, 17187, 85, 410, 72, 279, 2723, 2979, 50276, 15603, 31238, 849, 368, 11897, 253, 17668, 46254, 2076, 1506, 13849, 7349, 460, 14259, 347, 275, 749, 1704, 7127, 3239, 854, 50275, 265, 20, 275, 253, 11743, 273, 10334, 337, 4496, 13199, 259, 1047, 310, 253, 3497, 1979, 891, 1089, 253, 2022, 2934, 273, 436, 2929, 21414, 253, 7714, 973, 3542, 285, 253, 7103, 11080, 253, 958, 326, 253, 4477, 6780, 690, 7364, 273, 616, 2957, 1307, 275, 1798, 672, 3732, 281, 2460, 26405, 863, 6655, 281, 616, 8249, 24852, 891, 1158, 253, 2929, 22828, 247, 9311, 891, 717, 7378, 281, 2281, 352, 854, 740, 3185, 273, 8437, 17, 984, 891, 1089, 253, 7680, 32809, 285, 253, 3045, 6351, 16453, 50276, 249, 436, 3790, 273, 10123, 891, 2406, 619, 17401, 281, 721, 984, 891, 1089, 253, 14855, 374, 2529, 1840, 7928, 352, 3731, 4762, 5957, 253, 1055, 273, 253, 7680, 327, 3045, 275, 253, 716, 12290, 4677, 285, 275, 253, 806, 2829, 342, 11745, 1543, 891, 717, 13224, 253, 4477, 476, 4993, 352, 275, 253, 1735, 18520, 273, 253, 2929, 534, 588, 8959, 479, 281, 2572, 253, 13716, 281, 854, 740, 50276, 74, 11907, 253, 4477, 281, 671, 2953, 253, 14855, 495, 275, 253, 1735, 18520, 273, 253, 2929, 253, 1655, 3700, 4715, 4679, 2593, 5976, 513, 417, 7472, 253, 2173, 7680, 273, 253, 2929, 533, 2581, 3662, 625, 2087, 3533, 891, 717, 13224, 326, 6240, 7103, 625, 7106, 327, 253, 897, 273, 14168, 4065, 83, 588, 37078, 5649, 436, 629, 273, 253, 2929, 891, 17837, 513, 417, 1928, 326, 891, 943, 2748, 21006, 436, 4836, 1561, 253, 3710, 2278, 2180, 285, 891, 588, 417, 2406, 619, 17401, 327, 436, 3720, 50276, 15576, 253, 4477, 9713, 253, 32213, 374, 285, 495, 7117, 275, 253, 2022, 2278, 285, 9577, 512, 619, 3533, 891, 717, 3103, 5211, 281, 18883, 253, 2929, 436, 310, 247, 4891, 789, 285, 22828, 281, 320, 3863, 2490, 187, 4118, 18435, 27, 2520, 2929, 29328, 767, 5609, 323, 11138, 1881, 35421, 4715, 342, 247, 8113, 39707, 253, 806, 7756, 310, 970, 247, 1554, 382, 486, 9084, 534, 310, 1077, 2074, 281, 1863, 249, 39707, 285, 4477, 7478, 436, 310, 417, 247, 2201, 7680, 253, 4477, 2007, 1119, 326, 970, 247, 1554, 382, 486, 9084, 1057, 417, 4711, 20741, 800, 12097, 6779, 3021, 36636, 253, 1273, 7756, 342, 247, 2919, 1268, 2957, 1223, 1097, 11701, 403, 417, 3782, 4460, 407, 3746, 16248, 1097, 5644, 281, 247, 2266, 16774, 906, 2299, 352, 1057, 4453, 751, 253, 1554, 2865, 1079, 8113, 39707, 310, 253, 2201, 7756, 347, 11922, 253, 9933, 2957, 760, 5644, 281, 1679, 685, 337, 6379, 275, 3045, 275, 954, 2219, 275, 2087, 436, 310, 247, 1175, 11369, 2929, 342, 247, 8542, 2746, 323, 11138, 1881, 35421, 4715, 342, 8113, 9261, 285, 2797, 2266, 1543, 3021, 697, 18338, 273, 9311 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 783, 2929, 2340, 684, 849, 281, 897, 1881, 35421, 4715, 323, 1554, 382, 486, 5304, 39707, 3210, 2045, 2987, 452, 2011, 326, 256, 3433, 476, 3037, 2460, 2723, 2979, 285, 1421, 281, 1347, 386, 3215, 11273, 3210, 1223, 253, 1554, 382, 486, 3210, 476, 4796, 253, 13782, 2105, 16821, 436, 789, 14177, 281, 17310, 841, 767, 13554, 2366, 253, 2900, 310, 247, 747, 2919, 3169, 2957, 326, 476, 320, 3732, 281, 253, 1980, 3386, 253, 11088, 4679, 921, 253, 11361, 273, 253, 4795, 3210, 327, 2709, 8892, 253, 2929, 556, 1142, 20544, 50275, 262, 310, 4722, 281, 923, 326, 253, 2957, 327, 1980, 3386, 476, 789, 973, 275, 253, 1881, 35421, 4715, 1083, 436, 8310, 778, 1089, 625, 10393, 275, 253, 2852, 2987, 273, 2509, 1881, 35421, 4715, 327, 39707, 3169, 3210, 50275, 783, 4679, 403, 11088, 285, 21414, 253, 2929, 14177, 253, 1072, 2934, 327, 2709, 39707, 5289, 285, 1027, 8892, 253, 4081, 1332, 476, 5115, 253, 6733, 273, 253, 1554, 382, 486, 39707, 3210, 1223, 17170, 1175, 1881, 35421, 4735, 3215, 26208, 323, 8892, 1690, 9162, 26405, 285, 5481, 50275, 783, 8692, 275, 253, 30762, 310, 9371, 323, 906, 21068, 285, 7000, 4685, 50276, 249, 2426, 273, 14855, 891, 513, 3524, 326, 253, 2929, 476, 320, 625, 4518, 3542, 824, 347, 294, 7397, 3006, 253, 8692, 875, 2022, 2505, 285, 30762, 281, 1918, 625, 30328, 273, 298, 83, 50276, 249, 1635, 253, 2508, 273, 2829, 374, 310, 1512, 29967, 281, 2096, 3587, 891, 5476, 253, 22296, 8245, 310, 1863, 565, 533, 352, 310, 1892, 281, 9441, 432, 253, 2929, 3587, 4583, 253, 7714, 4453, 751, 247, 1175, 2929, 891, 513, 3524, 4361, 253, 2929, 476, 320, 6927, 5747, 253, 3239, 2701, 5474, 33032, 2520, 2929, 24357, 271, 5919, 1881, 35421, 8113, 39707, 323, 4715, 5304, 14237, 352, 23970, 247, 1554, 382, 486, 10336, 342, 23507, 33056, 621, 281, 4796, 13782, 10454, 285, 29328, 247, 747, 3215, 26208, 4836, 273, 2919, 11038, 281, 9232, 4030, 72, 11273, 2919, 21011, 253, 1543, 327, 253, 4440, 257, 292, 285, 1283, 1355, 15302, 390, 15450, 8892, 403, 1175, 285, 2429, 342, 643, 1375, 23037, 14387, 7274, 50276, 296, 3755, 20556, 50275, 18, 352, 23970, 253, 747, 3215, 26208, 4836, 281, 9232, 4030, 72, 11273, 2919, 21011, 374, 253, 5661, 1543, 403, 1175, 285, 1805, 685, 643, 2429, 7274, 50276, 20881, 1255, 265, 337, 253, 12097, 34047, 6333, 285, 23507, 1881, 42959, 275, 253, 1554, 382, 486, 9084, 403, 1077, 2074, 281, 253, 12097, 34047, 275, 253, 2929, 1863, 249, 39707, 632, 86, 1162, 355, 43425, 253, 4477, 943, 4518, 5513, 253, 3910, 875, 436, 2929, 285, 1863, 249, 39707, 374, 783, 5150, 374, 943, 823, 625, 1029, 5251, 20121, 323, 1016, 1980, 4735, 1182, 74, 2139, 4560, 253, 1980, 4735, 1182, 75, 432, 253, 9732, 342, 253, 4585, 7349, 460, 14259, 476, 9232, 253, 4030, 72, 11273, 2919, 18925, 50276, 783, 10527, 4460, 2890, 403, 3710, 436, 2929, 7194, 13179, 84, 253, 5697, 432, 767, 9380, 277, 2610, 1113, 251, 1162, 355, 43425, 285, 1863, 249, 39707, 632, 86, 1162, 355, 43425, 2299, 253, 5661, 1543, 403, 1077, 1175, 275, 436, 2929, 285, 253, 2929, 3400, 247, 2257, 273, 7092, 4278, 323, 2571, 281, 18302, 253, 1543, 50276, 7152, 339, 431, 248, 7680, 273, 253, 2929, 12093, 50276, 783, 8310, 326, 253, 1554, 382, 486, 8113, 39707, 278, 11427, 85, 347, 10066, 281, 253, 1114, 36842, 8113, 39707, 362, 85, 1057, 417, 4711, 20741, 800, 12097, 6779, 285, 50276, 66, 2957, 1159, 323, 1881, 35421, 3215, 26208, 273, 278, 11427, 1641, 326, 29426, 20741, 800, 12097, 6779, 50276, 2068, 3134, 5661, 7103, 14371, 326, 253, 4081, 2957, 1307, 672, 2879, 281, 253, 2629, 1327, 45842, 422, 1881, 35421, 2957, 10316, 247, 16453, 533, 12082, 7756, 275, 3045, 273, 253, 278, 11427, 85, 10334, 608, 285, 643, 35615, 10334, 577, 50276, 20881, 1255, 265, 337, 891, 1089, 253, 7680, 275, 253, 830, 273, 253, 2957, 1307, 32809, 342, 1675, 281, 1863, 249, 285, 253, 3045, 6351, 5183, 275, 954, 4679, 16453, 374, 275, 619, 4743, 4677, 337, 285, 2829, 337, 25355, 253, 3045, 6351, 326, 23880, 432, 253, 7680, 273, 253, 2929, 253, 8245, 10336, 1863, 249, 67, 285, 1863, 249, 39220, 1047, 2168, 41731, 13015, 2761, 512, 3082, 275, 253, 5301, 387, 247, 3012, 2406, 1180, 273, 3602, 285, 1199, 2169, 28519, 253, 3969, 3904, 403, 2361, 275, 10334, 608, 50276, 74, 11907, 253, 4477, 281, 10007, 4677, 337, 285, 2829, 337, 281, 1957, 253, 2572, 275, 3045, 273, 1863, 249, 4795, 432, 253, 897, 273, 253, 4081, 2957, 1307, 436, 476, 320, 2218, 407, 16984, 281, 1097, 4243, 273, 4677, 337, 2792, 9999, 253, 3045, 273, 253, 2613, 35615, 1863, 249, 67, 256, 285, 246, 10166, 1293, 14168, 4065, 83, 253, 3045, 6351, 1955, 281, 3733, 731, 342, 14168, 4065, 83, 476, 320, 4860, 275, 253, 7484, 342, 9118, 18159, 2829, 337, 476, 320, 7321, 407, 9610, 253, 3045, 273, 253, 2613, 35615, 10166, 1293, 253, 4081, 2957, 275, 1635, 281, 616, 3045, 672, 10166, 342, 436, 2957, 10323, 407, 34047, 10334, 608, 715, 10334, 337, 253, 28913, 1543, 476, 1335, 320, 5469, 275, 247, 4858, 2593, 495, 275, 253, 1655, 3700, 4715, 1543, 253, 3486, 273, 253, 4477, 7680, 327, 3700, 1430, 273, 253, 10166, 6928, 310, 417, 2590, 891, 651, 4510, 253, 4679, 275, 4715, 281, 30215, 1283, 1355, 15302, 281, 6780, 253, 3045, 6351, 18585, 326, 23880, 432, 3733, 342, 14168, 4065, 83, 275, 1635, 281, 253, 1655, 5301, 273, 22296, 281, 440, 35421, 3733, 436, 651, 16084, 6240, 247, 2626, 2534, 3295, 281, 4677, 495, 285, 581, 625, 4194, 281, 1097, 4243, 273, 2829, 374, 281, 1957, 253, 3045, 273, 1863, 249, 10166, 275, 247, 1881, 35421, 9459, 533, 1293, 14168, 4065, 83, 50275, 296, 3755, 20556, 50276, 66, 891, 751, 253, 44803, 273, 253, 2929, 835, 271, 4722, 8310, 670, 253, 20741, 800, 11038, 1612, 273, 12097, 42785, 15265, 684, 253, 5140, 273, 253, 2957, 1307, 50276, 67, 253, 4081, 2957, 1307, 310, 2011, 281, 3157, 1097, 253, 3045, 273, 1863, 249, 285, 273, 643, 35615, 50276, 68, 253, 5661, 7103, 310, 9470, 275, 326, 352, 671, 2722, 7364, 273, 253, 4081, 2957, 323, 1650, 672, 3732, 281, 2460, 26405, 10334, 374, 285, 10334, 721, 50276, 69, 2556, 281, 479, 253, 7714, 310, 973, 3542, 285, 3477, 281, 2096, 50276, 37585, 3533, 285, 21977, 13991, 50276, 82, 18, 275, 253, 2957, 1307, 374, 368, 3609, 480, 407, 253, 7349, 460, 4181, 533, 368, 15338, 2831, 15579, 452, 368, 3597, 970, 253, 1072, 4868, 323, 1097, 8892, 26332, 17221, 480, 342, 253, 2831, 290, 10144, 285, 28699, 2831, 15579, 390, 17221, 480, 407, 253, 7349, 460, 4181, 285, 28699, 253, 7349, 460, 4181, 50276, 82, 19, 275, 253, 2957, 1307, 374, 2139, 419, 2254, 368, 3597, 8654, 1903, 11038, 253, 10416, 6656, 5933, 281, 3761, 2603, 285, 2303, 20412, 50275, 82, 20, 275, 253, 12494, 2216, 10165, 273, 14168, 4065, 83, 368, 6266, 253, 4327, 273, 1736, 4090, 4632, 8654, 4616, 275, 253, 2957, 1307, 374, 891, 513, 417, 2096, 849, 14366, 476, 320, 908, 281, 3609, 253, 8654, 3605, 480, 812, 368, 19148, 436, 50274, 265, 18, 3239, 374, 2957, 273, 436, 2867, 281, 33623, 436, 2523, 4931, 897, 46223, 3185, 273, 29093, 285, 1056, 253, 7234, 625, 2173, 2957, 273, 253, 2867, 2529, 275, 337, 33623, 253, 2523, 3636, 275, 1127, 374, 50276, 265, 19, 3239, 495, 9084, 11026, 5325, 7200, 275, 50276, 17187, 85, 410, 72, 279, 2723, 2979, 50276, 15603, 31238, 849, 368, 11897, 253, 17668, 46254, 2076, 1506, 13849, 7349, 460, 14259, 347, 275, 749, 1704, 7127, 3239, 854, 50275, 265, 20, 275, 253, 11743, 273, 10334, 337, 4496, 13199, 259, 1047, 310, 253, 3497, 1979, 891, 1089, 253, 2022, 2934, 273, 436, 2929, 21414, 253, 7714, 973, 3542, 285, 253, 7103, 11080, 253, 958, 326, 253, 4477, 6780, 690, 7364, 273, 616, 2957, 1307, 275, 1798, 672, 3732, 281, 2460, 26405, 863, 6655, 281, 616, 8249, 24852, 891, 1158, 253, 2929, 22828, 247, 9311, 891, 717, 7378, 281, 2281, 352, 854, 740, 3185, 273, 8437, 17, 984, 891, 1089, 253, 7680, 32809, 285, 253, 3045, 6351, 16453, 50276, 249, 436, 3790, 273, 10123, 891, 2406, 619, 17401, 281, 721, 984, 891, 1089, 253, 14855, 374, 2529, 1840, 7928, 352, 3731, 4762, 5957, 253, 1055, 273, 253, 7680, 327, 3045, 275, 253, 716, 12290, 4677, 285, 275, 253, 806, 2829, 342, 11745, 1543, 891, 717, 13224, 253, 4477, 476, 4993, 352, 275, 253, 1735, 18520, 273, 253, 2929, 534, 588, 8959, 479, 281, 2572, 253, 13716, 281, 854, 740, 50276, 74, 11907, 253, 4477, 281, 671, 2953, 253, 14855, 495, 275, 253, 1735, 18520, 273, 253, 2929, 253, 1655, 3700, 4715, 4679, 2593, 5976, 513, 417, 7472, 253, 2173, 7680, 273, 253, 2929, 533, 2581, 3662, 625, 2087, 3533, 891, 717, 13224, 326, 6240, 7103, 625, 7106, 327, 253, 897, 273, 14168, 4065, 83, 588, 37078, 5649, 436, 629, 273, 253, 2929, 891, 17837, 513, 417, 1928, 326, 891, 943, 2748, 21006, 436, 4836, 1561, 253, 3710, 2278, 2180, 285, 891, 588, 417, 2406, 619, 17401, 327, 436, 3720, 50276, 15576, 253, 4477, 9713, 253, 32213, 374, 285, 495, 7117, 275, 253, 2022, 2278, 285, 9577, 512, 619, 3533, 891, 717, 3103, 5211, 281, 18883, 253, 2929, 436, 310, 247, 4891, 789, 285, 22828, 281, 320, 3863, 2490, 187, 4118, 18435, 27, 2520, 2929, 29328, 767, 5609, 323, 11138, 1881, 35421, 4715, 342, 247, 8113, 39707, 253, 806, 7756, 310, 970, 247, 1554, 382, 486, 9084, 534, 310, 1077, 2074, 281, 1863, 249, 39707, 285, 4477, 7478, 436, 310, 417, 247, 2201, 7680, 253, 4477, 2007, 1119, 326, 970, 247, 1554, 382, 486, 9084, 1057, 417, 4711, 20741, 800, 12097, 6779, 3021, 36636, 253, 1273, 7756, 342, 247, 2919, 1268, 2957, 1223, 1097, 11701, 403, 417, 3782, 4460, 407, 3746, 16248, 1097, 5644, 281, 247, 2266, 16774, 906, 2299, 352, 1057, 4453, 751, 253, 1554, 2865, 1079, 8113, 39707, 310, 253, 2201, 7756, 347, 11922, 253, 9933, 2957, 760, 5644, 281, 1679, 685, 337, 6379, 275, 3045, 275, 954, 2219, 275, 2087, 436, 310, 247, 1175, 11369, 2929, 342, 247, 8542, 2746, 323, 11138, 1881, 35421, 4715, 342, 8113, 9261, 285, 2797, 2266, 1543, 3021, 697, 18338, 273, 9311 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: the work identifies factors in the scalability of gnn based solution methods for relational mdps and develops methods to utilize them the main focus is in state variables that no actions change but that still determine the system behavior and which are not available for symnet but are available to enfanet this is an interesting observation but this issue and its solutions do not appear to be very profound you characterize prost as a stateoftheart online planner for mdps the area of mdps is quite broad and it would useful to characterize this more accurately my guess is that you are talking about mdp methods that have been tested on the ippc problems you could explain the experiment setup and the evaluation metric alpha that is used in table 1 a little bit deeper if i am guessing correctly the neural mdp solvers generate an action sequence very quickly so there is no question about how high the runtimes are and in addition to learning effort only the rewards the action sequences produce are interesting to compare the higher mean alpha for enfanet is due to the two cases in which symnet produces a negative alpha indicating much worse behavior than random it would be interesting to explain this a bit deeper in many cases the differences seem small or insignificant docsepthe paper attacks an important problem namely of learning general sizeindependent policies for relational mdps the approach addresses the representational inadequacy of symnet by adding nonfluents and objects as nodes the experiments in a number of domains show compelling results that it works better than symnet in both rl and imitation modes the results are comparable or better than prost when the problem size is increased the paper is well written and welljustified by the theory the theory is not as strong in certain aspects for example it would be nice to know if the network can represent any finite horizon optimal policy after a finite number of steps of propagation of activations the current results show that the distance between nodes is decreased for the new network which is to be expected when new nodes and edges are added to the network it does not show that the current architecture does not also have another representational inadequacy similar to that shown for symnet the paper badly needs diagrams of instance graphs in symnet and enfanet to clarify the representations and help make your points section 34 that describes training and inferencc is too brief since the network size depends on the problem size it is not clear how sizeindependent weights are learned how are the activations propagated in the network during inference all this needs more detail appendix is too elaborate and can be shortened docsepthe core insight is the more principled handling of nonfluents within symnet specifically the paper should present the novel encoding and then show that symnet is a specialized version this is interesting and shows a lot of benefits while the paper focuses on deep neural approaches it would be good to cite some more approaches that are not neural it is critical to highlight some of the assumptions such as the transfer setting for instance traditional boosting approaches to solving rmdps do not need this anyhow this is really a minor issue the main downside is the presentation the paper is very much presented in a symnetcentric way in turn this reads very much like an engineering exercise this however is not the case in my opinion see q3 the experimental results support the effectiveness of enfanet though the ablation study on neighborhood size should rather be a scaling experiment though the experiments show what they are supposed to show i am still wondering whether one could actually present results where symnet would essentially break down the paper proposes effectively handling nonfluents and actions network enfanet for solving relational mdps rmdps enfanet builds upon symnet building a denser graph and using a richer decoder the experimental results demonstrate improvements while the paper focuses on deep neural approaches it would be good to cite some more approaches that are not neural this is critical to highlight some of the assumption such as the transfer setting for instance traditional boosting approaches to solving rmdps do not need this anyhow this is really a minor issues the main downside is the presentation the paper is very much presented in a symnetcentric way in turn this reads vey much like an engineering exercises this however is not the case in my opinion the core insight is the more principled handling of nonfluents this should be the center for the paper in other words the paper should present the novel encoding and then show that symnet is a specialized version then also the differences in the learning approach should be highlighted the experimental results support the effectiveness of enfanet though the ablation study on neiughbourhood size should rather be a scaling experiments though the experiments show what they are suppose to show i am still wondering whether one could actually present results where symnet would essentially break down to summarize simple but very effective extension of symnet for solving rmdps while the paper could have been presented in stronger way the contributions are interesting ### Summary:
meta review the reviewers agreed that this paper makes a valuable contribution there are many suggestions in the reviews and your response that should be taken into account when revising this paper
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 783, 789, 22649, 2616, 275, 253, 9171, 1430, 273, 305, 9866, 1754, 2900, 3082, 323, 38524, 31934, 793, 285, 24357, 3082, 281, 16584, 731, 253, 2022, 2770, 310, 275, 1375, 4903, 326, 642, 5231, 1818, 533, 326, 1335, 3653, 253, 985, 3879, 285, 534, 403, 417, 2130, 323, 18870, 3024, 533, 403, 2130, 281, 546, 20227, 292, 436, 310, 271, 4722, 8310, 533, 436, 2523, 285, 697, 5482, 513, 417, 3176, 281, 320, 1077, 15585, 368, 17710, 17248, 347, 247, 1375, 23037, 14387, 3909, 499, 9582, 323, 31934, 793, 253, 2170, 273, 31934, 793, 310, 3240, 3862, 285, 352, 651, 4217, 281, 17710, 436, 625, 13613, 619, 5476, 310, 326, 368, 403, 5015, 670, 278, 12132, 3082, 326, 452, 644, 5762, 327, 253, 891, 377, 68, 3237, 50276, 5658, 812, 5513, 253, 3368, 9978, 285, 253, 7103, 7982, 9765, 326, 310, 908, 275, 2829, 337, 247, 1652, 2372, 12861, 604, 891, 717, 29985, 9113, 253, 11454, 278, 12132, 1220, 735, 6635, 271, 2250, 3425, 1077, 4541, 594, 627, 310, 642, 1953, 670, 849, 1029, 253, 1408, 3181, 403, 285, 275, 1635, 281, 4715, 3434, 760, 253, 23267, 253, 2250, 6430, 4711, 403, 4722, 281, 7277, 50276, 783, 2169, 1599, 9765, 323, 546, 20227, 292, 310, 1955, 281, 253, 767, 2219, 275, 534, 18870, 3024, 11330, 247, 4016, 9765, 7809, 1199, 7197, 3879, 685, 3632, 352, 651, 320, 4722, 281, 5513, 436, 247, 2372, 12861, 275, 1142, 2219, 253, 3910, 1646, 1355, 390, 34584, 50276, 7152, 339, 431, 248, 2929, 8104, 271, 1774, 1895, 10775, 273, 4715, 2087, 1979, 17777, 7823, 323, 38524, 31934, 793, 50275, 783, 2746, 12453, 253, 1957, 1050, 16273, 1974, 273, 18870, 3024, 407, 6240, 1327, 8501, 592, 285, 5113, 347, 7632, 50275, 783, 4679, 275, 247, 1180, 273, 10625, 921, 18511, 1543, 326, 352, 2987, 1805, 685, 18870, 3024, 275, 1097, 391, 77, 285, 45738, 10006, 253, 1543, 403, 10870, 390, 1805, 685, 17248, 672, 253, 1895, 1979, 310, 2559, 50275, 783, 2929, 310, 973, 3542, 285, 973, 6309, 1245, 407, 253, 3762, 50276, 783, 3762, 310, 417, 347, 2266, 275, 2176, 7794, 323, 1650, 352, 651, 320, 5322, 281, 871, 604, 253, 2990, 476, 1957, 667, 6486, 16892, 8654, 3646, 846, 247, 6486, 1180, 273, 5018, 273, 18634, 273, 1396, 569, 253, 1655, 1543, 921, 326, 253, 4181, 875, 7632, 310, 6137, 323, 253, 747, 2990, 534, 310, 281, 320, 3264, 672, 747, 7632, 285, 9297, 403, 2879, 281, 253, 2990, 352, 1057, 417, 921, 326, 253, 1655, 10336, 1057, 417, 671, 452, 1529, 1957, 1050, 16273, 1974, 2074, 281, 326, 2011, 323, 18870, 3024, 50273, 783, 2929, 16426, 3198, 21302, 273, 4227, 14580, 275, 18870, 3024, 285, 546, 20227, 292, 281, 19148, 253, 14237, 285, 1361, 1056, 634, 2792, 50273, 4674, 5910, 326, 8631, 3733, 285, 275, 22498, 550, 310, 1512, 4864, 1580, 253, 2990, 1979, 7024, 327, 253, 1895, 1979, 352, 310, 417, 2590, 849, 1979, 17777, 13461, 403, 6311, 849, 403, 253, 1396, 569, 46695, 275, 253, 2990, 1309, 17032, 512, 436, 3198, 625, 2508, 50275, 50237, 310, 1512, 21184, 285, 476, 320, 36439, 50276, 7152, 339, 431, 248, 5161, 12288, 310, 253, 625, 3505, 74, 6216, 10885, 273, 1327, 8501, 592, 1561, 18870, 3024, 5742, 253, 2929, 943, 1246, 253, 4460, 9706, 285, 840, 921, 326, 18870, 3024, 310, 247, 18052, 2715, 436, 310, 4722, 285, 2722, 247, 2257, 273, 5373, 50276, 6050, 253, 2929, 16633, 327, 3676, 11454, 7274, 352, 651, 320, 1175, 281, 26542, 690, 625, 7274, 326, 403, 417, 11454, 352, 310, 4619, 281, 6780, 690, 273, 253, 13260, 824, 347, 253, 3700, 4758, 323, 4227, 5899, 43124, 7274, 281, 16161, 391, 6535, 793, 513, 417, 878, 436, 667, 5430, 436, 310, 1663, 247, 5884, 2523, 50275, 783, 2022, 42719, 310, 253, 9759, 253, 2929, 310, 1077, 1199, 3559, 275, 247, 18870, 3024, 37382, 1039, 275, 1614, 436, 9563, 1077, 1199, 751, 271, 11369, 5763, 436, 2299, 310, 417, 253, 1083, 275, 619, 4743, 923, 2805, 20, 50275, 783, 5661, 1543, 1329, 253, 12510, 273, 546, 20227, 292, 2167, 253, 28913, 1263, 327, 9168, 1979, 943, 2581, 320, 247, 13642, 3368, 50275, 2004, 253, 4679, 921, 752, 597, 403, 6326, 281, 921, 891, 717, 1335, 12371, 1880, 581, 812, 2686, 1246, 1543, 835, 18870, 3024, 651, 9093, 2740, 1066, 50276, 783, 2929, 29328, 8069, 10885, 1327, 8501, 592, 285, 5231, 2990, 546, 20227, 292, 323, 16161, 38524, 31934, 793, 391, 6535, 793, 546, 20227, 292, 21168, 2220, 18870, 3024, 3652, 247, 12006, 254, 4216, 285, 970, 247, 38539, 29810, 253, 5661, 1543, 7568, 11701, 50275, 6050, 253, 2929, 16633, 327, 3676, 11454, 7274, 352, 651, 320, 1175, 281, 26542, 690, 625, 7274, 326, 403, 417, 11454, 436, 310, 4619, 281, 6780, 690, 273, 253, 9376, 824, 347, 253, 3700, 4758, 323, 4227, 5899, 43124, 7274, 281, 16161, 391, 6535, 793, 513, 417, 878, 436, 667, 5430, 436, 310, 1663, 247, 5884, 3374, 50275, 783, 2022, 42719, 310, 253, 9759, 253, 2929, 310, 1077, 1199, 3559, 275, 247, 18870, 3024, 37382, 1039, 275, 1614, 436, 9563, 1670, 90, 1199, 751, 271, 11369, 18418, 436, 2299, 310, 417, 253, 1083, 275, 619, 4743, 253, 5161, 12288, 310, 253, 625, 3505, 74, 6216, 10885, 273, 1327, 8501, 592, 436, 943, 320, 253, 4055, 323, 253, 2929, 275, 643, 3000, 253, 2929, 943, 1246, 253, 4460, 9706, 285, 840, 921, 326, 18870, 3024, 310, 247, 18052, 2715, 840, 671, 253, 3910, 275, 253, 4715, 2746, 943, 320, 16318, 253, 5661, 1543, 1329, 253, 12510, 273, 546, 20227, 292, 2167, 253, 28913, 1263, 327, 425, 74, 19180, 7390, 3639, 1979, 943, 2581, 320, 247, 13642, 4679, 2167, 253, 4679, 921, 752, 597, 403, 9428, 281, 921, 891, 717, 1335, 12371, 1880, 581, 812, 2686, 1246, 1543, 835, 18870, 3024, 651, 9093, 2740, 1066, 50275, 936, 26799, 2969, 533, 1077, 3576, 6880, 273, 18870, 3024, 323, 16161, 391, 6535, 793, 50276, 6050, 253, 2929, 812, 452, 644, 3559, 275, 10046, 1039, 253, 9021, 403, 4722, 50276, 187, 187, 4118, 18435, 27, 13518, 2278, 253, 30628, 5821, 326, 436, 2929, 2789, 247, 9865, 7680, 627, 403, 1142, 13991, 275, 253, 10123, 285, 634, 2380, 326, 943, 320, 2668, 715, 2395, 672, 3585, 2182, 436, 2929 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 783, 789, 22649, 2616, 275, 253, 9171, 1430, 273, 305, 9866, 1754, 2900, 3082, 323, 38524, 31934, 793, 285, 24357, 3082, 281, 16584, 731, 253, 2022, 2770, 310, 275, 1375, 4903, 326, 642, 5231, 1818, 533, 326, 1335, 3653, 253, 985, 3879, 285, 534, 403, 417, 2130, 323, 18870, 3024, 533, 403, 2130, 281, 546, 20227, 292, 436, 310, 271, 4722, 8310, 533, 436, 2523, 285, 697, 5482, 513, 417, 3176, 281, 320, 1077, 15585, 368, 17710, 17248, 347, 247, 1375, 23037, 14387, 3909, 499, 9582, 323, 31934, 793, 253, 2170, 273, 31934, 793, 310, 3240, 3862, 285, 352, 651, 4217, 281, 17710, 436, 625, 13613, 619, 5476, 310, 326, 368, 403, 5015, 670, 278, 12132, 3082, 326, 452, 644, 5762, 327, 253, 891, 377, 68, 3237, 50276, 5658, 812, 5513, 253, 3368, 9978, 285, 253, 7103, 7982, 9765, 326, 310, 908, 275, 2829, 337, 247, 1652, 2372, 12861, 604, 891, 717, 29985, 9113, 253, 11454, 278, 12132, 1220, 735, 6635, 271, 2250, 3425, 1077, 4541, 594, 627, 310, 642, 1953, 670, 849, 1029, 253, 1408, 3181, 403, 285, 275, 1635, 281, 4715, 3434, 760, 253, 23267, 253, 2250, 6430, 4711, 403, 4722, 281, 7277, 50276, 783, 2169, 1599, 9765, 323, 546, 20227, 292, 310, 1955, 281, 253, 767, 2219, 275, 534, 18870, 3024, 11330, 247, 4016, 9765, 7809, 1199, 7197, 3879, 685, 3632, 352, 651, 320, 4722, 281, 5513, 436, 247, 2372, 12861, 275, 1142, 2219, 253, 3910, 1646, 1355, 390, 34584, 50276, 7152, 339, 431, 248, 2929, 8104, 271, 1774, 1895, 10775, 273, 4715, 2087, 1979, 17777, 7823, 323, 38524, 31934, 793, 50275, 783, 2746, 12453, 253, 1957, 1050, 16273, 1974, 273, 18870, 3024, 407, 6240, 1327, 8501, 592, 285, 5113, 347, 7632, 50275, 783, 4679, 275, 247, 1180, 273, 10625, 921, 18511, 1543, 326, 352, 2987, 1805, 685, 18870, 3024, 275, 1097, 391, 77, 285, 45738, 10006, 253, 1543, 403, 10870, 390, 1805, 685, 17248, 672, 253, 1895, 1979, 310, 2559, 50275, 783, 2929, 310, 973, 3542, 285, 973, 6309, 1245, 407, 253, 3762, 50276, 783, 3762, 310, 417, 347, 2266, 275, 2176, 7794, 323, 1650, 352, 651, 320, 5322, 281, 871, 604, 253, 2990, 476, 1957, 667, 6486, 16892, 8654, 3646, 846, 247, 6486, 1180, 273, 5018, 273, 18634, 273, 1396, 569, 253, 1655, 1543, 921, 326, 253, 4181, 875, 7632, 310, 6137, 323, 253, 747, 2990, 534, 310, 281, 320, 3264, 672, 747, 7632, 285, 9297, 403, 2879, 281, 253, 2990, 352, 1057, 417, 921, 326, 253, 1655, 10336, 1057, 417, 671, 452, 1529, 1957, 1050, 16273, 1974, 2074, 281, 326, 2011, 323, 18870, 3024, 50273, 783, 2929, 16426, 3198, 21302, 273, 4227, 14580, 275, 18870, 3024, 285, 546, 20227, 292, 281, 19148, 253, 14237, 285, 1361, 1056, 634, 2792, 50273, 4674, 5910, 326, 8631, 3733, 285, 275, 22498, 550, 310, 1512, 4864, 1580, 253, 2990, 1979, 7024, 327, 253, 1895, 1979, 352, 310, 417, 2590, 849, 1979, 17777, 13461, 403, 6311, 849, 403, 253, 1396, 569, 46695, 275, 253, 2990, 1309, 17032, 512, 436, 3198, 625, 2508, 50275, 50237, 310, 1512, 21184, 285, 476, 320, 36439, 50276, 7152, 339, 431, 248, 5161, 12288, 310, 253, 625, 3505, 74, 6216, 10885, 273, 1327, 8501, 592, 1561, 18870, 3024, 5742, 253, 2929, 943, 1246, 253, 4460, 9706, 285, 840, 921, 326, 18870, 3024, 310, 247, 18052, 2715, 436, 310, 4722, 285, 2722, 247, 2257, 273, 5373, 50276, 6050, 253, 2929, 16633, 327, 3676, 11454, 7274, 352, 651, 320, 1175, 281, 26542, 690, 625, 7274, 326, 403, 417, 11454, 352, 310, 4619, 281, 6780, 690, 273, 253, 13260, 824, 347, 253, 3700, 4758, 323, 4227, 5899, 43124, 7274, 281, 16161, 391, 6535, 793, 513, 417, 878, 436, 667, 5430, 436, 310, 1663, 247, 5884, 2523, 50275, 783, 2022, 42719, 310, 253, 9759, 253, 2929, 310, 1077, 1199, 3559, 275, 247, 18870, 3024, 37382, 1039, 275, 1614, 436, 9563, 1077, 1199, 751, 271, 11369, 5763, 436, 2299, 310, 417, 253, 1083, 275, 619, 4743, 923, 2805, 20, 50275, 783, 5661, 1543, 1329, 253, 12510, 273, 546, 20227, 292, 2167, 253, 28913, 1263, 327, 9168, 1979, 943, 2581, 320, 247, 13642, 3368, 50275, 2004, 253, 4679, 921, 752, 597, 403, 6326, 281, 921, 891, 717, 1335, 12371, 1880, 581, 812, 2686, 1246, 1543, 835, 18870, 3024, 651, 9093, 2740, 1066, 50276, 783, 2929, 29328, 8069, 10885, 1327, 8501, 592, 285, 5231, 2990, 546, 20227, 292, 323, 16161, 38524, 31934, 793, 391, 6535, 793, 546, 20227, 292, 21168, 2220, 18870, 3024, 3652, 247, 12006, 254, 4216, 285, 970, 247, 38539, 29810, 253, 5661, 1543, 7568, 11701, 50275, 6050, 253, 2929, 16633, 327, 3676, 11454, 7274, 352, 651, 320, 1175, 281, 26542, 690, 625, 7274, 326, 403, 417, 11454, 436, 310, 4619, 281, 6780, 690, 273, 253, 9376, 824, 347, 253, 3700, 4758, 323, 4227, 5899, 43124, 7274, 281, 16161, 391, 6535, 793, 513, 417, 878, 436, 667, 5430, 436, 310, 1663, 247, 5884, 3374, 50275, 783, 2022, 42719, 310, 253, 9759, 253, 2929, 310, 1077, 1199, 3559, 275, 247, 18870, 3024, 37382, 1039, 275, 1614, 436, 9563, 1670, 90, 1199, 751, 271, 11369, 18418, 436, 2299, 310, 417, 253, 1083, 275, 619, 4743, 253, 5161, 12288, 310, 253, 625, 3505, 74, 6216, 10885, 273, 1327, 8501, 592, 436, 943, 320, 253, 4055, 323, 253, 2929, 275, 643, 3000, 253, 2929, 943, 1246, 253, 4460, 9706, 285, 840, 921, 326, 18870, 3024, 310, 247, 18052, 2715, 840, 671, 253, 3910, 275, 253, 4715, 2746, 943, 320, 16318, 253, 5661, 1543, 1329, 253, 12510, 273, 546, 20227, 292, 2167, 253, 28913, 1263, 327, 425, 74, 19180, 7390, 3639, 1979, 943, 2581, 320, 247, 13642, 4679, 2167, 253, 4679, 921, 752, 597, 403, 9428, 281, 921, 891, 717, 1335, 12371, 1880, 581, 812, 2686, 1246, 1543, 835, 18870, 3024, 651, 9093, 2740, 1066, 50275, 936, 26799, 2969, 533, 1077, 3576, 6880, 273, 18870, 3024, 323, 16161, 391, 6535, 793, 50276, 6050, 253, 2929, 812, 452, 644, 3559, 275, 10046, 1039, 253, 9021, 403, 4722, 50276, 187, 187, 4118, 18435, 27, 13518, 2278, 253, 30628, 5821, 326, 436, 2929, 2789, 247, 9865, 7680, 627, 403, 1142, 13991, 275, 253, 10123, 285, 634, 2380, 326, 943, 320, 2668, 715, 2395, 672, 3585, 2182, 436, 2929 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: this paper looks good enough to be accepted that said the clarity should be improved please reword and be more specific when presenting definitions if we do not only observe the rewards but also some additional variables and the rewards and these variables are generated according to a causal model then the rewards are no longer independent and this is called a causal bandit problem be more specific about which previous works you are referring to in this case the previously cited works on this topic have shown that one can this additional structure to improve our performance how is the context variable different from the regime indicator variable ft used by philip dawid httpsarxivorgpdf200412493 see section 2 note that while this reference is relatively new the use of such variables can be found in older papers by dawid and in pearls causality book in the introduction the authors state we develop a framework for causal bandits where everything is learned from scratch this is an exaggeration later on in section 31 they state we introduce a novel information sharing estimator that relies on much less specific knowledge about the causal graph please state explicitly the information being assumed about the causal graph typos and spelling mistakes then much less restrictive then specifically focussing on the algorithm is presented in the appendix it should probably be moved to the main paper docsepin this paper the authors proposed a new bandit algorithm that can more efficiently estimate the causal effect via learning the causal graphs the simulation results on the sachs data show that there is improved cumulative regret compared to classical bandit approaches i took a brief look at the proof and did not find any problem overall i think it is an ok paper and may be of interest to the clear audience nevertheless after carefully reviewed this manuscript i still have the following questions 1 theorem 2 only shows that there is variance reduction by given a valid separating set however in practice all the separating sets are learned from the algorithm and the causal discovery algorithm may not be guaranteed to find the valid set for separation is there any theoretical analysis to leverage the estimation errors from causal discovery algorithms into the regret or variance bound 2 in section 52 the authors showed improved cumulative regret bound on sachs data set this means that when the causal graph is easy to learn the proposed method can achieve a relatively nicer performance compared to orthodox bandit algorithms i wonder how the performance would be when we instead apply it on a data set with more complicated causal graph for example when it is applied to a data with a dense causal graph or with a causal graph containing latent confounders would it still perform well than the ordinary bandit methods or it would be the other way around overall i think it is an ok paper but considering the problems on theoretical analysis and real world applications mentioned above the contribution seems rather marginaldocsepi have reviewed several causal bandit papers and found that some of the papers contribution is simply using the parents of reward variable y to improve the performance in my review i wrote that there may be other separating set which can yield better variance reduction hence i was somehow expected this idea being realized the authors solution is simple and clean the use of context variable regime indicator is also a nice addition some questions why does the algorithms in the appendix does not check a separating set on the fly do we have to perform causal discovery every iteration or once in a while depending on pvalue one may skip a few iteration do we have an online version of conditional independence test or causal discovery algorithm that can be utilized in this setting the result with direct independence testing is better than those with causal discovery algorithm would it be desirable to employ some kind of optimism in the face of uncertainty for a separating set ### Summary:
this paper proposes a causal bandit algorithm that does not assume prior knowledge about the causal structure and efficiently estimates the causal effect using a separating set between intervention and reward among different arms the simulation results show that the proposed algorithm outperforms causal bandit approaches that do not leverage causal information obtainable through data and conditional independence tests the reviewers agree that the paper proposes an attractive solution to the causal bandit problem however the clarity and exposition of the paper can be improved therefore we recommend the authors take the reviewers comments to heart and encourage them to incorporate their thoughts in preparing the final version of their manuscript
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 2520, 2929, 4453, 1175, 2217, 281, 320, 7607, 326, 753, 253, 19843, 943, 320, 5520, 50275, 32897, 294, 3418, 285, 320, 625, 2173, 672, 15250, 14308, 604, 359, 513, 417, 760, 10018, 253, 23267, 533, 671, 690, 3081, 4903, 285, 253, 23267, 285, 841, 4903, 403, 4561, 2556, 281, 247, 19349, 1566, 840, 253, 23267, 403, 642, 3356, 3907, 285, 436, 310, 1925, 247, 19349, 3961, 262, 1895, 50276, 1257, 625, 2173, 670, 534, 2045, 2987, 368, 403, 14339, 281, 275, 436, 1083, 253, 3786, 11106, 2987, 327, 436, 9400, 452, 2011, 326, 581, 476, 436, 3081, 2605, 281, 3157, 776, 3045, 50276, 5430, 310, 253, 3634, 4778, 1027, 432, 253, 9459, 15301, 4778, 23899, 908, 407, 30005, 532, 277, 1403, 301, 5987, 39962, 2061, 9275, 9430, 805, 35337, 923, 2593, 374, 3877, 326, 1223, 436, 3806, 310, 4942, 747, 253, 897, 273, 824, 4903, 476, 320, 1119, 275, 5662, 9380, 407, 277, 1403, 301, 285, 275, 27887, 5200, 46449, 1984, 50276, 249, 253, 10199, 253, 4477, 1375, 359, 1287, 247, 7792, 323, 19349, 3961, 953, 835, 3253, 310, 6311, 432, 20041, 436, 310, 271, 23668, 318, 50276, 31312, 327, 275, 2593, 4562, 597, 1375, 359, 9569, 247, 4460, 1491, 9628, 29107, 326, 15771, 327, 1199, 1679, 2173, 3640, 670, 253, 19349, 4216, 4496, 1375, 11120, 253, 1491, 1146, 8025, 670, 253, 19349, 4216, 50276, 555, 993, 285, 33797, 16503, 840, 1199, 1679, 29190, 840, 5742, 41685, 1316, 272, 327, 50275, 783, 5933, 310, 3559, 275, 253, 30762, 352, 943, 3164, 320, 4395, 281, 253, 2022, 2929, 50276, 7152, 339, 9852, 436, 2929, 253, 4477, 4081, 247, 747, 3961, 262, 5933, 326, 476, 625, 14556, 6642, 253, 19349, 1055, 3066, 4715, 253, 19349, 14580, 253, 9864, 1543, 327, 253, 256, 607, 84, 941, 921, 326, 627, 310, 5520, 18849, 14938, 2429, 281, 8946, 3961, 262, 7274, 891, 2335, 247, 4864, 1007, 387, 253, 4737, 285, 858, 417, 1089, 667, 1895, 4583, 891, 1158, 352, 310, 271, 8718, 2929, 285, 778, 320, 273, 1600, 281, 253, 2590, 8446, 17837, 846, 9257, 9814, 436, 7714, 891, 1335, 452, 253, 1563, 3533, 50276, 18, 10012, 374, 760, 2722, 326, 627, 310, 11041, 5141, 407, 1677, 247, 3588, 23694, 873, 2299, 275, 3946, 512, 253, 23694, 5239, 403, 6311, 432, 253, 5933, 285, 253, 19349, 8900, 5933, 778, 417, 320, 16293, 281, 1089, 253, 3588, 873, 323, 9712, 310, 627, 667, 10527, 1783, 281, 25057, 253, 13418, 6332, 432, 19349, 8900, 11333, 715, 253, 14938, 390, 11041, 3033, 50276, 19, 275, 2593, 8073, 253, 4477, 2692, 5520, 18849, 14938, 3033, 327, 256, 607, 84, 941, 873, 436, 2097, 326, 672, 253, 19349, 4216, 310, 3477, 281, 3037, 253, 4081, 1332, 476, 5115, 247, 4942, 49482, 3045, 2429, 281, 9373, 19966, 3961, 262, 11333, 891, 4282, 849, 253, 3045, 651, 320, 672, 359, 3185, 4647, 352, 327, 247, 941, 873, 342, 625, 9542, 19349, 4216, 323, 1650, 672, 352, 310, 3732, 281, 247, 941, 342, 247, 14086, 19349, 4216, 390, 342, 247, 19349, 4216, 4508, 21624, 44667, 398, 651, 352, 1335, 1347, 973, 685, 253, 9826, 3961, 262, 3082, 390, 352, 651, 320, 253, 643, 1039, 1475, 50276, 1189, 455, 891, 1158, 352, 310, 271, 8718, 2929, 533, 7296, 253, 3237, 327, 10527, 1783, 285, 1524, 1533, 4893, 5393, 1840, 253, 7680, 3133, 2581, 16888, 7152, 339, 2059, 452, 9814, 2067, 19349, 3961, 262, 9380, 285, 1119, 326, 690, 273, 253, 9380, 7680, 310, 3365, 970, 253, 4651, 273, 10921, 4778, 340, 281, 3157, 253, 3045, 275, 619, 2278, 891, 4159, 326, 627, 778, 320, 643, 23694, 873, 534, 476, 4917, 1805, 11041, 5141, 7613, 891, 369, 10380, 3264, 436, 2934, 1146, 8156, 50276, 783, 4477, 2900, 310, 2969, 285, 4076, 253, 897, 273, 3634, 4778, 50276, 1747, 553, 15301, 310, 671, 247, 5322, 1635, 50275, 8826, 3533, 2139, 1057, 253, 11333, 275, 253, 30762, 1057, 417, 2451, 247, 23694, 873, 327, 253, 8778, 50276, 3088, 359, 452, 281, 1347, 19349, 8900, 1046, 19502, 390, 2378, 275, 247, 1223, 7293, 327, 268, 2877, 581, 778, 17049, 247, 1643, 19502, 513, 359, 452, 271, 3909, 2715, 273, 17697, 14275, 1071, 390, 19349, 8900, 5933, 326, 476, 320, 12845, 275, 436, 4758, 50276, 783, 906, 342, 1480, 14275, 5175, 310, 1805, 685, 1110, 342, 19349, 8900, 5933, 651, 352, 320, 11408, 281, 2126, 690, 2238, 273, 36970, 275, 253, 2454, 273, 11649, 323, 247, 23694, 873, 2490, 187, 4118, 18435, 27, 2520, 2929, 29328, 247, 19349, 3961, 262, 5933, 326, 1057, 417, 5467, 2720, 3640, 670, 253, 19349, 2605, 285, 14556, 8197, 253, 19349, 1055, 970, 247, 23694, 873, 875, 7268, 285, 10921, 2190, 1027, 6174, 253, 9864, 1543, 921, 326, 253, 4081, 5933, 41731, 13015, 19349, 3961, 262, 7274, 326, 513, 417, 25057, 19349, 1491, 4044, 494, 949, 941, 285, 17697, 14275, 5216, 50276, 783, 30628, 5194, 326, 253, 2929, 29328, 271, 12994, 2900, 281, 253, 19349, 3961, 262, 1895, 2299, 253, 19843, 285, 47284, 273, 253, 2929, 476, 320, 5520, 3103, 359, 5583, 253, 4477, 1379, 253, 30628, 5701, 281, 2798, 285, 11907, 731, 281, 19071, 616, 7906, 275, 13828, 253, 2457, 2715, 273, 616, 7714 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 2520, 2929, 4453, 1175, 2217, 281, 320, 7607, 326, 753, 253, 19843, 943, 320, 5520, 50275, 32897, 294, 3418, 285, 320, 625, 2173, 672, 15250, 14308, 604, 359, 513, 417, 760, 10018, 253, 23267, 533, 671, 690, 3081, 4903, 285, 253, 23267, 285, 841, 4903, 403, 4561, 2556, 281, 247, 19349, 1566, 840, 253, 23267, 403, 642, 3356, 3907, 285, 436, 310, 1925, 247, 19349, 3961, 262, 1895, 50276, 1257, 625, 2173, 670, 534, 2045, 2987, 368, 403, 14339, 281, 275, 436, 1083, 253, 3786, 11106, 2987, 327, 436, 9400, 452, 2011, 326, 581, 476, 436, 3081, 2605, 281, 3157, 776, 3045, 50276, 5430, 310, 253, 3634, 4778, 1027, 432, 253, 9459, 15301, 4778, 23899, 908, 407, 30005, 532, 277, 1403, 301, 5987, 39962, 2061, 9275, 9430, 805, 35337, 923, 2593, 374, 3877, 326, 1223, 436, 3806, 310, 4942, 747, 253, 897, 273, 824, 4903, 476, 320, 1119, 275, 5662, 9380, 407, 277, 1403, 301, 285, 275, 27887, 5200, 46449, 1984, 50276, 249, 253, 10199, 253, 4477, 1375, 359, 1287, 247, 7792, 323, 19349, 3961, 953, 835, 3253, 310, 6311, 432, 20041, 436, 310, 271, 23668, 318, 50276, 31312, 327, 275, 2593, 4562, 597, 1375, 359, 9569, 247, 4460, 1491, 9628, 29107, 326, 15771, 327, 1199, 1679, 2173, 3640, 670, 253, 19349, 4216, 4496, 1375, 11120, 253, 1491, 1146, 8025, 670, 253, 19349, 4216, 50276, 555, 993, 285, 33797, 16503, 840, 1199, 1679, 29190, 840, 5742, 41685, 1316, 272, 327, 50275, 783, 5933, 310, 3559, 275, 253, 30762, 352, 943, 3164, 320, 4395, 281, 253, 2022, 2929, 50276, 7152, 339, 9852, 436, 2929, 253, 4477, 4081, 247, 747, 3961, 262, 5933, 326, 476, 625, 14556, 6642, 253, 19349, 1055, 3066, 4715, 253, 19349, 14580, 253, 9864, 1543, 327, 253, 256, 607, 84, 941, 921, 326, 627, 310, 5520, 18849, 14938, 2429, 281, 8946, 3961, 262, 7274, 891, 2335, 247, 4864, 1007, 387, 253, 4737, 285, 858, 417, 1089, 667, 1895, 4583, 891, 1158, 352, 310, 271, 8718, 2929, 285, 778, 320, 273, 1600, 281, 253, 2590, 8446, 17837, 846, 9257, 9814, 436, 7714, 891, 1335, 452, 253, 1563, 3533, 50276, 18, 10012, 374, 760, 2722, 326, 627, 310, 11041, 5141, 407, 1677, 247, 3588, 23694, 873, 2299, 275, 3946, 512, 253, 23694, 5239, 403, 6311, 432, 253, 5933, 285, 253, 19349, 8900, 5933, 778, 417, 320, 16293, 281, 1089, 253, 3588, 873, 323, 9712, 310, 627, 667, 10527, 1783, 281, 25057, 253, 13418, 6332, 432, 19349, 8900, 11333, 715, 253, 14938, 390, 11041, 3033, 50276, 19, 275, 2593, 8073, 253, 4477, 2692, 5520, 18849, 14938, 3033, 327, 256, 607, 84, 941, 873, 436, 2097, 326, 672, 253, 19349, 4216, 310, 3477, 281, 3037, 253, 4081, 1332, 476, 5115, 247, 4942, 49482, 3045, 2429, 281, 9373, 19966, 3961, 262, 11333, 891, 4282, 849, 253, 3045, 651, 320, 672, 359, 3185, 4647, 352, 327, 247, 941, 873, 342, 625, 9542, 19349, 4216, 323, 1650, 672, 352, 310, 3732, 281, 247, 941, 342, 247, 14086, 19349, 4216, 390, 342, 247, 19349, 4216, 4508, 21624, 44667, 398, 651, 352, 1335, 1347, 973, 685, 253, 9826, 3961, 262, 3082, 390, 352, 651, 320, 253, 643, 1039, 1475, 50276, 1189, 455, 891, 1158, 352, 310, 271, 8718, 2929, 533, 7296, 253, 3237, 327, 10527, 1783, 285, 1524, 1533, 4893, 5393, 1840, 253, 7680, 3133, 2581, 16888, 7152, 339, 2059, 452, 9814, 2067, 19349, 3961, 262, 9380, 285, 1119, 326, 690, 273, 253, 9380, 7680, 310, 3365, 970, 253, 4651, 273, 10921, 4778, 340, 281, 3157, 253, 3045, 275, 619, 2278, 891, 4159, 326, 627, 778, 320, 643, 23694, 873, 534, 476, 4917, 1805, 11041, 5141, 7613, 891, 369, 10380, 3264, 436, 2934, 1146, 8156, 50276, 783, 4477, 2900, 310, 2969, 285, 4076, 253, 897, 273, 3634, 4778, 50276, 1747, 553, 15301, 310, 671, 247, 5322, 1635, 50275, 8826, 3533, 2139, 1057, 253, 11333, 275, 253, 30762, 1057, 417, 2451, 247, 23694, 873, 327, 253, 8778, 50276, 3088, 359, 452, 281, 1347, 19349, 8900, 1046, 19502, 390, 2378, 275, 247, 1223, 7293, 327, 268, 2877, 581, 778, 17049, 247, 1643, 19502, 513, 359, 452, 271, 3909, 2715, 273, 17697, 14275, 1071, 390, 19349, 8900, 5933, 326, 476, 320, 12845, 275, 436, 4758, 50276, 783, 906, 342, 1480, 14275, 5175, 310, 1805, 685, 1110, 342, 19349, 8900, 5933, 651, 352, 320, 11408, 281, 2126, 690, 2238, 273, 36970, 275, 253, 2454, 273, 11649, 323, 247, 23694, 873, 2490, 187, 4118, 18435, 27, 2520, 2929, 29328, 247, 19349, 3961, 262, 5933, 326, 1057, 417, 5467, 2720, 3640, 670, 253, 19349, 2605, 285, 14556, 8197, 253, 19349, 1055, 970, 247, 23694, 873, 875, 7268, 285, 10921, 2190, 1027, 6174, 253, 9864, 1543, 921, 326, 253, 4081, 5933, 41731, 13015, 19349, 3961, 262, 7274, 326, 513, 417, 25057, 19349, 1491, 4044, 494, 949, 941, 285, 17697, 14275, 5216, 50276, 783, 30628, 5194, 326, 253, 2929, 29328, 271, 12994, 2900, 281, 253, 19349, 3961, 262, 1895, 2299, 253, 19843, 285, 47284, 273, 253, 2929, 476, 320, 5520, 3103, 359, 5583, 253, 4477, 1379, 253, 30628, 5701, 281, 2798, 285, 11907, 731, 281, 19071, 616, 7906, 275, 13828, 253, 2457, 2715, 273, 616, 7714 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: the paper uses monotonic neural networks to estimate the energy consumption of chilling plants this can be used to save energy by using the neural network in modelbased optimization on the positive side the paper actually tests a monotonic neural network in mbo on a real chilling plant dropping the pue from 2 from 154 to 152 eg on the negative side all of the fundamental ideas in the submission are quite old the idea of penalizing nonmonotonic behavior in neural networks was first published in 1 the idea of forcing a monotonic structure on a neural network was first published in 2 the submission references 3 which enforces convexity not monotonicity the idea of using a neural network to predict the energy consumption of a chilling plant was published in 4 although may predate that now the submission does not exactly reproduce these older ideas eg the monotonicity penalty in 1 is quadratic while in the submission its either linear or a log loss mapped through a sigmoid however no comparison between these ideas was tested so it is unclear whether the difference from the old work is significant this paper would be of interest to a small segment of engineers who are interested in optimizing chilling plants it could be made of wider interest if either a some stronger mathematical statements could be made or b if more than one empirical experiment were carried out perhaps across different types of systems perhaps the authors could try a system identification benchmark eg 5 references 1 sill joseph and yaser s abumostafa monotonicity hints in advances in neural information processing systems pp 634640 1997 2 sill joseph monotonic networks in advances in neural information processing systems pp 661667 1998 3 amos brandon lei xu and j zico kolter input convex neural networks in international conference on machine learning pp 146155 2017 4 gao jim machine learning applications for data center optimization 2014 5 nonlinearbenchmarkorg docsepsummary this paper tackles the energy optimization problem of chiller plants which are the main equipment of a cooling system this paper aims to learn the nonlinear mapping between the control parameters of a cooling system and energy consumption the key idea of this paper is to incorporate domain knowledge into the training of neural networks specifically monotonic constraints are proposed to constrain the learning of the nonlinear mapping so that physical laws are satisfied the authors propose two methods to implement monotonicity hardmnn and softmnn hardmnn treats monotonicity as hard constraints and uses a monotonic neural network to learn the nonlinear mapping softmnn treats monotonicity as soft constraints and adds a monotonic loss into the training objective to encourage the learned mapping to be monotonic experiments show that 1 both the hardmnn and softmm outperforms multilayer perceptron mlp and 2 hardmnn yields lower energy consumption than mlp pros the idea of incorporating domain knowledge into training neural nets is reasonable but this is not a new idea eg kggan knowledgeguided generative adversarial networks appendix a1 does a great job of giving an introduction about cooling systems and chiller plants cons some claims are unjustified section 2 states that we make a theoretical analysis and however i cannot find such analysis does the theoretical analysis mean equation 3 4 and table 1 section 2 states that compared with the above state of art method mnn reduces the dependence on the amount of data provides a more accurate function space facilitates subsequent optimization steps and improves optimization performance however the current experimental results are not strong enough to support these four claims specifically no experiments empirically compare performance in terms of different amounts of training data there is no attempt to quantify the socalled a more accurate function space no results show information about the optimization applied to the learned mapping function the technical method is not clearly described appendix a3 gives an overview of how the energy optimization problem is decoupled into two subproblems identification and optimization i appreciate the overview but also think there is room for improvement this part could be important since it gives a concrete highlevel view of the problem formulation in particular this paper concentrates on the first subproblem however it is not evident to see this point neither from the introduction nor from the beginning of the method section making it hard for me to understand the problem formulation clearly in my firstpass reading in section 4 three type models are introduced however their names and notations are not linked together which makes the reading very difficult it would be better if the notations are shown together with their corresponding names like coolingchilled water pump power model pcowp and pchwp cooling tower power modelpct and chiller power model pch the connections between math equations are loose which makes the technical method hard to understand for example 1 the connection between y in equation 2 and ych in equation 4 is not clear 2 the connection between ych in equation 4 and xch in equation 5 is also not clear enough 3 cfchillerx mentioned in section 43 is somewhat isolated its relationship with the other equations are not evident the last five lines of section 43 below figure 44 are hard to understand overall the technical writing of section 4 could be further improved figure 41 and 44 lack a detailed explanation and thus are hard to understand what do ffan cowpump p3 p1p2p3 in the figure mean respectively what is the difference between a and b why a is called identification hyperplane while b is called optimization hyperplane most of the math notations are not explicitly explained in the paragraph i had difficulty reading the mathematical equations until i found the table of notations in the appendix table 2 however table 2 is not referenced throughout the paper insufficient experiments in terms of setting details and comparisons the experiments section lacks a detailed description of dataset information training settings implementation details the current experiments look impossible to reproduce the results what is the general introduction of the dataset used in the experiments do you split the data into training validation and testing set what is the number of data examples of each subset how to train the proposed model what is the detailed network architecture of the mlp and the proposed mnn in the first experiment figure 51 what is mape and how to calculate it what is dc mentioned in figure 51 the second experiment figure 52 is unclear in its purpose comparison method metric and effectiveness it is hard to judge the effectiveness of the proposed method based on the experiments why do we need to consider experimenting with different wetbulb temperatures what is the key difference in terms of the experiment setting compared with the previous experiment the 1st experiment focuses on internal control while the 2nd experiment focuses on external control can the authors elaborate more on these two experiments figure 51 and 52 what is pue and how to calculate it what do the dots and lines in figure 52 mean respectively what is mlp with local pid pid is not explained throughout the paper is improving the average pue by 15 significant why no results of softmnn in this experiment as the related work mentioned linear regression is the mostly used modeling method in realworld a baseline comparison with linear regression looks essential to let readers know the baseline performance however the current manuscript compares with mlp only is there any reason that prevents from comparing with linear regression this paper proposes a new activation function equation 11 and two extensions of the proposed monotonic neural networks mnn partialmnn and recurrentmnn however no results are shown to demonstrate their effectivenessdocsepthis paper presents how to utilize the domain knowledge which might be useful for solving problems in developing a deep neural network with the forms of variable constraints or objective functions and the proposed method called monotonic neural network was applied to the application of chiller plant optimization compared to the method mlp the proposed method showed better performance in the accuracy measure though this article shows how the machine learning model can be guided with prior knowledge in the realworld application it is hard to say that this paper is above the accept line as there are lots of rooms for improvement completeness no conclusion to make the research more consolidate authors need to consider putting conclusions of their works including summary reemphasizing what they found some key points or future works by just finishing with experiments in this article readers may wonder what are the main points of arguments from authors explanations for abbreviation as the paper deals with the application of chiller plant using technical terms is inevitableeg pue webulb temperature dc or chwp however authors need to bring the meaning of the terms from the appendix into the main article so that readers can glimpse easily when the unknown terms are shown applicability its applicability is limited only to the same problem from the same facilities authors may consider making their proposed method more general so that practitioners from other industries who want to combine their domain knowledge into their problem can get ways or insights from this study no specific information about the proposed model structure found how many layersk are used and why what is the size of the layer weights how are hyperparameters set from how many candidates superiority authors may compare their works to other stateoftheart approaches to present the superiority of them results showing better than mlp may not appeal to readers typocorrection authors need to check the correctness of sentences especially the first paragraph of section 2 has several misuses of capital letters and the fifth linealthough some is not a complete sentence in section 3 consider considering in section 2third paragraph state of art state of the art docsepmany thanks to the authors for their submission which aims at combining datadriven approaches with modelbased ones for optimising centralised cooling systems chiller plants the overarching aim of this paper has been well studied before given one aims at incorporating physical constraints one needs a solid understanding of the underlying physical system hence how one adds such a constraint to the model requires deep investigations the machine learning part of this paper is also not something new to match the physical constraints of the problem in hand authors incorporate monotonicity to the neural network model other than that the paper does not offer a comprehensive explanation on what the novelty and actual contributions are nor does it provide a solid experimental setup please consider some of my comments below major comments no conclusion or discussion has been provided that makes the paper incomplete experimental section is rather inadequate and rushed up the results are not explained properly no concrete information on the dataset used has been provided to contextualise the experimental design paper is heavily skewed towards providing too much background information and rather simplistic information on incorporating physical constraints on to the loss function of a neuralnet minor comments quite a few punctuation errors exist some sentences are not finished or finish with a full stop where it shouldnt please consider proofreading the manuscript carefully for correcting such typos ### Summary:
thank you for your submission to iclr the reviewers unanimously felt that there were substantial issues with this work owing to the fact that both the techniques and applications have been considered in a great deal of previous work furthermore the manuscript itself needs substantial amounts of revision before being suitable for publication as there was no response to these points during the rebuttal period it seems clear that the paper cant be accepted in its current form
[ 5697, 24088, 253, 45973, 414, 12339, 275, 337, 310, 21396, 1223, 275, 253, 19529, 697, 2057, 4872, 390, 247, 2412, 2957, 18301, 949, 247, 9788, 78, 1238, 2299, 642, 5301, 875, 841, 5697, 369, 5762, 594, 352, 310, 12744, 1880, 253, 3064, 432, 253, 1711, 789, 310, 1534, 50276, 2520, 2929, 651, 320, 273, 1600, 281, 247, 1355, 8223, 273, 19414, 665, 403, 6110, 275, 39793, 45942, 6244, 352, 812, 320, 1160, 273, 14200, 1600, 604, 2057, 247, 690, 10046, 15965, 7234, 812, 320, 1160, 390, 270, 604, 625, 685, 581, 16774, 3368, 497, 4824, 562, 4931, 2439, 1027, 3510, 273, 2718, 4931, 253, 4477, 812, 1611, 247, 985, 8137, 22791, 24088, 608, 50276, 250, 3065, 337, 46660, 480, 583, 545, 285, 340, 12290, 256, 490, 360, 493, 37124, 45973, 414, 28145, 275, 16424, 275, 11454, 1491, 5162, 2718, 7266, 721, 26140, 1449, 8210, 374, 46660, 480, 583, 545, 45973, 6928, 275, 16424, 275, 11454, 1491, 5162, 2718, 7266, 9523, 1036, 2251, 8065, 495, 717, 375, 7138, 251, 43278, 1269, 86, 285, 480, 1182, 4173, 38301, 350, 3280, 17133, 11454, 6928, 275, 5213, 8059, 327, 5145, 4715, 7266, 21640, 15054, 4240, 577, 305, 8500, 480, 303, 5145, 4715, 4893, 323, 941, 4055, 13757, 4059, 608, 14561, 31591, 4698, 2061, 5474, 339, 793, 360, 3454, 50276, 2520, 2929, 39223, 253, 2341, 13757, 1895, 273, 448, 6626, 6244, 534, 403, 253, 2022, 6500, 273, 247, 12356, 985, 436, 2929, 13698, 281, 3037, 253, 14561, 10603, 875, 253, 1453, 3602, 273, 247, 12356, 985, 285, 2341, 8353, 253, 2234, 2934, 273, 436, 2929, 310, 281, 19071, 5028, 3640, 715, 253, 3733, 273, 11454, 6928, 5742, 45973, 10806, 403, 4081, 281, 37709, 253, 4715, 273, 253, 14561, 10603, 594, 326, 3520, 5323, 403, 10048, 253, 4477, 12661, 767, 3082, 281, 3359, 45973, 414, 1892, 78, 9866, 285, 2602, 78, 9866, 50276, 10984, 78, 9866, 26574, 45973, 414, 347, 1892, 10806, 285, 4648, 247, 45973, 11454, 2990, 281, 3037, 253, 14561, 10603, 2602, 78, 9866, 26574, 45973, 414, 347, 2602, 10806, 285, 11323, 247, 45973, 2957, 715, 253, 3733, 8103, 281, 11907, 253, 6311, 10603, 281, 320, 45973, 50276, 16217, 3825, 921, 326, 337, 1097, 253, 1892, 78, 9866, 285, 2602, 2188, 41731, 13015, 33362, 4071, 591, 916, 1406, 13361, 81, 285, 374, 1892, 78, 9866, 11026, 2406, 2341, 8353, 685, 13361, 81, 50276, 856, 84, 50275, 783, 2934, 273, 24049, 5028, 3640, 715, 3733, 11454, 37507, 310, 5272, 50276, 2858, 436, 310, 417, 247, 747, 2934, 50275, 909, 50275, 5840, 1247, 3640, 26960, 1006, 800, 48960, 6928, 50275, 50237, 247, 18, 1057, 247, 1270, 2628, 273, 4933, 271, 10199, 670, 12356, 2718, 285, 448, 6626, 6244, 50276, 5040, 50275, 8826, 3916, 403, 26694, 1245, 50276, 4674, 374, 3054, 326, 359, 1056, 247, 10527, 1783, 285, 50276, 35529, 891, 2550, 1089, 824, 1783, 1057, 253, 10527, 1783, 1599, 5150, 495, 577, 285, 2829, 337, 50276, 4674, 374, 3054, 326, 2429, 342, 253, 1840, 1375, 273, 1445, 1332, 278, 9866, 11355, 253, 10096, 327, 253, 2408, 273, 941, 3400, 247, 625, 7899, 1159, 2317, 29499, 6774, 13757, 5018, 285, 19132, 13757, 3045, 2299, 253, 1655, 5661, 1543, 403, 417, 2266, 2217, 281, 1329, 841, 1740, 3916, 50276, 46458, 642, 4679, 45190, 7277, 3045, 275, 2426, 273, 1027, 8322, 273, 3733, 941, 627, 310, 642, 3177, 281, 22048, 253, 9267, 18859, 247, 625, 7899, 1159, 2317, 642, 1543, 921, 1491, 670, 253, 13757, 3732, 281, 253, 6311, 10603, 1159, 50275, 783, 7681, 1332, 310, 417, 4518, 2529, 50276, 50237, 247, 20, 4245, 271, 18389, 273, 849, 253, 2341, 13757, 1895, 310, 34430, 6216, 715, 767, 749, 856, 23042, 8137, 285, 13757, 50276, 74, 11435, 253, 18389, 533, 671, 1158, 627, 310, 2316, 323, 7756, 50276, 2520, 629, 812, 320, 1774, 1580, 352, 4245, 247, 11859, 1029, 5251, 1859, 273, 253, 1895, 15895, 275, 1798, 436, 2929, 5078, 684, 327, 253, 806, 749, 28872, 2299, 352, 310, 417, 8943, 281, 923, 436, 1127, 6747, 432, 253, 10199, 4543, 432, 253, 5068, 273, 253, 1332, 2593, 2403, 352, 1892, 323, 479, 281, 2096, 253, 1895, 15895, 4518, 275, 619, 806, 5858, 4361, 50276, 249, 2593, 577, 1264, 1511, 3210, 403, 5611, 2299, 616, 4454, 285, 41818, 403, 417, 7939, 2366, 534, 2789, 253, 4361, 1077, 2834, 352, 651, 320, 1805, 604, 253, 41818, 403, 2011, 2366, 342, 616, 3969, 4454, 751, 12356, 348, 2591, 1824, 8670, 1612, 1566, 21136, 319, 81, 285, 268, 348, 16471, 12356, 15469, 1612, 1566, 81, 291, 285, 448, 6626, 1612, 1566, 268, 348, 50276, 783, 10291, 875, 14168, 7424, 403, 13155, 534, 2789, 253, 7681, 1332, 1892, 281, 2096, 323, 1650, 50276, 18, 253, 4602, 875, 340, 275, 5150, 374, 285, 340, 348, 275, 5150, 577, 310, 417, 2590, 374, 253, 4602, 875, 340, 348, 275, 5150, 577, 285, 1269, 348, 275, 5150, 608, 310, 671, 417, 2590, 2217, 495, 21194, 348, 6626, 89, 5393, 275, 2593, 7652, 310, 8489, 7011, 697, 2954, 342, 253, 643, 7424, 403, 417, 8943, 50276, 783, 1390, 2620, 3104, 273, 2593, 7652, 2708, 4677, 7127, 403, 1892, 281, 2096, 4583, 253, 7681, 4028, 273, 2593, 577, 812, 320, 2007, 5520, 50276, 13206, 7609, 285, 7127, 3480, 247, 7000, 8813, 285, 3021, 403, 1892, 281, 2096, 752, 513, 269, 20227, 12120, 81, 1765, 268, 20, 268, 18, 81, 19, 81, 20, 275, 253, 4677, 1599, 2975, 752, 310, 253, 3064, 875, 247, 285, 270, 2139, 247, 310, 1925, 8137, 4373, 13568, 1223, 270, 310, 1925, 13757, 4373, 13568, 50276, 2252, 273, 253, 14168, 41818, 403, 417, 11120, 5544, 275, 253, 12494, 891, 574, 10183, 4361, 253, 15965, 7424, 1919, 891, 1119, 253, 2829, 273, 41818, 275, 253, 30762, 2829, 374, 2299, 2829, 374, 310, 417, 23378, 4768, 253, 2929, 50275, 968, 86, 2276, 4679, 275, 2426, 273, 4758, 4278, 285, 14023, 50276, 783, 4679, 2593, 19756, 247, 7000, 5740, 273, 10895, 1491, 3733, 7533, 7092, 4278, 253, 1655, 4679, 1007, 7479, 281, 18302, 253, 1543, 752, 310, 253, 2087, 10199, 273, 253, 10895, 908, 275, 253, 4679, 513, 368, 8085, 253, 941, 715, 3733, 12820, 285, 5175, 873, 752, 310, 253, 1180, 273, 941, 6667, 273, 1016, 8578, 849, 281, 6194, 253, 4081, 1566, 752, 310, 253, 7000, 2990, 10336, 273, 253, 13361, 81, 285, 253, 4081, 278, 9866, 50276, 249, 253, 806, 3368, 4677, 8319, 752, 310, 278, 2259, 285, 849, 281, 10173, 352, 752, 310, 36196, 5393, 275, 4677, 8319, 253, 1273, 3368, 4677, 8073, 310, 12744, 275, 697, 4096, 5301, 1332, 7982, 285, 12510, 352, 310, 1892, 281, 5963, 253, 12510, 273, 253, 4081, 1332, 1754, 327, 253, 4679, 2139, 513, 359, 878, 281, 1908, 46086, 342, 1027, 9685, 9705, 67, 9208, 50276, 5371, 310, 253, 2234, 3064, 275, 2426, 273, 253, 3368, 4758, 2429, 342, 253, 2045, 3368, 253, 337, 296, 3368, 16633, 327, 4812, 1453, 1223, 253, 374, 2109, 3368, 16633, 327, 6024, 1453, 476, 253, 4477, 21184, 625, 327, 841, 767, 4679, 4677, 8319, 285, 8073, 752, 310, 268, 489, 285, 849, 281, 10173, 352, 752, 513, 253, 20200, 285, 3104, 275, 4677, 8073, 1599, 2975, 752, 310, 13361, 81, 342, 1980, 33786, 33786, 310, 417, 5544, 4768, 253, 2929, 310, 11138, 253, 3388, 268, 489, 407, 1458, 1534, 2139, 642, 1543, 273, 2602, 78, 9866, 275, 436, 3368, 50276, 284, 253, 2905, 789, 5393, 4872, 9077, 310, 253, 6571, 908, 14053, 1332, 275, 1524, 10186, 50276, 66, 8245, 5301, 342, 4872, 9077, 4453, 5667, 281, 1339, 10668, 871, 253, 8245, 3045, 2299, 253, 1655, 7714, 26662, 342, 13361, 81, 760, 310, 627, 667, 1921, 326, 16897, 432, 10941, 342, 4872, 9077, 50276, 2520, 2929, 29328, 247, 747, 5743, 1159, 5150, 1903, 285, 767, 18149, 273, 253, 4081, 45973, 11454, 6928, 278, 9866, 7898, 78, 9866, 285, 18902, 78, 9866, 2299, 642, 1543, 403, 2011, 281, 7568, 616, 12510, 7152, 33032, 2520, 2929, 10262, 849, 281, 16584, 253, 5028, 3640, 534, 1537, 320, 4217, 323, 16161, 3237, 275, 6684, 247, 3676, 11454, 2990, 342, 253, 4948, 273, 4778, 10806, 390, 8103, 3470, 285, 253, 4081, 1332, 1925, 45973, 11454, 2990, 369, 3732, 281, 253, 2898, 273, 448, 6626, 4444, 13757, 2429, 281, 253, 1332, 13361, 81, 253, 4081, 1332, 2692, 1805, 3045, 275, 253, 7200, 2557, 50275, 2004, 436, 3929, 2722, 849, 253, 5145, 4715, 1566, 476, 320, 18107, 342, 2720, 3640, 275, 253, 1524, 10186, 2898, 352, 310, 1892, 281, 1333, 326, 436, 2929, 310, 1840, 253, 2997, 1386, 347, 627, 403, 8783, 273, 9956, 323, 7756, 50276, 7507, 1866, 405, 50276, 2369, 6452, 281, 1056, 253, 2561, 625, 16932, 366, 4477, 878, 281, 1908, 8133, 11815, 273, 616, 2987, 1690, 6010, 294, 37236, 284, 3006, 752, 597, 1119, 690, 2234, 2792, 390, 2852, 2987, 407, 816, 19083, 342, 4679, 275, 436, 3929, 10668, 778, 4282, 752, 403, 253, 2022, 2792, 273, 7125, 432, 4477, 50276, 911, 11139, 569, 323, 31931, 2492, 347, 253, 2929, 13330, 342, 253, 2898, 273, 50276, 348, 6626, 4444, 970, 7681, 2426, 310, 19455, 909, 268, 489, 4384, 335, 67, 3276, 36196, 390, 448, 16471, 2299, 4477, 878, 281, 3324, 253, 4495, 273, 253, 2426, 432, 253, 30762, 715, 253, 2022, 3929, 594, 326, 10668, 476, 27105, 4354, 672, 253, 7202, 2426, 403, 2011, 50275, 33751, 1430, 50276, 953, 30437, 310, 3710, 760, 281, 253, 1072, 1895, 432, 253, 1072, 9189, 4477, 778, 1908, 2403, 616, 4081, 1332, 625, 2087, 594, 326, 24432, 432, 643, 17057, 665, 971, 281, 13398, 616, 5028, 3640, 715, 616, 1895, 476, 755, 4088, 390, 16039, 432, 436, 1263, 50276, 2369, 2173, 1491, 670, 253, 4081, 1566, 2605, 1119, 849, 1142, 8090, 76, 403, 908, 285, 2139, 752, 310, 253, 1979, 273, 253, 3828, 13461, 849, 403, 4373, 22041, 873, 432, 849, 1142, 9183, 50275, 12185, 1528, 414, 50276, 43355, 778, 7277, 616, 2987, 281, 643, 1375, 23037, 14387, 7274, 281, 1246, 253, 34385, 273, 731, 1543, 4645, 1805, 685, 13361, 81, 778, 417, 4549, 281, 10668, 50275, 6611, 406, 263, 15831, 50276, 43355, 878, 281, 2451, 253, 36594, 273, 14683, 3340, 253, 806, 12494, 273, 2593, 374, 556, 2067, 3731, 5123, 273, 5347, 4876, 285, 253, 10720, 1386, 20261, 690, 50276, 261, 417, 247, 3426, 6197, 50276, 249, 2593, 495, 1908, 50276, 15603, 272, 50276, 249, 2593, 374, 19016, 12494, 1375, 273, 1445, 50276, 3409, 273, 253, 1445, 50276, 7152, 339, 2617, 1279, 6701, 281, 253, 4477, 323, 616, 19529, 534, 13698, 387, 16248, 2856, 324, 1069, 257, 7274, 342, 1566, 3169, 4394, 323, 5556, 2182, 4275, 1701, 12356, 2718, 448, 6626, 6244, 253, 689, 50238, 4388, 273, 436, 2929, 556, 644, 973, 5421, 1078, 1677, 581, 13698, 387, 24049, 3520, 10806, 581, 3198, 247, 4891, 4685, 273, 253, 6944, 3520, 985, 7613, 849, 581, 11323, 824, 247, 7658, 281, 253, 1566, 4419, 3676, 14006, 253, 5145, 4715, 629, 273, 436, 2929, 310, 671, 417, 1633, 747, 281, 3761, 253, 3520, 10806, 273, 253, 1895, 275, 1133, 4477, 19071, 45973, 414, 281, 253, 11454, 2990, 1566, 643, 685, 326, 253, 2929, 1057, 417, 3959, 247, 11088, 8813, 327, 752, 253, 38135, 285, 4588, 9021, 403, 4543, 1057, 352, 2085, 247, 4891, 5661, 9978, 4496, 1908, 690, 273, 619, 5701, 2708, 50276, 24330, 5701, 642, 6452, 390, 5955, 556, 644, 2530, 326, 2789, 253, 2929, 18464, 5661, 2593, 310, 2581, 18766, 285, 20906, 598, 253, 1543, 403, 417, 5544, 6283, 642, 11859, 1491, 327, 253, 10895, 908, 556, 644, 2530, 281, 33876, 885, 253, 5661, 2216, 2929, 310, 11306, 46746, 4404, 5277, 1512, 1199, 4114, 1491, 285, 2581, 8077, 2531, 1491, 327, 24049, 3520, 10806, 327, 281, 253, 2957, 1159, 273, 247, 11454, 3024, 50276, 37585, 5701, 3240, 247, 1643, 17256, 2368, 6332, 2226, 690, 14683, 403, 417, 6699, 390, 8416, 342, 247, 2120, 3523, 835, 352, 943, 2649, 4496, 1908, 4737, 24042, 253, 7714, 9257, 323, 35827, 824, 963, 993, 187, 187, 4118, 18435, 27, 47033, 368, 323, 634, 19529, 281, 17857, 32888, 50276, 783, 30628, 38350, 3543, 326, 627, 497, 6832, 3374, 342, 436, 789, 21681, 281, 253, 958, 326, 1097, 253, 5609, 285, 4893, 452, 644, 2783, 275, 247, 1270, 2968, 273, 2045, 789, 50276, 44295, 3062, 253, 7714, 3139, 3198, 6832, 8322, 273, 18520, 1078, 1146, 7470, 323, 9311, 50276, 284, 627, 369, 642, 2380, 281, 841, 2792, 1309, 253, 30080, 22559, 2180, 352, 3133, 2590, 326, 253, 2929, 16216, 320, 7607, 275, 697, 1655, 830 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 5697, 24088, 253, 45973, 414, 12339, 275, 337, 310, 21396, 1223, 275, 253, 19529, 697, 2057, 4872, 390, 247, 2412, 2957, 18301, 949, 247, 9788, 78, 1238, 2299, 642, 5301, 875, 841, 5697, 369, 5762, 594, 352, 310, 12744, 1880, 253, 3064, 432, 253, 1711, 789, 310, 1534, 50276, 2520, 2929, 651, 320, 273, 1600, 281, 247, 1355, 8223, 273, 19414, 665, 403, 6110, 275, 39793, 45942, 6244, 352, 812, 320, 1160, 273, 14200, 1600, 604, 2057, 247, 690, 10046, 15965, 7234, 812, 320, 1160, 390, 270, 604, 625, 685, 581, 16774, 3368, 497, 4824, 562, 4931, 2439, 1027, 3510, 273, 2718, 4931, 253, 4477, 812, 1611, 247, 985, 8137, 22791, 24088, 608, 50276, 250, 3065, 337, 46660, 480, 583, 545, 285, 340, 12290, 256, 490, 360, 493, 37124, 45973, 414, 28145, 275, 16424, 275, 11454, 1491, 5162, 2718, 7266, 721, 26140, 1449, 8210, 374, 46660, 480, 583, 545, 45973, 6928, 275, 16424, 275, 11454, 1491, 5162, 2718, 7266, 9523, 1036, 2251, 8065, 495, 717, 375, 7138, 251, 43278, 1269, 86, 285, 480, 1182, 4173, 38301, 350, 3280, 17133, 11454, 6928, 275, 5213, 8059, 327, 5145, 4715, 7266, 21640, 15054, 4240, 577, 305, 8500, 480, 303, 5145, 4715, 4893, 323, 941, 4055, 13757, 4059, 608, 14561, 31591, 4698, 2061, 5474, 339, 793, 360, 3454, 50276, 2520, 2929, 39223, 253, 2341, 13757, 1895, 273, 448, 6626, 6244, 534, 403, 253, 2022, 6500, 273, 247, 12356, 985, 436, 2929, 13698, 281, 3037, 253, 14561, 10603, 875, 253, 1453, 3602, 273, 247, 12356, 985, 285, 2341, 8353, 253, 2234, 2934, 273, 436, 2929, 310, 281, 19071, 5028, 3640, 715, 253, 3733, 273, 11454, 6928, 5742, 45973, 10806, 403, 4081, 281, 37709, 253, 4715, 273, 253, 14561, 10603, 594, 326, 3520, 5323, 403, 10048, 253, 4477, 12661, 767, 3082, 281, 3359, 45973, 414, 1892, 78, 9866, 285, 2602, 78, 9866, 50276, 10984, 78, 9866, 26574, 45973, 414, 347, 1892, 10806, 285, 4648, 247, 45973, 11454, 2990, 281, 3037, 253, 14561, 10603, 2602, 78, 9866, 26574, 45973, 414, 347, 2602, 10806, 285, 11323, 247, 45973, 2957, 715, 253, 3733, 8103, 281, 11907, 253, 6311, 10603, 281, 320, 45973, 50276, 16217, 3825, 921, 326, 337, 1097, 253, 1892, 78, 9866, 285, 2602, 2188, 41731, 13015, 33362, 4071, 591, 916, 1406, 13361, 81, 285, 374, 1892, 78, 9866, 11026, 2406, 2341, 8353, 685, 13361, 81, 50276, 856, 84, 50275, 783, 2934, 273, 24049, 5028, 3640, 715, 3733, 11454, 37507, 310, 5272, 50276, 2858, 436, 310, 417, 247, 747, 2934, 50275, 909, 50275, 5840, 1247, 3640, 26960, 1006, 800, 48960, 6928, 50275, 50237, 247, 18, 1057, 247, 1270, 2628, 273, 4933, 271, 10199, 670, 12356, 2718, 285, 448, 6626, 6244, 50276, 5040, 50275, 8826, 3916, 403, 26694, 1245, 50276, 4674, 374, 3054, 326, 359, 1056, 247, 10527, 1783, 285, 50276, 35529, 891, 2550, 1089, 824, 1783, 1057, 253, 10527, 1783, 1599, 5150, 495, 577, 285, 2829, 337, 50276, 4674, 374, 3054, 326, 2429, 342, 253, 1840, 1375, 273, 1445, 1332, 278, 9866, 11355, 253, 10096, 327, 253, 2408, 273, 941, 3400, 247, 625, 7899, 1159, 2317, 29499, 6774, 13757, 5018, 285, 19132, 13757, 3045, 2299, 253, 1655, 5661, 1543, 403, 417, 2266, 2217, 281, 1329, 841, 1740, 3916, 50276, 46458, 642, 4679, 45190, 7277, 3045, 275, 2426, 273, 1027, 8322, 273, 3733, 941, 627, 310, 642, 3177, 281, 22048, 253, 9267, 18859, 247, 625, 7899, 1159, 2317, 642, 1543, 921, 1491, 670, 253, 13757, 3732, 281, 253, 6311, 10603, 1159, 50275, 783, 7681, 1332, 310, 417, 4518, 2529, 50276, 50237, 247, 20, 4245, 271, 18389, 273, 849, 253, 2341, 13757, 1895, 310, 34430, 6216, 715, 767, 749, 856, 23042, 8137, 285, 13757, 50276, 74, 11435, 253, 18389, 533, 671, 1158, 627, 310, 2316, 323, 7756, 50276, 2520, 629, 812, 320, 1774, 1580, 352, 4245, 247, 11859, 1029, 5251, 1859, 273, 253, 1895, 15895, 275, 1798, 436, 2929, 5078, 684, 327, 253, 806, 749, 28872, 2299, 352, 310, 417, 8943, 281, 923, 436, 1127, 6747, 432, 253, 10199, 4543, 432, 253, 5068, 273, 253, 1332, 2593, 2403, 352, 1892, 323, 479, 281, 2096, 253, 1895, 15895, 4518, 275, 619, 806, 5858, 4361, 50276, 249, 2593, 577, 1264, 1511, 3210, 403, 5611, 2299, 616, 4454, 285, 41818, 403, 417, 7939, 2366, 534, 2789, 253, 4361, 1077, 2834, 352, 651, 320, 1805, 604, 253, 41818, 403, 2011, 2366, 342, 616, 3969, 4454, 751, 12356, 348, 2591, 1824, 8670, 1612, 1566, 21136, 319, 81, 285, 268, 348, 16471, 12356, 15469, 1612, 1566, 81, 291, 285, 448, 6626, 1612, 1566, 268, 348, 50276, 783, 10291, 875, 14168, 7424, 403, 13155, 534, 2789, 253, 7681, 1332, 1892, 281, 2096, 323, 1650, 50276, 18, 253, 4602, 875, 340, 275, 5150, 374, 285, 340, 348, 275, 5150, 577, 310, 417, 2590, 374, 253, 4602, 875, 340, 348, 275, 5150, 577, 285, 1269, 348, 275, 5150, 608, 310, 671, 417, 2590, 2217, 495, 21194, 348, 6626, 89, 5393, 275, 2593, 7652, 310, 8489, 7011, 697, 2954, 342, 253, 643, 7424, 403, 417, 8943, 50276, 783, 1390, 2620, 3104, 273, 2593, 7652, 2708, 4677, 7127, 403, 1892, 281, 2096, 4583, 253, 7681, 4028, 273, 2593, 577, 812, 320, 2007, 5520, 50276, 13206, 7609, 285, 7127, 3480, 247, 7000, 8813, 285, 3021, 403, 1892, 281, 2096, 752, 513, 269, 20227, 12120, 81, 1765, 268, 20, 268, 18, 81, 19, 81, 20, 275, 253, 4677, 1599, 2975, 752, 310, 253, 3064, 875, 247, 285, 270, 2139, 247, 310, 1925, 8137, 4373, 13568, 1223, 270, 310, 1925, 13757, 4373, 13568, 50276, 2252, 273, 253, 14168, 41818, 403, 417, 11120, 5544, 275, 253, 12494, 891, 574, 10183, 4361, 253, 15965, 7424, 1919, 891, 1119, 253, 2829, 273, 41818, 275, 253, 30762, 2829, 374, 2299, 2829, 374, 310, 417, 23378, 4768, 253, 2929, 50275, 968, 86, 2276, 4679, 275, 2426, 273, 4758, 4278, 285, 14023, 50276, 783, 4679, 2593, 19756, 247, 7000, 5740, 273, 10895, 1491, 3733, 7533, 7092, 4278, 253, 1655, 4679, 1007, 7479, 281, 18302, 253, 1543, 752, 310, 253, 2087, 10199, 273, 253, 10895, 908, 275, 253, 4679, 513, 368, 8085, 253, 941, 715, 3733, 12820, 285, 5175, 873, 752, 310, 253, 1180, 273, 941, 6667, 273, 1016, 8578, 849, 281, 6194, 253, 4081, 1566, 752, 310, 253, 7000, 2990, 10336, 273, 253, 13361, 81, 285, 253, 4081, 278, 9866, 50276, 249, 253, 806, 3368, 4677, 8319, 752, 310, 278, 2259, 285, 849, 281, 10173, 352, 752, 310, 36196, 5393, 275, 4677, 8319, 253, 1273, 3368, 4677, 8073, 310, 12744, 275, 697, 4096, 5301, 1332, 7982, 285, 12510, 352, 310, 1892, 281, 5963, 253, 12510, 273, 253, 4081, 1332, 1754, 327, 253, 4679, 2139, 513, 359, 878, 281, 1908, 46086, 342, 1027, 9685, 9705, 67, 9208, 50276, 5371, 310, 253, 2234, 3064, 275, 2426, 273, 253, 3368, 4758, 2429, 342, 253, 2045, 3368, 253, 337, 296, 3368, 16633, 327, 4812, 1453, 1223, 253, 374, 2109, 3368, 16633, 327, 6024, 1453, 476, 253, 4477, 21184, 625, 327, 841, 767, 4679, 4677, 8319, 285, 8073, 752, 310, 268, 489, 285, 849, 281, 10173, 352, 752, 513, 253, 20200, 285, 3104, 275, 4677, 8073, 1599, 2975, 752, 310, 13361, 81, 342, 1980, 33786, 33786, 310, 417, 5544, 4768, 253, 2929, 310, 11138, 253, 3388, 268, 489, 407, 1458, 1534, 2139, 642, 1543, 273, 2602, 78, 9866, 275, 436, 3368, 50276, 284, 253, 2905, 789, 5393, 4872, 9077, 310, 253, 6571, 908, 14053, 1332, 275, 1524, 10186, 50276, 66, 8245, 5301, 342, 4872, 9077, 4453, 5667, 281, 1339, 10668, 871, 253, 8245, 3045, 2299, 253, 1655, 7714, 26662, 342, 13361, 81, 760, 310, 627, 667, 1921, 326, 16897, 432, 10941, 342, 4872, 9077, 50276, 2520, 2929, 29328, 247, 747, 5743, 1159, 5150, 1903, 285, 767, 18149, 273, 253, 4081, 45973, 11454, 6928, 278, 9866, 7898, 78, 9866, 285, 18902, 78, 9866, 2299, 642, 1543, 403, 2011, 281, 7568, 616, 12510, 7152, 33032, 2520, 2929, 10262, 849, 281, 16584, 253, 5028, 3640, 534, 1537, 320, 4217, 323, 16161, 3237, 275, 6684, 247, 3676, 11454, 2990, 342, 253, 4948, 273, 4778, 10806, 390, 8103, 3470, 285, 253, 4081, 1332, 1925, 45973, 11454, 2990, 369, 3732, 281, 253, 2898, 273, 448, 6626, 4444, 13757, 2429, 281, 253, 1332, 13361, 81, 253, 4081, 1332, 2692, 1805, 3045, 275, 253, 7200, 2557, 50275, 2004, 436, 3929, 2722, 849, 253, 5145, 4715, 1566, 476, 320, 18107, 342, 2720, 3640, 275, 253, 1524, 10186, 2898, 352, 310, 1892, 281, 1333, 326, 436, 2929, 310, 1840, 253, 2997, 1386, 347, 627, 403, 8783, 273, 9956, 323, 7756, 50276, 7507, 1866, 405, 50276, 2369, 6452, 281, 1056, 253, 2561, 625, 16932, 366, 4477, 878, 281, 1908, 8133, 11815, 273, 616, 2987, 1690, 6010, 294, 37236, 284, 3006, 752, 597, 1119, 690, 2234, 2792, 390, 2852, 2987, 407, 816, 19083, 342, 4679, 275, 436, 3929, 10668, 778, 4282, 752, 403, 253, 2022, 2792, 273, 7125, 432, 4477, 50276, 911, 11139, 569, 323, 31931, 2492, 347, 253, 2929, 13330, 342, 253, 2898, 273, 50276, 348, 6626, 4444, 970, 7681, 2426, 310, 19455, 909, 268, 489, 4384, 335, 67, 3276, 36196, 390, 448, 16471, 2299, 4477, 878, 281, 3324, 253, 4495, 273, 253, 2426, 432, 253, 30762, 715, 253, 2022, 3929, 594, 326, 10668, 476, 27105, 4354, 672, 253, 7202, 2426, 403, 2011, 50275, 33751, 1430, 50276, 953, 30437, 310, 3710, 760, 281, 253, 1072, 1895, 432, 253, 1072, 9189, 4477, 778, 1908, 2403, 616, 4081, 1332, 625, 2087, 594, 326, 24432, 432, 643, 17057, 665, 971, 281, 13398, 616, 5028, 3640, 715, 616, 1895, 476, 755, 4088, 390, 16039, 432, 436, 1263, 50276, 2369, 2173, 1491, 670, 253, 4081, 1566, 2605, 1119, 849, 1142, 8090, 76, 403, 908, 285, 2139, 752, 310, 253, 1979, 273, 253, 3828, 13461, 849, 403, 4373, 22041, 873, 432, 849, 1142, 9183, 50275, 12185, 1528, 414, 50276, 43355, 778, 7277, 616, 2987, 281, 643, 1375, 23037, 14387, 7274, 281, 1246, 253, 34385, 273, 731, 1543, 4645, 1805, 685, 13361, 81, 778, 417, 4549, 281, 10668, 50275, 6611, 406, 263, 15831, 50276, 43355, 878, 281, 2451, 253, 36594, 273, 14683, 3340, 253, 806, 12494, 273, 2593, 374, 556, 2067, 3731, 5123, 273, 5347, 4876, 285, 253, 10720, 1386, 20261, 690, 50276, 261, 417, 247, 3426, 6197, 50276, 249, 2593, 495, 1908, 50276, 15603, 272, 50276, 249, 2593, 374, 19016, 12494, 1375, 273, 1445, 50276, 3409, 273, 253, 1445, 50276, 7152, 339, 2617, 1279, 6701, 281, 253, 4477, 323, 616, 19529, 534, 13698, 387, 16248, 2856, 324, 1069, 257, 7274, 342, 1566, 3169, 4394, 323, 5556, 2182, 4275, 1701, 12356, 2718, 448, 6626, 6244, 253, 689, 50238, 4388, 273, 436, 2929, 556, 644, 973, 5421, 1078, 1677, 581, 13698, 387, 24049, 3520, 10806, 581, 3198, 247, 4891, 4685, 273, 253, 6944, 3520, 985, 7613, 849, 581, 11323, 824, 247, 7658, 281, 253, 1566, 4419, 3676, 14006, 253, 5145, 4715, 629, 273, 436, 2929, 310, 671, 417, 1633, 747, 281, 3761, 253, 3520, 10806, 273, 253, 1895, 275, 1133, 4477, 19071, 45973, 414, 281, 253, 11454, 2990, 1566, 643, 685, 326, 253, 2929, 1057, 417, 3959, 247, 11088, 8813, 327, 752, 253, 38135, 285, 4588, 9021, 403, 4543, 1057, 352, 2085, 247, 4891, 5661, 9978, 4496, 1908, 690, 273, 619, 5701, 2708, 50276, 24330, 5701, 642, 6452, 390, 5955, 556, 644, 2530, 326, 2789, 253, 2929, 18464, 5661, 2593, 310, 2581, 18766, 285, 20906, 598, 253, 1543, 403, 417, 5544, 6283, 642, 11859, 1491, 327, 253, 10895, 908, 556, 644, 2530, 281, 33876, 885, 253, 5661, 2216, 2929, 310, 11306, 46746, 4404, 5277, 1512, 1199, 4114, 1491, 285, 2581, 8077, 2531, 1491, 327, 24049, 3520, 10806, 327, 281, 253, 2957, 1159, 273, 247, 11454, 3024, 50276, 37585, 5701, 3240, 247, 1643, 17256, 2368, 6332, 2226, 690, 14683, 403, 417, 6699, 390, 8416, 342, 247, 2120, 3523, 835, 352, 943, 2649, 4496, 1908, 4737, 24042, 253, 7714, 9257, 323, 35827, 824, 963, 993, 187, 187, 4118, 18435, 27, 47033, 368, 323, 634, 19529, 281, 17857, 32888, 50276, 783, 30628, 38350, 3543, 326, 627, 497, 6832, 3374, 342, 436, 789, 21681, 281, 253, 958, 326, 1097, 253, 5609, 285, 4893, 452, 644, 2783, 275, 247, 1270, 2968, 273, 2045, 789, 50276, 44295, 3062, 253, 7714, 3139, 3198, 6832, 8322, 273, 18520, 1078, 1146, 7470, 323, 9311, 50276, 284, 627, 369, 642, 2380, 281, 841, 2792, 1309, 253, 30080, 22559, 2180, 352, 3133, 2590, 326, 253, 2929, 16216, 320, 7607, 275, 697, 1655, 830 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: summary a clear an interresting presentation on learning sequences distributions it achieve this objective by replacing the discriminator with a mediator a mixture between the training distribution and the target distribution which is estimated via maximum likelihood pros original idea for modelling distribution of sequence data theoretical convergence in the jensen shanon divergence sense promising experiments cons no major cons to the best of my knowledge typos it would be very nice to have black and white color blind friendly graphs eq 10 too long introduce jm jg in sentence coma at the end of eq 5 and maybe align generator and discriminator in some position eg at the semi colon missing dot at eq 8 question how would you ensure reproducibility eg link to some code is there any hope to obtain consistency convergence wrt other metricsdocsepthe paper proposes an interesting method where the discriminator is replaced by a component that estimates the density that is the mixture of the data and the generators distributions in a sense that component is only a device that allows estimating a jensenshannon divergence for the generator to then be optimized against other gan papers have replaced their discriminator by a similar device eg wgans but the present formulation seems novel the numerical experiments presented on a synthetic turing test and text generation from emnlps 2017 news dataset appear promising overall the mediator seems to allow to achieve lower jensenshannon js divergence values in the experiments and is kind of designed for that although this may be an improvement with respect to existing methods for discrete sequential data it may also be limited in that it may not easily extend to other types of divergences that have proved superior to js in some continuous settings the paper is rather clear although there are lots of small grammatical errors as well as odd formulations which end up being distracting or confusing the language should be proofread carefully pros generative modeling of sequence data still in its infancy potentially lower variance than policy gradient approaches experiments are promising cons lots of grammatical errors and odd formulations questions equation 14 what does it mean to find the maximum entropy solution for the given optimization problem figure 2 how do b and c relate to each other remarks small typos and odd formulations for measuring mphi what does measuring mean in this context what does small m refer to algorithm 1 says the total number of steps but it is also used in the main text as an index for j and pi for mediator equation block 8 jm has not been defined yet the supports of distributions g and p g without subscript has now been defined in this context if the training being perfect tend to get stuck in some suboptimals the learned distribution collapses since the data distribution is thus that measures a that estimates a a predictive module a bit unclear generative v discriminative is more usual terminology is well ensured with the cost of diversity at the cost of diversity has theoretical guarantee in the references alias parth goyal all caps let p denote the intermediate states i dont understand what this is where is p used proof of theorem 3 cot theoretically guarantees the training effectiveness what does that mean figure 3 epochs epochs algorithm 1 what does mixed balanced samples mean make this more precise wideranged equation 10 is too long and equation number is not properly formatted figures hard to read in black white figure 2 doesnt use the same limits for the y axis of the two nll plots making comparisons difficult the two nll plots are also not sidebysidedocseppros this paper is easy to follow the idea is nice in three folds 1 by changing the auxiliary models role from a discriminator to a mediator it directly optimizes the jsd measure which is a symmetrized and smoothed version of kl divergence 2 moreover the mediator and the generator follow similar predictive goals rather than the opposite goals of g and d in gans 3 for discrete sequential data it avoids approximating expected rewards using markov rollouts cons some details are missing in the experiments 1 in table 2 of a leakgan seqgan and rankgan all show significantly better performances in terms of bleu on emnlp2017 wmt compared to results reported in table 3 of the submission any difference 2 the word mover distance is computed by training a discriminator which could be unstable could you provide other metrics to evaluate diveristy like selfbleu a guo jiaxian et al long text generation via adversarial training with leaked information arxiv preprint arxiv170908624 2017 misc 1 how will the number of samples ie batch size affect cot 2 how is the applicability of cot for continuous data it seems to me there is no theoretical difficulties to apply cot on continuous data ### Summary:
the paper proposes an original and interesting alternative to gans for optimizing a proxy to jensenshannon divergence for discrete sequence data experimental results seem promising official reviewers were largely positive based on originality and results however as it currently stands the paper still makes false claims that are not well explained or supported in particular its repeated central claim to provide a lowvariance biasfree algorithm to optimize js given that these central issues were clearly pointed out in a review from a prior submission of this work to another venue review reposted on the current openreview thread on nov 6 the ac feels that the authors had had plenty of time to look into them and address them in the paper as well as occasions to reference and discuss relevant related work pointed in that review the current version of the paper does neither the algorithm is not unbiased for at least two reasons pointed out in discussions a in practice a parameterized mediator will be unable to match the true pg at best yielding a useful biased estimate not unlike how gans parameterized discriminator induces bias b one would need to use reinforce or similar to get an unbiased estimate of the gradient in eq 13 a key detail omitted from the paper from the discussion thread it is possible that authors were initially confused about the fact that this fundamental issue did not disappear with eq 13 they commented most important idea we want to present in this paper is how to avoid incorporating reinforce please refer to eq13 which is the key to the success of this but rather as guessed by a commentator that a heuristic implementation not explained in the paper dropped the reinforce term thus effectively trading variance for bias on december 4th authors posted a justification confirming heuristically dropping the reinforce terms when taking the gradient of eq 13 and said they could attach detailed analysis and experiment results in the cameraready version however if one of the most important idea of the paper is how to avoid reinforce as still implied and highlighted in the abstract the ac finds it worrisome that the paper had no explanation of when and how this was done and no analysis of the bias induced by unreportedly dropping the term the approach remains original interesting and potentially promising but as it currently stands ac and sac agreed that inexact theoretical overclaiming and insufficient justification and indepth analysis of key heuristic shortcutstradeoffs however useful are too important for their fixing to be entrusted to a final cameraready revision step a major revision that clearly adresses these issues in depth both in how the approach is presented and in supporting experiments will constitute a much more convincing sound and impactful research contribution
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 8774, 247, 2590, 271, 734, 1120, 272, 9759, 327, 4715, 6430, 10670, 352, 5115, 436, 8103, 407, 15706, 253, 7134, 12915, 342, 247, 37190, 247, 7802, 875, 253, 3733, 3268, 285, 253, 2303, 3268, 534, 310, 5998, 3066, 4869, 12177, 50276, 856, 84, 50276, 19164, 2934, 323, 26278, 3268, 273, 3425, 941, 50276, 783, 33977, 14940, 275, 253, 480, 19434, 439, 46339, 23279, 3282, 50276, 13382, 2182, 4679, 50276, 5040, 50276, 2369, 2201, 772, 281, 253, 1682, 273, 619, 3640, 50276, 555, 993, 50276, 262, 651, 320, 1077, 5322, 281, 452, 2806, 285, 3168, 50276, 4897, 9645, 11453, 14580, 50276, 2574, 884, 1512, 1048, 50276, 36445, 336, 480, 78, 50276, 75, 72, 275, 50276, 36817, 50276, 18187, 387, 253, 990, 273, 16186, 608, 285, 5046, 8495, 14156, 285, 7134, 12915, 275, 690, 1899, 24088, 387, 253, 10020, 6769, 50276, 33722, 14261, 387, 16186, 854, 50276, 19751, 50276, 5430, 651, 368, 5416, 38041, 24088, 3048, 281, 690, 2127, 50276, 261, 627, 667, 3524, 281, 4044, 15274, 14940, 8772, 643, 17082, 7152, 339, 431, 248, 2929, 29328, 271, 4722, 1332, 835, 253, 7134, 12915, 310, 7932, 407, 247, 4445, 326, 8197, 253, 4038, 326, 310, 253, 7802, 273, 253, 941, 285, 253, 21025, 10670, 275, 247, 3282, 326, 4445, 310, 760, 247, 2813, 326, 4483, 26230, 247, 480, 561, 561, 73, 16554, 23279, 323, 253, 14156, 281, 840, 320, 18325, 1411, 643, 36827, 9380, 452, 7932, 616, 7134, 12915, 407, 247, 2074, 2813, 24088, 259, 72, 507, 50276, 2858, 253, 1246, 15895, 3133, 4460, 253, 10704, 4679, 3559, 327, 247, 13506, 246, 981, 1071, 285, 2505, 5978, 432, 802, 13307, 793, 4240, 3668, 10895, 3176, 12532, 50275, 1189, 455, 253, 37190, 3133, 281, 1581, 281, 5115, 2406, 480, 561, 561, 73, 16554, 23421, 23279, 2193, 275, 253, 4679, 285, 310, 2238, 273, 4158, 323, 326, 3738, 436, 778, 320, 271, 7756, 342, 1675, 281, 5368, 3082, 323, 13358, 22453, 941, 352, 778, 671, 320, 3710, 275, 326, 352, 778, 417, 4354, 9017, 281, 643, 3510, 273, 11711, 1541, 707, 326, 452, 8058, 8936, 281, 23421, 275, 690, 5415, 7533, 50276, 783, 2929, 310, 2581, 2590, 3738, 627, 403, 8783, 273, 1355, 47412, 474, 6332, 347, 973, 347, 8909, 26850, 534, 990, 598, 1146, 940, 25031, 390, 21643, 253, 3448, 943, 320, 4737, 1088, 9257, 50275, 856, 84, 50276, 36749, 14053, 273, 3425, 941, 1335, 275, 697, 42986, 50276, 11714, 4303, 2406, 11041, 685, 3646, 11786, 7274, 50276, 16217, 3825, 403, 12532, 50276, 5040, 50276, 77, 1502, 273, 47412, 474, 6332, 285, 8909, 26850, 50276, 34974, 50276, 29813, 1638, 752, 1057, 352, 1599, 281, 1089, 253, 4869, 15579, 2900, 323, 253, 1677, 13757, 1895, 50276, 13206, 374, 849, 513, 270, 285, 260, 14588, 281, 1016, 643, 50276, 2013, 7969, 1355, 963, 993, 285, 8909, 26850, 50276, 1542, 10499, 278, 2162, 752, 1057, 10499, 1599, 275, 436, 3634, 50276, 5371, 1057, 1355, 278, 3730, 281, 5933, 337, 2296, 253, 2264, 1180, 273, 5018, 50276, 2858, 352, 310, 671, 908, 275, 253, 2022, 2505, 347, 271, 3605, 323, 480, 285, 12580, 323, 37190, 50276, 29813, 2972, 854, 480, 78, 556, 417, 644, 2931, 2568, 50276, 783, 8525, 273, 10670, 305, 285, 268, 50276, 72, 1293, 749, 3866, 556, 1024, 644, 2931, 275, 436, 3634, 50276, 338, 253, 3733, 1146, 3962, 50276, 85, 423, 281, 755, 10960, 275, 690, 749, 2178, 21185, 50276, 783, 6311, 3268, 3007, 23508, 50276, 17480, 50276, 783, 941, 3268, 310, 3021, 50275, 3529, 5593, 247, 50276, 3529, 8197, 247, 50275, 66, 15970, 6333, 247, 2372, 12744, 50276, 36749, 362, 20741, 800, 310, 625, 7312, 28939, 50276, 261, 973, 33075, 50276, 3113, 253, 2105, 273, 9991, 50276, 255, 253, 2105, 273, 9991, 50276, 7110, 10527, 12215, 50276, 249, 253, 10414, 28129, 1061, 394, 564, 90, 267, 512, 12839, 50276, 1059, 268, 9173, 253, 10444, 3054, 891, 13414, 2096, 752, 436, 310, 835, 310, 268, 908, 4737, 273, 10012, 495, 50276, 27678, 28055, 23632, 253, 3733, 12510, 752, 1057, 326, 1599, 50276, 13206, 495, 44540, 50276, 554, 3770, 84, 50276, 41528, 337, 752, 1057, 6804, 16645, 3530, 1599, 1056, 436, 625, 10799, 50276, 88, 1334, 4626, 50276, 29813, 884, 310, 1512, 1048, 285, 5150, 1180, 310, 417, 6283, 39113, 50276, 40203, 1892, 281, 1239, 275, 2806, 50276, 11300, 50276, 13206, 374, 36908, 897, 253, 1072, 7787, 323, 253, 340, 7844, 273, 253, 767, 295, 620, 14777, 2403, 14023, 2834, 253, 767, 295, 620, 14777, 403, 671, 417, 1930, 44678, 1356, 406, 339, 377, 2921, 436, 2929, 310, 3477, 281, 956, 253, 2934, 310, 5322, 275, 1264, 34579, 50276, 18, 407, 6890, 253, 24026, 3210, 2554, 432, 247, 7134, 12915, 281, 247, 37190, 352, 3587, 5556, 4219, 253, 480, 8289, 2557, 534, 310, 247, 22892, 50065, 285, 43966, 2715, 273, 27451, 23279, 50275, 19, 25761, 253, 37190, 285, 253, 14156, 956, 2074, 15970, 7342, 2581, 685, 253, 7285, 50276, 2184, 932, 273, 305, 285, 277, 275, 305, 507, 50276, 20, 323, 13358, 22453, 941, 352, 32547, 4020, 839, 3264, 23267, 970, 1616, 729, 4533, 8349, 50273, 5040, 690, 4278, 403, 5816, 275, 253, 4679, 50276, 18, 275, 2829, 374, 273, 247, 13584, 1247, 22510, 1247, 285, 5958, 1247, 512, 921, 3012, 1805, 16226, 275, 2426, 273, 7387, 86, 327, 802, 13307, 81, 7132, 259, 6917, 2429, 281, 1543, 2361, 275, 2829, 495, 273, 253, 19529, 667, 3064, 374, 253, 3159, 278, 1189, 4181, 310, 10302, 407, 3733, 247, 7134, 12915, 534, 812, 320, 17631, 812, 368, 2085, 643, 17082, 281, 7472, 11711, 39660, 751, 1881, 934, 86, 50276, 66, 1149, 80, 480, 571, 89, 757, 1162, 355, 1048, 2505, 5978, 3066, 48960, 3733, 342, 31347, 1491, 549, 32693, 638, 3845, 549, 32693, 15046, 2270, 2691, 1348, 4240, 50276, 43671, 337, 849, 588, 253, 1180, 273, 3530, 26332, 14604, 1979, 2818, 13450, 50276, 19, 849, 310, 253, 30437, 273, 13450, 323, 5415, 941, 352, 3133, 281, 479, 627, 310, 642, 10527, 12748, 281, 4647, 13450, 327, 5415, 941, 187, 187, 4118, 18435, 27, 783, 2929, 29328, 271, 3236, 285, 4722, 5795, 281, 305, 507, 323, 39793, 247, 17335, 281, 480, 561, 561, 73, 16554, 23279, 323, 13358, 3425, 941, 5661, 1543, 1646, 12532, 3565, 30628, 497, 8127, 2762, 1754, 327, 3236, 414, 285, 1543, 2299, 347, 352, 4390, 9572, 253, 2929, 1335, 2789, 3221, 3916, 326, 403, 417, 973, 5544, 390, 4516, 275, 1798, 697, 6015, 4275, 1750, 281, 2085, 247, 1698, 87, 14417, 8492, 4924, 5933, 281, 22318, 23421, 50276, 28821, 326, 841, 4275, 3374, 497, 4518, 8042, 562, 275, 247, 2278, 432, 247, 2720, 19529, 273, 436, 789, 281, 1529, 18767, 2278, 1234, 493, 264, 327, 253, 1655, 1527, 15337, 6293, 327, 22458, 721, 253, 913, 9193, 326, 253, 4477, 574, 574, 9828, 273, 673, 281, 1007, 715, 731, 285, 2953, 731, 275, 253, 2929, 347, 973, 347, 15530, 281, 3806, 285, 2319, 4623, 2905, 789, 8042, 275, 326, 2278, 253, 1655, 2715, 273, 253, 2929, 1057, 6747, 253, 5933, 310, 417, 38663, 323, 387, 1878, 767, 4606, 8042, 562, 275, 11985, 247, 275, 3946, 247, 4764, 1025, 37190, 588, 320, 7591, 281, 3761, 253, 2032, 23256, 387, 1682, 27012, 247, 4217, 23539, 6642, 417, 12401, 849, 305, 507, 4764, 1025, 7134, 12915, 14757, 8492, 270, 581, 651, 878, 281, 897, 28432, 390, 2074, 281, 755, 271, 38663, 6642, 273, 253, 11786, 275, 16186, 2145, 247, 2234, 2508, 11035, 432, 253, 2929, 432, 253, 5955, 6293, 352, 310, 1896, 326, 4477, 497, 8523, 13477, 670, 253, 958, 326, 436, 7936, 2523, 858, 417, 15529, 342, 16186, 2145, 597, 20503, 954, 1774, 2934, 359, 971, 281, 1246, 275, 436, 2929, 310, 849, 281, 3693, 24049, 28432, 4496, 3730, 281, 16186, 1012, 534, 310, 253, 2234, 281, 253, 2323, 273, 436, 533, 2581, 347, 30346, 407, 247, 49355, 326, 247, 47641, 7092, 417, 5544, 275, 253, 2929, 8231, 253, 28432, 1307, 3021, 8069, 11947, 11041, 323, 8492, 50276, 251, 372, 4246, 577, 394, 4477, 9269, 247, 22861, 24025, 344, 321, 18260, 18752, 253, 28432, 2426, 672, 3192, 253, 11786, 273, 16186, 2145, 285, 753, 597, 812, 16152, 7000, 1783, 285, 3368, 1543, 275, 253, 4049, 254, 609, 5102, 2715, 50276, 35529, 604, 581, 273, 253, 954, 1774, 2934, 273, 253, 2929, 310, 849, 281, 3693, 28432, 347, 1335, 10466, 285, 16318, 275, 253, 12002, 253, 913, 9010, 352, 548, 4448, 485, 326, 253, 2929, 574, 642, 8813, 273, 672, 285, 849, 436, 369, 2218, 285, 642, 1783, 273, 253, 8492, 5802, 407, 440, 20527, 314, 18752, 253, 1307, 50275, 783, 2746, 4558, 3236, 4722, 285, 7826, 12532, 533, 347, 352, 4390, 9572, 913, 285, 7044, 5821, 326, 29257, 514, 10527, 689, 43759, 285, 12497, 22861, 285, 801, 554, 394, 1783, 273, 2234, 47641, 28194, 1344, 796, 14273, 2299, 4217, 403, 1512, 1774, 323, 616, 18505, 281, 320, 49979, 281, 247, 2457, 4049, 254, 609, 5102, 18520, 3213, 247, 2201, 18520, 326, 4518, 519, 16443, 841, 3374, 275, 6864, 1097, 275, 849, 253, 2746, 310, 3559, 285, 275, 8109, 4679, 588, 12647, 247, 1199, 625, 21414, 3590, 285, 3486, 1020, 2561, 7680, 50276 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 8774, 247, 2590, 271, 734, 1120, 272, 9759, 327, 4715, 6430, 10670, 352, 5115, 436, 8103, 407, 15706, 253, 7134, 12915, 342, 247, 37190, 247, 7802, 875, 253, 3733, 3268, 285, 253, 2303, 3268, 534, 310, 5998, 3066, 4869, 12177, 50276, 856, 84, 50276, 19164, 2934, 323, 26278, 3268, 273, 3425, 941, 50276, 783, 33977, 14940, 275, 253, 480, 19434, 439, 46339, 23279, 3282, 50276, 13382, 2182, 4679, 50276, 5040, 50276, 2369, 2201, 772, 281, 253, 1682, 273, 619, 3640, 50276, 555, 993, 50276, 262, 651, 320, 1077, 5322, 281, 452, 2806, 285, 3168, 50276, 4897, 9645, 11453, 14580, 50276, 2574, 884, 1512, 1048, 50276, 36445, 336, 480, 78, 50276, 75, 72, 275, 50276, 36817, 50276, 18187, 387, 253, 990, 273, 16186, 608, 285, 5046, 8495, 14156, 285, 7134, 12915, 275, 690, 1899, 24088, 387, 253, 10020, 6769, 50276, 33722, 14261, 387, 16186, 854, 50276, 19751, 50276, 5430, 651, 368, 5416, 38041, 24088, 3048, 281, 690, 2127, 50276, 261, 627, 667, 3524, 281, 4044, 15274, 14940, 8772, 643, 17082, 7152, 339, 431, 248, 2929, 29328, 271, 4722, 1332, 835, 253, 7134, 12915, 310, 7932, 407, 247, 4445, 326, 8197, 253, 4038, 326, 310, 253, 7802, 273, 253, 941, 285, 253, 21025, 10670, 275, 247, 3282, 326, 4445, 310, 760, 247, 2813, 326, 4483, 26230, 247, 480, 561, 561, 73, 16554, 23279, 323, 253, 14156, 281, 840, 320, 18325, 1411, 643, 36827, 9380, 452, 7932, 616, 7134, 12915, 407, 247, 2074, 2813, 24088, 259, 72, 507, 50276, 2858, 253, 1246, 15895, 3133, 4460, 253, 10704, 4679, 3559, 327, 247, 13506, 246, 981, 1071, 285, 2505, 5978, 432, 802, 13307, 793, 4240, 3668, 10895, 3176, 12532, 50275, 1189, 455, 253, 37190, 3133, 281, 1581, 281, 5115, 2406, 480, 561, 561, 73, 16554, 23421, 23279, 2193, 275, 253, 4679, 285, 310, 2238, 273, 4158, 323, 326, 3738, 436, 778, 320, 271, 7756, 342, 1675, 281, 5368, 3082, 323, 13358, 22453, 941, 352, 778, 671, 320, 3710, 275, 326, 352, 778, 417, 4354, 9017, 281, 643, 3510, 273, 11711, 1541, 707, 326, 452, 8058, 8936, 281, 23421, 275, 690, 5415, 7533, 50276, 783, 2929, 310, 2581, 2590, 3738, 627, 403, 8783, 273, 1355, 47412, 474, 6332, 347, 973, 347, 8909, 26850, 534, 990, 598, 1146, 940, 25031, 390, 21643, 253, 3448, 943, 320, 4737, 1088, 9257, 50275, 856, 84, 50276, 36749, 14053, 273, 3425, 941, 1335, 275, 697, 42986, 50276, 11714, 4303, 2406, 11041, 685, 3646, 11786, 7274, 50276, 16217, 3825, 403, 12532, 50276, 5040, 50276, 77, 1502, 273, 47412, 474, 6332, 285, 8909, 26850, 50276, 34974, 50276, 29813, 1638, 752, 1057, 352, 1599, 281, 1089, 253, 4869, 15579, 2900, 323, 253, 1677, 13757, 1895, 50276, 13206, 374, 849, 513, 270, 285, 260, 14588, 281, 1016, 643, 50276, 2013, 7969, 1355, 963, 993, 285, 8909, 26850, 50276, 1542, 10499, 278, 2162, 752, 1057, 10499, 1599, 275, 436, 3634, 50276, 5371, 1057, 1355, 278, 3730, 281, 5933, 337, 2296, 253, 2264, 1180, 273, 5018, 50276, 2858, 352, 310, 671, 908, 275, 253, 2022, 2505, 347, 271, 3605, 323, 480, 285, 12580, 323, 37190, 50276, 29813, 2972, 854, 480, 78, 556, 417, 644, 2931, 2568, 50276, 783, 8525, 273, 10670, 305, 285, 268, 50276, 72, 1293, 749, 3866, 556, 1024, 644, 2931, 275, 436, 3634, 50276, 338, 253, 3733, 1146, 3962, 50276, 85, 423, 281, 755, 10960, 275, 690, 749, 2178, 21185, 50276, 783, 6311, 3268, 3007, 23508, 50276, 17480, 50276, 783, 941, 3268, 310, 3021, 50275, 3529, 5593, 247, 50276, 3529, 8197, 247, 50275, 66, 15970, 6333, 247, 2372, 12744, 50276, 36749, 362, 20741, 800, 310, 625, 7312, 28939, 50276, 261, 973, 33075, 50276, 3113, 253, 2105, 273, 9991, 50276, 255, 253, 2105, 273, 9991, 50276, 7110, 10527, 12215, 50276, 249, 253, 10414, 28129, 1061, 394, 564, 90, 267, 512, 12839, 50276, 1059, 268, 9173, 253, 10444, 3054, 891, 13414, 2096, 752, 436, 310, 835, 310, 268, 908, 4737, 273, 10012, 495, 50276, 27678, 28055, 23632, 253, 3733, 12510, 752, 1057, 326, 1599, 50276, 13206, 495, 44540, 50276, 554, 3770, 84, 50276, 41528, 337, 752, 1057, 6804, 16645, 3530, 1599, 1056, 436, 625, 10799, 50276, 88, 1334, 4626, 50276, 29813, 884, 310, 1512, 1048, 285, 5150, 1180, 310, 417, 6283, 39113, 50276, 40203, 1892, 281, 1239, 275, 2806, 50276, 11300, 50276, 13206, 374, 36908, 897, 253, 1072, 7787, 323, 253, 340, 7844, 273, 253, 767, 295, 620, 14777, 2403, 14023, 2834, 253, 767, 295, 620, 14777, 403, 671, 417, 1930, 44678, 1356, 406, 339, 377, 2921, 436, 2929, 310, 3477, 281, 956, 253, 2934, 310, 5322, 275, 1264, 34579, 50276, 18, 407, 6890, 253, 24026, 3210, 2554, 432, 247, 7134, 12915, 281, 247, 37190, 352, 3587, 5556, 4219, 253, 480, 8289, 2557, 534, 310, 247, 22892, 50065, 285, 43966, 2715, 273, 27451, 23279, 50275, 19, 25761, 253, 37190, 285, 253, 14156, 956, 2074, 15970, 7342, 2581, 685, 253, 7285, 50276, 2184, 932, 273, 305, 285, 277, 275, 305, 507, 50276, 20, 323, 13358, 22453, 941, 352, 32547, 4020, 839, 3264, 23267, 970, 1616, 729, 4533, 8349, 50273, 5040, 690, 4278, 403, 5816, 275, 253, 4679, 50276, 18, 275, 2829, 374, 273, 247, 13584, 1247, 22510, 1247, 285, 5958, 1247, 512, 921, 3012, 1805, 16226, 275, 2426, 273, 7387, 86, 327, 802, 13307, 81, 7132, 259, 6917, 2429, 281, 1543, 2361, 275, 2829, 495, 273, 253, 19529, 667, 3064, 374, 253, 3159, 278, 1189, 4181, 310, 10302, 407, 3733, 247, 7134, 12915, 534, 812, 320, 17631, 812, 368, 2085, 643, 17082, 281, 7472, 11711, 39660, 751, 1881, 934, 86, 50276, 66, 1149, 80, 480, 571, 89, 757, 1162, 355, 1048, 2505, 5978, 3066, 48960, 3733, 342, 31347, 1491, 549, 32693, 638, 3845, 549, 32693, 15046, 2270, 2691, 1348, 4240, 50276, 43671, 337, 849, 588, 253, 1180, 273, 3530, 26332, 14604, 1979, 2818, 13450, 50276, 19, 849, 310, 253, 30437, 273, 13450, 323, 5415, 941, 352, 3133, 281, 479, 627, 310, 642, 10527, 12748, 281, 4647, 13450, 327, 5415, 941, 187, 187, 4118, 18435, 27, 783, 2929, 29328, 271, 3236, 285, 4722, 5795, 281, 305, 507, 323, 39793, 247, 17335, 281, 480, 561, 561, 73, 16554, 23279, 323, 13358, 3425, 941, 5661, 1543, 1646, 12532, 3565, 30628, 497, 8127, 2762, 1754, 327, 3236, 414, 285, 1543, 2299, 347, 352, 4390, 9572, 253, 2929, 1335, 2789, 3221, 3916, 326, 403, 417, 973, 5544, 390, 4516, 275, 1798, 697, 6015, 4275, 1750, 281, 2085, 247, 1698, 87, 14417, 8492, 4924, 5933, 281, 22318, 23421, 50276, 28821, 326, 841, 4275, 3374, 497, 4518, 8042, 562, 275, 247, 2278, 432, 247, 2720, 19529, 273, 436, 789, 281, 1529, 18767, 2278, 1234, 493, 264, 327, 253, 1655, 1527, 15337, 6293, 327, 22458, 721, 253, 913, 9193, 326, 253, 4477, 574, 574, 9828, 273, 673, 281, 1007, 715, 731, 285, 2953, 731, 275, 253, 2929, 347, 973, 347, 15530, 281, 3806, 285, 2319, 4623, 2905, 789, 8042, 275, 326, 2278, 253, 1655, 2715, 273, 253, 2929, 1057, 6747, 253, 5933, 310, 417, 38663, 323, 387, 1878, 767, 4606, 8042, 562, 275, 11985, 247, 275, 3946, 247, 4764, 1025, 37190, 588, 320, 7591, 281, 3761, 253, 2032, 23256, 387, 1682, 27012, 247, 4217, 23539, 6642, 417, 12401, 849, 305, 507, 4764, 1025, 7134, 12915, 14757, 8492, 270, 581, 651, 878, 281, 897, 28432, 390, 2074, 281, 755, 271, 38663, 6642, 273, 253, 11786, 275, 16186, 2145, 247, 2234, 2508, 11035, 432, 253, 2929, 432, 253, 5955, 6293, 352, 310, 1896, 326, 4477, 497, 8523, 13477, 670, 253, 958, 326, 436, 7936, 2523, 858, 417, 15529, 342, 16186, 2145, 597, 20503, 954, 1774, 2934, 359, 971, 281, 1246, 275, 436, 2929, 310, 849, 281, 3693, 24049, 28432, 4496, 3730, 281, 16186, 1012, 534, 310, 253, 2234, 281, 253, 2323, 273, 436, 533, 2581, 347, 30346, 407, 247, 49355, 326, 247, 47641, 7092, 417, 5544, 275, 253, 2929, 8231, 253, 28432, 1307, 3021, 8069, 11947, 11041, 323, 8492, 50276, 251, 372, 4246, 577, 394, 4477, 9269, 247, 22861, 24025, 344, 321, 18260, 18752, 253, 28432, 2426, 672, 3192, 253, 11786, 273, 16186, 2145, 285, 753, 597, 812, 16152, 7000, 1783, 285, 3368, 1543, 275, 253, 4049, 254, 609, 5102, 2715, 50276, 35529, 604, 581, 273, 253, 954, 1774, 2934, 273, 253, 2929, 310, 849, 281, 3693, 28432, 347, 1335, 10466, 285, 16318, 275, 253, 12002, 253, 913, 9010, 352, 548, 4448, 485, 326, 253, 2929, 574, 642, 8813, 273, 672, 285, 849, 436, 369, 2218, 285, 642, 1783, 273, 253, 8492, 5802, 407, 440, 20527, 314, 18752, 253, 1307, 50275, 783, 2746, 4558, 3236, 4722, 285, 7826, 12532, 533, 347, 352, 4390, 9572, 913, 285, 7044, 5821, 326, 29257, 514, 10527, 689, 43759, 285, 12497, 22861, 285, 801, 554, 394, 1783, 273, 2234, 47641, 28194, 1344, 796, 14273, 2299, 4217, 403, 1512, 1774, 323, 616, 18505, 281, 320, 49979, 281, 247, 2457, 4049, 254, 609, 5102, 18520, 3213, 247, 2201, 18520, 326, 4518, 519, 16443, 841, 3374, 275, 6864, 1097, 275, 849, 253, 2746, 310, 3559, 285, 275, 8109, 4679, 588, 12647, 247, 1199, 625, 21414, 3590, 285, 3486, 1020, 2561, 7680, 50276 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: the authors provide finite sample lp rate guarantees for q function estimation via minimax objectives unlike prior work they manage to prove guarantees with only realizability assumptions prior works obtained such results only for policy evaluation but not q function estimation or required function spaces for the adversarial weights that are too expressive to leverage statistical power i found the theory interesting and well presented the results could be of interest to the community of offline rl my main concerns are why would one estimate a q function other than in order to estimate an offline policy value please provide more motivation as otherwise the results are covered by prior work why are you restricting to finite action and state spaces given all your realizability assumptions it would seem that this is an overkill and defeats the purpose these function approximation would be primarily most powerful with continuous statesactions it is very unfortunate that only slow rates are obtained hence even for finite hypothesis spaces one gets n14 rates such realizability guarantees and also the absence of illposedness in this inverse problem should be leading to fast rates see for instance the uncited work of chen and qi httpsarxivorgabs220106169 please relate to this work in your response as it is highly relevant why are you only giving bounds for finite hypothesis spaces this is very restrictive given that this is mainly a theoretical contribution the above points would need to be addressed before publication docsepthe paper studies the problem of learning the qfunction and density ratio for a given policy in offline reinforcement learning settings where we have iid state action reward next state tuples collected from the stationary distribution of some behavioral policy they argue that in past work the focus has been on guarantees on the accuracy of policy value evaluation using estimates of these functions rather than guarantees on the accuracy of the estimates of these functions themselves the authors propose a new more general framework for regularized estimators of these functions and provide guarantees on the l2 error of the resulting estimates under any given distribution of stateaction values in this analysis they provide some theoretical results which suggest that using novel forms of regularization targeted towards some given distribution they can likely achieve lower l2 error wrt that distribution finally they test their theory with some simple synthetic experiments edit see follow up discussion below for updates on these points strengths the theory is very clean and easy to read understand they provide a nice more novel framework for qfunction and state density ratio estimation with very strong theoretical guarantees for these tasks their experiments are clean and well presented and make a reasonably strong case that their novel regularization is good at reducing l2 error under the target distribution of interest despite the weaknesses described below on the impact of this work one positive impact is that it allows a strong relaxation of the identification conditions needed by existing minimax methods for qfunction and state density ratio estimation used in the ope literature which is significant weaknesses the problem of needing to obtain strong guarantees on the accuracy of qfunction and state density ratio estimates is not very well motivated as the authors point out the main way in which these functions are used in practice is for policy value estimation which is already very well studied in past work with guarantees for the actual task of interest the authors mention for example that learning these functions can be used as subroutines for other rl tasks such as actorcritic methods but they do not provide muchany detail about this related to and expanding on the previous point since learning the qfunction or state density ratio is not particularly useful in and of itself it is disappointing that they do not show show that improved performance in learning these functions can actually translate to improved performance in downstream tasks where they might be used for example they could provide either 1 theoretical results that show that realizable versions of their proposed estimators can give improved guarantees for downstream tasks or 2 empirical results that show that their proposed estimators can improve the performance of downstream tasks in practice without either 1 or 2 or something similar it is not clear that the work is actually high impact similar to the above two points it is also not very clear how useful it is to be able to guarantee low l2 error under particular target stateaction distributions for example in the introduction they note that downstream learning algorithms that use offpolicy function estimation as a subroutine often assume the estimation to be accurate under certain specific distributions but unless these are fixed known distributions it is not clear how their theory can be utilized since they can only provide error guarantees wrt fixed known distributions again without more analysis about how their theory can actually improve downstream tasks the impact is hard to assess a concrete and detailed example case study of the use of the theory would be very helpful with these issues the results about how novel saspecific forms of regularization can improve performance is fairly heuristic for example they suggest using fsax frac12x hat qpisa2 for qfunction estimation where hat qpi is some firststage estimate of the q function which they argue results in zero error when the firststage estimate is perfectly accurate but they do not provide any actual concrete bounds on the resulting l2 error given provable bounds on the accuracy of hat qpi similarly in their experiments they do not test these novel regularization methods with actual firststage estimation of hat qpi but only with artificial error added to qpi which makes the empirical usefulness of this novel regularization difficult to assess there are several trivializing assumptions made including 1 finite state space and 2 finite function classes while it is true that often extending results using finite function classes to infinite ones using eg vcdimension or rademacher complexity is straightforward this is not necessarily always the case or at least making this exertion is sometimes less straightforward or requires additional assumptions so not including more general results is definitely a weakness similarly is there any reason why the state space was assumed to be finite this is an unrealistic assumption in many settings and none of the results depend on the size of the state space so if there are any theoretical reasons why the theory doesnt naturally extend to infinite state spaces this should be made clear similarly the iid data assumption is generally unrealistic although i am more sympathetic there since it is usually trivial to replace eg concentration inequalities clt etc with corresponding markovian versions given mixing assumptions however there should at least be some discussion in the paper about this issues with limitations already addressed in the strengthsweaknesses section no negative societal impact issues that i think need to be addressed docsepthe paper studies estimating q function in the offline setup the stateaction space is assumed to be discrete and the performance of the estimation is measured under a userspecified measure that can be different from the data distribution the major caveat stems from the distribution shift and existing results requires completeness and realizability assumptions this work proposes a lagrangian method and provides statistical guarantees in the absent of completeness assumption the analytical framework is further extended to weight function estimation offline offpolicy evaluation is a very important problem in rl and has many applications across various domains the paper focuses on addressing the change of measure issue between evaluating and sampling distributions moreover the paper only needs realizability assumption both of these advances are important in offpolicy evaluation there are many existing works studying offpolicy evaluation and to my knowledge the proposed method here is new the paper is relatively easy to follow with some minor issues on notations the result seems correct and sound although i have not checked the detailed proofs i will discuss some weaknesses in the questions and limitations section i dont see very obvious limitations in the paper comparing to concurrent theoretical studies of rl as mentioned the assumption on weight function wf may be a bit strong the authors discussed the barriers of obtaining faster rate n12 in appendix which mainly owes to the regularity of the lagrangian function i am interested in whether the n12 rate is ever possible with novel techniques ### Summary:
the authors provide slow rates for qfunction estimation based on minimax objectives the contribution is technically solid but seems somewhat incremental and even though the authors provided responses to all major reviewer concerns there is still concern by reviewers of the applicability of their result and the incrementality despite this it seems a solid contribution to the rl literature
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 783, 4477, 2085, 6486, 3410, 39322, 2281, 23632, 323, 2805, 1159, 13418, 3066, 7221, 991, 16566, 12401, 2720, 789, 597, 8722, 281, 5276, 23632, 342, 760, 42924, 1430, 13260, 2720, 2987, 2797, 824, 1543, 760, 323, 3646, 7103, 533, 417, 2805, 1159, 13418, 390, 2424, 1159, 8470, 323, 253, 48960, 13461, 326, 403, 1512, 43541, 281, 25057, 7605, 1612, 50276, 74, 1119, 253, 3762, 4722, 285, 973, 3559, 253, 1543, 812, 320, 273, 1600, 281, 253, 3114, 273, 28841, 391, 77, 50276, 2577, 2022, 7350, 403, 50276, 22309, 651, 581, 6642, 247, 2805, 1159, 643, 685, 275, 1340, 281, 6642, 271, 28841, 3646, 1318, 4496, 2085, 625, 16038, 347, 5010, 253, 1543, 403, 6107, 407, 2720, 789, 50276, 22309, 403, 368, 34617, 281, 6486, 2250, 285, 1375, 8470, 1677, 512, 634, 42924, 1430, 13260, 352, 651, 1646, 326, 436, 310, 271, 689, 24212, 285, 49204, 253, 4096, 841, 1159, 11193, 651, 320, 8558, 954, 6422, 342, 5415, 3054, 3518, 50276, 262, 310, 1077, 23293, 326, 760, 3468, 4142, 403, 2797, 7613, 1014, 323, 6486, 9079, 8470, 581, 4850, 295, 1047, 4142, 824, 42924, 1430, 23632, 285, 671, 253, 5928, 273, 2853, 7334, 1255, 275, 436, 13737, 1895, 943, 320, 4283, 281, 3809, 4142, 923, 323, 4227, 253, 5258, 959, 789, 273, 260, 864, 285, 2805, 74, 5987, 39962, 2061, 5375, 19, 1252, 3071, 17809, 4496, 14588, 281, 436, 789, 275, 634, 2380, 347, 352, 310, 4122, 4623, 2139, 403, 368, 760, 4933, 14493, 323, 6486, 9079, 8470, 436, 310, 1077, 29190, 1677, 326, 436, 310, 7194, 247, 10527, 7680, 253, 1840, 2792, 651, 878, 281, 320, 9713, 1078, 9311, 5474, 339, 431, 248, 2929, 2175, 253, 1895, 273, 4715, 253, 2805, 3701, 285, 4038, 4313, 323, 247, 1677, 3646, 275, 28841, 35221, 4715, 7533, 835, 359, 452, 891, 301, 1375, 2250, 10921, 1735, 1375, 11737, 1868, 5728, 432, 253, 17429, 3268, 273, 690, 14613, 3646, 597, 9059, 326, 275, 2469, 789, 253, 2770, 556, 644, 327, 23632, 327, 253, 7200, 273, 3646, 1318, 7103, 970, 8197, 273, 841, 3470, 2581, 685, 23632, 327, 253, 7200, 273, 253, 8197, 273, 841, 3470, 3746, 253, 4477, 12661, 247, 747, 625, 2087, 7792, 323, 3963, 1025, 48489, 273, 841, 3470, 285, 2085, 23632, 327, 253, 298, 19, 2228, 273, 253, 4795, 8197, 762, 667, 1677, 3268, 273, 1375, 1913, 2193, 275, 436, 1783, 597, 2085, 690, 10527, 1543, 534, 1804, 326, 970, 4460, 4948, 273, 37820, 10522, 4404, 690, 1677, 3268, 597, 476, 2779, 5115, 2406, 298, 19, 2228, 8772, 326, 3268, 4720, 597, 1071, 616, 3762, 342, 690, 2969, 13506, 4679, 12921, 923, 956, 598, 5955, 2708, 323, 11269, 327, 841, 2792, 50276, 296, 3755, 20556, 50276, 783, 3762, 310, 1077, 4076, 285, 3477, 281, 1239, 50276, 46322, 50276, 9328, 2085, 247, 5322, 625, 4460, 7792, 323, 2805, 3701, 285, 1375, 4038, 4313, 13418, 342, 1077, 2266, 10527, 23632, 323, 841, 8892, 50276, 14094, 4679, 403, 4076, 285, 973, 3559, 285, 1056, 247, 12054, 2266, 1083, 326, 616, 4460, 37820, 310, 1175, 387, 8493, 298, 19, 2228, 762, 253, 2303, 3268, 273, 1600, 50276, 3229, 3784, 253, 32213, 2529, 2708, 327, 253, 3486, 273, 436, 789, 581, 2762, 3486, 310, 326, 352, 4483, 247, 2266, 17040, 273, 253, 8137, 2515, 3058, 407, 5368, 7221, 991, 3082, 323, 2805, 3701, 285, 1375, 4038, 4313, 13418, 908, 275, 253, 258, 365, 6239, 534, 310, 1534, 50276, 20881, 1255, 265, 50276, 783, 1895, 273, 25312, 281, 4044, 2266, 23632, 327, 253, 7200, 273, 2805, 3701, 285, 1375, 4038, 4313, 8197, 310, 417, 1077, 973, 17194, 347, 253, 4477, 1127, 562, 253, 2022, 1039, 275, 534, 841, 3470, 403, 908, 275, 3946, 310, 323, 3646, 1318, 13418, 534, 310, 2168, 1077, 973, 5421, 275, 2469, 789, 342, 23632, 323, 253, 4588, 4836, 273, 1600, 253, 4477, 3748, 323, 1650, 326, 4715, 841, 3470, 476, 320, 908, 347, 749, 27861, 1100, 323, 643, 391, 77, 8892, 824, 347, 12353, 68, 17425, 3082, 533, 597, 513, 417, 2085, 1199, 1279, 2508, 670, 436, 50276, 4919, 281, 285, 16122, 327, 253, 2045, 1127, 1580, 4715, 253, 2805, 3701, 390, 1375, 4038, 4313, 310, 417, 3782, 4217, 275, 285, 273, 3139, 352, 310, 31623, 326, 597, 513, 417, 921, 921, 326, 5520, 3045, 275, 4715, 841, 3470, 476, 2686, 16497, 281, 5520, 3045, 275, 15450, 8892, 835, 597, 1537, 320, 908, 323, 1650, 597, 812, 2085, 2057, 337, 10527, 1543, 326, 921, 326, 1524, 12729, 9508, 273, 616, 4081, 48489, 476, 1918, 5520, 23632, 323, 15450, 8892, 390, 374, 16774, 1543, 326, 921, 326, 616, 4081, 48489, 476, 3157, 253, 3045, 273, 15450, 8892, 275, 3946, 1293, 2057, 337, 390, 374, 390, 1633, 2074, 352, 310, 417, 2590, 326, 253, 789, 310, 2686, 1029, 3486, 50276, 22202, 281, 253, 1840, 767, 2792, 352, 310, 671, 417, 1077, 2590, 849, 4217, 352, 310, 281, 320, 2104, 281, 12215, 1698, 298, 19, 2228, 762, 1798, 2303, 1375, 1913, 10670, 323, 1650, 275, 253, 10199, 597, 3877, 326, 15450, 4715, 11333, 326, 897, 745, 22872, 1159, 13418, 347, 247, 749, 27861, 460, 2223, 5467, 253, 13418, 281, 320, 7899, 762, 2176, 2173, 10670, 533, 5734, 841, 403, 4229, 1929, 10670, 352, 310, 417, 2590, 849, 616, 3762, 476, 320, 12845, 1580, 597, 476, 760, 2085, 2228, 23632, 8772, 4229, 1929, 10670, 969, 1293, 625, 1783, 670, 849, 616, 3762, 476, 2686, 3157, 15450, 8892, 253, 3486, 310, 1892, 281, 2939, 247, 11859, 285, 7000, 1650, 1083, 1263, 273, 253, 897, 273, 253, 3762, 651, 320, 1077, 9371, 342, 841, 3374, 50276, 783, 1543, 670, 849, 4460, 256, 284, 29765, 4948, 273, 37820, 476, 3157, 3045, 310, 9648, 47641, 323, 1650, 597, 1804, 970, 25290, 991, 50276, 1124, 805, 89, 50276, 700, 2805, 81, 8901, 19, 323, 2805, 3701, 13418, 835, 7856, 2805, 2059, 310, 690, 806, 13311, 6642, 273, 253, 2805, 1159, 534, 597, 9059, 1543, 275, 5058, 2228, 672, 253, 806, 13311, 6642, 310, 9670, 7899, 533, 597, 513, 417, 2085, 667, 4588, 11859, 14493, 327, 253, 4795, 298, 19, 2228, 1677, 872, 494, 14493, 327, 253, 7200, 273, 7856, 2805, 2059, 12014, 275, 616, 4679, 597, 513, 417, 1071, 841, 4460, 37820, 3082, 342, 4588, 806, 13311, 13418, 273, 7856, 2805, 2059, 533, 760, 342, 13345, 2228, 2879, 281, 2805, 2059, 534, 2789, 253, 16774, 31471, 273, 436, 4460, 37820, 2834, 281, 2939, 50276, 9088, 403, 2067, 14916, 3006, 13260, 1160, 1690, 337, 6486, 1375, 2317, 285, 374, 6486, 1159, 5971, 1223, 352, 310, 2032, 326, 2223, 13633, 1543, 970, 6486, 1159, 5971, 281, 11968, 4394, 970, 24088, 362, 2428, 50127, 390, 1985, 358, 12844, 10454, 310, 15246, 436, 310, 417, 7933, 1900, 253, 1083, 390, 387, 1878, 2403, 436, 14227, 279, 310, 4536, 1679, 15246, 390, 4419, 3081, 13260, 594, 417, 1690, 625, 2087, 1543, 310, 7964, 247, 14855, 12014, 310, 627, 667, 1921, 2139, 253, 1375, 2317, 369, 8025, 281, 320, 6486, 436, 310, 271, 46521, 9376, 275, 1142, 7533, 285, 5293, 273, 253, 1543, 3469, 327, 253, 1979, 273, 253, 1375, 2317, 594, 604, 627, 403, 667, 10527, 4606, 2139, 253, 3762, 36908, 10748, 9017, 281, 11968, 1375, 8470, 436, 943, 320, 1160, 2590, 12014, 253, 891, 301, 941, 9376, 310, 3839, 46521, 3738, 891, 717, 625, 24152, 627, 1580, 352, 310, 3798, 14916, 281, 8171, 24088, 4719, 25930, 502, 85, 3966, 342, 3969, 1616, 729, 757, 9508, 1677, 12480, 13260, 2299, 627, 943, 387, 1878, 320, 690, 5955, 275, 253, 2929, 670, 436, 3374, 342, 7364, 2168, 9713, 275, 253, 20544, 20881, 1255, 265, 2593, 642, 4016, 38058, 3486, 3374, 326, 891, 1158, 878, 281, 320, 9713, 5474, 339, 431, 248, 2929, 2175, 26230, 2805, 1159, 275, 253, 28841, 9978, 253, 1375, 1913, 2317, 310, 8025, 281, 320, 13358, 285, 253, 3045, 273, 253, 13418, 310, 4080, 762, 247, 4212, 1553, 1245, 2557, 326, 476, 320, 1027, 432, 253, 941, 3268, 253, 2201, 15985, 255, 23880, 432, 253, 3268, 5333, 285, 5368, 1543, 4419, 29867, 285, 42924, 1430, 13260, 436, 789, 29328, 247, 16653, 23623, 1332, 285, 3400, 7605, 23632, 275, 253, 12125, 273, 29867, 9376, 253, 16101, 7792, 310, 2007, 6508, 281, 2801, 1159, 13418, 28841, 745, 22872, 7103, 310, 247, 1077, 1774, 1895, 275, 391, 77, 285, 556, 1142, 4893, 2439, 2710, 10625, 253, 2929, 16633, 327, 15974, 253, 1818, 273, 2557, 2523, 875, 16344, 285, 10491, 10670, 25761, 253, 2929, 760, 3198, 42924, 1430, 9376, 1097, 273, 841, 16424, 403, 1774, 275, 745, 22872, 7103, 627, 403, 1142, 5368, 2987, 12392, 745, 22872, 7103, 285, 281, 619, 3640, 253, 4081, 1332, 1060, 310, 747, 50276, 783, 2929, 310, 4942, 3477, 281, 956, 342, 690, 5884, 3374, 327, 41818, 253, 906, 3133, 3451, 285, 3590, 3738, 891, 452, 417, 10141, 253, 7000, 27947, 50276, 74, 588, 2319, 690, 32213, 275, 253, 3533, 285, 7364, 2593, 891, 13414, 923, 1077, 4755, 7364, 275, 253, 2929, 10941, 281, 17336, 10527, 2175, 273, 391, 77, 347, 5393, 253, 9376, 327, 2801, 1159, 259, 71, 778, 320, 247, 2372, 2266, 50276, 783, 4477, 5469, 253, 15938, 273, 13546, 7938, 2281, 295, 805, 275, 30762, 534, 7194, 42261, 281, 253, 31793, 273, 253, 16653, 23623, 1159, 891, 717, 6110, 275, 1880, 253, 295, 805, 2281, 310, 2455, 1896, 342, 4460, 5609, 2490, 187, 4118, 18435, 27, 783, 4477, 2085, 3468, 4142, 323, 2805, 3701, 13418, 1754, 327, 7221, 991, 16566, 253, 7680, 310, 22335, 4891, 533, 3133, 8489, 32809, 285, 1014, 2167, 253, 4477, 2530, 6128, 281, 512, 2201, 37317, 7350, 627, 310, 1335, 4468, 407, 30628, 273, 253, 30437, 273, 616, 906, 285, 253, 17627, 1319, 50275, 3229, 3784, 436, 352, 3133, 247, 4891, 7680, 281, 253, 391, 77, 6239 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 783, 4477, 2085, 6486, 3410, 39322, 2281, 23632, 323, 2805, 1159, 13418, 3066, 7221, 991, 16566, 12401, 2720, 789, 597, 8722, 281, 5276, 23632, 342, 760, 42924, 1430, 13260, 2720, 2987, 2797, 824, 1543, 760, 323, 3646, 7103, 533, 417, 2805, 1159, 13418, 390, 2424, 1159, 8470, 323, 253, 48960, 13461, 326, 403, 1512, 43541, 281, 25057, 7605, 1612, 50276, 74, 1119, 253, 3762, 4722, 285, 973, 3559, 253, 1543, 812, 320, 273, 1600, 281, 253, 3114, 273, 28841, 391, 77, 50276, 2577, 2022, 7350, 403, 50276, 22309, 651, 581, 6642, 247, 2805, 1159, 643, 685, 275, 1340, 281, 6642, 271, 28841, 3646, 1318, 4496, 2085, 625, 16038, 347, 5010, 253, 1543, 403, 6107, 407, 2720, 789, 50276, 22309, 403, 368, 34617, 281, 6486, 2250, 285, 1375, 8470, 1677, 512, 634, 42924, 1430, 13260, 352, 651, 1646, 326, 436, 310, 271, 689, 24212, 285, 49204, 253, 4096, 841, 1159, 11193, 651, 320, 8558, 954, 6422, 342, 5415, 3054, 3518, 50276, 262, 310, 1077, 23293, 326, 760, 3468, 4142, 403, 2797, 7613, 1014, 323, 6486, 9079, 8470, 581, 4850, 295, 1047, 4142, 824, 42924, 1430, 23632, 285, 671, 253, 5928, 273, 2853, 7334, 1255, 275, 436, 13737, 1895, 943, 320, 4283, 281, 3809, 4142, 923, 323, 4227, 253, 5258, 959, 789, 273, 260, 864, 285, 2805, 74, 5987, 39962, 2061, 5375, 19, 1252, 3071, 17809, 4496, 14588, 281, 436, 789, 275, 634, 2380, 347, 352, 310, 4122, 4623, 2139, 403, 368, 760, 4933, 14493, 323, 6486, 9079, 8470, 436, 310, 1077, 29190, 1677, 326, 436, 310, 7194, 247, 10527, 7680, 253, 1840, 2792, 651, 878, 281, 320, 9713, 1078, 9311, 5474, 339, 431, 248, 2929, 2175, 253, 1895, 273, 4715, 253, 2805, 3701, 285, 4038, 4313, 323, 247, 1677, 3646, 275, 28841, 35221, 4715, 7533, 835, 359, 452, 891, 301, 1375, 2250, 10921, 1735, 1375, 11737, 1868, 5728, 432, 253, 17429, 3268, 273, 690, 14613, 3646, 597, 9059, 326, 275, 2469, 789, 253, 2770, 556, 644, 327, 23632, 327, 253, 7200, 273, 3646, 1318, 7103, 970, 8197, 273, 841, 3470, 2581, 685, 23632, 327, 253, 7200, 273, 253, 8197, 273, 841, 3470, 3746, 253, 4477, 12661, 247, 747, 625, 2087, 7792, 323, 3963, 1025, 48489, 273, 841, 3470, 285, 2085, 23632, 327, 253, 298, 19, 2228, 273, 253, 4795, 8197, 762, 667, 1677, 3268, 273, 1375, 1913, 2193, 275, 436, 1783, 597, 2085, 690, 10527, 1543, 534, 1804, 326, 970, 4460, 4948, 273, 37820, 10522, 4404, 690, 1677, 3268, 597, 476, 2779, 5115, 2406, 298, 19, 2228, 8772, 326, 3268, 4720, 597, 1071, 616, 3762, 342, 690, 2969, 13506, 4679, 12921, 923, 956, 598, 5955, 2708, 323, 11269, 327, 841, 2792, 50276, 296, 3755, 20556, 50276, 783, 3762, 310, 1077, 4076, 285, 3477, 281, 1239, 50276, 46322, 50276, 9328, 2085, 247, 5322, 625, 4460, 7792, 323, 2805, 3701, 285, 1375, 4038, 4313, 13418, 342, 1077, 2266, 10527, 23632, 323, 841, 8892, 50276, 14094, 4679, 403, 4076, 285, 973, 3559, 285, 1056, 247, 12054, 2266, 1083, 326, 616, 4460, 37820, 310, 1175, 387, 8493, 298, 19, 2228, 762, 253, 2303, 3268, 273, 1600, 50276, 3229, 3784, 253, 32213, 2529, 2708, 327, 253, 3486, 273, 436, 789, 581, 2762, 3486, 310, 326, 352, 4483, 247, 2266, 17040, 273, 253, 8137, 2515, 3058, 407, 5368, 7221, 991, 3082, 323, 2805, 3701, 285, 1375, 4038, 4313, 13418, 908, 275, 253, 258, 365, 6239, 534, 310, 1534, 50276, 20881, 1255, 265, 50276, 783, 1895, 273, 25312, 281, 4044, 2266, 23632, 327, 253, 7200, 273, 2805, 3701, 285, 1375, 4038, 4313, 8197, 310, 417, 1077, 973, 17194, 347, 253, 4477, 1127, 562, 253, 2022, 1039, 275, 534, 841, 3470, 403, 908, 275, 3946, 310, 323, 3646, 1318, 13418, 534, 310, 2168, 1077, 973, 5421, 275, 2469, 789, 342, 23632, 323, 253, 4588, 4836, 273, 1600, 253, 4477, 3748, 323, 1650, 326, 4715, 841, 3470, 476, 320, 908, 347, 749, 27861, 1100, 323, 643, 391, 77, 8892, 824, 347, 12353, 68, 17425, 3082, 533, 597, 513, 417, 2085, 1199, 1279, 2508, 670, 436, 50276, 4919, 281, 285, 16122, 327, 253, 2045, 1127, 1580, 4715, 253, 2805, 3701, 390, 1375, 4038, 4313, 310, 417, 3782, 4217, 275, 285, 273, 3139, 352, 310, 31623, 326, 597, 513, 417, 921, 921, 326, 5520, 3045, 275, 4715, 841, 3470, 476, 2686, 16497, 281, 5520, 3045, 275, 15450, 8892, 835, 597, 1537, 320, 908, 323, 1650, 597, 812, 2085, 2057, 337, 10527, 1543, 326, 921, 326, 1524, 12729, 9508, 273, 616, 4081, 48489, 476, 1918, 5520, 23632, 323, 15450, 8892, 390, 374, 16774, 1543, 326, 921, 326, 616, 4081, 48489, 476, 3157, 253, 3045, 273, 15450, 8892, 275, 3946, 1293, 2057, 337, 390, 374, 390, 1633, 2074, 352, 310, 417, 2590, 326, 253, 789, 310, 2686, 1029, 3486, 50276, 22202, 281, 253, 1840, 767, 2792, 352, 310, 671, 417, 1077, 2590, 849, 4217, 352, 310, 281, 320, 2104, 281, 12215, 1698, 298, 19, 2228, 762, 1798, 2303, 1375, 1913, 10670, 323, 1650, 275, 253, 10199, 597, 3877, 326, 15450, 4715, 11333, 326, 897, 745, 22872, 1159, 13418, 347, 247, 749, 27861, 460, 2223, 5467, 253, 13418, 281, 320, 7899, 762, 2176, 2173, 10670, 533, 5734, 841, 403, 4229, 1929, 10670, 352, 310, 417, 2590, 849, 616, 3762, 476, 320, 12845, 1580, 597, 476, 760, 2085, 2228, 23632, 8772, 4229, 1929, 10670, 969, 1293, 625, 1783, 670, 849, 616, 3762, 476, 2686, 3157, 15450, 8892, 253, 3486, 310, 1892, 281, 2939, 247, 11859, 285, 7000, 1650, 1083, 1263, 273, 253, 897, 273, 253, 3762, 651, 320, 1077, 9371, 342, 841, 3374, 50276, 783, 1543, 670, 849, 4460, 256, 284, 29765, 4948, 273, 37820, 476, 3157, 3045, 310, 9648, 47641, 323, 1650, 597, 1804, 970, 25290, 991, 50276, 1124, 805, 89, 50276, 700, 2805, 81, 8901, 19, 323, 2805, 3701, 13418, 835, 7856, 2805, 2059, 310, 690, 806, 13311, 6642, 273, 253, 2805, 1159, 534, 597, 9059, 1543, 275, 5058, 2228, 672, 253, 806, 13311, 6642, 310, 9670, 7899, 533, 597, 513, 417, 2085, 667, 4588, 11859, 14493, 327, 253, 4795, 298, 19, 2228, 1677, 872, 494, 14493, 327, 253, 7200, 273, 7856, 2805, 2059, 12014, 275, 616, 4679, 597, 513, 417, 1071, 841, 4460, 37820, 3082, 342, 4588, 806, 13311, 13418, 273, 7856, 2805, 2059, 533, 760, 342, 13345, 2228, 2879, 281, 2805, 2059, 534, 2789, 253, 16774, 31471, 273, 436, 4460, 37820, 2834, 281, 2939, 50276, 9088, 403, 2067, 14916, 3006, 13260, 1160, 1690, 337, 6486, 1375, 2317, 285, 374, 6486, 1159, 5971, 1223, 352, 310, 2032, 326, 2223, 13633, 1543, 970, 6486, 1159, 5971, 281, 11968, 4394, 970, 24088, 362, 2428, 50127, 390, 1985, 358, 12844, 10454, 310, 15246, 436, 310, 417, 7933, 1900, 253, 1083, 390, 387, 1878, 2403, 436, 14227, 279, 310, 4536, 1679, 15246, 390, 4419, 3081, 13260, 594, 417, 1690, 625, 2087, 1543, 310, 7964, 247, 14855, 12014, 310, 627, 667, 1921, 2139, 253, 1375, 2317, 369, 8025, 281, 320, 6486, 436, 310, 271, 46521, 9376, 275, 1142, 7533, 285, 5293, 273, 253, 1543, 3469, 327, 253, 1979, 273, 253, 1375, 2317, 594, 604, 627, 403, 667, 10527, 4606, 2139, 253, 3762, 36908, 10748, 9017, 281, 11968, 1375, 8470, 436, 943, 320, 1160, 2590, 12014, 253, 891, 301, 941, 9376, 310, 3839, 46521, 3738, 891, 717, 625, 24152, 627, 1580, 352, 310, 3798, 14916, 281, 8171, 24088, 4719, 25930, 502, 85, 3966, 342, 3969, 1616, 729, 757, 9508, 1677, 12480, 13260, 2299, 627, 943, 387, 1878, 320, 690, 5955, 275, 253, 2929, 670, 436, 3374, 342, 7364, 2168, 9713, 275, 253, 20544, 20881, 1255, 265, 2593, 642, 4016, 38058, 3486, 3374, 326, 891, 1158, 878, 281, 320, 9713, 5474, 339, 431, 248, 2929, 2175, 26230, 2805, 1159, 275, 253, 28841, 9978, 253, 1375, 1913, 2317, 310, 8025, 281, 320, 13358, 285, 253, 3045, 273, 253, 13418, 310, 4080, 762, 247, 4212, 1553, 1245, 2557, 326, 476, 320, 1027, 432, 253, 941, 3268, 253, 2201, 15985, 255, 23880, 432, 253, 3268, 5333, 285, 5368, 1543, 4419, 29867, 285, 42924, 1430, 13260, 436, 789, 29328, 247, 16653, 23623, 1332, 285, 3400, 7605, 23632, 275, 253, 12125, 273, 29867, 9376, 253, 16101, 7792, 310, 2007, 6508, 281, 2801, 1159, 13418, 28841, 745, 22872, 7103, 310, 247, 1077, 1774, 1895, 275, 391, 77, 285, 556, 1142, 4893, 2439, 2710, 10625, 253, 2929, 16633, 327, 15974, 253, 1818, 273, 2557, 2523, 875, 16344, 285, 10491, 10670, 25761, 253, 2929, 760, 3198, 42924, 1430, 9376, 1097, 273, 841, 16424, 403, 1774, 275, 745, 22872, 7103, 627, 403, 1142, 5368, 2987, 12392, 745, 22872, 7103, 285, 281, 619, 3640, 253, 4081, 1332, 1060, 310, 747, 50276, 783, 2929, 310, 4942, 3477, 281, 956, 342, 690, 5884, 3374, 327, 41818, 253, 906, 3133, 3451, 285, 3590, 3738, 891, 452, 417, 10141, 253, 7000, 27947, 50276, 74, 588, 2319, 690, 32213, 275, 253, 3533, 285, 7364, 2593, 891, 13414, 923, 1077, 4755, 7364, 275, 253, 2929, 10941, 281, 17336, 10527, 2175, 273, 391, 77, 347, 5393, 253, 9376, 327, 2801, 1159, 259, 71, 778, 320, 247, 2372, 2266, 50276, 783, 4477, 5469, 253, 15938, 273, 13546, 7938, 2281, 295, 805, 275, 30762, 534, 7194, 42261, 281, 253, 31793, 273, 253, 16653, 23623, 1159, 891, 717, 6110, 275, 1880, 253, 295, 805, 2281, 310, 2455, 1896, 342, 4460, 5609, 2490, 187, 4118, 18435, 27, 783, 4477, 2085, 3468, 4142, 323, 2805, 3701, 13418, 1754, 327, 7221, 991, 16566, 253, 7680, 310, 22335, 4891, 533, 3133, 8489, 32809, 285, 1014, 2167, 253, 4477, 2530, 6128, 281, 512, 2201, 37317, 7350, 627, 310, 1335, 4468, 407, 30628, 273, 253, 30437, 273, 616, 906, 285, 253, 17627, 1319, 50275, 3229, 3784, 436, 352, 3133, 247, 4891, 7680, 281, 253, 391, 77, 6239 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: the paper proposes a concatenation approach to combine multiple social media texts through a stacked embedding layer and demonstrates its effect in depression prediction based on text the motivation and the results of the paper are very interesting and definitely will be of importance in a more applied community focused on nlp or ml for mental health however the technical novelty for a venue such as iclr is somewhat limited all network components used exist already and while the different pretraining embedding models used appear to be effective they do not teach us anything new about deep learning i would suggest to the authors to consider a different venue that appreciates this kind of findings see above docsepthis paper proposes a deep learning approach called sercnn for depression detection from social media data tweets the proposed approach is flexible and stacks different embeddings glove pretrained model and a learned embedding vector from lstm for robust and richer tweets representation overall the contributions of this approach are third folds i leveraging social media as a valuable source of information for depression detection ii the authors employ stackedembeddings is a good technique to handle outofvocabulary words and iii using a few training examples eg 10 tweets per user the proposed approach showed a remarkable performance 78 accuracy in depression prediction further the authors conduct a set of experiments on a public twitter dataset for depression detection and compare it with different baselines strong points in general the paper is clearly written and easy to follow the problem of depression detection is wellmotivated the overall presentation is good the authors discussed adequately different related works to the problem of depression detection they stated the gap of the existing approaches which rely on handcraft features to detect depression the authors describe well the preprocessing steps and create a balanced version of the dataset for training the model weak points while most of the paper is easy to understand there are some points in the paper that lack clarity for the novelty and significance the authors propose a flexible concatenation of different embeddings for a robust and rich tweet representation however the novelty of this method is inadequate compared with stateofart contextualized embeddings such as bert for example the research work 1 addresses the same problem and considers contextualized embeddings eg bert roberta as stateofartembeddings baselines here i suggest the author benchmark their approach against bert embeddings for example the authors can use bertweet 2 as a baseline to sercnn in table 2 the authors show the hyperparameters optimized for the proposed approach and baselines this is not clear to me that all models are optimized by the same hyperparameter values given that these models have different architectures some details about the experiments setup are missed for example how the authors split the data for training validation and testing the models eg traintest split ratio is there overfitting during the models training further stacking embeddings is also not memory efficient i find the advantage of stacked embeddings is addressing outofvocabulary words here i suggest the authors conduct more experiments to highlight this contribution in their experiments as well since the authors didnt specify the test data used it would be better to evaluate their approach on a different dataset see 3 which may contain outvocabulary words and can show the efficiency of the proposed approach also i wonder if stacking embedding is efficient for this task only or its a generic richer representation that can achieve good performances with very little data in section 5 the authors analyzed the performance of their approach with little training data eg 10 30 100 500 etc i wonder how do you sample these tweets from the dataset randomly here i recommend obtaining multiple samplings and performing a standard statistical test to benchmark how significant are the outperforming results although this study lacks the explainability of depression detection and the authors name it as future work i find some studies 4 in 2020 that address depression detection with an explainable approach 1 zhang yipeng et al monitoring depression trend on twitter during the covid19 pandemic arxiv preprint arxiv200700228 2020 2 httpsgithubcomvinairesearchbertweet 3 httpsgithubcomswcwangdepressiondetection 4 zogan hamad et al explainable depression detection with multimodalities using a hybrid deep learning model on social media arxiv preprint arxiv200702847 2020 overall i rate this paper as marginally below the acceptance threshold 5 the problem is interesting however the proposed approach lacks a comprehensive evaluation with stateofthe embedding models to ensure the efficiency of the approach for example the authors can employ bert embedding as a baseline to evaluate to which extent sercnn approach is good further i suggest the authors clarify how do they perform sampling eg 10 tweets to assess their approach with little data docsepthis paper develops a sercnn which consists of stacked embeddings and recurrent cnn rcnn for depression detection from twitter text the key idea of stacked embeddings is to concatenate two embeddings based on two different word embeddings models pretrained on twitter and wikipedia copora into a single embedding vector which is fed into rcnn experimental results on the twitter depression dataset show reasonable performance when trained on 10 posts from each user which was further improved when trained on more data strength s1 sercnn shows solid improvements over existing solutions on the twitter depression detection dataset weaknesses w1 novelty and technical contributions are not significant w2 the paper lacks comparisons against pretrained transformerbased language models w3 no insights about depression are provided major comments w1 sercnn is a straightforward combination of existing techniques essentially the paper combines 2x glove word embeddings into a single vector without using any learning mechanism for each input word and then uses the concatenated vector as input to an rcnn model w2 the authors mention the reason why they prefer rcnn over rnn is to capture context information over a long sequence which can be addressed by using the selfattention mechanism the paper does not compare with any transformerbased models especially pretrained transfomerbased language models eg bert or more recent models which are considered more label efficient than conventional word embedding models eg glove rnns technically this paper uses rcnn in fact pretrained language models were already used in the context of emotional detection from text thus it is not natural to disregard the technique for depression detection for example 1 yenhao huang ssurui lee mauyun ma yihsin chen yawen yu yishin chen emotionxidea emotion bert an affectional model for conversation socialnlp 2019 httpsarxivorgabs190806264 2 kisu yang dongyub lee taesun whang seolhwa lee heuiseok lim emotionxku bertmax based contextual emotion classifier socialnlp 2019 httpsarxivorgabs190611565 w3 i would expect to see new findings of depression detection on twitter the paper simply presents a technique which has issues with respect to novelty and technical significance as commented above and claims the contribution by showing the numbers this point may not be the main scope of iclr ie which may put more emphasis on technical contributions but i believe it is very important for applicationoriented papers minor comments presentation style figure 2 is misleading the figure looks like a linear layer is applied on top of the concatenated vector which is not ie the concatenation although the paper tackles an important problem it does not have a sufficient level of novelty and technical contribution sercnn is a straightfoward combination of existing methods sercnn relies on pretrained word embedding models pretrained transformerbased language models should be compared docsepthis paper proposes a stacked embedding recurrent neural network named sercnn to detect depression from twitter first the authors use stacked metaembedding to gain the stacked word information then the rcnn structure is utilized to capture contextual features the experimental results show the effectiveness of the proposed model ves the depression detection application is valuable and the method achieves good performance the whole structure is nice concerns the key concern about the paper is the lack of novelty the rcnn method is classic and the word embedding concatenation is also very common so the novelty of this paper is not enough the motivation is not clear the reason why the authors use rcnn and what problem the authors want to solve should be explained rather than only for getting better results the paper says instead of using the recent transformer model why not use transformerbased methods like bert the authors should explain it and add comparative experiments in the experimental results why does the ehan achieve the best results and not the proposed ercnn in addition the paper says that the ercnn can achieve best performance when using a few posts but there are no comparative results that contain results of other methods with 10 posts or 100 posts the authors should add ablation studies to show the effectiveness of stack embedding the related works are not sufficient and many latest literature are missing minor comments different evaluation metrics should not be put in one figure fig 4 some references should be added for background the novelty is not enough and many experiments are missing ### Summary:
this paper tackles a very important problem of detecting depression on twitter as the reviewers expressed in their reviews this paper will be of interest for the community of researchers applying ml models to mental health domain it is unfortunate that the authors did not respond to the reviewers concerns and questions i strongly encourage the authors to improve the paper based on the authors comments and questions and resubmit to a future venue
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 783, 2929, 29328, 247, 32147, 318, 2746, 281, 13398, 2709, 2675, 3420, 17438, 949, 247, 24982, 21496, 3828, 285, 14371, 697, 1055, 275, 9454, 10554, 1754, 327, 2505, 253, 16038, 285, 253, 1543, 273, 253, 2929, 403, 1077, 4722, 285, 7964, 588, 320, 273, 6349, 275, 247, 625, 3732, 3114, 7106, 327, 295, 24343, 390, 13361, 323, 6255, 1786, 2299, 253, 7681, 38135, 323, 247, 18767, 824, 347, 17857, 32888, 310, 8489, 3710, 512, 2990, 4295, 908, 2226, 2168, 285, 1223, 253, 1027, 3215, 26208, 21496, 3210, 908, 3176, 281, 320, 3576, 597, 513, 417, 9798, 441, 2712, 747, 670, 3676, 4715, 50276, 74, 651, 1804, 281, 253, 4477, 281, 1908, 247, 1027, 18767, 326, 6373, 28032, 436, 2238, 273, 4342, 923, 1840, 5474, 33032, 2520, 2929, 29328, 247, 3676, 4715, 2746, 1925, 1151, 68, 9866, 323, 9454, 5481, 432, 2675, 3420, 941, 28311, 253, 4081, 2746, 310, 12112, 285, 34577, 1027, 46234, 38081, 3215, 11273, 1566, 50276, 395, 247, 6311, 21496, 4972, 432, 298, 296, 78, 323, 10237, 285, 38539, 28311, 6779, 4583, 253, 9021, 273, 436, 2746, 403, 2626, 34579, 891, 19732, 2977, 2675, 3420, 347, 247, 9865, 2603, 273, 1491, 323, 9454, 5481, 21255, 253, 4477, 2126, 24982, 24224, 26935, 310, 247, 1175, 5853, 281, 6016, 562, 1171, 87, 406, 25718, 3000, 285, 37685, 970, 247, 1643, 3733, 6667, 24088, 884, 28311, 591, 2608, 253, 4081, 2746, 2692, 247, 13406, 3045, 10523, 7200, 275, 9454, 10554, 50276, 44295, 253, 4477, 2589, 247, 873, 273, 4679, 327, 247, 1345, 34302, 10895, 323, 9454, 5481, 285, 7277, 352, 342, 1027, 1666, 25379, 50276, 9072, 2792, 50275, 249, 2087, 253, 2929, 310, 4518, 3542, 285, 3477, 281, 956, 253, 1895, 273, 9454, 5481, 310, 973, 24013, 8550, 253, 4583, 9759, 310, 1175, 50276, 783, 4477, 5469, 18212, 1027, 2905, 2987, 281, 253, 1895, 273, 9454, 5481, 597, 4767, 253, 8037, 273, 253, 5368, 7274, 534, 10725, 327, 1133, 12517, 3386, 281, 2736, 9454, 50276, 783, 4477, 6266, 973, 253, 638, 21678, 5018, 285, 2794, 247, 16645, 2715, 273, 253, 10895, 323, 3733, 253, 1566, 50275, 20881, 2792, 50276, 6050, 954, 273, 253, 2929, 310, 3477, 281, 2096, 627, 403, 690, 2792, 275, 253, 2929, 326, 3480, 19843, 323, 253, 38135, 285, 8453, 50275, 783, 4477, 12661, 247, 12112, 32147, 318, 273, 1027, 46234, 323, 247, 10237, 285, 6793, 15975, 6779, 2299, 253, 38135, 273, 436, 1332, 310, 18766, 2429, 342, 1375, 1171, 435, 33876, 1025, 46234, 824, 347, 270, 797, 50276, 1542, 1650, 253, 2561, 789, 337, 12453, 253, 1072, 1895, 285, 19401, 33876, 1025, 46234, 24088, 270, 797, 687, 589, 893, 347, 1375, 1171, 435, 24224, 26935, 1666, 25379, 1060, 891, 1804, 253, 2488, 22791, 616, 2746, 1411, 270, 797, 46234, 323, 1650, 253, 4477, 476, 897, 270, 797, 8775, 374, 347, 247, 8245, 281, 1151, 68, 9866, 50274, 249, 2829, 374, 253, 4477, 921, 253, 4373, 22041, 18325, 323, 253, 4081, 2746, 285, 1666, 25379, 436, 310, 417, 2590, 281, 479, 326, 512, 3210, 403, 18325, 407, 253, 1072, 4373, 19484, 2193, 1677, 326, 841, 3210, 452, 1027, 35615, 50276, 8826, 4278, 670, 253, 4679, 9978, 403, 9829, 323, 1650, 849, 253, 4477, 8085, 253, 941, 323, 3733, 12820, 285, 5175, 253, 3210, 24088, 1140, 565, 383, 8085, 4313, 310, 627, 689, 31893, 1309, 253, 3210, 3733, 50275, 44295, 37444, 46234, 310, 671, 417, 3541, 5919, 891, 1089, 253, 5750, 273, 24982, 46234, 310, 15974, 562, 1171, 87, 406, 25718, 3000, 1060, 891, 1804, 253, 4477, 2589, 625, 4679, 281, 6780, 436, 7680, 275, 616, 4679, 347, 973, 50274, 17480, 253, 4477, 42126, 13199, 253, 1071, 941, 908, 352, 651, 320, 1805, 281, 7472, 616, 2746, 327, 247, 1027, 10895, 923, 495, 534, 778, 3831, 562, 87, 406, 25718, 3000, 285, 476, 921, 253, 6733, 273, 253, 4081, 2746, 50275, 12563, 891, 4282, 604, 37444, 21496, 310, 5919, 323, 436, 4836, 760, 390, 697, 247, 12314, 38539, 6779, 326, 476, 5115, 1175, 16226, 342, 1077, 1652, 941, 50275, 249, 2593, 608, 253, 4477, 5867, 253, 3045, 273, 616, 2746, 342, 1652, 3733, 941, 24088, 884, 1884, 2233, 6783, 3966, 891, 4282, 849, 513, 368, 3410, 841, 28311, 432, 253, 10895, 12421, 1060, 891, 5583, 13546, 2709, 1775, 22945, 285, 9591, 247, 2629, 7605, 1071, 281, 22791, 849, 1534, 403, 253, 41731, 14692, 1543, 50276, 20261, 436, 1263, 19756, 253, 5513, 1430, 273, 9454, 5481, 285, 253, 4477, 1416, 352, 347, 2852, 789, 891, 1089, 690, 2175, 577, 275, 9169, 326, 2953, 9454, 5481, 342, 271, 5513, 494, 2746, 50274, 18, 1182, 12109, 340, 532, 1205, 1162, 355, 8667, 9454, 9058, 327, 34302, 1309, 253, 9383, 301, 746, 26296, 549, 32693, 638, 3845, 549, 32693, 8602, 361, 20694, 9169, 374, 5987, 7280, 681, 87, 1758, 603, 8716, 6291, 8775, 50276, 20, 5987, 7280, 681, 2140, 68, 33317, 7955, 1256, 49558, 50274, 21, 1182, 19356, 10546, 324, 1162, 355, 5513, 494, 9454, 5481, 342, 23390, 351, 12908, 970, 247, 9769, 3676, 4715, 1566, 327, 2675, 3420, 549, 32693, 638, 3845, 549, 32693, 1518, 1967, 1619, 2504, 9169, 50273, 1189, 455, 891, 2281, 436, 2929, 347, 42876, 2708, 253, 14924, 7887, 608, 253, 1895, 310, 4722, 2299, 253, 4081, 2746, 19756, 247, 11088, 7103, 342, 1375, 23037, 248, 21496, 3210, 281, 5416, 253, 6733, 273, 253, 2746, 323, 1650, 253, 4477, 476, 2126, 270, 797, 21496, 347, 247, 8245, 281, 7472, 281, 534, 6070, 1151, 68, 9866, 2746, 310, 1175, 2007, 891, 1804, 253, 4477, 19148, 849, 513, 597, 1347, 10491, 24088, 884, 28311, 281, 2939, 616, 2746, 342, 1652, 941, 5474, 33032, 2520, 2929, 24357, 247, 1151, 68, 9866, 534, 8414, 273, 24982, 46234, 285, 18902, 260, 9866, 27657, 9866, 323, 9454, 5481, 432, 34302, 2505, 253, 2234, 2934, 273, 24982, 46234, 310, 281, 32147, 366, 767, 46234, 1754, 327, 767, 1027, 3159, 46234, 3210, 3215, 11273, 327, 34302, 285, 259, 15170, 5440, 6464, 715, 247, 2014, 21496, 4972, 534, 310, 10208, 715, 27657, 9866, 5661, 1543, 327, 253, 34302, 9454, 10895, 921, 5272, 3045, 672, 10166, 327, 884, 9319, 432, 1016, 2608, 534, 369, 2007, 5520, 672, 10166, 327, 625, 941, 50274, 45563, 50275, 84, 18, 1151, 68, 9866, 2722, 4891, 11701, 689, 5368, 5482, 327, 253, 34302, 9454, 5481, 10895, 50273, 20881, 1255, 265, 50275, 88, 18, 38135, 285, 7681, 9021, 403, 417, 1534, 50275, 88, 19, 253, 2929, 19756, 14023, 1411, 3215, 11273, 39707, 3169, 3448, 3210, 50275, 88, 20, 642, 16039, 670, 9454, 403, 2530, 50274, 24330, 5701, 50276, 88, 18, 1151, 68, 9866, 310, 247, 15246, 5019, 273, 5368, 5609, 9093, 253, 2929, 24772, 374, 89, 38081, 3159, 46234, 715, 247, 2014, 4972, 1293, 970, 667, 4715, 5122, 323, 1016, 3280, 3159, 285, 840, 4648, 253, 32147, 456, 4972, 347, 3280, 281, 271, 27657, 9866, 1566, 50274, 88, 19, 50276, 783, 4477, 3748, 253, 1921, 2139, 597, 4510, 27657, 9866, 689, 391, 9866, 310, 281, 9232, 3634, 1491, 689, 247, 1048, 3425, 534, 476, 320, 9713, 407, 970, 253, 1881, 42959, 5122, 50276, 783, 2929, 1057, 417, 7277, 342, 667, 39707, 3169, 3210, 3340, 3215, 11273, 47415, 8056, 3169, 3448, 3210, 24088, 270, 797, 390, 625, 3332, 3210, 534, 403, 2783, 625, 5203, 5919, 685, 6041, 3159, 21496, 3210, 24088, 38081, 50276, 30930, 2224, 22335, 436, 2929, 4648, 27657, 9866, 50276, 249, 958, 3215, 11273, 3448, 3210, 497, 2168, 908, 275, 253, 3634, 273, 8991, 5481, 432, 2505, 3021, 352, 310, 417, 3626, 281, 27719, 253, 5853, 323, 9454, 5481, 323, 1650, 50274, 18, 340, 257, 31035, 30287, 606, 256, 9960, 4113, 458, 70, 278, 1952, 90, 328, 6429, 340, 6356, 7432, 260, 864, 340, 1403, 257, 340, 86, 340, 763, 249, 260, 864, 12904, 89, 36665, 12904, 270, 797, 50276, 266, 21909, 267, 1566, 323, 7827, 2675, 13307, 81, 6247, 5987, 39962, 2061, 5375, 16129, 1438, 3763, 1540, 50276, 19, 465, 37364, 30966, 277, 543, 90, 538, 458, 70, 15307, 265, 328, 364, 606, 396, 311, 73, 8754, 458, 70, 344, 86, 885, 536, 1579, 12904, 89, 13312, 270, 797, 4090, 1754, 33876, 12904, 30410, 2675, 13307, 81, 6247, 5987, 39962, 2061, 5375, 746, 3071, 12730, 2082, 50275, 88, 20, 50276, 74, 651, 1902, 281, 923, 747, 4342, 273, 9454, 5481, 327, 34302, 253, 2929, 3365, 10262, 247, 5853, 534, 556, 3374, 342, 1675, 281, 38135, 285, 7681, 8453, 347, 20503, 1840, 285, 3916, 253, 7680, 407, 4645, 253, 3904, 436, 1127, 778, 417, 320, 253, 2022, 7990, 273, 17857, 32888, 26332, 534, 778, 1691, 625, 15075, 327, 7681, 9021, 533, 891, 2868, 352, 310, 1077, 1774, 323, 2898, 21085, 9380, 50273, 37585, 5701, 50275, 49836, 3740, 4677, 374, 310, 24363, 253, 4677, 4453, 751, 247, 4872, 3828, 310, 3732, 327, 1755, 273, 253, 32147, 456, 4972, 534, 310, 417, 26332, 253, 32147, 318, 50276, 20261, 253, 2929, 39223, 271, 1774, 1895, 352, 1057, 417, 452, 247, 4209, 1268, 273, 38135, 285, 7681, 7680, 1151, 68, 9866, 310, 247, 4951, 71, 319, 472, 5019, 273, 5368, 3082, 1151, 68, 9866, 15771, 327, 3215, 11273, 3159, 21496, 3210, 3215, 11273, 39707, 3169, 3448, 3210, 943, 320, 2429, 50276, 7152, 33032, 2520, 2929, 29328, 247, 24982, 21496, 18902, 11454, 2990, 4907, 1151, 68, 9866, 281, 2736, 9454, 432, 34302, 806, 253, 4477, 897, 24982, 11419, 24224, 5361, 281, 6351, 253, 24982, 3159, 1491, 840, 253, 27657, 9866, 2605, 310, 12845, 281, 9232, 33876, 3386, 253, 5661, 1543, 921, 253, 12510, 273, 253, 4081, 1566, 18287, 50276, 783, 9454, 5481, 2898, 310, 9865, 285, 253, 1332, 33526, 1175, 3045, 50276, 783, 2644, 2605, 310, 5322, 50275, 585, 1209, 2224, 50276, 783, 2234, 4468, 670, 253, 2929, 310, 253, 3480, 273, 38135, 253, 27657, 9866, 1332, 310, 10610, 285, 253, 3159, 21496, 32147, 318, 310, 671, 1077, 1846, 594, 253, 38135, 273, 436, 2929, 310, 417, 2217, 50276, 783, 16038, 310, 417, 2590, 253, 1921, 2139, 253, 4477, 897, 27657, 9866, 285, 752, 1895, 253, 4477, 971, 281, 8415, 943, 320, 5544, 2581, 685, 760, 323, 2970, 1805, 1543, 50275, 783, 2929, 2296, 3185, 273, 970, 253, 3332, 39707, 1566, 2139, 417, 897, 39707, 3169, 3082, 751, 270, 797, 253, 4477, 943, 5513, 352, 285, 823, 20407, 4679, 50276, 249, 253, 5661, 1543, 2139, 1057, 253, 299, 5582, 5115, 253, 1682, 1543, 285, 417, 253, 4081, 209, 2269, 9866, 275, 1635, 253, 2929, 2296, 326, 253, 209, 2269, 9866, 476, 5115, 1682, 3045, 672, 970, 247, 1643, 9319, 533, 627, 403, 642, 20407, 1543, 326, 3831, 1543, 273, 643, 3082, 342, 884, 9319, 390, 2233, 9319, 50276, 783, 4477, 943, 823, 28913, 2175, 281, 921, 253, 12510, 273, 8031, 21496, 50276, 783, 2905, 2987, 403, 417, 4209, 285, 1142, 6323, 6239, 403, 5816, 50276, 37585, 5701, 50274, 19623, 7103, 17082, 943, 417, 320, 1691, 275, 581, 4677, 3036, 577, 50275, 8826, 10414, 943, 320, 2879, 323, 4114, 50276, 783, 38135, 310, 417, 2217, 285, 1142, 4679, 403, 5816, 2490, 187, 4118, 18435, 27, 2520, 2929, 39223, 247, 1077, 1774, 1895, 273, 15549, 9454, 327, 34302, 347, 253, 30628, 4469, 275, 616, 10123, 436, 2929, 588, 320, 273, 1600, 323, 253, 3114, 273, 8607, 9433, 13361, 3210, 281, 6255, 1786, 5028, 352, 310, 23293, 326, 253, 4477, 858, 417, 3794, 281, 253, 30628, 7350, 285, 3533, 891, 7052, 11907, 253, 4477, 281, 3157, 253, 2929, 1754, 327, 253, 4477, 5701, 285, 3533, 285, 501, 538, 2225, 281, 247, 2852, 18767 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 783, 2929, 29328, 247, 32147, 318, 2746, 281, 13398, 2709, 2675, 3420, 17438, 949, 247, 24982, 21496, 3828, 285, 14371, 697, 1055, 275, 9454, 10554, 1754, 327, 2505, 253, 16038, 285, 253, 1543, 273, 253, 2929, 403, 1077, 4722, 285, 7964, 588, 320, 273, 6349, 275, 247, 625, 3732, 3114, 7106, 327, 295, 24343, 390, 13361, 323, 6255, 1786, 2299, 253, 7681, 38135, 323, 247, 18767, 824, 347, 17857, 32888, 310, 8489, 3710, 512, 2990, 4295, 908, 2226, 2168, 285, 1223, 253, 1027, 3215, 26208, 21496, 3210, 908, 3176, 281, 320, 3576, 597, 513, 417, 9798, 441, 2712, 747, 670, 3676, 4715, 50276, 74, 651, 1804, 281, 253, 4477, 281, 1908, 247, 1027, 18767, 326, 6373, 28032, 436, 2238, 273, 4342, 923, 1840, 5474, 33032, 2520, 2929, 29328, 247, 3676, 4715, 2746, 1925, 1151, 68, 9866, 323, 9454, 5481, 432, 2675, 3420, 941, 28311, 253, 4081, 2746, 310, 12112, 285, 34577, 1027, 46234, 38081, 3215, 11273, 1566, 50276, 395, 247, 6311, 21496, 4972, 432, 298, 296, 78, 323, 10237, 285, 38539, 28311, 6779, 4583, 253, 9021, 273, 436, 2746, 403, 2626, 34579, 891, 19732, 2977, 2675, 3420, 347, 247, 9865, 2603, 273, 1491, 323, 9454, 5481, 21255, 253, 4477, 2126, 24982, 24224, 26935, 310, 247, 1175, 5853, 281, 6016, 562, 1171, 87, 406, 25718, 3000, 285, 37685, 970, 247, 1643, 3733, 6667, 24088, 884, 28311, 591, 2608, 253, 4081, 2746, 2692, 247, 13406, 3045, 10523, 7200, 275, 9454, 10554, 50276, 44295, 253, 4477, 2589, 247, 873, 273, 4679, 327, 247, 1345, 34302, 10895, 323, 9454, 5481, 285, 7277, 352, 342, 1027, 1666, 25379, 50276, 9072, 2792, 50275, 249, 2087, 253, 2929, 310, 4518, 3542, 285, 3477, 281, 956, 253, 1895, 273, 9454, 5481, 310, 973, 24013, 8550, 253, 4583, 9759, 310, 1175, 50276, 783, 4477, 5469, 18212, 1027, 2905, 2987, 281, 253, 1895, 273, 9454, 5481, 597, 4767, 253, 8037, 273, 253, 5368, 7274, 534, 10725, 327, 1133, 12517, 3386, 281, 2736, 9454, 50276, 783, 4477, 6266, 973, 253, 638, 21678, 5018, 285, 2794, 247, 16645, 2715, 273, 253, 10895, 323, 3733, 253, 1566, 50275, 20881, 2792, 50276, 6050, 954, 273, 253, 2929, 310, 3477, 281, 2096, 627, 403, 690, 2792, 275, 253, 2929, 326, 3480, 19843, 323, 253, 38135, 285, 8453, 50275, 783, 4477, 12661, 247, 12112, 32147, 318, 273, 1027, 46234, 323, 247, 10237, 285, 6793, 15975, 6779, 2299, 253, 38135, 273, 436, 1332, 310, 18766, 2429, 342, 1375, 1171, 435, 33876, 1025, 46234, 824, 347, 270, 797, 50276, 1542, 1650, 253, 2561, 789, 337, 12453, 253, 1072, 1895, 285, 19401, 33876, 1025, 46234, 24088, 270, 797, 687, 589, 893, 347, 1375, 1171, 435, 24224, 26935, 1666, 25379, 1060, 891, 1804, 253, 2488, 22791, 616, 2746, 1411, 270, 797, 46234, 323, 1650, 253, 4477, 476, 897, 270, 797, 8775, 374, 347, 247, 8245, 281, 1151, 68, 9866, 50274, 249, 2829, 374, 253, 4477, 921, 253, 4373, 22041, 18325, 323, 253, 4081, 2746, 285, 1666, 25379, 436, 310, 417, 2590, 281, 479, 326, 512, 3210, 403, 18325, 407, 253, 1072, 4373, 19484, 2193, 1677, 326, 841, 3210, 452, 1027, 35615, 50276, 8826, 4278, 670, 253, 4679, 9978, 403, 9829, 323, 1650, 849, 253, 4477, 8085, 253, 941, 323, 3733, 12820, 285, 5175, 253, 3210, 24088, 1140, 565, 383, 8085, 4313, 310, 627, 689, 31893, 1309, 253, 3210, 3733, 50275, 44295, 37444, 46234, 310, 671, 417, 3541, 5919, 891, 1089, 253, 5750, 273, 24982, 46234, 310, 15974, 562, 1171, 87, 406, 25718, 3000, 1060, 891, 1804, 253, 4477, 2589, 625, 4679, 281, 6780, 436, 7680, 275, 616, 4679, 347, 973, 50274, 17480, 253, 4477, 42126, 13199, 253, 1071, 941, 908, 352, 651, 320, 1805, 281, 7472, 616, 2746, 327, 247, 1027, 10895, 923, 495, 534, 778, 3831, 562, 87, 406, 25718, 3000, 285, 476, 921, 253, 6733, 273, 253, 4081, 2746, 50275, 12563, 891, 4282, 604, 37444, 21496, 310, 5919, 323, 436, 4836, 760, 390, 697, 247, 12314, 38539, 6779, 326, 476, 5115, 1175, 16226, 342, 1077, 1652, 941, 50275, 249, 2593, 608, 253, 4477, 5867, 253, 3045, 273, 616, 2746, 342, 1652, 3733, 941, 24088, 884, 1884, 2233, 6783, 3966, 891, 4282, 849, 513, 368, 3410, 841, 28311, 432, 253, 10895, 12421, 1060, 891, 5583, 13546, 2709, 1775, 22945, 285, 9591, 247, 2629, 7605, 1071, 281, 22791, 849, 1534, 403, 253, 41731, 14692, 1543, 50276, 20261, 436, 1263, 19756, 253, 5513, 1430, 273, 9454, 5481, 285, 253, 4477, 1416, 352, 347, 2852, 789, 891, 1089, 690, 2175, 577, 275, 9169, 326, 2953, 9454, 5481, 342, 271, 5513, 494, 2746, 50274, 18, 1182, 12109, 340, 532, 1205, 1162, 355, 8667, 9454, 9058, 327, 34302, 1309, 253, 9383, 301, 746, 26296, 549, 32693, 638, 3845, 549, 32693, 8602, 361, 20694, 9169, 374, 5987, 7280, 681, 87, 1758, 603, 8716, 6291, 8775, 50276, 20, 5987, 7280, 681, 2140, 68, 33317, 7955, 1256, 49558, 50274, 21, 1182, 19356, 10546, 324, 1162, 355, 5513, 494, 9454, 5481, 342, 23390, 351, 12908, 970, 247, 9769, 3676, 4715, 1566, 327, 2675, 3420, 549, 32693, 638, 3845, 549, 32693, 1518, 1967, 1619, 2504, 9169, 50273, 1189, 455, 891, 2281, 436, 2929, 347, 42876, 2708, 253, 14924, 7887, 608, 253, 1895, 310, 4722, 2299, 253, 4081, 2746, 19756, 247, 11088, 7103, 342, 1375, 23037, 248, 21496, 3210, 281, 5416, 253, 6733, 273, 253, 2746, 323, 1650, 253, 4477, 476, 2126, 270, 797, 21496, 347, 247, 8245, 281, 7472, 281, 534, 6070, 1151, 68, 9866, 2746, 310, 1175, 2007, 891, 1804, 253, 4477, 19148, 849, 513, 597, 1347, 10491, 24088, 884, 28311, 281, 2939, 616, 2746, 342, 1652, 941, 5474, 33032, 2520, 2929, 24357, 247, 1151, 68, 9866, 534, 8414, 273, 24982, 46234, 285, 18902, 260, 9866, 27657, 9866, 323, 9454, 5481, 432, 34302, 2505, 253, 2234, 2934, 273, 24982, 46234, 310, 281, 32147, 366, 767, 46234, 1754, 327, 767, 1027, 3159, 46234, 3210, 3215, 11273, 327, 34302, 285, 259, 15170, 5440, 6464, 715, 247, 2014, 21496, 4972, 534, 310, 10208, 715, 27657, 9866, 5661, 1543, 327, 253, 34302, 9454, 10895, 921, 5272, 3045, 672, 10166, 327, 884, 9319, 432, 1016, 2608, 534, 369, 2007, 5520, 672, 10166, 327, 625, 941, 50274, 45563, 50275, 84, 18, 1151, 68, 9866, 2722, 4891, 11701, 689, 5368, 5482, 327, 253, 34302, 9454, 5481, 10895, 50273, 20881, 1255, 265, 50275, 88, 18, 38135, 285, 7681, 9021, 403, 417, 1534, 50275, 88, 19, 253, 2929, 19756, 14023, 1411, 3215, 11273, 39707, 3169, 3448, 3210, 50275, 88, 20, 642, 16039, 670, 9454, 403, 2530, 50274, 24330, 5701, 50276, 88, 18, 1151, 68, 9866, 310, 247, 15246, 5019, 273, 5368, 5609, 9093, 253, 2929, 24772, 374, 89, 38081, 3159, 46234, 715, 247, 2014, 4972, 1293, 970, 667, 4715, 5122, 323, 1016, 3280, 3159, 285, 840, 4648, 253, 32147, 456, 4972, 347, 3280, 281, 271, 27657, 9866, 1566, 50274, 88, 19, 50276, 783, 4477, 3748, 253, 1921, 2139, 597, 4510, 27657, 9866, 689, 391, 9866, 310, 281, 9232, 3634, 1491, 689, 247, 1048, 3425, 534, 476, 320, 9713, 407, 970, 253, 1881, 42959, 5122, 50276, 783, 2929, 1057, 417, 7277, 342, 667, 39707, 3169, 3210, 3340, 3215, 11273, 47415, 8056, 3169, 3448, 3210, 24088, 270, 797, 390, 625, 3332, 3210, 534, 403, 2783, 625, 5203, 5919, 685, 6041, 3159, 21496, 3210, 24088, 38081, 50276, 30930, 2224, 22335, 436, 2929, 4648, 27657, 9866, 50276, 249, 958, 3215, 11273, 3448, 3210, 497, 2168, 908, 275, 253, 3634, 273, 8991, 5481, 432, 2505, 3021, 352, 310, 417, 3626, 281, 27719, 253, 5853, 323, 9454, 5481, 323, 1650, 50274, 18, 340, 257, 31035, 30287, 606, 256, 9960, 4113, 458, 70, 278, 1952, 90, 328, 6429, 340, 6356, 7432, 260, 864, 340, 1403, 257, 340, 86, 340, 763, 249, 260, 864, 12904, 89, 36665, 12904, 270, 797, 50276, 266, 21909, 267, 1566, 323, 7827, 2675, 13307, 81, 6247, 5987, 39962, 2061, 5375, 16129, 1438, 3763, 1540, 50276, 19, 465, 37364, 30966, 277, 543, 90, 538, 458, 70, 15307, 265, 328, 364, 606, 396, 311, 73, 8754, 458, 70, 344, 86, 885, 536, 1579, 12904, 89, 13312, 270, 797, 4090, 1754, 33876, 12904, 30410, 2675, 13307, 81, 6247, 5987, 39962, 2061, 5375, 746, 3071, 12730, 2082, 50275, 88, 20, 50276, 74, 651, 1902, 281, 923, 747, 4342, 273, 9454, 5481, 327, 34302, 253, 2929, 3365, 10262, 247, 5853, 534, 556, 3374, 342, 1675, 281, 38135, 285, 7681, 8453, 347, 20503, 1840, 285, 3916, 253, 7680, 407, 4645, 253, 3904, 436, 1127, 778, 417, 320, 253, 2022, 7990, 273, 17857, 32888, 26332, 534, 778, 1691, 625, 15075, 327, 7681, 9021, 533, 891, 2868, 352, 310, 1077, 1774, 323, 2898, 21085, 9380, 50273, 37585, 5701, 50275, 49836, 3740, 4677, 374, 310, 24363, 253, 4677, 4453, 751, 247, 4872, 3828, 310, 3732, 327, 1755, 273, 253, 32147, 456, 4972, 534, 310, 417, 26332, 253, 32147, 318, 50276, 20261, 253, 2929, 39223, 271, 1774, 1895, 352, 1057, 417, 452, 247, 4209, 1268, 273, 38135, 285, 7681, 7680, 1151, 68, 9866, 310, 247, 4951, 71, 319, 472, 5019, 273, 5368, 3082, 1151, 68, 9866, 15771, 327, 3215, 11273, 3159, 21496, 3210, 3215, 11273, 39707, 3169, 3448, 3210, 943, 320, 2429, 50276, 7152, 33032, 2520, 2929, 29328, 247, 24982, 21496, 18902, 11454, 2990, 4907, 1151, 68, 9866, 281, 2736, 9454, 432, 34302, 806, 253, 4477, 897, 24982, 11419, 24224, 5361, 281, 6351, 253, 24982, 3159, 1491, 840, 253, 27657, 9866, 2605, 310, 12845, 281, 9232, 33876, 3386, 253, 5661, 1543, 921, 253, 12510, 273, 253, 4081, 1566, 18287, 50276, 783, 9454, 5481, 2898, 310, 9865, 285, 253, 1332, 33526, 1175, 3045, 50276, 783, 2644, 2605, 310, 5322, 50275, 585, 1209, 2224, 50276, 783, 2234, 4468, 670, 253, 2929, 310, 253, 3480, 273, 38135, 253, 27657, 9866, 1332, 310, 10610, 285, 253, 3159, 21496, 32147, 318, 310, 671, 1077, 1846, 594, 253, 38135, 273, 436, 2929, 310, 417, 2217, 50276, 783, 16038, 310, 417, 2590, 253, 1921, 2139, 253, 4477, 897, 27657, 9866, 285, 752, 1895, 253, 4477, 971, 281, 8415, 943, 320, 5544, 2581, 685, 760, 323, 2970, 1805, 1543, 50275, 783, 2929, 2296, 3185, 273, 970, 253, 3332, 39707, 1566, 2139, 417, 897, 39707, 3169, 3082, 751, 270, 797, 253, 4477, 943, 5513, 352, 285, 823, 20407, 4679, 50276, 249, 253, 5661, 1543, 2139, 1057, 253, 299, 5582, 5115, 253, 1682, 1543, 285, 417, 253, 4081, 209, 2269, 9866, 275, 1635, 253, 2929, 2296, 326, 253, 209, 2269, 9866, 476, 5115, 1682, 3045, 672, 970, 247, 1643, 9319, 533, 627, 403, 642, 20407, 1543, 326, 3831, 1543, 273, 643, 3082, 342, 884, 9319, 390, 2233, 9319, 50276, 783, 4477, 943, 823, 28913, 2175, 281, 921, 253, 12510, 273, 8031, 21496, 50276, 783, 2905, 2987, 403, 417, 4209, 285, 1142, 6323, 6239, 403, 5816, 50276, 37585, 5701, 50274, 19623, 7103, 17082, 943, 417, 320, 1691, 275, 581, 4677, 3036, 577, 50275, 8826, 10414, 943, 320, 2879, 323, 4114, 50276, 783, 38135, 310, 417, 2217, 285, 1142, 4679, 403, 5816, 2490, 187, 4118, 18435, 27, 2520, 2929, 39223, 247, 1077, 1774, 1895, 273, 15549, 9454, 327, 34302, 347, 253, 30628, 4469, 275, 616, 10123, 436, 2929, 588, 320, 273, 1600, 323, 253, 3114, 273, 8607, 9433, 13361, 3210, 281, 6255, 1786, 5028, 352, 310, 23293, 326, 253, 4477, 858, 417, 3794, 281, 253, 30628, 7350, 285, 3533, 891, 7052, 11907, 253, 4477, 281, 3157, 253, 2929, 1754, 327, 253, 4477, 5701, 285, 3533, 285, 501, 538, 2225, 281, 247, 2852, 18767 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: update i find the revised version of the paper much clearer and streamlined than the originally submitted one and am mostly content with the authors reply to my comments however i still think the the work would highly benefit from a nonheuristic justification of its approach and some theoretic guarantees on the performance of the proposed framework especially in which regimes it is beneficial and when it is not also i still find the presentation of experimental results too convoluted to give a clear and comprehensive picture of how this methods compares to the competition when is it better when is it worse do the observationsclaim generalize to other task and which are the right competing methods to be considering i think the paper can still be improved on this aspect as well as i find the idea once it was clarified generally interesting i will raise my score to 6 the paper proposes an objective function for learning representations termed the conditional entropy bottleneck ceb variational bounds on the objective function are derived and used to train classifiers according to the ceb and compare the results to those attained by competing methods robustness and adversarial examples detection of ceb are emphasized my major comments are as follows 1 the authors base their informationtheoretic reasoning on the settheoretic structure of shannons information measures it is noteworthy that when dealing with more than 2 random variables eg when going from the twofold ixy to the threefold ixyz this theory has major issues in particular there are simple and natural examples for which ixyz is negative the paper presents an informationtheoretic heuristicintuitive explanation for their ceb construction based on this framework no proofs backing up any of the claims of performancerobustness in the paper are given unfortunately with such counterintuitive issues of the underlying theory a heurisitc explanation that motivates the proposed construction is not convincing simulations are presented to justify the construction but whether the claimed properties hold for a wide variety of setups remain unclear 2 appendix a is referred to early on for explaining the minimal necessary information mni but it is very unclear what is the claim of this appendix is there a claim its just seems like a convoluted and long explanation of mutual information even more so this explanation is inaccurate for instance the authors refer to the mutual information as a minimal sufficient statistic but it is not for a pair of random variables xy a sufficient statistic say for x given y is a function f of y such xfyy forms a markov chain specifically fy is another random variable the mutual information ixy is just a number i have multiple guesses on what the authors meaning could be here but was unable to figure it out from the text one option which is a pretty standard way to define sufficient statistic though mutual information is as a function f such that ixyfy0 such an f is a sufficient statistic since the zero mutual information term is equivalent to the markov chain xfyy from before is that what the authors mean 3 the zx variable introduced in section 3 in inspired by the ib framework footnote 2 if i understand correctly this means that in many applications zx is specified by a classifier of x wrt the label y my question is whether for a fixed set of system parameters zx is a deterministic function of x if this zx play the role of the sufficient statistics ive referred to in my previous comment then it should be just a function of x however if zxfx for a deterministic function f then the ceb from equation 3 is vacuous for many interesting cases of xy for instance if x is a continuous random variable and zxfx is continuous as well then ixzxyhzxyhzxxy where h is the differential entropy and the subtracted terms equals infty by definition see section 83 of cover thomas 2006 consequently the mutual information and the ceb objective are infinite if zxfx is a mixed random variable eg can be obtain from a relu neural network then the same happens other cases of interest such as discrete x and f being an injective mapping of the set of x values are also problematic for details of such problem associated with ib type terms see 1 r a amjad and b c geiger learning representations for neural networkbased classification using the information bottleneck principle 2018 httpsarxivorgabs180209766 can the authors account for that 4 the other two reviews addressed the missing accounts for past literature i agree on this point and will keep track of the authors responses i will not comment on that again beyond these specific issue they text is very wordy and confusing at times if some mathematical justificationmodeling was employed the proposed framework might have been easier to accept the long heuristic explanations employed at the moment do not suffice for this reviewer unless the authors are able to provide clarification of all the above points and properly place their work in relation to past literature i cannot recommend acceptance docsepthis paper wants to discuss a new objective function which the authors dub conditional entropy bottleneck ceb motivated by learning better latent representations however as far as i can tell the objective functions already exists in the oneparameter family of information bottleneck ib of tishby pereira and bialek the author seems to realize this in appendix b but calls it a somewhat surprising theoretical result however if we express ib as max izy beta izx see 19 and then flip signs and take the max to the min we get min beta izx izy taking beta 12 multiplying through by 2 and writing ixz iy z ixzy we find cib unfortunately i fail to see how this is surprising or different a difference only arises when using a variational approximation to ib the authors compare to the variational information bottleneck vib of alemi fischer dillon and murphy arxiv161200410 which requires a classifier an encoder and a marginal posterior over the latents here instead of the marginal posterior they learn a backwards encoder from labels to latents this difference arises because the ib objective has two terms of opposite sign and we can group them into positive definite terms in different ways creating different bounds perhaps this grouping leads to a better variational bound if so thats only a point about the variational method employed by alemi et al and not a separate objective as this seems to be the main contribution of the paper this point needs to be explained more carefully and in more detail for instance it seems worth pointing out in the discrete case that the marginal posterior z values to estimate and the backwards encoder has z x y suggesting this is a possibly a much harder learning problem if so there should be a compelling benefit for using this approximation and not the other one in summary the authors are not really clear about what they are doing and how it relates to ib furthermore the need for this specific choice in ib parameter space is not made clear nor do the experimental results giving a compelling need the experimental results are also not at all clearly presented or explained therefore i dont think this paper satisfies the quality clarity originality or significance criteria for iclrdocsepupdate see comments on revisions below this paper essentially introduces a labeldependent regularization to the vib framework matching the encoder distribution of one computed from labels the authors show good performance in generalization such that their approach is relatively robust in a number of tasks such as adversarial defense the idea i think is generally good but there are several problems with this work first there has been recent advances in mutual information estimation first found in 1 this is an important departure from the usual variational approximations used in vib you need to compare to this baseline as it was shown that it outperforms vib in a similar classification task as presented in your work second far too much space is used to lay out some fairly basic formalism with respect to mutual information conditional entropy etc it would be nice for example to have an algorithm to make the learning objective more clear overall i dont feel the content justifies the length third i have some concerns about the significance of this work they introduce essentially a labeldependent backwards encoder to provide samples for the kl term normally found in vib the justification is that we need the bottleneck term to improve generalization and the backwards encoder term is supposed to keep the representation relevant to labels one could have used an approach like mine doing min information for the bottleneck and max info for the labels in addition much work has been done on learning representations that generalize using mutual information maximizing instead of minimizing 2 3 4 5 along with some sort of term to improve relevance and this work seems to ignore not be aware of this work overall i could see some potential in this paper being published as i think the approach is sensible but its not presented in the proper context of past work 1 belghazi i baratin a rajeswar s courville a bengio y hjelm r d 2018 mine mutual information neural estimation international conference for machine learning 2018 2 gomes r krause a and perona p discriminative clustering by regularized information maximization in nips 2010 3 hu w miyato t tokui s matsumoto e and sugiyama m learning discrete representations via information maximizing selfaugmented training in icml 2017 4 hjelm r d fedorov a lavoiemarchildon s grewal k trischler a bengio y 2018 learning deep representations by mutual information estimation and maximization arxiv preprint arxiv180806670 5 oord aaron van den yazhe li and oriol vinyals representation learning with contrastive predictive coding arxiv preprint arxiv180703748 2018 ### Summary:
this paper proposes a criterion for representation learning minimum necessary information which states that for a task defined by some joint probability distribution pxy and the goal of for example predicting y from x a learned representation of x denoted z should satisfy the equality ixy ixz iyz the authors then propose an objective function the conditional entropy bottleneck ceb to ensure that a learned representation satisfies the minimum necessary information criterion and a variational approximation to the conditional entropy bottleneck that can be parameterized using deep networks and optimized with standard methods such as stochastic gradient descent the authors also relate the conditional entropy bottleneck to the information bottleneck lagrangian proposed by tishby showing that the ceb corresponds to the information bottleneck with 05 an important contribution of this work is that it gives a theoretical justification for selecting a specific value of rather than testing multiple values experiments on fashionmnist show that in comparison to a deterministic classifier and to variational information bottleneck models with in 001 01 05 the ceb model achieves good accuracy and calibration is competitive at detecting outofdistribution inputs and is more resistant to whitebox adversarial attacks another experiment demonstrates that a model trained with the ceb criterion is unable to memorize a randomly labeled version of fashionmnist there was a strong difference of opinion between the reviewers on this paper one reviewer r1 dismissed the work as trivial the authors rebutted this claim in their response and revision and r1 failed to participate in the discussion so the ac strongly discounted this review the other two reviewers had some concerns about the paper most of which were addressed by the revision but crucially some concerns still remain r4 would like more theoretical rigor in the paper while r2 would like a direct comparison against mine and cpc in the end the ac thinks that this paper needs just a bit more work to address these concerns the authors are encouraged to revise this work and submit it to another machine learning venue
[ 790, 1491, 5593, 352, 310, 35092, 326, 672, 10620, 342, 625, 685, 374, 3632, 4903, 24088, 672, 1469, 432, 253, 767, 8089, 891, 5246, 281, 253, 1264, 8089, 891, 35609, 436, 3762, 556, 2201, 3374, 275, 1798, 627, 403, 2969, 285, 3626, 6667, 323, 534, 891, 35609, 310, 4016, 253, 2929, 10262, 271, 1491, 783, 30325, 47641, 565, 48714, 8813, 323, 616, 260, 2275, 5140, 1754, 327, 436, 7792, 642, 27947, 19673, 598, 667, 273, 253, 3916, 273, 1347, 21955, 706, 461, 1255, 275, 253, 2929, 403, 1677, 19235, 342, 824, 4828, 565, 48714, 3374, 273, 253, 6944, 3762, 247, 344, 321, 261, 262, 68, 8813, 326, 15265, 684, 253, 4081, 5140, 310, 417, 21414, 9938, 403, 3559, 281, 15249, 253, 5140, 533, 1880, 253, 7558, 3607, 2186, 323, 247, 4618, 5235, 273, 873, 8777, 3464, 12744, 50276, 19, 30762, 247, 310, 6289, 281, 2393, 327, 323, 15571, 253, 8723, 3309, 1491, 278, 8311, 533, 352, 310, 1077, 12744, 752, 310, 253, 1750, 273, 436, 30762, 310, 627, 247, 1750, 697, 816, 3133, 751, 247, 2410, 311, 4525, 285, 1048, 8813, 273, 15577, 1491, 1014, 625, 594, 436, 8813, 310, 31215, 323, 4227, 253, 4477, 3730, 281, 253, 15577, 1491, 347, 247, 8723, 4209, 26312, 533, 352, 310, 417, 323, 247, 4667, 273, 3632, 4903, 1269, 90, 247, 4209, 26312, 1333, 323, 1269, 1677, 340, 310, 247, 1159, 269, 273, 340, 824, 1269, 71, 12502, 4948, 247, 1616, 729, 5931, 5742, 269, 90, 310, 1529, 3632, 4778, 253, 15577, 1491, 891, 5246, 310, 816, 247, 1180, 891, 452, 2709, 5476, 265, 327, 752, 253, 4477, 4495, 812, 320, 1060, 533, 369, 7591, 281, 4677, 352, 562, 432, 253, 2505, 581, 4500, 534, 310, 247, 3965, 2629, 1039, 281, 4853, 4209, 26312, 2167, 15577, 1491, 310, 347, 247, 1159, 269, 824, 326, 891, 5246, 31296, 17, 824, 271, 269, 310, 247, 4209, 26312, 1580, 253, 5058, 15577, 1491, 1307, 310, 6425, 281, 253, 1616, 729, 5931, 1269, 71, 12502, 432, 1078, 310, 326, 752, 253, 4477, 1599, 50276, 20, 253, 1182, 89, 4778, 5611, 275, 2593, 495, 275, 11797, 407, 253, 18890, 7792, 43302, 374, 604, 891, 2096, 9113, 436, 2097, 326, 275, 1142, 4893, 1182, 89, 310, 7616, 407, 247, 30410, 273, 1269, 8772, 253, 5203, 340, 619, 1953, 310, 1880, 323, 247, 4229, 873, 273, 985, 3602, 1182, 89, 310, 247, 30027, 1159, 273, 1269, 604, 436, 1182, 89, 1132, 253, 2554, 273, 253, 4209, 9990, 209, 422, 6289, 281, 275, 619, 2045, 4385, 840, 352, 943, 320, 816, 247, 1159, 273, 1269, 50275, 35529, 604, 1182, 14506, 89, 323, 247, 30027, 1159, 269, 840, 253, 260, 2275, 432, 5150, 495, 310, 5809, 3472, 323, 1142, 4722, 2219, 273, 1269, 90, 323, 4227, 604, 1269, 310, 247, 5415, 3632, 4778, 285, 1182, 14506, 89, 310, 5415, 347, 973, 840, 50276, 895, 91, 5246, 73, 91, 5246, 73, 91, 89, 5246, 835, 288, 310, 253, 8967, 15579, 285, 253, 42426, 2426, 18207, 2192, 555, 407, 5426, 923, 2593, 11439, 273, 3835, 50276, 394, 4921, 5403, 17912, 253, 15577, 1491, 285, 253, 260, 2275, 8103, 403, 11968, 604, 1182, 14506, 89, 310, 247, 6804, 3632, 4778, 24088, 476, 320, 4044, 432, 247, 774, 86, 11454, 2990, 840, 253, 1072, 6569, 643, 2219, 273, 1600, 824, 347, 13358, 1269, 285, 269, 1146, 271, 39510, 10603, 273, 253, 873, 273, 1269, 2193, 403, 671, 20276, 323, 4278, 273, 824, 1895, 2330, 342, 18890, 1511, 2426, 923, 50276, 18, 391, 247, 717, 75, 324, 285, 270, 260, 3471, 8047, 4715, 14237, 323, 11454, 2990, 3169, 9162, 970, 253, 1491, 3673, 44856, 8063, 4765, 5987, 39962, 2061, 5375, 1093, 9992, 4148, 2526, 50276, 5092, 253, 4477, 2395, 323, 326, 50276, 21, 253, 643, 767, 10123, 9713, 253, 5816, 8553, 323, 2469, 6239, 891, 5194, 327, 436, 1127, 285, 588, 1978, 3540, 273, 253, 4477, 6128, 891, 588, 417, 4385, 327, 326, 969, 50275, 42218, 841, 2173, 2523, 597, 2505, 310, 1077, 3159, 90, 285, 21643, 387, 2069, 604, 690, 15965, 22861, 7645, 272, 369, 7091, 253, 4081, 7792, 1537, 452, 644, 6927, 281, 2997, 253, 1048, 47641, 22909, 7091, 387, 253, 2774, 513, 417, 36433, 323, 436, 37317, 5734, 253, 4477, 403, 2104, 281, 2085, 37699, 273, 512, 253, 1840, 2792, 285, 6283, 1659, 616, 789, 275, 5886, 281, 2469, 6239, 891, 2550, 5583, 14924, 50276, 7152, 33032, 2520, 2929, 5605, 281, 2319, 247, 747, 8103, 1159, 534, 253, 4477, 19155, 17697, 15579, 3673, 44856, 260, 2275, 17194, 407, 4715, 1805, 21624, 14237, 2299, 347, 2080, 347, 891, 476, 2028, 253, 8103, 3470, 2168, 4961, 275, 253, 327, 554, 274, 6245, 2021, 273, 1491, 3673, 44856, 18890, 273, 246, 763, 1615, 759, 42800, 285, 270, 451, 1441, 253, 2488, 3133, 281, 8968, 436, 275, 30762, 270, 533, 5841, 352, 247, 8489, 10084, 10527, 906, 2299, 604, 359, 3890, 18890, 347, 2781, 24901, 90, 50276, 2461, 24901, 89, 923, 655, 285, 840, 19153, 7871, 285, 1379, 253, 2781, 281, 253, 1054, 359, 755, 1054, 9840, 24901, 89, 50276, 478, 90, 3192, 9840, 50276, 805, 39763, 949, 407, 374, 285, 4028, 891, 39344, 50276, 14059, 1182, 50276, 895, 3847, 359, 1089, 260, 487, 19235, 891, 1891, 281, 923, 849, 436, 310, 10084, 390, 1027, 50276, 66, 3064, 760, 15877, 672, 970, 247, 39762, 11193, 281, 18890, 253, 4477, 7277, 281, 253, 39762, 1491, 3673, 44856, 362, 487, 273, 247, 5616, 74, 269, 23268, 277, 24632, 285, 4682, 12039, 549, 32693, 1036, 805, 5525, 740, 534, 4419, 247, 30410, 271, 32049, 285, 247, 16888, 12637, 689, 253, 4329, 592, 1060, 3185, 273, 253, 16888, 12637, 597, 3037, 247, 24291, 32049, 432, 13301, 281, 4329, 592, 436, 3064, 15877, 984, 253, 18890, 8103, 556, 767, 2426, 273, 7285, 861, 285, 359, 476, 1387, 731, 715, 2762, 19040, 2426, 275, 1027, 4088, 6153, 1027, 14493, 50276, 30875, 436, 32827, 5644, 281, 247, 1805, 39762, 3033, 604, 594, 28763, 760, 247, 1127, 670, 253, 39762, 1332, 7091, 407, 247, 5616, 74, 1162, 355, 285, 417, 247, 4858, 8103, 347, 436, 3133, 281, 320, 253, 2022, 7680, 273, 253, 2929, 436, 1127, 3198, 281, 320, 5544, 625, 9257, 285, 275, 625, 2508, 323, 4227, 352, 3133, 4409, 13458, 562, 275, 253, 13358, 1083, 326, 253, 16888, 12637, 1182, 2193, 281, 6642, 285, 253, 24291, 32049, 556, 1182, 1269, 340, 50276, 35640, 272, 436, 310, 247, 6830, 247, 1199, 12150, 4715, 1895, 604, 594, 627, 943, 320, 247, 18511, 5649, 323, 970, 436, 11193, 285, 417, 253, 643, 581, 50276, 249, 6010, 253, 4477, 403, 417, 1663, 2590, 670, 752, 597, 403, 2509, 285, 849, 352, 7033, 281, 18890, 33810, 253, 878, 323, 436, 2173, 4327, 275, 18890, 4764, 2317, 310, 417, 1160, 2590, 4543, 513, 253, 5661, 1543, 4933, 247, 18511, 878, 253, 5661, 1543, 403, 671, 417, 387, 512, 4518, 3559, 390, 5544, 3103, 891, 13414, 1158, 436, 2929, 12310, 253, 3290, 19843, 3236, 414, 390, 8453, 6866, 323, 17857, 77, 5784, 406, 33032, 11183, 923, 5701, 327, 38549, 2708, 50276, 2520, 2929, 9093, 23970, 247, 5203, 6820, 37820, 281, 253, 362, 487, 7792, 11038, 253, 32049, 3268, 273, 581, 10302, 432, 13301, 253, 4477, 921, 1175, 3045, 275, 26647, 824, 326, 616, 2746, 310, 4942, 10237, 275, 247, 1180, 273, 8892, 824, 347, 48960, 5684, 50276, 783, 2934, 891, 1158, 310, 3839, 1175, 533, 627, 403, 2067, 3237, 342, 436, 789, 50276, 7053, 627, 556, 644, 3332, 16424, 275, 15577, 1491, 13418, 806, 1119, 275, 337, 436, 310, 271, 1774, 16018, 432, 253, 7312, 39762, 34754, 908, 275, 362, 487, 368, 878, 281, 7277, 281, 436, 8245, 347, 352, 369, 2011, 326, 352, 41731, 13015, 362, 487, 275, 247, 2074, 9162, 4836, 347, 3559, 275, 634, 789, 50276, 9815, 2080, 1512, 1199, 2317, 310, 908, 281, 2242, 562, 690, 9648, 5044, 30221, 342, 1675, 281, 15577, 1491, 17697, 15579, 3966, 352, 651, 320, 5322, 323, 1650, 281, 452, 271, 5933, 281, 1056, 253, 4715, 8103, 625, 2590, 4583, 891, 13414, 1928, 253, 2600, 816, 7790, 253, 2978, 50276, 19016, 891, 452, 690, 7350, 670, 253, 8453, 273, 436, 789, 597, 9569, 9093, 247, 5203, 6820, 24291, 32049, 281, 2085, 3530, 323, 253, 27451, 1307, 9403, 1119, 275, 362, 487, 253, 22861, 310, 326, 359, 878, 253, 3673, 44856, 1307, 281, 3157, 26647, 285, 253, 24291, 32049, 1307, 310, 6326, 281, 1978, 253, 6779, 4623, 281, 13301, 581, 812, 452, 908, 271, 2746, 751, 7477, 2509, 1054, 1491, 323, 253, 3673, 44856, 285, 2781, 8692, 323, 253, 13301, 275, 1635, 1199, 789, 556, 644, 2218, 327, 4715, 14237, 326, 39970, 970, 15577, 1491, 46875, 3185, 273, 28699, 374, 495, 577, 608, 2112, 342, 690, 3686, 273, 1307, 281, 3157, 17200, 285, 436, 789, 3133, 281, 11823, 50276, 1439, 320, 6600, 273, 436, 789, 50276, 1189, 455, 891, 812, 923, 690, 2442, 275, 436, 2929, 1146, 3863, 347, 891, 1158, 253, 2746, 310, 24600, 533, 697, 417, 3559, 275, 253, 1463, 3634, 273, 2469, 789, 50276, 18, 1112, 18068, 23248, 891, 2534, 5009, 247, 1218, 39238, 7523, 256, 1960, 6169, 247, 270, 1205, 900, 340, 50276, 73, 75, 37821, 391, 277, 4765, 7477, 15577, 1491, 11454, 13418, 5213, 8059, 323, 5145, 4715, 4765, 374, 305, 6819, 391, 465, 376, 2327, 247, 285, 591, 8440, 268, 20741, 800, 17524, 407, 3963, 1025, 1491, 11903, 1320, 275, 295, 2824, 4267, 495, 30287, 259, 3641, 90, 4611, 246, 18734, 4113, 256, 1111, 2204, 4881, 299, 285, 402, 7311, 90, 2902, 278, 4715, 13358, 14237, 3066, 1491, 46875, 1881, 2321, 16390, 3733, 275, 17857, 1686, 4240, 577, 288, 75, 37821, 391, 277, 10208, 42017, 247, 22874, 80, 9623, 1116, 786, 251, 256, 8899, 267, 465, 492, 17291, 2146, 247, 50276, 67, 1205, 900, 340, 4765, 4715, 3676, 14237, 407, 15577, 1491, 13418, 285, 11903, 1320, 549, 32693, 638, 3845, 549, 32693, 11395, 1438, 2526, 1967, 608, 258, 636, 247, 10510, 3889, 1850, 340, 1370, 248, 632, 285, 47692, 311, 362, 5104, 932, 6779, 4715, 342, 4499, 422, 15970, 12425, 549, 32693, 638, 3845, 549, 32693, 11395, 1967, 1787, 2385, 4765, 187, 187, 4118, 18435, 27, 2520, 2929, 29328, 247, 17705, 323, 6779, 4715, 5927, 3309, 1491, 534, 3054, 326, 323, 247, 4836, 2931, 407, 690, 6036, 5912, 3268, 268, 5246, 285, 253, 4736, 273, 323, 1650, 21565, 340, 432, 1269, 247, 6311, 6779, 273, 1269, 17007, 1182, 943, 10517, 253, 13919, 891, 5246, 50276, 895, 91, 50276, 14059, 91, 253, 4477, 840, 12661, 271, 8103, 1159, 253, 17697, 15579, 3673, 44856, 260, 2275, 281, 5416, 326, 247, 6311, 6779, 12310, 253, 5927, 3309, 1491, 17705, 285, 247, 39762, 11193, 281, 253, 17697, 15579, 3673, 44856, 326, 476, 320, 4764, 1025, 970, 3676, 6928, 285, 18325, 342, 2629, 3082, 824, 347, 19191, 11786, 18499, 253, 4477, 671, 14588, 253, 17697, 15579, 3673, 44856, 281, 253, 1491, 3673, 44856, 16653, 23623, 4081, 407, 246, 763, 1615, 4645, 326, 253, 260, 2275, 10140, 281, 253, 1491, 3673, 44856, 342, 50275, 1762, 271, 1774, 7680, 273, 436, 789, 310, 326, 352, 4245, 247, 10527, 22861, 323, 17221, 247, 2173, 1318, 273, 50276, 30786, 685, 5175, 2709, 2193, 4679, 327, 8142, 16192, 382, 921, 326, 275, 5301, 281, 247, 30027, 30410, 285, 281, 39762, 1491, 3673, 44856, 3210, 342, 50276, 249, 209, 2874, 14805, 16987, 253, 260, 2275, 1566, 33526, 1175, 7200, 285, 18543, 310, 12085, 387, 15549, 562, 1171, 35360, 14800, 285, 310, 625, 14264, 281, 3168, 3364, 48960, 8104, 1529, 3368, 14371, 326, 247, 1566, 10166, 342, 253, 260, 2275, 17705, 310, 7591, 281, 16407, 907, 247, 12421, 13130, 2715, 273, 8142, 16192, 382, 627, 369, 247, 2266, 3064, 273, 4743, 875, 253, 30628, 327, 436, 2929, 581, 37317, 391, 18, 11511, 253, 789, 347, 14916, 253, 4477, 30080, 8659, 436, 1750, 275, 616, 2380, 285, 18520, 285, 391, 18, 4242, 281, 10078, 275, 253, 5955, 594, 253, 913, 7052, 42214, 436, 2278, 253, 643, 767, 30628, 574, 690, 7350, 670, 253, 2929, 954, 273, 534, 497, 9713, 407, 253, 18520, 533, 29325, 1365, 690, 7350, 1335, 3464, 391, 21, 651, 751, 625, 10527, 8132, 263, 275, 253, 2929, 1223, 391, 19, 651, 751, 247, 1480, 5301, 1411, 7477, 285, 260, 5902, 275, 253, 990, 253, 913, 11121, 326, 436, 2929, 3198, 816, 247, 2372, 625, 789, 281, 2953, 841, 7350, 253, 4477, 403, 14659, 281, 49620, 436, 789, 285, 11929, 352, 281, 1529, 5145, 4715, 18767 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 790, 1491, 5593, 352, 310, 35092, 326, 672, 10620, 342, 625, 685, 374, 3632, 4903, 24088, 672, 1469, 432, 253, 767, 8089, 891, 5246, 281, 253, 1264, 8089, 891, 35609, 436, 3762, 556, 2201, 3374, 275, 1798, 627, 403, 2969, 285, 3626, 6667, 323, 534, 891, 35609, 310, 4016, 253, 2929, 10262, 271, 1491, 783, 30325, 47641, 565, 48714, 8813, 323, 616, 260, 2275, 5140, 1754, 327, 436, 7792, 642, 27947, 19673, 598, 667, 273, 253, 3916, 273, 1347, 21955, 706, 461, 1255, 275, 253, 2929, 403, 1677, 19235, 342, 824, 4828, 565, 48714, 3374, 273, 253, 6944, 3762, 247, 344, 321, 261, 262, 68, 8813, 326, 15265, 684, 253, 4081, 5140, 310, 417, 21414, 9938, 403, 3559, 281, 15249, 253, 5140, 533, 1880, 253, 7558, 3607, 2186, 323, 247, 4618, 5235, 273, 873, 8777, 3464, 12744, 50276, 19, 30762, 247, 310, 6289, 281, 2393, 327, 323, 15571, 253, 8723, 3309, 1491, 278, 8311, 533, 352, 310, 1077, 12744, 752, 310, 253, 1750, 273, 436, 30762, 310, 627, 247, 1750, 697, 816, 3133, 751, 247, 2410, 311, 4525, 285, 1048, 8813, 273, 15577, 1491, 1014, 625, 594, 436, 8813, 310, 31215, 323, 4227, 253, 4477, 3730, 281, 253, 15577, 1491, 347, 247, 8723, 4209, 26312, 533, 352, 310, 417, 323, 247, 4667, 273, 3632, 4903, 1269, 90, 247, 4209, 26312, 1333, 323, 1269, 1677, 340, 310, 247, 1159, 269, 273, 340, 824, 1269, 71, 12502, 4948, 247, 1616, 729, 5931, 5742, 269, 90, 310, 1529, 3632, 4778, 253, 15577, 1491, 891, 5246, 310, 816, 247, 1180, 891, 452, 2709, 5476, 265, 327, 752, 253, 4477, 4495, 812, 320, 1060, 533, 369, 7591, 281, 4677, 352, 562, 432, 253, 2505, 581, 4500, 534, 310, 247, 3965, 2629, 1039, 281, 4853, 4209, 26312, 2167, 15577, 1491, 310, 347, 247, 1159, 269, 824, 326, 891, 5246, 31296, 17, 824, 271, 269, 310, 247, 4209, 26312, 1580, 253, 5058, 15577, 1491, 1307, 310, 6425, 281, 253, 1616, 729, 5931, 1269, 71, 12502, 432, 1078, 310, 326, 752, 253, 4477, 1599, 50276, 20, 253, 1182, 89, 4778, 5611, 275, 2593, 495, 275, 11797, 407, 253, 18890, 7792, 43302, 374, 604, 891, 2096, 9113, 436, 2097, 326, 275, 1142, 4893, 1182, 89, 310, 7616, 407, 247, 30410, 273, 1269, 8772, 253, 5203, 340, 619, 1953, 310, 1880, 323, 247, 4229, 873, 273, 985, 3602, 1182, 89, 310, 247, 30027, 1159, 273, 1269, 604, 436, 1182, 89, 1132, 253, 2554, 273, 253, 4209, 9990, 209, 422, 6289, 281, 275, 619, 2045, 4385, 840, 352, 943, 320, 816, 247, 1159, 273, 1269, 50275, 35529, 604, 1182, 14506, 89, 323, 247, 30027, 1159, 269, 840, 253, 260, 2275, 432, 5150, 495, 310, 5809, 3472, 323, 1142, 4722, 2219, 273, 1269, 90, 323, 4227, 604, 1269, 310, 247, 5415, 3632, 4778, 285, 1182, 14506, 89, 310, 5415, 347, 973, 840, 50276, 895, 91, 5246, 73, 91, 5246, 73, 91, 89, 5246, 835, 288, 310, 253, 8967, 15579, 285, 253, 42426, 2426, 18207, 2192, 555, 407, 5426, 923, 2593, 11439, 273, 3835, 50276, 394, 4921, 5403, 17912, 253, 15577, 1491, 285, 253, 260, 2275, 8103, 403, 11968, 604, 1182, 14506, 89, 310, 247, 6804, 3632, 4778, 24088, 476, 320, 4044, 432, 247, 774, 86, 11454, 2990, 840, 253, 1072, 6569, 643, 2219, 273, 1600, 824, 347, 13358, 1269, 285, 269, 1146, 271, 39510, 10603, 273, 253, 873, 273, 1269, 2193, 403, 671, 20276, 323, 4278, 273, 824, 1895, 2330, 342, 18890, 1511, 2426, 923, 50276, 18, 391, 247, 717, 75, 324, 285, 270, 260, 3471, 8047, 4715, 14237, 323, 11454, 2990, 3169, 9162, 970, 253, 1491, 3673, 44856, 8063, 4765, 5987, 39962, 2061, 5375, 1093, 9992, 4148, 2526, 50276, 5092, 253, 4477, 2395, 323, 326, 50276, 21, 253, 643, 767, 10123, 9713, 253, 5816, 8553, 323, 2469, 6239, 891, 5194, 327, 436, 1127, 285, 588, 1978, 3540, 273, 253, 4477, 6128, 891, 588, 417, 4385, 327, 326, 969, 50275, 42218, 841, 2173, 2523, 597, 2505, 310, 1077, 3159, 90, 285, 21643, 387, 2069, 604, 690, 15965, 22861, 7645, 272, 369, 7091, 253, 4081, 7792, 1537, 452, 644, 6927, 281, 2997, 253, 1048, 47641, 22909, 7091, 387, 253, 2774, 513, 417, 36433, 323, 436, 37317, 5734, 253, 4477, 403, 2104, 281, 2085, 37699, 273, 512, 253, 1840, 2792, 285, 6283, 1659, 616, 789, 275, 5886, 281, 2469, 6239, 891, 2550, 5583, 14924, 50276, 7152, 33032, 2520, 2929, 5605, 281, 2319, 247, 747, 8103, 1159, 534, 253, 4477, 19155, 17697, 15579, 3673, 44856, 260, 2275, 17194, 407, 4715, 1805, 21624, 14237, 2299, 347, 2080, 347, 891, 476, 2028, 253, 8103, 3470, 2168, 4961, 275, 253, 327, 554, 274, 6245, 2021, 273, 1491, 3673, 44856, 18890, 273, 246, 763, 1615, 759, 42800, 285, 270, 451, 1441, 253, 2488, 3133, 281, 8968, 436, 275, 30762, 270, 533, 5841, 352, 247, 8489, 10084, 10527, 906, 2299, 604, 359, 3890, 18890, 347, 2781, 24901, 90, 50276, 2461, 24901, 89, 923, 655, 285, 840, 19153, 7871, 285, 1379, 253, 2781, 281, 253, 1054, 359, 755, 1054, 9840, 24901, 89, 50276, 478, 90, 3192, 9840, 50276, 805, 39763, 949, 407, 374, 285, 4028, 891, 39344, 50276, 14059, 1182, 50276, 895, 3847, 359, 1089, 260, 487, 19235, 891, 1891, 281, 923, 849, 436, 310, 10084, 390, 1027, 50276, 66, 3064, 760, 15877, 672, 970, 247, 39762, 11193, 281, 18890, 253, 4477, 7277, 281, 253, 39762, 1491, 3673, 44856, 362, 487, 273, 247, 5616, 74, 269, 23268, 277, 24632, 285, 4682, 12039, 549, 32693, 1036, 805, 5525, 740, 534, 4419, 247, 30410, 271, 32049, 285, 247, 16888, 12637, 689, 253, 4329, 592, 1060, 3185, 273, 253, 16888, 12637, 597, 3037, 247, 24291, 32049, 432, 13301, 281, 4329, 592, 436, 3064, 15877, 984, 253, 18890, 8103, 556, 767, 2426, 273, 7285, 861, 285, 359, 476, 1387, 731, 715, 2762, 19040, 2426, 275, 1027, 4088, 6153, 1027, 14493, 50276, 30875, 436, 32827, 5644, 281, 247, 1805, 39762, 3033, 604, 594, 28763, 760, 247, 1127, 670, 253, 39762, 1332, 7091, 407, 247, 5616, 74, 1162, 355, 285, 417, 247, 4858, 8103, 347, 436, 3133, 281, 320, 253, 2022, 7680, 273, 253, 2929, 436, 1127, 3198, 281, 320, 5544, 625, 9257, 285, 275, 625, 2508, 323, 4227, 352, 3133, 4409, 13458, 562, 275, 253, 13358, 1083, 326, 253, 16888, 12637, 1182, 2193, 281, 6642, 285, 253, 24291, 32049, 556, 1182, 1269, 340, 50276, 35640, 272, 436, 310, 247, 6830, 247, 1199, 12150, 4715, 1895, 604, 594, 627, 943, 320, 247, 18511, 5649, 323, 970, 436, 11193, 285, 417, 253, 643, 581, 50276, 249, 6010, 253, 4477, 403, 417, 1663, 2590, 670, 752, 597, 403, 2509, 285, 849, 352, 7033, 281, 18890, 33810, 253, 878, 323, 436, 2173, 4327, 275, 18890, 4764, 2317, 310, 417, 1160, 2590, 4543, 513, 253, 5661, 1543, 4933, 247, 18511, 878, 253, 5661, 1543, 403, 671, 417, 387, 512, 4518, 3559, 390, 5544, 3103, 891, 13414, 1158, 436, 2929, 12310, 253, 3290, 19843, 3236, 414, 390, 8453, 6866, 323, 17857, 77, 5784, 406, 33032, 11183, 923, 5701, 327, 38549, 2708, 50276, 2520, 2929, 9093, 23970, 247, 5203, 6820, 37820, 281, 253, 362, 487, 7792, 11038, 253, 32049, 3268, 273, 581, 10302, 432, 13301, 253, 4477, 921, 1175, 3045, 275, 26647, 824, 326, 616, 2746, 310, 4942, 10237, 275, 247, 1180, 273, 8892, 824, 347, 48960, 5684, 50276, 783, 2934, 891, 1158, 310, 3839, 1175, 533, 627, 403, 2067, 3237, 342, 436, 789, 50276, 7053, 627, 556, 644, 3332, 16424, 275, 15577, 1491, 13418, 806, 1119, 275, 337, 436, 310, 271, 1774, 16018, 432, 253, 7312, 39762, 34754, 908, 275, 362, 487, 368, 878, 281, 7277, 281, 436, 8245, 347, 352, 369, 2011, 326, 352, 41731, 13015, 362, 487, 275, 247, 2074, 9162, 4836, 347, 3559, 275, 634, 789, 50276, 9815, 2080, 1512, 1199, 2317, 310, 908, 281, 2242, 562, 690, 9648, 5044, 30221, 342, 1675, 281, 15577, 1491, 17697, 15579, 3966, 352, 651, 320, 5322, 323, 1650, 281, 452, 271, 5933, 281, 1056, 253, 4715, 8103, 625, 2590, 4583, 891, 13414, 1928, 253, 2600, 816, 7790, 253, 2978, 50276, 19016, 891, 452, 690, 7350, 670, 253, 8453, 273, 436, 789, 597, 9569, 9093, 247, 5203, 6820, 24291, 32049, 281, 2085, 3530, 323, 253, 27451, 1307, 9403, 1119, 275, 362, 487, 253, 22861, 310, 326, 359, 878, 253, 3673, 44856, 1307, 281, 3157, 26647, 285, 253, 24291, 32049, 1307, 310, 6326, 281, 1978, 253, 6779, 4623, 281, 13301, 581, 812, 452, 908, 271, 2746, 751, 7477, 2509, 1054, 1491, 323, 253, 3673, 44856, 285, 2781, 8692, 323, 253, 13301, 275, 1635, 1199, 789, 556, 644, 2218, 327, 4715, 14237, 326, 39970, 970, 15577, 1491, 46875, 3185, 273, 28699, 374, 495, 577, 608, 2112, 342, 690, 3686, 273, 1307, 281, 3157, 17200, 285, 436, 789, 3133, 281, 11823, 50276, 1439, 320, 6600, 273, 436, 789, 50276, 1189, 455, 891, 812, 923, 690, 2442, 275, 436, 2929, 1146, 3863, 347, 891, 1158, 253, 2746, 310, 24600, 533, 697, 417, 3559, 275, 253, 1463, 3634, 273, 2469, 789, 50276, 18, 1112, 18068, 23248, 891, 2534, 5009, 247, 1218, 39238, 7523, 256, 1960, 6169, 247, 270, 1205, 900, 340, 50276, 73, 75, 37821, 391, 277, 4765, 7477, 15577, 1491, 11454, 13418, 5213, 8059, 323, 5145, 4715, 4765, 374, 305, 6819, 391, 465, 376, 2327, 247, 285, 591, 8440, 268, 20741, 800, 17524, 407, 3963, 1025, 1491, 11903, 1320, 275, 295, 2824, 4267, 495, 30287, 259, 3641, 90, 4611, 246, 18734, 4113, 256, 1111, 2204, 4881, 299, 285, 402, 7311, 90, 2902, 278, 4715, 13358, 14237, 3066, 1491, 46875, 1881, 2321, 16390, 3733, 275, 17857, 1686, 4240, 577, 288, 75, 37821, 391, 277, 10208, 42017, 247, 22874, 80, 9623, 1116, 786, 251, 256, 8899, 267, 465, 492, 17291, 2146, 247, 50276, 67, 1205, 900, 340, 4765, 4715, 3676, 14237, 407, 15577, 1491, 13418, 285, 11903, 1320, 549, 32693, 638, 3845, 549, 32693, 11395, 1438, 2526, 1967, 608, 258, 636, 247, 10510, 3889, 1850, 340, 1370, 248, 632, 285, 47692, 311, 362, 5104, 932, 6779, 4715, 342, 4499, 422, 15970, 12425, 549, 32693, 638, 3845, 549, 32693, 11395, 1967, 1787, 2385, 4765, 187, 187, 4118, 18435, 27, 2520, 2929, 29328, 247, 17705, 323, 6779, 4715, 5927, 3309, 1491, 534, 3054, 326, 323, 247, 4836, 2931, 407, 690, 6036, 5912, 3268, 268, 5246, 285, 253, 4736, 273, 323, 1650, 21565, 340, 432, 1269, 247, 6311, 6779, 273, 1269, 17007, 1182, 943, 10517, 253, 13919, 891, 5246, 50276, 895, 91, 50276, 14059, 91, 253, 4477, 840, 12661, 271, 8103, 1159, 253, 17697, 15579, 3673, 44856, 260, 2275, 281, 5416, 326, 247, 6311, 6779, 12310, 253, 5927, 3309, 1491, 17705, 285, 247, 39762, 11193, 281, 253, 17697, 15579, 3673, 44856, 326, 476, 320, 4764, 1025, 970, 3676, 6928, 285, 18325, 342, 2629, 3082, 824, 347, 19191, 11786, 18499, 253, 4477, 671, 14588, 253, 17697, 15579, 3673, 44856, 281, 253, 1491, 3673, 44856, 16653, 23623, 4081, 407, 246, 763, 1615, 4645, 326, 253, 260, 2275, 10140, 281, 253, 1491, 3673, 44856, 342, 50275, 1762, 271, 1774, 7680, 273, 436, 789, 310, 326, 352, 4245, 247, 10527, 22861, 323, 17221, 247, 2173, 1318, 273, 50276, 30786, 685, 5175, 2709, 2193, 4679, 327, 8142, 16192, 382, 921, 326, 275, 5301, 281, 247, 30027, 30410, 285, 281, 39762, 1491, 3673, 44856, 3210, 342, 50276, 249, 209, 2874, 14805, 16987, 253, 260, 2275, 1566, 33526, 1175, 7200, 285, 18543, 310, 12085, 387, 15549, 562, 1171, 35360, 14800, 285, 310, 625, 14264, 281, 3168, 3364, 48960, 8104, 1529, 3368, 14371, 326, 247, 1566, 10166, 342, 253, 260, 2275, 17705, 310, 7591, 281, 16407, 907, 247, 12421, 13130, 2715, 273, 8142, 16192, 382, 627, 369, 247, 2266, 3064, 273, 4743, 875, 253, 30628, 327, 436, 2929, 581, 37317, 391, 18, 11511, 253, 789, 347, 14916, 253, 4477, 30080, 8659, 436, 1750, 275, 616, 2380, 285, 18520, 285, 391, 18, 4242, 281, 10078, 275, 253, 5955, 594, 253, 913, 7052, 42214, 436, 2278, 253, 643, 767, 30628, 574, 690, 7350, 670, 253, 2929, 954, 273, 534, 497, 9713, 407, 253, 18520, 533, 29325, 1365, 690, 7350, 1335, 3464, 391, 21, 651, 751, 625, 10527, 8132, 263, 275, 253, 2929, 1223, 391, 19, 651, 751, 247, 1480, 5301, 1411, 7477, 285, 260, 5902, 275, 253, 990, 253, 913, 11121, 326, 436, 2929, 3198, 816, 247, 2372, 625, 789, 281, 2953, 841, 7350, 253, 4477, 403, 14659, 281, 49620, 436, 789, 285, 11929, 352, 281, 1529, 5145, 4715, 18767 ]