2023-01-20 09:10:15,675 - mmdet - INFO - Environment info: ------------------------------------------------------------ sys.platform: linux Python: 3.7.3 (default, Jan 22 2021, 20:04:44) [GCC 8.3.0] CUDA available: True GPU 0,1,2,3,4,5,6,7: NVIDIA A100 80GB PCIe CUDA_HOME: /usr/local/cuda NVCC: Cuda compilation tools, release 11.3, V11.3.109 GCC: x86_64-linux-gnu-gcc (Debian 8.3.0-6) 8.3.0 PyTorch: 1.10.0 PyTorch compiling details: PyTorch built with: - GCC 7.3 - C++ Version: 201402 - Intel(R) Math Kernel Library Version 2020.0.0 Product Build 20191122 for Intel(R) 64 architecture applications - Intel(R) MKL-DNN v2.2.3 (Git Hash 7336ca9f055cf1bfa13efb658fe15dc9b41f0740) - OpenMP 201511 (a.k.a. OpenMP 4.5) - LAPACK is enabled (usually provided by MKL) - NNPACK is enabled - CPU capability usage: AVX512 - CUDA Runtime 11.3 - NVCC architecture flags: -gencode;arch=compute_70,code=sm_70;-gencode;arch=compute_75,code=sm_75;-gencode;arch=compute_80,code=sm_80 - CuDNN 8.2.1 - Built with CuDNN 8.2 - Magma 2.5.2 - Build settings: BLAS_INFO=mkl, BUILD_TYPE=Release, CUDA_VERSION=11.3, CUDNN_VERSION=8.2.0, CXX_COMPILER=/opt/rh/devtoolset-7/root/usr/bin/c++, CXX_FLAGS= -Wno-deprecated -fvisibility-inlines-hidden -DUSE_PTHREADPOOL -fopenmp -DNDEBUG -DUSE_KINETO -DUSE_FBGEMM -DUSE_QNNPACK -DUSE_PYTORCH_QNNPACK -DUSE_XNNPACK -DSYMBOLICATE_MOBILE_DEBUG_HANDLE -DEDGE_PROFILER_USE_KINETO -O2 -fPIC -Wno-narrowing -Wall -Wextra -Werror=return-type -Wno-missing-field-initializers -Wno-type-limits -Wno-array-bounds -Wno-unknown-pragmas -Wno-sign-compare -Wno-unused-parameter -Wno-unused-variable -Wno-unused-function -Wno-unused-result -Wno-unused-local-typedefs -Wno-strict-overflow -Wno-strict-aliasing -Wno-error=deprecated-declarations -Wno-stringop-overflow -Wno-psabi -Wno-error=pedantic -Wno-error=redundant-decls -Wno-error=old-style-cast -fdiagnostics-color=always -faligned-new -Wno-unused-but-set-variable -Wno-maybe-uninitialized -fno-math-errno -fno-trapping-math -Werror=format -Wno-stringop-overflow, LAPACK_INFO=mkl, PERF_WITH_AVX=1, PERF_WITH_AVX2=1, PERF_WITH_AVX512=1, TORCH_VERSION=1.10.0, USE_CUDA=ON, USE_CUDNN=ON, USE_EXCEPTION_PTR=1, USE_GFLAGS=OFF, USE_GLOG=OFF, USE_MKL=ON, USE_MKLDNN=ON, USE_MPI=OFF, USE_NCCL=ON, USE_NNPACK=ON, USE_OPENMP=ON, TorchVision: 0.11.1+cu113 OpenCV: 4.6.0 MMCV: 1.6.2 MMCV Compiler: GCC 9.3 MMCV CUDA Compiler: 11.3 MMDetection: 2.26.0+17ad02c ------------------------------------------------------------ 2023-01-20 09:10:17,611 - mmdet - INFO - Distributed training: True 2023-01-20 09:10:19,453 - mmdet - INFO - Config: model = dict( type='SelfSupDetector', backbone=dict( type='SelfSupMaskRCNN', backbone=dict( type='SwinTransformer', embed_dims=128, depths=[2, 2, 18, 2], num_heads=[4, 8, 16, 32], window_size=7, mlp_ratio=4, qkv_bias=True, qk_scale=None, drop_rate=0.0, attn_drop_rate=0.0, drop_path_rate=0.2, patch_norm=True, out_indices=(0, 1, 2, 3), with_cp=False, frozen_stages=4, convert_weights=True, init_cfg=dict( type='Pretrained', checkpoint= 'https://github.com/SwinTransformer/storage/releases/download/v1.0.0/swin_base_patch4_window7_224_22k.pth' )), neck=dict( type='FPN', in_channels=[128, 256, 512, 1024], out_channels=256, num_outs=5), rpn_head=dict( type='RPNHead', in_channels=256, feat_channels=256, anchor_generator=dict( type='AnchorGenerator', scales=[8], ratios=[0.5, 1.0, 2.0], strides=[4, 8, 16, 32, 64]), bbox_coder=dict( type='DeltaXYWHBBoxCoder', target_means=[0.0, 0.0, 0.0, 0.0], target_stds=[1.0, 1.0, 1.0, 1.0]), loss_cls=dict( type='CrossEntropyLoss', use_sigmoid=True, loss_weight=1.0), loss_bbox=dict(type='L1Loss', loss_weight=1.0)), roi_head=dict( type='SelfSupStandardRoIHead', bbox_roi_extractor=dict( type='SingleRoIExtractor', roi_layer=dict( type='RoIAlign', output_size=7, sampling_ratio=0), out_channels=256, featmap_strides=[4, 8, 16, 32]), bbox_head=dict( type='SelfSupShared4Conv1FCBBoxHead', in_channels=256, num_classes=256, roi_feat_size=7, reg_class_agnostic=False, loss_bbox=dict(type='L1Loss', loss_weight=1.0), loss_cls=dict( type='ContrastiveLoss', loss_weight=1.0, temperature=0.5)), mask_roi_extractor=None, mask_head=None), train_cfg=dict( rpn=dict( assigner=dict( type='MaxIoUAssigner', pos_iou_thr=0.7, neg_iou_thr=0.3, min_pos_iou=0.3, match_low_quality=True, ignore_iof_thr=-1), sampler=dict( type='RandomSampler', num=4096, pos_fraction=1.0, neg_pos_ub=-1, add_gt_as_proposals=False), allowed_border=-1, pos_weight=-1, debug=False), rpn_proposal=dict( nms_pre=2000, max_per_img=1000, nms=dict(type='nms', iou_threshold=0.7), min_bbox_size=0), rcnn=dict( assigner=dict( type='MaxIoUAssigner', pos_iou_thr=0.5, neg_iou_thr=0.5, min_pos_iou=0.5, match_low_quality=True, ignore_iof_thr=-1, gt_max_assign_all=False), sampler=dict( type='RandomSampler', num=4096, pos_fraction=1, neg_pos_ub=0, add_gt_as_proposals=True), mask_size=28, pos_weight=-1, debug=False)), test_cfg=dict( rpn=dict( nms_pre=1000, max_per_img=1000, nms=dict(type='nms', iou_threshold=0.7), min_bbox_size=0), rcnn=dict( score_thr=0.05, nms=dict(type='nms', iou_threshold=0.5), max_per_img=100, mask_thr_binary=0.5)), init_cfg=dict( type='Pretrained', checkpoint='pretrain/simmim_swin-b_mmselfsup-pretrain.pth'))) train_dataset_type = 'MultiViewCocoDataset' test_dataset_type = 'CocoDataset' data_root = 'data/coco/' classes = ['selective_search'] img_norm_cfg = dict( mean=[123.675, 116.28, 103.53], std=[58.395, 57.12, 57.375], to_rgb=True) load_pipeline = [ dict(type='LoadImageFromFile'), dict(type='LoadAnnotations', with_bbox=True, with_mask=False) ] train_pipeline1 = [ dict( type='Resize', img_scale=[(1333, 640), (1333, 672), (1333, 704), (1333, 736), (1333, 768), (1333, 800)], multiscale_mode='value', keep_ratio=True), dict(type='FilterAnnotations', min_gt_bbox_wh=(0.01, 0.01)), dict(type='Pad', size_divisor=32), dict(type='RandFlip', flip_ratio=0.5), dict( type='OneOf', transforms=[ dict(type='Identity'), dict(type='AutoContrast'), dict(type='RandEqualize'), dict(type='RandSolarize'), dict(type='RandColor'), dict(type='RandContrast'), dict(type='RandBrightness'), dict(type='RandSharpness'), dict(type='RandPosterize') ]), dict( type='Normalize', mean=[123.675, 116.28, 103.53], std=[58.395, 57.12, 57.375], to_rgb=True), dict(type='DefaultFormatBundle'), dict(type='Collect', keys=['img', 'gt_bboxes', 'gt_labels']) ] train_pipeline2 = [ dict( type='Resize', img_scale=[(1333, 640), (1333, 672), (1333, 704), (1333, 736), (1333, 768), (1333, 800)], multiscale_mode='value', keep_ratio=True), dict(type='FilterAnnotations', min_gt_bbox_wh=(0.01, 0.01)), dict(type='Pad', size_divisor=32), dict(type='RandFlip', flip_ratio=0.5), dict( type='OneOf', transforms=[ dict(type='Identity'), dict(type='AutoContrast'), dict(type='RandEqualize'), dict(type='RandSolarize'), dict(type='RandColor'), dict(type='RandContrast'), dict(type='RandBrightness'), dict(type='RandSharpness'), dict(type='RandPosterize') ]), dict( type='Normalize', mean=[123.675, 116.28, 103.53], std=[58.395, 57.12, 57.375], to_rgb=True), dict(type='DefaultFormatBundle'), dict(type='Collect', keys=['img', 'gt_bboxes', 'gt_labels']) ] test_pipeline = [ dict(type='LoadImageFromFile'), dict( type='MultiScaleFlipAug', img_scale=(1333, 800), flip=False, transforms=[ dict(type='Resize', keep_ratio=True), dict(type='RandomFlip'), dict( type='Normalize', mean=[123.675, 116.28, 103.53], std=[58.395, 57.12, 57.375], to_rgb=True), dict(type='Pad', size_divisor=32), dict(type='ImageToTensor', keys=['img']), dict(type='Collect', keys=['img']) ]) ] data = dict( samples_per_gpu=4, workers_per_gpu=2, train=dict( type='MultiViewCocoDataset', dataset=dict( type='CocoDataset', classes=['selective_search'], ann_file= 'data/coco/filtered_proposals/train2017_ratio3size0008@0.5.json', img_prefix='data/coco/train2017/', pipeline=[ dict(type='LoadImageFromFile'), dict(type='LoadAnnotations', with_bbox=True, with_mask=False) ]), num_views=2, pipelines=[[{ 'type': 'Resize', 'img_scale': [(1333, 640), (1333, 672), (1333, 704), (1333, 736), (1333, 768), (1333, 800)], 'multiscale_mode': 'value', 'keep_ratio': True }, { 'type': 'FilterAnnotations', 'min_gt_bbox_wh': (0.01, 0.01) }, { 'type': 'Pad', 'size_divisor': 32 }, { 'type': 'RandFlip', 'flip_ratio': 0.5 }, { 'type': 'OneOf', 'transforms': [{ 'type': 'Identity' }, { 'type': 'AutoContrast' }, { 'type': 'RandEqualize' }, { 'type': 'RandSolarize' }, { 'type': 'RandColor' }, { 'type': 'RandContrast' }, { 'type': 'RandBrightness' }, { 'type': 'RandSharpness' }, { 'type': 'RandPosterize' }] }, { 'type': 'Normalize', 'mean': [123.675, 116.28, 103.53], 'std': [58.395, 57.12, 57.375], 'to_rgb': True }, { 'type': 'DefaultFormatBundle' }, { 'type': 'Collect', 'keys': ['img', 'gt_bboxes', 'gt_labels'] }], [{ 'type': 'Resize', 'img_scale': [(1333, 640), (1333, 672), (1333, 704), (1333, 736), (1333, 768), (1333, 800)], 'multiscale_mode': 'value', 'keep_ratio': True }, { 'type': 'FilterAnnotations', 'min_gt_bbox_wh': (0.01, 0.01) }, { 'type': 'Pad', 'size_divisor': 32 }, { 'type': 'RandFlip', 'flip_ratio': 0.5 }, { 'type': 'OneOf', 'transforms': [{ 'type': 'Identity' }, { 'type': 'AutoContrast' }, { 'type': 'RandEqualize' }, { 'type': 'RandSolarize' }, { 'type': 'RandColor' }, { 'type': 'RandContrast' }, { 'type': 'RandBrightness' }, { 'type': 'RandSharpness' }, { 'type': 'RandPosterize' }] }, { 'type': 'Normalize', 'mean': [123.675, 116.28, 103.53], 'std': [58.395, 57.12, 57.375], 'to_rgb': True }, { 'type': 'DefaultFormatBundle' }, { 'type': 'Collect', 'keys': ['img', 'gt_bboxes', 'gt_labels'] }]]), val=dict( type='CocoDataset', classes=['selective_search'], ann_file='data/coco/annotations/instances_val2017.json', img_prefix='data/coco/val2017/', pipeline=[ dict(type='LoadImageFromFile'), dict( type='MultiScaleFlipAug', img_scale=(1333, 800), flip=False, transforms=[ dict(type='Resize', keep_ratio=True), dict(type='RandomFlip'), dict( type='Normalize', mean=[123.675, 116.28, 103.53], std=[58.395, 57.12, 57.375], to_rgb=True), dict(type='Pad', size_divisor=32), dict(type='ImageToTensor', keys=['img']), dict(type='Collect', keys=['img']) ]) ]), test=dict( type='CocoDataset', classes=['selective_search'], ann_file='data/coco/annotations/instances_val2017.json', img_prefix='data/coco/val2017/', pipeline=[ dict(type='LoadImageFromFile'), dict( type='MultiScaleFlipAug', img_scale=(1333, 800), flip=False, transforms=[ dict(type='Resize', keep_ratio=True), dict(type='RandomFlip'), dict( type='Normalize', mean=[123.675, 116.28, 103.53], std=[58.395, 57.12, 57.375], to_rgb=True), dict(type='Pad', size_divisor=32), dict(type='ImageToTensor', keys=['img']), dict(type='Collect', keys=['img']) ]) ])) evaluation = dict(interval=65535, gpu_collect=True, metric='bbox') optimizer = dict( type='AdamW', lr=6e-05, betas=(0.9, 0.999), weight_decay=0.05, paramwise_cfg=dict( custom_keys=dict( absolute_pos_embed=dict(decay_mult=0.0), relative_position_bias_table=dict(decay_mult=0.0), norm=dict(decay_mult=0.0)))) optimizer_config = dict(grad_clip=None) lr_config = dict( policy='step', warmup='linear', warmup_iters=1000, warmup_ratio=0.001, step=[8, 11]) runner = dict(type='EpochBasedRunner', max_epochs=12) checkpoint_config = dict(interval=1) log_config = dict(interval=50, hooks=[dict(type='TextLoggerHook')]) custom_hooks = [ dict(type='MomentumUpdateHook'), dict( type='MMDetWandbHook', init_kwargs=dict(project='I2B', group='pretrain'), interval=50, num_eval_images=0, log_checkpoint=False) ] dist_params = dict(backend='nccl') log_level = 'INFO' load_from = None resume_from = None workflow = [('train', 1)] opencv_num_threads = 0 mp_start_method = 'fork' auto_scale_lr = dict(enable=True, base_batch_size=32) custom_imports = dict( imports=[ 'mmselfsup.datasets.pipelines', 'selfsup.core.hook.momentum_update_hook', 'selfsup.datasets.pipelines.selfsup_pipelines', 'selfsup.datasets.pipelines.rand_aug', 'selfsup.datasets.single_view_coco', 'selfsup.datasets.multi_view_coco', 'selfsup.models.losses.contrastive_loss', 'selfsup.models.dense_heads.fcos_head', 'selfsup.models.dense_heads.retina_head', 'selfsup.models.dense_heads.detr_head', 'selfsup.models.dense_heads.deformable_detr_head', 'selfsup.models.roi_heads.bbox_heads.convfc_bbox_head', 'selfsup.models.roi_heads.standard_roi_head', 'selfsup.models.detectors.selfsup_detector', 'selfsup.models.detectors.selfsup_fcos', 'selfsup.models.detectors.selfsup_detr', 'selfsup.models.detectors.selfsup_deformable_detr', 'selfsup.models.detectors.selfsup_retinanet', 'selfsup.models.detectors.selfsup_mask_rcnn', 'selfsup.core.bbox.assigners.hungarian_assigner', 'selfsup.core.bbox.assigners.pseudo_hungarian_assigner', 'selfsup.core.bbox.match_costs.match_cost' ], allow_failed_imports=False) pretrained = 'https://github.com/SwinTransformer/storage/releases/download/v1.0.0/swin_base_patch4_window7_224_22k.pth' find_unused_parameters = True work_dir = 'work_dirs/selfsup_mask-rcnn_swin-b_lsj-3x-coco_simmim-pretrain' auto_resume = False gpu_ids = range(0, 8) 2023-01-20 09:10:19,453 - mmdet - INFO - Set random seed to 42, deterministic: False 2023-01-20 09:10:21,222 - mmdet - INFO - initialize SelfSupMaskRCNN with init_cfg {'type': 'Pretrained', 'checkpoint': 'pretrain/simmim_swin-b_mmselfsup-pretrain.pth'} 2023-01-20 09:10:21,516 - mmdet - INFO - initialize SelfSupMaskRCNN with init_cfg {'type': 'Pretrained', 'checkpoint': 'pretrain/simmim_swin-b_mmselfsup-pretrain.pth'} Name of parameter - Initialization information online_backbone.backbone.patch_embed.projection.weight - torch.Size([128, 3, 4, 4]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth online_backbone.backbone.patch_embed.projection.bias - torch.Size([128]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth online_backbone.backbone.patch_embed.norm.weight - torch.Size([128]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth online_backbone.backbone.patch_embed.norm.bias - torch.Size([128]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth online_backbone.backbone.stages.0.blocks.0.norm1.weight - torch.Size([128]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth online_backbone.backbone.stages.0.blocks.0.norm1.bias - torch.Size([128]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth online_backbone.backbone.stages.0.blocks.0.attn.w_msa.relative_position_bias_table - torch.Size([169, 4]): The value is the same before and after calling `init_weights` of SelfSupDetector online_backbone.backbone.stages.0.blocks.0.attn.w_msa.qkv.weight - torch.Size([384, 128]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth online_backbone.backbone.stages.0.blocks.0.attn.w_msa.qkv.bias - torch.Size([384]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth online_backbone.backbone.stages.0.blocks.0.attn.w_msa.proj.weight - torch.Size([128, 128]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth online_backbone.backbone.stages.0.blocks.0.attn.w_msa.proj.bias - torch.Size([128]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth online_backbone.backbone.stages.0.blocks.0.norm2.weight - torch.Size([128]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth online_backbone.backbone.stages.0.blocks.0.norm2.bias - torch.Size([128]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth online_backbone.backbone.stages.0.blocks.0.ffn.layers.0.0.weight - torch.Size([512, 128]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth online_backbone.backbone.stages.0.blocks.0.ffn.layers.0.0.bias - torch.Size([512]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth online_backbone.backbone.stages.0.blocks.0.ffn.layers.1.weight - torch.Size([128, 512]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth online_backbone.backbone.stages.0.blocks.0.ffn.layers.1.bias - torch.Size([128]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth online_backbone.backbone.stages.0.blocks.1.norm1.weight - torch.Size([128]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth online_backbone.backbone.stages.0.blocks.1.norm1.bias - torch.Size([128]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth online_backbone.backbone.stages.0.blocks.1.attn.w_msa.relative_position_bias_table - torch.Size([169, 4]): The value is the same before and after calling `init_weights` of SelfSupDetector online_backbone.backbone.stages.0.blocks.1.attn.w_msa.qkv.weight - torch.Size([384, 128]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth online_backbone.backbone.stages.0.blocks.1.attn.w_msa.qkv.bias - torch.Size([384]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth online_backbone.backbone.stages.0.blocks.1.attn.w_msa.proj.weight - torch.Size([128, 128]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth online_backbone.backbone.stages.0.blocks.1.attn.w_msa.proj.bias - torch.Size([128]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth online_backbone.backbone.stages.0.blocks.1.norm2.weight - torch.Size([128]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth online_backbone.backbone.stages.0.blocks.1.norm2.bias - torch.Size([128]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth online_backbone.backbone.stages.0.blocks.1.ffn.layers.0.0.weight - torch.Size([512, 128]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth online_backbone.backbone.stages.0.blocks.1.ffn.layers.0.0.bias - torch.Size([512]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth online_backbone.backbone.stages.0.blocks.1.ffn.layers.1.weight - torch.Size([128, 512]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth online_backbone.backbone.stages.0.blocks.1.ffn.layers.1.bias - torch.Size([128]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth online_backbone.backbone.stages.0.downsample.norm.weight - torch.Size([512]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth online_backbone.backbone.stages.0.downsample.norm.bias - torch.Size([512]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth online_backbone.backbone.stages.0.downsample.reduction.weight - torch.Size([256, 512]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth online_backbone.backbone.stages.1.blocks.0.norm1.weight - torch.Size([256]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth online_backbone.backbone.stages.1.blocks.0.norm1.bias - torch.Size([256]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth online_backbone.backbone.stages.1.blocks.0.attn.w_msa.relative_position_bias_table - torch.Size([169, 8]): The value is the same before and after calling `init_weights` of SelfSupDetector online_backbone.backbone.stages.1.blocks.0.attn.w_msa.qkv.weight - torch.Size([768, 256]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth online_backbone.backbone.stages.1.blocks.0.attn.w_msa.qkv.bias - torch.Size([768]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth online_backbone.backbone.stages.1.blocks.0.attn.w_msa.proj.weight - torch.Size([256, 256]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth online_backbone.backbone.stages.1.blocks.0.attn.w_msa.proj.bias - torch.Size([256]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth online_backbone.backbone.stages.1.blocks.0.norm2.weight - torch.Size([256]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth online_backbone.backbone.stages.1.blocks.0.norm2.bias - torch.Size([256]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth online_backbone.backbone.stages.1.blocks.0.ffn.layers.0.0.weight - torch.Size([1024, 256]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth online_backbone.backbone.stages.1.blocks.0.ffn.layers.0.0.bias - torch.Size([1024]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth online_backbone.backbone.stages.1.blocks.0.ffn.layers.1.weight - torch.Size([256, 1024]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth online_backbone.backbone.stages.1.blocks.0.ffn.layers.1.bias - torch.Size([256]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth online_backbone.backbone.stages.1.blocks.1.norm1.weight - torch.Size([256]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth online_backbone.backbone.stages.1.blocks.1.norm1.bias - torch.Size([256]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth online_backbone.backbone.stages.1.blocks.1.attn.w_msa.relative_position_bias_table - torch.Size([169, 8]): The value is the same before and after calling `init_weights` of SelfSupDetector online_backbone.backbone.stages.1.blocks.1.attn.w_msa.qkv.weight - torch.Size([768, 256]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth online_backbone.backbone.stages.1.blocks.1.attn.w_msa.qkv.bias - torch.Size([768]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth online_backbone.backbone.stages.1.blocks.1.attn.w_msa.proj.weight - torch.Size([256, 256]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth online_backbone.backbone.stages.1.blocks.1.attn.w_msa.proj.bias - torch.Size([256]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth online_backbone.backbone.stages.1.blocks.1.norm2.weight - torch.Size([256]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth online_backbone.backbone.stages.1.blocks.1.norm2.bias - torch.Size([256]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth online_backbone.backbone.stages.1.blocks.1.ffn.layers.0.0.weight - torch.Size([1024, 256]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth online_backbone.backbone.stages.1.blocks.1.ffn.layers.0.0.bias - torch.Size([1024]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth online_backbone.backbone.stages.1.blocks.1.ffn.layers.1.weight - torch.Size([256, 1024]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth online_backbone.backbone.stages.1.blocks.1.ffn.layers.1.bias - torch.Size([256]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth online_backbone.backbone.stages.1.downsample.norm.weight - torch.Size([1024]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth online_backbone.backbone.stages.1.downsample.norm.bias - torch.Size([1024]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth online_backbone.backbone.stages.1.downsample.reduction.weight - torch.Size([512, 1024]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth online_backbone.backbone.stages.2.blocks.0.norm1.weight - torch.Size([512]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth online_backbone.backbone.stages.2.blocks.0.norm1.bias - torch.Size([512]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth online_backbone.backbone.stages.2.blocks.0.attn.w_msa.relative_position_bias_table - torch.Size([169, 16]): The value is the same before and after calling `init_weights` of SelfSupDetector online_backbone.backbone.stages.2.blocks.0.attn.w_msa.qkv.weight - torch.Size([1536, 512]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth online_backbone.backbone.stages.2.blocks.0.attn.w_msa.qkv.bias - torch.Size([1536]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth online_backbone.backbone.stages.2.blocks.0.attn.w_msa.proj.weight - torch.Size([512, 512]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth online_backbone.backbone.stages.2.blocks.0.attn.w_msa.proj.bias - torch.Size([512]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth online_backbone.backbone.stages.2.blocks.0.norm2.weight - torch.Size([512]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth online_backbone.backbone.stages.2.blocks.0.norm2.bias - torch.Size([512]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth online_backbone.backbone.stages.2.blocks.0.ffn.layers.0.0.weight - torch.Size([2048, 512]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth online_backbone.backbone.stages.2.blocks.0.ffn.layers.0.0.bias - torch.Size([2048]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth online_backbone.backbone.stages.2.blocks.0.ffn.layers.1.weight - torch.Size([512, 2048]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth online_backbone.backbone.stages.2.blocks.0.ffn.layers.1.bias - torch.Size([512]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth online_backbone.backbone.stages.2.blocks.1.norm1.weight - torch.Size([512]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth online_backbone.backbone.stages.2.blocks.1.norm1.bias - torch.Size([512]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth online_backbone.backbone.stages.2.blocks.1.attn.w_msa.relative_position_bias_table - torch.Size([169, 16]): The value is the same before and after calling `init_weights` of SelfSupDetector online_backbone.backbone.stages.2.blocks.1.attn.w_msa.qkv.weight - torch.Size([1536, 512]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth online_backbone.backbone.stages.2.blocks.1.attn.w_msa.qkv.bias - torch.Size([1536]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth online_backbone.backbone.stages.2.blocks.1.attn.w_msa.proj.weight - torch.Size([512, 512]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth online_backbone.backbone.stages.2.blocks.1.attn.w_msa.proj.bias - torch.Size([512]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth online_backbone.backbone.stages.2.blocks.1.norm2.weight - torch.Size([512]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth online_backbone.backbone.stages.2.blocks.1.norm2.bias - torch.Size([512]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth online_backbone.backbone.stages.2.blocks.1.ffn.layers.0.0.weight - torch.Size([2048, 512]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth online_backbone.backbone.stages.2.blocks.1.ffn.layers.0.0.bias - torch.Size([2048]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth online_backbone.backbone.stages.2.blocks.1.ffn.layers.1.weight - torch.Size([512, 2048]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth online_backbone.backbone.stages.2.blocks.1.ffn.layers.1.bias - torch.Size([512]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth online_backbone.backbone.stages.2.blocks.2.norm1.weight - torch.Size([512]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth online_backbone.backbone.stages.2.blocks.2.norm1.bias - torch.Size([512]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth online_backbone.backbone.stages.2.blocks.2.attn.w_msa.relative_position_bias_table - torch.Size([169, 16]): The value is the same before and after calling `init_weights` of SelfSupDetector online_backbone.backbone.stages.2.blocks.2.attn.w_msa.qkv.weight - torch.Size([1536, 512]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth online_backbone.backbone.stages.2.blocks.2.attn.w_msa.qkv.bias - torch.Size([1536]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth online_backbone.backbone.stages.2.blocks.2.attn.w_msa.proj.weight - torch.Size([512, 512]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth online_backbone.backbone.stages.2.blocks.2.attn.w_msa.proj.bias - torch.Size([512]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth online_backbone.backbone.stages.2.blocks.2.norm2.weight - torch.Size([512]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth online_backbone.backbone.stages.2.blocks.2.norm2.bias - torch.Size([512]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth online_backbone.backbone.stages.2.blocks.2.ffn.layers.0.0.weight - torch.Size([2048, 512]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth online_backbone.backbone.stages.2.blocks.2.ffn.layers.0.0.bias - torch.Size([2048]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth online_backbone.backbone.stages.2.blocks.2.ffn.layers.1.weight - torch.Size([512, 2048]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth online_backbone.backbone.stages.2.blocks.2.ffn.layers.1.bias - torch.Size([512]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth online_backbone.backbone.stages.2.blocks.3.norm1.weight - torch.Size([512]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth online_backbone.backbone.stages.2.blocks.3.norm1.bias - torch.Size([512]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth online_backbone.backbone.stages.2.blocks.3.attn.w_msa.relative_position_bias_table - torch.Size([169, 16]): The value is the same before and after calling `init_weights` of SelfSupDetector online_backbone.backbone.stages.2.blocks.3.attn.w_msa.qkv.weight - torch.Size([1536, 512]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth online_backbone.backbone.stages.2.blocks.3.attn.w_msa.qkv.bias - torch.Size([1536]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth online_backbone.backbone.stages.2.blocks.3.attn.w_msa.proj.weight - torch.Size([512, 512]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth online_backbone.backbone.stages.2.blocks.3.attn.w_msa.proj.bias - torch.Size([512]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth online_backbone.backbone.stages.2.blocks.3.norm2.weight - torch.Size([512]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth online_backbone.backbone.stages.2.blocks.3.norm2.bias - torch.Size([512]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth online_backbone.backbone.stages.2.blocks.3.ffn.layers.0.0.weight - torch.Size([2048, 512]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth online_backbone.backbone.stages.2.blocks.3.ffn.layers.0.0.bias - torch.Size([2048]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth online_backbone.backbone.stages.2.blocks.3.ffn.layers.1.weight - torch.Size([512, 2048]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth online_backbone.backbone.stages.2.blocks.3.ffn.layers.1.bias - torch.Size([512]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth online_backbone.backbone.stages.2.blocks.4.norm1.weight - torch.Size([512]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth online_backbone.backbone.stages.2.blocks.4.norm1.bias - torch.Size([512]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth online_backbone.backbone.stages.2.blocks.4.attn.w_msa.relative_position_bias_table - torch.Size([169, 16]): The value is the same before and after calling `init_weights` of SelfSupDetector online_backbone.backbone.stages.2.blocks.4.attn.w_msa.qkv.weight - torch.Size([1536, 512]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth online_backbone.backbone.stages.2.blocks.4.attn.w_msa.qkv.bias - torch.Size([1536]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth online_backbone.backbone.stages.2.blocks.4.attn.w_msa.proj.weight - torch.Size([512, 512]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth online_backbone.backbone.stages.2.blocks.4.attn.w_msa.proj.bias - torch.Size([512]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth online_backbone.backbone.stages.2.blocks.4.norm2.weight - torch.Size([512]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth online_backbone.backbone.stages.2.blocks.4.norm2.bias - torch.Size([512]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth online_backbone.backbone.stages.2.blocks.4.ffn.layers.0.0.weight - torch.Size([2048, 512]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth online_backbone.backbone.stages.2.blocks.4.ffn.layers.0.0.bias - torch.Size([2048]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth online_backbone.backbone.stages.2.blocks.4.ffn.layers.1.weight - torch.Size([512, 2048]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth online_backbone.backbone.stages.2.blocks.4.ffn.layers.1.bias - torch.Size([512]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth online_backbone.backbone.stages.2.blocks.5.norm1.weight - torch.Size([512]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth online_backbone.backbone.stages.2.blocks.5.norm1.bias - torch.Size([512]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth online_backbone.backbone.stages.2.blocks.5.attn.w_msa.relative_position_bias_table - torch.Size([169, 16]): The value is the same before and after calling `init_weights` of SelfSupDetector online_backbone.backbone.stages.2.blocks.5.attn.w_msa.qkv.weight - torch.Size([1536, 512]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth online_backbone.backbone.stages.2.blocks.5.attn.w_msa.qkv.bias - torch.Size([1536]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth online_backbone.backbone.stages.2.blocks.5.attn.w_msa.proj.weight - torch.Size([512, 512]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth online_backbone.backbone.stages.2.blocks.5.attn.w_msa.proj.bias - torch.Size([512]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth online_backbone.backbone.stages.2.blocks.5.norm2.weight - torch.Size([512]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth online_backbone.backbone.stages.2.blocks.5.norm2.bias - torch.Size([512]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth online_backbone.backbone.stages.2.blocks.5.ffn.layers.0.0.weight - torch.Size([2048, 512]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth online_backbone.backbone.stages.2.blocks.5.ffn.layers.0.0.bias - torch.Size([2048]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth online_backbone.backbone.stages.2.blocks.5.ffn.layers.1.weight - torch.Size([512, 2048]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth online_backbone.backbone.stages.2.blocks.5.ffn.layers.1.bias - torch.Size([512]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth online_backbone.backbone.stages.2.blocks.6.norm1.weight - torch.Size([512]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth online_backbone.backbone.stages.2.blocks.6.norm1.bias - torch.Size([512]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth online_backbone.backbone.stages.2.blocks.6.attn.w_msa.relative_position_bias_table - torch.Size([169, 16]): The value is the same before and after calling `init_weights` of SelfSupDetector online_backbone.backbone.stages.2.blocks.6.attn.w_msa.qkv.weight - torch.Size([1536, 512]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth online_backbone.backbone.stages.2.blocks.6.attn.w_msa.qkv.bias - torch.Size([1536]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth online_backbone.backbone.stages.2.blocks.6.attn.w_msa.proj.weight - torch.Size([512, 512]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth online_backbone.backbone.stages.2.blocks.6.attn.w_msa.proj.bias - torch.Size([512]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth online_backbone.backbone.stages.2.blocks.6.norm2.weight - torch.Size([512]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth online_backbone.backbone.stages.2.blocks.6.norm2.bias - torch.Size([512]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth online_backbone.backbone.stages.2.blocks.6.ffn.layers.0.0.weight - torch.Size([2048, 512]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth online_backbone.backbone.stages.2.blocks.6.ffn.layers.0.0.bias - torch.Size([2048]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth online_backbone.backbone.stages.2.blocks.6.ffn.layers.1.weight - torch.Size([512, 2048]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth online_backbone.backbone.stages.2.blocks.6.ffn.layers.1.bias - torch.Size([512]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth online_backbone.backbone.stages.2.blocks.7.norm1.weight - torch.Size([512]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth online_backbone.backbone.stages.2.blocks.7.norm1.bias - torch.Size([512]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth online_backbone.backbone.stages.2.blocks.7.attn.w_msa.relative_position_bias_table - torch.Size([169, 16]): The value is the same before and after calling `init_weights` of SelfSupDetector online_backbone.backbone.stages.2.blocks.7.attn.w_msa.qkv.weight - torch.Size([1536, 512]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth online_backbone.backbone.stages.2.blocks.7.attn.w_msa.qkv.bias - torch.Size([1536]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth online_backbone.backbone.stages.2.blocks.7.attn.w_msa.proj.weight - torch.Size([512, 512]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth online_backbone.backbone.stages.2.blocks.7.attn.w_msa.proj.bias - torch.Size([512]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth online_backbone.backbone.stages.2.blocks.7.norm2.weight - torch.Size([512]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth online_backbone.backbone.stages.2.blocks.7.norm2.bias - torch.Size([512]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth online_backbone.backbone.stages.2.blocks.7.ffn.layers.0.0.weight - torch.Size([2048, 512]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth online_backbone.backbone.stages.2.blocks.7.ffn.layers.0.0.bias - torch.Size([2048]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth online_backbone.backbone.stages.2.blocks.7.ffn.layers.1.weight - torch.Size([512, 2048]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth online_backbone.backbone.stages.2.blocks.7.ffn.layers.1.bias - torch.Size([512]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth online_backbone.backbone.stages.2.blocks.8.norm1.weight - torch.Size([512]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth online_backbone.backbone.stages.2.blocks.8.norm1.bias - torch.Size([512]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth online_backbone.backbone.stages.2.blocks.8.attn.w_msa.relative_position_bias_table - torch.Size([169, 16]): The value is the same before and after calling `init_weights` of SelfSupDetector online_backbone.backbone.stages.2.blocks.8.attn.w_msa.qkv.weight - torch.Size([1536, 512]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth online_backbone.backbone.stages.2.blocks.8.attn.w_msa.qkv.bias - torch.Size([1536]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth online_backbone.backbone.stages.2.blocks.8.attn.w_msa.proj.weight - torch.Size([512, 512]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth online_backbone.backbone.stages.2.blocks.8.attn.w_msa.proj.bias - torch.Size([512]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth online_backbone.backbone.stages.2.blocks.8.norm2.weight - torch.Size([512]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth online_backbone.backbone.stages.2.blocks.8.norm2.bias - torch.Size([512]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth online_backbone.backbone.stages.2.blocks.8.ffn.layers.0.0.weight - torch.Size([2048, 512]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth online_backbone.backbone.stages.2.blocks.8.ffn.layers.0.0.bias - torch.Size([2048]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth online_backbone.backbone.stages.2.blocks.8.ffn.layers.1.weight - torch.Size([512, 2048]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth online_backbone.backbone.stages.2.blocks.8.ffn.layers.1.bias - torch.Size([512]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth online_backbone.backbone.stages.2.blocks.9.norm1.weight - torch.Size([512]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth online_backbone.backbone.stages.2.blocks.9.norm1.bias - torch.Size([512]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth online_backbone.backbone.stages.2.blocks.9.attn.w_msa.relative_position_bias_table - torch.Size([169, 16]): The value is the same before and after calling `init_weights` of SelfSupDetector online_backbone.backbone.stages.2.blocks.9.attn.w_msa.qkv.weight - torch.Size([1536, 512]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth online_backbone.backbone.stages.2.blocks.9.attn.w_msa.qkv.bias - torch.Size([1536]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth online_backbone.backbone.stages.2.blocks.9.attn.w_msa.proj.weight - torch.Size([512, 512]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth online_backbone.backbone.stages.2.blocks.9.attn.w_msa.proj.bias - torch.Size([512]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth online_backbone.backbone.stages.2.blocks.9.norm2.weight - torch.Size([512]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth online_backbone.backbone.stages.2.blocks.9.norm2.bias - torch.Size([512]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth online_backbone.backbone.stages.2.blocks.9.ffn.layers.0.0.weight - torch.Size([2048, 512]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth online_backbone.backbone.stages.2.blocks.9.ffn.layers.0.0.bias - torch.Size([2048]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth online_backbone.backbone.stages.2.blocks.9.ffn.layers.1.weight - torch.Size([512, 2048]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth online_backbone.backbone.stages.2.blocks.9.ffn.layers.1.bias - torch.Size([512]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth online_backbone.backbone.stages.2.blocks.10.norm1.weight - torch.Size([512]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth online_backbone.backbone.stages.2.blocks.10.norm1.bias - torch.Size([512]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth online_backbone.backbone.stages.2.blocks.10.attn.w_msa.relative_position_bias_table - torch.Size([169, 16]): The value is the same before and after calling `init_weights` of SelfSupDetector online_backbone.backbone.stages.2.blocks.10.attn.w_msa.qkv.weight - torch.Size([1536, 512]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth online_backbone.backbone.stages.2.blocks.10.attn.w_msa.qkv.bias - torch.Size([1536]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth online_backbone.backbone.stages.2.blocks.10.attn.w_msa.proj.weight - torch.Size([512, 512]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth online_backbone.backbone.stages.2.blocks.10.attn.w_msa.proj.bias - torch.Size([512]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth online_backbone.backbone.stages.2.blocks.10.norm2.weight - torch.Size([512]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth online_backbone.backbone.stages.2.blocks.10.norm2.bias - torch.Size([512]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth online_backbone.backbone.stages.2.blocks.10.ffn.layers.0.0.weight - torch.Size([2048, 512]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth online_backbone.backbone.stages.2.blocks.10.ffn.layers.0.0.bias - torch.Size([2048]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth online_backbone.backbone.stages.2.blocks.10.ffn.layers.1.weight - torch.Size([512, 2048]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth online_backbone.backbone.stages.2.blocks.10.ffn.layers.1.bias - torch.Size([512]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth online_backbone.backbone.stages.2.blocks.11.norm1.weight - torch.Size([512]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth online_backbone.backbone.stages.2.blocks.11.norm1.bias - torch.Size([512]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth online_backbone.backbone.stages.2.blocks.11.attn.w_msa.relative_position_bias_table - torch.Size([169, 16]): The value is the same before and after calling `init_weights` of SelfSupDetector online_backbone.backbone.stages.2.blocks.11.attn.w_msa.qkv.weight - torch.Size([1536, 512]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth online_backbone.backbone.stages.2.blocks.11.attn.w_msa.qkv.bias - torch.Size([1536]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth online_backbone.backbone.stages.2.blocks.11.attn.w_msa.proj.weight - torch.Size([512, 512]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth online_backbone.backbone.stages.2.blocks.11.attn.w_msa.proj.bias - torch.Size([512]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth online_backbone.backbone.stages.2.blocks.11.norm2.weight - torch.Size([512]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth online_backbone.backbone.stages.2.blocks.11.norm2.bias - torch.Size([512]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth online_backbone.backbone.stages.2.blocks.11.ffn.layers.0.0.weight - torch.Size([2048, 512]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth online_backbone.backbone.stages.2.blocks.11.ffn.layers.0.0.bias - torch.Size([2048]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth online_backbone.backbone.stages.2.blocks.11.ffn.layers.1.weight - torch.Size([512, 2048]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth online_backbone.backbone.stages.2.blocks.11.ffn.layers.1.bias - torch.Size([512]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth online_backbone.backbone.stages.2.blocks.12.norm1.weight - torch.Size([512]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth online_backbone.backbone.stages.2.blocks.12.norm1.bias - torch.Size([512]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth online_backbone.backbone.stages.2.blocks.12.attn.w_msa.relative_position_bias_table - torch.Size([169, 16]): The value is the same before and after calling `init_weights` of SelfSupDetector online_backbone.backbone.stages.2.blocks.12.attn.w_msa.qkv.weight - torch.Size([1536, 512]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth online_backbone.backbone.stages.2.blocks.12.attn.w_msa.qkv.bias - torch.Size([1536]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth online_backbone.backbone.stages.2.blocks.12.attn.w_msa.proj.weight - torch.Size([512, 512]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth online_backbone.backbone.stages.2.blocks.12.attn.w_msa.proj.bias - torch.Size([512]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth online_backbone.backbone.stages.2.blocks.12.norm2.weight - torch.Size([512]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth online_backbone.backbone.stages.2.blocks.12.norm2.bias - torch.Size([512]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth online_backbone.backbone.stages.2.blocks.12.ffn.layers.0.0.weight - torch.Size([2048, 512]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth online_backbone.backbone.stages.2.blocks.12.ffn.layers.0.0.bias - torch.Size([2048]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth online_backbone.backbone.stages.2.blocks.12.ffn.layers.1.weight - torch.Size([512, 2048]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth online_backbone.backbone.stages.2.blocks.12.ffn.layers.1.bias - torch.Size([512]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth online_backbone.backbone.stages.2.blocks.13.norm1.weight - torch.Size([512]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth online_backbone.backbone.stages.2.blocks.13.norm1.bias - torch.Size([512]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth online_backbone.backbone.stages.2.blocks.13.attn.w_msa.relative_position_bias_table - torch.Size([169, 16]): The value is the same before and after calling `init_weights` of SelfSupDetector online_backbone.backbone.stages.2.blocks.13.attn.w_msa.qkv.weight - torch.Size([1536, 512]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth online_backbone.backbone.stages.2.blocks.13.attn.w_msa.qkv.bias - torch.Size([1536]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth online_backbone.backbone.stages.2.blocks.13.attn.w_msa.proj.weight - torch.Size([512, 512]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth online_backbone.backbone.stages.2.blocks.13.attn.w_msa.proj.bias - torch.Size([512]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth online_backbone.backbone.stages.2.blocks.13.norm2.weight - torch.Size([512]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth online_backbone.backbone.stages.2.blocks.13.norm2.bias - torch.Size([512]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth online_backbone.backbone.stages.2.blocks.13.ffn.layers.0.0.weight - torch.Size([2048, 512]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth online_backbone.backbone.stages.2.blocks.13.ffn.layers.0.0.bias - torch.Size([2048]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth online_backbone.backbone.stages.2.blocks.13.ffn.layers.1.weight - torch.Size([512, 2048]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth online_backbone.backbone.stages.2.blocks.13.ffn.layers.1.bias - torch.Size([512]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth online_backbone.backbone.stages.2.blocks.14.norm1.weight - torch.Size([512]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth online_backbone.backbone.stages.2.blocks.14.norm1.bias - torch.Size([512]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth online_backbone.backbone.stages.2.blocks.14.attn.w_msa.relative_position_bias_table - torch.Size([169, 16]): The value is the same before and after calling `init_weights` of SelfSupDetector online_backbone.backbone.stages.2.blocks.14.attn.w_msa.qkv.weight - torch.Size([1536, 512]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth online_backbone.backbone.stages.2.blocks.14.attn.w_msa.qkv.bias - torch.Size([1536]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth online_backbone.backbone.stages.2.blocks.14.attn.w_msa.proj.weight - torch.Size([512, 512]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth online_backbone.backbone.stages.2.blocks.14.attn.w_msa.proj.bias - torch.Size([512]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth online_backbone.backbone.stages.2.blocks.14.norm2.weight - torch.Size([512]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth online_backbone.backbone.stages.2.blocks.14.norm2.bias - torch.Size([512]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth online_backbone.backbone.stages.2.blocks.14.ffn.layers.0.0.weight - torch.Size([2048, 512]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth online_backbone.backbone.stages.2.blocks.14.ffn.layers.0.0.bias - torch.Size([2048]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth online_backbone.backbone.stages.2.blocks.14.ffn.layers.1.weight - torch.Size([512, 2048]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth online_backbone.backbone.stages.2.blocks.14.ffn.layers.1.bias - torch.Size([512]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth online_backbone.backbone.stages.2.blocks.15.norm1.weight - torch.Size([512]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth online_backbone.backbone.stages.2.blocks.15.norm1.bias - torch.Size([512]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth online_backbone.backbone.stages.2.blocks.15.attn.w_msa.relative_position_bias_table - torch.Size([169, 16]): The value is the same before and after calling `init_weights` of SelfSupDetector online_backbone.backbone.stages.2.blocks.15.attn.w_msa.qkv.weight - torch.Size([1536, 512]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth online_backbone.backbone.stages.2.blocks.15.attn.w_msa.qkv.bias - torch.Size([1536]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth online_backbone.backbone.stages.2.blocks.15.attn.w_msa.proj.weight - torch.Size([512, 512]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth online_backbone.backbone.stages.2.blocks.15.attn.w_msa.proj.bias - torch.Size([512]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth online_backbone.backbone.stages.2.blocks.15.norm2.weight - torch.Size([512]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth online_backbone.backbone.stages.2.blocks.15.norm2.bias - torch.Size([512]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth online_backbone.backbone.stages.2.blocks.15.ffn.layers.0.0.weight - torch.Size([2048, 512]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth online_backbone.backbone.stages.2.blocks.15.ffn.layers.0.0.bias - torch.Size([2048]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth online_backbone.backbone.stages.2.blocks.15.ffn.layers.1.weight - torch.Size([512, 2048]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth online_backbone.backbone.stages.2.blocks.15.ffn.layers.1.bias - torch.Size([512]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth online_backbone.backbone.stages.2.blocks.16.norm1.weight - torch.Size([512]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth online_backbone.backbone.stages.2.blocks.16.norm1.bias - torch.Size([512]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth online_backbone.backbone.stages.2.blocks.16.attn.w_msa.relative_position_bias_table - torch.Size([169, 16]): The value is the same before and after calling `init_weights` of SelfSupDetector online_backbone.backbone.stages.2.blocks.16.attn.w_msa.qkv.weight - torch.Size([1536, 512]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth online_backbone.backbone.stages.2.blocks.16.attn.w_msa.qkv.bias - torch.Size([1536]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth online_backbone.backbone.stages.2.blocks.16.attn.w_msa.proj.weight - torch.Size([512, 512]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth online_backbone.backbone.stages.2.blocks.16.attn.w_msa.proj.bias - torch.Size([512]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth online_backbone.backbone.stages.2.blocks.16.norm2.weight - torch.Size([512]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth online_backbone.backbone.stages.2.blocks.16.norm2.bias - torch.Size([512]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth online_backbone.backbone.stages.2.blocks.16.ffn.layers.0.0.weight - torch.Size([2048, 512]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth online_backbone.backbone.stages.2.blocks.16.ffn.layers.0.0.bias - torch.Size([2048]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth online_backbone.backbone.stages.2.blocks.16.ffn.layers.1.weight - torch.Size([512, 2048]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth online_backbone.backbone.stages.2.blocks.16.ffn.layers.1.bias - torch.Size([512]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth online_backbone.backbone.stages.2.blocks.17.norm1.weight - torch.Size([512]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth online_backbone.backbone.stages.2.blocks.17.norm1.bias - torch.Size([512]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth online_backbone.backbone.stages.2.blocks.17.attn.w_msa.relative_position_bias_table - torch.Size([169, 16]): The value is the same before and after calling `init_weights` of SelfSupDetector online_backbone.backbone.stages.2.blocks.17.attn.w_msa.qkv.weight - torch.Size([1536, 512]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth online_backbone.backbone.stages.2.blocks.17.attn.w_msa.qkv.bias - torch.Size([1536]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth online_backbone.backbone.stages.2.blocks.17.attn.w_msa.proj.weight - torch.Size([512, 512]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth online_backbone.backbone.stages.2.blocks.17.attn.w_msa.proj.bias - torch.Size([512]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth online_backbone.backbone.stages.2.blocks.17.norm2.weight - torch.Size([512]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth online_backbone.backbone.stages.2.blocks.17.norm2.bias - torch.Size([512]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth online_backbone.backbone.stages.2.blocks.17.ffn.layers.0.0.weight - torch.Size([2048, 512]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth online_backbone.backbone.stages.2.blocks.17.ffn.layers.0.0.bias - torch.Size([2048]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth online_backbone.backbone.stages.2.blocks.17.ffn.layers.1.weight - torch.Size([512, 2048]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth online_backbone.backbone.stages.2.blocks.17.ffn.layers.1.bias - torch.Size([512]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth online_backbone.backbone.stages.2.downsample.norm.weight - torch.Size([2048]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth online_backbone.backbone.stages.2.downsample.norm.bias - torch.Size([2048]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth online_backbone.backbone.stages.2.downsample.reduction.weight - torch.Size([1024, 2048]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth online_backbone.backbone.stages.3.blocks.0.norm1.weight - torch.Size([1024]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth online_backbone.backbone.stages.3.blocks.0.norm1.bias - torch.Size([1024]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth online_backbone.backbone.stages.3.blocks.0.attn.w_msa.relative_position_bias_table - torch.Size([169, 32]): The value is the same before and after calling `init_weights` of SelfSupDetector online_backbone.backbone.stages.3.blocks.0.attn.w_msa.qkv.weight - torch.Size([3072, 1024]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth online_backbone.backbone.stages.3.blocks.0.attn.w_msa.qkv.bias - torch.Size([3072]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth online_backbone.backbone.stages.3.blocks.0.attn.w_msa.proj.weight - torch.Size([1024, 1024]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth online_backbone.backbone.stages.3.blocks.0.attn.w_msa.proj.bias - torch.Size([1024]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth online_backbone.backbone.stages.3.blocks.0.norm2.weight - torch.Size([1024]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth online_backbone.backbone.stages.3.blocks.0.norm2.bias - torch.Size([1024]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth online_backbone.backbone.stages.3.blocks.0.ffn.layers.0.0.weight - torch.Size([4096, 1024]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth online_backbone.backbone.stages.3.blocks.0.ffn.layers.0.0.bias - torch.Size([4096]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth online_backbone.backbone.stages.3.blocks.0.ffn.layers.1.weight - torch.Size([1024, 4096]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth online_backbone.backbone.stages.3.blocks.0.ffn.layers.1.bias - torch.Size([1024]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth online_backbone.backbone.stages.3.blocks.1.norm1.weight - torch.Size([1024]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth online_backbone.backbone.stages.3.blocks.1.norm1.bias - torch.Size([1024]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth online_backbone.backbone.stages.3.blocks.1.attn.w_msa.relative_position_bias_table - torch.Size([169, 32]): The value is the same before and after calling `init_weights` of SelfSupDetector online_backbone.backbone.stages.3.blocks.1.attn.w_msa.qkv.weight - torch.Size([3072, 1024]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth online_backbone.backbone.stages.3.blocks.1.attn.w_msa.qkv.bias - torch.Size([3072]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth online_backbone.backbone.stages.3.blocks.1.attn.w_msa.proj.weight - torch.Size([1024, 1024]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth online_backbone.backbone.stages.3.blocks.1.attn.w_msa.proj.bias - torch.Size([1024]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth online_backbone.backbone.stages.3.blocks.1.norm2.weight - torch.Size([1024]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth online_backbone.backbone.stages.3.blocks.1.norm2.bias - torch.Size([1024]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth online_backbone.backbone.stages.3.blocks.1.ffn.layers.0.0.weight - torch.Size([4096, 1024]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth online_backbone.backbone.stages.3.blocks.1.ffn.layers.0.0.bias - torch.Size([4096]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth online_backbone.backbone.stages.3.blocks.1.ffn.layers.1.weight - torch.Size([1024, 4096]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth online_backbone.backbone.stages.3.blocks.1.ffn.layers.1.bias - torch.Size([1024]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth online_backbone.backbone.norm0.weight - torch.Size([128]): The value is the same before and after calling `init_weights` of SelfSupDetector online_backbone.backbone.norm0.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of SelfSupDetector online_backbone.backbone.norm1.weight - torch.Size([256]): The value is the same before and after calling `init_weights` of SelfSupDetector online_backbone.backbone.norm1.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of SelfSupDetector online_backbone.backbone.norm2.weight - torch.Size([512]): The value is the same before and after calling `init_weights` of SelfSupDetector online_backbone.backbone.norm2.bias - torch.Size([512]): The value is the same before and after calling `init_weights` of SelfSupDetector online_backbone.backbone.norm3.weight - torch.Size([1024]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth online_backbone.backbone.norm3.bias - torch.Size([1024]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth online_backbone.neck.lateral_convs.0.conv.weight - torch.Size([256, 128, 1, 1]): The value is the same before and after calling `init_weights` of SelfSupDetector online_backbone.neck.lateral_convs.0.conv.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of SelfSupDetector online_backbone.neck.lateral_convs.1.conv.weight - torch.Size([256, 256, 1, 1]): The value is the same before and after calling `init_weights` of SelfSupDetector online_backbone.neck.lateral_convs.1.conv.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of SelfSupDetector online_backbone.neck.lateral_convs.2.conv.weight - torch.Size([256, 512, 1, 1]): The value is the same before and after calling `init_weights` of SelfSupDetector online_backbone.neck.lateral_convs.2.conv.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of SelfSupDetector online_backbone.neck.lateral_convs.3.conv.weight - torch.Size([256, 1024, 1, 1]): The value is the same before and after calling `init_weights` of SelfSupDetector online_backbone.neck.lateral_convs.3.conv.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of SelfSupDetector online_backbone.neck.fpn_convs.0.conv.weight - torch.Size([256, 256, 3, 3]): The value is the same before and after calling `init_weights` of SelfSupDetector online_backbone.neck.fpn_convs.0.conv.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of SelfSupDetector online_backbone.neck.fpn_convs.1.conv.weight - torch.Size([256, 256, 3, 3]): The value is the same before and after calling `init_weights` of SelfSupDetector online_backbone.neck.fpn_convs.1.conv.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of SelfSupDetector online_backbone.neck.fpn_convs.2.conv.weight - torch.Size([256, 256, 3, 3]): The value is the same before and after calling `init_weights` of SelfSupDetector online_backbone.neck.fpn_convs.2.conv.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of SelfSupDetector online_backbone.neck.fpn_convs.3.conv.weight - torch.Size([256, 256, 3, 3]): The value is the same before and after calling `init_weights` of SelfSupDetector online_backbone.neck.fpn_convs.3.conv.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of SelfSupDetector online_backbone.rpn_head.rpn_conv.weight - torch.Size([256, 256, 3, 3]): The value is the same before and after calling `init_weights` of SelfSupDetector online_backbone.rpn_head.rpn_conv.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of SelfSupDetector online_backbone.rpn_head.rpn_cls.weight - torch.Size([3, 256, 1, 1]): The value is the same before and after calling `init_weights` of SelfSupDetector online_backbone.rpn_head.rpn_cls.bias - torch.Size([3]): The value is the same before and after calling `init_weights` of SelfSupDetector online_backbone.rpn_head.rpn_reg.weight - torch.Size([12, 256, 1, 1]): The value is the same before and after calling `init_weights` of SelfSupDetector online_backbone.rpn_head.rpn_reg.bias - torch.Size([12]): The value is the same before and after calling `init_weights` of SelfSupDetector online_backbone.roi_head.bbox_head.fc_cls.0.weight - torch.Size([1024, 1024]): The value is the same before and after calling `init_weights` of SelfSupDetector online_backbone.roi_head.bbox_head.fc_cls.0.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of SelfSupDetector online_backbone.roi_head.bbox_head.fc_cls.2.weight - torch.Size([256, 1024]): The value is the same before and after calling `init_weights` of SelfSupDetector online_backbone.roi_head.bbox_head.fc_cls.2.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of SelfSupDetector online_backbone.roi_head.bbox_head.fc_reg.weight - torch.Size([1024, 1024]): The value is the same before and after calling `init_weights` of SelfSupDetector online_backbone.roi_head.bbox_head.fc_reg.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of SelfSupDetector online_backbone.roi_head.bbox_head.shared_convs.0.conv.weight - torch.Size([256, 256, 3, 3]): The value is the same before and after calling `init_weights` of SelfSupDetector online_backbone.roi_head.bbox_head.shared_convs.0.conv.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of SelfSupDetector online_backbone.roi_head.bbox_head.shared_convs.1.conv.weight - torch.Size([256, 256, 3, 3]): The value is the same before and after calling `init_weights` of SelfSupDetector online_backbone.roi_head.bbox_head.shared_convs.1.conv.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of SelfSupDetector online_backbone.roi_head.bbox_head.shared_convs.2.conv.weight - torch.Size([256, 256, 3, 3]): The value is the same before and after calling `init_weights` of SelfSupDetector online_backbone.roi_head.bbox_head.shared_convs.2.conv.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of SelfSupDetector online_backbone.roi_head.bbox_head.shared_convs.3.conv.weight - torch.Size([256, 256, 3, 3]): The value is the same before and after calling `init_weights` of SelfSupDetector online_backbone.roi_head.bbox_head.shared_convs.3.conv.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of SelfSupDetector online_backbone.roi_head.bbox_head.shared_fcs.0.weight - torch.Size([1024, 12544]): The value is the same before and after calling `init_weights` of SelfSupDetector online_backbone.roi_head.bbox_head.shared_fcs.0.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of SelfSupDetector target_backbone.backbone.patch_embed.projection.weight - torch.Size([128, 3, 4, 4]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth target_backbone.backbone.patch_embed.projection.bias - torch.Size([128]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth target_backbone.backbone.patch_embed.norm.weight - torch.Size([128]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth target_backbone.backbone.patch_embed.norm.bias - torch.Size([128]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth target_backbone.backbone.stages.0.blocks.0.norm1.weight - torch.Size([128]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth target_backbone.backbone.stages.0.blocks.0.norm1.bias - torch.Size([128]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth target_backbone.backbone.stages.0.blocks.0.attn.w_msa.relative_position_bias_table - torch.Size([169, 4]): The value is the same before and after calling `init_weights` of SelfSupDetector target_backbone.backbone.stages.0.blocks.0.attn.w_msa.qkv.weight - torch.Size([384, 128]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth target_backbone.backbone.stages.0.blocks.0.attn.w_msa.qkv.bias - torch.Size([384]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth target_backbone.backbone.stages.0.blocks.0.attn.w_msa.proj.weight - torch.Size([128, 128]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth target_backbone.backbone.stages.0.blocks.0.attn.w_msa.proj.bias - torch.Size([128]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth target_backbone.backbone.stages.0.blocks.0.norm2.weight - torch.Size([128]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth target_backbone.backbone.stages.0.blocks.0.norm2.bias - torch.Size([128]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth target_backbone.backbone.stages.0.blocks.0.ffn.layers.0.0.weight - torch.Size([512, 128]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth target_backbone.backbone.stages.0.blocks.0.ffn.layers.0.0.bias - torch.Size([512]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth target_backbone.backbone.stages.0.blocks.0.ffn.layers.1.weight - torch.Size([128, 512]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth target_backbone.backbone.stages.0.blocks.0.ffn.layers.1.bias - torch.Size([128]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth target_backbone.backbone.stages.0.blocks.1.norm1.weight - torch.Size([128]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth target_backbone.backbone.stages.0.blocks.1.norm1.bias - torch.Size([128]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth target_backbone.backbone.stages.0.blocks.1.attn.w_msa.relative_position_bias_table - torch.Size([169, 4]): The value is the same before and after calling `init_weights` of SelfSupDetector target_backbone.backbone.stages.0.blocks.1.attn.w_msa.qkv.weight - torch.Size([384, 128]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth target_backbone.backbone.stages.0.blocks.1.attn.w_msa.qkv.bias - torch.Size([384]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth target_backbone.backbone.stages.0.blocks.1.attn.w_msa.proj.weight - torch.Size([128, 128]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth target_backbone.backbone.stages.0.blocks.1.attn.w_msa.proj.bias - torch.Size([128]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth target_backbone.backbone.stages.0.blocks.1.norm2.weight - torch.Size([128]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth target_backbone.backbone.stages.0.blocks.1.norm2.bias - torch.Size([128]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth target_backbone.backbone.stages.0.blocks.1.ffn.layers.0.0.weight - torch.Size([512, 128]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth target_backbone.backbone.stages.0.blocks.1.ffn.layers.0.0.bias - torch.Size([512]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth target_backbone.backbone.stages.0.blocks.1.ffn.layers.1.weight - torch.Size([128, 512]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth target_backbone.backbone.stages.0.blocks.1.ffn.layers.1.bias - torch.Size([128]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth target_backbone.backbone.stages.0.downsample.norm.weight - torch.Size([512]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth target_backbone.backbone.stages.0.downsample.norm.bias - torch.Size([512]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth target_backbone.backbone.stages.0.downsample.reduction.weight - torch.Size([256, 512]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth target_backbone.backbone.stages.1.blocks.0.norm1.weight - torch.Size([256]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth target_backbone.backbone.stages.1.blocks.0.norm1.bias - torch.Size([256]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth target_backbone.backbone.stages.1.blocks.0.attn.w_msa.relative_position_bias_table - torch.Size([169, 8]): The value is the same before and after calling `init_weights` of SelfSupDetector target_backbone.backbone.stages.1.blocks.0.attn.w_msa.qkv.weight - torch.Size([768, 256]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth target_backbone.backbone.stages.1.blocks.0.attn.w_msa.qkv.bias - torch.Size([768]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth target_backbone.backbone.stages.1.blocks.0.attn.w_msa.proj.weight - torch.Size([256, 256]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth target_backbone.backbone.stages.1.blocks.0.attn.w_msa.proj.bias - torch.Size([256]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth target_backbone.backbone.stages.1.blocks.0.norm2.weight - torch.Size([256]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth target_backbone.backbone.stages.1.blocks.0.norm2.bias - torch.Size([256]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth target_backbone.backbone.stages.1.blocks.0.ffn.layers.0.0.weight - torch.Size([1024, 256]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth target_backbone.backbone.stages.1.blocks.0.ffn.layers.0.0.bias - torch.Size([1024]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth target_backbone.backbone.stages.1.blocks.0.ffn.layers.1.weight - torch.Size([256, 1024]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth target_backbone.backbone.stages.1.blocks.0.ffn.layers.1.bias - torch.Size([256]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth target_backbone.backbone.stages.1.blocks.1.norm1.weight - torch.Size([256]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth target_backbone.backbone.stages.1.blocks.1.norm1.bias - torch.Size([256]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth target_backbone.backbone.stages.1.blocks.1.attn.w_msa.relative_position_bias_table - torch.Size([169, 8]): The value is the same before and after calling `init_weights` of SelfSupDetector target_backbone.backbone.stages.1.blocks.1.attn.w_msa.qkv.weight - torch.Size([768, 256]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth target_backbone.backbone.stages.1.blocks.1.attn.w_msa.qkv.bias - torch.Size([768]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth target_backbone.backbone.stages.1.blocks.1.attn.w_msa.proj.weight - torch.Size([256, 256]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth target_backbone.backbone.stages.1.blocks.1.attn.w_msa.proj.bias - torch.Size([256]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth target_backbone.backbone.stages.1.blocks.1.norm2.weight - torch.Size([256]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth target_backbone.backbone.stages.1.blocks.1.norm2.bias - torch.Size([256]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth target_backbone.backbone.stages.1.blocks.1.ffn.layers.0.0.weight - torch.Size([1024, 256]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth target_backbone.backbone.stages.1.blocks.1.ffn.layers.0.0.bias - torch.Size([1024]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth target_backbone.backbone.stages.1.blocks.1.ffn.layers.1.weight - torch.Size([256, 1024]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth target_backbone.backbone.stages.1.blocks.1.ffn.layers.1.bias - torch.Size([256]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth target_backbone.backbone.stages.1.downsample.norm.weight - torch.Size([1024]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth target_backbone.backbone.stages.1.downsample.norm.bias - torch.Size([1024]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth target_backbone.backbone.stages.1.downsample.reduction.weight - torch.Size([512, 1024]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth target_backbone.backbone.stages.2.blocks.0.norm1.weight - torch.Size([512]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth target_backbone.backbone.stages.2.blocks.0.norm1.bias - torch.Size([512]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth target_backbone.backbone.stages.2.blocks.0.attn.w_msa.relative_position_bias_table - torch.Size([169, 16]): The value is the same before and after calling `init_weights` of SelfSupDetector target_backbone.backbone.stages.2.blocks.0.attn.w_msa.qkv.weight - torch.Size([1536, 512]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth target_backbone.backbone.stages.2.blocks.0.attn.w_msa.qkv.bias - torch.Size([1536]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth target_backbone.backbone.stages.2.blocks.0.attn.w_msa.proj.weight - torch.Size([512, 512]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth target_backbone.backbone.stages.2.blocks.0.attn.w_msa.proj.bias - torch.Size([512]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth target_backbone.backbone.stages.2.blocks.0.norm2.weight - torch.Size([512]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth target_backbone.backbone.stages.2.blocks.0.norm2.bias - torch.Size([512]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth target_backbone.backbone.stages.2.blocks.0.ffn.layers.0.0.weight - torch.Size([2048, 512]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth target_backbone.backbone.stages.2.blocks.0.ffn.layers.0.0.bias - torch.Size([2048]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth target_backbone.backbone.stages.2.blocks.0.ffn.layers.1.weight - torch.Size([512, 2048]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth target_backbone.backbone.stages.2.blocks.0.ffn.layers.1.bias - torch.Size([512]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth target_backbone.backbone.stages.2.blocks.1.norm1.weight - torch.Size([512]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth target_backbone.backbone.stages.2.blocks.1.norm1.bias - torch.Size([512]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth target_backbone.backbone.stages.2.blocks.1.attn.w_msa.relative_position_bias_table - torch.Size([169, 16]): The value is the same before and after calling `init_weights` of SelfSupDetector target_backbone.backbone.stages.2.blocks.1.attn.w_msa.qkv.weight - torch.Size([1536, 512]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth target_backbone.backbone.stages.2.blocks.1.attn.w_msa.qkv.bias - torch.Size([1536]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth target_backbone.backbone.stages.2.blocks.1.attn.w_msa.proj.weight - torch.Size([512, 512]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth target_backbone.backbone.stages.2.blocks.1.attn.w_msa.proj.bias - torch.Size([512]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth target_backbone.backbone.stages.2.blocks.1.norm2.weight - torch.Size([512]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth target_backbone.backbone.stages.2.blocks.1.norm2.bias - torch.Size([512]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth target_backbone.backbone.stages.2.blocks.1.ffn.layers.0.0.weight - torch.Size([2048, 512]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth target_backbone.backbone.stages.2.blocks.1.ffn.layers.0.0.bias - torch.Size([2048]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth target_backbone.backbone.stages.2.blocks.1.ffn.layers.1.weight - torch.Size([512, 2048]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth target_backbone.backbone.stages.2.blocks.1.ffn.layers.1.bias - torch.Size([512]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth target_backbone.backbone.stages.2.blocks.2.norm1.weight - torch.Size([512]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth target_backbone.backbone.stages.2.blocks.2.norm1.bias - torch.Size([512]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth target_backbone.backbone.stages.2.blocks.2.attn.w_msa.relative_position_bias_table - torch.Size([169, 16]): The value is the same before and after calling `init_weights` of SelfSupDetector target_backbone.backbone.stages.2.blocks.2.attn.w_msa.qkv.weight - torch.Size([1536, 512]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth target_backbone.backbone.stages.2.blocks.2.attn.w_msa.qkv.bias - torch.Size([1536]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth target_backbone.backbone.stages.2.blocks.2.attn.w_msa.proj.weight - torch.Size([512, 512]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth target_backbone.backbone.stages.2.blocks.2.attn.w_msa.proj.bias - torch.Size([512]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth target_backbone.backbone.stages.2.blocks.2.norm2.weight - torch.Size([512]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth target_backbone.backbone.stages.2.blocks.2.norm2.bias - torch.Size([512]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth target_backbone.backbone.stages.2.blocks.2.ffn.layers.0.0.weight - torch.Size([2048, 512]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth target_backbone.backbone.stages.2.blocks.2.ffn.layers.0.0.bias - torch.Size([2048]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth target_backbone.backbone.stages.2.blocks.2.ffn.layers.1.weight - torch.Size([512, 2048]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth target_backbone.backbone.stages.2.blocks.2.ffn.layers.1.bias - torch.Size([512]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth target_backbone.backbone.stages.2.blocks.3.norm1.weight - torch.Size([512]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth target_backbone.backbone.stages.2.blocks.3.norm1.bias - torch.Size([512]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth target_backbone.backbone.stages.2.blocks.3.attn.w_msa.relative_position_bias_table - torch.Size([169, 16]): The value is the same before and after calling `init_weights` of SelfSupDetector target_backbone.backbone.stages.2.blocks.3.attn.w_msa.qkv.weight - torch.Size([1536, 512]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth target_backbone.backbone.stages.2.blocks.3.attn.w_msa.qkv.bias - torch.Size([1536]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth target_backbone.backbone.stages.2.blocks.3.attn.w_msa.proj.weight - torch.Size([512, 512]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth target_backbone.backbone.stages.2.blocks.3.attn.w_msa.proj.bias - torch.Size([512]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth target_backbone.backbone.stages.2.blocks.3.norm2.weight - torch.Size([512]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth target_backbone.backbone.stages.2.blocks.3.norm2.bias - torch.Size([512]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth target_backbone.backbone.stages.2.blocks.3.ffn.layers.0.0.weight - torch.Size([2048, 512]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth target_backbone.backbone.stages.2.blocks.3.ffn.layers.0.0.bias - torch.Size([2048]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth target_backbone.backbone.stages.2.blocks.3.ffn.layers.1.weight - torch.Size([512, 2048]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth target_backbone.backbone.stages.2.blocks.3.ffn.layers.1.bias - torch.Size([512]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth target_backbone.backbone.stages.2.blocks.4.norm1.weight - torch.Size([512]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth target_backbone.backbone.stages.2.blocks.4.norm1.bias - torch.Size([512]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth target_backbone.backbone.stages.2.blocks.4.attn.w_msa.relative_position_bias_table - torch.Size([169, 16]): The value is the same before and after calling `init_weights` of SelfSupDetector target_backbone.backbone.stages.2.blocks.4.attn.w_msa.qkv.weight - torch.Size([1536, 512]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth target_backbone.backbone.stages.2.blocks.4.attn.w_msa.qkv.bias - torch.Size([1536]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth target_backbone.backbone.stages.2.blocks.4.attn.w_msa.proj.weight - torch.Size([512, 512]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth target_backbone.backbone.stages.2.blocks.4.attn.w_msa.proj.bias - torch.Size([512]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth target_backbone.backbone.stages.2.blocks.4.norm2.weight - torch.Size([512]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth target_backbone.backbone.stages.2.blocks.4.norm2.bias - torch.Size([512]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth target_backbone.backbone.stages.2.blocks.4.ffn.layers.0.0.weight - torch.Size([2048, 512]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth target_backbone.backbone.stages.2.blocks.4.ffn.layers.0.0.bias - torch.Size([2048]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth target_backbone.backbone.stages.2.blocks.4.ffn.layers.1.weight - torch.Size([512, 2048]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth target_backbone.backbone.stages.2.blocks.4.ffn.layers.1.bias - torch.Size([512]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth target_backbone.backbone.stages.2.blocks.5.norm1.weight - torch.Size([512]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth target_backbone.backbone.stages.2.blocks.5.norm1.bias - torch.Size([512]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth target_backbone.backbone.stages.2.blocks.5.attn.w_msa.relative_position_bias_table - torch.Size([169, 16]): The value is the same before and after calling `init_weights` of SelfSupDetector target_backbone.backbone.stages.2.blocks.5.attn.w_msa.qkv.weight - torch.Size([1536, 512]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth target_backbone.backbone.stages.2.blocks.5.attn.w_msa.qkv.bias - torch.Size([1536]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth target_backbone.backbone.stages.2.blocks.5.attn.w_msa.proj.weight - torch.Size([512, 512]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth target_backbone.backbone.stages.2.blocks.5.attn.w_msa.proj.bias - torch.Size([512]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth target_backbone.backbone.stages.2.blocks.5.norm2.weight - torch.Size([512]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth target_backbone.backbone.stages.2.blocks.5.norm2.bias - torch.Size([512]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth target_backbone.backbone.stages.2.blocks.5.ffn.layers.0.0.weight - torch.Size([2048, 512]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth target_backbone.backbone.stages.2.blocks.5.ffn.layers.0.0.bias - torch.Size([2048]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth target_backbone.backbone.stages.2.blocks.5.ffn.layers.1.weight - torch.Size([512, 2048]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth target_backbone.backbone.stages.2.blocks.5.ffn.layers.1.bias - torch.Size([512]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth target_backbone.backbone.stages.2.blocks.6.norm1.weight - torch.Size([512]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth target_backbone.backbone.stages.2.blocks.6.norm1.bias - torch.Size([512]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth target_backbone.backbone.stages.2.blocks.6.attn.w_msa.relative_position_bias_table - torch.Size([169, 16]): The value is the same before and after calling `init_weights` of SelfSupDetector target_backbone.backbone.stages.2.blocks.6.attn.w_msa.qkv.weight - torch.Size([1536, 512]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth target_backbone.backbone.stages.2.blocks.6.attn.w_msa.qkv.bias - torch.Size([1536]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth target_backbone.backbone.stages.2.blocks.6.attn.w_msa.proj.weight - torch.Size([512, 512]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth target_backbone.backbone.stages.2.blocks.6.attn.w_msa.proj.bias - torch.Size([512]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth target_backbone.backbone.stages.2.blocks.6.norm2.weight - torch.Size([512]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth target_backbone.backbone.stages.2.blocks.6.norm2.bias - torch.Size([512]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth target_backbone.backbone.stages.2.blocks.6.ffn.layers.0.0.weight - torch.Size([2048, 512]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth target_backbone.backbone.stages.2.blocks.6.ffn.layers.0.0.bias - torch.Size([2048]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth target_backbone.backbone.stages.2.blocks.6.ffn.layers.1.weight - torch.Size([512, 2048]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth target_backbone.backbone.stages.2.blocks.6.ffn.layers.1.bias - torch.Size([512]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth target_backbone.backbone.stages.2.blocks.7.norm1.weight - torch.Size([512]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth target_backbone.backbone.stages.2.blocks.7.norm1.bias - torch.Size([512]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth target_backbone.backbone.stages.2.blocks.7.attn.w_msa.relative_position_bias_table - torch.Size([169, 16]): The value is the same before and after calling `init_weights` of SelfSupDetector target_backbone.backbone.stages.2.blocks.7.attn.w_msa.qkv.weight - torch.Size([1536, 512]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth target_backbone.backbone.stages.2.blocks.7.attn.w_msa.qkv.bias - torch.Size([1536]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth target_backbone.backbone.stages.2.blocks.7.attn.w_msa.proj.weight - torch.Size([512, 512]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth target_backbone.backbone.stages.2.blocks.7.attn.w_msa.proj.bias - torch.Size([512]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth target_backbone.backbone.stages.2.blocks.7.norm2.weight - torch.Size([512]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth target_backbone.backbone.stages.2.blocks.7.norm2.bias - torch.Size([512]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth target_backbone.backbone.stages.2.blocks.7.ffn.layers.0.0.weight - torch.Size([2048, 512]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth target_backbone.backbone.stages.2.blocks.7.ffn.layers.0.0.bias - torch.Size([2048]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth target_backbone.backbone.stages.2.blocks.7.ffn.layers.1.weight - torch.Size([512, 2048]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth target_backbone.backbone.stages.2.blocks.7.ffn.layers.1.bias - torch.Size([512]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth target_backbone.backbone.stages.2.blocks.8.norm1.weight - torch.Size([512]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth target_backbone.backbone.stages.2.blocks.8.norm1.bias - torch.Size([512]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth target_backbone.backbone.stages.2.blocks.8.attn.w_msa.relative_position_bias_table - torch.Size([169, 16]): The value is the same before and after calling `init_weights` of SelfSupDetector target_backbone.backbone.stages.2.blocks.8.attn.w_msa.qkv.weight - torch.Size([1536, 512]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth target_backbone.backbone.stages.2.blocks.8.attn.w_msa.qkv.bias - torch.Size([1536]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth target_backbone.backbone.stages.2.blocks.8.attn.w_msa.proj.weight - torch.Size([512, 512]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth target_backbone.backbone.stages.2.blocks.8.attn.w_msa.proj.bias - torch.Size([512]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth target_backbone.backbone.stages.2.blocks.8.norm2.weight - torch.Size([512]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth target_backbone.backbone.stages.2.blocks.8.norm2.bias - torch.Size([512]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth target_backbone.backbone.stages.2.blocks.8.ffn.layers.0.0.weight - torch.Size([2048, 512]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth target_backbone.backbone.stages.2.blocks.8.ffn.layers.0.0.bias - torch.Size([2048]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth target_backbone.backbone.stages.2.blocks.8.ffn.layers.1.weight - torch.Size([512, 2048]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth target_backbone.backbone.stages.2.blocks.8.ffn.layers.1.bias - torch.Size([512]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth target_backbone.backbone.stages.2.blocks.9.norm1.weight - torch.Size([512]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth target_backbone.backbone.stages.2.blocks.9.norm1.bias - torch.Size([512]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth target_backbone.backbone.stages.2.blocks.9.attn.w_msa.relative_position_bias_table - torch.Size([169, 16]): The value is the same before and after calling `init_weights` of SelfSupDetector target_backbone.backbone.stages.2.blocks.9.attn.w_msa.qkv.weight - torch.Size([1536, 512]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth target_backbone.backbone.stages.2.blocks.9.attn.w_msa.qkv.bias - torch.Size([1536]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth target_backbone.backbone.stages.2.blocks.9.attn.w_msa.proj.weight - torch.Size([512, 512]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth target_backbone.backbone.stages.2.blocks.9.attn.w_msa.proj.bias - torch.Size([512]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth target_backbone.backbone.stages.2.blocks.9.norm2.weight - torch.Size([512]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth target_backbone.backbone.stages.2.blocks.9.norm2.bias - torch.Size([512]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth target_backbone.backbone.stages.2.blocks.9.ffn.layers.0.0.weight - torch.Size([2048, 512]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth target_backbone.backbone.stages.2.blocks.9.ffn.layers.0.0.bias - torch.Size([2048]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth target_backbone.backbone.stages.2.blocks.9.ffn.layers.1.weight - torch.Size([512, 2048]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth target_backbone.backbone.stages.2.blocks.9.ffn.layers.1.bias - torch.Size([512]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth target_backbone.backbone.stages.2.blocks.10.norm1.weight - torch.Size([512]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth target_backbone.backbone.stages.2.blocks.10.norm1.bias - torch.Size([512]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth target_backbone.backbone.stages.2.blocks.10.attn.w_msa.relative_position_bias_table - torch.Size([169, 16]): The value is the same before and after calling `init_weights` of SelfSupDetector target_backbone.backbone.stages.2.blocks.10.attn.w_msa.qkv.weight - torch.Size([1536, 512]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth target_backbone.backbone.stages.2.blocks.10.attn.w_msa.qkv.bias - torch.Size([1536]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth target_backbone.backbone.stages.2.blocks.10.attn.w_msa.proj.weight - torch.Size([512, 512]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth target_backbone.backbone.stages.2.blocks.10.attn.w_msa.proj.bias - torch.Size([512]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth target_backbone.backbone.stages.2.blocks.10.norm2.weight - torch.Size([512]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth target_backbone.backbone.stages.2.blocks.10.norm2.bias - torch.Size([512]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth target_backbone.backbone.stages.2.blocks.10.ffn.layers.0.0.weight - torch.Size([2048, 512]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth target_backbone.backbone.stages.2.blocks.10.ffn.layers.0.0.bias - torch.Size([2048]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth target_backbone.backbone.stages.2.blocks.10.ffn.layers.1.weight - torch.Size([512, 2048]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth target_backbone.backbone.stages.2.blocks.10.ffn.layers.1.bias - torch.Size([512]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth target_backbone.backbone.stages.2.blocks.11.norm1.weight - torch.Size([512]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth target_backbone.backbone.stages.2.blocks.11.norm1.bias - torch.Size([512]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth target_backbone.backbone.stages.2.blocks.11.attn.w_msa.relative_position_bias_table - torch.Size([169, 16]): The value is the same before and after calling `init_weights` of SelfSupDetector target_backbone.backbone.stages.2.blocks.11.attn.w_msa.qkv.weight - torch.Size([1536, 512]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth target_backbone.backbone.stages.2.blocks.11.attn.w_msa.qkv.bias - torch.Size([1536]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth target_backbone.backbone.stages.2.blocks.11.attn.w_msa.proj.weight - torch.Size([512, 512]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth target_backbone.backbone.stages.2.blocks.11.attn.w_msa.proj.bias - torch.Size([512]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth target_backbone.backbone.stages.2.blocks.11.norm2.weight - torch.Size([512]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth target_backbone.backbone.stages.2.blocks.11.norm2.bias - torch.Size([512]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth target_backbone.backbone.stages.2.blocks.11.ffn.layers.0.0.weight - torch.Size([2048, 512]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth target_backbone.backbone.stages.2.blocks.11.ffn.layers.0.0.bias - torch.Size([2048]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth target_backbone.backbone.stages.2.blocks.11.ffn.layers.1.weight - torch.Size([512, 2048]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth target_backbone.backbone.stages.2.blocks.11.ffn.layers.1.bias - torch.Size([512]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth target_backbone.backbone.stages.2.blocks.12.norm1.weight - torch.Size([512]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth target_backbone.backbone.stages.2.blocks.12.norm1.bias - torch.Size([512]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth target_backbone.backbone.stages.2.blocks.12.attn.w_msa.relative_position_bias_table - torch.Size([169, 16]): The value is the same before and after calling `init_weights` of SelfSupDetector target_backbone.backbone.stages.2.blocks.12.attn.w_msa.qkv.weight - torch.Size([1536, 512]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth target_backbone.backbone.stages.2.blocks.12.attn.w_msa.qkv.bias - torch.Size([1536]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth target_backbone.backbone.stages.2.blocks.12.attn.w_msa.proj.weight - torch.Size([512, 512]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth target_backbone.backbone.stages.2.blocks.12.attn.w_msa.proj.bias - torch.Size([512]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth target_backbone.backbone.stages.2.blocks.12.norm2.weight - torch.Size([512]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth target_backbone.backbone.stages.2.blocks.12.norm2.bias - torch.Size([512]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth target_backbone.backbone.stages.2.blocks.12.ffn.layers.0.0.weight - torch.Size([2048, 512]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth target_backbone.backbone.stages.2.blocks.12.ffn.layers.0.0.bias - torch.Size([2048]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth target_backbone.backbone.stages.2.blocks.12.ffn.layers.1.weight - torch.Size([512, 2048]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth target_backbone.backbone.stages.2.blocks.12.ffn.layers.1.bias - torch.Size([512]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth target_backbone.backbone.stages.2.blocks.13.norm1.weight - torch.Size([512]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth target_backbone.backbone.stages.2.blocks.13.norm1.bias - torch.Size([512]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth target_backbone.backbone.stages.2.blocks.13.attn.w_msa.relative_position_bias_table - torch.Size([169, 16]): The value is the same before and after calling `init_weights` of SelfSupDetector target_backbone.backbone.stages.2.blocks.13.attn.w_msa.qkv.weight - torch.Size([1536, 512]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth target_backbone.backbone.stages.2.blocks.13.attn.w_msa.qkv.bias - torch.Size([1536]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth target_backbone.backbone.stages.2.blocks.13.attn.w_msa.proj.weight - torch.Size([512, 512]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth target_backbone.backbone.stages.2.blocks.13.attn.w_msa.proj.bias - torch.Size([512]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth target_backbone.backbone.stages.2.blocks.13.norm2.weight - torch.Size([512]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth target_backbone.backbone.stages.2.blocks.13.norm2.bias - torch.Size([512]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth target_backbone.backbone.stages.2.blocks.13.ffn.layers.0.0.weight - torch.Size([2048, 512]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth target_backbone.backbone.stages.2.blocks.13.ffn.layers.0.0.bias - torch.Size([2048]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth target_backbone.backbone.stages.2.blocks.13.ffn.layers.1.weight - torch.Size([512, 2048]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth target_backbone.backbone.stages.2.blocks.13.ffn.layers.1.bias - torch.Size([512]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth target_backbone.backbone.stages.2.blocks.14.norm1.weight - torch.Size([512]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth target_backbone.backbone.stages.2.blocks.14.norm1.bias - torch.Size([512]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth target_backbone.backbone.stages.2.blocks.14.attn.w_msa.relative_position_bias_table - torch.Size([169, 16]): The value is the same before and after calling `init_weights` of SelfSupDetector target_backbone.backbone.stages.2.blocks.14.attn.w_msa.qkv.weight - torch.Size([1536, 512]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth target_backbone.backbone.stages.2.blocks.14.attn.w_msa.qkv.bias - torch.Size([1536]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth target_backbone.backbone.stages.2.blocks.14.attn.w_msa.proj.weight - torch.Size([512, 512]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth target_backbone.backbone.stages.2.blocks.14.attn.w_msa.proj.bias - torch.Size([512]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth target_backbone.backbone.stages.2.blocks.14.norm2.weight - torch.Size([512]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth target_backbone.backbone.stages.2.blocks.14.norm2.bias - torch.Size([512]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth target_backbone.backbone.stages.2.blocks.14.ffn.layers.0.0.weight - torch.Size([2048, 512]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth target_backbone.backbone.stages.2.blocks.14.ffn.layers.0.0.bias - torch.Size([2048]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth target_backbone.backbone.stages.2.blocks.14.ffn.layers.1.weight - torch.Size([512, 2048]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth target_backbone.backbone.stages.2.blocks.14.ffn.layers.1.bias - torch.Size([512]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth target_backbone.backbone.stages.2.blocks.15.norm1.weight - torch.Size([512]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth target_backbone.backbone.stages.2.blocks.15.norm1.bias - torch.Size([512]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth target_backbone.backbone.stages.2.blocks.15.attn.w_msa.relative_position_bias_table - torch.Size([169, 16]): The value is the same before and after calling `init_weights` of SelfSupDetector target_backbone.backbone.stages.2.blocks.15.attn.w_msa.qkv.weight - torch.Size([1536, 512]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth target_backbone.backbone.stages.2.blocks.15.attn.w_msa.qkv.bias - torch.Size([1536]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth target_backbone.backbone.stages.2.blocks.15.attn.w_msa.proj.weight - torch.Size([512, 512]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth target_backbone.backbone.stages.2.blocks.15.attn.w_msa.proj.bias - torch.Size([512]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth target_backbone.backbone.stages.2.blocks.15.norm2.weight - torch.Size([512]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth target_backbone.backbone.stages.2.blocks.15.norm2.bias - torch.Size([512]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth target_backbone.backbone.stages.2.blocks.15.ffn.layers.0.0.weight - torch.Size([2048, 512]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth target_backbone.backbone.stages.2.blocks.15.ffn.layers.0.0.bias - torch.Size([2048]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth target_backbone.backbone.stages.2.blocks.15.ffn.layers.1.weight - torch.Size([512, 2048]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth target_backbone.backbone.stages.2.blocks.15.ffn.layers.1.bias - torch.Size([512]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth target_backbone.backbone.stages.2.blocks.16.norm1.weight - torch.Size([512]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth target_backbone.backbone.stages.2.blocks.16.norm1.bias - torch.Size([512]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth target_backbone.backbone.stages.2.blocks.16.attn.w_msa.relative_position_bias_table - torch.Size([169, 16]): The value is the same before and after calling `init_weights` of SelfSupDetector target_backbone.backbone.stages.2.blocks.16.attn.w_msa.qkv.weight - torch.Size([1536, 512]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth target_backbone.backbone.stages.2.blocks.16.attn.w_msa.qkv.bias - torch.Size([1536]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth target_backbone.backbone.stages.2.blocks.16.attn.w_msa.proj.weight - torch.Size([512, 512]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth target_backbone.backbone.stages.2.blocks.16.attn.w_msa.proj.bias - torch.Size([512]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth target_backbone.backbone.stages.2.blocks.16.norm2.weight - torch.Size([512]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth target_backbone.backbone.stages.2.blocks.16.norm2.bias - torch.Size([512]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth target_backbone.backbone.stages.2.blocks.16.ffn.layers.0.0.weight - torch.Size([2048, 512]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth target_backbone.backbone.stages.2.blocks.16.ffn.layers.0.0.bias - torch.Size([2048]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth target_backbone.backbone.stages.2.blocks.16.ffn.layers.1.weight - torch.Size([512, 2048]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth target_backbone.backbone.stages.2.blocks.16.ffn.layers.1.bias - torch.Size([512]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth target_backbone.backbone.stages.2.blocks.17.norm1.weight - torch.Size([512]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth target_backbone.backbone.stages.2.blocks.17.norm1.bias - torch.Size([512]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth target_backbone.backbone.stages.2.blocks.17.attn.w_msa.relative_position_bias_table - torch.Size([169, 16]): The value is the same before and after calling `init_weights` of SelfSupDetector target_backbone.backbone.stages.2.blocks.17.attn.w_msa.qkv.weight - torch.Size([1536, 512]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth target_backbone.backbone.stages.2.blocks.17.attn.w_msa.qkv.bias - torch.Size([1536]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth target_backbone.backbone.stages.2.blocks.17.attn.w_msa.proj.weight - torch.Size([512, 512]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth target_backbone.backbone.stages.2.blocks.17.attn.w_msa.proj.bias - torch.Size([512]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth target_backbone.backbone.stages.2.blocks.17.norm2.weight - torch.Size([512]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth target_backbone.backbone.stages.2.blocks.17.norm2.bias - torch.Size([512]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth target_backbone.backbone.stages.2.blocks.17.ffn.layers.0.0.weight - torch.Size([2048, 512]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth target_backbone.backbone.stages.2.blocks.17.ffn.layers.0.0.bias - torch.Size([2048]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth target_backbone.backbone.stages.2.blocks.17.ffn.layers.1.weight - torch.Size([512, 2048]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth target_backbone.backbone.stages.2.blocks.17.ffn.layers.1.bias - torch.Size([512]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth target_backbone.backbone.stages.2.downsample.norm.weight - torch.Size([2048]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth target_backbone.backbone.stages.2.downsample.norm.bias - torch.Size([2048]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth target_backbone.backbone.stages.2.downsample.reduction.weight - torch.Size([1024, 2048]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth target_backbone.backbone.stages.3.blocks.0.norm1.weight - torch.Size([1024]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth target_backbone.backbone.stages.3.blocks.0.norm1.bias - torch.Size([1024]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth target_backbone.backbone.stages.3.blocks.0.attn.w_msa.relative_position_bias_table - torch.Size([169, 32]): The value is the same before and after calling `init_weights` of SelfSupDetector target_backbone.backbone.stages.3.blocks.0.attn.w_msa.qkv.weight - torch.Size([3072, 1024]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth target_backbone.backbone.stages.3.blocks.0.attn.w_msa.qkv.bias - torch.Size([3072]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth target_backbone.backbone.stages.3.blocks.0.attn.w_msa.proj.weight - torch.Size([1024, 1024]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth target_backbone.backbone.stages.3.blocks.0.attn.w_msa.proj.bias - torch.Size([1024]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth target_backbone.backbone.stages.3.blocks.0.norm2.weight - torch.Size([1024]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth target_backbone.backbone.stages.3.blocks.0.norm2.bias - torch.Size([1024]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth target_backbone.backbone.stages.3.blocks.0.ffn.layers.0.0.weight - torch.Size([4096, 1024]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth target_backbone.backbone.stages.3.blocks.0.ffn.layers.0.0.bias - torch.Size([4096]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth target_backbone.backbone.stages.3.blocks.0.ffn.layers.1.weight - torch.Size([1024, 4096]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth target_backbone.backbone.stages.3.blocks.0.ffn.layers.1.bias - torch.Size([1024]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth target_backbone.backbone.stages.3.blocks.1.norm1.weight - torch.Size([1024]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth target_backbone.backbone.stages.3.blocks.1.norm1.bias - torch.Size([1024]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth target_backbone.backbone.stages.3.blocks.1.attn.w_msa.relative_position_bias_table - torch.Size([169, 32]): The value is the same before and after calling `init_weights` of SelfSupDetector target_backbone.backbone.stages.3.blocks.1.attn.w_msa.qkv.weight - torch.Size([3072, 1024]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth target_backbone.backbone.stages.3.blocks.1.attn.w_msa.qkv.bias - torch.Size([3072]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth target_backbone.backbone.stages.3.blocks.1.attn.w_msa.proj.weight - torch.Size([1024, 1024]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth target_backbone.backbone.stages.3.blocks.1.attn.w_msa.proj.bias - torch.Size([1024]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth target_backbone.backbone.stages.3.blocks.1.norm2.weight - torch.Size([1024]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth target_backbone.backbone.stages.3.blocks.1.norm2.bias - torch.Size([1024]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth target_backbone.backbone.stages.3.blocks.1.ffn.layers.0.0.weight - torch.Size([4096, 1024]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth target_backbone.backbone.stages.3.blocks.1.ffn.layers.0.0.bias - torch.Size([4096]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth target_backbone.backbone.stages.3.blocks.1.ffn.layers.1.weight - torch.Size([1024, 4096]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth target_backbone.backbone.stages.3.blocks.1.ffn.layers.1.bias - torch.Size([1024]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth target_backbone.backbone.norm0.weight - torch.Size([128]): The value is the same before and after calling `init_weights` of SelfSupDetector target_backbone.backbone.norm0.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of SelfSupDetector target_backbone.backbone.norm1.weight - torch.Size([256]): The value is the same before and after calling `init_weights` of SelfSupDetector target_backbone.backbone.norm1.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of SelfSupDetector target_backbone.backbone.norm2.weight - torch.Size([512]): The value is the same before and after calling `init_weights` of SelfSupDetector target_backbone.backbone.norm2.bias - torch.Size([512]): The value is the same before and after calling `init_weights` of SelfSupDetector target_backbone.backbone.norm3.weight - torch.Size([1024]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth target_backbone.backbone.norm3.bias - torch.Size([1024]): PretrainedInit: load from pretrain/simmim_swin-b_mmselfsup-pretrain.pth target_backbone.neck.lateral_convs.0.conv.weight - torch.Size([256, 128, 1, 1]): The value is the same before and after calling `init_weights` of SelfSupDetector target_backbone.neck.lateral_convs.0.conv.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of SelfSupDetector target_backbone.neck.lateral_convs.1.conv.weight - torch.Size([256, 256, 1, 1]): The value is the same before and after calling `init_weights` of SelfSupDetector target_backbone.neck.lateral_convs.1.conv.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of SelfSupDetector target_backbone.neck.lateral_convs.2.conv.weight - torch.Size([256, 512, 1, 1]): The value is the same before and after calling `init_weights` of SelfSupDetector target_backbone.neck.lateral_convs.2.conv.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of SelfSupDetector target_backbone.neck.lateral_convs.3.conv.weight - torch.Size([256, 1024, 1, 1]): The value is the same before and after calling `init_weights` of SelfSupDetector target_backbone.neck.lateral_convs.3.conv.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of SelfSupDetector target_backbone.neck.fpn_convs.0.conv.weight - torch.Size([256, 256, 3, 3]): The value is the same before and after calling `init_weights` of SelfSupDetector target_backbone.neck.fpn_convs.0.conv.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of SelfSupDetector target_backbone.neck.fpn_convs.1.conv.weight - torch.Size([256, 256, 3, 3]): The value is the same before and after calling `init_weights` of SelfSupDetector target_backbone.neck.fpn_convs.1.conv.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of SelfSupDetector target_backbone.neck.fpn_convs.2.conv.weight - torch.Size([256, 256, 3, 3]): The value is the same before and after calling `init_weights` of SelfSupDetector target_backbone.neck.fpn_convs.2.conv.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of SelfSupDetector target_backbone.neck.fpn_convs.3.conv.weight - torch.Size([256, 256, 3, 3]): The value is the same before and after calling `init_weights` of SelfSupDetector target_backbone.neck.fpn_convs.3.conv.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of SelfSupDetector target_backbone.rpn_head.rpn_conv.weight - torch.Size([256, 256, 3, 3]): The value is the same before and after calling `init_weights` of SelfSupDetector target_backbone.rpn_head.rpn_conv.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of SelfSupDetector target_backbone.rpn_head.rpn_cls.weight - torch.Size([3, 256, 1, 1]): The value is the same before and after calling `init_weights` of SelfSupDetector target_backbone.rpn_head.rpn_cls.bias - torch.Size([3]): The value is the same before and after calling `init_weights` of SelfSupDetector target_backbone.rpn_head.rpn_reg.weight - torch.Size([12, 256, 1, 1]): The value is the same before and after calling `init_weights` of SelfSupDetector target_backbone.rpn_head.rpn_reg.bias - torch.Size([12]): The value is the same before and after calling `init_weights` of SelfSupDetector target_backbone.roi_head.bbox_head.fc_cls.0.weight - torch.Size([1024, 1024]): The value is the same before and after calling `init_weights` of SelfSupDetector target_backbone.roi_head.bbox_head.fc_cls.0.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of SelfSupDetector target_backbone.roi_head.bbox_head.fc_cls.2.weight - torch.Size([256, 1024]): The value is the same before and after calling `init_weights` of SelfSupDetector target_backbone.roi_head.bbox_head.fc_cls.2.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of SelfSupDetector target_backbone.roi_head.bbox_head.fc_reg.weight - torch.Size([1024, 1024]): The value is the same before and after calling `init_weights` of SelfSupDetector target_backbone.roi_head.bbox_head.fc_reg.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of SelfSupDetector target_backbone.roi_head.bbox_head.shared_convs.0.conv.weight - torch.Size([256, 256, 3, 3]): The value is the same before and after calling `init_weights` of SelfSupDetector target_backbone.roi_head.bbox_head.shared_convs.0.conv.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of SelfSupDetector target_backbone.roi_head.bbox_head.shared_convs.1.conv.weight - torch.Size([256, 256, 3, 3]): The value is the same before and after calling `init_weights` of SelfSupDetector target_backbone.roi_head.bbox_head.shared_convs.1.conv.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of SelfSupDetector target_backbone.roi_head.bbox_head.shared_convs.2.conv.weight - torch.Size([256, 256, 3, 3]): The value is the same before and after calling `init_weights` of SelfSupDetector target_backbone.roi_head.bbox_head.shared_convs.2.conv.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of SelfSupDetector target_backbone.roi_head.bbox_head.shared_convs.3.conv.weight - torch.Size([256, 256, 3, 3]): The value is the same before and after calling `init_weights` of SelfSupDetector target_backbone.roi_head.bbox_head.shared_convs.3.conv.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of SelfSupDetector target_backbone.roi_head.bbox_head.shared_fcs.0.weight - torch.Size([1024, 12544]): The value is the same before and after calling `init_weights` of SelfSupDetector target_backbone.roi_head.bbox_head.shared_fcs.0.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of SelfSupDetector 2023-01-20 09:10:50,354 - mmdet - INFO - Training with 8 GPU(s) with 4 samples per GPU. The total batch size is 32. 2023-01-20 09:10:50,354 - mmdet - INFO - The batch size match the base batch size: 32, will not scaling the LR (6e-05). 2023-01-20 09:10:51,162 - mmdet - INFO - Start running, host: tiger@n176-118-024, work_dir: /home/tiger/code/mmdet/work_dirs/selfsup_mask-rcnn_swin-b_lsj-3x-coco_simmim-pretrain 2023-01-20 09:10:51,162 - mmdet - INFO - Hooks will be executed in the following order: before_run: (VERY_HIGH ) StepLrUpdaterHook (NORMAL ) CheckpointHook (NORMAL ) MMDetWandbHook (LOW ) DistEvalHook (VERY_LOW ) TextLoggerHook -------------------- before_train_epoch: (VERY_HIGH ) StepLrUpdaterHook (NORMAL ) MMDetWandbHook (NORMAL ) DistSamplerSeedHook (LOW ) IterTimerHook (LOW ) DistEvalHook (VERY_LOW ) TextLoggerHook -------------------- before_train_iter: (VERY_HIGH ) StepLrUpdaterHook (NORMAL ) MomentumUpdateHook (LOW ) IterTimerHook (LOW ) DistEvalHook -------------------- after_train_iter: (ABOVE_NORMAL) OptimizerHook (NORMAL ) CheckpointHook (NORMAL ) MomentumUpdateHook (NORMAL ) MMDetWandbHook (LOW ) IterTimerHook (LOW ) DistEvalHook (VERY_LOW ) TextLoggerHook -------------------- after_train_epoch: (NORMAL ) CheckpointHook (NORMAL ) MMDetWandbHook (LOW ) DistEvalHook (VERY_LOW ) TextLoggerHook -------------------- before_val_epoch: (NORMAL ) MMDetWandbHook (NORMAL ) DistSamplerSeedHook (LOW ) IterTimerHook (VERY_LOW ) TextLoggerHook -------------------- before_val_iter: (LOW ) IterTimerHook -------------------- after_val_iter: (LOW ) IterTimerHook -------------------- after_val_epoch: (NORMAL ) MMDetWandbHook (VERY_LOW ) TextLoggerHook -------------------- after_run: (NORMAL ) MMDetWandbHook (VERY_LOW ) TextLoggerHook -------------------- 2023-01-20 09:10:51,163 - mmdet - INFO - workflow: [('train', 1)], max: 12 epochs 2023-01-20 09:10:51,163 - mmdet - INFO - Checkpoints will be saved to /home/tiger/code/mmdet/work_dirs/selfsup_mask-rcnn_swin-b_lsj-3x-coco_simmim-pretrain by HardDiskBackend. 2023-01-20 09:11:44,867 - mmdet - INFO - Epoch [1][50/3696] lr: 2.997e-06, eta: 12:25:46, time: 1.010, data_time: 0.071, memory: 5804, loss_rpn_cls: 0.3055, loss_rpn_bbox: 0.0909, loss_cls: 6.3495, loss_bbox: 2.9983, loss: 9.7443 2023-01-20 09:12:29,287 - mmdet - INFO - Epoch [1][100/3696] lr: 5.994e-06, eta: 11:40:37, time: 0.890, data_time: 0.018, memory: 5804, loss_rpn_cls: 0.1489, loss_rpn_bbox: 0.0629, loss_cls: 6.3722, loss_bbox: 2.7391, loss: 9.3231 2023-01-20 09:13:13,971 - mmdet - INFO - Epoch [1][150/3696] lr: 8.991e-06, eta: 11:26:00, time: 0.894, data_time: 0.016, memory: 5804, loss_rpn_cls: 0.1145, loss_rpn_bbox: 0.0546, loss_cls: 6.5859, loss_bbox: 3.0951, loss: 9.8501 2023-01-20 09:13:59,150 - mmdet - INFO - Epoch [1][200/3696] lr: 1.199e-05, eta: 11:20:09, time: 0.904, data_time: 0.016, memory: 5804, loss_rpn_cls: 0.1078, loss_rpn_bbox: 0.0509, loss_cls: 6.6727, loss_bbox: 3.2260, loss: 10.0574 2023-01-20 09:14:43,513 - mmdet - INFO - Epoch [1][250/3696] lr: 1.499e-05, eta: 11:13:55, time: 0.887, data_time: 0.017, memory: 5804, loss_rpn_cls: 0.1040, loss_rpn_bbox: 0.0468, loss_cls: 6.6959, loss_bbox: 3.2490, loss: 10.0957 2023-01-20 09:15:27,757 - mmdet - INFO - Epoch [1][300/3696] lr: 1.798e-05, eta: 11:09:15, time: 0.885, data_time: 0.017, memory: 6225, loss_rpn_cls: 0.1015, loss_rpn_bbox: 0.0439, loss_cls: 6.6974, loss_bbox: 3.2523, loss: 10.0951 2023-01-20 09:16:12,717 - mmdet - INFO - Epoch [1][350/3696] lr: 2.098e-05, eta: 11:07:12, time: 0.899, data_time: 0.017, memory: 7283, loss_rpn_cls: 0.0990, loss_rpn_bbox: 0.0420, loss_cls: 6.7325, loss_bbox: 3.2536, loss: 10.1271 2023-01-20 09:16:57,518 - mmdet - INFO - Epoch [1][400/3696] lr: 2.398e-05, eta: 11:05:11, time: 0.896, data_time: 0.016, memory: 7483, loss_rpn_cls: 0.0953, loss_rpn_bbox: 0.0392, loss_cls: 6.7200, loss_bbox: 3.2664, loss: 10.1209 2023-01-20 09:17:42,563 - mmdet - INFO - Epoch [1][450/3696] lr: 2.697e-05, eta: 11:03:50, time: 0.901, data_time: 0.017, memory: 7483, loss_rpn_cls: 0.0958, loss_rpn_bbox: 0.0390, loss_cls: 6.7265, loss_bbox: 3.2418, loss: 10.1030 2023-01-20 09:18:27,542 - mmdet - INFO - Epoch [1][500/3696] lr: 2.997e-05, eta: 11:02:31, time: 0.900, data_time: 0.017, memory: 7601, loss_rpn_cls: 0.0938, loss_rpn_bbox: 0.0382, loss_cls: 6.7134, loss_bbox: 3.2462, loss: 10.0915 2023-01-20 09:19:12,221 - mmdet - INFO - Epoch [1][550/3696] lr: 3.297e-05, eta: 11:00:55, time: 0.894, data_time: 0.016, memory: 7601, loss_rpn_cls: 0.0913, loss_rpn_bbox: 0.0361, loss_cls: 6.6717, loss_bbox: 3.2317, loss: 10.0308 2023-01-20 09:19:57,243 - mmdet - INFO - Epoch [1][600/3696] lr: 3.596e-05, eta: 10:59:51, time: 0.900, data_time: 0.017, memory: 7601, loss_rpn_cls: 0.0891, loss_rpn_bbox: 0.0350, loss_cls: 6.6427, loss_bbox: 3.2483, loss: 10.0152 2023-01-20 09:20:41,748 - mmdet - INFO - Epoch [1][650/3696] lr: 3.896e-05, eta: 10:58:16, time: 0.890, data_time: 0.017, memory: 7601, loss_rpn_cls: 0.0872, loss_rpn_bbox: 0.0341, loss_cls: 6.6351, loss_bbox: 3.2485, loss: 10.0049 2023-01-20 09:21:26,832 - mmdet - INFO - Epoch [1][700/3696] lr: 4.196e-05, eta: 10:57:25, time: 0.902, data_time: 0.017, memory: 8611, loss_rpn_cls: 0.0867, loss_rpn_bbox: 0.0337, loss_cls: 6.6106, loss_bbox: 3.2263, loss: 9.9573 2023-01-20 09:22:10,900 - mmdet - INFO - Epoch [1][750/3696] lr: 4.496e-05, eta: 10:55:35, time: 0.881, data_time: 0.017, memory: 8611, loss_rpn_cls: 0.0839, loss_rpn_bbox: 0.0320, loss_cls: 6.5669, loss_bbox: 3.2802, loss: 9.9631 2023-01-20 09:22:55,479 - mmdet - INFO - Epoch [1][800/3696] lr: 4.795e-05, eta: 10:54:21, time: 0.892, data_time: 0.017, memory: 8611, loss_rpn_cls: 0.0836, loss_rpn_bbox: 0.0317, loss_cls: 6.5534, loss_bbox: 3.2578, loss: 9.9265 2023-01-20 09:23:40,416 - mmdet - INFO - Epoch [1][850/3696] lr: 5.095e-05, eta: 10:53:29, time: 0.899, data_time: 0.017, memory: 8611, loss_rpn_cls: 0.0839, loss_rpn_bbox: 0.0316, loss_cls: 6.5305, loss_bbox: 3.2712, loss: 9.9172 2023-01-20 09:24:24,521 - mmdet - INFO - Epoch [1][900/3696] lr: 5.395e-05, eta: 10:51:57, time: 0.882, data_time: 0.018, memory: 8611, loss_rpn_cls: 0.0805, loss_rpn_bbox: 0.0306, loss_cls: 6.5124, loss_bbox: 3.2691, loss: 9.8926 2023-01-20 09:25:09,198 - mmdet - INFO - Epoch [1][950/3696] lr: 5.694e-05, eta: 10:50:57, time: 0.894, data_time: 0.017, memory: 8611, loss_rpn_cls: 0.0824, loss_rpn_bbox: 0.0309, loss_cls: 6.5169, loss_bbox: 3.2632, loss: 9.8935 2023-01-20 09:25:53,939 - mmdet - INFO - Exp name: selfsup_mask-rcnn_swin-b_simmim.py 2023-01-20 09:25:53,939 - mmdet - INFO - Epoch [1][1000/3696] lr: 5.994e-05, eta: 10:50:01, time: 0.895, data_time: 0.017, memory: 8611, loss_rpn_cls: 0.0806, loss_rpn_bbox: 0.0303, loss_cls: 6.4921, loss_bbox: 3.2672, loss: 9.8702 2023-01-20 09:26:39,358 - mmdet - INFO - Epoch [1][1050/3696] lr: 6.000e-05, eta: 10:49:34, time: 0.908, data_time: 0.017, memory: 8611, loss_rpn_cls: 0.0806, loss_rpn_bbox: 0.0304, loss_cls: 6.4976, loss_bbox: 3.2589, loss: 9.8675 2023-01-20 09:27:23,806 - mmdet - INFO - Epoch [1][1100/3696] lr: 6.000e-05, eta: 10:48:27, time: 0.889, data_time: 0.017, memory: 8611, loss_rpn_cls: 0.0800, loss_rpn_bbox: 0.0299, loss_cls: 6.4832, loss_bbox: 3.2635, loss: 9.8566 2023-01-20 09:28:08,714 - mmdet - INFO - Epoch [1][1150/3696] lr: 6.000e-05, eta: 10:47:39, time: 0.898, data_time: 0.017, memory: 8611, loss_rpn_cls: 0.0784, loss_rpn_bbox: 0.0291, loss_cls: 6.4529, loss_bbox: 3.2843, loss: 9.8447 2023-01-20 09:28:54,224 - mmdet - INFO - Epoch [1][1200/3696] lr: 6.000e-05, eta: 10:47:14, time: 0.910, data_time: 0.018, memory: 8611, loss_rpn_cls: 0.0783, loss_rpn_bbox: 0.0294, loss_cls: 6.4572, loss_bbox: 3.2652, loss: 9.8302 2023-01-20 09:29:38,745 - mmdet - INFO - Epoch [1][1250/3696] lr: 6.000e-05, eta: 10:46:12, time: 0.890, data_time: 0.017, memory: 8668, loss_rpn_cls: 0.0771, loss_rpn_bbox: 0.0285, loss_cls: 6.4241, loss_bbox: 3.2958, loss: 9.8256 2023-01-20 09:30:24,089 - mmdet - INFO - Epoch [1][1300/3696] lr: 6.000e-05, eta: 10:45:39, time: 0.907, data_time: 0.016, memory: 8668, loss_rpn_cls: 0.0770, loss_rpn_bbox: 0.0293, loss_cls: 6.4485, loss_bbox: 3.2595, loss: 9.8144 2023-01-20 09:31:08,643 - mmdet - INFO - Epoch [1][1350/3696] lr: 6.000e-05, eta: 10:44:40, time: 0.891, data_time: 0.017, memory: 8771, loss_rpn_cls: 0.0768, loss_rpn_bbox: 0.0285, loss_cls: 6.4079, loss_bbox: 3.2862, loss: 9.7993 2023-01-20 09:31:53,242 - mmdet - INFO - Epoch [1][1400/3696] lr: 6.000e-05, eta: 10:43:44, time: 0.892, data_time: 0.017, memory: 8771, loss_rpn_cls: 0.0760, loss_rpn_bbox: 0.0284, loss_cls: 6.4148, loss_bbox: 3.2837, loss: 9.8030 2023-01-20 09:32:38,073 - mmdet - INFO - Epoch [1][1450/3696] lr: 6.000e-05, eta: 10:42:55, time: 0.897, data_time: 0.017, memory: 8771, loss_rpn_cls: 0.0764, loss_rpn_bbox: 0.0284, loss_cls: 6.4213, loss_bbox: 3.2747, loss: 9.8007 2023-01-20 09:33:22,696 - mmdet - INFO - Epoch [1][1500/3696] lr: 6.000e-05, eta: 10:42:01, time: 0.892, data_time: 0.017, memory: 8771, loss_rpn_cls: 0.0762, loss_rpn_bbox: 0.0281, loss_cls: 6.3980, loss_bbox: 3.2743, loss: 9.7766 2023-01-20 09:34:07,905 - mmdet - INFO - Epoch [1][1550/3696] lr: 6.000e-05, eta: 10:41:23, time: 0.904, data_time: 0.017, memory: 8771, loss_rpn_cls: 0.0756, loss_rpn_bbox: 0.0282, loss_cls: 6.3914, loss_bbox: 3.2647, loss: 9.7599 2023-01-20 09:34:52,778 - mmdet - INFO - Epoch [1][1600/3696] lr: 6.000e-05, eta: 10:40:36, time: 0.897, data_time: 0.017, memory: 8771, loss_rpn_cls: 0.0748, loss_rpn_bbox: 0.0276, loss_cls: 6.3818, loss_bbox: 3.2654, loss: 9.7495 2023-01-20 09:35:37,709 - mmdet - INFO - Epoch [1][1650/3696] lr: 6.000e-05, eta: 10:39:50, time: 0.899, data_time: 0.017, memory: 8771, loss_rpn_cls: 0.0768, loss_rpn_bbox: 0.0283, loss_cls: 6.3842, loss_bbox: 3.2570, loss: 9.7463 2023-01-20 09:36:22,029 - mmdet - INFO - Epoch [1][1700/3696] lr: 6.000e-05, eta: 10:38:49, time: 0.886, data_time: 0.017, memory: 8771, loss_rpn_cls: 0.0750, loss_rpn_bbox: 0.0277, loss_cls: 6.3670, loss_bbox: 3.2639, loss: 9.7337 2023-01-20 09:37:07,082 - mmdet - INFO - Epoch [1][1750/3696] lr: 6.000e-05, eta: 10:38:07, time: 0.901, data_time: 0.017, memory: 8917, loss_rpn_cls: 0.0745, loss_rpn_bbox: 0.0278, loss_cls: 6.3807, loss_bbox: 3.2383, loss: 9.7214 2023-01-20 09:37:51,900 - mmdet - INFO - Epoch [1][1800/3696] lr: 6.000e-05, eta: 10:37:20, time: 0.896, data_time: 0.017, memory: 8917, loss_rpn_cls: 0.0733, loss_rpn_bbox: 0.0272, loss_cls: 6.3625, loss_bbox: 3.2557, loss: 9.7187 2023-01-20 09:38:36,594 - mmdet - INFO - Epoch [1][1850/3696] lr: 6.000e-05, eta: 10:36:29, time: 0.894, data_time: 0.016, memory: 8917, loss_rpn_cls: 0.0735, loss_rpn_bbox: 0.0277, loss_cls: 6.3829, loss_bbox: 3.2336, loss: 9.7176 2023-01-20 09:39:21,613 - mmdet - INFO - Epoch [1][1900/3696] lr: 6.000e-05, eta: 10:35:46, time: 0.901, data_time: 0.017, memory: 8917, loss_rpn_cls: 0.0733, loss_rpn_bbox: 0.0271, loss_cls: 6.3714, loss_bbox: 3.2297, loss: 9.7015 2023-01-20 09:40:06,322 - mmdet - INFO - Epoch [1][1950/3696] lr: 6.000e-05, eta: 10:34:57, time: 0.894, data_time: 0.017, memory: 8917, loss_rpn_cls: 0.0722, loss_rpn_bbox: 0.0268, loss_cls: 6.3410, loss_bbox: 3.2306, loss: 9.6707 2023-01-20 09:40:50,389 - mmdet - INFO - Exp name: selfsup_mask-rcnn_swin-b_simmim.py 2023-01-20 09:40:50,390 - mmdet - INFO - Epoch [1][2000/3696] lr: 6.000e-05, eta: 10:33:53, time: 0.881, data_time: 0.017, memory: 8917, loss_rpn_cls: 0.0720, loss_rpn_bbox: 0.0266, loss_cls: 6.3375, loss_bbox: 3.2500, loss: 9.6861 2023-01-20 09:41:35,176 - mmdet - INFO - Epoch [1][2050/3696] lr: 6.000e-05, eta: 10:33:06, time: 0.896, data_time: 0.017, memory: 8917, loss_rpn_cls: 0.0731, loss_rpn_bbox: 0.0271, loss_cls: 6.3493, loss_bbox: 3.2227, loss: 9.6722 2023-01-20 09:42:20,395 - mmdet - INFO - Epoch [1][2100/3696] lr: 6.000e-05, eta: 10:32:28, time: 0.904, data_time: 0.017, memory: 8917, loss_rpn_cls: 0.0744, loss_rpn_bbox: 0.0279, loss_cls: 6.3592, loss_bbox: 3.1851, loss: 9.6466 2023-01-20 09:43:05,198 - mmdet - INFO - Epoch [1][2150/3696] lr: 6.000e-05, eta: 10:31:41, time: 0.896, data_time: 0.017, memory: 8917, loss_rpn_cls: 0.0723, loss_rpn_bbox: 0.0271, loss_cls: 6.3442, loss_bbox: 3.2105, loss: 9.6541 2023-01-20 09:43:50,154 - mmdet - INFO - Epoch [1][2200/3696] lr: 6.000e-05, eta: 10:30:57, time: 0.899, data_time: 0.017, memory: 8917, loss_rpn_cls: 0.0713, loss_rpn_bbox: 0.0264, loss_cls: 6.3388, loss_bbox: 3.1981, loss: 9.6347 2023-01-20 09:44:34,956 - mmdet - INFO - Epoch [1][2250/3696] lr: 6.000e-05, eta: 10:30:10, time: 0.896, data_time: 0.017, memory: 8917, loss_rpn_cls: 0.0710, loss_rpn_bbox: 0.0262, loss_cls: 6.3316, loss_bbox: 3.2003, loss: 9.6291 2023-01-20 09:45:19,282 - mmdet - INFO - Epoch [1][2300/3696] lr: 6.000e-05, eta: 10:29:15, time: 0.887, data_time: 0.017, memory: 8917, loss_rpn_cls: 0.0717, loss_rpn_bbox: 0.0264, loss_cls: 6.3170, loss_bbox: 3.2026, loss: 9.6177 2023-01-20 09:46:04,084 - mmdet - INFO - Epoch [1][2350/3696] lr: 6.000e-05, eta: 10:28:28, time: 0.896, data_time: 0.017, memory: 8917, loss_rpn_cls: 0.0720, loss_rpn_bbox: 0.0268, loss_cls: 6.3334, loss_bbox: 3.1738, loss: 9.6060 2023-01-20 09:46:49,251 - mmdet - INFO - Epoch [1][2400/3696] lr: 6.000e-05, eta: 10:27:48, time: 0.903, data_time: 0.017, memory: 8917, loss_rpn_cls: 0.0724, loss_rpn_bbox: 0.0270, loss_cls: 6.3282, loss_bbox: 3.1498, loss: 9.5773 2023-01-20 09:47:33,815 - mmdet - INFO - Epoch [1][2450/3696] lr: 6.000e-05, eta: 10:26:57, time: 0.891, data_time: 0.017, memory: 8917, loss_rpn_cls: 0.0718, loss_rpn_bbox: 0.0266, loss_cls: 6.3354, loss_bbox: 3.1779, loss: 9.6118 2023-01-20 09:48:18,943 - mmdet - INFO - Epoch [1][2500/3696] lr: 6.000e-05, eta: 10:26:17, time: 0.903, data_time: 0.017, memory: 8917, loss_rpn_cls: 0.0713, loss_rpn_bbox: 0.0264, loss_cls: 6.3257, loss_bbox: 3.1631, loss: 9.5864 2023-01-20 09:49:04,552 - mmdet - INFO - Epoch [1][2550/3696] lr: 6.000e-05, eta: 10:25:43, time: 0.912, data_time: 0.017, memory: 9619, loss_rpn_cls: 0.0719, loss_rpn_bbox: 0.0267, loss_cls: 6.3189, loss_bbox: 3.1513, loss: 9.5688 2023-01-20 09:49:50,314 - mmdet - INFO - Epoch [1][2600/3696] lr: 6.000e-05, eta: 10:25:12, time: 0.915, data_time: 0.017, memory: 9619, loss_rpn_cls: 0.0729, loss_rpn_bbox: 0.0271, loss_cls: 6.3360, loss_bbox: 3.1421, loss: 9.5781 2023-01-20 09:50:35,413 - mmdet - INFO - Epoch [1][2650/3696] lr: 6.000e-05, eta: 10:24:30, time: 0.902, data_time: 0.017, memory: 9619, loss_rpn_cls: 0.0707, loss_rpn_bbox: 0.0262, loss_cls: 6.3142, loss_bbox: 3.1647, loss: 9.5758 2023-01-20 09:51:20,209 - mmdet - INFO - Epoch [1][2700/3696] lr: 6.000e-05, eta: 10:23:43, time: 0.896, data_time: 0.017, memory: 9619, loss_rpn_cls: 0.0713, loss_rpn_bbox: 0.0262, loss_cls: 6.3188, loss_bbox: 3.1438, loss: 9.5602 2023-01-20 09:52:04,942 - mmdet - INFO - Epoch [1][2750/3696] lr: 6.000e-05, eta: 10:22:55, time: 0.895, data_time: 0.016, memory: 9619, loss_rpn_cls: 0.0722, loss_rpn_bbox: 0.0268, loss_cls: 6.3379, loss_bbox: 3.1195, loss: 9.5564 2023-01-20 09:52:50,170 - mmdet - INFO - Epoch [1][2800/3696] lr: 6.000e-05, eta: 10:22:15, time: 0.905, data_time: 0.016, memory: 9921, loss_rpn_cls: 0.0716, loss_rpn_bbox: 0.0266, loss_cls: 6.3235, loss_bbox: 3.1235, loss: 9.5452 2023-01-20 09:53:35,157 - mmdet - INFO - Epoch [1][2850/3696] lr: 6.000e-05, eta: 10:21:31, time: 0.900, data_time: 0.017, memory: 9921, loss_rpn_cls: 0.0694, loss_rpn_bbox: 0.0259, loss_cls: 6.3129, loss_bbox: 3.1442, loss: 9.5523 2023-01-20 09:54:20,618 - mmdet - INFO - Epoch [1][2900/3696] lr: 6.000e-05, eta: 10:20:54, time: 0.909, data_time: 0.017, memory: 9921, loss_rpn_cls: 0.0723, loss_rpn_bbox: 0.0272, loss_cls: 6.3327, loss_bbox: 3.1098, loss: 9.5420 2023-01-20 09:55:04,925 - mmdet - INFO - Epoch [1][2950/3696] lr: 6.000e-05, eta: 10:20:00, time: 0.886, data_time: 0.017, memory: 9921, loss_rpn_cls: 0.0699, loss_rpn_bbox: 0.0255, loss_cls: 6.2911, loss_bbox: 3.1393, loss: 9.5258 2023-01-20 09:55:50,368 - mmdet - INFO - Exp name: selfsup_mask-rcnn_swin-b_simmim.py 2023-01-20 09:55:50,369 - mmdet - INFO - Epoch [1][3000/3696] lr: 6.000e-05, eta: 10:19:22, time: 0.909, data_time: 0.016, memory: 9921, loss_rpn_cls: 0.0700, loss_rpn_bbox: 0.0260, loss_cls: 6.3010, loss_bbox: 3.1194, loss: 9.5164 2023-01-20 09:56:36,055 - mmdet - INFO - Epoch [1][3050/3696] lr: 6.000e-05, eta: 10:18:47, time: 0.914, data_time: 0.017, memory: 9921, loss_rpn_cls: 0.0690, loss_rpn_bbox: 0.0259, loss_cls: 6.2963, loss_bbox: 3.1104, loss: 9.5016 2023-01-20 09:57:21,410 - mmdet - INFO - Epoch [1][3100/3696] lr: 6.000e-05, eta: 10:18:08, time: 0.907, data_time: 0.017, memory: 9921, loss_rpn_cls: 0.0712, loss_rpn_bbox: 0.0267, loss_cls: 6.3241, loss_bbox: 3.1131, loss: 9.5350 2023-01-20 09:58:06,301 - mmdet - INFO - Epoch [1][3150/3696] lr: 6.000e-05, eta: 10:17:22, time: 0.898, data_time: 0.016, memory: 9921, loss_rpn_cls: 0.0696, loss_rpn_bbox: 0.0258, loss_cls: 6.3020, loss_bbox: 3.1410, loss: 9.5384 2023-01-20 09:58:51,031 - mmdet - INFO - Epoch [1][3200/3696] lr: 6.000e-05, eta: 10:16:34, time: 0.895, data_time: 0.016, memory: 9921, loss_rpn_cls: 0.0692, loss_rpn_bbox: 0.0256, loss_cls: 6.2827, loss_bbox: 3.1143, loss: 9.4918 2023-01-20 09:59:35,417 - mmdet - INFO - Epoch [1][3250/3696] lr: 6.000e-05, eta: 10:15:42, time: 0.888, data_time: 0.017, memory: 9921, loss_rpn_cls: 0.0683, loss_rpn_bbox: 0.0252, loss_cls: 6.2841, loss_bbox: 3.1339, loss: 9.5116 2023-01-20 10:00:20,125 - mmdet - INFO - Epoch [1][3300/3696] lr: 6.000e-05, eta: 10:14:54, time: 0.894, data_time: 0.016, memory: 9921, loss_rpn_cls: 0.0689, loss_rpn_bbox: 0.0252, loss_cls: 6.2843, loss_bbox: 3.1360, loss: 9.5144 2023-01-20 10:01:05,260 - mmdet - INFO - Epoch [1][3350/3696] lr: 6.000e-05, eta: 10:14:12, time: 0.903, data_time: 0.017, memory: 10680, loss_rpn_cls: 0.0692, loss_rpn_bbox: 0.0256, loss_cls: 6.2909, loss_bbox: 3.1130, loss: 9.4987 2023-01-20 10:01:49,969 - mmdet - INFO - Epoch [1][3400/3696] lr: 6.000e-05, eta: 10:13:24, time: 0.894, data_time: 0.017, memory: 10680, loss_rpn_cls: 0.0690, loss_rpn_bbox: 0.0257, loss_cls: 6.2882, loss_bbox: 3.1023, loss: 9.4852 2023-01-20 10:02:35,132 - mmdet - INFO - Epoch [1][3450/3696] lr: 6.000e-05, eta: 10:12:42, time: 0.903, data_time: 0.017, memory: 10680, loss_rpn_cls: 0.0701, loss_rpn_bbox: 0.0265, loss_cls: 6.3271, loss_bbox: 3.1052, loss: 9.5289 2023-01-20 10:03:19,709 - mmdet - INFO - Epoch [1][3500/3696] lr: 6.000e-05, eta: 10:11:53, time: 0.892, data_time: 0.017, memory: 10680, loss_rpn_cls: 0.0683, loss_rpn_bbox: 0.0254, loss_cls: 6.2836, loss_bbox: 3.1299, loss: 9.5072 2023-01-20 10:04:05,044 - mmdet - INFO - Epoch [1][3550/3696] lr: 6.000e-05, eta: 10:11:12, time: 0.907, data_time: 0.017, memory: 10680, loss_rpn_cls: 0.0687, loss_rpn_bbox: 0.0255, loss_cls: 6.2916, loss_bbox: 3.1065, loss: 9.4923 2023-01-20 10:04:50,701 - mmdet - INFO - Epoch [1][3600/3696] lr: 6.000e-05, eta: 10:10:35, time: 0.913, data_time: 0.017, memory: 10680, loss_rpn_cls: 0.0693, loss_rpn_bbox: 0.0259, loss_cls: 6.3023, loss_bbox: 3.0950, loss: 9.4924 2023-01-20 10:05:36,697 - mmdet - INFO - Epoch [1][3650/3696] lr: 6.000e-05, eta: 10:10:02, time: 0.920, data_time: 0.017, memory: 10680, loss_rpn_cls: 0.0698, loss_rpn_bbox: 0.0262, loss_cls: 6.3052, loss_bbox: 3.0922, loss: 9.4934 2023-01-20 10:06:18,433 - mmdet - INFO - Saving checkpoint at 1 epochs 2023-01-20 10:07:09,387 - mmdet - INFO - Epoch [2][50/3696] lr: 6.000e-05, eta: 10:01:40, time: 0.960, data_time: 0.076, memory: 10680, loss_rpn_cls: 0.0688, loss_rpn_bbox: 0.0255, loss_cls: 6.2872, loss_bbox: 3.1086, loss: 9.4900 2023-01-20 10:07:53,907 - mmdet - INFO - Epoch [2][100/3696] lr: 6.000e-05, eta: 10:00:56, time: 0.890, data_time: 0.017, memory: 10680, loss_rpn_cls: 0.0667, loss_rpn_bbox: 0.0246, loss_cls: 6.2717, loss_bbox: 3.1128, loss: 9.4759 2023-01-20 10:08:38,629 - mmdet - INFO - Epoch [2][150/3696] lr: 6.000e-05, eta: 10:00:15, time: 0.894, data_time: 0.017, memory: 10680, loss_rpn_cls: 0.0690, loss_rpn_bbox: 0.0260, loss_cls: 6.3156, loss_bbox: 3.1068, loss: 9.5173 2023-01-20 10:09:24,137 - mmdet - INFO - Epoch [2][200/3696] lr: 6.000e-05, eta: 9:59:41, time: 0.910, data_time: 0.017, memory: 10680, loss_rpn_cls: 0.0688, loss_rpn_bbox: 0.0261, loss_cls: 6.3108, loss_bbox: 3.0838, loss: 9.4894 2023-01-20 10:10:09,192 - mmdet - INFO - Epoch [2][250/3696] lr: 6.000e-05, eta: 9:59:03, time: 0.901, data_time: 0.017, memory: 10680, loss_rpn_cls: 0.0687, loss_rpn_bbox: 0.0257, loss_cls: 6.2911, loss_bbox: 3.0875, loss: 9.4731 2023-01-20 10:10:53,791 - mmdet - INFO - Epoch [2][300/3696] lr: 6.000e-05, eta: 9:58:19, time: 0.892, data_time: 0.017, memory: 10680, loss_rpn_cls: 0.0681, loss_rpn_bbox: 0.0254, loss_cls: 6.2820, loss_bbox: 3.1077, loss: 9.4832 2023-01-20 10:11:38,776 - mmdet - INFO - Epoch [2][350/3696] lr: 6.000e-05, eta: 9:57:40, time: 0.900, data_time: 0.017, memory: 10680, loss_rpn_cls: 0.0689, loss_rpn_bbox: 0.0259, loss_cls: 6.3019, loss_bbox: 3.0923, loss: 9.4890 2023-01-20 10:12:28,418 - mmdet - INFO - Epoch [2][400/3696] lr: 6.000e-05, eta: 9:57:46, time: 0.993, data_time: 0.017, memory: 10680, loss_rpn_cls: 0.0690, loss_rpn_bbox: 0.0257, loss_cls: 6.2941, loss_bbox: 3.0877, loss: 9.4766 2023-01-20 10:13:24,648 - mmdet - INFO - Epoch [2][450/3696] lr: 6.000e-05, eta: 9:58:55, time: 1.125, data_time: 0.017, memory: 10680, loss_rpn_cls: 0.0702, loss_rpn_bbox: 0.0264, loss_cls: 6.3131, loss_bbox: 3.0716, loss: 9.4813 2023-01-20 10:14:09,236 - mmdet - INFO - Epoch [2][500/3696] lr: 6.000e-05, eta: 9:58:09, time: 0.892, data_time: 0.017, memory: 10680, loss_rpn_cls: 0.0685, loss_rpn_bbox: 0.0255, loss_cls: 6.2814, loss_bbox: 3.0971, loss: 9.4725 2023-01-20 10:14:56,995 - mmdet - INFO - Epoch [2][550/3696] lr: 6.000e-05, eta: 9:57:54, time: 0.955, data_time: 0.018, memory: 10680, loss_rpn_cls: 0.0689, loss_rpn_bbox: 0.0260, loss_cls: 6.3141, loss_bbox: 3.0839, loss: 9.4928 2023-01-20 10:15:43,207 - mmdet - INFO - Epoch [2][600/3696] lr: 6.000e-05, eta: 9:57:23, time: 0.924, data_time: 0.018, memory: 11338, loss_rpn_cls: 0.0689, loss_rpn_bbox: 0.0257, loss_cls: 6.2886, loss_bbox: 3.0804, loss: 9.4637 2023-01-20 10:16:28,260 - mmdet - INFO - Epoch [2][650/3696] lr: 6.000e-05, eta: 9:56:41, time: 0.901, data_time: 0.018, memory: 11338, loss_rpn_cls: 0.0669, loss_rpn_bbox: 0.0250, loss_cls: 6.2810, loss_bbox: 3.1064, loss: 9.4793 2023-01-20 10:17:13,326 - mmdet - INFO - Epoch [2][700/3696] lr: 6.000e-05, eta: 9:55:59, time: 0.901, data_time: 0.017, memory: 11338, loss_rpn_cls: 0.0671, loss_rpn_bbox: 0.0251, loss_cls: 6.2863, loss_bbox: 3.0927, loss: 9.4712 2023-01-20 10:17:58,046 - mmdet - INFO - Epoch [2][750/3696] lr: 6.000e-05, eta: 9:55:14, time: 0.894, data_time: 0.018, memory: 11338, loss_rpn_cls: 0.0671, loss_rpn_bbox: 0.0253, loss_cls: 6.2807, loss_bbox: 3.1044, loss: 9.4774 2023-01-20 10:18:43,595 - mmdet - INFO - Epoch [2][800/3696] lr: 6.000e-05, eta: 9:54:36, time: 0.911, data_time: 0.017, memory: 11338, loss_rpn_cls: 0.0676, loss_rpn_bbox: 0.0254, loss_cls: 6.2793, loss_bbox: 3.0807, loss: 9.4530 2023-01-20 10:19:28,342 - mmdet - INFO - Epoch [2][850/3696] lr: 6.000e-05, eta: 9:53:52, time: 0.895, data_time: 0.017, memory: 11338, loss_rpn_cls: 0.0668, loss_rpn_bbox: 0.0253, loss_cls: 6.2829, loss_bbox: 3.0993, loss: 9.4743 2023-01-20 10:20:13,743 - mmdet - INFO - Epoch [2][900/3696] lr: 6.000e-05, eta: 9:53:12, time: 0.908, data_time: 0.018, memory: 11338, loss_rpn_cls: 0.0673, loss_rpn_bbox: 0.0254, loss_cls: 6.2924, loss_bbox: 3.0883, loss: 9.4734 2023-01-20 10:20:58,809 - mmdet - INFO - Epoch [2][950/3696] lr: 6.000e-05, eta: 9:52:30, time: 0.901, data_time: 0.017, memory: 11338, loss_rpn_cls: 0.0678, loss_rpn_bbox: 0.0256, loss_cls: 6.2976, loss_bbox: 3.0787, loss: 9.4697 2023-01-20 10:21:43,489 - mmdet - INFO - Epoch [2][1000/3696] lr: 6.000e-05, eta: 9:51:45, time: 0.894, data_time: 0.017, memory: 11338, loss_rpn_cls: 0.0668, loss_rpn_bbox: 0.0249, loss_cls: 6.2827, loss_bbox: 3.0997, loss: 9.4741 2023-01-20 10:22:28,980 - mmdet - INFO - Epoch [2][1050/3696] lr: 6.000e-05, eta: 9:51:06, time: 0.910, data_time: 0.017, memory: 11338, loss_rpn_cls: 0.0674, loss_rpn_bbox: 0.0254, loss_cls: 6.2902, loss_bbox: 3.0767, loss: 9.4597 2023-01-20 10:23:14,153 - mmdet - INFO - Epoch [2][1100/3696] lr: 6.000e-05, eta: 9:50:24, time: 0.903, data_time: 0.017, memory: 11338, loss_rpn_cls: 0.0665, loss_rpn_bbox: 0.0250, loss_cls: 6.2838, loss_bbox: 3.0818, loss: 9.4571 2023-01-20 10:23:59,364 - mmdet - INFO - Epoch [2][1150/3696] lr: 6.000e-05, eta: 9:49:43, time: 0.904, data_time: 0.017, memory: 11338, loss_rpn_cls: 0.0665, loss_rpn_bbox: 0.0250, loss_cls: 6.2874, loss_bbox: 3.0838, loss: 9.4627 2023-01-20 10:24:43,845 - mmdet - INFO - Epoch [2][1200/3696] lr: 6.000e-05, eta: 9:48:56, time: 0.890, data_time: 0.018, memory: 11338, loss_rpn_cls: 0.0662, loss_rpn_bbox: 0.0247, loss_cls: 6.2721, loss_bbox: 3.0991, loss: 9.4620 2023-01-20 10:25:28,480 - mmdet - INFO - Epoch [2][1250/3696] lr: 6.000e-05, eta: 9:48:10, time: 0.893, data_time: 0.017, memory: 11338, loss_rpn_cls: 0.0664, loss_rpn_bbox: 0.0247, loss_cls: 6.2767, loss_bbox: 3.0964, loss: 9.4643 2023-01-20 10:26:13,074 - mmdet - INFO - Epoch [2][1300/3696] lr: 6.000e-05, eta: 9:47:24, time: 0.892, data_time: 0.017, memory: 11338, loss_rpn_cls: 0.0657, loss_rpn_bbox: 0.0250, loss_cls: 6.2942, loss_bbox: 3.1028, loss: 9.4878 2023-01-20 10:26:58,118 - mmdet - INFO - Epoch [2][1350/3696] lr: 6.000e-05, eta: 9:46:41, time: 0.901, data_time: 0.017, memory: 11338, loss_rpn_cls: 0.0666, loss_rpn_bbox: 0.0248, loss_cls: 6.2745, loss_bbox: 3.0890, loss: 9.4549 2023-01-20 10:27:42,621 - mmdet - INFO - Epoch [2][1400/3696] lr: 6.000e-05, eta: 9:45:54, time: 0.890, data_time: 0.018, memory: 11338, loss_rpn_cls: 0.0656, loss_rpn_bbox: 0.0247, loss_cls: 6.2717, loss_bbox: 3.0951, loss: 9.4572 2023-01-20 10:28:27,131 - mmdet - INFO - Epoch [2][1450/3696] lr: 6.000e-05, eta: 9:45:07, time: 0.890, data_time: 0.017, memory: 11338, loss_rpn_cls: 0.0674, loss_rpn_bbox: 0.0251, loss_cls: 6.2979, loss_bbox: 3.0838, loss: 9.4742 2023-01-20 10:29:12,976 - mmdet - INFO - Epoch [2][1500/3696] lr: 6.000e-05, eta: 9:44:31, time: 0.917, data_time: 0.018, memory: 11338, loss_rpn_cls: 0.0673, loss_rpn_bbox: 0.0256, loss_cls: 6.2954, loss_bbox: 3.0603, loss: 9.4487 2023-01-20 10:29:58,980 - mmdet - INFO - Epoch [2][1550/3696] lr: 6.000e-05, eta: 9:43:55, time: 0.920, data_time: 0.017, memory: 11338, loss_rpn_cls: 0.0675, loss_rpn_bbox: 0.0257, loss_cls: 6.2990, loss_bbox: 3.0584, loss: 9.4506 2023-01-20 10:30:44,212 - mmdet - INFO - Epoch [2][1600/3696] lr: 6.000e-05, eta: 9:43:13, time: 0.905, data_time: 0.017, memory: 11338, loss_rpn_cls: 0.0669, loss_rpn_bbox: 0.0254, loss_cls: 6.2912, loss_bbox: 3.0745, loss: 9.4580 2023-01-20 10:31:29,724 - mmdet - INFO - Epoch [2][1650/3696] lr: 6.000e-05, eta: 9:42:34, time: 0.910, data_time: 0.017, memory: 11338, loss_rpn_cls: 0.0668, loss_rpn_bbox: 0.0253, loss_cls: 6.2832, loss_bbox: 3.0662, loss: 9.4416 2023-01-20 10:32:15,481 - mmdet - INFO - Epoch [2][1700/3696] lr: 6.000e-05, eta: 9:41:56, time: 0.915, data_time: 0.017, memory: 11338, loss_rpn_cls: 0.0676, loss_rpn_bbox: 0.0252, loss_cls: 6.2817, loss_bbox: 3.0658, loss: 9.4404 2023-01-20 10:33:00,830 - mmdet - INFO - Epoch [2][1750/3696] lr: 6.000e-05, eta: 9:41:15, time: 0.907, data_time: 0.017, memory: 11338, loss_rpn_cls: 0.0672, loss_rpn_bbox: 0.0252, loss_cls: 6.2910, loss_bbox: 3.0617, loss: 9.4452 2023-01-20 10:33:46,495 - mmdet - INFO - Epoch [2][1800/3696] lr: 6.000e-05, eta: 9:40:36, time: 0.913, data_time: 0.017, memory: 11338, loss_rpn_cls: 0.0674, loss_rpn_bbox: 0.0255, loss_cls: 6.2903, loss_bbox: 3.0460, loss: 9.4292 2023-01-20 10:34:31,144 - mmdet - INFO - Epoch [2][1850/3696] lr: 6.000e-05, eta: 9:39:50, time: 0.893, data_time: 0.017, memory: 11338, loss_rpn_cls: 0.0661, loss_rpn_bbox: 0.0247, loss_cls: 6.2749, loss_bbox: 3.0877, loss: 9.4535 2023-01-20 10:35:16,140 - mmdet - INFO - Epoch [2][1900/3696] lr: 6.000e-05, eta: 9:39:06, time: 0.900, data_time: 0.017, memory: 11338, loss_rpn_cls: 0.0662, loss_rpn_bbox: 0.0249, loss_cls: 6.2853, loss_bbox: 3.0650, loss: 9.4414 2023-01-20 10:36:01,398 - mmdet - INFO - Epoch [2][1950/3696] lr: 6.000e-05, eta: 9:38:24, time: 0.905, data_time: 0.017, memory: 11338, loss_rpn_cls: 0.0670, loss_rpn_bbox: 0.0255, loss_cls: 6.2928, loss_bbox: 3.0694, loss: 9.4546 2023-01-20 10:36:47,003 - mmdet - INFO - Epoch [2][2000/3696] lr: 6.000e-05, eta: 9:37:45, time: 0.912, data_time: 0.018, memory: 11338, loss_rpn_cls: 0.0665, loss_rpn_bbox: 0.0251, loss_cls: 6.2781, loss_bbox: 3.0639, loss: 9.4335 2023-01-20 10:37:32,450 - mmdet - INFO - Epoch [2][2050/3696] lr: 6.000e-05, eta: 9:37:04, time: 0.909, data_time: 0.017, memory: 11338, loss_rpn_cls: 0.0660, loss_rpn_bbox: 0.0253, loss_cls: 6.2924, loss_bbox: 3.0649, loss: 9.4486 2023-01-20 10:38:17,311 - mmdet - INFO - Epoch [2][2100/3696] lr: 6.000e-05, eta: 9:36:19, time: 0.897, data_time: 0.017, memory: 11338, loss_rpn_cls: 0.0659, loss_rpn_bbox: 0.0249, loss_cls: 6.2816, loss_bbox: 3.0746, loss: 9.4470 2023-01-20 10:39:02,513 - mmdet - INFO - Epoch [2][2150/3696] lr: 6.000e-05, eta: 9:35:37, time: 0.904, data_time: 0.017, memory: 11338, loss_rpn_cls: 0.0653, loss_rpn_bbox: 0.0251, loss_cls: 6.2751, loss_bbox: 3.0747, loss: 9.4402 2023-01-20 10:39:47,880 - mmdet - INFO - Epoch [2][2200/3696] lr: 6.000e-05, eta: 9:34:55, time: 0.907, data_time: 0.017, memory: 11338, loss_rpn_cls: 0.0661, loss_rpn_bbox: 0.0251, loss_cls: 6.2901, loss_bbox: 3.0546, loss: 9.4359 2023-01-20 10:40:32,805 - mmdet - INFO - Epoch [2][2250/3696] lr: 6.000e-05, eta: 9:34:11, time: 0.899, data_time: 0.018, memory: 11338, loss_rpn_cls: 0.0670, loss_rpn_bbox: 0.0250, loss_cls: 6.2812, loss_bbox: 3.0720, loss: 9.4452 2023-01-20 10:41:17,447 - mmdet - INFO - Epoch [2][2300/3696] lr: 6.000e-05, eta: 9:33:25, time: 0.893, data_time: 0.017, memory: 11338, loss_rpn_cls: 0.0646, loss_rpn_bbox: 0.0246, loss_cls: 6.2840, loss_bbox: 3.0862, loss: 9.4595 2023-01-20 10:42:03,270 - mmdet - INFO - Epoch [2][2350/3696] lr: 6.000e-05, eta: 9:32:46, time: 0.916, data_time: 0.017, memory: 11338, loss_rpn_cls: 0.0662, loss_rpn_bbox: 0.0253, loss_cls: 6.2959, loss_bbox: 3.0488, loss: 9.4362 2023-01-20 10:42:48,766 - mmdet - INFO - Epoch [2][2400/3696] lr: 6.000e-05, eta: 9:32:05, time: 0.910, data_time: 0.017, memory: 11338, loss_rpn_cls: 0.0666, loss_rpn_bbox: 0.0252, loss_cls: 6.2831, loss_bbox: 3.0535, loss: 9.4284 2023-01-20 10:43:33,971 - mmdet - INFO - Epoch [2][2450/3696] lr: 6.000e-05, eta: 9:31:23, time: 0.904, data_time: 0.017, memory: 11338, loss_rpn_cls: 0.0650, loss_rpn_bbox: 0.0249, loss_cls: 6.2854, loss_bbox: 3.0727, loss: 9.4480 2023-01-20 10:44:18,588 - mmdet - INFO - Epoch [2][2500/3696] lr: 6.000e-05, eta: 9:30:36, time: 0.892, data_time: 0.018, memory: 11338, loss_rpn_cls: 0.0662, loss_rpn_bbox: 0.0249, loss_cls: 6.2829, loss_bbox: 3.0714, loss: 9.4453 2023-01-20 10:45:03,502 - mmdet - INFO - Epoch [2][2550/3696] lr: 6.000e-05, eta: 9:29:52, time: 0.898, data_time: 0.017, memory: 11338, loss_rpn_cls: 0.0651, loss_rpn_bbox: 0.0250, loss_cls: 6.2940, loss_bbox: 3.0660, loss: 9.4502 2023-01-20 10:45:48,373 - mmdet - INFO - Epoch [2][2600/3696] lr: 6.000e-05, eta: 9:29:07, time: 0.897, data_time: 0.017, memory: 11338, loss_rpn_cls: 0.0657, loss_rpn_bbox: 0.0247, loss_cls: 6.2790, loss_bbox: 3.0723, loss: 9.4417 2023-01-20 10:46:34,252 - mmdet - INFO - Epoch [2][2650/3696] lr: 6.000e-05, eta: 9:28:28, time: 0.918, data_time: 0.018, memory: 11338, loss_rpn_cls: 0.0663, loss_rpn_bbox: 0.0251, loss_cls: 6.2832, loss_bbox: 3.0582, loss: 9.4328 2023-01-20 10:47:18,896 - mmdet - INFO - Epoch [2][2700/3696] lr: 6.000e-05, eta: 9:27:42, time: 0.893, data_time: 0.017, memory: 11338, loss_rpn_cls: 0.0648, loss_rpn_bbox: 0.0248, loss_cls: 6.2832, loss_bbox: 3.0741, loss: 9.4468 2023-01-20 10:48:04,493 - mmdet - INFO - Epoch [2][2750/3696] lr: 6.000e-05, eta: 9:27:01, time: 0.912, data_time: 0.017, memory: 11338, loss_rpn_cls: 0.0663, loss_rpn_bbox: 0.0254, loss_cls: 6.2931, loss_bbox: 3.0553, loss: 9.4401 2023-01-20 10:48:49,772 - mmdet - INFO - Epoch [2][2800/3696] lr: 6.000e-05, eta: 9:26:19, time: 0.906, data_time: 0.017, memory: 11338, loss_rpn_cls: 0.0654, loss_rpn_bbox: 0.0251, loss_cls: 6.2869, loss_bbox: 3.0521, loss: 9.4294 2023-01-20 10:49:34,452 - mmdet - INFO - Epoch [2][2850/3696] lr: 6.000e-05, eta: 9:25:33, time: 0.894, data_time: 0.018, memory: 11338, loss_rpn_cls: 0.0655, loss_rpn_bbox: 0.0248, loss_cls: 6.2866, loss_bbox: 3.0630, loss: 9.4398 2023-01-20 10:50:19,583 - mmdet - INFO - Epoch [2][2900/3696] lr: 6.000e-05, eta: 9:24:49, time: 0.903, data_time: 0.018, memory: 11338, loss_rpn_cls: 0.0658, loss_rpn_bbox: 0.0251, loss_cls: 6.2882, loss_bbox: 3.0589, loss: 9.4380 2023-01-20 10:51:04,412 - mmdet - INFO - Epoch [2][2950/3696] lr: 6.000e-05, eta: 9:24:04, time: 0.897, data_time: 0.017, memory: 11338, loss_rpn_cls: 0.0651, loss_rpn_bbox: 0.0248, loss_cls: 6.2781, loss_bbox: 3.0658, loss: 9.4338 2023-01-20 10:51:49,661 - mmdet - INFO - Epoch [2][3000/3696] lr: 6.000e-05, eta: 9:23:21, time: 0.905, data_time: 0.017, memory: 11338, loss_rpn_cls: 0.0655, loss_rpn_bbox: 0.0250, loss_cls: 6.2795, loss_bbox: 3.0577, loss: 9.4276 2023-01-20 10:52:34,828 - mmdet - INFO - Epoch [2][3050/3696] lr: 6.000e-05, eta: 9:22:38, time: 0.903, data_time: 0.017, memory: 11338, loss_rpn_cls: 0.0653, loss_rpn_bbox: 0.0248, loss_cls: 6.2813, loss_bbox: 3.0504, loss: 9.4218 2023-01-20 10:53:19,923 - mmdet - INFO - Epoch [2][3100/3696] lr: 6.000e-05, eta: 9:21:54, time: 0.902, data_time: 0.017, memory: 11338, loss_rpn_cls: 0.0657, loss_rpn_bbox: 0.0250, loss_cls: 6.2872, loss_bbox: 3.0564, loss: 9.4343 2023-01-20 10:54:05,231 - mmdet - INFO - Epoch [2][3150/3696] lr: 6.000e-05, eta: 9:21:12, time: 0.906, data_time: 0.017, memory: 11338, loss_rpn_cls: 0.0653, loss_rpn_bbox: 0.0251, loss_cls: 6.2931, loss_bbox: 3.0507, loss: 9.4342 2023-01-20 10:54:50,214 - mmdet - INFO - Epoch [2][3200/3696] lr: 6.000e-05, eta: 9:20:27, time: 0.900, data_time: 0.017, memory: 11338, loss_rpn_cls: 0.0646, loss_rpn_bbox: 0.0245, loss_cls: 6.2759, loss_bbox: 3.0729, loss: 9.4380 2023-01-20 10:55:34,963 - mmdet - INFO - Epoch [2][3250/3696] lr: 6.000e-05, eta: 9:19:42, time: 0.895, data_time: 0.017, memory: 11338, loss_rpn_cls: 0.0634, loss_rpn_bbox: 0.0241, loss_cls: 6.2671, loss_bbox: 3.0784, loss: 9.4331 2023-01-20 10:56:20,165 - mmdet - INFO - Epoch [2][3300/3696] lr: 6.000e-05, eta: 9:18:58, time: 0.904, data_time: 0.017, memory: 11338, loss_rpn_cls: 0.0655, loss_rpn_bbox: 0.0252, loss_cls: 6.2870, loss_bbox: 3.0296, loss: 9.4073 2023-01-20 10:57:04,855 - mmdet - INFO - Epoch [2][3350/3696] lr: 6.000e-05, eta: 9:18:12, time: 0.894, data_time: 0.017, memory: 11338, loss_rpn_cls: 0.0651, loss_rpn_bbox: 0.0247, loss_cls: 6.2810, loss_bbox: 3.0528, loss: 9.4235 2023-01-20 10:57:50,086 - mmdet - INFO - Epoch [2][3400/3696] lr: 6.000e-05, eta: 9:17:29, time: 0.905, data_time: 0.017, memory: 11338, loss_rpn_cls: 0.0660, loss_rpn_bbox: 0.0251, loss_cls: 6.2797, loss_bbox: 3.0509, loss: 9.4217 2023-01-20 10:58:34,886 - mmdet - INFO - Epoch [2][3450/3696] lr: 6.000e-05, eta: 9:16:44, time: 0.896, data_time: 0.017, memory: 11338, loss_rpn_cls: 0.0662, loss_rpn_bbox: 0.0253, loss_cls: 6.3020, loss_bbox: 3.0625, loss: 9.4561 2023-01-20 10:59:19,601 - mmdet - INFO - Epoch [2][3500/3696] lr: 6.000e-05, eta: 9:15:58, time: 0.894, data_time: 0.017, memory: 11338, loss_rpn_cls: 0.0637, loss_rpn_bbox: 0.0242, loss_cls: 6.2729, loss_bbox: 3.0752, loss: 9.4361 2023-01-20 11:00:04,631 - mmdet - INFO - Epoch [2][3550/3696] lr: 6.000e-05, eta: 9:15:14, time: 0.901, data_time: 0.018, memory: 11338, loss_rpn_cls: 0.0633, loss_rpn_bbox: 0.0240, loss_cls: 6.2492, loss_bbox: 3.0746, loss: 9.4111 2023-01-20 11:00:49,882 - mmdet - INFO - Epoch [2][3600/3696] lr: 6.000e-05, eta: 9:14:31, time: 0.905, data_time: 0.017, memory: 11338, loss_rpn_cls: 0.0648, loss_rpn_bbox: 0.0249, loss_cls: 6.2788, loss_bbox: 3.0358, loss: 9.4042 2023-01-20 11:01:35,385 - mmdet - INFO - Epoch [2][3650/3696] lr: 6.000e-05, eta: 9:13:49, time: 0.910, data_time: 0.017, memory: 11338, loss_rpn_cls: 0.0656, loss_rpn_bbox: 0.0252, loss_cls: 6.2868, loss_bbox: 3.0403, loss: 9.4179 2023-01-20 11:02:16,632 - mmdet - INFO - Saving checkpoint at 2 epochs 2023-01-20 11:03:07,766 - mmdet - INFO - Epoch [3][50/3696] lr: 6.000e-05, eta: 9:09:13, time: 0.960, data_time: 0.073, memory: 11338, loss_rpn_cls: 0.0667, loss_rpn_bbox: 0.0256, loss_cls: 6.2949, loss_bbox: 3.0381, loss: 9.4252 2023-01-20 11:03:52,710 - mmdet - INFO - Epoch [3][100/3696] lr: 6.000e-05, eta: 9:08:30, time: 0.899, data_time: 0.017, memory: 11338, loss_rpn_cls: 0.0635, loss_rpn_bbox: 0.0243, loss_cls: 6.2681, loss_bbox: 3.0604, loss: 9.4163 2023-01-20 11:04:37,604 - mmdet - INFO - Epoch [3][150/3696] lr: 6.000e-05, eta: 9:07:47, time: 0.898, data_time: 0.017, memory: 11338, loss_rpn_cls: 0.0638, loss_rpn_bbox: 0.0245, loss_cls: 6.2823, loss_bbox: 3.0544, loss: 9.4250 2023-01-20 11:05:22,861 - mmdet - INFO - Epoch [3][200/3696] lr: 6.000e-05, eta: 9:07:05, time: 0.905, data_time: 0.017, memory: 11338, loss_rpn_cls: 0.0641, loss_rpn_bbox: 0.0249, loss_cls: 6.2849, loss_bbox: 3.0537, loss: 9.4276 2023-01-20 11:06:07,881 - mmdet - INFO - Epoch [3][250/3696] lr: 6.000e-05, eta: 9:06:22, time: 0.900, data_time: 0.017, memory: 11338, loss_rpn_cls: 0.0644, loss_rpn_bbox: 0.0249, loss_cls: 6.2844, loss_bbox: 3.0460, loss: 9.4196 2023-01-20 11:06:53,324 - mmdet - INFO - Epoch [3][300/3696] lr: 6.000e-05, eta: 9:05:41, time: 0.909, data_time: 0.018, memory: 11338, loss_rpn_cls: 0.0642, loss_rpn_bbox: 0.0247, loss_cls: 6.2826, loss_bbox: 3.0435, loss: 9.4151 2023-01-20 11:07:38,848 - mmdet - INFO - Epoch [3][350/3696] lr: 6.000e-05, eta: 9:05:01, time: 0.911, data_time: 0.017, memory: 11338, loss_rpn_cls: 0.0645, loss_rpn_bbox: 0.0247, loss_cls: 6.2778, loss_bbox: 3.0435, loss: 9.4105 2023-01-20 11:08:24,984 - mmdet - INFO - Epoch [3][400/3696] lr: 6.000e-05, eta: 9:04:23, time: 0.923, data_time: 0.017, memory: 11338, loss_rpn_cls: 0.0647, loss_rpn_bbox: 0.0252, loss_cls: 6.2949, loss_bbox: 3.0130, loss: 9.3978 2023-01-20 11:09:09,745 - mmdet - INFO - Epoch [3][450/3696] lr: 6.000e-05, eta: 9:03:39, time: 0.895, data_time: 0.017, memory: 11338, loss_rpn_cls: 0.0638, loss_rpn_bbox: 0.0246, loss_cls: 6.2815, loss_bbox: 3.0503, loss: 9.4202 2023-01-20 11:09:54,732 - mmdet - INFO - Epoch [3][500/3696] lr: 6.000e-05, eta: 9:02:56, time: 0.900, data_time: 0.017, memory: 11338, loss_rpn_cls: 0.0636, loss_rpn_bbox: 0.0245, loss_cls: 6.2755, loss_bbox: 3.0574, loss: 9.4210 2023-01-20 11:10:40,082 - mmdet - INFO - Epoch [3][550/3696] lr: 6.000e-05, eta: 9:02:14, time: 0.907, data_time: 0.018, memory: 11338, loss_rpn_cls: 0.0643, loss_rpn_bbox: 0.0249, loss_cls: 6.2945, loss_bbox: 3.0430, loss: 9.4267 2023-01-20 11:11:24,904 - mmdet - INFO - Epoch [3][600/3696] lr: 6.000e-05, eta: 9:01:30, time: 0.896, data_time: 0.017, memory: 11338, loss_rpn_cls: 0.0631, loss_rpn_bbox: 0.0242, loss_cls: 6.2705, loss_bbox: 3.0568, loss: 9.4146 2023-01-20 11:12:09,975 - mmdet - INFO - Epoch [3][650/3696] lr: 6.000e-05, eta: 9:00:47, time: 0.901, data_time: 0.017, memory: 11338, loss_rpn_cls: 0.0643, loss_rpn_bbox: 0.0253, loss_cls: 6.3077, loss_bbox: 3.0438, loss: 9.4412 2023-01-20 11:12:54,964 - mmdet - INFO - Epoch [3][700/3696] lr: 6.000e-05, eta: 9:00:04, time: 0.900, data_time: 0.017, memory: 11338, loss_rpn_cls: 0.0638, loss_rpn_bbox: 0.0244, loss_cls: 6.2736, loss_bbox: 3.0621, loss: 9.4239 2023-01-20 11:13:39,847 - mmdet - INFO - Epoch [3][750/3696] lr: 6.000e-05, eta: 8:59:20, time: 0.897, data_time: 0.017, memory: 11338, loss_rpn_cls: 0.0628, loss_rpn_bbox: 0.0241, loss_cls: 6.2624, loss_bbox: 3.0528, loss: 9.4021 2023-01-20 11:14:25,440 - mmdet - INFO - Epoch [3][800/3696] lr: 6.000e-05, eta: 8:58:39, time: 0.912, data_time: 0.017, memory: 11338, loss_rpn_cls: 0.0641, loss_rpn_bbox: 0.0247, loss_cls: 6.2754, loss_bbox: 3.0285, loss: 9.3927 2023-01-20 11:15:10,374 - mmdet - INFO - Epoch [3][850/3696] lr: 6.000e-05, eta: 8:57:56, time: 0.899, data_time: 0.017, memory: 11338, loss_rpn_cls: 0.0637, loss_rpn_bbox: 0.0241, loss_cls: 6.2574, loss_bbox: 3.0539, loss: 9.3992 2023-01-20 11:15:55,360 - mmdet - INFO - Epoch [3][900/3696] lr: 6.000e-05, eta: 8:57:12, time: 0.900, data_time: 0.017, memory: 11338, loss_rpn_cls: 0.0637, loss_rpn_bbox: 0.0248, loss_cls: 6.2775, loss_bbox: 3.0524, loss: 9.4184 2023-01-20 11:16:40,616 - mmdet - INFO - Epoch [3][950/3696] lr: 6.000e-05, eta: 8:56:30, time: 0.905, data_time: 0.017, memory: 11338, loss_rpn_cls: 0.0639, loss_rpn_bbox: 0.0250, loss_cls: 6.2872, loss_bbox: 3.0438, loss: 9.4199 2023-01-20 11:17:25,732 - mmdet - INFO - Epoch [3][1000/3696] lr: 6.000e-05, eta: 8:55:47, time: 0.902, data_time: 0.017, memory: 11338, loss_rpn_cls: 0.0646, loss_rpn_bbox: 0.0248, loss_cls: 6.2764, loss_bbox: 3.0407, loss: 9.4065 2023-01-20 11:18:10,966 - mmdet - INFO - Epoch [3][1050/3696] lr: 6.000e-05, eta: 8:55:05, time: 0.905, data_time: 0.017, memory: 11338, loss_rpn_cls: 0.0638, loss_rpn_bbox: 0.0245, loss_cls: 6.2810, loss_bbox: 3.0438, loss: 9.4132 2023-01-20 11:18:56,070 - mmdet - INFO - Epoch [3][1100/3696] lr: 6.000e-05, eta: 8:54:22, time: 0.902, data_time: 0.018, memory: 11338, loss_rpn_cls: 0.0646, loss_rpn_bbox: 0.0247, loss_cls: 6.2827, loss_bbox: 3.0398, loss: 9.4119 2023-01-20 11:19:41,260 - mmdet - INFO - Epoch [3][1150/3696] lr: 6.000e-05, eta: 8:53:39, time: 0.904, data_time: 0.017, memory: 11338, loss_rpn_cls: 0.0648, loss_rpn_bbox: 0.0251, loss_cls: 6.2903, loss_bbox: 3.0356, loss: 9.4158 2023-01-20 11:20:26,286 - mmdet - INFO - Epoch [3][1200/3696] lr: 6.000e-05, eta: 8:52:56, time: 0.901, data_time: 0.017, memory: 11338, loss_rpn_cls: 0.0629, loss_rpn_bbox: 0.0243, loss_cls: 6.2769, loss_bbox: 3.0488, loss: 9.4128 2023-01-20 11:21:11,610 - mmdet - INFO - Epoch [3][1250/3696] lr: 6.000e-05, eta: 8:52:14, time: 0.906, data_time: 0.017, memory: 11338, loss_rpn_cls: 0.0637, loss_rpn_bbox: 0.0245, loss_cls: 6.2773, loss_bbox: 3.0505, loss: 9.4161 2023-01-20 11:21:56,296 - mmdet - INFO - Epoch [3][1300/3696] lr: 6.000e-05, eta: 8:51:29, time: 0.894, data_time: 0.017, memory: 11338, loss_rpn_cls: 0.0637, loss_rpn_bbox: 0.0247, loss_cls: 6.2801, loss_bbox: 3.0381, loss: 9.4067 2023-01-20 11:22:41,795 - mmdet - INFO - Epoch [3][1350/3696] lr: 6.000e-05, eta: 8:50:47, time: 0.910, data_time: 0.017, memory: 11338, loss_rpn_cls: 0.0648, loss_rpn_bbox: 0.0252, loss_cls: 6.2912, loss_bbox: 3.0265, loss: 9.4077 2023-01-20 11:23:26,809 - mmdet - INFO - Epoch [3][1400/3696] lr: 6.000e-05, eta: 8:50:04, time: 0.900, data_time: 0.018, memory: 11338, loss_rpn_cls: 0.0647, loss_rpn_bbox: 0.0250, loss_cls: 6.2913, loss_bbox: 3.0413, loss: 9.4223 2023-01-20 11:24:11,796 - mmdet - INFO - Epoch [3][1450/3696] lr: 6.000e-05, eta: 8:49:20, time: 0.900, data_time: 0.017, memory: 11338, loss_rpn_cls: 0.0640, loss_rpn_bbox: 0.0249, loss_cls: 6.2896, loss_bbox: 3.0328, loss: 9.4112 2023-01-20 11:24:57,026 - mmdet - INFO - Epoch [3][1500/3696] lr: 6.000e-05, eta: 8:48:37, time: 0.905, data_time: 0.017, memory: 11338, loss_rpn_cls: 0.0639, loss_rpn_bbox: 0.0249, loss_cls: 6.2916, loss_bbox: 3.0337, loss: 9.4141 2023-01-20 11:25:42,031 - mmdet - INFO - Epoch [3][1550/3696] lr: 6.000e-05, eta: 8:47:54, time: 0.900, data_time: 0.018, memory: 11338, loss_rpn_cls: 0.0629, loss_rpn_bbox: 0.0242, loss_cls: 6.2685, loss_bbox: 3.0330, loss: 9.3885 2023-01-20 11:26:27,239 - mmdet - INFO - Epoch [3][1600/3696] lr: 6.000e-05, eta: 8:47:11, time: 0.904, data_time: 0.018, memory: 11338, loss_rpn_cls: 0.0630, loss_rpn_bbox: 0.0245, loss_cls: 6.2921, loss_bbox: 3.0501, loss: 9.4298 2023-01-20 11:27:12,793 - mmdet - INFO - Epoch [3][1650/3696] lr: 6.000e-05, eta: 8:46:29, time: 0.911, data_time: 0.017, memory: 11338, loss_rpn_cls: 0.0641, loss_rpn_bbox: 0.0247, loss_cls: 6.2770, loss_bbox: 3.0227, loss: 9.3884 2023-01-20 11:27:57,300 - mmdet - INFO - Epoch [3][1700/3696] lr: 6.000e-05, eta: 8:45:44, time: 0.890, data_time: 0.018, memory: 11338, loss_rpn_cls: 0.0642, loss_rpn_bbox: 0.0250, loss_cls: 6.2896, loss_bbox: 3.0294, loss: 9.4082 2023-01-20 11:28:42,130 - mmdet - INFO - Epoch [3][1750/3696] lr: 6.000e-05, eta: 8:44:59, time: 0.897, data_time: 0.017, memory: 11338, loss_rpn_cls: 0.0641, loss_rpn_bbox: 0.0245, loss_cls: 6.2827, loss_bbox: 3.0445, loss: 9.4159 2023-01-20 11:29:26,953 - mmdet - INFO - Epoch [3][1800/3696] lr: 6.000e-05, eta: 8:44:15, time: 0.896, data_time: 0.017, memory: 11338, loss_rpn_cls: 0.0624, loss_rpn_bbox: 0.0241, loss_cls: 6.2704, loss_bbox: 3.0560, loss: 9.4129 2023-01-20 11:30:11,875 - mmdet - INFO - Epoch [3][1850/3696] lr: 6.000e-05, eta: 8:43:31, time: 0.898, data_time: 0.017, memory: 11338, loss_rpn_cls: 0.0631, loss_rpn_bbox: 0.0242, loss_cls: 6.2768, loss_bbox: 3.0458, loss: 9.4100 2023-01-20 11:30:56,878 - mmdet - INFO - Epoch [3][1900/3696] lr: 6.000e-05, eta: 8:42:47, time: 0.900, data_time: 0.017, memory: 11338, loss_rpn_cls: 0.0639, loss_rpn_bbox: 0.0246, loss_cls: 6.2809, loss_bbox: 3.0328, loss: 9.4022 2023-01-20 11:31:41,744 - mmdet - INFO - Epoch [3][1950/3696] lr: 6.000e-05, eta: 8:42:03, time: 0.898, data_time: 0.017, memory: 11338, loss_rpn_cls: 0.0621, loss_rpn_bbox: 0.0241, loss_cls: 6.2735, loss_bbox: 3.0480, loss: 9.4076 2023-01-20 11:32:27,212 - mmdet - INFO - Epoch [3][2000/3696] lr: 6.000e-05, eta: 8:41:21, time: 0.909, data_time: 0.017, memory: 11338, loss_rpn_cls: 0.0645, loss_rpn_bbox: 0.0250, loss_cls: 6.2801, loss_bbox: 3.0277, loss: 9.3972 2023-01-20 11:33:11,664 - mmdet - INFO - Epoch [3][2050/3696] lr: 6.000e-05, eta: 8:40:35, time: 0.889, data_time: 0.017, memory: 11338, loss_rpn_cls: 0.0626, loss_rpn_bbox: 0.0239, loss_cls: 6.2558, loss_bbox: 3.0429, loss: 9.3852 2023-01-20 11:33:57,355 - mmdet - INFO - Epoch [3][2100/3696] lr: 6.000e-05, eta: 8:39:54, time: 0.914, data_time: 0.017, memory: 11338, loss_rpn_cls: 0.0644, loss_rpn_bbox: 0.0254, loss_cls: 6.2968, loss_bbox: 3.0160, loss: 9.4025 2023-01-20 11:34:43,553 - mmdet - INFO - Epoch [3][2150/3696] lr: 6.000e-05, eta: 8:39:15, time: 0.924, data_time: 0.017, memory: 11338, loss_rpn_cls: 0.0634, loss_rpn_bbox: 0.0249, loss_cls: 6.2804, loss_bbox: 3.0120, loss: 9.3808 2023-01-20 11:35:29,008 - mmdet - INFO - Epoch [3][2200/3696] lr: 6.000e-05, eta: 8:38:32, time: 0.909, data_time: 0.017, memory: 11338, loss_rpn_cls: 0.0630, loss_rpn_bbox: 0.0247, loss_cls: 6.2823, loss_bbox: 3.0244, loss: 9.3944 2023-01-20 11:36:14,337 - mmdet - INFO - Epoch [3][2250/3696] lr: 6.000e-05, eta: 8:37:50, time: 0.907, data_time: 0.018, memory: 11338, loss_rpn_cls: 0.0638, loss_rpn_bbox: 0.0248, loss_cls: 6.2849, loss_bbox: 3.0283, loss: 9.4018 2023-01-20 11:36:59,797 - mmdet - INFO - Epoch [3][2300/3696] lr: 6.000e-05, eta: 8:37:07, time: 0.909, data_time: 0.017, memory: 11338, loss_rpn_cls: 0.0630, loss_rpn_bbox: 0.0246, loss_cls: 6.2672, loss_bbox: 3.0316, loss: 9.3863 2023-01-20 11:37:45,033 - mmdet - INFO - Epoch [3][2350/3696] lr: 6.000e-05, eta: 8:36:24, time: 0.905, data_time: 0.017, memory: 11338, loss_rpn_cls: 0.0629, loss_rpn_bbox: 0.0244, loss_cls: 6.2732, loss_bbox: 3.0381, loss: 9.3985 2023-01-20 11:38:29,953 - mmdet - INFO - Epoch [3][2400/3696] lr: 6.000e-05, eta: 8:35:40, time: 0.898, data_time: 0.017, memory: 11338, loss_rpn_cls: 0.0630, loss_rpn_bbox: 0.0243, loss_cls: 6.2653, loss_bbox: 3.0317, loss: 9.3844 2023-01-20 11:39:15,478 - mmdet - INFO - Epoch [3][2450/3696] lr: 6.000e-05, eta: 8:34:58, time: 0.911, data_time: 0.017, memory: 11338, loss_rpn_cls: 0.0637, loss_rpn_bbox: 0.0245, loss_cls: 6.2690, loss_bbox: 3.0183, loss: 9.3755 2023-01-20 11:39:59,682 - mmdet - INFO - Epoch [3][2500/3696] lr: 6.000e-05, eta: 8:34:11, time: 0.884, data_time: 0.017, memory: 11338, loss_rpn_cls: 0.0617, loss_rpn_bbox: 0.0240, loss_cls: 6.2679, loss_bbox: 3.0561, loss: 9.4096 2023-01-20 11:40:44,756 - mmdet - INFO - Epoch [3][2550/3696] lr: 6.000e-05, eta: 8:33:28, time: 0.901, data_time: 0.017, memory: 11969, loss_rpn_cls: 0.0627, loss_rpn_bbox: 0.0245, loss_cls: 6.2739, loss_bbox: 3.0335, loss: 9.3946 2023-01-20 11:41:30,266 - mmdet - INFO - Epoch [3][2600/3696] lr: 6.000e-05, eta: 8:32:45, time: 0.910, data_time: 0.018, memory: 11969, loss_rpn_cls: 0.0635, loss_rpn_bbox: 0.0246, loss_cls: 6.2758, loss_bbox: 3.0294, loss: 9.3933 2023-01-20 11:42:15,115 - mmdet - INFO - Epoch [3][2650/3696] lr: 6.000e-05, eta: 8:32:01, time: 0.897, data_time: 0.017, memory: 11969, loss_rpn_cls: 0.0618, loss_rpn_bbox: 0.0239, loss_cls: 6.2660, loss_bbox: 3.0428, loss: 9.3945 2023-01-20 11:42:59,818 - mmdet - INFO - Epoch [3][2700/3696] lr: 6.000e-05, eta: 8:31:16, time: 0.894, data_time: 0.018, memory: 11969, loss_rpn_cls: 0.0620, loss_rpn_bbox: 0.0242, loss_cls: 6.2753, loss_bbox: 3.0491, loss: 9.4106 2023-01-20 11:43:45,242 - mmdet - INFO - Epoch [3][2750/3696] lr: 6.000e-05, eta: 8:30:33, time: 0.908, data_time: 0.017, memory: 11969, loss_rpn_cls: 0.0623, loss_rpn_bbox: 0.0241, loss_cls: 6.2686, loss_bbox: 3.0294, loss: 9.3844 2023-01-20 11:44:30,559 - mmdet - INFO - Epoch [3][2800/3696] lr: 6.000e-05, eta: 8:29:50, time: 0.906, data_time: 0.017, memory: 11969, loss_rpn_cls: 0.0636, loss_rpn_bbox: 0.0250, loss_cls: 6.2880, loss_bbox: 3.0301, loss: 9.4067 2023-01-20 11:45:15,577 - mmdet - INFO - Epoch [3][2850/3696] lr: 6.000e-05, eta: 8:29:06, time: 0.900, data_time: 0.018, memory: 11969, loss_rpn_cls: 0.0621, loss_rpn_bbox: 0.0243, loss_cls: 6.2763, loss_bbox: 3.0275, loss: 9.3901 2023-01-20 11:46:00,564 - mmdet - INFO - Epoch [3][2900/3696] lr: 6.000e-05, eta: 8:28:22, time: 0.900, data_time: 0.017, memory: 11969, loss_rpn_cls: 0.0628, loss_rpn_bbox: 0.0243, loss_cls: 6.2664, loss_bbox: 3.0414, loss: 9.3949 2023-01-20 11:46:46,185 - mmdet - INFO - Epoch [3][2950/3696] lr: 6.000e-05, eta: 8:27:40, time: 0.912, data_time: 0.018, memory: 11969, loss_rpn_cls: 0.0633, loss_rpn_bbox: 0.0248, loss_cls: 6.2878, loss_bbox: 3.0150, loss: 9.3909 2023-01-20 11:47:31,713 - mmdet - INFO - Epoch [3][3000/3696] lr: 6.000e-05, eta: 8:26:58, time: 0.911, data_time: 0.017, memory: 11969, loss_rpn_cls: 0.0621, loss_rpn_bbox: 0.0245, loss_cls: 6.2815, loss_bbox: 3.0299, loss: 9.3981 2023-01-20 11:48:17,265 - mmdet - INFO - Epoch [3][3050/3696] lr: 6.000e-05, eta: 8:26:16, time: 0.911, data_time: 0.017, memory: 12346, loss_rpn_cls: 0.0621, loss_rpn_bbox: 0.0246, loss_cls: 6.2802, loss_bbox: 3.0240, loss: 9.3909 2023-01-20 11:49:03,192 - mmdet - INFO - Epoch [3][3100/3696] lr: 6.000e-05, eta: 8:25:35, time: 0.919, data_time: 0.017, memory: 12346, loss_rpn_cls: 0.0630, loss_rpn_bbox: 0.0249, loss_cls: 6.2879, loss_bbox: 3.0210, loss: 9.3967 2023-01-20 11:49:49,323 - mmdet - INFO - Epoch [3][3150/3696] lr: 6.000e-05, eta: 8:24:54, time: 0.923, data_time: 0.017, memory: 12346, loss_rpn_cls: 0.0632, loss_rpn_bbox: 0.0247, loss_cls: 6.2873, loss_bbox: 3.0007, loss: 9.3758 2023-01-20 11:50:34,484 - mmdet - INFO - Epoch [3][3200/3696] lr: 6.000e-05, eta: 8:24:10, time: 0.903, data_time: 0.017, memory: 12346, loss_rpn_cls: 0.0632, loss_rpn_bbox: 0.0245, loss_cls: 6.2731, loss_bbox: 3.0242, loss: 9.3851 2023-01-20 11:51:19,851 - mmdet - INFO - Epoch [3][3250/3696] lr: 6.000e-05, eta: 8:23:27, time: 0.907, data_time: 0.017, memory: 12346, loss_rpn_cls: 0.0634, loss_rpn_bbox: 0.0249, loss_cls: 6.2815, loss_bbox: 3.0170, loss: 9.3868 2023-01-20 11:52:05,473 - mmdet - INFO - Epoch [3][3300/3696] lr: 6.000e-05, eta: 8:22:45, time: 0.912, data_time: 0.017, memory: 12346, loss_rpn_cls: 0.0621, loss_rpn_bbox: 0.0243, loss_cls: 6.2683, loss_bbox: 3.0174, loss: 9.3721 2023-01-20 11:52:50,798 - mmdet - INFO - Epoch [3][3350/3696] lr: 6.000e-05, eta: 8:22:02, time: 0.907, data_time: 0.018, memory: 12346, loss_rpn_cls: 0.0625, loss_rpn_bbox: 0.0246, loss_cls: 6.2767, loss_bbox: 3.0198, loss: 9.3835 2023-01-20 11:53:36,008 - mmdet - INFO - Epoch [3][3400/3696] lr: 6.000e-05, eta: 8:21:18, time: 0.904, data_time: 0.017, memory: 12346, loss_rpn_cls: 0.0619, loss_rpn_bbox: 0.0244, loss_cls: 6.2737, loss_bbox: 3.0116, loss: 9.3717 2023-01-20 11:54:21,834 - mmdet - INFO - Epoch [3][3450/3696] lr: 6.000e-05, eta: 8:20:37, time: 0.917, data_time: 0.017, memory: 12346, loss_rpn_cls: 0.0632, loss_rpn_bbox: 0.0247, loss_cls: 6.2772, loss_bbox: 3.0176, loss: 9.3827 2023-01-20 11:55:07,490 - mmdet - INFO - Epoch [3][3500/3696] lr: 6.000e-05, eta: 8:19:55, time: 0.913, data_time: 0.017, memory: 12346, loss_rpn_cls: 0.0626, loss_rpn_bbox: 0.0246, loss_cls: 6.2851, loss_bbox: 3.0165, loss: 9.3887 2023-01-20 11:55:52,443 - mmdet - INFO - Epoch [3][3550/3696] lr: 6.000e-05, eta: 8:19:10, time: 0.899, data_time: 0.018, memory: 12346, loss_rpn_cls: 0.0615, loss_rpn_bbox: 0.0240, loss_cls: 6.2571, loss_bbox: 3.0350, loss: 9.3776 2023-01-20 11:56:38,063 - mmdet - INFO - Epoch [3][3600/3696] lr: 6.000e-05, eta: 8:18:28, time: 0.912, data_time: 0.017, memory: 12346, loss_rpn_cls: 0.0617, loss_rpn_bbox: 0.0243, loss_cls: 6.2703, loss_bbox: 3.0239, loss: 9.3801 2023-01-20 11:57:23,505 - mmdet - INFO - Epoch [3][3650/3696] lr: 6.000e-05, eta: 8:17:45, time: 0.909, data_time: 0.018, memory: 12346, loss_rpn_cls: 0.0626, loss_rpn_bbox: 0.0246, loss_cls: 6.2847, loss_bbox: 3.0173, loss: 9.3893 2023-01-20 11:58:05,391 - mmdet - INFO - Saving checkpoint at 3 epochs 2023-01-20 11:58:56,040 - mmdet - INFO - Epoch [4][50/3696] lr: 6.000e-05, eta: 8:14:23, time: 0.949, data_time: 0.073, memory: 12346, loss_rpn_cls: 0.0619, loss_rpn_bbox: 0.0245, loss_cls: 6.2754, loss_bbox: 3.0139, loss: 9.3757 2023-01-20 11:59:41,544 - mmdet - INFO - Epoch [4][100/3696] lr: 6.000e-05, eta: 8:13:41, time: 0.910, data_time: 0.017, memory: 12346, loss_rpn_cls: 0.0627, loss_rpn_bbox: 0.0246, loss_cls: 6.2799, loss_bbox: 3.0070, loss: 9.3743 2023-01-20 12:00:26,930 - mmdet - INFO - Epoch [4][150/3696] lr: 6.000e-05, eta: 8:12:59, time: 0.908, data_time: 0.018, memory: 12346, loss_rpn_cls: 0.0635, loss_rpn_bbox: 0.0249, loss_cls: 6.2851, loss_bbox: 3.0118, loss: 9.3854 2023-01-20 12:01:12,231 - mmdet - INFO - Epoch [4][200/3696] lr: 6.000e-05, eta: 8:12:16, time: 0.906, data_time: 0.018, memory: 12346, loss_rpn_cls: 0.0613, loss_rpn_bbox: 0.0240, loss_cls: 6.2602, loss_bbox: 3.0223, loss: 9.3678 2023-01-20 12:01:57,006 - mmdet - INFO - Epoch [4][250/3696] lr: 6.000e-05, eta: 8:11:32, time: 0.895, data_time: 0.017, memory: 12346, loss_rpn_cls: 0.0625, loss_rpn_bbox: 0.0243, loss_cls: 6.2670, loss_bbox: 3.0261, loss: 9.3800 2023-01-20 12:02:42,427 - mmdet - INFO - Epoch [4][300/3696] lr: 6.000e-05, eta: 8:10:49, time: 0.909, data_time: 0.018, memory: 13163, loss_rpn_cls: 0.0614, loss_rpn_bbox: 0.0242, loss_cls: 6.2700, loss_bbox: 3.0258, loss: 9.3814 2023-01-20 12:03:27,660 - mmdet - INFO - Epoch [4][350/3696] lr: 6.000e-05, eta: 8:10:06, time: 0.905, data_time: 0.017, memory: 13163, loss_rpn_cls: 0.0623, loss_rpn_bbox: 0.0241, loss_cls: 6.2629, loss_bbox: 3.0195, loss: 9.3688 2023-01-20 12:04:13,578 - mmdet - INFO - Epoch [4][400/3696] lr: 6.000e-05, eta: 8:09:25, time: 0.918, data_time: 0.017, memory: 13163, loss_rpn_cls: 0.0624, loss_rpn_bbox: 0.0245, loss_cls: 6.2733, loss_bbox: 2.9985, loss: 9.3587 2023-01-20 12:04:58,550 - mmdet - INFO - Epoch [4][450/3696] lr: 6.000e-05, eta: 8:08:41, time: 0.899, data_time: 0.017, memory: 13163, loss_rpn_cls: 0.0614, loss_rpn_bbox: 0.0243, loss_cls: 6.2674, loss_bbox: 3.0174, loss: 9.3704 2023-01-20 12:05:43,955 - mmdet - INFO - Epoch [4][500/3696] lr: 6.000e-05, eta: 8:07:59, time: 0.908, data_time: 0.017, memory: 13163, loss_rpn_cls: 0.0621, loss_rpn_bbox: 0.0247, loss_cls: 6.2784, loss_bbox: 3.0071, loss: 9.3724 2023-01-20 12:06:28,603 - mmdet - INFO - Epoch [4][550/3696] lr: 6.000e-05, eta: 8:07:14, time: 0.893, data_time: 0.018, memory: 13163, loss_rpn_cls: 0.0612, loss_rpn_bbox: 0.0242, loss_cls: 6.2762, loss_bbox: 3.0212, loss: 9.3828 2023-01-20 12:07:14,624 - mmdet - INFO - Epoch [4][600/3696] lr: 6.000e-05, eta: 8:06:33, time: 0.920, data_time: 0.017, memory: 13163, loss_rpn_cls: 0.0618, loss_rpn_bbox: 0.0243, loss_cls: 6.2640, loss_bbox: 3.0069, loss: 9.3569 2023-01-20 12:08:00,073 - mmdet - INFO - Epoch [4][650/3696] lr: 6.000e-05, eta: 8:05:50, time: 0.909, data_time: 0.017, memory: 13163, loss_rpn_cls: 0.0614, loss_rpn_bbox: 0.0244, loss_cls: 6.2869, loss_bbox: 3.0081, loss: 9.3809 2023-01-20 12:08:45,076 - mmdet - INFO - Epoch [4][700/3696] lr: 6.000e-05, eta: 8:05:06, time: 0.900, data_time: 0.017, memory: 13163, loss_rpn_cls: 0.0629, loss_rpn_bbox: 0.0246, loss_cls: 6.2762, loss_bbox: 3.0083, loss: 9.3720 2023-01-20 12:09:30,261 - mmdet - INFO - Epoch [4][750/3696] lr: 6.000e-05, eta: 8:04:23, time: 0.904, data_time: 0.018, memory: 13163, loss_rpn_cls: 0.0617, loss_rpn_bbox: 0.0244, loss_cls: 6.2682, loss_bbox: 3.0029, loss: 9.3572 2023-01-20 12:10:15,814 - mmdet - INFO - Epoch [4][800/3696] lr: 6.000e-05, eta: 8:03:41, time: 0.911, data_time: 0.017, memory: 13163, loss_rpn_cls: 0.0618, loss_rpn_bbox: 0.0245, loss_cls: 6.2744, loss_bbox: 3.0043, loss: 9.3650 2023-01-20 12:11:01,808 - mmdet - INFO - Epoch [4][850/3696] lr: 6.000e-05, eta: 8:03:00, time: 0.920, data_time: 0.018, memory: 13163, loss_rpn_cls: 0.0622, loss_rpn_bbox: 0.0245, loss_cls: 6.2748, loss_bbox: 2.9942, loss: 9.3557 2023-01-20 12:11:47,119 - mmdet - INFO - Epoch [4][900/3696] lr: 6.000e-05, eta: 8:02:16, time: 0.906, data_time: 0.017, memory: 13163, loss_rpn_cls: 0.0620, loss_rpn_bbox: 0.0245, loss_cls: 6.2763, loss_bbox: 3.0025, loss: 9.3653 2023-01-20 12:12:32,378 - mmdet - INFO - Epoch [4][950/3696] lr: 6.000e-05, eta: 8:01:33, time: 0.905, data_time: 0.017, memory: 13163, loss_rpn_cls: 0.0617, loss_rpn_bbox: 0.0240, loss_cls: 6.2558, loss_bbox: 3.0171, loss: 9.3586 2023-01-20 12:13:17,444 - mmdet - INFO - Epoch [4][1000/3696] lr: 6.000e-05, eta: 8:00:50, time: 0.901, data_time: 0.017, memory: 13163, loss_rpn_cls: 0.0614, loss_rpn_bbox: 0.0243, loss_cls: 6.2641, loss_bbox: 3.0033, loss: 9.3532 2023-01-20 12:14:02,863 - mmdet - INFO - Epoch [4][1050/3696] lr: 6.000e-05, eta: 8:00:07, time: 0.908, data_time: 0.017, memory: 13163, loss_rpn_cls: 0.0630, loss_rpn_bbox: 0.0248, loss_cls: 6.2854, loss_bbox: 3.0056, loss: 9.3789 2023-01-20 12:14:47,285 - mmdet - INFO - Epoch [4][1100/3696] lr: 6.000e-05, eta: 7:59:21, time: 0.888, data_time: 0.017, memory: 13163, loss_rpn_cls: 0.0607, loss_rpn_bbox: 0.0240, loss_cls: 6.2632, loss_bbox: 3.0253, loss: 9.3732 2023-01-20 12:15:35,689 - mmdet - INFO - Epoch [4][1150/3696] lr: 6.000e-05, eta: 7:58:46, time: 0.968, data_time: 0.017, memory: 13163, loss_rpn_cls: 0.0621, loss_rpn_bbox: 0.0247, loss_cls: 6.2712, loss_bbox: 2.9906, loss: 9.3485 2023-01-20 12:16:20,922 - mmdet - INFO - Epoch [4][1200/3696] lr: 6.000e-05, eta: 7:58:03, time: 0.905, data_time: 0.017, memory: 13163, loss_rpn_cls: 0.0620, loss_rpn_bbox: 0.0244, loss_cls: 6.2714, loss_bbox: 3.0090, loss: 9.3668 2023-01-20 12:17:10,658 - mmdet - INFO - Epoch [4][1250/3696] lr: 6.000e-05, eta: 7:57:31, time: 0.995, data_time: 0.017, memory: 13163, loss_rpn_cls: 0.0614, loss_rpn_bbox: 0.0242, loss_cls: 6.2689, loss_bbox: 3.0032, loss: 9.3578 2023-01-20 12:17:55,798 - mmdet - INFO - Epoch [4][1300/3696] lr: 6.000e-05, eta: 7:56:47, time: 0.903, data_time: 0.017, memory: 13163, loss_rpn_cls: 0.0618, loss_rpn_bbox: 0.0245, loss_cls: 6.2807, loss_bbox: 3.0011, loss: 9.3681 2023-01-20 12:18:43,118 - mmdet - INFO - Epoch [4][1350/3696] lr: 6.000e-05, eta: 7:56:09, time: 0.946, data_time: 0.018, memory: 13163, loss_rpn_cls: 0.0614, loss_rpn_bbox: 0.0244, loss_cls: 6.2793, loss_bbox: 3.0063, loss: 9.3715 2023-01-20 12:19:32,696 - mmdet - INFO - Epoch [4][1400/3696] lr: 6.000e-05, eta: 7:55:37, time: 0.992, data_time: 0.018, memory: 13163, loss_rpn_cls: 0.0619, loss_rpn_bbox: 0.0241, loss_cls: 6.2633, loss_bbox: 3.0069, loss: 9.3563 2023-01-20 12:20:21,045 - mmdet - INFO - Epoch [4][1450/3696] lr: 6.000e-05, eta: 7:55:01, time: 0.967, data_time: 0.017, memory: 13163, loss_rpn_cls: 0.0627, loss_rpn_bbox: 0.0249, loss_cls: 6.2862, loss_bbox: 2.9794, loss: 9.3531 2023-01-20 12:21:06,555 - mmdet - INFO - Epoch [4][1500/3696] lr: 6.000e-05, eta: 7:54:18, time: 0.910, data_time: 0.018, memory: 13163, loss_rpn_cls: 0.0626, loss_rpn_bbox: 0.0248, loss_cls: 6.2801, loss_bbox: 3.0001, loss: 9.3676 2023-01-20 12:21:51,991 - mmdet - INFO - Epoch [4][1550/3696] lr: 6.000e-05, eta: 7:53:35, time: 0.909, data_time: 0.018, memory: 13163, loss_rpn_cls: 0.0618, loss_rpn_bbox: 0.0243, loss_cls: 6.2687, loss_bbox: 2.9997, loss: 9.3545 2023-01-20 12:22:37,674 - mmdet - INFO - Epoch [4][1600/3696] lr: 6.000e-05, eta: 7:52:52, time: 0.914, data_time: 0.018, memory: 13163, loss_rpn_cls: 0.0618, loss_rpn_bbox: 0.0240, loss_cls: 6.2650, loss_bbox: 3.0053, loss: 9.3562 2023-01-20 12:23:22,709 - mmdet - INFO - Epoch [4][1650/3696] lr: 6.000e-05, eta: 7:52:08, time: 0.901, data_time: 0.017, memory: 13163, loss_rpn_cls: 0.0618, loss_rpn_bbox: 0.0245, loss_cls: 6.2776, loss_bbox: 2.9936, loss: 9.3575 2023-01-20 12:24:07,376 - mmdet - INFO - Epoch [4][1700/3696] lr: 6.000e-05, eta: 7:51:23, time: 0.893, data_time: 0.018, memory: 13163, loss_rpn_cls: 0.0612, loss_rpn_bbox: 0.0240, loss_cls: 6.2664, loss_bbox: 3.0205, loss: 9.3721 2023-01-20 12:24:52,921 - mmdet - INFO - Epoch [4][1750/3696] lr: 6.000e-05, eta: 7:50:40, time: 0.911, data_time: 0.018, memory: 13163, loss_rpn_cls: 0.0625, loss_rpn_bbox: 0.0247, loss_cls: 6.2787, loss_bbox: 2.9800, loss: 9.3459 2023-01-20 12:25:38,196 - mmdet - INFO - Epoch [4][1800/3696] lr: 6.000e-05, eta: 7:49:56, time: 0.906, data_time: 0.018, memory: 13163, loss_rpn_cls: 0.0610, loss_rpn_bbox: 0.0240, loss_cls: 6.2651, loss_bbox: 3.0071, loss: 9.3572 2023-01-20 12:26:23,583 - mmdet - INFO - Epoch [4][1850/3696] lr: 6.000e-05, eta: 7:49:13, time: 0.908, data_time: 0.018, memory: 13163, loss_rpn_cls: 0.0623, loss_rpn_bbox: 0.0246, loss_cls: 6.2781, loss_bbox: 2.9890, loss: 9.3540 2023-01-20 12:27:09,355 - mmdet - INFO - Epoch [4][1900/3696] lr: 6.000e-05, eta: 7:48:30, time: 0.915, data_time: 0.017, memory: 13163, loss_rpn_cls: 0.0620, loss_rpn_bbox: 0.0248, loss_cls: 6.2841, loss_bbox: 2.9846, loss: 9.3555 2023-01-20 12:27:54,446 - mmdet - INFO - Epoch [4][1950/3696] lr: 6.000e-05, eta: 7:47:46, time: 0.902, data_time: 0.018, memory: 13163, loss_rpn_cls: 0.0607, loss_rpn_bbox: 0.0240, loss_cls: 6.2671, loss_bbox: 3.0022, loss: 9.3540 2023-01-20 12:28:39,799 - mmdet - INFO - Epoch [4][2000/3696] lr: 6.000e-05, eta: 7:47:03, time: 0.907, data_time: 0.018, memory: 13163, loss_rpn_cls: 0.0624, loss_rpn_bbox: 0.0247, loss_cls: 6.2751, loss_bbox: 2.9907, loss: 9.3529 2023-01-20 12:29:24,987 - mmdet - INFO - Epoch [4][2050/3696] lr: 6.000e-05, eta: 7:46:19, time: 0.904, data_time: 0.017, memory: 13163, loss_rpn_cls: 0.0618, loss_rpn_bbox: 0.0245, loss_cls: 6.2749, loss_bbox: 3.0028, loss: 9.3640 2023-01-20 12:30:10,673 - mmdet - INFO - Epoch [4][2100/3696] lr: 6.000e-05, eta: 7:45:36, time: 0.914, data_time: 0.018, memory: 13163, loss_rpn_cls: 0.0619, loss_rpn_bbox: 0.0247, loss_cls: 6.2726, loss_bbox: 2.9779, loss: 9.3372 2023-01-20 12:30:56,109 - mmdet - INFO - Epoch [4][2150/3696] lr: 6.000e-05, eta: 7:44:53, time: 0.909, data_time: 0.018, memory: 13163, loss_rpn_cls: 0.0609, loss_rpn_bbox: 0.0242, loss_cls: 6.2638, loss_bbox: 2.9903, loss: 9.3392 2023-01-20 12:31:41,424 - mmdet - INFO - Epoch [4][2200/3696] lr: 6.000e-05, eta: 7:44:09, time: 0.906, data_time: 0.017, memory: 13163, loss_rpn_cls: 0.0616, loss_rpn_bbox: 0.0242, loss_cls: 6.2670, loss_bbox: 2.9843, loss: 9.3371 2023-01-20 12:32:26,001 - mmdet - INFO - Epoch [4][2250/3696] lr: 6.000e-05, eta: 7:43:23, time: 0.892, data_time: 0.017, memory: 13163, loss_rpn_cls: 0.0610, loss_rpn_bbox: 0.0239, loss_cls: 6.2513, loss_bbox: 3.0066, loss: 9.3427 2023-01-20 12:33:11,476 - mmdet - INFO - Epoch [4][2300/3696] lr: 6.000e-05, eta: 7:42:40, time: 0.909, data_time: 0.017, memory: 13163, loss_rpn_cls: 0.0609, loss_rpn_bbox: 0.0243, loss_cls: 6.2723, loss_bbox: 2.9918, loss: 9.3493 2023-01-20 12:33:56,959 - mmdet - INFO - Epoch [4][2350/3696] lr: 6.000e-05, eta: 7:41:57, time: 0.910, data_time: 0.017, memory: 13163, loss_rpn_cls: 0.0627, loss_rpn_bbox: 0.0250, loss_cls: 6.2758, loss_bbox: 2.9906, loss: 9.3541 2023-01-20 12:34:42,751 - mmdet - INFO - Epoch [4][2400/3696] lr: 6.000e-05, eta: 7:41:14, time: 0.916, data_time: 0.017, memory: 13163, loss_rpn_cls: 0.0620, loss_rpn_bbox: 0.0248, loss_cls: 6.2732, loss_bbox: 2.9845, loss: 9.3444 2023-01-20 12:35:27,942 - mmdet - INFO - Epoch [4][2450/3696] lr: 6.000e-05, eta: 7:40:30, time: 0.904, data_time: 0.017, memory: 13163, loss_rpn_cls: 0.0610, loss_rpn_bbox: 0.0241, loss_cls: 6.2598, loss_bbox: 2.9965, loss: 9.3414 2023-01-20 12:36:13,287 - mmdet - INFO - Epoch [4][2500/3696] lr: 6.000e-05, eta: 7:39:46, time: 0.907, data_time: 0.018, memory: 13163, loss_rpn_cls: 0.0615, loss_rpn_bbox: 0.0247, loss_cls: 6.2807, loss_bbox: 2.9897, loss: 9.3565 2023-01-20 12:36:59,167 - mmdet - INFO - Epoch [4][2550/3696] lr: 6.000e-05, eta: 7:39:04, time: 0.918, data_time: 0.017, memory: 13163, loss_rpn_cls: 0.0610, loss_rpn_bbox: 0.0243, loss_cls: 6.2755, loss_bbox: 2.9904, loss: 9.3511 2023-01-20 12:37:44,410 - mmdet - INFO - Epoch [4][2600/3696] lr: 6.000e-05, eta: 7:38:20, time: 0.905, data_time: 0.018, memory: 13163, loss_rpn_cls: 0.0608, loss_rpn_bbox: 0.0243, loss_cls: 6.2725, loss_bbox: 2.9926, loss: 9.3503 2023-01-20 12:38:29,940 - mmdet - INFO - Epoch [4][2650/3696] lr: 6.000e-05, eta: 7:37:37, time: 0.911, data_time: 0.018, memory: 13163, loss_rpn_cls: 0.0613, loss_rpn_bbox: 0.0246, loss_cls: 6.2777, loss_bbox: 2.9918, loss: 9.3554 2023-01-20 12:39:15,826 - mmdet - INFO - Epoch [4][2700/3696] lr: 6.000e-05, eta: 7:36:54, time: 0.918, data_time: 0.018, memory: 13163, loss_rpn_cls: 0.0608, loss_rpn_bbox: 0.0245, loss_cls: 6.2830, loss_bbox: 2.9831, loss: 9.3514 2023-01-20 12:40:01,583 - mmdet - INFO - Epoch [4][2750/3696] lr: 6.000e-05, eta: 7:36:11, time: 0.915, data_time: 0.017, memory: 13163, loss_rpn_cls: 0.0611, loss_rpn_bbox: 0.0245, loss_cls: 6.2774, loss_bbox: 2.9961, loss: 9.3591 2023-01-20 12:40:47,392 - mmdet - INFO - Epoch [4][2800/3696] lr: 6.000e-05, eta: 7:35:29, time: 0.916, data_time: 0.018, memory: 13163, loss_rpn_cls: 0.0618, loss_rpn_bbox: 0.0248, loss_cls: 6.2760, loss_bbox: 2.9610, loss: 9.3235 2023-01-20 12:41:32,508 - mmdet - INFO - Epoch [4][2850/3696] lr: 6.000e-05, eta: 7:34:44, time: 0.902, data_time: 0.018, memory: 13163, loss_rpn_cls: 0.0604, loss_rpn_bbox: 0.0242, loss_cls: 6.2753, loss_bbox: 2.9932, loss: 9.3530 2023-01-20 12:42:17,875 - mmdet - INFO - Epoch [4][2900/3696] lr: 6.000e-05, eta: 7:34:01, time: 0.907, data_time: 0.017, memory: 13163, loss_rpn_cls: 0.0611, loss_rpn_bbox: 0.0246, loss_cls: 6.2690, loss_bbox: 2.9848, loss: 9.3395 2023-01-20 12:43:02,832 - mmdet - INFO - Epoch [4][2950/3696] lr: 6.000e-05, eta: 7:33:16, time: 0.899, data_time: 0.018, memory: 13163, loss_rpn_cls: 0.0611, loss_rpn_bbox: 0.0241, loss_cls: 6.2554, loss_bbox: 2.9918, loss: 9.3324 2023-01-20 12:43:48,172 - mmdet - INFO - Epoch [4][3000/3696] lr: 6.000e-05, eta: 7:32:32, time: 0.907, data_time: 0.018, memory: 13163, loss_rpn_cls: 0.0611, loss_rpn_bbox: 0.0241, loss_cls: 6.2570, loss_bbox: 2.9895, loss: 9.3318 2023-01-20 12:44:33,636 - mmdet - INFO - Epoch [4][3050/3696] lr: 6.000e-05, eta: 7:31:49, time: 0.909, data_time: 0.018, memory: 13163, loss_rpn_cls: 0.0607, loss_rpn_bbox: 0.0244, loss_cls: 6.2736, loss_bbox: 2.9889, loss: 9.3476 2023-01-20 12:45:19,349 - mmdet - INFO - Epoch [4][3100/3696] lr: 6.000e-05, eta: 7:31:05, time: 0.914, data_time: 0.017, memory: 13163, loss_rpn_cls: 0.0615, loss_rpn_bbox: 0.0246, loss_cls: 6.2684, loss_bbox: 2.9847, loss: 9.3392 2023-01-20 12:46:04,289 - mmdet - INFO - Epoch [4][3150/3696] lr: 6.000e-05, eta: 7:30:21, time: 0.899, data_time: 0.017, memory: 13163, loss_rpn_cls: 0.0616, loss_rpn_bbox: 0.0244, loss_cls: 6.2720, loss_bbox: 2.9909, loss: 9.3488 2023-01-20 12:46:50,051 - mmdet - INFO - Epoch [4][3200/3696] lr: 6.000e-05, eta: 7:29:38, time: 0.915, data_time: 0.018, memory: 13163, loss_rpn_cls: 0.0598, loss_rpn_bbox: 0.0241, loss_cls: 6.2641, loss_bbox: 2.9844, loss: 9.3324 2023-01-20 12:47:35,805 - mmdet - INFO - Epoch [4][3250/3696] lr: 6.000e-05, eta: 7:28:55, time: 0.915, data_time: 0.017, memory: 13163, loss_rpn_cls: 0.0614, loss_rpn_bbox: 0.0248, loss_cls: 6.2832, loss_bbox: 2.9814, loss: 9.3507 2023-01-20 12:48:21,359 - mmdet - INFO - Epoch [4][3300/3696] lr: 6.000e-05, eta: 7:28:11, time: 0.911, data_time: 0.017, memory: 13163, loss_rpn_cls: 0.0605, loss_rpn_bbox: 0.0243, loss_cls: 6.2714, loss_bbox: 2.9909, loss: 9.3472 2023-01-20 12:49:06,863 - mmdet - INFO - Epoch [4][3350/3696] lr: 6.000e-05, eta: 7:27:28, time: 0.910, data_time: 0.018, memory: 13163, loss_rpn_cls: 0.0613, loss_rpn_bbox: 0.0245, loss_cls: 6.2644, loss_bbox: 2.9805, loss: 9.3307 2023-01-20 12:49:51,856 - mmdet - INFO - Epoch [4][3400/3696] lr: 6.000e-05, eta: 7:26:43, time: 0.900, data_time: 0.017, memory: 13163, loss_rpn_cls: 0.0609, loss_rpn_bbox: 0.0241, loss_cls: 6.2717, loss_bbox: 2.9885, loss: 9.3452 2023-01-20 12:50:36,551 - mmdet - INFO - Epoch [4][3450/3696] lr: 6.000e-05, eta: 7:25:58, time: 0.894, data_time: 0.017, memory: 13163, loss_rpn_cls: 0.0604, loss_rpn_bbox: 0.0239, loss_cls: 6.2628, loss_bbox: 2.9913, loss: 9.3384 2023-01-20 12:51:22,231 - mmdet - INFO - Epoch [4][3500/3696] lr: 6.000e-05, eta: 7:25:15, time: 0.914, data_time: 0.017, memory: 13163, loss_rpn_cls: 0.0608, loss_rpn_bbox: 0.0242, loss_cls: 6.2650, loss_bbox: 2.9876, loss: 9.3376 2023-01-20 12:52:08,164 - mmdet - INFO - Epoch [4][3550/3696] lr: 6.000e-05, eta: 7:24:32, time: 0.919, data_time: 0.018, memory: 13163, loss_rpn_cls: 0.0592, loss_rpn_bbox: 0.0241, loss_cls: 6.2618, loss_bbox: 2.9887, loss: 9.3338 2023-01-20 12:52:53,794 - mmdet - INFO - Epoch [4][3600/3696] lr: 6.000e-05, eta: 7:23:49, time: 0.913, data_time: 0.017, memory: 13163, loss_rpn_cls: 0.0607, loss_rpn_bbox: 0.0244, loss_cls: 6.2676, loss_bbox: 2.9728, loss: 9.3255 2023-01-20 12:53:39,005 - mmdet - INFO - Epoch [4][3650/3696] lr: 6.000e-05, eta: 7:23:04, time: 0.904, data_time: 0.018, memory: 13163, loss_rpn_cls: 0.0604, loss_rpn_bbox: 0.0238, loss_cls: 6.2614, loss_bbox: 2.9808, loss: 9.3265 2023-01-20 12:54:21,521 - mmdet - INFO - Saving checkpoint at 4 epochs 2023-01-20 12:55:13,005 - mmdet - INFO - Epoch [5][50/3696] lr: 6.000e-05, eta: 7:20:23, time: 0.964, data_time: 0.073, memory: 13163, loss_rpn_cls: 0.0604, loss_rpn_bbox: 0.0240, loss_cls: 6.2646, loss_bbox: 2.9731, loss: 9.3222 2023-01-20 12:55:58,318 - mmdet - INFO - Epoch [5][100/3696] lr: 6.000e-05, eta: 7:19:39, time: 0.906, data_time: 0.017, memory: 13163, loss_rpn_cls: 0.0603, loss_rpn_bbox: 0.0236, loss_cls: 6.2508, loss_bbox: 2.9869, loss: 9.3216 2023-01-20 12:56:43,574 - mmdet - INFO - Epoch [5][150/3696] lr: 6.000e-05, eta: 7:18:55, time: 0.905, data_time: 0.017, memory: 13163, loss_rpn_cls: 0.0598, loss_rpn_bbox: 0.0240, loss_cls: 6.2749, loss_bbox: 2.9818, loss: 9.3406 2023-01-20 12:57:28,584 - mmdet - INFO - Epoch [5][200/3696] lr: 6.000e-05, eta: 7:18:11, time: 0.900, data_time: 0.017, memory: 13163, loss_rpn_cls: 0.0608, loss_rpn_bbox: 0.0242, loss_cls: 6.2522, loss_bbox: 2.9765, loss: 9.3136 2023-01-20 12:58:13,693 - mmdet - INFO - Epoch [5][250/3696] lr: 6.000e-05, eta: 7:17:27, time: 0.902, data_time: 0.017, memory: 13163, loss_rpn_cls: 0.0603, loss_rpn_bbox: 0.0241, loss_cls: 6.2589, loss_bbox: 2.9930, loss: 9.3363 2023-01-20 12:58:58,970 - mmdet - INFO - Epoch [5][300/3696] lr: 6.000e-05, eta: 7:16:43, time: 0.905, data_time: 0.017, memory: 13163, loss_rpn_cls: 0.0611, loss_rpn_bbox: 0.0245, loss_cls: 6.2700, loss_bbox: 2.9881, loss: 9.3436 2023-01-20 12:59:44,562 - mmdet - INFO - Epoch [5][350/3696] lr: 6.000e-05, eta: 7:16:00, time: 0.912, data_time: 0.018, memory: 13163, loss_rpn_cls: 0.0599, loss_rpn_bbox: 0.0238, loss_cls: 6.2644, loss_bbox: 2.9931, loss: 9.3413 2023-01-20 13:00:30,101 - mmdet - INFO - Epoch [5][400/3696] lr: 6.000e-05, eta: 7:15:17, time: 0.911, data_time: 0.018, memory: 13163, loss_rpn_cls: 0.0599, loss_rpn_bbox: 0.0243, loss_cls: 6.2619, loss_bbox: 2.9895, loss: 9.3356 2023-01-20 13:01:15,651 - mmdet - INFO - Epoch [5][450/3696] lr: 6.000e-05, eta: 7:14:33, time: 0.911, data_time: 0.017, memory: 13163, loss_rpn_cls: 0.0611, loss_rpn_bbox: 0.0243, loss_cls: 6.2711, loss_bbox: 2.9782, loss: 9.3347 2023-01-20 13:02:01,637 - mmdet - INFO - Epoch [5][500/3696] lr: 6.000e-05, eta: 7:13:51, time: 0.920, data_time: 0.018, memory: 13163, loss_rpn_cls: 0.0612, loss_rpn_bbox: 0.0249, loss_cls: 6.2829, loss_bbox: 2.9562, loss: 9.3251 2023-01-20 13:02:47,130 - mmdet - INFO - Epoch [5][550/3696] lr: 6.000e-05, eta: 7:13:07, time: 0.910, data_time: 0.018, memory: 13163, loss_rpn_cls: 0.0615, loss_rpn_bbox: 0.0247, loss_cls: 6.2728, loss_bbox: 2.9588, loss: 9.3178 2023-01-20 13:03:32,443 - mmdet - INFO - Epoch [5][600/3696] lr: 6.000e-05, eta: 7:12:24, time: 0.906, data_time: 0.018, memory: 13163, loss_rpn_cls: 0.0600, loss_rpn_bbox: 0.0242, loss_cls: 6.2665, loss_bbox: 2.9750, loss: 9.3258 2023-01-20 13:04:17,704 - mmdet - INFO - Epoch [5][650/3696] lr: 6.000e-05, eta: 7:11:40, time: 0.905, data_time: 0.017, memory: 13163, loss_rpn_cls: 0.0617, loss_rpn_bbox: 0.0247, loss_cls: 6.2758, loss_bbox: 2.9611, loss: 9.3231 2023-01-20 13:05:03,440 - mmdet - INFO - Epoch [5][700/3696] lr: 6.000e-05, eta: 7:10:57, time: 0.915, data_time: 0.017, memory: 13163, loss_rpn_cls: 0.0607, loss_rpn_bbox: 0.0244, loss_cls: 6.2712, loss_bbox: 2.9804, loss: 9.3368 2023-01-20 13:05:48,712 - mmdet - INFO - Epoch [5][750/3696] lr: 6.000e-05, eta: 7:10:13, time: 0.905, data_time: 0.018, memory: 13163, loss_rpn_cls: 0.0611, loss_rpn_bbox: 0.0248, loss_cls: 6.2835, loss_bbox: 2.9676, loss: 9.3369 2023-01-20 13:06:33,824 - mmdet - INFO - Epoch [5][800/3696] lr: 6.000e-05, eta: 7:09:29, time: 0.902, data_time: 0.017, memory: 13163, loss_rpn_cls: 0.0613, loss_rpn_bbox: 0.0243, loss_cls: 6.2636, loss_bbox: 2.9845, loss: 9.3337 2023-01-20 13:07:18,641 - mmdet - INFO - Epoch [5][850/3696] lr: 6.000e-05, eta: 7:08:44, time: 0.896, data_time: 0.017, memory: 13163, loss_rpn_cls: 0.0612, loss_rpn_bbox: 0.0243, loss_cls: 6.2653, loss_bbox: 2.9836, loss: 9.3345 2023-01-20 13:08:04,203 - mmdet - INFO - Epoch [5][900/3696] lr: 6.000e-05, eta: 7:08:01, time: 0.911, data_time: 0.018, memory: 13163, loss_rpn_cls: 0.0605, loss_rpn_bbox: 0.0245, loss_cls: 6.2744, loss_bbox: 2.9727, loss: 9.3322 2023-01-20 13:08:49,415 - mmdet - INFO - Epoch [5][950/3696] lr: 6.000e-05, eta: 7:07:17, time: 0.904, data_time: 0.017, memory: 13163, loss_rpn_cls: 0.0614, loss_rpn_bbox: 0.0243, loss_cls: 6.2657, loss_bbox: 2.9755, loss: 9.3269 2023-01-20 13:09:35,190 - mmdet - INFO - Epoch [5][1000/3696] lr: 6.000e-05, eta: 7:06:34, time: 0.915, data_time: 0.017, memory: 13163, loss_rpn_cls: 0.0611, loss_rpn_bbox: 0.0245, loss_cls: 6.2675, loss_bbox: 2.9679, loss: 9.3211 2023-01-20 13:10:20,753 - mmdet - INFO - Epoch [5][1050/3696] lr: 6.000e-05, eta: 7:05:50, time: 0.911, data_time: 0.017, memory: 13163, loss_rpn_cls: 0.0614, loss_rpn_bbox: 0.0245, loss_cls: 6.2667, loss_bbox: 2.9630, loss: 9.3156 2023-01-20 13:11:06,010 - mmdet - INFO - Epoch [5][1100/3696] lr: 6.000e-05, eta: 7:05:06, time: 0.905, data_time: 0.017, memory: 13163, loss_rpn_cls: 0.0611, loss_rpn_bbox: 0.0245, loss_cls: 6.2746, loss_bbox: 2.9695, loss: 9.3296 2023-01-20 13:11:50,980 - mmdet - INFO - Epoch [5][1150/3696] lr: 6.000e-05, eta: 7:04:22, time: 0.899, data_time: 0.017, memory: 13163, loss_rpn_cls: 0.0601, loss_rpn_bbox: 0.0239, loss_cls: 6.2589, loss_bbox: 2.9677, loss: 9.3106 2023-01-20 13:12:36,071 - mmdet - INFO - Epoch [5][1200/3696] lr: 6.000e-05, eta: 7:03:37, time: 0.902, data_time: 0.017, memory: 13163, loss_rpn_cls: 0.0609, loss_rpn_bbox: 0.0245, loss_cls: 6.2760, loss_bbox: 2.9741, loss: 9.3355 2023-01-20 13:13:20,787 - mmdet - INFO - Epoch [5][1250/3696] lr: 6.000e-05, eta: 7:02:52, time: 0.894, data_time: 0.017, memory: 13163, loss_rpn_cls: 0.0605, loss_rpn_bbox: 0.0242, loss_cls: 6.2591, loss_bbox: 2.9783, loss: 9.3221 2023-01-20 13:14:05,475 - mmdet - INFO - Epoch [5][1300/3696] lr: 6.000e-05, eta: 7:02:07, time: 0.894, data_time: 0.018, memory: 13163, loss_rpn_cls: 0.0596, loss_rpn_bbox: 0.0238, loss_cls: 6.2570, loss_bbox: 2.9825, loss: 9.3229 2023-01-20 13:14:50,883 - mmdet - INFO - Epoch [5][1350/3696] lr: 6.000e-05, eta: 7:01:24, time: 0.908, data_time: 0.017, memory: 13163, loss_rpn_cls: 0.0611, loss_rpn_bbox: 0.0241, loss_cls: 6.2616, loss_bbox: 2.9730, loss: 9.3199 2023-01-20 13:15:35,688 - mmdet - INFO - Epoch [5][1400/3696] lr: 6.000e-05, eta: 7:00:39, time: 0.896, data_time: 0.018, memory: 13163, loss_rpn_cls: 0.0593, loss_rpn_bbox: 0.0239, loss_cls: 6.2598, loss_bbox: 2.9846, loss: 9.3276 2023-01-20 13:16:21,350 - mmdet - INFO - Epoch [5][1450/3696] lr: 6.000e-05, eta: 6:59:56, time: 0.913, data_time: 0.017, memory: 13163, loss_rpn_cls: 0.0610, loss_rpn_bbox: 0.0244, loss_cls: 6.2782, loss_bbox: 2.9654, loss: 9.3290 2023-01-20 13:17:06,020 - mmdet - INFO - Epoch [5][1500/3696] lr: 6.000e-05, eta: 6:59:11, time: 0.893, data_time: 0.017, memory: 13163, loss_rpn_cls: 0.0605, loss_rpn_bbox: 0.0240, loss_cls: 6.2613, loss_bbox: 2.9882, loss: 9.3340 2023-01-20 13:17:51,577 - mmdet - INFO - Epoch [5][1550/3696] lr: 6.000e-05, eta: 6:58:27, time: 0.911, data_time: 0.017, memory: 13163, loss_rpn_cls: 0.0610, loss_rpn_bbox: 0.0248, loss_cls: 6.2802, loss_bbox: 2.9622, loss: 9.3282 2023-01-20 13:18:36,632 - mmdet - INFO - Epoch [5][1600/3696] lr: 6.000e-05, eta: 6:57:43, time: 0.901, data_time: 0.017, memory: 13163, loss_rpn_cls: 0.0614, loss_rpn_bbox: 0.0245, loss_cls: 6.2711, loss_bbox: 2.9641, loss: 9.3211 2023-01-20 13:19:22,323 - mmdet - INFO - Epoch [5][1650/3696] lr: 6.000e-05, eta: 6:56:59, time: 0.914, data_time: 0.017, memory: 13163, loss_rpn_cls: 0.0612, loss_rpn_bbox: 0.0245, loss_cls: 6.2791, loss_bbox: 2.9700, loss: 9.3348 2023-01-20 13:20:07,637 - mmdet - INFO - Epoch [5][1700/3696] lr: 6.000e-05, eta: 6:56:15, time: 0.906, data_time: 0.017, memory: 13163, loss_rpn_cls: 0.0606, loss_rpn_bbox: 0.0245, loss_cls: 6.2776, loss_bbox: 2.9703, loss: 9.3329 2023-01-20 13:20:52,820 - mmdet - INFO - Epoch [5][1750/3696] lr: 6.000e-05, eta: 6:55:31, time: 0.904, data_time: 0.017, memory: 13163, loss_rpn_cls: 0.0599, loss_rpn_bbox: 0.0238, loss_cls: 6.2594, loss_bbox: 2.9866, loss: 9.3297 2023-01-20 13:21:37,974 - mmdet - INFO - Epoch [5][1800/3696] lr: 6.000e-05, eta: 6:54:47, time: 0.903, data_time: 0.017, memory: 13163, loss_rpn_cls: 0.0603, loss_rpn_bbox: 0.0242, loss_cls: 6.2653, loss_bbox: 2.9572, loss: 9.3070 2023-01-20 13:22:23,443 - mmdet - INFO - Epoch [5][1850/3696] lr: 6.000e-05, eta: 6:54:03, time: 0.909, data_time: 0.017, memory: 13163, loss_rpn_cls: 0.0604, loss_rpn_bbox: 0.0243, loss_cls: 6.2660, loss_bbox: 2.9710, loss: 9.3217 2023-01-20 13:23:08,942 - mmdet - INFO - Epoch [5][1900/3696] lr: 6.000e-05, eta: 6:53:20, time: 0.910, data_time: 0.017, memory: 13163, loss_rpn_cls: 0.0611, loss_rpn_bbox: 0.0247, loss_cls: 6.2839, loss_bbox: 2.9759, loss: 9.3456 2023-01-20 13:23:54,322 - mmdet - INFO - Epoch [5][1950/3696] lr: 6.000e-05, eta: 6:52:36, time: 0.908, data_time: 0.017, memory: 13163, loss_rpn_cls: 0.0599, loss_rpn_bbox: 0.0242, loss_cls: 6.2684, loss_bbox: 2.9710, loss: 9.3235 2023-01-20 13:24:40,012 - mmdet - INFO - Epoch [5][2000/3696] lr: 6.000e-05, eta: 6:51:52, time: 0.914, data_time: 0.017, memory: 13163, loss_rpn_cls: 0.0610, loss_rpn_bbox: 0.0246, loss_cls: 6.2784, loss_bbox: 2.9698, loss: 9.3338 2023-01-20 13:25:25,274 - mmdet - INFO - Epoch [5][2050/3696] lr: 6.000e-05, eta: 6:51:08, time: 0.905, data_time: 0.017, memory: 13163, loss_rpn_cls: 0.0618, loss_rpn_bbox: 0.0249, loss_cls: 6.2793, loss_bbox: 2.9564, loss: 9.3224 2023-01-20 13:26:10,261 - mmdet - INFO - Epoch [5][2100/3696] lr: 6.000e-05, eta: 6:50:24, time: 0.900, data_time: 0.018, memory: 13163, loss_rpn_cls: 0.0597, loss_rpn_bbox: 0.0240, loss_cls: 6.2584, loss_bbox: 2.9735, loss: 9.3157 2023-01-20 13:26:55,728 - mmdet - INFO - Epoch [5][2150/3696] lr: 6.000e-05, eta: 6:49:40, time: 0.909, data_time: 0.017, memory: 13163, loss_rpn_cls: 0.0603, loss_rpn_bbox: 0.0244, loss_cls: 6.2599, loss_bbox: 2.9644, loss: 9.3089 2023-01-20 13:27:41,815 - mmdet - INFO - Epoch [5][2200/3696] lr: 6.000e-05, eta: 6:48:57, time: 0.921, data_time: 0.017, memory: 13163, loss_rpn_cls: 0.0605, loss_rpn_bbox: 0.0243, loss_cls: 6.2606, loss_bbox: 2.9590, loss: 9.3044 2023-01-20 13:28:26,661 - mmdet - INFO - Epoch [5][2250/3696] lr: 6.000e-05, eta: 6:48:12, time: 0.897, data_time: 0.018, memory: 13163, loss_rpn_cls: 0.0596, loss_rpn_bbox: 0.0239, loss_cls: 6.2584, loss_bbox: 2.9766, loss: 9.3185 2023-01-20 13:29:12,625 - mmdet - INFO - Epoch [5][2300/3696] lr: 6.000e-05, eta: 6:47:29, time: 0.919, data_time: 0.018, memory: 13163, loss_rpn_cls: 0.0612, loss_rpn_bbox: 0.0246, loss_cls: 6.2751, loss_bbox: 2.9446, loss: 9.3054 2023-01-20 13:29:57,698 - mmdet - INFO - Epoch [5][2350/3696] lr: 6.000e-05, eta: 6:46:45, time: 0.901, data_time: 0.018, memory: 13163, loss_rpn_cls: 0.0601, loss_rpn_bbox: 0.0245, loss_cls: 6.2672, loss_bbox: 2.9645, loss: 9.3162 2023-01-20 13:30:42,592 - mmdet - INFO - Epoch [5][2400/3696] lr: 6.000e-05, eta: 6:46:00, time: 0.898, data_time: 0.017, memory: 13163, loss_rpn_cls: 0.0597, loss_rpn_bbox: 0.0239, loss_cls: 6.2520, loss_bbox: 2.9804, loss: 9.3160 2023-01-20 13:31:27,989 - mmdet - INFO - Epoch [5][2450/3696] lr: 6.000e-05, eta: 6:45:16, time: 0.908, data_time: 0.017, memory: 13163, loss_rpn_cls: 0.0612, loss_rpn_bbox: 0.0246, loss_cls: 6.2787, loss_bbox: 2.9530, loss: 9.3175 2023-01-20 13:32:13,802 - mmdet - INFO - Epoch [5][2500/3696] lr: 6.000e-05, eta: 6:44:33, time: 0.916, data_time: 0.017, memory: 13163, loss_rpn_cls: 0.0598, loss_rpn_bbox: 0.0239, loss_cls: 6.2610, loss_bbox: 2.9646, loss: 9.3092 2023-01-20 13:32:59,198 - mmdet - INFO - Epoch [5][2550/3696] lr: 6.000e-05, eta: 6:43:49, time: 0.908, data_time: 0.017, memory: 13163, loss_rpn_cls: 0.0603, loss_rpn_bbox: 0.0243, loss_cls: 6.2651, loss_bbox: 2.9505, loss: 9.3003 2023-01-20 13:33:44,207 - mmdet - INFO - Epoch [5][2600/3696] lr: 6.000e-05, eta: 6:43:04, time: 0.900, data_time: 0.017, memory: 13163, loss_rpn_cls: 0.0594, loss_rpn_bbox: 0.0240, loss_cls: 6.2532, loss_bbox: 2.9717, loss: 9.3084 2023-01-20 13:34:29,568 - mmdet - INFO - Epoch [5][2650/3696] lr: 6.000e-05, eta: 6:42:20, time: 0.907, data_time: 0.017, memory: 13163, loss_rpn_cls: 0.0585, loss_rpn_bbox: 0.0241, loss_cls: 6.2613, loss_bbox: 2.9655, loss: 9.3094 2023-01-20 13:35:14,447 - mmdet - INFO - Epoch [5][2700/3696] lr: 6.000e-05, eta: 6:41:35, time: 0.898, data_time: 0.017, memory: 13163, loss_rpn_cls: 0.0598, loss_rpn_bbox: 0.0240, loss_cls: 6.2574, loss_bbox: 2.9696, loss: 9.3108 2023-01-20 13:35:59,686 - mmdet - INFO - Epoch [5][2750/3696] lr: 6.000e-05, eta: 6:40:51, time: 0.905, data_time: 0.017, memory: 13163, loss_rpn_cls: 0.0604, loss_rpn_bbox: 0.0240, loss_cls: 6.2540, loss_bbox: 2.9749, loss: 9.3134 2023-01-20 13:36:45,380 - mmdet - INFO - Epoch [5][2800/3696] lr: 6.000e-05, eta: 6:40:08, time: 0.914, data_time: 0.017, memory: 13163, loss_rpn_cls: 0.0605, loss_rpn_bbox: 0.0246, loss_cls: 6.2742, loss_bbox: 2.9586, loss: 9.3179 2023-01-20 13:37:30,368 - mmdet - INFO - Epoch [5][2850/3696] lr: 6.000e-05, eta: 6:39:23, time: 0.900, data_time: 0.018, memory: 13163, loss_rpn_cls: 0.0601, loss_rpn_bbox: 0.0242, loss_cls: 6.2611, loss_bbox: 2.9668, loss: 9.3123 2023-01-20 13:38:15,629 - mmdet - INFO - Epoch [5][2900/3696] lr: 6.000e-05, eta: 6:38:39, time: 0.905, data_time: 0.017, memory: 13163, loss_rpn_cls: 0.0602, loss_rpn_bbox: 0.0243, loss_cls: 6.2632, loss_bbox: 2.9756, loss: 9.3234 2023-01-20 13:39:01,310 - mmdet - INFO - Epoch [5][2950/3696] lr: 6.000e-05, eta: 6:37:55, time: 0.914, data_time: 0.017, memory: 13163, loss_rpn_cls: 0.0602, loss_rpn_bbox: 0.0244, loss_cls: 6.2642, loss_bbox: 2.9505, loss: 9.2992 2023-01-20 13:39:46,614 - mmdet - INFO - Epoch [5][3000/3696] lr: 6.000e-05, eta: 6:37:11, time: 0.906, data_time: 0.017, memory: 13163, loss_rpn_cls: 0.0591, loss_rpn_bbox: 0.0242, loss_cls: 6.2671, loss_bbox: 2.9802, loss: 9.3305 2023-01-20 13:40:31,545 - mmdet - INFO - Epoch [5][3050/3696] lr: 6.000e-05, eta: 6:36:26, time: 0.899, data_time: 0.017, memory: 13163, loss_rpn_cls: 0.0591, loss_rpn_bbox: 0.0239, loss_cls: 6.2581, loss_bbox: 2.9743, loss: 9.3154 2023-01-20 13:41:17,088 - mmdet - INFO - Epoch [5][3100/3696] lr: 6.000e-05, eta: 6:35:42, time: 0.911, data_time: 0.017, memory: 13163, loss_rpn_cls: 0.0604, loss_rpn_bbox: 0.0242, loss_cls: 6.2573, loss_bbox: 2.9519, loss: 9.2937 2023-01-20 13:42:03,208 - mmdet - INFO - Epoch [5][3150/3696] lr: 6.000e-05, eta: 6:34:59, time: 0.922, data_time: 0.017, memory: 13163, loss_rpn_cls: 0.0605, loss_rpn_bbox: 0.0244, loss_cls: 6.2669, loss_bbox: 2.9448, loss: 9.2965 2023-01-20 13:42:48,760 - mmdet - INFO - Epoch [5][3200/3696] lr: 6.000e-05, eta: 6:34:16, time: 0.911, data_time: 0.017, memory: 13163, loss_rpn_cls: 0.0595, loss_rpn_bbox: 0.0242, loss_cls: 6.2660, loss_bbox: 2.9684, loss: 9.3181 2023-01-20 13:43:34,586 - mmdet - INFO - Epoch [5][3250/3696] lr: 6.000e-05, eta: 6:33:32, time: 0.917, data_time: 0.017, memory: 13163, loss_rpn_cls: 0.0592, loss_rpn_bbox: 0.0240, loss_cls: 6.2620, loss_bbox: 2.9531, loss: 9.2983 2023-01-20 13:44:20,022 - mmdet - INFO - Epoch [5][3300/3696] lr: 6.000e-05, eta: 6:32:48, time: 0.909, data_time: 0.017, memory: 13163, loss_rpn_cls: 0.0595, loss_rpn_bbox: 0.0242, loss_cls: 6.2602, loss_bbox: 2.9572, loss: 9.3010 2023-01-20 13:45:05,075 - mmdet - INFO - Epoch [5][3350/3696] lr: 6.000e-05, eta: 6:32:04, time: 0.901, data_time: 0.018, memory: 13163, loss_rpn_cls: 0.0588, loss_rpn_bbox: 0.0239, loss_cls: 6.2617, loss_bbox: 2.9651, loss: 9.3095 2023-01-20 13:45:50,257 - mmdet - INFO - Epoch [5][3400/3696] lr: 6.000e-05, eta: 6:31:19, time: 0.904, data_time: 0.017, memory: 13163, loss_rpn_cls: 0.0601, loss_rpn_bbox: 0.0242, loss_cls: 6.2620, loss_bbox: 2.9661, loss: 9.3124 2023-01-20 13:46:35,042 - mmdet - INFO - Epoch [5][3450/3696] lr: 6.000e-05, eta: 6:30:34, time: 0.896, data_time: 0.017, memory: 13163, loss_rpn_cls: 0.0590, loss_rpn_bbox: 0.0236, loss_cls: 6.2546, loss_bbox: 2.9685, loss: 9.3058 2023-01-20 13:47:20,651 - mmdet - INFO - Epoch [5][3500/3696] lr: 6.000e-05, eta: 6:29:50, time: 0.912, data_time: 0.017, memory: 13163, loss_rpn_cls: 0.0597, loss_rpn_bbox: 0.0241, loss_cls: 6.2567, loss_bbox: 2.9626, loss: 9.3032 2023-01-20 13:48:06,205 - mmdet - INFO - Epoch [5][3550/3696] lr: 6.000e-05, eta: 6:29:07, time: 0.911, data_time: 0.017, memory: 13163, loss_rpn_cls: 0.0608, loss_rpn_bbox: 0.0244, loss_cls: 6.2653, loss_bbox: 2.9493, loss: 9.2998 2023-01-20 13:48:51,526 - mmdet - INFO - Epoch [5][3600/3696] lr: 6.000e-05, eta: 6:28:22, time: 0.906, data_time: 0.017, memory: 13163, loss_rpn_cls: 0.0596, loss_rpn_bbox: 0.0242, loss_cls: 6.2620, loss_bbox: 2.9646, loss: 9.3103 2023-01-20 13:49:36,797 - mmdet - INFO - Epoch [5][3650/3696] lr: 6.000e-05, eta: 6:27:38, time: 0.905, data_time: 0.017, memory: 13163, loss_rpn_cls: 0.0605, loss_rpn_bbox: 0.0242, loss_cls: 6.2600, loss_bbox: 2.9492, loss: 9.2939 2023-01-20 13:50:18,556 - mmdet - INFO - Saving checkpoint at 5 epochs 2023-01-20 13:51:09,610 - mmdet - INFO - Epoch [6][50/3696] lr: 6.000e-05, eta: 6:25:19, time: 0.958, data_time: 0.073, memory: 13163, loss_rpn_cls: 0.0595, loss_rpn_bbox: 0.0240, loss_cls: 6.2617, loss_bbox: 2.9648, loss: 9.3101 2023-01-20 13:51:55,007 - mmdet - INFO - Epoch [6][100/3696] lr: 6.000e-05, eta: 6:24:35, time: 0.908, data_time: 0.019, memory: 13163, loss_rpn_cls: 0.0596, loss_rpn_bbox: 0.0240, loss_cls: 6.2558, loss_bbox: 2.9569, loss: 9.2963 2023-01-20 13:52:39,964 - mmdet - INFO - Epoch [6][150/3696] lr: 6.000e-05, eta: 6:23:50, time: 0.899, data_time: 0.018, memory: 13163, loss_rpn_cls: 0.0598, loss_rpn_bbox: 0.0239, loss_cls: 6.2585, loss_bbox: 2.9614, loss: 9.3036 2023-01-20 13:53:25,523 - mmdet - INFO - Epoch [6][200/3696] lr: 6.000e-05, eta: 6:23:06, time: 0.911, data_time: 0.017, memory: 13163, loss_rpn_cls: 0.0598, loss_rpn_bbox: 0.0241, loss_cls: 6.2602, loss_bbox: 2.9613, loss: 9.3055 2023-01-20 13:54:10,932 - mmdet - INFO - Epoch [6][250/3696] lr: 6.000e-05, eta: 6:22:23, time: 0.908, data_time: 0.019, memory: 13163, loss_rpn_cls: 0.0599, loss_rpn_bbox: 0.0242, loss_cls: 6.2638, loss_bbox: 2.9494, loss: 9.2973 2023-01-20 13:54:57,561 - mmdet - INFO - Epoch [6][300/3696] lr: 6.000e-05, eta: 6:21:40, time: 0.933, data_time: 0.017, memory: 13163, loss_rpn_cls: 0.0606, loss_rpn_bbox: 0.0247, loss_cls: 6.2793, loss_bbox: 2.9343, loss: 9.2990 2023-01-20 13:55:43,453 - mmdet - INFO - Epoch [6][350/3696] lr: 6.000e-05, eta: 6:20:57, time: 0.918, data_time: 0.018, memory: 13163, loss_rpn_cls: 0.0600, loss_rpn_bbox: 0.0244, loss_cls: 6.2705, loss_bbox: 2.9527, loss: 9.3076 2023-01-20 13:56:29,189 - mmdet - INFO - Epoch [6][400/3696] lr: 6.000e-05, eta: 6:20:14, time: 0.915, data_time: 0.018, memory: 13163, loss_rpn_cls: 0.0607, loss_rpn_bbox: 0.0249, loss_cls: 6.2767, loss_bbox: 2.9306, loss: 9.2930 2023-01-20 13:57:14,296 - mmdet - INFO - Epoch [6][450/3696] lr: 6.000e-05, eta: 6:19:29, time: 0.902, data_time: 0.017, memory: 13163, loss_rpn_cls: 0.0601, loss_rpn_bbox: 0.0239, loss_cls: 6.2534, loss_bbox: 2.9564, loss: 9.2939 2023-01-20 13:57:59,563 - mmdet - INFO - Epoch [6][500/3696] lr: 6.000e-05, eta: 6:18:45, time: 0.905, data_time: 0.017, memory: 13163, loss_rpn_cls: 0.0599, loss_rpn_bbox: 0.0245, loss_cls: 6.2681, loss_bbox: 2.9485, loss: 9.3010 2023-01-20 13:58:44,594 - mmdet - INFO - Epoch [6][550/3696] lr: 6.000e-05, eta: 6:18:01, time: 0.901, data_time: 0.017, memory: 13163, loss_rpn_cls: 0.0592, loss_rpn_bbox: 0.0236, loss_cls: 6.2472, loss_bbox: 2.9657, loss: 9.2958 2023-01-20 13:59:30,433 - mmdet - INFO - Epoch [6][600/3696] lr: 6.000e-05, eta: 6:17:17, time: 0.917, data_time: 0.017, memory: 13163, loss_rpn_cls: 0.0601, loss_rpn_bbox: 0.0245, loss_cls: 6.2750, loss_bbox: 2.9396, loss: 9.2992 2023-01-20 14:00:16,603 - mmdet - INFO - Epoch [6][650/3696] lr: 6.000e-05, eta: 6:16:34, time: 0.923, data_time: 0.018, memory: 13163, loss_rpn_cls: 0.0599, loss_rpn_bbox: 0.0244, loss_cls: 6.2715, loss_bbox: 2.9329, loss: 9.2888 2023-01-20 14:01:02,000 - mmdet - INFO - Epoch [6][700/3696] lr: 6.000e-05, eta: 6:15:50, time: 0.908, data_time: 0.017, memory: 13163, loss_rpn_cls: 0.0593, loss_rpn_bbox: 0.0240, loss_cls: 6.2593, loss_bbox: 2.9571, loss: 9.2997 2023-01-20 14:01:46,981 - mmdet - INFO - Epoch [6][750/3696] lr: 6.000e-05, eta: 6:15:06, time: 0.900, data_time: 0.017, memory: 13163, loss_rpn_cls: 0.0597, loss_rpn_bbox: 0.0241, loss_cls: 6.2558, loss_bbox: 2.9602, loss: 9.2998 2023-01-20 14:02:32,529 - mmdet - INFO - Epoch [6][800/3696] lr: 6.000e-05, eta: 6:14:22, time: 0.911, data_time: 0.018, memory: 13163, loss_rpn_cls: 0.0598, loss_rpn_bbox: 0.0242, loss_cls: 6.2582, loss_bbox: 2.9573, loss: 9.2994 2023-01-20 14:03:18,911 - mmdet - INFO - Epoch [6][850/3696] lr: 6.000e-05, eta: 6:13:39, time: 0.928, data_time: 0.017, memory: 13163, loss_rpn_cls: 0.0616, loss_rpn_bbox: 0.0249, loss_cls: 6.2822, loss_bbox: 2.9139, loss: 9.2825 2023-01-20 14:04:03,837 - mmdet - INFO - Epoch [6][900/3696] lr: 6.000e-05, eta: 6:12:55, time: 0.898, data_time: 0.018, memory: 13163, loss_rpn_cls: 0.0581, loss_rpn_bbox: 0.0236, loss_cls: 6.2428, loss_bbox: 2.9688, loss: 9.2933 2023-01-20 14:04:49,223 - mmdet - INFO - Epoch [6][950/3696] lr: 6.000e-05, eta: 6:12:10, time: 0.908, data_time: 0.018, memory: 13163, loss_rpn_cls: 0.0595, loss_rpn_bbox: 0.0242, loss_cls: 6.2668, loss_bbox: 2.9600, loss: 9.3105 2023-01-20 14:05:34,944 - mmdet - INFO - Epoch [6][1000/3696] lr: 6.000e-05, eta: 6:11:27, time: 0.915, data_time: 0.017, memory: 13163, loss_rpn_cls: 0.0598, loss_rpn_bbox: 0.0244, loss_cls: 6.2698, loss_bbox: 2.9369, loss: 9.2910 2023-01-20 14:06:20,722 - mmdet - INFO - Epoch [6][1050/3696] lr: 6.000e-05, eta: 6:10:43, time: 0.916, data_time: 0.019, memory: 13163, loss_rpn_cls: 0.0606, loss_rpn_bbox: 0.0245, loss_cls: 6.2713, loss_bbox: 2.9474, loss: 9.3038 2023-01-20 14:07:06,232 - mmdet - INFO - Epoch [6][1100/3696] lr: 6.000e-05, eta: 6:09:59, time: 0.910, data_time: 0.018, memory: 13163, loss_rpn_cls: 0.0590, loss_rpn_bbox: 0.0241, loss_cls: 6.2656, loss_bbox: 2.9601, loss: 9.3088 2023-01-20 14:07:51,385 - mmdet - INFO - Epoch [6][1150/3696] lr: 6.000e-05, eta: 6:09:15, time: 0.903, data_time: 0.018, memory: 13163, loss_rpn_cls: 0.0596, loss_rpn_bbox: 0.0243, loss_cls: 6.2611, loss_bbox: 2.9487, loss: 9.2937 2023-01-20 14:08:36,905 - mmdet - INFO - Epoch [6][1200/3696] lr: 6.000e-05, eta: 6:08:31, time: 0.910, data_time: 0.017, memory: 13163, loss_rpn_cls: 0.0585, loss_rpn_bbox: 0.0238, loss_cls: 6.2491, loss_bbox: 2.9605, loss: 9.2918 2023-01-20 14:09:22,653 - mmdet - INFO - Epoch [6][1250/3696] lr: 6.000e-05, eta: 6:07:47, time: 0.915, data_time: 0.017, memory: 13163, loss_rpn_cls: 0.0595, loss_rpn_bbox: 0.0239, loss_cls: 6.2515, loss_bbox: 2.9612, loss: 9.2962 2023-01-20 14:10:08,188 - mmdet - INFO - Epoch [6][1300/3696] lr: 6.000e-05, eta: 6:07:04, time: 0.911, data_time: 0.017, memory: 13163, loss_rpn_cls: 0.0584, loss_rpn_bbox: 0.0238, loss_cls: 6.2572, loss_bbox: 2.9592, loss: 9.2987 2023-01-20 14:10:53,861 - mmdet - INFO - Epoch [6][1350/3696] lr: 6.000e-05, eta: 6:06:20, time: 0.913, data_time: 0.017, memory: 13163, loss_rpn_cls: 0.0597, loss_rpn_bbox: 0.0240, loss_cls: 6.2538, loss_bbox: 2.9611, loss: 9.2986 2023-01-20 14:11:39,353 - mmdet - INFO - Epoch [6][1400/3696] lr: 6.000e-05, eta: 6:05:36, time: 0.910, data_time: 0.017, memory: 13163, loss_rpn_cls: 0.0597, loss_rpn_bbox: 0.0243, loss_cls: 6.2624, loss_bbox: 2.9583, loss: 9.3047 2023-01-20 14:12:24,942 - mmdet - INFO - Epoch [6][1450/3696] lr: 6.000e-05, eta: 6:04:52, time: 0.912, data_time: 0.017, memory: 13163, loss_rpn_cls: 0.0592, loss_rpn_bbox: 0.0241, loss_cls: 6.2617, loss_bbox: 2.9529, loss: 9.2980 2023-01-20 14:13:11,198 - mmdet - INFO - Epoch [6][1500/3696] lr: 6.000e-05, eta: 6:04:09, time: 0.925, data_time: 0.018, memory: 13163, loss_rpn_cls: 0.0604, loss_rpn_bbox: 0.0247, loss_cls: 6.2765, loss_bbox: 2.9326, loss: 9.2942 2023-01-20 14:13:56,590 - mmdet - INFO - Epoch [6][1550/3696] lr: 6.000e-05, eta: 6:03:25, time: 0.908, data_time: 0.018, memory: 13163, loss_rpn_cls: 0.0598, loss_rpn_bbox: 0.0241, loss_cls: 6.2553, loss_bbox: 2.9577, loss: 9.2969 2023-01-20 14:14:41,419 - mmdet - INFO - Epoch [6][1600/3696] lr: 6.000e-05, eta: 6:02:40, time: 0.897, data_time: 0.017, memory: 13163, loss_rpn_cls: 0.0594, loss_rpn_bbox: 0.0237, loss_cls: 6.2509, loss_bbox: 2.9610, loss: 9.2950 2023-01-20 14:15:26,857 - mmdet - INFO - Epoch [6][1650/3696] lr: 6.000e-05, eta: 6:01:56, time: 0.909, data_time: 0.017, memory: 13163, loss_rpn_cls: 0.0596, loss_rpn_bbox: 0.0241, loss_cls: 6.2612, loss_bbox: 2.9511, loss: 9.2961 2023-01-20 14:16:12,202 - mmdet - INFO - Epoch [6][1700/3696] lr: 6.000e-05, eta: 6:01:12, time: 0.907, data_time: 0.017, memory: 13163, loss_rpn_cls: 0.0602, loss_rpn_bbox: 0.0243, loss_cls: 6.2657, loss_bbox: 2.9427, loss: 9.2929 2023-01-20 14:16:57,825 - mmdet - INFO - Epoch [6][1750/3696] lr: 6.000e-05, eta: 6:00:28, time: 0.912, data_time: 0.017, memory: 13163, loss_rpn_cls: 0.0604, loss_rpn_bbox: 0.0242, loss_cls: 6.2705, loss_bbox: 2.9372, loss: 9.2923 2023-01-20 14:17:43,282 - mmdet - INFO - Epoch [6][1800/3696] lr: 6.000e-05, eta: 5:59:44, time: 0.909, data_time: 0.017, memory: 13163, loss_rpn_cls: 0.0595, loss_rpn_bbox: 0.0242, loss_cls: 6.2687, loss_bbox: 2.9600, loss: 9.3125 2023-01-20 14:18:28,923 - mmdet - INFO - Epoch [6][1850/3696] lr: 6.000e-05, eta: 5:59:00, time: 0.913, data_time: 0.017, memory: 13163, loss_rpn_cls: 0.0595, loss_rpn_bbox: 0.0245, loss_cls: 6.2767, loss_bbox: 2.9368, loss: 9.2975 2023-01-20 14:19:14,762 - mmdet - INFO - Epoch [6][1900/3696] lr: 6.000e-05, eta: 5:58:16, time: 0.917, data_time: 0.017, memory: 13163, loss_rpn_cls: 0.0597, loss_rpn_bbox: 0.0243, loss_cls: 6.2682, loss_bbox: 2.9362, loss: 9.2885 2023-01-20 14:20:02,753 - mmdet - INFO - Epoch [6][1950/3696] lr: 6.000e-05, eta: 5:57:35, time: 0.960, data_time: 0.017, memory: 13163, loss_rpn_cls: 0.0591, loss_rpn_bbox: 0.0242, loss_cls: 6.2603, loss_bbox: 2.9464, loss: 9.2901 2023-01-20 14:20:49,989 - mmdet - INFO - Epoch [6][2000/3696] lr: 6.000e-05, eta: 5:56:53, time: 0.945, data_time: 0.017, memory: 13163, loss_rpn_cls: 0.0583, loss_rpn_bbox: 0.0236, loss_cls: 6.2490, loss_bbox: 2.9665, loss: 9.2975 2023-01-20 14:21:35,341 - mmdet - INFO - Epoch [6][2050/3696] lr: 6.000e-05, eta: 5:56:09, time: 0.907, data_time: 0.017, memory: 13163, loss_rpn_cls: 0.0588, loss_rpn_bbox: 0.0241, loss_cls: 6.2633, loss_bbox: 2.9553, loss: 9.3016 2023-01-20 14:22:22,931 - mmdet - INFO - Epoch [6][2100/3696] lr: 6.000e-05, eta: 5:55:27, time: 0.952, data_time: 0.017, memory: 13163, loss_rpn_cls: 0.0590, loss_rpn_bbox: 0.0239, loss_cls: 6.2589, loss_bbox: 2.9503, loss: 9.2922 2023-01-20 14:23:12,867 - mmdet - INFO - Epoch [6][2150/3696] lr: 6.000e-05, eta: 5:54:48, time: 0.999, data_time: 0.017, memory: 13163, loss_rpn_cls: 0.0586, loss_rpn_bbox: 0.0238, loss_cls: 6.2551, loss_bbox: 2.9610, loss: 9.2985 2023-01-20 14:24:02,551 - mmdet - INFO - Epoch [6][2200/3696] lr: 6.000e-05, eta: 5:54:09, time: 0.993, data_time: 0.018, memory: 13163, loss_rpn_cls: 0.0587, loss_rpn_bbox: 0.0238, loss_cls: 6.2571, loss_bbox: 2.9442, loss: 9.2838 2023-01-20 14:24:50,452 - mmdet - INFO - Epoch [6][2250/3696] lr: 6.000e-05, eta: 5:53:27, time: 0.958, data_time: 0.018, memory: 13163, loss_rpn_cls: 0.0604, loss_rpn_bbox: 0.0245, loss_cls: 6.2708, loss_bbox: 2.9539, loss: 9.3097 2023-01-20 14:25:35,842 - mmdet - INFO - Epoch [6][2300/3696] lr: 6.000e-05, eta: 5:52:43, time: 0.908, data_time: 0.017, memory: 13163, loss_rpn_cls: 0.0594, loss_rpn_bbox: 0.0242, loss_cls: 6.2674, loss_bbox: 2.9552, loss: 9.3063 2023-01-20 14:26:21,230 - mmdet - INFO - Epoch [6][2350/3696] lr: 6.000e-05, eta: 5:51:58, time: 0.908, data_time: 0.017, memory: 13163, loss_rpn_cls: 0.0587, loss_rpn_bbox: 0.0239, loss_cls: 6.2581, loss_bbox: 2.9506, loss: 9.2912 2023-01-20 14:27:06,766 - mmdet - INFO - Epoch [6][2400/3696] lr: 6.000e-05, eta: 5:51:14, time: 0.911, data_time: 0.017, memory: 13163, loss_rpn_cls: 0.0594, loss_rpn_bbox: 0.0241, loss_cls: 6.2598, loss_bbox: 2.9483, loss: 9.2916 2023-01-20 14:27:52,000 - mmdet - INFO - Epoch [6][2450/3696] lr: 6.000e-05, eta: 5:50:30, time: 0.905, data_time: 0.017, memory: 13163, loss_rpn_cls: 0.0605, loss_rpn_bbox: 0.0246, loss_cls: 6.2696, loss_bbox: 2.9334, loss: 9.2880 2023-01-20 14:28:37,468 - mmdet - INFO - Epoch [6][2500/3696] lr: 6.000e-05, eta: 5:49:45, time: 0.909, data_time: 0.018, memory: 13163, loss_rpn_cls: 0.0592, loss_rpn_bbox: 0.0240, loss_cls: 6.2569, loss_bbox: 2.9437, loss: 9.2838 2023-01-20 14:29:22,828 - mmdet - INFO - Epoch [6][2550/3696] lr: 6.000e-05, eta: 5:49:01, time: 0.907, data_time: 0.018, memory: 13163, loss_rpn_cls: 0.0589, loss_rpn_bbox: 0.0239, loss_cls: 6.2530, loss_bbox: 2.9500, loss: 9.2859 2023-01-20 14:30:08,950 - mmdet - INFO - Epoch [6][2600/3696] lr: 6.000e-05, eta: 5:48:18, time: 0.923, data_time: 0.017, memory: 13163, loss_rpn_cls: 0.0599, loss_rpn_bbox: 0.0247, loss_cls: 6.2778, loss_bbox: 2.9201, loss: 9.2825 2023-01-20 14:30:53,953 - mmdet - INFO - Epoch [6][2650/3696] lr: 6.000e-05, eta: 5:47:33, time: 0.900, data_time: 0.017, memory: 13163, loss_rpn_cls: 0.0589, loss_rpn_bbox: 0.0239, loss_cls: 6.2556, loss_bbox: 2.9477, loss: 9.2861 2023-01-20 14:31:39,271 - mmdet - INFO - Epoch [6][2700/3696] lr: 6.000e-05, eta: 5:46:48, time: 0.906, data_time: 0.017, memory: 13163, loss_rpn_cls: 0.0600, loss_rpn_bbox: 0.0242, loss_cls: 6.2611, loss_bbox: 2.9403, loss: 9.2856 2023-01-20 14:32:24,084 - mmdet - INFO - Epoch [6][2750/3696] lr: 6.000e-05, eta: 5:46:03, time: 0.896, data_time: 0.017, memory: 13163, loss_rpn_cls: 0.0588, loss_rpn_bbox: 0.0239, loss_cls: 6.2651, loss_bbox: 2.9505, loss: 9.2983 2023-01-20 14:33:09,759 - mmdet - INFO - Epoch [6][2800/3696] lr: 6.000e-05, eta: 5:45:19, time: 0.913, data_time: 0.018, memory: 13163, loss_rpn_cls: 0.0592, loss_rpn_bbox: 0.0241, loss_cls: 6.2600, loss_bbox: 2.9378, loss: 9.2811 2023-01-20 14:33:56,149 - mmdet - INFO - Epoch [6][2850/3696] lr: 6.000e-05, eta: 5:44:36, time: 0.928, data_time: 0.018, memory: 13220, loss_rpn_cls: 0.0596, loss_rpn_bbox: 0.0244, loss_cls: 6.2593, loss_bbox: 2.9290, loss: 9.2723 2023-01-20 14:34:41,511 - mmdet - INFO - Epoch [6][2900/3696] lr: 6.000e-05, eta: 5:43:52, time: 0.907, data_time: 0.017, memory: 13220, loss_rpn_cls: 0.0602, loss_rpn_bbox: 0.0243, loss_cls: 6.2561, loss_bbox: 2.9380, loss: 9.2785 2023-01-20 14:35:27,254 - mmdet - INFO - Epoch [6][2950/3696] lr: 6.000e-05, eta: 5:43:08, time: 0.915, data_time: 0.017, memory: 13220, loss_rpn_cls: 0.0600, loss_rpn_bbox: 0.0246, loss_cls: 6.2721, loss_bbox: 2.9361, loss: 9.2928 2023-01-20 14:36:12,379 - mmdet - INFO - Epoch [6][3000/3696] lr: 6.000e-05, eta: 5:42:23, time: 0.902, data_time: 0.017, memory: 13220, loss_rpn_cls: 0.0604, loss_rpn_bbox: 0.0243, loss_cls: 6.2629, loss_bbox: 2.9437, loss: 9.2913 2023-01-20 14:36:57,941 - mmdet - INFO - Epoch [6][3050/3696] lr: 6.000e-05, eta: 5:41:39, time: 0.911, data_time: 0.017, memory: 13220, loss_rpn_cls: 0.0596, loss_rpn_bbox: 0.0240, loss_cls: 6.2595, loss_bbox: 2.9367, loss: 9.2799 2023-01-20 14:37:43,135 - mmdet - INFO - Epoch [6][3100/3696] lr: 6.000e-05, eta: 5:40:54, time: 0.904, data_time: 0.018, memory: 13220, loss_rpn_cls: 0.0586, loss_rpn_bbox: 0.0240, loss_cls: 6.2598, loss_bbox: 2.9419, loss: 9.2843 2023-01-20 14:38:28,381 - mmdet - INFO - Epoch [6][3150/3696] lr: 6.000e-05, eta: 5:40:09, time: 0.905, data_time: 0.017, memory: 13220, loss_rpn_cls: 0.0588, loss_rpn_bbox: 0.0239, loss_cls: 6.2510, loss_bbox: 2.9533, loss: 9.2870 2023-01-20 14:39:13,284 - mmdet - INFO - Epoch [6][3200/3696] lr: 6.000e-05, eta: 5:39:25, time: 0.898, data_time: 0.018, memory: 13220, loss_rpn_cls: 0.0588, loss_rpn_bbox: 0.0239, loss_cls: 6.2545, loss_bbox: 2.9450, loss: 9.2822 2023-01-20 14:39:58,332 - mmdet - INFO - Epoch [6][3250/3696] lr: 6.000e-05, eta: 5:38:40, time: 0.901, data_time: 0.017, memory: 13220, loss_rpn_cls: 0.0587, loss_rpn_bbox: 0.0241, loss_cls: 6.2595, loss_bbox: 2.9520, loss: 9.2944 2023-01-20 14:40:44,303 - mmdet - INFO - Epoch [6][3300/3696] lr: 6.000e-05, eta: 5:37:56, time: 0.919, data_time: 0.017, memory: 13220, loss_rpn_cls: 0.0604, loss_rpn_bbox: 0.0247, loss_cls: 6.2713, loss_bbox: 2.9386, loss: 9.2950 2023-01-20 14:41:29,982 - mmdet - INFO - Epoch [6][3350/3696] lr: 6.000e-05, eta: 5:37:12, time: 0.914, data_time: 0.017, memory: 13220, loss_rpn_cls: 0.0593, loss_rpn_bbox: 0.0243, loss_cls: 6.2654, loss_bbox: 2.9356, loss: 9.2845 2023-01-20 14:42:15,612 - mmdet - INFO - Epoch [6][3400/3696] lr: 6.000e-05, eta: 5:36:28, time: 0.913, data_time: 0.017, memory: 13220, loss_rpn_cls: 0.0595, loss_rpn_bbox: 0.0244, loss_cls: 6.2712, loss_bbox: 2.9251, loss: 9.2802 2023-01-20 14:43:01,002 - mmdet - INFO - Epoch [6][3450/3696] lr: 6.000e-05, eta: 5:35:43, time: 0.908, data_time: 0.017, memory: 13220, loss_rpn_cls: 0.0595, loss_rpn_bbox: 0.0241, loss_cls: 6.2611, loss_bbox: 2.9324, loss: 9.2771 2023-01-20 14:43:46,151 - mmdet - INFO - Epoch [6][3500/3696] lr: 6.000e-05, eta: 5:34:59, time: 0.903, data_time: 0.017, memory: 13220, loss_rpn_cls: 0.0592, loss_rpn_bbox: 0.0242, loss_cls: 6.2612, loss_bbox: 2.9475, loss: 9.2921 2023-01-20 14:44:31,424 - mmdet - INFO - Epoch [6][3550/3696] lr: 6.000e-05, eta: 5:34:14, time: 0.905, data_time: 0.017, memory: 13220, loss_rpn_cls: 0.0591, loss_rpn_bbox: 0.0240, loss_cls: 6.2478, loss_bbox: 2.9511, loss: 9.2821 2023-01-20 14:45:16,951 - mmdet - INFO - Epoch [6][3600/3696] lr: 6.000e-05, eta: 5:33:30, time: 0.910, data_time: 0.017, memory: 13220, loss_rpn_cls: 0.0586, loss_rpn_bbox: 0.0239, loss_cls: 6.2606, loss_bbox: 2.9376, loss: 9.2807 2023-01-20 14:46:03,186 - mmdet - INFO - Epoch [6][3650/3696] lr: 6.000e-05, eta: 5:32:46, time: 0.925, data_time: 0.017, memory: 13220, loss_rpn_cls: 0.0588, loss_rpn_bbox: 0.0242, loss_cls: 6.2670, loss_bbox: 2.9274, loss: 9.2773 2023-01-20 14:46:44,977 - mmdet - INFO - Saving checkpoint at 6 epochs 2023-01-20 14:47:36,567 - mmdet - INFO - Epoch [7][50/3696] lr: 6.000e-05, eta: 5:30:42, time: 0.968, data_time: 0.074, memory: 13220, loss_rpn_cls: 0.0602, loss_rpn_bbox: 0.0247, loss_cls: 6.2739, loss_bbox: 2.9248, loss: 9.2836 2023-01-20 14:48:22,047 - mmdet - INFO - Epoch [7][100/3696] lr: 6.000e-05, eta: 5:29:58, time: 0.910, data_time: 0.018, memory: 13220, loss_rpn_cls: 0.0594, loss_rpn_bbox: 0.0243, loss_cls: 6.2599, loss_bbox: 2.9308, loss: 9.2743 2023-01-20 14:49:08,071 - mmdet - INFO - Epoch [7][150/3696] lr: 6.000e-05, eta: 5:29:14, time: 0.920, data_time: 0.018, memory: 13220, loss_rpn_cls: 0.0590, loss_rpn_bbox: 0.0244, loss_cls: 6.2722, loss_bbox: 2.9221, loss: 9.2777 2023-01-20 14:49:53,686 - mmdet - INFO - Epoch [7][200/3696] lr: 6.000e-05, eta: 5:28:30, time: 0.912, data_time: 0.018, memory: 13220, loss_rpn_cls: 0.0590, loss_rpn_bbox: 0.0245, loss_cls: 6.2665, loss_bbox: 2.9256, loss: 9.2756 2023-01-20 14:50:38,971 - mmdet - INFO - Epoch [7][250/3696] lr: 6.000e-05, eta: 5:27:46, time: 0.906, data_time: 0.018, memory: 13220, loss_rpn_cls: 0.0580, loss_rpn_bbox: 0.0238, loss_cls: 6.2578, loss_bbox: 2.9486, loss: 9.2881 2023-01-20 14:51:24,577 - mmdet - INFO - Epoch [7][300/3696] lr: 6.000e-05, eta: 5:27:02, time: 0.912, data_time: 0.018, memory: 13220, loss_rpn_cls: 0.0587, loss_rpn_bbox: 0.0238, loss_cls: 6.2537, loss_bbox: 2.9420, loss: 9.2782 2023-01-20 14:52:09,994 - mmdet - INFO - Epoch [7][350/3696] lr: 6.000e-05, eta: 5:26:17, time: 0.908, data_time: 0.018, memory: 13220, loss_rpn_cls: 0.0604, loss_rpn_bbox: 0.0245, loss_cls: 6.2677, loss_bbox: 2.9302, loss: 9.2828 2023-01-20 14:52:56,171 - mmdet - INFO - Epoch [7][400/3696] lr: 6.000e-05, eta: 5:25:34, time: 0.924, data_time: 0.017, memory: 13220, loss_rpn_cls: 0.0598, loss_rpn_bbox: 0.0247, loss_cls: 6.2698, loss_bbox: 2.9105, loss: 9.2648 2023-01-20 14:53:42,430 - mmdet - INFO - Epoch [7][450/3696] lr: 6.000e-05, eta: 5:24:50, time: 0.925, data_time: 0.018, memory: 13220, loss_rpn_cls: 0.0595, loss_rpn_bbox: 0.0245, loss_cls: 6.2669, loss_bbox: 2.9161, loss: 9.2671 2023-01-20 14:54:27,622 - mmdet - INFO - Epoch [7][500/3696] lr: 6.000e-05, eta: 5:24:06, time: 0.904, data_time: 0.017, memory: 13220, loss_rpn_cls: 0.0585, loss_rpn_bbox: 0.0242, loss_cls: 6.2635, loss_bbox: 2.9342, loss: 9.2804 2023-01-20 14:55:12,758 - mmdet - INFO - Epoch [7][550/3696] lr: 6.000e-05, eta: 5:23:21, time: 0.903, data_time: 0.018, memory: 13220, loss_rpn_cls: 0.0579, loss_rpn_bbox: 0.0236, loss_cls: 6.2481, loss_bbox: 2.9411, loss: 9.2708 2023-01-20 14:55:58,962 - mmdet - INFO - Epoch [7][600/3696] lr: 6.000e-05, eta: 5:22:38, time: 0.924, data_time: 0.018, memory: 13220, loss_rpn_cls: 0.0592, loss_rpn_bbox: 0.0243, loss_cls: 6.2712, loss_bbox: 2.9235, loss: 9.2782 2023-01-20 14:56:44,286 - mmdet - INFO - Epoch [7][650/3696] lr: 6.000e-05, eta: 5:21:53, time: 0.906, data_time: 0.018, memory: 13220, loss_rpn_cls: 0.0589, loss_rpn_bbox: 0.0241, loss_cls: 6.2672, loss_bbox: 2.9290, loss: 9.2791 2023-01-20 14:57:29,613 - mmdet - INFO - Epoch [7][700/3696] lr: 6.000e-05, eta: 5:21:09, time: 0.907, data_time: 0.018, memory: 13220, loss_rpn_cls: 0.0588, loss_rpn_bbox: 0.0240, loss_cls: 6.2597, loss_bbox: 2.9363, loss: 9.2788 2023-01-20 14:58:14,807 - mmdet - INFO - Epoch [7][750/3696] lr: 6.000e-05, eta: 5:20:24, time: 0.904, data_time: 0.018, memory: 13220, loss_rpn_cls: 0.0592, loss_rpn_bbox: 0.0244, loss_cls: 6.2689, loss_bbox: 2.9261, loss: 9.2787 2023-01-20 14:59:00,637 - mmdet - INFO - Epoch [7][800/3696] lr: 6.000e-05, eta: 5:19:40, time: 0.917, data_time: 0.018, memory: 13220, loss_rpn_cls: 0.0592, loss_rpn_bbox: 0.0240, loss_cls: 6.2553, loss_bbox: 2.9325, loss: 9.2710 2023-01-20 14:59:46,699 - mmdet - INFO - Epoch [7][850/3696] lr: 6.000e-05, eta: 5:18:56, time: 0.921, data_time: 0.018, memory: 13220, loss_rpn_cls: 0.0601, loss_rpn_bbox: 0.0246, loss_cls: 6.2609, loss_bbox: 2.9211, loss: 9.2667 2023-01-20 15:00:32,121 - mmdet - INFO - Epoch [7][900/3696] lr: 6.000e-05, eta: 5:18:12, time: 0.908, data_time: 0.017, memory: 13220, loss_rpn_cls: 0.0589, loss_rpn_bbox: 0.0242, loss_cls: 6.2599, loss_bbox: 2.9288, loss: 9.2718 2023-01-20 15:01:17,384 - mmdet - INFO - Epoch [7][950/3696] lr: 6.000e-05, eta: 5:17:28, time: 0.905, data_time: 0.018, memory: 13220, loss_rpn_cls: 0.0580, loss_rpn_bbox: 0.0237, loss_cls: 6.2455, loss_bbox: 2.9221, loss: 9.2494 2023-01-20 15:02:02,900 - mmdet - INFO - Epoch [7][1000/3696] lr: 6.000e-05, eta: 5:16:43, time: 0.910, data_time: 0.018, memory: 13220, loss_rpn_cls: 0.0590, loss_rpn_bbox: 0.0242, loss_cls: 6.2604, loss_bbox: 2.9347, loss: 9.2781 2023-01-20 15:02:48,410 - mmdet - INFO - Epoch [7][1050/3696] lr: 6.000e-05, eta: 5:15:59, time: 0.910, data_time: 0.017, memory: 13220, loss_rpn_cls: 0.0588, loss_rpn_bbox: 0.0243, loss_cls: 6.2659, loss_bbox: 2.9313, loss: 9.2802 2023-01-20 15:03:35,216 - mmdet - INFO - Epoch [7][1100/3696] lr: 6.000e-05, eta: 5:15:16, time: 0.936, data_time: 0.018, memory: 13220, loss_rpn_cls: 0.0605, loss_rpn_bbox: 0.0249, loss_cls: 6.2778, loss_bbox: 2.8857, loss: 9.2489 2023-01-20 15:04:21,997 - mmdet - INFO - Epoch [7][1150/3696] lr: 6.000e-05, eta: 5:14:33, time: 0.936, data_time: 0.018, memory: 13220, loss_rpn_cls: 0.0593, loss_rpn_bbox: 0.0248, loss_cls: 6.2795, loss_bbox: 2.9068, loss: 9.2703 2023-01-20 15:05:07,250 - mmdet - INFO - Epoch [7][1200/3696] lr: 6.000e-05, eta: 5:13:48, time: 0.905, data_time: 0.018, memory: 13220, loss_rpn_cls: 0.0576, loss_rpn_bbox: 0.0232, loss_cls: 6.2444, loss_bbox: 2.9470, loss: 9.2723 2023-01-20 15:05:52,430 - mmdet - INFO - Epoch [7][1250/3696] lr: 6.000e-05, eta: 5:13:04, time: 0.904, data_time: 0.017, memory: 13220, loss_rpn_cls: 0.0581, loss_rpn_bbox: 0.0236, loss_cls: 6.2449, loss_bbox: 2.9516, loss: 9.2782 2023-01-20 15:06:37,615 - mmdet - INFO - Epoch [7][1300/3696] lr: 6.000e-05, eta: 5:12:19, time: 0.904, data_time: 0.017, memory: 13290, loss_rpn_cls: 0.0587, loss_rpn_bbox: 0.0239, loss_cls: 6.2588, loss_bbox: 2.9286, loss: 9.2701 2023-01-20 15:07:23,306 - mmdet - INFO - Epoch [7][1350/3696] lr: 6.000e-05, eta: 5:11:35, time: 0.914, data_time: 0.018, memory: 13290, loss_rpn_cls: 0.0597, loss_rpn_bbox: 0.0244, loss_cls: 6.2643, loss_bbox: 2.9155, loss: 9.2639 2023-01-20 15:08:08,793 - mmdet - INFO - Epoch [7][1400/3696] lr: 6.000e-05, eta: 5:10:50, time: 0.910, data_time: 0.018, memory: 13290, loss_rpn_cls: 0.0596, loss_rpn_bbox: 0.0239, loss_cls: 6.2521, loss_bbox: 2.9313, loss: 9.2669 2023-01-20 15:08:54,966 - mmdet - INFO - Epoch [7][1450/3696] lr: 6.000e-05, eta: 5:10:07, time: 0.923, data_time: 0.018, memory: 13290, loss_rpn_cls: 0.0595, loss_rpn_bbox: 0.0243, loss_cls: 6.2604, loss_bbox: 2.9126, loss: 9.2569 2023-01-20 15:09:40,209 - mmdet - INFO - Epoch [7][1500/3696] lr: 6.000e-05, eta: 5:09:22, time: 0.905, data_time: 0.017, memory: 13290, loss_rpn_cls: 0.0591, loss_rpn_bbox: 0.0241, loss_cls: 6.2633, loss_bbox: 2.9408, loss: 9.2872 2023-01-20 15:10:25,564 - mmdet - INFO - Epoch [7][1550/3696] lr: 6.000e-05, eta: 5:08:38, time: 0.907, data_time: 0.018, memory: 13290, loss_rpn_cls: 0.0588, loss_rpn_bbox: 0.0240, loss_cls: 6.2535, loss_bbox: 2.9250, loss: 9.2613 2023-01-20 15:11:10,810 - mmdet - INFO - Epoch [7][1600/3696] lr: 6.000e-05, eta: 5:07:53, time: 0.905, data_time: 0.018, memory: 13290, loss_rpn_cls: 0.0590, loss_rpn_bbox: 0.0238, loss_cls: 6.2514, loss_bbox: 2.9348, loss: 9.2690 2023-01-20 15:11:55,686 - mmdet - INFO - Epoch [7][1650/3696] lr: 6.000e-05, eta: 5:07:08, time: 0.898, data_time: 0.017, memory: 13290, loss_rpn_cls: 0.0584, loss_rpn_bbox: 0.0236, loss_cls: 6.2471, loss_bbox: 2.9362, loss: 9.2652 2023-01-20 15:12:41,699 - mmdet - INFO - Epoch [7][1700/3696] lr: 6.000e-05, eta: 5:06:24, time: 0.920, data_time: 0.017, memory: 13290, loss_rpn_cls: 0.0600, loss_rpn_bbox: 0.0244, loss_cls: 6.2679, loss_bbox: 2.9184, loss: 9.2707 2023-01-20 15:13:27,268 - mmdet - INFO - Epoch [7][1750/3696] lr: 6.000e-05, eta: 5:05:40, time: 0.911, data_time: 0.018, memory: 13290, loss_rpn_cls: 0.0601, loss_rpn_bbox: 0.0247, loss_cls: 6.2757, loss_bbox: 2.9134, loss: 9.2739 2023-01-20 15:14:13,400 - mmdet - INFO - Epoch [7][1800/3696] lr: 6.000e-05, eta: 5:04:56, time: 0.923, data_time: 0.017, memory: 13290, loss_rpn_cls: 0.0589, loss_rpn_bbox: 0.0243, loss_cls: 6.2598, loss_bbox: 2.9069, loss: 9.2498 2023-01-20 15:14:58,841 - mmdet - INFO - Epoch [7][1850/3696] lr: 6.000e-05, eta: 5:04:12, time: 0.909, data_time: 0.018, memory: 13290, loss_rpn_cls: 0.0591, loss_rpn_bbox: 0.0242, loss_cls: 6.2659, loss_bbox: 2.9305, loss: 9.2797 2023-01-20 15:15:44,034 - mmdet - INFO - Epoch [7][1900/3696] lr: 6.000e-05, eta: 5:03:27, time: 0.904, data_time: 0.018, memory: 13290, loss_rpn_cls: 0.0580, loss_rpn_bbox: 0.0236, loss_cls: 6.2526, loss_bbox: 2.9380, loss: 9.2722 2023-01-20 15:16:29,696 - mmdet - INFO - Epoch [7][1950/3696] lr: 6.000e-05, eta: 5:02:43, time: 0.913, data_time: 0.017, memory: 13290, loss_rpn_cls: 0.0583, loss_rpn_bbox: 0.0238, loss_cls: 6.2557, loss_bbox: 2.9353, loss: 9.2732 2023-01-20 15:17:15,083 - mmdet - INFO - Epoch [7][2000/3696] lr: 6.000e-05, eta: 5:01:58, time: 0.908, data_time: 0.017, memory: 13290, loss_rpn_cls: 0.0589, loss_rpn_bbox: 0.0241, loss_cls: 6.2602, loss_bbox: 2.9317, loss: 9.2748 2023-01-20 15:18:00,717 - mmdet - INFO - Epoch [7][2050/3696] lr: 6.000e-05, eta: 5:01:14, time: 0.913, data_time: 0.018, memory: 13290, loss_rpn_cls: 0.0586, loss_rpn_bbox: 0.0239, loss_cls: 6.2672, loss_bbox: 2.9279, loss: 9.2777 2023-01-20 15:18:45,895 - mmdet - INFO - Epoch [7][2100/3696] lr: 6.000e-05, eta: 5:00:29, time: 0.904, data_time: 0.018, memory: 13290, loss_rpn_cls: 0.0582, loss_rpn_bbox: 0.0234, loss_cls: 6.2358, loss_bbox: 2.9348, loss: 9.2523 2023-01-20 15:19:31,035 - mmdet - INFO - Epoch [7][2150/3696] lr: 6.000e-05, eta: 4:59:45, time: 0.903, data_time: 0.018, memory: 13290, loss_rpn_cls: 0.0588, loss_rpn_bbox: 0.0241, loss_cls: 6.2648, loss_bbox: 2.9288, loss: 9.2766 2023-01-20 15:20:16,367 - mmdet - INFO - Epoch [7][2200/3696] lr: 6.000e-05, eta: 4:59:00, time: 0.907, data_time: 0.017, memory: 13290, loss_rpn_cls: 0.0594, loss_rpn_bbox: 0.0245, loss_cls: 6.2683, loss_bbox: 2.9213, loss: 9.2735 2023-01-20 15:21:01,901 - mmdet - INFO - Epoch [7][2250/3696] lr: 6.000e-05, eta: 4:58:16, time: 0.911, data_time: 0.018, memory: 13290, loss_rpn_cls: 0.0596, loss_rpn_bbox: 0.0241, loss_cls: 6.2585, loss_bbox: 2.9220, loss: 9.2642 2023-01-20 15:21:47,771 - mmdet - INFO - Epoch [7][2300/3696] lr: 6.000e-05, eta: 4:57:31, time: 0.917, data_time: 0.018, memory: 13290, loss_rpn_cls: 0.0597, loss_rpn_bbox: 0.0247, loss_cls: 6.2741, loss_bbox: 2.9071, loss: 9.2656 2023-01-20 15:22:33,647 - mmdet - INFO - Epoch [7][2350/3696] lr: 6.000e-05, eta: 4:56:47, time: 0.918, data_time: 0.018, memory: 13290, loss_rpn_cls: 0.0588, loss_rpn_bbox: 0.0244, loss_cls: 6.2746, loss_bbox: 2.9150, loss: 9.2728 2023-01-20 15:23:19,333 - mmdet - INFO - Epoch [7][2400/3696] lr: 6.000e-05, eta: 4:56:03, time: 0.914, data_time: 0.018, memory: 13290, loss_rpn_cls: 0.0586, loss_rpn_bbox: 0.0241, loss_cls: 6.2541, loss_bbox: 2.9378, loss: 9.2746 2023-01-20 15:24:04,526 - mmdet - INFO - Epoch [7][2450/3696] lr: 6.000e-05, eta: 4:55:18, time: 0.904, data_time: 0.018, memory: 13290, loss_rpn_cls: 0.0591, loss_rpn_bbox: 0.0242, loss_cls: 6.2654, loss_bbox: 2.9368, loss: 9.2855 2023-01-20 15:24:50,311 - mmdet - INFO - Epoch [7][2500/3696] lr: 6.000e-05, eta: 4:54:34, time: 0.916, data_time: 0.018, memory: 13290, loss_rpn_cls: 0.0591, loss_rpn_bbox: 0.0239, loss_cls: 6.2500, loss_bbox: 2.9197, loss: 9.2527 2023-01-20 15:25:35,564 - mmdet - INFO - Epoch [7][2550/3696] lr: 6.000e-05, eta: 4:53:49, time: 0.905, data_time: 0.017, memory: 13290, loss_rpn_cls: 0.0587, loss_rpn_bbox: 0.0243, loss_cls: 6.2592, loss_bbox: 2.9228, loss: 9.2651 2023-01-20 15:26:20,966 - mmdet - INFO - Epoch [7][2600/3696] lr: 6.000e-05, eta: 4:53:05, time: 0.908, data_time: 0.017, memory: 13290, loss_rpn_cls: 0.0579, loss_rpn_bbox: 0.0238, loss_cls: 6.2532, loss_bbox: 2.9200, loss: 9.2549 2023-01-20 15:27:07,005 - mmdet - INFO - Epoch [7][2650/3696] lr: 6.000e-05, eta: 4:52:21, time: 0.921, data_time: 0.018, memory: 13290, loss_rpn_cls: 0.0588, loss_rpn_bbox: 0.0243, loss_cls: 6.2632, loss_bbox: 2.9294, loss: 9.2756 2023-01-20 15:27:52,304 - mmdet - INFO - Epoch [7][2700/3696] lr: 6.000e-05, eta: 4:51:36, time: 0.906, data_time: 0.017, memory: 13290, loss_rpn_cls: 0.0580, loss_rpn_bbox: 0.0238, loss_cls: 6.2579, loss_bbox: 2.9317, loss: 9.2713 2023-01-20 15:28:37,490 - mmdet - INFO - Epoch [7][2750/3696] lr: 6.000e-05, eta: 4:50:52, time: 0.904, data_time: 0.017, memory: 13290, loss_rpn_cls: 0.0587, loss_rpn_bbox: 0.0240, loss_cls: 6.2578, loss_bbox: 2.9296, loss: 9.2700 2023-01-20 15:29:23,653 - mmdet - INFO - Epoch [7][2800/3696] lr: 6.000e-05, eta: 4:50:08, time: 0.923, data_time: 0.017, memory: 13290, loss_rpn_cls: 0.0598, loss_rpn_bbox: 0.0247, loss_cls: 6.2708, loss_bbox: 2.9067, loss: 9.2620 2023-01-20 15:30:09,199 - mmdet - INFO - Epoch [7][2850/3696] lr: 6.000e-05, eta: 4:49:23, time: 0.911, data_time: 0.018, memory: 13290, loss_rpn_cls: 0.0586, loss_rpn_bbox: 0.0243, loss_cls: 6.2695, loss_bbox: 2.9272, loss: 9.2796 2023-01-20 15:30:55,305 - mmdet - INFO - Epoch [7][2900/3696] lr: 6.000e-05, eta: 4:48:39, time: 0.922, data_time: 0.017, memory: 13290, loss_rpn_cls: 0.0596, loss_rpn_bbox: 0.0245, loss_cls: 6.2601, loss_bbox: 2.9055, loss: 9.2497 2023-01-20 15:31:40,764 - mmdet - INFO - Epoch [7][2950/3696] lr: 6.000e-05, eta: 4:47:55, time: 0.909, data_time: 0.017, memory: 13290, loss_rpn_cls: 0.0588, loss_rpn_bbox: 0.0241, loss_cls: 6.2517, loss_bbox: 2.9240, loss: 9.2586 2023-01-20 15:32:26,471 - mmdet - INFO - Epoch [7][3000/3696] lr: 6.000e-05, eta: 4:47:10, time: 0.914, data_time: 0.017, memory: 13290, loss_rpn_cls: 0.0585, loss_rpn_bbox: 0.0240, loss_cls: 6.2504, loss_bbox: 2.9238, loss: 9.2566 2023-01-20 15:33:11,789 - mmdet - INFO - Epoch [7][3050/3696] lr: 6.000e-05, eta: 4:46:26, time: 0.906, data_time: 0.018, memory: 13290, loss_rpn_cls: 0.0578, loss_rpn_bbox: 0.0237, loss_cls: 6.2555, loss_bbox: 2.9305, loss: 9.2675 2023-01-20 15:33:56,760 - mmdet - INFO - Epoch [7][3100/3696] lr: 6.000e-05, eta: 4:45:41, time: 0.899, data_time: 0.018, memory: 13290, loss_rpn_cls: 0.0582, loss_rpn_bbox: 0.0234, loss_cls: 6.2368, loss_bbox: 2.9286, loss: 9.2471 2023-01-20 15:34:41,479 - mmdet - INFO - Epoch [7][3150/3696] lr: 6.000e-05, eta: 4:44:56, time: 0.894, data_time: 0.017, memory: 13290, loss_rpn_cls: 0.0588, loss_rpn_bbox: 0.0237, loss_cls: 6.2494, loss_bbox: 2.9455, loss: 9.2774 2023-01-20 15:35:27,630 - mmdet - INFO - Epoch [7][3200/3696] lr: 6.000e-05, eta: 4:44:12, time: 0.923, data_time: 0.017, memory: 13290, loss_rpn_cls: 0.0591, loss_rpn_bbox: 0.0246, loss_cls: 6.2594, loss_bbox: 2.9002, loss: 9.2433 2023-01-20 15:36:12,927 - mmdet - INFO - Epoch [7][3250/3696] lr: 6.000e-05, eta: 4:43:27, time: 0.906, data_time: 0.017, memory: 13290, loss_rpn_cls: 0.0593, loss_rpn_bbox: 0.0241, loss_cls: 6.2587, loss_bbox: 2.9237, loss: 9.2658 2023-01-20 15:36:58,513 - mmdet - INFO - Epoch [7][3300/3696] lr: 6.000e-05, eta: 4:42:43, time: 0.912, data_time: 0.017, memory: 13290, loss_rpn_cls: 0.0581, loss_rpn_bbox: 0.0243, loss_cls: 6.2571, loss_bbox: 2.9178, loss: 9.2572 2023-01-20 15:37:43,451 - mmdet - INFO - Epoch [7][3350/3696] lr: 6.000e-05, eta: 4:41:58, time: 0.899, data_time: 0.017, memory: 13290, loss_rpn_cls: 0.0572, loss_rpn_bbox: 0.0232, loss_cls: 6.2348, loss_bbox: 2.9275, loss: 9.2427 2023-01-20 15:38:28,705 - mmdet - INFO - Epoch [7][3400/3696] lr: 6.000e-05, eta: 4:41:13, time: 0.905, data_time: 0.018, memory: 13290, loss_rpn_cls: 0.0590, loss_rpn_bbox: 0.0241, loss_cls: 6.2542, loss_bbox: 2.9266, loss: 9.2638 2023-01-20 15:39:14,268 - mmdet - INFO - Epoch [7][3450/3696] lr: 6.000e-05, eta: 4:40:28, time: 0.911, data_time: 0.017, memory: 13290, loss_rpn_cls: 0.0596, loss_rpn_bbox: 0.0245, loss_cls: 6.2773, loss_bbox: 2.9123, loss: 9.2738 2023-01-20 15:40:00,331 - mmdet - INFO - Epoch [7][3500/3696] lr: 6.000e-05, eta: 4:39:44, time: 0.921, data_time: 0.018, memory: 13290, loss_rpn_cls: 0.0591, loss_rpn_bbox: 0.0244, loss_cls: 6.2706, loss_bbox: 2.9074, loss: 9.2615 2023-01-20 15:40:44,864 - mmdet - INFO - Epoch [7][3550/3696] lr: 6.000e-05, eta: 4:38:59, time: 0.891, data_time: 0.018, memory: 13290, loss_rpn_cls: 0.0566, loss_rpn_bbox: 0.0229, loss_cls: 6.2269, loss_bbox: 2.9353, loss: 9.2418 2023-01-20 15:41:30,628 - mmdet - INFO - Epoch [7][3600/3696] lr: 6.000e-05, eta: 4:38:15, time: 0.915, data_time: 0.018, memory: 13290, loss_rpn_cls: 0.0584, loss_rpn_bbox: 0.0243, loss_cls: 6.2722, loss_bbox: 2.9149, loss: 9.2699 2023-01-20 15:42:15,195 - mmdet - INFO - Epoch [7][3650/3696] lr: 6.000e-05, eta: 4:37:30, time: 0.891, data_time: 0.017, memory: 13290, loss_rpn_cls: 0.0581, loss_rpn_bbox: 0.0237, loss_cls: 6.2509, loss_bbox: 2.9379, loss: 9.2706 2023-01-20 15:42:57,417 - mmdet - INFO - Saving checkpoint at 7 epochs 2023-01-20 15:43:48,875 - mmdet - INFO - Epoch [8][50/3696] lr: 6.000e-05, eta: 4:35:36, time: 0.964, data_time: 0.075, memory: 13290, loss_rpn_cls: 0.0588, loss_rpn_bbox: 0.0238, loss_cls: 6.2506, loss_bbox: 2.9135, loss: 9.2468 2023-01-20 15:44:33,901 - mmdet - INFO - Epoch [8][100/3696] lr: 6.000e-05, eta: 4:34:51, time: 0.901, data_time: 0.020, memory: 13290, loss_rpn_cls: 0.0587, loss_rpn_bbox: 0.0238, loss_cls: 6.2470, loss_bbox: 2.9329, loss: 9.2624 2023-01-20 15:45:19,688 - mmdet - INFO - Epoch [8][150/3696] lr: 6.000e-05, eta: 4:34:07, time: 0.916, data_time: 0.021, memory: 13290, loss_rpn_cls: 0.0588, loss_rpn_bbox: 0.0243, loss_cls: 6.2682, loss_bbox: 2.9169, loss: 9.2681 2023-01-20 15:46:05,266 - mmdet - INFO - Epoch [8][200/3696] lr: 6.000e-05, eta: 4:33:23, time: 0.912, data_time: 0.021, memory: 13290, loss_rpn_cls: 0.0581, loss_rpn_bbox: 0.0241, loss_cls: 6.2659, loss_bbox: 2.9025, loss: 9.2506 2023-01-20 15:46:51,631 - mmdet - INFO - Epoch [8][250/3696] lr: 6.000e-05, eta: 4:32:39, time: 0.927, data_time: 0.019, memory: 13290, loss_rpn_cls: 0.0594, loss_rpn_bbox: 0.0247, loss_cls: 6.2749, loss_bbox: 2.8982, loss: 9.2571 2023-01-20 15:47:37,143 - mmdet - INFO - Epoch [8][300/3696] lr: 6.000e-05, eta: 4:31:55, time: 0.910, data_time: 0.020, memory: 13290, loss_rpn_cls: 0.0581, loss_rpn_bbox: 0.0241, loss_cls: 6.2549, loss_bbox: 2.9124, loss: 9.2495 2023-01-20 15:48:22,837 - mmdet - INFO - Epoch [8][350/3696] lr: 6.000e-05, eta: 4:31:10, time: 0.914, data_time: 0.021, memory: 13290, loss_rpn_cls: 0.0596, loss_rpn_bbox: 0.0245, loss_cls: 6.2703, loss_bbox: 2.8987, loss: 9.2531 2023-01-20 15:49:08,608 - mmdet - INFO - Epoch [8][400/3696] lr: 6.000e-05, eta: 4:30:26, time: 0.915, data_time: 0.021, memory: 13290, loss_rpn_cls: 0.0596, loss_rpn_bbox: 0.0246, loss_cls: 6.2674, loss_bbox: 2.9073, loss: 9.2589 2023-01-20 15:49:54,414 - mmdet - INFO - Epoch [8][450/3696] lr: 6.000e-05, eta: 4:29:42, time: 0.916, data_time: 0.020, memory: 13290, loss_rpn_cls: 0.0592, loss_rpn_bbox: 0.0244, loss_cls: 6.2658, loss_bbox: 2.9038, loss: 9.2532 2023-01-20 15:50:39,763 - mmdet - INFO - Epoch [8][500/3696] lr: 6.000e-05, eta: 4:28:57, time: 0.907, data_time: 0.020, memory: 13290, loss_rpn_cls: 0.0591, loss_rpn_bbox: 0.0243, loss_cls: 6.2635, loss_bbox: 2.9288, loss: 9.2757 2023-01-20 15:51:25,425 - mmdet - INFO - Epoch [8][550/3696] lr: 6.000e-05, eta: 4:28:13, time: 0.913, data_time: 0.020, memory: 13290, loss_rpn_cls: 0.0592, loss_rpn_bbox: 0.0245, loss_cls: 6.2733, loss_bbox: 2.9055, loss: 9.2623 2023-01-20 15:52:11,050 - mmdet - INFO - Epoch [8][600/3696] lr: 6.000e-05, eta: 4:27:29, time: 0.912, data_time: 0.020, memory: 13290, loss_rpn_cls: 0.0590, loss_rpn_bbox: 0.0243, loss_cls: 6.2630, loss_bbox: 2.8990, loss: 9.2453 2023-01-20 15:52:57,172 - mmdet - INFO - Epoch [8][650/3696] lr: 6.000e-05, eta: 4:26:45, time: 0.922, data_time: 0.020, memory: 13290, loss_rpn_cls: 0.0582, loss_rpn_bbox: 0.0240, loss_cls: 6.2533, loss_bbox: 2.9037, loss: 9.2392 2023-01-20 15:53:42,923 - mmdet - INFO - Epoch [8][700/3696] lr: 6.000e-05, eta: 4:26:00, time: 0.915, data_time: 0.020, memory: 13290, loss_rpn_cls: 0.0574, loss_rpn_bbox: 0.0236, loss_cls: 6.2466, loss_bbox: 2.9238, loss: 9.2514 2023-01-20 15:54:27,799 - mmdet - INFO - Epoch [8][750/3696] lr: 6.000e-05, eta: 4:25:15, time: 0.898, data_time: 0.020, memory: 13290, loss_rpn_cls: 0.0569, loss_rpn_bbox: 0.0234, loss_cls: 6.2444, loss_bbox: 2.9357, loss: 9.2603 2023-01-20 15:55:13,389 - mmdet - INFO - Epoch [8][800/3696] lr: 6.000e-05, eta: 4:24:31, time: 0.912, data_time: 0.020, memory: 13290, loss_rpn_cls: 0.0577, loss_rpn_bbox: 0.0238, loss_cls: 6.2506, loss_bbox: 2.9188, loss: 9.2510 2023-01-20 15:55:59,280 - mmdet - INFO - Epoch [8][850/3696] lr: 6.000e-05, eta: 4:23:47, time: 0.918, data_time: 0.021, memory: 13290, loss_rpn_cls: 0.0589, loss_rpn_bbox: 0.0242, loss_cls: 6.2600, loss_bbox: 2.9106, loss: 9.2536 2023-01-20 15:56:44,953 - mmdet - INFO - Epoch [8][900/3696] lr: 6.000e-05, eta: 4:23:02, time: 0.913, data_time: 0.021, memory: 13290, loss_rpn_cls: 0.0579, loss_rpn_bbox: 0.0236, loss_cls: 6.2478, loss_bbox: 2.9203, loss: 9.2496 2023-01-20 15:57:29,649 - mmdet - INFO - Epoch [8][950/3696] lr: 6.000e-05, eta: 4:22:17, time: 0.894, data_time: 0.021, memory: 13290, loss_rpn_cls: 0.0579, loss_rpn_bbox: 0.0238, loss_cls: 6.2534, loss_bbox: 2.9269, loss: 9.2620 2023-01-20 15:58:15,088 - mmdet - INFO - Epoch [8][1000/3696] lr: 6.000e-05, eta: 4:21:33, time: 0.909, data_time: 0.020, memory: 13290, loss_rpn_cls: 0.0589, loss_rpn_bbox: 0.0243, loss_cls: 6.2688, loss_bbox: 2.9175, loss: 9.2695 2023-01-20 15:59:00,856 - mmdet - INFO - Epoch [8][1050/3696] lr: 6.000e-05, eta: 4:20:48, time: 0.915, data_time: 0.020, memory: 13290, loss_rpn_cls: 0.0576, loss_rpn_bbox: 0.0237, loss_cls: 6.2527, loss_bbox: 2.9129, loss: 9.2470 2023-01-20 15:59:46,595 - mmdet - INFO - Epoch [8][1100/3696] lr: 6.000e-05, eta: 4:20:04, time: 0.915, data_time: 0.020, memory: 13290, loss_rpn_cls: 0.0582, loss_rpn_bbox: 0.0239, loss_cls: 6.2433, loss_bbox: 2.9082, loss: 9.2336 2023-01-20 16:00:33,384 - mmdet - INFO - Epoch [8][1150/3696] lr: 6.000e-05, eta: 4:19:20, time: 0.936, data_time: 0.020, memory: 13290, loss_rpn_cls: 0.0589, loss_rpn_bbox: 0.0244, loss_cls: 6.2590, loss_bbox: 2.8883, loss: 9.2307 2023-01-20 16:01:19,072 - mmdet - INFO - Epoch [8][1200/3696] lr: 6.000e-05, eta: 4:18:36, time: 0.914, data_time: 0.019, memory: 13290, loss_rpn_cls: 0.0587, loss_rpn_bbox: 0.0241, loss_cls: 6.2676, loss_bbox: 2.9127, loss: 9.2632 2023-01-20 16:02:04,704 - mmdet - INFO - Epoch [8][1250/3696] lr: 6.000e-05, eta: 4:17:52, time: 0.913, data_time: 0.019, memory: 13290, loss_rpn_cls: 0.0577, loss_rpn_bbox: 0.0238, loss_cls: 6.2589, loss_bbox: 2.9176, loss: 9.2581 2023-01-20 16:02:49,810 - mmdet - INFO - Epoch [8][1300/3696] lr: 6.000e-05, eta: 4:17:07, time: 0.902, data_time: 0.019, memory: 13290, loss_rpn_cls: 0.0585, loss_rpn_bbox: 0.0238, loss_cls: 6.2592, loss_bbox: 2.9150, loss: 9.2565 2023-01-20 16:03:35,964 - mmdet - INFO - Epoch [8][1350/3696] lr: 6.000e-05, eta: 4:16:23, time: 0.923, data_time: 0.019, memory: 13290, loss_rpn_cls: 0.0583, loss_rpn_bbox: 0.0243, loss_cls: 6.2636, loss_bbox: 2.9054, loss: 9.2517 2023-01-20 16:04:21,479 - mmdet - INFO - Epoch [8][1400/3696] lr: 6.000e-05, eta: 4:15:38, time: 0.910, data_time: 0.018, memory: 13290, loss_rpn_cls: 0.0586, loss_rpn_bbox: 0.0240, loss_cls: 6.2543, loss_bbox: 2.9181, loss: 9.2550 2023-01-20 16:05:06,717 - mmdet - INFO - Epoch [8][1450/3696] lr: 6.000e-05, eta: 4:14:54, time: 0.905, data_time: 0.019, memory: 13290, loss_rpn_cls: 0.0573, loss_rpn_bbox: 0.0235, loss_cls: 6.2501, loss_bbox: 2.9271, loss: 9.2580 2023-01-20 16:05:52,111 - mmdet - INFO - Epoch [8][1500/3696] lr: 6.000e-05, eta: 4:14:09, time: 0.908, data_time: 0.020, memory: 13290, loss_rpn_cls: 0.0582, loss_rpn_bbox: 0.0237, loss_cls: 6.2534, loss_bbox: 2.9198, loss: 9.2551 2023-01-20 16:06:37,556 - mmdet - INFO - Epoch [8][1550/3696] lr: 6.000e-05, eta: 4:13:24, time: 0.909, data_time: 0.018, memory: 13290, loss_rpn_cls: 0.0586, loss_rpn_bbox: 0.0244, loss_cls: 6.2628, loss_bbox: 2.9019, loss: 9.2477 2023-01-20 16:07:22,773 - mmdet - INFO - Epoch [8][1600/3696] lr: 6.000e-05, eta: 4:12:40, time: 0.904, data_time: 0.019, memory: 13290, loss_rpn_cls: 0.0583, loss_rpn_bbox: 0.0238, loss_cls: 6.2482, loss_bbox: 2.9132, loss: 9.2436 2023-01-20 16:08:07,495 - mmdet - INFO - Epoch [8][1650/3696] lr: 6.000e-05, eta: 4:11:55, time: 0.894, data_time: 0.019, memory: 13290, loss_rpn_cls: 0.0576, loss_rpn_bbox: 0.0237, loss_cls: 6.2577, loss_bbox: 2.9273, loss: 9.2663 2023-01-20 16:08:53,528 - mmdet - INFO - Epoch [8][1700/3696] lr: 6.000e-05, eta: 4:11:10, time: 0.921, data_time: 0.019, memory: 13290, loss_rpn_cls: 0.0588, loss_rpn_bbox: 0.0241, loss_cls: 6.2555, loss_bbox: 2.9152, loss: 9.2536 2023-01-20 16:09:38,708 - mmdet - INFO - Epoch [8][1750/3696] lr: 6.000e-05, eta: 4:10:26, time: 0.904, data_time: 0.019, memory: 13290, loss_rpn_cls: 0.0579, loss_rpn_bbox: 0.0236, loss_cls: 6.2501, loss_bbox: 2.9332, loss: 9.2647 2023-01-20 16:10:24,222 - mmdet - INFO - Epoch [8][1800/3696] lr: 6.000e-05, eta: 4:09:41, time: 0.910, data_time: 0.019, memory: 13290, loss_rpn_cls: 0.0575, loss_rpn_bbox: 0.0236, loss_cls: 6.2462, loss_bbox: 2.9060, loss: 9.2333 2023-01-20 16:11:10,112 - mmdet - INFO - Epoch [8][1850/3696] lr: 6.000e-05, eta: 4:08:57, time: 0.918, data_time: 0.019, memory: 13290, loss_rpn_cls: 0.0581, loss_rpn_bbox: 0.0237, loss_cls: 6.2541, loss_bbox: 2.9160, loss: 9.2519 2023-01-20 16:11:55,848 - mmdet - INFO - Epoch [8][1900/3696] lr: 6.000e-05, eta: 4:08:12, time: 0.915, data_time: 0.019, memory: 13290, loss_rpn_cls: 0.0587, loss_rpn_bbox: 0.0239, loss_cls: 6.2545, loss_bbox: 2.8980, loss: 9.2351 2023-01-20 16:12:41,531 - mmdet - INFO - Epoch [8][1950/3696] lr: 6.000e-05, eta: 4:07:28, time: 0.914, data_time: 0.019, memory: 13290, loss_rpn_cls: 0.0586, loss_rpn_bbox: 0.0241, loss_cls: 6.2543, loss_bbox: 2.9054, loss: 9.2424 2023-01-20 16:13:27,060 - mmdet - INFO - Epoch [8][2000/3696] lr: 6.000e-05, eta: 4:06:43, time: 0.911, data_time: 0.019, memory: 13290, loss_rpn_cls: 0.0586, loss_rpn_bbox: 0.0239, loss_cls: 6.2566, loss_bbox: 2.9153, loss: 9.2544 2023-01-20 16:14:13,147 - mmdet - INFO - Epoch [8][2050/3696] lr: 6.000e-05, eta: 4:05:59, time: 0.922, data_time: 0.018, memory: 13290, loss_rpn_cls: 0.0587, loss_rpn_bbox: 0.0248, loss_cls: 6.2800, loss_bbox: 2.8989, loss: 9.2624 2023-01-20 16:14:58,425 - mmdet - INFO - Epoch [8][2100/3696] lr: 6.000e-05, eta: 4:05:14, time: 0.906, data_time: 0.019, memory: 13290, loss_rpn_cls: 0.0587, loss_rpn_bbox: 0.0242, loss_cls: 6.2582, loss_bbox: 2.9210, loss: 9.2621 2023-01-20 16:15:43,804 - mmdet - INFO - Epoch [8][2150/3696] lr: 6.000e-05, eta: 4:04:30, time: 0.908, data_time: 0.018, memory: 13290, loss_rpn_cls: 0.0577, loss_rpn_bbox: 0.0239, loss_cls: 6.2473, loss_bbox: 2.9069, loss: 9.2358 2023-01-20 16:16:29,152 - mmdet - INFO - Epoch [8][2200/3696] lr: 6.000e-05, eta: 4:03:45, time: 0.907, data_time: 0.018, memory: 13290, loss_rpn_cls: 0.0577, loss_rpn_bbox: 0.0236, loss_cls: 6.2553, loss_bbox: 2.9168, loss: 9.2533 2023-01-20 16:17:14,382 - mmdet - INFO - Epoch [8][2250/3696] lr: 6.000e-05, eta: 4:03:00, time: 0.905, data_time: 0.019, memory: 13290, loss_rpn_cls: 0.0576, loss_rpn_bbox: 0.0237, loss_cls: 6.2504, loss_bbox: 2.9163, loss: 9.2479 2023-01-20 16:18:00,102 - mmdet - INFO - Epoch [8][2300/3696] lr: 6.000e-05, eta: 4:02:16, time: 0.914, data_time: 0.019, memory: 13290, loss_rpn_cls: 0.0581, loss_rpn_bbox: 0.0241, loss_cls: 6.2528, loss_bbox: 2.9034, loss: 9.2384 2023-01-20 16:18:44,874 - mmdet - INFO - Epoch [8][2350/3696] lr: 6.000e-05, eta: 4:01:31, time: 0.895, data_time: 0.018, memory: 13290, loss_rpn_cls: 0.0580, loss_rpn_bbox: 0.0236, loss_cls: 6.2403, loss_bbox: 2.9174, loss: 9.2393 2023-01-20 16:19:30,576 - mmdet - INFO - Epoch [8][2400/3696] lr: 6.000e-05, eta: 4:00:46, time: 0.914, data_time: 0.019, memory: 13290, loss_rpn_cls: 0.0584, loss_rpn_bbox: 0.0241, loss_cls: 6.2516, loss_bbox: 2.8998, loss: 9.2340 2023-01-20 16:20:16,351 - mmdet - INFO - Epoch [8][2450/3696] lr: 6.000e-05, eta: 4:00:02, time: 0.915, data_time: 0.019, memory: 13290, loss_rpn_cls: 0.0588, loss_rpn_bbox: 0.0240, loss_cls: 6.2507, loss_bbox: 2.8950, loss: 9.2284 2023-01-20 16:21:02,523 - mmdet - INFO - Epoch [8][2500/3696] lr: 6.000e-05, eta: 3:59:18, time: 0.923, data_time: 0.020, memory: 13290, loss_rpn_cls: 0.0590, loss_rpn_bbox: 0.0242, loss_cls: 6.2611, loss_bbox: 2.9023, loss: 9.2466 2023-01-20 16:21:47,313 - mmdet - INFO - Epoch [8][2550/3696] lr: 6.000e-05, eta: 3:58:33, time: 0.896, data_time: 0.019, memory: 13290, loss_rpn_cls: 0.0585, loss_rpn_bbox: 0.0239, loss_cls: 6.2553, loss_bbox: 2.9349, loss: 9.2726 2023-01-20 16:22:32,591 - mmdet - INFO - Epoch [8][2600/3696] lr: 6.000e-05, eta: 3:57:48, time: 0.906, data_time: 0.019, memory: 13290, loss_rpn_cls: 0.0590, loss_rpn_bbox: 0.0242, loss_cls: 6.2577, loss_bbox: 2.9108, loss: 9.2517 2023-01-20 16:23:17,709 - mmdet - INFO - Epoch [8][2650/3696] lr: 6.000e-05, eta: 3:57:03, time: 0.902, data_time: 0.020, memory: 13290, loss_rpn_cls: 0.0571, loss_rpn_bbox: 0.0235, loss_cls: 6.2445, loss_bbox: 2.9137, loss: 9.2388 2023-01-20 16:24:03,326 - mmdet - INFO - Epoch [8][2700/3696] lr: 6.000e-05, eta: 3:56:19, time: 0.912, data_time: 0.019, memory: 13290, loss_rpn_cls: 0.0583, loss_rpn_bbox: 0.0243, loss_cls: 6.2681, loss_bbox: 2.8927, loss: 9.2433 2023-01-20 16:24:48,675 - mmdet - INFO - Epoch [8][2750/3696] lr: 6.000e-05, eta: 3:55:34, time: 0.907, data_time: 0.018, memory: 13290, loss_rpn_cls: 0.0581, loss_rpn_bbox: 0.0240, loss_cls: 6.2555, loss_bbox: 2.9156, loss: 9.2532 2023-01-20 16:25:36,321 - mmdet - INFO - Epoch [8][2800/3696] lr: 6.000e-05, eta: 3:54:51, time: 0.953, data_time: 0.019, memory: 13290, loss_rpn_cls: 0.0587, loss_rpn_bbox: 0.0242, loss_cls: 6.2619, loss_bbox: 2.8877, loss: 9.2325 2023-01-20 16:26:22,636 - mmdet - INFO - Epoch [8][2850/3696] lr: 6.000e-05, eta: 3:54:06, time: 0.926, data_time: 0.019, memory: 13290, loss_rpn_cls: 0.0596, loss_rpn_bbox: 0.0248, loss_cls: 6.2792, loss_bbox: 2.8909, loss: 9.2546 2023-01-20 16:27:09,894 - mmdet - INFO - Epoch [8][2900/3696] lr: 6.000e-05, eta: 3:53:23, time: 0.945, data_time: 0.019, memory: 13290, loss_rpn_cls: 0.0572, loss_rpn_bbox: 0.0236, loss_cls: 6.2388, loss_bbox: 2.9146, loss: 9.2342 2023-01-20 16:27:55,358 - mmdet - INFO - Epoch [8][2950/3696] lr: 6.000e-05, eta: 3:52:38, time: 0.909, data_time: 0.019, memory: 13290, loss_rpn_cls: 0.0582, loss_rpn_bbox: 0.0240, loss_cls: 6.2608, loss_bbox: 2.9110, loss: 9.2540 2023-01-20 16:28:42,796 - mmdet - INFO - Epoch [8][3000/3696] lr: 6.000e-05, eta: 3:51:54, time: 0.949, data_time: 0.019, memory: 13290, loss_rpn_cls: 0.0586, loss_rpn_bbox: 0.0243, loss_cls: 6.2693, loss_bbox: 2.9114, loss: 9.2635 2023-01-20 16:29:38,301 - mmdet - INFO - Epoch [8][3050/3696] lr: 6.000e-05, eta: 3:51:15, time: 1.110, data_time: 0.019, memory: 13290, loss_rpn_cls: 0.0578, loss_rpn_bbox: 0.0239, loss_cls: 6.2579, loss_bbox: 2.9095, loss: 9.2491 2023-01-20 16:30:23,428 - mmdet - INFO - Epoch [8][3100/3696] lr: 6.000e-05, eta: 3:50:30, time: 0.903, data_time: 0.019, memory: 13290, loss_rpn_cls: 0.0583, loss_rpn_bbox: 0.0239, loss_cls: 6.2474, loss_bbox: 2.9000, loss: 9.2297 2023-01-20 16:31:09,292 - mmdet - INFO - Epoch [8][3150/3696] lr: 6.000e-05, eta: 3:49:46, time: 0.917, data_time: 0.019, memory: 13290, loss_rpn_cls: 0.0591, loss_rpn_bbox: 0.0243, loss_cls: 6.2603, loss_bbox: 2.8951, loss: 9.2387 2023-01-20 16:31:54,898 - mmdet - INFO - Epoch [8][3200/3696] lr: 6.000e-05, eta: 3:49:01, time: 0.912, data_time: 0.018, memory: 13290, loss_rpn_cls: 0.0583, loss_rpn_bbox: 0.0244, loss_cls: 6.2596, loss_bbox: 2.9022, loss: 9.2446 2023-01-20 16:32:40,752 - mmdet - INFO - Epoch [8][3250/3696] lr: 6.000e-05, eta: 3:48:17, time: 0.917, data_time: 0.018, memory: 13290, loss_rpn_cls: 0.0585, loss_rpn_bbox: 0.0246, loss_cls: 6.2726, loss_bbox: 2.8919, loss: 9.2476 2023-01-20 16:33:26,100 - mmdet - INFO - Epoch [8][3300/3696] lr: 6.000e-05, eta: 3:47:32, time: 0.907, data_time: 0.018, memory: 13290, loss_rpn_cls: 0.0592, loss_rpn_bbox: 0.0245, loss_cls: 6.2674, loss_bbox: 2.8919, loss: 9.2430 2023-01-20 16:34:11,607 - mmdet - INFO - Epoch [8][3350/3696] lr: 6.000e-05, eta: 3:46:47, time: 0.910, data_time: 0.018, memory: 13290, loss_rpn_cls: 0.0586, loss_rpn_bbox: 0.0242, loss_cls: 6.2682, loss_bbox: 2.8915, loss: 9.2424 2023-01-20 16:34:57,621 - mmdet - INFO - Epoch [8][3400/3696] lr: 6.000e-05, eta: 3:46:03, time: 0.920, data_time: 0.018, memory: 13290, loss_rpn_cls: 0.0583, loss_rpn_bbox: 0.0242, loss_cls: 6.2618, loss_bbox: 2.9020, loss: 9.2464 2023-01-20 16:35:42,496 - mmdet - INFO - Epoch [8][3450/3696] lr: 6.000e-05, eta: 3:45:18, time: 0.898, data_time: 0.017, memory: 13290, loss_rpn_cls: 0.0580, loss_rpn_bbox: 0.0238, loss_cls: 6.2522, loss_bbox: 2.9187, loss: 9.2528 2023-01-20 16:36:28,506 - mmdet - INFO - Epoch [8][3500/3696] lr: 6.000e-05, eta: 3:44:33, time: 0.920, data_time: 0.018, memory: 13290, loss_rpn_cls: 0.0582, loss_rpn_bbox: 0.0242, loss_cls: 6.2602, loss_bbox: 2.8876, loss: 9.2301 2023-01-20 16:37:13,948 - mmdet - INFO - Epoch [8][3550/3696] lr: 6.000e-05, eta: 3:43:49, time: 0.909, data_time: 0.017, memory: 13290, loss_rpn_cls: 0.0576, loss_rpn_bbox: 0.0238, loss_cls: 6.2567, loss_bbox: 2.9114, loss: 9.2495 2023-01-20 16:37:58,128 - mmdet - INFO - Epoch [8][3600/3696] lr: 6.000e-05, eta: 3:43:03, time: 0.884, data_time: 0.018, memory: 13290, loss_rpn_cls: 0.0563, loss_rpn_bbox: 0.0233, loss_cls: 6.2433, loss_bbox: 2.9386, loss: 9.2615 2023-01-20 16:38:44,441 - mmdet - INFO - Epoch [8][3650/3696] lr: 6.000e-05, eta: 3:42:19, time: 0.926, data_time: 0.018, memory: 13290, loss_rpn_cls: 0.0589, loss_rpn_bbox: 0.0244, loss_cls: 6.2691, loss_bbox: 2.8953, loss: 9.2478 2023-01-20 16:39:26,670 - mmdet - INFO - Saving checkpoint at 8 epochs 2023-01-20 16:40:19,019 - mmdet - INFO - Epoch [9][50/3696] lr: 6.000e-06, eta: 3:40:34, time: 0.977, data_time: 0.075, memory: 13290, loss_rpn_cls: 0.0585, loss_rpn_bbox: 0.0243, loss_cls: 6.2603, loss_bbox: 2.8789, loss: 9.2220 2023-01-20 16:41:04,478 - mmdet - INFO - Epoch [9][100/3696] lr: 6.000e-06, eta: 3:39:49, time: 0.909, data_time: 0.019, memory: 13290, loss_rpn_cls: 0.0566, loss_rpn_bbox: 0.0234, loss_cls: 6.2471, loss_bbox: 2.9031, loss: 9.2301 2023-01-20 16:41:49,910 - mmdet - INFO - Epoch [9][150/3696] lr: 6.000e-06, eta: 3:39:05, time: 0.909, data_time: 0.019, memory: 13290, loss_rpn_cls: 0.0575, loss_rpn_bbox: 0.0240, loss_cls: 6.2535, loss_bbox: 2.8876, loss: 9.2227 2023-01-20 16:42:34,745 - mmdet - INFO - Epoch [9][200/3696] lr: 6.000e-06, eta: 3:38:20, time: 0.897, data_time: 0.018, memory: 13290, loss_rpn_cls: 0.0571, loss_rpn_bbox: 0.0235, loss_cls: 6.2341, loss_bbox: 2.9013, loss: 9.2160 2023-01-20 16:43:20,140 - mmdet - INFO - Epoch [9][250/3696] lr: 6.000e-06, eta: 3:37:35, time: 0.908, data_time: 0.018, memory: 13290, loss_rpn_cls: 0.0580, loss_rpn_bbox: 0.0239, loss_cls: 6.2454, loss_bbox: 2.8800, loss: 9.2074 2023-01-20 16:44:05,719 - mmdet - INFO - Epoch [9][300/3696] lr: 6.000e-06, eta: 3:36:50, time: 0.912, data_time: 0.018, memory: 13290, loss_rpn_cls: 0.0576, loss_rpn_bbox: 0.0240, loss_cls: 6.2432, loss_bbox: 2.8911, loss: 9.2160 2023-01-20 16:44:51,796 - mmdet - INFO - Epoch [9][350/3696] lr: 6.000e-06, eta: 3:36:06, time: 0.922, data_time: 0.018, memory: 13290, loss_rpn_cls: 0.0574, loss_rpn_bbox: 0.0239, loss_cls: 6.2501, loss_bbox: 2.8871, loss: 9.2185 2023-01-20 16:45:37,566 - mmdet - INFO - Epoch [9][400/3696] lr: 6.000e-06, eta: 3:35:22, time: 0.915, data_time: 0.017, memory: 13290, loss_rpn_cls: 0.0586, loss_rpn_bbox: 0.0242, loss_cls: 6.2565, loss_bbox: 2.8758, loss: 9.2150 2023-01-20 16:46:23,239 - mmdet - INFO - Epoch [9][450/3696] lr: 6.000e-06, eta: 3:34:37, time: 0.913, data_time: 0.017, memory: 13290, loss_rpn_cls: 0.0580, loss_rpn_bbox: 0.0240, loss_cls: 6.2458, loss_bbox: 2.8733, loss: 9.2011 2023-01-20 16:47:08,190 - mmdet - INFO - Epoch [9][500/3696] lr: 6.000e-06, eta: 3:33:52, time: 0.899, data_time: 0.017, memory: 13290, loss_rpn_cls: 0.0574, loss_rpn_bbox: 0.0239, loss_cls: 6.2587, loss_bbox: 2.8876, loss: 9.2277 2023-01-20 16:47:53,992 - mmdet - INFO - Epoch [9][550/3696] lr: 6.000e-06, eta: 3:33:08, time: 0.916, data_time: 0.017, memory: 13290, loss_rpn_cls: 0.0580, loss_rpn_bbox: 0.0240, loss_cls: 6.2505, loss_bbox: 2.8809, loss: 9.2135 2023-01-20 16:48:40,145 - mmdet - INFO - Epoch [9][600/3696] lr: 6.000e-06, eta: 3:32:23, time: 0.923, data_time: 0.017, memory: 13290, loss_rpn_cls: 0.0573, loss_rpn_bbox: 0.0237, loss_cls: 6.2449, loss_bbox: 2.8869, loss: 9.2128 2023-01-20 16:49:24,940 - mmdet - INFO - Epoch [9][650/3696] lr: 6.000e-06, eta: 3:31:38, time: 0.896, data_time: 0.017, memory: 13290, loss_rpn_cls: 0.0565, loss_rpn_bbox: 0.0235, loss_cls: 6.2434, loss_bbox: 2.9089, loss: 9.2324 2023-01-20 16:50:10,041 - mmdet - INFO - Epoch [9][700/3696] lr: 6.000e-06, eta: 3:30:53, time: 0.902, data_time: 0.017, memory: 13290, loss_rpn_cls: 0.0567, loss_rpn_bbox: 0.0236, loss_cls: 6.2431, loss_bbox: 2.8898, loss: 9.2132 2023-01-20 16:50:55,646 - mmdet - INFO - Epoch [9][750/3696] lr: 6.000e-06, eta: 3:30:09, time: 0.912, data_time: 0.017, memory: 13290, loss_rpn_cls: 0.0569, loss_rpn_bbox: 0.0238, loss_cls: 6.2437, loss_bbox: 2.8862, loss: 9.2108 2023-01-20 16:51:41,008 - mmdet - INFO - Epoch [9][800/3696] lr: 6.000e-06, eta: 3:29:24, time: 0.907, data_time: 0.017, memory: 13290, loss_rpn_cls: 0.0581, loss_rpn_bbox: 0.0239, loss_cls: 6.2495, loss_bbox: 2.8920, loss: 9.2235 2023-01-20 16:52:26,662 - mmdet - INFO - Epoch [9][850/3696] lr: 6.000e-06, eta: 3:28:39, time: 0.913, data_time: 0.017, memory: 13290, loss_rpn_cls: 0.0584, loss_rpn_bbox: 0.0239, loss_cls: 6.2401, loss_bbox: 2.8914, loss: 9.2138 2023-01-20 16:53:12,212 - mmdet - INFO - Epoch [9][900/3696] lr: 6.000e-06, eta: 3:27:55, time: 0.911, data_time: 0.017, memory: 13290, loss_rpn_cls: 0.0583, loss_rpn_bbox: 0.0242, loss_cls: 6.2488, loss_bbox: 2.8797, loss: 9.2111 2023-01-20 16:53:57,441 - mmdet - INFO - Epoch [9][950/3696] lr: 6.000e-06, eta: 3:27:10, time: 0.905, data_time: 0.017, memory: 13290, loss_rpn_cls: 0.0574, loss_rpn_bbox: 0.0237, loss_cls: 6.2405, loss_bbox: 2.8829, loss: 9.2045 2023-01-20 16:54:42,675 - mmdet - INFO - Epoch [9][1000/3696] lr: 6.000e-06, eta: 3:26:25, time: 0.905, data_time: 0.017, memory: 13290, loss_rpn_cls: 0.0577, loss_rpn_bbox: 0.0239, loss_cls: 6.2492, loss_bbox: 2.8813, loss: 9.2120 2023-01-20 16:55:28,104 - mmdet - INFO - Epoch [9][1050/3696] lr: 6.000e-06, eta: 3:25:41, time: 0.908, data_time: 0.017, memory: 13290, loss_rpn_cls: 0.0576, loss_rpn_bbox: 0.0243, loss_cls: 6.2675, loss_bbox: 2.8839, loss: 9.2332 2023-01-20 16:56:13,218 - mmdet - INFO - Epoch [9][1100/3696] lr: 6.000e-06, eta: 3:24:56, time: 0.902, data_time: 0.017, memory: 13290, loss_rpn_cls: 0.0575, loss_rpn_bbox: 0.0240, loss_cls: 6.2445, loss_bbox: 2.8939, loss: 9.2199 2023-01-20 16:56:58,369 - mmdet - INFO - Epoch [9][1150/3696] lr: 6.000e-06, eta: 3:24:11, time: 0.903, data_time: 0.018, memory: 13290, loss_rpn_cls: 0.0573, loss_rpn_bbox: 0.0237, loss_cls: 6.2433, loss_bbox: 2.8952, loss: 9.2195 2023-01-20 16:57:43,527 - mmdet - INFO - Epoch [9][1200/3696] lr: 6.000e-06, eta: 3:23:26, time: 0.903, data_time: 0.017, memory: 13290, loss_rpn_cls: 0.0578, loss_rpn_bbox: 0.0239, loss_cls: 6.2536, loss_bbox: 2.8726, loss: 9.2078 2023-01-20 16:58:28,739 - mmdet - INFO - Epoch [9][1250/3696] lr: 6.000e-06, eta: 3:22:41, time: 0.904, data_time: 0.017, memory: 13290, loss_rpn_cls: 0.0578, loss_rpn_bbox: 0.0238, loss_cls: 6.2485, loss_bbox: 2.8931, loss: 9.2232 2023-01-20 16:59:13,605 - mmdet - INFO - Epoch [9][1300/3696] lr: 6.000e-06, eta: 3:21:56, time: 0.897, data_time: 0.017, memory: 13290, loss_rpn_cls: 0.0571, loss_rpn_bbox: 0.0237, loss_cls: 6.2415, loss_bbox: 2.8985, loss: 9.2208 2023-01-20 16:59:58,959 - mmdet - INFO - Epoch [9][1350/3696] lr: 6.000e-06, eta: 3:21:12, time: 0.907, data_time: 0.017, memory: 13290, loss_rpn_cls: 0.0582, loss_rpn_bbox: 0.0240, loss_cls: 6.2436, loss_bbox: 2.8927, loss: 9.2185 2023-01-20 17:00:43,878 - mmdet - INFO - Epoch [9][1400/3696] lr: 6.000e-06, eta: 3:20:27, time: 0.898, data_time: 0.018, memory: 13290, loss_rpn_cls: 0.0576, loss_rpn_bbox: 0.0237, loss_cls: 6.2422, loss_bbox: 2.8919, loss: 9.2154 2023-01-20 17:01:28,722 - mmdet - INFO - Epoch [9][1450/3696] lr: 6.000e-06, eta: 3:19:42, time: 0.897, data_time: 0.017, memory: 13290, loss_rpn_cls: 0.0568, loss_rpn_bbox: 0.0234, loss_cls: 6.2423, loss_bbox: 2.9103, loss: 9.2328 2023-01-20 17:02:14,334 - mmdet - INFO - Epoch [9][1500/3696] lr: 6.000e-06, eta: 3:18:57, time: 0.912, data_time: 0.017, memory: 13290, loss_rpn_cls: 0.0581, loss_rpn_bbox: 0.0239, loss_cls: 6.2470, loss_bbox: 2.8935, loss: 9.2225 2023-01-20 17:02:59,816 - mmdet - INFO - Epoch [9][1550/3696] lr: 6.000e-06, eta: 3:18:12, time: 0.910, data_time: 0.017, memory: 13290, loss_rpn_cls: 0.0584, loss_rpn_bbox: 0.0241, loss_cls: 6.2585, loss_bbox: 2.8870, loss: 9.2281 2023-01-20 17:03:45,357 - mmdet - INFO - Epoch [9][1600/3696] lr: 6.000e-06, eta: 3:17:28, time: 0.911, data_time: 0.017, memory: 13290, loss_rpn_cls: 0.0584, loss_rpn_bbox: 0.0244, loss_cls: 6.2548, loss_bbox: 2.8747, loss: 9.2123 2023-01-20 17:04:31,192 - mmdet - INFO - Epoch [9][1650/3696] lr: 6.000e-06, eta: 3:16:43, time: 0.917, data_time: 0.017, memory: 13290, loss_rpn_cls: 0.0581, loss_rpn_bbox: 0.0242, loss_cls: 6.2591, loss_bbox: 2.8777, loss: 9.2191 2023-01-20 17:05:17,118 - mmdet - INFO - Epoch [9][1700/3696] lr: 6.000e-06, eta: 3:15:59, time: 0.919, data_time: 0.017, memory: 13290, loss_rpn_cls: 0.0583, loss_rpn_bbox: 0.0243, loss_cls: 6.2713, loss_bbox: 2.8797, loss: 9.2337 2023-01-20 17:06:02,145 - mmdet - INFO - Epoch [9][1750/3696] lr: 6.000e-06, eta: 3:15:14, time: 0.901, data_time: 0.018, memory: 13290, loss_rpn_cls: 0.0580, loss_rpn_bbox: 0.0239, loss_cls: 6.2491, loss_bbox: 2.8989, loss: 9.2300 2023-01-20 17:06:47,976 - mmdet - INFO - Epoch [9][1800/3696] lr: 6.000e-06, eta: 3:14:29, time: 0.917, data_time: 0.017, memory: 13290, loss_rpn_cls: 0.0572, loss_rpn_bbox: 0.0241, loss_cls: 6.2537, loss_bbox: 2.8839, loss: 9.2188 2023-01-20 17:07:34,055 - mmdet - INFO - Epoch [9][1850/3696] lr: 6.000e-06, eta: 3:13:45, time: 0.922, data_time: 0.017, memory: 13290, loss_rpn_cls: 0.0588, loss_rpn_bbox: 0.0244, loss_cls: 6.2628, loss_bbox: 2.8855, loss: 9.2314 2023-01-20 17:08:19,002 - mmdet - INFO - Epoch [9][1900/3696] lr: 6.000e-06, eta: 3:13:00, time: 0.899, data_time: 0.017, memory: 13290, loss_rpn_cls: 0.0578, loss_rpn_bbox: 0.0239, loss_cls: 6.2511, loss_bbox: 2.8948, loss: 9.2276 2023-01-20 17:09:04,343 - mmdet - INFO - Epoch [9][1950/3696] lr: 6.000e-06, eta: 3:12:15, time: 0.907, data_time: 0.017, memory: 13290, loss_rpn_cls: 0.0578, loss_rpn_bbox: 0.0242, loss_cls: 6.2587, loss_bbox: 2.8835, loss: 9.2241 2023-01-20 17:09:49,371 - mmdet - INFO - Epoch [9][2000/3696] lr: 6.000e-06, eta: 3:11:30, time: 0.901, data_time: 0.017, memory: 13290, loss_rpn_cls: 0.0580, loss_rpn_bbox: 0.0237, loss_cls: 6.2513, loss_bbox: 2.8994, loss: 9.2325 2023-01-20 17:10:34,892 - mmdet - INFO - Epoch [9][2050/3696] lr: 6.000e-06, eta: 3:10:45, time: 0.910, data_time: 0.017, memory: 13290, loss_rpn_cls: 0.0579, loss_rpn_bbox: 0.0241, loss_cls: 6.2539, loss_bbox: 2.8819, loss: 9.2178 2023-01-20 17:11:19,447 - mmdet - INFO - Epoch [9][2100/3696] lr: 6.000e-06, eta: 3:10:00, time: 0.891, data_time: 0.017, memory: 13290, loss_rpn_cls: 0.0570, loss_rpn_bbox: 0.0235, loss_cls: 6.2442, loss_bbox: 2.9039, loss: 9.2286 2023-01-20 17:12:05,091 - mmdet - INFO - Epoch [9][2150/3696] lr: 6.000e-06, eta: 3:09:16, time: 0.913, data_time: 0.017, memory: 13290, loss_rpn_cls: 0.0580, loss_rpn_bbox: 0.0241, loss_cls: 6.2498, loss_bbox: 2.8797, loss: 9.2116 2023-01-20 17:12:51,239 - mmdet - INFO - Epoch [9][2200/3696] lr: 6.000e-06, eta: 3:08:31, time: 0.923, data_time: 0.017, memory: 13290, loss_rpn_cls: 0.0582, loss_rpn_bbox: 0.0244, loss_cls: 6.2574, loss_bbox: 2.8509, loss: 9.1909 2023-01-20 17:13:36,846 - mmdet - INFO - Epoch [9][2250/3696] lr: 6.000e-06, eta: 3:07:46, time: 0.912, data_time: 0.017, memory: 13290, loss_rpn_cls: 0.0580, loss_rpn_bbox: 0.0241, loss_cls: 6.2535, loss_bbox: 2.8695, loss: 9.2051 2023-01-20 17:14:22,173 - mmdet - INFO - Epoch [9][2300/3696] lr: 6.000e-06, eta: 3:07:02, time: 0.907, data_time: 0.017, memory: 13290, loss_rpn_cls: 0.0568, loss_rpn_bbox: 0.0237, loss_cls: 6.2451, loss_bbox: 2.8878, loss: 9.2133 2023-01-20 17:15:07,733 - mmdet - INFO - Epoch [9][2350/3696] lr: 6.000e-06, eta: 3:06:17, time: 0.911, data_time: 0.017, memory: 13290, loss_rpn_cls: 0.0586, loss_rpn_bbox: 0.0240, loss_cls: 6.2515, loss_bbox: 2.8831, loss: 9.2172 2023-01-20 17:15:53,321 - mmdet - INFO - Epoch [9][2400/3696] lr: 6.000e-06, eta: 3:05:32, time: 0.912, data_time: 0.017, memory: 13290, loss_rpn_cls: 0.0569, loss_rpn_bbox: 0.0237, loss_cls: 6.2537, loss_bbox: 2.8732, loss: 9.2075 2023-01-20 17:16:39,248 - mmdet - INFO - Epoch [9][2450/3696] lr: 6.000e-06, eta: 3:04:48, time: 0.919, data_time: 0.017, memory: 13290, loss_rpn_cls: 0.0579, loss_rpn_bbox: 0.0241, loss_cls: 6.2516, loss_bbox: 2.8740, loss: 9.2076 2023-01-20 17:17:24,144 - mmdet - INFO - Epoch [9][2500/3696] lr: 6.000e-06, eta: 3:04:03, time: 0.898, data_time: 0.018, memory: 13290, loss_rpn_cls: 0.0570, loss_rpn_bbox: 0.0235, loss_cls: 6.2488, loss_bbox: 2.9041, loss: 9.2334 2023-01-20 17:18:09,741 - mmdet - INFO - Epoch [9][2550/3696] lr: 6.000e-06, eta: 3:03:18, time: 0.912, data_time: 0.017, memory: 13290, loss_rpn_cls: 0.0590, loss_rpn_bbox: 0.0246, loss_cls: 6.2624, loss_bbox: 2.8721, loss: 9.2181 2023-01-20 17:18:55,277 - mmdet - INFO - Epoch [9][2600/3696] lr: 6.000e-06, eta: 3:02:33, time: 0.911, data_time: 0.017, memory: 13290, loss_rpn_cls: 0.0582, loss_rpn_bbox: 0.0242, loss_cls: 6.2544, loss_bbox: 2.8721, loss: 9.2089 2023-01-20 17:19:40,817 - mmdet - INFO - Epoch [9][2650/3696] lr: 6.000e-06, eta: 3:01:48, time: 0.911, data_time: 0.017, memory: 13290, loss_rpn_cls: 0.0575, loss_rpn_bbox: 0.0241, loss_cls: 6.2518, loss_bbox: 2.8852, loss: 9.2187 2023-01-20 17:20:26,850 - mmdet - INFO - Epoch [9][2700/3696] lr: 6.000e-06, eta: 3:01:04, time: 0.921, data_time: 0.017, memory: 13290, loss_rpn_cls: 0.0574, loss_rpn_bbox: 0.0240, loss_cls: 6.2552, loss_bbox: 2.8828, loss: 9.2194 2023-01-20 17:21:12,673 - mmdet - INFO - Epoch [9][2750/3696] lr: 6.000e-06, eta: 3:00:19, time: 0.916, data_time: 0.017, memory: 13290, loss_rpn_cls: 0.0582, loss_rpn_bbox: 0.0240, loss_cls: 6.2507, loss_bbox: 2.8872, loss: 9.2201 2023-01-20 17:21:57,628 - mmdet - INFO - Epoch [9][2800/3696] lr: 6.000e-06, eta: 2:59:34, time: 0.899, data_time: 0.017, memory: 13290, loss_rpn_cls: 0.0572, loss_rpn_bbox: 0.0236, loss_cls: 6.2460, loss_bbox: 2.8895, loss: 9.2162 2023-01-20 17:22:42,997 - mmdet - INFO - Epoch [9][2850/3696] lr: 6.000e-06, eta: 2:58:50, time: 0.907, data_time: 0.017, memory: 13290, loss_rpn_cls: 0.0578, loss_rpn_bbox: 0.0239, loss_cls: 6.2482, loss_bbox: 2.8846, loss: 9.2145 2023-01-20 17:23:28,069 - mmdet - INFO - Epoch [9][2900/3696] lr: 6.000e-06, eta: 2:58:05, time: 0.901, data_time: 0.017, memory: 13290, loss_rpn_cls: 0.0575, loss_rpn_bbox: 0.0238, loss_cls: 6.2544, loss_bbox: 2.8899, loss: 9.2256 2023-01-20 17:24:13,591 - mmdet - INFO - Epoch [9][2950/3696] lr: 6.000e-06, eta: 2:57:20, time: 0.910, data_time: 0.017, memory: 13290, loss_rpn_cls: 0.0565, loss_rpn_bbox: 0.0235, loss_cls: 6.2490, loss_bbox: 2.9046, loss: 9.2335 2023-01-20 17:24:59,378 - mmdet - INFO - Epoch [9][3000/3696] lr: 6.000e-06, eta: 2:56:35, time: 0.916, data_time: 0.017, memory: 13290, loss_rpn_cls: 0.0574, loss_rpn_bbox: 0.0236, loss_cls: 6.2446, loss_bbox: 2.8812, loss: 9.2067 2023-01-20 17:25:45,061 - mmdet - INFO - Epoch [9][3050/3696] lr: 6.000e-06, eta: 2:55:51, time: 0.914, data_time: 0.017, memory: 13290, loss_rpn_cls: 0.0578, loss_rpn_bbox: 0.0239, loss_cls: 6.2493, loss_bbox: 2.8693, loss: 9.2003 2023-01-20 17:26:31,024 - mmdet - INFO - Epoch [9][3100/3696] lr: 6.000e-06, eta: 2:55:06, time: 0.919, data_time: 0.018, memory: 13290, loss_rpn_cls: 0.0578, loss_rpn_bbox: 0.0241, loss_cls: 6.2537, loss_bbox: 2.8759, loss: 9.2114 2023-01-20 17:27:16,393 - mmdet - INFO - Epoch [9][3150/3696] lr: 6.000e-06, eta: 2:54:21, time: 0.907, data_time: 0.017, memory: 13290, loss_rpn_cls: 0.0575, loss_rpn_bbox: 0.0240, loss_cls: 6.2532, loss_bbox: 2.8806, loss: 9.2152 2023-01-20 17:28:02,082 - mmdet - INFO - Epoch [9][3200/3696] lr: 6.000e-06, eta: 2:53:36, time: 0.914, data_time: 0.017, memory: 13290, loss_rpn_cls: 0.0581, loss_rpn_bbox: 0.0242, loss_cls: 6.2545, loss_bbox: 2.8735, loss: 9.2103 2023-01-20 17:28:47,943 - mmdet - INFO - Epoch [9][3250/3696] lr: 6.000e-06, eta: 2:52:52, time: 0.917, data_time: 0.017, memory: 13290, loss_rpn_cls: 0.0588, loss_rpn_bbox: 0.0246, loss_cls: 6.2569, loss_bbox: 2.8701, loss: 9.2104 2023-01-20 17:29:33,078 - mmdet - INFO - Epoch [9][3300/3696] lr: 6.000e-06, eta: 2:52:07, time: 0.903, data_time: 0.017, memory: 13290, loss_rpn_cls: 0.0567, loss_rpn_bbox: 0.0238, loss_cls: 6.2500, loss_bbox: 2.8840, loss: 9.2144 2023-01-20 17:30:18,550 - mmdet - INFO - Epoch [9][3350/3696] lr: 6.000e-06, eta: 2:51:22, time: 0.909, data_time: 0.017, memory: 13290, loss_rpn_cls: 0.0582, loss_rpn_bbox: 0.0240, loss_cls: 6.2463, loss_bbox: 2.8755, loss: 9.2039 2023-01-20 17:31:04,029 - mmdet - INFO - Epoch [9][3400/3696] lr: 6.000e-06, eta: 2:50:37, time: 0.910, data_time: 0.017, memory: 13290, loss_rpn_cls: 0.0590, loss_rpn_bbox: 0.0243, loss_cls: 6.2535, loss_bbox: 2.8785, loss: 9.2153 2023-01-20 17:31:49,320 - mmdet - INFO - Epoch [9][3450/3696] lr: 6.000e-06, eta: 2:49:52, time: 0.906, data_time: 0.017, memory: 13290, loss_rpn_cls: 0.0583, loss_rpn_bbox: 0.0240, loss_cls: 6.2544, loss_bbox: 2.8843, loss: 9.2210 2023-01-20 17:32:34,890 - mmdet - INFO - Epoch [9][3500/3696] lr: 6.000e-06, eta: 2:49:08, time: 0.911, data_time: 0.017, memory: 13290, loss_rpn_cls: 0.0574, loss_rpn_bbox: 0.0238, loss_cls: 6.2423, loss_bbox: 2.8846, loss: 9.2082 2023-01-20 17:33:19,751 - mmdet - INFO - Epoch [9][3550/3696] lr: 6.000e-06, eta: 2:48:23, time: 0.897, data_time: 0.017, memory: 13290, loss_rpn_cls: 0.0568, loss_rpn_bbox: 0.0234, loss_cls: 6.2321, loss_bbox: 2.9048, loss: 9.2173 2023-01-20 17:34:05,928 - mmdet - INFO - Epoch [9][3600/3696] lr: 6.000e-06, eta: 2:47:38, time: 0.924, data_time: 0.018, memory: 13290, loss_rpn_cls: 0.0581, loss_rpn_bbox: 0.0244, loss_cls: 6.2601, loss_bbox: 2.8727, loss: 9.2154 2023-01-20 17:34:51,449 - mmdet - INFO - Epoch [9][3650/3696] lr: 6.000e-06, eta: 2:46:53, time: 0.910, data_time: 0.018, memory: 13290, loss_rpn_cls: 0.0579, loss_rpn_bbox: 0.0240, loss_cls: 6.2501, loss_bbox: 2.8737, loss: 9.2058 2023-01-20 17:35:34,268 - mmdet - INFO - Saving checkpoint at 9 epochs 2023-01-20 17:36:25,942 - mmdet - INFO - Epoch [10][50/3696] lr: 6.000e-06, eta: 2:45:14, time: 0.969, data_time: 0.074, memory: 13290, loss_rpn_cls: 0.0587, loss_rpn_bbox: 0.0243, loss_cls: 6.2585, loss_bbox: 2.8701, loss: 9.2114 2023-01-20 17:37:11,151 - mmdet - INFO - Epoch [10][100/3696] lr: 6.000e-06, eta: 2:44:30, time: 0.904, data_time: 0.018, memory: 13290, loss_rpn_cls: 0.0587, loss_rpn_bbox: 0.0242, loss_cls: 6.2550, loss_bbox: 2.8638, loss: 9.2016 2023-01-20 17:37:57,132 - mmdet - INFO - Epoch [10][150/3696] lr: 6.000e-06, eta: 2:43:45, time: 0.920, data_time: 0.018, memory: 13290, loss_rpn_cls: 0.0575, loss_rpn_bbox: 0.0238, loss_cls: 6.2441, loss_bbox: 2.8865, loss: 9.2119 2023-01-20 17:38:42,122 - mmdet - INFO - Epoch [10][200/3696] lr: 6.000e-06, eta: 2:43:00, time: 0.900, data_time: 0.018, memory: 13290, loss_rpn_cls: 0.0571, loss_rpn_bbox: 0.0237, loss_cls: 6.2435, loss_bbox: 2.8958, loss: 9.2201 2023-01-20 17:39:27,596 - mmdet - INFO - Epoch [10][250/3696] lr: 6.000e-06, eta: 2:42:15, time: 0.909, data_time: 0.017, memory: 13290, loss_rpn_cls: 0.0575, loss_rpn_bbox: 0.0240, loss_cls: 6.2457, loss_bbox: 2.8809, loss: 9.2082 2023-01-20 17:40:13,283 - mmdet - INFO - Epoch [10][300/3696] lr: 6.000e-06, eta: 2:41:31, time: 0.914, data_time: 0.018, memory: 13290, loss_rpn_cls: 0.0574, loss_rpn_bbox: 0.0239, loss_cls: 6.2477, loss_bbox: 2.8746, loss: 9.2037 2023-01-20 17:40:59,680 - mmdet - INFO - Epoch [10][350/3696] lr: 6.000e-06, eta: 2:40:46, time: 0.928, data_time: 0.017, memory: 13290, loss_rpn_cls: 0.0600, loss_rpn_bbox: 0.0246, loss_cls: 6.2596, loss_bbox: 2.8616, loss: 9.2058 2023-01-20 17:41:44,696 - mmdet - INFO - Epoch [10][400/3696] lr: 6.000e-06, eta: 2:40:01, time: 0.900, data_time: 0.017, memory: 13290, loss_rpn_cls: 0.0568, loss_rpn_bbox: 0.0237, loss_cls: 6.2437, loss_bbox: 2.8939, loss: 9.2181 2023-01-20 17:42:29,644 - mmdet - INFO - Epoch [10][450/3696] lr: 6.000e-06, eta: 2:39:16, time: 0.899, data_time: 0.017, memory: 13290, loss_rpn_cls: 0.0569, loss_rpn_bbox: 0.0237, loss_cls: 6.2352, loss_bbox: 2.8963, loss: 9.2120 2023-01-20 17:43:15,235 - mmdet - INFO - Epoch [10][500/3696] lr: 6.000e-06, eta: 2:38:32, time: 0.912, data_time: 0.018, memory: 13290, loss_rpn_cls: 0.0577, loss_rpn_bbox: 0.0237, loss_cls: 6.2421, loss_bbox: 2.8733, loss: 9.1967 2023-01-20 17:44:01,116 - mmdet - INFO - Epoch [10][550/3696] lr: 6.000e-06, eta: 2:37:47, time: 0.918, data_time: 0.018, memory: 13290, loss_rpn_cls: 0.0591, loss_rpn_bbox: 0.0247, loss_cls: 6.2733, loss_bbox: 2.8768, loss: 9.2340 2023-01-20 17:44:46,624 - mmdet - INFO - Epoch [10][600/3696] lr: 6.000e-06, eta: 2:37:02, time: 0.910, data_time: 0.018, memory: 13290, loss_rpn_cls: 0.0580, loss_rpn_bbox: 0.0240, loss_cls: 6.2451, loss_bbox: 2.8779, loss: 9.2050 2023-01-20 17:45:32,624 - mmdet - INFO - Epoch [10][650/3696] lr: 6.000e-06, eta: 2:36:18, time: 0.920, data_time: 0.017, memory: 13290, loss_rpn_cls: 0.0587, loss_rpn_bbox: 0.0243, loss_cls: 6.2586, loss_bbox: 2.8712, loss: 9.2128 2023-01-20 17:46:18,239 - mmdet - INFO - Epoch [10][700/3696] lr: 6.000e-06, eta: 2:35:33, time: 0.912, data_time: 0.017, memory: 13290, loss_rpn_cls: 0.0580, loss_rpn_bbox: 0.0239, loss_cls: 6.2439, loss_bbox: 2.8751, loss: 9.2008 2023-01-20 17:47:03,957 - mmdet - INFO - Epoch [10][750/3696] lr: 6.000e-06, eta: 2:34:48, time: 0.914, data_time: 0.017, memory: 13290, loss_rpn_cls: 0.0583, loss_rpn_bbox: 0.0241, loss_cls: 6.2514, loss_bbox: 2.8795, loss: 9.2133 2023-01-20 17:47:50,124 - mmdet - INFO - Epoch [10][800/3696] lr: 6.000e-06, eta: 2:34:04, time: 0.923, data_time: 0.017, memory: 13290, loss_rpn_cls: 0.0571, loss_rpn_bbox: 0.0241, loss_cls: 6.2526, loss_bbox: 2.8698, loss: 9.2036 2023-01-20 17:48:34,654 - mmdet - INFO - Epoch [10][850/3696] lr: 6.000e-06, eta: 2:33:19, time: 0.891, data_time: 0.017, memory: 13290, loss_rpn_cls: 0.0577, loss_rpn_bbox: 0.0236, loss_cls: 6.2463, loss_bbox: 2.9135, loss: 9.2411 2023-01-20 17:49:19,730 - mmdet - INFO - Epoch [10][900/3696] lr: 6.000e-06, eta: 2:32:34, time: 0.902, data_time: 0.018, memory: 13290, loss_rpn_cls: 0.0570, loss_rpn_bbox: 0.0237, loss_cls: 6.2443, loss_bbox: 2.8862, loss: 9.2111 2023-01-20 17:50:05,116 - mmdet - INFO - Epoch [10][950/3696] lr: 6.000e-06, eta: 2:31:49, time: 0.908, data_time: 0.018, memory: 13290, loss_rpn_cls: 0.0576, loss_rpn_bbox: 0.0238, loss_cls: 6.2452, loss_bbox: 2.8790, loss: 9.2056 2023-01-20 17:50:50,074 - mmdet - INFO - Epoch [10][1000/3696] lr: 6.000e-06, eta: 2:31:04, time: 0.899, data_time: 0.017, memory: 13290, loss_rpn_cls: 0.0578, loss_rpn_bbox: 0.0242, loss_cls: 6.2608, loss_bbox: 2.8762, loss: 9.2189 2023-01-20 17:51:35,376 - mmdet - INFO - Epoch [10][1050/3696] lr: 6.000e-06, eta: 2:30:19, time: 0.906, data_time: 0.018, memory: 13290, loss_rpn_cls: 0.0581, loss_rpn_bbox: 0.0240, loss_cls: 6.2538, loss_bbox: 2.8782, loss: 9.2141 2023-01-20 17:52:20,331 - mmdet - INFO - Epoch [10][1100/3696] lr: 6.000e-06, eta: 2:29:35, time: 0.899, data_time: 0.017, memory: 13290, loss_rpn_cls: 0.0559, loss_rpn_bbox: 0.0231, loss_cls: 6.2385, loss_bbox: 2.9070, loss: 9.2245 2023-01-20 17:53:05,739 - mmdet - INFO - Epoch [10][1150/3696] lr: 6.000e-06, eta: 2:28:50, time: 0.908, data_time: 0.017, memory: 13290, loss_rpn_cls: 0.0568, loss_rpn_bbox: 0.0237, loss_cls: 6.2464, loss_bbox: 2.8811, loss: 9.2079 2023-01-20 17:53:51,261 - mmdet - INFO - Epoch [10][1200/3696] lr: 6.000e-06, eta: 2:28:05, time: 0.910, data_time: 0.017, memory: 13290, loss_rpn_cls: 0.0572, loss_rpn_bbox: 0.0235, loss_cls: 6.2359, loss_bbox: 2.8720, loss: 9.1886 2023-01-20 17:54:36,402 - mmdet - INFO - Epoch [10][1250/3696] lr: 6.000e-06, eta: 2:27:20, time: 0.903, data_time: 0.017, memory: 13290, loss_rpn_cls: 0.0565, loss_rpn_bbox: 0.0236, loss_cls: 6.2407, loss_bbox: 2.8916, loss: 9.2124 2023-01-20 17:55:21,841 - mmdet - INFO - Epoch [10][1300/3696] lr: 6.000e-06, eta: 2:26:35, time: 0.909, data_time: 0.018, memory: 13290, loss_rpn_cls: 0.0578, loss_rpn_bbox: 0.0240, loss_cls: 6.2541, loss_bbox: 2.8827, loss: 9.2187 2023-01-20 17:56:06,600 - mmdet - INFO - Epoch [10][1350/3696] lr: 6.000e-06, eta: 2:25:50, time: 0.895, data_time: 0.017, memory: 13290, loss_rpn_cls: 0.0579, loss_rpn_bbox: 0.0240, loss_cls: 6.2475, loss_bbox: 2.8789, loss: 9.2083 2023-01-20 17:56:51,608 - mmdet - INFO - Epoch [10][1400/3696] lr: 6.000e-06, eta: 2:25:05, time: 0.900, data_time: 0.018, memory: 13290, loss_rpn_cls: 0.0572, loss_rpn_bbox: 0.0240, loss_cls: 6.2569, loss_bbox: 2.8907, loss: 9.2288 2023-01-20 17:57:37,048 - mmdet - INFO - Epoch [10][1450/3696] lr: 6.000e-06, eta: 2:24:21, time: 0.909, data_time: 0.017, memory: 13290, loss_rpn_cls: 0.0587, loss_rpn_bbox: 0.0244, loss_cls: 6.2619, loss_bbox: 2.8811, loss: 9.2262 2023-01-20 17:58:22,768 - mmdet - INFO - Epoch [10][1500/3696] lr: 6.000e-06, eta: 2:23:36, time: 0.914, data_time: 0.017, memory: 13290, loss_rpn_cls: 0.0581, loss_rpn_bbox: 0.0240, loss_cls: 6.2545, loss_bbox: 2.8796, loss: 9.2163 2023-01-20 17:59:08,304 - mmdet - INFO - Epoch [10][1550/3696] lr: 6.000e-06, eta: 2:22:51, time: 0.911, data_time: 0.018, memory: 13290, loss_rpn_cls: 0.0586, loss_rpn_bbox: 0.0239, loss_cls: 6.2400, loss_bbox: 2.8860, loss: 9.2085 2023-01-20 17:59:53,561 - mmdet - INFO - Epoch [10][1600/3696] lr: 6.000e-06, eta: 2:22:06, time: 0.905, data_time: 0.017, memory: 13290, loss_rpn_cls: 0.0579, loss_rpn_bbox: 0.0236, loss_cls: 6.2377, loss_bbox: 2.8911, loss: 9.2103 2023-01-20 18:00:38,752 - mmdet - INFO - Epoch [10][1650/3696] lr: 6.000e-06, eta: 2:21:21, time: 0.904, data_time: 0.017, memory: 13290, loss_rpn_cls: 0.0574, loss_rpn_bbox: 0.0238, loss_cls: 6.2488, loss_bbox: 2.8852, loss: 9.2152 2023-01-20 18:01:24,246 - mmdet - INFO - Epoch [10][1700/3696] lr: 6.000e-06, eta: 2:20:37, time: 0.910, data_time: 0.017, memory: 13290, loss_rpn_cls: 0.0583, loss_rpn_bbox: 0.0243, loss_cls: 6.2617, loss_bbox: 2.8778, loss: 9.2222 2023-01-20 18:02:10,020 - mmdet - INFO - Epoch [10][1750/3696] lr: 6.000e-06, eta: 2:19:52, time: 0.915, data_time: 0.018, memory: 13290, loss_rpn_cls: 0.0583, loss_rpn_bbox: 0.0243, loss_cls: 6.2570, loss_bbox: 2.8670, loss: 9.2066 2023-01-20 18:02:55,582 - mmdet - INFO - Epoch [10][1800/3696] lr: 6.000e-06, eta: 2:19:07, time: 0.911, data_time: 0.018, memory: 13290, loss_rpn_cls: 0.0569, loss_rpn_bbox: 0.0235, loss_cls: 6.2364, loss_bbox: 2.8895, loss: 9.2063 2023-01-20 18:03:41,366 - mmdet - INFO - Epoch [10][1850/3696] lr: 6.000e-06, eta: 2:18:22, time: 0.916, data_time: 0.017, memory: 13290, loss_rpn_cls: 0.0589, loss_rpn_bbox: 0.0243, loss_cls: 6.2594, loss_bbox: 2.8719, loss: 9.2145 2023-01-20 18:04:26,829 - mmdet - INFO - Epoch [10][1900/3696] lr: 6.000e-06, eta: 2:17:38, time: 0.909, data_time: 0.017, memory: 13290, loss_rpn_cls: 0.0586, loss_rpn_bbox: 0.0241, loss_cls: 6.2536, loss_bbox: 2.8872, loss: 9.2236 2023-01-20 18:05:11,882 - mmdet - INFO - Epoch [10][1950/3696] lr: 6.000e-06, eta: 2:16:53, time: 0.901, data_time: 0.017, memory: 13290, loss_rpn_cls: 0.0570, loss_rpn_bbox: 0.0237, loss_cls: 6.2371, loss_bbox: 2.8829, loss: 9.2007 2023-01-20 18:05:57,035 - mmdet - INFO - Epoch [10][2000/3696] lr: 6.000e-06, eta: 2:16:08, time: 0.903, data_time: 0.017, memory: 13290, loss_rpn_cls: 0.0583, loss_rpn_bbox: 0.0242, loss_cls: 6.2587, loss_bbox: 2.8824, loss: 9.2236 2023-01-20 18:06:42,788 - mmdet - INFO - Epoch [10][2050/3696] lr: 6.000e-06, eta: 2:15:23, time: 0.915, data_time: 0.017, memory: 13290, loss_rpn_cls: 0.0583, loss_rpn_bbox: 0.0241, loss_cls: 6.2519, loss_bbox: 2.8774, loss: 9.2117 2023-01-20 18:07:28,198 - mmdet - INFO - Epoch [10][2100/3696] lr: 6.000e-06, eta: 2:14:38, time: 0.908, data_time: 0.017, memory: 13290, loss_rpn_cls: 0.0565, loss_rpn_bbox: 0.0235, loss_cls: 6.2392, loss_bbox: 2.8911, loss: 9.2105 2023-01-20 18:08:13,429 - mmdet - INFO - Epoch [10][2150/3696] lr: 6.000e-06, eta: 2:13:53, time: 0.905, data_time: 0.017, memory: 13290, loss_rpn_cls: 0.0574, loss_rpn_bbox: 0.0237, loss_cls: 6.2409, loss_bbox: 2.8969, loss: 9.2189 2023-01-20 18:08:58,919 - mmdet - INFO - Epoch [10][2200/3696] lr: 6.000e-06, eta: 2:13:09, time: 0.910, data_time: 0.017, memory: 13290, loss_rpn_cls: 0.0582, loss_rpn_bbox: 0.0240, loss_cls: 6.2603, loss_bbox: 2.8870, loss: 9.2295 2023-01-20 18:09:44,603 - mmdet - INFO - Epoch [10][2250/3696] lr: 6.000e-06, eta: 2:12:24, time: 0.914, data_time: 0.017, memory: 13290, loss_rpn_cls: 0.0585, loss_rpn_bbox: 0.0240, loss_cls: 6.2474, loss_bbox: 2.8749, loss: 9.2048 2023-01-20 18:10:29,946 - mmdet - INFO - Epoch [10][2300/3696] lr: 6.000e-06, eta: 2:11:39, time: 0.907, data_time: 0.018, memory: 13290, loss_rpn_cls: 0.0573, loss_rpn_bbox: 0.0240, loss_cls: 6.2452, loss_bbox: 2.8747, loss: 9.2011 2023-01-20 18:11:15,391 - mmdet - INFO - Epoch [10][2350/3696] lr: 6.000e-06, eta: 2:10:54, time: 0.909, data_time: 0.017, memory: 13290, loss_rpn_cls: 0.0579, loss_rpn_bbox: 0.0244, loss_cls: 6.2576, loss_bbox: 2.8741, loss: 9.2140 2023-01-20 18:12:00,683 - mmdet - INFO - Epoch [10][2400/3696] lr: 6.000e-06, eta: 2:10:09, time: 0.906, data_time: 0.018, memory: 13290, loss_rpn_cls: 0.0570, loss_rpn_bbox: 0.0238, loss_cls: 6.2520, loss_bbox: 2.9011, loss: 9.2339 2023-01-20 18:12:46,540 - mmdet - INFO - Epoch [10][2450/3696] lr: 6.000e-06, eta: 2:09:25, time: 0.917, data_time: 0.017, memory: 13290, loss_rpn_cls: 0.0574, loss_rpn_bbox: 0.0240, loss_cls: 6.2526, loss_bbox: 2.8713, loss: 9.2053 2023-01-20 18:13:32,675 - mmdet - INFO - Epoch [10][2500/3696] lr: 6.000e-06, eta: 2:08:40, time: 0.923, data_time: 0.017, memory: 13290, loss_rpn_cls: 0.0577, loss_rpn_bbox: 0.0241, loss_cls: 6.2528, loss_bbox: 2.8698, loss: 9.2045 2023-01-20 18:14:18,473 - mmdet - INFO - Epoch [10][2550/3696] lr: 6.000e-06, eta: 2:07:55, time: 0.916, data_time: 0.017, memory: 13290, loss_rpn_cls: 0.0588, loss_rpn_bbox: 0.0243, loss_cls: 6.2571, loss_bbox: 2.8739, loss: 9.2140 2023-01-20 18:15:03,800 - mmdet - INFO - Epoch [10][2600/3696] lr: 6.000e-06, eta: 2:07:10, time: 0.907, data_time: 0.018, memory: 13290, loss_rpn_cls: 0.0567, loss_rpn_bbox: 0.0235, loss_cls: 6.2447, loss_bbox: 2.8929, loss: 9.2178 2023-01-20 18:15:48,957 - mmdet - INFO - Epoch [10][2650/3696] lr: 6.000e-06, eta: 2:06:25, time: 0.903, data_time: 0.017, memory: 13290, loss_rpn_cls: 0.0566, loss_rpn_bbox: 0.0235, loss_cls: 6.2426, loss_bbox: 2.8813, loss: 9.2040 2023-01-20 18:16:34,194 - mmdet - INFO - Epoch [10][2700/3696] lr: 6.000e-06, eta: 2:05:41, time: 0.905, data_time: 0.017, memory: 13290, loss_rpn_cls: 0.0563, loss_rpn_bbox: 0.0233, loss_cls: 6.2392, loss_bbox: 2.8862, loss: 9.2050 2023-01-20 18:17:19,196 - mmdet - INFO - Epoch [10][2750/3696] lr: 6.000e-06, eta: 2:04:56, time: 0.900, data_time: 0.017, memory: 13290, loss_rpn_cls: 0.0572, loss_rpn_bbox: 0.0240, loss_cls: 6.2575, loss_bbox: 2.8883, loss: 9.2270 2023-01-20 18:18:04,319 - mmdet - INFO - Epoch [10][2800/3696] lr: 6.000e-06, eta: 2:04:11, time: 0.902, data_time: 0.017, memory: 13290, loss_rpn_cls: 0.0572, loss_rpn_bbox: 0.0237, loss_cls: 6.2468, loss_bbox: 2.8859, loss: 9.2137 2023-01-20 18:18:50,812 - mmdet - INFO - Epoch [10][2850/3696] lr: 6.000e-06, eta: 2:03:26, time: 0.930, data_time: 0.018, memory: 13290, loss_rpn_cls: 0.0593, loss_rpn_bbox: 0.0249, loss_cls: 6.2714, loss_bbox: 2.8554, loss: 9.2111 2023-01-20 18:19:36,537 - mmdet - INFO - Epoch [10][2900/3696] lr: 6.000e-06, eta: 2:02:41, time: 0.914, data_time: 0.017, memory: 13290, loss_rpn_cls: 0.0579, loss_rpn_bbox: 0.0240, loss_cls: 6.2421, loss_bbox: 2.8863, loss: 9.2104 2023-01-20 18:20:22,523 - mmdet - INFO - Epoch [10][2950/3696] lr: 6.000e-06, eta: 2:01:57, time: 0.920, data_time: 0.017, memory: 13290, loss_rpn_cls: 0.0582, loss_rpn_bbox: 0.0241, loss_cls: 6.2532, loss_bbox: 2.8833, loss: 9.2188 2023-01-20 18:21:07,933 - mmdet - INFO - Epoch [10][3000/3696] lr: 6.000e-06, eta: 2:01:12, time: 0.908, data_time: 0.017, memory: 13290, loss_rpn_cls: 0.0589, loss_rpn_bbox: 0.0244, loss_cls: 6.2621, loss_bbox: 2.8684, loss: 9.2137 2023-01-20 18:21:54,143 - mmdet - INFO - Epoch [10][3050/3696] lr: 6.000e-06, eta: 2:00:27, time: 0.924, data_time: 0.018, memory: 13290, loss_rpn_cls: 0.0572, loss_rpn_bbox: 0.0243, loss_cls: 6.2658, loss_bbox: 2.8647, loss: 9.2120 2023-01-20 18:22:39,939 - mmdet - INFO - Epoch [10][3100/3696] lr: 6.000e-06, eta: 1:59:42, time: 0.916, data_time: 0.017, memory: 13290, loss_rpn_cls: 0.0588, loss_rpn_bbox: 0.0245, loss_cls: 6.2553, loss_bbox: 2.8758, loss: 9.2143 2023-01-20 18:23:26,351 - mmdet - INFO - Epoch [10][3150/3696] lr: 6.000e-06, eta: 1:58:58, time: 0.928, data_time: 0.017, memory: 13290, loss_rpn_cls: 0.0582, loss_rpn_bbox: 0.0240, loss_cls: 6.2480, loss_bbox: 2.8771, loss: 9.2074 2023-01-20 18:24:11,342 - mmdet - INFO - Epoch [10][3200/3696] lr: 6.000e-06, eta: 1:58:13, time: 0.900, data_time: 0.017, memory: 13290, loss_rpn_cls: 0.0574, loss_rpn_bbox: 0.0240, loss_cls: 6.2466, loss_bbox: 2.8873, loss: 9.2154 2023-01-20 18:24:56,954 - mmdet - INFO - Epoch [10][3250/3696] lr: 6.000e-06, eta: 1:57:28, time: 0.912, data_time: 0.017, memory: 13290, loss_rpn_cls: 0.0577, loss_rpn_bbox: 0.0239, loss_cls: 6.2410, loss_bbox: 2.8760, loss: 9.1986 2023-01-20 18:25:42,198 - mmdet - INFO - Epoch [10][3300/3696] lr: 6.000e-06, eta: 1:56:43, time: 0.905, data_time: 0.017, memory: 13290, loss_rpn_cls: 0.0573, loss_rpn_bbox: 0.0238, loss_cls: 6.2444, loss_bbox: 2.8789, loss: 9.2044 2023-01-20 18:26:27,573 - mmdet - INFO - Epoch [10][3350/3696] lr: 6.000e-06, eta: 1:55:58, time: 0.907, data_time: 0.017, memory: 13290, loss_rpn_cls: 0.0563, loss_rpn_bbox: 0.0235, loss_cls: 6.2452, loss_bbox: 2.8843, loss: 9.2094 2023-01-20 18:27:12,815 - mmdet - INFO - Epoch [10][3400/3696] lr: 6.000e-06, eta: 1:55:13, time: 0.905, data_time: 0.018, memory: 13290, loss_rpn_cls: 0.0576, loss_rpn_bbox: 0.0239, loss_cls: 6.2484, loss_bbox: 2.9012, loss: 9.2312 2023-01-20 18:27:57,951 - mmdet - INFO - Epoch [10][3450/3696] lr: 6.000e-06, eta: 1:54:28, time: 0.903, data_time: 0.017, memory: 13290, loss_rpn_cls: 0.0570, loss_rpn_bbox: 0.0239, loss_cls: 6.2528, loss_bbox: 2.8676, loss: 9.2013 2023-01-20 18:28:45,254 - mmdet - INFO - Epoch [10][3500/3696] lr: 6.000e-06, eta: 1:53:44, time: 0.946, data_time: 0.018, memory: 13290, loss_rpn_cls: 0.0574, loss_rpn_bbox: 0.0242, loss_cls: 6.2597, loss_bbox: 2.8861, loss: 9.2273 2023-01-20 18:29:30,594 - mmdet - INFO - Epoch [10][3550/3696] lr: 6.000e-06, eta: 1:52:59, time: 0.907, data_time: 0.017, memory: 13290, loss_rpn_cls: 0.0569, loss_rpn_bbox: 0.0236, loss_cls: 6.2398, loss_bbox: 2.8939, loss: 9.2142 2023-01-20 18:30:15,946 - mmdet - INFO - Epoch [10][3600/3696] lr: 6.000e-06, eta: 1:52:14, time: 0.907, data_time: 0.017, memory: 13290, loss_rpn_cls: 0.0570, loss_rpn_bbox: 0.0235, loss_cls: 6.2385, loss_bbox: 2.8899, loss: 9.2088 2023-01-20 18:31:02,597 - mmdet - INFO - Epoch [10][3650/3696] lr: 6.000e-06, eta: 1:51:29, time: 0.933, data_time: 0.018, memory: 13290, loss_rpn_cls: 0.0575, loss_rpn_bbox: 0.0242, loss_cls: 6.2519, loss_bbox: 2.8649, loss: 9.1985 2023-01-20 18:31:44,703 - mmdet - INFO - Saving checkpoint at 10 epochs 2023-01-20 18:32:43,441 - mmdet - INFO - Epoch [11][50/3696] lr: 6.000e-06, eta: 1:49:57, time: 1.112, data_time: 0.076, memory: 13290, loss_rpn_cls: 0.0588, loss_rpn_bbox: 0.0244, loss_cls: 6.2554, loss_bbox: 2.8571, loss: 9.1957 2023-01-20 18:33:30,709 - mmdet - INFO - Epoch [11][100/3696] lr: 6.000e-06, eta: 1:49:12, time: 0.945, data_time: 0.019, memory: 13290, loss_rpn_cls: 0.0570, loss_rpn_bbox: 0.0237, loss_cls: 6.2439, loss_bbox: 2.8890, loss: 9.2135 2023-01-20 18:34:20,935 - mmdet - INFO - Epoch [11][150/3696] lr: 6.000e-06, eta: 1:48:29, time: 1.005, data_time: 0.020, memory: 13290, loss_rpn_cls: 0.0581, loss_rpn_bbox: 0.0243, loss_cls: 6.2585, loss_bbox: 2.8781, loss: 9.2190 2023-01-20 18:35:08,753 - mmdet - INFO - Epoch [11][200/3696] lr: 6.000e-06, eta: 1:47:44, time: 0.956, data_time: 0.020, memory: 13290, loss_rpn_cls: 0.0574, loss_rpn_bbox: 0.0239, loss_cls: 6.2399, loss_bbox: 2.8972, loss: 9.2184 2023-01-20 18:35:54,696 - mmdet - INFO - Epoch [11][250/3696] lr: 6.000e-06, eta: 1:46:59, time: 0.919, data_time: 0.019, memory: 13290, loss_rpn_cls: 0.0584, loss_rpn_bbox: 0.0241, loss_cls: 6.2454, loss_bbox: 2.8679, loss: 9.1957 2023-01-20 18:36:40,325 - mmdet - INFO - Epoch [11][300/3696] lr: 6.000e-06, eta: 1:46:15, time: 0.913, data_time: 0.020, memory: 13290, loss_rpn_cls: 0.0580, loss_rpn_bbox: 0.0241, loss_cls: 6.2559, loss_bbox: 2.8919, loss: 9.2299 2023-01-20 18:37:25,746 - mmdet - INFO - Epoch [11][350/3696] lr: 6.000e-06, eta: 1:45:30, time: 0.908, data_time: 0.020, memory: 13290, loss_rpn_cls: 0.0573, loss_rpn_bbox: 0.0237, loss_cls: 6.2468, loss_bbox: 2.8938, loss: 9.2216 2023-01-20 18:38:11,244 - mmdet - INFO - Epoch [11][400/3696] lr: 6.000e-06, eta: 1:44:45, time: 0.910, data_time: 0.020, memory: 13290, loss_rpn_cls: 0.0580, loss_rpn_bbox: 0.0241, loss_cls: 6.2501, loss_bbox: 2.8882, loss: 9.2204 2023-01-20 18:38:57,933 - mmdet - INFO - Epoch [11][450/3696] lr: 6.000e-06, eta: 1:44:00, time: 0.934, data_time: 0.021, memory: 13290, loss_rpn_cls: 0.0577, loss_rpn_bbox: 0.0240, loss_cls: 6.2434, loss_bbox: 2.8577, loss: 9.1829 2023-01-20 18:39:43,302 - mmdet - INFO - Epoch [11][500/3696] lr: 6.000e-06, eta: 1:43:15, time: 0.907, data_time: 0.019, memory: 13290, loss_rpn_cls: 0.0582, loss_rpn_bbox: 0.0240, loss_cls: 6.2468, loss_bbox: 2.8814, loss: 9.2104 2023-01-20 18:40:28,837 - mmdet - INFO - Epoch [11][550/3696] lr: 6.000e-06, eta: 1:42:31, time: 0.911, data_time: 0.020, memory: 13290, loss_rpn_cls: 0.0578, loss_rpn_bbox: 0.0240, loss_cls: 6.2471, loss_bbox: 2.8800, loss: 9.2089 2023-01-20 18:41:14,329 - mmdet - INFO - Epoch [11][600/3696] lr: 6.000e-06, eta: 1:41:46, time: 0.910, data_time: 0.020, memory: 13290, loss_rpn_cls: 0.0576, loss_rpn_bbox: 0.0239, loss_cls: 6.2458, loss_bbox: 2.8889, loss: 9.2161 2023-01-20 18:42:00,307 - mmdet - INFO - Epoch [11][650/3696] lr: 6.000e-06, eta: 1:41:01, time: 0.920, data_time: 0.020, memory: 13290, loss_rpn_cls: 0.0581, loss_rpn_bbox: 0.0241, loss_cls: 6.2541, loss_bbox: 2.8801, loss: 9.2164 2023-01-20 18:42:45,636 - mmdet - INFO - Epoch [11][700/3696] lr: 6.000e-06, eta: 1:40:16, time: 0.907, data_time: 0.020, memory: 13290, loss_rpn_cls: 0.0574, loss_rpn_bbox: 0.0240, loss_cls: 6.2441, loss_bbox: 2.8809, loss: 9.2064 2023-01-20 18:43:30,738 - mmdet - INFO - Epoch [11][750/3696] lr: 6.000e-06, eta: 1:39:31, time: 0.902, data_time: 0.020, memory: 13290, loss_rpn_cls: 0.0574, loss_rpn_bbox: 0.0237, loss_cls: 6.2414, loss_bbox: 2.8974, loss: 9.2199 2023-01-20 18:44:16,049 - mmdet - INFO - Epoch [11][800/3696] lr: 6.000e-06, eta: 1:38:46, time: 0.906, data_time: 0.020, memory: 13290, loss_rpn_cls: 0.0574, loss_rpn_bbox: 0.0239, loss_cls: 6.2443, loss_bbox: 2.8952, loss: 9.2208 2023-01-20 18:45:01,757 - mmdet - INFO - Epoch [11][850/3696] lr: 6.000e-06, eta: 1:38:01, time: 0.914, data_time: 0.020, memory: 13290, loss_rpn_cls: 0.0576, loss_rpn_bbox: 0.0238, loss_cls: 6.2446, loss_bbox: 2.8774, loss: 9.2034 2023-01-20 18:45:47,455 - mmdet - INFO - Epoch [11][900/3696] lr: 6.000e-06, eta: 1:37:17, time: 0.914, data_time: 0.021, memory: 13290, loss_rpn_cls: 0.0569, loss_rpn_bbox: 0.0236, loss_cls: 6.2425, loss_bbox: 2.8797, loss: 9.2026 2023-01-20 18:46:32,940 - mmdet - INFO - Epoch [11][950/3696] lr: 6.000e-06, eta: 1:36:32, time: 0.910, data_time: 0.019, memory: 13290, loss_rpn_cls: 0.0571, loss_rpn_bbox: 0.0238, loss_cls: 6.2500, loss_bbox: 2.8740, loss: 9.2049 2023-01-20 18:47:18,767 - mmdet - INFO - Epoch [11][1000/3696] lr: 6.000e-06, eta: 1:35:47, time: 0.916, data_time: 0.020, memory: 13290, loss_rpn_cls: 0.0578, loss_rpn_bbox: 0.0239, loss_cls: 6.2511, loss_bbox: 2.8725, loss: 9.2052 2023-01-20 18:48:03,945 - mmdet - INFO - Epoch [11][1050/3696] lr: 6.000e-06, eta: 1:35:02, time: 0.904, data_time: 0.019, memory: 13290, loss_rpn_cls: 0.0575, loss_rpn_bbox: 0.0235, loss_cls: 6.2378, loss_bbox: 2.8841, loss: 9.2029 2023-01-20 18:48:49,169 - mmdet - INFO - Epoch [11][1100/3696] lr: 6.000e-06, eta: 1:34:17, time: 0.904, data_time: 0.020, memory: 13290, loss_rpn_cls: 0.0571, loss_rpn_bbox: 0.0238, loss_cls: 6.2489, loss_bbox: 2.8888, loss: 9.2186 2023-01-20 18:49:34,846 - mmdet - INFO - Epoch [11][1150/3696] lr: 6.000e-06, eta: 1:33:32, time: 0.914, data_time: 0.019, memory: 13290, loss_rpn_cls: 0.0580, loss_rpn_bbox: 0.0242, loss_cls: 6.2497, loss_bbox: 2.8851, loss: 9.2170 2023-01-20 18:50:20,809 - mmdet - INFO - Epoch [11][1200/3696] lr: 6.000e-06, eta: 1:32:47, time: 0.919, data_time: 0.019, memory: 13290, loss_rpn_cls: 0.0567, loss_rpn_bbox: 0.0236, loss_cls: 6.2456, loss_bbox: 2.8736, loss: 9.1995 2023-01-20 18:51:06,791 - mmdet - INFO - Epoch [11][1250/3696] lr: 6.000e-06, eta: 1:32:03, time: 0.920, data_time: 0.020, memory: 13290, loss_rpn_cls: 0.0584, loss_rpn_bbox: 0.0242, loss_cls: 6.2577, loss_bbox: 2.8721, loss: 9.2124 2023-01-20 18:51:52,348 - mmdet - INFO - Epoch [11][1300/3696] lr: 6.000e-06, eta: 1:31:18, time: 0.911, data_time: 0.020, memory: 13290, loss_rpn_cls: 0.0582, loss_rpn_bbox: 0.0241, loss_cls: 6.2436, loss_bbox: 2.8751, loss: 9.2009 2023-01-20 18:52:38,662 - mmdet - INFO - Epoch [11][1350/3696] lr: 6.000e-06, eta: 1:30:33, time: 0.926, data_time: 0.020, memory: 13290, loss_rpn_cls: 0.0585, loss_rpn_bbox: 0.0246, loss_cls: 6.2623, loss_bbox: 2.8619, loss: 9.2074 2023-01-20 18:53:24,423 - mmdet - INFO - Epoch [11][1400/3696] lr: 6.000e-06, eta: 1:29:48, time: 0.915, data_time: 0.020, memory: 13290, loss_rpn_cls: 0.0582, loss_rpn_bbox: 0.0243, loss_cls: 6.2541, loss_bbox: 2.8684, loss: 9.2050 2023-01-20 18:54:09,825 - mmdet - INFO - Epoch [11][1450/3696] lr: 6.000e-06, eta: 1:29:03, time: 0.908, data_time: 0.020, memory: 13290, loss_rpn_cls: 0.0568, loss_rpn_bbox: 0.0236, loss_cls: 6.2391, loss_bbox: 2.8892, loss: 9.2086 2023-01-20 18:54:55,575 - mmdet - INFO - Epoch [11][1500/3696] lr: 6.000e-06, eta: 1:28:18, time: 0.915, data_time: 0.020, memory: 13290, loss_rpn_cls: 0.0576, loss_rpn_bbox: 0.0240, loss_cls: 6.2483, loss_bbox: 2.8846, loss: 9.2145 2023-01-20 18:55:40,651 - mmdet - INFO - Epoch [11][1550/3696] lr: 6.000e-06, eta: 1:27:33, time: 0.902, data_time: 0.020, memory: 13290, loss_rpn_cls: 0.0560, loss_rpn_bbox: 0.0234, loss_cls: 6.2342, loss_bbox: 2.8966, loss: 9.2103 2023-01-20 18:56:26,343 - mmdet - INFO - Epoch [11][1600/3696] lr: 6.000e-06, eta: 1:26:49, time: 0.914, data_time: 0.020, memory: 13290, loss_rpn_cls: 0.0587, loss_rpn_bbox: 0.0242, loss_cls: 6.2503, loss_bbox: 2.8765, loss: 9.2097 2023-01-20 18:57:11,778 - mmdet - INFO - Epoch [11][1650/3696] lr: 6.000e-06, eta: 1:26:04, time: 0.909, data_time: 0.020, memory: 13290, loss_rpn_cls: 0.0571, loss_rpn_bbox: 0.0237, loss_cls: 6.2434, loss_bbox: 2.8824, loss: 9.2066 2023-01-20 18:57:57,172 - mmdet - INFO - Epoch [11][1700/3696] lr: 6.000e-06, eta: 1:25:19, time: 0.908, data_time: 0.020, memory: 13290, loss_rpn_cls: 0.0582, loss_rpn_bbox: 0.0240, loss_cls: 6.2488, loss_bbox: 2.8841, loss: 9.2150 2023-01-20 18:58:42,859 - mmdet - INFO - Epoch [11][1750/3696] lr: 6.000e-06, eta: 1:24:34, time: 0.914, data_time: 0.020, memory: 13290, loss_rpn_cls: 0.0580, loss_rpn_bbox: 0.0239, loss_cls: 6.2529, loss_bbox: 2.8815, loss: 9.2164 2023-01-20 18:59:28,341 - mmdet - INFO - Epoch [11][1800/3696] lr: 6.000e-06, eta: 1:23:49, time: 0.910, data_time: 0.021, memory: 13290, loss_rpn_cls: 0.0579, loss_rpn_bbox: 0.0238, loss_cls: 6.2430, loss_bbox: 2.8785, loss: 9.2033 2023-01-20 19:00:13,532 - mmdet - INFO - Epoch [11][1850/3696] lr: 6.000e-06, eta: 1:23:04, time: 0.904, data_time: 0.019, memory: 13290, loss_rpn_cls: 0.0573, loss_rpn_bbox: 0.0238, loss_cls: 6.2478, loss_bbox: 2.8853, loss: 9.2142 2023-01-20 19:00:59,219 - mmdet - INFO - Epoch [11][1900/3696] lr: 6.000e-06, eta: 1:22:19, time: 0.914, data_time: 0.019, memory: 13290, loss_rpn_cls: 0.0580, loss_rpn_bbox: 0.0241, loss_cls: 6.2492, loss_bbox: 2.8842, loss: 9.2154 2023-01-20 19:01:44,000 - mmdet - INFO - Epoch [11][1950/3696] lr: 6.000e-06, eta: 1:21:34, time: 0.896, data_time: 0.020, memory: 13290, loss_rpn_cls: 0.0566, loss_rpn_bbox: 0.0238, loss_cls: 6.2538, loss_bbox: 2.8948, loss: 9.2292 2023-01-20 19:02:29,473 - mmdet - INFO - Epoch [11][2000/3696] lr: 6.000e-06, eta: 1:20:49, time: 0.909, data_time: 0.019, memory: 13290, loss_rpn_cls: 0.0581, loss_rpn_bbox: 0.0241, loss_cls: 6.2498, loss_bbox: 2.8776, loss: 9.2096 2023-01-20 19:03:14,728 - mmdet - INFO - Epoch [11][2050/3696] lr: 6.000e-06, eta: 1:20:04, time: 0.905, data_time: 0.020, memory: 13290, loss_rpn_cls: 0.0575, loss_rpn_bbox: 0.0240, loss_cls: 6.2601, loss_bbox: 2.8796, loss: 9.2212 2023-01-20 19:04:00,318 - mmdet - INFO - Epoch [11][2100/3696] lr: 6.000e-06, eta: 1:19:20, time: 0.912, data_time: 0.020, memory: 13290, loss_rpn_cls: 0.0580, loss_rpn_bbox: 0.0242, loss_cls: 6.2571, loss_bbox: 2.8685, loss: 9.2078 2023-01-20 19:04:45,919 - mmdet - INFO - Epoch [11][2150/3696] lr: 6.000e-06, eta: 1:18:35, time: 0.912, data_time: 0.020, memory: 13290, loss_rpn_cls: 0.0576, loss_rpn_bbox: 0.0243, loss_cls: 6.2493, loss_bbox: 2.8684, loss: 9.1996 2023-01-20 19:05:32,480 - mmdet - INFO - Epoch [11][2200/3696] lr: 6.000e-06, eta: 1:17:50, time: 0.931, data_time: 0.020, memory: 13290, loss_rpn_cls: 0.0580, loss_rpn_bbox: 0.0244, loss_cls: 6.2559, loss_bbox: 2.8597, loss: 9.1980 2023-01-20 19:06:17,985 - mmdet - INFO - Epoch [11][2250/3696] lr: 6.000e-06, eta: 1:17:05, time: 0.910, data_time: 0.020, memory: 13290, loss_rpn_cls: 0.0566, loss_rpn_bbox: 0.0237, loss_cls: 6.2456, loss_bbox: 2.8852, loss: 9.2111 2023-01-20 19:07:04,039 - mmdet - INFO - Epoch [11][2300/3696] lr: 6.000e-06, eta: 1:16:20, time: 0.921, data_time: 0.020, memory: 13290, loss_rpn_cls: 0.0580, loss_rpn_bbox: 0.0239, loss_cls: 6.2491, loss_bbox: 2.8729, loss: 9.2039 2023-01-20 19:07:49,406 - mmdet - INFO - Epoch [11][2350/3696] lr: 6.000e-06, eta: 1:15:35, time: 0.907, data_time: 0.020, memory: 13290, loss_rpn_cls: 0.0577, loss_rpn_bbox: 0.0240, loss_cls: 6.2414, loss_bbox: 2.8699, loss: 9.1931 2023-01-20 19:08:35,036 - mmdet - INFO - Epoch [11][2400/3696] lr: 6.000e-06, eta: 1:14:50, time: 0.913, data_time: 0.020, memory: 13290, loss_rpn_cls: 0.0568, loss_rpn_bbox: 0.0235, loss_cls: 6.2446, loss_bbox: 2.8940, loss: 9.2190 2023-01-20 19:09:20,655 - mmdet - INFO - Epoch [11][2450/3696] lr: 6.000e-06, eta: 1:14:05, time: 0.912, data_time: 0.019, memory: 13290, loss_rpn_cls: 0.0575, loss_rpn_bbox: 0.0239, loss_cls: 6.2452, loss_bbox: 2.8744, loss: 9.2010 2023-01-20 19:10:06,403 - mmdet - INFO - Epoch [11][2500/3696] lr: 6.000e-06, eta: 1:13:21, time: 0.915, data_time: 0.020, memory: 13290, loss_rpn_cls: 0.0576, loss_rpn_bbox: 0.0242, loss_cls: 6.2638, loss_bbox: 2.8788, loss: 9.2244 2023-01-20 19:10:51,709 - mmdet - INFO - Epoch [11][2550/3696] lr: 6.000e-06, eta: 1:12:36, time: 0.906, data_time: 0.020, memory: 13290, loss_rpn_cls: 0.0578, loss_rpn_bbox: 0.0239, loss_cls: 6.2476, loss_bbox: 2.8837, loss: 9.2130 2023-01-20 19:11:37,291 - mmdet - INFO - Epoch [11][2600/3696] lr: 6.000e-06, eta: 1:11:51, time: 0.912, data_time: 0.020, memory: 13290, loss_rpn_cls: 0.0576, loss_rpn_bbox: 0.0240, loss_cls: 6.2514, loss_bbox: 2.8775, loss: 9.2105 2023-01-20 19:12:22,782 - mmdet - INFO - Epoch [11][2650/3696] lr: 6.000e-06, eta: 1:11:06, time: 0.910, data_time: 0.020, memory: 13290, loss_rpn_cls: 0.0577, loss_rpn_bbox: 0.0238, loss_cls: 6.2482, loss_bbox: 2.8928, loss: 9.2226 2023-01-20 19:13:08,526 - mmdet - INFO - Epoch [11][2700/3696] lr: 6.000e-06, eta: 1:10:21, time: 0.915, data_time: 0.020, memory: 13290, loss_rpn_cls: 0.0579, loss_rpn_bbox: 0.0242, loss_cls: 6.2530, loss_bbox: 2.8727, loss: 9.2078 2023-01-20 19:13:54,265 - mmdet - INFO - Epoch [11][2750/3696] lr: 6.000e-06, eta: 1:09:36, time: 0.915, data_time: 0.019, memory: 13290, loss_rpn_cls: 0.0572, loss_rpn_bbox: 0.0240, loss_cls: 6.2432, loss_bbox: 2.8766, loss: 9.2009 2023-01-20 19:14:39,619 - mmdet - INFO - Epoch [11][2800/3696] lr: 6.000e-06, eta: 1:08:51, time: 0.907, data_time: 0.020, memory: 13290, loss_rpn_cls: 0.0575, loss_rpn_bbox: 0.0239, loss_cls: 6.2509, loss_bbox: 2.8766, loss: 9.2089 2023-01-20 19:15:24,977 - mmdet - INFO - Epoch [11][2850/3696] lr: 6.000e-06, eta: 1:08:06, time: 0.907, data_time: 0.020, memory: 13290, loss_rpn_cls: 0.0575, loss_rpn_bbox: 0.0240, loss_cls: 6.2532, loss_bbox: 2.8887, loss: 9.2234 2023-01-20 19:16:10,601 - mmdet - INFO - Epoch [11][2900/3696] lr: 6.000e-06, eta: 1:07:21, time: 0.912, data_time: 0.019, memory: 13290, loss_rpn_cls: 0.0580, loss_rpn_bbox: 0.0240, loss_cls: 6.2498, loss_bbox: 2.8683, loss: 9.2001 2023-01-20 19:16:55,885 - mmdet - INFO - Epoch [11][2950/3696] lr: 6.000e-06, eta: 1:06:36, time: 0.906, data_time: 0.020, memory: 13290, loss_rpn_cls: 0.0576, loss_rpn_bbox: 0.0240, loss_cls: 6.2541, loss_bbox: 2.8856, loss: 9.2212 2023-01-20 19:17:41,035 - mmdet - INFO - Epoch [11][3000/3696] lr: 6.000e-06, eta: 1:05:51, time: 0.903, data_time: 0.019, memory: 13290, loss_rpn_cls: 0.0570, loss_rpn_bbox: 0.0239, loss_cls: 6.2461, loss_bbox: 2.8933, loss: 9.2203 2023-01-20 19:18:26,229 - mmdet - INFO - Epoch [11][3050/3696] lr: 6.000e-06, eta: 1:05:06, time: 0.904, data_time: 0.019, memory: 13290, loss_rpn_cls: 0.0563, loss_rpn_bbox: 0.0233, loss_cls: 6.2342, loss_bbox: 2.8939, loss: 9.2077 2023-01-20 19:19:12,356 - mmdet - INFO - Epoch [11][3100/3696] lr: 6.000e-06, eta: 1:04:21, time: 0.923, data_time: 0.020, memory: 13290, loss_rpn_cls: 0.0583, loss_rpn_bbox: 0.0241, loss_cls: 6.2499, loss_bbox: 2.8657, loss: 9.1980 2023-01-20 19:19:57,658 - mmdet - INFO - Epoch [11][3150/3696] lr: 6.000e-06, eta: 1:03:36, time: 0.906, data_time: 0.020, memory: 13290, loss_rpn_cls: 0.0573, loss_rpn_bbox: 0.0236, loss_cls: 6.2465, loss_bbox: 2.8857, loss: 9.2132 2023-01-20 19:20:43,229 - mmdet - INFO - Epoch [11][3200/3696] lr: 6.000e-06, eta: 1:02:52, time: 0.911, data_time: 0.020, memory: 13290, loss_rpn_cls: 0.0573, loss_rpn_bbox: 0.0239, loss_cls: 6.2456, loss_bbox: 2.8678, loss: 9.1947 2023-01-20 19:21:29,390 - mmdet - INFO - Epoch [11][3250/3696] lr: 6.000e-06, eta: 1:02:07, time: 0.923, data_time: 0.019, memory: 13290, loss_rpn_cls: 0.0580, loss_rpn_bbox: 0.0242, loss_cls: 6.2550, loss_bbox: 2.8759, loss: 9.2131 2023-01-20 19:22:14,363 - mmdet - INFO - Epoch [11][3300/3696] lr: 6.000e-06, eta: 1:01:22, time: 0.899, data_time: 0.020, memory: 13290, loss_rpn_cls: 0.0573, loss_rpn_bbox: 0.0239, loss_cls: 6.2519, loss_bbox: 2.8973, loss: 9.2304 2023-01-20 19:22:59,352 - mmdet - INFO - Epoch [11][3350/3696] lr: 6.000e-06, eta: 1:00:37, time: 0.900, data_time: 0.020, memory: 13290, loss_rpn_cls: 0.0575, loss_rpn_bbox: 0.0239, loss_cls: 6.2498, loss_bbox: 2.8824, loss: 9.2136 2023-01-20 19:23:44,983 - mmdet - INFO - Epoch [11][3400/3696] lr: 6.000e-06, eta: 0:59:52, time: 0.913, data_time: 0.020, memory: 13290, loss_rpn_cls: 0.0568, loss_rpn_bbox: 0.0237, loss_cls: 6.2488, loss_bbox: 2.8839, loss: 9.2132 2023-01-20 19:24:30,611 - mmdet - INFO - Epoch [11][3450/3696] lr: 6.000e-06, eta: 0:59:07, time: 0.913, data_time: 0.019, memory: 13290, loss_rpn_cls: 0.0570, loss_rpn_bbox: 0.0238, loss_cls: 6.2492, loss_bbox: 2.8757, loss: 9.2057 2023-01-20 19:25:16,153 - mmdet - INFO - Epoch [11][3500/3696] lr: 6.000e-06, eta: 0:58:22, time: 0.911, data_time: 0.020, memory: 13290, loss_rpn_cls: 0.0572, loss_rpn_bbox: 0.0238, loss_cls: 6.2439, loss_bbox: 2.8836, loss: 9.2084 2023-01-20 19:26:01,554 - mmdet - INFO - Epoch [11][3550/3696] lr: 6.000e-06, eta: 0:57:37, time: 0.908, data_time: 0.020, memory: 13290, loss_rpn_cls: 0.0574, loss_rpn_bbox: 0.0238, loss_cls: 6.2497, loss_bbox: 2.8712, loss: 9.2021 2023-01-20 19:26:46,861 - mmdet - INFO - Epoch [11][3600/3696] lr: 6.000e-06, eta: 0:56:52, time: 0.906, data_time: 0.021, memory: 13290, loss_rpn_cls: 0.0573, loss_rpn_bbox: 0.0235, loss_cls: 6.2356, loss_bbox: 2.8898, loss: 9.2062 2023-01-20 19:27:33,254 - mmdet - INFO - Epoch [11][3650/3696] lr: 6.000e-06, eta: 0:56:07, time: 0.928, data_time: 0.020, memory: 13290, loss_rpn_cls: 0.0582, loss_rpn_bbox: 0.0244, loss_cls: 6.2516, loss_bbox: 2.8655, loss: 9.1997 2023-01-20 19:28:15,808 - mmdet - INFO - Saving checkpoint at 11 epochs 2023-01-20 19:29:06,968 - mmdet - INFO - Epoch [12][50/3696] lr: 6.000e-07, eta: 0:54:37, time: 0.963, data_time: 0.073, memory: 13290, loss_rpn_cls: 0.0571, loss_rpn_bbox: 0.0236, loss_cls: 6.2467, loss_bbox: 2.8890, loss: 9.2164 2023-01-20 19:29:52,306 - mmdet - INFO - Epoch [12][100/3696] lr: 6.000e-07, eta: 0:53:52, time: 0.907, data_time: 0.018, memory: 13290, loss_rpn_cls: 0.0580, loss_rpn_bbox: 0.0242, loss_cls: 6.2542, loss_bbox: 2.8767, loss: 9.2131 2023-01-20 19:30:37,829 - mmdet - INFO - Epoch [12][150/3696] lr: 6.000e-07, eta: 0:53:07, time: 0.910, data_time: 0.018, memory: 13290, loss_rpn_cls: 0.0569, loss_rpn_bbox: 0.0237, loss_cls: 6.2439, loss_bbox: 2.8892, loss: 9.2136 2023-01-20 19:31:23,021 - mmdet - INFO - Epoch [12][200/3696] lr: 6.000e-07, eta: 0:52:22, time: 0.904, data_time: 0.017, memory: 13290, loss_rpn_cls: 0.0570, loss_rpn_bbox: 0.0237, loss_cls: 6.2471, loss_bbox: 2.8850, loss: 9.2129 2023-01-20 19:32:08,003 - mmdet - INFO - Epoch [12][250/3696] lr: 6.000e-07, eta: 0:51:38, time: 0.900, data_time: 0.017, memory: 13290, loss_rpn_cls: 0.0565, loss_rpn_bbox: 0.0235, loss_cls: 6.2279, loss_bbox: 2.8868, loss: 9.1946 2023-01-20 19:32:53,317 - mmdet - INFO - Epoch [12][300/3696] lr: 6.000e-07, eta: 0:50:53, time: 0.906, data_time: 0.018, memory: 13290, loss_rpn_cls: 0.0570, loss_rpn_bbox: 0.0234, loss_cls: 6.2366, loss_bbox: 2.8906, loss: 9.2077 2023-01-20 19:33:39,387 - mmdet - INFO - Epoch [12][350/3696] lr: 6.000e-07, eta: 0:50:08, time: 0.921, data_time: 0.018, memory: 13290, loss_rpn_cls: 0.0583, loss_rpn_bbox: 0.0242, loss_cls: 6.2535, loss_bbox: 2.8540, loss: 9.1900 2023-01-20 19:34:25,350 - mmdet - INFO - Epoch [12][400/3696] lr: 6.000e-07, eta: 0:49:23, time: 0.919, data_time: 0.018, memory: 13290, loss_rpn_cls: 0.0575, loss_rpn_bbox: 0.0241, loss_cls: 6.2506, loss_bbox: 2.8771, loss: 9.2093 2023-01-20 19:35:11,281 - mmdet - INFO - Epoch [12][450/3696] lr: 6.000e-07, eta: 0:48:38, time: 0.919, data_time: 0.017, memory: 13290, loss_rpn_cls: 0.0573, loss_rpn_bbox: 0.0238, loss_cls: 6.2432, loss_bbox: 2.8704, loss: 9.1948 2023-01-20 19:35:57,226 - mmdet - INFO - Epoch [12][500/3696] lr: 6.000e-07, eta: 0:47:53, time: 0.919, data_time: 0.017, memory: 13290, loss_rpn_cls: 0.0577, loss_rpn_bbox: 0.0237, loss_cls: 6.2405, loss_bbox: 2.8893, loss: 9.2112 2023-01-20 19:36:42,556 - mmdet - INFO - Epoch [12][550/3696] lr: 6.000e-07, eta: 0:47:08, time: 0.907, data_time: 0.017, memory: 13290, loss_rpn_cls: 0.0580, loss_rpn_bbox: 0.0240, loss_cls: 6.2416, loss_bbox: 2.8769, loss: 9.2005 2023-01-20 19:37:27,542 - mmdet - INFO - Epoch [12][600/3696] lr: 6.000e-07, eta: 0:46:23, time: 0.900, data_time: 0.018, memory: 13290, loss_rpn_cls: 0.0570, loss_rpn_bbox: 0.0236, loss_cls: 6.2454, loss_bbox: 2.8916, loss: 9.2175 2023-01-20 19:38:13,068 - mmdet - INFO - Epoch [12][650/3696] lr: 6.000e-07, eta: 0:45:38, time: 0.911, data_time: 0.017, memory: 13290, loss_rpn_cls: 0.0569, loss_rpn_bbox: 0.0236, loss_cls: 6.2391, loss_bbox: 2.8854, loss: 9.2051 2023-01-20 19:38:58,042 - mmdet - INFO - Epoch [12][700/3696] lr: 6.000e-07, eta: 0:44:53, time: 0.899, data_time: 0.017, memory: 13290, loss_rpn_cls: 0.0575, loss_rpn_bbox: 0.0239, loss_cls: 6.2524, loss_bbox: 2.8787, loss: 9.2124 2023-01-20 19:39:43,507 - mmdet - INFO - Epoch [12][750/3696] lr: 6.000e-07, eta: 0:44:08, time: 0.909, data_time: 0.018, memory: 13290, loss_rpn_cls: 0.0573, loss_rpn_bbox: 0.0239, loss_cls: 6.2508, loss_bbox: 2.8789, loss: 9.2109 2023-01-20 19:40:29,658 - mmdet - INFO - Epoch [12][800/3696] lr: 6.000e-07, eta: 0:43:24, time: 0.923, data_time: 0.018, memory: 13290, loss_rpn_cls: 0.0574, loss_rpn_bbox: 0.0241, loss_cls: 6.2487, loss_bbox: 2.8683, loss: 9.1985 2023-01-20 19:41:15,232 - mmdet - INFO - Epoch [12][850/3696] lr: 6.000e-07, eta: 0:42:39, time: 0.911, data_time: 0.018, memory: 13290, loss_rpn_cls: 0.0574, loss_rpn_bbox: 0.0239, loss_cls: 6.2403, loss_bbox: 2.8763, loss: 9.1979 2023-01-20 19:42:00,823 - mmdet - INFO - Epoch [12][900/3696] lr: 6.000e-07, eta: 0:41:54, time: 0.912, data_time: 0.017, memory: 13290, loss_rpn_cls: 0.0570, loss_rpn_bbox: 0.0240, loss_cls: 6.2468, loss_bbox: 2.8720, loss: 9.1998 2023-01-20 19:42:46,616 - mmdet - INFO - Epoch [12][950/3696] lr: 6.000e-07, eta: 0:41:09, time: 0.916, data_time: 0.017, memory: 13290, loss_rpn_cls: 0.0582, loss_rpn_bbox: 0.0245, loss_cls: 6.2610, loss_bbox: 2.8580, loss: 9.2016 2023-01-20 19:43:31,662 - mmdet - INFO - Epoch [12][1000/3696] lr: 6.000e-07, eta: 0:40:24, time: 0.901, data_time: 0.017, memory: 13290, loss_rpn_cls: 0.0578, loss_rpn_bbox: 0.0239, loss_cls: 6.2502, loss_bbox: 2.8780, loss: 9.2099 2023-01-20 19:44:17,154 - mmdet - INFO - Epoch [12][1050/3696] lr: 6.000e-07, eta: 0:39:39, time: 0.910, data_time: 0.017, memory: 13290, loss_rpn_cls: 0.0572, loss_rpn_bbox: 0.0236, loss_cls: 6.2450, loss_bbox: 2.8865, loss: 9.2123 2023-01-20 19:45:03,452 - mmdet - INFO - Epoch [12][1100/3696] lr: 6.000e-07, eta: 0:38:54, time: 0.926, data_time: 0.017, memory: 13290, loss_rpn_cls: 0.0574, loss_rpn_bbox: 0.0240, loss_cls: 6.2486, loss_bbox: 2.8632, loss: 9.1933 2023-01-20 19:45:48,311 - mmdet - INFO - Epoch [12][1150/3696] lr: 6.000e-07, eta: 0:38:09, time: 0.897, data_time: 0.018, memory: 13290, loss_rpn_cls: 0.0571, loss_rpn_bbox: 0.0235, loss_cls: 6.2344, loss_bbox: 2.8928, loss: 9.2079 2023-01-20 19:46:33,030 - mmdet - INFO - Epoch [12][1200/3696] lr: 6.000e-07, eta: 0:37:24, time: 0.894, data_time: 0.017, memory: 13290, loss_rpn_cls: 0.0572, loss_rpn_bbox: 0.0238, loss_cls: 6.2454, loss_bbox: 2.8844, loss: 9.2108 2023-01-20 19:47:17,713 - mmdet - INFO - Epoch [12][1250/3696] lr: 6.000e-07, eta: 0:36:39, time: 0.894, data_time: 0.017, memory: 13290, loss_rpn_cls: 0.0570, loss_rpn_bbox: 0.0236, loss_cls: 6.2406, loss_bbox: 2.8806, loss: 9.2018 2023-01-20 19:48:03,028 - mmdet - INFO - Epoch [12][1300/3696] lr: 6.000e-07, eta: 0:35:54, time: 0.906, data_time: 0.018, memory: 13290, loss_rpn_cls: 0.0575, loss_rpn_bbox: 0.0238, loss_cls: 6.2489, loss_bbox: 2.8791, loss: 9.2093 2023-01-20 19:48:48,821 - mmdet - INFO - Epoch [12][1350/3696] lr: 6.000e-07, eta: 0:35:09, time: 0.916, data_time: 0.018, memory: 13290, loss_rpn_cls: 0.0578, loss_rpn_bbox: 0.0239, loss_cls: 6.2435, loss_bbox: 2.8821, loss: 9.2073 2023-01-20 19:49:34,130 - mmdet - INFO - Epoch [12][1400/3696] lr: 6.000e-07, eta: 0:34:24, time: 0.906, data_time: 0.018, memory: 13290, loss_rpn_cls: 0.0575, loss_rpn_bbox: 0.0238, loss_cls: 6.2448, loss_bbox: 2.8838, loss: 9.2099 2023-01-20 19:50:19,655 - mmdet - INFO - Epoch [12][1450/3696] lr: 6.000e-07, eta: 0:33:39, time: 0.910, data_time: 0.018, memory: 13290, loss_rpn_cls: 0.0574, loss_rpn_bbox: 0.0239, loss_cls: 6.2468, loss_bbox: 2.8830, loss: 9.2111 2023-01-20 19:51:05,327 - mmdet - INFO - Epoch [12][1500/3696] lr: 6.000e-07, eta: 0:32:54, time: 0.914, data_time: 0.017, memory: 13290, loss_rpn_cls: 0.0578, loss_rpn_bbox: 0.0239, loss_cls: 6.2496, loss_bbox: 2.8615, loss: 9.1929 2023-01-20 19:51:50,409 - mmdet - INFO - Epoch [12][1550/3696] lr: 6.000e-07, eta: 0:32:10, time: 0.902, data_time: 0.017, memory: 13290, loss_rpn_cls: 0.0567, loss_rpn_bbox: 0.0234, loss_cls: 6.2359, loss_bbox: 2.8954, loss: 9.2115 2023-01-20 19:52:35,780 - mmdet - INFO - Epoch [12][1600/3696] lr: 6.000e-07, eta: 0:31:25, time: 0.907, data_time: 0.017, memory: 13290, loss_rpn_cls: 0.0571, loss_rpn_bbox: 0.0239, loss_cls: 6.2425, loss_bbox: 2.8790, loss: 9.2025 2023-01-20 19:53:21,165 - mmdet - INFO - Epoch [12][1650/3696] lr: 6.000e-07, eta: 0:30:40, time: 0.908, data_time: 0.017, memory: 13290, loss_rpn_cls: 0.0572, loss_rpn_bbox: 0.0238, loss_cls: 6.2468, loss_bbox: 2.8702, loss: 9.1980 2023-01-20 19:54:07,087 - mmdet - INFO - Epoch [12][1700/3696] lr: 6.000e-07, eta: 0:29:55, time: 0.919, data_time: 0.017, memory: 13290, loss_rpn_cls: 0.0588, loss_rpn_bbox: 0.0247, loss_cls: 6.2617, loss_bbox: 2.8697, loss: 9.2149 2023-01-20 19:54:53,243 - mmdet - INFO - Epoch [12][1750/3696] lr: 6.000e-07, eta: 0:29:10, time: 0.923, data_time: 0.017, memory: 13290, loss_rpn_cls: 0.0583, loss_rpn_bbox: 0.0242, loss_cls: 6.2599, loss_bbox: 2.8620, loss: 9.2045 2023-01-20 19:55:38,650 - mmdet - INFO - Epoch [12][1800/3696] lr: 6.000e-07, eta: 0:28:25, time: 0.908, data_time: 0.017, memory: 13290, loss_rpn_cls: 0.0575, loss_rpn_bbox: 0.0239, loss_cls: 6.2412, loss_bbox: 2.8660, loss: 9.1886 2023-01-20 19:56:23,454 - mmdet - INFO - Epoch [12][1850/3696] lr: 6.000e-07, eta: 0:27:40, time: 0.896, data_time: 0.017, memory: 13290, loss_rpn_cls: 0.0579, loss_rpn_bbox: 0.0239, loss_cls: 6.2401, loss_bbox: 2.8878, loss: 9.2096 2023-01-20 19:57:08,472 - mmdet - INFO - Epoch [12][1900/3696] lr: 6.000e-07, eta: 0:26:55, time: 0.900, data_time: 0.017, memory: 13290, loss_rpn_cls: 0.0573, loss_rpn_bbox: 0.0239, loss_cls: 6.2508, loss_bbox: 2.8691, loss: 9.2011 2023-01-20 19:57:53,932 - mmdet - INFO - Epoch [12][1950/3696] lr: 6.000e-07, eta: 0:26:10, time: 0.909, data_time: 0.017, memory: 13290, loss_rpn_cls: 0.0563, loss_rpn_bbox: 0.0233, loss_cls: 6.2436, loss_bbox: 2.8886, loss: 9.2119 2023-01-20 19:58:39,539 - mmdet - INFO - Epoch [12][2000/3696] lr: 6.000e-07, eta: 0:25:25, time: 0.912, data_time: 0.017, memory: 13290, loss_rpn_cls: 0.0587, loss_rpn_bbox: 0.0245, loss_cls: 6.2594, loss_bbox: 2.8726, loss: 9.2152 2023-01-20 19:59:24,348 - mmdet - INFO - Epoch [12][2050/3696] lr: 6.000e-07, eta: 0:24:40, time: 0.896, data_time: 0.017, memory: 13290, loss_rpn_cls: 0.0583, loss_rpn_bbox: 0.0240, loss_cls: 6.2527, loss_bbox: 2.8907, loss: 9.2257 2023-01-20 20:00:09,391 - mmdet - INFO - Epoch [12][2100/3696] lr: 6.000e-07, eta: 0:23:55, time: 0.901, data_time: 0.017, memory: 13290, loss_rpn_cls: 0.0574, loss_rpn_bbox: 0.0240, loss_cls: 6.2477, loss_bbox: 2.8911, loss: 9.2202 2023-01-20 20:00:55,373 - mmdet - INFO - Epoch [12][2150/3696] lr: 6.000e-07, eta: 0:23:10, time: 0.920, data_time: 0.017, memory: 13290, loss_rpn_cls: 0.0576, loss_rpn_bbox: 0.0240, loss_cls: 6.2484, loss_bbox: 2.8621, loss: 9.1921 2023-01-20 20:01:40,641 - mmdet - INFO - Epoch [12][2200/3696] lr: 6.000e-07, eta: 0:22:25, time: 0.905, data_time: 0.017, memory: 13290, loss_rpn_cls: 0.0570, loss_rpn_bbox: 0.0237, loss_cls: 6.2464, loss_bbox: 2.8767, loss: 9.2039 2023-01-20 20:02:26,058 - mmdet - INFO - Epoch [12][2250/3696] lr: 6.000e-07, eta: 0:21:40, time: 0.908, data_time: 0.017, memory: 13290, loss_rpn_cls: 0.0572, loss_rpn_bbox: 0.0239, loss_cls: 6.2515, loss_bbox: 2.8766, loss: 9.2093 2023-01-20 20:03:11,969 - mmdet - INFO - Epoch [12][2300/3696] lr: 6.000e-07, eta: 0:20:55, time: 0.918, data_time: 0.017, memory: 13290, loss_rpn_cls: 0.0579, loss_rpn_bbox: 0.0245, loss_cls: 6.2621, loss_bbox: 2.8754, loss: 9.2200 2023-01-20 20:03:57,420 - mmdet - INFO - Epoch [12][2350/3696] lr: 6.000e-07, eta: 0:20:10, time: 0.909, data_time: 0.017, memory: 13290, loss_rpn_cls: 0.0583, loss_rpn_bbox: 0.0242, loss_cls: 6.2564, loss_bbox: 2.8681, loss: 9.2070 2023-01-20 20:04:42,542 - mmdet - INFO - Epoch [12][2400/3696] lr: 6.000e-07, eta: 0:19:25, time: 0.902, data_time: 0.017, memory: 13290, loss_rpn_cls: 0.0575, loss_rpn_bbox: 0.0239, loss_cls: 6.2479, loss_bbox: 2.8801, loss: 9.2093 2023-01-20 20:05:28,091 - mmdet - INFO - Epoch [12][2450/3696] lr: 6.000e-07, eta: 0:18:40, time: 0.911, data_time: 0.017, memory: 13290, loss_rpn_cls: 0.0587, loss_rpn_bbox: 0.0244, loss_cls: 6.2587, loss_bbox: 2.8648, loss: 9.2066 2023-01-20 20:06:13,391 - mmdet - INFO - Epoch [12][2500/3696] lr: 6.000e-07, eta: 0:17:55, time: 0.906, data_time: 0.017, memory: 13290, loss_rpn_cls: 0.0577, loss_rpn_bbox: 0.0239, loss_cls: 6.2495, loss_bbox: 2.8833, loss: 9.2143 2023-01-20 20:06:58,457 - mmdet - INFO - Epoch [12][2550/3696] lr: 6.000e-07, eta: 0:17:10, time: 0.901, data_time: 0.017, memory: 13290, loss_rpn_cls: 0.0573, loss_rpn_bbox: 0.0236, loss_cls: 6.2393, loss_bbox: 2.8858, loss: 9.2061 2023-01-20 20:07:43,325 - mmdet - INFO - Epoch [12][2600/3696] lr: 6.000e-07, eta: 0:16:25, time: 0.897, data_time: 0.018, memory: 13290, loss_rpn_cls: 0.0571, loss_rpn_bbox: 0.0237, loss_cls: 6.2437, loss_bbox: 2.8843, loss: 9.2089 2023-01-20 20:08:29,668 - mmdet - INFO - Epoch [12][2650/3696] lr: 6.000e-07, eta: 0:15:40, time: 0.927, data_time: 0.018, memory: 13290, loss_rpn_cls: 0.0590, loss_rpn_bbox: 0.0245, loss_cls: 6.2665, loss_bbox: 2.8592, loss: 9.2092 2023-01-20 20:09:15,295 - mmdet - INFO - Epoch [12][2700/3696] lr: 6.000e-07, eta: 0:14:55, time: 0.913, data_time: 0.018, memory: 13290, loss_rpn_cls: 0.0581, loss_rpn_bbox: 0.0241, loss_cls: 6.2588, loss_bbox: 2.8777, loss: 9.2187 2023-01-20 20:10:01,157 - mmdet - INFO - Epoch [12][2750/3696] lr: 6.000e-07, eta: 0:14:11, time: 0.917, data_time: 0.017, memory: 13290, loss_rpn_cls: 0.0589, loss_rpn_bbox: 0.0246, loss_cls: 6.2592, loss_bbox: 2.8568, loss: 9.1995 2023-01-20 20:10:46,916 - mmdet - INFO - Epoch [12][2800/3696] lr: 6.000e-07, eta: 0:13:26, time: 0.915, data_time: 0.017, memory: 13290, loss_rpn_cls: 0.0574, loss_rpn_bbox: 0.0240, loss_cls: 6.2469, loss_bbox: 2.8791, loss: 9.2073 2023-01-20 20:11:33,318 - mmdet - INFO - Epoch [12][2850/3696] lr: 6.000e-07, eta: 0:12:41, time: 0.928, data_time: 0.018, memory: 13290, loss_rpn_cls: 0.0578, loss_rpn_bbox: 0.0243, loss_cls: 6.2503, loss_bbox: 2.8586, loss: 9.1909 2023-01-20 20:12:18,899 - mmdet - INFO - Epoch [12][2900/3696] lr: 6.000e-07, eta: 0:11:56, time: 0.912, data_time: 0.017, memory: 13290, loss_rpn_cls: 0.0567, loss_rpn_bbox: 0.0235, loss_cls: 6.2295, loss_bbox: 2.8753, loss: 9.1850 2023-01-20 20:13:04,092 - mmdet - INFO - Epoch [12][2950/3696] lr: 6.000e-07, eta: 0:11:11, time: 0.904, data_time: 0.017, memory: 13290, loss_rpn_cls: 0.0573, loss_rpn_bbox: 0.0237, loss_cls: 6.2424, loss_bbox: 2.8784, loss: 9.2018 2023-01-20 20:13:49,276 - mmdet - INFO - Epoch [12][3000/3696] lr: 6.000e-07, eta: 0:10:26, time: 0.904, data_time: 0.017, memory: 13290, loss_rpn_cls: 0.0573, loss_rpn_bbox: 0.0238, loss_cls: 6.2424, loss_bbox: 2.8921, loss: 9.2156 2023-01-20 20:14:34,967 - mmdet - INFO - Epoch [12][3050/3696] lr: 6.000e-07, eta: 0:09:41, time: 0.914, data_time: 0.017, memory: 13290, loss_rpn_cls: 0.0573, loss_rpn_bbox: 0.0240, loss_cls: 6.2532, loss_bbox: 2.8653, loss: 9.1997 2023-01-20 20:15:19,767 - mmdet - INFO - Epoch [12][3100/3696] lr: 6.000e-07, eta: 0:08:56, time: 0.896, data_time: 0.017, memory: 13290, loss_rpn_cls: 0.0574, loss_rpn_bbox: 0.0238, loss_cls: 6.2401, loss_bbox: 2.8816, loss: 9.2028 2023-01-20 20:16:05,274 - mmdet - INFO - Epoch [12][3150/3696] lr: 6.000e-07, eta: 0:08:11, time: 0.910, data_time: 0.017, memory: 13290, loss_rpn_cls: 0.0567, loss_rpn_bbox: 0.0236, loss_cls: 6.2433, loss_bbox: 2.8763, loss: 9.1999 2023-01-20 20:16:51,538 - mmdet - INFO - Epoch [12][3200/3696] lr: 6.000e-07, eta: 0:07:26, time: 0.925, data_time: 0.017, memory: 13290, loss_rpn_cls: 0.0578, loss_rpn_bbox: 0.0244, loss_cls: 6.2556, loss_bbox: 2.8630, loss: 9.2008 2023-01-20 20:17:37,254 - mmdet - INFO - Epoch [12][3250/3696] lr: 6.000e-07, eta: 0:06:41, time: 0.914, data_time: 0.018, memory: 13290, loss_rpn_cls: 0.0573, loss_rpn_bbox: 0.0237, loss_cls: 6.2392, loss_bbox: 2.8842, loss: 9.2044 2023-01-20 20:18:22,948 - mmdet - INFO - Epoch [12][3300/3696] lr: 6.000e-07, eta: 0:05:56, time: 0.914, data_time: 0.017, memory: 13290, loss_rpn_cls: 0.0574, loss_rpn_bbox: 0.0238, loss_cls: 6.2393, loss_bbox: 2.8683, loss: 9.1887 2023-01-20 20:19:08,446 - mmdet - INFO - Epoch [12][3350/3696] lr: 6.000e-07, eta: 0:05:11, time: 0.910, data_time: 0.017, memory: 13290, loss_rpn_cls: 0.0575, loss_rpn_bbox: 0.0240, loss_cls: 6.2573, loss_bbox: 2.8834, loss: 9.2222 2023-01-20 20:19:53,437 - mmdet - INFO - Epoch [12][3400/3696] lr: 6.000e-07, eta: 0:04:26, time: 0.900, data_time: 0.017, memory: 13290, loss_rpn_cls: 0.0576, loss_rpn_bbox: 0.0238, loss_cls: 6.2482, loss_bbox: 2.8806, loss: 9.2102 2023-01-20 20:20:38,917 - mmdet - INFO - Epoch [12][3450/3696] lr: 6.000e-07, eta: 0:03:41, time: 0.910, data_time: 0.017, memory: 13290, loss_rpn_cls: 0.0575, loss_rpn_bbox: 0.0240, loss_cls: 6.2573, loss_bbox: 2.8711, loss: 9.2098 2023-01-20 20:21:24,598 - mmdet - INFO - Epoch [12][3500/3696] lr: 6.000e-07, eta: 0:02:56, time: 0.914, data_time: 0.017, memory: 13290, loss_rpn_cls: 0.0574, loss_rpn_bbox: 0.0236, loss_cls: 6.2403, loss_bbox: 2.8824, loss: 9.2036 2023-01-20 20:22:10,280 - mmdet - INFO - Epoch [12][3550/3696] lr: 6.000e-07, eta: 0:02:11, time: 0.914, data_time: 0.017, memory: 13290, loss_rpn_cls: 0.0580, loss_rpn_bbox: 0.0242, loss_cls: 6.2497, loss_bbox: 2.8664, loss: 9.1983 2023-01-20 20:22:56,135 - mmdet - INFO - Epoch [12][3600/3696] lr: 6.000e-07, eta: 0:01:26, time: 0.917, data_time: 0.017, memory: 13290, loss_rpn_cls: 0.0586, loss_rpn_bbox: 0.0246, loss_cls: 6.2562, loss_bbox: 2.8512, loss: 9.1905 2023-01-20 20:23:42,825 - mmdet - INFO - Epoch [12][3650/3696] lr: 6.000e-07, eta: 0:00:41, time: 0.934, data_time: 0.017, memory: 13290, loss_rpn_cls: 0.0585, loss_rpn_bbox: 0.0246, loss_cls: 6.2624, loss_bbox: 2.8589, loss: 9.2044 2023-01-20 20:24:25,511 - mmdet - INFO - Saving checkpoint at 12 epochs